CN111402348B - Lighting effect forming method and device and rendering engine - Google Patents

Lighting effect forming method and device and rendering engine Download PDF

Info

Publication number
CN111402348B
CN111402348B CN201910004659.7A CN201910004659A CN111402348B CN 111402348 B CN111402348 B CN 111402348B CN 201910004659 A CN201910004659 A CN 201910004659A CN 111402348 B CN111402348 B CN 111402348B
Authority
CN
China
Prior art keywords
color
illumination
channel
image
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910004659.7A
Other languages
Chinese (zh)
Other versions
CN111402348A (en
Inventor
郑宇琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910004659.7A priority Critical patent/CN111402348B/en
Publication of CN111402348A publication Critical patent/CN111402348A/en
Application granted granted Critical
Publication of CN111402348B publication Critical patent/CN111402348B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0276Advertisement creation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The embodiment of the invention provides a method and a device for forming illumination effect and a rendering engine. Wherein the method comprises the following steps: monitoring the rolling state of the information flow to obtain the position of the cell in the information flow; determining a direction of illumination using the locations of the cells in the information stream; determining a color of illumination using a color of an image displayed in the cell; and rendering on the image by utilizing the illumination direction and the illumination color to form an illumination effect. The illumination effect of the embodiment of the invention is related to the positions of the cells and the colors of the images included in the cells in the information flow rolling state, so that the rolling illumination effect can be rendered on the images displayed in the information flow, and the illumination effect can be different according to the different rolling positions of the information flow and the different colors of the images. Thus, the lighting effect can be more adapted to the current display content of the information stream.

Description

Lighting effect forming method and device and rendering engine
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method and an apparatus for forming an illumination effect, and a rendering engine.
Background
Feed can combine several sources of messages actively subscribed to by a user together to form a content aggregator that helps the user to continuously acquire up-to-date Feed content. Many information applications (apps) use the form of Feed streams (Feed streams) to push news information, advertisements, etc. to users. In portable intelligent terminals such as mobile phones and palm computers, more and more APP displays various information in a Feed stream mode.
With the need for Feed stream advertising innovations, more and more creatives are proposed that contain special effects, movable elements. The difficulty in achieving the required special effects in Feed streams is often great, requiring mathematical and graphic knowledge as support.
At present, the illumination effect can be realized on a common static image, but a good method for realizing the illumination effect in a Feed stream does not exist.
Disclosure of Invention
The embodiment of the invention provides a method and a device for forming illumination effect and a rendering engine, which are used for solving one or more technical problems in the prior art.
In a first aspect, an embodiment of the present invention provides a method for forming an illumination effect, including:
monitoring the rolling state of the information flow to obtain the position of the cell in the information flow;
determining a direction of illumination using the locations of the cells in the information stream;
determining a color of illumination using a color of an image displayed in the cell;
and rendering on the image by utilizing the illumination direction and the illumination color to form an illumination effect.
In one embodiment, determining the color of the illumination using the color of the image displayed in the cell comprises:
obtaining standardized color values of the image;
And mapping each standardized color value of the image by using the first offset and the first slope to obtain a first color value of illumination.
In one embodiment, obtaining standardized color values of the image includes:
obtaining a histogram of the image using a shader library;
separating the histogram according to RGB channels to obtain N dimensions, wherein each dimension of each channel corresponds to a color value,
obtaining the number of pixel points included in the image by each color value of each channel, wherein N is a positive integer;
and calculating a standardized color value by using the color value with the largest number of pixels of each channel.
In one embodiment, mapping each normalized color value of the image using a first offset and a first slope to obtain a first color value of illumination includes:
mapping each standardized color value of the image by adopting the formulas 1, 2 and 3 to obtain a first color value of illumination;
r1=k1 maxr+a1 formula 1,
g1 =k1×maxg+a1 formula 2,
b1 =k1×maxb+a1 formula 3,
wherein R1, G1 and B1 respectively represent first color values corresponding to the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent standardized color values corresponding to the R channel, the G channel and the B channel of the image, k1 represents a first slope, and a1 represents a first offset.
In one embodiment, determining the color of the illumination using the color of the image displayed in the cell further comprises:
calculating an average value of each normalized color value of the image;
and if the average value exceeds the color threshold value, mapping the average value of each standardized color value of the image again by utilizing the color threshold value, the second offset and the second slope to obtain a second color value of illumination.
In one embodiment, the mapping the average value of each normalized color value of the image again with the color threshold, the second offset, and the second slope to obtain a second color value of the illumination includes:
mapping each standardized color value of the image by adopting the formulas 4, 5 and 6 to obtain a second color value of illumination;
r2=r1 (avgRGB-t) k2+a2 formula 4,
g2 =g1× (avgRGB-t) ×k2+a2 formula 5,
b2 =b1× (avgRGB-t) ×k2+a2 formula 6,
wherein R2, G2 and B2 respectively represent second color values corresponding to the R channel, the G channel and the B channel, R1, G1 and B1 respectively represent first color values corresponding to the R channel, the G channel and the B channel, avgRGB represents an average value of normalized color values maxR, maxG and maxB, t represents a color threshold, k2 represents a second slope, and a2 represents a second offset.
In a second aspect, an embodiment of the present invention provides an apparatus for forming an illumination effect, including:
the monitoring module is used for monitoring the rolling state of the information flow so as to acquire the position of the cell in the information flow;
a direction determining module for determining a direction of illumination using the locations of the cells in the information stream;
a color determination module for determining a color of illumination using a color of an image displayed in the cell;
and the rendering module is used for rendering on the image by utilizing the direction and the color of the illumination to form an illumination effect.
In one embodiment, the color determination module includes:
a normalization sub-module for obtaining each normalized color value of the image;
and the first mapping sub-module is used for mapping each standardized color value of the image by using the first offset and the first slope to obtain a first color value of illumination.
In one embodiment, the normalization sub-module is further configured to:
obtaining a histogram of the image using a shader library;
separating the histogram according to RGB channels to obtain N dimensions, wherein each dimension of each channel corresponds to a color value,
obtaining the number of pixel points included in the image by each color value of each channel, wherein N is a positive integer;
And calculating a standardized color value by using the color value with the largest number of pixels of each channel.
In one embodiment, the first mapping submodule is further configured to map each standardized color value of the image to obtain a first color value of illumination by adopting equation 1, equation 2 and equation 3;
r1=k1 maxr+a1 formula 1,
g1 =k1×maxg+a1 formula 2,
b1 =k1×maxb+a1 formula 3,
wherein R1, G1 and B1 respectively represent first color values corresponding to the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent standardized color values corresponding to the R channel, the G channel and the B channel of the image, k1 represents a first slope, and a1 represents a first offset.
In one embodiment, the color determination module further comprises:
an average value sub-module for calculating an average value of each standardized color value of the image;
and the second mapping sub-module is used for mapping the average value of each standardized color value of the image again by utilizing the color threshold value, the second offset and the second slope to obtain a second color value of illumination if the average value exceeds the color threshold value.
In one embodiment, the second mapping submodule is further configured to map each standardized color value of the image to obtain a second color value of illumination by adopting equations 4, 5 and 6;
r2=r1 (avgRGB-t) k2+a2 formula 4,
g2 =g1× (avgRGB-t) ×k2+a2 formula 5,
b2 =b1× (avgRGB-t) ×k2+a2 formula 6,
wherein R2, G2 and B2 respectively represent second color values corresponding to the R channel, the G channel and the B channel, R1, G1 and B1 respectively represent first color values corresponding to the R channel, the G channel and the B channel, avgRGB represents an average value of normalized color values maxR, maxG and maxB, t represents a color threshold, k2 represents a second slope, and a2 represents a second offset.
In a third aspect, an embodiment of the present invention provides an apparatus for forming an illumination effect, where the function of the apparatus may be implemented by hardware, or may be implemented by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In an embodiment, the structure of the apparatus includes a processor and a memory, the memory is used for storing a program for supporting the apparatus to execute the method for forming the lighting effect, and the processor is configured to execute the program stored in the memory. The apparatus may also include a communication interface for communicating with other devices or communication networks.
In a fourth aspect, an embodiment of the present invention provides a rendering engine, including: the device for forming any one of the illumination effects in the embodiment of the invention.
In a fifth aspect, an embodiment of the present invention provides a computer-readable storage medium storing computer software instructions for use by a lighting effect forming apparatus, including a program for executing the above-described lighting effect forming method.
One of the above technical solutions has the following advantages or beneficial effects: the illumination effect is related to the position of the cell and the color of the image included in the cell in the scrolling state of the information stream, and thus, the illumination effect forming scrolling can be rendered on the image displayed in the information stream, and the illumination effect may be different according to the scrolling position of the information stream and the color of the image. Thus, the lighting effect can be more adapted to the current display content of the information stream.
The other technical scheme has the following advantages or beneficial effects: the mapping of the colors of the image to the colors of illumination is controlled by setting a certain slope, offset and threshold value, so that the colors of the image can be self-adapted. For example, when the image color is below a certain threshold, the illumination color may lighten as the image color lightens, making the illumination effect more noticeable. And when the image color exceeds a certain threshold, the illumination color can be quickly darkened as the image color becomes brighter so as to prevent overexposure.
The foregoing summary is for the purpose of the specification only and is not intended to be limiting in any way. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features of the present invention will become apparent by reference to the drawings and the following detailed description.
Drawings
In the drawings, the same reference numerals refer to the same or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily drawn to scale. It is appreciated that these drawings depict only some embodiments according to the disclosure and are not therefore to be considered limiting of its scope.
Fig. 1 shows a flowchart of a method of forming a lighting effect according to an embodiment of the present invention.
Fig. 2 shows a flowchart of a method of forming a lighting effect according to an embodiment of the present invention.
Fig. 3 shows a flowchart of a method of forming a lighting effect according to an embodiment of the present invention.
Fig. 4 shows a schematic diagram of position calculation in a method for forming an illumination effect according to an embodiment of the present invention.
Fig. 5 shows a flowchart of an application example of a method of forming a lighting effect according to an embodiment of the present invention.
Fig. 6a, 6b and 6c show effect diagrams of a method of forming an illumination effect according to an embodiment of the present invention.
Fig. 7 shows a block diagram of a structure of an apparatus for forming a lighting effect according to an embodiment of the present invention.
Fig. 8 shows a block diagram of a structure of an apparatus for forming a lighting effect according to an embodiment of the present invention.
FIG. 9 illustrates an example diagram of one rendering cycle in a rendering engine according to an embodiment of the present invention.
FIG. 10 illustrates an internal structural diagram of canvas objects in a rendering engine in accordance with an embodiment of the present invention.
FIG. 11 shows a schematic diagram of a rendering flow of a rendering engine according to an embodiment of the invention.
Fig. 12 shows a block diagram of a structure of an apparatus for forming a lighting effect according to an embodiment of the present invention.
Detailed Description
Hereinafter, only certain exemplary embodiments are briefly described. As will be recognized by those of skill in the pertinent art, the described embodiments may be modified in various different ways without departing from the spirit or scope of the present invention. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive.
Fig. 1 shows a flowchart of a method of forming a lighting effect according to an embodiment of the present invention. As shown in fig. 1, the method may include:
Step S11, monitoring the rolling state of the information flow to obtain the position of the cell in the information flow;
step S12, determining the illumination direction by utilizing the position of the cell in the information flow;
step S13, determining the color of illumination by utilizing the color of the image displayed in the cell;
and step S14, rendering on the image by utilizing the illumination direction and the illumination color to form an illumination effect.
In information streams, such as Feed streams, containers for displaying information content may be referred to as cells (cells). The cells can include various forms of multimedia resources such as text, static images, dynamic images, video and the like.
The user can control the position of the cells in the Feed stream displayed in the display area of the screen. For example, on a touch screen of a mobile phone, the upward sliding of a finger can control the upward scrolling of a Feed stream, and the corresponding cells in the Feed stream also scroll upward; sliding down with a finger can control the Feed stream to scroll down, as does the corresponding cells in the Feed stream.
In one example, the location of a cell in an information stream may be calculated using the location of the cell in the display area of the screen. Assuming that the position of the display area of the screen is normalized, the point S1 at the lowest side of the center line is assumed to be 0, and the point S2 at the uppermost side is assumed to be 1. Referring to fig. 4, the position of the cell a may be calculated according to the position of the center point O1 of the cell a. For example, if the length of the line segment composed of the center points O1 and S1 of the cell a is 20% of the length of the line segment composed of S1 and S2, the position of the cell a is equal to 0.2. For another example, if the length of the line segment formed by the center points O1 and S1 of the cell a is 50% of the length of the line segment formed by the S1 and S2, the position of the cell a is equal to 0.5. For another example, if the length of the line segment formed by the center points O1 and S1 of the cell a is 60% of the length of the line segment formed by the S1 and S2, the position of the cell a is equal to 0.6.
0,0.2,0.6,1 and the like in the present embodiment are merely examples and are not limiting. In practical applications, the setting may be performed as needed. For example, the maximum value of the display area of the screen is set to 2, 5, 10, 100, or the like, and the value of the actual position of the cell is determined using the positional proportional relationship of the cell and the screen.
In one example, if a scrolling lighting effect needs to be added to a large-drawing advertisement in a Feed stream, the scrolling state of the Feed stream can be monitored to determine the location of the cell in which the large-drawing advertisement is located. And determining the illumination direction by using the positions of the cells where the large-image advertisements are positioned.
In one example, the location of the cell may have a certain correspondence with the direction of illumination. For example, the direction of illumination is represented by (x, y, z). Where z represents an axis perpendicular to the screen, x represents an axis parallel to the lateral direction of the screen, and y represents an axis parallel to the longitudinal direction of the screen. It is assumed that the direction of illumination is linear with the position of the cell. Wherein the direction of illumination in the z-axis is directed from outside the screen into the screen. The z value may remain unchanged. If the Feed stream is scrolled left and right, the direction of illumination is unchanged by the values of z, y, and the value of x varies linearly with the position of the cell. If the Feed stream is scrolled up and down, the values of z, x are unchanged and the value of y varies linearly with the position of the cell.
In addition, the position change range of the cell may be set in advance. For example, when the position of the cell ranges from 0.2 to 0.6, the value of the position changes linearly in accordance with the actual position within this range. When the actual position of the cell is less than the lower limit value 0.2, the value of the position is set equal to the lower limit value 0.2. When the actual position of the cell is greater than the upper limit value 0.6, the value of the position is set equal to the upper limit value 0.6.
In this way, the direction of illumination can be made to vary within a certain range. For example, when the position of the cell ranges from 0.2 to 0.6, the direction of illumination, for example, the x value (or y value) changes linearly with the position value. When the position of the cell is less than 0.2, the direction of illumination, for example, the x value (or y value) is the same as when the position of the cell is 0.2. When the position range of the cell is greater than 0.6, the direction of illumination, for example, the x value (or y value) is the same as when the position of the cell is 0.6.
In one example, a title, content, etc. may be displayed in a cell. The title may be in text format. The content may be in the format of static graphics, dynamic graphics, video, etc. The image currently displayed by the cell may include a static image, a frame of a dynamic image, or a frame of video. The currently displayed image of the cell is used to calculate the color of the illumination, which can be changed if the currently displayed image changes.
In the embodiment of the invention, the time sequence for determining the color and the direction of illumination is not limited, and the time sequence can be set according to actual requirements. So long as the color and direction of illumination is determined prior to execution of the rendering command.
In one approach, the direction of illumination may be calculated first and then the color of the illumination may be calculated. For example, in the calculation stage of the entering rendering process, the direction of illumination is calculated first using the position of the cell in the information, and then the color of the illumination is calculated using the color of the image that needs to be displayed in the cell.
In another way, the color of the illumination may be calculated first and then the direction of the illumination may be calculated. For example, after the rendering engine obtains an image that needs to be displayed in a cell, the color of the illumination is first calculated using the color of the image in a preparation phase. And in the calculation stage of the rendering process, the position of the cell in the information is reused to calculate the illumination direction.
In one embodiment, as shown in fig. 2, step S13 includes:
step S21, obtaining each standardized color value of the image;
and S22, mapping each standardized color value of the image by using the first offset and the first slope to obtain a first color value of illumination.
The color values of the image can be divided into three channels of R (Red), G (Green), and B (Blue). And counting by using the three channels, the condition of the pixel points included in the image under a certain channel can be obtained, and then the standardized color value is obtained.
In one embodiment, as shown in fig. 3, step S21 includes:
step S31, obtaining a histogram of the image by using a shader library;
step S32, separating the histogram according to RGB channels to obtain N dimensions so as to obtain the number of pixel points included in the image by each color value of each channel, wherein N is a positive integer;
step S33, calculating a standardized color value by utilizing the color value with the largest pixel number of each channel.
For example, the shader library may be MPS. MPS (english, collectively Metal Performance Shaders) is a set of graphics processor (GPU, graphic Processing Unit) shader libraries. MPS includes conventional filter effects such as gaussian blur, and also computer vision related functions such as image color histogram, edge detection, etc. The functions of gaussian blur, histogram, etc. of MPS can be encapsulated in the current rendering engine. After the resources of the current rendering engine and the resource intercommunication interface of the MPS are encapsulated, the output of the MPS can be seamlessly used for other rendering processes in the current rendering engine. Of course, other types of shader libraries may be employed as long as the function of obtaining a histogram of an image can be implemented.
In one example, 256 dimensions are separated by RGB channels, each dimension representing a color value. The R channel has 256 color values, the G channel has 256 color values, and the B channel has 256 color values. Then, the number of pixels of each color value included in the image is counted. And calculating the standardized color value of each channel by using the color value with the maximum number of pixels of the channel. For example, the color value with the largest number of pixels in the R channel may be divided by the dimension N, e.g., 256, to obtain a normalized color value for the R channel, denoted maxR. In addition, the calculation method of maxG and maxB may be similar to maxR, and will not be described here again.
The above N is 256 is only an example, and may be other values, and the dimensions may be divided according to a plurality of bytes, for example, N is 256 square when two bytes are used, that is 65536, and the like, which may be specifically set according to the needs of the practical application scenario.
In one embodiment, step S22 includes:
mapping each standardized color value of the image by adopting the formulas 1, 2 and 3 to obtain a first color value of illumination;
r1=k1 maxr+a1 formula 1,
g1 =k1×maxg+a1 formula 2,
b1 =k1×maxb+a1 formula 3,
Wherein R1, G1 and B1 respectively represent first color values corresponding to the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent standardized color values corresponding to the R channel, the G channel and the B channel of the image, k1 represents a first slope, and a1 represents a first offset.
And performing color mapping by using the first offset, the first slope and the standardized color values to obtain the color value of illumination corresponding to each standardized color value, so as to prevent the color of illumination in the excessively-dark image from being excessively dark. For example, assuming that the first slope is a positive number, the brighter the image color, the brighter the illumination color may be made.
In one embodiment, as shown in fig. 2, step S13 further includes:
step S23, calculating the average value of each standardized color value of the image;
and step S24, if the average value exceeds the color threshold value, mapping the average value of each standardized color value of the image again by utilizing the color threshold value, the second offset and the second slope to obtain a second color value of illumination. If the average does not exceed the color threshold, this step may not be performed.
In one embodiment, the mapping the average value of each normalized color value of the image again with the color threshold, the second offset, and the second slope to obtain a second color value of the illumination includes:
Mapping each standardized color value of the image by adopting the formulas 4, 5 and 6 to obtain a second color value of illumination;
r2=r1 (avgRGB-t) k2+a2 formula 4,
g2 =g1× (avgRGB-t) ×k2+a2 formula 5,
b2 =b1× (avgRGB-t) ×k2+a2 formula 6,
wherein R2, G2 and B2 respectively represent second color values corresponding to the R channel, the G channel and the B channel, R1, G1 and B1 respectively represent first color values corresponding to the R channel, the G channel and the B channel, avgRGB represents an average value of normalized color values maxR, maxG and maxB, t represents a color threshold, k2 represents a second slope, and a2 represents a second offset.
The excess may be adjusted by performing a secondary color mapping using the second offset, the second slope, and each of the first color values. For example, assuming that the second slope is negative, the darker the illumination color may be made with brighter image colors. That is, the illumination color value of the excessively bright portion is reduced.
The three first color values r1, g1 and b1 are respectively calculated by adopting the formulas 4, 5 and 6, and the obtained second color values r2, g2 and b2 corresponding to the R, G, B channels respectively can be used as the final illumination color.
Then, the lighting effect may be formed in the image of the cell of the information stream using the direction, color, etc. of the lighting. In addition, other rendering parameters such as materials, for example, reflection coefficients in the materials, can be integrated to form the illumination effect.
In one example application, a scrolling lighting effect is implemented in a Feed stream, creative optimization may be performed for large-scale advertisements. The image of the large-scale advertisement is first rendered in the Feed stream using a rendering engine, and then a lighting effect is applied on the image. As shown in fig. 5, the application example may include the steps of:
and S51, monitoring the rolling state of the Feed stream, acquiring the position of the advertisement in the Feed stream, and calculating the illumination direction in real time according to the position. The direction of illumination is calculated by utilizing the position of the advertisement in the Feed stream, and the interaction effect that illumination changes along with the rolling of the Feed stream can be achieved.
In step S52, the color of the illumination may be adapted to the image color. When the image color is below a certain threshold, the illumination color will lighten as the image lightens. When the color of the image exceeds a certain threshold, the illumination color can be quickly darkened as the image becomes brighter so as to prevent overexposure.
Examples of the adaptive color algorithm are as follows:
i. The histogram of the image is obtained using MPS. The histogram is separated by RGB channels, each divided into 256 dimensions. The number of pixel points of each color value in the image from 0 to 255 in each channel is obtained.
And ii, obtaining the color value with the largest number of the pixel points of each channel, dividing the color value by 256 to obtain the 0-1 standardized color value of the color value with the largest number of the pixel points of each channel, wherein the standardized color values are maxR, maxG and maxB respectively.
Calculate average value avgrgb= (maxr+maxg+maxb)/3 of RGB.
Assuming that the color offset a1=0.3 and the color slope k1=0.2, mapping the above values by using the two values, calculating the color value of the R channel of the illumination by using the mapping rule r1=k1×maxr+a1. The G and B channel mapping rules are similar to R.
Thus, the original 0-1 color value is mapped to 0.3-0.5, and the excessively dark image illumination color is prevented. Since the slope k1 is a positive number, the brighter the image color, the brighter the illumination color.
v. assuming a color threshold t=0.5, a threshold offset a2=0.3, a threshold slope k2= -1.0.
If avgRGB exceeds the threshold t, then a color mapping is performed on the excess part, the mapping rule is r2=r1 (avgRGB-t) k2+a2, R, G and the B channel mapping rule is consistent with R. Substituting r1 value of each channel in the formula 4 to obtain the corresponding illumination color after secondary mapping of each channel. Since the slope tk is negative, it appears that the brighter the image color, the darker the illumination color.
If avgRGB does not exceed the threshold t, no secondary mapping may be performed.
The mapped r, b, g values can be used as final illumination colors. If avgRGB does not exceed the threshold, the final illumination colors are r1, g1, and b1. If avgRGB exceeds the threshold, the final illumination colors are r2, g2, and b2.
And step S53, the engine renders the advertisement image by utilizing parameters such as the illumination direction, the color, the material and the like to form an illumination effect.
The illumination effect of the embodiment of the invention is related to the position of the cell and the color of the image included in the cell in the scrolling state of the information stream, such as the Feed stream, so that the illumination effect forming scrolling can be rendered on the image displayed in the Feed stream, and the illumination effect can be different according to the different scrolling positions of the Feed stream and the different colors of the image. Thus, the lighting effect can be more adapted to the current display content of the Feed stream.
In addition, the illumination effect can respond to the corresponding change of the operation of the user on the information flow, and the interaction effect is realized. As shown in fig. 6a, 6b and 6c, a circular dashed box is used in the image to generally mark the trend of the effect of illumination as a function of cell position in the Feed stream.
Further, by setting a certain slope, offset and threshold, mapping of the color of the image to the color of the illumination is controlled, so that the color of the image can be self-adapted. For example, when the image color is below a certain threshold, the illumination color may lighten as the image color lightens, making the illumination effect more noticeable. And when the image color exceeds a certain threshold, the illumination color can be quickly darkened as the image color becomes brighter so as to prevent overexposure.
Fig. 7 shows a block diagram of a structure of an apparatus for forming a lighting effect according to an embodiment of the present invention. As shown in fig. 7, the apparatus may include:
a monitoring module 61, configured to monitor a rolling state of the information stream to obtain a position of the cell in the information stream;
a direction determination module 62 for determining a direction of illumination using the locations of the cells in the information stream;
a color determination module 63 for determining a color of illumination using a color of an image displayed in the cell;
and a rendering module 64, configured to render on the image to form an illumination effect by using the direction and the color of the illumination.
In one embodiment, as shown in fig. 8, the color determining module 63 includes:
a normalization sub-module 631 for obtaining the respective normalized color values of the image;
A first mapping sub-module 632 is configured to map each normalized color value of the image with a first offset and a first slope to obtain a first color value of illumination.
In one embodiment, the normalization sub-module 631 is further configured to:
obtaining a histogram of the image using a shader library;
separating the histogram according to RGB channels to obtain N dimensions, wherein each dimension of each channel corresponds to a color value,
obtaining the number of pixel points included in the image by each color value of each channel, wherein N is a positive integer;
and calculating a standardized color value by using the color value with the largest number of pixels of each channel.
In one embodiment, the first mapping sub-module 632 is further configured to map each normalized color value of the image to obtain a first color value of illumination using equations 1, 2 and 3;
r1=k1 maxr+a1 formula 1,
g1 =k1×maxg+a1 formula 2,
b1 =k1×maxb+a1 formula 3,
wherein R1, G1 and B1 respectively represent first color values corresponding to the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent standardized color values corresponding to the R channel, the G channel and the B channel of the image, k1 represents a first slope, and a1 represents a first offset.
In one embodiment, the color determination module 63 further includes:
an average sub-module 633 for calculating an average value of each normalized color value of the image;
and a second mapping sub-module 634, configured to map the average value of each normalized color value of the image again by using the color threshold, the second offset and the second slope if the average value exceeds the color threshold, to obtain a second color value of the illumination.
In one embodiment, the second mapping sub-module 634 is further configured to map each normalized color value of the image to obtain a second color value of illumination using equation 4;
r2=r1 (avgRGB-t) k2+a2 formula 4,
where r2 represents a second color value of the illumination, r1 represents a first color value of the illumination, avgRGB represents an average of normalized color values maxR, maxG, and maxB, t represents a color threshold, k2 represents a second slope, and a2 represents a second offset.
The functions of each module in each device of the embodiments of the present invention may be referred to the corresponding descriptions in the above methods, and are not described herein again.
The embodiment of the invention provides a rendering engine, which comprises any one of the forming devices of the illumination effect.
In one example application, a set of graphics rendering engines is developed based on a rendering application programming interface, such as Metal. The rendering engine may perform the method of forming a lighting effect of any of the above embodiments. The Metal is a low-level rendering application programming interface, provides the lowest level required by software, and ensures that the software can run on different graphics chips. The rendering engine can be applied to iOS equipment and has the characteristics of light weight, easy access, high performance, multiple instantiations and the like. In addition, the rendering engine has the capability of rendering graphical effects in three dimensions, illumination, etc. that provide multiple instances in a Feed stream.
In one example, the graphics-rendering engine implementation essentially comprises:
1. the infrastructure of the Metal is managed with a single instance core controller (e.g., a single instance vgmetacore) and the buffer objects (e.g., vgmetacache objects) are managed. Rendering events are driven with a system screen refresh notification class (e.g., CADisplayLink). For example, a CADisplayLink issuing one rendering driver event per frame may cause canvas objects in the rendering engine to draw at the same frequency as the display's refresh screen display interface.
2. Kernel function core controllers (e.g., vgmetalkernellcore) and augmented reality core controllers (e.g., vgarocore) are employed as portals for system high performance shader function tools (e.g., metalperformanceshapers) and augmented reality tools (e.g., ARKit).
3. The rendering event is triggered in turn by VGMetalCore controlling each canvas object named VGMetalCanva.
4. The structure inside the canvas object may be seen in FIG. 9. As shown in fig. 9, in one example, three canvas objects (abbreviated as canvases in fig. 9) trigger rendering events in series within one rendering period. The rendering driving event is sent from the system screen refreshing notification class to the end of rendering all the running canvas objects, and the rendering driving event can be regarded as one rendering period. Each canvas object includes multiple stages of event casting, numerical computation, ready to render, graphics rendering, and exchanging buffers in a rendering process. In the event throwing stage, the canvas object monitors the rendering driving notice thrown by the single core controller, and the notice can comprise a plurality of character strings which are used for indicating that the system screen refreshing notice class sends out the rendering driving event. In the numerical calculation phase, the effect parameters for each effect of the canvas object may be calculated. In the ready-to-render phase, GPU resources may be processed. In the graphics rendering stage, a render command may be invoked to complete rendering of effects of graphics to be rendered within the canvas object. In the exchange buffer stage, a currently used buffer of the canvas object may be exchanged with an unused buffer in preparation for displaying a rendering effect in the screen. For example: after exchanging the currently used buffer H1 with the unused buffer H2, the next frame may be rendered at H2. After rendering is completed, H2 and H1 are exchanged again, so that rendering effect can be more continuous.
Referring to the example of FIG. 9, three canvas objects have their own two buffers, the first being H1-1 and H2-1, the second being H1-2 and H2-2, and the third being H1-3 and H2-3, respectively. Assume that the screen also has two buffers C1 and C2. One is hidden in the background while the other is displayed in the foreground. Within one rendering cycle, at a frame, the rendering results of the three canvas objects are in buffers H1-1, H1-2, and H1-3, respectively. The rendering results of these three buffers may all be included in the buffer C1 of the screen. At this point, C1 may be displayed in the foreground, hiding C2. At the next frame, C2 may include the rendering effects of H2-1, H2-2, H2-3, and then C1 is swapped with C2 to display the rendering effect of the next frame in the screen. By exchanging different buffer areas, the rendering effect is continuously displayed on the screen.
In this example, after the rendering process of the first canvas object ends, the rendering process of the second object begins. After the rendering process of the second canvas object ends, the rendering process of the third canvas object begins. Only the stages involved in the rendering of the first canvas object are depicted in fig. 9, and the rendering of the second canvas object and the third canvas object is similar to the rendering of the first canvas object, although not shown.
In one example, as shown in FIG. 10, there is a schematic diagram of the internal structure of a Canvas object (Canvas). Assuming that the Canvas object is named VanGogh Canvas, the Canvas object may comprise a system class, for example: system layer (cametallilayer), system drawing (cametaldable), color Texture (Color Texture). Wherein the camallilayer may display content presented in the layers by the Metal.
The canvas object may also include a Depth Texture (Depth Texture), a pipeline descriptor (MTL Render Pass Descriptor), and an Effect List (Effect List). Wherein, a plurality of effects (effects) may be included in the Effect list. Each effect may include, for example: light source descriptors (Multi Light Descriptor), cameras (cameras), drawing lists (Draw List), etc. The camera may include, among Other things, perspective descriptors (Perspective Descriptor), perspective transformation descriptors (Eye Transform Descriptor), other descriptors (Other descriptors), etc. The drawing list includes a plurality of drawing objects (Draw). Each drawing object may include resources required for drawing of each stroke of the effect. For example: texture descriptor (Material Descriptor), vertex Content (Vertex Content), fragment Content (Fragment Content), pipeline state (Metal Pipeline State), depth template state (Metal Depth Stencil State), vertex uniform buffer (Vertex Uniform Buffer), fragment uniform buffer (Fragment Uniform Buffer). Vertex buffers (Vertex buffers), index buffers (Index buffers), and other Vertex descriptors (Other Vertex Descriptor) may be included in the Vertex content. Texture (Texture) such as RGB, Y-map, and UV-map, and other source descriptors (Other Fragment Descriptor) may be included in the source content. Among them, textures (Texture) such as Vertex Uniform Buffer, fragment Uniform Buffer, vertex Buffer, index Buffer, and RGB, Y-map, and UV-map may be provided in the GPU.
As shown in fig. 11, the main rendering flow of this application example may include:
in step S81, a system screen refresh notification class, for example, a CADisplayLink triggers a rendering driver event, and the CADisplayLink may trigger a rendering driver event once every frame. The core controller, e.g., vgmetacore, throws out the rendering driver notification upon receipt of the rendering driver event. One or more canvas objects may listen for the notification. Wherein the canvas object is the host of the rendering graphics. If there are multiple canvas objects in the APP at the same time, they can trigger rendering events serially and sequentially after receiving notification.
In step S82, a Canvas object, such as VGMetalCanvas (where VGMetalCanvas may be a class name of VanGogh Canvas in code), holds an effect object, such as VGEffect, that should be rendered at the present time. VGEffect may be a cluster of classes, with different effects achieved by different subclasses. The effect object that should be rendered at present may include one effect to be rendered, or may include an effect list composed of a plurality of effects to be rendered. The canvas object supports drawing multiple effects together. When the canvas object receives the notification, the effect object is traversed, triggering a "calculation" event of the effect object.
For example, if an AR effect is present, after triggering a "calculate" event into the calculation phase, the coordinates of the object tracked by the ARKit may be output to the current rendering engine.
For another example, if it is desired to create a lighting effect in the Feed stream, after the "calculate" event is triggered to enter the calculation phase, the direction of the lighting can be calculated based on the location of the cells (cells) in the Feed stream. The color of the illumination can then be calculated at an earlier time. For example, after creating a canvas, the colors of the illumination are pre-calculated once the colors of the image to be rendered are obtained.
Step S83, after receiving the 'calculation' event, the effect object performs different numerical calculations according to different classes of the class cluster. These calculations may be performed by a central processing unit (CPU, central Processing Unit). The effect parameters calculated by different effect objects may be different, for example: the effect parameters of some effect objects can be rotated by a certain angle, and the effect parameters of some effect objects can be moved by a certain distance according to the specific characteristics of the effect objects.
After the calculation is completed in step S84, the canvas object may determine whether the effect object needs redrawing based on the calculation result. For example, if the calculated effect parameters have not changed, it may be a static effect, some of which are not to be redrawn. For some effects which do not need redrawing, rendering commands can be omitted, unnecessary redrawing can be reduced, and performance and electric quantity consumption are saved.
In step S85, for the effect object that needs redrawing, the canvas object may further trigger a "ready to render" event. The event may be used to process GPU resources. Such as generating GPU buffers, or generating texture resources. One example of generating texture resources includes: in preparation for rendering, a shadow depth map is generated that is required when the scene renders the shadow.
Step S86, after the effect object processes the 'ready to render' event, the canvas object triggers the 'graphic rendering' event. The method comprises the steps of preparing an effect object for rendering, calculating a transformation matrix, calculating a light source descriptor, generating a rendering context structure body finally, and transmitting the rendering context structure body to a plurality of rendering objects held in the effect object for rendering. Wherein the rendering object may be, for example, a drawing object (Draw) in fig. 10. The rendering object may also be a cluster of classes, and different subclasses may have different implementations.
The resources of the MPS may be utilized in the current rendering engine. For example, after entering the ready-to-render stage, generation of gaussian blur results with MPS may be invoked. In the graphics rendering stage, mapping is performed by using Gaussian blur results.
In step S87, after receiving the rendering context, the rendering object accesses the rendering object, i.e., the rendering object, associated with both the internally held vertex content (vertex content) and the fragment content (fragment content). The two objects are updated into a rendering buffer shared with the GPU according to the rendering context, the FragmentContent also uploads textures from the CPU to the GPU at this time, and the rendering command of the Metal is called by the vertex content to perform final graphics rendering.
For example, if it is desired to form a lighting effect in a Feed stream, parameters such as color, direction, material, etc. of the lighting may be uploaded to the rendering pipeline during the graphics rendering stage, and the shader function may calculate color values using these parameters, ultimately forming the lighting effect.
In this application example, the rendering application programming interface of the rendering engine uses Metal technology instead of conventional OpenGL ES technology, having the following features.
a) And the method is more suitable for modern multi-core GPU, and the rendering engine can have higher performance and more stable frame rate.
b) By adopting the C/S model, the communication with the GPU is easier to manage, and the structure of the rendering engine is clearer.
c) The rendering engine has good stability and robustness, and fewer crashes (Crash) on the line. On the one hand, application programming interfaces (API, application Programming Interface) check to help the developer find problems during debugging. On the other hand, the APP cannot be crashed directly when the GPU is suspended (hang) and other problems are caused by the protection during running, so that risks are reduced.
d) The shader language MSL is based on C++14 expansion, so that the shader codes of a rendering engine are more modern and have better performance.
e) By adopting a pre-compiling mechanism, a grammar tree is generated during compiling, so that the loading of the shader codes of a rendering engine is faster, and the shader codes can be loaded faster during running.
By means of the rendering engine, advanced styles based on graphics rendering can be quickly developed. These advanced creative styles based on graphical rendering, whose eye-catching effect and advanced feel can be favored by the sponsored guaranty impression (Guaranteed Delivery, GD) advertiser. In addition, the rendering engine has the characteristics of light weight, powerful function, easiness in transplanting, no dependence on other third party libraries and the like, and can be quickly transplanted to other product lines.
With the rendering engine, a display lighting effect on an image displayed in a Feed stream can be displayed. Because the illumination effect is different with different scrolling positions of the Feed stream and different colors of the image, the illumination effect can be changed in response to the operation of the user on the Feed stream, so that the scrolling illumination effect is presented, and the interaction with the user can be presented.
Fig. 12 shows a block diagram of a structure of an apparatus for forming a lighting effect according to an embodiment of the present invention. As shown in fig. 12, the apparatus includes: memory 910 and processor 920, memory 910 stores a computer program executable on processor 920. The processor 920 implements the method for forming a lighting effect in the above-described embodiment when executing the computer program. The number of the memories 910 and the processors 920 may be one or more.
The apparatus further comprises:
and the communication interface 930 is used for communicating with external equipment and carrying out data interaction transmission.
The memory 910 may include high-speed RAM memory or may further include non-volatile memory (non-volatile memory), such as at least one disk memory.
If the memory 910, the processor 920, and the communication interface 930 are implemented independently, the memory 910, the processor 920, and the communication interface 930 may be connected to each other and perform communication with each other through buses. The bus may be an industry standard architecture (ISA, industry Standard Architecture) bus, a peripheral component interconnect (PCI, peripheral Component) bus, or an extended industry standard architecture (EISA, extended Industry Standard Component) bus, among others. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one thick line is shown in fig. 12, but not only one bus or one type of bus.
Alternatively, in a specific implementation, if the memory 910, the processor 920, and the communication interface 930 are integrated on a chip, the memory 910, the processor 920, and the communication interface 930 may communicate with each other through internal interfaces.
An embodiment of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements a method as in any of the above embodiments.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product. The storage medium may be a read-only memory, a magnetic or optical disk, or the like.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that various changes and substitutions are possible within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (13)

1. A method for forming an illumination effect, comprising:
monitoring the rolling state of the information flow to obtain the position of the cell in the information flow;
determining a direction of illumination using the locations of the cells in the information stream;
determining a color of illumination using a color of an image displayed in the cell;
rendering on the image by utilizing the illumination direction and the illumination color to form an illumination effect;
wherein determining the color of the illumination using the color of the image displayed in the cell comprises:
obtaining standardized color values of the image;
and mapping each standardized color value of the image by using the first offset and the first slope to obtain a first color value of illumination.
2. The method of claim 1, wherein obtaining each normalized color value of the image comprises:
Obtaining a histogram of the image using a shader library;
separating the histogram according to RGB channels to obtain N dimensions, wherein each dimension of each channel corresponds to a color value,
obtaining the number of pixel points included in the image by each color value of each channel, wherein N is a positive integer;
and calculating a standardized color value by using the color value with the largest number of pixels of each channel.
3. The method of claim 1, wherein mapping each normalized color value of the image with a first offset and a first slope to obtain a first color value of illumination comprises:
mapping each standardized color value of the image by adopting the formulas 1, 2 and 3 to obtain a first color value of illumination;
r1=k1 maxr+a1 formula 1,
g1 =k1×maxg+a1 formula 2,
b1 =k1×maxb+a1 formula 3,
wherein R1, G1 and B1 respectively represent first color values corresponding to the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent standardized color values corresponding to the R channel, the G channel and the B channel of the image, k1 represents a first slope, and a1 represents a first offset.
4. A method according to any one of claims 1 to 3, wherein determining the colour of illumination using the colour of the image displayed in the cell further comprises:
calculating an average value of each normalized color value of the image;
and if the average value exceeds the color threshold value, mapping the average value of each standardized color value of the image again by utilizing the color threshold value, the second offset and the second slope to obtain a second color value of illumination.
5. The method of claim 4, wherein re-mapping the average of the normalized color values of the image with the color threshold, the second offset, and the second slope to obtain the second color value of the illumination comprises:
mapping each standardized color value of the image by adopting the formulas 4, 5 and 6 to obtain a second color value of illumination;
r2=r1 (avgRGB-t) k2+a2 formula 4,
g2 =g1× (avgRGB-t) ×k2+a2 formula 5,
b2 =b1× (avgRGB-t) ×k2+a2 formula 6,
wherein R2, G2 and B2 respectively represent second color values corresponding to the R channel, the G channel and the B channel, R1, G1 and B1 respectively represent first color values corresponding to the R channel, the G channel and the B channel, avgRGB represents an average value of normalized color values maxR, maxG and maxB, t represents a color threshold value, k2 represents a second slope, and a2 represents a second offset.
6. A lighting effect forming apparatus, comprising:
the monitoring module is used for monitoring the rolling state of the information flow so as to acquire the position of the cell in the information flow;
a direction determining module for determining a direction of illumination using the locations of the cells in the information stream;
a color determination module for determining a color of illumination using a color of an image displayed in the cell;
the rendering module is used for rendering on the image by utilizing the direction and the color of the illumination to form an illumination effect;
wherein the color determination module comprises:
a normalization sub-module for obtaining each normalized color value of the image;
and the first mapping sub-module is used for mapping each standardized color value of the image by using the first offset and the first slope to obtain a first color value of illumination.
7. The apparatus of claim 6, wherein the normalization sub-module is further to:
obtaining a histogram of the image using a shader library;
separating the histogram according to RGB channels to obtain N dimensions, wherein each dimension of each channel corresponds to a color value,
obtaining the number of pixel points included in the image by each color value of each channel, wherein N is a positive integer;
And calculating a standardized color value by using the color value with the largest number of pixels of each channel.
8. The apparatus of claim 6, wherein the first mapping submodule is further configured to map each normalized color value of the image to obtain a first color value of illumination using equations 1, 2 and 3;
r1=k1 maxr+a1 formula 1,
g1 =k1×maxg+a1 formula 2,
b1 =k1×maxb+a1 formula 3,
wherein R1, G1 and B1 respectively represent first color values corresponding to the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent standardized color values corresponding to the R channel, the G channel and the B channel of the image, k1 represents a first slope, and a1 represents a first offset.
9. The apparatus according to any one of claims 6 to 8, wherein the color determination module further comprises:
an average value sub-module for calculating an average value of each standardized color value of the image;
and the second mapping sub-module is used for mapping the average value of each standardized color value of the image again by utilizing the color threshold value, the second offset and the second slope to obtain a second color value of illumination if the average value exceeds the color threshold value.
10. The apparatus of claim 9, wherein the second mapping submodule is further configured to map each normalized color value of the image to obtain a second color value of illumination using equations 4, 5, and 6;
r2=r1 (avgRGB-t) k2+a2 formula 4,
g2 =g1× (avgRGB-t) ×k2+a2 formula 5,
b2 =b1× (avgRGB-t) ×k2+a2 formula 6,
wherein R2, G2 and B2 respectively represent second color values corresponding to the R channel, the G channel and the B channel, R1, G1 and B1 respectively represent first color values corresponding to the R channel, the G channel and the B channel, avgRGB represents an average value of normalized color values maxR, maxG and maxB, t represents a color threshold value, k2 represents a second slope, and a2 represents a second offset.
11. A lighting effect forming apparatus, comprising:
one or more processors;
a storage means for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-5.
12. A rendering engine, comprising: the lighting effect forming device according to any one of claims 6 to 11.
13. A computer readable storage medium storing a computer program, which when executed by a processor implements the method of any one of claims 1 to 5.
CN201910004659.7A 2019-01-03 2019-01-03 Lighting effect forming method and device and rendering engine Active CN111402348B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910004659.7A CN111402348B (en) 2019-01-03 2019-01-03 Lighting effect forming method and device and rendering engine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910004659.7A CN111402348B (en) 2019-01-03 2019-01-03 Lighting effect forming method and device and rendering engine

Publications (2)

Publication Number Publication Date
CN111402348A CN111402348A (en) 2020-07-10
CN111402348B true CN111402348B (en) 2023-06-09

Family

ID=71428315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910004659.7A Active CN111402348B (en) 2019-01-03 2019-01-03 Lighting effect forming method and device and rendering engine

Country Status (1)

Country Link
CN (1) CN111402348B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114998504B (en) * 2022-07-29 2022-11-15 杭州摩西科技发展有限公司 Two-dimensional image illumination rendering method, device and system and electronic device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104268922A (en) * 2014-09-03 2015-01-07 广州博冠信息科技有限公司 Image rendering method and device
CN106056661A (en) * 2016-05-31 2016-10-26 钱进 Direct3D 11-based 3D graphics rendering engine

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201102794D0 (en) * 2011-02-17 2011-03-30 Metail Ltd Online retail system
CN102306402A (en) * 2011-09-16 2012-01-04 中山大学 Three-dimensional graph processing system of mobile visual media
CA2802605A1 (en) * 2012-01-17 2013-07-17 Pacific Data Images Llc Ishair: importance sampling for hair scattering
US9531422B2 (en) * 2013-12-04 2016-12-27 Lg Electronics Inc. Mobile terminal and control method for the mobile terminal
CN104134230B (en) * 2014-01-22 2015-10-28 腾讯科技(深圳)有限公司 A kind of image processing method, device and computer equipment
CN103886628B (en) * 2014-03-10 2017-02-01 百度在线网络技术(北京)有限公司 Two-dimension code image generating method and device
JP6646936B2 (en) * 2014-03-31 2020-02-14 キヤノン株式会社 Image processing apparatus, control method thereof, and program
KR20150119515A (en) * 2014-04-15 2015-10-26 삼성디스플레이 주식회사 Method of compensating an image based on light adaptation, display device employing the same, and electronic device
US9799125B1 (en) * 2014-08-26 2017-10-24 Cooper Technologies Company Color control user interfaces
US10375800B2 (en) * 2016-04-06 2019-08-06 Signify Holding B.V. Controlling a lighting system
CN106780709B (en) * 2016-12-02 2018-09-07 腾讯科技(深圳)有限公司 A kind of method and device of determining global illumination information
CN107977946A (en) * 2017-12-20 2018-05-01 百度在线网络技术(北京)有限公司 Method and apparatus for handling image
CN108470369B (en) * 2018-03-26 2022-03-15 城市生活(北京)资讯有限公司 Water surface rendering method and device
CN108879711B (en) * 2018-06-12 2022-11-22 广西大学 Low-voltage single-phase reactive power continuous adjusting device and method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104268922A (en) * 2014-09-03 2015-01-07 广州博冠信息科技有限公司 Image rendering method and device
CN106056661A (en) * 2016-05-31 2016-10-26 钱进 Direct3D 11-based 3D graphics rendering engine

Also Published As

Publication number Publication date
CN111402348A (en) 2020-07-10

Similar Documents

Publication Publication Date Title
JP6504212B2 (en) Device, method and system
CN111400024B (en) Resource calling method and device in rendering process and rendering engine
US10410398B2 (en) Systems and methods for reducing memory bandwidth using low quality tiles
CN109603155A (en) Merge acquisition methods, device, storage medium, processor and the terminal of textures
CN113313802B (en) Image rendering method, device and equipment and storage medium
CN105023234B (en) Figure accelerated method based on embedded system storage optimization
CN114529658A (en) Graph rendering method and related equipment thereof
CN111476851A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111402348B (en) Lighting effect forming method and device and rendering engine
CN111402349B (en) Rendering method, rendering device and rendering engine
CN112991143A (en) Method and device for assembling graphics primitives and computer storage medium
CN112614210A (en) Engineering drawing display method, system and related device
CN115861510A (en) Object rendering method, device, electronic equipment, storage medium and program product
CN114428573B (en) Special effect image processing method and device, electronic equipment and storage medium
CN111402375B (en) Shutter effect forming method and device and rendering engine
CN116563083A (en) Method for rendering image and related device
US10657705B2 (en) System and method for rendering shadows for a virtual environment
CN113763552A (en) Three-dimensional geographic model display method and device, computer equipment and storage medium
CN113240577B (en) Image generation method and device, electronic equipment and storage medium
US8599201B1 (en) System and method for a stencil-based overdraw visualizer
CN114367105A (en) Model coloring method, device, apparatus, medium, and program product
CN115880127A (en) Rendering format selection method and related equipment thereof
CN116966588A (en) Material management method and device, electronic equipment and storage medium
CN116740241A (en) Image processing method and electronic equipment
CN117710180A (en) Image rendering method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant