CN111402348A - Method and device for forming illumination effect and rendering engine - Google Patents

Method and device for forming illumination effect and rendering engine Download PDF

Info

Publication number
CN111402348A
CN111402348A CN201910004659.7A CN201910004659A CN111402348A CN 111402348 A CN111402348 A CN 111402348A CN 201910004659 A CN201910004659 A CN 201910004659A CN 111402348 A CN111402348 A CN 111402348A
Authority
CN
China
Prior art keywords
color
illumination
image
channel
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910004659.7A
Other languages
Chinese (zh)
Other versions
CN111402348B (en
Inventor
郑宇琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910004659.7A priority Critical patent/CN111402348B/en
Publication of CN111402348A publication Critical patent/CN111402348A/en
Application granted granted Critical
Publication of CN111402348B publication Critical patent/CN111402348B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0276Advertisement creation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The embodiment of the invention provides a method and a device for forming an illumination effect and a rendering engine. Wherein the method comprises the following steps: monitoring the rolling state of the information flow to acquire the position of the cell in the information flow; determining the direction of illumination by using the positions of the cells in the information stream; determining a color of illumination using a color of an image displayed in the cell; and rendering the image to form an illumination effect by using the direction and the color of the illumination. The illumination effect of the embodiment of the invention is related to the position of the cell and the color of the image included in the cell in the information flow scrolling state, so that the scrolling illumination effect can be rendered and formed on the image displayed in the information flow, and the illumination effect can be different along with the difference of the scrolling position and the image color of the information flow. Thus, the lighting effect can be more adapted to the currently displayed content of the information stream.

Description

Method and device for forming illumination effect and rendering engine
Technical Field
The invention relates to the technical field of image processing, in particular to a method and a device for forming an illumination effect and a rendering engine.
Background
Feed can combine several message sources actively subscribed by the user together to form a content aggregator, and help the user to continuously acquire the latest Feed content. Many information applications (applications) can use Feed flow (information flow) to push news information and advertisement information to users. In portable intelligent terminals such as mobile phones and palm computers, more and more APPs display various information in a Feed stream mode.
With the need of Feed stream advertisement innovation, more and more creatives containing special effects and movable elements are proposed. The realization of the required special effect in Feed flow is often very difficult, and mathematical and graphic knowledge is needed as support.
At present, the illumination effect can be realized on a common static image, but no good method for realizing the illumination effect exists in a Feed stream.
Disclosure of Invention
The embodiment of the invention provides a method and a device for forming an illumination effect and a rendering engine, which are used for solving one or more technical problems in the prior art.
In a first aspect, an embodiment of the present invention provides a method for forming an illumination effect, including:
monitoring the rolling state of the information flow to acquire the position of the cell in the information flow;
determining the direction of illumination by using the positions of the cells in the information stream;
determining a color of illumination using a color of an image displayed in the cell;
and rendering the image to form an illumination effect by using the direction and the color of the illumination.
In one embodiment, determining the color of illumination using the color of the image displayed in the cell comprises:
obtaining normalized color values for the image;
and mapping each standardized color value of the image by using a first offset and a first slope to obtain a first color value of illumination.
In one embodiment, obtaining normalized color values for the image comprises:
obtaining a histogram of the image using a shader library;
separating the histogram according to RGB channels to obtain N dimensions, wherein each dimension of each channel corresponds to a color value,
obtaining the number of pixel points included in the image by each color value of each channel, wherein N is a positive integer;
and calculating the standardized color value by using the color value with the maximum number of the pixel points of each channel.
In one embodiment, mapping each normalized color value of the image with a first offset and a first slope to obtain a first color value of illumination includes:
mapping each standardized color value of the image by adopting a formula 1, a formula 2 and a formula 3 to obtain a first color value of illumination;
r1 ═ k1 × maxR + a1 formula 1,
g1 ═ k1 × maxG + a1 formula 2,
b1 ═ k1 × maxB + a1 formula 3,
wherein R1, G1 and B1 respectively represent first color values of light illuminating on the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent normalized color values of the image corresponding to the R channel, the G channel and the B channel, k1 represents a first slope, and a1 represents a first offset.
In one embodiment, determining the color of the illumination using the color of the image displayed in the cell further comprises:
calculating an average value of each normalized color value of the image;
and if the average value exceeds the color threshold value, mapping the average value of the standardized color values of the image again by using the color threshold value, the second offset and the second slope to obtain the second color value of the illumination.
In one embodiment, the obtaining the second color value of the illumination by mapping the average value of the normalized color values of the image again with the color threshold, the second offset and the second slope includes:
mapping each standardized color value of the image by adopting an equation 4, an equation 5 and an equation 6 to obtain a second color value of illumination;
r2 ═ r1 ═ (avgRGB-t) × k2+ a2 formula 4,
g2 ═ g1 ═ avgRGB-t ═ k2+ a2 formula 5,
b2 ═ b1 ═ avgRGB-t ═ k2+ a2 formula 6,
where R2, G2, and B2 respectively represent second color values of illumination on the R channel, G channel, and B channel, R1, G1, and B1 respectively represent first color values of illumination on the R channel, G channel, and B channel, avgRGB represents an average of normalized color values maxR, maxG, and maxB, t represents a color threshold, k2 represents a second slope, and a2 represents a second offset.
In a second aspect, an embodiment of the present invention provides an apparatus for forming an illumination effect, including:
the monitoring module is used for monitoring the rolling state of the information flow so as to acquire the position of the cell in the information flow;
the direction determining module is used for determining the direction of illumination by utilizing the positions of the cells in the information flow;
a color determination module for determining a color of illumination using a color of an image displayed in the cell;
and the rendering module is used for rendering the image to form an illumination effect by utilizing the illumination direction and the illumination color.
In one embodiment, the color determination module comprises:
a normalization submodule for obtaining normalized color values of the image;
and the first mapping submodule is used for mapping each standardized color value of the image by using a first offset and a first slope to obtain a first color value of illumination.
In one embodiment, the normalization submodule is further configured to:
obtaining a histogram of the image using a shader library;
separating the histogram according to RGB channels to obtain N dimensions, wherein each dimension of each channel corresponds to a color value,
obtaining the number of pixel points included in the image by each color value of each channel, wherein N is a positive integer;
and calculating the standardized color value by using the color value with the maximum number of the pixel points of each channel.
In an embodiment, the first mapping sub-module is further configured to map each normalized color value of the image by using equations 1, 2, and 3 to obtain a first color value of illumination;
r1 ═ k1 × maxR + a1 formula 1,
g1 ═ k1 × maxG + a1 formula 2,
b1 ═ k1 × maxB + a1 formula 3,
wherein R1, G1 and B1 respectively represent first color values of light illuminating on the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent normalized color values of the image corresponding to the R channel, the G channel and the B channel, k1 represents a first slope, and a1 represents a first offset.
In one embodiment, the color determination module further comprises:
a mean submodule for calculating a mean of the normalized color values of the image;
and the second mapping submodule is used for mapping the average value of each standardized color value of the image again by using the color threshold, the second offset and the second slope if the average value exceeds the color threshold to obtain the second color value of illumination.
In an embodiment, the second mapping sub-module is further configured to map each normalized color value of the image by using equations 4, 5, and 6 to obtain a second color value of the illumination;
r2 ═ r1 ═ (avgRGB-t) × k2+ a2 formula 4,
g2 ═ g1 ═ avgRGB-t ═ k2+ a2 formula 5,
b2 ═ b1 ═ avgRGB-t ═ k2+ a2 formula 6,
where R2, G2, and B2 respectively represent second color values of illumination on the R channel, G channel, and B channel, R1, G1, and B1 respectively represent first color values of illumination on the R channel, G channel, and B channel, avgRGB represents an average of normalized color values maxR, maxG, and maxB, t represents a color threshold, k2 represents a second slope, and a2 represents a second offset.
In a third aspect, an embodiment of the present invention provides an apparatus for forming an illumination effect, where functions of the apparatus may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the above-described functions.
In an embodiment, the apparatus is configured to include a processor and a memory, the memory is used for storing a program supporting the apparatus to execute the above-mentioned forming method of the illumination effect, and the processor is configured to execute the program stored in the memory. The apparatus may also include a communication interface for communicating with other devices or a communication network.
In a fourth aspect, an embodiment of the present invention provides a rendering engine, including: the invention provides a device for forming an illumination effect.
In a fifth aspect, an embodiment of the present invention provides a computer-readable storage medium for storing computer software instructions for an apparatus for forming a lighting effect, which includes a program for executing the method for forming a lighting effect.
One of the above technical solutions has the following advantages or beneficial effects: the lighting effect is related to the position of the cell and the color of the image included in the cell in the information flow scrolling state, so that the lighting effect of scrolling can be rendered on the image displayed in the information flow, and the lighting effect may be different according to the scrolling position of the information flow and the color of the image. Thus, the lighting effect can be more adapted to the currently displayed content of the information stream.
Another technical scheme in the above technical scheme has the following advantages or beneficial effects: by setting a certain slope, offset and threshold, the mapping of the color of the image to the color of illumination is controlled, and the color of the image can be self-adapted. For example, below a certain threshold value, the illumination color may brighten as the image color brightens, making the illumination effect more noticeable. When the image color exceeds a certain threshold value, the illumination color can be quickly darkened along with the lightening of the image color so as to prevent overexposure.
The foregoing summary is provided for the purpose of description only and is not intended to be limiting in any way. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features of the present invention will be readily apparent by reference to the drawings and following detailed description.
Drawings
In the drawings, like reference numerals refer to the same or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily to scale. It is appreciated that these drawings depict only some embodiments in accordance with the disclosure and are therefore not to be considered limiting of its scope.
Fig. 1 shows a flow chart of a method of forming a lighting effect according to an embodiment of the invention.
Fig. 2 shows a flow chart of a method of forming a lighting effect according to an embodiment of the invention.
Fig. 3 shows a flow chart of a method of forming a lighting effect according to an embodiment of the invention.
Fig. 4 is a schematic diagram illustrating position calculation in the method of forming an illumination effect according to the embodiment of the present invention.
Fig. 5 is a flowchart showing an application example of the method of forming the illumination effect according to the embodiment of the present invention.
Fig. 6a, 6b and 6c show effect diagrams of a method of forming a lighting effect according to an embodiment of the invention.
Fig. 7 is a block diagram illustrating a configuration of an apparatus for forming an illumination effect according to an embodiment of the present invention.
Fig. 8 is a block diagram illustrating a configuration of an apparatus for forming an illumination effect according to an embodiment of the present invention.
FIG. 9 illustrates an example diagram of one rendering cycle in a rendering engine according to an embodiment of the invention.
FIG. 10 is a diagram illustrating an internal structure of a canvas object in a rendering engine according to an embodiment of the present invention.
FIG. 11 illustrates a schematic diagram of a rendering flow of a rendering engine according to an embodiment of the present invention.
Fig. 12 is a block diagram showing a configuration of an apparatus for forming an illumination effect according to an embodiment of the present invention.
Detailed Description
In the following, only certain exemplary embodiments are briefly described. As those skilled in the art will recognize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present invention. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
Fig. 1 shows a flow chart of a method of forming a lighting effect according to an embodiment of the invention. As shown in fig. 1, the method may include:
step S11, monitoring the rolling state of the information flow to obtain the position of the cell in the information flow;
step S12, determining the direction of illumination by using the position of the cell in the information flow;
step S13, determining the color of illumination by using the color of the image displayed in the cell;
and step S14, rendering on the image by using the direction and the color of the illumination to form an illumination effect.
In an information stream, such as a Feed stream, a container for presenting information content may be referred to as a cell. The cells can include various forms of multimedia resources such as texts, static pictures, dynamic pictures, videos and the like.
The user can control the location of cells in the Feed stream displayed in the display area of the screen. For example, on a touch screen of a mobile phone, sliding up with a finger can control the Feed stream to scroll upwards, and the cells in the corresponding Feed stream also scroll upwards; the Feed stream can be controlled to scroll downwards by sliding the finger downwards, and cells in the corresponding Feed stream also scroll downwards.
In one example, the location of a cell in the information stream may be calculated using the location of the cell in the display area of the screen. Assuming that the positions of the display regions of the screen are normalized, the lowermost point S1 of the center line is assumed to be 0, and the uppermost point S2 is assumed to be 1. Referring to fig. 4, the location of the cell a may be calculated according to the location of the center point O1 of the cell a. For example, if the length of the line segment composed of the center points O1 and S1 of the cell a is 20% of the length of the line segment composed of S1 and S2, the position of the cell a is equal to 0.2. For another example, if the length of the line segment composed of the center points O1 and S1 of the cell a is 50% of the length of the line segment composed of S1 and S2, the position of the cell a is equal to 0.5. For another example, if the length of the line segment composed of the center points O1 and S1 of the cell a is 60% of the length of the line segment composed of S1 and S2, the position of the cell a is equal to 0.6.
The parameters 0, 0.2, 0.6, 1, etc. in the present embodiment, which represent the cell or screen position, are merely examples and are not limiting. In practical application, the setting can be carried out according to the requirement. For example, the maximum value of the display area of the screen is set to 2, 5, 10, 100, or the like, and the value of the actual position of the cell is determined using the positional proportional relationship of the cell to the screen.
In one example, if a scrolling lighting effect needs to be added to a large-picture ad in a Feed stream, the scrolling status of the Feed stream can be monitored to determine the location of the cell in which the large-picture ad is located. And determining the direction of illumination by using the position of the cell where the large-image advertisement is positioned.
In one example, the location of a cell may have a certain correspondence with the direction of illumination. For example, the direction of illumination is represented by (x, y, z). Where z denotes an axis perpendicular to the screen, x denotes an axis parallel to the lateral direction of the screen, and y denotes an axis parallel to the longitudinal direction of the screen. The direction of the illumination is assumed to be linear with the position of the cell. Wherein the direction of illumination in the z-axis is directed from outside the screen to inside the screen. The z value may remain unchanged. If the Feed stream is scrolling left and right, the direction of the illumination is the same as the z and y values, and the x value varies linearly with the position of the cell. If the Feed stream is scrolling up and down, the values of z, x are unchanged, and the value of y varies linearly with the position of the cell.
In addition, the position variation range of the cell may be set in advance. For example, when the position range of the cell is 0.2 to 0.6, the value of the position linearly changes according to the actual position within this range. When the actual position of the cell is less than the lower limit value of 0.2, the value of the position is set equal to the lower limit value of 0.2. When the actual position of the cell is greater than the upper limit value of 0.6, the value of the position is set equal to the upper limit value of 0.6.
In this way, the direction of the illumination can be made to vary within a certain range. For example, when the position range of the unit cell is 0.2 ~ 0.6, the direction of the illumination, such as the x value (or y value), varies linearly with the value of the position. When the position of the cell is less than 0.2, the direction of illumination, for example, the x value (or y value) is the same as when the position of the cell is 0.2. When the position range of the cell is greater than 0.6, the direction of illumination, for example, the x value (or y value) is the same as when the position of the cell is 0.6.
In one example, a title, content, etc. may be displayed in a cell. The title may be in text format. The content can be in the form of static images, dynamic images, videos, and the like. The image currently displayed by a cell may comprise a still image, a certain frame of a moving image or a certain frame of a video. The illumination color is calculated by using the currently displayed image of the cell, and if the currently displayed image changes, the illumination color can also be changed accordingly.
In the embodiment of the invention, the time sequence for determining the color and the direction of illumination is not limited, and the setting can be carried out according to the actual requirement. As long as the color and direction of the illumination is determined before the rendering command is executed.
In one approach, the direction of illumination may be calculated before the color of the illumination. For example, in the calculation stage of the rendering process, the direction of illumination is calculated by using the position of the cell in the information, and then the color of the illumination is calculated by using the color of the image to be displayed in the cell.
In another way, the color of the illumination may be calculated first and then the direction of the illumination. For example, after the rendering engine obtains an image that needs to be displayed in a cell, the color of the illumination is first calculated in a preparation phase using the color of the image. And in the calculation stage of the rendering process, calculating the direction of illumination by using the positions of the cells in the information.
In one embodiment, as shown in fig. 2, step S13 includes:
step S21, obtaining each standardized color value of the image;
and step S22, mapping each standardized color value of the image by using the first offset and the first slope to obtain a first color value of illumination.
The color value of an image can be divided into three channels, namely R (Red), G (Green) and B (Blue). The three channels are used for statistics, the condition of pixel points included by the image in a certain channel can be obtained, and then the standardized color value is obtained.
In one embodiment, as shown in fig. 3, step S21 includes:
step S31, obtaining a histogram of the image by using a shader library;
step S32, separating the histogram according to RGB channels to obtain N dimensions so as to obtain the number of pixel points of each color value of each channel in the image, wherein N is a positive integer;
and step S33, calculating the standardized color value by using the color value with the maximum number of pixel points of each channel.
For example, the shader pool may be MPS. MPS (also known as Metal Performance Shaders) is a set of Graphics Processing Unit (GPU) shader libraries. MPS includes conventional filter effects such as gaussian blur, and also includes computer vision related functions such as image color histograms, edge detection, and the like. The functions of gaussian blur, histogram, etc. of the MPS can be packaged in the current rendering engine. After encapsulating the resources of the current rendering engine itself and the resource interworking interface of the MPS, the output of the MPS can be seamlessly used for other rendering processes in the current rendering engine. Of course, other types of shader libraries may be used as long as the function of obtaining a histogram of an image is achieved.
In one example, 256 dimensions are separated by the RGB channel, each dimension representing a color value. The R channel has 256 color values, the G channel has 256 color values, and the B channel has 256 color values. Then, the number of pixel points of each color value included in the image is counted. And calculating the standardized color value of each channel by using the color value with the maximum number of the pixel points of each channel. For example, the color value with the largest number of pixels in the R channel may be divided by the dimension number N, e.g., 256, to obtain the normalized color value of the R channel represented as maxR. In addition, the calculation method of maxG and maxB may be similar to maxR, and is not described herein again.
The above N is 256 merely as an example, and may be other values, and the dimension may be divided according to a plurality of bytes, for example, two bytes where N is a square of 256, that is, 65536, and the like, and may be specifically set according to the needs of an actual application scenario.
In one embodiment, step S22 includes:
mapping each standardized color value of the image by adopting a formula 1, a formula 2 and a formula 3 to obtain a first color value of illumination;
r1 ═ k1 × maxR + a1 formula 1,
g1 ═ k1 × maxG + a1 formula 2,
b1 ═ k1 × maxB + a1 formula 3,
wherein R1, G1 and B1 respectively represent first color values of light illuminating on the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent normalized color values of the image corresponding to the R channel, the G channel and the B channel, k1 represents a first slope, and a1 represents a first offset.
And performing color mapping by using the first offset, the first slope and the standardized color value to obtain an illuminated color value corresponding to each standardized color value, and preventing the illuminated color from being too dark in the too dark image. For example, assuming that the first slope is a positive number, the illumination color may be made brighter as the image color is brighter.
In one embodiment, as shown in fig. 2, step S13 further includes:
step S23, calculating the average value of each standardized color value of the image;
and step S24, if the average value exceeds the color threshold value, mapping the average value of the standardized color values of the image again by using the color threshold value, the second offset and the second slope to obtain the second color value of the illumination. This may not be done if the average does not exceed the color threshold.
In one embodiment, the obtaining the second color value of the illumination by mapping the average value of the normalized color values of the image again with the color threshold, the second offset and the second slope includes:
mapping each standardized color value of the image by adopting an equation 4, an equation 5 and an equation 6 to obtain a second color value of illumination;
r2 ═ r1 ═ (avgRGB-t) × k2+ a2 formula 4,
g2 ═ g1 ═ avgRGB-t ═ k2+ a2 formula 5,
b2 ═ b1 ═ avgRGB-t ═ k2+ a2 formula 6,
where R2, G2, and B2 respectively represent second color values of illumination on the R channel, G channel, and B channel, R1, G1, and B1 respectively represent first color values of illumination on the R channel, G channel, and B channel, avgRGB represents an average of normalized color values maxR, maxG, and maxB, t represents a color threshold, k2 represents a second slope, and a2 represents a second offset.
The excess portion may be adjusted by performing a second color mapping using the second offset, the second slope, and each of the first color values. For example, assuming that the second slope is a negative number, the illumination color can be made darker with brighter image color. That is, the illumination color value of an excessively bright portion is reduced.
The three first color values r1, g1 and b1 are respectively calculated by adopting formulas 4, 5 and 6, and the obtained R, G, B channel respectively corresponding second color values r2, g2 and b2 can be used as the final illumination color.
The lighting effect can then be formed in the image of that cell of the information stream using the direction, color, etc. of the lighting. In addition, other rendering parameters such as material, for example, reflection coefficient in the material, may be integrated to form the illumination effect.
In one application example, implementing a scrolling lighting effect in a Feed stream may be creative optimized for large-graph ads. An image of a large-image advertisement is first rendered in a Feed stream using a rendering engine, and then a lighting effect is applied on the image. As shown in fig. 5, the application example may include the following steps:
and step S51, monitoring the rolling state of the Feed stream, acquiring the position of the advertisement in the Feed stream, and calculating the illumination direction in real time according to the position. The illumination direction is calculated by utilizing the position of the advertisement in the Feed stream, so that the interaction effect that the illumination changes along with the rolling of the Feed stream can be achieved.
Step S52, the color of the illumination may be adaptive to the image color. When the image color is below a certain threshold, the illumination color will become brighter as the image becomes brighter. When the color of the image exceeds a certain threshold value, the illumination color can be rapidly darkened along with the lightening of the image so as to prevent overexposure.
Examples of adaptive color algorithms are, among others, the following:
i. a histogram of the image is obtained using MPS. The histograms are separated according to the RGB channels, each divided into 256 dimensions. And obtaining the number of pixel points of each color value of each channel from 0 to 255 in the image.
And ii, obtaining the color value with the maximum number of the pixel points of each channel, and dividing the color value by 256 to obtain 0-1 standardized color value of the color value with the maximum number of the pixel points of each channel, wherein the color value is maxR, maxG and maxB respectively.
Calculate the average avgRGB of RGB ═ (maxR + maxG + maxB)/3.
Assuming that the color shift a1 is 0.3 and the color slope k1 is 0.2, the two values are used to map the above values, and then the color value of the illuminated R channel is calculated by using the mapping rule R1 is k1 maxR + a 1. The G and B channel mapping rules are similar to R.
Thus, the original 0-1 color values are mapped to 0.3-0.5, and the excessively dark image illumination color is prevented from being excessively dark. Since the slope k1 is a positive number, the brighter the image color, the brighter the illumination color.
v. assume that the color threshold t is 0.5, the threshold offset a2 is 0.3, and the threshold slope k2 is-1.0.
And vi, if the avgRGB exceeds the threshold value t, performing color mapping on the exceeding part again, wherein the mapping rule is that R2 is R1 (avgRGB-t) k2+ a2, R, G and the B channel mapping rule is consistent with R. And substituting the r1 value of each channel in the formula 4 to obtain the corresponding illumination color after each channel is subjected to secondary mapping. Since the slope tk is negative, it appears that the brighter the image color is, the darker the illumination color is.
If avgRGB does not exceed threshold t, then secondary mapping may not be performed.
The mapped r, b, g values can be used as the final illumination color. If avgRGB does not exceed the threshold, the final lighting color is r1, g1, and b 1. If avgRGB exceeds the threshold, the final illumination color is r2, g2, and b 2.
And step S53, rendering the advertisement image by the engine according to parameters such as the direction, color, material and the like of illumination to form an illumination effect.
The illumination effect of the embodiment of the invention is related to the position of the cell and the color of the image included in the cell in the information stream, such as the Feed stream scrolling state, so that the scrolling illumination effect can be rendered and formed on the image displayed in the Feed stream, and the illumination effect can be different along with the difference of the Feed stream scrolling position and the image color. Thus, the lighting effect can be more adaptive to the current display content of the Feed stream.
In addition, the illumination effect can respond to the operation of the user on the information flow to generate corresponding change, and the interaction effect is realized. As shown in fig. 6a, 6b and 6c, the trend of the effect of the illumination as a function of the cell position in the Feed stream is roughly marked in the image with a circular dashed box.
Furthermore, the image color can be self-adapted by setting a certain slope, offset and threshold value to control the mapping of the image color to the illumination color. For example, below a certain threshold value, the illumination color may brighten as the image color brightens, making the illumination effect more noticeable. When the image color exceeds a certain threshold value, the illumination color can be quickly darkened along with the lightening of the image color so as to prevent overexposure.
Fig. 7 is a block diagram illustrating a configuration of an apparatus for forming an illumination effect according to an embodiment of the present invention. As shown in fig. 7, the apparatus may include:
the monitoring module 61 is used for monitoring the scrolling state of the information stream to acquire the position of the cell in the information stream;
a direction determination module 62 for determining the direction of illumination using the position of the cell in the information stream;
a color determination module 63 for determining the color of the illumination using the color of the image displayed in the cell;
and the rendering module 64 is configured to render the image to form an illumination effect by using the direction and the color of the illumination.
In one embodiment, as shown in fig. 8, the color determination module 63 includes:
a normalization submodule 631 for obtaining normalized color values of the image;
the first mapping submodule 632 is configured to map each normalized color value of the image by using the first offset and the first slope to obtain a first color value of illumination.
In one embodiment, the normalization sub-module 631 is further configured to:
obtaining a histogram of the image using a shader library;
separating the histogram according to RGB channels to obtain N dimensions, wherein each dimension of each channel corresponds to a color value,
obtaining the number of pixel points included in the image by each color value of each channel, wherein N is a positive integer;
and calculating the standardized color value by using the color value with the maximum number of the pixel points of each channel.
In an embodiment, the first mapping submodule 632 is further configured to map each normalized color value of the image by using equations 1, 2, and 3 to obtain a first color value of illumination;
r1 ═ k1 × maxR + a1 formula 1,
g1 ═ k1 × maxG + a1 formula 2,
b1 ═ k1 × maxB + a1 formula 3,
wherein R1, G1 and B1 respectively represent first color values of light illuminating on the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent normalized color values of the image corresponding to the R channel, the G channel and the B channel, k1 represents a first slope, and a1 represents a first offset.
In one embodiment, the color determination module 63 further comprises:
an average submodule 633 for calculating an average of the normalized color values of the image;
the second mapping sub-module 634, configured to map the average of the normalized color values of the image again by using the color threshold, the second offset, and the second slope if the average exceeds the color threshold, so as to obtain a second color value of the illumination.
In an embodiment, the second mapping sub-module 634 is further configured to map each normalized color value of the image by using equation 4 to obtain a second color value of the illumination;
r2 ═ r1 ═ (avgRGB-t) × k2+ a2 formula 4,
where r2 denotes the second color value of illumination, r1 denotes the first color value of illumination, avgRGB denotes the average of normalized color values maxR, maxG, and maxB, t denotes a color threshold, k2 denotes a second slope, and a2 denotes a second offset amount.
The functions of each module in each apparatus in the embodiments of the present invention may refer to the corresponding description in the above method, and are not described herein again.
The embodiment of the invention provides a rendering engine which comprises any one illumination effect forming device in the embodiment of the invention.
In one application example, a set of graphics rendering engines is developed based on a rendering application programming interface, such as Metal. The rendering engine may perform the method for forming the lighting effect according to any one of the above embodiments. The Metal is a low-level rendering application programming interface, provides the lowest level required by software, and ensures that the software can run on different graphic chips. The rendering engine can be applied to the iOS device and has the characteristics of light weight, easiness in access, high performance, multiple instantiations and the like. In addition, the rendering engine has the capability of providing multi-instance three-dimensional, lighting, and other graphical effects in the Feed stream.
In one example, the graphics-rendering engine implementation consists essentially of:
1. a single instance core controller (e.g., a single instance VGMeta core) is employed to manage the infrastructure of the Metal and to manage the buffer objects (e.g., VGMeta cache objects). A system screen refresh notification class (e.g., CADisplay L ink) is employed to drive rendering events.
2. Kernel core controllers (e.g., VGMetalKernelCore) and augmented reality core controllers (e.g., VGARCore) are employed as portals for system high performance shader function tools (e.g., MetalPerformanceShaders) and augmented reality tools (e.g., ARKit).
3. The rendering events are triggered in turn by VGMetalCore controlling each canvas object named VGMetalCanvas.
4. The structure inside the canvas object can be seen in figure 9. As shown in FIG. 9, in one example, three canvas objects (referred to simply as canvases in FIG. 9) trigger rendering events in series within one rendering cycle. The process of generating the rendering driving event from the system screen refresh notification class to the end of rendering all the running canvas objects can be regarded as one rendering cycle. Each canvas object comprises a plurality of stages of event throwing, numerical calculation, preparation rendering, graphic rendering and buffer exchanging in one rendering process. During the event throwing stage, the canvas object may be involved in monitoring the rendering driving notification thrown by the singleton core controller, and the notification may include some character strings for indicating that the system screen refresh notification class has issued the rendering driving event. In a numerical computation phase, effect parameters for effects of the canvas object may be computed. In preparation for the rendering phase, the GPU resources may be processed. In the graphics rendering phase, a rendering command may be invoked to complete the rendering of the effects of the graphics to be rendered within the canvas object. In the swap buffer phase, the buffer currently used by the canvas object may be swapped with an unused buffer in preparation for displaying the rendering effect in the screen. For example: after exchanging the currently used buffer H1 with the unused buffer H2, the next frame may be rendered at H2. And H2 and H1 are exchanged after the rendering is finished, so that the rendering effect is more continuous.
Referring to the example of FIG. 9, the three canvas objects each have their own two buffers, the first one being H1-1 and H2-1, the second one being H1-2 and H2-2, and the third one being H1-3 and H2-3. Assume that the screen also has two buffers C1 and C2. One is displayed in the foreground and the other is hidden in the background. Within one rendering cycle, at a certain frame, the rendering results of the three canvas objects are in buffers H1-1, H1-2, and H1-3, respectively. The rendering results of these three buffers may all be included in buffer C1 of the screen. At this time, C1 may be displayed in the foreground and C2 may be hidden. At the next frame, the C2 may include rendering effects of H2-1, H2-2, and H2-3, and then the C1 is exchanged with the C2 to display the rendering effect of the next frame on the screen. And by the exchange of different buffer areas, the rendering effect is continuously displayed on the screen.
In this example, after the rendering process of the first canvas object is completed, the rendering process of the second object is started. After the rendering process for the second canvas object is completed, the rendering process for the third canvas object is started. In fig. 9, only the stages involved in the rendering process of the first canvas object are shown, and the rendering processes of the second canvas object and the third canvas object are not shown, but are similar to the rendering process of the first canvas object.
Assuming the Canvas object is named VanGogh Canvas, the Canvas object may include system classes such as System layer (CAMeta L eye), System draw (CAMeta Drawable), Color Texture (Color Texture), where CAMeta L eye may display the content rendered by the Metal in the layer.
The canvas object may further include a Depth Texture (Depth Texture), a Pipeline Descriptor (MT L RenderPass Descriptor), and an effects list (Effect L ist), wherein the effects list may include a plurality of effects (Effect), each of which may include, for example, a light source Descriptor (Multi L eight Descriptor), a Camera (Camera), a Draw list (Draw L ist), etc., wherein the Camera may include a perspective Descriptor (Perfectivedescriptor), a view transformation Descriptor (EyeTransform Descriptor), Other descriptors (OtherDescriptor), etc. the Draw list includes a plurality of Draw objects (Draw), each of which may include resources required for drawing of each of the effects, e.g., a Texture Descriptor (Material Descriptor), a Vertex Content (VertexContent), a Fragment source Content (Fragment Content), a Vertex State (Buckt), and Other metadata, which may be set in a Buffer status, a Buffer status Descriptor (Buffer), and Other metadata, wherein the Buffer status Descriptor (Buffer) may include a Texture Descriptor, a Buffer status Descriptor, and Other metadata Descriptor (Buffer Index, etc., in the Texture Descriptor, wherein the Buffer status Descriptor may include a Texture Descriptor, a Buffer status Descriptor, a Buffer Index, a Buffer, and Other metadata Descriptor (Buffer Index, etc. in which may include a Texture Descriptor.
As shown in fig. 11, the main rendering flow of the application example may include:
in step S81, a system screen refresh notification class, such as CADisplay L ink, triggers a render driver event, and CADisplay L ink may trigger the render driver event once per frame.
In step S82, a canvas object such as VGMetalCanvas (where VGMetalCanvas may be the class name of VanGoghCanvas in code) holds an effect object such as VGEffect that should be currently rendered. VGEffect may be a cluster of classes, with different effects achieved by different sub-classes. The effect object that should be currently rendered may include one effect to be rendered or may include an effect list composed of a plurality of effects to be rendered. The canvas object supports drawing multiple effects together. When the canvas object receives a notification, the effect object is traversed, triggering a "compute" event for the effect object.
For example, if there is an AR effect, after triggering a "compute" event into the compute phase, the coordinates of the object tracked by the ARKit may be output to the current rendering engine.
For another example, if an illumination effect needs to be formed in the Feed stream, after the "compute" event is triggered to enter the compute stage, the direction of illumination can be computed according to the location of the cells (cells) in the Feed stream. The color of the illumination may then be calculated at an earlier time period. For example, after the canvas is created, once the color of the image to be rendered is obtained, the color of the illumination is pre-calculated.
And step S83, after the effect object receives the calculation event, different numerical calculations are carried out according to different classes of the class cluster. These calculations may be performed by a Central Processing Unit (CPU). The effect parameters calculated for different effect objects may be different, for example: some effect objects may have their effect parameters rotated by a certain angle, and some effect objects may have their effect parameters moved by a certain distance, depending on the specific characteristics of the effect object.
After the calculation is completed in step S84, the canvas object may determine whether the effect object needs to be redrawn based on the calculation result. For example, if the calculated effect parameter is unchanged, it may belong to static effects, some of which are not redrawn. For some effects which do not need redrawing, the rendering command does not need to be executed, unnecessary redrawing can be reduced, and performance and power consumption are saved.
Step S85, for effect objects that need to be redrawn, the canvas object may further trigger a "ready to render" event. The event may be used to process GPU resources. Such as generating GPU buffers, or generating texture resources. Wherein generating an example of a texture resource comprises: in preparation for rendering, a shadow depth map is generated that is needed when a scene renders shadows.
In step S86, after the effect object has processed the "ready to render" event, the canvas object triggers the "graphics rendering" event. The method comprises the steps of preparing rendering for an effect object, calculating a transformation matrix, calculating a light source descriptor, finally generating a rendering context structure body, and transmitting the rendering context structure body to a plurality of rendering objects held in the effect object for rendering. The rendering object may be, for example, a drawing object (Draw) in fig. 10. The rendering object may also be a cluster of classes, and different subclasses may have different implementations.
The resources of the MPS may be utilized in the current rendering engine. For example, after entering the ready-to-render stage, generating a gaussian blur result with MPS may be invoked. And in the graphic rendering stage, mapping by using a Gaussian fuzzy result.
In step S87, after receiving the rendering context, the rendering object accesses two shader-related drawing objects, i.e., the rendering object, which are vertex content (VertexContent) and fragment source content (fragment content) held inside. The two objects are updated to the UniformBuffer buffer shared with the GPU according to the rendering context, the FragmentContent uploads the texture from the CPU to the GPU at the moment, and the VertexContent calls the rendering command of the Metal to perform final graphics rendering.
For example, if a lighting effect needs to be formed in the Feed stream, parameters such as color, direction, and material of lighting can be uploaded to the rendering pipeline in the graphics rendering stage, and the shader function calculates color values using the parameters, and finally forms the lighting effect.
In this application example, the rendering application programming interface of the rendering engine uses Metal technology, rather than conventional OpenG L ES technology, with the following features.
a) The method is more suitable for the modern multi-core GPU, and the rendering engine can have higher performance and more stable frame rate.
b) And by adopting the C/S model, the communication with the GPU is easier to manage, and the structure of the rendering engine is clearer.
c) The rendering engine has good stability and robustness and has less on-line collapse (Crash). In one aspect, Application Programming Interface (API) checking can help developers find problems while debugging. On the other hand, during operation protection, when the GPU has the problems of hang (hang) and the like, the APP cannot be directly crashed, and the risk is reduced.
d) The shader language MS L is extended based on C + +14, making the shader code of the rendering engine more modern and better performing.
e) By adopting a precompilation mechanism, a syntax tree is generated during compiling, so that shader codes of the rendering engine are loaded more quickly, and can be loaded more quickly during running.
By the rendering engine, advanced styles based on graphics rendering can be rapidly developed. These advanced creative styles based on graphical rendering, whose eye-catching effect and premium can be favored by Guaranteed Delivery (GD) advertisers. In addition, the rendering engine has the characteristics of light weight, strong functions, easiness in transplantation, no dependence on other third-party libraries and the like, and can be rapidly transplanted to other product lines.
With this rendering engine, a display illumination effect on an image displayed in a Feed stream can be achieved. Since the illumination effect is different according to the scrolling position of the Feed stream and the color of the image, the illumination effect can be changed in response to the operation of the user on the Feed stream, so that the scrolling illumination effect is presented, and the interaction with the user can be embodied.
Fig. 12 is a block diagram showing a configuration of an apparatus for forming an illumination effect according to an embodiment of the present invention. As shown in fig. 12, the apparatus includes: a memory 910 and a processor 920, the memory 910 having stored therein computer programs operable on the processor 920. The processor 920 implements the method for forming the lighting effect in the above embodiments when executing the computer program. The number of the memory 910 and the processor 920 may be one or more.
The device also includes:
and a communication interface 930 for communicating with an external device to perform data interactive transmission.
Memory 910 may include high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
If the memory 910, the processor 920 and the communication interface 930 are implemented independently, the memory 910, the processor 920 and the communication interface 930 may be connected to each other through a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 12, but this is not intended to represent only one bus or type of bus.
Optionally, in an implementation, if the memory 910, the processor 920 and the communication interface 930 are integrated on a chip, the memory 910, the processor 920 and the communication interface 930 may complete communication with each other through an internal interface.
An embodiment of the present invention provides a computer-readable storage medium, which stores a computer program, and the computer program is used for implementing the method of any one of the above embodiments when being executed by a processor.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may also be stored in a computer readable storage medium. The storage medium may be a read-only memory, a magnetic or optical disk, or the like.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive various changes or substitutions within the technical scope of the present invention, and these should be covered by the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (15)

1. A method for creating an illumination effect, comprising:
monitoring the rolling state of the information flow to acquire the position of the cell in the information flow;
determining the direction of illumination by using the positions of the cells in the information stream;
determining a color of illumination using a color of an image displayed in the cell;
and rendering the image to form an illumination effect by using the direction and the color of the illumination.
2. The method of claim 1, wherein determining the color of the illumination using the color of the image displayed in the cell comprises:
obtaining normalized color values for the image;
and mapping each standardized color value of the image by using a first offset and a first slope to obtain a first color value of illumination.
3. The method of claim 2, wherein obtaining normalized color values for the image comprises:
obtaining a histogram of the image using a shader library;
separating the histogram according to RGB channels to obtain N dimensions, wherein each dimension of each channel corresponds to a color value,
obtaining the number of pixel points included in the image by each color value of each channel, wherein N is a positive integer;
and calculating the standardized color value by using the color value with the maximum number of the pixel points of each channel.
4. The method of claim 2, wherein mapping normalized color values of the image using a first offset and a first slope to obtain a first color value of illumination comprises:
mapping each standardized color value of the image by adopting a formula 1, a formula 2 and a formula 3 to obtain a first color value of illumination;
r1 ═ k1 × maxR + a1 formula 1,
g1 ═ k1 × maxG + a1 formula 2,
b1 ═ k1 × maxB + a1 formula 3,
wherein R1, G1 and B1 respectively represent first color values of light illuminating on the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent normalized color values of the image corresponding to the R channel, the G channel and the B channel, k1 represents a first slope, and a1 represents a first offset.
5. The method of any of claims 2 to 4, wherein determining the color of the illumination using the color of the image displayed in the cell further comprises:
calculating an average value of each normalized color value of the image;
and if the average value exceeds the color threshold value, mapping the average value of the standardized color values of the image again by using the color threshold value, the second offset and the second slope to obtain the second color value of the illumination.
6. The method of claim 5, wherein remapping the average of the normalized color values of the image using the color threshold, the second offset, and the second slope to obtain the illuminated second color value comprises:
mapping each standardized color value of the image by adopting an equation 4, an equation 5 and an equation 6 to obtain a second color value of illumination;
r2 ═ r1 ═ (avgRGB-t) × k2+ a2 formula 4,
g2 ═ g1 ═ avgRGB-t ═ k2+ a2 formula 5,
b2 ═ b1 ═ avgRGB-t ═ k2+ a2 formula 6,
where R2, G2, and B2 respectively represent second color values of illumination on the R channel, G channel, and B channel, R1, G1, and B1 respectively represent first color values of illumination on the R channel, G channel, and B channel, avgRGB represents an average of normalized color values maxR, maxG, and maxB, t represents a color threshold, k2 represents a second slope, and a2 represents a second offset.
7. An apparatus for creating an illumination effect, comprising:
the monitoring module is used for monitoring the rolling state of the information flow so as to acquire the position of the cell in the information flow;
the direction determining module is used for determining the direction of illumination by utilizing the positions of the cells in the information flow;
a color determination module for determining a color of illumination using a color of an image displayed in the cell;
and the rendering module is used for rendering the image to form an illumination effect by utilizing the illumination direction and the illumination color.
8. The apparatus of claim 7, wherein the color determination module comprises:
a normalization submodule for obtaining normalized color values of the image;
and the first mapping submodule is used for mapping each standardized color value of the image by using a first offset and a first slope to obtain a first color value of illumination.
9. The apparatus of claim 8, wherein the normalization sub-module is further configured to:
obtaining a histogram of the image using a shader library;
separating the histogram according to RGB channels to obtain N dimensions, wherein each dimension of each channel corresponds to a color value,
obtaining the number of pixel points included in the image by each color value of each channel, wherein N is a positive integer;
and calculating the standardized color value by using the color value with the maximum number of the pixel points of each channel.
10. The apparatus of claim 8, wherein the first mapping sub-module is further configured to map each normalized color value of the image using equations 1, 2, and 3 to obtain a first color value of the illumination;
r1 ═ k1 × maxR + a1 formula 1,
g1 ═ k1 × maxG + a1 formula 2,
b1 ═ k1 × maxB + a1 formula 3,
wherein R1, G1 and B1 respectively represent first color values of light illuminating on the R channel, the G channel and the B channel, maxR, maxG and maxB respectively represent normalized color values of the image corresponding to the R channel, the G channel and the B channel, k1 represents a first slope, and a1 represents a first offset.
11. The apparatus of any of claims 8 to 10, wherein the color determination module further comprises:
a mean submodule for calculating a mean of the normalized color values of the image;
and the second mapping submodule is used for mapping the average value of each standardized color value of the image again by using the color threshold, the second offset and the second slope if the average value exceeds the color threshold to obtain the second color value of illumination.
12. The apparatus of claim 11, wherein the second mapping sub-module is further configured to map each normalized color value of the image using equations 4, 5, and 6 to obtain a second color value of the illumination;
r2 ═ r1 ═ (avgRGB-t) × k2+ a2 formula 4,
g2 ═ g1 ═ avgRGB-t ═ k2+ a2 formula 5,
b2 ═ b1 ═ avgRGB-t ═ k2+ a2 formula 6,
where R2, G2, and B2 respectively represent second color values of illumination on the R channel, G channel, and B channel, R1, G1, and B1 respectively represent first color values of illumination on the R channel, G channel, and B channel, avgRGB represents an average of normalized color values maxR, maxG, and maxB, t represents a color threshold, k2 represents a second slope, and a2 represents a second offset.
13. An apparatus for creating an illumination effect, comprising:
one or more processors;
storage means for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-6.
14. A rendering engine, comprising: the apparatus for forming a light effect according to any one of claims 7 to 13.
15. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 6.
CN201910004659.7A 2019-01-03 2019-01-03 Lighting effect forming method and device and rendering engine Active CN111402348B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910004659.7A CN111402348B (en) 2019-01-03 2019-01-03 Lighting effect forming method and device and rendering engine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910004659.7A CN111402348B (en) 2019-01-03 2019-01-03 Lighting effect forming method and device and rendering engine

Publications (2)

Publication Number Publication Date
CN111402348A true CN111402348A (en) 2020-07-10
CN111402348B CN111402348B (en) 2023-06-09

Family

ID=71428315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910004659.7A Active CN111402348B (en) 2019-01-03 2019-01-03 Lighting effect forming method and device and rendering engine

Country Status (1)

Country Link
CN (1) CN111402348B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114998504A (en) * 2022-07-29 2022-09-02 杭州摩西科技发展有限公司 Two-dimensional image illumination rendering method, device and system and electronic device

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306402A (en) * 2011-09-16 2012-01-04 中山大学 Three-dimensional graph processing system of mobile visual media
CA2802605A1 (en) * 2012-01-17 2013-07-17 Pacific Data Images Llc Ishair: importance sampling for hair scattering
CN103886628A (en) * 2014-03-10 2014-06-25 百度在线网络技术(北京)有限公司 Two-dimension code image generating method and device
CN104134230A (en) * 2014-01-22 2014-11-05 腾讯科技(深圳)有限公司 Image processing method, image processing device and computer equipment
CN104268922A (en) * 2014-09-03 2015-01-07 广州博冠信息科技有限公司 Image rendering method and device
CN104702776A (en) * 2013-12-04 2015-06-10 Lg电子株式会社 Mobile terminal and control method for the mobile terminal
CN104954697A (en) * 2014-03-31 2015-09-30 佳能株式会社 Image processing apparatus and image processing method
CN105047147A (en) * 2014-04-15 2015-11-11 三星显示有限公司 Method of compensating an image based on light adaptation, display device employing the same, and electronic device
US20160284018A1 (en) * 2011-02-17 2016-09-29 Metail Limited Computer implemented methods and systems for generating virtual body models for garment fit visualisation
CN106056661A (en) * 2016-05-31 2016-10-26 钱进 Direct3D 11-based 3D graphics rendering engine
CN106780709A (en) * 2016-12-02 2017-05-31 腾讯科技(深圳)有限公司 A kind of method and device for determining global illumination information
WO2017174551A1 (en) * 2016-04-06 2017-10-12 Philips Lighting Holding B.V. Controlling a lighting system
US9799125B1 (en) * 2014-08-26 2017-10-24 Cooper Technologies Company Color control user interfaces
CN107977946A (en) * 2017-12-20 2018-05-01 百度在线网络技术(北京)有限公司 Method and apparatus for handling image
CN108470369A (en) * 2018-03-26 2018-08-31 城市生活(北京)资讯有限公司 A kind of water surface rendering intent and device
CN108879711A (en) * 2018-06-12 2018-11-23 广西大学 A kind of single-phase reactive power continuous regulating mechanism of low pressure and method

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160284018A1 (en) * 2011-02-17 2016-09-29 Metail Limited Computer implemented methods and systems for generating virtual body models for garment fit visualisation
CN102306402A (en) * 2011-09-16 2012-01-04 中山大学 Three-dimensional graph processing system of mobile visual media
CA2802605A1 (en) * 2012-01-17 2013-07-17 Pacific Data Images Llc Ishair: importance sampling for hair scattering
CN104702776A (en) * 2013-12-04 2015-06-10 Lg电子株式会社 Mobile terminal and control method for the mobile terminal
CN104134230A (en) * 2014-01-22 2014-11-05 腾讯科技(深圳)有限公司 Image processing method, image processing device and computer equipment
CN103886628A (en) * 2014-03-10 2014-06-25 百度在线网络技术(北京)有限公司 Two-dimension code image generating method and device
CN104954697A (en) * 2014-03-31 2015-09-30 佳能株式会社 Image processing apparatus and image processing method
CN105047147A (en) * 2014-04-15 2015-11-11 三星显示有限公司 Method of compensating an image based on light adaptation, display device employing the same, and electronic device
US9799125B1 (en) * 2014-08-26 2017-10-24 Cooper Technologies Company Color control user interfaces
CN104268922A (en) * 2014-09-03 2015-01-07 广州博冠信息科技有限公司 Image rendering method and device
WO2017174551A1 (en) * 2016-04-06 2017-10-12 Philips Lighting Holding B.V. Controlling a lighting system
CN106056661A (en) * 2016-05-31 2016-10-26 钱进 Direct3D 11-based 3D graphics rendering engine
CN106780709A (en) * 2016-12-02 2017-05-31 腾讯科技(深圳)有限公司 A kind of method and device for determining global illumination information
CN107977946A (en) * 2017-12-20 2018-05-01 百度在线网络技术(北京)有限公司 Method and apparatus for handling image
CN108470369A (en) * 2018-03-26 2018-08-31 城市生活(北京)资讯有限公司 A kind of water surface rendering intent and device
CN108879711A (en) * 2018-06-12 2018-11-23 广西大学 A kind of single-phase reactive power continuous regulating mechanism of low pressure and method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DAVE MOON: "利用高精度颜色传感器的适应性显示屏提升用户体验", pages 22 - 26 *
P. SHARMA: "A colour face image database for benchmarking of automatic face detection algorithms", pages 423 - 428 *
郑宇琦: "室内设计中灯光的艺术设计思考", pages 114 - 115 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114998504A (en) * 2022-07-29 2022-09-02 杭州摩西科技发展有限公司 Two-dimensional image illumination rendering method, device and system and electronic device

Also Published As

Publication number Publication date
CN111402348B (en) 2023-06-09

Similar Documents

Publication Publication Date Title
CN109603155B (en) Method and device for acquiring merged map, storage medium, processor and terminal
CN111400024B (en) Resource calling method and device in rendering process and rendering engine
CN108305228B (en) Image processing method, image processing device, storage medium and processor
CN110070551B (en) Video image rendering method and device and electronic equipment
CN113313802B (en) Image rendering method, device and equipment and storage medium
US11120591B2 (en) Variable rasterization rate
CN114669047B (en) Image processing method, electronic equipment and storage medium
CN113763856A (en) Method and device for determining ambient illumination intensity and storage medium
CN111476851A (en) Image processing method, image processing device, electronic equipment and storage medium
CN114529658A (en) Graph rendering method and related equipment thereof
CN111127603B (en) Animation generation method and device, electronic equipment and computer readable storage medium
CN111402348A (en) Method and device for forming illumination effect and rendering engine
CN111796825B (en) Bullet screen drawing method, bullet screen drawing device, bullet screen drawing equipment and storage medium
CN114285936B (en) Screen brightness adjustment method and device, storage medium and terminal
CN111402349B (en) Rendering method, rendering device and rendering engine
CN114745570B (en) Image rendering method, electronic device and storage medium
CN114764821B (en) Moving object detection method, moving object detection device, electronic equipment and storage medium
CN114428573B (en) Special effect image processing method and device, electronic equipment and storage medium
CN113763552A (en) Three-dimensional geographic model display method and device, computer equipment and storage medium
CN114782579A (en) Image rendering method and device and storage medium
CN109739403B (en) Method and apparatus for processing information
US10657705B2 (en) System and method for rendering shadows for a virtual environment
CN113744379A (en) Image generation method and device and electronic equipment
CN111402375A (en) Method and device for forming shutter effect and rendering engine
CN112070656B (en) Frame data modification method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant