CN111489429A - Image rendering control method, terminal device and storage medium - Google Patents

Image rendering control method, terminal device and storage medium Download PDF

Info

Publication number
CN111489429A
CN111489429A CN202010301299.XA CN202010301299A CN111489429A CN 111489429 A CN111489429 A CN 111489429A CN 202010301299 A CN202010301299 A CN 202010301299A CN 111489429 A CN111489429 A CN 111489429A
Authority
CN
China
Prior art keywords
pixel
processed
transparent
region
display interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010301299.XA
Other languages
Chinese (zh)
Inventor
李帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ARCHERMIND TECHNOLOGY (NANJING) CO LTD
Original Assignee
ARCHERMIND TECHNOLOGY (NANJING) CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ARCHERMIND TECHNOLOGY (NANJING) CO LTD filed Critical ARCHERMIND TECHNOLOGY (NANJING) CO LTD
Priority to CN202010301299.XA priority Critical patent/CN111489429A/en
Publication of CN111489429A publication Critical patent/CN111489429A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/503Blending, e.g. for anti-aliasing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Abstract

The invention provides an image rendering control method, a terminal device and a storage medium, wherein the method comprises the following steps: superposing a transparent channel on a display interface to obtain a region to be processed; acquiring pixel data of a region to be processed, and calculating pixel adjusting parameters according to the control pixel coefficient and the pixel data; and adjusting the pixel point state of the region to be processed according to the pixel adjusting parameter so as to complete the pixel state switching display of the display interface. The invention carries out regional rendering on the display interface, and the rendering display mode is more flexible and efficient.

Description

Image rendering control method, terminal device and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image rendering control method, a terminal device, and a storage medium.
Background
At present, in the field, in order to realize a three-dimensional dynamic rendering effect, the prior art mainly relies on a three-dimensional transparent channel mapping processing technology to realize the three-dimensional dynamic rendering effect, but the prior art mainly focuses on realizing a pixel value change display effect on three-dimensional objects of a model, such as trees, flowers, clothes and hairs, through a three-dimensional patch and overlapping transparent channels.
In the above techniques, to control the pixel data of the surface of the three-dimensional object, the pixel values are used to control the pixel variation effects, such as transparency and brightness, of the surface of the three-dimensional object, and the rendering effect is not divided into regions. In addition, the pixel data of each pixel point can be controlled and adjusted respectively, but the method can realize the partition rendering, but the partition rendering efficiency is low, and errors are easy to occur, so that the partition rendering effect is poor.
Therefore, how to perform rendering display in different regions and improve rendering efficiency in different regions is a technical problem that needs to be solved in the art.
Disclosure of Invention
The invention aims to provide an image rendering control method, terminal equipment and a storage medium, which can realize regional rendering of a display interface and have a more flexible and efficient rendering display mode.
The technical scheme provided by the invention is as follows:
the invention provides an image rendering control method, which comprises the following steps:
superposing a transparent channel on a display interface to obtain a region to be processed;
acquiring pixel data of the area to be processed, and calculating pixel adjusting parameters according to control pixel coefficients and the pixel data;
and adjusting the pixel point state of the region to be processed according to the pixel adjusting parameter so as to complete the pixel state switching display of the display interface.
Further, the acquiring the pixel data of the region to be processed includes:
if the interface image corresponding to the display interface does not comprise transparent pixels, carrying out gray processing on the interface image;
and obtaining a target pixel value corresponding to each pixel point according to the color channel value of each pixel in the interface image after the gray processing to obtain the pixel data.
Further, the step of superimposing a transparent channel on the display interface to obtain the region to be processed includes:
acquiring a grid object corresponding to the display interface, and generating a corresponding patch model according to the grid object;
and superposing a transparent channel corresponding to the identification information at the facet model, and extracting the region to be processed corresponding to the facet model through the transparent channel.
Further, the step of obtaining the pixel data of the region to be processed and calculating the pixel adjustment parameter according to the control pixel coefficient and the pixel data includes:
acquiring target pixel values corresponding to all pixel points in the region to be processed, wherein the target pixel values comprise transparent pixels and brightness pixels;
and performing product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain a pixel adjusting parameter.
Further, the multiplying the control pixel coefficient by the transparent pixel and/or the brightness pixel comprises:
acquiring a distance value of each pixel point in the region to be processed relative to a preset original point in a preset moving direction when a display interface slides;
and acquiring a corresponding control pixel coefficient according to each distance value, and performing product operation on the control pixel coefficient of each pixel point and the corresponding transparent pixel and/or brightness pixel.
The present invention also provides a terminal device, including:
the image processing module is used for superposing a transparent channel on the display interface to obtain a region to be processed;
the extraction operation module is used for acquiring pixel data of the area to be processed and calculating pixel adjustment parameters according to control pixel coefficients and the pixel data;
and the image display module is used for adjusting the pixel point state of the region to be processed according to the pixel adjusting parameter so as to complete the switching display of the pixel state of the display interface.
Further, the image processing module includes:
the patch generating unit is used for acquiring a grid object corresponding to the display interface and generating a corresponding patch model according to the grid object;
and the image processing unit is used for superposing a transparent channel corresponding to the identification information at the patch model and extracting the region to be processed corresponding to the patch model through the transparent channel.
Further, the extraction operation module includes:
the pixel extraction unit is used for acquiring a target pixel value corresponding to each pixel point in the region to be processed; the target pixel value comprises a transparent pixel and a luma pixel;
and the operation unit is used for performing product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain a pixel adjusting parameter.
Further, the arithmetic unit includes:
the distance obtaining subunit is used for obtaining a distance value of each pixel point in the to-be-processed area relative to a preset origin in a preset moving direction when the display interface slides;
and the product operation subunit is used for acquiring the corresponding control pixel coefficient according to each distance value, and performing product operation on the control pixel coefficient of each pixel point and the corresponding transparent pixel and/or brightness pixel to obtain the pixel adjustment parameter.
The invention also provides a storage medium, wherein at least one instruction is stored in the storage medium, and the instruction is loaded and executed by a processor to realize the operation executed by the image rendering control method.
By the image rendering control method, the terminal device and the storage medium, the display interface can be rendered in a regional mode, and the rendering display mode is more flexible and efficient.
Drawings
The above features, technical features, advantages and implementations of an image rendering control method, a terminal device and a storage medium will be further described in the following description of preferred embodiments in a clearly understandable manner with reference to the accompanying drawings.
FIG. 1 is a flow chart of one embodiment of an image rendering control method of the present invention;
FIG. 2 is a flow chart of another embodiment of an image rendering control method of the present invention;
FIG. 3 is a flow chart of another embodiment of an image rendering control method of the present invention;
FIG. 4 is a flow chart of another embodiment of an image rendering control method of the present invention;
FIG. 5 is a schematic diagram of image transparency adjustment using the image rendering control method of FIG. 4 according to the present invention;
FIG. 6 is a flow chart of another embodiment of an image rendering control method of the present invention;
FIG. 7 is a schematic diagram of the present invention employing the image rendering control method of FIG. 6 to switch adjustment of image transparency in three-dimensional space;
fig. 8 is a schematic structural diagram of an embodiment of a terminal device of the present invention.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will be made with reference to the accompanying drawings. It is obvious that the drawings in the following description are only some examples of the invention, and that for a person skilled in the art, other drawings and embodiments can be derived from them without inventive effort.
For the sake of simplicity, the drawings only schematically show the parts relevant to the present invention, and they do not represent the actual structure as a product. In addition, in order to make the drawings concise and understandable, components having the same structure or function in some of the drawings are only schematically illustrated or only labeled. In this document, "one" means not only "only one" but also a case of "more than one".
In an embodiment of the present invention, as shown in fig. 1, an image rendering control method includes:
s110, overlapping a transparent channel on a display interface to obtain a region to be processed;
specifically, colors in the transparent channel are added or deleted by editing the transparent channel, and the mask color and the transparency can be set, wherein the value of a common transparent pixel is between 0 and 1. The transparent channel is a grayscale numerical channel that defines transparent, opaque, and translucent regions with 256 levels of grayscale, where a transparent pixel value of 0 indicates opaque and a transparent pixel value of 1 indicates fully transparent. And a transparent channel is superposed on the display interface to perform cutout to obtain a region to be processed, namely a mask region.
Schematically, a pixel point in the region to be processed is represented as (r, g, b, a, l), where r is a red channel value of the pixel point, g is a green channel value of the pixel point, b is a blue channel value of the pixel point, a is a transparent channel value of the pixel point, i.e., a transparent pixel, and l is a brightness value of the pixel point, i.e., a brightness pixel. Optionally, the values of r, g, b, a, and l all range from 0 to 255 or from 0 to 1, where 0 to 1 is obtained by converting 0 to 255 (dividing by 255). In the following embodiments, the value ranges of r, g, b, a, and l are all 0 to 1, and the value range of the control pixel coefficient is 0 to 1. Illustratively, the opaque channel value is "1", the transparent channel value is "0" and the transparent channel value is "1".
S120, acquiring pixel data of a region to be processed, and calculating pixel adjusting parameters according to the control pixel coefficient and the pixel data;
specifically, the pixel data includes corresponding RGB values and also includes corresponding transparent pixels and brightness pixels, and after the region to be processed is obtained, the terminal device can read the RGB values, the transparency values, and the brightness values of the respective pixel points. And acquiring control pixel coefficients input by a user, wherein the control pixel coefficients comprise a transparent pixel coefficient and a brightness pixel coefficient corresponding to the transparency, even a red pixel coefficient, a blue pixel coefficient and a green pixel coefficient. After the pixel data of the region to be processed is obtained, the pixel value (including RGB value, transparency value and brightness value) of each type in the pixel data is respectively in one-to-one correspondence with the control pixel coefficient of the corresponding type to obtain the corresponding pixel adjusting parameter. The pixel adjustment parameters include any one or more of a transparency adjustment parameter, a lightness adjustment parameter, and a chroma adjustment parameter (e.g., a red adjustment parameter corresponding to a red pixel coefficient).
S130, adjusting the pixel point state of the region to be processed according to the pixel adjusting parameter so as to complete the switching display of the pixel state of the display interface.
Specifically, after the operation result is obtained in the above manner, the pixels in the region to be processed are respectively adjusted, so that the pixel data of each pixel is changed to complete the switching display of the pixel state of the display interface.
In the embodiment, the to-be-processed area is extracted through the transparent channel, and the to-be-processed area is respectively rendered for each pixel point according to the operation result of the control pixel coefficient and the pixel data, so that the display interface is rendered in a regional mode, and the rendering display mode is more flexible and efficient.
In an embodiment of the present invention, as shown in fig. 2, an image rendering control method includes:
s210, overlapping a transparent channel on a display interface to obtain a region to be processed;
s220, if the interface image corresponding to the display interface does not comprise transparent pixels, carrying out gray level processing on the interface image;
specifically, the PNG, BMP, PSD, TGA formatted images may each include transparent pixels, i.e., a color space including RGBA data (Red color channel value, Green color channel value, Blue color channel value, and transparency information Alpha). For example, if the PNG format image is composed of a plurality of pixels, each pixel corresponds to an RGBA data. Whereas images in GIF and JPEG format each include only RGB data (Red color channel value, Green color channel value, Blue color channel value) and no transparent pixels.
S230, obtaining a target pixel value corresponding to each pixel point to obtain pixel data according to the color channel value of each pixel in the interface image after the gray processing;
specifically, the interface images similar to the GIF and JPEG formats are subjected to gray scale processing, which is the prior art and is not described in detail herein. And then, according to the color channel value of each pixel in the display interface after the gray processing, inquiring a preset conversion table by using an image algorithm for extracting the depth of field of the pixel and image processing software such as Photo shop to obtain a transparent pixel corresponding to each pixel point. In addition, the brightness pixel corresponding to each pixel point can be obtained by substituting the color channel value into a preset formula.
The formula for the brightness pixel is L ═ R0.30 + G0.59 + B0.11, where R is the red color channel value, G is the green color channel value, and B is the blue color channel value.
S240, calculating pixel adjusting parameters according to the control pixel coefficients and the pixel data;
s250, adjusting the pixel point state of the region to be processed according to the pixel adjusting parameter so as to complete the pixel state switching display of the display interface.
In the embodiment, the to-be-processed area is extracted through the transparent channel, and the to-be-processed area is respectively rendered for each pixel point according to the operation result of the control pixel coefficient and the pixel data, so that the display interface is rendered in a regional mode, and the rendering display mode is more flexible and efficient. In addition, because the interface image is subjected to gray processing, the method can be suitable for the regional rendering of the interface image in any format, and has higher compatibility and practicability.
In an embodiment of the present invention, as shown in fig. 3, an image rendering control method includes:
s310, acquiring a grid object corresponding to the display interface, and generating a corresponding patch model according to the grid object;
specifically, all independent interface contents of the display interface are acquired to generate corresponding grid objects, and grid vertexes, sequences, directions, vertex normals and the like corresponding to the display interface are determined through screening according to image processing software such as Unity3D and Photo shop, so that a plurality of patch models corresponding to the display interface are created. And constructing a mesh object in the three-dimensional space for each interface content according to the center point coordinate by the obtained center point coordinate of the three-dimensional space information display interface, and connecting three vertexes of each interface content to construct a triangular patch, thereby creating a patch model corresponding to the display interface. Illustratively, after the interface image corresponding to the display interface is subjected to gray processing, the interface image subjected to gray processing is loaded into image processing software (such as blend, Unity3D, Photo shop) so as to subdivide the interface image into a plurality of meshes and create a plurality of patch models corresponding to the display interface, each patch model has a corresponding mesh coordinate range, and a corresponding identity ID is generated for each patch model.
S320, superposing a transparent channel corresponding to the identification information on the patch model, and extracting a region to be processed corresponding to the patch model through the transparent channel;
specifically, each transparent channel has identification information, the coordinates of a target area where the transparent channels need to be superimposed are obtained, and the corresponding target grid coordinate range is obtained by querying and matching according to the target area coordinates, so that a target patch model corresponding to the target grid coordinate range is found out, or the identity ID of the target patch model where the transparent channels need to be superimposed is obtained. The position of the target patch model is found out according to the mode, the transparent channel corresponding to the identification information is superposed on the target patch model, and thus the corresponding transparent channels can be superposed on the plurality of patch models according to requirements, and the corresponding to-be-processed area is obtained according to the transparent channels superposed by the plurality of patch models.
S330, acquiring target pixel values corresponding to all pixel points in the region to be processed, wherein the target pixel values comprise transparent pixels and brightness pixels;
s340, performing product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain a pixel adjusting parameter;
s350, adjusting the pixel point state of the region to be processed according to the pixel adjusting parameter so as to complete the pixel state switching display of the display interface.
In this embodiment, the corresponding patch model is generated by the mesh object corresponding to the display interface, and the transparent channel corresponding to the identification information is superimposed on the plurality of patch models according to the requirement, so that different regions to be processed can be extracted, and different regions to be processed on the same display interface can be respectively rendered according to the control pixel coefficients corresponding to different patch models to realize the rendering in different regions. In addition, each pixel point in the same region to be processed is respectively rendered according to the operation result of the control pixel coefficient and the pixel data, and the display interface is further rendered in a regional mode, so that the rendering display mode is more flexible and efficient.
In an embodiment of the present invention, as shown in fig. 4, an image rendering control method includes:
s410, acquiring a grid object corresponding to the display interface, and generating a corresponding patch model according to the grid object;
s420, superposing a transparent channel corresponding to the identification information at the patch model, and extracting a region to be processed corresponding to the patch model through the transparent channel;
s430, if the interface image corresponding to the display interface does not comprise transparent pixels, carrying out gray level processing on the interface image;
s440, obtaining pixel data of a target pixel value corresponding to each pixel point according to the color channel value of each pixel in the interface image after the gray processing;
s450, acquiring target pixel values corresponding to all pixel points in the region to be processed, wherein the target pixel values comprise transparent pixels and brightness pixels;
s460, performing product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain a pixel adjusting parameter;
s470, adjusting the pixel state of the region to be processed according to the pixel adjusting parameter, so as to complete the switching display of the pixel state of the display interface.
Specifically, when only the transparent pixels of each pixel point in the region to be processed are obtained through the above embodiment, if the transparent control pixel coefficient and the brightness control pixel coefficient are obtained at this time, only the transparent pixels of each pixel point in the region to be processed are obtained, and therefore, only the product operation is performed on the transparent pixels corresponding to each pixel point and the transparent control pixel coefficient, so as to adjust the transparency of each pixel point in the region to be processed.
Similarly, when only the brightness pixel of each pixel point in the region to be processed is obtained through the above embodiment, if the transparent control pixel coefficient and the brightness control pixel coefficient are obtained at this time, only the product operation is performed on the brightness pixel corresponding to each pixel point and the brightness control pixel coefficient, so as to adjust the brightness (also referred to as brightness) of each pixel point in the region to be processed.
In addition, when the transparent pixel and the brightness pixel of each pixel point in the region to be processed are obtained through the embodiment, if only the transparent control pixel coefficient is obtained, only the product operation is performed on the transparent pixel corresponding to each pixel point and the transparent control pixel coefficient, so that the transparency of each pixel point in the region to be processed is adjusted. Similarly, only the brightness of each pixel point in the region to be processed is adjusted when the brightness control pixel coefficient is obtained.
Illustratively, under the condition that the transparent pixels of a plurality of pixel points are input simultaneously, the product calculation is carried out on the transparent pixels of the plurality of pixel points and the obtained transparent control pixel coefficients respectively, so that the partially transparent surface can be avoided, the original transparency information is covered when transparency readjustment is carried out, and therefore transparency adjustment of different areas is carried out according to requirements.
Referring to fig. 5, an original image (i.e., a two-dimensional image of an interface image of a display interface according to the present invention) is input, and the original image includes transparent pixels and color pixels, i.e., RGB pixels. Extracting transparent pixels of an original image, superimposing a transparent channel on the original image, removing unnecessary parts of the original image in combination with the transparent channel, such as the physical representation of the transparent channel shown in fig. 5, removing a black representative to obtain a non-transparent processing area, and retaining a white representative to obtain a to-be-processed area. The transparency adjustment value (i.e. the transparent control pixel coefficient of the invention) is further superposed on the region to be processed, because the transparent channel is already endowed with the part which needs to be displayed (i.e. the region to be processed) and the part which does not need to be displayed (i.e. the non-transparentization processing region) in the original image. Assuming that the transparent pixel corresponding to the petal part of the area to be processed is 0.5, and the transparent pixel corresponding to the pistil part is 1, at this time, the transparency adjustment value is superposed on the pixel point of the non-transparentization processing area, namely, 0 multiplied by any value is generated as an effect of 0, and the area to be processed is influenced by the transparency adjustment value, namely, 0.5 multiplied by 0.5 to obtain 0.25 for the transparent pixel corresponding to the pixel point of the petal part area, and 0.5 multiplied by 0.5 to obtain 0.5 for the transparent pixel 1 corresponding to the pixel point of the pistil part area, so that the originally opaque pistil part becomes semitransparent, the semitransparent petal area becomes more transparent, and the transparent adjustment effect of the rightmost part area in fig. 5 is presented. Based on the superposition idea of the product mode, the transparency of different regions to be displayed can be adjusted by using the product result of the plurality of transparent channels.
In the embodiment, the to-be-processed area is extracted through the transparent channel, and transparency or brightness rendering is respectively performed on each pixel point of the to-be-processed area according to the operation result of the control pixel coefficient and the pixel data, so that the display interface is rendered in a partition mode synchronously, and the rendering display mode is more flexible and efficient.
In an embodiment of the present invention, as shown in fig. 6, an image rendering control method includes:
s510, acquiring a grid object corresponding to the display interface, and generating a corresponding patch model according to the grid object;
s520, superposing a transparent channel corresponding to the identification information on the patch model, and extracting a region to be processed corresponding to the patch model through the transparent channel;
s530, if the interface image corresponding to the display interface does not comprise transparent pixels, carrying out gray processing on the interface image;
s540, according to the color channel value of each pixel in the interface image after the gray processing, obtaining a target pixel value corresponding to each pixel point to obtain the pixel data;
s550, acquiring a target pixel value corresponding to each pixel point in the region to be processed, wherein the target pixel value comprises a transparent pixel and a brightness pixel;
s560, when the display interface slides, the distance value of each pixel point in the area to be processed relative to the preset origin in the preset moving direction is obtained;
s570, acquiring a corresponding control pixel coefficient according to each distance value, and performing product operation on the control pixel coefficient of each pixel point and a corresponding transparent pixel and/or brightness pixel to obtain a pixel adjusting parameter;
specifically, when the user slides the display interface, the preset origin is selected in advance according to the requirement and the habit preference, and a plane coordinate system is established. Therefore, a distance value of each pixel point in the region to be processed in the preset moving direction relative to the preset original point is obtained, and a preset control coefficient change table is set, wherein the preset control coefficient change table comprises a corresponding relation between the distance value and the transparent control pixel coefficient, and the distance value and the brightness control pixel coefficient. Therefore, the corresponding control pixel coefficient can be obtained according to each distance value, and the control pixel coefficient of each pixel point and the corresponding transparent pixel and/or brightness pixel are subjected to product operation.
Of course, a transparency change curve (or a brightness change curve) may be set in advance, and the transparency change curve (or the brightness change curve) may be a curve which is nonlinear and monotonically increases (or monotonically decreases) with a distance value according to the transparency control pixel coefficient (or the brightness control pixel coefficient). Furthermore, the transparency change curve (or brightness change curve) may also be a curve that is non-linear and that increases (or decreases) with distance value according to the transparency control pixel coefficient (or brightness control pixel coefficient) according to the demand. In short, it is only necessary to conform to the effect that the lower the transparency degree of the pixel point moving toward the preset origin, the higher the lightness degree.
S580 adjusts the pixel state of the region to be processed according to the pixel adjustment parameter, so as to complete the switching display of the pixel state of the display interface.
Specifically, with the popularization of Virtual Reality (VR is abbreviated as VR, and hereinafter referred to as VR), VR application interaction scenes are more and more, so that development and innovation of a Human-machine interface (HMI) are inevitably needed in the VR using process, and various more easily understood graphic interaction schemes are developed according to needs. In the three-dimensional space in the virtual reality scene, various types of menus and interfaces are displayed to become basic functions which each virtual reality application program must have. The virtual menu performs various dynamic effect operations, such as hiding/displaying, transition from opaque to completely transparent, and application of effects such as position movement and color change in many scenes.
In the process of displaying the VR menu, the following technical effects are expected to be realized: the semi-transparency of some areas of the menu in the three-dimensional space is high, and the semi-transparency of some areas is low. Referring to fig. 7, which illustrates a dynamic effect when the VR menu interface is switched, on both sides of a main picture or a logo, a user is prompted to slide left or right to fully display a partially visible object by hiding a part of the object.
In the VR menu switching process, if the area of the left interface body (a3) that has been in the semi-transparent state moves from left to right to approach the preset origin, the middle interface body (a2) moves from left to right to move away from the preset origin, and at this time, the area of the semi-transparent state of the left interface body (a3) gradually becomes the opaque area, and the area of the non-transparent state of the middle interface body (a2) gradually becomes the transparent area.
If the region of the interface body (a1) on the right side, which has already assumed a translucent state, moves in the direction from right to left toward the proximity of the preset origin, the middle interface body (a2) moves in the direction from right to left toward the proximity of the preset origin, at which time the region of the interface body (a1) on the right side in a translucent state gradually becomes an opaque region, and the region of the interface body (a2) in the middle in a non-transparent state gradually becomes a transparent region.
On the contrary, if the region of the left interface body (a3) that has assumed the translucent state moves in the direction from right to left toward a direction away from the preset origin, the region of the left interface body (a3) that has assumed the translucent state gradually becomes a highly transparent region or even a fully transparent region. The region of the right interface body (a1) that has assumed the translucent state gradually becomes a highly transparent region or even a fully transparent region when moving from the left-to-right direction toward away from the preset origin. The above is only an example of the left and right interface bodies, and the manner of adjusting the upper and lower interface bodies by sliding up and down is not described in detail herein.
Through the method, the menu surface or other surfaces with the transparent maps are virtually applied, a plurality of control pixel coefficients are received at the same time, the final rendering surface is influenced by the product, and the control that the local transparent surface and the semi-transparent surface are controlled by the control pixel coefficients for multiple times is realized. The surface of the three-dimensional object which is scratched off by using the transparent channel map can be kept completely transparent under the control of the transparent channel map. The method is mainly applied to simple processing of the semi-transparent animation of the user interface menu in the three-dimensional space in the VR equipment, and can realize regional rendering of the user interface menu, so that the semi-transparent degree of some regions is high, and the semi-transparent degree of some regions is low.
In the embodiment, the transparency or brightness of each to-be-processed area is respectively adjusted according to the sliding track of the interface operation by the user, particularly, in the edge area, the transparency is usually gradually changed or suddenly changed, a perfect and natural linking animation can be provided on the plane of the terminal device in cooperation with the sliding operation of the user, so that the user can more conveniently check the menu content, different transparencies can be set for each to-be-processed area, and the effect of sudden change or even gradual change of the transparency or brightness is achieved, so that the space movement effect is achieved, the visual experience of the user in menu switching with a VR/AR/MR display function is improved, the rich and vivid interface switching effect is achieved, the visual quality is also improved, the space movement effect is achieved, and the use experience of the user is improved.
In an embodiment of the present invention, as shown in fig. 8, a terminal device includes:
the image processing module 10 is used for superposing a transparent channel on the display interface to obtain a region to be processed;
the extraction operation module 20 is configured to obtain pixel data of the region to be processed, and calculate a pixel adjustment parameter according to the control pixel coefficient and the pixel data;
and the image display module 30 is configured to adjust the state of the pixel point in the region to be processed according to the pixel adjustment parameter, so as to complete switching and displaying of the pixel state of the display interface.
Specifically, this embodiment is a device embodiment corresponding to the method embodiment, and specific effects refer to the method embodiment, which is not described in detail herein.
Based on the foregoing embodiment, the image processing module 10 includes:
the system comprises a surface patch generating unit, a display interface generating unit and a surface patch generating unit, wherein the surface patch generating unit is used for acquiring a grid object corresponding to the display interface and generating a corresponding surface patch model according to the grid object;
and the image processing unit is used for superposing the transparent channel corresponding to the identification information at the patch model and extracting the region to be processed corresponding to the patch model through the transparent channel.
Specifically, this embodiment is a device embodiment corresponding to the method embodiment, and specific effects refer to the method embodiment, which is not described in detail herein.
Based on the foregoing embodiment, the extraction operation module 20 includes:
the pixel extraction unit is used for acquiring a target pixel value corresponding to each pixel point in the region to be processed; the target pixel value includes a transparent pixel and a brightness pixel;
and the operation unit is used for performing product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel.
Specifically, this embodiment is a device embodiment corresponding to the method embodiment, and specific effects refer to the method embodiment, which is not described in detail herein.
Based on the foregoing embodiment, the arithmetic unit includes:
the distance obtaining subunit is used for obtaining a distance value of each pixel point in the region to be processed in the preset moving direction relative to a preset original point when the display interface slides;
and the product operation subunit is used for acquiring the corresponding control pixel coefficient according to each distance value and performing product operation on the control pixel coefficient of each pixel point and the corresponding transparent pixel and/or brightness pixel.
Specifically, this embodiment is a device embodiment corresponding to the method embodiment, and specific effects refer to the method embodiment, which is not described in detail herein.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of program modules is illustrated, and in practical applications, the above-described distribution of functions may be performed by different program modules, that is, the internal structure of the apparatus may be divided into different program units or modules to perform all or part of the above-described functions. Each program module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one processing unit, and the integrated unit may be implemented in a form of hardware, or may be implemented in a form of software program unit. In addition, the specific names of the program modules are only used for distinguishing the program modules from one another, and are not used for limiting the protection scope of the application.
In one embodiment of the invention, a terminal device comprises a processor and a memory, wherein the memory is used for storing a computer program; and the processor is used for executing the computer program stored on the memory and realizing the image rendering control method in the embodiment of the method.
The terminal equipment can be desktop computers, notebooks, palm computers, tablet computers, mobile phones, man-machine interaction screens and other equipment. The terminal device may include, but is not limited to, a processor, a memory. Those skilled in the art will appreciate that the foregoing is merely an example of a terminal device and is not limiting of terminal devices, and that more or fewer components than those shown, or some of the components in combination, or different components may be included, such as: the terminal device may also include input/output interfaces, display devices, network access devices, communication buses, communication interfaces, and the like. A communication interface and a communication bus, and may further comprise an input/output interface, wherein the processor, the memory, the input/output interface and the communication interface complete communication with each other through the communication bus. The memory stores a computer program, and the processor is used for executing the computer program stored on the memory to realize the image rendering control method in the embodiment of the method.
The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory may be an internal storage unit of the terminal device, such as: hard disk or memory of the terminal device. The memory may also be an external storage device of the terminal device, such as: the terminal equipment is provided with a plug-in hard disk, an intelligent memory Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) and the like. Further, the memory may also include both an internal storage unit and an external storage device of the terminal device. The memory is used for storing the computer program and other programs and data required by the terminal device. The memory may also be used to temporarily store data that has been output or is to be output.
A communication bus is a circuit that connects the described elements and enables transmission between the elements. For example, the processor receives commands from other elements through the communication bus, decrypts the received commands, and performs calculations or data processing according to the decrypted commands. The memory may include program modules such as a kernel (kernel), middleware (middleware), an Application Programming Interface (API), and applications. The program modules may be comprised of software, firmware or hardware, or at least two of the same. The input/output interface forwards commands or data entered by a user via the input/output interface (e.g., sensor, keyboard, touch screen). The communication interface connects the terminal equipment with other network equipment, user equipment and a network. For example, the communication interface may be connected to a network by wire or wirelessly to connect to external other network devices or user devices. The wireless communication may include at least one of: wireless fidelity (WiFi), Bluetooth (BT), Near Field Communication (NFC), Global Positioning Satellite (GPS) and cellular communications, among others. The wired communication may include at least one of: universal Serial Bus (USB), high-definition multimedia interface (HDMI), asynchronous transfer standard interface (RS-232), and the like. The network may be a telecommunications network and a communications network. The communication network may be a computer network, the internet of things, a telephone network. The terminal device may be connected to the network via a communication interface, and a protocol used by the terminal device to communicate with other network devices may be supported by at least one of an application, an Application Programming Interface (API), middleware, a kernel, and a communication interface.
In an embodiment of the present invention, a storage medium stores at least one instruction, and the instruction is loaded and executed by a processor to implement the operations performed by the corresponding embodiments of the image rendering control method. For example, the computer readable storage medium may be a read-only memory (ROM), a random-access memory (RAM), a compact disc read-only memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
They may be implemented in program code that is executable by a computing device such that it is executed by the computing device, or separately, or as individual integrated circuit modules, or as a plurality or steps of individual integrated circuit modules. Thus, the present invention is not limited to any specific combination of hardware and software.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or recited in detail in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by sending instructions to relevant hardware through a computer program, where the computer program may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises: computer program code which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the content of the computer-readable storage medium can be increased or decreased according to the requirements of the legislation and patent practice in the jurisdiction, for example: in certain jurisdictions, in accordance with legislation and patent practice, the computer-readable medium does not include electrical carrier signals and telecommunications signals.
It should be noted that the above embodiments can be freely combined as necessary. The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. An image rendering control method characterized by comprising the steps of:
superposing a transparent channel on a display interface to obtain a region to be processed;
acquiring pixel data of the area to be processed, and calculating pixel adjusting parameters according to control pixel coefficients and the pixel data;
and adjusting the pixel point state of the region to be processed according to the pixel adjusting parameter so as to complete the pixel state switching display of the display interface.
2. The image rendering control method according to claim 1, wherein the acquiring pixel data of the region to be processed includes the steps of:
if the interface image corresponding to the display interface does not comprise transparent pixels, carrying out gray processing on the interface image;
and obtaining a target pixel value corresponding to each pixel point according to the color channel value of each pixel in the interface image after the gray processing to obtain the pixel data.
3. The image rendering control method according to claim 1 or 2, wherein the step of superimposing a transparent channel on the display interface to obtain the area to be processed comprises the steps of:
acquiring a grid object corresponding to the display interface, and generating a corresponding patch model according to the grid object;
and superposing a transparent channel corresponding to the identification information at the facet model, and extracting the region to be processed corresponding to the facet model through the transparent channel.
4. The image rendering control method according to claim 3, wherein the step of obtaining pixel data of the region to be processed and calculating pixel adjustment parameters according to control pixel coefficients and the pixel data comprises the steps of:
acquiring target pixel values corresponding to all pixel points in the region to be processed, wherein the target pixel values comprise transparent pixels and brightness pixels;
and performing product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain a pixel adjusting parameter.
5. The image rendering control method of claim 4, wherein the multiplying the control pixel coefficients by the transparent pixels and/or the luma pixels comprises:
acquiring a distance value of each pixel point in the region to be processed relative to a preset original point in a preset moving direction when a display interface slides;
and acquiring a corresponding control pixel coefficient according to each distance value, and performing product operation on the control pixel coefficient of each pixel point and the corresponding transparent pixel and/or brightness pixel.
6. A terminal device, comprising:
the image processing module is used for superposing a transparent channel on the display interface to obtain a region to be processed;
the extraction operation module is used for acquiring pixel data of the area to be processed and calculating pixel adjustment parameters according to control pixel coefficients and the pixel data;
and the image display module is used for adjusting the pixel point state of the region to be processed according to the pixel adjusting parameter so as to complete the switching display of the pixel state of the display interface.
7. The terminal device according to claim 6, wherein the image processing module comprises:
the patch generating unit is used for acquiring a grid object corresponding to the display interface and generating a corresponding patch model according to the grid object;
and the image processing unit is used for superposing a transparent channel corresponding to the identification information at the patch model and extracting the region to be processed corresponding to the patch model through the transparent channel.
8. The terminal device of claim 7, wherein the extraction operation module comprises:
the pixel extraction unit is used for acquiring a target pixel value corresponding to each pixel point in the region to be processed; the target pixel value comprises a transparent pixel and a luma pixel;
and the operation unit is used for performing product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain a pixel adjusting parameter.
9. The terminal device according to claim 8, wherein the arithmetic unit includes:
the distance obtaining subunit is used for obtaining a distance value of each pixel point in the to-be-processed area relative to a preset origin in a preset moving direction when the display interface slides;
and the product operation subunit is used for acquiring the corresponding control pixel coefficient according to each distance value, and performing product operation on the control pixel coefficient of each pixel point and the corresponding transparent pixel and/or brightness pixel to obtain the pixel adjustment parameter.
10. A storage medium having stored therein at least one instruction, which is loaded and executed by a processor to perform an operation performed by the image rendering control method according to any one of claims 1 to 5.
CN202010301299.XA 2020-04-16 2020-04-16 Image rendering control method, terminal device and storage medium Pending CN111489429A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010301299.XA CN111489429A (en) 2020-04-16 2020-04-16 Image rendering control method, terminal device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010301299.XA CN111489429A (en) 2020-04-16 2020-04-16 Image rendering control method, terminal device and storage medium

Publications (1)

Publication Number Publication Date
CN111489429A true CN111489429A (en) 2020-08-04

Family

ID=71794831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010301299.XA Pending CN111489429A (en) 2020-04-16 2020-04-16 Image rendering control method, terminal device and storage medium

Country Status (1)

Country Link
CN (1) CN111489429A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112017110A (en) * 2020-08-26 2020-12-01 广州拓想家科技有限公司 Image multi-face perspective adjustment method and system
CN112153408A (en) * 2020-09-28 2020-12-29 广州虎牙科技有限公司 Live broadcast rendering method and device, electronic equipment and storage medium
CN113377476A (en) * 2021-06-23 2021-09-10 北京百度网讯科技有限公司 Interface display method, related device and computer program product
CN114003163A (en) * 2021-10-27 2022-02-01 腾讯科技(深圳)有限公司 Image processing method and apparatus, storage medium, and electronic device
WO2022142875A1 (en) * 2020-12-31 2022-07-07 北京字跳网络技术有限公司 Image processing method and apparatus, electronic device, and storage medium

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0962852A (en) * 1995-08-30 1997-03-07 Kubota Corp Graphic processor
CN101078982A (en) * 2006-05-24 2007-11-28 北京壁虎科技有限公司 Screen display method based on drawing engine
CN101529495A (en) * 2006-09-19 2009-09-09 奥多比公司 Image mask generation
CN102393970A (en) * 2011-12-13 2012-03-28 北京航空航天大学 Object three-dimensional modeling and rendering system as well as generation and rendering methods of three-dimensional model
CN103618886A (en) * 2013-12-13 2014-03-05 厦门美图网科技有限公司 Shooting method for intelligently decoloring according to main color tone
US20160133046A1 (en) * 2014-11-12 2016-05-12 Canon Kabushiki Kaisha Image processing apparatus
CN108295467A (en) * 2018-02-06 2018-07-20 网易(杭州)网络有限公司 Rendering method, device and the storage medium of image, processor and terminal
CN108475330A (en) * 2015-11-09 2018-08-31 港大科桥有限公司 Auxiliary data for there is the View synthesis of pseudomorphism perception
CN109859303A (en) * 2019-01-16 2019-06-07 网易(杭州)网络有限公司 Rendering method, device, terminal device and the readable storage medium storing program for executing of image
CN110163831A (en) * 2019-04-19 2019-08-23 深圳市思为软件技术有限公司 The object Dynamic Display method, apparatus and terminal device of three-dimensional sand table
CN110503704A (en) * 2019-08-27 2019-11-26 北京迈格威科技有限公司 Building method, device and the electronic equipment of three components
CN110503599A (en) * 2019-08-16 2019-11-26 珠海天燕科技有限公司 Image processing method and device
CN110796725A (en) * 2019-08-28 2020-02-14 腾讯科技(深圳)有限公司 Data rendering method, device, terminal and storage medium
CN110796721A (en) * 2019-10-31 2020-02-14 北京字节跳动网络技术有限公司 Color rendering method and device of virtual image, terminal and storage medium
CN111009026A (en) * 2019-12-24 2020-04-14 腾讯科技(深圳)有限公司 Object rendering method and device, storage medium and electronic device

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0962852A (en) * 1995-08-30 1997-03-07 Kubota Corp Graphic processor
CN101078982A (en) * 2006-05-24 2007-11-28 北京壁虎科技有限公司 Screen display method based on drawing engine
CN101529495A (en) * 2006-09-19 2009-09-09 奥多比公司 Image mask generation
CN102393970A (en) * 2011-12-13 2012-03-28 北京航空航天大学 Object three-dimensional modeling and rendering system as well as generation and rendering methods of three-dimensional model
CN103618886A (en) * 2013-12-13 2014-03-05 厦门美图网科技有限公司 Shooting method for intelligently decoloring according to main color tone
US20160133046A1 (en) * 2014-11-12 2016-05-12 Canon Kabushiki Kaisha Image processing apparatus
CN108475330A (en) * 2015-11-09 2018-08-31 港大科桥有限公司 Auxiliary data for there is the View synthesis of pseudomorphism perception
CN108295467A (en) * 2018-02-06 2018-07-20 网易(杭州)网络有限公司 Rendering method, device and the storage medium of image, processor and terminal
CN109859303A (en) * 2019-01-16 2019-06-07 网易(杭州)网络有限公司 Rendering method, device, terminal device and the readable storage medium storing program for executing of image
CN110163831A (en) * 2019-04-19 2019-08-23 深圳市思为软件技术有限公司 The object Dynamic Display method, apparatus and terminal device of three-dimensional sand table
CN110503599A (en) * 2019-08-16 2019-11-26 珠海天燕科技有限公司 Image processing method and device
CN110503704A (en) * 2019-08-27 2019-11-26 北京迈格威科技有限公司 Building method, device and the electronic equipment of three components
CN110796725A (en) * 2019-08-28 2020-02-14 腾讯科技(深圳)有限公司 Data rendering method, device, terminal and storage medium
CN110796721A (en) * 2019-10-31 2020-02-14 北京字节跳动网络技术有限公司 Color rendering method and device of virtual image, terminal and storage medium
CN111009026A (en) * 2019-12-24 2020-04-14 腾讯科技(深圳)有限公司 Object rendering method and device, storage medium and electronic device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
甘岚等: "肿瘤细胞图像识别", 西安交通大学出版社, pages: 11 - 12 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112017110A (en) * 2020-08-26 2020-12-01 广州拓想家科技有限公司 Image multi-face perspective adjustment method and system
CN112153408A (en) * 2020-09-28 2020-12-29 广州虎牙科技有限公司 Live broadcast rendering method and device, electronic equipment and storage medium
WO2022142875A1 (en) * 2020-12-31 2022-07-07 北京字跳网络技术有限公司 Image processing method and apparatus, electronic device, and storage medium
CN113377476A (en) * 2021-06-23 2021-09-10 北京百度网讯科技有限公司 Interface display method, related device and computer program product
CN113377476B (en) * 2021-06-23 2023-07-25 北京百度网讯科技有限公司 Interface display method, related device and computer program product
CN114003163A (en) * 2021-10-27 2022-02-01 腾讯科技(深圳)有限公司 Image processing method and apparatus, storage medium, and electronic device
CN114003163B (en) * 2021-10-27 2023-10-24 腾讯科技(深圳)有限公司 Image processing method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN111489429A (en) Image rendering control method, terminal device and storage medium
US10699491B2 (en) Virtually representing spaces and objects while maintaining physical properties
KR102478606B1 (en) Image display device and method for displaying image
CN109064390B (en) Image processing method, image processing device and mobile terminal
US9576397B2 (en) Reducing latency in an augmented-reality display
US8467604B2 (en) Color enhancement
CN108765520B (en) Text information rendering method and device, storage medium and electronic device
CN112288665A (en) Image fusion method and device, storage medium and electronic equipment
CN108038916B (en) Augmented reality display method
WO2004107765A1 (en) 3-dimensional video display device, text data processing device, program, and storage medium
JP6336406B2 (en) Image composition apparatus, image composition method, image composition program, and recording medium storing the program
CN105094615A (en) Information processing method and electronic equipment
CN111105474A (en) Font drawing method and device, computer equipment and computer readable storage medium
US10650488B2 (en) Apparatus, method, and computer program code for producing composite image
CN113645476A (en) Picture processing method and device, electronic equipment and storage medium
CN110782387A (en) Image processing method and device, image processor and electronic equipment
KR101214674B1 (en) Apparatus and method for generating mosaic image including text
JP2014120014A (en) Image processing apparatus, image processing method, and image processing program
US11842236B2 (en) Colored visual markers for variable use
US20210082155A1 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP7102844B2 (en) Frame interpolation device, frame interpolation method and frame interpolation program
EP2816521A1 (en) Editing method of the three-dimensional shopping platform display interface for users
EP2706508A1 (en) Reducing latency in an augmented-reality display
CN115908596B (en) Image processing method and electronic equipment
KR101849384B1 (en) 3D image display system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination