CN114049425B - Illumination simulation method, device, equipment and storage medium in image - Google Patents

Illumination simulation method, device, equipment and storage medium in image Download PDF

Info

Publication number
CN114049425B
CN114049425B CN202111268299.5A CN202111268299A CN114049425B CN 114049425 B CN114049425 B CN 114049425B CN 202111268299 A CN202111268299 A CN 202111268299A CN 114049425 B CN114049425 B CN 114049425B
Authority
CN
China
Prior art keywords
image
target
pixel point
transparency
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111268299.5A
Other languages
Chinese (zh)
Other versions
CN114049425A (en
Inventor
袁佳平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111268299.5A priority Critical patent/CN114049425B/en
Publication of CN114049425A publication Critical patent/CN114049425A/en
Application granted granted Critical
Publication of CN114049425B publication Critical patent/CN114049425B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

The application provides an illumination simulation method, an illumination simulation device, a storage medium and a computer program product in an image; the method comprises the following steps: acquiring an aperture image corresponding to a light source to be simulated, wherein the aperture image is used for indicating the transparency of each pixel point in an irradiation area of the image corresponding to the light source to be simulated; acquiring target pixel values of all pixel points in a target image in a target illumination state, and creating an image container for placing an aperture image based on the target image; based on the aperture image and the image container, adjusting the transparency of each pixel point in the target image to obtain the target transparency of each pixel point in the target image under the light source to be simulated; rendering and outputting an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image; according to the method and the device, the device performance cost and the occupation of device processing resources can be reduced, and the picture rendering speed is improved.

Description

Illumination simulation method, device, equipment and storage medium in image
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an illumination simulation method, an apparatus, a device, a storage medium, and a computer program product in an image.
Background
With the development of image processing technology, in the related art, an illuminated image may be placed in a 3D scene by a 3D engine, and then a light source is added to the 3D scene and illuminated to the image, thereby simulating an illumination effect for the image so that a region in the image is illuminated to be viewed. However, the solution needs to use a 3D engine, which not only brings additional performance overhead to the device and occupies device processing resources, but also causes a situation of stuck when rendering the picture.
Disclosure of Invention
The embodiment of the application provides an illumination simulation method, device, equipment, storage medium and computer program product in an image, which can reduce the equipment performance cost and the occupation of equipment processing resources and improve the picture rendering speed.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides an illumination simulation method in an image, which comprises the following steps:
acquiring an aperture image corresponding to a light source to be simulated, wherein the aperture image is used for indicating the transparency of each pixel point in an irradiation area of the image corresponding to the light source to be simulated;
acquiring a target pixel value of each pixel point in a target image in a target illumination state, and creating an image container for placing the aperture image based on the target image, wherein the transparency of each pixel point in the image container is lower than a transparency threshold;
Based on the aperture image and the image container, adjusting the transparency of each pixel point in the target image to obtain the target transparency of each pixel point in the target image under the light source to be simulated;
and rendering and outputting an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image.
The embodiment of the application also provides an illumination simulation device in an image, which comprises:
the first acquisition module is used for acquiring an aperture image corresponding to the light source to be simulated, and the aperture image is used for indicating the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated;
the second acquisition module is used for acquiring target pixel values of all pixel points in a target image in a target illumination state, and creating an image container for placing the aperture image based on the target image, wherein the transparency of all pixel points in the image container is lower than a transparency threshold;
the adjusting module is used for adjusting the transparency of each pixel point in the target image based on the aperture image and the image container to obtain the target transparency of each pixel point in the target image under the light source to be simulated;
And the rendering module is used for rendering and outputting an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image.
In the above scheme, the first obtaining module is further configured to obtain light intensity variation information of the light source to be simulated;
determining the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated based on the light intensity change information;
and generating an aperture image corresponding to the light source to be simulated based on the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated.
In the above solution, the second obtaining module is further configured to perform, for each pixel point in the target image in the target illumination state, the following processing respectively:
acquiring a pixel value of the pixel point on a red R-green G-blue B color channel;
and determining the pixel value of the pixel point on the red R-green G-blue B color channel as a target pixel value of the pixel point.
In the above solution, the second obtaining module is further configured to obtain an image shape and an image size of the target image;
a blank image conforming to the image shape and image size is created as an image container for placing the aperture image.
In the above aspect, the image container is identical in shape and size to the target image;
the adjusting module is further used for respectively determining the aperture image, the image container and the image level corresponding to the target image;
superposing the aperture image, the image container and the target image according to the image level to obtain a superposed image;
and extracting the transparency of each pixel point in the superimposed image, and taking the transparency of each pixel point in the superimposed image as the target transparency of the corresponding pixel point in the target image under the light source to be simulated.
In the above scheme, the image levels corresponding to the aperture image, the image container and the target image are respectively a first level, a second level and a third level in order from the upper layer to the bottom layer;
the adjusting module is further used for superposing the aperture image on the image container to obtain an intermediate superposition image;
and superposing the intermediate superposition image on the target image to obtain the superposition image.
In the above scheme, the adjusting module is further configured to determine a target pixel point corresponding to the aperture image in the superimposed image when the transparency of each pixel point in the image container is zero;
And extracting the transparency of each target pixel point in the superimposed image, taking the transparency of the target pixel point as the transparency of the corresponding pixel point in the target image under the light source to be simulated, and determining the transparency of other pixel points in the target image to be zero so as to obtain the target transparency of each pixel point in the target image under the light source to be simulated.
In the above scheme, the rendering module is further configured to create a blank bitmap for image drawing;
drawing each pixel point into the blank bitmap according to the pixel point drawing sequence based on the target pixel value and the target transparency of each pixel point in the target image to obtain a drawing image;
rendering the drawing image to output an illumination simulation image corresponding to the target image under the light source to be simulated.
In the above solution, the rendering module is further configured to execute, according to the pixel point drawing order, the following processing for each pixel point in the target image, so as to obtain a drawn image:
determining coordinate information of the pixel point in the blank bitmap, and acquiring a target pixel value and target transparency of the pixel point;
And calling a graph drawing interface, and drawing the pixel point into the blank bitmap at a target position indicated by the coordinate information based on the target pixel value and the target transparency of the pixel point.
In the above solution, the target image is an image frame of a virtual scene, and the apparatus further includes:
the display module is used for responding to the illumination instruction of the to-be-simulated light source corresponding to the image picture and displaying an illumination simulation image taking the first area of the image picture as the illumination area of the to-be-simulated light source;
when receiving an adjustment instruction for the irradiation area of the light source to be simulated, controlling the irradiation area of the light source to be simulated to be adjusted from the first area to a second area indicated by the adjustment instruction, and
and presenting an illumination simulation image taking the second area as an illumination area of the light source to be simulated.
In the above aspect, the presenting module is further configured to determine, when the irradiation area is the first area, a position of an aperture of the aperture image placed on the image container, and obtain a target position indicated by the adjustment instruction and relative to the image container;
and controlling the aperture image to move from the aperture position to the target position so as to control the irradiation area of the light source to be simulated to be adjusted from the first area to a second area indicated by the adjustment instruction.
The embodiment of the application also provides electronic equipment, which comprises:
a memory for storing executable instructions;
and the processor is used for realizing the illumination simulation method in the image provided by the embodiment of the application when executing the executable instructions stored in the memory.
The embodiment of the application also provides a computer readable storage medium which stores executable instructions, wherein the executable instructions realize the illumination simulation method in the image provided by the embodiment of the application when being executed by a processor.
The embodiment of the application also provides a computer program product, which comprises a computer program or instructions, wherein the computer program or instructions realize the illumination simulation method in the image provided by the embodiment of the application when being executed by a processor.
The embodiment of the application has the following beneficial effects:
by applying the embodiment of the application, when illumination simulation in an image is performed, firstly, an aperture image corresponding to a light source to be simulated and a target pixel value of each pixel point in a target image in a target illumination state are obtained; then creating an image container for placing the aperture image based on the target image; because the aperture image is used for indicating the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated, and each pixel point in the image container also has the transparency lower than the transparency threshold, the transparency of each pixel point in the target image can be adjusted based on the aperture image and the image container, so that the target transparency of each pixel point in the target image under the light source to be simulated is obtained; and rendering and outputting an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image. Therefore, the illumination simulation of the image is realized by adjusting the transparency of the pixel points in the target image, the performance cost of the device and the occupation of the processing resources of the device are reduced, and the picture rendering speed is improved.
Drawings
FIG. 1 is a schematic architecture diagram of an illumination simulation system 100 in an image provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device 500 for implementing an illumination simulation method in an image according to an embodiment of the present application;
FIG. 3 is a flow chart of an illumination simulation method in an image provided by an embodiment of the present application;
FIG. 4 is a schematic view of displaying an aperture image according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of the layer relationship between an aperture image, an image container, and a target image provided in an embodiment of the present application;
fig. 6 is a schematic diagram of an illumination simulation image corresponding to a target image under a light source to be simulated according to an embodiment of the present application;
FIG. 7 is a schematic illustration of a display of an illumination simulation image provided by an embodiment of the present application;
FIG. 8 is a schematic illustration of a display of an illumination simulation image provided by an embodiment of the present application;
fig. 9 is a flowchart of an illumination simulation method in an image according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the present application will be described in further detail with reference to the accompanying drawings, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
In the following description, the terms "first", "second", "third" and the like are merely used to distinguish similar objects and do not represent a specific ordering of the objects, it being understood that the "first", "second", "third" may be interchanged with a specific order or sequence, as permitted, to enable embodiments of the application described herein to be practiced otherwise than as illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the present application.
Before further describing embodiments of the present application in detail, the terms and expressions that are referred to in the embodiments of the present application are described, and are suitable for the following explanation.
1) And the client is used for providing various service application programs such as a browser client, a video playing client and a virtual scene client.
2) In response to a condition or state that is used to represent the condition or state upon which the performed operation depends, the performed operation or operations may be in real-time or with a set delay when the condition or state upon which it depends is satisfied; without being specifically described, there is no limitation in the execution sequence of the plurality of operations performed.
3) The virtual scene is a virtual scene that an application program displays (or provides) when running on a terminal. The virtual scene may be a simulation environment for the real world, a semi-simulation and semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present invention. For example, a virtual scene may include sky, land, sea, etc., the land may include environmental elements of a desert, city, etc., and a user may control a virtual object to move in the virtual scene.
Based on the above explanation of terms and terminology involved in the embodiments of the present application, the illumination simulation system in the image provided by the embodiments of the present application is described below. Referring to fig. 1, fig. 1 is a schematic architecture diagram of a lighting simulation system 100 in an image provided in an embodiment of the present application, in order to support an exemplary application, a terminal 400 is connected to a server 200 through a network 300, where the network 300 may be a wide area network or a local area network, or a combination of the two, and a wireless or wired link is used to implement data transmission.
The terminal 400 is configured to send an illumination request of the target image corresponding to the light source to be simulated to the server 200 in response to an illumination instruction of the target image corresponding to the light source to be simulated;
the server 200 is configured to receive an illumination request of a target image corresponding to a light source to be simulated, and obtain an aperture image corresponding to the light source to be simulated; acquiring target pixel values of all pixel points in a target image in a target illumination state, and creating an image container for placing an aperture image based on the target image; based on the aperture image and the image container, adjusting the transparency of each pixel point in the target image to obtain the target transparency of each pixel point in the target image under the light source to be simulated; transmitting a target pixel value of each pixel point in the target image and a target transparency of each pixel point in the target image under the light source to be simulated to the terminal 400; the aperture image is used for indicating the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated;
The terminal 400 is configured to receive a target pixel value of each pixel point in the target image and a target transparency of each pixel point in the target image under the light source to be simulated; and rendering and outputting an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image.
In practical applications, the server 200 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, and basic cloud computing services such as big data and artificial intelligence platforms. The terminal 400 may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart television, a smart watch, etc. The terminal 400 and the server 200 may be directly or indirectly connected through wired or wireless communication, which is not limited herein.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an electronic device 500 for implementing an illumination simulation method in an image according to an embodiment of the present application. In practical applications, the electronic device 500 may be a server or a terminal shown in fig. 1, and the electronic device 500 is taken as an example of the terminal shown in fig. 1, to describe an electronic device implementing the illumination simulation method in the image of the embodiment of the present application, where the electronic device 500 provided in the embodiment of the present application includes: at least one processor 510, a memory 550, at least one network interface 520, and a user interface 530. The various components in electronic device 500 are coupled together by bus system 540. It is appreciated that the bus system 540 is used to enable connected communications between these components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to the data bus. The various buses are labeled as bus system 540 in fig. 2 for clarity of illustration.
The processor 510 may be an integrated circuit chip with signal processing capabilities such as a general purpose processor, such as a microprocessor or any conventional processor, or the like, a digital signal processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like.
The user interface 530 includes one or more output devices 531 that enable presentation of media content, including one or more speakers and/or one or more visual displays. The user interface 530 also includes one or more input devices 532, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard drives, optical drives, and the like. Memory 550 may optionally include one or more storage devices physically located remote from processor 510.
Memory 550 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a random access Memory (RAM, random Access Memory). The memory 550 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 550 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
network communication module 552 is used to reach other computing devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 include: bluetooth, wireless compatibility authentication (WiFi), and universal serial bus (USB, universal Serial Bus), etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating a peripheral device and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
the input processing module 554 is configured to detect one or more user inputs or interactions from one of the one or more input devices 532 and translate the detected inputs or interactions.
In some embodiments, the illumination simulation apparatus in the image provided in the embodiments of the present application may be implemented in a software manner, and fig. 2 shows the illumination simulation apparatus 555 in the image stored in the memory 550, which may be software in the form of a program, a plug-in, or the like, including the following software modules: the first acquisition module 5551, the second acquisition module 5552, the adjustment module 5553, and the rendering module 5554 are logical, and thus may be arbitrarily combined or further split according to the implemented functions, the functions of each module will be described below.
In other embodiments, the illumination simulation apparatus in the image provided by the embodiments of the present application may be implemented in a combination of hardware and software, and by way of example, the illumination simulation apparatus in the image provided by the embodiments of the present application may be a processor in the form of a hardware decoding processor that is programmed to perform the illumination simulation method in the image provided by the embodiments of the present application, for example, the processor in the form of a hardware decoding processor may employ one or more application specific integrated circuits (ASIC, application Specific Integrated Circuit), DSP, programmable logic device (PLD, programmable Logic Device), complex programmable logic device (CPLD, complex Programmable Logic Device), field programmable gate array (FPGA, field-Programmable Gate Array), or other electronic component.
In some embodiments, the terminal or the server may implement the illumination simulation method in the image provided by the embodiments of the present application by running a computer program. For example, the computer program may be a native program or a software module in an operating system; the method can be a local (Native) Application program (APP), namely a program which can be run only by being installed in an operating system, such as video play APP or virtual scene APP; the method can also be an applet, namely a program which can be run only by being downloaded into a browser environment; but also an applet that can be embedded in any APP. In general, the computer programs described above may be any form of application, module or plug-in.
Based on the above description of the illumination simulation system and the electronic device in the image provided by the embodiment of the present application, the illumination simulation method in the image provided by the embodiment of the present application is described below. In some embodiments, the method for simulating illumination in an image provided in the embodiments of the present application may be implemented by a server or a terminal alone or in conjunction with the server and the terminal, and the method for simulating illumination in an image provided in the embodiments of the present application is described below with reference to the implementation of the terminal. Referring to fig. 3, fig. 3 is a flow chart of an illumination simulation method in an image provided by an embodiment of the present application, where the illumination simulation method in an image provided by the embodiment of the present application includes:
Step 101: and the terminal acquires an aperture image corresponding to the light source to be simulated.
The aperture image is used for indicating the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated.
Here, the terminal may be installed with an application client, such as an application client supporting a virtual scene (e.g., a game client), an application client supporting video playback, etc. The terminal can display the illumination simulation image of the target image under the light source to be simulated by running the application client, so that the effect that a certain area in the target image is observed after being illuminated by the light source to be simulated (such as a flashlight, a spotlight and the like) is simulated by the illumination simulation image. In the embodiment of the application, when the terminal outputs the illumination simulation image, the aperture image corresponding to the light source to be simulated needs to be acquired. Here, the aperture image is used for indicating the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated, that is, the transparency of the pixel point in the irradiation area of the image under the simulated light source. Therefore, the original transparency of the pixel points of the image in the irradiation area under the light source to be simulated is changed based on the aperture image, and the illumination effect of the image corresponding to the light source to be simulated is simulated.
In practical implementation, the transparency of the pixel point is in the range of 0-1, which represents the percentage of transparency. When the transparency is 0, the percentage representing the transparency is 0%, which means that the pixel point of the light source to be simulated in the irradiation area of the image is completely transparent, namely is completely invisible to the user; when the transparency is 1, the percentage representing the transparency is 100%, which means that the pixel point of the light source to be simulated in the irradiation area of the image is completely opaque, namely the pixel point is the most clear for a user; the transparency of the other values lying between 0 and 1 means that the pixels of the light source to be emulated in the illuminated area of the image are partly transparent, i.e. appear to the user as a glass-like translucent feel.
In some embodiments, the terminal may acquire the aperture image corresponding to the light source to be simulated by: acquiring light intensity change information of a light source to be simulated; determining the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated based on the light intensity change information; and generating an aperture image corresponding to the light source to be simulated based on the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated.
Here, in order to simulate the simulation effect of the image to be simulated on the target image through the aperture image, the terminal first obtains the light intensity variation information of the light source to be simulated, where the light intensity variation information is used to indicate the variation of the light intensity of the light source to be simulated in the irradiation area of the image.
And then determining the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated based on the light intensity change information, wherein the transparency change of each pixel point in the irradiation area of the image corresponding to the light source to be simulated accords with the light intensity change information. Taking the light source to be simulated as a point light source as an example, the transparency of the pixel points at the center of a circular irradiation area of the point light source is 1, and the transparency value of each pixel point on the circle is sequentially decreased to 0 along the radius direction by taking the center of the circle as the origin, so that the light intensity attenuation process of the light source to be simulated is simulated through the transparency change of each pixel point in the aperture image.
And finally, generating an aperture image corresponding to the light source to be simulated based on the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated.
As an example, referring to fig. 4, fig. 4 is a schematic display diagram of an aperture image provided in an embodiment of the present application. Here, as shown in a diagram in fig. 4, the aperture image may describe the transparency of each pixel point of the image in the irradiation area under the light source to be simulated, so as to simulate the process of the light intensity of the light source to be simulated (illustrated as a point light source) from the center to the periphery by the change of the transparency. In order to make the image of the aperture more clearly visible, the background of the aperture can be painted black, as shown in the diagram B in fig. 4, the white part is the aperture image, and in actual implementation, the color can be set according to the requirement. As shown in fig. 4C, the aperture image is an aperture image corresponding to a point light source, and the transparency of a pixel point at the center of a circular irradiation area of the point light source is 1, and the transparency value of each pixel point on the circle is sequentially decreased to 0 along the radius direction by taking the center as the origin, so that the process of light intensity attenuation of the light source to be simulated is simulated by the transparency change of each pixel point in the aperture image.
Step 102: and acquiring target pixel values of all pixel points in the target image in the target illumination state, and creating an image container for placing the aperture image based on the target image.
Wherein the transparency of each pixel point in the image container is below a transparency threshold.
After acquiring the aperture image of the light source to be simulated, the terminal needs to realize illumination simulation of the target image under the light source to be simulated, so that the terminal continuously acquires the target pixel value of each pixel point in the target image under the target illumination state. In actual implementation, the target illumination state may be an illumination state of natural light, or may be an illumination state (i.e., not limited to any illumination light) different from an illumination-free state (a state in which no light is completely emitted).
Meanwhile, the terminal also needs to create an image container for placing the aperture image based on the target image, and the image container is used for shielding the target image and placing the aperture image so as to control the irradiation area of the light source to be simulated corresponding to the target image through the aperture image. The image container does not contain any content, wherein the transparency of each pixel point is lower than a transparency threshold value, for example, the transparency can be zero, so that the image container completely masks the target image to simulate the target image in a dark state (namely, a non-illumination state); for example, the transparency may also be non-zero but below a transparency threshold to simulate a target image in a low light state (i.e., a simulated illumination intensity that is lower than the illumination intensity of the target illumination state) by blurring the target image by the image container.
When the image container shields the target image and is provided with the aperture image of the light source to be simulated, displaying corresponding content of the target image in the placement area of the aperture image, so as to control the irradiation area of the light source to be simulated corresponding to the target image through the aperture image.
In some embodiments, the terminal may obtain the target pixel value of each pixel point in the target image in the target illumination state by: for each pixel point in the target image in the target illumination state, the following processing is executed respectively: acquiring pixel values of pixel points on a red R-green G-blue B color channel; and determining the pixel value of the pixel point on the red R-green G-blue B color channel as a target pixel value of the pixel point.
Here, when the target pixel value of each pixel point in the target image in the target illumination state is acquired, the RGB color value corresponding to each pixel point may be acquired, and then the RGB color value of each pixel point is determined as the target color value of the corresponding pixel point. In practical implementation, RGB color values corresponding to each pixel point can be obtained in a sequence from left to right and from top to bottom.
In some embodiments, based on the target image, the terminal may create an image container for placing the aperture image by: acquiring an image shape and an image size of a target image; a blank image conforming to the image shape and image size is created as an image container for placing the aperture image.
Here, the terminal acquires the image shape and the image size of the target image at the time of creating the image container for placing the aperture image, and then creates a blank image in conformity with the image shape and the image size of the target image as the image container for placing the aperture image. Thus, the image container can be completely overlapped with the target image to play a role of shielding the target image. When the image container shields the target image and is provided with the aperture image of the light source to be simulated, displaying corresponding content of the target image only in the placement area of the aperture image, so as to control the irradiation area of the light source to be simulated corresponding to the target image through the aperture image.
Step 103: and adjusting the transparency of each pixel point in the target image based on the aperture image and the image container to obtain the target transparency of each pixel point in the target image under the light source to be simulated.
After the aperture image of the light source to be simulated is acquired and the image container is created, the transparency of each pixel point in the target image is adjusted based on the aperture image and the image container, so that the target transparency of each pixel point in the target image under the light source to be simulated is obtained.
In some embodiments, the image container is consistent in shape and size with the target image; based on the aperture image and the image container, the terminal can adjust the transparency of each pixel point in the target image in the following manner to obtain the target transparency of each pixel point in the target image under the light source to be simulated: respectively determining an aperture image, an image container and an image level corresponding to a target image; superposing the aperture image, the image container and the target image according to the image level to obtain a superposed image; and extracting the transparency of each pixel point in the superimposed image, and taking the transparency of each pixel point in the superimposed image as the target transparency of the corresponding pixel point in the target image under the light source to be simulated.
In some embodiments, the image levels corresponding to the aperture image, the image container, and the target image are, in order from the upper layer to the bottom layer, a first level, a second level, and a third level, respectively; the terminal can superimpose the aperture image, the image container and the target image according to the image level to obtain a superimposed image by the following modes: overlapping the aperture image on the image container to obtain an intermediate overlapped image; and superposing the intermediate superposition image on the target image to obtain a superposition image.
In practical applications, the image container is consistent in shape and size with the target image. In the embodiment of the application, the circle image, the image container and the target image are all preset with corresponding image levels. When the target transparency of each pixel point in the target image under the light source to be simulated is determined, the aperture image, the image container and the target image are overlapped according to the image level to obtain an overlapped image, and the transparency of each pixel point in the overlapped image is extracted, so that the transparency of each pixel point in the overlapped image is used as the target transparency of the corresponding pixel point in the target image under the light source to be simulated. Because the shape and the size of the image container are consistent with those of the target image, the obtained superimposed image is consistent with those of the target image in shape and size, and based on the fact that the positions of all the pixels in the superimposed image are overlapped with all the pixels in the target image, the transparency of the target pixels in the superimposed image can be used as the target transparency of the pixels which are overlapped with the target pixels in the superimposed image in the target image under the light source to be simulated.
In practical application, the aperture image, the image container and the target image are overlapped according to the image level, when the overlapped image is obtained, the aperture image can be overlapped on the image container to obtain an intermediate overlapped image, and then the intermediate overlapped image is overlapped on the target image to obtain the overlapped image. As an example, referring to fig. 5, fig. 5 is a schematic diagram of a layer relationship among an aperture image, an image container, and a target image provided in an embodiment of the present application. Here, the aperture image is located above the image container (i.e., aperture container), the image container is located above the target image (i.e., the bright image), and the target image (i.e., the bright image) is located at the bottom layer.
In some embodiments, the terminal may extract the transparency of each pixel point in the superimposed image, and use the transparency of each pixel point in the superimposed image as the target transparency of the corresponding pixel point in the target image under the light source to be simulated, by: when the transparency of each pixel point in the image container is zero, determining a target pixel point of a corresponding aperture image in the superimposed image; and extracting the transparency of each target pixel point in the superimposed image, taking the transparency of the target pixel point as the transparency of the corresponding pixel point in the target image under the light source to be simulated, and determining the transparency of other pixel points in the target image to be zero so as to obtain the target transparency of each pixel point in the target image under the light source to be simulated.
Here, when the transparency of each pixel point in the image container is zero, it is indicated that the transparency of the pixel point in the superimposed image corresponding to the area of the non-aperture image is also zero. At this time, only the transparency of the target pixel point corresponding to the aperture image in the superimposed image is required to be extracted, and the transparency of the target pixel point is used as the transparency of the corresponding pixel point in the target image under the light source to be simulated; and meanwhile, determining the transparency of other pixel points in the target image to be zero so as to obtain the target transparency of each pixel point in the target image under the light source to be simulated.
Step 104: and rendering and outputting an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image.
After determining the target transparency of each pixel point in the target image under the light source to be simulated, the terminal may render and output an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value of each pixel point in the target image and the target transparency. In practical application, the target pixel value of each pixel point in the target image is the RGB color value of the pixel point, and the target transparency of each pixel point in the target image under the light source to be simulated is the Alpha channel value, so that the RGB color value of each pixel point in the target image and the target transparency (Alpha channel value) of the corresponding pixel point are combined to obtain the RGBA value of each pixel point in the target image under the light source to be simulated, and then the illumination simulation image corresponding to the target image under the light source to be simulated is rendered based on the RGBA value of each pixel point in the target image under the light source to be simulated, and the illumination simulation image is the target image under the light source to be simulated.
As an example, referring to fig. 6, fig. 6 is a schematic diagram of an illumination simulation image corresponding to a target image under a light source to be simulated according to an embodiment of the present application. Here, as shown in a diagram in fig. 6, a target image in a target illumination state; as shown in fig. 6B, the illumination simulation image corresponding to the target image under the light source to be simulated indicates that the illumination area of the light source to be simulated in the target image is an image area M.
In some embodiments, based on the target pixel value and the target transparency of each pixel point in the target image, the terminal may render and output an illumination simulation image corresponding to the target image under the light source to be simulated in the following manner: creating a blank bitmap for image drawing; drawing each pixel point into a blank bitmap according to the drawing sequence of the pixel points based on the target pixel value and the target transparency of each pixel point in the target image to obtain a drawing image; rendering the drawing image to output an illumination simulation image corresponding to the target image under the light source to be simulated.
In some embodiments, based on the target pixel value and the target transparency of each pixel point in the target image, according to the pixel point drawing sequence, the terminal may draw each pixel point into the blank bitmap in the following manner, to obtain a drawn image: according to the pixel point drawing sequence, the following processing is executed for each pixel point in the target image to obtain a drawing image: determining coordinate information of the pixel points in the blank bitmap, and acquiring target pixel values and target transparency of the pixel points; and calling a graph drawing interface, and drawing the pixel point into a blank bitmap at a target position indicated by the coordinate information based on the target pixel value and the target transparency of the pixel point.
When rendering an illumination simulation image corresponding to a target image under a light source to be simulated, the terminal firstly creates a blank bitmap for image drawing, wherein the blank bitmap can not contain any content; then, drawing each pixel point into a blank bitmap according to the drawing sequence of the pixel points based on the target pixel value and the target transparency of each pixel point in the target image to obtain a drawing image; and finally rendering the drawn drawing image, so as to output an illumination simulation image corresponding to the target image under the light source to be simulated.
In practical implementation, the target pixel value of each pixel point in the target image is the RGB color value of the pixel point, and the target transparency of each pixel point in the target image under the light source to be simulated is the Alpha channel value, so that the RGB color value of each pixel point in the target image and the target transparency (Alpha channel value) of the corresponding pixel point are combined to obtain the RGBA value of each pixel point in the target image under the light source to be simulated, and the RGBA value of each pixel point in the target image under the light source to be simulated is stored to obtain the RGBA data corresponding to the target image under the light source to be simulated.
In actual implementation, the following processing is performed for each pixel in accordance with the pixel drawing order: the terminal determines coordinate information of a pixel point in a blank bitmap, and obtains a target pixel value and target transparency of the pixel point from stored RGBA data of a target image under a light source to be simulated; and calling a graphic drawing interface, and drawing the pixel point into the blank bitmap at a target position indicated by the coordinate information based on the target pixel value and the target transparency of the pixel point.
In some embodiments, the target image is an image picture of a virtual scene, and the terminal responds to an illumination instruction of the image picture corresponding to the light source to be simulated, and presents an illumination simulation image taking a first area of the image picture as an illumination area of the light source to be simulated; when an adjustment instruction for the irradiation area of the light source to be simulated is received, controlling the irradiation area of the light source to be simulated to be adjusted from the first area to a second area indicated by the adjustment instruction, and displaying an illumination simulation image taking the second area as the irradiation area of the light source to be simulated.
Here, the target image is an image frame of a virtual scene, the virtual scene may be an electronic game scene, and the image frame is a game frame in the electronic game scene. In the virtual scene, when the image picture is in a non-illumination state, a user can trigger an illumination instruction of the light source to be simulated for the image picture, and the terminal responds to the illumination instruction of the light source to be simulated corresponding to the image picture to display an illumination simulation image taking a first area of the image picture as an illumination area of the light source to be simulated. In practical application, the illumination instruction can be triggered by a set illumination button, and can also be triggered by triggering operations such as clicking, double clicking, long pressing and the like.
The user may also adjust the illumination area of the light source to be emulated for the image frame to view other areas of the image frame than the first area. The terminal can trigger an adjustment instruction for the irradiation region of the light source to be simulated by sliding the screen or triggering an irradiation region adjustment function item. When the terminal receives an adjustment instruction aiming at the irradiation area of the light source to be simulated, controlling the irradiation area of the light source to be simulated to be adjusted from the first area to a second area indicated by the adjustment instruction, and displaying an illumination simulation image taking the second area as the irradiation area of the light source to be simulated.
In some embodiments, the terminal may control the irradiation area of the light source to be emulated to be adjusted from the first area to the second area indicated by the adjustment instruction by: determining a diaphragm position of the diaphragm image placed on the image container when the irradiation area is the first area, and acquiring a target position relative to the image container, indicated by the adjustment instruction; the aperture image is controlled to move from the aperture position to the target position so as to control the irradiation area of the light source to be simulated to be adjusted from the first area to the second area indicated by the adjustment instruction.
Here, the terminal first determines a diaphragm position where the diaphragm image is placed on the image container when the irradiation area is the first area, and acquires a target position with respect to the image container indicated by the adjustment instruction. And then controlling the aperture image to move from the aperture position to the target position, thereby realizing control of the irradiation area of the light source to be simulated to be adjusted from the first area to the second area indicated by the adjustment instruction.
As an example, referring to fig. 7, fig. 7 is a schematic display diagram of an illumination simulation image provided in an embodiment of the present application. Here, as shown in a diagram in fig. 7, the aperture image is located at an X position in the image container, and at this time, as shown in a diagram B in fig. 7, the terminal presents an illumination simulation image in which the first area of the image screen is the irradiation area of the light source to be simulated;
in response to an adjustment instruction for the irradiation region of the light source to be simulated, the diaphragm image is controlled to move from an aperture position (i.e., a position where the diaphragm image is placed on the image container when the irradiation region is the first region) "X position" to a target position (a position relative to the image container indicated by the adjustment instruction) "Y position", as shown in fig. 7C, at which time, as shown in fig. 7D, the terminal presents an illumination simulation image with the second region of the image screen as the irradiation region of the light source to be simulated.
By applying the embodiment of the application, when illumination simulation in an image is performed, firstly, an aperture image corresponding to a light source to be simulated and a target pixel value of each pixel point in a target image in a target illumination state are obtained; then creating an image container for placing the aperture image based on the target image; because the aperture image is used for indicating the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated, and each pixel point in the image container also has the transparency lower than the transparency threshold, the transparency of each pixel point in the target image can be adjusted based on the aperture image and the image container, so that the target transparency of each pixel point in the target image under the light source to be simulated is obtained; and rendering and outputting an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image. Therefore, illumination simulation of the image is realized by adjusting the transparency of the pixel points in the target image, a 3D engine is not needed, the equipment performance cost and the occupation of equipment processing resources are reduced, the picture rendering speed is improved, the occurrence of a blocking phenomenon is avoided, and the smoothness of the visual effect is improved. And 3D modeling is not needed by artistic staff, so that the manufacturing cost is saved, 3D knowledge is not needed by procedural staff, and the development cost is reduced.
An exemplary application of the embodiments of the present application in a practical application scenario will be described below.
With the development of image processing technology, in the related art, an illuminated image may be placed in a 3D scene by a 3D engine such as three.js, and then a light source (such as a spotlight, a flashlight, etc.) is added to the scene and is directed to the image, so as to simulate the illumination effect for the image, so that some areas in the image are illuminated and viewed by a user. However, the above solution needs to use a 3D engine, and using the 3D engine brings additional performance overhead, is not friendly to devices with middle-low end performance, may cause a clip on when rendering a picture, and affects user experience; and during manufacturing, 3D modeling is required by artistic staff, and then the programmer can perform subsequent programming processing work, so that manufacturing and development costs are increased.
Based on this, the embodiment of the application provides an illumination simulation method in an image to at least solve the above-mentioned problems. In the embodiment of the application, the 3D problem is rendered 2D by simulating the local illumination effect by changing the transparency of the pixel points in the target image instead of using the technology of the 3D engine, which has high requirements on the performance of the device. Firstly, extra performance expenditure is not caused to the equipment, the middle-low-end equipment can be better considered, and more users can more smoothly experience visual interaction of simulated illumination; secondly, the developer is not required to have the related knowledge of the 3D technology, and meanwhile, the 3D modeling is not required to be performed through fine arts, so that the workload of programmers and fine arts is reduced, and the development cost is reduced while the effect performance is ensured.
Next, an illumination simulation method in an image provided in an embodiment of the present application is described, including:
step 1, placing a bright image (namely the target image).
As shown in B in fig. 6, a target image in a target illumination state, i.e., an illumination image. First, the target image in the light-illuminated state (for example, in natural light) of the target is placed at the bottom layer, and the target image in the light-receiving state is represented. Then, the RGB color values of the respective pixels in the target image are sequentially extracted from the top to bottom and from left to right, thereby obtaining the color values of red, green, and blue of each pixel constituting the target image. Finally, the RGB color values of each pixel point in the target image are recorded. Wherein, the range of the values of red, green and blue is an integer of 0-255.
Step 2, preparing an aperture image.
Referring to fig. 4, fig. 4 is a schematic display diagram of an aperture image according to an embodiment of the present application. Here, as shown in a diagram in fig. 4, the aperture image may describe the transparency of each pixel point of the image in the irradiation area under the light source to be simulated, so as to simulate the process of the light intensity of the light source to be simulated (illustrated as a point light source) from the center to the periphery by the change of the transparency. In order to make the image of the aperture more clearly visible, the background of the aperture can be painted black, as shown in the diagram B in fig. 4, the white part is the aperture image, and in actual implementation, the color can be set according to the requirement. As shown in fig. 4C, the aperture image is an aperture image corresponding to a point light source, and the transparency of a pixel point at the center of a circular irradiation area of the point light source is 1, and the transparency value of each pixel point on the circle is sequentially decreased to 0 along the radius direction by taking the center as the origin, so that the process of light intensity attenuation of the light source to be simulated is simulated by the transparency change of each pixel point in the aperture image.
Step 3, creating an aperture container (namely the image container) and placing an aperture image.
Firstly, creating a blank container which is consistent with the shape and the size of a bright image (namely the target image) as an aperture container; then, the aperture image in the step 2 is placed in the aperture container, the aperture image can move in the aperture container, and at a certain moment, the transparency of the aperture container is larger than 0 only at the pixel points in the range where the aperture image is located, so as to simulate the irradiation area of the image corresponding to the light source to be simulated.
And 4, placing an aperture container.
Here, the diaphragm container having the diaphragm image obtained in step 3 is placed above the bright image, and the diaphragm container and the bright image are completely overlapped. Referring to fig. 5, fig. 5 is a schematic diagram of a layer relationship among an aperture image, an image container, and a target image according to an embodiment of the present application. Here, the aperture image is located above the image container (i.e., aperture container), the image container is located above the target image (i.e., the bright image), and the target image (i.e., the bright image) is located at the bottom layer.
And 5, monitoring a screen touch event.
Here, a screen touch event for the electronic device may be monitored. When a user slides on the screen of the electronic device, the position coordinates of the contact point of the user finger and the screen are recorded, the aperture image is moved to the position indicated by the position coordinates, and the aperture image is always controlled to move along with the finger, and is moved from the position A at the upper left corner to the position B at the lower right corner along with the user finger, see the diagrams A and C in fig. 7.
And 6, reading the transparency of each pixel point in the aperture container.
Here, after the aperture image, the aperture container and the bright image are superimposed, before each screen rendering, the transparency of each pixel point in the aperture container after the superimposed placement is sequentially read and recorded in the order from top to bottom and from left to right. Since the position of the aperture image changes with the position of the screen touched by the user, the transparency is required to be read when each frame of image is rendered, and the transparency data at the current moment is ensured.
And 7, combining the RGB color values of the bright graph with the read transparency to obtain RGBA color values.
Here, the image is formed by combining pixels, and the pixels are arranged in a rectangular shape, each pixel has three color channels of red (R), green (G), and blue (B), and the pixels in a partial picture format (such as PNG) may additionally have a transparent channel (Alpha). In the embodiment of the application, a transparent channel is needed, and the color value with the transparent channel is called RGBA color value. The RGBA color value is the RGBA value of each pixel point in the target image (i.e. the bright image) under the light source to be simulated.
In step 1, the RGB color value of each pixel in the bright map is determined, in step 6, the transparency of each pixel point in the aperture container after the overlapping placement at the current moment is read, and the two are combined to obtain the RGBA color value. In practical application, the RGBA color value is stored in the format "# RRGGBBAA", wherein each bit is a hexadecimal value, each two bits represent a channel, and the channel is sequentially red, green, blue and transparent, and the value range of each channel is 0-255, for example "# FF00007F" represents a red with a transparency of 50%, specifically, FF after # indicates that the red channel value is 255, and the green and blue channels are 0, so that the three primary colors are pure red after being mixed; 255 in the transparent channel represents 100% and the last two bits 7F is 127, which is half of 255 after rounding, representing 50% transparency.
And 8, drawing an illumination simulation image in the Canvas according to the RGBA color value.
The API, putImageData, provided by Canvas, can map a set of RGBA color values into a picture. Specifically, canvas renderingcontext20.Putimagedata () draws the data of a given ImageData object onto a bitmap, i.e. by "context. Putimagedata (dx, dy)".
The parameter ImageData is an ImageData object containing image pixel information, namely RGBA color value data to be transmitted; the ImageData interface represents pixel information data of a region specified by a < canvas > element, and the data object can be returned by the createImageData () and getImageData () methods in the canvas renderingcontext2D object; imageData includes the following attributes that meet the specification standard:
1) Imagedata: read-only is a Uint8 clay array containing RGBA pixel information, all values in the array are integers, and the range is 0-255;
2) Imagedata Height: read-only, which is an unsigned long integer, representing the actual pixel height corresponding to ImageData;
3) Imagedata.width: read-only, an unsigned long integer, represents the actual pixel width corresponding to ImageData.
Where dx represents the starting abscissa of the target Canvas (bitmap) replaced by image data (i.e., RGBA color value data); dy represents the ordinate of the starting point in the target Canvas (bitmap) replaced by the image data (i.e. RGBA color value data).
The color of the illumination simulation picture drawn based on the steps is consistent with that of the bright picture, but the transparency changes in real time according to the position of the aperture image, the clearer the transparency is, the clearer the content of the bright picture displayed in the illumination simulation image is, the closer the transparency is, the darker the content of the bright picture displayed in the illumination simulation image is. Referring to fig. 8, fig. 8 is a schematic display diagram of an illumination simulation image provided in an embodiment of the present application. Here, for the illumination simulation image of the point light source for the bright image, the closer the transparency is to 1 in the area near the center position N of the aperture image, the clearer the content of the bright image displayed in the illumination simulation image; the more the transparency is near 0 in a region distant from the center position of the aperture image, the darker the content of the bright image displayed in the illumination simulation image.
Finally, the illumination simulation image drawn each time is rendered on the screen in real time, so that a user can see the effect that a certain area in the target image is locally illuminated, the illuminated area changes along with the position change of the finger of the user, and the visual effect that the local area is illuminated by using the target light source (such as a flashlight) can be simulated interactively.
Next, an illumination simulation method in an image provided by an embodiment of the present application is described, referring to fig. 9, fig. 9 is a schematic flow chart of the illumination simulation method provided by the embodiment of the present application, including:
step 201: placing the bright graph at the bottommost layer;
step 202: sequentially reading the values (RGB values) of red, green and blue channels of each pixel point on the bright graph in sequence from top to bottom and from left to right, and recording;
step 203: creating a blank container (aperture container) with the same width and height as the bright image, and putting the aperture image into the blank container;
step 204: placing the aperture container on the upper layer of the bright graph, and overlapping the aperture container and the bright graph;
step 205: monitoring a screen touch event;
step 206: when a user slides the screen, recording a coordinate value of an action position on the screen during sliding, and moving the aperture image to a position indicated by the coordinate value;
step 207: traversing and recording the transparency (Alpha) of each pixel point in the aperture container after superposition when each frame is rendered on the screen;
step 208: combining the RGB values of each pixel point in the bright graph in the step 202 and the transparency read in the step 207 to obtain an RGBA color value;
step 209: drawing a picture in a Canvas according to the RGBA value;
Step 210: and rendering the drawn picture on a screen in real time.
By applying the embodiment of the application, the illumination effect of a light source (such as a flashlight) on the bright image can be simulated on the equipment only by one bright image and one aperture image, so that the extra performance cost caused by using a 3D engine is avoided, the rendering speed is improved, the visual effect is ensured, and the user experience is smoother; and 3D modeling is not needed by artistic staff, so that the manufacturing cost is saved, 3D knowledge is not needed by procedural staff, and the development cost is reduced.
Continuing with the description below of an exemplary architecture implemented as a software module for illumination emulation device 555 in an image provided in an embodiment of the present application, in some embodiments, as shown in fig. 2, the software modules in illumination emulation device 555 stored in an image of memory 550 may comprise:
the first obtaining module 5551 is configured to obtain an aperture image corresponding to a light source to be simulated, where the aperture image is used to indicate transparency of each pixel point in an illumination area of the image corresponding to the light source to be simulated;
a second obtaining module 5552, configured to obtain a target pixel value of each pixel point in a target image in a target illumination state, and create, based on the target image, an image container for placing the aperture image, where transparency of each pixel point in the image container is lower than a transparency threshold;
The adjusting module 5553 is configured to adjust the transparency of each pixel point in the target image based on the aperture image and the image container, so as to obtain a target transparency of each pixel point in the target image under the light source to be simulated;
and the rendering module 5554 is configured to render and output an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image.
In some embodiments, the first obtaining module 5551 is further configured to obtain information about a light intensity variation of the light source to be simulated;
determining the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated based on the light intensity change information;
and generating an aperture image corresponding to the light source to be simulated based on the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated.
In some embodiments, the second obtaining module 5552 is further configured to perform, for each of the pixel points in the target image in the target illumination state, the following processing:
acquiring a pixel value of the pixel point on a red R-green G-blue B color channel;
and determining the pixel value of the pixel point on the red R-green G-blue B color channel as a target pixel value of the pixel point.
In some embodiments, the second acquisition module 5552 is further configured to acquire an image shape and an image size of the target image;
a blank image conforming to the image shape and image size is created as an image container for placing the aperture image.
In some embodiments, the image container is consistent in shape and size with the target image;
the adjustment module 5553 is further configured to determine image levels corresponding to the aperture image, the image container, and the target image, respectively;
superposing the aperture image, the image container and the target image according to the image level to obtain a superposed image;
and extracting the transparency of each pixel point in the superimposed image, and taking the transparency of each pixel point in the superimposed image as the target transparency of the corresponding pixel point in the target image under the light source to be simulated.
In some embodiments, the image levels corresponding to the aperture image, the image container, and the target image are a first level, a second level, and a third level in order from an upper layer to a bottom layer, respectively;
the adjustment module 5553 is further configured to superimpose the aperture image onto the image container to obtain an intermediate superimposed image;
And superposing the intermediate superposition image on the target image to obtain the superposition image.
In some embodiments, the adjusting module 5553 is further configured to determine a target pixel point in the superimposed image corresponding to the aperture image when the transparency of each pixel point in the image container is zero;
and extracting the transparency of each target pixel point in the superimposed image, taking the transparency of the target pixel point as the transparency of the corresponding pixel point in the target image under the light source to be simulated, and determining the transparency of other pixel points in the target image to be zero so as to obtain the target transparency of each pixel point in the target image under the light source to be simulated.
In some embodiments, the rendering module 5554 is further configured to create a blank bitmap for image rendering;
drawing each pixel point into the blank bitmap according to the pixel point drawing sequence based on the target pixel value and the target transparency of each pixel point in the target image to obtain a drawing image;
rendering the drawing image to output an illumination simulation image corresponding to the target image under the light source to be simulated.
In some embodiments, the rendering module 5554 is further configured to perform, for each pixel in the target image, the following processing according to the pixel drawing order, so as to obtain a drawn image:
Determining coordinate information of the pixel point in the blank bitmap, and acquiring a target pixel value and target transparency of the pixel point;
and calling a graph drawing interface, and drawing the pixel point into the blank bitmap at a target position indicated by the coordinate information based on the target pixel value and the target transparency of the pixel point.
In some embodiments, the target image is an image picture of a virtual scene, the apparatus further comprising:
the display module is used for responding to the illumination instruction of the to-be-simulated light source corresponding to the image picture and displaying an illumination simulation image taking the first area of the image picture as the illumination area of the to-be-simulated light source;
when receiving an adjustment instruction for the irradiation area of the light source to be simulated, controlling the irradiation area of the light source to be simulated to be adjusted from the first area to a second area indicated by the adjustment instruction, and
and presenting an illumination simulation image taking the second area as an illumination area of the light source to be simulated.
In some embodiments, the presenting module is further configured to determine, when the illumination area is a first area, a position of an aperture at which the aperture image is placed on the image container, and obtain a target position relative to the image container indicated by the adjustment instruction;
And controlling the aperture image to move from the aperture position to the target position so as to control the irradiation area of the light source to be simulated to be adjusted from the first area to a second area indicated by the adjustment instruction.
By applying the embodiment of the application, when illumination simulation in an image is performed, firstly, an aperture image corresponding to a light source to be simulated and a target pixel value of each pixel point in a target image in a target illumination state are obtained; then creating an image container for placing the aperture image based on the target image; because the aperture image is used for indicating the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated, and each pixel point in the image container also has the transparency lower than the transparency threshold, the transparency of each pixel point in the target image can be adjusted based on the aperture image and the image container, so that the target transparency of each pixel point in the target image under the light source to be simulated is obtained; and rendering and outputting an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image. Therefore, illumination simulation of the image is realized by adjusting the transparency of the pixel points in the target image, a 3D engine is not needed, the equipment performance cost and the occupation of equipment processing resources are reduced, the picture rendering speed is improved, the occurrence of a blocking phenomenon is avoided, and the smoothness of the visual effect is improved.
The embodiment of the application also provides electronic equipment, which comprises:
a memory for storing executable instructions;
and the processor is used for realizing the illumination simulation method in the image provided by the embodiment of the application when executing the executable instructions stored in the memory.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the illumination simulation method in the image provided by the embodiment of the application.
The embodiment of the application also provides a computer readable storage medium which stores executable instructions, and when the executable instructions are executed by a processor, the illumination simulation method in the image provided by the embodiment of the application is realized.
In some embodiments, the computer readable storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of programs, software modules, scripts, or code, written in any form of programming language (including compiled or interpreted languages, or declarative or procedural languages), and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, the executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, for example, in one or more scripts in a hypertext markup language (HTML, hyper Text Markup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices located at one site or, alternatively, distributed across multiple sites and interconnected by a communication network.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modifications, equivalent substitutions, improvements, etc. that are within the spirit and scope of the present application are intended to be included within the scope of the present application.

Claims (12)

1. A method of illumination simulation in an image, the method comprising:
acquiring an aperture image corresponding to a light source to be simulated, wherein the aperture image is used for indicating the transparency of each pixel point in an irradiation area of the image corresponding to the light source to be simulated;
acquiring a target pixel value of each pixel point in a target image in a target illumination state, and acquiring an image shape and an image size of the target image;
creating a blank image consistent with the image shape and the image size as an image container for placing the aperture image, wherein the transparency of each pixel point in the image container is lower than a transparency threshold;
determining respective image levels of the aperture image, the image container and the target image;
superposing the aperture image, the image container and the target image according to the image level to obtain a superposed image;
Extracting the transparency of each pixel point in the superimposed image, and taking the transparency of each pixel point in the superimposed image as the target transparency of each pixel point in the target image under the light source to be simulated;
and rendering and outputting an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image.
2. The method of claim 1, wherein the acquiring the aperture image corresponding to the light source to be simulated comprises:
acquiring light intensity change information of the light source to be simulated;
determining the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated based on the light intensity change information;
and generating an aperture image corresponding to the light source to be simulated based on the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated.
3. The method of claim 1, wherein the obtaining the target pixel value of each pixel point in the target image in the target illumination state comprises:
for each pixel point in the target image in the target illumination state, the following processing is executed respectively:
Acquiring a pixel value of the pixel point on a red R-green G-blue B color channel;
and determining the pixel value of the pixel point on the red R-green G-blue B color channel as a target pixel value of the pixel point.
4. The method of claim 1, wherein the respective image levels of the aperture image, the image container, and the target image are, in order from an upper layer to a lower layer, a first level, a second level, and a third level, respectively;
the step of superposing the aperture image, the image container and the target image according to the image level to obtain a superposed image comprises the following steps:
superposing the aperture image on the image container to obtain an intermediate superposition image;
and superposing the intermediate superposition image on the target image to obtain the superposition image.
5. The method of claim 1, wherein the extracting the transparency of each pixel in the superimposed image and taking the transparency of each pixel in the superimposed image as the target transparency of the corresponding pixel in the target image under the light source to be simulated comprises:
when the transparency of each pixel point in the image container is zero, determining a target pixel point corresponding to the aperture image in the superimposed image;
And extracting the transparency of each target pixel point in the superimposed image, taking the transparency of the target pixel point as the transparency of the corresponding pixel point in the target image under the light source to be simulated, and determining the transparency of other pixel points in the target image to be zero so as to obtain the target transparency of each pixel point in the target image under the light source to be simulated.
6. The method of claim 1, wherein rendering and outputting an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image comprises:
creating a blank bitmap for image drawing;
drawing each pixel point into the blank bitmap according to the pixel point drawing sequence based on the target pixel value and the target transparency of each pixel point in the target image to obtain a drawing image;
rendering the drawing image to output an illumination simulation image corresponding to the target image under the light source to be simulated.
7. The method of claim 6, wherein the drawing each pixel point into the blank bitmap according to a pixel point drawing order based on the target pixel value and the target transparency of each pixel point in the target image, to obtain a drawn image, comprises:
According to the pixel point drawing sequence, the following processing is executed for each pixel point in the target image so as to obtain a drawing image:
determining coordinate information of the pixel point in the blank bitmap, and acquiring a target pixel value and target transparency of the pixel point;
and calling a graph drawing interface, and drawing the pixel point into the blank bitmap at a target position indicated by the coordinate information based on the target pixel value and the target transparency of the pixel point.
8. The method of claim 1, wherein the target image is an image picture of a virtual scene, the method further comprising:
responding to an illumination instruction of the to-be-simulated light source corresponding to the image picture, and presenting an illumination simulation image taking a first area of the image picture as an illumination area of the to-be-simulated light source;
when receiving an adjustment instruction for the irradiation area of the light source to be simulated, controlling the irradiation area of the light source to be simulated to be adjusted from the first area to a second area indicated by the adjustment instruction, and
and presenting an illumination simulation image taking the second area as an illumination area of the light source to be simulated.
9. The method of claim 8, wherein the controlling the illumination area of the light source to be emulated to be adjusted by the first area to a second area indicated by the adjustment instruction comprises:
determining a diaphragm position of the diaphragm image on the image container when the irradiation area is a first area, and acquiring a target position relative to the image container, indicated by the adjustment instruction;
and controlling the aperture image to move from the aperture position to the target position so as to control the irradiation area of the light source to be simulated to be adjusted from the first area to a second area indicated by the adjustment instruction.
10. An illumination emulation apparatus in an image, said apparatus comprising:
the first acquisition module is used for acquiring an aperture image corresponding to the light source to be simulated, and the aperture image is used for indicating the transparency of each pixel point in the irradiation area of the image corresponding to the light source to be simulated;
the second acquisition module is used for acquiring target pixel values of all pixel points in a target image in a target illumination state and acquiring the image shape and the image size of the target image; creating a blank image consistent with the image shape and the image size as an image container for placing the aperture image, wherein the transparency of each pixel point in the image container is lower than a transparency threshold;
The adjusting module is used for respectively determining the image levels of the aperture image, the image container and the target image; superposing the aperture image, the image container and the target image according to the image level to obtain a superposed image; extracting the transparency of each pixel point in the superimposed image, and taking the transparency of each pixel point in the superimposed image as the target transparency of each pixel point in the target image under the light source to be simulated;
and the rendering module is used for rendering and outputting an illumination simulation image corresponding to the target image under the light source to be simulated based on the target pixel value and the target transparency of each pixel point in the target image.
11. An electronic device, the electronic device comprising:
a memory for storing executable instructions;
a processor for implementing the illumination simulation method in an image according to any of claims 1 to 9 when executing executable instructions stored in said memory.
12. A computer readable storage medium storing executable instructions which, when executed by a processor, implement the illumination simulation method in an image according to any one of claims 1 to 9.
CN202111268299.5A 2021-10-29 2021-10-29 Illumination simulation method, device, equipment and storage medium in image Active CN114049425B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111268299.5A CN114049425B (en) 2021-10-29 2021-10-29 Illumination simulation method, device, equipment and storage medium in image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111268299.5A CN114049425B (en) 2021-10-29 2021-10-29 Illumination simulation method, device, equipment and storage medium in image

Publications (2)

Publication Number Publication Date
CN114049425A CN114049425A (en) 2022-02-15
CN114049425B true CN114049425B (en) 2023-06-09

Family

ID=80207048

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111268299.5A Active CN114049425B (en) 2021-10-29 2021-10-29 Illumination simulation method, device, equipment and storage medium in image

Country Status (1)

Country Link
CN (1) CN114049425B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101226644A (en) * 2008-02-18 2008-07-23 朱东晖 Method for tracing shade and shadow of space two-dimension image
CN104794699A (en) * 2015-05-08 2015-07-22 四川天上友嘉网络科技有限公司 Image rendering method applied to games
CN110503725A (en) * 2019-08-27 2019-11-26 百度在线网络技术(北京)有限公司 Method, apparatus, electronic equipment and the computer readable storage medium of image procossing
CN110782391A (en) * 2019-09-10 2020-02-11 腾讯科技(深圳)有限公司 Image processing method and device in driving simulation scene and storage medium
CN112153303A (en) * 2020-09-28 2020-12-29 广州虎牙科技有限公司 Visual data processing method and device, image processing equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7847800B2 (en) * 2004-04-16 2010-12-07 Apple Inc. System for emulating graphics operations
CN110570505B (en) * 2019-09-11 2020-11-17 腾讯科技(深圳)有限公司 Image rendering method, device and equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101226644A (en) * 2008-02-18 2008-07-23 朱东晖 Method for tracing shade and shadow of space two-dimension image
CN104794699A (en) * 2015-05-08 2015-07-22 四川天上友嘉网络科技有限公司 Image rendering method applied to games
CN110503725A (en) * 2019-08-27 2019-11-26 百度在线网络技术(北京)有限公司 Method, apparatus, electronic equipment and the computer readable storage medium of image procossing
CN110782391A (en) * 2019-09-10 2020-02-11 腾讯科技(深圳)有限公司 Image processing method and device in driving simulation scene and storage medium
CN112153303A (en) * 2020-09-28 2020-12-29 广州虎牙科技有限公司 Visual data processing method and device, image processing equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Mohan Muppidi ; Paul Rad ; Sos S. Agaian ; Mo Jamshidi.Container based parallelization for faster and reliable image segmentation.IEEE.2015,全文. *
基于Markov链的半色调图像反射率研究;刘振;张逸新;龚晔;;包装工程(第07期);全文 *
基于几何映射的遥感成像光照仿真方法;王晨昊;;系统仿真学报(03);全文 *

Also Published As

Publication number Publication date
CN114049425A (en) 2022-02-15

Similar Documents

Publication Publication Date Title
US10863168B2 (en) 3D user interface—360-degree visualization of 2D webpage content
CN112215934B (en) Game model rendering method and device, storage medium and electronic device
CN107832108B (en) Rendering method and device of 3D canvas webpage elements and electronic equipment
CN109542376B (en) Screen display adjustment method, device and medium
CN115546377B (en) Video fusion method and device, electronic equipment and storage medium
CN112891946A (en) Game scene generation method and device, readable storage medium and electronic equipment
CN112734896A (en) Environment shielding rendering method and device, storage medium and electronic equipment
US20240087219A1 (en) Method and apparatus for generating lighting image, device, and medium
CN112686939B (en) Depth image rendering method, device, equipment and computer readable storage medium
CN111381794B (en) Control method and device for robot eye lamp, terminal equipment and medium
CN111340684B (en) Method and device for processing graphics in game
CN113470153A (en) Rendering method and device of virtual scene and electronic equipment
CN116243831B (en) Virtual cloud exhibition hall interaction method and system
CN114049425B (en) Illumination simulation method, device, equipment and storage medium in image
US10127715B2 (en) 3D user interface—non-native stereoscopic image conversion
WO2023125132A1 (en) Special effect image processing method and apparatus, and electronic device and storage medium
CN112190941A (en) Shadow processing method and device
CN113192173B (en) Image processing method and device of three-dimensional scene and electronic equipment
CN115120966A (en) Rendering method and device of fluid effect
CN116450017B (en) Display method and device for display object, electronic equipment and medium
US20240153159A1 (en) Method, apparatus, electronic device and storage medium for controlling based on extended reality
KR100692210B1 (en) Method for rendering objects in game engine and recordable media recording programs for enabling the method
CN117218269A (en) Video processing method, device, equipment and storage medium
CN115904192A (en) Interface display method and device, electronic equipment and readable storage medium
CN115845363A (en) Rendering method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant