CN116012511A - Texture display method and device and electronic terminal - Google Patents

Texture display method and device and electronic terminal Download PDF

Info

Publication number
CN116012511A
CN116012511A CN202211287929.8A CN202211287929A CN116012511A CN 116012511 A CN116012511 A CN 116012511A CN 202211287929 A CN202211287929 A CN 202211287929A CN 116012511 A CN116012511 A CN 116012511A
Authority
CN
China
Prior art keywords
graffiti
texture
sphere
point
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211287929.8A
Other languages
Chinese (zh)
Inventor
杨初喜
姜锐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211287929.8A priority Critical patent/CN116012511A/en
Publication of CN116012511A publication Critical patent/CN116012511A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The disclosure provides a texture display method and device and an electronic terminal, relates to the technical field of image rendering, and solves the technical problem that in the prior art, the display effect of displaying graffiti textures on the surface of a three-dimensional sphere is poor. The method comprises the following steps: in response to a graffiti operation on the sphere on the graphical user interface, determining three-dimensional coordinates of a corresponding operation point of the graffiti operation on the sphere in a world coordinate system of the virtual scene space; converting the three-dimensional coordinates into first UV coordinates on the sphere of the sphere by using a specified conversion matrix; and displaying the graffiti texture corresponding to the graffiti operation on the spherical surface based on the first UV coordinates and the display size of the preset texture.

Description

Texture display method and device and electronic terminal
Technical Field
The disclosure relates to the technical field of image rendering, and in particular relates to a texture display method and device and an electronic terminal.
Background
Currently, with the development of image rendering technology, two-dimensional images can be displayed on a three-dimensional model. For example, in game or animation, a user may write and graffiti on a sheet or wall, and the contents of the written strokes or graffiti may be displayed in the form of two-dimensional images on a three-dimensional sheet model and a three-dimensional wall model.
However, in the prior art, the accuracy of displaying the two-dimensional image on the surface of the three-dimensional sphere is low, the graffiti texture can not be well displayed on the surface of the three-dimensional sphere, and the technical problem of poor display effect of displaying the graffiti texture on the surface of the three-dimensional sphere exists.
Disclosure of Invention
The invention aims to provide a texture display method and device and an electronic terminal, so as to solve the technical problem that the display effect of displaying graffiti textures on the surface of a three-dimensional sphere in the prior art is poor.
In a first aspect, an embodiment of the present disclosure provides a method for displaying textures, where a graphical user interface is provided by a terminal device, where content displayed by the graphical user interface includes a sphere located in a virtual scene space; the method comprises the following steps:
in response to a graffiti operation on the graphical user interface for the sphere, determining three-dimensional coordinates of a corresponding operation point of the graffiti operation on the sphere in a world coordinate system of the virtual scene space;
converting the three-dimensional coordinates into first UV coordinates on the sphere of the sphere using a specified conversion matrix;
and displaying the graffiti texture corresponding to the graffiti operation on the spherical surface based on the first UV coordinates and the display size of the preset texture.
In a second aspect, an embodiment of the present disclosure provides a display apparatus for textures, where a graphical user interface is provided by a terminal device, where content displayed by the graphical user interface includes a sphere located in a virtual scene space; comprising the following steps:
a determining module, configured to determine, in response to a graffiti operation on the graphical user interface for the sphere, three-dimensional coordinates of an operation point corresponding to the graffiti operation on the sphere in a world coordinate system of the virtual scene space;
the conversion module is used for converting the three-dimensional coordinates into first UV coordinates on the spherical surface of the sphere by using a specified conversion matrix;
and the display module is used for displaying the graffiti texture corresponding to the graffiti operation on the spherical surface based on the first UV coordinates and the display size of the preset texture.
In a third aspect, an embodiment of the present disclosure further provides an electronic terminal, including a memory, and a processor, where the memory stores a computer program executable on the processor, and the processor implements the steps of the method described in the first aspect when the processor executes the computer program.
In a fourth aspect, embodiments of the present disclosure further provide a computer-readable storage medium storing computer-executable instructions that, when invoked and executed by a processor, cause the processor to perform the method of the first aspect.
The embodiment of the disclosure brings the following beneficial effects:
according to the method and the device for displaying the textures, the three-dimensional coordinates of the corresponding operating points of the graffiti operation on the sphere in the world coordinate system of the virtual scene space are determined in response to the graffiti operation on the sphere on the graphical user interface, then the three-dimensional coordinates are converted into the first UV coordinates on the sphere of the sphere by using the appointed conversion matrix, and accordingly the graffiti textures corresponding to the graffiti operation are displayed on the sphere based on the first UV coordinates and the display size of the preset textures. In this scheme, through appointed conversion matrix with the three-dimensional coordinate conversion of on the spheroid graffiti operating point in the world coordinate system of virtual scene space on the spheroid sphere, can be based on this UV coordinate and the UV coordinate that presets texture display size and directly show the graffiti texture on the sphere for the two-dimensional texture that shows on three-dimensional spheroid surface is more accurate, thereby can comparatively clearly show the scope of graffiti brush on three-dimensional spheroid surface, has improved the display effect of three-dimensional spheroid surface demonstration graffiti texture, alleviates the technical problem that the display effect of three-dimensional spheroid surface demonstration graffiti texture is relatively poor among the prior art, improves user's use experience.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the prior art, the drawings that are required in the detailed description or the prior art will be briefly described, it will be apparent that the drawings in the following description are some embodiments of the present disclosure, and other drawings may be obtained according to the drawings without inventive effort for a person of ordinary skill in the art.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of an electronic terminal according to an embodiment of the disclosure;
fig. 3 is a schematic view of a usage scenario of an electronic terminal according to an embodiment of the present disclosure;
fig. 4 is a flowchart illustrating a method for displaying textures according to an embodiment of the present disclosure;
FIG. 5 is a graphical user interface schematic provided by an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of another graphical user interface provided by an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of another graphical user interface provided by an embodiment of the present disclosure;
FIG. 8 is a schematic diagram illustrating the intersection of a ray and a sphere according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of spatial coordinate transformation according to an embodiment of the disclosure;
FIG. 10 is a schematic diagram of another graphical user interface provided by an embodiment of the present disclosure;
FIG. 11 is a flowchart illustrating another texture display method according to an embodiment of the present disclosure;
fig. 12 is a structural view of a texture display device according to an embodiment of the present disclosure.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the present disclosure will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present disclosure, but not all embodiments. Based on the embodiments in this disclosure, all other embodiments that a person of ordinary skill in the art would obtain without making any inventive effort are within the scope of protection of this disclosure.
The terms "comprising" and "having" and any variations thereof, as referred to in the embodiments of the disclosure, are intended to cover non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed but may optionally include other steps or elements not listed or inherent to such process, method, article, or apparatus.
For the prior art, the two-dimensional image can be well displayed on the plane of the three-dimensional virtual object, for example, the two-dimensional image is displayed on one surface of the virtual cube in a map form, but for the three-dimensional sphere, the two-dimensional image cannot be well displayed on the cambered surface of the sphere. For example, graffiti is performed on a three-dimensional sphere, and the graffiti handwriting cannot be well displayed.
Based on the above, the embodiment of the disclosure provides a method and a device for displaying textures and an electronic terminal, by which the scope of a graffiti brush can be clearly displayed on the surface of a three-dimensional sphere, the technical problem that the display effect of displaying graffiti textures on the surface of the three-dimensional sphere in the prior art is poor is solved, and the use experience of a user is improved.
In one embodiment of the present disclosure, the method for displaying textures may be executed in an electronic terminal such as a terminal device, a server device, or the like. The terminal device may be a local terminal device. When the texture display method is operated on a server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presentation main body are separated, the storage and running of the display method of textures are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game pictures, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the terminal device for information processing is cloud game server of cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic terminal device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In a possible implementation manner, the embodiment of the present disclosure provides a method for displaying a texture, and a graphical user interface is provided through a first terminal device, where the first terminal device may be the aforementioned local terminal device or the aforementioned client device in the cloud interaction system.
For example, as shown in fig. 1, fig. 1 is a schematic view of an application scenario provided in an embodiment of the present disclosure. The application scenario may include a terminal device 102 and a server device 101, where the terminal device 102 may communicate with the server device 101 through a wired network or a wireless network. The electronic terminal in this embodiment may be the terminal device 102 or the server device 101.
The electronic terminal of the present embodiment will be described by taking the terminal device 102 as an example. As shown in fig. 2, the terminal device 102 includes a memory 1021, a processor 1022, in which a computer program executable on the processor is stored, which when executed implements the steps of the method provided by the above-described embodiments.
Referring to fig. 2, the terminal device 102 further includes: bus 1023 and communication interface 1024, processor 1022, communication interface 1024, and memory 1021 are connected by bus 1023; processor 1022 is configured to execute executable modules, such as computer programs, stored in memory 1021.
The memory 1021 may include a high-speed random access memory (Random Access Memory, simply referred to as RAM), and may further include a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory. The communication connection between the system element and at least one other element is implemented via at least one communication interface 1024 (which may be wired or wireless) using the internet, a wide area network, a local network, a metropolitan area network, etc.
Bus 1023 may be an ISA bus, a PCI bus, an EISA bus, or the like. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one bi-directional arrow is shown in FIG. 2, but not only one bus or type of bus.
The memory 1021 is configured to store a program, and the processor 1022 executes the program after receiving an execution instruction, where a method executed by the apparatus defined by the process disclosed in any embodiment of the disclosure may be applied to the processor 1022 or implemented by the processor 1022.
Processor 1022 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the methods described above may be performed by integrated logic circuitry in hardware in processor 1022 or by instructions in software. The processor 1022 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but may also be a digital signal processor (Digital Signal Processing, DSP for short), application specific integrated circuit (Application Specific Integrated Circuit, ASIC for short), off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA for short), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The various methods, steps and logic blocks of the disclosure in the embodiments of the disclosure may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present disclosure may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 1021, and the processor 1022 reads information in the memory 1021, and performs the steps of the method in combination with hardware.
Of course, the electronic terminal in this embodiment may also be a local computer device without network connection. As shown in fig. 3, the computer device 103 includes: a processor 1031, a memory 1032, and a bus, said memory 1032 storing machine readable instructions executable by said processor 1031, said processor 1031 and said memory 1032 communicating over the bus when the computer device is running, said processor 1031 executing said machine readable instructions to perform steps of a display method such as texture.
Specifically, the above-described memory 1032 and processor 1031 can be general-purpose memories and processors, and are not particularly limited herein, and the display method of textures can be performed when the processor 1031 runs a computer program stored in the memory 1032.
Embodiments of the present disclosure are further described below with reference to the accompanying drawings.
Fig. 4 is a flowchart illustrating a texture display method according to an embodiment of the present disclosure. The method can be applied to terminal equipment capable of presenting a graphical user interface, the graphical user interface is provided through the terminal equipment, and the content displayed by the graphical user interface comprises spheres positioned in a virtual scene space. As shown in fig. 4, the method includes:
In step S410, in response to the graffiti operation on the sphere on the graphical user interface, three-dimensional coordinates of the corresponding operation point on the sphere for the graffiti operation in the world coordinate system of the virtual scene space are determined.
By way of example, a user may click a surface of a sphere in a graphical user interface by controlling a mouse cursor to cause the system to determine three-dimensional coordinates of a start point of the graffiti in a world coordinate system of a virtual scene space, then the user may move the mouse cursor to implement a graffiti operation on the sphere, and the system may determine, in real time, three-dimensional coordinates of an operation point corresponding to the graffiti operation on the sphere in the world coordinate system of the virtual scene space in response to the graffiti operation by the user. In addition to the operation manner of clicking by controlling the mouse cursor shown in the above example, the graffiti operation may be other operation manners, such as a touch-type clicking operation, a long-press operation, and the like, which are not limited herein.
In step S420, the three-dimensional coordinates are converted into first UV coordinates on the sphere of the sphere using the specified conversion matrix.
In practical applications, graffiti textures are displayed on the sphere of the sphere by means of texture mapping, and the textures are used for controlling the appearance of the model by using a picture. A map may be attached to the model surface using texture mapping techniques to control the color of the model on a pixel-by-pixel basis. The texture mapping coordinates are typically stored on each vertex in the modeling software using texture expansion techniques at the time of art modeling. The texture map coordinates define the corresponding two-dimensional coordinates of the vertex in the texture. Typically the coordinates are represented using a two-dimensional variable (U, V), where U is the transverse coordinates and V is the longitudinal coordinates, so the texture mapping coordinates are also referred to as UV coordinates. Because the UV coordinates are two-dimensional coordinates, the three-dimensional coordinates need to be converted to first UV coordinates on the sphere of the sphere using a specified conversion matrix.
Step S430, displaying the graffiti texture corresponding to the graffiti operation on the spherical surface based on the first UV coordinates and the display size of the preset texture.
For example, the user may also set the shape, size, etc. of the brush for graffiti, for example, as a square brush as shown in fig. 5, or a circular brush as shown in fig. 6, etc.; the texture size may be varied, for example 256 by 256 or 1024 by 1024, etc., and the system may display the graffiti texture corresponding to the graffiti operation on the sphere based on the first UV coordinates and the display size of the graffiti texture.
In the embodiment of the disclosure, the three-dimensional coordinates of the graffiti operation points on the sphere in the world coordinate system of the virtual scene space are converted into the UV coordinates on the sphere surface of the sphere through the appointed conversion matrix, and the graffiti textures can be directly displayed on the sphere surface based on the UV coordinates and the preset texture display size, so that the two-dimensional textures displayed on the surface of the three-dimensional sphere are more accurate, the range of the graffiti brush can be clearly displayed on the surface of the three-dimensional sphere, the display effect of the graffiti textures displayed on the surface of the three-dimensional sphere is improved, the technical problem that the display effect of the graffiti textures displayed on the surface of the three-dimensional sphere in the prior art is poor is solved, and the use experience of a user is improved.
The above steps are described in detail below.
In some embodiments, after the system obtains the operation position of the user on the sphere, for example, the position clicked by the mouse, a ray can be emitted from the position to the virtual scene space along the preset direction, the intersection point of the ray and the sphere is determined as the operation point, and further, the accurate determination of the three-dimensional coordinates of the corresponding operation point on the sphere in the world coordinate system in the doodling operation is realized, and further, the doodling texture display can be performed on the three-dimensional sphere more accurately. As an example, the step S410 may specifically include the following steps:
step a), responding to the graffiti operation aiming at the sphere on the graphical user interface, and determining the corresponding operation position of the graffiti operation in the graphical user interface.
And b), determining an intersection point of the ray taking the operation position as a starting point and along the preset direction and the sphere in the virtual scene space.
And c), determining the corresponding operation point of the graffiti operation on the sphere and the three-dimensional coordinates of the operation point in the world coordinate system based on the intersection point.
For example, as shown in fig. 7, the black box in the drawing is a position clicked by the mouse cursor, that is, the corresponding operation position of the graffiti operation performed by the user in the graphical user interface, which is not directly located on the surface of the sphere, and the operation position may be understood as a position point (i.e., a position of the black box) of the operation performed by the user on the graphical user interface, where the position point corresponds to a point in the virtual scene space, and then taking this point as a starting point, a ray is performed in a preset direction into the virtual scene space, where the preset direction is the same as the orientation of the lens center of the virtual camera. The system may then recalculate the intersection of the ray with the sphere in the virtual scene space and determine the intersection as the corresponding operating point for the graffiti operation on the sphere, thereby determining the three-dimensional coordinates of the operating point in the world coordinate system.
The system responds to the graffiti operation of the user on the graphical user interface and determines the corresponding operation position of the graffiti operation in the graphical user interface, so that the intersection point of the ray taking the operation position as a starting point and along the preset direction and the sphere in the virtual scene space is determined, the corresponding operation point of the graffiti operation on the sphere and the three-dimensional coordinate of the operation point in the world coordinate system are determined based on the intersection point, the accurate determination of the three-dimensional coordinate of the corresponding operation point of the graffiti operation on the sphere in the world coordinate system is realized, and graffiti texture display can be better carried out on the three-dimensional sphere.
Based on the steps a), b) and c), zero, one or two intersection points may exist between the ray emitted from the operation position as the starting point and the sphere, and when two intersection points exist, the system can determine the intersection point closest to the virtual camera, namely the intersection point closest to the user, as the corresponding operation point of the graffiti operation on the sphere, so that the graffiti texture is displayed on the user side, the display effect of the texture is improved, and the user can observe conveniently. As an example, the above step c) may specifically include the following steps:
And d), if the number of the intersection points is one, determining the intersection points as corresponding operation points of the graffiti operation on the sphere.
And e), if the number of the intersection points is two, determining the closest intersection point of the virtual camera corresponding to the graphical user interface as the corresponding operation point of the graffiti operation on the sphere.
For the virtual camera corresponding to the graphical user interface in step e), it should be noted that there is a correlation between the virtual camera and the content displayed by the graphical user interface, that is, the image displayed in the graphical user interface is the content acquired by the virtual camera in the virtual scene space, and it can be understood that, as the virtual camera moves in the virtual scene space, the content displayed by the graphical user interface also changes correspondingly.
For example, as shown in fig. 8, with the operation position as a starting point, three cases may exist in which a ray emitted in a preset direction and a sphere have zero intersection points, one intersection point, and two intersection points. If, as shown in fig. 8 (a), the ray does not intersect the sphere, it can be determined that the operating point is not on the sphere and the graffiti texture is not displayed; if there is an intersection point as shown in fig. 8 (b), the intersection point may be determined as the corresponding operation point of the graffiti operation on the sphere; if there are two intersections as shown in fig. 8 (c), the closest intersection of the virtual camera corresponding to the graphical user interface may be determined as the corresponding operation point of the graffiti operation on the sphere, and it may be understood that one of the two intersections that is closer to the user is determined as the operation point.
The system determines the intersection point as the corresponding operation point of the graffiti operation on the sphere under the condition that the number of the intersection points is one, and determines the closest intersection point of the virtual camera corresponding to the graphical user interface as the corresponding operation point of the graffiti operation on the sphere under the condition that the number of the intersection points is two, so that graffiti textures are displayed on the user side, and the user can observe and draw conveniently.
In some embodiments, the display shape of the graffiti texture may include multiple types, and different display shapes correspond to different UV coordinate conversion modes, so that the coordinate conversion accuracy is improved, and the functions are more abundant. As one example, the display shape of the preset texture is square; the step S420 may specifically include the following steps:
and f) converting the three-dimensional coordinates in the world coordinate system into coordinates in the observation space by using the observation matrix.
And g), dividing the coordinates in the observation space by the scaling parameters of the sphere to obtain first UV coordinates on the sphere of the sphere.
For the step f), the observation space is a space with the virtual camera corresponding to the graphical user interface as the observer.
For example, as shown in fig. 9, the View Space (View Space) is often referred to as a virtual Camera (Camera) of a game, so is sometimes also referred to as a Camera Space (Camera Space) or a visual Space (Eye Space). The viewing space is the transformation of coordinates of the world space of the object into coordinates in front of the observer's field of view. The viewing space is thus the space that is observed from the perspective of the virtual camera. Which is typically a combination of a series of translations and rotations to translate and rotate the scene so that a particular object is translated in front of the virtual camera. These combined transformations are typically stored in a View Matrix (View Matrix) for transforming three-dimensional coordinates in the world coordinate system to coordinates in the viewing space. The coordinate transformation is thus performed by means of a viewing Matrix (View Matrix), for example using the following formula:
CameraPos=DiffPos*ViewMatrix;
obtaining coordinates (CameraPos) in an observation space, wherein DiffPos are three-dimensional coordinates to be converted; using the formula:
LocalPos=CameraPos/ScaleXY;
the local coordinates (LocalPos) are obtained by dividing the coordinates (CameraPos) in the observation space by the scaling parameter (scalxy) of the sphere, and are an expression of UV coordinates, i.e. the first UV coordinates on the sphere of the sphere.
The system converts the three-dimensional coordinates in the world coordinate system into coordinates in the observation space by using the observation matrix, divides the coordinates in the observation space by the scaling parameters of the sphere to obtain the first UV coordinates on the sphere of the sphere, and can convert the three-dimensional coordinates by using the observation matrix and divide the scaling parameters of the sphere under the condition that the display shape is square, so that the corresponding UV coordinates can be obtained.
In some embodiments, the display shape of the graffiti texture may include multiple types, and different display shapes correspond to different UV coordinate conversion modes, so that the coordinate conversion accuracy is improved, the functions are more abundant, and when the display shape is circular, the three-dimensional coordinates can be directly converted into UV coordinates through the model matrix. As one example, the display shape of the preset texture is a circle; the step S420 may specifically include the following steps:
and h), converting the three-dimensional coordinates in the world coordinate system into first UV coordinates on the sphere of the sphere by using the model matrix.
For example, as shown in fig. 9, when the display shape of the preset texture is a circle, the system may directly inverse-convert the three-dimensional coordinates in the world coordinate system to the local space (LocalSpace) through the Model Matrix (Model Matrix), to obtain the local coordinates (LocalPos), where the local coordinates are an expression form of UV coordinates, that is, the first UV coordinates on the sphere surface of the sphere. The conversion formula can be as follows:
LocalPos=DiffPos*InverseModelMatrix;
The inverse model matrix is the inverse of the model matrix.
When the display shape of the preset texture of the system is circular, the three-dimensional coordinates in the world coordinate system can be directly converted into the first UV coordinates on the spherical surface of the sphere by using the model matrix, so that the conversion efficiency is effectively improved, and the display effect of the graffiti texture is further improved.
In some embodiments, the graffiti texture is displayed on a two-dimensional level, so that the graffiti texture needs to be converted into a length on a circle, so that the graffiti texture can be better displayed, for example, the display range of the graffiti texture can be determined by combining the UV coordinates and the display size of the preset texture, specifically, the display range can be thinned into independent target points, and the graffiti texture can be displayed in a clearer range. As an example, the step S430 may specifically include the following steps:
and i) determining a target point in a to-be-displayed range of the graffiti texture from points on the spherical surface based on the first UV coordinates and the display size of the preset texture.
And j) displaying the graffiti texture corresponding to the graffiti operation at the position of the second UV coordinate of the target point on the spherical surface.
For step i) above, wherein the distance between the point on the sphere and the operating point is the difference between the UV coordinates of the point on the sphere and the first UV coordinates.
For example, the system may determine, from points on the sphere, a target point within a range to be displayed of the graffiti texture based on the first UV coordinates and a display size of the preset texture, as shown in fig. 10, a circle 1001 in the drawing is an operation point, that is, a position clicked by the player through the mouse cursor, a position of a center point is a position (world pos) of a point a in the drawing, and an O point in fig. 10 is a point on the sphere, that is, a point to be determined whether to display the graffiti texture, that is, a point to be determined whether to be within the range to be displayed of the graffiti texture. And judging whether the O point is positioned in the to-be-displayed range of the graffiti texture, obtaining the display size of the preset texture, and comparing the distance between the O point and the A point (namely the center point) with the display size of the preset texture to determine whether the O point is positioned in the to-be-displayed range of the graffiti texture.
For example, the display size of the preset texture is a circle with a radius of 5, the range indicated by the circle 1002 in fig. 10 is the display size of the preset texture, the world coordinate of the O point (absolteworld pos), and the position of the center point (point a) (world pos) is subtracted, that is, the difference between the O point and the center point (point a), that is, the distance between the O point and the center point (point a):
DiffPos=AbsoluteWorldPos-WorldPos。
if the difference distance is greater than the radius 5 of the circle 1002, determining that the O-point is outside the range to be displayed of the graffiti texture; if the difference distance is less than or equal to the radius 5 of the circle 1002, it is determined that the O-point is located within the range of the graffiti texture to be displayed, i.e., the O-point is a target point within the range of the graffiti texture to be displayed.
Similarly, the world coordinates of other points on the sphere minus the position of the center point is the difference between the other points on the sphere and the center point, so that whether the points are located in the range to be displayed of the graffiti texture can be judged by comparing the difference with the display size of the preset texture, and further the aim point in the range to be displayed of the graffiti texture is determined from the points on the sphere.
The system firstly determines the target point in the to-be-displayed range of the graffiti texture from the points on the spherical surface based on the first UV coordinates and the display size of the preset texture, and then displays the graffiti texture corresponding to the graffiti operation at the position of the second UV coordinates of the target point on the spherical surface, so that the graffiti texture can be better displayed, and the display effect is improved.
Based on the steps i) and j), when determining the target point in the to-be-displayed range of the graffiti texture, the scaling to be displayed of the graffiti texture can be adjusted according to the preset texture display size, and the accuracy of the display size of the graffiti texture on the sphere is ensured by adjusting the scaling, so that the display effect of the graffiti texture displayed on the sphere is ensured to be the same as that set by a user, and the display effect of the graffiti texture on the sphere is ensured. As an example, the step i) may specifically include the following steps:
And k), adjusting the scaling to be displayed of the graffiti texture based on the display size of the preset texture to obtain the scaling to be scaled of the graffiti texture.
Step l) determining a target point within the range to be displayed of the graffiti texture from points on the sphere based on the scale to be scaled and the first UV coordinates.
The user may set a display size of the preset texture, for example, a display size of the graffiti brush, and the system may adjust a scaling to be displayed of the graffiti texture according to a numerical value set by the user, so as to obtain the scaling to be scaled of the graffiti texture, further determine a target point in a range to be displayed of the graffiti texture from points on the spherical surface based on the scaling to be scaled and the first UV coordinate, and display the graffiti texture by displaying individual target points.
The system firstly adjusts the scaling to be displayed of the graffiti texture based on the display size of the preset texture to obtain the scaling to be displayed of the graffiti texture, and then determines a target point in the range to be displayed of the graffiti texture from points on the spherical surface based on the scaling to be displayed and the first UV coordinates, so that the display effect of the graffiti texture displayed on the spherical body is ensured to be the same as that set by a user, and the display effect of the graffiti texture on the spherical body is ensured.
Based on the steps k) and l), the display shape of the graffiti texture can comprise multiple types, and different display shapes correspond to different scaling modes, so that the functions are more abundant, when the display shape of the graffiti texture is square, a user can only set the side length of the square, so that the scaling to be displayed of the graffiti texture can be adjusted based on the side length, and the accuracy of the display size of the graffiti texture on the sphere is ensured. As an example, the display shape of the preset texture is square, and the display size of the preset texture includes a preset side length of the square; the step k) may specifically include the following steps:
and m) adjusting the scaling to be displayed of the graffiti texture based on the preset side length to obtain the scaling to be scaled of the graffiti texture.
For example, when the display shape of the preset texture is square, the user can set the brush radius, which is then converted by the system to the length on the ball. Since the texture is displayed in two dimensions, the picture is square as long as it is converted to the length on the circle, so it is converted to the side length of the inscribed square of the circle, which is taken as the preset side length of the square. The image is then compared with the long SizeX and the wide SizeY of the image to obtain the required scaling AlphaBrushScaleX and AlphaBrushScaleY of the image. The method can be realized by the following formula function:
floatMaxSize=FMath::Sqrt(2.0f)*Radius;
floatAlphaBrushScale=MaxSize/(float)FMath::Max<int32>(SizeX,SizeY);
AlphaBrushScaleX=1.0f/(AlphaBrushScale*SizeX);
AlphaBrushScaleY=1.0f/(AlphaBrushScale*SizeY)。
The system adjusts the scaling to be displayed of the graffiti texture based on the preset side length to obtain the scaling to be scaled of the graffiti texture, so that the display effect of the graffiti texture on the sphere is ensured.
Based on the steps k) and l), the UV coordinates can be shifted, so that the graffiti texture display center is kept coincident with the center of the operating point, the graffiti texture moves along with the mouse clicking position of the user, and the graffiti texture is displayed on the sphere better. As an example, after step i) above, the method may further comprise the steps of:
and n), carrying out translation adjustment based on the second UV coordinates of the target point so as to enable the operating point to be positioned at the central position in the range to be displayed.
Illustratively, texture-mapped coordinates are typically stored on each vertex in modeling software using texture expansion techniques during art modeling. The texture map coordinates define the corresponding two-dimensional coordinates of the vertex in the texture. Typically the coordinates are represented using a two-dimensional variable (U, V), where U is the transverse coordinate and V is the longitudinal coordinate. However, the vertex UV coordinate ranges are usually normalized to the [0,1] range, and a translation adjustment is usually performed by adding 0.5 to the UV coordinate, so that the square graffiti texture picture can be displayed in the middle of the mouse click, for example:
U=LocalPos.x*AlphaBrushScaleX+0.5;
V=LocalPos.y*AlphaBrushScaleY+0.5。
Through making the system carry out translation adjustment based on the second UV coordinate of the target point to make the operating point be located the central point that waits to show the within range, can ensure that the graffiti texture display center keeps the coincidence with the center of operating point, make the graffiti texture follow the mouse click position of user and remove, and then better carries out the demonstration of graffiti texture on the spheroid.
Based on the steps k) and l), the display shape of the graffiti texture can comprise multiple types, and different display shapes correspond to different scaling modes for adjustment, so that functions are more abundant, when the display shape of the graffiti texture is circular, a user can directly set the radius of the circular shape, so that the scaling to be displayed of the graffiti texture can be directly adjusted, and the accuracy of the display size of the graffiti texture on the sphere is ensured. As one example, the display shape of the preset texture is a circle; the step k) may specifically include the following steps:
and step o), adjusting the scaling to be displayed of the graffiti texture by adjusting the radius of the circle based on the display size of the preset texture, and obtaining the scaling to be scaled of the graffiti texture.
For example, when the display shape of the preset texture is a circle, the radius of the circle can be directly adjusted, so that the scaling to be displayed of the graffiti texture is adjusted, and the scaling to be scaled of the graffiti texture is obtained.
The system adjusts the scaling to be displayed of the graffiti texture by adjusting the radius of the circle based on the display size of the preset texture to obtain the scaling to be scaled of the graffiti texture, so that the operation efficiency of a user can be effectively improved, and the display effect of the system is improved.
Based on the step o), for the circular graffiti texture, the point on the boundary of the range may be determined first, so that the distance between the point on the boundary and the center point is determined as the radius of the circle, and then the target point in the to-be-displayed range of the graffiti texture is determined from the points on the sphere, and the circular graffiti texture is displayed on the sphere, thereby improving the accuracy of the display range of the circular graffiti texture. As one example, the operating point is a center point of a circle; the step l) may specifically include the following steps:
and p) determining points on the boundary of the range to be displayed based on the scale to be zoomed, and determining the distance between the points on the boundary and the center point as the radius of the circle.
Step q) determining a target point within the range to be displayed of the graffiti texture from the points on the sphere based on the radius of the circle and the first UV coordinates.
For example, the system may determine a point on a boundary of the range to be displayed, and determine a Distance between the point on the boundary and the center point as a radius of a circle, for example, the Distance may be calculated by a chord length formula between the two points, as follows:
Figure BDA0003900640260000171
Therefore, the Distance is determined to be the radius of the circle, and the target point in the circle range can be displayed.
The system firstly determines the point on the boundary of the range to be displayed based on the scale to be zoomed, determines the distance between the point on the boundary and the central point as the radius of the circle, then determines the target point in the range to be displayed of the graffiti texture from the points on the sphere based on the radius of the circle and the first UV coordinate, and displays the circular graffiti texture on the sphere based on the target point, thereby improving the display effect.
Based on the step p) and the step q), the system can also detect the display size of the circular graffiti texture, and if the display size of the texture exceeds the display range of the sphere, the graffiti texture is not displayed, so that the condition that the graffiti texture is larger than the sphere is avoided, and the display effect is improved. As an example, the above step q) may specifically include the steps of:
and r), determining the distance between the point on the spherical surface and the center point through a chord length formula between the two points based on the first UV coordinates.
Step s), if the distance between the point on the sphere and the center point is less than or equal to the radius of the circle, determining the point on the sphere as the target point in the range to be displayed.
And t) if the distance between the point on the spherical surface and the central point is larger than the radius of the circle, determining the point on the spherical surface as an out-of-circle point outside the range to be displayed.
Illustratively, as shown in fig. 11, the user sets the display size of the preset texture, sets Radius of the circle, and determines whether the circle needs to be displayed at the position by comparing the Radius with the Distance calculated previously. If Distance > Radius, the picture is not displayed, otherwise, it is displayed.
By comparing the distance between the point on the spherical surface and the center point with the circular radius, the display size of the circular graffiti texture is detected, if the display size of the texture exceeds the display range of the spherical surface, the graffiti texture is not displayed, the condition that the graffiti texture is larger than the spherical body is avoided, and the display effect is improved.
Fig. 12 is a schematic structural diagram of a texture display device according to an embodiment of the disclosure. The device can be applied to terminal equipment capable of presenting a graphical user interface, the graphical user interface is provided through the terminal equipment, and the content displayed by the graphical user interface comprises spheres. As shown in fig. 12, the texture display apparatus 1200 includes:
A determining module 1201, configured to determine, in response to a graffiti operation on the graphical user interface for the sphere, three-dimensional coordinates of an operation point corresponding to the graffiti operation on the sphere in a world coordinate system of the virtual scene space;
a conversion module 1202 for converting the three-dimensional coordinates into first UV coordinates on the sphere of the sphere using a specified conversion matrix;
the display module 1203 is configured to display a graffiti texture corresponding to the graffiti operation on the spherical surface based on the first UV coordinate and a display size of the preset texture.
In some embodiments, the determining module 1201 is specifically configured to:
responsive to a graffiti operation on the graphical user interface for the sphere, determining a corresponding operational location of the graffiti operation in the graphical user interface;
determining an intersection point of a ray taking the operation position as a starting point and along a preset direction and the sphere in the virtual scene space;
and determining a corresponding operation point of the graffiti operation on the sphere and three-dimensional coordinates of the operation point in the world coordinate system based on the intersection point.
In some embodiments, the determining module 1201 is specifically configured to:
if the number of the intersection points is one, determining the intersection points as corresponding operation points of the graffiti operation on the sphere;
And if the number of the intersection points is two, determining the closest intersection point of the virtual camera corresponding to the graphical user interface as the corresponding operation point of the graffiti operation on the sphere.
In some embodiments, the display shape of the preset texture is square; the conversion module 1202 is specifically configured to:
converting three-dimensional coordinates in a world coordinate system into coordinates in an observation space by using an observation matrix; the observation space is a space taking a virtual camera corresponding to the graphical user interface as an observer;
dividing the coordinates in the viewing space by the scaling parameters of the sphere yields the first UV coordinates on the sphere of the sphere.
In some embodiments, the display shape of the preset texture is circular; the conversion module 1202 is specifically configured to:
three-dimensional coordinates in the world coordinate system are converted to first UV coordinates on the sphere of the sphere using the model matrix.
In some embodiments, the display module 1203 is specifically configured to:
determining a target point in a to-be-displayed range of the graffiti texture from points on the spherical surface based on the first UV coordinates and the display size of the preset texture; wherein the distance between the point on the sphere and the operating point is the difference between the UV coordinates of the point on the sphere and the first UV coordinates;
and displaying the graffiti texture corresponding to the graffiti operation at the position of the second UV coordinate of the target point on the spherical surface.
In some embodiments, the display module 1203 is specifically configured to:
adjusting the scaling to be displayed of the graffiti texture based on the display size of the preset texture to obtain the scaling to be scaled of the graffiti texture;
based on the to-be-scaled scale and the first UV coordinates, a target point within a to-be-displayed range of the graffiti texture is determined from points on the sphere.
In some embodiments, the display shape of the preset texture is square, and the display size of the preset texture includes a preset side length of the square; the display module 1203 is specifically configured to:
adjusting the scaling of the graffiti texture to be displayed based on the display size of the preset texture to obtain the scaling of the graffiti texture to be scaled, comprising:
and adjusting the scaling to be displayed of the graffiti texture based on the preset side length to obtain the scaling to be scaled of the graffiti texture.
In some embodiments, the apparatus further comprises:
and the adjusting module is used for carrying out translation adjustment based on the second UV coordinates of the target point after determining the target point in the to-be-displayed range of the graffiti texture from the points on the spherical surface based on the to-be-scaled ratio and the first UV coordinates so as to enable the operating point to be positioned at the central position in the to-be-displayed range.
In some embodiments, the display shape of the preset texture is circular; the display module 1203 is specifically configured to:
Adjusting the scaling of the graffiti texture to be displayed based on the display size of the preset texture to obtain the scaling of the graffiti texture to be scaled, comprising:
and adjusting the scaling to be displayed of the graffiti texture by adjusting the radius of the circle based on the display size of the preset texture, so as to obtain the scaling to be scaled of the graffiti texture.
In some embodiments, the operating point is a center point of a circle; the display module 1203 is specifically configured to:
determining points on the boundary of the range to be displayed based on the scale to be zoomed, and determining the distance between the points on the boundary and the center point as the radius of a circle;
based on the radius of the circle and the first UV coordinates, a target point within the range to be displayed of the graffiti texture is determined from the points on the sphere.
In some embodiments, the display module 1203 is specifically configured to:
determining the distance between a point on the spherical surface and a center point through a chord length formula between two points based on the first UV coordinates;
if the distance between the point on the sphere and the center point is less than or equal to the radius of the circle, determining the point on the sphere as a target point in the range to be displayed;
if the distance between the point on the sphere and the center point is greater than the radius of the circle, the point on the sphere is determined to be an out-of-circle point outside the range to be displayed.
The texture display device provided by the embodiment of the present disclosure has the same technical characteristics as the texture display method provided by the foregoing embodiment, so that the same technical problems can be solved, and the same technical effects can be achieved. The system can respond to the graffiti operation of the ball on the graphical user interface by responding to the user, the three-dimensional coordinates of the corresponding operation points of the graffiti operation on the ball in the world coordinate system of the virtual scene space are determined, then the system converts the three-dimensional coordinates into the first UV coordinates on the ball surface by using the appointed conversion matrix, and accordingly the graffiti texture corresponding to the graffiti operation is displayed on the ball surface based on the first UV coordinates and the display size of the preset texture, the scope of the graffiti brush can be clearly displayed on the three-dimensional ball surface, the technical problem that the display effect of the graffiti texture displayed on the three-dimensional ball surface in the prior art is poor is solved, and the user experience is improved.
The embodiment of the disclosure also provides an electronic terminal, which comprises: a processor, a storage medium and a bus, the storage medium storing machine readable instructions executable by the processor, the processor and the storage medium communicating over the bus when the electronic terminal is running a method for displaying textures as in the embodiment, the processor executing the machine readable instructions, the processor method item preamble to perform the steps of:
In response to a graffiti operation on the graphical user interface for the sphere, determining three-dimensional coordinates of a corresponding operation point of the graffiti operation on the sphere in a world coordinate system of the virtual scene space;
converting the three-dimensional coordinates into first UV coordinates on the sphere of the sphere using a specified conversion matrix;
and displaying the graffiti texture corresponding to the graffiti operation on the spherical surface based on the first UV coordinates and the display size of the preset texture.
Reference may be made to the corresponding processes in the above method embodiments for the specific embodiments and specific working processes, which are not described herein.
In one possible embodiment, the processor, when executing the response to the graffiti operation for the sphere, is specifically configured to:
responsive to a graffiti operation on the graphical user interface for the sphere, determining a corresponding operational location of the graffiti operation in the graphical user interface;
determining an intersection point of a ray taking the operation position as a starting point and along a preset direction and the sphere in the virtual scene space;
And determining a corresponding operation point of the graffiti operation on the sphere and three-dimensional coordinates of the operation point in the world coordinate system based on the intersection point.
In a possible embodiment, the processor, when executing the determining, based on the intersection point, a corresponding operation point of the graffiti operation on the sphere, is specifically configured to:
if the number of the intersection points is one, determining the intersection points as corresponding operation points of the graffiti operation on the sphere;
and if the number of the intersection points is two, determining the closest intersection point of the virtual camera corresponding to the graphical user interface as the corresponding operation point of the graffiti operation on the sphere.
In a possible embodiment, the processor, when executing the conversion of the three-dimensional coordinates into first UV coordinates on the sphere of the sphere using a specified conversion matrix, is specifically configured to:
converting three-dimensional coordinates in the world coordinate system into coordinates in an observation space by using an observation matrix; the observation space is a space taking a virtual camera corresponding to the graphical user interface as an observer;
dividing the coordinates in the viewing space by the scaling parameters of the sphere to obtain first UV coordinates on the sphere of the sphere.
In a possible embodiment, the display shape of the preset texture is a circle; the processor is specifically configured to, when executing the conversion of the three-dimensional coordinates into first UV coordinates on the sphere of the sphere using a specified conversion matrix:
three-dimensional coordinates in the world coordinate system are converted to first UV coordinates on the sphere of the sphere using a model matrix.
In a possible implementation manner, when the processor executes the display size based on the first UV coordinate and a preset texture, the processor is specifically configured to:
determining a target point in a to-be-displayed range of the graffiti texture from points on the spherical surface based on the first UV coordinates and a display size of a preset texture; wherein the distance between the point on the sphere and the operating point is the difference between the UV coordinates of the point on the sphere and the first UV coordinates;
and displaying the graffiti texture corresponding to the graffiti operation at the position of the second UV coordinate of the target point on the spherical surface.
In a possible embodiment, the processor is specifically configured to, when executing the determining, from the points on the sphere, the target point within the range to be displayed of the graffiti texture based on the first UV coordinate and the display size of the preset texture:
Adjusting the scaling to be displayed of the graffiti texture based on the display size of the preset texture to obtain the scaling to be scaled of the graffiti texture;
and determining a target point in a range to be displayed of the graffiti texture from points on the spherical surface based on the to-be-scaled scale and the first UV coordinates.
In a possible implementation manner, the display shape of the preset texture is square, and the display size of the preset texture comprises a preset side length of the square; the processor is specifically configured to, when executing the scaling to be displayed for the graffiti texture based on the display size of the preset texture to obtain the scaling to be displayed for the graffiti texture:
and adjusting the scaling to be displayed of the graffiti texture based on the preset side length to obtain the scaling to be scaled of the graffiti texture.
In one possible embodiment, after said determining a target point within a range to be displayed of said graffiti texture from points on said sphere based on said scale to be scaled and said first UV coordinates, said processor is further configured to perform:
and carrying out translation adjustment based on the second UV coordinates of the target point so as to enable the operating point to be positioned at the central position in the range to be displayed.
In a possible embodiment, the display shape of the preset texture is a circle; the processor is specifically configured to, when executing the scaling to be displayed for the graffiti texture based on the display size of the preset texture to obtain the scaling to be displayed for the graffiti texture:
and adjusting the scaling to be displayed of the graffiti texture by adjusting the radius of the circle based on the display size of the preset texture to obtain the scaling to be scaled of the graffiti texture.
In a possible embodiment, the operating point is the center point of the circle; the processor is specifically configured to, when executing the determining, from points on the sphere, a target point within a range to be displayed of the graffiti texture based on the to-be-scaled scale and the first UV coordinate:
determining a point on a boundary of the range to be displayed based on the scale to be zoomed, and determining a distance between the point on the boundary and the center point as a radius of the circle;
and determining a target point in a range to be displayed of the graffiti texture from points on the spherical surface based on the radius of the circle and the first UV coordinates.
In a possible embodiment, the processor is specifically configured to, when executing the determining, from the points on the sphere, a target point within a range to be displayed of the graffiti texture based on the radius of the circle and the first UV coordinate: determining the distance between the point on the spherical surface and the center point through a chord length formula between two points based on the first UV coordinates;
If the distance between the point on the spherical surface and the center point is smaller than or equal to the radius of the circle, determining the point on the spherical surface as a target point in the range to be displayed;
and if the distance between the point on the spherical surface and the central point is larger than the radius of the circle, determining the point on the spherical surface as an out-of-circle point outside the range to be displayed.
By means of the method, the system can respond to the graffiti operation of the ball on the graphical user interface by a user, determine the three-dimensional coordinates of the corresponding operation points of the graffiti operation on the ball in the world coordinate system of the virtual scene space, and then convert the three-dimensional coordinates into the first UV coordinates on the spherical surface of the ball by using the appointed conversion matrix, so that the graffiti texture corresponding to the graffiti operation is displayed on the spherical surface based on the first UV coordinates and the display size of the preset texture, the range of the graffiti brush can be displayed clearly on the surface of the three-dimensional ball, the technical problem that the display effect of the graffiti texture displayed on the surface of the three-dimensional ball in the prior art is poor is solved, and the use experience of the user is improved.
Corresponding to the above-described texture display method, the disclosed embodiments also provide a program product, such as a computer-readable storage medium, storing computer-executable instructions that, when invoked and executed by a processor, cause the processor to perform the steps of:
In response to a graffiti operation on the graphical user interface for the sphere, determining three-dimensional coordinates of a corresponding operation point of the graffiti operation on the sphere in a world coordinate system of the virtual scene space;
converting the three-dimensional coordinates into first UV coordinates on the sphere of the sphere using a specified conversion matrix;
and displaying the graffiti texture corresponding to the graffiti operation on the spherical surface based on the first UV coordinates and the display size of the preset texture.
Reference may be made to the corresponding processes in the above method embodiments for the specific embodiments and specific working processes, which are not described herein.
In one possible embodiment, the processor, when executing the response to the graffiti operation for the sphere, is specifically configured to:
responsive to a graffiti operation on the graphical user interface for the sphere, determining a corresponding operational location of the graffiti operation in the graphical user interface;
determining an intersection point of a ray taking the operation position as a starting point and along a preset direction and the sphere in the virtual scene space;
And determining a corresponding operation point of the graffiti operation on the sphere and three-dimensional coordinates of the operation point in the world coordinate system based on the intersection point.
In a possible embodiment, the processor, when executing the determining, based on the intersection point, a corresponding operation point of the graffiti operation on the sphere, is specifically configured to:
if the number of the intersection points is one, determining the intersection points as corresponding operation points of the graffiti operation on the sphere;
and if the number of the intersection points is two, determining the closest intersection point of the virtual camera corresponding to the graphical user interface as the corresponding operation point of the graffiti operation on the sphere.
In a possible embodiment, the processor, when executing the conversion of the three-dimensional coordinates into first UV coordinates on the sphere of the sphere using a specified conversion matrix, is specifically configured to:
converting three-dimensional coordinates in the world coordinate system into coordinates in an observation space by using an observation matrix; the observation space is a space taking a virtual camera corresponding to the graphical user interface as an observer;
dividing the coordinates in the viewing space by the scaling parameters of the sphere to obtain first UV coordinates on the sphere of the sphere.
In a possible embodiment, the display shape of the preset texture is a circle; the processor is specifically configured to, when executing the conversion of the three-dimensional coordinates into first UV coordinates on the sphere of the sphere using a specified conversion matrix:
three-dimensional coordinates in the world coordinate system are converted to first UV coordinates on the sphere of the sphere using a model matrix.
In a possible implementation manner, when the processor executes the display size based on the first UV coordinate and a preset texture, the processor is specifically configured to:
determining a target point in a to-be-displayed range of the graffiti texture from points on the spherical surface based on the first UV coordinates and a display size of a preset texture; wherein the distance between the point on the sphere and the operating point is the difference between the UV coordinates of the point on the sphere and the first UV coordinates;
and displaying the graffiti texture corresponding to the graffiti operation at the position of the second UV coordinate of the target point on the spherical surface.
In a possible embodiment, the processor is specifically configured to, when executing the determining, from the points on the sphere, the target point within the range to be displayed of the graffiti texture based on the first UV coordinate and the display size of the preset texture:
Adjusting the scaling to be displayed of the graffiti texture based on the display size of the preset texture to obtain the scaling to be scaled of the graffiti texture;
and determining a target point in a range to be displayed of the graffiti texture from points on the spherical surface based on the to-be-scaled scale and the first UV coordinates.
In a possible implementation manner, the display shape of the preset texture is square, and the display size of the preset texture comprises a preset side length of the square; the processor is specifically configured to, when executing the scaling to be displayed for the graffiti texture based on the display size of the preset texture to obtain the scaling to be displayed for the graffiti texture:
and adjusting the scaling to be displayed of the graffiti texture based on the preset side length to obtain the scaling to be scaled of the graffiti texture.
In one possible embodiment, after said determining a target point within a range to be displayed of said graffiti texture from points on said sphere based on said scale to be scaled and said first UV coordinates, said processor is further configured to perform:
and carrying out translation adjustment based on the second UV coordinates of the target point so as to enable the operating point to be positioned at the central position in the range to be displayed.
In a possible embodiment, the display shape of the preset texture is a circle; the processor is specifically configured to, when executing the scaling to be displayed for the graffiti texture based on the display size of the preset texture to obtain the scaling to be displayed for the graffiti texture:
and adjusting the scaling to be displayed of the graffiti texture by adjusting the radius of the circle based on the display size of the preset texture to obtain the scaling to be scaled of the graffiti texture.
In a possible embodiment, the operating point is the center point of the circle; the processor is specifically configured to, when executing the determining, from points on the sphere, a target point within a range to be displayed of the graffiti texture based on the to-be-scaled scale and the first UV coordinate:
determining a point on a boundary of the range to be displayed based on the scale to be zoomed, and determining a distance between the point on the boundary and the center point as a radius of the circle;
and determining a target point in a range to be displayed of the graffiti texture from points on the spherical surface based on the radius of the circle and the first UV coordinates.
In a possible embodiment, the processor is specifically configured to, when executing the determining, from the points on the sphere, a target point within a range to be displayed of the graffiti texture based on the radius of the circle and the first UV coordinate: determining the distance between the point on the spherical surface and the center point through a chord length formula between two points based on the first UV coordinates;
If the distance between the point on the spherical surface and the center point is smaller than or equal to the radius of the circle, determining the point on the spherical surface as a target point in the range to be displayed;
and if the distance between the point on the spherical surface and the central point is larger than the radius of the circle, determining the point on the spherical surface as an out-of-circle point outside the range to be displayed.
By means of the method, the system can respond to the graffiti operation of the ball on the graphical user interface by a user, determine the three-dimensional coordinates of the corresponding operation points of the graffiti operation on the ball in the world coordinate system of the virtual scene space, and then convert the three-dimensional coordinates into the first UV coordinates on the spherical surface of the ball by using the appointed conversion matrix, so that the graffiti texture corresponding to the graffiti operation is displayed on the spherical surface based on the first UV coordinates and the display size of the preset texture, the range of the graffiti brush can be displayed clearly on the surface of the three-dimensional ball, the technical problem that the display effect of the graffiti texture displayed on the surface of the three-dimensional ball in the prior art is poor is solved, and the use experience of the user is improved.
The display device of the texture provided by the embodiment of the disclosure may be specific hardware on the device or software or firmware installed on the device, etc. The device provided by the embodiments of the present disclosure has the same implementation principle and technical effects as those of the foregoing method embodiments, and for a brief description, reference may be made to corresponding matters in the foregoing method embodiments where the device embodiment section is not mentioned. It will be clear to those skilled in the art that, for convenience and brevity, the specific operation of the system, apparatus and unit described above may refer to the corresponding process in the above method embodiment, which is not described in detail herein.
In the embodiments provided in the present disclosure, it should be understood that the disclosed apparatus and method may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
As another example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments provided in the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present disclosure may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, or in a form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the texture display method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It should be noted that: like reference numerals and letters in the following figures denote like items, and thus once an item is defined in one figure, no further definition or explanation of it is required in the following figures, and furthermore, the terms "first," "second," "third," etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present disclosure, and are not intended to limit the scope of the disclosure, but the present disclosure is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, it is not limited to the disclosure: any person skilled in the art, within the technical scope of the disclosure of the present disclosure, may modify or easily conceive changes to the technical solutions described in the foregoing embodiments, or make equivalent substitutions for some of the technical features thereof; such modifications, changes or substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present disclosure. Are intended to be within the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (15)

1. A method for displaying textures, characterized in that a graphical user interface is provided through a terminal device, and the content displayed by the graphical user interface comprises spheres in a virtual scene space; the method comprises the following steps:
in response to a graffiti operation on the graphical user interface for the sphere, determining three-dimensional coordinates of a corresponding operation point of the graffiti operation on the sphere in a world coordinate system of the virtual scene space;
converting the three-dimensional coordinates into first UV coordinates on the sphere of the sphere using a specified conversion matrix;
and displaying the graffiti texture corresponding to the graffiti operation on the spherical surface based on the first UV coordinates and the display size of the preset texture.
2. The method of claim 1, wherein the determining, in response to a graffiti operation on the graphical user interface for the sphere, three-dimensional coordinates of a corresponding operation point of the graffiti operation on the sphere in a world coordinate system of the virtual scene space comprises:
responsive to a graffiti operation on the graphical user interface for the sphere, determining a corresponding operational location of the graffiti operation in the graphical user interface;
Determining an intersection point of a ray taking the operation position as a starting point and along a preset direction and the sphere in the virtual scene space;
and determining a corresponding operation point of the graffiti operation on the sphere and three-dimensional coordinates of the operation point in the world coordinate system based on the intersection point.
3. The method of claim 2, wherein the determining, based on the intersection point, a corresponding operating point of the graffiti operation on the sphere comprises:
if the number of the intersection points is one, determining the intersection points as corresponding operation points of the graffiti operation on the sphere;
and if the number of the intersection points is two, determining the closest intersection point of the virtual camera corresponding to the graphical user interface as the corresponding operation point of the graffiti operation on the sphere.
4. The method of claim 1, wherein the display shape of the predetermined texture is square;
the converting the three-dimensional coordinates into first UV coordinates on the sphere of the sphere using a specified conversion matrix, comprising:
converting three-dimensional coordinates in the world coordinate system into coordinates in an observation space by using an observation matrix; the observation space is a space taking a virtual camera corresponding to the graphical user interface as an observer;
Dividing the coordinates in the viewing space by the scaling parameters of the sphere to obtain first UV coordinates on the sphere of the sphere.
5. The method of claim 1, wherein the display shape of the preset texture is circular;
the converting the three-dimensional coordinates into first UV coordinates on the sphere of the sphere using a specified conversion matrix, comprising:
three-dimensional coordinates in the world coordinate system are converted to first UV coordinates on the sphere of the sphere using a model matrix.
6. The method of claim 1, wherein displaying the graffiti texture corresponding to the graffiti operation on the sphere based on the first UV coordinates and a display size of a preset texture, comprises:
determining a target point in a to-be-displayed range of the graffiti texture from points on the spherical surface based on the first UV coordinates and a display size of a preset texture; wherein the distance between the point on the sphere and the operating point is the difference between the UV coordinates of the point on the sphere and the first UV coordinates;
and displaying the graffiti texture corresponding to the graffiti operation at the position of the second UV coordinate of the target point on the spherical surface.
7. The method of claim 6, wherein the determining the target point within the graffiti texture to be displayed from the points on the sphere based on the first UV coordinates and a display size of a preset texture comprises:
adjusting the scaling to be displayed of the graffiti texture based on the display size of the preset texture to obtain the scaling to be scaled of the graffiti texture;
and determining a target point in a range to be displayed of the graffiti texture from points on the spherical surface based on the to-be-scaled scale and the first UV coordinates.
8. The method of claim 7, wherein the display shape of the predetermined texture is square, and the display size of the predetermined texture includes a predetermined side length of the square;
the adjusting the scaling of the graffiti texture to be displayed based on the display size of the preset texture to obtain the scaling of the graffiti texture to be scaled comprises the following steps:
and adjusting the scaling to be displayed of the graffiti texture based on the preset side length to obtain the scaling to be scaled of the graffiti texture.
9. The method of claim 7, wherein the determining the target point within the range to be displayed of the graffiti texture from the points on the sphere based on the scale to be scaled and the first UV coordinates further comprises:
And carrying out translation adjustment based on the second UV coordinates of the target point so as to enable the operating point to be positioned at the central position in the range to be displayed.
10. The method of claim 7, wherein the display shape of the preset texture is circular;
the adjusting the scaling of the graffiti texture to be displayed based on the display size of the preset texture to obtain the scaling of the graffiti texture to be scaled comprises the following steps:
and adjusting the scaling to be displayed of the graffiti texture by adjusting the radius of the circle based on the display size of the preset texture to obtain the scaling to be scaled of the graffiti texture.
11. The method of claim 10, wherein the operating point is a center point of the circle;
the determining, based on the to-be-scaled scale and the first UV coordinate, a target point within a to-be-displayed range of the graffiti texture from points on the sphere, including:
determining a point on a boundary of the range to be displayed based on the scale to be zoomed, and determining a distance between the point on the boundary and the center point as a radius of the circle;
and determining a target point in a range to be displayed of the graffiti texture from points on the spherical surface based on the radius of the circle and the first UV coordinates.
12. The method of claim 11, wherein the determining a target point within a range to be displayed of the graffiti texture from points on the sphere based on the radius of the circle and the first UV coordinates comprises:
determining the distance between the point on the spherical surface and the center point through a chord length formula between two points based on the first UV coordinates;
if the distance between the point on the spherical surface and the center point is smaller than or equal to the radius of the circle, determining the point on the spherical surface as a target point in the range to be displayed;
and if the distance between the point on the spherical surface and the central point is larger than the radius of the circle, determining the point on the spherical surface as an out-of-circle point outside the range to be displayed.
13. A display device of textures, characterized in that a graphical user interface is provided by a terminal device, and the content displayed by the graphical user interface comprises spheres in a virtual scene space; comprising the following steps:
a determining module, configured to determine, in response to a graffiti operation on the graphical user interface for the sphere, three-dimensional coordinates of an operation point corresponding to the graffiti operation on the sphere in a world coordinate system of the virtual scene space;
The conversion module is used for converting the three-dimensional coordinates into first UV coordinates on the spherical surface of the sphere by using a specified conversion matrix;
and the display module is used for displaying the graffiti texture corresponding to the graffiti operation on the spherical surface based on the first UV coordinates and the display size of the preset texture.
14. An electronic terminal comprising a memory, a processor, the memory having stored therein a computer program executable on the processor, characterized in that the processor, when executing the computer program, implements the steps of the method of any of the preceding claims 1 to 12.
15. A computer readable storage medium storing computer executable instructions which, when invoked and executed by a processor, cause the processor to perform the method of any one of claims 1 to 12.
CN202211287929.8A 2022-10-20 2022-10-20 Texture display method and device and electronic terminal Pending CN116012511A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211287929.8A CN116012511A (en) 2022-10-20 2022-10-20 Texture display method and device and electronic terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211287929.8A CN116012511A (en) 2022-10-20 2022-10-20 Texture display method and device and electronic terminal

Publications (1)

Publication Number Publication Date
CN116012511A true CN116012511A (en) 2023-04-25

Family

ID=86021833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211287929.8A Pending CN116012511A (en) 2022-10-20 2022-10-20 Texture display method and device and electronic terminal

Country Status (1)

Country Link
CN (1) CN116012511A (en)

Similar Documents

Publication Publication Date Title
JP5693966B2 (en) Projecting graphic objects on interactive uneven displays
JP7337104B2 (en) Model animation multi-plane interaction method, apparatus, device and storage medium by augmented reality
CN110163942B (en) Image data processing method and device
US7382374B2 (en) Computerized method and computer system for positioning a pointer
US10895909B2 (en) Gaze and saccade based graphical manipulation
US9754398B1 (en) Animation curve reduction for mobile application user interface objects
CN111161398A (en) Image generation method, device, equipment and storage medium
CN115187729B (en) Three-dimensional model generation method, device, equipment and storage medium
WO2017113729A1 (en) 360-degree image loading method and loading module, and mobile terminal
CN111382618A (en) Illumination detection method, device, equipment and storage medium for face image
WO2021034396A1 (en) Using bounding volume representations for raytracing dynamic units within a virtual space
CN115512025A (en) Method and device for detecting model rendering performance, electronic device and storage medium
CN114663632A (en) Method and equipment for displaying virtual object by illumination based on spatial position
Trapp et al. Strategies for visualising 3D points-of-interest on mobile devices
CN113962979A (en) Cloth collision simulation enhancement presentation method and device based on depth image
JP2023525945A (en) Data Optimization and Interface Improvement Method for Realizing Augmented Reality of Large-Scale Buildings on Mobile Devices
JP3629243B2 (en) Image processing apparatus and method for rendering shading process using distance component in modeling
US20230147474A1 (en) Information processing apparatus to display image of illuminated target object, method of information processing, and storage medium
CN116012511A (en) Texture display method and device and electronic terminal
KR101227155B1 (en) Graphic image processing apparatus and method for realtime transforming low resolution image into high resolution image
CN109949396A (en) A kind of rendering method, device, equipment and medium
CN111862338B (en) Display method and device for simulated eyeglass wearing image
KR100848687B1 (en) 3-dimension graphic processing apparatus and operating method thereof
CN110827411B (en) Method, device, equipment and storage medium for displaying augmented reality model of self-adaptive environment
JP7365981B2 (en) Information terminal device, method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination