CN117519686A - Three-dimensional UI development, three-dimensional UI development device, and storage medium - Google Patents

Three-dimensional UI development, three-dimensional UI development device, and storage medium Download PDF

Info

Publication number
CN117519686A
CN117519686A CN202311572959.8A CN202311572959A CN117519686A CN 117519686 A CN117519686 A CN 117519686A CN 202311572959 A CN202311572959 A CN 202311572959A CN 117519686 A CN117519686 A CN 117519686A
Authority
CN
China
Prior art keywords
instruction
dimensional
development
interface
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311572959.8A
Other languages
Chinese (zh)
Inventor
谭述安
肖兰菲
康玉路
王秀琴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chaoxiang Digital Technology Co ltd
Original Assignee
Shenzhen Chaoxiang Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chaoxiang Digital Technology Co ltd filed Critical Shenzhen Chaoxiang Digital Technology Co ltd
Priority to CN202311572959.8A priority Critical patent/CN117519686A/en
Publication of CN117519686A publication Critical patent/CN117519686A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/33Intelligent editors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses development of a three-dimensional UI, a development device of the three-dimensional UI and a storage medium, wherein the method comprises the following steps: responding to a UI blueprint construction instruction and a World Space UI instruction received by a UI development interface, and outputting a UI element corresponding to the UI blueprint construction instruction in a three-dimensional screen editing interface; when receiving the editing instruction of the UI element, updating the state of the input device into an unlocking state; responding to an operation instruction input by the input equipment, and determining layout information and an interaction component of the UI element corresponding to the operation instruction; and determining a rendering effect corresponding to the interaction component, and adding the rendering effect to the UI element. According to the method and the device, the three-dimensional UI elements are edited and rendered according to the operation instructions of the input equipment in response to the three-dimensional development instructions received by the UI interface, so that the interaction effect of the UI products is improved.

Description

Three-dimensional UI development, three-dimensional UI development device, and storage medium
Technical Field
The present invention relates to the field of software design, and more particularly, to development of a three-dimensional UI, a development apparatus for a three-dimensional UI, and a storage medium.
Background
With technical support in fields of metauniverse, digital twinning, data visualization, AR/VR (Augmented Reality/Virtual Reality) and the like of the UE5 (Unreal Engine 5), a user has higher requirements on interactive experience, and UI design is provided.
However, the interface editing system (UMG, unreal Motion Graphics UIDesigner, phantom schematic graphical interface designer) of the UE5 can only meet the editing function of the two-dimensional panel, but cannot meet the development requirement of the 3D UI for interaction based on the fields of metauniverse, digital twin, data visualization, AR/VR and the like, resulting in poor three-dimensional interaction effect of the product.
The foregoing is provided merely for the purpose of facilitating understanding of the technical solutions of the present invention and is not intended to represent an admission that the foregoing is prior art.
Disclosure of Invention
The invention mainly aims to provide a three-dimensional UI development device, a three-dimensional UI development device and a storage medium, and solves the problem that an editing system carried by UE5 in the prior art cannot realize three-dimensional UI development.
To achieve the above object, the present invention provides a method for developing a three-dimensional UI, the method comprising the steps of:
responding to a UI blueprint construction instruction and a World Space UI instruction received by a UI development interface, and outputting a UI element corresponding to the UI blueprint construction instruction in a three-dimensional screen editing interface;
when receiving the editing instruction of the UI element, updating the state of the input device into an unlocking state;
responding to an operation instruction input by the input equipment, and determining layout information and an interaction component of the UI element corresponding to the operation instruction;
and determining a rendering effect corresponding to the interaction component, and adding the rendering effect to the UI element.
Optionally, before the step of outputting the UI element corresponding to the UI blueprint construction instruction in the three-dimensional screen editing interface, the responding to the UI blueprint construction instruction and the World Space UI instruction received by the UI development interface further includes:
outputting the three-dimensional development interface when detecting a starting instruction of the real-time engine;
responding to a plug-in call instruction received by the three-dimensional development interface, determining development content corresponding to the plug-in call instruction, and restarting the real-time engine when the development content is UI development and a reload instruction is received;
and after detecting that the real-time engine is restarted, outputting the UI development interface.
Optionally, the step of determining development content corresponding to the plug-in call instruction in response to the plug-in call instruction received by the three-dimensional development interface, and restarting the real-time engine when the development content is UI development and a reload instruction is received includes:
responding to the plug-in call instruction received by the three-dimensional development interface, and determining the development content corresponding to the plug-in call instruction;
acquiring a loading state of the blueprint life cycle behavior of the real-time engine, and generating and outputting the reload instruction when the loading state is not loaded;
and restarting the real-time engine when the development content is UI development and the reloading instruction is received.
Optionally, the step of outputting the UI element corresponding to the UI blueprint construction instruction in the three-dimensional screen editing interface in response to the UI blueprint construction instruction and the World Space UI instruction received by the UI development interface includes:
responding to a UI blueprint construction instruction and a World Space UI instruction received by the UI development interface, and determining dot drawing information, picture source information and color information corresponding to the UI blueprint construction instruction;
and generating the UI element according to the dotting information, the picture source information and the color information, and outputting the UI element corresponding to the UI blueprint construction instruction in a three-dimensional screen editing interface.
Optionally, the step of determining layout information of the UI element and the interaction component corresponding to the operation instruction in response to the operation instruction input by the input device includes:
responding to an operation instruction input by the input equipment, and outputting a component editing interface when the operation instruction is a component editing instruction;
and determining the layout information and the interaction component according to the editing action received by the component editing interface.
Optionally, after the step of updating the state of the input device to the unlocked state when receiving the editing instruction of the UI element, the method further includes:
responding to the operation instruction input by the input equipment, and determining an execution action corresponding to the operation instruction;
the UI element is moved, rotated, or scaled based on the execution action.
Optionally, the step of determining the rendering effect corresponding to the interaction component and adding the rendering effect to the UI element includes:
when the interaction component is a visual component, determining rendering special effects of rich text, background blurring, pixel blurring, block invalidation or automatic line feed processing corresponding to the visual component;
and rendering the UI element in three dimensions based on the rendering special effect and a rendering pipeline of the UI renderer.
Optionally, after the step of determining the rendering effect corresponding to the interaction component and adding the rendering effect to the UI element, the method further includes:
when a debugging instruction of the UI element is received, determining a debugging behavior corresponding to the debugging instruction;
and rotating, zooming, switching the view angle, modifying the color and displaying the special effect on the UI element according to the debugging behavior.
In addition, in order to achieve the above purpose, the invention also provides a development device of the three-dimensional UI, wherein the development device of the three-dimensional UI comprises a grid rendering pipeline, a global shader, entity classes of UI components, a component editing interface and a blueprint script; the rendering flow Mesh Drawing Pipeline provides a drawing foundation for the UI design, the global shader provides a post effect of drawing for the grid body, the entity class of the UI component is used for providing functional attributes of each component and code APIs exposed to the blueprint script, the component editing interface is used for providing layout and attribute configuration of editing components, and the blueprint script is used for providing APIs with interaction functions; the three-dimensional UI development device further comprises a memory, a processor and a three-dimensional UI development program stored on the memory and capable of running on the processor, wherein the three-dimensional UI development program realizes the steps of the three-dimensional UI development method when being executed by the processor.
In addition, in order to achieve the above object, the present invention also provides a computer-readable storage medium having stored thereon a development program of a three-dimensional UI, which when executed by a processor, implements the steps of the development method of a three-dimensional UI as described above.
The embodiment of the invention provides a three-dimensional UI development method, a three-dimensional UI development device and a storage medium, wherein a UI blueprint construction instruction and a World Space UI instruction received by a UI development interface are responded, UI elements corresponding to the UI blueprint construction instruction are output in a three-dimensional screen editing interface, then when an editing instruction of the UI elements is received, the state of an input device is updated to be an unlocking state, an operation instruction input by the input device is responded, layout information of the UI elements corresponding to the operation instruction and an interaction component are determined, rendering special effects corresponding to the interaction component are finally determined, and the rendering special effects are added to the UI elements. And by acquiring corresponding layout modification and special effect rendering of the current UI element, the development of the three-dimensional UI in the three-dimensional screen editing interface is completed, and the interactive effect of the UI product is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention. In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic diagram of a technical framework of three-dimensional UI development of the development method of the three-dimensional UI of the invention;
FIG. 2 is a schematic diagram of a development framework of a mesh rendering pipeline of the development method of the three-dimensional UI of the invention;
FIG. 3 is a schematic view angle switching diagram of a three-dimensional UI of the development method of the three-dimensional UI of the invention;
FIG. 4 is a schematic illustration of one of the three-dimensional developments of the development method of the three-dimensional UI of the invention;
FIG. 5 is another schematic diagram of three-dimensional development of the development method of the three-dimensional UI of the invention;
FIG. 6 is a flow chart of a second embodiment of a method of developing a three-dimensional UI of the invention;
FIG. 7 is a schematic illustration of the effects of the animation of the development method of the three-dimensional UI of the invention;
fig. 8 is a schematic diagram of a terminal hardware structure of various embodiments of a method of developing a three-dimensional UI of the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
In order to better understand the above technical solution, exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Referring to fig. 1, fig. 1 is a schematic diagram of a technical framework of three-dimensional UI development according to the present invention, in which Mesh Drawing Pipeline is a mesh rendering pipeline, and a drawing foundation is provided for the UI through custom mesh volume drawing. Further, the development method of the three-dimensional UI of the invention is based on the self grid drawing mode of the real-time engine for UI drawing, the real-time engine can comprise a illusion engine 4, an illusion engine 5 or other engines capable of being used for UI development, and the development framework of the grid rendering pipeline Mesh Drawing Pipeline is as shown in fig. 2: FPrimitiveSceneProxy is the beginning of the mesh volume rendering, represented by the "UPrimitiveComponent" rendering thread of the game thread, responsible for callbacks to "GetDynamiceHelements" and "DrawStaticElements" and submitting FMeshBatch to the renderer. FMeshBatch: the realization (user code) is decoupled from the mesh body channel (private renderer module), containing the final shader and all the rendered content; FMeshDrawCommand is converted by FMeshBatch, is an interface between FMeshDrawCommand, FMeshBatch and RHI, and contains all information of grid rendering, including shader used, binding of resources and parameters drawn.
Global loader is a Global Shader that provides post effects of rendering, such as pixel blurring effects, for the grid volume, and bug test aids, such as auxiliary lines of rendering. Where Global Shaders are operations performed on geometry (e.g., geometry for rendering fonts), material (Material) does not require a real-time engine to interact in memory, there is only one Shader for any given Global Shader type, the main ones used are Vertex Factories implementing different mesh types, and Material Shaders accessing Material properties. The perspective switching of the three-dimensional UI may be as shown in fig. 3.
Components are Components, which are entity classes of each UI component, and mainly provide functional properties of each component and a code API exposed to blueprint (blue print) for a developer to use. All Components of the component are inherited from U3 DUILifeCCLeBehaviour, and U3 DUILifeCCycLeBehaviour is inherited from UActorComponent of the real-time engine, and is used for the life cycle management of the whole component. Components provides a number of functional attributes, such as the photo component provides attributes of a point of description setting (Pivot), a photo (Sprite) source setting, a Color (Color) setting, and so forth.
The Editor Interface is a user editing Interface, and mainly provides a component editing Interface such as layout setting for a developer to quickly edit the layout and attribute configuration of the component. The main framework used by Editor Interface is the slave, which brings assembly visual Interface editing for users, and two modes of editing and presenting are available: screen Space UI and World Space UI, the Screen Space UI is mainly two-dimensional UI editing oriented, the lens view angle is fixed, and the World Space UI is mainly three-dimensional UI editing oriented, so the invention provides a three-dimensional development mode of UMG interface for users through the editing mode of the World Space UI.
Bluerint is a Blueprint script, is a script language of a real-time engine, and can enable a developer to develop UI interaction logic quickly.
In addition, the three-dimensional UI development system further comprises a visualization composition component, an event callback, a layout setting component, a 3D UIcanvas, a UE renderer and a 3D UI renderer, wherein the visualization composition component is used for providing visual editing management, the event callback is used for processing and responding UI interaction operation, the layout setting component is used for realizing dotting setting, animation, component clipping, DPI scaling and UI style setting of a UI interface, the 3D UIcanvas is used for providing canvas base classes, the UE renderer is used for rendering UI elements through a rendering pipeline, and the 3D UI renderer is used for rendering the UI interface through the rendering pipeline of the 3D UI. The visual composition component comprises a rich text component, a background blurring control, a pixel blurring control, an invalidation box and an automatic line feed frame, wherein the rich text component provides various component style options and user-defined options, the background blurring control is used for surrounding UI content of an adjustable frame and adding a Gaussian blurring later special effect, the pixel blurring component is used for adding a pixelation effect for the background, the invalidation box is used for caching a geometric body surrounded by the invalidation box so as to accelerate the rendering speed of an editing frame, and the automatic line feed frame is used for arranging sub-controls from left to right and placing other sub-controls exceeding a preset width into the next line.
The three-dimensional development process may be as shown in fig. 4 and 5, and in the editor state, the entire three-dimensional UI interface may embed a three-dimensional object therein and be anchored by a lens.
Based on the technical architecture of three-dimensional UI development shown in FIG. 1, the three-dimensional UI development method of the invention is provided. Referring to fig. 6, the method for developing a three-dimensional UI of the present invention includes the steps of: step S10, responding to a UI blueprint construction instruction and a World Space UI instruction received by a UI development interface, and outputting a UI element corresponding to the UI blueprint construction instruction in a three-dimensional screen editing interface;
in this embodiment, the UI development interface refers to a UMG editing interface after the real-time engine is started, which includes a three-dimensional screen editing interface. The World Space UI instruction is based on Editor Interface, a three-dimensional development visual Interface is provided for a user, and the UI blueprint construction instruction is a target element selected by the user and needing to be subjected to three-dimensional editing processing.
In an alternative implementation manner of outputting UI elements, the Components corresponding to the UI elements to be edited are picture Components, and in the development architecture, the picture Components of Components provide attributes such as description setting, picture source setting, color setting, and the like, so that description information, picture source information and color information corresponding to the UI blue image construction instruction are determined in response to the UI blue image construction instruction and the World Space UI instruction received by the UI development interface, and then the UI elements corresponding to the UI blue image construction instruction are generated according to the description information, the picture source information and the color information, and are output in a three-dimensional screen editing interface.
In another alternative implementation of the output UI element, when the UI element to be edited is a rich text component, it is necessary to determine an in-attribute of a style change, an inline image, a hyperlink, etc. of the rich text component. And then select a target element based on the attribute.
When editing UI elements, it is usually necessary to call an LGUI (local graphical user interface) plug-in, so when starting the real-time engine requires calling the LGUI and the current real-time engine does not load the plug-in, the plug-in needs to be selected, and the real-time engine is reloaded. That is, before step S10, the method further includes:
step S40, outputting the three-dimensional development interface when a starting instruction of the real-time engine is detected;
step S50, determining development content corresponding to the plug-in call instruction in response to the plug-in call instruction received by the three-dimensional development interface, and restarting the real-time engine when the development content is UI development and a reload instruction is received;
and step S60, outputting the UI development interface after detecting that the real-time engine is restarted.
In this embodiment, after the user clicks the application program file of the real-time engine, the three-dimensional development system responds to the start instruction and outputs the UMG interface, and then responds to the plug-in call instruction selected by the user and received by the interface, and determines the development content corresponding to the plug-in call instruction. Because the plug-in call instruction is an LGUI plug-in, whether the plug-in is loaded or not needs to be determined, namely, the loading state of the blueprint life cycle behavior of the real-time engine needs to be obtained, and when the loading state is not loaded, a reload instruction of the real-time engine is generated and output, and when development content is UI development and the reload instruction is received, the real-time engine is restarted, so that the LGUI plug-in is added to the real-time engine. And finally, after detecting that the real-time engine is restarted, outputting the UI development interface.
In an exemplary embodiment, in a specific operation scenario, after a user starts a real-time engine, a simple 3D scenario is obtained, then a plugin is inserted into an editing option, further, a content to be created is selected, and then after the user clicks a reload button, the real-time engine is restarted.
Step S20, when receiving the editing instruction of the UI element, updating the state of the input device into an unlocking state;
in this embodiment, after the UI element is output, the current editing interface and the UI element are in a protection state, based on which, after the user clicks the corresponding editing instruction, the locking state is released, so that the user can three-dimensionally edit the UI element through an input device such as a mouse and a keyboard. By setting the locking state of the input device, the user can be prevented from greatly changing the UI element due to operations such as false touch.
After updating the state of the input device to the touch-lock state, the user can control the UI element to move, rotate, flip or zoom through the mouse selection and based on the keys of the keyboard, so as to control or manage the UI element from the three-dimensional aspect. Thus, in response to an operation instruction input by the input device, an execution action corresponding to the operation instruction is then determined, and then the UI element is moved, rotated, or scaled based on the execution action. Control of the UI elements in three-dimensional space is achieved.
Step S30, responding to an operation instruction input by the input device, and determining layout information and an interaction component of the UI element corresponding to the operation instruction;
in this embodiment, the operation instruction input by the user through the input device may include layout adjustment and special effect rendering of the UI element. When special effect rendering is performed based on the operation instruction, a user can surround UI content by using an adjustable frame based on a background blurring control corresponding to the operation instruction, then a Gaussian blurring post special effect is added to the content in the background of the UI element, and a pixelized effect can be added to the background of the UI element through a pixel blurring component. Based on the method, three-dimensional special effect rendering of the UI element is achieved.
In another alternative implementation manner of determining the layout information, when the operation instruction is a component editing instruction, outputting a corresponding component editing interface, and further determining the layout information of the UI element and a corresponding interaction component according to an editing action input by a user and received in the component editing interface. For example, the UI element is a text element, and the user selects a supplemental animation corresponding to the text element in the component editing interface, and then performs layout adjustment on the text element based on the supplemental animation. The animation effect of the text element may be as shown in fig. 7, and the processing such as moving, rotating or scaling of the three-dimensional UI component may be implemented based on the animation. In addition, a visual component capable of clicking interaction by a user can be added for the UI element, so that the rendering effect of the UI element is improved.
And step S40, determining a rendering special effect corresponding to the interaction component, and adding the rendering special effect to the UI element.
In this embodiment, when the interactive component is a visual component, the component corresponds to different rendering effects, such as background blur, pixel blur, and the like. It is therefore desirable to determine rendering effects for rich text, background blur, pixel blur, block invalidation, or auto-wrap processing corresponding to the visualization component, and then render the UI elements in three dimensions based on the rendering effects and the rendering pipeline of the UI renderer. For example, the 3DUI renderer may use a rendering pipeline of the 3DUI, with elements rendered directly to the viewport after the real-time engine renders the world, unaffected by post-processing.
In an optional implementation manner after the rendering special effect is added to the UI element, the user can debug or display the UI element in the current operation interface, and specific actions can include rotation, scaling and visual angle switching of the UI element, and can also modify the color of the UI element and display the special effect. That is, when a debug instruction of a UI element is received, determining a debug behavior corresponding to the debug instruction, and performing rotation, scaling, view angle switching, color modification and special effect display on the UI element according to the debug behavior. Based on the method, three-dimensional development and three-dimensional display of the UI elements are realized, and the interaction effect of the UI products is improved.
In the technical scheme disclosed by the embodiment, based on the technical framework of three-dimensional UI development, an operation instruction input by a user in a three-dimensional UI development interface of a real-time engine is responded, and then the reloading of the engine, the addition of a three-dimensional component of a UI element, the rendering of a three-dimensional special effect of the UI element and the like are completed according to the operation instruction, based on the operation instruction, the development of the three-dimensional UI is realized in a UMG interface, and the three-dimensional interaction effect of a product is improved through the three-dimensional component.
Referring to fig. 8, fig. 8 is a schematic diagram of a terminal structure of a hardware running environment according to an embodiment of the present invention.
As shown in fig. 8, the terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display, an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may further include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the terminal structure shown in fig. 8 is not limiting of the terminal and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 8, an operating system, a network communication module, a user interface module, and a development program of the three-dimensional UI may be included in the memory 1005 as one type of computer storage medium.
In the terminal shown in fig. 8, the network interface 1003 is mainly used for connecting to a background server, and performing data communication with the background server; the processor 1001 may call a development program of the three-dimensional UI stored in the memory 1005 and perform the following operations:
responding to a UI blueprint construction instruction and a World Space UI instruction received by a UI development interface, and outputting a UI element corresponding to the UI blueprint construction instruction in a three-dimensional screen editing interface;
when receiving the editing instruction of the UI element, updating the state of the input device into an unlocking state;
responding to an operation instruction input by the input equipment, and determining layout information and an interaction component of the UI element corresponding to the operation instruction;
and determining a rendering effect corresponding to the interaction component, and adding the rendering effect to the UI element.
Further, the processor 1001 may call a development program of the three-dimensional UI stored in the memory 1005, and further perform the following operations:
outputting the three-dimensional development interface when detecting a starting instruction of the real-time engine;
responding to a plug-in call instruction received by the three-dimensional development interface, determining development content corresponding to the plug-in call instruction, and restarting the real-time engine when the development content is UI development and a reload instruction is received;
and after detecting that the real-time engine is restarted, outputting the UI development interface.
Further, the processor 1001 may call a development program of the three-dimensional UI stored in the memory 1005, and further perform the following operations:
responding to the plug-in call instruction received by the three-dimensional development interface, and determining the development content corresponding to the plug-in call instruction;
acquiring a loading state of the blueprint life cycle behavior of the real-time engine, and generating and outputting the reload instruction when the loading state is not loaded;
and restarting the real-time engine when the development content is UI development and the reloading instruction is received.
Further, the processor 1001 may call a development program of the three-dimensional UI stored in the memory 1005, and further perform the following operations:
responding to a UI blueprint construction instruction and a World Space UI instruction received by the UI development interface, and determining dot drawing information, picture source information and color information corresponding to the UI blueprint construction instruction;
and generating the UI element according to the dotting information, the picture source information and the color information, and outputting the UI element corresponding to the UI blueprint construction instruction in a three-dimensional screen editing interface.
Further, the processor 1001 may call a development program of the three-dimensional UI stored in the memory 1005, and further perform the following operations:
responding to an operation instruction input by the input equipment, and outputting a component editing interface when the operation instruction is a component editing instruction;
and determining the layout information and the interaction component according to the editing action received by the component editing interface.
Further, the processor 1001 may call a development program of the three-dimensional UI stored in the memory 1005, and further perform the following operations:
responding to the operation instruction input by the input equipment, and determining an execution action corresponding to the operation instruction;
the UI element is moved, rotated, or scaled based on the execution action.
Further, the processor 1001 may call a development program of the three-dimensional UI stored in the memory 1005, and further perform the following operations:
when the interaction component is a visual component, determining rendering special effects of rich text, background blurring, pixel blurring, block invalidation or automatic line feed processing corresponding to the visual component;
and rendering the UI element in three dimensions based on the rendering special effect and a rendering pipeline of the UI renderer.
Further, the processor 1001 may call a development program of the three-dimensional UI stored in the memory 1005, and further perform the following operations:
when a debugging instruction of the UI element is received, determining a debugging behavior corresponding to the debugging instruction;
and rotating, zooming, switching the view angle, modifying the color and displaying the special effect on the UI element according to the debugging behavior.
Furthermore, it will be appreciated by those of ordinary skill in the art that implementing all or part of the processes in the methods of the above embodiments may be accomplished by computer programs to instruct related hardware. The computer program comprises program instructions, and the computer program may be stored in a storage medium, which is a computer readable storage medium. The program instructions are executed by at least one processor in the control terminal to carry out the flow steps of the embodiments of the method described above.
Accordingly, the present invention also provides a computer-readable storage medium storing a development program of a three-dimensional UI, which when executed by a processor, implements the respective steps of the development method of a three-dimensional UI as described in the above embodiments.
It should be noted that, because the storage medium provided in the embodiments of the present application is a storage medium used to implement the method in the embodiments of the present application, based on the method described in the embodiments of the present application, a person skilled in the art can understand the specific structure and the modification of the storage medium, and therefore, the description thereof is omitted herein. All storage media used in the methods of the embodiments of the present application are within the scope of protection intended in the present application.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flowchart and/or block of the flowchart illustrations and/or block diagrams, and combinations of flowcharts and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (10)

1. A method for developing a three-dimensional UI, the method comprising:
responding to a UI blueprint construction instruction and a World Space UI instruction received by a UI development interface, and outputting a UI element corresponding to the UI blueprint construction instruction in a three-dimensional screen editing interface;
when receiving the editing instruction of the UI element, updating the state of the input device into an unlocking state;
responding to an operation instruction input by the input equipment, and determining layout information and an interaction component of the UI element corresponding to the operation instruction;
and determining a rendering effect corresponding to the interaction component, and adding the rendering effect to the UI element.
2. The method for developing a three-dimensional UI according to claim 1, wherein the step of responding to the UI blueprint construction instruction and the World Space UI instruction received by the UI development interface and outputting the UI element corresponding to the UI blueprint construction instruction in the three-dimensional screen editing interface further comprises:
outputting the three-dimensional development interface when detecting a starting instruction of the real-time engine;
responding to a plug-in call instruction received by the three-dimensional development interface, determining development content corresponding to the plug-in call instruction, and restarting the real-time engine when the development content is UI development and a reload instruction is received;
and after detecting that the real-time engine is restarted, outputting the UI development interface.
3. The method for developing a three-dimensional UI according to claim 2, wherein the step of determining development content corresponding to the plug-in call instruction in response to the plug-in call instruction received by the three-dimensional development interface, and restarting the real-time engine when the development content is UI development and a reload instruction is received comprises:
responding to the plug-in call instruction received by the three-dimensional development interface, and determining the development content corresponding to the plug-in call instruction;
acquiring a loading state of the blueprint life cycle behavior of the real-time engine, and generating and outputting the reload instruction when the loading state is not loaded;
and restarting the real-time engine when the development content is UI development and the reloading instruction is received.
4. The method for developing a three-dimensional UI according to claim 1, wherein the step of outputting the UI element corresponding to the UI blueprint construction instruction in the three-dimensional screen editing interface in response to the UI blueprint construction instruction and the World Space UI instruction received by the UI development interface comprises:
responding to a UI blueprint construction instruction and a World Space UI instruction received by the UI development interface, and determining dot drawing information, picture source information and color information corresponding to the UI blueprint construction instruction;
and generating the UI element according to the dotting information, the picture source information and the color information, and outputting the UI element corresponding to the UI blueprint construction instruction in a three-dimensional screen editing interface.
5. The method for developing a three-dimensional UI according to claim 1, wherein the step of determining layout information of the UI element and the interactive component corresponding to the operation instruction in response to the operation instruction input by the input device comprises:
responding to an operation instruction input by the input equipment, and outputting a component editing interface when the operation instruction is a component editing instruction;
and determining the layout information and the interaction component according to the editing action received by the component editing interface.
6. The method for developing a three-dimensional UI according to claim 1, wherein after the step of updating the state of the input device to the unlocked state upon receiving the edit instruction of the UI element, further comprising:
responding to the operation instruction input by the input equipment, and determining an execution action corresponding to the operation instruction;
the UI element is moved, rotated, or scaled based on the execution action.
7. The method of developing a three-dimensional UI of claim 1, wherein the determining a rendering effect corresponding to the interactive component and adding the rendering effect to the UI element comprises:
when the interaction component is a visual component, determining rendering special effects of rich text, background blurring, pixel blurring, block invalidation or automatic line feed processing corresponding to the visual component;
and rendering the UI element in three dimensions based on the rendering special effect and a rendering pipeline of the UI renderer.
8. The method for developing a three-dimensional UI according to claim 1, wherein after the step of determining the rendering effect corresponding to the interaction component and adding the rendering effect to the UI element, further comprises:
when a debugging instruction of the UI element is received, determining a debugging behavior corresponding to the debugging instruction;
and rotating, zooming, switching the view angle, modifying the color and displaying the special effect on the UI element according to the debugging behavior.
9. A three-dimensional UI development device, characterized in that the three-dimensional UI development device includes: the system comprises a grid rendering pipeline, a global shader, entity classes of UI components, component editing interfaces and blueprint scripts; the rendering flow Mesh Drawing Pipeline provides a drawing foundation for the UI design, the global shader provides a post effect of drawing for the grid body, the entity class of the UI component is used for providing functional attributes of each component and code APIs exposed to the blueprint script, the component editing interface is used for providing layout and attribute configuration of editing components, and the blueprint script is used for providing APIs with interaction functions;
the three-dimensional UI development device further includes: a memory, a processor, and a development program of a three-dimensional UI stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the development method of a three-dimensional UI as claimed in any one of claims 1 to 8.
10. A computer-readable storage medium, wherein a development program of a three-dimensional UI is stored thereon, which when executed by a processor, implements the steps of the development method of a three-dimensional UI according to any one of claims 1 to 8.
CN202311572959.8A 2023-11-22 2023-11-22 Three-dimensional UI development, three-dimensional UI development device, and storage medium Pending CN117519686A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311572959.8A CN117519686A (en) 2023-11-22 2023-11-22 Three-dimensional UI development, three-dimensional UI development device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311572959.8A CN117519686A (en) 2023-11-22 2023-11-22 Three-dimensional UI development, three-dimensional UI development device, and storage medium

Publications (1)

Publication Number Publication Date
CN117519686A true CN117519686A (en) 2024-02-06

Family

ID=89760484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311572959.8A Pending CN117519686A (en) 2023-11-22 2023-11-22 Three-dimensional UI development, three-dimensional UI development device, and storage medium

Country Status (1)

Country Link
CN (1) CN117519686A (en)

Similar Documents

Publication Publication Date Title
TWI808393B (en) Page processing method, device, apparatus and storage medium
CN107832108B (en) Rendering method and device of 3D canvas webpage elements and electronic equipment
US8860752B2 (en) Multimedia scripting
CN107393013B (en) Virtual roaming file generation and display method, device, medium, equipment and system
RU2360290C2 (en) Integration of three-dimensional scene hierarchy into two-dimensional image assembly system
JP5242789B2 (en) Mapping of graphics instructions to related graphics data in performance analysis
US20100020087A1 (en) Performance analysis during visual creation of graphics images
WO2008024940A1 (en) System for development of 3d content used in embedded devices
TW201118790A (en) Graphics analysis techniques
CN111324381B (en) Development system, development method, development apparatus, computer device, and storage medium
KR101431311B1 (en) Performance analysis during visual creation of graphics images
CN112307403A (en) Page rendering method, device, storage medium and terminal
CN114494024B (en) Image rendering method, device and equipment and storage medium
CN114549708A (en) Game object editing method and device and electronic equipment
JP5242788B2 (en) Partition-based performance analysis for graphics imaging
CN109816761B (en) Graph conversion method, graph conversion device, storage medium and electronic equipment
CN117519686A (en) Three-dimensional UI development, three-dimensional UI development device, and storage medium
CN114428573B (en) Special effect image processing method and device, electronic equipment and storage medium
Stenning Direct3D Rendering Cookbook
US9069905B2 (en) Tool-based testing for composited systems
Joshi et al. Graphics programming for the web
CN113676753B (en) Method and device for displaying video in VR scene, electronic equipment and storage medium
CN118170284B (en) Window rendering method, device, equipment and medium
CN117708454A (en) Webpage content processing method, device, equipment, storage medium and program product
WO2023168999A1 (en) Rendering method and apparatus for virtual scene, and electronic device, computer-readable storage medium and computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination