CN117908871A - Hybrid development method, apparatus, electronic device, and computer-readable storage medium - Google Patents

Hybrid development method, apparatus, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN117908871A
CN117908871A CN202311841021.1A CN202311841021A CN117908871A CN 117908871 A CN117908871 A CN 117908871A CN 202311841021 A CN202311841021 A CN 202311841021A CN 117908871 A CN117908871 A CN 117908871A
Authority
CN
China
Prior art keywords
event
item
project
executable file
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311841021.1A
Other languages
Chinese (zh)
Inventor
夏波
杨若鹄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202311841021.1A priority Critical patent/CN117908871A/en
Publication of CN117908871A publication Critical patent/CN117908871A/en
Pending legal-status Critical Current

Links

Landscapes

  • Stored Programmes (AREA)

Abstract

The embodiment of the application discloses a hybrid development method, a hybrid development device, electronic equipment and a computer readable storage medium, which are used for realizing chimeric display of a graphical user interface application development framework interface and a three-dimensional development tool interface. The method comprises the following steps: separating project code of the three-dimensional development tool into a first portion and a second portion, the first portion comprising code of the two-dimensional user interface, the second portion comprising business logic code and code of the three-dimensional interface; compiling the first part into a website project, compiling the second part into an executable file project, and accessing the website project and the executable file project into a target project, wherein the target project is a graphical user interface application program development framework project; after the target item is started, starting an executable file item through the target item, and displaying a three-dimensional interface of the executable file item through a window component of the target item; and starting the website project through the browser engine of the target project and displaying a two-dimensional user interface.

Description

Hybrid development method, apparatus, electronic device, and computer-readable storage medium
Technical Field
The application belongs to the technical field of software development, and particularly relates to a hybrid development method, a hybrid development device, electronic equipment and a computer readable storage medium.
Background
Three-dimensional development tools (e.g., unity) are real-time three-dimensional (3D) interactive content authoring and operation platforms that can provide a complete set of software solutions that can be used to author and operate any real-time interactive two-dimensional (2D) and 3D content. While a graphical user interface (GRAPHICAL USER INTERFACE, GUI) application development framework (e.g., QT) can be used to develop both GUI programs and non-GUI programs.
In the current hybrid development scheme based on a graphical user interface application development framework (such as QT) and a three-dimensional development tool (such as Unity), the three-dimensional development tool is usually compiled into corresponding Executable Files (exes) and embedded into the graphical user interface application development framework; the graphical user interface application development framework then initiates the application of the three-dimensional development tool (e.g., unityApp) via a process, and the interface of the application of the graphical user interface application development framework (e.g., QTApp) and the interface of the application of the three-dimensional development tool are displayed superimposed together.
However, the application program of the three-dimensional development tool and the application program of the graphical user interface application program development framework are essentially two different applications, and in terms of interface display and interface level processing, only the interface of the application program of the three-dimensional development tool can be displayed on the interface of the application program of the graphical user interface application program development framework, or the interface of the application program of the graphical user interface application program development framework can be displayed on the interface of the application program of the three-dimensional development tool, so that the chimeric display of the interface of the graphical user interface application program development framework and the interface of the three-dimensional development tool cannot be realized.
Disclosure of Invention
The embodiment of the application provides a hybrid development method, a hybrid development device, electronic equipment and a computer readable storage medium, which can solve the problem that the embedded display of the interface of a graphical user interface application development frame and the interface of a three-dimensional development tool cannot be realized in the hybrid development of the traditional graphical user interface application development frame and the three-dimensional development tool.
In a first aspect, an embodiment of the present application provides a hybrid development method, including:
Acquiring a project code and separating the project code into a first part and a second part, wherein the first part comprises a code of a two-dimensional user interface of a three-dimensional development tool, the second part comprises a business logic code of the three-dimensional development tool and a code of the three-dimensional interface, and the project code is based on the project code of the three-dimensional development tool;
compiling the first part into a website project, compiling the second part into an executable file project, and accessing the website project and the executable file project into a target project, wherein the target project is a graphical user interface application program development framework project;
After the target item is started, starting an executable file item through the target item, and displaying a three-dimensional interface of the executable file item through a window component of the target item;
and starting the website project through a browser engine of the target project, and displaying a two-dimensional user interface through the browser engine.
From the above, the embodiment of the application separates the project code of the three-dimensional development tool into the first part and the second part, wherein the first part comprises the two-dimensional user interface, and the second part comprises the three-dimensional interface, namely the interface of the three-dimensional development tool is separated; and accessing the first part and the second part into a target project, and controlling the starting and the displaying of the first part and the second part by the target project so as to display a two-dimensional user interface of the three-dimensional development tool on a browser engine of the target project and display the three-dimensional interface of the three-dimensional development tool on a window component of the target project. Therefore, the interface display and the interface level are controlled by the business logic of the same application program (namely, the target project), so that the interface of the graphical user interface application program development frame, the three-dimensional interface of the three-dimensional development tool and the chimeric display of the two-dimensional user interface of the three-dimensional development tool can be realized, and the three-dimensional development tool interface and the graphical user interface application program development frame interface can be perfectly chimeric.
In some possible implementations of the first aspect, after launching the executable file item and the website item, the method further includes:
obtaining a first user event captured by a two-dimensional user interface of a three-dimensional development tool;
transmitting the first user event to the executable file item through the target item;
Carrying out business logic processing on a first user event through an executable file item to obtain a first processed event;
and transferring the first processed event to the website item through the target item so as to enable the website item to respond to the event.
In this implementation, focus problems between multiple processes may be avoided by capturing three-dimensional development tool events through a two-dimensional user interface of the three-dimensional development tool.
In some possible implementations of the first aspect, transmitting the first user event to the executable file item through the target item, obtaining the first processed event after performing business logic processing on the first user event through the executable file item, and forwarding the first processed event to the website item through the target item includes:
if the first user event is determined to be transferred to the target project or the executable file project through the website project, after the first user event is converted into the first event, the first event is sent to the script of the three-dimensional development tool, the script of the three-dimensional development tool transfers the first event to the target project through a communication channel, and the communication channel is a channel between the target project and the website project;
If the first event is determined to be required to be transferred to the executable file item through the target item, after the first event is converted into the second event, the target item sends the second event to the shared memory, and the shared memory is shared by the target item of the shared memory and the executable file item;
Reading a second event from the shared memory through a stack machine of the executable file item, carrying out business logic processing on the second event by the executable file item to obtain a first processed event, and sending the first processed event to the shared memory by the executable file item;
Reading the first processed event from the shared memory through a stack machine of the target project, determining that the executable file project needs to call business logic of the website project by the target project, converting the first processed event into a third event, and forwarding the third event to a script of the three-dimensional development tool through a communication channel by the target project;
and forwarding the third event to the website project through the script of the three-dimensional development tool.
In some possible implementations of the first aspect, after launching the executable file item and the website item, the method further includes:
Acquiring a second user event captured by the target item;
Transmitting the second user event to the executable file item;
carrying out business logic processing on a second user event through the executable file item to obtain a second processed event;
And transferring the second processed event to the website item through the target item so as to enable the website item to respond to the event.
In some possible implementations of the first aspect, transmitting the second user event to the executable file item, obtaining the second processed event after performing business logic processing on the second user event by the executable file item, and forwarding the second processed event to the website item by the target item includes:
If the target item determines that the second user event needs to be transferred to the executable file item, converting the second user event into a fourth event, and then sending the fourth event to the shared memory by the target item, wherein the shared memory is shared by the target item of the shared memory and the executable file item;
reading a fourth event from the shared memory through a stack machine of the executable file item, carrying out business logic processing on the fourth event by the executable file item to obtain a second processed event, and sending the second processed event to the shared memory by the executable file item;
Reading a second processed event from the shared memory through a stack machine of the target project, determining that the executable file project needs to call business logic of the website project by the target project, converting the second processed event into a fifth event, and forwarding the fifth event to a script of the three-dimensional development tool through a communication channel by the target project;
And forwarding the fifth event to the website project through the script of the three-dimensional development tool.
In some possible implementations of the first aspect, the first user event or the second user event is a keyboard operation event or a mouse operation event.
In a second aspect, an embodiment of the present application provides a base mix development apparatus, including:
The code separation module is used for acquiring the project code and separating the project code into a first part and a second part, wherein the first part comprises the code of the two-dimensional user interface of the three-dimensional development tool, the second part comprises the business logic code of the three-dimensional development tool and the code of the three-dimensional interface, and the project code is based on the project code of the three-dimensional development tool;
The compiling access module is used for compiling the first part into a website project, compiling the second part into an executable file project, and accessing the website project and the executable file project into a target project, wherein the target project is a graphical user interface application program development framework project;
The first starting module is used for starting the executable file item through the target item after starting the target item, and displaying a three-dimensional interface of the executable file item through a window component of the target item;
and the second starting module is used for starting the website project through a browser engine of the target project and displaying a two-dimensional user interface through the browser engine.
In some possible implementations of the second aspect, the apparatus further includes:
A capture module for acquiring a first user event captured by a two-dimensional user interface of a three-dimensional development tool;
The data interaction module is used for transmitting the first user event to the executable file item through the target item; carrying out business logic processing on a first user event through an executable file item to obtain a first processed event;
And the event response module is used for transferring the first processed event to the website item through the target item so as to enable the website item to perform event response.
In a third aspect, an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the method according to any one of the first aspects when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program which, when executed by a processor, implements a method as in any of the first aspects above.
In a fifth aspect, embodiments of the present application provide a computer program product, which, when run on a chip, causes the chip to perform the method of any of the first aspects above.
It will be appreciated that the advantages of the second to fifth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic block flow diagram of a hybrid development method for a three-dimensional development tool-based and graphical user interface application development framework provided by an embodiment of the present application;
FIG. 2 is a schematic diagram of the structure and interaction mode of Unity provided by the embodiment of the application;
FIG. 3 is a schematic diagram of the structure and data interaction of QT provided by an embodiment of the present application;
FIG. 4 is another flow diagram of a hybrid development method based on a three-dimensional development tool and a graphical user interface application development framework provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of part UnityUI, part UnityDemo, and QT interaction logic provided by an embodiment of the present application;
FIG. 6 is a block diagram of a hybrid development device based on a three-dimensional development tool and a graphical user interface application development framework provided by an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
Referring to fig. 1, a schematic flow diagram of a hybrid development method based on a three-dimensional development tool and a gui application development framework according to an embodiment of the present application may include the following steps:
Step S101, acquiring project codes and separating the project codes into a first part and a second part, wherein the first part comprises codes of a two-dimensional user interface of a three-dimensional development tool, the second part comprises business logic codes of the three-dimensional development tool and codes of the three-dimensional interface, and the project codes are project codes based on the three-dimensional development tool.
Illustratively, the three-dimensional development tool may be referred to as Unity and the graphical user interface application development framework may be QT. In this case, the item code is a Unity item code. In hybrid development based on Unity and QT, the entire project can be divided into a Unity part and a QT part. Based on the Unity engine, developing the Unity part so that the electronic equipment can obtain the Unity project code; based on the QT frame, development of the QT portion is performed so that the electronic device can acquire codes of the QT portion.
For a three-dimensional development tool, the three-dimensional development tool may include a business logic portion and a User Interface (UI) portion, and the UI portion may be divided into a 3D portion and a 2D portion. For example, taking Unity as an example, for the Unity portion, the Unity itself can be divided into a business logic portion and a UI portion, which in turn can be divided into a 2D portion and a 3D portion. Typically, the 3D portion is shown at the bottom and the event response is based on the ray response.
In the embodiment of the present application, a portion of the three-dimensional tool (for example, a Unity item code of a Unity portion) is organically separated into a first portion and a second portion. The first portion may include code for a 2D-related UI, may be used to display a 2D-related UI interface, and may also receive various input events (e.g., user operational events). The second portion may include a business logic portion and a 3D presentation portion that may be used to present the 3D interface and perform business logic processing, e.g., the second portion may receive the input event captured by the first portion and perform business logic processing on the input event.
Illustratively, taking a three-dimensional development tool as a Unity example, separating the Unity portion may refer to: and respectively placing the business logic and the design resources of the two scenes of the framework layer into the appointed catalogue, and then respectively packaging to obtain a first part and a second part.
Step S102, compiling the first part into a website project, compiling the second part into an executable file project, and accessing the website project and the executable file project into a target project, wherein the target project is a graphical user interface application program development framework project.
The website item refers to a web item, i.e. the first part may be compiled into a web item. Further, the web item may refer to WASM items.
The graphical user interface application development framework may be QT, for example, in which case the target item may be referred to as a QT item. After the first part is compiled into WASM items and the second part is compiled into EXE items, the first part is accessed into QT according to the access modes of WASM and EXE, so that the start of the WASM part and the EXE part is controlled by the QT items.
In the embodiment of the application, when the three-dimensional development tool is Unity, the first part can be called UnityUI part, and the second part can be called UnityDemo part. For example, referring to the schematic diagram of the structure and interaction manner of Unity provided by the embodiment of the present application shown in fig. 2, for the Unity portion, the structure may be split (or separated) into a UnityUI portion and a UnityDemo portion.
Part UnityUI can be used to display or respond to UI interfaces (2D interfaces) and event captures. Event capture may be, for example, capturing user operation events, which may refer to events operated by a user through a keyboard or mouse.
Part UnityDemo can be further divided into 3D interfaces, event handling, resource handling, and business logic. Wherein the 3D interface may be used to perform bottom 3D environment and 3D object rendering; event processing may be used to process data from the QT portion; the resource handling is used to invoke the required resources.
And step S103, after the target item is started, starting the executable file item through the target item, and displaying a three-dimensional interface of the executable file item through a window component of the target item.
The window component of the target item may be QWidget of the QT item, for example.
When the three-dimensional development tool is a Unity, the Unity has a development mode and an operation mode. The development schema refers to the Unity running project under the editor. The operation mode refers to that the Unity is separated and packed into UnityUI parts and UnityDemo parts, and UnityUI parts and UnityDemo parts are embedded (or connected) into a QT project, and the QT project is started to control UnityUI parts and UnityDemo parts to start operation. After launching the QT item, the QT item may launch the exe item through the command line. Wherein QWidget of the QT item is the parent of all window component classes in the QT, is an abstraction of all window components, one QWidget for each window component. It is capable of drawing itself and processing user input, often used as a parent or top level component.
Step S104, starting the website project through a browser engine of the target project, and displaying a two-dimensional user interface through the browser engine.
Illustratively, the browser engine of the target item may be QWebEngineView of the QT item. At this point, WASM items may be determined by QWebEngineView of the QT item and the Unity two-dimensional user interface may be displayed by QWebEngineView.
WASM (WebAssembly) is a brand new format which is portable, small in size, fast to load and compatible with Web. The WASM item refers to an item developed using WebAssembly technology. Note that WASM items can be considered Web applications, while exe items can be considered exe applications.
QWebEngineView of QT can be one QWidget that can embed Web content in the QT application. Which is a new browser engine added by QT5.4 version for editing and viewing web content. From the above, it can be seen that the embodiment of the present application separates the project code of the three-dimensional development tool into a first portion and a second portion; and accessing the first part and the second part into a target project, and controlling the starting and the displaying of the first part and the second part by the target project so as to display a two-dimensional user interface of the three-dimensional development tool on a browser engine of the target project and display the three-dimensional interface of the three-dimensional development tool on a window component of the target project. Therefore, the interface display and the interface hierarchy are controlled by the business logic of the same application program, so that the interface of the graphical user interface application program development frame, the three-dimensional interface of the three-dimensional development tool and the chimeric display of the two-dimensional user interface of the three-dimensional development tool can be realized, and the three-dimensional development tool interface and the graphical user interface application program development frame interface can be perfectly chimeric.
Illustratively, when the three-dimensional development tool is Unity and the graphical user interface application development framework is QT, embodiments of the present application separate Unity into UnityUI and UnityDemo parts, i.e., separate the Unity interface; and the UnityUI part and the UnityDemo part are connected into the QT project, the starting and the displaying of the UnityUI part and the UnityDemo part are controlled by the QT project, so that a two-dimensional user interface of Unity is displayed on QWebEngineView of the QT project, and a three-dimensional interface of Unity is displayed on QWidget of the QT project. Therefore, by the service logic control interface display and interface level of the QT project, the chimeric display of the QT interface, the Unity three-dimensional interface and the Unity two-dimensional user interface can be realized, and the Unity interface and the QT interface can be perfectly chimeric.
In addition, by separating Unity into UnityUI and UnityDemo parts, a simple UI (e.g., a 2D interface UI) with low performance requirements is displayed on UnityUI part, and a complex model and a 3D scene with high performance are displayed on UnityDemo part, so that the interface display and the performance stability can be ensured. Specifically, the web has larger performance loss and lower performance consumption when 3D rendering is performed, and the embodiment of the application performs 3D interface display through the UnityDemo part (i.e. the exe part), so that the display effect is better than the display effect on QT and the web, and the performance consumption is lower.
As shown in fig. 2, in the development mode, the UnityUI portion and the UnityDemo portion may interact directly with each other; in run mode, unityUI parts are packaged into a web application (e.g., WASM project) for QT use in the web, and UnityDemo parts are packaged into exe applications for QT invocation process start-up. The Web application and the exe application perform data interaction through QT as a medium. Specifically, the web application and QT item interact through JS (JavaScript) events, while the exe application and QT item interact through a stack machine in shared memory, and the web application and exe application do not interact directly.
Specifically, in the development mode, the UnityUI part and the UnityDemo part are the same project and different scenes (scenes) and different directory structures, and the interaction is directly performed through the UnityUI part and the UnityDemo part of the test code in the development mode. Specifically, two scenes can be copied under the same initiator scene under the Unity editor, and then the mutual call between the UnityUI part and the UnityDemo part is performed through the code of the initiator, so as to realize the data interaction between the UnityDemo part and the UnityUI part.
In the run mode, bridging communication between UnityUI and UnityDemo parts may be via QT items. For example, referring to the structure and the data interaction schematic diagram of QT provided by the embodiment of the present application shown in fig. 3, the QT portion may include a logic portion itself and a portion that needs to interact with Unity. For the interface display of the QT portion, QWidget of the QT shows the interface of the UnityDemo portion (i.e., a 3D interface), and the interface of the UnityUI portion (i.e., a 2D interface) is displayed by QWebEngineView of the QT.
For the data interaction of the QT part, after the QT item starts the UnityDemo part through the command line, the QT item performs the cross-process data interaction with the UnityDemo part through the stack machine under the shared memory. The shared memory is the memory shared by QT item and UnityDemo parts.
After the QT item starts up the UnityUI part by starting up QWebEngineView, the QT item performs data interaction through the JS parts of QWebChannel and UnityUI, and the JS part of UnityUI performs data interaction through the WASM and C# parts, so that communication between the QT item and the UnityUI part is realized.
For business logic of QT part, it may include UI interface, event capturing, event processing, and resource processing, etc. The UI interface refers to an interface of the QT itself; event capture refers to QT capture user events (e.g., keyboard operation events, mouse operation events, etc.); event processing means processing a received event; resource handling refers to the resources required for invocation.
In the operation mode, after the QT item is started WASM and the exe item is started, the QT item, the WASM item and the exe item can be called mutually according to a certain interaction logic. The following describes the interaction logic between the three.
Referring to fig. 4, another flow schematic block diagram of a hybrid development method based on a three-dimensional development tool and a gui application development framework according to an embodiment of the present application may include the following steps:
Step S401, acquiring project codes, and separating the project codes into a first part and a second part, wherein the first part comprises codes of a two-dimensional user interface of a three-dimensional development tool, the second part comprises business logic codes of the three-dimensional development tool and codes of the three-dimensional interface, and the project codes are project codes based on the three-dimensional development tool.
Step S402, compiling the first part into a website project, compiling the second part into an executable file project, and accessing the website project and the executable file project into a target project, wherein the target project is a graphical user interface application program development framework project.
And step S403, after the target item is started, starting the executable file item through the target item, and displaying a three-dimensional interface of the executable file item through a window component of the target item.
Step S404, starting the website project through a browser engine of the target project, and displaying a two-dimensional user interface through the browser engine.
And after the target item controls the website item and the executable file item to be started and the interface is displayed, the target item, the website item and the executable file item can perform data interaction in an interactive mode so as to realize the mutual calling among the three codes.
Capturing user events by a first part of the three-dimensional development tool and the target item in the interaction process of the target item, the website item and the executable file item, and transmitting the user events to a second part through bridging of the target item; the second part carries out business logic processing on the user event, and then transfers the processed result to the first part through the target item so as to call the first part for display and response.
Illustratively, after the QT item control WASM item and the exe item are started and the interface is displayed, the QT item, the WASM item and the exe item can interact data in an interactive manner so as to realize the interaction call among the three parts of codes. Capturing user events by the Unity part and the QT item in the interaction process of the QT item, the WASM item and the exe item, and transmitting the user events to the UnityDemo part through bridging of the QT item; part UnityDemo carries out service logic processing on the user event, and then transfers the processed result to part UnityUI through QT item so as to call UnityUI for display and response.
Step S405, a first user event captured by a two-dimensional user interface of a three-dimensional development tool is acquired.
When the three-dimensional development tool is Unity, user events may be captured through a two-dimensional user interface of Unity.
The first user event is a user operation event, which may include, for example, a mouse operation event and a keyboard operation event. For example, when a user operates software, user events such as a mouse or a keyboard are captured by the UI of section UnityUI.
It is worth noting that in the related art, unitapps. Exe and qtapps. Exe are two different applications, resulting in great trouble in focus processing. In the embodiment of the application, the Unity event is captured through the Unity two-dimensional user interface, so that the focus problem among multiple processes can be avoided.
Step S406, the first user event is transmitted to the executable file item through the target item.
Optionally, after capturing the first user event, the website item may first determine whether the first user event needs to be processed by itself or needs to be transferred to the target item or the processing of the executable file item; if the website project determines that the first user event needs to be transferred to the target project or the executable file project, after the first user event is converted into the first event, the first event is sent to the script of the three-dimensional development tool, and the script of the three-dimensional development tool transfers the first event to the target project through a communication channel. If the website item determines that the first user event is handled by itself, there is no need to forward the first user event to the target item or the executable file item. The communication channel is a channel between the target item and the website item.
Illustratively, the script of the three-dimensional development tool may be UnityJS. The communication channel may be QWebChannel, which may be a channel of WASM item and QT item.
It should be noted that, in the embodiment of the present application, five event types including a user operation event, a UI-QT event, a QT-DEMO event, a DEMO-QT event, and a QT-UI event may be exemplified. Wherein, the UI-QT event, the QT-DEMO event, the DEMO-QT event and the QT-UI event are predefined event types. And the first event may be a UI-QT event, and the WASM item may send a custom UI-QT event, parameters, etc. to UnityJS.
After receiving the first user event captured by WASM items, the QT item may also perform service logic judgment according to its own service logic, so as to determine whether transmission to exe item processing is required. If the QT item determines that event processing is performed by itself, then no transmission to exe item processing is required; if the QT item is started and needs to be transferred to the exe item, the event is transferred to the exe item.
Optionally, the target item and the executable file item interact with each other through the shared memory. At this time, if it is determined that the first event needs to be transferred to the executable file item through the target item, the target item converts the first event into the second event, and then the target item sends the second event to the shared memory, where the shared memory is a memory shared by the target item and the executable file item. The second event may for example be a QT-DEMO event,
Step S407, after performing business logic processing on the first user event through the executable file item, obtaining a first processed event.
Because the executable file item and the target item can perform data interaction through the shared memory, after the target item sends an event to the shared memory, the executable file item can acquire the event from the shared memory through the stack machine, and a series of business logic processes are performed on the event to obtain the processed event. And finally, the executable file item sends the processed event to the shared memory so as to transmit the event processing result to the target item.
Optionally, the stack machine of the executable file item reads the second event (i.e. reads related information such as functions and parameters) from the shared memory, and the executable file item performs service logic processing on the second event to obtain the first processed event, and the executable file item sends the first processed event to the shared memory. The first post-processing event may be, for example, a DEMO-QT event.
Step S408, the first processed event is transferred to the website item through the target item, so that the website item responds to the event.
For example, when the target item is QT item and the website item is WASM item, after the exe item performs service logic processing on the event, it may be determined whether QT item and WASM item need to be invoked; and if the QT item or WASM item needs to be called, sending an event processing result to the shared memory. The QT item can acquire event processing results from the shared memory through the stack machine, and then judge whether the event processing results need to be transmitted to the WASM item through own business logic; if the QT item needs to transmit the event processing result to the WASM item, then the result is forwarded to UnityJS through QWebChannel, and then the result is forwarded to UnityC # through WASM by UnityJS to be submitted to WASM item (i.e. UnityUI part) for event response.
Optionally, reading the first processed event from the shared memory through a stack machine of the target item, determining that the executable file item needs to call service logic of the website item by the target item, converting the first processed event into a third event, and forwarding the third event to a script of the three-dimensional development tool through a communication channel by the target item; and forwarding the third event to the website project through the script of the three-dimensional development tool. The third event may be, for example, a QT-UI event.
It is worth pointing out that when the three-dimensional development tool is Unity, the target item is QT item, and the website item is WASM item, if the QT item directly calls exe item, the exe item can be called only through a process, so that the QT interface and the Unity interface can not realize embedded display, development difficulty is increased, and the direct joint debugging communication cost of QT and Unity is increased. In the embodiment of the application, the Unity is separated into the UnityUI part and the UnityDemo part, so that not only can the QT interface, the two-dimensional user interface of the Unity and the three-dimensional interface chimeric display of the Unity be realized, but also the bridge communication between the UnityUI part and the UnityDemo part can be realized through the QT during development and operation, the communication with the development of the QT part is not needed, only the QT is needed to transfer, and the communication cost and the development difficulty are reduced.
The foregoing exemplarily describes a recall process between the first portion, the second portion, and the target item after capturing a user event by a two-dimensional user interface of a three-dimensional development tool. The following continues with the description of the interaction procedure between the first portion, the second portion and the target item after capturing a user event by the target.
After the executable file item and the website item are started, if a second user event captured by the target item is obtained, the target item can transmit the second user event to the executable file item, then the executable file item carries out business logic processing on the second user event to obtain a second processed event, and finally the second processed event is transferred to the website item through the target item so as to enable the website item to carry out event response. The second user event is a user operation event, which may include, for example, a keyboard operation event, a mouse operation event, and the like.
It will be appreciated that data interactions may be performed between the target item and the executable file item via a shared memory, while data interactions may be performed between the target item and the website item via a communication channel (e.g., QWebChannel) and a script of a three-dimensional development tool (e.g., unityJS).
Based on the above, after the target item acquires the second user event, whether the target item is processed by the target item or is submitted to the executable file item for processing can be judged based on the business logic of the target item; if the executable file item is determined to be required to be handed over to be processed, the second user event can be converted into a fourth event, and then the fourth event is sent to the shared memory by the target item. Memory shared by the shared memory target item and the executable file item. The fourth event may be, for example, a QT-DEMO event, where the target item sends the QT-DEMO event type, parameters, and the like to the shared memory.
After the target item sends the event to the shared memory, the executable file item can read the fourth event from the shared memory through the stack machine, the executable file item carries out business logic processing on the fourth event to obtain a second processed event, and the executable file item sends the second processed event to the shared memory. Wherein the second post-processing event may be, for example, a DEMO-QT event. The executable file items can read functions, parameters and the like from the shared memory, and then carry out a series of business logic processing on the event to obtain the processed event; judging whether a target item and a website item need to be called or not; and if the target item or the website item needs to be called, sending an event processing result to the shared memory.
The target item can acquire an event processing result from the shared memory through the stack machine, and then judge whether the event processing result needs to be transmitted to the website item or not through own business logic; if the target project needs to transmit the event processing result to the website project, then the event processing result is forwarded to the script of the three-dimensional development tool through the communication channel, and then the script of the three-dimensional development tool is forwarded to the website project to be submitted to the website project (for example, unityUI part) for event response.
That is, after the executable file item sends the second processed event to the shared memory, the target item reads the second processed event from the shared memory through the stack machine, and determines that the executable file item needs to call the business logic of the website item according to the target item, after the second processed event is converted into the fifth event, the fifth event is forwarded to the script of the three-dimensional development tool through the communication channel by the target item; forwarding a fifth event to the website project through the script of the three-dimensional development tool; event responses are made by the website project. Wherein the fifth event may be, for example, a QT-UI event.
Illustratively, the three-dimensional development tool may be Unity, the target item may be QT item, the website item may be WASM item, the communication channel may be QWebChannel, and the script of the three-dimensional development tool may be UnityJS. The window component may be QWidget and the browser engine QWebEngineView. At this time, the first portion may be part UnityUI and the second portion may be part UnityDemo. To better receive the interaction logic of UnityUI, unityDemo, and QT three-part codes, the following description is presented in conjunction with the schematic diagram of UnityUI, unityDemo, and QT interaction logic provided by the embodiment of the application shown in fig. 5.
As shown in the left flowchart in fig. 5, QT activates part UnityDemo through the command line and displays the interface of part UnityDemo (i.e., the 3D interface) through QWidget; the interface (i.e., 2D interface) of UnityUI is launched and displayed through QWebEngineView.
After determining UnityUI and UnityDemo, user events may be captured via UnityUI and QT and data interacted with according to the right flow chart as in fig. 5.
The interaction logic following UnityUI capture of the user event may be as follows:
UnityUI after capturing the user event, the user event is handed to the logic of section UnityUI for processing. Wherein part UnityUI includes its own business logic. By the own business logic, whether the UnityUI part processes the user event by itself or needs to be transferred to the QT part or the UnityDemo part for processing is judged. When it is determined that the QT or UnityDemo part of the processing needs to be forwarded, the UI-QT event type, parameters, etc. are sent to UnityJS according to the user event.
After part UnityUI sends the event to UnityJS, unityJS then passes QWebChannel, which relays the event to QT.
After the QT receives the event, the data is sent to the shared memory after being judged by the service logic. The QT judges whether the event sent by the UnityUI part is submitted to QT processing or UnityDemo part processing through service logic; if the event is submitted to the QT processing, the QT carries out corresponding processing on the event; if yes, the QT is given to the UnityDemo part for processing, the event is converted into the QT-DEMO event type and parameters, and then related data is sent to the shared memory.
And UnityDemo, reading related data such as functions and parameters from the shared memory through the stack machine of the part, performing event processing through the business logic of the part, and sending the event to the shared memory after the event processing is finished. After the event is processed, the UnityDemo portion can determine whether to call the service logic of the QT portion and the UnityUI portion, and if so, the event can be sent to the shared memory. The event processed is a DEMO-QT event.
The QT acquires the event processing result of the UnityDemo part from the shared memory through the stack machine of the QT, and then judges whether the UnityDemo part needs to call the service logic of the UnityUI part through the service logic processing of the QT. If a call is required, the call can be forwarded to UnityJS, unityJS through QWebChannel and then forwarded to UnityC # through WASM for event response by UnityUI.
Thus, through the interaction logic of the QT, the UnityDemo part and the UnityUI part, the process of capturing the user event by the UnityUI part, transmitting event information and the like to the UnityDemo part through the QT, carrying out business logic processing by the UnityDemo part according to the event information, and then transferring the processing result to the UnityUI part through the QT can be realized.
As shown in fig. 5, the interaction logic after QT captures a user event may be as follows:
After the QT captures the user event, the data is sent to the shared memory after the user event is judged by the service logic of the QT. The QT judges whether the event is submitted to QT processing or to UnityDemo parts of processing through service logic; if the event is submitted to the QT processing, the QT carries out corresponding processing on the event; if yes, the QT is given to the UnityDemo part for processing, the event is converted into the QT-DEMO event type and parameters, and then related data is sent to the shared memory.
And UnityDemo, reading related data such as functions and parameters from the shared memory through the stack machine of the part, performing event processing through the business logic of the part, and sending the event to the shared memory after the event processing is finished. After the event is processed, the UnityDemo portion can determine whether to call the service logic of the QT portion and the UnityUI portion, and if so, the event can be sent to the shared memory. The event processed is a DEMO-QT event.
The QT acquires the event processing result of the UnityDemo part from the shared memory through the stack machine of the QT, and then judges whether the UnityDemo part needs to call the service logic of the UnityUI part through the service logic processing of the QT. If a call is required, the call can be forwarded to UnityJS, unityJS through QWebChannel and then forwarded to UnityC # through WASM for event response by UnityUI.
Thus, through the interaction logic of the QT, the UnityDemo part and the UnityUI part, the process of capturing user events by the QT, transmitting event information and the like to the UnityDemo part through the QT, carrying out business logic processing by the UnityDemo part according to the event information, and then transferring the processing result to the UnityUI part through the QT can be realized.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
Corresponding to the hybrid development method based on the three-dimensional development tool and the gui application development framework described in the above embodiments, fig. 6 shows a block diagram of a hybrid development apparatus based on the three-dimensional development tool and the gui application development framework provided in the embodiment of the present application, and for convenience of explanation, only the portions related to the embodiment of the present application are shown.
Referring to fig. 6, the apparatus includes:
A code separation module 61, configured to obtain a project code, and separate the project code into a first portion and a second portion, where the first portion includes a code of a two-dimensional user interface of a three-dimensional development tool, and the second portion includes a business logic code of the three-dimensional development tool and a code of the three-dimensional interface, and the project code is a project code based on the three-dimensional development tool;
A compiling access module 62, configured to compile the first portion into a website item, compile the second portion into an executable file item, and access the website item and the executable file item into a target item, where the target item is a graphical user interface application development framework item;
the first starting module 63 is configured to start the executable file item through the target item after starting the target item, and display a three-dimensional interface of the executable file item through a window component of the target item;
The second starting module 64 is configured to start the website item through the browser engine of the target item, and display a two-dimensional user interface through the browser engine.
In some possible implementations, the apparatus further includes:
A capture module for acquiring a first user event captured by a two-dimensional user interface of a three-dimensional development tool;
The data interaction module is used for transmitting the first user event to the executable file item through the target item; carrying out business logic processing on a first user event through an executable file item to obtain a first processed event;
And the event response module is used for transferring the first processed event to the website item through the target item so as to enable the website item to perform event response.
In some of the possible implementations of the present invention,
The data interaction module is specifically used for: if the first user event is determined to be transferred to the target project or the executable file project through the website project, after the first user event is converted into the first event, the first event is sent to the script of the three-dimensional development tool, the script of the three-dimensional development tool transfers the first event to the target project through a communication channel, and the communication channel is a channel between the target project and the website project; if the first event is determined to be required to be transferred to the executable file item through the target item, after the first event is converted into the second event, the target item sends the second event to the shared memory, and the shared memory is shared by the target item of the shared memory and the executable file item; reading a second event from the shared memory through a stack machine of the executable file item, carrying out business logic processing on the second event by the executable file item to obtain a first processed event, and sending the first processed event to the shared memory by the executable file item;
The event response module is specifically configured to: reading the first processed event from the shared memory through a stack machine of the target project, determining that the executable file project needs to call business logic of the website project by the target project, converting the first processed event into a third event, and forwarding the third event to a script of the three-dimensional development tool through a communication channel by the target project; and forwarding the third event to the website project through the script of the three-dimensional development tool.
In some possible implementations, the apparatus further includes:
The event capturing module is used for acquiring a second user event captured by the target item;
The transmission module is used for transmitting the second user event to the executable file item; carrying out business logic processing on a second user event through the executable file item to obtain a second processed event;
And the response module is used for transferring the second processed event to the website item through the target item so as to enable the website item to respond to the event.
In some of the possible implementations of the present invention,
The transmission module is specifically used for: if the target item determines that the second user event needs to be transferred to the executable file item, converting the second user event into a fourth event, and then sending the fourth event to the shared memory by the target item, wherein the shared memory is shared by the target item of the shared memory and the executable file item; reading a fourth event from the shared memory through a stack machine of the executable file item, carrying out business logic processing on the fourth event by the executable file item to obtain a second processed event, and sending the second processed event to the shared memory by the executable file item;
The response module is specifically used for: reading a second processed event from the shared memory through a stack machine of the target project, determining that the executable file project needs to call business logic of the website project by the target project, converting the second processed event into a fifth event, and forwarding the fifth event to a script of the three-dimensional development tool through a communication channel by the target project; and forwarding the fifth event to the website project through the script of the three-dimensional development tool.
In some possible implementations, the first user event or the second user event is a keyboard operation event or a mouse operation event.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 7, the electronic apparatus of this embodiment includes: at least one processor 70 (only one is shown in fig. 7), a memory 71 and a computer program 72 stored in the memory 71 and executable on the at least one processor 70, the processor 70 implementing the steps in any of the method embodiments described above when executing the computer program 72.
The electronic device may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is merely an example of the electronic device 7 and is not meant to be limiting of the electronic device 7, and may include more or fewer components than shown, or may combine certain components, or different components, such as may also include input-output devices, network access devices, etc.
The processor 70 may be a central processing unit (Central Processing Unit, CPU), and the processor 70 may be any other general purpose processor, digital signal processor (DIGITAL SIGNAL processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), off-the-shelf programmable gate array (field-programmable GATE ARRAY, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may in some embodiments be an internal storage unit of the electronic device 7, such as a hard disk or a memory of the electronic device 7. The memory 71 may in other embodiments also be an external storage device of the electronic device 7, such as a plug-in hard disk, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD) or the like, which are provided on the electronic device 7. Further, the memory 71 may also include both an internal storage unit and an external storage device of the electronic device 7. The memory 71 is used for storing an operating system, application programs, boot loader (BootLoader), data, other programs, etc., such as program codes of the computer program. The memory 71 may also be used for temporarily storing data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, performs the steps of the respective method embodiments described above.
Embodiments of the present application provide a computer program product which, when run on a chip, causes the chip to perform the steps of the method embodiments described above.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to an apparatus/electronic device, a recording medium, a computer memory, a read-only memory (ROM), a random access memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus electronic device and method may be implemented in other manners. For example, the apparatus/electronic device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms. The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A hybrid development method, the method comprising:
Acquiring a project code, and separating the project code into a first part and a second part, wherein the first part comprises a code of a two-dimensional user interface of a three-dimensional development tool, the second part comprises a business logic code of the three-dimensional development tool and a code of the three-dimensional interface, and the project code is based on the project code of the three-dimensional development tool;
Compiling the first part into a website project, compiling the second part into an executable file project, and accessing the website project and the executable file project into a target project, wherein the target project is a graphical user interface application development framework project;
after the target item is started, starting the executable file item through the target item, and displaying the three-dimensional interface of the executable file item through a window component of the target item;
and starting the website item through a browser engine of the target item, and displaying the two-dimensional user interface through the browser engine.
2. The method of claim 1, wherein after launching the executable file item and the website item, the method further comprises:
obtaining a first user event captured by a two-dimensional user interface of the three-dimensional development tool;
transmitting the first user event to the executable file item via the target item;
carrying out business logic processing on the first user event through the executable file item to obtain a first processed event;
and transferring the first processed event to the website item through the target item so as to enable the website item to respond to the event.
3. The method of claim 2, wherein transmitting the first user event to the executable file item via the target item, obtaining a first post-processing event after business logic processing of the first user event via the executable file item, and forwarding the first post-processing event to the website item via the target item comprises:
If the first user event is determined to be transferred to the target item or the executable file item through a website item, after the first user event is converted into the first event, the first event is sent to a script of the three-dimensional development tool, and the first event is transferred to the target item through a communication channel by the script of the three-dimensional development tool, wherein the communication channel is a channel between the target item and the website item;
If the first event is determined to need to be transferred to the executable file item through the target item, after the first event is converted into a second event, the second event is sent to a shared memory by the target item, and the shared memory is shared by the target item and the executable file item;
Reading the second event from the shared memory through a stack machine of the executable file item, carrying out business logic processing on the second event by the executable file item to obtain a first processed event, and sending the first processed event to the shared memory by the executable file item;
Reading the first processed event from the shared memory through a stack machine of the target item, determining that the executable file item needs to call business logic of the website item by the target item, converting the first processed event into a third event, and forwarding the third event to a script of the three-dimensional development tool through the communication channel by the target item;
forwarding the third event to the website project through the script of the three-dimensional development tool.
4. A method according to any one of claims 1 to 3, wherein after launching the executable file item and the website item, the method further comprises:
acquiring a second user event captured by the target item;
Transmitting the second user event to the executable file item;
carrying out business logic processing on the second user event through the executable file item to obtain a second processed event;
And transferring the second processed event to the website item through the target item so as to enable the website item to respond to the event.
5. The method of claim 4, wherein transmitting the second user event to the executable file item, wherein the second processed event is obtained after business logic processing of the second user event by the executable file item, and wherein forwarding the second processed event to the website item by the target item comprises:
If the second user event is determined to be transferred to the executable file item through the target item, after the second user event is converted into a fourth event, the fourth event is sent to a shared memory by the target item, wherein the shared memory is a memory shared by the target item and the executable file item;
Reading the fourth event from the shared memory through a stack machine of the executable file item, carrying out business logic processing on the fourth event by the executable file item to obtain a second processed event, and sending the second processed event to the shared memory by the executable file item;
Reading the second processed event from the shared memory through a stack machine of the target item, determining that the executable file item needs to call business logic of the website item by the target item, converting the second processed event into a fifth event, and forwarding the fifth event to a script of the three-dimensional development tool through the communication channel by the target item;
forwarding the fifth event to the website project through the script of the three-dimensional development tool.
6. The method of claim 4, wherein the first user event or the second user event is a keyboard operation event or a mouse operation event.
7. A hybrid development device, comprising:
the code separation module is used for acquiring project codes and separating the project codes into a first part and a second part, wherein the first part comprises codes of a two-dimensional user interface of a three-dimensional development tool, the second part comprises business logic codes of the three-dimensional development tool and codes of the three-dimensional interface, and the project codes are project codes based on the three-dimensional development tool;
the compiling access module is used for compiling the first part into a website project, compiling the second part into an executable file project, and accessing the website project and the executable file project into a target project, wherein the target project is a graphical user interface application program development framework project;
The first starting module is used for starting the executable file item through the target item after starting the target item, and displaying the three-dimensional interface of the executable file item through a window component of the target item;
and the second starting module is used for starting the website item through a browser engine of the target item and displaying the two-dimensional user interface through the browser engine.
8. The apparatus of claim 7, wherein the apparatus further comprises:
a capture module for acquiring a first user event captured by a two-dimensional user interface of the three-dimensional development tool;
The data interaction module is used for transmitting the first user event to the executable file item through the target item; carrying out business logic processing on the first user event through the executable file item to obtain a first processed event;
and the event response module is used for transferring the first processed event to the website item through the target item so as to enable the website item to respond to the event.
9. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 6 when executing the computer program.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the method according to any one of claims 1 to 6.
CN202311841021.1A 2023-12-28 2023-12-28 Hybrid development method, apparatus, electronic device, and computer-readable storage medium Pending CN117908871A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311841021.1A CN117908871A (en) 2023-12-28 2023-12-28 Hybrid development method, apparatus, electronic device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311841021.1A CN117908871A (en) 2023-12-28 2023-12-28 Hybrid development method, apparatus, electronic device, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN117908871A true CN117908871A (en) 2024-04-19

Family

ID=90696747

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311841021.1A Pending CN117908871A (en) 2023-12-28 2023-12-28 Hybrid development method, apparatus, electronic device, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN117908871A (en)

Similar Documents

Publication Publication Date Title
CN109918055B (en) Application program generation method and device
US7545386B2 (en) Unified mobile display emulator
CN110188044B (en) Software error processing method, device, storage medium and equipment
CN110209967B (en) Page loading method and device, terminal equipment and computer readable medium
CN106547580B (en) Method and device for hooking function, mobile terminal and storage medium
CN104038657B (en) Information Processing System, Device And Information Processing Method
CN113407086B (en) Object dragging method, device and storage medium
CN113127361B (en) Application development method and device, electronic equipment and storage medium
CN110968331A (en) Method and device for running application program
CN113157345A (en) Automatic starting method and device for front-end engineering
CN111506368B (en) Method, device, equipment and storage medium for converting asynchronous call into synchronous call
CN110727581A (en) Collapse positioning method and electronic equipment
CN112631649A (en) Intelligent contract management method, device, terminal equipment and medium
CN113835569A (en) Terminal device, quick start method for internal function of application and storage medium
CN111949491A (en) SQL extraction method and device for MyBatis application program
US7898552B2 (en) Method and editing processor for adding graphics object with simple manner
CN117908871A (en) Hybrid development method, apparatus, electronic device, and computer-readable storage medium
KR101412465B1 (en) Verification system and verification method of code block for separating execution based contents
CN114489429B (en) Terminal equipment, long screen capturing method and storage medium
CN112052377A (en) Resource recommendation method, device, server and storage medium
CN111708519B (en) Service component processing method, device, equipment and storage medium
EP2966565A1 (en) Method for automatically converting android application to tizen installable package
CN117724726B (en) Data processing method and related device
CN111198721A (en) Application program running method and device
CN116991380B (en) Application program construction method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination