CN116820651A - Interface display method and device, electronic equipment and storage medium - Google Patents

Interface display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116820651A
CN116820651A CN202310797090.0A CN202310797090A CN116820651A CN 116820651 A CN116820651 A CN 116820651A CN 202310797090 A CN202310797090 A CN 202310797090A CN 116820651 A CN116820651 A CN 116820651A
Authority
CN
China
Prior art keywords
interface
operation data
dimensional
image area
shared memory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310797090.0A
Other languages
Chinese (zh)
Inventor
龚岗华
谢晨旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikrobot Co Ltd
Original Assignee
Hangzhou Hikrobot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikrobot Co Ltd filed Critical Hangzhou Hikrobot Co Ltd
Priority to CN202310797090.0A priority Critical patent/CN116820651A/en
Publication of CN116820651A publication Critical patent/CN116820651A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/544Buffers; Shared memory; Pipes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/546Message passing systems or structures, e.g. queues

Abstract

The embodiment of the application provides an interface display method, an interface display device, electronic equipment and a storage medium, and relates to the technical field of interface development, wherein the method comprises the following steps: receiving interface operation of a user through a first process in the started two processes; the first process is any one of the two processes, and the two processes are developed based on different integrated development environments and are respectively used for rendering pictures in different image areas in the display interface; transmitting operation data of interface operation to a second process in the two processes through a first process; and processing according to the operation data through a second process. Based on the above processing, a display interface can be implemented in combination with two processes developed by two different integrated development environments. Therefore, developers can select different integrated development environments respectively to develop the two processes respectively, and the development advantages of the different integrated development environments can be combined, so that the display effect of the whole interface is improved.

Description

Interface display method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of interface development technologies, and in particular, to an interface display method, an apparatus, an electronic device, and a storage medium.
Background
With the development of software technology, the application scenes of the software are more and more, and the requirements of users on interfaces displayed by the software are more and more abundant. In some application scenarios, an interface that displays only two-dimensional (2D) images or three-dimensional (3D) images does not meet the needs of the user. For example, in an intelligent driving scene, an interface that only displays a two-dimensional image cannot realize the display of the stereoscopic spatial relationship between a driving vehicle and a driving environment, and an interface that only displays a three-dimensional image cannot conveniently perform batch configuration operation.
In the related art, when an interface is required to display a two-dimensional image and a three-dimensional image at the same time, only an image area (which may be referred to as a two-dimensional image area) suitable for displaying the two-dimensional image in a two-dimensional integrated development environment (IDE, integrated Development Environment) development interface can be selected, and an image area (which may be referred to as a three-dimensional image area) displaying the three-dimensional image in the development interface is developed through a control module in the integrated development environment; alternatively, only a three-dimensional image region suitable for use in a three-dimensional integrated development environment development interface can be selected and displayed in the two-dimensional image region in the integrated development environment development interface through a control module in the integrated development environment.
That is, only one integrated development environment can be selected to realize the interface development in the related art, and the integrated development environment is only suitable for developing one image area (i.e., a two-dimensional image area or a three-dimensional image area), and the display effect of the other image area developed by the control module in the integrated development environment is poor, that is, the development advantages of different integrated development environments cannot be combined, and further, the display effect of the whole interface is poor.
Disclosure of Invention
The embodiment of the application aims to provide an interface display method, an interface display device, electronic equipment and a storage medium, so as to realize the development advantages of combining different integrated development environments and improve the display effect of the whole interface. The specific technical scheme is as follows:
in a first aspect of the present application, there is provided an interface display method, including:
receiving interface operation of a user through a first process in the started two processes; the first process is any one of the two processes, and the two processes are developed based on different integrated development environments and are respectively used for rendering pictures in different image areas in a display interface;
Transmitting operation data of the interface operation to a second process in the two processes through the first process;
and processing according to the operation data through the second process.
Optionally, the sending, by the first process, the operation data of the interface operation to a second process of the two processes includes:
writing operation data of the interface operation into a shared memory of the first process through the first process;
transmitting, by the first process, a first signaling to a second process of the two processes based on the communication link; the first signaling is used for indicating the second process to read data from the shared memory;
the method further comprises the steps of:
and when the first signaling is received through the second process, reading the operation data from the shared memory through the second process.
Optionally, the writing, by the first process, the operation data of the interface operation into the shared memory of the first process includes:
when the state of the shared memory of the first process is that reading is completed, writing operation data of the interface operation into the shared memory through the first process;
After the operation data of the interface operation is written into the shared memory of the first process through the first process, the method further comprises:
setting the state of the shared memory to be read through the first process;
after the operation data is read from the shared memory by the second process when the first signaling is received by the second process, the method further includes:
transmitting, by the second process, second signaling indicating completion of reading to the first process based on the communication link;
and when the second signaling is received through the first process, setting the state of the shared memory to be read completion through the first process.
Optionally, the shared memory includes: a space for storing a type field representing a type of the interface operation, a space for storing a length field representing a data length of the operation data, and a space for storing the operation data.
Optionally, the sending, by the first process, the operation data of the interface operation to a second process of the two processes includes:
and transmitting operation data of the interface operation to a second process in the two processes through the first process based on a communication link.
Optionally, the communication link is a Pipe, a message queue, or a Socket.
Optionally, the first process is a main process, and the second process is a sub-process of the first process; the second process is used for rendering a picture in an area characterized by the area handle in the display interface based on the area handle defined in the first process.
Optionally, the different integrated development environments include: an integrated development environment for developing an image area displaying a two-dimensional image, and an integrated development environment for developing an image area displaying a three-dimensional image.
Optionally, the display interface includes a first image area for displaying a two-dimensional image and a second image area for displaying a three-dimensional image;
a process developed based on an integrated development environment for developing an image area displaying a two-dimensional image for display in the first image area: the two-dimensional control is used for setting attribute information of the three-dimensional virtual object in the second image area and current attribute information of the three-dimensional virtual object in the second image area; and updating attribute information of the three-dimensional virtual object displayed in the first image area according to operation data of the interface operation received in the second image area;
A process developed based on an integrated development environment for developing an image area for displaying a three-dimensional image for displaying a three-dimensional virtual object in the second image area; and updating the three-dimensional virtual object displayed in the second image area according to the operation data of the interface operation received in the first image area.
Optionally, the method further comprises:
and when a control operation for the display interface is received, synchronously adjusting the first image area and the second image area according to operation data of the control operation.
In a second aspect of the present application, there is also provided an interface display device, including:
the interface operation receiving module is used for receiving the interface operation of the user through a first process in the started two processes; the first process is any one of the two processes, and the two processes are developed based on different integrated development environments and are respectively used for rendering pictures in different image areas in a display interface;
the operation data sending module is used for sending the operation data of the interface operation to a second process in the two processes through the first process;
And the operation data processing module is used for processing according to the operation data through the second process.
Optionally, the operation data sending module includes:
the operation data writing sub-module is used for writing the operation data of the interface operation into the shared memory of the first process through the first process;
a first sending sub-module, configured to send, by the first process, a first signaling to a second process of the two processes based on a communication link; the first signaling is used for indicating the second process to read data from the shared memory;
the apparatus further comprises:
and the operation data reading module is used for reading the operation data from the shared memory through the second process when the first signaling is received through the second process.
Optionally, the operation data writing sub-module is specifically configured to write, when the state of the shared memory of the first process is that reading is completed, operation data of the interface operation into the shared memory through the first process;
the apparatus further comprises:
the first setting module is used for setting the state of the shared memory as read through the first process after the operation data of the interface operation is written into the shared memory of the first process through the first process;
The second sending module is used for sending a second signaling representing that reading is completed to the first process through the second process based on the communication link after the operation data is read from the shared memory through the second process when the first signaling is received through the second process;
and the second setting module is used for setting the state of the shared memory to be read through the first process when the second signaling is received through the first process.
Optionally, the shared memory includes: a space for storing a type field representing a type of the interface operation, a space for storing a length field representing a data length of the operation data, and a space for storing the operation data.
Optionally, the operation data sending module is specifically configured to send, by the first process, operation data of the interface operation to a second process of the two processes based on a communication link.
Optionally, the communication link is a pipe, a message queue, or a socket.
Optionally, the first process is a main process, and the second process is a sub-process of the first process; the second process is used for rendering a picture in an area characterized by the area handle in the display interface based on the area handle defined in the first process.
Optionally, the different integrated development environments include: an integrated development environment for developing an image area displaying a two-dimensional image, and an integrated development environment for developing an image area displaying a three-dimensional image.
Optionally, the display interface includes a first image area for displaying a two-dimensional image and a second image area for displaying a three-dimensional image;
a process developed based on an integrated development environment for developing an image area displaying a two-dimensional image for display in the first image area: the two-dimensional control is used for setting attribute information of the three-dimensional virtual object in the second image area and current attribute information of the three-dimensional virtual object in the second image area; and updating attribute information of the three-dimensional virtual object displayed in the first image area according to operation data of the interface operation received in the second image area;
a process developed based on an integrated development environment for developing an image area for displaying a three-dimensional image for displaying a three-dimensional virtual object in the second image area; and updating the three-dimensional virtual object displayed in the second image area according to the operation data of the interface operation received in the first image area.
Optionally, the apparatus further includes:
and the image area adjusting module is used for synchronously adjusting the first image area and the second image area according to the operation data of the control operation when the control operation for the display interface is received.
In a third aspect of the present application, there is provided an electronic device, including:
a memory for storing a computer program;
and the processor is used for realizing any one of the interface display methods when executing the program stored in the memory.
In still another aspect of the implementation of the present application, there is also provided a computer readable storage medium, where a computer program is stored, where the computer program is executed by a processor to implement any one of the interface display methods described above.
The embodiment of the application also provides a computer program product containing instructions, which when run on a computer, cause the computer to execute any of the interface display methods described above.
According to the interface display method provided by the embodiment of the application, interface operation of a user is received through a first process in the started two processes; the first process is any one of the two processes, and the two processes are developed based on different integrated development environments and are respectively used for rendering pictures in different image areas in the display interface; transmitting operation data of interface operation to a second process in the two processes through a first process; and processing according to the operation data through a second process.
Based on the above processing, when receiving the interface operation of the user, the two processes developed based on different integrated development environments can send operation data to the other process when the other process needs to respond to the interface operation, that is, the two processes can perform interaction of the operation data. And because the two processes are respectively used for rendering pictures in different image areas in the display interface, the interaction of the different image areas in the display interface can be realized, namely, the display interface can be realized by combining the two processes developed by two different integrated development environments. Therefore, developers can select different integrated development environments respectively to develop the two processes respectively, and the development advantages of the different integrated development environments can be combined, so that the display effect of the whole interface can be improved.
Of course, it is not necessary for any one product or method of practicing the application to achieve all of the advantages set forth above at the same time.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the application, and other embodiments may be obtained according to these drawings to those skilled in the art.
FIG. 1 is a first flowchart of an interface display method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a display interface according to an embodiment of the present application;
FIG. 3 is a second flowchart of an interface display method according to an embodiment of the present application;
FIG. 4 is a third flowchart of an interface display method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a shared memory according to an embodiment of the present application;
FIG. 6 is an interactive schematic diagram of two processes in an interface display method according to an embodiment of the present application;
FIG. 7 is a fourth flowchart of an interface display method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an interface display device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. Based on the embodiments of the present application, all other embodiments obtained by the person skilled in the art based on the present application are included in the scope of protection of the present application.
With the development of software technology, the application scenes of the software are more and more, and the requirements of users on interfaces displayed by the software are more and more abundant. In some application scenarios, an interface that displays only two-dimensional images or three-dimensional images may not meet the needs of the user. For example, in an intelligent driving scene, an interface that only displays a two-dimensional image cannot realize the display of the stereoscopic spatial relationship between a driving vehicle and a driving environment, and an interface that only displays a three-dimensional image cannot conveniently perform batch configuration operation.
In the related art, when an interface is required to display a two-dimensional image and a three-dimensional image at the same time, only an image area (which may be referred to as a two-dimensional image area) suitable for displaying the two-dimensional image in a development interface of a two-dimensional integrated development environment can be selected, and an image area (which may be referred to as a three-dimensional image area) displaying the three-dimensional image in the development interface of a control module in the integrated development environment is used; alternatively, only a three-dimensional image region suitable for use in a three-dimensional integrated development environment development interface can be selected and displayed in the two-dimensional image region in the integrated development environment development interface through a control module in the integrated development environment.
That is, only one integrated development environment can be selected to realize interface development in the related art, and the integrated development environment is only suitable for developing one image area (i.e., a two-dimensional image area or a three-dimensional image area), and the display effect of the other image area developed through the control module in the integrated development environment is poor. That is, only the development advantage of a certain integrated development environment can be utilized, and the development advantages of different integrated development environments cannot be combined, so that the display effect of the whole interface is poor.
In order to improve the display effect of the whole interface, the embodiment of the application provides an interface display method which is applied to a client running in electronic equipment. For example, the client may be three-dimensional drawing software or three-dimensional modeling software. Referring to fig. 1, fig. 1 is a first flowchart of an interface display method according to an embodiment of the present application, where the interface display method may include the following steps:
step S101: and receiving interface operation of the user through the first process in the started two processes.
The first process is any one of the two processes, and the two processes are developed based on different integrated development environments and are respectively used for rendering pictures in different image areas in the display interface.
Step S102: and sending operation data of the interface operation to a second process in the two processes through the first process.
Step S103: and processing according to the operation data through a second process.
Based on the above processing, when receiving the interface operation of the user, the two processes developed based on different integrated development environments can send operation data to the other process when the other process needs to respond to the interface operation, that is, the two processes can perform interaction of the operation data. And because the two processes are respectively used for rendering pictures in different image areas in the display interface, the interaction of the different image areas in the display interface can be realized, namely, the display interface can be realized by combining the two processes developed by two different integrated development environments. Therefore, developers can select different integrated development environments respectively to develop the two processes respectively, and the development advantages of the different integrated development environments can be combined, so that the display effect of the whole interface can be improved.
For step S101 and step S102, the first process is any one of the two processes that have been started, and accordingly, the other of the two processes may be referred to as a second process. The display interface may include an image area for displaying a first process rendered picture and an image area for displaying a second process rendered picture. When the user performs the interface operation in the image area corresponding to the first process in the display interface, the first process can respond to the interface operation and update the rendered picture when receiving the interface operation, and the picture displayed in the image area corresponding to the first process can be updated. Optionally, for each interface operation performed by the user in the image area corresponding to the first process in the display interface, the interface operation may be an interface operation responded inside the first process. That is, the first process needs to respond to the interface operation, the second process does not need to respond to the interface operation, and the first process does not need to send operation data of the interface operation to the second process. When the interface operation affects the display effect of the image area corresponding to the second process, that is, when the second process needs to respond to the interface operation, the first process may send operation data of the interface operation to the second process when receiving the interface operation. Accordingly, the second process may respond to the interface operation according to the received operation data and update the rendered picture, and may update the picture displayed in the image area corresponding to the second process. For example, when receiving an interface operation, the first process may determine whether the interface operation belongs to a preset type or whether the interface operation is directed to a preset position, and if the interface operation belongs to the preset type or the interface operation is directed to the preset position, it may be determined that the second process needs to respond to the interface operation, and the first process needs to send operation data of the interface operation to the second process.
For example, the operational data may be used to instruct the second process how to respond to the interface operation to update the display content of the image area and/or to record parameters associated with the interface operation in the operational data. That is, when any one of the two processes that have been started receives the interface operation of the user, the operation data of the interface operation may be sent to the other process of the two processes. Furthermore, interaction of operation data between the started two processes can be realized, and display synchronization of different image areas in the display interface can be ensured.
In one embodiment, the different integrated development environments include: an integrated development environment for developing an image area displaying a two-dimensional image, and an integrated development environment for developing an image area displaying a three-dimensional image.
In an embodiment of the present application, the display interface may include an image area for displaying a two-dimensional image (i.e., a two-dimensional image area, i.e., a first image area in the subsequent embodiment) and an image area for displaying a three-dimensional image (i.e., a three-dimensional image area, i.e., a second image area in the subsequent embodiment). For example, a two-dimensional image area may be used to display two-dimensional controls such as user menus, and a three-dimensional image area may be used to display a three-dimensional model built according to user operations.
For convenience of description, a process developed based on an integrated development environment for developing an image area displaying a two-dimensional image may be referred to as a two-dimensional process in the present application; a process developed based on an integrated development environment for developing an image area displaying a three-dimensional image may be referred to as a three-dimensional process.
Based on the above processing, when the two-dimensional process and the three-dimensional process receive the interface operation of the user, respectively, the two-dimensional process and the three-dimensional process can send operation data to the other process under the condition that the other process needs to respond to the interface operation, that is, the two-dimensional process and the three-dimensional process can perform interaction of the operation data. And because the two-dimensional process is used for rendering pictures in the two-dimensional image area in the display interface, and the three-dimensional process is used for rendering pictures in the three-dimensional image area in the display interface, the interaction between the two-dimensional image area and the three-dimensional image area in the display interface can be realized, namely, the display interface can be realized by combining the two-dimensional process and the three-dimensional process. Therefore, a developer can select the integrated development environment suitable for two dimensions to develop a two-dimensional process, and select the integrated development environment suitable for three dimensions to develop a three-dimensional process, so that the development advantages of different integrated development environments can be combined, and further, the display effect of the whole interface can be improved.
In one embodiment, the display interface includes a first image area for displaying a two-dimensional image and a second image area for displaying a three-dimensional image.
A development process based on an integrated development environment for developing an image area for displaying a two-dimensional image for display in a first image area: the two-dimensional control is used for setting attribute information of the three-dimensional virtual object in the second image area and current attribute information of the three-dimensional virtual object in the second image area; and updating attribute information of the three-dimensional virtual object displayed in the first image area according to the operation data of the interface operation received in the second image area.
A process developed based on an integrated development environment for developing an image area for displaying a three-dimensional image for displaying a three-dimensional virtual object in a second image area; and updating the three-dimensional virtual object displayed in the second image area according to the operation data of the interface operation received in the first image area.
In the embodiment of the present application, the process developed based on the integrated development environment for developing the image area for displaying the two-dimensional image is the two-dimensional process in the above embodiment. The process developed based on the integrated development environment for developing the image area for displaying the three-dimensional image is the three-dimensional process in the above embodiment.
When the two-dimensional process receives the interface operation for the two-dimensional control, operation data can be sent to the three-dimensional process to instruct the three-dimensional process to update the three-dimensional virtual object displayed in the second image area. For example, the two-dimensional controls may include controls for generating three-dimensional virtual objects (may be referred to as generating controls), controls for modifying attribute information of three-dimensional virtual objects that have been displayed in the second image area (may be referred to as modifying controls), and controls for deleting three-dimensional virtual objects that have been displayed in the second image area (may be referred to as deleting controls).
When the user triggers the generation control in the first image area through interface operation, the two-dimensional process can send attribute information of the three-dimensional virtual object to be generated to the three-dimensional process, and correspondingly, the three-dimensional process can display the three-dimensional virtual object in the second image area according to the received attribute information.
When the user triggers the modification control in the first image area through interface operation, the two-dimensional process can send attribute information of the three-dimensional virtual object to be modified to the three-dimensional process, and correspondingly, the three-dimensional process can modify the three-dimensional virtual object displayed in the second image area according to the received attribute information. For example, the user may adjust the size of the three-dimensional virtual object to be adjusted by modifying the control, and correspondingly, the operation data may include the adjusted size and identification information of the three-dimensional virtual object to be adjusted, and the three-dimensional process may adjust the size of the three-dimensional virtual object to be adjusted displayed in the second image area according to the operation data; the user can also adjust the position of the three-dimensional virtual object to be adjusted through modifying the control, correspondingly, the operation data can comprise the adjusted position and identification information of the three-dimensional virtual object to be adjusted, and the three-dimensional process can adjust the position of the three-dimensional virtual object to be adjusted displayed in the second image area according to the operation data.
When the user triggers the deletion control in the first image area through interface operation, the two-dimensional process can send identification information of the three-dimensional virtual object to be deleted to the three-dimensional process, and correspondingly, the three-dimensional process can determine the three-dimensional virtual object to be deleted according to the received identification information and delete the three-dimensional virtual object determined in the second image area.
When the user selects one three-dimensional virtual object in the second image area, the three-dimensional process can send identification information of the three-dimensional virtual object to the two-dimensional process. Accordingly, the two-dimensional process may determine attribute information of the three-dimensional virtual object according to the received identification information and display current attribute information of the three-dimensional virtual object in the first image area.
When the three-dimensional process receives the interface operation for the three-dimensional virtual object, operation data may be transmitted to the two-dimensional process to instruct the two-dimensional process to update attribute information of the three-dimensional virtual object displayed in the first image area. For example, when the three-dimensional process receives a scaling operation for the three-dimensional virtual object, the three-dimensional process may send identification information of the scaled three-dimensional virtual object, and the size of the scaled three-dimensional virtual object, to the two-dimensional process. The two-dimensional process may update the size of the three-dimensional virtual object displayed in the first image area according to the size of the scaled three-dimensional virtual object. When the three-dimensional process receives the moving operation for the three-dimensional virtual object, the three-dimensional process may send identification information of the moving three-dimensional virtual object and a position of the moving three-dimensional virtual object to the two-dimensional process. The two-dimensional process may update the position of the three-dimensional virtual object displayed in the first image area according to the position of the three-dimensional virtual object after the movement.
For example, as shown in fig. 2, fig. 2 is a schematic diagram of a display interface according to an embodiment of the present application. The display interface 20 is an interface of the indoor model modeling software, and the area in the dashed box represents the second image area for displaying the three-dimensional model. The area of the display interface 20 other than the second image area may be a first display area, where a plurality of two-dimensional controls may be disposed, and a user may click on the two-dimensional controls to implement a corresponding function. For example, a three-dimensional model may be composed of a plurality of three-dimensional virtual objects. The two-dimensional control may include: the control for adding the three-dimensional virtual object to the second image area is the generation control in the above embodiment. For example, three-dimensional virtual objects may be walls, tables, posts, chairs, and shelves; a save control for saving the three-dimensional model in the second image region; the parameter setting control is used for carrying out parameter adjustment on each three-dimensional virtual object forming the three-dimensional model, for example, new parameters can be added for the three-dimensional virtual object selected by a user through the parameter setting control; a cancel control for canceling the interface operation and a restore control for restoring the interface operation; and a control for modifying attribute information of the three-dimensional virtual object in the second image area, that is, a modification control in the above embodiment. Attribute information of the three-dimensional virtual object may be displayed in the modification control in the first image area, for example, for object X, each attribute information of object X may be displayed. Each attribute information may be represented by a parameter, such as parameter one, parameter two, … …, parameter N in fig. 2.
Based on the above processing, the two-dimensional process and the three-dimensional process can send operation data to another process when receiving the interface operation of the user, respectively, under the condition that the other process needs to respond to the interface operation. Accordingly, another process, upon receiving the operation data, may update the content displayed in the image area (the first image area or the second image area) in accordance with the operation data. The interaction of the operation data can be performed between the two-dimensional process and the three-dimensional process, so that the interaction of the first image area and the second image area in the display interface can be realized, and the display effect of the whole interface is further improved.
In one embodiment, the first process is a main process and the second process is a sub-process of the first process. The second process is used for rendering the picture in the area characterized by the area handle in the display interface based on the area handle defined in the first process.
In the embodiment of the application, the first process can be started as a main process, and the subprocess of the first process can be started as a second process. For example, after a first process is started, a sub-process of the first process may be started as a second process based on cmd (command, command word indicator). After the second process is started, a message indicating that the start is completed may be sent to the first process, so that the first process starts to prepare for communication with the second process. The main process is responsible for the presentation of the entire display interface and defines in the display interface an area (which may also be referred to as a window) for displaying the picture rendered by the second process. The region defines a corresponding region handle (also referred to as a window handle) in the main process, and further, the sub-process may acquire the region handle defined in the main process and render a picture in a region characterized by the region handle. For example, when the region represented by the region handle needs to be operated, the sub-process may use the obtained region handle as a parameter of the called operation function, and further, the operation on the region represented by the region handle can be implemented.
Based on the processing, two processes developed by combining two different integrated development environments can be used for rendering the display interface, so that the most suitable integrated development environment can be selected for different image areas in the display interface, the display effect of each image area is improved, and the display effect of the whole display interface is further improved.
In one embodiment, the interface display method further includes: when a control operation for the display interface is received, the first image area and the second image area are synchronously adjusted according to operation data of the control operation.
In the embodiment of the application, two processes render pictures in different image areas in the same display interface. When receiving the control operation of the user on the display interface, the two image areas and the display interface can be synchronously controlled according to the operation data of the control operation. When receiving a control operation for the display interface, the main process may control the display of the entire display interface according to the operation data of the control operation, and send the operation data of the control operation to the sub-process. The subprocess can control the display of the area for displaying the picture rendered by the subprocess in the display interface according to the received operation data of the control operation. For example, when a scaling operation is received for the display interface, the operation data may include a scaling scale, and the main process may scale the entire display interface according to the scaling scale and transmit the operation data to the sub-process. Accordingly, the sub-process may scale the area for displaying the picture rendered by the sub-process according to the scaling ratio. Or when a movement operation is received for the display interface, the operation data may include position change information, and the main process may move the entire display interface according to the position change information and send the operation data to the sub-process. Accordingly, the sub-process may move an area for displaying the picture rendered by the sub-process according to the position change information.
Based on the processing, two different image areas in the display interface can be adaptively controlled, so that the display synchronization of the two different image areas in the display interface can be further ensured, the display without perception of a user is realized, and the display effect of the display interface is further improved.
For step S103, the second process, upon receiving the operation data, may perform processing according to the operation data. For example, the self-rendered picture may be updated and/or corresponding operating parameters recorded. The interface operation response received by the first process can be realized, and the operation interaction between two different image areas in the display interface can be realized, so that the display synchronization of the two different image areas in the display interface is ensured.
For example, in a three-dimensional modeling scene, a two-dimensional legend for each three-dimensional model may be displayed in a two-dimensional image area, and a user may select and move the two-dimensional legend in the two-dimensional image area to the three-dimensional image area, at which time the two-dimensional process receives a movement operation of the user and transmits operation data of the movement operation to the three-dimensional process. For example, the operation data may include the size of the selected two-dimensional legend and the location of the movement. Further, when the three-dimensional process receives operation data of the moving operation, the display effect of the two-dimensional legend in the three-dimensional image area can be determined according to the operation data, and the picture is rendered based on the determined display effect.
In one embodiment, step S102 may include: and transmitting operation data of the interface operation to a second process in the two processes through the first process based on the communication link.
In the embodiment of the application, the two started processes can perform interaction of operation data through a communication link. That is, data interaction between two processes developed by different integrated development environments can be achieved through communication links. Based on the method, the whole interface can be realized by combining two processes developed by two different integrated development environments, and the display effect of the whole interface can be improved. In addition, the data interaction between the two processes is realized through the communication link, so that the occupied memory space is not needed, namely, the occupied memory space can be avoided, and the memory space can be saved.
In one implementation, after two processes are started, a communication link between the two processes may be created by either of the two processes. For example, a communication link between the two processes may be created by a master process of the two processes. For example, a master process of the two processes may call a function for creating a communication link to create a communication link between the two processes. Subsequently, the two processes may interact with data via a communication link.
In one embodiment, the communication link is a pipe, message queue, or socket.
In the embodiment of the application, the communication link can be any one of a pipeline, a message queue and a socket, and the inter-process communication can be realized through the pipeline, the message queue and the socket, namely, the data interaction between the two processes can be realized through the communication link. The communication link in the present application may be other ways capable of implementing inter-process communication, and is not limited in particular. Based on the above, two processes developed by combining two different integrated development environments can be realized to realize a display interface. Therefore, developers can select different integrated development environments respectively to develop the two processes respectively, and the development advantages of the different integrated development environments can be combined, so that the display effect of the whole interface can be improved.
In one embodiment, referring to fig. 3, fig. 3 is a second flowchart of an interface display method according to an embodiment of the present application.
Step S102, including:
step S1021: and writing the operation data of the interface operation into the shared memory of the first process through the first process.
Step S1022: the first signaling is sent by the first process to a second process of the two processes based on the communication link.
The first signaling is used for indicating the second process to read data from the shared memory.
The interface display method further comprises the following steps:
step S104: and when the first signaling is received through the second process, reading the operation data from the shared memory through the second process.
In the embodiment of the application, the interaction of operation data can be performed between two processes through the shared memory. The shared memory is a memory space in which both processes can access data, that is, both processes can read data in the shared memory or write data into the shared memory.
Since the first process is either of the two processes that have been started, both processes that have been started can create their own shared memory. For either of the two processes that have been started, the shared memory of that process is used to store the data that the process needs to send to the other process.
Furthermore, when the first process receives the interface operation of the user, the operation data of the interface operation can be written into the shared memory of the first process. After the operation data writing is completed, the first process may send a first signaling to the second process through the communication link to instruct the second process to read the operation data from the shared memory. That is, for any one of the two processes that have been started, when receiving the interface operation of the user, the process may write the operation data of the interface operation into its own shared memory. When the operation data writing is completed, the process may send a first signaling to another process through the communication link. Upon receiving the first signaling, another process may read the operational data from the shared memory of the process.
Based on the above processing, the shared memory stores the operation data which needs to be sent to another process by one process, and the first signaling is sent to instruct the other process to read the operation data in the shared memory by the communication link, so that the data interaction between the two processes can be realized by combining the communication link and the shared memory. Since the communication link is only responsible for transmitting the first signaling, the transmission of operation data through the communication link is not required, and the congestion of the communication link caused by the transmission of a large amount of operation data can be avoided. Therefore, the efficiency of transmitting the operation data can be improved. In addition, the speed of reading the data in the shared memory by the process is also high, so that the efficiency of transmitting the operation data can be further improved. Therefore, the efficiency of data interaction between two processes can be improved, and the display effect of the interface is further improved. Meanwhile, the time delay of data interaction between two processes is reduced, and the operation experience of a user is ensured.
In one embodiment, referring to fig. 4, fig. 4 is a third flowchart of an interface display method according to an embodiment of the present application. Writing operation data of the interface operation into the shared memory of the first process through the first process (step S1021), including:
Step S10211: and when the state of the shared memory of the first process is that reading is completed, writing operation data of interface operation into the shared memory through the first process.
After writing the operation data of the interface operation into the shared memory of the first process (step S1021), the method further includes:
step S105: the state of the shared memory is set to read by the first process.
After the operation data is read from the shared memory by the second process when the first signaling is received by the second process (step S104), the method further includes:
step S106: and sending, by the second process, second signaling indicating completion of the reading to the first process based on the communication link.
Step S107: when the second signaling is received through the first process, the state of the shared memory is set to be read completion through the first process.
In the embodiment of the application, the user can perform multiple interface operations, and correspondingly, the first process can receive multiple interface operations of the user. That is, the first process needs to transmit operation data of the interface operation to the second process a plurality of times. In order to ensure the accuracy of the operation data read by the second process, the operation data of the interface operation can be stored in the shared memory only once at a time.
When the operation data of the new interface operation is written into the shared memory, the operation data of the previous interface operation is covered. If the operation data of the previous interface operation is not read, the first process writes the operation data of the new interface operation into the shared memory, which may cause an error in the operation data read by the second process. The second process cannot respond to the previous interface operation, and accordingly, operation interaction between two different image areas in the display interface cannot be realized.
Therefore, in order to ensure the accuracy of the operation data read by the second process, the first process may set the state of the shared memory. After writing the operation data of the interface operation into the shared memory, the first process may set the state of the shared memory to be in reading. If the state of the shared memory is in the reading process, which means that the operation data of the last interface operation stored in the shared memory is not read, the first process will not write the operation data of the new interface operation into the shared memory when receiving the new interface operation.
After the completion of the read of the operational data, the second process may send a second signaling to the first process over the communication link indicating that the read is complete. The first process sets the state of the shared memory to read completion upon receiving the second signaling. If the state of the shared memory is that reading is completed, the operation data of the last interface operation stored in the shared memory is already read, and when the first process receives the new interface operation, the operation data of the new interface operation can be written into the shared memory.
Furthermore, before writing the operation data of the interface operation into the shared memory, the first process may determine the state of the shared memory, and when the state of the shared memory is that reading is completed, the operation data may be written into the shared memory.
Based on the above processing, it is also possible to avoid that when the operation data of the previous interface operation in the shared memory is not yet read, the operation data of the new interface operation is written, which results in an error in the operation data read by the second process. That is, the accuracy of the operation data read by the second process can be ensured, further, the second process can obtain accurate display content according to the accurate operation data, the display effect of the interface is further improved, and the operation experience of the user is ensured.
In one embodiment, the shared memory comprises: a space for storing a type field indicating a type of an interface operation, a space for storing a length field indicating a data length of operation data, and a space for storing operation data.
In the embodiment of the application, the shared memory can be divided into a plurality of spaces, and each space is used for storing different data. When the first process writes operation data into the shared memory, the type field is written into a space for storing a type field indicating the type of the interface operation, the length field is written into a space for storing a length field indicating the data length of the operation data, and the operation data is written into a space for storing the operation data, in accordance with the space divided in the shared memory.
When reading data from the shared memory, the second process can read according to the divided space, and reads the operation data according to the type field and the length field.
Based on the above processing, effective operation data can be read according to the type field and the length field, and thus the reliability of operation data reading can be improved. Furthermore, the second process can be ensured to obtain accurate display content according to the accurate operation data, the display effect of the interface is further improved, and the operation experience of the user is ensured.
In one embodiment, referring to fig. 5, fig. 5 is a schematic structural diagram of a shared memory according to an embodiment of the present application. As shown in fig. 5, the shared memory includes: a data type (i.e., a space for storing a type field indicating a type of interface operation in the above-described embodiment), a data length description (i.e., a space for storing a length field indicating a data length of operation data in the above-described embodiment), a data content (i.e., a space for storing operation data in the above-described embodiment), and a reserved space.
The data type may be used to store a description of the data type, i.e., store a record type field. The data length description may be used to store valid data length information, i.e. a storage length field. Based on the length field, the operation data is read, so that out-of-range reading during data reading can be avoided, and the accuracy of data reading can be improved. The data content may be used to store specific data content, i.e. store operational data, and may be used to populate specific data structures. The reserved space may also be referred to as a reserved memory space, and when the function expansion is required, the reserved space may be used to store corresponding data.
In one embodiment, referring to fig. 6, fig. 6 is an interaction schematic diagram of two processes in an interface display method according to an embodiment of the present application. The method can comprise the following steps:
step S601: the 3D interface is started. That is, the process a (i.e., the two-dimensional process in the above embodiment) is started as the main process, and the sub-process of the process a is started based on cmd as the process B (i.e., the three-dimensional process in the above embodiment).
Step S602: the loading was successful. That is, after the completion of the start of the three-dimensional process, a message indicating the completion of the start may be sent to the two-dimensional process to cause the two-dimensional process to start preparing to communicate with the three-dimensional process.
Step S603: a communication link is established. That is, the two-dimensional process may call a function for creating a communication link to create a communication link with the second process. For example, the communication link may be any of a pipe, a message queue, and a socket.
Step S604: and receiving interface operation. That is, the two-dimensional process receives an interface operation of the user.
Step S605: and the process A sends the first signaling and the specific message to the process B. Namely, the two-dimensional process writes a specific message (i.e. operation data of interface operation) into the shared memory of the two-dimensional process, and sends a first signaling to the three-dimensional process based on the communication link after the writing is completed.
Step S606: process B updates the internal data information. That is, when the first signaling is received, the three-dimensional process reads the operation data from the shared memory and processes according to the operation data. For example, the self-rendered picture may be updated and/or corresponding operating parameters recorded.
Step S607: and finishing information consumption. That is, when the operation data reading is completed, the three-dimensional process transmits a message (i.e., second signaling) indicating that the operation data reading is completed to the two-dimensional process.
Step S608: the next interface operation is received. That is, after receiving the second signaling, the two-dimensional process may perform processing according to steps S605 to S607 when receiving the next interface operation of the user.
Step S609: and receiving interface operation. That is, the three-dimensional process receives an interface operation of the user.
Step S610: and the process B sends the first signaling and the specific message to the process A. Namely, the three-dimensional process writes a specific message (i.e. operation data of interface operation) into the shared memory of the three-dimensional process, and sends a first signaling to the two-dimensional process based on the communication link after the writing is completed.
Step S611: process a updates the internal data information. That is, when the first signaling is received, the two-dimensional process reads the operation data from the shared memory and processes according to the operation data. For example, the self-rendered picture may be updated and/or corresponding operating parameters recorded.
Step S612: and finishing information consumption. That is, when the operation data reading is completed, the two-dimensional process transmits a message (i.e., second signaling) indicating that the operation data reading is completed to the three-dimensional process.
Step S613: the next interface operation is received. That is, after receiving the second signaling, the three-dimensional process may perform processing according to steps S610 to S612 when receiving the next interface operation of the user.
Based on the above processing, when the two-dimensional process and the three-dimensional process receive the interface operation of the user, respectively, the two-dimensional process and the three-dimensional process can send operation data to the other process under the condition that the other process needs to respond to the interface operation, that is, the two-dimensional process and the three-dimensional process can perform interaction of the operation data. That is, the display interface can be realized in combination of a two-dimensional process and a three-dimensional process. In this way, when the developer performs the technical model selection of developing the two-dimensional image area and the three-dimensional image area, different integrated development environments can be selected respectively, and different development languages can be selected to respectively realize the technical schemes suitable for developing the two-dimensional image area and the three-dimensional image area and respectively develop the two-dimensional process and the three-dimensional process. Furthermore, the two-dimensional process and the three-dimensional process which are developed are effectively fused, namely, the data interaction between the two-dimensional process and the three-dimensional process is realized, the development advantages of different integrated development environments can be fully utilized, and further, the display effect of the whole interface can be improved. And interaction of operation data is carried out through the shared memory, and signaling is carried out through the communication link, so that the real-time performance of interaction between two processes can be improved, and the time delay of user operation is reduced.
In one embodiment, referring to fig. 7, fig. 7 is a fourth flowchart of an interface display method according to an embodiment of the present application. The interface display method may include the steps of:
step S701: the 2D process is started. Namely, a two-dimensional process is started as a main process, and a sub-process of the two-dimensional process is started as a three-dimensional process based on cmd.
Step S702: the data address block of the 2D process is set. The address block is used for storing data which needs to be sent to the three-dimensional process by the two-dimensional process. That is, the two-dimensional process creates a shared memory (which may be referred to as a two-dimensional shared memory) of the two-dimensional process, and the two-dimensional shared memory is used to store data that the two-dimensional process needs to send to the three-dimensional process.
Step S703: setting a data address block of the 3D process. The address block is used for storing data which needs to be sent to the two-dimensional process by the three-dimensional process. That is, the three-dimensional process creates a shared memory (which may be referred to as a three-dimensional shared memory) of the three-dimensional process, and the three-dimensional shared memory is used to store data that the three-dimensional process needs to send to the two-dimensional process.
Step S704: whether the 3D interface is 2D notified. If the notification is 2D, the 3D interface is notified, step S705 is executed; if the 3D interface is not the 2D notification 3D interface, that is, the 3D notification 2D interface, step S709 is performed. That is, if the interface operation of the user is received for the two-dimensional process, step S705 is executed; if the interface operation of the user is received for the three-dimensional process, step S709 is performed.
Step S705: the 2D process throws out the message to the 3D process through the communication link, and sets the address block in use. That is, after the operation data is written into the two-dimensional shared memory, the two-dimensional process transmits the first signaling to the three-dimensional process based on the communication link while setting the state of the two-dimensional shared memory to be in reading. For example, the communication link may be a pipe, message queue, or socket.
Step S706: and after the 3D process receives the data, reading the fixed memory address block, and reading the data according to the protocol format of the address block. That is, upon receiving the first signaling, the three-dimensional process reads the operation data from the two-dimensional shared memory.
Step S707: after the reading is completed, the communication link notifies the 2D process. That is, after the completion of the reading of the operation data, the three-dimensional process transmits a second signaling indicating the completion of the reading to the two-dimensional process based on the communication link.
Step S708: the 2D process sets the address block release and can perform the next data interaction. That is, upon receiving the second signaling, the two-dimensional process sets the state of the two-dimensional shared memory to read completion. Furthermore, the two-dimensional process may write new operation data into the two-dimensional shared memory.
Step S709: the 3D process throws out a message to the 2D process through the communication link, and sets an address block in use. That is, after writing the operation data into the three-dimensional shared memory, the three-dimensional process transmits a first signaling to the two-dimensional process based on the communication link while setting the state of the three-dimensional shared memory to be in reading.
Step S710: and after the 2D process receives the data, reading the fixed memory address block, and reading the data according to the protocol format of the address block. That is, upon receiving the first signaling, the two-dimensional process reads the operation data from the three-dimensional shared memory.
Step S711: after the reading is completed, the communication link notifies the 3D process. That is, after the completion of the reading of the operation data, the two-dimensional process transmits a second signaling indicating the completion of the reading to the three-dimensional process based on the communication link.
Step S712: the 3D process sets the address block release and can perform the next data interaction. That is, upon receiving the second signaling, the three-dimensional process sets the state of the three-dimensional shared memory to read completion. Furthermore, the three-dimensional process can write new operation data into the three-dimensional shared memory.
In the embodiment of the application, the two-dimensional process and the three-dimensional process can be used as two independent processes, and the fusion of the two processes is realized, namely, the data interaction between the two processes is realized. And the data interaction between the two-dimensional process and the three-dimensional process can be performed by combining the shared memory and the pipeline, so that the pipeline congestion caused by the direct transmission of a large amount of data through the pipeline can be avoided. The speed of reading the two-dimensional shared memory and the three-dimensional shared memory is also higher, so that the efficiency of transmitting operation data between the two-dimensional process and the three-dimensional process can be improved, namely, the efficiency of data interaction between the two processes can be improved, the interaction timeliness of the two-dimensional image area and the three-dimensional image area can be effectively improved, and the display effect of an interface is further improved. Meanwhile, the real-time performance of data interaction between two processes is improved, the time delay for responding to the operation of a user is reduced, and the operation experience of the user is improved.
Based on the same inventive concept, the embodiment of the present application further provides an interface display device, referring to fig. 8, fig. 8 is a schematic structural diagram of the interface display device provided by the embodiment of the present application, where the device includes:
an interface operation receiving module 801, configured to receive an interface operation of a user through a first process of the two started processes; the first process is any one of the two processes, and the two processes are developed based on different integrated development environments and are respectively used for rendering pictures in different image areas in a display interface;
an operation data sending module 802, configured to send, by using the first process, operation data of the interface operation to a second process of the two processes;
an operation data processing module 803, configured to perform processing according to the operation data through the second process.
According to the interface display device provided by the embodiment of the application, when two processes developed based on different integrated development environments respectively receive interface operation of a user, under the condition that the other process needs to respond to the interface operation, operation data are sent to the other process, namely, interaction of the operation data can be carried out between the two processes. And because the two processes are respectively used for rendering pictures in different image areas in the display interface, the interaction of the different image areas in the display interface can be realized, namely, the display interface can be realized by combining the two processes developed by two different integrated development environments. Therefore, developers can select different integrated development environments respectively to develop the two processes respectively, and the development advantages of the different integrated development environments can be combined, so that the display effect of the whole interface can be improved.
In one embodiment, the operation data sending module 802 includes:
the operation data writing sub-module is used for writing the operation data of the interface operation into the shared memory of the first process through the first process;
a first sending sub-module, configured to send, by the first process, a first signaling to a second process of the two processes based on a communication link; the first signaling is used for indicating the second process to read data from the shared memory;
the apparatus further comprises:
and the operation data reading module is used for reading the operation data from the shared memory through the second process when the first signaling is received through the second process.
In an embodiment, the operation data writing sub-module is specifically configured to write, when the state of the shared memory of the first process is that reading is completed, the operation data of the interface operation into the shared memory through the first process;
the apparatus further comprises:
the first setting module is used for setting the state of the shared memory as read through the first process after the operation data of the interface operation is written into the shared memory of the first process through the first process;
The second sending module is used for sending a second signaling representing that reading is completed to the first process through the second process based on the communication link after the operation data is read from the shared memory through the second process when the first signaling is received through the second process;
and the second setting module is used for setting the state of the shared memory to be read through the first process when the second signaling is received through the first process.
In one embodiment, the shared memory includes: a space for storing a type field representing a type of the interface operation, a space for storing a length field representing a data length of the operation data, and a space for storing the operation data.
In one embodiment, the operation data sending module 802 is specifically configured to send, by the first process, operation data of the interface operation to a second process of the two processes based on a communication link.
In one embodiment, the communication link is a pipe, message queue, or socket.
In one embodiment, the first process is a main process and the second process is a sub-process of the first process; the second process is used for rendering a picture in an area characterized by the area handle in the display interface based on the area handle defined in the first process.
In one embodiment, the different integrated development environments include: an integrated development environment for developing an image area displaying a two-dimensional image, and an integrated development environment for developing an image area displaying a three-dimensional image.
In one embodiment, the display interface includes a first image area for displaying a two-dimensional image and a second image area for displaying a three-dimensional image;
a process developed based on an integrated development environment for developing an image area displaying a two-dimensional image for display in the first image area: the two-dimensional control is used for setting attribute information of the three-dimensional virtual object in the second image area and current attribute information of the three-dimensional virtual object in the second image area; and updating attribute information of the three-dimensional virtual object displayed in the first image area according to operation data of the interface operation received in the second image area;
a process developed based on an integrated development environment for developing an image area for displaying a three-dimensional image for displaying a three-dimensional virtual object in the second image area; and updating the three-dimensional virtual object displayed in the second image area according to the operation data of the interface operation received in the first image area.
In one embodiment, the apparatus further comprises:
and the image area adjusting module is used for synchronously adjusting the first image area and the second image area according to the operation data of the control operation when the control operation for the display interface is received.
The embodiment of the application also provides an electronic device, as shown in fig. 9, including:
a memory 901 for storing a computer program;
the processor 902 is configured to execute the program stored in the memory 901, thereby implementing the following steps:
receiving interface operation of a user through a first process in the started two processes; the first process is any one of the two processes, and the two processes are developed based on different integrated development environments and are respectively used for rendering pictures in different image areas in a display interface;
transmitting operation data of the interface operation to a second process in the two processes through the first process;
and processing according to the operation data through the second process.
And the electronic device may further include a communication bus and/or a communication interface, where the processor 902, the communication interface, and the memory 901 perform communication with each other via the communication bus.
The communication bus mentioned above for the electronic devices may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
In yet another embodiment of the present application, there is also provided a computer readable storage medium having stored therein a computer program which, when executed by a processor, implements the steps of any of the interface display methods described above.
In yet another embodiment of the present application, a computer program product containing instructions that, when run on a computer, cause the computer to perform any of the interface display methods of the above embodiments is also provided.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a storage medium (e.g., solid State Disk (SSD)), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for the apparatus, electronic device, and storage medium embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and references to the parts of the description of the method embodiments are only needed.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application are included in the protection scope of the present application.

Claims (13)

1. An interface display method, characterized in that the method comprises:
receiving interface operation of a user through a first process in the started two processes; the first process is any one of the two processes, and the two processes are developed based on different integrated development environments and are respectively used for rendering pictures in different image areas in a display interface;
transmitting operation data of the interface operation to a second process in the two processes through the first process;
and processing according to the operation data through the second process.
2. The method of claim 1, wherein the sending, by the first process, the operation data of the interface operation to the second process of the two processes comprises:
writing operation data of the interface operation into a shared memory of the first process through the first process;
Transmitting, by the first process, a first signaling to a second process of the two processes based on the communication link; the first signaling is used for indicating the second process to read data from the shared memory;
the method further comprises the steps of:
and when the first signaling is received through the second process, reading the operation data from the shared memory through the second process.
3. The method of claim 2, wherein writing, by the first process, the operation data of the interface operation to the shared memory of the first process comprises:
when the state of the shared memory of the first process is that reading is completed, writing operation data of the interface operation into the shared memory through the first process;
after the operation data of the interface operation is written into the shared memory of the first process through the first process, the method further comprises:
setting the state of the shared memory to be read through the first process;
after the operation data is read from the shared memory by the second process when the first signaling is received by the second process, the method further includes:
Transmitting, by the second process, second signaling indicating completion of reading to the first process based on the communication link;
and when the second signaling is received through the first process, setting the state of the shared memory to be read completion through the first process.
4. The method of claim 2, wherein the shared memory comprises: a space for storing a type field representing a type of the interface operation, a space for storing a length field representing a data length of the operation data, and a space for storing the operation data.
5. The method of claim 1, wherein the sending, by the first process, the operation data of the interface operation to the second process of the two processes comprises:
and transmitting operation data of the interface operation to a second process in the two processes through the first process based on a communication link.
6. The method of claim 2 or 5, wherein the communication link is a Pipe, a message queue, or a Socket.
7. The method of claim 1, wherein the first process is a main process and the second process is a sub-process of the first process; the second process is used for rendering a picture in an area characterized by the area handle in the display interface based on the area handle defined in the first process.
8. The method of claim 1, wherein the different integrated development environments comprise: an integrated development environment for developing an image area displaying a two-dimensional image, and an integrated development environment for developing an image area displaying a three-dimensional image.
9. The method of claim 8, wherein the display interface includes a first image area for displaying a two-dimensional image and a second image area for displaying a three-dimensional image;
a process developed based on an integrated development environment for developing an image area displaying a two-dimensional image for display in the first image area: the two-dimensional control is used for setting attribute information of the three-dimensional virtual object in the second image area and current attribute information of the three-dimensional virtual object in the second image area; and updating attribute information of the three-dimensional virtual object displayed in the first image area according to operation data of the interface operation received in the second image area;
a process developed based on an integrated development environment for developing an image area for displaying a three-dimensional image for displaying a three-dimensional virtual object in the second image area; and updating the three-dimensional virtual object displayed in the second image area according to the operation data of the interface operation received in the first image area.
10. The method according to claim 9, wherein the method further comprises:
and when a control operation for the display interface is received, synchronously adjusting the first image area and the second image area according to operation data of the control operation.
11. An interface display device, the device comprising:
the interface operation receiving module is used for receiving the interface operation of the user through a first process in the started two processes; the first process is any one of the two processes, and the two processes are developed based on different integrated development environments and are respectively used for rendering pictures in different image areas in a display interface;
the operation data sending module is used for sending the operation data of the interface operation to a second process in the two processes through the first process;
the operation data processing module is used for processing according to the operation data through the second process;
the operation data transmitting module includes:
the operation data writing sub-module is used for writing the operation data of the interface operation into the shared memory of the first process through the first process;
A first sending sub-module, configured to send, by the first process, a first signaling to a second process of the two processes based on a communication link; the first signaling is used for indicating the second process to read data from the shared memory;
the apparatus further comprises:
the operation data reading module is used for reading the operation data from the shared memory through the second process when the first signaling is received through the second process;
the operation data writing sub-module is specifically configured to write, when the state of the shared memory of the first process is that reading is completed, operation data of the interface operation into the shared memory through the first process;
the apparatus further comprises:
the first setting module is used for setting the state of the shared memory as read through the first process after the operation data of the interface operation is written into the shared memory of the first process through the first process;
the second sending module is used for sending a second signaling representing that reading is completed to the first process through the second process based on the communication link after the operation data is read from the shared memory through the second process when the first signaling is received through the second process;
The second setting module is used for setting the state of the shared memory to be read through the first process when the second signaling is received through the first process;
the shared memory comprises: a space for storing a type field representing a type of the interface operation, a space for storing a length field representing a data length of the operation data, and a space for storing the operation data;
the operation data sending module is specifically configured to send, by using the first process, operation data of the interface operation to a second process of the two processes based on a communication link;
the communication link is a pipeline Pipe, a message queue or a Socket;
the first process is a main process, and the second process is a sub-process of the first process; the second process is used for rendering a picture in an area characterized by the area handle in the display interface based on the area handle defined in the first process;
the different integrated development environments include: an integrated development environment for developing an image area displaying a two-dimensional image, and an integrated development environment for developing an image area displaying a three-dimensional image;
The display interface comprises a first image area for displaying a two-dimensional image and a second image area for displaying a three-dimensional image;
a process developed based on an integrated development environment for developing an image area displaying a two-dimensional image for display in the first image area: the two-dimensional control is used for setting attribute information of the three-dimensional virtual object in the second image area and current attribute information of the three-dimensional virtual object in the second image area; and updating attribute information of the three-dimensional virtual object displayed in the first image area according to operation data of the interface operation received in the second image area;
a process developed based on an integrated development environment for developing an image area for displaying a three-dimensional image for displaying a three-dimensional virtual object in the second image area; and updating the three-dimensional virtual object displayed in the second image area according to the operation data of the interface operation received in the first image area;
the apparatus further comprises:
and the image area adjusting module is used for synchronously adjusting the first image area and the second image area according to the operation data of the control operation when the control operation for the display interface is received.
12. An electronic device, comprising:
a memory for storing a computer program;
a processor for implementing the method of any of claims 1-10 when executing a program stored on a memory.
13. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when executed by a processor, implements the method of any of claims 1-10.
CN202310797090.0A 2023-06-30 2023-06-30 Interface display method and device, electronic equipment and storage medium Pending CN116820651A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310797090.0A CN116820651A (en) 2023-06-30 2023-06-30 Interface display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310797090.0A CN116820651A (en) 2023-06-30 2023-06-30 Interface display method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116820651A true CN116820651A (en) 2023-09-29

Family

ID=88115254

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310797090.0A Pending CN116820651A (en) 2023-06-30 2023-06-30 Interface display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116820651A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117539582A (en) * 2024-01-10 2024-02-09 中航国际金网(北京)科技有限公司 Multi-process interface fusion method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117539582A (en) * 2024-01-10 2024-02-09 中航国际金网(北京)科技有限公司 Multi-process interface fusion method and device

Similar Documents

Publication Publication Date Title
US5956028A (en) Virtual space communication system, three-dimensional image display method, and apparatus therefor
CN112614202B (en) GUI rendering display method, terminal, server, electronic equipment and storage medium
CN116820651A (en) Interface display method and device, electronic equipment and storage medium
JP4677733B2 (en) Server device, display device, and display method
CN110555900B (en) Rendering instruction processing method and device, storage medium and electronic equipment
CN107633013A (en) Page picture generation method, device and computer-readable recording medium
WO2023226371A1 (en) Target object interactive reproduction control method and apparatus, device and storage medium
WO2023173516A1 (en) Data exchange method and apparatus, and storage medium and electronic device
CN108459910A (en) A kind of method and apparatus for deleting resource
CN114581580A (en) Method and device for rendering image, storage medium and electronic equipment
CN117078888A (en) Virtual character clothing generation method and device, medium and electronic equipment
CN108733602A (en) Data processing
JP2002197490A (en) Device for displaying three-dimensional graph
CN111428453B (en) Processing method, device and system in annotation synchronization process
JPH1139507A (en) Stereoscopic image display device
US11763528B2 (en) Avatar mobility between virtual reality spaces
JP3275861B2 (en) Information presenting apparatus and computer-readable recording medium recording information presenting program
CN116567273B (en) Method for transmitting display screen of container system, server device, and storage medium
US20230325908A1 (en) Method of providing interior design market platform service using virtual space content data-based realistic scene image and device thereof
WO2023142945A1 (en) 3d model generation method and related apparatus
US20240111496A1 (en) Method for running instance, computer device, and storage medium
CN114972642A (en) Method, apparatus, device and storage medium for three-dimensional modeling
CN115984431A (en) Virtual object building animation generation method and device, storage medium and electronic equipment
CN117914957A (en) Multi-protocol interactive communication method, device, equipment, storage medium and product
JP2004334566A (en) Information processing method and information processor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination