CN112965773B - Method, apparatus, device and storage medium for information display - Google Patents

Method, apparatus, device and storage medium for information display Download PDF

Info

Publication number
CN112965773B
CN112965773B CN202110234891.7A CN202110234891A CN112965773B CN 112965773 B CN112965773 B CN 112965773B CN 202110234891 A CN202110234891 A CN 202110234891A CN 112965773 B CN112965773 B CN 112965773B
Authority
CN
China
Prior art keywords
terminal
displayed
image
information
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110234891.7A
Other languages
Chinese (zh)
Other versions
CN112965773A (en
Inventor
徐泽前
刘昕笛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shining Reality Wuxi Technology Co Ltd
Original Assignee
Shining Reality Wuxi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shining Reality Wuxi Technology Co Ltd filed Critical Shining Reality Wuxi Technology Co Ltd
Priority to CN202110234891.7A priority Critical patent/CN112965773B/en
Publication of CN112965773A publication Critical patent/CN112965773A/en
Application granted granted Critical
Publication of CN112965773B publication Critical patent/CN112965773B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure provides a method, a device, equipment and a storage medium for information display, comprising the following steps: and responding to the determination that the target application program is in an operating state, acquiring an image rendering request, analyzing the image rendering request, acquiring information to be displayed, then responding to the determination that the information to be displayed contains first information to be displayed, rendering the first information to be displayed, generating a first image so as to display the first image on a first user interface of a first terminal, responding to the determination that the information to be displayed contains second information to be displayed, rendering the second information to be displayed, generating a second image so as to display the second image on a second user interface of a second terminal.

Description

Method, apparatus, device and storage medium for information display
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for displaying information.
Background
With the development of computer software and hardware technology in recent years, various forms of wearable intelligent devices, such as smart watches, head-mounted electronic devices, intelligent sports shoes and the like, have appeared, and the wearable intelligent devices have shown wide application prospects in many fields of industry, medical health, military, education, entertainment and the like.
In life, wearable devices may typically be used with other terminal devices. For example, the head-mounted electronic device may be connected to a mobile phone, and the head-mounted electronic device may be used as an extension screen of the mobile phone. For example, a mobile phone may be used as a computing unit of a head-mounted electronic device, providing computing functionality for the head-mounted electronic device. Therefore, in the process that the wearable device is used together with other terminal devices, a user needs to pay attention to how the wearable device and the terminal device used together with the wearable device display information.
Disclosure of Invention
The embodiment of the disclosure provides a method, a device, equipment and a storage medium for information display.
In a first aspect, an embodiment of the present disclosure provides a method for displaying information, including: responding to the determination that the target application program is in an operating state, acquiring an image rendering request, analyzing the image rendering request, and acquiring information to be displayed; in response to determining that the information to be displayed contains first information to be displayed, rendering the first information to be displayed, and generating a first image so as to display the first image on a first user interface of the first terminal; and responding to the fact that the information to be displayed contains second information to be displayed, rendering the second information to be displayed, generating a second image, and displaying the second image on a second user interface of the second terminal, wherein the first user interface of the first terminal is used for controlling the second user interface of the second terminal.
In a second aspect, embodiments of the present disclosure provide an apparatus for information display, including: the first processing module is used for acquiring an image rendering request in response to determining that the target application program is in an operating state; the second processing module is used for analyzing the image rendering request and acquiring information to be displayed; the third processing module is used for rendering the first information to be displayed in response to determining that the information to be displayed contains the first information to be displayed, and generating a first image so as to display the first image on a first user interface of the first terminal; and the fourth processing module is used for responding to the fact that the information to be displayed contains second information to be displayed, rendering the second information to be displayed, generating a second image, and displaying the second image on a second user interface of the second terminal, wherein the first user interface of the first terminal is used for controlling the second user interface of the second terminal.
In a third aspect, an embodiment of the present disclosure provides an apparatus for displaying information, the apparatus including a head-mounted display terminal and a mobile terminal, the display terminal being communicable with the mobile terminal, the mobile terminal including: a processor, a communication interface, a memory, and a communication bus; the processor, the communication interface and the memory complete communication with each other through a bus; a memory for storing a computer program; a processor for executing a program stored on a memory, implementing the method steps for information display as in the first aspect.
In a fourth aspect, an embodiment of the present disclosure provides an apparatus for displaying information, the apparatus including a head-mounted display terminal and a mobile terminal, the display terminal being communicable with the mobile terminal, the display terminal including: a processor, a communication interface, a memory, and a communication bus; the processor, the communication interface and the memory complete communication with each other through a bus; a memory for storing a computer program; a processor for executing a program stored on a memory, implementing the method steps for information display as in the first aspect.
In a fifth aspect, embodiments of the present disclosure provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method steps for information display as in the first aspect.
According to the technical scheme provided by the embodiment of the disclosure, firstly, an image rendering request is obtained in response to determining that a target application program is in an operating state, then the image rendering request is analyzed to obtain information to be displayed, then the first information to be displayed is rendered in response to determining that the information to be displayed contains the first information to be displayed, a first image is generated to display the first image on a first user interface of a first terminal, the second information to be displayed is rendered in response to determining that the information to be displayed contains the second information to be displayed, a second image is generated to display the second image on a second user interface of a second terminal, and a user of the second user interface of the second terminal can control the second user interface of the second terminal through the first user interface of the first terminal. In the process of using the first terminal and the second terminal in a matching way, the first image of the first terminal can be rendered and displayed through the same target application program installed on the first terminal or the second terminal, the second image displayed on the second terminal can be rendered and displayed, the image rendering efficiency is improved, and the method and the device are more suitable for interaction among different terminals.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present disclosure, and other drawings may be obtained according to these drawings without inventive effort to a person of ordinary skill in the art.
FIG. 1 is a schematic diagram of a system architecture of a method for information display or an apparatus for information display suitable for use in the present application;
FIG. 2 is a flow chart of a first embodiment of a method for information display according to the present application;
FIG. 3 is a flow chart of a second embodiment of a method for information display according to the present application;
FIG. 4 is a flow chart of a third embodiment of a method for information display according to the present application;
FIG. 5 is a schematic block diagram of an apparatus for information display according to the present application;
Fig. 6 is a schematic structural view of an electronic device for information display according to the present application.
Legend description:
200-first terminal, 300-second terminal.
Detailed Description
The embodiment of the disclosure provides a method, a device, equipment and a storage medium for information display.
In order to better understand the technical solutions of the present application, the following description will clearly and completely describe the technical solutions of the embodiments of the present disclosure with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only some embodiments of the present disclosure, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments of the present application without inventive faculty, shall fall within the scope of the present disclosure.
As shown in fig. 1, fig. 1 is a schematic diagram of a system structure of a method for information display or an apparatus for information display, which is suitable for the present application. The system may include a first terminal 200, a second terminal 300. The first terminal 200 and the second terminal 300 may be connected by various means, such as a wired, wireless communication link, or a fiber optic cable, etc. The first terminal 200 and the second terminal 300 may interact to transmit or receive information or the like.
It should be understood that the first terminal 200 in fig. 1 may be hardware or software. When the first terminal 200 is hardware, it may be various electronic devices having a display screen, including but not limited to a smart phone, a tablet computer, a laptop computer, and the like. When the first terminal 200 is software, it may be installed in the above-listed electronic device. Which may be implemented as a plurality of software or software modules, or as a single software or software module. The present invention is not particularly limited herein. The second terminal 300 in fig. 1 may be various wearable devices, such as a head-mounted electronic device, which may be an electronic device displaying a user interface to be operated at a specified location in space, including but not limited to AR glasses, VR glasses, etc.
In this embodiment, the user interface of the first terminal 200 may be a first user interface, and the first user interface may display a first image. The user interface of the second terminal 300 may be a second user interface, which may display a second image. The first terminal 200 may interact with the second terminal 300, so that a user may manipulate the second user interface of the second terminal 300 through the first user interface of the first terminal 200.
The first terminal 200 may be an electronic device providing various service functions. For example, the acquired image rendering request is subjected to processing such as analysis, and the processing result (e.g., the generated first image) is displayed, or the processing result (e.g., the generated second image) may be fed back to the second terminal, so that the processing result is displayed at the second terminal.
It should be noted that, the method for displaying information provided in the embodiment of the present application is generally performed by the first terminal 200, and accordingly, the device for displaying information is generally disposed in the first terminal 200.
It should be further noted that the second terminal 300 may be an electronic device that provides various service functions, for example, performs processing such as analysis on an acquired image rendering request, and feeds back a processing result (such as a generated first image) to the first terminal, so as to display the processing result on the first terminal, or may directly display the processing result (such as a generated second image). Accordingly, a means for information display may be provided in the second terminal 300.
It should be understood that the number of first terminals 200 and second terminals 300 in fig. 1 is merely illustrative. There may be any number of first terminals 200 and second terminals 300, as desired for implementation. For example, fig. 1 may include two first terminals 200 and one second terminal 300, where two first images respectively displayed in the first user interfaces of different first terminals 200 may be generated, and a second image displayed in the second user interface of the second terminal 300 may be generated, and both first terminals 200 may implement manipulation of the second image displayed in the second user interface of the second terminal 300 based on the first images.
As shown in fig. 2, fig. 2 shows a flow diagram of a first embodiment of a method for information display according to the application. The method for information display comprises the following steps:
Step S102, in response to determining that the target application program is in a running state, an image rendering request is acquired.
As an example, the above-described target application may be an application previously installed in the separate device. The split device may include a first terminal and a second terminal, and the target application may be installed in the first terminal or the second terminal.
In the present embodiment, the execution subject (for example, the first terminal 200) of the method for information display may acquire the image rendering request in the case where it is determined that the above-described target application is in the running state. The running state may include a start state of the target application program, and a state in which the target application program is running after the start. The image rendering request may be a request for performing image rendering on information to be displayed in the first terminal and/or the second terminal.
In general, after the connection between the first terminal and the second terminal is established, the target application may be started, so that the target application is in an operating state, so that the first terminal may operate the second terminal. Or the target application program can be always in an operating state in the interaction process of the first terminal and the second terminal.
Step S104, analyzing the image rendering request to obtain the information to be displayed.
As an example, the information to be displayed may be image data to be displayed on the separate device, and as the information to be displayed, may be 2D image data or 3D image data to be displayed on the first terminal. Or the information to be displayed may be 2D image data or 3D image data to be displayed on the second terminal.
In this embodiment, based on the image rendering request acquired in step S102, the execution body may parse the acquired image rendering request, so as to acquire information to be displayed. It may be appreciated that the information to be displayed may include image data to be displayed at the first terminal and/or the second terminal, where the content specifically included in the information to be displayed may be determined according to an actual application scenario.
And step S106, in response to determining that the information to be displayed contains the first information to be displayed, rendering the first information to be displayed, and generating a first image so as to display the first image on a first user interface of the first terminal.
In this embodiment, based on the information to be displayed acquired in step S104, the execution body may analyze the information to be displayed, so as to obtain an analysis result of the information to be displayed. If the information to be displayed includes the first information to be displayed, the execution subject may render the first information to be displayed, so that a first image may be generated. Here, the first information to be displayed may include image data to be displayed on the first terminal in the split device. The execution body may then control the first terminal to display the first image, and display the first image on a first user interface of the first terminal. The first user interface may be a user interface of a first terminal. It will be appreciated that the first image may be in various forms of image, for example the first image may be a still image.
As an example, the first terminal may be a smart phone, and the first image may be a user interface displayed on the first terminal, where the first image may include two touch key identifiers. And each touch key can be further provided with different patterns for distinguishing, for example, the first image can be displayed with a shortcut identification pattern and a home key identification pattern for distinguishing the shortcut and the home keys.
And step S108, in response to determining that the information to be displayed contains second information to be displayed, rendering the second information to be displayed, and generating a second image so as to display the second image on a second user interface of the second terminal.
In this embodiment, based on the information to be displayed acquired in step S104, after the executing body analyzes the information to be displayed, if it is determined that the information to be displayed includes second information to be displayed, the executing body may render the second information to be displayed, so as to generate a second image. Here, the second information to be displayed may include image data to be displayed on the second terminal in the above-described separate device. The executing body may then control the second image to be displayed in the user interface of the second terminal. The user interface of the second terminal is a second user interface. It will be appreciated that the second image may be in various forms, for example the second image may be a three-dimensional image. The user may operate (e.g., touch) the first user interface of the first terminal, thereby implementing control of the specific content displayed by the second user interface of the second terminal through the first terminal.
As an example, the first terminal may be a smart phone, and the second terminal may be a head-mounted electronic device. The second user interface may be a user interface displayed on a virtual screen of the head-mounted electronic device. The user operates the first user interface on the smart phone, so that the second user interface displayed on the virtual screen of the head-mounted electronic equipment can be controlled through the smart phone.
In one application scenario of the present embodiment, the smart phone and the head-mounted electronic device may be in an interactable state when the target application is in an operational state. The user may operate a first user interface (for example, the first user interface may include two touch keys) displayed on the smart phone, for example, a double-click opening operation is performed by selecting a first APP icon in a virtual screen of the head-mounted electronic device, at this time, the execution body may obtain an image rendering request, and parse the image rendering request to obtain information to be displayed, where the information to be displayed includes image data of an initial window of the first APP to be displayed on the head-mounted electronic device. The executing body renders the information to be displayed to obtain a second image (the second image may include an initial window of the first APP), and displays the second image on a virtual screen of the head-mounted electronic device, so that a user controls a second user interface of the head-mounted electronic device through a first user interface of the smart phone.
It can be understood that in different application scenarios, the information to be displayed obtained by parsing may include the first information to be displayed and/or the second information to be displayed. Examples of different contents included in the information to be displayed are given below.
As an example, when the execution subject detects that the target application is not started, the target application may be started, so that the target application is in a running state. In this case, the target application may generate an image rendering request, and the execution subject may acquire the generated image rendering request. And, the image rendering request may request rendering of images displayed by the first terminal and the second terminal. The executing body analyzes the image rendering request to obtain information to be displayed of the first terminal and information to be displayed of the second terminal.
As an example, in the process that the first terminal controls the second terminal, the target application is always in an operating state, and in this scenario, the image rendering request acquired by the execution subject may be an image rendering request of the second terminal, and then the execution subject analyzes the image rendering request to obtain information to be displayed of the second terminal.
As an example, if the target application needs to adjust the image information displayed on the first terminal in the running state, the execution subject may acquire the image rendering request. In such a scenario, the image rendering request may be an image rendering request of the first terminal. The executing body analyzes the image rendering request to obtain information to be displayed of the first terminal.
In the related art, for different terminal devices, each terminal device generally renders an image displayed by a user interface itself. And if the first terminal renders the first image displayed by the first user interface, the second terminal renders the second image displayed by the second user interface. According to the scheme disclosed by the application, the first terminal and the second terminal can render the first image displayed on the first user interface and the second image displayed on the second user interface through the target application program installed on the first terminal or the second terminal, so that the first image displayed on the first user interface and the second image displayed on the second user interface are rendered through the same target application program installed on the first terminal or the second terminal, the first image and the second image to be displayed do not need to be respectively rendered on different terminal devices, a foundation is provided for interaction of the first terminal and the second terminal, and the image rendering efficiency of the first terminal and the second terminal in interaction is improved.
According to the technical scheme provided by the embodiment of the disclosure, firstly, an image rendering request is obtained in response to determining that a target application program is in an operating state, then the image rendering request is analyzed to obtain information to be displayed, then the first information to be displayed is rendered in response to determining that the information to be displayed contains first information to be displayed, a first image is generated to display the first image on a first user interface of a first terminal, the second information to be displayed is rendered in response to determining that the information to be displayed contains second information to be displayed, a second image is generated to display the second image on a second user interface of a second terminal, and a user can control the second user interface of the second terminal through the first user interface of the first terminal. In the process of the cooperation of the first terminal and the second terminal, the first image of the first terminal can be rendered and displayed through the same target application program installed on the first terminal or the second terminal, the second image of the second terminal can be rendered and displayed, the image rendering efficiency is improved, and the method and the device are more suitable for interaction among different terminals.
In some alternative embodiments, the method may further include the following specific processing procedure of step A2 before step S102.
In step A2, the target application is controlled to run in response to receiving the trigger information of the target application.
When the user needs to perform interactive operation by using the first terminal and the second terminal, the user can establish connection between the first terminal and the second terminal. Here, whether the first terminal and the second terminal establish a communication connection may be detected by the target application. In this way, the execution body may receive trigger information of the target application program when it is detected by the target application program that the first terminal and the second terminal are successfully connected. The execution body may control the target application to run, so that the target application is in a running state.
In some alternative embodiments, as shown in fig. 3, the processing methods of the step S106 and the step S108 may be varied, and an alternative processing method is provided below, and specific reference may be made to the specific processing procedures of the step S1062 and the step S1082 below.
In step S1062, in response to determining that the information to be displayed includes the first information to be displayed, a preset rendering engine is used to render the first information to be displayed according to a preset first display mode, and a first image is generated to display the first image on a first user interface of the first terminal.
In step S1082, in response to determining that the information to be displayed includes the second information to be displayed, a preset rendering engine is adopted to render the second information to be displayed according to a preset second display mode, and a second image is generated to display the second image on a second user interface of the second terminal.
The first terminal can control a second user interface of the second terminal based on the first user interface.
As an example, the preset rendering engine may be a unit engine or the like. Thus, when the target application program is in the running state, the target application program can use the same preset rendering engine to render the first image displayed on the first terminal and the second image displayed on the second terminal.
Alternatively, the first display mode may include a two-dimensional display mode, and the first image may include a two-dimensional image. The second display mode may include a three-dimensional display mode, and the second image may include a three-dimensional image. In this case, the target application may use the same preset rendering engine to render both the two-dimensional image displayed on the first terminal and the three-dimensional image displayed on the second terminal. Here, the display mode is not particularly limited.
In some alternative embodiments, the specific processing method of the step S1082 may be varied, and an alternative processing method is provided below to implement the rendering of the three-dimensional image, specifically, see the specific processing procedures of the steps L2-L4 below.
In step L2, in response to determining that the information to be displayed includes second information to be displayed, rendering the second information to be displayed according to the three-dimensional display mode by adopting a preset rendering engine, and generating a left eye view and a right eye view.
In step L4, the left eye view and the right eye view are combined into a three-dimensional image to display the three-dimensional image on the second user interface of the second terminal. The first user interface of the first terminal is used for controlling the second user interface of the second terminal.
In this embodiment, when the executing body determines that the information to be displayed includes a display image displayed on a second terminal (such as a head-mounted electronic device), feature information of the image to be displayed that is currently required to be displayed on the head-mounted electronic device may be acquired. And then, a preset rendering engine (such as a 3D engine: unit engine) can be adopted to render the image information to be displayed on the user interface of the second terminal according to the three-dimensional display mode based on the characteristic information of the image to be displayed, so as to generate a left eye view and a right eye view, and the left eye view and the right eye view are combined into a three-dimensional image. The executing body may then send the three-dimensional image synthesized by the left eye view and the right eye view to a left eye display screen and a right eye display screen on the second terminal (e.g., the head-mounted electronic device). In this way, the user can see a realistic stereoscopic 3D view of the object displayed on the second user interface of the second terminal through the left-eye display screen and the right-eye display screen of the head-mounted electronic device. Further, the user may manipulate the second user interface of the second terminal based on the first user interface displayed on the first terminal in the process of interacting with the second terminal using the first terminal.
In general, before the above-described generation of the left-eye view and the right-eye view, two cameras may be created in advance in a three-dimensional space provided by the 3D engine according to a distance between human eyes to render the left-eye view and the right-eye view, respectively.
In some alternative embodiments, the first terminal may include a touch display, for example, the first terminal may be a mobile phone, a PAD, or a terminal including a touch display. The second terminal may include a head-mounted electronic device, which may be a head mounted display, smart glasses, or the like. The first image may be displayed on a touch display screen of the first terminal, and the second image may be displayed on a virtual display screen of the second terminal.
In some alternative embodiments, the specific processing method of step S1062 may be varied, and an alternative processing method is provided below, see, for example, the specific processing procedures of steps K2-K4 below.
In step K2, in response to determining that the information to be displayed includes the first information to be displayed, a preset rendering engine is adopted to render the first information to be displayed in real time according to a preset first display mode, so as to generate a dynamic two-dimensional image.
In step K4, the dynamic two-dimensional image is determined as a first image to display the first image at the first terminal.
In this embodiment, the execution body may render the two-dimensional image displayed by the first terminal in real time by using the 3D rendering engine (for example, refresh the rendered two-dimensional image at a frequency of 60 frames/second, so that a technical effect of real-time rendering may be achieved), so that a more complex image effect (dynamic image effect) may be achieved by the two-dimensional image. Specifically, since the two-dimensional image is rendered based on the 3D rendering engine in real time, a dynamic display effect can be achieved by performing animation processing on the display elements on the two-dimensional image. The execution body may determine the two-dimensional image as a first image to be displayed at the first terminal, so that the two-dimensional image may be displayed at the first terminal.
In some alternative embodiments, as shown in fig. 4, before the step S102, the method may further include a process of steps S002-S008, which may be specifically referred to as a specific process of steps S002-S008.
In step S002, the attitude change information of the first terminal in space is acquired.
In the present embodiment, an execution subject (e.g., a first terminal) of a method for information display may acquire posture change information of the first terminal in various ways. Wherein the gesture change information may characterize a gesture change of the first terminal in space. Here, the change in the posture of the first terminal in the space can be understood as a change in the free movement of the first terminal in a plurality of directions in the space.
As an example, the above-described first terminal may directly acquire its own posture change information through a sensor mounted thereon, in which case the above-described execution subject may directly acquire the posture change information of the first terminal from the first terminal. It should be noted that, the first terminal may be a terminal device including a touch display screen. For example, the first terminal may be a mobile phone. And at least two touch keys can be displayed on a first user interface displayed on the touch display screen of the first terminal.
As an example, the at least two touch keys may include a function key. The function key may be a separate function key that performs some operation, for example, a home key, etc., and by clicking the home key, an operation such as returning to the home screen may be performed. The above-mentioned touch key may further include an auxiliary key that may assist a user in touch operation, for example, by receiving a click, a double click, a slide, etc., to correspondingly execute a menu for opening an application, closing an application, opening an application. The function keys and the auxiliary keys may be displayed in different manners, so that a user may distinguish between two different touch keys on the first user interface. It may be understood that only an auxiliary key may be displayed on the first user interface displayed on the touch display screen of the first terminal, and the auxiliary key may receive a touch operation of a user to assist in implementing a function such as selecting in the interface, where the touch operation may be a single click, a double click, a sliding, or the like.
In general, the first terminal may be equipped with a sensor for acquiring information of a plurality of degrees of freedom (degree of freedom). As an example, the sensor of the first terminal may be a mobile device with a sensor of 3dof or a sensor of 6dof, where 3dof may refer to a degree of freedom with 3 rotation angles, and 6dof may refer to a degree of freedom related to 3 positions, i.e. up and down, front and back, left and right, in addition to 3 rotation angles. The above-described posture change information may characterize a change in the position of the first terminal in space and a change in the direction of the first terminal in space. For example, the posture of the first terminal in the space may be changed from the horizontal state to the vertical state, or may be changed from the horizontal state to a state inclined at a certain angle to the horizontal direction. The posture change information of the first terminal may be determined by the 3dof sensor or the 6dof sensor. It will be appreciated that if the first terminal is equipped with a 6dof sensor, the first terminal may directly collect 6dof information to determine the posture change information when moving in space. At this time, the executing body may directly acquire the posture change information of the first terminal. If the first terminal is provided with the 3dof sensor, a reference datum point can be set by taking a user as a reference, and the execution body can determine the posture change information of the first terminal according to the 3dof information of the first terminal in the space and the reference datum point.
In step S004, according to the posture change information, an operation point in the second user interface displayed at the specified position in the space by the second terminal is adjusted, and a target object corresponding to the adjusted operation point is determined in the second user interface.
In this embodiment, the above-described second terminal may be a head-mounted electronic device, and the second terminal may display the second user interface at a spatially specified position. Of course, the second terminal may also be another electronic device displaying the user interface in space, which is not limited only herein. In the process of interaction between the first terminal and the second terminal, the gesture of the first terminal in the space can correspond to the operation point in the second user interface, so that when the gesture of the first terminal in the space changes, the operation point in the second user interface of the second terminal correspondingly changes. After the executing body acquires the posture change information, the executing body can analyze the posture change information, so that an operation point in a second user interface displayed by the second terminal can be adjusted. The execution body may determine an object indicated by the adjusted operation point in the second user interface, and determine the object as the target object.
As an example, for a current pose of the first terminal in space, a position of an operation point in a second user interface displayed by the second terminal indicates the first APP icon, and the execution body adjusts the operation point in the second user interface displayed by the second terminal according to pose change information, from the position of the current first APP icon to the position of the second APP icon, so that a target object corresponding to the adjusted operation point can be determined to be the second APP in the second user interface.
In step S006, a touch operation received by the first terminal is acquired, and a touch instruction is generated. The touch operation is an operation of a user aiming at the first user interface, and the touch instruction is used for triggering updating of the second user interface.
In this embodiment, the user may perform a touch operation on the first user interface of the first terminal, for example, the user may perform a touch operation on a touch key displayed on the first user interface. Thus, in response to the touch operation, the execution subject may generate or invoke a touch instruction for the target object.
In step S008, a touch instruction is executed to generate an image rendering request.
In this embodiment, based on the touch instruction generated or invoked in step S006, the execution body may execute the touch instruction, so as to generate an image rendering request. It should be noted that, the image rendering request may be used to request to render the second image to be displayed on the second terminal.
Therefore, in response to determining that the target application program is in the running state, the execution subject may acquire the image rendering request, and parse the image rendering request to obtain the second information to be displayed. And then, rendering the second information to be displayed to generate a second image, and displaying the second image on a second user interface of the second terminal. The user operates the first user interface of the first terminal so that the second user interface of the second terminal can be controlled to update the display content of the second user interface.
Optionally, the user may perform multiple touch operations on the first user interface, and the execution body may correspondingly generate different touch instructions after receiving the touch operations. See for details the following specific procedures of step X2-step X4.
In the step X2, the touch operation received by the first terminal is obtained, and the touch type is determined according to the touch operation.
In this embodiment, the first terminal may receive various touch operations of the user on the first user interface, and then analyze the received touch operations, so as to determine a touch type of the touch operations. Here, the touch operation may include a slide operation, a click operation, a long press operation, or the like. The touch type may include a left-slide type, a right-slide type, an up-slide type, a down-slide type, a single click type, a double click type, a long press type, and the like. As an example, the execution body may analyze the received sliding trajectory of the sliding operation, thereby determining a left-sliding type, a right-sliding type, an up-sliding type, a down-sliding type, and the like. Or the execution body can analyze the clicking operation so as to determine the clicking type, the double clicking type and the like.
In step X4, a touch instruction of the target object is generated according to the touch type of the touch operation.
In this embodiment, based on the touch type determined in step X2, the executing body may generate the control instruction for the target object in response to the touch type of the touch operation. It can be understood that control instructions corresponding to different touch types are preset, so that the execution body can obtain the corresponding control instruction after determining the touch type, and the control instruction can be used for controlling the target object.
As an example, the target object may be a window of a first APP displayed on the user interface, and if the first terminal receives a sliding operation of the user on the touch area, the execution body may analyze the received sliding operation, so as to determine that the touch type is a left-sliding type. And then the execution body generates a window reducing instruction according to the left slide type so as to reduce the window of the first APP, namely, the reduced window of the first APP can be rendered and displayed as a second image on a second user interface of the second terminal.
Further, the first image displayed on the first user interface may be displayed with at least two touch keys. The problem that the user touches by mistake during touch operation may exist, so that the immersion of the user in using the first terminal and the second terminal for interaction experience is reduced. Therefore, only one touch key (e.g., an auxiliary key) may be displayed on the first image, and other touch keys (e.g., preset function keys such as home) may be displayed on the second user interface. See for a specific procedure for step Q2 below.
In step Q2, in response to receiving the preset function key calling instruction, feature information of the preset function key is obtained, and an image rendering request is generated. The preset function key calling instruction is used for controlling the preset function keys of the first terminal to be displayed on the second user interface.
The user can operate the first terminal or the second terminal under the condition that the user needs to use a preset function key such as a home key, so that the executing body can acquire a calling instruction of the preset function key. After the execution main body obtains the preset function key calling instruction, the execution main body can analyze and process the preset function key calling instruction so as to obtain the characteristic information of the preset function key required to be called by a user. The feature information may represent a presentation form of a preset function key. The execution body may generate an image rendering request according to the feature information of the preset function key so as to generate a second image displayed on the second terminal, where the second image includes the preset function key.
The determination mode of the preset function key can be various. For example, the preset function key may be a plurality of function keys determined by the execution body according to the detected number of times the user uses the function key in a history period (e.g., 1 day) greater than a preset number of times. Or may be a function key or the like which is added in advance to the whitelist by the user according to the actual use requirement of the user. The preset function key may include a home key, a shortcut key, etc. It should be noted that, the method for determining the preset function key and the type of the preset function key are not limited in particular.
Further, considering that the plurality of preset function keys are displayed on the second user interface, there may be a problem that the plurality of function keys are scattered on the second user interface, which is inconvenient for the user to find. Therefore, in order to facilitate the user to quickly find a preset function key when executing the preset function using the preset function key, in some alternative implementations, the method may further include a specific processing procedure of step M2 below.
In step M2, the plurality of preset function keys are combined according to the preset function display mode, and the combined preset function keys are generated. In this case, the second information to be displayed may include the combined related image information of the preset function key.
In this way, in the process of using the first terminal and the second terminal to interact, when the user needs to use a certain preset function key, the user can find the combined preset function key in the first image displayed by the first terminal, and can easily find the preset function key, so that the interaction efficiency of using the first terminal and the second terminal to interact by the user is further improved.
Optionally, the combined preset function key may be displayed according to a preset mode.
For example, the display mode of the combined preset function key may be a 2D display mode, and the execution body may use a preset rendering engine to render the second information to be displayed according to the 2D display mode to generate a two-dimensional image. For example, when the display mode of the combined preset function key is a 2D display mode, the executing body may render the second information to be displayed into an image of a wheel disc display mode when rendering the second information to be displayed, where the wheel disc is divided into a plurality of areas, and each area corresponds to one preset function key.
Or the display mode of the combined preset function key may be a 3D display mode, and the executing body may use a preset rendering engine to render the second information to be displayed according to the 3D display mode, so as to generate a three-dimensional image as the second image. For example, if the display mode of the combined preset function key is a 3D display mode, the execution body may render the second information to be displayed into an image including a cube display mode when rendering the second information to be displayed. Wherein, each face of the cube can display different preset function keys respectively. The user can select the corresponding preset function key by selecting different surfaces.
In this way, the execution body may render a second image with a different form according to the second information to be displayed, and the second image may include images of a plurality of preset function keys. If the wheel disc display mode or the cube display mode is adopted, a user can select a preset function key corresponding to a certain area of the wheel disc or a certain surface of the cube in the second image by clicking the preset function key displayed on the first user interface, so that the function of the preset function key can be triggered, and the use experience of the user is further improved.
Further, in order to further improve the interaction efficiency of the user in using the first terminal and the second terminal, in some alternative implementations, the method may further include the following specific processing procedures of step P2 to step P4.
In step P2, a third preset instruction is received, where the third preset instruction may be a call instruction for calling a preset function key.
The third preset instruction may be an instruction obtained through a voice manner, or may also be an instruction obtained through receiving a touch operation of a physical key set on the first terminal or the second terminal by a user. Here, the reception form of the third preset instruction is not limited only.
In step P4, an image rendering request is generated in response to the third preset instruction. The image rendering request is used for requesting to render a second image displayed in a second user interface, and the second image can include related information of a preset function key to be called.
In this embodiment, in the process that the user uses the first terminal and the second terminal to perform interaction, when the user needs to implement a certain function, if the user needs to execute the return desktop or start the home key, the user may speak related keywords such as the return desktop, the start home key, or the like, or may trigger a physical key corresponding to the preset function on the first terminal or the second terminal. In this way, the executing body may receive the third preset instruction through a voice receiving module provided in the first terminal or the second terminal, or may generate the third preset instruction after the received operation performed by the user on the physical key with the preset function. Then, in response to the third preset instruction, an image rendering request is generated, wherein the image rendering request is used for requesting to render a second image displayed in a second user interface, and the second image can comprise related information of a preset function key to be called.
In this way, the execution body may acquire the image rendering request, parse the image rendering request, acquire the second information to be displayed, and render the second information to be displayed, so as to generate the second image displayed on the second terminal. The second image may include preset function keys that the user needs to invoke.
In consideration of that the layout modes of the touch keys corresponding to different interface types may be different, in order to facilitate the user operation, a first image including touch keys with reasonable layout may be rendered according to the interface types. First, the execution body may first determine the type of interface to be displayed on the second terminal. And the execution body can acquire the layout mode of each touch key displayed in the first image of the first terminal according to the determined interface type. Finally, when it is determined that the target application program is in the running state, the execution body may acquire an image rendering request, where the image rendering request may include first information to be displayed, and render the first information to be displayed to obtain a first image, where each touch key in the first image is laid out according to the acquired layout manner.
And respectively setting an applicable touch key layout mode for a game interface, a video interface and the like to be displayed on the second terminal. For example, a first touch key layout mode is set for the game interface, and a second touch key layout mode is set for the video interface.
As an example, the executing body determines that the interface type to be displayed on the second terminal is a video interface. And then the execution body can obtain a second touch key layout mode. Finally, when it is determined that the target application program is in the running state, the execution body may acquire an image rendering request, where the image rendering request may include first information to be displayed, and render the first information to be displayed to obtain a first image, where each touch key in the first image is laid out according to a second touch key layout manner.
In some alternative embodiments, the executing entity may control the specific application software of the first terminal to be displayed on the second user interface when the first terminal and the second terminal are in interaction. The execution subject may select the specific application software in various ways. The method may further include the following steps N2 to N14 before the step S102, and specific reference may be made to the following steps N2 to N14.
In step N2, a pre-stored white list of application software is obtained.
The whitelist may be pre-stored list information of application software that may be displayed in a user interface of the second terminal. The manner of determining the whitelist may be varied. For example, the white list may be determined according to application software satisfying a preset condition. The preset condition may be that the first terminal detects that the use time of the user for using a certain application software in a history period (for example, 1 day) is longer than a preset time. As an example, the whitelist may be updated in real time, and specifically, application software detected by the first terminal to meet the preset condition may be automatically added to the whitelist in real time. Alternatively, the preset condition may be that the number of times of use exceeds a preset threshold value in a preset history period (e.g., 1 week), which is not limited only.
In step N4, an application to be displayed at the second terminal is determined from the first terminal based on the white list.
In step N6, second information to be displayed on the second terminal is obtained, where the second information to be displayed includes image data such as the determined icon of the application program.
In step N8, an image rendering request is generated according to the second information to be displayed.
In step N10, in response to determining that the target application is in a running state, an image rendering request is acquired.
In step N12, the image rendering request is parsed to obtain second information to be displayed.
In step N14, a preset rendering engine is adopted to render the second information to be displayed according to a preset second display mode, and a second image is generated so as to display the second image on a second user interface of the second terminal. Thereby, it is possible to realize that a second image of icons or the like of programs including a white list is displayed on the second user interface of the second terminal.
The display mode may include a 2D display mode, a 3D display model, or the like. Or the display mode may further include an arrangement mode of information to be displayed, for example, the information to be displayed is application software selected from a white list, and the display mode may be a display mode of dividing the selected application software into different groups. Here, the display mode is not limited only.
In this way, the first terminal determines the second information to be displayed on the second terminal by acquiring the pre-stored white list of the application software, and adopts the preset rendering engine to render the second information to be displayed according to the preset display mode to generate the second image. Finally, the second image is displayed on a second user interface of the second terminal, so that a user can control application software in the second user interface displayed on the second terminal through the first terminal.
In some alternative embodiments, the executing entity may select, not only the icon of the target application software included in the second image from the first terminal through the white list, but also the icon of the target application software included in the second image from the first terminal through a target software development kit, which is not limited herein.
In this way, the determined application software using the target software development kit is determined to be the target application software, so that the target application software and the second terminal have better adaptation effect. Then, the target application software can be determined from the first terminal as second information to be displayed on the second terminal, and a preset rendering engine is adopted to render the second information to be displayed according to a preset display mode, so that a second image is generated. Finally, the second image is displayed on a second user interface of the second terminal, so that a user can control the application software displayed on the second user interface of the second terminal through the first terminal. According to the embodiment, the user can find the target application software in the second user interface displayed on the second terminal, so that the diversified requirements of the user are met, and the use experience of the user is improved.
In some embodiments, the second terminal, upon initialization, will typically display a second user interface at a spatially designated location that needs to be in a fixed relative distance, orientation, etc. with the second terminal. The position of the user is generally not unchanged when the user wears the second terminal, and the relative position between the second terminal and the user interface displayed in the space of the second terminal is changed, for example, when the user wearing the second terminal approaches to the direction of the user interface displayed by the second terminal, the distance between the user and the user interface is gradually reduced, so that the visual effect of the user may be affected. In order to solve the above problem, the method may further include a treatment process of the following step C2, and specific reference may be made to a specific treatment process of the following step C2.
In step C2, if the first terminal receives a preset touch operation for the target touch key, a second user interface initialization instruction is generated in response to the preset touch operation. The user performs preset touch operation on a target touch key (such as a home key) displayed in the first user interface, so that a second user interface initialization instruction can be generated, and the second user interface initialization instruction not only can reset the position of the second user interface in space to enable the relative position of the second user interface and the second terminal to be restored to an initial state, but also can re-render a second image displayed on the second user interface.
The above-mentioned sensor of the IMU or the like of the second terminal may acquire the position of the second terminal in space (the position may include coordinates of vector information), so that the position of the second user interface in space may be determined according to the position of the second terminal. Therefore, the second user interface initialization command can restore the second user interface and the second terminal to the initial relative positions, regardless of whether the current relative positions of the second user interface and the second terminal are changed.
Further, after generating the second user interface initialization instruction, the execution body may acquire an image rendering request, which may be used to request rendering of the second image displayed on the second terminal. It will be appreciated that the second image may be the initial user interface or the second image may also be the same as the image displayed prior to initialization of the second user interface.
The embodiment of the present disclosure further provides an apparatus for displaying information, based on the same technical concept, corresponding to the method for displaying information provided in the foregoing embodiment, and fig. 5 is a schematic block diagram of the apparatus for displaying information provided in the embodiment of the present disclosure, where the apparatus for displaying information is used to perform the method for displaying information described in fig. 1 to 4, as shown in fig. 5, and the apparatus for displaying information includes: a first processing module 501, configured to obtain an image rendering request in response to determining that a target application is in a running state; the second processing module 502 is configured to parse the image rendering request to obtain information to be displayed; a third processing module 503, configured to render, in response to determining that the information to be displayed includes the first information to be displayed, generate a first image, and display the first image on a first user interface of the first terminal; a fourth processing module 504, configured to render, in response to determining that the information to be displayed includes the second information to be displayed, the second information to be displayed to generate a second image, so as to display the second image on a second user interface of the second terminal, where the first user interface of the first terminal is used to control the second user interface of the second terminal.
Optionally, the apparatus further includes: and the fifth processing module is used for controlling the operation of the target application program in response to receiving the trigger information of the target application program.
Optionally, a third processing module 503 is configured to: a preset rendering engine is adopted, and first information to be displayed is rendered according to a preset first display mode, so that a first image is generated; and a fourth processing module for: and rendering the second information to be displayed according to a preset second display mode by adopting a preset rendering engine to generate a second image.
Optionally, the first display mode comprises a two-dimensional display mode, and the first image comprises a two-dimensional image; the second display mode includes a three-dimensional display mode, and the second image includes a three-dimensional image.
Optionally, the fourth processing module 503 includes: the first rendering unit is used for rendering the second information to be displayed according to a three-dimensional display mode by adopting a preset rendering engine to generate a left eye view and a right eye view; and a synthesizing unit for synthesizing the left eye view and the right eye view into a three-dimensional image.
Optionally, the first terminal comprises a touch display screen, and the second terminal comprises a head-mounted electronic device; the first image is displayed on a touch display screen of the first terminal, and the second image is displayed on a virtual display screen of the second terminal.
Optionally, the fourth processing module 503 includes: the second rendering unit is used for rendering the first information to be displayed in real time by adopting a preset rendering engine according to a preset first display mode so as to generate a dynamic two-dimensional image; and a display unit for determining the dynamic two-dimensional image as a first image to display the first image at the first terminal.
Optionally, the apparatus further includes: the acquisition module is used for acquiring the attitude change information of the first terminal in space; the adjusting module is used for adjusting the operation point in the second user interface displayed at the specified position in the space by the second terminal according to the gesture change information, and determining a target object corresponding to the adjusted operation point in the second user interface; the sixth processing module is used for acquiring touch operation received by the first terminal and generating a touch instruction, wherein the touch operation is the operation of a user aiming at the first user interface, and the touch instruction is used for triggering and updating the second user interface; and the execution module is used for executing the touch instruction to generate an image rendering request.
Optionally, the apparatus further includes: and the seventh processing module is used for responding to the received preset function key calling instruction, acquiring the characteristic information of the preset function key and generating an image rendering request, wherein the preset function key calling instruction is used for controlling the preset function key of the first terminal to be displayed on the second user interface.
The device for displaying information provided in the embodiment of the present disclosure can implement each process in the embodiment corresponding to the method for displaying information, and in order to avoid repetition, a description is omitted here.
It should be noted that, the apparatus for displaying information provided by the embodiments of the present disclosure and the method for displaying information provided by the embodiments of the present disclosure are based on the same inventive concept, so that the implementation of the embodiments may refer to the implementation of the foregoing method for displaying information, and the repetition is omitted.
In accordance with the method for displaying information provided in the foregoing embodiments, based on the same technical concept, the embodiments of the present disclosure further provide an apparatus for displaying information, where the apparatus is used to perform the method for displaying information described above, fig. 6 is a schematic structural diagram of an apparatus for displaying information to implement the embodiments of the present disclosure, and as shown in fig. 6, the apparatus may have relatively large differences due to different configurations or performances, and may include one or more processors 601 and a memory 602, where the memory 602 may store one or more storage applications or data. Wherein the memory 602 may be transient storage or persistent storage. The application programs stored in the memory 602 may include one or more modules (not shown), each of which may include a series of computer-executable instructions for use in an electronic device. Still further, the processor 601 may be arranged to communicate with the memory 602 and execute a series of computer executable instructions in the memory 602 on an electronic device. The electronic device may also include one or more power supplies 603, one or more wired or wireless network interfaces 604, one or more input/output interfaces 605, and one or more keyboards 606.
In particular, in this embodiment, the device includes a processor, a communication interface, a memory, and a communication bus; the processor, the communication interface and the memory complete communication with each other through a bus; a memory for storing a computer program; the processor is used for executing the program stored in the memory and realizing the following method steps: acquiring an image rendering request in response to determining that the target application program is in an operating state; analyzing an image rendering request to obtain information to be displayed; in response to determining that the information to be displayed contains first information to be displayed, rendering the first information to be displayed, and generating a first image so as to display the first image on a first user interface of the first terminal; and in response to determining that the information to be displayed contains second information to be displayed, rendering the second information to be displayed, generating a second image, and displaying the second image on a second user interface of the second terminal, wherein the first terminal controls the second user interface of the second terminal based on the first user interface.
According to the technical scheme provided by the embodiment of the disclosure, firstly, an image rendering request is obtained in response to determining that a target application program is in an operating state, then the image rendering request is analyzed to obtain information to be displayed, then the first information to be displayed is rendered in response to determining that the information to be displayed contains first information to be displayed, a first image is generated to display the first image on a first user interface of a first terminal, the second information to be displayed is rendered in response to determining that the information to be displayed contains second information to be displayed, a second image is generated to display the second image on a second user interface of a second terminal, and a user can control the second user interface of the second terminal through the first user interface of the first terminal. In the process of the cooperation of the first terminal and the second terminal, the first image of the first terminal can be rendered and displayed through the same target application program installed on the first terminal or the second terminal, the second image of the second terminal can be rendered and displayed, the image rendering efficiency is improved, and the method and the device are more suitable for interaction among different terminals.
Further, in correspondence with the method for displaying information provided by the foregoing embodiment, the present disclosure further provides a computer readable storage medium, where a computer program is stored on the computer readable storage medium, and when the computer program is executed by the processor 603, the steps of the foregoing method for displaying information are implemented, and the same technical effects can be achieved, so that repetition is avoided, and no further description is provided herein. The computer readable storage medium includes Read-Only Memory (ROM), random access Memory (Random Access Memory RAM), magnetic disk or optical disk.
It should be noted that, the embodiments of the storage medium in this specification and the embodiments of the method for displaying information in this specification are based on the same inventive concept, so that the specific implementation of this embodiment may refer to the implementation of the corresponding method for displaying information, and the repetition is omitted.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present description is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or a combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application SPECIFIC INTEGRATED Circuits (ASICs), digital signal processors (DIGITAL SIGNAL Processing, DSPs), digital signal Processing devices (DSP DEVICE, DSPD), programmable logic devices (Programmable Logic Device, PLDs), field-Programmable gate arrays (Field-Programmable GATE ARRAY, FPGA), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units for performing the functions described herein, or a combination thereof.
For a software implementation, the techniques described in embodiments of this specification may be implemented by means of modules (e.g., procedures, functions, and so on) that perform the functions described in embodiments of this specification. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
It should also be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present specification may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), including several instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the above-described method of the various embodiments of the present specification.
The embodiments of the present disclosure have been described in connection with the accompanying drawings, but the present disclosure is not limited to the above-described embodiments, which are intended to be illustrative only and not limiting, and various modifications and variations can be made by one skilled in the art without departing from the spirit of the disclosure and the scope of the claims. Any modifications, equivalent substitutions, improvements, or the like, which are within the spirit and principles of the present description, are intended to be included within the scope of the claims of the present description.

Claims (12)

1. A method for information display, comprising:
Acquiring attitude change information of a first terminal in space according to a plurality of degrees of freedom information of the first terminal in space and a reference datum point set by taking a user as a reference;
According to the gesture change information, adjusting an operation point in a second user interface displayed at a specified position in space by a second terminal, and determining a target object corresponding to the adjusted operation point in the second user interface;
Acquiring touch operation received by the first terminal, and generating or calling a touch instruction aiming at the target object in response to the touch operation, wherein the touch operation is the operation of a user aiming at a first user interface, and the touch instruction is used for triggering and updating the second user interface;
executing the touch instruction to generate an image rendering request;
Acquiring an image rendering request in response to determining that the target application program is in an operating state;
analyzing the image rendering request to obtain information to be displayed;
In response to determining that the information to be displayed contains first information to be displayed, rendering the first information to be displayed, and generating a first image, so that the first image is displayed on a first user interface of a first terminal, wherein a touch key is displayed on the first image;
In response to determining that the information to be displayed contains second information to be displayed, rendering the second information to be displayed, and generating a second image so as to display the second image on a second user interface of a second terminal;
Receiving touch operation of a user on a touch key displayed on a first user interface of the first terminal, and generating a control instruction to control display content of a second user interface of the second terminal;
the first user interface of the first terminal is used for controlling the second user interface of the second terminal; the first terminal comprises a touch display screen, and the second terminal comprises a head-mounted electronic device;
The target application is for rendering a first image displayed on a first user interface and a second image displayed on a second user interface.
2. The method of claim 1, wherein, prior to obtaining the image rendering request in response to determining that the target application is in a running state, the method further comprises:
and controlling the target application program to run in response to receiving the trigger information of the target application program.
3. The method of claim 1, wherein the rendering the first information to be displayed, generating a first image, comprises:
A preset rendering engine is adopted, and the first information to be displayed is rendered according to a preset first display mode, so that the first image is generated; and
The rendering the second information to be displayed to generate a second image includes:
and rendering the second information to be displayed according to a preset second display mode by adopting the preset rendering engine to generate the second image.
4. A method according to claim 3, wherein the first display mode comprises a two-dimensional display mode and the first image comprises a two-dimensional image;
the second display mode includes a three-dimensional display mode, and the second image includes a three-dimensional image.
5. The method of claim 4, wherein the rendering the second information to be displayed in a preset second display mode with the preset rendering engine to generate the second image includes:
Rendering the second information to be displayed according to the three-dimensional display mode by adopting the preset rendering engine to generate a left eye view and a right eye view;
and synthesizing the left eye view and the right eye view into the three-dimensional image.
6. The method of claim 1, wherein the first image is displayed on a touch display of the first terminal and the second image is displayed on a virtual display of the second terminal.
7. The method of claim 4, wherein the rendering the first information to be displayed in a preset first display mode using a preset rendering engine to generate the first image comprises:
A preset rendering engine is adopted, and the first information to be displayed is rendered in real time according to a preset first display mode, so that a dynamic two-dimensional image is generated;
And determining the dynamic two-dimensional image as the first image so as to display the first image on the first terminal.
8. The method of claim 1, wherein prior to the acquiring the image rendering request, the method further comprises:
And responding to receiving a preset function key calling instruction, acquiring characteristic information of the preset function key, and generating an image rendering request, wherein the preset function key calling instruction is used for controlling the preset function key of the first terminal to be displayed on the second user interface.
9. An apparatus for information display, comprising:
The acquisition module is used for acquiring attitude change information of the first terminal in the space according to the plurality of degrees of freedom information of the first terminal in the space and a reference point set by taking a user as a reference;
The adjusting module is used for adjusting the operation point in the second user interface displayed at the specified position in the space by the second terminal according to the gesture change information, and determining a target object corresponding to the adjusted operation point in the second user interface;
The sixth processing module is used for acquiring touch operation received by the first terminal, and generating or calling a touch instruction aiming at the target object in response to the touch operation, wherein the touch operation is an operation aiming at the first user interface by a user, and the touch instruction is used for triggering updating of the second user interface;
The execution module is used for executing the touch instruction to generate an image rendering request;
the first processing module is used for acquiring an image rendering request in response to determining that the target application program is in an operating state;
the second processing module is used for analyzing the image rendering request and acquiring information to be displayed;
The third processing module is used for responding to the fact that the information to be displayed contains first information to be displayed, rendering the first information to be displayed, generating a first image, displaying the first image on a first user interface of a first terminal, and displaying a touch key on the first image;
the fourth processing module is used for rendering the second information to be displayed in response to determining that the information to be displayed contains the second information to be displayed, and generating a second image so as to display the second image on a second user interface of a second terminal;
the sixth processing module is used for receiving touch operation of a user on a touch key displayed on the first user interface of the first terminal, and generating a control instruction to control display content of the second user interface of the second terminal;
the first user interface of the first terminal is used for controlling the second user interface of the second terminal; the first terminal comprises a touch display screen, and the second terminal comprises a head-mounted electronic device;
The target application is for rendering a first image displayed on a first user interface and a second image displayed on a second user interface.
10. An apparatus for information display, the apparatus comprising a head mounted display terminal and a mobile terminal, the display terminal being communicable with the mobile terminal, the mobile terminal comprising:
A processor, a communication interface, a memory, and a communication bus; the processor, the communication interface and the memory complete communication with each other through a bus; the memory is used for storing a computer program; the processor is configured to execute a program stored in the memory, and implement the method for displaying information according to any one of claims 1 to 8.
11. An apparatus for information display, the apparatus comprising a head mounted display terminal and a mobile terminal, the display terminal being communicable with the mobile terminal, the display terminal comprising:
A processor, a communication interface, a memory, and a communication bus; the processor, the communication interface and the memory complete communication with each other through a bus; the memory is used for storing a computer program; the processor is configured to execute a program stored in the memory, and implement the method for displaying information according to any one of claims 1 to 8.
12. A computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method for information display according to any of claims 1 to 8.
CN202110234891.7A 2021-03-03 2021-03-03 Method, apparatus, device and storage medium for information display Active CN112965773B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110234891.7A CN112965773B (en) 2021-03-03 2021-03-03 Method, apparatus, device and storage medium for information display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110234891.7A CN112965773B (en) 2021-03-03 2021-03-03 Method, apparatus, device and storage medium for information display

Publications (2)

Publication Number Publication Date
CN112965773A CN112965773A (en) 2021-06-15
CN112965773B true CN112965773B (en) 2024-05-28

Family

ID=76276307

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110234891.7A Active CN112965773B (en) 2021-03-03 2021-03-03 Method, apparatus, device and storage medium for information display

Country Status (1)

Country Link
CN (1) CN112965773B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113791495B (en) * 2021-08-27 2024-09-20 优奈柯恩(北京)科技有限公司 Method, apparatus, device and computer readable medium for displaying information
CN113723614B (en) * 2021-09-01 2024-08-20 北京百度网讯科技有限公司 Method, apparatus, device and medium for aided design of quantum circuits

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107071539A (en) * 2017-05-08 2017-08-18 深圳小辣椒虚拟现实技术有限责任公司 Information resources synchronous display method and system in terminal based on VR equipment
WO2018086295A1 (en) * 2016-11-08 2018-05-17 华为技术有限公司 Application interface display method and apparatus
CN109471603A (en) * 2017-09-07 2019-03-15 华为终端(东莞)有限公司 A kind of interface display method and device
CN110347305A (en) * 2019-05-30 2019-10-18 华为技术有限公司 A kind of VR multi-display method and electronic equipment
CN111399789A (en) * 2020-02-20 2020-07-10 华为技术有限公司 Interface layout method, device and system
CN111399630A (en) * 2019-01-03 2020-07-10 广东虚拟现实科技有限公司 Virtual content interaction method and device, terminal equipment and storage medium
US10802667B1 (en) * 2019-06-03 2020-10-13 Bank Of America Corporation Tactile response for user interaction with a three dimensional rendering
US10825245B1 (en) * 2019-06-03 2020-11-03 Bank Of America Corporation Three dimensional rendering for a mobile device
CN112351325A (en) * 2020-11-06 2021-02-09 惠州视维新技术有限公司 Gesture-based display terminal control method, terminal and readable storage medium
CN112383664A (en) * 2020-10-15 2021-02-19 华为技术有限公司 Equipment control method, first terminal equipment and second terminal equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101799294B1 (en) * 2013-05-10 2017-11-20 삼성전자주식회사 Display appratus and Method for controlling display apparatus thereof

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018086295A1 (en) * 2016-11-08 2018-05-17 华为技术有限公司 Application interface display method and apparatus
CN107071539A (en) * 2017-05-08 2017-08-18 深圳小辣椒虚拟现实技术有限责任公司 Information resources synchronous display method and system in terminal based on VR equipment
CN109471603A (en) * 2017-09-07 2019-03-15 华为终端(东莞)有限公司 A kind of interface display method and device
CN111399630A (en) * 2019-01-03 2020-07-10 广东虚拟现实科技有限公司 Virtual content interaction method and device, terminal equipment and storage medium
CN110347305A (en) * 2019-05-30 2019-10-18 华为技术有限公司 A kind of VR multi-display method and electronic equipment
US10802667B1 (en) * 2019-06-03 2020-10-13 Bank Of America Corporation Tactile response for user interaction with a three dimensional rendering
US10825245B1 (en) * 2019-06-03 2020-11-03 Bank Of America Corporation Three dimensional rendering for a mobile device
CN111399789A (en) * 2020-02-20 2020-07-10 华为技术有限公司 Interface layout method, device and system
CN112383664A (en) * 2020-10-15 2021-02-19 华为技术有限公司 Equipment control method, first terminal equipment and second terminal equipment
CN112351325A (en) * 2020-11-06 2021-02-09 惠州视维新技术有限公司 Gesture-based display terminal control method, terminal and readable storage medium

Also Published As

Publication number Publication date
CN112965773A (en) 2021-06-15

Similar Documents

Publication Publication Date Title
CN107913520B (en) Information processing method, information processing device, electronic equipment and storage medium
EP3396511B1 (en) Information processing device and operation reception method
EP3293723A1 (en) Method, storage medium, and electronic device for displaying images
CN109471522B (en) Method for controlling pointer in virtual reality and electronic device
KR102499139B1 (en) Electronic device for displaying image and method for controlling thereof
CN112965773B (en) Method, apparatus, device and storage medium for information display
US9395764B2 (en) Gestural motion and speech interface control method for 3d audio-video-data navigation on handheld devices
KR102632270B1 (en) Electronic apparatus and method for displaying and generating panorama video
CN106873886B (en) Control method and device for stereoscopic display and electronic equipment
US11670022B2 (en) Electronic device and method for displaying and generating panoramic image
CN112241199B (en) Interaction method and device in virtual reality scene
CN109845251B (en) Electronic device and method for displaying images
KR20190083464A (en) Electronic device controlling image display based on scroll input and method thereof
CN113559501B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
CN112987924A (en) Method, apparatus, device and storage medium for device interaction
KR20180036359A (en) Method for displaying an image and an electronic device thereof
EP3327551A1 (en) Electronic device for displaying image and method for controlling the same
KR20210129067A (en) Interaction methods, devices, interaction devices, electronic devices and storage media
CN115543138A (en) Display control method and device, augmented reality head-mounted device and medium
CN115098524A (en) Method, device, equipment and medium for updating safety area
CN108499102B (en) Information interface display method and device, storage medium and electronic equipment
KR20210136659A (en) Electronic device for providing augmented reality service and operating method thereof
CN111973984A (en) Coordinate control method and device for virtual scene, electronic equipment and storage medium
KR102405385B1 (en) Method and system for creating multiple objects for 3D content
CN117742555A (en) Control interaction method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant