CN115617165A - Display control method, display control device, head-mounted display equipment and medium - Google Patents

Display control method, display control device, head-mounted display equipment and medium Download PDF

Info

Publication number
CN115617165A
CN115617165A CN202211204518.8A CN202211204518A CN115617165A CN 115617165 A CN115617165 A CN 115617165A CN 202211204518 A CN202211204518 A CN 202211204518A CN 115617165 A CN115617165 A CN 115617165A
Authority
CN
China
Prior art keywords
display
virtual screen
virtual
screen
identification control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211204518.8A
Other languages
Chinese (zh)
Inventor
杨明明
王丹
张方方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN202211204518.8A priority Critical patent/CN115617165A/en
Publication of CN115617165A publication Critical patent/CN115617165A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Abstract

The present disclosure provides a display control method, apparatus, head-mounted display device, and medium, the display control method including: taking a first virtual screen in a plurality of virtual screens as a main display screen of the head-mounted display equipment, and displaying an identification control of each virtual screen; wherein the plurality of virtual screens are arranged in a stack; determining whether display content of a second virtual screen of the plurality of virtual screens is updated; under the condition that the display content in any second virtual screen is updated, determining a target identification control corresponding to the any second virtual screen; and displaying the target identification control based on a set display mode.

Description

Display control method, display control device, head-mounted display equipment and medium
Technical Field
The embodiment of the disclosure relates to the technical field of wearable devices, and more particularly, to a display control method, a display control device, a head-mounted display device, and a computer-readable storage medium.
Background
In the prior art, taking a head-mounted display device as an intelligent glasses such as AR glasses as an example, usually, a plurality of display screens can be placed on a Launcher (desktop system Launcher) of the AR glasses at the same time, and applications on the plurality of display screens are foreground applications, that is, applications on each display screen are foreground applications.
Generally, a plurality of display screens are overlapped and covered on a display area of the AR glasses, that is, for a user, the user can only see one main display screen and cannot see other display screens, so that if the other display screens have information updating, there is no way to notify the user in time, and therefore the user can only browse and confirm whether new messages exist one by one, and user experience is poor.
Disclosure of Invention
It is an object of the embodiments of the present disclosure to provide a new technical solution for display control.
According to a first aspect of embodiments of the present disclosure, there is provided a display control method, the method including:
taking a first virtual screen in a plurality of virtual screens as a main display screen of the head-mounted display device, and displaying an identification control of each virtual screen; wherein the plurality of virtual screens are arranged in a stack;
determining whether display content of a second virtual screen of the plurality of virtual screens is updated;
under the condition that the display content in any second virtual screen is updated, determining a target identification control corresponding to the any second virtual screen;
and displaying the target identification control based on a set display mode.
Optionally, determining whether the display content of any second virtual screen is updated includes:
acquiring a current first display data frame and a latest second display data frame of the arbitrary second virtual screen;
comparing the first display data frame with the second display data frame to obtain a comparison result;
and under the condition that the comparison result shows that the first display data frame and the second display data frame have difference, indicating that the display content of any second virtual screen is updated.
Optionally, the determining a target identification control corresponding to the arbitrary second virtual screen includes:
acquiring target attribute information of the arbitrary second virtual screen;
acquiring set mapping data; the mapping data reflects the corresponding relation between the attribute information and the identification control of different virtual screens;
and determining the target identification control according to the mapping data and the target attribute information.
Optionally, the displaying the display target identifier control based on the set display mode includes:
displaying the target identification control based on a first color; and/or the presence of a gas in the atmosphere,
controlling the target display identifier to shake; and/or the presence of a gas in the gas,
and controlling the target display identifier to deform.
Optionally, the method further comprises:
and under the condition that a first virtual screen in the plurality of virtual screens is displayed, displaying a first identification control corresponding to the first virtual screen based on a second color.
Optionally, after the displaying the display target identifier control based on the set display manner, the method further includes:
receiving a first input for the target identification control;
updating a display position of the arbitrary second virtual screen in response to the first input;
wherein said updating the display position of said arbitrary second virtual screen in response to said first input comprises:
interchanging the positions of the arbitrary second virtual screen and the first virtual screen; alternatively, the first and second electrodes may be,
overlaying the arbitrary second virtual screen on the first virtual screen.
Optionally, the method further comprises:
responding to the first input, and acquiring attribute information of an application operated by the first virtual screen;
when the attribute information of the application is in a set application attribute list, pausing the playing of the target video based on the application;
wherein the set application attribute list includes attribute information of a plurality of different video applications.
According to a second aspect of the embodiments of the present disclosure, there is provided a display control apparatus including:
the display module is used for taking a first virtual screen in the multiple virtual screens as a main display screen of the head-mounted display equipment and displaying the identification control of each virtual screen; wherein the plurality of virtual screens are arranged in a stacked manner;
a determining module, configured to determine whether display content of a second virtual screen in the plurality of virtual screens is updated;
the determining module is used for determining a target identification control corresponding to any second virtual screen under the condition that the display content in the any second virtual screen is updated;
and the display module is used for displaying the target identification control based on a set display mode.
According to a third aspect of embodiments of the present disclosure, there is provided a head mounted display device including:
a memory for storing executable computer instructions;
a processor for executing the display control method according to the first aspect above, according to the control of the executable computer instructions.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, perform the display control method of the above first aspect.
The display method and the display device have the advantages that under the condition that the virtual screens are arranged in a stacked mode, the first virtual screen in the virtual screens can be used as the main display screen of the head-mounted display device to display the identification control of each virtual screen, meanwhile, under the condition that the display content of any second virtual screen in the virtual screens is updated, the target identification control corresponding to any second virtual screen can be determined, and then the target identification control is displayed based on the set display mode. Therefore, when the display content of any second virtual screen is updated, the target identification control corresponding to the any second virtual screen can be displayed based on the set display mode, so that the user can rapidly switch to the any second virtual screen based on the target identification control.
Other features of the present description and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the specification and together with the description, serve to explain the principles of the specification.
Fig. 1 is a hardware configuration schematic diagram of a head mounted display device according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow diagram of a display control method according to an embodiment of the disclosure;
FIG. 3 is a display schematic of a virtual screen according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a display control apparatus according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a head mounted display device according to an embodiment of the present disclosure.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of parts and steps, numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the embodiments of the present disclosure unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
< hardware configuration >
Fig. 1 is a block diagram of a hardware configuration of a head mounted display apparatus 1000 according to an embodiment of the present disclosure.
As shown in fig. 1, the head-mounted display device 1000 may be smart glasses, which may be AR glasses, but may also be other devices, which is not limited in this disclosure.
In one embodiment, as shown in fig. 1, the head mounted display apparatus 1000 may include a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, a display device 1500, an input device 1600, a speaker 1700, a microphone 1800, and the like.
The processor 1100 may include, but is not limited to, a central processing unit CPU, a microprocessor MCU, and the like. The memory 1200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, various bus interfaces such as a serial bus interface (including a USB interface), a parallel bus interface, and the like. Communication device 1400 is capable of wired or wireless communication, for example. The display device 1500 is, for example, a liquid crystal display, an LED display, an OLED (Organic Light-Emitting Diode) display, or the like. The input device 1600 includes, for example, a touch screen, a keyboard, a handle, and the like. The head mounted display device 1000 may output audio information through the speaker 1700 and may collect audio information through the microphone 1800.
It should be understood by those skilled in the art that although a plurality of apparatuses of the head mounted display apparatus 1000 are illustrated in fig. 1, the head mounted display apparatus 1000 of the embodiments of the present specification may only refer to some of the apparatuses, and may also include other apparatuses, which are not limited herein.
In this embodiment, the memory 1200 of the head mounted display device 1000 is configured to store instructions for controlling the processor 1100 to operate to implement or support the implementation of a display control method according to any of the embodiments. The skilled person can design the instructions according to the solution disclosed in the present specification. How the instructions control the operation of the processor is well known in the art and will not be described in detail herein.
In the above description, the skilled person can design the instructions according to the solutions provided in the present disclosure. How the instructions control the operation of the processor is well known in the art and will not be described in detail herein.
The head mounted display device shown in fig. 1 is merely illustrative and is in no way intended to limit the present disclosure, its application, or uses.
Various embodiments and examples according to the present disclosure are described below with reference to the drawings.
< method examples >
Fig. 2 illustrates a display control method according to an embodiment of the disclosure, where the display control method may be implemented by a head-mounted display device, may be implemented by a control device independent of the head-mounted display device and the head-mounted display device together, or may be implemented by a cloud server and the head-mounted display device together.
As shown in fig. 2, the display control method of the embodiment may include steps S2100 to S2400:
step S2100, a first virtual screen in the plurality of virtual screens is used as a main display screen of the head-mounted display device, and an identification control of each virtual screen is displayed; wherein the plurality of virtual screens are arranged in a stacked manner.
The first virtual screen is a display screen that can be visually seen by a wearer of the head-mounted display device, and the first virtual screen that can be visually seen by the user is usually used as a main display screen of the head-mounted display device. Taking the head-mounted display device as an intelligent glasses such as AR glasses as an example, referring to fig. 3, three virtual screens, namely a virtual screen 1, a virtual screen 2 and a virtual screen 3, are simultaneously placed on a Launcher of the AR glasses, and the three virtual screens are stacked on the Launcher, that is, only one main display screen can be seen at the same time for a wearer of the AR glasses. Illustratively, the virtual screen 1 is a first virtual screen that is a display screen visually seen by the wearer of the AR glasses, where the virtual screen 1 may be considered as a main display screen, and the virtual screen 2 and the virtual screen 3 are not visible to the wearer of the AR glasses.
The identification control can be a corner mark control, the identification control can distinguish different virtual screens, and the main display screen can be switched through the identification control. Referring to fig. 3, a virtual screen 1 is used as a main display screen of the AR glasses, and the main display screen displays a corner mark control 1 of the virtual screen 1, a corner mark control 2 of the virtual screen 2, and a corner mark control 3 of the virtual screen 3, where one corner mark control uniquely identifies one virtual screen. Here, the virtual screen 1 can be switched as the main display screen by the corner mark control 1, the virtual screen 2 can be switched as the main display screen by the corner mark control 2, and the virtual screen 3 can be switched as the main display screen by the corner mark control 3. It will be appreciated that the corner mark control may be placed elsewhere on the main display screen, for example, below the main display screen.
Subsequently, the process proceeds to step S2200, where it is determined whether or not the display content of the second virtual screen among the plurality of virtual screens is updated.
The second virtual screen is the other virtual screen besides the main display screen. Referring to fig. 3, the virtual screen 1 is a first virtual screen that is a display screen visually seen by a wearer of the AR glasses, where the virtual screen 1 is a main display screen. Meanwhile, since the virtual screen 2 and the virtual screen 3 are displayed in a layered manner below the virtual screen 1, both the virtual screen 2 and the virtual screen 3 can be used as the second virtual screen.
In one embodiment, determining whether the display content of any second virtual screen is updated may further include the following steps S3100 to S3300:
step S3100, acquiring a current first display data frame and a latest second display data frame of the arbitrary second virtual screen.
Optionally, any second virtual screen may be subjected to screen capturing according to the set screen capturing period, and a screen capturing picture of the current screen capturing period is obtained. Referring to fig. 3, the virtual screen 1 is used as a main display screen, the virtual screen 2 and the virtual screen 3 are both second virtual screens, and taking the second virtual screen as the virtual screen 2 as an example, the virtual screen 2 may be captured based on a set capture period, so as to obtain a current capture picture as a first display data frame, and obtain a latest capture picture as a second display data frame.
Alternatively, it may be possible to capture the display data frame of any second virtual screen based on a data frame capture tool, such as an imageReader tool. Of course, the display data frame of any second virtual screen may also be obtained based on other manners, which is not limited in this embodiment.
Step S3200, comparing the first display data frame and the second display data frame to obtain a comparison result.
In step S3200, each pixel point of the first display data frame and the second display data frame may be compared. Continuing with the above example, each pixel point of the first display data frame and the second display data frame of the virtual screen 2 is compared.
Step S3300, when the comparison result indicates that there is a difference between the first display data frame and the second display data frame, it indicates that the display content of the arbitrary second virtual screen is updated.
Continuing with the above example, in the case that there is a difference between the pixels of the first display data frame and the second display data frame of the virtual screen 2, it indicates that the display content of the virtual screen 2 is updated.
Subsequently, the process proceeds to step S2300, and when the display content in any second virtual screen is updated, the target identification control corresponding to the any second virtual screen is determined.
In an embodiment, the determining of the target identification control corresponding to the arbitrary second virtual screen in step S2300 may further include the following steps S2310 to S2330:
step S2310, acquiring target attribute information of the arbitrary second virtual screen.
The target attribute information may be id information of the arbitrary second virtual screen, which may uniquely identify the arbitrary second virtual screen. Continuing with the above example, the id information of the virtual screen 2 may be obtained.
Step S2320, acquires the set mapping data.
The mapping data reflects the corresponding relationship between the attribute information and the identification control of different virtual screens. Illustratively, the set mapping data may include three corresponding relationships, one corresponding relationship may be the id information of the virtual screen 1 and the identification control of the virtual screen 1, one corresponding relationship may be the id information of the virtual screen 2 and the identification control of the virtual screen 2, and one corresponding relationship may be the id information of the virtual screen 3 and the identification control of the virtual screen 3.
Further, the mapping data may be generated before step S2320. For example, when creating virtual screens on the Launcher of the AR glasses, each created virtual screen is usually assigned with id information corresponding to the identification control of the created virtual screen, and here, the above mapping data can be obtained according to the id information assigned to the created virtual screen and the identification control of the virtual screen.
And step S2330, determining the target identification control according to the mapping data and the target attribute information.
Continuing with the above example, the identification control corresponding to the id information of the virtual screen 2 may be searched from the set mapping data based on the id information of the virtual screen 2.
Subsequently, step S2400 is performed, in which the target identifier control is displayed based on the set display manner.
In an embodiment, the displaying the target identification control based on the set display manner in step S2400 may further include:
displaying the target identification control based on a first color; and/or the presence of a gas in the gas,
controlling the target display identifier to shake; and/or the presence of a gas in the gas,
and controlling the target display identifier to deform.
Continuing with the above example, in the case that the display content of the virtual screen 2 is updated, referring to fig. 3, the corner mark control of the virtual screen 2 may be marked with red (shown by thick black), the corner mark of the virtual screen 2 is controlled to shake, and the corner mark control of the virtual screen 2 is controlled to deform, so as to prompt the wearer that the display content of the virtual screen 2 is updated, and at this time, the virtual screen 2 needs to be switched to the main display screen for displaying.
According to the embodiment of the disclosure, under the condition that a plurality of virtual screens are stacked, a first virtual screen in the plurality of virtual screens can be used as a main display screen of a head-mounted display device to display the identification control of each virtual screen, and meanwhile, under the condition that the display content of a second virtual screen in the plurality of virtual screens is updated, a target identification control corresponding to any second virtual screen is determined, and then the target identification control is displayed based on the set display mode. Therefore, when the display content of any second virtual screen is updated, the target identification control corresponding to the any second virtual screen can be displayed based on the set display mode, so that the user can quickly switch to the any second virtual screen.
In one embodiment, the display control method of the embodiment of the present disclosure may further include: and under the condition that a first virtual screen in the plurality of virtual screens is displayed, displaying a first identification control corresponding to the first virtual screen based on a second color.
Referring to fig. 3, when the virtual screen 1 is displayed as the main display screen, the corner mark control of the virtual screen 1 may be marked as green (not shown in the figure) to prompt the wearer that the virtual screen 1 is the main display screen.
In an embodiment, after the step S2400 is executed to display the target identifier control based on the set display manner, the display control method according to the embodiment of the present disclosure further includes the following steps S4100 to S4400:
step S4100, receiving a first input for the target identification control.
Alternatively, the first input may be a touch input to a target identification control.
Optionally, the first input may also be a ray event sent by the interactive device for the target identification control, where the interactive device may be a handle, a mouse, a mobile phone, or another device.
Optionally, the first input may also be a gesture event of the user, i.e. the wearer, for the target identification control.
Step S4200, updating the display position of the arbitrary second virtual screen in response to the first input.
Optionally, in step S4200, updating the display position of the arbitrary second virtual screen in response to the first input includes: interchanging the positions of the arbitrary second virtual screen and the first virtual screen; or, overlaying the arbitrary second virtual screen on the first virtual screen.
Continuing with the above example, for example, the user may interchange the positions of the virtual screen 2 and the virtual screen 1, or directly overlay the virtual screen 2 on the virtual screen 1, and at this time, the virtual screen 2 serves as the main display screen, so that the wearer can be guaranteed to process new messages on the virtual screen 2 in real time.
In one embodiment, the display control method according to the embodiment of the present disclosure may further include the following steps S5100 to S5200:
step S5100, in response to the first input, obtains attribute information of an application run by the first virtual screen.
The attribute information of the application includes, for example, but not limited to, the name of the application, the type of the application.
In this embodiment, after the above step S4100 is performed to receive the first input for the target identification control, according to the step S5100, in response to the first input, the attribute information of the application running on the first virtual screen may also be obtained. Continuing with the above example, for example, the application run by the virtual screen 1 is application 1, and the application 1 is a video-class application.
Step S5200, in a case where the attribute information of the application is in the set application attribute list, pausing the playing of the target video based on the application;
the set application attribute list includes attribute information of a plurality of different video applications, and it can be understood that a video application may be a game application. Illustratively, the set application attribute list includes application 1, application 2, and application 3.
Continuing with the above example, since the application 1 is located in the set application attribute list, when the user switches the virtual screen 2 to be the main display screen, the application 1 running on the virtual screen 1 is paused to play the target video, so as to ensure that the user can continue to play the target video directly based on the pause time when switching to the virtual screen 1 next time, thereby ensuring the play continuity of the target video.
According to the embodiment of the disclosure, when a user watches a film, other virtual screens have new messages, when the user switches to other virtual screens to process the new messages, the film watching is automatically paused, and it is ensured that the user can directly play the target video continuously based on the pause moment when switching back next time, so that the playing continuity of the target video is ensured, and the user experience is improved.
< example >
Taking the head-mounted display device as AR glasses as an example, an example of a display control method is shown next, and referring to fig. 3, the display control method may include the following steps:
step S701, creating a virtual screen 1, a virtual screen 2, and a virtual screen 3 on an AR Launcher, and assigning id information and an identification control to each virtual screen to establish mapping data.
Step S702, using the virtual screen 1 as a main display screen, displaying the identifier control 1 of the virtual screen 1, and displaying the identifier control 2 of the virtual screen 2 and the identifier control 3 of the virtual screen 3.
Step S703 is to capture the screen of the virtual screen 2 and the virtual screen 3 according to the set screen capture period, and to obtain the screen capture picture of the current screen capture period.
Step S704, under the condition that the current display data frame and the latest display data frame of the virtual screen 2 are different, the corner mark control of the virtual screen 2 is marked red, or the corner mark of the virtual screen 2 is controlled to shake, or the corner mark control of the virtual screen 2 is controlled to deform.
Step S705, the user clicks the corner mark control of the virtual screen 2, and exchanges the virtual screen 1 and the virtual screen 2 to use the virtual screen 2 as the main display screen. After the virtual screen 2 is used as the main display screen, the identification control of the virtual screen 2 is marked as green to represent that the virtual screen 2 is used as the main display screen, and meanwhile, the identification control of the virtual screen 1 is adjusted to be in an unmarked state from green.
In step S706, in the case where the attribute information of the application running on the virtual screen 1 is in the set application attribute list, the playing of the target video based on the application is suspended.
< apparatus embodiment >
Fig. 4 is a schematic diagram of a display control apparatus according to an embodiment, and referring to fig. 4, the apparatus 400 includes a display module 410 and a determination module 420.
A display module 410, configured to use a first virtual screen in the multiple virtual screens as a main display screen of the head-mounted display device, and display an identification control of each of the virtual screens; wherein the plurality of virtual screens are arranged in a stack;
a determining module 420, configured to determine whether display content of a second virtual screen in the plurality of virtual screens is updated;
the determining module 420 is configured to determine, when display content in any second virtual screen is updated, a target identifier control corresponding to the any second virtual screen;
the display module 410 is configured to display the target identifier control based on a set display manner.
In an embodiment, the determining module 420 is specifically configured to: acquiring a current first display data frame and a latest second display data frame of any second virtual screen; comparing the first display data frame with the second display data frame to obtain a comparison result; and under the condition that the comparison result shows that the first display data frame and the second display data frame have difference, indicating that the display content of any second virtual screen is updated.
In an embodiment, the determining module 420 is specifically configured to: acquiring target attribute information of the arbitrary second virtual screen; acquiring set mapping data; the mapping data reflects the corresponding relation between the attribute information and the identification control of different virtual screens; and determining the target identification control according to the mapping data and the target attribute information.
In one embodiment, the display module 410 is specifically configured to: displaying the target identification control based on a first color; and/or controlling the target display identifier to shake; and/or controlling the target display identifier to deform.
In one embodiment, the display module 410 is further configured to: and under the condition that a first virtual screen in the plurality of virtual screens is displayed, displaying a first identification control corresponding to the first virtual screen based on a second color.
In one embodiment, the apparatus 400 further comprises a receiving module and an updating module (not shown in the figures).
A receiving module for receiving a first input for the target identification control;
an update module for updating a display position of the arbitrary second virtual screen in response to the first input;
the updating module is specifically configured to interchange positions of the arbitrary second virtual screen and the first virtual screen; or, overlaying the arbitrary second virtual screen on the first virtual screen.
In one embodiment, the apparatus 400 further includes an acquisition module and a pause module (not shown).
The acquisition module is used for responding to the first input and acquiring attribute information of the application operated by the first virtual screen;
the pause module is used for pausing the playing of the target video based on the application under the condition that the attribute information of the application is positioned in the set application attribute list;
wherein the set application attribute list includes attribute information of a plurality of different video applications.
According to the embodiment of the disclosure, under the condition that a plurality of virtual screens are stacked, a first virtual screen in the plurality of virtual screens can be used as a main display screen of a head-mounted display device to display the identification control of each virtual screen, and meanwhile, under the condition that the display content of a second virtual screen in the plurality of virtual screens is updated, a target identification control corresponding to any second virtual screen is determined, and then the target identification control is displayed based on the set display mode. Therefore, when the display content of any second virtual screen is updated, the target identification control corresponding to the any second virtual screen can be displayed based on the set display mode, and therefore the user can quickly switch to the any second virtual screen.
< apparatus embodiment >
Fig. 5 is a hardware configuration diagram of a head-mounted display device according to an embodiment. As shown in fig. 5, the head mounted display device 500 includes a processor 510 and a memory 520.
The memory 520 may be used to store executable computer instructions.
The processor 510 may be configured to execute the display control method according to the method embodiments of the present disclosure, according to the control of the executable computer instructions.
The head-mounted display device 500 may be the head-mounted display device 1000 shown in fig. 1, or may be a device having another hardware structure, which is not limited herein.
In further embodiments, the head mounted display apparatus 500 may include the above display control device 400.
In one embodiment, the above modules of the display control apparatus 400 may be implemented by the processor 510 executing computer instructions stored in the memory 520.
< computer-readable storage Medium >
The embodiment of the present disclosure also provides a computer-readable storage medium, on which computer instructions are stored, and when the computer instructions are executed by a processor, the display control method provided by the embodiment of the present disclosure is executed.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the disclosure are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are equivalent.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the present disclosure is defined by the appended claims.

Claims (10)

1. A display control method, characterized in that the method comprises:
taking a first virtual screen in a plurality of virtual screens as a main display screen of the head-mounted display equipment, and displaying an identification control of each virtual screen; wherein the plurality of virtual screens are arranged in a stacked manner;
determining whether display content of a second virtual screen of the plurality of virtual screens is updated;
under the condition that the display content in any second virtual screen is updated, determining a target identification control corresponding to the any second virtual screen;
and displaying the target identification control based on a set display mode.
2. The method of claim 1, wherein determining whether the display content of any second virtual screen is updated comprises:
acquiring a current first display data frame and a latest second display data frame of any second virtual screen;
comparing the first display data frame with the second display data frame to obtain a comparison result;
and under the condition that the comparison result shows that the first display data frame and the second display data frame have difference, indicating that the display content of any second virtual screen is updated.
3. The method of claim 1, wherein determining the target identification control corresponding to the arbitrary second virtual screen comprises:
acquiring target attribute information of the arbitrary second virtual screen;
acquiring set mapping data; the mapping data reflects the corresponding relation between the attribute information and the identification control of different virtual screens;
and determining the target identification control according to the mapping data and the target attribute information.
4. The method of claim 1, wherein displaying the display target identification control based on the set display mode comprises:
displaying the target identification control based on a first color; and/or the presence of a gas in the gas,
controlling the target display identifier to shake; and/or the presence of a gas in the gas,
and controlling the target display identifier to deform.
5. The method of claim 1, further comprising:
and under the condition that a first virtual screen in the plurality of virtual screens is displayed, displaying a first identification control corresponding to the first virtual screen based on a second color.
6. The method of claim 1, wherein after displaying the display target identification control based on the set display mode, further comprising:
receiving a first input for the target identification control;
updating a display position of the arbitrary second virtual screen in response to the first input;
wherein said updating the display position of said arbitrary second virtual screen in response to said first input comprises:
interchanging the positions of the arbitrary second virtual screen and the first virtual screen; alternatively, the first and second liquid crystal display panels may be,
overlaying the arbitrary second virtual screen on the first virtual screen.
7. The method of claim 6, further comprising:
responding to the first input, and acquiring attribute information of an application operated by the first virtual screen;
when the attribute information of the application is in a set application attribute list, pausing the playing of the target video based on the application;
wherein the set application attribute list includes attribute information of a plurality of different video applications.
8. A display control apparatus, characterized in that the apparatus comprises:
the display module is used for taking a first virtual screen in the multiple virtual screens as a main display screen of the head-mounted display equipment and displaying the identification control of each virtual screen; wherein the plurality of virtual screens are arranged in a stack;
a determining module, configured to determine whether display content of a second virtual screen in the plurality of virtual screens is updated;
the determining module is used for determining a target identification control corresponding to any second virtual screen under the condition that the display content in the any second virtual screen is updated;
and the display module is used for displaying the target identification control based on a set display mode.
9. A head-mounted display device, comprising:
a memory for storing executable computer instructions;
a processor for executing the display control method according to any one of claims 1 to 7, according to the control of the executable computer instructions.
10. A computer-readable storage medium having stored thereon computer instructions which, when executed by a processor, perform the display control method of any one of claims 1-7.
CN202211204518.8A 2022-09-29 2022-09-29 Display control method, display control device, head-mounted display equipment and medium Pending CN115617165A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211204518.8A CN115617165A (en) 2022-09-29 2022-09-29 Display control method, display control device, head-mounted display equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211204518.8A CN115617165A (en) 2022-09-29 2022-09-29 Display control method, display control device, head-mounted display equipment and medium

Publications (1)

Publication Number Publication Date
CN115617165A true CN115617165A (en) 2023-01-17

Family

ID=84860021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211204518.8A Pending CN115617165A (en) 2022-09-29 2022-09-29 Display control method, display control device, head-mounted display equipment and medium

Country Status (1)

Country Link
CN (1) CN115617165A (en)

Similar Documents

Publication Publication Date Title
CN109618177B (en) Video processing method and device, electronic equipment and computer readable storage medium
KR102463304B1 (en) Video processing method and device, electronic device, computer-readable storage medium and computer program
CN108260020B (en) Method and device for displaying interactive information in panoramic video
CN108600818B (en) Method and device for displaying multimedia resources
JP2020514892A (en) Method and apparatus for displaying interactive attributes during multimedia playback
US10701301B2 (en) Video playing method and device
US10593018B2 (en) Picture processing method and apparatus, and storage medium
CN103975313A (en) Information processing system, electronic device, image file reproduction method and generation method
US20160070421A1 (en) Information Processing Method And Electronic Apparatus
EP4191513A1 (en) Image processing method and apparatus, device and storage medium
CN115617166A (en) Interaction control method and device and electronic equipment
CN108449255B (en) Comment interaction method and equipment, client device and electronic equipment
US20170185422A1 (en) Method and system for generating and controlling composite user interface control
CN113873272A (en) Method, device and storage medium for controlling background image of live video
WO2024066752A1 (en) Display control method and apparatus, head-mounted display device, and medium
CN113596555B (en) Video playing method and device and electronic equipment
CN109521980B (en) Method, device, medium and electronic equipment for determining display content of entity display screen
CN115617165A (en) Display control method, display control device, head-mounted display equipment and medium
CN114742534A (en) Multi-machine linkage cooperative system and control method
CN115617163A (en) Display control method, display control device, head-mounted display equipment and medium
CN113031781A (en) Augmented reality resource display method and device, electronic equipment and storage medium
CN113031846A (en) Method and device for displaying description information of task and electronic equipment
CN115834754A (en) Interaction control method and device, head-mounted display equipment and medium
CN117148966A (en) Control method, control device, head-mounted display device and medium
CN116244024A (en) Interactive control method and device, head-mounted display equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination