CN117572960A - Control method, control device, control equipment and medium - Google Patents

Control method, control device, control equipment and medium Download PDF

Info

Publication number
CN117572960A
CN117572960A CN202311431486.XA CN202311431486A CN117572960A CN 117572960 A CN117572960 A CN 117572960A CN 202311431486 A CN202311431486 A CN 202311431486A CN 117572960 A CN117572960 A CN 117572960A
Authority
CN
China
Prior art keywords
container
head
mounted display
display device
virtual scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311431486.XA
Other languages
Chinese (zh)
Inventor
骆俊谕
林大鹏
张超
赵冠博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN202311431486.XA priority Critical patent/CN117572960A/en
Publication of CN117572960A publication Critical patent/CN117572960A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45562Creating, deleting, cloning virtual machine instances

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the disclosure discloses a control method, a device, a control device and a medium, wherein the method comprises the following steps: creating a first camera group corresponding to the first head-mounted display device in a virtual scene of the first container operation; controlling the first camera group to shoot images of virtual scenes operated by the first container according to the degree of freedom information of the first head-mounted display equipment and sending the images to the first head-mounted display equipment; creating a second camera group corresponding to the second head-mounted display device in the virtual scene of the second container operation; wherein the second container and the first container are the same container or different containers; and controlling the second camera group to shoot the image of the virtual scene operated by the second container according to the degree of freedom information of the second head-mounted display device, and sending the image to the second head-mounted display device.

Description

Control method, control device, control equipment and medium
Technical Field
The embodiment of the disclosure relates to the technical field of electronic equipment, and more particularly relates to a control method, a control device, a control equipment and a computer readable storage medium.
Background
The split type is an important trend of AR development, in order to reduce the power consumption of the split type AR glasses, the split type AR glasses are connected with a control device, and an image rendered by the control device is transmitted to the split type AR glasses for display. However, in the related art, in case that the control device is connected to different split type AR glasses, respectively, there is a problem how to enable the control device to better serve the different split type AR glasses.
Disclosure of Invention
The embodiment of the disclosure aims to provide a control method, a control device, a control equipment and a medium.
According to a first aspect of embodiments of the present disclosure, there is provided a control method, including:
creating a first camera group corresponding to the first head-mounted display device in a virtual scene of the first container operation;
controlling the first camera group to shoot images of virtual scenes operated by the first container according to the degree of freedom information of the first head-mounted display equipment and sending the images to the first head-mounted display equipment;
creating a second camera group corresponding to the second head-mounted display device in the virtual scene of the second container operation; wherein the second container and the first container are the same container or different containers;
and controlling the second camera group to shoot the image of the virtual scene operated by the second container according to the degree of freedom information of the second head-mounted display device, and sending the image to the second head-mounted display device.
Optionally, the method further comprises:
receiving a first activation request to activate the first head mounted display device;
in response to the first start request, creating the first container and starting a first application in the first container;
and under the condition that the first application is in a starting state, constructing a virtual scene in the first container.
Optionally, the second container and the first container are the same container,
the creating a second camera group corresponding to the second head-mounted display device in the virtual scene of the second container operation includes:
receiving a second starting request for starting the second head-mounted display device;
and responding to the second starting request, and creating a second camera group corresponding to the second head-mounted display device in the virtual scene running in the first container.
Optionally, the second container and the first container are different containers,
the method further comprises the steps of:
receiving a second starting request for starting the second head-mounted display device;
creating the second container in response to the second start request, and starting a second application in the second container;
and under the condition that the second application is in a starting state, constructing a virtual scene in the second container.
Optionally, the first container and the second container are docker containers.
According to a second aspect of embodiments of the present disclosure, there is provided a control apparatus comprising:
the first creating module is used for creating a first camera group corresponding to the first head-mounted display device in the virtual scene of the first container operation;
the first control module is used for controlling the first camera group to shoot images of the virtual scene operated by the first container according to the degree of freedom information of the first head-mounted display device and sending the images to the first head-mounted display device;
the second creating module is used for creating a second camera group corresponding to the second head-mounted display device in the virtual scene of the second container operation; wherein the second container and the first container are the same container or different containers;
and the second control module is used for controlling the second camera group to shoot the image of the virtual scene operated by the second container according to the degree of freedom information of the second head-mounted display device and sending the image to the second head-mounted display device.
Optionally, the apparatus further comprises a first receiving module and a third creating module,
the first receiving module is used for receiving a first starting request for starting the first head-mounted display device;
the third creating module is used for responding to the first starting request, creating the first container and starting a first application in the first container; the method comprises the steps of,
and under the condition that the first application is in a starting state, constructing a virtual scene in the first container.
Optionally, the second container and the first container are the same container, the second creation module comprises a receiving unit and a creation unit,
the receiving unit is used for receiving a second starting request for starting the second head-mounted display device;
the creating unit is used for responding to the second starting request and creating a second camera group corresponding to the second head-mounted display device in the virtual scene running in the first container.
According to a third aspect of the embodiments of the present disclosure, there is provided a control apparatus including:
a memory for storing executable computer instructions;
a processor for executing the control method according to the above first aspect, according to control of the executable computer instructions.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer instructions which, when executed by a processor, perform the control method of the first aspect above.
The control device creates a first camera group corresponding to the first head-mounted display device in a virtual scene of the first container operation, controls the first camera group to shoot an image of the virtual scene of the first container operation and send the image to the first head-mounted display device according to the degree of freedom information of the first head-mounted display device, and creates a second camera group corresponding to the second head-mounted display device in a virtual scene of the second container operation, controls the second camera group to acquire an image of the virtual scene of the second container operation and send the image to the second head-mounted display device according to the degree of freedom information of the second head-mounted display device. Because the second container and the first container can be the same container or different containers, that is, the control device can enable different head-mounted display devices to coexist in the virtual scene running in the same container, and also enable the different head-mounted display devices to respectively exist in the virtual scenes running in the different containers, the control device can better serve the different head-mounted display devices, and different head-mounted display devices coexist in the same virtual scene or different head-mounted display devices exist in independent virtual scenes.
Other features of the present specification and its advantages will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the specification and together with the description, serve to explain the principles of the specification.
FIG. 1 is a schematic diagram of a hardware configuration of a control system according to an embodiment of the present disclosure;
FIG. 2 is a flow diagram of a control method according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of interactions between a control device and AR glasses according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of interactions between a control device and AR glasses according to another embodiment of the present disclosure;
FIG. 5 is a functional block diagram of a control device according to an embodiment of the present disclosure;
fig. 6 is a functional block diagram of a control device according to an embodiment of the present disclosure.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of parts and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the embodiments of the present disclosure unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of exemplary embodiments may have different values.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
< hardware configuration >
Fig. 1 is a schematic diagram of a hardware configuration of a control system that may be used to implement the control method of one embodiment. Fig. 1 shows a head mounted display device 100, a control device 200 and a network 300. The head-mounted display device 100 may be connected to the network 300 and may also be connected to the terminal device 200 by a communication means such as bluetooth. In one embodiment, the head mounted display devices 100 are connected to the terminal device 200, respectively, only by a communication means such as bluetooth. Wherein a plurality of servers 301, 302 may be provided in the network 300. The network 300 may be a wireless communication network or a wired communication network. The network 300 may be a local area network or a wide area network. The network 300 may be a near field communication or a far field communication.
In one embodiment, as shown in fig. 1, a head mounted display device 100 may include a processor 101 and a memory 102. The head mounted display apparatus 100 further comprises communication means 103, display means 104, user interface 105, camera means 106, audio/video interface 107, and sensor 108 etc. In addition, the head-mounted display device 100 may further include a power management chip 109, a battery 110, and the like.
The processor 101 may be various processors. The memory 102 may store underlying software, system software, application software, data, etc., that are required for the operation of the head mounted display device 100. The memory 102 may include various forms of memory, such as ROM, RAM, flash, etc. The communication device 103 may include, for example, a WiFi communication device, a bluetooth communication device, a 3G, 4G, and 5G communication device, and the like. The head mounted display device 100 may be arranged in a network through the communication means 103. The display device 104 may be a liquid crystal display, an OLED display, or the like. In one example, the display device 104 may be a touch screen. The user may perform an input operation through the display device 104. In addition, the user can also conduct fingerprint identification and the like through the touch screen. The user interface 105 may include a USB interface, a lightning interface, a keyboard, etc. The camera 106 may be a single camera or multiple cameras. The audio/video interface 107 may include, for example, a speaker interface, a microphone interface, a video transmission interface such as HDMI, and the like. The sensor 108 may include, for example, a gyroscope, an accelerometer, a temperature sensor, a humidity sensor, a pressure sensor, and the like. For example, attitude information of the head-mounted display device and the like can be determined by the sensor. The power management chip 109 may be used to manage the power of the power input to the head-mounted display device 100, and may also manage the battery 110 to ensure a greater utilization efficiency. The battery 110 is, for example, a lithium ion battery or the like.
The head-mounted display device 100 may be AR (augmented Reality ) glasses, MR (Mixed Reality) glasses, etc., the AR glasses may be split-type AR glasses, the MR glasses may be split-type MR glasses, and the embodiments of the present disclosure are not limited thereto. The various components shown in fig. 1 are merely illustrative. The head mounted display device 100 may include one or more of the components shown in fig. 1, and need not include all of the components in fig. 1. The head mounted display device 100 shown in fig. 1 is merely illustrative and is in no way intended to limit the embodiments herein, their applications or uses.
In one embodiment, as shown in FIG. 1, the control device 200 may include a processor 201 and a memory 202. The control apparatus 200 further comprises communication means 203, display means 204, user interface 205, camera means 206, audio/video interface 207, and sensor 208 etc. In addition, the terminal device 200 may further include a power management chip 209, a battery 210, and the like.
The terminal device 200 may be a mobile phone, a portable computer, a tablet computer, a palm computer, a wearable device, etc., which is not limited in the embodiments of the present disclosure. The control device may be an Android device, for example, the control device may be an Android device having the display device 204, or the control device may be an Android device not having the display device 204. That is, the various components shown in FIG. 1 are merely illustrative. The control device 200 may include one or more of the components shown in fig. 1, and need not include all of the components in fig. 1. The control device 200 shown in fig. 1 is merely illustrative and is in no way intended to limit the embodiments herein, their applications or uses.
In this embodiment, the memory 202 of the control device 200 is used to store program instructions for controlling the processor 201 to operate to execute a control method, and a skilled person can design the instructions according to the disclosed solution. How the instructions control the processor to operate is well known in the art and will not be described in detail here.
It should be understood that although fig. 1 shows only one head mounted display device 100, control device 200, it is not meant to limit the respective number, and that a plurality of head mounted display devices 100, 200 may be included in a control system.
In the above description, a skilled person may design instructions according to the solutions provided by the present disclosure. How the instructions control the processor to operate is well known in the art and will not be described in detail here.
< method example >
Fig. 2 illustrates a control method of an embodiment of the present disclosure, which may be implemented by the control device illustrated in fig. 1, which may be, for example, an Android device without the display device 204. As shown in fig. 2, the control method of this embodiment may include the following steps S2100 to S2400:
in step S2100, a first camera group corresponding to a first head mounted display device is created in a virtual scene in which a first container is running.
Wherein the first container may be a docker container.
Wherein the first camera group is used for scene rendering of the first head-mounted display device, typically, the first camera group comprises two first cameras, wherein one first camera is used for simulating the left eye of the wearer of the first head-mounted display device, which may be called a left camera, and the other first camera is used for simulating the right eye of the wearer of the first head-mounted display device, which may be called a right camera.
In an alternative embodiment, before executing the step S2100 to create the first camera group corresponding to the first head-mounted display device in the virtual scene in which the first container runs, the control method of the embodiment of the present disclosure may further include: receiving a first activation request to activate the first head mounted display device; in response to the first start request, creating the first container and starting a first application in the first container; and under the condition that the first application is in a starting state, constructing a virtual scene in the first container.
It should be noted that, the control device typically deploys multiple containers in advance, and packages different applications, for example, different AR applications, into different containers, so as to subsequently launch different AR applications in different containers, so as to create different virtual scenes in different AR applications.
Referring to fig. 3 and 4, when the wearer 1 activates the AR glasses 1, the control apparatus receives a first activation request for activating the AR glasses 1 by the wearer 1, creates a Container1 (corresponding to Container1 in fig. 3), and activates the AR application 1 in the Container1, and in the case where the AR application 1 is in an activated state, creates a virtual scene 1, at which time the control apparatus may create a first camera group in the virtual scene 1.
It should be noted that, referring to fig. 3 and 4, the control device generally includes four levels: container (Container), docker Engine, host operating system (Host OS), and Hardware.
After creating the first camera group corresponding to the first head-mounted display device in the virtual scene in which the first container is running in performing step S2100, entering:
step S2200, controlling the first camera group to shoot an image of a virtual scene operated by the first container according to the degree of freedom information of the first head-mounted display device, and sending the image to the first head-mounted display device.
The degree of freedom information of the first head-mounted display device may be 6 degree of freedom information (hereinafter, referred to as 6DoF information) of the first head-mounted display device, including a movement degree of freedom and a rotation degree of freedom.
In this embodiment, a streaming connection is established between the control device and the first head-mounted display device, and the streaming connection may be a wireless streaming connection. The control device may receive 6DoF information of the first head-mounted display device transmitted by the first head-mounted display device through a streaming connection with the first head-mounted display device. The control device controls the pose of the first camera group, i.e. for example the displacement and rotation of the first camera group, which takes images of the virtual scene operated by the first container and renders it to the display screen, via the 6DoF information of the first head mounted display device. The control device transmits the image displayed on the display screen to the first head-mounted display device, which displays the image in the corresponding window, so that the screen displayed by the first head-mounted display device can follow the switching and shifting of the head dynamics of the wearer. Wherein the first head mounted display device only performs the calculation of the 6DoF information and the display of the rendered image.
Referring to fig. 3 and 4, the control device may receive the 6DoF information of the AR glasses 1 sent by the AR glasses 1 based on the wireless streaming connection with the AR glasses 1, and control the first camera group to displace and rotate according to the 6DoF information of the AR glasses 1 to perform 6DoF conversion, the left camera in the first camera group renders the captured image of the virtual scene 1 to the display screen 1, and the right camera in the first camera group renders the captured image of the virtual scene 1 to the display screen 2. The control device transmits the image code of the display screen 1 to the AR glasses 1, the AR glasses 1 decode and display the image at the window 1, and the control device transmits the image code of the display screen 2 to the AR glasses 1, the AR glasses 1 decode and display the image at the window 2.
After executing step S2200 to control the first camera group to capture an image in the virtual scene operated by the first container according to the degree of freedom information of the first head-mounted display device and send the image to the first head-mounted display device, entering:
step S2300, creating a second camera group corresponding to the second head-mounted display device in the virtual scene in which the second container is running.
Wherein the second container may be a dock container. The second container and the first container may be the same container, and the second container and the first container may be different containers, which is not limited in this embodiment.
The second camera group is used for scene rendering of the second head-mounted display device, and generally, the second camera group includes two second cameras, wherein one second camera is used for simulating a left eye of a wearer of the second head-mounted display device and can be called a left camera, and the other second camera is used for simulating a right eye of the wearer of the second head-mounted display device and can be called a right camera.
In an alternative embodiment, the second container and the first container are the same container, and creating the second camera group corresponding to the second head-mounted display device in the virtual scene in which the second container operates in step S2300 may further include: receiving a second starting request for starting the second head-mounted display device; and responding to the second starting request, and creating a second camera group corresponding to the second head-mounted display device in the virtual scene running in the first container.
Referring to fig. 3, when the wearer 2 activates the AR glasses 2, the control apparatus receives a second activation request for activating the AR glasses 2 by the wearer 2, and may create a second camera group in the virtual scene 1 running in the container 1.
In an alternative embodiment, the second container and the first container are different containers. After executing step S2300 to create a second camera group corresponding to a second head-mounted display device in a virtual scene of a second container operation, the control method of the embodiment of the present disclosure further includes: receiving a second starting request for starting the second head-mounted display device; creating the second container in response to the second start request, and starting a second application in the second container; and under the condition that the second application is in a starting state, constructing a virtual scene in the second container.
Referring to fig. 4, when the wearer 2 starts the AR glasses 2 and the control device receives a second start request for the wearer 2 to start the AR glasses 2, a Container 2 (corresponding to Container 2 in fig. 4) may be created, and an AR application 2 may be started in the Container 2, and in the case where the AR application 2 is in a start state, a virtual scene 2 may be constructed, at this time, the control device may create a second camera group in the virtual scene 2.
After executing step S2300 to create a second camera group corresponding to a second head mounted display device in the virtual scene in which the second container is running, enter:
step S2400 controls the second camera group to capture an image of the virtual scene operated by the second container according to the degree of freedom information of the second head-mounted display device, and sends the image to the second head-mounted display device.
In this embodiment, a streaming connection is established between the control device and the second head-mounted display device, and the streaming connection may be a wireless streaming connection. The control device can receive the 6DoF information of the second head-mounted display device sent by the second head-mounted display device through the series flow connection between the control device and the second head-mounted display device. The control device controls the gesture of the second camera group through the 6DoF information of the second head-mounted display device, namely, for example, the displacement and rotation of the second camera group can be controlled, and the second camera group shoots images of the virtual scene operated by the second container and renders the images to the display screen. The control device transmits the image displayed by the display screen to the second head-mounted display device, and the second head-mounted display device displays the image in the corresponding window, so that the picture displayed by the second head-mounted display device can be switched and offset along with the head dynamics of the wearer. Wherein the second head mounted display device only performs the calculation of the 6DoF information and the display of the rendered image.
Referring to fig. 3, the control device receives the 6DoF information of the AR glasses 2 sent by the AR glasses 2 based on the wireless streaming connection with the AR glasses 2, and controls the second camera group to shift and rotate to perform 6DoF conversion according to the 6DoF information of the AR glasses 2, the left camera in the second camera group renders the photographed image of the virtual scene 1 to the display screen 3, and the right camera in the second camera group renders the photographed image of the virtual scene 1 to the display screen 4. The control device sends the image code of the display screen 3 to the AR glasses 2, the AR glasses 2 decode and display the image at the window 3, the control device sends the image code of the display screen 4 to the AR glasses 2, the AR glasses 2 decode and display the image at the window 4.
It should be noted that, based on the same manner, when the wearer 3 activates the AR glasses 3, the control device receives the third activation request for activating the AR glasses 3 by the wearer 3, and may create the third camera group in the virtual scene 1 running in the container 1. The control device receives the 6DoF information of the AR glasses 3 sent by the AR glasses 3 based on the wireless streaming connection with the AR glasses 3, and controls the third camera group to shift and rotate to perform 6DoF conversion according to the 6DoF information of the AR glasses 3, the left camera in the third camera group renders the photographed image of the virtual scene 1 to the display screen 5, and the right camera in the third camera group renders the photographed image of the virtual scene 1 to the display screen 6. The control device sends the image code of the display screen 5 to the AR glasses 3, the AR glasses 3 decode and display the image at the window 5, the control device sends the image code of the display screen 6 to the AR glasses 3, the AR glasses 3 decode and display the image at the window 6.
That is, according to this embodiment, AR applications may be started in one dock container, a virtual scene may be constructed, and unified management event distribution may be performed on a plurality of AR glasses, so that a plurality of users coexist in the same scene.
Referring to fig. 4, the control device receives the 6DoF information of the AR glasses 2 sent by the AR glasses 2 based on the wireless streaming connection with the AR glasses 2, and controls the second camera group to shift and rotate to perform 6DoF conversion according to the 6DoF information of the AR glasses 2, the left camera in the second camera group renders the photographed image of the virtual scene 2 to the display screen 3, and the right camera in the second camera group renders the photographed image of the virtual scene 2 to the display screen 4. The control device sends the image code of the display screen 3 to the AR glasses 2, the AR glasses 2 decode and display the image at the window 3, the control device sends the image code of the display screen 4 to the AR glasses 2, the AR glasses 2 decode and display the image at the window 4.
It should be noted that, in the same manner, the control device may create the container 3 upon receiving the activation of the AR glasses 3 by the wearer 3, and activate the AR application 3 within the container 3, and may construct the virtual scene 3 upon the AR application 3 being in the activated state. At this time, the control apparatus may create a third camera group in the virtual scene 3. The control device receives the 6DoF information of the AR glasses 3 sent by the AR glasses 3 based on the wireless streaming connection with the AR glasses 3, and controls the third camera group to shift and rotate to perform 6DoF conversion according to the 6DoF information of the AR glasses 3, the left camera in the third camera group renders the photographed image of the virtual scene 3 to the display screen 5, and the right camera in the third camera group renders the photographed image of the virtual scene 3 to the display screen 6. The control device sends the image code of the display screen 5 to the AR glasses 3, the AR glasses 3 decode and display the image at the window 5, the control device sends the image code of the display screen 6 to the AR glasses 3, the AR glasses 3 decode and display the image at the window 6.
That is, according to this embodiment, AR applications may be started in multiple dock containers, each AR application constructs a virtual scene, and independent management event distribution is performed on multiple AR glasses, so that multiple users exist in independent scenes.
According to the embodiment, the control device creates a first camera group corresponding to the first head-mounted display device in the virtual scene of the first container operation, and controls the first camera group to shoot an image of the virtual scene of the first container operation and send the image to the first head-mounted display device according to the degree of freedom information of the first head-mounted display device, and creates a second camera group corresponding to the second head-mounted display device in the virtual scene of the second container operation, and controls the second camera group to acquire an image of the virtual scene of the second container operation and send the image to the second head-mounted display device according to the degree of freedom information of the second head-mounted display device. Because the second container and the first container can be the same container or different containers, that is, the control device can enable different head-mounted display devices to coexist in the virtual scene running in the same container, and also enable the different head-mounted display devices to respectively exist in the virtual scenes running in the different containers, the control device can better serve the different head-mounted display devices, and different head-mounted display devices coexist in the same virtual scene or different head-mounted display devices exist in independent virtual scenes.
According to the embodiment of the disclosure, an external control device can better serve multiple AR glasses users, different scene demands are met, and the simultaneous rendering efficiency of multiple independent scenes is improved through a dock container scheme.
< device example >
Fig. 5 is a schematic diagram of a control device according to an embodiment, and referring to fig. 5, the control device 500 includes a first creation module 510, a first control module 520, a second creation module 530, and a second control module 540.
A first creation module 510, configured to create a first camera group corresponding to a first head-mounted display device in a virtual scene in which a first container runs;
the first control module 520 is configured to control, according to the degree of freedom information of the first head-mounted display device, the first camera group to capture an image of a virtual scene operated by the first container and send the image to the first head-mounted display device;
a second creating module 530, configured to create a second camera group corresponding to a second head-mounted display device in a virtual scene running in a second container; wherein the second container and the first container are the same container or different containers;
and the second control module 540 is configured to control the second camera group to capture an image of the virtual scene operated by the second container according to the degree of freedom information of the second head-mounted display device, and send the image to the second head-mounted display device.
In one embodiment, the apparatus further comprises a first receiving module and a third creating module (neither shown).
The first receiving module is used for receiving a first starting request for starting the first head-mounted display device;
the third creating module is used for responding to the first starting request, creating the first container and starting a first application in the first container; the method comprises the steps of,
and under the condition that the first application is in a starting state, constructing a virtual scene in the first container.
In one embodiment, the second container and the first container are the same container, and the second creation module 530 includes a receiving unit and a creation unit (not shown in the figure).
The receiving unit is used for receiving a second starting request for starting the second head-mounted display device;
the creating unit is used for responding to the second starting request and creating a second camera group corresponding to the second head-mounted display device in the virtual scene running in the first container.
In one embodiment, the second container and the first container are different containers, and the apparatus further comprises a second receiving module and a fourth creating module (neither shown).
The second receiving module is used for receiving a second starting request for starting the second head-mounted display device;
the fourth creating module is used for responding to the second starting request, creating the second container and starting a second application in the second container; and under the condition that the second application is in a starting state, constructing a virtual scene in the second container.
In one embodiment, the first container and the second container are docker containers.
According to the embodiment of the disclosure, the control device creates a first camera group corresponding to the first head-mounted display device in a virtual scene of the first container operation, and controls the first camera group to shoot an image of the virtual scene of the first container operation and send the image to the first head-mounted display device according to the degree of freedom information of the first head-mounted display device, and creates a second camera group corresponding to the second head-mounted display device in a virtual scene of the second container operation, and controls the second camera group to acquire an image of the virtual scene of the second container operation and send the image to the second head-mounted display device according to the degree of freedom information of the second head-mounted display device. Because the second container and the first container can be the same container or different containers, that is, the control device can enable different head-mounted display devices to coexist in the virtual scene running in the same container, and also enable the different head-mounted display devices to respectively exist in the virtual scenes running in the different containers, the control device can better serve the different head-mounted display devices, and different head-mounted display devices coexist in the same virtual scene or different head-mounted display devices exist in independent virtual scenes.
< device example >
Fig. 6 is a schematic diagram of a hardware structure of a control device according to an embodiment. As shown in fig. 6, the control device 600 includes a processor 610 and a memory 620.
The memory 620 may be used to store executable computer instructions.
The processor 610 may be configured to execute a control method according to an embodiment of the method of the present disclosure, according to control of the executable computer instructions.
The control device 600 may be the second device 200 shown in fig. 1, and is not limited herein.
In further embodiments, the control apparatus 600 may include the above control device 500.
In one embodiment, the modules of the control device 500 above may be implemented by the processor 610 executing computer instructions stored in the memory 620.
< computer-readable storage Medium >
The disclosed embodiments also provide a computer-readable storage medium having stored thereon computer instructions that, when executed by a processor, perform the control methods provided by the disclosed embodiments.
The present disclosure may be a system, method, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for performing the operations of the present disclosure can be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, c++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present disclosure are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information of computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are all equivalent.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the technical improvements in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the present disclosure is defined by the appended claims.

Claims (10)

1. A control method, characterized in that the method comprises:
creating a first camera group corresponding to the first head-mounted display device in a virtual scene of the first container operation;
controlling the first camera group to shoot images of virtual scenes operated by the first container according to the degree of freedom information of the first head-mounted display equipment and sending the images to the first head-mounted display equipment;
creating a second camera group corresponding to the second head-mounted display device in the virtual scene of the second container operation; wherein the second container and the first container are the same container or different containers;
and controlling the second camera group to shoot the image of the virtual scene operated by the second container according to the degree of freedom information of the second head-mounted display device, and sending the image to the second head-mounted display device.
2. The method according to claim 1, wherein the method further comprises:
receiving a first activation request to activate the first head mounted display device;
in response to the first start request, creating the first container and starting a first application in the first container;
and under the condition that the first application is in a starting state, constructing a virtual scene in the first container.
3. The method of claim 1, wherein the second container and the first container are the same container,
the creating a second camera group corresponding to the second head-mounted display device in the virtual scene of the second container operation includes:
receiving a second starting request for starting the second head-mounted display device;
and responding to the second starting request, and creating a second camera group corresponding to the second head-mounted display device in the virtual scene running in the first container.
4. The method of claim 1, wherein the second container and the first container are different containers,
the method further comprises the steps of:
receiving a second starting request for starting the second head-mounted display device;
creating the second container in response to the second start request, and starting a second application in the second container;
and under the condition that the second application is in a starting state, constructing a virtual scene in the second container.
5. The method of any one of claims 1 to 4, wherein the first vessel and the second vessel are dock vessels.
6. A control apparatus, characterized in that the apparatus comprises:
the first creating module is used for creating a first camera group corresponding to the first head-mounted display device in the virtual scene of the first container operation;
the first control module is used for controlling the first camera group to shoot images of the virtual scene operated by the first container according to the degree of freedom information of the first head-mounted display device and sending the images to the first head-mounted display device;
the second creating module is used for creating a second camera group corresponding to the second head-mounted display device in the virtual scene of the second container operation; wherein the second container and the first container are the same container or different containers;
and the second control module is used for controlling the second camera group to shoot the image of the virtual scene operated by the second container according to the degree of freedom information of the second head-mounted display device and sending the image to the second head-mounted display device.
7. The apparatus of claim 6, further comprising a first receiving module and a third creating module,
the first receiving module is used for receiving a first starting request for starting the first head-mounted display device;
the third creating module is used for responding to the first starting request, creating the first container and starting a first application in the first container; the method comprises the steps of,
and under the condition that the first application is in a starting state, constructing a virtual scene in the first container.
8. The apparatus of claim 6, wherein the second container and the first container are the same container, the second creation module comprises a receiving unit and a creation unit,
the receiving unit is used for receiving a second starting request for starting the second head-mounted display device;
the creating unit is used for responding to the second starting request and creating a second camera group corresponding to the second head-mounted display device in the virtual scene running in the first container.
9. A control apparatus, characterized in that the control apparatus comprises:
a memory for storing executable computer instructions;
a processor for executing the control method according to any one of claims 1-5, according to control of the executable computer instructions.
10. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, perform the control method of any of claims 1-5.
CN202311431486.XA 2023-10-31 2023-10-31 Control method, control device, control equipment and medium Pending CN117572960A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311431486.XA CN117572960A (en) 2023-10-31 2023-10-31 Control method, control device, control equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311431486.XA CN117572960A (en) 2023-10-31 2023-10-31 Control method, control device, control equipment and medium

Publications (1)

Publication Number Publication Date
CN117572960A true CN117572960A (en) 2024-02-20

Family

ID=89890787

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311431486.XA Pending CN117572960A (en) 2023-10-31 2023-10-31 Control method, control device, control equipment and medium

Country Status (1)

Country Link
CN (1) CN117572960A (en)

Similar Documents

Publication Publication Date Title
EP3533025B1 (en) Virtual reality experience sharing
US9829706B2 (en) Control apparatus, information processing apparatus, control method, information processing method, information processing system and wearable device
CN109271125B (en) Screen display control method and device of split type terminal equipment and storage medium
CN107908447B (en) Application switching method and device and virtual reality device
CN110070496B (en) Method and device for generating image special effect and hardware device
CN110620954B (en) Video processing method, device and storage medium for hard solution
CN112527174B (en) Information processing method and electronic equipment
CN112527222A (en) Information processing method and electronic equipment
CN115543535B (en) Android container system, android container construction method and device and electronic equipment
CN112363658B (en) Interaction method and device for video call
US20200219310A1 (en) Mobile device integration with a virtual reality environment
CN112420217A (en) Message pushing method, device, equipment and storage medium
US20220172440A1 (en) Extended field of view generation for split-rendering for virtual reality streaming
CN117244249A (en) Multimedia data generation method and device, readable medium and electronic equipment
CN116244024A (en) Interactive control method and device, head-mounted display equipment and medium
CN115576470A (en) Image processing method and apparatus, augmented reality system, and medium
CN117572960A (en) Control method, control device, control equipment and medium
CN117215449A (en) Interactive control method, device, electronic equipment and medium
CN116737291A (en) Desktop application processing method and electronic equipment
CN117148965A (en) Interactive control method, device, electronic equipment and medium
US20230405475A1 (en) Shooting method, apparatus, device and medium based on virtual reality space
CN117119149A (en) Camera data transmission method and device, electronic equipment and medium
CN116149535A (en) Interactive control method and device, head-mounted display equipment and medium
CN117215688A (en) Control method, control device, electronic equipment and medium
CN117201730A (en) Camera data transmission method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination