CN114625248A - AR multi-window holographic interaction method and system - Google Patents

AR multi-window holographic interaction method and system Download PDF

Info

Publication number
CN114625248A
CN114625248A CN202210207580.6A CN202210207580A CN114625248A CN 114625248 A CN114625248 A CN 114625248A CN 202210207580 A CN202210207580 A CN 202210207580A CN 114625248 A CN114625248 A CN 114625248A
Authority
CN
China
Prior art keywords
terminal equipment
window
displayed
floating
fixed area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210207580.6A
Other languages
Chinese (zh)
Inventor
孙阳
董小平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sunrise Simcom Electronic Technology Co Ltd
Original Assignee
Shanghai Sunrise Simcom Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sunrise Simcom Electronic Technology Co Ltd filed Critical Shanghai Sunrise Simcom Electronic Technology Co Ltd
Priority to CN202210207580.6A priority Critical patent/CN114625248A/en
Publication of CN114625248A publication Critical patent/CN114625248A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to an AR multi-window holographic interaction method and system, wherein the method comprises the following steps: acquiring application data of terminal equipment and displaying all applications on the terminal equipment in a fixed area through AR glasses; receiving the operation action of the user on the picture displayed in the fixed area, and feeding back the received operation action to the terminal equipment for execution; and displaying the execution result of the terminal equipment above the fixed area in a floating window mode through the AR glasses, wherein the number of the floating windows is at least two, and each floating window can correspondingly display one application. The interaction method of the invention realizes that the operation and the application on the terminal equipment are holographically presented and interacted in the space through the AR glasses, and the suspension window can support a plurality of application rendering pictures to be simultaneously displayed in the space by opening the window, thereby realizing the support of a plurality of windows to be displayed and interacted in the real space.

Description

AR multi-window holographic interaction method and system
Technical Field
The invention relates to the technical field of AR holography, in particular to an AR multi-window holographic interaction method and system.
Background
At present, the traditional mobile phone interaction still remains clicking and triggering on a flat screen, and due to the limitation of the screen, the screen cannot display multiple apps for holographic interaction.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, provides an AR multi-window holographic interaction method and system, and solves the problem that the conventional mobile phone screen cannot display a plurality of apps and carry out holographic interaction.
The technical scheme for realizing the purpose is as follows:
the invention provides an AR multi-window holographic interaction method, which comprises the following steps:
acquiring application data of terminal equipment and displaying all applications on the terminal equipment in a fixed area through AR glasses;
receiving the operation action of the user on the picture displayed in the fixed area, and feeding back the received operation action to the terminal equipment for execution; and
and displaying the execution result of the terminal equipment above the fixed area in a floating window mode through the AR glasses, wherein the number of the floating windows is at least two, and each floating window can correspondingly display one application.
The interaction method of the invention realizes that the operation and the application on the terminal equipment are holographically presented and interacted in the space through the AR glasses, and the suspension window can support a plurality of application rendering pictures to be simultaneously displayed in the space by opening the window, thereby realizing the support of a plurality of windows to be displayed and interacted in the real space.
The AR multi-window holographic interaction method is further improved in that three suspended windows are preset;
when a plurality of applications are opened, the three applications are sequentially displayed through the preset three floating windows, and the application interaction is realized in the floating windows.
The AR multi-window holographic interaction method is further improved in that the method further comprises the following steps:
triggering a sensor in the terminal equipment to emit rays, so that a user can operate the pictures displayed by the fixed area and the floating window through the rays emitted by the terminal equipment;
shooting pictures displayed by the fixed area and the floating window through a camera on the AR glasses to form video data;
and calculating the coordinate value of the intersection point of the ray emitted by the terminal equipment and the displayed picture in real time by using the video data, and forming a corresponding operation action by combining the received touch instruction of the terminal equipment and feeding the operation action back to the terminal equipment for execution.
The AR multi-window holographic interaction method is further improved in that the touch control instruction of the terminal equipment comprises a click instruction, a long press instruction and a sliding instruction.
The AR multi-window holographic interaction method is further improved in that the method further comprises the following steps:
setting a gesture action library;
shooting the pictures displayed by the fixed area and the floating window and the operation gesture of a user through a camera on the AR glasses to form video data;
and recognizing the operation gesture in the video data by using the gesture action library, calculating the coordinate value of the contact point of the operation gesture and the displayed picture, and further forming an operation action to be fed back to the terminal equipment for execution.
The invention also provides an AR multi-window holographic interaction system, which comprises:
the acquisition unit is arranged on the terminal equipment and used for acquiring the application data of the terminal equipment;
the display unit is connected with the AR glasses and is used for displaying all applications on the terminal equipment in a fixed area through the AR glasses; and
and the processing unit is connected with the display unit and used for receiving the operation action of the user on the picture displayed in the fixed area and feeding back the operation action to the terminal equipment for execution, and is also used for displaying the execution result of the terminal equipment in a floating window mode above the fixed area through AR glasses, wherein the number of the floating windows is at least two, and each floating window can correspondingly display one application.
The AR multi-window holographic interaction system is further improved in that three suspended windows are preset, three applications can be sequentially displayed through the three preset suspended windows, and application interaction is achieved in the suspended windows.
The AR multi-window holographic interaction system is further improved in that the AR multi-window holographic interaction system further comprises a triggering unit, wherein the triggering unit is used for triggering a sensor in the terminal equipment to emit rays, so that a user can operate pictures displayed by the fixed area and the floating window through the rays emitted by the terminal equipment;
the processing unit is also used for controlling a camera of the AR glasses to shoot the pictures displayed by the fixed area and the floating window to form video data; and calculating the coordinate value of the intersection point of the ray emitted by the terminal equipment and the displayed picture in real time by using the video data, and forming a corresponding operation action by combining the received touch instruction of the terminal equipment and feeding the operation action back to the terminal equipment for execution.
The AR multi-window holographic interaction system is further improved in that the touch instruction comprises a click instruction, a long-time press instruction and a sliding instruction.
The AR multi-window holographic interaction system is further improved in that the AR multi-window holographic interaction system further comprises a storage unit, and a gesture action library is stored in the storage unit;
the processing unit is further used for controlling a camera on the AR glasses to shoot the fixed area, the picture displayed by the floating window and the operation gesture of the user to form video data; and recognizing the operation gesture in the video data by using the gesture action library, calculating the coordinate value of the contact point of the operation gesture and the displayed picture, and further forming an operation action to be fed back to the terminal equipment for execution.
Drawings
FIG. 1 is a flow chart of an AR multi-window holographic interaction method of the present invention.
FIG. 2 is a system diagram of the AR multi-window holographic interaction system of the present invention.
Fig. 3 is a schematic structural diagram of connection between AR glasses and terminal equipment in the AR multi-window holographic interaction method and system of the present invention.
FIG. 4 is a schematic structural diagram of display of a display area and a floating window in the AR multi-window holographic interaction method and system of the present invention.
Detailed Description
The invention is further described with reference to the following figures and specific examples.
Referring to fig. 1, the invention provides an AR multi-window holographic interaction method and system, which use AR holographic technology to holographically present and interact apps operated on terminal devices such as daily mobile phones and the like in a space through AR technology, and support a plurality of app rendering pictures to be displayed in the space in a windowing manner at the same time. The method is used for presenting a plurality of pictures rendered by terminal equipment such as a mobile phone in the AR glasses, and can support a plurality of windows to display and interact in a real space. The AR multi-window holographic interaction method and system of the present invention are described below with reference to the accompanying drawings.
Referring to FIG. 2, a system diagram of the AR multi-window holographic interaction system of the present invention is shown. The AR multi-window holographic interactive system of the present invention is described with reference to fig. 2.
As shown in fig. 2, the AR multi-window holographic interaction system of the present invention includes an obtaining unit 21, a display unit 22, and a processing unit 23, where the obtaining unit 21 is installed on a terminal device, and the obtaining unit 21 is configured to obtain application data of the terminal device; the display unit 22 is connected with the acquisition unit, the display unit 22 is also installed on the terminal device, and the display unit 22 is also connected with the AR glasses and is used for displaying all applications on the terminal device in a fixed area through the AR glasses; the processing unit 23 is connected to the display unit 22, the processing unit 23 is also installed on the terminal device, the processing unit 23 is configured to receive an operation action of a user on a picture displayed in the fixed area and feed back the operation action to the terminal device for execution, and is further configured to display an execution result of the terminal device in a manner of floating windows above the fixed area through the AR glasses, and at least two floating windows are provided, and each floating window can correspondingly display one application.
The AR multi-window holographic interaction system is application software which is installed on the terminal equipment, when a user uses the AR multi-window holographic interaction system, AR glasses are required to be connected to the terminal equipment, the AR glasses are worn, application software of the AR multi-window holographic interaction system is started, and then all applications on the terminal equipment can be presented through the AR glasses. The user can control the display picture presented in the AR glasses, for example, a certain application is clicked to open the application, when the operation action of the user is to open the certain application, the AR multi-window holographic interaction system renders the picture opened by the application into a floating window, the opening interface of the application is displayed through the floating window, and the user can perform interaction operation on the application displayed in the floating window. The AR multi-window holographic interaction system supports simultaneous opening of multiple applications, and the opened applications are correspondingly displayed through the floating window.
In an embodiment of the present invention, the processing unit 23 is further configured to acquire a system right of the terminal device, and capture data from a frame rendered at a bottom layer of the terminal device, where the captured data is all application data in the terminal device, and further display the application data in a fixed area. When the processing unit 23 feeds back the operation action to the terminal device for execution, the processing unit obtains the rendering picture of the terminal device and displays the rendering picture in the floating window, so that the multi-window displaying and interacting functions are realized.
In a specific embodiment of the present invention, three floating windows are preset, and three applications can be sequentially displayed through the preset three floating windows and application interaction is realized in the floating windows.
Referring to fig. 4, the fixed region 311 is configured to display all application icons in the terminal device, and when the application icons are displayed in the fixed region 311, three floating windows are hidden, and when a user opens one application, one floating window is displayed to display an interface of the opened application, and when the user opens another application, another floating window is displayed to display an interface of the opened application. The three suspension windows are arranged, so that three applications can be opened simultaneously and displayed and interacted.
In a specific embodiment of the present invention, the mobile terminal further includes a triggering unit, configured to trigger a sensor in the terminal device to emit a ray, so that a user can operate a picture displayed in the fixed area and the floating window through the ray emitted by the terminal device;
the processing unit is also used for controlling a camera of the AR glasses to shoot pictures displayed by the fixed area and the floating window to form video data; and calculating the coordinate value of the intersection point of the ray emitted by the terminal equipment and the displayed picture in real time by utilizing the video data, and forming a corresponding operation action by combining the received touch instruction of the terminal equipment and feeding the operation action back to the terminal equipment for execution.
The terminal equipment can be converted into an operating handle through the trigger unit, and a user can hold the terminal equipment by hand and control a display picture by utilizing rays emitted by the terminal equipment. The screen of the terminal equipment is also used for inputting corresponding touch instructions, and the touch instructions comprise click instructions, long-time press instructions and sliding instructions. When the screen is clicked, a click instruction is formed, when the screen is pressed for a long time, a long press instruction is formed, and when the screen is slid, a slide instruction is formed. A user sets a transmitted ray to point to a corresponding application icon through a terminal, corresponding touch operation is carried out to form a touch instruction, a processing unit analyzes and obtains a coordinate value of an intersection point of the ray and a displayed picture according to video data, the coordinate value and the touch instruction are fed back to the terminal equipment, the terminal equipment executes a corresponding operation action, and the processing unit obtains a rendering picture of an execution result of the processing unit to display the rendering picture in a suspension window. For example, when a certain application is clicked, the processing unit obtains the coordinate value of the application and the click instruction of the user, sends the coordinate value and the click instruction to the terminal device for execution, and displays the execution result of the terminal device through the floating window.
The invention can carry out random position arrangement, rotation, amplification and reduction on the suspension window through the rays emitted by the terminal equipment. When the position is dragged, a long-press instruction is input through a long-press screen, the position of a dragged rendering picture can be mapped in real time in space by utilizing a suspended window in a ray point and dragging. And the application can be controlled in the suspension window, such as clicking, long pressing, up-down sliding and other operations, so that interaction is realized.
In a specific embodiment of the present invention, the present invention further includes a storage unit, in which a gesture library is stored;
the processing unit is also used for controlling a camera on the AR glasses to shoot pictures displayed by the fixed area and the floating window and forming video data by operation gestures of a user; and recognizing the operation gesture in the video data by using the gesture action library, calculating the coordinate value of the contact point of the operation gesture and the displayed picture, and further forming the operation action to feed back to the terminal equipment for execution. Thus, the invention realizes interaction through gestures.
Furthermore, the AR multi-window holographic interaction system further supports handle interaction, the handle can be connected with the terminal device, and the display picture can be controlled through the handle.
The terminal equipment of the invention is preferably a mobile phone and can also be electronic equipment such as a flat panel and the like.
As shown in fig. 3, a schematic structural diagram of the connection between the mobile phone 32 and the AR glasses 31 is shown, the AR multi-window holographic interaction system of the present invention is installed on the mobile phone 32, and the AR multi-window holographic interaction system software of the present invention is started on the mobile phone 32, so that all applications in the mobile phone can be presented through the AR glasses, and interactive operation and control are performed. After the interface of the AR multi-window holographic interaction system software of the present invention is displayed on the screen of the mobile phone 32, a desired interaction mode can be selected on the interface, and mobile phone interaction, gesture interaction, and handle interaction are displayed on the interface, and selection is performed by clicking the corresponding interaction mode. When the mobile phone interaction is selected, the AR multi-window holographic interaction system triggers the sensor of the mobile phone to emit rays, and the mobile phone becomes an operation handle to complete the interaction action.
The invention provides an AR multi-window holographic interaction method, which is explained below.
As shown in fig. 1, the AR multi-window holographic interaction method of the present invention comprises the following steps:
step S11 is executed, the application data of the terminal device is obtained, and all the applications on the terminal device are displayed in a fixed area through AR glasses; then, step S12 is executed;
an execution step S12 of receiving an operation action of the user on the screen displayed in the fixed area, and feeding back the received operation action to the terminal device for execution; then, go to step S13;
step S13 is executed, the execution result of the terminal device is displayed in a floating window manner through the AR glasses above the fixed area, there are at least two floating windows, and each floating window can correspondingly display one application.
The interaction method of the invention realizes that the operation and the application on the terminal equipment are holographically presented and interacted in the space through the AR glasses, and the suspension window can support a plurality of application rendering pictures to be simultaneously displayed in the space by opening the window, thereby realizing the support of a plurality of windows to be displayed and interacted in the real space.
In a specific embodiment of the invention, three suspension windows are preset;
when a plurality of applications are opened, the three applications are sequentially displayed through the three preset floating windows, and application interaction is achieved in the floating windows.
In one embodiment of the present invention, the method further comprises:
triggering a sensor in the terminal equipment to emit rays, so that a user can operate pictures displayed in the fixed area and the floating window through the rays emitted by the terminal equipment;
shooting pictures displayed by the fixed area and the floating window through a camera on the AR glasses to form video data;
and calculating the coordinate value of the intersection point of the ray emitted by the terminal equipment and the displayed picture in real time by utilizing the video data, and forming a corresponding operation action by combining the received touch instruction of the terminal equipment and feeding the operation action back to the terminal equipment for execution.
Further, the touch instruction of the terminal device includes a click instruction, a long press instruction and a slide instruction.
The AR multi-window holographic interaction method is further improved in that the method further comprises the following steps:
setting a gesture action library;
shooting pictures displayed by a fixed area and a floating window and operation gestures of a user through a camera on the AR glasses to form video data;
and recognizing the operation gesture in the video data by using the gesture action library, calculating the coordinate value of the contact point of the operation gesture and the displayed picture, and further forming the operation action to feed back to the terminal equipment for execution.
While the present invention has been described in detail and with reference to the embodiments thereof as illustrated in the accompanying drawings, it will be apparent to one skilled in the art that various changes and modifications can be made therein. Therefore, certain details of the embodiments are not to be interpreted as limiting, and the scope of the invention is to be determined by the appended claims.

Claims (10)

1. An AR multi-window holographic interaction method is characterized by comprising the following steps:
acquiring application data of terminal equipment and displaying all applications on the terminal equipment in a fixed area through AR glasses;
receiving the operation action of the user on the picture displayed in the fixed area, and feeding back the received operation action to the terminal equipment for execution; and
and displaying the execution result of the terminal equipment above the fixed area in a floating window mode through the AR glasses, wherein the number of the floating windows is at least two, and each floating window can correspondingly display one application.
2. The AR multi-window holographic interaction method of claim 1, wherein there are three pre-set floating windows;
when a plurality of applications are opened, the three applications are sequentially displayed through the preset three floating windows, and the application interaction is realized in the floating windows.
3. The AR multi-window holographic interaction method of claim 1, further comprising:
triggering a sensor in the terminal equipment to emit rays, so that a user can operate the pictures displayed by the fixed area and the floating window through the rays emitted by the terminal equipment;
shooting pictures displayed by the fixed area and the floating window through a camera on the AR glasses to form video data;
and calculating the coordinate value of the intersection point of the ray emitted by the terminal equipment and the displayed picture in real time by using the video data, and forming a corresponding operation action by combining the received touch instruction of the terminal equipment and feeding the operation action back to the terminal equipment for execution.
4. The AR multi-window holographic interaction method of claim 3, wherein the touch instructions of the terminal device comprise click instructions, long press instructions, and slide instructions.
5. The AR multi-window holographic interaction method of claim 1, further comprising:
setting a gesture action library;
shooting pictures displayed by the fixed area and the floating window and operation gestures of a user through a camera on the AR glasses to form video data;
and recognizing the operation gesture in the video data by using the gesture action library, calculating the coordinate value of the contact point of the operation gesture and the displayed picture, and further forming an operation action to be fed back to the terminal equipment for execution.
6. An AR multi-window holographic interaction system, comprising:
the acquisition unit is arranged on the terminal equipment and used for acquiring the application data of the terminal equipment;
the display unit is connected with the AR glasses and is used for displaying all applications on the terminal equipment in a fixed area through the AR glasses; and
and the processing unit is connected with the display unit and used for receiving the operation action of the user on the picture displayed in the fixed area and feeding back the operation action to the terminal equipment for execution, and is also used for displaying the execution result of the terminal equipment in a floating window mode above the fixed area through AR glasses, wherein the number of the floating windows is at least two, and each floating window can correspondingly display one application.
7. The AR multi-window holographic interaction system of claim 6, wherein there are three preset floating windows, and three applications can be sequentially displayed through the three preset floating windows and the interaction of the applications can be realized in the floating windows.
8. The AR multi-window holographic interaction system of claim 6, further comprising a triggering unit for triggering a sensor in a terminal device to emit a ray, so that a user can operate on the fixed area and the displayed picture of the floating window through the ray emitted by the terminal device;
the processing unit is also used for controlling a camera of the AR glasses to shoot the pictures displayed by the fixed area and the floating window to form video data; and calculating the coordinate value of the intersection point of the ray emitted by the terminal equipment and the displayed picture in real time by using the video data, and forming a corresponding operation action by combining the received touch instruction of the terminal equipment and feeding the operation action back to the terminal equipment for execution.
9. The AR multi-window holographic interaction system of claim 8, wherein the touch instructions comprise a click instruction, a long press instruction, and a slide instruction.
10. The AR multi-window holographic interaction system of claim 6, further comprising a storage unit having a library of gesture actions stored therein;
the processing unit is further used for controlling a camera on the AR glasses to shoot the fixed area, the picture displayed by the floating window and the operation gesture of the user to form video data; and recognizing the operation gesture in the video data by using the gesture action library, calculating the coordinate value of the contact point of the operation gesture and the displayed picture, and further forming an operation action to be fed back to the terminal equipment for execution.
CN202210207580.6A 2022-03-04 2022-03-04 AR multi-window holographic interaction method and system Pending CN114625248A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210207580.6A CN114625248A (en) 2022-03-04 2022-03-04 AR multi-window holographic interaction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210207580.6A CN114625248A (en) 2022-03-04 2022-03-04 AR multi-window holographic interaction method and system

Publications (1)

Publication Number Publication Date
CN114625248A true CN114625248A (en) 2022-06-14

Family

ID=81899738

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210207580.6A Pending CN114625248A (en) 2022-03-04 2022-03-04 AR multi-window holographic interaction method and system

Country Status (1)

Country Link
CN (1) CN114625248A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115529486A (en) * 2022-09-23 2022-12-27 三星电子(中国)研发中心 Suspended window type APP presentation method and apparatus
CN115834754A (en) * 2022-09-29 2023-03-21 歌尔科技有限公司 Interaction control method and device, head-mounted display equipment and medium
CN115834754B (en) * 2022-09-29 2024-05-28 歌尔科技有限公司 Interactive control method and device, head-mounted display equipment and medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115529486A (en) * 2022-09-23 2022-12-27 三星电子(中国)研发中心 Suspended window type APP presentation method and apparatus
CN115834754A (en) * 2022-09-29 2023-03-21 歌尔科技有限公司 Interaction control method and device, head-mounted display equipment and medium
CN115834754B (en) * 2022-09-29 2024-05-28 歌尔科技有限公司 Interactive control method and device, head-mounted display equipment and medium

Similar Documents

Publication Publication Date Title
CN109164964B (en) Content sharing method and device, terminal and storage medium
CN106775313B (en) Split screen operation control method and mobile terminal
US9348504B2 (en) Multi-display apparatus and method of controlling the same
CN106780685B (en) A kind of generation method and terminal of dynamic picture
CN111443863A (en) Page control method and device, storage medium and terminal
KR102027879B1 (en) Menu contolling method of media equipment, apparatus thereof, and medium storing program source thereof
CN103092518B (en) A kind of mobile accurate touch control method of cloud desktop based on RDP agreement
JP2016500175A (en) Method and apparatus for realizing floating object
CN108228020B (en) Information processing method and terminal
CN112099707A (en) Display method and device and electronic equipment
WO2017113154A1 (en) System and method for operating system of mobile device
US9293108B2 (en) Transmission apparatus and system of using the same
CN104063128A (en) Information processing method and electronic equipment
CN102637127A (en) Method for controlling mouse modules and electronic device
CN108604173A (en) Image processing apparatus, image processing system and image processing method
CN114625248A (en) AR multi-window holographic interaction method and system
WO2016141597A1 (en) Touch control method, device, terminal and graphical user interface thereof
WO2024099102A1 (en) Control display method and apparatus, electronic device, and readable storage medium
CN103150116A (en) RDP-based method for magnification display of cloud desktop
CN110225182B (en) Control method for flexible screen intelligent terminal and intelligent terminal
CN113457117B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
CN114995713A (en) Display control method and device, electronic equipment and readable storage medium
CN104572602A (en) Method and device for displaying message
KR102140935B1 (en) Menu controlling method of media equipment, apparatus thereof, and medium storing program source thereof
CN114442849B (en) Display equipment and display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination