CN114529691A - Window control method, electronic device and computer readable storage medium - Google Patents

Window control method, electronic device and computer readable storage medium Download PDF

Info

Publication number
CN114529691A
CN114529691A CN202011218043.9A CN202011218043A CN114529691A CN 114529691 A CN114529691 A CN 114529691A CN 202011218043 A CN202011218043 A CN 202011218043A CN 114529691 A CN114529691 A CN 114529691A
Authority
CN
China
Prior art keywords
window
windows
target
area
control method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011218043.9A
Other languages
Chinese (zh)
Inventor
杨婉艺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011218043.9A priority Critical patent/CN114529691A/en
Publication of CN114529691A publication Critical patent/CN114529691A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of computers, and discloses a window control method, electronic equipment and a computer readable storage medium. The selected target window may be a plurality of windows at the same level or at different levels, respectively. And after the target window is selected, the electronic equipment executes the operation indicated by the pointing component on each target window. The operation indicated by the pointing component includes, but is not limited to, a retracting operation to zoom out or zoom in the target window, a replacing operation to replace the selected target windows with each other, a deleting operation to delete the selected target windows, and the like. Therefore, the user selects the plurality of target windows by using the pointing component, so that the electronic equipment can conveniently perform batch operation on the target windows, and the experience of the user is improved.

Description

Window control method, electronic device and computer readable storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a window control method, an electronic device, and a computer-readable storage medium.
Background
Augmented Reality (AR) is a new man-machine interaction technology. By means of the AR technology, real-time interaction between participants and virtual objects can be achieved, wonderful visual experience is obtained, space, time and other objective limitations can be broken through, and experience that the participants cannot experience in the real world in person can be experienced. Virtual Reality (VR) is a Virtual Reality technology, which generates a simulation environment by computer technology and immerses a user in a created three-dimensional dynamic real scene, so that a simulation system for the real world can be understood. The earliest VR technology was applied to the military field, and the most common product was the head-mounted display.
The technology of interaction with AR/VR devices primarily utilizes motion sensors integrated into the head-up display. The scene in the field of view may be changed as the user rotates the head. With the development of AR/VR technology, a series of pointing devices, such as gloves, watches, mobile phones, and handles, have been developed. The pointing component presents both hands in a virtual scene, and the purpose of moving around of the user is achieved. By tracking the state of these or hand rotations, movements, etc., the motion state of these pointing elements is mapped to the interaction of movement, selection, rotation, and scaling, etc., of the virtual object. At present, the interaction technology using the AR/VR device is also simpler, and mainly focuses on gestures, handles and interface interaction. When the interface is interacted by gestures and handles, only a single window of a single plane in the interface can be independently operated. Such as splitting, closing, and moving a single window. Thus, when there are multiple windows in the interface, this approach is not suitable, resulting in a poor experience for the user.
Disclosure of Invention
The invention aims to provide a window control method, an electronic device and a computer readable storage medium, which are convenient for operating a plurality of windows simultaneously and improve the experience of users
In a first aspect, an embodiment of the present application discloses a window control method, which is applied to an electronic device based on a virtual reality technology or an augmented reality, where a pointing component is associated with the electronic device, the window control method is configured to control a plurality of window layers displayed in a visible area of the electronic device, the window layers are sequentially arranged along a direction away from a user side of the electronic device, and each window layer includes a plurality of windows, where the window control method includes:
detecting movement of the pointing element;
determining a selection area based on a movement trajectory of the pointing device;
determining at least one target window from the plurality of window layers according to the selection area;
and simultaneously executing the operation indicated by the pointing component on each at least one target window.
According to the window control method disclosed by the embodiment of the application, the electronic equipment and the pointing component are mutually associated, and the electronic equipment can be used for controlling a plurality of windows selected by the pointing component. When a user is using the electronic device, multiple windows at the same hierarchical level or at different hierarchical levels are presented within the viewable area of the electronic device. The user can select a part of the window from the plurality of windows as a target window to be controlled using the pointing device. The selected target window may be a plurality of windows at the same level or at different levels. After the target window is selected, the equipment simultaneously executes the operation indicated by the pointing component on each target window. The operation indicated by the indicating device includes, but is not limited to, a retracting operation to zoom out or enlarge the target window, a replacing operation to replace the selected target windows with each other, a deleting operation to delete the selected target windows, and the like. Therefore, the user selects the target windows by using the pointing component, so that batch operation of the target windows by the electronic equipment is facilitated, and the experience of the user is improved.
According to some embodiments provided by the first aspect of the application, the movement of the pointing element is a click on the window, the selection area being determined based on a click position of the pointing element.
According to some embodiments provided by the first aspect of the application, the movement of the pointing element is a movement with a predetermined trajectory,
the selection area is determined based on a trajectory formed when the pointing element is moved.
According to some embodiments provided by the first aspect of the present application, the trajectory formed when the pointing device moves forms a closed figure, the electronic device determines a selection area based on an area covered by the closed figure, and a window in the area of the closed figure as the selection area is a selected target window.
According to some embodiments provided in the first aspect of the present application, the retracting operation based on the indication of the pointing component includes zooming out, zooming in, and moving the window position of each target window.
According to some embodiments provided by the first aspect of the present application, determining at least one target window from the plurality of window layers according to the selection region comprises:
based on the selection area, at least one target window in the same window layer is determined.
According to some embodiments provided by the first aspect of the present application, determining at least one target window from the plurality of window layers according to the selection region comprises:
based on the selection region, a plurality of target windows in a plurality of window layers is determined, at least one window in each window layer being selected.
According to some embodiments provided in the first aspect of the present application, the area of the closed figure determined by the selection area is selected to cover a plurality of windows in a plurality of window layers, and the plurality of windows are determined as target windows.
According to some embodiments provided by the first aspect of the present application, a window whose selection area is selected on one of the plurality of window layers is selected, and the selection area is mapped to the determined at least one window on the other window layer as the target window.
According to some embodiments provided in the first aspect of the present application, in a case where the at least one target window includes at least two windows, the operation of performing pointing device pointing performs the operation of pointing device pointing on the plurality of target windows, including
The positions of at least two windows are interchanged.
According to some embodiments provided by the first aspect of the application, the at least two windows are divided into two window groups, and the operation of performing pointing device pointing performs the operation of pointing device pointing on the plurality of target windows, including
The positions of the two window groups are interchanged.
According to some embodiments provided by the first aspect of the present application, the area of the first target window of the at least two windows is adjusted to fit the display area at the position of the second target window, or the area of the second target window of the at least two windows is adjusted to fit the display area at the position of the first target window.
In a second aspect, an embodiment of the present application discloses an electronic device, where a pointing component is associated with the electronic device, and the electronic device is based on an electronic device based on a virtual reality technology or an augmented reality, and the electronic device includes:
a memory for storing window control instructions;
a processor, the processor implementing the following steps when executing the window control instruction:
detecting movement of the pointing element;
determining a selection area based on a movement trajectory of the pointing device;
determining at least one target window from the plurality of window layers according to the selection area;
and simultaneously executing the operation indicated by the pointing component on each at least one target window.
By means of the electronic device disclosed by the embodiment of the application and the window control method disclosed by the embodiment of the application, the electronic device and the pointing component are mutually associated, and the electronic device can be used for controlling a plurality of windows selected by the pointing component. When a user is using the electronic device, multiple windows at the same hierarchical level or at different hierarchical levels are presented within the viewable area of the electronic device. The user can select a part of the window from the plurality of windows as a target window to be controlled using the pointing device. The selected target window may be a plurality of windows at the same level or at different levels, respectively. After the target window is selected, the equipment simultaneously executes the operation indicated by the pointing component on each target window. The operation indicated by the indicating device includes, but is not limited to, a retracting operation to zoom out or enlarge the target window, a replacing operation to replace the selected target windows with each other, a deleting operation to delete the selected target windows, and the like. Therefore, the user selects the target windows by using the pointing component, so that batch operation of the target windows by the electronic equipment is facilitated, and the experience of the user is improved.
According to some embodiments provided by the second aspect of the application, the processor, when executing the window control instructions, is further configured to determine the selection area based on a click position of the pointing element.
According to some embodiments provided by the second aspect of the application, the processor, when executing the window control instructions, is further configured to determine the selection area based on a trajectory formed when the pointing element is moved.
According to some embodiments provided by the second aspect of the present application, when executing the window control instruction, the processor is further configured to perform a retracting operation based on the pointing component instruction, including zooming out, zooming in, and moving the window position for each target window.
According to some embodiments provided in the second aspect of the present application, when executing the window control instructions, the processor is further configured to determine at least one target window in the same window layer based on the selection area, or determine a plurality of target windows in a plurality of window layers based on the selection area, at least one window in each window layer being selected.
According to some embodiments provided by the second aspect of the application, the memory is further configured to store the selection area to enable the processor to call the selection area.
According to some embodiments provided in the second aspect of the present application, the processor, when executing the window control instructions, is further configured to exchange the positions of at least two windows or exchange the positions of two window groups with each other.
In a third aspect, an embodiment of the present application discloses a computer-readable storage medium, in which storage window control instructions are stored, and when executed by a processor, the storage window control instructions implement the window control method as mentioned in any one of the above.
Additional features and corresponding advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
Fig. 1A and fig. 1B are scene diagrams formed by an electronic device and a pointing device according to an embodiment of the present application;
fig. 1C and fig. 1D are schematic structural diagrams of a pointing device according to an embodiment of the present application;
fig. 2a to fig. 2c are schematic diagrams illustrating a selection manner of a target window according to an embodiment of the present application;
fig. 3A to fig. 3C are schematic diagrams of selection results corresponding to selection manners of a target window according to an example of the present application;
FIG. 4 is a schematic diagram of a distribution of windows according to an example of the present application;
fig. 5A schematically illustrates a structure of a VR device disclosed in an embodiment of the present application;
fig. 5B is a schematic diagram illustrating a structure of a memory of a VR device disclosed in an embodiment of the present application;
fig. 6A to 6B are schematic diagrams illustrating the folding and unfolding of windows at the same level according to an embodiment of the present application;
fig. 7A to 7C are schematic diagrams illustrating the folding and unfolding of windows at different levels according to an embodiment of the present application;
8A-8C are schematic diagrams of the window being in different levels but in the same longitudinal space according to the embodiment of the present application;
fig. 9A to 9C are another schematic diagrams illustrating the storing and releasing of windows at different levels according to the embodiment of the present application;
10A-10C are schematic diagrams of two alternative windows at the same level according to an embodiment of the present application;
11A-11B are schematic diagrams illustrating an alternative arrangement of multiple windows at different levels according to an embodiment of the present application;
12A-12B are schematic diagrams of a plurality of windows at different levels but in the same longitudinal space according to an embodiment of the present application;
fig. 13 is a schematic flowchart illustrating a process of applying a window control method provided in an embodiment of the present application to control windows at different levels or at the same level to be received and released;
fig. 14A to fig. 14B are schematic flow charts illustrating a window control method applied to control windows of different levels or the same level for replacement according to an embodiment of the present disclosure;
fig. 15 is a schematic structural diagram of a window control apparatus according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of an electronic device disclosed in an embodiment of the present application;
fig. 17 is a schematic structural diagram of an SOC disclosed in an embodiment of the present application.
Detailed Description
The window control method provided by the embodiment of the application is applied to electronic equipment, and the electronic equipment comprises but is not limited to AR equipment or VR equipment. The AR device adopts an augmented reality technology, which is a new technology for seamlessly integrating real world information and virtual world information. The AR equipment adopts a virtual reality technology, and the virtual reality technology mainly comprises aspects of simulation environment, perception, natural technology, sensing equipment and the like. The simulated environment is a computer-generated, dynamic, three-dimensional, realistic image. Perception means that an ideal VR should have the perception that everyone has. In addition to visual perception generated by computer graphics technology, there are also perceptions such as auditory, tactile, centrifugal, and movement. The natural skill refers to head rotation, eyes, gestures or other human body behaviors, the data adaptive to the motions of the participants are processed by the computer, the real-time response is made to the input of the user, and the data are respectively fed back to the five sense organs of the user.
Referring to fig. 1A and 1B, first, please refer to fig. 1A and 1B, where fig. 1A and 1B are scene diagrams of a configuration of an electronic device and a pointing device according to an embodiment of the present application.
In the scenario diagrams shown in fig. 1A and 1B, including but not limited to an electronic device 10, exemplified by a VR device, a pointing element 11, exemplified by a remote control handle, and a plurality of windows 1000 presented within a viewable area 100 (e.g., the area between dashed line 101 and dashed line 102 in fig. 1A) of the VR device. The VR device and the remote control handle are associated with one another such that the VR device controls a plurality of windows selected by the remote control handle. After the user wears the VR device, a plurality of windows 1000 are presented within the visible area 100 of the VR device. The user can select a part of the window from the plurality of windows as a target window to be controlled using the pointing device 11. After the target window is selected, the VR device executes the operation instructed by the pointing device 11 to each target window at the same time.
It is noted that although the example of the pointing element 11 explained above may be a remote control handle, the present application is not limited to the case where the pointing element 11 may also be a glove as shown in fig. 1B, a watch as shown in fig. 1C, a bracelet or the like as shown in fig. 1D, or a hand of a user or the like. Although the pointing element 11 is explained above as being used to control a window displayed by a VR device, the application is not so limited and the pointing element may also be used for other functional operations, such as controlling object movement in a game, performing an action, etc.
According to the embodiment of the application, the target window selection modes include, but are not limited to, click (click selection, where a clicked window is selected as a target window), surface selection (drawing a closed graph for selection, where a window covered by the closed graph is a target window), and line selection (drawing a line for selection, where a window covered by the line is selected as a target window).
Fig. 2 is a schematic diagram illustrating an example selection manner of a target window according to an embodiment of the present application. For example, as illustrated in fig. 2a, which is the leftmost side of fig. 2, a user may click on one or more windows presented in the visible area of the VR device using keys provided on the remote control handle, or by holding the remote control handle in a tap or click-like motion. The clicked window can be used as the selected target window. As illustrated in the middle fig. 2b of fig. 2, the user may select one or more windows by a tap or click action of the hand on them, or by swiping the hand to select a window in the area through which the hand swipes, as the selected target window. The hand swing can move in a visible area of the VR equipment, and when a moving track forms a closed graph, a window covered by the closed graph is a target window. When the hand moves in the visible area of the VR device, the moving track forms a curve or a straight line, and a window covered by the curve or the straight line can be selected as a target window. As illustrated in the middle fig. 2c of fig. 2, the user may use a part of the window to be grasped by the hand as the target window, and when the user grasps the hand, the window covered by the hand is the selected target window.
Fig. 3A to 3C are schematic diagrams of selection results corresponding to selection manners of the target window in the embodiment of the application. As illustrated in fig. 3A, the user clicks on window a, window C, and window D among window a, window B, window C, and window D based on a remote handle or finger. Window a, window C, and window D, which are selected target windows, may show a selection marker 1001 in the upper right corner. The selected indicia 1001 may be a black dot as shown in FIG. 3A. Other shapes and types of indicia are also possible, such as triangular, diamond, square, etc. The embodiments of the present application are not limited thereto.
As illustrated in fig. 3B, the user moves within the viewable area of the VR device using a hand or remote control handle. The moving track forms a closed figure such as a regular rectangle 1002, a triangle 1003, an irregular shape 1004, and the like shown in fig. 3B. The window a, the window B and the window C covered by the regular rectangle 1002, the triangle 1003 and the irregular shape 1004 shown in the closed figure shown in fig. 3B are selected target windows. Of course, the shape of the closed figure is not limited to that illustrated in fig. 3B in this embodiment, and may also be other types of shapes, and the embodiments of the present application are not limited herein. It is understood that the closed graph formed by the moving track can determine the selected target window as follows: the window is entirely located inside the closed figure and is selected as the target window, and the window is partially located inside the closed figure and is considered as the target window (for example, 50% of the area of the window is located inside the closed figure and is considered as the target window).
As illustrated in fig. 3C, the user moves within the viewable area of the VR device using a hand or remote control handle. The movement trajectory forms a trajectory such as a straight line 1005 or a curved line 1006 shown in fig. 3C. The window a, the window B and the window C through which the straight line 1005 or the curve 1006 shown in fig. 3C passes are selected target windows.
For multiple windows 1000 within the viewable area 100 of the VR device, they are located in three-dimensional space. The three-dimensional space is a space formed by three dimensions of length, width, and height, and corresponds to an x-axis (horizontal axis), a y-axis (vertical axis), and a z-axis (vertical axis) of the three-dimensional space. For multiple windows 1000, they may be located in the same planar area of the three-dimensional space (same level), or they may be located in different planar areas (different levels). For example, as shown in fig. 4, the window a, the window B, the window C, and the window D all lie in the same plane region (as a first plane region), that is, the window a, the window B, the window C, and the window D are windows at the same level (in the embodiment of the present application, they are denoted as Layer-1). The window E and the window F are located in a second plane area behind the first plane area, that is, the window E and the window F are located in a second level (this is denoted as Layer-2 in this embodiment of the present application), and the window a and the window B, the window C, and the window D are located in the first level. I.e., windows E and F are at a different level than windows a and B and C and D. The window G, the window H, and the window I are located in a third plane area behind the second plane area, that is, the window G, the window H, and the window I are located in a third level (this is denoted as Layer-3 in this embodiment of the present application). The windows in the three planar regions are at different levels.
Of course, the multiple windows 1000 in the visible area 100 of the VR device may have other types of display manners, and the embodiment of the present application is not limited herein.
According to some embodiments of the present application, the VR device and the remote control handle are associated with one another, and the VR device can be used to control multiple windows selected by the remote control handle. When a user is using the VR device, multiple windows 1000 at the same hierarchical level or at different hierarchical levels as shown in fig. 1 are presented within the viewable area 100 of the VR device. The user can select a part of the window from the plurality of windows as a target window to be controlled using the pointing device 11. The selected target window may be a plurality of windows at the same level or at different levels, respectively. After the target window is selected, the VR device executes the operation instructed by the pointing device 11 to each target window at the same time. The operation instructed by the instructing device 11 includes, but is not limited to, a retracting operation to zoom out or enlarge the target window, a replacing operation to replace the selected target windows with each other, a deleting operation to delete the selected target windows, and the like. Therefore, the user selects the target windows by using the pointing component, so that the VR equipment can conveniently perform batch operation on the target windows, and the experience of the user is improved.
The following describes a structure of a VR device that implements the window control method exemplified in the above embodiment of the present application:
as shown in fig. 5A, the VR device 50 shown in fig. 5A includes, but is not limited to, a processor 11, and the processor 11 is configured to generate corresponding operation control signals, send the corresponding operation control signals to corresponding components in the device, read and process data in software, in particular, read and process data and programs in a memory, so as to enable each functional module in the device to execute corresponding functions, thereby controlling the corresponding components to act according to the requirements of the instructions. Such as for various media processing algorithms including human-computer interaction, motion tracking/prediction (e.g., tracking user hand movement, movement and rotation of a remote control handle, etc. in the embodiments of the present application), rendering display, audio processing, window reduction or enlargement, window replacement, and window deletion.
The sensor system 12: the system is used for collecting, acquiring or sending information, including image information and distance information, such as hand information and ray click information of a remote control handle in the embodiment of the application. The sensor system of the embodiment of the application can include a 3-axis or 6-axis sensor, and is used for acquiring motion information of the VR device, such as angular velocity and linear acceleration; simultaneously, the hand movement is positioned, tracked and identified; the sensor system also acquires static and dynamic characteristics of the hand. Static feature information such as fingertip fixation point, palm centroid, hand joints, etc. Such features typically employ single frame data acquisition. And dynamic characteristic information such as displacement vectors, motion speed and the like. Such characteristic information is generally acquired through multi-frame data. As a sensor system, some specific program instructions may also be stored therein.
The memory 13 is used for storing programs and various data, and mainly stores software units such as an operating system, applications, and functional instructions, or a subset thereof, or an extended set thereof. Non-volatile random access memory may also be included to provide processor 11 with functionality including managing hardware, software, and data resources in the computing processing device, supporting control software and applications. The method is also used for storing the selection range of the multiple windows and storing the running programs and the applications. As shown in fig. 5B, at least one storage unit may be disposed in the memory 13 of the VR device 50, and each storage unit may have a respective storage function, for example, a first storage unit is used to store software units such as an operating system, applications, and functional instructions; the second storage unit is used for storing applications, running programs and the like; the third storage unit is used for storing the selection range selected by the user for the multiple windows.
Display element 14: generally comprises a display screen and associated optics for displaying content; typically, a display interface is presented in the display screen for human-computer interaction and window browsing.
Acoustic element 15: such as a microphone, speaker, earphone, etc., for outputting sound.
The physical hardware 16: such as switch keys, volume keys, mechanical control keys and other physical function keys.
The device may also comprise some other components 17 than those described above 11-16 for making the function and appearance of the device more versatile and elegant.
The above hardware 11-16 and part of the hardware 17 may be coupled to communicate via bus electrical connections.
A window control method provided in an embodiment of the present application is described below with reference to the accompanying drawings:
according to some embodiments of the application, the plurality of windows within the visible area of the VR device are windows at the same hierarchical level. The window control method is applied to controlling a plurality of windows at the same level to be stored and released.
As shown in FIG. 6A, windows at the same level (Layer-1) include, but are not limited to, window A, window B, window C, and window D. The user establishes the selection in any of the ways shown in figure 2. For example, the user establishes a rectangular movement track by using the movement of the hand, and the rectangle formed by the movement track covers the window a, the window B, the window C and the window D. And the selected window A, window B, window C and window D are used as target windows. And, the selected window a, window B, window C and window D present the selected mark 1001 for prompting the user which pages are currently in the selected state and waiting for the subsequent operation.
After the window A, the window B, the window C and the window D are selected, the user can simultaneously reduce or enlarge the window A, the window B, the window C and the window D through the hand retraction operation. As shown in FIG. 6B, when the fingers of the user's hand are closed, window A, window B, window C, and window D shrink together. The scale at which window a, window B, window C and window D are reduced may be determined by how close the user's hand fingers are relative to the original position of the fingers. For example, the VR device tracks the movement of the fingers of the user's hand in real time, and when the fingers of the user's hand are fully closed, window a, window B, window C, and window D are scaled to a minimum. When the fingers of the user's hand are closed to half of the original state, window a, window B, window C, and window D are scaled to half of the original size. The reduction ratios of the window a, the window B, the window C, and the window D may also be determined in other manners, and the embodiment of the present application is not limited herein.
In addition, it will be appreciated that in the context of using a remote control handle, the reduction or enlargement of the control window may be accomplished by a case of the remote control handle or by a particular manner of waving the remote control handle.
It should be noted that the windows a, B, C and D displayed in the reduced manner may still be displayed in the plane areas of the windows a, B, C and D before the reduction, and the levels of the windows a, B, C and D after the reduction may be the same as the levels of the windows a, B, C and D before the reduction. In addition, each window after zooming out can be presented in the form of an icon at a position (e.g., lower right or lower left) that is not centered within the visible area of the VR device so that the user can zoom it in again.
Therefore, the window control method provided by the embodiment of the application is applied to control of multiple windows at the same level, the multiple windows can be operated in batches, and the experience of a user is improved compared with a mode that the user operates a single window.
Although the above embodiment explains that all of the windows a to D on the first hierarchical display area are selected and reduced. However, the present application is not limited thereto, and some of the windows, such as windows a and D, or windows A, B and C, may be selected.
According to some embodiments herein, the plurality of windows within the visible area of the VR device are windows at different hierarchical levels. The window control method is applied to controlling the plurality of windows at different levels to be stored and released. The application of the window control method provided in the present embodiment to control a plurality of windows in different hierarchies will be described by taking, as an example, windows in a first hierarchy (in the present embodiment, this is denoted as Layer-1), a second hierarchy (in the present embodiment, this is denoted as Layer-2), and a third hierarchy (in the present embodiment, this is denoted as Layer-3). In the embodiment of the present application, the plurality of windows at different levels means that the window at the first level, the window at the second level, and the window at the third level may be at the same position on the z-axis of the three-dimensional space coordinate. It is understood that the remaining windows in the VR device may be located at different levels from the first level to the third level mentioned in the embodiments of the present application, and there may be more levels at the levels in the visible area in the VR device, and the embodiments of the present application are not limited thereto.
In three-dimensional space, as shown in fig. 7A, the first-level window, the second-level window, and the third-level window may be at the same position on the z-axis of the three-dimensional space coordinate. That is, the vertex Layer-10 of the planar region formed by the window of the first hierarchy, the vertex Layer-20 of the planar region formed by the window of the second hierarchy, and the vertex Layer-30 of the planar region formed by the window of the third hierarchy have the same coordinate on the z-axis, that is, the coordinate on the z-axis is z 1. Wherein, the windows at the first level include, but are not limited to, window a, window B, window C, and window D. Windows at the second level include, but are not limited to, window E and window F. Windows at the third level include, but are not limited to, window G, window H, and window I.
As shown in fig. 7B, the user establishes the selection area in any of the ways shown in fig. 2. For example, the user selects the window a, the window B and the window D in the first hierarchy as a selected target window (first selection area) in the first hierarchy by clicking or establishing the selection area by using the movement of the finger. And selecting the window F of the second level as a selected target window (second selected area) by clicking or establishing a selected area, and selecting the window I in the third level as a selected target window (third selected area) by clicking or establishing a selected area. And, the selected window a, window B, window D, window F and window I present the selected mark 1001 for prompting the user which pages are currently in the selected state and waiting for the subsequent operation. It is noted that when the user clicks the windows of different levels with a finger to select the target window, the user clicks the window a, the window B and the window D of the first level as the selected target window in the first level. Through the first hierarchy, when the overlapping area of the window a, the window B and the window D of the selected first hierarchy in the window of the second hierarchy located at the next level to the first hierarchy exceeds a threshold (the overlapping area may be 80%), the window of the overlapping area of the second hierarchy may be directly selected as the target window.
After the window A, the window B, the window D, the window F and the window I are selected, the user can simultaneously reduce or enlarge the window A, the window B, the window D, the window F and the window I through the hand retraction operation. As shown in FIG. 7C, when the fingers of the user's hand are closed, window A, window B, window D, window F, and window I shrink together. The scale at which window a, window B, window D, window F and window I are reduced may be determined by how close the user's fingers are relative to the original position of the fingers. For example, the VR device tracks the movement of the fingers of the user's hand in real time, and when the fingers of the user's hand are fully closed, window a, window B, window D, window F, and window I are scaled to a minimum. When the fingers of the user's hand are closed to half of the original state, window a, window B, window D, window F, and window I are scaled to half of the original size. The scaling of the window a, the window B, the window D, the window F, and the window I may also be determined by other manners, and the embodiment of the present application is not limited herein.
It is noted that the windows a, B and D displayed in a reduced manner may still be displayed in the plane area where the first level of the windows a, B and D before the reduction is located. That is, the levels of the windows a, B, and D after the reduction may be the same as the levels of the windows a, B, and D before the reduction. The window F which is displayed in a reduced manner may still be displayed in the plane area where the second level of the window F before the reduction is located. The window I displayed in a reduced size may still display a plane area where the third level of the window I before the reduction is located. Therefore, because the window of the first level is reduced, more windows of the next level can be conveniently displayed. In addition, each window after zooming out can be presented in the form of an icon at a non-central location within the visible area of the VR device (e.g., the lower right or lower left corner of the visible area) for the user to zoom in again.
In addition, it will be appreciated that in the context of using a remote control pad, the reduction or enlargement of the control window may be achieved by a key of the remote control pad or by a particular way of waving the remote control pad.
Therefore, the window control method provided by the embodiment of the application is applied to control of the plurality of windows at different levels, the plurality of windows can be operated in batches, and the experience of a user is improved compared with a mode that the user operates a single window.
Although the above embodiment explains that the window a, the window B, the window D, the window F, and the window I are selected and reduced on the first to third hierarchical display areas. However, the present application is not limited to this, and some of the windows may be selected, for example, the window a and the window B of the first hierarchy, the window E and the window F of the second hierarchy, and the window H and the window I of the third hierarchy.
According to some embodiments herein, the multiple windows within the visible area of the VR device are windows at different levels but in the same longitudinal space. The window control method is applied to controlling the storage and release of a plurality of windows which are in different levels and are in the same longitudinal space. The application of the window control method provided by the embodiment of the present application to control a plurality of windows in different hierarchies will be described by taking windows in a first hierarchy (in the embodiment of the present application, this is denoted as Layer-1), a second hierarchy (in the embodiment of the present application, this is denoted as Layer-2), and a third hierarchy (in the embodiment of the present application, this is denoted as Layer-3) as an example. In this embodiment of the present application, the multiple windows at different levels refer to that the windows at the first level, the windows at the second level, and the windows at the third level may be at the same position on the y-axis of the three-dimensional space coordinate (the same position refers to that the overlapping area of the windows at the three levels reaches a threshold value, and the threshold value may be any one of values such as 80% and 85%, which is not limited herein in this embodiment of the present application). It is understood that the remaining windows in the VR device may be located at different levels from the first level to the third level mentioned in the embodiments of the present application, and there may be more levels at the levels in the visible area in the VR device, and the embodiments of the present application are not limited thereto.
In three-dimensional space, as shown in fig. 8A, the first level window, the second level window, and the third level window may be at the same position on the y-axis of the three-dimensional space coordinates. That is, the vertex Layer-40 of the planar region formed by the window of the first hierarchy, the vertex Layer-50 of the planar region formed by the window of the second hierarchy, and the vertex Layer-60 of the planar region formed by the window of the third hierarchy have the same coordinate on the y-axis, that is, the coordinate on the y-axis is y 1. Wherein, the windows at the first level include, but are not limited to, window a, window B, window C, and window D. Windows at the second level include, but are not limited to, window E, window F, window G, and window H. Windows at the third level include, but are not limited to, window I, window J, and window K.
As shown in fig. 8B, the user establishes the selection area in any of the ways shown in fig. 2. For example, the user selects the window a and the window B in the first hierarchy by clicking or creating a selection area, selects the window E in the second hierarchy by clicking or creating a selection area, and selects the window I in the third hierarchy by clicking or creating a selection area as a target window of the first selection area (represented by group1 in this embodiment of the present application). The user selects a window D in the first level by clicking or establishing a selection area, selects a window M and a window G in the second level by clicking or establishing the selection area, and selects a window K in the third level by clicking or establishing the selection area as a target window of the second selection area (represented by group2 in the embodiment of the application). And, the selected window a, window B, window E, window I, window D, window M, window G and window K show the selected mark 1001 for prompting the user which pages are currently in the selected state and waiting for the subsequent operation.
After the window a, the window B, the window E, the window I, the window D, the window M, the window G and the window K are selected, the user can reduce or enlarge the window a, the window B, the window E, the window I, the window D, the window M, the window G and the window K through the hand retraction operation. As shown in FIG. 8C, when the fingers of the user's hand close, window A, window B, window E, window I, window D, window M, window G, and window K shrink together. Wherein, the window A, the window B, the window E and the window I are reduced as a first group of windows. Window D, window M, window G, and window K are reduced as a second set of windows. The scale at which window a, window B, window E, window I, window D, window M, window G and window K are reduced may be determined by how close the user's fingers are relative to the original position of the fingers. For example, the VR device tracks the movement of the fingers of the user's hand in real time, and when the fingers of the user's hand are fully closed, window a, window B, window E, window I, window D, window M, window G, and window K are scaled to a minimum. When the fingers of the user's hand are closed to half of the original state, the windows a, B, E, I, D, M, G, and K are scaled to half of the original size. The reduction ratios of the window a, the window B, the window E, the window I, the window D, the window M, the window G, and the window K may also be determined by other manners, and the embodiment of the present application is not limited herein.
In addition, it will be appreciated that in the context of using a remote control handle, the reduction or enlargement of the control window may be accomplished by a case of the remote control handle or by a particular manner of waving the remote control handle.
It is noted that the windows a, B, E, I, D, M, G and K displayed in a reduced manner may still display the plane areas of the window a, B and D at the level before the reduction. That is, the levels of the window a, the window B, the window E, the window I, the window D, the window M, the window G, and the window K after the reduction may be the same as the levels of the window a, the window B, the window E, the window I, the window D, the window M, the window G, and the window K before the non-reduction. Therefore, the window of the first level is reduced, so that more windows of the next level can be conveniently displayed. In addition, each window after zooming out can be presented in the form of an icon at a non-central location within the visible area of the VR device (e.g., the lower right or lower left corner of the visible area) for the user to zoom in again.
Therefore, the window control method provided by the embodiment of the application is applied to control of the plurality of windows which are in different levels and are in the same longitudinal space, the plurality of windows can be operated in batches, and the experience of a user is improved compared with a mode that the user operates a single window.
According to some embodiments herein, the plurality of windows within the visible area of the VR device are windows at different hierarchical levels. The window control method is applied to controlling the plurality of windows at different levels to be stored and released. The application of the window control method provided in the present embodiment to control a plurality of windows in different hierarchies will be described by taking, as an example, windows in a first hierarchy (in the present embodiment, this is denoted as Layer-1), a second hierarchy (in the present embodiment, this is denoted as Layer-2), and a third hierarchy (in the present embodiment, this is denoted as Layer-3). In the embodiment of the present application, the plurality of windows at different levels means that the window at the first level, the window at the second level, and the window at the third level may be at the same position on the x-axis of the three-dimensional space coordinate. It is understood that the remaining windows in the VR device may be located at different levels from the first level to the third level mentioned in the embodiments of the present application, and there may be more levels at the levels in the visible area in the VR device, and the embodiments of the present application are not limited thereto.
In three-dimensional space, as shown in fig. 9A, the first level window, the second level window, and the third level window may be overlaid at the same position on the x-axis of the three-dimensional space coordinates. That is, the coordinates of the vertex Layer-70 of the planar region formed by the window of the first hierarchy, the vertex Layer-80 of the planar region formed by the window of the second hierarchy, and the vertex Layer-90 of the planar region formed by the window of the third hierarchy on the x axis are the same or partially coincide. I.e., the coordinates on the x-axis are all x 1. For example, window D of the first level is selected when the user clicks on window D with a finger, and the coordinates of window D are matched with the coordinates of window F of the second level and the coordinates of window I of the third level. It should be understood that, when the coordinates of the window F at the second level and the coordinates of the part of the window I at the third level coincide with the coordinates of the window D, it may also be considered that the window at the first level, the window at the second level and the window at the third level may be covered at the same position on the x-axis of the three-dimensional space coordinates, and the embodiment of the present application is not limited thereto. Wherein, the windows at the first level include, but are not limited to, window a, window B, window C, and window D. Windows at the second level include, but are not limited to, window E and window F. Windows at the third level include, but are not limited to, window G, window H, and window I.
As shown in fig. 9B, the user establishes the selection in any of the ways shown in fig. 2. For example, the user uses the movement of the finger to click or establish a selection area to select the window D in the first hierarchy as a selected target window (first selection area) in the first hierarchy. And selecting the window F of the second level as a selected target window (second selected area) by clicking or establishing a selected area, and selecting the window I in the third level as a selected target window (third selected area) by clicking or establishing a selected area. And, the selected window D, window F and window I show the selected mark 1001 for prompting the user which pages are currently in the selected state and waiting for the subsequent operation. It is noted that when the user clicks the window D of the first hierarchy as the selected target window in the first hierarchy while the user clicks the window of the different hierarchy with a finger to select the target window. Through the first hierarchy, when the overlapping area of the window a, the window B and the window D of the selected first hierarchy in the window of the second hierarchy located at the next level to the first hierarchy exceeds a threshold (the overlapping area may be 80%), the window (for example, the window F) of the overlapping area of the second hierarchy may be directly selected as the target window. The embodiment of the present application does not limit the selection manner of the target window.
After the window D, the window F and the window I are selected, the user can simultaneously reduce or enlarge the window D, the window F and the window I through the retraction operation of the hand. As shown in FIG. 9C, when the fingers of the user's hand are closed, window D, window F, and window I shrink together. The scale at which window D, window F and window I are reduced may be determined by how close the fingers of the user's hand are relative to the home position of the fingers. For example, the VR device tracks the movement of the fingers of the user's hand in real time, and when the fingers of the user's hand are fully closed, window D, window F, and window I are scaled to a minimum. When the fingers of the user's hand are closed to half of the original state, window D, window F, and window I are scaled to half of the original size. The reduction ratios of the window D, the window F and the window I can also be determined by other manners, and the embodiment of the present application is not limited herein.
In addition, it will be appreciated that in the context of using a remote control handle, the reduction or enlargement of the control window may be accomplished by a case of the remote control handle or by a particular manner of waving the remote control handle.
It is noted that the window D displayed in a reduced manner may still display the plane area where the first level of the window D before the reduction is located. That is, the level of the window D after the reduction may be the same as the level of the window D before the reduction. The window F which is displayed in a reduced manner may still be displayed in the plane area where the second level of the window F before the reduction is located. The window I displayed in a reduced size may still display a plane area where the third level of the window I before the reduction is located. In addition, each window after zooming out can be presented in the form of an icon at a non-central location within the visible area of the VR device (e.g., the lower right or lower left corner of the visible area) for the user to zoom in again.
In addition, the selected area established for the target window can be stored in the memory, so that the selected area can be called again and used directly, the condition that the selected area is established again is avoided, and the efficiency of operating the window is improved.
Therefore, the window control method provided by the embodiment of the application is applied to control of the plurality of windows at different levels, the plurality of windows can be operated in batches, and the experience of a user is improved compared with a mode that the user operates a single window.
Although the above embodiment explains that the window D, the window F, and the window I on the first to third hierarchical display areas are selected and reduced. However, the present application is not limited to this, and some of the windows may be selected, for example, the window a and the window B of the first hierarchy, the window E and the window F of the second hierarchy, and the window H and the window I of the third hierarchy.
According to some embodiments herein, the plurality of windows within the visible area of the VR device are windows at different levels or the same level. The window control method is applied to control replacement between two windows in the same level. The window at the first level is taken as an example to explain that the window control method provided by the embodiment of the present application is applied to control replacement between two windows at the same level.
In the embodiment of the present application, windows in different levels are exemplified by windows in a first level (in the embodiment of the present application, this is denoted as Layer-1), a second level (in the embodiment of the present application, this is denoted as Layer-2), and a third level (in the embodiment of the present application, this is denoted as Layer-3). The multiple windows at different levels means that the window at the first level, the window at the second level and the window at the third level can be covered at the same position on the x-axis of the three-dimensional space coordinate. That is, the coordinates of the planar region formed by the window of the first hierarchy, the planar region formed by the window of the second hierarchy, and the planar region formed by the window of the third hierarchy on the x-axis are the same or partially coincide with each other. For example, window D of the first hierarchical level is selected when the user clicks on window D with a finger. The coordinates of the position clicked on by the user on the window D on the plane of the first hierarchy are mapped on the plane of the second hierarchy and the plane of the third hierarchy, respectively coinciding with the window F of the second hierarchy and with the window I of the third hierarchy. Thus, the windows F of the second level coincide, and the windows I of the third level are also selected. It should be understood that, when the coordinates of the window F of the second level and the coordinates of the part of the window I of the third level overlap with the coordinates covered by the window D to a predetermined degree, the window of the first level, the window of the second level, and the window of the third level may be covered at the same position on the x-axis of the three-dimensional space coordinates. For example, when the user clicks on window D with a finger, window D is selected. When it is determined that the range covered by the window D is mapped onto the plane of the second hierarchy and the plane of the third hierarchy, and coincides with the window F of the second hierarchy, and coincides with the window I of the third hierarchy to an extent of more than 50% of the area covered by the window D, it is determined that the window F of the second hierarchy coincides, and the window I of the third hierarchy is also selected. The embodiments of the present application are not limited thereto. It is understood that the remaining windows in the VR device may be located at different levels from the first level to the third level mentioned in the embodiments of the present application, and there may be more levels at the levels in the visible area in the VR device, and the embodiments of the present application are not limited thereto.
In the three-dimensional space, the positions of the first-level window, the second-level window, and the third-level window in the three-dimensional space may refer to the positions shown in fig. 9A. The embodiments of the present application are not described herein again.
As shown in fig. 10A, the user establishes the selection in any of the ways shown in fig. 2. For example, the user selects window a in the first hierarchical level as the first target window by clicking or establishing a selection area using the movement of the finger. And selecting the window B in the first level as a second target window by clicking or establishing a selection area by using fingers. And, the selected window a and the selected window B present a selected identifier 1001 for prompting the user which pages are currently in the selected state and waiting for subsequent operations.
After the window a and the window B are selected, as shown in fig. 10B, the user moves the window B with a finger to approach the display area where the window a is located. So as to move the window B to the position of the window A and move the window A to the position of the window B.
As shown in fig. 10C, after the replacement of the positions of the window a and the window B is completed, the window B is located in the display area of the window a, and the window a is located in the display area of the window B. It is to be noted that, when the area of the display region of the window a is not identical to the area of the display region of the window B, after the positions of the window a and the window B are interchanged, the window area of the window B is adjusted to be the same as the area of the display region where the window a is located. The window area of the window a is adjusted to be the same as the area of the display region where the window B is located. The area of the first target window is adapted to the display area at the position of the second target window, and the area of the second target window is adapted to the display area at the position of the first target window.
In the above, the replacement is triggered by moving some windows to other windows by fingers and approaching them, according to the embodiment of the present application, after both windows as replacement objects are selected, a specific gesture may be used to trigger, for example, rotating the palm. In addition, it will be appreciated that in the context of using a remote control handle, the replacement of the control window may be implemented by a key of the remote control handle or by a specific way of waving the remote control handle.
Therefore, the window control method provided by the embodiment of the application is applied to control of the plurality of windows at the same level, the plurality of windows can be operated in batch, and the experience of a user is improved compared with a mode that the user operates a single window.
Although the above embodiment explains that the window a and the window B are both selected on the first hierarchical display area, and the positions of the window a and the window B are replaced with each other. However, the present application is not limited to this, and other windows of the first hierarchy or other hierarchies may be selected, for example, the window a and the window D of the first hierarchy, or the window E and the window F of the second hierarchy, or the window H and the window I of the third hierarchy.
According to some embodiments of the application, the plurality of windows within the visible area of the VR device are windows at different levels or the same level. The window control method of the embodiment of the application is applied to control replacement among a plurality of windows at different levels. The window control method provided by the embodiment of the present application is applied to control replacement between a plurality of windows at different hierarchies by taking a window at a first hierarchy and a window at a second hierarchy as an example.
In the embodiment of the present application, windows in different levels are exemplified by windows in a first level (in the embodiment of the present application, this is denoted as Layer-1), a second level (in the embodiment of the present application, this is denoted as Layer-2), and a third level (in the embodiment of the present application, this is denoted as Layer-3).
A plurality of windows respectively located at different hierarchies may be selected in a manner of selecting windows at different hierarchies as explained in conjunction with fig. 9A through 9B above.
In the three-dimensional space, the positions of the first-level window, the second-level window, and the third-level window in the three-dimensional space may refer to the positions shown in fig. 9A. The embodiments of the present application are not described herein again.
As shown in fig. 11A, the user establishes the selection area in any of the ways shown in fig. 2. For example, the user selects the window a, the window B and the window C in the first hierarchy as the first target window by clicking or establishing a selection area by using the movement of the finger. And selecting the window E in the second level as a second target window by clicking or establishing a selection area by using fingers. And, the selected window a, window B, window C and window E show the selected mark 1001 for prompting the user which pages are currently in the selected state and waiting for the subsequent operation.
After the window a, the window B, the window C and the window E are selected, the user moves the window a, the window B and the window C with the fingers to approach the display area where the window E is located. The window A, the window B and the window C are moved to the position of the window E, and the window E is moved to the position of a display area formed by the window A, the window B and the window C.
As shown in fig. 11B, after the replacement of the positions of the window a, the window B, and the window C and the position of the window E are completed, the window E is located in the display area where the window a, the window B, and the window C are located, and the window a, the window B, and the window C are located in the display area where the window E is located.
It is worth noting that when the area of the display area formed by the window a, the window B and the window C is not consistent with the area of the display area of the window E, after the window a, the window B and the window C and the window E are interchanged, the window area of the window E is adjusted to be the same as the area of the display area at the position where the window a, the window B and the window C are located. And adjusting the window area of a plane area formed by the window A, the window B and the window C to be the same as the area of a display area at the position of the window E. The area of the first target window is adapted to the display area at the position of the second target window, and the area of the second target window is adapted to the display area at the position of the first target window.
In the above, the replacement is triggered by moving some windows to other windows by fingers and approaching them, according to the embodiment of the present application, after both windows as replacement objects are selected, a specific gesture may be used to trigger, for example, rotating the palm. In addition, it will be appreciated that in the context of using a remote control handle, the replacement of the control window may be implemented by a key of the remote control handle or by a specific way of waving the remote control handle.
Therefore, the window control method provided by the embodiment of the application is applied to control of the plurality of windows at different levels, the plurality of windows can be operated in batches, and the experience of a user is improved compared with a mode that the user operates a single window.
Although the above embodiment explains that the three of the window a, the window B, and the window C and the window E of the second hierarchy are selected on the first-hierarchy display area, and the positions of the three of the window a, the window B, the window C, and the window E are replaced with each other. However, the present application is not limited to this, and the remaining windows of the first hierarchy or windows of other hierarchies may be selected for position replacement, for example, the windows a, B, C, and D of the first hierarchy are selected for replacement, and the windows E and F of the second hierarchy are selected for replacement, or the windows G of the third hierarchy and the windows E of the second hierarchy are selected for replacement.
According to some embodiments of the application, the plurality of windows within the visible area of the VR device are windows at different levels or the same level. The window control method of the embodiment of the application is applied to control replacement among a plurality of windows which are at different levels but in the same longitudinal space. The description is given by taking a window at a first level, a window at a second level, a window at a third level and a window at a third level as examples, and the window control method provided by the embodiment of the application is applied to control replacement among a plurality of windows at different levels.
In the embodiment of the present application, windows in different levels are exemplified by windows in a first level (in the embodiment of the present application, this is denoted as Layer-1), a second level (in the embodiment of the present application, this is denoted as Layer-2), and a third level (in the embodiment of the present application, this is denoted as Layer-3). The different levels of the windows mean that the first level of the windows, the second level of the windows and the third level of the windows can be covered at the same position on the y-axis of the three-dimensional space coordinate (namely, the first level of the windows, the second level of the windows and the third level of the windows are in the same longitudinal space). That is, the plane area formed by the first level window, the plane area formed by the second level window and the plane area formed by the third level window have the same or partially coincident coordinates on the y axis. For example, in the window D of the first hierarchy, when the user clicks the window D with a finger, the window D is selected, and the coordinates of the window D are the same as or partially coincide with the coordinates of the window H of the second hierarchy and the coordinates of the window K of the third hierarchy. It should be understood that, when the coordinates of the window H at the second level and the coordinates of the part of the window K at the third level coincide with the coordinates of the window D, it may be considered that the window at the first level, the window at the second level and the window at the third level may be covered at the same position on the y-axis of the three-dimensional space coordinates, and the embodiment of the present application is not limited thereto. It is understood that the remaining windows in the VR device may be located at different levels from the first level to the third level mentioned in the embodiments of the present application, and there may be more levels at the levels in the visible area in the VR device, and the embodiments of the present application are not limited thereto.
In the three-dimensional space, the positions of the first-level window, the second-level window, and the third-level window in the three-dimensional space may refer to the positions shown in fig. 8A. The embodiments of the present application are not described herein again.
As shown in fig. 12A, the user establishes the selection area in any of the ways shown in fig. 2. For example, the user selects the window B and the window C in the first hierarchy and selects the window F in the second hierarchy and selects the window J in the third hierarchy as the first target window (represented by group1 in the figure) by clicking or establishing a selection area by using the movement of the finger. And selecting the window D of the first level, the window G and the window H of the second level and the window K of the third level as second target windows (represented by group2 in the figure) by clicking or establishing a selection area by using fingers. And, the selected window B, C, F, J, D, G, H and K show the selected mark 1001 for prompting the user which pages are currently in the selected state and waiting for the subsequent operation.
After the window B, the window C, the window F, the window J, and the window D, the window G, the window H, and the window K are selected, the user moves the window B, the window C, the window F, and the window J with a finger to be close to a display area where the window D, the window G, the window H, and the window K are located. To move windows B and C to the display area where window D is located, window F to the display area where windows H and G are located, and window J to the display area where window K is located. The window D moves to the display area where the window B and the window C are located, the window H and the window G move to the display area where the window F is located, and the window K moves to the display area where the window J is located.
As shown in fig. 12B, after the replacement of the positions of the window B, the window C, the window F, the window J, and the window D, the window G, the window H, and the window K is completed, the display regions of the window B and the window C at the position of the window D, the display regions of the window F shifted from the positions of the window H and the window G, and the display regions of the window J at the position of the window K are displayed. A display region where window D is located at the position of windows B and C, a display region where windows H and G are located at the position of window F, and a display region where window K is located at the position of window J.
It is noted that, after the positions of the window B, the window C and the window D are interchanged, when the area of the display region formed by the window B and the window C is not consistent with the area of the display region of the window D, the window area of the planar region formed by the window B and the window C is adjusted to be the same as the area of the display region at the position of the window D. The window area of the window D is adjusted to be the same as the area of the display area of the flat area formed by the windows B and C.
After the window F, the window G and the window H are exchanged, when the window area of the window F is not consistent with the area of a plane area formed by the window G and the window H, adjusting the window area of the window F, the window G and the window H; the display areas of the planar regions formed at the positions of the two are the same. The area of the planar region formed by the adjustment windows G and H is the same as the area of the display region where the window F is located.
After the window K and the window J are interchanged, when the window area of the window K is not consistent with the area of the display area at the position of the window J, the window area of the window K is adjusted to be the same as the area of the display area at the position of the window J. The window area of the adjustment window J is the same as the area of the display region at the position of the window K.
In addition, it will be appreciated that in the context of using a remote control handle, the replacement of the control window may be implemented by a key of the remote control handle or by a specific way of waving the remote control handle. Further, after the position replacement of window B, window C, window F, window J, and window D, window G, window H, and window K is completed, in order to prompt the user which pages are currently in the selected state and are replaced, mark 1001 may continue to be displayed on the selected window.
Therefore, the window control method provided by the embodiment of the application is applied to control of the plurality of windows at different levels, the plurality of windows can be operated in batches, and the experience of a user is improved compared with a mode that the user operates a single window.
The following describes a process of applying the window control method provided in the embodiment of the present application to control windows of different hierarchies or the same hierarchy to be stored and released.
Referring to fig. 13, fig. 13 is a schematic flow chart illustrating that a window control method provided in the embodiment of the present application is applied to controlling windows of different levels or the same level to receive and release windows.
The method includes steps S130 to S133.
Step S130: receiving a batch operation instruction, wherein the batch operation instruction is used for carrying out a collecting and releasing operation on a plurality of windows in a visual area of the VR device, and the plurality of windows may be a plurality of windows in the same level, for example, a window a, a window B, a window C and a window D in the same level as shown in fig. 6A. The plurality of windows may also be a plurality of windows at different levels, such as windows a, B, C, and D, windows E and F, windows G, H, and I at the first to third levels shown in fig. 7A. Window a, window B, window C, and window D of the first hierarchy shown in fig. 8A. Window E, window F, window G, and window H of the second hierarchy. Window I, window J, and window K of the third level. And window a, window B, window C, and window D of the first hierarchy shown in fig. 9A. Window E and window F of the second hierarchy. Window G, window H, and window I of the third level. The batch operation instruction can be sent by a user by using a gesture or a pointing component such as a remote control handle, for example, the user triggers a zoom-in instruction or a zoom-out instruction (the batch operation instruction includes the zoom-in instruction or the zoom-out instruction) in a visible area of the VR device by unfolding or folding five fingers. Or the user triggers an amplification instruction or a contraction instruction (the batch operation instruction comprises the amplification instruction or the contraction instruction) by operating keys on the remote control handle.
Step S131: bulk operation instructions are identified. The batch operation instruction includes, but is not limited to, a zoom-in instruction for zooming in a window in a visible area of the VR device or a zoom-out instruction for zooming out the window. The enlargement ratio and the reduction ratio of the enlargement instruction or the reduction instruction can be determined by the expansion degree or the folding degree of the fingers of the user, or can be determined by keys on the remote control handle. For example, the VR device tracks the movement of the fingers of the user's hand in real time, and the selected target window is scaled to a minimum when the fingers of the user's hand are fully closed. When the fingers of the user's hand are closed to half of the original state, the selected target window is scaled to half of the original size. The scaling of the selected target window may also be determined in other ways, and the embodiment of the present application is not limited herein.
Step S132: a selection range is determined. The selection range may be a planar area where the selected target window is located. The selection of the target window may be determined by the user using a hand or remote control to create a selection field or clicking on a single window. For example, a user can click on one or more windows presented within the visible area of the VR device using keys provided on the remote control handle, or by holding the remote control handle in a tap or click-like action. The clicked window can be used as the selected target window. For example, as shown in fig. 7B, the user selects window a, window B, and window D in the first hierarchy as the target window (first selection region) selected in the first hierarchy by clicking or creating a selection region using the movement of the finger. And selecting the window F of the second level as a selected target window (second selected area) by clicking or establishing a selected area, and selecting the window I in the third level as a selected target window (third selected area) by clicking or establishing a selected area.
Or the user waves the hand to select the windows in the area through which the hand waves, and the windows are taken as the selected target windows. The hand swing can move in a visible area of the VR equipment, and when a moving track forms a closed graph, a window covered by the closed graph is a target window. When the hand moves in the visual area of the VR device, the moving track forms a curve or a straight line, and a window covered by the curve or the straight line can be selected as a target window. For example, as shown in fig. 6A, the user creates a rectangular movement trajectory by using the movement of the hand, and the rectangle formed by the movement trajectory covers the window a, the window B, the window C, and the window D as the selected target window.
It will be appreciated that the user may also use the remote control handle to move within the visible area of the VR device to create a closed figure to select the target window. The closed figure formed can be a regular rectangle 1002 or a triangle 1003 or an irregular shape 1004 as shown in fig. 3B. It is understood that the closed graph formed by the moving track can determine the selected target window as follows: the window is entirely located inside the closed figure and is selected as the target window, and the window is partially located inside the closed figure and is considered as the target window (for example, 50% of the area of the window is located inside the closed figure and is considered as the target window).
Step S133: and executing the operation indicated by the batch operation instruction. After the selection range is determined, the target window in the selection range is enlarged or reduced after the enlargement operation or the reduction operation in the batch operation instruction at the identification position. The above zoom-in operation or zoom-out operation is an example of an operation instruction, and the operation instruction may further include an instruction to control other windows, such as closing a window, and the like.
Therefore, the window control method provided by the embodiment of the application is applied to control of a plurality of windows at different levels or the same level, the plurality of windows can be operated in batch, and the experience of a user is improved compared with a mode that the user operates a single window.
Next, a flow of the window control method provided in the embodiment of the present application, which is applied to control replacement of windows in different hierarchies or the same hierarchy, will be described below.
Referring to fig. 14A, fig. 14A is a schematic flowchart illustrating a window control method applied to control windows of different levels or the same level for replacement according to an embodiment of the present disclosure.
The method includes steps S140 to S144.
Step S140: receiving a batch operation instruction, where the batch operation instruction is used to instruct to perform a replacement operation on multiple windows in a visible region of the VR device, where the multiple windows may be multiple windows at the same level or multiple windows at different levels, such as a window a, a window B, a window C, and a window D at a first level shown in fig. 9A. Window E and window F of the second hierarchy. Window G, window H, and window I of the third level. The batch operation instruction can be sent by a user by using a gesture or a pointing component such as a remote control handle, for example, the user triggers a replacement instruction by moving a finger in a visible area of the VR device (the batch operation instruction comprises the replacement instruction). Or the user triggers the replacement instruction by controlling the keys on the remote control handle.
Step S141: bulk operation instructions are identified. Wherein the batch operation instruction includes, but is not limited to, an instruction to replace between at least two windows within a visible region of the VR device. For example, the VR device tracks the movement of the fingers of the hand of the user in real time, and when the fingers of the hand of the user long press the window B shown in fig. 10B to move to the window a until the fingers move to the display area where the window a is located, the batch operation instruction at this time is a replacement instruction between the target windows.
Step S142: a selection range is determined. The selection range can be a plane area where the first target window and the second target window of the same level or different levels are selected. The selection of the target window may be determined by a user using a hand or remote control to create a selection field or clicking on a single window. For example, a user can click on one or more windows presented within the visible area of the VR device using keys provided on the remote control handle, or by holding the remote control handle in a tap or click-like action. The clicked window can be used as the selected target window. For example, as shown in fig. 10A, the user selects window a in the first hierarchy as the first target window by clicking or creating a selection area using the movement of the finger. And selecting the window B in the first level as a second target window by clicking or establishing a selection area by using fingers. Or as shown in fig. 11A, the user selects the window a, the window B, and the window C in the first hierarchy as the first target window by clicking or establishing the selection area using the movement of the finger. And selecting the window E in the second level as a second target window by clicking or establishing a selection area by using fingers. Or as shown in fig. 12A, the user selects the window B and the window C in the first hierarchical level, selects the window F in the second hierarchical level, and selects the window J in the third hierarchical level as the first target window (represented by group1 in the figure) by clicking or establishing a selection area by using the movement of the finger. And selecting the window D of the first level, the window G and the window H of the second level and the window K of the third level as second target windows (represented by group2 in the figure) by clicking or establishing a selection area by using fingers.
Step S143: coordinates of the first target window are identified. Referring to fig. 14B, windows a and B in the same hierarchy serve as the first target window and the second target window that are selected. Corresponding to three-dimensional space, window a and window B are both in the region formed by the positive direction of the x-axis and the positive direction of the z-axis. Corresponding to the rectangular window a, the four vertices of the window a have four coordinates in three-dimensional space, which are a1(x1, 0, z1), a2(x1, 0, z2), A3(x2, 0, z1), and a4(x2, 0, z 2). Of course, the first target windows may also be located at other positions in the three-dimensional space, the number of the first target windows is not limited to one, and the embodiment of the present application is not limited herein.
Step S144: coordinates of a second target window are identified. Referring to fig. 14B, corresponding to the rectangular window B in the three-dimensional space, four vertices of the window B have four coordinates in the three-dimensional space, which are B1(x3, 0, z3), B2(x3, 0, z4), B3(x4, 0, z3), and B4(x4, 0, z 4). Of course, the second target windows may also be located at other positions in the three-dimensional space, the number of the second target windows is not limited to one, and the first target window and the second target window may also be windows of different levels, which is not limited herein in this embodiment of the application.
Step S145: and executing the operation indicated by the batch operation instruction. After the coordinates of the first target window and the second target window in the selection range are determined, the positions of the first target window and the second target window in the selection range are replaced aiming at the replacement operation in the identified batch operation instruction. Namely, the first target window is moved to the coordinate position of the second target window, and the second target window is moved to the coordinate position of the first target window. And when the coordinates of the first target window are not overlapped with the coordinates of the second target window, adjusting the area of the first target window to enable the coordinates at the vertex of the first target window to be overlapped with the coordinates at the vertex of the second target window. And when the coordinates of the second target window are not overlapped with the coordinates of the first target window, adjusting the area of the second target window to enable the coordinates at the vertex of the second target window to be overlapped with the coordinates at the vertex of the first target window.
Therefore, the window control method provided by the embodiment of the application is applied to control of a plurality of windows at different levels or the same level, the plurality of windows can be operated in batch, and the experience of a user is improved compared with a mode that the user operates a single window.
Please refer to fig. 15, wherein fig. 15 is a schematic structural diagram of a window control device disclosed in the present embodiment.
The window control device 2 shown in fig. 15 includes: a receiving module 150, an acquisition module 151, a collision detection module 152, a region identification module 153, a grouping module 154, a display module 155, and an output module 156.
The receiving module 150 is configured to receive the batch operation instructions shown in fig. 13 and fig. 14A or receive the position of the selected target window in the visible area 100 detected by the collision detecting module 152. The batch operation instruction can be a window receiving and releasing instruction or a window replacing position instruction sent by a user by using a hand part to move or a remote control handle.
The collecting module 151 is configured to collect gestures, motions, regions, and the like pointed by the pointing device, so as to determine a moving track of the pointing device.
Collision detection module 152 is used to detect a portion of window 1000 in visual area 100 that is hit by radiation emitted from remote control handle into visual area 100 shown in fig. 1A or fig. 1B. Or the user may point to window 1000 within viewable area 100 shown in fig. 1A or 1B by way of a glove or gesture. The window where the ray emitted by the remote control handle collides or the window where the direction pointed by the user's finger is located is the target window. Or a window covered by a closed figure as shown in fig. 3B formed by radiation from a remote control handle, may also be identified and determined by collision detection module 152 as a target window.
And the area identification module 153 is used for identifying the selection area of the form selected by the user by using a hand or a remote control handle. For example, the user uses the hand movement to create the area shown in fig. 3B, and the area recognition module 153 is used to recognize the shape and size of the selection area formed by the closed figure and the window covered by the closed figure. Alternatively, the user uses the hand movement to create a straight line or a curved line as shown in fig. 3C, and the region identification module 153 is used to identify the length of the straight line or the curved line and the window where the straight line or the curved line passes through.
Grouping module 154, the user groups windows and manages windows.
And a display module 155, configured to display the selected target frame, for example, on the selected target frame, as indicated by reference 1001 shown in fig. 3A. Or a position schematic diagram for displaying a target window with finished retraction or replacement, such as a display position in the visible area 100 after the window is reduced as shown in fig. 6B, 7C, 8C, and 9C. Or the display position of the target window or the target window group after the replacement is completed as shown in fig. 10C, 11B, and 12B.
And the output module 156 is used for executing the action of window batch operation interaction. The actions of the window batch operation interaction include, but are not limited to, moving a window, splicing windows, closing windows, maximizing and minimizing windows, and the like.
In some embodiments of the present application, an electronic device is also provided, and the electronic device in the embodiments of the present application is described below with reference to fig. 16. Fig. 16 is a schematic structural diagram of an electronic device disclosed in an embodiment of the present application.
For at least one embodiment, controller hub 804 communicates with processor 801 via a multi-drop bus such as a front-side bus (FSB), a point-to-point interface such as a quick channel interconnect (QPI), or similar connection. The processor 801 executes instructions that control data processing operations of a general type. In one embodiment, controller hub 804 includes, but is not limited to, a Graphics Memory Controller Hub (GMCH) (not shown) and an input/output hub (IOH) (which may be on separate chips) (not shown), where the GMCH includes memory and graphics controllers and is coupled to the IOH.
The electronic device 800 may also include a coprocessor 806 and memory 802 coupled to the controller hub 804. Alternatively, one or both of the memory 802 and the GMCH may be integrated within the processor 801 (as described herein), with the memory 802 and the coprocessor 806 coupled directly to the processor 801 and to the controller hub 804, with the controller hub 804 and IOH in a single chip.
In one embodiment, memory 802 may be, for example, Dynamic Random Access Memory (DRAM), Phase Change Memory (PCM), or a combination of the two. Memory 802 may include one or more tangible, non-transitory computer-readable media for storing data and/or instructions. A computer-readable storage medium has stored therein instructions, and in particular, temporary and permanent copies of the instructions.
In one embodiment, the coprocessor 806 is a special-purpose processor, such as, for example, a high-throughput MIC processor, a network or communication processor, compression engine, graphics processor, GPU, embedded processor, or the like. The optional nature of coprocessor 806 is represented in FIG. 16 by dashed lines.
In one embodiment, electronic device 800 may further include a Network Interface (NIC) 803. Network interface 803 may include a transceiver to provide a radio interface for device 800 to communicate with any other suitable device (e.g., front end module, antenna, etc.). In various embodiments, the network interface 803 may be integrated with other components of the electronic device 800. The network interface 803 can realize the functions of the communication unit in the above-described embodiments.
In one embodiment, as shown in FIG. 16, electronic device 800 may further include an input/output (I/O) device 805. Input/output (I/O) devices 805 may include: a user interface designed to enable a user to interact with the electronic device 800; the design of the peripheral component interface enables peripheral components to also interact with the electronic device 800; and/or sensors are designed to determine environmental conditions and/or location information associated with the electronic device 800.
It is noted that fig. 16 is merely exemplary. That is, although fig. 16 shows that the electronic apparatus 800 includes a plurality of devices such as the processor 801, the controller hub 804, the memory 802, etc., in practical applications, an apparatus using the methods of the present application may include only a part of the devices of the electronic apparatus 800, and for example, may include only the processor 801 and the NIC 803. The nature of the alternative device in fig. 16 is shown in dashed lines.
In some embodiments of the present application, the computer readable storage medium of the electronic device 800 having instructions stored therein may include: instructions that when executed by at least one unit in a processor cause an apparatus to implement the method for detecting wireless charging registration mentioned in the above embodiments. When the instructions are run on a computer, the instructions cause the computer to perform the window control method as mentioned in the above embodiments.
Referring now to fig. 17, fig. 17 is a schematic structural diagram of an SOC according to an embodiment of the present disclosure, and illustrates a block diagram of an exemplary SOC (System on Chip) 1000 according to an embodiment of the present disclosure. In fig. 17, like parts have the same reference numerals. In addition, the dashed box is an optional feature of more advanced socs. The SoC may be used in an electronic device according to an embodiment of the present application, and may implement corresponding functions according to instructions stored therein.
In fig. 17, the SoC 1000 includes: an interconnect unit 1002 coupled to the processor 1001; a system agent unit 1006; a bus controller unit 1005; an integrated memory controller unit 1003; a set or one or more coprocessors 1007 which may include integrated graphics logic, an image processor, an audio processor, and a video processor; an Static Random Access Memory (SRAM) unit 1008; a Direct Memory Access (DMA) unit 1004. In one embodiment, the coprocessor 1007 comprises a special-purpose processor, such as, for example, a network or communication processor, compression engine, GPU, a high-throughput MIC processor, embedded processor, or the like.
Included in Static Random Access Memory (SRAM) unit 1008 may be one or more computer-readable media for storing data and/or instructions. A computer-readable storage medium may have stored therein instructions, in particular, temporary and permanent copies of the instructions.
When the SoC 1000 is applied to an electronic device according to the present application, the instructions stored in the computer-readable storage medium may include: instructions that when executed by at least one unit in a processor cause an electronic device to implement a method of detecting wireless charging registration as mentioned in the above embodiments. When the instructions are run on a computer, the instructions cause the computer to perform the window control method as mentioned in the above embodiments.
In addition, the embodiment of the application also discloses a computer readable storage medium, wherein a processing program is stored on the computer readable storage medium, and when the processing program is executed by a processor, the window control method mentioned in the above embodiment is realized.
The computer readable storage medium may be a read-only memory, a random access memory, a hard disk, or an optical disk, etc.

Claims (20)

1. A window control method is applied to electronic equipment based on virtual reality technology or augmented reality, wherein a pointing component is associated with the electronic equipment, and the window control method is used for controlling a plurality of window layers displayed in a visual area of the electronic equipment, the window layers are sequentially arranged along a direction far away from a user side of the electronic equipment, and each window layer comprises a plurality of windows, wherein the window control method comprises the following steps:
detecting movement of the pointing element;
determining a selection area based on a movement trajectory of the pointing element;
determining at least one target window from the plurality of window layers according to the selection area;
and simultaneously executing the operation indicated by the pointing component on each at least one target window.
2. The window control method as claimed in claim 1, wherein the movement of the pointing member is a click for a window, and the selection area is determined based on a click position of the pointing member.
3. The window control method of claim 1, wherein the movement of the pointing device is a movement in a predetermined trajectory,
the selection area is determined based on a trajectory formed when the pointing element is moved.
4. The window control method according to claim 3, wherein a trajectory formed when the pointing member moves forms a closed figure, the electronic device determines the selection area based on an area covered by the closed figure, and a window in an area of the closed figure as the selection area is the selected target window.
5. The window control method according to any one of claims 1 to 4, wherein the retracting operation based on the pointing member instruction includes reduction, enlargement, and movement of a window position for each of the target windows.
6. The window control method of any one of claims 1-4, wherein said determining at least one target window from the plurality of window layers according to a selection region comprises:
based on the selection area, at least one target window in the same window layer is determined.
7. The window control method of any one of claims 1-4, wherein said determining at least one target window from the plurality of window layers according to a selection region comprises:
based on the selection region, a plurality of target windows in a plurality of window layers is determined, at least one window in each window layer being selected.
8. The window control method according to claim 6 or 7, wherein the range of the closed figure determined by the selection area covers a plurality of windows in a plurality of window layers, the plurality of windows being determined as the target window.
9. The window control method according to claim 6 or 7, wherein the selection area is a window selected on one of the plurality of window layers, and the selection area is mapped to the determined at least one window on the other window layer as the target window.
10. The window control method according to claim 1, wherein in a case where the at least one of the target windows includes at least two windows, performing the operation instructed by the pointing member performs the operation instructed by the pointing member on the plurality of the target windows, including
The positions of the at least two windows are interchanged.
11. The window control method according to claim 1, wherein the at least two windows are divided into two window groups, and the operation instructed by the pointing device is performed on the plurality of the target windows, including
The positions of the two window groups are interchanged.
12. The window control method according to claim 10 or 11, wherein an area of a first target window of at least two of the windows is adjusted to fit a display area at a position of a second target window, or an area of a second target window of the at least two of the windows is adjusted to fit a display area at a position of the first target window.
13. An electronic device having a pointing element associated therewith and based on virtual reality technology or augmented reality-based electronic device, comprising:
a memory to store window control instructions;
a processor that, when executing the window control instructions, performs the steps of:
detecting movement of the pointing element;
determining a selection area based on a movement trajectory of the pointing element;
determining at least one target window from the plurality of window layers according to the selection area;
and simultaneously executing the operation indicated by the pointing component on at least one target window.
14. The electronic device of claim 13, wherein the processor, when executing the window control instructions, is further to determine the selection region based on a click position of the pointing component.
15. The electronic device of claim 13, wherein the processor, when executing the window control instructions, is further configured to determine the selection area based on a trajectory formed when the pointing element is moved.
16. The electronic device of any one of claims 13-15, wherein the processor, when executing the window control instructions, is further configured to cause the retraction operation based on the pointing element indication to include zooming out, zooming in, and moving a window position for each of the target windows.
17. The electronic device of any of claims 13-15, wherein the processor, when executing the window control instructions, is further configured to determine at least one target window in a same window layer based on the selection region, or determine a plurality of target windows in a plurality of window layers based on the selection region, at least one window in each window layer being selected.
18. The electronic device of claim 17, wherein the memory is further configured to store the selection area to cause the processor to invoke the selection area.
19. The electronic device of any of claims 13-15, wherein the processor, when executing the window control instructions, is further configured to swap the positions of the at least two windows or swap the positions of the two window groups with each other.
20. A computer-readable storage medium storing stored window control instructions which, when executed by a processor, implement the window control method of any one of claims 1-12.
CN202011218043.9A 2020-11-04 2020-11-04 Window control method, electronic device and computer readable storage medium Pending CN114529691A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011218043.9A CN114529691A (en) 2020-11-04 2020-11-04 Window control method, electronic device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011218043.9A CN114529691A (en) 2020-11-04 2020-11-04 Window control method, electronic device and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114529691A true CN114529691A (en) 2022-05-24

Family

ID=81618878

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011218043.9A Pending CN114529691A (en) 2020-11-04 2020-11-04 Window control method, electronic device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114529691A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115033333A (en) * 2022-07-19 2022-09-09 荣耀终端有限公司 Floating window display method, electronic equipment and storage medium
CN115421626A (en) * 2022-11-02 2022-12-02 海看网络科技(山东)股份有限公司 AR virtual window interaction method based on mobile terminal
CN116301482A (en) * 2023-05-23 2023-06-23 杭州灵伴科技有限公司 Window display method of 3D space and head-mounted display device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115033333A (en) * 2022-07-19 2022-09-09 荣耀终端有限公司 Floating window display method, electronic equipment and storage medium
CN115421626A (en) * 2022-11-02 2022-12-02 海看网络科技(山东)股份有限公司 AR virtual window interaction method based on mobile terminal
CN115421626B (en) * 2022-11-02 2023-02-24 海看网络科技(山东)股份有限公司 AR virtual window interaction method based on mobile terminal
CN116301482A (en) * 2023-05-23 2023-06-23 杭州灵伴科技有限公司 Window display method of 3D space and head-mounted display device
CN116301482B (en) * 2023-05-23 2023-09-19 杭州灵伴科技有限公司 Window display method of 3D space and head-mounted display device

Similar Documents

Publication Publication Date Title
US20220084279A1 (en) Methods for manipulating objects in an environment
US11983326B2 (en) Hand gesture input for wearable system
KR101844390B1 (en) Systems and techniques for user interface control
CN108469899B (en) Method of identifying an aiming point or area in a viewing space of a wearable display device
US11360551B2 (en) Method for displaying user interface of head-mounted display device
CN114529691A (en) Window control method, electronic device and computer readable storage medium
US10289214B2 (en) Method and device of controlling virtual mouse and head-mounted displaying device
JP2020052991A (en) Gesture recognition-based interactive display method and device
O'Hagan et al. Visual gesture interfaces for virtual environments
KR101833253B1 (en) Object manipulation method in augmented reality environment and Apparatus for augmented reality implementing the same
JP7382994B2 (en) Tracking the position and orientation of virtual controllers in virtual reality systems
Rautaray et al. Real time multiple hand gesture recognition system for human computer interaction
CN105446481A (en) Gesture based virtual reality human-machine interaction method and system
JP2006506737A (en) Body-centric virtual interactive device and method
CN111324250B (en) Three-dimensional image adjusting method, device and equipment and readable storage medium
EP2558924B1 (en) Apparatus, method and computer program for user input using a camera
Smith et al. Digital foam interaction techniques for 3D modeling
CN111880652A (en) Method, apparatus and storage medium for moving position of AR object
CN117472189B (en) Typing or touch control realization method with physical sense
CN104820584B (en) Construction method and system of 3D gesture interface for hierarchical information natural control
CN111240483B (en) Operation control method, head-mounted device, and medium
CN110717993B (en) Interaction method, system and medium of split type AR glasses system
CN109960404B (en) Data processing method and device
Kolaric et al. Direct 3D manipulation using vision-based recognition of uninstrumented hands
KR101605740B1 (en) Method for recognizing personalized gestures of smartphone users and Game thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination