WO2023284791A1 - 虚拟界面操作方法、头戴式显示设备和计算机可读介质 - Google Patents

虚拟界面操作方法、头戴式显示设备和计算机可读介质 Download PDF

Info

Publication number
WO2023284791A1
WO2023284791A1 PCT/CN2022/105489 CN2022105489W WO2023284791A1 WO 2023284791 A1 WO2023284791 A1 WO 2023284791A1 CN 2022105489 W CN2022105489 W CN 2022105489W WO 2023284791 A1 WO2023284791 A1 WO 2023284791A1
Authority
WO
WIPO (PCT)
Prior art keywords
sliding
virtual interface
finger
touch
mentioned
Prior art date
Application number
PCT/CN2022/105489
Other languages
English (en)
French (fr)
Inventor
孙超
石文峰
李统宇
Original Assignee
杭州灵伴科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州灵伴科技有限公司 filed Critical 杭州灵伴科技有限公司
Publication of WO2023284791A1 publication Critical patent/WO2023284791A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Definitions

  • the embodiments of the present disclosure relate to the field of computer technology, and in particular to a method for operating a virtual interface, a head-mounted display device, and a computer-readable medium.
  • Head-mounted display devices such as AR (Augmented Reality, Augmented Reality) glasses or MR (Mixed Reality, Mixed Reality) glasses, can enable users to watch virtual images through the display screen in the AR or MR glasses.
  • the interaction mode usually adopted is: in response to detecting an application opening operation acting on the touch-sensitive display screen of the computing device, the information displayed on the display screen of the head-mounted display device The application window corresponding to the application opening operation is displayed on the virtual interface.
  • Some embodiments of the present disclosure propose a method for operating a virtual interface, a head-mounted display device, and a computer-readable medium to solve the technical problems mentioned in the above background section.
  • some embodiments of the present disclosure provide a method for operating a virtual interface, the method including: in response to detecting a single-finger sliding operation acting on a touch-sensitive display screen of a target device, determining the corresponding A first sliding offset value, wherein the above-mentioned target device is connected in communication with the head-mounted display device; sliding in the 3D virtual interface of the above-mentioned head-mounted display device according to the above-mentioned first sliding offset value and the first preset sliding ratio
  • the anchor point corresponding to the above-mentioned one-finger sliding operation, and the ray from the target starting point to the above-mentioned anchor point are displayed in the above-mentioned 3D virtual interface, wherein the above-mentioned 3D virtual interface is displayed on the display screen of the above-mentioned head-mounted display device; in response to the detection When the single-finger click operation acts on the above-mentioned touch-sensitive display screen, and the anchor point corresponding to the above-
  • some embodiments of the present disclosure provide a head-mounted display device, including: one or more processors; a display screen for displaying a 3D virtual interface and a 2D virtual interface; One or more programs, when the one or more programs are executed by the one or more processors, so that the one or more processors implement the method described in any implementation manner of the first aspect above.
  • some embodiments of the present disclosure provide a computer-readable medium on which a computer program is stored, wherein, when the program is executed by a processor, the method described in any implementation manner of the above-mentioned first aspect is implemented.
  • the application windows displayed in the virtual interface can be controlled internally and externally.
  • the reason why the internal and external control of the application window displayed in the virtual interface cannot be realized is that only the application window corresponding to the application opening operation displayed in the virtual interface can be controlled, and when the operation inside and outside the application window needs to be performed in the virtual interface , for example, sliding or clicking operations need to be performed inside and outside the application window respectively, and the operations acting on the application window cannot be distinguished, resulting in the inability to control the inside and outside of the application window.
  • the virtual interface operation method of some embodiments of the present disclosure first, in response to detecting a single-finger sliding operation acting on the touch-sensitive display screen of the target device, determine the first sliding offset value corresponding to the above-mentioned single-finger sliding operation .
  • the above-mentioned target device communicates with the head-mounted display device.
  • slide the anchor point corresponding to the above-mentioned one-finger sliding operation in the 3D virtual interface of the above-mentioned head-mounted display device and display in the above-mentioned 3D virtual interface from The ray from the origin of the target to the aforementioned anchor point.
  • the above-mentioned 3D virtual interface is displayed on the display screen of the above-mentioned head-mounted display device.
  • the user's one-finger sliding operation on the touch-sensitive display screen can be visually displayed in the 3D virtual interface.
  • the anchor point corresponding to the single-finger click operation is at the position of the target control, create a 2D virtual interface in the above-mentioned 3D virtual interface, and create a 2D virtual interface in the above-mentioned 2D virtual interface.
  • the application window corresponding to the above single-finger click operation is displayed on the interface.
  • the user can browse the application window corresponding to the target control in the 2D virtual interface after clicking the target control through the touch-sensitive display screen.
  • the target control may be an application identification control in the desktop window, or an application control in an application window. It is also possible to control the display content of the virtual interface according to the user's single-finger sliding operation and single-finger clicking operation, and to control the application windows displayed on the virtual interface internally and externally.
  • FIG. 1 is an architectural diagram of an exemplary system to which some embodiments of the present disclosure can be applied;
  • 2-3 are schematic diagrams of an application scenario of a virtual interface operation method according to some embodiments of the present disclosure.
  • FIG. 4 is a flowchart of some embodiments of a virtual interface operating method according to the present disclosure.
  • Fig. 5 is a flow chart of another embodiment of a method for operating a virtual interface according to the present disclosure
  • FIG. 6 is a schematic structural diagram of a head-mounted display device suitable for implementing some embodiments of the present disclosure.
  • FIG. 1 shows an exemplary system architecture 100 that can be applied to the embodiment of the virtual interface operating method of the present disclosure.
  • an exemplary system architecture 100 may include a head-mounted display device 11 and a target device 12 .
  • the head-mounted display device 11 may include one or two display screens 111 .
  • the above-mentioned display screen is used for displaying a 3D virtual interface and a 2D virtual interface.
  • the head-mounted display device 11 also includes a frame 112 .
  • the sensor, processing unit, memory and battery of the head-mounted display device 11 can be placed inside the frame 112 .
  • one or more components of the sensor, processing unit, memory, and battery can also be integrated into another independent accessory (not shown), through the data line and mirror rack 112 for connection.
  • the head-mounted display device 11 may only have a display function and some sensors, and the target device 12 provides capabilities such as data processing, data storage, and power supply capabilities.
  • the target device 12 may include a touch-sensitive display screen 121, and in some embodiments, the head-mounted display device 11 and the target device 12 may communicate via a wireless connection. In some optional implementation manners of some embodiments, the head-mounted display device 11 and the target device 12 may also be connected through a data cable (not shown).
  • head-mounted display devices and target devices in FIG. 1 are only illustrative. There may be any suitable number of head-mounted display devices and target devices, depending on implementation needs.
  • 2-3 are schematic diagrams of an application scenario of a method for operating a virtual interface according to some embodiments of the present disclosure.
  • the target device 201 may, in response to detecting the single-finger sliding operation 202 acting on the touch-sensitive display screen of the target device, determine the first sliding direction corresponding to the above-mentioned single-finger sliding operation 202 Shift value 203.
  • the above-mentioned target device 201 is communicatively connected with the head-mounted display device 204 .
  • the target device 201 can slide the anchor point 207 corresponding to the one-finger sliding operation 202 in the 3D virtual interface 206 of the head-mounted display device 204 according to the first sliding offset value 203 and the first preset sliding ratio 205 , and the ray 208 from the starting point of the target to the anchor point 207 is displayed on the 3D virtual interface 206 .
  • the above-mentioned 3D virtual interface 206 is displayed on the display screen of the above-mentioned head-mounted display device 204 .
  • What is displayed in the 3D virtual interface 206 in FIG. 2 is a desktop window, and the application identification control group is displayed in the desktop window.
  • the application identification control in the application identification control group may be a control for receiving a user's selection operation to display the application window corresponding to the application identification control.
  • the target device 201 may respond to the detection of the single-finger click operation 209 acting on the touch-sensitive display screen, and the anchor point 210 corresponding to the above-mentioned single-finger click operation 109 is at the position of the target control.
  • a 2D virtual interface 211 is created in the virtual interface 206 , and the application window 212 corresponding to the single-finger click operation 209 is displayed on the 2D virtual interface 211 .
  • the aforementioned target control position may be the position of the application identification control in the application identification control group in the 3D virtual interface 206 .
  • the anchor point 210 is at the position of the target control, that is, the anchor point 210 falls on the application identification control.
  • An application page (for example, the running page of XX application program) is displayed in the application window 212 .
  • the execution subject of the virtual interface operation method can be various software, the above execution subject can also be the target device 201, the above execution subject can also be a server, and the above execution subject can also include the above target device 201 communicating with the above server.
  • the target device 201 may be various electronic devices with information processing capabilities, including but not limited to smart phones, tablet computers, e-book readers, laptop computers, desktop computers and so on.
  • the execution subject of the virtual interface operation method is software, it can be installed in the electronic devices listed above. It can be implemented, for example, as a plurality of software or software modules for providing distributed services, or as a single software or software module. No specific limitation is made here.
  • target devices and head-mounted display devices in FIGS. 2-3 are only illustrative. There can be any number of target devices and head-mounted display devices depending on implementation needs.
  • the virtual interface operation method includes the following steps:
  • Step 401 in response to detecting a single-finger sliding operation acting on a touch-sensitive display screen of a target device, determine a first sliding offset value corresponding to the single-finger sliding operation.
  • the execution subject of the virtual interface operation method may respond to detecting a touch-sensitive display screen acting on the target device A single-finger slide operation, and determine the first slide offset value corresponding to the above-mentioned single-finger slide operation.
  • the above-mentioned one-finger sliding operation may be an operation of sliding on a touch-sensitive display screen through a touch point.
  • the contact point may be the contact point between the user's finger and the touch-sensitive display screen, and may also be the contact point between the stylus pen and the touch-sensitive display screen.
  • the aforementioned target device may be a computing device with a touch-sensitive display screen.
  • the above-mentioned target device may be but not limited to one of the following: mobile phone, tablet computer.
  • the above target device communicates with the head-mounted display device.
  • the above-mentioned head-mounted display device may be a head-mounted device for enabling a user to watch a virtual image, and may be but not limited to one of the following: a head-mounted enhanced display device, a head-mounted hybrid display device.
  • the aforementioned head-mounted augmented display device may be AR glasses.
  • the aforementioned head-mounted hybrid display device may be MR glasses.
  • a desktop window and at least one application window may be displayed in the head-mounted display device.
  • the above-mentioned first slide offset value may be the distance between the coordinates of the touch point at the end of the single-finger slide operation and the coordinates of the touch point in the previous frame.
  • the above distance may be a Euclidean distance.
  • the touch points in the previous frame may be the touch points displayed in a frame on the display screen of the head-mounted display device.
  • the frame rate of the display screen is not limited.
  • the above-mentioned ending touch point may be a touch point at the end of the one-finger sliding operation.
  • the aforementioned coordinates may be screen coordinates of the touch-sensitive display screen.
  • the first preset sliding ratio is determined according to the above sliding speed of the single-finger sliding operation.
  • the execution subject may first determine the speed range in which the sliding speed is located. Then, the sliding ratio corresponding to the above speed range may be determined as the first preset sliding ratio.
  • the execution subject may determine the speed interval in which the above-mentioned sliding speed is located and the sliding ratio corresponding to the above-mentioned speed interval through a preset speed interval-sliding ratio comparison table.
  • the speed interval-sliding ratio comparison table includes each speed interval and the sliding ratio corresponding to each speed interval. The greater the sliding speed corresponding to the speed interval, the greater the first preset sliding ratio corresponding to the speed interval.
  • the first preset sliding ratio can be dynamically adjusted.
  • Step 402 according to the first sliding offset value and the first preset sliding ratio, slide the anchor point corresponding to the one-finger sliding operation in the 3D virtual interface of the head-mounted display device, and display in the 3D virtual interface the distance from the target starting point to the Anchor point for the ray.
  • the execution subject may slide the anchor point corresponding to the one-finger slide operation in the 3D virtual interface of the head-mounted display device according to the first sliding offset value and the first preset sliding ratio, and A ray from the starting point of the target to the anchor point is displayed on the 3D virtual interface.
  • the above-mentioned first preset sliding ratio may be a preset ratio used to adjust the first sliding offset value of the single-finger sliding operation in the 3D virtual interface.
  • the above-mentioned first preset sliding ratio may be 200%.
  • the above-mentioned 3D virtual interface may be an interface displayed in a three-dimensional form on the display screen of the head-mounted display device.
  • the above-mentioned anchor point may be a point after the visual display of the touch point of the above-mentioned one-finger sliding operation in the 3D virtual interface.
  • the execution subject can display the coordinate mapping anchor point of the touch point of the one-finger slide operation in the above 3D virtual interface, so that the 3D virtual interface coordinates of the anchor point corresponding to the ending touch point correspond to the touch point of the previous frame
  • the distance between the coordinates of the 3D virtual interface of the anchor point is the product of the above-mentioned first sliding offset value and the above-mentioned first preset sliding ratio.
  • the execution subject may determine the product of the abscissa included in the coordinates of the touch point of the one-finger slide operation and the first preset sliding ratio as the abscissa of the 3D virtual interface coordinates of the mapped anchor point.
  • the above-mentioned executive body may determine the product of the ordinate included in the coordinates of the touch point of the one-finger slide operation and the first preset sliding ratio as the ordinate of the 3D virtual interface coordinates of the mapped anchor point.
  • the coordinates of the 3D virtual interface may be coordinates on the display screen of the head-mounted display device. It can be understood that when the touch point continues to slide, the anchor point can be continuously slid in the 3D virtual interface, or the anchor point can be slid in the 3D virtual interface according to a set time interval.
  • the aforementioned target starting point may be a coordinate point of the 3D virtual interface corresponding to a virtual camera point of the 3D virtual interface.
  • the aforementioned virtual camera point may be a viewing point.
  • the virtual camera point can be the location point of the unity Main Camera.
  • the 3D virtual interface coordinate point corresponding to the above virtual camera point may be a 3D virtual interface coordinate point offset relative to the unity Main Camera.
  • the direction and distance of the offset are not limited.
  • the above execution subject may display a ray from the starting point of the target to the anchor point during the process of sliding to display the anchor point.
  • the rendering style of the anchor point and the ray may be a preset style, which is not limited.
  • an anchor point can be a dot with a predetermined radius and a predetermined fill color.
  • Rays can be lines or arrows of predetermined width and predetermined fill color.
  • the first step is to determine the anchor point corresponding to the single-finger sliding operation in the above-mentioned 3D screen according to the short-side offset value of the above-mentioned first sliding offset value in the short-side direction of the touch-sensitive display screen and the above-mentioned first preset sliding ratio.
  • the vertical offset value in the virtual interface may be a mode in which the long side of the touch-sensitive display screen is placed horizontally.
  • the default touch mode is the portrait mode.
  • the aforementioned portrait mode may be a mode in which the long side of the touch-sensitive display screen is placed vertically.
  • the above-mentioned direction of the short side may be an upward or downward direction along the short side of the touch-sensitive display screen.
  • upward or downward is a user-centered direction.
  • the aforementioned short-side offset value may be an offset value in the short-side direction of the first slide offset value along the sliding direction of the one-finger slide operation.
  • the direction of the above short-side offset value corresponds to the sliding direction.
  • the positive direction of the horizontal axis of the screen coordinate system is horizontal to the right.
  • the sliding direction is 45 degrees to the positive direction of the horizontal axis of the screen coordinate system
  • the direction of the short side offset value is upward.
  • the above-mentioned vertical offset value may be a vertical offset value of the anchor point on the 3D virtual interface.
  • the executive body may determine the product of the short-side offset value and the first preset sliding ratio as the vertical offset value of the anchor point corresponding to the single-finger sliding operation in the 3D virtual interface.
  • the second step according to the long-side offset value of the above-mentioned first sliding offset value in the long-side direction of the touch-sensitive display screen and the above-mentioned first preset sliding ratio, determine the anchor point corresponding to the above-mentioned one-finger sliding operation in the above-mentioned 3D
  • the lateral offset value in the virtual interface may be a direction to the left or right along the long side of the touch-sensitive display screen.
  • left or right is a user-centered direction.
  • the above-mentioned long-side offset value may be an offset value in the long-side direction of the first slide offset value along the long-side direction of the one-finger slide operation.
  • the direction of the above long-side offset value corresponds to the sliding direction.
  • the positive direction of the horizontal axis of the screen coordinate system is horizontal to the right.
  • the direction of the above-mentioned long side offset value is to the right.
  • the above-mentioned horizontal offset value may be an offset value of the anchor point in the horizontal direction of the 3D virtual interface.
  • the executive body may determine the product of the long side offset value and the first preset sliding ratio as the horizontal offset value of the anchor point corresponding to the single-finger sliding operation in the 3D virtual interface.
  • the third step is to slide the anchor point corresponding to the one-finger sliding operation in the above-mentioned 3D virtual interface according to the above-mentioned vertical offset value and the above-mentioned horizontal offset value.
  • the above-mentioned execution subject can slide the above-mentioned anchor point along the direction of the above-mentioned vertical offset value and the above-mentioned horizontal offset value in the above-mentioned 3D virtual interface at the same time, so that the above-mentioned anchor point slides the above-mentioned vertical offset in the vertical direction of the 3D virtual interface.
  • To the pixel of the offset value the pixel of the above-mentioned lateral offset value is slid in the horizontal direction.
  • the 3D virtual interface can be operated in the touch mode of the landscape mode.
  • Step 403 in response to detecting the single-finger click operation acting on the touch-sensitive display screen, and the anchor point corresponding to the single-finger click operation is at the position of the target control, create a 2D virtual interface in the 3D virtual interface, and display it in the 2D virtual interface Single-finger click on the corresponding application window.
  • the execution subject may create a 2D interface in the 3D virtual interface in response to detecting a single-finger click operation on the touch-sensitive display screen, and the anchor point corresponding to the single-finger click operation is at the position of the target control.
  • a virtual interface and display the application window corresponding to the single-finger click operation on the 2D virtual interface.
  • the above single-finger click operation may be an operation of clicking on one touch point.
  • the anchor point corresponding to the single-finger click operation may be a point after the visual display of the touch point corresponding to the single-finger click operation in the 3D virtual interface.
  • the above-mentioned 2D virtual interface may be an interface displayed on the display screen of the head-mounted display device in a two-dimensional form.
  • the aforementioned target control position may be the position of the application identification control displayed in the desktop window in the 3D virtual interface.
  • the above-mentioned application identification control may be a control for receiving a user's single-finger click operation to display a corresponding application window.
  • the execution subject may display the application window corresponding to the application identification control on the 2D virtual interface.
  • the above-mentioned target control position may also be the position of the setting control displayed in the desktop window in the 3D virtual interface.
  • the above-mentioned setting control may be a control for receiving a user's single-finger click operation to display a corresponding application window for setting related configurations.
  • the above-mentioned setting control may be a control for setting display brightness of the 3D virtual interface.
  • the above-mentioned setting control may also be a control for setting the layout of the application windows displayed in the 3D virtual interface.
  • the execution subject may display the application window corresponding to the setting control on the 2D virtual interface.
  • the position of the target control may be the position of the application control displayed in the application window.
  • the above-mentioned application control may be used to receive a user's single-finger click operation to display an application window displaying application content corresponding to the application control.
  • the above-mentioned application control may be a page refresh control, which is used to receive a user's single-finger click operation to display the application window after the page is refreshed.
  • the above-mentioned application control can also be a page jump control, which is used to receive a user's single-finger click operation to display the application window after the page is jumped.
  • the execution subject may display the application window corresponding to the application control on the 2D virtual interface.
  • the user can browse the application window corresponding to the target control in the 2D virtual interface after clicking the target control through the touch-sensitive display screen.
  • the above-mentioned execution subject may send the above-mentioned sliding operation or the above-mentioned clicking operation to the above-mentioned display screen through mapping.
  • the above-mentioned sliding operation may include but not limited to one of the following: one-finger sliding operation, two-finger sliding operation and three-finger sliding operation.
  • the above-mentioned click operation may include but not limited to one of the following: a single-finger click operation, a two-finger click operation, and a three-finger click operation.
  • the above-mentioned execution subject may obtain the MotionEvent object through a touch event (touch event) corresponding to a sliding operation or a clicking operation. Then, the parameter of the setDisplayId method of the InputEvent can be called reflectively, and the id of the external screen to be mapped by the above-mentioned MotionEvent object is set as the id of the above-mentioned display screen. Finally, the injectInputEvent method of the InputManager can be called reflectively to send the touch event to the above display screen.
  • the above-mentioned embodiments of the present disclosure have the following beneficial effects: through the virtual interface operating method of some embodiments of the present disclosure, the application windows displayed in the virtual interface can be controlled internally and externally.
  • the reason why the internal and external control of the application window displayed in the virtual interface cannot be realized is that only the application window corresponding to the application opening operation displayed in the virtual interface can be controlled, and when the operation inside and outside the application window needs to be performed in the virtual interface , for example, sliding or clicking operations need to be performed inside and outside the application window respectively, and the operations acting on the application window cannot be distinguished, resulting in the inability to control the inside and outside of the application window.
  • the virtual interface operation method of some embodiments of the present disclosure first, in response to detecting a single-finger sliding operation acting on the touch-sensitive display screen of the target device, determine the first sliding offset value corresponding to the above-mentioned single-finger sliding operation .
  • the above-mentioned target device communicates with the head-mounted display device. Then, according to the above-mentioned first sliding offset value and the first preset sliding ratio, slide the anchor point corresponding to the above-mentioned one-finger sliding operation in the 3D virtual interface of the above-mentioned head-mounted display device, and display in the above-mentioned 3D virtual interface from The ray from the origin of the target to the aforementioned anchor point. Wherein, the above-mentioned 3D virtual interface is displayed on the display screen of the above-mentioned head-mounted display device. Thus, the user's one-finger sliding operation on the touch-sensitive display screen can be visually displayed in the 3D virtual interface.
  • the anchor point corresponding to the single-finger click operation is at the position of the target control, create a 2D virtual interface in the above-mentioned 3D virtual interface, and create a 2D virtual interface in the above-mentioned 2D virtual interface.
  • the application window corresponding to the above single-finger click operation is displayed on the interface.
  • the target control may be an application identification control in the desktop window, or an application control in an application window. It is also possible to control the display content of the virtual interface according to the user's single-finger sliding operation and single-finger clicking operation, and to control the application windows displayed on the virtual interface internally and externally.
  • FIG. 5 it shows a flow 500 of another embodiment of the virtual interface operating method.
  • the flow 500 of the virtual interface operation method includes the following steps:
  • Step 501 in response to detecting a single-finger sliding operation acting on a touch-sensitive display screen of a target device, determine a first sliding offset value corresponding to the single-finger sliding operation.
  • Step 502 according to the first sliding offset value and the first preset sliding ratio, slide the anchor point corresponding to the one-finger sliding operation in the 3D virtual interface of the head-mounted display device, and display in the 3D virtual interface the distance from the target starting point to the Anchor point for the ray.
  • Step 503 in response to detecting the single-finger click operation acting on the touch-sensitive display screen, and the anchor point corresponding to the single-finger click operation is at the position of the target control, create a 2D virtual interface in the 3D virtual interface, and display it in the 2D virtual interface Single-finger click on the corresponding application window.
  • steps 501-503 for the specific implementation of steps 501-503 and the technical effects brought about by them, reference may be made to steps 401-403 in those embodiments corresponding to FIG. 4 , which will not be repeated here.
  • Step 504 in response to detecting a two-finger sliding operation acting on the touch-sensitive display screen, package the two-finger sliding operation into a single-finger sliding operation.
  • the execution subject of the virtual interface operation method (such as the head-mounted display device 11 shown in FIG. 1 or the target device 201 shown in FIG.
  • the finger slide operation encapsulates the above-mentioned two-finger slide operation into a single-finger slide operation.
  • the above-mentioned two-finger sliding operation may be an operation of sliding on a touch-sensitive display screen through two contacts.
  • the above-mentioned two-finger sliding operation corresponds to the target application window displayed on the above-mentioned 3D virtual interface.
  • the aforementioned target application window may be an application window currently in a selected state.
  • the execution subject may determine the first touch point corresponding to the two-finger slide operation as the target touch point.
  • the above-mentioned target contact can be used as a contact for encapsulating a single-finger sliding operation, thereby realizing encapsulation for a two-finger sliding operation.
  • the above-mentioned first contact may be the left contact. It can be understood that the above-mentioned left side is the side where the user is the main body.
  • the above-mentioned first contact point may also be the contact point that first touches the touch-sensitive display screen.
  • Step 505 determining a second sliding offset value corresponding to the encapsulating single-finger sliding operation.
  • the execution subject may determine the second sliding offset value corresponding to the packaging single-finger sliding operation.
  • the above-mentioned second sliding offset value may be the distance between the coordinates of the end contact of the encapsulation single-finger sliding operation and the coordinates of the contact of the previous frame.
  • the end contact of the single-finger slide operation of the package is the target contact at the end of the single-finger slide operation of the package.
  • Step 506 slide the display content of the target application window according to the second sliding offset value and the second preset sliding ratio.
  • the execution subject may slide the display content of the target application window according to the second sliding offset value and the second preset sliding ratio.
  • the above-mentioned second preset sliding ratio may be a preset ratio used to adjust the second sliding offset value of the encapsulating single-finger sliding operation in the target application window.
  • the above-mentioned first preset sliding ratio may be 150%.
  • the execution subject may slide the displayed content of the target application window according to the sliding direction of the encapsulated single-finger sliding operation.
  • the sliding distance of the displayed content is the product of the above-mentioned second sliding offset value and the above-mentioned second preset sliding ratio. In this way, the user can browse the displayed content of the sliding application window after sliding the selected application window with two fingers on the touch-sensitive display screen.
  • Step 507 in response to detecting a two-finger click operation acting on the touch-sensitive display screen, package the two-finger click operation into a single-finger click operation.
  • the execution subject may package the two-finger click operation into a single-finger click operation in response to detecting the two-finger click operation on the touch-sensitive display screen.
  • the above two-finger click operation may be an operation of clicking through two touch points.
  • the above two-finger click operation corresponds to the above target application window.
  • the execution subject may determine the first touch point corresponding to the two-finger click operation as the target touch point.
  • the target touch point can be used as a touch point for encapsulating the single-finger click operation, thereby realizing the encapsulation of the two-finger click operation.
  • the above-mentioned first contact may be the left contact. It can be understood that the above-mentioned left side is the side where the user is the main body.
  • the above-mentioned first contact point may also be the contact point that first touches the touch-sensitive display screen.
  • Step 508 in response to the fact that the anchor point corresponding to the encapsulated single-finger click operation is at the position of the target control, update the target application window according to the encapsulated single-finger click operation.
  • the execution subject may update the target application window according to the encapsulated single-finger click operation in response to the fact that the anchor point corresponding to the encapsulated single-finger click operation is at the position of the target control.
  • the position of the target control may be the position of the application control in the target application window.
  • the above-mentioned application control may be a page refresh control, which is used to receive a user's single-finger click operation to display the target application window after the page is refreshed.
  • the above-mentioned application control can also be a page jump control, which is used to receive the user's single-finger click operation to display the target application window after the page jumps.
  • the execution subject may display preset display content corresponding to the single-finger click operation of the package in the target application window, so as to update the target application window.
  • the above-mentioned preset display content may be the display content associated with the application control corresponding to the encapsulating single-finger click operation.
  • the above-mentioned application control is a page refresh control
  • the above-mentioned preset display content may be a refreshed page.
  • the above-mentioned application control is a page jump control
  • the above-mentioned preset display content may be a page after jumping. In this way, the user can browse the updated application window after clicking the application control in the selected application window with two fingers on the touch-sensitive display screen.
  • the execution subject may determine the sliding direction corresponding to the three-finger sliding operation, the sliding Distance and sliding acceleration.
  • the above-mentioned three-finger sliding operation may be an operation in which three contacts slide on the touch-sensitive display screen.
  • the above-mentioned execution subject may package the above-mentioned three-finger sliding operation into a one-finger sliding operation.
  • the execution subject may determine the first touch point corresponding to the three-finger slide operation as the target touch point.
  • the target contact can be used as a contact for the single-finger sliding operation after packaging, thereby realizing the packaging of the three-finger sliding operation.
  • the first contact above may be the first contact on the left.
  • the above-mentioned left side is the side where the user is the main body.
  • the above-mentioned first contact point may also be the contact point that first touches the touch-sensitive display screen.
  • the sliding direction, sliding distance and sliding acceleration corresponding to the encapsulated single-finger sliding operation may be respectively determined as the sliding direction, sliding distance and sliding acceleration corresponding to the three-finger sliding operation.
  • the execution subject may switch to the desktop window in response to the sliding direction being upward, the sliding distance greater than or equal to a first sliding distance, and the sliding acceleration greater than or equal to the first sliding acceleration.
  • the above-mentioned first sliding distance may be a preset sliding distance.
  • the above-mentioned first sliding acceleration may be a preset sliding acceleration.
  • the specific setting of the first sliding distance and the first preset sliding acceleration is not limited.
  • the above execution subject may close the application window displayed in the above 3D virtual interface, and switch to the desktop window.
  • upward refers to a user-centered direction.
  • the user can browse the desktop window after sliding up with three fingers on the touch-sensitive display screen.
  • the execution subject may respond to the fact that the previous application window of the target application window exists in the above-mentioned 3D virtual interface, and the above-mentioned sliding direction is to the left, and the above-mentioned sliding distance is greater than or equal to the second sliding distance, and the above-mentioned sliding acceleration is greater than or equal to The second sliding acceleration, switch to the above previous application window.
  • the target application window may be an application window in a selected state.
  • the previous application window may be the application window displayed on the left side of the target application window, or may be the last application window displayed before the target application window is displayed.
  • the above-mentioned second sliding distance may be a preset sliding distance.
  • the above-mentioned second sliding acceleration may be a preset sliding acceleration.
  • the specific setting of the second sliding distance and the second preset sliding acceleration is not limited.
  • left and left refer to directions centered on the user. In this way, the user can browse the previous application window after swiping left with three fingers on the touch
  • the execution subject may respond to the fact that the next application window of the target application window exists in the above-mentioned 3D virtual interface, and the above-mentioned sliding direction is to the right, and the above-mentioned sliding distance is greater than or equal to the third sliding distance, and the above-mentioned sliding acceleration is greater than or equal to The third sliding acceleration is to switch to the above-mentioned next application window.
  • the target application window may be an application window in a selected state.
  • the next application window may be the application window displayed on the right side of the target application window, or may be the application window displayed first after the target application window is displayed.
  • the above-mentioned third sliding distance may be a preset sliding distance.
  • the above-mentioned third sliding acceleration may be a preset sliding acceleration.
  • the specific settings of the third sliding distance and the third preset sliding acceleration are not limited.
  • right and right refer to directions centered on the user.
  • the user can browse the next application window after sliding right with three fingers on the touch-sensitive display screen.
  • the process 500 of the virtual interface operation method in some embodiments corresponding to FIG. 5 embodies the two-finger sliding operation corresponding to the target application window and two-finger tap operation to expand the steps. Therefore, the solutions described in these embodiments can enable the user to browse the displayed content of the sliding application window after sliding the selected application window with two fingers on the touch-sensitive display screen. At the same time, the user can browse the updated application window after clicking the application control in the selected application window with two fingers on the touch-sensitive display screen.
  • FIG. 6 it shows a schematic diagram of a hardware structure of a head-mounted display device (such as the head-mounted display device in FIG. 1 ) 600 suitable for implementing some embodiments of the present disclosure.
  • the head-mounted display device shown in FIG. 6 is only an example, and should not limit the functions and scope of use of the embodiments of the present disclosure.
  • a head-mounted display device 600 may include a processing device (such as a central processing unit, a graphics processing unit, etc.) 601 , a memory 602 , an input unit 603 , and an output unit 604 .
  • the processing device 601 , the memory 602 , the input unit 603 and the output unit 604 are connected to each other through a bus 605 .
  • the methods according to the embodiments of the present disclosure may be implemented as computer programs and stored in the memory 602 .
  • some embodiments of the present disclosure include a computer program product, which includes a computer program carried on a computer-readable medium, where the computer program includes program codes for executing the methods shown in the flowcharts.
  • the processing device 601 in the head-mounted display device specifically implements the virtual interface operation function defined in the method of the present disclosure by calling the above-mentioned computer program stored in the memory 602 .
  • the input unit 603 may include a touch device (eg, a touch-sensitive display screen of a target device).
  • the touch device in the input unit 603 can sense whether the user's operation on the virtual interface is detected, and then, in response to a determination of yes, the processing device 601 can call the above-mentioned computer program to execute the function of displaying the application page.
  • the output unit 604 may include a display screen for displaying a 3D virtual interface and a 2D virtual interface.
  • the computer-readable medium described in some embodiments of the present disclosure may be a computer-readable signal medium or a computer-readable storage medium or any combination of the above two.
  • a computer readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples of computer-readable storage media may include, but are not limited to, electrical connections with one or more wires, portable computer diskettes, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, carrying computer-readable program code thereon. Such propagated data signals may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, which can transmit, propagate, or transmit a program for use by or in conjunction with an instruction execution system, apparatus, or device .
  • Program code embodied on a computer readable medium may be transmitted by any appropriate medium, including but not limited to wires, optical cables, RF (radio frequency), etc., or any suitable combination of the above.
  • the client and the server can communicate using any currently known or future network protocols such as HTTP (HyperText Transfer Protocol, Hypertext Transfer Protocol), and can communicate with digital data in any form or medium
  • HTTP HyperText Transfer Protocol
  • the communication eg, communication network
  • Examples of communication networks include local area networks (“LANs”), wide area networks (“WANs”), internetworks (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network of.
  • the above-mentioned computer-readable medium may be contained in the above-mentioned head-mounted display device; or it may exist independently without being assembled into the head-mounted display device.
  • the computer-readable medium carries one or more programs, and when the one or more programs are executed by the head-mounted display device, the head-mounted display device: responds to detecting a touch-sensitive display acting on the target device
  • the single-finger sliding operation on the screen determines the first sliding offset value corresponding to the above-mentioned one-finger sliding operation, wherein the above-mentioned target device is connected in communication with the head-mounted display device; according to the above-mentioned first sliding offset value and the first preset sliding ratio, slide the anchor point corresponding to the one-finger sliding operation in the 3D virtual interface of the above-mentioned head-mounted display device, and display the ray from the target starting point to the above-mentioned anchor point in the above-mentioned 3D virtual interface, wherein the above-mentioned
  • Computer program code for carrying out operations of some embodiments of the present disclosure may be written in one or more programming languages, or combinations thereof, including object-oriented programming languages—such as Java, Smalltalk, C++, Also included are conventional procedural programming languages - such as the "C" language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (for example, using an Internet service provider to connected via the Internet).
  • LAN local area network
  • WAN wide area network
  • Internet service provider for example, using an Internet service provider to connected via the Internet.
  • each block in a flowchart or block diagram may represent a module, program segment, or portion of code that contains one or more logical functions for implementing the specified executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented by a dedicated hardware-based system that performs the specified functions or operations , or may be implemented by a combination of dedicated hardware and computer instructions.
  • FPGAs Field Programmable Gate Arrays
  • ASICs Application Specific Integrated Circuits
  • ASSPs Application Specific Standard Products
  • SOCs System on Chips
  • CPLD Complex Programmable Logical device

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开的实施例公开了虚拟界面操作方法、头戴式显示设备和计算机可读介质。该方法的一具体实施方式包括:响应于检测到作用于目标设备的触敏显示屏幕的单指滑动操作,确定单指滑动操作对应的第一滑动偏移值,其中,目标设备与头戴式显示设备通信连接;根据第一滑动偏移值和第一预设滑动比例,在头戴式显示设备的3D虚拟界面中滑动单指滑动操作对应的锚点,以及在3D虚拟界面中显示从目标起点至锚点的射线;响应于检测到作用于触敏显示屏幕的单指点击操作,以及单指点击操作对应的锚点在目标控件位置,在3D虚拟界面中创建2D虚拟界面,并在2D虚拟界面中显示单指点击操作对应的应用窗口。该实施方式可以实现对虚拟界面中显示的应用窗口进行内外控制。

Description

虚拟界面操作方法、头戴式显示设备和计算机可读介质 技术领域
本公开的实施例涉及计算机技术领域,具体涉及虚拟界面操作方法、头戴式显示设备和计算机可读介质。
背景技术
头戴式显示设备,例如,AR(Augmented Reality,增强现实技术)眼镜或MR(Mixed Reality,混合现实技术)眼镜,可以使得用户通过AR或MR眼镜中的显示屏幕观看虚拟图像。目前,头戴式显示设备与计算设备连接后,通常采用的交互方式为:响应于检测到作用于计算设备的触敏显示屏幕的应用开启操作,在头戴式显示设备的显示屏幕所显示的虚拟界面中显示应用开启操作对应的应用窗口。
然而,当采用上述交互方式时,经常会存在如下技术问题:仅可控制在虚拟界面中显示应用开启操作对应的应用窗口,需要在虚拟界面中进行应用窗口内外的操作时,例如,需要分别在应用窗口内和应用窗口外的滑动或点击操作,无法区分作用于应用窗口的操作,导致无法实现对应用窗口内外的控制。
公开内容
本公开的内容部分用于以简要的形式介绍构思,这些构思将在后面的具体实施方式部分被详细描述。本公开的内容部分并不旨在标识要求保护的技术方案的关键特征或必要特征,也不旨在用于限制所要求的保护的技术方案的范围。
本公开的一些实施例提出了虚拟界面操作方法、头戴式显示设备和计算机可读介质,来解决以上背景技术部分提到的技术问题。
第一方面,本公开的一些实施例提供了一种虚拟界面操作方法,该方法包括:响应于检测到作用于目标设备的触敏显示屏幕的单指滑动操作,确定上述单指滑动操作对应的第一滑动偏移值,其中,上述目标设备与头戴式显示设备通信连接;根据上述第一滑动偏移值和第一预设滑 动比例,在上述头戴式显示设备的3D虚拟界面中滑动上述单指滑动操作对应的锚点,以及在上述3D虚拟界面中显示从目标起点至上述锚点的射线,其中,上述3D虚拟界面在上述头戴式显示设备的显示屏幕中显示;响应于检测到作用于上述触敏显示屏幕的单指点击操作,以及上述单指点击操作对应的锚点在目标控件位置,在上述3D虚拟界面中创建2D虚拟界面,并在上述2D虚拟界面中显示上述单指点击操作对应的应用窗口。
第二方面,本公开的一些实施例提供了一种头戴式显示设备,包括:一个或多个处理器;显示屏幕,用于显示3D虚拟界面和2D虚拟界面;存储装置,其上存储有一个或多个程序,当上述一个或多个程序被上述一个或多个处理器执行,使得上述一个或多个处理器实现如上述第一方面任一实现方式所描述的方法。
第三方面,本公开的一些实施例提供了一种计算机可读介质,其上存储有计算机程序,其中,程序被处理器执行时实现上述第一方面任一实现方式所描述的方法。
本公开的上述各个实施例具有如下有益效果:通过本公开的一些实施例的虚拟界面操作方法,可以对虚拟界面中显示的应用窗口进行内外控制。具体来说,造成无法实现对虚拟界面中显示的应用窗口进行内外控制的原因在于:仅可控制在虚拟界面中显示应用开启操作对应的应用窗口,需要在虚拟界面中进行应用窗口内外的操作时,例如,需要分别在应用窗口内和应用窗口外的滑动或点击操作,无法区分作用于应用窗口的操作,导致无法实现对应用窗口内外的控制。基于此,本公开的一些实施例的虚拟界面操作方法,首先,响应于检测到作用于目标设备的触敏显示屏幕的单指滑动操作,确定上述单指滑动操作对应的第一滑动偏移值。其中,上述目标设备与头戴式显示设备通信连接。然后,根据上述第一滑动偏移值和第一预设滑动比例,在上述头戴式显示设备的3D虚拟界面中滑动上述单指滑动操作对应的锚点,以及在上述3D虚拟界面中显示从目标起点至上述锚点的射线。其中,上述3D虚拟界面在上述头戴式显示设备的显示屏幕中显示。由此,可以在3D虚拟界面中对用户在触敏显示屏幕上的单指滑动操作进行可视化显示。最后,响应于检测到作用于上述触敏显示屏幕的单指点击操作,以及上述单指点击操作对应的锚点在目标控件位置,在上述3D虚拟界面中创建2D虚拟界面,并在 上述2D虚拟界面中显示上述单指点击操作对应的应用窗口。由此,可以使得用户通过触敏显示屏幕点击目标控件后,在2D虚拟界面中浏览目标控件对应的应用窗口。可以理解的是,目标控件可以为桌面窗口中的应用标识控件,也可以是一应用窗口中的应用控件。也因为可以根据用户的单指滑动操作和单指点击操作,对虚拟界面的显示内容进行控制,可以对虚拟界面中显示的应用窗口进行内外控制。
附图说明
结合附图并参考以下具体实施方式,本公开各实施例的上述和其他特征、优点及方面将变得更加明显。贯穿附图中,相同或相似的附图标记表示相同或相似的元素。应当理解附图是示意性的,元件和元素不一定按照比例绘制。
图1是本公开的一些实施例可以应用于其中的示例性系统的架构图;
图2-3是根据本公开的一些实施例的虚拟界面操作方法的一个应用场景的示意图;
图4是根据本公开的虚拟界面操作方法的一些实施例的流程图;
图5是根据本公开的虚拟界面操作方法的另一些实施例的流程图;
图6是适于用来实现本公开的一些实施例的头戴式显示设备的结构示意图。
具体实施方式
下面将参照附图更详细地描述本公开的实施例。虽然附图中显示了本公开的某些实施例,然而应当理解的是,本公开可以通过各种形式来实现,而且不应该被解释为限于这里阐述的实施例。相反,提供这些实施例是为了更加透彻和完整地理解本公开。应当理解的是,本公开的附图及实施例仅用于示例性作用,并非用于限制本公开的保护范围。
另外还需要说明的是,为了便于描述,附图中仅示出了与有关公开相关的部分。在不冲突的情况下,本公开中的实施例及实施例中的特征可以相互组合。
需要注意,本公开中提及的“第一”、“第二”等概念仅用于对不同的装置、模块或单元进行区分,并非用于限定这些装置、模块或单元所 执行的功能的顺序或者相互依存关系。
需要注意,本公开中提及的“一个”、“多个”的修饰是示意性而非限制性的,本领域技术人员应当理解,除非在上下文另有明确指出,否则应该理解为“一个或多个”。
本公开实施方式中的多个装置之间所交互的消息或者信息的名称仅用于说明性的目的,而并不是用于对这些消息或信息的范围进行限制。
下面将参考附图并结合实施例来详细说明本公开。
图1示出了可以应用于本公开的应用于虚拟界面操作方法的实施例的示例性系统架构100。
如图1所示,示例性系统架构100可以包括头戴式显示设备11和目标设备12。
头戴式显示设备11可以包括一个或两个显示屏幕111。上述显示屏幕用于显示3D虚拟界面和2D虚拟界面。此外,头戴式显示设备11还包括镜架112。在一些实施例中,可以将头戴式显示设备11的传感器、处理单元、存储器和电池放到镜架112内部。在一些实施例的一些可选的实现方式中,也可以将传感器、处理单元、存储器和电池中的一个或多个部件集成在另一个独立的配件(未示出)中,通过数据线和镜架112进行连接。在一些实施例的一些可选的实现方式中,头戴式显示设备11可以仅具有显示功能和部分传感器,而通过目标设备12提供数据处理、数据存储、供电能力等能力。
目标设备12可以包括触敏显示屏幕121,在一些实施例中,头戴式显示设备11和目标设备12可以通过无线连接方式进行通信。在一些实施例的一些可选的实现方式中,头戴式显示设备11和目标设备12也可以通过数据线(未示出)进行连接。
应该理解,图1中的头戴式显示设备和目标设备的数目仅仅是示意性的。根据实现需要,可以具有任意合适数目的头戴式显示设备和目标设备。
图2-3是根据本公开一些实施例的虚拟界面操作方法的一个应用场景的示意图。
如图2所示,首先,目标设备201(例如,手机)可以响应于检测 到作用于目标设备的触敏显示屏幕的单指滑动操作202,确定上述单指滑动操作202对应的第一滑动偏移值203。其中,上述目标设备201与头戴式显示设备204通信连接。然后,目标设备201可以根据上述第一滑动偏移值203和第一预设滑动比例205,在上述头戴式显示设备204的3D虚拟界面206中滑动上述单指滑动操作202对应的锚点207,以及在上述3D虚拟界面206中显示从目标起点至上述锚点207的射线208。其中,上述3D虚拟界面206在上述头戴式显示设备204的显示屏幕中显示。图2中的3D虚拟界面206中显示的为桌面窗口,桌面窗口中显示了应用标识控件组。应用标识控件组中的应用标识控件可以为用于接收用户的选择操作以展示应用标识控件对应的应用窗口的控件。
最后,如图3所示,目标设备201可以响应于检测到作用于上述触敏显示屏幕的单指点击操作209,以及上述单指点击操作109对应的锚点210在目标控件位置,在上述3D虚拟界面206中创建2D虚拟界面211,并在上述2D虚拟界面211中显示上述单指点击操作209对应的应用窗口212。这里,上述目标控件位置可以为应用标识控件组中的应用标识控件在3D虚拟界面206中的位置。锚点210在目标控件位置,即,锚点210落在应用标识控件上。应用窗口212中显示了应用页面(例如,XX应用程序的运行页面)。
可以理解的是,虚拟界面操作方法的执行主体可以是各种软件,上述执行主体也可以是目标设备201,上述执行主体还可以是服务器,上述执行主体还可以包括上述目标设备201与上述服务器通过网络相集成所构成的设备。其中,目标设备201可以是具有信息处理能力的各种电子设备,包括但不限于智能手机、平板电脑、电子书阅读器、膝上型便携计算机和台式计算机等等。当虚拟界面操作方法的执行主体为软件时,可以安装在上述所列举的电子设备中。其可以实现成例如用来提供分布式服务的多个软件或软件模块,也可以实现成单个软件或软件模块。在此不做具体限定。
应该理解,图2-3中的目标设备和头戴式显示设备的数目仅仅是示意性的。根据实现需要,可以具有任意数目的目标设备和头戴式显示设备。
继续参考图4,示出了根据本公开的虚拟界面操作方法的一些实施 例的流程400。该虚拟界面操作方法,包括以下步骤:
步骤401,响应于检测到作用于目标设备的触敏显示屏幕的单指滑动操作,确定单指滑动操作对应的第一滑动偏移值。
在一些实施例中,虚拟界面操作方法的执行主体(例如图1所示的头戴式显示设备11或图2所示的目标设备201)可以响应于检测到作用于目标设备的触敏显示屏幕的单指滑动操作,确定上述单指滑动操作对应的第一滑动偏移值。其中,上述单指滑动操作可以为通过一个触点在触敏显示屏幕滑动的操作。这里,触点可以为用户的手指与触敏显示屏幕的接触点,还可以为触控笔与触敏显示屏幕的接触点。上述目标设备可以为具有触敏显示屏幕的计算设备。例如,上述目标设备可以为但不限于以下中的一项:手机,平板电脑。上述目标设备与头戴式显示设备通信连接。上述头戴式显示设备可以为用于使得用户观看虚拟影像的头戴式设备,可以为但不限于以下中的一项:头戴式增强显示设备,头戴式混合显示设备。例如,上述头戴式增强显示设备可以为AR眼镜。上述头戴式混合显示设备可以为MR眼镜。上述头戴式显示设备中可以显示桌面窗口和至少一个应用窗口。上述第一滑动偏移值可以为单指滑动操作的结束触点的坐标与上一帧的触点的坐标间的距离。例如,上述距离可以为欧式距离。上述上一帧的触点可以为头戴式显示设备的显示屏幕上一帧显示的触点。这里,对于显示屏幕的帧率,不作限定。上述结束触点可以为单指滑动操作结束时的触点。上述坐标可以为触敏显示屏幕的屏幕坐标。
可选地,根据上述单指滑动操作的滑动速度,确定第一预设滑动比例。实践中,上述执行主体可以首先确定上述滑动速度所处的速度区间。然后,可以将上述速度区间对应的滑动比例确定为第一预设滑动比例。这里,上述执行主体可以通过预设的速度区间-滑动比例对照表确定上述滑动速度所处的速度区间和上述速度区间对应的滑动比例。上述速度区间-滑动比例对照表包括各个速度区间和与各个速度区间对应的滑动比例。上述速度区间对应的滑动速度越大,速度区间对应的第一预设滑动比例越大。由此,可以动态调整第一预设滑动比例。
步骤402,根据第一滑动偏移值和第一预设滑动比例,在头戴式显示设备的3D虚拟界面中滑动单指滑动操作对应的锚点,以及在3D虚拟 界面中显示从目标起点至锚点的射线。
在一些实施例中,上述执行主体可以根据上述第一滑动偏移值和第一预设滑动比例,在上述头戴式显示设备的3D虚拟界面中滑动上述单指滑动操作对应的锚点,以及在上述3D虚拟界面中显示从目标起点至上述锚点的射线。其中,上述第一预设滑动比例可以为预设的用于调整单指滑动操作在3D虚拟界面中的第一滑动偏移值的比例。例如,上述第一预设滑动比例可以为200%。上述3D虚拟界面可以为以三维形式在头戴式显示设备的显示屏幕中显示的界面。上述锚点可以为上述单指滑动操作的触点在3D虚拟界面中的可视化显示后的点。实践中,上述执行主体可以在上述3D虚拟界面中显示单指滑动操作的触点的坐标映射的锚点,使得结束触点对应的锚点的3D虚拟界面坐标与上一帧的触点对应的锚点的3D虚拟界面坐标的距离为上述第一滑动偏移值和上述第一预设滑动比例的乘积。例如,上述执行主体可以将单指滑动操作的触点的坐标包括的横坐标与第一预设滑动比例的乘积确定为映射的锚点的3D虚拟界面坐标的横坐标。上述执行主体可以将单指滑动操作的触点的坐标包括的纵坐标与第一预设滑动比例的乘积确定为映射的锚点的3D虚拟界面坐标的纵坐标。3D虚拟界面坐标可以为在头戴式显示设备的显示屏幕中的坐标。可以理解的是,触点持续滑动时,可以在上述3D虚拟界面中持续滑动锚点,也可以根据设定的时间间隔在上述3D虚拟界面中滑动锚点。
上述目标起点可以为3D虚拟界面的虚拟相机点对应的3D虚拟界面坐标点。上述虚拟相机点可以为观看的位置点。例如,虚拟相机点可以为unity Main Camera的位置点。上述虚拟相机点对应的3D虚拟界面坐标点可以为相对unity Main Camera的偏移后的3D虚拟界面坐标点。这里,对于偏移的方向和距离不作限定。实践中,上述执行主体可以在滑动显示锚点的过程中,显示从目标起点至锚点的射线。这里,锚点和射线的渲染样式可以为预先设置的样式,不作限定。例如,锚点可以预定半径和预定填充颜色的圆点。射线可以为预定宽度和预定填充颜色的线条或箭头。由此,可以在3D虚拟界面中对用户在触敏显示屏幕上的单指滑动操作进行可视化显示。
可选地,响应于上述触敏显示屏幕的触控模式为横屏模式,执行以下步骤:
第一步,根据上述第一滑动偏移值在上述触敏显示屏幕的短边方向的短边偏移值和上述第一预设滑动比例,确定上述单指滑动操作对应的锚点在上述3D虚拟界面中的竖向偏移值。其中,上述横屏模式可以为触敏显示屏幕的长边横向放置的模式。这里,上述执行主体与上述头戴式显示设备连接时,默认的触控模式为竖屏模式。上述竖屏模式可以为触敏显示屏幕的长边竖向放置的模式。上述短边方向可以为沿触敏显示屏幕的短边向上或向下的方向。这里,向上或向下是以用户为主体的方向。上述短边偏移值可以为单指滑动操作沿滑动方向的第一滑动偏移值在短边方向的偏移值。可以理解的是,上述短边偏移值的方向与滑动方向相对应。例如,屏幕坐标系的横轴正方向为水平向右。滑动方向为屏幕坐标系的横轴正方向45度方向时,上述短边偏移值的方向为向上。上述竖向偏移值可以为锚点在3D虚拟界面的竖向的偏移值。实践中,上述执行主体可以将上述短边偏移值与上述第一预设滑动比例的乘积确定为上述单指滑动操作对应的锚点在上述3D虚拟界面中的竖向偏移值。
第二步,根据上述第一滑动偏移值在上述触敏显示屏幕的长边方向的长边偏移值和上述第一预设滑动比例,确定上述单指滑动操作对应的锚点在上述3D虚拟界面中的横向偏移值。上述长边方向可以为沿触敏显示屏幕的长边向左或向右的方向。这里,向左或向右是以用户为主体的方向。上述长边偏移值可以为单指滑动操作沿长边方向的第一滑动偏移值在长边方向的偏移值。可以理解的是,上述长边偏移值的方向与滑动方向相对应。例如,屏幕坐标系的横轴正方向为水平向右。滑动方向为屏幕坐标系的横轴正方向45度方向时,上述长边偏移值的方向为向右。上述横向偏移值可以为锚点在3D虚拟界面的横向的偏移值。实践中,上述执行主体可以将上述长边偏移值与上述第一预设滑动比例的乘积确定为上述单指滑动操作对应的锚点在上述3D虚拟界面中的横向偏移值。
第三步,根据上述竖向偏移值和上述横向偏移值,在上述3D虚拟界面中滑动上述单指滑动操作对应的锚点。实践中,上述执行主体可以在上述3D虚拟界面中同时将上述锚点沿上述竖向偏移值和上述横向偏移值的方向滑动,使得上述锚点在3D虚拟界面的竖向方向滑动上述竖向偏移值的像素,在横向方向滑动上述横向偏移值的像素。由此,可以在横屏模式的触控模式下,对3D虚拟界面进行操作。
步骤403,响应于检测到作用于触敏显示屏幕的单指点击操作,以及单指点击操作对应的锚点在目标控件位置,在3D虚拟界面中创建2D虚拟界面,并在2D虚拟界面中显示单指点击操作对应的应用窗口。
在一些实施例中,上述执行主体可以响应于检测到作用于上述触敏显示屏幕的单指点击操作,以及上述单指点击操作对应的锚点在目标控件位置,在上述3D虚拟界面中创建2D虚拟界面,并在上述2D虚拟界面中显示上述单指点击操作对应的应用窗口。其中,上述单指点击操作可以为在一个触点进行点击的操作。上述单指点击操作对应的锚点可以为单指点击操作对应的触点在3D虚拟界面中的可视化显示后的点。其中,上述2D虚拟界面可以为以二维形式在头戴式显示设备的显示屏幕中显示的界面。
上述锚点在3D虚拟界面中的桌面窗口时,上述目标控件位置可以为桌面窗口中显示的应用标识控件在3D虚拟界面中的位置。上述应用标识控件可以为用于接收用户的单指点击操作,以展示对应的应用窗口的控件。实践中,上述执行主体可以在上述2D虚拟界面中显示上述应用标识控件对应的应用窗口。
上述锚点在3D虚拟界面中的桌面窗口时,上述目标控件位置还可以为桌面窗口中显示的设置控件在3D虚拟界面中的位置。上述设置控件可以为用于接收用户的单指点击操作,以展示对应的用于设置相关配置的应用窗口的控件。例如,上述设置控件可以为用于设置3D虚拟界面的显示亮度的控件。上述设置控件还可以为用于设置3D虚拟界面中显示的应用窗口的布局的控件。实践中,上述执行主体可以在上述2D虚拟界面中显示上述设置控件对应的应用窗口。
上述锚点在3D虚拟界面中的应用窗口时,上述目标控件位置可以为应用窗口中显示的应用控件的位置。上述应用控件可以为用于接收用户的单指点击操作,以展示显示了应用控件对应的应用内容的应用窗口。例如,上述应用控件可以为页面刷新控件,用于接收用户的单指点击操作,以展示刷新页面后的应用窗口。上述应用控件还可以为页面跳转控件,用于接收用户的单指点击操作,以展示跳转页面后的应用窗口。实践中,上述执行主体可以在上述2D虚拟界面中显示上述应用控件对应的应用窗口。
由此,可以使得用户通过触敏显示屏幕点击目标控件后,在2D虚拟界面中浏览目标控件对应的应用窗口。
可选地,上述执行主体可以响应于检测到作用于上述触敏显示屏幕的滑动操作或点击操作,将上述滑动操作或上述点击操作通过映射方式发送到上述显示屏幕。其中,上述滑动操作可以包括但不限于以下中的一项:单指滑动操作、双指滑动操作和三指滑动操作。上述点击操作可以包括但不限于以下中的一项:单指点击操作、双指点击操作和三指点击操作。实践中,上述执行主体可以通过滑动操作或点击操作对应的触摸事件(touch事件)获取MotionEvent对象。然后,可以反射调用InputEvent的setDisplayId方法参数,将上述MotionEvent对象将要映射的外接屏幕id设置为上述显示屏幕的id。最后,可以反射调用InputManager的injectInputEvent方法,将触摸事件发送到上述显示屏幕。本公开的上述各个实施例具有如下有益效果:通过本公开的一些实施例的虚拟界面操作方法,可以对虚拟界面中显示的应用窗口进行内外控制。具体来说,造成无法实现对虚拟界面中显示的应用窗口进行内外控制的原因在于:仅可控制在虚拟界面中显示应用开启操作对应的应用窗口,需要在虚拟界面中进行应用窗口内外的操作时,例如,需要分别在应用窗口内和应用窗口外的滑动或点击操作,无法区分作用于应用窗口的操作,导致无法实现对应用窗口内外的控制。基于此,本公开的一些实施例的虚拟界面操作方法,首先,响应于检测到作用于目标设备的触敏显示屏幕的单指滑动操作,确定上述单指滑动操作对应的第一滑动偏移值。其中,上述目标设备与头戴式显示设备通信连接。然后,根据上述第一滑动偏移值和第一预设滑动比例,在上述头戴式显示设备的3D虚拟界面中滑动上述单指滑动操作对应的锚点,以及在上述3D虚拟界面中显示从目标起点至上述锚点的射线。其中,上述3D虚拟界面在上述头戴式显示设备的显示屏幕中显示。由此,可以在3D虚拟界面中对用户在触敏显示屏幕上的单指滑动操作进行可视化显示。最后,响应于检测到作用于上述触敏显示屏幕的单指点击操作,以及上述单指点击操作对应的锚点在目标控件位置,在上述3D虚拟界面中创建2D虚拟界面,并在上述2D虚拟界面中显示上述单指点击操作对应的应用窗口。由此,可以使得用户通过触敏显示屏幕点击目标控件后,在2D虚拟界面中浏览 目标控件对应的应用窗口。可以理解的是,目标控件可以为桌面窗口中的应用标识控件,也可以是一应用窗口中的应用控件。也因为可以根据用户的单指滑动操作和单指点击操作,对虚拟界面的显示内容进行控制,可以对虚拟界面中显示的应用窗口进行内外控制。
进一步参考图5,其示出了虚拟界面操作方法的另一些实施例的流程500。该虚拟界面操作方法的流程500,包括以下步骤:
步骤501,响应于检测到作用于目标设备的触敏显示屏幕的单指滑动操作,确定单指滑动操作对应的第一滑动偏移值。
步骤502,根据第一滑动偏移值和第一预设滑动比例,在头戴式显示设备的3D虚拟界面中滑动单指滑动操作对应的锚点,以及在3D虚拟界面中显示从目标起点至锚点的射线。
步骤503,响应于检测到作用于触敏显示屏幕的单指点击操作,以及单指点击操作对应的锚点在目标控件位置,在3D虚拟界面中创建2D虚拟界面,并在2D虚拟界面中显示单指点击操作对应的应用窗口。
在一些实施例中,步骤501-503的具体实现及所带来的技术效果可以参考图4对应的那些实施例中的步骤401-403,在此不再赘述。
步骤504,响应于检测到作用于触敏显示屏幕的双指滑动操作,将双指滑动操作封装为封装单指滑动操作。
在一些实施例中,虚拟界面操作方法的执行主体(例如图1所示的头戴式显示设备11或图2所示的目标设备201)可以响应于检测到作用于上述触敏显示屏幕的双指滑动操作,将上述双指滑动操作封装为封装单指滑动操作。其中,上述双指滑动操作可以为通过两个触点在触敏显示屏幕滑动的操作。上述双指滑动操作对应于上述3D虚拟界面中显示的目标应用窗口。上述目标应用窗口可以为当前处于选中状态的应用窗口。实践中,上述执行主体可以将上述双指滑动操作对应的第一个触点确定为目标触点。上述目标触点可以作为封装单指滑动操作的触点,由此实现对双指滑动操作的封装。上述第一个触点可以为左侧的触点。可以理解的是,上述左侧是以用户为主体的一侧。上述第一个触点也可以是首先接触到触敏显示屏幕的触点。
步骤505,确定封装单指滑动操作对应的第二滑动偏移值。
在一些实施例中,上述执行主体可以确定上述封装单指滑动操作对 应的第二滑动偏移值。其中,上述第二滑动偏移值可以为封装单指滑动操作的结束触点的坐标与上一帧的触点的坐标间的距离。封装单指滑动操作的结束触点为封装单指滑动操作结束时的目标触点。
步骤506,根据第二滑动偏移值和第二预设滑动比例,滑动目标应用窗口的显示内容。
在一些实施例中,上述执行主体可以根据上述第二滑动偏移值和第二预设滑动比例,滑动上述目标应用窗口的显示内容。其中,上述第二预设滑动比例可以为预设的用于调整封装单指滑动操作在目标应用窗口中的第二滑动偏移值的比例。例如,上述第一预设滑动比例可以为150%。实践中,上述执行主体可以根据上述封装单指滑动操作的滑动方向滑动上述目标应用窗口的显示内容。其中,显示内容滑动的距离为上述第二滑动偏移值和上述第二预设滑动比例的乘积。由此,可以使得用户通过触敏显示屏幕双指滑动选中状态的应用窗口后,浏览滑动的应用窗口的显示内容。
步骤507,响应于检测到作用于触敏显示屏幕的双指点击操作,将双指点击操作封装为封装单指点击操作。
在一些实施例中,上述执行主体可以响应于检测到作用于上述触敏显示屏幕的双指点击操作,将上述双指点击操作封装为封装单指点击操作。其中,上述双指点击操作可以为通过两个触点进行点击的操作。上述双指点击操作对应于上述目标应用窗口。实践中,上述执行主体可以将上述双指点击操作对应的第一个触点确定为目标触点。目标触点可以作为封装单指点击操作的触点,由此实现对双指点击操作的封装。上述第一个触点可以为左侧的触点。可以理解的是,上述左侧是以用户为主体的一侧。上述第一个触点也可以是首先接触到触敏显示屏幕的触点。
步骤508,响应于封装单指点击操作对应的锚点在目标控件位置,根据封装单指点击操作,更新目标应用窗口。
在一些实施例中,上述执行主体可以响应于上述封装单指点击操作对应的锚点在目标控件位置,根据上述封装单指点击操作,更新上述目标应用窗口。这里,上述目标控件位置可以为上述目标应用窗口中的应用控件的位置。例如,上述应用控件可以为页面刷新控件,用于接收用户的单指点击操作,以展示刷新页面后的目标应用窗口。上述应用控件 还可以为页面跳转控件,用于接收用户的单指点击操作,以展示跳转页面后的目标应用窗口。实践中,上述执行主体可以在上述目标应用窗口中显示对应上述封装单指点击操作的预设显示内容,以更新上述目标应用窗口。上述预设显示内容可以为与封装单指点击操作对应的应用控件关联的显示内容。例如,当上述应用控件为页面刷新控件时,上述预设显示内容可以为刷新后的页面。当上述应用控件为页面跳转控件时,上述预设显示内容可以为跳转后的页面。由此,可以使得用户通过触敏显示屏幕双指点击选中状态的应用窗口中应用控件后,浏览更新后的该应用窗口。
可选地,上述执行主体可以响应于检测到作用于上述触敏显示屏幕的三指滑动操作,以及上述3D虚拟界面中在先显示了应用窗口,确定上述三指滑动操作对应的滑动方向、滑动距离和滑动加速度。其中,上述三指滑动操作可以为三个触点在触敏显示屏幕滑动的操作。实践中,首先,上述执行主体可以将上述三指滑动操作封装为单指滑动操作。例如,上述执行主体可以将上述三指滑动操作对应的第一个触点确定为目标触点。目标触点可以作为封装后的单指滑动操作的触点,由此实现对三指滑动操作的封装。上述第一个触点可以为左侧第一个触点。可以理解的是,上述左侧是以用户为主体的一侧。上述第一个触点也可以是首先接触到触敏显示屏幕的触点。然后,可以将封装后的单指滑动操作对应的滑动方向、滑动距离和滑动加速度分别确定为三指滑动操作对应的滑动方向、滑动距离和滑动加速度。
可选地,上述执行主体可以响应于上述滑动方向为向上,以及上述滑动距离大于等于第一滑动距离,且上述滑动加速度大于等于第一滑动加速度,切换至桌面窗口。其中,上述第一滑动距离可以为预设的滑动距离。上述第一滑动加速度可以为预设的滑动加速度。对于第一滑动距离和第一预设滑动加速度的具体设定,不作限定。实践中,上述执行主体可以关闭上述3D虚拟界面中显示的应用窗口,以及切换至桌面窗口。这里,向上是指以用户为主体的方向。由此,可以使得用户在触敏显示屏幕上三指上滑后,浏览桌面窗口。
可选地,上述执行主体可以响应于上述3D虚拟界面中存在目标应用窗口的上一个应用窗口,以及上述滑动方向为向左,且上述滑动距离大 于等于第二滑动距离,且上述滑动加速度大于等于第二滑动加速度,切换至上述上一个应用窗口。其中,目标应用窗口可以为处于选中状态的应用窗口。上一个应用窗口可以为在目标应用窗口左侧显示的应用窗口,还可以为在目标应用窗口显示之前最后显示的应用窗口。上述第二滑动距离可以为预设的滑动距离。上述第二滑动加速度可以为预设的滑动加速度。对于第二滑动距离和第二预设滑动加速度的具体设定,不作限定。这里,向左和左侧是指以用户为主体的方向。由此,可以使得用户在触敏显示屏幕上三指左滑后,浏览上一个应用窗口。
可选地,上述执行主体可以响应于上述3D虚拟界面中存在目标应用窗口的下一个应用窗口,以及上述滑动方向为向右,且上述滑动距离大于等于第三滑动距离,且上述滑动加速度大于等于第三滑动加速度,切换至上述下一个应用窗口。其中,目标应用窗口可以为处于选中状态的应用窗口。下一个应用窗口可以为在目标应用窗口右侧显示的应用窗口,还可以为在目标应用窗口显示之后最先显示的应用窗口。上述第三滑动距离可以为预设的滑动距离。上述第三滑动加速度可以为预设的滑动加速度。对于第三滑动距离和第三预设滑动加速度的具体设定,不作限定。这里,向右和右侧是指以用户为主体的方向。由此,可以使得用户在触敏显示屏幕上三指右滑后,浏览下一个应用窗口。
从图5中可以看出,与图4对应的一些实施例的描述相比,图5对应的一些实施例中的虚拟界面操作方法的流程500体现了对对应于目标应用窗口的双指滑动操作和双指点击操作进行扩展的步骤。由此,这些实施例描述的方案可以使得用户通过触敏显示屏幕双指滑动选中状态的应用窗口后,浏览滑动的应用窗口的显示内容。同时可以使得用户通过触敏显示屏幕双指点击选中状态的应用窗口中应用控件后,浏览更新后的该应用窗口。
下面参考图6,其示出了适于用来实现本公开的一些实施例的头戴式显示设备(例如图1中的头戴式显示设备)600的硬件结构示意图。图6示出的头戴式显示设备仅仅是一个示例,不应对本公开的实施例的功能和使用范围带来任何限制。
如图6所示,头戴式显示设备600可以包括处理装置(例如中央处理器、图形处理器等)601、存储器602、输入单元603、输出单元604。 其中,处理装置601、存储器602、输入单元603和输出单元604通过总线605彼此相连。在此,根据本公开的实施例的方法可以被实现为计算机程序,并且存储在存储器602中。例如,本公开的一些实施例包括一种计算机程序产品,其包括承载在计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。头戴式显示设备中的处理装置601通过调用存储器602中存储的上述计算机程序,来具体实现本公开的方法中限定的虚拟界面操作功能。在一些实现方式中,输入单元603可以包括触控设备(例如,目标设备的触敏显示屏幕)。由此,可以通过输入单元603中的触控设备感测是否检测到用户对于虚拟界面的操作,进而,响应于确定是,处理装置601可以调用上述计算机程序执行显示应用页面功能。输出单元604可以包括显示屏幕,用于显示3D虚拟界面和2D虚拟界面。
需要说明的是,本公开的一些实施例中记载的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开的一些实施例中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开的一些实施例中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读信号介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:电线、光缆、RF(射频)等等,或者上述的任意合适的组合。
在一些实施方式中,客户端、服务器可以利用诸如HTTP(HyperText Transfer Protocol,超文本传输协议)之类的任何当前已知或未来研发的网络协议进行通信,并且可以与任意形式或介质的数字数据通信(例如,通信网络)互连。通信网络的示例包括局域网(“LAN”),广域网(“WAN”),网际网(例如,互联网)以及端对端网络(例如,ad hoc端对端网络),以及任何当前已知或未来研发的网络。
上述计算机可读介质可以是上述头戴式显示设备中所包含的;也可以是单独存在,而未装配入该头戴式显示设备中。上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该头戴式显示设备执行时,使得该头戴式显示设备:响应于检测到作用于目标设备的触敏显示屏幕的单指滑动操作,确定上述单指滑动操作对应的第一滑动偏移值,其中,上述目标设备与头戴式显示设备通信连接;根据上述第一滑动偏移值和第一预设滑动比例,在上述头戴式显示设备的3D虚拟界面中滑动上述单指滑动操作对应的锚点,以及在上述3D虚拟界面中显示从目标起点至上述锚点的射线,其中,上述3D虚拟界面在上述头戴式显示设备的显示屏幕中显示;响应于检测到作用于上述触敏显示屏幕的单指点击操作,以及上述单指点击操作对应的锚点在目标控件位置,在上述3D虚拟界面中创建2D虚拟界面,并在上述2D虚拟界面中显示上述单指点击操作对应的应用窗口。
可以以一种或多种程序设计语言或其组合来编写用于执行本公开的一些实施例的操作的计算机程序代码,上述程序设计语言包括面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)——连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上, 流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:现场可编程门阵列(FPGA)、专用集成电路(ASIC)、专用标准产品(ASSP)、片上系统(SOC)、复杂可编程逻辑设备(CPLD)等等。
以上描述仅为本公开的一些较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本公开的实施例中所涉及的公开范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离上述公开构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本公开的实施例中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。

Claims (12)

  1. 一种虚拟界面操作方法,包括:
    响应于检测到作用于目标设备的触敏显示屏幕的单指滑动操作,确定所述单指滑动操作对应的第一滑动偏移值,其中,所述目标设备与头戴式显示设备通信连接;
    根据所述第一滑动偏移值和第一预设滑动比例,在所述头戴式显示设备的3D虚拟界面中滑动所述单指滑动操作对应的锚点,以及在所述3D虚拟界面中显示从目标起点至所述锚点的射线,其中,所述3D虚拟界面在所述头戴式显示设备的显示屏幕中显示;
    响应于检测到作用于所述触敏显示屏幕的单指点击操作,以及所述单指点击操作对应的锚点在目标控件位置,在所述3D虚拟界面中创建2D虚拟界面,并在所述2D虚拟界面中显示所述单指点击操作对应的应用窗口。
  2. 根据权利要求1所述的方法,其中,所述方法还包括:
    响应于检测到作用于所述触敏显示屏幕的滑动操作或点击操作,将所述滑动操作或所述点击操作通过映射方式发送到所述显示屏幕。
  3. 根据权利要求1所述的方法,其中,在所述根据所述第一滑动偏移值和第一预设滑动比例,在所述头戴式显示设备的3D虚拟界面中滑动所述单指滑动操作对应的锚点之前,所述方法还包括:
    根据所述单指滑动操作的滑动速度,确定第一预设滑动比例。
  4. 根据权利要求3所述的方法,其中,所述在所述头戴式显示设备的3D虚拟界面中滑动所述单指滑动操作对应的锚点,包括:
    响应于所述触敏显示屏幕的触控模式为横屏模式,执行以下步骤:
    根据所述第一滑动偏移值在所述触敏显示屏幕的短边方向的短边偏移值和所述第一预设滑动比例,确定所述单指滑动操作对应的锚点在所述3D虚拟界面中的竖向偏移值;
    根据所述第一滑动偏移值在所述触敏显示屏幕的长边方向的长边偏移值和所述第一预设滑动比例,确定所述单指滑动操作对应的锚点 在所述3D虚拟界面中的横向偏移值;
    根据所述竖向偏移值和所述横向偏移值,在所述3D虚拟界面中滑动所述单指滑动操作对应的锚点。
  5. 根据权利要求1所述的方法,其中,所述方法还包括:
    响应于检测到作用于所述触敏显示屏幕的双指滑动操作,将所述双指滑动操作封装为封装单指滑动操作,其中,所述双指滑动操作对应于所述3D虚拟界面中显示的目标应用窗口;
    确定所述封装单指滑动操作对应的第二滑动偏移值;
    根据所述第二滑动偏移值和第二预设滑动比例,滑动所述目标应用窗口的显示内容。
  6. 根据权利要求5所述的方法,其中,所述方法还包括:
    响应于检测到作用于所述触敏显示屏幕的双指点击操作,将所述双指点击操作封装为封装单指点击操作,其中,所述双指点击操作对应于所述目标应用窗口;
    响应于所述封装单指点击操作对应的锚点在目标控件位置,根据所述封装单指点击操作,更新所述目标应用窗口。
  7. 根据权利要求1-6之一所述的方法,其中,所述方法还包括:
    响应于检测到作用于所述触敏显示屏幕的三指滑动操作,以及所述3D虚拟界面中在先显示了应用窗口,确定所述三指滑动操作对应的滑动方向、滑动距离和滑动加速度。
  8. 根据权利要求7所述的方法,其中,所述方法还包括:
    响应于所述滑动方向为向上,以及所述滑动距离大于等于第一滑动距离,且所述滑动加速度大于等于第一滑动加速度,切换至桌面窗口。
  9. 根据权利要求8所述的方法,其中,所述方法还包括:
    响应于所述3D虚拟界面中存在目标应用窗口的上一个应用窗口,以及所述滑动方向为向左,且所述滑动距离大于等于第二滑动距离,且所 述滑动加速度大于等于第二滑动加速度,切换至所述上一个应用窗口。
  10. 根据权利要求9所述的方法,其中,所述方法还包括:
    响应于所述3D虚拟界面中存在目标应用窗口的下一个应用窗口,以及所述滑动方向为向右,且所述滑动距离大于等于第三滑动距离,且所述滑动加速度大于等于第三滑动加速度,切换至所述下一个应用窗口。
  11. 一种头戴式显示设备,包括:
    一个或多个处理器;
    显示屏幕,用于显示3D虚拟界面和2D虚拟界面;
    存储装置,其上存储有一个或多个程序,
    当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-10中任一所述的方法。
  12. 一种计算机可读介质,其上存储有计算机程序,其中,所述程序被处理器执行时实现如权利要求1-10中任一所述的方法。
PCT/CN2022/105489 2021-07-13 2022-07-13 虚拟界面操作方法、头戴式显示设备和计算机可读介质 WO2023284791A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110788467.7 2021-07-13
CN202110788467.7A CN113589926B (zh) 2021-07-13 2021-07-13 虚拟界面操作方法、头戴式显示设备和计算机可读介质

Publications (1)

Publication Number Publication Date
WO2023284791A1 true WO2023284791A1 (zh) 2023-01-19

Family

ID=78247499

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/105489 WO2023284791A1 (zh) 2021-07-13 2022-07-13 虚拟界面操作方法、头戴式显示设备和计算机可读介质

Country Status (2)

Country Link
CN (1) CN113589926B (zh)
WO (1) WO2023284791A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113589926B (zh) * 2021-07-13 2022-10-25 杭州灵伴科技有限公司 虚拟界面操作方法、头戴式显示设备和计算机可读介质
CN115344152B (zh) * 2022-07-13 2023-09-01 北京奇艺世纪科技有限公司 界面操作方法、装置、电子设备及可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140147032A1 (en) * 2010-11-09 2014-05-29 The Provost Fellows,and Scholars of the College of the Holy and Undivided Trinity of Queen Elizabeth Method and System for Recovery of 3D Scene Structure and Camera Motion From a Video Sequence
CN106527722A (zh) * 2016-11-08 2017-03-22 网易(杭州)网络有限公司 虚拟现实中的交互方法、系统及终端设备
CN108646997A (zh) * 2018-05-14 2018-10-12 刘智勇 一种虚拟及增强现实设备与其他无线设备进行交互的方法
US20200292813A1 (en) * 2019-03-13 2020-09-17 Thirdeye Gen, Inc. Gaze-based user interface for augmented and mixed reality device
CN112764629A (zh) * 2021-01-28 2021-05-07 北京城市网邻信息技术有限公司 增强现实界面展示方法、装置、设备和计算机可读介质
CN113589926A (zh) * 2021-07-13 2021-11-02 杭州灵伴科技有限公司 虚拟界面操作方法、头戴式显示设备和计算机可读介质

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10495726B2 (en) * 2014-11-13 2019-12-03 WorldViz, Inc. Methods and systems for an immersive virtual reality system using multiple active markers
CN107133002B (zh) * 2017-03-22 2020-07-24 联想(北京)有限公司 一种信息处理方法及电子设备
US20200357183A1 (en) * 2019-05-06 2020-11-12 Michael Weber Methods, Systems and Apparatuses for Viewing Content in Augmented Reality or Virtual Reality
CN110428504B (zh) * 2019-07-12 2023-06-27 北京旷视科技有限公司 文本图像合成方法、装置、计算机设备和存储介质
CN110764675A (zh) * 2019-10-11 2020-02-07 维沃移动通信有限公司 一种控制方法及电子设备
CN112000259A (zh) * 2020-06-30 2020-11-27 深圳点猫科技有限公司 一种基于移动终端触摸事件控制摄像头的方法及装置
CN111966255B (zh) * 2020-08-25 2022-03-15 北京城市网邻信息技术有限公司 信息显示方法、装置、电子设备和计算机可读介质
CN112035028A (zh) * 2020-09-15 2020-12-04 Oppo广东移动通信有限公司 界面控制方法、界面控制装置、存储介质与电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140147032A1 (en) * 2010-11-09 2014-05-29 The Provost Fellows,and Scholars of the College of the Holy and Undivided Trinity of Queen Elizabeth Method and System for Recovery of 3D Scene Structure and Camera Motion From a Video Sequence
CN106527722A (zh) * 2016-11-08 2017-03-22 网易(杭州)网络有限公司 虚拟现实中的交互方法、系统及终端设备
CN108646997A (zh) * 2018-05-14 2018-10-12 刘智勇 一种虚拟及增强现实设备与其他无线设备进行交互的方法
US20200292813A1 (en) * 2019-03-13 2020-09-17 Thirdeye Gen, Inc. Gaze-based user interface for augmented and mixed reality device
CN112764629A (zh) * 2021-01-28 2021-05-07 北京城市网邻信息技术有限公司 增强现实界面展示方法、装置、设备和计算机可读介质
CN113589926A (zh) * 2021-07-13 2021-11-02 杭州灵伴科技有限公司 虚拟界面操作方法、头戴式显示设备和计算机可读介质

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LIU LIAN: "Analysis of VR Interface Design Patterns", COMPUTER PROGRAMMING SKILLS & MAINTENANCE, 18 January 2020 (2020-01-18), pages 138 - 139+155, XP093023386, DOI: 10.16184/j.cnki.comprg.2020.01.050 *
ZHU CHEN-GUANG,MENG XIAN-GUO,DI YAN-QIANG,ZHU YUAN-CHANG: "Research and Implementation of Large Multi-Touch Screen Applied to Virtual Training System", COMPUTER SIMULATION, vol. 30, no. 3, 15 March 2013 (2013-03-15), pages 294 - 298, XP093023389, ISSN: 1006-9348 *

Also Published As

Publication number Publication date
CN113589926B (zh) 2022-10-25
CN113589926A (zh) 2021-11-02

Similar Documents

Publication Publication Date Title
US11956528B2 (en) Shooting method using target control, electronic device, and storage medium
WO2023284791A1 (zh) 虚拟界面操作方法、头戴式显示设备和计算机可读介质
WO2022111239A1 (zh) 投屏控制方法、设备及电子设备
US20230024650A1 (en) Method and apparatus for selecting menu items, readable medium and electronic device
WO2021098537A1 (zh) 用于目标设备的视图调整方法、装置、电子设备和介质
US12003468B2 (en) Information processing method, apparatus and medium
US11861381B2 (en) Icon updating method and apparatus, and electronic device
US20220398011A1 (en) Target object display method and apparatus, electronic device and non-transitory computer-readable medium
WO2023138559A1 (zh) 虚拟现实交互方法、装置、设备和存储介质
CN110275659A (zh) 一种图像切换方法、装置、电子设备及存储介质
CN113220118B (zh) 虚拟界面展示方法、头戴式显示设备和计算机可读介质
WO2023125164A1 (zh) 页面显示方法、装置、电子设备和存储介质
WO2022183887A1 (zh) 视频编辑及播放方法、装置、设备、介质
WO2024061063A1 (zh) 通知消息的显示方法、装置、电子设备和存储介质
US20230199262A1 (en) Information display method and device, and terminal and storage medium
WO2022179409A1 (zh) 控件显示方法、装置、设备及介质
US20150326705A1 (en) Mobile Device Data Transfer Using Location Information
WO2024051639A1 (zh) 图像处理方法、装置、设备及存储介质和产品
US20230251777A1 (en) Target object display method and apparatus, electronic device and non-transitory computer-readable medium
WO2023246302A1 (zh) 字幕的显示方法、装置、设备及介质
US11935176B2 (en) Face image displaying method and apparatus, electronic device, and storage medium
EP4328725A1 (en) Display method and apparatus, electronic device, and storage medium
EP4274237A1 (en) Information display method and apparatus, and device and medium
CN114168063A (zh) 虚拟按键显示方法、头戴式显示设备和计算机可读介质
CN114397961A (zh) 头戴式显示设备控制方法、头戴式显示设备组件和介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22841424

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18570488

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE