WO2018209572A1 - 头戴式显示设备及其交互输入方法 - Google Patents

头戴式显示设备及其交互输入方法 Download PDF

Info

Publication number
WO2018209572A1
WO2018209572A1 PCT/CN2017/084567 CN2017084567W WO2018209572A1 WO 2018209572 A1 WO2018209572 A1 WO 2018209572A1 CN 2017084567 W CN2017084567 W CN 2017084567W WO 2018209572 A1 WO2018209572 A1 WO 2018209572A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
display device
touch screen
display
sensing signal
Prior art date
Application number
PCT/CN2017/084567
Other languages
English (en)
French (fr)
Inventor
谢俊
Original Assignee
深圳市柔宇科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市柔宇科技有限公司 filed Critical 深圳市柔宇科技有限公司
Priority to PCT/CN2017/084567 priority Critical patent/WO2018209572A1/zh
Priority to CN201780004641.6A priority patent/CN108475085A/zh
Publication of WO2018209572A1 publication Critical patent/WO2018209572A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Definitions

  • the present invention relates to a display device, and more particularly to a head mounted display device and an interactive input method thereof.
  • Head-mounted display devices have gradually become popular because of their convenience and ability to achieve stereoscopic display and stereo sound.
  • VR virtual reality
  • head-mounted display devices have become more widely used as hardware support devices for VR technology.
  • in the current head-mounted display device when the user inputs the head-mounted display device, it is usually inconvenient to operate the input device for blind operation.
  • inputting is improved by displaying some of the more friendly input interfaces and allowing the user to operate the input device, but it is still not intuitive enough.
  • the embodiment of the invention discloses a head-mounted display device and an interactive input method thereof, which can be input in a more intuitive manner for the user to wear the head-mounted display device, thereby effectively improving the input experience.
  • the head-mounted display device includes a processor and a display device, wherein the head-mounted display device further includes a touch screen, and the touch screen is disposed on a back surface of the display device for sensing a preset distance.
  • the touch device is electrically connected to the display device and the touch screen, and is configured to control the display device to display a cursor when the touch screen senses the touch object, and according to the touch object and the The change in the distance between the touch screens controls the display properties of the cursor.
  • the interactive input method disclosed in the embodiment of the present invention is applied to a head mounted display device, the head mounted display device includes a display device and a touch screen located at the back of the display device, and the interactive input method includes the steps of: sensing through the touch screen a touch object within a preset distance; controlling the display device to display a cursor when the touch screen senses the touch object; and changing the display of the cursor according to a change in a distance between the touch object and the touch screen Attributes.
  • the head-mounted display device of the present invention and the interactive input method thereof by setting the touch screen on the back of the display device, the user can perform the touch operation on the content of the displayed display device through the corresponding position of the touch screen, and the operation is more intuitive, and Controlling the dynamic change of the cursor according to the distance of the user's touch distance, guiding the user to touch within a suitable distance, is more intuitive and convenient.
  • FIG. 1 is a block diagram showing the structure of a head mounted display device in accordance with an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a head mounted display device in accordance with an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a head-mounted display device displaying a cursor according to an embodiment of the invention.
  • FIG. 4 is a schematic diagram showing the relationship between a touch screen and a display device of a head mounted display device according to an embodiment of the invention.
  • FIG. 5 is an exploded perspective view of a wearable display device according to an embodiment of the invention.
  • FIG. 6 is a flowchart of an interactive input method according to an embodiment of the present invention.
  • FIG. 7 is a sub-flow chart of step S603 in FIG. 6.
  • FIG. 8 is a flowchart of an interactive input method according to another embodiment of the present invention.
  • FIG. 1 is a structural block diagram of a head mounted display device 100 according to an embodiment of the invention.
  • the head mounted display device 100 includes a processor 10 , a display device 20 , and a touch screen 30 .
  • the touch screen 30 is peripherally mounted on the back of the display device 20. That is, when the user wears the head mounted display device 100, the touch screen 30 is located on a side of the display device 20 that is away from the user's eyes.
  • the touch screen 30 is configured to sense the touch object T1 within a preset distance.
  • the touch object T1 can be a user's finger or the like.
  • the preset distance may be 5 cm or the like.
  • a corresponding sensing signal is generated.
  • FIG. 3 a schematic diagram of displaying a cursor for the display device 20 is shown.
  • the processor 10 is electrically connected to the display device 20 and the touch screen 30 for controlling the display device 20 to display the cursor B1 when the touch screen 30 senses the touch object T1, and according to the touch object
  • the change in the distance between T1 and the touch screen 30 controls the display attribute of the cursor B1.
  • the processor 10 controls the cursor B1 to be displayed larger and more transparent when the touch object T1 is farther from the touch screen 30.
  • the processor 10 controls the cursor to be displayed smaller and clearer. Obviously, the more transparent the cursor B1 is, the more unrecognizable it is. The more opaque the cursor B1 is, the clearer it is.
  • the touch screen 30 is disposed on the back surface of the display device 20, and the user can perform touch operation on the touch screen 30 by using the content of the displayed display device 20, and the operation is more intuitive, and the user's touch can be dynamically indicated by the cursor B1.
  • the distance can guide the user to move to the appropriate touch distance to make the touch more intuitive and convenient.
  • the processor 10 controls the display device 20 to display the cursor B1. Specifically, the processor 10 senses the touch object T1 on the touch screen 30. The cursor B1 is displayed at a corresponding display position of the display device 20 according to the touch position control of the touch object T1.
  • the sensing signal generated by the touch screen 30 sensing the touch object T1 within the preset distance includes distance information of the touch object T1 and the touch screen 30.
  • the processor 10 obtains the sensing signal, and determines a distance of the touch object T1 from the touch screen 30 according to the distance information in the sensing signal, and determines the touch object T1 and the touch screen 30 according to the continuously received sensing signals. The change in distance between.
  • the processor 10 controls when the touch object T1 is farther from the touch screen 30.
  • the cursor B1 is displayed larger and more transparent.
  • the processor 10 controls the cursor to be displayed smaller and clearer.
  • the sensing signal generated by the touch screen 30 sensing the touch object T1 in the preset distance further includes touch position information, and the processor 10 acquires the sensing signal and according to the touch in the acquired sensing signal.
  • the control position information determines the touch position of the touch object T1 on the touch screen 30, and controls the display of the cursor B1 at the corresponding display position of the display device 20.
  • the display position on the display device 20 corresponding to the touch position on the touch screen 30 is that after the user puts on the head mounted display device 100, the user's line of sight looks at the touch and the touch. The location where the positions coincide. That is, after the user puts on the head mounted display device 100, the area on the display device 20 seen by the user's line of sight and the corresponding touch position on the touch screen 30 are coincident, so that the user performs touch input on the touch screen 30.
  • the display position coincides with the displayed display position, and the user can accurately touch the icon displayed on the display device 20 or the like.
  • the display device 20 includes a display area 21 for display, and the entire display area 21 of the display device 20 and the touch screen 30 are square.
  • the position of the touch screen 30 disposed on the back surface of the display device 20 is an extension of a line connecting four vertices of the entire display area 21 of the display device 20 and four vertices corresponding to the touch screen 30. It just happens to meet the observation point G1 of the user's eyes.
  • the display area 21 and the touch screen 30 are extended to cover the entire screen area of the left and right eyes, that is, when the user wears the head mounted display device 100, the display area 21 and the touch screen 30 completely cover the left and right eye positions of the user.
  • the observation point of the user's eyes refers to the midpoint position of the line connecting the left and right eyes of the user wearing the head mounted display device 100.
  • the touch screen 30 when the display area 21 and the touch screen 30 are areas corresponding to the left eye or the right eye, that is, in some embodiments, the touch screen 30 includes a touch screen corresponding to the left eye area and a corresponding right eye.
  • the observation point of the user's eyes refers to the wearable head. The position of the left or right eye of the user of the wearable display device 100.
  • the touch position on the touch screen 30 and the display position on the display device 20 in the corresponding relationship can be made that the user wears the head mounted display device 100 after the user wears the head mounted display device 100. Looking at the line of sight is in the two position areas that overlap each other.
  • the user can see the display interface through the display device 20, and can accurately touch or touch the position to be touched by the finger according to the position of the display interface to be touched.
  • the interactive input method is also more intuitive.
  • the sensing signal generated by the touch screen 30 in the sensing distance of the touch screen 30 further includes touch motion information
  • the processor 10 is further configured to perform touch according to the sensing signal.
  • the action information determines the touch action of the user, for example, double-click, click, slide, etc., and determines the touch position of the current user according to the touch position information in the sensing signal, and then determines that the touch position corresponds to the display device 20 A display element displayed on the display position is displayed, and a function related to the display element is performed according to the touch action.
  • a plurality of function icons P1 are displayed on the display device 20, including a browser icon, an e-book icon, a map icon, a photographing icon, and the like.
  • the processor 10 determines that the touch action of the user is a double click according to the touch action information in the sensing signal, and the display position of the current user's touch position displayed on the display device 20 is a browser icon, The processor 10 controls to open a browser. For example, if the processor 10 determines, according to the touch action information in the sensing signal, the touch action of the user is a two-finger sliding touch, and the touch position of the current user is displayed on the corresponding display position on the display device 20. When the component is a picture, the processor 10 controls to reduce the picture.
  • the corresponding relationship between the display position of the display device 20 and the touch position on the touch screen 30 may be that the display device 20 and the touch screen 30 are divided into a plurality of position regions at a preset resolution in advance, and the user's eye line is in a coincident state.
  • the display position of the display device 20 and the touch position on the touch screen 30 are obtained by establishing a correspondence relationship.
  • the corresponding relationship may be pre-established before the head mounted display device 100 is shipped from the factory.
  • the head mounted display device 100 further includes a memory 40 that stores a correspondence relationship between a display position of the display device 20 and a touch position on the touch screen 30.
  • the processor 10 determines that the correspondence determines the corresponding display position of the touch position on the display device 20, and further determines the display element displayed on the display position.
  • the processor 10 determines the touch position of the current user after determining the touch position of the current user, and then determines the corresponding display position of the touch position on the display device 20 according to the corresponding relationship, and controls the The cursor B1 is displayed on the corresponding display position of the display device 100.
  • FIG. 5 is a schematic exploded view of the head mounted display device 100.
  • the touch screen 30 is a floating touch screen, and is distributed with a proximity touch sensor, and the touch screen 30 can sense a touch of the touch object T1 at a short distance.
  • the head mounted display device 100 further includes a housing 50 for covering a back surface of the display device 20, and the touch screen 30 is disposed on an inner wall of the housing 50. That is, as shown in FIG. 5, when the touch screen 30 is a floating touch screen, the touch screen 30 is disposed on the inner wall of the outer casing 50 without being directly exposed.
  • the touch screen 30 can be a common touch screen that detects a user's direct touch, and the touch screen 30 is used to sense the touch object T1 that is in direct contact with the touch screen 30. Obviously, when the touch screen 30 is a touch object T1 that directly contacts the touch screen 30, the touch screen 30 is located on the outermost surface of the display device 20.
  • the processor 10 can be a microcontroller, a microprocessor, a single chip, a digital signal processor, or the like.
  • the memory 40 can be a computer readable storage medium such as a memory card, a solid state memory, a micro hard disk, an optical disk, or the like. In some embodiments, the memory 40 stores a number of program instructions that can be executed by the processor 10 to perform the aforementioned functions.
  • the head mounted display device 100 can be a head mounted device such as a smart helmet or smart glasses.
  • FIG. 6 is a flowchart of an interactive input method according to an embodiment of the present invention.
  • the interactive input method is applied to the aforementioned head mounted display device 100, and the order of execution is not limited to the order shown in FIG. 6.
  • the method includes the steps of:
  • the display device 20 When the touch screen 30 senses the touch object T1, the display device 20 is controlled to display the cursor B1 (S601). In some embodiments, when the touch screen 30 senses the touch object T1, the processor 10 displays the cursor B1 according to the touch position control of the touch object T1 at a corresponding display position of the display device 20. In some embodiments, the display position on the display device 20 corresponding to the touch position on the touch screen 30 is that after the user puts on the head mounted display device 100, the user's line of sight looks at the touch and the touch. The location where the positions coincide. In some embodiments, the correspondence between the display position of the display device 20 and the touch position on the touch screen 30 is pre-stored in the head mounted display device 100, and the touch position is determined on the display device 20 according to the correspondence relationship. Corresponding display position.
  • Changing the display attribute of the cursor B1 according to the change in the distance between the touch object T1 and the touch screen 30 (S603).
  • the processor 10 is away from the touch object T1. The farther the touch screen 30 is, the more the cursor B1 is displayed and the more transparent it is.
  • step S603 is a sub-flowchart in step S603 in some embodiments.
  • the step S603 includes:
  • the acquisition touch screen 30 senses the sensing signal generated by the touch object T1 (S6031).
  • the distance between the touch object T1 and the touch screen 30 is determined according to the distance information in the sensing signal (S6032).
  • a change in the distance between the touch object T1 and the touch screen 30 is determined according to the distance information in the continuously received sensing signals (S6033).
  • the display attribute of the cursor B1 is changed according to a change in the distance between the touch object T1 and the touch screen 30 (S6034).
  • FIG. 8 is a flowchart of an interactive input method in an embodiment of the present invention.
  • the interactive input method is applied to the aforementioned head mounted display device 100, and the order of execution is not limited to the order shown in FIG.
  • the method includes the steps of:
  • the display device 20 When the touch screen 30 senses the touch object T1, the display device 20 is controlled to display the cursor B1 (S801). In some embodiments, when the touch screen 30 senses the touch object T1, the processor 10 displays the cursor B1 according to the touch position control of the touch object T1 at a corresponding display position of the display device 20.
  • the touch action information in the sensing signal generated by the touch object T1 is sensed by the touch screen 30 to determine the touch action of the user (S805).
  • the current user's touch position is determined according to the touch position information in the sensing signal (S806).
  • a display element displayed on the display device 20 at a corresponding display position is determined (S807).
  • the head mounted display device 100 stores a correspondence between the display position of the display device 20 and the touch position on the touch screen 30.
  • the step S807 includes: after determining the touch position of the current user, determining the corresponding display position of the touch position on the display device 20 according to the correspondence, and further determining the display element displayed on the display position.
  • the function related to the display element is performed according to the touch action (S808).
  • steps S801 and S803 are the same as the steps S601 and S603 in FIG. 6, respectively. Further features included are the same as steps S601 and S603 in FIG. 6, respectively. For detailed steps, please refer to steps S601 and S603 in FIG.
  • the plurality of program instructions are used by the processor 10 to perform execution to perform the steps in any of the methods of FIGS. 6-7.
  • the head mounted display device 100 and the interactive input method of the present invention provide the touch screen 30 on the back side of the display device 20, and the user can touch the content of the displayed display device 20 through the touch screen 30 to touch the corresponding position of the content.
  • the operation is more intuitive, and in addition, the cursor B1 can be displayed on the display device 20 according to the user's close touch, and the display property of the cursor B1 can be changed according to the touch distance to prompt the user to touch the distance.
  • the present invention further makes the user grasp the display content and the touch position more accurately by making the touch position and the corresponding display position coincide on the user's line of sight, and is convenient for the user to perform quick input.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种交互输入方法,应用于一头戴式显示设备中,所述头戴式显示设备包括显示装置及位于显示装置背面的触摸屏,所述交互输入方法包括步骤:通过触摸屏感测预设距离内的触控物体;在触摸屏感测到触控物体时,控制所述显示装置显示光标(S601);以及根据所述触控物体与所述触摸屏之间距离的变化控制改变所述光标的显示属性(S603)。该方法能够提供更直观的交互输入。

Description

头戴式显示设备及其交互输入方法 技术领域
本发明涉及一种显示设备,尤其涉及一种头戴式显示设备及其交互输入方法。
背景技术
头戴式显示设备由于便捷性,且能实现立体显示及立体声等效果,已经逐渐被人们所喜爱。近年来,随着虚拟现实(virtual reality,VR)技术的出现,头戴式显示设备作为VR技术的硬件支持设备,更加应用广泛了。然而,目前的头戴式显示设备,当用户穿戴头戴式显示设备后进行输入时,通常需要操作输入装置进行盲操作,显得很不方便。在一些技术中,通过显示一些比较友好的输入界面并供用户操作输入装置来进行输入提高了输入的体验,但是仍然不够直观。
发明内容
本发明实施例公开一种头戴式显示设备及其交互输入方法,能够以更直观的方式供用户穿戴头戴式显示设备后进行输入,可有效提高输入体验。
本发明实施例公开的头戴式显示设备,包括处理器及显示装置,其中,所述头戴式显示设备还包括触摸屏,所述触摸屏设置于显示装置的背面,用于感测预设距离内的触控物体,所述处理器与所述显示装置及所述触摸屏电连接,用于在触摸屏感测到触控物体时,控制所述显示装置显示光标,并根据所述触控物体与所述触摸屏之间距离的变化控制改变所述光标的显示属性。
本发明实施例公开的交互输入方法,应用于一头戴式显示设备中,所述头戴式显示设备包括显示装置及位于显示装置背面的触摸屏,所述交互输入方法包括步骤:通过触摸屏感测预设距离内的触控物体;在触摸屏感测到触控物体时,控制所述显示装置显示光标;以及根据所述触控物体与所述触摸屏之间距离的变化控制改变所述光标的显示属性。
本发明的头戴式显示设备及其交互输入方法,通过将触摸屏设置于显示装置的背面,用户可对看到的显示装置的内容通过在触摸屏的对应位置进行触控操作,操作更加直观,且根据用户触控距离的远近控制光标动态变化,引导用户到合适的距离内进行触控,更加直观方便。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明一实施例中的头戴式显示设备的结构框图。
图2为本发明一实施例中的头戴式显示设备的示意图。
图3为本发明一实施例中的头戴式显示设备显示光标的示意图。
图4为本发明一实施例中的头戴式显示设备的触摸屏与显示装置的设置关系示意图。
图5为本发明一实施例中的戴式显示设备的分解示意图。
图6为本发明一实施例中的交互输入方法的流程图。
图7为图6中步骤S603的子流程图。
图8为本发明另一实施例中的交互输入方法的流程图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
请参阅图1,为本发明一实施例中的头戴式显示设备100的结构框图。如图1所示,所述头戴式显示设备100包括处理器10、显示装置20、触摸屏30。
请一并参阅图2,所述触摸屏30外设于显示装置20的背面。即,用户穿戴所述头戴式显示设备100时,所述触摸屏30位于所述显示装置20的远离用户眼睛的一侧上。
所述触摸屏30用于感测预设距离内的触控物体T1。在一些实施例中,所述触控物体T1可为用户的手指等。所述预设距离可为5厘米等。在一些实施例中,所述触摸屏30感测到预设距离内有触控物体T1时,产生对应的感应信号。
请一并参阅图3,为显示装置20显示光标的示意图。所述处理器10与所述显示装置20及所述触摸屏30电连接,用于在触摸屏30感测到触控物体T1时,控制所述显示装置20显示光标B1,并根据所述触控物体T1与所述触摸屏30之间距离的变化控制改变所述光标B1的显示属性。
在一些实施例中,当触控物体T1在所述预设距离内时,所述处理器10在触控物体T1离触摸屏30越远时,控制所述光标B1显示得越大且越透明。相应的,所述处理器10在触控物体T1离触摸屏30越近时,控制所述光标B1显示得越小且越清晰。显然,当光标B1越透明,则越是不可辨认,当光标B1越不透明,则越清晰。
从而,本发明将触摸屏30设置于显示装置20的背面,用户可通过对看到的显示装置20的内容通过触摸屏30进行触控操作,操作更加直观,且可通过光标B1动态指示用户的触控距离,可引导用户移动到合适的触控距离范围内进行触控,使得输入更直观方便。
在一些实施例中,所述处理器10在触摸屏30感测到触控物体T1时,控制所述显示装置20显示光标B1具体包括:所述处理器10在触摸屏30感测到触控物体T1时,根据所述触控物体T1的触控位置控制在显示装置20的对应显示位置显示所述光标B1。
在一些实施例中,所述触摸屏30感测预设距离内的触控物体T1产生的感应信号包括触控物体T1与触摸屏30的距离信息。所述处理器10获取所述感应信号,并根据感应信号中的距离信息确定所述触控物体T1离触摸屏30的距离,并根据连续接收到的感应信号确定所述触控物体T1与触摸屏30之间距离的变化。如前所述,所述处理器10在触控物体T1离触摸屏30越远时,控制 所述光标B1显示得越大且越透明。相应的,所述处理器10在触控物体T1离触摸屏30越近时,控制所述光标B1显示得越小且越清晰。
其中,所述触摸屏30感测预设距离内的触控物体T1产生的感应信号还包括触控位置信息,所述处理器10获取所述感应信号,并根据所述获取的感应信号中的触控位置信息确定所述触控物体T1在触摸屏30上的触控位置,并控制在显示装置20的对应显示位置显示所述光标B1。
在一些实施例中,所述显示装置20上的与所述触摸屏30上的触控位置对应的显示位置为用户戴上头戴式显示设备100后,用户的视线上看过去与所述触控位置重合的位置。即,用户戴上头戴式显示设备100后,用户视线看到的显示装置20上的区域和触摸屏30上的对应触控位置是重合的,这样,用户在触摸屏30上进行触控输入的位置与看到的显示位置是重合,用户可以准确的对显示装置20上显示的图标等进行触控。
请一并参阅图4,在一些实施例中,所述显示装置20包括用于显示的显示区域21,所述显示装置20的整个显示区域21与所述触摸屏30为四方形。所述触摸屏30设置于所述显示装置20的背面上的位置为使得所述显示装置20的整个显示区域21的四个顶点与所述触摸屏30对应的四个顶点之间的连线的延长线正好交汇于用户眼睛的观察点G1。
其中,当所述显示区域21以及触摸屏30为延伸覆盖左右眼的整屏区域时,即,当用户穿戴头戴式显示设备100后,所述显示区域21及触摸屏30完整覆盖用户的左右眼位置时,所述用户眼睛的观察点指的是穿戴头戴式显示设备100的用户左右眼的连线的中点位置。
在另一些实施例中,当所述显示区域21以及触摸屏30为对应左眼或右眼的区域时,即,在一些实施例中,所述触摸屏30包括对应左眼区域的触摸屏以及对应右眼区域的触摸屏在内的两个触摸屏,且所述显示装置20显示区域21包括左眼显示区域和右眼显示区域在内的两个显示区域时,所述用户眼睛的观察点指的是穿戴头戴式显示设备100的用户左眼或右眼的位置。
通过上述对触摸屏30与显示区域21的设置,能够使得为对应关系的所述触摸屏30上的触控位置与所述显示装置20上的显示位置为用户戴上头戴式显示设备100后用户的视线上看过去处于相互重合的两个位置区域。
从而,用户通过显示装置20看到显示界面,并可以根据看到的显示界面上待触控的位置而通过手指准确对该待触控的位置进行接触或非接触式触控,在这种触控方式下,交互输入方式也更加直观。
在一些实施例中,所述触摸屏30感测预设距离内的触控物体T1产生的感应信号中还包括触控动作信息,所述处理器10还用于根据所述感应信号中的触控动作信息确定用户的触控动作,例如为双击、单击、滑动等,并根据感应信号中的触控位置信息确定当前用户的触控位置,然后确定所述触控位置在显示装置20上对应显示位置上显示的显示元件,并根据所述触控动作执行与所述显示元件相关的功能。
例如,如图3所示,所述显示装置20上显示有若干功能图标P1,包括浏览器图标、电子书图标、地图图标、拍照图标等。当处理器10根据所述感应信号中的触控动作信息确定用户的触控动作为双击,且当前用户的触控位置在显示装置20上对应显示位置上显示的显示元件为浏览器图标时,所述处理器10控制打开浏览器。又例如,若处理器10根据所述感应信号中的触控动作信息确定用户的触控动作为双指靠拢滑动触摸,且当前用户的触控位置在显示装置20上对应显示位置上显示的显示元件为图片时,所述处理器10控制缩小所述图片。
其中,所述显示装置20的显示位置与触摸屏30上的触摸位置的对应关系可为预先将显示装置20和触摸屏30以预设分辨率划分为若干位置区域,并将用户眼睛视线上处于重合状态的显示装置20的显示位置及触摸屏30上的触摸位置一一建立对应关系后得出的。其中,所述对应关系可在头戴式显示设备100在出厂前预先建立。
如图1所示,所述头戴式显示设备100还包括存储器40,所述存储器40存储有显示装置20的显示位置及触摸屏30上的触摸位置的对应关系。所述处理器10确定当前用户的触控位置后,然后确定所述对应关系确定触控位置在显示装置20上的对应显示位置,并进一步确定在所述显示位置上显示的显示元件。显然,所述处理器10在控制显示光标B1时,也为确定当前用户的触控位置后,然后根据所述对应关系确定触控位置在显示装置20上的对应显示位置,并控制在所述显示装置100的对应显示位置上显示所述光标B1。
请一并参阅图5,为头戴式显示设备100的分解结构示意图。在一些实施例中,所述触摸屏30为悬浮式触摸屏,分布有近距离触摸传感器,所述触摸屏30可感测触控物体T1近距离的触摸。所述头戴式显示设备100还包括壳体50,所述壳体50用于遮盖显示装置20的背面,所述触摸屏30设置于壳体50的内壁。即,如图5所示,当所述触摸屏30为悬浮式触摸屏时,所述触摸屏30设置于外壳50的内壁而不直接露在外面。
在另一些实施例中,所述触摸屏30可为普通的侦测用户直接触摸的触摸屏,所述触摸屏30用于感测与触摸屏30直接接触的触控物体T1。显然,当所述触摸屏30为感测与触摸屏30直接接触的触控物体T1时,所述触摸屏30位于显示装置20的最外侧的表面上。
其中所述处理器10可为微控制器、微处理器、单片机、数字信号处理器等。
所述存储器40可为存储卡、固态存储器、微硬盘、光盘等计算机可读存储介质。在一些实施例中,所述存储器40中存储有若干程序指令,所述程序指令可被处理器10调用后执行前述的功能。
所述头戴式显示设备100可为智能头盔,智能眼镜等头戴式设备。
请参阅图6,为本发明一实施例中的交互输入方法的流程图。所述交互输入方法应用于前述的头戴式显示设备100中,执行顺序并不限于图6所示的顺序。所述方法包括步骤:
在触摸屏30感测到触控物体T1时,控制所述显示装置20显示光标B1(S601)。在一些实施例中,所述处理器10在触摸屏30感测到触控物体T1时,根据所述触控物体T1的触控位置控制在显示装置20的对应显示位置显示所述光标B1。在一些实施例中,所述显示装置20上的与所述触摸屏30上的触控位置对应的显示位置为用户戴上头戴式显示设备100后,用户的视线上看过去与所述触控位置重合的位置。在一些实施例中,所述头戴式显示设备100中预存有显示装置20的显示位置及触摸屏30上的触摸位置的对应关系,为根据所述对应关系确定触控位置在显示装置20上的对应显示位置。
根据所述触控物体T1与所述触摸屏30之间距离的变化控制改变所述光标B1的显示属性(S603)。在一些实施例中,所述处理器10在触控物体T1离 触摸屏30越远时控制所述光标B1显示得越大且越透明。
请参阅图7,为步骤S603在一些实施例中的子流程图。如图7所示,所述步骤S603包括:
获取触摸屏30感测到触控物体T1产生的感应信号(S6031)。
根据感应信号中的距离信息确定所述触控物体T1与触摸屏30之间的距离(S6032)。
根据连续接收到的感应信号中的距离信息确定所述触控物体T1与触摸屏30之间的距离的变化(S6033)。
根据所述触控物体T1与所述触摸屏30之间距离的变化控制改变所述光标B1的显示属性(S6034)。
请参阅图8,本发明一实施例中的交互输入方法的流程图。所述交互输入方法应用于前述的头戴式显示设备100中,执行顺序并不限于图8所示的顺序。所述方法包括步骤:
在触摸屏30感测到触控物体T1时,控制所述显示装置20显示光标B1(S801)。在一些实施例中,所述处理器10在触摸屏30感测到触控物体T1时,根据所述触控物体T1的触控位置控制在显示装置20的对应显示位置显示所述光标B1。
根据所述触控物体T1与所述触摸屏30之间距离的变化控制改变所述光标B1的显示属性(S803)。
根据所述触摸屏30感测到触控物体T1产生的感应信号中的触控动作信息确定用户的触控动作(S805)。
根据感应信号中的触控位置信息确定当前用户的触控位置(S806)。
确定所述触控位置在显示装置20上对应显示位置上显示的显示元件(S807)。在一些实施例中,所述头戴式显示设备100中存储有显示装置20的显示位置及触摸屏30上的触摸位置的对应关系。所述步骤S807包括:所确定当前用户的触控位置后,然后根据所述对应关系确定触控位置在显示装置20上的对应显示位置,并进一步确定在所述显示位置上显示的显示元件。
根据所述触控动作执行所述显示元件相关的功能(S808)。
其中,所述步骤S801、S803与图6中的步骤S601、S603分别相同,其 所包括的进一步的特征与图6中的步骤S601、S603分别相同,详细步骤请参考图6中的步骤S601、S603。
其中,当存储器40中存储有若干程序指令时,所述若干程序指令用于供处理器10调用执行而执行图6-7中任一方法中的步骤。
从而,本发明的头戴式显示设备100及交互输入方法,将触摸屏30设置于显示装置20的背面,用户可对看到的显示装置20的内容通过触摸屏30对看到内容的相应位置进行触控操作,操作更加直观,此外,可根据用户的近距离触摸在显示装置20上显示光标B1,并根据触摸距离改变光标B1的显示属性,来提示用户触摸距离的远近。此外,本发明还通过使得触控位置与对应的显示位置在用户视线上是重合的,使得用户对显示内容以及触控位置的把握更加准确,方便用户进行快捷输入。
以上所述是本发明的优选实施例,应当指出,对于本技术领域的普通技术人员来说,在不脱离本发明原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也视为本发明的保护范围。

Claims (20)

  1. 一种头戴式显示设备,包括处理器及显示装置,其特征在于,所述头戴式显示设备还包括触摸屏,所述触摸屏外设于显示装置的背面,用于感测预设距离内的触控物体,所述处理器与所述显示装置及所述触摸屏电连接,用于在触摸屏感测到触控物体时,控制所述显示装置显示光标,并根据所述触控物体与所述触摸屏之间距离的变化控制改变所述光标的显示属性。
  2. 如权利要求1所述的头戴式显示设备,其特征在于,所述显示属性包括显示尺寸及透明度,所述处理器在触控物体离所述触摸屏越远时,控制所述光标显示得越大且越透明。
  3. 如权利要求1所述的头戴式显示设备,其特征在于,所述触摸屏感测预设距离内的触控物体产生感应信号,所述感应信号包括触控物体与触摸屏的距离信息,所述处理器获取所述感应信号,并根据感应信号中的距离信息确定所述触控物体离触摸屏的距离,并根据连续接收到的感应信号中的距离信息确定所述触控物体与触摸屏之间距离的变化。
  4. 如权利要求3所述的头戴式显示设备,其特征在于,所述感应信号还包括触控位置信息,所述处理器获取所述感应信号,并根据所述获取的感应信号中的触控位置信息确定所述触控物体在触摸屏上的触控位置,并控制在显示装置的对应显示位置显示所述光标。
  5. 如权利要求4所述的头戴式显示设备,其特征在于,所述显示装置上与所述触摸屏上的触控位置对应的显示位置为用户戴上头戴式显示设备后,从用户的视线上看过去与所述触控位置重合的位置。
  6. 如权利要求5所述的头戴式显示设备,其特征在于,所述显示装置的显示区域与所述触摸屏对应的四个顶点之间的连线的延长线交汇于用户眼睛的观察点。
  7. 如权利要求6所述的头戴式显示设备,其特征在于,所述显示区域以及触摸屏为延伸覆盖左右眼的整屏区域,所述用户眼睛的观察点为用户左右眼的连线的中点位置。
  8. 如权利要求6所述的头戴式显示设备,其特征在于,所述触摸屏包括 对应左眼区域的触摸屏以及对应右眼区域的触摸屏,且所述显示区包括左眼显示区域和右眼显示区域,所述用户眼睛的观察点为穿戴头戴式显示设备的用户的左眼或右眼的位置;所述左眼显示区域与所述对应左眼区域的触摸屏对应的四个顶点之间的连线的延长线交汇于用户左眼的位置,所述右眼显示区域与所述对应右眼区域的触摸屏对应的四个顶点之间的连线的延长线交汇于用户右眼的位置。
  9. 如权利要求3所述的头戴式显示设备,其特征在于,所述触摸屏感测预设距离内的触控物体产生的感应信号中还包括触控动作信息,所述处理器还用于根据所述感应信号中的触控动作信息确定用户的触控动作,然后确定所述触控位置在显示装置上对应显示位置上显示的显示元件,并根据所述触控动作执行与所述显示元件相关的功能。
  10. 如权利要求4或9所述的头戴式显示设备,其特征在于,所述处理器确定当前用户的触控位置后,根据显示装置的显示位置及触摸屏上的触摸位置的对应关系确定触控位置在显示装置上的对应显示位置。
  11. 如权利要求4或9所述的头戴式显示设备,其特征在于,所述显示装置的显示位置及触摸屏上的触摸位置的对应关系为预先将显示装置和触摸屏以预设分辨率划分为若干位置区域,并将用户眼睛视线上处于重合状态的显示装置的显示位置及触摸屏上的触摸位置一一建立对应关系后得出的。
  12. 如权利要求1所述的头戴式显示设备,其特征在于,所述头戴式显示设备还包括壳体,所述壳体用于遮盖显示装置的背面,所述触摸屏设置于壳体的内壁。
  13. 一种交互输入方法,应用于一头戴式显示设备中,所述头戴式显示设备包括显示装置及外设于显示装置背面的触摸屏,其特征在于,所述交互输入方法包括步骤:
    通过触摸屏感测预设距离内的触控物体;
    在触摸屏感测到触控物体时,控制所述显示装置显示光标;以及
    根据所述触控物体与所述触摸屏之间距离的变化控制改变所述光标的显示属性。
  14. 如权利要求13所述的交互输入方法,其特征在于,所述显示属性包 括显示尺寸及透明度,所述步骤“根据所述触控物体与所述触摸屏之间距离的变化控制改变所述光标的显示属性”包括:
    在触控物体离所述触摸屏越远时,控制所述光标显示得越大且越透明。
  15. 如权利要求13所述的交互输入方法,其特征在于,所述触摸屏感测预设距离内的触控物体产生感应信号,所述感应信号包括触控物体与触摸屏的距离信息,所述步骤“根据所述触控物体与所述触摸屏之间距离的变化控制改变所述光标的显示属性”包括:
    获取所述触摸屏感测预设距离内的触控物体产生的感应信号;
    根据感应信号中的距离信息确定所述触控物体离触摸屏的距离;
    根据连续接收到的感应信号中的距离信息确定所述触控物体与触摸屏之间距离的变化;以及
    根据所述触控物体与所述触摸屏之间距离的变化控制改变所述光标的显示属性。
  16. 如权利要求13所述的交互输入方法,其特征在于,所述触摸屏感测预设距离内的触控物体产生感应信号,所述感应信号包括触控位置信息,所述步骤“在触摸屏感测到触控物体时,控制所述显示装置显示光标”包括:
    获取所述触摸屏感测预设距离内的触控物体产生的感应信号;
    根据所述获取的感应信号中的触控位置信息确定所述触控物体在触摸屏上的触控位置,并控制在显示装置的对应显示位置显示所述光标。
  17. 如权利要求13所述的交互输入方法,其特征在于,所述触摸屏感测预设距离内的触控物体产生感应信号,所述感应信号中包括触控动作信息,所述方法还包括步骤:
    根据所述感应信号中的触控动作信息确定用户的触控动作;
    确定所述触控位置在显示装置上对应显示位置上显示的显示元件,并根据所述触控动作执行与所述显示元件相关的功能。
  18. 如权利要求16或17所述的交互输入方法,其特征在于,所述显示装置上的与所述触摸屏上的触控位置对应的显示位置为用户戴上头戴式显示设备后,用户的视线上看过去与所述触控位置重合的位置。
  19. 如权利要求16或17所述的交互输入方法,其特征在于,所述方法还 包括步骤:
    根据显示装置的显示位置及触摸屏上的触摸位置的对应关系,确定所述触控位置在显示装置上的对应显示位置。
  20. 如权利要求19所述的交互输入方法,其特征在于,所述方法还包括步骤:
    预先将显示装置和触摸屏以预设分辨率划分为若干位置区域;以及
    将用户眼睛视线上处于重合状态的显示装置的显示位置及触摸屏上的触摸位置一一建立对应关系得出所述显示装置的显示位置及触摸屏上的触摸位置的对应关系。
PCT/CN2017/084567 2017-05-16 2017-05-16 头戴式显示设备及其交互输入方法 WO2018209572A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2017/084567 WO2018209572A1 (zh) 2017-05-16 2017-05-16 头戴式显示设备及其交互输入方法
CN201780004641.6A CN108475085A (zh) 2017-05-16 2017-05-16 头戴式显示设备及其交互输入方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/084567 WO2018209572A1 (zh) 2017-05-16 2017-05-16 头戴式显示设备及其交互输入方法

Publications (1)

Publication Number Publication Date
WO2018209572A1 true WO2018209572A1 (zh) 2018-11-22

Family

ID=63266550

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/084567 WO2018209572A1 (zh) 2017-05-16 2017-05-16 头戴式显示设备及其交互输入方法

Country Status (2)

Country Link
CN (1) CN108475085A (zh)
WO (1) WO2018209572A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110113529B (zh) * 2019-04-29 2022-03-18 努比亚技术有限公司 一种拍摄参数调控方法、设备及计算机可读存储介质
CN110362231B (zh) * 2019-07-12 2022-05-20 腾讯科技(深圳)有限公司 抬头触控设备、图像显示的方法及装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823550A (zh) * 2012-11-16 2014-05-28 广达电脑股份有限公司 虚拟触控方法
CN104076930A (zh) * 2014-07-22 2014-10-01 北京智谷睿拓技术服务有限公司 盲操作控制方法、装置和系统
CN106155383A (zh) * 2015-04-03 2016-11-23 上海乐相科技有限公司 一种头戴式智能眼镜屏幕控制方法及装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007310599A (ja) * 2006-05-17 2007-11-29 Nikon Corp 映像表示装置
CN102609120A (zh) * 2007-11-30 2012-07-25 原相科技股份有限公司 影像显示装置上的光标控制装置、方法及影像系统
JP5569271B2 (ja) * 2010-09-07 2014-08-13 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
JP5953963B2 (ja) * 2012-06-13 2016-07-20 ソニー株式会社 頭部装着型映像表示装置
DE102015012720A1 (de) * 2015-10-01 2017-04-06 Audi Ag Interaktives Bediensystem und Verfahren zum Durchführen einer Bedienhandlung bei einem interaktiven Bediensystem

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823550A (zh) * 2012-11-16 2014-05-28 广达电脑股份有限公司 虚拟触控方法
CN104076930A (zh) * 2014-07-22 2014-10-01 北京智谷睿拓技术服务有限公司 盲操作控制方法、装置和系统
CN106155383A (zh) * 2015-04-03 2016-11-23 上海乐相科技有限公司 一种头戴式智能眼镜屏幕控制方法及装置

Also Published As

Publication number Publication date
CN108475085A (zh) 2018-08-31

Similar Documents

Publication Publication Date Title
KR102487389B1 (ko) 제스쳐를 이용하여 화면을 제어하기 위한 폴더블 전자 장치 및 방법
EP2972727B1 (en) Non-occluded display for hover interactions
US10133407B2 (en) Display apparatus, display system, method for controlling display apparatus, and program
KR102638956B1 (ko) 전자 장치 및 증강 현실 서비스를 제공하는 증강 현실 장치와 그 동작 방법
US9965039B2 (en) Device and method for displaying user interface of virtual input device based on motion recognition
KR102243652B1 (ko) 디스플레이 디바이스 및 그 제어 방법
KR101608423B1 (ko) 모바일 디바이스상의 풀 3d 상호작용
CN108073432B (zh) 一种头戴式显示设备的用户界面显示方法
KR20150091322A (ko) 아이웨어 상에서의 멀티 터치 상호작용 기법
JP7005161B2 (ja) 電子機器及びその制御方法
KR20100027976A (ko) 이동 장치에서 제스처 및 움직임 기반 내비게이션 및 3차원 가상 콘텐츠와의 인터랙션
US20210004133A1 (en) Remote touch detection enabled by peripheral device
US20140267049A1 (en) Layered and split keyboard for full 3d interaction on mobile devices
WO2019241040A1 (en) Positioning a virtual reality passthrough region at a known distance
KR102297473B1 (ko) 신체를 이용하여 터치 입력을 제공하는 장치 및 방법
US9927914B2 (en) Digital device and control method thereof
WO2018209572A1 (zh) 头戴式显示设备及其交互输入方法
KR20140094958A (ko) 플렉서블 디스플레이 장치의 동작 실행 방법 및 그 장치
CN110968248A (zh) 生成用于视觉触摸检测的指尖的3d模型
JP7005160B2 (ja) 電子機器及びその制御方法
US20210216146A1 (en) Positioning a user-controlled spatial selector based on extremity tracking information and eye tracking information
EP3128397B1 (en) Electronic apparatus and text input method for the same
US12008216B1 (en) Displaying a volumetric representation within a tab
US11641460B1 (en) Generating a volumetric representation of a capture region
US12008160B2 (en) Eye tracking based selection of a user interface (UI) element based on targeting criteria

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17910013

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17910013

Country of ref document: EP

Kind code of ref document: A1