WO2017161497A1 - 头戴式显示设备及其操控方法 - Google Patents

头戴式显示设备及其操控方法 Download PDF

Info

Publication number
WO2017161497A1
WO2017161497A1 PCT/CN2016/076981 CN2016076981W WO2017161497A1 WO 2017161497 A1 WO2017161497 A1 WO 2017161497A1 CN 2016076981 W CN2016076981 W CN 2016076981W WO 2017161497 A1 WO2017161497 A1 WO 2017161497A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
touch area
touch signal
area
display device
Prior art date
Application number
PCT/CN2016/076981
Other languages
English (en)
French (fr)
Inventor
林麒
Original Assignee
深圳市柔宇科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市柔宇科技有限公司 filed Critical 深圳市柔宇科技有限公司
Priority to PCT/CN2016/076981 priority Critical patent/WO2017161497A1/zh
Priority to CN201680011871.0A priority patent/CN107466396A/zh
Publication of WO2017161497A1 publication Critical patent/WO2017161497A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted

Definitions

  • the present invention relates to the field of head-mounted display devices, and in particular, to a head-mounted display device and a method for controlling the same.
  • Head Mounted Display refers to a display device that can be worn on the head.
  • the HMD uses a "near-eye optical system" to display multimedia information such as graphic pictures on a display screen a few centimeters from the eyeball.
  • the HMD type currently includes a transmissive HMD that can see external and HMD display content and an immersive HMD that only sees HMD display content.
  • the immersive HMD has many entertaining applications. After wearing the device, the visual aid is completely lost. When the input operation is performed, it is difficult to locate the finger position, and it is difficult to achieve precise touch of the finger, which is inconvenient for the user to manipulate the device.
  • the embodiment of the invention discloses a head-mounted display device and a control method thereof, which can realize accurate touch of a finger, thereby improving the maneuverability and operation convenience of the head-mounted display device.
  • the embodiment of the invention discloses a head-mounted display device, comprising a body, a processor and a display module disposed in the body, the body comprising an inner side and an outer side corresponding to the inner side, the inner side comprising the same Display the viewing window of the module light source.
  • the head mounted display device further includes a first touch area and a second touch area. The first touch area is disposed on the outer side and corresponds to the viewing window.
  • the processor is configured to execute a corresponding interface control command according to the touch signal generated by the first touch area, and is further configured to perform a touch action corresponding to the touch signal generated by the second touch area, and Corresponding relationship between the touch action on the second touch area and the control command is executed, and a control instruction corresponding to the current touch action on the second touch area is executed.
  • the embodiment of the invention further discloses a method for controlling a head mounted display device, the head mounted display device comprising a body, a processor and a display module disposed in the body, the body comprising an inner side and the inner side Corresponding outer side, the inner side includes a viewing view that provides the light output of the display module The first display area and the second touch area are disposed on the outer side and the viewing window.
  • the control method includes:
  • the touch action corresponding to the touch signal generated by the second touch area and the predefined touch on the second touch area when the touch signal is from the second touch area Corresponding relationship between the action and the control command, and executing a control instruction corresponding to the current touch action on the second touch area.
  • the first touch area is disposed on the front surface of the front-end display device, so that the first touch area corresponds to the display interface that the user sees through the display module.
  • the head-mounted display device further includes a second touch area, and the control instruction corresponding to the current touch action on the second touch area is executed according to the correspondence between the predefined touch action and the control command.
  • the second touch area is set as a touch area for performing a shortcut operation, so that the product characteristics of the head-mounted display device can be further provided to provide a control that is more in line with the user's operating habits, thereby improving the control efficiency and thereby improving User's operating experience. Therefore, the maneuverability of the head-mounted display device and the convenience of operation can be further improved, thereby improving the user experience.
  • FIG. 1 is a perspective view of a head mounted display device according to an embodiment of the present invention.
  • FIG. 2 is a partial structural schematic view of the head mounted display device of FIG. 1;
  • FIG. 3 is a functional block diagram of the head mounted display device of FIG. 1;
  • FIG. 4 is a flowchart of a method for controlling a head mounted display device according to an embodiment of the present invention.
  • the head mounted display device in the first embodiment of the present invention may include a body 10, a display module 20 disposed in the body 10, a processor 30, a first touch area 40, and a second touch. Area 50.
  • the body 10 includes an inner side 11 and an outer side 12 corresponding to the inner side 11.
  • the inner side 11 is provided with a viewing window 111 that provides a light source from the display module 20.
  • the display module 20 may be an integrated structure including a display and an optical module (not shown), or may be a single display structure independent of the optical module, which is not limited in this embodiment.
  • the wearer can view the magnified virtual image formed by the light source of the display through the optical module, that is, the wearer can indirectly view the display interface of the display.
  • the display interface includes content displayed by the display, including but not limited to a main interface, a menu interface, a video playing interface, an audio playing interface, an application interface, and the like.
  • the first touch area 40 is disposed on the outer side 12 and corresponds to the viewing window 111.
  • the processor 30 executes a corresponding interface control instruction according to the touch signal generated by the first touch area 40.
  • the first touch area 40 can be a floating touch area, and can also be a common capacitive or resistive touch area. According to the characteristics of the hovering touch, when the touch object falls within the sensing range of the suspension, the hovering touch area can be triggered to generate a corresponding hovering touch signal, that is, the trigger can be triggered without actually touching the hovering touch area.
  • the hovering touch area generates a corresponding touch signal.
  • the first touch area 40 is a flexible touch area, for example, a flexible touch board can be used.
  • the first touch area 40 can cover the outer side 12.
  • the first touch area 40 is disposed on the front surface of the head mounted display device, so that the first touch area 40 corresponds to the display interface that the user sees through the display module 20 .
  • Providing a sensory consistent correspondence, which is convenient for the user to position the first touch area 40 when the head mounted display device is worn, so that the wearer can conveniently Touching the first touch area 40 to control the display interface can improve the handling of the head mounted display device.
  • the first touch area 40 is a common touch type touch panel, including but not limited to a capacitive type and a resistive type.
  • the position information of the touch signal generated by the processor 30 according to the first touch area is A graphical indicator is displayed where the display interface corresponds to the location information.
  • the graphical indicator can be a graphical cursor, a mouse arrow, a focus display on an action item, and the like.
  • the touch object is usually a finger, and of course other objects that can be sensed. That is, in the embodiment, when the touch object touches a position of the first touch area 40, the processor 30 displays the graphic indicator at the corresponding position of the display interface according to the position information of the position, as User input guidance.
  • the processor 30 knows that when the displacement is generated according to the position information of the touch signal, the graphic indicator is correspondingly moved according to the displacement, and when the touch signal generated by the first touch area 40 is a touch touch signal, according to The content corresponding to the graphic indicator, for example, the graphic indicator falls within the location area of an operation item, and performs corresponding operations. That is, after the graphic indicator is displayed, if the user moves the touch object in any direction, such as up, down, left, or right on the first touch area 40, a displacement is generated, and the processor 30 The displacement accordingly moves the display position of the graphical indicator so that the user can determine whether the moving direction of the touch object is correct according to the movement of the graphical indicator.
  • a touch action such as a click, a double click
  • the processor 30 acquires the touch touch signal according to the touch action of the user and performs a corresponding operation.
  • the single graphic indicator is moved to the operation item as an application
  • the selected application can be opened by a click operation.
  • the current position of the touch object is indicated by a graphic cursor as an input guide, which facilitates the user's manipulation of the device after wearing the device.
  • the processor 30 determines, according to the touch signal generated by the first touch area 40, that the touch object performs a preset operation on the first touch area 30, according to the preset preset operation and interface control.
  • corresponding interface control instructions are executed. For example, when the two-finger open or close operation is performed on the first touch area 40, a control instruction to enlarge or reduce the display interface is performed. For another example, when both fingers slide to the left on the first touch area 40, the previous interface is returned. For another example, when the three fingers simultaneously click on the first touch area 40, the home page is returned.
  • the processor 30 displays a graphic indication at the position corresponding to the position information according to the position information of the floating touch signal of the first touch area 40. symbol. That is to say, in the embodiment, when the touch object falls within the sensing range of the first touch area 40, that is, the graphic indicator is displayed at the corresponding position of the display interface as the input guidance of the user. Further, the processor 30 further determines, according to the position information of the hovering touch signal generated by the first touch area 40, that the touch object is up, down, left, and right in front of the first touch area 40.
  • the position information of the hovering touch signal is changed, that is, it is known that the displacement is generated, the graphic indicator is correspondingly moved according to the displacement, and when the touch touch signal is generated in the first touch area 40, according to the graphic The content corresponding to the indicator performs the corresponding operation.
  • the processor 30 correspondingly Moving the graphical indicator, that is, according to the position information dynamically changed during the displacement process, real-time controlling the display position of the icon indicator on the display interface, realizing the synchronous displacement of the icon indicator and the touch object, and facilitating the user according to the graphic indicator
  • the movement determines whether the moving direction of the touch object is correct. Further, when moving the graphic indicator to the destination, for example, moving to an operation item of the current display interface, such as a movie poster, an audio file, a setting option, a menu item of the navigation interface, etc.
  • the operated item the user no longer performs the movement in the direction parallel to the display interface, but moves the touch object to the first touch area 40, that is, the movement in the direction perpendicular to the display screen and touches the first
  • the touch area 40 generates a touch touch signal.
  • the processor 30 performs a corresponding operation according to the operation item corresponding to the graphic indicator. For example, if the graphic indicator falls into an application, the operation of opening the selected application is performed.
  • the current position of the touch object is indicated by a graphic indicator as an input guide, which facilitates the user's manipulation of the device after wearing the device. Further, by using the floating touch area, the touch operation can be realized without the user actually touching the touch area, thereby further improving the convenience of manipulation.
  • the operation item further includes an operation item that is called out by a predefined preset operation.
  • a predefined preset operation For example, hidden menus, etc.
  • the preset operation includes a gesture operation.
  • the processor 30 when the first touch area 40 is a floating touch area, the processor 30 is configured according to the When the hovering touch signal generated by the touch area 40 determines that the hovering touch signal corresponds to a preset operation, the interface control corresponding to the preset operation is performed according to the corresponding relationship between the preset preset operation and the interface control command. instruction. For example, when the two fingers open or close together in front of the first touch area 40, the display interface is enlarged or reduced. For another example, when both fingers move to the left in the front direction of the first touch area 40, the previous interface is returned.
  • the first touch area 40 when the first touch area 40 is a floating touch area, the first touch area 40 can be directly touched to implement corresponding interface control. For example, when the two fingers touch the first touch area 40 and slide to the left at the same time, the previous interface is returned.
  • the corresponding control can be performed according to the preset operation of the first touch area 40 combined with the floating operation.
  • the corresponding interface control command is executed according to the preset relationship between the preset preset operation and the interface control command, and the first finger touches the first touch area 40, the other finger performs the azimuth movement, or the two open/close
  • the floating operation is a preset operation
  • the interface control instruction having the corresponding relationship is a finger touch to select an operation item, such as a picture, an audio file icon, a video file icon, an application software, etc.
  • the floating operation may be based on a floating operation
  • the azimuth movement drags the operation item to perform corresponding azimuth movement on the display interface, and performs corresponding enlargement/reduction on the operation item, for example, a picture, according to the opening/closing of the floating operation.
  • more operational possibilities are provided under limited touch areas. On the one hand, it can reduce the
  • the second touch area 50 is disposed on the outer side 12 and adjacent to the first touch area 40.
  • the second touch area 50 and the first touch area 40 can be different touch areas of the same touch panel. It can also be two touchpads.
  • the second touch area 50 may also be located at other locations of the head mounted display device, such as at the earphones 70 opposite the ends of the body 10.
  • the head mounted display device further includes a headband 60 connected between the two earphones 70
  • the second touch area 50 may also be disposed in any area outside the headband 60.
  • a plurality of second touch regions 50 may be included, for example, two second touch regions 50, which may be respectively disposed on two sides of the first touch region 40, or respectively disposed on two Headphones 70.
  • the number and setting position of the second touch area 50 can be as needed Make flexible settings.
  • the second touch area 50 can be a floating touch area or a common capacitive or resistive touch area.
  • the second touch area 50 is a flexible touch area.
  • the head-mounted display device includes a mapping relationship between the touch action and the control command on the second touch area 50, and performs the current touch on the second touch area 50 according to the corresponding relationship.
  • the control command corresponding to the control action.
  • the touch action on the second touch area 50 may be a touch action performed within the suspension sensing range of the second touch area 50, and may actually touch The second touch area 50 may not actually touch the second touch area 50.
  • the correspondence between the touch action on the second touch area 50 and the control command may be a factory setting or a user-defined setting.
  • the correspondence may be stored in a storage module of the head mounted display device, such as: Read Only Memory (ROM), EEPROM (Electrically Erasable Programmable Read-Only Memory), random Memory (RAM, Random Access Memory), disk, etc.
  • the processor 30 can read the data of the storage module.
  • the touch action can be a click, a double click, a slide operation, etc.
  • the control commands include a volume adjustment, a fast forward/reverse adjustment, a switch audio/video mode control, a mute control, and/or a pause/play control, etc. instruction. That is to say, in the embodiment, the second touch area 50 is set as a touch area for performing a shortcut operation, which will be further described below by way of example.
  • control command corresponding to the operation action of the upward sliding movement is a volume increase command
  • control command corresponding to the downward sliding operation action is a volume reduction command.
  • the second touch area 50 is set as a touch area for performing a shortcut operation, and the product characteristics of the head-mounted display device can be further provided to better control the operation habits of the user and improve the control efficiency. Improve the user's operating experience.
  • the head mounted display device in the present embodiment further includes a headband 60, an earphone 70, and a speaker 80 that are coupled to the body 10 and that are used to wear the head mounted display device to the wearer's head.
  • the speaker 80 can be disposed in the body 10 or can be disposed at any position of the head mounted display device, for example, can be set.
  • the processor 30 can control the audio signal to be played through the earphone 70 or the speaker 80 according to the touch action on the second touch area 50. For example, if the touch signal from the second touch area 50 is received when the audio content is played through the earphone 70, and the touch action is determined to be a circle on the second touch area 50 according to the touch signal, the processing is performed.
  • the device 30 switches to play the audio content through the speaker 80.
  • the method further includes: when the touch action is determined according to the touch signal of the second touch area 50, the associated control command is to switch to the audio mode, and the processor 30 turns off the display output, and only plays the audio through the earphone 70 or the speaker 80.
  • the processor 30 turning off the display output includes, but is not limited to, causing the display of the display module 20 to enter a standby state or the like, such as a non-powered state, thereby saving power resources and reducing power consumption.
  • the touch action is determined according to the touch signal of the second touch area 50, and the associated control command is to switch to the video mode, and the processor 30 turns on the display output.
  • the above exemplary solution in order to provide the setting for the second touch area 50, combined with the product characteristics, can provide a convenient operation with better adaptation. Of course, it is not used to define specific control actions and control commands of the second touch area 50.
  • FIG. 4 is a flowchart of a method for controlling a head mounted display device according to an embodiment of the present invention, which may include the following steps:
  • Step 401 Determine whether the touch signal is from the first touch area 40 or the second touch area 50. When the touch signal is from the first touch area 40, step 402 is performed. Otherwise, when the touch signal is from the second touch area 50, step 403 is performed.
  • Step 402 When the touch signal is from the first touch area 40, execute corresponding interface control instructions according to the touch signal.
  • the graphic indicator is displayed at the position corresponding to the display interface according to the position information of the touch signal generated by the first touch area 40.
  • the graphical indicator can be a graphical cursor, a mouse arrow, a focus display on an action item, and the like.
  • the touch object is usually a finger, and of course other objects that can be sensed. That is, in the embodiment, when the touch object touches a position of the first touch area 40, the position information according to the position is A graphical indicator is displayed at the corresponding location of the display interface to serve as the user's input guide.
  • the graphic indicator is correspondingly moved according to the displacement, and the touch signal generated in the first touch area 40 is touch touch.
  • the signal is received, according to the content corresponding to the graphic indicator, for example, the graphic indicator falls into the position area of a certain operation item, and the corresponding operation is performed. That is to say, after the user displays the graphic indicator, if the user moves the touch object in any orientation on the first touch area 40, a displacement is generated, and the display position of the graphic indicator is correspondingly moved according to the displacement, so as to facilitate The user determines whether the moving direction of the touch object is correct according to the movement of the graphic indicator.
  • a touch action such as a click, a double click
  • the touch touch signal is acquired and the corresponding operation is performed.
  • the graphic indicator is moved to the operation item as an application, the selected application can be opened by the click operation.
  • the current position of the touch object is indicated by a graphic cursor as an input guide, which facilitates the user's manipulation of the device after wearing the device.
  • the touch object when the touch object performs a preset operation on the first touch area 30, according to the corresponding relationship between the preset preset operation and the interface control instruction, Execute the corresponding interface control instructions. For example, when the two-finger open or close operation is performed on the first touch area 40, a control instruction to enlarge or reduce the display interface is performed. For another example, when both fingers slide to the left on the first touch area 40, the previous interface is returned. For another example, when the three fingers simultaneously click on the first touch area 40, the home page is returned. It is to be understood that the matter of the operation herein is merely illustrative and not intended to limit the scope of the invention.
  • the graphic indicator is displayed at the position corresponding to the position information according to the position information of the hovering touch signal generated by the first touch area 40. That is to say, in the embodiment, when the touch object falls within the sensing range of the first touch area 40, that is, the graphic indicator is displayed at the corresponding position of the display interface as the input guidance of the user. Further, according to the position information of the hovering touch signal generated by the first touch area 40, when the displacement occurs, for example, the touch object moves in any direction of the front, the bottom, the left, and the right of the first touch area 40.
  • the displacement is generated, according to the The displacement correspondingly moves the graphic indicator, and when the first touch area 40 generates the touch touch signal, the corresponding operation is performed according to the content corresponding to the graphic indicator. That is to say, after the graphic indicator is displayed, if the user can move the touch object upwards, downwards, leftward or rightward in the sensing space in the floating touch area, the graphic indication is moved accordingly.
  • the display position of the icon indicator on the display interface is controlled in real time, and the synchronous displacement of the icon indicator and the touch object is realized, so that the user can judge the touch according to the movement of the graphic indicator. Check if the direction of movement of the object is correct.
  • the user when moving the graphic indicator to the destination, for example, moving to an operation item of the current display interface, such as a movie poster, an audio file, a setting option, a menu item of the navigation interface, etc.
  • the operated item the user no longer performs the movement in the direction parallel to the display interface, but moves the touch object to the first touch area 40, that is, the movement in the direction perpendicular to the display screen and touches the first
  • the touch area 40 generates a touch touch signal.
  • the corresponding operation is performed according to the operation item corresponding to the graphic indicator. For example, if the graphic indicator falls into an application, the operation of opening the selected application is performed.
  • the current position of the touch object is indicated by a graphic indicator as an input guide, which facilitates the user's manipulation of the device after wearing the device. Further, by using the floating touch area, the touch operation can be realized without the user actually touching the touch area, thereby further improving the convenience of manipulation.
  • the operation item further includes an operation item that is called out by a predefined preset operation.
  • a predefined preset operation For example, hidden menus, etc.
  • the preset operation includes a gesture operation.
  • the floating touch signal generated by the first touch area 40 determines that the floating touch signal corresponds to a preset operation, according to a preset preset.
  • the corresponding relationship between the operation and the interface control instruction is set, and the interface control instruction corresponding to the preset operation is executed. For example, when the two fingers open or close together in front of the first touch area 40, the display interface is enlarged or reduced. For another example, when the two fingers simultaneously slide to the left in the front direction of the first touch area 40, the previous interface is returned.
  • the first touch area 40 when the first touch area 40 is a floating touch area, the first touch area 40 can be directly touched to implement corresponding interface control. For example, when the two fingers touch the first touch area 40 and slide to the left at the same time, the previous interface is returned.
  • the first touch area 40 is a floating touch area
  • a touch can also be defined.
  • the preset operation to the first touch area 40 in combination with the floating operation is controlled accordingly.
  • the corresponding interface control command is executed according to the preset relationship between the preset preset operation and the interface control command, and the first finger touches the first touch area 40, the other finger performs the azimuth movement, or the two open/close
  • the floating operation is a preset operation
  • the interface control instruction having the corresponding relationship is a finger touch to select an operation item, such as a picture, an audio file icon, a video file icon, an application software, etc.
  • the floating operation may be based on a floating operation
  • the azimuth movement drags the operation item to perform corresponding azimuth movement on the display interface, and performs corresponding enlargement/reduction on the operation item, for example, a picture, according to the opening/closing of the floating operation.
  • more operational possibilities are provided under limited touch areas.
  • step 403 when the touch signal is from the second touch area 50, the current touch on the second touch area 50 is performed according to the corresponding relationship between the touch action and the control command on the second touch area 50.
  • the control command corresponding to the control action.
  • the second touch area 50 can be a floating touch area or a common touch touch area.
  • the touch action on the second touch area 50 may be a touch action performed within the suspension sensing range of the second touch area 50, and may actually touch The second touch area 50 may not actually touch the second touch area 50.
  • the correspondence between the touch action on the second touch area 50 and the control command may be a factory setting or a user-defined setting.
  • the touch action can be a click, a double click, a slide operation, etc.
  • the control commands include a volume adjustment, a fast forward/reverse adjustment, a switch audio/video mode control, a mute control, and/or a pause/play control, etc. instruction.
  • the second touch area 50 is set as a touch area for performing a shortcut operation, and the product characteristics of the head mounted display device can be further provided to better control the user's operation habits and improve the control. Efficiency, which can enhance the user's operating experience.

Abstract

一种头戴式显示设备,包括本体(10)、处理器(30)以及设置于本体(10)内的显示模块(20)。本体(10)包括内侧(11)及与内侧(11)对应的外侧(12),内侧(11)包括提供显示模块(20)的光源出射的观看视窗(111)。头戴式显示设备还包括第一触控区域(40)及第二触控区域(50)。第一触控区域(40)设置于外侧(12)上且与观看视窗(111)对应。处理器(30)用于根据第一触控区域(40)产生的触控信号执行相应的界面控制指令,还用于根据第二触控区域(50)产生的触控信号对应的触控动作,以及预先定义的第二触控区域(50)上的触控动作与控制指令的对应关系,执行与第二触控区域(50)上的当前触控动作相对应的控制指令。该设备可提高操控性和便捷性,从而提升用户体验。

Description

头戴式显示设备及其操控方法 技术领域
本发明涉及一种头戴显示设备技术领域,尤其涉及一种头戴式显示设备及其操控方法。
背景技术
头戴式显示设备(HMD,Head Mounted Display)指的是可以戴在头上的显示设备。HMD一使用“近眼光学系统”来在距离眼球几厘米的显示屏幕上显示图形图片等多媒体信息。HMD的类型目前包括可看到外部及HMD显示内容的穿透式HMD以及只能看到HMD显示内容的沉浸式HMD。沉浸式HMD具有许多娱乐性的应用,佩戴设备后,视觉辅助完全丧失,当要执行输入操作时,难以定位手指位置,难以实现手指的精确触控,从而不便于用户操控设备。
发明内容
本发明实施例公开一种头戴式显示设备及其操控方法,可实现手指的精确触控,从而提高头戴式显示设备的操控性和操作的便捷性。
本发明实施例公开一种头戴式显示设备,包括本体、处理器以及设置于所述本体内的显示模块,所述本体包括内侧及与所述内侧对应的外侧,所述内侧包括提供所述显示模块光源出射的观看视窗。所述头戴式显示设备还包括第一触控区域及第二触控区域。所述第一触控区域设置于所述外侧上且与所述观看视窗对应。所述处理器用于根据所述第一触控区域产生的触控信号,执行相应的界面控制指令,还用于根据所述第二触控区域产生的触控信号对应的触控动作,以及预先定义的所述第二触控区域上的触控动作与控制指令的对应关系,执行与所述第二触控区域上的当前触控动作相对应的控制指令。
本发明实施例还公开一种头戴式显示设备的操控方法,所述头戴式显示设备包括本体、处理器以及设置于所述本体内的显示模块,所述本体包括内侧及与所述内侧对应的外侧,所述内侧包括提供所述显示模块光源出射的观看视 窗,所述头戴式显示设备还包括第一触控区域及第二触控区域,所述第一触控区域设置于所述外侧上且与所述观看视窗,所述操控方法包括:
在所述触控信号来自所述第一触控区域时,根据所述触控信号执行相应的界面控制指令;以及
在所述触控信号来自所述第二触控区域时,根据所述第二触控区域产生的触控信号对应的触控动作,以及预先定义的所述第二触控区域上的触控动作与控制指令的对应关系,执行与所述第二触控区域上的当前触控动作相对应的控制指令。
本发明提供的实施方式中,由于第一触控区域与观看视窗对应,即设置于头戴式显示设备的前端外表面,使得第一触控区域对应到用户通过显示模块看到的显示界面,提供感官上的一致对应,便于用户佩戴该头戴式显示设备时定位到该第一触控区域,使得佩戴者可方便地触摸第一触控区域来控制显示界面,从而可提高头戴式显示设备的操控性。再者,头戴式显示设备还包括第二触控区域,可根据预先定义的触控动作与控制指令的对应关系,执行与第二触控区域上的当前触控动作相对应的控制指令。也就是说,第二触控区域设置为用于执行快捷操作的触控区域,因此可进一步针对头戴式显示设备的产品特性,提供更符合用户操作习惯的操控,提升操控效率,从而可提升用户的操作体验。因此,可进一步提高头戴式显示设备的操控性和操作的便捷性,从而提升用户的体验。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例中头戴式显示设备的立体示意图;
图2为图1中的头戴式显示设备的部分结构示意图;
图3为图1中的头戴式显示设备的功能模块图;以及
图4为本发明实施方式中头戴式显示设备的操控方法的流程图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
参考图1至图3,本发明第一实施例中的头戴式显示设备可以包括本体10、设置于本体10内的显示模块20、处理器30、第一触控区域40及第二触控区域50。本体10包括内侧11及与内侧11对应的外侧12。内侧11设置有提供显示模块20光源出射的观看视窗111。显示模块20可以为包括显示器及光学模组(未示意)的一体结构,也可以为独立于光学模组的单一显示器结构,本实施方式对此不做限定。佩戴者可观看到显示器的光源经光学模组形成的放大的虚像,即使得佩戴者间接观看到显示器的显示界面。该显示界面包括显示器显示的内容,包括但不限于主界面、菜单界面、视频播放界面、音频播放界面、应用界面,等等。
本实施方式中,第一触控区域40设置于外侧12上,与观看视窗111对应。处理器30根据第一触控区域40产生的触控信号,执行相应的界面控制指令。第一触控区域40可为悬浮触控区域,也可为常见的电容式、电阻式等触控区域。根据悬浮触控的特性可知,当触控物落于悬浮的感测范围内时,即可触发悬浮触控区域产生相应的悬浮触控信号,即无需实际触摸到悬浮触控区域,即可触发悬浮触控区域产生相应的触控信号。较佳地,第一触控区域40为柔性触控区域,例如可以采用柔性触控板。在外侧12为任意形状时,第一触控区域40都可以包覆外侧12。本实施方式中,由于第一触控区域40与观看视窗111对应,即设置于头戴式显示设备的前端外表面,使得第一触控区域40对应到用户通过显示模块20看到的显示界面,提供感官上的一致对应,便于用户佩戴该头戴式显示设备时定位到该第一触控区域40,从而佩戴者可方便地 触摸第一触控区域40来控制显示界面,从而可提高头戴式显示设备的操控性。
本实施方式中,在第一触控区域40为常见的接触式触控板,包括但不限于电容式、电阻式,处理器30根据第一触控区域产生的触控信号的位置信息,在显示界面对应该位置信息之处显示图形指示符。图形指示符可以为图形光标、鼠标箭头、于操作项的落焦显示等。触控物通常为手指,当然也可为其他可被感测的物体。也就是说,在本实施方式中,当触控物触控第一触控区域40的一个位置时,处理器30根据该位置的位置信息在显示界面的相应位置处显示图形指示符,以作为用户的输入导向。进一步,处理器30根据该触控信号的位置信息获知其产生位移时,根据该位移相应地移动图形指示符,以及在第一触控区域40产生的触控信号为触摸触控信号时,根据图形指示符对应的内容,例如图形指示符落入某一操作项的位置区域,执行相应地操作。也就是说,在显示图形指示符后,若用户在第一触控区域40上往上、往下、往左或往右等任意方向移动触控物,即产生了位移,则处理器30根据该位移相应地移动图形指示符的显示位置,以便于用户根据图形指示符的移动判断出触控物的移动方向是否正确。进一步,在将图形指示符移动至目的地时,例如移动至欲打开的应用程序处,在第一触控区域40上执行触摸动作,例如单击、双击。此时,处理器30根据用户的触摸动作,获取到触摸触控信号并执行相应地操作,例如单图形指示符移动至操作项为应用程序,则可通过单击的操作,打开选择的应用程序。本实施方式中,以图形光标指示触控物的当前位置,作为输入导向,可便于用户佩戴设备后对设备的操控。
本实施方式中,处理器30根据第一触控区域40产生的触控信号判断出触控物在第一触控区域30上执行一预设操作时,根据预设的预设操作与界面控制指令的对应关系,执行相应地界面控制指令。例如,在第一触控区域40上执行双指张开或并拢操作时,则执行放大或缩小显示界面的控制指令。又例如,双指同时在第一触控区域40上向左滑动时,返回上一界面。再例如,三指同时在第一触控区域40上点击时,返回主页。可以理解的是,此处操作相关内容仅为示例性阐述,并不用于限定本发明范围。
本实施方式中,当第一触控区域40为悬浮触控区域时,处理器30根据第一触控区域40的悬浮触控信号的位置信息,在显示界面对应该位置信息之处显示图形指示符。也就是说,在本实施方式中,当触控物落于第一触控区域40的感测范围内时,即在显示界面的相应位置处显示图形指示符,以作为用户的输入导向。进一步,处理器30还根据第一触控区域40产生的悬浮触控信号的位置信息获知其产生位移时,例如,触控物在第一触控区域40的前方往上、下、左、右中等任意方向移动时,悬浮触控信号的位置信息发生改变,即获知其产生了位移,根据该位移相应地移动图形指示符,以及在第一触控区域40产生触摸触控信号时,根据图形指示符对应的内容执行相应地操作。也就是说,在显示图形指示符后,若用户于悬浮触控区域可感测空间内,执行往上、往下、往左或往右等移动触控物的动作时,则处理器30相应地移动图形指示符,即根据位移过程中动态改变的位置信息,实时控制图标指示符于显示界面的显示位置,则实现了图标指示符与触控物的同步位移,便于用户根据图形指示符的移动判断出触控物的移动方向是否正确。进一步,在将图形指示符移动至目的地时,例如移动至当前显示界面的某操作项,例如某电影海报、某音频文件、某设置选项、导航界面的某菜单项等等任何显化的可被操作项,用户不再进行相对平行于显示界面方向上的移动,而是将触控物移动至触摸到第一触控区域40,即相对垂直于显示画面方向上的移动且触摸到第一触控区域40产生触摸触控信号。此时,根据用户的触摸动作,处理器30根据图形指示符对应的操作项执行相应地操作,例如,图形指示符落入某一应用程序,则执行打开选择的应用程序的操作。本实施方式中,以图形指示符指示触控物的当前位置,作为输入导向,可便于用户佩戴设备后对设备的操控。进一步,采用悬浮触控区域,无需用户实际触控到触控区域,即可实现触控操作,进一步提升了操控的便利性。
可以理解的是,操作项还包括通过预定义的一预设操作而唤出的操作项。例如隐性菜单等。该预设操作包括手势操作。
本实施方式中,当第一触控区域40为悬浮触控区域时,处理器30根据第 一触控区域40产生的悬浮触控信号判断出该悬浮触控信号对应于一预设操作时,根据预设的预设操作与界面控制指令的对应关系,执行该预设操作对应的界面控制指令。例如,双指在第一触控区域40的前方张开或并拢时,放大或缩小显示界面。又例如,双指同时在第一触控区域40的前方向左移动时,返回上一界面。
当然,可以理解的是,在第一触控区域40为悬浮触控区域时,也可通过直接触控第一触控区域40,实现相应地界面控制。例如,双指触控第一触控区域40并同时向左滑动时,返回上一界面。
可以理解的是,在第一触控区域40为悬浮触控区域时,还可以根据触摸到第一触控区域40与悬浮操作结合的预设操作执行相应地控制。例如,根据预设的预设操作与界面控制指令的对应关系执行相应地界面控制指令,可以为,一手指触摸第一触控区域40,另一手指执行方位移动、或两只张开/并拢的悬浮操作,为预设操作,具有对应关系的界面控制指令为手指触摸为选中一操作项,例如图片、音频文件图标、视频文件图标、应用软件等,相应的,悬浮操作可以为根据悬浮操作的方位移动拖拉该操作项于显示界面上执行相应的方位移动,根据悬浮操作的张开/并拢,执行对操作项,例如为图片,进行相应的放大/缩小。由此,在有限的触控区域下,提供了更多的操作可能。一方面可以减少显示界面上显化繁杂的操作项,提供更简洁的显示画面。另一方面,提供用户更加便捷、贴合用户习惯的直观操作方式。
本实施方式中,第二触控区域50设置于外侧12上且邻近于第一触控区域40,第二触控区域50与第一触控区域40可为同一触控板的不同触控区域,也可为两个触控板。在其他实施方式中,第二触控区域50也可以位于头戴式显示设备的其他位置,例如设置于本体10相对两末端的耳机70处。当头戴式显示设备还包括连接于两耳机70之间的头带60时,第二触控区域50也可设置于头带60外侧的任意区域。当然,在其他实施方式中,也可包括多个第二触控区域50,例如两个第二触控区域50,可分别设置在第一触控区域40的两侧,或分别设置在两个耳机70处。第二触控区域50的数量及设置位置可根据需要 进行灵活设置。第二触控区域50可为悬浮式触控区域,也可为常见的电容式、电阻式触控区域。较佳地,第二触控区域50为柔性触控区域。
本实施方式中,头戴式显示设备包括预先定义的第二触控区域50上的触控动作与控制指令的对应关系,并根据所述对应关系执行与第二触控区域50上的当前触控动作相对应的控制指令。在第二触控区域50为悬浮触控区域时,第二触控区域50上的触控动作可为在第二触控区域50的悬浮感测范围内执行的触控动作,可实际触摸到第二触控区域50,也可没有实际触摸到第二触控区域50。第二触控区域50上的触控动作与控制指令的对应关系可以为出厂设定或用户自定义设定。具体的,该对应关系可以存储于头戴式显示设备的存储模块,例如:只读存储器(ROM,Read Only Memory)、可改写的只读存储器(EEPROM,Electrically Erasable Programmable Read-Only Memory)、随机存储器(RAM,Random Access Memory)、磁盘等。处理器30可读取存储模块的数据。触控动作可为单击、双击、滑动操作等,控制指令包括可用于执行音量调节、快进/快退调节、切换音频/视频模式控制、静音控制,和/或,暂停/播放控制等的指令。也就是说,本实施方式中,将第二触控区域50设置为用于执行快捷操作的触控区域,以下可通过举例对此进行进一步说明。
例如,预先设置向上滑动的操作动作对应的控制指令为音量增大指令,向下滑动的操作动作对应的控制指令为音量减小指令。在播放音乐或视频时,若处理器30根据第二触控区域50产生的信号判断出执行于第二触控区域50上的触控操作为向上滑动时,则增大音量,若判断出执行于第二触控区域50上的触控操作为向下滑动时,则减小音量。
本实施方式中,将第二触控区域50设置为用于执行快捷操作的触控区域,可进一步针对头戴式显示设备的产品特性,提供更符合用户操作习惯操控,提升操控效率,从而可提升用户的操作体验。
本实施方式中的头戴式显示设备还包括连接于本体10且用于将头戴式显示设备穿戴至佩戴者头部的头带60、耳机70以及扬声器80。其中,扬声器80可设置于本体10中,也可设置于头戴式显示设备的任意位置,例如可设置 于头带60或耳机70中。处理器30可根据第二触控区域50上的触控动作控制音频信号通过耳机70或扬声器80进行播放。例如,若在通过耳机70播放音频内容时接收到来自第二触控区域50的触控信号,并根据触控信号判断出触控动作为在第二触控区域50上画一个圆圈时,处理器30切换为通过扬声器80播放音频内容。如此,通过在第二触控区域50上执行触控操作,可实现耳机70播放与扬声器80播放的快捷切换,进一步提高了操作的便捷性。还可以包括,当根据第二触控区域50的触控信号判断出触控动作,其关联的控制指令为切换至音频模式,处理器30关闭显示输出,仅通过耳机70或扬声器80播放音频。其中处理器30关闭显示输出包括但不限于使显示模组20的显示器进入待机状态等非工作状态,例如不供电状态,由此节省电力资源、降低功耗。当处于音频模式时,根据第二触控区域50的触控信号判断出触控动作,其关联的控制指令为切换至视频模式,则处理器30开启显示输出。可以理解的是,上述示例性方案,为了提供对于第二触控区域50的设置,结合产品特性,可以提供更佳适配的便捷操作。当然,并不用于限定第二触控区域50的具体控制动作、控制指令。
图4为本发明实施方式中头戴式显示设备的操控方法的流程图,可以包括以下步骤:
步骤401,判断触控信号是来自第一触控区域40还是第二触控区域50。触控信号来自第一触控区域40时,执行步骤402,否则,即触控信号来自第二触控区域50时,执行步骤403。
步骤402,在触控信号来自第一触控区域40时,根据触控信号执行相应的界面控制指令。
具体地,在第一触控区域40为常见的接触式触控区域时,根据第一触控区域40产生的触控信号的位置信息,在显示界面对应所述位置之处显示图形指示符。图形指示符可以为图形光标、鼠标箭头、于操作项的落焦显示等。触控物通常为手指,当然也可为其他可被感测的物体。也就是说,在本实施方式中,当触控物触控第一触控区域40的一个位置时,根据该位置的位置信息在 显示界面的相应位置处显示图形指示符,以作为用户的输入导向。进一步,根据第一触控区域40产生的触控信号的位置信息获知其产生位移时,根据该位移相应地移动图形指示符,以及在第一触控区域40产生的触控信号为触摸触控信号时,根据图形指示符对应的内容,例如图形指示符落入某一操作项的位置区域,执行相应地操作。也就是说,在显示图形指示符后,若用户在第一触控区域40上往任意方位移动触控物,即产生了位移,则根据该位移相应地移动图形指示符的显示位置,以便于用户根据图形指示符的移动判断出触控物的移动方向是否正确。进一步,在将图形指示符移动至目的地时,例如移动至欲打开的应用程序处,在第一触控区域40上执行触摸动作,例如单击、双击。此时,根据用户的触摸动作,获取到触摸触控信号并执行相应地操作,例如图形指示符移动至操作项为应用程序,则可通过单击的操作,打开选择的应用程序。本实施方式中,以图形光标指示触控物的当前位置,作为输入导向,可便于用户佩戴设备后对设备的操控。
具体地,根据第一触控区域40产生的触控信号判断出触控物在第一触控区域30上执行一预设操作时,根据预设的预设操作与界面控制指令的对应关系,执行相应地界面控制指令。例如,在第一触控区域40上执行双指张开或并拢操作时,则执行放大或缩小显示界面的控制指令。又例如,双指同时在第一触控区域40上向左滑动时,返回上一界面。再例如,三指同时在第一触控区域40上点击时,返回主页。可以理解的是,此处操作相关内容仅为示例性阐述,并不用于限定本发明范围。
具体地,当第一触控区域40为悬浮触控区域时,根据第一触控区域40产生的悬浮触控信号的位置信息,在显示界面对应该位置信息之处显示图形指示符。也就是说,在本实施方式中,当触控物落于第一触控区域40的感测范围内时,即在显示界面的相应位置处显示图形指示符,以作为用户的输入导向。进一步,还根据第一触控区域40产生的悬浮触控信号的位置信息获知其产生位移时,例如触控物在第一触控区域40的前方往上、下、左、右中等任意方向移动时,悬浮触控信号的位置信息发生改变,即获知其产生了位移,根据该 位移相应地移动图形指示符,以及在第一触控区域40产生触摸触控信号时,根据图形指示符对应的内容执行相应地操作。也就是说,在显示图形指示符后,若用户于悬浮触控区域可感测空间内,执行往上、往下、往左或往右等移动触控物的动作时,相应地移动图形指示符,即根据位移过程中动态改变的位置信息,实时控制图标指示符于显示界面的显示位置,则实现了图标指示符与触控物的同步位移,便于用户根据图形指示符的移动判断出触控物的移动方向是否正确。进一步,在将图形指示符移动至目的地时,例如移动至当前显示界面的某操作项,例如某电影海报、某音频文件、某设置选项、导航界面的某菜单项等等任何显化的可被操作项,用户不再进行相对平行于显示界面方向上的移动,而是将触控物移动至触摸到第一触控区域40,即相对垂直于显示画面方向上的移动且触摸到第一触控区域40产生触摸触控信号。此时,根据用户的触摸动作,根据图形指示符对应的操作项执行相应地操作,例如,图形指示符落入某一应用程序,则执行打开选择的应用程序的操作。本实施方式中,以图形指示符指示触控物的当前位置,作为输入导向,可便于用户佩戴设备后对设备的操控。进一步,采用悬浮触控区域,无需用户实际触控到触控区域,即可实现触控操作,进一步提升了操控的便利性。
可以理解的是,操作项还包括通过预定义的一预设操作而唤出的操作项。例如隐性菜单等。该预设操作包括手势操作。
具体地,当第一触控区域40为悬浮触控区域时,根据第一触控区域40产生的悬浮触控信号判断出该悬浮触控信号对应于一预设操作时,根据预设的预设操作与界面控制指令的对应关系,执行该预设操作对应的界面控制指令。例如,双指在第一触控区域40的前方张开或并拢时,放大或缩小显示界面。又例如,双指同时在第一触控区域40的前方向左滑动时,返回上一界面。
当然,可以理解的是,在第一触控区域40为悬浮触控区域时,也可通过直接触控第一触控区域40,实现相应地界面控制。例如,双指触控第一触控区域40并同时向左滑动时,返回上一界面。
可以理解的是,在第一触控区域40为悬浮触控区域时,还可以定义触摸 到第一触控区域40与悬浮操作结合的预设操作执行相应地控制。例如,根据预设的预设操作与界面控制指令的对应关系执行相应地界面控制指令,可以为,一手指触摸第一触控区域40,另一手指执行方位移动、或两只张开/并拢的悬浮操作,为预设操作,具有对应关系的界面控制指令为手指触摸为选中一操作项,例如图片、音频文件图标、视频文件图标、应用软件等,相应的,悬浮操作可以为根据悬浮操作的方位移动拖拉该操作项于显示界面上执行相应的方位移动,根据悬浮操作的张开/并拢,执行对操作项,例如为图片,进行相应的放大/缩小。由此,在有限的触控区域下,提供了更多的操作可能。一方面可以减少显示界面上显化繁杂的操作项,提供更简洁的显示画面。另一方面,提供用户更加便捷、贴合用户习惯的直观操作方式。
步骤403,在触控信号来自第二触控区域50时,根据预先定义的第二触控区域50上的触控动作与控制指令的对应关系,执行与第二触控区域50上的当前触控动作相对应的控制指令。
具体地,第二触控区域50可为悬浮式触控区域,也可为常见的接触式触控区域。在第二触控区域50为悬浮触控区域时,第二触控区域50上的触控动作可为在第二触控区域50的悬浮感测范围内执行的触控动作,可实际触摸到第二触控区域50,也可没有实际触摸到第二触控区域50。
具体地,第二触控区域50上的触控动作与控制指令的对应关系可以为出厂设定或用户自定义设定。触控动作可为单击、双击、滑动操作等,控制指令包括可用于执行音量调节、快进/快退调节、切换音频/视频模式控制、静音控制,和/或,暂停/播放控制等的指令。也就是说,本实施方式中,将第二触控区域50设置为用于执行快捷操作的触控区域,可进一步针对头戴式显示设备的产品特性,提供更符合用户操作习惯操控,提升操控效率,从而可提升用户的操作体验。
以上所述是本发明的优选实施例,应当指出,对于本技术领域的普通技术人员来说,在不脱离本发明原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也视为本发明的保护范围。

Claims (18)

  1. 一种头戴式显示设备,包括本体 处理器以及设置于所述本体内的显示模块,所述本体包括内侧及与所述内侧对应的外侧,所述内侧包括提供所述显示模块光源出射的观看视窗,其特征在于,所述头戴式显示设备还包括:
    第一触控区域,设置于所述外侧 且与所述观看视窗对应;
    第二触控区域;
    所述处理器,用于根据所述第一触控区域产生的触控信号,执行相应的界面控制指令,还用于根据所述第二触控区域产生的触控信号对应的触控动作,以及预先定义的所述第二触控区域 的触控动作与控制指令的对应关系,执行与所述第二触控区域 的当前触控动作相对应的控制指令。
  2. 权利要求1所述的头戴式显示设备,其特征在于,所述第二触控区域设置于所述外侧 且邻近所述第一触控区域。
  3. 如权利要求1所述的头戴式显示设备,其特征在于,还包括设置于所述本体相对两末端的耳机,以及连接于两耳机之间的头带,所述第二触控区域设置于所述耳机和/或头带。
  4. 如权利要求1至3任意一项所述的头戴式显示设备,其特征在于,所述第一触控区域及所述第二触控区域为柔性触控区域。
  5. 如权利要求1所述的头戴式显示设备,其特征在于,所述第一触控区域为悬浮触控区域,所述触控信号包括悬浮触控信号。
  6. 如权利要求5所述的头戴式显示设备,其特征在于,所述处理器根据所述第一触控区域产生的触控信号执行相应的界面控制指令具体为:
    所述处理器在所述第一触控区域产生的触控信号为悬浮触控信号时,根据所述悬浮触控信号的位置信息,在显示界面对应所述位置信息之处显示图形指示符,当所述悬浮触控信号的位置信息产生位移时,根据所述位移相应地移动所述图形指示符,以及在所述第一触控区域产生的触控信号为触摸触控信号时,根据所述图形指示符对应的内容执行相应地操作。
  7. 如权利要求5所述的头戴式显示设备,其特征在于,所述处理器根据所述第一触控区域产生的触控信号执行相应的界面控制指令具体为:
    所述处理器在所述第一触控区域产生的触控信号为悬浮触控信号,且所述悬浮触控信号为一预设操作时,根据预设的预设操作与界面控制指令的对应关系,执行相应地界面控制指令。
  8. 如权利要求5所述的头戴式显示设备,其特征在于,所述处理器根据所述第一触控区域产生的触控信号执行相应的界面控制指令具体为:
    所述处理器在所述第一触控区域产生的触控信号包括悬浮触控信号及触摸触控信号时,判断出所述悬浮触控信号及所述触摸触控信号的结合对应的一预设操作,以及根据预设的预设操作与界面控制指令的对应关系执行相应地界面控制指令。
  9. 如权利要求1所述的头戴式显示设备,其特征在于,所述处理器根据所述第一触控区域产生的触控信号执行相应的界面控制指令具体为:
    所述处理器在所述第一触控区域产生的触控信号为触摸触控信号时,根据所述触摸触控信号的位置信息,在显示界面对应所述位置信息之处显示图形指示符,在所述触摸触控信号的位置信息产生位移时,相应地移动所述图形指示符,以及在再次产生触摸控制信号时,根据所述图形指示符对应的内容执行相应地操作。
  10. 如权利要求9所述的头戴式显示设备,其特征在于,所述处理器根据所述第一触控区域产生的触控信号执行相应的界面控制指令具体为:
    所述处理器在所述第一触控区域产生的触控信号为触摸触控信号,且所述触摸触控信号为一预设操作时,根据预设的预设操作与界面控制指令的对应关系,执行相应地界面控制指令。
  11. 如权利要求1所述的头戴式显示设备,其特征在于,所述控制指令包括可用于执行音量调节 快进/快退调节 切换音频/视频模式控制 静音控制,和/或,暂停/播放控制的指令。
  12. 一种头戴式显示设备的操控方法,所述头戴式显示设备包括本体 处理器以及设置于所述本体内的显示模块,所述本体包括内侧及与所述内侧对应的外侧,所述内侧包括提供所述显示模块光源出射的观看视窗,所述头戴式显示设备还包括第一触控区域及第二触控区域,所述第一触控区域设置于所述外侧 且与所述观看视窗,其特征在于,所述操控方法包括:
    在所述触控信号来自所述第一触控区域时,根据所述触控信号执行相应的界面控制指令;以及
    在所述触控信号来自所述第二触控区域时,根据所述第二触控区域产生的触控信号对应的触控动作,以及预先定义的所述第二触控区域 的触控动作与控制指令的对应关系,执行与所述第二触控区域 的当前触控动作相对应的控制指令。
  13. 如权利要求12所述的头戴式显示设备的操控方法,其特征在于,所述第一触控区域为悬浮触控区域,在所述触控信号来自所述第一触控区域时,根据所述触控信号执行相应的界面控制指令具体为:
    在所述第一触控区域产生的触控信号为悬浮触控信号时,根据所述悬浮触控信号的位置信息,在显示界面对应所述位置信息之处显示图形指示符,当所述悬浮触控信号的位置信息产生位移时,根据所述位移相应地移动所述图形指示符,以及在所述第一触控区域产生的触控信号为触摸触控信号时,根据所述图形指示符对应的内容执行相应地操作。
  14. 如权利要求12所述的头戴式显示设备的操控方法,其特征在于,所述第一触控区域为悬浮触控区域,在所述触控信号来自所述第一触控区域时,根据所述触控信号执行相应的界面控制指令具体为:
    在所述第一触控区域产生的触控信号为悬浮触控信号,且所述悬浮触控信号为一预设操作时,根据预设的预设操作与界面控制指令的对应关系,执行相应地界面控制指令。
  15. 如权利要求12所述的头戴式显示设备的操控方法,其特征在于,所述第一触控区域为悬浮触控区域,根据所述第一触控区域产生的触控信号执行相应的界面控制指令具体为:
    在所述第一触控区域产生的触控信号包括悬浮触控信号及触摸触控信号时,判断出所述悬浮触控信号及所述触摸触控信号的结合对应的一预设操作,以及根据预设的预设操作与界面控制指令的对应关系执行相应地界面控制指令。
  16. 如权利要求12所述的头戴式显示设备的操控方法,其特征在于,在所述触控信号来自所述第一触控区域时,根据所述触控信号执行相应的界面控 制指令具体为:
    在所述第一触控区域产生的触控信号为触摸触控信号时,根据所述触摸触控信号的位置信息,在显示界面对应所述位置信息之处显示图形指示符,在所述触摸触控信号的位置信息产生位移时,相应地移动所述图形指示符,以及在再次产生触摸控制信号时,根据所述图形指示符对应的内容执行相应地操作。
  17. 如权利要求16所述的头戴式显示设备,其特征在于,根据所述第一触控区域产生的触控信号执行相应的界面控制指令具体为:
    在所述第一触控区域产生的触控信号为触摸触控信号,且所述触摸触控信号为一预设操作时,根据预设的预设操作与界面控制指令的对应关系,执行相应地界面控制指令。
  18. 如权利要求12所述的头戴式显示设备的操控方法,其特征在于,所述控制指令包括可用于执行音量调节 快进/快退调节 切换音频/视频模式控制 静音控制,和/或,暂停/播放控制的指令。
PCT/CN2016/076981 2016-03-22 2016-03-22 头戴式显示设备及其操控方法 WO2017161497A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2016/076981 WO2017161497A1 (zh) 2016-03-22 2016-03-22 头戴式显示设备及其操控方法
CN201680011871.0A CN107466396A (zh) 2016-03-22 2016-03-22 头戴式显示设备及其操控方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/076981 WO2017161497A1 (zh) 2016-03-22 2016-03-22 头戴式显示设备及其操控方法

Publications (1)

Publication Number Publication Date
WO2017161497A1 true WO2017161497A1 (zh) 2017-09-28

Family

ID=59899825

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/076981 WO2017161497A1 (zh) 2016-03-22 2016-03-22 头戴式显示设备及其操控方法

Country Status (2)

Country Link
CN (1) CN107466396A (zh)
WO (1) WO2017161497A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102117140A (zh) * 2009-12-30 2011-07-06 联想(北京)有限公司 一种触摸处理方法及移动终端
US20120242560A1 (en) * 2011-03-24 2012-09-27 Seiko Epson Corporation Head-mounted display device and control method for the head-mounted display device
CN204203554U (zh) * 2014-11-05 2015-03-11 昆山优力电能运动科技有限公司 头戴式显示设备
CN104503584A (zh) * 2014-12-31 2015-04-08 青岛歌尔声学科技有限公司 一种触控式头戴显示器
CN104503585A (zh) * 2014-12-31 2015-04-08 青岛歌尔声学科技有限公司 触控式头戴显示器
CN105190477A (zh) * 2013-03-21 2015-12-23 索尼公司 用于在增强现实环境中的用户交互的头戴式显示装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5953963B2 (ja) * 2012-06-13 2016-07-20 ソニー株式会社 頭部装着型映像表示装置
CN103677356B (zh) * 2012-09-03 2018-03-23 联想(北京)有限公司 电子设备
CN107193373B (zh) * 2012-09-03 2020-04-24 联想(北京)有限公司 一种信息处理方法及电子设备
CN103914128B (zh) * 2012-12-31 2017-12-29 联想(北京)有限公司 头戴式电子设备和输入方法
CN104063037B (zh) * 2013-03-18 2017-03-29 联想(北京)有限公司 一种操作命令识别方法、装置和穿戴式电子设备
CN104076907A (zh) * 2013-03-25 2014-10-01 联想(北京)有限公司 一种控制方法、装置和穿戴式电子设备
US20160216792A1 (en) * 2015-01-26 2016-07-28 Seiko Epson Corporation Head mounted display, and control method and control program for head mounted display

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102117140A (zh) * 2009-12-30 2011-07-06 联想(北京)有限公司 一种触摸处理方法及移动终端
US20120242560A1 (en) * 2011-03-24 2012-09-27 Seiko Epson Corporation Head-mounted display device and control method for the head-mounted display device
CN105190477A (zh) * 2013-03-21 2015-12-23 索尼公司 用于在增强现实环境中的用户交互的头戴式显示装置
CN204203554U (zh) * 2014-11-05 2015-03-11 昆山优力电能运动科技有限公司 头戴式显示设备
CN104503584A (zh) * 2014-12-31 2015-04-08 青岛歌尔声学科技有限公司 一种触控式头戴显示器
CN104503585A (zh) * 2014-12-31 2015-04-08 青岛歌尔声学科技有限公司 触控式头戴显示器

Also Published As

Publication number Publication date
CN107466396A (zh) 2017-12-12

Similar Documents

Publication Publication Date Title
JP6286599B2 (ja) 文字入力インターフェース提供方法及び装置
US9645663B2 (en) Electronic display with a virtual bezel
KR102034584B1 (ko) 포터블 디바이스 및 그 제어 방법
JP6408156B2 (ja) 多表面コントローラ
KR100832355B1 (ko) 3차원 포인팅 방법, 3차원 표시제어방법, 3차원 포인팅장치, 3차원 표시 제어장치, 3차원 포인팅 프로그램, 및3차원 표시제어 프로그램
JP5900393B2 (ja) 情報処理装置、操作制御方法及びプログラム
US20170123516A1 (en) Multi-surface controller
US9430041B2 (en) Method of controlling at least one function of device by using eye action and device for performing the method
US20080297484A1 (en) Method and apparatus for providing gesture information based on touchscreen and information terminal device having the apparatus
EP2214088A2 (en) Information processing
US20130093695A1 (en) Organizational Tools on a Multi-touch Display Device
JP2017526057A (ja) アプリケーションウィンドウの領域ベースのサイズ調節および適所配置
JP2017527882A (ja) アプリケーションウィンドウの補助的表示
TWM341271U (en) Handheld mobile communication device
KR20110081040A (ko) 투명 디스플레이를 구비한 휴대단말에서 컨텐츠 운용 방법 및 장치
US20130127731A1 (en) Remote controller, and system and method using the same
US20090235201A1 (en) Methods for controlling display of on-screen menus
JP2013114422A (ja) 情報処理装置、情報処理方法、およびコンテンツファイルのデータ構造
US20120287059A1 (en) Portable device and method for operating portable device
JP2018514865A (ja) ウェアラブル装置、そのタッチスクリーン、そのタッチ操作方法、及びそのグラフィカルユーザインタフェース
US20180181263A1 (en) Uninterruptable overlay on a display
EP2661671A1 (en) Multi-touch integrated desktop environment
TWI564780B (zh) 觸控螢幕姿態技術
WO2017161497A1 (zh) 头戴式显示设备及其操控方法
WO2018209572A1 (zh) 头戴式显示设备及其交互输入方法

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16894852

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 06/02/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16894852

Country of ref document: EP

Kind code of ref document: A1