WO2018103040A1 - 头戴式显示设备及其内容输入方法 - Google Patents

头戴式显示设备及其内容输入方法 Download PDF

Info

Publication number
WO2018103040A1
WO2018103040A1 PCT/CN2016/109011 CN2016109011W WO2018103040A1 WO 2018103040 A1 WO2018103040 A1 WO 2018103040A1 CN 2016109011 W CN2016109011 W CN 2016109011W WO 2018103040 A1 WO2018103040 A1 WO 2018103040A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
touch
display device
touch panel
virtual
Prior art date
Application number
PCT/CN2016/109011
Other languages
English (en)
French (fr)
Inventor
王一琦
陈爽新
徐培辰
Original Assignee
深圳市柔宇科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市柔宇科技有限公司 filed Critical 深圳市柔宇科技有限公司
Priority to CN201680039083.2A priority Critical patent/CN107980110A/zh
Priority to US16/330,215 priority patent/US20190227688A1/en
Priority to PCT/CN2016/109011 priority patent/WO2018103040A1/zh
Publication of WO2018103040A1 publication Critical patent/WO2018103040A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1041Mechanical or electronic switches, or control elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/002Mounting on the human body
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • G06F1/1673Arrangements for projecting a virtual keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1008Earpieces of the supra-aural or circum-aural type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0339Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys

Definitions

  • the present invention relates to a display device, and more particularly to a head mounted display device and a content input method thereof.
  • head-mounted display devices have gradually become popular because of their convenience and ability to achieve stereoscopic display and stereo.
  • VR virtual reality
  • head-mounted display devices have become more widely used as hardware support devices for VR technology. Since the user cannot see the outside situation after wearing the head-mounted display device, it is often inconvenient when using an existing input device for input.
  • the embodiment of the invention discloses a head-mounted display device and a content input method thereof, which can facilitate the user to input content.
  • the head-mounted display device disclosed in the embodiment of the present invention includes an earphone device, a display device, a touch input device, and a processor, wherein the touch input device includes a ring-shaped first touch panel, and the first touch panel
  • the processor is configured to control the display device to display a soft keyboard input interface in response to the content input request, the soft keyboard input interface includes an input box and a plurality of circularly arranged virtual buttons; the processor is also used to Determining a virtual key to be input in response to the first touch action input on the first touch panel, and controlling to display the character output in the input box when determining that the virtual key to be input is a character .
  • the content input method disclosed in the embodiment of the present invention is applied to a head mounted display device, the head mounted display device includes an earphone device, a display device, and a touch input device, wherein the method includes the steps of: responding to a content input request Controlling the display device to display a soft keyboard input interface, the soft keyboard input interface comprising an input box and a plurality of circularly arranged virtual buttons; determining in response to the first touch action of the circular first touchpad input of the touch input device Enter the virtual button; and in the OK When the virtual key to be input is a character, it will control to display the character output in the input box.
  • the head-mounted display device and the content input method thereof of the present invention can facilitate the input of character content after the user wears the head-mounted display device.
  • FIG. 1 is a schematic diagram of a head mounted display device in accordance with an embodiment of the present invention.
  • FIG. 2 is a block diagram showing the structure of a head mounted display device in accordance with an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a touch input device of a head mounted display device in accordance with an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a soft keyboard input interface displayed by a display device of a head mounted display device according to an embodiment of the invention.
  • FIG. 5 is a schematic diagram of inputting characters in a soft keyboard input interface according to an embodiment of the invention.
  • FIG. 6 is a schematic diagram of a language category of a virtual button of a soft keyboard input interface according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a language category of a virtual button of a soft keyboard input interface in a lowercase letter category according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram showing the language categories of the virtual keys of the soft keyboard input interface in the embodiment of the present invention as numbers and punctuation symbols.
  • 9-11 are schematic diagrams showing an input process when a language type of a virtual button of a soft keyboard input interface is a Chinese pinyin category in an embodiment of the present invention.
  • FIG. 12 is a schematic diagram of a head mounted display device according to another embodiment of the present invention.
  • FIG. 13 is a flowchart of a content input method in an embodiment of the present invention.
  • FIG. 14 is a diagram showing a language type of a virtual button of a soft keyboard input interface according to an embodiment of the present invention. Flowchart of the content input method when the pinyin category is used.
  • FIG. 1 is a schematic diagram of a head mounted display device 100 in accordance with an embodiment of the present invention.
  • the head mounted display device 100 includes an earphone device 1 and a display device 2.
  • the earphone device 1 is for outputting sound
  • the display device 2 is for outputting a display screen.
  • FIG. 2 is a structural block diagram of the head mounted display device 100.
  • the head mounted display device 100 includes a touch input device 3 and a processor 4 in addition to the earphone device 1 and the display device 2.
  • the processor 4 is electrically connected to the earphone device 1, the display device 2, and the touch input device 3.
  • the touch input device 3 includes a first touch panel 31.
  • the first touch panel 31 is configured to detect a touch operation.
  • the first touch panel 31 is annular.
  • the processor 4 controls the display device 2 to display a soft keyboard input interface T1 in response to a content input request, and the soft keyboard input interface T1 includes a plurality of circularly arranged virtual buttons K1.
  • each virtual button K1 is a character or a language category switching button/icon, and each virtual button K1 is used for the user to input a single character under the language category or to perform the switching of the language category to Switching operation of another language category.
  • the processor 4 determines the virtual button K1 selected by the user in response to the first touch action input on the first touch panel 31, that is, determines the virtual button K1 to be input by the user.
  • the first touch action is a sliding touch of the circular track along the first touch panel 31 and a preset time at a corresponding position of the virtual button K1 that needs to be input. For example, as shown in FIG. 4, if the letter "A" needs to be input, it is possible to slide along the first touch panel 31 and stay at the position of the letter "A" for a preset time.
  • the preset time may be 2 seconds or other suitable time.
  • the touch position on the first touch panel 31 and the soft keyboard input boundary The positions of the virtual keys K1 on the surface T1 have a one-to-one correspondence.
  • the processor 4 is further configured to control the virtual button K1 corresponding to the current touch position to be highlighted according to the current touch position when the sliding touch is performed on the first touch panel 31, to notify the user that the virtual button currently selectable input is K1.
  • the processor 4 determines that the virtual button K1 is the selected virtual button K1 when the preset time is stopped according to the touch position corresponding to a certain virtual button K1. For example, as shown in FIG. 4, when sliding to the touch position corresponding to the letter "A", the letter "A" is highlighted.
  • the predetermined time may be left at the current touch position, and the character "A" on the selected virtual button K1 may be input. Therefore, when the sliding touch is performed, the virtual button K1 is highlighted every time a touch position corresponding to the virtual button K1 is passed, so that the highlighted virtual button K1 is sequentially changed following the change of the touch position.
  • the virtual button K1 highlights a color that can be highlighted, or displays a different color from the other virtual buttons K1, or a special mark such as a circle added to the virtual button K1, and the like.
  • the soft keyboard input interface T1 further includes an input box K2.
  • the processor 4 is further configured to: when the selected virtual button K1 to be input is a character, control to display the character output of the selected virtual button K1 in the input box. K2. For example, as shown in FIG. 5, after determining that the letter "A" is the selected virtual button K1 to be input, the processor 4 further controls to display the letter "A" in the input box K2.
  • the input box K2 is surrounded by the plurality of circularly arranged virtual buttons K1, and is located in a ring formed by the plurality of annularly arranged virtual buttons K1. That is, the plurality of virtual keys K1 are annularly arranged around the input box K2.
  • the touch input device 3 further includes a second touch panel 32 as shown in FIG. 3 .
  • the first touch panel 32 is also used to detect a touch operation.
  • the second touch panel 32 can be a circular touch panel surrounded by the annular first touch panel 31 .
  • the second touch panel 32 can also be an annular touch panel surrounded by the first touch panel 31, and the outer diameter of the second touch panel 32 and the first The inner diameter of the touch panel 31 is substantially equal.
  • the second touch panel 32 may be located at the periphery of the first touch panel 31 , that is, surrounding the first touch panel 31 .
  • the processor 4 is further configured to control the second touch action input on the second touch panel 32.
  • the character entered in the input box K2 is deleted.
  • the second touch action may be a back and forth sliding touch input in the preset direction on the second touch panel 32.
  • the processor 4 controls the number of deleted characters to be different according to the difference in the distance of the back and forth sliding touch input on the second touch panel 32. That is, the processor 4 controls to delete the corresponding number of characters according to the distance of the sliding touch input on the second touch panel 32.
  • the processor 4 controls to delete one character input in the input box K2 according to the distance of the back and forth sliding touch input on the second touch panel 32 is less than a first preset distance, and in the second When the distance of the back and forth sliding touch input on the touch panel 32 is greater than a second preset distance, the control deletes all the characters input in the input box K2.
  • the second preset distance is greater than the first preset distance.
  • the sliding touch operation means that the sliding touch operation includes at least a first sliding direction and a second sliding direction, wherein the first sliding direction and/or the second sliding direction are the same as the preset direction. And the distance of the sliding distance of the back and forth sliding touch in the second sliding direction is greater than a predetermined distance, for example, greater than 0. In another embodiment, the angle between the first sliding direction and the second sliding direction is less than 90°. Wherein the distance of the back and forth sliding touch is equal to the distance of the sliding in the second sliding direction, or the distance from the sliding distance in the second sliding direction to the first sliding direction, or the distance in the first sliding direction and the second The difference in the distance that the distance in the sliding direction is projected into the first sliding direction.
  • the touch input device 3 further includes a proximity sensor 33 , which may be disposed in a region of the first touch panel 31 and/or the second touch panel 32 .
  • the proximity sensor 33 is configured to detect a close-range non-contact gesture of the user.
  • the processor 4 is further configured to control a language category of the virtual button K1 displayed by the soft keyboard input interface T1 in response to the preset gesture detected by the proximity sensor 33.
  • the processor 4 detects a preset gesture at the proximity sensor 33. Then, the language category of the virtual key K1 of the soft keyboard input interface T1 is switched to the Chinese pinyin category shown in FIG. 6.
  • the virtual button K1 of the Chinese Pinyin category includes a pinyin lowercase letter K11 arranged in a ring shape, and a common Chinese character K12 surrounding the outer ring of the pinyin lowercase letters.
  • the language category of the virtual keyboard displayed by the soft keyboard input interface T1 is also Includes the lowercase letter category shown in Figure 7.
  • the processor 4 controls the language category of the current virtual button K1 of the soft keyboard input interface T1 to be switched to the lowercase letter category shown in FIG. 7.
  • the language category of the virtual keyboard displayed by the soft keyboard input interface T1 further includes the numbers and punctuation categories shown in FIG. 8.
  • the processor 4 controls the language category of the virtual button K1 of the soft keyboard input interface T1 to be switched to the number and punctuation category shown in FIG. 8. .
  • the language category of the virtual button K1 displayed by the soft keyboard input interface T1 may further include other categories, and the user may switch the language category of the virtual button K1 by executing the preset gesture until switching to the required language. category.
  • the preset gesture may be a gesture of non-contact unidirectional motion or a non-contact back and forth motion along a direction parallel to the surface of the touch input device 3.
  • the preset gesture may be a gesture of non-contact back and forth movement in a direction perpendicular to the surface of the touch input device 3, or the like.
  • the virtual button K1 of the soft keyboard input interface T1 may further include a language category switching button q1, and the processor 4 also switches in response to clicking on the language category switching button q1.
  • the language category of the virtual button K1. For example, as shown in FIG. 5, when the language category of the current virtual button K1 is a Chinese pinyin category, the soft keyboard input interface T1 displays a language category switching button q1 having a content of ".123", and the processor 4 responds. The language category of the virtual key K1 is switched to the number and punctuation type of the click of the language category switching button q1.
  • the language category of the current virtual button K1 is a numeric and punctuation type
  • the soft keyboard input interface T1 displays a language category switching button q1 having a content of “Pinyin”, and the processor 4 responds to the language category.
  • the language of the virtual button K1 is switched to the Chinese pinyin category by switching the click of the button q1.
  • FIG. 9-11 Please refer to FIG. 9-11 together for an example of an input process when the language category of the current virtual button K1 of the soft keyboard input interface T1 is a Chinese pinyin category.
  • the processor 4 sequentially selects pinyin letters of Chinese characters in response to a first touch action of multiple inputs on the first touch panel 31, for example, "rou" as shown in FIG. ".
  • the processor 4 confirms the selected pinyin letter "r" in response to the first touch action input on the first touchpad 31, and then responds to the second on the first trackpad 31.
  • the first touch action of the second input confirms that the pinyin letter "o" is selected, and then rings
  • the first touch action input on the first touch panel 31 confirms that the pinyin letter "u” is selected, thereby sequentially selecting the pinyin letter "rou".
  • the plurality of first touch actions for selecting the plurality of pinyin letters may be continuous touch actions without leaving the first touch pad 31.
  • the interval between the plurality of first touch actions for inputting each pinyin letter of a Chinese character is less than a preset time, for example, inputting the pinyin letter "r” and the pinyin letter "o"
  • the interval of a touch action is less than the preset time such as 2S.
  • the processor 4 when the language category of the current virtual button K1 of the soft keyboard input interface T1 is a Chinese pinyin category, the processor 4 further controls the soft keyboard input interface T1 to display a separation identifier.
  • the separation identifier F1 is used to separate the pinyin letters of the Chinese characters and the Chinese characters corresponding to the pinyin letters.
  • the processor 4 displays the pinyin letter "rou” on the left side of the separation mark F1, and displays the plurality of Chinese characters corresponding to the pinyin letter "rou”.
  • the separation mark F1 may be a vertical line.
  • the processor 4 determines that the pinyin letters of the current Chinese character are input. For example, when the user's finger leaves the first touchpad 31, it is confirmed that the current touch action is stopped.
  • the third touch action is a touch action of a “ ⁇ ” shaped touch track.
  • the processor 4 continues to determine the Chinese character to be input in response to the first touch action input again on the first touch panel 31. For example, as shown in FIG. 10, after the input of the pinyin letters of the current Chinese character is completed, the first touchpad 31 has a first touch action input, and then the control is highlighted according to the change of the sliding position. Different Chinese characters, and after a preset time of a certain Chinese character, such as a "soft" word, determine the "soft" as the Chinese character to be input. As shown in FIG. 11, the processor 4 displays the "soft" in the input box K2, thereby completing the input of the "soft” word.
  • the processor 4 is further configured to display the input "soft" word on one side of the separation identifier F1 after inputting a "soft" word, and associate the soft word The word is displayed on the other side of the separation mark F1.
  • the first touch action of the processor 4 on the first touch panel 31 is stopped or the third touch action is input on the first touch panel 31.
  • the next word to be input is determined from the associated words.
  • the processor 4 further controls the display device 2 to return to display the pinyin having 26 all letters as shown in FIG. 6 in response to the fourth touch action input on the first touch panel 31.
  • the fourth touch action may be a flick touch action in a certain direction.
  • the flicking touch action may be to swipe the first touch panel 31 in one direction and make a short touch contact with the first touch panel 31.
  • the input box K2 may be an input box of a certain application software or system software, and the content input request is generated after the user clicks on the input box K2.
  • the application software may be a browser search toolbar, a short message input box, an audio and video player search bar, and the like.
  • the processor 4 when the display device 2 does not display the soft keyboard input interface T1, that is, when input of content such as characters is not performed, the processor 4 is further responsive to the touch input device 3 The input of the first touchpad 31, the second touchpad 32, and/or the proximity sensor 33 performs a specific operation on the currently displayed page content or the currently open application.
  • the processor 4 can control a pointer movement, a page drag, and the like in response to a sliding touch on the first touch panel 31 and/or the second touch panel 32 of the touch input device 3.
  • the processor 4 can control to open a specific object or enter a next level folder or the like in response to a click operation on the first touch panel 31 and the second touch panel 32 of the touch input device 3.
  • the click operation can be performed on the first touch panel 31 and the second touch panel 32 of the touch input device 3 to open the The application.
  • the processor 4 can control the adjustment in response to a sliding touch on the first touchpad 31 and/or the second touchpad 32 of the touch input device 3. Audio and video player parameters such as volume and brightness.
  • the processor 4 moves to the input box K2 in response to a sliding touch control pointer on the first touch panel 31 and/or the second touch panel 32 of the touch input device 3.
  • the first touch panel 31 and/or the second touch panel 32 can generate the content in response to a click touch (click or double click) on the first touch panel 31 and/or the second touch panel 32. Enter the request.
  • the processor 4 may also be responsive to the touch input device
  • the gesture sensor detected by the proximity sensor 33 of 3 performs a specific operation on the currently displayed page content or the currently open application.
  • the processor 4 can control parameters such as volume, brightness, and the like of the audio and video player in response to the hovering gesture detected by the proximity sensor 33.
  • the touch input device 3 is disposed on the earphone device 1.
  • the earphone device 1 includes an earpiece 11 and an earphone loop 12.
  • the first touch panel 31 and the second touch panel 32 of the touch input device 3 are disposed on the outer surface of the earpiece 11 of the earphone device 1.
  • the first touch panel 31 is disposed in an outer ring region of the earpiece 11
  • the second touch panel 32 is disposed in a central region of the earpiece 11 .
  • the touch input device 3 can be a separate input device connected to the earphone device 1 or the display device 2 by wire or wirelessly.
  • the touch input device 3 can be a mouse-like device for the user to hold input.
  • the first touch panel 31 and the second touch panel 32 of the touch input device 3 are configured to detect a touch action of a user to generate a touch sensing signal, and the processor 4 receives the touch sensing signal.
  • the touch action input on the first touch panel 31 and the second touch panel 32 is determined.
  • the proximity sensor 33 of the touch input device 3 is configured to detect a short-distance gesture action of the user to generate a sensing signal, and the processor 4 receives the sensing signal generated by the proximity sensor 33 to determine the proximity sensor 33 to detect The detected user's close-range gesture action.
  • the processor 4 can be a processing chip such as a central processing unit, a microcontroller, a microprocessor, a single chip microcomputer, or a digital signal processor.
  • the processor 4 can be located in the headset device 1 or in the display device 2.
  • FIG. 13 is a flowchart of a content input method according to an embodiment of the present invention.
  • the method is applied to the aforementioned head mounted display device 100.
  • the method includes the steps of:
  • the control display device 2 displays a soft keyboard input interface T1 including a plurality of circularly arranged virtual keys K1 (S131).
  • the content input request is generated by clicking on the input box K2, and the plurality of circularly arranged virtual keys K1 are arranged annularly around the input box K2.
  • the virtual button K1 selected to be input is determined in response to the first touch action input on the first touch panel 31 of the touch input device 3 (S132). Specifically, the step S133 includes: determining a virtual button K1 corresponding to the current touch position in response to the sliding touch of the first touch panel 31, and the virtual The button K1 is highlighted; when the dwell time of the touch position corresponding to the highlighted virtual button K1 reaches a preset time, the highlighted virtual button K1 is selected as the virtual button K1 to be input.
  • the highlighting may be highlighting, displaying a color different from other virtual buttons K1, adding a special mark such as a circle around the virtual button K1, etc.
  • the character that is, the character output on the virtual key K1 is displayed in the input box K2 (S133).
  • the content input method further includes the steps of:
  • the character input in the input box K2 is controlled to be deleted (S134).
  • the second touch action may be a back and forth sliding touch input in the preset direction on the second touch panel 32.
  • the step S134 includes: controlling the deletion of the corresponding number of characters according to the distance of the back and forth sliding touch input on the second touch panel 32.
  • the content input method further includes the steps of:
  • the content input method further includes the steps of:
  • the processor 4 is further responsive to the first touch panel 31, the second touch panel 32, and/or the close range of the touch input device 3.
  • the input of the sensor 33 performs a specific operation on the currently displayed page content or the currently open application.
  • FIG. 14 is a flowchart of a content input method when the language category of the virtual button K1 displayed by the soft keyboard input interface T1 is a Chinese pinyin category.
  • the method comprises the steps of:
  • the processor 4 sequentially selects the pinyin letters of the Chinese characters in response to the plurality of first touch actions input on the first touch panel 31 (S141). For example, the processor 4 selects the pinyin letter "r” in response to the first touch action input on the first touchpad 31, and then responds to the second time on the first trackpad 31. The input first touch action selects the pinyin letter "o”, and then selects the pinyin letter "u” in response to the first touch action of the third input on the first trackpad 31, thereby sequentially selecting the pinyin letter "rou".
  • the plurality of first touch actions for selecting the pinyin letters may be continuous touch actions without leaving the first touch pad 31.
  • the processor 4 stops in response to the third touch action input on the first touch panel 31 or the touch action on the first touch panel 31, it is determined that the pinyin letter input of the current Chinese character is completed (S143). For example, when the user's finger leaves the first touchpad 31, it is confirmed that the current touch action is stopped.
  • the third touch action is a touch action of a “ ⁇ ” shaped touch track.
  • the processor 4 controls to display a Chinese character that matches the currently input pinyin letter in the soft keyboard input interface T1 (S144).
  • the processor 4 continues to determine the Chinese character to be input in response to the first touch action input again on the first touch panel 31 (S145). For example, as shown in FIG. 10, after the input of the pinyin letters of the current Chinese character is completed, the first touchpad 31 has a sliding touch input, and then controls to highlight the sliding position according to the change of the sliding position. Corresponding Chinese characters, and after a preset time of a certain Chinese character, such as a "soft" word, determine the "soft" as the Chinese character to be input.
  • the processor 4 controls to display the Chinese character to be input in the input box K2 (S146).
  • the method further includes the step of: the processor 4 is further configured to control displaying a separation identifier F1 in the soft keyboard input interface T1, and displaying the input Chinese character on a side of the separation identifier F1. And displaying the associated word of the soft word on the other side of the separation identifier F1; the processor 4 continues to respond to the first touch action input again on the first touch panel 31, from the association The word to be entered is determined in the word.
  • the method further includes the step of: the processor 4 further controlling the soft keyboard input interface T1 to return to the initial Hanyu Pinyin in response to the fourth touch action input on the first touch panel 31.
  • the fourth touch action may be a flick touch action in a certain direction.
  • the electronic device 100 and the soft keyboard display method of the present invention can automatically determine the category of the user and display a soft keyboard that conforms to the user category.

Abstract

一种内容输入方法和头戴式显示设备,所述内容输入方法应用于一头戴式显示设备中,所述头戴式显示设备包括耳机装置、显示装置及触摸输入装置,所述方法包括步骤:响应内容输入请求,控制所述显示装置显示软键盘输入界面,所述软键盘输入界面包括输入框以及若干环形排列的虚拟按键(S131);响应触摸输入装置的环形的第一触控板输入的第一触摸动作而确定要输入的虚拟按键(S132);以及在确定所述要输入的虚拟按键为字符时,控制将所述字符输出显示在所述输入框中(S133)。该头戴式显示设备及内容输入方法可方便用户在穿戴头戴式显示设备时进行输入。

Description

头戴式显示设备及其内容输入方法 技术领域
本发明涉及一种显示设备,尤其涉及一种头戴式显示设备及其内容输入方法。
背景技术
目前,头戴式显示设备由于便捷性,且能实现立体显示及立体声等效果,已经逐渐被人们所喜爱。近年来,随着虚拟现实(virtual reality,VR)技术的出现,头戴式显示装置作为VR技术的硬件支持设备,更加应用广泛了。由于用户穿戴头戴式显示设备后,无法看到外面的情况,因此使用现有的输入装置进行输入时,往往较为不便。
发明内容
本发明实施例公开一种头戴式显示设备及其内容输入方法,能够方便用户进行内容的输入。
本发明实施例公开的头戴式显示设备,包括耳机装置、显示装置、触摸输入装置以及处理器,其中,所述触摸输入装置包括环形的第一触控板,所述第一触控板用于侦测触摸操作,所述处理器用于响应内容输入请求,控制所述显示装置显示软键盘输入界面,所述软键盘输入界面包括输入框以及若干环形排列的虚拟按键;所述处理器并用于响应所述第一触控板上输入的第一触摸动作而确定要输入的虚拟按键,以及在确定所述要输入的虚拟按键为字符时,控制将所述字符输出显示在所述输入框中。
本发明实施例公开的内容输入方法,应用于一头戴式显示设备中,所述头戴式显示设备包括耳机装置、显示装置及触摸输入装置,其中,所述方法包括步骤:响应内容输入请求,控制所述显示装置显示软键盘输入界面,所述软键盘输入界面包括输入框以及若干环形排列的虚拟按键;响应触摸输入装置的环形的第一触控板输入的第一触摸动作而确定要输入的虚拟按键;以及在确定所 述要输入的虚拟按键为字符时,将控制将所述字符输出显示在所述输入框中。
本发明的头戴式显示设备及其内容输入方法,能够供用户穿戴头戴式显示设备后,方便地进行字符内容的输入。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明一实施例中的头戴式显示设备的示意图。
图2为本发明一实施例中的头戴式显示设备的结构框图。
图3为本发明一实施例中的头戴式显示设备的触摸输入装置的示意图。
图4为本发明一实施例中的头戴式显示设备的显示装置显示的软键盘输入界面的示意图。
图5为本发明一实施例中的在软键盘输入界面中输入字符的示意图。
图6为本发明一实施例中的软键盘输入界面的虚拟按键的语言类别为汉语拼音类别的示意图。
图7为本发明一实施例中的软键盘输入界面的虚拟按键的语言类别为小写字母类别的示意图。
图8为本发明一实施例中的软键盘输入界面的虚拟按键的语言类别为数字及标点符号类别的示意图。
图9-11本发明一实施例中的软键盘输入界面的虚拟按键的语言类别为汉语拼音类别时的输入过程示意图。
图12为本发明另一实施例中的头戴式显示设备的示意图。
图13为本发明一实施例中的内容输入方法的流程图。
图14为本发明一实施例中当软键盘输入界面的虚拟按键的语言类别为汉 语拼音类别时的内容输入方法的流程图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
请参阅图1,为本发明一实施例中的头戴式显示设备100的示意图。如图1所示,所述头戴式显示装置100包括耳机装置1及显示装置2。所述耳机装置1用于输出声音,所述显示装置2用于输出显示画面。
请一并参阅图2及图3,图2为头戴式显示设备100的结构框图。所述头戴式显示设备100除包括所述耳机装置1及显示装置2以外,还包括触摸输入装置3以及处理器4。
所述处理器4与所述耳机装置1、显示装置2及触摸输入装置3电连接。如图3所示,所述触摸输入装置3包括第一触控板31。所述第一触控板31用于侦测触摸操作。在一些实施例中,所述第一触控板31为环形。
请一并参阅图4,所述处理器4响应内容输入请求,控制所述显示装置2显示软键盘输入界面T1,所述软键盘输入界面T1包括若干环形排列的虚拟按键K1。在一语言类别输入模式下,每一虚拟按键K1为一字符或一语言类别切换按键/图标,每一虚拟按键K1用于供用户输入该语言类别下的单一字符或执行将该语言类别切换至另一语言类别的切换操作。所述处理器4并响应所述第一触控板31上输入的第一触摸动作而确定用户所选定的虚拟按键K1,即确定用户要输入的虚拟按键K1。
所述第一触摸动作为沿着所述第一触控板31进行环形轨迹的滑动触摸并在需要输入的虚拟按键K1的对应位置停留预设时间。例如,如图4所示,如果需要输入字母“A”,则可沿着第一触控板31滑动并在所述字母“A”的位置停留预设时间。其中,所述预设时间可为2秒或其他合适的时间。
在一些实施例中,所述第一触控板31上的触摸位置与所述软键盘输入界 面T1上的虚拟按键K1的位置呈一一对应关系。所述处理器4还用于根据在第一触控板31上进行滑动触摸时当前的触摸位置,控制当前触摸位置对应的虚拟按键K1突出显示,以告知用户当前可被选定输入的虚拟按键K1。所述处理器4根据在某一虚拟按键K1对应的触摸位置停留预设时间时,确定所述虚拟按键K1为所选定的虚拟按键K1。例如,如图4所示,滑动到字母“A”对应的触摸位置时,则字母“A”进行突出显示。如果用户确定当前突出显示的虚拟按键K1为用户选定要输入的虚拟按键K1时,则可在当前触摸位置停留预定时间,则可输入所述选定的虚拟按键K1上的字符“A”。从而,在所述滑动触摸进行时,每经过一个虚拟按键K1对应的触摸位置,所述虚拟按键K1会高亮显示,从而跟随触摸位置的变化,突出显示的虚拟按键K1也依次变化。
其中,所述虚拟按键K1突出显示可为高亮显示、或显示与其他虚拟按键K1不同的颜色、或在虚拟按键K1增加圆圈等特殊标记等等。
其中,所述软键盘输入界面T1还包括输入框K2。请一并参阅图5,所述处理器4还用于在所述选定的要输入的虚拟按键K1为字符时,控制将所述选定的虚拟按键K1的字符输出显示在所述输入框K2中。例如,如图5所示,所述处理器4在确定字母“A”为选定的要输入的虚拟按键K1后,则进一步控制将所述字母“A”显示在所述输入框K2中。
其中,所述输入框K2被所述若干环形排列的虚拟按键K1包围,位于所述若干环形排列的虚拟按键K1形成的环内。即,所述若干虚拟按键K1为围绕所述输入框K2环形排列。
其中,如图3所示,所述触摸输入装置3还包括第二触控板32。所述第一触控板32同样用于侦测触摸操作。在一些实施例中,如图3所示,所述第二触控板32可为所述环形的第一触控板31包围的圆形触控板。在另一些实施例中,所述第二触控板32可同样为环形触控板,为所述第一触控板31所包围,且第二触控板32的外径与所述第一触控板31的内径大致相等。在其他实施例中,所述第二触控板32可位于所述第一触控板31的外围,即包围所述第一触控板31。
所述处理器4还用于响应所述第二触控板32上输入的第二触摸动作,控 制删除输入框K2中输入的字符。其中,所述第二触摸动作可为在所述第二触控板32上输入的沿预设方向上的来回滑动触摸。在一些实施例中,所述处理器4根据所述第二触控板32上输入的来回滑动触摸的距离的不同,控制删除的字符的数量不同。即,所述处理器4根据所述第二触控板32上输入的来回滑动触摸的距离,控制删除所述距离对应数量的字符。例如,所述处理器4根据所述第二触控板32上输入的来回滑动触摸的距离小于一第一预设距离时,控制删除输入框K2中输入的一个字符,并在所述第二触控板32上输入的来回滑动触摸的距离大于一第二预设距离时,控制删除输入框K2中输入的所有字符。其中,所述第二预设距离大于第一预设距离。
其中,该来回滑动触摸是指该滑动触摸操作至少包括一第一滑动方向与一第二滑动方向,其中,所述第一滑动方向和/或第二滑动方向与所述预设的方向相同,且该来回滑动触摸在第二滑动方向上的滑动距离投影在第一滑动方向上的距离大于一预设距离例如大于0。在另一实施方式中,该第一滑动方向与第二滑动方向的夹角小于90°。其中,该来回滑动触摸的距离等于第二滑动方向上的滑动的距离,或等于第二滑动方向上滑动的距离投影到第一滑动方向上的距离,或第一滑动方向上的距离与第二滑动方向上滑动的距离投影到第一滑动方向上的距离的差值。
如图3所示,所述触摸输入装置3还包括近距离传感器33,所述近距离传感器33可设置于第一触控板31和/或第二触控板32的区域内。所述近距离传感器33用于侦测用户的近距离非接触式的手势动作。所述处理器4还用于响应所述近距离传感器33侦测到的预设手势,控制切换所述软键盘输入界面T1显示的虚拟按键K1的语言类别。
请一并参阅图6。例如,所述软键盘输入界面T1当前的虚拟按键K1的语言类别为如图4或图5所示的大写字母类别时,所述处理器4在所述近距离传感器33侦测到预设手势时,则控制将所述软键盘输入界面T1当前的虚拟按键K1的语言类别切换为图6所示的汉语拼音类别。其中,如图6所示,所述汉语拼音类别的虚拟按键K1包括环形排列的拼音小写字母K11,以及环绕所述拼音小写字母外环的常用汉字K12。
请一并参阅图7,所述软键盘输入界面T1显示的虚拟键盘的语言类别还 包括图7显示的小写字母类别。所述处理器4在所述近距离传感器33又侦测到预设手势时,则控制所述软键盘输入界面T1当前的虚拟按键K1的语言类别切换为图7所示的小写字母类别。
请一并参阅图8,所述软键盘输入界面T1显示的虚拟键盘的语言类别还包括图8显示的数字及标点符号类别。所述处理器4在所述近距离传感器33又侦测到预设手势时,则控制所述软键盘输入界面T1当前的虚拟按键K1的语言类别切换为图8所示的数字及标点符号类别。
显然,所述软键盘输入界面T1显示的虚拟按键K1的语言类别还可包括其他类别,用户可通过执行所述预设手势来切换所述虚拟按键K1的语言类别,直到切换至所需要的语言类别。
其中,在一实施例中,所述预设手势可为沿着平行于所述触摸输入装置3表面的方向非接触式的单方向运动的手势或非接触式的来回运动的手势。在另一实施例中,所述预设手势可为沿着垂直于所述触摸输入装置3表面的方向非接触式的来回运动的手势等等。
其中,如图5或图7所示,所述软键盘输入界面T1的虚拟按键K1还可包括语言类别切换按键q1,所述处理器4还响应对所述语言类别切换按键q1的点击而切换虚拟按键K1的语言类别。例如,如图5所示,在当前虚拟按键K1的语言类别为汉语拼音类别时,所述软键盘输入界面T1显示有内容为“.123”的语言类别切换按键q1,所述处理器4响应对所述语言类别切换按键q1的点击而切换虚拟按键K1的语言类别为数字及标点符号类别。又例如,在当前虚拟按键K1的语言类别为数字及标点符号类别,所述软键盘输入界面T1显示有内容为“拼音”的语言类别切换按键q1,所述处理器4响应对所述语言类别切换按键q1的点击而切换虚拟按键K1的语言类别为汉语拼音类别。
请一并参阅图9-11,为所述软键盘输入界面T1当前的虚拟按键K1的语言类别为汉语拼音类别时的输入过程示例。如图8所示,所述处理器4响应在所述第一触控板31上的多次输入的第一触摸动作而依次选定汉字的拼音字母,比如,如图8所示的“rou”。例如,处理器4响应在所述第一触控板31上的第一次输入的第一触摸动作确认选定拼音字母“r”,然后响应在所述第一触控板31上的第二次输入的第一触摸动作确认选定拼音字母“o”,然后响 应在所述第一触控板31上的第三次输入的第一触摸动作确认选定拼音字母“u”,从而依次选定拼音字母“rou”。其中,所述用于选定多个拼音字母的多次第一触摸动作可为不离开所述第一触控板31的连续触摸动作。在另一实施方式中,用于输入一汉字的每一拼音字母的该多次第一触摸动作之间的间隔小于一预设时间,例如输入拼音字母“r”与拼音字母“o”的第一触摸动作的间隔小于所述预设时间如2S。
其中,如图9-11所示,当所述软键盘输入界面T1当前的虚拟按键K1的语言类别为汉语拼音类别时,所述处理器4还控制所述软键盘输入界面T1显示一分隔标识F1,所述分隔标识F1用于分隔所述汉字的拼音字母及与所述拼音字母对应的汉字。例如,如图9所示,所述处理器4在选定拼音字母“rou”后,将拼音字母“rou”显示在分隔标识F1的左边,并将拼音字母“rou”对应的多个汉字显示在分隔标识F1的右边。其中,所述分隔标识F1可为一竖线。
其中,所述处理器4在所述第一触控板31上的第一触摸动作停止或所述第一触控板31输入一第三触摸动作时,确定当前汉字的拼音字母输入完毕。例如,用户手指离开所述第一触控板31时,确认当前触摸动作停止,在一些实施例中,所述第三触摸动作为“√”形触摸轨迹的触摸动作。当拼音字母输入完毕时,处理器4控制显示与所输入的拼音字母相对应的多个汉字,并突出显示所述多个汉字中的第一字汉字。此时,表明用户将对该多个汉字进行进一步地选择操作。所述处理器4并继续响应在所述第一触控板31上的再次输入的第一触摸动作,确定要输入的汉字。例如,如图10所示,所述处理器4在当前汉字的拼音字母输入完毕后,所述第一触控板31又有第一触摸动作输入时,则控制根据滑动位置的变化高亮显示不同的汉字,并在某一汉字,例如“柔”字的位置停留预设时间后,确定所述“柔”为要输入的汉字。如图11所示,所述处理器4并将所述“柔”显示在所述输入框K2中,从而完成“柔”字的输入。
如图11所述,所述处理器4还用于在输入“柔”字后,将所述输入的“柔”字显示在所述分隔标识F1的一侧,并将所述柔字的关联字显示在所述分隔标识F1的另一侧。同样的,当完成“柔”字的输入后,所述处理器4在所述第一触控板31上的第一触摸动作停止或所述第一触控板31上输入第三触摸动作 时,确定“柔”字输入完毕,并继续响应在所述第一触控板31上的再次输入的第一触摸动作,从关联字中确定下一个要输入的字。
其中,所述处理器4还响应所述第一触控板31上输入的第四触摸动作,而控制所述显示装置2返回至显示所述如图6所示的具有26个所有字母的拼音字母类别的虚拟按键K11及具有常用汉字K12的汉语拼音类别的初始界面。
其中,所述第四触摸动作可为向某一方向的轻弹触摸动作。所述轻弹触摸动作可为朝一个方向掠过第一触控板31并与第一触控板31进行短暂的触控接触。
其中,所述输入框K2可为某一应用软件或系统软件的输入框,所述内容输入请求为用户对输入框K2点击后产生。其中,所述应用软件可为浏览器搜索工具栏、短信输入框、音视频播放器搜索栏等等。
其中,在一些实施例中,当所述显示装置2没有显示所述软键盘输入界面T1时,即没有进行字符等内容的输入时,所述处理器4还响应在所述触摸输入装置3的第一触控板31、第二触控板32和/或近距离传感器33的输入而执行对当前显示页面内容或者当前打开的应用程序的特定操作。
例如,所述处理器4可响应在所述触摸输入装置3的第一触控板31和/或第二触控板32上的滑动触摸,而控制指针移动、页面拖曳等操作。所述处理器4可响应在所述触摸输入装置3的第一触控板31、第二触控板32上的点击操作,而控制打开特定对象或进入下一层级文件夹等。例如,当指针运动到某一应用程序或者通过滑动点选一应用程序时,通过在所述触摸输入装置3的第一触控板31、第二触控板32上执行点击操作,可以打开所述应用程序。又例如,在当前打开了音视频播放器时,所述处理器4可响应在所述触摸输入装置3的第一触控板31和/或第二触控板32上的滑动触摸,控制调节音视频播放器的音量、亮度等参数。
在一些实施例中,所述处理器4响应在所述触摸输入装置3的第一触控板31和/或第二触控板32上的滑动触摸控制指针移动到输入框K2时,所述第一触控板31和/或第二触控板32上可响应在第一触控板31和/或第二触控板32上进行的点击触摸(单击或双击)而生成所述内容输入请求。
在一些实施例中,又例如,所述处理器4还可以在响应所述触摸输入装置 3的近距离传感器33侦测到的手势动作,执行对当前显示页面内容或者当前打开的应用程序的特定操作。例如,所述处理器4可响应所述近距离传感器33侦测到的盘旋手势动作,控制调节音视频播放器的音量、亮度等参数。
如图1所示,在一些实施例中,所述触摸输入装置3设置于耳机装置1上。所述耳机装置1包括听筒11和耳机环带12。例如,所述触摸输入装置3的第一触控板31及第二触控板32设置于耳机装置1的听筒11的外表面上。所述第一触控板31设置于听筒11的外环区域,所述第二触控板32设置于所述听筒11的中心区域。
请一并参阅图12,在另一些实施例中,触摸输入装置3可为一单独的输入装置,通过有线或无线的方式与所述耳机装置1或显示装置2连接。例如,所述触摸输入装置3可为一类似鼠标的装置,供用户手持进行输入。
其中,本申请中,所述触摸输入装置3的第一触控板31、第二触控板32用于侦测用户的触摸动作产生触摸感应信号,所述处理器4接收所述触摸感应信号确定在所述第一触控板31、第二触控板32输入的触摸动作。所述触摸输入装置3的近距离传感器33用于侦测用户近距离的手势动作而产生感应信号,所述处理器4接收所述近距离传感器33产生的感应信号而确定近距离传感器33所侦测到的用户近距离的手势动作。
其中,所述处理器4可为中央处理器、微控制器、微处理器、单片机、数字信号处理器等处理芯片。
所述处理器4可位于所述耳机装置1中或位于所述显示装置2中。
请参阅图13,为本发明一实施例中的内容输入方法的流程图。所述方法应用于前述的头戴式显示设备100中。所述方法包括步骤:
响应内容输入请求,控制显示装置2显示软键盘输入界面T1,所述软键盘输入界面T1包括若干环形排列的虚拟按键K1(S131)。在一实施例中,所述内容输入请求为点击输入框K2产生,所述若干环形排列的虚拟按键K1为围绕所述输入框K2环形排列。
响应触摸输入装置3的第一触控板31上输入的第一触摸动作而确定所选定要输入的虚拟按键K1(S132)。具体的,所述步骤S133包括:响应所述第一触控板31的滑动触摸确定当前触摸位置对应的虚拟按键K1,并将所述虚拟 按键K1突出显示;在所述突出显示的虚拟按键K1对应的触摸位置的停留时间达到预设时间时,选定所述突出显示的虚拟按键K1为要输入的虚拟按键K1。其中,突出显示可为高亮显示、显示与其他虚拟按键K1不同的颜色、在虚拟按键K1的周边增加圆圈等特殊标记等等
在所述要输入的虚拟按键K1为字符时,将所述字符,即虚拟按键K1上的字符输出显示在输入框K2中(S133)。
在一些实施例中,所述内容输入方法还包括步骤:
响应触摸输入装置3的第二触控板32上输入的第二触摸动作,控制删除输入框K2中输入的字符(S134)。其中,所述第二触摸动作可为在所述第二触控板32上输入的沿预设方向上的来回滑动触摸。在一些实施例中,所述步骤S134包括:根据所述第二触控板32上输入的来回滑动触摸的距离,控制删除所述距离对应数量的字符。
在一些实施例中,所述内容输入方法还包括步骤:
响应触摸输入装置3的近距离传感器33侦测到的预设手势,控制切换所述软键盘输入界面T1显示的虚拟按键K1的语言类别(S135)。
在一些实施例中,所述内容输入方法还包括步骤:
当所述显示装置2没有显示所述软键盘输入界面T1时,所述处理器4还响应在所述触摸输入装置3的第一触控板31、第二触控板32和/或近距离传感器33的输入而执行对当前显示页面内容或者当前打开的应用程序的特定操作。
请参阅图14,为所述软键盘输入界面T1显示的虚拟按键K1的语言类别为汉语拼音类别时的内容输入方法的流程图。其中,所述方法包括步骤:
处理器4响应在所述第一触控板31上输入的多次第一触摸动作而依次选定汉字的拼音字母(S141)。例如,处理器4响应在所述第一触控板31上的第一次输入的第一触摸动作选定拼音字母“r”,然后响应在所述第一触控板31上的第二次输入的第一触摸动作选定拼音字母“o”,然后响应在所述第一触控板31上的第三次输入的第一触摸动作选定拼音字母“u”,从而依次选定拼音字母“rou”。其中,所述用于选定拼音字母的多次第一触摸动作可为不离开所述第一触控板31的连续触摸动作。
所述处理器4响应所述第一触控板31上输入的第三触摸动作或者所述第一触控板31上的触摸动作停止时,确定当前汉字的拼音字母输入完毕(S143)。例如,用户手指离开所述第一触控板31时,确认当前触摸动作停止,在一些实施例中,所述第三触摸动作为“√”形触摸轨迹的触摸动作。
所述处理器4控制在软键盘输入界面T1中显示与所述当前输入的拼音字母符合的汉字(S144)。
所述处理器4并继续响应在所述第一触控板31上的再次输入的第一触摸动作,确定要输入的汉字(S145)。例如,如图10所示,所述处理器4在当前汉字的拼音字母输入完毕后,所述第一触控板31又有滑动触摸输入时,则控制根据滑动位置的变化高亮显示滑动位置对应的汉字,并在某一汉字,例如“柔”字的位置停留预设时间后,确定所述“柔”为要输入的汉字。
所述处理器4控制将所述要输入的汉字显示在所述输入框K2中(S146)。
在一些实施例中,所述方法还包括步骤:所述处理器4还用于控制在软键盘输入界面T1中显示一分隔标识F1,将所述输入的汉字显示在分隔标识F1的一侧,并将所述柔字的关联字显示在所述分隔标识F1的另一侧;所述处理器4并继续响应在所述第一触控板31上的再次输入的第一触摸动作,从关联字中确定下一个要输入的字。
在一些实施例中,所述方法还包括步骤:所述处理器4还响应所述第一触控板31上输入的第四触摸动作,控制所述软键盘输入界面T1返回至初始的汉语拼音类别界面。其中,所述第四触摸动作可为向某一方向的轻弹触摸动作。
从而,本发明的电子装置100及软键盘显示方法,可自动判断用户的类别而显示符合所述用户类别的软键盘。
以上所述是本发明的优选实施例,应当指出,对于本技术领域的普通技术人员来说,在不脱离本发明原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也视为本发明的保护范围。

Claims (20)

  1. 一种头戴式显示设备,包括耳机装置、显示装置、触摸输入装置以及处理器,其特征在于,所述触摸输入装置包括第一触控板,所述第一触控板用于侦测触摸操作,所述处理器用于响应内容输入请求控制所述显示装置显示软键盘输入界面,所述软键盘输入界面包括输入框以及若干环形排列的虚拟按键;所述处理器并用于响应所述第一触控板上输入的第一触摸动作而确定要输入的虚拟按键,以及在确定所述要输入的虚拟按键为字符时,控制将所述字符输出显示在所述输入框中。
  2. 如权利要求1所述的头戴式显示设备,其特征在于,所述第一触摸动作为沿着所述第一触控板进行环形轨迹的滑动触摸并在所述要输入的虚拟按键的对应位置停留预设时间的触摸动作。
  3. 如权利要求2所述的头戴式显示设备,其特征在于,所述第一触控板上的触摸位置与所述软键盘输入界面上的虚拟按键的位置呈一一对应关系,所述处理器还用于确定在第一触控板上进行滑动触摸时当前的触摸位置,控制将当前触摸位置对应的虚拟按键突出显示,所述处理器并用于在某一突出显示的虚拟按键对应的触摸位置停留预设时间时,确定所述突出显示的虚拟按键为要输入的虚拟按键。
  4. 如权利要求3所述的头戴式显示设备,其特征在于,所述虚拟按键突出显示包括高亮显示、显示与其他虚拟按键不同的颜色、在虚拟按键增加特殊标记中的一种。
  5. 如权利要求1-4任一项所述的头戴式显示设备,其特征在于,所述触摸输入装置还包括第二触控板,所述处理器还用于响应所述第二触控板上输入的第二触摸动作,控制删除输入框中输入的字符。
  6. 如权利要求5所述的头戴式显示设备,其特征在于,所述第二触摸动作为在所述第二触控板上输入的沿预设方向上的来回滑动触摸,所述处理器根据所述第二触控板上输入的来回滑动触摸的距离,控制删除所述距离对应数量的字符。
  7. 如权利要求5所述的头戴式显示设备,其特征在于,所述触摸输入装 置还包括近距离传感器,所述近距离传感器可设置于第一触控板和/或第二触控板的区域内,所述近距离传感器用于侦测用户的近距离非接触式的手势动作,所述处理器还用于响应所述近距离传感器侦测到的预设手势,控制切换所述软键盘输入界面显示的虚拟按键的语言类别。
  8. 如权利要求7所述的头戴式显示设备,其特征在于,所述预设手势为沿着平行于所述触摸输入装置表面方向的非接触式的单方向运动的手势或非接触式的来回运动的手势,或者为垂直于所述触摸输入装置表面方向的非接触式的来回运动的手势。
  9. 如权利要求7所述的头戴式显示设备,其特征在于,所述软键盘输入界面显示的虚拟按键的语言类别包括大写字母类别、汉语拼音类别、小写字母类别以及数字及标点符号类别。
  10. 如权利要求9所述的头戴式显示设备,其特征在于,当所述软键盘输入界面显示的虚拟按键的语言类别为汉语拼音类别时,所述处理器响应在所述第一触控板上的多次第一触摸动作而依次选定汉字的拼音字母,所述处理器并响应所述第一触控板上的第三触摸动作或者所述第一触控板上的触摸动作停止时,确定当前汉字的拼音字母输入完毕;所述处理器并控制在软键盘输入界面中显示与所述当前输入的拼音字母符合的若干汉字;所述处理器并继续响应在所述第一触控板上的再次输入的第一触摸动作,从所述若干汉字中确定要输入的汉字。
  11. 如权利要求9所述的头戴式显示设备,其特征在于,所述处理器还用于控制在软键盘输入界面中显示一分隔标识,将所述输入的汉字显示在分隔标识的一侧,并将与所输入的汉字的关联字显示在所述分隔标识的另一侧;所述处理器并用于继续响应在所述第一触控板上的再次输入的第一触摸动作,从所述关联字中确定下一个要输入的字。
  12. 如权利要求1-4任一项所述的头戴式显示设备,其特征在于,所述触摸输入装置设置于所述耳机装置上,或者所述触摸输入装置为与所述显示装置及耳机装置相互独立,并通过有线或无线的方式与所述显示装置进行连接通信。
  13. 如权利要求1-4任一项所述的头戴式显示设备,其特征在于,所述若 干环形排列的虚拟按键环绕所述输入框排列。
  14. 一种内容输入方法,应用于一头戴式显示设备中,所述头戴式显示设备包括耳机装置、显示装置及触摸输入装置,其特征在于,所述方法包括步骤:
    响应内容输入请求,控制所述显示装置显示软键盘输入界面,所述软键盘输入界面包括输入框以及若干环形排列的虚拟按键;
    响应触摸输入装置的第一触控板输入的第一触摸动作而确定要输入的虚拟按键;以及
    在确定所述要输入的虚拟按键为字符时,控制将所述字符输出显示在所述输入框中。
  15. 如权利要求14所述的内容输入方法,其特征在于,所述步骤“响应第一触控板输入的第一触摸动作而确定要输入的虚拟按键”包括:
    响应沿着所述第一触控板进行环形轨迹的滑动触摸并在某一虚拟按键的对应位置停留预设时间的触摸动作,确定所述某一虚拟按键为要输入的虚拟按键。
  16. 如权利要求15所述的内容输入方法,其特征在于,所述第一触控板上的触摸位置与所述软键盘输入界面上的虚拟按键的位置呈一一对应关系,所述步骤“响应环形的第一触控板输入的第一触摸动作而确定要输入的虚拟按键”包括:
    确定在第一触控板上进行滑动触摸时当前的触摸位置,控制当前触摸位置对应的虚拟按键突出显示,其中,所述虚拟按键突出显示包括高亮显示、显示与其他虚拟按键不同的颜色、在虚拟按键增加特殊标记中的一种;
    在某一突出显示的虚拟按键对应的触摸位置停留预设时间时,确定所述突出显示的虚拟按键为要输入的虚拟按键。
  17. 如权利要求14-16任一项所述的内容输入方法,其特征在于,所述触摸输入装置还包括第二触控板,所述方法还包括:
    响应所述第二触控板上输入的第二触摸动作,控制删除输入框中输入的字符。
  18. 如权利要求17所述的内容输入方法,其特征在于,所述第二触摸动作为在所述第二触控板上输入的沿预设方向上的来回滑动触摸,所述步骤“响 应所述第二触控板上输入的第二触摸动作,控制删除输入框中输入的字符”包括:
    根据所述第二触控板上输入的来回滑动触摸的距离,控制删除所述距离对应数量的字符。
  19. 如权利要求14-16任一项所述的内容输入方法,其特征在于,所述触摸输入装置还包括近距离传感器,所述近距离传感器设置于第一触控板和/或第二触控板的区域内,用于侦测用户的近距离非接触式的手势动作,所述方法还包括:
    响应所述近距离传感器侦测到的预设手势,控制切换所述软键盘输入界面显示的虚拟按键的语言类别。
  20. 如权利要求19所述的内容输入方法,其特征在于,所述预设手势为沿着平行于所述触摸输入装置表面方向的非接触式的单方向运动的手势或非接触式的来回运动的手势,或者为垂直于所述触摸输入装置表面方向的非接触式的来回运动的手势。
PCT/CN2016/109011 2016-12-08 2016-12-08 头戴式显示设备及其内容输入方法 WO2018103040A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201680039083.2A CN107980110A (zh) 2016-12-08 2016-12-08 头戴式显示设备及其内容输入方法
US16/330,215 US20190227688A1 (en) 2016-12-08 2016-12-08 Head mounted display device and content input method thereof
PCT/CN2016/109011 WO2018103040A1 (zh) 2016-12-08 2016-12-08 头戴式显示设备及其内容输入方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/109011 WO2018103040A1 (zh) 2016-12-08 2016-12-08 头戴式显示设备及其内容输入方法

Publications (1)

Publication Number Publication Date
WO2018103040A1 true WO2018103040A1 (zh) 2018-06-14

Family

ID=62004260

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/109011 WO2018103040A1 (zh) 2016-12-08 2016-12-08 头戴式显示设备及其内容输入方法

Country Status (3)

Country Link
US (1) US20190227688A1 (zh)
CN (1) CN107980110A (zh)
WO (1) WO2018103040A1 (zh)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2631760A1 (en) * 2012-02-24 2013-08-28 Research In Motion Limited Method and apparatus for providing a user interface on a device enabling selection of operations to be performed in relation to content
CN106873899A (zh) * 2017-03-21 2017-06-20 网易(杭州)网络有限公司 输入信息的获取方法及装置、存储介质和处理器
CN109300478A (zh) * 2018-09-04 2019-02-01 上海交通大学 一种听力障碍者的辅助对话装置
US11137908B2 (en) * 2019-04-15 2021-10-05 Apple Inc. Keyboard operation with head-mounted device
CN110351426B (zh) * 2019-05-31 2021-01-26 努比亚技术有限公司 智能手表信息输入方法、智能手表及计算机可读存储介质
CN111093133B (zh) * 2019-12-13 2021-11-30 上海传英信息技术有限公司 无线设备控制方法、装置及计算机可读存储介质
CN111142675A (zh) * 2019-12-31 2020-05-12 维沃移动通信有限公司 输入方法及头戴式电子设备
WO2021208965A1 (zh) * 2020-04-14 2021-10-21 Oppo广东移动通信有限公司 文本输入方法、移动设备、头戴式显示设备以及存储介质
CN112034995B (zh) * 2020-09-02 2023-09-12 中国银行股份有限公司 输入法输入界面的显示方法及装置、存储介质及电子设备
KR102222770B1 (ko) * 2020-11-04 2021-03-04 신고은 메시지 전송 장치 및 그 방법
CN113093978A (zh) * 2021-04-21 2021-07-09 山东大学 一种基于环形虚拟键盘输入方法及电子设备
CN113253908B (zh) * 2021-06-22 2023-04-25 腾讯科技(深圳)有限公司 按键功能执行方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104484050A (zh) * 2015-01-04 2015-04-01 谭希韬 一种字符输入式的穿戴式眼镜的方法和系统
CN104536607A (zh) * 2014-12-26 2015-04-22 广东小天才科技有限公司 一种基于手表的触摸环的输入方法和装置
CN105487229A (zh) * 2015-12-18 2016-04-13 济南中景电子科技有限公司 多通道交互虚拟现实眼镜
CN105929533A (zh) * 2015-02-18 2016-09-07 Lg电子株式会社 头戴式显示器
CN205620969U (zh) * 2016-01-11 2016-10-05 北京帕罗奥图科技有限公司 头戴智能设备的触摸处理系统及头戴智能设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8661340B2 (en) * 2007-09-13 2014-02-25 Apple Inc. Input methods for device having multi-language environment
CN103824033A (zh) * 2014-01-02 2014-05-28 南京永泰电子有限公司 一种安全输入密码信息的触摸显示装置及密码输入方法
US20160202903A1 (en) * 2015-01-12 2016-07-14 Howard Gutowitz Human-Computer Interface for Graph Navigation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536607A (zh) * 2014-12-26 2015-04-22 广东小天才科技有限公司 一种基于手表的触摸环的输入方法和装置
CN104484050A (zh) * 2015-01-04 2015-04-01 谭希韬 一种字符输入式的穿戴式眼镜的方法和系统
CN105929533A (zh) * 2015-02-18 2016-09-07 Lg电子株式会社 头戴式显示器
CN105487229A (zh) * 2015-12-18 2016-04-13 济南中景电子科技有限公司 多通道交互虚拟现实眼镜
CN205620969U (zh) * 2016-01-11 2016-10-05 北京帕罗奥图科技有限公司 头戴智能设备的触摸处理系统及头戴智能设备

Also Published As

Publication number Publication date
CN107980110A (zh) 2018-05-01
US20190227688A1 (en) 2019-07-25

Similar Documents

Publication Publication Date Title
WO2018103040A1 (zh) 头戴式显示设备及其内容输入方法
US11269575B2 (en) Devices, methods, and graphical user interfaces for wireless pairing with peripheral devices and displaying status information concerning the peripheral devices
JP6275839B2 (ja) リモコン装置、情報処理方法およびシステム
US9354780B2 (en) Gesture-based selection and movement of objects
US10353550B2 (en) Device, method, and graphical user interface for media playback in an accessibility mode
KR102297473B1 (ko) 신체를 이용하여 터치 입력을 제공하는 장치 및 방법
JP2014157578A (ja) タッチパネル装置、タッチパネル装置の制御方法及びプログラム
KR102559030B1 (ko) 터치 패널을 포함하는 전자 장치 및 그 제어 방법
TW201633106A (zh) 觸控裝置及判斷虛擬鍵盤按鍵之方法
US20150220156A1 (en) Interface system for a computing device with visual proximity sensors and a method of interfacing with a computing device
JP5427940B1 (ja) 入力装置、角度入力装置およびプログラム
WO2018112951A1 (zh) 头戴式显示设备及其内容输入方法
US10338692B1 (en) Dual touchpad system
TWI512592B (zh) 電子裝置及其用戶界面顯示方法
JP2018023792A (ja) ゲーム装置及びプログラム
WO2018191961A1 (zh) 头戴式显示设备及其内容输入方法
US20140006996A1 (en) Visual proximity keyboard
JP2014155856A (ja) タッチパネル式ディスプレイを持った携帯型ゲーム装置
JP6126639B2 (ja) タッチパネル式ディスプレイを持った携帯型ゲーム装置及びゲームプログラム。
WO2019041171A1 (zh) 按键操作提示方法及头戴显示设备
JP6204414B2 (ja) ゲーム装置及びプログラム
JP5769765B2 (ja) タッチパネル式ディスプレイを持った携帯型ゲーム装置
KR20150049661A (ko) 터치패드 입력 정보 처리 장치 및 방법
KR20180103366A (ko) 반응형 유저 인터페이스 제공 장치 및 방법
AU2013204699A1 (en) A headphone set and a connector therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16923429

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 17/10/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16923429

Country of ref document: EP

Kind code of ref document: A1