WO2020216106A1 - 一种穿戴式计算设备和一种人机交互方法 - Google Patents

一种穿戴式计算设备和一种人机交互方法 Download PDF

Info

Publication number
WO2020216106A1
WO2020216106A1 PCT/CN2020/084815 CN2020084815W WO2020216106A1 WO 2020216106 A1 WO2020216106 A1 WO 2020216106A1 CN 2020084815 W CN2020084815 W CN 2020084815W WO 2020216106 A1 WO2020216106 A1 WO 2020216106A1
Authority
WO
WIPO (PCT)
Prior art keywords
computing device
wearable computing
server
head
eyeball
Prior art date
Application number
PCT/CN2020/084815
Other languages
English (en)
French (fr)
Inventor
洪浛檩
Original Assignee
洪浛檩
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 洪浛檩 filed Critical 洪浛檩
Publication of WO2020216106A1 publication Critical patent/WO2020216106A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Definitions

  • the present invention relates to the field of human-computer interaction, in particular to a wearable computing device and a human-computer interaction method.
  • wearable computing devices Traditional human-computer interaction input devices (mouse, keyboard cap, and touch screen) are mainly designed for desktops, notebooks and smart phones, so they are not suitable for new wearable computing devices.
  • the display part of wearable computing devices is mainly head-mounted displays, including but not limited to AR, VR and MR glasses or head-mounted displays.
  • the human-computer interaction methods used include:
  • Voice interaction mainly collecting human voice information through a microphone, and inputting it to the wearable computing device as an operation instruction signal.
  • the advantage of voice interaction is that it does not need to rely on human hands like keyboard, mouse and touch screen, and can free hands to realize human-computer interaction, but the disadvantages are also obvious.
  • (1) The stability is low. In a noisy environment, the speech recognition rate is severely reduced, and there are many recognition errors, which affect the user interaction experience; (2) Voice interaction cannot be used in some unsuitable situations (such as meetings); (3) Too much Complicated interactive information cannot be completed by voice, such as drawing forms and long text input.
  • Body interaction is mainly to capture body motions of human limbs, especially arms, through cameras, and input them to wearable computing devices as operating instruction signals.
  • the advantage of physical interaction is that it is suitable for complex information interaction, such as chart interaction and long text input, but the disadvantages are: (1) Some serious occasions (such as meetings) are not suitable for waving with hands in the air; (2) it needs to occupy people One or two hands can complete the interaction process, and there is no more hands-free operation than using a mobile phone.
  • Gesture interaction is mainly to capture the motion state of a single finger or multiple fingers through the sliding touch pad on the side of the head-mounted display, and input to the wearable computing device as an operation instruction signal.
  • the advantage of gesture interaction is that it is similar to the touch screen operation of a mobile phone, which is easier for ordinary people to adapt.
  • the disadvantages are: (1) Not intuitive enough. Touch screen interaction is what you see and can be touched immediately. If gesture interaction is to find the screen coordinates that the eyes are looking at , Requires hand-eye cooperation, and a period of groping to complete; (2) Similarly, gesture interaction also takes up one hand of a person to proceed.
  • Brainwave interaction is mainly to collect human brainwave signals through brainwave sensors, and input them to wearable computing devices as operating instruction signals. Brainwave interaction is still in the scientific research stage, with high bit error rate, low corresponding rate, and poor stability, which is not suitable for engineering practice.
  • the purpose of the present invention is to provide a wearable computing device and a human-computer interaction method, so that users can interact with the head-mounted display more naturally and accurately, and improve the operation efficiency.
  • the present invention provides the following solutions:
  • a wearable computing device comprising: a stand, a head-mounted display, an eyeball camera, a touch button ring, and a server; the head-mounted display and the server are arranged on the stand, so The eyeball shooting camera is arranged on the head-mounted display, the server is respectively connected with the head-mounted display and the eyeball shooting camera, the server is wirelessly connected with the touch button ring, and the eyeball shooting camera There are two; the eyeball shooting camera sends the captured eyeball image to the server for image processing, and the server extracts the gaze point position of the human eye on the screen according to the eyeball image as a coordinate signal.
  • the touch button ring includes: a microprocessor, a touch button, a sensor, and a wireless communication module.
  • the microprocessor is connected to the server through the wireless communication module, and the microprocessor is connected to the server through the sensor. Connect with the touch button.
  • the wireless communication module is at least one of Bluetooth, WiFi, 4G module, and 5G module.
  • the head-mounted display is at least one of a VR box inserted with a mobile phone, a VR all-in-one machine, an AR projection head-up display, MR glasses, and a head-up display.
  • the wearable computing device further includes: an audio input device, and the audio input device is connected to the server.
  • the wearable computing device further includes: an audio output device, and the audio output device is connected to the server.
  • the audio input device is a microphone.
  • the audio output device is at least one of a speaker and a headset.
  • the server includes a CPU and a memory, the CPU is connected to the memory, the CPU includes an arithmetic unit and a controller, and the memory includes an internal memory and an external memory.
  • a human-computer interaction method includes:
  • the application program corresponding to the icon is controlled according to the operation and the change trend.
  • the present invention discloses the following technical effects:
  • the present invention provides a wearable computing device and a human-computer interaction method.
  • people can input multiple operation instructions to the computing device without speaking or waving their limbs, and the number of operation instructions that can be input is far More than the amount that can be provided by voice interaction and physical interaction, users can interact with the head-mounted display more naturally and accurately, which improves operation efficiency.
  • the interaction mode of the present invention can free both hands, so that people do not have to keep raising their arms to operate on the touch pad on the side of the head-mounted display.
  • the method of the present invention Compared with brain wave interaction, the method of the present invention has a lower error rate, faster corresponding speed, and better stability, and is more suitable for engineering practice.
  • Fig. 1 is a structural diagram 1 of the wearable computing device of the present invention.
  • Figure 2 is a second structural diagram of the wearable computing device of the present invention.
  • FIG. 3 is a schematic diagram of the hardware structure of the server of the present invention.
  • FIG. 4 is a schematic diagram of the circuit structure of the touch button ring 4 of the present invention.
  • FIG. 5 is a general structural diagram of the wearable computing device of the present invention.
  • Fig. 6 is a flowchart of the human-computer interaction method of the present invention.
  • the purpose of the present invention is to provide a wearable computing device and a human-computer interaction method.
  • the wearable computing device includes: a stand, a head-mounted display, an eyeball camera, a touch button ring, and a server;
  • the display and the server are arranged on the bracket, the eyeball shooting camera is arranged on the head-mounted display, the server is respectively connected with the head-mounted display and the eyeball shooting camera, and the server is connected with
  • the touch button ring is wirelessly connected, and there are two eyeball shooting cameras.
  • the invention enables the user to interact with the head-mounted display more naturally and accurately, and improves the operation efficiency.
  • FIG. 1 is a first structural diagram of the wearable computing device of the present invention
  • FIG. 2 is a second structural diagram of the wearable computing device of the present invention.
  • the wearable computing device includes:
  • the head-mounted display 2 and the server are arranged on the bracket 1, the eyeball shooting camera 3 is arranged on the head-mounted display 2, the server is respectively connected with the head-mounted display 2, the eyeball shooting camera 3, and the server is wirelessly connected with the touch button ring 4.
  • FIG 3 is a schematic diagram of the hardware structure of the server of the present invention.
  • the server includes a CPU and a memory, the CPU is connected to the memory, the CPU includes an arithmetic unit 9 and a controller 10, and the memory includes an internal Storage 7 and external storage 8.
  • the server also includes common computing device interfaces and accessories such as wireless communication module, USB socket, SIM card socket, SD card socket, battery, power socket, power supply cord, switch button, restart button, and shell. These interfaces and accessories are mainly used to enhance the functions of the device.
  • the main function of the head-mounted display 2 is to present the screen picture in front of people's eyes. It includes, but is not limited to, a virtual reality (VR) box with a mobile phone plugged in, a VR all-in-one machine, an augmented reality (Augmented Reality, AR) projection head display, a mixed reality (Mixed Reality, MR) glasses, a head-up display, and these methods
  • VR virtual reality
  • AR Augmented Reality
  • MR mixed reality
  • the main function of the eyeball shooting camera 3 is to extract the position of the gaze point of the human eye on the screen into a coordinate signal, which is similar to the position coordinate input by the mouse pointer to the computer. Its working principle is: the eyeball shooting camera is aimed at the human eye, and the real-time eyeball image is sent to the server for image processing.
  • the server's chip is Qualcomm Snapdragon 425 (64-bit quad core).
  • the touch button ring 4 is provided with a touch button 5 and a finger ring 6, and further includes a microprocessor, a sensor, and a wireless communication module.
  • the microprocessor is connected to the server through the wireless communication module, and the microprocessor
  • the sensor is connected to the touch button 5.
  • the sensor includes a movement detection module and a key detection module.
  • the circuit structure of the touch button ring 4 includes: a microprocessor 41, a movement detection module 42, a button detection module 43, a wireless communication module 44 and a battery 45.
  • the microprocessor 41 is connected to the touch button 5 through the movement detection module 42 and the key detection module 43, and the microprocessor 41 is connected to the server through the wireless communication module 44.
  • the wireless communication module 44 includes, but is not limited to, Bluetooth, WiFi, 4G, and 5G modules.
  • the battery 45 supplies power to the microprocessor 41.
  • the main function of the touch button ring 4 is: by wearing the finger ring 6 on the finger, a person can manipulate the touch button 5 with his finger to send operation instruction signals to the wearable computing device, similar to the functions of the left button, right button and scroll wheel of a mouse.
  • the working principle is: the touch button sends the real-time "button press” signal, "button up” signal and "finger moving direction” signal to the microprocessor 41, and sends it to the server through the wireless communication module 44.
  • Fig. 5 is a general structural diagram of the wearable computing device of the present invention. As shown in FIG. 5, the wearable computing device further includes: an audio input device, and the audio input device is connected to the server.
  • the audio input device includes but is not limited to a microphone.
  • the wearable computing device further includes: an audio output device connected to the server.
  • the audio output device includes but is not limited to speakers and headphones.
  • the invention provides a wearable computing device.
  • the touch button ring By operating the touch button ring, people can input various operation instructions to the computing device without speaking or waving their limbs, and the number of operation instructions that can be input is far more than voice interaction and physical interaction The quantity that can be provided.
  • the interaction mode of the present invention can free both hands, so that people do not have to keep raising their arms to operate on the touch pad on the side of the head-mounted display.
  • the method of the present invention has lower bit error rate, faster corresponding speed and better stability, and is more suitable for engineering practice.
  • FIG. 6 is a flowchart of the human-computer interaction method of the present invention. As shown in FIG. 6, the human-computer interaction method includes:
  • Step 601 Obtain the eye pupil coordinates of the user.
  • Step 602 Determine the application icon corresponding to the gaze point of the human eye on the screen according to the eye pupil coordinates.
  • Step 603 Obtain the user's operation on the touch button.
  • Step 604 Obtain the change trend of the user's eye pupil coordinates.
  • Step 605 Control the application corresponding to the icon according to the operation and the change trend.
  • the head-mounted display 2 will present a desktop screen in front of people's eyes. Several icons of applications and buttons are displayed on the desktop.
  • the server will take the eyeball image captured by the camera 3 according to the eyeball, analyze the human eye pupil coordinates in real time, analyze the line of sight angle, and calculate the coordinates of the eye gaze point on the screen to determine the user’s needs The icon for the operation. Then, according to the obtained operation signal of the touch key ring 4, the user's operation intention judgment is completed, and the icon on the head mounted display 2 is operated according to the user's operation intention judgment.
  • the operation signals of the touch button ring 4 mainly include three types: button pressing, button lifting, and finger movement direction.
  • the server completes user operation intention judgment based on these three signals.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

本发明公开一种穿戴式计算设备和一种人机交互方法,所述穿戴式计算设备包括:支架、头戴式显示器、眼球拍摄摄像头、触摸按键戒指和服务器;所述头戴式显示器和所述服务器设置于所述支架上,所述眼球拍摄摄像头设置于所述头戴式显示器上,所述服务器分别与所述头戴式显示器、所述眼球拍摄摄像头连接,所述服务器与所述触摸按键戒指无线连接,所述眼球拍摄摄像头为两个。本发明使用户可以更加自然、准确地与头戴式显示器进行交互,提高了操作效率。

Description

一种穿戴式计算设备和一种人机交互方法 技术领域
本发明涉及人机交互领域,特别是涉及一种穿戴式计算设备和一种人机交互方法。
背景技术
传统的人机交互输入设备(鼠标、键盘帽和触屏)主要是为台式机、笔记本和智能手机而设计,因此不适合用于新生的穿戴式计算设备。目前,穿戴式计算设备的显示部分主要是头戴式显示器,包含但不限于AR、VR和MR眼镜或头显,采用的人机交互方式包括:
1、语音交互,主要是通过麦克风采集人的语音信息,作为操作指令信号输入到穿戴式计算设备。语音交互的优点是不需要像键鼠和触屏那样依赖人的手,可以解放双手而实现人机交互,但缺点也很明显。(1)稳定性低,在嘈杂环境下,语音识别率严重降低,识别错误较多,影响用户交互体验;(2)有些不宜说话的场合(例如会议中)无法使用语音交互;(3)过于复杂的交互信息无法通过语音完成,例如绘图填表、长文本输入等。
2、肢体交互,主要是通过摄像头对人的四肢、尤其是手臂进行肢体动作捕捉,作为操作指令信号输入到穿戴式计算设备。肢体交互的优点是适合进行复杂信息交互,例如图表交互、长文本输入,但缺点是:(1)有些严肃的场合(例如会议中)不适合用手在空中挥舞;(2)需要占用人的一到两只手才能完成交互过程,并没有比使用手机更解放双手。
3、手势交互,主要是通过头戴式显示器侧边的滑动触摸板捕捉单指或多指运动状态,作为操作指令信号输入到穿戴式计算设备。手势交互的优点是类似 手机触屏操作,普通人较容易适应,缺点是:(1)不够直观,触屏交互是所见可立即触得,而手势交互如果要找准眼睛正在看的屏幕坐标,需要手眼配合,摸索一段时间才能完成;(2)同样地,手势交互也要占用人的一只手才能进行。
4、脑波交互,主要是通过脑波传感器采集人的脑波信号,作为操作指令信号输入到穿戴式计算设备。脑波交互,目前还处于科研阶段,误码率高、相应速率低、稳定性差,不适合用于工程实践中。
以上人机交互方式都因为各自的缺陷,无论单独使用还是组合使用,都无法使人自然、准确地与头戴式显示器进行交互,因而也就无法高效地操作穿戴式计算设备。
发明内容
本发明的目的是提供一种穿戴式计算设备和一种人机交互方法,使用户可以更加自然、准确地与头戴式显示器进行交互,提高了操作效率。
为实现上述目的,本发明提供了如下方案:
一种穿戴式计算设备,所述穿戴式计算设备包括:支架、头戴式显示器、眼球拍摄摄像头、触摸按键戒指和服务器;所述头戴式显示器和所述服务器设置于所述支架上,所述眼球拍摄摄像头设置于所述头戴式显示器上,所述服务器分别与所述头戴式显示器、所述眼球拍摄摄像头连接,所述服务器与所述触摸按键戒指无线连接,所述眼球拍摄摄像头为两个;所述眼球拍摄摄像头将拍摄的眼球图像发送给服务器进行图像处理,所述服务器依据眼球图像将人眼视线在屏幕上的注视点位置提取出来成为坐标信号。
可选的,所述触摸按键戒指包括:微处理器、触摸按键、传感器和无线通讯模块,所述微处理器通过所述无线通讯模块与所述服务器连接,所述微处理器通过所述传感器与所述触摸按键连接。
可选的,所述无线通讯模块为蓝牙、WiFi、4G模块和5G模块中的至少一种。
可选的,所述头戴式显示器为插有手机的VR盒子、VR一体机、AR投影头显、MR眼镜和抬头显示器中的至少一种。
可选的,所述穿戴式计算设备还包括:音频输入设备,所述音频输入设备与所述服务器连接。
可选的,所述穿戴式计算设备还包括:音频输出设备,所述音频输出设备与所述服务器连接。
可选的,所述音频输入设备为麦克风。
可选的,所述音频输出设备为音响和耳机中的至少一种。
可选的,所述服务器包括:CPU和存储器,所述CPU与所述存储器连接,所述CPU包括运算器和控制器,所述存储器包括内存储器和外存储器。
一种人机交互方法,所述人机交互方法包括:
依据眼球拍摄摄像头拍摄的用户眼球图像获取用户的眼瞳坐标;
根据所述眼瞳坐标确定屏幕上人眼注视点对应的应用程序的图标;
获取用户对触摸按键的操作;
获取用户的眼瞳坐标的变化趋势;
根据所述操作和所述变化趋势控制所述图标对应的应用程序。
根据本发明提供的具体实施例,本发明公开了以下技术效果:
本发明提供的一种穿戴式计算设备和一种人机交互方法,通过操作触摸按键戒指,人不需要说话、挥动四肢就可以对计算设备输入多种操作指令,而且可以输入的操作指令数量远多于语音交互和肢体交互所能提供的数量,用户可以更加自然、准确地与头戴式显示器进行交互,提高了操作效率。
相对于手势交互的方式,本发明所述的交互方式可以解放双手,使得人不必一直抬着手臂在头戴式显示器侧边的触摸板上进行操作。
相对于脑波交互,本发明所述的方式误码率更低、相应速度更快、稳定性更好,更适合用于工程实践。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明穿戴式计算设备的结构示意图一;
图2为本发明穿戴式计算设备的结构示意图二;
图3为本发明服务器的硬件结构示意图;
图4为本发明触摸按键戒指4的电路结构示意图;
图5为本发明的穿戴式计算设备的结构总图;
图6为本发明人机交互方法的流程图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明的目的是提供一种穿戴式计算设备和一种人机交互方法,所述穿戴式计算设备包括:支架、头戴式显示器、眼球拍摄摄像头、触摸按键戒指和服务器;所述头戴式显示器和所述服务器设置于所述支架上,所述眼球拍摄摄像头设置于所述头戴式显示器上,所述服务器分别与所述头戴式显示器、所述眼球拍摄摄像头连接,所述服务器与所述触摸按键戒指无线连接,所述眼球拍摄摄像头为两个。本发明使用户可以更加自然、准确地与头戴式显示器进行交互,提高了操作效率。
为使本发明的上述目的、特征和优点能够更加明显易懂,下面结合附图和具体实施方式对本发明作进一步详细的说明。
本发明提供了一种穿戴式计算设备,图1为本发明穿戴式计算设备的结构示意图一;图2为本发明穿戴式计算设备的结构示意图二。如图1和图2所示,所述穿戴式计算设备包括:
支架1、头戴式显示器2、眼球拍摄摄像头3、触摸按键戒指4和服务器(图1和图2中未示出);眼球拍摄摄像头3为两个。头戴式显示器2和服务器设置于支架1上,眼球拍摄摄像头3设置于头戴式显示器2上,服务器分别与头戴式显示器2、眼球拍摄摄像头3连接,服务器与触摸按键戒指4无线连接。
图3为本发明服务器的硬件结构示意图,如图3所示,服务器包括:CPU和存储器,所述CPU与所述存储器连接,所述CPU包括运算器9和控制器10, 所述存储器包括内存储器7和外存储器8。另外,服务器还包括无线通讯模块、USB插口、SIM卡插口、SD卡插口、电池、电源插口、供电电源线、开关按键、重启按键、外壳等常见的计算设备接口和配件。这些接口和配件主要是用于增强设备的功能。
头戴式显示器2的主要作用是:将屏幕画面呈现在人的眼前。其包括但不限于插有手机的虚拟现实(Virtual Reality,VR)盒子、VR一体机、增强现实(Augmented Reality,AR)投影头显、混合现实(Mixed Reality,MR)眼镜、抬头显示器以及这些方式组合而成的可以戴在头上的显示设备。
眼球拍摄摄像头3的主要作用是:将人眼视线在屏幕上的注视点位置提取出来成为坐标信号,类似于鼠标指针输入给计算机的位置坐标。其工作原理是:眼球拍摄摄像头对准人眼部位,将实时拍摄的眼球图像发送给服务器进行图像处理,服务器的芯片是高通骁龙425(64位四核)。
触摸按键戒指4上设置有触摸按键5和指环6,还包括:微处理器、传感器和无线通讯模块,所述微处理器通过所述无线通讯模块与所述服务器连接,所述微处理器通过所述传感器与触摸按键5连接。传感器包括移动检测模块和按键检测模块。
图4为本发明触摸按键戒指4的电路结构示意图,如图4所示,触摸按键戒指4的电路结构包括:微处理器41、移动检测模块42、按键检测模块43、无线通讯模块44和电池45。微处理器41通过移动检测模块42和按键检测模块43与触摸按键5连接,微处理器41通过无线通讯模块44与服务器连接。无线通讯模块44包括但不限于蓝牙、WiFi、4G模块和5G模块。电池45为微处理器41供电。
触摸按键戒指4的主要作用是:通过指环6穿戴在手指上,人可以用手指操控触摸按键5,对穿戴式计算设备发出操作指令信号,类似于鼠标左键、右键和滚轮的作用。其工作原理是:触摸按键将实时的“按键按下”信号、“按键抬起”信号和“手指移动方向”信号发送给微处理器41,并通过无线通讯模块44发送给服务器。
图5为本发明的穿戴式计算设备的结构总图。如图5所示,所述穿戴式计算设备还包括:音频输入设备,所述音频输入设备与所述服务器连接。所述音频输入设备包括单不限于麦克风。所述穿戴式计算设备还包括:音频输出设备,所述音频输出设备与所述服务器连接。所述音频输出设备包括但不限于音响和耳机。
通过显示和输入设备的组合使用,人可以像通过鼠标操作电脑一样操作穿戴式计算设备,交互自然、准确并且高效。
由于人在操作穿戴式计算设备时,总会自然地先看到并注视头戴式显示器里的交互界面所呈现的内容或图标,再在脑中决策是观看画面内容还是给予交互指令,因此如果人不使用触摸按键戒指,那么就表示穿戴式计算设备不需要获得、执行任何操作,而如果人使用了触摸按键戒指,那么穿戴式计算设备就会获得对应的操作指令输入信号,为人完成打开、右击、拖动、翻页、翻转等行为。本发明提供的一种穿戴式计算设备,通过操作触摸按键戒指,人不需要说话、挥动四肢就可以对计算设备输入多种操作指令,而且可以输入的操作指令数量远多于语音交互和肢体交互所能提供的数量。相对于手势交互的方式,本发明所述的交互方式可以解放双手,使得人不必一直抬着手臂在头戴式显示器侧边的触摸板上进行操作。而相对于脑波交互,本发明所述的方式误码率更低、相应速度更快、稳定性更好,更适合用于工程实践。
本发明还提供给了一种人机交互方法,图6为本发明人机交互方法的流程图,如图6所示,所述人机交互方法包括:
步骤601:获取用户的眼瞳坐标。
步骤602:根据所述眼瞳坐标确定屏幕上人眼注视点对应的应用程序的图标。
步骤603:获取用户对触摸按键的操作。
步骤604:获取用户的眼瞳坐标的变化趋势。
步骤605:根据所述操作和所述变化趋势控制所述图标对应的应用程序。
首先,穿戴式计算设备开机后,头戴式显示器2会在人眼前呈现桌面的画面。桌面上陈列若干应用和按钮的图标,此时服务器会根据眼球拍摄摄像头3拍摄的眼球图像,实时分析人眼瞳孔坐标,解析视线角度,并计算人眼注视点在屏幕上的坐标,确定用户要进行操作的图标。然后根据获取的触摸按键戒指4的操作信号,完成用户操作意图判断,根据用户操作意图判断对头戴式显示器2上的图标进行操作。
其中,触摸按键戒指4的操作信号主要包括三种:按键按下、按键抬起和手指移动方向,服务器根据这三种信号完成用户操作意图判断。
例如:
(1)若人眼注视一个图标,并在戒指上单击一次触摸按键5,即发出“按键按下”信号后迅速发出“按键抬起”信号,则可像鼠标左键双击一样激活该应用或按钮。
(2)若人眼注视一个图标,并在戒指上长按触摸按键5,即发出“按键按下”信号后延迟一段时间再发出“按键抬起”信号,则可像鼠标右键单击一样打开该应用或按钮的额外菜单。
(3)若人眼注视一个图标,并在戒指上按下触摸按键5、并移动注视点,即发出“按键按下”信号、同时移动注视坐标,则可像鼠标左键拖拽一样重新布置该图标的桌面位置。
(4)若人在戒指上上下滑动或滚动触摸按键5,即发出“手指移动方向”信号,则可像鼠标滚轮一样上下翻动画面。
(5)若人在戒指上按下触摸按键5、并上下左右滑动或滚动触摸按键,即发出“按键按下”信号、再发出“手指移动方向”信号,则可像鼠标滚轮被按下、并前后左右移动鼠标时一样,对屏幕内的3D模型进行翻转。
本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处。综上所述,本说明书内容不应理解为对本发明的限制。

Claims (10)

  1. 一种穿戴式计算设备,其特征在于,包括:支架、头戴式显示器、眼球拍摄摄像头、触摸按键戒指和服务器;所述头戴式显示器和所述服务器设置于所述支架上,所述眼球拍摄摄像头设置于所述头戴式显示器上,所述服务器分别与所述头戴式显示器、所述眼球拍摄摄像头连接,所述服务器与所述触摸按键戒指无线连接,所述眼球拍摄摄像头为两个;
    所述眼球拍摄摄像头将拍摄的眼球图像发送给服务器进行图像处理,所述服务器依据眼球图像将人眼视线在屏幕上的注视点位置提取出来成为坐标信号。
  2. 根据权利要求1所述的穿戴式计算设备,其特征在于,所述触摸按键戒指包括:微处理器、触摸按键、传感器和无线通讯模块,所述微处理器通过所述无线通讯模块与所述服务器连接,所述微处理器通过所述传感器与所述触摸按键连接。
  3. 根据权利要求2所述的穿戴式计算设备,其特征在于,所述无线通讯模块为蓝牙、WiFi、4G模块和5G模块中的至少一种。
  4. 根据权利要求1所述的穿戴式计算设备,其特征在于,所述头戴式显示器为插有手机的VR盒子、VR一体机、AR投影头显、MR眼镜和抬头显示器中的至少一种。
  5. 根据权利要求1所述的穿戴式计算设备,其特征在于,所述穿戴式计算设备还包括:音频输入设备,所述音频输入设备与所述服务器连接。
  6. 根据权利要求1所述的穿戴式计算设备,其特征在于,所述穿戴式计算设备还包括:音频输出设备,所述音频输出设备与所述服务器连接。
  7. 根据权利要求5所述的穿戴式计算设备,其特征在于,所述音频输入设备为麦克风。
  8. 根据权利要求6所述的穿戴式计算设备,其特征在于,所述音频输出设备为音响和耳机中的至少一种。
  9. 根据权利要求1所述的穿戴式计算设备,其特征在于,所述服务器包括:CPU和存储器,所述CPU与所述存储器连接,所述CPU包括运算器和控制器,所述存储器包括内存储器和外存储器。
  10. 一种人机交互方法,其特征在于,所述人机交互方法包括:依据眼球拍摄摄像头拍摄的用户眼球图像获取用户的眼瞳坐标;
    根据所述眼瞳坐标确定屏幕上人眼注视点对应的应用程序的图标;获取用户对触摸按键的操作;
    获取用户的眼瞳坐标的变化趋势;
    根据所述操作和所述变化趋势控制所述图标对应的应用程序。
PCT/CN2020/084815 2019-04-24 2020-04-15 一种穿戴式计算设备和一种人机交互方法 WO2020216106A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910332453.7 2019-04-24
CN201910332453.7A CN110069101B (zh) 2019-04-24 2019-04-24 一种穿戴式计算设备和一种人机交互方法

Publications (1)

Publication Number Publication Date
WO2020216106A1 true WO2020216106A1 (zh) 2020-10-29

Family

ID=67368723

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/084815 WO2020216106A1 (zh) 2019-04-24 2020-04-15 一种穿戴式计算设备和一种人机交互方法

Country Status (2)

Country Link
CN (1) CN110069101B (zh)
WO (1) WO2020216106A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110069101B (zh) * 2019-04-24 2024-04-02 洪浛檩 一种穿戴式计算设备和一种人机交互方法
CN111061372B (zh) * 2019-12-18 2023-05-02 Oppo广东移动通信有限公司 设备控制方法及相关设备
CN113413585B (zh) * 2021-06-21 2024-03-22 Oppo广东移动通信有限公司 头戴显示设备的交互方法、装置和电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101840265A (zh) * 2009-03-21 2010-09-22 深圳富泰宏精密工业有限公司 视觉感知装置及其控制方法
CN103838378A (zh) * 2014-03-13 2014-06-04 广东石油化工学院 一种基于瞳孔识别定位的头戴式眼睛操控系统
CN106214118A (zh) * 2016-01-28 2016-12-14 北京爱生科贸有限公司 一种基于虚拟现实的眼球运动监测系统
CN108985291A (zh) * 2018-08-07 2018-12-11 东北大学 一种基于单摄像头的双眼追踪系统
CN110069101A (zh) * 2019-04-24 2019-07-30 洪浛檩 一种穿戴式计算设备和一种人机交互方法

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853096A (zh) * 2010-05-21 2010-10-06 程喜庆 触控分离式无线控鼠指环及触摸板
CN101890719B (zh) * 2010-07-09 2015-06-03 中国科学院深圳先进技术研究院 机器人远程控制装置及机器人系统
CN202533867U (zh) * 2012-04-17 2012-11-14 北京七鑫易维信息技术有限公司 一种头戴式眼控显示终端
CN203102173U (zh) * 2013-01-21 2013-07-31 昆明理工大学 一种具有无线鼠标功能的智能戒指
CN105302451A (zh) * 2014-07-23 2016-02-03 吴建伟 一种手势输入装置
KR101709611B1 (ko) * 2014-10-22 2017-03-08 윤영기 디스플레이와 카메라가 장착된 스마트 안경과 이를 이용한 공간 터치 입력 및 보정 방법
CN105068648A (zh) * 2015-08-03 2015-11-18 众景视界(北京)科技有限公司 头戴式智能交互系统
CN205942629U (zh) * 2016-07-06 2017-02-08 童宗伟 指环无线鼠标
CN206270890U (zh) * 2016-10-18 2017-06-20 莫汝森 戒指鼠标
CN106444086B (zh) * 2016-10-27 2018-12-14 吴元旦 一种指环控制的智能眼镜及其使用方法
CN108334188A (zh) * 2017-01-20 2018-07-27 深圳纬目信息技术有限公司 一种带有眼控的头戴式显示设备
CN106980370A (zh) * 2017-03-14 2017-07-25 无锡云瞳信息科技有限公司 具有多交互的可穿戴智能眼镜
CN107368187A (zh) * 2017-07-12 2017-11-21 深圳纬目信息技术有限公司 一种双重交互控制的头戴式显示设备
KR102053367B1 (ko) * 2017-10-11 2019-12-09 오익재 웨어러블 인터페이스 장치
CN108536285B (zh) * 2018-03-15 2021-05-14 中国地质大学(武汉) 一种基于眼部移动识别与控制的鼠标交互方法与系统
CN209803661U (zh) * 2019-04-24 2019-12-17 洪浛檩 一种穿戴式计算设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101840265A (zh) * 2009-03-21 2010-09-22 深圳富泰宏精密工业有限公司 视觉感知装置及其控制方法
CN103838378A (zh) * 2014-03-13 2014-06-04 广东石油化工学院 一种基于瞳孔识别定位的头戴式眼睛操控系统
CN106214118A (zh) * 2016-01-28 2016-12-14 北京爱生科贸有限公司 一种基于虚拟现实的眼球运动监测系统
CN108985291A (zh) * 2018-08-07 2018-12-11 东北大学 一种基于单摄像头的双眼追踪系统
CN110069101A (zh) * 2019-04-24 2019-07-30 洪浛檩 一种穿戴式计算设备和一种人机交互方法

Also Published As

Publication number Publication date
CN110069101A (zh) 2019-07-30
CN110069101B (zh) 2024-04-02

Similar Documents

Publication Publication Date Title
CN109891367B (zh) 在增强和/或虚拟现实环境中使用手势生成虚拟符号表面
US10534447B2 (en) Multi-surface controller
WO2020216106A1 (zh) 一种穿戴式计算设备和一种人机交互方法
US10121063B2 (en) Wink gesture based control system
CN107003750B (zh) 多表面控制器
JP5900393B2 (ja) 情報処理装置、操作制御方法及びプログラム
EP2889718A1 (en) A natural input based virtual ui system for electronic devices
US10254844B2 (en) Systems, methods, apparatuses, computer readable medium for controlling electronic devices
EP3090331B1 (en) Systems with techniques for user interface control
CN110456907A (zh) 虚拟画面的控制方法、装置、终端设备及存储介质
JP6165485B2 (ja) 携帯端末向けarジェスチャユーザインタフェースシステム
KR102297473B1 (ko) 신체를 이용하여 터치 입력을 제공하는 장치 및 방법
US20220317776A1 (en) Methods for manipulating objects in an environment
US11782571B2 (en) Device, method, and graphical user interface for manipulating 3D objects on a 2D screen
US11567625B2 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
KR101488662B1 (ko) Nui 장치를 통하여 사용자와 상호작용하는 인터페이스 제공방법 및 제공장치
CN209803661U (zh) 一种穿戴式计算设备
CN111475017A (zh) 一种智能眼镜设备及人机交互方法
CN114201030A (zh) 设备交互方法、电子设备及交互系统
WO2023124972A1 (zh) 显示状态切换方法、装置及系统、电子设备、存储介质
WO2019227734A1 (zh) 一种控制指令输入方法和输入装置
CN211857417U (zh) 一种智能眼镜设备
Deepateep et al. Facial movement interface for mobile devices using depth-sensing camera
WO2023286316A1 (ja) 入力装置、システム、および制御方法
US11641460B1 (en) Generating a volumetric representation of a capture region

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20794354

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20794354

Country of ref document: EP

Kind code of ref document: A1