WO2017012519A1 - 头操作的数字眼镜 - Google Patents

头操作的数字眼镜 Download PDF

Info

Publication number
WO2017012519A1
WO2017012519A1 PCT/CN2016/090236 CN2016090236W WO2017012519A1 WO 2017012519 A1 WO2017012519 A1 WO 2017012519A1 CN 2016090236 W CN2016090236 W CN 2016090236W WO 2017012519 A1 WO2017012519 A1 WO 2017012519A1
Authority
WO
WIPO (PCT)
Prior art keywords
angular velocity
pointer
user
head
processor
Prior art date
Application number
PCT/CN2016/090236
Other languages
English (en)
French (fr)
Inventor
谢培树
Original Assignee
谢培树
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201510436115.XA external-priority patent/CN105116544A/zh
Priority claimed from CN201610544134.9A external-priority patent/CN106681488A/zh
Application filed by 谢培树 filed Critical 谢培树
Priority to US15/566,634 priority Critical patent/US20180143436A1/en
Publication of WO2017012519A1 publication Critical patent/WO2017012519A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to a display device, and more particularly to a head mounted display device.
  • Digital glasses are head-mounted display devices that can display digital signals, including augmented reality glasses, virtual reality glasses, smart glasses, and the like.
  • Mobile phones and tablets are mainstream mobile communication devices.
  • the display area of the mobile phone is narrow, and the tablet is heavy.
  • mobile phones and tablets can only display two-dimensional images, which require users to change the head posture, which limits their application areas.
  • Digital glasses can use a near-eye display to output images into a wide three-dimensional space.
  • current digital glasses are difficult to input text.
  • the speech recognition input method has recognition errors and poor anti-interference.
  • the touchpad can complete text input, it requires resources of at least 1 hand. When the user's hands are busy, it is difficult to input text using the touchpad.
  • the eye tracking device can operate the digital glasses with eye movements, the pointer tracking device has a low pointer movement accuracy and is susceptible to ambient light interference.
  • the present invention is directed to a head operated digital eyewear and method of operation thereof. It allows the user to quickly and accurately manipulate the pointer through the head and output image information into a wide three-dimensional space.
  • the digital glasses include a left temple, a right temple, a left display device, a right display device, a left infrared emitter, a right infrared emitter, a left infrared receiver, a right infrared receiver, a head angular velocity detector, a torso angular velocity receiving interface, Processor, memory, power supply.
  • the display device is a video output device, including a projector, a liquid crystal panel, and the like.
  • the left display device is located in front of the user's left eye and the right display device is located in front of the user's right eye.
  • the infrared emitter can continuously emit infrared rays, or it can emit infrared rays at regular intervals.
  • the infrared light emitted by the infrared emitter illuminates his eyes and reflects strong infrared light.
  • the light emitted by the display device illuminates his eyes and reflects the weak infrared light.
  • the infrared receiver converts reflected infrared light into a digital signal.
  • the left infrared emitter emits infrared light to the left eye of the user, and the left infrared receiver receives infrared light reflected by the left eye of the user.
  • the right infrared emitter emits infrared light to the user's right eye
  • the right infrared receiver receives infrared light reflected by the user's right eye.
  • the infrared receiver converts the received infrared light intensity into a digital signal or a set of digital signals and sends it to the processor. If the infrared receiver is a single-pixel infrared camera, it outputs 1 digital signal; if the infrared receiver is a multi-pixel infrared camera, it outputs 1 set of digital signals. Within the response range, the input infrared light intensity increases, and the digital signal output by the infrared receiver increases; conversely, the digital signal output by the infrared receiver decreases.
  • the processor determines that the user's right eye is open; if r ⁇ [p 2 , q 2 ), the processor determines that the user's right eye blinks; if r ⁇ [0, p 2 ), the processor determines that the user's right eye is closed.
  • Single left eye The user's right eye continues to open and the user's left eye cannot be closed. At the same time, the user's left eye is first opened, then blinked for s milliseconds, and finally opened.
  • Single right eye The user's left eye continues to open and the user's right eye cannot be closed. At the same time, the user's right eye is opened first, then blinked for s milliseconds, and finally opened.
  • Single ⁇ Single left eye or single right eye.
  • Double eyes The user's eyes cannot be closed. At the same time, the user's eyes are simultaneously opened at the same time, and at the same time blinking for s milliseconds, and finally open at the same time.
  • Single or double ⁇ .
  • Single left eye The user's right eye continues to open. At the same time, the user's left eye is first opened, then closed for s milliseconds, and finally opened.
  • Single right eye The user's left eye continues to open. At the same time, the user's right eye is first opened, then closed for s milliseconds, and finally opened.
  • Single eye single eye left eye or single eye right eye.
  • Double blinking The user's eyes are simultaneously opened at the same time, and then closed for s milliseconds at the same time, and finally opened at the same time.
  • the left infrared receiver, the right infrared receiver, and the processor can recognize the user's command and the single eye command. ⁇ and single blink can eliminate unconscious blinks of the human eye and reduce misuse. It’s easier and faster than blinking. ⁇ does not close the user's view.
  • Double-twist commands, single-left-eye commands, single-right-eye commands, single-left-eye commands, and single-right-eye commands can trigger different events.
  • An angular velocity detector is an instrument that detects the angular velocity of a carrier.
  • the commonly used angular velocity detector is a three-axis angular velocity gyroscope whose center serves as the origin of the gyroscope coordinate system, and its main axis, horizontal axis and vertical axis form the coordinate axis of the gyroscope coordinate system.
  • the three-axis angular velocity gyroscope can detect a three-dimensional angular velocity vector.
  • the head angular velocity detector is worn on the head, which detects the three-dimensional angular velocity vector [a 1 , a 2 , a 3 ] of the head.
  • the head angular velocity detector detects the angular motion noise of the torso. Therefore, the digital glasses can be added with a torso angular velocity detector to eliminate angular velocity noise generated by the torso angle motion.
  • the torso angular velocity detector is worn on the torso, which detects the three-dimensional angular velocity vector [b 1 , b 2 , b 3 ] of the torso.
  • the torso angular velocity receiving interface can directly receive the torso angular velocity vector by wired connection to the torso angular velocity transmitter, and the torso angular velocity receiving interface can also indirectly receive the torso angular velocity vector by wiredly connecting the torso angular velocity receiver.
  • the torso angular velocity receiver can receive the torso angular velocity vector by wired communication or wireless communication.
  • the digital glasses may also include the following components: left temple, right temple, left display device, right display device, left infrared emitter, right infrared emitter, left infrared receiver, right infrared receiver, head angular velocity detector, Torso angular velocity detector, processor, memory, power supply.
  • the torso angular velocity detector transmits the torso angular velocity vector to the processor by wired communication or wireless communication.
  • the digital glasses may also include the following components: left temple, right temple, left display device, right display device, left infrared emitter, right infrared emitter, left infrared receiver, right infrared receiver, head angular velocity detector, Torso angular velocity detector, sweat-proof tape, processor, memory, power supply.
  • one side of the sweat-proof tape can be fixedly connected to the trunk gyroscope, and the other side can be attached to the trunk skin.
  • the sweat-proof tape prevents the torso gyroscope from slipping due to sweating by the user. It can fix the torso gyroscope to the user's torso for a long time. It is convenient to stick and tear off the sweat-proof tape, which allows the user to operate the digital glasses in a bumpy environment.
  • the "head relative angular velocity vector” is defined below.
  • Head relative angular velocity vector The three-dimensional angular velocity vector of the head relative to the torso.
  • the processor can detect the state of the torso angular velocity detector and can output it to the user via an audio or video signal.
  • the state of the torso angular velocity detector includes four types: the angular velocity detector is successfully connected, the angular velocity detector is failed to connect, the angular velocity detector is successfully paired, and the angular velocity detector is paired successfully.
  • the processor If the torso angular velocity receiving interface receives the torso angular velocity vector, the processor notifies the user that the angular velocity detector is successfully connected; otherwise, the processor notifies the user that the angular velocity detector fails to connect.
  • the head angular velocity detector can output the head angular velocity coordinate axis direction to the processor.
  • the torso angular velocity detector can output the torso angular velocity coordinate axis direction to the processor.
  • the processor can calculate the coordinate axis direction difference X of the head angular velocity detector and the torso angular velocity detector, and determine whether the angular velocity detector pairing is successful.
  • the processor notifies the user that the angular velocity detector is successfully paired; if X ⁇ 0, the head angular velocity detector and the torso angular velocity detector The direction of the coordinate axes is inconsistent, and the processor notifies the user that the angular velocity detector has failed to pair.
  • the processor sets the torso angular velocity vector [b 1 , b 2 , b 3 ] to a zero vector [0, 0, 0]. At this time, the head relative angular velocity vector is [a 1 , a 2 , a 3 ].
  • the processor does not need to detect the state of the torso angular velocity detector.
  • the processor translates the coordinate system origin of the head angular velocity detector to the vertices of the user's cervical vertebra, and uses the coordinate axis direction of the head angular velocity detector to serve as the coordinate axis direction of the head coordinate system.
  • the processor then creates a three-dimensional head coordinate system for the user's head. Regardless of the state of motion of the user's head, the origin of the head coordinate system is always at the apex of the user's cervical vertebra, and the coordinate axis direction of the head coordinate system is always consistent with the coordinate axis direction of the head angular velocity detector.
  • the direction in front of the digital glasses coincides with the direction in front of the user's eyes. Assuming that the head state when the user is upright is the initial state, the up and down rotation angle ⁇ of the user's head is satisfied. And the left and right rotation angle ⁇ of the user's head is satisfied.
  • the operation interface is a virtual planar object in three-dimensional space, which is stored in the form of electronic data in the memory. It is located in front of the user's glasses.
  • the front here includes the front side, the front upper side, the front lower side, the front left side, and the front right side.
  • the operator interface is always stationary relative to the head.
  • the pointer is located on the two-dimensional operation interface, and the tip coordinates can be represented by a two-dimensional vector.
  • pointer disabled There are two other switchable states for pointers: "pointer disabled” and “pointer active”. When the pointer is in the "pointer disabled” state, the pointer cannot move or click; when the pointer is in the "active” state, the pointer can be moved or clicked. "Pointer disabled” can eliminate the interference of the user's non-operating shaking head.
  • the user continues to blink for more than t milliseconds, or continuously closes the single eye for more than t milliseconds;
  • a specific video alert signal can be sent immediately to indicate a status change.
  • the user can move the pointer with the head.
  • the left and right rotational angular velocity components are d 1
  • the head up and down rotational angular velocity components are d 2 .
  • the relative angular velocity vector of the head is [c 1 , c 2 , c 3 ].
  • the three-dimensional angular velocity vector [c 1 , c 2 , c 3 ] can extract a two-dimensional angular velocity vector [d 1 , d 2 ].
  • d 1 can generate a horizontal displacement component of the pointer;
  • d 2 can generate a vertical displacement component of the pointer.
  • the method of moving the pointer by the head includes the following steps:
  • the processor calculates a two-dimensional vector [d 1 , d 2 ], and transfers to S3;
  • the processor multiplies the components of the two-dimensional vector [d 1 , d 2 ] by the scaling factors k 1 and k 2 , respectively, to generate a pointer displacement vector [k 1 ⁇ d 1 , k 2 ⁇ d 2 ], and transfers to S4. ;
  • the processor adds the pointer displacement vector [k 1 ⁇ d 1 , k 2 ⁇ d 2 ] to the current pointer coordinates, thereby moving the pointer on the operation interface and going to S1.
  • the user can click on the pointer with the head.
  • the method of clicking the pointer on the head is: user ⁇ or single blink.
  • the processor can immediately issue a specific video alert signal to prompt completion of the click. For example, after the pointer is clicked, the pointer flashes to prompt completion.
  • the processor can issue a variety of video alert signals to prompt for multiple instructions to complete. For example, after the single-eye left-eye command is completed, the processor will flash a red circle on the pointer to indicate the completion of the instruction; after the single-eye right-eye instruction is completed, the processor will flash the circle on the pointer to prompt the instruction to complete.
  • the user can click on any button on the operator interface by turning the head and clicking.
  • the user can use the soft keyboard to input text by turning the head and tweeting.
  • the display device can be a transparent display device.
  • the transparent display device can display the operation interface below the display device. This prevents the user interface from obscuring the user's line of sight, allowing the user to walk normally.
  • the outer surface of the transparent display device can cover the electrochromic material. Electrochromic materials allow for adjustment of light transmission, which shields ambient light and enhances the contrast of virtual images.
  • the digital glasses operated by the head not only liberated the user's hands, but also liberated the user's feet.
  • Digital glasses can contain expensive decorative materials such as precious metals and jewelry. Decorative materials can decorate the user's head.
  • Digital glasses can be added to the camera to send the captured reality image to the processor. Then, the processor can fuse the real image and the virtual image to the display device.
  • the camera can also take photos and videos.
  • the camera can be an infrared camera to capture infrared images.
  • Digital glasses can add microphones and speakers to send and receive audio information. Digital glasses can add communication chips to enable remote communication.
  • Digital glasses can add eye tracking devices to achieve eye control.
  • Digital glasses can also be installed with a variety of software.
  • digital glasses can be equipped with speech recognition software to output the recognized text to the display device.
  • the power supply can be either a built-in power supply or an external power supply.
  • the digital glasses operated by the head can completely liberate the user's hands and feet. It allows the user to quickly and accurately manipulate the pointer through the head and output image information into a wide three-dimensional space.
  • Figure 1 is a front view of the display module.
  • Figure 2 is a front view of the gyro tape.
  • the digital eyewear embodiment includes two modules: a display module and a gyro tape.
  • the display module includes the following components: a nose bridge (1), a processor (2), a display device (3A), a display device (3B), a nose pad (4A), a nose pad (4B), and an infrared emitter.
  • 5A infrared emitter
  • 5B infrared receiver
  • 6A infrared receiver
  • 6B pile head (7A), pile head (7B), hinge (8A), hinge (8B), power supply (9A ), power supply (9B), temple (10A), temple (10B), head angular velocity gyroscope (11), torso angular velocity receiver (12), memory (13).
  • the gyro tape includes the following components: a torso angular velocity gyroscope (14), a torso angular velocity transmitter (15), a power source (16), and a sweat-proof and breathable tape (17).
  • the display device (3A) is located in front of the user's left eye and the display device (3B) is located in front of the user's right eye.
  • the infrared emitter can continuously emit infrared rays, or it can emit infrared rays at regular intervals.
  • the infrared light emitted by the infrared emitter illuminates his eyes and reflects strong infrared light.
  • the light emitted by the display device illuminates his eyes and reflects the weak infrared light.
  • the infrared receiver converts reflected infrared light into a digital signal.
  • the infrared emitter (5A) emits infrared light to the left eye of the user, and the infrared receiver (6A) receives infrared light reflected by the user's left eye.
  • the infrared emitter (5B) emits infrared light to the user's right eye
  • the infrared receiver (6B) receives infrared light reflected by the user's right eye.
  • the infrared receiver (6A) and the infrared receiver (6B) convert the received infrared light intensity into a digital signal and send it to the processor (2).
  • the processor (2) Within the response range, the input infrared light intensity increases, and the digital signal output by the infrared receiver increases; conversely, the digital signal output by the infrared receiver decreases.
  • the processor (2) determines that the user's left eye is open; if l ⁇ [p 1 , q 1 ), the processor (2) determines that the user's left eye blinks; if l ⁇ [0, p 1 ), then the processor (2) determines that the user's left eye is off.
  • the processor (2) determines that the user's right eye is open; if r ⁇ [p 2 , q 2 ), the processor (2) determines that the user's right eye blinks; if r ⁇ [0, p 2 ), then the processor (2) determines that the user's right eye is off.
  • Single left eye The user's right eye continues to open and the user's left eye cannot be closed. At the same time, the user's left eye is first opened, then blinked for s milliseconds, and finally opened.
  • Single right eye The user's left eye continues to open and the user's right eye cannot be closed. At the same time, the user's right eye is opened first, then blinked for s milliseconds, and finally opened.
  • Single ⁇ Single left eye or single right eye.
  • Double eyes The user's eyes cannot be closed. At the same time, the user's eyes are simultaneously opened at the same time, and at the same time blinking for s milliseconds, and finally open at the same time.
  • Single or double ⁇ .
  • Single left eye The user's right eye continues to open. At the same time, the user's left eye is first opened, then closed for s milliseconds, and finally opened.
  • Single right eye The user's left eye continues to open. At the same time, the user's right eye is first opened, then closed for s milliseconds, and finally opened.
  • Single eye single eye left eye or single eye right eye.
  • Double blinking The user's eyes are simultaneously opened at the same time, and then closed for s milliseconds at the same time, and finally opened at the same time.
  • the infrared receiver (6A), the infrared receiver (6B), and the processor (2) can recognize the user's command and the single eye command. ⁇ and single blink can eliminate unconscious blinks of the human eye and reduce misuse. It’s easier and faster than blinking. ⁇ does not close the user's view.
  • Both the head angular velocity gyroscope (11) and the torso angular velocity gyroscope (14) belong to the angular velocity detector.
  • the head angular velocity gyroscope (11) and the torso angular velocity gyroscope (14) have the same coordinate axis direction.
  • the head angular velocity gyroscope (11) is a three-axis angular velocity gyroscope. Its holder center acts as the origin of the gyroscope coordinate system, and its main axis, horizontal axis and vertical axis form the coordinate axis of the gyroscope coordinate system.
  • the head angular velocity gyro (11) can detect a three-dimensional angular velocity vector. It is attached to the digital glasses to detect the three-dimensional angular velocity vector [a 1 , a 2 , a 3 ] of the head.
  • the torso angular velocity receiver (12) can receive the torso angular velocity coordinate axis direction and the torso angular velocity vector by wired communication or wireless communication.
  • the torso angular velocity gyroscope (14) can be attached to the anti-sweat breathable tape (17).
  • the sweat-proof and breathable tape (17) has anti-sweat and breathable functions.
  • the sweat-proof and breathable tape (17) can be attached to the torso of the torso to fix the torso angular velocity gyroscope (14).
  • Gyro tape allows users to operate digital glasses in bumpy environments.
  • the torso angular velocity gyroscope (14) can detect the three-dimensional angular velocity vector [b 1 , b 2 , b 3 ] of the torso.
  • the torso angular velocity gyroscope (14) can output the torso angular velocity coordinate axis direction and the torso angular velocity vector to the torso angular velocity transmitter (15), and the torso angular velocity transmitter (15) can transmit the torso angular velocity coordinate axis direction by wired communication or wireless communication. Torso angular velocity vector.
  • the processor (2) can detect the state of the torso angular velocity gyroscope (14) and can transmit it to the user via an audio signal or a video signal.
  • the state of the torso angular velocity gyroscope (14) includes four types: the angular velocity detector is successfully connected, the angular velocity detector is failed to connect, the angular velocity detector is successfully paired, and the angular velocity detector is paired successfully.
  • the processor (2) If the torso angular velocity receiver (12) receives the torso angular velocity vector, the processor (2) notifies the user that the angular velocity detector is successfully connected; otherwise, the processor (2) notifies the user that the angular velocity detector connection has failed.
  • the head angular velocity gyroscope (11) can output the head angular velocity coordinate axis direction to the processor.
  • the torso angular velocity gyroscope (14) can output the torso angular velocity coordinate axis direction to the processor (2).
  • the processor (2) can calculate the coordinate axis direction difference X of the head angular velocity gyro (11) and the torso angular velocity gyro (14), and determine whether the angular velocity detector pairing is successful.
  • the processor (2) notifies the user that the angular velocity detector is successfully paired; if X ⁇ 0, the head The angular velocity direction of the angular velocity gyroscope (11) and the torso angular velocity gyroscope (14) are inconsistent, and the processor (2) notifies the user that the angular velocity detector fails to be paired.
  • the processor (2) sets the torso angular velocity vector [b 1 , b 2 , b 3 ] to a zero vector [0, 0, 0]. At this time, the head relative angular velocity vector is [a 1 , a 2 , a 3 ].
  • the processor (2) translates the coordinate system origin of the head angular velocity gyro (11) to the vertices of the user's cervical vertebra, and uses the coordinate axis direction of the head angular velocity gyro (11) to serve as the coordinate axis direction of the head coordinate system.
  • the processor (2) then creates a three-dimensional head coordinate system for the user's head. Regardless of the state of motion of the user's head, the origin of the head coordinate system is always at the vertex of the user's cervical vertebra, and the coordinate axis direction of the head coordinate system is always consistent with the coordinate axis direction of the head angular velocity gyroscope (11).
  • the direction in front of the digital glasses coincides with the direction in front of the user's eyes. Assuming that the head state when the user is upright is the initial state, the up and down rotation angle ⁇ of the user's head is satisfied. And the left and right rotation angle ⁇ of the user's head is satisfied.
  • the operation interface is a virtual planar object in three-dimensional space, which is stored in the form of electronic data in the memory. It is located in front of the user's glasses.
  • the front here includes the front side, the front upper side, the front lower side, the front left side, and the front right side.
  • the operator interface is always stationary relative to the head.
  • the pointer is located on the two-dimensional operation interface, and the tip coordinates can be represented by a two-dimensional vector.
  • the user switches the pointer state with the head.
  • the method of switching the state of the pointer by the head is: the user singles the left eye.
  • the user can move the pointer with the head.
  • the left and right rotational angular velocity components are d 1
  • the head up and down rotational angular velocity components are d 2 .
  • the relative angular velocity vector of the head is [c 1 , c 2 , c 3 ].
  • the three-dimensional angular velocity vector [c 1 , c 2 , c 3 ] can extract a two-dimensional angular velocity vector [d 1 , d 2 ].
  • d 1 can generate a horizontal displacement component of the pointer;
  • d 2 can generate a vertical displacement component of the pointer.
  • the method of moving the pointer by the head includes the following steps:
  • the processor (2) calculates a two-dimensional vector [d 1 , d 2 ], and transfers to S3;
  • the processor (2) multiplies the components of the two-dimensional vector [d 1 , d 2 ] by the scaling factors k 1 and k 2 , respectively, thereby generating a pointer displacement vector [k 1 ⁇ d 1 , k 2 ⁇ d 2 ], Go to S4;
  • the processor (2) adds the pointer displacement vector [k 1 ⁇ d 1 , k 2 ⁇ d 2 ] to the current pointer coordinates, thereby moving the pointer on the operation interface and going to S1.
  • the user can click on the pointer with the head.
  • the method of clicking the pointer on the head includes the following steps:
  • the user can click on any button on the operator interface by turning the head and clicking.
  • the user can use the soft keyboard to input text by turning the head and tweeting.
  • the display device can be a transparent display device.
  • the transparent display device can display the operation interface below the display device. This prevents the user interface from obscuring the user's line of sight, allowing the user to walk normally.
  • the digital glasses operated by the head not only liberated the user's hands, but also liberated the user's feet.
  • Digital glasses can contain expensive decorative materials such as precious metals and jewelry. Decorative materials can decorate the user's head.
  • Digital glasses can be added to the camera to send the captured reality image to the processor. Then, the processor can fuse the real image and the virtual image to the display device.
  • the camera can also take photos and videos.
  • the camera can be an infrared camera to capture infrared images.
  • Digital glasses can add microphones and speakers to send and receive audio information. Digital glasses can add communication chips to enable remote communication.
  • Digital glasses can add eye tracking devices to achieve eye control.
  • Digital glasses can also be installed with a variety of software.
  • digital glasses can be equipped with speech recognition software to output the recognized text to the display device.
  • the power supply can be either a built-in power supply or an external power supply.
  • the digital glasses operated by the head can completely liberate the user's hands and feet. It allows the user to quickly and accurately manipulate the pointer through the head and output image information into a wide three-dimensional space.

Abstract

一种数字眼镜,其包括以下组件:左镜腿(10A)、右镜腿(10B)、左显示设备(3A)、右显示设备(3B)、左红外发射器(5A)、右红外发射器(5B)、左红外接收器(6A)、右红外接收器(6B)、头部角速度检测仪(11)、躯干角速度接收接口、处理器(2)、存储器(13)、电源(16)。它允许用户通过头来快速、精确地操作指针,并可以将图像信息输出至广阔的三维空间中。

Description

头操作的数字眼镜 技术领域
本发明涉及一种显示设备,尤其涉及一种头戴式显示设备。
背景技术
数字眼镜是可以显示数字信号的头戴式显示设备,包括增强现实眼镜、虚拟现实眼镜、智能眼镜等。
手机和平板电脑是主流的移动通信设备。手机显示区域狭窄,而平板电脑重量较大。此外,手机和平板电脑只能显示二维图像,它们要求用户改变头姿态,这限制了它们的应用领域。
数字眼镜可以用近眼显示器将图像输出至广阔的三维空间中。但是目前的数字眼镜难以输入文字。语音识别输入法存在识别误差,抗干扰性差。触摸板虽然可以完成文字输入,但是它需要占用至少1只手的资源。当用户双手处于繁忙状态时,难以用触摸板输入文字。眼球追踪设备虽然可以用眼球运动操作数字眼镜,但是眼球追踪设备的指针移动精度较低,而且容易遭受环境光的干扰。
本文设计一种数字眼镜,它允许用户通过头来快速、精确地操作指针,并可以将图像信息输出至广阔的三维空间中。目前尚无文献公开此类产品的制造方法。
发明内容
本发明旨在提供一种头操作的数字眼镜及其操作方法。它允许用户通过头来快速、精确地操作指针,并可以将图像信息输出至广阔的三维空间中。
数字眼镜包括左镜腿、右镜腿、左显示设备、右显示设备、左红外发射器、右红外发射器、左红外接收器、右红外接收器、头部角速度检测仪、躯干角速度接收接口、处理器、存储器、电源。
显示设备是视频输出设备,包括投影机、液晶面板等设备。
本段落假设用户一直戴着数字眼镜。左显示设备位于用户左眼前方,右显示设备位于用户右眼前方。
红外发射器可以持续发射红外线,也可以固定时间间隔为周期发射红外线。当用户睁眼时,红外发射器发射的红外线照射其眼睛就会反射强红外光。当用户闭眼时,显示设备发射的光线照射其眼睛就会反射弱红外光。红外接收器可以将反射红外光转换成数字信号。左红外发射器将红外光发射至用户左眼上,左红外接收器接收用户左眼反射的红外光。右红外发射器将红外光发射至用户右眼上,右红外接收器接收用户右眼反射的红外光。红外接收器将接收的红外光强度转换成1个数字信号或1组数字信号发送给处理器。如果红外接收器是单像素红外相机,其就输出1个数字信号;如果红外接收器是多像素红外相机,其就输出1组数字信号。在响应范围内,输入红外光强度增大,红外接收器输出的数字信号就增大;反之,红外接收器输出的数字信号则减少。
假设0<p1<q1和0<p2<q2成立。假设左红外接收器输出信号之和为l,而右红外接收器输出信号之和为r。如果l∈[q1,+∞),则处理器判断用户左眼睁开;如果l∈[p1,q1),则处理器判断用户左眼眯眼;如果l∈[0,p1),则处理器判断用户左眼关闭。如果r∈[q2,+∞),则处理器判断用户右眼睁开;如果r∈[p2,q2),则处理器判断用户右眼眯眼;如果r∈[0,p2),则处理器判断用户右眼关闭。
假设s∈(0,3000)。下面定义一些专有名词。
单眯眨左眼:用户的右眼持续睁开,用户的左眼不能关闭。于此同时该用户的左眼先睁开,再眯眼s毫秒,最后睁开。
单眯眨右眼:用户的左眼持续睁开,用户的右眼不能关闭。于此同时该用户的右眼先睁开,再眯眼s毫秒,最后睁开。
单眯眨:单眯眨左眼或单眯眨右眼。
双眯眨:用户的双眼不能关闭。于此同时用户的双眼先同时睁开,再同时眯眼s毫秒,最后同时睁开。
眯眨:单眯眨或双眯眨。
单眨左眼:用户的右眼持续睁开。于此同时该用户的左眼先睁开,再关闭s毫秒,最后睁开。
单眨右眼:用户的左眼持续睁开。于此同时该用户的右眼先睁开,再关闭s毫秒,最后睁开。
单眨眼:单眨左眼或单眨右眼。
双眨眼:用户的双眼先同时睁开,再同时关闭s毫秒,最后同时睁开。
于是,左红外接收器、右红外接收器和处理器就可以识别出用户眯眨指令和单眨眼指令。眯眨和单眨眼可以排除人眼无意识的双眨眼,减少误操作。眯眨比眨眼更轻松、更快捷。眯眨不会关闭用户视野。
双眯眨指令、单眯眨左眼指令、单眯眨右眼指令、单眨左眼指令和单眨右眼指令可以触发不同的事件。
角速度检测仪是检测运载器角速度的仪器。常用的角速度检测仪是三轴角速度陀螺仪,它的支架中心充当陀螺仪坐标系原点,它的主轴、水平轴和垂直轴构成陀螺仪坐标系的坐标轴。于是,三轴角速度陀螺仪能检测出三维角速度向量。
头部角速度检测仪戴在头部,它可以检测出头部的三维角速度向量[a1,a2,a3]。头部角速度检测仪会检测出躯干的角运动噪声。因此,数字眼镜可以增加一个躯干角速度检测仪以消除躯干角运动产生的角速度噪声。躯干角速度检测仪戴在躯干上,它可以检测出躯干的三维角速度向量[b1,b2,b3]。躯干角速度接收接口可以通过有线连接躯干角速度发射器来直接接收躯干角速度向量,躯干角速度接收接口也可以通过有线连接躯干角速度接收器来间接接收躯干角速度向量。躯干角速度接收器可以通过有线通信或无线通信来接收躯干角速度向量。
数字眼镜也可以包括以下组件:左镜腿、右镜腿、左显示设备、右显示设备、左红外发射器、右红外发射器、左红外接收器、右红外接收器、头部角速度检测仪、躯干角速度检测仪、处理器、存储器、电源。此时,躯干角速度检测仪通过有线通信或无线通信将躯干角速度向量发送给处理器。
数字眼镜也可以包括以下组件:左镜腿、右镜腿、左显示设备、右显示设备、左红外发射器、右红外发射器、左红外接收器、右红外接收器、头部角速度检测仪、躯干角速度检测仪、防汗胶带、处理器、存储器、电源。此时,防汗胶带的一面可以固定连接躯干陀螺仪,其另一面可以粘贴躯干表皮。防汗胶带可以防止用户流汗导致的躯干陀螺仪滑落。它可以长时间地将躯干陀螺仪固定在用户躯干上。防汗胶带的粘贴和撕下都很方便,它允许用户在颠簸环境中操作数字眼镜。
下面定义“头部相对角速度向量”。
头部相对角速度向量:头部相对躯干的三维旋转角速度向量。
处理器可以检测躯干角速度检测仪的状态,并可以通过音频信号或视频信号将其输出给用户。躯干角速度检测仪的状态包括4种:角速度检测仪连接成功、角速度检测仪连接失败、角速度检测仪配对成功、角速度检测仪配对失败。
如果躯干角速度接收接口接收到躯干角速度向量,则处理器就通知用户:角速度检测仪连接成功;反之,则处理器就通知用户:角速度检测仪连接失败。
头部角速度检测仪可以将头部角速度坐标轴方向输出给处理器。躯干角速度检测仪可以将躯干角速度坐标轴方向输出给处理器。处理器可以计算头部角速度检测仪和躯干角速度检测仪的坐标轴方向差值X,并判断角速度检测仪配对是否成功。若X=0,则头部角速度检测仪和躯干角速度检测仪的坐标轴方向一致,处理器就通知用户:角速度检测仪配对成功;若X≠0,则头部角速度检测仪和躯干角速度检测仪的坐标轴方向不一致,处理器就通知用户:角速度检测仪配对失败。
令c1=a1-b1、c1=a1-b1、c1=a1-b1,则头部相对角速度向量为[c1,c2,c3]。
如果角速度检测仪连接失败或配对失败,则处理器将躯干角速度向量[b1,b2,b3]设置成零向量[0,0,0]。此时,头部相对角速度向量为[a1,a2,a3]。
如果数字眼镜包括躯干角速度检测仪,并且该躯干角速度检测仪与头部角速度检测仪的坐标轴方向一致,则处理器无需检测躯干角速度检测仪的状态。
处理器将头部角速度检测仪的坐标系原点平移至用户的颈椎顶点,用头部角速度检测仪的坐标轴方向充当头部坐标系的坐标轴方向。于是,处理器便针对用户头部建立了一个三维头部坐标系。无论用户头部处于何种运动状态,头部坐标系的原点始终位于用户的颈椎顶点,头部坐标系的坐标轴方向始终与头部角速度检测仪的坐标轴方向一致。
数字眼镜的正前方方向与用户眼睛的正前方方向一致。假设用户直立时的头部状态为初始状态,则用户头部的上下旋转角度α满足
Figure PCTCN2016090236-appb-000001
而用户头部的左右旋转角度β满足
Figure PCTCN2016090236-appb-000002
操作界面是三维空间中的一个虚拟平面物体,它以电子数据形式存储于存储器中。它位于用户眼镜前方。此处的前方包括正前方、前上方、前下方、前左方和前右方。操作界面始终相对头部静止。
指针位于二维操作界面上,其针尖坐标可以用二维向量表示。指针可以存在两种可切换状态:“移动禁用”和“移动激活”。当指针处于“移动禁用”状态时,指针无法移动;当指针处于“移动激活”状态时,指针可以移动。“移动禁用”可以排除用户非操作性摇头的干扰。
指针可以存在另外两种可切换状态:“指针禁用”和“指针激活”。当指针处于“指针禁用”状态时,指针既无法移动也无法单击;当指针处于“激活”状态时,指针既可以移动也可以单击。“指针禁用”可以排除用户非操作性摇头的干扰。
假设t∈[800,+∞)。用户可以用头部切换指针状态。头部切换指针状态的方法有两种:
1.用户眯眨或单眨眼;
2.用户持续眯眼超过t毫秒,或者持续关闭单眼超过t毫秒;
指针在切换状态后,可以立即发出特定的视频提示信号以提示状态改变。
用户可以用头部移动指针。头部左右旋转角速度分量为d1,头部上下旋转角速度分量为d2。已知头部相对角速度向量为[c1,c2,c3]。三维角速度向量[c1,c2,c3]可以提取出二维角速度向量[d1,d2]。d1可以生成指针的水平位移分量;d2可以生成指针的垂直位移分量。
假设k1∈(0,+∞),k2∈(0,+∞)。头部移动指针的方法包括如下步骤:
S1.如果指针处于“移动禁用”或“指针禁用”状态,则转至S1;否则,转至S2;
S2.处理器计算出二维向量[d1,d2],转至S3;
S3.处理器将二维向量[d1,d2]的分量分别乘以缩放因子k1和k2,从而生成指针位移向量[k1·d1,k2·d2],转至S4;
S4.处理器将指针位移向量[k1·d1,k2·d2]添加至当前指针坐标上,从而在操作界面上移动指针,转至S1。
用户可以用头部单击指针。头部单击指针的方法为:用户眯眨或单眨眼。
在指针单击后,处理器可以立即发出特定的视频提示信号以提示单击完成。例如在指针单击后,指针会闪烁一下以提示单击完成。处理器可以发出多种视频提示信号以提示多种指令完成。例如在单眨左眼指令完成后,处理器会在指针上闪烁一下红圈以提示指令完成;在单眨右眼指令完成后,处理器会在指针上闪烁一下篮圈以提示指令完成。
因此,用户只需转头和眯眨就可以单击操作界面上的任意按钮。同理,用户只需转头和眯眨就可以用软键盘输入文字。
显示设备可以是透明显示设备。透明显示设备可以将操作界面显示在显示设备的下方。这样就避免操作界面遮挡用户视线,从而允许用户正常行走。透明显示设备的外表面可以覆盖电致变色材料。电致变色材料允许调节透光率,从而可以屏蔽环境光线,增强虚拟画面的对比度。头操作的数字眼镜不仅解放了用户的双手,还解放了用户的双脚。
数字眼镜可以包含昂贵的装饰材料,例如贵金属和珠宝。装饰材料可以装饰用户的头部。
数字眼镜可以增加摄像头,以将采集的现实图像发送至处理器。然后,处理器就可以将现实图像与虚拟图像融合输出给显示设备。摄像头还可以拍照、录像。摄像头可以是红外摄像头,以采集红外图像。
数字眼镜可以增加麦克风和扬声器,以收发音频信息。数字眼镜可以增加通信芯片,以实现远程通信。
数字眼镜可以增加眼球追踪设备,以实现眼球控制功能。
数字眼镜还可以安装各种软件。比如,数字眼镜可以安装语音识别软件,以将识别文本输出至显示设备上。
电源可以是内置电源,也可以是外置电源。
综上所述,头操作的数字眼镜可以彻底解放用户的双手和双脚。它允许用户通过头来快速、精确地操作指针,并可以将图像信息输出至广阔的三维空间中。
附图说明
图1为显示模块的主视图。
图2为陀螺胶带的主视图。
具体实施方法
下面提供本发明的一个最佳实施例,并结合附图描述本发明。
数字眼镜实施例包括2个模块:显示模块和陀螺胶带。如图1所示,显示模块包括以下组件:鼻梁(1)、处理器(2)、显示设备(3A)、显示设备(3B)、鼻托(4A)、鼻托(4B)、红外发射器(5A)、红外发射器(5B)、红外接收器(6A)、红外接收器(6B)、桩头(7A)、桩头(7B)、铰链(8A)、铰链(8B)、电源(9A)、电源(9B)、镜腿(10A)、镜腿(10B)、头部角速度陀螺仪(11)、躯干角速度接收器(12)、存储器(13)。如图2所示,陀螺胶带包括以下组件:躯干角速度陀螺仪(14)、躯干角速度发射器(15)、电源(16)、防汗透气胶带(17)。
本段落假设用户一直戴着数字眼镜。显示设备(3A)位于用户左眼前方,显示设备(3B)位于用户右眼前方。
红外发射器可以持续发射红外线,也可以固定时间间隔为周期发射红外线。当用户睁眼时,红外发射器发射的红外线照射其眼睛就会反射强红外光。当用户闭眼时,显示设备发射的光线照射其眼睛就会反射弱红外光。红外接收器可以将反射红外光转换成数字信号。红外发射器(5A)将红外光发射至用户左眼上,红外接收器(6A)接收用户左眼反射的红外光。红外发射器(5B)将红外光发射至用户右眼上,红外接收器(6B)接收用户右眼反射的红外光。红外接收器(6A)和红外接收器(6B)将接收的红外光强度转换成1个数字信号发送给处理器(2)。在响应范围内,输入红外光强度增大,红外接收器输出的数字信号就增大;反之,红外接收器输出的数字信号则减少。
假设0<p1<q1和0<p2<q2成立。假设红外接收器(6A)输出信号之和为l,而红外接收器(6B)输出信号之和为r。如果l∈[q1,+∞),则处理器(2)判断用户左眼睁开;如果l∈[p1,q1),则处理器(2)判断用户左眼眯眼;如果l∈[0,p1),则处理器(2)判断用户左眼关闭。如果r∈[q2,+∞),则处理器(2)判断用户右眼睁开;如果r∈[p2,q2),则处理器(2)判断用户右眼眯眼;如果r∈[0,p2),则处理器(2)判断用户右眼关闭。
假设s∈(0,3000)。下面定义一些专有名词。
单眯眨左眼:用户的右眼持续睁开,用户的左眼不能关闭。于此同时该用户的左眼先睁开,再眯眼s毫秒,最后睁开。
单眯眨右眼:用户的左眼持续睁开,用户的右眼不能关闭。于此同时该用户的右眼先睁开,再眯眼s毫秒,最后睁开。
单眯眨:单眯眨左眼或单眯眨右眼。
双眯眨:用户的双眼不能关闭。于此同时用户的双眼先同时睁开,再同时眯眼s毫秒,最后同时睁开。
眯眨:单眯眨或双眯眨。
单眨左眼:用户的右眼持续睁开。于此同时该用户的左眼先睁开,再关闭s毫秒,最后睁开。
单眨右眼:用户的左眼持续睁开。于此同时该用户的右眼先睁开,再关闭s毫秒,最后睁开。
单眨眼:单眨左眼或单眨右眼。
双眨眼:用户的双眼先同时睁开,再同时关闭s毫秒,最后同时睁开。
于是,红外接收器(6A)、红外接收器(6B)和处理器(2)就可以识别出用户眯眨指令和单眨眼指令。眯眨和单眨眼可以排除人眼无意识的双眨眼,减少误操作。眯眨比眨眼更轻松、更快捷。眯眨不会关闭用户视野。
头部角速度陀螺仪(11)和躯干角速度陀螺仪(14)都属于角速度检测仪。头部角速度陀螺仪(11)和躯干角速度陀螺仪(14)的坐标轴方向一致。头部角速度陀螺仪(11)是个三轴角速度陀螺仪。它的支架中心充当陀螺仪坐标系原点,它的主轴、水平轴和垂直轴构成陀螺仪坐标系的坐标轴。于是,头部角速度陀螺 仪(11)能检测出三维角速度向量。它固定在数字眼镜上,以检测出头部的三维角速度向量[a1,a2,a3]。躯干角速度接收器(12)可以用有线通信或无线通信方式接收躯干角速度坐标轴方向和躯干角速度向量。
躯干角速度陀螺仪(14)可以固定在防汗透气胶带(17)上。防汗透气胶带(17)具备防汗、透气功能。防汗透气胶带(17)可以粘贴在躯干表皮上,以固定躯干角速度陀螺仪(14)。陀螺胶带允许用户在颠簸环境中操作数字眼镜。躯干角速度陀螺仪(14)可以检测出躯干的三维角速度向量[b1,b2,b3]。躯干角速度陀螺仪(14)可以将躯干角速度坐标轴方向和躯干角速度向量输出给躯干角速度发射器(15),躯干角速度发射器(15)可以用有线通信或无线通信方式发射躯干角速度坐标轴方向和躯干角速度向量。
处理器(2)可以检测躯干角速度陀螺仪(14)的状态,并可以通过音频信号或视频信号将其发送给用户。躯干角速度陀螺仪(14)的状态包括4种:角速度检测仪连接成功、角速度检测仪连接失败、角速度检测仪配对成功、角速度检测仪配对失败。
如果躯干角速度接收器(12)接收到躯干角速度向量,则处理器(2)就通知用户:角速度检测仪连接成功;反之,则处理器(2)就通知用户:角速度检测仪连接失败。
头部角速度陀螺仪(11)可以将头部角速度坐标轴方向输出给处理器。躯干角速度陀螺仪(14)可以将躯干角速度坐标轴方向输出给处理器(2)。处理器(2)可以计算头部角速度陀螺仪(11)和躯干角速度陀螺仪(14)的坐标轴方向差值X,并判断角速度检测仪配对是否成功。若X=0,则头部角速度陀螺仪(11)和躯干角速度陀螺仪(14)的坐标轴方向一致,处理器(2)就通知用户:角速度检测仪配对成功;若X≠0,则头部角速度陀螺仪(11)和躯干角速度陀螺仪(14)的坐标轴方向不一致,处理器(2)就通知用户:角速度检测仪配对失败。
令c1=a1-b1、c1=a1-b1、c1=a1-b1,则头部相对角速度向量为[c1,c2,c3]。
如果角速度检测仪连接失败或配对失败,则处理器(2)将躯干角速度向量[b1,b2,b3]设置成零向量[0,0,0]。此时,头部相对角速度向量为[a1,a2,a3]。
处理器(2)将头部角速度陀螺仪(11)的坐标系原点平移至用户的颈椎顶点,用头部角速度陀螺仪(11)的坐标轴方向充当头部坐标系的坐标轴方向。于是,处理器(2)便针对用户头部建立了一个三维头部坐标系。无论用户头部处于何种运动状态,头部坐标系的原点始终位于用户的颈椎顶点,头部坐标系的坐标轴方向始终与头部角速度陀螺仪(11)的坐标轴方向一致。
数字眼镜的正前方方向与用户眼睛的正前方方向一致。假设用户直立时的头部状态为初始状态,则用户头部的上下旋转角度α满足
Figure PCTCN2016090236-appb-000003
而用户头部的左右旋转角度β满足
Figure PCTCN2016090236-appb-000004
操作界面是三维空间中的一个虚拟平面物体,它以电子数据形式存储于存储器中。它位于用户眼镜前方。此处的前方包括正前方、前上方、前下方、前左方和前右方。操作界面始终相对头部静止。
指针位于二维操作界面上,其针尖坐标可以用二维向量表示。指针存在两种可切换状态:“指针禁用”和“指针激活”。当指针处于“指针禁用”状态时,指针既无法移动也无法单击;当指针处于“激活”状态时,指针既可以移动也可以单击。
用户用头部切换指针状态。头部切换指针状态的方法为:用户单眯眨左眼。
指针在变成“指针禁用”状态后,立即被“叉号”覆盖,以强调状态变化。指针在变成“指针激活”状态后,立即恢复原状,以强调状态变化。
用户可以用头部移动指针。头部左右旋转角速度分量为d1,头部上下旋转角速度分量为d2。已知头部相对角速度向量为[c1,c2,c3]。三维角速度向量[c1,c2,c3]可以提取出二维角速度向量[d1,d2]。d1可以生成指针的水平位移分量;d2可以生成指针的垂直位移分量。
假设k1∈(0,+∞),k2∈(0,+∞)。头部移动指针的方法包括如下步骤:
S1.如果指针处于“指针禁用”状态,则转至S1;否则,转至S2;
S2.处理器(2)计算出二维向量[d1,d2],转至S3;
S3.处理器(2)将二维向量[d1,d2]的分量分别乘以缩放因子k1和k2,从而生成指针位移向量[k1·d1,k2·d2],转至S4;
S4.处理器(2)将指针位移向量[k1·d1,k2·d2]添加至当前指针坐标上,从而在操作界面上移动指针,转至S1。
用户可以用头部单击指针。头部单击指针的方法包括如下步骤:
S1.如果指针处于“指针禁用”状态,则转至S1;否则,转至S2;
S2.如果用户眯眨,处理器(2)就单击指针,转至S1;否则,转至S1;
因此,用户只需转头和眯眨就可以单击操作界面上的任意按钮。同理,用户只需转头和眯眨就可以用软键盘输入文字。
显示设备可以是透明显示设备。透明显示设备可以将操作界面显示在显示设备的下方。这样就避免操作界面遮挡用户视线,从而允许用户正常行走。头操作的数字眼镜不仅解放了用户的双手,还解放了用户的双脚。
数字眼镜可以包含昂贵的装饰材料,例如贵金属和珠宝。装饰材料可以装饰用户的头部。
数字眼镜可以增加摄像头,以将采集的现实图像发送至处理器。然后,处理器就可以将现实图像与虚拟图像融合输出给显示设备。摄像头还可以拍照、录像。摄像头可以是红外摄像头,以采集红外图像。
数字眼镜可以增加麦克风和扬声器,以收发音频信息。数字眼镜可以增加通信芯片,以实现远程通信。
数字眼镜可以增加眼球追踪设备,以实现眼球控制功能。
数字眼镜还可以安装各种软件。比如,数字眼镜可以安装语音识别软件,以将识别文本输出至显示设备上。
电源可以是内置电源,也可以是外置电源。
综上所述,头操作的数字眼镜可以彻底解放用户的双手和双脚。它允许用户通过头来快速、精确地操作指针,并可以将图像信息输出至广阔的三维空间中。
以上叙述及图像已揭示本发明的较佳实施例。该实施例应被视为用以说明本发明,而非用以限制本发明。本发明的保护范围,并不局限于该实施例。

Claims (16)

  1. 一种数字眼镜,其包括以下组件:左镜腿、右镜腿、左显示设备、右显示设备、左红外发射器、右红外发射器、左红外接收器、右红外接收器、头部角速度检测仪、躯干角速度接收接口、处理器、存储器、电源;其特征在于:左显示设备和右显示设备可以显示二维操作界面和指针,左红外发射器将红外光发射至用户左眼上,左红外接收器接收用户左眼反射的红外光,右红外发射器将红外光发射至用户右眼上,右红外接收器接收用户右眼反射的红外光,左红外接收器所接收的红外光强度被转换成左眼睁开指令或左眼关闭指令,右红外接收器所接收的红外光强度被转换成右眼睁开指令或右眼关闭指令,左红外接收器、右红外接收器和处理器可以识别出用户眯眨指令,头部角速度检测仪戴在头部,头部角速度检测仪可以输出头部角速度向量,躯干角速度接收接口可以通过有线连接躯干角速度发射器来直接接收躯干角速度向量,躯干角速度接收接口也可以通过有线连接躯干角速度接收器来间接接收躯干角速度向量,头部角速度向量和躯干角速度向量可以生成指针位移向量,处理器用指针位移向量在操作界面上移动指针,处理器用眯眨指令来单击指针。
  2. 一种数字眼镜,其包括以下组件:左镜腿、右镜腿、左显示设备、右显示设备、左红外发射器、右红外发射器、左红外接收器、右红外接收器、头部角速度检测仪、躯干角速度检测仪、处理器、存储器、电源;其特征在于:左显示设备和右显示设备可以显示二维操作界面和指针,左红外发射器将红外光发射至用户左眼上,左红外接收器接收用户左眼反射的红外光,右红外发射器将红外光发射至用户右眼上,右红外接收器接收用户右眼反射的红外光,左红外接收器所接收的红外光强度被转换成左眼睁开指令或左眼关闭指令,右红外接收器所接收的红外光强度被转换成右眼睁开指令或右眼关闭指令,左红外接收器、右红外接收器和处理器可以识别出用户单眨眼指令,头部角速度检测仪戴在头部,头部角速度检测仪将检测的头部角速度向量发送给处理器,躯干角速度检测仪将检测的躯干角速度向量发送给处理器,头部角速度向量和躯干角速度向量可以生成指针位移向量,处理器用指针位移向量在操作界面上移动指针,指针状态包括“指针禁用”和“指针激活”,处理器用单眨眼指令来单击指针。
  3. 如权利要求1或2所述的数字眼镜,其特征在于:该数字眼镜包括躯干角速度检测仪和防汗胶带,防汗胶带的一面可以固定连接躯干陀螺仪,防汗胶带的另一面可以粘贴躯干表皮,防汗胶带可以防止用户流汗导致的躯干陀螺仪滑落。
  4. 如权利要求1或2所述的数字眼镜,其特征在于:该数字眼镜可以识别出双眯眨指令、单眯眨左眼指令和单眯眨右眼指令,这三个指令可以触发不同的事件。
  5. 如权利要求1或2所述的数字眼镜,其特征在于:该数字眼镜可以识别出单眨左眼指令和单眨右眼指令,这两个指令可以触发不同的事件。
  6. 如权利要求1或2所述的数字眼镜,其特征在于:头部角速度检测仪可以将头部角速度坐标轴方向输出给处理器,处理器可以输出躯干角速度检测仪的状态。
  7. 如权利要求1或2所述的数字眼镜,其特征在于:显示设备为透明显示设备,其外表面覆盖了电致变色材料。
  8. 如权利要求1或2所述的数字眼镜,其特征在于:指针状态可以在“移动禁用”和“移动激活”之间切换。
  9. 如权利要求1所述的数字眼镜,其特征在于:指针状态可以在“指针禁用”和“指针激活”之间切换。
  10. 如权利要求8或9所述的数字眼镜,其特征在于:指针在切换状态后会立即发出特定的视频提示信号以提示状态改变。
  11. 如权利要求8或9所述的数字眼镜,其特征在于:在指针单击后,数字眼镜可以立即发出特定的视频提示信号以提示单击完成。
  12. 如权利要求10或11所述的数字眼镜,其特征在于:该数字眼镜可以发出多种视频提示信号以提示多种指令完成。
  13. 如权利要求8或9所述的数字眼镜,其特征在于:指针状态切换方法为:用户眯眨或单眨眼。
  14. 如权利要求8或9所述的数字眼镜,其特征在于:指针状态切换方法为:用户持续眯眼超过t毫 秒,或者持续关闭单眼超过t毫秒。
  15. 如权利要求8或9所述的数字眼镜,其特征在于:头部移动指针的方法包括如下步骤:
    S1.如果指针处于“移动禁用”或“指针禁用”状态,则转至S1;否则,转至S2;
    S2.处理器计算出二维向量[d1,d2],转至S3;
    S3.处理器将二维向量[d1,d2]的分量分别乘以缩放因子k1和k2,从而生成指针位移向量[k1·d1,k2·d2],转至S4;
    S4.处理器将指针位移向量[k1·d1,k2·d2]添加至当前指针坐标上,从而在操作界面上移动指针,转至S1。
  16. 如权利要求2或9所述的数字眼镜,其特征在于:头部单击指针的方法包括如下步骤:
    S1.如果指针处于“指针禁用”状态,则转至S1;否则,转至S2;
    S2.如果用户眯眨或单眨眼,处理器就单击指针,转至S1;否则,转至S1;
PCT/CN2016/090236 2015-07-20 2016-07-16 头操作的数字眼镜 WO2017012519A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/566,634 US20180143436A1 (en) 2015-07-20 2016-07-16 Head-operated digital eyeglasses

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
CN201510428409.8 2015-07-20
CN201510428409 2015-07-20
CN201510436115.X 2015-07-22
CN201510436115.XA CN105116544A (zh) 2015-05-25 2015-07-22 头操作的电子眼镜
CN201610152700.1 2016-03-17
CN201610152700 2016-03-17
CN201610344062 2016-05-22
CN201610344062.3 2016-05-22
CN201610458398 2016-06-23
CN201610458398.2 2016-06-23
CN201610544134.9A CN106681488A (zh) 2015-07-20 2016-07-11 头操作的数字眼镜
CN201610544134.9 2016-07-11

Publications (1)

Publication Number Publication Date
WO2017012519A1 true WO2017012519A1 (zh) 2017-01-26

Family

ID=57833702

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/090236 WO2017012519A1 (zh) 2015-07-20 2016-07-16 头操作的数字眼镜

Country Status (1)

Country Link
WO (1) WO2017012519A1 (zh)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0933720A2 (en) * 1998-01-29 1999-08-04 Shimadzu Corporation Input apparatus for the physically handicapped
CN103513770A (zh) * 2013-10-09 2014-01-15 中国科学院深圳先进技术研究院 基于三轴陀螺仪的人机接口设备及人机交互方法
US20140022371A1 (en) * 2012-07-20 2014-01-23 Pixart Imaging Inc. Pupil detection device
CN103777759A (zh) * 2014-02-18 2014-05-07 马根昌 电子眼镜动作识别系统
CN103777351A (zh) * 2012-10-26 2014-05-07 鸿富锦精密工业(深圳)有限公司 多媒体眼镜
US20140333521A1 (en) * 2013-05-07 2014-11-13 Korea Advanced Institute Of Science And Technology Display property determination
CN204178312U (zh) * 2014-11-20 2015-02-25 姚尧 人体姿态数据的获取系统
CN204695230U (zh) * 2015-05-25 2015-10-07 谢培树 头操作的电子眼镜
CN105116544A (zh) * 2015-05-25 2015-12-02 谢培树 头操作的电子眼镜
CN204855938U (zh) * 2015-05-30 2015-12-09 谢培树 电子眼镜

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0933720A2 (en) * 1998-01-29 1999-08-04 Shimadzu Corporation Input apparatus for the physically handicapped
US20140022371A1 (en) * 2012-07-20 2014-01-23 Pixart Imaging Inc. Pupil detection device
CN103777351A (zh) * 2012-10-26 2014-05-07 鸿富锦精密工业(深圳)有限公司 多媒体眼镜
US20140333521A1 (en) * 2013-05-07 2014-11-13 Korea Advanced Institute Of Science And Technology Display property determination
CN103513770A (zh) * 2013-10-09 2014-01-15 中国科学院深圳先进技术研究院 基于三轴陀螺仪的人机接口设备及人机交互方法
CN103777759A (zh) * 2014-02-18 2014-05-07 马根昌 电子眼镜动作识别系统
CN204178312U (zh) * 2014-11-20 2015-02-25 姚尧 人体姿态数据的获取系统
CN204695230U (zh) * 2015-05-25 2015-10-07 谢培树 头操作的电子眼镜
CN105116544A (zh) * 2015-05-25 2015-12-02 谢培树 头操作的电子眼镜
CN204855938U (zh) * 2015-05-30 2015-12-09 谢培树 电子眼镜

Similar Documents

Publication Publication Date Title
US11169600B1 (en) Virtual object display interface between a wearable device and a mobile device
US10521026B2 (en) Passive optical and inertial tracking in slim form-factor
US9442567B2 (en) Gaze swipe selection
US9256987B2 (en) Tracking head movement when wearing mobile device
US9728010B2 (en) Virtual representations of real-world objects
US9552060B2 (en) Radial selection by vestibulo-ocular reflex fixation
US9223402B2 (en) Head mounted display and method of controlling digital device using the same
US8994672B2 (en) Content transfer via skin input
US20170357332A1 (en) Six dof mixed reality input by fusing inertial handheld controller with hand tracking
US20160132189A1 (en) Method of controlling the display of images and electronic device adapted to the same
KR20150092165A (ko) Imu를 이용한 직접 홀로그램 조작
US20160171780A1 (en) Computer device in form of wearable glasses and user interface thereof
US20210407205A1 (en) Augmented reality eyewear with speech bubbles and translation
US20210255481A1 (en) Hyperextending hinge for wearable electronic device
KR102110208B1 (ko) 안경형 단말기 및 이의 제어방법
CN104281266A (zh) 头戴式显示设备
US11195341B1 (en) Augmented reality eyewear with 3D costumes
US20220299794A1 (en) Hyperextending hinge having fpc service loops for eyewear
CN105786163A (zh) 显示处理方法和显示处理装置
JP6996115B2 (ja) 頭部装着型表示装置、プログラム、及び頭部装着型表示装置の制御方法
KR20220137117A (ko) 안경류를 위한 코스메틱 트림을 갖는 과신장 힌지
US20180143436A1 (en) Head-operated digital eyeglasses
WO2017012519A1 (zh) 头操作的数字眼镜
CN114115544B (zh) 人机交互方法、三维显示设备及存储介质
CN207249604U (zh) 头操作的数字眼镜

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16827210

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15566634

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16827210

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16827210

Country of ref document: EP

Kind code of ref document: A1