GB2495159A - A head-mounted somatosensory control and display system based on a user's body action - Google Patents

A head-mounted somatosensory control and display system based on a user's body action Download PDF

Info

Publication number
GB2495159A
GB2495159A GB1200910.6A GB201200910A GB2495159A GB 2495159 A GB2495159 A GB 2495159A GB 201200910 A GB201200910 A GB 201200910A GB 2495159 A GB2495159 A GB 2495159A
Authority
GB
United Kingdom
Prior art keywords
head
text
user
image
body action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1200910.6A
Other versions
GB201200910D0 (en
Inventor
Jung Ya Hsieh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GIXIA GROUP CO
Original Assignee
GIXIA GROUP CO
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GIXIA GROUP CO filed Critical GIXIA GROUP CO
Publication of GB201200910D0 publication Critical patent/GB201200910D0/en
Publication of GB2495159A publication Critical patent/GB2495159A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Abstract

A head-mounted somatosensory control and display system is suitable for use as a head-mounted movie theatre for use in an airplane cabin. It includes a base unit, an image pick-up device which may be a microcamera (32, figure 3), a display device which can be LCD display module (34, figure 3) and a processing device (36, figure 3). The display device may be mounted on the base unit at a position that is viewable by the user. The image pick-up device is mounted on the base unit at a position having a visual filed closely approximating to that of the user's eyes. The image pick-up device captures an image of the user's body action such as a specific hand gesture and the processing device analyzes the body action 503, 504, thereby overlaying the body action onto virtual images displayed on the display device and even allowing the user to immersively interact with virtual objects.

Description

HEAD-MOUNTED SOMATOSENSORY CONTROL AND DISPLAY
SYSTEM AND METHOD USING SAME
Field of the Invention
The invention relates to a head-mounted somatosensory control and display system, and more particularly to a head-mounted somatosensory control and display system with high operability, and a method using the same.
Description of Prior Art
A variety of head-mounted display devices have been available in the consumer market for years. They were previously expected to replace household display devices, such as TVs, taking advantage of their space effectiveness and capability of providing the visual pleasure as that which can be provided by a 60-or 70-inch display device. However, with the rapid innovation and development of projection televisions, it is now possible for consumers to accomplish the same visual effect simply by using a projector to project large-size images onto a wall or a projection screen, in contrast, head-mounted display devices have poor market share because they are too heavy to be worn on the head for a long time. Especially, the display devices, when being worn on the head, will block the sight of the wearers, making the wearers difficult to find out particular keys on controllers to tune the volume up or down or tune into a selected channel.
Considerable efforts were made by some manufacturers to deal with its market location by mounting an additional camera onto a head-mounted display device. As shown in FIG. 1, a conventional technique involves superimposing the real world images captured from the wearer's visua.l field by a head-mounted camera 1 2 over the virtual image data stored in an image storage unit 1 8 by virtue of a processor 1 6, in conjunction with monitoring of the wearer's horizontal position, vertical position, head position and body part position by a multi-functional sensor 10. The superimposed images are displayed on a head-mounted display device 14.
However, since the conventional head-mounted display device described above is of a non-transmissive type, the wearer can only observe the real-world a.ction indirectly by way of the head-mounted camera 12. Inherently, the shooting angle of the head-mounted camera would slightly deviate from the three-dimensional visual field of the wear's eyes. Unless motionless, the wearer would possibly misjudge the distance to a certain object and end up falling down due to such deviation in observation. Not to mention that the visual field from the camera is so restricted that it is much narrower than that from human eyes. In addition, the human eye can rapidly focus objects at different distances by using six muscles in the vicinity of the crystalline lens to move the eyeball and to quickly make the crystalline lens change shape. In contrast, an auto-focusing camera llas to take longer time to complete focusing operation as compared to human eyes, causing a delay in response time. A camera cannot replace the human eyes to observe the real world.
Another conventional design is disclosed in US 7,573,525. As illustrated in Fig. 2, a head-mounted camera assembly 2 is provided with a head-mounted display device 24 and an external controller 25 for setting a focal length of a camera 22 and displaying the image data captured by the camera 22 on the head-mounted display device 24. The external controller 25 is also adapted to move the image data displayed on the head-mounted display device 24 along the up, down, left and right directions and to magnify or minify the displayed images.
The head-mounted assembly disclosed in US 7,573,525 is provided with a transmissive-type display device, allowing the wearer to directly observe real-world objects through the display device while displaying the camera-captured images on the display device. However, the initiation of a displaying process, as well as the positions of the images thus displayed, have to be controlled by pressing some buttons on the external controller with the wearer's finger, causing inconvenience in operation. Especially, compared to the cameras now available in the market which are of high portability and are operable with a single hand, the conventional head-mounted assembly is quite cumbersome due to being equipped with a handheld controller and an entire set of head-mounted camera and display device.
Therefore, there exists a need for a more powerftil hea.d-mounted display device, which is ergonomically operable in response to user's body actions, thereby providing the user with enhanced operational convenience.
Summary of the Invention
An aspect of the invention is to provide a head-mounted somatosensory control and display system, which is capable of capturing an image of the user's body action in the real world and utilizing the body action as an operation instruction, thereby reducing the manufacture cost.
Another aspect of the invention is to provide a head-mounted soma.tosensory control and display system, which is capable of executing an j operation instruction upon identifying a specific body action, thereby achieving a simplified structure and a better portability.
A still another aspect of the invention is to provide a head-mounted soma.tosensory control and display system, which is capable of overlaying the user's specific body action onto the virtual objects displayed on the display device, thereby providing improved operability and ergonomic advances for user comfort and convenience.
A still another aspect of the invention is to provide a head-mounted somatosensory control and display method for automatically and rapidly capturing an image of the user's body action and analyzing and displaying the captured image.
A still another aspect of the invention is to provide a head-mounted somatosdnsory control and display method for executing an operation instruction upon identif,ing a corresponding body action, thereby achieving a better operability.
Therefore, the head-mounted sornatosensory control and display system disclosed herein is adapted for controlling image display based on a user's specific body action present within a predetermined image pick-up scope. The specific body action corresponds to a specific instruction. The system comprises a base unit mountable on the user's head; an image pick-up device mounted on the base unit and adapted for capturing an image of the user's body action within the predetermined image pick-up scope; a display device mounted on the base unit; and a processing device for receiving the captured image from the image pick-up device and assessing the presence or absence of the specific body action and, if identifying the presence of the specific body action, outputting the specific instruction corresponding to the specific body action to have an image displayed on the display device change to a different image.
The invention ftirther relates to a head-mounted somatosensory control and display method. The method is adapted for assessing whether a user performs a specific body action within a predetermined image pick-up scope. The specific body action is defined to serve as a specific operation instruction. The method comprises the steps of: a) placing a base unit on the user's head, wherein the base unit is provided with an image pick-up device and a display device at a position corresponding to the user's eyes; b) using the image pick-up device to capture an image within the predetermined image pick-up scope, and converting the captured image into electrical signals, and transmitting the electrical signals to a processing device; c) using the processing device to assess whether the user performs the specific body action; and d) executing the specific operation instruction corresponding to tile specific body action to have an image displayed on the display device change to a different image, if identifying the presence of the specific body action.
By using the image pick-up device to capturing images of the objects located within the predetermined image pick-up scope in front of the user, the processing device can subsequently analyze the captured images. If the processing device identifies that the user performs a predetermined body action, it would correlate the body action with a predetermined instruction and then execute the instruction. As a result, the invented system no longer requires a remote control, thereby ha.ving a simplified structure, a better portability and a reduced manufacture cost. Especially, the invented system and method enable an interaction of the captured images with a virtual object present in the displayed animation, such as a virtual swivel or driving lever. The invented system is more ergonomically friendly to a user who wants to adjust the displaying conditions of a film currently displayed on the display device. The invention achieves the objects described above accordingly.
Brief Description of the Drawings
Fig. 1 is a block diagram of a conventional head-mounted camera; Fig. 2 is a schematic diagram of a conventional head-mounted camera provided with a head-mounted display device; Fig. 3 is a schematic perspective view of a head-mounted somatosensory control and display system according to the first preferred embodiment of the invention; Fig. 4 is a block diagram of the head-mounted somatosensory control and display system shown in Fig. 3; Fig. 5 shows an operation chart of the head-mounted somatosensory control and display system shown in Fig. 3; Fig. 6 is a schematic diagram showing the overlay of a captured image of the wearer's hand onto an image currently displayed by the head-mounted somatosensory control and display system shown in Fig. 3, wherein the captured hand image corresponds to a "pause" instruction; Fig. 7 is another schematic diagram showing the overlay of a captured image of the wearer's hand onto an image currently displayed by the head-mounted somatosensory control and display system shown in Fig. 3, wherein the captured hand image corresponds to an instruction of "skip forward" to a given time point; Fig. 8 is another schematic diagram showing the overlay of a captured image of the wearer's hand onto an ima.ge currently displayed by the head-mounted somatosensory control and display system shown in Fig. 3, wherein the captured hand image corresponds to a volume adjustment instruction; Fig. 9 is a block diagram of a head-mounted somatosensory control and display system according to the second preferred embodiment of tile invention, showing that the video and audio data are stored in an external console and transmitted wirelessly by a. wireless Bluetooth module; Fig. 10 is a schematic diagram showing the overlay of a captured image of the wearer's hand onto an image currently displayed by the system shown in Fig. 9, wherein the wearer's finger is interacting with a virtual image; Fig. 11 is a schematic diagram showing that the image displayed by the system shown in Fig. 9 is changed as the electric gyro detects a rotation movement of the wearer's head; Fig. 12 is a schematic perspective view of a head-mounted somatosensory control and display system according to the third preferred embodiment of the invention; Fig. 13 is a schematic diagram of the wearer's both hands and a virtual keyboard during use of the head-mounted somatosensory control and display system shown in Fig. 12; Fig. 14 is a top view of the head-mounted somatosensory control and display system shown in Fig. 12, showing that the projected images are focused as virtual images at farther distances: and Fig. 15 is a schematic perspective view of a head-mounted somatosensory control and display system according to the fourth preferred embodiment of the invention.
Detailed Description of the Invention
Regarding previous description of the invention and other techniques, features and performance, it will be described more fully hereinafter with reference to the accompanying drawings of preferred embodiments for clear presentation.
Figs. 3 and 4 show a. head-mounted somatosensoiy control and display system according to the first preferred embodiment of the invention. For the purpose of illustration, the system according to this embodiment is described by way of example as a head-mounted movie theater for use in an airplane cabin, which comprises a base unit configured in the form of a frame 30, an image pick-up device in the form of a micro-camera 32, a liquid crystal display (LCD) module 34, a processing device 36, a storage device 38, a video and audio display device 37, and an audio player device in the form of a pair of earphones 33.
The LCD display module 34 is mounted on the frame 30 at a position that is viewable by the user's eyes. According to Step 501 shown in Fig. 5, when the user is going to use the head-mounted somatosensory control and display system disclosed herein, the user is required to wear the frame 30 on his/her head and place a compact disc 4 loaded with video/audio data into the video and audio display device 37. In Step 502, the video and audio display device 37
S
displays the video and audio data through tile LCD display module 34 and the earphones 33. The micro-camera 32 is mounted on the frame 30 at a position between tile user's eyes such that, in Step 503, the micro-camera 32 continuously captures images within a predetermined image pick-up scope.
The captured images are then converted into electrical signals, which are in turn transmitted to the processing device 36.
If the user wants to pause a film that is currently playing, the user could make a clenched-fist gesture 5 in front of his/her face, as shown in Fig. 6.
When the user's certain body action appears in the predetermined image pick-up scope of the micro-camera 32, the processing device 36 analyzes the actual images captured by the micro-camera 32 in Step 504, to assess whether the body action matches with a specific predetermined body action. If yes, a predetermined instruction corresponding to the predetermined built-in body action is retrieved from the storage device 38 in Step 505 for executing a prescribed operation. For example, the "clenched-fist gesture" described herein may correspond to a "pause" instruction for commending the video and audio display device 37 to temporarily stop displaying the film. On the contrary. the micro-camera 32 will keep capturing images and the film will be displayed continuously to its end without any interruption, if the "clenched-fist gesture" is never identified in Step 504.
As shown in Fig. 7, when the user, for example, stretches out the forefinger of his/her left hand, such a body action corresponds to an instruction of adjusting the frame rate according to this embodiment. Tn this case, the processing device 36 instructs the video and audio display device 37 to display a virtual frame-rate adjustment interface 382 on the LCD module 34 at a position corresponding to the location of the forefinget At the same time, the micro-camera 32 keeps capturing images of the forefinger, such that the processing device 36 can track the direction of the finger's movement and instruct the video and audio display device 37 to either skip backward or skip forward in accordance with the body action. In addition, the storage device 38 is stored with a variety of virtual control interface images. As shown in Fig. 8, when the user stretches out the forefinger of the right hand to present his/her wish to tune up or down the volume, the processing device 36, once recognizing the hand gesture, instructs the video and audio display device 37 to present a virtual volume adjustment interface 383 on the LCD module 34 a.t a position corresponding to the location of the forefinger, so that the user can a.djust the volume by moving the forefinger upward or downward.
It can be readily appreciated by those skilled in the art that the head-mounted soinatosensory control and display system disclosed herein is not limited to being useful in an airplane cabin but is adapted to substitute some of the in-home display devices, such as conventional somatosensory game consoles.
By virtue of connecting a camera to a television located in front of the user to capture images of the user's body actions, the user is allowed to interact with tile virtual game on the television screen by overlaying the user's images onto the displayed virtual pictures. According to the second preferred embodiment of the invention shown in Fig. 9, the video and audio data stored in an external console 5' are transnritted wirelessly by a wireless Bluetooth module 59' provided in the console 5' to a corresponding wireless Bluetooth module 39', from which the data are in turn transmitted to a portable processing device 36'.
The processing device 36' then instructs a display device to display a conesponding image shown in Fig. 10.
According to this embodiment, the display device comprises a transmissive-type display device 34', allowing the wearer to directly observe real-world objects through the display device while viewing the virtual images displayed on the screen. When a kid who wears the head-mounted somatosensory control and display system disclosed herein tries to touch a virtual dairy cattle shown on the screen with a single finger, the particular hand gesture is captured by a micro-camera 32' and converted into electrical signals, which are in turn transmitted to the processing device 36'. Once identifting the hand gesture, the processing device 36' renders the virtual dairy cattle to interact with the kid by instructing an audio player device to make a cow moo and then rendering the virtual dairy cattle to slowly move forward.
Once the virtual dairy cattle moves out from the screen as shown in Fig. 11, the kid might try to track the virtual dairy cattle by rotating his/her neck. In this case, an electric gyro 361' provided in the system is employed to assess the orientation of the kid's head, which is substantially equivalent to the direction that the transrnissive-type display device 34' and the micro-camera 32' face.
As a result, the images displayed on the transmissive-type display device 34' will move in accordance with the rotation movement of the wearer's head. For example, when the wearer who is viewing an image shown in Fig. 10 moves his/her head to the left, the electric gyro 361' detects a change in orientation and infons the processing device 36' of a new orientation that the wearer faces.
Such information is transmitted to the console 5' via the wireless Bluetooth module 39'. Upon receiving the information, the console 5' sends the corresponding image shown ill Fig. 11 back to the processing device 36', and the image is then displayed on the transrnissive-type display device 34'. In this embodiment, it is preferably that the castle and tree displayed on the transmissive-type display device 34' will also change their positions in accordance with the rotation movement of the wearer's head, so as to provide the wearer with a fully immersive simulation experience of virtual reality.
Alternatively, in the case where the wearer is playing a gun fighting game, a handgun gesture made by the wearer with his/her forefinger and thumb will be captured by the micro-camera 32', allowing the wearer to directly use the gesture as a virtual gun in the game. The head-mounted system disclosed herein has less peripherals and, thus, has better portability and operability.
While the displa.y device used in the embodiment above is of the transmissive-type, the display devices of other types may be used in other embodiments to keep the trend for portable electronics to be designed in a compact size. As shown in Figs. 12 and 14, the display device according to the third preferred embodiment of the invention comprises micro-projectors 43", micro-textured lenses 44" and an image pick-up device configured in the form of a pair of micro-cameras 42". Each of the micro-cameras 42" is mounted on the base unit 40" at a position right above one of the user's eyes, so as to have a solid visual angle closely approximating to that of the one eye.
The base unit 40" according to this embodiment is preferably configured in the form of conventional eye spectacles, which are normally worn in front of the user's eyes at an inappropriate distance for the user to perceive real images. As such, the display device is provided with two micro-projectors 43" at left and right temple arms, respectively. The micro-projectors 43' are connected to a portable wireless transceiver 46" via a transmission line 41". The wireless transceiver 46" is adapted to communicate with a host 45" in a wireless manner.
The micro-projectors 43' are adapted to project the image data provided by the host 45" to where the micro-textured lenses 440' reside, so that the user will view tile projected images as virtual images focused at farther distances due to deflection and reflection by the micro-textured lenses 440'. A three-dimensional virtual keyboard 50" is therefore presented as shown in Fig. 13 by virtue of difference in the images projected from the left and right micro-projectors 43".
When the micro-cameras 42" capture an image of the user's both hands 52", a corresponding virtual keyboard 50" reveals. The transmissivity of the lenses allows overlay of the actual hands 52" onto the virtual keyboard 50".
The cameras capture the motions of the user's hands, which are in turn transmitted to the processing device for analysis. Once being identified, the motions serve as instructions of typing characters on the virtual keyboard 50" and the input operation is completed upon recognizing the typing motions.
Moreover, according to the fourth preferred embodiment shown in Fig. I 5, the display device comprises a pair of lenses polarized orthogonally with each other. The optical fiber projectors mounted at two sides (not shown) are used to project images to the lenses, such that vertically polarized images are displayed on the lens at the left side and horizontally polarized images are displayed on the lens at the right side. The user who wears the device perceives three-di rnensional images accordingly.
By virtue of the structural arrangements and operation processes described above, the invented head-mounted somatosensory control and display system can capture a user's body action within the user's visual field and analyze the body action with a processing device, thereby overlaying the body action onto virtual images and even allowing the user to immersively interact with virtual objects. The processing device may further analyze the body action and, if identifying that it matches with a predetermined body action, generate an operation instruction corresponding thereto and even display corresponding image data, such as a virtual operation interface and the current status information, on the display device at a corresponding position. In contrast to the conventional techniques, the invented system can be easily manipulated through a user friendly interface in a. fascinating mannet The invention achieves the objects described above accordingly.
While the invention has been described with reference to the preferred embodiments above, it should be recognized that die preferred embodiments are given for the purpose of illustration only and are not intended to limit the scope of the present invention and that various modifications and changes, which will be apparent to those skilled in the relevant art, may be made without departing from the spirit and scope of the invention.

Claims (1)

  1. <claim-text>What is Claimed is: A head-mounted somatosensory control and display system, adapted for controlling image display based on a. user's specific body a.ction present within a predetennined image pick-up scope, wherein the specific body action corresponds to a specific instruction, the system comprising: a base unit mountable on the user's head; an image pick-up device mounted on the base unit and adapted for capturing an image of the user's body a.ction within the predetermined image pick-up scope: a display device mounted on the base unit; and a processing device for receiving the captured image from the image pick-up device and assessing the presence or absence of the specific body action and, if identifying the presence of the specific body action, outputting the specific instruction corresponding to the specific body action to have an image displayed on the display device change to a different image.</claim-text> <claim-text>2. The head-mounted somatosensory control and display system according to claim 1, wherein the specific body action comprises a hand gesture and the control and display system further comprises a storage device stored with an image data of a virtual operation interface corresponding to the hand gesture, wherein the image data of the virtual operation interface is retrievable by the processing device for being displayed on the display device.</claim-text> <claim-text>3. The head-mounted somatosensory control and display system according to claim 1, wherein the display device comprises a transrnissive-type display device.</claim-text> <claim-text>4. The head-mounted somatosensory control and display system according to claim 3. wherein the display device further comprises an audio player device.</claim-text> <claim-text>5. The head-mounted somatosensory control and display system according to anyone of claims 1-4, wherein the display device comprises a liquid crystal display module.</claim-text> <claim-text>6. The head-mounted somatosensory control and display system a.ccording to anyone of claims 1-4, wherein the display device comprises at least one micro-projector and at least one micro-textured lens to which the at least one micro-projector can project images.</claim-text> <claim-text>7. The head-mounted sornatosensory control and display system according to anyone of claims 1-4, wherein the display device comprises a pair of lenses polarized orthogonally with each other.</claim-text> <claim-text>8. The head-mounted somatosensory control and display system according to anyone of claims 1-4, wherein the image pick-up device comprises a micro-camera mounted in a manner corresponding to a position between the user's eyes.</claim-text> <claim-text>9. The head-mounted somatosensory control and display system according to anyone of claims 1-4, wherein the image pick-up device comprises two micro-cameras spaced apart from each other and mounted in a manner corresponding to the respective eyes of the user.</claim-text> <claim-text>10. A head-mounted somatosensory control and display method for assessing whether a user performs a specific body action within a predetermined image pick-up scope. wherein the specific body action is defined to serve as a specific operation instruction, the method comprising the steps of: a) placing a base unit on the user's head, wherein the base unit is provided with an image pick-up device and a display device at a position corresponding to the user's eyes; b) using the image pick-up device to capture an image within the predetermined image pick-up scope, and converting the captured image into electrical signals, and transmitting the electrical signals to a processing device; c) using the processing device to assess whether the user performs the specific body action; and d) executing the specific operation instruction corresponding to the specific body action to have an image displayed on the display device change to a different image, if identifying the presence of the specific body action.</claim-text>
GB1200910.6A 2011-09-23 2012-01-19 A head-mounted somatosensory control and display system based on a user's body action Withdrawn GB2495159A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011102866452A CN103018905A (en) 2011-09-23 2011-09-23 Head-mounted somatosensory manipulation display system and method thereof

Publications (2)

Publication Number Publication Date
GB201200910D0 GB201200910D0 (en) 2012-02-29
GB2495159A true GB2495159A (en) 2013-04-03

Family

ID=45814245

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1200910.6A Withdrawn GB2495159A (en) 2011-09-23 2012-01-19 A head-mounted somatosensory control and display system based on a user's body action

Country Status (2)

Country Link
CN (1) CN103018905A (en)
GB (1) GB2495159A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104144335A (en) * 2014-07-09 2014-11-12 青岛歌尔声学科技有限公司 Head-wearing type visual device and video system
WO2015020888A1 (en) * 2013-08-05 2015-02-12 Microsoft Corporation Two-hand interaction with natural user interface
DE102013019574A1 (en) 2013-11-22 2015-05-28 Audi Ag Method for operating electronic data glasses and electronic data glasses
EP3173848A4 (en) * 2014-07-22 2018-03-07 LG Electronics Inc. Head mounted display and control method thereof
US20190377464A1 (en) * 2017-02-21 2019-12-12 Lenovo (Beijing) Limited Display method and electronic device
US20210120315A1 (en) * 2019-10-16 2021-04-22 Charter Communications Operating, Llc Apparatus and methods for enhanced content control, consumption and delivery in a content distribution network
US11729453B2 (en) 2019-04-24 2023-08-15 Charter Communications Operating, Llc Apparatus and methods for personalized content synchronization and delivery in a content distribution network

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI563818B (en) * 2013-05-24 2016-12-21 Univ Central Taiwan Sci & Tech Three dimension contactless controllable glasses-like cell phone
TWI552565B (en) * 2013-05-24 2016-10-01 中臺科技大學 Three dimension contactless controllable glasses-like cell phone
CN103324309A (en) * 2013-06-18 2013-09-25 杭鑫鑫 Wearable computer
CN103530061B (en) * 2013-10-31 2017-01-18 京东方科技集团股份有限公司 Display device and control method
CN104750234B (en) * 2013-12-27 2018-12-21 中芯国际集成电路制造(北京)有限公司 The interactive approach of wearable smart machine and wearable smart machine
KR102229890B1 (en) 2014-05-30 2021-03-19 삼성전자주식회사 Method for processing data and an electronic device thereof
CN110275619A (en) * 2015-08-31 2019-09-24 北京三星通信技术研究有限公司 The method and its head-mounted display of real-world object are shown in head-mounted display
CN105184268B (en) * 2015-09-15 2019-01-25 北京国承万通信息科技有限公司 Gesture identification equipment, gesture identification method and virtual reality system
CN205899837U (en) * 2016-04-07 2017-01-18 贾怀昌 Use head mounted display's training system
US10469976B2 (en) * 2016-05-11 2019-11-05 Htc Corporation Wearable electronic device and virtual reality system
US10255658B2 (en) * 2016-08-09 2019-04-09 Colopl, Inc. Information processing method and program for executing the information processing method on computer
CN111045209A (en) * 2018-10-11 2020-04-21 光宝电子(广州)有限公司 Travel system and method using unmanned aerial vehicle
CN109343715A (en) * 2018-11-16 2019-02-15 深圳时空数字科技有限公司 A kind of intelligence body-sensing interactive approach, equipment, system and storage equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10054242A1 (en) * 2000-11-02 2002-05-16 Visys Ag Method of inputting data into a system, such as a computer, requires the user making changes to a real image by hand movement
EP1630587A1 (en) * 2004-08-27 2006-03-01 Samsung Electronics Co.,Ltd. HMD information apparatus and method of operation thereof
EP2107414A1 (en) * 2008-03-31 2009-10-07 Brother Kogyo Kabushiki Kaisha Head mount display and head mount display system
US20100156787A1 (en) * 2008-12-19 2010-06-24 Brother Kogyo Kabushiki Kaisha Head mount display
WO2010073928A1 (en) * 2008-12-22 2010-07-01 ブラザー工業株式会社 Head-mounted display
WO2010082270A1 (en) * 2009-01-15 2010-07-22 ブラザー工業株式会社 Head-mounted display
WO2011097226A1 (en) * 2010-02-02 2011-08-11 Kopin Corporation Wireless hands-free computing headset with detachable accessories controllable by motion, body gesture and/or vocal commands

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH086708A (en) * 1994-04-22 1996-01-12 Canon Inc Display device
JP5309448B2 (en) * 2007-01-26 2013-10-09 ソニー株式会社 Display device and display method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10054242A1 (en) * 2000-11-02 2002-05-16 Visys Ag Method of inputting data into a system, such as a computer, requires the user making changes to a real image by hand movement
EP1630587A1 (en) * 2004-08-27 2006-03-01 Samsung Electronics Co.,Ltd. HMD information apparatus and method of operation thereof
EP2107414A1 (en) * 2008-03-31 2009-10-07 Brother Kogyo Kabushiki Kaisha Head mount display and head mount display system
US20100156787A1 (en) * 2008-12-19 2010-06-24 Brother Kogyo Kabushiki Kaisha Head mount display
WO2010073928A1 (en) * 2008-12-22 2010-07-01 ブラザー工業株式会社 Head-mounted display
WO2010082270A1 (en) * 2009-01-15 2010-07-22 ブラザー工業株式会社 Head-mounted display
WO2011097226A1 (en) * 2010-02-02 2011-08-11 Kopin Corporation Wireless hands-free computing headset with detachable accessories controllable by motion, body gesture and/or vocal commands

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015020888A1 (en) * 2013-08-05 2015-02-12 Microsoft Corporation Two-hand interaction with natural user interface
US9529513B2 (en) 2013-08-05 2016-12-27 Microsoft Technology Licensing, Llc Two-hand interaction with natural user interface
DE102013019574A1 (en) 2013-11-22 2015-05-28 Audi Ag Method for operating electronic data glasses and electronic data glasses
CN104144335A (en) * 2014-07-09 2014-11-12 青岛歌尔声学科技有限公司 Head-wearing type visual device and video system
EP3173848A4 (en) * 2014-07-22 2018-03-07 LG Electronics Inc. Head mounted display and control method thereof
US10217258B2 (en) 2014-07-22 2019-02-26 Lg Electronics Inc. Head mounted display and control method thereof
US20190377464A1 (en) * 2017-02-21 2019-12-12 Lenovo (Beijing) Limited Display method and electronic device
US10936162B2 (en) * 2017-02-21 2021-03-02 Lenovo (Beijing) Limited Method and device for augmented reality and virtual reality display
US11729453B2 (en) 2019-04-24 2023-08-15 Charter Communications Operating, Llc Apparatus and methods for personalized content synchronization and delivery in a content distribution network
US20210120315A1 (en) * 2019-10-16 2021-04-22 Charter Communications Operating, Llc Apparatus and methods for enhanced content control, consumption and delivery in a content distribution network
US11812116B2 (en) * 2019-10-16 2023-11-07 Charter Communications Operating, Llc Apparatus and methods for enhanced content control, consumption and delivery in a content distribution network

Also Published As

Publication number Publication date
GB201200910D0 (en) 2012-02-29
CN103018905A (en) 2013-04-03

Similar Documents

Publication Publication Date Title
GB2495159A (en) A head-mounted somatosensory control and display system based on a user&#39;s body action
GB2494940A (en) Head-mounted display with display orientation lock-on
JP6339239B2 (en) Head-mounted display device and video display system
JP6186689B2 (en) Video display system
US20170264881A1 (en) Information processing apparatus, information processing method, and program
US9411160B2 (en) Head mounted display, control method for head mounted display, and image display system
KR100943392B1 (en) Apparatus for displaying three-dimensional image and method for controlling location of display in the apparatus
US10303244B2 (en) Information processing apparatus, information processing method, and computer program
JP5073013B2 (en) Display control program, display control device, display control method, and display control system
JP6378781B2 (en) Head-mounted display device and video display system
CN104076512A (en) Head-mounted display device and method of controlling head-mounted display device
WO2010107072A1 (en) Head-mounted display
JP6094305B2 (en) Head-mounted display device and method for controlling head-mounted display device
US9846305B2 (en) Head mounted display, method for controlling head mounted display, and computer program
JP2001189902A (en) Method for controlling head-mounted display and head- mounted display device
US20180205932A1 (en) Stereoscopic video see-through augmented reality device with vergence control and gaze stabilization, head-mounted display and method for near-field augmented reality application
WO2017085974A1 (en) Information processing apparatus
CN109923868A (en) Display device and its control method
JP6303274B2 (en) Head-mounted display device and method for controlling head-mounted display device
US9648315B2 (en) Image processing apparatus, image processing method, and computer program for user feedback based selective three dimensional display of focused objects
WO2017022769A1 (en) Head-mounted display, display control method and program
JP6631014B2 (en) Display system and display control method
JP6788129B2 (en) Zoom device and related methods
US20150237338A1 (en) Flip-up stereo viewing glasses
JP2017079389A (en) Display device, display device control method, and program

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20160121 AND 20160127

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)