CN101650624B - Operation controller of graphical user interface - Google Patents

Operation controller of graphical user interface Download PDF

Info

Publication number
CN101650624B
CN101650624B CN200810135157XA CN200810135157A CN101650624B CN 101650624 B CN101650624 B CN 101650624B CN 200810135157X A CN200810135157X A CN 200810135157XA CN 200810135157 A CN200810135157 A CN 200810135157A CN 101650624 B CN101650624 B CN 101650624B
Authority
CN
China
Prior art keywords
image
user
graphical user
interface
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN200810135157XA
Other languages
Chinese (zh)
Other versions
CN101650624A (en
Inventor
叶舟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi en somatosensory Polytron Technologies Inc
Original Assignee
Cywee Group Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cywee Group Ltd filed Critical Cywee Group Ltd
Priority to CN200810135157XA priority Critical patent/CN101650624B/en
Publication of CN101650624A publication Critical patent/CN101650624A/en
Application granted granted Critical
Publication of CN101650624B publication Critical patent/CN101650624B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention discloses an operation controller of a graphical user interface, which comprises an image sensing unit and the graphical user interface (GUI). The image sensing unit comprises an infrared luminous device, an image acquisition device and a computing control module, wherein the infrared luminous device is used for projecting rays to a user; the image acquisition device enables the rays including rays reflected from the user to pass and optically form images, and outputs a digital image signal; and the computing control module receives the digital image signal and computes and processes the digital image signal to produce operation control signals corresponding to various body motions of the user. The graphical user interface is connected with the image sensing unit and displayed on a screen; and through the operation control signals produced by the image sensing unit, the user can operate and control the graphical user interface by using self body motions.

Description

The actuation means at graphical user interface
Technical field
The present invention relates to a kind of graphical user interface (graphical user interface; GUI) actuation means; Particularly relate to a kind of exercises that utilize sensing user health, control the actuation means at graphical user interface with the body action that provides the user can utilize self.
Background technology
Press; Graphical user interface (graphical user interface) is a kind of to be patterned into the user interface on basis; Utilize unified figure and mode of operation; Like combined window, menu and cursor,, so can let and can not use the user of input command that computer system is imported as the foundation of the interaction between user and the computer system; And help the user to understand and the searching function, and can let the user be familiar with other program or function operations in an identical manner apace.
From the later stage in the 1980's so far; The application at graphical user interface is ripe gradually; And be found in electronic product miscellaneous, like desktop PC, mobile computer, mobile communication device, personal digital assistant, action Satellite Navigation Set etc., so that user's one convenience, friendliness and mode of operation fast to be provided; Yet the user is when utilizing graphical user interface to come operational computations machine system or carrying out interaction with computer system; Still need cooperate keyboard, mouse, Trackpad or other actuation means to import, not only become the restriction of graphical user interface development, and the operating environment of a situation formula can't be provided.
And the existing graphical user interface of controlling through sensing user body action; The user still often need wear or with some specific input medias; Like handgrip or telepilot etc.; And can't react user's specific action accurately, make specific cursor in computer system or the electronic game display frame insensitive or postpone with respect to the reaction of user's body action.
Therefore, the inventor improves thoughts on above-mentioned defective, is necessary to propose a kind of reasonable in design and effectively improve the present invention of the problems referred to above.
Summary of the invention
The object of the invention, (graphical user interface, actuation means GUI) with sensing user's body action, are operated a corresponding graphical user interface to aim to provide a kind of graphical user interface.
In order to reach above-mentioned purpose, the present invention provides a kind of graphical user the actuation means at interface, and it comprises:
One image sensing unit; It comprises an infrared light emission device, an image capture unit and a s operation control module, and this infrared light emission device is in order to launch infrared ray towards the user, and this image capture unit passes through and optical imagery the infrared ray that includes from user's reflection; Obtain an infrared image and receive; And improve the contrast between the user's image and environmental background image in this infrared image, again with this infrared image digitizing, and export a digital image signal; This digital image signal includes user's image and environmental background image; This s operation control module is connected in this image capture unit, in order to receiving this digital image signal with the variation with respect to a time shaft on a plane reference coordinate of this user's image of identification, controls signal and produce corresponding to user's various body actions: and
One graphical user interface, it is shown on the screen and with this image sensing unit and is connected, and shows a specific output-response to receive and to control signal corresponding to this;
After this s operation control module is received from this digital image signal that this image capture unit exports, and calculate the depth of field of this user's image in this digital image signal, so that the supplementary that this environmental background image is removed to be provided in this digital image signal.
The actuation means at described graphical user interface; Wherein, After this s operation control module is received from this digital image signal that this image capture unit exports, and calculate the depth of field of this user's image in this digital image signal, so that the supplementary that this environmental background image is removed to be provided in this digital image signal; And only this user's image is locked, again should blocked user's image be defined in this plane reference coordinate and carry out calculation process.
The actuation means at described graphical user interface, wherein, this s operation control module meets the imaging filter of the depth of field of this user's image equally with beyond blocked user's image.
The actuation means at described graphical user interface; Wherein, Also have a plurality of image capture units, this s operation control module is connected in these image capture units, and obtains a plurality of digital image signals from these image capture units receptions; With difference and the variation of comparing the depth of field of user's image in these digital image signals; Calculate user's health particular portion bit position or yardstick information spatially, and differentiate the whether overlapping and user's health of user's limbs distance, move, actions such as acceleration or rotation, and produce the signal of controlling corresponding to user's body action.
The actuation means at described graphical user interface, wherein, this graphical user interface includes the whole body of representing user's health or the image cursor of privileged site, and this image cursor is corresponding to user's body action, and as the function of a pointer.
The actuation means at described graphical user interface, wherein, this graphical user interface is rendered as two-dimensional image formula or three-dimensional image formula.
The present invention has following beneficial effect:
One, this image sensing unit of the present invention health image of identification user accurately, and have good sensitivity.
Two, this image sensing unit can and further to the computing of the depth of field of user's health image; And the information that can obtain user's health yardstick or move forward and backward, the body action that provides the user can utilize self is fully by this controlled this graphical user interface.
Three, the user need not lodge extra input media, like keyboard, mouse, Trackpad, joystick etc., can incorporate the situation that this graphical user interface is appeared naturally, and utilize the body action of self to operate.
Four, the present invention can be in order to providing various graphical users the operation at interface fully, and what no matter this graphical user interface appeared is plane picture formula or three-dimensional situation formula, to reach abundanter virtual reality effect.
Describe the present invention below in conjunction with accompanying drawing and specific embodiment, but not as to qualification of the present invention.
Description of drawings
Fig. 1 is the composition calcspar of the actuation means at the graphical user of the present invention interface;
Fig. 2 is the user mode synoptic diagram of the actuation means at the graphical user of the present invention interface;
Fig. 3 is the user mode synoptic diagram of the actuation means at the graphical user of the present invention interface;
Fig. 4 is the user mode synoptic diagram of another embodiment of the actuation means at the graphical user of the present invention interface;
Fig. 5 is the user mode synoptic diagram of another embodiment of the actuation means at the graphical user of the present invention interface.
Wherein, Reference numeral:
1 image sensing unit
11 infrared light emission devices, 111 infrared light emission unit
12 image capture units, 121 infrared ray filtration modules
122 image sensing modules, 13 s operation control modules
2 graphical user interfaces
21 pointers, 22 image cursor
Embodiment
See also Fig. 1 and shown in Figure 2; Be graphical user of the present invention interface (graphical user interface; The composition calcspar of the embodiment of actuation means GUI) and user mode synoptic diagram; It includes: an image sensing unit 1 and a graphical user interface 2, wherein this graphical user interface 2 is connected with this image sensing unit 1 and is presented on the screen 20.
This image sensing unit 1 comprises an infrared light emission device 11, an image capture unit 12 and a s operation control module 13; This infrared light emission device 11 is in order to launch infrared ray towards the user; It includes a plurality of infrared light emissions unit 111; As can the ultrared light emitting diode of emission wavelength between 750~1300nm, IR wavelength preferable in the present embodiment is about 850nm.As shown in Figure 2; This infrared light emission device 11 is disposed at the outside of this image capture unit 12; These infrared light emission unit 111 are around these image capture units 12, and these infrared light emission unit 111 are arranged in the form of a ring in the present embodiment, but the present invention is not as limit; Also can arrange rectangular or specific curve, to launch uniform infrared ray towards the user.This infrared light emission device 11 is configurable at display screen 20 upsides, to launch infrared ray equably towards the user who operates this graphical user interface 2.
This image capture unit 12 includes an infrared ray filtration module 121 and an image sensing module 122, and this infrared ray filtration module 121 includes a colored filter, in order to the light beyond the IR wavelength is filtered, and only lets infrared ray pass through.This image capture unit 12 passes through the infrared ray that includes from user's reflection with this infrared ray filtration module 121, and optical imagery, obtains an infrared image and receive.This image sensing module 122 receives this infrared image; And improve the contrast between the user's image and environmental background image in this infrared image; Again with this infrared image digitizing, and export a digital image signal, this digital image signal includes user's image and environmental background image.In the present embodiment, the contrast that improves between this user's image and the environmental background image can be set at: make the brightness of user's image be higher than the brightness of environmental background image, or make the brightness of user's image be lower than the brightness of environmental background image.Or a supplementary is provided in advance; As set an image reference value, and with this digital image signal compartmentalization, if the rate of change in the digital image signal of compartmentalization then is set at user's image (prospect) greater than reference value; Otherwise be the environmental background image; So capture, lock user's image, and remove the environmental background image, with further identification user's body action.
Because the user is different with respect to the distance of this image capture unit 12 with environmental background, and has the different depth of field.Therefore; This s operation control module 13 is connected in this image capture unit 12; In order to receiving this digital image signal, and calculate the depth of field of this user's image in this digital image signal, so that the supplementary that this environmental background image is removed to be provided in this digital image signal; And only this user's image is locked and follows the trail of; And then continue to meet the imaging filter of the depth of field of this user's image equally with beyond blocked user's image, connect and should blocked user's image be defined in and carry out calculation process in the plane reference coordinate; With the variation with respect to a time shaft on this plane reference coordinate of this user's image of identification, and control signal corresponding to user's various body actions.And it is as shown in Figure 2; Be connected with this s operation control module 13 if use two image acquisition modules 12; Then this s operation control module 13 can be further to the digital image signal of reception from different image capture unit 12; Carry out the comparison and the computing of the depth of field of user's image; And obtain the information (like user's health particular portion bit position or yardstick information spatially) of the three-dimensional size of user's body contour more easily, whether and it is overlapping to differentiate user's limbs more accurately, and the identification user with respect to the distance of this image sensing unit 1 or this screen 20 front and back, move, actions such as acceleration or rotation.
This graphical user interface 2 receives by what this s operation control module 13 was produced and controls signal; And control signal corresponding to this and show a specific output-response, wherein this screen 20 can be a flat-panel screens or is projected on the projection screen by projection arrangement and appears.And it is as shown in Figure 3; This graphical user interface 2 can be rendered as a kind of operation interface of two-dimensional image formula; As include the user interface of objects such as window (windows), image (icons), framework (frame), menu (menu) and index (pointer) 21; And wherein this pointer 21 can show specific reaction corresponding to the action of user's given body, as upper and lower, left and right move, choose, action such as unlatching.
In another embodiment; As shown in Figure 4; This graphical user interface 2 not only can be the two-dimensional image formula, also can be rendered as the situation formula operating environment of a virtual reality (virtual reality), and the object in this graphical user interface 2 can be rendered as the stereo-picture form of three-dimensional (3-dimension); And include one represent the whole body or the privileged site of user's health image cursor 22; With body action corresponding to the user, and as the function of a pointer, for example; This image cursor 22 can corresponding to user's health move or the waving pendulum of four limbs and corresponding demonstrates the action of moving or waving pendulum; Particularly move forward and backward with respect to this image sensing unit 1 or this screen 20 as the user, or when particularly limbs move forward and backward, bounce, go out fist etc. forward like palm; In this graphical user interface 2 corresponding to the image cursor 22 of user's health can by received this image sensing unit 1 produced controls signal and demonstrate a specific output-response, as choose, click, unlatching etc.
In the foregoing description; This graphical user interface 2 can demonstrate various specific situations; Like parlor, meeting, party etc.; And can be as shown in Figure 5, further connect a plurality of image sensing units 1 with corresponding to a plurality of users via the computing machine or the webserver, make different users can operate in this graphical user interface 2 corresponding respectively this user's image cursor 22 carry out interaction; The impression that lets the user have to be personally on the scene, and reach abundanter virtual reality effect.
In sum, the present invention has following beneficial effect:
One, this image sensing unit 1 of the present invention has the effect of good filtration background video, and identification user's health image accurately, and have good sensitivity.
Two, this image sensing unit 1 can and further to the computing of the depth of field of user's health image; And the information that can obtain user's health yardstick or move forward and backward, the body action that provides the user can utilize self is fully by this controlled this graphical user interface.
Three, the user need not lodge extra input media, like keyboard, mouse, Trackpad, joystick etc., can incorporate the situation that this graphical user interface is appeared naturally, and utilize the body action of self to operate.
Four, the present invention can be in order to providing various graphical users the operation at interface fully, and what no matter this graphical user interface appeared is plane picture formula or three-dimensional situation formula, to reach abundanter virtual reality effect.
Certainly; The present invention also can have other various embodiments; Under the situation that does not deviate from spirit of the present invention and essence thereof; Those of ordinary skill in the art work as can make various corresponding changes and distortion according to the present invention, but these corresponding changes and distortion all should belong to the protection domain of the appended claim of the present invention.

Claims (6)

1. the actuation means at a graphical user interface is characterized in that, comprising:
One image sensing unit; It comprises an infrared light emission device, an image capture unit and a s operation control module, and this infrared light emission device is in order to launch infrared ray towards the user, and this image capture unit passes through and optical imagery the infrared ray that includes from user's reflection; Obtain an infrared image and receive; And improve the contrast between the user's image and environmental background image in this infrared image, again with this infrared image digitizing, and export a digital image signal; This digital image signal includes user's image and environmental background image; This s operation control module is connected in this image capture unit, in order to receiving this digital image signal with the variation with respect to a time shaft on a plane reference coordinate of this user's image of identification, controls signal and produce corresponding to user's various body actions: and
One graphical user interface, it is shown on the screen and with this image sensing unit and is connected, and shows a specific output-response to receive and to control signal corresponding to this;
After this s operation control module is received from this digital image signal that this image capture unit exports, and calculate the depth of field of this user's image in this digital image signal, so that the supplementary that this environmental background image is removed to be provided in this digital image signal.
2. the actuation means at graphical user according to claim 1 interface is characterized in that, this s operation control module only locks this user's image, again should blocked user's image be defined in this plane reference coordinate to carry out calculation process.
3. the actuation means at graphical user according to claim 1 interface is characterized in that, this s operation control module is beyond blocked user's imaging filter, equally to the imaging filter of the depth of field that meets this user's image.
4. the actuation means at graphical user according to claim 1 interface; It is characterized in that; Also have a plurality of image capture units, this s operation control module is connected in these image capture units, and obtains a plurality of digital image signals from these image capture units receptions; With difference and the variation of comparing the depth of field of user's image in these digital image signals; Calculate user's health particular portion bit position or yardstick information spatially, and differentiate the whether overlapping and user's health of user's limbs distance, move, acceleration or spinning movement, and produce the signal of controlling corresponding to user's body action.
5. the actuation means at graphical user according to claim 1 interface; It is characterized in that; This graphical user interface includes the whole body of representing user's health or the image cursor of privileged site, and this image cursor is corresponding to user's body action, and as the function of a pointer.
6. the actuation means at graphical user according to claim 1 interface is characterized in that, this graphical user interface is rendered as two-dimensional image formula or three-dimensional image formula.
CN200810135157XA 2008-08-13 2008-08-13 Operation controller of graphical user interface Active CN101650624B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200810135157XA CN101650624B (en) 2008-08-13 2008-08-13 Operation controller of graphical user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200810135157XA CN101650624B (en) 2008-08-13 2008-08-13 Operation controller of graphical user interface

Publications (2)

Publication Number Publication Date
CN101650624A CN101650624A (en) 2010-02-17
CN101650624B true CN101650624B (en) 2012-07-18

Family

ID=41672870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200810135157XA Active CN101650624B (en) 2008-08-13 2008-08-13 Operation controller of graphical user interface

Country Status (1)

Country Link
CN (1) CN101650624B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8818027B2 (en) * 2010-04-01 2014-08-26 Qualcomm Incorporated Computing device interface
US8670023B2 (en) * 2011-01-17 2014-03-11 Mediatek Inc. Apparatuses and methods for providing a 3D man-machine interface (MMI)
US9983685B2 (en) 2011-01-17 2018-05-29 Mediatek Inc. Electronic apparatuses and methods for providing a man-machine interface (MMI)
CN102306065A (en) * 2011-07-20 2012-01-04 无锡蜂巢创意科技有限公司 Realizing method of interactive light sensitive touch miniature projection system
CN103376887B (en) * 2012-04-23 2016-08-10 联想(北京)有限公司 A kind of method of information processing and electronic equipment
CN102999165B (en) * 2012-12-11 2015-10-28 广东威创视讯科技股份有限公司 A kind of virtual reality system based on infrared identification and exchange method
CN103777753A (en) * 2013-11-12 2014-05-07 广州新节奏智能科技有限公司 Novel portable body sensing and controlling device and application method thereof
CN106095098A (en) * 2016-06-07 2016-11-09 深圳奥比中光科技有限公司 Body feeling interaction device and body feeling interaction method
TWI649550B (en) * 2017-11-10 2019-02-01 秀育企業股份有限公司 Time-sharing multi-spectral detection device and method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7170492B2 (en) * 2002-05-28 2007-01-30 Reactrix Systems, Inc. Interactive video display system
JP2007128307A (en) * 2005-11-04 2007-05-24 Advanced Telecommunication Research Institute International Operation instruction apparatus
CN101086681A (en) * 2006-06-09 2007-12-12 中国科学院自动化研究所 Game control system and method based on stereo vision

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7170492B2 (en) * 2002-05-28 2007-01-30 Reactrix Systems, Inc. Interactive video display system
JP2007128307A (en) * 2005-11-04 2007-05-24 Advanced Telecommunication Research Institute International Operation instruction apparatus
CN101086681A (en) * 2006-06-09 2007-12-12 中国科学院自动化研究所 Game control system and method based on stereo vision

Also Published As

Publication number Publication date
CN101650624A (en) 2010-02-17

Similar Documents

Publication Publication Date Title
CN101650624B (en) Operation controller of graphical user interface
Davis et al. Lumipoint: Multi-user laser-based interaction on large tiled displays
CN101419513B (en) Remote virtual touch system of infrared laser pen
US20180299966A1 (en) Visual collaboration interface
US20110298708A1 (en) Virtual Touch Interface
US20140075370A1 (en) Dockable Tool Framework for Interaction with Large Scale Wall Displays
CA2609155A1 (en) Free-space pointing and handwriting
CN104081307A (en) Image processing apparatus, image processing method, and program
KR20140068855A (en) Adaptive tracking system for spatial input devices
US11030980B2 (en) Information processing apparatus, information processing system, control method, and program
CN108700957A (en) Electronic system and method for text input in virtual environment
CN109313510A (en) Integrated free space and surface input device
US9201519B2 (en) Three-dimensional pointing using one camera and three aligned lights
US20090283341A1 (en) Input device and control method thereof
CN108268204A (en) Electric whiteboard system and its electronic whiteboard and operating method
Cassinelli et al. Smart laser-scanner for 3D human-machine interface
US9310851B2 (en) Three-dimensional (3D) human-computer interaction system using computer mouse as a 3D pointing device and an operation method thereof
Osunkoya et al. Gesture-based human-computer-interaction using Kinect for windows mouse control and powerpoint presentation
CN102163105A (en) Human-computer interaction touch table for screen input of surface computer
US9678583B2 (en) 2D and 3D pointing device based on a passive lights detection operation method using one camera
WO2000070551A1 (en) Stylus pen for writing on the monitor
CN201369027Y (en) Remote finger virtual touch system with infrared laser pen
Lang et al. A multimodal smartwatch-based interaction concept for immersive environments
Nancel et al. Precision pointing for ultra-high-resolution wall displays
US20100064213A1 (en) Operation device for a graphical user interface

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20160421

Address after: Hongkong China des Voeux Road No. 136, BOC Group Life Insurance Mansion fifteen floor

Patentee after: Hongkong Shang Xi en somatosensory Polytron Technologies Inc

Address before: The British Virgin Islands of Tortola

Patentee before: British Virgin Islands CyWee Group Ltd.

CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: China Hongkong gun Taishan King Road, 1 O twenty-four, building 24, O, one room

Patentee after: Hongkong Shang Xi en somatosensory Polytron Technologies Inc

Address before: Hongkong China des Voeux Road No. 136, BOC Group Life Insurance Mansion fifteen floor

Patentee before: Hongkong Shang Xi en somatosensory Polytron Technologies Inc

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20170601

Address after: China Hongkong gun Taishan King Road, 1 O twenty-four, building 24, O, one room

Patentee after: Xi en somatosensory Polytron Technologies Inc

Address before: China Hongkong gun Taishan King Road, 1 O twenty-four, building 24, O, one room

Patentee before: Hongkong Shang Xi en somatosensory Polytron Technologies Inc