CN102779000A - User interaction system and method - Google Patents

User interaction system and method Download PDF

Info

Publication number
CN102779000A
CN102779000A CN2012101349985A CN201210134998A CN102779000A CN 102779000 A CN102779000 A CN 102779000A CN 2012101349985 A CN2012101349985 A CN 2012101349985A CN 201210134998 A CN201210134998 A CN 201210134998A CN 102779000 A CN102779000 A CN 102779000A
Authority
CN
China
Prior art keywords
user
signal
processing unit
touch
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012101349985A
Other languages
Chinese (zh)
Other versions
CN102779000B (en
Inventor
刘广松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SUZHOU CHUDA INFORMATION TECHNOLOGY CO., LTD.
Original Assignee
Dry Line Consulting (beijing) Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dry Line Consulting (beijing) Technology Co Ltd filed Critical Dry Line Consulting (beijing) Technology Co Ltd
Priority to CN201210134998.5A priority Critical patent/CN102779000B/en
Publication of CN102779000A publication Critical patent/CN102779000A/en
Application granted granted Critical
Publication of CN102779000B publication Critical patent/CN102779000B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The embodiment of the invention discloses a user interaction system and method. The user interaction system comprises a signal operation processing unit, a retina display unit and a touch screen unit, wherein the signal operation processing unit is used for providing a display signal to the retina display unit; the retina display unit is used for projecting the display signal to the retina of a user, the user senses a virtual interface appearing in the visual sense, and the display signal is displayed on the virtual interface; the touch screen unit is used for capturing a touch operation action on the touch screen unit of the user, and touch operation information corresponding to the touch operation action is sent to the signal operation processing unit; and the signal operation processing unit is further used for confirming an interactive operation command corresponding to the touch operation information, and the display signal after the interactive operation command is executed is provided to the retina display unit in real time. According to the user interaction system and method, the user experience can be enhanced, virtual information and actual live-action are mixed together, the touch interaction control is provided, sense organ experience of augmented reality is provided for the user, and a large number of significant applications can be derived.

Description

A kind of user interactive system and method
Technical field
The present invention relates to electronic application (application) technical field, especially, relate to a kind of user interactive system and method.
Background technology
Nineteen fifty-nine American scholar B.Shackel has proposed the notion of man-machine interaction engineering science first.Since the later stage nineties 20th century; Along with the high speed processing chip; The developing rapidly and popularizing of multimedia technology and Internet technology; The research emphasis of man-machine interaction has been placed on aspects such as intelligent mutual, multi-modal (hyperchannel)-multimedia interactive, virtual interacting and man-machine coordination be mutual, just is placed on artificial aspect the human-computer interaction technology at center.
Along with the progress and the arriving in information explosion epoch of society, People more and more relies on consumer-elcetronics devices miscellaneous (like portable terminal, PDA(Personal Digital Assistant) etc.) to obtain various information more.Such as: make a phone call and others links up, browsing page obtains news and checks Email etc.The man-machine interaction of widespread use at present comprises the hardware devices such as keyboard and mouse that dependence is traditional, and the touch-screen that came into vogue gradually in recent years etc.
People do not satisfy for existing man-machine interaction mode, and people expect that the man-machine interaction of a new generation can be natural alternately as the person to person, accurate and quick.So the research in the man-machine interaction nineties of 20th century has entered the multi-modal stage, be called natural human-machine interaction (Human-Computer Nature Interaction, HCNI or Human-Machine Nature Interaction, HMNI).
At present, how with the natural human-machine interaction technical application in various application, still be a great challenge.
Summary of the invention
In view of this, embodiment of the present invention proposes a kind of user interactive system, to increase user experience.
Embodiment of the present invention also proposes a kind of user interaction approach, to strengthen user experience.
Technical scheme of the present invention is following:
A kind of user interactive system, this system comprises signal operation processing unit, retina display unit and touch screen unit, wherein:
The signal operation processing unit is used for to the retina display unit shows signal being provided;
The retina display unit is used for the shows signal that the signal operation processing unit is provided is projected to user's retina, make the user visually feel to occur virtual interface, and said shows signal is displayed on the said virtual interface;
Touch screen unit is used to catch user's touch operation action above that, and will send to the signal operation processing unit corresponding to the touch operation information of this touch operation action;
Said signal operation processing unit is further used for confirming the interactive operation order corresponding to this touch operation information, and provides in real time corresponding to the shows signal after carrying out this interactive operation order to the retina display unit.
Glasses type displayer or direct retinal projection device that said retina display unit is worn for the user.
Said signal operation processing unit is portable terminal, computing machine or based on the information service platform of cloud computing.
Said signal operation processing unit and retina display unit are integrated into integral body physically.
The signal operation processing unit is used for to the retina display unit 3 D stereo interface display signal being provided;
The retina display unit is used for showing the 3 D stereo interface according to said 3 D stereo interface display signal to the user.
This system further comprises the perception unit, visual angle that is worn on user's head;
Perception unit, visual angle is used for sensing user head movement information, and said user's head movement information is sent to the information operation processing unit;
The information operation processing unit is further used for confirming the user real time visual angle according to said user's head movement information, and provides based on the 3 D stereo interface display signal under this user real time visual angle to the retina display unit in real time.
This system further comprises the action capturing unit,
Said action capturing unit is used to catch the user and browses the limbs spatial movement information that this virtual interface is made, and said limbs spatial movement information is sent to the information operation processing unit;
The information operation processing unit is further used for confirming the interactive operation order corresponding to this user's limbs spatial movement information, and provides corresponding to the shows signal after carrying out this interactive operation order to the retina display unit in real time.
The signal operation processing unit is used for to the retina display unit two-dimensional virtual interface display signal being provided;
The retina display unit is used for showing the two-dimensional virtual interface according to said two-dimensional virtual interface display signal to the user.
The signal operation processing unit is further used for display space virtual mouse element on virtual interface, and said space virtual pointer element is consistent at the track of movement locus on the virtual interface and the touch operation of user on touch screen unit.
A kind of user interaction approach, this method comprises:
Shows signal is provided;
The shows signal that is provided is projected to user's retina, make the user visually feel to occur virtual interface, and said shows signal is displayed on the said virtual interface;
Catch the touch operation of user on touch-screen, confirm interactive operation order corresponding to this user's touch operation;
Provide in real time corresponding to the shows signal of carrying out after this interactive operation is ordered.
Provide shows signal to be: 3 D stereo interface display signal is provided.
This method further comprises:
Sensing user head movement information;
Confirm the user real time visual angle according to said user's head movement information, and provide in real time based on the 3 D stereo interface display signal under this user real time visual angle.
This method further comprises:
Catch the user and browse the limbs spatial movement information that this virtual interface is made;
Confirm interactive operation order corresponding to this user's limbs spatial movement information;
Provide in real time corresponding to the shows signal of carrying out after this interactive operation is ordered.
The said user of catching browses the limbs spatial movement information that this virtual interface makes: catch the user and browse accurate positioning action and/or the non-accurate positioning action that this virtual interface is made.
The touch operation of said user on touch-screen is single-point touches operation or multiple point touching operation.
Said single-point touches operation comprises: click, slide or long pressing.
Said multiple point touching operation comprises: rotation or convergent-divergent.
Can find out that from technique scheme in embodiment of the present invention, the signal operation processing unit is used for to the retina display unit shows signal being provided; The retina display unit is used for shows signal is projected to user's retina, makes the user visually feel to occur virtual interface, and shows signal is displayed on the virtual interface; Touch screen unit is used to catch user's touch operation action above that, and will send to the signal operation processing unit corresponding to the touch operation information of this touch operation action; The signal operation processing unit is further used for confirming the interactive operation order corresponding to this touch operation information, and provides in real time corresponding to the shows signal after carrying out this interactive operation order to the retina display unit.This shows, use after the embodiment of the present invention that mutual through between virtual interface and the touch-screen realized mutual and the mode of obtaining information between a kind of user and the hardware device, greatly strengthened user experience.
And; The interactive mode that the present invention implements is very natural; Basic limb action (such as the gesture) interactive mode that meets human nature; And reduced the learning cost of user to operating equipment, and meet human body natural ground and control the split design with mobile information processing hardware equipment alternately, make the people can more concentrate on its information paid close attention to rather than hardware device itself.
And embodiment of the present invention has also proposed a kind of three-dimensional tridimensional virtual information natural interaction interface of natural interaction technology, and this interactive interface comprises the element that numerous 3 D stereos can carry out natural interaction.Through the solution that embodiment of the present invention proposed, the user can be naturally with in the above-mentioned three-dimensional tridimensional virtual information of the hand control natural interaction interface corresponding to the virtual mouse of user's hand, natural interaction is carried out at three-dimensional tridimensional virtual information natural interaction interface.
In addition, unique display mode of embodiment of the present invention makes it affected by environment less, to the people high-quality sensory experience is provided, and can protect the privacy of information.Embodiment of the present invention can merge virtual information and real outdoor scene through direct retina scanning Projection Display mode; The sensory experience of augmented reality is provided to the people; Thereby can derive a large amount of significant application based on this, further greatly improve user experience.
Not only in this, embodiment of the present invention can be used and any human-machine interactive information equipment, and its versatility will be brought very big facility to people.
Description of drawings
Fig. 1 is the user interactive system structural representation according to embodiment of the present invention;
Fig. 2 is the user interaction approach schematic flow sheet according to embodiment of the present invention;
Fig. 3 is the mutual synoptic diagram of gesture touch-control according to embodiment of the present invention;
Fig. 4 is the virtual interface demonstration synoptic diagram according to embodiment of the present invention;
Fig. 5 is the virtual interface demonstration synoptic diagram according to embodiment of the present invention;
Fig. 6 is the three-dimensional interface display synoptic diagram according to embodiment of the present invention.
Embodiment
For the purpose, technical scheme and the advantage that make embodiment of the present invention are expressed clearlyer, embodiment of the present invention is remake further detailed explanation below in conjunction with accompanying drawing and specific embodiment.
In embodiment of the present invention; Adopt keyboard etc. to influence the defective of user experience as interactive means to various electronic equipments (such as portable electric appts) under the prior art; Utilize direct retina display mode; Make the user feel that virtual screen interface appears in the place ahead certain distance, key message can high bright demonstration in screen interface, and realizes alternately through touch screen operation and/or space limb action that the identification user makes to this virtual interface.
Embodiment of the present invention uses the mode of the direct scanning projection of retina to produce the virtual screen interface; Avoided the variety of issue that adopts the physics display screen to cause; But also can not influence the background visual field; The virtual screen interface of its generation can be used as the enhancing to real outdoor scene, can be widely used in augmented reality (Augment Reality) technology.
And embodiment of the present invention proposes a kind of people-oriented interaction scheme based on touch-screen identification to aforementioned virtual interface simultaneously, and this interaction schemes can the aforementioned virtual interface of seamless fusion be moved the information of controlling with touch-screen.Can form a stable mutual development platform through some basic typical touch screen operation identifications are optimized processing equally, supply the developer to develop application of all kinds.
In addition; Embodiment of the present invention proposes a kind of based on the people-oriented interaction scheme that people's limb action (preferably be people's gesture) is discerned simultaneously to aforementioned virtual interface, this interaction schemes can the aforementioned virtual interface of seamless fusion and the limb action of human body control information.Similarly,, form a stable mutual development platform, supply the developer to develop application of all kinds through some basic typical operation identifications are optimized processing.
Fig. 1 is the user interactive system structural representation according to embodiment of the present invention.
As shown in Figure 1, this system comprises: signal operation processing unit 101, retina display unit 102 and touch screen unit 103, wherein:
Signal operation processing unit 101 is used for to retina display unit 102 shows signal being provided;
Retina display unit 102 is used for the shows signal that signal operation processing unit 101 is provided is projected to user's retina, make the user visually feel to occur virtual interface, and said shows signal is displayed on the said virtual interface;
Touch screen unit 103 is used to catch user's touch operation action above that, and will send to signal operation processing unit 101 corresponding to the touch operation information of this touch operation action;
Said signal operation processing unit 101 is further used for confirming the interactive operation order corresponding to this touch operation information, and provides in real time corresponding to the shows signal after carrying out this interactive operation order to retina display unit 102.Retina display unit 102 shows in virtual interface that further this is corresponding to the shows signal of carrying out after interactive operation is ordered.
Wherein, signal operation processing unit 101 can be any device that shows signal and computing power can be provided.Such as portable terminal, computing machine, or even based on information service platform of cloud computing etc.
Signal operation processing unit 101 can be handled the order of corresponding interaction process through its built-in operating system and accomplish certain computing (handset dialing for example; Browsing page etc.); And through the corresponding shows signal of wired or wireless mode real-time update, and the output shows signal is given retina display unit 102.
Preferably; Communication mode between signal operation processing unit 101 and the retina display unit 102 can have multiple practical implementation form, includes, but are not limited to: wireless broadband transmission, Bluetooth transmission, infrared transmission, mobile communication transmission or wire transmission or the like.
Retina display unit 102 receives shows signal through above-mentioned communication modes from signal operation processing unit 101; After shows signal carried out decode-regulating; Display image is directly projected on user's retina, make the user feel that a virtual interface (being preferably the screen of augmentation) appears in the place ahead.User's interest information will be able to represent through this virtual interface.
Retina display unit 102 can produce this virtual interface in several ways.Such as, retina display unit 102 specifically can be user's wear-type glasses type displayer.This user's wear-type glasses type displayer has the right and left eyes of two ultra micro display screens corresponding to the people, amplifies the image on the ultra micro display screen through the precision optics lens, and then is presented in the screen picture of virtual augmentation in beholder's eye.
In addition, retina display unit 102 can also produce virtual interface through the mode of the direct projection of retina.Such as, retina display unit 102 specifically can be direct retinal projection device.In this mode; Utilize people's persistence of vision principle, the display chip in the retina display unit 102 (being direct retinal projection device) receives the shows signal from signal operation processing unit 101, and then the RGB laser that the miniature laser generator in the retina display unit 102 produces is modulated; Let low power laser scan with vertical both direction cocycle in level by specified order apace; Clash into an amphiblestroid pocket and make it produce light sensation, make people feel the existence of image, this kind display mode can not influence the background visual field; Virtual screen is superimposed upon on the true visual field, and then the sensory experience of augmented reality can be provided.
In one embodiment, after virtual interface occurs, user's browsing virtual interface, and trigger reciprocal process through touch screen unit 103.
Touch screen unit 103 can be the smart machine that has touch pad or touch-screen (for example with the smart mobile phone of touch-screen, portable terminals such as panel computers).Touch screen unit 103 is through touch and the slide of built-in software real time record user wherein on touch-screen; And these are touched with slide information-based, and will touch in real time and slide information (abbreviation touch operation information) passes to signal operation processing unit 101 through the particular communication data-interface with wired or wireless mode.Signal operation processing unit 101 will be analyzed it, and the interactive request of process user operation representative.
In one embodiment, touch screen unit 103 specifically may be embodied as five basic kind: vector pressure sensing technology touch-screen, resistive technologies touch-screen, capacitance technology touch-screen, infrared technology touch-screen, surface acoustic wave technique touch-screen.According to the principle of work of touch-screen medium with transmission information, can be divided into four kinds to touch-screen, they are respectively resistance-type, capacitor induction type, infrared-type and surface acoustic wave type.
Application scenarios in the face of embodiment of the present invention carries out exemplary illustrated down.
Such as: the user is with the display (being retina display unit 102) of a spectacle, and this glasses type displayer integrates as a whole (perhaps being connected with signal operation processing unit 101 through wireless mode) with signal operation processing unit 101.To glasses type displayer shows signal is provided in real time by signal operation processing unit 101.Through this glasses type displayer, the user can see a virtual interactive interface in its place ahead, visual field.
Exemplarily: Fig. 4 is the virtual interface demonstration synoptic diagram according to embodiment of the present invention; Fig. 5 is the virtual interface demonstration synoptic diagram according to embodiment of the present invention.
The user sets up wireless connections with its smart machine (being touch screen unit 103) that has a touch-screen and signal operation processing unit 101 through communication modes such as bluetooth or wifi.Set up data communication through operation on touch-screen based on the application program of embodiment of the present invention exploitation.
The user opens this application program on it has the smart machine of touch-screen.Suppose that the user carries out slide from left to right on the touch-screen of smart machine, application program with recording user finger on touch-screen mobile message and give signal operation processing unit 101 according to specific data protocol interface with this information transmission.
Signal operation processing unit 101 is the slide of one section distance to a declared goal from left to right according to the interbehavior that specific data-interface protocol analysis goes out the user; And this interbehavior correspondence is converted into the effective interactive command under the interactive interface that the user sees; For example the interactive command of user's interbehavior under the interactive interface that the active user saw is the specific range that slides from left to right of pointer in control the interactive interface that the user saw; At this moment 101 pairs of these interactive commands of signal operation processing unit are made response; Real-time update interactive interface shows signal is given retina display unit 102, and retina display unit 102 is presented to the user in real time with the interactive interface of real-time update.
In another embodiment, the user can carry out the multiple point touching operation on the touch pad on the smart machine (or touch-screen).On touch pad (or touch-screen), do amplifieroperation (two fingers slide on the touch pad and the contact distance of two fingers on touch pad increases) such as: user with two fingers, this interbehavior is write down and is transferred to signal operation processing unit 101 with specific data-interface through wireless mode by the application program that embodiment of the present invention develops of being directed on the smart machine.
Preserve the corresponding relation of various touch operation actions and concrete interactive operation order in the signal operation processing unit 101 in advance.Receive the amplifieroperation information that touch screen unit is sent when signal operation processing unit 101, parse with the corresponding interactive operation of this amplifieroperation information order and be: the pictures in the current mutual display interface carries out amplifieroperation.Then, 101 pairs of pictures of signal operation processing unit carry out amplifieroperation, and the shows signal after will executing sends to retina display unit 102, then retina display unit 102 this shows signal that on virtual interface, upgrades in time.
Preferably, discernible interactive operation can comprise single-point touches operation and multiple point touching operation on touch-screen or touch pad.
The single-point touches operation specifically can comprise: clicks, slides and long pressing, or the like.The multiple point touching operation specifically can comprise rotation, and (two fingers are in touch-screen or the enterprising line slip of touch pad; And the contact spacing of finger on touch-screen or touch pad is from constant basically; But line and touch pad border angle change between the contact) and convergent-divergent (two fingers slide on touch-screen or touch pad; And the contact spacing of finger on touch-screen or touch pad leaves increase or dwindles), or the like.
In one embodiment; Signal operation processing unit 101; Be further used for display space virtual mouse element on virtual interface, and said space virtual pointer element is consistent at the track of movement locus on the virtual interface and the touch operation of user on touch screen unit 103.
Preferably, signal operation processing unit 101 provides 3 D stereo interface display signal to retina display unit 102; Retina display unit 102 is used for showing the 3 D stereo interface according to said 3 D stereo interface display signal to the user.Like this, the user can browse the 3 D stereo interface, obtains more lively mutual impression.Exemplarily, Fig. 6 is the three-dimensional interface display synoptic diagram according to embodiment of the present invention.
In order to support to show the 3 D stereo signal, retina display unit 102 specifically may be embodied as headset equipment, such as being preferably spectacle 3 D stereo display device.This spectacle 3 D stereo display device forms parallax through control corresponding to the picture that the fine difference of the micro-display institute display frame of right and left eyes is seen user's eyes, brain can understand the parallax of eyes and use judgment object far and near with produce stereoscopic vision.
When in the scene that is applied to 3-D display, this system further comprises the perception unit, visual angle (not illustrating among the figure) that is worn on user's head.Perception unit, visual angle is used for sensing user head movement information, and said user's head movement information is sent to information operation processing unit 101.At this moment, information operation processing unit 101 is further used for confirming the user real time visual angle according to said user's head movement information, and provides based on the 3 D stereo interface display signal under this user real time visual angle to retina display unit 102 in real time.
Particularly, perception unit in visual angle can comprise microelectronic sensor and data transmission modules such as gyroscope, accelerometer, electronic compass.Perception unit, visual angle can fixedly be worn on user's head; Be used for sensing user head movement information; And corresponding data information is transferred to information operation processing unit 101 through data transmission module, further analyze view directions and the change informations that obtain user real time by information operation processing unit 101.Perception unit, visual angle can physically become one with retina display unit 102.
Though more than specified reciprocal process based on the three-dimensional interface, it will be appreciated by those of skill in the art that embodiment of the present invention is not limited to the three-dimensional interface, but can be adapted to be applicable to virtual interface arbitrarily.
In a preferred implementation, after two dimension or three-dimensional interface occur, the user can also pass through various limb actions (such as, be preferably through gesture) trigger reciprocal process.
Corresponding, also system further comprises action capturing unit 104.
Said action capturing unit 104 is used to catch the user and browses the limbs spatial movement information that this virtual interface is made, and said limbs spatial movement information is sent to information operation processing unit 101;
Information operation processing unit 101 is further used for confirming the interactive operation order corresponding to this user's limbs spatial movement information, and provides corresponding to the shows signal after carrying out this interactive operation order to retina display unit 102 in real time.
Particularly, action capturing unit 104 is caught user's limb action through scene in the real-time shooting visual field, and the depth of view information view data that comprises that will obtain is passed to information operation processing unit 101 in real time.Then, information operation processing unit 101 can be analyzed and obtain user's limb action (being preferably gesture) track through a series of software algorithms, and then analysis obtains the user interaction commands intention.Signal operation processing unit 101 is further used for providing in real time corresponding to the shows signal after carrying out interactive user limb action operational order to retina display unit 102.
Fig. 3 is the mutual synoptic diagram of gesture touch-control according to embodiment of the present invention.Such as, if user's hand streaks the field of view of action capturing unit 104 from right to left, the recording and sending view data is given signal operation processing unit 101 during action capturing unit 104.Signal operation processing unit 101 is analyzed from view data through a series of software algorithms and is drawn user's gesture track and be paddling from right to left, confirms as certain interactive command (for example: return page up) through software algorithm again.
In actual reciprocal process, action capturing unit 104 can be caught a series of interactive action.Such as: can catch " beginning mutual/confirm/selections/clicks ", " mobile (up, down, left, right, before and after) ", " amplification ", " dwindling ", " rotation ", the gesture motion of " withdraw from/finish alternately " etc.Signal operation processing unit 101 is at the corresponding relation of preserving in advance between interactive action and the interactive command.After signal operation processing unit 101 obtains interactive action, at first determine the interactive command corresponding, and carry out this interactive command with the interactive action line, and the corresponding mutual back of control retina display unit 102 outputs show state.
Preferably; Signal operation processing unit 101 possesses self-learning capability and certain User Defined extended operation function, and the user can improve the gesture identification ability of system and gesture and the mode of operation that can like self-defined various operations according to user self according to the gesture hadit training of self.A lot of parameters have been preset in the user interactions identification software; People's colour of skin information for example, length information of arm or the like, under the initial situation these parameter initial values based on statistical average to satisfy most of users as far as possible; Through realizing the self-learning capability of system in the software algorithm; Just along with the user constantly uses, software can be more prone to specific user's characteristics identification according to user's own characteristic correction some of them parameter alternately, and then improves the gesture identification ability of system.
In addition, the User Recognition interactive software should also provide the User Defined operation-interface, represents certain user-defined operational order such as the certain gestures track that the user likes, thus the personalized customizable characteristics of the system of realization.
More specifically, the user is divided into two types to the interactive operation of virtual interface: one type is the non-accurate positioning action of identification, such as " page turning ", and " advancing ", orders such as " retreating ".Another kind of is to realize accurate positioning action, such as clicking the button in the virtual interface or selecting an operation such as specific region.
For the identification of non-accurate positioning action, only need the motion track information of record analysis hand to get final product.Such as, non-accurate positioning action can comprise: for example hand from right to left paddling, hand from left to right paddling, hand from top to bottom paddling and, hand from top to bottom paddling or, and two hands separate, gather etc.
In order to realize the accurately identification of operation, need the movement locus of real-time follow-up hand, and this track is mapped on the virtual interface in real time, corresponding one by one with coordinate figure on the pseudo operation interface, thus realization is to the accurate operation at interface.
In fact, touch screen unit 103 and action capturing unit 104 have constituted the interactive command input equipment of this system respectively.Preferably, can command prioritization be set for touch screen unit 102 and action capturing unit 104, thereby avoid from the command collision between the different command input equipment.Such as, touch screen unit 102 can be set have higher priority, and action capturing unit 104 has lower priority.
Like this, when touch screen unit 102 and action capturing unit 104 simultaneously when signal operation processing unit 101 provides interactive command, preferentially carry out the interactive command of touch screen unit 102, thereby avoid command collision.
Based on above-mentioned analysis, embodiment of the present invention has also proposed a kind of user interaction approach.
Fig. 2 is the user interaction approach schematic flow sheet according to embodiment of the present invention.
As shown in Figure 2, this method comprises:
Step 201: shows signal is provided.
Step 202: the shows signal that is provided is projected to user's retina, make the user visually feel to occur virtual interface, and said shows signal is displayed on the said virtual interface.
Step 203: catch the touch operation of user on touch-screen, confirm interactive operation order corresponding to this user's touch operation.
Step 204: provide in real time corresponding to the shows signal of carrying out after this interactive operation is ordered.
In above-mentioned flow process, 3 D stereo interface display signal preferably is provided in step 201, thereby can shows this 3 D stereo interface in the flow.This method further comprises: sensing user head movement information; Confirm the user real time visual angle according to said user's head movement information, and provide in real time based on the 3 D stereo interface display signal under this user real time visual angle.
In the method, further comprise: catch the user and browse the limbs spatial movement information that this virtual interface is made; Confirm interactive operation order corresponding to this user's limbs spatial movement information; Provide in real time corresponding to the shows signal of carrying out after this interactive operation is ordered.
And, catch the user and browse the limbs spatial movement information that this virtual interface makes and can comprise: catch that the user browses this virtual interface and accurate positioning action and/or the non-accurate positioning action made.The user is divided into two types to the interactive operation of virtual interface: one type is the non-accurate positioning action of identification, such as " page turning ", and " advancing ", orders such as " retreating ".Another kind of is to realize accurate positioning action, such as clicking the button in the virtual interface or selecting an operation such as specific region.
For the identification of non-accurate positioning action, only need the motion track information of record analysis hand to get final product.Such as, non-accurate positioning action can comprise: for example hand from right to left paddling, hand from left to right paddling, hand from top to bottom paddling and, hand from top to bottom paddling or, and two hands separate, gather etc.
In order to realize the accurately identification of operation, need the movement locus of real-time follow-up hand, and this track is mapped on the virtual interface in real time, corresponding one by one with coordinate figure on the pseudo operation interface, thus realization is to the accurate operation at interface.
Preferably, discernible interactive operation can comprise single-point touches operation and multiple point touching operation on touch-screen or touch pad.
The single-point touches operation specifically can comprise: clicks, slides and long pressing, or the like.The multiple point touching operation specifically can comprise rotation, and (two fingers are in touch-screen or the enterprising line slip of touch pad; And the contact spacing of finger on touch-screen or touch pad is from constant basically; But line and touch pad border angle change between the contact) and convergent-divergent (two fingers slide on touch-screen or touch pad; And the contact spacing of finger on touch-screen or touch pad leaves increase or dwindles), or the like.
In sum, in embodiment of the present invention, the signal operation processing unit is used for to the retina display unit shows signal being provided; The retina display unit is used for shows signal is projected to user's retina, makes the user visually feel to occur virtual interface, and shows signal is displayed on the virtual interface; Touch screen unit is used to catch user's touch operation action above that, and will send to the signal operation processing unit corresponding to the touch operation information of this touch operation action; The signal operation processing unit is further used for confirming the interactive operation order corresponding to this touch operation information, and provides in real time corresponding to the shows signal after carrying out this interactive operation order to the retina display unit.This shows, use after the embodiment of the present invention that mutual through between virtual interface and the touch-screen realized mutual and the mode of obtaining information between a kind of user and the hardware device, greatly strengthened user experience.
And the interactive mode that the present invention implements is very natural, meets basic limb action (such as the gesture) interactive mode of human nature, and has reduced the learning cost of user to operating equipment.Embodiment of the present invention is controlled the split design with mobile information processing hardware equipment with meeting the human body natural alternately, makes the people can more concentrate on its information paid close attention to rather than hardware device itself.
And embodiment of the present invention has proposed a kind of three-dimensional tridimensional virtual information natural interaction interface of natural interaction technology, and this interactive interface comprises the element that numerous 3 D stereos can carry out natural interaction.Through the solution that embodiment of the present invention proposed, the user can be naturally with in the above-mentioned three-dimensional tridimensional virtual information of the hand control natural interaction interface corresponding to the virtual mouse of user's hand, natural interaction is carried out at three-dimensional tridimensional virtual information natural interaction interface.
In addition, unique display mode of embodiment of the present invention makes it affected by environment less, to the people high-quality sensory experience is provided, and can protect the privacy of information.Embodiment of the present invention can merge virtual information and real outdoor scene through direct retina scanning Projection Display mode; The sensory experience of augmented reality is provided to the people; Thereby can derive a large amount of significant application based on this, further greatly improve user experience.
Not only in this, embodiment of the present invention can be used and any human-machine interactive information equipment, and its versatility will be brought very big facility to people.
The above is merely the preferred embodiment of embodiment of the present invention, is not the protection domain that is used to limit embodiment of the present invention.All within the spirit and principle of embodiment of the present invention, any modification of being done, be equal to replacement, improvement etc., all should be included within the protection domain of embodiment of the present invention.

Claims (17)

1. a user interactive system is characterized in that, this system comprises signal operation processing unit, retina display unit and touch screen unit, wherein:
The signal operation processing unit is used for to the retina display unit shows signal being provided;
The retina display unit is used for the shows signal that the signal operation processing unit is provided is projected to user's retina, make the user visually feel to occur virtual interface, and said shows signal is displayed on the said virtual interface;
Touch screen unit is used to catch user's touch operation action above that, and will send to the signal operation processing unit corresponding to the touch operation information of this touch operation action;
Said signal operation processing unit is further used for confirming the interactive operation order corresponding to this touch operation information, and provides in real time corresponding to the shows signal after carrying out this interactive operation order to the retina display unit.
2. user interactive system according to claim 1 is characterized in that, glasses type displayer or direct retinal projection device that said retina display unit is worn for the user.
3. user interactive system according to claim 1 is characterized in that, said signal operation processing unit is portable terminal, computing machine or based on the information service platform of cloud computing.
4. according to each described user interactive system in claim 1 or 3, it is characterized in that said signal operation processing unit and retina display unit are integrated into integral body physically.
5. according to each described user interactive system in claim 1 or 3, it is characterized in that the signal operation processing unit is used for to the retina display unit 3 D stereo interface display signal being provided;
The retina display unit is used for showing the 3 D stereo interface according to said 3 D stereo interface display signal to the user.
6. user interactive system according to claim 5 is characterized in that, this system further comprises the perception unit, visual angle that is worn on user's head;
Perception unit, visual angle is used for sensing user head movement information, and said user's head movement information is sent to the information operation processing unit;
The information operation processing unit is further used for confirming the user real time visual angle according to said user's head movement information, and provides based on the 3 D stereo interface display signal under this user real time visual angle to the retina display unit in real time.
7. according to each described user interactive system in claim 1 or 3, it is characterized in that this system further comprises the action capturing unit;
Said action capturing unit is used to catch the user and browses the limbs spatial movement information that this virtual interface is made, and said limbs spatial movement information is sent to the information operation processing unit;
The information operation processing unit is further used for confirming the interactive operation order corresponding to this user's limbs spatial movement information, and provides corresponding to the shows signal after carrying out this interactive operation order to the retina display unit in real time.
8. according to each described user interactive system in claim 1 or 3, it is characterized in that,
The signal operation processing unit is further used for display space virtual mouse element on virtual interface, and said space virtual pointer element is consistent at the track of movement locus on the virtual interface and the touch operation of user on touch screen unit.
9. according to each described user interactive system in claim 1 or 3, it is characterized in that the signal operation processing unit is used for to the retina display unit two-dimensional virtual interface display signal being provided;
The retina display unit is used for showing the two-dimensional virtual interface according to said two-dimensional virtual interface display signal to the user.
10. a user interaction approach is characterized in that, this method comprises:
Shows signal is provided;
The shows signal that is provided is projected to user's retina, make the user visually feel to occur virtual interface, and said shows signal is displayed on the said virtual interface;
Catch the touch operation of user on touch-screen, confirm interactive operation order corresponding to this user's touch operation;
Provide in real time corresponding to the shows signal of carrying out after this interactive operation is ordered.
11. user interaction approach according to claim 10 is characterized in that, the said shows signal that provides is: 3 D stereo interface display signal is provided.
12. user interaction approach according to claim 11 is characterized in that, this method further comprises:
Sensing user head movement information;
Confirm the user real time visual angle according to said user's head movement information, and provide in real time based on the 3 D stereo interface display signal under this user real time visual angle.
13. user interaction approach according to claim 10 is characterized in that, this method further comprises:
Catch the user and browse the limbs spatial movement information that this virtual interface is made;
Confirm interactive operation order corresponding to this user's limbs spatial movement information;
Provide in real time corresponding to the shows signal of carrying out after this interactive operation is ordered.
14. user interaction approach according to claim 13; It is characterized in that the said user of catching browses the limbs spatial movement information that this virtual interface makes and is: catch the user and browse accurate positioning action and/or the non-accurate positioning action that this virtual interface is made.
15., it is characterized in that the touch operation of said user on touch-screen is single-point touches operation or multiple point touching operation according to each described user interaction approach among the claim 10-13.
16. user interaction approach according to claim 15 is characterized in that, said single-point touches operation comprises: click, slide or long pressing.
17. user interaction approach according to claim 15 is characterized in that, said multiple point touching operation comprises: rotation or convergent-divergent.
CN201210134998.5A 2012-05-03 2012-05-03 User interaction system and method Active CN102779000B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210134998.5A CN102779000B (en) 2012-05-03 2012-05-03 User interaction system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210134998.5A CN102779000B (en) 2012-05-03 2012-05-03 User interaction system and method

Publications (2)

Publication Number Publication Date
CN102779000A true CN102779000A (en) 2012-11-14
CN102779000B CN102779000B (en) 2015-05-20

Family

ID=47123925

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210134998.5A Active CN102779000B (en) 2012-05-03 2012-05-03 User interaction system and method

Country Status (1)

Country Link
CN (1) CN102779000B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103246351A (en) * 2013-05-23 2013-08-14 刘广松 User interaction system and method
CN103686082A (en) * 2013-12-09 2014-03-26 苏州市峰之火数码科技有限公司 Field mapping glasses
CN103823548A (en) * 2012-11-19 2014-05-28 联想(北京)有限公司 Electronic equipment, wearing-type equipment, control system and control method
CN103914128A (en) * 2012-12-31 2014-07-09 联想(北京)有限公司 Head mounted electronic device and input method
CN103970272A (en) * 2014-04-10 2014-08-06 北京智谷睿拓技术服务有限公司 Interaction method and device and user device
CN104063039A (en) * 2013-03-18 2014-09-24 朱慧灵 Human-computer interaction method of wearable computer intelligent terminal
CN104238730A (en) * 2013-06-21 2014-12-24 上海复旦上科多媒体有限公司 Smart grid visualization platform and demonstration control method
CN104298350A (en) * 2014-09-28 2015-01-21 联想(北京)有限公司 Information processing method and wearable electronic device
CN104391575A (en) * 2014-11-21 2015-03-04 深圳市哲理网络科技有限公司 Head mounted display device
CN104995583A (en) * 2012-12-13 2015-10-21 微软技术许可有限责任公司 Direct interaction system for mixed reality environments
CN106155284A (en) * 2015-04-02 2016-11-23 联想(北京)有限公司 Electronic equipment and information processing method
CN106527696A (en) * 2016-10-31 2017-03-22 宇龙计算机通信科技(深圳)有限公司 Method for implementing virtual operation and wearable device
CN106569725A (en) * 2016-11-09 2017-04-19 北京小米移动软件有限公司 A method and device for providing input for smart glasses, and touch device
CN107179876A (en) * 2017-06-30 2017-09-19 吴少乔 Human-computer interaction device based on virtual reality system
CN108786110A (en) * 2018-05-30 2018-11-13 腾讯科技(深圳)有限公司 Gun sight display methods, equipment and storage medium in virtual environment
CN109189312A (en) * 2018-08-06 2019-01-11 北京理工大学 A kind of human-computer interaction device and method for mixed reality

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1115152A (en) * 1993-12-03 1996-01-17 德克萨斯仪器股份有限公司 Visual information system
CN1770063A (en) * 2004-10-01 2006-05-10 通用电气公司 Method and apparatus for surgical operating room information display gaze detection and user prioritization for control
CN1785608A (en) * 2005-11-10 2006-06-14 上海大学 Control platform of multifinger mechanical skillful closed ring real time action
CN101258436A (en) * 2005-09-08 2008-09-03 瑞士电信流动电话公司 Communication device, system and method
CN101662720A (en) * 2008-08-26 2010-03-03 索尼株式会社 Sound processing apparatus, sound image localized position adjustment method and video processing apparatus
US20110221668A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Partial virtual keyboard obstruction removal in an augmented reality eyepiece

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1115152A (en) * 1993-12-03 1996-01-17 德克萨斯仪器股份有限公司 Visual information system
CN1770063A (en) * 2004-10-01 2006-05-10 通用电气公司 Method and apparatus for surgical operating room information display gaze detection and user prioritization for control
CN101258436A (en) * 2005-09-08 2008-09-03 瑞士电信流动电话公司 Communication device, system and method
CN1785608A (en) * 2005-11-10 2006-06-14 上海大学 Control platform of multifinger mechanical skillful closed ring real time action
CN101662720A (en) * 2008-08-26 2010-03-03 索尼株式会社 Sound processing apparatus, sound image localized position adjustment method and video processing apparatus
US20110221668A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Partial virtual keyboard obstruction removal in an augmented reality eyepiece

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823548B (en) * 2012-11-19 2019-07-26 联想(北京)有限公司 Electronic equipment, wearable device, control system and method
CN103823548A (en) * 2012-11-19 2014-05-28 联想(北京)有限公司 Electronic equipment, wearing-type equipment, control system and control method
CN104995583A (en) * 2012-12-13 2015-10-21 微软技术许可有限责任公司 Direct interaction system for mixed reality environments
CN103914128A (en) * 2012-12-31 2014-07-09 联想(北京)有限公司 Head mounted electronic device and input method
CN103914128B (en) * 2012-12-31 2017-12-29 联想(北京)有限公司 Wear-type electronic equipment and input method
CN104063039A (en) * 2013-03-18 2014-09-24 朱慧灵 Human-computer interaction method of wearable computer intelligent terminal
CN103246351B (en) * 2013-05-23 2016-08-24 刘广松 A kind of user interactive system and method
CN103246351A (en) * 2013-05-23 2013-08-14 刘广松 User interaction system and method
CN104238730A (en) * 2013-06-21 2014-12-24 上海复旦上科多媒体有限公司 Smart grid visualization platform and demonstration control method
CN103686082A (en) * 2013-12-09 2014-03-26 苏州市峰之火数码科技有限公司 Field mapping glasses
CN103970272A (en) * 2014-04-10 2014-08-06 北京智谷睿拓技术服务有限公司 Interaction method and device and user device
CN104298350A (en) * 2014-09-28 2015-01-21 联想(北京)有限公司 Information processing method and wearable electronic device
CN104298350B (en) * 2014-09-28 2020-01-31 联想(北京)有限公司 information processing method and wearable electronic equipment
CN104391575A (en) * 2014-11-21 2015-03-04 深圳市哲理网络科技有限公司 Head mounted display device
CN106155284A (en) * 2015-04-02 2016-11-23 联想(北京)有限公司 Electronic equipment and information processing method
CN106155284B (en) * 2015-04-02 2019-03-08 联想(北京)有限公司 Electronic equipment and information processing method
CN106527696A (en) * 2016-10-31 2017-03-22 宇龙计算机通信科技(深圳)有限公司 Method for implementing virtual operation and wearable device
CN106569725A (en) * 2016-11-09 2017-04-19 北京小米移动软件有限公司 A method and device for providing input for smart glasses, and touch device
CN107179876A (en) * 2017-06-30 2017-09-19 吴少乔 Human-computer interaction device based on virtual reality system
CN108786110A (en) * 2018-05-30 2018-11-13 腾讯科技(深圳)有限公司 Gun sight display methods, equipment and storage medium in virtual environment
CN109189312A (en) * 2018-08-06 2019-01-11 北京理工大学 A kind of human-computer interaction device and method for mixed reality

Also Published As

Publication number Publication date
CN102779000B (en) 2015-05-20

Similar Documents

Publication Publication Date Title
CN102779000B (en) User interaction system and method
US20220121344A1 (en) Methods for interacting with virtual controls and/or an affordance for moving virtual objects in virtual environments
US10754417B2 (en) Systems and methods for operating an input device in an augmented/virtual reality environment
Lv et al. Extending touch-less interaction on vision based wearable device
US10776618B2 (en) Mobile terminal and control method therefor
CN103246351B (en) A kind of user interactive system and method
CN102789313B (en) User interaction system and method
US9651782B2 (en) Wearable tracking device
CN102681651A (en) User interaction system and method
US20100053151A1 (en) In-line mediation for manipulating three-dimensional content on a display device
US11302086B1 (en) Providing features of an electronic product in an augmented reality environment
CN102426486B (en) Stereo interaction method and operated apparatus
CN103793060A (en) User interaction system and method
KR20140070326A (en) Mobile device providing 3d interface and guesture controlling method thereof
KR102147430B1 (en) virtual multi-touch interaction apparatus and method
US11500452B2 (en) Displaying physical input devices as virtual objects
US20190050132A1 (en) Visual cue system
JP2018142313A (en) System and method for touch of virtual feeling
US11776182B1 (en) Techniques for enabling drawing in a computer-generated reality environment
CN102508562A (en) Three-dimensional interaction system
US10171800B2 (en) Input/output device, input/output program, and input/output method that provide visual recognition of object to add a sense of distance
Zhang et al. A hybrid 2D–3D tangible interface combining a smartphone and controller for virtual reality
CN102508561A (en) Operating rod
CN110717993A (en) Interaction method, system and medium of split type AR glasses system
CN102508563A (en) Stereo interactive method and operated device

Legal Events

Date Code Title Description
DD01 Delivery of document by public notice

Addressee: Dry line consulting (Beijing) Technology Co., Ltd.

Document name: Notification of Passing Preliminary Examination of the Application for Invention

C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: SUZHOU CHUDA INFORMATION TECHNOLOGY CO., LTD.

Free format text: FORMER OWNER: TOUCHAIR (BEIJING) TECHNOLOGY CO., LTD.

Effective date: 20140211

COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 100085 HAIDIAN, BEIJING TO: 215021 SUZHOU, JIANGSU PROVINCE

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20140211

Address after: 215021 A1503, international science and Technology Park, 1355 Jinji Lake Avenue, Suzhou Industrial Park, Suzhou, Jiangsu, China

Applicant after: SUZHOU CHUDA INFORMATION TECHNOLOGY CO., LTD.

Address before: 100085. Office building 2, building 2, No. 1, Nongda South Road, Beijing, Haidian District, B-201

Applicant before: Dry line consulting (Beijing) Technology Co., Ltd.

C14 Grant of patent or utility model
GR01 Patent grant
DD01 Delivery of document by public notice
DD01 Delivery of document by public notice

Addressee: He Xiaopan

Document name: Approval notice of fee reduction