CN105516815A - Method for controlling operation interface of display device by motion - Google Patents

Method for controlling operation interface of display device by motion Download PDF

Info

Publication number
CN105516815A
CN105516815A CN201410496355.4A CN201410496355A CN105516815A CN 105516815 A CN105516815 A CN 105516815A CN 201410496355 A CN201410496355 A CN 201410496355A CN 105516815 A CN105516815 A CN 105516815A
Authority
CN
China
Prior art keywords
user
operation interface
display unit
screen
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410496355.4A
Other languages
Chinese (zh)
Inventor
吴季庭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Top Victory Investments Ltd
TPV Investment Co Ltd
Original Assignee
TPV Investment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TPV Investment Co Ltd filed Critical TPV Investment Co Ltd
Priority to CN201410496355.4A priority Critical patent/CN105516815A/en
Publication of CN105516815A publication Critical patent/CN105516815A/en
Pending legal-status Critical Current

Links

Abstract

The invention relates to a method for controlling an operation interface of a display device by motion. The operation interface is displayed on a screen of the display device, and comprises multiple columns extending along the horizontal direction of the screen and multiple layers extending along the depth direction of the screen. The method comprises that the operation interface is displayed on the screen of the display device; the display device senses motions of a user; the user moves hands to original positions; the display device corresponds motion of the user to the operation interface; when the user moves the hand left and right or moves the eye left and right along the horizontal direction of the screen, the operation interface is switched left and right among the columns step by step correspondingly; and when the user moves the hand front and back along the depth direction of the screen, the operation interface is switched front and back among the layers step by step correspondingly. According to the invention, motions of the user are sensed by the display device and corresponding to switching and selection among the layers and columns, and the convenience in using is improved.

Description

User is by the method for the operation interface of action control display unit
Technical field
The present invention about a kind of user in order to control the method for display unit, and especially in regard to the method for a kind of user by the operation interface of action control display unit.
Background technology
Existing intelligent television, can provide operation interface usually, carries out more complicated operation for user.Such as, different channels being displayed in operation interface respectively, after the channel allowing user select for viewing and admiring, then being presented at by the channel selected on screen with full-screen mode.But due to the restriction of screen size, cannot be displayed in operation interface by complete for all channels, therefore in the prior art, these channels can be presented with the several column arranged along screen level direction and along the mode of the plural layer of screen intensity direction arrangement, re-use specific control device, the directionkeys of the remote controller of such as TV, switches between these layers and row.But this kind of mode of operation needs to use a teleswitch, if remote controller does not have electricity or just not on hand, the convenience in use can be affected.
Summary of the invention
The object of the invention is to propose the method for a kind of user by the operation interface of action control display unit, it corresponds to switching between layer and row and selection, to improving the convenience on using by allowing display unit sense the action of user.
For achieving the above object, the present invention proposes the method for a kind of user by the operation interface of action control display unit, operation interface in order to display on the screen of the display apparatus, and operation interface comprises the several column extended along screen level direction and the plural layer extended along screen intensity direction, described method comprises: operation display interface is on the screen of display unit; The action of display unit sensing user; Hand is moved to initial position by user; The action of user is corresponded to operation interface by display unit; When user along screen level direction left and right mobile hand or turn left and right eyes, operation interface is left and right switching between these row in proper order accordingly; And when user along screen intensity direction forward and backward mobile hand, operation interface is forward and backward switching between, the layers in proper order accordingly.
In one embodiment of this invention, wherein the step that the action of user corresponds to operation interface comprises by display unit further: when user along screen level direction left and right mobile hand, operation interface accordingly in proper order these row between left and right switching; Or when user turns left and right eyes along screen level direction, operation interface is left and right switching between these row in proper order accordingly.
In one embodiment of this invention, wherein display unit comprises after the action of user being corresponded to the step of operation interface further: operation interface switches to one and specifically arranges and layer; User by hand transfixion more than a scheduled time; And after display unit senses that the hand of user is static and exceedes the scheduled time, operation interface is selected specific row with layer and will specifically arrange and the content of layer is presented on screen.
In one embodiment of this invention, wherein hand is moved to initial position and the step that the action of user corresponds to operation interface comprises by display unit further by user: palm opens and by the centre of the palm to display unit by user; And display unit senses the centre of the palm of user to after display unit, and the action of user is corresponded to operation interface by display unit.
In one embodiment of this invention, wherein display unit comprises after the action of user being corresponded to the step of operation interface further: operation interface switches to one and specifically arranges and layer; User will hold fist; And, display unit sense user hold fist after, operation interface is selected specific row with layer and will specifically arrange and the content of layer is presented on screen.
In one embodiment of this invention, wherein display unit comprises after the action of user being corresponded to the step of operation interface further: operation interface switches to one and specifically arranges and layer; Specific finger is regained and is affixed on the centre of the palm by user; And the specific finger that display unit senses user is regained and after being affixed on the centre of the palm, operation interface is selected specific row with layer and will specifically arrange and the content of layer is presented on screen.
In one embodiment of this invention, wherein operation interface is selected specific row with layer and will specifically arrange and the content of layer is presented on screen with full-screen mode.
In one embodiment of this invention, wherein when user along screen intensity direction forward and backward mobile hand, the step of the corresponding forward and backward switching between, the layers in proper order of operation interface comprises further: screen shows a plurality of layer along the arrangement of screen intensity direction simultaneously, wherein every one deck has a label, and these labels partly overlap each other; And, when user along screen intensity direction forward and backward mobile hand, operation interface switches accordingly in proper order between these labels; Wherein, the label be switched to produces a special-effect action, special-effect comprise spring, displacement, highlighted, highlight or amplify.
In one embodiment of this invention, wherein the step that the action of user corresponds to operation interface comprises by display unit further: the fingerprint of display unit identification user or palmmprint; And display unit analyzes the image skeleton drawing structure of user.
In one embodiment of this invention, wherein display unit senses the action of user by shooting or ultrasonic waves; Or display unit wirelessly senses the action of user by the video camera on a mancarried device or transducer.
User proposed by the invention, by the method for the operation interface of action control display unit, corresponds to switching between layer and row and selection by the action allowing display unit sense user, improves the convenience in use.
Accompanying drawing explanation
The following drawings only schematically illustrates the present invention and explains, not delimit the scope of the invention.
Fig. 1 is the flow chart of user by the method for the operation interface of action control display unit of one embodiment of the invention.
Fig. 2 is the hand of the user of one embodiment of the invention and the schematic diagram of display unit.
Fig. 3 is the schematic diagram one of user by the operation interface of action control display unit of one embodiment of the invention.
Fig. 4 is the schematic diagram two of user by the operation interface of action control display unit of one embodiment of the invention.
Fig. 5 is the schematic diagram three of user by the operation interface of action control display unit of one embodiment of the invention.
Fig. 6 is the schematic diagram four of user by the operation interface of action control display unit of one embodiment of the invention.
Fig. 7 is the block diagram of the display unit of one embodiment of the invention.
Identifier declaration:
10 display unit
11 screens
12 operation interfaces
13 arrows
14 signal operations and processing unit
15 image sensing units
16 storage elements
20 hands
111 first row ground floors
The 112 first row second layers
113 first row third layer
121 secondary series ground floors
The 122 secondary series second layers
123 secondary series third layer
131 the 3rd row ground floors
132 the 3rd row second layers
133 the 3rd row third layer
D depth direction
H horizontal direction
S101 operation display interface is on the screen of display unit
The action of S103 display unit sensing user
Whether S105 user moves to initial position by hand
The action of user is corresponded to operation interface by S107 display unit
Whether S109 user left and right mobile hand along screen level direction
S111 operation interface is left and right switching between row in proper order
Whether S113 user forward and backward mobile hand along screen intensity direction
S115 operation interface is forward and backward switching between layers in proper order
Whether transfixion exceedes the scheduled time to the hand of S117 user
The content of the row be switched to and layer is presented on screen by S119 operation interface.
Embodiment
For allowing object of the present invention, feature and advantage that those of ordinary skill in the art can be made to be more readily understood, hereafter in conjunction with the embodiments, and accompanying drawing is coordinated to be described in detail as follows.
Please refer to Fig. 1, Fig. 2 and Fig. 3, Figure 1 shows that the flow chart of the user of one embodiment of the invention by the method for the operation interface 12 of action control display unit 10, Figure 2 shows that the hand 20 of the user of one embodiment of the invention and the schematic diagram of display unit 10, Figure 3 shows that the schematic diagram one of the user of one embodiment of the invention by the operation interface 12 of action control display unit 10.The user of the present embodiment passes through the method for the operation interface 12 of action control display unit 10, and it applies to intelligent television, but is not limited thereto.As shown in Figure 2, display unit 10 comprises screen 11, and the operation interface 12 of display unit 10 is presented on screen 11, and the user for the operation interface 12 controlling display unit 10 then stands in the front of display unit 10, the position of hand 20 as shown in Figure 2.As shown in Figure 2 and Figure 3, operation interface 12 comprises the several column extended along screen 11 horizontal direction H and the plural layer extended along screen 11 depth direction D.In the present embodiment, display unit 10 is flat-panel screens; In other embodiments, display unit also can be solid (3D) display, and it can be and nakedly looks 3D display or 3D glasses type displayer.
Please more simultaneously with reference to Fig. 3 to Fig. 6, in proper order for user is by schematic diagram two to the figure four of the operation interface 12 of action control display unit 10 shown in Fig. 4 to Fig. 6, in the present embodiment, screen 11 can show a plurality of layer arranged along screen 11 depth direction D simultaneously, and two row can be shown side by side along screen 11 horizontal direction H simultaneously, the row exceeding screen 11 border are then hidden, first row ground floor 111 as shown in Figure 5, the first row second layer 112 and first row third layer 113 are hidden state in figure 3, because first row ground floor 111 to first row third layer 113 is positioned at secondary series ground floor 121 in Virtual Space, the left side of the secondary series second layer 122 and secondary series third layer 123, it is beyond the indication range of screen 11.Wherein, every one deck and each arrange there is a label respectively, as the label of secondary series ground floor 121 shows " PocketMonster ", the label of the secondary series second layer 122 shows " SnowWhite ", the label of the 3rd row ground floor 131 shows " Kano ", the label of the 3rd row second layer 132 shows " RushHour " etc., the corresponding contents of information being representative shown on label this specifically row and layer, and, the label of the different layers of same row has at least part of overlap each other, but can not be completely overlapping, this is the label of all layers in order to show each row on the depth direction D of screen 11.In the present embodiment, the different layers of same row is the film with identical category or attribute, film as first row ground floor 111 to first row third layer 113 belongs to science fiction class, the film of secondary series ground floor 121 to secondary series third layer 123 belongs to cartoon, 3rd row ground floor 131 belongs to action movie to the film of the 3rd row third layer 133, but is not limited thereto.
As shown in Figure 1, user comprises the steps: in step S101 by the method for the operation interface 12 of action control display unit 10, display unit 10 is started shooting by user and behind open operation interface 12, the operation interface 12 of display unit 10 can be presented on the screen 11 of display unit 10.In step s 103, display unit 10 can start and continue to sense the action of user, and such as sense the position of user and its face or limb action, whether user for confirmation will begin through action control operation interface 12.Wherein, display unit 10 is by the action of shooting sensing user, such as display unit 10 can built-in or external twin camera (not illustrating), utilize twin camera take the image in display unit 10 front and carry out three-dimensional matrice calculating, learn the position of user and hand 20 shift action in three dimensions of user by this.
Please more simultaneously with reference to Fig. 7, Figure 7 shows that the block diagram of the display unit of one embodiment of the invention, in the present embodiment, display unit 10 is except screen 11 and operation interface 12, also comprise signal operation and processing unit 14, image sensing unit 15 and storage element 16, wherein, image sensing unit 15 is the twin camera be built in display unit 10, the signal of video signal that image sensing unit 15 senses can pass to signal operation and processing unit 14, and signal operation and processing unit 14 and storage element 16 are worked in coordination and are carried out three-dimensional matrice calculating, and be presented in screen 11 after the action of user is corresponded to operation interface 12.In other embodiments, display unit is also by the action of ultrasonic waves sensing user; Or display unit wirelessly can sense the action of user by the video camera on mancarried device or transducer, mancarried device is smart mobile phone, panel computer or intelligent glasses etc. such as.For panel computer, the screen of panel computer is other can be provided with the twin camera that can carry out three-dimensional matrice calculating, and panel computer can be placed on by the side of by user, and the action of user can be transferred to display unit again by after the twin camera of panel computer shooting analysis; For smart mobile phone, user directly can hold smart mobile phone, and is transferred to display unit again after sensing the action of hand by the gravity sensor (GSensor) that smart mobile phone is built-in.
In step S105, display unit 10 can continue the action sensing user, and whether user for confirmation moves to initial position by hand 20, if not, then gets back to step S103, if so, then carries out step S107.In the present embodiment, described initial position is defined as the hand 20 of user when chest, the relative position of hand 20 and health, and when hand 20 is lifted near chest by user, then display unit 10 can confirm that hand 20 is moved to initial position by user; In other embodiments, initial position also can be defined as position, the centre of the palm and the direction of user by step S105, when user palm opened and by the centre of the palm to display unit time, display unit can confirm that hand has been moved to initial position and carried out step S107 by user.In step s 107, the action of the hand 20 of user is corresponded to operation interface 12 by display unit 10, and now the shift action of the hand 20 of user is by controllable operation interface 12; In other embodiments, the rotational action of the eyes of user also may correspond to operation interface, makes the rotation controllable operation interface of eyes.In step S109, display unit 10 can sense and confirm that whether user is along the left and right mobile hand 20 of screen 11 horizontal direction H, if not, then carries out step S113; If so, then first carry out step S111, then carry out step S113.Wherein, as shown in Figure 2, screen 11 horizontal direction H is the direction being parallel to screen 11.In step S111, when user is left along screen 11 horizontal direction H, move to right and start 20, operation interface 12 can be corresponding left between these row in proper order, right cut is changed, as shown in Figure 3, the arrow 13 of original operation interface 12 points to secondary series ground floor 121, when the hand 20 of user moves right, then arrow 13 can be corresponding to moving to right, become as shown in Figure 4, arrow 13 points to the 3rd row ground floor 131, in this situation, if the hand of user 20 continues to be moved to the left twice, then arrow 13 can be corresponding to moving to left twice, become as shown in Figure 5, now, in first row meeting shift-in screen 11, secondary series can move on on the right of screen 11, 3rd row then can shift out screen 11, and arrow 13 can point to first row ground floor 111.In other embodiments, display unit also can sense and confirm whether user turns left and right eyes along screen level direction, if when user turns left and right eyes, operation interface can corresponding left and right switching between these row in proper order.
In step S113, display unit 10 can sense and confirm that whether user is along the forward and backward mobile hand 20 of screen 11 depth direction D, if not, then carries out step S117; If so, then first carry out step S115, then carry out step S117.Wherein, as shown in Figure 2, screen 11 depth direction D is perpendicular to the direction of screen 11.In step sl 15, when user along the forward and backward mobile hand 20(of screen 11 depth direction D and user hand 20 near or away from screen 11), the corresponding forward and backward switching between, the layers in proper order of operation interface 12, as shown in Figure 3, the arrow 13 of original operation interface 12 points to secondary series ground floor 121, when the hand 20 of user moves forward, then corresponding layer is switched to the secondary series second layer 122 from secondary series ground floor 121, if now the hand 20 of user moves again backward, then can switch back again secondary series ground floor 121.In the present embodiment, the switching of layer switches between layer and the label of layer, but be not limited thereto.Wherein, the label be switched to can produce special-effect, described special-effect is such as spring, displacement, highlighted, highlight or amplify, in the present embodiment, if the label be switched to is the secondary series second layer 122, as shown in Figure 6, then the label of the secondary series second layer 122 can slightly amplify, and the word covered a little on the label of secondary series third layer 123, to highlight the secondary series second layer 122.
In step S117, display unit 10 can sense the hand 20 of user, and whether transfixion exceedes the scheduled time, and the described scheduled time is such as three seconds, if not, then representing the not yet selected target for selecting of user, therefore repeatedly carrying out step S109 to step S117; If so, then represent that the selected institute of user is for the target of selection, then carries out step S119.For Fig. 6, the target that " SnowWhite " if of the secondary series second layer 122 wish that is selected by user, then user allows hand 20 transfixion at control operation interface 12 more than three seconds, target that select to by user that namely display unit 10 can pick out " SnowWhite ".In other embodiments, whether step S117 can change alternate manner into confirm user's selected target, such as, if the hand of user to open with palm and the action of user is corresponded to operation interface to screen to allow display unit by the centre of the palm, then when user's selected target, whether display unit can clench fist with the hand of user further judges, when fist is held in control operation interface by user, and display unit sense user hold fist after, namely represent user's selected target, display unit then carries out step S119; Or, display unit changes and judges with the specific finger movement of user, such as, when forefinger, middle finger, nameless and little finger are regained and be affixed on the centre of the palm and only extend thumb by user or only regain thumb and extend all the other four when referring to, display unit can judge user's selected target.In step S119, operation interface 12 select this specifically row this specifically to be arranged and the content of layer is presented on screen 11 with layer, for Fig. 6, the content of the secondary series second layer 122 can then be presented on screen 11 with full-screen mode by display unit 10, in other words, display unit 10 can start to play " SnowWhite ".
In other embodiments, the situation that display unit front exists number people simultaneously if occur, display unit can be utilized further to analyze fingerprint, palmmprint to confirm whom real user is, such as, taken each one hand by video camera after, further with the built-in image identification software analysis identification of display unit to find out the fingerprint or the palmmprint that have previously logged in user in systems in which, again the action of this user is corresponded to operation interface, to get rid of the people of all the other non-user, disturbed when avoiding user's control operation interface.In addition, for avoiding limbs or all the other article perturbation operation of non-user, display unit also can be utilized to analyze image skeleton drawing structure, get rid of irrational part, then the action of user is corresponded to operation interface.Described image skeleton drawing structure, it is such as the structure of image skeleton drawing human body being combined into simulated humanbody with lines, it can comprise the primary structures such as head, trunk, four limbs and its joint, when hand-held doing evil through another person, the structure had more can be analyzed as being in image skeleton drawing structure because doing evil through another person, for irrational part, the action of therefore doing evil through another person can be excluded and can not perturbation operation.
Although the present invention discloses as above with embodiment; so it is not intended to limit the present invention, any those of ordinary skill in the art, without departing from the spirit and scope of the present invention; when doing a little change and retouching, therefore protection scope of the present invention is when being as the criterion of defining with claims.

Claims (10)

1. user is by the method for the operation interface of action control display unit, this operation interface is in order to be presented on the screen of this display unit, and this operation interface comprises the several column extended along this screen level direction and the plural layer extended along this screen intensity direction, this user is comprised by the method for the operation interface of action control display unit:
Show this operation interface on the screen of this display unit;
This display unit senses the action of this user;
Hand is moved to an initial position by this user;
The action of this user is corresponded to this operation interface by this display unit; And
When this user is along the forward and backward mobile hand in this screen intensity direction, the corresponding forward and backward switching between said layers in proper order of this operation interface.
2. user as claimed in claim 1 is by the method for the operation interface of action control display unit, and wherein, the step that the action of this user corresponds to this operation interface comprises by this display unit further:
When this user is along the left and right mobile hand in this screen level direction, the corresponding left and right switching between described row in proper order of this operation interface; Or
When this user turns left and right eyes along this screen level direction, the corresponding left and right switching between described row in proper order of this operation interface.
3. user as claimed in claim 2 is by the method for the operation interface of action control display unit, and wherein, this display unit comprises after the action of this user is corresponded to the step of this operation interface further:
This operation interface switches to one and specifically arranges and layer;
This user by hand transfixion more than a scheduled time; And
After this display unit senses that the hand of this user is static and exceedes this scheduled time, this operation interface select this specifically row this specifically to be arranged and the content of layer shows on the screen with layer.
4. user as claimed in claim 2 is by the method for the operation interface of action control display unit, and wherein, this user hand is moved to this initial position and the step that the action of this user corresponds to this operation interface comprises by this display unit further:
Palm opens and by the centre of the palm to this display unit by this user; And
This display unit senses the centre of the palm of this user to after this display unit, and the action of this user is corresponded to this operation interface by this display unit.
5. user as claimed in claim 4 is by the method for the operation interface of action control display unit, and wherein, this display unit comprises after the action of this user is corresponded to the step of this operation interface further:
This operation interface switches to one and specifically arranges and layer;
This user will hold fist; And
This display unit sense this user hold fist after, this operation interface select this specifically row this specifically to be arranged and the content of layer shows on the screen with layer.
6. user as claimed in claim 4 is by the method for the operation interface of action control display unit, and wherein, this display unit comprises after the action of this user is corresponded to the step of this operation interface further:
This operation interface switches to one and specifically arranges and layer;
Specific finger is regained and is affixed on the centre of the palm by this user; And
The specific finger that this display unit senses this user is regained and after being affixed on the centre of the palm, this operation interface select this specifically row this specifically to be arranged and the content of layer shows on the screen with layer.
7. the user as described in claim 3,5 or 6 by the method for the operation interface of action control display unit, wherein, this operation interface select this specifically row this specifically to be arranged and the content of layer shows on the screen with full-screen mode with layer.
8. user as claimed in claim 2 is by the method for the operation interface of action control display unit, wherein, when this user is along the forward and backward mobile hand in this screen intensity direction, the step of the corresponding forward and backward switching between said layers in proper order of this operation interface comprises further:
This screen shows a plurality of layer along the arrangement of this screen intensity direction simultaneously, and wherein every one deck has a label, and described label partly overlaps each other; And
When this user is along the forward and backward mobile hand in this screen intensity direction, this operation interface switches accordingly in proper order between described label;
Wherein, the label be switched to produces a special-effect action, described special-effect comprise spring, displacement, highlighted, highlight or amplify.
9. user as claimed in claim 1 is by the method for the operation interface of action control display unit, and wherein, the step that the action of this user corresponds to this operation interface comprises by this display unit further:
The fingerprint of this user of this display unit identification or palmmprint; And
This display unit analyzes the image skeleton drawing structure of this user.
10. user as claimed in claim 1 is by the method for the operation interface of action control display unit, and wherein, this display unit senses the action of this user by shooting or ultrasonic waves; Or this display unit wirelessly senses the action of this user by the video camera on a mancarried device or transducer.
CN201410496355.4A 2014-09-25 2014-09-25 Method for controlling operation interface of display device by motion Pending CN105516815A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410496355.4A CN105516815A (en) 2014-09-25 2014-09-25 Method for controlling operation interface of display device by motion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410496355.4A CN105516815A (en) 2014-09-25 2014-09-25 Method for controlling operation interface of display device by motion

Publications (1)

Publication Number Publication Date
CN105516815A true CN105516815A (en) 2016-04-20

Family

ID=55724400

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410496355.4A Pending CN105516815A (en) 2014-09-25 2014-09-25 Method for controlling operation interface of display device by motion

Country Status (1)

Country Link
CN (1) CN105516815A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108939211A (en) * 2018-07-31 2018-12-07 佛山市苔藓云链科技有限公司 A kind of transfusion monitoring system
CN109124580A (en) * 2018-07-31 2019-01-04 佛山市苔藓云链科技有限公司 The shared remote control medical device of one kind
CN109416614A (en) * 2016-11-30 2019-03-01 日本聚逸株式会社 Application program controlling program, application control method and application control system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101901050A (en) * 2009-04-23 2010-12-01 日立民用电子株式会社 Input media
US20120050273A1 (en) * 2010-08-26 2012-03-01 Samsung Electronics Co., Ltd. Apparatus and method for controlling interface
CN102770828A (en) * 2010-02-09 2012-11-07 微软公司 Handles interactions for human-computer interface
CN102789312A (en) * 2011-12-23 2012-11-21 乾行讯科(北京)科技有限公司 User interaction system and method
CN102917271A (en) * 2011-08-05 2013-02-06 三星电子株式会社 Method for controlling electronic apparatus and electronic apparatus applying the same
CN102981610A (en) * 2008-09-29 2013-03-20 株式会社日立制作所 Input Apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102981610A (en) * 2008-09-29 2013-03-20 株式会社日立制作所 Input Apparatus
CN101901050A (en) * 2009-04-23 2010-12-01 日立民用电子株式会社 Input media
CN102770828A (en) * 2010-02-09 2012-11-07 微软公司 Handles interactions for human-computer interface
US20120050273A1 (en) * 2010-08-26 2012-03-01 Samsung Electronics Co., Ltd. Apparatus and method for controlling interface
CN102917271A (en) * 2011-08-05 2013-02-06 三星电子株式会社 Method for controlling electronic apparatus and electronic apparatus applying the same
CN102789312A (en) * 2011-12-23 2012-11-21 乾行讯科(北京)科技有限公司 User interaction system and method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109416614A (en) * 2016-11-30 2019-03-01 日本聚逸株式会社 Application program controlling program, application control method and application control system
CN109416614B (en) * 2016-11-30 2023-04-04 日本聚逸株式会社 Method implemented by computer and non-volatile computer-readable medium, system
CN108939211A (en) * 2018-07-31 2018-12-07 佛山市苔藓云链科技有限公司 A kind of transfusion monitoring system
CN109124580A (en) * 2018-07-31 2019-01-04 佛山市苔藓云链科技有限公司 The shared remote control medical device of one kind

Similar Documents

Publication Publication Date Title
CN106484085B (en) The method and its head-mounted display of real-world object are shown in head-mounted display
EP2630563B1 (en) Apparatus and method for user input for controlling displayed information
EP3098689B1 (en) Image display device and image display method
CN102566747B (en) Mobile terminal and method for controlling operation of mobile terminal
CN102929388B (en) Full space posture inputs
CN104246661B (en) Interacted using gesture with device
CN102662577B (en) A kind of cursor operating method based on three dimensional display and mobile terminal
CN110476142A (en) Virtual objects user interface is shown
US20140157206A1 (en) Mobile device providing 3d interface and gesture controlling method thereof
CN116194868A (en) Apparatus, method and graphical user interface for interacting with a three-dimensional environment
CN106055090A (en) Virtual reality and augmented reality control with mobile devices
CN109313500A (en) The passive optical and inertia of very thin form factor track
US20140267637A1 (en) Hybrid stereoscopic viewing device
US20160162155A1 (en) Information processing device, information processing method, and program
CN105917291A (en) Wearable device with multi-mode display system
EP3106963B1 (en) Mediated reality
CN104571849A (en) Wearable device and method for controlling the same
CN103729054A (en) Multi display device and control method thereof
US20160371888A1 (en) Interactive information display
CN108932100A (en) A kind of operating method and head-mounted display apparatus of dummy keyboard
CN102422253A (en) Electronic apparatus including one or more coordinate input surfaces and method for controlling such an electronic apparatus
CN102880304A (en) Character inputting method and device for portable device
KR20210091739A (en) Systems and methods for switching between modes of tracking real-world objects for artificial reality interfaces
CN103744518A (en) Stereoscopic interaction method, stereoscopic interaction display device and stereoscopic interaction system
US11209903B2 (en) Rendering of mediated reality content

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20160420

WD01 Invention patent application deemed withdrawn after publication