CN105159445A - Input device - Google Patents

Input device Download PDF

Info

Publication number
CN105159445A
CN105159445A CN201510486457.2A CN201510486457A CN105159445A CN 105159445 A CN105159445 A CN 105159445A CN 201510486457 A CN201510486457 A CN 201510486457A CN 105159445 A CN105159445 A CN 105159445A
Authority
CN
China
Prior art keywords
motion
display
user interface
operator
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510486457.2A
Other languages
Chinese (zh)
Other versions
CN105159445B (en
Inventor
松原孝志
尾崎友哉
德永龙也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Maxell Ltd
Original Assignee
Hitachi Maxell Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Maxell Ltd filed Critical Hitachi Maxell Ltd
Publication of CN105159445A publication Critical patent/CN105159445A/en
Application granted granted Critical
Publication of CN105159445B publication Critical patent/CN105159445B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Image Analysis (AREA)

Abstract

An input device includes an input unit for inputting a predetermined motion image signal, a motion detector for detecting a motion on the basis of the motion image signal inputted into the input unit, a video signal processor for outputting a predetermined video signal, and a controller. The controller controls the video signal processor so that, when a motion detector detects a first motion, the video signal processor outputs a video signal to explain a predetermined second motion to be next detected by the motion detector after the detection of the first motion to a user.

Description

Input media
The divisional application that this case is application number is 201010167387.1, the applying date is the patented claim of the same name on April 23rd, 2010
Technical field
The present invention relates to input media.
Background technology
Receive the operation of user (operator) via graphic user interface (hereinafter referred to as GUI), computing machine and the televisor of giving the feedback of user operation result are popularized simultaneously.
In patent documentation 1, disclose the portable terminal device of display for the operation guiding information of the operation of assisted user.User can make finger action up and down according to this guiding, performs object function.
In patent documentation 2, disclose the interface arrangement of the posturography picture display of the identification object of the posture of carrying out for representing the operation becoming user with visual manner.User Ke Bian confirms that posturography carries out the operation of device as limit.
Disclose display in patent documentation 3 become the chart of the posture of the operation of user and show the mobile unit of the operation carried out.User can understand posture to easily carry out.
The vehicle operation input media of the selection guide information of state and the display operand equipment representing hand in driving is disclosed in patent documentation 4.User starts with reference to this guide, can select object operating equipment.
Patent documentation 1: Japanese Patent Laid-Open 2007-213245 publication
Patent documentation 2: Japanese Patent Laid-Open 2008-52590 publication
Patent documentation 3: Japanese Patent Laid-Open 2001-216069 publication
Patent documentation 4: Japanese Patent Laid-Open 2005-250785 publication
Arbitrary patent documentation all disclose represent for operating action, pause, user performs and the relevant action of device specified according to these instructions.
But, user is when carrying out operating the action carrying out specifying or take the pause specified, until the action of regulation, the pause of regulation, other action that occasional is undertaken unconscious, other action be identified as operating that pauses, may perform the action relevant to the device be not intended to.
Such as, user makes shown content right direction action, and when by the action of hand right direction, the action that the hand being in right direction returns left direction is identified as making the operation of content left direction action, may perform this action.
That is, all do not consider in arbitrary patent documentation, when user carries out the posture for operating, for user, be from each action understanding user instinctively whether be the state be identified.
Summary of the invention
Therefore, the object of the invention is to, considering in such thing, provide the action carried out during display user operation how to identify, and the input media that the usability of the operation be not not intended to is good.
For realizing described object, the invention provides a kind of input media, it possesses: as the input part of the motion of picture signal input operation person; The motion detecting section of motion is detected according to the picture signal being input to above-mentioned input part; The display part of display graphics user interface; With the motion that basis is detected by above-mentioned motion detecting section, the control part of the display of above-mentioned graphic user interface is changed in above-mentioned display part, above-mentioned control part, in above-mentioned display part, show the motion with the synchronized movement detected by above-mentioned motion detecting section, and change the display of above-mentioned graphic user interface.
At this, the display of above-mentioned graphic user interface is for selecting arbitrary option from multiple option and making the display of the display position movement of multiple option, above-mentioned control part shows the motion with the synchronized movement detected by above-mentioned motion detecting section in above-mentioned display part, and according to this motion, the display position of above-mentioned multiple option is moved.
In addition, in above-mentioned input media, above-mentioned control part show in above-mentioned display part be used for preoperative operator illustrate by which motion carry out which operation, need the motion that detected by above-mentioned motion detecting section.
At this, the display of above-mentioned graphic user interface is for selecting arbitrary option from multiple option and making the display of the display position movement of multiple option, above-mentioned control part show in above-mentioned display part make the display position of above-mentioned multiple option move required, need to carry out by above-mentioned motion detecting section the motion that detects.
According to the present invention, such as, when user to carry out the operation of the devices such as TV according to posture, user understands the posture of oneself carrying out at present and how to identify, can carry out posture adjustment in the mode of only carrying out conscious operation.
Accompanying drawing explanation
Fig. 1 is the skeleton diagram of the input media representing embodiment 1;
Fig. 2 is the block diagram of the formation of the input media representing embodiment 1;
Fig. 3 is the skeleton diagram of the method for operating of the user of the input media representing embodiment 1;
Fig. 4 is the skeleton diagram of the correspondence that the user operation of embodiment 1 and the action of input media are described;
Fig. 5 is the process flow diagram of the action of the input media that embodiment 1 is described;
Fig. 6 is the skeleton diagram of the correspondence that the user operation of embodiment 1 and the action of input media are described;
Fig. 7 is the skeleton diagram of the method for operating of the user of the input media representing embodiment 2;
Fig. 8 is the skeleton diagram of the correspondence that the user operation of embodiment 2 and the action of input media are described;
Fig. 9 is the skeleton diagram of the correspondence that the user operation of embodiment 2 and the action of input media are described;
Figure 10 is the skeleton diagram of the correspondence that the user operation of embodiment 3 and the action of input media are described;
Symbol description
100 input medias
101 display parts
102 image pickup parts
103 users
104 TipWizards
200 motion detection (motion detects) portion
201 control parts
202 signal of video signal handling parts
300 actions swinging hand
301 operated roller
302 options
The action of 303 rotation hands
400 animation operation guides
700 actions menus
The action of 701 hands
800 animation operation guides
1000 operation reflections
Embodiment
Below, be described applying various embodiments of the present invention.
(embodiment 1)
The input media 100 of the present embodiment is the device that also can change GUI from the action of the hand of the motion video detection user photographing user according to this action.
The summary of operating environment when Fig. 1 represents that user passes through display part 101, image pickup part 102, user 103, operation guide portion 104 use input media 100.
Display part 101 is display device of input media 100, such as, be made up of the display device such as liquid crystal display and plasma scope.Display part 101 is made up of display panel and panel control circuit and panel control and drive system, the image data supplied is presented on this display panel from signal of video signal handling part 202 described later.Image pickup part 102 is the devices for inputting motion video to input media 100, such as, be camera.User 103 is the users operated input media 100.Operation guide portion 104 is presented at the GUI on display part 101, is word and the figure of the method for operating for above-mentioned input media 100 is described above-mentioned user 103.
Such as shown in Fig. 2, input media 100 possesses display part 101, image pickup part 102, motion detection (motion detects) portion 200, control part 201, signal of video signal handling part 202.
Motion detection (motion detect) portion 200 receives motion video signal from image pickup part 102, reaches, the action of the personages such as hand that wave, rotate according to the motion video input such as personage received.In addition, the instruction of the regulation corresponding with this action is exported according to the action detected.Control part 201 is such as made up of microprocessor, controls the action of signal of video signal handling part 202 according to the above-mentioned instruction received from motion detection (motion detects) portion 200.Signal of video signal handling part 202 is such as made up for the treatment of apparatus such as ASIC, FPGA, MPU.The image data of GUI to convert to according to the control of control part 201 can carry out at display part 101 form that processes and export by signal of video signal handling part 202.
The action of the user 103 when Fig. 3 represents that user 103 operates input media 100 of the present invention and the action of above-mentioned display part 101 and the correspondence of display.The figure of " the usual state " of Fig. 3 represents that user 103 does not carry out the state of the operation of input media 100.The formation of the input media 100 of the present embodiment is, identify the action 300 that user about 103 waves, start the identification of the action 300 waved in left and right, input media 100 starts the reception of the operation of user 103.Therefore, as shown in the figure of Fig. 3 " starting to operate ", user 103 reaches to input media 100, and then the hand stretched out that swings, and thus, as shown in the figure of " mode of operation " of Fig. 3, shows the menu of operation at above-mentioned display part 101.In addition, this menu is made up of operated roller 301 and multiple option 302.When user 103 carries out the action 303 of rotating hand, interlock with it, this operated roller 301 is rotated, and the plurality of option 302 is rotated mobile.Like this, user 103 the arbitrary project of this option 302 can be selected.
Fig. 4 is the figure of the correspondence of the action of user 103 when the input media 100 shown in user 103 application drawing 3 is described and the display of above-mentioned display part 101.As display to correspondence, there is aforesaid operations guide portion 104, the display packing of animation operation guide 400 and display opportunity.Animation operation guide 400 is TipWizards, and it is used for the method operated the menu be presented in above-mentioned display part 101 to instruction manual by the image of motion.
Secondly, the action of process flow diagram to input media 100 of Fig. 1 ~ Fig. 4 and Fig. 5 is used to be described.
Input media 100 detects the device that also can change the display of GUI from the action of the hand of the motion video input user 103 photographing user 103 according to this action.
First, with reference to Fig. 2 input media 100 is detected the hand of user 103 action and according to the flow process of process before this action display GUI.User 103 for input media 100 by carrying out not shown power knob etc. and starting the action of input media 100.Control part 201 responds the beginning of above-mentioned action, gives the instruction that signal of video signal handling part 202 shows the image of regulation.Signal of video signal handling part 202 responds above-mentioned instruction, exports the signal of video signal of the input being used for display part 101.Thus, image is shown at display part 101.
In addition, control part 201 responds the beginning of above-mentioned action, image pickup part 102 is indicated to the beginning of the shooting of motion video.Image pickup part 102 responds above-mentioned instruction, starts the shooting of motion video, and the video data photographed is exported to motion detection (motion detects) portion 200.Motion detection (motion detects) portion 200 uses and carries out the method for feature point extraction etc. to detect the action of user's hand from the data of the above-mentioned motion video received, and then when the action of this hand is judged as YES follow the pattern predetermined, export to control part 201 and correspond to the instruction of this pattern.Control part 201 indicates the display of GUI and the change of display according to this instruction to signal of video signal handling part 202.Signal of video signal handling part 202 responds above-mentioned instruction, changes the signal of video signal exported, and changes the display of the GUI on display part 101 thus.
Secondly, the display packing of process flow diagram to the TipWizard 104 during user's 103 input device 100 and animation operation guide 400 of Fig. 4 and Fig. 5 is used to be described.
First, user 103 starts the action of input media 100 by the order of regulation, and the control part 201 of input media 100 starts the identification (step 500) of the action (posture) of the hand of personage by the order described before.One of user 103 now and the state of input media 100 example is Fig. 4 " usual state ".
Secondly, as " the beginning TipWizard " of Fig. 4, user 103 reaches (step 501:Yes) to input media 100.Input media 100 identifies and the action that user 103 reaches TipWizard 104 is shown in display part 101 (step 502).Now, TipWizard 104 is the guide that action (operation starts by posture) that user 103 carries out when starting the operation of input media 100 illustrates the action 300 that hand swings.In addition, when user 103 puts down hand, input media 100 makes the display of TipWizard 104 disappear (step 501:No or step 503:No).
Secondly, as Fig. 4 " starting operation ", user 103 carries out the action (step 503:Yes) of waving towards input media about 100 according to the content illustrated by TipWizard 104.The action 300 that input media 100 is waved about identifying this, as shown in " the animation operation guide " of Fig. 4, shows the gesture operation menu (step 504) be made up of operated roller 301 and option 302 in display part 101.And then input media 100 is for illustrating the method for operating of this gesture operation menu and showing above-mentioned animation operation guide 400 (step 505) at display part 101.Now, animation operation guide 400 is to describe the personage be taken, and the hand touch operation roller 301 of the personage depicted, the mode that hand and operated roller together rotate movement shows.Thus, carry out which operation or manually make which kind of degree when can illustrate intuitively and which position to carry out which action in user 103.
Secondly, as shown in " mode of operation " of Fig. 4, user 103 carries out the action 303 of rotating hand according to the display of animation operation guide 400.When for (step 506:Yes), input media 100 identifies this action of user 103, and show the action synchronous with this identified action by animation operation guide 400, this action identified is reflected on gesture operation menu (step 507) simultaneously.That is, when user carries out the action 303 of rotating hand, people display frame also being carried out shown by animation operation guide 400 rotates the hand of operating of contacts roller 301, and operated roller 301 is rotated, and makes the display that option 302 is rotated.
Like this, animation operation guide 400 in step 505, before user operates, with the action of user independently as the non-synchronous type TipWizard action that animation changes.In addition, in step 507, user starts operation, synchronous with the action of user 103, the synchronous operation guide as animation change carries out action (with reference to Fig. 6).
Like this, input media 100 identifies the action that user 103 starts, and according to this action display TipWizard 104 and animation operation guide 400, points out the effective action of operation (posture) to user 103.Thus, user 103 can confirm can carry out desired operation when which opportunity carries out which action (posture), therefore, can carry out the operation such as the display of menu and the selection of menu item swimmingly.
In addition, above-mentioned non-synchronous type TipWizard makes animation change relative to preoperative user, and it is effective for carrying out which operation when carrying out which action to understanding.In the present embodiment, in display frame, showing operated roller 301 and option 302, like this, drawing, figure, image being moved, in contrast, for not knowing that the user which action this carries out is effective especially by operation.
In addition, above-mentioned synchronized model TipWizard is for the user after operation, and the action undertaken by animation display user, meanwhile, for carrying out or do not carry out actual operation, and whether the state that user can be made to understand each action of user is instinctively identified.Therefore, user can carry out corrective action by the mode of self not carrying out the operation be not intended to.
(embodiment 2)
Then, embodiment 2 is described.In the present embodiment, different from the method for operating of the input media 100 of embodiment 1, to user by the action of two hands can input device 100 when method of operating and the display packing of TipWizard be described.The formation of input media 100 is identical with embodiment 1, and the method for operating of user is different.
Below, based on accompanying drawing, the present embodiment is described.In addition, below, part mark prosign same as the previously described embodiments, the description thereof will be omitted.
The action of user's hand when Fig. 7 represents user operation input media 100 in the present embodiment and be shown in the state of GUI of display part 101.Be shown in this actions menu 700 of display part 101, in simulated three-dimensional space, in upper configuration three options of the table (table) of circle.And rotated by the table of this circle, the project being positioned at picture just front becomes by the project selected.In addition, in this actions menu 700, by pointing out the outward appearance of overlapping hand shape on the table of this circle to user 103, imparting user 103 hand is placed on to be shown above and the reflection of rotation table.The action 701 of user's hand represent user 103 in order to operate aforesaid operations menu 700 and two hands along circular orbit action.Like this, user rotates along circular orbit action by making two hands on circle table, can just before make the item action that will select at picture.
Fig. 8 is used to be described the action of input media 100 of the present embodiment and the display packing of TipWizard.
The figure of " the menu display " of Fig. 8 represents the state of display menu in the same manner as in Example 1.Input media 100 can by the action of the hand of the method identification user 103 identical with embodiment 1.When user 103 reaches out one's hands relative to input media 100, as shown in " the animation operation guide display " of Fig. 8, represent animation operation guide 800.The image of the appearance that this animation operation guide 800 display simulation personage two manually does, represents that being used for user uses and have an action carried out when the image manipulation actions menu 700 of action.Secondly, when the action of user 103 illustrated by this animation operation guide 800 starts the operation of actions menu 700, as shown in " operation " of Fig. 8, the display of this animation operation guide 800 is eliminated, and the action of user 103 is reflected into the operation to actions menu 700.
Like this, the shape of input media 100 by actions menu 700 and the reflection of outward appearance imparting user 103 method of operating.In addition, when in fact user 103 operates based on this reflection, more specifically prompting illustrates the animation operation guide 800 of the method for operating of user.
Thus, user can the concrete guide of the timing receipt relevant operational carrying out operating.Therefore, the operation that user self will carry out is effective, namely can understand the operation which method of operating also carries out menu item swimmingly.
In addition, aforesaid operations menu 700, such as shown in Fig. 9, also can be formed with the combination of the operation menu be made up of the operated roller 301 illustrated in embodiment 1 and option 302.In this formation, when user's 103 pairs of input medias 100 stretch out one hand, the animation operation guide of display one-handed performance.In addition, when user 103 stretches out two hands for input media 100, show the animation operation guide of two manual manipulations.Accordingly, multiple guidance panel can be reflected via the shape of menu and outward appearance, and can in guide the method for operating of user when in fact user specifically operates.
(embodiment 3)
Secondly, embodiment 3 is described.In the present embodiment, to in the input media 100 of embodiment 2, do not show above-mentioned animation operation guide, carry out operation reflection by the position display user of operational motion in aforesaid operations menu, the input media realizing the method for operating identical with embodiment 2 is described.
Below, based on accompanying drawing, the present embodiment is described.In addition, below to the part identical with the above embodiments mark prosign, the description thereof will be omitted.
Use Figure 10 that the action of the input media 100 of the present embodiment and the display packing of operation reflection are described.
The operation reflection 1000 of Figure 10 is the figure of the hand reflecting user.In this figure, pin two options on aforesaid operations menu with hand, to the method for operating that user's prompting makes the table of the circle in aforesaid operations menu 700 rotate.As Figure 10 " lifting the state 1 of two hands ", when user's 103 pairs of input medias 100 stretch out two hands, display operation circular 1000.Thus, user 103 is appreciated that the operation that user self carries out is effective state or how method of operating.In addition, Figure 10 " lifting the state 2 of two hands " represents that two hands are reached the state of the position higher than this figure " lifting the state 1 of two hands " by user 103.In this figure " lifting the state 2 of two hands ", operate effective 1000 display state identical with this figure " lifting the state 1 of two hands ".Therefore, user 103 can not rely on the position of stretching out one's hand, as long as stretch out two hands, just can be identified as input media 100 is identical action.
Like this, input media 100 does not rely on the asynchronous action deviation of operating position of user 103, according to the position display operation reflection 1000 of simple action practical operation effect on actions menu.Thus, user, when operating, is being interpreted as that actual exercisable state is to the position confirming operational motion on actions menu simultaneously, operates not need the action of correctness.

Claims (7)

1. an input media, is characterized in that, possesses:
As the input part of the motion of picture signal input operation person;
The motion detecting section of motion is detected according to the picture signal being input to described input part;
The display part of display graphics user interface; With
According to the motion detected by described motion detecting section, in described display part, change the control part of the display of described graphic user interface,
Wherein, before operator's operational movement, described control part shows motion guide in described display part, the motion that this motion guide represents the operator nonsynchronous, required with the motion of operator by the change of animation and the change state of the graphic user interface of display caused by the motion carrying out required operator
After operator is according to the change operation motion of animation, described control part, in described display part, changes the display of described graphic user interface, makes the synchronized movement of the change of animation and the operator to be detected by described motion detecting section.
2. input media as claimed in claim 1, is characterized in that,
The display of described graphic user interface comprises:
There is the first graphic user interface part of multiple option;
Changed by operator the option of described first graphic user interface part by the second graph user interface part of state selected; With
3rd graphic user interface part of the motion of the operator required for instruction,
Described control part is in described display part, show the mode of the operation for representing the change described second graph user interface part along with described 3rd graphic user interface part, with change the option of described first graphic user interface part by the guide of state selected
When operator's operational movement, described second graph user interface part and described 3rd graphic user interface part are moved with the synchronized movement of the operator detected by described motion detecting section, further, described control part change in described display part the option of described first graphic user interface part by the state selected.
3. input media as claimed in claim 1, is characterized in that,
Described control part show in described display part be used for preoperative operator illustrate by which motion carry out which operation, need the motion that detected by described motion detecting section.
4. input media as claimed in claim 3, is characterized in that,
The display of described graphic user interface makes the display of the display position movement of multiple option for selecting the option expected from multiple option,
Described control part show in described display part make the display position of described multiple option move required, need to carry out by described motion detecting section the motion that detects.
5. input media as claimed in claim 3, is characterized in that,
The motion that needs are shown is the motion of operator's one-handed performance.
6. input media as claimed in claim 5, is characterized in that,
The motion that needs are shown is the motion of operator's bimanualness.
7. input media as claimed in claim 3, is characterized in that,
Described control part shows motion guide in described display part, the motion of that this motion guide represents the one hand that utilizes operator to stretch out or both hands by the change of animation, required operator and the change state of the graphic user interface of display caused by the motion carrying out required operator.
CN201510486457.2A 2009-04-23 2010-04-23 Input unit Active CN105159445B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2009-104641 2009-04-23
JP2009104641A JP5256109B2 (en) 2009-04-23 2009-04-23 Display device
CN2010101673871A CN101901050A (en) 2009-04-23 2010-04-23 Input media

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN2010101673871A Division CN101901050A (en) 2009-04-23 2010-04-23 Input media

Publications (2)

Publication Number Publication Date
CN105159445A true CN105159445A (en) 2015-12-16
CN105159445B CN105159445B (en) 2019-03-01

Family

ID=42320359

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201510486457.2A Active CN105159445B (en) 2009-04-23 2010-04-23 Input unit
CN2010101673871A Pending CN101901050A (en) 2009-04-23 2010-04-23 Input media

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN2010101673871A Pending CN101901050A (en) 2009-04-23 2010-04-23 Input media

Country Status (5)

Country Link
US (3) US9164578B2 (en)
EP (1) EP2244166B1 (en)
JP (1) JP5256109B2 (en)
KR (1) KR101148484B1 (en)
CN (2) CN105159445B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109152556A (en) * 2016-06-06 2019-01-04 麦克赛尔株式会社 Menu Generating System, methods and procedures are practiced in finger movement

Families Citing this family (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4569613B2 (en) 2007-09-19 2010-10-27 ソニー株式会社 Image processing apparatus, image processing method, and program
JP5802667B2 (en) 2010-07-20 2015-10-28 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Gesture input device and gesture input method
EP2421252A1 (en) * 2010-08-17 2012-02-22 LG Electronics Display device and control method thereof
EP2421251A1 (en) * 2010-08-17 2012-02-22 LG Electronics Display device and control method thereof
US9164542B2 (en) * 2010-08-31 2015-10-20 Symbol Technologies, Llc Automated controls for sensor enabled user interface
TW201222429A (en) * 2010-11-23 2012-06-01 Inventec Corp Web camera device and operating method thereof
US9361009B2 (en) * 2010-12-01 2016-06-07 Adobe Systems Incorporated Methods and systems for setting parameter values via radial input gestures
US9575561B2 (en) 2010-12-23 2017-02-21 Intel Corporation Method, apparatus and system for interacting with content on web browsers
KR101811909B1 (en) * 2010-12-30 2018-01-25 톰슨 라이센싱 Apparatus and method for gesture recognition
KR20120080072A (en) * 2011-01-06 2012-07-16 삼성전자주식회사 Display apparatus controled by a motion, and motion control method thereof
KR101795574B1 (en) 2011-01-06 2017-11-13 삼성전자주식회사 Electronic device controled by a motion, and control method thereof
KR101858531B1 (en) 2011-01-06 2018-05-17 삼성전자주식회사 Display apparatus controled by a motion, and motion control method thereof
US9058059B2 (en) 2011-03-03 2015-06-16 Omron Corporation Gesture input device and method for controlling gesture input device
US9317111B2 (en) 2011-03-30 2016-04-19 Elwha, Llc Providing greater access to one or more items in response to verifying device transfer
US8918861B2 (en) 2011-03-30 2014-12-23 Elwha Llc Marking one or more items in response to determining device transfer
US8863275B2 (en) * 2011-03-30 2014-10-14 Elwha Llc Access restriction in response to determining device transfer
US9153194B2 (en) 2011-03-30 2015-10-06 Elwha Llc Presentation format selection based at least on device transfer determination
JP2012215963A (en) 2011-03-31 2012-11-08 Hitachi Consumer Electronics Co Ltd Image display apparatus
CN103608761B (en) 2011-04-27 2018-07-27 日本电气方案创新株式会社 Input equipment, input method and recording medium
US9329673B2 (en) 2011-04-28 2016-05-03 Nec Solution Innovators, Ltd. Information processing device, information processing method, and recording medium
US10001898B1 (en) 2011-07-12 2018-06-19 Domo, Inc. Automated provisioning of relational information for a summary data visualization
US9792017B1 (en) 2011-07-12 2017-10-17 Domo, Inc. Automatic creation of drill paths
US9202297B1 (en) 2011-07-12 2015-12-01 Domo, Inc. Dynamic expansion of data visualizations
US9292112B2 (en) * 2011-07-28 2016-03-22 Hewlett-Packard Development Company, L.P. Multimodal interface
EP2555536A1 (en) 2011-08-05 2013-02-06 Samsung Electronics Co., Ltd. Method for controlling electronic apparatus based on voice recognition and motion recognition, and electronic apparatus applying the same
WO2013022222A2 (en) * 2011-08-05 2013-02-14 Samsung Electronics Co., Ltd. Method for controlling electronic apparatus based on motion recognition, and electronic apparatus applying the same
CN102360263A (en) * 2011-09-26 2012-02-22 中兴通讯股份有限公司 Method implemented by taking three-dimensional moving track as input and mobile terminal
KR20130078490A (en) * 2011-12-30 2013-07-10 삼성전자주식회사 Electronic apparatus and method for controlling electronic apparatus thereof
US9501152B2 (en) 2013-01-15 2016-11-22 Leap Motion, Inc. Free-space user interface and control using virtual constructs
US11493998B2 (en) 2012-01-17 2022-11-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US9141197B2 (en) * 2012-04-16 2015-09-22 Qualcomm Incorporated Interacting with a device using gestures
US9448635B2 (en) * 2012-04-16 2016-09-20 Qualcomm Incorporated Rapid gesture re-engagement
TWI476706B (en) * 2012-04-30 2015-03-11 Pixart Imaging Inc Method for outputting command by detecting object movement and system thereof
KR101370022B1 (en) * 2012-05-02 2014-03-25 주식회사 매크론 Gesture recognition remote controller
US10360706B2 (en) 2012-05-22 2019-07-23 Sony Corporation Device method and program for adjusting a display state of a superimposed image
EP2698686B1 (en) * 2012-07-27 2018-10-10 LG Electronics Inc. Wrist-wearable terminal and control method thereof
JP5936155B2 (en) * 2012-07-27 2016-06-15 Necソリューションイノベータ株式会社 3D user interface device and 3D operation method
SE537580C2 (en) * 2012-08-03 2015-06-30 Crunchfish Ab Improved input
KR102083918B1 (en) 2012-10-10 2020-03-04 삼성전자주식회사 Multi display apparatus and method for contorlling thereof
KR102083937B1 (en) 2012-10-10 2020-03-04 삼성전자주식회사 Multi display device and method for providing tool thereof
US20150212647A1 (en) 2012-10-10 2015-07-30 Samsung Electronics Co., Ltd. Head mounted display apparatus and method for displaying a content
KR101951228B1 (en) 2012-10-10 2019-02-22 삼성전자주식회사 Multi display device and method for photographing thereof
KR101984683B1 (en) 2012-10-10 2019-05-31 삼성전자주식회사 Multi display device and method for controlling thereof
KR102061881B1 (en) 2012-10-10 2020-01-06 삼성전자주식회사 Multi display apparatus and method for controlling display operation
KR102063952B1 (en) 2012-10-10 2020-01-08 삼성전자주식회사 Multi display apparatus and multi display method
JP2014085964A (en) * 2012-10-25 2014-05-12 Nec Personal Computers Ltd Information processing method, information processing device, and program
TWI454971B (en) * 2012-12-11 2014-10-01 Pixart Imaging Inc Electronic apparatus controlling method and electronic apparatus utilizing the electronic apparatus controlling method
JP2014127124A (en) * 2012-12-27 2014-07-07 Sony Corp Information processing apparatus, information processing method, and program
KR20140085061A (en) * 2012-12-27 2014-07-07 삼성전자주식회사 Display apparatus and Method for controlling display apparatus thereof
CN103914126A (en) * 2012-12-31 2014-07-09 腾讯科技(深圳)有限公司 Multimedia player control method and device
US9459697B2 (en) 2013-01-15 2016-10-04 Leap Motion, Inc. Dynamic, free-space user interactions for machine control
US9916009B2 (en) 2013-04-26 2018-03-13 Leap Motion, Inc. Non-tactile interface systems and methods
JP6104707B2 (en) * 2013-05-23 2017-03-29 アルパイン株式会社 Electronic device, operation input method, and operation input program
KR20170109077A (en) * 2013-06-25 2017-09-27 후지쯔 가부시끼가이샤 Information processing device and recording medium
US10281987B1 (en) 2013-08-09 2019-05-07 Leap Motion, Inc. Systems and methods of free-space gestural interaction
US20150067603A1 (en) * 2013-09-05 2015-03-05 Kabushiki Kaisha Toshiba Display control device
US20150193111A1 (en) * 2013-10-01 2015-07-09 Google Inc. Providing Intent-Based Feedback Information On A Gesture Interface
US10152136B2 (en) * 2013-10-16 2018-12-11 Leap Motion, Inc. Velocity field interaction for free space gesture interface and control
US10126822B2 (en) 2013-12-16 2018-11-13 Leap Motion, Inc. User-defined virtual interaction space and manipulation of virtual configuration
JP6550643B2 (en) * 2014-03-14 2019-07-31 本田技研工業株式会社 Motion estimation device, robot, and motion estimation method
US10845884B2 (en) * 2014-05-13 2020-11-24 Lenovo (Singapore) Pte. Ltd. Detecting inadvertent gesture controls
WO2016016674A1 (en) * 2014-07-29 2016-02-04 Umm Al-Qura University Transparent oled architectural partition and method
US20160034036A1 (en) * 2014-07-29 2016-02-04 Umm Al-Qura University Oled multi-use intelligent curtain and method
CN105516815A (en) * 2014-09-25 2016-04-20 冠捷投资有限公司 Method for controlling operation interface of display device by motion
WO2016051521A1 (en) * 2014-09-30 2016-04-07 三菱電機エンジニアリング株式会社 Screen operation device and screen operation method
JP6293372B2 (en) * 2015-05-20 2018-03-14 三菱電機株式会社 Information processing apparatus and interlock control method
US10460044B2 (en) 2017-05-26 2019-10-29 General Electric Company Methods and systems for translating natural language requirements to a semantic modeling language statement
USD855642S1 (en) * 2017-09-29 2019-08-06 Song Kug Im Portable terminal with a graphical user interface
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments
USD873285S1 (en) * 2018-07-24 2020-01-21 Magic Leap, Inc. Display panel or portion thereof with a graphical user interface
US11908243B2 (en) * 2021-03-16 2024-02-20 Snap Inc. Menu hierarchy navigation on electronic mirroring devices

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
CN1531676A (en) * 2001-06-01 2004-09-22 ���ṫ˾ User input apparatus
JP2007122136A (en) * 2005-10-25 2007-05-17 Sharp Corp Rotation selection system for menu item
CN101131609A (en) * 2006-08-25 2008-02-27 株式会社东芝 Interface apparatus and interface method

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2256605B1 (en) * 1998-01-26 2017-12-06 Apple Inc. Method and apparatus for integrating manual input
JP2001216069A (en) 2000-02-01 2001-08-10 Toshiba Corp Operation inputting device and direction detecting method
US7665041B2 (en) * 2003-03-25 2010-02-16 Microsoft Corporation Architecture for controlling a computer using hand gestures
US8745541B2 (en) 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
JP3945445B2 (en) 2003-04-21 2007-07-18 ソニー株式会社 Display method and display device
JP2004356819A (en) * 2003-05-28 2004-12-16 Sharp Corp Remote control apparatus
JP3941786B2 (en) 2004-03-03 2007-07-04 日産自動車株式会社 Vehicle operation input device and method
JP2006079281A (en) * 2004-09-08 2006-03-23 Sony Corp Image processor and image processing method, program and recording medium
KR100687737B1 (en) 2005-03-19 2007-02-27 한국전자통신연구원 Apparatus and method for a virtual mouse based on two-hands gesture
US7657849B2 (en) 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
JP2007213245A (en) 2006-02-08 2007-08-23 Nec Corp Portable terminal and program
US8531396B2 (en) * 2006-02-08 2013-09-10 Oblong Industries, Inc. Control system for navigating a principal dimension of a data space
US8972902B2 (en) * 2008-08-22 2015-03-03 Northrop Grumman Systems Corporation Compound gesture recognition
JP4650381B2 (en) * 2006-09-08 2011-03-16 日本ビクター株式会社 Electronics
KR101304461B1 (en) * 2006-12-04 2013-09-04 삼성전자주식회사 Method and apparatus of gesture-based user interface
JP2008146243A (en) * 2006-12-07 2008-06-26 Toshiba Corp Information processor, information processing method and program
US7770136B2 (en) 2007-01-24 2010-08-03 Microsoft Corporation Gesture recognition interactive feedback
US20080215975A1 (en) * 2007-03-01 2008-09-04 Phil Harrison Virtual world user opinion & response monitoring
US8726194B2 (en) * 2007-07-27 2014-05-13 Qualcomm Incorporated Item selection using enhanced control
JP4569613B2 (en) * 2007-09-19 2010-10-27 ソニー株式会社 Image processing apparatus, image processing method, and program
CN101874404B (en) 2007-09-24 2013-09-18 高通股份有限公司 Enhanced interface for voice and video communications
US8413075B2 (en) * 2008-01-04 2013-04-02 Apple Inc. Gesture movies
US8196042B2 (en) * 2008-01-21 2012-06-05 Microsoft Corporation Self-revelation aids for interfaces
US8555207B2 (en) * 2008-02-27 2013-10-08 Qualcomm Incorporated Enhanced input using recognized gestures
US9772689B2 (en) * 2008-03-04 2017-09-26 Qualcomm Incorporated Enhanced gesture-based image manipulation
US20100031202A1 (en) * 2008-08-04 2010-02-04 Microsoft Corporation User-defined gesture set for surface computing
TW201009650A (en) * 2008-08-28 2010-03-01 Acer Inc Gesture guide system and method for controlling computer system by gesture
US8610673B2 (en) * 2008-12-03 2013-12-17 Microsoft Corporation Manipulation of list on a multi-touch display
KR101979666B1 (en) * 2012-05-15 2019-05-17 삼성전자 주식회사 Operation Method For plural Touch Panel And Portable Device supporting the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
CN1531676A (en) * 2001-06-01 2004-09-22 ���ṫ˾ User input apparatus
JP2007122136A (en) * 2005-10-25 2007-05-17 Sharp Corp Rotation selection system for menu item
CN101131609A (en) * 2006-08-25 2008-02-27 株式会社东芝 Interface apparatus and interface method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109152556A (en) * 2016-06-06 2019-01-04 麦克赛尔株式会社 Menu Generating System, methods and procedures are practiced in finger movement
CN109152556B (en) * 2016-06-06 2022-06-24 麦克赛尔株式会社 Finger movement exercise menu generation system, method, and program

Also Published As

Publication number Publication date
CN101901050A (en) 2010-12-01
US20160313802A1 (en) 2016-10-27
US20160018903A1 (en) 2016-01-21
CN105159445B (en) 2019-03-01
EP2244166A3 (en) 2013-09-04
KR101148484B1 (en) 2012-05-25
JP2010257093A (en) 2010-11-11
US9164578B2 (en) 2015-10-20
US9411424B2 (en) 2016-08-09
EP2244166B1 (en) 2018-10-31
US11036301B2 (en) 2021-06-15
KR20100117036A (en) 2010-11-02
US20100275159A1 (en) 2010-10-28
EP2244166A2 (en) 2010-10-27
JP5256109B2 (en) 2013-08-07

Similar Documents

Publication Publication Date Title
CN105159445A (en) Input device
EP3461291B1 (en) Implementation of a biometric enrollment user interface
JP5333397B2 (en) Information processing terminal and control method thereof
JP4166229B2 (en) Display device with touch panel
EP2328061A2 (en) Input apparatus
CN101715087B (en) Manipulation method
AU2010366331B2 (en) User interface, apparatus and method for gesture recognition
JP5906984B2 (en) Display terminal device and program
JP2008146243A (en) Information processor, information processing method and program
US20150026619A1 (en) User Interface Method and Apparatus Using Successive Touches
US20120249542A1 (en) Electronic apparatus to display a guide with 3d view and method thereof
JP5879880B2 (en) Touch panel electronic device
JP6100497B2 (en) Information processing program, information processing apparatus, information processing system, and image display method
JP2015046949A (en) Display device and computer program
CN115280282A (en) Information terminal device and application operation mode control method thereof
KR101571734B1 (en) Mobile terminal and method for controlling the same
JP2011086035A (en) Portable device, and image display control method, and device therefor
KR101541639B1 (en) Electronic device and method for controlling electronic device
JP6252042B2 (en) Information processing system, information processing apparatus, information processing method, and program
US20140225853A1 (en) Information processing device, information processing method, and program
KR101601763B1 (en) Motion control method for station type terminal
WO2016001748A1 (en) Method and apparatus for displaying an operation on a touch screen of a device
JP2015005301A (en) Portable device, and image display control method and device therefor
JP2017224358A5 (en)
JP2008305023A (en) Projector device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20180302

Address after: Kyoto Japan

Applicant after: MAXELL, Ltd.

Address before: Osaka Japan

Applicant before: Hitachi Maxell, Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Kyoto Japan

Patentee after: MAXELL, Ltd.

Address before: Kyoto Japan

Patentee before: MAXELL HOLDINGS, Ltd.

CP01 Change in the name or title of a patent holder
TR01 Transfer of patent right

Effective date of registration: 20220607

Address after: Kyoto Japan

Patentee after: MAXELL HOLDINGS, Ltd.

Address before: Kyoto Japan

Patentee before: MAXELL, Ltd.

TR01 Transfer of patent right