CN101398712A - Image input control method and image input control system - Google Patents

Image input control method and image input control system Download PDF

Info

Publication number
CN101398712A
CN101398712A CNA2007101617761A CN200710161776A CN101398712A CN 101398712 A CN101398712 A CN 101398712A CN A2007101617761 A CNA2007101617761 A CN A2007101617761A CN 200710161776 A CN200710161776 A CN 200710161776A CN 101398712 A CN101398712 A CN 101398712A
Authority
CN
China
Prior art keywords
image
instruction
input operation
gesture
motion picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA2007101617761A
Other languages
Chinese (zh)
Inventor
赖荣基
尹俊雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Darfon Electronics Corp
Original Assignee
Darfon Electronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Darfon Electronics Corp filed Critical Darfon Electronics Corp
Priority to CNA2007101617761A priority Critical patent/CN101398712A/en
Publication of CN101398712A publication Critical patent/CN101398712A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to an image input control method and an image input control apparatus. The image input control apparatus comprises an image retrieve unit and an electronic device. The image retrieve unit is used for retrieving an action menu of a user and produces an object signal after recognizing the action menu. A control unit executing the defined input operation according to the object signal.

Description

Image input control method and image input control system
Technical field
The present invention relates to a kind of image input control method and image input control system, and be particularly related to a kind of control method of importing with user's motion picture and the device of using its method.
Background technology
The input control method of ordinary electronic product is based on existing input media, as mouse, keyboard or button etc.Thus, the user just can control electronic installation by push input device, so quite inconvenience of operating mode.With mouse is that the computing machine of main input media illustrates as an example.When the user used a computer, because the operation of input media mouse need be positioned over desktop operated, and the user just can only be sitting in when using a computer before the computing machine with mouse action, and can not use a computer in other place.Though the existing wireless input media is operated but still need hold this radio input device always now, could control and the operational computations machine.So lock into the suitable inconvenience of mode of operation of input media, the puzzlement that causes the user to use.
Summary of the invention
The invention provides a kind of image input control method and image input control system,, come the control operation electronic installation with image input by acquisition user's motion images.
According to a first aspect of the invention, propose a kind of image input control method, the method comprises the following steps.At first, capture a motion picture of a user.Then, identification maneuver picture.Then, be image when instruction when judging motion picture, send an identification signal.Moreover, send a control signal according to identification signal.Then, carry out an input operation of correspondence image instruction according to control signal.
According to a second aspect of the invention, a kind of image input control system is proposed.Image input control system comprises: capturing images unit and electronic installation.The capturing images unit is used to capture user's motion picture.Electronic installation comprises recognition unit, judging unit and control module.Recognition unit is used for the identification maneuver picture, and is image when instruction when judging motion picture, sends an identification signal.Judging unit sends a control signal according to identification signal.Control module is carried out an input operation of correspondence image instruction according to control signal.
According to a third aspect of the invention we, provide a kind of image input control method, and the method is applicable to input signal to an electronic installation is provided that the method comprises the following steps.At first, capture a motion picture of a user.Then, identification maneuver picture.Then, parsing motion picture is object signal on the two-dimensional space.Moreover, carry out an input operation according to object signal.
According to a forth aspect of the invention, reintroduce a kind of image input control system.Image input control system comprises capturing images unit and electronic installation.The capturing images unit and comprises recognition unit in order to acquisition user's motion picture.Recognition unit is an object signal in order to resolve motion picture.Electronic installation comprises control module and display unit.Control module is used for carrying out an input operation according to object signal.Display unit is used to show input operation.
For foregoing of the present invention can be become apparent, preferred embodiment cited below particularly, and conjunction with figs. are described in detail below.
Description of drawings
Fig. 1 shows the calcspar of the image input control system of the first embodiment of the present invention;
Fig. 2 shows the process flow diagram of the image input control method of preferred embodiment of the present invention;
Fig. 3 A shows the synoptic diagram according to first gesture of the motion picture of first embodiment of the invention;
Fig. 3 B shows the synoptic diagram according to second gesture of the motion picture of first embodiment of the invention;
Fig. 3 C shows the synoptic diagram according to the 3rd gesture of the motion picture of first embodiment of the invention;
Fig. 4 shows the synoptic diagram according to first image instruction of first embodiment of the invention;
Fig. 5 shows the synoptic diagram according to second image instruction of the first embodiment of the present invention;
Fig. 6 shows the synoptic diagram according to the application image instruction of the first embodiment of the present invention;
Fig. 7 shows another synoptic diagram according to the application image indication of the first embodiment of the present invention;
Fig. 8 shows the synoptic diagram according to the 6th image instruction of the first embodiment of the present invention;
Fig. 9 shows the calcspar according to the image input control device of the second embodiment of the present invention;
Figure 10 shows the process flow diagram according to the image input control method of the second embodiment of the present invention;
Figure 11 shows the synoptic diagram according to capturing images unit in the second embodiment of the invention;
Figure 12 A shows the synoptic diagram according to first gesture of the motion picture of second embodiment of the invention;
Figure 12 B shows the synoptic diagram according to second gesture of the motion picture of second embodiment of the invention;
Figure 12 C shows the synoptic diagram according to the 3rd gesture of the motion picture of second embodiment of the invention;
Figure 13 shows the synoptic diagram of the light of first gesture reflection in discerning by Figure 12 A;
Figure 14 shows the synoptic diagram of the light of second gesture reflection in identification Figure 12 B;
Figure 15 shows the synoptic diagram of the light of the 3rd gesture reflection in identification Figure 12 C;
Figure 16 A is according to another synoptic diagram of first gesture of the motion picture of second embodiment of the invention; And
Figure 16 B is according to another synoptic diagram of second gesture of the motion picture of second embodiment of the invention.
The reference numeral explanation
D: fixed range
D1: first distance
D2: second distance
I, I ': motion picture
Sr: identification signal
Sc: control signal
So: object signal
G1, G1 ': first gesture
G2, G2 ': second gesture
G3, G3 ': the 3rd gesture
100,200: electronic installation
110,210: the capturing images unit
120,220: recognition unit
122: database
130: judging unit
140,240: control module
142: index
144: object
146: change option soon
148: menu
150,250: the display unit town
160: indicator elment
212: light source
212a: light beam
214: light sensing unit
216a, 216a ': first reflecting body
216b, 216b ': second reflecting body
216c: fingerstall
222a: first object
222b: second object
222d: three objects.
Embodiment
A kind of image input control method of the present invention and use its image input control system is via acquisition user's motion picture, comes the control operation electronic installation with image input control method.Below be that different embodiments with the capturing images unit illustrate image input control method and system thereof.
First embodiment
Please refer to Fig. 1, it shows the calcspar of the image input control system of the first embodiment of the present invention.Image input control system comprises capturing images unit 110 and electronic installation 100.Capturing images unit 110 is used to capture user's motion picture I.Electronic installation 100 comprises recognition unit 120, judging unit 130, control module 140 and display unit 150.Recognition unit 120 is used for identification maneuver picture I, and is image when instruction when judging motion picture I, sends identification signal Sr.Judging unit 130 is used for sending a control signal Sc according to identification signal Sr.Control module 140 is carried out the input operation of correspondence image instruction according to control signal Sc.Display unit 150 is used to show the input operation of correspondence image instruction.In the present embodiment, electronic installation 100 for example is electronic products such as computing machine or TV, capturing images unit 110 for example is complementary metal oxide semiconductor (CMOS) (Complementary Metal OxideSemiconductor, CMOS) image sensering device or charge coupled cell (Charge CoupledDevice, CCD) image sensering device.Thus, the user just can be via capturing images unit 110 with this computing machine of action gesture operation.
Please refer to Fig. 2, it shows the process flow diagram of the image input control method of preferred embodiment of the present invention.At first, as step 201, with capturing images unit 110 acquisition users' motion picture I.In the present embodiment, capturing images unit 110 preferably is a cmos image sensing apparatus.
Then, as step 202, with recognition unit 120 identification maneuver picture I.In the present embodiment, recognition unit 120 preferably comprises database 122, and database 122 stores several images, and each image all has its corresponding image instruction.Then, in step 203, recognition unit 120 judges whether motion picture I is image instruction, if, then enter step 204, then return step 201 if not.In this step, motion picture I that recognition unit 120 will be captured by capturing images unit 110 and the image in the database 122 compare one by one, to judge whether the motion picture I that is captured is image instruction.If in step 203 when recognition unit 120 judges that motion picture I are image instruction, then in step 204 to send identification signal Sr, then return step 201 if not until judging that motion picture I is the image instruction.In addition, capturing images unit 110 preferably also can utilize indicator elment 160 (as shown in Figure 4) to indicate with sound or with light color whether user's action is image instruction, if mistake is informed the user just can sound or change Light Color.
Then, in step 205, judging unit 130 sends control signal Sc according to identification signal Sr.Then, in step 206, control module 140 is carried out the input operation of correspondence image instruction according to control signal Sc.Then, in step 207, display unit 150 is used to show the input operation of correspondence image instruction.The motion picture I that then enumerates several users is as example.
At first, please refer to Fig. 3 A, it shows the synoptic diagram according to first gesture of the motion picture of first embodiment of the invention.When user's motion picture I is as shown in Figure 3A the first gesture G1, this first gesture G1 is defined as between thumb and forefinger to open.In the present embodiment, the image instruction comprises the instruction of first image.When capturing images unit 110 during at this first gesture of step 201 acquisition G1, recognition unit 120 is just discerned this first gesture G1 in step 202.Then in step 203, recognition unit 120 judges relatively that through database 122 this first gesture G1 is the instruction of first image, and wherein, the input operation of instructing corresponding to first image is to show an index (pointer) 142.
Please refer to Fig. 4, it shows the synoptic diagram according to first image instruction of first embodiment of the invention.Thus, as Fig. 4, when step 206, just control module 140 can make display unit 150 pointer 142 occur.
Then, be the first gesture G1 that moves if work as user's motion picture I.After capturing images unit 110 captured the motion picture I of this first gesture G1 that moves in step 201, recognition unit 120 was proceeded identification.Then in step 203, recognition unit 120 judges relatively that through database 122 this first gesture G1 that moves is the instruction of second image, and wherein, the input operation of instructing corresponding to second image is mobile index 142.
In addition, please refer to Fig. 3 B, it shows the synoptic diagram according to second gesture of the motion picture of first embodiment of the invention.If when user's motion picture I shown in Fig. 3 B, this motion picture I is the second gesture G2, this second gesture G2 is defined as the thumb finger tip and the forefinger finger tip is in contact with one another.When the motion picture I of capturing images unit 110 at this second gesture of step 201 acquisition G2, recognition unit 120 is discerned in step 202.Then in step 203, recognition unit 120 compares to judge that the second gesture G2 is the instruction of the 3rd image through database 122, wherein, the input operation corresponding to the instruction of the 3rd image is click (single click) or is called mouse up that the left button that is equivalent to mouse is pressed.
Thus, when step 206, control module 140 just can make in the display unit 150 of this embodiment pointer 142 click.Please refer to Fig. 5, it shows the synoptic diagram according to the 3rd image instruction of the first embodiment of the present invention.In the present embodiment, the instruction of the 3rd image is that pointer 142 is clicked.If with Fig. 6 is example, the user just can click fast commentaries on classics option one 46 by the second gesture G2.
In addition, be the second gesture G2 that moves if work as user's motion picture I.When the motion picture I of capturing images unit 110 at this second gesture G2 that moves of step 201 acquisition, recognition unit 120 is just discerned in step 202.Then in step 203, recognition unit 120 judges relatively that through database 122 the second gesture G2 that moves is the instruction of the 4th image, and wherein, the input operation of instructing corresponding to the 4th image is towing.Thus, when step 206, control module 140 just can make in the display unit 150 of this embodiment and carry out object 144 towing of (as shown in Figure 6).
Please refer to Fig. 6, it shows the synoptic diagram according to the application image instruction of the first embodiment of the present invention.At first, be to move on the object 144 with the first gesture G1 steering needle 142, be converted to the second gesture G2 again and click this object 144, move to an object 144 with this pointer 142 of the second gesture G2 that moves again, just pull this object 144.In the present embodiment, object 144 for example is a small icon or the data folder on the Microsft Windows.
Please refer to Fig. 7, it shows another synoptic diagram according to the application image indication of the first embodiment of the present invention.In the present embodiment, the instruction of the 4th image is to make the towing of accusing that system unit 140 carries out on the display unit 150.If with Fig. 7 is example, the user just can instead choose several data folders by the second gesture G2 in vain.
In addition, when carrying out towing, if when user's motion picture I changes the first gesture G1 into by the second gesture G2, recognition unit 120 is relatively judged the first gesture G1 through database 122, the input operation of control module 140 was release or was called mouse up that the left button that is equivalent to mouse discharges this moment.
Thus, when step 206, control module 140 just can make in the display unit 150 of this embodiment pointer 142 determine action.In the present embodiment, the second gesture G2 represents click, and the second gesture G2 that continues after clicking to move is the representative towing, and when transferring the first gesture G1 to by the second gesture G2, representative discharges.
Moreover, please refer to Fig. 3 C, it shows the synoptic diagram according to the 3rd gesture of the motion picture of first embodiment of the invention.If working as user's motion picture I is the gesture G3 of Fig. 3 C, this 3rd gesture G3 is defined as middle finger, the third finger and little finger of toe three are referred to open, and forefinger roughly contacts with thumb.When the motion picture I of capturing images unit 110 at this 3rd gesture of step 201 acquisition G3, recognition unit 120 is just discerned in step 202.Then in step 203, recognition unit 120 judges relatively that through database 122 the 3rd gesture G3 is the instruction of the 5th image, and wherein the input operation corresponding to the instruction of the 5th image is that function is switched.
Please refer to Fig. 8, it shows the synoptic diagram according to the 6th image instruction of the first embodiment of the present invention.Thus, when step 206, control module 140 just can make in the display unit 150 of this embodiment pointer 142 carry out function to switch.In the present embodiment, the function switching for example duplicates in right mouse button, and when control module 140 was carried out the instruction of the 6th image, a menu 148 for example can appear in display unit 150, chooses for the user.
Comprehensive the above, in the present embodiment, motion picture I comprises the first gesture G1, the second gesture G2, the 3rd gesture G3.The design of these three kinds of gestures is according to the operation of user's intuition, and four kinds of gestures mutually matched combined to control this electronic installation 100.For instance, normal several operations of using in the computing machine are as moving cursor, click, towing, release, instead three kinds of gestures that input operation all can be above-mentioned such as choose in vain and finish.
The method of operating of moving cursor in present embodiment at first is described, the user makes an index 142 show (as Fig. 4) with the first gesture G1 earlier.Then, so that the removable first gesture G1 makes this pointer 142 move to an object 144 (as Fig. 6).Then the method for operating in present embodiment is clicked in explanation, and the user can be earlier moves to index 142 with the first gesture G1 and wants option or the object 144 (as Fig. 6) chosen, as the fast commentaries on classics option one 46 of Fig. 6, clicks with the second gesture G2 again.Thus, just can directly carry out image input control method, not only can increase the convenient degree of operation, also also not be subject to the control method that existing input media is pushed again with gesture.
Second embodiment
Please refer to Fig. 9, it shows the calcspar according to the image input control device of the second embodiment of the present invention.Image input control system comprises capturing images unit 210 and electronic installation 200.
Capturing images unit 210 is used to capture a motion picture I ' of a user, and capturing images unit 210 comprises recognition unit 220.It is an object signal So that recognition unit 220 is used to resolve this motion picture I '.Electronic installation 200 comprises control module 240 and display unit 250.Control module 240 is used for carrying out input operation according to object signal So.Display unit 250 is used to show input operation.In this example, electronic installation 200 is to be example with the computing machine.
With reference to Fig. 9 and Figure 10, Figure 10 shows the process flow diagram according to the image input control method of second embodiment of the invention simultaneously.The details step of the display packing of present embodiment is described with Fig. 9 and Figure 10 when as follows.
At first, when step 901, with capturing images unit 210 acquisition users' motion picture I '.In the present embodiment, capturing images unit 210 preferably more comprises light source 212 and light sensing unit 214.Light source 212 is used to send several light beams 212a (showing as Figure 11).Light sensing unit 214 is used to receive the beam reflected 212a of institute on the limbs via the user (for example finger), to produce motion picture I '.
Please refer to Figure 11, it shows the synoptic diagram according to capturing images unit in the second embodiment of the present invention.In Figure 11, succinct for making drawing, only illustrate with the path of part light beam.In the present embodiment, in order to improve discrimination, light source 212 adopts the infrared light light source, and the first reflecting body 216a and the second reflecting body 216b are provided.The first reflecting body 216a and the second reflecting body 216b can be worn on user's the limbs (as finger) with the mode of wearing.The first reflecting body 216a and the second reflecting body 216b make these light beams 212a reflection.When several light beams 212a through the first reflecting body 216a and the second reflecting body 216b reflex time, light sensing unit 214 just can receive these a little beam reflected 212a.
Then, in step 902, recognition unit 220 identification maneuver picture I '.Then, judge whether motion picture I ' is object signal So in step 903, if, then enter step 904, if not, then repeating step 901 is to step 902.
When if when step 903 is resolved the object signal So that whether has among this motion picture I ' on the dual space, enter step 904, control module 240 is carried out input operation according to object signal So.In the present embodiment, recognition unit 220 identification maneuver picture I ' and the first reflecting body 216a and the second reflecting body 216b resolved to the first object 222a and the second object 222b (showing as Figure 13) in the object signal.Then, in step 905, the first object 222a among the object signal So that control module 240 is transmitted according to recognition unit 220 and the second object 222b carry out corresponding input operation.The motion picture I ' that then enumerates several users is as the example explanation.
At first, please refer to Figure 12 A and Figure 13, Figure 12 A shows the synoptic diagram according to first gesture of the motion picture of second embodiment of the invention, and Figure 13 shows the synoptic diagram of the light of first gesture reflection in identification Figure 12 A.As the first gesture G1 ' of user's motion picture I ' shown in Figure 12 A, recognition unit 220 identification maneuver picture I ' and judge the first object 222a and the second object 222b on the dual space.In this embodiment, control module 240 selects first to be identified unit 220 detected objects, and for example the first object 222a is the foundation of input operation, and input operation then shows a pointer 142 (as Fig. 4) according to the first object 222a.
Hold, referring again to Figure 12 A and Figure 13.When the first object 222a moved, then the input operation of the image of this corresponding this motion picture I ' instruction was for moving an index 142 (as shown in Figure 6).
In addition, please refer to Figure 12 B and Figure 14.Figure 12 B, it shows the synoptic diagram according to second gesture of the motion picture of second embodiment of the invention, and Figure 14 shows the synoptic diagram of the light of second gesture reflection in identification Figure 12 B.If working as the motion picture I ' that uses is shown in Figure 12 B, this motion picture I ' is the second gesture G2 ', and this second gesture G2 ' is defined as the thumb finger tip and contacts with the forefinger finger tip.And capturing images unit 210 is after step 901 captures the motion picture I ' of this second gesture G2 ' with light beam 212a, and recognition unit 220 is just discerned in step 902.At first, it is the first object 222a and the second object 222b that recognition unit 220 can be resolved these a little beam reflected 212a earlier, this the first object 222a and the second object 222b have a specific area on two-dimensional space, when reflecting body (216a or 216b) during the closer to light source 212, area is big more, otherwise when reflecting body (216a or 216b) during away from this light source 212, area is more little.Control module 240 is judged when the first object 222a and the second object 222b overlap in fact or apart from first during apart from d1, the input operation of control module 240 is to click (single click) object 144 (as Fig. 6), click and can be described as mouse down again, be equivalent to pressing of left mouse button.
Hold, referring again to Figure 12 B and Figure 14.After the user clicks an object 144 (as Fig. 6), move with the second gesture G2 ', then the input operation of control module 240 is the object 144 (for example small icons (icon)) on towing one display unit 250.In the present embodiment, by recognition unit 220 parse between the first object 222a and the second object 222b overlap in fact or distance first when then producing displacement behind the d1, the input operation of control module 240 is the clicked object 144 (as shown in Figure 6) of towing.
Hold, when carrying out towing, user's motion picture I ' is that the second gesture G2 ' as Figure 12 B transfers the first gesture G1 ' as Figure 12 A to, and then the input operation of control module 240 is for discharging or being called mouse up.In the present embodiment, when carrying out towing, recognition unit 220 parses the first object 222a and the second object 222b by overlapping in fact or distance first when being transformed into distance one second distance d2 apart from d1 (as Figure 14) (as Figure 13).The input operation that control module 240 is carried out is to discharge (mouse up), wherein, second distance d2 in fact greater than first apart from d1.
Please refer to Figure 12 C, it shows the synoptic diagram according to the 3rd gesture of the motion picture of second embodiment of the invention.In the present embodiment, capturing images unit 210 more comprises tri-finger stall 216c, is that middle finger, the third finger and the little finger of toe of supplying with the user is inserted in, so that light beam 212a is through a little fingerstall 216c reflections thus.When light source 212 sends several light beams 212a reflex time, light sensing unit 214 just can receive these a little beam reflected 212a, to produce motion picture I '.It is three object 222d that recognition unit 220 just can be resolved these beam reflected 212a, and control module 240 determines an input operation according to this three object 222d.
Then, please be simultaneously with reference to 12C and Figure 15.Figure 15 shows the synoptic diagram of the light of the 3rd gesture reflection in identification Figure 12 C.When user's motion picture I ' was the 3rd gesture G3 ' as Figure 12 C, just this motion picture I ' is defined as the 3rd gesture G3 ', then the input operation of the image of this corresponding this motion picture I ' instruction was that function is switched.In the present embodiment, when keeping a fixed range d (as Figure 15) in fact between the three object 222d that parsed by recognition unit 220, it is that function is switched that control module 240 is carried out these input operations.
Then, please be simultaneously with reference to Figure 16 A and Figure 16 B, Figure 16 A is according to another synoptic diagram of first gesture of the motion picture of second embodiment of the invention, and Figure 16 B is according to another synoptic diagram of second gesture of the motion picture of second embodiment of the invention.The first reflecting body 216a ' and the second reflecting body 216b ' can be inserted on user's the same forefinger, and recognition unit 220 can parse the first object 222a and the second object 222b (shown in 12 figure) on the dual space.Control module 240 determines pairing various input operation according to coordinate or the relative position of the first object 222a and the second object 222b.When recognition unit 220 after the first reflecting body 216a ' (first fingerstall) and the second reflecting body 216b ' (second fingerstall) the beam reflected 212a of institute resolve to the first object 222a and the second object 222b on the dual space on forefinger.Control module 240 is received the object signal So that recognition unit 220 is transmitted, and first object that is detected among the object signal So (as the first object 222a) is carried out the input operation of pointer 142, when the user moved forefinger, its input operation was that index 142 moves.
See also shown in Figure 16 B, when user's forefinger was crooked, the first reflecting body 216a ' and the second reflecting body 216b ' were adjacent to each other.Recognition unit 220 parses the first object 222a and second object 222b distance one first apart from d1 or when overlapping in fact, and operation being input as click (single click).Then, after carrying out click, the user is if continue mobile forefinger, and input operation this moment is just for pulling.Then when user's forefinger by case of bending (shown in Figure 16 B) when stretching (shown in Figure 16 A).Recognition unit 220 identifies the first object 222a and second object 222b distance, one second distance d2, and second distance d2 is in fact greater than first apart from d1, and input operation this moment is for discharging (mouse down).
Disclosed image input control method of the above embodiment of the present invention and control system thereof are also discerned by acquisition user's motion picture, and are made the user control this image input system via action.Thus, the user just can increase the convenient degree of operation directly with the most habitual gesture direct control electronic installation, more can make electronic installation needn't be subject to the control method of existing input media.
In sum, though the present invention discloses as above with preferred embodiment, so it is not in order to limit the present invention.The persond having ordinary knowledge in the technical field of the present invention, without departing from the spirit and scope of the present invention, when being used for a variety of modifications and variations.Therefore, protection scope of the present invention is as the criterion when looking the claim person of defining of the present invention.

Claims (35)

1. image input control method, this method is applicable to provides input signal to computer system, and this control method comprises:
(a) acquisition one user's motion picture;
(b) discern this motion picture;
(c) be image when instruction when identifying this motion picture, send an identification signal;
(d) send a control signal according to this identification signal; And
(e) carry out input operation that should the image instruction according to this control signal.
2. image input control method as claimed in claim 1, wherein, when this motion picture was first gesture, this image instruction comprised the instruction of first image, this step (c) comprising:
Judge that this first gesture is this first image instruction, wherein, this input operation of instructing corresponding to this first image is to show an index.
3. image input control method as claimed in claim 2, wherein, when this motion picture this first gesture for moving, this image instruction more comprises the instruction of second image, this step (c) more comprises:
Judge that this first gesture that moves is this second image instruction, wherein, this input operation of instructing corresponding to this second image is to show to move this index.
4. image input control method as claimed in claim 2, wherein, when this motion picture was second gesture, this image instruction comprised the instruction of the 3rd image, this step (c) more comprises:
Judge that this second gesture is the instruction of the 3rd image, wherein, this input operation of instructing corresponding to this second image is to click.
5. image input control method as claimed in claim 4, wherein, when this motion picture was this second gesture that moves, this image instruction more comprised the instruction of the 4th image, this step (c) more comprises:
Judge that this second gesture that moves is the instruction of the 4th image, wherein, this input operation of instructing corresponding to the 4th image is towing.
6. image input control method as claimed in claim 5, wherein, when this motion picture changed this first gesture into by this second gesture, this input operation was to discharge.
7. image input control method as claimed in claim 1, wherein, when this motion picture was the 3rd gesture, this input operation was a handoff functionality.
8. image input control system comprises:
The capturing images unit is in order to capture a motion picture of a user; And
Electronic installation comprises:
Recognition unit in order to discerning this motion picture, when judging this motion picture and be image instruction, sends an identification signal;
Judging unit is in order to send a control signal according to this identification signal;
Control module is used according to this control signal and is carried out input operation that should the image instruction;
Display unit is used to show the input operation of correspondence image instruction.
9. image input control system as claimed in claim 8, wherein, when motion picture is first gesture, this image instruction comprises the instruction of first image, this recognition unit is more in order to judge that this first gesture is this first image instruction, wherein, to should first image this input operation of instruction be to show an index.
10. image input control system as claimed in claim 9, wherein, when motion picture is this first gesture that moves, this image instruction more comprises the instruction of second image, this recognition unit is more in order to judge that this mobile first gesture is this second image instruction, wherein, to should second image this input operation of instruction be to show to move this index.
11. image input control system as claimed in claim 9, wherein, when this motion picture is second gesture, this image instruction comprises the instruction of the 3rd image, this recognition unit is more in order to judge that this second gesture is the instruction of the 3rd image, wherein, this input operation corresponding to the instruction of the 3rd image is to click.
12. image input control system as claimed in claim 11, wherein, when this motion picture is this second gesture that moves, this image instruction more comprises the instruction of the 4th image, this recognition unit is more in order to judge that this mobile second gesture is the instruction of the 4th image, wherein, this input operation corresponding to the instruction of the 4th image is towing.
13. image input control system as claimed in claim 12, wherein, when this motion picture is when changing this first gesture into by this second gesture, this input operation is to discharge.
14. image input control system as claimed in claim 8, wherein, when this motion picture was one the 3rd gesture, this input operation was a handoff functionality.
Provide input signal to an electronic installation 15. an image input control method, this method are applicable to, this control method comprises:
(a) acquisition one user's a motion picture;
(b) discern this motion picture;
(c) parsing this motion picture is object signal on the two-dimensional space; And
(d) carry out an input operation according to this object signal.
16. image input control method as claimed in claim 15, this input operation are included in and carry out pointer, click, towing or release on this electronic installation.
17. image input control method as claimed in claim 15, this object signal comprises first object, and this input operation is according to this display unit prior pointer of mobile execution of this first object.
18. image input control method as claimed in claim 17, this object signal more comprises second object, when this first object and this second object in fact distance first apart from the time, this input operation is to click.
19. image input control method as claimed in claim 18, this cursor continue when mobile, this input operation is the object on this display unit of towing.
20. image input control method as claimed in claim 19, when the distance of this first object and this second object changed a second distance into by this first distance, this input operation was for discharging, and wherein, this second distance is in fact greater than this first distance.
21. image input control method as claimed in claim 15 has first reflecting body on this user's the limbs, wherein, this method more comprises:
Send a plurality of light beams; And
Receive this reflecting body institute beam reflected;
Resolve this first reflecting body and be first object in the object signal.
22. image input control method as claimed in claim 21, this input operation is carried out this display unit prior pointer according to first object.
23. as claim 21 described image input control methods, tool second reflecting body on this user's the limbs, wherein, this method more comprises:
Resolve this first reflecting body and this second reflecting body and be first object and second object on the two-dimensional space, wherein, when this first object and this second object distance one first apart from the time, this input operation is clicked for carrying out one, when this first object and this second object distance, one second distance, this input operation discharges for carrying out one, and wherein, this second distance is greater than this first distance.
24. image input control method as claimed in claim 15, have first reflecting body, second reflecting body and the 3rd reflecting body on these user's limbs, when these three reflecting bodys are resolved when being three objects on the two-dimensional space, this input operation is that function is switched or catalogue is selected.
25. an image input control system comprises:
The capturing images unit, in order to capturing a user motion picture, and this capturing images unit comprises:
Recognition unit is an object signal in order to resolve this motion picture; And
Electronic installation comprises:
Control module is used for carrying out an input operation according to this object signal; And
Display unit is used to show this input operation.
26. image input control system as claimed in claim 24, wherein, this capturing images unit more comprises:
Light source is in order to send a plurality of light beams; And
Light sensing unit is in order to receive those via this user's beam reflected, to produce this object signal.
27. image input control system as claimed in claim 25, has first reflecting body on this user's the limbs, this object signal comprises first object, and this first object is corresponding to this first reflecting body, and this input operation is according to this display unit prior pointer of mobile execution of this first object.
28. image input control system as claimed in claim 27, has second reflecting body on this user's the limbs, this object signal more comprises second object, this second object is corresponding to this second reflecting body, when this first object and this second object in fact distance one first apart from the time, this input operation is to click.
29. after image input control system as claimed in claim 28, this input operation were carried out and clicked, first object and second object continued when mobile, this input operation is to pull a clicked object on this display unit.
30. image input control system as claimed in claim 29, when the distance of this first object and this second object was a second distance, this input operation was for discharging, and wherein, this second distance is greater than this first distance.
31. image input control system as claimed in claim 25, this object signal comprises first object, and this input operation is according to this display unit prior pointer of mobile execution of this first object.
32. image input control system as claimed in claim 31, this object signal more comprises second object, when this first object and this second object in fact distance first apart from the time, this input operation is to click.
33. image input control system as claimed in claim 32, when first object and second object continue when mobile, this input operation is the object on this display unit of towing.
34. image input control system as claimed in claim 33, when the distance of this first object and this second object was a second distance, this input operation was for discharging, and wherein, this second distance is greater than this first distance.
35. image input control system as claimed in claim 25, this object signal comprises first object and second object, when this first object and this second object distance one first apart from the time, this input operation is clicked for carrying out one, when this first object and this second object distance, one second distance, this input operation discharges for carrying out one, and wherein, this second distance is in fact greater than this first distance.
CNA2007101617761A 2007-09-26 2007-09-26 Image input control method and image input control system Pending CN101398712A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNA2007101617761A CN101398712A (en) 2007-09-26 2007-09-26 Image input control method and image input control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNA2007101617761A CN101398712A (en) 2007-09-26 2007-09-26 Image input control method and image input control system

Publications (1)

Publication Number Publication Date
CN101398712A true CN101398712A (en) 2009-04-01

Family

ID=40517300

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2007101617761A Pending CN101398712A (en) 2007-09-26 2007-09-26 Image input control method and image input control system

Country Status (1)

Country Link
CN (1) CN101398712A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102055925A (en) * 2009-11-06 2011-05-11 康佳集团股份有限公司 Television supporting gesture remote control and using method thereof
CN103118297A (en) * 2013-01-22 2013-05-22 广东星海数字家庭产业技术研究院有限公司 Fatigue-prevention digital television system based on motion identification
CN103809839A (en) * 2012-11-01 2014-05-21 夏普株式会社 Information displaying apparatus and information displaying method
CN104978017A (en) * 2014-04-14 2015-10-14 冠捷投资有限公司 Method for controlling function menu of display device by user gesture
CN105378602A (en) * 2013-07-23 2016-03-02 罗伯特·博世有限公司 Method for operating an input device, and input device
CN113678089A (en) * 2019-03-26 2021-11-19 株式会社东海理化电机制作所 Control device, system, and program

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102055925A (en) * 2009-11-06 2011-05-11 康佳集团股份有限公司 Television supporting gesture remote control and using method thereof
CN103809839A (en) * 2012-11-01 2014-05-21 夏普株式会社 Information displaying apparatus and information displaying method
CN103809839B (en) * 2012-11-01 2017-03-01 夏普株式会社 Information display device and method for information display
CN103118297A (en) * 2013-01-22 2013-05-22 广东星海数字家庭产业技术研究院有限公司 Fatigue-prevention digital television system based on motion identification
CN105378602A (en) * 2013-07-23 2016-03-02 罗伯特·博世有限公司 Method for operating an input device, and input device
CN105378602B (en) * 2013-07-23 2019-08-09 罗伯特·博世有限公司 For running the method and input equipment of input equipment
CN104978017A (en) * 2014-04-14 2015-10-14 冠捷投资有限公司 Method for controlling function menu of display device by user gesture
CN113678089A (en) * 2019-03-26 2021-11-19 株式会社东海理化电机制作所 Control device, system, and program

Similar Documents

Publication Publication Date Title
AU2007100827A4 (en) Multi-event input system
US7849421B2 (en) Virtual mouse driving apparatus and method using two-handed gestures
US9207806B2 (en) Creating a virtual mouse input device
EP1241616B1 (en) Portable electronic device with mouse-like capabilities
EP1980937B1 (en) Object search method and terminal having object search function
CN102855081B (en) The apparatus and method that web browser interface using gesture is provided in a device
US7358963B2 (en) Mouse having an optically-based scrolling feature
US9946372B2 (en) Pen type input device and method for character input and mouse functions
CN101398712A (en) Image input control method and image input control system
US20120127106A1 (en) Electronic device capable of executing commands therein and method for executing commands in the same
US20150177858A1 (en) Contact type finger mouse and operation method thereof
JP2008234212A (en) Coordinate input device and method of controlling coordinate input device
KR101512239B1 (en) System and method for transfering content among devices using touch command and unusual touch
JP6364790B2 (en) pointing device
CN101308421B (en) Block-free touch control operation electronic device
CN208061136U (en) A kind of gesture identification terminal
CN101308453B (en) Operation method possessing user interface
CN101308434B (en) User interface operation method
CN101308420A (en) Hand-held device and electronic device capable of switching user interface
CN112162689B (en) Input method and device and electronic equipment
KR100899864B1 (en) Pointing method of portable apparatus
CN201117000Y (en) Non-obstruction touch control operation electronic device
KR100868175B1 (en) Mobil phone having optical pointing device and thereof method
CN210466360U (en) Page control device
CN101308418A (en) Hand held device user interface operation method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20090401