CN103294173A - Remote control system based on user actions and method thereof - Google Patents

Remote control system based on user actions and method thereof Download PDF

Info

Publication number
CN103294173A
CN103294173A CN2012100448998A CN201210044899A CN103294173A CN 103294173 A CN103294173 A CN 103294173A CN 2012100448998 A CN2012100448998 A CN 2012100448998A CN 201210044899 A CN201210044899 A CN 201210044899A CN 103294173 A CN103294173 A CN 103294173A
Authority
CN
China
Prior art keywords
theta
tan
angle
image capture
capture element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012100448998A
Other languages
Chinese (zh)
Inventor
梁家瑞
吴柏增
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TPV Investment Co Ltd
TPV Technology Co Ltd
Original Assignee
TPV Investment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TPV Investment Co Ltd filed Critical TPV Investment Co Ltd
Priority to CN2012100448998A priority Critical patent/CN103294173A/en
Publication of CN103294173A publication Critical patent/CN103294173A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The invention discloses a remote control system based on user actions and a remote control method thereof. The system comprises a display, a first image acquisition unit and a second image acquisition unit which are used for continuously acquiring user images, an image recognition unit, and a cursor position acquisition unit used for acquiring a cursor position coordinate, wherein the image recognition unit is used for recognizing the two-eye midpoint position of a user and the fingertip position from a signal of a user image; and an intersection point coordinate of a connecting straight line between the two-eye midpoint position and the fingertip point and a display plane of the display is obtained by the cursor position acquisition unit according to the first image acquisition unit, the second image acquisition unit, the two-eye midpoint position and the fingertip position, and the intersection point coordinate is taken as the cursor position coordinate. The remote control system based on the user actions and the method thereof can simulate actions of remote control operation of the user.

Description

Telechirics and method thereof based on user's action
Technical field
The present invention relates to a kind of telechirics and method thereof, particularly relate to a kind of telechirics and method thereof based on user's action.
Background technology
Along with development of science and technology, increasing electronic product appears in the middle of people's the life, for example air conditioner, televisor etc.Just also therefore, also occur more and more telepilots that are used for these electronic products of control in people's life, and telepilot has just like become one of device indispensable in the life.
In addition the function that possesses of existing televisor from strength to strength, intelligent TV for example, it also has the function of online except having existing function of watching various wired and wireless channel programs.Relatively, its corresponding telepilot is compared to the telepilot of existing televisor, and function is also numerous and diverse more, therefore may cause user's puzzlement, that is to say that the user can't intuitively find the function of wanting rapidly on telepilot.
On the other hand, body sense technology the technology of the function of some similar telepilots appearred in recent years, for example.Be example with the Kinect that Microsoft was released, it detects the action of human body by the colour camera on the relevant apparatus and 3D depth transducer, and in other words, this technology also can be used to replace the function of prior remote controller.
Yet the shortcoming of above-mentioned body sense technology is, video camera and sensor must capture the image of the action of user's whole body, just have preferable detecting accuracy, thereby make the user may be under the state of standing, even, the environment that also must stand in enough broadnesses is beneficial to stretch limbs, captures the image of its action for video camera and sensor.
Summary of the invention
The object of the present invention is to provide a kind of telechirics based on user's action.
The present invention is based on the telechirics of user's action, comprise a display, one first image capture element, one second image capture element, an image identification unit, and a cursor position obtains the unit.
The front of this display comprises that a peripheral part and is by the display plane of this circumference branch encirclement.
This first image capture element and this second image capture element are arranged at the peripheral part of this display, and lay respectively at left side and the right side of the display plane of this display, and be positioned on the same surface level, be used for capturing user's image constantly and producing corresponding signal.Wherein this first and second image capture interelement defines a horizontal line and at a distance of a horizontal range.
This image identification unit is used for the signal from this user's image, picks out the fingertip location of the finger tip of two point midways of user and at least one sensing display.
This cursor position obtains unit and is used for according to the relative position relation between this first image capture element, the second image capture element, two point midways and fingertip location, obtain the intersecting point coordinate of the display plane of connection straight line between these two point midways and fingertip location and this display, with as a cursor position coordinate.
Another object of the present invention is to provide a kind of remote control thereof based on user's action.Should be based on the remote control thereof of user's action, comprise following steps: (a) provide one to comprise a display, one first image capture element, one second image capture element, one image identification unit and a cursor position obtain the telechirics based on user's action of unit, wherein the front of this display comprises that a peripheral part and is by the display plane of this circumference branch encirclement, this first and second image capture element is arranged at the peripheral part of this display and lays respectively at left side and the right side of the display plane of this display, and be positioned on the same surface level, this first and second image capture interelement defines a horizontal line and at a distance of a horizontal range; (b) this first and second image capture element captures user's image constantly and produces corresponding signal; (c) this image identification unit is used for the signal from this user's image, picks out the fingertip location of the finger tip of two point midways of user and at least one sensing display; And (d) this cursor position obtains unit and is used for according to the relative position relation between this first image capture element, the second image capture element, two point midways and fingertip location, obtain the intersecting point coordinate of the display plane of connection straight line between these two point midways and fingertip location and this display, with as a cursor position coordinate.
Beneficial effect of the present invention is: by two point midways and the fingertip location that picks out the user, and obtain this cursor position coordinate according to two point midways and fingertip location, just can simulate the behavior of user's remote controller further.
Description of drawings
Fig. 1 is a system diagram, and the preferred embodiment of the telechirics that the present invention is based on user's action is described;
Fig. 2 is a process flow diagram, and the remote control thereof based on user's action of corresponding this preferred embodiment is described;
Fig. 3 is a synoptic diagram, and two point midways of user of this preferred embodiment and the relative position of display are described;
Fig. 4 is a synoptic diagram, and user's the fingertip location of this preferred embodiment and the relative position of display are described;
Fig. 5 is a synoptic diagram, and the user's of this preferred embodiment two point midways, fingertip location is described, and is connected the intersection point that straight line intersects at the display plane of display between two point midways of user and fingertip location.
Embodiment
The present invention is described in detail below in conjunction with drawings and Examples.
Consult Fig. 1, Fig. 2, Fig. 3, the preferred embodiment that the present invention is based on the telechirics 1 of user's action comprises a display 11, one first image capture element 12, one second image capture element 13, an image identification unit 14, and a cursor position obtains unit 15.This cursor position obtains unit 15 and comprises that an angle information obtains module 151 and a cursor coordinates computing module 152.In this preferred embodiment, this display 11 is televisors.This first image capture element 12 and the second image capture element 13 be charge coupled cell (Charge Coupled Device, CCD) or CMOS (Complementary Metal Oxide Semiconductor) (Complementary Metal Oxide Semiconductor, CMOS) wherein one.It is to implement with software mode that this image identification unit 14 obtains unit 15 with this cursor position, and it implements aspect is the computer program of the corresponding program of interior storage.After the processor (not shown) of this display 11 loads this program and execution, can finish the function that this image identification unit 14 and this cursor position obtain unit 15.
The front of this display 11 comprises display plane 112 (see figure 3)s that a peripheral part 111 and is surrounded by this peripheral part 111.
This first image capture element 12 and this second image capture element 13 are arranged at the peripheral part 111 of this display 11, and lay respectively at left side and the right side of the display plane 112 of this display 11, and are positioned on the same surface level.Wherein this first image capture element 12 and 13 of the second image capture elements define a horizontal line L and at a distance of a horizontal range X.This horizontal line L is the straight line that this first image capture element 12 and this second image capture element 13 are formed by connecting, and this horizontal range X is the length of this horizontal line L.
What deserves to be mentioned is that in this preferred embodiment, the initial point of a coordinate system is located at the position of this first image capture element 12.Positive x direction of principal axis is for pointing to the direction of the second image capture element 13 from initial point.Positive y direction of principal axis is the direction directly over pointing to from initial point.Positive z direction of principal axis is the direction of pointing out this display plane 112,, points to the user's in the place ahead that is positioned at these display 11 fronts direction that is.In brief, the coordinate of this first image capture element 12 is (0,0,0), the coordinate of this second image capture element 13 is (X, 0,0), and the surface level of the formed display plane 112 perpendicular to this display 11 of this horizontal line L certainly, is considered as the X ﹠amp of Y=0 in the whole coordinate system; The Z axial plane.
Below cooperate remote control thereof and an exemplary applications based on user's action, obtain module 151 at above-mentioned this first image capture element 12, the second image capture element 13, image identification unit 14, angle information, reach the interaction of 152 of cursor coordinates computing modules, explanation further.
At first, after the user who is positioned at the place ahead in these display 11 fronts started this display 11, this first image capture element 12 and the second image capture element 13 also came into operation.
Shown in step S21, this first image capture element 12 and the second image capture element 13 capture user's image constantly, and produce corresponding signal.
Shown in step S22, the signal that this image identification unit 14 is used for according to this user's image picks out the fingertip location B (see figure 4) of the finger tip of two point midway A (see figure 3)s of user and at least one sensing display 11.In this preferred embodiment, this image identification unit 14 is the fingertip location B of identification one finger tip only, but the also fingertip location of a plurality of finger tips of identification of this image identification unit 14 certainly, and it is disclosed to be not limited to this preferred embodiment.Wherein the coherent video identification technique of identification people face and limbs is to be familiar with this field person can understand easily, so do not give unnecessary details at this.
Shown in step S23, the angle information that this cursor position obtains unit 15 obtains module 151 according to the relative position relation (see figure 4) between this first image capture element 12, the second image capture element 13 and fingertip location B, obtains to represent the one first intersegmental angle theta of connecting line between this horizontal line L and this first image capture element 12 and fingertip location B 1, represent the one second intersegmental angle theta of connecting line between this horizontal line L and this second image capture element 13 and fingertip location B 2, represent this surface level P and one the 3rd intersegmental angle theta of the connecting line of this fingertip location B between the subpoint M on this horizontal line L and this fingertip location B 3
The angle information that this cursor position obtains unit 15 obtains module 151 also according to this first image capture element 12, the second image capture element 13, and the relative position relation (see figure 3) of two point midway A, obtain to represent one the 4th intersegmental angle theta of connecting line between this horizontal line L and this first image capture element 12 and two point midway A 4, represent one the 5th intersegmental angle theta of connecting line between this horizontal line L and this second image capture element 13 and two point midway A 5, and represent this surface level P and one the 6th intersegmental angle theta of the connecting line of this two point midway A between the subpoint N on this horizontal line L and this two point midway A 6
Shown in step S24, the cursor coordinates computing module 152 that this cursor position obtains unit 15 is used for according to this horizontal range X, and this angle information obtains first angle theta that module 151 obtains 1, second angle theta 2, the 3rd angle theta 3, the 4th angle theta 4, the 5th angle theta 5And the 6th angle theta 6, this cursor position coordinate of computing.
What deserves to be mentioned is, after this cursor coordinates computing module 152 calculates the relative coordinate of these two point midway A and this fingertip location B respectively earlier, calculate this cursor position coordinate more accordingly.Below will illustrate further at the details of this this cursor position coordinate of cursor coordinates computing module 152 computings.
The relative coordinate of two point midways:
Consult Fig. 3, at first, suppose that these two point midway A are positioned at this surface level below, this first image capture element 12 and the distance of these two point midway A between the subpoint N on this horizontal line L are X 3, this second image capture element 13 and the distance of these two point midway A between the subpoint N on this horizontal line L are X 4, that is to say this horizontal range X=X 3+ X 4
The distance of supposing these two point midway A and subpoint N is Z 4, the vertical range of these two point midway A and this surface level is Y 2, these two point midway A extend to the point of this surface level and the distance of subpoint N is Z 5, that is, the relative coordinate of these two point midway A is (X 3,-Y 2, Z 5).
Can be pushed away by trigonometric function, tan θ 4 = Z 4 X 3 , tan θ 5 = Z 4 X 4 .
So tan θ 4 X 3 = Z 4 = tan θ 5 X 4 = tan θ 5 ( X - X 3 ) = tan θ 5 X - tan θ 5 X 3 ;
= > tan θ 4 X 3 + tan θ 5 X 3 = tan θ 5 X = X 3 ( tan θ 4 + tan θ 5 ) ;
= > X 3 = X tan θ 5 tan θ 4 + tan θ 5 .
Again, Z 4 = tan θ 4 X 3 = X tan θ 4 tan θ 5 tan θ 4 + tan θ 5 ;
By trigonometric function as can be known, tan θ 6 = Y 2 Z 5 , sin θ 6 = Y 2 Z 4 .
So, tan θ 6 Z 5 = Y 2 = sin θ 6 Z 4 ;
= > Z 5 = sin θ 6 tan θ 6 Z 4 = sin θ 6 tan θ 6 ( X ) ( tan θ 4 tan θ 5 tan θ 4 + tan θ 5 ) ;
= > Z 5 = X sin θ 6 tan θ 4 tan θ 5 tan θ 6 ( tan θ 4 + tan θ 5 ) .
And Y 2 = sin θ 6 Z 4 = sin θ 6 ( X tan θ 4 tan θ 5 tan θ 4 + tan θ 5 ) = X sin θ 6 tan θ 4 tan θ 5 tan θ 4 + tan θ 5 .
By above formula can be in the hope of the relative coordinate of these two point midway A
( X tan θ 5 tan θ 4 + tan θ 5 , - X sin θ 6 tan θ 4 tan θ 5 tan θ 4 + tan θ 5 , X sin θ 6 tan θ 4 tan θ 5 tan θ 6 ( tan θ 4 + tan θ 5 ) ) .
The relative coordinate of fingertip location:
Consult Fig. 4, at first, suppose that this fingertip location B is positioned at this surface level below, this first image capture element 12 and the distance of this fingertip location B between the subpoint M on this horizontal line L are X 1, this second image capture element 13 and the distance of this fingertip location B between the subpoint M on this horizontal line L are X 2, that is to say this horizontal range X=X 1+ X 2
The distance of supposing this fingertip location B and subpoint M is Z 1, the vertical range of this fingertip location B and this surface level is Y 1, this fingertip location B extends to the point of this surface level and the distance of subpoint M is Z 2, that is, the relative coordinate of this fingertip location B is (X 1,-Y 1, Z 2).
Can be pushed away by trigonometric function, tan θ 1 = Z 1 X 1 , tan θ 2 = Z 1 X 2 .
So Z 1 = tan θ 1 X 1 tan θ 2 X 2 = tan θ 2 ( X - X 1 ) = tan θ 2 x - tan θ 2 X 1 ;
= > tan θ 2 X = tan θ 1 X 1 + tan θ 2 X 1 = X 1 ( tan θ 1 + tan θ 2 ) ;
= > X 1 = X tan θ 2 tan θ 1 + tan θ 2 .
Again, Z 1 = tan θ 1 X 1 = tan θ 1 tan θ 2 X tan θ 1 + tan θ 2 = X tan θ 1 tan θ 2 tan θ 1 + tan θ 2 ;
By trigonometric function as can be known, tan θ 3 = Y 1 Z 2 , sin θ 3 = Y 1 Z 1 .
So, tan θ 3 Z 2 = Y 1 = sin θ 3 Z 1 ;
= > Z 2 = sin θ 3 tan θ 3 Z 1 = sin θ 3 tan θ 3 ( X ) ( tan θ 1 tan θ 2 tan θ 1 + tan θ 2 ) ;
= > Z 2 = X sin θ 3 tan θ 1 tan θ 2 tan θ 3 ( tan θ 1 + tan θ 2 ) .
And Y 1 = sin θ 3 Z 1 = sin θ 3 ( X ) ( tan θ 1 tan θ 2 tan θ 1 + tan θ 2 ) = X sin θ 3 tan θ 1 tan θ 2 tan θ 1 + tan θ 2 .
By above formula can be in the hope of the relative coordinate of this fingertip location B
( X tan θ 2 tan θ 1 + tan θ 2 , - X sin θ 3 tan θ 1 tan θ 2 tan θ 1 + tan θ 2 , X sin θ 3 tan θ 1 tan θ 2 tan θ 3 ( tan θ 1 + tan θ 2 ) ) .
The cursor position coordinate:
Consult Fig. 5, after calculating the relative coordinate of the relative coordinate of this two point midway A and this fingertip location B respectively, the intersection point of then supposing the display plane 112 of connection straight line between these two point midway A and fingertip location B and this display 11 is C, and the relative coordinate that this cursor position coordinate is intersection point C, and its coordinate figure is (E, F, 0).The distance of supposing these two point midway A and this fingertip location B in addition is L 1, the distance of this fingertip location B and intersection point C is L 2
By L 1With L 2Relativeness can push away L 1With L 2Ratio is as follows:
Z 5 - { ( Z 5 - 0 ) L 1 L 1 + L 2 } = Z 2 = Z 5 - ( Z 5 L 1 L 1 + L 2 ) ;
= > Z 5 - Z 2 = Z 5 L 1 L 1 + L 2 ;
= > ( Z 5 - Z 2 ) ( L 1 + L 2 ) = Z 5 L 1 ;
= > Z 5 L 1 + Z 5 L 2 - Z 2 L 1 - Z 2 L 2 = Z 5 L 1 ;
= > L 2 ( Z 5 - Z 2 ) = Z 2 L 1 ;
= > L 2 = Z 2 L 1 Z 5 - Z 2 ;
= > L 2 L 1 = Z 2 Z 5 - Z 2 .
By L 1With L 2Ratio can push away:
X 3 - { ( X 3 - E ) L 1 L 1 + L 2 } = X 1 ;
= > X 3 - { X 3 L 1 - EL 1 L 1 + L 2 } = X 1 = X 3 L 1 + X 3 L 2 - X 3 L 1 + EL 1 L 1 + L 2 ;
= > X 1 L 1 + X 1 L 2 = X 3 L 1 + X 3 L 2 - X 3 L 1 + EL 1 ;
= > EL 1 = X 1 L 1 + X 1 L 2 - X 3 L 2 ;
= > E = X 1 L 1 + X 1 L 2 - X 3 L 2 L 1 = X 1 + L 2 ( X 1 - X 3 ) L 1 = X 1 + Z 2 ( X 1 - X 3 ) Z 5 - Z 2 .
By L 1With L 2Ratio can also push away:
- Y 2 - { ( - Y 2 - F ) L 1 L 1 + L 2 } = - Y 1 ;
= > - Y 2 - { - Y 2 L 1 - FL 1 L 1 + L 2 } = - Y 1 = - Y 2 L 1 - Y 2 L 2 + Y 2 L 1 + FL 1 L 1 + L 2 ;
= > - Y 1 L 1 - Y 1 L 2 = - Y 2 L 1 - Y 2 L 2 + Y 2 L 1 + FL 1 ;
= > FL 1 = Y 2 L 1 + Y 2 L 2 - Y 2 L 1 - Y 1 L 1 - Y 1 L 2 ;
= > F = Y 2 L 2 - Y 1 L 2 - Y 1 L 1 L 1 = - Y 1 + L 2 ( Y 2 - Y 1 ) L 1 = - Y 1 + Z 2 ( Y 2 - Y 1 ) Z 5 - Z 2 .
By above formula as can be known, the value of the relative coordinate of this intersection point C is ( X 1 + Z 2 ( X 1 - X 3 ) Z 5 - Z 2 ,
- Y 1 + Z 2 ( Y 2 - Y 1 ) Z 5 - Z 2 , 0 ) .
What deserves to be mentioned is, when these two point midway A or this fingertip location B horizontal projection during to the first quartile of this coordinate system and second quadrant, the Y-axis coordinate figure of its relative coordinate be on the occasion of, for example, these two point midway A horizontal projections are during to first quartile, and its relative coordinate is (X, Y, Z); When these two point midway A or this fingertip location B horizontal projection during to the third quadrant of this coordinate system and fourth quadrant, the Y-axis coordinate figure of its relative coordinate is negative value, and for example, this fingertip location B horizontal projection is during to fourth quadrant, its relative coordinate be (X ,-Y, Z).In this preferred embodiment, these two point midway A or this fingertip location B horizontal projection are to the fourth quadrant of this coordinate system, so the Y-axis coordinate figure of its relative coordinate is negative value, and the Z axial coordinate value perseverance of this cursor position coordinate is zero.
Similarly, when these two point midway A or this fingertip location B horizontal projection during to second quadrant of this coordinate system and third quadrant, the X-axis coordinate figure of its relative coordinate is negative value, for example, during these two point midway A horizontal projection to the second quadrants, its relative coordinate is (X, Y, Z).The wherein first quartile of this coordinate system, second quadrant, third quadrant, and fourth quadrant is defined as and is familiar with this field person and can understands easily, so do not give unnecessary details at this.
Then, after this cursor coordinates computing module 152 calculates the relative coordinate of this intersection point C, this display 11 can according to this relative coordinate present a patterned cursor in this display plane 112 for user's reference.And the user also can carry out further action according to this patterned cursor, for example chooses or confirms etc. instruction.Only further the corresponding relation between action and dependent instruction is to be familiar with this field person can understand easily, and non-emphasis of the present invention, does not therefore give unnecessary details at this.
By above explanation as can be known, the present invention is by two point midways that calculate the user, the relative coordinate of fingertip location, and the relative coordinate of the intersection point of the display plane that is connected straight line and display between computing user's two point midways and fingertip location further, this design has following effect:
One. can simulate the behavior that the hand-held remote controller of user points to display exactly.
Two. the user does not need the telepilot of operating function complexity, need not stand on enough big space action yet.

Claims (8)

1. telechirics based on user action, comprise: a display, one first image capture element and one second image capture element, wherein the front of this display comprises that a peripheral part and is by the display plane of this circumference branch encirclement, this first image capture element and this second image capture element are arranged at the peripheral part of this display, and lay respectively at left side and the right side of the display plane of this display, and be positioned on the same surface level, be used for capturing user's image constantly and producing corresponding signal, wherein this first and second image capture interelement defines a horizontal line and at a distance of a horizontal range; It is characterized in that: should also comprise an image identification unit and cursor position acquisition unit based on the telechirics of user's action, wherein this image identification unit is used for the signal from this user's image, pick out the fingertip location of the finger tip of two point midways of user and at least one sensing display, this cursor position obtains the unit and is used for according to this first image capture element, the second image capture element, relative position relation between two point midways and fingertip location, obtain the intersecting point coordinate of the display plane of connection straight line between these two point midways and fingertip location and this display, with as a cursor position coordinate.
2. the telechirics based on user action according to claim 1, it is characterized in that: this cursor position obtains the unit and comprises that an angle information obtains module, be used for according to this first image capture element, the second image capture element, relative position relation between two point midways and fingertip location, obtain to represent the one first intersegmental angle of connecting line between this horizontal line and this first image capture element and fingertip location, represent the one second intersegmental angle of connecting line between this horizontal line and this second image capture element and fingertip location, represent one the 3rd intersegmental angle of this surface level and this fingertip location connecting line between the subpoint on this horizontal line and this fingertip location, represent one the 4th intersegmental angle of connecting line between this horizontal line and this first image capture element and two point midways, represent one the 5th intersegmental angle of connecting line between this horizontal line and this second image capture element and two point midways, and represent this surface level and one the 6th intersegmental angle of the connecting line of this two point midways between the subpoint on this horizontal line and this two point midways.
3. the telechirics based on user action according to claim 2, it is characterized in that: this cursor position obtains the unit and also comprises a cursor coordinates computing module, be used for obtaining first angle, second angle, the 3rd angle, the 4th angle, the 5th angle and the 6th angle that module obtains, this cursor position coordinate of computing according to this horizontal range and this angle information.
4. the telechirics based on user action according to claim 3, it is characterized in that: the position of being located at this first image capture element when the initial point of a coordinate system, positive x direction of principal axis is for pointing to the direction of the second image capture element from initial point, positive y direction of principal axis is the direction directly over pointing to from initial point, and positive z direction of principal axis is when pointing out the direction of this display plane, and this cursor position coordinate is ( X 1 + ( X 1 - X 3 ) × Z 2 Z 5 - Z 2 , - Y 1 + ( Y 2 - Y 1 ) × Z 2 Z 5 - Z 2 , 0 ) , Wherein:
X 1 = X × tan θ 2 tan θ 1 + tan θ 2 ;
Y 1 = X tan θ 1 × tan θ 2 × sin θ 3 tan θ 1 + tan θ 2 ;
Z 2 = X tan θ 1 × tan θ 2 × sin θ 3 tan θ 3 ( tan θ 1 + tan θ 2 ) ;
X 3 = X × tan θ 5 tan θ 4 + tan θ 5 ;
Y 2 = X tan θ 4 × tan θ 5 × sin θ 6 tan θ 4 + tan θ 5 ;
Z 5 = X tan θ 4 × tan θ 5 × sin θ 6 tan θ 6 ( tan θ 4 + tan θ 5 ) ;
Wherein: θ 1Be this first angle; θ 2Be this second angle; θ 3Be the 3rd angle; θ 4Be the 4th angle; θ 5Be the 5th angle; θ 6Be the 6th angle; X is this horizontal range.
5. remote control thereof based on user action is characterized in that: should comprise following steps based on remote control thereof of user's action:
(a) provide one to comprise a display, one first image capture element, one second image capture element, one image identification unit and a cursor position obtain the telechirics based on user's action of unit, wherein the front of this display comprises that a peripheral part and is by the display plane of this circumference branch encirclement, this first image capture element and the second image capture element are arranged at the peripheral part of this display and lay respectively at left side and the right side of the display plane of this display, and be positioned on the same surface level, this first and second image capture interelement defines a horizontal line and at a distance of a horizontal range;
(b) this first and second image capture element captures user's image constantly and produces corresponding signal;
(c) this image identification unit is used for the signal from this user's image, picks out the fingertip location of the finger tip of two point midways of user and at least one sensing display;
(d) this cursor position obtains unit and is used for according to the relative position relation between this first image capture element, the second image capture element, two point midways and fingertip location, obtain the intersecting point coordinate of the display plane of connection straight line between these two point midways and fingertip location and this display, with as a cursor position coordinate.
6. the remote control thereof based on user action according to claim 5, it is characterized in that: should (d) step comprise, the angle information that this cursor position obtains the unit obtains module according to this first image capture element, the second image capture element, relative position relation between two point midways and fingertip location, obtain to represent the one first intersegmental angle of connecting line between this horizontal line and this first image capture element and fingertip location, represent the one second intersegmental angle of connecting line between this horizontal line and this second image capture element and fingertip location, represent one the 3rd intersegmental angle of this surface level and this fingertip location connecting line between the subpoint on this horizontal line and this fingertip location, represent one the 4th intersegmental angle of connecting line between this horizontal line and this first image capture element and two point midways, represent one the 5th intersegmental angle of connecting line between this horizontal line and this second image capture element and two point midways, and represent this surface level and one the 6th intersegmental angle of the connecting line of this two point midways between the subpoint on this horizontal line and this two point midways.
7. the remote control thereof based on user action according to claim 6, it is characterized in that: the cursor coordinates computing module that this cursor position obtains the unit is used for obtaining first angle, second angle, the 3rd angle, the 4th angle, the 5th angle and the 6th angle that module obtains, this cursor position coordinate of computing according to this horizontal range and this angle information.
8. the remote control thereof based on user action according to claim 7, it is characterized in that: should (d) step comprise, be located at the position of this first image capture element when the initial point of a coordinate system, positive x direction of principal axis is for pointing to the direction of the second image capture element from initial point, positive y direction of principal axis is the direction directly over pointing to from initial point, and positive z direction of principal axis is when pointing out the direction of this display plane, and this cursor position coordinate is ( X 1 + ( X 1 - X 3 ) × Z 2 Z 5 - Z 2 , - Y 1 + ( Y 2 - Y 1 ) × Z 2 Z 5 - Z 2 , 0 ) , Wherein:
X 1 = X × tan θ 2 tan θ 1 + tan θ 2 ;
Y 1 = X tan θ 1 × tan θ 2 × sin θ 3 tan θ 1 + tan θ 2 ;
Z 2 = X tan θ 1 × tan θ 2 × sin θ 3 tan θ 3 ( tan θ 1 + tan θ 2 ) ;
X 3 = X × tan θ 5 tan θ 4 + tan θ 5 ;
Y 2 = X tan θ 4 × tan θ 5 × sin θ 6 tan θ 4 + tan θ 5 ;
Z 5 = X tan θ 4 × tan θ 5 × sin θ 6 tan θ 6 ( tan θ 4 + tan θ 5 ) ;
Wherein, θ 1Be this first angle; θ 2Be this second angle; θ 3Be the 3rd angle; θ 4Be the 4th angle; θ 5Be the 5th angle; θ 6Be the 6th angle; X is this horizontal range.
CN2012100448998A 2012-02-24 2012-02-24 Remote control system based on user actions and method thereof Pending CN103294173A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012100448998A CN103294173A (en) 2012-02-24 2012-02-24 Remote control system based on user actions and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012100448998A CN103294173A (en) 2012-02-24 2012-02-24 Remote control system based on user actions and method thereof

Publications (1)

Publication Number Publication Date
CN103294173A true CN103294173A (en) 2013-09-11

Family

ID=49095201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012100448998A Pending CN103294173A (en) 2012-02-24 2012-02-24 Remote control system based on user actions and method thereof

Country Status (1)

Country Link
CN (1) CN103294173A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020126090A1 (en) * 2001-01-18 2002-09-12 International Business Machines Corporation Navigating and selecting a portion of a screen by utilizing a state of an object as viewed by a camera
CN1636178A (en) * 2001-01-22 2005-07-06 皇家菲利浦电子有限公司 Single camera system for gesture-based input and target indication
CN1904806A (en) * 2006-07-28 2007-01-31 上海大学 System and method of contactless position input by hand and eye relation guiding
CN101344816A (en) * 2008-08-15 2009-01-14 华南理工大学 Human-machine interaction method and device based on sight tracing and gesture discriminating
US20100053151A1 (en) * 2008-09-02 2010-03-04 Samsung Electronics Co., Ltd In-line mediation for manipulating three-dimensional content on a display device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020126090A1 (en) * 2001-01-18 2002-09-12 International Business Machines Corporation Navigating and selecting a portion of a screen by utilizing a state of an object as viewed by a camera
CN1636178A (en) * 2001-01-22 2005-07-06 皇家菲利浦电子有限公司 Single camera system for gesture-based input and target indication
CN1904806A (en) * 2006-07-28 2007-01-31 上海大学 System and method of contactless position input by hand and eye relation guiding
CN101344816A (en) * 2008-08-15 2009-01-14 华南理工大学 Human-machine interaction method and device based on sight tracing and gesture discriminating
US20100053151A1 (en) * 2008-09-02 2010-03-04 Samsung Electronics Co., Ltd In-line mediation for manipulating three-dimensional content on a display device

Similar Documents

Publication Publication Date Title
US9367138B2 (en) Remote manipulation device and method using a virtual touch of a three-dimensionally modeled electronic device
US20110118877A1 (en) Robot system and method and computer-readable medium controlling the same
US20170351338A1 (en) Input unit for controlling a display image according to a distance of the input unit and user
US20210132681A1 (en) Natural human-computer interaction system based on multi-sensing data fusion
WO2017041433A1 (en) Touch control response method and apparatus for wearable device, and wearable device
TWI471815B (en) Gesture recognition device and method
EP2677399A2 (en) Virtual touch device without pointer
KR20150107597A (en) Gesture recognition apparatus and control method of gesture recognition apparatus
KR102147430B1 (en) virtual multi-touch interaction apparatus and method
CN107145822B (en) User somatosensory interaction calibration method and system deviating from depth camera
CN103218059A (en) Three-dimensional remote control device and positioning method thereof
TWI471764B (en) Coordinate sensing system, coordinate sensing method and display system
TW201319925A (en) A three-dimensional interactive system and three-dimensional interactive method
US9292106B2 (en) Interface apparatus using motion recognition, and method for controlling same
CN106598422B (en) hybrid control method, control system and electronic equipment
US9373035B2 (en) Image capturing method for image recognition and system thereof
JPS61196317A (en) Information input system
US11353959B2 (en) Controller device
KR102278747B1 (en) Contactless Screen Pad for Vehicle
CN103294173A (en) Remote control system based on user actions and method thereof
CN103823577A (en) Sensor-based gesture remote-control method and system
CN102662495A (en) Coordinate sensing system, coordinate sensing method and display system
TWI446214B (en) Remote control system based on user action and its method
CN109214295A (en) The gesture identification method of data fusion based on Kinect v2 and Leap Motion
JP2013109538A (en) Input method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130911