CN105589552B - Projection interactive method based on gesture and projection interactive device - Google Patents

Projection interactive method based on gesture and projection interactive device Download PDF

Info

Publication number
CN105589552B
CN105589552B CN201410601367.9A CN201410601367A CN105589552B CN 105589552 B CN105589552 B CN 105589552B CN 201410601367 A CN201410601367 A CN 201410601367A CN 105589552 B CN105589552 B CN 105589552B
Authority
CN
China
Prior art keywords
gesture
projection
image
camera
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410601367.9A
Other languages
Chinese (zh)
Other versions
CN105589552A (en
Inventor
宋金龙
马军
王琳
李翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410601367.9A priority Critical patent/CN105589552B/en
Publication of CN105589552A publication Critical patent/CN105589552A/en
Application granted granted Critical
Publication of CN105589552B publication Critical patent/CN105589552B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Projection Apparatus (AREA)

Abstract

The present invention provides a kind of projection interactive method based on gesture and projection interactive devices, this method is applied to projection interactive device, the projection interactive device includes projecting cell, first camera and second camera, the projection desktop of the overlapping region covering projecting cell of the image acquisition region of first camera and second camera, method include:First camera is passed sequentially through within a predetermined period of time and obtains the first infrared image frame and the first color image frames of projection desktop, and passes sequentially through the second infrared image frame and the second color image frames that second camera obtains projection desktop within a predetermined period of time;Restore finger depth according to the first infrared image frame and the second infrared image frame, and desktop depth is projected according to the first coloured image and the second color image restoration;The coordinate information of gesture and gesture is determined according to finger depth and projection desktop depth;The operation represented by gesture is determined according to the coordinate information of gesture and gesture and projecting cell is operated accordingly.

Description

Projection interactive method based on gesture and projection interactive device
Technical field
The present invention relates to image processing fields, more particularly to a kind of projection interactive method and projection based on gesture Interactive device.
Background technology
With development in science and technology, projection is widely used in various aspects, for example, teaching, meeting etc..By projection and Camera carries out gesture identification, can reach better human-computer interaction, make projection for the use of more.
In the prior art, the camera of use is usually infrared camera or color camera, using deep to finger when color camera The recovery of degree may be interfered by projector, using can not restore to project desktop depth when infrared camera, it is difficult to accurate It determines finger depth and projects the relationship between desktop depth, gesture identification effect is poor.
Invention content
The present invention provides a kind of projection interactive method based on gesture and projection interactive devices, can accurately determine finger Relationship between depth and projection desktop depth, and then determine the operation represented by gesture, there is preferable gesture identification effect.
In a first aspect, providing a kind of projection interactive method based on gesture, this method is applied to projection interactive device, should Projection interactive device includes projecting cell, first camera, second camera, and the first camera and the second camera can be obtained alternately Infrared image and coloured image are taken, the overlapping of the image acquisition region of the first camera and the image acquisition region of the second camera The projection desktop of the region overlay projecting cell, this method include:First camera acquisition is passed sequentially through within a predetermined period of time First infrared image frame of the projection desktop and the first color image frames, and second phase is passed sequentially through in the predetermined amount of time Machine obtains the second infrared image frame and the second color image frames of the projection desktop;According to the first infrared image frame and this second Infrared image frame restores finger depth, and projects desktop depth according to first coloured image and second color image restoration; The coordinate information of gesture and gesture is determined according to the finger depth and the projection desktop depth;According to the coordinate of the gesture and gesture Information determines the operation represented by the gesture and is operated accordingly to the projecting cell.
Second aspect, provides a kind of projection interactive device, which includes:Projecting cell, the first image acquisition units, Second image acquisition units and control unit, wherein the projecting cell is used to project the display data of the projection interactive device To projection desktop;First image acquisition units and second image acquisition units are for alternately obtaining infrared image and cromogram Picture, and the overlapping of the image acquisition region of the image acquisition region of first image acquisition units and second image acquisition units The projection desktop of the region overlay projecting cell;The infrared light compensating lamp is used for first image acquisition units and second image The infrared image of collecting unit carries out infrared light filling;The control control unit includes Image Acquisition subelement, and depth recovery is single Member, coordinate determination subelement and gesture operation subelement, wherein the Image Acquisition subelement is used to pass through first Image Acquisition Unit obtains the projection desktop in the first infrared image frame at current time and the first color image frames, and passes through second image Collecting unit obtains the projection desktop in the second infrared image frame at current time and the second color image frames;Depth recovery Unit is used to restore finger depth according to the first infrared image frame and the second infrared image frame, and according to first cromogram Picture and second color image restoration project desktop depth;The coordinate determination subelement is used for according to the finger depth and the projection Desktop depth determines the coordinate information of gesture and gesture;The gesture operation subelement is used to be believed according to the coordinate of the gesture and gesture Breath determines the operation represented by the gesture and is operated accordingly to the projecting cell.
The projection interactive method and projection interactive device based on gesture of the embodiment of the present invention, are provided simultaneously with red by two Outer image and the camera of Color Image Acquisition function acquire infrared image and coloured image successively respectively, and then according to two cameras The infrared image of acquisition restores finger depth, and desktop depth is projected according to the color image restoration of two camera acquisitions, so as to Enough accurate relationships determined between finger depth and projection desktop depth, and then determine the operation represented by gesture, have preferable Gesture identification effect.
Description of the drawings
Fig. 1 is projection interactive method flow chart of the embodiment of the present invention based on gesture.
Fig. 2 is a kind of external structure schematic diagram of projection interactive device of the embodiment of the present invention.
Fig. 3 is the structural schematic diagram of projection interactive device of the embodiment of the present invention.
Fig. 4 is another structural schematic diagram of projection interactive device of the embodiment of the present invention.
Specific implementation mode
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation describes, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hair Embodiment in bright, every other implementation obtained by those of ordinary skill in the art without making creative efforts Example, shall fall within the protection scope of the present invention.
Fig. 1 is projection interactive method flow chart of the embodiment of the present invention based on gesture.This method is applied to projection interaction and sets It is standby, it is executed by the control device of projection interactive device.The projection interactive device includes projecting cell, first camera, second camera, The first camera and the second camera can alternately obtain infrared image and coloured image, the image acquisition areas of the first camera The overlapping region of the image acquisition region of domain and the second camera covers the projection desktop of the projecting cell.This method includes:
101, the first infrared image frame and first that first camera obtains projection desktop is passed sequentially through within a predetermined period of time Color image frames, and pass sequentially through in the predetermined amount of time second camera obtain the projection desktop the second infrared image frame and Second color image frames.
It should be understood that projection desktop refers to the region that projection unit projects form image, can be wall, projection screen, etc. Deng.
It should be understood that the predetermined amount of time is shorter a period of time, in the predetermined amount of time, it is believed that projection desktop Do not change.It in other words, can be by the first infrared image frame acquired in predetermined amount of time, the first color image frames, second red Outer picture frame and the second color image frames are considered as the same projection desktop different angle, the result of different modes acquisition.For example, can To provide that the predetermined amount of time is the time occupied by 5 frames, during this period of time, projection desktop does not have substantially too big Variation.Certainly, which may further specify that as longer a period of time or shorter a period of time.
In addition, first camera and second camera can alternately obtain infrared image and coloured image, thus can be by the One camera and second camera obtain infrared image frame and color image frames successively respectively.It is infrared that first camera can first obtain first Picture frame, then the first color image frames are obtained, or the first color image frames are first obtained, then obtain the first infrared image frame.Class As, second camera can first obtain the second infrared image frame, then obtain the second color image frames, or first obtain the second colour Picture frame, then obtain the second infrared image frame.
102, finger depth is restored according to the first infrared image frame and the second infrared image frame, and according to first coloured silk Color image and second color image restoration project desktop depth.
It should be understood that by binocular algorithm, restore the method for finger depth according to infrared image frame, and according to coloured image The method for restoring projection desktop depth, can refer to the prior art, details are not described herein for the embodiment of the present invention.
103, the coordinate information of gesture and gesture is determined according to the finger depth and the projection desktop depth.
104, according to the coordinate information of the gesture and gesture determine the operation represented by the gesture and to the projecting cell into The corresponding operation of row.
In the embodiment of the present invention, the camera that is provided simultaneously with infrared image and Color Image Acquisition function by two respectively according to Secondary acquisition infrared image and coloured image, and then finger depth is restored according to the infrared image of two camera acquisitions, according to two The color image restoration of camera acquisition projects desktop depth, so as to accurately determine between finger depth and projection desktop depth Relationship, and then determine gesture represented by operation, have preferable gesture identification effect.
Optionally, as one embodiment, in step 104, which is determined according to the coordinate information of the gesture and gesture Represented operation includes:The operation corresponding to gesture is determined according to multiple continuous gestures and corresponding coordinate information;Alternatively, The operation corresponding to gesture is determined according to single gesture and corresponding coordinate information.
For example, when determining the operation pulled, can be determined by multiple continuous gestures and corresponding coordinate information. In another example when determining the operation clicked, can be determined by single gesture and corresponding coordinate information, etc..Certainly, may be used also To determine other operations by multiple continuous gestures and corresponding coordinate information, or pass through single gesture and corresponding coordinate Information determines that other operations, specific implementation can refer to the prior art, and details are not described herein for the embodiment of the present invention.
Optionally, as one embodiment, which further includes infrared light compensating lamp, which is used for Infrared light filling is carried out to the infrared image of the first camera and the second camera.By infrared light filling, the red of acquisition is enabled to Outer image is more clear, and then can ensure the accuracy for the finger depth restored.
Optionally, this method may also include:It determines between the image acquisition region of the first camera and the projection desktop Pixel correspondence;Wherein, step 103 it is specific can be achieved be:The hand is obtained according to the finger depth and the projection desktop depth First coordinate of the gesture in the image acquisition region of the first camera;It is determined according to first coordinate and the pixel correspondence Second coordinate of the gesture in the projection desktop, second coordinate are the operation coordinate of the gesture.Certainly, it also can determine second Pixel correspondence between the image acquisition region of camera and the projection desktop, details are not described herein for the embodiment of the present invention.
In the following, will be further described to the method for the embodiment of the present invention in conjunction with specific example.
Fig. 2 is a kind of external structure schematic diagram of projection interactive device of the embodiment of the present invention.As shown in Fig. 2, the projection is handed over Mutual equipment may include:Infrared colo r camera 1, infrared colo r camera 2, projecting apparatus, infrared light compensating lamp.Wherein, projecting apparatus is for giving birth to At projection desktop;Infrared colo r camera 1 and infrared colo r camera 2 can alternately obtain infrared image and coloured image, and The overlapping region of the image acquisition region of infrared colo r camera 1 and the image acquisition region of infrared colo r camera 2 covers the projection The projection desktop of unit.That is, infrared colo r camera 1 and infrared colo r camera 2 can acquired projections desktop areas Infrared image and coloured image.The infrared image that infrared light compensating lamp is used to acquire infrared colo r camera 1 and infrared colo r camera 2 Light filling is carried out, to obtain better infrared image.Concrete operation step is as follows:
Interaction acquires infrared image frame and coloured image respectively for the first step, infrared colo r camera 1 and infrared colo r camera 2 Frame.
The time interval of two camera acquisition infrared image frames and color image frames is very short, and projecting desktop at this time can regard as not It changes.
Second step, according to binocular algorithm, the infrared image acquired using infrared colo r camera 1 and infrared colo r camera 2 is extensive Multiple finger depth.
It should be understood that in the embodiment of the present invention, infrared light compensating lamp can adopt infrared colo r camera 1 and infrared colo r camera 2 The infrared image of collection carries out light filling, to obtain better infrared image, so as to more accurately restore finger depth.When So, if without infrared light compensating lamp, finger depth can also be restored, the finger depth error only restored may be larger, But have no effect on the execution of method shown in FIG. 1 of the embodiment of the present invention.
In addition, restoring the method for finger depth according to binocular algorithm, specific implementation can refer to binocular algorithm and infrared figure Algorithm as restoring depth, details are not described herein for the embodiment of the present invention.
Third walks, and according to binocular algorithm, the coloured image acquired using infrared colo r camera 1 and infrared colo r camera 2 is extensive Desktop depth is projected again.
Similarly, the method for projecting desktop depth according to binocular algorithm, specific implementation can refer to binocular algorithm and colour Image restores the algorithm of depth, and details are not described herein for the embodiment of the present invention.
In addition, it should be understood that in step, second step and third walk the not limitation of sequencing in time.
4th step obtains the pixel correspondence between projector image and camera image by calibration.
Pixel correspondence that can in several ways between labeling projection instrument image and camera image.
One embodiment of the present of invention, n are that the plane equation that projecting apparatus is thrown on object (for example is thrown on metope, just It is the plane equation of metope), KpIt is projecting apparatus internal reference, RpIt is spin matrix of the projecting apparatus to camera, tpIt is projecting apparatus to camera Translation matrix, KcIt is the internal reference of camera, XcIt is a magazine point coordinates, XpIt is the projector coordinates after mapping.Then projecting apparatus Pixel correspondence between image and camera image is as follows:
Xp=Kp(Rp-tpnT) P=Kp(Rp-tpnT)Kc -1Xc
Wherein, plane equation nTP+1=0.
That is, can be indicated with following formula between projector image and camera image:
Xp=HpcXc
Wherein, Hpc=Kp(Rp-tpnT)Kc -1
Work as Rp、tpTo timing, H is mappedpcOnly and plane nTIt is related.
It should be understood, of course, that being likely present the pixel correspondence between other labeling projection instrument images and camera image Mode, specific implementation can refer to the prior art, this is not restricted for the embodiment of the present invention.
5th step identifies gesture using Gesture Recognition Algorithm, and according to the coordinate mapping relations of the 4th step, is mapped to throwing The operation of shadow instrument.
It, can be by judging the variation of finger height, and the variation on direction up and down after restoring finger depth Judge action that finger is done, identifies gesture in other words.
The algorithm that gesture is identified using Gesture Recognition Algorithm, can refer to the prior art, the embodiment of the present invention is not gone to live in the household of one's in-laws on getting married herein It states.
After identifying gesture, the behaviour corresponding to gesture can be determined according to multiple continuous gestures and corresponding coordinate information Make;Alternatively, determining the operation corresponding to gesture according to single gesture and corresponding coordinate information.
Then, it further according to the pixel correspondence between the coordinate of gesture and projector image and camera image, determines The coordinate of projecting apparatus, and then it is mapped to the operation to projecting apparatus.
Fig. 3 is the structural schematic diagram of the projection interactive device 300 of the embodiment of the present invention.As shown in figure 3, projection interactive device 300 may include:Projecting cell 320, the first image acquisition units 330, the second image acquisition units 340 and control unit 310.Its In,
Projecting cell 320 is used to the display data for projecting interactive device 300 projecting to projection desktop;
First image acquisition units 330 and the second image acquisition units 340 are for alternately obtaining infrared image and cromogram Picture, and the weight of the image acquisition region of the image acquisition region of the first image acquisition units 330 and the second image acquisition units 340 The projection desktop of folded region overlay projecting cell 320;
It includes Image Acquisition subelement 311, depth recovery subelement 312, coordinate determination subelement to control control unit 310 313 and gesture operation subelement 314, wherein
Image Acquisition subelement 311 is used to obtain the projection desktop at current time by the first image acquisition units 330 The first infrared image frame and the first color image frames, and by the second image acquisition units 340 obtain the projection desktop working as The second infrared image frame at preceding moment and the second color image frames;
Depth recovery subelement 312 is used to restore finger according to the first infrared image frame and the second infrared image frame deep Degree, and desktop depth is projected according to first coloured image and second color image restoration;
Coordinate determination subelement 313 is used to determine the seat of gesture and gesture according to the finger depth and the projection desktop depth Mark information;
Gesture operation subelement 314 is used to determine the operation represented by the gesture according to the coordinate information of the gesture and gesture And projecting cell 320 is operated accordingly.
In the embodiment of the present invention, projection interactive device 300 is provided simultaneously with infrared image and Color Image Acquisition by two The camera of function acquires infrared image and coloured image successively respectively, and then restores hand according to the infrared image of two camera acquisitions Refer to depth, desktop depth projected according to the color image restoration of two cameras acquisition, so as to accurately determine finger depth with The relationship between desktop depth is projected, and then determines the operation represented by gesture, there is preferable gesture identification effect.
Optionally, as one embodiment, for determining the gesture institute table according to the coordinate information of the gesture and gesture During the operation shown, gesture operation subelement 314 is specifically used for:According to multiple continuous gestures and corresponding coordinate information Determine the operation corresponding to gesture;Alternatively, determining the operation corresponding to gesture according to single gesture and corresponding coordinate information.
Optionally, as shown in figure 4, projection interactive device 300 may also include infrared light filling unit 350.Infrared light filling unit 350 carry out infrared light filling for the infrared image to the first image acquisition units 330 and the second image acquisition units 340.
Optionally, coordinate determination subelement 313 be additionally operable to determine the first image acquisition units 330 image acquisition region with Pixel correspondence between the projection desktop;For determining gesture and hand according to the finger depth and the projection desktop depth During the coordinate information of gesture, coordinate determination subelement 313 is specifically used for:According to the finger depth and the projection desktop depth Obtain first coordinate of the gesture in the image acquisition region of the first image acquisition units 330;According to first coordinate and The pixel correspondence determines that second coordinate of the gesture in the projection desktop, second coordinate are that the operation of the gesture is sat Mark.
It should be understood that in actual application, projecting cell 320 can be a projecting apparatus or other projection devices;First Image acquisition units 330 and the second image acquisition units 340 can be the phases for having infrared image and Color Image Acquisition function Machine or other image capture devices for having infrared image and Color Image Acquisition function;Infrared light filling unit 350 can be red Outer light compensating lamp or other infrared light filling equipment, etc..Control unit 310 can be the control projected in interactive device 300 Device, control chip or other control devices, or be made of multiple devices, this is not restricted for the embodiment of the present invention.
In addition, projection interactive device 300 can also carry out the mode of Fig. 1, and has projection interactive device shown in Fig. 1, Fig. 2 The function of embodiment, details are not described herein for the embodiment of the present invention.
In the present invention, when being described to particular elements between the first component and second component, in the particular elements May exist intervening elements between the first component or second component, there can not also be intervening elements;When being described to particular portion Part connect other components when, the particular elements can be directly connected to other components without intervening elements, can also It is not directly connected to other components and there are intervening elements.

Claims (9)

1. a kind of projection interactive method based on gesture, which is characterized in that the method is applied to projection interactive device, the throwing Shadow interactive device includes projecting cell, first camera, second camera, and the first camera and the second camera can replace Obtain infrared image and coloured image, the image acquisition region of the image acquisition region of the first camera and the second camera Overlapping region cover the projection desktop of the projecting cell, the method includes:
The first infrared image frame and first that the first camera obtains the projection desktop is passed sequentially through within a predetermined period of time Color image frames, and pass sequentially through in the predetermined amount of time second camera and obtain the second infrared of the projection desktop Picture frame and the second color image frames;
Restore finger depth according to the first infrared image frame and the second infrared image frame, and colored according to described first Image and second color image restoration project desktop depth;
The coordinate information of gesture and gesture is determined according to the finger depth and the projection desktop depth;
The operation represented by the gesture is determined according to the coordinate information of the gesture and gesture and the projecting cell is carried out Corresponding operation.
2. the method as described in claim 1, which is characterized in that
The coordinate information according to the gesture and gesture determines that the operation represented by the gesture includes:
The operation corresponding to gesture is determined according to multiple continuous gestures and corresponding coordinate information;Or
The operation corresponding to gesture is determined according to single gesture and corresponding coordinate information.
3. method as claimed in claim 1 or 2, which is characterized in that the projection interactive device further includes infrared light compensating lamp, institute It states infrared image of the infrared light compensating lamp for the first camera and the second camera and carries out infrared light filling.
4. method as claimed in claim 1 or 2, which is characterized in that the method further includes:Determine the figure of the first camera As the pixel correspondence between pickup area and the projection desktop;Wherein,
It is described according to the finger depth and the projection desktop depth determines that the coordinate information of gesture and gesture includes:
The gesture is obtained in the image acquisition areas of the first camera according to the finger depth and the projection desktop depth The first coordinate in domain;
Second seat of the gesture in the projection desktop is determined according to first coordinate and the pixel correspondence Mark, second coordinate are the operation coordinate of the gesture.
5. a kind of projection interactive device, which is characterized in that including:Projecting cell, the first image acquisition units, the second Image Acquisition Unit and control unit, wherein
The projecting cell is used to the display data of the projection interactive device projecting to projection desktop;
Described first image collecting unit and second image acquisition units obtain infrared image and coloured image for alternating, And the weight of the image acquisition region of the image acquisition region of described first image collecting unit and second image acquisition units The projection desktop of projecting cell described in folded region overlay;
The control control unit includes Image Acquisition subelement, depth recovery subelement, coordinate determination subelement and gesture behaviour Make subelement, wherein
Described image acquires subelement and is used to obtain the projection desktop at current time by described first image collecting unit The first infrared image frame and the first color image frames, and obtain the projection desktop by second image acquisition units and exist The second infrared image frame at current time and the second color image frames;
The depth recovery subelement is used to restore finger according to the first infrared image frame and the second infrared image frame Depth, and desktop depth is projected according to first coloured image and second color image restoration;
The coordinate determination subelement is used to determine gesture and gesture according to the finger depth and the projection desktop depth Coordinate information;
The gesture operation subelement is used to determine the behaviour represented by the gesture according to the coordinate information of the gesture and gesture Make and the projecting cell is operated accordingly.
6. projection interactive device as claimed in claim 5, which is characterized in that for the coordinate according to the gesture and gesture During information determines the operation represented by the gesture, the gesture operation subelement is specifically used for:
The operation corresponding to gesture is determined according to multiple continuous gestures and corresponding coordinate information;Or
The operation corresponding to gesture is determined according to single gesture and corresponding coordinate information.
7. such as projection interactive device described in claim 5 or 6, which is characterized in that the projection interactive device further includes infrared Light filling unit, the infrared light filling unit are used for the red of described first image collecting unit and second image acquisition units Outer image carries out infrared light filling.
8. such as projection interactive device described in claim 5 or 6, which is characterized in that the coordinate determination subelement is additionally operable to really Determine the pixel correspondence between the image acquisition region of described first image collecting unit and the projection desktop;
During for determining the coordinate information of gesture and gesture according to the finger depth and the projection desktop depth, The coordinate determination subelement is specifically used for:The gesture is obtained in institute according to the finger depth and the projection desktop depth State the first coordinate in the image acquisition region of the first image acquisition units;It is corresponded to according to first coordinate and the pixel Relationship determines second coordinate of the gesture in the projection desktop, and second coordinate is the operation coordinate of the gesture.
9. such as projection interactive device described in claim 5 or 6, which is characterized in that described first image collecting unit and described Second image acquisition units are the camera for having acquisition infrared image and coloured image.
CN201410601367.9A 2014-10-30 2014-10-30 Projection interactive method based on gesture and projection interactive device Active CN105589552B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410601367.9A CN105589552B (en) 2014-10-30 2014-10-30 Projection interactive method based on gesture and projection interactive device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410601367.9A CN105589552B (en) 2014-10-30 2014-10-30 Projection interactive method based on gesture and projection interactive device

Publications (2)

Publication Number Publication Date
CN105589552A CN105589552A (en) 2016-05-18
CN105589552B true CN105589552B (en) 2018-10-12

Family

ID=55929197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410601367.9A Active CN105589552B (en) 2014-10-30 2014-10-30 Projection interactive method based on gesture and projection interactive device

Country Status (1)

Country Link
CN (1) CN105589552B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI653563B (en) * 2016-05-24 2019-03-11 仁寶電腦工業股份有限公司 Projection touch image selection method
CN106371594A (en) * 2016-08-31 2017-02-01 李姣昂 Binocular infrared vision portable gesture-controlled projection system and method
CN106705003A (en) * 2017-01-16 2017-05-24 歌尔科技有限公司 Projection lamp
CN107506133B (en) * 2017-08-24 2020-09-18 歌尔股份有限公司 Operation track response method and system of projection touch system
CN108227923A (en) * 2018-01-02 2018-06-29 南京华捷艾米软件科技有限公司 A kind of virtual touch-control system and method based on body-sensing technology
CN111258410B (en) * 2020-05-06 2020-08-04 北京深光科技有限公司 Man-machine interaction equipment
CN114374776B (en) * 2020-10-15 2023-06-23 华为技术有限公司 Camera and control method of camera
CN116974369B (en) * 2023-06-21 2024-05-17 广东工业大学 Method, system, equipment and storage medium for operating medical image in operation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945079A (en) * 2012-11-16 2013-02-27 武汉大学 Intelligent recognition and control-based stereographic projection system and method
CN102959616A (en) * 2010-07-20 2013-03-06 普莱姆森斯有限公司 Interactive reality augmentation for natural interaction
CN103093471A (en) * 2013-01-24 2013-05-08 青岛智目科技有限公司 Foreground extraction method under complicated background
CN104019761A (en) * 2014-04-15 2014-09-03 北京农业信息技术研究中心 Three-dimensional configuration obtaining device and method of corn plant

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120062170A (en) * 2010-12-06 2012-06-14 삼성전자주식회사 Method and device for controlling a virtual multi-monitor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102959616A (en) * 2010-07-20 2013-03-06 普莱姆森斯有限公司 Interactive reality augmentation for natural interaction
CN102945079A (en) * 2012-11-16 2013-02-27 武汉大学 Intelligent recognition and control-based stereographic projection system and method
CN103093471A (en) * 2013-01-24 2013-05-08 青岛智目科技有限公司 Foreground extraction method under complicated background
CN104019761A (en) * 2014-04-15 2014-09-03 北京农业信息技术研究中心 Three-dimensional configuration obtaining device and method of corn plant

Also Published As

Publication number Publication date
CN105589552A (en) 2016-05-18

Similar Documents

Publication Publication Date Title
CN105589552B (en) Projection interactive method based on gesture and projection interactive device
US8818027B2 (en) Computing device interface
JP2009048616A (en) Method, system, and program for dynamically controlling cursor on screen when using video camera as pointing device
CN104809347B (en) A kind of implementation method that control display outburst area is shown
US20120163661A1 (en) Apparatus and method for recognizing multi-user interactions
CN106201173A (en) The interaction control method of a kind of user's interactive icons based on projection and system
CN107102736A (en) The method for realizing augmented reality
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
KR101256046B1 (en) Method and system for body tracking for spatial gesture recognition
JP2013186838A (en) Generation device, generation program, and generation method
JP4694957B2 (en) Information presenting apparatus, information presenting method, and program thereof
CN105227882B (en) A kind of display methods and corresponding intrument
CN101196986A (en) Three-dimensional palm print identity identifier and its identification method
CN105892637A (en) Gesture identification method and virtual reality display output device
WO2018006481A1 (en) Motion-sensing operation method and device for mobile terminal
WO2022174574A1 (en) Sensor-based bare-hand data annotation method and system
Herbert et al. Mobile device and intelligent display interaction via scale-invariant image feature matching
RU2020135294A (en) METHOD AND SYSTEM FOR ASSESSING TEETH SHADES IN UNMANAGED ENVIRONMENT
JP6632298B2 (en) Information processing apparatus, information processing method and program
Molyneaux et al. Cooperative augmentation of mobile smart objects with projected displays
CN104035661A (en) Cursor display method and device of ultrahigh resolution splice wall
CN106815825A (en) One kind fitting method for information display and display device
TW201006527A (en) Measuring object contour method and measuring object contour apparatus
Widodo et al. Laser spotlight detection and interpretation of its movement behavior in laser pointer interface
CN203630717U (en) Interaction system based on a plurality of light inertial navigation sensing input devices

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant