CN102722254A - Method and system for location interaction - Google Patents

Method and system for location interaction Download PDF

Info

Publication number
CN102722254A
CN102722254A CN201210204994XA CN201210204994A CN102722254A CN 102722254 A CN102722254 A CN 102722254A CN 201210204994X A CN201210204994X A CN 201210204994XA CN 201210204994 A CN201210204994 A CN 201210204994A CN 102722254 A CN102722254 A CN 102722254A
Authority
CN
China
Prior art keywords
projection
mutual
sophisticated
advanced
optical projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201210204994XA
Other languages
Chinese (zh)
Other versions
CN102722254B (en
Inventor
周倩
乔晓蕊
倪凯
李阳
毛乐山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi Guihua Intelligent Manufacturing Co ltd
Shenzhen International Graduate School of Tsinghua University
Original Assignee
Shenzhen Graduate School Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Graduate School Tsinghua University filed Critical Shenzhen Graduate School Tsinghua University
Priority to CN201210204994.XA priority Critical patent/CN102722254B/en
Publication of CN102722254A publication Critical patent/CN102722254A/en
Priority to HK12112354.1A priority patent/HK1171544A1/en
Application granted granted Critical
Publication of CN102722254B publication Critical patent/CN102722254B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Projection Apparatus (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method and a system for location interaction. The method for location interaction includes the following steps of 1) projecting to from a projection scene via a projection system, and shooting interacting objects in the projection scene and images of the interacting objects shades via a camera system; 2) processing images of the interacting objects and the interacting objects shades to obtain location information about top ends of the interacting objects; and 3) converting the location information obtained in the step 2) into projection information to be projected to a screen. During the whole location interaction process, a user does not need to hold any specific interacting device, and the location interaction process can be realized with a common projection screen.

Description

A kind of positioning interaction method and system
Technical field
The present invention relates to the positioning interaction technology, particularly relate to a kind of can be on ordinary screen the mutual positioning interaction system of free-hand realization.
Background technology
Based on the development of computer vision technique, occurred in recent years multiplely constructing interactive system based on computer vision technique and projection display technique, build a kind of natural, happy interactive operation impression to the user.In the existing interactive system, mainly be interactive device and the projection screen of handing through the people, computer system etc. establish contact, and perhaps contactless interactive mode realizes.For example; Mutual writing device based on the pressure sensitivity technology; Need mutual writing device be contacted with projection screen, the pressure information that contacts is sent to computer system through communication module, the pressure information when computer system will contact is converted into the projection of related text handwriting information and exports on the screen.Have, based on laser technology, electromagnetic wave technology, infrared technology, the mutual writing device of ultrasonic technology does not need mutual writing device to contact with projection screen again, depends on mainly that each assembly module carries out work in the mutual apparatus for writing.The interactive device of above-mentioned contact need contact with projection screen, therefore user's scope of activities is limited in the projection screen zone before; And the projection screen of contact need utilize specific interactive electric whiteboard, installs and use all to have limitation.Simultaneously, mutual under the dual mode all need use specific interactive device, needs the user hold interactive device all the time.Do not carry relevant interactive device like the user, then can't realize mutual.And owing to need the user to hand interactive device, therefore can relate to the problem that user's hand blocks signal, cause loss of data in the reciprocal process, reciprocal process can't be carried out real-time and accurately.
Summary of the invention
Technical matters to be solved by this invention is: the deficiency that remedies above-mentioned prior art; A kind of positioning interaction method and system are proposed; Can be free-hand mutual through common projection screen realization, no longer need rely on specific interactive device, do not need specific electronic whiteboard yet.
Technical matters of the present invention solves through following technical scheme:
A kind of positioning interaction method may further comprise the steps: 1) form the projection scene through the optical projection system projection, and take mutual object and the image of mutual shadow of object in the said projection scene through camera system; 2) image of handling said mutual object and mutual shadow of object according to following steps obtains the most advanced and sophisticated positional information of said mutual object: the 2-1) distance between the shade of the most advanced and sophisticated and mutual object of mutual object tip on the detected image; 2-2) whether judge said distance,, then calculate the most advanced and sophisticated three dimensional local information of mutual object if greater than setting value greater than setting value; If less than setting value, then calculate the most advanced and sophisticated two-dimensional position information of mutual object; 3) with step 2) positional information that obtains is converted into projection information, projects on the screen area.
Technical matters of the present invention solves through following further technical scheme:
A kind of positioning interaction system comprises optical projection system (100), camera system (200) and disposal system (300); Said optical projection system (100) is used to form the projection scene, and projection information is projected screen area; Said camera system (200) is used for the image taking of mutual object in the said projection scene and mutual shadow of object is got off; Said disposal system (300) receives mutual object and the mutual shadow of object image that said camera system (200) is taken; And according to above-mentioned said step 2) handle and to obtain the most advanced and sophisticated positional information of said mutual object; And the positional information that obtains is converted into projection information, output to said optical projection system.
The beneficial effect of the present invention and prior art contrast is:
Positioning interaction method and system of the present invention; Image through mutual object and mutual shadow of object in the projection scene under the camera system shooting; Thereby cooperate disposal system that the image of mutual object and mutual shadow of object is handled and obtain most advanced and sophisticated positional information; The positional information at tip is converted into after projection information outputs to the projection screen zone, can realizes mutual.In the whole positioning interaction process, the user need not hand specific interactive device, and through finger or common thin rod, pen etc. all can be realized positioning interaction.Projection screen is common projection screen and can realizes above-mentioned positioning interaction process simultaneously, no longer needs specific interactive electric whiteboard.
Description of drawings
Fig. 1 is the process flow diagram of the positioning interaction method in the specific embodiment of the invention;
Fig. 2 is the particular flow sheet in second step in the positioning interaction method in the specific embodiment of the invention;
Fig. 3 is the composition synoptic diagram of the positioning interaction system in the specific embodiment of the invention;
Fig. 4 is the schematic top plan view of the positioning interaction system in the specific embodiment of the invention.
Embodiment
Below in conjunction with embodiment and contrast accompanying drawing the present invention is explained further details.
As shown in Figure 1, the process flow diagram for the positioning interaction method of this embodiment may further comprise the steps:
P1) form the projection scene through the optical projection system projection, and take mutual object and the image of mutual shadow of object in the said projection scene through camera system.
Wherein, mutual object can be people's finger, the thin rod that the people hands, common lettering pen etc.In this embodiment, mutual object directly is people's a finger, and the forefinger that can select usually during with indication is the example explanation.In this step; Be that staff is when projection scene middle finger shows the content on the projection screen zone; The light of optical projection system projection is blocked by staff, on the screen area of projection, forms shade, and this moment, camera system was taken the image of interior finger of projection scene and finger shade.During the forefinger instruction content, but thereby the indication of contact projection screen area, also not only remote indication in the projection scene of contact screen zone.
P2) image of mutual object of processing and mutual shadow of object obtains the most advanced and sophisticated positional information of said mutual object.Particularly, obtain positional information according to step process shown in Figure 2: the P21) distance between the most advanced and sophisticated shade of the most advanced and sophisticated and mutual object of mutual object on the detected image; P22) whether judge said distance greater than setting value,, then get into step P231 if greater than setting value) the most advanced and sophisticated three dimensional local information of the mutual object of calculating; If less than setting value, then get into step P232) the most advanced and sophisticated two-dimensional position information of the mutual object of calculating.
In this embodiment, step P1) obtain the image of staff and shade thereof after, promptly image is handled in this step and is obtained the most advanced and sophisticated spatial positional information of forefinger.Judge the distance between forefinger tip and the most advanced and sophisticated shade of forefinger from image, do not touch on the projection screen zone, then need calculate the positional information of forefinger in three dimensions if this distance, is then represented forefinger greater than setting value; If this distance, is then represented forefinger less than setting value and has been touched on the projection screen zone, then calculates the two-dimensional position information of forefinger.Through the processing of this step, whether can judge forefinger to the projection screen touch-control.Preferably, during from the image of finger and finger shade Image Acquisition finger tips shade, the point on the available finger shade carries out curve fitting and obtains the finger tips shadow image.Like this, after the image that obtains from step 1) extracts finger tips image information and finger tips shadow image information, can and then get into step P21) to P23), thereby obtain the positional information of finger tips.What need explanation is when obtaining the image information of finger tips shade, except that the method that adopts above-mentioned curve fitting, also can adopt other image processing operations such as shadow Detection algorithm to obtain.
Above-mentioned three-dimensional xyz axle is respectively, and the z axle is and the vertical direction in plane, place, projection screen zone; Plane, place, projection screen zone is the x-y plane; The y axle be in the plane, projection screen zone place along the direction of said optical projection system with said camera system line, the x axle is that the projection screen zone belongs in the plane and said optical projection system and the vertical direction of said camera system line direction.Described two-dimensional position information, when being in the above-mentioned three dimensions z=0, the coordinate information on x-y plane.And judgement touch-control whether setting value; Then can be by the rule of thumb situation setting of user of positioning interaction method; For example; Reality when touch-control is to projection screen on the image both distances what be, and look touch-control to projection screen but reality also not touch-control then on the image both distances what are, get a intermediate value between two values as setting value.
P3) with step P2) positional information that obtains converts projection information into, and project on the screen area, thereby realize the mutual input behind the location.
In this embodiment, a kind of positioning interaction system is provided also.As shown in Figure 3, be the composition synoptic diagram of positioning interaction system.The positioning interaction system comprises optical projection system 100, camera system 200 and disposal system 300.
Wherein, optical projection system 100 is used to form the projection scene, and projection information is projected in the screen area 4.
Camera system 200 is used for the image taking of mutual object in the projection scene of said optical projection system 100 and mutual shadow of object is got off.Mutual object can be people's finger, the thin rod that the people hands, common lettering pen etc.In this embodiment, mutual object directly is people's a finger, and the forefinger that can select usually during with indication is the example explanation.Promptly take the image that includes staff 5 and shade 6 information thereof down, what need to utilize is that the forefinger of staff 5 on the image is most advanced and sophisticated, the relevant information of the most advanced and sophisticated shade of forefinger of staff shade 6.
Disposal system 300 receives mutual object and the mutual shadow of object image that camera system 200 is taken; And according to the said step P2 in the aforementioned positioning interaction method) handle and to obtain the most advanced and sophisticated positional information of said mutual object; And the positional information that obtains is converted into projection information, output to said optical projection system 100.
During work; The mutual object of staff 5 conducts is when projection scene middle finger shows the content on the projection screen zone; The light of optical projection system 100 projections is blocked by staff 5; On the screen area 4 of projection, form staff shade 6, comprise the image of staff 5 and staff shade 6 this moment in the camera system 200 shooting projection scenes.Disposal system 300 is promptly handled image after receiving the image of camera system 200 transmission, obtains the most advanced and sophisticated positional information of forefinger of staff 5; After obtaining its positional information; And, be converted into projection information with after this positional information binding time information, export it to optical projection system 100; Project on the screen area 4 by optical projection system 100, thereby realize the interactive mode input behind the location.For example, staff 5 goes out straight line at projection scene inside-paint, then pass through the positioning interaction system of this embodiment after, can demonstrate straight line in the relevant position of screen area.
The positioning interaction method and system of this embodiment behind the image that obtains mutual object and shade thereof, utilize Flame Image Process to obtain positional information, thereby realize positioning interaction.In the whole positioning interaction process, the user need not hand specific interactive device, and through finger or common thin rod, pen etc. can realize that all positioning interaction, user need not carry interactive device, freely makes things convenient for.Owing to can realize alternately through people's hand finger, therefore need not hand specific interactive device, also just do not exist user's hand to block the problem of signal, Data Receiving is complete, and reciprocal process can be carried out real-time and accurately.Projection screen is common projection screen and can realizes above-mentioned positioning interaction process simultaneously, no longer needs specific interactive electric whiteboard, and the cost of total system decreases.
Preferably; The line of optical projection system 100 positions and camera system 200 positions is parallel to screen area 4; When then disposal system 300 is handled the image calculation positional information, calculate the z axial coordinate in the most advanced and sophisticated three dimensional local information of said mutual object according to following formula:
Figure 207679DEST_PATH_IMAGE002
(1)
Wherein, The z axle is and the vertical direction in projection screen plane, 4 place, zone; The y axle is along the direction of said optical projection system 100 with camera system 200 lines in the plane, 4 place, projection screen zone; The x axle be in the plane, 4 place, projection screen zone with said optical projection system 100 and the vertical direction of camera system 200 line directions, y1 representes the y coordinate of the most advanced and sophisticated shade of mutual object, y representes the y coordinate at mutual object tip; L representes the vertical range of said optical projection system 100 and said screen area, and w representes the distance between said optical projection system 100 and the said camera system 200.
As shown in Figure 4, be the schematic top plan view of positioning interaction system.P is the position at optical projection system 100 places, and C is camera system 200 positions, and A is a staff tip end.PP1 is the throw light of optical projection system 100 outgoing, and the most advanced and sophisticated A of process forms the top P1 of shade on screen, is the most advanced and sophisticated shade of forefinger, and most advanced and sophisticated A is corresponding with forefinger.C1C is a reflection ray, gets into camera system 200 through A, and C1 is the picture point of the most advanced and sophisticated A of forefinger in camera system 200.Distance between the PC is w, and the vertical range of P and screen area 4 is L, and the vertical range of most advanced and sophisticated A and screen area 4 is z, and the distance between the P1C1 is s.Because the line of optical projection system 100 positions and camera system 200 positions is parallel to screen area 4, also is that the line of PC is parallel with screen, then △ PAC ∽ △ P1AC1 among Fig. 4 is similar according to triangle, can get:
Figure 721837DEST_PATH_IMAGE004
(2)
Remember behind the abbreviation:
Figure 710522DEST_PATH_IMAGE006
; Get most advanced and sophisticated A of forefinger and most advanced and sophisticated shade P1 along the s in the relative distance replacement formula of y axle; Therefore promptly obtain in the formula 1; So just can obtain the vertical range of the most advanced and sophisticated A of forefinger, i.e. z axial coordinate information in the three dimensional local information apart from screen area 4.For the x of the most advanced and sophisticated A of forefinger, the y coordinate can carry out Flame Image Process through the two dimensional image that photographs and obtain, and does not specify at this.Calculate the three-dimensional coordinate that can obtain the most advanced and sophisticated A of forefinger through above processing, thereby can carry out the track written operations, realize mutual.
The preferred version of z axial coordinate is calculated in above-mentioned processing, and the related algorithm of calculation processes is simple, and data volume is little, can carry out mutual in real time.And calculation processes does not exist hand to block the problem that signal data is lost, and result of calculation is comparatively accurate, and reciprocal process has location, degree of precision ground and fast speed ground real-time follow-up, thereby realizes precisely smooth interactive operation.
Need to prove that above-mentioned processing is calculated in the preferred version of z axial coordinate, before the system start-up, need proofread and correct and demarcate system.At first guarantee the rational position of optical projection system 100 and camera system 200, optical projection system 100 is parallel to screen area 4 with the line of camera system 200.Measure the coverage of camera system 200, guarantee that whole projection scene is photographed.System is carried out location position, and preferably the center with screen area 4 is the three-dimensional system of coordinate initial point, can a plurality of calibration points of equally distributed collection (for example 25 calibration points), and set up error model after being provided with and use.Error model: the optical property factor (comprising optical distortion and astigmatism) of the projection lens of analysis optical projection system 100 and the pick-up lens of camera system 200 is to the influence of error, and analytic system whole geometry structure influence the various possible error effect variable of system interaction degree of accuracy to the meetings such as influence of error.Error analysis source of error according to measuring position of calculating and the most advanced and sophisticated physical location of forefinger changes an error effect variable successively, obtains the largest source factor of error, carries out software compensation optimization, thereby reduces error.
Another kind of preferred scheme promptly is to increase an Infrared projector.Also comprise the infrared light emitter that is used to launch infrared light in the optical projection system 100, on the camera system 200 infrared fileter is installed.Because camera system 200 is added with infrared fileter, can visible light be filtered, the image background that photographs like this is simple, and tone is single, and contrast obviously can very directly obtain the image of object.Through this scheme that increases Infrared projector, make disposal system 300 only handle to infrared light, can effectively remove the noise of visible light.
Above content is to combine concrete preferred implementation to the further explain that the present invention did, and can not assert that practical implementation of the present invention is confined to these explanations.For the those of ordinary skill of technical field under the present invention, make some substituting or obvious modification under the prerequisite of the present invention design not breaking away from, and performance or purposes are identical, all should be regarded as belonging to protection scope of the present invention.

Claims (9)

1. positioning interaction method is characterized in that: may further comprise the steps: 1) form the projection scene through the optical projection system projection, and take mutual object and the image of mutual shadow of object in the said projection scene through camera system; 2) image of handling said mutual object and mutual shadow of object according to following steps obtains the most advanced and sophisticated positional information of said mutual object: the 2-1) distance between the shade of the most advanced and sophisticated and mutual object of mutual object tip on the detected image; 2-2) whether judge said distance,, then calculate the most advanced and sophisticated three dimensional local information of mutual object if greater than setting value greater than setting value; If less than setting value, then calculate the most advanced and sophisticated two-dimensional position information of mutual object; 3) with step 2) positional information that obtains is converted into projection information, projects on the screen area.
2. positioning interaction method according to claim 1 is characterized in that: comprise also in the said step 1) that the line that said optical projection system position and said camera system position are set is parallel to the projection screen zone; Said step 2) calculates the z axial coordinate in the most advanced and sophisticated three dimensional local information of said mutual object according to following formula in;
Figure 47299DEST_PATH_IMAGE002
; Wherein, The z axle is and the vertical direction in plane, place, projection screen zone; The y axle is along the direction of said optical projection system and said camera system line in the plane, projection screen zone place; The x axle be in the plane, projection screen zone place with said optical projection system and the vertical direction of said camera system line direction; Y1 representes the y coordinate of the most advanced and sophisticated shade of mutual object; Y representes the y coordinate that mutual object is most advanced and sophisticated, and L representes the vertical range of said optical projection system and said screen area, and w representes the distance between said optical projection system and the said camera system.
3. positioning interaction method according to claim 1 is characterized in that: also be included in the step that the infrared light emitter is set in the said optical projection system and infrared fileter is installed on said camera system in the said step 1).
4. positioning interaction method according to claim 1 is characterized in that: the finger that said mutual object is behaved, the thin rod that the people hands, common lettering pen.
5. positioning interaction method according to claim 1 is characterized in that: the finger that said mutual object is behaved, take the image that obtains finger and finger shade in the said step 1); Said step 2) also comprising in carries out curve fitting to the point on the finger shade on the image that obtains obtains the step of finger tips shadow image.
6. a positioning interaction system is characterized in that: comprise optical projection system (100), camera system (200) and disposal system (300); Said optical projection system (100) is used to form the projection scene, and projection information is projected screen area; Said camera system (200) is used for the image taking of mutual object in the said projection scene and mutual shadow of object is got off; Said disposal system (300) receives mutual object and the mutual shadow of object image that said camera system (200) is taken; And according to step 2 described in the claim 1) handle and to obtain the most advanced and sophisticated positional information of said mutual object; And the positional information that obtains is converted into projection information, output to said optical projection system.
7. positioning interaction according to claim 6 system, it is characterized in that: the line of said optical projection system (100) position and said camera system (200) position is parallel to said screen area; Calculate the z axial coordinate in the most advanced and sophisticated three dimensional local information of said mutual object according to following formula in the said disposal system (300);
Figure 201210204994X100001DEST_PATH_IMAGE003
; Wherein, The z axle is and the vertical direction in plane, place, projection screen zone; The y axle is along the direction of said optical projection system (100) with said camera system (200) line in the plane, place, projection screen zone; The x axle be in the plane, projection screen zone place with said optical projection system (100) and the vertical direction of said camera system (200) line direction; Y1 representes the y coordinate of the most advanced and sophisticated shade of mutual object; Y representes the y coordinate that mutual object is most advanced and sophisticated, and L representes the vertical range of said optical projection system (100) and said screen area, and w representes the distance between said optical projection system (100) and the said camera system (200).
8. positioning interaction according to claim 6 system, it is characterized in that: said optical projection system (100) also comprises the infrared light emitter that is used to launch infrared light, said camera system is equipped with infrared fileter on (200).
9. positioning interaction system according to claim 6 is characterized in that: the finger that said mutual object is behaved, the thin rod that the people hands, common lettering pen.
CN201210204994.XA 2012-06-20 2012-06-20 Method and system for location interaction Active CN102722254B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201210204994.XA CN102722254B (en) 2012-06-20 2012-06-20 Method and system for location interaction
HK12112354.1A HK1171544A1 (en) 2012-06-20 2012-11-29 Locating and interacting system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210204994.XA CN102722254B (en) 2012-06-20 2012-06-20 Method and system for location interaction

Publications (2)

Publication Number Publication Date
CN102722254A true CN102722254A (en) 2012-10-10
CN102722254B CN102722254B (en) 2015-06-17

Family

ID=46948049

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210204994.XA Active CN102722254B (en) 2012-06-20 2012-06-20 Method and system for location interaction

Country Status (2)

Country Link
CN (1) CN102722254B (en)
HK (1) HK1171544A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103327385A (en) * 2013-06-08 2013-09-25 上海集成电路研发中心有限公司 Distance identification method and device based on single image sensor
CN104281301A (en) * 2013-07-05 2015-01-14 联想(北京)有限公司 Input method and electronic equipment
CN104581101A (en) * 2013-10-10 2015-04-29 全视科技有限公司 Projector-camera system having interaction screen
CN104714675A (en) * 2013-12-13 2015-06-17 华为终端有限公司 Gesture recognition method and device
CN106030495A (en) * 2015-01-30 2016-10-12 索弗特凯耐提克软件公司 Multi-modal gesture based interactive system and method using one single sensing system
CN106233307A (en) * 2014-04-28 2016-12-14 罗伯特·博世有限公司 Object identifying
CN107102750A (en) * 2017-04-23 2017-08-29 吉林大学 The system of selection of target in a kind of virtual three-dimensional space based on pen type interactive system
CN107396075A (en) * 2017-08-08 2017-11-24 海信集团有限公司 A kind of generation method and generating means of projection image correction information
CN112231023A (en) * 2019-07-15 2021-01-15 北京字节跳动网络技术有限公司 Information display method, device, equipment and storage medium
CN114020145A (en) * 2021-09-30 2022-02-08 联想(北京)有限公司 Method, device and equipment for interacting with digital content and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266530A (en) * 2008-04-04 2008-09-17 中国海洋大学 Large-screen three-dimensional measuring touch screen
CN101729628A (en) * 2008-10-15 2010-06-09 Lg电子株式会社 Mobile terminal having image projection
CN201974160U (en) * 2011-01-20 2011-09-14 沈阳同联集团高新技术有限公司 Device for measuring three-dimensional shape of structured light
CN102360424A (en) * 2011-10-20 2012-02-22 浙江工商大学 Glare prevention method based on shadow image cutting algorithm

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266530A (en) * 2008-04-04 2008-09-17 中国海洋大学 Large-screen three-dimensional measuring touch screen
CN101729628A (en) * 2008-10-15 2010-06-09 Lg电子株式会社 Mobile terminal having image projection
CN201974160U (en) * 2011-01-20 2011-09-14 沈阳同联集团高新技术有限公司 Device for measuring three-dimensional shape of structured light
CN102360424A (en) * 2011-10-20 2012-02-22 浙江工商大学 Glare prevention method based on shadow image cutting algorithm

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103327385B (en) * 2013-06-08 2019-03-19 上海集成电路研发中心有限公司 Based on single image sensor apart from recognition methods and device
CN103327385A (en) * 2013-06-08 2013-09-25 上海集成电路研发中心有限公司 Distance identification method and device based on single image sensor
CN104281301A (en) * 2013-07-05 2015-01-14 联想(北京)有限公司 Input method and electronic equipment
CN104281301B (en) * 2013-07-05 2017-12-29 联想(北京)有限公司 A kind of input method and electronic equipment
CN104581101A (en) * 2013-10-10 2015-04-29 全视科技有限公司 Projector-camera system having interaction screen
CN104714675A (en) * 2013-12-13 2015-06-17 华为终端有限公司 Gesture recognition method and device
WO2015085874A1 (en) * 2013-12-13 2015-06-18 华为终端有限公司 Method and device for identifying gesture
CN104714675B (en) * 2013-12-13 2017-12-05 华为终端(东莞)有限公司 A kind of gesture identification method and device
CN106233307A (en) * 2014-04-28 2016-12-14 罗伯特·博世有限公司 Object identifying
CN106030495A (en) * 2015-01-30 2016-10-12 索弗特凯耐提克软件公司 Multi-modal gesture based interactive system and method using one single sensing system
CN106030495B (en) * 2015-01-30 2021-04-13 索尼深度传感解决方案股份有限公司 Multi-modal gesture-based interaction system and method utilizing a single sensing system
CN107102750B (en) * 2017-04-23 2019-07-26 吉林大学 The selection method of target in a kind of virtual three-dimensional space based on pen type interactive system
CN107102750A (en) * 2017-04-23 2017-08-29 吉林大学 The system of selection of target in a kind of virtual three-dimensional space based on pen type interactive system
CN107396075A (en) * 2017-08-08 2017-11-24 海信集团有限公司 A kind of generation method and generating means of projection image correction information
CN107396075B (en) * 2017-08-08 2019-12-31 海信集团有限公司 Method and device for generating projection image correction information
CN112231023A (en) * 2019-07-15 2021-01-15 北京字节跳动网络技术有限公司 Information display method, device, equipment and storage medium
CN114020145A (en) * 2021-09-30 2022-02-08 联想(北京)有限公司 Method, device and equipment for interacting with digital content and readable storage medium

Also Published As

Publication number Publication date
HK1171544A1 (en) 2013-03-28
CN102722254B (en) 2015-06-17

Similar Documents

Publication Publication Date Title
CN102722254B (en) Method and system for location interaction
CN102799318B (en) A kind of man-machine interaction method based on binocular stereo vision and system
CN101231450B (en) Multipoint and object touch panel arrangement as well as multipoint touch orientation method
CN102253737B (en) Realizing method of screen-vision mouse system
CN201191355Y (en) Multi-point object touch screen apparatus
CN204926047U (en) Portable touch -control projection arrangement
CN101639746B (en) Automatic calibration method of touch screen
CN101419513A (en) A kind of remote virtual touch system of infrared laser pen
KR100907104B1 (en) Calculation method and system of pointing locations, and collaboration system comprising it
WO2013162235A1 (en) Apparatus for obtaining virtual 3d object information without requiring pointer
CN103729074A (en) Device, system and method for recognizing handwritings
CN109782962A (en) A kind of projection interactive method, device, system and terminal device
CN201562264U (en) Self-adaptive intelligent interactive projector
CN204902785U (en) Hand -held type laser three -dimensional scanning system who possesses quick demarcation function
CN107682595B (en) interactive projection method, system and computer readable storage medium
CN100478860C (en) Electronic plane display positioning system and positioning method
CN102508575A (en) Screen writing device, screen writing system and realization method thereof
CN103176606B (en) Based on plane interaction system and the method for binocular vision identification
CN111240524A (en) Laser radar touch screen
CN104133599A (en) Terminal device and method allowing projection surface to be operated
CN101446745A (en) Projection system with interactive function
CN103076969A (en) Input system for mobile terminal display screen and control method thereof
CN109901714A (en) A kind of electronics paper pen system and its control method
CN211787036U (en) Laser radar touch screen
CN202049465U (en) Combined device for recognizing infrared light signal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1171544

Country of ref document: HK

C14 Grant of patent or utility model
GR01 Patent grant
REG Reference to a national code

Ref country code: HK

Ref legal event code: GR

Ref document number: 1171544

Country of ref document: HK

CP01 Change in the name or title of a patent holder

Address after: 518055 Guangdong city of Shenzhen province Nanshan District Xili of Tsinghua

Patentee after: Tsinghua Shenzhen International Graduate School

Address before: 518055 Guangdong city of Shenzhen province Nanshan District Xili of Tsinghua

Patentee before: GRADUATE SCHOOL AT SHENZHEN, TSINGHUA University

CP01 Change in the name or title of a patent holder
TR01 Transfer of patent right

Effective date of registration: 20211228

Address after: 530000 floor 18, building 8, Nanning Greenland center, No. 15, piankaixuan Road, China (Guangxi) pilot Free Trade Zone, Liangqing District, Nanning City, Guangxi Zhuang Autonomous Region

Patentee after: Guangxi Guihua Intelligent Manufacturing Co.,Ltd.

Address before: 518055 Guangdong city of Shenzhen province Nanshan District Xili of Tsinghua

Patentee before: Tsinghua Shenzhen International Graduate School

TR01 Transfer of patent right