CN102722254B - Method and system for location interaction - Google Patents

Method and system for location interaction Download PDF

Info

Publication number
CN102722254B
CN102722254B CN201210204994.XA CN201210204994A CN102722254B CN 102722254 B CN102722254 B CN 102722254B CN 201210204994 A CN201210204994 A CN 201210204994A CN 102722254 B CN102722254 B CN 102722254B
Authority
CN
China
Prior art keywords
projection
mutual
image
camera system
mutual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210204994.XA
Other languages
Chinese (zh)
Other versions
CN102722254A (en
Inventor
周倩
乔晓蕊
倪凯
李阳
毛乐山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi Guihua Intelligent Manufacturing Co.,Ltd.
Original Assignee
Shenzhen Graduate School Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Graduate School Tsinghua University filed Critical Shenzhen Graduate School Tsinghua University
Priority to CN201210204994.XA priority Critical patent/CN102722254B/en
Publication of CN102722254A publication Critical patent/CN102722254A/en
Priority to HK12112354.1A priority patent/HK1171544A1/en
Application granted granted Critical
Publication of CN102722254B publication Critical patent/CN102722254B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Projection Apparatus (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method and a system for location interaction. The method for location interaction includes the following steps of 1) projecting to from a projection scene via a projection system, and shooting interacting objects in the projection scene and images of the interacting objects shades via a camera system; 2) processing images of the interacting objects and the interacting objects shades to obtain location information about top ends of the interacting objects; and 3) converting the location information obtained in the step 2) into projection information to be projected to a screen. During the whole location interaction process, a user does not need to hold any specific interacting device, and the location interaction process can be realized with a common projection screen.

Description

A kind of positioning interaction method and system
Technical field
The present invention relates to positioning interaction technology, particularly relate to a kind of can free-hand realization is mutual on ordinary screen positioning interaction system.
Background technology
Based on the development of computer vision technique, occur in recent years multiplely constructing interactive system based on computer vision technique and projection display technique, built a kind of natural, happy interactive operation impression to user.In existing interactive system, mainly pass through the hand-held interactive device of people and projection screen, computer system etc. are set up contact, or contactless interactive mode realizes.Such as, based on the mutual writing device of pressure sensitivity technology, need mutual writing device to contact with projection screen, the pressure information of contact is sent to computer system by communication module, pressure information during contact is converted into the projection of related text handwriting information and exports on screen by computer system.Further, based on laser technology, electromagnetic wave technology, infrared technology, the mutual writing device of ultrasonic technology, does not need mutual writing device to contact with projection screen, depends on each assembly module in mutual apparatus for writing and carries out work.The interactive device of above-mentioned contact, needs to contact with projection screen, before therefore the scope of activities of user being limited in projection screen region; And the projection screen of contact needs to utilize specific interactive electric whiteboard, all there is limitation in installation and use.Meanwhile, mutual under two kinds of modes, all needs to use specific interactive device, needs user to hold interactive device all the time.As user does not carry relevant interactive device, then cannot realize mutual.And due to the hand-held interactive device of needs user, therefore can relate to the problem that user's hand blocks signal, cause loss of data in reciprocal process, reciprocal process cannot be carried out real-time and accurately.
Summary of the invention
Technical matters to be solved by this invention is: make up above-mentioned the deficiencies in the prior art, a kind of positioning interaction method and system are proposed, realize alternately free-hand by common projection screen, no longer need to rely on specific interactive device, also do not need specific electronic whiteboard.
Technical matters of the present invention is solved by following technical scheme:
A kind of positioning interaction method, comprises the following steps: 1) form projection scene by projection systems project, and takes the image of mutual object in described projection scene and mutual shadow of object by camera system; 2) according to following steps process, the image of mutual object and mutual shadow of object obtains the positional information at described mutual object tip: 2-1) distance in detected image between the most advanced and sophisticated and most advanced and sophisticated shade of mutual object of mutual object; 2-2) judge whether described distance is greater than setting value, if be greater than setting value, then calculate the three dimensional local information at mutual object tip; If be less than setting value, then calculate the two-dimensional position information at mutual object tip; 3) by step 2) positional information that obtains is converted into projection information, projects on screen area.
Technical matters of the present invention is solved by following further technical scheme:
A kind of positioning interaction system, comprises optical projection system (100), camera system (200) and disposal system (300); Projection information, for the formation of projection scene, is projected screen area by described optical projection system (100); Described camera system (200) is for getting off the image taking of the mutual object in described projection scene and mutual shadow of object; Described disposal system (300) receives the mutual object and mutual shadow of object image that described camera system (200) takes, and according to step 2 described above) process the positional information obtaining described mutual object tip, and the positional information obtained is converted into projection information, output to described optical projection system.
The beneficial effect that the present invention is compared with the prior art is:
Positioning interaction method and system of the present invention, by the image of mutual object and mutual shadow of object in the lower projection scene of camera system shooting, the image of disposal system to mutual object and mutual shadow of object is coordinated to process thus obtain most advanced and sophisticated positional information, the positional information at tip is converted into after projection information outputs to projection screen region, can realizes mutual.In whole positioning interaction process, user does not need hand-held specific interactive device, and by finger or common thin rod, pen etc. all can realize positioning interaction.Projection screen is common projection screen and can realizes above-mentioned positioning interaction process simultaneously, no longer needs specific interactive electric whiteboard.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of the positioning interaction method in the specific embodiment of the invention;
Fig. 2 is the particular flow sheet of second step in the positioning interaction method in the specific embodiment of the invention;
Fig. 3 is the composition schematic diagram of the positioning interaction system in the specific embodiment of the invention;
Fig. 4 is the schematic top plan view of the positioning interaction system in the specific embodiment of the invention.
Embodiment
Contrast accompanying drawing below in conjunction with embodiment the present invention is described in further details.
As shown in Figure 1, be the process flow diagram of the positioning interaction method of this embodiment, comprise the following steps:
P1) form projection scene by projection systems project, and take the image of mutual object in described projection scene and mutual shadow of object by camera system.
Wherein, mutual object can be the finger of people, the thin rod that people is hand-held, common written pen etc.In this embodiment, mutual object is directly the finger of people, and the forefinger usually can selected during to indicate illustrates.In this step, namely staff is when the scene middle finger that projects shows the content on projection screen region, the light of projection system projects is blocked by staff, and the screen area of projection forms shade, the image of finger and finger shade in now camera system shooting projection scene.During forefinger instruction content, can contact projection screen area thus instruction, also can not contact screen region only projection scene in remote indication.
P2) image processing mutual object and mutual shadow of object obtains the positional information at described mutual object tip.Particularly, positional information is obtained according to the step process shown in Fig. 2: distance P21) in detected image between mutual object tip and the most advanced and sophisticated shade of mutual object; P22) judge whether described distance is greater than setting value, if be greater than setting value, then enter step P231) calculate the three dimensional local information at mutual object tip; If be less than setting value, then enter step P232) calculate the two-dimensional position information at mutual object tip.
In this embodiment, step P1) obtain the image of staff and shade thereof after, namely the spatial positional information obtaining forefinger tip is processed to image in this step.Judge the distance between forefinger tip and the most advanced and sophisticated shade of forefinger from image, if this distance is greater than setting value, then represents that forefinger does not touch on projection screen region, then need to calculate forefinger positional information in three dimensions; If this distance is less than setting value, then represent that forefinger has touched on projection screen region, then calculate the two-dimensional position information of forefinger.By the process of this step, can judge that to projection screen touch-control whether forefinger.Preferably, during from the image of finger and finger shade Image Acquisition finger tips shade, the point on available finger shade carries out curve fitting and obtains finger tips shadow image.Like this, after the image obtained from step 1) extracts finger tips image information and finger tips shadow image information, and then step P21 can be entered) to P23), thus obtain the positional information of finger tips.It should be noted that, when obtaining the image information of finger tips shade, except adopting the method for above-mentioned curve, other image processing operations such as shadow Detection algorithm also can be adopted to obtain.
Above-mentioned three-dimensional xyz axle is respectively, and z-axis is the direction with place, projection screen region plane orthogonal; Place, projection screen region plane is x-y plane, y-axis is projection screen region in the planes along the direction of described optical projection system and described camera system line, x-axis is projection screen region vertical with described camera system line direction with described optical projection system in the planes direction.Described two-dimensional position information, when being z=0 in above-mentioned three dimensions, the coordinate information of x-y plane.And judge touch-control whether setting value, then can by the user of positioning interaction method rule of thumb situation setting, such as, reality on touch-control to image during projection screen both distances be how many, and look touch-control to projection screen but reality also non-touch-control then on image both distances be how many, get an intermediate value between two values as setting value.
P3) by step P2) positional information that obtains is converted to projection information, and project on screen area, thus realize the mutual input behind location.
In this embodiment, also provide a kind of positioning interaction system.As shown in Figure 3, be the composition schematic diagram of positioning interaction system.Positioning interaction system comprises optical projection system 100, camera system 200 and disposal system 300.
Wherein, projection information, for the formation of projection scene, projects in screen area 4 by optical projection system 100.
Camera system 200 is for getting off the image taking of the mutual object in the projection scene of described optical projection system 100 and mutual shadow of object.Mutual object can be the finger of people, the thin rod that people is hand-held, common written pen etc.In this embodiment, mutual object is directly the finger of people, and the forefinger usually can selected during to indicate illustrates.Namely include the image of staff 5 and shade 6 information thereof under shooting, what need to utilize is the relevant information of the most advanced and sophisticated shade of forefinger forefinger that is most advanced and sophisticated, staff shade 6 of staff 5 on image.
Disposal system 300 receives the mutual object of camera system 200 shooting and mutual shadow of object image, and according to the described step P2 in aforementioned positioning interaction method) process the positional information obtaining described mutual object tip, and the positional information obtained is converted into projection information, output to described optical projection system 100.
During work, staff 5 as mutual object when the scene middle finger of projecting shows the content on projection screen region, the light that optical projection system 100 projects is blocked by staff 5, the screen area 4 of projection forms staff shade 6, and now camera system 200 takes the image comprising staff 5 and staff shade 6 in projection scene.After disposal system 300 receives the image of camera system 200 transmission, namely image is processed, obtain the positional information at the forefinger tip of staff 5, after obtaining its positional information, and by after this positional information binding time information, be converted into projection information, exported to optical projection system 100, projected on screen area 4 by optical projection system 100, thus realize the interactive mode input behind location.Such as, staff 5 goes out straight line at projection scene inside-paint, then, after the positioning interaction system of this embodiment, can demonstrate straight line in the relevant position of screen area.
The positioning interaction method and system of this embodiment, by after the image that obtains mutual object and shade thereof, utilize image procossing to obtain positional information, thus realize positioning interaction.In whole positioning interaction process, user does not need hand-held specific interactive device, and by finger or common thin rod, pen etc. all can realize positioning interaction, and user need not carry interactive device, freely facilitates.Owing to can be realized alternately, therefore not needing hand-held specific interactive device by people's hand finger, also just there is not the problem that user's hand blocks signal, data receiver is complete, and reciprocal process can be carried out real-time and accurately.Projection screen is common projection screen and can realizes above-mentioned positioning interaction process simultaneously, and no longer need specific interactive electric whiteboard, the cost of whole system decreases.
Preferably, the line of optical projection system 100 position and camera system 200 position is parallel to screen area 4, when then disposal system 300 processes image calculating location information, obtain the z-axis coordinate in the three dimensional local information at described mutual object tip according to following formulae discovery:
(1)
Wherein, z-axis is the direction with place, projection screen region 4 plane orthogonal, y-axis is projection screen region 4 in the planes along the direction of described optical projection system 100 with camera system 200 line, x-axis is projection screen region 4 vertical with camera system 200 line direction with described optical projection system 100 in the planes direction, y1 represents the y coordinate of the most advanced and sophisticated shade of mutual object, y represents the y coordinate at mutual object tip, L represents the vertical range of described optical projection system 100 and described screen area, and w represents the distance between described optical projection system 100 and described camera system 200.
As shown in Figure 4, be the schematic top plan view of positioning interaction system.P is the position at optical projection system 100 place, and C is camera system 200 position, and A is staff forefinger tip end.PP1 is the throw light of optical projection system 100 outgoing, forms the top P1 of shade through most advanced and sophisticated A on screen, is the most advanced and sophisticated shade of forefinger, corresponding with the most advanced and sophisticated A of forefinger.C1C is reflection ray, and entering camera system 200, C1 through A is the picture point of the most advanced and sophisticated A of forefinger in camera system 200.Distance between PC is the vertical range of w, P and screen area 4 is L, and the vertical range of most advanced and sophisticated A and screen area 4 is the distance between z, P1C1 is s.Because the line of optical projection system 100 position and camera system 200 position is parallel to screen area 4, also namely the line of PC is parallel with screen, then △ PAC ∽ △ P1AC1 in Fig. 4, similar according to triangle, can obtain:
(2)
Remember after abbreviation: , take food the s that fingertip end A and most advanced and sophisticated shade P1 replaces along the relative distance of y-axis in formula, therefore namely obtain in formula 1 , so just can obtain the vertical range of forefinger most advanced and sophisticated A distance screen area 4, the z-axis coordinate information namely in three dimensional local information.For the x of the most advanced and sophisticated A of forefinger, y coordinate can carry out image procossing acquisition by the two dimensional image photographed, and does not describe in detail at this.By processing the three-dimensional coordinate calculating and can obtain the most advanced and sophisticated A of forefinger above, thus track can be carried out write operation, realize mutual.
Above-mentioned process calculates the preferred version of z-axis coordinate, and the algorithm involved by calculation processes is simple, and data volume is little, can carry out mutual in real time.And calculation processes does not exist the problem that hand blocks signal data loss, result of calculation is comparatively accurate, and reciprocal process is located and fast speed ground real-time follow-up with having degree of precision, thus realizes the interactive operation of precisely smoothness.
It should be noted that, above-mentioned process calculates in the preferred version of z-axis coordinate, and system needs to carry out emendation and correction to system before starting.First guarantee the rational position of optical projection system 100 and camera system 200, optical projection system 100 is parallel to screen area 4 with the line of camera system 200.Measure the coverage of camera system 200, ensure that whole projection scene is photographed.Location position is carried out to system, preferably with the center of screen area 4 for three-dimensional system of coordinate initial point, can the multiple calibration point of equally distributed collection (such as 25 calibration points), for setting up error model later.Error model: optical property factor (comprising optical distortion and the astigmatism) impact on error analyzing the projection lens of optical projection system 100 and the pick-up lens of camera system 200, and analytic system whole geometry structure is on the various possible error effect variable of the meeting influential system interaction accuracy such as the impact of error.According to the measuring position of calculating and the error analysis source of error of the most advanced and sophisticated physical location of forefinger, change an error effect variable successively, obtain the largest source factor of error, carry out software compensation optimization, thus reduce error.
Namely another kind of preferred scheme is increase Infrared projector.Also comprise the infrared light emission device for launching infrared light in optical projection system 100, camera system 200 is provided with infrared fileter.Because camera system 200 is added with infrared fileter, visible ray can be filtered, the image background photographed like this is simple, and tone is single, and contrast obviously, very directly can obtain the image of object.By the scheme of this increase Infrared projector, make disposal system 300 only for infrared light process, effectively can remove the noise of visible ray.
Above content is in conjunction with concrete preferred implementation further description made for the present invention, can not assert that specific embodiment of the invention is confined to these explanations.For general technical staff of the technical field of the invention, make some substituting or obvious modification without departing from the inventive concept of the premise, and performance or purposes identical, all should be considered as belonging to protection scope of the present invention.

Claims (7)

1. a positioning interaction method, is characterized in that: comprise the following steps: 1) form projection scene by projection systems project, and takes the image of mutual object in described projection scene and mutual shadow of object by camera system, and the line arranging described optical projection system position and described camera system position is parallel to projection screen region, 2) according to following steps process, the image of mutual object and mutual shadow of object obtains the positional information at described mutual object tip: 2-1) distance in detected image between the most advanced and sophisticated and most advanced and sophisticated shade of mutual object of mutual object, 2-2) judge whether described distance is greater than setting value, if be greater than setting value, then calculate the three dimensional local information (x, y, z) at mutual object tip, if be less than setting value, then calculate the two-dimensional position information (x, y) at mutual object tip, setting value is by user's rule of thumb situation setting of positioning interaction method, and the z-axis coordinate obtained according to following formulae discovery in the three dimensional local information at described mutual object tip, wherein, three-dimensional xyz axle is respectively, z-axis is the direction with place, projection screen region plane orthogonal, place, projection screen region plane is x-y plane, y-axis is projection screen region in the planes along the direction of described optical projection system and described camera system line, x-axis is projection screen region vertical with described camera system line direction with described optical projection system in the planes direction, y1 represents the y coordinate of the most advanced and sophisticated shade of mutual object, y represents the y coordinate at mutual object tip, L represents the vertical range of described optical projection system and described screen area, w represents the distance between described optical projection system and described camera system, 3) by step 2) positional information that obtains is converted into projection information, projects on screen area.
2. positioning interaction method according to claim 1, is characterized in that: described step 1) in be also included in described optical projection system infrared light emission device be set and the step of installation infrared optical filter in described camera system.
3. positioning interaction method according to claim 1, is characterized in that: the finger that described mutual object is behaved, the thin rod that people is hand-held, common written pen.
4. positioning interaction method according to claim 1, is characterized in that: the finger that described mutual object is behaved, described step 1) in shooting obtain the image of finger and finger shade; Described step 2) in also comprise carrying out curve fitting to the point in finger shade on the image obtained and obtain the step of finger tips shadow image.
5. a positioning interaction system, is characterized in that: comprise optical projection system (100), camera system (200) and disposal system (300); The line arranging described optical projection system (100) position and described camera system (200) position is parallel to projection screen region; Projection information, for the formation of projection scene, is projected screen area by described optical projection system (100); Described camera system (200) is for getting off the image taking of the mutual object in described projection scene and mutual shadow of object; Described disposal system (300) receives the mutual object and mutual shadow of object image that described camera system (200) takes, and according to step 2 described in claim 1) process the positional information obtaining described mutual object tip, and the positional information obtained is converted into projection information, output to described optical projection system.
6. positioning interaction system according to claim 5, is characterized in that: described optical projection system (100) also comprises the infrared light emission device for launching infrared light, and (200) are provided with infrared fileter to described camera system.
7. positioning interaction system according to claim 5, is characterized in that: the finger that described mutual object is behaved, the thin rod that people is hand-held, common written pen.
CN201210204994.XA 2012-06-20 2012-06-20 Method and system for location interaction Active CN102722254B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201210204994.XA CN102722254B (en) 2012-06-20 2012-06-20 Method and system for location interaction
HK12112354.1A HK1171544A1 (en) 2012-06-20 2012-11-29 Locating and interacting system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210204994.XA CN102722254B (en) 2012-06-20 2012-06-20 Method and system for location interaction

Publications (2)

Publication Number Publication Date
CN102722254A CN102722254A (en) 2012-10-10
CN102722254B true CN102722254B (en) 2015-06-17

Family

ID=46948049

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210204994.XA Active CN102722254B (en) 2012-06-20 2012-06-20 Method and system for location interaction

Country Status (2)

Country Link
CN (1) CN102722254B (en)
HK (1) HK1171544A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103327385B (en) * 2013-06-08 2019-03-19 上海集成电路研发中心有限公司 Based on single image sensor apart from recognition methods and device
CN104281301B (en) * 2013-07-05 2017-12-29 联想(北京)有限公司 A kind of input method and electronic equipment
US20150102993A1 (en) * 2013-10-10 2015-04-16 Omnivision Technologies, Inc Projector-camera system with an interactive screen
CN104714675B (en) * 2013-12-13 2017-12-05 华为终端(东莞)有限公司 A kind of gesture identification method and device
DE102014207932A1 (en) * 2014-04-28 2015-10-29 Robert Bosch Gmbh object recognition
JP6539816B2 (en) * 2015-01-30 2019-07-10 ソニー デプスセンシング ソリューションズ エスエー エヌブイ Multi-modal gesture based interactive system and method using one single sensing system
CN107102750B (en) * 2017-04-23 2019-07-26 吉林大学 The selection method of target in a kind of virtual three-dimensional space based on pen type interactive system
CN107396075B (en) * 2017-08-08 2019-12-31 海信集团有限公司 Method and device for generating projection image correction information
CN112231023A (en) * 2019-07-15 2021-01-15 北京字节跳动网络技术有限公司 Information display method, device, equipment and storage medium
CN114020145A (en) * 2021-09-30 2022-02-08 联想(北京)有限公司 Method, device and equipment for interacting with digital content and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266530A (en) * 2008-04-04 2008-09-17 中国海洋大学 Large-screen three-dimensional measuring touch screen
CN101729628A (en) * 2008-10-15 2010-06-09 Lg电子株式会社 Mobile terminal having image projection
CN201974160U (en) * 2011-01-20 2011-09-14 沈阳同联集团高新技术有限公司 Device for measuring three-dimensional shape of structured light
CN102360424A (en) * 2011-10-20 2012-02-22 浙江工商大学 Glare prevention method based on shadow image cutting algorithm

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266530A (en) * 2008-04-04 2008-09-17 中国海洋大学 Large-screen three-dimensional measuring touch screen
CN101729628A (en) * 2008-10-15 2010-06-09 Lg电子株式会社 Mobile terminal having image projection
CN201974160U (en) * 2011-01-20 2011-09-14 沈阳同联集团高新技术有限公司 Device for measuring three-dimensional shape of structured light
CN102360424A (en) * 2011-10-20 2012-02-22 浙江工商大学 Glare prevention method based on shadow image cutting algorithm

Also Published As

Publication number Publication date
HK1171544A1 (en) 2013-03-28
CN102722254A (en) 2012-10-10

Similar Documents

Publication Publication Date Title
CN102722254B (en) Method and system for location interaction
CN102799318B (en) A kind of man-machine interaction method based on binocular stereo vision and system
TWI683259B (en) Method and related device of determining camera posture information
CN102508578B (en) Projection positioning device and method as well as interaction system and method
CN101231450B (en) Multipoint and object touch panel arrangement as well as multipoint touch orientation method
CN102622108B (en) A kind of interactive projection system and its implementation
TWI408587B (en) Touch system and positioning method therefor
CN109782962A (en) A kind of projection interactive method, device, system and terminal device
TW201510771A (en) Pointing direction detecting device and its method, program and computer readable-medium
CN101639746B (en) Automatic calibration method of touch screen
CN115956259A (en) Generating an underlying real dataset for a virtual reality experience
CN201562264U (en) Self-adaptive intelligent interactive projector
CN104423578A (en) Interactive Input System And Method
KR100907104B1 (en) Calculation method and system of pointing locations, and collaboration system comprising it
WO2021004412A1 (en) Handheld input device, and method and apparatus for controlling display position of indication icon thereof
KR20130119233A (en) Apparatus for acquiring 3 dimension virtual object information without pointer
CN107682595B (en) interactive projection method, system and computer readable storage medium
CN113672099A (en) Electronic equipment and interaction method thereof
CN101520707A (en) Infrared ray and camera combined multipoint positioning touch device and positioning method
CN103176606B (en) Based on plane interaction system and the method for binocular vision identification
CN112657176A (en) Binocular projection man-machine interaction method combined with portrait behavior information
CN101013349A (en) Electronic plane display positioning system and positioning method
TWI486815B (en) Display device, system and method for controlling the display device
CN102289326B (en) Optical multi-point touch screen device and method applicable to vibration and damp environment
CN203606780U (en) Multi-touch and gesture recognition fusion system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1171544

Country of ref document: HK

C14 Grant of patent or utility model
GR01 Patent grant
REG Reference to a national code

Ref country code: HK

Ref legal event code: GR

Ref document number: 1171544

Country of ref document: HK

CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 518055 Guangdong city of Shenzhen province Nanshan District Xili of Tsinghua

Patentee after: Shenzhen International Graduate School of Tsinghua University

Address before: 518055 Guangdong city of Shenzhen province Nanshan District Xili of Tsinghua

Patentee before: GRADUATE SCHOOL AT SHENZHEN, TSINGHUA University

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20211228

Address after: 530000 floor 18, building 8, Nanning Greenland center, No. 15, piankaixuan Road, China (Guangxi) pilot Free Trade Zone, Liangqing District, Nanning City, Guangxi Zhuang Autonomous Region

Patentee after: Guangxi Guihua Intelligent Manufacturing Co.,Ltd.

Address before: 518055 Guangdong city of Shenzhen province Nanshan District Xili of Tsinghua

Patentee before: Shenzhen International Graduate School of Tsinghua University