CN108227923A - A kind of virtual touch-control system and method based on body-sensing technology - Google Patents

A kind of virtual touch-control system and method based on body-sensing technology Download PDF

Info

Publication number
CN108227923A
CN108227923A CN201810001164.4A CN201810001164A CN108227923A CN 108227923 A CN108227923 A CN 108227923A CN 201810001164 A CN201810001164 A CN 201810001164A CN 108227923 A CN108227923 A CN 108227923A
Authority
CN
China
Prior art keywords
view field
gesture
sensing technology
virtual touch
method based
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810001164.4A
Other languages
Chinese (zh)
Inventor
盛赞
王行
周晓军
李骊
李朔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Huajie Imi Software Technology Co Ltd
Original Assignee
Nanjing Huajie Imi Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Huajie Imi Software Technology Co Ltd filed Critical Nanjing Huajie Imi Software Technology Co Ltd
Priority to CN201810001164.4A priority Critical patent/CN108227923A/en
Publication of CN108227923A publication Critical patent/CN108227923A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention discloses a kind of virtual touch-control systems and method based on body-sensing technology.System includes main control module and the somatosensory device being attached thereto respectively and performs equipment, somatosensory device includes 3D body-sensings chip and the infrared transmitter, infrared remote receiver and the RGB cameras that are attached thereto respectively, the RGB image in RGB camera acquired projections region, 3D body-sensings chip obtains the depth information of view field by infrared transmitter and infrared remote receiver, main control module identifies gesture of the user in view field according to the information that somatosensory device obtains, and control performs equipment execution corresponding operating accordingly.The present invention can realize complicated virtual interacting independent of projection identification device.

Description

A kind of virtual touch-control system and method based on body-sensing technology
Technical field
It is more particularly to a kind of based on the virtual of body-sensing technology the present invention relates to image identification and field of intelligent control technology Touch-control system and method.
Background technology
It is a large amount of universal with mobile equipment, more and more people are taken pictures using the mobile equipment such as mobile phone, tablet, The operations such as webpage are browsed, the information in mobile terminal can be projected on arbitrary plane whenever and wherever possible and on projection images Interacting formula operation becomes demand instantly.
Traditional virtual touch way as shown in Figure 1, it by projection arrangement, infrared transmitter, infrared photography camera lens, master Controller forms.When finger touches projection module in the projection in an interface surface, block and reflect infrared light to red Outer pick-up lens by the processing of master controller, determines the location information of finger press points, and control and throw according to location information Shadow operates equipment so as to fulfill by virtual touch way.
Such as Chinese patent application (application number:201620832077) a kind of projection dress based on virtual touch technology is proposed It puts, is exactly realized with reference to aforesaid way.This mode can only realize some simple virtual interactings, such as click, long-press etc., It can not realize the complicated gesture such as mouse, slip, scaling.
Invention content
In order to solve the technical issues of above-mentioned background technology proposes, the present invention is intended to provide a kind of void based on body-sensing technology Intend touch-control system and method, independent of projection identification device, and can realize complicated virtual interacting.
In order to realize above-mentioned technical purpose, the technical scheme is that:
A kind of virtual touch-control system based on body-sensing technology, the somatosensory device being attached thereto including main control module and respectively With execution equipment, the somatosensory device includes 3D body-sensings chip and the infrared transmitter, the infrared remote receiver that are attached thereto respectively With RGB cameras, the RGB image in the RGB cameras acquired projections region, 3D body-sensings chip passes through infrared transmitter and infrared Receiver obtains the depth information of view field, and main control module identifies user in projected area according to the information that somatosensory device obtains The gesture in domain, and control performs equipment execution corresponding operating accordingly.
A kind of virtual touch control method based on body-sensing technology, includes the following steps:
(1) depth information of shooting view field, measurement space three-dimensional data, output RGB image and image;
(2) depth, brightness and chrominance information based on every frame image calculate plane equation and the region at projection place, If not identifying projection, view field is just simulated on plane equation;
(3) for the depth information of every frame image, the depth information sold is extracted, the key message sold is screened, goes forward side by side Row normalized;
(4) image after continuous a few frame normalizeds is taken, extracts characteristic value, normalization constructs multidimensional characteristic vectors, and This feature vector and the Euclidean distance of gesture feature vector pre-saved are calculated, if Euclidean distance is less than predetermined threshold value, Gesture successful match;
(5) position and the duration for the gesture that step (4) identifies are calculated, by gesture-type, position and duration Corresponding operational order is converted into, the corresponding operation of equipment execution is performed so as to control.
Further, in step (2), first, 3 points are taken in depth map, a plane equation is calculated, then Remaining point is gradually substituted into again to correct the plane equation, obtains stable plane equation;Then, with reference to throwing in RGB figures The brightness in shadow zone domain, contrast and coloration with ambient enviroment, calculate the width and height of view field, determine view field Size and location on plane equation.
Further, in step (3), the key message of the hand includes the pass of the size of hand, region and every finger Nodal information.
Further, in step (3), the normalized is returned including size normalizing, reference zero normalizing and direction One.
Further, in step (4), the multidimensional characteristic vectors include gesture legal person movement velocity, movement angle and Movement locus feature.
The advantageous effect brought using above-mentioned technical proposal:
The region that the present invention can be projected by somatosensory device with automatic identification, if moreover, not recognizing projection, system A view field can be simulated, only user does not see, and equally realizes pseudo operation of the user to equipment.Meanwhile the present invention is logical The identification that gesture identification method realizes complicated gesture is crossed, so as to fulfill complicated virtual interacting.
Description of the drawings
Fig. 1 is traditional virtual touch control manner schematic diagram;
Fig. 2 is the block diagram of system of the present invention;
Fig. 3 is flow chart of the method for the present invention;
Fig. 4 is several virtual touch ways that the present invention supports.
Specific embodiment
Below with reference to attached drawing, technical scheme of the present invention is described in detail.
As shown in Fig. 2, a kind of virtual touch-control system based on body-sensing technology, it is attached thereto including main control module and respectively Somatosensory device and perform equipment, the somatosensory device include 3D body-sensings chip and be attached thereto respectively infrared transmitter, Infrared remote receiver and RGB cameras, the RGB image in the RGB cameras acquired projections region, 3D body-sensings chip pass through infrared hair Emitter and infrared remote receiver obtain the depth information of view field, and main control module identifies use according to the information that somatosensory device obtains Family view field gesture, and accordingly control perform equipment perform corresponding operating.
As shown in figure 3, a kind of virtual touch control method based on body-sensing technology, is as follows.
Step 1:The depth information of shooting view field, measurement space three-dimensional data, output RGB image and image.
Step 2:Based on depth, brightness and the chrominance information of every frame image, plane equation and the area at projection place are calculated View field if not identifying projection, is just simulated in domain on plane equation.
First, 3 points are taken in depth map, a plane equation is calculated, then again gradually substitute into remaining point To correct the plane equation, stable plane equation is obtained;Then, with reference to the brightness of view field and surrounding ring in RGB figures The contrast and coloration in border calculate the width and height of view field, determine size of the view field on plane equation and Position.
Plane equation and view field constantly refresh, so, it is changed regardless of view field, system is all automatic Identification.
Step 3:For the depth information of every frame image, the depth information sold is extracted, screens the key message sold, and It is normalized.
The size of the key message including hand, region, each finger joint point data etc..The normalized of opponent Including size normalizing, reference zero normalizing, direction normalizing.By normalization, can improve hand operation reliability, stability and Accuracy.
Step 4:The image after continuous a few frame normalizeds is taken, extracts characteristic value, normalization constructs multidimensional characteristic vectors, And this feature vector and the Euclidean distance of gesture feature vector pre-saved are calculated, if Euclidean distance is less than predetermined threshold value, Then gesture successful match.
Feature vector includes features, the normalized meanings of feature vector such as movement velocity, movement angle, movement locus and is The Recognition Different and stability that less hand otherness is brought.Gesture identification uses AP clustering algorithms.
Step 5:Position and the duration for the gesture that step 4 identifies are calculated, by gesture-type, position and duration Corresponding operational order is converted into, the corresponding operation of equipment execution is performed so as to control.Fig. 4 illustrates what several present invention supported Virtual touch way, including click, long-press, slip and scaling etc..
Embodiment is merely illustrative of the invention's technical idea, it is impossible to protection scope of the present invention is limited with this, it is every according to Technological thought proposed by the present invention, any change done on the basis of technical solution, each falls within the scope of the present invention.

Claims (6)

1. a kind of virtual touch-control system based on body-sensing technology, it is characterised in that:It is attached thereto including main control module and respectively Somatosensory device and perform equipment, the somatosensory device include 3D body-sensings chip and be attached thereto respectively infrared transmitter, Infrared remote receiver and RGB cameras, the RGB image in the RGB cameras acquired projections region, 3D body-sensings chip pass through infrared hair Emitter and infrared remote receiver obtain the depth information of view field, and main control module identifies use according to the information that somatosensory device obtains Family view field gesture, and accordingly control perform equipment perform corresponding operating.
2. a kind of virtual touch control method based on body-sensing technology, which is characterized in that include the following steps:
(1) depth information of shooting view field, measurement space three-dimensional data, output RGB image and image;
(2) depth, brightness and chrominance information based on every frame image calculate plane equation and the region at projection place, if It does not identify projection, view field is just simulated on plane equation;
(3) for the depth information of every frame image, the depth information sold is extracted, screens the key message sold, and returned One change is handled;
(4) image after continuous a few frame normalizeds is taken, extracts characteristic value, normalization construction multidimensional characteristic vectors, and calculate This feature vector and the Euclidean distance of gesture feature vector pre-saved, if Euclidean distance is less than predetermined threshold value, gesture Successful match;
(5) position and the duration for the gesture that step (4) identifies are calculated, gesture-type, position and duration are converted For corresponding operational order, the corresponding operation of equipment execution is performed so as to control.
3. the virtual touch control method based on body-sensing technology according to claim 2, it is characterised in that:It is first in step (2) First, 3 points are taken in depth map, a plane equation is calculated, then remaining point is gradually substituted into again and is put down with correcting this Face equation obtains stable plane equation;Then, with reference to the brightness of view field, the contrast with ambient enviroment in RGB figures And coloration, the width and height of view field are calculated, determines size and location of the view field on plane equation.
4. the virtual touch control method based on body-sensing technology according to claim 2, it is characterised in that:It is described in step (3) The artis information of size of the key message of hand including hand, region and every finger.
5. the virtual touch control method based on body-sensing technology according to claim 2, it is characterised in that:It is described in step (3) Normalized includes size normalizing, reference zero normalizing and direction normalizing.
6. the virtual touch control method based on body-sensing technology according to claim 2, it is characterised in that:It is described in step (4) Multidimensional characteristic vectors include gesture legal person movement velocity, movement angle and movement locus feature.
CN201810001164.4A 2018-01-02 2018-01-02 A kind of virtual touch-control system and method based on body-sensing technology Pending CN108227923A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810001164.4A CN108227923A (en) 2018-01-02 2018-01-02 A kind of virtual touch-control system and method based on body-sensing technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810001164.4A CN108227923A (en) 2018-01-02 2018-01-02 A kind of virtual touch-control system and method based on body-sensing technology

Publications (1)

Publication Number Publication Date
CN108227923A true CN108227923A (en) 2018-06-29

Family

ID=62642479

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810001164.4A Pending CN108227923A (en) 2018-01-02 2018-01-02 A kind of virtual touch-control system and method based on body-sensing technology

Country Status (1)

Country Link
CN (1) CN108227923A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110162225A (en) * 2019-05-05 2019-08-23 青岛小鸟看看科技有限公司 A kind of projection lamp and the touch control method for projection lamp
CN110825271A (en) * 2019-11-13 2020-02-21 一汽轿车股份有限公司 Vehicle-mounted AR holographic projection interaction device
CN111240486A (en) * 2020-02-17 2020-06-05 河北冀联人力资源服务集团有限公司 Data processing method and system based on edge calculation
CN111258410A (en) * 2020-05-06 2020-06-09 北京深光科技有限公司 Man-machine interaction equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982557A (en) * 2012-11-06 2013-03-20 桂林电子科技大学 Method for processing space hand signal gesture command based on depth camera
CN103677240A (en) * 2012-09-24 2014-03-26 株式会社理光 Virtual touch interaction method and equipment
US20150244911A1 (en) * 2014-02-24 2015-08-27 Tsinghua University System and method for human computer interaction
CN105589552A (en) * 2014-10-30 2016-05-18 联想(北京)有限公司 Projection interaction method and projection interaction device based on gestures
CN106095098A (en) * 2016-06-07 2016-11-09 深圳奥比中光科技有限公司 Body feeling interaction device and body feeling interaction method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103677240A (en) * 2012-09-24 2014-03-26 株式会社理光 Virtual touch interaction method and equipment
CN102982557A (en) * 2012-11-06 2013-03-20 桂林电子科技大学 Method for processing space hand signal gesture command based on depth camera
US20150244911A1 (en) * 2014-02-24 2015-08-27 Tsinghua University System and method for human computer interaction
CN105589552A (en) * 2014-10-30 2016-05-18 联想(北京)有限公司 Projection interaction method and projection interaction device based on gestures
CN106095098A (en) * 2016-06-07 2016-11-09 深圳奥比中光科技有限公司 Body feeling interaction device and body feeling interaction method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110162225A (en) * 2019-05-05 2019-08-23 青岛小鸟看看科技有限公司 A kind of projection lamp and the touch control method for projection lamp
CN110825271A (en) * 2019-11-13 2020-02-21 一汽轿车股份有限公司 Vehicle-mounted AR holographic projection interaction device
CN111240486A (en) * 2020-02-17 2020-06-05 河北冀联人力资源服务集团有限公司 Data processing method and system based on edge calculation
CN111240486B (en) * 2020-02-17 2021-07-02 河北冀联人力资源服务集团有限公司 Data processing method and system based on edge calculation
CN111258410A (en) * 2020-05-06 2020-06-09 北京深光科技有限公司 Man-machine interaction equipment
CN111258410B (en) * 2020-05-06 2020-08-04 北京深光科技有限公司 Man-machine interaction equipment

Similar Documents

Publication Publication Date Title
US11314335B2 (en) Systems and methods of direct pointing detection for interaction with a digital device
US9349039B2 (en) Gesture recognition device and control method for the same
US9734392B2 (en) Image processing device and image processing method
US9465444B1 (en) Object recognition for gesture tracking
WO2022166243A1 (en) Method, apparatus and system for detecting and identifying pinching gesture
US20140300542A1 (en) Portable device and method for providing non-contact interface
KR101364571B1 (en) Apparatus for hand detecting based on image and method thereof
JP5077956B2 (en) Information terminal equipment
CN108227923A (en) A kind of virtual touch-control system and method based on body-sensing technology
KR20130004357A (en) A computing device interface
US20150089453A1 (en) Systems and Methods for Interacting with a Projected User Interface
CN104364733A (en) Position-of-interest detection device, position-of-interest detection method, and position-of-interest detection program
CN111527468A (en) Air-to-air interaction method, device and equipment
CN114138121B (en) User gesture recognition method, device and system, storage medium and computing equipment
EP2996067A1 (en) Method and device for generating motion signature on the basis of motion signature information
JP2012238293A (en) Input device
CN112351188B (en) Apparatus and method for displaying graphic element according to object
JP2011118533A (en) Device and method for inputting touch position
US20160140762A1 (en) Image processing device and image processing method
KR100968205B1 (en) Apparatus and Method for Space Touch Sensing and Screen Apparatus sensing Infrared Camera
KR101961266B1 (en) Gaze Tracking Apparatus and Method
WO2018171363A1 (en) Position information determining method, projection device and computer storage medium
CN106951077B (en) Prompting method and first electronic device
CN113282164A (en) Processing method and device
CN110213407B (en) Electronic device, operation method thereof and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180629

RJ01 Rejection of invention patent application after publication