CN102221887B - Interactive projection system and method - Google Patents

Interactive projection system and method Download PDF

Info

Publication number
CN102221887B
CN102221887B CN201110171066.3A CN201110171066A CN102221887B CN 102221887 B CN102221887 B CN 102221887B CN 201110171066 A CN201110171066 A CN 201110171066A CN 102221887 B CN102221887 B CN 102221887B
Authority
CN
China
Prior art keywords
depth image
unit
interaction content
limbs
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110171066.3A
Other languages
Chinese (zh)
Other versions
CN102221887A (en
Inventor
陈大炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konka Group Co Ltd
Original Assignee
Konka Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konka Group Co Ltd filed Critical Konka Group Co Ltd
Priority to CN201110171066.3A priority Critical patent/CN102221887B/en
Publication of CN102221887A publication Critical patent/CN102221887A/en
Application granted granted Critical
Publication of CN102221887B publication Critical patent/CN102221887B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Controls And Circuits For Display Device (AREA)
  • Position Input By Displaying (AREA)
  • Projection Apparatus (AREA)

Abstract

The invention discloses a kind of interactive projection system and method, the technical problem that solve is to reach more real interaction effect, and image capturing equipment do not need to be mounted to eminence again, can conveniently debug and use. The present invention is by the following technical solutions: a kind of interactive projection system, described interactive projection system comprises with lower component: depth image collecting unit, depth image processing unit, limbs target following unit, interaction content control module and projection display unit, described depth image collecting unit connects depth image processing unit, depth image processing unit connects limbs target following unit, limbs target following unit connects interaction content control module, and described interaction content control module connects projection display unit. Compared with prior art, adopt the depth image collecting unit based on depth image induction, coordinate corresponding depth image to process and limbs recognition unit, thereby reach more real interaction effect.

Description

Interactive projection system and method
Technical field
The present invention relates to a kind of stereoscopic display field, particularly a kind of interactive projection system and method.
Background technology
Interaction is a kind of popular multimedia display platform in recent years, adopts computer vision techniqueAnd projection display technique, user can directly use the virtual scene in pin or hand and view field to carry out alternately,Build a kind of interactive experience of innervation. First the general principle of interactive projection system is to establish by picture catchingFor target image is caught to shooting, then analyzed by image analysis system, thereby produce the object that is capturedMotion, this exercise data is in conjunction with real-time imaging interaction systems, makes to produce mutually between participant and view fieldMoving effect. In general, common interactive projection system uses projector as imaging device, by computerPicture projects on ground or metope, forms the imaging screen of super large, and uses thermal camera as videoCapture device is analyzed the motion of participant in projected picture, and movable information is controlled to throwing as computer inputShadow picture, but at present common interaction systems, it is attached that video capturing device must be positioned at eminence projector equipmentClosely, installation and debugging are very not inconvenient. On the other hand, current image analysis system is substantially all to transportMoving detection is main, as long as the object of motion all can cause and the interaction of project content, and is not realThe induction to human motion, often there will be near the winged insect that flies over camera etc. all can system be produced and be doneWith. Further, for motion detection result, the areas imaging of participant in video is less,It is just more accurate that motion detection result is converted to the input control signal of computer, and interaction effect is just more true to nature,Therefore just require projector equipment and camera position high as far as possible, this conversely again can be to engineering construction, installationDebugging brings larger difficulty.
Summary of the invention
The object of this invention is to provide a kind of interactive projection system and method, the technical problem that solve is to reachInteraction effect, and image capturing equipment more really does not need to be mounted to eminence again, can facilitateDebugging and use.
The present invention is by the following technical solutions: a kind of interactive projection system, is characterized in that: described interactive throwingShadow system comprises with lower component:
Depth image collecting unit, for receiving and responding to the depth image signal in drop shadow spread;
Depth image processing unit, for the depth image signal of collecting according to depth image collecting unit, willThe human body of motion is separated from depth image;
Limbs target following unit, for according to the result of depth image processing unit,, and according to interactionThe needs of content, follow the tracks of the movement locus of user's hand or foot, and result is sent to the interactive list of controllingUnit;
Interaction content control module, for according to user's hand of limbs target following unit or the motion of footTrack following result, responds with predetermined interaction content, controls and upgrades interaction content;
Projection display unit, for Projection Display interaction content;
Described depth image collecting unit connects depth image processing unit, and depth image processing unit connects limbsTarget following unit, limbs target following unit connects interaction content control module, described interaction content controlUnit connects projection display unit.
Depth image collecting unit of the present invention is degree of depth camera.
Projection display unit of the present invention is projector.
Collecting unit of the present invention is arranged on 1000mm horizontal direction can respond to projection display unit projected areaIn the spatial dimension of place, territory.
A kind of interaction method, comprises the following steps: one, depth image collecting unit is responded to and gathered bagDraw together user at interior depth image; Two, depth image collecting unit judges that whether user has limb action, isDepth image is sent to depth image processing unit, depth image processing unit by user's limbs information fromSeparating treatment in depth image, and result is sent to limbs target following unit, otherwise continue inductionAnd gather; Three, tracking target, according to interaction content needs, is determined in limbs target following unit, obtains trackingThe action coordinate position of target, and tracking results is sent to interaction content control module; Four, interaction contentControl module obtains after user's coordinate position, is mapped in projector space coordinate system, show that user existsAbsolute coordinate position on the projected picture of ground; Five, projection display unit obtains behind absolute coordinate position, willMutual effect projects on projection screen.
The present invention compared with prior art, adopts the depth image collecting unit based on depth image induction, joinsClose corresponding depth image and process and limbs recognition unit, thereby reach more real interaction effect,And affect capture device and do not need to be arranged on again eminence, can conveniently debug and use.
Brief description of the drawings
Fig. 1 is structured flowchart of the present invention.
Fig. 2 is flow chart of the present invention.
Fig. 3 is placing space area schematic of the present invention.
Detailed description of the invention
Below in conjunction with drawings and Examples, the present invention is described in further detail.
As shown in Figure 1, interactive projection system of the present invention comprises with lower component:
Depth image collecting unit, for receiving and responding to the depth image signal in drop shadow spread;
Depth image processing unit, for according to structure light coding know-why, by contrast three-dimensional environment pairImage structures light coding and original plane structure light coding, obtain the structured light sensor visual range internal field depth of fieldDegree information, user, through the depth information in drop shadow spread, separates the human body of motion from depth imageOut;
Limbs target following unit, for according to the result of depth image processing unit,, and according to interactionThe needs of content, follow the tracks of the movement locus of user's hand or foot, and result is sent to the interactive list of controllingUnit;
Interaction content control module, for according to user's hand of limbs target following unit or the motion of footTrack following result, responds with predetermined interaction content, controls and upgrades interaction content;
Projection display unit, for Projection Display interaction content;
Described depth image collecting unit connects depth image processing unit, and depth image processing unit connects limbsTarget following unit, limbs target following unit connects interaction content control module, described interaction content controlUnit connects projection display unit.
Described depth image collecting unit is degree of depth camera, is responsible for receiving and responding to infrared in drop shadow spreadDepth image signal, can choice for use structure light coding according to the difference of the depth image induction technology of applicationTechnology or flying time technology, without loss of generality, this degree of depth camera by infrared structure Optical Transmit Unit andInfrared structure optical sensor unit composition, wherein infrared structure Optical Transmit Unit is sent out to view field's in-scopePenetrate the structured light plane through coding, infrared structure optical sensor unit receives via user through projection modelThe infrared structure light reflecting while enclosing.
Described projection display unit is projector.
Described collecting unit is arranged on 1000mm horizontal direction can respond to institute of projection display unit view fieldIn spatial dimension.
Projection display unit is arranged on eminence by described projection display unit, and interaction content is projected in to object tableFace, on ground or special formed desktop.
As most preferred embodiment, the camera that degree of depth camera of the present invention adopts PrimeSense company to produce;The PS1080 type infrared structure optical sensor that infrared structure optical sensor unit adopts PrimeSense company to produceChip; The NITE middleware that depth image processing unit adopts PrimeSensor company to produce; Limbs targetTracking cell adopts the limbs target tracker of the KK-Montion type of KangJia Group Co., Ltd's production;Interaction content control module adopts the interaction of the KK-InterActive type of KangJia Group Co., Ltd's productionContent controller; The VPL-DX11 type projector that projection display unit adopts Sony to produce.
As shown in Figure 2, the present invention adopts following methods to realize: one, depth image collecting unit is responded to and adoptedThe depth image of collection including user; Two, depth image collecting unit judges whether user has limb action,Be depth image to be sent to depth image processing unit, depth image processing unit is by user's limbs informationSeparating treatment from depth image, and result is sent to limbs target following unit, otherwise continue senseShould and gather; Three, tracking target, according to interaction content needs, is determined in limbs target following unit, obtains and followsThe action coordinate position of track target, and tracking results is sent to interaction content control module; Four, in interactionHold control module and obtain after user's coordinate position, be mapped in projector space coordinate system, draw userAbsolute coordinate position on the projected picture of ground; Five, projection display unit obtains behind absolute coordinate position,Mutual effect is projected on projection screen.
Depth image collecting unit of the present invention and projection display unit can adopt mutual 90 degree as shown in Figure 3Angle arranges, and alternative traditional video capturing device must be arranged on and be positioned near eminence projector equipment.

Claims (1)

1. an interactive projection system, is characterized in that: described interactive projection system comprises with lower component:
Depth image collecting unit, for receiving and responding to the depth image signal in drop shadow spread;
Depth image processing unit, for the depth image signal of collecting according to depth image collecting unit, willThe human body of motion is separated from depth image;
Limbs target following unit, for according to the result of depth image processing unit, and according to interactionThe needs of content, follow the tracks of the movement locus of user's hand or foot, and result is sent to the interactive list of controllingUnit;
Interaction content control module, for according to user's hand of limbs target following unit or the motion of footTrack following result, responds with predetermined interaction content, controls and upgrades interaction content;
Projection display unit, for Projection Display interaction content;
Described depth image collecting unit connects depth image processing unit, and depth image processing unit connects limbsTarget following unit, limbs target following unit connects interaction content control module, described interaction content controlUnit connects projection display unit;
Described depth image collecting unit is degree of depth camera;
Described projection display unit is projector;
Described depth image collecting unit is arranged on 1000mm horizontal direction can respond to projection display unit throwingIn the spatial dimension of place, territory, shadow zone;
Described interactive projection system adopts following interaction method to realize, and the method comprises the following steps: one,Depth image collecting unit is responded to and is gathered the depth image including user; Two, depth image collection listUnit judges whether user has limb action, is depth image to be sent to depth image processing unit, the degree of depthGraphics processing unit is user's limbs information separating treatment from depth image, and result is sent to limbBody target following unit, otherwise continue respond to and gather; Three, limbs target following unit is according to interaction contentNeed, determine tracking target, obtain the action coordinate position of tracking target, and tracking results is sent to mutuallyMoving content-control unit; Four, interaction content control module obtains after user's coordinate position, is mapped toIn projector space coordinate system, draw the absolute coordinate position of user on the projected picture of ground; Five, projection is aobviousShow that unit obtains, behind absolute coordinate position, mutual effect being projected on projection screen;
The camera that degree of depth camera adopts PrimeSense company to produce; Infrared structure optical sensor unit is adoptedThe PS1080 type infrared structure optical sensor chip of producing with PrimeSense company; Depth image processing unitThe NITE middleware that adopts PrimeSensor company to produce; Limbs target following unit adopts group of Konka thighThe limbs target tracker of the KK-Montion type that part Co., Ltd produces; Interaction content control module adopts healthThe interaction content controller of the KK-InterActive type that good Group Plc produces; Projection display unitThe VPL-DX11 type projector that adopts Sony to produce.
CN201110171066.3A 2011-06-23 2011-06-23 Interactive projection system and method Active CN102221887B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110171066.3A CN102221887B (en) 2011-06-23 2011-06-23 Interactive projection system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110171066.3A CN102221887B (en) 2011-06-23 2011-06-23 Interactive projection system and method

Publications (2)

Publication Number Publication Date
CN102221887A CN102221887A (en) 2011-10-19
CN102221887B true CN102221887B (en) 2016-05-04

Family

ID=44778452

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110171066.3A Active CN102221887B (en) 2011-06-23 2011-06-23 Interactive projection system and method

Country Status (1)

Country Link
CN (1) CN102221887B (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102375542B (en) * 2011-10-27 2015-02-11 Tcl集团股份有限公司 Method for remotely controlling television by limbs and television remote control device
DE102012206851A1 (en) * 2012-04-25 2013-10-31 Robert Bosch Gmbh Method and device for determining a gesture executed in the light cone of a projected image
CN102779359B (en) * 2012-07-13 2015-07-15 南京大学 Automatic ticket checking device for performing passage detection based on depth image
CN102968222A (en) * 2012-11-07 2013-03-13 电子科技大学 Multi-point touch equipment based on depth camera
US9047514B2 (en) * 2013-07-10 2015-06-02 Christie Digital Systems Usa, Inc. Apparatus, system and method for projecting images onto predefined portions of objects
CN104349096B (en) * 2013-08-09 2017-12-29 联想(北京)有限公司 A kind of image calibration method, apparatus and electronic equipment
US9691357B2 (en) 2013-08-09 2017-06-27 Lenovo (Beijing) Co., Ltd. Information processing method and electronic device thereof, image calibration method and apparatus, and electronic device thereof
CN103455141B (en) * 2013-08-15 2016-07-06 无锡触角科技有限公司 The calibration steps of interactive projection system and depth transducer and projector
CN104951200B (en) * 2014-03-26 2018-02-27 联想(北京)有限公司 A kind of method and apparatus for performing interface operation
US20160216778A1 (en) * 2015-01-27 2016-07-28 Industrial Technology Research Institute Interactive projector and operation method thereof for determining depth information of object
CN104915010A (en) * 2015-06-28 2015-09-16 合肥金诺数码科技股份有限公司 Gesture recognition based virtual book flipping system
CN105843378A (en) * 2016-03-17 2016-08-10 中国农业大学 Service terminal based on somatosensory interaction control and control method of the service terminal
TWI653563B (en) * 2016-05-24 2019-03-11 仁寶電腦工業股份有限公司 Projection touch image selection method
CN106251387A (en) * 2016-07-29 2016-12-21 武汉光之谷文化科技股份有限公司 A kind of imaging system based on motion capture
CN107027014A (en) * 2017-03-23 2017-08-08 广景视睿科技(深圳)有限公司 A kind of intelligent optical projection system of trend and its method
CN109816723A (en) * 2017-11-21 2019-05-28 深圳光峰科技股份有限公司 Method for controlling projection, device, projection interactive system and storage medium
CN108322639A (en) * 2017-12-29 2018-07-24 维沃移动通信有限公司 A kind of method, apparatus and mobile terminal of image procossing
CN108347558A (en) * 2017-12-29 2018-07-31 维沃移动通信有限公司 A kind of method, apparatus and mobile terminal of image optimization
CN108259740A (en) * 2017-12-29 2018-07-06 维沃移动通信有限公司 A kind of method, apparatus and mobile terminal of panoramic picture generation
CN109993835A (en) * 2017-12-31 2019-07-09 广景视睿科技(深圳)有限公司 A kind of stage interaction method, apparatus and system
CN109521879B (en) * 2018-11-19 2022-02-18 杭州易现先进科技有限公司 Interactive projection control method and device, storage medium and electronic equipment
CN111258410B (en) * 2020-05-06 2020-08-04 北京深光科技有限公司 Man-machine interaction equipment
CN112702587A (en) * 2020-12-29 2021-04-23 广景视睿科技(深圳)有限公司 Intelligent tracking projection method and system
CN114253452B (en) * 2021-11-16 2024-08-20 深圳市普渡科技有限公司 Robot, man-machine interaction method, device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101699370A (en) * 2009-11-10 2010-04-28 北京思比科微电子技术有限公司 Depth detection based body identification control device
CN201699871U (en) * 2010-01-29 2011-01-05 联动天下科技(大连)有限公司 Interactive projector
CN102033608A (en) * 2001-06-05 2011-04-27 瑞克楚斯系统公司 Interactive video display system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5201096B2 (en) * 2009-07-17 2013-06-05 大日本印刷株式会社 Interactive operation device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102033608A (en) * 2001-06-05 2011-04-27 瑞克楚斯系统公司 Interactive video display system
CN101699370A (en) * 2009-11-10 2010-04-28 北京思比科微电子技术有限公司 Depth detection based body identification control device
CN201699871U (en) * 2010-01-29 2011-01-05 联动天下科技(大连)有限公司 Interactive projector

Also Published As

Publication number Publication date
CN102221887A (en) 2011-10-19

Similar Documents

Publication Publication Date Title
CN102221887B (en) Interactive projection system and method
CN108986189B (en) Method and system for capturing and live broadcasting of real-time multi-person actions based on three-dimensional animation
CN102262725B (en) The analysis of three-dimensional scenic
KR102077108B1 (en) Apparatus and method for providing contents experience service
US9787943B2 (en) Natural user interface having video conference controls
US9628755B2 (en) Automatically tracking user movement in a video chat application
CN100373394C (en) Petoscope based on bionic oculus and method thereof
CN106843507B (en) Virtual reality multi-person interaction method and system
CN100487636C (en) Game control system and method based on stereo vision
US20170054972A1 (en) Method and apparatus for presenting 3d scene
CN107027014A (en) A kind of intelligent optical projection system of trend and its method
CN103619090A (en) System and method of automatic stage lighting positioning and tracking based on micro inertial sensor
CN105518584A (en) Recognizing interactions with hot zones
CN103543827A (en) Immersive outdoor activity interactive platform implement method based on single camera
CN104516492A (en) Man-machine interaction technology based on 3D (three dimensional) holographic projection
KR20150094680A (en) Target and press natural user input
Yu et al. Intelligent visual-IoT-enabled real-time 3D visualization for autonomous crowd management
CN201194061Y (en) Interactive projection system
CN108615243A (en) The determination method, apparatus and system of three-dimensional multimedia messages
Zhang et al. Virtual reality aided high-quality 3D reconstruction by remote drones
US20200230488A1 (en) Pre-Visualization Device
CN103327385B (en) Based on single image sensor apart from recognition methods and device
US20200159339A1 (en) Desktop spatial stereoscopic interaction system
CN204808201U (en) Gesture recognition control system based on vision
TWM474185U (en) Remote hand gesture operating system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant