CN103945122B - The method that virtual window is realized using monopod video camera and projector - Google Patents
The method that virtual window is realized using monopod video camera and projector Download PDFInfo
- Publication number
- CN103945122B CN103945122B CN201410130607.1A CN201410130607A CN103945122B CN 103945122 B CN103945122 B CN 103945122B CN 201410130607 A CN201410130607 A CN 201410130607A CN 103945122 B CN103945122 B CN 103945122B
- Authority
- CN
- China
- Prior art keywords
- projector
- video camera
- human body
- controller
- monopod video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Controls And Circuits For Display Device (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
A kind of method that utilization monopod video camera and projector realize virtual window, first configures a controller, a monopod video camera, a projector and a position of human body sensor;The controller has two control ports and a USB interface;The monopod video camera and projector are connected respectively to two control ports by serial communication line, and the position of human body sensor USB cable is connected to the USB interface;The monopod video camera has video output interface, and the projector is connected to video output interface by video transmission line;Methods described includes:The control instruction collection and control parameter of each equipment are set, control monopod video camera shoots, and a built-in window frame pattern is overlapped with described image signal by projector, and be projeced into the content such as one virtual window of formation on metope.The present invention can form the window picture that projection visual angle can change with observer's change in location on wall, so as to realize wall transparence.
Description
Technical field
Present invention relates particularly to a kind of method that utilization monopod video camera and projector realize virtual window.
Background technology
With science and technology development, various new electronic equipments be applied to people work, life in.Monopod video camera
With projector as video capture equipment and large-scale display output equipment, security protection, meeting, education, family are widely used in
In movie theatre.
It is without landscape or the room that should not be opened a window, unavoidable to constrain and barren.Existing solution can install TV
Deng display device, displaying video programs content.Video camera can also be assembled on the basis of this, show the content that video camera shoots
(Such as monitored picture).But said method all equally gives spectators without image of Buddha real window(Observer)Real telepresenc is provided.
Such as, observed outside room by real window, visual angle will change with respect to the difference of window position with observer, in difference
Position, it is seen that scenery also difference, as shown in Figure 1.
The content of the invention
The technical problem to be solved is to provide one kind to realize virtual window using monopod video camera and projector
The method at family, can form a virtual window indoors on wall.
The present invention is to solve above-mentioned technical problem by the following technical programs:It is a kind of to utilize monopod video camera and projector
The method for realizing virtual window, first configures a controller, a monopod video camera, a projector and a position of human body sensor;Institute
Controller is stated with two control ports and a USB interface;The monopod video camera and projector are connected respectively by serial communication line
Two control ports are connected to, the position of human body sensor USB cable is connected to the USB interface;The monopod video camera has
Video output interface, the projector are connected to video output interface by video transmission line;Methods described specifically includes following
Content:
Step 1:The monopod video camera of connection corresponding with each port, projector and position of human body are sensed the controller respectively
Device exchanges information, obtains each unit type, so as to set the control instruction collection and control parameter of three equipment respectively;
Step 2:Controller is sent by serial communication line and is instructed according to the control instruction collection and control parameter, control
Monopod video camera resets to predeterminated position and starts shooting, while by the landscape transition for shooting into picture signal;
Step 3:Monopod video camera is by regarding that video output interface output described image signal to the one of projector is specified
Frequency port;Controller sends instruction by serial communication line, controls projector and starts and be switched to the video port specified;So
A built-in window frame pattern is overlapped by projector with described image signal afterwards, and is projeced into one virtual window of formation on metope
Family;
Step 4:Position of human body sensor obtains the environmental data in its reconnaissance range, and environmental data is passed to control
Device;The environmental data refer in the reconnaissance range of position of human body sensor whether nobody or have single personal data;Then control
Device analysis environments data processed, judge in the reconnaissance range of position of human body sensor whether nobody or have single people;If environment number
According to in the reconnaissance range of position of human body sensor nobody, then into step 5;If environmental data is in position of human body sensor
Reconnaissance range in have single people, then execution step 6;
Step 5:Position of human body sensor continues detecting, detects always to enter less than human body and detect within the default time
Scope, then controller sends instruction control monopod video camera and projector enters holding state;
Step 6:
A. position of human body sensor obtains the current location of the single people and is sent to controller, and controller is single according to this
The current location of people, control monopod video camera adjustment lens direction and zoom ratio are shot to corresponding observation visual angle, together
When virtual window image is presented according to the corresponding shooting visual angle of monopod video camera also;
B. position of human body sensor keeps track the motion of the single people, and will be its current location mobile message constantly anti-
Feed controller, change in location of the controller according to the single people, control monopod video camera real-time adjustment camera lens direction and zoom
Multiplying power, the observation visual angle for simulating the single people are shot, and the scene of shooting is passed to projector in real time, and projector exists
Image of the scene with window frame pattern after superimposed is projected on wall.
Further, whether the environmental data is additionally included in the reconnaissance range of position of human body sensor more personal;
The step 4 also includes herein below:
Controller analysis environments data, judge whether there is more personal in the reconnaissance range of position of human body sensor;If ring
Border data are have more personal in the reconnaissance range of position of human body sensor, then execution step 7;
Step 7:It is main observer that controller specifies the people in many individuals according to default rule, then carries out following behaviour
Make:
A. position of human body sensor obtains the current location of main observer and is sent to controller, and controller is examined according to subjectivity
The current location of person, control monopod video camera adjustment lens direction and zoom ratio are shot to corresponding observation visual angle, together
When virtual window image is presented according to the corresponding shooting visual angle of monopod video camera also;
B. position of human body sensor keeps track the motion of main observer, and will be its current location mobile message constantly anti-
Feed controller, change in location of the controller according to main observer, control monopod video camera real-time adjustment camera lens direction and zoom
Multiplying power, the observation visual angle for simulating main observer are shot, and the scene of shooting is passed to projector in real time, and projector exists
Image of the scene with window frame pattern after superimposed is projected on wall.
Further, the rule in the step 7 is referred to:To move most active artificial main observer;Or distance is pre-
If the nearest artificial main observer in position;Or median is calculated according to many personal distributing positions, with positioned at the middle position
The people of value or apart from the nearest artificial main observer of median.
The beneficial effects of the present invention is:Using monopod video camera and projector, can be with by visual angle is projected on wall
Observer's change in location and the window picture that changes, so as to realize wall transparence.
Description of the drawings
The invention will be further described in conjunction with the embodiments with reference to the accompanying drawings.
Fig. 1 is existing human eye visual field schematic diagram.
Fig. 2 is the structural representation of simulation window system in invention.
Fig. 3 is a kind of utilization monopod video camera of the invention and projector realize virtual window method flow chart.
Specific embodiment
Fig. 2-3 are referred to, Fig. 2 is the structural representation of simulation window system in invention.It is a kind of to utilize monopod video camera and throwing
The method that shadow machine realizes virtual window, first configures a controller, a monopod video camera, a projector and position of human body sensing
Device;The controller has two control ports and a USB interface;The monopod video camera and projector are by serial communication line point
Two control ports are not connected to, the position of human body sensor USB cable is connected to the USB interface;The monopod video camera
With video output interface, the projector is connected to video output interface by video transmission line;Methods described is specifically included
Herein below:
Step 1:The monopod video camera of connection corresponding with each port, projector and position of human body are sensed the controller respectively
Device exchanges information, obtains each unit type, so as to set the control instruction collection and control parameter of three equipment respectively.
Step 2:Controller is sent by serial communication line and is instructed according to the control instruction collection and control parameter, control
Monopod video camera resets to predeterminated position(I.e. camera lens is towards parameters such as default direction, setting zoom ratios)And start shooting,
Simultaneously by the landscape transition for shooting into picture signal.
Step 3:Monopod video camera is by regarding that video output interface output described image signal to the one of projector is specified
Frequency port;Controller sends instruction by serial communication line, controls projector and starts and be switched to the video port specified;So
A built-in window frame pattern is overlapped by projector with described image signal afterwards, and is projeced into one virtual window of formation on metope
Family.
Step 4:Position of human body sensor obtains the environmental data in its reconnaissance range, and environmental data is passed to control
Device;The environmental data refer in the reconnaissance range of position of human body sensor whether nobody, have single people or have multiple numbers
According to;Then controller analysis environments data, judge in the reconnaissance range of position of human body sensor whether nobody, have single people or
Have more personal;If environmental data be in the reconnaissance range of position of human body sensor nobody, into step 5;If environmental data
Be have single people in the reconnaissance range of position of human body sensor, then execution step 6;If environmental data is to sense in position of human body
Have more personal in the reconnaissance range of device, then execution step 7.
Step 5:Position of human body sensor continues detecting, detects always to enter less than human body and detect within the default time
Scope, then controller sends instruction control monopod video camera and projector enters holding state.
Step 6:
A. position of human body sensor obtains the current location of the single people and is sent to controller, and controller is single according to this
The current location of people, control monopod video camera adjustment lens direction and zoom ratio are shot to corresponding observation visual angle, together
When virtual window image is presented according to the corresponding shooting visual angle of monopod video camera also;
B. position of human body sensor keeps track the motion of the single people, and will be its current location mobile message constantly anti-
Feed controller, change in location of the controller according to the single people, control monopod video camera real-time adjustment camera lens direction and zoom
Multiplying power, the observation visual angle for simulating the single people are shot, and the scene of shooting is passed to projector in real time, and projector exists
Image of the scene with window frame pattern after superimposed is projected on wall.
Step 7:It is main observer that controller specifies the people in many individuals according to default rule, then carries out following behaviour
Make:
A. position of human body sensor obtains the current location of main observer and is sent to controller, and controller is examined according to subjectivity
The current location of person, control monopod video camera adjustment lens direction and zoom ratio are shot to corresponding observation visual angle, together
When virtual window image is presented according to the corresponding shooting visual angle of monopod video camera also;
B. position of human body sensor keeps track the motion of main observer, and will be its current location mobile message constantly anti-
Feed controller, change in location of the controller according to main observer, control monopod video camera real-time adjustment camera lens direction and zoom
Multiplying power, the observation visual angle for simulating main observer are shot, and the scene of shooting is passed to projector in real time, and projector exists
Image of the scene with window frame pattern after superimposed is projected on wall.
Rule in the step 7 is referred to:To move most active artificial main observer;Or it is nearest apart from predeterminated position
Artificial main observer;Or calculate median according to many personal distributing positions, with the people positioned at the median or away from
From the nearest artificial main observer of median.
The present invention utilizes monopod video camera and projector, can be with observer's change in location by projecting visual angle on wall
The window picture of change, so as to realize wall transparence, is that Observer draws up the experience when participating in the cintest observed by real window.
Claims (3)
1. a kind of method that utilization monopod video camera and projector realize virtual window, it is characterised in that:First configure a controller,
One monopod video camera, a projector and a position of human body sensor;The controller has two control ports and a USB interface;
The monopod video camera and projector are connected respectively to two control ports, the position of human body sensor by serial communication line
USB cable is connected to the USB interface;The monopod video camera has video output interface, and the projector is passed by video
Defeated line is connected to video output interface;Methods described specifically includes herein below:
Step 1:Hand over respectively by the monopod video camera of connection corresponding with each port, projector and position of human body sensor for the controller
Information is changed, each unit type is obtained, so as to set the control instruction collection and control parameter of three equipment respectively;
Step 2:Controller is sent by serial communication line and is instructed according to the control instruction collection and control parameter, controls head
Video camera resets to predeterminated position and starts shooting, while by the landscape transition for shooting into picture signal;
Step 3:Monopod video camera exports described image signal to the one of the projector video end specified by video output interface
Mouthful;Controller sends instruction by serial communication line, controls projector and starts and be switched to the video port specified;Then throw
A built-in window frame pattern is overlapped by shadow machine with described image signal, and is projeced into one virtual window of formation on metope;
Step 4:Position of human body sensor obtains the environmental data in its reconnaissance range, and environmental data is passed to controller;
The environmental data refer in the reconnaissance range of position of human body sensor whether nobody or have single personal data;Then controller
Analysis environments data, judge in the reconnaissance range of position of human body sensor whether nobody or have single people;If environmental data is
In the reconnaissance range of position of human body sensor nobody, then into step 5;If environmental data is detecing in position of human body sensor
There is single people in the range of survey, then execution step 6;
Step 5:Position of human body sensor continues detecting, detects always and enter detecting model less than human body within the default time
Enclose, then controller sends instruction control monopod video camera and projector enters holding state;
Step 6:Including the step of execution successively 61 and step 62:
61. position of human body sensors obtain the current location of the single people and are sent to controller, and controller is according to the single people
Current location, control monopod video camera adjustment lens direction and zoom ratio shot to corresponding observation visual angle, while
Virtual window is also presented image according to the corresponding shooting visual angle of monopod video camera;
62. position of human body sensors keep track the motion of the single people, and its current location mobile message is constantly fed back
To controller, change in location of the controller according to the single people, control monopod video camera real-time adjustment camera lens direction and zoom times
Rate, the observation visual angle for simulating the single people are shot, and the scene of shooting is passed to projector in real time, and projector is in wall
Image of the scene with window frame pattern after superimposed is projected on wall.
2. the method for realizing virtual window using monopod video camera and projector as claimed in claim 1, it is characterised in that:Institute
State environmental data and be additionally included in the reconnaissance range of position of human body sensor whether have more personal;The step 4 also includes following
Content:
Controller analysis environments data, judge whether there is more personal in the reconnaissance range of position of human body sensor;If environment number
According to more personal to have in the reconnaissance range of position of human body sensor, then execution step 7;
Step 7:It is main observer that controller specifies the people in many individuals according to default rule, then carries out following operation:
A. position of human body sensor obtains the current location of main observer and is sent to controller, and controller is according to main observer's
Current location, control monopod video camera adjustment lens direction and zoom ratio are shot to corresponding observation visual angle, while empty
Intend window and image is presented according to the corresponding shooting visual angle of monopod video camera also;
B. position of human body sensor keeps track the motion of main observer, and its current location mobile message is constantly fed back to
Controller, change in location of the controller according to main observer, control monopod video camera real-time adjustment camera lens direction and zoom ratio,
The observation visual angle for simulating main observer is shot, and the scene of shooting is passed to projector in real time, and projector is in wall
The upper image for projecting the scene with window frame pattern after superimposed.
3. the method for realizing virtual window using monopod video camera and projector as claimed in claim 2, it is characterised in that:Institute
The rule stated in step 7 is referred to:To move most active artificial main observer;Or apart from the nearest artificial subjectivity of predeterminated position
The person of examining;Or median is calculated according to many personal distributing positions, with the people positioned at the median or apart from median most
Near artificial main observer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410130607.1A CN103945122B (en) | 2014-04-02 | 2014-04-02 | The method that virtual window is realized using monopod video camera and projector |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410130607.1A CN103945122B (en) | 2014-04-02 | 2014-04-02 | The method that virtual window is realized using monopod video camera and projector |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103945122A CN103945122A (en) | 2014-07-23 |
CN103945122B true CN103945122B (en) | 2017-04-05 |
Family
ID=51192581
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410130607.1A Active CN103945122B (en) | 2014-04-02 | 2014-04-02 | The method that virtual window is realized using monopod video camera and projector |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103945122B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106303246A (en) * | 2016-08-23 | 2017-01-04 | 刘永锋 | Real-time video acquisition methods based on Virtual Realization |
CN108153103A (en) * | 2017-12-30 | 2018-06-12 | 宁波高新区若水智创科技有限公司 | One planting sand draws synchronizing for creation and follows stereo imaging system and method |
CN108845775A (en) * | 2018-05-30 | 2018-11-20 | 王玉龙 | A kind of virtual landscape window |
TWI685252B (en) | 2018-08-15 | 2020-02-11 | 友達光電股份有限公司 | Scenario projection system and controlling method thereof |
CN111314663A (en) * | 2020-02-28 | 2020-06-19 | 青岛海信智慧家居系统股份有限公司 | Intelligent virtual window system based on 5G |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1373560A (en) * | 2001-02-28 | 2002-10-09 | 黄宝儿 | In-situ electronic scenery |
CN1588992A (en) * | 2004-10-21 | 2005-03-02 | 上海交通大学 | Entertainment system for video frequency real time synthesizing and recording |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8733938B2 (en) * | 2012-03-07 | 2014-05-27 | GM Global Technology Operations LLC | Virtual convertible tops, sunroofs, and back windows, and systems and methods for providing same |
-
2014
- 2014-04-02 CN CN201410130607.1A patent/CN103945122B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1373560A (en) * | 2001-02-28 | 2002-10-09 | 黄宝儿 | In-situ electronic scenery |
CN1588992A (en) * | 2004-10-21 | 2005-03-02 | 上海交通大学 | Entertainment system for video frequency real time synthesizing and recording |
Also Published As
Publication number | Publication date |
---|---|
CN103945122A (en) | 2014-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103945122B (en) | The method that virtual window is realized using monopod video camera and projector | |
US10600253B2 (en) | Information processing apparatus, information processing method, and program | |
CN103402106B (en) | three-dimensional image display method and device | |
CN111447340A (en) | Mixed reality virtual preview shooting system | |
CN103458184B (en) | A kind of mobile phone is applied to carry out the method that The Cloud Terrace remotely controls | |
CN105262968B (en) | The optical projection system and its projecting method of adjust automatically projected picture position | |
CN110969905A (en) | Remote teaching interaction and teaching aid interaction system for mixed reality and interaction method thereof | |
CN104883557A (en) | Real time holographic projection method, device and system | |
CN104866101A (en) | Real-time interactive control method and real-time interactive control device of virtual object | |
JP2010529738A5 (en) | ||
CN105635669A (en) | Movement contrast system based on three-dimensional motion capture data and actually photographed videos and method thereof | |
CN106373142A (en) | Virtual character on-site interaction performance system and method | |
WO2018112695A1 (en) | Image display method and mobile terminal | |
CN110324553B (en) | Live-action window system based on video communication | |
CN108377361B (en) | Display control method and device for monitoring video | |
CN106327583A (en) | Virtual reality equipment for realizing panoramic image photographing and realization method thereof | |
CN109663343A (en) | A kind of augmented reality AR game device and implementation method | |
CN107479701B (en) | Virtual reality interaction method, device and system | |
CN212231547U (en) | Mixed reality virtual preview shooting system | |
JP2022102923A (en) | Virtual studio system | |
WO2018196184A1 (en) | Plant monitoring method and monitoring system | |
CN203827437U (en) | Simulated window system with changeable angle | |
US20210084197A1 (en) | Data processing | |
JP2004289779A (en) | Mobile body imaging method and mobile body imaging system | |
US9979930B2 (en) | Head-wearable apparatus, 3D video call system and method for implementing 3D video call |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address |
Address after: Room 530, Management Committee Building, Fuxing Economic Development Zone, 22 Fuxing Avenue, Jinan District, Fuzhou City, Fujian Province Patentee after: McSail Digital Image (China) Co., Ltd. Address before: 350000 Innovation Building, Quian'an Extension Zone, Fuzhou Economic and Technological Development Zone, Fujian Province, 2nd Floor Patentee before: Hitachi Global Storage Technologies (China) Co., Ltd. |
|
CP03 | Change of name, title or address |