CN103945122A - Method for forming virtual window through pan-tilt camera and projector - Google Patents

Method for forming virtual window through pan-tilt camera and projector Download PDF

Info

Publication number
CN103945122A
CN103945122A CN201410130607.1A CN201410130607A CN103945122A CN 103945122 A CN103945122 A CN 103945122A CN 201410130607 A CN201410130607 A CN 201410130607A CN 103945122 A CN103945122 A CN 103945122A
Authority
CN
China
Prior art keywords
projector
controller
human body
video camera
monopod video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410130607.1A
Other languages
Chinese (zh)
Other versions
CN103945122B (en
Inventor
出口雅晴
冲满
陈晟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
McSail Digital Image (China) Co., Ltd.
Original Assignee
Hitachi Digital Products China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Digital Products China Co Ltd filed Critical Hitachi Digital Products China Co Ltd
Priority to CN201410130607.1A priority Critical patent/CN103945122B/en
Publication of CN103945122A publication Critical patent/CN103945122A/en
Application granted granted Critical
Publication of CN103945122B publication Critical patent/CN103945122B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Controls And Circuits For Display Device (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention discloses a method for forming a virtual window through a pan-tilt camera and a projector. A controller, the pan-tilt camera, the projector and a human body position sensor are arranged first; the controller is provided with two control interfaces and a USB interface; the pan-tilt camera and the projector are connected to the two control interfaces through serial communication lines, and the human body position sensor is connected to the USB interface through a USB cable; the pan-tilt camera is provided with a video output interface, and the projector is connected to the video output interface through a video transmission line. The method includes the steps of setting a control instruction set and control parameters of all devices, controlling the pan-tilt camera to shoot a video, conducting superposition on a built-in window frame pattern and an image signal through the projector, and projecting the built-in window frame pattern and the image signal on a wall surface to form the content such as the virtual window. By means of the method, a window frame with the projection viewing angle capable of being changed according to the position change of an observer can be formed on a wall, and therefore the wall becomes transparent.

Description

Utilize monopod video camera and projector to realize the method for virtual window
Technical field
The present invention is specifically related to a kind of method of utilizing monopod video camera and projector to realize virtual window.
Background technology
Along with scientific and technological development, various novel electronic equipments be applied to people work, life in.Monopod video camera and projector, as video capture equipment and large-scale demonstration output equipment, are widely used in security protection, meeting, education, home theater.
The room that does not have landscape maybe should not window, oppressive and barren unavoidably.Existing solution can be that the display devices such as TV, displaying video programs content are installed.Also can be on this basis, assembling video camera, shows the content (as monitored picture) that video camera is taken.But said method all cannot provide real telepresenc to as real window spectators (observer).Such as, to observe outside room by real window, visual angle changes the difference along with the relative window position of observer, in diverse location, also difference to some extent of the scenery of seeing, as shown in Figure 1.
Summary of the invention
Technical problem to be solved by this invention is to provide a kind of method of utilizing monopod video camera and projector to realize virtual window, can on indoor wall, form a virtual window.
The present invention solves the problems of the technologies described above by the following technical programs: a kind of method of utilizing monopod video camera and projector to realize virtual window, first configures a controller, a monopod video camera, a projector and a position of human body transducer; Described controller has two control ports and a USB interface; Described monopod video camera and projector are connected respectively to two control ports by serial communication line, and described position of human body transducer USB cable is connected to described USB interface; Described monopod video camera has video output interface, and described projector is connected to video output interface by video transmission line; Described method specifically comprises following content:
Step 1: described controller is monopod video camera, projector and the position of human body transducer exchange message of connection corresponding to each port respectively, obtains each unit type, thus set respectively the control command collection of three equipment and control parameter;
Step 2: controller is according to described control command collection and control parameter, sends instruction by serial communication line, and control monopod video camera resets to predeterminated position and starts and takes, and converts the view of shooting to picture signal simultaneously;
Step 3: monopod video camera is exported the video port of specifying of described picture signal to projector by video output interface; Controller sends instruction by serial communication line, and controlling projection machine starts and be switched to the video port of appointment; Then projector superposes built-in a window frame pattern and described picture signal, and is projeced into and on metope, forms a virtual window;
Step 4: position of human body transducer obtains the environmental data in its reconnaissance range, and environmental data is passed to controller; Whether described environmental data refers in the reconnaissance range of position of human body transducer unmanned or have a single personal data; Whether then controller analysis environments data, judge in the reconnaissance range of position of human body transducer unmanned or have a single people; If environmental data is unmanned in the reconnaissance range of position of human body transducer, enter step 5; If environmental data is to have single people in the reconnaissance range of position of human body transducer, perform step 6;
Step 5: position of human body transducer continues detecting is detected less than human body and is entered reconnaissance range always within default time, and controller sends instruction control monopod video camera and projector enters holding state;
Step 6:
A. position of human body transducer obtains this single people's current location and sends to controller, controller is according to this single people's current location, control monopod video camera adjustment lens direction and zoom ratio and take to corresponding observation visual angle, virtual window of while is also taken accordingly visual angle according to monopod video camera and is presented image;
B. position of human body transducer continues to follow the trail of this single people's motion, and its current location mobile message is fed back to controller constantly, controller is according to this single people's change in location, control monopod video camera adjust in real time camera lens towards and zoom ratio, the observation visual angle that simulates this single people is taken, and in real time the scene of shooting being passed to projector, projector projects described scene and the image of window frame pattern after superimposed on wall.
Further, described environmental data is also included in the reconnaissance range of position of human body transducer whether have multiple people; Described step 4 also comprises following content:
Controller analysis environments data, judge in the reconnaissance range of position of human body transducer, whether there are multiple people; If environmental data is to have multiple people in the reconnaissance range of position of human body transducer, perform step 7;
Step 7: it is main observer that controller is specified the people in multiple people according to default rule, then carries out following operation:
A. position of human body transducer obtains main observer's current location and sends to controller, controller is according to main observer's current location, control monopod video camera adjustment lens direction and zoom ratio and take to corresponding observation visual angle, virtual window of while is also taken accordingly visual angle according to monopod video camera and is presented image;
B. position of human body transducer continues to follow the trail of main observer's motion, and its current location mobile message is fed back to controller constantly, controller is according to main observer's change in location, control monopod video camera adjust in real time camera lens towards and zoom ratio, the observation visual angle that simulates main observer is taken, and in real time the scene of shooting being passed to projector, projector projects described scene and the image of window frame pattern after superimposed on wall.
Further, the rule in described step 7 refers to: with the most active artificial main observer that moves; Or the artificial main observer nearest apart from predeterminated position; Or the distributing position according to multiple people calculates median, to be positioned at the people of described median or the artificial main observer nearest apart from median.
Beneficial effect of the present invention is: utilize monopod video camera and projector, and the window picture that can change with observer's change in location by project visual angle on wall, thus realize wall transparence.
Brief description of the drawings
The invention will be further described in conjunction with the embodiments with reference to the accompanying drawings.
Fig. 1 is existing human eye visual field schematic diagram.
Fig. 2 is the structural representation of simulation window system in invention.
Fig. 3 is a kind of flow chart that utilizes monopod video camera and projector to realize the method for virtual window of the present invention.
Embodiment
Refer to Fig. 2-3, Fig. 2 is the structural representation of simulation window system in invention.Utilize monopod video camera and projector to realize a method for virtual window, first configure a controller, a monopod video camera, a projector and a position of human body transducer; Described controller has two control ports and a USB interface; Described monopod video camera and projector are connected respectively to two control ports by serial communication line, and described position of human body transducer USB cable is connected to described USB interface; Described monopod video camera has video output interface, and described projector is connected to video output interface by video transmission line; Described method specifically comprises following content:
Step 1: described controller is monopod video camera, projector and the position of human body transducer exchange message of connection corresponding to each port respectively, obtains each unit type, thus set respectively the control command collection of three equipment and control parameter.
Step 2: controller is according to described control command collection and control parameter, send instruction by serial communication line, control monopod video camera and reset to predeterminated position (being that camera lens is towards parameters such as default direction, setting zoom ratios) and start shooting, convert the view of shooting to picture signal simultaneously.
Step 3: monopod video camera is exported the video port of specifying of described picture signal to projector by video output interface; Controller sends instruction by serial communication line, and controlling projection machine starts and be switched to the video port of appointment; Then projector superposes built-in a window frame pattern and described picture signal, and is projeced into and on metope, forms a virtual window.
Step 4: position of human body transducer obtains the environmental data in its reconnaissance range, and environmental data is passed to controller; Whether described environmental data refers in the reconnaissance range of position of human body transducer unmanned, have single people or have multiple personal data; Then controller analysis environments data, judge in the reconnaissance range of position of human body transducer whether unmanned, have single people or have multiple people; If environmental data is unmanned in the reconnaissance range of position of human body transducer, enter step 5; If environmental data is to have single people in the reconnaissance range of position of human body transducer, perform step 6; If environmental data is to have multiple people in the reconnaissance range of position of human body transducer, perform step 7.
Step 5: position of human body transducer continues detecting is detected less than human body and is entered reconnaissance range always within default time, and controller sends instruction control monopod video camera and projector enters holding state.
Step 6:
A. position of human body transducer obtains this single people's current location and sends to controller, controller is according to this single people's current location, control monopod video camera adjustment lens direction and zoom ratio and take to corresponding observation visual angle, virtual window of while is also taken accordingly visual angle according to monopod video camera and is presented image;
B. position of human body transducer continues to follow the trail of this single people's motion, and its current location mobile message is fed back to controller constantly, controller is according to this single people's change in location, control monopod video camera adjust in real time camera lens towards and zoom ratio, the observation visual angle that simulates this single people is taken, and in real time the scene of shooting being passed to projector, projector projects described scene and the image of window frame pattern after superimposed on wall.
Step 7: it is main observer that controller is specified the people in multiple people according to default rule, then carries out following operation:
A. position of human body transducer obtains main observer's current location and sends to controller, controller is according to main observer's current location, control monopod video camera adjustment lens direction and zoom ratio and take to corresponding observation visual angle, virtual window of while is also taken accordingly visual angle according to monopod video camera and is presented image;
B. position of human body transducer continues to follow the trail of main observer's motion, and its current location mobile message is fed back to controller constantly, controller is according to main observer's change in location, control monopod video camera adjust in real time camera lens towards and zoom ratio, the observation visual angle that simulates main observer is taken, and in real time the scene of shooting being passed to projector, projector projects described scene and the image of window frame pattern after superimposed on wall.
Rule in described step 7 refers to: with the most active artificial main observer that moves; Or the artificial main observer nearest apart from predeterminated position; Or the distributing position according to multiple people calculates median, to be positioned at the people of described median or the artificial main observer nearest apart from median.
The present invention utilizes monopod video camera and projector, the window picture that can change with observer's change in location by project visual angle on wall, thus realize wall transparence, for Observer is drawn up experiencing when participating in the cintest of observing by real window.

Claims (3)

1. utilize monopod video camera and projector to realize a method for virtual window, it is characterized in that: first configure a controller, a monopod video camera, a projector and a position of human body transducer; Described controller has two control ports and a USB interface; Described monopod video camera and projector are connected respectively to two control ports by serial communication line, and described position of human body transducer USB cable is connected to described USB interface; Described monopod video camera has video output interface, and described projector is connected to video output interface by video transmission line; Described method specifically comprises following content:
Step 1: described controller is monopod video camera, projector and the position of human body transducer exchange message of connection corresponding to each port respectively, obtains each unit type, thus set respectively the control command collection of three equipment and control parameter;
Step 2: controller is according to described control command collection and control parameter, sends instruction by serial communication line, and control monopod video camera resets to predeterminated position and starts and takes, and converts the view of shooting to picture signal simultaneously;
Step 3: monopod video camera is exported the video port of specifying of described picture signal to projector by video output interface; Controller sends instruction by serial communication line, and controlling projection machine starts and be switched to the video port of appointment; Then projector superposes built-in a window frame pattern and described picture signal, and is projeced into and on metope, forms a virtual window;
Step 4: position of human body transducer obtains the environmental data in its reconnaissance range, and environmental data is passed to controller; Whether described environmental data refers in the reconnaissance range of position of human body transducer unmanned or have a single personal data; Whether then controller analysis environments data, judge in the reconnaissance range of position of human body transducer unmanned or have a single people; If environmental data is unmanned in the reconnaissance range of position of human body transducer, enter step 5; If environmental data is to have single people in the reconnaissance range of position of human body transducer, perform step 6;
Step 5: position of human body transducer continues detecting is detected less than human body and is entered reconnaissance range always within default time, and controller sends instruction control monopod video camera and projector enters holding state;
Step 6:
A. position of human body transducer obtains this single people's current location and sends to controller, controller is according to this single people's current location, control monopod video camera adjustment lens direction and zoom ratio and take to corresponding observation visual angle, virtual window of while is also taken accordingly visual angle according to monopod video camera and is presented image;
B. position of human body transducer continues to follow the trail of this single people's motion, and its current location mobile message is fed back to controller constantly, controller is according to this single people's change in location, control monopod video camera adjust in real time camera lens towards and zoom ratio, the observation visual angle that simulates this single people is taken, and in real time the scene of shooting being passed to projector, projector projects described scene and the image of window frame pattern after superimposed on wall.
2. the method for utilizing monopod video camera and projector to realize virtual window as claimed in claim 1, is characterized in that: described environmental data is also included in the reconnaissance range of position of human body transducer whether have multiple people; Described step 4 also comprises following content:
Controller analysis environments data, judge in the reconnaissance range of position of human body transducer, whether there are multiple people; If environmental data is to have multiple people in the reconnaissance range of position of human body transducer, perform step 7;
Step 7: it is main observer that controller is specified the people in multiple people according to default rule, then carries out following operation:
A. position of human body transducer obtains main observer's current location and sends to controller, controller is according to main observer's current location, control monopod video camera adjustment lens direction and zoom ratio and take to corresponding observation visual angle, virtual window of while is also taken accordingly visual angle according to monopod video camera and is presented image;
B. position of human body transducer continues to follow the trail of main observer's motion, and its current location mobile message is fed back to controller constantly, controller is according to main observer's change in location, control monopod video camera adjust in real time camera lens towards and zoom ratio, the observation visual angle that simulates main observer is taken, and in real time the scene of shooting being passed to projector, projector projects described scene and the image of window frame pattern after superimposed on wall.
3. the method for utilizing monopod video camera and projector to realize virtual window as claimed in claim 2, is characterized in that: the rule in described step 7 refers to: with the most active artificial main observer that moves; Or the artificial main observer nearest apart from predeterminated position; Or the distributing position according to multiple people calculates median, to be positioned at the people of described median or the artificial main observer nearest apart from median.
CN201410130607.1A 2014-04-02 2014-04-02 The method that virtual window is realized using monopod video camera and projector Active CN103945122B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410130607.1A CN103945122B (en) 2014-04-02 2014-04-02 The method that virtual window is realized using monopod video camera and projector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410130607.1A CN103945122B (en) 2014-04-02 2014-04-02 The method that virtual window is realized using monopod video camera and projector

Publications (2)

Publication Number Publication Date
CN103945122A true CN103945122A (en) 2014-07-23
CN103945122B CN103945122B (en) 2017-04-05

Family

ID=51192581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410130607.1A Active CN103945122B (en) 2014-04-02 2014-04-02 The method that virtual window is realized using monopod video camera and projector

Country Status (1)

Country Link
CN (1) CN103945122B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106303246A (en) * 2016-08-23 2017-01-04 刘永锋 Real-time video acquisition methods based on Virtual Realization
CN108153103A (en) * 2017-12-30 2018-06-12 宁波高新区若水智创科技有限公司 One planting sand draws synchronizing for creation and follows stereo imaging system and method
CN108845775A (en) * 2018-05-30 2018-11-20 王玉龙 A kind of virtual landscape window
CN111314663A (en) * 2020-02-28 2020-06-19 青岛海信智慧家居系统股份有限公司 Intelligent virtual window system based on 5G
US10891920B2 (en) 2018-08-15 2021-01-12 Au Optronics Corporation Scenario projection system and controlling method thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1373560A (en) * 2001-02-28 2002-10-09 黄宝儿 In-situ electronic scenery
CN1588992A (en) * 2004-10-21 2005-03-02 上海交通大学 Entertainment system for video frequency real time synthesizing and recording
US20130235351A1 (en) * 2012-03-07 2013-09-12 GM Global Technology Operations LLC Virtual convertible tops, sunroofs, and back windows, and systems and methods for providing same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1373560A (en) * 2001-02-28 2002-10-09 黄宝儿 In-situ electronic scenery
CN1588992A (en) * 2004-10-21 2005-03-02 上海交通大学 Entertainment system for video frequency real time synthesizing and recording
US20130235351A1 (en) * 2012-03-07 2013-09-12 GM Global Technology Operations LLC Virtual convertible tops, sunroofs, and back windows, and systems and methods for providing same

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106303246A (en) * 2016-08-23 2017-01-04 刘永锋 Real-time video acquisition methods based on Virtual Realization
CN108153103A (en) * 2017-12-30 2018-06-12 宁波高新区若水智创科技有限公司 One planting sand draws synchronizing for creation and follows stereo imaging system and method
CN108845775A (en) * 2018-05-30 2018-11-20 王玉龙 A kind of virtual landscape window
US10891920B2 (en) 2018-08-15 2021-01-12 Au Optronics Corporation Scenario projection system and controlling method thereof
CN111314663A (en) * 2020-02-28 2020-06-19 青岛海信智慧家居系统股份有限公司 Intelligent virtual window system based on 5G

Also Published As

Publication number Publication date
CN103945122B (en) 2017-04-05

Similar Documents

Publication Publication Date Title
CN105681656B (en) System and method for bullet time shooting
CN103945122A (en) Method for forming virtual window through pan-tilt camera and projector
EP2978217A1 (en) Display device and visual display method for simulating holographic 3d scene
US20100271394A1 (en) System and method for merging virtual reality and reality to provide an enhanced sensory experience
US20190317490A1 (en) Control method, device, and remote control for vr apparatus
CN204741528U (en) Intelligent control ware is felt to three -dimensional immersive body
CN105684415A (en) Spherical omnidirectional video-shooting system
CN110324553B (en) Live-action window system based on video communication
EP2413607A3 (en) Mobile terminal and method of controlling a three-dimensional image therein
US10623698B2 (en) Video communication device and method for video communication
JP2008005450A (en) Method of grasping and controlling real-time status of video camera utilizing three-dimensional virtual space
CN106327583A (en) Virtual reality equipment for realizing panoramic image photographing and realization method thereof
EP3465631B1 (en) Capturing and rendering information involving a virtual environment
US10972699B2 (en) Video communication device and method for video communication
US10645340B2 (en) Video communication device and method for video communication
KR20110006976A (en) Mutimedia syncronization control system for display space from experience space
CN105991972A (en) Aerial photographing system
CN205754586U (en) Coordinate the system of shooting image
JP7287798B2 (en) Remote camera system, control system, video output method, virtual camera work system, and program
KR20150103528A (en) The apparatus and method of camera placement and display for free viewpoint video capture
CN203827437U (en) Simulated window system with changeable angle
CN207319495U (en) Interaction optics panorama practical traning platform and interaction optics panorama Training Room
CN102946508A (en) Panoramic video camera
CN115076561A (en) Tele-immersion type binocular holder follow-up system and method applied to engineering machinery
US9979930B2 (en) Head-wearable apparatus, 3D video call system and method for implementing 3D video call

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: Room 530, Management Committee Building, Fuxing Economic Development Zone, 22 Fuxing Avenue, Jinan District, Fuzhou City, Fujian Province

Patentee after: McSail Digital Image (China) Co., Ltd.

Address before: 350000 Innovation Building, Quian'an Extension Zone, Fuzhou Economic and Technological Development Zone, Fujian Province, 2nd Floor

Patentee before: Hitachi Global Storage Technologies (China) Co., Ltd.