US20010056574A1 - VTV system - Google Patents

VTV system Download PDF

Info

Publication number
US20010056574A1
US20010056574A1 US09/891,733 US89173301A US2001056574A1 US 20010056574 A1 US20010056574 A1 US 20010056574A1 US 89173301 A US89173301 A US 89173301A US 2001056574 A1 US2001056574 A1 US 2001056574A1
Authority
US
United States
Prior art keywords
electronic device
video
image
panoramic
hmd
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/891,733
Other languages
English (en)
Inventor
Angus Richards
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US09/891,733 priority Critical patent/US20010056574A1/en
Priority to PCT/US2001/049287 priority patent/WO2003001803A1/en
Priority to DE10197255T priority patent/DE10197255T5/de
Priority to JP2003508064A priority patent/JP2005500721A/ja
Publication of US20010056574A1 publication Critical patent/US20010056574A1/en
Priority to US11/230,173 priority patent/US7688346B2/en
Priority to US12/732,671 priority patent/US20100302348A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/002Special television systems not provided for by H04N7/007 - H04N7/18
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/08Gnomonic or central projection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/641Multi-purpose receivers, e.g. for auxiliary information

Definitions

  • the following patent relates to an overall hardware configuration that produces an enhanced spatial television-like viewing experience. Unlike normal television, with this system the viewer is able to control both the viewing direction and relative position of the viewer with respect to the movie action. In addition to a specific hardware configuration, this patent also relates to a new video format which makes possible this virtual reality like experience. Additionally, several proprietary video compression standards are also defined which facilitate this goal.
  • the VTV system is designed to be an intermediary technology between conventional two-dimensional cinematography and true virtual reality.
  • the overall VTV system consists of a central graphics processing device (the VTV processor), a range of video input devices (DVD, VCR, satellite, terrestrial television, remote video cameras), infrared remote control, digital network connection and several output device connections.
  • the VTV unit In its most basic configuration as shown in FIG. 2, the VTV unit would output imagery to a conventional television device.
  • a remote control device possibly infrared
  • the advantage of this “basic system configuration” is that it is implementable utilizing current audiovisual technology.
  • the VTV graphics standard is a forwards compatible graphics standard which can be thought of as a “layer” above that of standard video.
  • VTV video represents a subset of the new VTV graphics standard.
  • VTV can be introduced without requiring any major changes in the television and/or audiovisual manufacturers specifications.
  • VTV compatible television decoding units will inherently be compatible with conventional television transmissions.
  • the VTV system uses a wireless HMD as the display device.
  • the wireless HMD can be used as a tracking device in addition to simply displaying images.
  • This tracking information in the most basic form could consist of simply controlling the direction of view.
  • both direction of view and position of the viewer within the virtual environment can be determined.
  • remote cameras on the HMD will provide to the VTV system, real world images which it will interpret into spatial objects, the spatial objects can then be replaced with virtual objects thus providing an “environment aware” augmented reality system.
  • the wireless HMD is connected to the VTV processor by virtue of a wireless data link “Cybernet link”.
  • this link is capable of transmitting video information from the VTV processor to the HMD and transmitting tracking information from the HMD to the VTV processor.
  • the cybernet link would transmit video information both to and from the HMD in addition to transferring tracking information from the HMD to the VTV processor. Additionally certain components of the VTV processor may be incorporated in the remote HMD thus reducing the data transfer requirement through the cybernet link.
  • This wireless data link can be implemented in a number of different ways utilizing either analog or digital video transmission (in either an un-compressed or a digitally compressed format) with a secondary digitally encoded data stream for tracking information.
  • a purely digital unidirectional or bi-directional data link which carries both of these channels could be incorporated.
  • the actual medium for data transfer would probably be microwave or optical. However either transfer medium may be utilized as appropriate.
  • the preferred embodiment of this system is one which utilizes on-board panoramic cameras fitted to the HMD in conjunction with image analysis hardware on board the HMD or possibly on the VTV base station to provide real-time tracking information.
  • retroflective markers may also the utilized in the “real world environment”. In such a configuration, switchable light sources placed near to the optical axis of the on-board cameras would be utilized in conjunction with these cameras to form a “differential image analysis” system.
  • Such a system features considerably higher recognition accuracy than one utilizing direct video images alone.
  • the VTV system will transfer graphic information utilizing a “universal graphics standard”.
  • a “universal graphics standard” will incorporate an object based graphics description language which achieves a high degree of compression by virtue of a “common graphics knowledge base” between subsystems.
  • This patent describes in basic terms three levels of progressive sophistication in the evolution of this graphics language.
  • VTV system In its most basic format the VTV system can be thought of as a 360 Degree panoramic display screen which surrounds the viewer.
  • This “virtual display screen” consists of a number of “video Pages”. Encoded in the video image is a “Page key code” which instructs the VTV processor to place the graphic information into specific locations within this “virtual display screen”.
  • Page key code which instructs the VTV processor to place the graphic information into specific locations within this “virtual display screen”.
  • the VTV graphics standard consists of a virtual 360 degree panoramic display screen upon which video images can be rendered from an external video source such as VCR, DVD, satellite, camera or terrestrial television receiver such that each video frame contains not only the video information but also information that defines its location within the virtual display screen.
  • an external video source such as VCR, DVD, satellite, camera or terrestrial television receiver
  • each video frame contains not only the video information but also information that defines its location within the virtual display screen.
  • Such a system is remarkably versatile as it provides not only variable resolution images but also frame rate independent imagery. That is to say, the actual update rate within a particular virtual image (entire virtual display screen) may vary within the display screen itself This is inherently accomplished by virtue of each frame containing its virtual location information. This allows active regions of the virtual image to be updated quickly at the nominal perception cost of not updating sections on the image which have little or no change.
  • FIG. 4 Such a system is shown in FIG. 4.
  • the basic VTV system can be enhanced to the format shown in FIG. 5.
  • the cylindrical virtual display screen is interpreted by the VTV processor as a truncated sphere. This effect can be easily generated through the use of a geometry translator or “Warp Engine” within the digital processing hardware component of the VTV processor.
  • the VTV standard In addition to 360 Degree panoramic video, the VTV standard also supports either 4 track (quadraphonic) or 8 track (octaphonic) spatial audio.
  • a virtual representation of the 4 track system is shown in FIG. 6.
  • sound through the left and right speakers of the sound system or headphones, in the case of an HMD based system
  • the azimuth the of the view port direction of view within the VR environment
  • the 8 track audio system sound through the left and right speakers of the sound system is scaled according to both the azimuth and elevation of the view port, as shown in the virtual representation of the system, FIG. 7.
  • the VTV standard encodes the multi-track audio channels as part of the video information in a digital/analogue hybrid format as shown in FIG. 12.
  • each audio scan line contains 512 audio samples.
  • each audio scan line contains a three bit digital code that is used to “pre-scale” the audio information. That is to say that the actual audio sample value is X*S where X is the pre-scale number and S is the sample value.
  • the dynamic range of the audio system can be extended from about 43 dB to over 60 dB.
  • this extending of the dynamic range is done at relatively “low cost” to the audio quality because we are relatively insensitive to audio distortion when the overall signal level is high.
  • the start bit is an important component in the system. It's function is to set the maximum level for the scan line (i.e. the 100% or white level) This level in conjunction with the black level (this can be sampled just after the colour burst) forms the 0% and 100% range for each line.
  • the system becomes much less sensitive to variations in black level due to AC-coupling of video sub modules and/or recording and play back of the video media in addition to improving the accuracy of the decoding of the digital component of the scan line.
  • an audio control bit is included in each field (at line 21 ). This control bit sets the audio buffer sequence to 0 when it is set. This provides a way to synchronize the 4 or 8 track audio information so that the correct track is always being updated from the current data regardless of the sequence of the video Page updates.
  • this spatial audio system/standard could also be used in audio only mode by the combination of a suitable compact tracking device and a set of cordless headphones to realize a spatial-audio system for advanced hi-fi equipment.
  • the first two standards relate to the definitions of spatial graphics objects where as the third graphics standard relates to a complete VR environment definition language which utilizes the first standards as a subset and incorporates additional environment definitions and control algorithms.
  • the VTV graphic standard (in its basic form) can be thought of as a control layer above that of the conventional video standard (NTSC, PAL etc.). As such, it is not limited purely to conventional analog video transmission standards. Using basically identical techniques, the VTV standard can operate with the HDTV standard as well as many of the computer graphic and industry audiovisual standards.
  • the VTV graphics processor is the heart of the VTV system.
  • this module is responsible for the real-time generation of the graphics which is output to the display device (either conventional TV/HDTV or HMD).
  • the display device either conventional TV/HDTV or HMD.
  • a video media provision device such as VCR, DVD, satellite, camera or terrestrial television receiver.
  • More sophisticated versions of this module may real-time render graphics from a “universal graphics language” passed to it via the Internet or other network connection.
  • the VTV processor can also perform image analysis. Early versions of this system will use this image analysis function for the purpose of determining tracking coordinates of the HMD.
  • More sophisticated versions of this module will in addition to providing this tracking information, also interpret the real world images from the HMD as physical three-dimensional objects. These three-dimensional objects will be defined in the universal graphics language which can then be recorded or communicated to similar remote display devices via the Internet or other network or alternatively be replaced by other virtual objects of similar physical size thus creating a true augmented reality experience.
  • VTV hardware itself consists of a group of sub modules as follows:
  • VRM Virtual Reality Memory
  • Video information is digitized and placed in the augmented reality memory on a field by field basis assuming an absolute Page reference of 0 degree azimuth, 0 degree elevation with the origin of each Page being determined by the state of the Page number bits (P 3 -P 0 ).
  • Auxiliary video information for background and/or floor/ceiling maps is loaded into the virtual reality memory on a field by field basis dependent upon the state of the “field type” bits (F 3 -F 0 ) and Page number bits (P 3 -P 0 ).
  • the digital processing hardware interprets this information held in augmented reality and virtual reality memory and utilizing a combination of a geometry processing engine (Warp Engine), digital subtractive image processing and a new versatile form of “blue-screening”, translates and selectively combines this data into an image substantially similar to that which would be seen by the viewer if they were standing in the same location as that of the panoramic camera when the video material was filmed.
  • a geometry processing engine Warp Engine
  • digital subtractive image processing and a new versatile form of “blue-screening”
  • VTV processor mode is determined by additional control information present in the source media and thus the processing and display modes can change dynamically while displaying a source of VTV media.
  • the video generation module then generates a single or pair of video images for display on a conventional television or HMD display device.
  • VTV image field will be updated at less than full frame rates (unless multi-spin DVD devices are used as the image media) graphics rendering will still occur at full video frame rates, as will the updates of the spatial audio. This is possible because each “Image Sphere” contains all of the required information for both video and audio for any viewer orientation (azimuth and elevation).
  • ADC-0 would generally be used for live panoramic video feeds and ADC-2 would generally be used for virtual reality video feeds from pre-rendered video material
  • both video input stages have fill access to both augmented reality and virtual reality memory (i.e. they use a memory pool).
  • This hardware configuration allows for more versatility in the design and allows several unusual display modes (which will be covered in more detail in later sections).
  • the video output stages (DAC-0 and DAC-1) have total access to both virtual and augmented reality memory.
  • the memory pool style of design means that the system can function with either one or two input and/or output stages (although with reduced capabilities) and as such the presence of either one or two input or output stages in a particular implementation should not limit the generality of the specification.
  • the digital processing hardware would take the form of one or more field programmable logic arrays or custom ASIC.
  • the advantage of using field programmable logic arrays is that the hardware can be updated at anytime.
  • the main disadvantage of this technology is that it is not quite as fast as an ASIC.
  • high-speed conventional digital processors may also be utilized to perform this image analysis and/or graphics generation task.
  • VTV base station hardware would act only as a link between the HMD and the Internet or other network with all graphics image generation, image analysis and spatial object recognition occurring within the HMD itself
  • a VTV image frame consists of either a cylinder or a truncated sphere. This space subtends only a finite vertical angle to the viewer (+/ ⁇ 45 degrees in the prototype). This is an intentional limitation designed to make the most of the available data bandwidth of the video storage and transmission media and thus maintain compatibility with existing video systems. However, as a result of this compromise, there can exist a situation in which the view port exceeds the scope of the image data. There are several different ways in which this exception can be handled. Firstly, the simplest way to handle this exception is to simply make out of bounds video data black. This will give the appearance of being in a room with a black ceiling and floor.
  • VRM Virtual reality memory
  • FIG. 8 The basic memory map for the system utilizing both augmented reality memory and virtual reality memory (in addition to translation memory) is shown in FIG. 8. As can be seen in this illustration, The translation memory area must have sufficient range to cover a full 360 degree*180 degrees and ideally have the same angular resolution as that of the augmented reality memory bank (which covers 360 degree*90 degree). With such a configuration, it is possible to provide both floor and ceiling exception handling and variable transparency imagery such as looking through windows in the foreground and showing the background behind them.
  • the backgrounds can be either static or dynamic and can be updated in basically the same way as foreground (augmented reality memory) by utilizing a Paged format.
  • the VTV system has two basic modes of operation. Within these two modes there also exist several sub modes.
  • the two basic modes are as follows:
  • augmented reality mode 1 selective components of “real world imagery” are overlaid upon a virtual reality background.
  • this process involves first removing all of the background components from the “real world” imagery. This can be easily done by using differential imaging techniques. I.e. by comparing current “real world” imagery against a stored copy taken previously and detecting differences between the two. After the two images have been correctly aligned, the regions that differ are new or foreground objects and those that remain the same are static background objects. This is the simplest of the augmented reality modes and is generally not sufficiently interesting as most of the background will be removed in the process.
  • augmented reality memory when operated in mobile Pan-Cam (telepresense) or augmented reality mode the augmented reality memory will generally be updated in sequential Page order (i.e. updated in whole system frames) rather than random Page updates. This is because constant variations in the position and orientation of the panoramic camera system during filming will probably cause mismatches in the image Pages if they are handled separately.
  • Augmented reality mode 2 differs from mode 1 in that, in addition to automatically extracting foreground and moving objects and placing these in an artificial background environment, the system also utilizes the Warp Engine to “push” additional “real world” objects into the background. In addition to simply adding these “real world” objects into the virtual environment the Warp Engine is also capable of scaling and translating these objects so that they match into the virtual environment more effectively. These objects can be handled as opaque overlays or transparencies.
  • Augmented reality mode 3 differs from the mode 2 in that, in this case, the Warp Engine is used to “pull” the background objects into the foreground to replace “real world” objects. As in mode 2, these objects can be translated and scaled and can be handled as either opaque overlays or transparencies. This gives the user to the ability to “match” the physical size and position of a “real world” object with a virtual object. By doing so, the user is able to interact and navigate within the augmented reality environment as they would in the “real world” environment. This mode is probably the most likely mode to be utilized for entertainment and gaming purposes as it would allow a Hollywood production to be brought into the users own living room,
  • Virtual reality mode is a functionally simpler mode than the previous augmented reality modes.
  • “pre-filmed” or computer-generated graphics are loaded into augmented reality memory on a random Page by Page basis. This is possible because the virtual camera planes of reference are fixed.
  • virtual reality memory is loaded with a fixed or dynamic background at a lower resolution. The use of both foreground and background image planes makes possible more sophisticated graphics techniques such as motion parallax.
  • VTV encoder module In the case of imagery collected by mobile panoramic camera systems, the images are first processed by a VTV encoder module. This device provides video distortion correction and also inserts video Page information, orientation tracking data and spatial audio into the video stream. This can be done without altering the video standard, thereby maintaining compatibility with existing recording and playback devices.
  • this module could be incorporated within the VTV processor, having this module as a separate entity is advantageous for use in remote camera applications where the video information must ultimately be either stored or transmitted through some form of wireless network
  • tracking information must comprise part of the resultant video stream in order that an “absolute” azimuth and elevation coordinate system be maintained.
  • this data is not required as the camera orientation is a theoretical construct known to the computer system at render time.
  • the basic tracking system of the VTV HMD utilizes on-board panoramic video cameras to capture the required 360 degree visual information of the surrounding real world environment. This information is then analyzed by the VTV processor (whether it exists within the HMD or as a base station unit) utilizing computationally intensive yet relatively algorithmically simple techniques such as auto correlation. Examples of a possible algorithm are shown in FIGS. 13 - 19 .
  • FIG. 20 shows a simplistic representation of the tracking hardware in which the auto correlators simply detect the presence or absence of a particular movement.
  • a practical system would probably incorporate a number of auto correlators for each class of movement (for example there may be 16 or more separate auto correlators to detect horizontal movement). Such as system would then be able to detect different levels or amounts of movement in all of the directions.
  • absolute reference points allows such a system to re-calibrate its absolute references and thus achieve an overall absolute coordinate system.
  • This absolute reference point calibration can be achieved relatively easily utilizing several different techniques. The first, and perhaps simplest technique is to use color sensitive retroflective spots as previously described. Alternately, active optical beacon's (such as LED beacon's) could also be utilized.
  • a further alternative absolute reference calibration system which could be used is based on a bi-directional infrared beacon. Such as system would communicate a unique ID code between the HMD and the beacon, such that calibration would occur only once each time the HMD passed under any of these “known spatial reference points”. This is required to avoid “dead tracking regions” within the vicinity of the calibration beacons due to multiple origin resets.
  • the image can then be processed as a series of horizontal and vertical strips such that auto correlation regions are bounded between highlight points/edges. Additionally, small highlight regions can very easily be tracked by comparing previous image frames against current images and determining “closest possible fit” between the images (i.e. minimum movement of highlight points). Such techniques are relatively easy and well within the capabilities of most moderate speed micro-processors, provided some of the image pre-processing overhead is handled by hardware.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • Television Signal Processing For Recording (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
US09/891,733 2000-06-26 2001-06-25 VTV system Abandoned US20010056574A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US09/891,733 US20010056574A1 (en) 2000-06-26 2001-06-25 VTV system
PCT/US2001/049287 WO2003001803A1 (en) 2001-06-25 2001-12-21 Vtv system
DE10197255T DE10197255T5 (de) 2001-06-25 2001-12-21 VTV-System
JP2003508064A JP2005500721A (ja) 2001-06-25 2001-12-21 Vtvシステム
US11/230,173 US7688346B2 (en) 2001-06-25 2005-09-19 VTV system
US12/732,671 US20100302348A1 (en) 2001-06-25 2010-03-26 VTV System

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US21286200P 2000-06-26 2000-06-26
US09/891,733 US20010056574A1 (en) 2000-06-26 2001-06-25 VTV system

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US11/230,173 Continuation US7688346B2 (en) 2001-06-25 2005-09-19 VTV system
US11/230,173 Division US7688346B2 (en) 2001-06-25 2005-09-19 VTV system

Publications (1)

Publication Number Publication Date
US20010056574A1 true US20010056574A1 (en) 2001-12-27

Family

ID=25398728

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/891,733 Abandoned US20010056574A1 (en) 2000-06-26 2001-06-25 VTV system
US11/230,173 Active 2024-09-19 US7688346B2 (en) 2001-06-25 2005-09-19 VTV system
US12/732,671 Abandoned US20100302348A1 (en) 2001-06-25 2010-03-26 VTV System

Family Applications After (2)

Application Number Title Priority Date Filing Date
US11/230,173 Active 2024-09-19 US7688346B2 (en) 2001-06-25 2005-09-19 VTV system
US12/732,671 Abandoned US20100302348A1 (en) 2001-06-25 2010-03-26 VTV System

Country Status (4)

Country Link
US (3) US20010056574A1 (ja)
JP (1) JP2005500721A (ja)
DE (1) DE10197255T5 (ja)
WO (1) WO2003001803A1 (ja)

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020147586A1 (en) * 2001-01-29 2002-10-10 Hewlett-Packard Company Audio annoucements with range indications
US20040125044A1 (en) * 2002-09-05 2004-07-01 Akira Suzuki Display system, display control apparatus, display apparatus, display method and user interface device
WO2004088994A1 (de) * 2003-04-02 2004-10-14 Daimlerchrysler Ag Vorrichtung zur berücksichtigung der betrachterposition bei der darstellung von 3d-bildinhalten auf 2d-anzeigevorrichtungen
US20040222988A1 (en) * 2003-05-08 2004-11-11 Nintendo Co., Ltd. Video game play using panoramically-composited depth-mapped cube mapping
WO2005064440A2 (de) * 2003-12-23 2005-07-14 Siemens Aktiengesellschaft Vorrichtung und verfahren zur positionsgenauen überlagerung des realen blickfeldes
US20050231532A1 (en) * 2004-03-31 2005-10-20 Canon Kabushiki Kaisha Image processing method and image processing apparatus
US20060069591A1 (en) * 2004-09-29 2006-03-30 Razzano Michael R Dental image charting system and method
US7118228B2 (en) 2003-11-04 2006-10-10 Hewlett-Packard Development Company, L.P. Image display system
US20070268316A1 (en) * 2006-05-22 2007-11-22 Canon Kabushiki Kaisha Display apparatus with image-capturing function, image processing apparatus, image processing method, and image display system
US20080007617A1 (en) * 2006-05-11 2008-01-10 Ritchey Kurtis J Volumetric panoramic sensor systems
US20080024594A1 (en) * 2004-05-19 2008-01-31 Ritchey Kurtis J Panoramic image-based virtual reality/telepresence audio-visual system and method
US20080117288A1 (en) * 2006-11-16 2008-05-22 Imove, Inc. Distributed Video Sensor Panoramic Imaging System
US7734070B1 (en) 2002-12-31 2010-06-08 Rajeev Sharma Method and system for immersing face images into a video sequence
US20120281128A1 (en) * 2011-05-05 2012-11-08 Sony Corporation Tailoring audio video output for viewer position and needs
US20120307001A1 (en) * 2011-06-03 2012-12-06 Nintendo Co., Ltd. Information processing system, information processing device, storage medium storing information processing program, and moving image reproduction control method
US8771064B2 (en) 2010-05-26 2014-07-08 Aristocrat Technologies Australia Pty Limited Gaming system and a method of gaming
US9236000B1 (en) * 2010-12-23 2016-01-12 Amazon Technologies, Inc. Unpowered augmented reality projection accessory display device
CN105324984A (zh) * 2013-12-09 2016-02-10 Cjcgv株式会社 用于生成多投影图像的方法和系统
US20160127723A1 (en) * 2013-12-09 2016-05-05 Cj Cgv Co., Ltd. Method and system for generating multi-projection images
US9383831B1 (en) * 2010-12-23 2016-07-05 Amazon Technologies, Inc. Powered augmented reality projection accessory display device
CN106165402A (zh) * 2014-04-22 2016-11-23 索尼公司 信息再现装置、信息再现方法、信息记录装置和信息记录方法
US9508194B1 (en) 2010-12-30 2016-11-29 Amazon Technologies, Inc. Utilizing content output devices in an augmented reality environment
US9607315B1 (en) 2010-12-30 2017-03-28 Amazon Technologies, Inc. Complementing operation of display devices in an augmented reality environment
US9721386B1 (en) 2010-12-27 2017-08-01 Amazon Technologies, Inc. Integrated augmented reality environment
US9766057B1 (en) 2010-12-23 2017-09-19 Amazon Technologies, Inc. Characterization of a scene with structured light
US20170329394A1 (en) * 2016-05-13 2017-11-16 Benjamin Lloyd Goldstein Virtual and augmented reality systems
FR3057430A1 (fr) * 2016-10-10 2018-04-13 Immersion Dispositif d'immersion dans une representation d'un environnement resultant d'un ensemble d'images
US9950262B2 (en) 2011-06-03 2018-04-24 Nintendo Co., Ltd. Storage medium storing information processing program, information processing device, information processing system, and information processing method
US9958934B1 (en) * 2006-05-01 2018-05-01 Jeffrey D. Mullen Home and portable augmented reality and virtual reality video game consoles
US20180176708A1 (en) * 2016-12-20 2018-06-21 Casio Computer Co., Ltd. Output control device, content storage device, output control method and non-transitory storage medium
US20180342267A1 (en) * 2017-05-26 2018-11-29 Digital Domain, Inc. Spatialized rendering of real-time video data to 3d space
US20190007672A1 (en) * 2017-06-30 2019-01-03 Bobby Gene Burrough Method and Apparatus for Generating Dynamic Real-Time 3D Environment Projections
US20190104282A1 (en) * 2017-09-29 2019-04-04 Sensormatic Electronics, LLC Security Camera System with Multi-Directional Mount and Method of Operation
WO2019076667A1 (en) * 2017-10-16 2019-04-25 Signify Holding B.V. METHOD AND CONTROL DEVICE FOR CONTROLLING A PLURALITY OF LIGHTING DEVICES
US20190149731A1 (en) * 2016-05-25 2019-05-16 Livit Media Inc. Methods and systems for live sharing 360-degree video streams on a mobile device
CN109996060A (zh) * 2017-12-30 2019-07-09 深圳多哚新技术有限责任公司 一种虚拟现实影院系统及信息处理方法
TWI666912B (zh) * 2017-03-22 2019-07-21 聯發科技股份有限公司 具有由封包於分段球體投影設計並以投影面表示之360度內容之投影訊框生成及編碼之方法及裝置
US10375355B2 (en) 2006-11-16 2019-08-06 Immersive Licensing, Inc. Distributed video sensor panoramic imaging system
US10712810B2 (en) * 2017-12-08 2020-07-14 Telefonaktiebolaget Lm Ericsson (Publ) System and method for interactive 360 video playback based on user location
CN112233048A (zh) * 2020-12-11 2021-01-15 成都成电光信科技股份有限公司 一种球形视频图像校正方法
US11086395B2 (en) * 2019-02-15 2021-08-10 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
CN113824746A (zh) * 2021-11-25 2021-12-21 山东信息职业技术学院 一种虚拟现实信息传输方法及虚拟现实系统
US11288937B2 (en) 2017-06-30 2022-03-29 Johnson Controls Tyco IP Holdings LLP Security camera system with multi-directional mount and method of operation
US11361640B2 (en) 2017-06-30 2022-06-14 Johnson Controls Tyco IP Holdings LLP Security camera system with multi-directional mount and method of operation
US11372474B2 (en) * 2019-07-03 2022-06-28 Saec/Kinetic Vision, Inc. Systems and methods for virtual artificial intelligence development and testing
US11816757B1 (en) * 2019-12-11 2023-11-14 Meta Platforms Technologies, Llc Device-side capture of data representative of an artificial reality environment
CN117557446A (zh) * 2023-11-24 2024-02-13 北京同步风云科技有限公司 一种球幕led实时图像处理与软件控制方法

Families Citing this family (83)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9063633B2 (en) * 2006-03-30 2015-06-23 Arjuna Indraeswaran Rajasingham Virtual navigation system for virtual and real spaces
US9101279B2 (en) 2006-02-15 2015-08-11 Virtual Video Reality By Ritchey, Llc Mobile user borne brain activity data and surrounding environment data correlation system
CN101496387B (zh) 2006-03-06 2012-09-05 思科技术公司 用于移动无线网络中的接入认证的系统和方法
US8570373B2 (en) * 2007-06-08 2013-10-29 Cisco Technology, Inc. Tracking an object utilizing location information associated with a wireless device
US8717412B2 (en) * 2007-07-18 2014-05-06 Samsung Electronics Co., Ltd. Panoramic image production
US8098283B2 (en) * 2007-08-01 2012-01-17 Shaka Ramsay Methods, systems, and computer program products for implementing a personalized, image capture and display system
US9703369B1 (en) 2007-10-11 2017-07-11 Jeffrey David Mullen Augmented reality video game systems
US8355041B2 (en) * 2008-02-14 2013-01-15 Cisco Technology, Inc. Telepresence system for 360 degree video conferencing
US8797377B2 (en) 2008-02-14 2014-08-05 Cisco Technology, Inc. Method and system for videoconference configuration
US10229389B2 (en) * 2008-02-25 2019-03-12 International Business Machines Corporation System and method for managing community assets
US8319819B2 (en) * 2008-03-26 2012-11-27 Cisco Technology, Inc. Virtual round-table videoconference
US8390667B2 (en) 2008-04-15 2013-03-05 Cisco Technology, Inc. Pop-up PIP for people not in picture
JP4444354B2 (ja) * 2008-08-04 2010-03-31 株式会社東芝 画像処理装置、および画像処理方法
EP2157545A1 (en) * 2008-08-19 2010-02-24 Sony Computer Entertainment Europe Limited Entertainment device, system and method
US8694658B2 (en) * 2008-09-19 2014-04-08 Cisco Technology, Inc. System and method for enabling communication sessions in a network environment
US8529346B1 (en) * 2008-12-30 2013-09-10 Lucasfilm Entertainment Company Ltd. Allocating and managing software assets
US8659637B2 (en) 2009-03-09 2014-02-25 Cisco Technology, Inc. System and method for providing three dimensional video conferencing in a network environment
US8477175B2 (en) * 2009-03-09 2013-07-02 Cisco Technology, Inc. System and method for providing three dimensional imaging in a network environment
US20130176192A1 (en) * 2011-09-30 2013-07-11 Kenneth Varga Extra-sensory perception sharing force capability and unknown terrain identification system
WO2010124074A1 (en) * 2009-04-22 2010-10-28 Terrence Dashon Howard System for merging virtual reality and reality to provide an enhanced sensory experience
US8659639B2 (en) 2009-05-29 2014-02-25 Cisco Technology, Inc. System and method for extending communications between participants in a conferencing environment
US8762982B1 (en) * 2009-06-22 2014-06-24 Yazaki North America, Inc. Method for programming an instrument cluster
US9082297B2 (en) 2009-08-11 2015-07-14 Cisco Technology, Inc. System and method for verifying parameters in an audiovisual environment
US8812990B2 (en) * 2009-12-11 2014-08-19 Nokia Corporation Method and apparatus for presenting a first person world view of content
US9225916B2 (en) 2010-03-18 2015-12-29 Cisco Technology, Inc. System and method for enhancing video images in a conferencing environment
USD626102S1 (en) 2010-03-21 2010-10-26 Cisco Tech Inc Video unit with integrated features
USD626103S1 (en) 2010-03-21 2010-10-26 Cisco Technology, Inc. Video unit with integrated features
US9313452B2 (en) 2010-05-17 2016-04-12 Cisco Technology, Inc. System and method for providing retracting optics in a video conferencing environment
US8896655B2 (en) 2010-08-31 2014-11-25 Cisco Technology, Inc. System and method for providing depth adaptive video conferencing
US8599934B2 (en) 2010-09-08 2013-12-03 Cisco Technology, Inc. System and method for skip coding during video conferencing in a network environment
US20120075466A1 (en) * 2010-09-29 2012-03-29 Raytheon Company Remote viewing
WO2012048252A1 (en) 2010-10-07 2012-04-12 Aria Glassworks, Inc. System and method for transitioning between interface modes in virtual and augmented reality applications
US8599865B2 (en) 2010-10-26 2013-12-03 Cisco Technology, Inc. System and method for provisioning flows in a mobile network environment
US8699457B2 (en) 2010-11-03 2014-04-15 Cisco Technology, Inc. System and method for managing flows in a mobile network environment
US8902244B2 (en) 2010-11-15 2014-12-02 Cisco Technology, Inc. System and method for providing enhanced graphics in a video environment
US9143725B2 (en) 2010-11-15 2015-09-22 Cisco Technology, Inc. System and method for providing enhanced graphics in a video environment
US8730297B2 (en) 2010-11-15 2014-05-20 Cisco Technology, Inc. System and method for providing camera functions in a video environment
US9338394B2 (en) 2010-11-15 2016-05-10 Cisco Technology, Inc. System and method for providing enhanced audio in a video environment
US8576276B2 (en) 2010-11-18 2013-11-05 Microsoft Corporation Head-mounted display device which provides surround video
US8542264B2 (en) 2010-11-18 2013-09-24 Cisco Technology, Inc. System and method for managing optics in a video environment
US8723914B2 (en) 2010-11-19 2014-05-13 Cisco Technology, Inc. System and method for providing enhanced video processing in a network environment
WO2012071466A2 (en) 2010-11-24 2012-05-31 Aria Glassworks, Inc. System and method for acquiring virtual and augmented reality scenes by a user
US9111138B2 (en) 2010-11-30 2015-08-18 Cisco Technology, Inc. System and method for gesture interface control
USD682293S1 (en) 2010-12-16 2013-05-14 Cisco Technology, Inc. Display screen with graphical user interface
USD678894S1 (en) 2010-12-16 2013-03-26 Cisco Technology, Inc. Display screen with graphical user interface
USD682864S1 (en) 2010-12-16 2013-05-21 Cisco Technology, Inc. Display screen with graphical user interface
USD682854S1 (en) 2010-12-16 2013-05-21 Cisco Technology, Inc. Display screen for graphical user interface
USD682294S1 (en) 2010-12-16 2013-05-14 Cisco Technology, Inc. Display screen with graphical user interface
USD678320S1 (en) 2010-12-16 2013-03-19 Cisco Technology, Inc. Display screen with graphical user interface
USD678307S1 (en) 2010-12-16 2013-03-19 Cisco Technology, Inc. Display screen with graphical user interface
USD678308S1 (en) 2010-12-16 2013-03-19 Cisco Technology, Inc. Display screen with graphical user interface
US8953022B2 (en) 2011-01-10 2015-02-10 Aria Glassworks, Inc. System and method for sharing virtual and augmented reality scenes between users and viewers
US8692862B2 (en) 2011-02-28 2014-04-08 Cisco Technology, Inc. System and method for selection of video data in a video conference environment
US8670019B2 (en) 2011-04-28 2014-03-11 Cisco Technology, Inc. System and method for providing enhanced eye gaze in a video conferencing environment
US8786631B1 (en) 2011-04-30 2014-07-22 Cisco Technology, Inc. System and method for transferring transparency information in a video environment
US8934026B2 (en) 2011-05-12 2015-01-13 Cisco Technology, Inc. System and method for video coding in a dynamic environment
US8947493B2 (en) 2011-11-16 2015-02-03 Cisco Technology, Inc. System and method for alerting a participant in a video conference
US8682087B2 (en) 2011-12-19 2014-03-25 Cisco Technology, Inc. System and method for depth-guided image filtering in a video conference environment
US20130250040A1 (en) * 2012-03-23 2013-09-26 Broadcom Corporation Capturing and Displaying Stereoscopic Panoramic Images
US9743119B2 (en) 2012-04-24 2017-08-22 Skreens Entertainment Technologies, Inc. Video display system
US11284137B2 (en) 2012-04-24 2022-03-22 Skreens Entertainment Technologies, Inc. Video processing systems and methods for display, selection and navigation of a combination of heterogeneous sources
US10499118B2 (en) * 2012-04-24 2019-12-03 Skreens Entertainment Technologies, Inc. Virtual and augmented reality system and headset display
US9179126B2 (en) * 2012-06-01 2015-11-03 Ostendo Technologies, Inc. Spatio-temporal light field cameras
US20130333633A1 (en) * 2012-06-14 2013-12-19 Tai Cheung Poon Systems and methods for testing dogs' hearing, vision, and responsiveness
US9626799B2 (en) * 2012-10-02 2017-04-18 Aria Glassworks, Inc. System and method for dynamically displaying multiple virtual and augmented reality scenes on a single display
US9681154B2 (en) 2012-12-06 2017-06-13 Patent Capital Group System and method for depth-guided filtering in a video conference environment
US10769852B2 (en) 2013-03-14 2020-09-08 Aria Glassworks, Inc. Method for simulating natural perception in virtual and augmented reality scenes
US9786246B2 (en) 2013-04-22 2017-10-10 Ar Tables, Llc Apparatus for hands-free augmented reality viewing
US9843621B2 (en) 2013-05-17 2017-12-12 Cisco Technology, Inc. Calendaring activities based on communication processing
US8982472B2 (en) * 2013-05-21 2015-03-17 Matvey Lvovskiy Method of widening of angular field of view of collimating optical systems
FR3006841B1 (fr) * 2013-06-07 2015-07-03 Kolor Fusion de plusieurs flux video
US10363486B2 (en) 2013-06-10 2019-07-30 Pixel Press Technology, LLC Smart video game board system and methods
US9579573B2 (en) * 2013-06-10 2017-02-28 Pixel Press Technology, LLC Systems and methods for creating a playable video game from a three-dimensional model
US10977864B2 (en) 2014-02-21 2021-04-13 Dropbox, Inc. Techniques for capturing and displaying partial motion in virtual or augmented reality scenes
US9392212B1 (en) 2014-04-17 2016-07-12 Visionary Vr, Inc. System and method for presenting virtual reality content to a user
US9665170B1 (en) 2015-06-10 2017-05-30 Visionary Vr, Inc. System and method for presenting virtual reality content to a user based on body posture
DE102015116868A1 (de) 2015-10-05 2017-04-06 Christoph Greiffenbach Präsentationssystem für Werbezwecke und zur Zurschaustellung eines Produkts
KR20180110051A (ko) * 2016-02-05 2018-10-08 매직 립, 인코포레이티드 증강 현실을 위한 시스템들 및 방법들
US10547704B2 (en) 2017-04-06 2020-01-28 Sony Interactive Entertainment Inc. Predictive bitrate selection for 360 video streaming
US10217488B1 (en) * 2017-12-15 2019-02-26 Snap Inc. Spherical video editing
KR102157160B1 (ko) * 2018-12-27 2020-09-17 주식회사 다윈테크 360도 가상영상 체험시스템
US11683464B2 (en) * 2018-12-28 2023-06-20 Canon Kabushiki Kaisha Electronic device, control method, and non-transitorycomputer readable medium
US11503227B2 (en) 2019-09-18 2022-11-15 Very 360 Vr Llc Systems and methods of transitioning between video clips in interactive videos

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3580978A (en) * 1968-06-06 1971-05-25 Singer General Precision Visual display method and apparatus
US5495576A (en) * 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5619255A (en) * 1994-08-19 1997-04-08 Cornell Research Foundation, Inc. Wide-screen video system
US5999220A (en) * 1997-04-07 1999-12-07 Washino; Kinya Multi-format audio/video production system with frame-rate conversion

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3872238A (en) * 1974-03-11 1975-03-18 Us Navy 360 Degree panoramic television system
JPS5124211A (en) * 1974-08-23 1976-02-27 Victor Company Of Japan Onseishingono shuhasuhenkansochi
JPS60141087A (ja) * 1983-12-28 1985-07-26 Tsutomu Ohashi 環境再現装置
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5130815A (en) * 1990-07-20 1992-07-14 Mti Associates Method and apparatus for encoding a video signal having multi-language capabilities
US5148310A (en) * 1990-08-30 1992-09-15 Batchko Robert G Rotating flat screen fully addressable volume display system
ES2043549B1 (es) * 1992-04-30 1996-10-01 Jp Producciones Sl Sistema integral de grabacion, proyeccion-visualizacion-audicion de imagenes y/o realidad virtual perfeccionado.
JPH06301390A (ja) * 1993-04-12 1994-10-28 Sanyo Electric Co Ltd 立体音像制御装置
US5850352A (en) * 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US5991085A (en) * 1995-04-21 1999-11-23 I-O Display Systems Llc Head-mounted personal visual display apparatus with image generator and holder
US5703604A (en) * 1995-05-22 1997-12-30 Dodeca Llc Immersive dodecaherdral video viewing system
JPH11308608A (ja) * 1998-02-19 1999-11-05 Nippon Lsi Card Co Ltd 動画像生成方法,動画像生成装置及び動画像呈示装置
JP3232408B2 (ja) * 1997-12-01 2001-11-26 日本エルエスアイカード株式会社 画像生成装置,画像呈示装置及び画像生成方法
US6064423A (en) * 1998-02-12 2000-05-16 Geng; Zheng Jason Method and apparatus for high resolution three dimensional display
JP2002516121A (ja) * 1998-03-03 2002-06-04 アリーナ, インコーポレイテッド 多次元空間における運動技術を追跡し、そして評価するためのシステムおよび方法
EP1099343A4 (en) * 1998-05-13 2007-10-17 Infinite Pictures Inc PANORAMIC FILMS SIMULATING A DISPLACEMENT IN A MULTI-DIMENSIONAL SPACE
JP3449937B2 (ja) * 1999-01-14 2003-09-22 日本電信電話株式会社 パノラマ画像作成方法及びパノラマ画像を用いた周囲状況伝達方法並びにこれらの方法を記録した記録媒体
JP4453119B2 (ja) * 1999-06-08 2010-04-21 ソニー株式会社 カメラ・キャリブレーション装置及び方法、画像処理装置及び方法、プログラム提供媒体、並びに、カメラ
GB9914914D0 (en) * 1999-06-26 1999-08-25 British Aerospace Measurement apparatus for measuring the position and orientation of a first part to be worked, inspected or moved
JP2001108421A (ja) * 1999-10-13 2001-04-20 Sanyo Electric Co Ltd 3次元モデリング装置、3次元モデリング方法および3次元モデリングプログラムを記録した媒体
US6765569B2 (en) * 2001-03-07 2004-07-20 University Of Southern California Augmented-reality tool employing scene-feature autocalibration during camera motion

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3580978A (en) * 1968-06-06 1971-05-25 Singer General Precision Visual display method and apparatus
US5495576A (en) * 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5619255A (en) * 1994-08-19 1997-04-08 Cornell Research Foundation, Inc. Wide-screen video system
US5999220A (en) * 1997-04-07 1999-12-07 Washino; Kinya Multi-format audio/video production system with frame-rate conversion

Cited By (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020147586A1 (en) * 2001-01-29 2002-10-10 Hewlett-Packard Company Audio annoucements with range indications
JP2005099064A (ja) * 2002-09-05 2005-04-14 Sony Computer Entertainment Inc 表示システム、表示制御装置、表示装置、表示方法、およびユーザインタフェイス装置
US20040125044A1 (en) * 2002-09-05 2004-07-01 Akira Suzuki Display system, display control apparatus, display apparatus, display method and user interface device
US20100195913A1 (en) * 2002-12-31 2010-08-05 Rajeev Sharma Method and System for Immersing Face Images into a Video Sequence
US7826644B2 (en) 2002-12-31 2010-11-02 Rajeev Sharma Method and system for immersing face images into a video sequence
US7734070B1 (en) 2002-12-31 2010-06-08 Rajeev Sharma Method and system for immersing face images into a video sequence
WO2004088994A1 (de) * 2003-04-02 2004-10-14 Daimlerchrysler Ag Vorrichtung zur berücksichtigung der betrachterposition bei der darstellung von 3d-bildinhalten auf 2d-anzeigevorrichtungen
US7256779B2 (en) * 2003-05-08 2007-08-14 Nintendo Co., Ltd. Video game play using panoramically-composited depth-mapped cube mapping
US20040222988A1 (en) * 2003-05-08 2004-11-11 Nintendo Co., Ltd. Video game play using panoramically-composited depth-mapped cube mapping
US7118228B2 (en) 2003-11-04 2006-10-10 Hewlett-Packard Development Company, L.P. Image display system
WO2005064440A2 (de) * 2003-12-23 2005-07-14 Siemens Aktiengesellschaft Vorrichtung und verfahren zur positionsgenauen überlagerung des realen blickfeldes
WO2005064440A3 (de) * 2003-12-23 2006-01-26 Siemens Ag Vorrichtung und verfahren zur positionsgenauen überlagerung des realen blickfeldes
US20050231532A1 (en) * 2004-03-31 2005-10-20 Canon Kabushiki Kaisha Image processing method and image processing apparatus
US7728852B2 (en) * 2004-03-31 2010-06-01 Canon Kabushiki Kaisha Image processing method and image processing apparatus
US20080024594A1 (en) * 2004-05-19 2008-01-31 Ritchey Kurtis J Panoramic image-based virtual reality/telepresence audio-visual system and method
US20060069591A1 (en) * 2004-09-29 2006-03-30 Razzano Michael R Dental image charting system and method
US20060285636A1 (en) * 2004-09-29 2006-12-21 Interactive Diagnostic Imaging, Inc. Dental image charting system and method
US9958934B1 (en) * 2006-05-01 2018-05-01 Jeffrey D. Mullen Home and portable augmented reality and virtual reality video game consoles
US10838485B2 (en) 2006-05-01 2020-11-17 Jeffrey D. Mullen Home and portable augmented reality and virtual reality game consoles
US20080030573A1 (en) * 2006-05-11 2008-02-07 Ritchey Kurtis J Volumetric panoramic sensor systems
US20080007617A1 (en) * 2006-05-11 2008-01-10 Ritchey Kurtis J Volumetric panoramic sensor systems
US8953057B2 (en) 2006-05-22 2015-02-10 Canon Kabushiki Kaisha Display apparatus with image-capturing function, image processing apparatus, image processing method, and image display system
EP1860612A3 (en) * 2006-05-22 2011-08-31 Canon Kabushiki Kaisha Image distortion correction
EP1860612A2 (en) * 2006-05-22 2007-11-28 Canon Kabushiki Kaisha Image distortion correction
US20070268316A1 (en) * 2006-05-22 2007-11-22 Canon Kabushiki Kaisha Display apparatus with image-capturing function, image processing apparatus, image processing method, and image display system
US10819954B2 (en) 2006-11-16 2020-10-27 Immersive Licensing, Inc. Distributed video sensor panoramic imaging system
US20080117288A1 (en) * 2006-11-16 2008-05-22 Imove, Inc. Distributed Video Sensor Panoramic Imaging System
US10375355B2 (en) 2006-11-16 2019-08-06 Immersive Licensing, Inc. Distributed video sensor panoramic imaging system
US8771064B2 (en) 2010-05-26 2014-07-08 Aristocrat Technologies Australia Pty Limited Gaming system and a method of gaming
US9236000B1 (en) * 2010-12-23 2016-01-12 Amazon Technologies, Inc. Unpowered augmented reality projection accessory display device
US10031335B1 (en) 2010-12-23 2018-07-24 Amazon Technologies, Inc. Unpowered augmented reality projection accessory display device
US9383831B1 (en) * 2010-12-23 2016-07-05 Amazon Technologies, Inc. Powered augmented reality projection accessory display device
US9766057B1 (en) 2010-12-23 2017-09-19 Amazon Technologies, Inc. Characterization of a scene with structured light
US9721386B1 (en) 2010-12-27 2017-08-01 Amazon Technologies, Inc. Integrated augmented reality environment
US9508194B1 (en) 2010-12-30 2016-11-29 Amazon Technologies, Inc. Utilizing content output devices in an augmented reality environment
US9607315B1 (en) 2010-12-30 2017-03-28 Amazon Technologies, Inc. Complementing operation of display devices in an augmented reality environment
US20120281128A1 (en) * 2011-05-05 2012-11-08 Sony Corporation Tailoring audio video output for viewer position and needs
US9950262B2 (en) 2011-06-03 2018-04-24 Nintendo Co., Ltd. Storage medium storing information processing program, information processing device, information processing system, and information processing method
US10471356B2 (en) 2011-06-03 2019-11-12 Nintendo Co., Ltd. Storage medium storing information processing program, information processing device, information processing system, and information processing method
US20120307001A1 (en) * 2011-06-03 2012-12-06 Nintendo Co., Ltd. Information processing system, information processing device, storage medium storing information processing program, and moving image reproduction control method
CN105324984A (zh) * 2013-12-09 2016-02-10 Cjcgv株式会社 用于生成多投影图像的方法和系统
US20160328824A1 (en) * 2013-12-09 2016-11-10 Cj Cgv Co., Ltd. Method and system for generating multi-projection images
US20160127723A1 (en) * 2013-12-09 2016-05-05 Cj Cgv Co., Ltd. Method and system for generating multi-projection images
CN106165402A (zh) * 2014-04-22 2016-11-23 索尼公司 信息再现装置、信息再现方法、信息记录装置和信息记录方法
EP3136713A4 (en) * 2014-04-22 2017-12-06 Sony Corporation Information reproduction device, information reproduction method, information recording device, and information recording method
US20170329394A1 (en) * 2016-05-13 2017-11-16 Benjamin Lloyd Goldstein Virtual and augmented reality systems
US20190149731A1 (en) * 2016-05-25 2019-05-16 Livit Media Inc. Methods and systems for live sharing 360-degree video streams on a mobile device
FR3057430A1 (fr) * 2016-10-10 2018-04-13 Immersion Dispositif d'immersion dans une representation d'un environnement resultant d'un ensemble d'images
US20180176708A1 (en) * 2016-12-20 2018-06-21 Casio Computer Co., Ltd. Output control device, content storage device, output control method and non-transitory storage medium
US10593012B2 (en) 2017-03-22 2020-03-17 Mediatek Inc. Method and apparatus for generating and encoding projection-based frame with 360-degree content represented in projection faces packed in segmented sphere projection layout
TWI666912B (zh) * 2017-03-22 2019-07-21 聯發科技股份有限公司 具有由封包於分段球體投影設計並以投影面表示之360度內容之投影訊框生成及編碼之方法及裝置
US20180342267A1 (en) * 2017-05-26 2018-11-29 Digital Domain, Inc. Spatialized rendering of real-time video data to 3d space
US10796723B2 (en) * 2017-05-26 2020-10-06 Immersive Licensing, Inc. Spatialized rendering of real-time video data to 3D space
US11361640B2 (en) 2017-06-30 2022-06-14 Johnson Controls Tyco IP Holdings LLP Security camera system with multi-directional mount and method of operation
US11288937B2 (en) 2017-06-30 2022-03-29 Johnson Controls Tyco IP Holdings LLP Security camera system with multi-directional mount and method of operation
US20190007672A1 (en) * 2017-06-30 2019-01-03 Bobby Gene Burrough Method and Apparatus for Generating Dynamic Real-Time 3D Environment Projections
US10713811B2 (en) 2017-09-29 2020-07-14 Sensormatic Electronics, LLC Security camera system with multi-directional mount and method of operation
US20190104282A1 (en) * 2017-09-29 2019-04-04 Sensormatic Electronics, LLC Security Camera System with Multi-Directional Mount and Method of Operation
WO2019076667A1 (en) * 2017-10-16 2019-04-25 Signify Holding B.V. METHOD AND CONTROL DEVICE FOR CONTROLLING A PLURALITY OF LIGHTING DEVICES
US11234312B2 (en) 2017-10-16 2022-01-25 Signify Holding B.V. Method and controller for controlling a plurality of lighting devices
US11703942B2 (en) 2017-12-08 2023-07-18 Telefonaktiebolaget Lm Ericsson (Publ) System and method for interactive 360 video playback based on user location
US10712810B2 (en) * 2017-12-08 2020-07-14 Telefonaktiebolaget Lm Ericsson (Publ) System and method for interactive 360 video playback based on user location
CN109996060A (zh) * 2017-12-30 2019-07-09 深圳多哚新技术有限责任公司 一种虚拟现实影院系统及信息处理方法
US11086395B2 (en) * 2019-02-15 2021-08-10 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US11914761B1 (en) * 2019-07-03 2024-02-27 Saec/Kinetic Vision, Inc. Systems and methods for virtual artificial intelligence development and testing
US11372474B2 (en) * 2019-07-03 2022-06-28 Saec/Kinetic Vision, Inc. Systems and methods for virtual artificial intelligence development and testing
US11644891B1 (en) * 2019-07-03 2023-05-09 SAEC/KineticVision, Inc. Systems and methods for virtual artificial intelligence development and testing
US11816757B1 (en) * 2019-12-11 2023-11-14 Meta Platforms Technologies, Llc Device-side capture of data representative of an artificial reality environment
CN112233048A (zh) * 2020-12-11 2021-01-15 成都成电光信科技股份有限公司 一种球形视频图像校正方法
CN113824746A (zh) * 2021-11-25 2021-12-21 山东信息职业技术学院 一种虚拟现实信息传输方法及虚拟现实系统
CN117557446A (zh) * 2023-11-24 2024-02-13 北京同步风云科技有限公司 一种球幕led实时图像处理与软件控制方法

Also Published As

Publication number Publication date
US20100302348A1 (en) 2010-12-02
WO2003001803A1 (en) 2003-01-03
JP2005500721A (ja) 2005-01-06
US7688346B2 (en) 2010-03-30
US20060082643A1 (en) 2006-04-20
DE10197255T5 (de) 2004-10-14

Similar Documents

Publication Publication Date Title
US7688346B2 (en) VTV system
US7719563B2 (en) VTV system
JP2005500721A5 (ja)
US10645369B2 (en) Stereo viewing
CN110463195B (zh) 用于在虚拟现实视频中渲染定时文本和图形的方法和设备
EP0793392B1 (en) Method and apparatus for the transmission and the reception of three-dimensional television signals of stereoscopic images
CN113099204B (zh) 一种基于vr头戴显示设备的远程实景增强现实方法
KR20170017700A (ko) 360도 3d 입체 영상을 생성하는 전자 장치 및 이의 방법
EP1919219A1 (en) Video transmitting apparatus, video display apparatus, video transmitting method and video display method
EP3301933A1 (en) Methods, devices and stream to provide indication of mapping of omnidirectional images
KR20200065087A (ko) 다중 뷰포인트 기반 360 비디오 처리 방법 및 그 장치
KR101825063B1 (ko) 평판 패널에서의 입체 영상 입력을 위한 하드웨어 시스템
JP2018033107A (ja) 動画の配信装置及び配信方法
Zheng et al. Research on panoramic stereo live streaming based on the virtual reality
CN114040097A (zh) 一种基于多通道图像采集融合的大场景交互动作捕捉系统
JP3520318B2 (ja) 映像合成演算処理装置、その方法及びシステム
JP2007323481A (ja) 映像データの伝送システムとその方法、送信処理装置とその方法、ならびに、受信処理装置とその方法
WO2023202897A1 (en) A method and apparatus for encoding/decoding a 3d scene
JP2004048803A (ja) 映像合成処理システム
JP2022021886A (ja) Vr映像生成装置及びプログラム

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION