US20070157276A1 - Web page based video service and apparatus - Google Patents

Web page based video service and apparatus Download PDF

Info

Publication number
US20070157276A1
US20070157276A1 US11/684,704 US68470407A US2007157276A1 US 20070157276 A1 US20070157276 A1 US 20070157276A1 US 68470407 A US68470407 A US 68470407A US 2007157276 A1 US2007157276 A1 US 2007157276A1
Authority
US
United States
Prior art keywords
user
plurality
video
users
videos
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/684,704
Inventor
Francis Maguire
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SIMULATED PERCEPTS LLC
Original Assignee
Maguire Francis J Jr
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US6323297P priority Critical
Priority to US09/177,356 priority patent/US7190392B1/en
Application filed by Maguire Francis J Jr filed Critical Maguire Francis J Jr
Priority to US11/684,704 priority patent/US20070157276A1/en
Publication of US20070157276A1 publication Critical patent/US20070157276A1/en
Assigned to VIDEO WEB SHARE, LLC reassignment VIDEO WEB SHARE, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MAGUIRE, FRANCIS J., JR.
Assigned to SIMULATED PERCEPTS, LLC reassignment SIMULATED PERCEPTS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: VIDEO WEB SHARE, LLC
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed circuit television systems, i.e. systems in which the signal is not broadcast
    • H04N7/181Closed circuit television systems, i.e. systems in which the signal is not broadcast for receiving images from a plurality of remote sources

Abstract

A telepresence server is for connection to a telecommunications network for providing access to a reality engine for a plurality of passive users. The reality engine can be controlled by an active user or a professional director through the network or by a local professional director. Active/passive mode displays can be used by the passive users in a passive mode and an active/passive mode display can be used by the active user in an active mode.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. patent application Ser. No. 09/177,356 filed Oct. 23, 1998 and which claims priority from U.S. Provisional Application Ser. No. 60/063,232 filed Oct. 23, 1997.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The invention relates to communication of images and, more particularly, to telepresence, including remote video monitoring.
  • 2. Discussion of Related Art
  • Remote monitoring systems are known to include remotely located video cameras positioned for monitoring from a remote site with a personal computer or display. Such can be connected by any kind of connection such as point-to-point with a telephone line, via the internet or through an internet hub. A video server is used to capture successive real time images from a video camera, digitize and compress them and transfer them frame-by-frame through the internet, intranet or point-to-point protocol direct dial-in connection.
  • Telepresence is similar in concept to “virtual reality” except images and other stimuli are provided to the user via a connection in a telecommunications network. One approach uses a teleoperated camera platform coupled to the head movements of a remote user wearing a head-tracked, head-mounted display (HTHMD). See U.S. Pat. No. 5,436,638 at column 1, lines 43-48 and column 3, lines 10-31. Instead of a HTHMD, a desktop display can be yoked to the movements of a user seated before the display such as shown in FIGS. 13, 14A, 14B and 16 of U.S. Pat. No. 5,436,638. See also the PUSH desktop display and the BOOM3C head-coupled stereoscopic display, either hand-guided or hands-free (head-guided), of Fakespace, Inc., Menlo Park, Calif. Another approach is to use a remote reality engine with prerecorded scenarios for selection over the network according to monitored movements of the user. Due to the limited bandwidth typically available for such connections, the rate of frame delivery is very slow and therefore there is a noticeable lag between the time of image capture or retrieval and display. Moreover, the amount of video information conveyed is rather limited since the technology is based on the existing NTSC infrastructure. Consequently, the above described applications for telepresence tend to be lacking in the “presence” aspect and likewise remote viewing tends to be confined to rather static, e.g., industrial plant process monitoring, employee parking lot monitoring, security monitoring for plant ingress/egress and the like.
  • However, various competing transport technologies are now being deployed to increased the bandwidth enormously and thereby speed up such connections. These include optical fiber networks, cable, satellite and techniques to utilize the existing telephony infrastructure of twisted copper pairs as digital subscribers lines. Included in the services deliverable on the links provided according to such technologies will be HDTV. While the bandwidth of such links now being deployed to subscribers can be heavily proportioned in the downstream direction, they also provide at least a significant amount of upstream bandwidth. As a result, there will now be new opportunities for far more dynamic types of telepresence applications, including remote video monitoring, particularly on the Internet, and in ways heretofore never even contemplated. In particular, it can be foreseen that there will be extremely high demand for exciting, new telepresence application.
  • Unfortunately, these telepresence applications suffer from an underlying assumption borrowed from the art of “virtual reality” where the user is enabled to navigate within a virtual environment in a highly autonomous manner. The user takes command of the virtual environment and actively controls all of the responses of the reality engine according to monitored activity of the user. This dedication to a single user of the tools needed to generate the virtual environment makes the reality engine unavailable to all but this one user at a given time. A similar situation exists for a remotely located video camera. Since these tools are quite expensive, the cost of use for the single user is high. Hence the anticipated demand cannot be efficiently and economically met.
  • SUMMARY OF THE INVENTION
  • As object of the present invention is to provide a new type of telepresence, including remote monitoring, that takes advantage of the increased bandwidth on links now being deployed.
  • Another object of the present invention is to provide telepresence to more than one user at a given time.
  • According to a first aspect of the present invention, a system for providing video images, comprises a video camera for providing video signals indicative of said video images captured by said video camera, a first display, responsive to said video signals, for providing said video images for viewing by a first user, an n-axis sensor, responsive to n-axis first display motions caused by said first user, for providing an n-axis attitude control signal, an n-axis platform having said video camera mounted thereon, responsive to said n-axis attitude command signal, for executing n-axis platform motions emulative of said n-axis first display motions, and one or more second displays, responsive to said video signals, for providing said video images for viewing by one or more corresponding second users and responsive to said n-axis attitude command signal for executing n-axis second display motions emulative of said n-axis first display motions.
  • According to a second aspect of the present invention, a system comprises at least one reality engine for providing an image signal indicative of images taken from various attitudes, and a telepresence server, responsive to said image signal, for providing said image signal and an attitude control signal to at least one attitudinally actuatable display via a telecommunications network for attitudinally actuating said display for guiding a viewing attitude of a user and for displaying said images for said user of said at least one attitudinally actuatable display for passively viewing said images from said various attitudes. The telepresence server can be for providing access to said reality engine for an active user of a display attitudinally actuatable by said active user for providing said attitude control signal to said reality engine and to said telepresence server wherein the user is drawn from the general public with no special training. Or, the telepresence server can be for providing access to said reality engine for a trained director who can be local, not needing network access to the server, or remote, needing to access via a network.
  • According to a third aspect of the present invention, a display device comprises an n-axis display platform, responsive in a passive mode to an attitudinal control signal, for guiding a user's head to execute attitudinal movements, and responsive in an active mode to attitudinal movements of a user's head for providing sensed signals indicative of said attitudinal movements, and a display connected to said n-axis display platform, responsive to a video signal, for displaying images corresponding to said attitudinal movements.
  • These and other objects, features and advantages of the present invention will become more apparent in light of the following detailed description of a best mode embodiment thereof, as illustrated in the accompanying drawing.
  • BRIEF DESCRIPTION OF THE DRAWING
  • FIG. 1 shows a prior art remote monitoring application;
  • FIG. 2 shows plural cameras mounted on corresponding n-axis platforms for being moved in an n-axis manner to monitor remote sites under the control of corresponding plural remote viewers each using a display such as a head mounted display having its attitude and/or position monitored in a corresponding n-axis manner;
  • FIG. 3 illustrates that one or more of the remote viewers may be passive viewers whose attitudinal head movements are not monitored at all but rather are guided to emulate the attitudinal head movements of an active viewer;
  • FIG. 4 shows one type of display for use by a passive viewer, according to the present invention;
  • FIG. 5 shows a telepresence server with a reality engine under the control of an active user, a local director, or a remote director and a plurality of passive users all interconnected by a communications network;
  • FIG. 6 shows a three-axis display that is usable in an active mode or a passive mode;
  • FIG. 7 shows a monitor screen where a user can choose a reality engine located at a remote tourist site for remotely viewing the chosen site; and
  • FIG. 8 shows the three-axis display such as shown in FIG. 6 in schematic block diagram form connected to the communications network of FIG. 5 via a signal processor.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • FIG. 1 shows a prior art internet remote video monitoring application where a user 10 of a personal computer (PC) 12 can access the internet 14 via a modem 16 and a local internet service provider (ISP) 18. Another service provider 20 is connected via a router 22 and a hub 24 to a plurality of video servers 26, . . . , 28 which are in turn connected to a plurality of video cameras 30, . . . , 32. The cameras can be located in different parts of an industrial plant such as the factory 34 and the warehouse 36. In this way the user 10 can monitor various parts of the plant remotely. It is even possible to remotely control the camera, e.g., by controlling their lenses 38, 40 to zoom in and out. It should be realized, however, that the camera or cameras can be located anywhere and that the internet 14 can be any kind of connection or connections provided that bidirectionality is provided.
  • Given the typical bandwidth limitations of existing methods, such as methods for accessing the internet and other similar connections, this way of remote video monitoring has been found to be effective for rather static type applications such as security monitoring. E.g., a security officer sits before a PC or other display (or bank or displays) and monitors the desired points in various plants of a company from a single remote monitoring site. For this sort of application, a need for a large amount of bandwidth is not particularly important and hence the proven success of such relatively static applications. On the other hand, more dynamic remote video monitoring applications, such as entertainment or education, cannot be expected to be viable using such limited bandwidth connections.
  • Telepresence concepts are shown implemented in FIG. 2 in a wideband network. A wideband internet service provider (WISP) 42 is shown connected to a plurality of displays for users at different geographical locations. The WISP 42 may be owned by a local telephone or cable TV company, for example, and can provide broadband services on any of the competing media illustrated such as a copper pair 44, coaxial cable 46, an optical fiber 48, or equivalents such as wireless including satellite communications. Each of these will deploy their own peculiar subscriber interfaces 50, 52, 54 respectively. For instance, the twisted copper pair may be used as an Asymmetric Digital Subscriber Line (ADSL) using, e.g., Discrete MultiTone (DMT), Carrierless Amplitude Phase (CAP) modulation, or the like, and terminate with an interface 50 comprising an ADSL modem. Of course, other types of so-called xDSL technologies can be used as well. These include but are not limited to IDSL (ISDN DSL), HDSL (High-bit-rate DSL), SDSL (Symmetric DSL), RADSL (Rate-Adaptive DSL), and VDSL (Very-high-rate DSL). On the other hand, the cable interface 52 might simply be a part of a set top converter, for example, also used for conventional television as modified for web access, e.g., by the WebTVNetworks unit of Microsoft, or by other providers such as Oracle Corp., Sun Microsystems Inc., Netscape Communications, General Instrument, Thomson SA, WorldGate Communications Inc., or the like. The fiber interface 54 might be an optical network unit (ONU) that converts downstream optical signals to electrical and formats them for a copper or cable connection to the subscriber and the reverse for upstream signals. The fiber 48 could also terminate in the user's premises. It should be mentioned that other delivery systems are possible as well, including LMDS. Similarly, the internet is presently being augmented to service increased bandwidth requirements.
  • Considering the enormously increased bandwidth provided by the WISP 42 e.g., 7 or 8 Mbit/sec compared to 33 kbit/sec for the modem 16 of FIG. 1, it is now possible to provide new and more dynamic functions for remote video monitoring. For instance, FIG. 2 shows Head Mounted Displays (HMDs) 56, 58, 60 for use by three different subscribers connected to the respective interfaces 50, 52, 54 by signal lines 62 a, 64, 66 which can be bidirectional. Bidirectionality may be employed for conveying broadband video information downstream and command data upstream in a narrowband. The command data may be entered through a PC (not shown) or by means of input devices on the HMD itself, for example. It should be realized that the example of FIG. 2 does not exclude the coexistence of the possible transmission of wideband information, such as video, from the subscriber to the network as well, such as by using SDSL, mentioned above, or by making ADSL behave symmetrically although with a reduced downstream rate.
  • Various head mounted displays are known. One type is a see-through display where the real world view of the user is “augmented” with imagery from an image source, called “augmented reality”. Another type completely blocks light from the outside and is for use in a completely virtual environment. Yet another type is a “video see-through” where the user wears stereo cameras on his head which provide images for perception of the surroundings using a head mounted display. All of these types of HMDs can be used to implement the present invention. However, many of these displays use bulky optics and related heavy cables which are somewhat burdensome. Moreover, presently available optics have a rather narrow field of view and present video image resolution is rather poor.
  • A particularly attractive recent innovation for the purposes of the present invention is the retinal display which does away with the external display and the associated optics entirely. There is no comparable problem with narrow field of view and low resolution with a retinal display. A retinal display has been disclosed for providing a scanning light signal for the formation of images firstly and directly in the eye of a viewer: U.S. Pat. No. 5,467,104 shows the projection of a modulated scanning light signal directly onto the retina of the viewer's eye without the prior formation of any real or aerial image outside the viewer's eye. In other words, light rays do not converge in any way outside the eye to form an image. That patent shows modulated photons of the light signal reflected from one or more scanners by way of projection optics directly onto the retina. A micromechanical scanner can be used as the scanning device, as shown in U.S. Pat. No. 5,557,444 (based on U.S. patent application Ser. No. 08/329,508, filed Oct. 26, 1994). An optical fiber may be used to provide the light signal from the photon source to the scanner as shown in U.S. Pat. No. 5,596,339 in order to promote a lightweight, head mounted, panoramic display.
  • In addition to the HMDs 56, 58, 60, a respective plurality of attitude sensors 62, 64, 66 are shown for mounting on the head of the user for sensing the rotational movements of the user's head and providing a sensed signal on a line 68, 70, 72, respectively, to interfaces 50, 52, 54 for upstream transmission. Such a device for determining orientation of a user's head using accelerometers is shown in U.S. Pat. No. 5,615,132 to Horton et al. Another is shown in U.S. Pat. No. 5,645,077 to Foxlin. Yet another is provided by Precision Navigation, Inc., 1235 Pear Avenue, Suite 111, Mountain View, Calif. 94043. For a simple case, it is assumed that translatory position (translation) of the user's head is not measured or, if measured, is ignored. A further simplification reduces the number of rotational degrees of freedom that are measured from three to two (e.g., pan (yaw) and tilt (pitch) as described below), or even just one. This simplification does exclude the measurement of translations, however. The WISP 42 is connected by a signal on a line 74 and via the internet 76 and a signal on a line 78 to another WISP 80 connected in turn to a plurality of video servers 82, 84 86 signals on lines 87, 88, 89. It should be realized that there need not be two separate WISP's 42, 80, but that in certain circumstances one can suffice. The video servers are connected to a corresponding plurality of cameras 90, 91, 92 by a plurality of signal lines 94, 96, 98. The cameras 90, 91, 92 send video signals via the internet 76 to the HMDs 56, 58, 60, respectively, for display.
  • In the opposite direction, the interfaces 50, 52, 54 transmit attitude command signals in response to the corresponding sensed attitude signals on the lines 68, 70,72 from the attitude sensors 62, 64, 66 through the WISP 42, the internet 76, the WISP 80 and the plurality of video servers 82, 84, 86 to a corresponding plurality of n-axis platforms such as three axis platforms 100, 102, 104.
  • The platforms 100, 102, 104 need not be three-axis, i. e., including pitch, roll and yaw but may be restricted to only two axes (e. g., pitch and yaw) or even just one (e. g., yaw). For instance, if roll is omitted, a 2-axis platform in the form of a computer controlled pan-tilt (2-axis: yaw-pitch) unit, Model PTU-46-70 or PTU-46-17.5, produced by Directed Perception, Inc., 1485 Rollins Road, Burlingame, Calif. 94010 may be used. Actuators from other manufacturers such as Densitron may be used as well. In addition to one or more of the three attitudinal degrees of freedom, one or more of the three translational degrees of freedom may also be added in any desired combination. For example, a six degree of freedom platform could be provided.
  • While some of the attitudinal or positional degrees of freedom discussed above may be added or subtracted in a given application in different combinations, it should be realized that other degrees of freedom that are different in kind from those discussed above may also be added to an n-axis platform. For instance, the attitude sensor 62, as shown in FIG. 3, can be sensing 2-axes only, e. g., yaw (pan) and pitch (tilt) while an additional eye attitude sensor 106, as shown in FIG. 3 can be added for monitoring two degrees of freedom of the HMD 56 user's eyes. The eye sensor 1056 provides a sensed signal on a line 108 to the interface 50. In that case, a four-degree of freedom, i. e., 4-axis platform 100 would be appropriate. Two axes for emulating the pitch and yaw of the user's head and two axes for emulating the pitch and yaw of at least one of the user's eyes. A 4-axis platform (“The vision-head”) for carrying out the above is shown by the ESCHeR high performance stereo-head at the following internet site provided by Rougeaux Sebastian:http//www.etl.go.jp/etl/robotics/Projec...6/node4.html#SECTION0002100000000000 0000. See also http://www.etl.go.jp/etl/robotics/Projects/CogRobo/escher.html. Other camera motion platforms are available, for instance from Helpmate Robotics Inc., Shelter Rock Lane, Danbury, Conn. 06810-8159 under the product names BiSight/UniSight and Zebra at http://ntplx.net/˜helpmate/. Another 4-axis Stereo-Vision Head (TO 40) can be obtained from Robosoft, Technopole d=Izarbel F-64210 Bidart, France at http://www.robosoft.fr.
  • Thus various combinations of monitoring of degrees-of-freedom of body parts can be used. Not only selected head and/or eye attitudinal degrees-of-freedom but also translatory (positional) degrees-of-freedom of the head can be monitored in one or more axes. These are altogether then emulated on the n-axis platform. Depending on the number of body parts and spatial motions thereof monitored, any correspondingly appropriate multi-axis positioning platform can be used. A platform based on those used for conventional flight-simulators but scaled down for a camera-sized application can be used. For instance, an even more scaled down version of the six degree of freedom principle demonstrated by the Polytec PI “Hexapod” can be used (Polytec PI, Inc., Suite 212, 23 Midstate Drive, Auburn, Mass. 01501 USA, the subsidiary of Physik Instruments (PI) GmbH & Co., and Polytec GmbH, both of Polytec-Platz 5-7, 76337 Waldbronn, Germany).
  • It will now be more fully realized from the foregoing, as mentioned above, that there will now be new opportunities for far more dynamic types of telepresence applications, including remote video monitoring, particularly on the Internet, and in ways heretofore never even contemplated. In particular, it can be foreseen that there will be extremely high demand for exciting, new telepresence applications.
  • As also mentioned above, these telepresence applications suffer from an underlying assumption borrowed from the art of “virtual reality” where the user is enabled to navigate within a virtual environment in a highly autonomous manner. The user takes command of the virtual environment and actively controls all of the responses of the reality engine according to monitored activity of the user. This has been shown extended to a wideband network in FIG. 2. This dedication of the tools needed to generate the remote presence for a single user is quite expensive. A way to meet the anticipated high demand in an efficient and economical manner will now be shown.
  • According to the present invention, the remote monitoring carried out under the control of a remote active viewer using an HMD/attitude sensor 56, 62 combination, such as in FIG. 2, can be used to control not only a camera/n-axis platform 88, 100, but also one or more passive viewing platforms for use by one or more remote passive viewers. In other words, both “what” the passive viewers see, and the “attitude” in which they see same, are controlled by the active viewer. Further, for such embodiments, according to the teachings hereof, the passive viewing need not be contemporaneous with the image acquisition process, but may be delayed in time to any desired extent using a memory or other storage facility.
  • For instance, FIG. 3 shows that one or more of the remote viewers may be passive viewers whose attitudinal head movements are not monitored at all but rather are instead mechanically guided to emulate the attitudinal head movements of the active viewer whose head movements are monitored by the attitude sensor 62 such as already shown along with the HMD 56 in FIG. 2, which is also shown again in FIG. 3. Such a passive apparatus 114 is shown in FIG. 3 with a light source 115, e. g., such as the type of display used for an HMD 116 (shown in FIG. 4), mounted on a head guide 118 which may be an actuatable multi-axis platform which is in turn actuated by an actuator 120 under the control of an actuation signal on a line 112. Several examples of such display devices for such passive use is described in more detail in copending application Ser. No. 08/794,122 filed Feb. 3, 1997, now U.S. Pat. No. 6,181,371, and which is hereby incorporated by reference. The actuation signal on the line 112 is provided by an interface 122 that receives a signal on a line 124 from the WISP 42. An image signal is also provided on a line 126 from the interface 122 to the light source 115.
  • As shown in further detail in FIG. 4, if the light source 115 is part of an HMD 116, the HMD may be optionally detachable along with the light source 115 from the head-guide 118 and, for that purpose, will also optionally include an attitude sensor 62 b so that the HMD and light source while detached from the head guide may alternatively be used in a active way by the viewer as previously described in connection with FIG. 3. It should be realized that there need not be any HMD provided with the passive apparatus 114 and that the light source alone can be used with the head guide with or without the attitude sensor. If used with a sensor, the apparatus 114 could be built in the same way as the various devices shown in U.S. Pat. No. 5,436,638 except with actuators as well as sensors. For instance, the pivot 110 of FIG. 1 or the assembly 770 of FIG. 6 of Bolas et al could be actuated. It should also be realized, however, that such a light-source/head-guide combination can also be designed as a dual use active/passive display apparatus so the user can select to operate in either active or passive mode. The user will want to select active mode if the desired camera is not in use by anyone else. Or, if the desired camera is already in use, the user can select passive mode and the WISP 42 can then transmit the head attitude signals on the line 68 to both the n-axis camera platform 100 and to the apparatus 114 for actuating both the n-axis platform 100 and the apparatus 114 for emulating the attitudinal head motions of the user of the HMD 56.
  • Considering the foregoing, the systems of FIGS. 2-4 can be used in a communication network application such as, but not limited to, the internet. A user of one of the HMDs 56, 58, 60 can contact his internet service provider 42 using an internet browser and access the internet service provider 80 for gaining control of one of the cameras 90, 91, 92 for the purpose of remote viewing. These cameras can be located in different locations such as the factory, warehouse and loading dock of the prior art mentioned above in connection with FIG. 1. However, given the wide bandwidth capabilities now becoming available, the cameras could be located in places that would have wide appeal to the general public such as tourist sites, educational settings, entertainment performances, or the like.
  • A conventional display 128 responsive to a signal on a line 130 from an interface 132 can be used instead of the HMD 56 or the device such as shown in U.S. Pat. No. 5,436,638. An attitude sensor or a conventional input device such as a mouse, joystick or the like 134 can be used, or a sensor such as shown in U.S. Pat. No. 5,436,638, to provide an upstream control signal on a line 136 to the interface 132. The interface 132 interchanges a bidirectional signal on a media line 138 with a wideband internet service provider 140 connected to the internet 76 by a line 142.
  • The wideband internet service provider 80 could own and operate the remotely located cameras and provide internet access to the various active viewers of FIG. 2 through a web page of the provider 80. Or the provider 80 could allow other owners of such cameras or other reality engines to hookup and provide video services to the various active and/or passive users through his web page. Such a wideband internet service provider could become a provider of specialized video services such as remote tourism. A problem with such a remote tourist site is that the demand for active control of a given camera, such as located at Niagara Falls, could become very high. In that case, the web page of the WISP 80 can give the user intending to use a particular site a choice: active or passive. If the camera at the desired site is not presently in use, then the intending user can choose the active option with his input device and take control of the remote camera by means of one of the attitude sensors 62, 64, 66 or the control device 134 of FIG. 2. But if the camera at the desired site is presently in use, the apparatus 114 of FIG. 3 or 4 becomes very useful because the user can opt to be a passive viewer. In that case, the control signal provided for instance by the active user of the HMD 56 to the n-axis camera platform 100 also has another role, i. e., to control the n-axis actuator The actuator 120, in response to the signal on the line 112, causes the head guide to execute n-axis motions in emulation of the n-axis motions of the n-axis platform 100 executed in response to the signals on the line 94. Both the control signals on the lines 112 and 94 are derived from the sensed signal on the line 68 from the head attitude sensor 62. In the case of the remote camera, it is caused to execute attitudinal motions emulative of the commands of the remote user from the head attitude sensor 62. In the case of the apparatus 114, it is also caused to execute attitudinal motions emulative of the commands of the remote user from the head attitude sensor 62. Moreover, the video signals provided by the camera 90 via the internet 76 are provided to both the HMD 56 for viewing by the active viewer and to the light source 115 of the apparatus 114 for viewing by the passive viewer. It should be realized that although the attitudinal command signals for controlling the actuator 120 have been described as coming from the sensor 62, they could be sent first to the video server 82 for repackaging and sent to the apparatus 114 along with the video signals by the server 82.
  • FIG. 5 shows a communications network 144 connected to a telepresence server 146 with a web server connected to a plurality of generalized reality engines 148, 149 a, 149 b. The reality engine 148, e.g., can be one or more “live” cameras on n-axis platforms or prerecorded “virtual reality” programs. A display 150 under the control of an active user is shown in FIG. 5, as described above. The active user can access a webpage of the telepresence server 146 and, if not being used, seize control of the reality engine as described above with the display 150. Subsequent users who can each be at different geographic locations and who want to use the reality engine cannot be active users but can be passive users using displays 152, 154, 156, . . . ,158. As an alternative to the display 150 under the control of an active user drawn from the general public, the operator of the telepresence server 146 can use the services of a professional local director using a display 160 actively or a professional remote director using a display 162 actively who accesses the server through the network.
  • It should be realized that the displays need not be the versatile active/passive displays described here. The displays 150, 160, 162 can be designed to be useable purely as active displays such as the display shown in U.S. Pat. No. 5,436,638 to Bolas et al. Likewise, the displays 152, 154, 156, . . . , 158 can be designed to be useable purely as passive displays such as the various displays shown in co-pending U.S. patent application Ser. No. 08/794,122 filed Feb. 3, 1997, now U.S. Pat. No. 6,181,371, or even the simple conventional monitor 128 of FIG. 2. However, selectable active/passive displays are preferred for the reasons explained above.
  • It should also be realized that the selectable mode (active/passive) display does not have to include a detachable helmet mounted display for use when the active mode is selected. For instance, FIG. 6 shows a selectable mode (active/passive) device 163 wherein a display 164 is attached to a shaft 166 that is rotatable 168 about a vertical z-axis 170 in both modes. The user places his hands on hand grips 172, 174 and places his eyes on display viewports 176, 178. The shaft 166 is rotatably mounted in a first platform part, e.g., a disc 180 and is driven in the passive mode by a yaw motor 182 that is fixed to the disc 180. In the active mode, rotations about the z-axis are measured by a yaw sensor 184. The disc 180 is rotatably mounted within a second platform part, e.g., an inner annulus 185 on a pair of pins 186, 188 in the inner annulus for rotating the about an x-axis 190. One end of the pin 186 is fixed in the disc 180 while the other end is journaled in a bearing for being rotatably driven by pitch motor 192 fixed to or in the inner annulus. The pitch motor 192 drives one of the pins 186 as a drive shaft about the x-axis to pitch disc 180 and the display 164 forward or backward in the passive mode. A pitch sensor 194 mounted in or on the inner annulus 185 senses rotation of the disc 180 about the x-axis in the active mode while the pitch motor is inactive. One pin 196 is shown of a pair of pins fixed in the inner annulus but journaled on bearings in a third platform part, e.g., an outer annulus 198 for rotating the inner annulus about a y-axis 200. A roll motor 202 is fixed on or in the outer annulus 198 and drives the pin 196 as a drive shaft to rotate the inner annulus about the y-axis in the passive mode. A roll sensor 204 is fixed in or on the outer annulus and senses rotation of the inner annulus about the y-axis in the active mode while the roll motor is inactive. It should be realized that the sensors can be used in the passive mode as well to provide feedback signals for controlling the motors in closed loop fashion. If not used in this way, the attitude of the display in passive mode can be controlled in open loop fashion.
  • FIG. 7 shows a screen 206 of a monitor 208 shown in FIG. 8 connected to a signal processor 210 by a line 212 that drives the monitor. A keyboard 214 is connected to the processor by a line 216. An intending user uses a mouse 218 connected by a line 220 to the processor to select, for example, one of the tourist sites shown in FIG. 7. The screen 206 shows nine available sites, most of which are inactive but two of which are “now active.” One of the inactive sites such as the Grand Canyon can be selected with the mouse and the user is then able to use the display 164 of FIGS. 6 and 8 in an active way. In that case, the motors 182, 192, 202 are inactive while the sensors 184, 194, 204 are used to indicate the present yaw, pitch and roll, respectively, of the display 164 by providing a sensed yaw signal on a line 222, a sensed pitch signal on a line 224, and a sensed roll signal on a line 226. The signal processor 210, in response to the sensed signals on the lines 222, 224, 226, provides an output signal on a line 228 to a modem 230 which in turn provides the sensed signals on a line 232 to the communications network 144 and on to the reality engine 148 of FIG. 5 via the telepresence server 146. The server sends the sensed yaw, pitch and roll signals to the reality engine such as a 3-axis camera platform such as the platform 100 of FIG. 3 located at the Grand Canyon. A camera on the platform such as the camera 90 provides an image signal on the line 94 back to the communications network 144 and on to the display 164 through the signal processor 210. In this way, the device 163 of FIG. 8 is used like the display 150 used in active mode by an active user.
  • On the other hand, the user can instead use the mouse 218 to select one of the more popular sites that is already under active control indicated by “(now active)” such as Niagara Falls. In that case, the telepresence server 146 and reality engine 148 are responsive to the already active user's actions for causing images to be gathered from attitudes dictated by the active user and for providing the gathered images and the sensed yaw, pitch and roll signals to the device 163 for use in a passive way. In other words, the communications network 144 provides the gathered images and sensed yaw, pitch and roll signals from the device 150 used in an active way and provides them on the line 232 to the modem 230 which in turn provides them to the processor 210 for display on the display 164 and for controlling the yaw, pitch and roll motors by control signals on lines 234, 236, 238 for controlling the device 163 and hence the attitude of the display 164. In this way, a camera and associated platform at a popular site can be used by more than one user although only one is active. The same principle applies to accessing any kind of popular reality engine (such as preprogrammed “virtual reality” scenarios) which might otherwise be inaccessible because of high demand.
  • Although the invention has been shown and described with respect to a best mode embodiment thereof, it should be understood by those skilled in the art that the foregoing and various other changes, omissions and additions in the form and detail thereof may be made therein without departing from the spirit and scope of the invention.

Claims (20)

1. Method, comprising:
providing a web page showing a plurality of available videos from a corresponding plurality of video cameras of various owners, said cameras for acquiring said videos and made available by said owners over a telecommunications network for selection by a plurality of users using said web page on a corresponding plurality of displays in different geographic locations,
providing videos selected by said plurality of users via said telecommunications network to said plurality of users of said corresponding plurality of displays according to selection signals received over said network from said plurality of users wherein each selection signal is indicative of a particular video selected by a particular user of said web page and wherein each of said plurality of available videos is selectable for use in viewing at a same time by more than one user of said plurality of users of said web page at said different geographic locations, and
storing one or more of said plurality of available videos for display on said web page as stored videos for selection by said plurality of users so that video selection for viewing is delayed in time after video acquisition by said cameras to any extent.
2. The method of claim 1, wherein at least one user of said plurality of users is able to use a selected live video as an active user, that is, by providing a user control signal for controlling said live video and, alternatively, to use said live video as a passive user, that is, without providing any user control signal for controlling said live video.
3. The method of claim 1, wherein at least one of said one or more user control signals is provided over said network by an active user for actively controlling a corresponding live video.
4. The method of claim 3, wherein said active user is also able to use a selected live video as a passive user, that is, without providing any user control signal for controlling said live video.
5. The method of claim 3, wherein multiple users selecting said live video actively controlled by said active user only provide selection signals and are not providing any control signals and are therefore passive users of said live video.
6. The method of claim 1, wherein at least one of said one or more control signals is provided over said network by a remote director user for remotely controlling one or more corresponding live videos.
7. The method of claim 6, wherein multiple users selecting a remotely controlled one of said live videos actively controlled by said remote director user only provide selection signals and are not providing any control signals and are therefore only passive users of said remotely controlled live videos.
8. The method of claim 1, wherein at least one of said one or more control signals is provided by a local director user for locally controlling one or more corresponding live videos.
9. The method of claim 8, wherein multiple users selecting a locally controlled one of said live videos actively controlled by said local director user only provide selection signals and are not providing any control signals and are therefore only passive users of said one or more locally controlled live videos.
10. The method of claim 1, wherein said method is carried out by a server.
11. Apparatus, comprising:
a memory for storing a plurality of videos received from a corresponding plurality of video cameras of various owners at different sites; and
a server, for providing a web page for displaying a plurality of available videos on said web page indicative of said plurality of videos received from said corresponding plurality of video cameras, said server responsive to selection signals from a plurality of users in different geographic locations via a telecommunications network, for providing selected videos to said plurality of users via said telecommunications network according to said selection signals wherein each of said one or more videos is selectable by multiple users at a same time, wherein any one or more of said plurality of available videos from said plurality of video cameras are stored in said memory for display on said web page as stored videos for selection by said users so that video selection for viewing is delayed in time after video acquisition by said cameras to any extent.
12. The apparatus of claim 11, wherein a user is able to view a display of a particular selected video as a passive user, that is, without providing any active user control signal back to said server, while another user of said plurality of users provides said active user control signal for controlling said particular selected video and for viewing a display of said selected video as an active user.
13. The apparatus of claim 11, wherein at least one of one or more control signals is provided over said network by a remote director user for remotely controlling one or more corresponding videos.
14. The method of claim 11, wherein at least one of one or more control signals is provided by a local director user for locally controlling one or more corresponding videos.
15. The apparatus of claim 11, wherein an active user control signal can come from any one of said plurality of users any one of which is also able to use said video as a passive user, that is, without providing any user control signal for controlling said video while another user of said plurality of users provides said active user control signal for viewing a display of said selected video as an active user.
16. The apparatus of claim 11, wherein said server is responsive to an active user control signal from one user among said plurality of users for controlling said video actively while others of said plurality of users are without active control of said video but rather use the video passively, according to the control of said one user.
17. Apparatus, comprising:
means for providing a web page showing a plurality of available videos from a corresponding plurality of video cameras of various owners, said cameras for acquiring said videos and made available by said owners over a telecommunications network for selection by a plurality of users using said web page on a corresponding plurality of displays in different geographic locations,
means for providing videos selected by said plurality of users via said telecommunications network to said plurality of users of said corresponding plurality of displays according to selection signals received over said network from said plurality of users wherein each selection signal is indicative of a particular video selected by a particular user of said web page and wherein each of said plurality of available videos is selectable for use in viewing at a same time by more than one user of said plurality of users of said web page at said different geographic locations, and
means for storing one or more of said plurality of available videos for display on said web page as stored videos for selection by said plurality of users so that video selection for viewing is delayed in time after video acquisition by said cameras to any extent.
18. The apparatus of claim 17, wherein at least one user of said plurality of users is able to use a selected live video as an active user, that is, by providing a user control signal for controlling said live video and, alternatively, to use said live video as a passive user, that is, without providing any user control signal for controlling said live video.
19. The apparatus of claim 17, wherein at least one of one or more control signals is provided over said network by a remote director user for remotely controlling one or more corresponding videos.
20. The method of claim 17, wherein at least one of one or more control signals is provided by a local director user for locally controlling one or more corresponding videos.
US11/684,704 1997-10-23 2007-03-12 Web page based video service and apparatus Abandoned US20070157276A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US6323297P true 1997-10-23 1997-10-23
US09/177,356 US7190392B1 (en) 1997-10-23 1998-10-23 Telepresence system and active/passive mode display for use therein
US11/684,704 US20070157276A1 (en) 1997-10-23 2007-03-12 Web page based video service and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/684,704 US20070157276A1 (en) 1997-10-23 2007-03-12 Web page based video service and apparatus

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/177,356 Continuation US7190392B1 (en) 1997-10-23 1998-10-23 Telepresence system and active/passive mode display for use therein

Publications (1)

Publication Number Publication Date
US20070157276A1 true US20070157276A1 (en) 2007-07-05

Family

ID=37833418

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/177,356 Expired - Fee Related US7190392B1 (en) 1997-10-23 1998-10-23 Telepresence system and active/passive mode display for use therein
US11/684,704 Abandoned US20070157276A1 (en) 1997-10-23 2007-03-12 Web page based video service and apparatus
US11/685,708 Expired - Fee Related US7587747B2 (en) 1997-10-23 2007-03-13 Telepresence method and apparatus for simultaneous use by multiple active/passive users

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/177,356 Expired - Fee Related US7190392B1 (en) 1997-10-23 1998-10-23 Telepresence system and active/passive mode display for use therein

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/685,708 Expired - Fee Related US7587747B2 (en) 1997-10-23 2007-03-13 Telepresence method and apparatus for simultaneous use by multiple active/passive users

Country Status (1)

Country Link
US (3) US7190392B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150215612A1 (en) * 2014-01-24 2015-07-30 Ganesh Gopal Masti Jayaram Global Virtual Reality Experience System

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7301536B2 (en) * 1993-09-10 2007-11-27 Geovector Corporation Electro-optic vision systems
US7734724B2 (en) * 2000-09-06 2010-06-08 Xanboo Inc. Automated upload of content based on captured event
DE10056217A1 (en) * 2000-11-13 2002-05-23 Siemens Ag Demand-driven logistical system acquires images of delivery unit storage point associated with supplier, passes image data via data connection including Internet for display to supplier
US7474852B1 (en) * 2004-02-12 2009-01-06 Multidyne Electronics Inc. System for communication of video, audio, data, control or other signals over fiber
US8151315B2 (en) * 2005-02-07 2012-04-03 Oklejas Robert A Hybrid audio/video entertainment system
US10219035B2 (en) 2005-02-07 2019-02-26 Robert A. Oklejas System and method for providing a television network customized for an end user
US9131079B2 (en) 2005-02-07 2015-09-08 Robert A. Oklejas System and method for providing a television network customized for an end user
US8160747B1 (en) * 2008-10-24 2012-04-17 Anybots, Inc. Remotely controlled self-balancing robot including kinematic image stabilization
US8471888B2 (en) 2009-08-07 2013-06-25 Research In Motion Limited Methods and systems for mobile telepresence
TW201123895A (en) * 2009-12-25 2011-07-01 Hon Hai Prec Ind Co Ltd Monitoring system
CN102111609A (en) * 2009-12-28 2011-06-29 鸿富锦精密工业(深圳)有限公司 surveillance system
US20110191664A1 (en) * 2010-02-04 2011-08-04 At&T Intellectual Property I, L.P. Systems for and methods for detecting url web tracking and consumer opt-out cookies
US8788096B1 (en) 2010-05-17 2014-07-22 Anybots 2.0, Inc. Self-balancing robot having a shaft-mounted head
WO2012120540A1 (en) 2011-03-10 2012-09-13 Bansal Sanjay A dynamic telepresence system and method
US8676893B1 (en) 2011-12-12 2014-03-18 Google Inc. Utilizing multiple teleo-operated devices for common teleo-experience sessions
US20130218706A1 (en) * 2012-02-22 2013-08-22 Elwha Llc Systems and methods for accessing camera systems
US20130218704A1 (en) * 2012-02-22 2013-08-22 Elwha Llc Systems and methods for accessing camera systems
US9691241B1 (en) * 2012-03-14 2017-06-27 Google Inc. Orientation of video based on the orientation of a display
US9519640B2 (en) 2012-05-04 2016-12-13 Microsoft Technology Licensing, Llc Intelligent translations in personal see through display
US9122321B2 (en) 2012-05-04 2015-09-01 Microsoft Technology Licensing, Llc Collaboration environment using see through displays
CN103425420B (en) * 2012-05-25 2016-08-03 国网山东省电力公司汶上县供电公司 Multi-monitor system and method
US9019174B2 (en) 2012-10-31 2015-04-28 Microsoft Technology Licensing, Llc Wearable emotion detection and feedback system
US9044863B2 (en) 2013-02-06 2015-06-02 Steelcase Inc. Polarized enhanced confidentiality in mobile camera applications
EP2954673A2 (en) 2013-02-07 2015-12-16 Bansal, Sanjay A graphical user interface (gui) for a conference call
TWI540534B (en) * 2015-02-26 2016-07-01 Staging Design Control system and method for virtual navigation
US9591260B1 (en) 2015-12-03 2017-03-07 Microsoft Technology Licensing, Llc Immersive telepresence

Citations (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2955156A (en) * 1957-05-24 1960-10-04 Morton L Heilig Stereoscopic-television apparatus for individual use
US3050870A (en) * 1961-01-10 1962-08-28 Morton L Heilig Sensorama simulator
US4181408A (en) * 1977-12-05 1980-01-01 Senders John W Vision compensation
US4300818A (en) * 1978-03-13 1981-11-17 Schachar Ronald A Multifocal ophthalmic lens
US4402580A (en) * 1980-07-31 1983-09-06 Richard Ross Optical exercising device
US4405943A (en) * 1981-08-19 1983-09-20 Harris Corporation Low bandwidth closed loop imagery control and communication system for remotely piloted vehicle
US4672438A (en) * 1985-06-28 1987-06-09 Her Majesty The Queen In Right Of Canada Tracking simulator
US4757380A (en) * 1985-01-21 1988-07-12 Technische Hogeschool Delft Method of causing an observer to get a three-dimensional impression from a two-dimensional representation
US4879849A (en) * 1987-11-04 1989-11-14 Omni Films International, Inc. Point-of-view motion simulator system
US5040055A (en) * 1988-12-14 1991-08-13 Horizonscan Inc. Panoramic interactive system
US5422653A (en) * 1993-01-07 1995-06-06 Maguire, Jr.; Francis J. Passive virtual reality
US5436638A (en) * 1993-12-17 1995-07-25 Fakespace, Inc. Image display method and apparatus with means for yoking viewpoint orienting muscles of a user
US5444476A (en) * 1992-12-11 1995-08-22 The Regents Of The University Of Michigan System and method for teleinteraction
US5488412A (en) * 1994-03-31 1996-01-30 At&T Corp. Customer premises equipment receives high-speed downstream data over a cable television system and transmits lower speed upstream signaling on a separate channel
US5584696A (en) * 1994-07-28 1996-12-17 Evans & Sutherland Computer Corp. Hang gliding simulation system with a stereoscopic display and method of simulating hang gliding
US5615132A (en) * 1994-01-21 1997-03-25 Crossbow Technology, Inc. Method and apparatus for determining position and orientation of a moveable object using accelerometers
US5644324A (en) * 1993-03-03 1997-07-01 Maguire, Jr.; Francis J. Apparatus and method for presenting successive images
US5646677A (en) * 1995-02-23 1997-07-08 Motorola, Inc. Method and apparatus for interactively viewing wide-angle images from terrestrial, space, and underwater viewpoints
US5729471A (en) * 1995-03-31 1998-03-17 The Regents Of The University Of California Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
US5737012A (en) * 1994-12-01 1998-04-07 Olympus Optical Co., Ltd. Head mounted image display apparatus and image forming apparatus related thereto
US5745161A (en) * 1993-08-30 1998-04-28 Canon Kabushiki Kaisha Video conference system
US5850352A (en) * 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US6069655A (en) * 1997-08-01 2000-05-30 Wells Fargo Alarm Services, Inc. Advanced video security system
US6133944A (en) * 1995-12-18 2000-10-17 Telcordia Technologies, Inc. Head mounted displays linked to networked electronic panning cameras
US6133941A (en) * 1996-10-25 2000-10-17 Canon Kabushiki Kaisha Camera control system, apparatus, and method which includes a camera control server that receives camera control requests from clients and issues control authority and wait times for control of camera to clients
US6144402A (en) * 1997-07-08 2000-11-07 Microtune, Inc. Internet transaction acceleration
US6144375A (en) * 1998-08-14 2000-11-07 Praja Inc. Multi-perspective viewer for content-based interactivity
US6181371B1 (en) * 1995-05-30 2001-01-30 Francis J Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
US6185737B1 (en) * 1998-06-30 2001-02-06 Sun Microsystems, Inc. Method and apparatus for providing multi media network interface
US6208379B1 (en) * 1996-02-20 2001-03-27 Canon Kabushiki Kaisha Camera display control and monitoring system
US6219086B1 (en) * 1994-11-30 2001-04-17 Canon Kabushiki Kaisha Terminal apparatus
US6233428B1 (en) * 1997-09-17 2001-05-15 Bruce Fryer System and method for distribution of child care training materials and remote monitoring of child care centers
US6252989B1 (en) * 1997-01-07 2001-06-26 Board Of The Regents, The University Of Texas System Foveated image coding system and method for image bandwidth reduction
US6396462B1 (en) * 1996-04-05 2002-05-28 Fakespace Labs, Inc. Gimbal-mounted virtual reality display system
US6567122B1 (en) * 1998-03-18 2003-05-20 Ipac Acquisition Subsidiary I Method and system for hosting an internet web site on a digital camera
US6611285B1 (en) * 1996-07-22 2003-08-26 Canon Kabushiki Kaisha Method, apparatus, and system for controlling a camera, and a storage medium storing a program used with the method, apparatus and/or system
US6675386B1 (en) * 1996-09-04 2004-01-06 Discovery Communications, Inc. Apparatus for video access and control over computer network, including image correction

Family Cites Families (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US564677A (en) * 1896-07-28 Bradley woodhull
US3916094A (en) * 1974-06-21 1975-10-28 Us Navy Submersible visual simulator for remotely piloted systems
US4303394A (en) * 1980-07-10 1981-12-01 The United States Of America As Represented By The Secretary Of The Navy Computer generated image simulator
EP0070336B1 (en) * 1981-07-20 1984-07-25 International Business Machines Corporation A tiltable and/or rotatable support for a display device
US4753173A (en) * 1983-12-19 1988-06-28 James Stanley D Portable turntable device
GB2193088B (en) * 1986-03-05 1990-02-14 Mitsubishi Electric Corp Swivel support structure
US4970589A (en) * 1986-07-10 1990-11-13 Varo, Inc. Head mounted video display and remote camera system
DE3712287C1 (en) * 1987-04-10 1988-09-15 Messerschmitt Boelkow Blohm Device for transmitting optical information
US4836486A (en) * 1987-04-16 1989-06-06 Anthro Corporation Adjustable support
US4918473A (en) * 1988-03-02 1990-04-17 Diamond Electronics, Inc. Surveillance camera system
US5124805A (en) * 1988-12-08 1992-06-23 Daewoo Electronics, Co., Ltd. Remote control operated moving television receiver
US5153716A (en) * 1988-12-14 1992-10-06 Horizonscan Inc. Panoramic interactive system
US4968123A (en) * 1989-04-24 1990-11-06 United Technologies Corporation Helmet mounted display configured for simulator use
JPH03292093A (en) 1990-04-10 1991-12-24 Seiko Epson Corp Three-dimensional display device
GB9015177D0 (en) * 1990-07-10 1990-08-29 Secr Defence A helmet loader for flight simulation
US5320534A (en) * 1990-11-05 1994-06-14 The United States Of America As Represented By The Secretary Of The Air Force Helmet mounted area of interest (HMAoI) for the display for advanced research and training (DART)
WO1992009904A1 (en) * 1990-11-29 1992-06-11 Vpl Research, Inc. Absolute position tracker
CA2085735A1 (en) * 1991-04-22 1992-10-23 Ralph W. Fisher Head-mounted projection display system featuring beam splitter
US5257094A (en) * 1991-07-30 1993-10-26 Larussa Joseph Helmet mounted display system
DE4130619A1 (en) * 1991-09-14 1993-03-25 Deutsche Aerospace Means for wardens
JP3031013B2 (en) * 1991-11-15 2000-04-10 日産自動車株式会社 Visual information providing device
US5267708A (en) * 1992-09-28 1993-12-07 Rockwell International Corp. Head support apparatus
US5745128A (en) * 1992-11-30 1998-04-28 Hewlett Packard Company Method and apparatus for ink transfer printing
US5625410A (en) * 1993-04-21 1997-04-29 Kinywa Washino Video monitoring and conferencing system
US5397133A (en) * 1993-09-30 1995-03-14 At&T Corp. System for playing card games remotely
JPH07135594A (en) * 1993-11-11 1995-05-23 Canon Inc Image pickup controller
US5577981A (en) * 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
US5436542A (en) * 1994-01-28 1995-07-25 Surgix, Inc. Telescopic camera mount with remotely controlled positioning
US5436654A (en) 1994-02-07 1995-07-25 Sony Electronics, Inc. Lens tilt mechanism for video teleconferencing unit
US5734414A (en) 1994-03-03 1998-03-31 Matsushita Electric Industrial Co., Ltd. Camera apparatus for electronic conference
JP3428151B2 (en) * 1994-07-08 2003-07-22 株式会社セガ Game device using an image display device
US5566370A (en) * 1994-11-03 1996-10-15 Lockheed Martin Corporation Simulation display system
US5684531A (en) * 1995-04-10 1997-11-04 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Ranging apparatus and method implementing stereo vision system
US5767820A (en) * 1995-05-09 1998-06-16 Virtual Research Systems Head-mounted visual display apparatus
GB2301216A (en) * 1995-05-25 1996-11-27 Philips Electronics Uk Ltd Display headset
US5734421A (en) * 1995-05-30 1998-03-31 Maguire, Jr.; Francis J. Apparatus for inducing attitudinal head movements for passive virtual reality
US6020885A (en) * 1995-07-11 2000-02-01 Sony Corporation Three-dimensional virtual reality space sharing method and system using local and global object identification codes
US5634622A (en) * 1995-09-18 1997-06-03 Pye; Craig D. Remote controllable television viewing stand
US6008837A (en) * 1995-10-05 1999-12-28 Canon Kabushiki Kaisha Camera control apparatus and method
US5867210A (en) * 1996-02-09 1999-02-02 Rod; Samuel R. Stereoscopic on-screen surgical microscope systems
US6208376B1 (en) * 1996-04-22 2001-03-27 Canon Kabushiki Kaisha Communication system and method and storage medium for storing programs in communication system
US6084979A (en) * 1996-06-20 2000-07-04 Carnegie Mellon University Method for creating virtual reality
US6046712A (en) * 1996-07-23 2000-04-04 Telxon Corporation Head mounted communication system for providing interactive visual communications with a remote system
US6288891B1 (en) * 1996-11-21 2001-09-11 Canon Kabushiki Kaisha Movable display apparatus
JP3817312B2 (en) * 1996-11-29 2006-09-06 キヤノン株式会社 The method of controlling the method and apparatus and an imaging system and the display operation device
US5895021A (en) * 1997-07-14 1999-04-20 Morgan Marshall Industries, Inc. Rotatable platform display device
JP3085252B2 (en) * 1997-07-31 2000-09-04 日本電気株式会社 Remote control camera video relay system
US6027257A (en) * 1998-03-26 2000-02-22 Basic Telepresence Inc Pan and tilt unit

Patent Citations (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2955156A (en) * 1957-05-24 1960-10-04 Morton L Heilig Stereoscopic-television apparatus for individual use
US3050870A (en) * 1961-01-10 1962-08-28 Morton L Heilig Sensorama simulator
US4181408A (en) * 1977-12-05 1980-01-01 Senders John W Vision compensation
US4300818A (en) * 1978-03-13 1981-11-17 Schachar Ronald A Multifocal ophthalmic lens
US4402580A (en) * 1980-07-31 1983-09-06 Richard Ross Optical exercising device
US4405943A (en) * 1981-08-19 1983-09-20 Harris Corporation Low bandwidth closed loop imagery control and communication system for remotely piloted vehicle
US4757380A (en) * 1985-01-21 1988-07-12 Technische Hogeschool Delft Method of causing an observer to get a three-dimensional impression from a two-dimensional representation
US4672438A (en) * 1985-06-28 1987-06-09 Her Majesty The Queen In Right Of Canada Tracking simulator
US4879849A (en) * 1987-11-04 1989-11-14 Omni Films International, Inc. Point-of-view motion simulator system
US5040055A (en) * 1988-12-14 1991-08-13 Horizonscan Inc. Panoramic interactive system
US5444476A (en) * 1992-12-11 1995-08-22 The Regents Of The University Of Michigan System and method for teleinteraction
US5422653A (en) * 1993-01-07 1995-06-06 Maguire, Jr.; Francis J. Passive virtual reality
US5644324A (en) * 1993-03-03 1997-07-01 Maguire, Jr.; Francis J. Apparatus and method for presenting successive images
US5745161A (en) * 1993-08-30 1998-04-28 Canon Kabushiki Kaisha Video conference system
US5436638A (en) * 1993-12-17 1995-07-25 Fakespace, Inc. Image display method and apparatus with means for yoking viewpoint orienting muscles of a user
US5615132A (en) * 1994-01-21 1997-03-25 Crossbow Technology, Inc. Method and apparatus for determining position and orientation of a moveable object using accelerometers
US5488412A (en) * 1994-03-31 1996-01-30 At&T Corp. Customer premises equipment receives high-speed downstream data over a cable television system and transmits lower speed upstream signaling on a separate channel
US5584696A (en) * 1994-07-28 1996-12-17 Evans & Sutherland Computer Corp. Hang gliding simulation system with a stereoscopic display and method of simulating hang gliding
US6219086B1 (en) * 1994-11-30 2001-04-17 Canon Kabushiki Kaisha Terminal apparatus
US5737012A (en) * 1994-12-01 1998-04-07 Olympus Optical Co., Ltd. Head mounted image display apparatus and image forming apparatus related thereto
US5646677A (en) * 1995-02-23 1997-07-08 Motorola, Inc. Method and apparatus for interactively viewing wide-angle images from terrestrial, space, and underwater viewpoints
US6470498B1 (en) * 1995-02-23 2002-10-22 Motorola, Inc. Personal computer system, compact disk and method for interactively viewing the earth
US5850352A (en) * 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US5729471A (en) * 1995-03-31 1998-03-17 The Regents Of The University Of California Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
US5745126A (en) * 1995-03-31 1998-04-28 The Regents Of The University Of California Machine synthesis of a virtual video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
US6181371B1 (en) * 1995-05-30 2001-01-30 Francis J Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
US6133944A (en) * 1995-12-18 2000-10-17 Telcordia Technologies, Inc. Head mounted displays linked to networked electronic panning cameras
US6208379B1 (en) * 1996-02-20 2001-03-27 Canon Kabushiki Kaisha Camera display control and monitoring system
US6396462B1 (en) * 1996-04-05 2002-05-28 Fakespace Labs, Inc. Gimbal-mounted virtual reality display system
US6611285B1 (en) * 1996-07-22 2003-08-26 Canon Kabushiki Kaisha Method, apparatus, and system for controlling a camera, and a storage medium storing a program used with the method, apparatus and/or system
US6675386B1 (en) * 1996-09-04 2004-01-06 Discovery Communications, Inc. Apparatus for video access and control over computer network, including image correction
US6133941A (en) * 1996-10-25 2000-10-17 Canon Kabushiki Kaisha Camera control system, apparatus, and method which includes a camera control server that receives camera control requests from clients and issues control authority and wait times for control of camera to clients
US6252989B1 (en) * 1997-01-07 2001-06-26 Board Of The Regents, The University Of Texas System Foveated image coding system and method for image bandwidth reduction
US6144402A (en) * 1997-07-08 2000-11-07 Microtune, Inc. Internet transaction acceleration
US6069655A (en) * 1997-08-01 2000-05-30 Wells Fargo Alarm Services, Inc. Advanced video security system
US6233428B1 (en) * 1997-09-17 2001-05-15 Bruce Fryer System and method for distribution of child care training materials and remote monitoring of child care centers
US6567122B1 (en) * 1998-03-18 2003-05-20 Ipac Acquisition Subsidiary I Method and system for hosting an internet web site on a digital camera
US6185737B1 (en) * 1998-06-30 2001-02-06 Sun Microsystems, Inc. Method and apparatus for providing multi media network interface
US6144375A (en) * 1998-08-14 2000-11-07 Praja Inc. Multi-perspective viewer for content-based interactivity

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150215612A1 (en) * 2014-01-24 2015-07-30 Ganesh Gopal Masti Jayaram Global Virtual Reality Experience System

Also Published As

Publication number Publication date
US20070220434A1 (en) 2007-09-20
US7190392B1 (en) 2007-03-13
US7587747B2 (en) 2009-09-08

Similar Documents

Publication Publication Date Title
Lanier Virtually there
Fisher et al. Virtual environment display system
JP5993376B2 (en) Customizable robot system
Goldberg et al. Desktop teleoperation via the world wide web
US5267775A (en) System for mounting a monitor
US7983497B2 (en) Coding method for motion-image data, decoding method, terminal equipment executing these, and two-way interactive system
Goldberg et al. Collaborative teleoperation via the internet
US6259470B1 (en) Image capture system having virtual camera
US7714895B2 (en) Interactive and shared augmented reality system and method having local and remote access
Burdea et al. Virtual reality technology
US9055234B2 (en) Navigable telepresence method and system
US6147701A (en) Image sensing apparatus
US6215498B1 (en) Virtual command post
US20040100575A1 (en) Retractable camera apparatus
EP1064783B1 (en) Wearable camera system with viewfinder means
US20140204003A1 (en) Systems Using Eye Mounted Displays
Hu et al. Internet-based robotic systems for teleoperation
EP1025696B1 (en) Apparatus for video access and control over computer network, including image correction
CA2240961C (en) Head mounted displays linked to networked electronic panning cameras
US20080007617A1 (en) Volumetric panoramic sensor systems
US20020085843A1 (en) Wearable camera system with viewfinder means
US7312766B1 (en) Method and system for time/motion compensation for head mounted displays
US6002430A (en) Method and apparatus for simultaneous capture of a spherical image
US6307526B1 (en) Wearable camera system with viewfinder means
CA2174574C (en) Method and system for panoramic viewing

Legal Events

Date Code Title Description
AS Assignment

Owner name: VIDEO WEB SHARE, LLC, CONNECTICUT

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MAGUIRE, FRANCIS J., JR.;REEL/FRAME:023115/0178

Effective date: 20090819

AS Assignment

Owner name: SIMULATED PERCEPTS, LLC, CONNECTICUT

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VIDEO WEB SHARE, LLC;REEL/FRAME:025519/0611

Effective date: 20101202

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION