WO2008058283A2 - Integrated information technology system - Google Patents

Integrated information technology system Download PDF

Info

Publication number
WO2008058283A2
WO2008058283A2 PCT/US2007/084336 US2007084336W WO2008058283A2 WO 2008058283 A2 WO2008058283 A2 WO 2008058283A2 US 2007084336 W US2007084336 W US 2007084336W WO 2008058283 A2 WO2008058283 A2 WO 2008058283A2
Authority
WO
WIPO (PCT)
Prior art keywords
technology
pixel output
sources
output stream
subsystem
Prior art date
Application number
PCT/US2007/084336
Other languages
French (fr)
Other versions
WO2008058283A3 (en
Inventor
Theodore Mayer Iii
Martin Phipps
Original Assignee
Panoram Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panoram Technologies, Inc. filed Critical Panoram Technologies, Inc.
Publication of WO2008058283A2 publication Critical patent/WO2008058283A2/en
Publication of WO2008058283A3 publication Critical patent/WO2008058283A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications

Definitions

  • the sources may be passive and not be manipulated, such as with video cameras, GPS and mapping systems.
  • the sources may be interactive with user I/O capabilities including but not limited to function selections, data input, view selection and more.
  • a known prior art system includes a vehicle active network that communicatively couples devices within a vehicle. Device operation is independent of the interface of the device with the active network. Additionally, the architecture of the active network provides one or more levels of communication redundancy along data transmission paths. The architecture provides for the total integration of vehicle systems and functions, and permits plug-and-play device integration and upgradeability.
  • a deficiency with the disclosed system that is not addressed is similar to what has been failing in integration attempts to date - namely that the analysis defines, assumes and applies some form of standard protocol, architecture or data topology that integrates the various technologies, for example, utilization of data packets. Accordingly, there is a need for a system that addresses the same problem area, but solves the requirement of integrating non-compatible systems before they have met some common standardization method and therefore can be integrated and enhanced with redundant communication paths.
  • Another known prior art system has been disclosed for a driver information system for a motor vehicle that includes a network that executes one of a number of application programs depending on which function of the system the driver has selected at any given point in time. In response to the driver's selection, the appropriate application program is retrieved from storage for execution. Information regarding the specific hardware interface software objects that are required during that execution are read from the retrieved application program and loaded for execution.
  • this disclosed prior art system does not address the core problem of requiring data standardization, formats, operating systems and integration.
  • the configuration of this prior art system assumes that the software and programs that are being retrieved are operating on compatible hardware and compatible operating systems, which is not a typical configuration with present-day large vehicles where the information systems are not part of a fully integrated solution from a single supplier. When the entire solution is provided by a single supplier, such as the vehicle manufacturer, or at such as time as industry standards have been defined and accepted by all after market suppliers, then the disclosed inventions will work.
  • the present invention is directed to the integration of multiple data, control and media systems from a variety of separate "technology sources,” and more particularly to a method and apparatus for managing such separate technology sources into an integrated overview and control system without requiring data or operating format standardization.
  • This integrated system specifically provides for the monitoring and operation of a plurality of independent image (for example, video) sources and controller sources. Pixel output streams from each technology source are aggregated into a composite output pixel stream, which may be connected to a local user interface. Controller functions from the technology sources also may be interfaced to the local user interface. In addition or alternatively, the pixel output streams and controller functions may be connected to a remote user interface.
  • the system of the present invention transfers only the required images and/or controller data for a particular technology source upon demand when requested by an operator or user. Accordingly, the network does not carry the data stream for all devices and thus required bandwidth for information sharing is minimized.
  • the system and method of the present invention configure imaging (camera, video) systems, sensor systems, location systems, communication systems, controller systems and all available technology sources into a common operating picture (COP) that provides an overview of the technology sources in thumbnail form.
  • These thumbnails may be real-time and active so they are simply miniature real-time displays from each technology source (system) or they may be frozen screen shots of the technology sources or even a button icon representing the technology source.
  • the system and method of the present invention allow the user to select any of these technology sources so as to transfer the selected technology source onto a main operations window or a separate main operations touch screen where it can be controlled and operated.
  • the technology sources are presented on one or more output devices or control screens (user interface windows) using an image processing subsystem and a control subsystem for managing the technology sources.
  • the challenge is that the technologies being infused into the machines in the operating environment are generally and typically incompatible, coming from a variety of suppliers in a variety of forms and formats, including size, resolution, operating system, data format, and data type.
  • Each component provides another enhancement for the operation of the machines, but in total they quickly become unmanageable as non- integrated components.
  • the system and method of the present invention bring all these disparate components into one integrated solution regardless of their data, software and computer standards. This allows for custom mixing and matching of solutions from varying suppliers. Additionally, since these technologies are improving and changing rapidly, changes or upgrades of any one component does not necessitate the restructure or update of the entire integrated solution.
  • the operator environment is co-located with the machinery being operated.
  • another operator not co-located with the machinery being operated can also select the source to be viewed and/or controlled.
  • the complete operator environment is remote. This embodiment allows remote monitoring of the many systems whether they are manned, remote operated or autonomous.
  • operators can select or share sources with other operators of other machinery for more sophisticated coordinated efforts.
  • One application of the system of the present invention relates to managing the data, control sensor and camera systems in operator environments, and especially relates to large-scale industrial vehicles, such as mining shovels, haul trucks, drills combines, dozers and cranes.
  • the present system for integrating information technology is also applicable to static systems such as, but not limited to, drill rigs and plant equipment.
  • the "on-vehicle" system controller selection mechanism selects a single system's graphics or video stream to be sent over the network. This can be transparent to the vehicle operator, and can be configured to not affect his selections and operations. All systems and sensors are thus available to remote sites, but only the selected source is put onto the network for transmission.
  • the network does not need to be able to carry all the information at all times, but only a subset of the information that has been selected for that moment by a combination of all the remote monitors requesting information at the same time. This will dramatically reduce the bandwidth requirements from each vehicle for remote monitoring or operations.
  • a subsystem that organizes groups or pages of sources and cameras would be available to the operator.
  • monitored conditions or sensors in the system can be monitored by the integrated solution that can automatically change which pages are available and/or which pixel output stream is featured on the main operations display or which technology source is highlighted on the selection screen.
  • a good simple example would be a rear view camera array being automatically selected as the truck is put into reverse.
  • Another example would be an array of highlighted and featured resources triggered by an emergency condition.
  • a group of cameras might be arrayed into a composite view. For example, an array of three cameras might provide a wide field of view in reverse, or perhaps a full array of cameras could be set up to provide a complete look-around view or even a top down view of the scene all around the area of the vehicle.
  • FIGURE 1 is a schematic representation of an embodiment of the system of the present invention showing a user interface subsystem and depicting the interface with a plurality of information technology sources.
  • FIG. 2 is a schematic representation of an alternative embodiment of the system of the present invention showing a user interface subsystem further including a remote operations subsystem.
  • FIG. 3 is a schematic representation of one embodiment of a user interface subsystem of the present invention.
  • FIG. 4 is a schematic representation of an alternative embodiment of a user interface subsystem of the present invention.
  • FIG. 5 is a schematic representation of an embodiment of the system of the present invention showing a user interface subsystem and depicting the interface with a plurality of independent image sources and a plurality independent controller sources.
  • FIG. 6 is a schematic representation of an alternative embodiment of the system of the present invention including a pre-selection interface for a plurality of independent image sources.
  • FIG. 7 depicts one embodiment of an interface module for use with the system of the present invention.
  • FIG. 8 depicts a mining shovel system and transport truck that may be configured with sensor arrays for use with the system of the present invention.
  • FIG. 9 shows the process steps of the present invention having a local user interface.
  • FIG. 10 shows the process steps of the present invention having a remote user interface.
  • FIG. 11 is a pictorial representation of a graphical user subsystem of the present invention.
  • FIG. 12 is a pictorial representation of a graphical user subsystem of the present invention, wherein the operator is selecting a controller source.
  • FIG. 13 is a pictorial representation of a graphical user subsystem of the present invention, wherein the operator is selecting display functions of a controller source.
  • FIG. 14 is a pictorial representation of the graphical user subsystem of the present invention, wherein the operator is selecting an image source.
  • FIG. 15 is a pictorial representation of the graphical user subsystem of the present invention, wherein the pixel output stream of a selected image source is shown in a main operations window.
  • the present invention is directed to a method and apparatus for integrating information from several independent and separate proprietary technology source systems.
  • the information from the separate technology sources is selectable, controllable and accessible to local and/or remote operations and/or monitoring locations.
  • the present invention integrates multiple data, control and media (image, video) systems from the variety of separate technology sources, for managing such separate technology systems into an integrated overview and control of these separate sources without requiring data or format standardization.
  • the system of the present invention integrates the various systems by aggregating the pixel output streams of the various computer systems' graphics ports rather than requiring the transmission and integration of the data streams from each technology source.
  • system of the present invention may be configured to also transmit one or more pixel output streams and/or control functions for a particular technology source upon demand when requested by a remote operator or user.
  • This limited transmission of pixel output streams and control functions allows for an unlimited number of information technology sources without a proportional requirement for a bandwidth of data transmission resources.
  • the present invention enables operators (users) of such systems to share, by publishing or by polling, many of these systems between operating machines and users.
  • the system of the present invention transfers pixel output streams from at least two separate and potentially incompatible technology sources into a common operating picture (COP) that provides a display of each available technology source in thumbnail form using a graphical user interface. These thumbnails can be real-time and active so that they are simply miniature real-time displays from each separate technology system.
  • the system of the present invention is configured to allow a local and/or remote user to select any of the image (still pictures, video) and controller technology sources so as to automatically transfer the pixel output stream and/or control functions from the selected technology source onto a second graphical user interface having a main operations window, for example, a separate touch screen.
  • the main operations window is configured such that the selected technology source can be monitored and/or controlled (user interacts with or operates the technology source).
  • the composite output stream 123 from the image processing subsystem 107 is operably connected to a user interface subsystem 150 having a first display (control screen) 108 for presenting the pixel output streams from each available technology source 101-105 into separate windows or thumbnails (see FIG. 3) as a common operating picture.
  • the image processing subsystem may be further configured to transmit one of the pixel output streams 124 to a second display (main operations window) 109.
  • the pixel output stream presented in the main operations window may be user selected or determined due to an operating condition, such as an alarm or machine state (for example, a truck is put in reverse).
  • Both the first and second displays of the user interface subsystem may be equipped with a touch screen interface.
  • the integrated system of the present invention may include a control subsystem 106 configured for mediating each of the control functions from the technology sources.
  • the control subsystem may be configured for monitoring the control screen 108 via control line 117 and selecting window layouts in the image processing subsystem 107 based on conditional input from control screen (if it is a touch screen) or alternatively from an external I/O device 108 A.
  • the control subsystem also may be configured for switching I/O control for the appropriate technology source 101-105 by tying the appropriate control line 110, 111, 112, 113 or 114 to the main operations window 109 or external I/O device using control line 116.
  • the user interface subsystem 150 and the control subsystem may be configured such that a user can select one of the technology sources so as to allow the user to interact with the selected technology source in the main operations window.
  • one aspect of the present invention includes a simultaneous remote monitoring or remote operations subsystem 300.
  • the image processing subsystem 107 sends the pixel output streams from the available technology sources as windows to selection screen 201 configured with selection palate windows 202-
  • the pixel output stream and/or control functions of a selected technology source may be provided to the main control screen
  • a remote operator or remote monitor 306 may request that the image processing subsystem send the pixel output stream and/or control functions associated with a technology source to him/her by sending a control command to the control subsystem 106 through a remote system network 304 (for example, an Internet or Ethernet monitoring and/or control system) using one or more control lines 307, 308.
  • a remote system network 304 for example, an Internet or Ethernet monitoring and/or control system
  • Remote monitoring, operation, interaction and/or control of a selected technology source 101-105 is via the remote system network 304 using the remote control lines 307, 308 via the local control module 106, which is operably connected to control lines 110-114 for all available technology sources (see FIG. 1).
  • a remote operator or monitor can select from any of the available technology sources, such as, but not limited to, the technology source currently monitored by the local operator in the local main operations window 212.
  • one embodiment of the integrated system of the present invention includes a user interface subsystem 250 configured with two (upper and lower) control screens 201, 211.
  • An upper control screen 201 is configured as a selection palate.
  • the upper control screen presents a series of windows 202-210 each filled with the image from one technology source, for example, a still picture or video.
  • the windows and window layout are created using the image processing subsystem 107 and composite output stream 123 shown in FIGS. 1 and 2.
  • each window 202-210 of the upper control screen 201 is a live and real-time display of each technology source presented in a thumbnail fashion using the window processing system so that all available technology sources can be monitored at the same time in a common operating picture.
  • a pixel output stream and/or control functions from the selected technology source are presented in the main operations control window 212 by the image processing subsystem and the control subsystem 106 (see FIG. 1).
  • the selected technology source is presented and available for user interaction on a separate main (lower) control screen 211 having a main operations window 212.
  • the main operations window is controlled with the touch screen I/O capability of the lower display screen or via the keyboard or other I/O device and/or the tracking device (mouse).
  • a user interface subsystem 250A may be configured such that the main control screen 21 IA is incorporated as a window 212A in the display layout on a single control screen 20 IA. It might also contain the selection palate of video streams in windows 202A-210A. Thus, a main control window 212A might reside on the same control screen 20 IA as the technology source windows 202A-210A.
  • the main control window 21 IA is mapped as a control area for that technology source to be controlled with a touch screen I/O capability of the main operations window 212A or via a keyboard or other I/O device 213A and/or a tracking device (mouse) 215A.
  • one embodiment of the integrated information technology system of the present invention includes a sensor array configured from a group of independent imaging devices (such as video or still picture cameras 401, 402, 403) that are arrayed and arranged to provide a wide f ⁇ eld-of-view image in one or more of the windows 202-210 of the upper control screen 201.
  • the image processing subsystem 107 may be configured to provide a composite output stream 123 that aggregates one or more pixel output streams 451, 452, 453 from each of the set of technology sources into adjacent windows 202, 203, 204 or otherwise arranged in a group corresponding to the sensor array.
  • the group of technology source windows is treated as a single combined technology source so that if any of the group is selected by the user, then the image processing subsystem and the controller subsystem 106 makes the entire group of technology sources available on the main operations window 212 of the main control screen 211, whether it is on a separate control screen or on a combined screen, such as shown in FIG. 4.
  • Additional technology sources 404-407 having pixel output streams 420-426 and control function connections 410-416 are also integrated for display in the upper control screen using the image processing subsystem and provided for user interaction in the main operations window by the control subsystem, as heretofore described.
  • the subsystem selector's outputs 516, 517, 518, 519 in this case are sent to the image processing subsystem 107 either singly or in arrayed groups to be displayed on the selection palate screen 201, sent to the master control screen 211, to the master control window 212, and/or sent out to the remote system network 304 to a remote location 515 via an intranet, the Internet or other suitable connection 309. Additional technology sources 509, 510, 511, 512 are also managed via connections 520, 522, 524, 526 by the image processing subsystem and might be represented in the selection palate screen in windows 202-210.
  • the interface module circuit board 5030 may further be configured with a microprocessor 5032 and a connector for a power source 5034.
  • the circuit board is further configured with an output connector 5036 that is electrically connected to the module output connector 5040 that is configured for connection with a standard cable connector 5042 (for example, RS-232 compatible) for connection to the main system enclosure.
  • the module may be configured with a removable lid (not shown) and configured from plastic, metal or other suitable material for withstanding the environment in which the integrated system of the present invention is to be used.
  • the interface module may be located within the technology source or otherwise may be attached, connected or placed adjacent to the technology source. When the technology source has control functions, then the circuit board allows for input and output of control signals. When the technology source is not controllable, then the interface module merely provides for direct connection of a pixel output stream to the image processing subsystem.
  • the integrated information technology system of the present invention may be incorporated into a mining truck 2000 and shovel 3000 system.
  • the system of the present invention may be designed to address the various information technologies, video imaging and other sensor systems, which are finding their way into the cabs of mining shovels.
  • Installing the system of the present invention on a shovel means that more technology can be inserted into the operations environment in a user beneficial, ergonomic and easy to use manner. It provides the shovel operator with a real-time monitoring of all technology systems as well as intuitive and simple system source selection and operations panel. A remote access option is available for the system where mine operators might wish to monitor the shovel from a remote site.
  • the remote monitor receives the streamed remote source on his or her display. They can quickly change their selection from: (a) the entire source selection panel, wherein this function reduces the total number of system sources, since one system source input is used by this feature; (b) any source system; and (c) any source image.
  • the source selection panel provides a quick thumbnail overview of all the shovel's sources and systems. If a remote monitor is only interested in one aspect of the shovel's system sources or source image, they can simply default to that selection as they address each shovel. This all happens without any interruption, effect or required notification to the operator in the shovel itself. The operator's displays do not change regardless of what the remote user is monitoring.
  • the remote access option's greatest feature is that no matter how many shovels or how many system sources or source images a remote monitor is managing, the remote access system only imposes a single channel of streaming video on the network.
  • the only increase in load on the mine's remote network is when mine operations adds more remote monitors who wish to access more channels simultaneously.
  • FIG. 9 shows the process steps used by the method of the present invention. All technology sources are processed using the image processing subsystem and are arranged or grouped on a selection palate in step 601.
  • the operator in the operations environment selects one of the technology sources. This can be via a touch screen or via another I/O selection device connected to the controller subsystem.
  • the controller subsystem sends out commands that causes the image processing subsystem to send the pixel output stream from the selected technology source to the main operations window whether on a separate main control screen or not. This selection replaced the previous selection in the main control window.
  • the controller subsystem will send the appropriate I/O signals to the technology source to allow the operation of that technology source in the main control window.
  • FIG. 10 shows the process steps used by the invention when a remote operator or remote monitor wishes to select a technology source.
  • the system is set up as it is in step 601 in FIG. 9.
  • the remote operator requests that the entire selection screen with its many technology source windows arrayed into a selection palate is sent to the remote location.
  • the remote operator can view and choose the system, sensor or video the operator is interested in viewing and/or operating.
  • the remote operator makes such a selection and the controller subsystem sends the appropriate commands to the image processing subsystem to send the selection to the system output connected to the network.
  • this causes the pixel output stream from the selected technology source to be sent to the remote operator location.
  • step 706 if the technology source is interactive, the controller subsystem acts as an intermediary to send I/O control information to the remote operator. Meanwhile, if there should be any conflict between local and remote control of the single technology source, the controller subsystem mediates conflicting requests via step 705.
  • an embodiment of the user interface subsystem 1000 may be configured with an upper control screen 1010 having a plurality of windows for displaying the pixel output streams of the available technology sources in a common operating picture.
  • the user interface subsystem is further configured with a lower screen 1020 having a main operations window.
  • the lower screen is used for displaying operator functions, such as a schedule 1022 (FIG. 11) or general menu functions 1024 (FIG. 14), which were originally in a selection window 1056 in the upper control screen, but upon selection was transferred to the main operations window.
  • Both the upper screen and lower screen may be configured with touch control capabilities so that the operator may easily select functions.
  • the upper and lower screens may further be activated by a keyboard, mouse or other operator interface device. As shown in FIG.
  • the upper screen may be configured to display several controller sources in separate windows 1051-1056, and may be configured to display and allow the operator to select several image sources in other windows 1061- 1066.
  • the operator may select any one of the controller sources by touching the segment of the screen associated with a controller source, for example controller source 1051.
  • the lower screen displays that source in the main operations window and at the same time the control functions 1025 of the selected controller source are mapped to the touch panel interface or other control device in the main operations window (FIG. 13).
  • FIG. 14 when an image source 1065 is selected, the lower screen displays the pixel output stream 1026 of the selected image source (FIG. 15).

Abstract

A system and method for the integration of multiple data, control and media systems from a variety of separate technology sources without requiring data or format standardization. An image processing subsystem is configured to aggregate pixel output streams from a plurality of image sources that are displayed on a local and/or remote user interface subsystem that is real-time and active. The operation of a plurality of independent controller sources is also provided through the user interface subsystem and a control subsystem. Rather than sending the full array of data from and to each of the image and controller technology sources, the integrated system of the present invention makes all technology sources available to a remote system network, but only the pixel output stream and control functions from the selected technology source is sent to the remote user interface subsystem. Accordingly, the required bandwidth for information sharing is minimized.

Description

INTEGRATED INFORMATION TECHNOLOGY SYSTEM CROSS-REFERENCES TO RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Application Ser. No. 60/857,936, filed November 9, 2006, the content of which is hereby incorporated herein by reference.
BACKGROUND OF THE INVENTION
The present invention is related to the integration of multiple data, control and media systems from a variety of separate sources, and more particularly to a method and apparatus for managing such data, control and media systems into an integrated overview and control of these separate sources in large mobile vehicles without requiring data or format standardization, for example, construction equipment, mining shovels, haul trucks, backhoes, compactors, dozers, loaders, pipelayers, telehandlers, cranes, drilling rigs, combines, tractors, barges, tanks, ships, rail equipment and aircraft; and for managing such data, control and media systems in large scale static machinery, for example, drilling rigs and power generation systems.
More and more information technology is being infused into large-scale vehicles and machinery. These technologies include: (1) Imaging and Sensor Systems, such as, but not limited to, cameras, low light imagers, heat imagers, radar, laser range finders, light detection and ranging (lidar), various deployed sensor networks and others; (2) Location Systems, such as, but not limited to, global or other positioning technologies, mapping, other party location and status indicators, (3) Communication Systems such as, but not limited to, voice over internet protocol (VoIP), schedules, dispatch, directions, orders, and video teleconferencing. These technologies are provided by an array of vendors and tend to be independent proprietary and incompatible systems, which often come with their own proprietary and sometimes embedded computers, sensor technologies, display systems, Input/Output (I/O) devices and control systems. These independent proprietary systems may include touch screens, control panels, keyboards and pointing devices. As a result, the cab or operator environments in such large-scale vehicles and machinery are becoming more and more chaotic and difficult to manage rather than becoming more controlled. In many cases, this complexity defeats the value and purpose of inserting more technology into the operator environments.
Since these independent proprietary systems often come from competing suppliers, there are no shared system standards for the technologies. Systems tend to be "stovepipe" applications, wherein each system is an independent, complete and isolated solution, having sensors, computers, I/O and displays, but having little compatibility with any other system. In some implementations, the sources may be passive and not be manipulated, such as with video cameras, GPS and mapping systems. In other implementations, the sources may be interactive with user I/O capabilities including but not limited to function selections, data input, view selection and more. In still other implementations, there may be more sources or technologies than can be clearly shown at one time.
A known prior art system includes a vehicle active network that communicatively couples devices within a vehicle. Device operation is independent of the interface of the device with the active network. Additionally, the architecture of the active network provides one or more levels of communication redundancy along data transmission paths. The architecture provides for the total integration of vehicle systems and functions, and permits plug-and-play device integration and upgradeability. A deficiency with the disclosed system that is not addressed is similar to what has been failing in integration attempts to date - namely that the analysis defines, assumes and applies some form of standard protocol, architecture or data topology that integrates the various technologies, for example, utilization of data packets. Accordingly, there is a need for a system that addresses the same problem area, but solves the requirement of integrating non-compatible systems before they have met some common standardization method and therefore can be integrated and enhanced with redundant communication paths.
Another known prior art system has been disclosed for a driver information system for a motor vehicle that includes a network that executes one of a number of application programs depending on which function of the system the driver has selected at any given point in time. In response to the driver's selection, the appropriate application program is retrieved from storage for execution. Information regarding the specific hardware interface software objects that are required during that execution are read from the retrieved application program and loaded for execution. Again, this disclosed prior art system does not address the core problem of requiring data standardization, formats, operating systems and integration. The configuration of this prior art system assumes that the software and programs that are being retrieved are operating on compatible hardware and compatible operating systems, which is not a typical configuration with present-day large vehicles where the information systems are not part of a fully integrated solution from a single supplier. When the entire solution is provided by a single supplier, such as the vehicle manufacturer, or at such as time as industry standards have been defined and accepted by all after market suppliers, then the disclosed inventions will work.
Accordingly, there is a need for, and what was heretofore unavailable, a method and apparatus that circumvents the requirements of data standards, of data compatibility, of CPU compatibility or of operating system compatibility of any kind. There is a need to address the integration of the technologies by aggregating the graphics, or images, or display pixels from the sources rather than the data used by the computer and software application to make such images. In such a way, disparate formats and proprietary systems may be brought together into a common system view. Additionally, by routing control data from the integrated common system back to the disparate devices, the separate systems can be interfaced for control and operation. Furthermore, it would be highly beneficial if this same capability was accessible to remote operations or monitoring locations. It would be further desirable to be able to share, by publishing or by polling, many of these systems between different vehicles or machines in the operating environments. For example, there would be a great benefit to show a mining shovel load or bucket position information to a haul truck driver as the driver aligns the haul truck to be loaded. The haul truck will not have the same sensor systems, computer or software application, but if the image or picture from the mining shovel's system were presented and integrated into the haul truck's system as per the invention, he would be able to see the graphic product of the mining shovel system on his system. Similarly, the haul load sensors from the haul truck could be viewed by the shovel operator. The present invention addresses these and other needs without requiring common data standards.
SUMMARY OF THE INVENTION
The present invention is directed to the integration of multiple data, control and media systems from a variety of separate "technology sources," and more particularly to a method and apparatus for managing such separate technology sources into an integrated overview and control system without requiring data or operating format standardization. This integrated system specifically provides for the monitoring and operation of a plurality of independent image (for example, video) sources and controller sources. Pixel output streams from each technology source are aggregated into a composite output pixel stream, which may be connected to a local user interface. Controller functions from the technology sources also may be interfaced to the local user interface. In addition or alternatively, the pixel output streams and controller functions may be connected to a remote user interface. The system of the present invention transfers only the required images and/or controller data for a particular technology source upon demand when requested by an operator or user. Accordingly, the network does not carry the data stream for all devices and thus required bandwidth for information sharing is minimized.
The system and method of the present invention configure imaging (camera, video) systems, sensor systems, location systems, communication systems, controller systems and all available technology sources into a common operating picture (COP) that provides an overview of the technology sources in thumbnail form. These thumbnails may be real-time and active so they are simply miniature real-time displays from each technology source (system) or they may be frozen screen shots of the technology sources or even a button icon representing the technology source. The system and method of the present invention allow the user to select any of these technology sources so as to transfer the selected technology source onto a main operations window or a separate main operations touch screen where it can be controlled and operated. The technology sources are presented on one or more output devices or control screens (user interface windows) using an image processing subsystem and a control subsystem for managing the technology sources.
The challenge is that the technologies being infused into the machines in the operating environment are generally and typically incompatible, coming from a variety of suppliers in a variety of forms and formats, including size, resolution, operating system, data format, and data type. Each component provides another enhancement for the operation of the machines, but in total they quickly become unmanageable as non- integrated components. The system and method of the present invention bring all these disparate components into one integrated solution regardless of their data, software and computer standards. This allows for custom mixing and matching of solutions from varying suppliers. Additionally, since these technologies are improving and changing rapidly, changes or upgrades of any one component does not necessitate the restructure or update of the entire integrated solution.
In one embodiment of the present invention, the operator environment is co-located with the machinery being operated. In another embodiment, another operator not co-located with the machinery being operated can also select the source to be viewed and/or controlled. In a further embodiment, the complete operator environment is remote. This embodiment allows remote monitoring of the many systems whether they are manned, remote operated or autonomous. In yet another embodiment, operators can select or share sources with other operators of other machinery for more sophisticated coordinated efforts. One application of the system of the present invention relates to managing the data, control sensor and camera systems in operator environments, and especially relates to large-scale industrial vehicles, such as mining shovels, haul trucks, drills combines, dozers and cranes. The present system for integrating information technology is also applicable to static systems such as, but not limited to, drill rigs and plant equipment.
By remote access to the same integration and selection systems used in the operations environment, the required bandwidth for remote monitoring or information sharing can be dramatically reduced. Rather than sending the full array of sensors and technologies from each vehicle to the remote monitoring location, thereby requiring all system and video data to be transmitted, the "on-vehicle" system controller selection mechanism selects a single system's graphics or video stream to be sent over the network. This can be transparent to the vehicle operator, and can be configured to not affect his selections and operations. All systems and sensors are thus available to remote sites, but only the selected source is put onto the network for transmission. As such, the network does not need to be able to carry all the information at all times, but only a subset of the information that has been selected for that moment by a combination of all the remote monitors requesting information at the same time. This will dramatically reduce the bandwidth requirements from each vehicle for remote monitoring or operations.
In some cases where there may be more sources or technologies than can be clearly shown at one time, a subsystem that organizes groups or pages of sources and cameras would be available to the operator. In addition, monitored conditions or sensors in the system can be monitored by the integrated solution that can automatically change which pages are available and/or which pixel output stream is featured on the main operations display or which technology source is highlighted on the selection screen. A good simple example would be a rear view camera array being automatically selected as the truck is put into reverse. Another example would be an array of highlighted and featured resources triggered by an emergency condition. Unlike the referenced prior art that includes a network that executes one of a number of application programs depending on which function of the system the driver has selected, in this case, an appropriate selection of otherwise non- integrated onboard systems required for the condition are brought up to prominence. In other implementations, a group of cameras might be arrayed into a composite view. For example, an array of three cameras might provide a wide field of view in reverse, or perhaps a full array of cameras could be set up to provide a complete look-around view or even a top down view of the scene all around the area of the vehicle. Other features and advantages of the invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the features of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
FIGURE 1 is a schematic representation of an embodiment of the system of the present invention showing a user interface subsystem and depicting the interface with a plurality of information technology sources.
FIG. 2 is a schematic representation of an alternative embodiment of the system of the present invention showing a user interface subsystem further including a remote operations subsystem.
FIG. 3 is a schematic representation of one embodiment of a user interface subsystem of the present invention.
FIG. 4 is a schematic representation of an alternative embodiment of a user interface subsystem of the present invention.
FIG. 5 is a schematic representation of an embodiment of the system of the present invention showing a user interface subsystem and depicting the interface with a plurality of independent image sources and a plurality independent controller sources.
FIG. 6 is a schematic representation of an alternative embodiment of the system of the present invention including a pre-selection interface for a plurality of independent image sources.
FIG. 7 depicts one embodiment of an interface module for use with the system of the present invention.
FIG. 8 depicts a mining shovel system and transport truck that may be configured with sensor arrays for use with the system of the present invention.
FIG. 9 shows the process steps of the present invention having a local user interface.
FIG. 10 shows the process steps of the present invention having a remote user interface.
FIG. 11 is a pictorial representation of a graphical user subsystem of the present invention.
FIG. 12 is a pictorial representation of a graphical user subsystem of the present invention, wherein the operator is selecting a controller source.
FIG. 13 is a pictorial representation of a graphical user subsystem of the present invention, wherein the operator is selecting display functions of a controller source. FIG. 14 is a pictorial representation of the graphical user subsystem of the present invention, wherein the operator is selecting an image source.
FIG. 15 is a pictorial representation of the graphical user subsystem of the present invention, wherein the pixel output stream of a selected image source is shown in a main operations window.
DETAILED DESCRIPTION OF THE INVENTION
As shown in the drawings for purposes of illustration, the present invention is directed to a method and apparatus for integrating information from several independent and separate proprietary technology source systems. The information from the separate technology sources is selectable, controllable and accessible to local and/or remote operations and/or monitoring locations. The present invention integrates multiple data, control and media (image, video) systems from the variety of separate technology sources, for managing such separate technology systems into an integrated overview and control of these separate sources without requiring data or format standardization. The system of the present invention integrates the various systems by aggregating the pixel output streams of the various computer systems' graphics ports rather than requiring the transmission and integration of the data streams from each technology source. In addition, the system of the present invention may be configured to also transmit one or more pixel output streams and/or control functions for a particular technology source upon demand when requested by a remote operator or user. This limited transmission of pixel output streams and control functions allows for an unlimited number of information technology sources without a proportional requirement for a bandwidth of data transmission resources.
The present invention enables operators (users) of such systems to share, by publishing or by polling, many of these systems between operating machines and users. The system of the present invention transfers pixel output streams from at least two separate and potentially incompatible technology sources into a common operating picture (COP) that provides a display of each available technology source in thumbnail form using a graphical user interface. These thumbnails can be real-time and active so that they are simply miniature real-time displays from each separate technology system. The system of the present invention is configured to allow a local and/or remote user to select any of the image (still pictures, video) and controller technology sources so as to automatically transfer the pixel output stream and/or control functions from the selected technology source onto a second graphical user interface having a main operations window, for example, a separate touch screen. The main operations window is configured such that the selected technology source can be monitored and/or controlled (user interacts with or operates the technology source).
Turning now to the drawings, in which like reference numerals represent like or corresponding aspects of the drawings, and with particular reference to FIG. 1, the integrated information technology system 100 of the present invention provides a custom interface between a plurality of information technology sources and an integrated monitoring and control system having a graphical user interface. Each technology source 101-105 (for example, information systems, controllers, sensors, still picture and video sources) is connected via a series of one or more cables or other mechanisms 118-122 that transfers (moves) pixels or graphical pictures from each information technology source to an image processing subsystem 107. The pixel output streams from the various technology sources are combined into a composite output stream 123 and arranged for display into separate windows by the image processing subsystem.
The composite output stream 123 from the image processing subsystem 107 is operably connected to a user interface subsystem 150 having a first display (control screen) 108 for presenting the pixel output streams from each available technology source 101-105 into separate windows or thumbnails (see FIG. 3) as a common operating picture. The image processing subsystem may be further configured to transmit one of the pixel output streams 124 to a second display (main operations window) 109. The pixel output stream presented in the main operations window may be user selected or determined due to an operating condition, such as an alarm or machine state (for example, a truck is put in reverse). Both the first and second displays of the user interface subsystem may be equipped with a touch screen interface.
One or more of the available technology sources 101-102 may include control functions (controllers). Accordingly, the integrated system of the present invention may include a control subsystem 106 configured for mediating each of the control functions from the technology sources. For example, the control subsystem may be configured for monitoring the control screen 108 via control line 117 and selecting window layouts in the image processing subsystem 107 based on conditional input from control screen (if it is a touch screen) or alternatively from an external I/O device 108 A. The control subsystem also may be configured for switching I/O control for the appropriate technology source 101-105 by tying the appropriate control line 110, 111, 112, 113 or 114 to the main operations window 109 or external I/O device using control line 116. For example, the user interface subsystem 150 and the control subsystem may be configured such that a user can select one of the technology sources so as to allow the user to interact with the selected technology source in the main operations window.
Referring to FIG. 2, one aspect of the present invention includes a simultaneous remote monitoring or remote operations subsystem 300. In this embodiment, the image processing subsystem 107 sends the pixel output streams from the available technology sources as windows to selection screen 201 configured with selection palate windows 202-
210 to display the images from the technology sources. The pixel output stream and/or control functions of a selected technology source may be provided to the main control screen
211 and displayed in the main operations window 212, so as to allow a local user to interact with (control) the selected technology source. Simultaneously, a remote operator or remote monitor 306 may request that the image processing subsystem send the pixel output stream and/or control functions associated with a technology source to him/her by sending a control command to the control subsystem 106 through a remote system network 304 (for example, an Internet or Ethernet monitoring and/or control system) using one or more control lines 307, 308. This causes the control subsystem to issue a command to the image processing subsystem to send a pixel output stream (window) through an image output line 309 connected to the remote system network so as to transmit the window to a remote display screen and/or control subsystem 306.
Remote monitoring, operation, interaction and/or control of a selected technology source 101-105 (FIG. 1), if appropriate, is via the remote system network 304 using the remote control lines 307, 308 via the local control module 106, which is operably connected to control lines 110-114 for all available technology sources (see FIG. 1). In this embodiment of the present invention, a remote operator or monitor (user) can select from any of the available technology sources, such as, but not limited to, the technology source currently monitored by the local operator in the local main operations window 212.
Referring to FIG. 3, one embodiment of the integrated system of the present invention includes a user interface subsystem 250 configured with two (upper and lower) control screens 201, 211. An upper control screen 201 is configured as a selection palate. The upper control screen presents a series of windows 202-210 each filled with the image from one technology source, for example, a still picture or video. The windows and window layout are created using the image processing subsystem 107 and composite output stream 123 shown in FIGS. 1 and 2. In one embodiment of the present invention, each window 202-210 of the upper control screen 201 is a live and real-time display of each technology source presented in a thumbnail fashion using the window processing system so that all available technology sources can be monitored at the same time in a common operating picture. When the local operator selects one of the available technology sources 101-105 either via the touch screen I/O capabilities of the upper control screen or via a keyboard or other I/O device 213 and/or a tracking device (mouse) 215, a pixel output stream and/or control functions from the selected technology source are presented in the main operations control window 212 by the image processing subsystem and the control subsystem 106 (see FIG. 1). As shown in FIG. 3, the selected technology source is presented and available for user interaction on a separate main (lower) control screen 211 having a main operations window 212. If the selected technology source is a device that has control functions rather than simply a non-interactive image source or data display, then the main operations window is controlled with the touch screen I/O capability of the lower display screen or via the keyboard or other I/O device and/or the tracking device (mouse).
As shown in FIG. 4, a user interface subsystem 250A may be configured such that the main control screen 21 IA is incorporated as a window 212A in the display layout on a single control screen 20 IA. It might also contain the selection palate of video streams in windows 202A-210A. Thus, a main control window 212A might reside on the same control screen 20 IA as the technology source windows 202A-210A. If the selected technology source is a device that has control rather than simply a non-interactive image source or data display, then the main control window 21 IA is mapped as a control area for that technology source to be controlled with a touch screen I/O capability of the main operations window 212A or via a keyboard or other I/O device 213A and/or a tracking device (mouse) 215A.
As shown in FIG. 5, one embodiment of the integrated information technology system of the present invention includes a sensor array configured from a group of independent imaging devices (such as video or still picture cameras 401, 402, 403) that are arrayed and arranged to provide a wide fϊeld-of-view image in one or more of the windows 202-210 of the upper control screen 201. For example, the image processing subsystem 107 may be configured to provide a composite output stream 123 that aggregates one or more pixel output streams 451, 452, 453 from each of the set of technology sources into adjacent windows 202, 203, 204 or otherwise arranged in a group corresponding to the sensor array. The group of technology source windows is treated as a single combined technology source so that if any of the group is selected by the user, then the image processing subsystem and the controller subsystem 106 makes the entire group of technology sources available on the main operations window 212 of the main control screen 211, whether it is on a separate control screen or on a combined screen, such as shown in FIG. 4. Additional technology sources 404-407 having pixel output streams 420-426 and control function connections 410-416 are also integrated for display in the upper control screen using the image processing subsystem and provided for user interaction in the main operations window by the control subsystem, as heretofore described.
As shown in FIG. 6, an embodiment of the system of the present invention includes integrating a large number of sensors and still or video cameras that are available to the system. Some of the technology sources might be arranged as arrays of video cameras 501, 502, 503, 504, while others are individual video cameras 505, 506, 507, 508. With this many technology sources, a subsystem selector, probably in the form of a video matrix switcher 500, is added to the system of the present invention. The controller subsystem 106 also is configured to address all the components, including the subsystem selector 500. The subsystem selector's outputs 516, 517, 518, 519 in this case are sent to the image processing subsystem 107 either singly or in arrayed groups to be displayed on the selection palate screen 201, sent to the master control screen 211, to the master control window 212, and/or sent out to the remote system network 304 to a remote location 515 via an intranet, the Internet or other suitable connection 309. Additional technology sources 509, 510, 511, 512 are also managed via connections 520, 522, 524, 526 by the image processing subsystem and might be represented in the selection palate screen in windows 202-210.
Referring now to FIG. 7, an interface module (device) 5000 configured for use with the integrated information technology system of the present invention is configured with a housing 5010, an input source cable 5020, a printed circuit board 5030 and an output connector 5040. A separate interface module may be configured for each technology source to capture the pixel output streams and control functions. Alternatively, one or more technology sources may share a similarly configured interface device. The interface module printed circuit board may be configured with a multi-pin connector receptacle 5021 that is removably connected to an input plug 5022 that is electrically connected to a cable (wire bundle) or other communication mechanism from a technology source.
The interface module circuit board 5030 may further be configured with a microprocessor 5032 and a connector for a power source 5034. The circuit board is further configured with an output connector 5036 that is electrically connected to the module output connector 5040 that is configured for connection with a standard cable connector 5042 (for example, RS-232 compatible) for connection to the main system enclosure. The module may be configured with a removable lid (not shown) and configured from plastic, metal or other suitable material for withstanding the environment in which the integrated system of the present invention is to be used. Alternatively, the interface module may be located within the technology source or otherwise may be attached, connected or placed adjacent to the technology source. When the technology source has control functions, then the circuit board allows for input and output of control signals. When the technology source is not controllable, then the interface module merely provides for direct connection of a pixel output stream to the image processing subsystem.
With reference to FIG. 8, the integrated information technology system of the present invention may be incorporated into a mining truck 2000 and shovel 3000 system. The system of the present invention may be designed to address the various information technologies, video imaging and other sensor systems, which are finding their way into the cabs of mining shovels. Installing the system of the present invention on a shovel means that more technology can be inserted into the operations environment in a user beneficial, ergonomic and easy to use manner. It provides the shovel operator with a real-time monitoring of all technology systems as well as intuitive and simple system source selection and operations panel. A remote access option is available for the system where mine operators might wish to monitor the shovel from a remote site. It would be impractical for a remote monitor to receive all sources from even one shovel, much less all sources from multiple shovels, for example, ten shovels with four computers and four cameras each would generate eighty information channels. The mine's remote network would quickly overload no matter how sophisticated it was. The remote access option addresses this issue. It allows the remote monitor to send a request to the system on board any shovel via their network. Upon request, the system aboard that shovel selects a single source and streams it onto the mine's remote network.
The remote monitor receives the streamed remote source on his or her display. They can quickly change their selection from: (a) the entire source selection panel, wherein this function reduces the total number of system sources, since one system source input is used by this feature; (b) any source system; and (c) any source image. The source selection panel provides a quick thumbnail overview of all the shovel's sources and systems. If a remote monitor is only interested in one aspect of the shovel's system sources or source image, they can simply default to that selection as they address each shovel. This all happens without any interruption, effect or required notification to the operator in the shovel itself. The operator's displays do not change regardless of what the remote user is monitoring. The remote access option's greatest feature is that no matter how many shovels or how many system sources or source images a remote monitor is managing, the remote access system only imposes a single channel of streaming video on the network. The only increase in load on the mine's remote network is when mine operations adds more remote monitors who wish to access more channels simultaneously.
FIG. 9 shows the process steps used by the method of the present invention. All technology sources are processed using the image processing subsystem and are arranged or grouped on a selection palate in step 601. In step 602, the operator in the operations environment selects one of the technology sources. This can be via a touch screen or via another I/O selection device connected to the controller subsystem. In step 603, the controller subsystem sends out commands that causes the image processing subsystem to send the pixel output stream from the selected technology source to the main operations window whether on a separate main control screen or not. This selection replaced the previous selection in the main control window. In step 604, if the selected technology source is interactive rather than passive, the controller subsystem will send the appropriate I/O signals to the technology source to allow the operation of that technology source in the main control window.
FIG. 10 shows the process steps used by the invention when a remote operator or remote monitor wishes to select a technology source. In step 701, the system is set up as it is in step 601 in FIG. 9. In step 702, the remote operator requests that the entire selection screen with its many technology source windows arrayed into a selection palate is sent to the remote location. Thus the remote operator can view and choose the system, sensor or video the operator is interested in viewing and/or operating. In step 703, the remote operator makes such a selection and the controller subsystem sends the appropriate commands to the image processing subsystem to send the selection to the system output connected to the network. In step 704, this causes the pixel output stream from the selected technology source to be sent to the remote operator location. In step 706, if the technology source is interactive, the controller subsystem acts as an intermediary to send I/O control information to the remote operator. Meanwhile, if there should be any conflict between local and remote control of the single technology source, the controller subsystem mediates conflicting requests via step 705.
Referring now to FIGS. 11-15, an embodiment of the user interface subsystem 1000 may be configured with an upper control screen 1010 having a plurality of windows for displaying the pixel output streams of the available technology sources in a common operating picture. The user interface subsystem is further configured with a lower screen 1020 having a main operations window. The lower screen is used for displaying operator functions, such as a schedule 1022 (FIG. 11) or general menu functions 1024 (FIG. 14), which were originally in a selection window 1056 in the upper control screen, but upon selection was transferred to the main operations window. Both the upper screen and lower screen may be configured with touch control capabilities so that the operator may easily select functions. The upper and lower screens may further be activated by a keyboard, mouse or other operator interface device. As shown in FIG. 11, the upper screen may be configured to display several controller sources in separate windows 1051-1056, and may be configured to display and allow the operator to select several image sources in other windows 1061- 1066. As shown in FIG. 12, the operator may select any one of the controller sources by touching the segment of the screen associated with a controller source, for example controller source 1051. When such a controller source is selected, the lower screen displays that source in the main operations window and at the same time the control functions 1025 of the selected controller source are mapped to the touch panel interface or other control device in the main operations window (FIG. 13). Similarly, as shown in FIG. 14, when an image source 1065 is selected, the lower screen displays the pixel output stream 1026 of the selected image source (FIG. 15).
While particular forms of the present invention have been illustrated and described, it will also be apparent to those skilled in the art that various modifications can be made without departing from the spirit and scope of the invention. Accordingly, it is not intended that the invention be limited, except as by the appended claims.

Claims

We Claim:
1. A system for integrating technology sources, comprising: at least two technology sources, wherein each technology source has at least one pixel output stream; and an image processing subsystem connected to each pixel output stream, the image processing subsystem being configured to combine the pixel output streams into a composite output stream.
2. The system of claim 1, further including a user interface subsystem connected to the image processing subsystem and configured to display at least one of the pixel output streams to a user.
3. The system of claim 2, wherein the user interface subsystem is configured such that a user can select one of the technology sources so as to display the pixel output stream from the selected technology source in a main operations window.
4. The system of claim 2, wherein the user interface subsystem includes more than one display screen.
5. The system of claim 2, wherein the user interface subsystem includes a single display screen.
6. The system of claim 1 , wherein a first technology source is configured to generate a first pixel output stream having a first size format, and a second technology source is configured to generate a second pixel output stream having a second size format different from the first size format.
7. The system of claim 1 , wherein a first technology source is configured to generate a first pixel output stream having a first resolution format, and a second technology source is configured to generate a second pixel output stream having a second resolution format different from the first resolution format.
8. The system of claim 2, further including a control subsystem configured to allow a user to interact with at least one of the technology sources.
9. The system of claim 8, wherein the user interface subsystem and the control subsystem are configured such that a user can select one of the technology sources so as to allow the user to interact with the selected technology source in a main operations window.
10. A system for integrating technology sources at a remote location, comprising: at least two local technology sources, wherein each local technology source has at least one pixel output stream; an image processing subsystem connected to each pixel output stream, the image processing subsystem being configured to combine the pixel output streams into a composite output stream; and a remote user interface subsystem connected to the image processing subsystem and configured to request and receive each pixel output stream and the composite output stream at a remote location.
11. The system of claim 10, wherein the remote user interface subsystem is configured such that a remote user can select one of the technology sources so as to display the pixel output stream from the selected technology source in a remote operations window.
12. The system of claim 11, further including a control subsystem configured to allow a remote user to interact with at least one of the technology sources.
13. The system of claim 12, wherein the remote user interface subsystem and the control subsystem are configured such that a remote user can select one of the technology sources so as to allow the remote user to interact with the selected technology source in a remote operations window.
14. The system of claim 13, wherein a first technology source is configured to generate a first pixel output stream having a first format, and a second technology source is configured to generate a second pixel output stream having a second format different from the first format.
15. A system for integrating technology sources, comprising: at least two technology sources, wherein each technology source has at least one pixel output stream, wherein a first technology source is configured to generate a first pixel output stream having a first format, and a second technology source is configured to generate a second pixel output stream having a second format different from the first format; an image processing subsystem connected to each pixel output stream, the image processing subsystem being configured to combine the pixel output streams into a composite output stream; and a user interface subsystem connected to the image processing subsystem and configured to display at least one of the pixel output streams to a local user in a first screen and a second screen.
16. The system of claim 15, wherein the user interface subsystem is configured with a plurality of control windows in the first screen for displaying each pixel output stream, such that a user can select one of the technology sources from one of the control windows so as to display the pixel output stream from the selected technology source in a main operations window configured in the second screen.
17. The system of claim 16, further including a control subsystem configured to allow a user to interact with at least one of the technology sources, wherein the user interface subsystem and the control subsystem are configured such that a user can select one of the technology sources from the first screen so as to allow the user to interact with the selected technology source in the main operations window.
18. A system for integrating technology sources, comprising: at least two technology sources, wherein each technology source has at least one pixel output stream, wherein a first technology source is configured to generate a first pixel output stream having a first format, and a second technology source is configured to generate a second pixel output stream having a second format different from the first format; an image processing subsystem connected to each pixel output stream, the image processing subsystem being configured to combine the pixel output streams into a composite output stream; and a user interface subsystem connected to the image processing subsystem and configured to display at least one of the pixel output streams to a local user in a single screen.
19. The system of claim 18, wherein the user interface subsystem is configured with a plurality of control windows in the single screen for displaying each pixel output stream, such that a user can select one of the technology sources from one of the control windows so as to display the pixel output stream from the selected technology source in a main operations window configured in the single screen.
20. The system of claim 19, further including a control subsystem configured to allow a user to interact with at least one of the technology sources, wherein the user interface subsystem and the control subsystem are configured such that a user can select one of the technology sources from the plurality of control windows so as to allow the user to interact with the selected technology source in the main operations window.
21. A method for integrating a system of technology sources, comprising: providing at least two technology sources, wherein each technology source has at least one pixel output stream, wherein a first technology source is configured to generate a first pixel output stream having a first format, and a second technology source is configured to generate a second pixel output stream having a second format different from the first format; connecting an image processing subsystem to each pixel output stream, the image processing subsystem being configured to combine the pixel output streams into a composite output stream; and providing a local user interface subsystem connected to the image processing subsystem and configured to display at least one of the pixel output streams to a local user, wherein the local user interface subsystem is configured with at least one control window for displaying at least one pixel output stream, such that a local user can select one of the technology sources from a control window so as to display the pixel output stream from the selected technology source in a local operations window.
22. The method of claim 21, further including providing a control subsystem configured to allow a local user to interact with at least one of the technology sources, wherein the local user interface subsystem and the control subsystem are configured such that a local user can select one of the technology sources so as to allow the local user to interact with the selected technology source in the local operations window.
23. The method of claim 22, further including providing a remote user interface subsystem configured such that a remote user can select one of the technology sources to display the pixel output stream from the selected technology source at a remote operations location.
24. The system of claim 23, further including configuring the control subsystem and the remote user interface subsystem to allow a remote user to interact with at least one of the technology sources, wherein the remote user can select one of the technology sources so as to allow the remote user to interact with the selected technology source from the remote operations location.
PCT/US2007/084336 2006-11-09 2007-11-09 Integrated information technology system WO2008058283A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US85793606P 2006-11-09 2006-11-09
US60/857,936 2006-11-09

Publications (2)

Publication Number Publication Date
WO2008058283A2 true WO2008058283A2 (en) 2008-05-15
WO2008058283A3 WO2008058283A3 (en) 2008-10-16

Family

ID=39365403

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/084336 WO2008058283A2 (en) 2006-11-09 2007-11-09 Integrated information technology system

Country Status (1)

Country Link
WO (1) WO2008058283A2 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030202101A1 (en) * 2002-04-29 2003-10-30 Monroe David A. Method for accessing and controlling a remote camera in a networked system with multiple user support capability and integration to other sensor systems
US20050091311A1 (en) * 2003-07-29 2005-04-28 Lund Christopher D. Method and apparatus for distributing multimedia to remote clients
US20050246628A1 (en) * 2004-03-19 2005-11-03 Judd Peterson Facility reference system and method
US20060041181A1 (en) * 2004-06-04 2006-02-23 Viswanathan Raju R User interface for remote control of medical devices
US20060064716A1 (en) * 2000-07-24 2006-03-23 Vivcom, Inc. Techniques for navigating multiple video streams

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060064716A1 (en) * 2000-07-24 2006-03-23 Vivcom, Inc. Techniques for navigating multiple video streams
US20030202101A1 (en) * 2002-04-29 2003-10-30 Monroe David A. Method for accessing and controlling a remote camera in a networked system with multiple user support capability and integration to other sensor systems
US20050091311A1 (en) * 2003-07-29 2005-04-28 Lund Christopher D. Method and apparatus for distributing multimedia to remote clients
US20050246628A1 (en) * 2004-03-19 2005-11-03 Judd Peterson Facility reference system and method
US20060041181A1 (en) * 2004-06-04 2006-02-23 Viswanathan Raju R User interface for remote control of medical devices

Also Published As

Publication number Publication date
WO2008058283A3 (en) 2008-10-16

Similar Documents

Publication Publication Date Title
US10951862B2 (en) Systems and methods for managing and displaying video sources
US6002995A (en) Apparatus and method for displaying control information of cameras connected to a network
US6266082B1 (en) Communication apparatus image processing apparatus communication method and image processing method
US6560513B2 (en) Robotic system with teach pendant
EP1465413A2 (en) Camera control system
US20090037024A1 (en) Robot Operator Control Unit Configuration System and Method
JPH10285580A (en) Communication equipment and communication display method
KR20160109592A (en) Video wall security system
EP2629188B1 (en) Display apparatus and control method for drag and drop externally stored data to separately externally stored application to trigger the processing of the data by the application
JPWO2016013686A1 (en) Work machine display system, work machine display device, and work machine display method
US20230409027A1 (en) Ruggedized remote control display management system for harsh and safety-critical environments
EP0893919B1 (en) Camera control system
US20060168634A1 (en) Compressed video image transmission method and apparatus for allocating transmission capacity for reference images
US20080228963A1 (en) Method and apparatus for transferring settings and other configuration information from one consumer electronics device to another
GB2537582A (en) On-screen display systems
US20080079801A1 (en) Video conference system and video conference method
CN110351524A (en) Three-dimensional visualization monitoring method, device, electronic equipment and readable storage medium storing program for executing
US20020140819A1 (en) Customizable security system component interface and method therefor
WO2008058283A2 (en) Integrated information technology system
JP3933016B2 (en) Monitoring system
KR101698342B1 (en) Nuclear Plant Main Control Room Screen Sharing System
JP2003009135A (en) Camera supervising control system and image server
JP2001136515A (en) Image-pickup device, image communication system, image pickup method, image communicating method, and storage medium
JP4101476B2 (en) Camera monitoring system, video selection control device, and video selection control method
EP4120689A1 (en) System and method for configuring the graphical layout of at least one multi-viewer display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07845030

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07845030

Country of ref document: EP

Kind code of ref document: A2