GB2282726A - Co-operatively slavable phased virtual image sensor array - Google Patents

Co-operatively slavable phased virtual image sensor array Download PDF

Info

Publication number
GB2282726A
GB2282726A GB9323779A GB9323779A GB2282726A GB 2282726 A GB2282726 A GB 2282726A GB 9323779 A GB9323779 A GB 9323779A GB 9323779 A GB9323779 A GB 9323779A GB 2282726 A GB2282726 A GB 2282726A
Authority
GB
United Kingdom
Prior art keywords
virtual image
image sensor
operatively
phased
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB9323779A
Other versions
GB9323779D0 (en
Inventor
Roger Colston Downs
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB939317600A external-priority patent/GB9317600D0/en
Priority claimed from GB939317573A external-priority patent/GB9317573D0/en
Priority claimed from GB939317602A external-priority patent/GB9317602D0/en
Application filed by Individual filed Critical Individual
Priority to GB9323779A priority Critical patent/GB2282726A/en
Publication of GB9323779D0 publication Critical patent/GB9323779D0/en
Priority to GB9807454A priority patent/GB2320392B/en
Priority to AU74654/94A priority patent/AU7465494A/en
Priority to PCT/GB1994/001845 priority patent/WO1995006283A1/en
Priority to GB9601754A priority patent/GB2295741B/en
Priority to US08/601,048 priority patent/US6233361B1/en
Priority to GB9725082A priority patent/GB2319688B/en
Publication of GB2282726A publication Critical patent/GB2282726A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/142Edging; Contouring
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Input (AREA)
  • Facsimile Scanning Arrangements (AREA)
  • Image Processing (AREA)

Abstract

A co-operatively slavable phased virtual image sensor is an array of virtual image sensors V1, V2, V3 where each virtual image sensor is logically and physically organised to regard a common scenario from a different perspective figure 1 such that common image detail from the different perspectives of each virtual image sensor is registered in corresponding line scans of each of the virtual image sensors and where each virtual image sensors frame and line sync generation is controlled to have coincidence in time. The fields of view of each virtual image sensor may be co-operatively and electronically slaved across the field of regard, afforded by the combines fields of view of the image sensors comprising each of the virtual image sensors, such that they maintain their differing perspective views of the common scenario and time coincidence of their frame and line synchronisation. The outputs of the pair of image sensors (7, 10, Fig. 2) of each virtual image sensor may be sampled in sequence to simulate an azimuthal scanned field of view 15, equal to that of each image sensor. <IMAGE>

Description

CO-OPERATIVELY SLiVFiBLE PHASED VIRTUA SENSOR NRRAY This invention relates to a co-operatively slavable phased virtual irhage sensor array, Lone applications require that image sensors CCD or equivalent be logically and physically positioned and oriented, such that they generate differnt perspective views of a comron scenario, and that common iage detail is registered in corresponding line scans of all such image sensors in the array and further that the frame and line sync generation by all such image sensors has time coincidence It may further be required by some applications for the image sensors in such an array to accurately slave their respective fields of view to a different but still common scenario, or for the image sensors of such an array to perform an accurate and co-operatively synchronised scan of a common field of regard.Whilst mechanical slaving of image sensors may satisfy some requirements virtual image sensors support the fast and accurate positioning of their fields of view, and allows control led accurate electronic co-ordinated and synchroni sed scanning between sensors For all such requirements the image information fr.-.m such a multiple image sensor system may need to be passed to an e::-.ternal processing system, Riccording to the present invention there is provided a co-operatively slavable phased virtual image sensor array comprising a number of equivalent virtual image sensors, logically and physically positioned and orientated, and where the boresights of their respective fields of view are parallel and such that they generate images of a common scenario from their different perspectives, and where corirlicin image detail is registered in corresponding line scans for each such virtual image sensor, and where the frame and line sync generation fc.r each such virtual image sensor is controlled such that a time coincidence exists between the characteristics of these signals, and where each such virtual image sensor's 5 field of view may be slaved or scan in a controlled co-ordinated and synchronised fashion such as to preserve the differing perspective views of a common scenario within their shared field of regard, and where such image information may be passed to an external processor system n specific embodiment of the invention will now be describe by way of e:.'ample with reference to the accompanying drawings in which:: Figure 1 shows a common scenario viewed by three virtual image sensors comprising ordered pairs of image sensors Figure 2 shows a representation of the time distribution if image data from an ordered pair of image sensors comprising a virtual image sensor Figure 3 shows a system block di agr am identifying major functional areas and important signals between them Figure 4 shows important waveforms used in the system.
Ref erring to the drawing figure 1 shows a horizontally aligned phased virtual image sensor array Vi. V2, end V3 -nd V comprising three ordered pairs of image sensor CCD or equivalent 1 and 2, 3 and 4, 5 and b respectively. The orientation of the image sensors c ompr is i ng an ordered pair supports a field of regard comprising te adjoining fields of view of each image sensor of the pair, Each virtual image sensor is positioned at known separation in the phased array, in this example horizontally aligned such that their orientation ensures that the boresights of the fields of view of each of the virtual image sensors are parallel and that the field of view of each virtual image sensor shares a common scenario with the other virtual image sensors and where common image detail is contrived to register in corresponding line scans of the virtual image sensors A representation of the time distribution of image information from each image sensor figure 2 shows visible luminance data 7, frame and line sync, porch and non visible luminance information C. with a Frame begin tiriie 9.For the ordered pairs of image sensors comprising each virtual image sensor the relative image sensor synchronisation is organised such that the lefthand image sensor of a pair has frame begin time 9 whilst the righthand image sensor of the pair has te relative frame begin time 12. Visible luminance information for the righthand image sensor of the pair e: ists in the rectangle 10, the border area 11 comprises the frame and line sync, porch and non visible luminance information of this image sensor A time slice XX would comprise line scan information 13 and 14 from the two image sensors comprising an ordered pair such that luminance information between them is continuous in tie. n subset of luminance information 1 comprising a field of view equivalent to the field of view of either image sensor in a pair may te extracted and combined with synthetic frame and line sync information 16 to form a frame of composite vide signal of a virtual image sensor with Frame bin time 17. This field of view 15 is capable of being slaved within a field of regard as defined by defined te the combined fields of view 7 and 1 I-J of each image sensor in the pair, in this e:ample allowing an azimuth scan.
With particular reference to the system bloc diagram figure composite video CV1,2,3,4,5,6 from the image sensors 1, 2, 3, 4, 5, 6 respectively is passed to the Frame Line Mas FLM 18 circuitry which generates Frame begin signals F1, 2, 3, 4, 5 6 waveform 35, Frame and Line sync signals L1, 2, 3, 4, 5, 6. waveform and Luminance Mask signals Ml, 2, 3, 4, 5, 6 waveform 39 for each of the image sensors. Frame begin signals Fl, 2, 3, 4, 5, 6 are passed to the Sensor Clock Synchronisation SCS 1 circuitry and Frame and line sync L3, 4, 5, 6 are passed to the Fine Line Position FLP 22 circuitry, Frame and line sync information L1, L2 from image sensors 1 and 2 are normally passed unmodified via the Shifted SYnc SSV 60 circuitry as L1 and L2' respectively to the Fine line position FLP 22 circuitry, Frame and line sync information L1 is also passed to the Scanner SCNR 23 circuitry.
The purpose of the Sensor clock synchronisation c;C.S 21 circuitry is to establish and maintain the necessary relative Frame and line synchronisation between the image sensors 1, 2, 3, 4, 5, 6 in the array, The Sensor clock synchronisation SOS 21 circuitry achieves this in conjunction with the FLP 22 circuitry by controlling the clock drive to each of the image sensors in the array thereby control 1 ing each image sensors Frame and line sync generation which provides the control feedback loop to the SOYS 21 circuitry, In this particular example image sensor IS1 1 is used as a reference sensor, and the relative Frame beg in time sync hron i sati on of left and r i ghthand image sensors of a pair is such as to correspond with that shown in figure 2 where the Frame begin time for the lefthand image sensor of a pair 181, Ir=::3, and 155 is represented by 9, whilst the relative Frame begin time synchronisation for the righthand image sensor of a pair 1S2, IS4, and 156 is represented by 12, The SC8 21 circuitry also generates a Frame window FW signal waveform 36 which corresponds with The frame syncing period this signal is passed to the Scanner SCN 2 circuitry, The purpose of the SCaNneR ::3ONR 23 circuitry is to allow controlled automatic or manual scanning of each virtual image sensors field of regard by the virtual image sensors field of view, It comprises a clock whose pulse rate is variable and during frame sync periods as defined by the signal FW waveform 36 drives a cascaded count up counter whose maximum count corresponds with the number of sensor resolution periods in a luminance period, This counter automatically resets when the maximum count is reached, During any particular frame period this count up register holds a particular and fixed value which represents the Scanner's Frame and line sync RS time offset, (image sensor resolution periods), relative to the reference sensor's 131 1 Frame and line sync generation, The output fro this counter is loaded for every line scan of a frame into cascaded count down registers by the action of the image sensor IS2 mask signal M2 waveform 39 going low. A clock also gated by the Mask. M2 and operating at the maximum frequency of the image sensors bandwidth, counts down the value loaded into the cascaded count down registers.When the output of all these cascaded count down registers reach zero, elements of frame and line syne information are generated, which over the course of a frame period forms the synthetic Frame and line sync information SS to be combined with subsets of luminance signals LS1,2, LS3,4, L.-3S, e: trected from the image sensors of the ordered image sensor pairs 1 and 2, 2, 3 and 4, 5 and 6 respectively, thereby forming for each virtual image sensor a composite video signal.Because the relative Frame and line sync generation between each image sensor in an ordered pair is the same and because each lefthand image sensor synchronisation of an ordered pair corresponds with the synchronisation of image sensor I1 the synthetic sync information SS so generated is appropriate for the combined luminance subsets for all virtual image sensors formed by each c'rdered image sensor pair in the phased virtual image sensor array, Since in this example the image sensors are horizontally aligned virtual image sensor subsets of luminance signals may only be offset in a horizontal direction. there teing no sensible possibility here to scan the combined field of views in a vertical plane, To generate the necessary luminance subset masks, image sensor 152' 5 2 Mask M2 rising edge triggers a monostable generating a signal Q whose maximum duration is greater than 064ms, The presence of signal Q forces the selection of luminance from image sensors 152 2, 154 4, and 156 6 for the virtual image sensors V1, V:2, and V3 respectively.The monostable is reset by the action of the countdown registers producing a zero result and the presence of signal Q's compliment Q is the mask signal used to select luminance subsets from image sensors 151 1, IS 3, and ISS 5 for the virtual image sensors V1, V2, and V::3 respectively, Combine luminance subset signals are selected and combined through analogue switching gates controlled by the the signals tR and its compliment Q and form the Luminance signals VL1, VL2, and VL3 of the virtual image sensors Vlr V2, and V3, Composite video CV7, comprising a particular virtual image sensors processed luminance subset signal combined with shifted Frame and line sync information SS through the action of the Video mixer VM 24 may be displayed on Display D 26.
In this particular example an image pattern thread processor comprising Lurninance Differential Processor LDP 19 circuitry, Address generation ADG 20 circuitry, and Stack pointer SP 27 circuitry is employed, The LDP 19 circuitry essentially performs spectral analysis of the luminance signal from each image sensor The binary event signals BE1,2, 3, 4, 5, 6 produced by the the luminance differential processor from the composite video signals CV1,2, 3, 4, 5, 6 of the image sensors 1, 2, 3, 4, 5, 6 respectively are also passed to the Scanner SCNR 23 circuitry, The binary nature of these signals allows direct logic gating with the subset luminance mask signals 4 and its compliment to generate the virtual image sensor binary event signals VBE1, VBE2, and VREv of each of the virtual image sensors V1, V2, and V3 respectively.
These signals may be similarly combined with the synthetic Frame and line sync information SS in the Video mixer VM 24 to form the composite video signal CV7 for display on Display D 26.
Automatic electronic co-operative scanning rates of the virtual image sensors across their field of regard are manually variable L.y modification of the scanners SONR 23 variable rate clock frequency.
This example allows the output from the integrator cunt up counter to be overridden and control and slave position XSO taken from an eternal processor, simplified here by the use of the digitised output from the X potentiometer demand of a Joystick 25.This feature allows synchronous and co-ordinateti horizontal scanning by each of the virtual image sensors fields of view under manual control, across the field of regard formed by the combined fields of view of an image sensor pair, The synthetic Frame and 1 ine sync 55 generated by the Scanner SCNR 23 appropriate to any particular position of the slaved virtual image sensor field of view foresight is passed to the Frame line mask FLM 15 circuitry which generates the associated Frame begin signal VF waveform 35, Frame and line sync signal VL waveform 39, and Mask signal VM waveform 39 appropriate for each of the virtual image sensors in the array, these signals are passed to the ADdress Generation ADG 20 circuitry The SCNR 23 also generates as output the signal m4, suitable for passing across an inter process link, which defines the actual scan position of each virtual image sensor's field of view foresight within its field of regard.
The Address generation circuitry ADG 20 circuitry supports output to an External computer system ExCs 29 of image data from the virtual image sensor array in this example for the particular vector attributes identified by the LDP 19 circuitry, To achieve this the ADG 20 circuitry generates identities representative of the azimuth VI, position in line, and elevation SI, line in frame, within a virtual image sensors field of view of any possible frequency excursion events identified by the LDP 19 circuitry, both the signals VI and SI are passed to the Write memory control WMC 28 circuitry, In this particular example the memory used is physically partitioned according to frame, virtual image sensor. and binary event attribute, but only logically partitioned within a frame on the basis of the line scan SI identity, The Address generation ADG 20 circuitry also generates an End marker EM signal, this signal is passed to both the Write memory control WMC 28 circuitry and Stack pointer SP 27 circuitry and is used to terminate line stacks at the end of each virtual image sensors luminance period. The ADG 20 circuitry also generates the signal FO which identifies odd and even frames and thereL.y allows the Write memory control WMG 28 circuitry and Eternal computer system E::8Cs 29 double buffering of memories M1FIB, MEAD, and M3Ae used in the inter processor link.
Binary event data output from the Luminance differential processor LDP 19 for each virtual image sensor is passed to the to the Stack Pointer SP 27 circuitry which generates for each of the virtual image sensors a pointer VISP1, VISP2 ,VISP3 within each stack, contained in the memories M1AD, M21J6, and M3Ae respectively to the particular address to be used, at the next binary event from a particular virtual image sensor, to store its associated vector identity VI.
The Write memory control WMC 28 combines the information from the Address generation DG 20 circuitry and Stack pointer SP 27 circuitry to generate specific addresses WAl=SI+VISPl, WA2=SI+VISP2, and W=SI+VISP3 in partitioned memory into which identities WD1=WD2=WD3=VI for frequency excursions events from a particular virtual image sensor and line scan and attribute are written, The binary event signals from a particular virtual image sensor in conjunction with the signal FO are~used to generate the WE signals WE1A, WEB, WE2, WE2B, WEST, and WESS for the currently available memory associated with a specific image sensor. FCI allows the duble buffering of memory where address and data lines are buffered for writing as well as reading by the use of sets of three state gates for the A and B memories involved, allowing an External computer system ExCs 29 continuous processing of frame organised data, The purpose of the Shifted sync SSY 60 circuitry is to override the co-operative synchronous scan of the virtual image sensors and permit relative to the reference virtual image sensor an asynchronous scan (: ie non parallel field of view boresights and non time coincidence between virtual image sensor frame and line sync signals) Ly the other virtual image sensors This feature allows the fields of view of the other virtual image sensors to effectively scan the field of view of the reference virtual image sensor, This is achieved by modifying the apparent relative timing of the virtual image sensor V1 ' s Frame and line sync signal by generating a time shifted sync signal L1' and L2' from the signals L1 and L2 respectively which are sent to the Fine line position FLP 22 circuitry, this has the effect of controlling the relative separation of the Frame and line sync generation by virtual image sensors V2 and V3, The Shifted sync SSY 60 circuitry operates in two main rhodes, either under the control of the External computer system ExCs 29 or in autonomous mode, When under the control of E-xC:s the the magnitude of the time shift in L1 ' and L2' characteristics relative to L1 and L2 respectively and thereby V1 relative to V2 and V3 is defined by the Time shift and control TSC parameters. In the autonomous mode the magnitude of the V1 frame and line sync characteristics delay with respect to V2 and V-.: is controlled by a sweep generator which allows the relative Frame and line sync of virtual image sensors V2 and V3 in respect of the reference virtual image sensor V1 to iteratively change step wise for each new frame period from time coinciden-e to a maiximum of one 1 line scan luminance period,

Claims (1)

  1. CLAIMS 1 co-operatively slavat.le phased virtual image sensor array comprising a number of equivalent virtual image sensors, logically and physically positioned and orientated, and where the boresights of their respective fields of view are parallel and such that they generate images of a con scenario from their different perspectives, and where common image detail is registered in corresponding 1 ine scans for each such virtual image image sensor, and where the frame and line sync generation for each such virtual sensor is controlled such that a time coincidence exists between the characteristics of these signals, and where each such virtual image sensor ' 5 field of view may be slaved or scan in a controlled co orclinated and synchronised fashion such as to preserve the differing perspective views of a common scenario within their shared field of regard, and where such image information may be passed to an external processor system 2 A co-operatively slavable phased virtual image sensor array as claimed in Claim 1 wherein control means are provided fc.r autonomous automatic or manual definition of the virtual image sensor's field of view boresi ght slave position 3 ,::' co-operatively slavable phased virtual image sensor array as claimed in Claim 1 wherein control means are provided for an eternal processor defined slave position of the virtual image sens,-,r's field of view Lresigt, 4 A co-operatively slavable phased virtual image sensor array as claimed in Claim 1 but wherein control led co-ordinated asynchronous means are provided for an autonomous iterative inter frame time relative shift of the frame and line sync between a reference virtual image sensor and the other virtual image sensors in the virtual image sensor phased array.
    5 y co-operatively slavable phased virtual image sensor array as 1 aimed in Claim 1 but wherein control led co-ordinated asynchronous means is provided for control by an external processor of a specified time relative shift of the frame and line sync separation between a reference virtual image sensor and the other virtual image sensors in the virtual image sensor phased array, 6. fi co-operatively slavable phased virtual image sensor array as claimed in any preceding claim wherein communication means are provided allowing virtual image sensor derived information to be passed to an eternal processor system, 7 A co-operatively slavable phased virtual image sensor array substantially as described herein with reference to figures 1-.5 of the accompanying drawing,
GB9323779A 1993-08-24 1993-11-18 Co-operatively slavable phased virtual image sensor array Withdrawn GB2282726A (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
GB9323779A GB2282726A (en) 1993-08-24 1993-11-18 Co-operatively slavable phased virtual image sensor array
GB9725082A GB2319688B (en) 1993-08-24 1994-08-23 Topography processor system
US08/601,048 US6233361B1 (en) 1993-08-24 1994-08-23 Topography processor system
GB9601754A GB2295741B (en) 1993-08-24 1994-08-23 Topography processor system
PCT/GB1994/001845 WO1995006283A1 (en) 1993-08-24 1994-08-23 Topography processor system
GB9807454A GB2320392B (en) 1993-08-24 1994-08-23 Topography processor system
AU74654/94A AU7465494A (en) 1993-08-24 1994-08-23 Topography processor system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB939317602A GB9317602D0 (en) 1993-08-24 1993-08-24 Co-operatively slavable phased virtual image sensor array
GB939317573A GB9317573D0 (en) 1993-08-24 1993-08-24 Virtual image sensor
GB939317600A GB9317600D0 (en) 1993-08-24 1993-08-24 Image pattern thread processor
GB9323779A GB2282726A (en) 1993-08-24 1993-11-18 Co-operatively slavable phased virtual image sensor array

Publications (2)

Publication Number Publication Date
GB9323779D0 GB9323779D0 (en) 1994-01-05
GB2282726A true GB2282726A (en) 1995-04-12

Family

ID=27451061

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9323779A Withdrawn GB2282726A (en) 1993-08-24 1993-11-18 Co-operatively slavable phased virtual image sensor array

Country Status (1)

Country Link
GB (1) GB2282726A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2014015A (en) * 1978-01-25 1979-08-15 Honeywell Gmbh Method and circuit arrangement for generating on a TV-monitor a partial image of an overall picture
EP0115780A2 (en) * 1983-02-03 1984-08-15 Harris Corporation Automatic registration control system for color television cameras
GB2193412A (en) * 1983-11-30 1988-02-03 Rca Corp Electrical spatial registration of solid state imagers

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2014015A (en) * 1978-01-25 1979-08-15 Honeywell Gmbh Method and circuit arrangement for generating on a TV-monitor a partial image of an overall picture
EP0115780A2 (en) * 1983-02-03 1984-08-15 Harris Corporation Automatic registration control system for color television cameras
GB2193412A (en) * 1983-11-30 1988-02-03 Rca Corp Electrical spatial registration of solid state imagers

Also Published As

Publication number Publication date
GB9323779D0 (en) 1994-01-05

Similar Documents

Publication Publication Date Title
US6115176A (en) Spherical viewing/projection apparatus
US5990934A (en) Method and system for panoramic viewing
US3976982A (en) Apparatus for image manipulation
US6111702A (en) Panoramic viewing system with offset virtual optical centers
US5077608A (en) Video effects system able to intersect a 3-D image with a 2-D image
CN101548277B (en) The computer graphics system of multiple parallel processor
US4148070A (en) Video processing system
DE112016002043T5 (en) CIRCULAR DISPLAY OF RECORDED PICTURES
CA2068006A1 (en) Point addressable cursor for stereo raster display
CN100466720C (en) Video composition apparatus, video composition method and video composition program
KR100345591B1 (en) Image-processing system for handling depth information
CA2238768A1 (en) Multi-camera virtual set system employing still store frame buffers for each camera
JP2880168B2 (en) Video signal processing circuit capable of enlarged display
JPS62142476A (en) Television receiver
JPH02250585A (en) Inter face device for digital tv and graphic display
WO1995006283A1 (en) Topography processor system
DE69116537T2 (en) Arbiter circuit for a multimedia system
GB2282726A (en) Co-operatively slavable phased virtual image sensor array
SU834692A1 (en) Device for output of halftone images of three-dimensional objects onto television receiver screen
JPH09186932A (en) View device using virtual optical center
CA2039332A1 (en) Method for controlling a 3d patch-driven video special effects system
Harris et al. Computer-controlled multidimensional display device for investigation and modeling of physiologic systems
JP2975837B2 (en) Method for converting a part of a two-dimensional image to a three-dimensional image
Webb et al. A scalable video rate camera interface
JPH06101844B2 (en) Image processing device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)