WO2001029649A1 - Image processing method and apparatus for synthesising a representation from a plurality of synchronised moving image camera - Google Patents

Image processing method and apparatus for synthesising a representation from a plurality of synchronised moving image camera Download PDF

Info

Publication number
WO2001029649A1
WO2001029649A1 PCT/GB2000/004007 GB0004007W WO0129649A1 WO 2001029649 A1 WO2001029649 A1 WO 2001029649A1 GB 0004007 W GB0004007 W GB 0004007W WO 0129649 A1 WO0129649 A1 WO 0129649A1
Authority
WO
WIPO (PCT)
Prior art keywords
image processing
video
cameras
input signals
arrangement according
Prior art date
Application number
PCT/GB2000/004007
Other languages
French (fr)
Inventor
Gideon Matthew Hale
Original Assignee
Tct International Plc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tct International Plc filed Critical Tct International Plc
Priority to AU10362/01A priority Critical patent/AU1036201A/en
Publication of WO2001029649A1 publication Critical patent/WO2001029649A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • H04N13/117Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/161Encoding, multiplexing or demultiplexing different image signal components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players

Definitions

  • the present invention relates to an image processing method and apparatus.
  • the invention relates particularly but not exclusively to methods and apparatus for three-
  • DE-A-4,417,128 discloses a security monitoring arrangement for detecting movement in two dimensions.
  • the described embodiment utilises only one camera, although there is a reference to the possibility of using more than one camera.
  • EP-A-820,039 discloses an automatic inspection arrangement in which the images from a single camera are fed selectively into two frame stores for subsequent processing.
  • JP 10267649 discloses an arrangement employing multiple analogue video cameras whose outputs are digitised and multiplexed. The images appear to be combined into one image prior to processing.
  • An object of the present invention is to provide a method and apparatus in which the 5 outputs of two or more (preferably three or more) cameras can be repeatedly captured in synchronism.
  • the invention provides an image processing arrangement comprising video multiplexer means arranged to multiplex a plurality of video input signals and 0 to generate an output sync signal for synchronising retrieval of the respective video input signals, and image processing means arranged to process groups of two or more corresponding portions of the respective video input signals to produce a combined output signal.
  • the portions are preferably frames but could for example be 5 fields or lines.
  • the arrangement enables the outputs of two or more (preferably three or more) video cameras to be fed via one port into a computer for subsequent processing, for example demultiplexing and stereoscopic processing.
  • the arrangement can be scaled up to accommodate the outputs of more video cameras without placing further demands on the computer hardware.
  • the video multiplexer means is arranged to generate and feed to the image processing means a video output comprising cycles of n synchronised frames, fields or lines of the respective video input signals wherein n is the number of video input signals.
  • the arrangement further comprises frame grabber means arranged to receive and store corresponding frames, fields or lines of the multiplexed video output of the video multiplexer means, the image processing means being arranged to read and process the stored frames, fields or lines.
  • the image processing means is arranged to generate a three- dimensional representation from two or more corresponding overlapping images.
  • the image processing means can optionally be arranged to process two or more overlapping images to generate a further image corresponding to a viewpoint different from the respective viewpoints of the overlapping images.
  • Suitable image processing algorithms for correlating image regions of such overlapping images to enable the ' shape of the object common to the region of overlap to be regenerated are already known - eg Gruen's algorithm (see Gruen, A W "Adaptive least squares correlation: a powerful image matching technique”S Afr J of Photogrammetry, remote sensing and Cartography Vol 14 No 3 (1985) and Gruen, A W and Baltsavias, E P "High precision image matching for digital terrain model generation” Int Arch photogrammetry Vol 25 No 3 (1986) p254) and particularly the "region-growing” modification thereto which is described in Otto and Chau “Region-growing algorithm for matching terrain images” Image and Vision Computing Vol 7 No 2 May 1989 p83, all of which are incorporated herein by reference.
  • the video multiplexer means is arranged to generate a video output stream comprising cycles of n frames, fields or lines, wherein n is the number of cameras, the frame, field or line rate of the cameras and video mulitplexer means being determined by the output sync signal.
  • the invention provides method of processing a group of video signals wherein the signals are read in synchronism by means of a common sync signal and fed to an image processing means arranged to process groups of two or more corresponding frames or parts thereof of the respective video signals to produce a combined output signal.
  • n is the number of video input signals.
  • the video signals are acquired by respective cameras having overlapping fields of view.
  • FIG. 1 is a block diagram showing an arrangement in accordance with the invention
  • Figure 2 is a diagram of the signal waveforms and video frame structure generated in the arrangement of Figure 1;
  • FIG 3 is a diagrammatic sketch perspective view of a surveillance system incorporating the arrangement of Figure 1;
  • Figure 4 is a ray diagram conceptually illustrating the image processing performed in the computer of the arrangement of Figure 1, and
  • Figure 5 is a screenshot of an image manipulation program suitable for manipulating the 3D object representations generated by the computer of Figure 1.
  • the arrangement comprises up to six Pulnix TM 9701 or TMC 9700 progressive scan digital video cameras of which only three, namely cameras 1,
  • Camera 1A and IB are shown. Both monochrome and colour cameras can be mixed in the one installation. Camera 1 is representative of the others and its circuitry is shown in some detail.
  • the analogue video output ports and timing input ports of the cameras are connected to a multiplexer module 2 which in turn has its video output port connected to a frame grabber FG of a computer 3 and has a control input port connected to an RS 232 output port of the computer.
  • each camera comprises timing circuitry 70 (comprising conventional async generator, sync generator, phase-locked loop and timing generator blocks) arranged to receive an initialisation pulse VINIT which causes its electronic shutter (not shown) and the electronic shutters of all the other cameras to open so that all the cameras simultaneously acquire an image focussed on their CCD arrays 20 by their lenses 10.
  • VINIT pulse also causes the cameras to synchronise their vertical sync pulses VSYNC to VINIT.
  • All the cameras receive a common horizontal sync signal HSYNC from a sync generator block 110 of the multiplexer module 2 and continually output their stored frames at 30 frames/second as analogue video from a signal processing block 80.
  • the latter receives the analogue output of a digital to analogue converter which in turn is connected to frame stores 50.
  • frame stores 50 These are controlled by a memory control block 40. and can be arranged to store either interlaced fields or complete frames, under the control of a switch input INTERLACE/NON-INTERLACE to memory control block 40.
  • complete frames are stored in frame stores 50.
  • the signal to the frame stores 50 is derived from the digitised output of a CDS and automatic gain control block 60 which is connected to the output of CCD array 20.
  • the timing generator of block 70 controls the CCD readout by means of timing signals sent to both CDS/AGC block 60 and a CCD driver block 30.
  • Multiplexer module 2 comprises a sync generator 110 which is providedwith a clock CLK and sends a common sync signal not only to the cameras but also to a multiplexer block 90.
  • the overall operation is controlled by a programmed microcontroller 100 which receives control signals (for selecting and triggering the shutters of the cameras) over an RS 232 interface from an RS 232 port (COM 1) of a PC 3.
  • Microcontroller 100 is connected to multiplexer block 90 for this purpose.
  • the microcontroller is preferably firmware controlled and has a port (not shown) for downloading suitable control software.
  • Microcontroller 100 also has an external
  • TTL trigger input which provides an alternative (to the PC) camera shutter control.
  • the microcontroller 100 has bidirectional ports for interfacing with a) an auxiliary control interface (not shown) and b) other slave multiplexer modules (not shown) for controlling further cameras.
  • the auxiliary control interface can for example be a hard-wired controller which can supplement or be substituted for the PC 3.
  • the output port for connection to slave multiplexer modules carries timing signals for synchronising all the multiplexer modules and hence all the cameras irrespective of the multiplexer module to which they are connected. If a slave multiplexer module is used, its multiplexed video output signal is fed to one of the video inputs of multiplexer 90 of the master multiplexer module, which is therefore connected to up to five cameras in this mode.
  • these multiplexer modules are also each arranged to generate four projector control signals for switching on up to four pattern projectors P ( Figure 3) as will be discussed subsequently.
  • the signals from the ports to the auxiliary control interface, slave multiplexer modules and pattern projectors are all controlled via the RS 232 interface from the PC 3 at a baud rate of 9600 bits/second.
  • the following commands are provided for:
  • TRIGGER Triggers electronic shutters of all selected cameras
  • RETURN AUXILIARY REGISTER STATUS indicates status of auxiliary interface, if used).
  • a status indicator in the form of a seven segment LED (not shown) on the multiplexer module 2 indicates the status of the module (and optionally provides diagnostic information).
  • a typical physical arrangement of the cameras 1, 1A, IB and 1C in eg an observation or surveillance aystem is shown in Figure 3.
  • a optical projector P switchedby multiplexer module 2 projects a pattern (eg a speckle pattern or other fractal pattern of visible or, preferably, infra-red radiation) onto the scene of interest and the cameras 1, 1A and IB which may be supported in elevated positions are focussed onto regions which overlap with each other and with the region illuminated by the pattern.
  • a pattern eg a speckle pattern or other fractal pattern of visible or, preferably, infra-red radiation
  • the fields of view of cameras 1A and IB overlap in region Ql
  • the fields of view of cameras 1 and 1A overlap in region Q2
  • the fields of view of cameras 1A and 1C overlap in region Q3.
  • One or more of the cameras eg camera IB, as shown
  • the multiplexer module 2 operates in two modes as follows:
  • the PC 3 selects one or more cameras eg 1 and 1A (which are then rapidly synchronised to the HSYNC signal of the multiplexer module 2) and repeatedly instructs the multiplexer module to send a TRIGGER signal to operate the electronic shutters of the selected cameras. This enables the cameras to be adjusted and focussed in real time.
  • TRIGGER signal to the electonic shutters of the selected camera(s) which consequently each acquire a single frame which is stored in the frame stores 50.
  • the stored frame of each camera is read out repeatedly to the respective video input of multiplexer 90.
  • plot i the horizontal sync signal HSYNC is shown. This (and also the vertical sync pulses, not shown) is common to all the cameras.
  • plot ii) the composite video output signal of one camera (CAMERA 1 VIDEO) is shown to the same timescale, lines LI, L2, L3, L4 LN being shown which make up one frame F.
  • Plot iii) shows a typical composite video output signal for a second camera (CAMERA 2 VIDEO) and plot iv) shows a typical composite video output signal for a sixth camera (CAMERA 6 VIDEO).
  • Thenumber of lines per frame F can be in accordance with any of the normal video standards.
  • CAMERA 1 reads out from its frame stores the same frame Fl i for six frames
  • CAMERA 2 reads out from its frame stores the same frame Fl2 for six frames
  • CAMERA 6 reads out from its frame stores the same frame Fl 6 for six frames
  • in general CAMERA N reads out from its frame stores the same frame FI N for six frames.
  • the cameras then read out the next frame F2i, F22, F26 and in general F2N for the next six frames.
  • the process continues with the reading out of further cycles of new frames, each cycle being of six frames or, more generally where K cameras are employed, K frames.
  • Plot viii) (to the same timescale as plots v) to vii)) shows the multiplexed output of the multiplexer module 2 which is transmitted to the PC 3.
  • This sequence comprises successive cycles of six frames simultaneously acquired by the six cameras (or more generally, successive cycles of K frames simultaneously acquired by K cameras).
  • the computer 3 is suitably equipped with a Pentium® microprocessor 120 and, as noted above, sends control signals to the multiplexer module(s) 2 from its COM 1 port via an RS 232 interface.
  • a frame grabber FG receives the multiplexed video signals from the multiplexer module 2 (or the master multiplexer module if more than one multiplexer module is used) processes them in an ANALOGUE processing module and strips out the sync signals in a SYNC module before converting the images to digital form in an analogue to digital converter (A/D) which feeds the digitised images to video memory (RAM) whence they can be accessed and processed by the microprocessor 120. These digitised images are also reconverted to analogue form for display on a monitor 50. Suitable frame grabbers are commercially available.
  • the microprocessor 120 runs a Windows 95® operating system from hard disc 130 and is provided with conventional RAM and ROM.
  • the PC 3 is provided with a conventional keyboard and a pointing device eg mouse 60.
  • the hard disc 130 is loaded with software:
  • the software to carry out function a) can be any suitable graphics program and the software to carry out function b) can be based on the algorithms disclosed in Hu et al "Matching Point Features with ordered Geometric, Rigidity and Disparity Constraints" IEEE Transactions on Pattern Analysis and Machine Intelligence Vol 16 No 10, 1994 ppl041-1049 (and references cited therein).
  • One suitable algorithm is the Gruen algorithm, although we have found a number of improvements, as follows:
  • a candidate matched point moves by more than a certain amount (eg 3 pixels) per iteration then it is not a valid matched point and should be rejected;
  • One or more projectors P generate a pattern (eg a speckle pattern) which provides artificial texture on the scene viewed by the cameras and aids the stereo matching process.
  • the pattern is preferably an infra-red pattern.
  • Control signals for the projector(s) are received from the multiplexer 2.
  • a person X (eg a member of the public or a player in a sports match or an athlete) is assumed to be in the scene illuminated by the projector pattern in Figure 3 and within the area of overlap of the fields of view of two cameras 1 and 1 A and also within the further area of overlap of the fields of view of cameras 1A and IB.
  • Three representative points PI, P2 and P3 on the surface of the person's face are assumed to be imaged by all three cameras through their perspective centres 01, 02 and 03 respectively onto respective conjugate points in their image planes..
  • the correlation software eg based on the Gruen algorithm
  • PC3 correlates the respective camera's simultaneous images (eg the pixels of the images acquired by cameras 1 and 1A corresponding to PI) and thereby enables a 3D representation of the face X to be reconstructed.
  • This can be visualised as a projection of the correlated points from virtual projectors PR1, PR2 and PR3 in a virtual 3D space. If the virtual projectors have the same optical characteristics as the cameras and are located at the same points in the virtual space as the cameras in the real space, with the same oriientations, then the representation will be lifesize and undistorted. This process will be performed for all correlated pixels.
  • the resulting 3D representations will not be coincident and further software in the PC 3 is arranged to fit these together, optionally under the control of the user.
  • the resulting overall 3D representation can then be viewed from different angles in virtual space by eg the software program COSMO PLAYER, a web browser plug-in produced by Cosmo Software Inc, of California USA.
  • COSMO PLAYER a web browser plug-in produced by Cosmo Software Inc, of California USA.
  • This enables a person or other subject moving in the region viewed by the cameras to be viewed from another viewpoint, eg that of the virtual viewer V shown in Figure 4.
  • an image corresponding to eg an intermediate viewpoint can be generated from the camera images.
  • This feature is useful not only in surveillance systems (in which a front view or profile of a suspicious individual may be generated) but also in sports events where a view of the game from a different viewpoint may be required.
  • Figure 5 shows a screen shot of the user interface generated by a program for manipulating the overall 3D representations generated by the process of Figure 4.
  • Buttons BN are provided which can be clicked on by the mouse under the control of the user and, when thus selected, enable the user to drag portions of the displayed representation so as to zoom, rotate and panthe view of the object, as well as tocome closer to and move away from the object ("nearer" and "further” buttons respectively).
  • the interface is similar to the publicly available interface of the COSMO PLAYER web browser plug-in .
  • buttons or other means may be provided to enable distortions to be applied in a graphical fashion.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

An image processing arrangement comprises a video multiplexer mean (2) arranged to multiplex a plurality of video input signals from respective cameras (1, 1A, 1B, 1C) and to generate an output sync signal (HSYNC) for synchronising retrieval of the respective video input signals, and a computer (3) arranged to process groups of two or more corresponding frames (e.g. with a stereo processing algorithm) of the respective video input signals to produce a combined output signal (e.g. a 3D representation of the object). The computer may also include software for generating images that would be acquired by a virtual camera at an intermediate viewpoint. The arrangement is useful in surveillance systems and for recording sporting events, for example.

Description

IMAGE PROCESSING METHOD AND APPARATUS FOR
SYNTHESISING A REPRESENTATION FROM A PLURALITY OF
SYNCHRONISED MOVING IMAGE CAMERA
The present invention relates to an image processing method and apparatus. The invention relates particularly but not exclusively to methods and apparatus for three-
5 dimensional image processing.
Various computer-based image processing arrangements are known for use in production line inspection and security environments for example. 0
Thus DE-A-4,417,128 discloses a security monitoring arrangement for detecting movement in two dimensions. The described embodiment utilises only one camera, although there is a reference to the possibility of using more than one camera.
5 EP-A-820,039 discloses an automatic inspection arrangement in which the images from a single camera are fed selectively into two frame stores for subsequent processing.
0 JP 10267649 discloses an arrangement employing multiple analogue video cameras whose outputs are digitised and multiplexed. The images appear to be combined into one image prior to processing.
An object of the present invention is to provide a method and apparatus in which the 5 outputs of two or more (preferably three or more) cameras can be repeatedly captured in synchronism.
In one aspect the invention provides an image processing arrangement comprising video multiplexer means arranged to multiplex a plurality of video input signals and 0 to generate an output sync signal for synchronising retrieval of the respective video input signals, and image processing means arranged to process groups of two or more corresponding portions of the respective video input signals to produce a combined output signal. The portions are preferably frames but could for example be 5 fields or lines.
The arrangement enables the outputs of two or more (preferably three or more) video cameras to be fed via one port into a computer for subsequent processing, for example demultiplexing and stereoscopic processing. The arrangement can be scaled up to accommodate the outputs of more video cameras without placing further demands on the computer hardware.
Preferably the video multiplexer means is arranged to generate and feed to the image processing means a video output comprising cycles of n synchronised frames, fields or lines of the respective video input signals wherein n is the number of video input signals.
Preferably the arrangement further comprises frame grabber means arranged to receive and store corresponding frames, fields or lines of the multiplexed video output of the video multiplexer means, the image processing means being arranged to read and process the stored frames, fields or lines.
In one embodiment the image processing means is arranged to generate a three- dimensional representation from two or more corresponding overlapping images.
In this and other embodiments the image processing means can optionally be arranged to process two or more overlapping images to generate a further image corresponding to a viewpoint different from the respective viewpoints of the overlapping images.
Suitable image processing algorithms for correlating image regions of such overlapping images to enable the' shape of the object common to the region of overlap to be regenerated are already known - eg Gruen's algorithm (see Gruen, A W "Adaptive least squares correlation: a powerful image matching technique"S Afr J of Photogrammetry, remote sensing and Cartography Vol 14 No 3 (1985) and Gruen, A W and Baltsavias, E P "High precision image matching for digital terrain model generation" Int Arch photogrammetry Vol 25 No 3 (1986) p254) and particularly the "region-growing" modification thereto which is described in Otto and Chau "Region-growing algorithm for matching terrain images" Image and Vision Computing Vol 7 No 2 May 1989 p83, all of which are incorporated herein by reference.
Preferably the video multiplexer means is arranged to generate a video output stream comprising cycles of n frames, fields or lines, wherein n is the number of cameras, the frame, field or line rate of the cameras and video mulitplexer means being determined by the output sync signal.
In another aspect the invention provides method of processing a group of video signals wherein the signals are read in synchronism by means of a common sync signal and fed to an image processing means arranged to process groups of two or more corresponding frames or parts thereof of the respective video signals to produce a combined output signal.
Preferably cycles of n synchronised frames, fields or lines of the respective video input signals are fed to the image processing means, wherein n is the number of video input signals.
Preferably the video signals are acquired by respective cameras having overlapping fields of view.
Other preferred features of the invention are defined in the dependent claims.
A preferred embodiment of the invention is describedbelow by way of example only with reference to Figures 1 to 5 of the accompanying drawings, wherein:
Figure 1 is a block diagram showing an arrangement in accordance with the invention;
Figure 2 is a diagram of the signal waveforms and video frame structure generated in the arrangement of Figure 1;
Figure 3 is a diagrammatic sketch perspective view of a surveillance system incorporating the arrangement of Figure 1;
Figure 4 is a ray diagram conceptually illustrating the image processing performed in the computer of the arrangement of Figure 1, and
Figure 5 is a screenshot of an image manipulation program suitable for manipulating the 3D object representations generated by the computer of Figure 1.
Referring to Figure 1, the arrangement comprises up to six Pulnix TM 9701 or TMC 9700 progressive scan digital video cameras of which only three, namely cameras 1,
1A and IB are shown. Both monochrome and colour cameras can be mixed in the one installation. Camera 1 is representative of the others and its circuitry is shown in some detail. The analogue video output ports and timing input ports of the cameras are connected to a multiplexer module 2 which in turn has its video output port connected to a frame grabber FG of a computer 3 and has a control input port connected to an RS 232 output port of the computer.
Referring to camera 1, each camera comprises timing circuitry 70 (comprising conventional async generator, sync generator, phase-locked loop and timing generator blocks) arranged to receive an initialisation pulse VINIT which causes its electronic shutter (not shown) and the electronic shutters of all the other cameras to open so that all the cameras simultaneously acquire an image focussed on their CCD arrays 20 by their lenses 10. The VINIT pulse also causes the cameras to synchronise their vertical sync pulses VSYNC to VINIT.
All the cameras receive a common horizontal sync signal HSYNC from a sync generator block 110 of the multiplexer module 2 and continually output their stored frames at 30 frames/second as analogue video from a signal processing block 80. The latter receives the analogue output of a digital to analogue converter which in turn is connected to frame stores 50. These are controlled by a memory control block 40. and can be arranged to store either interlaced fields or complete frames, under the control of a switch input INTERLACE/NON-INTERLACE to memory control block 40. In the described embodiment complete frames are stored in frame stores 50. The signal to the frame stores 50 is derived from the digitised output of a CDS and automatic gain control block 60 which is connected to the output of CCD array 20. The timing generator of block 70 controls the CCD readout by means of timing signals sent to both CDS/AGC block 60 and a CCD driver block 30.
Multiplexer module 2 comprises a sync generator 110 which is providedwith a clock CLK and sends a common sync signal not only to the cameras but also to a multiplexer block 90. The overall operation is controlled by a programmed microcontroller 100 which receives control signals (for selecting and triggering the shutters of the cameras) over an RS 232 interface from an RS 232 port (COM 1) of a PC 3. Microcontroller 100 is connected to multiplexer block 90 for this purpose. The microcontroller is preferably firmware controlled and has a port (not shown) for downloading suitable control software. Microcontroller 100 also has an external
TTL trigger input which provides an alternative (to the PC) camera shutter control.
Camera shutter speeds of from 1/125 second to 1/6000 second are supported.
Additionally the microcontroller 100 has bidirectional ports for interfacing with a) an auxiliary control interface (not shown) and b) other slave multiplexer modules (not shown) for controlling further cameras. The auxiliary control interface can for example be a hard-wired controller which can supplement or be substituted for the PC 3. The output port for connection to slave multiplexer modules carries timing signals for synchronising all the multiplexer modules and hence all the cameras irrespective of the multiplexer module to which they are connected. If a slave multiplexer module is used, its multiplexed video output signal is fed to one of the video inputs of multiplexer 90 of the master multiplexer module, which is therefore connected to up to five cameras in this mode.
In addition to the sync and trigger signals output from the multiplexer module(s) 2, these multiplexer modules are also each arranged to generate four projector control signals for switching on up to four pattern projectors P (Figure 3) as will be discussed subsequently.
The signals from the ports to the auxiliary control interface, slave multiplexer modules and pattern projectors are all controlled via the RS 232 interface from the PC 3 at a baud rate of 9600 bits/second. The following commands are provided for:
WRITE COMMANDS
RESET (to the default state in which no cameras are selected - the multiplexer responds with an OK or ERROR status indication)
NO CAMERAS SELECTED
TRIGGER (triggers electronic shutters of all selected cameras)
SELECT CAMERA n OF MULTIPLEXER MODULE m (n = 1 - 6, m = 1, 2, 3...)
READ COMMAND
RETURN AUXILIARY REGISTER STATUS (indicates status of auxiliary interface, if used). A status indicator in the form of a seven segment LED (not shown) on the multiplexer module 2 indicates the status of the module (and optionally provides diagnostic information).
A typical physical arrangement of the cameras 1, 1A, IB and 1C in eg an observation or surveillance aystem is shown in Figure 3. A optical projector P switchedby multiplexer module 2 projects a pattern (eg a speckle pattern or other fractal pattern of visible or, preferably, infra-red radiation) onto the scene of interest and the cameras 1, 1A and IB which may be supported in elevated positions are focussed onto regions which overlap with each other and with the region illuminated by the pattern. Thus the fields of view of cameras 1A and IB overlap in region Ql, the fields of view of cameras 1 and 1A overlap in region Q2 and the fields of view of cameras 1A and 1C overlap in region Q3. One or more of the cameras (eg camera IB, as shown) may be have its attitude and/or position controllable remotely by eg a motor M in order to enable one or more of the fields of view and regions of overlap to be varied.
Referring again to Figure 1, the multiplexer module 2 operates in two modes as follows:
i) in SETUP mode the PC 3 selects one or more cameras eg 1 and 1A (which are then rapidly synchronised to the HSYNC signal of the multiplexer module 2) and repeatedly instructs the multiplexer module to send a TRIGGER signal to operate the electronic shutters of the selected cameras. This enables the cameras to be adjusted and focussed in real time.
ii) in CAPTURE mode the PC 3 instructs the multiplexer module 2 to send a single
TRIGGER signal to the electonic shutters of the selected camera(s) which consequently each acquire a single frame which is stored in the frame stores 50. The stored frame of each camera is read out repeatedly to the respective video input of multiplexer 90.
Having selected the desired regions of overlap and focussed the cameras on the scene of interest illuminated by the projected pattern in the SETUP mode, the arrangement is switched to CAPTURE mode and handles the following waveforms as shown in Figure 2:
In plot i) the horizontal sync signal HSYNC is shown. This (and also the vertical sync pulses, not shown) is common to all the cameras.
In plot ii) the composite video output signal of one camera (CAMERA 1 VIDEO) is shown to the same timescale, lines LI, L2, L3, L4 LN being shown which make up one frame F. Plot iii) shows a typical composite video output signal for a second camera (CAMERA 2 VIDEO) and plot iv) shows a typical composite video output signal for a sixth camera (CAMERA 6 VIDEO). The composite video waveforms
(not shown) of up to three further cameras can be similarly synchronised by one multiplexer module 2 and yet further cameras can be synchronised if a master and slave arrangement of multiplexer modules is employed, as mentioned above.
Thenumber of lines per frame F can be in accordance with any of the normal video standards.
The corresponding frame sequences are shown (to a different timescale) in plots v), vi) and vii) and it will be seen that in an arrangement employing six cameras, CAMERA 1 reads out from its frame stores the same frame Fl i for six frames, CAMERA 2 reads out from its frame stores the same frame Fl2 for six frames, CAMERA 6 reads out from its frame stores the same frame Fl 6 for six frames and in general CAMERA N reads out from its frame stores the same frame FI N for six frames. The cameras then read out the next frame F2i, F22, F26 and in general F2N for the next six frames. The process continues with the reading out of further cycles of new frames, each cycle being of six frames or, more generally where K cameras are employed, K frames.
Plot viii) (to the same timescale as plots v) to vii)) shows the multiplexed output of the multiplexer module 2 which is transmitted to the PC 3. This sequence comprises successive cycles of six frames simultaneously acquired by the six cameras (or more generally, successive cycles of K frames simultaneously acquired by K cameras).
Referring again to Figure 1, the hardware and software of PC 3 which process the sequence of plot viii) will now be described briefly. The computer 3 is suitably equipped with a Pentium® microprocessor 120 and, as noted above, sends control signals to the multiplexer module(s) 2 from its COM 1 port via an RS 232 interface.
A frame grabber FG receives the multiplexed video signals from the multiplexer module 2 (or the master multiplexer module if more than one multiplexer module is used) processes them in an ANALOGUE processing module and strips out the sync signals in a SYNC module before converting the images to digital form in an analogue to digital converter (A/D) which feeds the digitised images to video memory (RAM) whence they can be accessed and processed by the microprocessor 120. These digitised images are also reconverted to analogue form for display on a monitor 50. Suitable frame grabbers are commercially available.
The microprocessor 120 runs a Windows 95® operating system from hard disc 130 and is provided with conventional RAM and ROM. The PC 3 is provided with a conventional keyboard and a pointing device eg mouse 60. The hard disc 130 is loaded with software:
a) to display images acquired from the multiplexer module 2;
b) to correlate overlapping regions of the images derived from corresponding frames of the output of frame grabber FG, optionally with the assistance of a pattern projected onto the object surface;
c) to generate at least a partial 3-dimensional reconstruction of object surfaces in the region of overlap of the fields of view of respective cameras from the above correlations and (preferably) information on the viewpoints (ie positions and orientations) of the cameras, as illustrated in Figure 4;
d) to combine one or more such 3D reconstructions derived from the images acquired by different pairs of cameras into an overall 3D reconstruction;
e) to process the 3D reconstructions of c) or d) to generate 2D images of the object scene as viewed from a viewpoint different from that of the cameras, and
f) to generate higher resolution images eg by combining two or more 3D reconstructions prior to deriving a 2D image from the combination, whether from a viewpoint identical to that of one of the cameras or from another viewpoint. The software to carry out function a) can be any suitable graphics program and the software to carry out function b) can be based on the algorithms disclosed in Hu et al "Matching Point Features with ordered Geometric, Rigidity and Disparity Constraints" IEEE Transactions on Pattern Analysis and Machine Intelligence Vol 16 No 10, 1994 ppl041-1049 (and references cited therein). One suitable algorithm is the Gruen algorithm, although we have found a number of improvements, as follows:
i) the additive radiometric shift employed in the algorithm can be dispensed with;
ii) if during successive iterations, a candidate matched point moves by more than a certain amount (eg 3 pixels) per iteration then it is not a valid matched point and should be rejected;
iii) during the growing of a matched region it is useful to check for sufficient contrast at at least three of the four sides of the region in order to ensure that there is sufficient data for a stable convergence - in order to facilitate this it is desirable to make the algorithm configurable to enable the parameters (eg required contrast) to be optimised for different environments, and
iv) in order to quantify the validity of the correspondences between respective patches of one image and points in the other image it has been found useful to re- derive the original grid point in the starting image by applying the algorithm to the matched point in the other image (ie reversing the stereo matching process) and measuring the distance between the original grid point and the new grid point found in the starting image from the reverse stereo matching. The smaller the distance the better the correspondence.
The software to carry out process d) is described in principle in our UK patent GB 2,292,605B and references therein and the software needed to carry out the processes e) and f) (to the extent that it is not commercially available) can be written by graphics programmers of normal skill in this field on the basis of the principles illustrated in Figure 4.
One or more projectors P generate a pattern (eg a speckle pattern) which provides artificial texture on the scene viewed by the cameras and aids the stereo matching process. The pattern is preferably an infra-red pattern. Control signals for the projector(s) are received from the multiplexer 2.
Referring to Figure 4, a person X (eg a member of the public or a player in a sports match or an athlete) is assumed to be in the scene illuminated by the projector pattern in Figure 3 and within the area of overlap of the fields of view of two cameras 1 and 1 A and also within the further area of overlap of the fields of view of cameras 1A and IB. Three representative points PI, P2 and P3 on the surface of the person's face are assumed to be imaged by all three cameras through their perspective centres 01, 02 and 03 respectively onto respective conjugate points in their image planes..
It is assumed that the locations and orientations of the perspective centres 01 to 03 are known. The correlation software (eg based on the Gruen algorithm) in PC3 correlates the respective camera's simultaneous images (eg the pixels of the images acquired by cameras 1 and 1A corresponding to PI) and thereby enables a 3D representation of the face X to be reconstructed. This can be visualised as a projection of the correlated points from virtual projectors PR1, PR2 and PR3 in a virtual 3D space. If the virtual projectors have the same optical characteristics as the cameras and are located at the same points in the virtual space as the cameras in the real space, with the same oriientations, then the representation will be lifesize and undistorted. This process will be performed for all correlated pixels.
Since the region of overlap of the fields of view of cameras 1 and 1A will not be the same as the region of overlap of the fields of view of cameras 1A and IB, the resulting 3D representations will not be coincident and further software in the PC 3 is arranged to fit these together, optionally under the control of the user. The resulting overall 3D representation can then be viewed from different angles in virtual space by eg the software program COSMO PLAYER, a web browser plug-in produced by Cosmo Software Inc, of California USA. This enables a person or other subject moving in the region viewed by the cameras to be viewed from another viewpoint, eg that of the virtual viewer V shown in Figure 4. In this manner an image corresponding to eg an intermediate viewpoint can be generated from the camera images. This feature is useful not only in surveillance systems (in which a front view or profile of a suspicious individual may be generated) but also in sports events where a view of the game from a different viewpoint may be required.
Figure 5 shows a screen shot of the user interface generated by a program for manipulating the overall 3D representations generated by the process of Figure 4. Buttons BN are provided which can be clicked on by the mouse under the control of the user and, when thus selected, enable the user to drag portions of the displayed representation so as to zoom, rotate and panthe view of the object, as well as tocome closer to and move away from the object ("nearer" and "further" buttons respectively). As described this far, the interface is similar to the publicly available interface of the COSMO PLAYER web browser plug-in . However, in accordance with a feature of the present invention, "wheels" Wl and W2 are provided which are rotatable by the mouse and enable the user to adjust the separation between the virtual projectors and to vary the distance to the object respectively. The latter control is effectively a perspective control. Optionally, further buttons or other means (not shown) may be provided to enable distortions to be applied in a graphical fashion.

Claims

Claims
1. An image processing arrangement comprising video multiplexer means arranged to multiplex a plurality of video input signals and to generate an output sync signal for synchronising retrieval of the respective video input signals, and image processing means arranged to process groups of two or more corresponding portions of the respective video input signals to produce a combined output signal.
2. An image processing arrangement according to claim 1 wherein the video multiplexer means is arranged to generate and feed to the image processing means a video output comprising cycles of n synchronised frames, fields or lines of the respective video input signals wherein n is the number of video input signals.
3. An image processing arrangement according to claim 1 or claim 2, further comprising frame grabber means arranged to receive and store corresponding frames, fields or lines of the multiplexed video output of the video multiplexer means, the image processing means being arranged to read and process the stored frames, fields or lines.
4. An image processing arrangement according to any preceding claim wherein the image processing means is arranged to generate a three-dimensional representation from two or more corresponding overlapping images.
5. An image processing arrangement according to any preceding claim wherein the image processing means is arranged to process two or more overlapping images to generate a further image corresponding to a viewpoint different from the respective viewpoints of the overlapping images.
6. An image processing arrangement according to any preceding claim wherein the image processing means is arranged to combine two or more images to generate a higher resolution ima Όge- .
7. An image processing arrangement according to any preceding claim further comprising a plurality of cameras arranged to generate said video input signals, the fields of view of at least two of the cameras having a region of overlap.
8. An image processing arrangement according to claim 7 wherein each camera is provided with local memory means arranged to store at least part of a frame derived from that camera, the stored frame or part thereof being arranged to be read out repeatedly to said video multiplexer means under the control of said sync signal before refreshing said local memory means with a further stored frame or part thereof subsequently derived from that camera.
9. An image processing arrangement according to claim 7 or claim 8 wherein the video multiplexer means is arranged to generate a video output stream comprising cycles of n frames, fields or lines, wherein n is the number of cameras, the frame, field or line rate of the cameras and video mulitplexer means being determined by said output sync signal.
10. An image processing arangement according to any of claims 7 to 9 which constitutes a surveillance system, the cameras being arranged to acquire images of members of the public.
11. An image processing arrangment according to any of claims 7 to 10 comprising three or more such cameras and a region of overlap common to three or more fields of view of the cameras.
12. An image processing arrangement according to any preceding claim wherein the video multiplexer means has an output port for the multiplexed video output and a communications link is arranged to feed the multiplexed video output to an input port of a computer comprising said image processing means.
13. An image processing arrangement according to claim 12 wherein said video multiplexer means is responsive to at least one control signal from the computer and is arranged to generate at least one output signal for controlling the acquisition, multiplexing or other processing of the video input signals.
14. An image processing arrangement according to claim 13 wherein said video multiplexer means is arranged to feed said output sync signal to a further video multiplexer means which is in turn arranged to multiplex and synchronise frame retrieval of a further plurality of video input signals.
15. An image processing arrangement according to any preceding claim wherein the video multiplexer means has a plurality of input ports for the respective video input signals.
16. A method of processing a group of video signals wherein the signals are read in synchronism by means of a common sync signal and fed to an image processing means arranged to process groups of two or more corresponding frames or parts thereof of the respective video signals to produce a combined output signal.
17. A method according to claim 16 wherein cycles of n synchronised frames, fields or lines of the respective video input signals are fed to said image processing means, wherein n is the number of video input signals.
18. A method according to claim 16 or claim 17 wherein the video signals are acquired by respective cameras having overlapping fields of view.
19. A method according to any of claims 16 to 18 wherein a video multiplexer means reads out the contents of local image storing means associated with the cameras and feeds a multiplexed video output to said image processing means.
20. An image processing arrangement substantially as described hereinabove with reference to Figures 1 and 2 and optionally any of Figures 3, 4 and 5 of the accompanying drawings.
21. A method of processing a group of video signals substantially as described hereinabove with reference to Figures 1 and 2 and optionally Figures 4 and 5 of the accompanying drawings.
PCT/GB2000/004007 1999-10-19 2000-10-18 Image processing method and apparatus for synthesising a representation from a plurality of synchronised moving image camera WO2001029649A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU10362/01A AU1036201A (en) 1999-10-19 2000-10-18 Image processing method and apparatus for synthesising a representation from a plurality of synchronised moving image camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB9924977.3 1999-10-19
GB9924977A GB2355612A (en) 1999-10-19 1999-10-19 Image processing arrangement producing a combined output signal from input video signals.

Publications (1)

Publication Number Publication Date
WO2001029649A1 true WO2001029649A1 (en) 2001-04-26

Family

ID=10863163

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2000/004007 WO2001029649A1 (en) 1999-10-19 2000-10-18 Image processing method and apparatus for synthesising a representation from a plurality of synchronised moving image camera

Country Status (3)

Country Link
AU (1) AU1036201A (en)
GB (1) GB2355612A (en)
WO (1) WO2001029649A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1389020A1 (en) * 2002-08-07 2004-02-11 Electronics and Telecommunications Research Institute Method and apparatus for multiplexing multi-view three-dimensional moving picture
WO2006052204A1 (en) * 2004-11-11 2006-05-18 Totalförsvarets Forskningsinstitut Method for surveillance of a geographical area and a system utilising the method
US8896668B2 (en) 2010-04-05 2014-11-25 Qualcomm Incorporated Combining data from multiple image sensors
US9001227B2 (en) 2010-04-05 2015-04-07 Qualcomm Incorporated Combining data from multiple image sensors
US10999568B2 (en) 2010-12-13 2021-05-04 Nokia Technologies Oy Method and apparatus for 3D capture synchronization

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7359748B1 (en) 2000-07-26 2008-04-15 Rhett Drugge Apparatus for total immersion photography
IL159537A0 (en) * 2001-06-28 2004-06-01 Omnivee Inc Method and apparatus for control and processing of video images
JP4220883B2 (en) * 2003-11-05 2009-02-04 本田技研工業株式会社 Frame grabber
FR2931611B1 (en) * 2008-05-23 2012-10-26 4D View Solutions METHOD FOR 3D MODELING OF REAL AND DYNAMIC SCENES

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2085629A (en) * 1980-10-17 1982-04-28 Micro Cosultants Ltd Object recognition
DD286277A5 (en) * 1989-07-17 1991-01-17 Veb Funkwerk Koepenick,Zentrum F. Forschung Und Technologie Nachrichtenelektron,De ARRANGEMENT FOR ASYNCHRONES, ADDRESSABLE COMMUNICATION BETWEEN A MAIN STATION AND MULTIPLE SUB-STATIONS
US5144445A (en) * 1989-12-26 1992-09-01 Sanyo Electric Co., Ltd. Solid-state image pickup apparatus having a plurality of photoelectric transducers arranged in a matrix
DE4200961A1 (en) * 1991-11-11 1993-05-13 Wolf Henning High resolution camera system - has camera with image gatherers in image field of stationary or movable camera objective lens
EP0684059A1 (en) * 1994-05-24 1995-11-29 Texas Instruments Incorporated Method and apparatus for the display of video images
EP0820039A2 (en) * 1996-07-15 1998-01-21 Matsushita Electric Works, Ltd. Image processing inspection apparatus
US5745126A (en) * 1995-03-31 1998-04-28 The Regents Of The University Of California Machine synthesis of a virtual video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
JPH10267649A (en) * 1997-03-24 1998-10-09 Tech Res & Dev Inst Of Japan Def Agency Picture processing system for integrating plurality of images for multiplex lens stereo image processing
EP0920211A2 (en) * 1997-12-01 1999-06-02 Lsi Card Corporation A method of forming a panoramic image

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8426775D0 (en) * 1984-10-23 1984-11-28 Gill C Three dimensional images
WO1986005058A1 (en) * 1985-02-13 1986-08-28 University Of Queensland Digital imagery and analysis system
JPH06350937A (en) * 1993-06-14 1994-12-22 Pioneer Electron Corp Picture synthesis reproduction device
US5627582A (en) * 1993-11-29 1997-05-06 Canon Kabushiki Kaisha Stereoscopic compression processing with added phase reference

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2085629A (en) * 1980-10-17 1982-04-28 Micro Cosultants Ltd Object recognition
DD286277A5 (en) * 1989-07-17 1991-01-17 Veb Funkwerk Koepenick,Zentrum F. Forschung Und Technologie Nachrichtenelektron,De ARRANGEMENT FOR ASYNCHRONES, ADDRESSABLE COMMUNICATION BETWEEN A MAIN STATION AND MULTIPLE SUB-STATIONS
US5144445A (en) * 1989-12-26 1992-09-01 Sanyo Electric Co., Ltd. Solid-state image pickup apparatus having a plurality of photoelectric transducers arranged in a matrix
DE4200961A1 (en) * 1991-11-11 1993-05-13 Wolf Henning High resolution camera system - has camera with image gatherers in image field of stationary or movable camera objective lens
EP0684059A1 (en) * 1994-05-24 1995-11-29 Texas Instruments Incorporated Method and apparatus for the display of video images
US5745126A (en) * 1995-03-31 1998-04-28 The Regents Of The University Of California Machine synthesis of a virtual video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
EP0820039A2 (en) * 1996-07-15 1998-01-21 Matsushita Electric Works, Ltd. Image processing inspection apparatus
JPH10267649A (en) * 1997-03-24 1998-10-09 Tech Res & Dev Inst Of Japan Def Agency Picture processing system for integrating plurality of images for multiplex lens stereo image processing
EP0920211A2 (en) * 1997-12-01 1999-06-02 Lsi Card Corporation A method of forming a panoramic image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 1999, no. 01 29 January 1999 (1999-01-29) *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1389020A1 (en) * 2002-08-07 2004-02-11 Electronics and Telecommunications Research Institute Method and apparatus for multiplexing multi-view three-dimensional moving picture
WO2006052204A1 (en) * 2004-11-11 2006-05-18 Totalförsvarets Forskningsinstitut Method for surveillance of a geographical area and a system utilising the method
US8896668B2 (en) 2010-04-05 2014-11-25 Qualcomm Incorporated Combining data from multiple image sensors
US9001227B2 (en) 2010-04-05 2015-04-07 Qualcomm Incorporated Combining data from multiple image sensors
KR101512222B1 (en) * 2010-04-05 2015-04-14 퀄컴 인코포레이티드 Combining data from multiple image sensors
US10999568B2 (en) 2010-12-13 2021-05-04 Nokia Technologies Oy Method and apparatus for 3D capture synchronization

Also Published As

Publication number Publication date
AU1036201A (en) 2001-04-30
GB2355612A (en) 2001-04-25
GB9924977D0 (en) 1999-12-22

Similar Documents

Publication Publication Date Title
US6084979A (en) Method for creating virtual reality
EP0637815B1 (en) Image processing method and image processing apparatus
US5835133A (en) Optical system for single camera stereo video
US5510831A (en) Autostereoscopic imaging apparatus and method using suit scanning of parallax images
Kanade et al. Virtualized reality: Concepts and early results
US5479597A (en) Imaging system for producing a sequence of composite images which combine superimposed real images and synthetic images
US6573912B1 (en) Internet system for virtual telepresence
Narayanan et al. Constructing virtual worlds using dense stereo
Fuchs et al. Virtual space teleconferencing using a sea of cameras
JP3728160B2 (en) Depth image measuring apparatus and method, and mixed reality presentation system
EP1013080B1 (en) Virtual studio position sensing system
US20160323565A1 (en) Real Time Sensor and Method for Synchronizing Real Time Sensor Data Streams
JP2951230B2 (en) Method for generating 3D image from 2D image
Bradley et al. Synchronization and rolling shutter compensation for consumer video camera arrays
US9756277B2 (en) System for filming a video movie
CN103348682B (en) The method and apparatus that single vision is provided in multi-view system
JP2004312745A (en) Composite camera and method for achieving automatic focusing, depth of field, and high resolution function
JPH09238367A (en) Television signal transmission method, television signal transmitter, television signal reception method, television signal receiver, television signal transmission/ reception method and television signal transmitter-receiver
WO2001029649A1 (en) Image processing method and apparatus for synthesising a representation from a plurality of synchronised moving image camera
US8264486B2 (en) Real-time high-speed three dimensional modeling system
KR101680367B1 (en) CG image product system by synchronization of simulator camera and virtual camera
EP1219115A2 (en) Narrow bandwidth broadcasting system
WO2009020381A1 (en) Apparatus and method for three-dimensional panoramic image formation
US20210343081A1 (en) Three dimensional media streaming and broadcasting system and method
WO2002087218A2 (en) Navigable camera array and viewer therefore

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP