EP1434184A2 - Commande d'un système multicaméra - Google Patents

Commande d'un système multicaméra Download PDF

Info

Publication number
EP1434184A2
EP1434184A2 EP03029530A EP03029530A EP1434184A2 EP 1434184 A2 EP1434184 A2 EP 1434184A2 EP 03029530 A EP03029530 A EP 03029530A EP 03029530 A EP03029530 A EP 03029530A EP 1434184 A2 EP1434184 A2 EP 1434184A2
Authority
EP
European Patent Office
Prior art keywords
camera
line
sight
control unit
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP03029530A
Other languages
German (de)
English (en)
Other versions
EP1434184A3 (fr
EP1434184B1 (fr
Inventor
Markus Dr. Michaelis
Volker Dr. Steinbiss
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Funkwerk Video Systeme GmbH
Original Assignee
Funkwerk Plettac Electronic GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Funkwerk Plettac Electronic GmbH filed Critical Funkwerk Plettac Electronic GmbH
Publication of EP1434184A2 publication Critical patent/EP1434184A2/fr
Publication of EP1434184A3 publication Critical patent/EP1434184A3/fr
Application granted granted Critical
Publication of EP1434184B1 publication Critical patent/EP1434184B1/fr
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19641Multiple cameras having overlapping views on a single scene
    • G08B13/19643Multiple cameras having overlapping views on a single scene wherein the cameras play different roles, e.g. different resolution, different camera type, master-slave camera

Definitions

  • the invention relates to a method for controlling a multi-camera system with the features of claim 1 and an apparatus for implementing the Method with the features of claim 16.
  • Multicamera systems are well known. Especially at Surveillance systems and alarm systems often consist of a combination stationary cameras and pan-tilt zoom cameras. Stationary cameras are cheaper and they allow continuous monitoring of a predefined area. A disadvantage of stationary cameras, however, is that an intruder can not be tracked. An intruder leaves predefined monitoring field, tracking becomes impossible, and the Invader is often lost. Moreover, it is through in such systems The rough resolution of the camera often does not allow for an intruder identify or detect small objects. There will therefore be pan-tilt zoom cameras used to detect one with a stationary camera To track and enlarge the intruder. The same problem arises also in other applications e.g. in camera systems for Traffic observation or sports events in large halls or stadiums. For a user, the manual control of the pan-tilt-zoom camera is to fast and targeted capture of an object in general difficult.
  • EP-A1-0 529 317 and EP-A1-0 714 081 it is known to be movable Objects tracking cameras, moving from one camera to the next to hand over.
  • the invention describes an application within closed rooms, for example within a casino.
  • the nearest available pan-tilt zoom camera selected, which is the Can track the object if it is currently the monitoring area of the Object detecting camera leaves.
  • the applications handle this Interiors where the area to be monitored is one level - the Floor level - is. For the objects becomes a well-known Size - Person and ground contact accepted.
  • the object of the invention is the detection of a picture of a first camera visible object to improve by a second orientable camera.
  • the camera system according to the invention consists of a first camera, too referred to as a master camera, and at least one second camera, as well Slave camera called.
  • the second camera is in Advantageously, a pan-tilt-zoom camera.
  • an object detector is present.
  • an object detector is any kind of a suitable device, method or human user indicating, in a camera image, an object of interest, e.g.
  • This can e.g. be a human user who has the object over suitable input medium, e.g. a mouse, on a monitor picture or one other suitable representation of the camera image detected.
  • suitable input medium e.g. a mouse
  • video motion detectors that detect moving objects.
  • objects based on characteristic To detect features e.g. Color, shape, face recognition. Under a In the following we mean a person who operates the system.
  • An automatic detector is to be understood as meaning detectors which work without human intervention. They are also semi automatic Combinations possible where an automatic detector Makes detection suggestions that are verified by a user have to. Detection should also be understood as localization are, i.
  • a Detection generally refers to a position determination in this sense in FIG a camera picture. In the case of a user, this happens in advantageous Configurations often by a mouse click. However, it can also be one Joystick, a touch-sensitive monitor or other suitable Input medium can be used.
  • the position can be a point that is assigned to the object or also one or more surfaces.
  • the Position often determined by a pixel.
  • This pixel has a finite Expansion and may e.g. in a higher resolution through several pixels be represented. It may also be due to the nature of the detector, such as a motion detector or so-called touch screens one or multiple surfaces are selected as the position of the object. We Therefore, below under a point in the picture or pixel also understand one or more surfaces. A point in the picture shown by the camera Scene can therefore also have an extension.
  • monitor in the following any suitable means for temporally changeable representation of images, text and graphics to be understood.
  • This can e.g. CRT monitors such as PC monitors, surveillance monitors, TV or something similar. It can also have flat screens, LCD screens or other types of displays.
  • a camera image a really visible to a user Monitor image but also an invisible representation of the same image for example as digital data in a computer.
  • a monitor picture is meant and the Insertion or marking of lines, points, etc. happens in the Monitor image in a form perceptible to the user.
  • an automatic detector If one Process part is carried out by an automatic detector is it It is a general, not necessarily visible image. The drawing or fade in here corresponds to a consideration of corresponding positions in the image by the affected automatically executed process part.
  • a master camera is a camera in which an object of a Object detector is detected, this detection for detecting the same Object to be used in other, so-called slave cameras.
  • the Master camera is often stationary, but it can also be used with a pan-tilt unit be equipped.
  • every camera is a master camera suitable, at a given time a unique assignment between a point in the camera image and a so-called visual ray given is.
  • a visual ray belonging to a pixel is the location of all dots in the room, which are imaged by the camera on this pixel. For Real cameras are this for straight-line pixels, for extended pixels corresponding cones as the sum of all lines.
  • an unstretched straight line is given as a ray of sight, from this cone is a suitable line as Representative used, usually the center of the object or object base belonging straight line. This may be the case if a video motion detector detects an area in the camera image or a user does not enter a pixel-perfect position via a touch screen.
  • the Allocation rule between pixels and visual rays, wherein the Visual rays are given in a camera-fixed coordinate system, is called internal calibration of the camera.
  • the position of the camera relative to a world coordinate system or another camera is called called external calibration.
  • the property of the master camera relates only on the current detection task. In another detection The same camera can also perform the function of a slave camera if you Field of view is adjustable.
  • the task of the slave camera is to capture the object detected in the master camera.
  • the slave camera is any camera with adjustable field of view suitable.
  • the adjustment of the field of view for the detection of the object by the slave camera is understood to mean the alignment of the so-called camera sector with the object.
  • the camera vector is a predefined directional vector permanently connected to the camera; it is therefore set together with the field of view of the camera.
  • slave cameras are pan-tilt-zoom cameras (domes or pan-tilt heads).
  • any other mechanism for adjusting the field of view is also suitable. A zoom feature is not necessary for all applications, but often beneficial. It is assumed that the slave camera is externally calibrated, ie its position and orientation relative to the master camera is known.
  • the adjustment mechanism of the field of view is assumed to be calibrated, ie the camera sector can be set to specific directions and the zoom can be selectively influenced.
  • the function of the slave camera is only for this detection process.
  • the same camera can also be a master camera in another acquisition process.
  • a system may consist of two or more cameras, with each camera operating as a master camera and one or more cameras having an adjustable field of view operating as slave cameras for a specific object detection. For the next object detection, these can be other camera pairings or the roles of the last pairing can be reversed.
  • An object can also be detected by several master cameras at the same or different times. In the camera image of the master camera, it is also possible to detect a plurality of objects which are then detected by a plurality of slave cameras or an object is detected, which is then detected by a plurality of slave cameras. The search of the object is restricted in all cases to the visual beam of the object given by the respective master camera. For multiple objects, there is a line of sight through each object.
  • This exemplary description does not represent a limitation or limitation of the invention to the listed examples. It merely serves to illustrate the invention and particularly advantageous embodiments.
  • Figures 3 to 5 allow a quick basic Understanding the invention.
  • Figure 1 shows the block diagram of the system.
  • a first camera 101 as a master camera and a second camera 102 with an adjustable field of view as a slave camera are shown here.
  • the master camera 101 is a stationary camera.
  • the master camera 101 is connected to a control unit 140 which connects the master camera 101 with a central control unit 142 via analog outputs, in particular, for example, for the video signal and digital outputs, in particular, for example, for control signals via networks. If digital cameras or image transmissions are used, the network for image transmission may be completely or partially digital. If the mater camera 101 has an adjustable zoom lens, it will be adjusted from the central control unit 142 via the control unit 140.
  • the current value is either stored in the central control unit 142 or queried via the control unit 140.
  • control unit 141 controls and connects the pan-tilt-zoom camera 102.
  • pan and tilt can also be set and the current values can be interrogated.
  • the central control unit 142 thus has access to the control, the current settings and the camera images of all participating cameras 101 and 102.
  • the camera images can be displayed on the monitors 144 and 145 connected to the central control unit 142. If automatic detectors are used for the master camera 101 and / or the slave camera 102, the corresponding monitors 144 and 145 can also be connected directly to the control units 140 and 141 or directly to the cameras 101 and 102 or they can be omitted altogether.
  • the central control unit 142 is capable of displaying information such as a mouse pointer, marked dots and visual rays in the displayed images.
  • graphic controls such as sliders
  • a user may make the necessary controls via an input device 143, which may be advantageously a mouse and / or keyboard and / or slider and / or control panel and / or joystick, etc.
  • the central control unit 142 also stores the data for the internal and external calibration of the cameras 101 and 102. If available, terrain models are also stored and automatic detectors, such as motion detectors, etc. installed. All information and modules for one of the cameras 101 or 102 may also be located in the local control units 140 or 141. These modules are advantageously modules for calibration and / or automatic detection.
  • the central control unit 142 all inputs and information are collected and processed, ie for an object detected in the master camera 101, the associated visual beam 303 (in FIG. 3) is calculated in world coordinates and, as far as the visual beam is detected by the slave camera 102, in the Camera image of the slave camera 102 superimposed (line of sight 504 in Figure 5). Further overlays, according to the embodiments described below, are made and the slave camera 102, for example, according to the embodiments described below, controlled. If a user operates the system, appropriate inputs are taken into account by the central control unit 142 or controls corresponding to the user are offered.
  • FIG. 2 shows a schematic representation of a camera.
  • cameras are often modeled as so-called projective or even hole cameras. All visual rays pass through one point, the so-called optical center 260 of the camera.
  • optical center 260 Through the optical center 260 and a pixel 206 on the image plane 262 (one pixel on the CCD or CMOS chip in digital cameras), a line of sight 203 is defined, which is the location of all points in space imaged on the pixel 206 by the camera become.
  • Many real cameras can accurately match this model by algorithmically correcting distortions in the image. For other cameras, there are similar models.
  • the orientation of a camera with adjustable field of view (slave camera 102 in Figure 1) is not necessarily on the center of the image.
  • a general camera sector 261 is defined that defines the orientation of the camera. If the camera is modeled as a projective camera, one can advantageously anchor the camera sector 261 in the optical center 260, as shown in the drawing. The camera sector 261 then corresponds to a specific line of sight and the orientation of the camera sector corresponds to the orientation of the associated pixel on the object. For slave cameras with zoom capability, this pixel can be advantageously chosen as the zoom center, ie the pixel which remains stationary in the image as the zoom changes. Any other pixel, such as the center of the image, is also possible.
  • FIG. 3 shows a schematic plan (plan view) of an area to be monitored with a system of two cameras 301 and 302.
  • the camera 301 is the master camera
  • the camera 302 is the slave camera.
  • the two cameras 301 and 302 are eg surveillance cameras and serve for the visual surveillance of the area.
  • In the area are the objects 311 (house), 312 - 315 (trees) and 316 (cars).
  • the master camera 301 detects in its indicated by the boundary lines 321 field of view of this camera associated monitoring area. Surveillance areas often do not cover the whole field of vision. This is indicated by the lines 370 and 371, which represent here by way of example beginning and end of the monitoring area.
  • the objects 314, 315, 316 as well as the object 311.
  • the imaged scene represents the orientation of the camera at the time of object detection.
  • the slave camera 302 has an adjustable field of view, eg as pan-tilt-zoom camera.
  • Line 307 denotes the line of sight associated with the camera camera of the slave camera (or the line defined by the camera vector in the general case). By aligning the slave camera with an object, we understand the orientation of this line on the object.
  • FIG. 4 shows a camera image of the master camera 301 from FIG. 3 as it is, for example, for a user on a monitor. The image shows the objects 311, 314, 315 and 316 of FIG. 3.
  • an input unit eg a mouse, by means of which a user can select a pixel 406 in the displayed scene.
  • This point 406 can be marked by the system on the screen, in Figure 4 this is done by a black dot.
  • the point 406 indicates an object in the scene, in this case a corner of the house, to be detected by the slave camera 302 of FIG. In the area diagram of FIG. 3, this object corresponds to the world point 305 with the associated visual ray 303.
  • the distinction between the world point 305 and the pixel 406 and between the (world) visual ray 303 and its imaging / projection in a camera image (504 in the camera image of FIG Slave camera in Figure 5) is essential.
  • the entire visual beam 303 is formed in the camera image of the master camera on the one pixel 406.
  • the detector marks a pixel in the camera image of the master camera and thus a line of sight.
  • the position of the corresponding world point on the visual ray, ie its distance from the master camera, is still unknown.
  • the location of the world point and the effective control of the slave camera 302 is the subject of the embodiments of the invention described below.
  • the point 406 in FIG. 4 or 305 in FIG. 3 is also shown with adapted reference symbols in FIGS. 5 to 7. If the slave camera 302 can be arranged close to the master camera 301, with proper alignment the visual beams of the master camera 301 and the camera camera of the slave camera run quasi parallel, the object distance is then irrelevant to the orientation of the slave camera.
  • the detection by the slave camera 302 is relatively easy to solve in this case.
  • a special case also arises if the terrain of the monitored area is known, especially if it is a very simple terrain form such as a floor level in interior monitoring. In this case, assuming that the object is on the ground, it is relatively easy to calculate the object distance and direct the slave camera 302 at the object. In more complex terrain, the terrain shape could also be determined by the camera system itself. This is possible, for example, by calculating distance maps by means of known stereo or multicamera methods.
  • a distance map is a map that stores the distance to the point of interest represented for each pixel in the camera image of the master camera. Another option, if the object size is known, is to use this to estimate the object distance and thus align the slave camera 302.
  • the required accuracy is generally determined by the zoom, i. the Hang camera opening angle. If one assumes that the Detection by the slave camera 302 with an accuracy of e.g. 1/10 of the Camera aperture angle, this means for a large zoom of about 2 degrees opening angle an accuracy of the setting of the slave camera 302 of 0.2 degrees. This accuracy is with the methods given above not reachable in most cases.
  • a line of sight 303 in FIG. 3 are defined.
  • entering this point means e.g. one Click.
  • the object distance can be at one certain area are restricted. Often this is due to the size of the detected by the master camera 301 in Figure 3 monitoring area given as an example between the boundary lines 370 and 371 in FIG Figure 3 is located.
  • the slave camera 302 will then automatically do so by the system set (pan, tilt and zoom) as shown in Figure 3. That by the Boundary lines 322 indicated in Figure 3 field of view detects the Line of sight 303 of the pixel 406 within the surveillance area Completely. If a complete acquisition is not possible a as much as possible is recorded and a warning is displayed.
  • the focus can be adjusted automatically if required so that objects along the Sehstrahls appear as sharp as possible.
  • FIG. 5 shows the camera image of the slave camera 302 set in this way, as it appears to a user on a monitor or an automatic detector.
  • the line of sight 303 from FIG. 3 is advantageously superimposed (line 504 in FIG. 5) so that the user can orient themselves, but depending on the application, the picture is not obscured.
  • the line of sight 504 is displayed in color, dashed and / or semitransparent, etc.
  • the user or an automatic detector now only has to search the image along the line of sight 504 in order to find the pixel 506, which is an image of the searched world point 305 from FIG. 3 in the camera image of the slave camera. For a user, this means eg a second mouse click.
  • the line of sight 504 can be an imaginary line.
  • the detection in the camera image of the slave camera determines the position of the object on the line of sight 303, and the slave camera 302 can be automatically aimed at the object and set to high zoom. Restricting the search to this line is of course advantageous to an automatic detector, but for a user, the insertion of the line of sight 504 and the restriction of the search along the line of sight 504 may also be helpful. This is the case, for example, if the object is very low in contrast and the image is of poor quality or if there are a large number of similar objects, for example a person in a crowd of viewers.
  • a great advantage in any case is that the user or detector with two detections (for a user eg two mouse clicks) and two associated movements of the slave camera 302 has detected the object. If the object in the camera image of the master camera was detected as an area, the search area would be correspondingly given by the sum of the visual rays.
  • FIG. 6 shows the same surveillance scene as FIG. 3 with a master camera 601 and a slave camera 602.
  • a slider is offered to the user on the screen or in the input device 143 in FIG.
  • the goal is again to align the slave camera 602 with the world point 605.
  • the associated pixel 406 is detected in the camera image of the master camera (FIG. 4).
  • the slave camera 602 is already set to a large zoom, which is indicated by the boundary lines 322, 323 and 324 of the visual field for three different orientations of the slave camera in the drawing. This has the advantage, for example, that a high resolution is already given during the search and during the detection of the object by the slave camera 602.
  • the system therefore offers the user the option of moving the slave camera along the line of sight 603 by moving the slider.
  • the search is so much easier and faster because only this one degree of freedom to use and search is.
  • the search on the visual ray S is automatically restricted to the range determined by the apriori knowledge.
  • the slider can be provided with distance information on the line of sight 603, for example.
  • the slider can also be replaced by other input media with one degree of freedom or input media whose function is limited to one degree of freedom.
  • the normal pan-tilt control of a camera can be changed via a joystick or keys for object detection so that one of the two deflections or a pair of keys the slave camera moves on the line of sight, while the other deflection or the other pair of buttons is turned off Zoom the slave camera or the deflection of the slave camera perpendicular to the line controls or other tasks.
  • FIG. 7 shows illustrations of camera images taken by the slave camera 602 in Figure 6 with the orientations 622, 623 and 624 along the line of sight 603 are recorded.
  • the camera image 722 shows the object 612 of FIG Camera image 723 shows the objects 613, 614, and 616.
  • This through point 406 selected object is in the camera image 724 in Figure 7 from the Perspective of the slave camera 602 and zoomed in (point 706).
  • the slider is then eliminated, of course.
  • the object in the camera image of the master camera ( Figure 4) as an area detected in this embodiment, one of the visual rays, such as the middle, to be selected.
  • figure 8 shows a camera image 833 of the slave camera 602.
  • the slave camera 602 is here still too high to detect the line of sight 603, s.d. only the Treetops of the objects 613 and 614 are detected.
  • the arrow 809 shows the Direction in which the line of sight is reached on the shortest path.
  • the Information could also be displayed on a control panel, e.g. in which Direction to move a joystick for camera control is or which of Control keys is to use.
  • Is the slave camera according to this Information procedure to reach the camera image 823th The slave camera has detects the visual beam, the camera image 823 is identical to the camera image 723 of FIG. 7.
  • the arrows 804 shown indicate the direction in which the camera has to be moved to drive off the line of sight. Instead of the Arrows can also be the line of sight as a line or in another suitable way to be displayed.
  • FIG. 9 shows a further particularly advantageous embodiment.
  • Clarification is the concrete task given the face of a person in high resolution.
  • FIG. 9 shows a scenario with a Master camera 901, a slave camera 902 and three persons 982, 983 and 984 on a visual beam 903 of the master camera 901 at different Positions are located.
  • Embodiment becomes the zoom during the process of the slave camera 602 not adapted, indicated by the fields of view 922, 923 and 924.
  • the corresponding camera images can be seen in FIG. Due to the At different distances from the slave camera 902, the person 982 becomes Camera picture 1022 too big (face cut off), the person 984 im Camera image 1024 recorded too small.
  • the person 983 is for the adjusted zoom at approximately the appropriate distance (camera image 1023).
  • the master camera 901 will be an example detected suspicious person in low resolution.
  • the described Embodiment can be very efficient with the help of the slave camera 902 High-resolution snapshot of the person's face.
  • the Triggering of the snapshot can be done by a user, by a automatic detector (here a face detector) or semi-automatic.
  • the auto focus can focus on the visual ray as well support the detection of the object in the slave camera. Often that is Arrangement so that the line of sight, before it hits the selected object, far away from other objects (through the air). The objects that are in the Projection into the camera image of the slave camera along the line of sight then the "wrong" distance from the slave camera has to be imaged and are not shown in focus. Starting from the master camera, the first sharply imaged object is in this case the searched object.
  • General can also be used in other embodiments for any position on the Line of sight 303, 603 or 903 the distance from the slave camera 302, 602 or 902 can easily be calculated by triangulation. Is the object of the Slave camera has been detected on the visual ray, as well as easily by triangulation, the distance of the object from the master camera 301, 601 or 901. This information is generally automatic Adjustment of zoom and focus and can be used for further tasks.
  • the detection or tracking of a moving Object in the camera image of the master camera 301, 601 or 901 also continuously respectively.
  • the sight beam 303, 603 or 903 and the slave camera control 302, 602 or 902 are then automatically adjusted continuously.
  • the focus is automatically adjusted such that objects are sharply imaged on the line of sight.
  • the searched object is thereby automatically detected, whereby starting from the first camera 01 with automatically focused on the line of sight 03 zoom the first sharply imaged or the sharpest pictured object is.
  • the one-degree of freedom controller than Slider or joystick or in the form of keys is realized, with a Deflection of the joystick moves the respective camera on the line of sight and used in normal operation to control the second camera 02 Input medium for the object acquisition task changed in the way that it fulfills the object detection function.
  • the Master camera as shown in Figure 3, with a focused light source 350 equipped, e.g. a laser.
  • the light source is in its orientation adjustable, the spectrum does not have to be in the visible range.
  • the Focused light source 350 is as close as possible to the master camera 301 and can thus be selected for a selected point 406 (FIG. 4) along the associated Sehstrahls 303 are oriented. The beam then hits the selected object and illuminate this. If the slave camera 302 sensitive for the radiation used this can be used to detect the object through the Slave camera 302 serve. This is especially true if the radiation is above it Spectrum, its intensity, via a pulsed operation, over combinations this or other features to a clearly detectable signal in Camera image of the slave camera leads.
  • FIG. 4 Another embodiment is shown in FIG.
  • the figure shows the same way like Figure 4 is a camera image of the master camera 301 of Figure 3.
  • the Visual beam 1208 is the camera camera of the slave camera associated Sehstrahl 307 of Figure 3 (or through the camera sector defined straight line in the general case). This allows a user Immediately detect the current tilt of the slave camera.
  • to Handling multi-camera systems will often give the user site maps of the area offered on a monitor in which the cameras are shown are. Since the site plans are top views, it is also easy by a corresponding symbolism the current pan of a pan-tilt-zoom camera display. This facilitates orientation for the user in the control of the camera.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Burglar Alarm Systems (AREA)
  • Cameras In General (AREA)
EP03029530A 2002-12-22 2003-12-20 Commande d'un système multicaméra Expired - Lifetime EP1434184B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE10261146A DE10261146A1 (de) 2002-12-22 2002-12-22 Steuerung eines Multikamera-Systems
DE10261146 2002-12-22

Publications (3)

Publication Number Publication Date
EP1434184A2 true EP1434184A2 (fr) 2004-06-30
EP1434184A3 EP1434184A3 (fr) 2004-11-17
EP1434184B1 EP1434184B1 (fr) 2008-03-05

Family

ID=32404310

Family Applications (1)

Application Number Title Priority Date Filing Date
EP03029530A Expired - Lifetime EP1434184B1 (fr) 2002-12-22 2003-12-20 Commande d'un système multicaméra

Country Status (4)

Country Link
EP (1) EP1434184B1 (fr)
AT (1) ATE388457T1 (fr)
DE (2) DE10261146A1 (fr)
ES (1) ES2301752T3 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112509257A (zh) * 2020-12-29 2021-03-16 鼎力联合(深圳)高新技术有限公司 一种光束扫描防御系统及其使用方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004034663A1 (de) * 2004-07-17 2006-02-09 Siemens Ag Folgekamerasteuerung
DE102016119241B4 (de) 2016-10-10 2018-08-09 Markus Blömer Operationsbestimmungsvorrichtung zum Bestimmen einer von einem Gerät durchzuführenden Operation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994017636A1 (fr) * 1993-01-29 1994-08-04 Bell Communications Research, Inc. Systeme de commande de cameras a poursuite automatique
DE19639728A1 (de) * 1996-09-26 1998-04-09 Siemens Ag Video-Überwachungseinrichtung
US20020102101A1 (en) * 2001-01-30 2002-08-01 Philips Electronics North America Corporation Camera system and method for operating same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994017636A1 (fr) * 1993-01-29 1994-08-04 Bell Communications Research, Inc. Systeme de commande de cameras a poursuite automatique
DE19639728A1 (de) * 1996-09-26 1998-04-09 Siemens Ag Video-Überwachungseinrichtung
US20020102101A1 (en) * 2001-01-30 2002-08-01 Philips Electronics North America Corporation Camera system and method for operating same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112509257A (zh) * 2020-12-29 2021-03-16 鼎力联合(深圳)高新技术有限公司 一种光束扫描防御系统及其使用方法

Also Published As

Publication number Publication date
EP1434184A3 (fr) 2004-11-17
ATE388457T1 (de) 2008-03-15
DE10261146A1 (de) 2004-07-01
DE50309311D1 (de) 2008-04-17
EP1434184B1 (fr) 2008-03-05
ES2301752T3 (es) 2008-07-01

Similar Documents

Publication Publication Date Title
EP2464098B1 (fr) Dispositif de représentation d'environnement ainsi qu'un véhicule doté d'un tel dispositif de représentation d'environnement et procédé de représentation d'une image panoramique
DE60020420T2 (de) Situationsdarstellungs-Anzeigesystem
DE10152883B4 (de) Nachführvorrichtung
DE10017072C2 (de) Hecküberwachungssystem für ein Fahrzeug
EP2174188B1 (fr) Système de positionnement automatique de caméras accouplées pour représentation en trois dimensions
EP1614080B1 (fr) Dispositif de surveillance
DE19836681A1 (de) Stereoskopisches Aufnahme- und Wiedergabesystem
DE102006012239A1 (de) Video-Überwachungssystem
EP3534210B1 (fr) Unité d'affichage à réglage du foyer
DE102012107153A1 (de) Vorrichtung und Verfahren zur Bestimmung der Eigenlage einer bildaufnehmenden Kamera
EP0445334A1 (fr) Procédé de détection d'intrus
DE102005055879A1 (de) Flugverkehr-Leiteinrichtung
DE69721520T2 (de) System mit einem Photosensor, insbesonder zur Zeitmessung bei Wettkämpfen, und Einstellverfahren zum Ausrichten eines solchen Systems auf eine Ziellinie
EP1434184B1 (fr) Commande d'un système multicaméra
DE10049366A1 (de) Verfahren zum Überwachen eines Sicherheitsbereichs und entsprechendes System
EP1912431A2 (fr) Procédé et dispositif destinés à la commande d'une caméra pivotante
WO2020061604A1 (fr) Procédé de réglage de mise au point d'une caméra
EP2831839B1 (fr) Procédé d'exploitation automatique d'un système de surveillance
EP2884746A1 (fr) Dispositif de caméra de surveillance doté d'une détermination d'information sur le relief
DE19956266A1 (de) Überwachungsanlage
EP3185213B1 (fr) Procédé de réalisation d'une carte bathymétrique à l'aide d'une caméra
DE19811286C2 (de) Kamerabewegungssteuerung
DE102020134814A1 (de) Vorrichtung zum Überwachen von Umgebungen eines Fahrzeugs
EP1455525A1 (fr) Procédé et dispositif d'enregistrement des données vidéo
DE19827835B4 (de) Bildübertragungsverfahren und -vorrichtung

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK

17P Request for examination filed

Effective date: 20041124

AKX Designation fees paid

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

17Q First examination report despatched

Effective date: 20060703

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REF Corresponds to:

Ref document number: 50309311

Country of ref document: DE

Date of ref document: 20080417

Kind code of ref document: P

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2301752

Country of ref document: ES

Kind code of ref document: T3

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

REG Reference to a national code

Ref country code: IE

Ref legal event code: FD4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080805

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080605

ET Fr: translation filed
PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

Ref country code: IE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

26N No opposition filed

Effective date: 20081208

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080605

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

BERE Be: lapsed

Owner name: FUNKWERK PLETTAC ELECTRONIC G.M.B.H.

Effective date: 20081231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20081231

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20081231

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20081231

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20081231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20081220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080906

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20081220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20080606

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20130117

Year of fee payment: 10

Ref country code: ES

Payment date: 20121226

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20121218

Year of fee payment: 10

REG Reference to a national code

Ref country code: NL

Ref legal event code: V1

Effective date: 20140701

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20140829

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20140701

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20131231

REG Reference to a national code

Ref country code: ES

Ref legal event code: FD2A

Effective date: 20150327

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20131221

REG Reference to a national code

Ref country code: DE

Ref legal event code: R081

Ref document number: 50309311

Country of ref document: DE

Owner name: FUNKWERK VIDEO SYSTEME GMBH, DE

Free format text: FORMER OWNER: FUNKWERK PLETTAC ELECTRONIC GMBH, 90766 FUERTH, DE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20171117

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20171221

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20171229

Year of fee payment: 15

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 50309311

Country of ref document: DE

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20181220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190702

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181220