WO2018211057A1 - Dispositif de suivi par caméra à base de marqueurs - Google Patents
Dispositif de suivi par caméra à base de marqueurs Download PDFInfo
- Publication number
- WO2018211057A1 WO2018211057A1 PCT/EP2018/063048 EP2018063048W WO2018211057A1 WO 2018211057 A1 WO2018211057 A1 WO 2018211057A1 EP 2018063048 W EP2018063048 W EP 2018063048W WO 2018211057 A1 WO2018211057 A1 WO 2018211057A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- markers
- camera
- images
- positions
- scene
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- the present document relates to a marker-based "through the
- Camera tracker the camera parameters based on shots of
- Virtual studios (studios with a green background) need to be calibrated to display backgrounds using moving cameras, so that the virtual background is always visually displayed correctly, especially with respect to real objects in the studio, those of the cameras
- the recording parameters may be
- Properties of recording cameras act, such as their position and orientation in space or their focal length.
- the proposed procedure is based on a measurement of markings mounted in the studio by means of a camera and the determination of the spatial positions of the markings. These marks are then from captured by a camera during the recording (production) of image or video recordings and used to determine the recording parameters. For example, the spatial position of the camera during production and its
- Zoom parameters such as focal length can be determined continuously during production without additional equipment (such as other
- Image data revealed includes capturing a plurality of images of a scene, wherein the images include a plurality of markers disposed in the scene, and determining the positions of the markers in the space. In this way, all markers in the scene can be measured.
- the scene may be a television studio or other suitable studio for producing image or video content.
- the determined positions of the markers can then be stored for later use. Thus, the preparatory measures are completed and the production of image or video content can begin.
- the other images can be the produced image or video content.
- the camera parameters may be useful in taking the additional images. These are determined on the basis of the assigned spatial / image positions of the markings in the further images.
- the camera parameters thus determined may include at least one of camera position, camera orientation, and camera zoom.
- the positions of the markers are determined in the rough by means of a "structure from motion" method
- the method can have a further step of using the determined camera parameters for inserting virtual content into the further images n the background of the virtual studio can be optically filled depending on the camera parameters.
- At least some of the markers may be designed to allow unique identification of the particular mark.
- the markers may include a visual coding of their identification number and / or a reference position or direction. This allows at least some of the markers in the other pictures to be unique
- the markers are arranged three-dimensionally in the space of the scene and their respective three-dimensional positions in space are determined.
- the markings can have uniquely identifiable master markings and slave markings whose identifier is unique only in relation to a master mark.
- the markers can be colored in such a way that they can be easily removed from the other images by means of their color coding. This is particularly advantageous when using a green box.
- the images of the scene are taken with a calibrated surveying camera. This can be previously subjected to an intrinsic calibration to their camera parameters, such as
- focal length, optical center and distortion based on
- the further images can be taken with a production camera (e.g., a broadcast camera).
- a production camera e.g., a broadcast camera
- composite images can then be determined the positions of the markers in the room.
- an apparatus for carrying out the method described above is proposed.
- a system for determining camera parameters which has at least one camera for recording images of a scene, a plurality of markings arranged in the scene, and an evaluation unit.
- the evaluation unit is connected to the at least one camera in order to obtain the recorded images and to determine the positions of the markings in the space on the basis of the obtained images. Further pictures of the scene are then taken by the at least one camera and the Transfer evaluation unit.
- the evaluation unit is further configured to associate the spatial positions of the markers in the further images with their respective image positions; and determining the camera parameters when capturing the further images based on the assigned spatial / image positions of the markers in the further images.
- Evaluation unit perform the steps of the method described above and all aspects described there can be transferred to the proposed system.
- 2 cameras can be used: an intrinsically calibrated surveying camera for recording the mark for determining their spatial positions and a
- Production camera for taking picture or video content Production camera for taking picture or video content.
- FIG. 1 shows an example of a marking
- FIG. 2 shows a calibration object
- FIG. 3 shows the geometric relationships of a projective camera.
- the described system for dynamically determining acquisition parameters of a production camera is based on a measurement of markings mounted in the studio by means of a
- the so-called "structure from motion” method can be used, which is a stereo measurement with unknown extrinsic
- the markers are designed in such a way that on the one hand they are easy to find or recognize in a recorded image and on the other hand enable an exact determination of a position.
- a round basic shape is appropriate, for example a circle with 2 filled,
- 3 marks define the coordinate system.
- the markings are preferably visually coded by means of markings, so that their identity can be determined by evaluating recorded images of the markings.
- additional structures may be provided for coding.
- These additional structures can be filled circles (blobs), which are arranged at certain positions in a circle around the possibly also round basic shape.
- blobs filled circles
- a binary coding of the marker ID by means of these additional structures is possible.
- the markers may also have an additional reference structure (Blob). This indicates the zero direction for decoding and ID recognition.
- the additional structure may be smaller than the ID structures for coding. If this additional structure is not recognized in an image, there is a risk that ID structure detection will be unreliable and the ID may be misrecognized. In this case, the marker should be discarded.
- the visual coding of the marker IDs can be recognized over a wide zoom range, so that the markings can be unambiguously identified during production under a wide variety of recording conditions. Therefore, as many markers as possible should be placed in the studio in different locations and possibly also in different sizes. So it can be ensured that among the
- a "master-slave" concept can be used, whereby an easily recognizable master marker (for example, with larger dimensions) and multiple slave markers used in known spatial arrangements for master marking are, for example, annular in a known direction of rotation to the master mark.
- the slave tags can be identified, even if their visual IDs are too small in the captured image for safe evaluation.
- the "master-slave" concept can also be used to simplify the necessary visual identifiers of the markers for unique identification with a large number of markers, since too many unique ones
- Production camera during the recording (production) of image or video in hold recorded and used to determine the recording parameters.
- the rough position and orientation of the production camera and its zoom parameters such as focal length can be determined continuously during production without the need for additional devices during production
- the markings are preferably executed green-in-green, so that they are in the display of virtual content together with the green Background image can be removed.
- the markers can be made from a picture over their
- the proposed calibration and tracking system or the corresponding method is not limited to virtual studios and can also be used elsewhere, for example wherever recording parameters of a camera are to be determined dynamically. Applications are for example in the recording of sporting events or vehicle crash tests.
- the procedure for through-the-lens calibration of recordings can be used for broadcast video cameras and all other popular cameras (for virutal reality and augmented reality productions).
- Position of the markers is almost arbitrary
- a tracking system includes a production video camera (e.g., a broadcast camera)
- a production video camera e.g., a broadcast camera
- Surveying camera can be the broadcast camera, but a special camera is advantageous in handling and for accuracy), markers that have a clearly recognizable from the image marking, and a
- Processing unit on.
- the processing unit receives the images taken by the surveying camera with the markers located in the scene to determine their 3D positions. For this purpose, at least 2 images of the markers of different
- the processing unit performs a structure-from-motion reconstruction of the 3D positions of the markers (corresponds Stereo with unknown camera positions) using the 2 or more shots and the measured distance. Furthermore, the parameters of an intrinsic calibration of the surveying camera are required. This can be done before the recording of markers in the studio and independently. In general, the surveying camera is calibrated once and the intrinsic calibration parameters determined and
- a calibration object of known dimensions is used for intrinsic calibration of the survey camera. From this, one or more images of the calibration object are made with the surveying camera for intrinsic calibration of the surveying camera. In this way you can
- Camera parameters such as focal length, optical center, possibly distortions can be determined, which later for stereo measurement with unknown
- extrinsic camera calibration using the so-called "structure from motion” - method for the determination of the 3D positions of the markers are used.
- FIG. 1 shows an example of a marker with a coded identifier.
- the basic shape of the marker is circular with two opposite solid sectors meeting at the center of the circle. Detecting rounds
- Basic form of the marker around are more visual structures (Blops), here filled circles, arranged.
- the upper center circle identifies the orientation of the marker and marks its zero position or direction. In the example shown, this will be the zero-degree orientation of the marker
- the further circles can be used to encode the ID of the marker, for example by means of a binary coding.
- 4 circles are provided at predetermined positions of 45, 135, 225 and 315 degrees with respect to the zero-degree direction, so that the binary value 1111 is coded.
- the corresponding binary value is set to one.
- more coding points can be used to uniquely identify a larger number of markers.
- the recognition accuracy for the coded marker IDs is reduced.
- the master-slave principle described above can be used.
- the markers may be green-in-green, so that they can easily be removed from the shapes together with the green background.
- the 3D positions of the markers can be used during the production of image or video content to calibrate the position, orientation and zoom of the production camera.
- Image processing unit supplied. This determines the marker positions in the images, determines their identifier and assigns the markers in the image via their I Ds their respective 3D position in space.
- the image processing unit may be separate from the processing unit used to measure the SD positions of the markers. Alternatively, both can
- Computing device can be implemented. Due to the known 3D position of the markers and their respective
- Image positions can be the recording geometry and thus the
- the markers can then be removed from the pictures due to their color coding.
- the knowledge of the recording parameters obtained in this way makes it possible to carry out a wide variety of image evaluations and measurements of the content produced, for example when evaluating images of a vehicle crash test. Because the
- Video data are detected by the production camera, these are for dynamic shooting conditions, such as moving or tilted zoom camera, of particular importance. They can be used in a virtual studio with a green box to visualize virtual content with correct orientation to the recording situation. The following are details about intrinsic calibration and about
- Embodiments can be used. It should be noted that not all details are required for the execution of the invention and embodiments are suitable without these details for carrying out the invention.
- the intrinsic calibration assigns each pixel of a camera one
- the intrinsic calibration is one Feature of the camera regardless of its position or orientation in the world.
- the intrinsic calibration is based on at least one image of an object with multiple calibration points and exactly known dimensions.
- FIG. 2 shows a CNC-produced calibration object with dark calibration points (filled circles) whose positions on the object are exactly known.
- the dark areas of the points are detected via an adaptive threshold method and in a binary mask
- Adaptive threshold methods do not work with a fixed threshold, but find areas that are darker in their local environment than the environment. Thus, differences in illumination across the object can be compensated. All contiguous areas in the resulting binary mask are then examined for circularity. This can also distortions by the projection of the circles in ellipses
- the calibration points are then numbered according to their position on the object by rows and columns and assigned a 3D position based on the known object dimensions.
- the associated world coordinate system is anchored here on the calibration object.
- a projective camera For the intrinsic calibration the model of a projective camera is used.
- a projective camera a world point X is imaged onto the pixel (u, v) on the image sensor chip of the camera, which is located on a beam between the world point and the (virtual) focal point of the camera.
- the intrinsic camera parameters are determined by the focal point. This is described by the focal length f (distance from the image plane) and the optical center cu, cv (penetration point of the optical axis, which is perpendicular to the image plane).
- FIG. 3 illustrates the geometric relationships of a projective camera.
- the components of the matrix R, T1, 2.3 are the components of the translation vector T, f is the focal length and u, v is a pixel , and ⁇ , ⁇ , ⁇ are a world point to one of the calibration points on the calibration object.
- These formulas are also known under the name "Direct Linear Transformation.” For a sufficient number of N calibration points, one obtains N equations (5), from which rll, rl2, r13, r21, r22, r23, T1, T2 can be determined The orthogonality conditions of the rotation matrix are given the rest of them
- Equations (3) and (4) again for N calibration points, can then be used to calculate T3 and f.
- the parameters of the surveying camera such as focal length, optical center, possibly distortions, can be determined.
- This calibration is determined by linear equations and therefore not yet optimal.
- a non-linear optimization such as a Gauss-Newton method can be used.
- Gauss Newton method iteratively minimizes an error function of the following type (see eg Wikipedia):
- the xi, l..n are the parameters l ... n of the calibration (ie f, cx, cy, R, T); i passes over the calibration points.
- the function f in (6) are the
- Image coordinates and the yi correspond to the measured pixels (u, v, ie 2 terms per pixel).
- the marker positions can be determined using the "structure from motion" method, in which case we have N calibration points with now unknown world coordinates, which are no longer located on a precision calibration object, but rather from In the formulas (3) to (5) u, v (for N points) and f are known, the rest unknown, including the N world points (X, Y, Z) i.
- Equation (1) indicates for projective cameras how world points and pixels are related. Applying this consistently from one pixel in shot 1 to the world point and from there into shot 2, one can show that for matrix F, which links these pixels, the following applies:
- R, T are the sought-after extrinsic calibration parameters from (1) and K the known matrix of the intrinsic calibration (T is the transpose):
- the right side in (9) can thus be determined by point correspondences between the 2 images. It now turns out that there are methods that can be used to factorize KTFK (also called essential matrix) into the R and T matrix on the right. Such a method is e.g. in Richard Hartley, Andrew Zisserman, Multiple View Geometry in Computer Vision,
- the 3D positions of the world points can also be calculated by a stereo reconstruction. From (1) this results directly in two lines of sight as straight lines in the room, which can be cut to determine the desired 3D point.
- the position R, T and the focal length of the camera for this recording can be calculated for any recording with a production camera.
- the formulas (3) to (5) it is basically the same task as the
- Coordinate system generated as a 3D world.
- this virtual world is captured by a virtual camera with a specific position, orientation, and opening angle (zoom) called rendering.
- the two coordinate systems In order for the view of the objects in the rendered image to match the real camera, the two coordinate systems must be connected.
- the virtual objects thus receive a position and orientation in the real world, and the position of the virtual camera for rendering then corresponds to the position of the real camera.
- the marker of the Y-axis indicates only the plane of the Y-axis, the actual axis is calculated automatically perpendicular to the X-axis, so that one does not necessarily rely on an exactly orthogonal arrangement of the marker.
- the Z axis is then dictated by the origin and location of that plane.
- Computing device in particular a digital image processing device, are executed, wherein the calculation of the respectively sought parameters by means of software programming takes place.
- special hardware circuits or a mixture of both can be used for this purpose.
- On the production camera itself no changes are required compared to conventional models.
- the requirements for the surveying camera are not particularly high, so that conventional models can also be used here.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
L'invention décrit un procédé de détermination de paramètres de caméra à l'aide de données d'images capturées au moyen d'une caméra. Le procédé consiste à capturer plusieurs images d'une scène, les images contenant plusieurs marquages agencés dans la scène; à déterminer les positions des marquages dans l'espace; à capturer d'autres images de la scène, les autres images contenant au moins quelques-uns des marquages agencés dans la scène; à associer les positions spatiales des marquages dans les autres images à celles des positions d'images respectives; et à déterminer le paramètre de caméra lors de la capture des autres images à l'aide des positions spatiales/positions d'images associées des marquages dans les autres images.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102017208526.6A DE102017208526A1 (de) | 2017-05-19 | 2017-05-19 | Marker basierter Kamera-Tracker |
DE102017208526.6 | 2017-05-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018211057A1 true WO2018211057A1 (fr) | 2018-11-22 |
Family
ID=62196602
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2018/063048 WO2018211057A1 (fr) | 2017-05-19 | 2018-05-18 | Dispositif de suivi par caméra à base de marqueurs |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102017208526A1 (fr) |
WO (1) | WO2018211057A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102020119601A1 (de) | 2020-07-24 | 2022-01-27 | Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg | Hintergrundwiedergabesystem |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2329292A (en) * | 1997-09-12 | 1999-03-17 | Orad Hi Tec Systems Ltd | Camera position sensing system |
GB2366463A (en) * | 1997-05-30 | 2002-03-06 | British Broadcasting Corp | Position determination |
US20100245593A1 (en) * | 2009-03-27 | 2010-09-30 | Electronics And Telecommunications Research Institute | Apparatus and method for calibrating images between cameras |
WO2014020108A1 (fr) * | 2012-08-03 | 2014-02-06 | Thorsten Mika | Dispositif et procédé destinés à déterminer la situation d'une caméra de prise de vue |
-
2017
- 2017-05-19 DE DE102017208526.6A patent/DE102017208526A1/de not_active Withdrawn
-
2018
- 2018-05-18 WO PCT/EP2018/063048 patent/WO2018211057A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2366463A (en) * | 1997-05-30 | 2002-03-06 | British Broadcasting Corp | Position determination |
GB2329292A (en) * | 1997-09-12 | 1999-03-17 | Orad Hi Tec Systems Ltd | Camera position sensing system |
US20100245593A1 (en) * | 2009-03-27 | 2010-09-30 | Electronics And Telecommunications Research Institute | Apparatus and method for calibrating images between cameras |
WO2014020108A1 (fr) * | 2012-08-03 | 2014-02-06 | Thorsten Mika | Dispositif et procédé destinés à déterminer la situation d'une caméra de prise de vue |
Non-Patent Citations (1)
Title |
---|
RICHARD HARTLEY; ANDREW ZISSERMAN: "Multiple View Geometry in Computer Vision", CAMBRIDGE UNIVERSITY PRESS, article "Extraction of Cameras from the Essential Matrix" |
Also Published As
Publication number | Publication date |
---|---|
DE102017208526A1 (de) | 2018-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102012112322B4 (de) | Verfahren zum optischen Abtasten und Vermessen einer Umgebung | |
DE102012112321B4 (de) | Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung | |
DE102015011914B4 (de) | Konturlinienmessvorrichtung und Robotersystem | |
DE112010004767B4 (de) | Punktwolkedaten-Verarbeitungsvorrichtung, Punktwolkedaten-Verarbeitungsverfahren und Punktwolkedaten-Verarbeitungsprogramm | |
DE102014016069B4 (de) | Vorrichtung und Verfahren zum Aufnehmen eines Gegenstandes im 3- dimensionalen Raum mit einem Roboter | |
DE19983341B4 (de) | Verfahren und Einrichtung zur Erfassung stereoskopischer Bilder unter Verwendung von Bildsensoren | |
DE69807508T2 (de) | Positionsbestimmung | |
DE102006055758B4 (de) | Verfahren zur Kalibrierung von Kameras und Projektoren | |
EP2880853B1 (fr) | Dispositif et procédé destinés à déterminer la situation d'une caméra de prise de vue | |
DE102015015194A1 (de) | Bildverarbeitungsvorrichtung und -verfahren und Programm | |
DE10137241A1 (de) | Registrierung von Tiefenbildern mittels optisch projizierter Marken | |
DE102008031942A1 (de) | Verfahren und Vorrichtung zur 3D-Digitalisierung eines Objekts | |
DE102015005267A1 (de) | Informationsverarbeitungsvorrichtung, Verfahren dafür und Messvorrichtung | |
DE10135300A1 (de) | Verfahren und Einrichtung zum Messen dreidimensionaler Information | |
DE102009030644B4 (de) | Berührungslose Erfassungseinrichtung | |
DE102012023623B4 (de) | Verfahren zum Zusammensetzen von Teilaufnahmen einer Oberfläche eines Objektes zu einer Gesamtaufnahme des Objektes und System zum Erstellen einer Gesamtaufnahme eines Objektes | |
DE102012204537A1 (de) | Verfahren zum Bestimmen der Abmessungen eines Körperteils | |
EP3104330A1 (fr) | Procede de suivi d'au moins un objet et procede de remplacement d'au moins un objet par un objet virtuel dans un signal d'image animee enregistre par une camera | |
DE112014006493T5 (de) | Bestimmen eines Massstabs dreidimensonaler Informationen | |
EP3539085B1 (fr) | 3d localisation | |
AT511460B1 (de) | Verfahren zur bestimmung der position eines luftfahrzeugs | |
WO2018211057A1 (fr) | Dispositif de suivi par caméra à base de marqueurs | |
DE10112732C2 (de) | Verfahren zur Bestimmung der Lage von Meßbildern eines Objektes relativ zum Objekt | |
WO2009018894A1 (fr) | Procédé et dispositif de détermination de données géométriques d'un objet mesuré | |
EP2660560A1 (fr) | Procédé de détermination de la géométrie de surface d'une section de tunnel |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18725518 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11.03.2020) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18725518 Country of ref document: EP Kind code of ref document: A1 |