EP1725986A1 - Dispositif d'analyse de mouvement en temps reel - Google Patents

Dispositif d'analyse de mouvement en temps reel

Info

Publication number
EP1725986A1
EP1725986A1 EP05715022A EP05715022A EP1725986A1 EP 1725986 A1 EP1725986 A1 EP 1725986A1 EP 05715022 A EP05715022 A EP 05715022A EP 05715022 A EP05715022 A EP 05715022A EP 1725986 A1 EP1725986 A1 EP 1725986A1
Authority
EP
European Patent Office
Prior art keywords
cameras
objects
central computer
image processing
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP05715022A
Other languages
German (de)
English (en)
Inventor
Hendrik Fehlis
Thorsten Mika
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from DE102004044002A external-priority patent/DE102004044002A1/de
Application filed by Individual filed Critical Individual
Publication of EP1725986A1 publication Critical patent/EP1725986A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/285Analysis of motion using a sequence of stereo image pairs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30221Sports video; Sports image

Definitions

  • the present invention relates to a real time motion analyzer for moving objects in a room.
  • the space to be monitored is recorded with the aid of cameras, preferably video cameras, and the images obtained from the cameras are processed in such a way that the movements of all moving objects can be analyzed.
  • Such methods and devices are particularly suitable for recording and analyzing sporting events e.g. on a sports field.
  • the real-time motion analysis is ideal for team sports such as soccer or volleyball.
  • the idea of being able to better analyze the actual movement activity of moving objects (players, referees, devices ...) during sports games is not new, rather various devices and methods are known in the prior art which enable such an analysis.
  • the athletes or their coaches are provided with data with which the performance of the athletes can be evaluated, e.g. For example, in soccer games or similar team sports, it may be important to know what routes a player has traveled during the game or in certain situations. It can also be useful to analyze the areas in which the main game scenes played out.
  • real-time motion analysis devices and methods serve to present the actual performance of the players in sports events to readers in a manner that is more legible and understandable, so to speak. This includes e.g. B. that the viewer can always identify each player. This can be facilitated, for example, by the fact that names and shirt numbers can be displayed on a monitor or television set at any time. Real-time motion analysis devices can also help to better resolve special game situations, such as the offside in football, afterwards.
  • WO 02/071334 A2 describes a method and a system for collecting data relating to the movement of an athlete on a playing field.
  • the position of the player on the playing field is determined in a two-dimensional plane.
  • the device has a large number of cameras which overlook and record the playing field.
  • the object to be observed for example the player, moves across the field and is filmed or recorded depending on its position by another camera. His position is then determined by this one camera, which records him.
  • the position is only calculated two-dimensionally. If there are occlusions in a video image, a person manually determines the position of the undercover player. Alternatively, the position can also be determined using another camera for which the player is not covered.
  • WO 03/056809 AI discloses a similar system.
  • the system described is used for the automatic display of moving objects in real time within a limited space, for example a playing field.
  • surveillance cameras cover this area and record the moving objects in order to determine in real time data about the position of the moving objects from the video images.
  • auxiliary cameras are used which support one of the surveillance cameras if necessary. These auxiliary cameras are used in particular to be able to make a better determination of the objects when they are concentrated in a certain point. An auxiliary camera is then determined which is best positioned to help the actual surveillance camera and to record this concentration point together with it.
  • Another focus of this system is to calculate the concentration points that occur in order to be able to determine an auxiliary camera in good time.
  • the data to be calculated are essentially obtained from only one surveillance camera.
  • Auxiliary cameras are only used when there is a high concentration of players in a certain area.
  • the positions are also determined using two-dimensional data.
  • WO 00/31560 which describes a system specially designed for ice hockey. This only detects movements in the X and Y direction of the players, additional cameras are installed in the area of the goals in order to be able to calculate the height of the puck. With this system, however, it is necessary to mark the racket, helmet, jersey and puck with a label whose special color reflects infrared light.
  • the active illumination of the markers is realized by light sources near the cameras in conjunction with infrared narrowband filters in front of the cameras. All cameras are aligned as perpendicular to the floor as possible, whereby only one camera is used per surface. With this system, too, the measurement or calculation of the positions is carried out two-dimensionally. Only in the area of the gate does a volume tracking matrix calculate the height, i.e. three-dimensional values.
  • All devices and systems described are relatively complex and yet almost exclusively calculate two-dimensional data. For example, it is not possible to permanently determine the position of a ball during an entire football game without affecting the players and the ball. In particular, it is not possible to determine the height of the ball or, for example, the player's jumps, the distance from the ground. This information is essential for a complete and meaningful analysis of a sports game.
  • the object of the present invention is to provide a device and a method for real-time motion analysis for moving objects in a room, which allow the position of each object in the room to be determined in all three dimensions at any time.
  • the device should be as simple as possible to build and inexpensive to manufacture.
  • the generated data should enable a viewer to carry out a multitude of different analyzes regarding the movements of the objects.
  • the device or the method must under no circumstances have a disruptive influence on the scenes or games to be observed. The installation of facilities affecting the players or other moving objects should be avoided.
  • the object is achieved by a real-time motion analysis device for moving objects in a room with - a central computer, - at least one image processing unit, - a large number of cameras which overlook the room in such a way that each position in the room is always overlooked by at least two cameras , wherein - each image processing unit with at least one camera and all cameras are each in data exchange with at least one image processing unit, - each image processing unit is at least indirectly in data exchange with the central computer, - the cameras each produce video images of the room which are transmitted to the image processing unit , whereby each object is recorded by at least two cameras from different angles, the image processing unit calculates mathematical features from the video images, for example the center of gravity coordinates in the XY direction and the extent of the objects, and extracts the data of the objects from the data of their surroundings and forwards the results to the central computer, the central computer transmits them from the image processing units Data of intersecting video images, taking into account the calibrated positions of the cameras in the room, calculated the three-dimensional position of
  • space in the context of this invention is to be understood only in relation to all three dimensions. In this respect, the space results from the area recorded by the cameras, which does not have to be delimited by walls, for example.
  • the device according to the invention and the method according to the invention enables, for the first time, the calculation of the three-dimensional positions of all objects within the space to be monitored. It is therefore possible to include the distance of all objects from the floor in the analyzes. This in turn means that the device according to the invention or the method according to the invention permits considerably greater analysis options. So it is particularly advantageous in football games if, for example, the trajectory of high-played balls (flanks, high long passes) can be precisely determined. Sports can also be analyzed in which this third dimension is decisive. For example, the invention is also suitable for volleyball, badminton, tennis and similar sports.
  • Every object is permanently recorded by at least 2 cameras. This significantly reduces the likelihood of objects hiding one another. Depending on the number and arrangement of the cameras, coverage may even be almost impossible. Only in extremely rare cases can it happen that already identified objects lose their identification due to the very high concentration of objects in a small space and this has to be manually assigned again.
  • the invention is essentially based on the fact that as many surveillance cameras as possible, in their entirety, survey the entire area to be monitored. From the known positions of the cameras, a "beam" or several beams are calculated to the objects. The three-dimensional position of the object is calculated by crossing at least two beams. Later, further algorithms for the reliable tracking and identification of the objects are then added.
  • the optical parameters of the objective and the camera are preferably measured in advance in a laboratory.
  • the calibration of the camera parameters requires as input data an assignment of three-dimensional point coordinates of a calibration body to their two-dimensional pixel coordinates in the camera image.
  • the cameras are thus initially arranged in such a way that the entirety of the room to be monitored can be viewed. It is necessary for every point in the room to be viewed by at least two cameras. The areas monitored by the cameras must therefore overlap in such a way that an object can be detected by a next camera before leaving the surveillance area of two cameras.
  • the cameras are then calibrated based on known points within the room. This can be done, for example, with existing lines be on a playing field or a movable calibration body. If a movable calibration body is used, its trajectories should cover as much as possible all areas of the camera images or the area to be monitored. Data is recorded during the calibration of the position of all cameras.
  • the three-dimensional coordinates of its projection center within the previously defined world coordinate system are output as the position of a camera.
  • the orientation of a camera is output in the form of its Euler angles (pan, tilt and roll angle) with respect to the same coordinate system.
  • the image coordinates of the objects or groups of objects to be tracked are then determined.
  • the input data are e.g. the color and video signals from the sensor camera are available.
  • the objects to be tracked can be extracted by using an image processing method.
  • the user chooses a suitable method based on the environmental conditions, e.g. a keying process.
  • chroma key method the difference key method or the so-called edge detection, for example.
  • the user After selecting the appropriate process, the user sets the required process parameters for each camera.
  • the user also determines which image areas of the respective camera are excluded from further processing. Such an exclusion does not have to be done, but it can make sense depending on the environment. For example, grandstands, buildings, forest areas or the like can be hidden for the analysis.
  • the result of the respective setting is displayed to the user as a grayscale or binary image.
  • the binarized image obtained in this way is segmented in the next step, for example, into groups of connected object pixels, which are further processed as so-called "blobs".
  • Each "blob” represents the potential image of an object (eg the ball or player) on the playing field.
  • the mathematical features for example the center of gravity coordinates (X, Y) and the extent of each “blob” are calculated in the video image or on the basis of the video images. These data are transmitted from each image processing unit to the central computer.
  • the central computer calculates in real time the three-dimensional position of the objects with respect to the world coordinate system based on the result data of the image processing unit (s). Depending on the accuracy or the resolution of the data, it is also possible to calculate the position of limbs. This in turn is not possible with any previously known method. Not only can directions of movement or movements of entire objects be recorded and analyzed, it is also possible to include limbs in the analysis.
  • the matching and triangulation methods contain various consistency checks that reduce the likelihood of incorrect assignments and thus incorrect object identifications. If necessary, it is possible to add smoothing and / or interpolation of object positions.
  • the X, .Y and Z positions of the objects within the room are calculated and can be transferred directly to a database or the like. This can then have appropriate interfaces to other applications, for example mobile radio, television, Internet, etc.
  • a semi-automatic or automatic assignment of the objects to names or names follows. In this step, for example, the names of the players or their shirt numbers are assigned to the objects.
  • This assignment can take place automatically depending on the room to be monitored or the event to be monitored, namely if, for example, positions of certain people or objects of the recording are known. However, the assignment can also be made manually by a user if necessary.
  • the object is usually identified only once, and there is an automatic update of a unique identification number for each object. According to the invention, it can be provided that all identified and unidentified objects are visualized by special measures, so that a user can recognize at first glance which objects are not identified. For example, a two-dimensional graphic of a supervision can be used. The user can use this representation to make a new or modified assignment at any time if necessary.
  • the object names are transmitted together with the object positions and identification numbers via a network to a suitable display system.
  • every object in the room automatically receives a unique identification number.
  • each identification number can be assigned a name, for example a player name. If the assignment is missing, the user receives visual feedback and can make a manual assignment.
  • the identification numbers are sent via the same network that sends the position data to the corresponding display or "further processing system".
  • an identification if an identification was lost during the monitoring period, it is possible to redefine it and assign it retrospectively. This means that, for example, if the loss of an assigned identification was not immediately apparent, it can then be assigned manually later, whereupon the object is identified over the entire period when the recording is viewed again. Gaps in identification can therefore be closed retrospectively.
  • the data can be disseminated via all suitable media. It is particularly suitable for mobile communications, Internet systems or television, as well as the use of set-top boxes, digital television and so on, which prepare and display the data accordingly.
  • the invention is explained in connection with sporting events, but is not limited to it. Because almost all objects can be reliably identified at any time, the invention is also particularly suitable for monitoring security-relevant items Areas. For example, it is conceivable to equip buildings, banks, airports, intersections or publicly accessible places with such a system. Once identified, a person can then be safely observed and tracked at any time.
  • the device according to the invention and the method according to the invention can also help, for example, to monitor air traffic.
  • the invention is thus not only limited to the sports events explained by way of example, but is much more suitable for all areas of application in which objects in three dimensions in a room are to be monitored and analyzed in real time.
  • the method according to the invention and the device according to the invention further enables the three-dimensional recording of the scenery before the actual recording begins. All other recordings are based on a topography map, so to speak, which in turn enables significantly improved analysis options. Such a topography recording can be useful, for example, at golf events, airport monitoring or cross-country skiing events.
  • Fig. 2 A schematic diagram of the real-time motion analysis device according to the invention.
  • FIG. 1 shows an example of the arrangement of cameras 10 around a soccer field 12.
  • eight cameras 10 are used, but the number is arbitrary and depends only on the fact that all areas of the room to be monitored, here the soccer field 12, are provided by at least two cameras on- must be visible.
  • 14 soccer players and a ball are shown as examples on the soccer field 12 as objects to be monitored. Their position is determined at all times in every position in all three dimensions.
  • FIG. 2 illustrates the basic structure of a device according to the invention.
  • This consists of the cameras 10, each of which is connected to at least one image processing unit 16.
  • each camera 10 can be connected to an image processing unit 16, but several cameras 10 can also use only one image processing unit 16 or vice versa.
  • the image processing units 16 are in data exchange with one another. This can be ensured, for example, via a corresponding wired network or also via any other suitable technology (e.g. radio network, WLAN).
  • the image processing units 16 are in turn connected to a central computer 18 or are in data exchange with the latter.
  • the central computer 18 can be arranged spatially separate from the image processing units 16.
  • the image processing units 16 can be installed near the stadium roof, while the central computer 18 is in a suitable room in the stadium or even several thousand kilometers away in a control room.
  • the image processing units 16 can also be arranged very far apart.
  • the use of radio-based networks enables a spatially independent arrangement. It should only be ensured that the connection for the data exchange between the components is sufficiently fast and secure.
  • the number of cameras used can also be of any size and the space can be of any size.
  • Essentially video images are exchanged between the cameras 10 and the image processing units 16 and essentially only the data from the mathematical features of the video images and control data are exchanged between the image processing units 16 and the central computer 18.
  • the image processing units 16 control the connected cameras 10, for example they influence the aperture, exposure time, gain, color balance, etc. Furthermore, the image processing units 16 read in and process the video signals from the cameras 10.
  • Video cameras can be used as cameras 10, but higher resolution or improved scanning can be achieved by higher quality cameras 10 (improved scanning frequency, higher number of pixels, etc.). This in turn leads to an immediate improvement in the temporal and spatial resolution of the object recognition.
  • the type and properties of the image acquisition sensors can also be adapted to the respective application. For example, spectral sensitivity is also possible outside the visible range. Other required properties, such as weather resistance when used outdoors, can also be taken into account.
  • the central computer 18 is connected to the image processing units 16 or is in data exchange with them. In a second embodiment variant, the central computer 18 and the image processing units 16 can also be combined in one component.
  • the central computer 18 sends commands, for example, to the image processing units 16 or to the software programs installed on them.
  • the central computer 18 also receives the data from the image processing units 16.
  • the central computer 18 is formed by two computer units.
  • the first computer unit is used to calculate the object positions
  • the other computer unit is used to check the results, to operate the user interface and to calibrate.
  • the recorded video images of the cameras 10 can also be displayed on these via a connected monitor.
  • the calculated object positions and their identification can be faded in or out in the respective video image. All calculations The presentations and presentations take place in real time.
  • the central computer 18 calculates the object position and identifies the objects and assigns them an identifier. These data are then passed on in real time, for example to a database 20, for further processing (calculation of running paths, accelerations, ball contacts, etc.).
  • the database 20 is used to communicate with the respective applications 24 (mobile radio network, interactive television, Internet, etc.) and appropriate interfaces are made available.
  • At least two cameras 10 could also be equipped with an additional control for horizontal and vertical panning, if necessary also with a zoom lens. These cameras 10 could observe particularly interesting areas, for example the ball-carrying player. Since the image section is larger or the observed space is smaller, the quality of the data increases considerably. This can improve the three-dimensional reconstruction of the players (physical representation of all limbs etc.).
  • An input medium 22 can be connected to the central computer 18 in order to be able to use it to input necessary information, such as player names, etc.
  • necessary information such as player names, etc.
  • objects that are not identified, for example, can also be optically highlighted in the monitor.
  • An essential advantage of the invention is that only relatively small amounts of data have to be passed on from the database 20 to the further applications 24 in order to enable adequate analyzes. For example, gaming situations can be displayed in the form of graphics on a cell phone or PDA.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un dispositif pour l'analyse de mouvement en temps réel d'objets (14) se déplaçant dans un espace. Ce dispositif comporte un ordinateur central (18), au moins une unité de traitement d'images (16) et une pluralité de caméras (10) balayant l'espace de sorte que chaque position dans l'espace est balayée en continu par au moins deux caméras (10). Les unités de traitement d'images (16) sont reliées entre elles, aux caméras (10) et à l'ordinateur central. Ces caméras (10) produisent chacune des images vidéo de l'espace, chaque objet (14) étant filmé par au moins deux caméras (10) sous différents angles de vue. L'unité de traitement d'images (16) calcule des caractéristiques mathématiques à partir des images vidéo, par exemple des coordonnées de centre de gravité dans le sens X-Y et la dimension des objets (14), puis transmet les résultats à l'ordinateur central (18). Celui-ci calcule la position tridimensionnelle de l'objet (14) dans l'espace à partir de données d'images vidéo se recoupant, en tenant compte de la position calibrée des caméras (10).
EP05715022A 2004-03-08 2005-02-25 Dispositif d'analyse de mouvement en temps reel Withdrawn EP1725986A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102004011561 2004-03-08
DE102004044002A DE102004044002A1 (de) 2004-03-08 2004-09-09 Echtzeit-Bewegungsanalysevorrichtung
PCT/DE2005/000318 WO2005088541A1 (fr) 2004-03-08 2005-02-25 Dispositif d'analyse de mouvement en temps reel

Publications (1)

Publication Number Publication Date
EP1725986A1 true EP1725986A1 (fr) 2006-11-29

Family

ID=34961340

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05715022A Withdrawn EP1725986A1 (fr) 2004-03-08 2005-02-25 Dispositif d'analyse de mouvement en temps reel

Country Status (2)

Country Link
EP (1) EP1725986A1 (fr)
WO (1) WO2005088541A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010035182A1 (fr) * 2008-09-26 2010-04-01 Koninklijke Philips Electronics N.V. Appareil d'aide à l'activité
ES2319087B2 (es) * 2008-11-03 2010-02-22 Universidad De Cordoba Sistema y metodo de captura, procesamiento y representacion de localizamiento tridimensional en tiempo real de una señal optica.
US11022972B2 (en) * 2019-07-31 2021-06-01 Bell Textron Inc. Navigation system with camera assist
CN114693576B (zh) * 2022-03-28 2023-07-18 浙江大学 一种实验动物行为学三维数据记录方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001273500A (ja) * 2000-03-23 2001-10-05 Hitachi Ltd 運動物体計測装置および球技分析システムおよびデータサービスシステム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2005088541A1 *

Also Published As

Publication number Publication date
WO2005088541A1 (fr) 2005-09-22

Similar Documents

Publication Publication Date Title
EP1864153B1 (fr) Systeme de poursuite d'objet et d'analyse de situation
DE112005000929B4 (de) Automatisches Abbildungsverfahren und Vorrichtung
DE69728687T2 (de) Verfahren und Vorrichtung zur Hervorhebung von Signalteilen bei der Fernseh-Direktübertragung
EP2880853B1 (fr) Dispositif et procédé destinés à déterminer la situation d'une caméra de prise de vue
DE60213526T2 (de) Verfahren und System zur Verbesserung des Situationsbewusstseins von Kommando-Kontrolleinheiten
DE102012022005A1 (de) Verfahren, Vorrichtung und System
DE102006033147A1 (de) Überwachungskamera, Verfahren zur Kalibrierung der Überwachungskamera sowie Verwendung der Überwachungskamera
DE10308525A1 (de) Vermessungssystem
EP2553660B1 (fr) Procédé pour visualiser des zones d'activité accrue dans des scènes de surveillance
DE102019133756A1 (de) Systems and methods for generating a lighting design
DE10029463A1 (de) Auswerteeinheit und Verfahren zur Auswertung von statischen Zuständen und/oder Bewegungsabläufen
DE3802541C2 (fr)
WO2021083915A1 (fr) Procédé et dispositif de détection mobile pour détecter des éléments d'infrastructure d'un réseau de conduits souterrains
EP1725986A1 (fr) Dispositif d'analyse de mouvement en temps reel
DE102012022038A1 (de) Verfahren, Vorrichtung und Programm
WO2006032552A1 (fr) Commande d'une camera asservie
DE102004044002A1 (de) Echtzeit-Bewegungsanalysevorrichtung
EP3711392B1 (fr) Procédé et dispositif de détermination de position
DE112022002520T5 (de) Verfahren zur automatischen Kalibrierung von Kameras und Erstellung von Karten
EP2943934B1 (fr) Procédé d'enregistrement et de reproduction d'une suite d'événements
DE10039384A1 (de) Verfahren und Vorrichtung zur Bestimmung von Positionsdaten von beweglichen Objekten, zur Auswertung von Ereignissen und/oder zur Bestimmung der Relationen zwischen beweglichen Objekten
EP2940624B1 (fr) Modèle virtuel tridimensionnel d'un environnement pour applications destinées à la détermination de position
WO2020065021A1 (fr) Système de détection de ballon, et procédé permettant de détecter un événement avec un ballon
EP1434184B1 (fr) Commande d'un système multicaméra
WO2009068336A2 (fr) Module de traitement d'images pour l'estimation d'une position d'un objet à surveiller, procédé de détermination de la position d'un objet à surveiller et programme informatique

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20061009

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20080807

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20090418