CN116740130A - Method for acquiring motion information, calibration method and device - Google Patents

Method for acquiring motion information, calibration method and device Download PDF

Info

Publication number
CN116740130A
CN116740130A CN202210209801.3A CN202210209801A CN116740130A CN 116740130 A CN116740130 A CN 116740130A CN 202210209801 A CN202210209801 A CN 202210209801A CN 116740130 A CN116740130 A CN 116740130A
Authority
CN
China
Prior art keywords
acquisition
acquisition equipment
homography matrix
target
acquisition device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210209801.3A
Other languages
Chinese (zh)
Inventor
李明
曹世明
张利平
王波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210209801.3A priority Critical patent/CN116740130A/en
Priority to PCT/CN2023/078599 priority patent/WO2023165452A1/en
Publication of CN116740130A publication Critical patent/CN116740130A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)

Abstract

A method for acquiring motion information, a calibration method and a device thereof do not need to be remotely moved to wear a wearable sensor, and the convenience for acquiring the motion information is improved. The tracking of the athlete is performed by deploying a plurality of acquisition devices around the playing surface and then based on a homography matrix between pre-calibrated cameras. The player does not need to wear any equipment, and convenience can be provided. The application adopts the movable calibration object to calibrate, gets rid of the influence of the inherent visual characteristics/calibration points of the field, and is applicable to wider scenes and is also applicable to scenes with larger fields. The application can solve the problem of large-field marking without using a large-field angle camera, and is beneficial to improving the precision of target detection and tracking in a large scene.

Description

Method for acquiring motion information, calibration method and device
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and apparatus for acquiring motion information, and a calibration method.
Background
The sports auxiliary training is a technology for generating sports statistical data by synchronously shooting, storing, processing original data and analyzing sports data through a multi-camera video acquisition system aiming at sports scenes. In the current scheme, the athlete is required to wear a wearable sensor to collect athlete information so as to acquire the athlete's movement information. The wearing of the wearable sensor to obtain the athletic information of the athlete is less convenient, and the athletic information of the athlete cannot be collected, such as in a game scenario.
Disclosure of Invention
The embodiment of the application provides a method for acquiring motion information, a calibration method and a device, which do not need to remotely mobilize a wearable sensor, and improve the convenience for acquiring the motion information.
In a first aspect, an embodiment of the present application provides a motion information obtaining method, including: acquiring single-frame images acquired at the same time by a plurality of acquisition devices deployed in a set space comprising a sports ground; tracking target athletes included in single-frame images acquired by two adjacent acquisition devices at the same time according to first homography matrixes corresponding to the two adjacent acquisition devices to obtain motion information of the target athletes; the two adjacent collecting devices are any two of the plurality of collecting devices and are adjacent in the set motion direction of the sports ground; the first homography matrix is used for representing the mapping relation of the position coordinates of the same object in single-frame images acquired by two adjacent acquisition devices at the same moment, and is obtained by calibrating multi-frame images shot by the two adjacent acquisition devices at different moments synchronously, and the positions of a first object calibration object in the sports field in the images shot at different moments are different; the first target calibration object includes a plurality of calibration points.
At present, the mode of acquiring the sports information of the athlete by wearing the wearable sensor has poor convenience. By the scheme provided by the embodiment of the application, a plurality of acquisition devices are deployed around the sports ground, and then the tracking of athletes is performed based on a homography matrix among cameras calibrated in advance. The player does not need to wear any equipment, and convenience can be provided. At present, a method for calibrating a camera by adopting site inherent visual characteristics/calibration points (hereinafter referred to as the prior art) depends on the number and distribution of the calibration points meeting the requirements in the whole sports field, and the camera calibration precision is reduced due to fewer or uneven distribution of the characteristic points. Compared with the method, the method adopts the movable calibration object for calibration, gets rid of the influence of inherent visual characteristics/calibration points of the field, is wider in applicable scene, and is also applicable to scenes with larger fields. And because the visual features/calibration points are more in quantity and uniformly distributed, the calibration precision is higher. In addition, in order to utilize the field inherent visual characteristics/calibration points, the prior art often needs to enlarge the field angle to cover a plurality of calibration points, which can lead to smaller imaging of the tracking target in the image, and thus reduced detection accuracy. The embodiment of the application can solve the problem of large-field marking without using a large-field angle camera, and is beneficial to improving the precision of target detection and tracking in a large scene.
In one possible design, the first target calibration object comprises a plurality of calibration points in different images at the same distance from the plane in which the ground of the playing field lies.
In one possible design, the method further comprises: acquiring multi-frame images acquired by a first acquisition device and a second acquisition device at different moments in the motion process of a first target calibration object in a set region of a sports ground respectively, wherein the first acquisition device and the second acquisition device are any two adjacent acquisition devices in the set motion direction of the sports ground, and the set region at least comprises the same region of the sports ground, which is corresponding to the two adjacent acquisition devices in a visual angle range respectively; acquiring first position coordinate information of a plurality of calibration points respectively in a multi-frame image acquired by the first acquisition equipment, which are included in the first target calibration object, and acquiring second position coordinate information of a plurality of calibration points respectively in a multi-frame image acquired by the second acquisition equipment, which are included in the first target calibration object; and determining the first homography matrix according to the first position identification information and the second position coordinate information.
In one possible design, the method further comprises: determining a third position coordinate of the target player in a coordinate system corresponding to the sports field according to a second homography matrix corresponding to a third acquisition device and the position coordinate of the target player in a single-frame image shot by the third acquisition device; the second homography matrix corresponding to the third acquisition equipment is used for describing the mapping relation between the position coordinates of the target object in the image, which are shot by the third acquisition equipment, and the position coordinates of the target object in the coordinate system corresponding to the sports ground; the third acquisition device is any one of the plurality of acquisition devices. The above-described solution provides an efficient and simple solution for determining the position of an athlete.
In one possible design, the third acquisition device is a reference acquisition device, and the second homography matrix of the reference acquisition device is determined according to position coordinates of a second target calibration object including a plurality of calibration points in an image shot by the reference acquisition device and position coordinates of a plurality of calibration points included by the second target calibration object in a coordinate system corresponding to the sports field, where the second target calibration object is located in a view-finding range of the reference acquisition device in the sports field; or alternatively, the process may be performed,
The third acquisition equipment is not reference acquisition equipment, the third acquisition equipment is adjacent to fourth acquisition equipment in the set motion direction of the sports ground, the second homography matrix corresponding to the third acquisition equipment is determined according to the second homography matrix corresponding to the fourth acquisition equipment and the first homography matrix between the fourth acquisition equipment and the third acquisition equipment, and the fourth acquisition equipment is reference acquisition equipment; or alternatively, the process may be performed,
the third acquisition equipment is not reference acquisition equipment and is spaced from the fourth acquisition equipment by at least one acquisition equipment in the set motion direction of the sports ground, and the second homography matrix corresponding to the third acquisition equipment is determined according to the second homography matrix corresponding to the fourth acquisition equipment and the first homography matrix corresponding to every two adjacent acquisition equipment between the third acquisition equipment and the fourth acquisition equipment.
In the scheme, the second homography matrix of each camera is determined by adopting the cascade connection mode of the first homography matrixes of the adjacent cameras, the second homography matrix is not required to be determined by a calibration mode aiming at all cameras, the calibration complexity is reduced, and the calibration time is also reduced.
In one possible design, the reference acquisition device includes a set landmark reference point on the ground of the playing field within a viewing range;
the position coordinates of the plurality of calibration points included in the second target calibration object in the corresponding coordinate system of the sports field are determined according to the position coordinates of the set landmark reference point in the corresponding coordinate system of the sports field, the relative position relation between the second target calibration object and the set landmark reference point and the topological parameters of the second target calibration object, and the topological parameters represent the relative position relation and the relative posture between the components included in the second target calibration object.
In a second aspect, an embodiment of the present application provides a calibration method, including: acquiring multi-frame images acquired by a first acquisition device and a second acquisition device in a plurality of acquisition devices deployed in a setting space of a sports ground at different moments in the motion process of a first target calibration object in a setting area of the sports ground, wherein the first acquisition device and the second acquisition device are any two adjacent acquisition devices in the setting motion direction of the sports ground, and the setting area at least comprises the same areas in the sports ground, which respectively correspond to the adjacent two acquisition devices in a visual angle range; acquiring first position coordinate information of a plurality of calibration points respectively in a multi-frame image acquired by the first acquisition equipment, which are included in the first target calibration object, and acquiring second position coordinate information of a plurality of calibration points respectively in a multi-frame image acquired by the second acquisition equipment, which are included in the first target calibration object; and determining the first homography matrix according to the first position identification information and the second position coordinate information.
The method for calibrating the camera by adopting the site inherent visual characteristics/calibration points (hereinafter referred to as the prior art) depends on the number and distribution of the calibration points meeting the requirements in the whole sport field, and the camera calibration precision is reduced due to fewer or uneven distribution of the characteristic points. Compared with the method, the method adopts the movable calibration object for calibration, gets rid of the influence of inherent visual characteristics/calibration points of the field, is wider in applicable scene, and is also applicable to scenes with larger fields. And because the visual features/calibration points are more in quantity and uniformly distributed, the calibration precision is higher. In addition, in order to utilize the field inherent visual characteristics/calibration points, the prior art often needs to enlarge the field angle to cover a plurality of calibration points, which can lead to smaller imaging of the tracking target in the image, and thus reduced detection accuracy. The embodiment of the application can solve the problem of large-field marking without using a large-field angle camera, and is beneficial to improving the target detection and tracking precision under a large scene.
In one possible design, the method further comprises: acquiring a first image of a second target calibration object included in a view finding range acquired by third acquisition equipment, wherein the second target calibration object comprises a plurality of calibration points, and the third acquisition equipment is any one of the plurality of acquisition equipment; identifying the position coordinates of a plurality of calibration points included in the second target calibration object in the first image respectively; and determining the second homography matrix corresponding to the third acquisition device according to the position coordinates of a plurality of calibration points respectively included in the second target calibration object in the sports field corresponding coordinate system and the position coordinates of a plurality of calibration points respectively included in the second target calibration object in the first image.
In one possible design, the third acquisition device is adjacent to the fourth acquisition device in a set direction of motion of the playing field, the method further comprising: and determining a second homography matrix corresponding to the fourth acquisition equipment according to the second homography matrix corresponding to the third acquisition equipment and the first homography matrix between the third acquisition equipment and the fourth acquisition equipment.
In one possible design, the third acquisition device is spaced from the fifth acquisition device by at least one acquisition device in a set direction of motion of the playing field, the method further comprising: and determining a second homography matrix of the fifth acquisition equipment according to the second homography matrix corresponding to the third acquisition equipment and the first homography matrix corresponding to every two adjacent acquisition equipment between the third acquisition equipment and the fifth acquisition equipment.
In one possible design, the third acquisition device is a reference acquisition device, and the view finding range of the reference acquisition device includes a set landmark reference point on the ground of the sports ground; the position coordinates of the plurality of calibration points included in the second target calibration object in the corresponding coordinate system of the sports field are determined according to the position coordinates of the set landmark reference point in the corresponding coordinate system of the sports field, the relative position relation between the second target calibration object and the set landmark reference point and the topological parameters of the second target calibration object, and the topological parameters represent the relative position relation and the relative posture between the components included in the second target calibration object.
In a third aspect, an embodiment of the present application provides a motion information acquisition apparatus, including: the acquisition unit acquires single-frame images acquired at the same time by a plurality of acquisition devices deployed in a set space comprising a sports ground; the processing unit is used for tracking target athletes included in single-frame images acquired by the two adjacent acquisition devices at the same time according to first homography matrixes corresponding to the two calibrated adjacent acquisition devices so as to acquire the motion information of the target athletes; the two adjacent collecting devices are any two of the plurality of collecting devices and are adjacent in the set motion direction of the sports ground; the first homography matrix is used for representing the mapping relation of the position coordinates of the same object in single-frame images acquired by two adjacent acquisition devices at the same moment, and is obtained by calibrating multi-frame images shot by the two adjacent acquisition devices at different moments synchronously, and the positions of a first object calibration object in the sports field in the images shot at different moments are different; the first target calibration object includes a plurality of calibration points.
In one possible design, the first target calibration object comprises a plurality of calibration points in different images at the same distance from the plane in which the ground of the playing field lies.
In one possible design, the acquiring unit is further configured to acquire multi-frame images acquired by a first acquiring device and a second acquiring device at different moments in a motion process of the first target calibration object in a set area of the sports field respectively, where the first acquiring device and the second acquiring device are any two adjacent acquiring devices in a set motion direction of the sports field, and the set area at least includes the two adjacent acquiring devices respectively corresponding to a same area in the sports field within a viewing angle range; the processing unit is further configured to obtain first position coordinate information of a plurality of calibration points included in the first target calibration object in the multi-frame image acquired by the first acquisition device, and obtain second position coordinate information of a plurality of calibration points included in the first target calibration object in the multi-frame image acquired by the second acquisition device; and determining the first homography matrix according to the first position identification information and the second position coordinate information.
In one possible design, the processing unit is further configured to:
determining a third position coordinate of the target player in a coordinate system corresponding to the sports field according to a second homography matrix corresponding to a third acquisition device and the position coordinate of the target player in a single-frame image shot by the third acquisition device;
the second homography matrix corresponding to the third acquisition equipment is used for describing the mapping relation between the position coordinates of the target object in the image, which are shot by the third acquisition equipment, and the position coordinates of the target object in the coordinate system corresponding to the sports ground; the third acquisition device is any one of the plurality of acquisition devices.
In one possible design, the third acquisition device is a reference acquisition device, and the second homography matrix of the reference acquisition device is determined according to position coordinates of a second target calibration object including a plurality of calibration points in an image shot by the reference acquisition device and position coordinates of a plurality of calibration points included by the second target calibration object in a coordinate system corresponding to the sports field, where the second target calibration object is located in a view-finding range of the reference acquisition device in the sports field; or alternatively, the process may be performed,
The third acquisition equipment is not reference acquisition equipment, the third acquisition equipment is adjacent to fourth acquisition equipment in the set motion direction of the sports ground, the second homography matrix corresponding to the third acquisition equipment is determined according to the second homography matrix corresponding to the fourth acquisition equipment and the first homography matrix between the fourth acquisition equipment and the third acquisition equipment, and the fourth acquisition equipment is reference acquisition equipment; or alternatively, the process may be performed,
the third acquisition equipment is not reference acquisition equipment and is spaced from the fourth acquisition equipment by at least one acquisition equipment in the set motion direction of the sports ground, and the second homography matrix corresponding to the third acquisition equipment is determined according to the second homography matrix corresponding to the fourth acquisition equipment and the first homography matrix corresponding to every two adjacent acquisition equipment between the third acquisition equipment and the fourth acquisition equipment.
In one possible design, the reference acquisition device includes a set landmark reference point on the ground of the playing field within a viewing range; the position coordinates of the plurality of calibration points included in the second target calibration object in the corresponding coordinate system of the sports field are determined according to the position coordinates of the set landmark reference point in the corresponding coordinate system of the sports field, the relative position relation between the second target calibration object and the set landmark reference point and the topological parameters of the second target calibration object, and the topological parameters represent the relative position relation and the relative posture between the components included in the second target calibration object.
In a fourth aspect, an embodiment of the present application provides a calibration device, including: the acquisition unit is used for acquiring multi-frame images acquired by a first acquisition device and a second acquisition device in a plurality of acquisition devices deployed in a setting space of a sports ground at different moments in the process of the movement of a first target calibration object in a setting area of the sports ground, wherein the first acquisition device and the second acquisition device are any two adjacent acquisition devices in the setting movement direction of the sports ground, and the setting area at least comprises the same areas of the adjacent two acquisition devices in the sports ground corresponding to the range of view angles respectively;
the processing unit is used for acquiring first position coordinate information of a plurality of calibration points respectively in the multi-frame images acquired by the first acquisition equipment and second position coordinate information of a plurality of calibration points respectively in the multi-frame images acquired by the second acquisition equipment; and determining the first homography matrix according to the first position identification information and the second position coordinate information.
In one possible design, the acquiring unit is further configured to acquire a first image of a second target calibration object included in a viewfinder range acquired by a third acquiring device, where the second target calibration object includes a plurality of calibration points, and the third acquiring device is any one of the plurality of acquiring devices;
The processing unit is further used for identifying position coordinates of a plurality of calibration points included in the second target calibration object in the first image respectively; and determining the second homography matrix corresponding to the third acquisition device according to the position coordinates of a plurality of calibration points respectively included in the second target calibration object in the sports field corresponding coordinate system and the position coordinates of a plurality of calibration points respectively included in the second target calibration object in the first image.
In one possible design, the third collecting device is adjacent to the fourth collecting device in a set motion direction of the sports field, and the processing unit is further configured to determine a second homography matrix corresponding to the fourth collecting device according to the second homography matrix corresponding to the third collecting device and a first homography matrix between the third collecting device and the fourth collecting device.
In one possible design, the third acquisition device is spaced from the fifth acquisition device by at least one acquisition device in a set direction of motion of the playing field, the processing unit being further configured to:
and determining a second homography matrix of the fifth acquisition equipment according to the second homography matrix corresponding to the third acquisition equipment and the first homography matrix corresponding to every two adjacent acquisition equipment between the third acquisition equipment and the fifth acquisition equipment.
In one possible design, the third acquisition device is a reference acquisition device, and the view finding range of the reference acquisition device includes a set landmark reference point on the ground of the sports ground;
the position coordinates of the plurality of calibration points included in the second target calibration object in the corresponding coordinate system of the sports field are determined according to the position coordinates of the set landmark reference point in the corresponding coordinate system of the sports field, the relative position relation between the second target calibration object and the set landmark reference point and the topological parameters of the second target calibration object, and the topological parameters represent the relative position relation and the relative posture between the components included in the second target calibration object.
In a fifth aspect, an embodiment of the present application provides an apparatus for acquiring motion information, including a memory and a processor. The memory is used for storing programs or instructions; the processor is configured to invoke the program or the instruction to perform the method of the first aspect or any of the designs of the first aspect.
In a sixth aspect, an embodiment of the present application provides a calibration device, including a memory and a processor. The memory is used for storing programs or instructions; the processor is configured to invoke the program or instructions to perform the method of the second aspect or any design of the second aspect.
In a seventh aspect, the present application provides a computer readable storage medium having stored therein a computer program or instructions which, when executed by a terminal device, cause the processor to perform the method of the first aspect or any of the possible designs of the first aspect, or cause the processor to perform the method of the second aspect or any of the possible designs of the second aspect.
In an eighth aspect, the application provides a computer program product comprising a computer program or instructions which, when executed by a processor, performs the method of any possible implementation of the first aspect or of the first aspect, or performs the method of any possible implementation of the second aspect or of the second aspect.
The technical effects achieved by any one of the third aspect to the eighth aspect may be referred to the description of the advantageous effects of the first aspect or the second aspect, and the detailed description is not repeated here.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings that are required to be used in the description of the embodiments will be briefly described below.
FIG. 1 is a schematic diagram of an information system architecture according to an embodiment of the present application;
FIG. 2 is a schematic diagram of another information system architecture according to an embodiment of the present application;
FIG. 3 is a schematic view of a camera deployment method for a circular speed racetrack according to an embodiment of the present application;
FIG. 4 is a schematic view of another camera deployment method for a circular speed racetrack according to an embodiment of the present application;
FIG. 5 is a schematic view of still another camera deployment method for a circular speed-ski-race track according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a camera deployment for an athletic track according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a possible first target calibration object according to an embodiment of the present application;
FIG. 8 is a flowchart of a calibration method of a first homography matrix according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a moving calibration object in a common area of adjacent cameras according to an embodiment of the present application;
FIG. 10 is a schematic diagram of feature point detection of a mobile calibration object according to an embodiment of the present application;
FIG. 11 is a schematic diagram of matching points of adjacent cameras according to an embodiment of the present application;
fig. 12 is a schematic diagram of determining a homography matrix through matching points of adjacent cameras according to an embodiment of the present application;
FIG. 13 is a flowchart of a calibration method of a second homography matrix according to an embodiment of the present application;
FIG. 14 is a schematic diagram showing the relationship between a coordinate system and a calibration object of a field according to an embodiment of the present application;
FIG. 15 is a schematic diagram of mapping relation calibration between a coordinate system of a field and an image coordinate system according to an embodiment of the present application;
fig. 16 is a schematic diagram of a motion information acquisition flow provided in an embodiment of the present application;
FIG. 17 is a schematic diagram of an ROI and mask provided in an embodiment of the present application;
FIG. 18 is a schematic diagram of a single machine in human body tracking at two moments according to an embodiment of the present application;
FIG. 19 is a schematic diagram of adjacent camera ID relay tracking according to an embodiment of the present application;
FIG. 20 is a schematic diagram of athlete's body detection and trajectory projection to a venue according to an embodiment of the present application;
fig. 21 is a schematic structural diagram of an apparatus for acquiring motion information according to an embodiment of the present application;
FIG. 22 is a schematic structural diagram of a calibration device according to an embodiment of the present application;
fig. 23 is a schematic diagram of a device structure according to an embodiment of the present application.
Detailed Description
The application provides a method and a device for acquiring sports information, and a method and a device for calibrating mapping relations of coordinate systems between acquisition equipment and between the acquisition equipment and a sports field in a setting space containing the sports field, and acquiring sports information of athletes in the training or competition process of the sports field, such as a sports track, a sports speed, a sports step number or a sports distance and the like, according to a calibration result. The athletic information may be used to assist in the training of the athlete. The playing surface may be an annular field such as an annular runway or an annular skateway. The sports scene may also be a rectilinear field. The playing field may also be of other forms, such as a football field, etc., to which embodiments of the present application are not limited in particular.
Referring to fig. 1, a schematic diagram of an information system architecture according to an embodiment of the present application is shown. The information system comprises a plurality of acquisition devices and a data processing server, wherein in fig. 1, N acquisition devices are taken as an example, and N is a positive integer. The number of cameras included in the information system may be configured according to the size of the playing field. The acquisition device may be a camera, or a video camera, or the like. The plurality of acquisition devices can be deployed in a set space where the sports ground is located. For example, the playing field is an annular ice chute, which is located in a speed skating stadium where a plurality of collection devices are located. The viewing range of each of the plurality of collection devices includes a portion of the playing field. The view finding ranges of different acquisition devices are different, and a common view area is arranged between the view finding ranges respectively corresponding to two acquisition devices which are spatially adjacent in the motion direction. And the common view area is an area commonly shot by the spatially adjacent acquisition equipment.
The data processing server may comprise one or more servers, and if the data processing server comprises a plurality of servers, it is understood that the data processing server is a server cluster made up of a plurality of servers. The data processing server may be configured to extract synchronization frames of the plurality of acquisition devices for video streams acquired by the plurality of acquisition devices, and then process the synchronization frames frame by frame to obtain motion information. In some scenarios, the data processing server may also perform a calibration process. The calibration process may include calibrating to obtain the first homography matrix and/or the second homography matrix. The first homography matrix is used for representing the mapping relation of the position coordinates of the same object in a single frame image acquired by two adjacent acquisition devices at the same moment. The second homography matrix is used for representing a mapping relation between the position coordinates of the target object in the image, which are shot by the acquisition equipment, and the position coordinates of the target object in a coordinate system corresponding to the sports ground. In some embodiments, the coordinate system corresponding to the sports field may be a spatial coordinate system created based on the sports field, and the origin may be a certain position point in the sports field. In some embodiments, the origin of the coordinate system corresponding to the playing field may be a spatial coordinate system established by taking the chief center of the country as the origin, and the origin may also be another location, which is not particularly limited in the present application.
In some possible scenarios, one or more routing devices may also be included in the information system, which may be used to transmit images acquired by the acquisition device to the data processing server. The routing devices may be routers, switches, and the like. Taking the switch as an example, referring to fig. 2, a multi-layer switch may be deployed in the information system, taking two layers as an example, a switch deployed in a first layer may be used to connect one or more acquisition devices, a switch deployed in a second layer may be used as a main switch, one end of the main switch is connected to the first layer switch, and the other end is connected to the data processing server. See, for example, fig. 2.
In other possible scenarios, the information system also supports the acquisition of motion analysis data by the mobile device. Illustratively, the information system further includes a mobile front end. For example, the mobile front end includes a web page server. Referring to fig. 2, the web page server is connected to a data processing server. The mobile front-end may also include a wireless router (or wired router) and one or more terminal devices. The terminal device may be an electronic device supporting access to web pages, such as a desktop computer, a portable computer, a mobile phone, etc. One or more terminal devices may operate the data server by accessing the web page server, for example, sending a synchronized acquisition signal to multiple acquisition devices or stopping recording signals. The synchronous acquisition signal is used for indicating the acquisition equipment to synchronously start video recording. The stop recording signal is used for indicating the acquisition device to stop video recording. As another example, a video playback of a history, or a motion information and display, etc.
The calibration method provided by the embodiment of the application is described in detail below with reference to the embodiment. And deploying the acquisition equipment in a setting space to which the sports ground belongs. In the following description, an acquisition device is taken as an example of a camera. When the camera is deployed in the setting space of the sports field, the position of the installation allowed in the setting space of the sports field can be determined according to whether the camera has a column, a truss or a suspended ceiling, etc. When cameras are deployed in a set space to which a playing field belongs, each camera can cover a partial area of the entire track, such as a length of the track. Spatially adjacent cameras have a common view area, such as a common view area with 1/2 or 1/3 of the image. Truss refers to a planar or spatial structure consisting of straight bars, typically with triangular elements, for the fixing of the camera mount.
As an example, a ring-shaped speed ski track deployment camera of a speed ski library is taken as an example. The track is 400 m long, and the movement direction of the athlete on the track is anticlockwise. Fig. 3-5 show three possible camera deployment modes. Referring to fig. 3 (a), an example of 20 stations deployed along the track is shown. The camera position refers to cameras distributed at different positions. Each machine position is positioned above the outside of the track, and the track is obliquely shot from the high position. In fig. 3 (a), the camera is disposed on the pillar. Fig. 3 (b) is a top view of the camera deployment. Fig. 3 (c) is a side view of the camera deployed on the post. The straight-way camera is disposed on the extension line of the straight-way and on the side of the curve, and shoots the athlete from the front. Each camera shoots a 40 meter range area, two cameras which are spatially adjacent have a common view range of 20 meters, and the total of 20 cameras cover 400 meters of the racetrack (5 straight channels are 2 and 5 curved channels are 2). In some scenarios, after the cameras are mounted in a set location, the focus, orientation, or field angle of the cameras may be adjusted such that each camera focuses on a portion of the track, and there is a common viewing area between adjacent cameras. The cameras are connected to two switches in groups, the cameras 1-10 are respectively connected to one switch, the cameras 11-20 are respectively connected to the other switch, and video frames collected by the cameras 1-20 are sent to the data processing server through the two switches.
Referring to fig. 4 (a), an example of 20 stations deployed along the track is shown. Take the example of a camera deployed on a ceiling truss. Each machine is located above the track and is used for nodding the track from a high place. The camera lens axis forms an acute angle with the ground and is not perpendicular to the ground so as to cover a larger shooting range. Fig. 4 (b) is a top view of the camera deployment. Fig. 4 (c) is a side view of the camera deployed on the ceiling truss. In some scenarios, after the cameras are mounted in a set location, the focus, orientation, or field angle of the cameras may be adjusted such that each camera focuses on a portion of the track, and there is a common viewing area between adjacent cameras. The cameras are connected to two switches in groups, the cameras 1-10 are respectively connected to one switch, the cameras 11-20 are respectively connected to the other switch, and video frames collected by the cameras 1-20 are sent to the data processing server through the two switches.
Referring to fig. 5 (a), an example of 20 stations deployed along the track is shown. Taking the example of the camera being deployed on the upright post. The 20 machine positions are deployed along the track, 1-5 shooting straight tracks of the non-track-changing area, 6-10, 16-20 shooting two curves, and 11-15 shooting straight tracks of the non-track-changing area. In fig. 5 (b) is shown a side view of the cameras 1-5 deployed on a column. The cameras are connected to two switches in groups, the cameras 1-10 are respectively connected to one switch, the cameras 11-20 are respectively connected to the other switch, and video frames collected by the cameras 1-20 are sent to the data processing server through the two switches.
As another example, track-laying cameras are taken as an example. The camera can be deployed at the positions of the upright posts, the trusses or the suspended ceilings, and the like, and can also be deployed at the set positions of the stand. For example, referring to FIG. 6, an example of deploying 20 stations along a racetrack is shown. In fig. 6, a setting position where the camera is disposed on the stand is taken as an example. Fig. 6 (a) is a schematic diagram of the track and camera arrangement. Each machine position is positioned above an external stand of the track, and the track is obliquely and divedly shot from a high position. Fig. 6 (b) is a top view of a camera deployment. Fig. 6 (c) is a side view of the camera deployed on the stand. The straight-way camera is disposed on the extension line of the straight-way and on the side of the curve, and shoots the athlete from the front. In some scenarios, after the cameras are mounted in a set location, the focus, orientation, or field angle of the cameras may be adjusted such that each camera focuses on a portion of the track, and there is a common viewing area between adjacent cameras. The cameras are connected to two switches in groups, the cameras 1-10 are respectively connected to one switch, the cameras 11-20 are respectively connected to the other switch, and video frames collected by the cameras 1-20 are sent to the data processing server through the two switches. By deploying cameras on an athletic track, athletes participating in a competition event, such as sprinting, middle-distance running, hurdles, etc., may be analyzed to obtain athletic information, such as a motion profile, a motion gesture, a motion speed, etc., of the athlete.
It should be noted that the above camera deployment is merely an example, and a specific deployment may be deployed in conjunction with an actual scenario, which is not specifically limited by the embodiment of the present application. The number of deployed cameras, the grouping situation of the cameras and the number of deployed switches are not particularly limited.
After the cameras are deployed, the conversion relation between the image coordinate systems of two adjacent cameras needs to be calibrated, namely, the first homography matrix is calibrated. The conversion relation between the image coordinate system of the camera and the sports ground coordinate system can be further calibrated, namely the second homography matrix is calibrated.
The calibration scheme of the first homography matrix provided by the embodiment of the application is described as follows. The embodiment of the application is realized by shooting multiple frames of images in the process of moving the calibration object when calibrating the first homography matrix. For convenience of description, a calibration object for calibrating the first homography matrix is referred to as a first target calibration object. The first target calibration object comprises a plurality of calibration points. The first target calibration object may comprise one or a group of calibration objects. Each marker in the set of markers includes at least one marker point. The index point has a stable visual characteristic that does not change from time to time. In some possible examples, the first target calibration object has a specific pattern thereon, and intersection points of lines in the pattern may be used as calibration points. In other possible examples, the first target calibration object may have a luminescent screen thereon, and the luminescent point displayed is used as the calibration point. Of course, other manners of setting the calibration points on the calibration object may be used, which is not particularly limited in the embodiments of the present application.
As an example, the first target calibration object has a specific pattern thereon. FIG. 7 is a schematic diagram of a possible first target calibration object. In fig. 7, the first target calibration object includes a set of calibration objects. Each of the set of markers may be a bin rack. One surface of the box frame is provided with a specific pattern, and the specific patterns on different calibration objects are different, and a two-dimensional code is taken as an example in fig. 7. Points of each corner in the two-dimensional code can be selected as calibration points, or two corner points at the lower edge of the box body are used as calibration points, or two points of the lower corner of the rectangle comprising the two-dimensional code are used as calibration points. In the embodiment of the application, two points of the lower corner of the rectangle comprising the two-dimensional code are taken as the standard points.
Referring to fig. 8, a flowchart of a calibration method of the first homography matrix is shown. In fig. 8, a first homography matrix is taken as an example for calibrating the first camera and the second camera. The first camera and the second camera are any two spatially adjacent cameras in the motion direction of the sports field. Such as camera 1 and camera 2, camera 2 and cameras 3, … … in fig. 3. The method provided in fig. 8 may be performed by a data processing server or by a processor or processor system in a data processing server.
801, acquiring multi-frame images acquired by a first camera and a second camera at different moments respectively in the motion process of the first target calibration object in a set area of a sports field. The setting area at least comprises a common view area, namely two adjacent cameras respectively correspond to the same area in the sports field within the visual angle range.
In some embodiments, the data processing server may send synchronized capture signals to multiple cameras in the information system. Therefore, the cameras synchronously shoot in the moving process of the first target calibration object to respectively obtain video streams, and the video streams are sent to the data processing server. In this embodiment, the first target calibration object is moved from a starting position to an end position of the playing field.
For example, in the annular field, the first target calibration object may move around the annular field for one circle, so that each of the plurality of cameras may capture the first target calibration object during a certain period of time during the movement of the first target calibration object. The synchronous shooting signal refers to a signal used for triggering shooting of multiple cameras, a periodic (related to shooting frame rate) pulse signal is generally used in wired synchronous triggering, a communication synchronous protocol is generally defined in a wireless system, and a specific instruction is transmitted to trigger periodic shooting.
For another example, the sports field may be segmented in the direction of motion with a relatively large number of cameras or with a relatively long distance. For example, an annular ice chute 400 meters long is divided into 2 segments of 200 meters each, 1-200 meters and 200-400 meters, respectively. Two first target calibration objects can be used to move on two sections of ice roads respectively, and each first target calibration object only needs to move by 200 meters.
In other embodiments, the data processing server may send synchronized capture signals to the two cameras currently calibrated. Therefore, the two cameras synchronously shoot in the moving process of the first target calibration object to respectively obtain video streams and send the video streams to the data processing server. In this embodiment, in the synchronous photographing period of the two cameras, the first target calibration object moves within the set area including the common view area of the two cameras.
For example, the first target calibration object is synchronously acquired by the first camera and the second camera, and the first camera acquires M frames of images containing the first target calibration object, so that it can be understood that the first camera acquires one image at M times respectively to obtain M images. It should be understood that the first camera and the second camera synchronously acquire one frame of image, that is, the first camera and the second camera acquire a single frame of image at the same time. It should be noted that a certain error, for example, an error in the order of milliseconds, is allowed at the same time within the range of the error allowed by the calibration.
The positions of the first target calibration objects in the sports field in the M images acquired by the first camera are different. It can be understood that the M images collected by the first camera each include a first target calibration object, and the positions of the first target calibration objects in the images collected by the first camera at different moments are different. The M images collected by the second camera comprise first target calibration objects, and the positions of the first target calibration objects in the images collected by the second camera at different moments are different. In a possible embodiment, the first target calibration object comprises a plurality of calibration points in different images acquired by the first camera or the second camera, which are at the same distance from the plane on which the ground of the sports field lies. It will be appreciated that in selecting the calibration point, a plurality of calibration points parallel to the ground may be selected. In some embodiments, the first target calibration object has a calibration surface of a specific image, and may be placed parallel to the ground. For example, the first calibration object moves in the common viewing area of the first camera and the second camera for 10 seconds, and assuming that 25fps is used for recording, 250 frames of images are acquired in the common viewing area by the first camera and the second camera. The first camera and the second camera send the acquired images to a data processing server. For example, referring to fig. 9, the first camera and the second camera capture images of the first target calibration object at three different moments in time. In fig. 9, a rectangular area of view 1 represents an image captured by the first camera, and a rectangular area of view 2 represents an image captured by the second camera. The lines represent the course split lines.
802, acquiring first position coordinate information of a plurality of calibration points respectively in a multi-frame image acquired by the first camera, which are included in the first target calibration object, and acquiring second position coordinate information of a plurality of calibration points respectively in a multi-frame image acquired by the second camera, which are included in the first target calibration object.
After receiving the video streams sent by the first camera and the second camera, the data processing server detects the characteristic point (i.e. the position of the detection target point) of the first target calibration object in each frame image of the video stream, for example, detects the corner point of the lower edge of the two-dimensional code. Referring to fig. 10 (a), feature points of the first target calibration object for detecting movement are shown. Taking the example that the distance between the calibration point and the ground of the sports field is kept constant during the movement of the first target calibration object, a plurality of feature points of the accumulated continuous frames are located in a plane parallel to the ground of the sports field, as shown in fig. 10 (b). In the process of moving the first target calibration object, the distance between the calibration point and the ground of the sports field is kept at a constant value, which can be understood that under the condition that the small change of the distance between the calibration point and the ground hardly affects the calibration result in the process of moving the first target calibration object, the distance between the calibration point and the ground of the sports field can be ignored, and under the condition, the distance between the calibration point and the ground of the sports field can be considered to be kept at a constant value. The data processing server determines the position coordinates of the feature points corresponding to the same standard point in the image acquired by the adjacent machine positions at the same moment, and forms a pair of matching points Pi and Pi'. Pi is a feature point in the image acquired by the first camera, pi' is a feature point in the image acquired by the second camera, see fig. 11. It can be understood that the position coordinates of all feature points in the image collected by the first camera in the image coordinate system of the first camera form the first position coordinate information, and the position coordinates of all feature points in the image collected by the second camera in the image coordinate system of the second camera form the second position coordinate information.
803, determining the first homography matrix according to the first location identification information and the second location coordinate information.
The data processing server can calculate a first homography matrix H1 between two adjacent machine positions according to the position coordinates corresponding to the matching feature points in the first position coordinate information and the second position coordinate information. Referring to fig. 12, the calculated first homography matrix H1 minimizes an average projection Error error_proj of all feature points Pi' in the image acquired by the second camera projected to the position of the corresponding calibration point P of the image acquired by the first camera. And further saving the determined first homography matrix H1 as a calibration result of the first camera and the second camera. Pi "in fig. 12 represents a 2D point in the image coordinate system projected to the first camera by Pi' in the second camera through the first homography matrix H1. error_proj=pi-Pi "=pi-h1×pi'.
It should be noted that, ideally, the re-projections (Pi "and Pi) of the same feature point observed at the view angles of the two cameras should be completely coincident, but because of the imaging quality, the re-projection of the same feature point at the view angles of the different cameras inevitably causes a deviation at the pixel level, i.e., error_proj is not 0. Such as noise, illumination intensity, lens distortion, etc., affect imaging quality. In the calibration process, the calibration result can be compared with a tolerable deviation value to determine whether the calibration result meets the requirements. The tolerable deviation values may be determined empirically by the user, or experimentally, etc. In some embodiments, the accuracy of calibration and the success rate of calibration may be increased by increasing the number of calibration points or improving the distribution of calibration points.
Next, a description is given of a calibration scheme of the second homography matrix provided by the embodiment of the present application. When the second homography matrix is calibrated, the calibration object is arranged in the set area, so that the second homography matrix corresponding to the calibration camera is realized through the image which is shot by the camera and comprises the calibration object. For convenience of description, a calibration object for calibrating the second homography matrix is referred to as a second target calibration object. The second target calibration object comprises a plurality of calibration points. The second target calibration object may comprise one or a group of calibration objects. Each marker in the set of markers includes at least one marker point. The index point has a stable visual characteristic that does not change from time to time. In some possible examples, the first target calibration object has a specific pattern thereon, and intersection points of lines in the pattern may be used as calibration points. In other possible examples, the first target calibration object may have a luminescent screen thereon, and the luminescent point displayed is used as the calibration point. Of course, other manners of setting the calibration points on the calibration object may be used, which is not particularly limited in the embodiments of the present application.
In some embodiments, the second homography matrix may be calibrated for all cameras.
In other embodiments, the second homography matrix may be calibrated for some cameras, and the second homography matrix for other cameras that are not calibrated may be determined by the first homography matrix between adjacent cameras and the second homography matrix for the calibrated camera.
In the following description, the second homography matrix for calibrating the third camera is taken as an example. The third camera may be any one of all cameras if the second homography matrix is calibrated for all cameras. If not all cameras perform the calibration of the second homography matrix, the third camera may be understood as any one of the cameras performing the calibration of the second homography matrix.
Referring to fig. 13, a flowchart of a calibration method of a second homography matrix according to an embodiment of the present application is shown. The method may be performed by a data processing server or by a processor or processor system in a data processing server.
1301, acquiring a first image of a second target calibration object included in a view finding range acquired by a third camera, wherein the second target calibration object includes a plurality of calibration points. The third camera is any one of the plurality of cameras.
1302, identifying position coordinates of a plurality of calibration points included in the second target calibration object in the first image respectively.
1303, determining the second homography matrix corresponding to the third camera according to the position coordinates of a plurality of calibration points included in the second target calibration object in the sports field corresponding coordinate system and the position coordinates of a plurality of calibration points included in the second target calibration object in the first image.
It should be noted that, in the embodiment of the present application, the first target calibration object and the second target calibration object may be the same or different, which is not limited in particular.
The view finding range of the third camera comprises landmark reference points set on the ground of the sports ground. The landmark reference point refers to a point on a sports ground where coordinates can be repeatedly and accurately measured, for example, an intersection point of a finishing line and a track line, for example, an intersection point of a starting line and a track line, or an artificially marked point can be used as a landmark reference point. The position coordinates of the plurality of calibration points included in the second target calibration object in the corresponding coordinate system of the sports field are determined according to the position coordinates of the set landmark reference point in the corresponding coordinate system of the sports field, the relative position relation between the second target calibration object and the set landmark reference point and the topological parameters of the second target calibration object, and the topological parameters represent the relative position relation and the relative posture between the parts included in the second target calibration object.
Illustratively, multiple sets of stereoscopic calibrators are placed within the viewing range of the third camera, each set of stereoscopic calibrators may include multiple stereoscopic calibrators. Each three-dimensional object includes at least one calibration point. Referring to fig. 14, a three-dimensional calibration object is taken as an example of the calibration object shown in fig. 8. The physical distance between the markers can be measured to obtain the position coordinates of the respective marker points in the coordinate system of the playing field. Fig. 14 (a) shows an example of placement of a three-dimensional object. Fig. 14 (b) shows a plane coordinate system of the sports field and position coordinates of the calibration object.
The data processing server recognizes that a plurality of calibration points included in the second target calibration object pass through P at the position coordinates of the first image respectively 1 And (3) representing.
The position coordinates of a plurality of calibration points on the sports ground pass P 1w And (3) representing.
In the embodiment of the application, the distances between a plurality of calibration points and the ground are the same, and the ground is flat. Thus, each calibration point has a height c from the ground.
Thus based on the detected feature point P i Known setpoint P iw I= … … n. So that the second homography matrix can be determined. The second homography matrix passes through H w To represent. P (P) iw =H w *P i . For example, referring to fig. 15, two points of the lower corners of the pattern on each calibration object are detected, and the position coordinates of each point in the image are acquired. Then according to the feature point Pi and the known calibration point P in the coordinate system of the sports field iw Calculating a second homography matrix H w . In fig. 15, (a) shows an image captured by the third camera, and black dots indicate detected feature points P 1 . FIG. 15 (b) shows a coordinate point P corresponding to Pi in the coordinate system of the sports field iw . It should be appreciated that the relative physical distance between the markers and the size of the markers remain the same in fig. 15 (b).
In some possible embodiments, where the second homography matrix is calibrated in the above manner for only a portion of the cameras, it may be determined by the first homography matrix between adjacent cameras and the calibrated second homography matrix of the cameras. In this case, the camera calibrated with the second homography matrix in the above manner may be referred to as a reference camera. When the reference camera is selected, a camera capable of capturing a specific landmark point may be selected, for example, the specific landmark point may be an intersection of the finish line and the track line, or the specific landmark may be an intersection of the start line and the track line.
For example, camera 1 is a reference camera. The second homography matrix of the camera 1 is H 1w . The camera 1 is adjacent to the camera 2, and the second homography matrix of the camera 2 may be determined in cascade from the second homography matrix of the camera 1 and the first homography matrix between the camera 1 and the camera 2. The second homography matrix of the camera 2 is H 2w Then H 2w =H 1w *H 2,1 。H 2,1 The camera coordinate system representing camera 2 is mapped to the first homography matrix of the camera coordinate system of camera 1.
For example, if camera 1 is spaced from camera 3 by camera 2, the second homography matrix of camera 2 may be determined in cascade from the second homography matrix of camera 1 and the first homography matrix between camera 1 and camera 2. The second homography matrix of the camera 3 may be determined from the second homography matrix of the camera 1, the first homography matrix between the camera 1 and the camera 2, and the first homography matrix between the camera 2 and the camera 3. The second homography matrix of the camera 3 is H 3w Then H 3w =H 1w *H 2,1 *H 3,2 。H 2,1 Representing a first homography matrix, H, between camera 2 and camera 1 3,2 Representing a first homography matrix between camera 3 and camera 2.
Further, camera 1 is spaced i-1 cameras from camera i. Then H iw =H 1w *H 2,1 *H 3,2 *……*H i,i-1 . Wherein H is i,i-1 A first homography matrix representing that the camera coordinate system of camera i is mapped to the camera coordinate system of camera i-1, H iw Representing a second homography matrix for camera i. It should be appreciated that the camera coordinate system of camera i maps to the first homography matrix of the camera coordinate system of camera i-1 and the camera coordinate system of camera i-1 maps to the camera of camera iThe first homography matrix of the coordinate system is an inverse matrix.
In some possible scenarios, the reference camera may comprise a plurality, i.e. the second homography matrix is calibrated in the above-described manner for each of the plurality of reference cameras. Other cameras for calibrating the second homography matrix can determine the second homography matrix in a cascade manner nearby. In an example, referring to fig. 3, the camera 19 and the camera 3 are both reference cameras, and the second homography matrix may be determined for the camera between the camera 3 and the camera 19 based on the cascade of cameras 19, or may be determined based on the cascade of cameras 3. In another example, a second homography matrix may be determined from camera 19 or a cascade of cameras 3 as a function of distance for the cameras between camera 3 and camera 19. In yet another example, the cascade error may also be determined according to the cascade relationship and the error in calibrating the first homography matrix of the adjacent cameras, and then the camera with the smaller cascade error is selected to cascade the second homography matrix. For example, the error when the first homography matrix of the camera 19 and the camera 20 is calibrated is a, the error when the first homography matrix of the camera 20 and the camera 1 is calibrated is B, the error when the first homography matrix of the camera 1 and the camera 2 is calibrated is C, and the error when the first homography matrix of the camera 2 and the camera 3 is calibrated is D. For the camera 1, if a×b is greater than c×d, the second homography matrix of the camera 1 is determined by cascading the first homography matrices of the camera 1 and the camera 2, the first homography matrices of the camera 2 and the camera 3, and the second homography matrix of the camera 3.
After determining the first homography matrix and the second homography matrix for each camera, the data processing server stores the first homography matrix of each two adjacent cameras and the second homography matrix corresponding to each camera for obtaining subsequent motion information.
The flow of the method for acquiring motion information is described in detail below with reference to specific embodiments. Referring to fig. 16, a flowchart of a method for acquiring motion information according to an embodiment of the present application is shown. The method may be performed by a data processing server or by a processor or processor system in a data processing server.
1601, a single frame image acquired at the same time by a plurality of cameras disposed in a setting space including a sports field is acquired.
And 1602, tracking the target athlete included in the single-frame image acquired by the two adjacent cameras at the same time according to the first homography matrix corresponding to the two adjacent calibrated cameras so as to obtain the motion information of the target athlete.
The two adjacent cameras are any two of the plurality of cameras and are adjacent in a set motion direction of the sports field; the first homography matrix is used for representing the mapping relation of the position coordinates of the same object in single-frame images acquired by two adjacent cameras at the same moment, and is obtained by calibrating multi-frame images shot by the two adjacent cameras at different moments synchronously, and the positions of a first object calibration object in the sports field in the images shot at different moments are different; the first target calibration object includes a plurality of calibration points.
For example, the data processing server may send a synchronous shooting signal to each camera, respectively, so that each camera starts shooting synchronously to obtain a synchronous video stream. And each camera sends the acquired video stream to a data processing server. The data processing server acquires the synchronous frames of a plurality of cameras for processing, such as human body detection and tracking processing of target athletes, such as ID tracking, of a moving area in each image by using a visual algorithm.
In some embodiments, detection may also be performed on a low-speed moving object to exclude the low-speed moving object from further processing. In some embodiments, it may be determined whether an object is moving at a low speed based on whether the position of the object in successive video frames of the same camera changes, and the distance of the change. The low-speed moving object includes an stationary object. The false detection of the low-speed moving object as the athlete can be avoided by eliminating the low-speed moving object, and the interference of the warm-up athlete or the interference of a coach can be prevented.
In some embodiments, the interference region may be removed by a phase picture mask before performing the sync frame processing. To avoid non-racetrack area characters, such as referees, coaches, warm-up athletes, for detected disturbances, a masking approach may be used for each camera, marking areas of interest (ROI) and areas of non-interest (non-ROI) in the camera view. For example, referring to FIG. 17, the black region represents the non-ROI area of the non-racetrack region.
Taking as an example ID tracking of two adjacent cameras respectively a first camera and a second camera. The first camera and the second camera are adjacent in a set direction of the playing field. The first camera and the second camera respectively send the synchronously acquired video streams to the data processing server after receiving the synchronous shooting signals of the data processing server. The data processing server respectively performs human body detection on video streams of the first camera and the second camera. In some embodiments, the body detection may include body contour detection, and may also include body bone detection, such as hip joint detection. For example, the centroid of the human body may be represented by a hip joint. Referring to fig. 17, human body detection is performed to determine the position of a human body detection frame and extract image features. For multiple frames of images continuously shot by the same camera, a target player in the multiple frames of images can be determined according to the extracted image characteristics, such as human body contours, human body postures and the like, and the target player can be subjected to ID marking. The specific detection modes of the human body contour, the human body posture and the like are not particularly limited in the embodiment of the application. For example, referring to fig. 18, a camera captures athlete tracking results in two frames of images taken at time t and time t+1. When the target player is located in the common view area of the two cameras, the target player can be detected in both the image acquired by the first camera at a certain moment and the image acquired by the second camera at the moment. In order to ensure consistency of player ID tracking during a player's full motion, ID tracking, or ID inheritance, may be performed based on a matching relationship of a player detected from an image acquired by a first camera at a certain moment and a player detected from an image acquired by a second camera at that moment.
As an example, see fig. 19 (a), view 1 shows an image acquired by a first camera at time t, and view 2 shows an image acquired by a second camera at time t. When ID tracking between adjacent cameras is performed, the data processing server may perform human body detection on the view diagram 1 to obtain a human body frame and a centroid point of a human body. The player of the target passes through the field of view range of the first camera and then enters the field of view range of the second camera in the movement direction of the sports field. After determining the ID of the target player from the image captured by the first camera, for example, see fig. 19 (b), the id=1 of the target player in view 1, and the centroid node is A1. The data processing server detects that the human body frame of the target athlete is BB2 from the view 2, maps the centroid node of the athlete detected from the view 1 to the position coordinate in the view 2 as A1 'according to the first homography matrix H12 of the camera coordinate system of the first camera and the camera coordinate system of the second camera, if A1' is positioned in BB2, it can be determined that the athlete detected from the view 2 is the same athlete as the athlete detected from the view 1, and can assign the ID of the athlete detected from the view 2 as 1. If A1' is outside BB2, it can be determined that the athlete detected in view 2 is not the same athlete as the athlete detected in view 1, i.e., the athlete in body frame BB2 is another athlete.
In one possible implementation manner, after ID tracking is performed on the images acquired by the plurality of cameras, the position coordinates of the target player in the coordinate system corresponding to the sports field can be determined according to the determined second homography matrix corresponding to each camera and the position coordinates of the target player in the images shot by each camera, and the position coordinates of the target player shot by the plurality of cameras are connected to form the motion trail of the target player. For example, the position coordinates of the target player may select the centroid of the target player, and the centroid of the target player included in the image captured by each camera is mapped to the position coordinates of the coordinate system corresponding to the sports field, so that the motion trail of the target player may be obtained, as shown in fig. 20.
In one possible implementation, the step number of the athlete can be further calculated through the scheme provided by the embodiment of the application. For example, the number of steps is calculated based on the movement of the player's leg joints and the change in angle. For example, the step is counted according to the distance change between two knee joints in the acquired image. The speed of the athlete can be further calculated through the scheme provided by the embodiment of the application. For example, after mapping the centroid of the target athlete included in the image collected by each camera to the position coordinates of the coordinate system corresponding to the sports field, the motion distance of the athlete can be calculated according to the position coordinates, and the motion duration of the athlete can be calculated according to the collection time of each image, so as to calculate the motion speed of the athlete. The embodiment of the application can also generate the motion video of the target athlete through the ID tracking result. After the data processing server calculates the athlete's movement information or generates movement videos, the movement information may be sent to the user's terminal device for display. In some embodiments, the data processing server may further analyze the motion data, generate an analysis result, and send the analysis result to the terminal device for display. Display form of analysis results the embodiment of the present application is not particularly limited.
It will be appreciated that, in order to implement the functions of the above-described method embodiments, the data processing server includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the various illustrative modules and method steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application scenario and design constraints imposed on the solution.
As an example, referring to fig. 21, a schematic structural diagram of a motion information acquiring device according to an embodiment of the present application is shown. The apparatus may be applied to a data processing server. The apparatus includes an acquisition unit 2101 and a processing unit 2102. An acquisition unit 2101 that acquires a single frame image acquired at the same time by a plurality of acquisition devices disposed in a setting space including a sports field, respectively; the processing unit 2102 is used for tracking target athletes included in single-frame images acquired by two adjacent acquisition devices at the same time according to a first homography matrix corresponding to the two adjacent acquisition devices so as to obtain the motion information of the target athletes; the two adjacent collecting devices are any two of the plurality of collecting devices and are adjacent in the set motion direction of the sports ground; the first homography matrix is used for representing the mapping relation of the position coordinates of the same object in single-frame images acquired by two adjacent acquisition devices at the same moment, and is obtained by calibrating multi-frame images shot by the two adjacent acquisition devices at different moments synchronously, and the positions of a first object calibration object in the sports field in the images shot at different moments are different; the first target calibration object includes a plurality of calibration points.
In one possible implementation, the first target calibration object comprises a plurality of calibration points in different images at the same distance from a plane on which the ground of the playing field lies.
In a possible implementation manner, the acquiring unit 2101 is further configured to acquire multi-frame images acquired by a first acquiring device and a second acquiring device at different moments in a motion process of the first target calibration object in a set area of a sports field, where the first acquiring device and the second acquiring device are any two adjacent acquiring devices in a set motion direction of the sports field, and the set area includes at least two adjacent acquiring devices respectively corresponding to a same area in the sports field within a view angle range; the processing unit 2102 is further configured to obtain first position coordinate information of a plurality of calibration points included in the first target calibration object in the multi-frame image acquired by the first acquisition device, and obtain second position coordinate information of a plurality of calibration points included in the first target calibration object in the multi-frame image acquired by the second acquisition device; and determining the first homography matrix according to the first position identification information and the second position coordinate information.
In one possible implementation, the processing unit 2102 is further configured to:
determining a third position coordinate of the target player in a coordinate system corresponding to the sports field according to a second homography matrix corresponding to a third acquisition device and the position coordinate of the target player in a single-frame image shot by the third acquisition device;
the second homography matrix corresponding to the third acquisition equipment is used for describing the mapping relation between the position coordinates of the target object in the image, which are shot by the third acquisition equipment, and the position coordinates of the target object in the coordinate system corresponding to the sports ground; the third acquisition device is any one of the plurality of acquisition devices.
In one possible implementation manner, the third acquisition device is a reference acquisition device, and the second homography matrix of the reference acquisition device is determined according to position coordinates of a second target calibration object including a plurality of calibration points in an image shot by the reference acquisition device and position coordinates of a plurality of calibration points included by the second target calibration object in a coordinate system corresponding to the sports field, where the second target calibration object is located in a view finding range of the reference acquisition device in the sports field; or alternatively, the process may be performed,
The third acquisition equipment is not reference acquisition equipment, the third acquisition equipment is adjacent to fourth acquisition equipment in the set motion direction of the sports ground, the second homography matrix corresponding to the third acquisition equipment is determined according to the second homography matrix corresponding to the fourth acquisition equipment and the first homography matrix between the fourth acquisition equipment and the third acquisition equipment, and the fourth acquisition equipment is reference acquisition equipment; or alternatively, the process may be performed,
the third acquisition equipment is not reference acquisition equipment and is spaced from the fourth acquisition equipment by at least one acquisition equipment in the set motion direction of the sports ground, and the second homography matrix corresponding to the third acquisition equipment is determined according to the second homography matrix corresponding to the fourth acquisition equipment and the first homography matrix corresponding to every two adjacent acquisition equipment between the third acquisition equipment and the fourth acquisition equipment.
In one possible implementation, the reference acquisition device includes a set landmark reference point on the ground of the playing field within a viewing range of the reference acquisition device; the position coordinates of the plurality of calibration points included in the second target calibration object in the corresponding coordinate system of the sports field are determined according to the position coordinates of the set landmark reference point in the corresponding coordinate system of the sports field, the relative position relation between the second target calibration object and the set landmark reference point and the topological parameters of the second target calibration object, and the topological parameters represent the relative position relation and the relative posture between the components included in the second target calibration object.
As another example, referring to fig. 22, a schematic structural diagram of a calibration device according to an embodiment of the present application is shown. The apparatus may be applied to a data processing server. The apparatus includes an acquisition unit 2201 and a processing unit 2202.
An acquiring unit 2201, configured to acquire multi-frame images acquired by a first acquiring device and a second acquiring device in a plurality of acquiring devices deployed in a setting space including a sports field at different moments in a playing process of the first target calibration object in a setting area of the sports field, where the first acquiring device and the second acquiring device are any two adjacent acquiring devices in a setting movement direction of the sports field, and the setting area at least includes two adjacent acquiring devices that respectively correspond to a same area in the sports field within a viewing angle range;
a processing unit 2202, configured to obtain first position coordinate information of a plurality of calibration points included in the first target calibration object in the multi-frame image acquired by the first acquisition device, and obtain second position coordinate information of a plurality of calibration points included in the first target calibration object in the multi-frame image acquired by the second acquisition device, respectively; and determining the first homography matrix according to the first position identification information and the second position coordinate information.
In a possible implementation manner, the obtaining unit 2201 is further configured to obtain a first image of a second target calibration object included in a viewfinder range acquired by a third acquisition device, where the second target calibration object includes a plurality of calibration points, and the third acquisition device is any one of the plurality of acquisition devices;
the processing unit 2202 is further configured to identify position coordinates of a plurality of calibration points included in the second target calibration object in the first image, respectively; and determining the second homography matrix corresponding to the third acquisition device according to the position coordinates of a plurality of calibration points respectively included in the second target calibration object in the sports field corresponding coordinate system and the position coordinates of a plurality of calibration points respectively included in the second target calibration object in the first image.
In a possible implementation manner, the third collecting device is adjacent to the fourth collecting device in a set motion direction of the sports field, and the processing unit 2202 is further configured to determine a second homography matrix corresponding to the fourth collecting device according to the second homography matrix corresponding to the third collecting device and a first homography matrix between the third collecting device and the fourth collecting device.
In a possible implementation, the third acquisition device is spaced from the fifth acquisition device by at least one acquisition device in a set movement direction of the playing field, and the processing unit 2202 is further configured to:
and determining a second homography matrix of the fifth acquisition equipment according to the second homography matrix corresponding to the third acquisition equipment and the first homography matrix corresponding to every two adjacent acquisition equipment between the third acquisition equipment and the fifth acquisition equipment.
In one possible implementation manner, the third acquisition device is a reference acquisition device, and a framing range of the reference acquisition device includes a set landmark reference point located on the ground of the sports ground;
the position coordinates of the plurality of calibration points included in the second target calibration object in the corresponding coordinate system of the sports field are determined according to the position coordinates of the set landmark reference point in the corresponding coordinate system of the sports field, the relative position relation between the second target calibration object and the set landmark reference point and the topological parameters of the second target calibration object, and the topological parameters represent the relative position relation and the relative posture between the components included in the second target calibration object.
The division of the units in the embodiments of the present application is schematically shown, which is merely a logic function division, and may have another division manner when actually implemented, and in addition, each functional unit in each embodiment of the present application may be integrated in one processor, or may exist separately and physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units. Only one or more of the individual units in fig. 21, 22 may be implemented in software, hardware, firmware or a combination thereof. The software or firmware includes, but is not limited to, computer program instructions or code and may be executed by a hardware processor. The hardware includes, but is not limited to, various types of integrated circuits such as a Central Processing Unit (CPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or an Application Specific Integrated Circuit (ASIC).
Based on the above embodiments and the same concept, the embodiments of the present application further provide a device, which is used to implement the method for acquiring or calibrating the motion information provided by the embodiments of the present application. As shown in fig. 23, the apparatus may include: one or more processors 2301, a memory 2302, and one or more computer programs (not shown). As one implementation, the devices described above may be coupled by one or more communication lines 2303. Wherein the memory 2302 has stored therein one or more computer programs, the one or more computer programs comprising instructions; the processor 2301 invokes the instructions stored in the memory 2302 to cause the device to perform the method for acquiring or calibrating motion information provided by the embodiment of the application.
In the embodiment of the present application, the processor may be a general purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component, and may implement or execute the methods, steps and logic blocks disclosed in the embodiments of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
In embodiments of the present application, the memory may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous DRAM (SLDRAM), and direct memory bus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory. The memory in the embodiments of the present application may also be a circuit or any other device capable of implementing a memory function.
As an implementation, the apparatus may further include a communication interface 2304 for communicating with other apparatuses via a transmission medium, for example, through the communication interface 2304, to communicate with the acquisition device, so as to receive the image frames acquired by the acquisition device. In an embodiment of the application, communication interface 2304 may be a transceiver, a circuit, a bus, a module, or other type of communication interface. In an embodiment of the present application, when the communication interface 2304 is a transceiver, the transceiver may include a separate receiver and a separate transmitter; a transceiver or interface circuit integrating the transceiver function is also possible.
In some embodiments of the present application, the processor 2301, the memory 2302 and the communication interface 2304 may be connected to each other by a communication line 2303; the communication line 2303 may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus, an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, or the like. The communication lines 2303 may be classified into address buses, data buses, control buses, and the like. For ease of illustration, only one thick line is shown in fig. 23, but not only one bus or one type of bus.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the above-described communication system may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
An embodiment of the application provides a computer readable medium for storing a computer program comprising instructions for performing the method steps in the corresponding method embodiment of fig. 4.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (25)

1. A method for acquiring motion information, comprising:
acquiring single-frame images acquired at the same time by a plurality of acquisition devices deployed in a set space comprising a sports ground;
tracking target athletes included in single-frame images acquired by two adjacent acquisition devices at the same time according to first homography matrixes corresponding to the two adjacent acquisition devices to obtain motion information of the target athletes; the two adjacent collecting devices are any two of the plurality of collecting devices and are adjacent in the set motion direction of the sports ground;
the first homography matrix is used for representing the mapping relation of the position coordinates of the same object in single-frame images acquired by two adjacent acquisition devices at the same moment, and is obtained by calibrating multi-frame images shot by the two adjacent acquisition devices at different moments synchronously, and the positions of a first object calibration object in the sports field in the images shot at different moments are different; the first target calibration object includes a plurality of calibration points.
2. The method of claim 1, wherein the first target calibration object comprises a plurality of calibration points in different images at the same distance from a plane on which the surface of the playing field lies.
3. The method of claim 1 or 2, wherein the method further comprises:
acquiring multi-frame images acquired by a first acquisition device and a second acquisition device at different moments in the motion process of a first target calibration object in a set region of a sports ground respectively, wherein the first acquisition device and the second acquisition device are any two adjacent acquisition devices in the set motion direction of the sports ground, and the set region at least comprises the same region of the sports ground, which is corresponding to the two adjacent acquisition devices in a visual angle range respectively;
acquiring first position coordinate information of a plurality of calibration points respectively in a multi-frame image acquired by the first acquisition equipment, which are included in the first target calibration object, and acquiring second position coordinate information of a plurality of calibration points respectively in a multi-frame image acquired by the second acquisition equipment, which are included in the first target calibration object;
and determining the first homography matrix according to the first position identification information and the second position coordinate information.
4. A method according to any one of claims 1-3, wherein the method further comprises:
determining a third position coordinate of the target player in a coordinate system corresponding to the sports field according to a second homography matrix corresponding to a third acquisition device and the position coordinate of the target player in a single-frame image shot by the third acquisition device;
the second homography matrix corresponding to the third acquisition equipment is used for describing the mapping relation between the position coordinates of the target object in the image, which are shot by the third acquisition equipment, and the position coordinates of the target object in the coordinate system corresponding to the sports ground; the third acquisition device is any one of the plurality of acquisition devices.
5. The method of claim 4, wherein the third acquisition device is a reference acquisition device, and the second homography matrix of the reference acquisition device is determined according to position coordinates of a second target calibration object including a plurality of calibration points in an image captured by the reference acquisition device and position coordinates of a plurality of calibration points included in the second target calibration object in a coordinate system corresponding to the sports field, wherein the second target calibration object is located in a view-finding range of the reference acquisition device in the sports field; or alternatively, the process may be performed,
The third acquisition equipment is not reference acquisition equipment, the third acquisition equipment is adjacent to fourth acquisition equipment in the set motion direction of the sports ground, the second homography matrix corresponding to the third acquisition equipment is determined according to the second homography matrix corresponding to the fourth acquisition equipment and the first homography matrix between the fourth acquisition equipment and the third acquisition equipment, and the fourth acquisition equipment is reference acquisition equipment; or alternatively, the process may be performed,
the third acquisition equipment is not reference acquisition equipment and is spaced from the fourth acquisition equipment by at least one acquisition equipment in the set motion direction of the sports ground, and the second homography matrix corresponding to the third acquisition equipment is determined according to the second homography matrix corresponding to the fourth acquisition equipment and the first homography matrix corresponding to every two adjacent acquisition equipment between the third acquisition equipment and the fourth acquisition equipment.
6. The method of claim 5, wherein the reference acquisition device includes a set landmark reference point on the ground of the playing field within a viewing range of the reference acquisition device;
the position coordinates of the plurality of calibration points included in the second target calibration object in the corresponding coordinate system of the sports field are determined according to the position coordinates of the set landmark reference point in the corresponding coordinate system of the sports field, the relative position relation between the second target calibration object and the set landmark reference point and the topological parameters of the second target calibration object, and the topological parameters represent the relative position relation and the relative posture between the components included in the second target calibration object.
7. A calibration method, comprising:
acquiring multi-frame images acquired by a first acquisition device and a second acquisition device in a plurality of acquisition devices deployed in a setting space of a sports ground at different moments in the motion process of a first target calibration object in a setting area of the sports ground, wherein the first acquisition device and the second acquisition device are any two adjacent acquisition devices in the setting motion direction of the sports ground, and the setting area at least comprises the same areas in the sports ground, which respectively correspond to the adjacent two acquisition devices in a visual angle range;
acquiring first position coordinate information of a plurality of calibration points respectively in a multi-frame image acquired by the first acquisition equipment, which are included in the first target calibration object, and acquiring second position coordinate information of a plurality of calibration points respectively in a multi-frame image acquired by the second acquisition equipment, which are included in the first target calibration object;
and determining the first homography matrix according to the first position identification information and the second position coordinate information.
8. The method of claim 7, wherein the method further comprises:
acquiring a first image of a second target calibration object included in a view finding range acquired by third acquisition equipment, wherein the second target calibration object comprises a plurality of calibration points, and the third acquisition equipment is any one of the plurality of acquisition equipment;
Identifying the position coordinates of a plurality of calibration points included in the second target calibration object in the first image respectively;
and determining the second homography matrix corresponding to the third acquisition device according to the position coordinates of a plurality of calibration points respectively included in the second target calibration object in the sports field corresponding coordinate system and the position coordinates of a plurality of calibration points respectively included in the second target calibration object in the first image.
9. The method of claim 8, wherein the third acquisition device is adjacent to a fourth acquisition device in a set direction of motion of the playing surface, the method further comprising:
and determining a second homography matrix corresponding to the fourth acquisition equipment according to the second homography matrix corresponding to the third acquisition equipment and the first homography matrix between the third acquisition equipment and the fourth acquisition equipment.
10. The method of claim 8, wherein the third acquisition device is spaced from the fifth acquisition device by at least one acquisition device in a set direction of motion of the playing surface, the method further comprising:
and determining a second homography matrix of the fifth acquisition equipment according to the second homography matrix corresponding to the third acquisition equipment and the first homography matrix corresponding to every two adjacent acquisition equipment between the third acquisition equipment and the fifth acquisition equipment.
11. The method of any of claims 8-10, wherein the third acquisition device is a reference acquisition device, the reference acquisition device including within its field of view a set landmark reference point located on the ground of the sports field;
the position coordinates of the plurality of calibration points included in the second target calibration object in the corresponding coordinate system of the sports field are determined according to the position coordinates of the set landmark reference point in the corresponding coordinate system of the sports field, the relative position relation between the second target calibration object and the set landmark reference point and the topological parameters of the second target calibration object, and the topological parameters represent the relative position relation and the relative posture between the components included in the second target calibration object.
12. A motion information acquisition apparatus, characterized by comprising:
the acquisition unit acquires single-frame images acquired at the same time by a plurality of acquisition devices deployed in a set space comprising a sports ground;
the processing unit is used for tracking target athletes included in single-frame images acquired by the two adjacent acquisition devices at the same time according to first homography matrixes corresponding to the two calibrated adjacent acquisition devices so as to acquire the motion information of the target athletes; the two adjacent collecting devices are any two of the plurality of collecting devices and are adjacent in the set motion direction of the sports ground;
The first homography matrix is used for representing the mapping relation of the position coordinates of the same object in single-frame images acquired by two adjacent acquisition devices at the same moment, and is obtained by calibrating multi-frame images shot by the two adjacent acquisition devices at different moments synchronously, and the positions of a first object calibration object in the sports field in the images shot at different moments are different; the first target calibration object includes a plurality of calibration points.
13. The apparatus of claim 12, wherein the first target calibration object comprises a plurality of calibration points in different images at the same distance from a plane on which the surface of the playing field lies.
14. The apparatus according to claim 12 or 13, wherein the acquiring unit is further configured to acquire multi-frame images acquired by a first acquiring device and a second acquiring device respectively at different moments during the movement of the first target calibration object in a set area of the playing field, where the first acquiring device and the second acquiring device are any two adjacent acquiring devices in a set movement direction of the playing field, and the set area includes at least two adjacent acquiring devices respectively corresponding to a same area in the playing field within a viewing angle range;
The processing unit is further configured to obtain first position coordinate information of a plurality of calibration points included in the first target calibration object in the multi-frame image acquired by the first acquisition device, and obtain second position coordinate information of a plurality of calibration points included in the first target calibration object in the multi-frame image acquired by the second acquisition device; and determining the first homography matrix according to the first position identification information and the second position coordinate information.
15. The apparatus of any of claims 12-14, wherein the processing unit is further to:
determining a third position coordinate of the target player in a coordinate system corresponding to the sports field according to a second homography matrix corresponding to a third acquisition device and the position coordinate of the target player in a single-frame image shot by the third acquisition device;
the second homography matrix corresponding to the third acquisition equipment is used for describing the mapping relation between the position coordinates of the target object in the image, which are shot by the third acquisition equipment, and the position coordinates of the target object in the coordinate system corresponding to the sports ground; the third acquisition device is any one of the plurality of acquisition devices.
16. The apparatus of claim 15, wherein the third acquisition device is a reference acquisition device, the second homography matrix of the reference acquisition device is determined according to position coordinates of a second target calibration object including a plurality of calibration points in an image captured by the reference acquisition device, and position coordinates of a plurality of calibration points included in the second target calibration object in a coordinate system corresponding to the sports field, the second target calibration object being located in a viewing range of the reference acquisition device in the sports field; or alternatively, the process may be performed,
the third acquisition equipment is not reference acquisition equipment, the third acquisition equipment is adjacent to fourth acquisition equipment in the set motion direction of the sports ground, the second homography matrix corresponding to the third acquisition equipment is determined according to the second homography matrix corresponding to the fourth acquisition equipment and the first homography matrix between the fourth acquisition equipment and the third acquisition equipment, and the fourth acquisition equipment is reference acquisition equipment; or alternatively, the process may be performed,
the third acquisition equipment is not reference acquisition equipment and is spaced from the fourth acquisition equipment by at least one acquisition equipment in the set motion direction of the sports ground, and the second homography matrix corresponding to the third acquisition equipment is determined according to the second homography matrix corresponding to the fourth acquisition equipment and the first homography matrix corresponding to every two adjacent acquisition equipment between the third acquisition equipment and the fourth acquisition equipment.
17. The apparatus of claim 16, wherein the reference acquisition device includes a set landmark reference point on the ground of the playing field within a viewing range of the reference acquisition device;
the position coordinates of the plurality of calibration points included in the second target calibration object in the corresponding coordinate system of the sports field are determined according to the position coordinates of the set landmark reference point in the corresponding coordinate system of the sports field, the relative position relation between the second target calibration object and the set landmark reference point and the topological parameters of the second target calibration object, and the topological parameters represent the relative position relation and the relative posture between the components included in the second target calibration object.
18. A calibration device, comprising:
the acquisition unit is used for acquiring multi-frame images acquired by a first acquisition device and a second acquisition device in a plurality of acquisition devices deployed in a setting space of a sports ground at different moments in the process of the movement of a first target calibration object in a setting area of the sports ground, wherein the first acquisition device and the second acquisition device are any two adjacent acquisition devices in the setting movement direction of the sports ground, and the setting area at least comprises the same areas of the adjacent two acquisition devices in the sports ground corresponding to the range of view angles respectively;
The processing unit is used for acquiring first position coordinate information of a plurality of calibration points respectively in the multi-frame images acquired by the first acquisition equipment and second position coordinate information of a plurality of calibration points respectively in the multi-frame images acquired by the second acquisition equipment; and determining the first homography matrix according to the first position identification information and the second position coordinate information.
19. The apparatus of claim 18, wherein the acquisition unit is further configured to acquire a first image of a second target calibration object included in a viewing range acquired by a third acquisition device, the second target calibration object including a plurality of calibration points, the third acquisition device being any one of the plurality of acquisition devices;
the processing unit is further used for identifying position coordinates of a plurality of calibration points included in the second target calibration object in the first image respectively; and determining the second homography matrix corresponding to the third acquisition device according to the position coordinates of a plurality of calibration points respectively included in the second target calibration object in the sports field corresponding coordinate system and the position coordinates of a plurality of calibration points respectively included in the second target calibration object in the first image.
20. The apparatus of claim 19, wherein the third acquisition device is adjacent to a fourth acquisition device in a set direction of motion of the playing field, the processing unit further configured to determine a second homography matrix corresponding to the fourth acquisition device based on a second homography matrix corresponding to the third acquisition device and a first homography matrix between the third acquisition device and the fourth acquisition device.
21. The apparatus of claim 19, wherein the third acquisition device is spaced from the fifth acquisition device by at least one acquisition device in a set direction of motion of the playing surface, the processing unit further configured to:
and determining a second homography matrix of the fifth acquisition equipment according to the second homography matrix corresponding to the third acquisition equipment and the first homography matrix corresponding to every two adjacent acquisition equipment between the third acquisition equipment and the fifth acquisition equipment.
22. The apparatus of any of claims 19-21, wherein the third acquisition device is a reference acquisition device, the reference acquisition device including within its field of view a set landmark reference point located on the ground of the sports field;
The position coordinates of the plurality of calibration points included in the second target calibration object in the corresponding coordinate system of the sports field are determined according to the position coordinates of the set landmark reference point in the corresponding coordinate system of the sports field, the relative position relation between the second target calibration object and the set landmark reference point and the topological parameters of the second target calibration object, and the topological parameters represent the relative position relation and the relative posture between the components included in the second target calibration object.
23. A motion information acquisition device, comprising a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to execute a computer program stored in the memory to implement the method of any one of claims 1-6.
24. The calibration device is characterized by comprising a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to execute a computer program stored in the memory to implement the method of any one of claims 7-11.
25. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when run on a processor, causes the processor to perform the method according to any of the preceding claims 1-11.
CN202210209801.3A 2022-03-04 2022-03-04 Method for acquiring motion information, calibration method and device Pending CN116740130A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210209801.3A CN116740130A (en) 2022-03-04 2022-03-04 Method for acquiring motion information, calibration method and device
PCT/CN2023/078599 WO2023165452A1 (en) 2022-03-04 2023-02-28 Motion information acquisition method, calibration method, and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210209801.3A CN116740130A (en) 2022-03-04 2022-03-04 Method for acquiring motion information, calibration method and device

Publications (1)

Publication Number Publication Date
CN116740130A true CN116740130A (en) 2023-09-12

Family

ID=87882988

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210209801.3A Pending CN116740130A (en) 2022-03-04 2022-03-04 Method for acquiring motion information, calibration method and device

Country Status (2)

Country Link
CN (1) CN116740130A (en)
WO (1) WO2023165452A1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101146231A (en) * 2007-07-03 2008-03-19 浙江大学 Method for generating panoramic video according to multi-visual angle video stream
CN101853524A (en) * 2010-05-13 2010-10-06 北京农业信息技术研究中心 Method for generating corn ear panoramic image by using image sequence
CN102164269A (en) * 2011-01-21 2011-08-24 北京中星微电子有限公司 Method and device for monitoring panoramic view
CN105869166B (en) * 2016-03-29 2018-07-10 北方工业大学 A kind of human motion recognition method and system based on binocular vision
CN106991690B (en) * 2017-04-01 2019-08-20 电子科技大学 A kind of video sequence synchronous method based on moving target timing information
EP3493148A1 (en) * 2017-11-30 2019-06-05 Thomson Licensing View synthesis for unstabilized multi-view video
CN109240496B (en) * 2018-08-24 2021-07-16 中国传媒大学 Acousto-optic interaction system based on virtual reality
CN111091025B (en) * 2018-10-23 2023-04-18 阿里巴巴集团控股有限公司 Image processing method, device and equipment

Also Published As

Publication number Publication date
WO2023165452A1 (en) 2023-09-07

Similar Documents

Publication Publication Date Title
US9448067B2 (en) System and method for photographing moving subject by means of multiple cameras, and acquiring actual movement trajectory of subject based on photographed images
RU2387011C2 (en) Movement tracking based on image analysis
US10515471B2 (en) Apparatus and method for generating best-view image centered on object of interest in multiple camera images
US20130094696A1 (en) Integrated Background And Foreground Tracking
CN109151439A (en) A kind of the automatic tracing camera system and method for view-based access control model
EP1757087A2 (en) Automatic event videoing, tracking and content generation system
CN105718862A (en) Method, device and recording-broadcasting system for automatically tracking teacher via single camera
Scott et al. SoccerTrack: A dataset and tracking algorithm for soccer with fish-eye and drone videos
JP2009505553A (en) System and method for managing the insertion of visual effects into a video stream
US10922871B2 (en) Casting a ray projection from a perspective view
CN105069795B (en) Moving object tracking method and device
CN101894380A (en) Method for tracing target object in panoramic video automatically
CN103500471A (en) Method for realizing high-resolution augmented reality system
CN112330710A (en) Moving target identification tracking method, device, server and readable storage medium
CN114120165A (en) Gun and ball linked target tracking method and device, electronic device and storage medium
WO2024012405A1 (en) Calibration method and apparatus
CN110059653A (en) A kind of method of data capture and device, electronic equipment, storage medium
CN114037923A (en) Target activity hotspot graph drawing method, system, equipment and storage medium
JP2004094518A (en) Figure tracing device and figure tracing method and its program
CN109978908A (en) A kind of quick method for tracking and positioning of single goal adapting to large scale deformation
CN116740130A (en) Method for acquiring motion information, calibration method and device
CN111970434A (en) Multi-camera multi-target athlete tracking shooting video generation system and method
CN105894505A (en) Quick pedestrian positioning method based on multi-camera geometrical constraint
US11908194B2 (en) Tracking sparse objects and people in large scale environments
JP2000333080A (en) Image pickup and processing unit

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication