SE1950724A1 - System for analyzing movement in sport - Google Patents

System for analyzing movement in sport

Info

Publication number
SE1950724A1
SE1950724A1 SE1950724A SE1950724A SE1950724A1 SE 1950724 A1 SE1950724 A1 SE 1950724A1 SE 1950724 A SE1950724 A SE 1950724A SE 1950724 A SE1950724 A SE 1950724A SE 1950724 A1 SE1950724 A1 SE 1950724A1
Authority
SE
Sweden
Prior art keywords
unit
data
sensor
processing circuitry
analysis system
Prior art date
Application number
SE1950724A
Other languages
Swedish (sv)
Other versions
SE543581C2 (en
Inventor
Kjell Bystedt
Original Assignee
Sport & Health Sensors Sweden AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sport & Health Sensors Sweden AB filed Critical Sport & Health Sensors Sweden AB
Priority to SE1950724A priority Critical patent/SE543581C2/en
Priority to DE102020115519.0A priority patent/DE102020115519A1/en
Publication of SE1950724A1 publication Critical patent/SE1950724A1/en
Publication of SE543581C2 publication Critical patent/SE543581C2/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1124Determining motor skills
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6895Sport equipment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/10Athletes

Abstract

The present invention relates to an analysis system (1) for analysing movements associated with sports comprising at least one camera unit (2) comprising a camera (21) and having access to a communication unit (5) for wireless communication, at least one sensor unit (3) comprising an accelerometer (31), a gyroscope (32), and a communication unit (33) for wireless communication and processing circuitry (4) having access to a communication unit (5) for wireless communication. The processing circuitry (4) being arranged to receive synchronization information, receive a video frame data from the camera (21), receive sensor data from the sensor unit (3), determine a position of the sensor unit (3), transforming the determined position to an absolute position in a 2- of 3-dimensional space, increasing the accuracy of the absolute position of the sensor unit (3) and the sensor data by fusion of the absolute position data and sensor data.

Description

System for analyzing movement in sport Technical field The present invention relates to an analysis system for analyzing movements associated withsports.
Background ln sport, there is a lot of focus on the technique of the athletes. lt is not unusual to usetechnology to assist in perfecting techniques.
US 8622795 B2 discloses a system capturing data for analysis and qualitative conclusionsregarding an objects actions, specifically the activity of an athlete. Sensors on the athlete, suchas gyroscopes and accelerometers collect activity related data. A processor compares the datato stored activity-related data. A transmitter on the athlete sends the data to the processor.Examples of what can be calculated are the velocity of a punch, force of the punch anddistance covered by each punch. Cameras may also be used to track the position of boxersduring a boxing match. A retro-reflective material can be used as a motion capture surface onthe athlete such that the camera can easily find the athlete. The document mentions that acomputer can analyze camera data to detect punch.
US 2005/0223799 A1 discloses that one or more video cameras records the motion of asubject. The data acquisition unit works in conjunction with motion sensors to captureposition data of the subject. The motion of the subject may be compared against a standardideal motion. The document also presents an environmental sensorfor sensing environmentalparameters such as wind and temperature.
US 8894505 B2 discloses a fitting system for a golf club. The golf club has a plurality of markersand a plurality of cameras positioned around the golfer identify and react to the markers. Thesystem uses the markers to triangulate the position of the golf club.
US 8941723 B2 discloses a mobile device motion capture and analysis system. Embodimentsuse accelerometers and gyroscopes on a golf club to determine speed or range. Several mobilephones with cameras may be used to triangulate positions of markers.
US 8477046 B2 discloses attaching sensor modules and telemetry modules on movable andstationary sources, such as players, referees, balls, goal posts, zone markers etc. The measureddata may include acceleration, location, tracking, movement, speed, velocity, speed burst,impact, body temperature etc. The playing field can be mapped using for example UWB,ZigBee, Wi-Fi and GPS and triangulation ofthe signals. The performance metrics, data, analyticand/or statistics captured by the sensors can be exported in real time to reports, video, forsuperimposing and/or integration into video, or as represented graphs of sports performance metrics. Displaying progressive movements comprises a sequence of three-dimensionalimages. The sensors are for example UWB sensors, accelerometers or gyroscope.
US 2018132748 A1 discloses a portable biomechanical assessment laboratory (PBAL). Thelaboratory is a portable bag with different kinds of sensors that may be used to analyzeathletes' movements. Video data may be used to determine if there are anomalies inkinematic data.
The demand for technology that can be used in assisting athletes to progress is increasing.Usability and precision are important factors when choosing a technology.
Summary An aspect of the present disclosure is to provide an alternative solution to the existingtechnologies. The disclosure proposes an analysis system with high usability and precision andwhich is cheap to produce.
I\/|ore specifically, the disclosure provides an analysis system for analysing movementsassociated with sports.
This object is achieved by a system as defined in claim 1.
According to an embodiment of the invention, an analysis system for analysing movementsassociated with sports is provided. The system comprises at least one camera unit, forattaching to a sports object or athlete, comprising a camera and having access to acommunication unit for wireless communication, at least one sensor unit comprising anaccelerometer, a gyroscope, and a communication unit for wireless communication andprocessing circuitry having access to a communication unit for wireless communication. Theprocessing circuitry is arranged to receive synchronization information from the sensor unitand the camera unit, receive a video frame data captured by the camera of the camera unit,receive sensor data from the accelerometer and gyroscope from the sensor unit for the sametime as the captured video frame, determine a position of the sensor unit, transforming thedetermined position to an absolute position in a 2- or 3- dimensional space and increasing theaccuracy of the absolute position of the sensor unit and the sensor data by fusion of theabsolute position data and sensor data. By this, a system which is easy to use and very preciseis provided. The sensor unit can be arranged on any kind of sports object or on an athlete andthen input from the camera unit and sensor unit is used to increase the accuracy of the data.
According to some aspects, the determine a position of the sensor unit is performed in thevideo frame, by image processing or/and by using the gyro and accelerometer data and a fixstarting position of the sensor unit. At least two ways to determine the position of the sensorunit is thus provided.
According to some aspects, the processing circuitry is arranged to determine movementCharacteristics of the sensor unit by fusing the data captured by the camera unit and thesensor unit. When an ath|ete can see the forces acting on the sensor unit at different positionsduring a movement, it aids the ath|ete in comparing several similar movements to determinewhich one is the most effective.
According to some aspects, the analysis system comprising a display for displaying the videoframe data received from the camera unit, wherein the processing circuitry is arranged toprocess the video frame data to visualize the determined movement characteristics in 2- or 3-dimensions together with the video frame data. This provides a usable way for the user to see,in the video, the determined movement characteristics together with the video of themovement. lt is therefore easy for the ath|ete to analyse her/his movement and asses ifshe/he should adjust anything in the movement.
According to some aspects, the analysis system comprising at least one additionalcommunication unit for wireless communication to be fixed in the environment. Theprocessing circuitry is arranged to receive synchronization signals from the additionalcommunication unit and triangulate the position of the sensor unit using the signals from theadditional communication unit, the communication unit of the camera unit and thecommunication unit of the sensor unit. The increasing the accuracy of the absolute positionofthe sensor unit and the sensor data by fusion ofthe absolute position data and sensor datais also including fusion of the triangulated position.
Triangulation is used to further increase the accuracy of the system. The position of the sensorunit can thus be even more accurately determined.
According to some aspects, one of the at least one camera units and the processing circuitryare comprised in a smart phone. By using the camera unit and processing circuitry of a smartphone, the cost of the system can be kept low and thus accessible to everyone.
According to some aspects, the analysis system comprises a marker and wherein thedetermining a position of the sensor unit comprises identifying a marker in the video data.The marker may be anything that can be attached to a sports object or ath|ete that can beused to make it easier for the processing circuitry to find what the marker is attached to in theimage frame data from the camera. The marker is for example a tape or a sticker of a specificcolour. lf a sports object has a pronounced shape, it may be used as a marker directly. Anexample of such an object is the metal ball in a hammer used in hammerthrowing.
According to some aspects, the increasing the accuracy ofthe absolute position and the sensordata is done by filtering by a Kalman filter or equivalent. Thus, a fast way to increase theaccuracy ofthe absolute position is provided.
According to some aspects, steps ii to vi are repeatedly performed by the processing circuitryas the camera unit and sensor unit continuously capture and send data. The system can thusbe continuously run, and the above determinations can be performed as the data is beingco||ected. The system may also then improve the accuracy ofthe determinations continuously.
According to some aspects, the processing circuitry is arranged to determine if there is anerror in the sensor data from the accelerometer or gyroscope based on the determinedabsolute position, and if there is an error, to determining the quantity of the error. Thedetermination can be used to evaluate the sensor unit.
According to some aspects, the processing circuitry is arranged to compensate for determinederrors to improve the absolute position. The determination can thus also be used to improvedata from the sensor unit so that a more exact absolute position can be determined.
According to some aspects, the analysis system comprises a pressure sensor unit, comprisinga pressure sensor with access to a communication unit, for calibrating one axis of theaccelerometer, wherein the processing circuitry is arranged to receive synchronizationinformation and pressure sensor data from the pressure sensor and wherein the increasingthe accuracy of the absolute position of the sensor unit (3) and the sensor data by fusion ofthe absolute position data and sensor data is also including fusion ofthe pressure sensor data.A pressure sensor that is placed correctly on an athlete or sports equipment can be used tocalibrate the accelerometer since it can be used to determine when the accelerometer shouldgive a 0 value along an axis.
Brief description of the drawings The invention will now be explained more closely by the description ofdifferent embodimentsof the invention and with reference to the appended figures.
Fig. 1 shows an example analysis system in use with a hammer thrower.
Fig. 2 shows an example analysis system in use with a runner.
Fig. 3 shows an example analysis system.
Fig. 4 shows an example analysis system.
Fig. 5 shows an example analysis system.
Fig. 6 shows an example camera unit.
Fig. 7 shows another example camera unit.
Fig. 8 shows an example of a screen showing captured video data together with determinedinformation.
Fig. 9 illustrates examples of signals sent to the processing circuitry for processing.Fig. 10 illustrates a box diagram of processing steps of the processing circuitry.
Detailed description The present invention is not limited to the embodiments disclosed but may be varied andmodified within the scope ofthe following claims.
Aspects of the present disclosure will be described more fully hereinafter with reference tothe accompanying drawings. The devices and methods disclosed herein can, however, berealized in many different forms and should not be construed as being limited to the aspectsset forth herein. Like numbers in the drawings refer to like elements throughout.
The terminology used herein is for the purpose of describing particular aspects of thedisclosure only and is not intended to limit the invention. As used herein, the singular forms a , an” and ”the” are intended to include the plural forms as well, unless the context clearlyindicates otherwise.
Unless otherwise defined, all terms (including technical and scientific terms) used herein havethe same meaning as commonly understood by one of ordinary skill in the art to which thisdisclosure belongs. lt will be further understood that terms used herein should be interpretedas having a meaning that is consistent with their meaning in the context of this specificationand the relevant art and will not be interpreted in an idealized or overly formal sense unlessexpressly so defined herein.
The disclosure provides an analysis system 1 for analysing movements associated with sports.Examples ofthe analysis system 1 are illustrated in figures 1-5.
The system 1 comprises at least one camera unit 2 comprising a camera 21 and having accessto a communication unit 5 for wireless communication. The camera 21 may be any kind ofcamera 21 that can capture video frames. The communication unit 5 is for example aBluetooth, BT, unit, a Bluetooth Low Energy, BLE, unit or an Ultra-Wideband, UWB, unit. Thetechnology for communication may be any kind of communication means which facilitateswireless communication between the processing circuitry 4 and the camera unit 2.
The system 1 comprises at least one sensor unit 3, for attaching to a sports object or athlete,comprising an accelerometer 31, a gyroscope 32, and a communication unit 33 for wirelesscommunication. Accelerometers 31 and gyroscopes 32 have been available on the marked fora long time and are known to the skilled person. Any accelerometer 31 and gyroscope 32 maybe used in the system 1 which have a suitable size for a sensor unit 3 that is to be attached toa sports object or athlete. The at least one sensor unit is for example an lnertial MeasurementUnit, |I\/|U. The at least one sensor unit may comprise a magnetometer. The communicationunit 33 may be of the same technology as the communication unit 5 of the camera unit 2 orof another technology of the ones described for the camera unit 2.
The system 1 comprises processing circuitry 4 having access to a communication unit 5 forwireless communication. The processing circuitry 4 is any kind of processing means capable ofperforming the below steps. The processing circuitry 4 may be a processor with one ormultiple cores and/or may be several processors working together. The system 1 may alsocomprise storage circuitry. The storage circuitry is then any kind of memory capable of storinginformation that may be accessed by the processing circuitry 4. The processing circuitry 4 isthus in communication with the storage circuitry. The storage circuitry is preferably a non-volatile memory such as a Read-Only I\/|emory, ROM, a flash memory, a magnetic computerstorage device or an optical storage device. ln figure 1, a hammer thrower is illustrated in an example of the system in use. A sensor unit3 is attached where the hammer is attached to the wire. ln the illustrated example, the cameraunit 2 and the processing unit are both comprised in a smart phone 6. Thus, according to someaspects, one of the at least one camera unit 2 and the processing circuitry 4 are comprised ina smart phone 6. By using the camera unit 2 and processing circuitry 4 ofa smart phone 6, thecost of the system 1 can be kept low and thus accessible to everyone. ln this case, thecommunication unit 5 that the camera unit 2 and processing circuitry 4 have access to, is thecommunication means ofthe smart phone 6. The camera unit 2 and processing circuitry 4 areconnected internally in the phone and the communication unit 5 is used for wirelesscommunication with the sensor unit 3. lt may also be that an external communication meansis used that can be connected to the smart phone 6, such as a BLE dongle or UWB dongle.
Aspects of the disclosure are exemplified using a smart phone 6. However, it should beappreciated that the invention is as such applicable to many types of electronic devices whichhave a camera unit 2. Examples of such devices may for instance be any type of mobile phone,smartphone, laptop, handheld computers, portable digital assistants, tablet computers, touchpads, gaming devices, accessories to mobile phones, e.g. wearables in the form ofheadphones/-sets, visors/goggles, bracelets, wristbands, necklaces, etc. For the sake of clarityand simplicity, the embodiments outlined in this specification are exemplified with a smartphone 6 only. ln figure 2, a runner is illustrated in an example of the system in use. Three sensor units 3 areattached to the runner; one at an ankle, one at a wrist and one at the waist of the runner. lnthe illustrated example, the camera unit 2 and the processing unit are both comprised in asmart phone 6. As shown in this example, the system 1 can be used with more than one sensorunits. One sensor unit for each point of interest on the sports object or athlete can be used.
Figures 3, 4 and 5 illustrate different examples ofthe system 1. ln figure 3 the sensor unit 3 isin wireless communication, via its communication unit 33, to the processing circuitry 4. Theprocessing circuitry is here illustrated as being comprised in a smart phone 6 as an example.A camera unit 2 is illustrated as a separate unit with its own communication unit 5 and incommunication with the processing circuitry. lt should be noted that the camera unit 2 and the sensor unit 3 may or may not be in communication with each other. ln figure 4, the cameraunit 2 and the processing circuitry and the communication unit 5 are comprised in the smartphone 6 and in wireless communication with the sensor unit 3. ln figure 5, the processingcircuitry 4 and communication unit 5 are comprised in a smart phone 6 and there are twocommunication units 5 in this example. The reason for this is further explained below. Othercombinations are possible, such as one camera unit 2 which is its own unit and with acommunication unit 5 and processing circuitry 4 as well as a camera unit 2 and acommunication unit 5 in a smart phone 6 to provide a stereo video of the sports object orathlete with the sensor unit 3. ln figure 6, an example camera unit 2 is illustrated. The cameraunit 2 comprises a camera 21 and may have its own communication unit 5 and/or processingcircuitry 4. The camera unit may also have its own memory if it is its own unit and notcomprised in a smart phone. The processing circuitry 4 is arranged to receive synchronizationinformation from the sensor unit 3 and the camera unit 2. ln the case where the processingunit and the camera unit 2 are comprised in a smart phone 6, the synchronization informationmay be implicit in that the units are automatically synchronized in the smart phone 6. Thesynchronization information is for example time stamps so that the information from all unitscan be synchronized in time. ln figure 7, an example camera unit 2 is illustrated. ln this example, the camera unit 2comprises two cameras 21 and/or an active IR sensor 22. The two cameras 21 then provide astereo image. The active IR sensor 22 signal can then be used to give the position in threedimensions. The camera unit may comprise its own communication unit 5 and/or processingcircuitry 4 and/or memory or it may be connected to a smart phone and use one or more unitsof the smart phone.
The processing circuitry 4 is arranged to receive a video frame data captured by the camera21 of the camera unit 2 and to receive sensor data from the accelerometer 31 and gyroscope32 from the sensor unit 3 for the same time as the captured video frame. The informationfrom the camera unit 2 and the sensor unit 3 may be time stamped. The sensor data is receivedvia the communication unit 5. The video frame data may be received via the communicationunit 5 if the camera unit 2 and the processing circuitry 4 are comprised in different devices. lfthe camera unit 2 and the processing circuitry 4 are comprised in a smart phone 6, they areconnected internally in the smart phone 6.
Figure 9 illustrates examples of signals sent to the processing circuitry 4 for processing. ln theillustrated example, the processing circuitry 4 receives image frame data from the cameraunit/units 2, sensor data from the sensor unit/units 3 and optionally pressure data from apressure sensor unit 7 which is further explained below. The processing circuitry 4 is arrangedto determine a position of the sensor unit 3, transforming the determined position to anabsolute position in a 2- or 3-dimensional space and increasing the accuracy of the absoluteposition of the sensor unit 3 and the sensor data by fusion of the absolute position data andsensor data. ln other words, the processing circuitry 4 determines the position of the sensor unit by fusing data obtained from the image with sensor data from the sensor unit 3. To dothis, image processing is used for determining the position of the sensor unit in an imageframe, the position is then transformed into an absolute position in 2- or 3-dimensional space.According to some aspects, the determine a position of the sensor unit is performed in thevideo frame, by image processing or/and by using the gyro and accelerometer data and a fixstarting position of the sensor unit. At least two ways to determine the position of the sensorunit is thus provided. The absolute position is then used together with sensor data from thesensor unit 3 in a sensor fusion algorithm to get an absolute position with higher accuracythan what is achievable using only image processing or sensor data. The position of the sensorunit 3 is acquired in x, y coordinates for a single camera 21 in the camera unit 2 and in x, y andz for a stereo camera 21 in the camera unit 2 or for more than one camera units 2. Theposition, when transformed to an absolute position, may be denoted HX, Hy, and optionally H2.The sensor data from the accelerometer 31 is acceleration and may be denoted ax, ay, andoptionally ay with the unit m/sz. The sensor data from the gyroscope 32 is angular velocity andmay be denoted cox, coy and optionally coz with the unit rad/s. Optionally a compass degreemay be received from each sensor, which may be denoted QX, ®y and optionally ®Z with theunit rad.
There are several ways to perform the fusion. One example is to use a Kalman filter. A specificexample is to use a continuous-discrete extended Kalman filter, EKF. Such a filter estimatesthe state of a system as times moves forward and new measurements become available. Dueto noise, the estimates computed in this way are not perfect. The filter takes this into accountby keeping and updating an estimate of the uncertainty in the state vector estimate. ln thiscase, the state vector consists of orientation, angular velocity, position, velocity, acceleration,gyro bias, accelerometer bias, geomantic field and magnetometer bias. By keeping variablesof interest in the state vector, the variables of interest are estimated using all available sensordata. Note that the measurements that are used to update the estimate of the state vectormay occur at different frequencies and still be taken into account by the filter.
By this, a system 1 which is easy to use and very precise is provided. The sensor unit 3 can bearranged on any kind of sports object or on an athlete and then input from the camera unit 2and sensor unit 3 is used to increase the accuracy of the data.
According to some aspects, the increasing the accuracy ofthe absolute position and the sensordata is done by filtering by a Kalman filter or equivalent. Thus, a fast way to increase theaccuracy of the absolute position is provided. The filter is for example an extended Kalmanfilter. According to some aspects, the analysis system 1 comprises a marker and wherein thedetermining a position of the sensor unit 3 comprises identifying a marker in the video data.The marker may be anything that can be attached to a sports object or athlete that can beused to make it easier for the processing circuitry 4 to find what the marker is attached to inthe image frame data from the camera 21. The marker is for example a tape or a sticker of aspecific colour. |fa sports object has a pronounced shape, it may be used as a marker directly.
An example of such an object is the metal ball in a hammer used in hammer throwing. Themarker is used to make it easier for the processing circuitry 4 to locate the sensor unit 3 in thevideo frame data because the marker is chosen as something that is easily identified in thevideo frame data. Image processing to identify specific object in an image is known to theskilled person and will not be discussed further. ln the example of figure 1, either a marker on the sensor unit may be used or the ball may beused as identifier in the video data. Since the sensor is, in this example case, attached to theside of the ball, parameters to compensate for the difference in position may be added to theprocessing circuitry. The sensors could also be mounted inside a sports object.
According to some aspects, steps ii to vi are repeatedly performed by the processing circuitry4 as the camera unit 2 and sensor unit 3 continuously capture and send data. The system 1can thus be continuously run, and the above determinations can be performed as the data isbeing collected. The system 1 may also then improve the accuracy of the determinationscontinuously. Since the processing will have access to information regarding the position ofthe sensor unit 3 and sensor data for several positions over time, it is possible to evaluatewhether the sensor data is correct or not. According to some aspects, the processing circuitry4 is arranged to determine if there is an error in the sensor data from the accelerometer 31 orgyroscope 32 based on the determined absolute position, and if there is an error, todetermining the quantity of the error. The determination can be used to evaluate the sensorunit 3. For example, for a determined position of the sensor unit 3 in the video frame data,the processing also has access to the acceleration from the accelerometer 31. The processingunit will also receive the same for another position after a time has passed and the sensorobject have moved. By calculating the distance, the sensor unit 3 has moved and comparingthat to the acceleration received by the accelerometer 31, it can be determined if there areany errors in the sensor information. lf the sensor unit 3 has not moved between receivingtwo positions, the accelerometer 31 and gyroscope 32 should give 0 values and it can thus bechecked if they give the right value. The sensor fusion will thus improve and correct data torepresent the real values in a more correct way.
The determination can thus also be used to improve data from the sensor unit 3 so that amore exact absolute position can be determined. According to some aspects, the processingcircuitry 4 is arra nged to compensate for determined errors to improve the absolute position.The compensation may be to remove the calculated error from the sensor data received afterthe error has been determined. The processing unit may also send information to the sensorunit 3 to recalibrate it so that the error is removed for future sensor measurements.
I\/|ore sensor types may be added to the system 1 for more data points. According to someaspects, the analysis system 1 comprises a pressure sensor unit 7, comprising a pressuresensor 71 with access to a communication unit 5, for calibrating one axis ofthe accelerometer31, wherein the processing circuitry is arranged to receive synchronization information and pressure sensor data from the pressure sensor and wherein the increasing the accuracy oftheabsolute position ofthe sensor unit (3) and the sensor data by fusion of the absolute positiondata and sensor data is also including fusion ofthe pressure sensor data. A pressure sensor 71that is placed correctly on an athlete or sports equipment can be used to calibrate theaccelerometer 31 since it can be used to determine when the accelerometer 31 should give a0-value along an axis. For example, if the sensor unit 3 is attached to the shoe of an athleteand the pressure sensor 71 is attached to the bottom ofthe shoe, the accelerometer 31 shouldgive a 0-value when the pressure sensor 71 gives a pressure indicating that the athlete hasstepped down on her/his foot. An example pressure unit 7 with a pressure sensor 71 and acommunication unit 5 is illustrated in figure 1 under the show of the athlete.
When an athlete can see the forces acting on the sensor unit 3 at different positions during amovement, it aids the athlete in comparing several similar movements to determine whichone is the most effective. According to some aspects, the processing circuitry 4 is arranged todetermine movement characteristics of the sensor unit 3 by fusing the data captured by thecamera unit 2 and the sensor unit 3. The movement characteristics are for example velocityof the sensor unit 3 in vx, vy, and optionally vz with the unit m/s. Another example is that theforces acting on the sensor unit 3 can be calculated. According to some aspects, the analysissystem 1 comprising a display for displaying the video frame data received from the cameraunit 2, wherein the processing circuitry 4 is arranged to process the video frame data tovisualize the determined movement characteristics in 2- or 3-dimensions together with thevideo frame data. This provides a usable way for the user to see, in the video, the determinedmovement characteristics together with the video of the movement. lt is therefore easy forthe athlete to analyse her/his movement and asses if she/he should adjust anything in themovement. The movement characteristics is for example the velocity of the sensor unit 3which may be illustrated as a vector with a direction and a length. The vector may also beillustrating the force on the sensor unit 3 or the acceleration of the sensor unit 3. Thevisualisation may be illustrated in the video frame data in connection to the sensor unit 3 oranywhere else in the image. lt may for example be illustrated on a side along the edge of thevideo frame data or in an added data window anywhere on the screen. ln figure 8, an exampleof a screen with an image of the hammer thrower of figure 1 is shown. The square boxes areexamples of where the visualized movement characteristics can be shown. The determinedmovement characteristics can thus be illustrated anywhere on the screen so that an onlookereasily can analyse the information. For example, is a hammer thrower looks at the video ofhis/her hammer throw, he/she will see, in all positions of the hammer, what movementcharacteristics the hammer had, when the sensor unit is arranged on the hammer. Thehammer thrower can then analyse in which positions there is need for a change; for example,more force or more speed, for improving the hammer throw.
To increase the accuracy of the positioning, triangulation may be used. According to someaspects, the analysis system 1 comprising at least one additional communication unit 5 forwireless communication to be fixed in the environment, wherein the processing circuitry 4 is 11 arranged to receive synchronization signals from the additional communication unit 5 andtriangulate the position of the sensor unit 3 using the signals from the additionalcommunication unit 5, the communication unit 5 ofthe camera unit 2 and the communicationunit 33 ofthe sensor unit 3. The increasing the accuracy ofthe absolute position ofthe sensorunit 3 and the sensor data by fusion of the absolute position data and sensor data may thenalso include fusion of the triangulated position. An example of this is shown in figure 5 whichhas been explained above. lt is possible to use the communication unit 5 of the smart phoneinstead of one of the individual communication units 5 but for a precise system 1, it may bebeneficial to fixate the units used in the environment. ln figure 9, it is also illustrated thatinformation from the communication unit/units 5 may also be used for the positioning. Theposition of the sensor unit 3 can thus be even more accurately determined. The additionalcommunication unit 5 may comprise BLE and/or UWB technology for communication with BLEand/or UWB. lt may also comprise another technology for wireless communication. Whenhaving at least three units communicate wirelessly, triangulation may be performed todetermine positions of the three units.
For BLE, the strength of the signal may be used. The signal strength between two devices ismeasured and converted to a distance estimate. For UWB, in the IEEE 802.15.4-2011 standard,radio waves with very short impulse transmissions is used. The short bursts of signals, withsharp rises and drops, makes the signals' starts and stops easy to measure. This means thatthe distance between two UWB devices can be measured precisely by measuring the time thatit takes for a radio wave to pass between the two devices.
Both to improve the image quality of the camera 21 and to improve the preciseness of thetriangulation, the one or more camera unit 2 may be fixed in the environment by the user. lnother words, the user may position the camera unit 2 in the surroundings of the object to befilmed and to fix it by, for example, fastening it to something or by placing it in a stand. ln the figures, boxes with a dashed line are optional. ln figure 10, boxes are illustrated to showthe processing steps by the processing circuitry. Please note that the steps may be taken inanother order than illustrated. For example, it does not matter in which order the processingcircuitry receives the information in steps i to iii.
List of references 1. System2. Camera unit 21. Camera 22. Active IR sensor 223. Sensor unit 31. Accelerometer 32. Gyroscope 33. Communication unit NFDWP 12 Processing circuitryCommunication unitSmart phonePressure sensor unit 71. Pressure sensor

Claims (2)

13 m
1. An analysis system (1) for analysing movements associated with sports, the system (1) comprises: - at least one camera unit (2) comprising a camera (21) and having access to acommunication unit (5) for wireless communication, - at least one sensor unit (3), for attaching to a sports object or athlete,comprising an accelerometer (31), a gyroscope (32), and a communication unit(33) for wireless communication, - processing circuitry (4) having access to a communication unit (5) for wirelesscommunication, the processing circuitry (4) being arranged to: i. receive synchronization information from the sensor unit (3) and thecamera unit (2),ii. receive a video frame data captured by the camera (21) of the cameraunit (2),iii. receive sensor data from the accelerometer (31) and gyroscope (32)from the sensor unit (3) for the same time as the captured video frame,iv. determine a position of the sensor unit (3),v. transforming the determined position to an absolute position in a 2- or3-dimensional space,viii. increasing the accuracy ofthe absolute position ofthe sensor unit (3)and the sensor data by fusion ofthe absolute position data and sensordata. The analysis system according to claim 1, wherein the determine a position of thesensor unit (3) is performed in the video frame, by image processing or/and by usingthe gyro and accelerometer data and a fix starting position ofthe sensor unit. The analysis system (1) according to claim 1 or 2, wherein the processing circuitry (4)is arranged to determine movement characteristics of the sensor unit (3) by fusing thedata captured by the camera unit (2) and the sensor unit (3). The analysis system (1) according to claim 3, comprising a display for displaying thevideo frame data received from the camera unit (2), wherein the processing circuitry(4) is arranged to process the video frame data to visualize the determined movementcharacteristics in 2- or 3-dimensions together with the video frame data. The analysis system (1) according to any preceding claim, comprising at least oneadditional communication unit (5) for wireless communication to be fixed in theenvironment, wherein the processing circuitry (4) is arranged to: 10. 11. 1
2. 14 vi. receive synchronization signals from the additional communication unit(5),vii. triangulate the position of the sensor unit (3) using the signals from theadditional communication unit (5), the communication unit (5) of thecamera unit (2) and the communication unit (33) of the sensor unit (3)and wherein the increasing the accuracy of the absolute position of the sensor unit(3) and the sensor data by fusion of the absolute position data and sensor data is alsoincluding fusion of the triangulated position. The analysis system (1) according to any preceding claim, wherein one of the at leastone camera unit (2) and the processing circuitry (4) are comprised in a smart phone (6)- The analysis system (1) according to any preceding claim, comprising a marker andwherein the determining a position of the sensor unit (3) comprises identifying amarker in the video data. The analysis system (1) according to any preceding claim, wherein the increasing theaccuracy of the absolute position and the sensor data is done by filtering by a Kalmanfilter or equivalent. The analysis system (1) according to any preceding claim, wherein steps ii to vi arerepeatedly performed by the processing circuitry (4) as the camera unit (2) and sensorunit (3) continuously capture and send data. The analysis system (1) according to any preceding claim, wherein the processingcircuitry (4) is arranged to determine if there is an error in the sensor data from theaccelerometer (31) or gyroscope (32) based on the determined absolute position, andif there is an error, to determining the quantity of the error. The analysis system (1) according to claim 10, wherein the processing circuitry (4) isarra nged to compensate for determined errors to improve the absolute position. The analysis system (1) according to any preceding claim, comprising a pressure sensorunit (7), comprising a pressure sensor (71) with access to a communication unit (5), forcalibrating one axis of the accelerometer (31), wherein the processing circuitry isarranged to receive synchronization information and pressure sensor data from thepressure sensor and wherein the increasing the accuracy of the absolute position ofthe sensor unit (3) and the sensor data by fusion of the absolute position data andsensor data is also including fusion of the pressure sensor data.
SE1950724A 2019-06-14 2019-06-14 System for analyzing movement in sport SE543581C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1950724A SE543581C2 (en) 2019-06-14 2019-06-14 System for analyzing movement in sport
DE102020115519.0A DE102020115519A1 (en) 2019-06-14 2020-06-10 System for analyzing movement in sports

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1950724A SE543581C2 (en) 2019-06-14 2019-06-14 System for analyzing movement in sport

Publications (2)

Publication Number Publication Date
SE1950724A1 true SE1950724A1 (en) 2020-12-15
SE543581C2 SE543581C2 (en) 2021-04-06

Family

ID=73546947

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1950724A SE543581C2 (en) 2019-06-14 2019-06-14 System for analyzing movement in sport

Country Status (2)

Country Link
DE (1) DE102020115519A1 (en)
SE (1) SE543581C2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021134460A1 (en) 2021-12-23 2023-06-29 Otto-von-Guericke-Universität Magdeburg, Körperschaft des öffentlichen Rechts Measuring unit for determining the radius and sports equipment equipped therewith

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015157808A1 (en) * 2014-04-18 2015-10-22 Catapult Group International Ltd Sports throwing measurement
US10065074B1 (en) * 2014-12-12 2018-09-04 Enflux, Inc. Training systems with wearable sensors for providing users with feedback
US10182746B1 (en) * 2017-07-25 2019-01-22 Verily Life Sciences Llc Decoupling body movement features from sensor location

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015157808A1 (en) * 2014-04-18 2015-10-22 Catapult Group International Ltd Sports throwing measurement
US10065074B1 (en) * 2014-12-12 2018-09-04 Enflux, Inc. Training systems with wearable sensors for providing users with feedback
US10182746B1 (en) * 2017-07-25 2019-01-22 Verily Life Sciences Llc Decoupling body movement features from sensor location

Also Published As

Publication number Publication date
SE543581C2 (en) 2021-04-06
DE102020115519A1 (en) 2020-12-17

Similar Documents

Publication Publication Date Title
US11150071B2 (en) Methods of determining performance information for individuals and sports objects
CN104853104B (en) A kind of method and system of auto-tracking shooting moving target
US11348255B2 (en) Techniques for object tracking
JP6814196B2 (en) Integrated sensor and video motion analysis method
US8941723B2 (en) Portable wireless mobile device motion capture and analysis system and method
EP2609568B1 (en) Portable wireless mobile device motion capture and analysis system and method
CN104834917A (en) Mixed motion capturing system and mixed motion capturing method
US10837794B2 (en) Method and system for characterization of on foot motion with multiple sensor assemblies
CN115024715B (en) Human motion intelligent measurement and digital training system
JP7317399B2 (en) Electronic device and system for guiding ball drop point
US11577125B2 (en) Sensor device-equipped golf shoes
US20140106892A1 (en) Method for swing result deduction and posture correction and the apparatus of the same
KR20180050589A (en) Apparatus for tracking object
US11460912B2 (en) System and method related to data fusing
EP3465252A1 (en) Sports officiating system
SE1950724A1 (en) System for analyzing movement in sport
WO2022257597A1 (en) Method and apparatus for flexible local tracking
JP2016127880A (en) Information recording apparatus, information recording system, information recording method and information recording program
JP2016116612A (en) Carry measurement device, hit ball direction measurement device, carry measurement system, carry measurement method, and program
CN207503173U (en) A kind of interactive holographic blackboard system
JP2018005474A (en) Mobile entity identification device, mobile entity identification system, mobile entity identification method and program
Lombardo et al. An inertial-based system for golf assessment
KR20180011947A (en) Analysis system of golf swing
CN109117000A (en) A kind of interactive holographic blackboard system