WO2024047482A1 - Fitness assessment system and methodology - Google Patents

Fitness assessment system and methodology Download PDF

Info

Publication number
WO2024047482A1
WO2024047482A1 PCT/IB2023/058403 IB2023058403W WO2024047482A1 WO 2024047482 A1 WO2024047482 A1 WO 2024047482A1 IB 2023058403 W IB2023058403 W IB 2023058403W WO 2024047482 A1 WO2024047482 A1 WO 2024047482A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
capturing device
heart rate
image capturing
image
Prior art date
Application number
PCT/IB2023/058403
Other languages
French (fr)
Inventor
Bernard Matthee STEYN
Christiaan Maarten VAN DER WALT
Ernst-Erich DINKELMANN
Francesco Orlando JOSHUA
Horatio Benjamin MOGGEE
Ruhan COETZER
Original Assignee
Momentum Metropolitan Life Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Momentum Metropolitan Life Limited filed Critical Momentum Metropolitan Life Limited
Publication of WO2024047482A1 publication Critical patent/WO2024047482A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/15Biometric patterns based on physiological signals, e.g. heartbeat, blood flow
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/41Analysis of document content
    • G06V30/418Document matching, e.g. of document images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection

Definitions

  • THIS invention relates to a fitness assessment method and system.
  • the invention also relates to a computer program and digital application (e.g. a mobile application or web-based application) which can implement the fitness assessment methodology when run/executed on a computing device.
  • a computer program and digital application e.g. a mobile application or web-based application
  • a method of obtaining an indication of fitness includes: receiving/obtaining/retrieving images or video of a specific person which has been captured by using an image capturing device; conducting a liveness test, by using images or video captured by the image capturing device, to verify that the person captured in the images or video is a real person; verifying, by using a processor, the identity of the person by utilising an image captured by the image capturing device; and performing a fitness analysis on the person, using a processor, by utilising images/video captured by the image capturing device.
  • CRF cardiorespiratory fitness
  • video typically includes a series of images.
  • the images and video may be of the person’s face.
  • the fitness analysis may be to determine/estimate a VO2 max score/rating for the person.
  • VO2 max is a well-known term and relates to a measure of the maximum amount of oxygen someone’s body can utilize during exercise.
  • VO2 max can, for example, be predicted by conducting a so- called sub-maximal test (i.e. a test which does not require maximal exertion).
  • the method may include capturing the images/video by utilising the image capturing device.
  • the image capturing device may be a camera of a computing device.
  • the computing device may be a laptop, desktop computer, smart phone, tablet or any other smart device which has a camera.
  • the step of verifying the identity of the person may include comparing the image of the person (e.g. a person’s face) with an image of the person (e.g. the person’s face) stored on a database (hereinafter referred to as the “stored image”).
  • the database may be an official/government source/database (e.g. from the Department of Home Affairs). The stored image may therefore be obtained/retrieved from the official/government source/database.
  • the step of verifying the identity of the person may include utilising machine learning in order to perform the verification.
  • the method may further include: receiving and validating a scanned/captured image of an identification card/document, a driver’s license or passport of the person which depicts the person’s image; and comparing the image of the person captured by the image capturing device with the image of the person depicted in the identification card/document, driver’s license or passport, for verification purposes.
  • the method may include performing a fitness readiness assessment prior to performing the fitness analysis on the person ( to ensure that the person is healthy enough to engage in the exercise necessary for the fitness analysis).
  • the step of performing the fitness readiness assessment may include utilising user profile information received/captured from the person and/or information obtained/retrieved by conducting a scan using the image capturing device.
  • the scan may be performed when the person is in a resting/rested state (i.e. prior to exercising) (also referred to as a “resting scan).
  • the method may include determining whether there may be a possible health-related reason(s) why the person should be discouraged from proceeding with the fitness analysis step.
  • the method may further include informing the person if there is such a reason.
  • Examples of health-related reason(s) may include (but are not limited to):
  • the step of performing the fitness analysis may include detecting a heart rate of the person by utilising a series of images or video captured by the image capturing device (captured over a period of time).
  • the method may include communicating to the person, via a user interface, what exercise(s) or movement(s) should be performed by the person during the fitness assessment.
  • the user interface may be presented on a display screen of a computing device, such as a smart phone or laptop.
  • the method may include a second verification step whereby the identity of the person is again verified after the exercise(s) or movement(s) has/have been performed by the person, by using an image(s) captured by the image capturing device.
  • the method may include receiving an identification number/code from the person via a communication network.
  • the method may further include utilising the identification number/code to retrieve an image of the person (e.g. of a face of a person) which is associated with the identification number/code, which is then used during the verification step.
  • the method may include receiving/obtaining/retrieving user profile information from the person.
  • the user profile information may include age, sex, height and weight, and optionally a self-reported fitness level, etc.
  • the user profile information may also include a smoker status and/or details of any medication use.
  • the age and sex of the person may be extracted from the received identification number/code.
  • the method may include calculating a BMI (Body Mass Index) for the person. The calculation may be done by using the height and weight of the person. Alternatively, the BMI could be estimated by using image(s) captured by the image capturing device. The BMI may be added to, or form part of, the user profile.
  • BMI Body Mass Index
  • the method may include using, by using a processor, at least some of the user profile information in order to determine what exercise(s) or movement(s) should be performed by the person during the fitness assessment.
  • the step of performing the fitness analysis may include determining, using a processor, the VO2 max score/rating for the person by measuring recovery heart rate of the person, preferably after a person has performed the exercise(s) or movement(s), by utilising images/video captured by the image capturing device.
  • the step of performing the fitness analysis may include determining, by using a processor, whether the person has reached a predetermined peak heart rate.
  • the step of performing the fitness analysis may include determining, by using a processor, if the peak heart rate has been reached, and if so, determining the VO2 max score/rating for the person by utilising a primary algorithm/machine learning model (e.g. a linear regression algorithm).
  • a primary algorithm/machine learning model e.g. a linear regression algorithm
  • the method may include determining, by using a processor, if the peak heart rate has been reached, and if so, determining the VO2 max score/rating for the person by: measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the recovery heart rate as an input to the primary algorithm/machine learning model which then determines the VO2 max score/rating.
  • the method may include, if the peak heart rate has not been reached, determining the VO2 max score/rating for the person by utilising a secondary algorithm/machine learning model (e.g. a linear regression algorithm) which is different from the primary algorithm/machine learning model.
  • a secondary algorithm/machine learning model e.g. a linear regression algorithm
  • the method may include, if the peak heart rate has not been reached, determining the VO2 max score/rating for the person by: measuring a maximum heart rate achieved immediately after the person has completed the required exercise(s) or movement(s), by utilising images/video captured by the image capturing device; measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the maximum heart rate and recovery heart rate as inputs to the secondary algorithm/machine learning model which then determines the VO2 max score/rating, wherein the secondary algorithm/machine learning model is different from the primary algorithm/machine learning model.
  • the method may include calculating, using a processor, the predetermined peak heartrate for the person.
  • the calculation of the predetermined peak heartrate for the person may include utilising the person’s age.
  • the step of performing the fitness analysis may include classifying the person into one of a plurality of fitness categories/classes, based on the person’s VO2 max score/rating.
  • the method may include estimating, using a processor, the person’s VO2 max score/rating by using the detected heartrate and at least some of the user profile information.
  • the method may be implemented on a smart phone, tablet or laptop.
  • the method may include determining a level of risk associated with the person, based on the VO2 max score/rating.
  • the level of risk may be a health risk.
  • a method of performing an insurance underwriting process/procedure wherein the method includes implementing the method in accordance with the first aspect of the invention.
  • the method may further include determining a health risk associated with the person, based on the VO2 max score/rating.
  • a computer program a mobile application or web-based application which includes a set of instructions which, if executed by a processor/computer, performs the following steps: receiving/obtaining/retrieving images or video of a specific person which has been captured by using an image capturing device; conducting a liveness test, by using images/video captured by the image capturing device, to verify that the person captured in the images/video is a real person; verifying, by using a processor, the identity of the person by utilising an image captured by the image capturing device; and performing a fitness analysis on the person, using a processor, by utilising images/video captured from the image capturing device.
  • the image capturing device may be an image capturing device of a mobile communication device on which the mobile application is stored/installed.
  • the mobile communication device may be a smart device, such as a smart phone or smart tablet.
  • the fitness analysis may be to determine/estimate a VO2 sub maximal score/rating or a VO2 max score/rating for the person.
  • the set of instructions may, if executed by a processor/computer, perform a step of capturing the images/video by utilising the image capturing device.
  • the image capturing device may be a camera of a computing device on which the computer program, web-based application or mobile application is stored.
  • the computing device may be a laptop, desktop computer, smart phone, tablet or any other smart device which has a camera.
  • the step of verifying the identity of the person may include comparing the image of the person with an image of the person stored on a database (hereinafter referred to as the “stored image”).
  • the database may be an official/government source/database (e.g. from Home Affairs).
  • the stored image may therefore be obtained/retrieved from the official/government source/database.
  • the step of verifying the identity of the person may include utilising machine learning in order to perform the verification.
  • the set of instructions may, if executed by a processor/computer, perform the following steps, if the verification fails: receiving a scan ned/captu red image of an identification card/document, a driver’s license or passport of the person which depicts the person’s image; and comparing the image of the person captured by the image capturing device with the image of the person depicted in the identification card/document, driver’s license or passport, for verification purposes.
  • the step of performing the fitness analysis may include detecting a heart rate of the person by utilising a series of images/video captured by the image capturing device.
  • the set of instructions may, if executed by a processor/computer, perform a step of communicating to the person, via a user interface presented on a display screen of a computing device on which the computer program is stored, what exercise(s) or movement(s) should be performed by the person during the fitness assessment.
  • the computing device may be a mobile communication device and the computer program may be a mobile application which is stored on the mobile communication device.
  • the set of instructions may, if executed by a processor/computer, perform a second verification step whereby the identity of the person is again verified after the exercise has been performed by the person, by using image(s) captured by the image capturing device.
  • the set of instructions may, if executed by a processor/computer, perform the step of receiving an identification number/code from the person via a communication network.
  • the set of instructions may, if executed by a processor/computer, perform the further step of utilising the identification number/code to retrieve an image which is associated with the identification number/code, which is then used during the verification step.
  • the set of instructions may, if executed by a processor/computer, perform the step of receiving/obtaining/retrieving user profile information from the person.
  • the user profile information may include age, sex, height and weight, and optionally a self-reported fitness level, etc.
  • the user profile information may also include a smoker status and/or details of any medication use.
  • the age and sex of the person may be extracted from the received identification number/code.
  • the set of instructions may, if executed by a processor/computer, perform the step of calculating a BMI (Body Mass Index) for the person.
  • the calculation may be done by using the height and weight of the person.
  • the BMI could be estimated by using image(s) captured by the image capturing device.
  • the BMI may be added to, form part of, the user profile.
  • the set of instructions may, if executed by a processor/computer, determine the VO2 max score/rating for the person by measuring recovery heart rate of the person, by utilising images/video captured by the image capturing device. More specifically, the set of instructions may, if executed by a processor/computer, determine the VO2 max score/rating for the person by measuring recovery heart rate of the person after the person has performed the exercise(s) or movement(s), by utilising images/video captured by the image capturing device.
  • the step of performing the fitness analysis may include determining whether the person has reached a predetermined peak heart rate.
  • the step of performing the fitness analysis may include determining, by using a processor, if the peak heart rate has been reached, and if so, determining the VO2 max score/rating for the person by utilising a primary algorithm/machine learning model (e.g. a linear regression algorithm).
  • a primary algorithm/machine learning model e.g. a linear regression algorithm
  • the step of performing the fitness analysis may include determining if the peak heart rate has been reached, and if so, determining a VO2 max score/rating for the person by: measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the recovery heart rate as an input to the primary algorithm/machine learning model which then determines the VO2 max score/rating.
  • the set of instructions may, if executed by a processor/computer, determine the VO2 max score/rating for the person by utilising a secondary algorithm/machine learning model (e.g. a linear regression algorithm) which is different from the primary algorithm, if the peak heart rate has not been reached.
  • a secondary algorithm/machine learning model e.g. a linear regression algorithm
  • the set of instructions may, if executed by a processor/computer, perform the following steps if the peak heart rate has not been reached, in order to determine the VO2 max score/rating for the person by: measuring a maximum heart rate achieved immediately after the person has completed the required exercise(s) or movement(s), by utilising images/video captured by the image capturing device; measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the maximum heart rate and recovery heart rate as inputs to the secondary algorithm/machine learning model which then determines the VO2 max score/rating, wherein the secondary algorithm/machine learning model is different from the primary algorithm/machine learning model.
  • the step of performing the fitness analysis may include classifying the person into one of a plurality of fitness categories/classes, based on the person’s VO2 max score/rating.
  • a computing device on which the computer program/mobile application/web- based application in accordance with the third aspect of the invention is stored.
  • the computing device may be a computer, smart phone, tablet or any other smart device which has a camera.
  • a non- transitory computer-readable storage medium on which the computer program/mobile application/web-based application in accordance with the third aspect of the invention is stored.
  • a fitness assessment system for obtaining an indication of fitness, wherein the system includes: a liveness assessment module which is configured to utilise images/video of a person which has been captured by using an image capturing device, to verify that the person captured in the image/video is a real person; an identify verification module which is configured to utilise an image of a specific person which has been captured by using the image capturing device in order to verify the identity of the person; a fitness analysis module which is configured to perform a fitness analysis on the person, using a processor, by utilising images/video captured by the image capturing device.
  • the fitness analysis may be to determine/estimate a VO2 sub maximal score/rating or a VO2 max score/rating for the person.
  • the image capturing device may form part of the system.
  • the system may be configured to capture the images/video by utilising the image capturing device.
  • the image capturing device may be a camera of a computing device.
  • the computing device may be a laptop, desktop computer, smart phone, tablet or any other smart device which has a camera.
  • the computing device may form part of the system.
  • the verification module may be configured to compare a captured image of the person with an image of the person stored on a database (hereinafter referred to as the “stored image”).
  • the database may be an official/government source/database (e.g. from Home Affairs)
  • the verification module may be configured to query the database in order to obtain the stored image.
  • the verification module may be configured to utilise machine learning in order to perform the verification.
  • the verification module may be configured such that, if the verification fails, it implements an alternative verification process whereby the verification module: receives a scanned/captured image of an identification card/document, a driver’s license or passport of the person which depicts the person’s image; and compares the image of the person captured by the image capturing device with the image of the person depicted in the identification card/document, a driver’s license or passport, for verification purposes.
  • the fitness analysis module may be configured to detect a heart rate of the person by utilising images/video captured by the image capturing device.
  • the fitness analysis module may be configured to communicate to the person, via a user interface, what exercise(s) or movement(s) should be performed by the person during the fitness assessment.
  • the user interface may be presented on a display screen of a computing device, such as a smart phone or laptop.
  • the verification module may be configured to perform a second verification step whereby the identity of the person is again verified after the exercise has been performed by the person, by using image(s) captured by the image capturing device.
  • the verification module may be configured to: receive an identification number/code from the person via a communication network; and utilise the identification number/code to retrieve an image which is associated with the identification number/code, which is then used during the verification step/process.
  • the system may be configured to receive/obtain/retrieve user profile information from the person.
  • the user profile information may include age, sex, height and weight, and optionally, a self-reported fitness level, etc.
  • the user profile information may also include a smoker status and/or details of any medication use.
  • the age and sex of the person may be extracted from the received identification number/code.
  • the system particularly the fitness analysis module, may be configured to calculate a BMI (Body Mass Index) for the person. The calculation may be done by using the height and weight of the person. Alternatively, the BMI could be estimated by the system (particularly the fitness analysis module) by using an image(s) captured by the image capturing device. The BMI may be added to, form part of, the user profile.
  • the fitness analysis module may be configured to use at least some of the user profile information in order to determine what exercise(s) or movement(s) should be performed by the person during the fitness assessment.
  • the system may be configured to perform a fitness readiness assessment prior to performing the fitness analysis on the person (e.g. to ensure that the person is healthy enough to engage in the exercise necessary for the fitness analysis).
  • the system may be configured to perform the fitness readiness assessment by utilising user profile information received/captured from the person and/or information obtained/retrieved by conducting a scan using the image capturing device.
  • the scan may be performed when the person is in a resting/rested state (i.e. prior to exercising) (also referred to as a “resting scan).
  • the system may be configured to determine whether there may be a possible health-related reason(s) why the person should be discouraged from proceeding with the fitness analysis step, by using the user profile information and information obtained/retrieved from the scan. Examples of health-related reason(s) may include (but are not limited to):
  • the fitness assessment module may be configured to determine the VO2 max score/rating for the person by measuring recovery heart rate of the person by utilising images/video captured by the image capturing device. More specifically, the fitness assessment module may be configured to determine the VO2 max score/rating for the person by measuring recovery heart rate of the person after a person has performed the exercise(s) or movement(s), by utilising images/video captured by the image capturing device.
  • the fitness assessment module may be configured to determine whether the person has reached a predetermined peak heart rate.
  • the fitness assessment module may be configured to determine if the peak heart rate has been reached, and if so, determine the VO2 max score/rating for the person by utilising a primary algorithm/machine learning model (e.g. a linear regression algorithm). More specifically, the fitness assessment module may be configured to determine if the peak heart rate has been reached, and if so, determine the VO2 max score/rating for the person by: measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the recovery heart rate as an input to a primary algorithm/machine learning model which then determines the VO2 max score/rating.
  • a primary algorithm/machine learning model e.g. a linear regression algorithm
  • the fitness assessment module may be configured such that, if the peak heart rate has not been reached, to determine the VO2 max score/rating for the person by utilising a secondary algorithm/machine learning model (e.g. a linear regression algorithm) which is different from the primary algorithm/machine learning model.
  • a secondary algorithm/machine learning model e.g. a linear regression algorithm
  • the fitness assessment module may be configured, if the peak heart rate has not been reached, to determine the VO2 max score/rating for the person by: measuring a maximum heart rate achieved immediately after the person has completed the required exercise(s) or movement(s), by utilising images/video captured by the image capturing device measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the maximum heart rate and recovery heart rate as inputs to the secondary algorithm/machine learning model which is different from the primary algorithm/machine learning model.
  • the fitness assessment module may be configured to classify the person into one of a plurality of fitness categories/classes, based on the person’s VO2 max score/rating.
  • the system may be implemented on a computing device, such as a smart phone.
  • the system may however also be web-based.
  • Figure 1 shows a schematic layout of a fitness assessment system in accordance with the invention
  • Figure 2 shows a flow diagram which illustrates the general process flow of the system shown in Figure 1 ;
  • Figure 3 shows a flow diagram which illustrates an identity verification process illustrated as block 212 in Figure 2;
  • Figure 4a shows a flow diagram of another example of a process flow of the system shown in Figure 3;
  • Figures 4b-e each shows a different portion of the flow diagram shown in Figure 4a; and Figures 5a-5keach show a table which sets out the bucketing rules used by the fitness assessment system in accordance with the invention for converting VO2 max values into fitness levels.
  • the present invention relates to a fitness assessment system and methodology which can typically be implemented by a computing device, such as a laptop, a server (specifically a web-based server - i.e. the system may be web-based) or a smartphone which has a camera.
  • the invention can therefore include a computer program or mobile application or web-based application which is typically installed on, of accessed via, a computer or smart phone, which can then perform the fitness assessment in conjunction with a central server (with which the computer/smart phone can communicate), by utilizing images/video captured by an image capturing device (e.g. a camera) of the computing device.
  • the captured images/video are essentially used to (i) conduct a liveness analysis (to determine that the image capturing device captured images/video of a real person), (ii) verify that the person is the actual person who is associated with a particular user profile, and (iii) measure user heart rate, so that it can be used in order to conduct the fitness analysis.
  • the system 10 includes a server 22 (e.g. which includes both a web-based and application-based server) with which users 100 can communicate over a communication network 50, by using a mobile application installed on their smart phone 12 or via the web (e.g. using a laptop to access a web interface via which communication can take place).
  • a server 22 e.g. which includes both a web-based and application-based server
  • users 100 can communicate over a communication network 50, by using a mobile application installed on their smart phone 12 or via the web (e.g. using a laptop to access a web interface via which communication can take place).
  • the user 100 would download the mobile application (e.g. from an App store) and enter certain user profile information during a registration process via a user interface presented by the mobile application on a display screen of the smart phone 12 (see block 200 in Figure 2).
  • the user profile information may include details such as an identity (ID) number, height, weight, and a self-reported/self-determined fitness level (i.e. the user provides an indication of his/her own level of fitness).
  • the user profile information may also include a smoker status and/or details of any medication use. These details are then sent, via a communication network 50, to the server 22.
  • the entered ID number is checked against a database 26 of the Department of Home Affairs (DHA), in order to validate the authenticity of the user 100. Additional information can also be obtained from the DHA database 26, including a photo of the user (hereinafter referred to as the “official image” of the user), the user’s name, sex and status, which are then added to the user profile. The user profile is then stored on the database 24.
  • the server 22 is also able to extract the age and sex of the user from the ID number and store this information as part of the user profile information.
  • the server 22 (more specifically the fitness analysis module 18) is also configured to calculate a BMI (Body Mass Index) of the user 100 by utilising the user’s height and weight, and add the BMI to the user profile.
  • BMI Body Mass Index
  • the BMI could be estimated by using image(s) captured by the image capturing device 20.
  • the liveness assessment module 16 performs a liveness assessment/test (see block 204) in order to confirm whether the person is in fact a real person.
  • the liveness assessment module 16 instructs the user 100 via the mobile app to utilise the camera 20 in order to capture a series of images (more specifically a video which comprises a series of images) of the person, more specifically the person’s face (i.e. similar to a selfie).
  • the images/video is then sent back to the server 22 which then utilizes the images/video in order to determine whether the person captured by the images is in fact a real person (e.g. not merely a photo of the person which is held in front of the camera 20) (see block 208).
  • the liveness assessment fails, then a failure notification is sent to the mobile application which then presents the notification on the display screen of the smart phone 12 (see block 210). If the liveness assessment is successful, then one of the images captured during the liveness assessment is used by the identity verification module 14 to verify the identity of the user 100, in order to confirm that the person who will be conducting the fitness assessment is actually the person who is associated with the particular user profile (see block 212 in Figure 2, as well as the flow diagram in Figure 3).
  • This verification can be done by comparing the captured image with an authenticated/official image of the person which is associated with the particular user profile, in order to verify that the user’s image which has just been captured is the same as the one associated with the particular user profile. In order to do so the captured image is compared by the identification verification module 14 with the official image of the user previously obtained from the DHA database 26 (see block 212a). A verification result is then sent back to the smartphone 12).
  • the validation process can include the use of machine learning/a machine learning algorithm/model in order to perform the comparison and verification. If the validation is unsuccessful, then the mobile application instructs the user 100 to scan/capture an image of his/her identification card/document, passport or driver’s license (i.e.
  • the identification verification module 14 then again performs a validation process by comparing (i) the captured image of the user 100 and (ii) the image of the identification card/document, passport or driver’s license (see block 212d).
  • the verification fails (see block 210) then a failure notification sent to the mobile app, which is then displayed on the display screen of the smart phone 12. If the verification is however successful, then the fitness analysis can be conducted. However, in one example, the fitness analysis can still be conducted even if the identity verification failed. However, the user or the fitness assessment will be flagged on the system 10 to indicate that the user is unverified.
  • the system 10 may be configured to perform a fitness readiness assessment 215 prior to performing the fitness analysis on the person (e.g. to ensure that the person is healthy enough to engage in the exercise necessary for the fitness analysis).
  • the system 10 may typically be configured to perform the fitness readiness assessment by utilising user profile information received/captured from the person and information obtained/retrieved by conducting a scan using the camera 20.
  • the scan may be performed when the person is in a resting/rested state (i.e. prior to exercising) (also referred to as a “resting scan).
  • the system may then be configured to determine whether there may be a possible health- related reason(s) why the person should be discouraged from proceeding with the fitness analysis step, by using the user profile information and information obtained/retrieved from the scan. Examples of health-related reason(s) may include (but are not limited to):
  • the fitness analysis may include up to 3 potential protocols, each with its own method for estimating the user’s VO2 max . These protocols are summarized in table 1 below.
  • the MAXIMUM HEART RATE must be at least 40 beats higher than the RESTING HR from the scan at rest in order for a valid result to be obtained - This feature is incorporated in software.
  • the fitness analysis module 18 calculates/determines/extracts certain user vitals by utilizing images/video captured of the user 100 (more specifically the face of the user) by the camera 20 (see block 216). These vitals are captured in order to determine a baseline before the user performs the target heart rate protocol (A) or the fixed protocol (B). Some of the vitals which are captured/extracted from the images/video include heart rate (more specifically heart rate at rest), heart rate variability (SDNN), stress level, respiration rate, oxygen saturation and blood pressure. The extraction of these vitals from video is well-known and will therefore not be described in more detail (e.g. BinahTM, VastmindzTM, NuralogixTM are examples of known mobile apps which can measure these vitals).
  • the main reason for the initial screening is to obtain the resting heart rate. If the VO2 ma x cannot be estimated/calculated with the primary or secondary algorithms (algorithms A or B) (see further below), then the final fallback could be a resting heart rate estimate of V02 ma x using a resting state (passive) algorithm (see further below).
  • a peak heart rate for the particular user 100 is calculated by the fitness analysis module 18 by firstly calculating a max heart rate with the following formula:
  • Max heart rate 220 - Age of person
  • the peak heart rate/heart rate target is then calculated by taking 80% of the max heart rate (80% is a well-defined cut-off for vigorous intensity):
  • Peak heart rate 80% of Max heart rate
  • the fitness analysis module 18 is also configured to determine what exercise protocol should be performed by the user (e.g. it determines the most appropriate exercise for best results - i.e. highest chance of success for reaching the peak heart rate), based on the user’s profile details, and then communicates the exercise protocol to the user via the user interface which is displayed on the display screen of the smart phone 12 (see block 218).
  • the fitness assessment step is preceded by a step of performing a fitness readiness assessment (see block 215) which typically includes an evaluation of the user profile information together with the information collected during a resting scan.
  • flags are detected then the user will be discouraged from attempting the active fitness test. Examples of flags could include but not limited to:
  • a user could, in one example, select whether to use the target heart rate protocol (A) or rather the fixed protocol (B) (e.g. on the user interface).
  • the aim of the fixed protocol is to have a set workload for users to follow. This will be a 5-minute step test, with a set tempo.
  • the user will be timed, with a metronome determining the tempo of stepping. Audio prompts will assist the user to keep moving and return to do a screening after 5 minutes.
  • the aim of the target heart rate protocol is to get a user to reach their peak heart rate (or target heart rate). This is based on 80% of the theoretical max heart rate (220-age).
  • the user can choose a run or step activity with a duration of 10 minutes.
  • the user will be timed with audio prompts to keep moving and return for a screening after 10 minutes. If the user has not reached the target peak heart rate, then the user can be instructed to do a few more minutes of exercise. It should be appreciated that the 10 minute period is not set in stone and it is more about the user exercising long and hard enough to get their heart rate up to the required target. The period may therefore be less than 10 minutes or more than 10 minutes.
  • a written description of the exercise(s) may be displayed on the user interface together with a number of images or a video illustrating the exercise(s).
  • the main aim of the exercise(s) is for the user to increase his/her heart rate to the calculated peak heart rate. Once this is reached, then the fitness analysis module 18 can analyze the heart rate recovery shortly thereafter, in order to calculate/estimate a VO2 max rating/score for the user 100. However, as mentioned below, if the peak heart rate is not reached then the fitness analysis module 18 can still calculate a VO2 max rating/score for the user 100 in a slightly different manner.
  • the user 100 can then perform the exercises and, once done, use the camera of the smart phone 12 to capture images/video of the user 100 (more specifically of the user’s face) immediately after the exercise.
  • the fitness analysis module 18 then utilizes the captured images/video in order to determine:
  • the fitness analysis module 18 When measuring the heart rate after exercising, the fitness analysis module 18 is configured to filter out possible outlier measurements (e.g. possible false measurements).
  • the current SDK software development kit
  • the current SDK for measuring heart rate takes 10 seconds to stabilize and start returning heart rate readings. The user 100 would already have recovered a bit from the exercise so the peak heart rate after 10 seconds would underestimate their true peak. Based on tests performed on various subjects by analysing heart rate recovery, it was determined that users 100 would recover between 5 and 10 beats per minute during the 10second window. This is then used by the fitness analysis module 18 as a “grace margin” and would still see the peak heart rate as being reached if it is within this margin. In a slight alternative example, a shape of a heart rate curve after 10 seconds could be projected back for the first 10 seconds - e.g. using linear regression.
  • the heart rate SDK has been adapted to check for this and filter out readings where there is low confidence. If the second identity verification is successful and it is determined that the peak heart rate has been reached (see block 222), then the fitness analysis module 12 calculates/estimates a VO2 max rating/score for the user 100 by utilizing a primary algorithm/machine learning model (also referred to as “Algorithm A”) which takes into account the heart rate recovery of the user 100 over a period of time (i.e. by analyzing images/video captured of the user 100 over the period of time shortly after exercising).
  • a primary algorithm/machine learning model also referred to as “Algorithm A”
  • Heart rate recovery however, additional user profile information (age, sex and BMI) is also used in calculating/estimating the VO2 max rating/score.
  • Heart rate variability, respiration rate and oxygen saturation are also tracked (i.e. by analysing the images/video) and may be used to enhance to the algorithm(s)/machine learning model in future with the expanded feature set.
  • the fitness analysis module 12 calculates the VO2 max rating/score for the user 100 by utilizing a secondary algorithm/machine learning model (which is different from the primary algorithm/machine learning model ) (also referred to as “Algorithm B”) which takes into account the same vitals mentioned above (in respect of the primary algorithm) over a period of time (i.e. again by analyzing images/video captured of the user 100 over the period of time shortly after exercising).
  • Algorithm B also takes into account the maximum heart rate directly after exercising. For more details on Algorithm B, see table 2b.
  • the fitness analysis module 12 may calculate/estimate a VO2 max rating/score for the user 100 by utilising a resting state (passive) algorithm/machine learning model (also referred to as “Algorithm C”) which takes into account the measured resting heart rate of the user and, optionally, details of the user’s user profile (e.g. age, sex and BML).
  • a resting state also referred to as “Algorithm C”
  • Algorithm C also referred to as “Algorithm C”
  • the primary model for the target heart rate protocol A
  • the secondary model where exercise was performed but maximum heart rate was not achieved (for fixed protocol B)
  • the passive model for resting protocol C
  • the passive model for resting protocol C
  • no exercise was performed has age, gender, BMI and resting heart rate as input variables.
  • the fitness analysis module 12 can then classify the user 100 into one of five different fitness categories/classes. In order to perform the classification, the fitness analysis module 12 can also take into account certain other factors, such as age and sex. This can be done via a basic lookup of the calculated VO2 ma x in a table filtered by age and sex. Reference is in this regard specifically made to Figures 5a-k which sets out an example of the bucketing rules/lookup table. From studying Figures 5a-k it should be clear that for each specified sex and age combination, there are different VO2 max ranges which then accordingly correlates to a certain fitness level. For example, if a user is a 20 year old female (F) with a VO2 max of 40, then the fitness level of the user will be Level 2 (see Figure 5a).
  • FIGS 4a-e show another example of a general process flow of the system 10 in accordance with the invention. The main difference is that Figures 4a- e include a scenario where a user 100 has already registered with the system 10 and no Home Affairs check or document validation is required. Figures 4a-e also show how the 2 different exercise protocols branches off (A for fit users where peak heart rate must be reached and Algorithm A applied, and then B being a standard step test and Algorithm B applied.
  • the present invention provides an effective way of conducting a fitness assessment, without requiring a user to interact with an actual person, such as a biokineticist.
  • an actual person such as a biokineticist.
  • the user can install the mobile application on his/her smart phone 12 or use the web-based platform, he/she can effectively conduct the fitness assessment at any place and at any time (without any additional hardware).
  • the present invention does this while at the same time also addressing reliability/fraud concerns by verifying the identity of the user during the process and also conducting a liveness assessment in order to confirm that the user captured by the camera is an actual person.
  • the fitness classification can be very useful in underwriting/pricing in a life insurance context.
  • the present invention can be extremely valuable for insurance companies who want to obtain an indication of a person’s fitness in a way which is very convenient to a user, but also at the same time has appropriate fraud prevention processes in place (similar to those described earlier in the specification). This fitness indication can then be used to calculate an appropriate premium/pricing for a life insurance policy.

Abstract

A fitness assessment system for obtaining an indication of fitness. The system includes a liveness assessment module, an identify verification module and a fitness analysis module. The liveness assessment module is configured to utilise images/video of a person which has been captured by using an image capturing device, to verify that the person captured in the image/video is a real person. The identify verification module is configured to utilise an image of a specific person which has been captured by using the image capturing device in order to verify the identity of the person. The fitness analysis module is configured to perform a fitness analysis on the person, using a processor, by utilising images/video captured by the image capturing device. The fitness analysis may more specifically be to determine/estimate a VO2 max score/rating for the person.

Description

TITLE: FITNESS ASSESSMENT SYSTEM AND METHODOLOGY
BACKGROUND OF THE INVENTION
THIS invention relates to a fitness assessment method and system. The invention also relates to a computer program and digital application (e.g. a mobile application or web-based application) which can implement the fitness assessment methodology when run/executed on a computing device.
In order to conduct a fitness assessment, people typically need to visit a biokineticist in person in order to oversee the assessment. Although it is also possible to conduct certain assessments with a biokineticist over a video call, it still requires the active involvement of the biokineticist.
Although there are wearable devices (e.g. Garmin, Fitbit or Apple watches) that can estimate VO2 max, not many people have access to these devices. Furthermore, these devices are also incapable of verifying the identity of the user when conducting any type of assessment. As a result, there is a big risk of fraud.
The Inventors wish to address at least some of the issues mentioned above.
SUMMARY OF THE INVENTION
In accordance with a first aspect of the invention there is provided a method of obtaining an indication of fitness, wherein the method includes: receiving/obtaining/retrieving images or video of a specific person which has been captured by using an image capturing device; conducting a liveness test, by using images or video captured by the image capturing device, to verify that the person captured in the images or video is a real person; verifying, by using a processor, the identity of the person by utilising an image captured by the image capturing device; and performing a fitness analysis on the person, using a processor, by utilising images/video captured by the image capturing device.
The term “fitness”, in the context of the patent specification, refers/relates to cardiorespiratory fitness (CRF). CRF can typically be determined by estimating a person’s V02max. CRF is the capacity of the heart and lungs to deliver oxygen and fuel to the body's muscles and organs during physical activity. Higher levels of CRF enable the ability to perform more, or higher intensity exercise, and are also associated with significant lower cardiovascular and all-cause mortality, making it an important metric for assessing longevity & well-being.
It should be appreciated that “video" typically includes a series of images.
The images and video may be of the person’s face.
The fitness analysis may be to determine/estimate a VO2 max score/rating for the person. The term “VO2 max” is a well-known term and relates to a measure of the maximum amount of oxygen someone’s body can utilize during exercise. VO2 max can, for example, be predicted by conducting a so- called sub-maximal test (i.e. a test which does not require maximal exertion).
The method may include capturing the images/video by utilising the image capturing device. The image capturing device may be a camera of a computing device. The computing device may be a laptop, desktop computer, smart phone, tablet or any other smart device which has a camera. The step of verifying the identity of the person may include comparing the image of the person (e.g. a person’s face) with an image of the person (e.g. the person’s face) stored on a database (hereinafter referred to as the “stored image”). The database may be an official/government source/database (e.g. from the Department of Home Affairs). The stored image may therefore be obtained/retrieved from the official/government source/database. The step of verifying the identity of the person may include utilising machine learning in order to perform the verification.
If the verification fails, then the method may further include: receiving and validating a scanned/captured image of an identification card/document, a driver’s license or passport of the person which depicts the person’s image; and comparing the image of the person captured by the image capturing device with the image of the person depicted in the identification card/document, driver’s license or passport, for verification purposes.
The method may include performing a fitness readiness assessment prior to performing the fitness analysis on the person ( to ensure that the person is healthy enough to engage in the exercise necessary for the fitness analysis). The step of performing the fitness readiness assessment may include utilising user profile information received/captured from the person and/or information obtained/retrieved by conducting a scan using the image capturing device. The scan may be performed when the person is in a resting/rested state (i.e. prior to exercising) (also referred to as a “resting scan). The method may include determining whether there may be a possible health-related reason(s) why the person should be discouraged from proceeding with the fitness analysis step. The method may further include informing the person if there is such a reason.
Examples of health-related reason(s) may include (but are not limited to):
• Elevated heart rate;
• Elevated blood pressure; and
• Poor V02max estimate based on a scan at rest (i.e. prior to any exercising), using the image capturing device). The step of performing the fitness analysis may include detecting a heart rate of the person by utilising a series of images or video captured by the image capturing device (captured over a period of time).
The method may include communicating to the person, via a user interface, what exercise(s) or movement(s) should be performed by the person during the fitness assessment. The user interface may be presented on a display screen of a computing device, such as a smart phone or laptop.
The method may include a second verification step whereby the identity of the person is again verified after the exercise(s) or movement(s) has/have been performed by the person, by using an image(s) captured by the image capturing device.
The method may include receiving an identification number/code from the person via a communication network. The method may further include utilising the identification number/code to retrieve an image of the person (e.g. of a face of a person) which is associated with the identification number/code, which is then used during the verification step.
The method may include receiving/obtaining/retrieving user profile information from the person. The user profile information may include age, sex, height and weight, and optionally a self-reported fitness level, etc. The user profile information may also include a smoker status and/or details of any medication use. The age and sex of the person may be extracted from the received identification number/code. The method may include calculating a BMI (Body Mass Index) for the person. The calculation may be done by using the height and weight of the person. Alternatively, the BMI could be estimated by using image(s) captured by the image capturing device. The BMI may be added to, or form part of, the user profile. The method may include using, by using a processor, at least some of the user profile information in order to determine what exercise(s) or movement(s) should be performed by the person during the fitness assessment. The step of performing the fitness analysis may include determining, using a processor, the VO2 max score/rating for the person by measuring recovery heart rate of the person, preferably after a person has performed the exercise(s) or movement(s), by utilising images/video captured by the image capturing device.
The step of performing the fitness analysis may include determining, by using a processor, whether the person has reached a predetermined peak heart rate. The step of performing the fitness analysis may include determining, by using a processor, if the peak heart rate has been reached, and if so, determining the VO2 max score/rating for the person by utilising a primary algorithm/machine learning model (e.g. a linear regression algorithm). More specifically, the method may include determining, by using a processor, if the peak heart rate has been reached, and if so, determining the VO2 max score/rating for the person by: measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the recovery heart rate as an input to the primary algorithm/machine learning model which then determines the VO2 max score/rating.
The method may include, if the peak heart rate has not been reached, determining the VO2 max score/rating for the person by utilising a secondary algorithm/machine learning model (e.g. a linear regression algorithm) which is different from the primary algorithm/machine learning model. More specifically, the method may include, if the peak heart rate has not been reached, determining the VO2 max score/rating for the person by: measuring a maximum heart rate achieved immediately after the person has completed the required exercise(s) or movement(s), by utilising images/video captured by the image capturing device; measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the maximum heart rate and recovery heart rate as inputs to the secondary algorithm/machine learning model which then determines the VO2 max score/rating, wherein the secondary algorithm/machine learning model is different from the primary algorithm/machine learning model.
The method may include calculating, using a processor, the predetermined peak heartrate for the person. The calculation of the predetermined peak heartrate for the person may include utilising the person’s age.
The step of performing the fitness analysis may include classifying the person into one of a plurality of fitness categories/classes, based on the person’s VO2 max score/rating.
The method may include estimating, using a processor, the person’s VO2 max score/rating by using the detected heartrate and at least some of the user profile information.
The method may be implemented on a smart phone, tablet or laptop.
The method may include determining a level of risk associated with the person, based on the VO2 max score/rating. The level of risk may be a health risk.
In accordance with a second aspect of the invention there is provided a method of performing an insurance underwriting process/procedure, wherein the method includes implementing the method in accordance with the first aspect of the invention. The method may further include determining a health risk associated with the person, based on the VO2 max score/rating.
In accordance with a third aspect of the invention there is provided a computer program, a mobile application or web-based application which includes a set of instructions which, if executed by a processor/computer, performs the following steps: receiving/obtaining/retrieving images or video of a specific person which has been captured by using an image capturing device; conducting a liveness test, by using images/video captured by the image capturing device, to verify that the person captured in the images/video is a real person; verifying, by using a processor, the identity of the person by utilising an image captured by the image capturing device; and performing a fitness analysis on the person, using a processor, by utilising images/video captured from the image capturing device.
The image capturing device may be an image capturing device of a mobile communication device on which the mobile application is stored/installed. The mobile communication device may be a smart device, such as a smart phone or smart tablet.
The fitness analysis may be to determine/estimate a VO2 sub maximal score/rating or a VO2 max score/rating for the person.
The set of instructions may, if executed by a processor/computer, perform a step of capturing the images/video by utilising the image capturing device. The image capturing device may be a camera of a computing device on which the computer program, web-based application or mobile application is stored. The computing device may be a laptop, desktop computer, smart phone, tablet or any other smart device which has a camera.
The step of verifying the identity of the person may include comparing the image of the person with an image of the person stored on a database (hereinafter referred to as the “stored image”). The database may be an official/government source/database (e.g. from Home Affairs). The stored image may therefore be obtained/retrieved from the official/government source/database. The step of verifying the identity of the person may include utilising machine learning in order to perform the verification. The set of instructions may, if executed by a processor/computer, perform the following steps, if the verification fails: receiving a scan ned/captu red image of an identification card/document, a driver’s license or passport of the person which depicts the person’s image; and comparing the image of the person captured by the image capturing device with the image of the person depicted in the identification card/document, driver’s license or passport, for verification purposes.
The step of performing the fitness analysis may include detecting a heart rate of the person by utilising a series of images/video captured by the image capturing device.
The set of instructions may, if executed by a processor/computer, perform a step of communicating to the person, via a user interface presented on a display screen of a computing device on which the computer program is stored, what exercise(s) or movement(s) should be performed by the person during the fitness assessment. The computing device may be a mobile communication device and the computer program may be a mobile application which is stored on the mobile communication device.
The set of instructions may, if executed by a processor/computer, perform a second verification step whereby the identity of the person is again verified after the exercise has been performed by the person, by using image(s) captured by the image capturing device.
The set of instructions may, if executed by a processor/computer, perform the step of receiving an identification number/code from the person via a communication network. The set of instructions may, if executed by a processor/computer, perform the further step of utilising the identification number/code to retrieve an image which is associated with the identification number/code, which is then used during the verification step. The set of instructions may, if executed by a processor/computer, perform the step of receiving/obtaining/retrieving user profile information from the person. The user profile information may include age, sex, height and weight, and optionally a self-reported fitness level, etc. The user profile information may also include a smoker status and/or details of any medication use. The age and sex of the person may be extracted from the received identification number/code. The set of instructions may, if executed by a processor/computer, perform the step of calculating a BMI (Body Mass Index) for the person. The calculation may be done by using the height and weight of the person. Alternatively, the BMI could be estimated by using image(s) captured by the image capturing device. The BMI may be added to, form part of, the user profile.
The set of instructions may, if executed by a processor/computer, determine the VO2 max score/rating for the person by measuring recovery heart rate of the person, by utilising images/video captured by the image capturing device. More specifically, the set of instructions may, if executed by a processor/computer, determine the VO2 max score/rating for the person by measuring recovery heart rate of the person after the person has performed the exercise(s) or movement(s), by utilising images/video captured by the image capturing device.
The step of performing the fitness analysis may include determining whether the person has reached a predetermined peak heart rate. The step of performing the fitness analysis may include determining, by using a processor, if the peak heart rate has been reached, and if so, determining the VO2 max score/rating for the person by utilising a primary algorithm/machine learning model (e.g. a linear regression algorithm). More specifically, the step of performing the fitness analysis may include determining if the peak heart rate has been reached, and if so, determining a VO2 max score/rating for the person by: measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the recovery heart rate as an input to the primary algorithm/machine learning model which then determines the VO2 max score/rating.
The set of instructions may, if executed by a processor/computer, determine the VO2 max score/rating for the person by utilising a secondary algorithm/machine learning model (e.g. a linear regression algorithm) which is different from the primary algorithm, if the peak heart rate has not been reached.
More specifically, the set of instructions may, if executed by a processor/computer, perform the following steps if the peak heart rate has not been reached, in order to determine the VO2 max score/rating for the person by: measuring a maximum heart rate achieved immediately after the person has completed the required exercise(s) or movement(s), by utilising images/video captured by the image capturing device; measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the maximum heart rate and recovery heart rate as inputs to the secondary algorithm/machine learning model which then determines the VO2 max score/rating, wherein the secondary algorithm/machine learning model is different from the primary algorithm/machine learning model.
The step of performing the fitness analysis may include classifying the person into one of a plurality of fitness categories/classes, based on the person’s VO2 max score/rating.
In accordance with a fourth aspect of the invention there is provided a computing device on which the computer program/mobile application/web- based application in accordance with the third aspect of the invention is stored. The computing device may be a computer, smart phone, tablet or any other smart device which has a camera.
In accordance with a fifth aspect of the invention there is provided a non- transitory computer-readable storage medium on which the computer program/mobile application/web-based application in accordance with the third aspect of the invention is stored.
In accordance with a sixth aspect of the invention there is provided a fitness assessment system for obtaining an indication of fitness, wherein the system includes: a liveness assessment module which is configured to utilise images/video of a person which has been captured by using an image capturing device, to verify that the person captured in the image/video is a real person; an identify verification module which is configured to utilise an image of a specific person which has been captured by using the image capturing device in order to verify the identity of the person; a fitness analysis module which is configured to perform a fitness analysis on the person, using a processor, by utilising images/video captured by the image capturing device.
The fitness analysis may be to determine/estimate a VO2 sub maximal score/rating or a VO2 max score/rating for the person.
The image capturing device may form part of the system.
The system may be configured to capture the images/video by utilising the image capturing device. The image capturing device may be a camera of a computing device. The computing device may be a laptop, desktop computer, smart phone, tablet or any other smart device which has a camera. The computing device may form part of the system. The verification module may be configured to compare a captured image of the person with an image of the person stored on a database (hereinafter referred to as the “stored image”). The database may be an official/government source/database (e.g. from Home Affairs The verification module may be configured to query the database in order to obtain the stored image. The verification module may be configured to utilise machine learning in order to perform the verification.
The verification module may be configured such that, if the verification fails, it implements an alternative verification process whereby the verification module: receives a scanned/captured image of an identification card/document, a driver’s license or passport of the person which depicts the person’s image; and compares the image of the person captured by the image capturing device with the image of the person depicted in the identification card/document, a driver’s license or passport, for verification purposes.
The fitness analysis module may be configured to detect a heart rate of the person by utilising images/video captured by the image capturing device.
The fitness analysis module may be configured to communicate to the person, via a user interface, what exercise(s) or movement(s) should be performed by the person during the fitness assessment. The user interface may be presented on a display screen of a computing device, such as a smart phone or laptop.
The verification module may be configured to perform a second verification step whereby the identity of the person is again verified after the exercise has been performed by the person, by using image(s) captured by the image capturing device.
The verification module may be configured to: receive an identification number/code from the person via a communication network; and utilise the identification number/code to retrieve an image which is associated with the identification number/code, which is then used during the verification step/process.
The system may be configured to receive/obtain/retrieve user profile information from the person. The user profile information may include age, sex, height and weight, and optionally, a self-reported fitness level, etc. The user profile information may also include a smoker status and/or details of any medication use. The age and sex of the person may be extracted from the received identification number/code. The system, particularly the fitness analysis module, may be configured to calculate a BMI (Body Mass Index) for the person. The calculation may be done by using the height and weight of the person. Alternatively, the BMI could be estimated by the system (particularly the fitness analysis module) by using an image(s) captured by the image capturing device. The BMI may be added to, form part of, the user profile. The fitness analysis module may be configured to use at least some of the user profile information in order to determine what exercise(s) or movement(s) should be performed by the person during the fitness assessment.
The system may be configured to perform a fitness readiness assessment prior to performing the fitness analysis on the person (e.g. to ensure that the person is healthy enough to engage in the exercise necessary for the fitness analysis). The system may be configured to perform the fitness readiness assessment by utilising user profile information received/captured from the person and/or information obtained/retrieved by conducting a scan using the image capturing device. The scan may be performed when the person is in a resting/rested state (i.e. prior to exercising) (also referred to as a “resting scan). The system may be configured to determine whether there may be a possible health-related reason(s) why the person should be discouraged from proceeding with the fitness analysis step, by using the user profile information and information obtained/retrieved from the scan. Examples of health-related reason(s) may include (but are not limited to):
• Elevated heart rate;
• Elevated blood pressure; and
• Poor V02max estimate (based on the resting scan using the image capturing device).
The fitness assessment module may be configured to determine the VO2 max score/rating for the person by measuring recovery heart rate of the person by utilising images/video captured by the image capturing device. More specifically, the fitness assessment module may be configured to determine the VO2 max score/rating for the person by measuring recovery heart rate of the person after a person has performed the exercise(s) or movement(s), by utilising images/video captured by the image capturing device.
The fitness assessment module may be configured to determine whether the person has reached a predetermined peak heart rate. The fitness assessment module may be configured to determine if the peak heart rate has been reached, and if so, determine the VO2 max score/rating for the person by utilising a primary algorithm/machine learning model (e.g. a linear regression algorithm). More specifically, the fitness assessment module may be configured to determine if the peak heart rate has been reached, and if so, determine the VO2 max score/rating for the person by: measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the recovery heart rate as an input to a primary algorithm/machine learning model which then determines the VO2 max score/rating.
The fitness assessment module may be configured such that, if the peak heart rate has not been reached, to determine the VO2 max score/rating for the person by utilising a secondary algorithm/machine learning model (e.g. a linear regression algorithm) which is different from the primary algorithm/machine learning model. More specifically, the fitness assessment module may be configured, if the peak heart rate has not been reached, to determine the VO2 max score/rating for the person by: measuring a maximum heart rate achieved immediately after the person has completed the required exercise(s) or movement(s), by utilising images/video captured by the image capturing device measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the maximum heart rate and recovery heart rate as inputs to the secondary algorithm/machine learning model which is different from the primary algorithm/machine learning model.
The fitness assessment module may be configured to classify the person into one of a plurality of fitness categories/classes, based on the person’s VO2 max score/rating.
The system may be implemented on a computing device, such as a smart phone. The system may however also be web-based.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be described, by way of example, with reference to the accompanying diagrammatic drawings. In the drawings:
Figure 1 shows a schematic layout of a fitness assessment system in accordance with the invention;
Figure 2 shows a flow diagram which illustrates the general process flow of the system shown in Figure 1 ;
Figure 3 shows a flow diagram which illustrates an identity verification process illustrated as block 212 in Figure 2;
Figure 4a shows a flow diagram of another example of a process flow of the system shown in Figure 3;
Figures 4b-e each shows a different portion of the flow diagram shown in Figure 4a; and Figures 5a-5keach show a table which sets out the bucketing rules used by the fitness assessment system in accordance with the invention for converting VO2max values into fitness levels.
DESCRIPTION OF PREFERRED EMBODIMENTS
The present invention relates to a fitness assessment system and methodology which can typically be implemented by a computing device, such as a laptop, a server (specifically a web-based server - i.e. the system may be web-based) or a smartphone which has a camera. The invention can therefore include a computer program or mobile application or web-based application which is typically installed on, of accessed via, a computer or smart phone, which can then perform the fitness assessment in conjunction with a central server (with which the computer/smart phone can communicate), by utilizing images/video captured by an image capturing device (e.g. a camera) of the computing device. The captured images/video are essentially used to (i) conduct a liveness analysis (to determine that the image capturing device captured images/video of a real person), (ii) verify that the person is the actual person who is associated with a particular user profile, and (iii) measure user heart rate, so that it can be used in order to conduct the fitness analysis.
Reference is hereinafter specifically made to a smart phone and a mobile application which is installable on the smart phone. However, it will be appreciated that the invention can easily also be implemented on another type of computing device, such as a laptop or tablet. The system 10 includes a server 22 (e.g. which includes both a web-based and application-based server) with which users 100 can communicate over a communication network 50, by using a mobile application installed on their smart phone 12 or via the web (e.g. using a laptop to access a web interface via which communication can take place).
To start, the user 100 would download the mobile application (e.g. from an App store) and enter certain user profile information during a registration process via a user interface presented by the mobile application on a display screen of the smart phone 12 (see block 200 in Figure 2). The user profile information may include details such as an identity (ID) number, height, weight, and a self-reported/self-determined fitness level (i.e. the user provides an indication of his/her own level of fitness). The user profile information may also include a smoker status and/or details of any medication use. These details are then sent, via a communication network 50, to the server 22. As an initial authentication step (see block 201 ) the entered ID number is checked against a database 26 of the Department of Home Affairs (DHA), in order to validate the authenticity of the user 100. Additional information can also be obtained from the DHA database 26, including a photo of the user (hereinafter referred to as the “official image” of the user), the user’s name, sex and status, which are then added to the user profile. The user profile is then stored on the database 24. The server 22 is also able to extract the age and sex of the user from the ID number and store this information as part of the user profile information. The server 22 (more specifically the fitness analysis module 18) is also configured to calculate a BMI (Body Mass Index) of the user 100 by utilising the user’s height and weight, and add the BMI to the user profile. Alternatively, the BMI could be estimated by using image(s) captured by the image capturing device 20.
It should be noted that the user profile details can however also be stored on the mobile phone 12 itself.
When the user 100 then wishes to conduct a fitness assessment, then he/she can do so by selecting an appropriate option on the user interface presented in the mobile application (see block 202).
During the assessment, the liveness assessment module 16 performs a liveness assessment/test (see block 204) in order to confirm whether the person is in fact a real person. In order to do this assessment, the liveness assessment module 16 instructs the user 100 via the mobile app to utilise the camera 20 in order to capture a series of images (more specifically a video which comprises a series of images) of the person, more specifically the person’s face (i.e. similar to a selfie). The images/video is then sent back to the server 22 which then utilizes the images/video in order to determine whether the person captured by the images is in fact a real person (e.g. not merely a photo of the person which is held in front of the camera 20) (see block 208). If the liveness assessment fails, then a failure notification is sent to the mobile application which then presents the notification on the display screen of the smart phone 12 (see block 210). If the liveness assessment is successful, then one of the images captured during the liveness assessment is used by the identity verification module 14 to verify the identity of the user 100, in order to confirm that the person who will be conducting the fitness assessment is actually the person who is associated with the particular user profile (see block 212 in Figure 2, as well as the flow diagram in Figure 3).
This verification can be done by comparing the captured image with an authenticated/official image of the person which is associated with the particular user profile, in order to verify that the user’s image which has just been captured is the same as the one associated with the particular user profile. In order to do so the captured image is compared by the identification verification module 14 with the official image of the user previously obtained from the DHA database 26 (see block 212a). A verification result is then sent back to the smartphone 12). The validation process can include the use of machine learning/a machine learning algorithm/model in order to perform the comparison and verification. If the validation is unsuccessful, then the mobile application instructs the user 100 to scan/capture an image of his/her identification card/document, passport or driver’s license (i.e. depicting an image of the user), by using the camera 20 (see block 212c). The identification verification module 14 then again performs a validation process by comparing (i) the captured image of the user 100 and (ii) the image of the identification card/document, passport or driver’s license (see block 212d).
If the verification fails (see block 210) then a failure notification sent to the mobile app, which is then displayed on the display screen of the smart phone 12. If the verification is however successful, then the fitness analysis can be conducted. However, in one example, the fitness analysis can still be conducted even if the identity verification failed. However, the user or the fitness assessment will be flagged on the system 10 to indicate that the user is unverified.
It should be noted that in one example, the system 10 may be configured to perform a fitness readiness assessment 215 prior to performing the fitness analysis on the person (e.g. to ensure that the person is healthy enough to engage in the exercise necessary for the fitness analysis). The system 10 may typically be configured to perform the fitness readiness assessment by utilising user profile information received/captured from the person and information obtained/retrieved by conducting a scan using the camera 20. The scan may be performed when the person is in a resting/rested state (i.e. prior to exercising) (also referred to as a “resting scan). The system may then be configured to determine whether there may be a possible health- related reason(s) why the person should be discouraged from proceeding with the fitness analysis step, by using the user profile information and information obtained/retrieved from the scan. Examples of health-related reason(s) may include (but are not limited to):
• Elevated heart rate;
• Elevated blood pressure; and
• Poor VO2max estimate (based on the resting scan using the image capturing device) (see resting protocol C described further below).
The fitness analysis may include up to 3 potential protocols, each with its own method for estimating the user’s VO2max. These protocols are summarized in table 1 below.
Figure imgf000022_0001
Table 1: Fitness protocols
The details of Algorithms A-C are set out in tables 2a-2c below:
Figure imgf000023_0001
Table 2a: Fitness Algorithm A
Figure imgf000023_0002
Formula:
-0.38*AGE-0.63*BMI + 5.55*SEX -
0.1*HEART_RATE_AFTER_REST -
0.14*MAXIMUM_HEART_RATE + 0.01*MHR_DEC - 0.2 + 102.86609291744699
Guardrail: The MAXIMUM HEART RATE must be at least 40 beats higher than the RESTING HR from the scan at rest in order for a
Figure imgf000024_0001
valid result to be obtained - This feature is incorporated in software.
Table 2b: Fitness Algorithm B
Figure imgf000024_0002
Table 2c: Fitness Algorithm C
To start the fitness analysis, the fitness analysis module 18 calculates/determines/extracts certain user vitals by utilizing images/video captured of the user 100 (more specifically the face of the user) by the camera 20 (see block 216). These vitals are captured in order to determine a baseline before the user performs the target heart rate protocol (A) or the fixed protocol (B). Some of the vitals which are captured/extracted from the images/video include heart rate (more specifically heart rate at rest), heart rate variability (SDNN), stress level, respiration rate, oxygen saturation and blood pressure. The extraction of these vitals from video is well-known and will therefore not be described in more detail (e.g. Binah™, Vastmindz™, Nuralogix™ are examples of known mobile apps which can measure these vitals). It should however be noted that the main reason for the initial screening is to obtain the resting heart rate. If the VO2max cannot be estimated/calculated with the primary or secondary algorithms (algorithms A or B) (see further below), then the final fallback could be a resting heart rate estimate of V02max using a resting state (passive) algorithm (see further below).
A peak heart rate for the particular user 100 is calculated by the fitness analysis module 18 by firstly calculating a max heart rate with the following formula:
Max heart rate = 220 - Age of person
It should be noted that there are various alternative ways of calculating a user’s maximum heart rate, which could also be used.
The peak heart rate/heart rate target is then calculated by taking 80% of the max heart rate (80% is a well-defined cut-off for vigorous intensity):
Peak heart rate = 80% of Max heart rate
The fitness analysis module 18 is also configured to determine what exercise protocol should be performed by the user (e.g. it determines the most appropriate exercise for best results - i.e. highest chance of success for reaching the peak heart rate), based on the user’s profile details, and then communicates the exercise protocol to the user via the user interface which is displayed on the display screen of the smart phone 12 (see block 218).
As mentioned earlier, the fitness assessment step is preceded by a step of performing a fitness readiness assessment (see block 215) which typically includes an evaluation of the user profile information together with the information collected during a resting scan.
If flags are detected then the user will be discouraged from attempting the active fitness test. Examples of flags could include but not limited to:
• Elevated heart rate
• Elevated blood pressure
• Poor VO2max estimate based on a scan at rest (i.e. prior to any exercising) (Level 1 or 2) A user could, in one example, select whether to use the target heart rate protocol (A) or rather the fixed protocol (B) (e.g. on the user interface).
Fixed protocol (B):
The aim of the fixed protocol is to have a set workload for users to follow. This will be a 5-minute step test, with a set tempo.
The user will be timed, with a metronome determining the tempo of stepping. Audio prompts will assist the user to keep moving and return to do a screening after 5 minutes.
Target heart rate protocol (A)
The aim of the target heart rate protocol is to get a user to reach their peak heart rate (or target heart rate). This is based on 80% of the theoretical max heart rate (220-age).
The user can choose a run or step activity with a duration of 10 minutes.
The user will be timed with audio prompts to keep moving and return for a screening after 10 minutes. If the user has not reached the target peak heart rate, then the user can be instructed to do a few more minutes of exercise. It should be appreciated that the 10 minute period is not set in stone and it is more about the user exercising long and hard enough to get their heart rate up to the required target. The period may therefore be less than 10 minutes or more than 10 minutes.
A written description of the exercise(s) may be displayed on the user interface together with a number of images or a video illustrating the exercise(s).
The main aim of the exercise(s) is for the user to increase his/her heart rate to the calculated peak heart rate. Once this is reached, then the fitness analysis module 18 can analyze the heart rate recovery shortly thereafter, in order to calculate/estimate a VO2 max rating/score for the user 100. However, as mentioned below, if the peak heart rate is not reached then the fitness analysis module 18 can still calculate a VO2 max rating/score for the user 100 in a slightly different manner.
After the user 100 has studied the proposed exercise(s), the user 100 can then perform the exercises and, once done, use the camera of the smart phone 12 to capture images/video of the user 100 (more specifically of the user’s face) immediately after the exercise. The fitness analysis module 18 then utilizes the captured images/video in order to determine:
(i) that it is still the same user 100. In other words, a second identity verification process is again implemented by the identity verification module 14 in the same manner as described earlier in the specification; and
(ii) if the peak heart rate has been reached (i.e. as a result of the exercise(s) performed by the user 100).
When measuring the heart rate after exercising, the fitness analysis module 18 is configured to filter out possible outlier measurements (e.g. possible false measurements). The current SDK (software development kit) for measuring heart rate takes 10 seconds to stabilize and start returning heart rate readings. The user 100 would already have recovered a bit from the exercise so the peak heart rate after 10 seconds would underestimate their true peak. Based on tests performed on various subjects by analysing heart rate recovery, it was determined that users 100 would recover between 5 and 10 beats per minute during the 10second window. This is then used by the fitness analysis module 18 as a “grace margin” and would still see the peak heart rate as being reached if it is within this margin. In a slight alternative example, a shape of a heart rate curve after 10 seconds could be projected back for the first 10 seconds - e.g. using linear regression.
Excess movement and/or light can sometimes affect the results, however, the heart rate SDK has been adapted to check for this and filter out readings where there is low confidence. If the second identity verification is successful and it is determined that the peak heart rate has been reached (see block 222), then the fitness analysis module 12 calculates/estimates a VO2 max rating/score for the user 100 by utilizing a primary algorithm/machine learning model (also referred to as “Algorithm A”) which takes into account the heart rate recovery of the user 100 over a period of time (i.e. by analyzing images/video captured of the user 100 over the period of time shortly after exercising). The primary factor is heart rate recovery, however, additional user profile information (age, sex and BMI) is also used in calculating/estimating the VO2 max rating/score. Heart rate variability, respiration rate and oxygen saturation are also tracked (i.e. by analysing the images/video) and may be used to enhance to the algorithm(s)/machine learning model in future with the expanded feature set. If the second identity verification is successful and it is determined that the peak heart rate has not been reached (see block 224), then the fitness analysis module 12 calculates the VO2 max rating/score for the user 100 by utilizing a secondary algorithm/machine learning model (which is different from the primary algorithm/machine learning model ) (also referred to as “Algorithm B”) which takes into account the same vitals mentioned above (in respect of the primary algorithm) over a period of time (i.e. again by analyzing images/video captured of the user 100 over the period of time shortly after exercising). In addition, Algorithm B also takes into account the maximum heart rate directly after exercising. For more details on Algorithm B, see table 2b.
It should be noted that if it is not possible for a person to perform the necessary exercises, the fitness analysis module 12 may calculate/estimate a VO2 max rating/score for the user 100 by utilising a resting state (passive) algorithm/machine learning model (also referred to as “Algorithm C”) which takes into account the measured resting heart rate of the user and, optionally, details of the user’s user profile (e.g. age, sex and BML).
It should be noted that several different machine learning algorithms were tested on the VO2max dataset, and a linear regression algorithm (for Algorithms A, B and C) has been found to perform optimally. Linear regression predictions were converted to class labels by dividing the output range into quantiles and assigning each prediction to a quantile or class.
Three linear models were trained to be used under the 3 different scenarios described herein. The main difference between these models is the input variables. The primary model (for the target heart rate protocol A), where maximum heart rate was achieved, has age, gender, BMI and heart rate after rest as input variables. The secondary model, where exercise was performed but maximum heart rate was not achieved (for fixed protocol B), has age, gender, BMI, maximum heart rate and heart rate after rest as input variables. The passive model (for resting protocol C), where no exercise was performed, has age, gender, BMI and resting heart rate as input variables. For the training of each of these three models, historical data on the above- mentioned input variables for each model, together with their corresponding calculated VO2max scores (calculated historically), were used.
Based on the VO2 max rating/score for the user 100, the fitness analysis module 12 can then classify the user 100 into one of five different fitness categories/classes. In order to perform the classification, the fitness analysis module 12 can also take into account certain other factors, such as age and sex. This can be done via a basic lookup of the calculated VO2max in a table filtered by age and sex. Reference is in this regard specifically made to Figures 5a-k which sets out an example of the bucketing rules/lookup table. From studying Figures 5a-k it should be clear that for each specified sex and age combination, there are different VO2 max ranges which then accordingly correlates to a certain fitness level. For example, if a user is a 20 year old female (F) with a VO2 max of 40, then the fitness level of the user will be Level 2 (see Figure 5a).
It should be noted that the fitness analysis module 12 could be configured to verify the actual exercise activity performed by the user (e.g. through visual analysis using the camera 20). The fitness analysis module 12 could therefore, for example, confirm whether the required number of steps, star jumps, or squats have been performed. Figures 4a-e show another example of a general process flow of the system 10 in accordance with the invention. The main difference is that Figures 4a- e include a scenario where a user 100 has already registered with the system 10 and no Home Affairs check or document validation is required. Figures 4a-e also show how the 2 different exercise protocols branches off (A for fit users where peak heart rate must be reached and Algorithm A applied, and then B being a standard step test and Algorithm B applied.
The Inventors believe that the present invention provides an effective way of conducting a fitness assessment, without requiring a user to interact with an actual person, such as a biokineticist. In fact, since the user can install the mobile application on his/her smart phone 12 or use the web-based platform, he/she can effectively conduct the fitness assessment at any place and at any time (without any additional hardware). The present invention does this while at the same time also addressing reliability/fraud concerns by verifying the identity of the user during the process and also conducting a liveness assessment in order to confirm that the user captured by the camera is an actual person.
Since the user’s identity is verified, the fitness classification can be very useful in underwriting/pricing in a life insurance context. In other words, the present invention can be extremely valuable for insurance companies who want to obtain an indication of a person’s fitness in a way which is very convenient to a user, but also at the same time has appropriate fraud prevention processes in place (similar to those described earlier in the specification). This fitness indication can then be used to calculate an appropriate premium/pricing for a life insurance policy.

Claims

1 . A method of obtaining an indication of fitness, wherein the method includes: receiving/obtaining images or video of a specific person which has been captured by using an image capturing device; conducting a liveness test, by using images or video captured by the image capturing device, to verify that the person captured in the images or video is a real person; verifying, by using a processor, the identity of the person by utilising an image captured by the image capturing device; and performing a fitness analysis on the person, using a processor, by utilising images/video captured by the image capturing device.
2. The method of claim 1 , wherein the fitness analysis is to determine/estimate a VO2 max score/rating for the person.
3. The method of claim 2, which includes capturing the images/video by utilising the image capturing device.
4. The method of claim 2, wherein the step of verifying the identity of the person includes comparing the image of the person with an image of the person stored on a database.
5. The method of claim 4, wherein if the verification fails, then the method further includes: receiving and validating a scanned/captured image of an identification card/document, a driver’s license or passport of the person which depicts the person’s image; and comparing the image of the person captured by the image capturing device with the image of the person depicted in the identification card/document, driver’s license or passport, for verification purposes.
. The method of claim 2, wherein the step of performing the fitness analysis includes detecting a heart rate of the person by utilising a series of images or video captured by the image capturing device. . The method of claim 6, which includes communicating to the person, via a user interface, what exercise(s) or movement(s) should be performed by the person during the fitness assessment. . The method of claim 7, which includes a second verification step whereby the identity of the person is again verified after the exercise(s) or movement(s) has/have been performed by the person, by using an image captured by the image capturing device. . The method of claim 8, which includes receiving an identification number/code from the person via a communication network and utilising the identification number/code to retrieve an image of the person which is associated with the identification number/code, which is then used during the verification step. 0. The method of claim 9, which includes receiving/obtaining user profile information from the person. 1 . The method of claim 10, wherein the user profile information includes age, sex, height and weight. 2. The method of claim 1 1 , which includes utilising, by using a processor, at least some of the user profile information in order to determine what exercise(s) or movement(s) should be performed by the person during the fitness assessment. 3. The method of claim 8, wherein the step of performing the fitness analysis includes determining, using a processor, the VO2 max score/rating for the person by measuring recovery heart rate of the person after the person has performed the exercise(s) or movement(s), by utilising images/video captured by the image capturing device. The method of claim 8, where the step of performing the fitness analysis includes determining, by using a processor, whether the person has reached a predetermined peak heart rate, and if so, determining the VO2 max score/rating for the person by: measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the recovery heart rate as an input to a primary algorithm/machine learning model which then determines the VO2 max score/rating. The method of claim 14, which includes, if the peak heart rate has not been reached, determining the VO2 max score/rating for the person by: measuring a maximum heart rate achieved immediately after the person has completed the required exercise(s) or movement(s), by utilising images/video captured by the image capturing device, measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the maximum heart rate and recovery heart rate as an input to a secondary algorithm/machine learning model which then determines the VO2 max score/rating, wherein the secondary algorithm/machine learning model is different from the primary algorithm/machine learning model. The method of claim 15, wherein the step of performing the fitness analysis includes classifying the person into one of a plurality of fitness categories/classes, based on the person’s VO2 max score/rating.
17. The method of claim 9, which includes estimating, using a processor, the person’s VO2 max score/rating by using the detected heartrate and at least some of the user profile information.
18. The method of claim 15, wherein the method is implemented on a smart phone.
19. A method of performing an insurance underwriting process/procedure, wherein the method includes implementing the method as claimed in claim 1 .
20. A mobile application which includes a set of instructions which, if executed by a processor, performs the following steps: receiving/obtaining images or video of a specific person which has been captured by using an image capturing device of a mobile communication device on which the mobile application is stored; conducting a liveness test, by using images/video captured by the image capturing device, to verify that the person captured in the images/video is a real person; verifying, by using a processor, the identity of the person by utilising an image captured by the image capturing device; and performing a fitness analysis on the person, using a processor, by utilising images or video captured from the image capturing device.
21 . The mobile application of claim 20, wherein the fitness analysis is to determine/estimate a VO2 max score/rating for the person.
22. The mobile application of claim 21 , wherein the set of instructions, if executed by a processor, performs a step of capturing the images/video by utilising the image capturing device.
23. The mobile application of claim 22, wherein the step of verifying the identity of the person includes comparing the image of the person with an image of the person stored on a database.
24. The mobile application of claim 23, wherein the set of instructions, if executed by a processor/computer, performs the following steps, if the verification fails: receiving a scan ned/captu red image of an identification card/document, a driver’s license or passport of the person which depicts the person’s image; and comparing the image of the person captured by the image capturing device with the image of the person depicted in the identification card/document, driver’s license or passport, for verification purposes.
25. The mobile application of claim 22, wherein the step of performing the fitness analysis includes detecting a heart rate of the person by utilising a series of images/video captured by the image capturing device.
26. The mobile application of claim 25, wherein the set of instructions, if executed by a processor, performs a step of communicating to the person, via a user interface presented on a display screen of the mobile communication device on which the mobile application is stored, what exercise(s) or movement(s) should be performed by the person during the fitness assessment.
27. The mobile application of claim 26, where the set of instructions, if executed by a processor, determines the VO2 max score/rating for the person by measuring recovery heart rate of the person after the person has performed the exercise(s) or movement(s), by utilising images/video captured by the image capturing device.
28. The mobile application of claim 26, wherein the step of performing the fitness analysis includes determining whether the person has reached a predetermined peak heart rate, and if so, determining the VO2 max score/rating for the person by: measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the recovery heart rate as an input to a primary algorithm/machine learning model which then determines the VO2 max score/rating. The mobile application of claim 28, wherein the set of instructions, if executed by a processor/computer, determines the VO2 max score/rating, if the peak heart rate has not been reached, by: measuring a maximum heart rate achieved immediately after the person has completed the required exercise(s) or movement(s), by utilising images/video captured by the image capturing device, measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the maximum heart rate and recovery heart rate as an input to a secondary algorithm/machine learning model which then determines the VO2 max score/rating, wherein the secondary algorithm/machine learning model is different from the primary algorithm/machine learning model. The mobile application of claim 29, wherein the step of performing the fitness analysis includes classifying the person into one of a plurality of fitness categories/classes, based on the person’s VO2 max score/rating. A computing device on which a computer program or mobile application or web-based application is stored, which includes a set of instructions which, if executed by a processor, performs the following steps: receiving/obtaining images/video of a specific person which has been captured by using an image capturing device; conducting a liveness test, by using images/video captured by the image capturing device, to verify that the person captured in the images/video is a real person; verifying, by using a processor, the identity of the person by utilising an image captured by the image capturing device; and performing a fitness analysis on the person, using a processor, by utilising images/video captured from the image capturing device.
32. A non-transitory computer-readable storage medium on which the mobile application as claimed in claim 20 is stored.
33. A fitness assessment system for obtaining an indication of fitness, wherein the system includes: a liveness assessment module which is configured to utilise images/video of a person which has been captured by using an image capturing device, to verify that the person captured in the image/video is a real person; an identify verification module which is configured to utilise an image of a specific person which has been captured by using the image capturing device in order to verify the identity of the person; and a fitness analysis module which is configured to perform a fitness analysis on the person, using a processor, by utilising images/video captured by the image capturing device.
34. The system of claim 33, wherein the fitness analysis is to determine/estimate a VO2 max score/rating for the person.
35. The system of claim 34, wherein the image capturing device forms part of the system.
36. The system of claim 35, wherein the image capturing device is a camera of a computing device which forms part of the system.
37. The system of claim 36, wherein the verification module is configured to compare a captured image of the person with an image of the person stored on a database.
38. The system of claim 37, wherein the verification module is configured such that, if the verification fails, it implements an alternative verification process whereby the verification module: receives a scanned/captured image of an identification card/document, a driver’s license or passport of the person which depicts the person’s image; and compares the image of the person captured by the image capturing device with the image of the person depicted in the identification card/document, a driver’s license or passport, for verification purposes.
39. The system of claim 34, wherein the fitness analysis module is configured to detect a heart rate of the person by utilising images/video captured by the image capturing device.
40. The system of claim 39, wherein the fitness analysis module is configured to communicate to the person, via a user interface presented on a display screen of a computing device, what exercise(s) or movement(s) should be performed by the person during the fitness assessment.
41 . The system of claim 40, wherein the verification module is configured to perform a second verification step whereby the identity of the person is again verified after the exercise has been performed by the person, by using image(s) captured by the image capturing device.
42. The system of claim 41 , wherein the verification module is configured to: receive an identification number/code from the person via a communication network; and utilise the identification number/code to retrieve an image which is associated with the identification number/code, which is then used during the verification step/process.
43. The system of claim 42, which is configured to receive/obtain user profile information from the person.
44. The system of claim 43, wherein the user profile information includes age, sex, height and weight.
45. The system of claim 43, wherein the fitness analysis module is configured to use at least some of the user profile information in order to determine what exercise(s) or movement(s) should be performed by the person during the fitness assessment.
46. The system of claim 45, wherein the fitness assessment module is configured to determine the VO2 max score/rating for the person by measuring recovery heart rate of the person after the person has performed the exercise(s) or movement(s), by utilising images/video captured by the image capturing device.
47. The system of claim 45, wherein the fitness assessment module is configured to determine whether the person has reached a predetermined peak heart rate.
48. The system of claim 47, wherein the fitness assessment module is configured to determine if the peak heart rate has been reached, and if so, determine the VO2 max score/rating for the person by: measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the recovery heart rate as an input to a primary algorithm/machine learning model which then determines the VO2 max score/rating. The system of claim 48, wherein the fitness assessment module is configured, if the peak heart rate has not been reached, to determine the VO2 max score/rating for the person by: measuring a maximum heart rate achieved immediately after the person has completed the required exercise(s) or movement(s), by utilising images/video captured by the image capturing device measuring recovery heart rate for the person by utilising images/video captured by the image capturing device, and using the maximum heart rate and recovery heart rate as an input to a secondary algorithm/machine learning model which is different from the primary algorithm/machine learning model. The system of claim 49, wherein the fitness assessment module is configured to classify the person into one of a plurality of fitness categories/classes, based on the person’s VO2 max score/rating.
PCT/IB2023/058403 2022-08-30 2023-08-24 Fitness assessment system and methodology WO2024047482A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ZA202209638 2022-08-30
ZA2022/09638 2022-08-30

Publications (1)

Publication Number Publication Date
WO2024047482A1 true WO2024047482A1 (en) 2024-03-07

Family

ID=90098857

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2023/058403 WO2024047482A1 (en) 2022-08-30 2023-08-24 Fitness assessment system and methodology

Country Status (1)

Country Link
WO (1) WO2024047482A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210334570A1 (en) * 2018-06-11 2021-10-28 Laurence Hamid Liveness detection
US20220079452A1 (en) * 2017-06-02 2022-03-17 Apple Inc. Wearable computer with fitness machine connectivity for improved activity monitoring using caloric expenditure models
US20220147605A1 (en) * 2014-08-28 2022-05-12 Facetec, Inc. Method and apparatus for creation and use of digital identification

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220147605A1 (en) * 2014-08-28 2022-05-12 Facetec, Inc. Method and apparatus for creation and use of digital identification
US20220079452A1 (en) * 2017-06-02 2022-03-17 Apple Inc. Wearable computer with fitness machine connectivity for improved activity monitoring using caloric expenditure models
US20210334570A1 (en) * 2018-06-11 2021-10-28 Laurence Hamid Liveness detection

Similar Documents

Publication Publication Date Title
US10824852B1 (en) Method and system for identifying biometric characteristics using machine learning techniques
US11862326B1 (en) Biometric characteristic application using audio/video analysis
US8412664B2 (en) Non-natural pattern identification for cognitive assessment
JP2020525251A (en) System and method for testing and analyzing visual acuity and its changes
JP2016198197A (en) Diagnosis support system and application program
KR101301821B1 (en) Apparatus and method for detecting complexion, apparatus and method for determinig health using complexion, apparatus and method for generating health sort function
US20230240545A1 (en) Heart Rate Variability Composite Scoring and Analysis
KR102252851B1 (en) Automatic analysis system for picture for cognitive ability test and recording medium for the same
CN111887867A (en) Method and system for analyzing character formation based on expression recognition and psychological test
KR20220061511A (en) Device, method and program for guiding exercise posture and momentum
CN111553189A (en) Data verification method and device based on video information and storage medium
WO2021225744A1 (en) Heart rate variability monitoring and analysis
CN110782986A (en) Intelligent tongue diagnosis data processing system and method
JPWO2018211688A1 (en) Computer system, subject diagnosis support method and program
US20100016750A1 (en) Pattern Recognition System for Classifying the Functional Status of Patients with Pulmonary Hypertension, Including Pulmonary Arterial and Pulmonary Vascular Hypertension
WO2024047482A1 (en) Fitness assessment system and methodology
CN116013511B (en) Intelligent recommendation method and system for diabetes intervention based on knowledge graph
EP3659136A1 (en) Automated assessment of medical conditions
CN116230213A (en) Intelligent injury identification method and system
JP7129058B2 (en) Medical image diagnosis support device, program, and medical image diagnosis support method
CN115497621A (en) Old person cognitive status evaluation system
AU2020103509A4 (en) An artificial intelligence based system to identify the medical condition prior to doctor consultation
CN111222374A (en) Lie detection data processing method and device, computer equipment and storage medium
JP2016099965A (en) Health information management system and health information management method
US11786156B2 (en) Method and apparatus for use in detecting malingering by a first subject in tests of physical and/or mental function of the first subject

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23859565

Country of ref document: EP

Kind code of ref document: A1