GB2593847A - System, device and method - Google Patents

System, device and method Download PDF

Info

Publication number
GB2593847A
GB2593847A GB2109997.3A GB202109997A GB2593847A GB 2593847 A GB2593847 A GB 2593847A GB 202109997 A GB202109997 A GB 202109997A GB 2593847 A GB2593847 A GB 2593847A
Authority
GB
United Kingdom
Prior art keywords
wearer
activity data
representation
garment
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2109997.3A
Other versions
GB2593847B (en
GB202109997D0 (en
Inventor
Crofts Adam
Mahmood Tahir
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Prevayl Ltd
Original Assignee
Prevayl Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Prevayl Ltd filed Critical Prevayl Ltd
Priority to GB2109997.3A priority Critical patent/GB2593847B/en
Priority claimed from GB1908179.3A external-priority patent/GB2585360B/en
Publication of GB202109997D0 publication Critical patent/GB202109997D0/en
Publication of GB2593847A publication Critical patent/GB2593847A/en
Application granted granted Critical
Publication of GB2593847B publication Critical patent/GB2593847B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41DOUTERWEAR; PROTECTIVE GARMENTS; ACCESSORIES
    • A41D1/00Garments
    • A41D1/002Garments adapted to accommodate electronic equipment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • A61B5/6805Vests
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1172Identification of persons based on the shapes or appearances of their bodies or parts thereof using fingerprinting
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14532Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring glucose, e.g. by tissue impedance measurement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14542Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4519Muscles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4809Sleep detection, i.e. determining whether a subject is asleep or not
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records

Abstract

A system 100 comprises a garment 11 and an electronic device 13. The garment comprises a sensor arranged to monitor the activity of a wearer of the garment and a communicator arranged to receive the activity data from the sensor and transmit the activity data. The electronic device comprises a receiver, a processor and a display unit. The receiver is arranged to receive the activity data. The processor is arranged to track the motion of the wearer of the garment and generate a representation of the received activity data. The display unit is arranged to simultaneously display a representation of the wearer and the representation of the received activity data. The representation of the activity data is displayed at a position determined based on the tracked motion of the wearer. The electronic device is further arranged to receive a user credential from a user, and the receiver only receives the activity data if the user is authorised based on the obtained user credential as having permission to access the activity data. The electronic device can transmit the credential to a server 15 so that the server determines if the user is authorised.

Description

SYSTEM, DEVICE AND METHOD The present invention is directed towards a system, device and method. The present invention is directed in particular towards displaying activity data associated with a sensor incorporated into a garment in an intuitive way.
Background
Garments incorporating sensors are wearable electronics designed to interface with a wearer of the garment, and to determine information such as the wearers heart rate, rate of respiration, activity level, and body positioning. Such properties can be measured with a sensor assembly that includes a sensor for signal transduction and/or microprocessors for analysis. Such garments are commonly referred to as 'smart clothing'.
It would be desirable to provide a mechanism for intuitively displaying activity data recorded by the sensor of the garment.
Summary
According to the present disclosure there is provided a system, electronic device, and method as set forth in the appended claims. Other features of the invention will be apparent from the dependent claims, and the description which follows.
According to a first aspect of the invention there is provided a system. The system comprises a garment. The garment comprises a sensor arranged to monitor the activity of a wearer of the garment. The garment comprises a communicator arranged to receive the activity data from the sensor and transmit the activity data. The system comprises an electronic device. The electronic device comprises a receiver arranged to receive the activity data. The electronic device comprises a processor arranged to track the motion of the wearer of the garment and generate a representation of the received activity data. The electronic device comprises a display unit arranged to simultaneously display a representation of the wearer and the representation of the received activity data. The representation of the activity data is displayed at a position determined based on the tracked motion of the wearer.
Beneficially, the representation of the received activity data is displayed at a position determined based on the tracked motion of the wearer. In this way, the display of the activity data is linked to the motion of the wearer. This approach is more intuitive to the user as the user is able to quickly ascertain the relevance of the activity data based on the displayed position of the activity data.
The position of the representation of the activity data may correspond to the position of a feature of interest on a wearer of the garment. The feature of interest may, for example, be a muscle, muscle group or organ of the wearer. The feature of interest may be a region of the wearer such as the chest, legs, arms, or head of the wearer.
The method may comprise determining the location of a feature of interest on the wearer or the garment. The representation of the activity data may be positioned relative to the determined location of the feature of interest. For example, the representation of the activity data may be positioned at a location that corresponds to the feature of interest or which is offset by a predetermined distance relative to the feature of interest.
The representation of the activity data may at least partially overlay the displayed representation of the wearer. In this way, the representation of the activity data may augment the displayed representation of the wearer. Beneficially, the representation of the activity data may overlay the representation of the wearer. The representation of the activity data may overlay the representation of the wearer at a position that corresponds or relates to the activity data. For example, cardiac activity data may overlay the cardiac region of the representation of the wearer and muscle activity data may overlay the relevant muscle of the representation of the wearer.
Displaying the representation of the wearer of the garment may comprise displaying a live view image of the wearer. Live view image will be understood as referring to a real-time moving image. That is, a real-time video otherwise known as a live video feed. The electronic device may comprise or may be communicatively coupled to a camera arranged to capture the live view image. Beneficially, a user may view a live view image of the wearer which may be overlaid with a representation of the activity data. In this way, the display of the live view image is enhanced with additional information about the activity of the wearer.
Displaying the representation of the wearer of the garment may comprise displaying an avatar representation of the wearer. The avatar representation of the wearer may be a 3D representation of the wearer. The avatar representation of the wearer may be generated based on the tracked motion of the wearer. That is, the avatar may mimic the motion of the wearer.
The representation of the activity data may change in one or more of the size, shape, colour and texture based on the received activity data. This may be beneficial in conveying physiological information. The representation of the activity data may change in real time as the activity data changes.
The representation of the activity data may be in the form of an augmented reality object.
The representation of the activity data may be animated based on the activity data.
The representation of the activity data may represent a physiological or emotional state of the wearer, optionally wherein the physiological or emotional state of the wearer relates to a muscle or muscle group of the wearer, an organ of the wearer, or a condition of the wearer. The representation of the activity data may relate to neurological data of the wearer. The emotional state may relate to a stress level of the wearer for example.
The activity data may comprise activity data related to a muscle or muscle group of the wearer of the garment. The position of the representation of the activity data may be determined to correspond to an estimated location of the muscle or muscle group of the wearer as determined from the tracked motion of the wearer. The representation of the activity data may be displayed such that it overlays the representation of the wearer at the position corresponding to the estimated location of the muscle or muscle group of the wearer.
The muscle or muscle groups may comprise one or more of the triceps, deltoids, pectorals, abdominals, quadriceps, hamstrings, gluteals, and forearms The representation of the activity data may represent a physiological state of the muscle or muscle group.
The activity data may comprise cardiac activity data. The position of the representation of the activity data may be determined to correspond to an estimated location of the heart of a wearer of the garment as determined from the tracked motion of the wearer. The representation of the activity data may be displayed such that it overlays the representation of the wearer at the position corresponding to the estimated location of the heart of the wearer.
The representation of the activity data may represent a physiological state of the heart such as the heart rate.
The representation of the activity data may be in the form of a 2D or 3D model of the heart. The model of the heart is animated to beat at a rate corresponding to the heart rate of the wearer as determined from the activity data.
The representation of the activity data may be in the form of a 2D or 3D object representing a feature of interest of the wearer.
The representation of the activity data may relate to factors such as the movement and/or velocity of the wearer. The representation of the activity data may be in the form of instructions for the wearer generated based on the activity data. For example, the representation of the activity data may be an instruction to the wearer to slow down or speed up. The instruction may be determined by analysing the user's cardiac activity data or respiration activity data for
example.
Tracking the motion of the wearer may comprise receiving a live view image of the wearer and may comprise processing the live view image to track the motion of the wearer.
The garment may further comprise a fiducial marker located on an outside surface of the garment. Beneficially, a fiducial marker is useable as a point of reference for the garment and thus enables the position of the garment and the motion of the garment over time to be monitored simply by capturing images of the garment. In this way, the motion of the wearer of the garment is tracked by determining the location of the fiducial marker in the captured image. The fiducial marker may be in the form of a 2D image. The fiducial marker of the present invention is beneficial as it is simple, of low cost and does not negatively affect the comfort of the garment for the wearer. The fiducial marker may be an augmented reality (AR) marker.
The marker may have a limited visual footprint on the garment. This means that the marker may be sufficiently small that it is not easily visible by the naked eye but is still visible in the captured image. In this way, the marker does not affect or has a minimal effect on the appearance of the garment. In some examples, the marker is visible to the naked eye.
The method may further comprise processing the image to determine the location of the fiducial marker.
The processor may be arranged to track the motion of the wearer of the garment by obtaining a live view image of the garment and by processing the obtained live view image to determine the location of the fiducial marker. The position of the representation of the activity data on the display may be determined according to the determined location of the fiducial marker. The position of the representation of the activity data may be determined by applying a predetermined displacement to the determined location of the fiducial marker. Beneficially, this is a computationally simple way to track the motion of the wearer/garment and determine the position of the representation of the activity data.
The fiducial marker may comprise a plurality of fiducial markers. The plurality of fiducial markers may be located at different locations on the garment. The plurality of fiducial markers may be arranged in a geometric pattern. The plurality of fiducial markers may be arranged together on the garment to form a decorative item. The plurality of fiducial markers may be located at different locations on the garment.
The fiducial marker may be integrated into the garment. The fiducial marker may be printed onto or into the garment. Any known garment printing technique may be used such as screen printing or inkjet printing.
The fiducial marker may be incorporated into the stitching of the garment, and/or a seam of the garment, and/or a hem of the garment, and/or a neckline of the garment, and/or a collar of the garment, and/or a sleeve of the garment, and/or a cuff of the garment, and/or a pocket of the garment, and/or a body of the garment, and/or a fastener of the garment. The fastener may be a zipper, button, clasp, toggle, stud, snap fastener, popper, eyelet, buckle, fie or ribbon.
In some examples, the fiducial marker has a limited visual footprint on the garment. This means that the fiducial marker is sufficiently small that it is not easily visible by the naked eye but is still visible in the image captured by the image capturing device. In this way, the fiducial marker does not affect or has a minimal effect on the appearance of the garment. In some examples, the marker is visible to the naked eye.
The fiducial marker may be incorporated into or form part of visual element on the garment which may be a decorative item in the garment. The decorative item may be a logo, design, image, motif or pattern on the garment. In this way, the fiducial marker may contribute to or enhance the appearance of the garment.
The fiducial marker may further comprise a code string identifying the garment encoded into a visual symbol. The electronic device may be arranged to obtain an image of the garment. The electronic device may be arranged to process the image to generate a data string representing the visual symbol. The electronic device may be arranged to use the data string to access activity data associated with the sensor of the garment identified by the code string. Significantly, a garment comprising a marker is provided. The marker comprises a (unique) code string identifying the garment encoded into a visual symbol. When the visual symbol is imaged and the data string is obtained from the image, the data string is used to access activity data associated with the garment. In this way, access to the activity data for the garment may be controlled by imaging the garment and processing the resultant image. This enables the access of activity data to be controlled using a simple electronic device such as a portable electronic device with an integrated camera. As a result, an electronic device which may not be in direct communication with the garment is able to access activity data in a controlled way by imaging the garment and processing the resultant image. Beneficially, access to the data obtained from the sensor(s) located on the wearer's garment is controlled through a simple and intuitive procedure of imaging the garment. This approach enables different electronic devices to access activity data associated with the garment in a controlled way. Significantly, tracking of the motion of the garment and controlling access to the activity data can be controlled through the use of the same fiducial marker.
The data string may be used to access the activity data by establishing, based on the data string, the identity of the garment, and by accessing the activity data associated with the sensor of the identified garment. The establishing may be performed by the electronic device or a server in communication with the electronic device. The establishing of the identity of the garment may comprises decoding the data string so as to obtain the code string and may further comprises identifying the garment based on the code string. The data string may be a simple digitised representation of the visual symbol or may be an encrypted version of the code string. The electronic device or the server may run a decoding algorithm to generate the code string from the data string. The electronic device or server may provide a database to store one or a plurality of code strings each associated with a different garment. The identity of the garment may be established based on which of the code strings in the database the generated data string (once decoded) matches.
The electronic device may be further arranged to obtain a user credential from a user. The activity data may only be accessed if the user is authorised, based on the obtained user credential, as having permission to access the activity data. In this way, access to the activity data is controlled such that only a user with the requisite user credentials is able to access the activity data. The activity data may only be accessed if the user is authorised as having permission to access the activity data. The user credential may be in the form of a password or passcode. The user credential may be in the form of biometric data.
The obtained user credential may be used to determine whether the user is authorised to access the activity data. The activity data may be provided to the user only if the user is authorised. The determining and providing may be performed by the electronic device or a server. Different users may have different permissions levels and thus may have permission to access different quantities or types of activity data. The permission level of the user may be determined, and the activity data may be provided to the user based on the determined permission level.
The electronic device may be communicatively coupled to a server. The electronic device may transmit the data string to the server so that the server is able to establish the identity of the garment from the data string. The electronic device may receive, from the server, the activity data.
The electronic device may obtain a user credential from a user. The activity data may only be accessed if the user is authorised, based on the obtained user credential, as having permission to access the activity data. The electronic device may transmit the user credential to a server so that the server is able to determine if the user is authorised as having permission to access the activity data based on the obtained user credential. The electronic device may receive, from the server, the activity data if the user is authorised the server as having permission to access the activity data.
The sensor may comprise a plurality of sensors. The sensor may comprise one or a combination of bioelectrical sensors, biomagnetic sensors, biochemical sensors, biomechanical sensors, bioacousfic sensors, bioopfical sensors, and biothermal sensors. The bioelectrical sensor may be a biopotential and/or bioimpedance sensor. The biopotenfial sensor may be an ECG or EKG (electrocardiogram), EEG (electroencephalogram), EMG (electromyogram), sensor. The bioimpedance sensor may be a plethysmography (e.g., for respiration), body composition (e.g., hydration, fat, etc.), or EIT (electroimpedance tomography) sensor.
The garment may be one of a shirt, t-shirt, blouse, dress, brassiere, shorts, pants, arm or leg sleeve, jacket/coat, glove, vest, armband, underwear, headband, hat/cap, collar, wristband, stocking, sock, or shoe.
The garment may be a shirt. The marker may be located on the collar, yoke, sleeve, gauntlet, cuff, body, pocket, plackett, or fastener of the shirt. The shirt may comprise a plurality of fiducial markers. The plurality of fiducial markers may be located at a plurality of different positions on the shirt. The plurality of different positions on the shirt may comprise one or more of the collar, yoke, sleeve, gauntlet, cuff, body, pocket, plackett, or fastener of the shirt The garment may be a T-shirt. The marker may be located on the neckline, sleeve, cuff, body or hem of the T-shirt. The T-shirt may comprise a plurality of fiducial markers. The plurality of fiducial markers may be located at a plurality of different positions on the T-shirt. The plurality of different positions on the shirt may comprise one or more of the neckline, sleeve, cuff, body or hem of the T-shirt.
The garment may comprise a plurality of sensors. The activity data obtained by the sensors may be transmitted by the communicator of the garment to a server. The transmission may be performed over a high throughput wireless communication protocol such as 5G.
The garment may be worn by a first person referred to as the "wearer". A second person referred to as the "user" may be in possession of the electronic device which may be a electronic device such as a mobile phone. The second person may desire to see activity data for the wearer as
B
recorded by the sensors of the garment. For example, the user may be a personal trainer that may desire to view metrics such as the wearer's heartrate, respiration levels and hydration levels. The user may also be a healthcare professional such as a physiotherapist or doctor. In some examples, the "user" and the "wearer" refer to the same person. For example, the electronic device may be a television apparatus with an integrated camera. The wearer of the garment may stand in front of the television apparatus and may be captured by the camera of the television apparatus. The television apparatus may then display the activity data so that the wearer may view their activity information.
According to a second aspect of the invention, there is provided an electronic device. The electronic device comprises a receiver arranged to receive activity data relating to a garment. The electronic device comprises a processor arranged to track the motion of the wearer of the garment and generate a representation of the received activity data. The electronic device comprises a display unit arranged to simultaneously display a representation of the wearer and the representation of the received activity data, wherein the representation of the activity data is displayed at a position determined based on the tracked motion of the wearer.
The electronic device may further comprise a camera arranged to capture a live view image of the wearer. The displayed representation of the wearer may be the captured live view image of the wearer.
According to a third aspect of the invention, there is provided a method. The method comprises receiving activity data relating to a garment. The method comprises tracking the motion of the wearer of the garment. The method comprises generating a representation of the activity data.
The method comprises simultaneously displaying a representation of the wearer and the representation of the received activity data, wherein the representation of the activity data is displayed at a position determined based on the tracked motion of the wearer.
According to a fourth aspect of the invention, there is provided a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of the third aspect of the invention.
According to a fifth aspect of the invention, there is provided a computer readable storage medium comprising instructions which, when executed by a computer, cause the computer to carry out the method of the first, second, or third aspect of the invention.
Brief Description of the Drawings
Examples of the present disclosure will now be described with reference to the accompanying drawings, in which: Figure 1 shows an example system according to aspects of the present invention; Figure 2 shows an example user interface according to aspects of the present invention; Figure 3 shows an example user interface according to aspects of the present invention; Figure 4 shows an example user interface according to aspects of the present invention. Figure 5 shows a flow diagram for an example method according to aspects of the present invention; Figure 6 shows a flow diagram for an example method according to aspects of the present invention; Figure 7 shows a flow diagram for an example method according to aspects of the present invention; Figures 8A and 8B show example markers in accordance with aspects of the present invention; Figure 9 shows a schematic diagram of an example electronic device according to aspects of the present invention; and Figure 10 shows an example user interface according to aspects of the present invention
Detailed Description
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope and spirit of the disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the disclosure is provided for illustration purpose only and not forthe purpose of limiting the disclosure as defined by the appended claims and their equivalents.
It is to be understood that the singular forms "a," "an," and "the" include plural referents unless the context clearly dictates otherwise.
Referring to Figure 1, there is shown an example garment 11, electronic device 13, and server 15 in accordance with aspects of the present invention. The garment 11, electronic device 13, and server 15 form an example system 100 in accordance with aspects of the invention.
The garment 11 in the example of Figure 1 is in the form of a T-shirt. The garment 11 comprises two sensors 101, 103 arranged to monitor the activity of the wearer of the garment 11. In this example, one of the sensors 101 acts as a monitoring unit and the other sensor 103 acts as a reference unit. The sensors 101, 103 are communicatively coupled to a communicator 105 arranged to communicate activity data to the server 15. Of course, both sensors 101, 103 may have their own communicator in some examples.
While not required in all embodiments of the present invention, the garment 11 comprises a marker 107 located on the outside surface of the garment 11 and in particular on the main body of the T-shirt. The marker 107 may be a fiducial marker used in tracking the motion of the garment/wearer. The marker 107 may additionally or separately comprise a code string identifying the garment encoded into a visual symbol as shown in Figures 8A and 8B. The marker 107 is arranged such that, when imaged by an image capturing device such as the camera of the user electronic device 13, the marker 107 is useable to access activity data associated with the garment 11.
The electronic device 13 in the example of Figure 1 is a user electronic device 13 in the form of a mobile phone with an integrated camera. The user electronic device 13 comprises a receiver, a storage, a controller, a display unit 109, a camera and a user input unit. The controller provides overall control to the user electronic device 13. The receiver receives the activity data.
The receiver may receive the activity data directly from the communicator 105 of the garment or may receive the activity data from the server 15. That is, the receiver may be in direct communication with the garment 11 or in indirect communication via the garment 11. The receiver may be in the form of a communicator and may transmit and receive various pieces of information required for communication with a server 15 under the control of the controller. If required, the communicator may transmit the data string to the server and may receive activity data from the server. The user input unit receives inputs from the user such as a user credential. The camera captures the image of the garment. The storage stores information for the user terminal. The display unit 109 is arranged to simultaneously display a representation of the wearer and a representation of the activity data. The display unit 109 may be a presence-sensitive display and therefore may comprise the user input unit. The presence-sensitive display may include a display component and a presence-sensitive input component. The presence sensitive display may be a touch-screen display arranged to provide the user interface.
The user electronic device 13 may also include a biometric sensor. The biometric sensor may be used to identify a user or users of device based on unique physiological features. The biometric sensor may be: a fingerprint sensor used to capture an image of a users fingerprint; an iris scanner or a retina scanner configured to capture an image of a users iris or retina; an ECG module used to measure the user's ECG; or the camera of the user electronic device 13 arranged to capture the face of the user. The biometric sensor may be an internal module of the user electronic device. The biometric module may be an external (stand-alone) device which may be coupled to the user electronic device 13 by a wired or wireless link.
Electronic devices in accordance with the present invention are not limited to mobile phones and may take the form of any electronic device which may be used by a user to perform the methods according to aspects of the present invention. The electronic device may be a mobile electronic device such as a smartphone, tablet personal computer (PC), mobile phone, smart phone, video telephone, laptop PC, netbook computer, personal digital assistant (PDA), mobile medical device, camera or wearable device. The wearable device may include a head-mounted device such as an Augmented Reality, Virtual Reality or Mixed Reality head-mounted device. The user electronic device may be desktop PC, workstations, television apparatus or a projector, e.g. arranged to project a display onto a surface.
The server 15 may be a single device or may comprise a plurality of distributed devices communicatively coupled to one another, e.g. as a cloud-based server such as cloud server network. The server comprises a communicator, a storage, and a controller. The controller provides overall control to the server. The communicator transmits and receives various pieces of information required for communication with a user electronic device and/or garment under the control of the controller. The storage stores information for the server such as code strings identifying garments and user credential information.
In an example operation of the system shown in Figure 1, the communicator 105 of the garment 11 communicates activity data monitored by the sensors 101, 103 to the server 15. The receiver of the user electronic device 13 is in communication with the server 15 and receives the activity data from the server 15. The activity data received by the user electronic device 13 may not be the same as the activity data transmitted by the garment 11 as the server 15 may perform one or more processing operations on the received activity data prior to transmitting the processed activity data to the user electronic device 13. The camera of the user electronic device 13 is obtaining live view image data (a live video feed) of the garment/wearer. The processor of the user electronic device 13 tracks the motion of the garment/wearer from the live view image. The processor of the user electronic device 13 further generates a representation of the received activity data for display. The display unit 109 of the user electronic device 13 simultaneously displays a representation of the wearer in the form of a live view image and the representation of the received activity data. The representation of the received activity data is displayed at a position determined based on the tracked motion of the wearer.
Referring to Figure 2, there is shown an example user interface 200 displayed on the user electronic device 13 according to aspects of the present invention.
The user interface 200 includes a display of a representation of the wearer of the garment 201. In this example, the representation of the wearer of the garment 201 is in the form of a 3D avatar. The 3D avatar will move as the wearer moves as a result of the motion tracking performed using the fiducial markers provided on the garment 11 or performed through marker-less tracking using known image processing algorithms. A more refined motion estimate may be provided by incorporating motion sensors such as accelerometers and gyroscopes into the garment. In other examples, the displayed representation of the wearer of the garment 201 is in the form of a live view image as captured by the camera of the user electronic device 13. In this example, the additional active motion sensors are not required for the garment 11 but may still be provided.
The user interface 200 also includes a display of a representation of the activity data 203 received from the server 15. The representation of the activity data 203 in this example is the form of an object 203 that overlays the abdominal muscles of the wearer. The representation of the activity data 203 changes colour and size depending on whether the activity data indicates that the abdominal muscles are in contraction or relaxation. In Figure 2, the abdominal muscles are in contraction and as such the object 203 has a dark colour. When the abdominal muscles are in relaxation the colour of the object 203 lightens. The representation of the activity data changes in size to mimic the contraction and relaxation of the muscles. Of course, other visual representations of the activity data relating to the abdominal muscles may be provided.
In the example of Figure 2, the object 203 is displayed at a position determined according to the location of the fiducial marker 107 (Figure 1) on the garment 11. In particular, that the fiducial marker 107 acts as a reference position for the garment 11 in relation to the wearer of the garment 11. The position of the object to be displayed is determined using the position of the fiducial marker 107 indicated by the coordinate (x1, y1) in the image and a predetermined displacement indicated by the coordinate (x2, y2) from the marker to a feature of interest on the wearer such as the abdominal muscles. In particular, the position of the object 203 to be displayed can be determined as (x1, y1) + (x2, y2). Of course, the predetermined displacement is not required to be in the x and y directions. A displacement may be performed in a combination of any of the x, y and z directions as required.
The user interface 200 of Figure 2 displays additional activity data for the wearer and other data for the garment 11 at positions which are not necessarily determined based on the location of the fiducial marker 107 on the garment 11. The user interface 200 includes an ECG trace 205 and heartrate data 207; the signal strength 209 for the communicator of the garment 11; the battery level 211 for a battery of the garment 11; GPS coordinate data 213; core body temperature and skin surface temperature 215; the oxygen level 217, blood pressure and blood glucose levels 219, sleep tracking, step tracking and hydration level 221, and fat level, calories burned, blood lactate level as well as an indication of the type of calories burned 223. The user interface 200 also displays warnings 225, 227 indicating the wearer's V02 and hydration levels are concerning. Of course, the user interface 200 in Figure 2 is just one example interface and other forms of bio data may be displayed to the user in a different way.
Referring to Figure 3, there is shown another example user interface 300 displayed on the user electronic device 13. The user interface 300 displays a representation of the wearer 301 which may be a 3D avatar or a live view image of the wearer. In addition, the user interface 300 overlays the representation of the wearer 301 with two objects 303, 305. The two objects 303, 305 are displayed at positions that correspond to the location of the pectoral muscle area of the wearer. The position of the objects 303, 305 are determined according to the location of the marker on the garment as described above in relation to Figure 2. The objects 303, 305 are representations of the activity data of the wearer relating to the pectoral muscle area. The objects 303, 305 change colour and size based on the received activity data for example to indicate whether the muscles are in contraction or relaxation.
Referring to Figure 4, there is shown another example user interface 400 displayed on the user electronic device 13. The user interface 400 displays a representation of the wearer 401 which may be a 3D avatar or a live view image of the wearer. In addition, the user interface 400 overlays the representation of the wearer 401 with an object 403. The object 403 is displayed at a position that corresponds to the location of the heart (the representation of the wearer 401 is displayed as a mirror image in Figure 4). The position of the object 403 is determined according to the location of the marker on the garment as described above in relation to Figure 4. The object 403 is a representation of the cardiac activity data of the wearer. The object 403 is in particular an animated 3D model of a heart that beats at a rate corresponding to the heart rate of the wearer as determined from the activity data.
In addition, the user interface may display information relating to past or predicted future movements undertaken by the wearer The garment may incorporate one or more motion sensors such as accelerometers or gyroscopes which may be used to derive position and velocity data for the wearer. This information may be displayed such that the user can view how the wearer has moved over time. Moreover, based on past motion information, a future motion of the user may be estimated and displayed. The motion information may be displayed as a series of points on the display such as in the form of a point cloud.
Referring to Figure 5, there is shown an example method according to aspects of the present invention that allows the user view activity data on their user electronic device in an intuitive and easy to use way.
In step 501 of the method, the user electronic device receives activity data relating to a garment. The activity data may be directly received from the garment or may be indirectly received via a server.
In step 502 of the method, the user electronic device tracks the motion of the wearer of the garment. In on example, the user electronic device may capture a live view image of the wearer and process the captured the live view image to track the motion of the wearer. The tracking of the motion may be marker based or marker-less based tracking.
In step 503 of the method, the user electronic device generates a representation of the activity data.
In step 504 of the method, the user electronic device simultaneously displays a representation of the wearer and the representation of the activity data. The representation of the activity data is displayed at a position determined based on the tracked motion of the wearer. The activity data may relate to one or more biosignals of the user such as the heart rate, respiration rate and hydration level of the user. In this way, the user is able to obtain biosignal data for the wearer in real time so as to enable the user to observe and assess the performance of the wearer. This is particularly useful when the user is a personal trainer or healthcare professional.
Referring to Figure 6, there is shown a more detailed overview of the example method of Figure 5. In Figure 6, steps 601 and 602 are the same as steps 501 and 502 in Figure 5. Step 502 of Figure 5 is split into two separate steps performed by the user electronic device which are referred to as steps 603 and 604. In step 603, the user electronic device obtains a live view image of the garment. The user electronic device may comprise a camera that captures the live view image of the garment. In step 603, the user electronic device processes the live view image to track the motion of the wearer of the garment. Steps 604 and 605 are the same as steps 503 and 504 in Figure 5.
Referring to Figure 7, there is shown a more detailed overview of the example method of Figure 1. In Figure 7, steps 701, 702, 704 and 705 are the same as steps 601, 602, 604 and 605 of Figure 6. In step 703, the user electronic device obtains a live view image of the garment. The user electronic device may comprise a camera that captures the live view image of the garment.
In step 703, the user electronic device processes the live view image to track the motion of the wearer of the garment by determining the location of a fiducial marker on the garment.
According to some aspects of the present invention, the garment is provided with a marker located on an outside surface of the garment. The marker comprises a code string identifying the garment encoded into a visual symbol. In more detail, a code is generated for the garment using a random number generator. The code may in a code format with sufficient address space to enable a sufficient number of different codes to be generated. For example, the code format may be in the form of a 14-character hexadecimal number. Once the code is generated by the random number generator, a processor running an algorithm converts the code into a visual symbol which is printed or otherwise manufactured onto or into the garment. Encoding the code into a visual symbol is beneficial because the original code cannot be parsed from the visual symbol without access to the encoding algorithm. Moreover, the visual symbol is easily machine readable by providing image data of the visual symbol captured using a camera. As an added benefit the visual symbol is also useable as a fiducial marker for tracking the motion of the garment.
Referring to Figure 8A, there is shown an example marker 801 in accordance with the present invention. The marker 801 in this example is based on the Vcode ® provided by VST Enterprises TM and comprises a visual symbol in the form of black marks upon white pathways. The black marks represent the characters in the code string. The visual symbol may additionally encode redundant information for error detection, correction, and uniqueness over different rotations of the marker.
Referring to Figure 8B, there is shown another example marker 803 in accordance with the present invention. The marker 803 in this example is derived from the AR marker system known as ARTag. The marker 803 comprises a visual symbol in the form of a 6x6 grid of black or white cells which represent 36 binary '0' or '1' symbols. The 36-bit sequence encodes the code string and may additionally encode redundant information for error detection, correction and uniqueness over the different rotations of the marker.
In both examples, the code string/data string may be retrieved from the marker 801, 803 by processing an image containing the visual symbol. It will be appreciated that known image processing operations such as contour extraction and edge detection will be used to read the symbol from the image.
It will be appreciated that the marker 801, 803 in accordance with the present invention is not limited to the examples of markers shown in Figures 8A and 8B. Instead, other forms of markers 801, 803 that encode a code string identifying the garment into a visual symbol may be used. In most preferred examples, the markers 801, 803 are additionally used as fiducial markers 801, 803. This means that the markers 801, 803 act as a point of reference for the garment and thus enable the position of the garment and the motion of the garment over time to be monitored simply by capturing images of the garment. Generally, the marker 801, 803 is preferred to be a bitonal marker as this means that there is no need to identify different shades of grey within the marker 801, 803 during the image processing operation to identify and decode the marker 801, 803. This beneficially helps reduce the sensitive to lighting conditions and camera settings. Of course, in some examples the marker 801, 803 may not be bitonal and may comprise different grey levels or indeed different colours.
Further, it will be appreciated that markers/fiducial markers are not required in all embodiments, and the motion of the wearer/garment can be tracked using other known techniques. For example, obtained live view image data can be processed to track the motion of the wearer/garment using known marker-less based tracking techniques.
Referring to Figure 9, there is shown a schematic diagram of an example electronic device 900. The electronic device 900 comprises a memory 901, processor 903, display unit 905, receiver 907 and camera 909.
Referring to Figure 10, there is shown an example user interface according to aspects of the present invention. The user interface displays a live view image 1000 that is captured by a camera communicatively coupled to the electronic device. The live view image 1000 is a live video feed of the wearer wearing the garment 11. The garment 11 has a fiducial marker 107 located on an outside surface of the garment. The live view image 1000 is processed to determine the location of the fiducial marker 107. The display of the live view image 1000 is augmented with the display of an augmented reality object 1001. The position of the augmented reality object 1001 on the display is determined based on the determined location of the fiducial marker 107. That is, the augmented reality object 1001 is always displayed at a predetermined displacement from the fiducial marker 107. The effect of this is that the augmented reality object 1001 always appears on the display to overlay the cardiac region of the wearer of the garment 11. The augmented reality object 1001 provides a representation of the cardiac activity data which is recorded by one or more sensors (not visible) of the garment. The augmented reality object 1001 comprises a 3D model of the heart 1003 that is animated to beat at a rate corresponding to the heart rate of the wearer as recorded by the sensor(s) of the garment 11, The 3D model of the heart 1003 changes colour based on the heart rate of the wearer. The 3D model of the heart 1003 is green when the heart rate is at a low value (e.g. less than 100 beats per minute), yellow when the heart rate is at a medium value (e.g. between 100 and 145 beats per minute) and red when the heart rate is at a high value (e.g. greater than 145 beats per minute). Of course, other colours may be used. The 3D model of the heart may additionally or separately change size, shape or texture depending on the heart rate. The augmented reality object 1001 comprises a numerical display of the heart rate 1005. The augmented reality object 1001 comprises a display of ECG data 1005 for the wearer. The display of the heart rate 1003 and the ECG data 1007 may also change colour, size, shape or texture depending on the heart rate. Conveniently, the present invention conveys cardiac information to the observer in a way that is easy and intuitive to understand as the augmented reality object 1001 is always positioned to overlay the cardiac region of the wearer.
In summary, there is provided a system. The system comprises a garment and an electronic device. The garment comprises a sensor arranged to monitor the activity of a wearer of the garment. The garment comprises a communicator arranged to receive the activity data from the sensor and transmit the activity data. The electronic device comprises a receiver, a processor and a display unit. The receiver is arranged to receive the activity data. The processor is arranged to track the motion of the wearer of the garment and generate a representation of the received activity data. The display unit is arranged to simultaneously display a representation of the wearer and the representation of the received activity data. The representation of the activity data is displayed at a position determined based on the tracked motion of the wearer.
Examples
Example 1: A system comprising: a garment comprising a sensor arranged to monitor the activity of a wearer of the garment; and a communicator arranged to receive the activity data from the sensor and transmit the activity data; an electronic device comprising: a receiver arranged to receive the activity data; a processor arranged to track the motion of the wearer of the garment and generate a representation of the received activity data; and a display unit arranged to simultaneously display a representation of the wearer and the representation of the received activity data, wherein the representation of the activity data is displayed at a position determined based on the tracked motion of the wearer.
Example 2: The system of Example 1, wherein the position of the representation of the activity data corresponds to the position of a feature of interest on a wearer of the garment.
Example 3: The system of Example 1 or 2, wherein the representation of the activity data at least partially overlays the displayed representation of the wearer.
Example 4: The system of any preceding Example, wherein displaying the representation of the wearer of the garment comprises displaying a live view image of the wearer.
Example 5: The system of any of Examples 1 to 3, wherein displaying the representation of the wearer of the garment comprises displaying an avatar representation of the wearer, optionally wherein the avatar representation of the wearer is a 3D representation of the wearer.
Example 6: The system of Example 5, wherein the avatar representation of the wearer is generated based on the tracked motion of the wearer.
Example 7: The system of any preceding Example, wherein the representation of the activity data is in the form of an Augmented Reality object.
Example 8: The system of any preceding Example, wherein tracking the motion of the wearer comprises: receiving a live view image of the wearer; and processing the live view image to track the motion of the wearer.
Example 9: The system of any preceding Example, wherein the garment further comprises a fiducial marker located on an outside surface of the garment.
Example 10: The system of Example 9, wherein the processor is arranged to track the motion of the wearer of the garment by obtaining a live view image of the garment and by processing the obtained live view image to determine the location of the fiducial marker.
Example 11: The system of Example 10, wherein the position of the representation of the activity data on the display is determined according to the determined location of the fiducial marker.
Example 12: The system of Example 11, wherein the position of the representation of the activity data is determined by applying a predetermined displacement to the determined location of the fiducial marker.
Example 13: The system of any of Examples 9 to 12, wherein the fiducial marker further comprises a code string identifying the garment encoded into a visual symbol.
Example 14: The system of any preceding Example, wherein the representation of the activity data is animated based on the activity data.
Example 15: The system of any preceding Example, wherein the representation of the activity data represents a physiological state of the wearer, optionally wherein the physiological state of the wearer relates to a muscle or muscle group of the wearer, an organ of the wearer, or a condition of the wearer.
Example 16: The system of any preceding Example, wherein the activity data comprises activity data related to a muscle or muscle group of the wearer of the garment, wherein the position of the representation of the activity data is determined to correspond to an estimated location of the muscle or muscle group of the wearer as determined from the tracked motion of the wearer, and wherein the representation of the activity data is displayed such that it overlays the representation of the wearer at the position corresponding to the estimated location of the muscle or muscle group of the wearer.
Example 17: The system of any preceding Example, wherein the representation of the activity data represents a physiological state of the muscle or muscle group.
Example 18: The system of any preceding Example, wherein the activity data comprises cardiac activity data, wherein the position of the representation of the activity data is determined to correspond to an estimated location of the heart of a wearer of the garment as determined from the tracked motion of the wearer, and wherein the representation of the activity data is displayed such that it overlays the representation of the wearer at the position corresponding to the estimated location of the heart of the wearer.
Example 19: The system of Example 18, wherein the representation of the activity data represents a physiological state of the heart such as the heart rate.
Example 20: The system of Example 19, wherein the representation of the activity data is in the form of a 2D or 3D model of the heart, and optionally wherein the model of the heart is animated to beat at a rate corresponding to the heart rate of the wearer as determined from the activity data.
Example 21: The system of any preceding Example, wherein the representation of the activity data is in the form of a 2D or 3D object representing a feature of interest of the wearer.
Example 22: An electronic device comprising: a receiver arranged to receive activity data relating to a garment; a processor arranged to track the motion of the wearer of the garment and generate a representation of the received activity data; and a display unit arranged to simultaneously display a representation of the wearer and the representation of the received activity data, wherein the representation of the activity data is displayed at a position determined based on the tracked motion of the wearer.
Example 23: The electronic device of Example 22, further comprising a camera arranged to capture a live view image of the wearer, and wherein the displayed representation of the wearer is the captured live view image of the wearer.
Example 24: A method comprising: receiving activity data relating to a garment; tracking the motion of the wearer of the garment; generating a representation of the received activity data; and simultaneously displaying a representation of the wearer and the representation of the received activity data, wherein the representation of the activity data is displayed at a position determined based on the tracked motion of the wearer.
Example 25: A computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of Example 24.
At least some of the example embodiments described herein may be constructed, partially or wholly, using dedicated special-purpose hardware. Terms such as 'component', 'module' or 'unit' used herein may include, but are not limited to, a hardware device, such as circuitry in the form of discrete or integrated components, a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks or provides the associated functionality. In some embodiments, the described elements may be configured to reside on a tangible, persistent, addressable storage medium and may be configured to execute on one or more processors. These functional elements may in some embodiments include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. Although the example embodiments have been described with reference to the components, modules and units discussed herein, such functional elements may be combined into fewer elements or separated into additional elements. Various combinations of optional features have been described herein, and it will be appreciated that described features may be combined in any suitable combination. In particular, the features of any one example embodiment may be combined with features of any other embodiment, as appropriate, except where such combinations are mutually exclusive. Throughout this specification, the term "comprising" or "comprises" means including the component(s) specified but not to the exclusion of the presence of others.
Attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.
All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
The invention is not restricted to the details of the foregoing embodiment(s). The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.

Claims (1)

  1. 22 CLAIMS 1. A system comprising: a garment comprising a sensor arranged to monitor the activity of a wearer of the garment; and a communicator arranged to receive the activity data from the sensor and transmit the activity data; an electronic device arranged to obtain a user credential from a user, the electronic device comprising: a receiver arranged to receive the activity data if the user is authorised, based on the obtained user credential, as having permission to access the activity data; a processor arranged to track the motion of the wearer of the garment and generate a representation of the received activity data; and a display unit arranged to simultaneously display a representation of the wearer and the representation of the received activity data, wherein the representation of the activity data is displayed at a position determined based on the tracked motion of the wearer. 15 2 A system as claimed in claim 1, wherein the electronic device is arranged to transmit the user credential to a server so that the server is able to determine if the user is authorised as having permission to access the activity data based on the obtained user credential, and wherein the receiver is arranged to receive, from the server, the activity data from the server if the user is authorised to access the activity data.3. A system as claimed in claim 1 or 2, wherein the position of the representation of the activity data corresponds to the position of a feature of interest on a wearer of the garment.4. A system as claimed in any of claims 1 to 3, wherein the representation of the activity data at least partially overlays the displayed representation of the wearer.5. A system as claimed in any preceding claim, wherein displaying the representation of the wearer of the garment comprises displaying a live view image of the wearer.6. A system as claimed in any of claims 1 to 4, wherein displaying the representation of the wearer of the garment comprises displaying an avatar representation of the wearer, optionally wherein the avatar representation of the wearer is a 3D representation of the wearer.7. A system as claimed in claim 6, wherein the avatar representation of the wearer is generated based on the tracked motion of the wearer.8. A system as claimed in any preceding claim, wherein the representation of the activity data is in the form of an Augmented Reality object.9. A system as claimed in any preceding claim, wherein tracking the motion of the wearer comprises: receiving a live view image of the wearer; and processing the live view image to track the motion of the wearer.10. A system as claimed in any preceding claim, wherein the garment further comprises a fiducial marker located on an outside surface of the garment.11. A system as claimed in claim 10, wherein the processor is arranged to track the motion of the wearer of the garment by obtaining a live view image of the garment and by processing the obtained live view image to determine the location of the fiducial marker.12. A system as claimed in claim 11, wherein the position of the representation of the activity data on the display is determined according to the determined location of the fiducial marker.13. A system as claimed in claim 12, wherein the position of the representation of the activity data is determined by applying a predetermined displacement to the determined location of the fiducial marker.14. A system as claimed in any of claims 10 to 13, wherein the fiducial marker further comprises a code string identifying the garment encoded into a visual symbol.15. A system as claimed in any preceding claim, wherein the representation of the activity data is animated based on the activity data.16. A system as claimed in any preceding claim, wherein the representation of the activity data represents a physiological state of the wearer, optionally wherein the physiological state of the wearer relates to a muscle or muscle group of the wearer, an organ of the wearer, or a condition of the wearer.17 A system as claimed in any preceding claim, wherein the activity data comprises activity data related to a muscle or muscle group of the wearer of the garment, wherein the position of the representation of the activity data is determined to correspond to an estimated location of the muscle or muscle group of the wearer as determined from the tracked motion of the wearer, and wherein the representation of the activity data is displayed such that it overlays the representation of the wearer at the position corresponding to the estimated location of the muscle or muscle group of the wearer 18. A system as claimed in any preceding claim, wherein the representation of the activity data represents a physiological state of the muscle or muscle group.19. A system as claimed in any preceding claim, wherein the activity data comprises cardiac activity data, wherein the position of the representation of the activity data is determined to correspond to an estimated location of the heart of a wearer of the garment as determined from the tracked motion of the wearer, and wherein the representation of the activity data is displayed such that it overlays the representation of the wearer at the position corresponding to the estimated location of the heart of the wearer.20. A system as claimed in claim 19, wherein the representation of the activity data represents a physiological state of the heart such as the heart rate.21. A system as claimed in claim 20, wherein the representation of the activity data is in the form of a 2D or 3D model of the head, and optionally wherein the model of the head is animated to beat at a rate corresponding to the heart rate of the wearer as determined from the activity data.22. A system as claimed in any preceding claim, wherein the representation of the activity data is in the form of a 2D or 3D object representing a feature of interest of the wearer.23. An electronic device arranged to obtain a user credential from a user, the electronic device comprising: a receiver arranged to receive activity data relating to a garment if the user is authorised, based on the obtained user credential, as having permission to access the activity data; a processor arranged to track the motion of the wearer of the garment and generate a representation of the received activity data; and a display unit arranged to simultaneously display a representation of the wearer and the representation of the received activity data, wherein the representation of the activity data is displayed at a position determined based on the tracked motion of the wearer.24. A method comprising: obtaining a user credential from a user; receiving activity data relating to a garment if the user is authorised, based on the obtained user credential, as having permission to access the activity data; tracking the motion of the wearer of the garment; generating a representation of the received activity data; and simultaneously displaying a representation of the wearer and the representation of the received activity data, wherein the representation of the activity data is displayed at a position determined based on the tracked motion of the wearer.25. A computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of claim 24.1.
GB2109997.3A 2019-06-07 2019-06-07 System, device and method Active GB2593847B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2109997.3A GB2593847B (en) 2019-06-07 2019-06-07 System, device and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2109997.3A GB2593847B (en) 2019-06-07 2019-06-07 System, device and method
GB1908179.3A GB2585360B (en) 2019-06-07 2019-06-07 System, device and method

Publications (3)

Publication Number Publication Date
GB202109997D0 GB202109997D0 (en) 2021-08-25
GB2593847A true GB2593847A (en) 2021-10-06
GB2593847B GB2593847B (en) 2022-04-20

Family

ID=77354030

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2109997.3A Active GB2593847B (en) 2019-06-07 2019-06-07 System, device and method

Country Status (1)

Country Link
GB (1) GB2593847B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115349738B (en) * 2022-08-31 2023-07-21 慕思健康睡眠股份有限公司 Human body monitoring data acquisition method and system and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100185398A1 (en) * 2009-01-22 2010-07-22 Under Armour, Inc. System and Method for Monitoring Athletic Performance

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100185398A1 (en) * 2009-01-22 2010-07-22 Under Armour, Inc. System and Method for Monitoring Athletic Performance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Enflux Exercise Clothing: Real-time 3D Analysis", kickstarter.com, [online] available from https://www.kickstarter.com/projects/1850884998/enflux-smart-clothing-3d-workout-tracking-and-form *

Also Published As

Publication number Publication date
GB2593847B (en) 2022-04-20
GB202109997D0 (en) 2021-08-25

Similar Documents

Publication Publication Date Title
US11813082B2 (en) Method of controlling access to activity data from a garment
CN105338890B (en) The method and apparatus for determining life parameters
US20180307314A1 (en) Circumferential Array of Electromyographic (EMG) Sensors
US9986771B2 (en) Garments having stretchable and conductive ink
US20150366504A1 (en) Electromyographic Clothing
US11950901B2 (en) Systems and methods for assessing gait, stability, and/or balance of a user
US11301656B2 (en) Clothing having one or more printed areas disguising a shape or a size of a biological feature
US20230115286A1 (en) Method of controlling a garment to record activity data
US20230222302A1 (en) Method, Apparatus and Wearable Assembly
GB2586950A (en) Garment, method and device
KR101702825B1 (en) Body composition measuring apparatus and server for amending result of body composition measurement
WO2022193425A1 (en) Exercise data display method and system
GB2593847A (en) System, device and method
US20210265055A1 (en) Smart Meditation and Physiological System for the Cloud
CN214284888U (en) System and electronic equipment
Zhou et al. CoRSA: A cardio-respiratory monitor in sport activities
CN116963807A (en) Motion data display method and system
GB2596783A (en) Wearable assembly
GB2596782A (en) Method, apparatus and wearable assembly
Alexander et al. An analysis of human motion detection systems use during elder exercise routines
US20230346051A1 (en) Wearable Assembly, Apparatus and Method
CN112580652B (en) Virtual decoration method, device, electronic equipment and storage medium
US20240001196A1 (en) Artificial Intelligence Assisted Personal Training System, Personal Training Device and Control Device
WO2021028659A1 (en) Method, computer readable medium and system
Cleland et al. Assessment of custom fitted heart rate sensing garments whilst undertaking everyday activities

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20211007 AND 20211013