WO2021038550A1 - System, method and computer readable medium for entity parameter calculation - Google Patents

System, method and computer readable medium for entity parameter calculation Download PDF

Info

Publication number
WO2021038550A1
WO2021038550A1 PCT/IL2020/050124 IL2020050124W WO2021038550A1 WO 2021038550 A1 WO2021038550 A1 WO 2021038550A1 IL 2020050124 W IL2020050124 W IL 2020050124W WO 2021038550 A1 WO2021038550 A1 WO 2021038550A1
Authority
WO
WIPO (PCT)
Prior art keywords
classifier
circumference
conversions
records
garment
Prior art date
Application number
PCT/IL2020/050124
Other languages
French (fr)
Inventor
Omer GORALNIK
Ariel Shapiro
Original Assignee
Myselffit Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Myselffit Ltd. filed Critical Myselffit Ltd.
Publication of WO2021038550A1 publication Critical patent/WO2021038550A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/022Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of tv-camera scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B5/00Measuring arrangements characterised by the use of mechanical techniques
    • G01B5/02Measuring arrangements characterised by the use of mechanical techniques for measuring length, width or thickness
    • G01B5/025Measuring of circumference; Measuring length of ring-shaped articles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/42Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/16Measuring arrangements characterised by the use of optical techniques for measuring the deformation in a solid, e.g. optical strain gauge
    • G01B11/165Measuring arrangements characterised by the use of optical techniques for measuring the deformation in a solid, e.g. optical strain gauge by means of a grating deformed by the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Definitions

  • the invention relates to a system, method and computer readable medium for entity parameter calculation.
  • Physical fitness refers to the extent a human body can be taken while carrying out physical tasks. Knowing the specifics can help to determine fitness level and consequently set realistic fitness goals, monitor progress and maintain motivation.
  • Fat weight is the fat stored in fat cells throughout the body.
  • Lean weight includes all other tissues, such as organs, bones, blood, skin, and muscle. Over the years, there is a muscle loss in favor of fat gain. The muscle loss adversely affects human's physical function and personal appearance.
  • the deterioration in body composition is a major health risk associated with many medical problems including obesity, lower back pain, type II diabetes, various forms of cancer, high blood pressure, and heart disease. In fact, obesity contributes to at least half the chronic diseases in western society.
  • Body fat measurement is important for determining fitness level as weight alone is not a clear indicator of good health because it does not distinguish between pounds that come from body fat and those that come from lean body mass or muscle.
  • Body fat percentage is crucial to know in order to maintain a healthy lifestyle.
  • There are many techniques for body fat percentage measurement (some of which are based on determination of physical circumference of the measured entity), such as Underwater weighing, Whole-body air displacement plethysmography, Near-infrared interactance, Dual energy X-ray absorptiometry, Body average density measurement, Bioelectrical impedance analysis and Anthropometric methods.
  • a form- fitting garment comprising: a reference marker, having a reference marker size that is known and substantially constant; and at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; thereby enabling calculation of at least one of:
  • the at least one stretchable feature is in a front portion of the garment when being worn.
  • the garment is one of: a shirt, a vest, a dress, a strap, a belt, a patch or a halterneck.
  • the reference marker having a distinctive color compared to the garment.
  • a first color of the reference marker in a visible spectrum is identical to a second color of the garment.
  • the first color of the reference marker in a non-visible spectrum is different than the second color of the garment upon the garment being worn.
  • the reference marker is made of one of the following materials: linen, hemp, cotton, ramie, wool, silk, bamboo, soya, tencel, viscose, leather, suede, metal, polyester, polyolefin or polymer material.
  • the difference between the given stretched horizontal width and the respective known feature horizontal width of the feature horizontal widths is determined by capturing an image of the garment, upon the garment being worn, the image including the reference marker and the stretchable feature, and analyzing the image.
  • a method for an entity parameter calculation comprising: receiving, by a processing circuitry, a digital image of a form-fitting garment being worn, the form- fitting garment having (a) a reference marker, having a reference marker size that is known and substantially constant, and (b) at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; identifying, by the processing circuitry, within the digital image, the reference marker and the at least one stretchable feature; determining, by the processing circuitry, a conversion ratio utilizing pixels representing the reference marker in the digital image and the reference marker size; determining, by the processing circuitry, one or more conversions of one or more respective given stretched horizontal widths of the at least one stretchable feature identified in the digital image, converted using the conversion ratio; and
  • the entity parameter is a circumference of the entity.
  • the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
  • the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
  • the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
  • the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement.
  • the method further comprising determining a body fat percentage of the entity using a formula, wherein the formula is configured to receive the circumference and the one or more input parameters and output the body fat percentage.
  • the entity parameter is a body fat percentage of the entity.
  • the body fat percentage is calculated by: inputting the conversions, and one or more input parameters into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and provide the body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified body fat percentage measurement.
  • the body fat percentage is calculated by: inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and provide the body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified body fat percentage measurement.
  • the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified body fat percentage measurement.
  • the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified body fat percentage measurement.
  • the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the circumference and the input parameters and determine the respective body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training circumference and a respective verified body fat percentage measurement.
  • the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference into a classifier, wherein the classifier is configured to obtain the circumference and determine the respective body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training circumference and a respective verified body fat percentage measurement.
  • the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
  • the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
  • the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
  • the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement.
  • the input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
  • the training input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
  • a system for an entity parameter calculation comprising at least one processing circuitry configured to perform the following: receive a digital image of a form-fitting garment being worn, the form-fitting garment having (a) a reference marker, having a reference marker size that is known and substantially constant, and (b) at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; identify within the digital image, the reference marker and the at least one stretchable feature; determine a conversion ratio utilizing pixels representing the reference marker in the digital image and the reference marker size; determine one or more conversions of one or more respective given stretched horizontal widths of the at least one stretchable feature identified in the digital image, converted using the conversion ratio; and calculate an entity parameter related to an entity wearing the garment, utilizing the conversions.
  • the entity parameter is a circumference of the entity.
  • the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
  • the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
  • the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
  • the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement.
  • system further comprising determining a body fat percentage of the entity using a formula, wherein the formula is configured to receive the circumference and the one or more input parameters and output the body fat percentage.
  • the entity parameter is a body fat percentage of the entity.
  • the body fat percentage is calculated by: inputting the conversions, and one or more input parameters into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and provide the body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified body fat percentage measurement.
  • the body fat percentage is calculated by: inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and provide the body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified body fat percentage measurement.
  • the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified body fat percentage measurement.
  • the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified body fat percentage measurement.
  • the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the circumference and the input parameters and determine the respective body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training circumference and a respective verified body fat percentage measurement.
  • the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference into a classifier, wherein the classifier is configured to obtain the circumference and determine the respective body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training circumference and a respective verified body fat percentage measurement.
  • the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
  • the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
  • the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
  • the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement.
  • the input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
  • the training input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
  • a non-transitory computer readable storage medium having computer readable program code embodied therewith, the computer readable program code, executable by a processing circuitry to perform a method for circumference calculation, the method comprising: receiving, by a processing circuitry, a digital image of a form fitting garment being worn, the form-fitting garment having (a) a reference marker, having a reference marker size that is known and substantially constant, and (b) at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; identifying, by the processing circuitry, within the digital image, the reference marker and the at least one stretchable feature; determining, by the processing circuitry, a conversion ratio utilizing pixels representing the reference marker in the digital image and the reference marker size; determining, by the processing circuit
  • Fig. 1A and IB are schematic illustrations of a T-shirt according to one example of the presently disclosed subject matter
  • Fig. 2 is a block diagram schematically illustrating one example of an environment for performing an entity parameter calculation, according to one example of the presently disclosed subject matter
  • Fig. 3 is a block diagram schematically illustrating one example of a system for performing an entity parameter calculation, in accordance with the presently disclosed subject matter
  • Fig. 4 is a flowchart illustrating one example of a sequence of operations carried out for performing an entity parameter calculation, in accordance with the presently disclosed subject matter.
  • should be expansively construed to cover any kind of electronic device with data processing capabilities, including, by way of non limiting example, a personal desktop/laptop computer, a server, a computing system, a communication device, a smartphone, a tablet computer, a smart television, a processor (e.g. digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc.), a group of multiple physical machines sharing performance of various tasks, virtual servers co-residing on a single physical machine, any other electronic computing device, and/or any combination thereof.
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • non-transitory is used herein to exclude transitory, propagating signals, but to otherwise include any volatile or non-volatile computer memory technology suitable to the application.
  • the phrase “for example,” “such as”, “for instance” and variants thereof describe non-limiting embodiments of the presently disclosed subject matter.
  • Reference in the specification to “one case”, “some cases”, “other cases” or variants thereof means that a particular feature, structure or characteristic described in connection with the embodiment(s) is included in at least one embodiment of the presently disclosed subject matter.
  • the appearance of the phrase “one case”, “some cases”, “other cases” or variants thereof does not necessarily refer to the same embodiment(s).
  • Figs. 2 and 3 illustrate a general schematic of the system architecture in accordance with an embodiment of the presently disclosed subject matter.
  • Each module in Figs. 2 and 3 can be made up of any combination of software, hardware and/or firmware that performs the functions as defined and explained herein.
  • the modules in Figs. 2 and 3 may be centralized in one location or dispersed over more than one location.
  • the system may comprise fewer, more, and/or different modules than those shown in Figs. 2 and 3.
  • Any reference in the specification to a method should be applied mutatis mutandis to a system capable of executing the method and should be applied mutatis mutandis to a non-transitory computer readable medium that stores instructions that once executed by a computer result in the execution of the method. Any reference in the specification to a system should be applied mutatis mutandis to a method that may be executed by the system and should be applied mutatis mutandis to a non-transitory computer readable medium that stores instructions that may be executed by the system.
  • Any reference in the specification to a non-transitory computer readable medium should be applied mutatis mutandis to a system capable of executing the instructions stored in the non-transitory computer readable medium and should be applied mutatis mutandis to method that may be executed by a computer that reads the instructions stored in the non-transitory computer readable medium.
  • FIGs 1A and IB showing an exemplary form-fitting garment 10, in accordance with the presently disclosed subject matter.
  • the form- fitting garment 10 includes a reference marker 12 and stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5).
  • the form-fitting garment 10 can be made from elastic, washable and durable fabric, such as but not limited to linen, hemp, cotton, ramie, wool, silk, bamboo, soya, tencel, viscose, leather, suede, metal or any suitable polymer material.
  • a fabric is a polyester and polyolefin materials having moisture wicking and stretch characteristics.
  • the form-fitting garment 10 may be a slim or skin tight T-shirt, vest, dress, strap, belt, patch, halterneck or the like of appropriate (fitted) size.
  • the front portion of the form-fitting garment 10 can be stretchable while the form-fitting garment's 10 back portion is not necessarily stretchable.
  • the reference marker 12 may have a reference marker size that is known and constant.
  • the reference marker 12 can be made from an unstretchable material so that wearing the form-fitting garment 10 will not cause stretching thereof. Accordingly, the reference marker's 12 size and shape does not change, irrespective of the form fitting garment 10 being worn or not.
  • the reference marker 12 may have reference marker size that is known and substantially constant. That is, wearing the form-fitting garment 10 may cause stretching thereof by an insignificant extent, as further detailed hereinbelow.
  • the reference marker 12 may stretch (e.g. over time, upon wearing of the form fitting garment 10, etc.) by up to five millimeters, or less.
  • the reference marker 12 may be located in any location on the form- fitting garment 10. It can be integrated in the form-fitting garment 10, forming one surface of fabric, or located on the exterior of the front portion or the exterior of the back portion thereof, or any other configuration as long as it is visible to a camera taking a picture of an entity wearing the form-fitting garment 10. For example, the reference marker 12 can be located on the form-fitting garment 10 using an Applique technique.
  • the reference marker 12 may have a distinctive color compared to the form-fitting garment 10.
  • the distinctiveness can be in the visible spectrum, and in such cases, the reference marker 12 may have a first color that is different than a second color of the form-fitting garment 10, so that it can be distinguished from the form- fitting garment 10.
  • the distinctiveness can be in the non-visible spectrum
  • the reference marker 12 may have a first color different than a second color of the form-fitting garment 10, in an invisible spectrum (while their color can be identical or not in the visible spectrum).
  • Such structure may be used for IR imaging for example.
  • the form-fitting garment 10 may have a distinctive color compared to background surroundings.
  • the stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5) may have known feature horizontal widths along at least part of a length of the form- fitting garment 10 when unworn. Accordingly, the horizontal widths of the stretchable features at various points along the at least part of the length of the form-fitting garment 10 are known.
  • the stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5) can be designed to stretch upon the form-fitting garment 10 being worn, giving rise to stretched features 14'(1), 14'(2), 14'(3), 14'(4) and 14'(5) having stretched horizontal widths along the at least part of a length of the form-fitting garment 10. Accordingly, the horizontal width of a given stretchable feature (e.g. 14'(2)) at a given point along the at least part of a length of the form- fitting garment 10 when being worn is larger than the horizontal width of the same given stretchable feature (e.g. 14(2)) at the same given point when the form-fitting garment 10 is unworn.
  • a given stretchable feature e.g. 14'(2)
  • stretchable features 14(2) and 14(4) have known feature horizontal widths 15 at a given point on the form- fitting garment 10 when it is being unworn, as depicted in figure 1A, and stretched horizontal widths 15' at the same given point on the form-fitting garment 10 when it is being worn, as depicted in figure IB. It is to be noted that the stretched horizontal widths 15' are larger than the known feature horizontal widths 15.
  • the stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5) may be located in any location on the form-fitting garment 10.
  • the reference marker 12 and the stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5) are of a rectangular/quadratic shape but other shapes may be used.
  • the reference marker 12 and the stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5) may be whole or fractions of geometric shape(s), pattern(s), picture(s), design(s), dot(s), line(s), logo(s), symbol(s) and the like.
  • the reference marker 12 and the stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5) enable calculation of at least one of:
  • the environment 200 includes a user equipment 204, a user 206 wearing the form-fitting garment 10, a communication network 202 and an entity parameter calculation system 208.
  • the user equipment 204 may be a portable handheld device such as a smartphone, a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a wristwatch, a desktop computer or any other device with data communication capabilities. It may include a camera unit for capturing image(s) of the form-fitting garment 10 when worn by an entity.
  • the user equipment 204 can communicate with communication network 202 via wired or wireless communication. It may also include one or more of: a video display unit (e.g. flat panel display, such as OLED, or liquid crystal display (LCD) or a cathode ray tube (CRT)), an alphanumeric input device (e.g. a keyboard), a cursor control device (e.g. a mouse), or a signal generation device (e.g. a speaker).
  • a video display unit e.g. flat panel display, such as OLED, or liquid crystal display (LCD) or a cathode ray tube (CRT)
  • an alphanumeric input device e.g. a keyboard
  • a cursor control device e.g. a mouse
  • a signal generation device e.g. a speaker
  • the network 202 may be a cellular network, a Personal Area Network (PAN) Local Area Network (LAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), Wide Area Network (WAN), Virtual Private Network (VPN), an intranet, an extranet, an Internet, etc.
  • PAN Personal Area Network
  • LAN Local Area Network
  • WLAN Wireless Local Area Network
  • MAN Metropolitan Area Network
  • WAN Wide Area Network
  • VPN Virtual Private Network
  • intranet an extranet
  • extranet an extranet
  • Internet etc.
  • the user equipment 204 and the entity parameter calculation system 208 can communicate over the communication network 202 via wired or wireless communication.
  • the entity parameter calculation system 208 may be comprised within the user equipment 204.
  • FIG. 3 there is shown block diagram schematically illustrating one example of an entity parameter calculation system 208 for performing an entity parameter calculation, in accordance with the presently disclosed subject matter.
  • the entity parameter calculation system 208 comprising at least one processing circuitry 302.
  • Processing circuitry 302 can be one or more processing units (e.g. central processing units), microprocessors, microcontrollers or any other computing devices or modules, including multiple and/or parallel and/or distributed processing units, which are adapted to independently or cooperatively process data for controlling relevant resources of the entity parameter calculation system 208 and for enabling operations related to resources of the entity parameter calculation system 208.
  • the processing circuitry 302 comprises an entity parameter calculation module 304 configured to perform an entity parameter calculation process, as further detailed herein with respect to Pig. 4.
  • the system for entity parameter calculation 208 comprises, or may be otherwise associated with, a data repository 306 (e.g. a database, a storage system, a memory including Read Only Memory - ROM, Random Access Memory - RAM, or any other type of memory, etc.) configured to store data, including inter alia user-related data relating to one or more users 206 wearing the form-fitting garment 10, and various data acquired from such users 206 (e.g.
  • a data repository 306 e.g. a database, a storage system, a memory including Read Only Memory - ROM, Random Access Memory - RAM, or any other type of memory, etc.
  • data repository 306 e.g. a database, a storage system, a memory including Read Only Memory - ROM, Random Access Memory - RAM, or any other type of memory, etc.
  • data repository 306 e.g. a database, a storage system, a memory including Read Only Memory - ROM, Random Access Memory - RAM, or any other type of memory,
  • digital image/s of the user 206 wearing the form-fitting garment 10 digital image/s of the user 206 wearing the form-fitting garment 10, physical details of the user 206 (such as age, gender, garment size of the garment 10 worn by the user 10, medical condition, fitness level or any other parameter representing a physiological characteristic thereof), body metrics of the user 206 (such as height, weight, circumference, BMI, body composition, etc.), contact details of the user 206 (such as phone number, email, residence address, social media accounts, etc.), etc.).
  • data repository 306 can be further configured to enable retrieval and/or update and/or deletion of the stored data. It is to be noted that in some cases, data repository 306 can be distributed across multiple locations, whether within the entity parameter calculation system 208 and/or within the user equipment 204 and/or elsewhere. It is to be noted, that in some cases, the relevant information relating to the user 206 can be loaded into data repository 306 before taking a picture of user 206 wearing the form-fitting garment 10 (e.g. upon entering the system for entity parameter calculation 208 and/or periodically and/or upon the system for entity parameter calculation 208 requesting the information) or after.
  • FIG. 4 there is shown a flowchart illustrating one example of a sequence of operations carried out for performing an entity parameter calculation, in accordance with the presently disclosed subject matter.
  • system for an entity parameter calculation 208 can be configured to perform an entity parameter calculation process 400, e.g. utilizing an entity parameter calculation module 304.
  • the system for entity parameter calculation 208 can be configured to receive a digital image of a form-fitting garment 10 being worn (e.g. by a person whose entity parameter calculation is desired) (block 410).
  • the form-fitting garment 10 has (a) a reference marker 12, having a reference marker size that is known and substantially constant, and (b) at least one stretchable feature 14(1), 14(2), 14(3), 14(4) or 14(5), having known feature horizontal widths along at least part of a length of the garment when unworn.
  • Each of the at least one stretchable features e.g.
  • stretchable features 14(1), 14(2), 14(3), 14(4) or 14(5) is designed to stretch upon the garment 10 being worn, giving rise to respective stretched features (e.g. stretched features 14'(1), 14'(2), 14'(3), 14'(4) or 14'(5) having stretched horizontal widths along the at least part of the length of the garment.
  • the digital image of the form-fitting garment 10 being worn can be acquired by a camera unit that may be comprised within the user equipment 204 or may be otherwise connected to it via a wired/wireless communication channel.
  • the camera unit may be operated by the user 206 wearing the form-fitting garment 10 (i.e. for taking a self- portrait) or a third party (e.g. a camera unit with a timer feature, a selfie stick, mirror reflection, another person is taking the picture etc.).
  • the system 208 can be further configured to identify, within the digital image, the reference marker 12 and at least one of the stretchable features (e.g. stretchable feature 14(1), 14(2), 14(3), 14(4) or 14(5)) (block 420).
  • the stretchable features e.g. stretchable feature 14(1), 14(2), 14(3), 14(4) or 14(5)
  • the identification of the reference marker 12 and the at least one stretchable feature 14(1), 14(2), 14(3), 14(4) or 14(5) within the digital image can be done for example by an automated pattern recognition technology (e.g. by utilizing digital image processing techniques).
  • the user can be required to place the reference marker 12 within an ad hoc frame displayed on a user interface of the camera unit used for acquiring the digital image, before taking the picture of the form-fitting garment 10 being worn.
  • the system 208 is not required to identify the reference marker 12 within the digital image.
  • the operator of the camera unit is required to accurately place the reference marker 12 within an ad hoc frame displayed on a user interface of the camera unit used for acquiring the digital image, before taking the picture of the form-fitting garment 10 being worn. Accordingly, the identification of the reference marker by the system 208 is made redundant.
  • the camera unit can be configured to automatically acquire the digital image, of the form- fitting garment 10 being worn, once the reference marker 12 has been aligned with the frame displayed on the user's interface thereof.
  • the camera unit can be configured to assist the user to align the reference marker 12 with the frame displayed on the user interface of the camera unit by providing direction arrows for example.
  • the system 208 can be further configured to determine a conversion ratio utilizing pixels representing the reference marker 12 in the digital image and the reference marker size (block 430).
  • the reference marker size may be known and constant, that is, its size (e.g. its cross-section, area, horizontal latitude, etc.) may be known in centimeters for example, or in any other measurement system (i.e. the International System of Units (SI), the imperial system, the United States customary units, etc.).
  • SI International System of Units
  • the reference marker 12 can be designed to maintain its known original size (i.e. remain constant) irrespective of the form-fitting garment 10 being worn or not.
  • the reference marker 12 may be represented by MxN pixels in the digital image, wherein M indicates number of pixels in horizontal line of the reference marker's 12 contour and N indicates number of pixels in vertical line thereof.
  • the marker 12 may be represented in pixels merely by a one dimensional line that may be oriented in any direction on the form-fitting garment 10.
  • the conversion ratio may be determined utilizing the reference marker size and its representation in pixels (e.g. M pixels are equal to Y centimeters so that conversion ratio can be calculated therefrom).
  • any desired object in the form-fitting garment 10 e.g. stretchable features 14(1), 14(2), 14(3), 14(4) or 14(5) or stretched features 14'(1), 14'(2), 14'(3), 14'(4) or 14'(5)
  • stretchable features 14(1), 14(2), 14(3), 14(4) or 14(5) may be converted from its pixel representation in the digital image to its corresponding metric representation for example.
  • the reference marker 12 can be designed to substantially maintain its known original size (i.e. remain substantially constant) upon the form-fitting garment 10 being worn. In these cases, the reference marker 12 may stretch by insignificant extent that will not affect the determination of the conversion ratio therefrom (e.g. by up to five millimeters or less).
  • the system 208 can be further configured to determine one or more conversions of one or more respective given stretched horizontal widths of the at least one stretchable feature 14(1), 14(2), 14(3), 14(4) or 14(5) identified in the digital image, converted using the conversion ratio (block 440).
  • the at least one stretchable feature 14(1), 14(2), 14(3), 14(4) or 14(5) may be stretched once the user 206 is wearing the form-fitting garment 10.
  • the stretched feature 14'(1), 14'(2), 14'(3), 14'(4) or 14'(5) may have a stretched horizontal width that can be represented in pixels in the digital image.
  • the stretched horizontal width can be converted into centimeters, for example, by utilizing the conversion ratio. This way, each stretched horizontal width along the at least part of the length of the garment 10 can have a corresponding converted width (i.e. conversions).
  • the system 208 can be further configured to calculate an entity parameter related to an entity wearing the garment 10, utilizing the conversions (block 450).
  • the entity parameter may be a circumference of the entity (i.e. user 206 wearing the form-fitting garment 10).
  • the circumference of the entity can be determined in various manners, some of which are exemplified herein:
  • the circumference can be calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
  • the input parameters may include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
  • the training input parameters may include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
  • the training conversions may be obtained from a conversions training data set obtained during previous measurements, or from external databases.
  • the conversions training data set may consist of previously determined stretched horizontal widths along the at least part of the length of the form fitting garment 10 converted into centimeters by utilizing various conversion ratios.
  • the verified circumference measurements may be obtained from a verified circumference measurement training data set obtained during previous measurements, or from external databases.
  • the verified circumference measurement training data set may consist of circumference measurements made by a dietician.
  • an exemplary record of the training set can comprise, for a given entity, its input parameters (e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender), its conversions (e.g. stretched horizontal widths of stretchable feature/s along the at least part of the length of the form-fitting garment 10 converted into centimeters), and its verified circumference measurement (e.g. measured using a measurement device operated by a dietician).
  • input parameters e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender
  • its conversions e.g. stretched horizontal widths of stretchable feature/s along the at least part of the length of the form-fitting garment 10 converted into centimeters
  • its verified circumference measurement e.g. measured using a measurement device operated by a dietician.
  • the circumference can be calculated by:
  • such a classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters (e.g. age, height, weight, garment size of the garment worn by the entity, or gender), training difference and a respective verified circumference measurement.
  • respective training input parameters e.g. age, height, weight, garment size of the garment worn by the entity, or gender
  • the training difference may be obtained from a difference training data set obtained during previous measurements, or from external databases.
  • the difference training data set may consist of previously determined differences between conversions of stretched horizontal widths along the at least part of the length of the form- fitting garment 10 into centimeters and respective known feature horizontal widths.
  • the verified circumference measurement training data set may consist of circumference measurements made by a dietician.
  • an exemplary record of the training set can comprise, for a given entity, its input parameters (e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender), its differences (e.g. differences between conversions of stretched horizontal widths of stretchable feature/s along the at least part of the length of the form-fitting garment 10 worn by the entity into centimeters and respective known feature horizontal widths of the respective stretchable feature/s), and its verified circumference measurement (e.g. measured using a measurement device operated by a dietician).
  • input parameters e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender
  • its differences e.g. differences between conversions of stretched horizontal widths of stretchable feature/s along the at least part of the length of the form-fitting garment 10 worn by the entity into centimeters and respective known feature horizontal widths of the respective stretchable feature/s
  • its verified circumference measurement e.
  • system 208 can be further configured to determine a body fat percentage of the entity using a formula, wherein the formula is configured to receive the circumference and the one or more input parameters and output the body fat percentage.
  • the formula may be a mathematical formula such as a U.S. Navy Formula, BMI formula, BMR formula, YMCA formula, BSA formula or any other suitable formula.
  • the entity parameter may be a body fat percentage of the entity (i.e. user 206 wearing the form-fitting garment 10).
  • the body fat percentage can be determined in various manners, some of which are exemplified herein:
  • the body fat percentage may be calculated by inputting the conversions, and one or more input parameters into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and provide the body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters (e.g. age, height, weight, garment size of the garment worn by the entity, or gender), training conversions and a respective verified body fat percentage measurement.
  • training input parameters e.g. age, height, weight, garment size of the garment worn by the entity, or gender
  • the training conversions may be obtained from a conversions training data set obtained during previous measurements, or from external databases.
  • the conversions training data set may consist of previously determined stretched horizontal widths along the at least part of the length of the form fitting garment 10 converted into centimeters by utilizing various conversion ratios.
  • the verified body fat percentage measurements may be obtained from a body fat percentage training data set obtained during previous measurements, or from external databases.
  • the verified body fat percentage measurement training data set may consist of body fat percentage measurements made by a dietician.
  • Various measurement devices may be used by the dietician to measure body fat percentage such as Ultrasound devices, Dual-energy X-ray absorptiometry (DXA) scans (that have become a “gold standard” for the assessment of body composition in sports nutrition), Skinfold Calipers, etc.
  • DXA Dual-energy X-ray absorptiometry
  • an exemplary record of the training set can comprise, for a given entity, its input parameters (e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender), its conversions (e.g. stretched horizontal widths of stretchable feature/s along the at least part of the length of the form- fitting garment 10 converted into centimeters), and its verified body fat percentage measurement (e.g. measured using a measurement device operated by a dietician).
  • input parameters e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender
  • its conversions e.g. stretched horizontal widths of stretchable feature/s along the at least part of the length of the form- fitting garment 10 converted into centimeters
  • its verified body fat percentage measurement e.g. measured using a measurement device operated by a dietician.
  • the body fat percentage can be calculated by:
  • such a classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters (e.g. age, height, weight, garment size of the garment worn by the entity, or gender), training difference and a respective verified body fat percentage measurement.
  • respective training input parameters e.g. age, height, weight, garment size of the garment worn by the entity, or gender
  • the training difference may be obtained from a difference training data set obtained during previous measurements, or from external databases.
  • the difference training data set may consist of previously determined differences between conversions of stretched horizontal widths along the at least part of the length of the form- fitting garment 10 into centimeters and respective known feature horizontal widths.
  • the verified body fat percentage measurement training data set may consist of body fat percentage measurements made by a dietician.
  • an exemplary record of the training set can comprise, for a given entity, its input parameters (e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender), its differences (e.g. differences between conversions of stretched horizontal widths of stretchable feature/s along the at least part of the length of the form-fitting garment 10 worn by the entity into centimeters and respective known feature horizontal widths of the respective stretchable feature/s), and its verified body fat percentage measurement (e.g. measured using a measurement device operated by a dietician).
  • input parameters e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender
  • its differences e.g. differences between conversions of stretched horizontal widths of stretchable feature/s along the at least part of the length of the form-fitting garment 10 worn by the entity into centimeters and respective known feature horizontal widths of the respective stretchable feature/s
  • its verified body fat percentage measurement
  • the body fat percentage can be calculated by determining a circumference of the entity and inputting the circumference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the circumference and the input parameters and determine the respective body fat percentage.
  • the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters (e.g. age, height, weight, garment size of the garment worn by the entity, or gender), training circumference and a respective verified body fat percentage measurement.
  • respective training input parameters e.g. age, height, weight, garment size of the garment worn by the entity, or gender
  • the training circumference may be obtained from a circumference training data set obtained during previous measurements, or from external databases.
  • the circumference measurement training data set may consist of circumference measurements made by a dietician.
  • the verified body fat percentage measurements may be obtained from a body fat percentage training data set obtained during previous measurements, or from external databases.
  • the verified body fat percentage measurement training data set may consist of body fat percentage measurements made by a dietician.
  • an exemplary record of the training set can comprise, for a given entity, its input parameters (e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender), its circumferences (e.g. measured using a measurement device operated by a dietician or calculated by one or more of the classifiers that are used to determine circumference as described hereinabove), and its verified body fat percentage measurement (e.g. measured using a measurement device operated by a dietician).
  • input parameters e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender
  • its circumferences e.g. measured using a measurement device operated by a dietician or calculated by one or more of the classifiers that are used to determine circumference as described hereinabove
  • its verified body fat percentage measurement e.g. measured using a measurement device operated by a dietician.
  • the method described in the foregoing specification can be implemented utilizing a software debounce routine, or any other anomaly filtering method and/or technique, thereby at least reducing, if not eliminating, err calculations of the entity parameter(s).
  • the number of measurements can be any number of measurements higher than one. In case more than one measurement is used, each measurement is made at a different location along the length of the stretchable feature(s). In a specific non-limiting example, nine measurements can be used.
  • the stretched horizontal widths that are used are the ones the correspond to those locations along pre-set portions (e.g. the waist area, the chest area, the basin area, etc.) along the length of the stretchable feature(s), that stretched more than other locations at the corresponding pre-set portion.
  • the system 208 can take into account differences in the way men and women accumulate fat so that it's the calculation of the entity parameter(s) may differ between women and men. For instance, calculation for women may be based on measurements of three stretched horizontal widths of stretchable feature(s) along the at least part of the length of the form- fitting garment 10 while for men only two measurements of stretched horizontal widths may suffice.
  • the entity parameter may be a body type of the entity wearing the form-fitting garment 10, thereby demonstrating one of the three known in the art body types: Ectomorph, Endomorph or Mesomorphic.
  • the entity wearing the form-fitting garment 10 may provide the system 208 with at least one of the input parameters (e.g. height).
  • the entity parameter calculation module 304 may be configured to calculate a width of the entity wearing the form-fitting garment 10 and utilize the width and the input parameters to determine the body type of the entity wearing the form-fitting garment 10.
  • the entity parameter calculation module 304 may be configured to calculate a width of the entity wearing the form-fitting garment 10 and utilize the width to determine the body type of the entity wearing the form-fitting garment 10.
  • system 208 can utilize historical and optionally current measurements of stretched horizontal widths in order to generate a two-dimensional or three-dimensional representation showing the progress/change of the entity wearing the form-fitting garment 10.
  • some of the blocks can be integrated into a consolidated block or can be broken down to a few blocks and/or other blocks may be added. Furthermore, in some cases, the blocks can be performed in a different order than described herein (for example, block 440 can be performed before block 430, block 430 can be performed before block 420, block 420 can be performed before block 410, etc.). It is to be further noted that some of the blocks are optional. It should be also noted that whilst the flow diagram is described also with reference to the system elements that realizes them, this is by no means binding, and the blocks can be performed by elements other than those described herein.
  • system can be implemented, at least partly, as a suitably programmed computer.
  • the presently disclosed subject matter contemplates a computer program being readable by a computer for executing the disclosed method.
  • the presently disclosed subject matter further contemplates a machine-readable memory tangibly embodying a program of instructions executable by the machine for executing the disclosed method.
  • Examples of the present presently disclosed subject matter may be provided as a computer program product, or software, that may include a machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the presently disclosed subject matter.
  • a machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g. a computer).
  • a machine-readable (e.g. computer readable) medium includes a machine (e.g. a computer) readable storage medium (e.g. read only memory (“ROM”), random access memory (“RAM”), magnetic disk storage media, optical storage media, flash memory devices, etc.), a machine (e.g. computer) readable transmission medium (electrical, optical, acoustical or other form of propagated signals (e.g., infrared signals, digital signals, etc.)), etc.
  • Fig. 2 illustrates a diagrammatic representation of a system in the exemplary form of a computer system 208 including HW and SW such as e.g. set of instructions, causing the system to perform any one or more of the above techniques.
  • the machine may be connected (e.g. networked) to other machines in a Focal Area Network (LAN), an intranet, an extranet, or the Internet.
  • LAN Focal Area Network
  • the system may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a server, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA Personal Digital Assistant
  • STB set-top box
  • WPA Personal Digital Assistant
  • the exemplary computer system 208 may include a processor.
  • the processor represents one or more general-purpose processing devices such as microprocessor, central processing unit, or the like. More particularly, the processor may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, processor implementing other instructions sets, or processors implementing a combination of instruction sets.
  • CISC complex instruction set computing
  • RISC reduced instruction set computing
  • VLIW very long instruction word
  • Processor may also be one or more special purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • DSP digital signal processor
  • the computer system may further include a network interface device (NID).
  • NID network interface device
  • the computer system may also include a video display unit (e.g. flat panel display, such as OLED, or liquid crystal display (LCD) or a cathode ray tube (CRT)), an alphanumeric input device (e.g. a keyboard), a cursor control device (e.g. a mouse), and a signal generation device (e.g. a speaker).
  • a video display unit e.g. flat panel display, such as OLED, or liquid crystal display (LCD) or a cathode ray tube (CRT)
  • an alphanumeric input device e.g. a keyboard
  • a cursor control device e.g. a mouse
  • a signal generation device e.g. a speaker
  • the computer system may further include a memory.
  • the memory may include a machine-accessible storage medium (or more specifically a computer-readable storage medium) on which stored one or more sets of instructions (e.g. software) embodying any one or more of the methodologies or functions described herein.
  • the software may also reside, completely or at least partially, within the memory and/or within the processor during execution thereof by the computer system, the memory and the processor also constituting machine-readable storage media.
  • the software may further be transmitted or received over a network via the network interface device. While the machine -accessible storage medium is shown in an exemplary embodiment to be a single medium, the term “machine -readable storage medium should be taken to include a single medium or multiple media (e.g. centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • machine-readable storage medium shall also be taken to include any medium that is capable of storing or encoding a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present presently disclosed subject matter the term “machine- readable storage medium” shall accordingly be taken to include, but not limited to, solid-state memories, and optical and magnetic media.
  • a computer program is a list of instructions such as a particular application program and/or an operating system.
  • the computer program may for instance include one or more of: a subroutine, a function, a procedure, an object method, an object implementation, an executable application, an applet, a servlet, a source code, an object code, a shared library/dynamic load library and/or other sequence of instructions designed for execution on a computer system.
  • the computer program may be stored internally on a non-transitory computer readable medium. All or some of the computer program may be provided on computer readable media permanently, removably or remotely coupled to an information processing system.
  • the computer readable media may include, for example and without limitation, any number of the following: magnetic storage media including disk and tape storage media; optical storage media such as compact disk media (e.g., CD-ROM, CD-R, etc.) and digital video disk storage media; nonvolatile memory storage media including semiconductor-based memory units such as FLASH memory, EEPROM, EPROM, ROM; ferromagnetic digital memories; MRAM; volatile storage media including registers, buffers or caches, main memory, RAM, etc.
  • a computer process typically includes an executing (running) program or portion of a program, current program values and state information, and the resources used by the operating system to manage the execution of the process.
  • An operating system is the software that manages the sharing of the resources of a computer and provides programmers with an interface used to access those resources.
  • An operating system processes system data and user input, and responds by allocating and managing tasks and internal system resources as a service to users and programs of the system.
  • the computer system may for instance include at least one processing unit, associated memory and a number of input/output (I/O) devices.
  • I/O input/output
  • the presently disclosed subject matter may also be implemented in a computer program for running on a computer system, at least including code portions for performing steps of a method according to the presently disclosed subject matter when run on a programmable apparatus, such as a computer system.
  • the examples, or portions thereof may implemented as soft or code representations of physical circuitry or of logical representations convertible into physical circuitry, such as in a hardware description language of any appropriate type.
  • the presently disclosed subject matter is not limited to physical devices or units implemented in nonprogrammable hardware but can also be applied in programmable devices or units able to perform the desired device functions by operating in accordance with suitable program code, such as mainframes, minicomputers, servers, workstations, personal computers, notepads, personal digital assistants, electronic games, and other embedded systems, cell phones and various other wireless devices, commonly denoted in this application as ‘computer systems’.
  • suitable program code such as mainframes, minicomputers, servers, workstations, personal computers, notepads, personal digital assistants, electronic games, and other embedded systems, cell phones and various other wireless devices, commonly denoted in this application as ‘computer systems’.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Strategic Management (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Human Computer Interaction (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Veterinary Medicine (AREA)
  • Economics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Business, Economics & Management (AREA)
  • Geometry (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Image Analysis (AREA)

Abstract

A system for an entity parameter calculation, including at least one processing circuitry configured to receive a digital image of a form-fitting garment being worn, he form-fitting garment having (a) a reference marker and (b) at least one stretchable feature, identify within the digital image, the reference marker and the at least one stretchable feature, determine a conversion ratio utilizing pixels representing the reference marker in the digital image and the reference marker size, determine one or more conversions of one or more respective given stretched horizontal widths of the at least one stretchable feature identified in the digital image, converted using the conversion ratio, and calculate an entity parameter related to an entity wearing the garment, utilizing the conversions.

Description

SYSTEM, METHOD AND COMPUTER READABLE MEDIUM FOR ENTITY PARAMETER CALCULATION
CROSS-REFERENCES TO RELATED APPLICATIONS
This application claims priority from U.S. Provisional Patent Application No. 62/892,572 filed on 28 Aug 2019, incorporated herein by reference in its entirety.
TECHNICAL FIELD
The invention relates to a system, method and computer readable medium for entity parameter calculation.
BACKGROUND
Physical fitness refers to the extent a human body can be taken while carrying out physical tasks. Knowing the specifics can help to determine fitness level and consequently set realistic fitness goals, monitor progress and maintain motivation.
Generally, fitness is assessed in four key areas: aerobic fitness, muscular strength and endurance, flexibility, and body composition. Human body is composed of two types of tissues known as fat weight and lean weight. Fat weight is the fat stored in fat cells throughout the body. Lean weight includes all other tissues, such as organs, bones, blood, skin, and muscle. Over the years, there is a muscle loss in favor of fat gain. The muscle loss adversely affects human's physical function and personal appearance.
The deterioration in body composition (too little muscle and too much fat) is a major health risk associated with many medical problems including obesity, lower back pain, type II diabetes, various forms of cancer, high blood pressure, and heart disease. In fact, obesity contributes to at least half the chronic diseases in western society.
There are many ways to assess body composition, the most common are circumference and body fat percentage measurements.
Body fat measurement is important for determining fitness level as weight alone is not a clear indicator of good health because it does not distinguish between pounds that come from body fat and those that come from lean body mass or muscle.
Body fat percentage is crucial to know in order to maintain a healthy lifestyle. There are many techniques for body fat percentage measurement (some of which are based on determination of physical circumference of the measured entity), such as Underwater weighing, Whole-body air displacement plethysmography, Near-infrared interactance, Dual energy X-ray absorptiometry, Body average density measurement, Bioelectrical impedance analysis and Anthropometric methods.
The known techniques for circumference and body fat percentage measurements are expensive and inaccessible, and thus, there is a growing need for a new system and method for entity parameter calculation.
GENERAL DESCRIPTION
In accordance with a first aspect of the presently disclosed subject matter, there is provided a form- fitting garment, comprising: a reference marker, having a reference marker size that is known and substantially constant; and at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; thereby enabling calculation of at least one of:
(A) a circumference of an entity wearing the garment utilizing at least (a) the reference marker size, and (b) at least one difference between a given stretched horizontal width and a respective known feature horizontal width of the feature horizontal widths; or
(B)body fat percentage of the entity wearing the garment utilizing at least (a) the reference marker size, and (b) the at least one given stretched horizontal width.
In some cases, the at least one stretchable feature is in a front portion of the garment when being worn.
In some cases, the garment is one of: a shirt, a vest, a dress, a strap, a belt, a patch or a halterneck.
In some cases, the reference marker having a distinctive color compared to the garment.
In some cases, a first color of the reference marker in a visible spectrum is identical to a second color of the garment.
In some cases, the first color of the reference marker in a non-visible spectrum is different than the second color of the garment upon the garment being worn. In some cases, the reference marker is made of one of the following materials: linen, hemp, cotton, ramie, wool, silk, bamboo, soya, tencel, viscose, leather, suede, metal, polyester, polyolefin or polymer material.
In some cases, upon the entity wearing the garment, the difference between the given stretched horizontal width and the respective known feature horizontal width of the feature horizontal widths is determined by capturing an image of the garment, upon the garment being worn, the image including the reference marker and the stretchable feature, and analyzing the image.
In accordance with a second aspect of the presently disclosed subject matter, there is provided a method for an entity parameter calculation, the method comprising: receiving, by a processing circuitry, a digital image of a form-fitting garment being worn, the form- fitting garment having (a) a reference marker, having a reference marker size that is known and substantially constant, and (b) at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; identifying, by the processing circuitry, within the digital image, the reference marker and the at least one stretchable feature; determining, by the processing circuitry, a conversion ratio utilizing pixels representing the reference marker in the digital image and the reference marker size; determining, by the processing circuitry, one or more conversions of one or more respective given stretched horizontal widths of the at least one stretchable feature identified in the digital image, converted using the conversion ratio; and calculating, by the processing circuitry, an entity parameter related to an entity wearing the garment, utilizing the conversions.
In some cases, the entity parameter is a circumference of the entity.
In some cases, the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement. In some cases, the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
In some cases, the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
In some cases, the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement.
In some cases, the method further comprising determining a body fat percentage of the entity using a formula, wherein the formula is configured to receive the circumference and the one or more input parameters and output the body fat percentage.
In some cases, the entity parameter is a body fat percentage of the entity.
In some cases, the body fat percentage is calculated by: inputting the conversions, and one or more input parameters into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and provide the body fat percentage.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified body fat percentage measurement. In some cases, the body fat percentage is calculated by: inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and provide the body fat percentage.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified body fat percentage measurement.
In some cases, the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective body fat percentage.
In some cases, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified body fat percentage measurement.
In some cases, the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective body fat percentage.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified body fat percentage measurement.
In some cases, the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the circumference and the input parameters and determine the respective body fat percentage.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training circumference and a respective verified body fat percentage measurement.
In some cases, the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference into a classifier, wherein the classifier is configured to obtain the circumference and determine the respective body fat percentage.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training circumference and a respective verified body fat percentage measurement.
In some cases, the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
In some cases, the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
In some cases, the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
In some cases, the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference. In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement.
In some cases, the input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
In some cases, the training input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
In accordance with a third aspect of the presently disclosed subject matter, there is provided a system for an entity parameter calculation, the system comprising at least one processing circuitry configured to perform the following: receive a digital image of a form-fitting garment being worn, the form-fitting garment having (a) a reference marker, having a reference marker size that is known and substantially constant, and (b) at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; identify within the digital image, the reference marker and the at least one stretchable feature; determine a conversion ratio utilizing pixels representing the reference marker in the digital image and the reference marker size; determine one or more conversions of one or more respective given stretched horizontal widths of the at least one stretchable feature identified in the digital image, converted using the conversion ratio; and calculate an entity parameter related to an entity wearing the garment, utilizing the conversions.
In some cases, the entity parameter is a circumference of the entity.
In some cases, the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
In some cases, the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference. In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
In some cases, the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
In some cases, the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement.
In some cases, the system further comprising determining a body fat percentage of the entity using a formula, wherein the formula is configured to receive the circumference and the one or more input parameters and output the body fat percentage.
In some cases, the entity parameter is a body fat percentage of the entity.
In some cases, the body fat percentage is calculated by: inputting the conversions, and one or more input parameters into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and provide the body fat percentage.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified body fat percentage measurement.
In some cases, the body fat percentage is calculated by: inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and provide the body fat percentage. In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified body fat percentage measurement.
In some cases, the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective body fat percentage.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified body fat percentage measurement.
In some cases, the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective body fat percentage.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified body fat percentage measurement.
In some cases, the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the circumference and the input parameters and determine the respective body fat percentage.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training circumference and a respective verified body fat percentage measurement.
In some cases, the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference into a classifier, wherein the classifier is configured to obtain the circumference and determine the respective body fat percentage. In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training circumference and a respective verified body fat percentage measurement.
In some cases, the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
In some cases, the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
In some cases, the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
In some cases, the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference.
In some cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement. In some cases, the input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
In some cases, the training input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
In accordance with a fourth aspect of the presently disclosed subject matter, there is provided a non-transitory computer readable storage medium having computer readable program code embodied therewith, the computer readable program code, executable by a processing circuitry to perform a method for circumference calculation, the method comprising: receiving, by a processing circuitry, a digital image of a form fitting garment being worn, the form-fitting garment having (a) a reference marker, having a reference marker size that is known and substantially constant, and (b) at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; identifying, by the processing circuitry, within the digital image, the reference marker and the at least one stretchable feature; determining, by the processing circuitry, a conversion ratio utilizing pixels representing the reference marker in the digital image and the reference marker size; determining, by the processing circuitry, one or more conversions of one or more respective given stretched horizontal widths of the at least one stretchable feature identified in the digital image, converted using the conversion ratio; and calculating, by the processing circuitry, an entity parameter related to an entity wearing the garment, utilizing the conversions.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to understand the presently disclosed subject matter and to see how it may be carried out in practice, the subject matter will now be described, by way of non limiting examples only, with reference to the accompanying drawings, in which:
Fig. 1A and IB are schematic illustrations of a T-shirt according to one example of the presently disclosed subject matter;
Fig. 2 is a block diagram schematically illustrating one example of an environment for performing an entity parameter calculation, according to one example of the presently disclosed subject matter; Fig. 3 is a block diagram schematically illustrating one example of a system for performing an entity parameter calculation, in accordance with the presently disclosed subject matter; and
Fig. 4 is a flowchart illustrating one example of a sequence of operations carried out for performing an entity parameter calculation, in accordance with the presently disclosed subject matter.
DETAILED DESCRIPTION
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the presently disclosed subject matter. However, it will be understood by those skilled in the art that the presently disclosed subject matter may be practiced without these specific details. In other instances, well- known methods, procedures, and components have not been described in detail so as not to obscure the presently disclosed subject matter.
In the drawings and descriptions set forth, identical reference numerals indicate those components that are common to different embodiments or configurations.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "receiving", "displaying", "performing", "storing", “providing”, “preventing”, " identifying", "determining", "calculating" or the like, include action and/or processes of a computer that manipulate and/or transform data into other data, said data represented as physical quantities, e.g. such as electronic quantities, and/or said data representing the physical objects. The terms “computer”, “processor”, “processing resource” and “controller” should be expansively construed to cover any kind of electronic device with data processing capabilities, including, by way of non limiting example, a personal desktop/laptop computer, a server, a computing system, a communication device, a smartphone, a tablet computer, a smart television, a processor (e.g. digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc.), a group of multiple physical machines sharing performance of various tasks, virtual servers co-residing on a single physical machine, any other electronic computing device, and/or any combination thereof.
The operations in accordance with the teachings herein may be performed by a computer specially constructed for the desired purposes or by a general-purpose computer specially configured for the desired purpose by a computer program stored in a non-transitory computer readable storage medium. The term "non-transitory" is used herein to exclude transitory, propagating signals, but to otherwise include any volatile or non-volatile computer memory technology suitable to the application.
As used herein, the phrase "for example," "such as", "for instance" and variants thereof describe non-limiting embodiments of the presently disclosed subject matter. Reference in the specification to "one case", "some cases", "other cases" or variants thereof means that a particular feature, structure or characteristic described in connection with the embodiment(s) is included in at least one embodiment of the presently disclosed subject matter. Thus, the appearance of the phrase "one case", "some cases", "other cases" or variants thereof does not necessarily refer to the same embodiment(s).
It is appreciated that, unless specifically stated otherwise, certain features of the presently disclosed subject matter, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the presently disclosed subject matter, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination.
In embodiments of the presently disclosed subject matter, fewer, more and/or different stages than those shown in Fig. 4 may be executed. In embodiments of the presently disclosed subject matter one or more stages illustrated in Fig. 4 may be executed in a different order and/or one or more groups of stages may be executed simultaneously. Figs. 2 and 3 illustrate a general schematic of the system architecture in accordance with an embodiment of the presently disclosed subject matter. Each module in Figs. 2 and 3 can be made up of any combination of software, hardware and/or firmware that performs the functions as defined and explained herein. The modules in Figs. 2 and 3 may be centralized in one location or dispersed over more than one location. In other embodiments of the presently disclosed subject matter, the system may comprise fewer, more, and/or different modules than those shown in Figs. 2 and 3.
Any reference in the specification to a method should be applied mutatis mutandis to a system capable of executing the method and should be applied mutatis mutandis to a non-transitory computer readable medium that stores instructions that once executed by a computer result in the execution of the method. Any reference in the specification to a system should be applied mutatis mutandis to a method that may be executed by the system and should be applied mutatis mutandis to a non-transitory computer readable medium that stores instructions that may be executed by the system.
Any reference in the specification to a non-transitory computer readable medium should be applied mutatis mutandis to a system capable of executing the instructions stored in the non-transitory computer readable medium and should be applied mutatis mutandis to method that may be executed by a computer that reads the instructions stored in the non-transitory computer readable medium.
Bearing this in mind, attention is drawn to Figs 1A and IB, showing an exemplary form-fitting garment 10, in accordance with the presently disclosed subject matter.
As shown in figure 1A, the form- fitting garment 10 includes a reference marker 12 and stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5).
The form-fitting garment 10 can be made from elastic, washable and durable fabric, such as but not limited to linen, hemp, cotton, ramie, wool, silk, bamboo, soya, tencel, viscose, leather, suede, metal or any suitable polymer material. Another non limiting example of a fabric is a polyester and polyolefin materials having moisture wicking and stretch characteristics. The form-fitting garment 10 may be a slim or skin tight T-shirt, vest, dress, strap, belt, patch, halterneck or the like of appropriate (fitted) size. In some cases, the front portion of the form-fitting garment 10 can be stretchable while the form-fitting garment's 10 back portion is not necessarily stretchable.
The reference marker 12 may have a reference marker size that is known and constant. The reference marker 12 can be made from an unstretchable material so that wearing the form-fitting garment 10 will not cause stretching thereof. Accordingly, the reference marker's 12 size and shape does not change, irrespective of the form fitting garment 10 being worn or not.
In some cases, the reference marker 12 may have reference marker size that is known and substantially constant. That is, wearing the form-fitting garment 10 may cause stretching thereof by an insignificant extent, as further detailed hereinbelow. For example, the reference marker 12 may stretch (e.g. over time, upon wearing of the form fitting garment 10, etc.) by up to five millimeters, or less.
The reference marker 12 may be located in any location on the form- fitting garment 10. It can be integrated in the form-fitting garment 10, forming one surface of fabric, or located on the exterior of the front portion or the exterior of the back portion thereof, or any other configuration as long as it is visible to a camera taking a picture of an entity wearing the form-fitting garment 10. For example, the reference marker 12 can be located on the form-fitting garment 10 using an Applique technique.
In some cases, the reference marker 12 may have a distinctive color compared to the form-fitting garment 10.
In some cases, the distinctiveness can be in the visible spectrum, and in such cases, the reference marker 12 may have a first color that is different than a second color of the form-fitting garment 10, so that it can be distinguished from the form- fitting garment 10.
In other cases, the distinctiveness can be in the non-visible spectrum, in such cases, the reference marker 12 may have a first color different than a second color of the form-fitting garment 10, in an invisible spectrum (while their color can be identical or not in the visible spectrum). Such structure may be used for IR imaging for example.
In some cases, the form-fitting garment 10 may have a distinctive color compared to background surroundings.
The stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5) may have known feature horizontal widths along at least part of a length of the form- fitting garment 10 when unworn. Accordingly, the horizontal widths of the stretchable features at various points along the at least part of the length of the form-fitting garment 10 are known.
The stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5) can be designed to stretch upon the form-fitting garment 10 being worn, giving rise to stretched features 14'(1), 14'(2), 14'(3), 14'(4) and 14'(5) having stretched horizontal widths along the at least part of a length of the form-fitting garment 10. Accordingly, the horizontal width of a given stretchable feature (e.g. 14'(2)) at a given point along the at least part of a length of the form- fitting garment 10 when being worn is larger than the horizontal width of the same given stretchable feature (e.g. 14(2)) at the same given point when the form-fitting garment 10 is unworn. For example, stretchable features 14(2) and 14(4) have known feature horizontal widths 15 at a given point on the form- fitting garment 10 when it is being unworn, as depicted in figure 1A, and stretched horizontal widths 15' at the same given point on the form-fitting garment 10 when it is being worn, as depicted in figure IB. It is to be noted that the stretched horizontal widths 15' are larger than the known feature horizontal widths 15. The stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5) may be located in any location on the form-fitting garment 10. They can be integrated in the form-fitting garment 10, forming one surface of fabric, or located on the exterior of the front portion or the exterior of the back portion thereof, or any other configuration as long as they are visible to a camera taking a picture of an entity wearing the form-fitting garment 10.
Optionally, as shown in Figs 1A and IB, the reference marker 12 and the stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5), are of a rectangular/quadratic shape but other shapes may be used. In some cases, the reference marker 12 and the stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5) may be whole or fractions of geometric shape(s), pattern(s), picture(s), design(s), dot(s), line(s), logo(s), symbol(s) and the like.
As further detailed herein, inter alia with reference to Fig. 4, the reference marker 12 and the stretchable features 14(1), 14(2), 14(3), 14(4) and 14(5) enable calculation of at least one of:
(A) a circumference of an entity wearing the garment 10 utilizing at least (a) the reference marker size, and (b) at least one difference between a given stretched horizontal width and a respective known feature horizontal width of the feature horizontal widths; wherein upon the entity wearing the garment, the difference between the given stretched horizontal width and the respective known feature horizontal width of the feature horizontal widths is determined by capturing an image of the garment, including the reference marker and the stretchable feature, and analyzing the image or
(B) body fat percentage of the entity wearing the garment 10 utilizing at least (a) the reference marker size, and (b) the at least one given stretched horizontal width.
Turning to Fig. 2, there is shown a block diagram schematically illustrating one example of an environment 200 for performing an entity parameter calculation, in accordance with the presently disclosed subject matter. The environment 200 includes a user equipment 204, a user 206 wearing the form-fitting garment 10, a communication network 202 and an entity parameter calculation system 208. The user equipment 204 may be a portable handheld device such as a smartphone, a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a wristwatch, a desktop computer or any other device with data communication capabilities. It may include a camera unit for capturing image(s) of the form-fitting garment 10 when worn by an entity. The user equipment 204 can communicate with communication network 202 via wired or wireless communication. It may also include one or more of: a video display unit (e.g. flat panel display, such as OLED, or liquid crystal display (LCD) or a cathode ray tube (CRT)), an alphanumeric input device (e.g. a keyboard), a cursor control device (e.g. a mouse), or a signal generation device (e.g. a speaker).
The network 202 may be a cellular network, a Personal Area Network (PAN) Local Area Network (LAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), Wide Area Network (WAN), Virtual Private Network (VPN), an intranet, an extranet, an Internet, etc.
The user equipment 204 and the entity parameter calculation system 208 can communicate over the communication network 202 via wired or wireless communication. In some cases, the entity parameter calculation system 208 may be comprised within the user equipment 204.
Turning to Fig. 3, there is shown block diagram schematically illustrating one example of an entity parameter calculation system 208 for performing an entity parameter calculation, in accordance with the presently disclosed subject matter.
The entity parameter calculation system 208 comprising at least one processing circuitry 302. Processing circuitry 302 can be one or more processing units (e.g. central processing units), microprocessors, microcontrollers or any other computing devices or modules, including multiple and/or parallel and/or distributed processing units, which are adapted to independently or cooperatively process data for controlling relevant resources of the entity parameter calculation system 208 and for enabling operations related to resources of the entity parameter calculation system 208.
The processing circuitry 302 comprises an entity parameter calculation module 304 configured to perform an entity parameter calculation process, as further detailed herein with respect to Pig. 4.
In some cases, the system for entity parameter calculation 208 comprises, or may be otherwise associated with, a data repository 306 (e.g. a database, a storage system, a memory including Read Only Memory - ROM, Random Access Memory - RAM, or any other type of memory, etc.) configured to store data, including inter alia user-related data relating to one or more users 206 wearing the form-fitting garment 10, and various data acquired from such users 206 (e.g. digital image/s of the user 206 wearing the form-fitting garment 10, physical details of the user 206 (such as age, gender, garment size of the garment 10 worn by the user 10, medical condition, fitness level or any other parameter representing a physiological characteristic thereof), body metrics of the user 206 (such as height, weight, circumference, BMI, body composition, etc.), contact details of the user 206 (such as phone number, email, residence address, social media accounts, etc.), etc.).
In some cases, data repository 306 can be further configured to enable retrieval and/or update and/or deletion of the stored data. It is to be noted that in some cases, data repository 306 can be distributed across multiple locations, whether within the entity parameter calculation system 208 and/or within the user equipment 204 and/or elsewhere. It is to be noted, that in some cases, the relevant information relating to the user 206 can be loaded into data repository 306 before taking a picture of user 206 wearing the form-fitting garment 10 (e.g. upon entering the system for entity parameter calculation 208 and/or periodically and/or upon the system for entity parameter calculation 208 requesting the information) or after.
Turning to Fig. 4, there is shown a flowchart illustrating one example of a sequence of operations carried out for performing an entity parameter calculation, in accordance with the presently disclosed subject matter.
According to certain examples of the presently disclosed subject matter, system for an entity parameter calculation 208 can be configured to perform an entity parameter calculation process 400, e.g. utilizing an entity parameter calculation module 304.
For this purpose, the system for entity parameter calculation 208 (also referred to herein as “system”) can be configured to receive a digital image of a form-fitting garment 10 being worn (e.g. by a person whose entity parameter calculation is desired) (block 410). As indicated herein, inter alia with reference to Figs. 1A, IB, and 2, the form-fitting garment 10 has (a) a reference marker 12, having a reference marker size that is known and substantially constant, and (b) at least one stretchable feature 14(1), 14(2), 14(3), 14(4) or 14(5), having known feature horizontal widths along at least part of a length of the garment when unworn. Each of the at least one stretchable features (e.g. stretchable features 14(1), 14(2), 14(3), 14(4) or 14(5)) is designed to stretch upon the garment 10 being worn, giving rise to respective stretched features (e.g. stretched features 14'(1), 14'(2), 14'(3), 14'(4) or 14'(5) having stretched horizontal widths along the at least part of the length of the garment.
The digital image of the form-fitting garment 10 being worn can be acquired by a camera unit that may be comprised within the user equipment 204 or may be otherwise connected to it via a wired/wireless communication channel. The camera unit may be operated by the user 206 wearing the form-fitting garment 10 (i.e. for taking a self- portrait) or a third party (e.g. a camera unit with a timer feature, a selfie stick, mirror reflection, another person is taking the picture etc.).
The system 208 can be further configured to identify, within the digital image, the reference marker 12 and at least one of the stretchable features (e.g. stretchable feature 14(1), 14(2), 14(3), 14(4) or 14(5)) (block 420).
The identification of the reference marker 12 and the at least one stretchable feature 14(1), 14(2), 14(3), 14(4) or 14(5) within the digital image can be done for example by an automated pattern recognition technology (e.g. by utilizing digital image processing techniques). In such cases, optionally (e.g. in order to facilitate the pattern recognition), the user can be required to place the reference marker 12 within an ad hoc frame displayed on a user interface of the camera unit used for acquiring the digital image, before taking the picture of the form-fitting garment 10 being worn.
It is to be noted that in other cases, the system 208 is not required to identify the reference marker 12 within the digital image. In such cases, the operator of the camera unit is required to accurately place the reference marker 12 within an ad hoc frame displayed on a user interface of the camera unit used for acquiring the digital image, before taking the picture of the form-fitting garment 10 being worn. Accordingly, the identification of the reference marker by the system 208 is made redundant.
Optionally, the camera unit can be configured to automatically acquire the digital image, of the form- fitting garment 10 being worn, once the reference marker 12 has been aligned with the frame displayed on the user's interface thereof.
Moreover, the camera unit can be configured to assist the user to align the reference marker 12 with the frame displayed on the user interface of the camera unit by providing direction arrows for example.
The system 208 can be further configured to determine a conversion ratio utilizing pixels representing the reference marker 12 in the digital image and the reference marker size (block 430). The reference marker size may be known and constant, that is, its size (e.g. its cross-section, area, horizontal latitude, etc.) may be known in centimeters for example, or in any other measurement system (i.e. the International System of Units (SI), the imperial system, the United States customary units, etc.). The reference marker 12 can be designed to maintain its known original size (i.e. remain constant) irrespective of the form-fitting garment 10 being worn or not.
In some cases, the reference marker 12 may be represented by MxN pixels in the digital image, wherein M indicates number of pixels in horizontal line of the reference marker's 12 contour and N indicates number of pixels in vertical line thereof.
In other cases, the marker 12 may be represented in pixels merely by a one dimensional line that may be oriented in any direction on the form-fitting garment 10.
The conversion ratio may be determined utilizing the reference marker size and its representation in pixels (e.g. M pixels are equal to Y centimeters so that conversion ratio can be calculated therefrom). This way, any desired object in the form-fitting garment 10 (e.g. stretchable features 14(1), 14(2), 14(3), 14(4) or 14(5) or stretched features 14'(1), 14'(2), 14'(3), 14'(4) or 14'(5) ) may be converted from its pixel representation in the digital image to its corresponding metric representation for example.
It is to be noted that in some cases, the reference marker 12 can be designed to substantially maintain its known original size (i.e. remain substantially constant) upon the form-fitting garment 10 being worn. In these cases, the reference marker 12 may stretch by insignificant extent that will not affect the determination of the conversion ratio therefrom (e.g. by up to five millimeters or less).
The system 208 can be further configured to determine one or more conversions of one or more respective given stretched horizontal widths of the at least one stretchable feature 14(1), 14(2), 14(3), 14(4) or 14(5) identified in the digital image, converted using the conversion ratio (block 440).
The at least one stretchable feature 14(1), 14(2), 14(3), 14(4) or 14(5) may be stretched once the user 206 is wearing the form-fitting garment 10. The stretched feature 14'(1), 14'(2), 14'(3), 14'(4) or 14'(5) may have a stretched horizontal width that can be represented in pixels in the digital image. The stretched horizontal width can be converted into centimeters, for example, by utilizing the conversion ratio. This way, each stretched horizontal width along the at least part of the length of the garment 10 can have a corresponding converted width (i.e. conversions).
The system 208 can be further configured to calculate an entity parameter related to an entity wearing the garment 10, utilizing the conversions (block 450).
Calculating circumference of the entity:
In some cases, the entity parameter may be a circumference of the entity (i.e. user 206 wearing the form-fitting garment 10). In such cases, the circumference of the entity can be determined in various manners, some of which are exemplified herein:
In these cases, the circumference can be calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
The input parameters may include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
The classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
The training input parameters may include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
The training conversions may be obtained from a conversions training data set obtained during previous measurements, or from external databases.
For example, the conversions training data set may consist of previously determined stretched horizontal widths along the at least part of the length of the form fitting garment 10 converted into centimeters by utilizing various conversion ratios.
The verified circumference measurements may be obtained from a verified circumference measurement training data set obtained during previous measurements, or from external databases.
For example, the verified circumference measurement training data set may consist of circumference measurements made by a dietician.
Looking at an exemplary record of the training set, it can comprise, for a given entity, its input parameters (e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender), its conversions (e.g. stretched horizontal widths of stretchable feature/s along the at least part of the length of the form-fitting garment 10 converted into centimeters), and its verified circumference measurement (e.g. measured using a measurement device operated by a dietician).
Alternatively, or additionally, the circumference can be calculated by:
(A) determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and
(B) inputting the at least one difference and one or more input parameters associated with the entity into a classifier that is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
In such cases, such a classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters (e.g. age, height, weight, garment size of the garment worn by the entity, or gender), training difference and a respective verified circumference measurement.
The training difference may be obtained from a difference training data set obtained during previous measurements, or from external databases.
For example, the difference training data set may consist of previously determined differences between conversions of stretched horizontal widths along the at least part of the length of the form- fitting garment 10 into centimeters and respective known feature horizontal widths.
For example, the verified circumference measurement training data set may consist of circumference measurements made by a dietician.
Looking at an exemplary record of the training set, it can comprise, for a given entity, its input parameters (e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender), its differences (e.g. differences between conversions of stretched horizontal widths of stretchable feature/s along the at least part of the length of the form-fitting garment 10 worn by the entity into centimeters and respective known feature horizontal widths of the respective stretchable feature/s), and its verified circumference measurement (e.g. measured using a measurement device operated by a dietician).
Calculating Body Fat Percentage of the entity:
In some cases, the system 208 can be further configured to determine a body fat percentage of the entity using a formula, wherein the formula is configured to receive the circumference and the one or more input parameters and output the body fat percentage.
The formula may be a mathematical formula such as a U.S. Navy Formula, BMI formula, BMR formula, YMCA formula, BSA formula or any other suitable formula.
Having described various manners for determining the circumference of the entity, it is to be noted that in other cases, the entity parameter may be a body fat percentage of the entity (i.e. user 206 wearing the form-fitting garment 10). In such cases, the body fat percentage can be determined in various manners, some of which are exemplified herein:
In some cases, the body fat percentage may be calculated by inputting the conversions, and one or more input parameters into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and provide the body fat percentage.
In such cases, the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters (e.g. age, height, weight, garment size of the garment worn by the entity, or gender), training conversions and a respective verified body fat percentage measurement.
The training conversions may be obtained from a conversions training data set obtained during previous measurements, or from external databases.
For example, the conversions training data set may consist of previously determined stretched horizontal widths along the at least part of the length of the form fitting garment 10 converted into centimeters by utilizing various conversion ratios.
The verified body fat percentage measurements may be obtained from a body fat percentage training data set obtained during previous measurements, or from external databases.
For example, the verified body fat percentage measurement training data set may consist of body fat percentage measurements made by a dietician. Various measurement devices may be used by the dietician to measure body fat percentage such as Ultrasound devices, Dual-energy X-ray absorptiometry (DXA) scans (that have become a “gold standard” for the assessment of body composition in sports nutrition), Skinfold Calipers, etc.
Looking at an exemplary record of the training set, it can comprise, for a given entity, its input parameters (e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender), its conversions (e.g. stretched horizontal widths of stretchable feature/s along the at least part of the length of the form- fitting garment 10 converted into centimeters), and its verified body fat percentage measurement (e.g. measured using a measurement device operated by a dietician).
Alternatively, or additionally, the body fat percentage can be calculated by:
(A) determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and
(B) inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective body fat percentage.
In such cases, such a classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters (e.g. age, height, weight, garment size of the garment worn by the entity, or gender), training difference and a respective verified body fat percentage measurement.
The training difference may be obtained from a difference training data set obtained during previous measurements, or from external databases.
For example, the difference training data set may consist of previously determined differences between conversions of stretched horizontal widths along the at least part of the length of the form- fitting garment 10 into centimeters and respective known feature horizontal widths.
For example, the verified body fat percentage measurement training data set may consist of body fat percentage measurements made by a dietician.
Looking at an exemplary record of the training set, it can comprise, for a given entity, its input parameters (e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender), its differences (e.g. differences between conversions of stretched horizontal widths of stretchable feature/s along the at least part of the length of the form-fitting garment 10 worn by the entity into centimeters and respective known feature horizontal widths of the respective stretchable feature/s), and its verified body fat percentage measurement (e.g. measured using a measurement device operated by a dietician).
Alternatively, or additionally, the body fat percentage can be calculated by determining a circumference of the entity and inputting the circumference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the circumference and the input parameters and determine the respective body fat percentage.
The classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters (e.g. age, height, weight, garment size of the garment worn by the entity, or gender), training circumference and a respective verified body fat percentage measurement.
The training circumference may be obtained from a circumference training data set obtained during previous measurements, or from external databases.
For example, the circumference measurement training data set may consist of circumference measurements made by a dietician.
The verified body fat percentage measurements may be obtained from a body fat percentage training data set obtained during previous measurements, or from external databases.
For example, the verified body fat percentage measurement training data set may consist of body fat percentage measurements made by a dietician.
Looking at an exemplary record of the training set, it can comprise, for a given entity, its input parameters (e.g. the entity’s age and/or height and/or weight and/or garment size of the garment worn by the entity and/or gender), its circumferences (e.g. measured using a measurement device operated by a dietician or calculated by one or more of the classifiers that are used to determine circumference as described hereinabove), and its verified body fat percentage measurement (e.g. measured using a measurement device operated by a dietician).
It is to be further noted that various calculation methods described hereinabove may be performed utilizing statistical techniques, e.g. Multiple linear regression (MLR).
In addition, the method described in the foregoing specification can be implemented utilizing a software debounce routine, or any other anomaly filtering method and/or technique, thereby at least reducing, if not eliminating, err calculations of the entity parameter(s).
Optionally, when reference is made herein to calculations that utilize measurements of stretched horizontal widths of stretchable feature(s) along the at least part of the length of the form-fitting garment 10 worn by the entity, the number of measurements can be any number of measurements higher than one. In case more than one measurement is used, each measurement is made at a different location along the length of the stretchable feature(s). In a specific non-limiting example, nine measurements can be used. In some cases, the stretched horizontal widths that are used, are the ones the correspond to those locations along pre-set portions (e.g. the waist area, the chest area, the basin area, etc.) along the length of the stretchable feature(s), that stretched more than other locations at the corresponding pre-set portion.
Optionally, the system 208 can take into account differences in the way men and women accumulate fat so that it's the calculation of the entity parameter(s) may differ between women and men. For instance, calculation for women may be based on measurements of three stretched horizontal widths of stretchable feature(s) along the at least part of the length of the form- fitting garment 10 while for men only two measurements of stretched horizontal widths may suffice.
In some cases, the entity parameter may be a body type of the entity wearing the form-fitting garment 10, thereby demonstrating one of the three known in the art body types: Ectomorph, Endomorph or Mesomorphic. For this purpose, the entity wearing the form-fitting garment 10 may provide the system 208 with at least one of the input parameters (e.g. height). The entity parameter calculation module 304 may be configured to calculate a width of the entity wearing the form-fitting garment 10 and utilize the width and the input parameters to determine the body type of the entity wearing the form-fitting garment 10.
In other cases, the entity parameter calculation module 304 may be configured to calculate a width of the entity wearing the form-fitting garment 10 and utilize the width to determine the body type of the entity wearing the form-fitting garment 10.
It is to be noted that in some cases, the system 208 can utilize historical and optionally current measurements of stretched horizontal widths in order to generate a two-dimensional or three-dimensional representation showing the progress/change of the entity wearing the form-fitting garment 10.
It is to be noted that, with reference to Fig. 4, some of the blocks can be integrated into a consolidated block or can be broken down to a few blocks and/or other blocks may be added. Furthermore, in some cases, the blocks can be performed in a different order than described herein (for example, block 440 can be performed before block 430, block 430 can be performed before block 420, block 420 can be performed before block 410, etc.). It is to be further noted that some of the blocks are optional. It should be also noted that whilst the flow diagram is described also with reference to the system elements that realizes them, this is by no means binding, and the blocks can be performed by elements other than those described herein.
It is to be understood that the presently disclosed subject matter is not limited in its application to the details set forth in the description contained herein or illustrated in the drawings. The presently disclosed subject matter is capable of other embodiments and of being practiced and carried out in various ways. Hence, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting. As such, those skilled in the art will appreciate that the conception upon which this disclosure is based may readily be utilized as a basis for designing other structures, methods, and systems for carrying out the several purposes of the present presently disclosed subject matter.
It will also be understood that the system according to the presently disclosed subject matter can be implemented, at least partly, as a suitably programmed computer. Likewise, the presently disclosed subject matter contemplates a computer program being readable by a computer for executing the disclosed method. The presently disclosed subject matter further contemplates a machine-readable memory tangibly embodying a program of instructions executable by the machine for executing the disclosed method.
Examples of the present presently disclosed subject matter may be provided as a computer program product, or software, that may include a machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the presently disclosed subject matter. A machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g. a computer). For example, a machine-readable (e.g. computer readable) medium includes a machine (e.g. a computer) readable storage medium (e.g. read only memory (“ROM”), random access memory (“RAM”), magnetic disk storage media, optical storage media, flash memory devices, etc.), a machine (e.g. computer) readable transmission medium (electrical, optical, acoustical or other form of propagated signals (e.g., infrared signals, digital signals, etc.)), etc.
Fig. 2 illustrates a diagrammatic representation of a system in the exemplary form of a computer system 208 including HW and SW such as e.g. set of instructions, causing the system to perform any one or more of the above techniques. In alternative examples, the machine may be connected (e.g. networked) to other machines in a Focal Area Network (LAN), an intranet, an extranet, or the Internet. The system may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a server, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines (e.g. computers) that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
The exemplary computer system 208 may include a processor. The processor represents one or more general-purpose processing devices such as microprocessor, central processing unit, or the like. More particularly, the processor may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, processor implementing other instructions sets, or processors implementing a combination of instruction sets. Processor may also be one or more special purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like.
The computer system may further include a network interface device (NID).
The computer system may also include a video display unit (e.g. flat panel display, such as OLED, or liquid crystal display (LCD) or a cathode ray tube (CRT)), an alphanumeric input device (e.g. a keyboard), a cursor control device (e.g. a mouse), and a signal generation device (e.g. a speaker).
The computer system may further include a memory.
The memory may include a machine-accessible storage medium (or more specifically a computer-readable storage medium) on which stored one or more sets of instructions (e.g. software) embodying any one or more of the methodologies or functions described herein. The software may also reside, completely or at least partially, within the memory and/or within the processor during execution thereof by the computer system, the memory and the processor also constituting machine-readable storage media. The software may further be transmitted or received over a network via the network interface device. While the machine -accessible storage medium is shown in an exemplary embodiment to be a single medium, the term “machine -readable storage medium should be taken to include a single medium or multiple media (e.g. centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable storage medium” shall also be taken to include any medium that is capable of storing or encoding a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present presently disclosed subject matter the term “machine- readable storage medium” shall accordingly be taken to include, but not limited to, solid-state memories, and optical and magnetic media.
A computer program is a list of instructions such as a particular application program and/or an operating system. The computer program may for instance include one or more of: a subroutine, a function, a procedure, an object method, an object implementation, an executable application, an applet, a servlet, a source code, an object code, a shared library/dynamic load library and/or other sequence of instructions designed for execution on a computer system.
The computer program may be stored internally on a non-transitory computer readable medium. All or some of the computer program may be provided on computer readable media permanently, removably or remotely coupled to an information processing system. The computer readable media may include, for example and without limitation, any number of the following: magnetic storage media including disk and tape storage media; optical storage media such as compact disk media (e.g., CD-ROM, CD-R, etc.) and digital video disk storage media; nonvolatile memory storage media including semiconductor-based memory units such as FLASH memory, EEPROM, EPROM, ROM; ferromagnetic digital memories; MRAM; volatile storage media including registers, buffers or caches, main memory, RAM, etc. A computer process typically includes an executing (running) program or portion of a program, current program values and state information, and the resources used by the operating system to manage the execution of the process. An operating system (OS) is the software that manages the sharing of the resources of a computer and provides programmers with an interface used to access those resources. An operating system processes system data and user input, and responds by allocating and managing tasks and internal system resources as a service to users and programs of the system. The computer system may for instance include at least one processing unit, associated memory and a number of input/output (I/O) devices. When executing the computer program, the computer system processes information according to the computer program and produces resultant output information via I/O devices.
In the foregoing specification, the presently disclosed subject matter has been described with reference to specific examples of embodiments of the presently disclosed subject matter. It will, however, be evident that various modifications and changes may be made therein without departing from the broader spirit and scope of the presently disclosed subject matter as set forth in the appended claims.
The presently disclosed subject matter may also be implemented in a computer program for running on a computer system, at least including code portions for performing steps of a method according to the presently disclosed subject matter when run on a programmable apparatus, such as a computer system.
Moreover, the terms “front,” “back,” “top,” “bottom,” “over,” “under” and the like in the description and in the claims, if any, are used for descriptive purposes and not necessarily for describing permanent relative positions. It is understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the presently disclosed subject matter described herein are, for example, capable of operation in other orientations than those illustrated or otherwise described herein.
Also, for example, the examples, or portions thereof, may implemented as soft or code representations of physical circuitry or of logical representations convertible into physical circuitry, such as in a hardware description language of any appropriate type.
Also, the presently disclosed subject matter is not limited to physical devices or units implemented in nonprogrammable hardware but can also be applied in programmable devices or units able to perform the desired device functions by operating in accordance with suitable program code, such as mainframes, minicomputers, servers, workstations, personal computers, notepads, personal digital assistants, electronic games, and other embedded systems, cell phones and various other wireless devices, commonly denoted in this application as ‘computer systems’.
However, other modifications, variations and alternatives are also possible. The specifications and drawings are, accordingly, to be regarded in an illustrative rather than in a restrictive sense. While certain features of the presently disclosed subject matter have been illustrated and described herein, many modifications, substitutions, changes, and equivalents will now occur to those of ordinary skill in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the presently disclosed subject matter.

Claims

CLAIMS:
1. A form- fitting garment, comprising: a reference marker, having a reference marker size that is known and substantially constant; and at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; thereby enabling calculation of at least one of:
(C) a circumference of an entity wearing the garment utilizing at least (a) the reference marker size, and (b) at least one difference between a given stretched horizontal width and a respective known feature horizontal width of the feature horizontal widths; or
(D) body fat percentage of the entity wearing the garment utilizing at least (a) the reference marker size, and (b) the at least one given stretched horizontal width.
2. The garment of claim 1, wherein the at least one stretchable feature is in a front portion of the garment when being worn.
3. The garment of claim 1, wherein the garment is one of: a shirt, a vest, a dress, a strap, a belt, a patch or a halterneck.
4. The garment of claim 1, wherein the reference marker having a distinctive color compared to the garment.
5. The garment of claim 1, wherein a first color of the reference marker in a visible spectrum is identical to a second color of the garment.
6. The garment of claim 5, wherein the first color of the reference marker in a non- visible spectrum is different than the second color of the garment upon the garment being worn.
7. The garment of claim 1, wherein the reference marker is made of one of the following materials: linen, hemp, cotton, ramie, wool, silk, bamboo, soya, tencel, viscose, leather, suede, metal, polyester, polyolefin or polymer material.
8. The garment of claim 1, wherein upon the entity wearing the garment, the difference between the given stretched horizontal width and the respective known feature horizontal width of the feature horizontal widths is determined by capturing an image of the garment, upon the garment being worn, the image including the reference marker and the stretchable feature, and analyzing the image.
9. A method for an entity parameter calculation, the method comprising: receiving, by a processing circuitry, a digital image of a form-fitting garment being worn, the form-fitting garment having (a) a reference marker, having a reference marker size that is known and substantially constant, and (b) at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; identifying, by the processing circuitry, within the digital image, the reference marker and the at least one stretchable feature; determining, by the processing circuitry, a conversion ratio utilizing pixels representing the reference marker in the digital image and the reference marker size; determining, by the processing circuitry, one or more conversions of one or more respective given stretched horizontal widths of the at least one stretchable feature identified in the digital image, converted using the conversion ratio; and calculating, by the processing circuitry, an entity parameter related to an entity wearing the garment, utilizing the conversions.
10. The method of claim 9, wherein the entity parameter is a circumference of the entity.
11. The method of claim 10, wherein the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
12. The method of claim 11, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
13. The method of claim 10, wherein the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference.
14. The method of claim 13, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
15. The method of claim 10, wherein the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
16. The method of claim 15, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
17. The method of claim 10, wherein the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference.
18. The method of claim 17, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement.
19. The method of claim 10, further comprising determining a body fat percentage of the entity using a formula, wherein the formula is configured to receive the circumference and the one or more input parameters and output the body fat percentage.
20. The method of claim 9, wherein the entity parameter is a body fat percentage of the entity.
21. The method of claim 20, wherein the body fat percentage is calculated by: inputting the conversions, and one or more input parameters into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and provide the body fat percentage.
22. The method of claim 21, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified body fat percentage measurement.
23. The method of claim 20, wherein the body fat percentage is calculated by: inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and provide the body fat percentage.
24. The method of claim 23, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified body fat percentage measurement.
25. The method of claim 20, wherein the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective body fat percentage.
26. The method of claim 25, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified body fat percentage measurement.
27. The method of claim 20, wherein the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective body fat percentage.
28. The method of claim 27, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified body fat percentage measurement.
29. The method of claim 20, wherein the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the circumference and the input parameters and determine the respective body fat percentage.
30. The method of claim 29, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training circumference and a respective verified body fat percentage measurement.
31. The method of claim 20, wherein the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference into a classifier, wherein the classifier is configured to obtain the circumference and determine the respective body fat percentage.
32. The method of claim 31, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training circumference and a respective verified body fat percentage measurement.
33. The method of claim 29, wherein the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
34. The method of claim 33, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
35. The method of claim 31, wherein the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference.
36. The method of claim 35, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
37. The method of claim 29, wherein the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
38. The method of claim 37, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
39. The method of claim 31, wherein the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference.
40. The method of claim 39, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement.
41. The method according to any one of claims 11, 15, 19, 21, 25, 29, 33 or 37, wherein the input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
42. The method according to any one of claims 12, 16, 22, 26, 30, 34 or 38, wherein the training input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
43. A system for an entity parameter calculation, the system comprising at least one processing circuitry configured to perform the following: receive a digital image of a form-fitting garment being worn, the form fitting garment having (a) a reference marker, having a reference marker size that is known and substantially constant, and (b) at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; identify within the digital image, the reference marker and the at least one stretchable feature; determine a conversion ratio utilizing pixels representing the reference marker in the digital image and the reference marker size; determine one or more conversions of one or more respective given stretched horizontal widths of the at least one stretchable feature identified in the digital image, converted using the conversion ratio; and calculate an entity parameter related to an entity wearing the garment, utilizing the conversions.
44. The system of claim 43, wherein the entity parameter is a circumference of the entity.
45. The system of claim 44, wherein the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
46. The system of claim 45, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
47. The system of claim 44, wherein the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference.
48. The system of claim 47, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
49. The system of claim 44, wherein the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
50. The system of claim 49, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
51. The system of claim 44, wherein the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference.
52. The system of claim 51, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement.
53. The system of claim 44, further comprising determining a body fat percentage of the entity using a formula, wherein the formula is configured to receive the circumference and the one or more input parameters and output the body fat percentage.
54. The system of claim 43, wherein the entity parameter is a body fat percentage of the entity.
55. The system of claim 54, wherein the body fat percentage is calculated by: inputting the conversions, and one or more input parameters into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and provide the body fat percentage.
56. The system of claim 55, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified body fat percentage measurement.
57. The system of claim 54, wherein the body fat percentage is calculated by: inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and provide the body fat percentage.
58. The system of claim 57, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified body fat percentage measurement.
59. The system of claim 54, wherein the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective body fat percentage.
60. The system of claim 59, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified body fat percentage measurement.
61. The system of claim 54, wherein the body fat percentage is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective body fat percentage.
62. The system of claim 61, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified body fat percentage measurement.
63. The system of claim 54, wherein the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the circumference and the input parameters and determine the respective body fat percentage.
64. The system of claim 63, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training circumference and a respective verified body fat percentage measurement.
65. The system of claim 54, wherein the body fat percentage is calculated by: determining a circumference of the entity; and inputting the circumference into a classifier, wherein the classifier is configured to obtain the circumference and determine the respective body fat percentage.
66. The system of claim 65, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training circumference and a respective verified body fat percentage measurement.
67. The system of claim 63, wherein the circumference is calculated by inputting the conversions and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the conversions and the input parameters and determine the respective circumference.
68. The system of claim 67, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training conversions and a respective verified circumference measurement.
69. The system of claim 65, wherein the circumference is calculated by inputting the conversions into a classifier, wherein the classifier is configured to obtain the conversions and determine the respective circumference.
70. The system of claim 69, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training conversions and a respective verified circumference measurement.
71. The system of claim 63, wherein the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference and one or more input parameters associated with the entity into a classifier, wherein the classifier is configured to obtain the at least one difference and the input parameters and determine the respective circumference.
72. The system of claim 71, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training input parameters, training difference and a respective verified circumference measurement.
73. The system of claim 65, wherein the circumference is calculated by: determining at least one difference between (a) the conversions, and (b) respective known feature horizontal widths of the feature horizontal widths; and inputting the at least one difference into a classifier, wherein the classifier is configured to obtain the at least one difference and determine the respective circumference.
74. The system of claim 73, wherein the classifier is trained using a training set including a plurality of records, each of the records having respective training difference and a respective verified circumference measurement.
75. The system according to any one of claims 45, 49, 53, 55, 59, 63, 67 or 71, wherein the input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
76. The system according to any one of claims 46, 50, 56, 40, 64, 68 or 72, wherein the training input parameters include at least one of: age, height, weight, garment size of the garment worn by the entity, or gender.
77. A non-transitory computer readable storage medium having computer readable program code embodied therewith, the computer readable program code, executable by a processing circuitry to perform a method for circumference calculation, the method comprising: receiving, by a processing circuitry, a digital image of a form-fitting garment being worn, the form-fitting garment having (a) a reference marker, having a reference marker size that is known and substantially constant, and (b) at least one stretchable feature, having known feature horizontal widths along at least part of a length of the garment when unworn, the stretchable feature designed to stretch upon the garment being worn, giving rise to a stretched feature having stretched horizontal widths along the at least part of the length of the garment; identifying, by the processing circuitry, within the digital image, the reference marker and the at least one stretchable feature; determining, by the processing circuitry, a conversion ratio utilizing pixels representing the reference marker in the digital image and the reference marker size; determining, by the processing circuitry, one or more conversions of one or more respective given stretched horizontal widths of the at least one stretchable feature identified in the digital image, converted using the conversion ratio; and calculating, by the processing circuitry, an entity parameter related to an entity wearing the garment, utilizing the conversions.
PCT/IL2020/050124 2019-08-28 2020-02-03 System, method and computer readable medium for entity parameter calculation WO2021038550A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962892572P 2019-08-28 2019-08-28
US62/892,572 2019-08-28

Publications (1)

Publication Number Publication Date
WO2021038550A1 true WO2021038550A1 (en) 2021-03-04

Family

ID=74684973

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2020/050124 WO2021038550A1 (en) 2019-08-28 2020-02-03 System, method and computer readable medium for entity parameter calculation

Country Status (1)

Country Link
WO (1) WO2021038550A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115830128A (en) * 2023-02-15 2023-03-21 成都全景德康医学影像诊断中心有限公司 Face positioning measurement method, device and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130179288A1 (en) * 2010-11-17 2013-07-11 Upcload Gmbh Collecting and using anthropometric measurements
IN2015MU01862A (en) * 2015-05-11 2016-11-18
WO2019032982A1 (en) * 2017-08-11 2019-02-14 North Carolina State University Devices and methods for extracting body measurements from 2d images
US20190122424A1 (en) * 2017-10-23 2019-04-25 Fit3D, Inc. Generation of Body Models and Measurements
US10339706B2 (en) * 2008-08-15 2019-07-02 Brown University Method and apparatus for estimating body shape

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10339706B2 (en) * 2008-08-15 2019-07-02 Brown University Method and apparatus for estimating body shape
US20130179288A1 (en) * 2010-11-17 2013-07-11 Upcload Gmbh Collecting and using anthropometric measurements
IN2015MU01862A (en) * 2015-05-11 2016-11-18
WO2019032982A1 (en) * 2017-08-11 2019-02-14 North Carolina State University Devices and methods for extracting body measurements from 2d images
US20190122424A1 (en) * 2017-10-23 2019-04-25 Fit3D, Inc. Generation of Body Models and Measurements

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115830128A (en) * 2023-02-15 2023-03-21 成都全景德康医学影像诊断中心有限公司 Face positioning measurement method, device and system
CN115830128B (en) * 2023-02-15 2023-05-12 成都全景德康医学影像诊断中心有限公司 Face positioning measurement method, device and system

Similar Documents

Publication Publication Date Title
US11526983B2 (en) Image feature recognition method and apparatus, storage medium, and electronic apparatus
Ferris et al. Computer-aided classification of melanocytic lesions using dermoscopic images
CN110414631B (en) Medical image-based focus detection method, model training method and device
US20180368731A1 (en) Image processing apparatus, image processing method, and recording medium recording same
CN109934220B (en) Method, device and terminal for displaying image interest points
US20240041425A1 (en) Non-invasive determination of pennation angle and/or fascicle length
US20150105651A1 (en) Systems and methods for mri-based health management
Gonzalez et al. Volumetric electromagnetic phase-shift spectroscopy of brain edema and hematoma
US11612376B2 (en) Non-invasive determination of muscle tissue size
US11776687B2 (en) Medical examination of human body using haptics
US20190325780A1 (en) Physical movement analysis
WO2021038550A1 (en) System, method and computer readable medium for entity parameter calculation
Shepherd et al. Modeling the shape and composition of the human body using dual energy X-ray absorptiometry images
US20190053788A1 (en) Method and ultrasound apparatus for providing annotation related information
Coratella et al. Quadriceps and gastrocnemii anatomical cross-sectional area and vastus lateralis fascicle length predict peak-power and time-to-peak-power
US20200297210A1 (en) Methods and apparatus for using brain imaging to predict performance
KR20200061236A (en) Apparatus, method, and computer program product for processing tomography image
CN112309540A (en) Motion evaluation method, device, system and storage medium
US11589841B2 (en) Ultrasound imaging device, ultrasound imaging system, ultrasound imaging method and ultrasound imaging program
Gifford et al. Optimizing breast-tomosynthesis acquisition parameters with scanning model observers
Zhang et al. Missing-view completion for fatty liver disease detection
EP3337581A1 (en) Information processing apparatus, information processing method, and program
JP5292616B2 (en) Visceral fat information calculation device and visceral fat calculation program
KR20100041312A (en) Ultrasonic sebum thickness meter
US11948324B2 (en) Ultrasound imaging device, ultrasound imaging system, ultrasound imaging method, and ultrasound imaging program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20857768

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 21/04/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20857768

Country of ref document: EP

Kind code of ref document: A1