CA2831674A1 - System and method for assessment of health risks and visualization of weight loss and muscle gain - Google Patents

System and method for assessment of health risks and visualization of weight loss and muscle gain Download PDF

Info

Publication number
CA2831674A1
CA2831674A1 CA2831674A CA2831674A CA2831674A1 CA 2831674 A1 CA2831674 A1 CA 2831674A1 CA 2831674 A CA2831674 A CA 2831674A CA 2831674 A CA2831674 A CA 2831674A CA 2831674 A1 CA2831674 A1 CA 2831674A1
Authority
CA
Canada
Prior art keywords
person
factor
fat
value
data set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA2831674A
Other languages
French (fr)
Inventor
Mario J. Bravomalo
Russ Edward Brucks
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
INTER-IMAGES PARTNERS LP
Original Assignee
INTER-IMAGES PARTNERS LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by INTER-IMAGES PARTNERS LP filed Critical INTER-IMAGES PARTNERS LP
Priority to CA2831674A priority Critical patent/CA2831674A1/en
Priority claimed from CA2462703A external-priority patent/CA2462703C/en
Publication of CA2831674A1 publication Critical patent/CA2831674A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/0092Nutrition
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • General Physics & Mathematics (AREA)
  • Nutrition Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The present system (fig.1) combines image morphing technology, exercise programming, supplement sales, and motivational techniques into one product. Users begin by entering their current measurements, measurement goals and current picture into the system (fig.1), preferably via a Web site. The picture is segmented into body components, and each affected segment is morphed based upon the measurements, goals, and the segment's responsiveness to weight loss in order to create a modified image representative of the user in a post-regimen condition. This system (fig.1) helps health and fitness businesses obtain new members and retain existing members by showing the members how they will look after following a specific regimen of diet and/or exercise. The system (fig.1) also predicts health risks of diabetes, heart disease, and stroke associated with the user's pre-regimen and post-regimen conditions.

Description

SYSTEM AND METHOD FOR ASSESSMENT OF HEALTH RISKS AND
VISUALIZATION OF WEIGHT LOSS AND MUSCLE GAIN
INVENTORS: MARIO J. BRAVOMALO
RUSS EDWARD BRUCKS
BACKGROUND OF THE INVENTION
1. Field of the Invention This invention relates to visual image processing used to predict a subject's appearance after a given amount of weight loss or muscle gain and prediction of changes in health risks associated with such weight loss or muscle gain. This invention also relates to business methods employing a predictive image visualization system to attract and retain clients of service providers in the weight loss food program, fitness center, physical therapy, sports medicine, and weight control medical industries.
2. Description of the Related Art Many people desire to decrease their body weight, especially their body fat content. Modem life styles include highly sedentary weekday routines such as computer-based desk jobs, low-exercise commuting routines such as transportation by private automobile, and high-fat, high-calorie food selections often eaten quickly or while "on the run." Besides genetic tendencies, these factors lead many people to be dissatisfied with their appearance.
The problem is so prevalent that billion dollar industries have evolved to help people overcome their body dissatisfaction, including packaged food programs such as Weight Watchers (TM) and Jenny Craig (TM), fitness and workout centers such as Bally's (TM), and physical therapy and sports medicine centers. This industry has also attracted medical and osteopathic doctors who specialize in the use of diet, exercise, and sometimes prescriptive regimens to help their clients and patients achieve their weight and appearance goals.
According to marketLooks.com (TM), there are currently over 24,000 health clubs in the United States with 40 million members generating over 12 billion dollars in revenue each year. In 1995, health clubs and private individuals spent 3.2 billion dollars on fitness equipment alone, and these revenues are expected to reach 4.9 billion, a 38% increase, by the year 2001. In 1996, $500 =

million was spent on meal replacements and protein drinks, and these sales are expected to grow by 30% over the next five years.
However, many people fail to meet their goals, despite their efforts and the amounts they spend. The two most common reasons people fail in their attempts to change their body weight and appearance are lack of understanding and motivation.
Client and Patient Education Previous technologies, systems, and methods do not adequately provide for the education and understanding of how exercise and diet affect the physiology of a person, especially taking into consideration the person's frame size or "build"
and metabolism. Some available technologies include the ability to scan a photograph or import an image of a client or patient from a digital camera and to digitally alter the image manually to produce an estimate of the client's future appearance.
Currently available systems and methods simply "shrink" an image, such as by hand manipulation and editing of a digitized photograph, also known as digital "retouching." However, different body builds will store fat in different amounts in various portions of the body, and different exercises will reduce and/or finn up different body areas unevenly. Additionally, certain features of the body will show little or no response to weight change. For example, if the width of an image of a leg is decreased by a certain percentage, the appearance of the knee will be changed. However, knees generally do not have a significant fat layer and thus have a minimum circumference at almost any weight. So, the resulting image would predict an overall thin appearance to a leg which is not physiologically achievable. Similar factors apply to other points in the body, such as the width of shoulders and hips and the circumference of joints. As this method is highly inaccurate, it does not provide the level of education a client or patient needs to understand why particular diets and exercises have been recommended and how to adjust and apply this information in the future.
In order to accurately predict a future appearance, many physiological factors must be taken into account with diet and exercise goals. Estimating the results of these changes is typically beyond the technical and medical education
3 and skill sets of most staffers at weight loss packaged food program outlets and physical fitness centers, and such estimation may be highly labor intensive and expensive if performed by appropriately qualified health and medical professionals.
At present, there are a few relevant resources available on the Internet. One service, called MorphOver (TM) from eFit of New York City, New York, provides a service in which users e-mail a digital photograph in JPEG format to the company's website without any body measurements, body fat data, or indicated goals, and the service returns a "slimmed" photograph file within a few weeks.
The instructions indicate that the original or "before" photograph must be of the subject in dark clothing, in a certain position, and with a white background.
Another on-line service offered by Sound Feelings Publishing of Reseda, California, is similar in that it only requires submission of a photograph without any data as to the subject's body fat, dimensions, or goals. Additionally, the advertisement for this service states that a digital photograph artist will spend at least two hours manually manipulating the photograph.
Client and Patient Motivation There are very few credible, non-surgical remedies for rapid weight loss.
Therefore, successful weight loss programs require months to even years of commitment and adherence to prescribed diet and exercise regimens. If a client or patient becomes unmotivated or loses confidence in a program, he or she will not continue the program. Further, this client or patient may produce negative effects on the attraction and retention of other clients and patients as they will report to their friends and acquaintances that the program is another "scam" or "doesn't work," or that a particular professional is not competent. This can lead to a decline in memberships of businesses which are membership-based.
Therefore, there is a need in the art for a system and method which accurately produces predicted images of a weight-loss client or patient. Such a system and method should be operable by persons of usual skill and education who are commonly employed in the package food program and fitness center industries.
Further, there is a need in the art for this system and method to easily and quickly produce intermediate images, such as weekly or monthly predictions, in order to
4 provide accurate and positive confidence reinforcement to the client or patient, thereby enhancing the likelihood that the client or patient will continue to abide by the program and ultimately achieve his or her goals. There also exists a need in the art for this system and method to be operable in a networked or an Internet-based form or in a single workstation form. Additionally, there exists a need in the art for a method of leveraging such a system to attract and retain clients and patients in this industry.
SUMMARY OF THE INVENTION
System Overview The present system comprises a fitness profiler which helps users gain insight into their fitness plans and their projected outcomes and results. The present system combines image morphing technology, exercise programming, supplement sales, and motivational techniques into one product.
Users begin by entering their measurement goals and current picture into the system, preferably via an Internet web site. The system analyzes the user's data and produces a customized fitness plan by applying a "morphing" process to the "before view." The picture is sectionalized into body components which are highly responsive to weight loss and components which are less responsive to weight loss, and the amount of change in each body section is determined by physiological tables and formulae. The resulting modified "after view" image is then returned to the user, preferably by online communications such as e-mail.

The Image Analyzer The combination of three-dimensional ("3-D") morphing technology with mathematical statistics is used to project fat loss and muscle gain and to produce projected fitness outcomes. The user's input data preferably includes skin fold, circumference, height, weight, BMR, and activity level. By entering the client's measurements into a mathematical formula, the user's picture can be morphed into the desired outcome. The combination of skin fold and circumference measurement produces an accurate morphing outcome for each user.
Business Method for Use of the Predictive System The image prediction system in accordance with the present invention helps the fitness industry overcome two of their biggest problems: obtaining new members and retaining current members. People may decide to join or renew their membership with a specific health club because they offer the present image prediction system as a service. By showing members how they will look 10 pounds thinner, for example, and giving them a clear-cut, understandable plan on
5 how to achieve it, businesses in this industry will generate a satisfied and loyal clientele.
The present image prediction system is useful for nationwide health clubs, diet centers, and exercise equipment manufacturers. Direct marketing to Internet users may also be employed, as the technology and methods lend themselves well to interfacing to the user via common web site and browser technologies. As such, Internet users who are looking to start a fitness program may have access to the present image prediction system via a web site.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 shows the arrangement of an Internet browser computer, digital photography and scanning equipment, the Internet, and a system server in accordance with the present invention.
Fig. 2 illustrates in detail the functional organization of the system of Fig. 1.
Fig. 3 depicts a cross-sectional view of a body portion to illustrate the calculation of base circumference of a body part.
Fig. 4 sets forth an example of locating grids placed on a subject's photograph to aid the image processor in locating each body part.
Fig. 5 shows the result of the placing of a grid over a single body part during the process of finding edges of the body part.
Fig. 6 illustrates the result of the morphing to reduce the width of the body part image.
Fig. 7 shows a simulated side-by-side "before" and "after" comparison output from the system.
Fig. 8 is a graph of male optimal weight versus height.
Fig. 9 is a graph of female optimal weight versus height.
Fig. 10 is a graph of essential body fat versus age as a function of ideal weight.
6 Fig. 11 is a graph of male age factor versus age.
Fig. 12 is a graph of female age factor versus age.
Fig. 13 is a graph of resistance training compliance by days.
Fig. 14 is a graph of muscle gain compensation by age.
Fig. 15 is a graph of muscle gain factor by age.
Fig. 16 is a graph of muscle gain scaling by nutritional compliance.
Fig. 17 is a screen display of personal information generated by computer software in accordance with the present invention.
Fig. 18 is a variation of the screen display of Fig. 17.
Fig. 19 is another variation of the screen display of Fig. 17.
Fig. 20 is a screen display of health risks generated by computer software in accordance with the present invention.
Fig. 21 is a screen display of health risks and an associated image for a female generated by computer software in accordance with the present invention.
Fig. 22 is a screen display of health risks and an associated image for a male generated by computer software in accordance with the present invention.
Fig. 23 is a variation of the screen display of Fig. 21.
Fig. 24 is a screen display of diet and exercise goals and an associated image for a male before commencement of a desired regimen.
Fig. 25 is a screen display for the male of Fig. 24 after following the desired regimen for a period of time.
Fig. 26 is a screen display of diet and exercise goals and associated images for a male before commencement of a desired regimen and after following the desired regimen for a period of time.
Fig. 27 is a screen display of diet and exercise goals and an associated image for a female before commencement of a desired regimen.
Fig. 28 is a screen display for the female of Fig. 27 after following the desired regimen for a period of time.
DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT
An image prediction system in accordance with the present invention is preferably an Internet-based fitness system and service, which helps the user meet
7 his or her fitness objectives. However, it may be implemented as a stand-alone workstation for use within a health club facility or medical professional's office.
In general, users enter their measurements, goals and current picture into the system. The system analyzes the user's data, generates a daily fitness program to help the customer reach his or her goal, and produces an after-fitness program image of the user. By setting the goals at an intermediate level, intermediate =
results can be projected and visualized.
The system employs readily available image morphing technology, driven by specialized technology to sectionalize the image into body components and predict specific size changes based upon physiological formulae and data tables.
System Overview In a preferred embodiment, the user, health club advisor, or medical professional may use the system via a web site using a web browser, although in an alternate embodiment he or she may use the system directly. Figure 1 illustrates the basic system components, including a browser computer (1) with Internet access (5), and a digital camera (2) or digital scanner (4), and optionally a printer (3). The computer can be any of several well-known and readily available systems, such as IBM-compatible personal computers running Microsoft's Windows operating system equipped with a web browser software such as Microsoft's Explorer or Netscape's Navigator, and appropriately equipped with a dial-up modem, cable modem, or Internet access via a local area network interface.
Alternate computers, software and operating systems, such as Apple iMac, Unix and Linux, may be used equally well.
The system also preferably includes a computer network (6), such as the Internet or an intranet, and a server (7). This server (7) is preferably based upon any of the well-known, readily available Internet web server platforms, such as an IBM-compatible personal computer running Microsoft's Windows NT operating system and an Apache web server. The user may point his or her web browser to the address or Universal Resource Locator ("URL") of the server (7) to access web pages and forms, such as HTML, XTML, and Common Gateway Interface ("COI"), all of which are well-known within the art. The user may transfer his or her "before" photo in the form of any of many well-known digital photograph
8 formats, such as Joint Photographic Experts Group ("JPEG"), bitmap ("BMP") or tagged-image file format ("TIFF), either by attachment to an e-mail message, retrieval by a Java client script (supplied by the server), by file transfer protocol ("FTP"), or any other suitable means.
Figure 2 shows a preferred functional organization of the server system (7), which includes a web content server (22), a mathematical analyzer (23) and an image processor (24). In a preferred embodiment, the server system interfaces directly to the Internet using any of the well-known methods, such as by modem or local area network The image processor (24) is described in more detail infra, as is the mathematical analyzer (23). If the system is implemented as a stand-alone workstation, it may also include a Graphical User Interface ("GUI") function for user control and input, such as a web browser software or custom GUI program.
Additionally, for stand-alone use, a digital camera or scanner may be added to the system via a Universal Serial Bus ("USB") port, parallel port, or other common computer interface.
In a preferred embodiment, the user accesses the server (7) via an Internet (6) arrangement, using his or her browser computer (1). The web content server (22) transmits web pages, such as HTML and CGI forms, to the user to establish an account session and verify the user's identity, which are viewed and completed using a web browser (20). The user may then enter specific goals and measurement data and submit a "before" photographic image file. The goals and measurements may be entered using a client-side Java applet, Adobe Acrobat Portable Document Format ("PDF"), CGI form, or other suitable means, and the photo file (21) may be uploaded to the server (7) by e-mail attachment, FTP, a client-side script, or other suitable means.
The user's measurements and goals are received by the mathematical analyzer (23), wherein certain formulae and data tables are applied to determine the amount of exercise to achieve the weight loss goal, and the amount of circumferential reduction in each body section.
The user's "before" image file (21) is received by the image processor (24), as are the body segment circumference changes from the mathematical analyzer (23). The image processor (24) segments the photo into body sections, applies the
9 reduction changes by morphing the photo, and produces the "after" image, which is then returned to the user by the web content server (22), preferably via a web page or e-mail attachment.
In the alternative, stand-alone embodiment, the "after" image is returned to , the GUI (25) so that the operator and/or client or patient may view the projected results.
In practice, the goals may be adjusted to produce the desired results and/or intermediate results, thereby providing a full fitness plan needed to achieve the user's or client's goals.
Mathematical Analysis The present system preferably requires measurements to be taken in order to produce a customized fitness plan. The desired measurements include:
(1) The circumferences of the neck, arm, chest, waist, hips, thigh, and calf.
(2) The skin fold of the neck, biceps, triceps, chest, scapula, abdomen, low back, hip, thigh, hamstring, and calf.
(3) The user's height, weight, and age.
(4) Percent desired of fat.
By taking the skin fold and circumference measurements, the present system utilizes the following new formula to find the circumference of the fat layer and predict the reduction in circumference for a particular body segment (all units are preferably in centimeters unless otherwise noted):
Cchange = Cater Cbefore where Cchange is the change in circumference of a body part, Cater is the final circumference of the body part after fat loss, and Cbefore is the circumference of the body part at the beginning of the program.
Figure 3 shows a cross sectional view of a body part, such as an upper arm or thigh, including a layer of fat with skin (30), a layer of muscle (31), and an underlying bone structure (32). The muscle and bone structure represent the component of the body part which will not be heavily affected by fat loss.
Thus, the circumference of the body part without fat is calculated as:

Cno fat = 2 = n = rno fat = 2 = TC = [rat, ¨ twice the depth of fat]
= 2 = n = [Cbefore/27c ¨ (2 = skin_fold_measurement)]
where Cao fat represents the minimum circumference of a body part with no fat, IT
5 represents an approximation for the constant "pi," such as 3.14, Cbefore represents the starting circumference (current circumference) of the body part, and "skin_fold_measurement" is the measurement of standard skin fold. All units are preferably in centimeters, although the formula holds for any unit of measure.
The body part circumference after a desired percentage fat loss is calculated
10 as:
Carter = 2 = 7t = {rõ. fat ¨ twice the desired depth of fat}
= 2 = it = { [r,aõ, ¨ twice the depth of fat] + twice the desired depth of fat}
= 2 = it = [Cberate/27c ¨ (2 = skin_fold_measurement)]
+ [2 = skin_fold_measurement = (1-P) = V] }
where Cafter is the circumference of the body part after the desired fat loss, P is the amount of desired fat loss expressed in decimal form (e.g., 10% desired loss would be 0.10), and V is a constant based upon the body part being analyzed. The V
constant is drawn from a table, and provides the variability to account for different body parts being more responsive to weight loss than others. For example, a body part which is highly responsive to weight loss would have a V value close to unity, while other less responsive body parts would have a greater than unity V
value.
Table 1 shows the preferred values for V.
11 Table 1: Adjustment Variable for Each Body Part Body Part Skin Fold Rank Order Variable (V) neck 3 mm 9 1 biceps 4 mm 8 1.1 triceps 14 mm 6 1.5 chest 21 mm 5 1.8 subscapula 19 mm 4 1.7 abdomen 30 mm 1 2.5 hip 24 mm 3 1.9 thigh 27 mm 2 2.0 calf 13 mm 7 1.2 The variable number is dependent on the skin fold of each body part. The rank order gives the ability to place the skin fold measurements in an order that the variable members can be assigned. By ranking the skin fold measurements from greatest to least, the variable numbers can be assigned. This table can change depending on where a person stores their fat.
Method of Producing the Predicted Image In a preferred embodiment, the following method is implemented in software. The programming language is of little consequence, as the required calculations can be performed by most well-known languages, including "C", Java, and "C++." The method comprises the steps of:
(a) Receive from intake data sheet all user information needed for analysis (i.e., skin fold measurements, age, height, weight, desired loss amount), and receive "before" photographic image file.
(b) Scale real-life measurements to picture size.
(c) Place photograph on grid.
(d) Convert intake data to reduction on photograph utilizing formula.
(e) Place locating grids of individual body parts on "before" image.
(f) Find outline of the individual body parts within locating grids.
(g) Apply reduction of the individual body parts using morphing function.
(h) Apply original "before" photo next to reduced "after"
photo.

=
=
12 In the first step, receipt of data from intake data sheet, the software receives the user's name, age, current weight, height, and circumference measurements for the neck, arm, chest, abdomen, hips, thigh, and calf. The data also includes skin fold measurements for the neck, biceps, triceps, chest, subscapula, abdomen, hips, thigh, and calf. The data further includes the desired percent body fat goal.
In the second step, the measurements are scaled to the picture size by taking the person's height and dividing by picture height. Then, this ratio is multiplied by all other real life measurements to produce scaled measurements.

For example, if a person is actually 5 feet 9 inches (69 inches), and the photograph submitted represents a 7 inch tall image, the scaling ratio is 69/7 = 9.8. So, all real-life measurements would be multiplied by the inverse of the scaling ratio to yield a scaled measurement set.
In the third step, a locating grid is used to identify each body part, as shown = in Fig. 4. Locating grids are placed on the arms (40), hips and/or buttocks (42), abdomen (43), thigh (44), calf (45), chest (47), and neck (46). In a preferred embodiment, a feature extraction algorithm may be used to automatically find each . body portion, aided by the placement of arrows (49) on a background behind the subject at the time of taking the photograph. Alternatively, the body-part identifying grids may be placed on the photo manually through a graphical user interface. Both implementations are within the skill of the art of software engineers with expertise in this type of image processing.
In. the fourth step, a grid is overlaid on each body segment image, as shown in Fig. 5. The grid (50) is useful in the process in finding the edges (51) of the image of the body part and in applying a percent reduction to the image.
In the fifth step, the reductions for each affected body segment are applied using an image morphing function, as shown in Fig. 6 with the new edges (60) of the body part image. This yields an "after" image in which each affected body part has been analytically reduced based upon each part's responsiveness to fat loss, the estimated beginning fat layer thickness based upon the skin fold measurements, and the desired amount of reduction of fat.
Finally, simulated "before" (70) and "after" (71) images are displayed side-by-side for ease of comparison, such as shown in Fig. 7. Thus, a more accurate
13 system and method are provided which scales the current image of the client or patient on a segmented basis using physiological calculations.
Alternative Body Modeling Method As an alternative to using actual photographic images of a person, a system in accordance with the present invention may utilize a stock or "canned" image of a generic person that may be customized to represent the salient features of the particular user by specifying a set of parameters. In this alternative embodiment, the present system preferably models body transformation in terms of muscle gain as well as fat loss, and the model is preferably three-dimensional. In order to make reasonable predictions, the system must begin with reasonable approximations of how much fat and muscle a person has initially. Preferably, a person's height, weight, age, and percent body fat are used to determine this starting point.
To present this information visually, the system translates these physiological attributes into units that will cause the computer graphics model to present an image that is consistent with how one would expect a person with those attributes to look. Additionally, the system uses a description of the body shape to know where on the body to place the fat. For example, if the person is "pear-shaped"
(represented by a triangle), the model will show most of the fat in the hips and thighs. If the person is shaped like an "apple" (represented by a circle), the model will concentrate the fat in the middle of the body. Finally, if the person has a "straight" body shape (represented by a rectangle), the model will distribute the fat more evenly. The overall effect is to create an image that is a good approximation of the person's initial appearance.
To build the body model, one of the first steps is preferably to determine how many pounds (or kilos) of fat the person would have if he or she had an amount of muscle consistent with that of a person who is currently not exercising or lifting weights. In terms of a preferred computer model, this means that the muscle layer is turned off or zero. This will yield a reasonable initial estimate of the person's percent body fat. If the user adjusts the percent body fat value, the system will modify the pounds (or kilos) of muscle and fat accordingly. These modifications will be translated into adjustments in the computer model's fat and muscle units. So, as the user increases the percent body fat, the model will grow = 14 "fatter," and as the user reduces the percent body fat, the model will grow more muscular.
The person's gender (male or female) is also generally an important factor because many relationships used in body modeling vary by gender. Additionally, the person's height and weight are important initial parameters. A set of equations is used to specify an optimal weight for any given height, depending on gender.
By subtracting the optimal weight from the person's current weight, the system may establish an initial estimate of how many excess pounds of fat the person has available to lose. This determination of the amount of excess fat also enables the system to determine how much to inflate the fat layer of the model.
The optimal weight for a given height is given by the following equations, in which the units are inches for height and pounds for weight.
Gender Height Condition Equation Male All Heights Ideal Weight = ( 4 x Height) - 100 Height > -66 inches Ideal Weight = ( 4 x Height) - 131 (5ft 6in) Female Height < 66 inches Ideal Weight = ( 3 x Height) - 65 (5ft 6in) These optimal weight equations are shown graphically in Figs. 8 and 9.
In addition to the pounds of fat that can be lost through diet and exercise, there are some additional pounds of fat that are required by the human body.
To estimate the percentage of body fat for a person, one must estimate essential fat as well as excess fat. By combining the estimated excess pounds of fat and the estimated essential pounds of fat, one arrives at an estimate of the total pounds of fat.
A person's amount of essential fat varies by age and may be estimated by the following equation:
Essential Fat = (( Age x 0.001625 ) + 0.0425 ) ( Ideal Weight) This essential fat equation is illustrated graphically in Fig. 10.
To calculate the estimated body fat percentage, divide the pounds of fat by the person's current body weight as follows:
Body Fat Percentage = ( Essential Fat + Excess Fat) / Body Weight If the person is female, add 3%.

CA 02831674,2013-10-31 Preferably, a computer program in accordance with the present invention will not render an initial estimate of body fat percentage that is less than 4% for males or less than 7% for females, and the maximum estimated body fat percentage is preferably 61% for males and 64% for females.
5 Thus, by way of example, for a male who is 6 feet (72 inches) in height, 240 pounds in weight, and 40 years of age, his initial body fat may be estimated using the above equations as follows:
Ideal Weight = ( 4 x 72 Inches) ¨ 100 = 188 Lbs.
Excess Pounds = 240 Lbs. ¨ 188 Lbs. = 52 Lbs.
10 Essential Fat = ((40 Yrs) x (0.001625) + 0.0425) x (188 Lbs) = 20.21 Lbs.
Pounds of Fat = 20.21 Lbs. +52 Lbs. = 72.21 Lbs.
Percent Body Fat = 72.21 Lbs. / 240 Lbs. = 30 %
A preferred display screen corresponding to the foregoing example is shown in Fig. 17. The person's height, weight, age, and body fat are preferably selected by 15 slider bars or entered in dialog boxes as shown in Fig. 17, but such parameters may be entered in any other suitable manner.
A person's body appearance as well as the amount of fat for any given weight will vary depending on the person's age. The person's age is preferably used to make a small adjustment to the amount of fat that appears on the model for any given weight. This variation is preferably described by a parabolic function that applies between the ages of 30 and 70 as reflected in the following equation for males, which is depicted graphically in Fig. 11:
Age Factor = ( (-0.000438)Age2 + (0.0439)Age ) - 1 A similar age factor relationship for females is depicted graphically in Fig.
12. For men, this age factor is preferably applied to boost fat pounds by up to 10% of a person's ideal weight. For women, the age factor is preferably applied to boost fat pounds by up to 5% of their ideal weight. For a middle-aged person, these additional pounds of fat will affect the appearance of the model. However, these additional pounds preferably are not assumed to be available for loss.
Thus, by way of example, considering again a 6 foot tall male who is 40 years of age, the weight adjustment for age is preferably calculated as follows:

Age Factor = ( (-0.000438)(40)2 + (0.0439)(40) ) ¨ 1 = 0.0552 Age Pounds = 188 (0.0552) = 10.3776 lbs.
This man's body model will then be shown with 10 more pounds of fat than would a similar person of age 35.
A system in accordance with the present invention also preferably allows manual adjustment of the user's body fat percentage. After an initial estimate of body fat percentage has been calculated, the user may increase or decrease the dbeosdiyrefda.t pwherceenntathgee by user sl iidnienrgeaths ees"t%heBhooddyyFfaatt" percentage, greig, htthoer software w(sneree Figs. program increases This will have the effect of making the model "flabbier" or more muscular, as increases the pounds of fat assumed for the given weight to reflect the new percentage. This also increases the number of pounds of fat available for loss. The fat layer of the model is inflated to reflect these additional pounds of fat.
Pounds of Muscle Converted into Fat = (Body Weight) (Selected Body Fat% ¨ Initial Body Fat %) In the previous example, the 6 foot, 240 pound, 40 year old male was estimated to have a body fat percentage of 30%. If the user selects a body fat percentage of 37%, the model will reflect 16.8 more pounds of excess fat, as follows (see Fig. 18):
Pounds of Muscle Converted into Fat = (240)(37% - 30%) = 16.8 Lbs.
If the user reduces the body fat percentage, the program converts pounds of fat (that were available for loss) into lean muscle mass to reflect the new percentage. As lean muscle is added, the muscle layer of the model is inflated to show greater muscle development. Additionally, the fat layer is reduced to reflect fewer pounds of excess fat.
Pounds of Fat Converted into Muscle = (Body Weight) x (Initial Body Fat % ¨ Selected Body Fat %) In the previous example the 6 foot, 240 pound, 40 year old male was estimated to have a body fat percentage of 30%. If the user selects a body fat percentage of 25%, the model will reflect 12 more pounds of lean muscle and 12 less pounds of fat, as follows (see Fig. 19):
Pounds of Fat Converted into Muscle = (240) x (30% - 25%) = 12 Lbs.

In addition, the muscle layer of the model and the fat layer of the model preferably may be adjusted independently by sliding the Muscle Adjustment or the Fat Adjustment slider bars. Increasing the Muscle Adjustor increases the appearance of muscle without modifying the weight or body fat percentage of the subject. Similarly, moving the Fat Adjustment will change the amount of fat displayed on the model without changing the weight or the pounds of fat available for loss. Moving these sliders modifies the default assumptions about how a pound of fat or a pound of muscle appears on a particular person. All subsequent modifications to the appearance of the model for this person will reflect these = 10 manual adjustments.
Health Risk Calculations The present system also preferably estimates the risk of Type II Diabetes, Cardiovascular Heart Disease, and Stroke. Such disease risks are preferably calculated based on Body Mass Index (BMI), body fat percentage, and family history of disease. Age, gender, smoking, exercise, high blood pressure, and high cholesterol are also preferably taken into account. The calculated risk values are based on assumptions about the relative importance of several well known risk factors. The estimates of the magnitudes of each risk factor attempt to capture the prevailing wisdom about Type II Diabetes, Cardiovascular Heart Disease, and Stroke. Although quantifying a person's risk with a number in this manner does not imply the person's actual risk (rather, it indicates only that their risk could be higher or lower if certain factors were different), these calculations dramatically demonstrate how some behaviors increase disease risk while others reduce a person's risk of disease.
As shown in Fig. 20, the person's BMI is preferably displayed with a vertical progress bar. The range of the BMI bar is preferably 18 to 40. Disease risk factors are preferably displayed with vertical progress bars having a range of 0 to 100.
The level of risk is expressed by a value between 0 and 100, which may be qualitatively categorized as follows:

Risk Value Risk Category 0-30 Minimal 31-39 Low 40-52 Moderate 53-75 High 76-98 Very High mass and body Over 98 Extremely High The disease risk calculations preferably take into account the effects of as body fat percentage on a person's health. The obesity factor is a combination of a person's BMI and a measure of their excess body fat. Excess body fat depends on a person's age and gender. The following table shows an acceptable amount of body fat for persons of various ages:
Age 20 40 60 80 Male 14 15 17 18 Female 17 18 19 21 To determine the excess body fat, the program subtracts the acceptable body fat percentage from the person's actual body fat percentage.
The obesity factor is preferably constructed from these basic functions:
Body Mass Index (BMI) = (Weight x 704.5) / Height x Height BMI Factor = (BMI ¨25) x 7.5 Acceptable Body Fat = (Age X 0.0667 ¨ 1.3333) + 14; (add 3 more if female) Excess Body Fat Factor = % Body Fat ¨ Acceptable Body Fat The obesity factor is an average of the BMI factor and the Excess Body Fat Factor:
Obesity Factor = ( BMI Factor + Excess Body Fat Factor ) /2.
Constructing the obesity factor in this way allows the present system to use the power of BMI to predict disease while also taking into account the effect of a high body fat percentage. The obesity factor is preferably scaled by the function:
Y = 0.04X ¨ 0.28 This scaling function effectively lowers the calculated risks for athletes and others with a low percentage of body fat. If the BMI effect is a proxy for the amount of body fat a person carries, then if the person's body fat is known to be CA 02831674,2013-10-31 lower, then the effect of this factor should be smaller. For example, using this scaling function, the obesity factor for a person with 7% body fat would be scaled to 0. However, the obesity factor of a person with 32% body fat would be scaled by 100%.
Diabetes Risk Calculation A person's risk of Type II Diabetes is the sum of six factors: family history, exercise, age, gender, BMI, and Percent Body Fat.
Family History Add 15 if the subject has a family history of diabetes.
Exercise Add 25 if the subject is not exercising.
Age Add ( Age / 5 ) ¨ 7 BMI
Add 7.5 for every unit of BMI over 25 % Body Fat Add 1 for every percent of Excess Body Fat Scales obesity factor by (4X ¨ 28)%
The obesity factor is magnified by 25% for Type II Diabetes.
By way of example, consider a female subject 5 ft. 6 in. tall, 175 lbs., 36 years old who has a family history of diabetes, does not exercise, has a BMI
of 28.3, and has a body fat percentage of 35%. Her risk of Diabetes is calculated as follows (see Fig. 21):
Family History of diabetes {0, 15) 15 Not Exercising {0, 25) 25 Age (36/5) ¨ 7 = 0 0 BMI Factor 7.5 x(28.3 ¨ 25))= 24.75 Acceptable Body Fat 17 +(36x0.07)-1.3 = 18 Excess Body Fat Factor 35 ¨ 18 = 17 Obesity Factor ((24.75)+17))/2 = 20.9 Body Fat Scaler 4(35)-28 =112% x 20.9 = 23.4 25% Magnifier for Diabetes 1.25 x 23.4 = 29 Diabetes Risk 69 CA 02831674,2013-10-31 A value of 69 places the subject in the "High" risk category, as depicted in Fig. 21.
Heart Disease Risk Calculation 5 Coronary Heart Disease Risk is the sum of nine factors: age, family history, gender, smoking, high blood pressure, exercise, BMI, and body fat percentage.
Age:
Add 1 for each year over 59 up to a maximum of 10 years.
Family History:
10 Add 15 if the subject has a family history of heart disease.
Add 5 if the subject has a family history of Diabetes.
Smoking Add 25 if the subject smokes.
Gender 15 Add 5 if the subject is male.
High Blood Pressure Add 15 if the subject has high blood pressure (greater than 120/80).
Exercise Add 10 if the subject is not exercising.
20 High Cholesterol Add 10 for high cholesterol (240 mg/dL or higher).
BMI
Add 7.5 for every unit of BMI over 25.
% Body Fat Add 1 for every percent of Excess Body Fat.
Scales Obesity factor by (4X ¨ 28) %
As an example, consider a male subject 6 ft. tall, 250 lbs., 62 years old who smokes and has a family history of heart disease. He does not have high blood pressure or high cholesterol. His BMI is 34Ø His Body Fat is 36%. His risk of heart disease is calculated as follows (see Fig. 22):

21 , Age over 59: 62-59=3 . 3 Family History of heart disease {0, 15) 15 Family History of diabetes (0, 5) 0 Smoking {0, 25) 25 Male Gender {0, 51 5 No High Blood Pressure (0, 15) 0 Some Exercise {0, 10) 0 High Cholesterol {0, 10) 0 BMI Factor 7.5 x(34.0 ¨25)) = 67.5 Acceptable Body Fat 14 + (62x0.07)-1.3 = 17 Excess Body Fat Factor 36¨ 17 = 19 Obesity Factor ((67.5) + 19))/2 = 43.25 Body Fat Scaler 4(36-28=116% x 43.25 = 50 Heart Disease Risk 98 A value of 98 places the subject in the "Very High" risk category, as depicted in Fig. 22.
Stroke Risk Calculation Stroke risk is the sum of eight factors: age, family history, gender, smoking, high blood pressure, exercise, BMI, and high cholesterol.
Age:
Add 1 for each year over 59.
Family History:
Add 15 if the subject has a family history of stoke.
Add 5 if the subject has a family history of diabetes.
Smoking Add 15 if the subject smokes.
Gender Add 5 if the subject is male.
High Blood Pressure Add 25 if the subject has high blood pressure (greater than 120/80).
Exercise Add 5 if the subject is not exercising.

(.
High Cholesterol Add 10 for high cholesterol (240 mg/dL or higher) BMI
Add 7.5 for every unit of BMI over 25 % Body Fat Add I for every percent of Excess Body Fat Scales Obesity factor by (4X 28) %
As an example of stroke risk calculation, consider a female subject 5 ft. 3 in. tall, 165 lbs., 60 years old who smokes, has high blood pressure, is physically inactive, and has a family history of stroke. Her BMI is 29.3. Her Body Fat is 38%. Her risk of stroke is calculated as follows (see Fig. 23):
Age over 59: 60-59=1 1 Family History of Stoke {0, 15) 15 Family History of diabetes (0, 5) 0 Smoking (0, 15) 15 Male Gender {0, 5) 0 High Blood Pressure (0,25) 25 Not Exercising {0, 5) 5 High Cholesterol {0, 10) 0 BMI Factor 7.5 x (29.3 ¨25)) = 32.25 Acceptable Body Fat 17 + (60x0.07)-1.3 = 20 Excess Body Fat Factor 38 ¨ 20= 18 Obesity Factor ((32.25) + 18))/2 = 25.1 Body Fat Scaler 4(38)-28 =124% x 25.1 = 31 Risk of Stroke 92 A value of 92 places the subject in the "Very High" risk category, as depicted in Fig. 23.
Fat Loss Calculations To predict how much fat will be burned over time, the present system treats the human body as a system that is constantly consuming energy. If a person takes in more energy (food) than the body can use, that surplus energy will be stored as fat. If a person uses more energy than is consumed, the body will convert stored fat into energy and that person will lose fat. A person can create an energy deficit by eating less or exercising more. It is this energy deficit that the present system calculates. When a user selects a certain number of days of compliance per week (or other suitable period of time) for eating and exercising according to plan, the present system adds up their caloric deficit for the week (or other suitable time period). According to well known data, it is assumed that one pound of fat is equal to 3,500 calories of energy. So, if a person can introduce a 3,500-calorie deficit within the course of a week, that person will lose one pound in one week.
It is assumed that the person's current physical condition is a result of their current eating and exercise habits. To recommend a dietary caloric deficit, the system first calculates how many calories are required per day to maintain the person's current weight. From that maintenance amount, subtracting an appropriate number of calories creates the desired dietary deficit. To find the initial maintenance value, the system preferably uses the standard calculation for Basil Metabolic Rate (BMR), as follows.
BMI1f.õõde = Activity Rate x (665.1+ (9.56 x Weight) + (1.85 x Height) -(4.68 x Age));
BMR. ,we = Activity Rate x (66.47+ (13.75 x Weight) + (5.0 x Height) - (6.76 x Age)).
An Activity Rate of 1.3 (ambulatory) is preferably used to account for only their normal, daily activity. The effects of the exercise regimen are added separately. For example, a dietary deficit of 500 calories per day is found by subtracting 500 from the calculated BMR.
Calories per Day = BMR ¨ 500 calories The present system provides three basic ways to create a caloric deficit:
diet, resistance training, and cardiovascular training. Additionally, the present system allows the user to take into account the effects of dietary supplements and personal training in boosting the number of calories burned. The number of calories burned per week is the sum of five products:

Calories Burned / Week = ( Daily Dietary Deficit calories x Days on Diet) + ( Cardiovascular Exercise session calories x Days of Cardiovascular Exercise) + ( Resistance Training session calories x Days of Resistance Exercise) + ( Caloric value of a day of Supplementation x Days of Supplementation) + ( Caloric boost value of a day of Personal Assistance x Days of Personal Assistance) The pounds of fat burned per week is found by dividing the Calories Burned per Week by the constant for Calories burned per Pound, as follows:
Pounds of Fat Burned / Week = (Calories of Fat Burned / Week) / (Calories /
Pound) The caloric value of a resistance training session or a cardiovascular exercise session varies with the intensity of the workout. Therefore, the value used in the calculations will vary depending on the type of exercise program selected.
The user can vary the values used in these calculations by selecting different goals.
In addition to the boost experienced while the personal trainer is present, the intensity of all solitary resistance training sessions is assumed to increase by some small amount.
As an example, consider the following fitness program, which is graphically depicted in Fig. 24.
For the selected goal, the preferred caloric values used are as shown in the below table.
Activity Days of Caloric Total Calories Compliance Contribution Daily Dietary Deficit = 500 5 Days / Week 2500 calories Calories Caloric Value of Cardiovascular Exercise 4 Days / Week 1200 calories session = 300 calories Caloric Value of Resistance Training session (boosted by 3 Days / Week 975 calories personal assistance) = 325 calories Caloric Value of Dietary Supplementation = 100 5 Days / Week 500 calories calories Caloric Value of Personal Assistance = 50 calories 2 Days / Week 100 calories (while trainer is present) Total Weekly Caloric Deficit 5275 calories The fat burned as a result of this regimen is then calculated as follows:
Fat Burned = 5275 / 3500 = 1.5 Lbs / Week Given a rate of pounds burned per week, it is possible to calculate the number of 5 pounds burned over a given period of time.
Lbs = Weeks x Lbs / Week For example, Lbs. of Fat Burned in 26 Weeks = 26 Weeks x 1.5 Lbs/ Week = 39 Lbs.
It is also possible to calculate how many weeks will be required to burn a 10 particular amount of fat:
Weeks to burn 39 Lbs. = 39 Lbs. / 1.5 =26 Weeks The present system will preferably calculate Weeks or Pounds, depending on which slider bar is "grabbed" by the user. If the user slides the "Pounds of Fat Burned" slider, the program will use the rate of fat burn to adjust the "Goal 15 Timeline" Slider bar. Conversely, if the user slides the "Goal Timeline"
slider bar, pounds of fat burned will be calculated. In either event, the body image will be slimmed to reflect the pounds of fat lost, as shown in Fig. 25. Also, the percent body fat display will be updated to reflect the new body composition.
The present system will preferably predict weight loss based on excess 20 pounds of fat only. As described above, a person is assumed to have an optimal weight for any given height. Once all of the excess fat pounds are "lost,"
continued fat loss calculations are stopped. The present system preferably will not recommend weight loss below the assumed optimal weight, unless the person's body fat percentage indicates that additional excess fat pounds remain to be 25 burned. Additionally, "age appearance adjustments," as discussed above, preferably remain in effect.
Muscle Gain Calculations The present system preferably predicts muscle gain based on the person's age and gender as well as the muscle building program selected. The components affecting a muscle building program are nutrition, resistance training, and dietary supplementation. Preferably, it is assumed that the maximum amount of muscle will be gained by an 18 year old male. The amount of muscle gain will depend =

upon the number of days of resistance training, proper nutrition, and supplementation. The present system preferably calculates muscle gain in units that are native to the body model. The calculations are scaled to cause the body model's muscle to inflate with a rate that is consistent with what one would expect given a particular fitness program, and preferably in uniform proportions. The resulting muscle gain is transformed into pounds to display the pounds of muscle gained.
Base Muscle Gain Factor The Base Muscle Gain factor represents the change in the appearance of the model after one resistance training session, if all other factors are at maximum.
The magnitude of this factor changes depending on the goal selected by the user.
For example, if the person's goal is to lose weight, the nutritional recommendations favor fat loss at the expense of muscle gain. However, if the person's selected goal is to gain muscle, the nutritional recommendations will favor muscle gain. Therefore, the magnitude of the Base Muscle Gain factor will be greater if one's goal is to gain muscle. Preferred Base Muscle Gain factors for various goals are listed in the table below.
Goal Base Muscle Gain Factor ( model units) Muscle Gain 1 / 725 Muscle Gain and Fat Loss 1 /1087 Fat Loss I / 1450 Health / Maintenance 1 / 1450 Supplementation Boost Supplements are preferably assumed to boost muscle formation, depending on the number of days of supplementation and the number of days of resistance training. The default value for the supplement boost factor is preferably 15%.

Supplement Boost = 1.0 + ( (Days of Resistance Training / 7 days) x (Days of Supplementation / 7 days) x Supplement Boost Factor) Resistance Training Compliance As illustrated in Fig. 13, the effect of the number of days of resistance training is preferably assumed to be linear for up to 4 days per week.
However, the effect of days 5, 6, and 7 are reduced by a function that compensates for the lack of rest days. Accordingly, Resistance Compliance is defined as follows:
If Days of Resistance training is greater than 4:
Resistance Compliance = ( Days of Resistance Training! 3 ) + 2.56667 Otherwise Resistance Compliance = Days of Resistance Training Scaling Muscle Gain by Age A 40 year old man, for instance, cannot build as much muscle in a given amount of time as he could have when he was 18, even though he engages in an identical fitness program. Therefore, the present system preferably scales the muscle response according to age, which is preferably accomplished in two steps.
The first step preferably uses a function that peaks at age 18 and has a minimum value of 0.1, as shown in Fig. 14 and defined by the following equation. This function allocates the amount of muscle available for gain by age.
Age Compensation = Age x -0.015909 + 1.286364 A second function preferably transforms the scaling from model units into pounds gained according to the following equation:
Age Scaler = Age x -0.618182 + 65.7273 Combining the two functions yields the preferred overall muscle gain by age curve as illustrated in Fig. 15 and defined by the following equation:
Age Factor = Age2 (0.009835) + Age (-1.84086) + 84.54923 Scaling Muscle Gain by Nutritional Compliance Proper nutrition is required to achieve maximum muscle gain. Therefore, days of inadequate nutrition reduce the amount of muscle gained. A Nutrition Factor is therefore preferably calculated by the following equation, which is depicted graphically in Fig. 16:
Nutrition Factor = Days/Week on Nutrition Plan (0.035714286) + 0.75 Scaling Muscle Gain for Gender Females are preferably assumed to be able to develop 55% as much muscle as males. For example, if a male can develop 11 pounds of muscle over one year on a particular fitness program, a female will develop 6 pounds over the same time using an identical fitness program. Accordingly, a Gender Factor is preferably established as follows:
Gender Factorremak = 0.55 Gender Factormak = 1.0 Combined Scaling for Muscle Gain As the user slides the Goal Timeline slider bar to the right, the present system calculates how much muscle is developed according to the fitness program selected, and the model image becomes more muscular. To determine how many pounds of muscle have been gained, the above scale factors are preferably multiplied together to determine the rate of muscle gain as follows:
Muscle Gained / Week = ( Resistance Compliance x Base Muscle Gain Factor) x Supplement Boost x Age Factor x Nutrition Factor x Gender Factor The pounds of muscle gained over a number of weeks are found by multiplying the rate of gain by the time in weeks, as follows:
Muscle Gained in Lbs = ( Muscle Gained in Lbs / Week) x Weeks' As an example of a muscle gain calculation, consider a 220 pound, 40 year old male who eats properly 5 days per week, takes supplementation 6 days per week, and lifts weights 5 days per week as illustrated in Fig. 26. His primary goal is to gain muscle. To determine his muscle gain, the system first calculates all of the relevant scaling factors as follows:
Base Muscle Gain Factor = 1 / 725 Supplement Boost = 1.0 + ( (5 /7 days) x (6 / 7 days) x 0.15 ) = 1.09 Resistance Compliance = ( 5 days /3 ) + 2.56667 = 4.2333 Age Factor = (40 yrs)2 (0.009835) + (40 yrs) (4.84086) + 84.54923 = 26.6508 Nutrition Factor = (5 days) (0.035714286) + 0.75 = 0.9286 Gender Factormaie = 1.0 Those factors are then combined to find the rate of muscle gain in pounds as follows:

(.
Muscle Gained / Week = ( 4.2333 / 725 ) x (1.09) x (26.6508) x (0.9286) x (1.0) =0.1575 Finally, the number of weeks is multiplied by the muscle gain rate to find the total number of pounds of muscle gained as follows:
Muscle Gained (lbs) = ( 0.1575 Lbs / Week) x ( 52 Weeks ) = 8 Lbs.
The pounds of muscle gained are preferably displayed for the user -on a display screen showing before and after images as illustrated in Fig. 26.
Body Fat Percentage Recalculation After a user has engaged in a fitness program over a period of time, the present system's body model composition will preferably change to reflect a loss of fat and/or a gain in muscle. Therefore, the body fat percentage should be recalculated. To recalculate the body fat percentage, the system should first determine the person's new body weight. The new body weight is found by subtracting the pounds of fat lost and adding the muscle gained as follows:
Weight new = Weight initial ¨ Fat Lost + Muscle Gained ( Lbs ) The system should also determine the pounds of body fat remaining. To do this, the known or assumed percent body fat of the person is used to determine how many pounds of fat the person currently has as follows:
Fat initial (Lbs) = Body Fat Percentage initial x Weight initial (Lbs) The remaining pounds of fat are determined by subtracting the pounds of fat lost from the original pounds of body fat as follows:
Fat remaining = Fat initial - Fat Lost ( Lbs ) The new body fat percentage is found by dividing the pounds of fat remaining by the new body weight as follows:
Body Fat new % ) =Fat remaining Weight new As an example, consider a 180 pound female with an initial body fat percentage of 35% as illustrated in Fig. 27. After engaging in a fitness program for 6 months, she loses 30 pounds of fat and gains 3 pounds of muscle. Her new weight is 153 Lbs.
Weight new = (180 Lbs ) ¨ ( 30 Lbs ) + ( 3 Lbs ) = 153 Lbs.
Her initial pounds of fat were 63 Lbs.
Fat initial (Lbs) = Bo( 35 %) x 180 (Lbs) = 63 Lbs.

Her remaining pounds of fat are 33 Lbs.
Fat remaining = ( 63 Lbs )- ( 30 Lbs ) = 33 Lbs.
Therefore, her new percent body fat is 22 %, as illustrated in Fig. 28.
Body Fat new ( % ) = ( 33 Lbs ) / ( 153 Lbs ) = 22%
5 Although the foregoing specific details describe a preferred embodiment of this invention, persons reasonably skilled in the art will recognize that various changes may be made in the details of this invention without departing from the spirit and scope of the invention as defined in the appended claims.
Therefore, it should be understood that this invention is not to be limited to the specific details 10 shown and described herein.

Claims (39)

The embodiments of the present invention for which an exclusive property or privilege is claimed are defined as follows:
1. A computer-implemented method for producing an image predictive of a person's appearance resulting from following a prescribed regimen, said method comprising:
receiving a first data set associated with said person;
said first data set comprising a body shape designation;
said body shape designation being representative of where fat is located on said person;
creating a first image representative of said person in a pre-regimen condition by modifying a generic image based on said first data set;
receiving a second data set comprising at least one goal desired from said regimen;
creating a second image representative of said person in a post-regimen condition by modifying said first image based on said second data set;
displaying said second image; and calculating an ideal weight and an estimated body fat percentage for said person;
wherein said estimated body fat percentage is calculated substantially according to the following equation:
Body Fat Percentage = ( Essential Fat + Excess Fat ) / Body Weight said Essential Fat being calculated according to the following equation:
Essential Fat = ((Age x 0.001625 ) + 0.0425) (Ideal Weight).
2. A computer-implemented method for producing an image predictive of a person's appearance resulting from following a prescribed regimen, said method comprising:
receiving a first data set associated with said person;
said first data set comprising a body shape designation;
said body shape designation being representative of where fat is located on said person;
creating a first image representative of said person in a pre-regimen condition by modifying a generic image based on said first data set;
receiving a second data set comprising at least one goal desired from said regimen;

creating a second image representative of said person in a post-regimen condition by modifying said first image based on said second data set; and displaying said second image;
wherein said creating a second image comprises calculation of an age factor;
and wherein said age factor is calculated substantially according to the following equation:
Age Factor = ( (-0.000438)Age2+ (0.0439)Age) - 1.
3. A
computer-implemented method for producing an image predictive of a person's appearance resulting from following a prescribed regimen, said method comprising:
receiving a first data set associated with said person;
said first data set comprising a body shape designation;
said body shape designation being representative of where fat is located on said person;
creating a first image representative of said person in a pre-regimen condition by modifying a generic image based on said first data set;
receiving a second data set comprising at least one goal desired from said regimen;
creating a second image representative of said person in a post-regimen condition by modifying said first image based on said second data set; and displaying said second image;
wherein said at least one goal comprises muscle gain and wherein said muscle gain is calculated based on at least one of the following factors:
a base muscle gain factor;
a supplement boost factor;
a resistance compliance factor;
an age factor;
a nutrition factor; and a gender factor;
wherein said base muscle gain factor is selected from the group consisting of:
1/725 if said goal comprises muscle gain only;
1/1087 if said goal comprises muscle gain and fat loss;
1/1450 if said goal comprises fat loss only or health maintenance.
4. A computer-implemented method for producing an image predictive of a person's appearance resulting from following a prescribed regimen, said method comprising:
receiving a first data set associated with said person;
said first data set comprising a body shape designation;
said body shape designation being representative of where fat is located on said person;
creating a first image representative of said person in a pre-regimen condition by modifying a generic image based on said first data set;
receiving a second data set comprising at least one goal desired from said regimen;
creating a second image representative of said person in a post-regimen condition by modifying said first image based on said second data set; and displaying said second image;
wherein said at least one goal comprises muscle gain and wherein said muscle gain is calculated based on at least one of the following factors:
a base muscle gain factor;
a supplement boost factor;
a resistance compliance factor;
an age factor;
a nutrition factor; and a gender factor;
wherein a supplement boost is calculated substantially according to the following equation:
Supplement Boost = 1.0 + ((Days of Resistance Training / 7 days) × (Days of Supplementation / 7 days) × Supplement Boost Factor).
5. A computer-implemented method for producing an image predictive of a person's appearance resulting from following a prescribed regimen, said method comprising:
receiving a first data set associated with said person;
said first data set comprising a body shape designation;
said body shape designation being representative of where fat is located on said person;
creating a first image representative of said person in a pre-regimen condition by modifying a generic image based on said first data set;

receiving a second data set comprising at least one goal desired from said regimen;
creating a second image representative of said person in a post-regimen condition by modifying a generic image based on said first data set; and displaying said second image;
wherein said at least one goal comprises muscle gain and wherein said muscle gain is calculated based on at least one of the following factors:
a base muscle gain factor;
a supplement boost factor;
a resistance compliance factor;
an age factor;
a nutrition factor; and a gender factor;
wherein said resistance compliance factor is calculated substantially according to one of the following:
(a) if said regimen comprises a number of days of resistance training per week which is greater than 4, Resistance Compliance = (Days of Resistance Training/3) + 2.56667 (b) if said regimen comprises a number of days of resistance training per week which is less than or equal to 4, Resistance Compliance = Days of Resistance Training.
6. A computer-implemented method for producing an image predictive of a person's appearance resulting from following a prescribed regimen, said method comprising:
receiving a first data set associated with said person;
said first data set comprising a body shape designation;
said body shape designation being representative of where fat is located on said person;
creating a first image representative of said person in a pre-regimen condition by modifying a generic image based on said first data set;
receiving a second data set comprising at least one goal desired from said regimen;
creating a second image representative of said person in a post-regimen condition by modifying said first image based on said second data set; and displaying said second image;

wherein said at least one goal comprises muscle gain and wherein said muscle gain is calculated based on at least one of the following factors:
a base muscle gain factor;
a supplement boost factor;
a resistance compliance factor;
an age factor;
a nutrition factor; and a gender factor;
wherein said age factor is calculated substantially according to the following equation:
Age Factor = Age2 (0.00983) + Age (-1.84086) + 84.54923.
7. A
computer-implemented method for producing an image predictive of a person's appearance resulting from following a prescribed regimen, said method comprising:
receiving a first data set associated with said person;
said first data set comprising a body shape designation;
said body shape designation being representative of where fat is located on said person;
creating a first image representative of said person in a re-regimen condition by modifying a generic image based on said first data set;
receiving a second data set comprising at least one goal desired from said regimen;
creating a second image representative of said person in a post-regimen condition by modifying said first image based on said second data set; and displaying said second image;
wherein said at least one goal comprises muscle gain and wherein said muscle gain is calculated based on at least one of the following factors:
a base muscle gain factor;
supplement boost factor;
a resistance compliance factor;
an age factor;
a nutrition factor; and a gender factor;
wherein said nutrition factor is calculated substantially according to the following equation:

Nutrition Factor = Days/Week on Nutrition Plan (0.035714286) +
0.75.
8. A computer-implemented method for producing an image predictive of a person's appearance resulting from following a prescribed regimen, said method comprising:
receiving a first data set associated with said person;
said first data set comprising a body shape designation;
said body shape designation being representative of where fat is located on said person;
creating a first image representative of said person in a pre-regimen condition by modifying a generic image based on said first data set;
receiving a second data set comprising at least one goal desired from said regimen;
creating a second image representative of said person in a post-regimen condition by modifying said first image based on said second data set; and displaying said second image;
wherein said at least one goal comprises muscle gain and wherein said muscle gain is calculated based on at least one of the following factors:
a base muscle gain factor;
a supplement boost factor;
a resistance compliance factor;
a age factor;
a nutrition factor; and a gender factor;
wherein said gender factor is calculated substantially according to one of the following equations:
(a) if said person is a female, Gender Factor female = 0.55;
(b) if said person is a male, Gender Factor male 1Ø
9. A computer-implemented method for producing an image predictive of a person's appearance resulting from following a prescribed regimen, said method comprising:
receiving a first data set associated with said person;
said first data set comprising a body shape designation;

said body shape designation being representative of where fat is located on said person;
creating a first image representative of said person in a pre-regimen condition by modifying a generic image based on said first data set;
receiving a second data set comprising at least one goal desired from said regimen;
creating a second image representative of said person in a post-regimen condition by modifying said first image based on said second data set; and displaying said second image;
wherein said at least one goal comprises muscle gain and wherein said muscle gain is calculated based on at least one of the following factors:
a base muscle gain factor;
a supplement boost factor;
a resistance compliance factor;
an age factor;
a nutrition factor; and a gender factor;
wherein said muscle gain is calculated substantially according to the following equation:
Muscle Gained/Week = (Resistance Compliance x Base Muscle Gain Factor) x Supplement Boost x Age Factor x Nutrition Factor x Gender Factor.
10. A
computer implemented method for producing an image predictive of a person's appearance resulting from following a regimen of diet, exercise, or both diet and exercise, said method comprising the steps of:
receiving as input a first image of said person in a pre-regimen condition;
receiving as input a first data set of body measurements for said person;
receiving as input a second data set indicating at least one goal for results of said regimen;
segmenting said first image into a plurality of body segment images;
automatically creating a modified set of body segment images by modifying at least one of said plurality of body segment images based upon said first and second data sets in a manner representative of predictable fat loss resulting from following said regimen; and outputting a second image predictive of the appearance of said person in a post-regimen condition using said modified set of body segment images.
11. The method of claim 10 wherein said step of receiving a first data set of body measurements comprises receiving a weight measurement for said person.
12. The method of claim 10 wherein said step of receiving a first data set of body measurements comprises receiving at least one body fat measurement for said person.
13. The method of claim 12 wherein said step of receiving at least one body fat measurement comprises receiving at least one skin fold measurement.
14. The method of claim 13 wherein said at least one skin fold measurement is associated with a body part selected from the group consisting of neck, bicep, tricep, chest, subscapula, abdomen, hip, thigh, and calf.
15. The method of claim 10 wherein said step of receiving a second data set comprises receiving a target weight value.
16. The method of claim 10 wherein said step of receiving a second data set comprises receiving a target body fat value.
17. The method of claim 10 wherein said creating step comprises the steps of:
determining an initial fat layer thickness associated with said at least one of said plurality of body segment images;
determining a reduced fat layer thickness associated with said at least one of said plurality of body segment images based upon predicted fat loss from said initial fat layer thickness; and determining a final body segment size associated with said at least one of said plurality of body segment images based upon said first data set and said reduced fat layer thickness.
18. The method of claim 10 wherein said creating step comprises calculating a final circumference of a body part associated with said at least one of said plurality of body segment images according to the following equation:

C after = 2 .cndot. .pi. .cndot. { [C before /2.pi. - (2 .cndot.
skin_fold_measurement)]
+ [2 .cndot. skin_fold_measurement .cndot. (1-P) .cndot. V] }
wherein C after represents the circumference of said body part in said post-regimen condition, C before represents the circumference of said body part in said pre-regimen condition, skin_ fold_measurement represents a skin fold measurement associated with said body part in said pre-regimen condition, P represents an amount of desired fat loss expressed in decimal form, and V represents a constant associated with said body part.
19. The method of claim 18 wherein said body part is selected from the group consisting of neck, bicep, tricep, chest, subscapula, abdomen, hip, thigh, and calf, and wherein said constant V is respectively selected from the group consisting of 1, 1.1, 1.5, 1.8, 1.7, 2.5, 1.9, 2.0, and 1.2.
20. A computer readable medium for producing images predictive of a person's appearance resulting from following a regimen of diet, exercise, or both diet and exercise, said computer readable medium comprising instructions for:
receiving as input a first image of said person in a pre-regimen condition;
receiving as input a first data set of body measurements for said person;
receiving as input a second data set indicating at least one goal for results of said regimen;
segmenting said first image into a plurality of body segment images;
automatically creating a modified set of body segment images by modifying at least one of said plurality of body segment images based upon said first and second data sets in a manner representative of predictable fat loss resulting from following said regimen; and outputting a second image predictive of the appearance of said person in a post-regimen condition using said modified set of body segment images.
21. The computer readable medium of claim 20 wherein said instructions for receiving a first data set of body measurements include instructions for receiving a weight measurement for said person.
22. The computer readable medium of claim 23 wherein said instructions for receiving a first data set of body measurements include instructions for receiving at least one body fat measurement for said person.
23. The computer readable medium of claim 22 wherein said instructions for receiving at least one body fat measurement include instructions for receiving at least one skin fold measurement.
24. The computer readable medium of claim 23 wherein said at least one skin fold measurement is associated with a body part selected from the group consisting of neck, bicep, tricep, chest, subscapula, abdomen, hip, thigh, and calf.
25. The computer readable medium of claim 20 wherein said instructions for receiving a second data set include instructions for receiving a target weight value.
26. The computer readable medium of claim 20 wherein said instructions for receiving a second data set include instructions for receiving a target body fat value.
27. The computer readable medium of claim 20 wherein said instructions for creating a modified set of body segment images by modifying at least one of said plurality of body segment images include instructions for:
determining an initial fat layer thickness associated with said at least one of said plurality of body segment images;
determining a reduced fat layer thickness associated with said at least one of said plurality of body segment images based upon predicted fat loss; and determining a final body segment size associated with said at least one of said plurality of body segment images based upon said first data set and said reduced fat layer thickness.
28. The computer readable medium of claim 20 wherein said instructions for creating include instructions for calculating a final circumference of a body part associated with said at least one of said plurality of body segment images according to the following equation:
C after = 2 .cndot. .pi. .cndot. { [C before /2.pi. - (2 .cndot.
skin_fold_measurement)]
+ [2 .cndot. skin_fold_measurement .cndot. (1-P) .cndot. V] }

wherein C after represents the circumference of said body part in said post-regimen condition, C before represents the circumference of said body part in said pre-regimen condition, skin_fold_measurement represents a skin fold measurement associated with said body part in said pre-regimen condition, P represents an amount of desired fat loss expressed in decimal form, and V represents a constant associated with said body part.
29. The computer readable medium of claim 28 wherein said body part is selected from the group consisting of neck, bicep, tricep, chest, subscapula, abdomen, hip, thigh, and calf, and wherein said constant V is respectively selected from the group consisting of 1, 1.1, 1.5, 1.8, 1.7, 2.5, 1.9, 2.0, and 1.2.
30. A computer implemented method for attracting and retaining clients of health and fitness service providers, comprising the steps of:
receiving as input a first image representative of a client in a pre-regimen condition, a set of measurements associated with said client, and at least one goal desired by said client from following a regimen of diet, exercise, or diet and exercise;
segmenting said first image into a plurality of body segment images;
modifying at least one of said plurality of body segment images based upon said set of measurements and said at least one goal in a manner representative of predictable fat loss resulting from following said regimen; and outputting a second image predictive of said client in a post-regimen condition using the results of said modifying step, thereby allowing said client to visualize results attainable through said regimen.
31. The method of claim 30 wherein said receiving step comprises communication of information from said client through a network.
32. The method of claim 31 wherein said network comprises the Internet.
33. The method of claim 30 wherein said set of measurements comprises a weight measurement.
34. The method of claim 30 wherein said set of measurements comprises at least one fat measurement.
35. The method of claim 34 wherein said at least one fat measurement comprises at least one skin fold measurement.
36. The method of claim 35 wherein said at least one skin fold measurement is associated with a body part selected from the group consisting of neck, bicep, tricep, chest, subscapula, abdomen, hip, thigh, and calf.
37. A method of estimating a person's risk of diabetes, comprising the following steps:
receiving a first data set for said person, said first data set comprising a weight, a height, an age, a gender designation, a designation regarding family history of diabetes, and a designation regarding the person's level of exercise;
calculating the person's body mass index (BMI) according to the equation BMI = (Weight × 704.5) / Height × Height;
calculating the person's BMI factor according to the equation BMI Factor = (BMI - 25) × 7.5;
calculating the person's acceptable body fat according to the equation Acceptable Body Fat = (Age × 0.0667 - 1.3333) + 14 (if said person is male);
Acceptable Body Fat = (Age × 0.0667 - 1.3333) + 17 (if said person is female);
calculating the person's excess body fat factor according to the equation Excess Body Fat Factor = % Body Fat - Acceptable Body Fat;
calculating the person's obesity factor according to the equation Obesity Factor = (BMI Factor + Excess Body Fat Factor) / 2;
scaling said obesity factor according to the equation Body Fat Scales = [4 × (% Body Fat) - 28] × (Obesity Factor);
magnifying said Body Fat Scales according to the equation Magnifier = 1.25 × Body Fat Scaler;
assigning a family history value, said family history value being equal to zero if said person has no family history of diabetes, said family history value being equal to 15 if said person has a family history of diabetes;
assigning an exercise value, said exercise value being equal to zero if said person is exercising, said exercise value being equal to 25 if said person is not exercising;
assigning an age value according to the equation Age value = (Age/5) - 7 (if (Age/5) - 7 is greater than or equal to zero) Age value = 0 (if (Age/5) - 7 is less than zero);
calculating said risk of diabetes according to the equation Diabetes Risk = Family History Value + Exercise Value + AgeValue +
Magnifier.
38. A
method of estimating a person's risk of heart disease, comprising the following steps:
receiving a first data set for said person, said first data set comprising a weight, a height, an age, a gender designation, a designation regarding family history of heart disease, a designation regarding family history of diabetes, a designation regarding the person's level of exercise, a designation regarding the person's blood pressure, a designation regarding the person's smoking activity, a designation regarding the person's cholesterol level;
calculating the person's body mass index (BMI) according to the equation BMI = (Weight × 704.5) / Height × Height;
calculating the person's BMI factor according to the equation BMI Factor = (BMI - 25) × 7.5;
calculating the person's acceptable body fat according to the equation Acceptable Body Fat = (Age × 0.0667 - 1.3333) + 14 (if said person is male);
Acceptable Body Fat = (Age × 0.0667 - 1.3333) + 17 (if said person is female);
calculating the person's excess body fat factor according to the equation Excess Body Fat Factor = % Body Fat - Acceptable Body Fat;
calculating the person's obesity factor according to the equation Obesity Factor = ( BMI Factor + Excess Body Fat Factor ) / 2;
scaling said obesity factor according to the equation Body Fat Scaler = (4 × (% Body Fat) - 28] × (Obesity Factor);
assigning an age value according to the equation Age Value = Age - 59 (if (Age - 59) is greater than or equal to zero) Age Value = 0 (if (Age - 59) is less than zero);
assigning a heart disease family history value, said heart disease family history value being equal to zero if said person has no family history of heart disease, said heart disease family history value being equal to 15 if said person has a family history of heart disease;

assigning a diabetes family history value, said diabetes family history value being equal to zero if said person has no family history of diabetes, said diabetes family history value being equal to 5 if said person has a family history of diabetes;
assigning a smoking value, said smoking value being equal to zero if said person is not smoking, said smoking value being equal to 25 if said person is smoking;
assigning a gender value, said gender value being equal to zero if said person is female, said gender value being equal to 5 if said person is male;
assigning a blood pressure value, said blood pressure value being equal to zero if said person does not have high blood pressure, said blood pressure value being equal to 15 if said person has high blood pressure;
assigning an exercise value, said exercise value being equal to zero if said person is exercising, said exercise value being equal to 10 if said person is not exercising;
assigning a cholesterol value, said cholesterol value being equal to zero if said person does not have high cholesterol, said cholesterol value being equal to 10 if said person has high cholesterol;
calculating said risk of heart disease according to the equation Heart Disease Risk = Age Value + Heart Disease Family History Value +
Diabetes Family History Value + Smoking Value + Gender Value + Blood Pressure Value + Exercise Value + Cholesterol Value + Body Fat Scaler.
39. A method of estimating a person's risk of stroke, comprising the following steps:
receiving a first data set for said person, said first data set comprising a weight, a height, an age, a gender designation, a designation regarding family history of stroke, a designation regarding family history of diabetes, a designation regarding the person's smoking activity, a designation regarding the person's level of exercise, a designation regarding the person's blood pressure, a designation regarding the person's cholesterol level;
calculating the person's body mass index (BMI) according to the equation BMI = (Weight x 704.5) / Height x Height;
calculating the person's BMI factor according to the equation BMI Factor = (BMI - 25) x 7.5;
calculating the person's acceptable body fat according to the equation Acceptable Body Fat = (Age x 0.0667 - 1.3333) + 14 (if said person is male);
Acceptable Body Fat = (Age x 0.0667 - 1.3333) + 17 (if said person is female);
calculating the person's excess body fat factor according to the equation Excess Body Fat Factor = % Body Fat - Acceptable Body Fat;
calculating the person's obesity factor according to the equation Obesity Factor = (BMI Factor + Excess Body Fat Factor ) / 2;
scaling said obesity factor according to the equation Body Fat Sealer = [4 x (% Body Fat) - 28) x (Obesity Factor);
assigning an age value according to the equation Age Value = Age - 59 (if (Age - 59) is greater than or equal to zero) Age Value = 0 (if (Age - 59) is less than zero);
assigning a stroke family history value, said stroke family history value being equal to zero if said person has no family history of stroke, said stroke family history value being equal to 15 if said person has a family history of stroke;
assigning a diabetes family history value, said diabetes family history value being equal to zero if said person has no family history of diabetes, said diabetes family history value being equal to 5 if said person has a family history of diabetes;
assigning a smoking value, said smoking value being equal to zero if said person is not smoking, said smoking value being equal to 15 if said person is smoking;
assigning a gender value, said gender value being equal to zero if said person is female, said gender value being equal to 5 if said person is male;
assigning a blood pressure value, said blood pressure value being equal to zero if said person does not have high blood pressure, said blood pressure value being equal to 25 if said person has high blood pressure;
assigning an exercise value, said exercise value being equal to zero if said person is exercising, said exercise value being equal to 5 if said person is not exercising;
assigning a cholesterol value, said cholesterol value being equal to zero if said person does not have high cholesterol, said cholesterol value being equal to 10 if said person has high cholesterol;
calculating said risk of stroke according to the equation Stroke Risk = Age Value + Stroke Family History Value + Diabetes Family History Value + Smoking Value + Gender Value + Blood Pressure Value + Exercise Value + Cholesterol Value + Body Fat Scaler.
CA2831674A 2003-10-10 2003-10-10 System and method for assessment of health risks and visualization of weight loss and muscle gain Abandoned CA2831674A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA2831674A CA2831674A1 (en) 2003-10-10 2003-10-10 System and method for assessment of health risks and visualization of weight loss and muscle gain

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CA2462703A CA2462703C (en) 2003-10-10 2003-10-10 System and method for assessment of health risks and visualization of weight loss and muscle gain
CA2831674A CA2831674A1 (en) 2003-10-10 2003-10-10 System and method for assessment of health risks and visualization of weight loss and muscle gain

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CA2462703A Division CA2462703C (en) 2003-10-10 2003-10-10 System and method for assessment of health risks and visualization of weight loss and muscle gain

Publications (1)

Publication Number Publication Date
CA2831674A1 true CA2831674A1 (en) 2005-04-10

Family

ID=49628140

Family Applications (1)

Application Number Title Priority Date Filing Date
CA2831674A Abandoned CA2831674A1 (en) 2003-10-10 2003-10-10 System and method for assessment of health risks and visualization of weight loss and muscle gain

Country Status (1)

Country Link
CA (1) CA2831674A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111009304A (en) * 2019-12-16 2020-04-14 杭州启达医疗技术有限公司 Deep squat exercise monitoring method and device
CN112837783A (en) * 2021-01-28 2021-05-25 北京大学第一医院 Patient nutrition assessment method and device
CN114514562A (en) * 2019-09-26 2022-05-17 亚马逊技术有限公司 Predictive personalized three-dimensional body model
CN118173228A (en) * 2024-03-29 2024-06-11 暨南大学附属第一医院(广州华侨医院) Individualized nutrition evaluation system based on health condition of old people

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114514562A (en) * 2019-09-26 2022-05-17 亚马逊技术有限公司 Predictive personalized three-dimensional body model
CN114514562B (en) * 2019-09-26 2023-06-13 亚马逊技术有限公司 Predictive personalized three-dimensional body model
US11836853B2 (en) 2019-09-26 2023-12-05 Amazon Technologies, Inc. Generation and presentation of predicted personalized three-dimensional body models
CN111009304A (en) * 2019-12-16 2020-04-14 杭州启达医疗技术有限公司 Deep squat exercise monitoring method and device
CN112837783A (en) * 2021-01-28 2021-05-25 北京大学第一医院 Patient nutrition assessment method and device
CN112837783B (en) * 2021-01-28 2024-02-06 北京大学第一医院 Nutritional assessment method and device for patient
CN118173228A (en) * 2024-03-29 2024-06-11 暨南大学附属第一医院(广州华侨医院) Individualized nutrition evaluation system based on health condition of old people
CN118173228B (en) * 2024-03-29 2024-08-09 暨南大学附属第一医院(广州华侨医院) Individualized nutrition evaluation system based on health condition of old people

Similar Documents

Publication Publication Date Title
US8891839B2 (en) System and method for assessment of health risks and visualization of weight loss and muscle gain
US6643385B1 (en) System and method for weight-loss goal visualization and planning and business method for use therefor
US12106841B2 (en) System and methods for calculating, displaying, modifying, and using single dietary intake score reflective of optimal quantity and quality of consumables
US20050240444A1 (en) System and method of individualized mass diagnosis and treatment of obesity
US6872077B2 (en) System and method for generating personalized meal plans
Schulz et al. Aortic calcification and the risk of osteoporosis and fractures
US7090638B2 (en) System and method for optimized dietary menu planning
Standing Committee on the Scientific Evaluation of Dietary Reference Intakes et al. Dietary reference intakes: applications in dietary planning
US12002589B2 (en) Body composition prediction tools
US20140287384A1 (en) Method, system and apparatus for improved nutritional analysis
US20110009708A1 (en) System and apparatus for improved nutrition analysis
Kris-Etherton et al. Validation for MEDFICTS, a dietary assessment instrument for evaluating adherence to total and saturated fat recommendations of the National Cholesterol Education Program Step 1 and Step 2 diets
US20160035248A1 (en) Providing Food-Portion Recommendations to Facilitate Dieting
US20100055271A1 (en) Processes and systems based on metabolic conversion efficiency
US8170807B2 (en) Managing body composition
US9144404B2 (en) Managing body composition
Neuwalder et al. A review of computer-aided body surface area determination: SAGE II and EPRI's 3D Burn Vision
CA2462703C (en) System and method for assessment of health risks and visualization of weight loss and muscle gain
JP2002233507A (en) Health care system, method for reflecting the health condition in character image in the system and recording medium for recording the program of the method
KR20190125157A (en) Information processing apparatus and operation method thereof
CA2831674A1 (en) System and method for assessment of health risks and visualization of weight loss and muscle gain
Cloud et al. “Drawing” conclusions about perceptions of ideal male and female body shapes
JP6069598B1 (en) Health condition evaluation system, health condition evaluation apparatus, and health condition evaluation method
JP2003203123A (en) Dieting assistance system
US20170273618A1 (en) Use of dexa scans to assess body fat outside of healthcare settings

Legal Events

Date Code Title Description
EEER Examination request

Effective date: 20131031

FZDE Discontinued

Effective date: 20180814

FZDE Discontinued

Effective date: 20180814