AU2013203345A1 - A method and apparatus for facial aging assessment and treatment management - Google Patents

A method and apparatus for facial aging assessment and treatment management Download PDF

Info

Publication number
AU2013203345A1
AU2013203345A1 AU2013203345A AU2013203345A AU2013203345A1 AU 2013203345 A1 AU2013203345 A1 AU 2013203345A1 AU 2013203345 A AU2013203345 A AU 2013203345A AU 2013203345 A AU2013203345 A AU 2013203345A AU 2013203345 A1 AU2013203345 A1 AU 2013203345A1
Authority
AU
Australia
Prior art keywords
face
person
executable code
software executable
facial model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2013203345A
Inventor
Steven Liew
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2012900108A external-priority patent/AU2012900108A0/en
Application filed by Individual filed Critical Individual
Priority to AU2013203345A priority Critical patent/AU2013203345A1/en
Publication of AU2013203345A1 publication Critical patent/AU2013203345A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

- 33 Abstract A computer implemented apparatus (200) for performing a method (300) of facilitating cosmetic treatment of a person's face, the method comprising inputting (305) a generic 5 facial model, capturing (307) images of the person's face, extracting (309) facial features, determining (311) based upon the extracted facial features and the generic facial model, an idealised facial model of the person's face; and determining (325) variations between the person's face and the idealised facial model; wherein the variations are used by a medical practitioner (115) to determine (a) a dosage of neuro-modulators and a location on the 10 person's face for application of the neuro-modulators and (b) an amount of Dermal filler and a location on the person's face for application of the filler. Fig. 3

Description

- 1 A METHOD AND APPARATUS FOR FACIAL AGING ASSESSMENT AND TREATMENT MANAGEMENT Technical Field of the Invention The present invention relates generally to facial cosmetic procedures, and in particular, to non-surgical procedures involving application of neuro-modulators (botulinum toxins) and Dermal fillers to counter the effects of ageing. The present 5 invention also relates to a method and apparatus for facilitating performance of such procedures, and to a computer program product including a computer readable medium having recorded thereon a computer program for facilitating performance of such procedures. Background 10 Facial shape and signs of aging are key indicators of vitality and beauty in people. As a result, people are often prepared to undergo cosmetic procedures in order to achieve a more ideal face shape, and/or to remove or reduce the signs of aging. Current cosmetic techniques used in this regard include injections of neuro-modulators, such as Botox and related drugs, in order to relax muscles and thereby reduce the prominence of facial 15 features such as wrinkles. Application of Dermal fillers is also used in order to add volume to facial features where appropriate. The skill and experience of the individual medical practitioners performing the procedures have a direct impact upon the aesthetic results achieved by use of the aforementioned procedures, and accordingly there can be significant lack of uniformity in the results achieved. 20 Summary It is an object of the present invention to substantially overcome, or at least ameliorate, one or more disadvantages of existing arrangements. Disclosed are arrangements, referred to as Extracted Feature Contour Based Comparison (EFCBC) arrangements, which seek to address the above problems by 25 extracting facial features from images of a patient's face, constructing an idealised facial model for the patient based both upon (a) the extracted features and (b) a generic facial model embodying commonly accepted features regarded as aesthetically beautiful, and determining variances between the patient's face, as characterised by the extracted features, and the idealised facial model, these variations then being made available to a -2 medical practitioner to guide procedures for applying neuro-modulators and/or Dermal filler to the patient's face. The EFCBC arrangements also provide an adjunct capability relating to a treatment management system that tracks patient visits, treatment plans, procedures 5 conducted based on the treatment plan including before, during and after images of the treatments identifying injection points for Botox and Dermal Filler. The EFCBC arrangements also include the ability to generate patient proposals and contracts based on patient assessments, patient progress reports and statistical reports of single patient behaviours, group patient behaviours and performance metrics for the whole practice. 10 According to a first aspect of the present invention, there is provided a computer implemented apparatus comprising: a computer processor; an image capture device; and a memory storing a software program comprising a plurality of computer 15 executable code modules configured to direct the computer processor to perform a method of facilitating cosmetic treatment of a person's face, the software program comprising: software executable code for inputting a plurality of parameters and mathematical relationships defining a generic facial model, and storing the facial model in the memory; software executable code for capturing, using the image capture device, a plurality 20 of images of the person's face and storing the images in the memory; software executable code for extracting a plurality of facial features from the stored images; software executable code for determining, based upon the extracted facial features and the generic facial model, an idealised facial model of the person's face; and 25 software executable code for determining variations between the person's face and the idealised facial model; wherein: said variations are adapted to be used by a medical practitioner to (i) determine treatment parameters comprising at least one of (a) a dosage of neuro-modulators and a location on the person's face for application of the neuro-modulators and (b) an amount of 30 Dermal filler and a location on the person's face for application of the filler; and (ii) apply the at least one of the neuro-modulators and the Dermal filler to the person's face in accordance with the determined parameters.
-3 According to another aspect of the present invention, there is provided a computer program product including a computer readable medium having recorded thereon a computer program for implementing any one of the methods performed by the apparatus above. 5 Other aspects of the invention are also disclosed. Brief Description of the Drawings At least one embodiment of the present invention will now be described with reference to the drawings and appendices, in which: Fig. 1 is a functional block diagram of a system upon which the disclosed EFCBC 10 arrangements can be practiced; Figs. 2A and 2B depict a general-purpose computer system 200, upon which the various EFCBC arrangements described can be practiced Fig. 3 is a flow chart depicting a process that can be used to perform the EFCBC method using the system depicted in Fig. 1; 15 Fig. 4 shows a process according to which the step 311 in Fig. 3 may be performed; Fig. 5 shows a front-facing image of the patient used for face shape construction; Fig. 6 shows an image of the patient and feature points used to determine an actual facial model; 20 Fig. 7 depicts graphical representations of variation results; Figs. 8A and 8B depict feature points relating to face angle; Fig. 9 depicts an S-curve; Fig. 10 shows an idealised generic S-curve; Figs. 11A and 11B show screen shots of an EFCBC patient management system 25 Graphical User Interface screen; Figs. 12A and 12B show screen shots of another EFCBC patient management system Graphical User Interface screen; Figs. 13A and 13B show screen shots of another EFCBC patient management system Graphical User Interface screen; 30 Figs. 14A and 14B show screen shots of another EFCBC patient management system Graphical User Interface screen; -4 Figs. 15A and 15B show screen shots of another EFCBC patient management system Graphical User Interface screen; Figs. 16A and 16B show screen shots of another EFCBC patient management system Graphical User Interface screen; 5 Figs. 17A and 17B show screen shots of another EFCBC patient management system Graphical User Interface screen; Figs. 18A and 18B show screen shots of another EFCBC patient management system Graphical User Interface screen; Fig. 19 shows a flow chart of a process in which the EFCBC arrangements are 10 used as part of a patient management system Appendix A shows an illustrative table of typical cosmetic concerns held by patients.. Detailed Description including Best Mode Where reference is made in any one or more of the accompanying drawings to 15 steps and/or features, which have the same reference numerals, those steps and/or features have for the purposes of this description the same function(s) or operation(s), unless the contrary intention appears. It is to be noted that the discussions contained in the "Background" section and that above relating to prior art arrangements relate to discussions of arrangements that may 20 form public knowledge through their use. Such discussions should not be interpreted as a representation by the inventor or the patent applicant that such arrangements in any way form part of the common general knowledge in the art. The inventor has discovered, via extensive analysis of image data in movies, advertising material, magazines and the like, that with the increasing globalisation of 25 communications and travel, concepts of beauty are showing signs of convergence across cultural boundaries. Consequently, certain facial attributes in combination are perceived of as beautiful largely irrespective of geography. Three particularly important facial features that figure in this trend are referred to in this specification as "face shape", "face angle", and "S-curve". The inventor has incorporated these three attributes, defined by a set of 30 parameters and mathematical relationships, into a generic facial model which captures, in a generic sense, what is increasingly commonly accepted as facial beauty.
-5 The described EFCBC arrangements provide an assessment and treatment management system for facial aging for use to support cosmetic medical procedures. The arrangements comprise, in one arrangement, a high-resolution digital camera integrated with a hand-held tablet computer. The EFCBC system in this arrangement is self-contained 5 on the tablet computer, but can also interface with other system for remote data entry, synchronization, backups, printing, etc. In another arrangement, described hereinafter in more detail in regard to Figs. 2A and 2B, the EFCBC arrangement is implemented on a general-purpose computer system. The EFCBC arrangements capture multiple images of the patient's head and face 10 from multiple angles. The operator of the EFCBC arrangement may be presented with tutorial images as part of the image capture process to assist in capturing the correct image at the correct scale for each head and face orientation of interest. Image analysis is performed to identify where the face differs from an aesthetically ideal face as defined by algorithms incorporated in the generic facial model. The EFCBC arrangement makes an 15 assessment of the amount of Botox or Dermal Filler required to bring the patient's face closer to the aesthetic ideal. Information on the suggested treatment is provided to the practitioner for action. The system also includes a treatment management system that tracks patient visits, treatment plans, procedures conducted based on the treatment plan including before, 20 during and after images of the treatments identifying injection points for Botox and Dermal Filler. The system also includes the ability to generate patient proposals and contracts based on patient assessments, patient progress reports and statistical reports of single patient behaviours, group patient behaviours and performance metrics for the whole practice. 25 Fig. 1 is a functional block diagram of a system 100 upon which the disclosed EFCBC arrangements can be practiced. The EFCBC system 100 is a computer-assisted system, and includes a computer processor 106, forming part of the general-purpose computer system 200 described hereinafter in more detail in regard to Fig. 2A. The processor 106 performs the EFCBC methods under the control of (a) an EFCBC software 30 application 109 and (b) manual inputs 113 which may be provided via a variety of user interface devices described hereinafter in more detail in regard to Fig. 2A.
-6 A patient 101 wishing to investigate or undergo cosmetic treatment in accordance with the EFCBC arrangements is imaged by an image capture device 104 which captures, as depicted by dashed arrows 102, 103 one or more images of the persons face. These captured images are stored in a memory 206 (see Fig. 2A) that is configured to 5 communicate with the processor 106. The images are captured in a number of different orientations including front-facing, right- and left-facing orientations and so on. In the course of an initial consultation a medical practitioner 115 elicits from the patient 101 his or her concerns 111 in relation to age related factors. The practitioner then inputs this information, as depicted by an arrow 112, into the processor 106. A generic 10 facial model 107, defined by a plurality of parameters and mathematical relationships associated with the patients face shape, face angle and S-curve is also input, as depicted by an arrow 108, to the processor 106. Having received the captured images of the patient 101, his or her concerns 111, and the generic facial model 107, the processor 106, as directed by the EFCBC software 15 109, extracts facial features of the patient 101 from the captured images. These extracted facial features constitute, in aggregate, what is referred to as the patient's actual facial model. Thereafter based upon the patient's actual facial model and the generic facial model 107 the processor 106 determines the patient's idealised facial model. The processor 106 then determines variations between the patient's actual facial model and the patient's 20 idealised facial model and outputs, as depicted by an arrow 114, information describing variations between the patient's actual facial model and the patient's idealised facial model. These variations are provided, as depicted by the arrow 114, to the medical practitioner 115. The medical practitioner 115 uses the information about these variations 25 in order to determine treatment parameters, depicted by arrows 116 and 119, to be used in application of a neuro-modulator and/or Dermal filler, in order to address the patient's requirements. Figs. 2A and 2B depict a general-purpose computer system 200, upon which the various EFCBC arrangements described can be practiced. In a preferred arrangement, the 30 system consists of a high-resolution digital camera integrated with a hand-held tablet computer. From a functional perspective, however, the hand-held arrangement and the general-purpose arrangement described below are effectively functionally equivalent.
-7 As seen in Fig. 2A, the computer system 200 includes: a computer module 201; input devices such as a keyboard 202, a mouse pointer device 203, a scanner 226, a camera 104, and a microphone 280; and output devices including a printer 215, a display device 214 and loudspeakers 217. An external Modulator-Demodulator (Modem) 5 transceiver device 216 may be used by the computer module 201 for communicating to and from a communications network 220 via a connection 221. The communications network 220 may be a wide-area network (WAN), such as the Internet, a cellular telecommunications network, or a private WAN. Where the connection 221 is a telephone line, the modem 216 may be a traditional "dial-up" modem. Alternatively, where the 10 connection 221 is a high capacity (e.g., cable) connection, the modem 216 may be a broadband modem. A wireless modem may also be used for wireless connection to the communications network 220. As an alternative to the mouse 203, a touch screen 268 may be used to perform the function of the mouse, this screen typically being implemented using a touch sensitive transparent screen overlaid on the video display 214. When the 15 screen is touched a signal is sent to the 11O interface 213 that is equivalent to the signal produced by the mouse 203. In addition to the functions of the mouse 203 the touch screen can register multiple touch events (such as two or more fingers touching simultaneously) and provide track information for both touch points as the fingers are moved over the touch screen surface. 20 The computer module 201 typically includes at least the single processor unit 106, and the memory unit 206. For example, the memory unit 206 may have semiconductor random access memory (RAM) and semiconductor read only memory (ROM). The computer module 201 also includes an number of input/output (1/0) interfaces including: an audio-video interface 207 that couples to the video display 214, loudspeakers 217 and 25 microphone 280; an I/O interface 213 that couples to the keyboard 202, mouse 203, scanner 226, the camera 104 and optionally a joystick or other human interface device (not illustrated); and an interface 208 for the external modem 216 and printer 215. In some implementations, the modem 216 may be incorporated within the computer module 201, for example within the interface 208. The computer module 201 also has a local network 30 interface 211, which permits coupling of the computer system 200 via a connection 223 to a local-area communications network 222, known as a Local Area Network (LAN). As illustrated in Fig. 2A, the local communications network 222 may also couple to the wide -8 network 220 via a connection 224, which would typically include a so-called "firewall" device or device of similar functionality. The local network interface 211 may comprise an EthemetTM circuit card, a BluetoothTM wireless arrangement or an IEEE 802.11 wireless arrangement; however, numerous other types of interfaces may be practiced for the 5 interface 211. The I/O interfaces 208 and 213 may afford either or both of serial and parallel connectivity, the former typically being implemented according to the Universal Serial Bus (USB) standards and having corresponding USB connectors (not illustrated). Storage devices 209 are provided and typically include a hard disk drive (HDD) 210. Other storage 10 devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used. An optical disk drive 212 is typically provided to act as a non-volatile source of data. Portable memory devices, such optical disks (e.g., CD-ROM, DVD, Blu-ray Disc
TM
), USB-RAM, portable, external hard drives, and floppy disks, for example, may be used as appropriate sources of data to the system 200. 15 The components 106 and 206 to 213 of the computer module 201 typically communicate via an interconnected bus 204 and in a manner that results in a conventional mode of operation of the computer system 200 known to those in the relevant art. For example, the processor 106 is coupled to the system bus 204 using a connection 218. Likewise, the memory 206 and optical disk drive 212 are coupled to the system bus 204 by 20 connections 219. Examples of computers on which the described arrangements can be TM practised include IBM-PC's and compatibles, Sun Sparcstations, Apple Mac or a like computer systems. The EFCBC method may be implemented using the computer system 200 wherein the processes of Figs. 3, 4 and 19, to be described, may be implemented as one or more 25 software application programs 109 executable within the computer system 200. In particular, the steps of the EFCBC method are performed by means of instructions 231 (see Fig. 2B) in the software 109 that are carried out within the computer system 200. The software instructions 231 may be formed as one or more code modules, each for performing one or more particular tasks. The software may also be divided into two 30 separate parts, in which a first part and the corresponding code modules performs the EFCBC methods and a second part and the corresponding code modules manage a user interface between the first part and the user. The user interface may take the form of -9 specialised Graphical User Interface screens as depicted in Figs. 11A, 11B to 18A, 18B. The workflow associated with Figs. 11A, 11B to 18A, 18B may be embodied in either of the aforementioned two separate parts of the software 109. The software may be stored in a computer readable medium, including the storage 5 devices described below, for example. The software is loaded into the computer system 200 from the computer readable medium, and then executed by the computer system 200. A computer readable medium having such software or computer program recorded on the computer readable medium is a computer program product. The use of the computer program product in the computer system 200 preferably effects an advantageous 10 apparatus for performing the EFCBC methods. The software 109 is typically stored in the HDD 210 or the memory 206. The software is loaded into the computer system 200 from a computer readable medium, and executed by the computer system 200. Thus, for example, the software 109 may be stored on an optically readable disk storage medium (e.g., CD-ROM) 225 that is read by the 15 optical disk drive 212. A computer readable medium having such software or computer program recorded on it is a computer program product. The use of the computer program product in the computer system 200 preferably effects an EFCBC apparatus. In some instances, the application programs 109 may be supplied to the user encoded on one or more CD-ROMs 225 and read via the corresponding drive 212, or 20 alternatively may be read by the user from the networks 220 or 222. Still further, the software can also be loaded into the computer system 200 from other computer readable media. Computer readable storage media refers to any non-transitory tangible storage medium that provides recorded instructions and/or data to the computer system 200 for execution and/or processing. Examples of such storage media include floppy disks, 25 magnetic tape, CD-ROM, DVD, Blu-ray T M Disc, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 201. Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application 30 programs, instructions and/or data to the computer module 201 include radio or infra-red transmission channels as well as a network connection to another computer or networked -10 device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like. The second part of the application programs 109 and the corresponding code modules mentioned above may be executed to implement the one or more graphical user 5 interfaces (GUIs) to be rendered or otherwise represented upon the display 214, such as the GUIs depicted in Figs. 11A, 11B to 18A, 18B. Through manipulation of typically the keyboard 202 and the mouse 203, or the touch screen 268 a user of the computer system 200 and the application may manipulate the interface in a functionally adaptable manner to provide controlling commands and/or input to the applications associated with 10 the GUI(s). Other forms of functionally adaptable user interfaces may also be implemented, such as an audio interface utilizing speech prompts output via the loudspeakers 217 and user voice commands input via the microphone 280. Fig. 2B is a detailed schematic block diagram of the processor 106 and a "memory" 234. The memory 234 represents a logical aggregation of all the memory 15 modules (including the HDD 209 and semiconductor memory 206) that can be accessed by the computer module 201 in Fig. 2A. When the computer module 201 is initially powered up, a power-on self-test (POST) program 250 executes. The POST program 250 is typically stored in a ROM 249 of the semiconductor memory 206 of Fig. 2A. A hardware device such as the ROM 249 20 storing software is sometimes referred to as firmware. The POST program 250 examines hardware within the computer module 201 to ensure proper functioning and typically checks the processor 106, the memory 234 (209, 206), and a basic input-output systems software (BIOS) module 251, also typically stored in the ROM 249, for correct operation. Once the POST program 250 has run successfully, the BIOS 251 activates the hard disk 25 drive 210 of Fig. 2A. Activation of the hard disk drive 210 causes a bootstrap loader program 252 that is resident on the hard disk drive 210 to execute via the processor 106. This loads an operating system 253 into the RAM memory 206, upon which the operating system 253 commences operation. The operating system 253 is a system level application, executable by the processor 106, to fulfil various high level functions, including processor 30 management, memory management, device management, storage management, software application interface, and generic user interface.
- 11 The operating system 253 manages the memory 234 (209, 206) to ensure that each process or application running on the computer module 201 has sufficient memory in which to execute without colliding with memory allocated to another process. Furthermore, the different types of memory available in the system 200 of Fig. 2A must be 5 used properly so that each process can run effectively. Accordingly, the aggregated memory 234 is not intended to illustrate how particular segments of memory are allocated (unless otherwise stated), but rather to provide a general view of the memory accessible by the computer system 200 and how such is used. As shown in Fig. 2B, the processor 106 includes a number of functional modules 10 including a control unit 239, an arithmetic logic unit (ALU) 240, and a local or internal memory 248, sometimes called a cache memory. The cache memory 248 typically includes a number of storage registers 244 - 246 in a register section. One or more internal busses 241 functionally interconnect these functional modules. The processor 106 typically also has one or more interfaces 242 for communicating with external devices via 15 the system bus 204, using a connection 218. The memory 234 is coupled to the bus 204 using a connection 219. The application program 109 includes a sequence of instructions 231 that may include conditional branch and loop instructions. The program 109 may also include data 232 that is used in execution of the program 109. The instructions 231 and the 20 data 232 are stored in memory locations 228, 229, 230 and 235, 236, 237, respectively. Depending upon the relative size of the instructions 231 and the memory locations 228 230, a particular instruction may be stored in a single memory location as depicted by the instruction shown in the memory location 230. Alternately, it is possible to segment an instruction into a number of parts each of which is stored in a separate memory location, as 25 depicted by the instruction segments shown in the memory locations 228 and 229. In general, the processor 106 is given a set of instructions that are executed therein. The processor 1105 waits for a subsequent input, to which the processor 106 reacts to by executing another set of instructions. Each input may be provided from one or more of a number of sources, including data generated by one or more of the input 30 devices 202, 203, data received from an external source across one of the networks 220, 202, data retrieved from one of the storage devices 206, 209 or data retrieved from a storage medium 225 inserted into the corresponding reader 212, all - 12 depicted in Fig. 2A. The execution of a set of the instructions may in some cases result in output of data. Execution may also involve storing data or variables to the memory 234. The disclosed EFCBC arrangements use input variables 254, which are stored in the memory 234 in corresponding memory locations 255, 256, 257. The EFCBC 5 arrangements produce output variables 261, which are stored in the memory 234 in corresponding memory locations 262, 263, 264. Intermediate variables 258 may be stored in memory locations 259, 260, 266 and 267. Referring to the processor 106 of Fig. 2B, the registers 244, 245, 246, the arithmetic logic unit (ALU) 240, and the control unit 239 work together to perform 10 sequences of micro-operations needed to perform "fetch, decode, and execute" cycles for every instruction in the instruction set making up the program 109. Each fetch, decode, and execute cycle comprises: (a) a fetch operation, which fetches or reads an instruction 231 from a memory location 228, 229, 230; 15 (b) a decode operation in which the control unit 239 determines which instruction has been fetched; and (c) an execute operation in which the control unit 239 and/or the ALU 240 execute the instruction. Thereafter, a further fetch, decode, and execute cycle for the next instruction may 20 be executed. Similarly, a store cycle may be performed by which the control unit 239 stores or writes a value to a memory location 232. Each step or sub-process in the processes of Figs. 3, 4 and 19 is associated with one or more segments of the program 109 and is performed by the register section 244, 245, 247, the ALU 240, and the control unit 239 in the processor 106 working 25 together to perform the fetch, decode, and execute cycles for every instruction in the instruction set for the noted segments of the program 109. The EFCBC methods may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the EFCBC functions or sub functions. Such dedicated hardware may include graphic processors, digital signal processors, or one 30 or more microprocessors and associated memories. Fig. 3 is a flow chart depicting an example of a process 300 that can be used to effect the EFCBC method using the system 100 depicted in Fig. 1. The process 300 - 13 commences with a start step 301 and then follows an arrow 302 to a step 303 that inputs information about the patients concerns 111. This is typically performed during the initial patient interview conducted to assess concerns of the patient 101 in relation to aging related factors, as set out in Table 1 in Appendix A. 5 The process 300 then follows an arrow 304 to a step 305 that inputs the generic facial model 107 comprising parameters and mathematical relationships relating to face shape, face angle and S-curve. The process 300 then follows an arrow 306 to a step 307 in which images of the patient 101 are captured by the image capture device 104. Following an arrow 308 a step 309 extracts facial features (also referred to as facial feature points) 10 from the captured images. The facial features extracted in the step 309 comprise feature points which describe (a) the actual shape of the patient's face by defining a contour of the outline of the person's face in a front-facing orientation, (b) actual facial angles for the patient's face by defining left-side and right-side contours of the person's jaw-line in the front-facing 15 orientation, and (c) actual S-curves of the patient's face by defining contours of the outline of the person's face, in left-facing and right-facing orientations, between a lower edge of the person's outer eye and a mid-line of the person's lips. The step 309 extracts facial features from the captured images in order to determine face shape, face angle and S-curve information. For each of these attributes, the 20 feature extraction involves detecting the location, and hence the presence, of the face in the image in question using trained Haar Cascades. Using the detected location of the face, active shape modeling is then used to detect locations of feature points on the face based on statistical models of faces. The feature points relating to the particular attributes, i.e. face shape, face angle, or S-curve are then displayed overlaid on the particular image in 25 question of the patient 101 as points and connecting lines. Thereafter following an arrow 310 a step 311, described hereafter in more detail in regard to Fig. 4, determines an idealised facial model, based upon the extracted facial features (i.e. the patient's actual facial model) and the generic facial model. Following an arrow 324 a step 325, described hereinafter in more detail in regard to Fig. 7, then 30 compares the patient's idealised facial model to patient's actual facial model. Following an arrow 312 the process 300 then outputs variations between the idealised facial model and the actual facial model in a step 313 and as depicted by the arrow 114 in Fig. 1.
- 14 Following an arrow 323, the medical practitioner 115 determines, in a step 314, treatment parameters to be used in cosmetically treating the patient 101. These parameters include one or more of a dosage of neuro-modulators such as Botox and a corresponding location on the patients face for application of the neuro-modulators, and an amount of 5 Dermal filler, and an associated location on the patients face for application of the filler. Following an arrow 315, if application of neuro-modulators is required the practitioner 115 applies in a step 316, as depicted by an arrow 118 in Fig. 1, neuro-modulators using a neuro-modulator applicator 117 in accordance with the treatment parameters determined in the step 314. The practitioner 115 also applies in the step 316, if application of Dermal 10 filler is required, as depicted by an arrow 121 in Fig. 1, using a Dermal filler applicator 120, Dermal filler according to the treatment parameters determined in the step 314. The process 300 then follows an arrow 317 to a stop step 318, this completing the present procedure. This procedure may form one of a series of procedures associated with a treatment plan, described hereinafter in more detail in regard to Fig. 19. 15 A dashed line 322 partitions the process 300 into two segments. The process segment comprising steps 301-311, 325 and 313, depicted by an arrow 319, are performed in a semi-automatic manner by the computer system 200. The output of this process segment is the information about the variations between the actual facial model of the patient, characterised by the feature points extracted in the step 309, and the idealised 20 facial model for the patient, which is determined in the step 311 depending upon the extracted features and the generic facial model. These variations are then used to assist and support the medical practitioner 115 in his or her performance of the steps 314 and 316 in which the neuro-modulators and Dermal filler are applied to the patient's face in order to improve the correspondence between the 25 patient's face and the patient's idealised facial model. Fig. 4 shows a process according to which the step 311 in Fig. 3 may be performed. The process 311 is entered as depicted by the arrow 310 that emanates from the step 309 in Fig. 3. The process 311 commences with a step 401 that determines the patient's actual 30 face shape using corresponding feature points extracted in the step 309, described hereinafter in more detail in relation to Fig. 5. The actual face shape forms part of the patient's actual facial model. Following an arrow 402, the process 311 then determines the - 15 patient's desired face shape in a step 403 described hereinafter in more detail in regard to Fig. 6. The desired face shape forms part of the patient's idealized facial model. Following an arrow 404, a step 405 then determines the patient's actual face angle. This is performed using the feature points extracted in the step 309, and in 5 particular, the feature points describing left-side and right-side contours of the patients jaw line in the front-facing orientation, described hereinafter in more detail in regard to Figs. 8A and 8B. The actual face angle forms part of the patient's actual facial model. Following an arrow 406 a step 407 then determines the patient's desired face angle. The inventor has determined that ideal face angles range between 9 and 11 degrees from the vertical, this 10 angle being the angle between a line such as 852 that is tangent to the patient's jaw line at the point of the jaw and a vertical line such as 854. The desired face angle forms part of the patient's idealized facial model. Following an arrow 408, a step 409 determines the patients actual S-curve, described hereinafter in more detail in regard to Fig. 9. The actual S-curve forms part of 15 the patient's actual facial model. Following an arrow 410, the process 311 determines the patients idealised S-curve in a step 411, described hereinafter in more detail in regard to Fig. 10. The desired S-curve forms part of the patient's idealized facial model. The process 311 in then directed by the arrow 324 to the step 325 in Fig. 3. Fig. 5 shows a front-facing image 501 of the patient 101. The EFCBC 20 arrangements allow the images captured in the step 307 to be scaled down, depicted by an arrow 502, in order to produce a processing image 503 at a lower resolution. This reduces the number of image points which need to be processed, and hence achieves reduction of the processing time. The processing image 503 is analyzed in the step 309, depicted by an arrow 504, in order to detect a location 511, and hence the presence, of the patients' face in 25 an image 505 using trained Haar Cascades. Using the detected location of the face active shape modeling is used in order to detect locations of the feature points such as 508 on the face, based on statistical models of faces. The feature points such as 508 and 509 that are related to the contour of the outline of the patients face are selected and displayed on the display 214 as the face shape. The 30 feature points 508 and 509 are members a set of feature points describing the actual shape of the patient's face. In contrast, the feature point 510 indicates a central point of the patient's nose. The feature points 508 and 509, together with other feature points - 16 describing the contour of the outline of the person's face, are displayed overlaid on the image 501 as depicted at 507. The display can also include straight-line segments such as 512 joining the various feature points in the contour. To allow for possible imperfections in the extraction of feature points, the EFCBC 5 arrangement allows the practitioner 115 to manually adjust face outline points such as 508, as depicted by the dashed arrow 113 in Fig. 1. Fig. 6 shows an image 507 of the patient 101 upon which are overlaid feature points such as 602, 604, 603, and 601. All the feature points shown in Fig. 6 in this regard are interconnected using straight-line segments in order to show a contour 610 forming the 10 outline of the patient's face. An ellipse 609 is fitted to the contour 610 described by the feature points such as 602, the feature points and the interconnecting straight line segments determining what is referred to as the "actual" shape of the patient's face. The ellipse 609 when appropriately fitted over the aforementioned contour, represents the idealized face shape, and forms part 15 of the idealized facial model. The ellipse 609 is fitted by setting the size of a major axis 608 to the distance between the top and bottom most feature points 602 and 603. The size of a minor axis 606 is set equal to the distance between the left most and right most feature points 601 and 604 respectively. Having determined the lengths of the major and minor axes of the ellipse 609, the position of the ellipse when overlaid over the patient's face is 20 determined by aligning a center point 607 of the ellipse with a centroid 605 of the feature points in the face shape contour 610. Accordingly, the patient's actual face shape is determined by the contour 610 described by the feature points such as 602, and the patient's desired or idealized face shape is represented by the ellipse 609 suitably sized and positioned as described. 25 Fig. 8A shows a first view 800 of the patient and shows feature points 802 and 803 that describe, together with connecting straight-line segment 805, a vertical right-side (expressed from the patient's perspective) contour of the patients jaw line in the front facing orientation. Similarly, feature points 808 and 807 together with their connecting straight-line segment 804 describe the vertical left-side contour of the patients jaw line. 30 These two contours constitute the contours in regard to which the patient's actual face angle is determined as described hereinafter in regard to Fig. 8B. The feature points associated with face angle are extracted using the same method as that described in relation - 17 to feature points for face shape. The relevant feature points for describing the contours of the patient's jaw line are the junction of the face and the earlobe (i.e. the feature point 802), and the point at which the jaw bone changes direction (i.e. the feature point 803). In order to allow for possible imperfections in the feature extraction process, the EFCBC 5 arrangements allow the practitioner to adjust the feature points associated with the jaw line contours, as depicted by the dashed arrow 113. Fig. 8B shows a second view 850 of the patient, showing how a vertical line 855 is constructed through a feature point 853 at the point at which the jaw bone changes direction. The actual face angle associated with the patient's vertical left-side contour is 10 defined to be the angle depicted by an arc 856 between the vertical line 855 and the extension of the line 854. This is the actual face angle on the left-hand side of the patients face, and the same methodology is used to determine the face angle on the right-hand side of the patients face. The desired or idealised face angle lies in a range between 9 degrees and 12 15 degrees from the vertical. The face angle can be altered using the EFCBC arrangements, and this is now described with reference to an inset 866 which depicts an enlarged view of the patient's vertical right-side contour. Feature points 858 and 860 together with their connecting line segment 864 form the patient's vertical right-side contour. A vertical line 862 is constructed through the feature point 860, and an actual face angle 863 is formed 20 between the vertical line 862 and the line segment 864. If according to the EFCBC arrangement it is determined that a smaller angle is desired to form an idealized face angle falling within the desired angular range, then the feature point 860 can non-surgically lowered using the EFCBC arrangement, lo a position depicted by a feature point 861. A dashed line 859 which connects the lowered feature point 861 and the feature point 858 25 now describes an idealized face angle 857. Fig. 9 depicts an S-curve 909. The S-curve is defined as the outline of the patient's face in the region from the bottom of the eyes to the center of the lips, when the face is angled such that the outer edge of the eye is at the edge of the image. In order to extract feature points associated with the S-curve, the EFCBC system analyses captured 30 images in which the face of the patient is angled left and right such that the outer edge of the eye is at the edge of the image. This is depicted in Fig. 9, for a face angled to the right.
- 18 Feature points for the S-curve are extracted using the same method as described above in relation to face shape. In order to allow for possible imperfections in the feature extraction step 309, resulting in possible inaccuracies in the detection of the bottom of the eyes and the center of the lips positions, the EFCBC arrangement allows the practitioner 5 115 to adjust the feature points via the user interface, as depicted by the dashed arrow 113. The processing time may be further reduced by processing the captured image only in the region of interest between the bottom of the eyes and the center of the lips in regard to the S-curve. The region of interest may then be contrast enhanced in order to increase the dynamic range of the image. The outline of the face in the region of interest 10 may be detected using image segmentation to separate the background and foreground. In order to aid with image segmentation, the image may be captured with a neutral color background, with the patient 101 centered in the image. One or more points near the edge of the image may be used as example background points. One or more points near the center of the image may be used as example foreground points. The example background 15 and foreground points may be used to aid the image segmentation process. The foreground image extracted in the step 309 by the image segmentation process may be saturated to generate a solid shape. The outline of the solid shape may then be detected using contour detection. The right edge of the contour of the right-facing patient images, and the left edge 20 of the contour of the left-facing patient images form the patient S-curves. In order to allow for possible imperfections in the feature extraction used for detection of the S-curves the EFCBC system enables the practitioner 115 to adjust the S-curve feature points via a user interface, as depicted by the dashed arrow 113. Fig. 9 depicts a right-facing patient view and depicts a bold line 909 that runs 25 along the right edge contour of the patient's face from the bottom of an eye 902, as depicted by a horizontal line 905, to the center of the lips 903, as depicted by a horizontal line 908. The bold line 909 is the patient's actual S-curve. The three feature points used to characterize the patient's actual S-curve are the top feature point 904, the right-most point 906, and the bottom point 907. For a left-facing patient, it is apparent that the top, left 30 most and bottom points serve to characterize the patient's actual S-curve. The image can be optionally scaled down in order to produce a processing image, such as 503 for face - 19 shape, at a lower resolution in order to reduce the number of image points which need to be processed, thereby reducing the overall processing time. Fig. 10 shows an idealised generic S-curve that forms part of the idealised facial model. A relative length of the line segment 1011 when compared to the sum of the line 5 segments 1012 and 1013 is specified by the relative length of two arrows 1002 and 1004. The arrow 1004 is 1.61 times as long as the arrow 1002. The relative lengths of the line segments 1012 and 1013 are specified by respective arrows 1005 and 1007 which are seen to have the same length. The horizontal distance between the point 1003 and the point 1006 is depicted by an arrow 1009. The horizontal distance between the point 1003 and the 10 points 1001, 1008 is depicted by the sum of the lengths of the arrows 1009, 1010. The horizontal distance between the point 1006 and the points 1001, 1008 is depicted by the arrow 1010. The relative length of the arrow 1010 with reference to the arrow 1009 is described by a proportionality relationship of 1:1.61. Having regard to the step 411 in Fig. 4 the desired S-curve is constructed by 15 scaling the bounding rectangle of the ideal S-curve in Fig. 10 and rotating the bounding rectangle in order to ensure that the ideal S-curve in Fig. 10 matches the reference points in the actual S-curve depicted in Fig. 9. This is achieved by generating and fitting a cardinal spline through the reference points 1001, 1003 and 1008, using an additional point "X" (i.e. reference numeral 1006) to ensure that the points 1001, 1003 and 1008 of the ideal S 20 curve match the reference points 905, 906 and 907 respectively of the actual S-curve as closely as possible. Fig. 7 depicts graphical representations of the variation results produced by the step 313 in Fig. 3. Having regard to face shape, at 700 the actual face shape contour 610 (see Fig. 6) is compared with the idealised ellipse 701, and a variation is shown in 25 graphical form in diagonal shaded format at 702. Turning to face angle, at 710 the two feature points 858 and 860, describing the right-side contour of the patients jaw line as depicted in Fig. 8B, describe an actual face angle 863 having regard to the vertical line 862 and the line 864. An ideal face angle is described by a line 859 which describes an ideal face angle 857 which lies between 9-12 30 degrees. The variation between the actual face angle 855 and the ideal face angle 860 is depicted by a shaded area 711.
- 20 At 720 Fig. 7 depicts an actual S-curve 721 described by the points 904, 906 and 907 as depicted in Fig. 9, as well as an ideal S-curve 722 which is generated by matching the ideal curve depicted in Fig. 10 to the feature points 904, 906 and 907 as previously described. The variation between the actual S-curve and the closest fitting ideal S-curve is 5 depicted in shaded format at 723. Fig. 19 shows a flow chart of a process 1900 in which the EFCBC arrangements are used as part of a patient management system. The process commences with a start step 1901 after which, following an arrow 1902, in a step 1903 an initial interview is conducted with the patient. Thereafter, following an arrow 1904, the practitioner 115 rates the 10 concerns raised by the patient and recommends corrective procedures. The step 1905 is performed using a table of issues such as that shown in Table 1 in Appendix A. The process follows an arrow 1906 to a step 1907 in which the practitioner 115 prepares a treatment plan and presents the plan to the patient. Thereafter, following an arrow 1908, the patient accepts the treatment plan in a step 1909. The process follows an 15 arrow 1910 to a step 321 that encompasses all the process steps depicted in Fig. 3 apart from the step 303 in which the patients inputs their concerns. The process then follows an arrow 1911 to a step 1912 in which the practitioner 115 correlates the procedure and its outcomes with the treatment plan constructed in the step 1907. Thereafter, following an arrow 1913, in a decision step 1914 the practitioner 20 115 decides, together with the patient, whether further procedures are required. If this is the case, then the process follows a YES arrow 1915 back to the step 321. If, on the other hand, no further procedures are required, then the process follows an NO arrow 1916 to a stop step 1917. Fig. 1 1A shows a screen shot of a Graphical User Interface screen. 25 The GUI screen relates to the LIFT Home Page F Start- goes to Patient Details Page F Administration- temporarily used for exiting the application Fig. 11B shows a screen shot of another Graphical User Interface screen. The GUI screen relates to the Patient Details Page 30 E Returned Patient mode- enter patient name to search. Currently two patients are in the database: "Jade Porter" and "Laura Citizen" E New Patient mode- enter details of a new patient. Required fields are: First Name, - 21 Surname, DOB and Gender. F Patient Concerns- goes to Patient Concerns Popup Page. F Cancel- goes to LIFT Home Page. F Done- goes to Assessment Camera Page. 5 F Battery Icon- temporarily assigned to return to LIFT Home Page. Fig. 12A shows a screen shot of another Graphical User Interface screen. The GUI screen relates to the Patient Concerns Popup Page E Content (populated from database) to be implemented. E Tick Icon- returns to Patient Details Page. 10 Fig. 12B shows a screen shot of another Graphical User Interface screen. The GUI screen relates to the Patient Details Page - Returned Patient mode E When a match is found, the photo and name of the matched person are displayed. E Currently supports single match only. E Touch on search text field to restart search or enter New Patient mode. 15 E Touch on photo- enters Patient Home Page. Fig. 13A shows a screen shot of another Graphical User Interface screen. The GUI screen relates to the Patient Home Page F Edit- returns to Patient Details Page. F History- goes to Patient History Page. 20 E Consultation- goes to Assessment Camera Page. F Battery Icon- goes to LIFT Home Page. Fig. 13B shows a screen shot of another Graphical User Interface screen. The GUI screen relates to the Patient History Page E Patient Details- goes to Patient Home Page. 25 E Consultation- goes to Assessment Camera Page. Fig. 14A shows a screen shot of another Graphical User Interface screen. The GUI screen relates to the Assessment Camera Page. E Photo 1, Photo 2, Photo 3, Photo 4, Photo 5- activates camera. F Camera Icon- takes picture. 30 E Tick Icon- saves all taken pictures to database and enters Assessment Page. Only activated when at least the Front On picture is taken. F Any Photo Icon- photo viewing mode - 22 o Bin Icon - deletes current photo o EXIT Icon - camera mode Fig. 14B shows a screen shot of another Graphical User Interface screen. Fig. 15A shows a screen shot of another Graphical User Interface screen. 5 Fig. 15B shows a screen shot of another Graphical User Interface screen. The GUI screen relates to the Assessment Page. F Add Assessment- goes to Add Assessment Page. E LIFT Beautification- goes to LIFT Beautification Page. F Patient Concerns- goes to Patient Concerns Review Page. 10 E Treatment Plan- temporarily goes to Treatment Checklist Page. F Battery Icon- goes to LIFT Home Page. Fig. 16A shows a screen shot of another Graphical User Interface screen. The GUI screen relates to the Add Assessment Page. E Content (populated from database) un-implemented. 15 E Tick Icon- returns to Assessment Page. Fig. 16B shows a screen shot of another Graphical User Interface screen. The GUI screen relates to the LIFT Beautification Page. F Awaiting license from Luxand to run Facial Recognition on tablet. E Tick Icon- returns to Assessment Page. 20 Fig. 17A shows a screen shot of another Graphical User Interface screen; The GUI screen relates to the Patient Concerns Review Page. This page consists of a concise summary of the concerns identified by the patient during the diagnostic session with the practitioner. Fig. 17B shows a screen shot of another Graphical User Interface screen; 25 The GUI screen relates to the Treatment Checklist Page. E All must be checked before Tick Icon is activated. E Tick Icon- goes to Treatment Page. E Cross Icon- returns to Assessment Page. Fig. 18A shows a screen shot of another Graphical User Interface screen; 30 The GUI screen relates to the Treatment Page. F -al Commisure - temporarily activated to go to Treatment (Botox) Page. E Finish- goes to Patient Details Page.
- 23 Fig. 18B shows a screen shot of another Graphical User Interface screen; The GUI screen relates to the Treatment (Botox) Page. E Add Injections- activates injection mode. E DONE- exits injection mode. 5 E Tick Icon- returns to Treatment Page. Industrial Applicability The arrangements described are applicable to the computer and data processing industries and particularly as they apply to the healthcare industry. The foregoing describes only some embodiments of the present invention, and 10 modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive. Thus for example the variation results produced by the step 313 may take other forms such as numerical values of variations and/or other graphical representations in addition to or instead of those depicted in Fig. 7. 15 - 24 Appendix A Facial Area Concern or Issue Skin Type (Fitzpatrick) Skin Condition Texture Sun Damage Patchy Capillaries Diffuse Capillaries & Redness Facial Skin Pigmentation - Sun Related Pigmentation - Age Related Pigmentation - Hormonal Active Acne Acne Scarring Large Pores Black Heads Resting Forehead Lines Dynamic Forehead Lines Upper Face Lines / Resting Frown Lines Wrinkles Dynamic Frown Lines Resting Crow's Feet Dynamic Crow's Feet Position Eyebrow Hooding Volume Loss Volume Loss Upper Eye Lid Eyelid Ptosis Excess Skin Excess Skin Crepey Skin Lower Eye Lid Eyelid wrinkles Excess Fat Pad Tear Trough Temple Hollowing Cheek Proper Volume Loss Mid Cheek Crease Dorsum Nose Tip Projection Alar Base Retrusion Alar width Upper Lip Lines Lips Volume Loss Asymmetry Marionette Fold Oral Commisure Pre Jowl Sulcus Jowls Masseter Hypertrophy / Wide Lower Face - 25 Lower Cheek Accordian Lines Thin & Crepey Skin Loose Skin Neck Neck Bands Submental Fat Dyschromia & Sun Damage Decollatage Dyschromia & Sun Damage Wrinkling of Skin Hands Dyschromia & Sun Damage Volume Loss Table 1 - Facial Areas and Aging related Concerns

Claims (10)

1. A computer implemented apparatus comprising: a computer processor; 5 an image capture device; and a memory storing a computer executable software program comprising a plurality of computer executable code modules configured to direct the computer processor to perform a method of facilitating cosmetic treatment of a person's face, the software program comprising: 10 software executable code for inputting a plurality of parameters and mathematical relationships defining a generic facial model, and storing the facial model in the memory; software executable code for capturing, using the image capture device, a plurality of images of the person's face and storing the images in the memory; software executable code for extracting a plurality of facial features from the 15 stored images; software executable code for determining, based upon the extracted facial features and the generic facial model, an idealised facial model of the person's face; and software executable code for determining variations between the person's face and the idealised facial model; wherein: 20 said variations are adapted to be used by a medical practitioner to (i) determine treatment parameters comprising at least one of (a) a dosage of neuro-modulators and a location on the person's face for application of the neuro-modulators and (b) an amount of Dermal filler and a location on the person's face for application of the filler; and (ii) apply the at least one of the neuro-modulators and the Dermal filler to the person's face in 25 accordance with the determined parameters.
2. A computer implemented apparatus according to claim 1, wherein the software executable code for extracting a plurality of facial features from the stored images comprises: 30 software executable code for detecting a location of the person's face in the images using Haar Cascades; and -27 software executable code for determining feature points indicating locations of respective features on the detected face using active shape modelling and statistical models of faces. 5
3. A computer implemented apparatus according to claim 2, wherein the software executable code for determining, based upon the extracted facial features and the generic facial model, an idealised facial model of the person's face comprises: software executable code for (a) determining an actual shape of the patient's face by identifying the feature points defining a contour of the outline of the person's face in a 10 front-facing orientation, and (b) fitting an ellipse, being an element of the generic facial model, to the actual shape; software executable code for (a) determining actual facial angles for the patient's face by identifying the feature points defining vertical left-side and vertical right-side contours of the person's jaw-line in the front-facing orientation, and (b) establishing 15 angles, being elements of the generic facial model, for the contours; and software executable code for (a) determining actual S curves of the patient's face by identifying the feature points defining contours of the outline of the person's face, in left-facing and right-facing orientations, between a lower edge of the person's outer eye and a mid-line of the person's lips, and (b) fitting idealised S curves, being elements of the 20 generic facial model, to the contours.
4. A computer implemented apparatus according to claim 3, wherein said software executable code enables manual adjustment, by a manual input provided via a user interface communicating with the processor, of feature points associated with one or more 25 of (a) the contour of the outline of the person's face, (b) the left-side and the right-side contours of the person's jaw-line, and (c) the contours of the outline of the person's face, in left-facing and right-facing orientations, between the lower edge of the person's outer eye and the mid-line of the person's lips. 30
5. A computer aided system for cosmetically treating a person's face comprising: a computer processor; an image capture device; and -28 a memory storing a software program comprising a plurality of computer executable code modules configured to direct the computer processor to perform a method of cosmetically treating a person's face, the program comprising: software executable code for inputting a plurality of parameters and mathematical 5 relationships defining a generic facial model, and storing the facial model in the memory; software executable code for capturing, using the image capture device, a plurality of images of the person's face and storing the images in the memory; software executable code for extracting a plurality of facial features from the stored images; 10 software executable code for determining, based upon the extracted features and the generic facial model, an idealised facial model of the person; software executable code for determining variations between the person's face and the idealised facial model; wherein the method further comprises the steps of: determining based upon the variations, treatment parameters comprising at least 15 one of (a) a dosage of neuro-modulators and a location on the person's face for application of the neuro-modulators and (b) an amount of Dermal filler and a location on the person's face for application of the filler; and applying the at least one of the neuro-modulators and the Dermal filler to the person's face in accordance with the determined parameters. 20
6. A computer implemented method of facilitating cosmetic treatment of a person's face, the method comprising the steps of: inputting a plurality of parameters and mathematical relationships defining a generic facial model, and storing the facial model in the memory; 25 capturing, using an image capture device, a plurality of images of the person's face and storing the images in a memory device; extracting, by a computer processor directed by a software executable program, a plurality of facial features from the stored images; determining, by the computer processor directed by the software executable 30 program based upon the extracted features and the generic facial model, an idealised facial model of the person; and -29 determining, by the computer processor directed by the software executable program, variations between the person's face and the idealised facial model; wherein said variations can be used by a medical practitioner to (i) determine treatment parameters comprising at least one of (a) a dosage of neuro-modulators and a location on the person's 5 face for application of the neuro-modulators and (b) an amount of Dermal filler and a location on the person's face for application of the filler; and (ii) apply the at least one of the neuro-modulators and the Dermal filler to the person's face in accordance with the determined parameters. 10
7. A computer assisted method of determining at least one of (a) a dosage of neuro modulators and a location for application of the neuro-modulators and (b) an amount of Dermal filler and a location for application of the filler, in order to cosmetically treat a person's face, the method comprising the steps of: inputting a plurality of parameters and mathematical relationships defining a 15 generic facial model, and storing the facial model in the memory; capturing, using an image capture device, a plurality of images of the person's face and storing the images in a memory device; extracting, by a computer processor directed by a software executable program, a plurality of facial features from the stored images; 20 determining, by the computer processor directed by the software executable program based upon the extracted features and the generic facial model, an idealised facial model of the person; determining, by the computer processor directed by the software executable program, variations between the person's face and the idealised facial model; and 25 determining, based upon said variations, treatment parameters comprising at least one of (a) a dosage of neuro-modulators and a location on the person's face for application of the neuro-modulators and (b) an amount of Dermal filler and a location on the person's face for application of the filler. 30
8. A computer assisted method of cosmetically treating a person's face, the method comprising the steps of: -30 inputting a plurality of parameters and mathematical relationships defining a generic facial model, and storing the facial model in the memory; capturing, using an image capture device, a plurality of images of the person's face and storing the images in a memory device; 5 extracting, by a computer processor directed by a software executable program, a plurality of facial features from the stored images; determining, by the computer processor directed by the software executable program based upon the extracted features and the generic facial model, an idealised facial model of the person; 10 determining, by the computer processor directed by the software executable program, variations between the person's face and the idealised facial model; determining based upon the variations, treatment parameters comprising at least one of (a) a dosage of neuro-modulators and a location on the person's face for application of the neuro-modulators and (b) an amount of Dermal filler and a location on the person's 15 face for application of the filler; and applying the at least one of the neuro-modulators and the Dermal filler to the person's face in accordance with the determined parameters.
9. A computer implemented patient management system comprising: 20 a computer processor; an image capture device; and a memory storing a software program comprising a plurality of computer executable code modules configured to direct the computer processor to perform a method of managing cosmetic treatment of a person's face, the software program comprising: 25 software executable code for inputting patient information at an initial review meeting; software executable code for inputting information rating patient concerns, and recommendations for corrective procedures; software executable code for inputting information proposing a treatment plan and 30 for presenting the treatment plan to the patient; software executable code for receiving information accepting the proposed treatment plan; -31 software executable code for performing a method of facilitating performance of a cosmetic procedure for treatment of a person's face, the software program comprising: software executable code for inputting a plurality of parameters and mathematical relationships defining a generic facial model, and storing the facial model in 5 the memory; software executable code for capturing, using the image capture device, a plurality of images of the person's face and storing the images in the memory; software executable code for extracting a plurality of facial features from the stored images; 10 software executable code for determining, based upon the extracted facial features and the generic facial model, an idealised facial model of the person's face; and software executable code for determining variations between the person's face and the idealised facial model; wherein: said variations are adapted to be used by a medical practitioner to (i) determine treatment parameters comprising at least one of (a) a 15 dosage of neuro-modulators and a location on the person's face for application of the neuro-modulators and (b) an amount of Dermal filler and a location on the person's face for application of the filler; and (ii) apply the at least one of the neuro-modulators and the Dermal filler to the person's face in accordance with the determined parameters; software executable code for receiving information correlating the cosmetic 20 procedure with the treatment plan; software executable code for receiving information specifying if at least one further cosmetic procedure is required; and software executable code for, if at least one further cosmetic procedure is required, performing the method of facilitating performance of a cosmetic procedure, 25 receiving information correlating the cosmetic procedure with the treatment plan, and receiving information specifying if at least one further cosmetic procedure is required.
10. A non-transitory computer readable storage medium storing computer executable software program comprising a plurality of computer executable code modules configured 30 to direct a computer processor to perform a method of facilitating cosmetic treatment of a person's face, the software program comprising: - 32 software executable code for inputting a plurality of parameters and mathematical relationships defining a generic facial model, and storing the facial model in the memory; software executable code for capturing, using the image capture device, a plurality of images of the person's face and storing the images in the memory; 5 software executable code for extracting a plurality of facial features from the stored images; software executable code for determining, based upon the extracted facial features and the generic facial model, an idealised facial model of the person's face; and software executable code for determining variations between the person's face 10 and the idealised facial model; wherein: said variations are adapted to be used by a medical practitioner to (i) determine treatment parameters comprising at least one of (a) a dosage of neuro-modulators and a location on the person's face for application of the neuro-modulators and (b) an amount of Dermal filler and a location on the person's face for application of the filler; and (ii) apply the at 15 least one of the neuro-modulators and the Dermal filler to the person's face in accordance with the determined parameters.
AU2013203345A 2012-01-11 2013-01-08 A method and apparatus for facial aging assessment and treatment management Abandoned AU2013203345A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2013203345A AU2013203345A1 (en) 2012-01-11 2013-01-08 A method and apparatus for facial aging assessment and treatment management

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
AU2012900108 2012-01-11
AU2012900108A AU2012900108A0 (en) 2012-01-11 A method and apparatus for facial aging assessment and treatment management
AU2013203345A AU2013203345A1 (en) 2012-01-11 2013-01-08 A method and apparatus for facial aging assessment and treatment management
PCT/AU2013/000009 WO2013104015A1 (en) 2012-01-11 2013-01-08 A method and apparatus for facial aging assessment and treatment management

Publications (1)

Publication Number Publication Date
AU2013203345A1 true AU2013203345A1 (en) 2013-07-25

Family

ID=48780965

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2013203345A Abandoned AU2013203345A1 (en) 2012-01-11 2013-01-08 A method and apparatus for facial aging assessment and treatment management

Country Status (2)

Country Link
AU (1) AU2013203345A1 (en)
WO (1) WO2013104015A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103632165B (en) 2013-11-28 2017-07-04 小米科技有限责任公司 A kind of method of image procossing, device and terminal device
US20180039734A1 (en) * 2015-03-06 2018-02-08 Instant Advice Ab Method and system for providing medical advice about treatment of a condition of a user
US11055762B2 (en) 2016-03-21 2021-07-06 The Procter & Gamble Company Systems and methods for providing customized product recommendations
US10621771B2 (en) 2017-03-21 2020-04-14 The Procter & Gamble Company Methods for age appearance simulation
US10614623B2 (en) 2017-03-21 2020-04-07 Canfield Scientific, Incorporated Methods and apparatuses for age appearance simulation
WO2018222808A1 (en) 2017-05-31 2018-12-06 The Procter & Gamble Company Systems and methods for determining apparent skin age
EP3635626A1 (en) 2017-05-31 2020-04-15 The Procter and Gamble Company System and method for guiding a user to take a selfie
CN112053381A (en) * 2020-07-13 2020-12-08 北京迈格威科技有限公司 Image processing method, image processing device, electronic equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3912834B2 (en) * 1997-03-06 2007-05-09 有限会社開発顧問室 Face image correction method, makeup simulation method, makeup method, makeup support apparatus, and foundation transfer film
US8535228B2 (en) * 2004-10-06 2013-09-17 Guided Therapy Systems, Llc Method and system for noninvasive face lifts and deep tissue tightening

Also Published As

Publication number Publication date
WO2013104015A1 (en) 2013-07-18

Similar Documents

Publication Publication Date Title
AU2013203345A1 (en) A method and apparatus for facial aging assessment and treatment management
EP2174296B1 (en) Method and apparatus for realistic simulation of wrinkle aging and de-aging
US9875394B2 (en) Skin analysis method, skin analysis device, and method for controlling skin analysis device
US9760935B2 (en) Method, system and computer program product for generating recommendations for products and treatments
US20180204052A1 (en) A method and apparatus for human face image processing
CN114502061A (en) Image-based automatic skin diagnosis using deep learning
US8218862B2 (en) Automatic mask design and registration and feature detection for computer-aided skin analysis
CN108874145B (en) Image processing method, computing device and storage medium
EP2178045A1 (en) Makeup simulation system, makeup simulation apparatus, makeup simulation method, and makeup simulation program
US8711178B2 (en) System and method for generating profile morphing using cephalometric tracing data
US20070255589A1 (en) Systems and methods using a dynamic database to provide aesthetic improvement procedures
WO2018076622A1 (en) Image processing method and device, and terminal
US9202312B1 (en) Hair simulation method
WO2007103377A2 (en) Systems and methods using a dynamic database to provide aesthetic improvement procedures
JPWO2020113326A5 (en)
WO2015017687A2 (en) Systems and methods for producing predictive images
US11321764B2 (en) Information processing apparatus and information processing method
US20230200908A1 (en) Computing platform for improved aesthetic outcomes and patient safety in medical and surgical cosmetic procedures
US20240265433A1 (en) Interactive system and method for recommending one or more lifestyle products
CN114242243A (en) User health assessment method, device, equipment and storage medium
KR102330833B1 (en) cosmetic curation service method
Gutstein et al. Hand-eye coordination: automating the annotation of physician-patient interactions
US20210174064A1 (en) Method for analyzing and evaluating facial muscle status
CN114333018A (en) Shaping information recommendation method and device and electronic equipment
CN115131841A (en) Cosmetic mirror and dressing assisting method

Legal Events

Date Code Title Description
MK5 Application lapsed section 142(2)(e) - patent request and compl. specification not accepted