WO2014015378A1 - Dispositif informatique mobile, serveur d'application, support de stockage lisible par ordinateur et système pour calculer des indices de vitalité, détecter un danger environnemental, fournir une aide à la vision et détecter une maladie - Google Patents

Dispositif informatique mobile, serveur d'application, support de stockage lisible par ordinateur et système pour calculer des indices de vitalité, détecter un danger environnemental, fournir une aide à la vision et détecter une maladie Download PDF

Info

Publication number
WO2014015378A1
WO2014015378A1 PCT/AU2013/000823 AU2013000823W WO2014015378A1 WO 2014015378 A1 WO2014015378 A1 WO 2014015378A1 AU 2013000823 W AU2013000823 W AU 2013000823W WO 2014015378 A1 WO2014015378 A1 WO 2014015378A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
processor
computing device
accordance
program code
Prior art date
Application number
PCT/AU2013/000823
Other languages
English (en)
Inventor
Jonathon YEOW
Giancarlo VALENZUELA
Daesol LEE
Caleb IOANNIDIS
Christopher John Baxter
Original Assignee
Nexel Pty Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2012903142A external-priority patent/AU2012903142A0/en
Application filed by Nexel Pty Ltd. filed Critical Nexel Pty Ltd.
Publication of WO2014015378A1 publication Critical patent/WO2014015378A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS

Definitions

  • the present invention relates to treatment of treatment and detection of unhealthy lifestyle, illness, disease process and the like and in particular to a mobile computing device, application server, computer readable storage medium and system for calculating a vitality indicia, detecting an environmental hazard, vision assistance and detecting disease
  • the invention has been developed primarily for use with mobile computing devices such as mobile computing devices including mobile phones, tablets, computers and the like, and augmented reality aids such as glasses or goggles having virtual reality overlay and will be described hereinafter with reference to this application. However, It will be appreciated that the invention is not limited to this particular field of use.
  • anxiety disorders may comprise obsessive-compulsive disorders which may be characterised by sessions and compulsions. Sufferers of such anxiety disorders may exhibit certain characteristics, such as intrusive thoughts, and easiness, fear, and repetitive behaviour, such as excessive washing, counting, checking and the like.
  • Such anxiety disorders are usually treated by means of behavioural therapy, medication and the like wherein such means is usually undertaken by a professional, such as a psychologist, medical doctor and the like.
  • a professional such as a psychologist, medical doctor and the like.
  • the costs for the services of such professionals are usually prohibitive, and not available to most sufferers.
  • certain anxiety disorders are best treated when a sufferer has a relapse or is experiencing an episode.
  • a mobile computing device for calculating vitality indicia comprising:
  • a memory device for storing digital data including computer program code and being coupled to the processor: [15] an augmented reality display device for displaying digital data in augmented reality and being coupled to the processor,
  • the vitality indicia is displayed, using the augmented reality display device, the vitality indicia.
  • the calculating of the vitality indicia comprises detecting a user anxiety disorder wherein the calculating of the vitality indicia comprises calculating the occurrence of the anxiety disorder.
  • the mobile computing device is adapted for monitoring one or more vitality indlca of a person in an automated manner such that corrective action may be taken should any of the vitality indica indicate a potential problem.
  • the environment input data comprises image data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia in accordance with an image recognition technique applied to the image data.
  • the mobile computing device is adapted for obtaining information from a user's surroundings or from a patient so as to be able to obtain the vitality indica
  • the image recognition technigue is adapted for recognising an object.
  • the object is a meal and wherein the processor is controlled by the computer program code to:
  • the mobile computing device is adapted for monitoring a user's daily nutritional intake, so as to be able to measure whether the intake is adequate or inadequate.
  • the processor is further controlled by the computer program code to calculate nutritional deficiency in accordance with the nutritional composition.
  • the mobile computing device can determine from what a person is eating and drinking whether there is a nutritional deficieney in their diet
  • the vitality indicia comprises a meal suggestion or plan and the processor is further controlled by the computer program code to calculate the meal suggestion or plan in accordance with the nutritional deficiency.
  • the mobile computing device is adapted for recommending a meal suggestion or plan which, when taken by the user, would aim to correct the balance of the user's nutritional requirements.
  • the image recognition technigue is adapted for recognising an action of a person.
  • the person is a wearer of the augmented reality display device.
  • the image recognition technigue is adapted for receiving first image data at a first time and second image data at a second later time, and comparing the first data and the second data,
  • the mobile computing device is adapted for monitoring certain activities of the user in the calculation of the vitality indicia.
  • the action represents compliance with a treatment regime.
  • the mobile computing device is adapted for determining whether a person is adhering to a treatment regime, by monitoring the actions of the user,
  • the action represents actions selected from the set of actions comprising: coughing, sneezing and blinking actions.
  • the mobile computing device is adapted for detecting various symptoms, each of which may be indicative of illness.
  • the vitality indicia represents a level of awakeness or tiredness.
  • At least one sensor of the one or more sensors comprises a rearward facing image capture device adapted for capturing image data relating to at least a part of the face of the user,
  • the mobile computing device is adapted for detecting various symptoms on at least a part of the face of the user, which may be indicative of their vitality, for example, of an illness.
  • the at least a par! of the face of the user is at least a part of an eye of the user and wherein the processor is further controlled by the computer program code to calculating the vitality indicia in accordance with a characteristic of the eye.
  • the mobile computing device is adapted for detecting various symptoms on or within an eye of the user, which may be indicative of their vitality, for example, of awakeness, stress, eye disease or another Illness,
  • the characteristic of the eye is one characteristic from the following set of characteristics:
  • the mobile computing device can readily suggest to the user that they may be tired or stressed.
  • the augmented reality display device comprises a view-through means and a transparency control means which is controlled by the processor, the view-through means comprising at least a portion having a transparency that can be adjusted by the transparency control means, and the processor is controlled by the computer program code to instruct the control means to darken the at least a portion of the view-through means in accordance with the characteristic of the eye.
  • the transparency can be automatically turned down, that is the view-through means can be made darker, for tired or irritated eyes.
  • At least one sensor of the one or more sensors comprises a forward facing image capture device is adapted for capturing image data from within a wearers field of vision.
  • a doctor can simply look at a patient to determine aspects of their vitality.
  • At least one sensor of the one or more sensors is a temperature sensor and the vitality indicia represents a body temperature of a wearer of the augmented reality display device determined in accordance with data received from the temperature sensor.
  • temperature can be a factor used in expressing or determining a person's vitality and in diagnosis of an Illness, such as a fever.
  • the environmental input data comprises audio data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia in accordance with an audio recognition technique.
  • the audio recognition technique is adapted for recognising sounds within the audio data.
  • the mobile computing device is adapted for calculating the vitality indicia by using sounds from the environment of the user or sounds made by the user.
  • the sound is selected from the set of sounds comprising: coughing, hiccupping, sneezing and obstructed airways sounds.
  • the mobile computing device is adapted for detecting one or more symptoms which may be indicative of Illness in the case of coughing, hiccupping, sneezing or a sleep disorder in the case of obstructed airways sounds.
  • At least one sensor of the one or more sensors comprises a stethoscope interface for receiving audio from a stethoscope in use,
  • a person's heartbeat or lung condition can be monitored and can be used in deriving a diagnosis.
  • the environmental input data comprises acceleration data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia in accordance with a movement recognition technique applied to the acceleration data.
  • the movement recognition technique comprises recognising a movement.
  • the mobile computing device is adapted for calculating the vitality indicia in accordance with a movement of the user.
  • the movement represents an exercise movement selected from the set of exercise movements comprising walking, talking and running exercise movements.
  • a person's physical exercise which has a significant impact on their vitality can be monitored and this can be expressed in the vitality indicia. For example, If the person is not doing enough physical exercise a bar chart may indicate this.
  • the movement comprises at least a vibrational component and the processor is further controlled by the computer program code to diagnose instances of sleep disordered breathing in accordance with the movement.
  • the environmental input data further comprises audio data and in calculating the vitality indicia, the processor is further controlled by the computer program code to calculate the vitality indicia in accordance with an audio recognition technique that is adapted for recognising sleep disordered breathing sounds within the audio data.
  • sleep disordered breathing can be diagnosed from both accelerometer and audio inputs or can be diagnosed with a higher degree of confidence by correlating the environmental input data from both sources.
  • the environment input data comprises orientation data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia in accordance with an orientation recognition technique.
  • the orientation recognition technlgue is adapted for recognising an orientation of a wearer of the augmented reality display device in use.
  • the mobile computing device is adapted for calculating the vitality indica in accordance with orientation of the wearer, wherein the orientation may be representative of the sleeping state of the user.
  • the processor is further controlled by the computer program code to calculate the resting state of a wearer of the augmented reality display device in accordance with the orientation.
  • the mobile computing device further comprises a data interface for sending and receiving data across a data network, the data interface being coupled to the processor, wherein the processor is controlled by the computer program code to send, via the data interface, the environment input data.
  • the processor is further controlled by the computer program code to send the environment input data to an application server.
  • the mobile computing device is adapted for offloading certain processing tasks to a remote application server having superior processing capabilities.
  • the processor is further controlled by the computer program code to send the environment input data via a communications interface to another mobile computing device.
  • information from a friend, doctor or patient's mobile computing device can be shared so that additional environmental data can be used in calculation of the vitality data and/or in making a diagnosis.
  • the processor is further controlled by the computer program code to send the environment input data to another mobile computing device when a wearer of the augmented reality display device of the mobile computing device looks at a wearer of the augmented reality display device of the another mobile computing device.
  • sharing can take place on a restricted basis.
  • the processor is further controlled by the computer program code to receive, via the data interface, vitality indicia data representing the vitality indicia.
  • the processor is further controlled by the computer program code to receive the vitality indicia data from an application server.
  • the processor is further controlled by the computer program code to receive other environment input data from another mobile computing device and to calculate the vitality indicia further in accordance with the other environment input data.
  • information from a friend, doctor or patient's mobile computing device can be shared so that additional environmental data can be used in calculation of the vitality data and/or in making a diagnosis.
  • the processor is further controlled by the computer program code to calculate a vitality category in accordance with the environment input data.
  • the mobile computing device is adapted for categorising the vitality indica of the user into an intelligible or easy to understand format.
  • the processor is further controlled by the computer program code to calculate a diagnosis data in accordance with the environment input data.
  • the mobile computing device is adapted for calculating a diagnosis using the environment input data or the vitality indicia.
  • diagnosis may be determined, for example, from a lookup table correlating symptoms and diagnoses, from an addition of scores correlating symptoms to diagnoses or using an algorithm, such as an artificial intelligence algorithm, to determine the diagnosis or possible diagnoses that best fit the symptoms, expressed by either of the environment input data or the vitality indicia, by such an algorithm analysing prior patient data such as data that correlates diagnoses with symptoms,
  • the processor is further controlled by the computer program code to calculate one or more appropriate remedies in accordance with the diagnosis data,
  • the mobile computing device is adapted to determine remedies for various diagnoses.
  • These remedies may be determined, for example, from a lookup table, from an addition of scores correlating to diagnoses or using an algorithm, such as an artificial intelligence algorithm, to determine the remedy or remedies that best fit a diagnosis by the algorithm analysing prior patient data such as data that correlates diagnoses with remedies that have been effective.
  • the one or more remedies are each a remedy from the following set of remedies:
  • the vitality indicia comprises the one or more remedies.
  • the vitality indicia may express possible remedies to a user of the mobile computing device.
  • the one or more remedies are communicated to a user of the mobile computing device audibly.
  • the remedies may be understood by a person who is blind or has impaired vision.
  • the one or more remedies comprise a branded product suggestion
  • a manufacturer, distributor or service provider can advertise branded remedies to the user. For example, If the diagnosis is a common cold and the remedy is cough medicine, vitality indicia in the form of a suggestion can be provided to the user such as, " SinTM: It appears you have caught a cold, why not try Brand cough medicine?"
  • the one or more remedies is a branded medication remedy.
  • the mobile computing device further comprises a data communications interface and wherein in calculating the diagnosis data, the processor is controlled by the computer program code to send the environment input data to a server via the data communications interface and receive back from the server, the diagnosis data.
  • the mobile computing device is adapted for offloading certain processing tasks to a remote application server having superior processing capabilities.
  • the mobile computing device further comprises a data communications interface and wherein in calculating the one or more remedies, the processor is controlled by the computer program code to send the diagnosis data to a server via the data communications interface and receive back from the server, the one or more remedies,
  • the mobile computing device is adapted for offloading certain processing tasks to a remote application server having superior processing capabilities.
  • the one or more sensors is two or more sensors and the processor is further controlled by the computer program code to calculate the diagnosis in accordance with environment data received from at least two sensors of the two or more sensors.
  • a diagnosis formed on the basis of multiple observations of a patient can be made with a higher degree of confidence.
  • the one or more sensors is two or more sensors and the processor is further controlled by the computer program code to calculate the diagnosis in accordance with environment data received from a threshold one or more sensors of the two or more sensors.
  • this feature allows the mobile computing device to check that the particular symptom is present before making the diagnosis.
  • the one or more sensors is two or more sensors and the processor is further controlled by the computer program code to calculate the diagnosis in accordance with a weighting of the environment data according to which sensor It is received from.
  • the augmented reality display devices is a pair of glasses and the at least one of the one or more sensors is located at a region selected from the following set of regions of the glasses in relation to a wearer of the glasses:
  • a temperature sensor can be located at a position more preferable for measuring a person's body temperature.
  • an application server for calculating vitality indicia data comprising:
  • a memory device for storing digital data including computer program code and being coupled to the processor:
  • a data interface for sending and receiving data across a date network and being coupled to the processor, wherein the processor is controlled by the computer program code to:
  • the vitality indica date being adapted for display by an augmented reality display device, and send, via the data interface the vitality indicia data to an augmented reality display device,
  • the environment input data comprises image data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia data in accordance with an image recognition technique applied to the image data.
  • the image recognition technigue is adapted for recognising an object
  • the object is a meal and wherein the processor is controlled by the computer program code to:
  • the processor is further controlled by the computer program code to calculate nutritional deficiency in accordance with the nutritional composition.
  • the vitality indicia data comprises a meal suggestion or plan and the processor is further controlled by the computer program code to calculate the meal suggestion orplan in accordance with the nutritional deficiency.
  • the image recognition technique is adapted for recognising an action of a person.
  • the image recognition technique is adapted for receiving first image data at a first time and second image data at a second later time, and comparing the first data and the second data.
  • the action represents compliance with a treatment regime.
  • the action represents actions selected from the set of actions comprising: coughing, sneezing, wincing and blinking actions.
  • the vitality indicia represents a level of awakeness or tiredness.
  • the processor is controlled by the computer program code to receive, via the data interface, the environment input data from a wearable rearward facing image capture device adapted for capturing image data relating to at least a part of the face of a wearer.
  • the at least a part of the face of the user is at least a part of an eye of the user and wherein the processor is further controlled by the computer program code to calculating the vitality indicia data in accordance with a characteristic of the eye.
  • the characteristic of the eye is one characteristic from the following set of characteristics:
  • the processor is adapted to send, via the data interface, an instruction to the augmented reality display device to darken a view-through means of the augmented reality display device in accordance with the vitality indicia data.
  • the environment input data comprises a temperature reading of a wearer of the augmented reality display device.
  • the environmental input data comprises audio data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia in accordance with an audio recognition technique.
  • the audio recognition technigue is adapted for recognising sounds within the audio data.
  • the sound is selected from the set of sounds comprising: coughing, hiccupping, sneezing and obstructed airways sounds.
  • the processor is further controlled by the computer program code to receive, via the data interface, the environment sensor data in the form of data from a stethoscope.
  • the environmental input data comprises acceleration data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia in accordance with a movement recognition technigue applied to the acceleration data.
  • the movement recognition technigue comprises recognising a movement.
  • the movement represents an exercise movement selected from the set of exercise movements comprising walking, talking and running exercise movements.
  • the movement comprises at least a vibrational component and the processor is further controlled by the computer program code to diagnose instances of sleep disordered breathing in accordance with the movement.
  • the environmental input data further comprises audio data and in calculating the vitality indicia, the processor is further controlled by the computer program code to calculate the vitality indicia in accordance with an audio recognition technique that is adapted for recognising sleep disordered breathing sounds within the audio data.
  • the environment input data comprises orientation data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia in accordance with an orientation recognition technique.
  • the orientation recognition technique is adapted for recognising an orientation of a wearer of the augmented reality display device in use.
  • the processor is further controlled by the computer program code to calculate the resting state of a wearer of the augmented reality display device in accordance with the orientation.
  • the processor is further controlled by the computer program code to send the environment input data to another mobile computing device.
  • the processor is further controlled by the computer program code to send the environment input data to another mobile computing device when a wearer of the augmented reality display device of the mobile computing device looks at a wearer of the augmented reality display device of the another mobile computing device.
  • the processor is further controlled by the computer program code to calculate a vitality category in accordance with the environment input data.
  • the processor is further controlled by the computer program code to calculate a diagnosis data in accordance with the environment input data.
  • the processor is further controlled by the computer program code to calculate one or more appropnate remedies in accordance with the diagnosis data,
  • the one or more remedies are each a remedy from the following set of remedies: s
  • the vitality indicia comprises the one or more remedies.
  • the application server is adapted to send instructions to the augmented reality display device to communicate the one or more remedies audibly to the wearer.
  • the one or more remedies comprise a branded product suggestion.
  • the one or more remedies is a branded medication remedy.
  • the environment data comprises at least two types of environment data and wherein the processor is further controlled by the computer program code to calculate the diagnosis in accordance with the at least two types of environment data,
  • the application server is further adapted to calculate the diagnosis in accordance with a threshold number of types of environment data.
  • the application server is further adapted to calculate the diagnosis in accordance with a weighting of the types of environment data.
  • a computer readable storage medium for calculating a vitality indicia having computer program code instructions recorded thereon, the computer program code instructions being executable by a computer and comprising:
  • instructions for receiving environment input data from one or more sensors instructions for calculating the vitality indicia in accordance with the environment input data
  • the environment input data comprises image data.
  • the environment input data comprises image data.
  • the environment input data comprises image data.
  • the environment input data comprises image data.
  • the environment input data comprises image data.
  • the environment input data comprises image data.
  • the environment input data comprises image data.
  • the environment input data comprises image data.
  • the environment input data comprises image data.
  • the environment input data comprises image data.
  • the environment input data comprises image data.
  • the vitality indicia in accordance with an image recognition technique applied to the image data.
  • the image recognition technique is adapted for recognising an object
  • the object is a meal and further comprising;
  • the vitality indicia comprises a meal suggestion or plan and the computer readable storage medium further comprises instructions for calculating the meal suggestion or plan in accordance with the nutritional deficiency.
  • the image recognition technlgue is adapted for recognising an action of a person.
  • the person is a wearer of the computer readable storage medium.
  • the image recognition technique is adapted for receiving first image data at a first time and second image data at a second later time, and comparing the first data and the second data.
  • the action represents compliance with a treatment regime.
  • the action represents actions selected from the set of actions comprising: coughing, sneezing, wincing and blinking actions.
  • the vitality indicia represents a level of awakeness or tiredness.
  • the at least a part of the face of the user is at least a part of an eye of the user and wherein the computer readable storage means further comprises instructions for calculating the vitality indica in accordance with a characteristic of the eye.
  • the characteristic of the eye is one characteristic from the following set of characteristics:
  • the augmented reality display device comprises a view-through means and a transparency control means, the view-through means comprising at least a portion having a transparency that can be adjusted by the transparency control means, and the computer readable storage medium comprises instructions for the control means to darken the at least a portion of the view-through means in aocordance with the vitality indicia,
  • the environment input data is temperature data of a wearer of the augmented reality device.
  • the environmental input data comprises audio data.
  • the audio recognition technique is adapted for recognising sounds from within the audio data
  • the sound is selected from the set of sounds comprising: coughing, hiccupping, sneezing and obstructed airways sounds.
  • the environment data comprises data received from a stethoscope.
  • the environmental input data comprises acceleration data.
  • the movement recognition technique comprises recognising a movement.
  • the movement represents an exercise movement selected from the set of exercise movements comprising walking, talking and running exercise movements.
  • the movement comprises at least a vibrational component and the computer readable storage medium further comprises instructions to diagnose instances of sleep disordered breathing in accordance with the movement.
  • the environmental input data further comprises audio data and in calculating the vitality indicia
  • the computer readabie storage medium further comprises instructions to calculate the vitality indicia in accordance with an audio recognition technique that is adapted for recognising sleep disordered breathing sounds within the audio data
  • the environment input data comprises orientation data
  • the orientation recognition technique is adapted for recognising an orientation of a wearer of the augmented reality display device in use.
  • the computer readable storage medium comprises further instructions to send the environment input data to another mobile computing device.
  • the computer readable storage medium further comprises instructions to send the environment input data to another mobile computing device when a wearer of the augmented reality display device of the mobile computing device looks at a wearer of the augmented reality display device of the another mobile computing device.
  • the one or more remedies are each a remedy from the following set of remedies:
  • the vitality indicia comprises the one or more remedies.
  • the one or more remedies comprise a branded product suggestion.
  • the one or more remedies is a branded medication remedy.
  • the environment data comprises at ieast two types of environment data and the computer readable storage medium further comprises instructions to calculate the diagnosis in accordance with the at Ieast two types of environment data,
  • a system for calculating a vitality indicia is provided, the system eomprislng:
  • the application server is adapted to receive environment input data from the wearable device, the application server is adapted to calculate the vitality indicia in accordance with the environment input data, and
  • the application server is adapted to send vitality indicia data representing the vitality indicia to the wearable device.
  • the wearable device comprises a display device, and wherein the wearable device is adapted to display the vitality indicia data,
  • the wearable display device is an augmented reality display device.
  • the vitality indicia data is adapted for display in segmented reality by the augmented reality display device.
  • the environment input data comprises image data.
  • the application server is adapted to calculate the vitality indicia in accordance with an image recognition technique applied to the image data.
  • the image recognition technique is adapted for recognising an object.
  • the object is a meal and wherein the application server is further adapted to:
  • the application server is adapted to calculate nutritional deficiency in accordance with the nutritional composition
  • the application server is adapted to calculate a meal suggestion or plan in accordance with the nutritional deficiency.
  • the image recognition technique is adapted for recognising an action of a person wearing the wearable device.
  • the image recognition technique is adapted for receiving first image data at a first time and second image data at a second later time, and comparing the first data and the second data,
  • the action represents compliance with a treatment regime.
  • the action represents actions selected from the set of actions comprising: coughing, sneezing, wincing and blinking actions.
  • the vitality indicia represents a level of awakeness or tiredness.
  • the wearable device comprises a rearward facing image capture device adapted for capturing image data relating to at least a part of the face of a wearer.
  • the at least a part of the face of the user is at least a part of an eye of the wearer and wherein the server calculates the vitality indicia in accordance with a characteristic of the eye.
  • the characteristic of the eye is one characteristic from the following set of characteristics:
  • the wearable device comprises a view-through means and a transparency control means, the view-through means comprising at least a portion having a transparency that can be adjusted by the transparency control means, and the server controls the control means to darken the at least a portion of the view- through means in accordance with the vitality indicia,
  • the wearable device comprises at least one temperature sensor and the environmental input data eomprlses temperature data reeeived from the at least one temperature sensor.
  • the environmental input data comprises audio data.
  • the application server is adapted to calculate the vitality indicia in accordance with an audio recognition teehnique.
  • the audio recognition technique is adapted for recognising sounds within the audio data.
  • the sound is selected from the set of sounds comprising: coughing, hiccupping, sneezing and obstructed airways sounds.
  • the wearable device further comprises a stethoscope interface is adapted for receiving the audio data from a stethoscope in use.
  • the environmental input data comprises acceleration data.
  • the application server is adapted to calculate the vitality indicia in accordance with a movement recognition technique applied to the acceleration data.
  • the movement recognition technique comprises recognising a movement
  • the movement represents an exercise movement selected from the set of exercise movements comprising walking, talking and running exercise movements.
  • the movement comprises at least a vibrational component and the server is adapted to diagnose instances of sleep disordered breathing in accordance with the movement.
  • the environmental input data further comprises audio data and in calculating the vitality indicia, the server is adapted to caicuiate the vitality indicia in accordance with an audio recognition technique that is, in turn, adapted for recognising sleep disordered breathing sounds within the audio data.
  • the environment input data comprises orientation data.
  • the application server is adapted to caicuiate the vitality indicia in accordance with an orientation recognition technique
  • the orientation recognition technique is adapted for recognising an orientation of a wearer in use.
  • the application server is adapted to caicuiate the resting state of a wearer of the application server in accordance with the orientation.
  • the server is adapted to send the environment input data to another wearable device.
  • the server is adapted to send the environment input data to another wearable device when a wearer of the wearable device of the mobile computing device looks at a wearer of the another wearable device.
  • the server is adapted to receive other environment input data from another wearable device and to caicuiate the vitality indicia further in accordance with the other environment input data.
  • the server is further adapted to caicuiate a vitality category in accordance with the environment input data.
  • the server is further adapted to calculate a diagnosis data in accordance with the environment input data.
  • the server is further adapted to calculate one or more appropriate remedies in accordance with the diagnosis data.
  • the one or more remedies are each a remedy from the following set of remedies:
  • the vitality indicia comprises the one or more remedies.
  • the one or more remedies are communicated to a wearer of the wearable device audibly.
  • the one or more remedies comprise a branded product suggestion.
  • the one or more remedies is a branded medication remedy.
  • the environment data comprises at Ieast two types of environment data and wherein the server is adapted to calculate the diagnosis in accordance with the at Ieast two types of environment data.
  • the application server is further adapted to calculate the diagnosis in accordance with a threshold number of types of environment data.
  • the application server is further adapted to calculate the diagnosis in accordance with a weighting of the types of environment data.
  • the wearable device is a pair of augmented reality glasses comprising one or more sensors and at Ieast one of the one or more sensors is located at a region selected from the following set of regions of the glasses in relation to a wearer of the glasses:
  • a mobile computing device for calculating vitality indicia comprising:
  • a memory device for storing digital data including computer program code and being coupled to the processor
  • an augmented reality display device for displaying digital data in augmented reality and being coupled to the processor
  • the processor is controlled by the computer program code to: receive, from the at least one sensor, the sensor input data,
  • the mobile computing device further comprises a user interface for sending and receiving user input data via the augmented reality device, the user interface being coupled to the processor, wherein the processor is controlled by the computer program code to calculate the vitality indicia in accordance with at least the user input data.
  • the device can communicate with an external data network in an automated manner to send and receive user input data to assist in the diagnosis of the user.
  • the processor is controlled by the computer program code to calculate first body posture data representing a first body posture in accordance with the sensor input data and wherein the processor is controlled by the computer program code to calculate the vitality indicia in accordance with the first body posture data.
  • the computer program code can interpret sensor input data in an automated manner, which may not be manual inputs that the user consciously makes, and the user's vitality indicia is calculated for that given posture or motion.
  • the processor is controlled by the computer program code to calculate second body posture data representing a second body posture in accordance with the sensor input data and wherein the processor is controlled by the computer program code to calculate the vitality indicia further in accordance the second body posture data.
  • the computer program code can interpret proceeding sensor input data as changes to the users initial posture or motion in an automated manner, and again are not manual inputs made consciously by the user, to further calculate the vitality indicia.
  • the at least one sensor comprises an image capture device and wherein the processor is controlled by the computer program code to calculate the first posture data further in accordance with image data from the image capture device.
  • the image capture device is a stereoscopic image capture device and wherein the processor is controlled by the computer program code to calculate the first body posture further in accordance with stereoscopic image data from the stereoscopic image capture device.
  • a stereoscopic image capture device is able to visualise image data in a way that can determine depth between any reference points in an automated manner without the user requiring to define distances and locations of their body with respect to the environment.
  • the at least one sensor comprises an orientation sensor adapted for generating orientation data and wherein the processor is further controlled by the computer program code to calculate the first posture data further in accordance with the orientation data.
  • an orientation sensor is used to accurately measure further posture data.
  • the processor is further controlled by the computer program code to calculate if the a first body posture exceeds a posture range threshold.
  • the computer program code has stored information pertaining to the normal ranges of movement unique to the user, and determine at any time when the user has made a significant change from their desired movement or posture, and so corrective action can be taken,
  • the posture range threshold represents a height range threshold.
  • the posture range threshold represents an angular range threshold.
  • the processor is controlled by the computer program code to calculate reference point data representing a reference point using the image data and calculate the first body posture further in accordance with the reference point data.
  • the computer program code can assign reference points pertaining to both the user's body and external environment such that the user can move through any ranges of motion and still be traceable Irrespective of sensor orientation.
  • the mobile computing device further comprises a user input interface for receiving user input data
  • the processor is controlled by the computer program code to receive, via the user input device, reference point data representing a reference point and calculate the first body posture further in accordance with the reference point data.
  • the processor is controlled by the computer program code to calculate distance data representing a distance from the reference point in accordance with the reference point data.
  • the processor is further controlled by the computer program code to display, using the augmented reality display device, the first posture data.
  • the posture data can be communicated to the user so the user can understand how their body movements and positions have changed over time.
  • the processor is further controlled by the computer program code to calculate remedial action data representing a remedial action in accordance with the first posture data.
  • both the posture data and recommendations to correct the problem are shown to the user, and the user can more accurately make corrective actions while the device monitors this corrective action in an automated manner.
  • At least one sensor comprises a rearward facing image capture device adapted for capturing image data representing at least a part of the face of the wearer.
  • the at least a part of the face of the wearer is at least a part of an eye of the wearer.
  • the processor is controlled by the computer program code to calculate eye characteristic data representing an eye characteristic and calculate the vitality indicia further in accordance with the eye characteristic.
  • the rearwards-facing camera can be adapted to capture and analyse eye characteristics, in an automated manner
  • the eye characteristic is selected from the set of eye characteristics comprising
  • the device can recognise characteristics of the users eyes that may not initially have been diagnosed prior, and these may include redness, swelling, dark rings, iris colour, pupil symmetry and pupil size eye characteristics.
  • the processor is further controlled by the computer program code to calculate the eye characteristic data in accordance with a colour recognition technique.
  • the computer program code can further calculate eye characteristic data using a colour recognition techniques applied to the image data.
  • the processor is further controlled by the computer program code to calculate the eye characteristic data in accordance with a movement recognition technique.
  • the computer program code can further calculate eye characteristic data in terms of eye movements in an automated manner that may have been previously undiagnosed.
  • the processor is further controlled by the computer program code to calculate further eye characteristic data representing a further eye characteristic and calculate the vitality indicia further in accordance with the further eye characteristic.
  • the processor is further controlled by the computer program code to compare the eye characteristic data against normal eye characteristic data representing a normal eye characteristic.
  • the eye characteristic data is compared against a database that contains characteristics relating to normal, healthy and functioning eyes, to determine if the users eye characteristics vary significantly,
  • the eye characteristic data represents a pupil dilation state and wherein the at least one sensor further comprises an ambient light meter and wherein the processor is further controlled by the computer program code to receive, from the ambient light meter, ambient light data representing an ambient lighting level, and calculate the vitality indicia further in accordance with the ambient light data and the pupil dilation state.
  • the vitality indicia can be calculated in accordance with the pupil dilation state and ambient light levels.
  • the further eye characteristic data represents a second pupil dilation state and wherein the processor is further controlled by the computer program code to calculate the vitality indicia further in accordance with the second pupil dilation state.
  • the processor is further controlled by the computer program code to receive, from the ambient light meter, further ambient light data representing a further ambient light level, and calculate the vitality indicate further in accordance with the further ambient light data.
  • the vitality indicia can be calculated in accordance with changes in pupil size relative to changes in the ambient light levels.
  • At least one sensor comprises an audio sensor
  • the processor is further controlled by the computer program code to receive, from the audio sensor, audio data representing a voice command and calculate, using audio recognition technique, a command in accordance with the audio data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia further in accordance with the command.
  • the mobile computing device further comprises user interface output being coupled to the processor for outputting user data and wherein the processor is further controlled by the computer program code to calculate prompt data representing a prompt and output using the user interface output, the prompt data,
  • the processor is further controlled by the computer program code to calculate the vitality indicia further in accordance with the prompt data.
  • the processor is further controlled by the computer program code to calculate further prompt data representing a further prompt in accordance with the command and output, using the user interface output, the further prompt data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia further in accordance with the further prompt data.
  • the processor is further controlled by the computer program code to receive, from the audio sensor, further audio data representing a further voice command and calculate, using audio recognition technique, a further command in accordance with the further audio data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia further in accordance with the further command,
  • the processor is further controlled by the computer program code to calculate disease process data representing a disease in accordance with the vitality indicia.
  • the processor is further controlled by the computer program code to receive, from the at least one sensor, further sensor input data and calculate the vitality indicia further in accordance with the further sensor input data.
  • the processor calculates remedy effectiveness data in accordance with further sensor input data in an automated manner, and the users vitality indicia is updated accordingly so the user can monitor changes to their wellbeing over time.
  • the processor is further controlled by the computer program code to select remedy data representing a remedy in accordance with the sensor input data and the further sensor input data.
  • the remedy is selected from the set of remedies comprising medication, exercise activity, diet suggestion, lifestyle suggestion, therapy suggestion and product suggestion remedies.
  • the processor is further controlled by the computer program code to calculate remedy effectiveness data representing an effectiveness of the remedy in accordance with the further sensor input data.
  • the mobile computing device further comprises a data interface for sending and receiving data across a data network, the data interface being coupled to the processor, wherein the processor is controlled by the computer program code to send, via the data interface, the remedy data.
  • the mobile computing device further comprises a data interface for sending and receiving data across a data network, the data interface being coupled to the processor, wherein the processor is further controlled by the computer program code to send, via the data interface, emergency data representing an emergency in accordance with the vitality indicia.
  • the at least one sensor is adapted to monitor blood oxygen saturation.
  • the at least one sensor is adapted to monitor a breathing rate.
  • the at least one sensor is adapted to monitor a heart rate.
  • the at least one sensor is adapted to monitor a temperature.
  • the mobile computing device further comprises a location sensing means for sensing a location, the location sensing means being coupled to the processor, wherein the processor is further controlled by the computer program code to receive, from the location sensing means, location data representing a location, and send, via the date interface, the location data.
  • the processor is further controlled by the computer program code to select emergency contact data further in accordance with the vitality indicia.
  • the processor is further controlled by the computer program code to select a proximate mobile computing device and send, via the date interface, to the proximate mobile computing device the emergency data.
  • a wearer of a proximal mobile computing device can be sent the location data corresponding to the first wearer of a mobile computing device.
  • the processor is controlled by the computer program code to display, using the augmented reality display device, medical assistance instructions.
  • the vitality indicia represents the wearer's stress level.
  • the at least one sensor is adapted to capture sensor input data selected from the set of sensor input data comprising:
  • the augmented reality display can be used to display the wearer's stress level in real-time
  • the display of the wearers stress level can be used in an application of a closed-loop biofeedback therapy method.
  • the processor is controlled by the computer program code to display the wearers stress level on the augmented reality display in a real-time graphical format.
  • a mobile computing device for detecting an environmental hazard comprising:
  • a memory device for storing digital data including computer program code and being coupled to the processor
  • an augmented reality display device for displaying digital data in augmented reality and being coupled to the processor
  • the processor is controlled by the computer program code to: receive, from the at least one sensor, the sensor input data,
  • the display using the augmented reality display device, the environmental hazard.
  • the at least one sensor is an image capture device adapted for capturing image data
  • the processor is controlled by the computer program code to calculate the vitality indicia in accordance with an image recognition technique and the image data.
  • the processor is further controlled by the computer program code to recognise an object.
  • the processor is further controlled by the computer program code to calculate whether the object is hazardous.
  • the image recognition technique comprises text recognition technique.
  • the processor is further controlled by the computer program code to recognise text.
  • the text recognition technique is adapted to recognise text representing a hazardous substance
  • the processor is further controlled by the computer program code to calculate whether the text represents a hazardous substance.
  • the mobile computing device further comprises a data interlace for sending and receiving data across a data network and being couple to the processor, and wherein the processor is further controlled by the computer program code, to send the text to a hazardous substance lookup service,
  • the at least one sensor is a radiation measurement device adapted for generating radiation level data representing a radiation level
  • the processor is further controlled by the computer program code to detect the environmental hazard further in accordance with the radiation level data.
  • the radiation measurement device is a UV radiation measurement device.
  • the processor is further controlled by the computer program code to calculate radiation exposure data representing radiation exposure in accordance with the radiation level data and detect the environmental hazard further in accordance with the radiation exposure data,
  • the processor is adapted to calculate the hazard of UV exposure in accordance with the UV index level and radiation exposure level.
  • a mobile computing device for vision assistance comprising:
  • a memory device for storing digital data including computer program code and being coupled to the processor
  • an augmented reality display device for displaying digital data in augmented reality and being coupled to the processor, at least one sensor for capturing sensor input data and being coupled to the processor, wherein the processor is controlled by the computer program code to: receive, from the at least one sensor, the sensor input data,
  • At least one sensor comprises a rearward facing image capture device adapted for capturing image data representing at least a part of the face of a wearer.
  • the at least a part of the face of the wearer is at least a part of an eye of the wearer
  • the processor is further controlled by the computer program code to calculate orientation data representing an orientation of the eye in accordance with the image data and calculate the augmented image data further in accordance with the orientation data,
  • the device is able to calculate the augmented image data using the orientation of the eye
  • the processor is further controlled by the computer program code to calculate a field of view data representing a field of view of the wearer in accordance with the orientation data and calculate the augmented image data further in accordance with the field of view data.
  • At least one sensor further comprises an forward facing image capture device
  • the processor is further controlled by the computer program code to receive, from the forward facing image capture device, view image data representing a view of the wearer and calculate the augmented image data further in accordance with the view image data.
  • the forward facing image capture device is able to capture image data of the wearer ' s surroundings relating to the wearers predicted field of view.
  • the mobile computing device further comprises a user interface for receiving user input data and being coupled to the processor, wherein the processor is controlled by the computer program code to receive, from the user interface, vision abnormality data representing a vision abnormality, and calculate the augmented image data further in accordance with the vision abnormality data,
  • the vision abnormality data comprises blind spot data representing a blind spot.
  • the vision abnormality data can be used to generate blind spot data that represents the blind spot of the wearer.
  • the processor is further controlled by the computer program code to calculate blind spot image data representing a portion of an image within the wearer's blind spot in accordance with the view image data and the blind spot data,
  • the processor would adapt video of the view of the wearer to be displayed within a field of view such that it is visible to them, broadening their field of view.
  • the augmented image data comprises a superlmpositlon of the blind spot image data and the view image data.
  • an application server for calculating vitality indicia comprising;
  • a memory device for storing digital data including computer program code and being coupled to the processor
  • a data interface for sending and receiving data across a data network and being coupled to the processor, wherein the processor is controlled by the computer program code to:
  • the processor is controlled by the computer program code to:
  • the processor is controlled by the computer program code to calculate first body posture data representing a first body posture in accordance with the sensor input data and wherein the processor is controlled by the computer program code to calculate the vitality indicia in accordance with the first body posture data,
  • the processor is controlled by the computer program code to calculate second body posture data representing a second body posture in accordance with the sensor input data and wherein the processor is controlled by the computer program code to calculate the vitality indicia further in accordance the second body posture data.
  • the sensor data represents image data and wherein the processor is controlled by the computer program code to calculate the first posture data further in accordance with the image data.
  • the image data is stereoscopic image data and wherein the processor is controlled by the computer program code to calculate the first body posture further in accordance with the stereoscopic image data.
  • the sensor input data represents orientation data and wherein the processor is further controlled by the computer program code to calculate the first posture data further in accordance with the orientation data
  • the processor is further controlled by the computer program code to calculate If the a first body posture exceeds a posture range threshold.
  • the posture range threshold represents a height range threshold
  • the posture range threshold represents an angular range threshold
  • the processor is controlled by the computer program code to calculate reference point data representing a reference point using the image data and calculate the first body posture further in accordance with the reference point data.
  • the processor is controlled by the computer program code to receive, via the data interface, reference point data representing a reference point and calculate the first body posture further in accordance with the reference point data.
  • the processor is controlled by the computer program code to calculate distance data representing a distance from the reference point in accordance with the reference point data,
  • the processor is further controlled by the computer program code to send, via the data interface, the first posture data.
  • the processor is further controlled by the computer program code to calculate remedial action data representing a remedial action in accordance with the first posture data.
  • the sensor input data represents image data representing at least a part of the face of the wearer.
  • the at least a part of the face of the wearer is at least a part of an eye of the wearer.
  • the processor is controlled by the computer program code to calculate eye characteristic data representing an eye characteristic and calculate the vitality indicia further in accordance with the eye characteristic.
  • the eye characteristic is selected from the set of eye characteristics comprising
  • the processor is further controlled by the computer program code to calculate the eye characteristic data in accordance with a colour recognition technique.
  • the processor is further controlled by the computer program code to calculate the eye characteristic data in accordance with a movement recognition technlgue.
  • the processor is further controlled by the computer program code to calculate further eye characteristic data representing a further eye characteristic and calculate the vitality indicia further in accordance with the further eye characteristic.
  • the processor is further controlled by the computer program code to compare the eye characteristic data against normal eye characteristic data representing a normal eye characteristic
  • the eye characteristic data represents a pupil dilation state and wherein the sensor input data comprises ambient light data representing an ambient lighting level, and wherein the processor is further controited by the computer program code to calculate the vitality indicia further in accordance with the ambient light data and the pupil dilation state.
  • the further eye characteristic data represents a second pupil dilation state and wherein the processor is further controlled by the computer program code to calculate the vitality indicia further in accordance with the second pupil dilation state.
  • the sensor input data further comprises further ambient iight data representing a further ambient iight level
  • the processor is further controlled by the computer program code to calculate the vitality indicate further in accordance with the further ambient light data.
  • the sensor input data comprises audio data representing a voice command and wherein the processor is further controlled by the computer program code to calculate, using audio recognition technique, a command in accordance with the audio data,
  • the processor is further controlled by the computer program code to calculate the vitality indicia further in accordance with the command.
  • the processor is further controlled by the computer program code to receive, via the data interface, user input interface data representing user input using a user interface and wherein the processor is further controlled by the computer program code to calculate prompt data representing a prompt and output, using the user interface output, the prompt data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia further in accordance with the prompt data
  • the processor is further controlled by the computer program code to calculate further prompt data representing a further prompt in accordance with the command and send, via the data interface, the further prompt data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia further in accordance with the further prompt data.
  • the sensor input data comprises further audio data representing a further voice command and wherein the processor is further controlled by the computer program code to calculate, using audio recognition technique, a further command in accordance with the further audio data.
  • the processor is further controlled by the computer program code to calculate the vitality indicia further in accordance with the further command,
  • the processor is further controlled by the computer program code to calculate disease process data representing a disease in accordance with the vitality indicia.
  • the processor is further controlled by the computer program code to receive, via the data interface, further sensor input data and calculate the vitality indicia further in accordance with the further sensor input data.
  • the processor is further controlled by the computer program code to select remedy data representing a remedy in accordance with the sensor input data and the further sensor input data,
  • the remedy is selected from the set of remedies comprising medication, exercise activity, diet suggestion, lifestyle suggestion, therapy suggestion and product suggestion remedies,
  • the processor is further controlled by the computer program code to calculate remedy effectiveness data representing an effectiveness of the remedy in accordance with the further sensor input data.
  • the processor is controlled by the computer program code to send, via the data interface, the remedy data.
  • the processor is further controlled by the computer program code to send, via the data interface, emergency data representing an emergency in accordance with the vitality indicia,
  • the sensor input data represents blood oxygen saturation.
  • the sensor input data represents a breathing rate.
  • the sensor input data represents a heart rate.
  • the sensor input data represents a temperature
  • the processor is further controlled by the computer program code to receive, via the data interface, location data representing a location, and send, via the date interface, the location data,
  • the processor is further controlled by the computer program code to select emergency contact data further in accordance with the vitality indicia.
  • the processor is further controlled by the computer program code to select a proximate application server and send, via the date interface, to the proximate application server the emergency data.
  • the processor is controlled by the computer program code to send, using the data interface, medical assistance instructions.
  • the vitality indicia represents the wearer's stress level.
  • the at least one sensor is adapted to capture sensor input data selected from the set of sensor input data comprising:
  • the processor is controlled by the computer program code to send, via the date interface, the wearers stress level.
  • an application server for detecting an environmental hazard comprising:
  • a memory device for storing digital data including computer program code and being coupled to the processor
  • a data interface for sending and receiving data across a data network and being coupled to the processor, wherein the processor is controlled by the computer program code to:
  • the sensor input data comprises image data and wherein the processor is controlled by the computer program code to calculate the vitality indicia in accordance with an image recognition technique and the image data.
  • the processor is further controlled by the computer program code to recognise an object.
  • the processor is further controlled by the computer program code to calculate whether the object is hazardous.
  • the image recognition technique comprises text recognition technique.
  • the processor is further controlled by the computer program code to recognise text.
  • the processor is further controlled by the computer program code to calculate whether the text represents a hazardous substance.
  • the processor is further controlled by the computer program code, to send the text to a hazardous substance lookup service.
  • the sensor input data comprises radiation level data representing a radiation level
  • the processor is further controlled by the computer program code to detect the environmental hazard further in accordance with the radiation level data.
  • the radiation measurement data represents a UV radiation measurement.
  • the processor is further controlled by the computer program code to calculate radiation exposure data representing radiation exposure in accordance with the radiation level data and detect the environmental hazard further in accordance with the radiation exposure data.
  • an application server for vision assistance comprising:
  • a memory device for storing digital data including computer program code and being coupled to the processor:
  • a data interface for sending and receiving data across a data network and being coupled to the processor, wherein the processor is controlled by the computer program code to:
  • the sensor input data comprises image data representing at least a part of the face of a wearer.
  • the at least a part of the face of the wearer is at least a par! of an eye of the wearer.
  • the processor is further controlled by the computer program code to calculate orientation data representing an orientation of the eye in accordance with the image data and calculate the augmented image data further in accordance with the orientation data.
  • the processor is further controlled by the computer program code to calculate a field of view data representing a field of view of the wearer in accordance with the orientation data and calculate the augmented image data further in accordance with the field of view data.
  • the sensor input data further comprises image data representing forward facing image data.
  • the processor is further controlled by the computer program code to calculate view image data representing a view of the wearer and calculate the augmented image data further in accordance with the view image data.
  • the processor is further controlled by the computer program code to receive, via the date interface, vision abnormality data representing a vision abnormality, and calculate the augmented image data further in accordance with the vision abnormality data.
  • the vision abnormality data comprises blind spot data representing a blind spot.
  • the processor is further controlled by the computer program code to calculate blind spot image data representing a portion of an image within the wearer's blind spot in accordance with the view image data and the blind spot data.
  • the augmented image data comprises a supe lmpositlon of the blind spot image data and the view image data.
  • a computer readable storage medium for calculating vitality indicia
  • the computer readable storage medium comprising computer code instructions for a computing device and comprising instructions for:
  • the computer readable storage medium further comprises instructions tor calculating the vitality indicia in accordance with at least the user input data.
  • the computer readable storage medium further comprises instructions for calculating first body posture data representing a first body posture in accordance with the sensor input data and further comprising instructions for calculating the vitality indicia in accordance with the first body posture data.
  • the computer readable storage medium further comprises instructions for calculating second body posture data representing a second body posture in accordance with the sensor input data and further comprising instructions for calculating the vitality indicia further in accordance the second body posture data.
  • the computer readable storage medium further comprises instructions for calculating the first posture data further in accordance with image data from an image capture device,
  • the image capture device is a stereoscopic image capture device and further comprising instructions for calculating the first body posture further in accordance with stereoscopic image data from the stereoscopic image capture device.
  • the at least one sensor comprises an orientation sensor adapted for generating orientation data and further comprising instructions for calculating the first posture data further in accordance with the orientation data
  • the computer readable storage medium further comprises instructions for calculating If the a first body posture exceeds a posture range threshold.
  • the posture range threshold represents a height range threshold
  • the posture range threshold represents an angular range threshold
  • the computer readable storage medium further comprises instructions for calculating reference point data representing a reference point using the image data and calculating the first body posture further in accordance with the reference point data.
  • the computer readable storage medium further comprises instructions for receiving, via a user input device, reference point data representing a reference point and calculating the first body posture further in accordance with the reference point data,
  • the computer readable storage medium further comprises instructions for calculating distance data representing a distance from the reference point in accordance with the reference point data.
  • the computer readable storage medium further comprises instructions for displaying, using the augmented reality displaying device, the first posture data.
  • the computer readable storage medium further comprises instructions for calculating remedial action data representing a remedial action in accordance with the first posture data.
  • At least one sensor comprises a rearward facing image capture device adapted for capturing image data representing at least a part of the face of the wearer.
  • the at least a part of the face of the wearer is at least a part of an eye of the wearer.
  • the computer readable storage medium further comprises instructions for calculating eye characteristic data representing an eye characteristic and calculating the vitality indicia further in accordance with the eye characteristic.
  • the eye characteristic is selected from the set of eye characteristics comprising
  • the computer readable storage medium further comprises instructions for calculating the eye characteristic data in accordance with a colour recognition technique.
  • the computer readable storage medium further comprises instructions for calculating the eye characteristic data in accordance with a movement recognition technique.
  • the computer readable storage medium further comprises instructions for calculating further eye characteristic data representing a further eye characteristic and calculating the vitality indicia further in accordance with the further eye characteristic.
  • the computer readable storage medium further comprises instructions for compare the eye characteristic data against normal eye characteristic data representing a normal eye characteristic.
  • the eye characteristic data represents a pupil dilation state and wherein the at least one sensor further comprises an ambient light meter and further comprising instructions for receiving, from the ambient light meter, ambient light data representing an ambient lighting level, and calculating the vitality indicia further in accordance with the ambient light data and the pupil dilation state.
  • the further eye characteristic data represents a second pupil dilation state and further comprising instructions for calculating the vitality indicia further in accordance with the second pupil dilation state.
  • the computer readable storage medium further comprises instructions for receiving, from the ambient light meter, further ambient light data representing a further ambient light level, and calculating the vitality indicate further in accordance with the further ambient light data.
  • At least one sensor comprises an audio sensor, and further comprising instructions for receiving, from the audio sensor, audio data representing a voice command and calculating, using audio recognition technique, a command in accordance with the audio data.
  • the computer readable storage medium further comprises instructions for calculating the vitality indicia further in accordance with the command.
  • the computer readable storage medium further comprises instructions for calculating prompt data representing a prompt and outputtlng, using a user interface output, the prompt data.
  • the computer readable storage medium further comprises instructions for calculating the vitality indicia further in accordance with the prompt data.
  • the computer readable storage medium further comprises instructions for calculating further prompt data representing a further prompt in accordance with the command and output! ng, using the user interface output, the further prompt data,
  • the computer readable storage medium further comprises instructions for calculating the vitality indicia further in accordance with the further prompt data.
  • the computer readable storage medium further comprises instructions for receiving, from the audio sensor, further audio data representing a further voice command and calculating, using audio recognition technique, a further command in accordance with the further audio data.
  • the computer readable storage medium further comprises instructions for calculating the vitality indicia further in accordance with the further command.
  • the computer readable storage medium further comprises instructions for calculating disease process data representing a disease in accordance with the vitality indicia.
  • the computer readable storage medium further comprises instructions for receiving, from the at least one sensor, further sensor input data and calculating the vitality indicia further in accordance with the further sensor input data,
  • the computer readable storage medium further comprises instructions for selecting remedy data representing a remedy in accordance with the sensor input data and the further sensor input data.
  • the remedy is selected from the set of remedies comprising medication, exercise activity, diet suggestion, lifestyle suggestion, therapy suggestion and product suggestion remedies,
  • the computer readable storage medium further comprises instructions for calculating remedy effectiveness data representing an effectiveness of the remedy in accordance with the further sensor input data.
  • the computer readabte storage medium further comprises instructions for sending, via a data interface, the remedy data,
  • the computer readable storage medium further comprises instructions for sending, via a data interface, emergency data representing an emergency in accordance with the vitality indicia,
  • the at least one sensor is adapted to monitor blood oxygen saturation.
  • the at least one sensor is adapted to monitor a breathing rate
  • the at least one sensor is adapted to monitor a heart rate
  • the at least one sensor is adapted to monitor a temperature.
  • the computer readable storage medium further comprises instructions for receiving, from location sensing means, location data representing a location, and send, via the data interface, the location data.
  • the computer readabte storage medium further comprises instructions for selecting emergency contact data further in accordance with the vitality indicia.
  • the computer readable storage medium further comprises instructions for selecting a proximate computer readable storage medium and sending, via the data interface, to the proximate computer readable storage medium the emergency data,
  • the computer readable storage medium further comprises instructions for displaying, using the augmented reality displaying device, medical assistance instructions.
  • the vitality indicia represents the wearers stress level.
  • the at least one sensor is adapted to capture sensor input data selected from the set of sensor input data comprising:
  • the computer readable storage medium further comprises instructions for displaying the wearer's stress level on the augmented reality displaying in a real-time graphical format.
  • a computer readable storage medium for detecting an environmental hazard
  • the computer readable storage medium comprising computer code instructions for a computing device and comprising instructions for:
  • the at least one sensor is an image capture device adapted for capturing image data, and further comprising instructions for calculating the vitality indicia in accordance with an image recognition technique and the image data.
  • the computer readable storage medium further comprises instructions for recognising an object.
  • the computer readable storage medium further comprises instructions for calculating whether the object is hazardous.
  • the image recognition technique comprises text recognition technique.
  • the computer readable storage medium further comprises instructions for recognise text
  • the computer readable storage medium further comprises instructions for calculating whether the text represents a hazardous substance
  • the computer readable storage medium further comprises instructions for sending the text to a hazardous substance lookup service.
  • the at least one sensor is a radiation measurement device adapted for generating radiation level data representing a radiation level, and further comprising instructions for detecting the environmental hazard further in accordance with the radiation level data.
  • the radiation measurement device is a UV radiation measurement device.
  • the computer readable storage medium further comprises instructions for calculating radiation exposure data representing radiation exposure in accordance with the radiation level data and detecting the environmental hazard further in accordance with the radiation exposure data.
  • a computer readable storage medium for vision assistance comprising computer code instructions for a computing device and comprising instructions for; receiving, from at least one sensor, the sensor input data,
  • At least one sensor comprises a rearward facing image capture device adapted for capturing image data representing at least a part of the face of a wearer.
  • the at least a part of the face of the wearer is at least a part of an eye of the wearer.
  • the computer readable storage medium further comprises instructions for calculating orientation data representing an orientation of the eye in accordance with the image data and calculating the augmented image data further in accordance with the orientation data.
  • the computer readable storage medium further comprises instructions for calculating a field of view data representing a field of view of the wearer in accordance with the orientation data and calculating the augmented image data further in accordance with the field of view data,
  • At least one sensor further comprises a forward facing image capture device.
  • the computer readable storage medium further comprises instructions for receiving, from the forward facing image capture device, view image data representing a view of the wearer and calculating the augmented image data further in accordance with the view image data,
  • the computer readable storage medium further comprises instructions for receiving, from a user interface, vision abnormality data representing a vision abnormality, and calculating the augmented image data further in accordance with the vision abnormality data,
  • the vision abnormality data comprises blind spot data representing a blind spot.
  • the computer readable storage medium further comprises instructions for calculating blind spot image data representing a portion of an image within the wearer's blind spot in accordance with the view image data and the blind spot data,
  • the augmented image data comprises a super-imposition of the blind spot image data and the view image data.
  • a wearable computing device for diagnosing a disease
  • the wearable computing device comprising a processor for processing digital data: a memory device for storing digital data including computer program code and being coupled to the processor; and at least one sensor for capturing sensor input data and being coupled to the processor, wherein the processor is controlled by the computer program code to receive, from the at least one sensor, the sensor input data, calculate, using the sensor input data, the disease in accordance with the sensor input data.
  • the disease is an eye disease.
  • the least one sensor comprises an image capture device
  • the processor is further controlled by the computer program code to receive, via the image capture device, image data representing an image capture of at least a portion of an eye of a wearer; and calculate the eye disease in accordance with the image data.
  • the at least a portion of the eye of the wearer is the sclera of the eye
  • the processor is further controlled by the computer program code to calculate the eye disease in accordance with a colour recognition technigue.
  • the eye condition is jaundice.
  • the at least a portion of the eye of the wearer is the Iris of the eye.
  • the at least a portion of the eye of the wearer is the lens of the eye.
  • the eye condition is cataracts
  • the disease is a skeletal defect.
  • the skeletal defect is scoliosis.
  • At least one sensor comprises at least one of a gyroscope and an acceleromoter and wherein the processor is further controlled by the computer program code to receive, from the at least one sensor, at least one of gyroscope and accelerorneter data; and calculate, using the at least one of the gyroscope and accelerometer data, the skeletal defect.
  • the processor is further controlled by the computer program code to calculate a postural remedy.
  • augmented reality display device for displaying digital data in augmented reality and being coupled to the processor, wherein the processor is further controlled by the computer program code to display, using the augmented reality display device, the postural remedy.
  • the disease is diabetes.
  • the at least one sensor comprises a blood glucose level sensor.
  • the blood glucose level sensor is an !n-ear sensor.
  • the disease is a neuromuscular disease.
  • the neuromuscular disease is Parkinsons disease.
  • the processor is further controlled by the computer program code to calculate a gait remedy.
  • augmented reality display device for displaying digital data in augmented reality and being coupled to the processor; and wherein the processor is further controlled by the computer problem code to display, using the augmented reality display device, the gait remedy.
  • the gait remedy comprises virtual foot guides.
  • the disease is colour blindness.
  • augmented reality display device for displaying digital data in augmented reality and being coupled to the processor; and wherein the processor is further controlled by the computer program code to display, using the augmented reality display device, a colour blindness corrected image.
  • the colour blindness corrected image comprises the substitution of at least one colour with another colour.
  • the at least one colour is red.
  • the another colour is yellow.
  • the processor is further controlled by the computer program code to calculate the proximity of the colour red and the colour green.
  • augmented reality display device tor displaying digital data in augmented reality and being coupled to the processor; display, using the augmented reality display device, the colour blindness corrected image,
  • augmented reality display device for displaying digital data in augmented reality and being coupled to the processor; display, using the augmented reality display device, a colour blind test message.
  • [537] comprising a user input interface adapted for receiving user input data, and wherein the processor is further controlled by the computer problem code to receive, via the user input interface, acknowledgement data representing an acknowledgement of the colour blind test message.
  • the disease is a balance disorder.
  • the at least one sensor comprises at least one of a gyroscope and aocelerometer and wherein the processor is further controlled by the computer problem code to receive, from the at least one of the gyroscope and aocelerometer at least one of positional and acceleration data; and calculate an impending imbalance in accordance with the at least one of positional and exploration data.
  • the at least one sensor is adapted for location approximate and peripheral the waist of a user.
  • the at least one sensor is adapted for location within a belt.
  • the disease is a neuromuscular disease.
  • the neuromuscular disease is indicated by at least one of tremors and twitching.
  • the at least one sensor is adapted for mounting proximity the wrist of a user
  • the disease is Alzheimer's.
  • the at least one sensor is adapted to measure the intercranlal pressure of a user.
  • the at least one sensor is adapted to measure the intercraniai pressure of the optic nerve sheath of a user,
  • the at least one sensor is adapted to employ an ultrasonic technique in measuring the intercraniai pressure.
  • the least one sensor is adapted to measure the stapedial reflex of a user.
  • At least one sensor is adapted to employ a Endoscopy technique.
  • the at least one sensor is adapted for detecting papilledema.
  • the at least one sense comprises an electroencephalography sensor.
  • the at least one sensor comprises any infrared spectroscopy sensor.
  • the disease is loss of hearing.
  • [556] Preferably, comprising an audio play out device, and wherein the processor is further controlled by the computer program code to play out, using the audio player device a test audio signal,
  • the test audio signal is characterised in frequency.
  • the test audio signal is characterised in volume.
  • [559] Preferably, comprising a user input interface adapted for receiving user input data, and wherein the processor is further controlled by the computer problem code to receive, via the user input interface, acknowledgement data representing an acknowledgement of the test audio signal.
  • the disease is intoxication.
  • the at least one sensor is adapted for measuring a blood alcohol content of a user.
  • At least one sensor is adapted for tissue spectrometry.
  • At least one sensor is adapted for breath spectrometry.
  • At least one sensor comprises a transdermal alcohol sensor.
  • the disease is tooth decay.
  • At least one sensor comprises an image capture device adapted for capturing image data representing at least a portion of the mouth of a user and wherein the processor is further controlled by the computer program code to calculate the disease in accordance with the image data.
  • the processor is further controlled by the computer program code to send, via the data interface, alert data indicative of the disease,
  • the processor is further controlled by the computer program code to calculate a remedial medication in accordance with the disease
  • the processor is further controlled by the computer program code to determine a side-effect of the remedial medication.
  • a server for diagnosing a disease comprising a processor for processing digital data; a memory device for storing digital data including computer program code and being coupled to the processor; and a data interface for sending and receiving data across a data network, the data interface being coupled to the processor, wherein the processor is controlled by the computer program code to receive, via the data interface, sensor input data, calculate, using the sensor input data, the disease in accordance with the sensor input data.
  • the disease is an eye disease.
  • the processor is further controlled by the computer program code to receive, via the data interface, image data representing an image capture of at least a portion of an eye of a wearer; and calculate the eye disease in accordance with the image data.
  • the at least a portion of the eye of the wearer is the sclera of the eye.
  • the processor is further controlled by the computer program code to calculate the eye disease in accordance with a colour recognition technique.
  • the eye condition is jaundice.
  • the at least a portion of the eye of the wearer is the iris of the eye.
  • the at least a portion of the eye of the wearer is the lens of the eye.
  • the eye condition is cataracts
  • the disease is a skeletal defect.
  • a skeletal defect is scoliosis.
  • the processor is further controlled by the computer program code to receive, from the via the data interface, at least one of gyroscope and accelerometer data; and calculate, using the at least one of the gyroscope and accelerometer data, the skeletal defect
  • the processor is further controlled by the computer program code to calculate a postural remedy
  • the processor is further controlled by the computer program code to send, via the data interface, display data for displaying using an augmented reality display device, the display data representing the postural remedy,
  • the disease is diabetes.
  • the sensor input data represents a blood glucose level.
  • the blood glucose level is an in-ear blood glucose level.
  • the disease is a neuromuscular disease.
  • the neuromuscular disease is Parkinsons disease.
  • the processor is further controlled by the computer program code to calculate a gait remedy
  • the processor is further controlled by the computer program code to send, via the data interface, display data for display using anaugmented reality display device, the display data representing the gait remedy.
  • the gait remedy comprises virtual foot guides.
  • the disease is colour blindness.
  • the processor is further controlled by the computer program code to send, via the data interface, display data for display using an augmented reality display device, the display data representing a colour blindness corrected image,
  • the colour blindness corrected image comprises the substitution of at least one colour with another colour.
  • the at least one colour is red.
  • the another colour is yellow.
  • the processor is further controlled by the computer program code to calculate the proximity of the colour red and the colour green.
  • the processor is further controlled by the computer program code to send, via the data interface, display data for display using an augmented reality display device, the display data representing the colour blindness corrected image.
  • the processor is further controlled by the computer program code to send, via the data interface, display data for display using an augmented reality display device, the display data representing a colour blind test message.
  • the processor is further controlled by the computer problem code to receive, via the data interface, acknowledgement data representing an acknowledgement of the colour blind test message,
  • the disease is a balance disorder
  • the processor is further controlled by the computer problem code to receive, via the data interface, at least one of positional and acceleration data; and calculate an impending imbalance in accordance with the at least one of positional and acceleration data.
  • the positional and acceleration data represents at least one of a position and acceleration proximate a waist of a user
  • proximate a waist of a user is proximate a belt of the user.
  • the processor is further controlled by the computer program code to, send, via the data interface, deployments data representing an instruction to deploy at least one airbag.
  • the disease is a neuromuscular disease.
  • the neuromuscular disease is indicated by at least one of tremors and twitching.
  • the sensor input data comprises sensor input data obtained from proximate the wrist of a user.
  • the disease is Alzheimer's.
  • the sensor input data represents intercranlal pressure of a user.
  • the sensor input data represents optic nerve sheath intercranlal pressure.
  • the sensor input data is obtained utilising an ultrasonic technique.
  • the sensor input data is a measurement of the stapedial reflex of a user
  • At least sensor input data represents a fundoscopy measurement.
  • the sensor input data is papilledema sensor input data.
  • the sensor input data is electroencephalography sensor input data.
  • the sensor input data is infrared spectroscopy sensor input data.
  • the disease is toss of hearing
  • the processor is further controlled by the computer program code to send, via the data interface, audio data for play out using an audio play out device, the audio data representing test audio signal.
  • test audio signal is characterised in frequency.
  • test audio signal is characterised in volume
  • the processor is further controlled by the computer problem code to receive, via the data interface, acknowledgement data representing an acknowledgement of the test audio signal.
  • the disease is intoxication.
  • the sensor input data is blood alcohol content sensor input data.
  • the sensor input data is tissue spectrometry sensor input data
  • the sensor input data is breath spectrometry sensor input data.
  • the sensor input data is transdermal alcohol sensor input data.
  • the disease is tooth decay.
  • the processor is further controlled by the computer program code to calculate the disease in accordance with image data.
  • the processor is further controlled by the computer program code to send, via the data interface, alert data indicative of the disease.
  • the processor is further controlled by the computer program code to calculate a remedial medication in accordance with the disease.
  • the processor is further controlled by the computer program code to determine a side-effect of the remedial medication.
  • a computer readable storage medium for diagnosing a disease
  • the computer readable storage medium comprising computer code instructions for receiving sensor input data, calculating, using the sensor input data, the disease in accordance with the sensor input data.
  • the disease is an eye disease
  • the instructions further comprise instructions for receiving image data representing an image capture of at least a portion of an eye of a wearer; and calculating the eye disease in accordance with the image data.
  • the at least a portion of the eye of the wearer is the sclera of the eye.
  • the instructions further comprise instructions for calculating the eye disease in accordance with a colour recognition technique.
  • the eye condition is jaundice.
  • the at least a portion of the eye of the wearer is the iris of the eye.
  • the at least a portion of the eye of the wearer is the lens of the eye.
  • the eye condition is cataracts
  • the disease is a skeletal defect.
  • a skeletal defect is scoliosis.
  • the instructions further comprise instructions for receiving at least one of gyroscope and accelerometer data; and calculating, using the at least one of the gyroscope and accelerometer data, the skeletal defect.
  • the instructions further comprise instructions for calculating a postural remedy.
  • the instructions further comprise instructions for sending display data for displaying using an augmented reality display device, the display data representing the postural remedy.
  • the disease is diabetes.
  • the sensor input data represents a blood glucose level.
  • the blood glucose level is an irvear blood glucose level
  • the disease is a neuromuscular disease.
  • the neuromuscular disease is Parkinsons disease.
  • the instructions further comprise instructions for calculating a gait remedy.
  • the instructions further comprise instructions for sending display data for display using anaugmented reality display device, the display data representing the gait remedy.
  • the gait remedy comprises virtual foot guides.
  • the disease is colour blindness.
  • the instructions further comprise instructions for sending display data for display using an augmented reality display device, the display data representing a colour blindness corrected image.
  • the colour blindness corrected image comprises the substitution of at least one colour with another colour
  • the at least one colour is red
  • the another colour is yellow.
  • the instructions further comprise instructions for calculating the proximity of the colour red and the colour green.
  • the instructions further comprise instructions for sending display data for display using an augmented reality display device, the display data representing the colour blindness corrected image.
  • the instructions further comprise instructions for sending display data for display using an augmented reality display device, the display data representing a colour blind test message,
  • the processor is further controlled by the computer problem code to receiving acknowledgement data representing an acknowledgement of the colour blind test message.
  • the disease is a balance disorder.
  • the processor is further controlled by the computer problem code to receiving at least one of positional and acceleration data; and calculating an impending imbalance in accordance with the at least one of positional and acceleration data,
  • the positional and acceleration data represents at least one of a position and acceleration proximate a waist of a user.
  • proximate a waist of a user is proximate a belt of the user
  • the instructions further comprise instructions for, sending deployments data representing an instruction to deploy at least one alrbag.
  • the disease is a neuromuscular disease.
  • the neuromuscular disease is indicated by at least one of tremors and twitching.
  • the sensor input data comprises sensor input data obtained from proximate the wrist of a user.
  • the disease is Alzheimer's.
  • the sensor input data represents intercranlal pressure of a user.
  • the sensor input data represents optic nerve sheath intercranlal pressure.
  • the sensor input data is obtained utilising an ultrasonic technique.
  • the sensor input data is a measurement of the stapedial reflex of a user.
  • At least sensor input data represents a fundoscopy measurement.
  • the sensor input data is papilledema sensor input data.
  • the sensor input data is electroencephalography sensor input data.
  • the sensor input data is infrared spectroscopy sensor input data.
  • the disease is loss of hearing.
  • the instructions further comprise instructions for sending audio data for play out using an audio play out device, the audio data representing test audio signal.
  • test audio signal is characterised in frequency.
  • the test audio signal is characterised in volume.
  • the processor is further controlled by the computer problem code to receiving acknowledgement data representing an acknowledgement of the test audio signal.
  • the disease is intoxication
  • the sensor input data is blood alcohol content sensor input data.
  • the sensor input data is tissue spectrometry sensor input data.
  • the sensor input data is breath spectrometry sensor input data.
  • the sensor input data is transdermal alcohol sensor input data.
  • the disease is tooth decay.
  • the instructions further comprise instructions for calculating the disease in accordance with image data
  • the instructions further comprise instructions for sending alert data indicative of the disease.
  • the instructions further comprise instructions for calculating a remedial medication in accordance with the disease.
  • the instructions further comprise instructions for determine a side- effect of the remedial medication.
  • a mobile computing device for detecting a user anxiety disorder
  • the mobile computing device comprising a processor for processing digital data; a memory device for storing digital data including computer program code and being coupled to the processor: a sensor for capturing environment input data and being coupled to the processor, wherein the processor is controlled by the computer program code to: receive, from the sensor, the environment input data, calculate the occurrence of the anxiety disorder in accordance with the environment input data.
  • the mobile computing device is adapted to automate the detection of one or more anxiety disorders, such that the anxiety order may be treated.
  • Such automated detection remove the need for specialised healthcare professionals making treatment is available at lower cost to more sufferers,
  • the environment input data comprises image data.
  • the mobile computing device is operable to recognise various scenes, scenarios objects and the like within the uses surrounds for the purposes of detecting the occurrence of the anxiety disorder.
  • the processor is further controlled by the computer program code to calculate the occurrence of the anxiety disorder in accordance with an image recognition technique.
  • the image recognition technique is adapted for recognising an object.
  • the image recognition technique is adapted for receiving first image data at a first time and second image data at a second later time, and comparing the first data and the second data.
  • the mobile computing device is adapted to compare scenes at different points in time for the purposes of recognising repetition anxiety disorders
  • the image data comprises video data.
  • the environmental input data comprises audio data.
  • the mobile computing device may be adapted for analysing other aspects of the uses environment such as by recognising events using audio data. in this manner, the mobile computing device may be adapted for use in sound in diagnosing certain anxiety disorders, such as obsessive counting or antisocial behaviour such as swearing, shouting and the like.
  • the processor is further controlled by the computer program code to calculate the occurrence of the anxiety disorder in accordance with an audio recognition technique.
  • the audio recognition technique is adapted for recognising a sound.
  • the environmental input data comprises acceleration data.
  • the mobile computing device is adapted for detecting motion and therefore for detecting certain anxiety disorders characterised by the motion of the individual, such as excessive washing, repetition and the like.
  • the processor is further controlled by the computer program code to calculate the occurrence of the anxiety disorder in accordance with a movement recognition technlgue.
  • the movement recognition technique comprises recognising a movement
  • the mobile computing device further comprises an interface for outputtlng information, and wherein the processor is further controlled by the computer program code to output indication data representing an indication of the occurrence of the anxiety disorder,
  • the mobile computing device is operable to provide feedback to the user such that the user may take corrective action.
  • the interface comprises a haptic interface.
  • the haptic interface is adapted to vibrate.
  • the interface comprises a data interface for sending data across a data network, and wherein the processor is further controlled by the computer program code to send the indication data via the data interface.
  • the interface comprises a display device, and wherein the processor is further controlled by the computer program code to display the indication data using the display device.
  • the display device comprises an augmented reality display device.
  • the processor is further controlled by the computer program code to display the indication of the occurrence of the anxiety disorder in augmented reality.
  • the processor is further controlled by the computer program code to display instructions for addressing the anxiety disorder.
  • the mobile computing device may be adapted to not only detect and anxiety disorder but also provide the means for remedying the disorder.
  • the mobile computing device comprises a wearable portion.
  • the mobile computing device comprises a headset.
  • the sensor is located at the headset.
  • an application server for detecting a user anxiety disorder
  • the application server comprising a processor for processing digital data: a memory device for storing digital data including computer program code and being coupled to the processor; a network interface for sending and receiving data across a data network and being coupled to the processor, wherein the processor is controlled by the computer program code to receive, via the network interface, the environment input data, calculate the occurrence of the anxiety disorder in accordance with the environment input data.
  • the environment input data comprises image data.
  • the processor is further controlled by the computer program code to calculate the occurrence of the anxiety disorder in accordance with an image recognition technique
  • the image recognition technique is adapted for recognising an object.
  • the image recognition technique is adapted for receiving first image data at a first time and second image data at a second later time, and comparing the first image data and the second image data.
  • the image data comprises video data.
  • the environmental input data comprises audio data.
  • the processor is further controlled by the computer program code to calculate the occurrence of the anxiety disorder in accordance with an audio recognition technique.
  • the audio recognition technique is adapted for recognising a sound.
  • the environmental input data comprises acceleration data.
  • the processor is further controlled by the computer program code to calculate the occurrence of the anxiety disorder in accordance with a movement recognition technique.
  • the movement recognition technique comprises recognising a movement
  • the processor is further controlled by the computer program code to send, via the network interface, an indication of the occurrence of the anxiety disorder.
  • the indication data is adapted for display by a display device.
  • the display device comprises an augmented reality display device.
  • the indication data further comprises instructions for addressing the anxiety disorder.
  • the indication data is adapted for use by a haptic interface.
  • the haptic interface is adapted to vibrate
  • a computer readable storage medium for detecting a user anxiety disorder
  • the computer readable storage medium having computer program code instructions recorded thereon, the computer program code instructions being executable by a computer and comprising instructions for receiving, via a network interface, the environment input data, instructions for calculating the occurrence of the anxiety disorder in accordance with the environment input data,
  • the environment input data comprises image data.
  • the computer readable storage medium further comprises instructions for calculating the occurrence of the anxiety disorder in accordance with an image recognition technique.
  • the image recognition technique is adapted for recognising an object.
  • the image recognition technique is adapted for receiving first image data at a first time and second image data at a second later time, and comparing the first image data and the second image data,
  • the image data comprises video data.
  • the environmental input data comprises audio data.
  • the computer readable storage medium further comprises instructions for calculating the occurrence of the anxiety disorder in accordance with an audio recognition technique.
  • the audio recognition technique is adapted for recognising a sound.
  • the environmental input data comprises acceleration data.
  • the computer readable storage medium further comprises instructions for calculating the occurrence of the anxiety disorder in accordance with a movement recognition technique
  • the movement recognition technique comprises recognising a movement
  • the computer readable storage medium further comprises instructions for sending, via the network interface, an indication of the occurrence of the anxiety disorder.
  • the indication data is adapted for display by a display device.
  • the display device comprises an augmented reality display device.
  • the indication data further comprises instructions for addressing the anxiety disorder.
  • the indication data is adapted for use by a haptic interface.
  • the haptic interface is adapted to vibrate.
  • a system for detecting a user anxiety disorder comprising a wearable device; and an application server; wherein the application server is adapted to receive from the wearable device environment input data, and the application server is adapted to calculate the occurrence of the anxiety disorder in accordance with the environment input data.
  • the environment input data comprises image data.
  • the application server is adapted to calculate the occurrence of the anxiety disorder in accordance with an image recognition technique
  • the image recognition technique is adapted for recognising an object.
  • the image recognition technique is adapted for receiving first image data at a first time and second image data at a second later time, and comparing the first data and the second data.
  • the image data comprises video data
  • the environmental input data comprises audio data
  • the application server is adapted to calculate the occurrence of the anxiety disorder in accordance with an audio recognition technique.
  • the audio recognition technique is adapted for recognising a sound.
  • the audio recognition technique is adapted for receiving first audio data at a first time and second audio data at a second later time, and comparing the first audio data and the second audio data.
  • the environmental input data comprises acceleration data.
  • the application server is adapted to calculate the occurrence of the anxiety disorder in accordance with a movement recognition technique.
  • the movement recognition technique comprises recognising a movement.
  • the movement recognition technique is adapted for receiving first movement data at a first time and second movement data at a second later time, and comparing the first movement data and the second movement data.
  • the application server further comprises a data interface for outputting information, and upon the occurrence of the anxiety disorder, the application server is adapted to output, via the data interface, indication data representing an indication of the occurrence of the anxiety disorder to the wearable device; and wherein the wearable device comprises a device output interface for outputting information to the user.
  • the device output interface comprises a haptic interface
  • the haptic interface is adapted to vibrate.
  • the device output interface is a display device, and wherein the wearable device is adapted to display the indication data using the display device.
  • the display device comprises an augmented reality display device.
  • the wearable device is adapted to display the indication of the occurrence of the anxiety disorder in augmented reality.
  • the wearable device is adapted to display instructions for addressing the anxiety disorder.
  • the wearable device comprises a headset
  • the sensor is located at the headset,
  • the device output interface is an audio interface
  • the wearable device is adapted to play out the indication data using the audio interface.
  • FIG. 1 shows a mobile computing device on which the various embodiments described herein may be implemented in accordance with an embodiment of the present invention
  • FIG. 2 shows a network of computing devices on which the various embodiments described herein may he implemented in accordance with an embodiment of the present invention
  • FIG. 3A shows an exemplary wearable mobile computing device taking the form of a headset in accordance with a preferred embodiment of the present invention
  • Fig, 3B shows an exemplary wearable mobile computing device taking the form of a headset in accordance with a preferred embodiment of the present invention.
  • FIG. 4 shows an embodiment of the mobile computing device 100 on which the various embodiments described herein may be implemented in accordance with an embodiment of the present invention:
  • Fig. 5 shows a user wearing a mobile computing device adapted to project markers on the users surroundings to define reference points, and a construction of a computer-generated model of the user in accordance with an embodiment of the present invention
  • Fig. 6 shows a graphical means of displaying the actions performed by a user of the mobile computing device in accordance with an embodiment of the present invention
  • Fig. 7 shows a graphical means of displaying stress vitality of the user of the mobile computing device, as well as possible shape and colour variations of this graph in accordance with an embodiment of the present invention:
  • FIG. 8 shows an overhead view of the first user of the mobile computing device lying down after a heart attack, as observed by a second user of the mobile computing device, and display means that may be used for the second user in accordance with an embodiment of the present invention
  • FIG. 9 shows a feedback system that uses user input data and environmental input data to diagnose a health condition in accordance with an embodiment of the present invention
  • Fig. 19 shows the field of view as observed by the user of the mobile computing device suffering from a blind spot, and possible display means to include the obstructed field of view in accordance with an embodiment of the present invention.
  • Fig. 11 shows various types of blind spots of the user of the mobile computing device, and an example of how a complete field of view can be displayed in accordance with an embodiment of the present invention.
  • FIG. 1 shows a mobile computing device 100 on which the various embodiments described herein may be implemented, in particular the steps of detecting vitality indicia may be implemented as computer program code instructions executable by the mobile computing device 100.
  • the mobile computing device 100 is adapted for use by users, carers and the like in calculating various vitality indlca.
  • the vitality indlea may comprise any indica of general well-being, health, lifestyle and the like and may include indica relating to uses state of sleep, food consumption, exposure to environmental hazards, disease symptoms and the like.
  • a user may use the mobile computing device 100 for continually monitoring one or more vitality indica.
  • the mobile computing device 100 may be adapted to continually measure whether the user has had enough sleep and alert the user when the user is in a state of sleep deprivation.
  • the mobile computing device 100 may be adapted to measure one or more vital signs of the user, such as temperature, blood pressure, heart rate, blood sugar levels and the like in the early detection of disease onset.
  • the mobile computing device 100 may also be used by oarers, such as doctors, nurses and the like in the care of patients. For example, a doctor or nurse may use the mobile computing device 100 to provide one or more vitality indica relating to a patient.
  • the mobile computing device 100 is a wearable mobile computing device, such as, for example, the glasses 100b as substantially shown in figure 3.
  • the mobile computing device 100 may comprise constituent parts, where for example, the mobile computing device 100 comprises a headset coupled to a computing device, such as a computing device located in the users pocket or located in a remote location, wherein the constituent parts cooperate for the purpose of detecting vitality indicia in the manner described herein.
  • the mobile computing device 100 is adapted for providing augmented reality information.
  • augmented reality overlay 310 displayed by the mobile computing device 100 showing various vitality indicia 305.
  • the mobile computing device 100 is adapted for calculating a vitality indicia, such that remedial steps may be taken where the vitality indicia represents an unhealthy state.
  • remedial steps may comprise alerting the user, providing instructions to the user, or alerting the third person so as to be able to assist the user.
  • the computer program code instructions may be divided into one or more computer program code instruction libraries, such as dynamic link libraries (DLL), wherein each of the libraries performs a one or more steps of the method. Additionally, a subset of the one or more of the libraries may perform graphical user interface tasks relating to the steps of the method.
  • DLL dynamic link libraries
  • the device 100 comprises semiconductor memory 110 comprising volatile memory such as random access memory (RAM) or read only memory (ROM),
  • RAM random access memory
  • ROM read only memory
  • the memory 100 may comprise either RAM or ROM or a combination of RAM and ROM.
  • the device 100 comprises a computer program code storage medium reader 130 for reading the computer program code instructions from computer program code storage media 120.
  • the storage media 120 may be optical media such as CD- ROM disks, magnetic media such as floppy disks and tape cassettes or flash media such as USB memory sticks.
  • the device 100 further comprises I/O interface 140 for communicating with one or more peripheral devices.
  • the I/O interface 140 may offer both serial and parallel interface connectivity.
  • the I/O interface 140 may comprise a Small Computer System interface (SCSI), Universal Serial Bus (USB) or similar I/O interface for interfacing with the storage medium reader 130.
  • the I/O interface 140 may also communicate with one or more human input devices (HID) 160 such as keyboards, pointing devices, joysticks and the like.
  • the I/O interface 140 may also comprise a computer to computer interface, such as a Recommended Standard 332 (RS-232) interface, for interfacing the device 100 with one or more personal computer (PC) devices 190.
  • the I/O interface 140 may also comprise an audio interface for communicate audio signals to one or more audio devices 1050, such as a speaker or a buzzer.
  • the device 100 also comprises a network interface 170 for communicating with one or more computer networks 180.
  • the network 180 may be a wired network, such as a wired Ethernet ⁇ network or a wireless network, such as a Bluetooth ' network or IEEE 802.11 network.
  • the network 180 may be a local area network (LAN), such as a home or office computer network, or a wide area network (WAN), such as the internet or private WAN.
  • LAN local area network
  • WAN wide area network
  • the device 100 comprises an arithmetic logic unit or processor 1000 for performing the computer program code instructions.
  • the processor 1000 may be a reduced instruction set computer (RISC) or complex instruction set computer (CISC) processor or the like.
  • the device 100 further comprises a storage device 1030, such as a magnetic disk hard drive or a solid state disk drive.
  • Computer program code instructions may be loaded into the storage device 1030 from the storage media 120 using the storage medium reader 130 or from the network 180 using network interface 170.
  • an operating system and one or more software applications are loaded from the storage device 1030 into the memory 110.
  • the processor 1000 fetches computer program code instructions from memory 110, decodes the instructions into machine code, executes the instructions and stores one or more intermediate results in memory 100.
  • the instructions stored in the memory 110 when retrieved and executed by the processor 1000, may configure the mobile computing device 100 as a special-purpose machine that may perform the functions described herein.
  • the device 100 also comprises a video interface 1010 for sending and receiving video signals to a display device 1020, such as a liquid crystal display
  • the display device 1020 is an augmented reality display device as substantially shown in figure 3. in this manner, the device 100 is adapted to display information to the user in augmented reality.
  • the device 100 further comprises an environment sensor 105.
  • the environment sensor 105 is adapted for receiving environment input from which the vitality indicia may be calculated.
  • the environment sensor 106 is adapted to detect infrared radiation so as to be able to ascertain the surface temperature of a patient so as to, for example, alert a doctor as to a fever state of the patient.
  • the environment sensor 105 is adapted to detect sound, so as to, for example be able to detect repetitive coughing of the user for the purposes of detecting the onset of flu, or, in another example, sound levels for the purposes of warning the user of potentially damaging sound levels.
  • the environment sensor 106 may be adapted for capturing image or video data, so as to allow for image or video recognition technique to be employed for detecting certain events or scenarios.
  • the environment sensor 105 may be adapted for monitoring a user's eyelids, so as to determine when the user is in a level of awareness or tiredness and wherein the user is in an awakened state. in this manner, the mobile computing device 100 may be adapted for monitoring the level of awareness or tiredness of the user.
  • the image or video data from the environment sensor 105 may he employed in the detection of other aspects, such as determining the nutritional composition from a plate of food (e.g., protein, hydrates and fats) using image data, or calculating patient compliance with the treatment regime, such as by using video data for the purposes of detecting whenever a patient takes the tablets prescribed by a doctor.
  • a plate of food e.g., protein, hydrates and fats
  • calculating patient compliance with the treatment regime such as by using video data for the purposes of detecting whenever a patient takes the tablets prescribed by a doctor.
  • the environment sensor 105 may be adapted for measuring acceleration, so as to be able to detect the motion and orientation of the user.
  • the motion detected by the environment sensor 105 may be adapted for calculating a fitness vitality indicia, wherein the environment sensor 105 is adapted for detecting when the user is jogging, walking, working at a much and the like.
  • the orientation of the user may be used in the detection of the resting state of the user, wherein the horizontal position usually indicates that the user is in the resting state.
  • the environment sensor 105 may be adapted for measuring other vital signs of the user, such as heart rate, blood pressure, blood sugar levels and the like all of which may be employed in, for example, the detection of a disease process such as diabetes.
  • the environment sensor 105 may be adapted for measuring environment hazards, such as UV light levels, radiation, toxic chemicals and the like. in this manner, the mobile computing device 100 may be adapted for alerting the user to an incident environmental hazard (e.g., whether user is in an atmosphere comprising high levels of carbon monoxide) or punitive environment or hazards, such as prolonged UV or other radiation exposure levels.
  • environment hazards such as UV light levels, radiation, toxic chemicals and the like.
  • the mobile computing device 100 may be adapted for alerting the user to an incident environmental hazard (e.g., whether user is in an atmosphere comprising high levels of carbon monoxide) or punitive environment or hazards, such as prolonged UV or other radiation exposure levels.
  • the environment sensor 105 may be adapted for measuring further environmental inputs depending on the application.
  • the environment sensor 105 may be proximate with the computing device 100.
  • the environment sensor 105 may be integrated within the housing of the augmented reality headset 100b, such as by being a miniaturised accelerometer or the like located within a temple of the headset 110b.
  • the remote sensor 105 may be located away from the computing device 100 adapted to communicate with the computing device by suitable wired or wireless transmission means.
  • wireless transmission means such as Bluetooth is preferred from an ergonomic perspective.
  • such a remotely located environment sensor 105 may take the form of a wristband, comprising one or more of the above-mentioned senses.
  • the wristband being so located approximate to the rest of the user is ideally suited for measuring pulse rate_and other vitality indlca of the user.
  • the computing device 100 is able to monitor the hand movements of the user which may be used for receiving instructions from the user, or for diagnostic purposes.
  • the computlng device 100 utilising accelerometer data from the wristband, maybe adapted to detect tremors which may be an indication of Parkinson's disease or other neuromuscular disease.
  • the environment sensor 100 may be adapted specifically for detecting Alzheimer's disease.
  • Alzheimer's disease can be detected up to 10 years in advance by detecting loss in brain matter.
  • cerebral spinal fluid CSF
  • intracranial pressure not a similar for the symptoms of hydrocephalus.
  • the environment sensor 100 may be adapted to measure intracranial pressure of the user utilising a noncontact sensing technique such as ultrasound measurements of the optic nerve sheath of the user.
  • the computing device 100 may be adapted to measure the stapedial reflex (such as in the ear) or alternatively take the form of a rear facing camera to conduct fundoscopy (otherwise known as ophthalmology) to visualise papilledema
  • the environment sensor 105 may additionally or alternatively comprise sensors for measuring brain activity such as by conducting electroencephalography wherein external electrodes placed on the scalp of the user to measure changes and electrical activity.
  • NIRS near infrared spectroscopy
  • NIRS near infrared spectroscopy
  • the environment sensor 105 may be adapted for measuring a combination of environment inputs.
  • the environment sensor 105 may be adapted for measuring the infrared radiation from the patient's to measure the temperature of the patient (e.g., for the purposes of detecting a fever state of the patient) and also any audio emanating from the patient, (e.g. such as audio data emanating from a stethoscope interface for the purposes of detecting and obstructed airway) wherein the environment indicia may then be employed by the mobile computing device 100 and combined fashion.
  • the mobile computing device 100 may be adapted for detecting flu by the combination of a high temperature and an indication of obstructed airway.
  • the device 100 also comprises a communication bus subsystem 150 for interconnecting the various devices described above.
  • the bus subsystem 150 may offer parallel connectivity such as industry Standard Architecture (ISA), conventional Peripheral Component interconnect (PCI) and the like or serial connectivity such as PCI Express (PCIe), Serial Advanced Technology Attachment (Serial ATA) and the like.
  • ISA industry Standard Architecture
  • PCI Peripheral Component interconnect
  • PCIe PCI Express
  • Serial Advanced Technology Attachment Serial ATA
  • FIG. 2 shows an exemplary system 200 of computing devices 100 on which the various embodiments described herein may be implemented. Such a system 200 may be employed, where, for example remote data processing is required for the purposes of detecting the vitality indicia.
  • the system 200 comprises an application server 210 in communication with one or more mobile computing devices 100 across a data network 180.
  • the mobile computing devices 100 operable to send environment input data to the application server 210 and receive from the application server 210 data representing the vitality indicia.
  • processor intensive computing technique such as image recognition and the like are employed.
  • the intensive computing may be offloaded to the application server 210 blowing for mobile computing devices 100 with less processing power to be employed.
  • a "dumb” or “thin” wearable device may be used as an little or no processing power instead of the mobile computing device 100. in this manner, the user wears the wearable device, and the wearable device comprises one or more sensors as described herein. in this embodiment, instead of the device performing any processing of the environmental input data, the environmental input data is forwarded to the application server 210 for processing by the application server 210.
  • the mobile computing device 100 is adapted for use by a user in measuring and tracking one or more vitality indicia.
  • the mobile computing device 100 is adapted for detecting when the user has a fever. in this manner, the mobile computing device 100 is adapted for measuring the temperature of the user.
  • the mobile computing device may be adapted to measure the temperature of the user by skin contact.
  • the environment sensor 105 may take the form of a thermometer 315 located at the head engagements of the wearable device 100b.
  • thermometer 315 may be located at the locations of the wearable device 100b, For example, the thermometer 315 may be located at the nose engagement portion, forehead engagement portion, temple adjacent portion or ear adjacent or insertion portion of the wearable device 100b, in this manner, the mobile computing device 100 may measure the temperature of the wearer from the wearer's head and alert the user when the wearer's temperature exceeds a threshold.
  • the mobile computing device 100 may be adapted to send vitality indicia data across a data network 180 to an application server 210.
  • a carer such as a doctor treating the user, may be alerted to the vitality indicia of the user.
  • the doctor may be alerted to the user is a temperature and be able to phone the user to guestion the user as to the user's condition,
  • the environment sensor 105 may be a remote unit, such as a remote unit attached to the user skin, adapted to send measurements data to the mobile computing device 100 via interfaces including a wired interface, or wireless interfaces such as a Bluetooth wireless interface.
  • the mobile computing device 100 may be adapted to measure temperature in other manners, such as via infrared and the like.
  • the mobile computing device 100 may be adapted for detecting coughing, and the like.
  • the environment sensor 105 may comprise an audio interface, adapted for receiving audio data. in this manner, the mobile computing device 100 may detect a cough when the audio data indicates a sound above a certain decibel threshold.
  • the mobile computing device 100 may be adapted to employ an audio recognition technique is adapted to recognise a cough sound.
  • the environment sensor 105 may comprise an accelerometer, so as to for example detect acceleration wherein the user coughs.
  • the environment sensor 105 may comprise a video interface, such as a rear facing video interface is adapted to capture video data from the user's face, in this manner, the mobile computing device 100 may be adapted for employing a video recognition technique adapted to detect a cough from the user's facial expression.
  • the mobile computing device 100 may be adapted for the purposes of monitoring the sleep requirements of the user.
  • the mobile computing device 100 may be adapted to measure the user's sleep periods, and indicates to the user whether the user's sleep periods comprise sufficient sleep time.
  • the augmented display 310 comprises a vitality indicia 305f representing the user's level of awareness or tiredness.
  • the mobile computing device 100 may be further adapted to calculate a category into which their users vitality indicia falls. in the example given, the mobile computing device 100 displays to the user that the user level of awareness or tiredness is critical on account of having received too little sleep with in a previous period.
  • the amount of sleep may be shown in the augmented display 310 by a bar chart, wherein the full extent of the bar chart represents sufficient sleep.
  • the environment sensor 105 may be a rear facing video oapture device for monitoring the user's eyes to detect when the user's eyelids are shut.
  • the environment sensor 105 may comprise an orientation sensor for detecting when the user is lying down,
  • the mobile computing device 100 is adapted for employing image and video recognition technique for the purposes of ascertaining the vitality indicia.
  • the mobile computing device 100 may be adapted for measuring the nutritional intake of the user.
  • the environment sensor 105 may comprise a forward facing video camera such that when the user is eating, the environment sensor 105 is adapted to capture image or video data of the user's plate. image or video recognition technlgue may then be employed for the purposes of ascertaining the nutritional composition of the plate of food.
  • the mobile computing device 100 may be adapted for recognising the food composition of the food in various manners, such as by colour, shape and the like.
  • the mobile computing device 100 may be adapted to ascertain nutritional information from food packaging wherein the user may scan each Item of food packaging prior to consumption for recordal by the mobile computing device 100,
  • the nutritional composition of the food packaging may be recorded by the mobile computing device 100 using image recognition technique (e.g. so as to read the nutritional information displayed on the packaging) or using barcode recognition technique (e.g. wherein the mobile computing device 100 is adapted to look up nutritional information from the application server 210 in accordance with a barcode data).
  • image recognition technique e.g. so as to read the nutritional information displayed on the packaging
  • barcode recognition technique e.g. wherein the mobile computing device 100 is adapted to look up nutritional information from the application server 210 in accordance with a barcode data.
  • a vitality indicia representing the nutritional intake of the user.
  • the mobile computing device 100 is categorised the nutritional intake of the user as being okay wherein, in the example given, and okay categorisation represents that is the nutritional intake of the user comprises the dally recommended allowance
  • the image or video recognition technique may be employed in other manners such as by recognising certain situations or events.
  • the video or image recognition technlgue may be employed to measure compliance, so as to, for example detect each time a user takes a tablets as prescribed by a doctor so as to be able to report on treatment regime compliance.
  • the image or video recognition technique may be employed to detect symptoms of the user.
  • the environment sensor 105 is a rearward facing video camera
  • the mobile computing device 100 may be adapted to detect and eye infection on the basis of the colour of the eye, or a measles, on the basis of a rash pattern across the user's face, for example.
  • the environment sensor 105 may be adapted to measure the environment input parameters.
  • the environment sensor 105 may be adapted to measure the exposure to UV light during a certain period. in this manner, the mobile computing device 100 may inform the user when the user has received too much UV light. Alternatively, the mobile computing device 100 may inform the user when the user has not received enough sunlight.
  • the environment sensor 105 comprises a UV sensor.
  • the augmented display 310 comprises an ultraviolet light vitality indicia 305d representing to the user the amount of UV light to which the user has been exposed. in the example given, the mobile computing device 100 is categorised the UV exposure as being low,
  • the environment sensor 105 may be further adapted to measure other vital signs, such as blood pressure, sugar levels, heart rate and the like. in this manner, where for example the user is diabetic, the mobile computing device 100 may be adapted to detect a fall in blood sugar of the user and warm the user to take appropriate remedial action in time.
  • the mobile computing device 100 is adapted to assist carers, such as doctors and the like in treatment of a patient.
  • the mobile computing device 100 may be adapted to be worn by the patient so as to gather vitality indicia data about the patient, such that the vitality indicia data may be reported to the doctor, such as by being stored by the mobile computing device 100 for download at the doctors practice, or communicate to the doctor via the application server 210,
  • the mobile computing device 100 may be worn by the doctor wherein the mobile computing device 100 provides augmented information as to one or more vitality indicia of the patient. in this manner, the mobile computing device 100 may facilitate the diagnosis of diseases and the like and reduce professional error in the care of patients.
  • the mobile computing device 100 takes the form of the glasses 100b is shown substantially in figure 3 for use by the doctor during a patient consultation so as to provide augmented vitality indicia information to the doctor about the patient during the consultation.
  • the mobile computing device 100 is adapted for assisting the doctor in the diagnosis of flu and therefore is adapted to measure three vitality indicia being, the airway condition of the user as indicated by indicia 305a, the temperature of the user is indicated by indicia 305b and the user's compliance with a treatment regime as indicated by indicia 305c, Each of these indicia will now be discussed in turn.
  • the indicia represents the breathing capabilities of the patient, whether they be obstructed, or clear.
  • the environment sensor 105 may comprise a stethoscope interface for obtaining audio data from a stethoscope measurement of the patient.
  • the environment sensor 105 comprises an audio interface
  • the mobile computing device 100 may be adapted to employ an audio recognition technigue in detecting obstructed airways, which may be characterised by a rasping sound (e.g. an audio Spectrum having high frequency band components).
  • a rasping sound e.g. an audio Spectrum having high frequency band components
  • the mobile computing device 100 may be adapted to measure the temperature of the patient. As opposed to the contact which measurement described above. in this embodiment, it is preferable that the mobile computing device 100 measure the temperature of the patient at a distance, such as by using infrared, for example. in this manner, the environment sensor 105 is adapted to measure the infrared radiation radiating from the patient and calculate the surface temperature of the patient, in this manner a high temperature of a patient may be determined without contact with the patient. Referring to the embodiment given in figure 3, the environment sensor 105 has detected that the patient has a high temperature and has therefore characterised the patient as having a fever,
  • the mobile computing device 100 may be adapted to measure the patient's compliance with a treatment regime, in this manner, the patient may have worn the mobile computing device 100 for a period of time wherein the mobile computing device 100 was adapted to measure the patient's compliance with a treatment regime, such as, for example, the taking of flu medication at a patient at required intervals.
  • a treatment regime such as, for example, the taking of flu medication at a patient at required intervals.
  • the mobile computing device 100 displays a compliance indicia 305c of lower compliance.
  • the mobile computing device 100 may be adapted to use a combination of vital indicia in the diagnosis process.
  • the augmented display 310 comprises a calculator to diagnosis 320 representing that the patient potentially has flu. in reaching this diagnosis, the mobile computing device 100 has ascertained that the patient has obstructed airways, has a high temperature, and has failed to take flu medication.
  • each of the vital indicia described with reference to this example may be used for the purposes of forming the diagnosis.
  • the mobile computing device 100 is adapted to use a combination of vitality indicia or at least a minimum threshold of vital indicia e.g. such as two of the three vital indicia) in forming the diagnosis.
  • the mobile computing device 100 is adapted to use a weighted measure of the vitality indica in forming the diagnosis. For example, where a patient is suffering from flu, a high temperature may be more indicative of the presence of flu as opposed to sneezing symptoms, for example.
  • the vitality indicia 306 are represented in graphical format.
  • the vitality indicia 305 may be displayed in any manner, graphical or otherwise.
  • information derived from the vitality indicia 305 may further be calculated by the mobile computing device 100 for display.
  • indicia such as mental health, emotional health, fitness, strength, tiredness and the like may all be derived from the vitality indicia measurements obtained from the environment sensor 105.
  • the mobile computing device 100 or application server 210 may be adapted to employ artificial intelligence computational technique for the purposes of forming the diagnosis.
  • artificial intelligence type may be used depending on the circumstance but may comprise adaptive artificial intelligence technique such as learner or breeder algorithms, fuzzy logic algorithms and the like,
  • the mobile computing device 100 or application server 210 may be adapted to calculate a remedy for any diagnosis. For example, where the mobile computing device 100 or application server 210 diagnosis a person with flu, the mobile computing device or application server 210 may consult a database for potential treatment, such as the appropriate cough medicine, vitamin supplements or the like. in one embodiment, the wearable device 100b may be adapted to display advertising's relating to any diagnosis. For example, where a user is suffering from flu, the wearable device 100b may be adapted to display an advertisement for a particular brand of cough syrup. Note that in making a recommendation of a medication, the wearable device 100b may be adapted to ascertain the potential side-effects of the recommended medication.
  • the computing device 100 may be configured with potential side-effects, or alternatively request such potential side-effects via the network interface 170 so as to be able to alert the user accordingly.
  • the user will be able to make a determination not to take a particular cost syrup which may cause drowsiness if the user is about to drive a vehicle.
  • the computing device 100 is adapted to monitor the dosage regime of a particular medicine, so as to warn or that the user should the user exceeds the commended dosage regime, fail to take medicine as prescribed or the like.
  • the mobile computing device 100 may be adapted for the effective management of diabetes in monitoring blood glucose levels of the user.
  • the mobile computing device 100 may comprise a blood glucose measurement device (not shown) adapted to periodically measure the blood glucose levels of the user, in one embodiment, the user may be reguired to submit to periodic pinpricks to provide blood samples for the computing device 100, such as by way of disposable swab or the like, in other embodiments, the computing device 100 may be operably coupled to a non-invasive blood glucose measuring device, such as an in ear measurement device or the like. [866] i this manner, the mobile computing device is able to monitor the blood glucose levels of the user and take appropriate action should the glucose levels tend towards dangerously low levels.
  • the mobile computing device 100 may be adapted for displaying, using the display device 1020 to the user, and warning indication to the user to take and insulin shot for example, alternatively, the computing device 100 may be adapted to send a warning message across the network 180 to an appropriate carer, should the insulin levels drop below dangerously low levels,
  • the computing device 100 may be adapted to protect dangerously low insulin levels so as to prevent the user from reaching a diabetic state.
  • the computing device 100 may be adapted to monitor the trajectory of the blood glucose levels so as to estimate time to the time for the user reaches dangerously low a diabetic state.
  • the computing device 100 may be adapted to implement a graded the management process, wherein should the computing device 100 calculates that the user is a one hour away from a diabetic state, the computing device 100 may display, using the display device 1020 a warning, warning the user to take and insulin shot.
  • the computing device 100 may be adapted to communicate, via the network interface 170 a warning message to an appropriate carer so as to seek external assistance for the user.
  • the data sent via the network interface 170 may comprise the location of the user so as to assist in the location of the user.
  • the computing device 100 may be adapted to interface with dangerous machinery so as to prevent the operation of such dangerous machinery wherein the user is in a diabetic state.
  • dangerous machinery so as to prevent the operation of such dangerous machinery wherein the user is in a diabetic state.
  • the computing device 100 may be adapted to cut power to, for example a motor vehicle of the user.
  • Figure 4 shows a mobile computing device 401 on which the various embodiments described herein may be implemented. in particular the steps of detecting vitality indicia may be imptemented as computer program code instructions executable by the mobile computing device 401 .
  • the device 401 is an embodiment of the mobile computing device 100, which comprises a headset 400, on which is mounted a sensor array 410, further comprising an orientation sensor 410a, stereoscopic image capture sensor 410b, audio sensor 410c, image capture sensor 410d, location sensor 410e, breathing rate sensor 410f, blood oxygen saturation sensor 410g, heart rate sensor 410h ; temperature sensor 410i, and perspiration sensor 410j .
  • the headset 400 also comprises a display means 415 for displaying the vitality indicia, further comprising graphical display 415a, number/index display 415b, image/video display 415c, and speaker 415d for audio output.
  • Vitality indicia may be displayed on a lens 1020 of the headset 400.
  • the headset 400 can communicate wlrelessly to an external bus subsystem 470, which in turn connects together a processor 420, storage media 440, and an I/O data interface 450.
  • the processor 420, storage media 440, and an I/O data interface 450 are contained within a unit 480,
  • the I/O data interface can communicate wlrelessly to an external network database 460.
  • the processor 420 further comprises a computer program code 430 stored in RAM or other memory means that the processor 420 controls.
  • the unit 480 may be connected directly to, or form part of, the headset frame of the wearable device 400.
  • the unit 480 is compact and Its components fit on or within the headset 400.
  • the components within the unit 480 may be spread across the headset 400 to ensure uniformity and to evenly distribute the weight so that wearing of the headset 400 is more comfortable for the user.
  • Communication between the sensor array 410, the lens display 1020, and the unit 480 may be through direct electrical wire connection. in this way, the processor 420 may process the environmental input data more efficiently and minimise any lag or delay with communicating the data between the components.
  • the computer program code 430 has the capacity to determine the overall health and wellbelng of the user by comparing the environmental input data from the sensor array 410 with reference data from a population or past userdata stored in the storage media 440. Further, the computer program code 430 may comprise an algorithm that can specifically diagnose a health condition If certain values are beyond the normal healthy range, or if predetermined targets are met that are indicative of a health condition. Preferably, It can also determine a remedy for the detected health condition. For example, the remedy may be listed on a look-up table corresponding to a given health condition, and the processor 420 can access the look-up table to find the possible remedies for a given health condition. Further, the remedy may be a medication, exercise activity, diet suggestion, lifestyle suggestion, therapy suggestion or product suggestion. If the remedy requires a product purchase the network database 460 may be accessed to locate the desired product.
  • the orientation sensor 410a may be at least one gyroscope adapted to monitor environmental input data comprising the head movements and neck flexion of the user.
  • the gyroscope may be able to distinguish between sudden and gradual motions such as swaying and twitching.
  • a medication consumed by the user may have resulted in fatigue, which may be characterised by gradual leaning of the head when the user is seated. The user may be prompted of this motion so corrective action can be taken.
  • the stereoscopic image capture sensor 410b may be at least two image capture sensors 41 Od in the form of cameras. These cameras may be adapted to monitor environmental input data comprising the external environment surrounding the device 100. The two images taken from the stereoscopic image capture sensor 410b may be offset when combined, but are then processed by the processor 420 to calculate three-dimensional depth to determine accurate distances between objects or features. Sensor input data may be derived from the actions performed by the user, the physiological features of the user, or the objects perceived in the field of view of the user.
  • the image capture sensor 41 Od is provided as a rearward facing camera for the purposes of monitoring dental health of the user. in this manner, the camera is adapted to capture imagery of the user's teeth within the field of view of the camera so as to make a determination as to potential buildup of plaque, bacteria or the like.
  • the audio sensor 41 Oe may be at least one microphone adapted to monitor environmental input data comprising noise produced by the user. For instance, coughing, sneezing, runny nose and the like may occur. By monitoring such data diagnosis or approximate diagnosis can be formed or the effectiveness of a suggested remedy can be monitored, and so the computer program code 430 can cause the processor to calculate a further remedy suggestion and log data.
  • the vitality indicia may also be communicated using a speaker.
  • the audio sensor 410c may detect coughing, sneezing, runny nose and the like from the user as audio data.
  • the audio data may be compared against reference audio data indicative of a health condition, such as a cold or flu, and the user may be prompted that they have the cold, flu, or other health condition. Further, the processor 420 may determine an appropriate remedy or action to alleviate or cure the health condition,
  • the image capture sensor 41 Od may be at least one outwards- facing camera adapted to monitor environmental input data comprising captured images or recorded video of the user's physiological features.
  • the processor 420 may prompt the user to look at the area of interest while wearing the headset 400 so that the camera 41 Od can retrieve data on the skin colour and contours.
  • the image capture sensor 41 Od may be a rearwards-facing camera adapted to look direetly at the user's facial features to detect swelling eyelids, teary eyes and the like. Skin disorders such as measles may be detected using rearwards-facing cameras, analysing rash patterns on the user's face.
  • the location sensor 41 Oe may be at least one GPS tracking device adapted to monitor environmental input data comprising the geographic location of the user.
  • the I/O data interface 450 can communicate location data between the processor 420 and the network database 460 to determine in real-time the users location. in one embodiment, It may be used to assist the processor 420 to locate stores or pharmacies that have the suggested products the user was recommended to take. Alternatively, It can assist the user to be located If the user prompts for emergency assistance.
  • the location sensor 41 Oe is adapted for use by the computing device 100 for being a navigational aid for the user, so as to provide visual or auditory directions to the user, in a yet further embodiment, the computing device 100 may utilise a complement of the image capture sensor 41 Od and the location sensor 41 Oe to provide navigational assistance for the visually impaired. Specifically, utilising the image capture sensor 41 Od in conjunction with a suitable object or image recognition technique, the computing device 100 may be adapted to Identify obstacles in the path of the user so as to guide the user appropriately. It should be noted that in certain embodiments, the computing device 100 need not necessarily utilise an image capture device for the purposes of identifying obstacles, but may alternatively utilise ultrasonic proximity measurement devices or the like.
  • the computing device 100 is able to provide directions to the user, and alert them to potential obstacles with in their path, in a yet further embodiment, the computing device 100 may be adapted for face recognition for the purposes of assisting the visually impaired person (or even a non- visually impaired person) in recognising other people.
  • the computing device 100 may be adapted to recognise a person, and play out the persons name to be used, either visually or by sound.
  • the network database 460 may comprise data on the weather conditions, pollen count, humidity and the like as weather data.
  • the processor 420 can use the weather data to determine the weather conditions of the users location in an automated manner. Further, the processor 420 can determine if the weather data will adversely affect the user. For instance, a high pollen count in the area may cause an allergic reaction if the user at the time is suffering from hay fever, and so the user is prompted that the area may exacerbate their health condition and so should seek remedial action.
  • the breathing rate sensor 410f may be at least one acoustic transducer adapted to monitor environmental input data comprising the regular inhalation/exhalation cycles of the user by monitoring the sounds these cycles produce.
  • the breathing rate sensor may be at least one radar sensor adapted to detect minute movements below the surface of the skin. The radar may send very short pulses towards the chest and detect the echo reflected. The regular pulsating rhythm could be converted to breathing rate, and be sensitive enough to determine changes over time. The effectiveness of medications pertaining to nasal congestion could be tracked over time to assess user vitality.
  • breathing rate can be detected by at least one acoelerometer,
  • the acoelerometer is put into contact with the users body, preferably on or near their chest. Breathing causes a periodical movement of the chest wall and thus accelerates the accelerometer. in the rest state, the acoelerometer measures the acceleration at a first time as frequency data, and the periodical change in frequency data at a second time may be converted to an equivalent breathing rate.
  • a series of acceierometers may be used to determine breathing rate at different positions/orientations with respect to the user to account for the effects of gravity.
  • the blood oxygen saturation sensor 41 Og may be at least one pulse oximeter adapted to monitor environmental input data comprising oxygen count in the users blood, whereby light of two different wavelengths is passed through the user to a photodeteetor. The changing absorbanee at each of the wavelengths would be measured, allowing the processor 420 to determine the absorbanee due to the pulsing arterial blood. in this way, the processor 420 can determine If the user is experiencing inadequate ventilation, and hence suggest a remedy to increase intake of oxygen.
  • the heart rate sensor 41 Oh may be at least one heart rate monitor adapted to monitor environmental input data comprising the users heart beat frequency, whereby It detects the electrical signal produced by the beating heart through the skin. This monitor may be used to determine if a medication has caused the user to have an elevated heart rate, and the user can be prompted of this.
  • the heart rate sensor 41 Oh may comprise a radar sensor adapted to measure the arterial movements just below the surface of the skin. This may then be converted to an equivalent blood pressure of the user.
  • the heart rate sensor 41 Oh may be at least one accelerometer.
  • the accelerometer is put into contact with the user's body, preferably close to their heart.
  • the heart beat would cause a periodical movement of the chest wail and thus changing the acceleration of the accelerometer.
  • the accelerometer measures the acceleration at a first time as frequency data, and the periodical change in frequency data at a second time may be converted to an equivalent heart rate.
  • a series of accelerometers may be used to determine heart rate at different positions/orientations with respect to the user to account for the complex three- dimensional movements and deformations of the heart.
  • the temperature sensor 4101 may be at least one thermometer 315 adapted to monitor environmental input data comprising the temperature of the user's skin surface. in one embodiment, it may be used to assess the treatment of a person experiencing fluc or is recovering from an exercise conducted.
  • the perspiration sensor 410 may be at least one pH sensor adapted to monitor environmental input data comprising the sodium content in the perspiration of the user. This sensor may change colour depending on the detected sodium content and the image capture sensor 41 Od can analyse the colour. The processor 420 can then determine the perspiration output proportional to the sodium content.
  • the sensor array 410 may comprise many other sensors that can be mounted at the headset frame of the wearable device 400, Preferably, the sensor array 410 is compact and the sensors fit on or within the headset 400. The sensors may be spread across the headset 400 at specific locations to optimise their effectiveness.
  • the sensor array 410 can remain active without the user being aware of Its function, and can collect environmental input data in an automated manner.
  • the environmental input data can be Ideally detected by at least one of the sensors in the sensor array 410. Further, the environmental input data can be derived from the physiological features of the user. These physiological features may include body temperature, sweat output, pulse derived from heartbeat, sounds and the like.
  • the unit 480 can be a smartphone or other mobile device in communication with the headset 400, Communication can be via an internet connection, Bluetooth, radio signal, or wire cables directly connecting the headset 400 to the unit 480.
  • the I/O data interface 450 allows the user to communicate with the device 401 , which may be in the form of manual input of information or speaking of commands to the I/O data interface 450,
  • the unit 480 may be capable of analysing the function of the headset 400 and the sensor array 410, and determine if any component in the system 401 is faulty or defective. The user may be alerted via the I/O data interface 450 or the headset 400 of the fault.
  • the storage media 440 is capable of storing information about the physiological features of the user, such as the healthy range of values for the users heart rate, body temperature and the like. It can also store input data derived from at least one of the sensors in the sensor array 410 in an automated manner, as well as input data from the I/O data interface 450 and from the network database 460, The data stored in the storage media can be retrieved by the user, carer or doctor via the I/O data interface 450 or the network database 460.
  • the network database 460 can comprise user health information from a doctor's database, statistical health information, as well as product databases of pharmaceutical manufacturers of health products. The network database 460 can be updated in an automated manner so that It contains the most up-to-date information about user health and products available. These products may be categorised according to what ailment It can treat, price range, location, availability and the like,
  • the vitality indicia may be in the form of text orreadiiy comprehensible graph, numbers, audible speech and the like.
  • the vitality indicia can be displayed over the lens 1020 of the headset 100b, and be displayed when the user requests it or when a health condition is detected.
  • the device 401 may be adapted to diagnose a health condition affecting the user of the device 401 , and calculate a remedy for the given diagnosis.
  • the user may have an elevated temperature, which may be detected by a temperature sensor 4101.
  • the user may also have an elevated heart rate, which can be detected by the heart rate sensor 41 Oh.
  • the user may also have an abnormal breathing rate, which can be detected by the breathing rate sensor 41 Of.
  • the input data from these sensors may be transferred to the processor 420, and the computer program code may determine that elevations in these data from the reference healthy state is, for example, indicative of flu. So, the processor 420 may consult the network database 460 for potential treatment. An appropriate cough medicine, vitamin supplement or the like may be suggested.
  • the processor 420 may be prompted by the user that the product has been take.
  • the processor 420 may continue to analyse the vitality of the user via the sensor array 410, and determine the progress of the treatment. Further recommendations may be given depending on the outcomes of the preceding treatments.
  • the wearable device 400 may be adapted to display advertising relating to a diagnosis.
  • the wearable device 400 may be adapted to display an advertisement for a particular brand of cough syrup.
  • the advertisement may be in the form of an image or prerecorded video designed by the manufacturer or company that owns the product.
  • the user may be able to send feedback on the advertised product to the network database 460 as to whether the product was successful in treating the users condition or feedback data can be generated automatically by the processor as a function of sensor data received post implementation of a remedy and this feedback data may be fed back automatically to the storage media 440 or network database 460. So, in future, the processor 420 can determine whether or not to recommend that product again according to the feedback data.
  • Product advertisements can be selectively chosen by the processor 420 based on the user's experience with a product.
  • the detected health condition of the user may be tiredness or fatigue, which may not be treated by medication.
  • the processor 420 may give a diet suggestion, such as recommending consumption of more fruits and vegetables, less fatty food intake and the like.
  • the processor 420 may also recommend the user conduct an exercise activity such as, but not limited to, walking, jogging, riding a bike, push-ups, or visit a gym remedies.
  • the user's vitality would be tracked in an automated manner over time to assess If there was an improvement to their vitality.
  • Relevant parties such as pharmaceutical manufacturers, carers and doctors may have access to the network database 460. in particular, a carer or doctor may be able to track the progress of the user's vitality, and be able to give a diagnosis for an underlying, possibly more severe, health condition which would otherwise have gone undiagnosed.
  • the remedy may be a prescription or instruction given by the doctor or carer that the user should follow.
  • a doctor may prescribe that the user take tablet medications at regular intervals over a few days.
  • the image capture sensor 410b may analyse instances where It recognises the user holding the tablet, and log this data in the storage media 440.
  • At least one of the sensors in the sensor array 410 may be able to detect physiological changes each time the tablet is ingested, and the I/O data interface 450 may report on treatment regime compliance. This report can be accessed by a doctor or oarer to monitor user compliance.
  • the network database 460 may further comprise data in relation to a collection of users of the device 401 including data pertaining to their health conditions, when the condition was first encountered, which demographics or populations were affected at the time, the locations of these incidents, and which remedies proved most successful in treating the conditions.
  • the processor 420 may use this data and compare It against the input data of the current user of the device 100 to adapt the remedy suggestion so that it is more suitable. in this way, the remedies suggested are tailored to the users specific needs.
  • the suggestion may be based on the user's cultural background, age, gender, lifestyle and the like. Further, product suggestions may be selected based on these aforementioned factors.
  • the user may lead a sedentary lifestyle or have a poor diet ; which at least one of the sensors in the sensor array 410 detects. So, when the user eventually has an Illness the feedback data can be generated automatically by the processor as a function of sensor data received post implementation of a remedy and then feedback choice of remedy may be affected by the lifestyle and diet input data. Exercise and dietary supplement consumption may be some suggested options for people in this demographic.
  • At least one of the sensors in the sensor array 410 can be used to determine a user's exposure to ultraviolet (UV) radiation.
  • the image capture sensor 41 Od may be further adapted to detect light intensity or brightness of the surrounding environment. Further, it may also be able to detect light of shorter wavelengths within the visible light spectrum, and the processor 420 may calculate that there is a possible elevation of UV radiation.
  • the temperature sensor 4101 may detect the current ambient temperature of the environment the user is situated. If it is higher than the expected beat levels then the processor 420 may calculate that there is a possible elevation of UV radiation.
  • These indicia may then be displayed on the headset 400 showing the current and predicted radiation levels for the day, and whether the user has exceeded or not yet reached their recommended exposure to sunlight.
  • the processor 420 may give a remedy or suggestion that the user moves outdoors to increase their UV exposure, or to seek shelter If the exposure limit has been reached.
  • the processor 420 may be further adapted to recognise an emergency or hazard to the health of the user If certain criteria are met.
  • the processor 420 is also capable of triggering an emergency data broadcast from the I/O data interface 450 to the network database 460, hospital or ambulance in close proximity, or another user of the device 401 . This broadcast will prompt the doctor, carer, or other user that the person must be attended to.
  • the sensors in the sensor array 410 may be adapted for measuring vital signs of the user such as heart rate, blood pressure, breathing rate, body temperature, blood oxygen saturation and the like.
  • the processor 420 may be adapted to detect an emergency situation based upon significant deviations of vital signs from the normal range.
  • the processor 420 may be adapted to send the emergency data to concerned parties which may include emergency health contact, relevant health professionals, family members and the like, so they may be notified as to the health status of the wearer.
  • the user vitality data can be stored in the storage media 440 or network database 460 to be accessed by the person/s treating the user at the time to better deduce what the condition is and what indications were present prior to the emergency. For instance, the data may indicate that a heart attack was preceded by gradual increases in heart rate over the last five minutes, as well as possible difficulties in breathing over the past hour.
  • the emergency data may also include global positioning data corresponding to the location of the wearer via the location sensor 41 Oe, For example, if the user experienced a heart attack, the emergency data may comprise both the vitality indicia and user location, and broadcast to the relevant emergency service. in this way, the user can be easily located and remedial action taken sooner.
  • the emergency data may be sent to a second geographically-proximal user of the mobile computing device 100.
  • the emergency data may be retrieved from the storage media 440 or network database 460, and may comprise medical assistance instructions which the second user of the mobile computing device 100 must perform.
  • the emergency medical procedure may be cardiopulmonary resuscitation (CPR).
  • CPR cardiopulmonary resuscitation
  • the second user of the mobile computing device 100 may be prompted by their device to perform a certain seguence of chest compressions and artificial respiration techniques on a person undergoing cardiac arrest.
  • Figure 8 shows an image recognition technique applied to the first wearer 810 of the mobile computing device 100 that experienced a heart attack.
  • Medical assistance instructions may be sent from the device 100 of the first wearer 810 to the second wearer of the mobile computing device 100. This may comprise an overlay of the necessary positioning procedures on the anatomy of the first wearer 810.
  • the image recognition technique may be adapted to identify the position on the chest 820 of the first wearer 810 on which the second person is to perform chest compressions.
  • the second user is then prompted by their device 100 to adopt the required hand position.
  • a prompt 830 may give written or verbal instructions as to what action to perform, for instance: "Apply 10 counts of compressions.”
  • the processor 400 can be adapted to monitor the users stress level, resulting from emotional anxiety or exercise.
  • stress may be indicated through a combination of the following inputs: heart rate, blood pressure, body temperature, blood oxygen saturation, breathing rate, perspiration level and the like.
  • the user may be alerted of their stress vitality indicia graphically, numerically, verbally or by any other suitable means.
  • the stress vitality of the user may be that they are angry, which may have been calculated from a detected elevation in heart rate and perspiration.
  • the output may comprise a pre-recorded voice from a speaker, saying "lt appears that you are stressed. Perhaps you will benefit from a quick break," as verbal means of conveying vitality. Further, the output may also comprise a favourite song that is played as part of the remedial action.
  • a numerical display of vitality indicia may be a score, for instance between 1 and 10, to correlate with the detected stress levels of the user.
  • the user may be prompted that their score is 8, and then suggest remedial actions.
  • the sensors in the sensor array 410 will subsequently monitor the vitality of the user and adjust the score accordingly.
  • this method prompts the user to actively seek remedial action to change their vitality indicia, and to possibly allow the user to pre-empt external influences that could elevate the score in the future.
  • stress level of the wearer over time may be shown in the form of a bar graph figure 7 as part of the augmented display 310 that actively assists in the treatment of the users stress disorders.
  • the vitality indicia 700 may comprise a scale of emotion 720a which, for example, indicates the user is happy via an emoticon 710a, but as a result of detected stress or anxiety the scale will change to correlate with the detected stress.
  • the graph 700 may alter its characteristic such as colour of the bar 720b, type of emoticon symbols 710b or the like. in this way, the users attention is drawn to the elevated stress level, and so may be compelled to actively seek remedial action to change their vitality indicia 720b.
  • the image capture sensor 41 Od may be adapted for detecting environment hazards, in particular, hazards posed by a consumable product that the user may be unaware of.
  • the image capture sensor 41 Od can analyse the text printed on a given product, for instance the ingredients listed on a food product.
  • the computer program code 430 can be adapted to use a text recognition technique for the purposes of identifying text containing a hazardous substance. For example, an object containing a label with text that says "May contain traces of nuts" may be detected by the computer program code 430.
  • the storage media 440 may have this phrase marked as a hazard to the user if the user has a peanut allergy.
  • a user-defined list of allergies, hazards and undesired products may be inputted into the storage media 440 for future reference for the user.
  • a product may not contain an ingredients list that is readable. So, the image capture sensor 41 Od may instead take images of the product name and manufacturer label. Then, the processor 420 may communicate with the network database 460 to access a hazardous lookup service of the manufacturer or knowledgeable third party, and look up the ingredients list to determine if the product poses a health hazard to the user, if so, then the user is prompted of the hazard,
  • the image capture sensor 41 Od may be a rearwards- facing camera further adapted to take images or video of the eyes of the user. in particular, eye characteristics such as redness, swelling, dark ring, iris colour, pupil symmetry, and pupil size eye characteristics may be analysed using the image capture sensor 41 Od.
  • the processor 420 may use an image recognition technique to recognise certain eye characteristics and changes to these eye characteristics over time. Further, the image recognition technique may comprise colour recognition, light intensity, and movement techniques to further calculate eye characteristics. [917] t should be noted that in other embodiments, other visual characteristics indicative of eye disorders over and above those mentioned above may be detected by the computing device.
  • the computing device 100 may be adapted to discern the sclera colour of the user which may be indicative of jaundice. Furthermore, the computing device 100 may be adapted for recognising iris pattern and lens colour, which may become clouded for those users suffering from cataracts.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • Dermatology (AREA)
  • Primary Health Care (AREA)
  • Computer Hardware Design (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

L'invention concerne un dispositif informatique mobile (100) pour calculer des indices de vitalité (305), le dispositif informatique mobile (100) comprenant un processeur (1000) pour traiter des données numériques, un dispositif de mémoire pour stocker des données numériques comprenant un code de programme d'ordinateur et étant couplé au processeur (1000), un dispositif d'affichage de réalité augmentée pour afficher des données numériques dans une réalité augmentée et étant couplé au processeur (1000), un ou plusieurs capteurs pour capturer des données d'entrée d'environnement et étant couplés au processeur (1000), le processeur (1000) étant commandé par le code de programme d'ordinateur pour recevoir, à partir du ou des capteurs, les données d'entrée d'environnement, calculer les indices de vitalité (305) conformément aux données d'entrée d'environnement, et afficher, à l'aide du dispositif d'affichage de réalité augmentée, les indices de vitalité (305).
PCT/AU2013/000823 2012-07-24 2013-07-24 Dispositif informatique mobile, serveur d'application, support de stockage lisible par ordinateur et système pour calculer des indices de vitalité, détecter un danger environnemental, fournir une aide à la vision et détecter une maladie WO2014015378A1 (fr)

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
AU2012903142A AU2012903142A0 (en) 2012-07-24 A mobile computing device, application server and computer readable storage medium for detecting a user anxiety disorder
AU2012903142 2012-07-24
AU2012903217A AU2012903217A0 (en) 2012-07-27 A mobile computing device, application server, computer readable storage medium and system for calculating a vitality indicia
AU2012903217 2012-07-27
AU2012903411 2012-08-09
AU2012903411A AU2012903411A0 (en) 2012-08-09 A mobile computing device, application server, computer readable storage medium and system for calculating a vitality indicia
AU2012904530 2012-10-17
AU2012904530A AU2012904530A0 (en) 2012-10-17 A mobile computing device, application server, computer readable storage medium and system for calculating a vitality indicia, detecting an environmental hazard and vision assistance
AU2012904975 2012-11-15
AU2012904975A AU2012904975A0 (en) 2012-11-15 A mobile computing device, application server, computer readable storage medium and system for calculating a vitality indicia, detecting an environmental hazard and vision assistance
AU2012905069 2012-11-22
AU2012905069A AU2012905069A0 (en) 2012-11-22 A mobile computing device, application server, computer readable storage medium and system for calculating a vitality indicia, detecting an environmental hazard and vision assistance
AU2013900886 2013-03-14
AU2013900886A AU2013900886A0 (en) 2013-03-14 A mobile computing device, application server, computer readable storage medium and system for calculating a vitality indicia, detecting an environmental hazard, vision assistance and detecting disease

Publications (1)

Publication Number Publication Date
WO2014015378A1 true WO2014015378A1 (fr) 2014-01-30

Family

ID=49996429

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2013/000823 WO2014015378A1 (fr) 2012-07-24 2013-07-24 Dispositif informatique mobile, serveur d'application, support de stockage lisible par ordinateur et système pour calculer des indices de vitalité, détecter un danger environnemental, fournir une aide à la vision et détecter une maladie

Country Status (1)

Country Link
WO (1) WO2014015378A1 (fr)

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015160828A1 (fr) * 2014-04-15 2015-10-22 Huntington Ingalls Incorporated Système et procédé d'affichage à réalité augmentée d'informations d'environnement dynamiques
DE102014222355A1 (de) * 2014-11-03 2016-05-04 Bayerische Motoren Werke Aktiengesellschaft Müdigkeitserkennung mit Sensoren einer Datenbrille
WO2016149416A1 (fr) * 2015-03-16 2016-09-22 Magic Leap, Inc. Méthodes et systèmes de diagnostic et de traitement des troubles de santé
WO2016168738A1 (fr) * 2015-04-17 2016-10-20 Declara, Inc. Système et procédés pour plate-forme d'apprentissage haptique
US9508248B2 (en) 2014-12-12 2016-11-29 Motorola Solutions, Inc. Method and system for information management for an incident response
US9530299B2 (en) 2015-03-03 2016-12-27 Toyota Motor Engineering & Manufacturing North America, Inc. Methods and apparatuses for assisting a visually-impaired user
CN106796417A (zh) * 2014-09-29 2017-05-31 微软技术许可有限责任公司 经由可穿戴计算系统的环境控制
US9734403B2 (en) 2014-04-25 2017-08-15 Huntington Ingalls Incorporated Augmented reality display of dynamic target object information
CN107533632A (zh) * 2015-04-02 2018-01-02 埃西勒国际通用光学公司 用于对个人的指数进行更新的方法
US9864909B2 (en) 2014-04-25 2018-01-09 Huntington Ingalls Incorporated System and method for using augmented reality display in surface treatment procedures
US9898867B2 (en) 2014-07-16 2018-02-20 Huntington Ingalls Incorporated System and method for augmented reality display of hoisting and rigging information
US20180081179A1 (en) * 2016-09-22 2018-03-22 Magic Leap, Inc. Augmented reality spectroscopy
WO2018190762A1 (fr) * 2017-04-11 2018-10-18 Cargotec Patenter Ab Système de présentation et procédé associé au système
US10147234B2 (en) 2014-06-09 2018-12-04 Huntington Ingalls Incorporated System and method for augmented reality display of electrical system information
US10166091B2 (en) 2014-02-21 2019-01-01 Trispera Dental Inc. Augmented reality dental design method and system
EP3498150A1 (fr) * 2017-12-13 2019-06-19 Vestel Elektronik Sanayi ve Ticaret A.S. Appareil pouvant être monté sur une tête
US10354350B2 (en) 2016-10-18 2019-07-16 Motorola Solutions, Inc. Method and system for information management for an incident response
JP2019164634A (ja) * 2018-03-20 2019-09-26 カシオ計算機株式会社 ウェアラブル機器、健康管理支援方法及び健康管理支援プログラム
WO2019189971A1 (fr) * 2018-03-30 2019-10-03 주식회사 홍복 Procédé d'analyse par intelligence artificielle d'image d'iris et d'image rétinienne pour diagnostiquer le diabète et un pré-symptôme
US10459231B2 (en) 2016-04-08 2019-10-29 Magic Leap, Inc. Augmented reality systems and methods with variable focus lens elements
US10504294B2 (en) 2014-06-09 2019-12-10 Huntington Ingalls Incorporated System and method for augmented reality discrepancy determination and reporting
EP3644161A1 (fr) * 2018-08-06 2020-04-29 Motorola Mobility LLC Rétroaction d'activité de réalité augmentée en temps réel
US10667683B2 (en) 2018-09-21 2020-06-02 MacuLogix, Inc. Methods, apparatus, and systems for ophthalmic testing and measurement
US10835809B2 (en) 2017-08-26 2020-11-17 Kristina Contreras Auditorium efficient tracking in auditory augmented reality
US10915754B2 (en) 2014-06-09 2021-02-09 Huntington Ingalls Incorporated System and method for use of augmented reality in outfitting a dynamic structural space
US10962855B2 (en) 2017-02-23 2021-03-30 Magic Leap, Inc. Display system with variable power reflector
US10987028B2 (en) 2018-05-07 2021-04-27 Apple Inc. Displaying user interfaces associated with physical activities
US10987008B2 (en) 2015-12-21 2021-04-27 Koninklijke Philips N.V. Device, method and computer program product for continuous monitoring of vital signs
CN112835736A (zh) * 2021-01-22 2021-05-25 张立旭 一种通用型数据纠错方法及系统
US11039778B2 (en) 2018-03-12 2021-06-22 Apple Inc. User interfaces for health monitoring
US11049603B1 (en) 2020-12-29 2021-06-29 Kpn Innovations, Llc. System and method for generating a procreant nourishment program
US11107580B1 (en) 2020-06-02 2021-08-31 Apple Inc. User interfaces for health applications
US11152100B2 (en) 2019-06-01 2021-10-19 Apple Inc. Health application user interfaces
US11209957B2 (en) 2019-06-01 2021-12-28 Apple Inc. User interfaces for cycle tracking
US11223899B2 (en) 2019-06-01 2022-01-11 Apple Inc. User interfaces for managing audio exposure
US11228835B2 (en) 2019-06-01 2022-01-18 Apple Inc. User interfaces for managing audio exposure
US11266330B2 (en) 2019-09-09 2022-03-08 Apple Inc. Research study user interfaces
US11317833B2 (en) 2018-05-07 2022-05-03 Apple Inc. Displaying user interfaces associated with physical activities
US11355229B1 (en) 2020-12-29 2022-06-07 Kpn Innovations, Llc. System and method for generating an ocular dysfunction nourishment program
WO2022126117A1 (fr) * 2020-12-08 2022-06-16 Irisvision, Inc. Procédé et système de gestion de clinicien à distance de dispositifs d'assistance visuelle portés sur la tête
US11480467B2 (en) 2018-03-21 2022-10-25 Magic Leap, Inc. Augmented reality system and method for spectroscopic analysis
US11639944B2 (en) 2019-08-26 2023-05-02 Apple Inc. Methods and apparatus for detecting individual health related events
US11698710B2 (en) 2020-08-31 2023-07-11 Apple Inc. User interfaces for logging user activities
US11735310B2 (en) 2020-12-29 2023-08-22 Kpn Innovations, Llc. Systems and methods for generating a parasitic infection nutrition program
US11782508B2 (en) * 2019-09-27 2023-10-10 Apple Inc. Creation of optimal working, learning, and resting environments on electronic devices
WO2023224803A1 (fr) * 2022-05-18 2023-11-23 Apple Inc. Détermination de caractéristique de l'œil
US11854685B2 (en) 2021-03-01 2023-12-26 Kpn Innovations, Llc. System and method for generating a gestational disorder nourishment program
US11935642B2 (en) 2021-03-01 2024-03-19 Kpn Innovations, Llc System and method for generating a neonatal disorder nourishment program
US11947722B2 (en) * 2020-03-24 2024-04-02 Arm Limited Devices and headsets
US12001648B2 (en) 2022-09-23 2024-06-04 Apple Inc. User interfaces for logging user activities

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110227813A1 (en) * 2010-02-28 2011-09-22 Osterhout Group, Inc. Augmented reality eyepiece with secondary attached optic for surroundings environment vision correction
US20120179665A1 (en) * 2011-01-07 2012-07-12 Access Business Group International Llc Health monitoring system
WO2013006642A2 (fr) * 2011-07-05 2013-01-10 Saudi Arabian Oil Company Systèmes, support informatique et procédés informatiques pour fournir des informations de santé à des employés à l'aide d'un affichage à réalité augmentée

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110227813A1 (en) * 2010-02-28 2011-09-22 Osterhout Group, Inc. Augmented reality eyepiece with secondary attached optic for surroundings environment vision correction
US20120179665A1 (en) * 2011-01-07 2012-07-12 Access Business Group International Llc Health monitoring system
WO2013006642A2 (fr) * 2011-07-05 2013-01-10 Saudi Arabian Oil Company Systèmes, support informatique et procédés informatiques pour fournir des informations de santé à des employés à l'aide d'un affichage à réalité augmentée

Cited By (130)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10166091B2 (en) 2014-02-21 2019-01-01 Trispera Dental Inc. Augmented reality dental design method and system
US9947138B2 (en) 2014-04-15 2018-04-17 Huntington Ingalls Incorporated System and method for augmented reality display of dynamic environment information
WO2015160828A1 (fr) * 2014-04-15 2015-10-22 Huntington Ingalls Incorporated Système et procédé d'affichage à réalité augmentée d'informations d'environnement dynamiques
US9864909B2 (en) 2014-04-25 2018-01-09 Huntington Ingalls Incorporated System and method for using augmented reality display in surface treatment procedures
US9734403B2 (en) 2014-04-25 2017-08-15 Huntington Ingalls Incorporated Augmented reality display of dynamic target object information
US10504294B2 (en) 2014-06-09 2019-12-10 Huntington Ingalls Incorporated System and method for augmented reality discrepancy determination and reporting
US10915754B2 (en) 2014-06-09 2021-02-09 Huntington Ingalls Incorporated System and method for use of augmented reality in outfitting a dynamic structural space
US10147234B2 (en) 2014-06-09 2018-12-04 Huntington Ingalls Incorporated System and method for augmented reality display of electrical system information
US9898867B2 (en) 2014-07-16 2018-02-20 Huntington Ingalls Incorporated System and method for augmented reality display of hoisting and rigging information
US10345768B2 (en) 2014-09-29 2019-07-09 Microsoft Technology Licensing, Llc Environmental control via wearable computing system
CN106796417A (zh) * 2014-09-29 2017-05-31 微软技术许可有限责任公司 经由可穿戴计算系统的环境控制
DE102014222355A1 (de) * 2014-11-03 2016-05-04 Bayerische Motoren Werke Aktiengesellschaft Müdigkeitserkennung mit Sensoren einer Datenbrille
US9508248B2 (en) 2014-12-12 2016-11-29 Motorola Solutions, Inc. Method and system for information management for an incident response
US9530299B2 (en) 2015-03-03 2016-12-27 Toyota Motor Engineering & Manufacturing North America, Inc. Methods and apparatuses for assisting a visually-impaired user
US10788675B2 (en) 2015-03-16 2020-09-29 Magic Leap, Inc. Methods and systems for diagnosing and treating eyes using light therapy
US10379351B2 (en) 2015-03-16 2019-08-13 Magic Leap, Inc. Methods and systems for diagnosing and treating eyes using light therapy
US11747627B2 (en) 2015-03-16 2023-09-05 Magic Leap, Inc. Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US10775628B2 (en) 2015-03-16 2020-09-15 Magic Leap, Inc. Methods and systems for diagnosing and treating presbyopia
WO2016149416A1 (fr) * 2015-03-16 2016-09-22 Magic Leap, Inc. Méthodes et systèmes de diagnostic et de traitement des troubles de santé
EP3271776A4 (fr) * 2015-03-16 2018-12-26 Magic Leap, Inc. Méthodes et systèmes de diagnostic et de traitement des troubles de santé
US20170007450A1 (en) 2015-03-16 2017-01-12 Magic Leap, Inc. Augmented and virtual reality display systems and methods for delivery of medication to eyes
US11474359B2 (en) 2015-03-16 2022-10-18 Magic Leap, Inc. Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US10345592B2 (en) 2015-03-16 2019-07-09 Magic Leap, Inc. Augmented and virtual reality display systems and methods for diagnosing a user using electrical potentials
US20170007843A1 (en) 2015-03-16 2017-01-12 Magic Leap, Inc. Methods and systems for diagnosing and treating eyes using laser therapy
US11256096B2 (en) 2015-03-16 2022-02-22 Magic Leap, Inc. Methods and systems for diagnosing and treating presbyopia
US10345590B2 (en) 2015-03-16 2019-07-09 Magic Leap, Inc. Augmented and virtual reality display systems and methods for determining optical prescriptions
US10345591B2 (en) 2015-03-16 2019-07-09 Magic Leap, Inc. Methods and systems for performing retinoscopy
US10345593B2 (en) 2015-03-16 2019-07-09 Magic Leap, Inc. Methods and systems for providing augmented reality content for treating color blindness
US11156835B2 (en) 2015-03-16 2021-10-26 Magic Leap, Inc. Methods and systems for diagnosing and treating health ailments
US10359631B2 (en) 2015-03-16 2019-07-23 Magic Leap, Inc. Augmented reality display systems and methods for re-rendering the world
US10365488B2 (en) 2015-03-16 2019-07-30 Magic Leap, Inc. Methods and systems for diagnosing eyes using aberrometer
US10371948B2 (en) 2015-03-16 2019-08-06 Magic Leap, Inc. Methods and systems for diagnosing color blindness
US10371947B2 (en) 2015-03-16 2019-08-06 Magic Leap, Inc. Methods and systems for modifying eye convergence for diagnosing and treating conditions including strabismus and/or amblyopia
US10371946B2 (en) 2015-03-16 2019-08-06 Magic Leap, Inc. Methods and systems for diagnosing binocular vision conditions
US10371945B2 (en) 2015-03-16 2019-08-06 Magic Leap, Inc. Methods and systems for diagnosing and treating higher order refractive aberrations of an eye
US10371949B2 (en) 2015-03-16 2019-08-06 Magic Leap, Inc. Methods and systems for performing confocal microscopy
US10379354B2 (en) 2015-03-16 2019-08-13 Magic Leap, Inc. Methods and systems for diagnosing contrast sensitivity
US10379353B2 (en) 2015-03-16 2019-08-13 Magic Leap, Inc. Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US10379350B2 (en) 2015-03-16 2019-08-13 Magic Leap, Inc. Methods and systems for diagnosing eyes using ultrasound
US10969588B2 (en) 2015-03-16 2021-04-06 Magic Leap, Inc. Methods and systems for diagnosing contrast sensitivity
US10386640B2 (en) 2015-03-16 2019-08-20 Magic Leap, Inc. Methods and systems for determining intraocular pressure
US10386639B2 (en) 2015-03-16 2019-08-20 Magic Leap, Inc. Methods and systems for diagnosing eye conditions such as red reflex using light reflected from the eyes
US10386641B2 (en) 2015-03-16 2019-08-20 Magic Leap, Inc. Methods and systems for providing augmented reality content for treatment of macular degeneration
US10564423B2 (en) 2015-03-16 2020-02-18 Magic Leap, Inc. Augmented and virtual reality display systems and methods for delivery of medication to eyes
US10429649B2 (en) 2015-03-16 2019-10-01 Magic Leap, Inc. Augmented and virtual reality display systems and methods for diagnosing using occluder
US10983351B2 (en) 2015-03-16 2021-04-20 Magic Leap, Inc. Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US10437062B2 (en) 2015-03-16 2019-10-08 Magic Leap, Inc. Augmented and virtual reality display platforms and methods for delivering health treatments to a user
US10444504B2 (en) 2015-03-16 2019-10-15 Magic Leap, Inc. Methods and systems for performing optical coherence tomography
US10545341B2 (en) 2015-03-16 2020-01-28 Magic Leap, Inc. Methods and systems for diagnosing eye conditions, including macular degeneration
US10451877B2 (en) 2015-03-16 2019-10-22 Magic Leap, Inc. Methods and systems for diagnosing and treating presbyopia
US10539795B2 (en) 2015-03-16 2020-01-21 Magic Leap, Inc. Methods and systems for diagnosing and treating eyes using laser therapy
US10459229B2 (en) 2015-03-16 2019-10-29 Magic Leap, Inc. Methods and systems for performing two-photon microscopy
US10466477B2 (en) 2015-03-16 2019-11-05 Magic Leap, Inc. Methods and systems for providing wavefront corrections for treating conditions including myopia, hyperopia, and/or astigmatism
US10473934B2 (en) 2015-03-16 2019-11-12 Magic Leap, Inc. Methods and systems for performing slit lamp examination
US20170000342A1 (en) 2015-03-16 2017-01-05 Magic Leap, Inc. Methods and systems for detecting health conditions by imaging portions of the eye, including the fundus
US10527850B2 (en) 2015-03-16 2020-01-07 Magic Leap, Inc. Augmented and virtual reality display systems and methods for determining optical prescriptions by imaging retina
US10539794B2 (en) 2015-03-16 2020-01-21 Magic Leap, Inc. Methods and systems for detecting health conditions by imaging portions of the eye, including the fundus
EP3866100A1 (fr) * 2015-04-02 2021-08-18 Essilor International Système de mise à jour d'un index d'une personne
CN107533632A (zh) * 2015-04-02 2018-01-02 埃西勒国际通用光学公司 用于对个人的指数进行更新的方法
WO2016168738A1 (fr) * 2015-04-17 2016-10-20 Declara, Inc. Système et procédés pour plate-forme d'apprentissage haptique
US10987008B2 (en) 2015-12-21 2021-04-27 Koninklijke Philips N.V. Device, method and computer program product for continuous monitoring of vital signs
US10459231B2 (en) 2016-04-08 2019-10-29 Magic Leap, Inc. Augmented reality systems and methods with variable focus lens elements
US11106041B2 (en) 2016-04-08 2021-08-31 Magic Leap, Inc. Augmented reality systems and methods with variable focus lens elements
US11614626B2 (en) 2016-04-08 2023-03-28 Magic Leap, Inc. Augmented reality systems and methods with variable focus lens elements
US11460705B2 (en) 2016-09-22 2022-10-04 Magic Leap, Inc. Augmented reality spectroscopy
US11079598B2 (en) 2016-09-22 2021-08-03 Magic Leap, Inc. Augmented reality spectroscopy
AU2017331284B2 (en) * 2016-09-22 2022-01-13 Magic Leap, Inc. Augmented reality spectroscopy
CN109997174A (zh) * 2016-09-22 2019-07-09 奇跃公司 增强现实光谱检查
WO2018057962A1 (fr) * 2016-09-22 2018-03-29 Magic Leap, Inc. Spectroscopie de réalité augmentée
US10558047B2 (en) * 2016-09-22 2020-02-11 Magic Leap, Inc. Augmented reality spectroscopy
US20180081179A1 (en) * 2016-09-22 2018-03-22 Magic Leap, Inc. Augmented reality spectroscopy
US11754844B2 (en) 2016-09-22 2023-09-12 Magic Leap, Inc. Augmented reality spectroscopy
JP7148501B2 (ja) 2016-09-22 2022-10-05 マジック リープ, インコーポレイテッド 拡張現実の分光法
JP2019529917A (ja) * 2016-09-22 2019-10-17 マジック リープ, インコーポレイテッドMagic Leap,Inc. 拡張現実の分光法
CN109997174B (zh) * 2016-09-22 2023-06-02 奇跃公司 可穿戴光谱检查系统
US10354350B2 (en) 2016-10-18 2019-07-16 Motorola Solutions, Inc. Method and system for information management for an incident response
US11300844B2 (en) 2017-02-23 2022-04-12 Magic Leap, Inc. Display system with variable power reflector
US11774823B2 (en) 2017-02-23 2023-10-03 Magic Leap, Inc. Display system with variable power reflector
US10962855B2 (en) 2017-02-23 2021-03-30 Magic Leap, Inc. Display system with variable power reflector
WO2018190762A1 (fr) * 2017-04-11 2018-10-18 Cargotec Patenter Ab Système de présentation et procédé associé au système
US10835809B2 (en) 2017-08-26 2020-11-17 Kristina Contreras Auditorium efficient tracking in auditory augmented reality
EP3498150A1 (fr) * 2017-12-13 2019-06-19 Vestel Elektronik Sanayi ve Ticaret A.S. Appareil pouvant être monté sur une tête
US11039778B2 (en) 2018-03-12 2021-06-22 Apple Inc. User interfaces for health monitoring
US11950916B2 (en) 2018-03-12 2024-04-09 Apple Inc. User interfaces for health monitoring
US11202598B2 (en) 2018-03-12 2021-12-21 Apple Inc. User interfaces for health monitoring
JP7444216B2 (ja) 2018-03-20 2024-03-06 カシオ計算機株式会社 ウェアラブル機器、健康管理支援方法及び健康管理支援プログラム
JP2019164634A (ja) * 2018-03-20 2019-09-26 カシオ計算機株式会社 ウェアラブル機器、健康管理支援方法及び健康管理支援プログラム
US11480467B2 (en) 2018-03-21 2022-10-25 Magic Leap, Inc. Augmented reality system and method for spectroscopic analysis
US11852530B2 (en) 2018-03-21 2023-12-26 Magic Leap, Inc. Augmented reality system and method for spectroscopic analysis
WO2019189971A1 (fr) * 2018-03-30 2019-10-03 주식회사 홍복 Procédé d'analyse par intelligence artificielle d'image d'iris et d'image rétinienne pour diagnostiquer le diabète et un pré-symptôme
US11712179B2 (en) 2018-05-07 2023-08-01 Apple Inc. Displaying user interfaces associated with physical activities
US10987028B2 (en) 2018-05-07 2021-04-27 Apple Inc. Displaying user interfaces associated with physical activities
US11103161B2 (en) 2018-05-07 2021-08-31 Apple Inc. Displaying user interfaces associated with physical activities
US11317833B2 (en) 2018-05-07 2022-05-03 Apple Inc. Displaying user interfaces associated with physical activities
EP3644161A1 (fr) * 2018-08-06 2020-04-29 Motorola Mobility LLC Rétroaction d'activité de réalité augmentée en temps réel
US11457805B2 (en) 2018-09-21 2022-10-04 MacuLogix, Inc. Methods, apparatus, and systems for ophthalmic testing and measurement
US11478142B2 (en) 2018-09-21 2022-10-25 MacuLogix, Inc. Methods, apparatus, and systems for ophthalmic testing and measurement
US11089954B2 (en) 2018-09-21 2021-08-17 MacuLogix, Inc. Method and apparatus for guiding a test subject through an ophthalmic test
US11344194B2 (en) 2018-09-21 2022-05-31 MacuLogix, Inc. Methods, apparatus, and systems for ophthalmic testing and measurement
US10667683B2 (en) 2018-09-21 2020-06-02 MacuLogix, Inc. Methods, apparatus, and systems for ophthalmic testing and measurement
US11478143B2 (en) 2018-09-21 2022-10-25 MacuLogix, Inc. Methods, apparatus, and systems for ophthalmic testing and measurement
US11471044B2 (en) 2018-09-21 2022-10-18 MacuLogix, Inc. Methods, apparatus, and systems for ophthalmic testing and measurement
US11234077B2 (en) 2019-06-01 2022-01-25 Apple Inc. User interfaces for managing audio exposure
US11228835B2 (en) 2019-06-01 2022-01-18 Apple Inc. User interfaces for managing audio exposure
US11152100B2 (en) 2019-06-01 2021-10-19 Apple Inc. Health application user interfaces
US11527316B2 (en) 2019-06-01 2022-12-13 Apple Inc. Health application user interfaces
US11223899B2 (en) 2019-06-01 2022-01-11 Apple Inc. User interfaces for managing audio exposure
US11209957B2 (en) 2019-06-01 2021-12-28 Apple Inc. User interfaces for cycle tracking
US11842806B2 (en) 2019-06-01 2023-12-12 Apple Inc. Health application user interfaces
US11639944B2 (en) 2019-08-26 2023-05-02 Apple Inc. Methods and apparatus for detecting individual health related events
US11266330B2 (en) 2019-09-09 2022-03-08 Apple Inc. Research study user interfaces
US11782508B2 (en) * 2019-09-27 2023-10-10 Apple Inc. Creation of optimal working, learning, and resting environments on electronic devices
US11947722B2 (en) * 2020-03-24 2024-04-02 Arm Limited Devices and headsets
US12002588B2 (en) 2020-04-17 2024-06-04 Apple Inc. Health event logging and coaching user interfaces
US11710563B2 (en) 2020-06-02 2023-07-25 Apple Inc. User interfaces for health applications
US11594330B2 (en) 2020-06-02 2023-02-28 Apple Inc. User interfaces for health applications
US11482328B2 (en) 2020-06-02 2022-10-25 Apple Inc. User interfaces for health applications
US11107580B1 (en) 2020-06-02 2021-08-31 Apple Inc. User interfaces for health applications
US11194455B1 (en) 2020-06-02 2021-12-07 Apple Inc. User interfaces for health applications
US11698710B2 (en) 2020-08-31 2023-07-11 Apple Inc. User interfaces for logging user activities
WO2022126117A1 (fr) * 2020-12-08 2022-06-16 Irisvision, Inc. Procédé et système de gestion de clinicien à distance de dispositifs d'assistance visuelle portés sur la tête
US11049603B1 (en) 2020-12-29 2021-06-29 Kpn Innovations, Llc. System and method for generating a procreant nourishment program
US11735310B2 (en) 2020-12-29 2023-08-22 Kpn Innovations, Llc. Systems and methods for generating a parasitic infection nutrition program
US11355229B1 (en) 2020-12-29 2022-06-07 Kpn Innovations, Llc. System and method for generating an ocular dysfunction nourishment program
CN112835736B (zh) * 2021-01-22 2023-08-22 张立旭 一种通用型数据纠错方法及系统
CN112835736A (zh) * 2021-01-22 2021-05-25 张立旭 一种通用型数据纠错方法及系统
US11854685B2 (en) 2021-03-01 2023-12-26 Kpn Innovations, Llc. System and method for generating a gestational disorder nourishment program
US11935642B2 (en) 2021-03-01 2024-03-19 Kpn Innovations, Llc System and method for generating a neonatal disorder nourishment program
WO2023224803A1 (fr) * 2022-05-18 2023-11-23 Apple Inc. Détermination de caractéristique de l'œil
US12001648B2 (en) 2022-09-23 2024-06-04 Apple Inc. User interfaces for logging user activities

Similar Documents

Publication Publication Date Title
WO2014015378A1 (fr) Dispositif informatique mobile, serveur d'application, support de stockage lisible par ordinateur et système pour calculer des indices de vitalité, détecter un danger environnemental, fournir une aide à la vision et détecter une maladie
US20230359910A1 (en) Artificial intelligence and/or virtual reality for activity optimization/personalization
JP7333432B2 (ja) ユーザ健康分析のための拡張現実システムおよび方法
US11195340B2 (en) Systems and methods for rendering immersive environments
US20200337631A1 (en) Systems, environment and methods for identification and analysis of recurring transitory physiological states and events using a portable data collection device
JP6358586B2 (ja) 拡張現実感表示によって従業員に健康情報を提供するためのシステム、コンピュータ媒体およびコンピュータにより実行される方法
US7502498B2 (en) Patient monitoring apparatus
US8647268B2 (en) Patient monitoring apparatus
JP5561742B2 (ja) 患者の視界内の補正領域を用いて患者の症状を改善するシステム
US20140024971A1 (en) Assessment and cure of brain concussion and medical conditions by determining mobility
US20050165327A1 (en) Apparatus and method for detecting the severity of brain function impairment
JP2019523027A (ja) 記憶及び機能の衰えの記録及び分析のための装置及び方法
JP2022548473A (ja) 患者監視のためのシステム及び方法
EP3940715A1 (fr) Système d'aide à la décision pour les troubles neurologiques et procédé associé
KR102235716B1 (ko) 가상현실을 이용한 학습장애 진단/치유장치, 및 그 방법
CN112086164A (zh) 身体状况反馈方法、系统、存储介质
US20230320640A1 (en) Bidirectional sightline-position determination device, bidirectional sightline-position determination method, and training method
US20220007936A1 (en) Decision support system and method thereof for neurological disorders
KR102624293B1 (ko) 어지럼증 진단 치료 서비스 시스템 및 방법
KR20240029921A (ko) 복합 생체신호 처리 기반 vr 기기를 활용한 스마트 인지코칭서비스 시스템
Bao Vibrotactile Sensory Augmentation and Machine Learning Based Approaches for Balance Rehabilitation
Subbhuraam et al. Pervasive healthcare applications in neurology
Moore et al. US Health care technologies
WO2024050650A1 (fr) Système pour le diagnostic et la gestion de l'anxiété ou de la douleur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13823117

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13823117

Country of ref document: EP

Kind code of ref document: A1