WO2019043475A2 - Dispositifs et procédés destinés à être utilisés dans le diagnostic d'une affection médicale - Google Patents

Dispositifs et procédés destinés à être utilisés dans le diagnostic d'une affection médicale Download PDF

Info

Publication number
WO2019043475A2
WO2019043475A2 PCT/IB2018/055767 IB2018055767W WO2019043475A2 WO 2019043475 A2 WO2019043475 A2 WO 2019043475A2 IB 2018055767 W IB2018055767 W IB 2018055767W WO 2019043475 A2 WO2019043475 A2 WO 2019043475A2
Authority
WO
WIPO (PCT)
Prior art keywords
eye
patient
mirror
communication device
mobile communication
Prior art date
Application number
PCT/IB2018/055767
Other languages
English (en)
Other versions
WO2019043475A3 (fr
Inventor
Joshua David FISCHER
David Jacobus VAN DEN HEEVER
Original Assignee
Stellenbosch University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Stellenbosch University filed Critical Stellenbosch University
Publication of WO2019043475A2 publication Critical patent/WO2019043475A2/fr
Publication of WO2019043475A3 publication Critical patent/WO2019043475A3/fr
Priority to ZA2020/00774A priority Critical patent/ZA202000774B/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement

Definitions

  • This invention relates to devices and methods for use in diagnosing a medical condition, in particular to a portable device and associated method for use in diagnosing concussion.
  • Concussion also known as mild traumatic brain injury (mTBI)
  • mTBI mild traumatic brain injury
  • Symptoms include a variety of physical, cognitive, and emotional symptoms, which may not be recognized if subtle.
  • Concussions are common in sports such as football and rugby. Due to complications arising out of so-called "secondary impact syndrome", it is important to keep a player who has suffered a concussion from returning to the field until the player has fully recovered from the concussion. A player's return should not be considered while the player continues to exhibit any symptoms.
  • a player suffering from a concussion is evaluated by a medical professional conducting a number of assessments on the player.
  • concussion may be under-diagnosed. To address this, a number of technological solutions have been provided.
  • a universal headset- mounted neuropsychological testing system utilizes eye tracking, with a single screen fixed with respect to the face and a dot on the screen driven to present an object that both eyes are focused on using a set of prisms, with the prisms eliminating interocular distance considerations.
  • Ultrathin optics cast a virtual image at 40 centimetres and a universal mask against which the test taker's face is placed fixes the single screen with respect to the face so that head movement is not a factor.
  • miniature cameras are located in the headset housing beneath the eyes, and a quick release tensioning unit provides easy headset mounting and removal.
  • all elements are located in the headset hood to eliminate the effects of head movement as well as environmental distractions.
  • a portable device comprising: a casing configured for mounting a mobile communication device on a head of a patient and providing an optical path between an eye of the patient and a display zone in which a display of the mobile communication device operatively locates, the casing including a lens located in the optical path and configured to facilitate focussing of the eye of the patient at the display zone, wherein the casing includes an illumination module configured to project light onto the eye of the patient and an optical arrangement arranged to direct light, projected onto the eye by the illumination module and reflected by the eye, towards an opening arranged operatively to provide optical communication with a camera of the mobile communication device, thereby to enable tracking of movement of the eye by the mobile communication device.
  • the optical arrangement to include a set of mirrors, for the optical arrangement to include a first mirror located substantially within the optical path and a second mirror located outside of the optical path, and for light projected onto the eye by the illumination module to be reflected by the eye onto the first mirror, for the first mirror to be arranged to reflect light towards the second mirror and for the second mirror to be arranged to reflect light from the first mirror into the opening.
  • a further feature provides for the opening providing optical communication with the camera to be arranged to provide optical communication with a front facing camera located on a front face of the mobile communication device alongside the display.
  • the casing to include an eyepiece providing the lens and for the illumination module to be fixed at or near a first end of the eyepiece.
  • the optical arrangement to include a support element configured to support the set of mirrors; for the support element to define an aperture shaped and dimensioned to receive a second end of the eyepiece and engage a side wall of the eyepiece; and for the support element to include a first mirror support formation for supporting the first mirror and a second mirror support formation for supporting the second mirror.
  • first mirror support formation to be configured to support the first mirror between the second end of the eyepiece and the display zone.
  • a further feature provides for the second mirror support formation to be configured to support the second mirror adjacent the eyepiece and towards the opening providing optical communication with the camera.
  • a still further feature provides for the mirror support formations to be configured to support the mirrors at angles selected such that in operation light reflecting from the eye of the patient is directed by the mirrors into the opening providing optical communication with the camera.
  • the optical arrangement to include an infrared or near-infrared pass filter configured to permit passage of electromagnetic waves falling within the infrared or near infrared band and to block passage of electromagnetic waves of other wavelengths.
  • the casing to define a cavity in which the optical arrangement locates and an opening in which the eyepiece is supported with the first end thereof extending through the eyepiece support opening away from the cavity and the second end thereof located within the cavity; for the casing to include a shield which defines an abutment formation arranged operatively to abut a face of the patient with the first end of the eyepiece adjacent the eye of the patient; for the shield to be configured to shield the eye and eyepiece from external light when in position on the patient's face; and for the abutment formation to be contoured so as to generally fit and engage the face of the patient.
  • the casing to include a docking arrangement configured to receive and hold the mobile communication device captive, the docking arrangement including a receiving formation configured to receive the mobile communication device and a securing arrangement configured to secure the mobile communication device in the receiving formation, and for the receiving formation to define an opening which provides the display zone.
  • the illumination module to be configured to emit non-visible light; for the illumination module to be an infrared or near infrared illumination module; and, for the illumination module to include a number of infrared or near infrared light emitting diodes.
  • a yet further feature provides for the first mirror to be configured to permit the passage of visible light therethrough and to reflect at least selected frequencies of non-visible light.
  • the device to include an attachment mechanism for attaching the casing to the head of the patient; and for the attachment mechanism to include a strap arrangement.
  • the invention extends to a kit for use in diagnosing a medical condition, the kit including the illumination module and optical arrangement as defined above and being configured for attachment to and cooperation with the casing.
  • a further feature provides for the casing to be provided by a virtual reality headset.
  • a computer-implemented method conducted at a mobile communication device comprising: displaying, via a display of the mobile communication device, a virtual environment in which a stimulus point moves in a predefined manner, wherein the virtual environment is displayed while a portable device mounts the mobile communication device with the display locating in a display zone thereof, and wherein the virtual environment is visible to a patient through a lens of the portable device; receiving, from a camera of the mobile communication device, image data relating to movement of an eye of the patient, wherein the image data is received while the portable device mounts the mobile communication device via an optical arrangement of the portable device which is arranged to direct light, projected onto the eye by an illumination module and reflected by the eye, towards an opening of the portable device arranged operatively to provide optical communication with the camera; analysing the image data to track movement of the eye; comparing the movement of the eye to movement of the stimulus point; and, outputting a tracking score based on the extent to which movement of the eye matches movement of
  • a further feature provides for receiving image data to receive the image data from a front facing camera of the mobile communication device.
  • a still further feature provides for analysing the image data to include determining the position of a centre point of a pupil of the eye.
  • a yet further feature provides for comparing the movement of the eye to movement of the stimulus point to include: mapping the position of the centre point of the pupil to an estimated gaze point in the virtual environment; and, determining the extent to which the gaze point matches the stimulus point.
  • determining the extent to which the gaze point matches the stimulus point includes determining a root mean square value of the distance of the gaze point to the stimulus point over the duration of the movement of the stimulus point; and for outputting the tracking score to output the root mean square value.
  • a further feature provides for the method to include comparing the tracking score to one or more stored values and outputting the result of the comparison.
  • a still further feature provides for the method to include: receiving, from a movement sensor of the mobile communication device, movement data relating to movement of the mobile communication device; comparing the received movement data with expected movement data; and, outputting a movement score based on the comparison for use in diagnosing the medical condition.
  • the movement sensor to include one or both of an accelerometer and a gyroscope and for the movement data to include one or both of acceleration data and rotation rate data; for the method to include receiving a patient identifier uniquely identifying a current user of the device; for the method to include comparing the tracking score to one or more historic tracking scores stored in association with the patient identifier; for the method to include comparing the movement score to one or more historic movement scores stored in association with the patient identifier; for the method to include storing the tracking score in association with the patient identifier; and, for the method to include storing the movement score in association with the patient identifier.
  • a system including a mobile communication device including a memory for storing computer-readable program code and a processor for executing the computer-readable program code and comprising: a display for displaying a virtual environment in which a stimulus point moves in a predefined manner, wherein the virtual environment is displayed while a portable device mounts the mobile communication device with the display locating in a display zone thereof, and wherein the virtual environment is visible to a patient through a lens of the portable device; an image receiving component for receiving, from a camera of the mobile communication device, image data relating to movement of an eye of the patient, wherein the image data is received while the portable device mounts the mobile communication device via an optical arrangement of the portable device which is arranged to direct light, projected onto the eye by an illumination module and reflected by the eye, towards an opening of the portable device arranged operatively to provide optical communication with the camera; an analysing component for analysing the image data to track movement of the eye; a comparing component for comparing the movement of the eye
  • a computer program product comprising a computer-readable medium having stored computer-readable program code for performing the steps of: displaying, via a display of a mobile communication device, a virtual environment in which a stimulus point moves in a predefined manner, wherein the virtual environment is displayed while a portable device mounts the mobile communication device with the display locating in a display zone thereof, and wherein the virtual environment is visible to a patient through a lens of the portable device; receiving, from a camera of the mobile communication device, image data relating to movement of an eye of the patient, wherein the image data is received while the portable device mounts the mobile communication device via an optical arrangement of the portable device which is arranged to direct light, projected onto the eye by an illumination module and reflected by the eye, towards an opening of the portable device arranged operatively to provide optical communication with the camera; analysing the image data to track movement of the eye; comparing the movement of the eye to movement of the stimulus point; and, outputting a tracking score based on the extent
  • Figure 1 is a three dimensional view of a portable device for use in diagnosing a condition according to an example embodiment described herein;
  • Figure 2 is a front view of the portable device of Figure 1 , with a cover removed;
  • Figure 3 is a rear view of the portable device of Figure 1 ;
  • Figure 4 is a three dimensional view of the portable device of Figure 1 in which a wall of a casing of the device has been removed so as to illustrate a set of mirrors located within a cavity thereof;
  • Figure 5 is a three dimensional view of a support element supporting a set of mirrors according to embodiments described herein;
  • Figure 6 is a front view of an illumination module according to embodiments described herein;
  • Figure 7 is a schematic diagram which illustrates an optical arrangement including a set of mirrors described herein;
  • Figure 8 is a flow diagram which illustrates an example embodiment of a method for use in diagnosing a condition described herein;
  • Figure 9 is a sequence of exemplary images which illustrates exemplary operations for analysing image data according to aspects of the present disclosure
  • Figure 10 is a schematic diagram which illustrates an example embodiment of a system for use in diagnosing a condition described herein;
  • Figure 1 1 is a schematic diagram which illustrates an example user interface of a software application described herein;
  • Figure 12 is a schematic diagram which illustrates an example user interface of a software application described herein.
  • Figure 13 illustrates an example of a computing device in which various aspects of the disclosure may be implemented.
  • aspects of this disclosure are directed towards a smartphone-based application and associated portable device for one or more of concussion detection, monitoring and management.
  • the application and device may enable access to high-end concussion management protocols by amateur athletes and the like.
  • patient data may be stored in and retrieved from a server computer and may be accessible to multiple parties involved in the concussion management cycle.
  • the application and device may obtain objective balance and eye tracking performance measurements for use in diagnosing and/or screening of concussion and/or other related medical conditions.
  • aspects of the disclosure may provide quick cognitive and symptom assessment for concussion.
  • Aspects of the disclosure may enable monitoring of progress of patients having suffered a concussion and may aid medical professionals making and monitoring diagnoses.
  • FIGS 1 to 4 illustrate an example embodiment of a portable device (1 ) for use in diagnosing a medical condition, such as concussion.
  • the portable device (1 ) includes a casing (3) which is configured for mounting a mobile communication device (not shown) on a head of a patient.
  • the casing (3) may be in the form of a suitable headset which holds a mobile communication device display static relative to a head of the patient.
  • the casing (3) may be provided by a purpose-built headset including the illumination module (23) and optical arrangement (26) described below, or by means of a kit for adapting a headset, the kit including the illumination module and optical arrangement.
  • the casing (3) includes walls which are arranged to define a cavity.
  • a shield (5) extends from the casing (3) and defines an abutment formation arranged operatively to abut a face of the patient.
  • the shield (5) is configured to shield eyes and eyepieces from external light when in position on the patient's face.
  • the abutment formation may be defined by an edge of the shield (5) and may be contoured so as to generally fit and engage the face of the patient.
  • the edge of the shield may be contoured so as to engage the patient's face along most if not all of the length of the edge so as to reduce to a minimum entry of external light into the zone operatively containing the eyepiece and patient's eye.
  • the abutment formation has padding (7) fixed thereon which engages the patient's face and improves comfort.
  • the casing (3) may include a docking arrangement configured to receive and hold the mobile communication device captive.
  • the docking arrangement may include a receiving formation (9) configured to receive the mobile communication device and a securing arrangement (1 1 ) arranged to secure the mobile communication device in the receiving formation (9).
  • the receiving formation (9) defines a depression in a first wall of the casing (3) which is shaped and dimensioned to receive the mobile communication device with its face engaging a bed of the depression.
  • An opening is defined in the first wall of the casing (3) within the bed of the depression.
  • the opening provides a display zone (13) in which a display of the mobile communication device operatively locates.
  • the display zone (13) is generally of binocular shape.
  • the securing arrangement (1 1 ) may be provided by any suitable formations arranged to secure the mobile communication device in the receiving formation (9) and, in the illustrated embodiment, is provided by elbowed fasteners which are moveable into a fastening condition in which they lock and engage end regions of a rear surface of the mobile communication device to hold the mobile communication device captive.
  • the casing (3) includes a pair of openings in a second wall being generally opposite the first wall. The pair of openings provide communication with the cavity from an exterior of the casing and each opening is arranged to receive and support an eyepiece (15). It should be appreciated that although two eyepieces (15) are illustrated in the present embodiment, in other implementations there may be only a single eyepiece.
  • Each eyepiece (15) is supported in their respective openings with first ends thereof extending from their respective eyepiece support openings away from the cavity and second ends thereof located within the cavity.
  • Each eyepiece (15) may include at least one lens (19) and a lens holder which supports the lens (19).
  • the position of the lens (19) may be adjustable to facilitate focussing via a focussing arrangement (21 ).
  • the eyepieces (15) are arranged so as to generally align with eyes of the patient and are configured such that the first ends thereof do not extend beyond the shield.
  • the lenses (19) are selected so as to facilitate focussing of the eyes of the patient at the display zone such that in use when a patient looks through the eyepiece (15) and lens (19), the patient is able to focus his or her vision on the display located in the display zone (13).
  • the lenses may for example be selected to provide a focal length which is approximately at the display zone when the eyepieces are supported in their respective openings.
  • the eyepiece support openings and the display zone opening are in direct optical communication with each other. With the eyepieces (15) fitted in their respective openings, the casing (3) provides optical paths which extend between respective eyepieces (and lenses thereof) and the display zone (13).
  • the portable device (1 ) includes an illumination module (23) configured to project light onto one of the patient's eyes.
  • the illumination module is fitted to the left eyepiece such that light is projected on the patient's left eye.
  • the illumination module may be fitted to or otherwise associated with the right eyepiece for projecting light onto the right eye or indeed on both eyepieces for projecting light onto both of the patient's eyes.
  • the illumination module (23) may be configured to emit non- visible light, for example infrared or near infrared light, and in the illustrated embodiment, as shown more clearly in Figure 6, includes a number of infrared or near infrared light emitting diodes (LEDs) (25).
  • LEDs infrared or near infrared light emitting diodes
  • the LEDs (25) are arranged around an annulus which is shaped and dimensioned to be fixed around a holder of the eyepiece towards a first end thereof, such that the LEDs (25) are arranged around the eyepiece (15) and are directed towards the patient's eye in use.
  • the illumination module (23) may include cables for supplying electrical current to the LEDs (25).
  • NIR light may be the most suitable light source wavelength as it typically causes minimal discomfort to the patient (as compared, e.g., to visible light).
  • the illumination module (23) may be configured to clip onto the eyepiece (15) of the device (1 ).
  • the illumination module (23) may draw electrical power from the mobile communication device via a suitable cable and adapter.
  • the adapter may be configured for interfacing with a data port of the mobile communication device, which may allow the mobile communication device to control the illumination module (e.g. turning LEDs on and off, setting duty cycle, etc.).
  • the casing (3) includes an optical arrangement (26) which includes a set of mirrors (27, 29).
  • the set of mirrors (27, 29) is located in the cavity and supported by a support element (31 ).
  • the support element (31 ) defines an aperture shaped and dimensioned to fit around and optionally engage the second end of the same eyepiece (15) to which the illumination module (23) is fixed. Inner walls of the aperture engage or are otherwise fixed to a side wall of the eyepiece at or near the second end thereof.
  • the support element (31 ), as illustrated more clearly in Figure 5, includes a first mirror support formation (33) arranged to support a first mirror (27) of the set of mirrors and a second mirror support formation (35) arranged to support a second mirror (29) of the set of mirrors.
  • the mirror support formations (33, 35) may be provided by arms which have recesses formed therein which receive and hold captive the respective mirrors by way of a friction fit or other suitable mechanism.
  • the first mirror support formation (33) may be configured to support the first mirror (27) between the second end of the eyepiece (15) and the display zone (13) such that the first mirror (27) locates in the optical path extending from the lens (19) of the eyepiece (15) to the display zone (13).
  • the second mirror support formation (35) may be configured to support the second mirror (29) adjacent the eyepiece (15) and towards an opening (37) formed in the first wall of the casing (3) alongside the display zone opening.
  • This opening is provided in the receiving formation (9) at a location selected so as to align with a front-facing camera of the mobile communication device and thereby operatively to provide optical communication between the cavity and the camera.
  • the mirror support formations (33, 35) are configured to support the mirrors at angles selected such that in operation light reflecting from the eye of the patient is directed by the mirrors into the opening (37).
  • the mirrors are generally arranged at an angle between 40 degrees and 60 degrees relative to an axis of the lens (19) of the eyepiece. It should of course be appreciated that the exact angles will depend on the implementation and will be affected by factors such as, the number of mirrors used, the dimensions of the casing, the dimensions of the mobile communication device (e.g.
  • At least the first mirror (27), and in some implementations the second mirror (29), is configured to permit the passage of visible light therethrough and to reflect at least selected frequencies of non- visible light.
  • the mirror (27) (or mirrors) may be provided by so-called “hot mirrors” or “hot mirror lenses” (e.g. a suitable dielectric mirror, a dichroic filter, etc.) which reflects infrared or near infrared light, while allowing visible light to pass.
  • the first mirror (27) may accordingly appear transparent to the eye.
  • first mirror (27) is located within the optical path, it does not noticeably interfere with visible light communication via the optical path meaning that the patient is able to look through the lens (19) and first mirror (27) and see the mobile communication device display located in the display zone (13).
  • Using mirrors in the form of hot mirror lenses may allow light of the visible light spectrum to pass through the mirror, but reflect light in near infrared (NIR) and infrared (IR) spectrum.
  • NIR near infrared
  • IR infrared
  • the first mirror (27) may be a major mirror and may be shaped and dimensioned such that a border thereof which is projected onto the plane of the lens (19) exceeds a field of view provided by the lens (19) such that the perimeter thereof is not visible to a patient looking through the lens (19).
  • the second mirror (29) may be a minor mirror, being smaller than the first mirror (27).
  • the set of mirrors (27, 29) is arranged such that light (39), having been projected onto the eye by the illumination module (23) and reflected by the eye, travels through the lens (19) of the eyepiece and onto the first mirror (27).
  • the first mirror (27) in turn reflects the light (39) towards the second mirror (29) which in turn reflects the light (39) from the first mirror (27) into the opening (37) and onwards into the camera of the mobile communication device.
  • Light (41 ) projected by the display of the mobile communication device when located in the display zone (13) is permitted to pass through the first mirror (27) and lens (19) so as to be visible to the patient when looking through the eyepiece (15).
  • the optical arrangement (26) includes an infrared or near-infrared pass filter (43) which is configured to permit passage of electromagnetic waves falling within the infrared or near infrared band therethrough and to block passage of electromagnetic waves of at least selected other wavelengths (in particular visible light).
  • the pass filter (43) may be fixed to the first wall over the opening (37) which provides optical communication with the camera.
  • the pass filter (43) may assist in reducing camera image noise that could be caused by other light wavelengths.
  • the optical arrangement (26) may accordingly enable a front camera of the mobile communication device to be inserted into the portable device to capture image data depicting movement of the patient's eye.
  • the portable device (1 ) also includes an attachment mechanism (45) which is configured to attach the casing (3) to the head of the patient.
  • the attachment mechanism (45) includes a strap arrangement.
  • the device (1 ) may include a cover (47) shaped and dimensioned to fit over and cover the receiving formation (9) and mobile communication device received therein. The cover (47) may be configured to cooperate with the securing arrangement (1 1 ) and may limit ambient light exposure when in operation.
  • the portable device (1 ) may be used in a method for use in diagnosing a medical condition.
  • Figure 8 is a flow diagram which illustrates an exemplary method for use in diagnosing a medical condition, such as concussion. The method is described with reference to an exemplary system as illustrated in Figure 10.
  • the method may be conducted at the mobile communication device (102).
  • the mobile communication device (102) may be mounted to the head of the patient using a portable device, such as the device (1 ) described above with reference to Figures 1 to 7.
  • the steps, operations or procedures of the method are performed locally at the mobile communication device, i.e. by the processor or processors of the mobile communication device.
  • the mobile communication device may be configured to perform the method while offline, i.e. while not connected to a data network, which may improve portability and versatility of the devices.
  • the method may include displaying (202), via the display of the mobile communication device (102), a virtual environment in which a stimulus point moves in a predefined manner.
  • the stimulus point may be a circle, sphere, or other suitable shape which is displayed on the display and which moves around the display in a predetermined fashion.
  • the stimulus point follows a circular route and travels at a constant speed.
  • the movement of the stimulus point around the display may be confined to that portion of the display which is visible through the eyepiece of the portable device.
  • the virtual environment and movement of the stimulus point therein may be configured to optimise evaluation of ocular motor function.
  • the speed at which the stimulus point moves may be selected so that a healthy user can follow movement of the stimulus point in the virtual environment.
  • the virtual environment displayed on the display may be visible to a patient wearing the portable device to which the mobile communication device is attached via one or both eyepieces (15) of the portable device.
  • the method may include receiving (204) image data relating to movement of the eye of the patient.
  • the image data may be received from the camera of the mobile communication device (102).
  • the image data is received from a front facing camera of the mobile communication device (e.g. a camera located on a front face of the mobile communication device alongside the display).
  • the image data may be captured by the camera of the mobile communication device (102) and may be a digital representation of an image of the patient's eye, as illuminated by the illumination module (23), which is projected through the opening (37) via the set of mirrors (27, 29).
  • the image data may include a sequence of images, a video or the like and may enable movement of the patient's eye to be tracked using suitable image processing techniques.
  • the image data may be received while the virtual environment is being displayed on the display and while the patient wears the portable device to which the mobile communication device is attached. In other words, displaying (202) the virtual environment and receiving (204) the image data may happen substantially simultaneously.
  • the method may include analysing (206) the image data to track movement of the eye. Analysing the image data may include determining the position of a centre point of a pupil of the eye using image processing.
  • Analysing the image data may include isolating from original image data (250) that part of the image data which illustrates the eye and calculating row indexes of a predetermined number of (e.g. 100) equally distributed rows.
  • Background subtraction may be performed to eliminate noise caused by lens reflections. This may be effected by subtracting a background image (252) from the original image (250) to result in an isolated and background subtracted image (254).
  • the background image may be obtained by capturing an image of a white object. Background subtraction may assist in eliminating noise introduced by unwanted lens refractions.
  • the image data may be transformed (256) into a single channel, for example by subtracting the red channel from the image data.
  • subtraction and channel conversion may only be performed for selected pixels in the image data.
  • Thresholding and contour detection may be performed (258).
  • Contour detection/selection may be performed (260) to select the start and end of the most suitable contour lines and a circle may be fitted (262) around the contour points to detect the pupil. These operations may be performed for each index.
  • These operations may be performed for each frame of the image data so that pupil movement can be tracked across the image data.
  • the method may include comparing (208) the movement of the eye to movement of the stimulus point. This may include mapping (210) the position of the centre point of the pupil to an estimated gaze point in the virtual environment.
  • the method may use between a five- and nine-point calibration procedure to map the centre of the pupil to a gaze position.
  • the method may include using the pupil's position for the known calibration points to fit two second-order conic equations to map any x and y pupil co-ordinates, respectively, to gaze co-ordinates.
  • the method in the case of nine-point calibration, for example, the method may including using a nine-point grid to calculate the pupil's position at these nine known gaze positions.
  • the method may use a linear interpolation algorithm to calculate the gaze point position for all possible pupil positions.
  • Operations (202) to (212) described above may provide an ocular motor function test, such as a smooth pursuit eye tracking test, for diagnosing or screening for concussion and may be adapted to conform to best practices in the medical profession.
  • the ocular motor function test may perform monocular eye tracking. A sampling rate of between 20 Hz and 40 Hz, preferably 30 Hz may be selected and the test may achieve an accuracy in the region of 0.5°.
  • the method may include displaying (213) or otherwise outputting movement test instructions which instruct the patient to perform certain actions or assume certain positions (e.g. to stand on one leg, walk along a line, etc.) so that the patient's balance can be evaluated.
  • the movement test instructions are displayed as a separate test, which may be performed either before or after the ocular motor function test described above is performed.
  • the method may include receiving (214) movement data relating to movement of the mobile communication device (102).
  • the movement data may be received from a movement sensor of the mobile communication device (102).
  • the movement sensor may include one or both of an accelerometer and a gyroscope and the movement data may include one or both of acceleration data and rotation rate data.
  • Acceleration data may for example relate to acceleration of the mobile communication device along one or more of three axes.
  • Rotation rate data may for example relate to rotation of the mobile communication device around or about one or more of the three axes.
  • Receiving (214) movement data may include receiving movement data relating to torso and/or head tilt and acceleration in the mediolateral (ML) and anteroposterior (AP) axis.
  • Receiving (214) movement data may include receiving data relating to a tandem gait test. This movement data may be used to identify more accurate markers of torso and/or head tilt disturbance after a concussion and may be received while the mobile communication device is mounted to a torso of the patient using an appropriate mounting mechanism.
  • a sampling rate of between 100 Hz and 200 Hz, preferably150 Hz, may be selected for sampling the movement data.
  • the method may include comparing (216) the received movement data with expected movement data.
  • the root mean square of the ML and AP torso tilt is used to identify differences in the baseline balance ability and post-injury balance ability in a dynamic gait test, the same as or similar to those defined in the Head Injury Assessment (HIA) protocol and Balance Error Scoring System (BESS) test using the mobile sensors.
  • Operations (213) to (216) may provide a suitable quantitative balance assessment using sensors of the mobile communication device (102). In some implementations, these operations may be performed with the mobile communication device (102) docked in the portable device (1 ) while in other implementations, a harness may be used to secure the mobile communication device (102) to the chest of the patient.
  • Using the mobile communication device (102) may enable more objective balance examinations.
  • Operations (213) to (216) may be configured to align with the testing procedure of the Head Injury Assessment (HIA) protocol or another suitable protocol for evaluating concussion.
  • HOA Head Injury Assessment
  • a more objective test may be provided.
  • the mobile communication device accelerometer and/or gyro data may be used to quantify a balance error scoring system, which may improve what was previously a subjective test.
  • the method may include outputting (218) scores for use in diagnosing the medical condition. This may include outputting one or both of a tracking score based on the extent to which movement of the eye matches movement of the stimulus point and a movement score based on the comparison of the movement data to expected movement data for use in diagnosing the medical condition. Outputting the tracking score may include outputting the root mean square value.
  • the method may include receiving (220) a patient identifier which uniquely identifies the current patient using the device.
  • the patient identifier may for example be received via a touch sensitive display of the mobile communication device (102), a contactless element (e.g. NFC, RFID, etc.), a graphical code captured using the camera of the device or the like.
  • the method may use the patient identifier to access (222) a patient record (1 10) in which historic movement and/or tracking scores may be stored. At least some of the historic scores may be baseline scores which are determined while it is known that the patient is not suffering from the medical condition. Baseline scores may assist in providing more accurate diagnoses.
  • the patient record (1 10) may be retrieved from an internal memory of the mobile communication device (102) or from a remote location accessible via a communication network (106) (e.g. a cloud-based storage facility, remotely accessible database, etc.).
  • the method may include storing (224) the output scores in the patient record (1 10).
  • the method may include comparing (226) the output score to one or more corresponding historic scores.
  • the method may include outputting (228) the comparison including one or both of the tracking comparison and the movement comparison.
  • the system (100) may include the portable device (1 ), the mobile communication device (102) and a remote server computer (104).
  • the mobile communication device (102) and server computer (104) may be configured to communicate via a suitable communication network (106), such as a local area network and/or the Internet.
  • Communication via the communication network (106) may be secured (e.g. using SSL, IPSec or the like).
  • the server computer (104) may be any suitable computing device performing a server role.
  • the server computer (104) may have access to a database (108) in which a patient record (1 10) may be stored.
  • the patient record (1 10) may be associated with a patient identifier uniquely associated with a patient. While only one patient record (1 10) is illustrated, it should be appreciated that in a practical implementation there may be a plurality of these, i.e. one for each patient registered with the system.
  • One or more of the following may be stored in the patient record: base line scores (for tracking, movement and/or cognitive tests), field scores (for tracking, movement and/or cognitive tests), medical aid information, patient personal information (e.g.
  • the server computer (104) may provide access to the patient record (1 10) to the mobile communication device (102) as well as authenticated devices, such as computing devices of doctors, parents, coaches of the patient and the like. Access may be subject to the authenticated devices authenticating themselves to the server computer (104) and different users may have different permissions, as defined in the permissions list.
  • the mobile communication device (102) may be any suitable portable device capable of communicating on the communication network (106), such as a mobile phone (e.g. smartphone), tablet computer, personal digital assistant, wearable computing device or the like.
  • the mobile communication device (102) may include a processor (120) for executing the functions of components described below, which may be provided by hardware or by software units executing on the mobile communication device (102).
  • the software units may be stored in a memory component (122) and instructions may be provided to the processor (120) to carry out the functionality of the described components.
  • software units arranged to manage and/or process data on behalf of the mobile communication device (102) may be provided remotely.
  • Some or all of the components may be provided by a software application (123) downloadable onto and executable on the mobile communication device (102).
  • the mobile communication device (102) may include a display (126) arranged to display a virtual environment in which a stimulus point moves in a predefined manner.
  • the mobile communication device (102) may include a camera (128) arranged to output image data relating to movement of an eye of a patient of the portable device (1 ).
  • the camera (128) may be a front-facing camera and the mobile communication device (102) may be shaped and dimensioned to fit within the receiving formation (9) of the portable device (1 ) such that an aperture of the camera (128) aligns with the opening (37).
  • An optical arrangement of the portable device may operatively direct light, projected onto the eye by an illumination module and reflected by the eye, towards the opening to provide optical communication with the camera.
  • the software application (123) may include an image receiving component (129) arranged to receive image data relating to movement of the eye of the patient from the camera (128).
  • the image data may be received while the portable device mounts the mobile communication device.
  • the software application (123) may include an analysing component (130) arranged to analyse the image data to track movement of the eye.
  • the software application (123) may include a comparing component (132) arranged to compare the movement of the eye to movement of the stimulus point.
  • the software application (123) may include a score outputting component (134) arranged to output a tracking score based on the extent to which movement of the eye matches movement of the stimulus point for use in diagnosing the medical condition.
  • Each of these components may be configured to provide the functionality described above with reference to Figure 8.
  • Figures 1 1 and 12 are schematic diagrams which illustrate an exemplary user interface which may be provided by the software application (123).
  • the software application (123) resident on the mobile communication device (102) may be configured to allow a new user to create a patient profile.
  • the patient profile may be a personalised profile which stores a patient's individual assessment data in a patient record (1 10).
  • the software application (123) may be further configured to allow more than one patient profile to be created and managed from the mobile communication device (102).
  • the user (who may be an ambulance first responder, doctor, parent, coach, the patient or the like) may be required to input login information before further features of the software application (123) may be accessed.
  • the user When creating a new patient profile, the user may be requested to enter the weight, height, date of birth an email address of the patient and the like.
  • a patient status label may indicate a health status associated with the patient.
  • the patient status label may automatically indicate that the patient is healthy if the patient is a first time user and may change thereafter based on actual results associated
  • the software application further prompts the user to enter a test phase.
  • the test phase may include an assessment tool which may implement a concussion assessment similar to the HIA Protocol.
  • the HIA Protocol is a three-stage process that includes HIA 1 assessment, HIA 2 assessment and HIA 3 assessment.
  • the Protocol has been developed to support the recognition and diagnosis of concussion.
  • the user is also able to conduct a baseline test on the patient to determine a set of baseline results for future use in concussion diagnosis.
  • the software application (123) will present a sequence of tests and questions to the user (e.g. as illustrated in Figures 10 and 1 1 ) enabling diagnosis of the patient. Each test requires a user input in order to progress to the next.
  • the software application (123) may be configured to analyse and evaluate the results and display whether the patient has failed or passed the assessment. The user is then prompted to select a course of action thereby finishing the assessment and returning a result to be stored in the patient record.
  • the software application and associated system described herein may accordingly enable accessible baseline testing and access to a patient's medical history (e.g. concussion history) from anywhere in the world.
  • the software application may be used to monitor players from the day they start playing sport until they potentially become professional players.
  • the software application may also facilitate a balance test using the mobile communication device accelerometer and an eye tracking test that have been shown to be good indicators for concussion.
  • the described device and method enable the front camera sensor of a mobile communication device, such as a smartphone, to be used in combination with an off-the-shelf virtual reality headset and custom lens attachments to perform a smooth pursuit eye tracking test. Studies have shown that smooth pursuit tests have been very successful in detecting concussions.
  • the illumination module and optical arrangement described herein may be configured to be attached to or otherwise integrated with a commercially available virtual reality headset with minimum alteration of the headset being required.
  • the only modifications that are required to be made to the headset is the drilling of the opening providing optical communication to the front facing camera.
  • the application described herein aims to streamline this comparative process, allowing for quick testing in any setting, without the need for acute medical training in concussions.
  • the application may incorporate a tests similar to those described in the HIA protocol which may facilitate quick and effective testing. Test data may be automatically recorded, and can be accessed from anywhere in the world using the application.
  • Test digitization may allow for randomization of tests, such as the numbers the athlete is required to memorize during the immediate memory recall test. This may significantly decrease the learning effect and may improve to the sensitivity of repeated use. By incorporating quantitative balance and eye tracking performance assessments, the sensitivity of the system may be increased further. This may grant access to effective concussion management to everybody. Detrimental effects that repeated concussion leave in community/amateur sport may be decreased.
  • mirror as used herein should be construed so as to include any suitable surface which reflects at least selected wavelengths (e.g. infrared and/or near infrared) of electromagnetic waves and, in some (e.g. in the case of the first mirror or major mirror), but not necessarily all, cases which permits passage of at least selected other wavelengths (e.g. visible light) of electromagnetic waves.
  • Some of the mirrors described herein may accordingly have a transparent appearance to a user, while being configured to reflect selected wavelengths of non-visible light.
  • FIG. 13 illustrates an example of a computing device (400) in which various aspects of the disclosure may be implemented.
  • the computing device (400) may be embodied as any form of data processing device including a personal computing device (e.g. laptop or desktop computer), a server computer (which may be self-contained, physically distributed over a number of locations), a client computer, or a communication device, such as a mobile phone (e.g. cellular telephone), satellite phone, tablet computer, personal digital assistant or the like.
  • a mobile phone e.g. cellular telephone
  • satellite phone e.g. cellular telephone
  • tablet computer tablet computer
  • personal digital assistant e.g. cellular telephone
  • the computing device (400) may be suitable for storing and executing computer program code.
  • the various participants and elements in the previously described system diagrams may use any suitable number of subsystems or components of the computing device (400) to facilitate the functions described herein.
  • the computing device (400) may include subsystems or components interconnected via a communication infrastructure (405) (for example, a communications bus, a network, etc.).
  • the computing device (400) may include one or more processors (410) and at least one memory component in the form of computer-readable media.
  • the one or more processors (410) may include one or more of: CPUs, graphical processing units (GPUs), microprocessors, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs) and the like.
  • a number of processors may be provided and may be arranged to carry out calculations simultaneously.
  • various subsystems or components of the computing device (400) may be distributed over a number of physical locations (e.g. in a distributed, cluster or cloud-based computing configuration) and appropriate software units may be arranged to manage and/or process data on behalf of remote devices.
  • the memory components may include system memory (415), which may include read only memory (ROM) and random access memory (RAM).
  • ROM read only memory
  • RAM random access memory
  • BIOS basic input/output system
  • System software may be stored in the system memory (415) including operating system software.
  • the memory components may also include secondary memory (420).
  • the secondary memory (420) may include a fixed disk (421 ), such as a hard disk drive, and, optionally, one or more storage interfaces (422) for interfacing with storage components (423), such as removable storage components (e.g. magnetic tape, optical disk, flash memory drive, external hard drive, removable memory chip, etc.), network attached storage components (e.g. NAS drives), remote storage components (e.g. cloud-based storage) or the like.
  • removable storage components e.g. magnetic tape, optical disk, flash memory drive, external hard drive, removable memory chip, etc.
  • network attached storage components e.g. NAS drives
  • remote storage components e.g. cloud-based storage
  • the computing device (400) may include an external communications interface (430) for operation of the computing device (400) in a networked environment enabling transfer of data between multiple computing devices (400) and/or the Internet.
  • Data transferred via the external communications interface (430) may be in the form of signals, which may be electronic, electromagnetic, optical, radio, or other types of signal.
  • the external communications interface (430) may enable communication of data between the computing device (400) and other computing devices including servers and external storage facilities. Web services may be accessible by and/or from the computing device (400) via the communications interface (430).
  • the external communications interface (430) may be configured for connection to wireless communication channels (e.g., a cellular telephone network, wireless local area network (e.g.
  • the external communications interface (430) may include a subscriber identity module (SIM) in the form of an integrated circuit that stores an international mobile subscriber identity and the related key used to identify and authenticate a subscriber using the computing device (400).
  • SIM subscriber identity module
  • One or more subscriber identity modules may be removable from or embedded in the computing device (400).
  • the external communications interface (430) may further include a contactless element (450), which is typically implemented in the form of a semiconductor chip (or other data storage element) with an associated wireless transfer element, such as an antenna.
  • the contactless element (450) may be associated with (e.g., embedded within) the computing device (400) and data or control instructions transmitted via a cellular network may be applied to the contactless element (450) by means of a contactless element interface (not shown).
  • the contactless element interface may function to permit the exchange of data and/or control instructions between computing device circuitry (and hence the cellular network) and the contactless element (450).
  • the contactless element (450) may be capable of transferring and receiving data using a near field communications capability (or near field communications medium) typically in accordance with a standardized protocol or data transfer mechanism (e.g., ISO 14443/NFC).
  • Near field communications capability may include a short-range communications capability, such as radio- frequency identification (RFID), BluetoothTM, infra-red, or other data transfer capability that can be used to exchange data between the computing device (400) and an interrogation device.
  • RFID radio- frequency identification
  • BluetoothTM BluetoothTM
  • infra-red infra-red
  • the computer-readable media in the form of the various memory components may provide storage of computer-executable instructions, data structures, program modules, software units and other data.
  • a computer program product may be provided by a computer-readable medium having stored computer-readable program code executable by the central processor (410).
  • a computer program product may be provided by a non-transient computer-readable medium, or may be provided via a signal or other transient means via the communications interface (430).
  • Interconnection via the communication infrastructure (405) allows the one or more processors (410) to communicate with each subsystem or component and to control the execution of instructions from the memory components, as well as the exchange of information between subsystems or components.
  • Peripherals such as printers, scanners, cameras, or the like
  • input/output (I/O) devices such as a mouse, touchpad, keyboard, microphone, touch-sensitive display, input buttons, speakers and the like
  • I/O input/output
  • One or more displays (445) (which may be touch-sensitive displays) may be coupled to or integrally formed with the computing device (400) via a display (445) or video adapter (440).
  • the computing device (400) may include a geographical location element (455) which is arranged to determine the geographical location of the computing device (400).
  • the geographical location element (455) may for example be implemented by way of a global positioning system (GPS), or similar, receiver module.
  • GPS global positioning system
  • the geographical location element (455) may implement an indoor positioning system, using for example communication channels such as cellular telephone or Wi-FiTM networks and/or beacons (e.g. BluetoothTM Low Energy (BLE) beacons, iBeaconsTM, etc.) to determine or approximate the geographical location of the computing device (400).
  • the geographical location element (455) may implement inertial navigation to track and determine the geographical location of the communication device using an initial set point and inertial measurement data.
  • a software unit is implemented with a computer program product comprising a non-transient computer-readable medium containing computer program code, which can be executed by a processor for performing any or all of the steps, operations, or processes described.
  • Software units or functions described in this application may be implemented as computer program code using any suitable computer language such as, for example, JavaTM, C++, or PerlTM using, for example, conventional or object-oriented techniques.
  • the computer program code may be stored as a series of instructions, or commands on a non-transitory computer-readable medium, such as a random access memory (RAM), a read-only memory (ROM), a magnetic medium such as a hard-drive, or an optical medium such as a CD-ROM. Any such computer-readable medium may also reside on or within a single computational apparatus, and may be present on or within different computational apparatuses within a system or network.
  • a non-transitory computer-readable medium such as a random access memory (RAM), a read-only memory (ROM), a magnetic medium such as a hard-drive, or an optical medium such as a CD-ROM.
  • RAM random access memory
  • ROM read-only memory
  • magnetic medium such as a hard-drive
  • optical medium such as a CD-ROM.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

L'invention concerne des systèmes et un procédé destinés à être utilisés dans le diagnostic d'une affection médicale. Le dispositif comprend un boîtier conçu pour monter un dispositif de communication sur la tête d'un patient et former un trajet optique entre un œil du patient et une zone d'affichage dans laquelle un affichage du dispositif de communication est fonctionnellement située. Le boîtier comprend une lentille située dans le trajet optique et conçue pour faciliter la focalisation de l'œil du patient au niveau de la zone d'affichage. Le boîtier comprend un module d'éclairage conçu pour projeter de la lumière sur l'œil et un agencement optique. L'agencement optique est conçu pour diriger la lumière, qui a été projetée sur l'œil par le module d'éclairage et réfléchi par l'œil, vers une ouverture agencée de manière fonctionnelle pour permettre une communication optique avec une caméra du dispositif de communication, ce qui permet le suivi du mouvement de l'œil par le dispositif de communication.
PCT/IB2018/055767 2017-09-04 2018-08-01 Dispositifs et procédés destinés à être utilisés dans le diagnostic d'une affection médicale WO2019043475A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
ZA2020/00774A ZA202000774B (en) 2017-09-04 2020-02-05 Devices and methods for use in diagnosing a medical condition

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ZA201705983 2017-09-04
ZA2017/05983 2017-09-04

Publications (2)

Publication Number Publication Date
WO2019043475A2 true WO2019043475A2 (fr) 2019-03-07
WO2019043475A3 WO2019043475A3 (fr) 2019-05-09

Family

ID=63405283

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2018/055767 WO2019043475A2 (fr) 2017-09-04 2018-08-01 Dispositifs et procédés destinés à être utilisés dans le diagnostic d'une affection médicale

Country Status (2)

Country Link
WO (1) WO2019043475A2 (fr)
ZA (1) ZA202000774B (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9439592B2 (en) 2012-05-18 2016-09-13 Sync-Think, Inc. Eye tracking headset and system for neuropsychological testing including the detection of brain damage

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9101312B2 (en) * 2012-04-18 2015-08-11 TBI Diagnostics LLC System for the physiological evaluation of brain function
US20140104692A1 (en) * 2012-10-11 2014-04-17 Sony Computer Entertainment Europe Limited Head mountable display
WO2015051660A1 (fr) * 2013-10-13 2015-04-16 北京蚁视科技有限公司 Afficheur stéréoscopique facial
US10096162B2 (en) * 2014-12-22 2018-10-09 Dimensions And Shapes, Llc Headset vision system for portable devices that provides an augmented reality display and/or a virtual reality display
US9791924B2 (en) * 2014-12-23 2017-10-17 Mediatek Inc. Eye tracking with mobile device in a head-mounted display
CN106019597A (zh) * 2016-07-27 2016-10-12 北京小米移动软件有限公司 虚拟现实眼镜

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9439592B2 (en) 2012-05-18 2016-09-13 Sync-Think, Inc. Eye tracking headset and system for neuropsychological testing including the detection of brain damage

Also Published As

Publication number Publication date
ZA202000774B (en) 2021-01-27
WO2019043475A3 (fr) 2019-05-09

Similar Documents

Publication Publication Date Title
US11755102B2 (en) User interface interaction paradigms for eyewear device with limited field of view
US9004687B2 (en) Eye tracking headset and system for neuropsychological testing including the detection of brain damage
US20200057495A1 (en) Eye-brain interface (ebi) system and method for controlling same
JP7106569B2 (ja) ユーザーの健康状態を評価するシステム
US20150045012A1 (en) Ophthalmic adapter for personal electronics
CN106575039A (zh) 具有确定用户眼镜特性的眼睛跟踪设备的平视显示器
US20210196174A1 (en) Apparatus, method and program for determining a cognitive state of a user of a mobile device
US20190328305A1 (en) System and method for testing a condition of the nervous system using virtual reality technology
US10709328B2 (en) Main module, system and method for self-examination of a user's eye
US9888845B2 (en) System and method for optical detection of cognitive impairment
WO2016114496A1 (fr) Procédé procurant une interface utilisateur au moyen d'un affichage porté sur la tête, utilisant la reconnaisance oculaire et un biosignal, appareil l'utilisant et support d'enregistrement lisible par ordinateur
CN104146684A (zh) 一种眼罩式眩晕检测仪
KR102304369B1 (ko) Vr을 이용한 안과검사방법 및 시스템
CN113197542B (zh) 一种在线自助视力检测系统、移动终端及存储介质
JP2017191546A (ja) 医療用ヘッドマウントディスプレイ、医療用ヘッドマウントディスプレイのプログラムおよび医療用ヘッドマウントディスプレイの制御方法
CN114846788A (zh) 使用用于移动设备的附加结构的增强型动眼神经测试设备和方法
CN113138664A (zh) 基于光场感知的眼球追踪系统、方法
WO2019043475A2 (fr) Dispositifs et procédés destinés à être utilisés dans le diagnostic d'une affection médicale
JP2015123262A (ja) 角膜表面反射画像を利用した視線計測方法及びその装置
KR102312358B1 (ko) 사용자가 수행한 호흡 미션 결과를 이용한 경도인지 장애 진단 및 훈련 시스템
KR102458553B1 (ko) 눈 건강 측정 디바이스 및 방법
US20230210363A1 (en) Infrared tele-video-oculography for remote evaluation of eye movements
KR102204112B1 (ko) 동공과 홍채를 이용한 이석증 질병예측정보를 제공하는 방법
KR102189783B1 (ko) 이석증의 질병예측정보 표시방법
Sharma et al. Requirement analysis and sensor specifications–First version

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18760050

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18760050

Country of ref document: EP

Kind code of ref document: A2