WO2019132780A1 - System and method for obtaining data associated with an injury - Google Patents

System and method for obtaining data associated with an injury Download PDF

Info

Publication number
WO2019132780A1
WO2019132780A1 PCT/SG2018/050636 SG2018050636W WO2019132780A1 WO 2019132780 A1 WO2019132780 A1 WO 2019132780A1 SG 2018050636 W SG2018050636 W SG 2018050636W WO 2019132780 A1 WO2019132780 A1 WO 2019132780A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
injury
image
input
operable
Prior art date
Application number
PCT/SG2018/050636
Other languages
English (en)
French (fr)
Inventor
Ee Sian NEO
Original Assignee
Tetsuyu Healthcare Holdings Pte. Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tetsuyu Healthcare Holdings Pte. Ltd. filed Critical Tetsuyu Healthcare Holdings Pte. Ltd.
Priority to JP2020530384A priority Critical patent/JP2021509298A/ja
Priority to CN201880077241.2A priority patent/CN111742374A/zh
Publication of WO2019132780A1 publication Critical patent/WO2019132780A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the present invention relates to a system and method for obtaining data associated with an injury, in particular, an injury image for assessment of the injury.
  • the present invention seeks to provide a system and method that addresses the aforementioned need at least in part.
  • the technical solution is provided in the form of a system and method for obtaining data associated with an injury.
  • the data includes at least one of an injury image, information such as size, colour, location and temperature, and clinical information associated with the injury.
  • the system is suited for obtaining the data in the form of one or more injury image(s).
  • the system also comprises an information system operable to store information relating to the injury; at least one sensor operable to sense the injury; and a device operable to control the sensor, obtain sensing data from the sensor, and display the data associated with the injury based on the sensing data, wherein information relating to the injury is used for controlling the sensor to obtain the sensing data of a characteristic of the injury.
  • the invention seeks to improve the accuracy and efficiency of the injury assessment by obtaining accurate and proper data associated with the injury using machine intelligence and assistance of experienced medical staff who could be located remotely in real-time.
  • the machine intelligence may include one or more learning algorithms based on supervised or unsupervised learning models.
  • the invention further seeks to reduce time of exposure of the injury during the injury assessment.
  • a system for obtaining data associated with an injury comprising: an information system operable to store information relating to the injury; at least one sensor operable to sense the injury; and a device operable to control the sensor, obtain sensing data from the sensor, and display the data associated with the injury based on the sensing data, wherein the information relating to the injury is used for controlling the sensor to obtain the sensing data of a characteristic of the injury.
  • the device is operable to set a sensing parameter using the information relating to the injury to operate the sensor.
  • the data includes at least one of an injury image, information such as size, colour, location and temperature, and clinical information associated with the injury.
  • the device is operable to generate interactive information of the injury using the information relating to the injury and the sensing data, and display the interactive information with the injury image.
  • the device is operable to receive a first input on the injury image or the interactive information from a first user, and modify or add the interactive information based on the first input.
  • the system further comprises a remote device operable to receive the modified or added interactive information, receive a second input on the injury image or the received interactive information from a second user, and modify or add the received interactive information based on the second input.
  • a remote device operable to receive the modified or added interactive information, receive a second input on the injury image or the received interactive information from a second user, and modify or add the received interactive information based on the second input.
  • the modified or added interactive information based on the second input is stored on the information system for future use.
  • the first input and the second input comprises at least one of a touch input, key input, gesture input and voice input.
  • the device is operable to generate 3D reconstructed information using 3D information acquired by at least one 3D imaging sensor, and display the 3D reconstructed information with at least one of the injury image, an anatomy image and the interactive information.
  • the device is operable to toggle at least one of the interactive information and the injury image, based on the users’ input.
  • the device is operable to display the interactive information overlaying on the injury image in a mixed reality format.
  • the device is operable to display the interactive information beside the injury image.
  • the device is operable to display a historical trend of the injury.
  • a previous injury image is overlapped with a current injury image.
  • the interactive information comprises machine intelligence generated suggestions.
  • the suggestions include description of the injury and information of medical supplies for the injury.
  • the information system comprises at least one of a local database and a remote database.
  • the information stored on the information system comprises at least one of the following information: injury information, patient information, environment information, medication product information and algorithms for machine intelligence.
  • the senor is installed in the device.
  • the device comprises a first device and a second device, the sensor is installed in the first device, the first device is operable to control the sensor using the information relating to the injury, and the second device is operable to obtain the sensing data from the first device, display the data associated with the injury based on the sensing data, generate the interactive information of the injury using the information relating to the injury and the sensing data, and display the interactive information with the data associated with the injury.
  • a method for obtaining data associated with an injury comprising: storing information relating to the injury in an information system; sensing, by at least one sensor, the injury; controlling, by a device, the sensor; obtaining, at the device, sensing data from the sensor; and displaying, on the device, the data associated with the injury based on the sensing data, wherein the information relating to the injury is used for controlling the sensor to obtain the sensing data of a characteristic of the injury.
  • the method further comprises a step of: setting, by the device, a sensing parameter using the information relating to the injury to operate the sensor.
  • the data includes at least one of an injury image, information such as size, colour, location and temperature, and clinical information associated with the injury.
  • the method further comprises steps of: generating, by the device, interactive information of the injury using the information relating to the injury and the sensing data; and displaying, on the device, the interactive information with the injury image.
  • the method further comprises steps of: receiving, by the device, a first input on the injury image or the interactive information from a first user; and modifying or adding, by the device, the interactive information based on the first input.
  • the method further comprises steps of; receiving, at a remote device, the modified or added interactive information; receiving, at the remote device, a second input on the injury image or the received interactive information from a second user; and modifying or adding, by the remote device, the received interactive information based on the second input.
  • the modified or added interactive information based on the second input is stored on the information system for future use.
  • the first input and the second input comprises at least one of a touch input, key input, gesture input and voice input.
  • the device is operable to generate 3D reconstructed information using 3D information acquired by at least one 3D imaging sensor, and display the 3D reconstructed information with at least one of the injury image, an anatomy image and the interactive information.
  • the device is operable to toggle at least one of the interactive information and the injury image, based on the users’ input.
  • the device is operable to display the interactive information overlaying on the injury image in a mixed reality format.
  • the device is operable to display the interactive information beside the injury image.
  • the device is operable to display a historical trend of the injury.
  • a previous injury image is overlapped with a current injury image.
  • the interactive information comprises machine intelligence generated suggestions.
  • the suggestions include description of the injury and information of medical supplies for the injury.
  • the information system comprises at least one of a local database and a remote database.
  • the information stored on the information system comprises at least one of the following information: injury information, patient information, environment information, medication product information and algorithms for machine intelligence.
  • the senor is installed in the device.
  • the device comprises a first device and a second device, the sensor is installed in the first device, the first device is operable to control the sensor using the information relating to the injury, and the second device is operable to obtain the sensing data from the first device, display the data associated with the injury based on the sensing data, generate the interactive information of the injury using the information relating to the injury and the sensing data, and display the interactive information with the data associated with the injury.
  • Fig. 1 illustrates a block diagram of an information system, a device and a remote device in accordance with an embodiment of the invention.
  • Fig. 2 illustrates a flow diagram of an information system, a device and a remote device in accordance with an embodiment of the invention.
  • Fig. 3 illustrates a block diagram of an information system, a first device, a second device and a remote device in accordance with another embodiment of the invention.
  • Fig. 4 illustrates a flow diagram of an information system, a first device, a second device and a remote device in accordance with another embodiment of the invention.
  • Figs. 5 to 7 illustrate examples of displaying wound image and/or interactive information in accordance with various embodiments of the invention.
  • Figs. 8 and 9 illustrate examples of modifying the interactive information in accordance with various embodiments of the invention.
  • Figs. 10 to 14 illustrate examples of interfaces showing 3D reconstructed information in accordance with various embodiments of the invention.
  • Figs. 15 to 19 illustrate examples of interfaces in an interactive mode in accordance with various embodiments of the invention.
  • Other arrangements of the invention are possible and, consequently, the accompanying drawings are not to be understood as superseding the generality of the preceding description of the invention.
  • Fig. 1 illustrates a block diagram of an information system, a device and a remote device in accordance with an embodiment of the invention.
  • the injury may include, but not be limited to a wound, bruise, abrasion, fracture and/or combinations of two or more of the aforementioned injuries.
  • the data may be in the form of, for example, an injury image, for example a wound image.
  • the system comprises an information system 100 and at least one device 200.
  • the system may further comprise at least one remote device 300.
  • the information system 100 may include a local database on a device itself, for example a local database on the device 200, the remote device 300 or other devices.
  • the information system 100 may be an on-premises server located at the same location as the device 200 and connected through local area network.
  • the information system 100 may be a remote server or cloud server which is connected to the device 200 or the remote device 300 through internet communication or any other types of communication.
  • the information system 100 comprises at least one database serving different functions or multiple databases each serving a specific function.
  • the information system 100 may comprise an injury database 110 such as a wound model database, a patient information database 120, a product database 130 and an algorithm database 140.
  • the information system 100 therefore stores specific and holistic information about one or more patients and the environment to provide the context for obtaining a wound image for the specific wound assessment.
  • the wound model database 110 contains physical attributes of the wound, for example size, colour, images, location and temperature, and clinical information, for example disease profile and correlation information.
  • the patient information database 120 includes at least one of the following data related to the patient: - historical data of the patient’s health condition, allergy-related information, disease history, medication history, vital signs, physical attributes, temperature, skin conditions, anatomy related information such as body maps, and 3D models of patient’s body.
  • the patient information database 120 may further include environmental information of the patient, such as condition of the room where the patient is, a map of the room, and information and/or models relating to objects such as furniture.
  • the condition of the room may include, but not be limited to lighting, temperature and humidity, and the information and/or models regarding the furniture may include, but not be limited to bed, chair and table information.
  • the patient information database 120 may be updated periodically or non-periodically (e.g. on-demand).
  • the patient information database 120 may update a vital sign of the patient in a real-time manner.
  • the vital sign of the patient may be sensed through the device 200, remote device 300 or other devices such as vital sign sensing device.
  • the vital sign of the patient may be manually inputted by the users.
  • the vital sign information may be obtained from other database, for example electronic medical records storing the historical data of the vital sign of the patient.
  • the product database 130 contains at least one of inventory information of products, information related to cost and/or performance of the products, and clinical information related to the products.
  • the products may include, but not be limited to medical supplies.
  • the algorithm database 140 contains at least one of data processing algorithms, control algorithms for software and/or hardware components, wound assessment tools and/or algorithms, and analytics algorithms.
  • the data processing algorithms may include image processing algorithms.
  • the algorithms are utilized to adjust and tune a sensing parameter to be used for sensing. For the purposes of 3D measurements and colour image processing of the injury based on the information, specific image processing algorithms are selected and utilized. The algorithms are used to determine the assessment methodology of the injury. Each algorithm is used to assess corresponding injury. For example, an“algorithm A” is used to assess a specific type of“injury A” based on the patient information, medical history and/or injury information. Based on the information related to the patient, the sensor data, assessment made by machine intelligence or user’s input and medical product information, the algorithms are utilized to provide suggestions related to interventions and medications used for managing the injury.
  • the information system 100 comprises at least one controller (not shown).
  • the controller may include an artificial intelligence (Al) module.
  • Al artificial intelligence
  • the controller may be arranged in signal/data communication with the algorithm database 140 to utilize one or more algorithms to generate information associated to the wound.
  • the system may further comprise an application provider (not shown).
  • the application provider provides an application or a service to at least one device related to the described invention.
  • the application is installable on at least one device.
  • the application provider may include at least one of an application server and an application database.
  • the application provider may be located remotely or independent from the information system 100. In some embodiments, the application provider may be the information system 100.
  • the application may be executed on at least one of the device 200 and the remote device 300.
  • the at least one device may communicate with each other and with the information system 100.
  • the device 200 may be at the patient’s site and the remote device 300 may be at a remote site.
  • the device 200 and the remote device 300 may include, but not be limited to the following devices: mobile device such as mobile phone and tablet, head-mounted device, robotic platform, laptop and computer.
  • At least one sensor is operable to sense the injury, for example wound for the measurement of the characteristics of the wound.
  • the device 200 controls the sensor and obtains sensing data from the sensor.
  • the information relating to the wound stored in the information system 100 is used for controlling the sensor to obtain the sensing data of a characteristic of the wound.
  • the device 200 therefore is able to obtain the data associated with the wound, for example imaging information which focuses on the characteristic such as an area and colour of the wound.
  • the characteristic of the wound may include, but not be limited to a characteristic of interest of the wound. For example, if a recovery of a left side of a wound is slow, the device 200 may control the sensor to obtain the sensing data of or on the left side of the wound in greater detail. As another example, if a patient’s skin colour is similar to the wound’s colour, the device 200 may control the sensor to obtain sensing data having more colours.
  • the sensor may include, but not be limited to a 2D imaging sensor and a 3D imaging sensor.
  • the sensor may be attached or mounted to the device 200. It should be appreciated that a camera or an infrared sensor of the device 200 may be utilized to sense the wound. It is to be appreciated that other sensors or sensing modalities, for example biochemical sensor, thermal sensor and ultrasonic sensor, may be utilized individually or in combination to sense the presence of a wound, as well as the characteristic of the wound.
  • the senor may be external and not be attached or mounted to the device 200.
  • the sensor may be attached to a body of the user, for example patient, clinician and non-clinician.
  • the sensor may be attached to or installed in the robotic mechanism or in the environment such as bed or stand.
  • Fig. 2 illustrates a flow diagram of an information system, a device and a remote device in accordance with an embodiment of the invention.
  • the system may further comprises the at least one remote device 300.
  • the device 200 comprises at least one sensor for sensing the injury, for example wound.
  • the device 200 may be at the patient’s site and the remote device 300 may be at the remote site.
  • the users may include clinicians and non-clinicians such as care-givers, family members of the patient, patient him/herself, or any other person who need to use the device 200 or the remote device 300 to assist in the assessment of the wound condition.
  • clinicians and non-clinicians such as care-givers, family members of the patient, patient him/herself, or any other person who need to use the device 200 or the remote device 300 to assist in the assessment of the wound condition.
  • there may be non-clinicians including the patient at the patient’s site and there may be clinicians at the remote site.
  • clinicians and non-clinicians may use the device 200 to assist in capturing the condition of the wound at the patient’s site
  • the clinicians may assess the wound condition based on clinical protocol at the remote site.
  • the information system 100 sends information related to the injury, for example wound (S110).
  • the information system 100 stores information related to the wound.
  • the information comprises at least one of the following information: the wound information, the patient information, the environment information, wound medication product information and algorithms for implementation of machines intelligence such as image processing algorithms, pattern recognition algorithms, learning algorithms, control algorithms, injury assessment algorithms, etc.
  • the device 200 receives the information related to the wound from the information system 100. It should be appreciated that the device 200 may receive the information from the information system 100 through at least one of internet communication, BluetoothTM communication, near-field communication (NFC), local area network communication, etc.
  • the device 200 controls the sensor using the information (S120).
  • the device 200 is operable to set a sensing parameter using the information to operate the sensor.
  • the algorithms are utilized to adjust and tune the sensing parameter to be used for sensing.
  • a controller of the device 200 may decide on one or more sensors to be used to capture a specific attribute of the wound, as well as the sensing parameter to be used for the sensing.
  • a machine-controllable mechanism may be attached to the one or more sensors.
  • a control parameter may be set to control the sensor attaching the mechanism for the sensing.
  • specific image processing algorithms are utilized. If there are a plurality of sensors, the device 200 may control at least one sensor among the plurality of sensors based on the information. Thereafter, the sensor is operable to sense the wound based on the sensing parameter, in particular a characteristic of the wound, for the measurement of the characteristics of the wound (S130).
  • the device 200 may obtain or receive the sensing data (S140). As earlier described, the device 200 may obtain or receive the sensing data of a characteristic of the wound from the sensor.
  • the device 200 may generate interactive information for the wound using the information related to the wound and the sensing data (S150).
  • the interactive information may comprise machine intelligence generated suggestions provided from an interactive interface of the device 200.
  • the suggestions include description of the wound and information of medical supplies (also referred to as “products”) for the wound.
  • the interactive information may also comprise non- suggestive information such as reference frame of measurements.
  • the interactive information may be provided to the users in the form of 3D reconstructed information showing measuring results of the wound or 3D features of the wound, to assist the users in assessing the wound.
  • the interactive information may be provided to the users in the form of image overlay in real-time based on sensor- driven assessment and the users’ input, as described below.
  • the device 200 displays the data associated with the wound, for example a wound image (S160).
  • the data includes at least one of the wound image, information such as size, colour, location and temperature, and clinical information associated with the wound.
  • the device 200 further displays the interactive information with the wound image (S170).
  • the device 200 may control the display of the interactive information relative to the wound image.
  • the device 200 is operable to toggle at least one of the interactive information and the wound image, based on the users’ preference, for example the users’ input.
  • the device 200 is operable to display the interactive information overlaying on the wound image.
  • the interactive information is displayed as a colour image.
  • the device 200 is operable to display the interactive information beside the wound image, pertaining to the descriptions of the wound appearance according to the sensing data of the wound.
  • the users are able to control the reference images to be displayed. For example, the users are able to choose different set of reference images through at least one of a touch input, key input, gesture input and voice input.
  • the device 200 is operable to generate 3D reconstructed information using 3D information acquired from one or more 3D imaging sensors.
  • the device 200 is further operable to display the 3D reconstructed information, for example 3D reconstructed image, in a virtual reality mode.
  • the 3D reconstructed image may be projected in a purely virtual 3D environment.
  • the users are able to interact with the 3D reconstructed image in the 3D environment.
  • the virtual reality mode may be used for interactive operations for the enhancement of user experience.
  • the device 200 receives an input (hereinafter referred to as“first input”) on the interactive information or the wound image from the user (hereinafter referred to as“first user”) (S180).
  • the first user may include non-clinicians such as care-givers, family members of the patient, patient him/herself, or any other person.
  • the first user may include clinicians.
  • the device 200 may modify the interactive information based on the first input (S190). It should be appreciated that the device 200 may delete the interactive information or add new interactive information.
  • the device 200 sends the modified or added interactive information (hereinafter referred to as“first modified interactive information”) to the information system 100 (S200).
  • the information system 100 may store the first modified interactive information for the wound.
  • the information system 100 may send the first modified interactive information to the remote device 300 (S210).
  • the information such as the wound information, the patient information, the environment information, wound medication product information and algorithms for machine intelligence may be sent to the remote device 300.
  • the remote device 300 is operable to display the first modified interactive information and/or the wound image.
  • the remote device 300 may control the display of the first modified interactive information relative to the wound image, for example toggle at least one of the interactive information and the wound image based on the second users’ preference.
  • the remote device 300 is operable to display the first modified interactive information overlaying on the wound image in a mixed reality format.
  • the first modified interactive information is displayed as a colour image.
  • the remote device 300 is operable to display the first modified interactive information beside the wound image, pertaining to the descriptions of the wound appearance according to the sensing data of the wound.
  • the user hereinafter referred to as “second user” is able to control the reference images to be displayed.
  • the second user is able to choose different set of reference images through at least one of a touch input, key input, gesture input and voice input.
  • the remote device 300 is operable to generate 3D reconstructed information using 3D information acquired from one or more 3D imaging sensors.
  • the remote device 300 is further operable to display the 3D reconstructed information, for example 3D reconstructed image, in a virtual reality mode.
  • the 3D reconstructed image may be projected in a purely virtual 3D environment.
  • the users are able to interact with the 3D reconstructed image in the 3D environment.
  • the virtual reality mode may be used for interactive operations for the enhancement of user experience.
  • the remote device 300 receives an input (hereinafter referred to as“second input”) on the first modified interactive information or the wound image from the user (S220).
  • the second user may include clinicians who are remote from the patient.
  • the second user may include non-clinicians such as care-givers, family members of the patient, or any other person who are remote from the patient.
  • the remote device 300 may modify the first modified interactive information based on the second input (S230). It should be appreciated that the remote device 300 may delete the first modified interactive information or add new interactive information.
  • the remote device 300 sends the modified or added interactive information (hereinafter referred to as“second modified interactive information”) to the information system 100 (S240).
  • the information system 100 may store the second modified interactive information as the information for future analysis.
  • the objective of the remote device 300 is to provide a means for the second user, for example a clinician at the remote site, to conduct and monitor the assessment from afar without the constraint of location and distance.
  • the second user may modify the assessment done by the first user, for example a non-clinician, at the remote site.
  • the described invention allows the information related to the wound to be stored both locally and remotely, for modification and/or assessment in real-time or in an on- demand format.
  • the described invention may be integrated seamlessly with tele-medicine infrastructure such as tele-presence and other remote sensors for remote- communication and assessment.
  • Figs. 3 and 4 illustrate a block diagram and a flow diagram of an information system, a first device, a second device and a remote device in accordance with another embodiment of the invention.
  • the system comprises the information system 100, the at least one device 200 and the at least one remote device 300.
  • the device 200 comprises at least one first device 210 and at least one second device 220.
  • the first device 210, the second device 220 and the remote device 300 may communicate with each other and with the information system 100.
  • the sensor (not shown) is installed in or attached to the first device 210.
  • the first device 210 may be at the patient’s site and the remote device 300 may be at a remote site.
  • the second device 220 may be at the patient’s site.
  • the second device 220 may be at a remote site.
  • the first device 210, the second device 220 and the remote device 300 may include, but not be limited to the following devices: mobile device such as mobile phone and tablet, head-mounted device, robotic platform, laptop and computer.
  • the information system 100 sends information related to the injury, for example wound to the second device 220 (S110).
  • the second device 220 controls the first device 210 using the information related to the wound (S120).
  • the second device 220 is operable to set a sensing parameter using the information related to the wound to operate the first device 210 so that the first device 210 controls the sensor.
  • the sensor of the first device 210 is operable to sense the wound, in particular a characteristic of the wound, for the measurement of the characteristics of the wound (S130).
  • the first device 210 may obtain the sensing data of a characteristic of the wound and display data associated with the wound, for example a wound image based on the sensing data.
  • the data includes at least one of the wound image, information such as size, colour, location and temperature, and clinical information associated with the wound.
  • the first device 210 may send the sensing data obtained from the sensor to the second device 220 (S140).
  • the second device 220 may generate interactive information for the wound using the information and the sensing data (S150).
  • the interactive information may comprise machine intelligence generated suggestions provided from an interactive interface of the second device 220. It should be appreciated that the interactive information may also comprise non-suggestive information such as reference frame of measurements.
  • the second device 220 displays the wound image (S160).
  • the second device 220 further displays the interactive information with the wound image (S170).
  • the second device 220 may control the display of the interactive information relative to the wound image.
  • the second device 220 receives an input (hereinafter referred to as“first input”) on the interactive information or the wound image from the user (hereinafter referred to as “first user”) (S180). Thereafter, the second device 220 may modify the interactive information based on the first input (S190). It should be appreciated that the second device 220 may delete the interactive information or add new interactive information.
  • the second device 220 sends the modified or added interactive information (hereinafter referred to as“first modified interactive information”) to the information system 100 (S200).
  • the information system 100 may store the first modified interactive information for the wound.
  • the information system 100 may send the first modified interactive information to the remote device 300 (S210).
  • the information such as the wound information, the patient information, the environment information, wound medication product information and algorithms for machine intelligence may be sent to the remote device 300.
  • the remote device 300 receives an input (hereinafter referred to as“second input”) on the first modified interactive information or the wound image from the user (hereinafter referred to as“second user”) (S220). Thereafter, the remote device 300 may modify the first modified interactive information based on the second input (S230). It should be appreciated that the remote device 300 may delete the first modified interactive information or add new interactive information.
  • the remote device 300 sends the modified or added interactive information (hereinafter referred to as“second modified interactive information”) to the information system 100 (S240).
  • the information system 100 may store the second modified interactive information as the information for future analysis.
  • Figs. 5 to 7 illustrate examples of displaying wound image and/or interactive information on the device 200 in accordance with various embodiments of the invention. It should be appreciated that Figs. 5, 6 and 7 may be also applied to the remote device 300.
  • the device 200 is operable to obtain the sensing data from the sensor and display the wound image 201 as shown in Fig. 5(a).
  • the device 200 is further operable to generate interactive information for the wound using the information obtained from the information system 100 and the sensing data.
  • the interactive information may comprise machine intelligence generated suggestions provided from an interactive interface of the device 200. It should be appreciated that the interactive information may also comprise non-suggestive information such as reference frame of measurements.
  • the suggestions include description of the wound and information of medical supplies for the wound.
  • the suggestions of the medical supplies to be applied may be derived based on at least one of the wound assessment results, the inventory information of the medical supplies and cost-performance of the medical supplies stored in the product database 130 of the information system 100.
  • the suggestions of the medical supplies to be applied may also be derived based on patient’s information such as allergy, medication history and disease profile, and/or environmental information such as temperature and humidity, stored in the patient information database 120 of the information system 100.
  • the interactive information may be provided to the users in the form of image overlay.
  • the device 200 displays the description of the wound 202 and the information of medical supplies for the wound 203 overlaying on the wound image 201 in a mixed reality format
  • the description of the wound 202 and the information of medical supplies for the wound 203 may be displayed as a colour image.
  • the device 200 displays the description of the wound 202 and the information of medical supplies for the wound 203, beside the wound image 201 , so that the wound image 201 is not covered by any interactive information.
  • any description of the wound 202 and the information of medical supplies for the wound 203 can be shown on another interface such as a list which is not beside the wound image 201 or overlaid on the wound image 201.
  • the device 200 may control the display of the interactive information relative to the wound image 201.
  • the device 200 is operable to toggle the description of the wound 202, the information of medical supplies for the wound 203 and the wound image 201 , based on the user’s preference, for example the users’ input.
  • the user may toggle to a view that shows reference images of wounds from clinical literature, for the comparison of the current wound with the reference images, so that the user can assess the condition of the wound in a more informed manner.
  • the wound image 201 and the reference image(s) can be displayed together, for example side-by-side, so that the user can compare the current wound and the reference wound easily.
  • the first user may toggle to another view where the historical trend of the wound will be shown.
  • the device 200 is operable to display a historical trend of the wound. For example, as shown in Fig. 7, a previous wound image 210 is overlapped with a current wound image 201.
  • the device 200 is operable to display a plurality of previous wound images with the descriptions relevant to the historical trend of the wound.
  • the device 200 is operable to display charts of the wound images to show the change of condition, for example size and colour, of the wound.
  • the wound image 201 and the interactive information are displayed in the form of 3D interface.
  • the device 200 is operable to generate 3D reconstructed information using 3D information acquired from one or more 3D imaging sensors.
  • the device 200 is further operable to display the 3D reconstructed information, for example 3D reconstructed image, in a virtual reality mode.
  • the 3D reconstructed image may be projected in a purely virtual 3D environment.
  • the users are able to interact with the 3D reconstructed image in the 3D environment.
  • the virtual reality mode may be used for interactive operations for the enhancement of user experience.
  • Figs. 8 and 9 illustrate examples of modifying the interactive information in accordance with various embodiments of the invention.
  • the device 200 displays the wound image 201 with the interactive information, for example the description of the wound 202 and the information of medical supplies for the wound 203.
  • the description of the wound 202 and the information of medical supplies for the wound 203 are generated by the device 200.
  • the description of the wound 202 and the information of medical supplies for the wound 203 are generated by any other device, such as the information system 100.
  • the first user may modify the interactive information.
  • the first user may include non-clinicians such as care-givers, family members of the patient, patient him/herself, or any other person.
  • the first user may include clinicians.
  • the device 200 receives an input 204 (hereinafter referred to as “first input”) on the wound image 201 from the first user.
  • first input an input 204
  • the device 200 is also operable to receive the first input 204 on the description of the wound 202 and/or the information of medical supplies for the wound 203.
  • the objective of the first input 204 is to modify or add the interactive information regarding the wound. In this way, the first user is able to modify the assessment done by the device 200. It should be appreciated that the first input 204 comprises at least one of a touch input, key input, gesture input and voice input.
  • the device 200 may modify or add the interactive information based on the first input 204.
  • the device 200 may re-generate the interactive information (hereinafter referred to as“first modified interactive information”) based on the first input 204 and display the first modified description of the wound 205 and the first modified information of medical supplies 206.
  • first modified interactive information the interactive information
  • the first user may manually enter the first modified description of the wound 205 and/or the first modified information of medical supplies 206, and the device 200 may display the same.
  • the device 200 may then send the first modified description of the wound 205 and/or the first modified information of medical supplies 206 to the information system 100 so that the information system 100 provide the same to the remote device 300.
  • the device 200 may send the description of the wound 202 and the information of medical supplies for the wound 203, to the information system 100.
  • the device 200 may send the information system 100 instructions to inform that there is no modification on the interactive information.
  • the remote device 300 receives the first modified description of the wound 205 and/or the first modified information of medical supplies 206, and displays the same.
  • the first input 204 may be displayed with the wound image 201 so that the second user is able to identify the first input 204.
  • the second user may modify the first modified interactive information.
  • the second user may include clinicians at the remote site.
  • the second user may include non-clinicians such as care-givers, family members of the patient, or any other person at the remote site.
  • the remote device 300 receives an input 207 (hereinafter referred to as“second input”) on the wound image 201 from the second user.
  • second input an input 207
  • the remote device 300 is also operable to receive the second input 207 on the first modified description of the wound 205 and/or the first modified information of medical supplies 206.
  • the objective of the second input is to modify or add the first modified interactive information regarding the wound.
  • the second user is able to modify the assessment done by the device 200 or the first user.
  • the second input 207 comprises at least one of a touch input, key input, gesture input and voice input.
  • the remote device 300 may modify or add the interactive information based on the second input 207. As shown in Fig. 9(c), the remote device 300 may re-generate the interactive information (hereinafter referred to as “second modified interactive information”) based on the second input 207 and display the second modified description of the wound 208 and the second modified information of medical supplies 209. Although not shown, it should be appreciated that the second user may manually enter the second modified description of the wound 208 and/or the second modified information of medical supplies 209, and the remote device 300 may display the same.
  • second modified interactive information hereinafter referred to as “second modified interactive information”
  • the information system 00 may store the second modified interactive information as the information for future analysis.
  • the second modified interactive information may be sent to the device 200 or any other patient’s device.
  • Figs. 10 to 14 illustrate examples of interfaces showing 3D reconstructed information in accordance with various embodiments of the invention.
  • the sensor may include, but not be limited to a 2D imaging sensor and/or a 3D imaging sensor.
  • the imaging sensor may be attached or mounted to the device 200. It should be appreciated that the imaging sensor may include, but not be limited to a camera and/or an infrared sensor.
  • a live preview image 211 generated by the 2D imaging sensor may be displayed.
  • the 2D imaging sensor for example 2D camera, may capture a 2D image of an injury, for example wound.
  • 3D information of the wound may be acquired in the form of a set of data points such as point clouds, etc. by using one or more 3D imaging sensors such as a 3D scanner, a stereo camera system or a mono-camera system with motion using algorithms installed on the 3D imaging sensors. These algorithms may include visual simultaneous localization and mapping (“Visual SLAM”) algorithms. Thereafter, the 3D information of the wound is visually reconstructed. The 3D reconstructed information is displayed in mixed reality or virtual reality format.
  • an anatomy image 218 may be displayed on the live preview image 21 1 or the captured 2D image, based on the information of a location of the wound.
  • anatomy image 218 may include an anatomy image of whole body and/or a part of the body relating to the location of the wound, for example upper body.
  • the location of the wound may be a sacrum.
  • the location of the wound may be shown using the anatomy image 218 by highlighting a corresponding part of the anatomy image 218.
  • the wound may be captured in various orientations and/or angles based on the user’s preference or input.
  • An annotation showing the orientation and/or angle may be displayed with the live preview image 21 1 or the captured 2D image.
  • the orientation of the head of the patient is shown using the anatomy image 218.
  • the anatomy image 218 can be used as a compass relative to the location of the wound.
  • the anatomy image 218 can be used to indicate a relative location of the head of the patient based on the location of the wound.
  • the anatomy image 218 may be dynamically displayed. For example, if the orientation and/or angle is changed for capturing the wound, the anatomy image 218 may be rotated accordingly.
  • the anatomy image 218 may be displayed in a fixed format, for example fixed size, shape, orientation and/or location.
  • an estimated distance between the wound plane and the sensor and/or camera may be displayed.
  • the sensor and/or camera may sense a distance from the closest plane of an object such as a part of body being captured, and utilize the distance as the estimated distance. For example, as shown in Fig. 10, “40 cm” is displayed as the distance from the plane of the body to the sensor and/or camera.
  • the display of the anatomy image 218 and/or the estimated distance can assist the user to capture the image of the wound in a consistent manner, for example, in a specific orientation in respect to the body of the patient, and from a known distance. Since a series of images can be captured from a consistent distance, the series of images can be recorded from a consistent perspective. Therefore, the user can check and compare the progress of the size of the wound in a consistent manner.
  • Fig. 1 1 shows an example of the 3D reconstructed information 210 in a mixed reality format.
  • the 3D reconstructed information 210 that is obtained from the 3D imaging sensor may be superimposed on the live preview image 211 in real-time.
  • the user can then interact with the information system 100 by actuating, for example by pressing or touching one or more buttons or icons 212 to capture the 3D reconstructed information 210 of the wound within a selected area 213.
  • the area 213 may be selected by the user’s input. In another embodiment, the area 213 may be selected automatically without the user’s input.
  • assistive information such as an anatomy image 218 and/or size of the targeted sensing volume may be displayed.
  • the anatomy image 218 may be displayed in one colour, for example white colour, at the bottom of the live preview image 211. It should be appreciated that the colour of the anatomy image 218 may be changed in accordance with the colour of the live preview image 211.
  • the anatomy image 218 may show an orientation of a part of a body of a person, such as head of the patient.
  • the targeted sensing volume may be set at 30x30x30 cm cube 219.
  • the size of the targeted sensing volume may be adjusted by the user’s input such as touch input and/or voice input.
  • the touch input may include pinch input.
  • the 3D reconstructed information 210 may be newly acquired by the 3D imaging sensor using the adjusted sensing volume.
  • Figs. 12 to 14 show examples of the 3D reconstructed information 210 in a virtual reality format.
  • the captured 3D reconstructed information 210 may be displayed in the virtual reality format.
  • a plurality of viewing modes may be toggled based on the user’s preference or input. As shown in Figs. 12 and 13, the user can view the 3D reconstructed information 210 from various view points and/or in various range of colours to examine the structure of the wound.
  • the user can interact with the 3D reconstructed information 210 to perform various actions, such as obtain measurements or readings of the wound’s size, etc.
  • the user can measure the perimeter of the wound by selecting an area 214 of the wound.
  • the information system 100 may automatically calculate the perimeter and the depth of the wound area, based on the selected area of the wound by the user’s input.
  • the user can further perform actions, such as obtain measurements or readings of the length between two selected points using the 3D reconstructed information 210.
  • the user can set the respective measurement results by selecting a specific button or icon on the interface to update the information system 100 relating to the wound attributes.
  • Figs. 15 to 19 illustrate examples of interfaces in an interactive mode in which the user is making assessment of the appearance of the injury, in accordance with various embodiments of the invention.
  • the interfaces may display information of wound attributes such as location, category, type of appearance, size and/or smell of the wound.
  • the user can use the interactive information obtained or received from the information system 100 to make a more accurate assessment of the injury, for example wound.
  • Fig. 15 shows an interface in which a window 216 for reference information is being prepared.
  • the window 216 may be displayed on the upper right side of the interface.
  • the size and/or location of the window 216 may be changed based on the user’s preference or input. Wounds may be divided into some types according to their appearance.
  • the information system 100 may display the respective reference information such as reference image 217 by the side of the captured wound image 215, to show the user how the appearance of the wound would look like for the specific type.
  • the reference image 217 of “Epithelising” is displayed when the user selects“Epithelising” as the type of the appearance of the wound. This function can help the user to make a more informed decision about the type. This is helpful for a less experienced user, for example non-clinician.
  • the reference image 217 is interactively changed.
  • the reference image 217 is interactively changed.
  • the reference image 217 is interactively changed.
  • voice command may be used for the described invention.
  • the examples of the voice command are as follows:
  • sensing commands e.g.“take picture”,“start scanning”,“bigger area”,“smaller area”, “closer”,“further”,“right”,“left”,“up”,“down”, etc. 2) commands to toggle between different views: e.g. “show wound history”, “show wound reference”,“show 3D view”,“show wound attribute”, etc.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
PCT/SG2018/050636 2017-12-28 2018-12-28 System and method for obtaining data associated with an injury WO2019132780A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020530384A JP2021509298A (ja) 2017-12-28 2018-12-28 損傷に関連付けられたデータを取得するためのシステム及び方法
CN201880077241.2A CN111742374A (zh) 2017-12-28 2018-12-28 用于获得与创伤相关的数据的系统和方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SG10201710872W 2017-12-28
SG10201710872W 2017-12-28

Publications (1)

Publication Number Publication Date
WO2019132780A1 true WO2019132780A1 (en) 2019-07-04

Family

ID=67063096

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2018/050636 WO2019132780A1 (en) 2017-12-28 2018-12-28 System and method for obtaining data associated with an injury

Country Status (3)

Country Link
JP (1) JP2021509298A (ja)
CN (1) CN111742374A (ja)
WO (1) WO2019132780A1 (ja)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112716452A (zh) * 2020-12-27 2021-04-30 孙炳伟 伤口创面的红外监测方法、装置及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080021287A1 (en) * 2006-06-26 2008-01-24 Woellenstein Matthias D System and method for adaptively adjusting patient data collection in an automated patient management environment
US20130231711A1 (en) * 2012-03-02 2013-09-05 Thomas E. Kaib Systems and methods for configuring a wearable medical monitoring and/or treatment device
US20160228049A1 (en) * 2015-02-06 2016-08-11 Nxp B.V. Wound monitoring

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003281270A (ja) * 2002-03-25 2003-10-03 Menicon Co Ltd 医療物資情報提供方法とそれを用いた医療物資の供給方法
JP2006271840A (ja) * 2005-03-30 2006-10-12 Hitachi Medical Corp 画像診断支援システム
JP5368821B2 (ja) * 2009-02-13 2013-12-18 キヤノン株式会社 カメラ制御装置、カメラ制御方法及びプログラム
GB0904080D0 (en) * 2009-03-09 2009-04-22 Mologic Ltd Imaging method
JP2014188095A (ja) * 2013-03-26 2014-10-06 Kitasato Institute 遠隔診断システム
JP6530746B2 (ja) * 2013-10-08 2019-06-12 リーフ ヘルスケア インコーポレイテッド ユーザをモニタリングするシステムとユーザ装着可能センサデバイス

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080021287A1 (en) * 2006-06-26 2008-01-24 Woellenstein Matthias D System and method for adaptively adjusting patient data collection in an automated patient management environment
US20130231711A1 (en) * 2012-03-02 2013-09-05 Thomas E. Kaib Systems and methods for configuring a wearable medical monitoring and/or treatment device
US20160228049A1 (en) * 2015-02-06 2016-08-11 Nxp B.V. Wound monitoring

Also Published As

Publication number Publication date
CN111742374A (zh) 2020-10-02
JP2021509298A (ja) 2021-03-25

Similar Documents

Publication Publication Date Title
US8953837B2 (en) System and method for performing an automatic and self-guided medical examination
JP2021118892A (ja) 生理学的モニタのためのシステム、方法、及びコンピュータプログラム製品
US20160100790A1 (en) Automated systems and methods for skin assessment and early detection of a latent pathogenic bio-signal anomaly
CN111758137A (zh) 用于远程医疗的方法和设备
TW202103646A (zh) 用於遠距監督手術程序之擴增實境系統及方法
US20120133640A1 (en) Method and system for combining 2d image and 3d model and computer program product thereof
US11340708B2 (en) Gesture control of medical displays
US10607340B2 (en) Remote image transmission system, display apparatus, and guide displaying method thereof
WO2006065374A1 (en) A graphical medical data acquisition system
US20140193056A1 (en) Systems and Methods for Patient Anatomical Image Volume Data Visualization Using A Portable Processing Device
US11361433B2 (en) Image display control system, image display system, and image analysis device for dynamic medical imaging
US11817205B2 (en) Virtual augmentation of clinical care environments
CN113768619B (zh) 路径定位方法、信息显示装置、存储介质及集成电路芯片
JP2014178458A (ja) 携帯型医療画像表示装置
JPWO2018139468A1 (ja) 医療情報仮想現実システム
CN114520048A (zh) 远程通信装置、手术和治疗系统和远程通信方法
WO2019132780A1 (en) System and method for obtaining data associated with an injury
US10854005B2 (en) Visualization of ultrasound images in physical space
JP6116375B2 (ja) 診断支援システム
KR20160023015A (ko) 의료영상의 제공방법
US20130009860A1 (en) Information display apparatus
US9076310B2 (en) Method and electronic device for remote diagnosis
EP4181789B1 (en) One-dimensional position indicator
EP4283439A1 (en) Reducing spatial conflict between a virtual object and an individual
WO2023145503A1 (ja) 医療用情報処理システム、医療用情報処理方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18894857

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
ENP Entry into the national phase

Ref document number: 2020530384

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18894857

Country of ref document: EP

Kind code of ref document: A1