AU2021105118A4 - An intelligent auxiliary decision-making platform for the treatment of large-scale casualty incidents - Google Patents

An intelligent auxiliary decision-making platform for the treatment of large-scale casualty incidents Download PDF

Info

Publication number
AU2021105118A4
AU2021105118A4 AU2021105118A AU2021105118A AU2021105118A4 AU 2021105118 A4 AU2021105118 A4 AU 2021105118A4 AU 2021105118 A AU2021105118 A AU 2021105118A AU 2021105118 A AU2021105118 A AU 2021105118A AU 2021105118 A4 AU2021105118 A4 AU 2021105118A4
Authority
AU
Australia
Prior art keywords
classification
heart rate
video image
intelligent
treatment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2021105118A
Inventor
Hua Chen
Li Chen
Xiang Cui
Peng Li
Yahua Liu
Liang LUO
Zijie PAN
Suhong Qiu
Hao Wang
Lili Wang
Yanan Wang
Yuwei Wang
Hong Xu
Bo Yang
Hua Yang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
First Medical Center of PLA General Hospital
Original Assignee
First Medical Center of PLA General Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by First Medical Center of PLA General Hospital filed Critical First Medical Center of PLA General Hospital
Priority to AU2021105118A priority Critical patent/AU2021105118A4/en
Application granted granted Critical
Publication of AU2021105118A4 publication Critical patent/AU2021105118A4/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • A61B5/015By temperature mapping of body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/091Measuring volume of inspired or expired gases, e.g. to determine lung capacity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14542Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7465Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
    • A61B5/747Arrangements for interactive communication between patient and care services, e.g. by using a telephone network in case of emergency, i.e. alerting emergency services
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/15Biometric patterns based on physiological signals, e.g. heartbeat, blood flow

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Pulmonology (AREA)
  • Cardiology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Psychiatry (AREA)
  • Nursing (AREA)
  • Vascular Medicine (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Emergency Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Emergency Management (AREA)
  • Critical Care (AREA)
  • Bioinformatics & Cheminformatics (AREA)

Abstract

The invention discloses an intelligent auxiliary decision-making platform for the treatment of large-scale casualty incidents that relates to large-scale casualty incident treatment field. The platform includes a detection module, a classification module and a central station. The detection module includes an inspection unit which is used to obtain the heart rate, respiratory rate, blood pressure, blood oxygen saturation, body temperature and/or breathing depth of the injured person; the classification module includes a local intelligent classification unit. The local intelligent classification unit is used to classify the injuries based on the information obtained by the inspection unit; the injury classification includes the classification of abnormal values corresponding to the specific data obtained by the inspection unit and the comprehensive risk classification. The invention can improve the level of treating the injured person, assist rescue decision-making command and enhance the overall treatment effect. 1 Drawings 1 6 2 3 Fig. 1 Detection Module Detection Module Smart Glasses Smar GlasesLocal Intelligent Classification Unit Data Collection Device Data Processing Device -- - - - -Intelligent ] Clda%fction Unit Display Device Fig. 2 1

Description

Drawings
1 6 2
3
Fig. 1
Detection Module Detection Module
Smart Glasses Smar GlasesLocal Intelligent Classification Unit
Data Collection Device
Data Processing Device -- - - - -Intelligent ] Clda%fction Unit
Display Device
Fig. 2
Description
AN INTELLIGENT AUXILIARY DECISION-MAKING PLATFORM FOR THE TREATMENT OF LARGE-SCALE CASUALTY INCIDENTS
Technical Field
The invention belongs to the technical field of treatment of large-scale casualties and
in particular relates to an intelligent auxiliary decision-making platform for the
treatment of large-scale casualty incidents.
Background
After a large-scale casualty event occurs, it is necessary to collect data and determine
the condition of the injured person to treat the injured person in a targeted manner,
which can effectively reduce the medical risk of the injured person. At the same time,
it is necessary to ensure that medical resources for treatment can reach the
corresponding place as quickly as possible to save more human resources and
transportation time.
In the case of sudden and mass rescue needs in large-scale casualties, the existing
rescue medical system of "search-rough inspection-transfer-hospital fine
inspection-treatment" has limitation of medical resources and testing capabilities on
the spot. The prominent contradiction between the accuracy requirements of mass
inspections and limitation of the data is obvious (fast but not good). The other
outstanding contradiction is between the completeness of the medical resources,
testing capabilities and data in the rear hospital and the rapid demand of the on-site
group inspections (good but not fast).
Description
* Invention Content
In order to solve the above-mentioned technical problems, the present invention
provides an intelligent auxiliary decision-making platform for the treatment of
large-scale casualty incidents, so as to realize the high efficiency and unity of the
timeliness and accuracy of the on-site treatment of injuries. Through online inspection
principles and technologies as the overall research goal, smart wearable devices,
medical sensor technology, artificial intelligence and big data technology are applied
to the rescue process of the injured person to break through relevant key scientific and
technical issues including establishing inspection classifications, achieving the in-situ
injury detection and rapid classification. Accurate guidance on the flow of wounded,
information and material at the rescue site can significantly improve the survival rate
of the wounded at the rescue site.
The above-mentioned technical purpose of the present invention is achieved through
the following technical solutions:
An intelligent auxiliary decision-making platform for the treatment of large-scale
casualty incidents which includes a detection module, a classification module and a
central station.
The detection module includes an inspection unit which is used to obtain the heart rate,
respiratory rate, blood pressure, blood oxygen saturation, body temperature and/or
breathing depth of the injured person.
The classification module includes a local intelligent classification unit. The local
intelligent classification unit is used to classify the injuries based on the information
obtained by the inspection unit; the injury classification includes the classification of
Description
abnormal values corresponding to the specific data obtained by the inspection unit and
the comprehensive risk classification.
In the above scheme, the relevant information of the injured is obtained through the
inspection unit. Then, according to different data for risk levels dividing, the
corresponding data of the injured, such as heart rate, respiratory rate, blood pressure
variability, blood oxygen saturation, body temperature and respiratory depth, are
generated into the corresponding database; it can also generate a comprehensive risk
level table and the corresponding database by setting the proportion of different data
to classify the injury level of the injured person. According to the risk levels of the
injured person, the corresponding information is sent to the central station. The central
station divides the rear environment of the treatment site into several safety-level
treatment units to arrange the injured people to the nearest treatment unit according to
different injury risk levels and expected treatment time; at the same time, it can assist
in the deployment of medical resources to the corresponding treatment unit for
targeted treatment of the injured person.
Further, the inspection unit further includes wearable smart glasses and the smart
glasses include a data collection device, a data processing device and a display device;
The data collection device includes a camera and infrared components arranged on the
smart glasses and the camera is used to obtain a target image;
The data processing device is used to obtain the blood oxygen saturation of the
injured person by obtaining the image of the facial blood vessel volume change and
obtain the breathing depth and respiration frequency by obtaining the image of the
perioral and peri-nasal image change;
The display device includes a virtual display screen displayed on the glasses lens;
Description
The smart glasses include RGB colour sensor, moving average filter, band pass filter,
microcontroller, video image processing chip, memory and storage battery installed
inside;
The RGB colour sensor is used to obtain the mixed signal of the plethysmographic
signal reflected by the video image information and other light fluctuation sources
caused by artefacts;
The video image processing chip is used to process the video image information of
the human facial target area collected by the camera. According to calculating and
converting the video image information of the human face target area, the human
heart rate and respiration atlas are obtained. Also, according to the analyse and
discriminate of the human heart rate and respiration atlas, the human heart rate is
measured;
The measuring method of breathing depth and breathing frequency specifically
includes the following steps:
Si-i: acquire thermal images of the nose and skin;
S1-2: continuous measurement of thermal images of the nose and skin;
Si-3: obtain the highest and lowest temperature (°C) in the bounding box (BB) from
continuously measured thermal images:
T(t) = max(M BB(R,C)) T.(t) = min(MBB(R,C))
Among them: R represents the row of matrix MBB , C represents the column of
matrix MBB, Tmax represents the temperature value of the skin surface (C) and
represents the temperature value (C) around the nose during breathing;
Description
The breathing cycle function f (x) is:
f(t) = r.(t) - ,(t) --- (2)
Among them: the number of peaks in f(x) is the number of breaths and the time
interval between the two peaks is 1 breathing cycle
As a preferred solution, it also includes the following steps: the respiratory rate (RR)
is:
R = 60 _.[f (x)] (3)
Among them: i is the cycle order of f(x). In1 breathing cycle, t ( T.)
represents the time to reach the maximum temperature, and t(Tej) represents the
time to reach the minimum temperature.
Further, the smart glasses are provided with a wireless transmission device which
includes components of 5G, Bluetooth, infrared and/or WIFI.
In the above-mentioned preferred solution, the corresponding information sent by the
central station can be obtained through the wireless transmission device and displayed
directly on the smart glasses. The data obtained by the smart glasses can also be
transmitted back to the central station to assist in making corresponding decisions; In
the environment of casualties, a central station can be set up behind the rescue and
then data transmission can be carried out to realize data transmission without internet
through short-range wireless transmission methods such as Bluetooth, infrared or
WIFI transmission.
Description
Further, the central station includes a remote control room which is used to receive
the signal.
The classification module further includes an online intelligent classification unit. The
online intelligent classification unit includes a clinical data comparison subunit, a
medical staff interaction subunit, and a big data processing subunit.
In the above-mentioned preferred solution, the online intelligent classification unit
can transmit the data obtained by the smart glasses and the data of the local intelligent
classification unit back to the central station or remote control room. Then, through
the assistance of medical staff for determination, the determination rate can be
improved. It can also be adaptively modified to the classification method and
corresponding algorithm parameters of the local intelligent classification unit to better
classify the injured person to achieve the purpose of rapid treatment; at the same time,
when the big data processing sub-unit cannot undertake the task in the local intelligent
classification unit, auxiliary treatment can be carried out.
Further, the smart glasses are also provided with a buzzer and a signal light. The
buzzer and the signal light are connected with the microcontroller.
In the above-mentioned preferred solution, when the data of the injured people are in
different injury levels, the buzzer and the signal light can be used to send out signals
to quickly alert the rescue personnel, so as to adapt to the rescue environment of
large-scale casualties which needs to get data quickly.
Description
Further, the specific method for the video image processing chip to calculate the
human heart rate according to the video image of the human facial area comprises the
following steps:
S2-1: Use the combination of CEEMDAN algorithm and empty channel detection
technology to remove the noise in the video image information collected by the
network camera;
S2-2: According to the mixed signal of the plethysmographic signal reflected by the
video image information obtained by the RGB colour sensor and other light
fluctuation sources caused by artifacts, the independent component analysis method is
used to separate the mixed signal into three independent signal sources;
S2-3: Select three independent signal sources to analyse the cardiac cycle whose
power spectrum contains the highest peak value and use cubic spline function to
interpolate the signal. Then use a custom algorithm to detect the blood volume pulse
(BVP) in the interpolated signal and get the beat interval according to the detected
BVP peak value;
S2-4: Use the non-dependent variable threshold algorithm to filter the obtained beat
intervals to obtain low-frequency power and high-frequency power and normalize the
low-frequency power and high-frequency power to reduce the influence of the total
power change value;
S2-5: According to S2-3, use the Lomb-Scargle periodogram to obtain the power
spectral density graph and finally convert it into a heart rate graph and a respiration
graph;
S2-6: According to the heart rate graph and respiration graph obtained in S2-5, the
heart rate detection, calculation and discriminating program implanted in the video
Description
image processing chip in advance is adopted to analyse the heart rate graph to
calculate the heart rate value.
Further, the side wall of the smart glasses is provided with a charging interface
connected with the battery.
In the above-mentioned preferred solution, electric energy storage is realized through
a charging interface; in different embodiments, a replaceable power supply can also
be provided to cooperate with the charging interface to make the use more stable.
Further, the wall surface of the smart glasses is provided with an illuminating light
close to the camera.
In the above-mentioned preferred solution, illuminating lights are provided which can
help the rescue of the injured person and obtain relevant information in a dark
environment.
In summary, the present invention has the following beneficial effects:
The intelligent auxiliary decision-making platform for the treatment of large-scale
casualty incidents provided by the present invention can be used to guide the flow of
medical resources, the evacuation of critically wounded and the resource allocation
time, thereby improving the level of treatment of the wounded, auxiliary
decision-making, rescue and command.
Description
* Description of Drawings
Fig. 1 is a schematic diagram of the structure of smart glasses according to an
embodiment of the present invention;
Fig. 2 is a schematic structural diagram of an intelligent auxiliary decision-making
platform for the treatment of large-scale casualty incidents according to an
embodiment of the present invention;
In the figure: 1. Camera; 2. Infrared components; 3. Virtual display; 4. Buzzer;
5. Signal light; 6. Illumination light.
* Detailed Description
In order to make the objectives, technical solutions and advantages of the present
invention clearer, the present invention will be further described in detail below with
embodiments. It should be understood that the specific embodiments described here
are only used to explain the present invention but not to limit the present invention. It
should be also indicated that, the figures provided in the present embodiment only
exemplarily explain the basic conception of the present invention, so only show the
components associated with the present invention, and are not drawn in accordance
with component number, shapes and sizes in actual implementation. Forms, number
and proportions of the components in the actual implementation can be freely changed,
and component layout forms may also be more complex. Structures, proportions and
sizes drawn in the figures of the description are only used to match with the disclosure
in the description for those skilled in the art to understand and read, not intended to
limit implementation of the present invention, so have no technical material meaning.
Description
Any structural modification, proportional change and adjustment of sizes shall still be
included in the scope of the technical contents revealed in the present invention
without affecting the effects generated by the present invention and the purposes
achieved by the present invention. Meanwhile, terms such as "upper", "lower", "left",
"right", "middle", "an", etc. cited in the description are only used for clear illustration,
not intended to limit the implementation scope of the present invention. Change or
adjustment of relative relationships shall be included in the implementation scope of
the present invention without substantially changing the technical contents.
The present invention will be further described in detail below in conjunction with the
accompanying drawings:
An intelligent auxiliary decision-making platform for the treatment of large-scale
casualty incidents which includes a detection module, a classification module and a
central station.
The detection module includes an inspection unit which is used to obtain the heart rate,
respiratory rate, blood pressure, blood oxygen saturation, body temperature and/or
breathing depth of the injured person;
The classification module includes a local intelligent classification unit. The local
intelligent classification unit is used to classify the injuries based on the information
obtained by the inspection unit; the injury classification includes the classification of
abnormal values corresponding to the specific data obtained by the inspection unit and
the comprehensive risk classification.
In the above scheme, the relevant information of the injured is obtained through the
inspection unit. Then, according to different data for risk levels dividing, the
Description
corresponding data of the injured, such as heart rate, respiratory rate, blood pressure
variability, blood oxygen saturation, body temperature and respiratory depth, are
generated into the corresponding database; it can also generate a comprehensive risk
level table and the corresponding database by setting the proportion of different data
to classify the injury level of the injured person. According to the risk levels of the
injured person, the corresponding information is sent to the central station. The central
station divides the rear environment of the treatment site into several safety-level
treatment units to arrange the injured people to the nearest treatment unit according to
different injury risk levels and expected treatment time; at the same time, it can assist
in the deployment of medical resources to the corresponding treatment unit for
targeted treatment of the injured person.
Further, the inspection unit further includes wearable smart glasses and the smart
glasses include a data collection device, a data processing device and a display device;
The data collection device includes a camera (1) and infrared components (2)
arranged on the smart glasses and the camera (1) is used to obtain a target image;
The data processing device is used to obtain the blood oxygen saturation of the
injured person by obtaining the image of the facial blood vessel volume change and
obtain the breathing depth and respiration frequency by obtaining the image of the
perioral and peri-nasal image change;
The display device includes a virtual display screen (3) displayed on the glasses lens;
The smart glasses include RGB colour sensor, moving average filter, band pass filter,
microcontroller, video image processing chip, memory and storage battery installed
inside;
Description
The RGB colour sensor is used to obtain the mixed signal of the plethysmographic
signal reflected by the video image information and other light fluctuation sources
caused by artefacts;
The video image processing chip is used to process the video image information of
the human facial target area collected by the camera (1). According to calculating and
converting the video image information of the human face target area, the human
heart rate and respiration atlas are obtained. Also, according to the analyse and
discriminate of the human heart rate and respiration atlas, the human heart rate is
measured;
The measuring method of breathing depth and breathing frequency specifically
includes the following steps:
Si-i: acquire thermal images of the nose and skin;
S1-2: continuous measurement of thermal images of the nose and skin;
S1-3: obtain the highest and lowest temperature (°C) in the bounding box (BB) from
continuously measured thermal images:
T (t) = max(MBB(R,C)) Ta . Wt = min( MBBC R,C )
Among them: R represents the row of matrix MBB, C represents the column of
matrix MBB, Tmx represents the temperature value of the skin surface (C) and
Tmin represents the temperature value (C) around the nose during breathing;
Description
The breathing cycle function f (x) is:
f(t) = r.(t) - ,(t) --- (2)
Among them: the number of peaks in f(x) is the number of breaths and the time
interval between the two peaks is 1 breathing cycle
As a preferred solution, it also includes the following steps: the respiratory rate (RR)
is:
R = 60 _.[f (x)] (3)
Among them: i is the cycle order of f(x). In1 breathing cycle, t ( T.)
represents the time to reach the maximum temperature, and t(Tej) represents the
time to reach the minimum temperature.
Further, the smart glasses are provided with a wireless transmission device which
includes components of 5G, Bluetooth, infrared and/or WIFI.
In the above-mentioned preferred solution, the corresponding information sent by the
central station can be obtained through the wireless transmission device and displayed
directly on the smart glasses. The data obtained by the smart glasses can also be
transmitted back to the central station to assist in making corresponding decisions; In
the environment of casualties, a central station can be set up behind the rescue and
then data transmission can be carried out to realize data transmission without internet
through short-range wireless transmission methods such as Bluetooth, infrared or
WIFI transmission.
Description
Further, the central station includes a remote control room which is used to receive
the signal.
The classification module further includes an online intelligent classification unit. The
online intelligent classification unit includes a clinical data comparison subunit, a
medical staff interaction subunit, and a big data processing subunit.
In the above-mentioned preferred solution, the online intelligent classification unit
can transmit the data obtained by the smart glasses and the data of the local intelligent
classification unit back to the central station or remote control room. Then, through
the assistance of medical staff for determination, the determination rate can be
improved. It can also be adaptively modified to the classification method and
corresponding algorithm parameters of the local intelligent classification unit to better
classify the injured person to achieve the purpose of rapid treatment; at the same time,
when the big data processing sub-unit cannot undertake the task in the local intelligent
classification unit, auxiliary treatment can be carried out.
Further, the smart glasses are also provided with a buzzer (4) and a signal light (5).
The buzzer (4) and the signal light (5) are connected with the microcontroller.
In the above-mentioned preferred solution, when the data of the injured people are in
different injury levels, the buzzer (4) and the signal light (5) can be used to send out
signals to quickly alert the rescue personnel, so as to adapt to the rescue environment
of large-scale casualties which needs to get data quickly.
Description
Further, the specific method for the video image processing chip to calculate the
human heart rate according to the video image of the human facial area comprises the
following steps:
S2-1: Use the combination of CEEMDAN algorithm and empty channel detection
technology to remove the noise in the video image information collected by the
network camera (1);
S2-2: According to the mixed signal of the plethysmographic signal reflected by the
video image information obtained by the RGB colour sensor and other light
fluctuation sources caused by artifacts, the independent component analysis method is
used to separate the mixed signal into three independent signal sources;
S2-3: Select three independent signal sources to analyse the cardiac cycle whose
power spectrum contains the highest peak value and use cubic spline function to
interpolate the signal. Then use a custom algorithm to detect the blood volume pulse
(BVP) in the interpolated signal and get the beat interval according to the detected
BVP peak value;
S2-4: Use the non-dependent variable threshold algorithm to filter the obtained beat
intervals to obtain low-frequency power and high-frequency power and normalize the
low-frequency power and high-frequency power to reduce the influence of the total
power change value;
S2-5: According to S2-3, use the Lomb-Scargle periodogram to obtain the power
spectral density graph and finally convert it into a heart rate graph and a respiration
graph;
S2-6: According to the heart rate graph and respiration graph obtained in S2-5, the
heart rate detection, calculation and discriminating program implanted in the video
Description
image processing chip in advance is adopted to analyse the heart rate graph to
calculate the heart rate value.
Further, the side wall of the smart glasses is provided with a charging interface
connected with the battery.
In the above-mentioned preferred solution, electric energy storage is realized through
a charging interface; in different embodiments, a replaceable power supply can also
be provided to cooperate with the charging interface to make the use more stable.
Further, the wall surface of the smart glasses is provided with an illuminating light (6)
close to the camera (1).
In the above-mentioned preferred solution, illuminating lights (6) are provided which
can help the rescue of the injured person and obtain relevant information in a dark
environment.
The above descriptions are only examples of the invention, and are not used to limit
the protection scope of the invention. For those skilled in the art, the application can
have various modifications and changes. Any modification, equivalent replacement,
improvement, etc. made within the spirit and principle of this invention shall be
included in the protection scope of this invention.

Claims (5)

Claims
1. An intelligent auxiliary decision-making platform for the treatment of large-scale
casualty incidents, characterized in that the platform includes a detection module, a
classification module and a central station.
The detection module includes an inspection unit which is used to obtain the heart rate,
respiratory rate, blood pressure, blood oxygen saturation, body temperature and/or
breathing depth of the injured person;
The classification module includes a local intelligent classification unit. The local
intelligent classification unit is used to classify the injuries based on the information
obtained by the inspection unit; the injury classification includes the classification of
abnormal values corresponding to the specific data obtained by the inspection unit and
the comprehensive risk classification;
The central station includes a remote control room which is used to receive the signal.
2. An intelligent auxiliary decision-making platform for the treatment of large-scale
casualty incidents according to claim 1, characterized in that the inspection unit
further includes wearable smart glasses. The smart glasses include a data collection
device, a data processing device and a display device;
The data collection device includes a camera (1) and infrared components (2)
arranged on the smart glasses and the camera (1) is used to obtain a target image;
The data processing device is used to obtain the blood oxygen saturation of the
injured person by obtaining the image of the facial blood vessel volume change and
obtain the breathing depth and respiration frequency by obtaining the image of the
perioral and peri-nasal image change;
The display device includes a virtual display screen (3) displayed on the glasses lens;
Claims
The smart glasses include RGB colour sensor, moving average filter, band pass filter,
microcontroller, video image processing chip, memory and storage battery installed
inside;
The RGB colour sensor is used to obtain the mixed signal of the plethysmographic
signal reflected by the video image information and other light fluctuation sources
caused by artefacts;
The video image processing chip is used to process the video image information of
the human facial target area collected by the camera (1). According to calculating and
converting the video image information of the human face target area, the human
heart rate and respiration atlas are obtained. Also, according to the analyse and
discriminate of the human heart rate and respiration atlas, the human heart rate is
measured;
The measuring method of breathing depth and breathing frequency specifically
includes the following steps:
Si-i: acquire thermal images of the nose and skin;
S1-2: continuous measurement of thermal images of the nose and skin;
Si-3: obtain the highest and lowest temperature (°C) in the bounding box (BB) from
continuously measured thermal images:
T,x(t)= max( MBB(RC)) (
T' (t)=min(MBB(RC))
Among them: R represents the row of matrixMBB , C represents the column of
matrixMBB , max represents the temperature value of the skin surface (C) and
Tmin represents the temperature value (C) around the nose during breathing;
Claims
The breathing cycle function f (x) is:
ft) = T(t) - TiW(t) --- (2)
Among them: the number of peaks in f(x) is the number of breaths and the time
interval between the two peaks is 1 breathing cycle
As a preferred solution, it also includes the following steps: the respiratory rate (RR)
is:
RR(j= 60 [f x) (3) t ( T_) - t(Ti)
Among them: i is the cycle order of f(x). In1 breathing cycle, t ( T_)
represents the time to reach the maximum temperature, and t(,) represents the
time to reach the minimum temperature.
3. An intelligent auxiliary decision-making platform for the treatment of large-scale
casualty incidents according to claim 2, characterized in that the smart glasses are
provided with a wireless transmission device which includes components of 5G,
Bluetooth, infrared and/or WIFI.
The smart glasses are also provided with a buzzer (4) and a signal light (5). The
buzzer (4) and the signal light (5) are connected with the microcontroller.
The side wall of the smart glasses is provided with a charging interface connected
with the battery.
The wall surface of the smart glasses is provided with an illuminating light (6) close
to the camera (1).
4. An intelligent auxiliary decision-making platform for the treatment of large-scale
casualty incidents according to claim 1, characterized in that the classification module
further includes an online intelligent classification unit. The online intelligent
Claims
classification unit includes a clinical data comparison subunit, a medical staff
interaction subunit, and a big data processing subunit.
5. An intelligent auxiliary decision-making platform for the treatment of large-scale
casualty incidents according to claim 2, characterized in that the specific method for
the video image processing chip to calculate the human heart rate according to the
video image of the human facial area comprises the following steps :
S2-1: Use the combination of CEEMDAN algorithm and empty channel detection
technology to remove the noise in the video image information collected by the
network camera (1);
S2-2: According to the mixed signal of the plethysmographic signal reflected by the
video image information obtained by the RGB colour sensor and other light
fluctuation sources caused by artifacts, the independent component analysis method is
used to separate the mixed signal into three independent signal sources;
S2-3: Select three independent signal sources to analyse the cardiac cycle whose
power spectrum contains the highest peak value and use cubic spline function to
interpolate the signal. Then use a custom algorithm to detect the blood volume pulse
(BVP) in the interpolated signal and get the beat interval according to the detected
BVP peak value;
S2-4: Use the non-dependent variable threshold algorithm to filter the obtained beat
intervals to obtain low-frequency power and high-frequency power and normalize the
low-frequency power and high-frequency power to reduce the influence of the total
power change value;
S2-5: According to S2-3, use the Lomb-Scargle periodogram to obtain the power
spectral density graph and finally convert it into a heart rate graph and a respiration
graph;
Claims
S2-6: According to the heart rate graph and respiration graph obtained in S2-5, the
heart rate detection, calculation and discriminating program implanted in the video
image processing chip in advance is adopted to analyse the heart rate graph to
calculate the heart rate value.
AU2021105118A 2021-08-09 2021-08-09 An intelligent auxiliary decision-making platform for the treatment of large-scale casualty incidents Ceased AU2021105118A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2021105118A AU2021105118A4 (en) 2021-08-09 2021-08-09 An intelligent auxiliary decision-making platform for the treatment of large-scale casualty incidents

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2021105118A AU2021105118A4 (en) 2021-08-09 2021-08-09 An intelligent auxiliary decision-making platform for the treatment of large-scale casualty incidents

Publications (1)

Publication Number Publication Date
AU2021105118A4 true AU2021105118A4 (en) 2021-10-07

Family

ID=77922911

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021105118A Ceased AU2021105118A4 (en) 2021-08-09 2021-08-09 An intelligent auxiliary decision-making platform for the treatment of large-scale casualty incidents

Country Status (1)

Country Link
AU (1) AU2021105118A4 (en)

Similar Documents

Publication Publication Date Title
US10492720B2 (en) System and method for determining sleep stage
CN110703626B (en) Sleep quality monitoring system
US20180122073A1 (en) Method and device for determining vital parameters
CN105078449B (en) Senile dementia monitor system based on health service robot
CN105869356B (en) A kind of the assessment interfering system and method for railway drivers working condition
CN108451513A (en) A kind of paster style physiological multi-parameter monitoring equipment
CN110477925A (en) A kind of fall detection for home for the aged old man and method for early warning and system
CN107092806A (en) It is a kind of towards the intelligentized information fusion of old man's household and method for early warning
CN103070683A (en) Sleep breathing mode identification method and device based on bioelectrical impedance
CN108309263A (en) Multi-parameter monitoring data analysing method and multi-parameter monitoring system
CN108577830A (en) A kind of user oriented sign information dynamic monitor method and dynamic monitor system
KR20210063188A (en) Smart mirror device
CN113257440A (en) ICU intelligent nursing system based on patient video identification
CN111035096A (en) Engineering constructor fatigue detection system based on safety helmet
CN108460957A (en) A kind of the elder's health group monitoring warning system and method
CN113499035A (en) Pain recognition system based on confidence interval fusion threshold criterion
CN109717830A (en) The fatigue detecting of parameter monitoring is moved based on eye movement and head and promotees system of waking up
CN105912870A (en) Portable multi-sensor monitoring medical system
CN113057599A (en) Machine for rapidly evaluating pain
CN117860207B (en) Video non-contact measurement pain identification method and system based on data analysis
AU2021105118A4 (en) An intelligent auxiliary decision-making platform for the treatment of large-scale casualty incidents
WO2024140417A1 (en) Human-machine interaction collection method, apparatus and system for wearable extended reality device
Zhou et al. Driver vigilance detection based on deep learning with fused thermal image information for public transportation
CN111122010A (en) Engineering constructor fatigue detection method based on safety helmet
CN113143274A (en) Emotion early warning method based on camera

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry