CN111951949A - Intelligent nursing interaction system for intelligent ward - Google Patents
Intelligent nursing interaction system for intelligent ward Download PDFInfo
- Publication number
- CN111951949A CN111951949A CN202010831367.3A CN202010831367A CN111951949A CN 111951949 A CN111951949 A CN 111951949A CN 202010831367 A CN202010831367 A CN 202010831367A CN 111951949 A CN111951949 A CN 111951949A
- Authority
- CN
- China
- Prior art keywords
- pixel
- data
- video image
- unit
- patient
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/46—Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0407—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
- G08B21/043—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0438—Sensor means for detecting
- G08B21/0446—Sensor means for detecting worn on the body to detect changes of posture, e.g. a fall, inclination, acceleration, gait
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0438—Sensor means for detecting
- G08B21/0453—Sensor means for detecting worn on the body to detect health condition by physiological monitoring, e.g. electrocardiogram, temperature, breathing
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0438—Sensor means for detecting
- G08B21/0476—Cameras to detect unsafe condition, e.g. video cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0438—Sensor means for detecting
- G08B21/0492—Sensor dual technology, i.e. two or more technologies collaborate to extract unsafe condition, e.g. video tracking and RFID tracking
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/08—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using communication transmission lines
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/20—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
Abstract
The utility model provides an intelligent nursing interactive system for wisdom ward, includes the mutual large screen of a plurality of nurse stations, data center server, HIS database, mobile client, information management module, video acquisition module and cloud platform, the mutual large screen of nurse station is used for the display information and provides information query function, the mutual large screen of nurse station is transferred through data center server and is got clinical care information who stores in the HIS database, mobile client is used for gathering patient's information in real time, video acquisition module is used for gathering patient's video image, information management module is used for handling patient information and the video image of mobile client and video acquisition module collection and transmits to the cloud platform after, the mutual large screen of nurse station is transferred information from the cloud platform and is shown. The invention has the beneficial effects that: the nurse can master the clinical care information of the patient through the interactive large screen of the nurse station, receive the help seeking signal of the patient and timely receive the alarm signal when the patient falls down.
Description
Technical Field
The invention relates to the field of intelligent wards, in particular to an intelligent nursing interaction system for the intelligent wards.
Background
As the concept of hospital services is changed from "hospital-centric" to "patient-centric", the medical information system will necessarily develop towards the goal of "dynamic quality control of the overall medical procedure". In order to ensure medical safety, improve nursing quality and standardize nursing behaviors, the system construction of a nurse workstation also takes a certain weight in the grade review rigid standard of a hospital. However, in the information-based construction process of an actual hospital, the information-based degree of clinical care is low at present, most of the current hospitals manage patients in the stages of nursing staff timing query, bedside alarm call and the like, the information-based degree is low, and various problems of the nursing hospitals in the daily nursing process cannot be solved timely and efficiently.
Disclosure of Invention
In view of the above problems, the present invention is directed to an intelligent nursing interactive system for an intelligent ward.
The purpose of the invention is realized by the following technical scheme:
an intelligent nursing interaction system for a smart ward comprises a plurality of nurse station interaction large screens, a data center server, an HIS database, a mobile user side, an information management module, a video acquisition module and a cloud platform, wherein the nurse station interaction large screens are used for displaying information and providing an information query function, the HIS database is used for storing clinical nursing information data, the nurse station interaction large screens are used for calling the clinical nursing information data stored in the HIS database through the data center server, the mobile user side is worn on a patient and comprises a patient positioning unit, a sign monitoring unit, a patient help-seeking unit and a falling alarm unit, the patient positioning unit is used for acquiring the position data of the patient, the sign monitoring unit is used for acquiring the vital sign data of the patient, the patient help-seeking unit is used for sending a help-seeking signal by the user, and the falling alarm unit is used for falling down and detecting the patient, when the patient falls down, an alarm signal is sent, the data collected by the mobile user terminal and the sent alarm and help seeking signals are sent to an information management module, the information management module comprises a data receiving unit, a data processing unit, a data analysis unit, an image processing unit and a data sending unit, the data receiving unit is used for receiving the data and the signals sent by the mobile user terminal, the data processing unit is used for processing the received vital sign data, the data analysis unit is used for analyzing the position information of the patient, when the patient is positioned in a dangerous area of a hospital and the staying time in the dangerous area exceeds a given safe duration, the video collection module is made to collect the video image of the patient and send the collected video image to the information management module, and the image processing unit of the information management module is used for image processing, the data sending unit transmits the position information, the alarm signal, the help seeking signal, the processed vital sign data and the processed video image which are received by the data receiving unit to the cloud platform, and the nurse station interaction large screen calls information from the cloud platform to display.
The beneficial effects created by the invention are as follows: the utility model provides an intelligent nursing interactive system for wisdom ward has realized the real-time grasp of nurse to patient's information, and the nurse can grasp patient's clinical care information through the mutual big screen in nurse station, receives patient's the signal of seeking help to in time receive alarm signal when the patient tumbles, thereby improved the quality of nursing, improved the information-based degree of nursing.
Drawings
The invention is further described with the aid of the accompanying drawings, in which, however, the embodiments do not constitute any limitation to the invention, and for a person skilled in the art, without inventive effort, further drawings may be derived from the following figures.
FIG. 1 is a schematic diagram of the present invention.
Reference numerals:
a nurse station interaction large screen; a data center server; an HIS database; a mobile user terminal; an information management module; a video acquisition module; and (4) cloud platform.
Detailed Description
The invention is further described with reference to the following examples.
Referring to fig. 1, the intelligent nursing interaction system for the intelligent ward of the embodiment includes a plurality of nurse station interaction large screens, a data center server, an HIS database, a mobile user side, an information management module, a video acquisition module, and a cloud platform, the nurse station interaction large screens are installed at each nurse station for displaying information and providing an information query function, the HIS database is used for storing clinical nursing information data, the nurse station interaction large screens call the clinical nursing information data stored in the HIS database through the data center server, the mobile user side is worn on a patient and includes a patient positioning unit, a sign monitoring unit, a patient help-seeking unit and a fall alarm unit, the patient positioning unit is used for acquiring position data of the patient, the sign monitoring unit is used for acquiring vital sign data of the patient, and the patient help-seeking unit is used for the user to send a help-seeking signal, the falling alarm unit is used for detecting the falling of a patient, sending an alarm signal when detecting that the patient falls, the mobile user terminal sending the acquired data and the sent alarm and help signals to the information management module, the information management module comprises a data receiving unit, a data processing unit, a data analysis unit, an image processing unit and a data sending unit, the data receiving unit is used for receiving the data and the signals sent by the mobile user terminal, the data processing unit is used for processing the received vital sign data, the data analysis unit is used for analyzing the position information of the patient, when the patient is positioned in a dangerous area of a hospital and the staying time in the dangerous area exceeds the given safe duration, the video acquisition module is used for acquiring the video image of the patient and sending the acquired video image to the information management module, the image processing unit of the information management module is used for processing images, the data sending unit is used for transmitting the position information, the alarm signal, the help seeking signal, the processed vital sign data and the processed video image which are received by the data receiving unit to the cloud platform, and the interactive large screen of the nurse station is used for calling the information from the cloud platform to display.
This preferred embodiment provides an intelligent nursing interactive system for wisdom ward, has realized that the nurse masters patient's information in real time, and the nurse can master patient's clinical care information through the mutual big screen in nurse station, receives patient's the signal of seeking help to in time receive alarm signal when the patient tumbles, thereby improved the quality of nursing, improved the information-based degree of nursing.
Preferably, the nurse station interactive large screen is used for displaying information and providing information inquiry functions, and comprises an electronic bed bitmap module, an on-duty doctor and nurse management module, a medical care information management module, an access and change management module, an operation inspection schedule management module, a nursing and scheduling management module, an infusion execution management module, a nursing identification module, an information notification management module, a personnel positioning management module, a video display module, a ward contact module, a nursing and shift management module and a physical sign monitoring management module, wherein the electronic bed bitmap module is used for displaying basic information of all bed patients in a department and displaying the basic information by using different colors according to the severity and nursing level of the illness state of the bed patients; the on-duty doctor and nurse management module is used for displaying on-duty doctor information, the medical care information management module is used for displaying information of patients managed by doctors and nurses, and the in-out and transfer management module is used for displaying patient statistical data of daily admission, discharge, transfer and transfer in departments; the operation examination schedule management module is used for displaying all examination, examination and operation information of the patient in the department, and a nurse can check whether the patient has the conditions of missing examination and examination according to the information displayed by the operation examination schedule management module; the nursing shift scheduling management module is used for displaying the information of the person on duty on the day and checking the details of the shift of the person in one week by nurses; the infusion execution management module is used for displaying all infusion medical advice of the patient in the day and displaying the infusion medical advice of the patient in the day according to the infusion condition priority; the nursing identification module is used for displaying the nursing grade of the bed patient; the message notification management module is used for displaying important notifications sent by special departments; the personnel positioning management module is used for displaying the position information of the patient, and the video display module is used for displaying the video image of the patient in the dangerous area, wherein the time length of the video image exceeds the given safe time length; the ward contact module is used for displaying alarm information and help seeking information of a patient, so that a nurse can timely handle an emergency; the nursing shift-switching management module is used for displaying the shift-switching condition of nurses, and is used for handing over work and leaving messages between nurses and shift-switching nurses; the sign monitoring management module is used for displaying the vital sign data of the patient in real time.
This preferred embodiment has realized that the nurse masters patient care information in real time, communication channel between nurse and the nurse is provided, the nurse can carry out management module real time monitoring infusion through the infusion and whether overtime is not carried out, thereby make things convenient for the nurse in time to know patient's infusion condition, the mutual large screen in nurse station has ensured that notice information is obtained by all nurses in the administrative or technical offices, the authority and the prevalence rate of notice information have been improved, and the longe work load of nurse has significantly reduced, the nurse length need not to inform the content of noticing to every nurse is repeated.
Preferably, the vital sign data of the patient comprises the body temperature, the heart electricity and the pulse of the patient.
Preferably, the data processing unit is configured to modify the received vital sign data by xj(t) data of vital sign j acquired at time t, Xj(t) is data xj(t) is a time series of length l, and wherein the content of the first and second substances,to representValue x 'obtained by correcting data of vital sign j acquired at moment'j(t-1) represents the corrected value, x, of the data of the vital sign j acquired at the moment (t-1)j(t +1) represents the data of the vital sign j acquired at the moment (t +1),to representAcquiring data of the vital sign j at any moment;
for time series Xj(t) screening the data to define a time series XjData x in (t)j(i) Corresponding significance detection coefficient isj(i) And is andj(i) the expression of (a) is:
in the formula, xj(i) Data representing vital signs j acquired at time i, and xj(i)∈Xj(t),Hj(t) is the screening threshold, and HjThe expression of (t) is:
then data xj(t) corrected value x'j(t) is:
the preferred embodiment is used for correcting acquired vital sign data and removing noise pollution in the vital sign data, when the data are corrected, a certain amount of adjacent data before and after the acquisition time of the data to be corrected are introduced to form a time sequence as reference data of the data to be corrected, in the time sequence, the data before the acquisition time of the data to be corrected adopt a corrected data value as the reference data, the accuracy of the reference data is improved, the certain amount of adjacent data after the acquisition time of the data to be corrected are introduced into the time sequence as the reference data, so that the time sequence can effectively reflect the change trend of the vital sign data of a patient, the condition that the data with large numerical value change caused by the problem of the physical condition of the patient are judged by mistake as the noise data can be avoided, and the adjacent data after the acquisition time of the data to be corrected are introduced into the time sequence as the reference data are easy to be judged as the noise data In order to avoid the influence of noise data on the correction result, the preferred embodiment defines the validity detection coefficient to screen the data in the time sequence, so as to avoid the influence on the accuracy of the data correction result because the data to be corrected is the noise data or the adjacent data after the acquisition time of the data to be corrected is the noise data; when the data is corrected, the adjacent data with the collection time close to the collection time of the data to be corrected in the time sequence is endowed with larger weight, and the adjacent data with the collection time far from the collection time of the data to be corrected is endowed with smaller weight, so that the accuracy of the correction result of the data to be corrected is improved.
Preferably, the image processing unit is configured to perform filtering processing on the received video image, and set ItRepresenting the t-th frame video image, and video image ItIs of size M × N, let It(x, y) denotes a video image ItPixel at medium coordinate (x, y), pixel It(x, y) performing filtering processing, including a first denoising unit, a second denoising unit and a comprehensive denoising unitTo representThe first denoising unit pairs the pixel It(x, y) the filtered gray values,representing the second denoising unit to the pixel It(x, y) filtered grayscale value, h't(x, y) represents pixel I of comprehensive denoising unitt(x, y) the gray value after filtering, thenAnd h'tThe expression of (x, y) is:
wherein G (x, y) is a normalization parameter, andIt(I, j) represents a video image ItPixel at the middle coordinate (i, j), ht(I, j) represents a pixel ItGray value of (i, j), h't-l(x, y) denotes the (t-l) th frame video image It-lThe gray value, omega, of the pixel at the middle coordinate (x, y) after being filtered by the comprehensive denoising unitt(x, y) is expressed as a pixel It(x, y) is a local neighborhood of (2W +1) × (2W +1) with the center, and K is a participating pixel I in the second denoising unitt(x, y) the number of video image frames for the filtering process, and the values of W and K are determined in the following manner:
definition of Ut(x, y) is a pixel It(x, y) corresponding neighborhood detection coefficient, Tt(x, y) is a pixel It(x, y) corresponding adjacent frame detection coefficients, andUt(x, y) and TtThe expressions of (x, y) are respectively:
in the formula, Rt(x, y) is expressed as a pixel It(x, y) centered local neighborhood of (2w +1) × (2w +1), It-l(I, j) represents a video image It-lPixel at medium coordinate (i, j), h't-l(I, j) represents a pixel It-l(i, j) the gray value, R, filtered by the comprehensive denoising unitt-l(x, y) is represented by video image It-lLocal neighborhood of (2w +1) × (2w +1) centered on the pixel at the middle coordinate (x, y), Nt(x, y) denotes the local neighborhood RtThe number of pixels in (x, y), k representing the participating pixel It(x, y) number of video image frames detected adjacent frame, #t(I, j) is a pixel It(i, j) corresponding spatial coefficient, #t(x, y) is a pixel It(x, y) corresponding spatial coefficient, ht(I + α, j + β) represents the video image ItGray value, h, of the pixel at the middle coordinate (i + α, j + β)t(x + α, y + β) represents a video image ItThe gray value of the pixel at the middle coordinate (x + α, y + β);
let k and w have initial values of 1, and iterate in a growth mode with step length of 1 to satisfy Tt(x,y)+Ut(x,y)<HtThe maximum K and ω values of (x, y) are the values of K and W, where Ht(x, y) is a given threshold, andwherein, h't-1(x,y)、h′t-2(x, y) and h't-3And (x, y) respectively represents the gray value of the pixel at the coordinate (x, y) in the (t-1), (t-2) and (t-3) th frame video image after being filtered by the comprehensive denoising unit.
In the preferred embodiment, a first denoising unit, a second denoising unit and a comprehensive denoising unit are adopted to filter the acquired video image, the first denoising unit replaces the gray value of the pixel to be filtered by using the weighted average value of the gray value of the pixel in the local neighborhood similar to the pixel to be filtered, the problem caused by using the point as a reference point under the condition that the pixel to be filtered is polluted seriously can be effectively solved, in addition, compared with the traditional mode of adopting a fixed local neighborhood window, the preferred embodiment defines the neighborhood detection coefficient to detect the local neighborhood of the pixel to be filtered, updates the neighborhood detection coefficient by adopting an iteration mode, selects the maximum local neighborhood window side length which enables the neighborhood detection coefficient to be smaller than a given threshold value as the final local neighborhood window side length of the pixel to be filtered, the determined local neighborhood range is maximized, and the spatial similarity of pixels in the local neighborhood can be ensured, so that the structural characteristics of the image can be ensured while noise is effectively filtered; the second denoising unit utilizes the characteristic that adjacent frames generally have temporal correlation, utilizes the adjacent frame video image of the current frame to perform denoising processing on the video image of the current frame, can ensure the continuity of the video image in time, compared with the traditional mode of only adopting front and back frames of the current frame or randomly appointing the adjacent frame number, the preferred embodiment defines the adjacent frame detection coefficient to detect the correlation of pixels in the video images of the current frame and the adjacent frame, adopts an iterative mode to update the value of the adjacent frame detection coefficient, selects the maximum frame number which enables the adjacent frame detection coefficient to be smaller than a given threshold value to participate in the denoising of the pixel to be filtered, enables the determined adjacent frame number which participates in the denoising of the pixel to be filtered to be the maximum, and simultaneously can ensure that the adjacent frame number which participates in the denoising and the pixel to be filtered have larger correlation, the effect of denoising is prevented from being influenced by different gray values of pixels at the same position in adjacent frames caused by movement; the comprehensive denoising unit combines the denoising results of the first denoising unit and the second denoising unit, so that the video flicker phenomenon caused by filtering a single-frame video image is avoided while the video image of the current frame is effectively denoised.
Preferably, the image processing unit performs enhancement processing, I ', on the filtered video image'tRepresenting video images ItFiltered video image, I't(x, y) denotes video image I'tPixel at medium coordinate (x, y), defining pixel I'tThe first feature detection coefficient corresponding to (x, y) is γ1(x, y), Pixel I'tThe second feature detection coefficient corresponding to (x, y) is γ2(x, y), and γ1(x, y) and γ2The expression of (x, y) is:
in formula (II)'t(I, j) represents a video image I'tPixel at medium coordinate (i, j), h't(I, j) represents a pixel I'tGray value of (i, j), h't(max) represents video image I'tMaximum value of middle pixel gray-scale value, L't(x, y) denotes by pixel I't(x, y) a local neighborhood of (2N +1) × (2N +1) centered, where 0 < N < min { M, N }, M (x, y) represents a local neighborhood of L'tNumber of pixels in (x, y), η (h't(i,j),h′t(x, y)) is a judgment function whenThen eta (h't(i,j),h′t(x,y) 1); when in useThen eta (h't(i,j),h′t(x, y)) -0, wherein, M't(x, y) is local neighborhood L'tMedian of the pixel gray values in (x, y), λ (L't(x, y), f) is a judgment function, and the local neighborhood L'tWhen the gray value of the pixel existing in (x, y) is equal to the gray value f, then λ (L't(x, y), f) ═ 1, otherwise λ (L't(x,y),f)=0;
When in useThen, pixel I't(x, y) enhanced grayscale value q't(x, y) is: when in useAnd isThen, pixel I't(x, y) enhanced grayscale value q't(x, y) is: when in useAnd isThen, pixel I't(x, y) enhanced grayscale value q't(x, y) is:wherein, E't(x, y) is pixel I't(x, y) corresponding information value, and E'tThe expression of (x, y) is:
in the formula (I), the compound is shown in the specification,representing video images ItMiddle pixel It(x, y) local descriptors,represents video image I'tMiddle pixel I't(x, y) local descriptor, Lt(x, y) is represented by It(x, y) a local neighborhood of (2N +1) × (2N +1) centered, where 0 < N < min { M, N },representing a pixel It(x, y) and Pixel I't(x, y) local correlation descriptors,representing a local neighborhood LtThe mean of the grey values of the pixels in (x, y),representing local neighborhood L'tMean of the grey values of the pixels in (x, y).
The preferred embodiment is used for performing enhancement processing on a filtered video image, defining a first feature detection coefficient and a second feature detection coefficient to detect a pixel to be enhanced, wherein the defined first feature detection coefficient can effectively judge whether the pixel to be enhanced is in an edge region or an internal region, when the pixel to be enhanced is in the internal region, the defined second feature detection coefficient further judges detailed information contained in the region where the pixel to be enhanced is located, and performs enhancement processing on a gray value of the pixel to be enhanced according to values of the first feature detection coefficient and the second feature detection coefficient, so that edge pixels can be enhanced less, regions containing more detailed information can be enhanced more, internal regions containing less detailed information can be enhanced less, and over-enhancement of the edge pixels can be effectively avoided, the detail information in the video image is highlighted, so that the enhanced video image has a better visual effect.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the protection scope of the present invention, although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions can be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.
Claims (5)
1. An intelligent nursing interaction system for a smart ward is characterized by comprising a plurality of nurse station interaction large screens, a data center server, an HIS database, a mobile user side, an information management module, a video acquisition module and a cloud platform, wherein the nurse station interaction large screens are used for displaying information and providing an information query function, the HIS database is used for storing clinical nursing information data, the nurse station interaction large screens are used for calling the clinical nursing information data stored in the HIS database through the data center server, the mobile user side is worn on a patient and comprises a patient positioning unit, a sign monitoring unit, a patient help-seeking unit and a falling alarm unit, the patient positioning unit is used for acquiring the position data of the patient, the sign monitoring unit is used for acquiring the vital sign data of the patient, and the patient help-seeking unit is used for sending help-seeking signals by the user, the falling alarm unit is used for detecting falling of a patient, sending an alarm signal when detecting that the patient falls, sending data acquired by the mobile user terminal, the sent alarm signal and a help seeking signal to the information management module, wherein the information management module comprises a data receiving unit, a data processing unit, a data analysis unit, an image processing unit and a data sending unit, the data receiving unit is used for receiving the data and the signals sent by the mobile user terminal, the data processing unit is used for processing the received vital sign data, the data analysis unit is used for analyzing the position information of the patient, and when the patient is positioned in a dangerous area of a hospital and the staying time in the dangerous area exceeds a given safe duration, the video acquisition module is used for acquiring the video image of the patient and sending the acquired video image to the information management module, the image processing unit of the information management module processes images, the data sending unit transmits the position information, the alarm signal, the help seeking signal, the processed vital sign data and the processed video image which are received by the data receiving unit to the cloud platform, and the interactive large screen of the nurse station transfers the information from the cloud platform to display.
2. The intelligent nursing interactive system for intelligent ward of claim 1, wherein the interactive large screen of the nurse station is used for displaying information and providing information inquiry function, and comprises an electronic bed chart module, an on-duty doctor and nurse management module, a medical care information management module, an in-out and in-out change management module, an operation check scheduling management module, a nursing scheduling management module, an infusion execution management module, a nursing identification module, a message notification management module, a personnel positioning management module, a video display module, a ward contact module, a nursing shift management module and a physical sign monitoring management module.
3. The intelligent nursing interactive system for intelligent ward of claim 2, wherein the image processing unit is configured to filter the received video image, and is configured to set ItRepresenting the t-th frame video image, and video image ItIs of size M × N, let It(x, y) denotes a video image ItPixel at medium coordinate (x, y), pixel It(x, y) performing filtering processing, including a first denoising unit, a second denoising unit and a comprehensive denoising unitRepresenting the first denoising unit to the pixel It(x, y) the filtered gray values,representing the second denoising unit to the pixel It(x, y) filtered grayscale value, h't(x, y) represents pixel I of comprehensive denoising unitt(x, y) the gray value after filtering, thenAnd h'tThe expression of (x, y) is:
wherein G (x, y) is a normalization parameter, andIt(I, j) represents a video image ItPixel at the middle coordinate (i, j), ht(I, j) represents a pixel ItGray value of (i, j), h't-1(x, y) denotes the (t-l) th frame video image It-lThe gray value, omega, of the pixel at the middle coordinate (x, y) after being filtered by the comprehensive denoising unitt(x, y) is expressed as a pixel It(x, y) is a local neighborhood of (2W +1) × (2W +1) with the center, and K is a participating pixel I in the second denoising unitt(x, y) the number of video image frames for the filtering process, and the values of W and K are determined in the following manner:
definition of Ut(x, y) is a pixel It(x, y) corresponding neighborhood detection coefficient, Tt(x, y) is a pixel It(x, y) corresponding to the adjacent frame detection coefficient, and Ut(x, y) and TtThe expressions of (x, y) are respectively:
in the formula, Rt(x, y) is expressed as a pixel It(x, y) centered local neighborhood of (2w +1) × (2w +1), It-l(I, j) represents a video image It-lPixel at medium coordinate (i, j), h't-l(I, j) represents a pixel It-l(i, j) the gray value, R, filtered by the comprehensive denoising unitt-l(x, y) is represented by video image It-lLocal neighborhood of (2w +1) × (2w +1) centered on the pixel at the middle coordinate (x, y), Nt(x, y) denotes the local neighborhood RtThe number of pixels in (x, y), k representing the participating pixel It(x, y) number of video image frames detected adjacent frame, #t(I, j) is a pixel It(i, j) corresponding spatial coefficient, #t(x, y) is a pixel It(x, y) corresponding spatial coefficient, ht(I + α, j + β) represents the video image ItGray value, h, of the pixel at the middle coordinate (i + α, j + β)t(x + α, y + β) represents a video image ItThe gray value of the pixel at the middle coordinate (x + α, y + β);
let k and w have initial values of 1, and iterate in a growth mode with step length of 1 to satisfy Tt(x,y)+Ut(x,y)<HtThe maximum K and ω values of (x, y) are the values of K and W, where Ht(x, y) is a given threshold, andwherein, h't-1(x,y)、h′t-2(x, y) and h't-3And (x, y) respectively represents the gray value of the pixel at the coordinate (x, y) in the (t-1), (t-2) and (t-3) th frame video image after being filtered by the comprehensive denoising unit.
4. The intelligent nursing interactive system for smart ward of claim 3, wherein the image processing unit performs enhancement processing on the filtered video image, I'tRepresenting video images ItFiltered video image, I't(x, y) denotes video image I'tPixel at medium coordinate (x, y), defining pixel I'tThe first feature detection coefficient corresponding to (x, y) is γ1(x, y), Pixel I'tThe second feature detection coefficient corresponding to (x, y) is γ2(x, y), and γ1(x, y) and γ2The expression of (x, y) is:
in formula (II)'t(I, j) represents a video image I'tPixel at medium coordinate (i, j), h't(I, j) represents a pixel I'tGray value of (i, j), h't(max) represents video image I'tMaximum value of middle pixel gray-scale value, L't(x, y) denotes by pixel I't(x, y) a local neighborhood of (2N +1) × (2N +1) centered, where 0 < N < min { M, N }, M (x, y) represents a local neighborhood of L'tNumber of pixels in (x, y), η (h't(i,j),h′t(x, y)) is a judgment function whenThen eta (h't(i,j),h′t(x, y)) ═ 1; when in useThen eta (h't(i,j),h′t(x, y)) -0, wherein, M't(x, y) is local neighborhood L'tMedian of the pixel gray values in (x, y), λ (L't(x, y), f) is a judgment function, and the local neighborhood L'tWhen the gray value of the pixel existing in (x, y) is equal to the gray value f, then λ (L't(x, y), f) ═ 1, otherwise λ (L't(x,y),f)=0;
When in useThen, pixel I't(x, y) enhanced grayscale value q't(x, y) is: when in useAnd isThen, pixel I't(x, y) enhanced grayscale value q't(x, y) is: when in useAnd isThen, pixel I't(x, y) enhanced grayscale value q't(x, y) is:wherein, E't(x, y) is pixel I't(x, y) corresponding information value, and E'tThe expression of (x, y) is:
in the formula (I), the compound is shown in the specification,representing video images ItMiddle pixel It(x, y) local descriptors,represents video image I'tMiddle pixel I't(x, y) local descriptor, Lt(x, y) is represented by It(x, y) a local neighborhood of (2N +1) × (2N +1) centered, where 0 < N < min { M, N },representing a pixel It(x, y) and Pixel I't(x, y) local correlation descriptors,representing a local neighborhood LtThe mean of the grey values of the pixels in (x, y),representing local neighborhood L'tMean of the grey values of the pixels in (x, y).
5. The intelligent nursing interactive system for intelligent ward of claim 4, wherein the data processing unit is configured to modify the received vital sign data by xj(t) data of vital sign j acquired at time t, Xj(t) is data xj(t) is a time series of length l, and wherein the content of the first and second substances,to representValue x 'obtained by correcting data of vital sign j acquired at moment'j(t-1) represents the corrected value, x, of the data of the vital sign j acquired at the moment (t-1)j(t +1) represents the data of the vital sign j acquired at the moment (t +1),to representAcquiring data of the vital sign j at any moment;
for time series Xj(t) screening the data to define a time series XjData x in (t)j(i) Corresponding significance detection coefficient isj(i) And is andj(i) the expression of (a) is:
in the formula, xj(i) Data representing vital signs j acquired at time i, and xj(i)∈Xj(t),Hj(t) is the screening threshold, and HjThe expression of (t) is:
then data xj(t) correctedValue of after x'j(t) is:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010831367.3A CN111951949B (en) | 2020-01-21 | 2020-01-21 | Intelligent nursing interaction system for intelligent ward |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010831367.3A CN111951949B (en) | 2020-01-21 | 2020-01-21 | Intelligent nursing interaction system for intelligent ward |
CN202010072187.1A CN111292845B (en) | 2020-01-21 | 2020-01-21 | Intelligent nursing interaction system for intelligent ward |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010072187.1A Division CN111292845B (en) | 2020-01-21 | 2020-01-21 | Intelligent nursing interaction system for intelligent ward |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111951949A true CN111951949A (en) | 2020-11-17 |
CN111951949B CN111951949B (en) | 2021-11-09 |
Family
ID=71018965
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010072187.1A Active CN111292845B (en) | 2020-01-21 | 2020-01-21 | Intelligent nursing interaction system for intelligent ward |
CN202010831367.3A Active CN111951949B (en) | 2020-01-21 | 2020-01-21 | Intelligent nursing interaction system for intelligent ward |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010072187.1A Active CN111292845B (en) | 2020-01-21 | 2020-01-21 | Intelligent nursing interaction system for intelligent ward |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN111292845B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112869722A (en) * | 2021-01-11 | 2021-06-01 | 梅里医疗科技(洋浦)有限责任公司 | Intelligent 5G intelligent acquisition terminal for medical care and aged people based on Internet of things and acquisition method |
CN113284588A (en) * | 2021-05-10 | 2021-08-20 | 深圳市瀚翔工业设计有限公司 | Intelligent ward system |
CN114141347A (en) * | 2021-12-03 | 2022-03-04 | 北京汉王影研科技有限公司 | Intelligent clinical information management system |
CN114882983A (en) * | 2022-05-13 | 2022-08-09 | 浙江远图技术股份有限公司 | Smart ward interaction method, system and storage medium |
CN116363021A (en) * | 2023-06-02 | 2023-06-30 | 中国人民解放军总医院第八医学中心 | Intelligent collection system for nursing and evaluating wound patients |
CN116433537A (en) * | 2023-06-13 | 2023-07-14 | 济南科汛智能科技有限公司 | Intelligent ward monitoring system based on Internet of things and cloud computing |
CN116564498A (en) * | 2023-05-19 | 2023-08-08 | 徐州市永康电子科技有限公司 | Multi-parameter monitor and intelligent clinical interaction system thereof |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111775759B (en) * | 2020-07-10 | 2020-12-15 | 李国安 | New energy automobile fills electric pile intelligent management system |
CN111950450A (en) * | 2020-08-11 | 2020-11-17 | 李国安 | Intelligent medical care management system based on block chain and image processing |
CN112858273B (en) * | 2021-01-13 | 2023-04-21 | 上海玄元医疗科技有限公司 | Intelligent hospital is with former detection kit of dog procalcitonin, detection device of adaptation |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020042723A1 (en) * | 2000-05-23 | 2002-04-11 | Rice Marion R. | FDA alert monitoring and alerting healthcare network |
US20050192845A1 (en) * | 2000-10-12 | 2005-09-01 | Brinsfield James W. | Mobile clinical information system |
US20050200486A1 (en) * | 2004-03-11 | 2005-09-15 | Greer Richard S. | Patient visual monitoring system |
CN1735208A (en) * | 2004-07-12 | 2006-02-15 | 微软公司 | Adaptive updates in motion-compensated temporal filtering |
CN101138248A (en) * | 2005-12-07 | 2008-03-05 | 索尼株式会社 | Encoding device, encoding method, encoding program, decoding device, decoding method, and decoding program |
CN101714256A (en) * | 2009-11-13 | 2010-05-26 | 河北工业大学 | Omnibearing vision based method for identifying and positioning dynamic target |
CN102014240A (en) * | 2010-12-01 | 2011-04-13 | 深圳市蓝韵实业有限公司 | Real-time medical video image denoising method |
CN104424628A (en) * | 2013-09-02 | 2015-03-18 | 南京理工大学 | CCD-image-based method for reducing noise by using frame-to-frame correlation |
CN104780295A (en) * | 2015-04-16 | 2015-07-15 | 中国科学院自动化研究所 | Video denoising system based on noise correlation |
CN104796581A (en) * | 2015-04-16 | 2015-07-22 | 中国科学院自动化研究所 | Video denoising system based on noise distribution feature detection |
KR20150136255A (en) * | 2014-05-27 | 2015-12-07 | 심재환 | Patient Monitoring System and Method |
CN105354540A (en) * | 2015-10-22 | 2016-02-24 | 上海鼎松物联网科技有限公司 | Video analysis based method for implementing person fall-down behavior detection |
CN109147889A (en) * | 2018-09-05 | 2019-01-04 | 广州小楠科技有限公司 | A kind of managing medical information platform |
CN109146925A (en) * | 2018-08-23 | 2019-01-04 | 郑州航空工业管理学院 | Conspicuousness object detection method under a kind of dynamic scene |
CN109411039A (en) * | 2018-10-16 | 2019-03-01 | 广州益牛科技有限公司 | A kind of managing medical information platform |
CN110351453A (en) * | 2019-08-16 | 2019-10-18 | 焦作大学 | A kind of computer video data processing method |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8378859B2 (en) * | 2010-07-16 | 2013-02-19 | Apple Inc. | Memory compression technique with low latency per pixel |
CN102592063A (en) * | 2012-03-25 | 2012-07-18 | 河北普康医疗设备有限公司 | Digitalized information management system for nursing stations in hospitals and method for realizing same |
US20150110248A1 (en) * | 2013-10-22 | 2015-04-23 | Yaron Rabi | Radiation detection and method for non-destructive modification of signals |
CN105160175A (en) * | 2015-08-31 | 2015-12-16 | 西安交通大学 | Remote medical monitoring system |
CN108289613B (en) * | 2015-10-22 | 2021-07-06 | 泰拓卡尔有限公司 | System, method and computer program product for physiological monitoring |
CN105701355A (en) * | 2016-02-01 | 2016-06-22 | 中国人民解放军第四军医大学 | Interactive sickroom nursing management system |
CN107133612A (en) * | 2017-06-06 | 2017-09-05 | 河海大学常州校区 | Based on image procossing and the intelligent ward of speech recognition technology and its operation method |
CN107742281A (en) * | 2017-11-06 | 2018-02-27 | 钟永松 | One kind plans accurate urban and rural planning system |
CN109636784B (en) * | 2018-12-06 | 2021-07-27 | 西安电子科技大学 | Image saliency target detection method based on maximum neighborhood and super-pixel segmentation |
-
2020
- 2020-01-21 CN CN202010072187.1A patent/CN111292845B/en active Active
- 2020-01-21 CN CN202010831367.3A patent/CN111951949B/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020042723A1 (en) * | 2000-05-23 | 2002-04-11 | Rice Marion R. | FDA alert monitoring and alerting healthcare network |
US20050192845A1 (en) * | 2000-10-12 | 2005-09-01 | Brinsfield James W. | Mobile clinical information system |
US20050200486A1 (en) * | 2004-03-11 | 2005-09-15 | Greer Richard S. | Patient visual monitoring system |
CN1735208A (en) * | 2004-07-12 | 2006-02-15 | 微软公司 | Adaptive updates in motion-compensated temporal filtering |
CN101138248A (en) * | 2005-12-07 | 2008-03-05 | 索尼株式会社 | Encoding device, encoding method, encoding program, decoding device, decoding method, and decoding program |
CN101714256A (en) * | 2009-11-13 | 2010-05-26 | 河北工业大学 | Omnibearing vision based method for identifying and positioning dynamic target |
CN102014240A (en) * | 2010-12-01 | 2011-04-13 | 深圳市蓝韵实业有限公司 | Real-time medical video image denoising method |
CN104424628A (en) * | 2013-09-02 | 2015-03-18 | 南京理工大学 | CCD-image-based method for reducing noise by using frame-to-frame correlation |
KR20150136255A (en) * | 2014-05-27 | 2015-12-07 | 심재환 | Patient Monitoring System and Method |
CN104780295A (en) * | 2015-04-16 | 2015-07-15 | 中国科学院自动化研究所 | Video denoising system based on noise correlation |
CN104796581A (en) * | 2015-04-16 | 2015-07-22 | 中国科学院自动化研究所 | Video denoising system based on noise distribution feature detection |
CN105354540A (en) * | 2015-10-22 | 2016-02-24 | 上海鼎松物联网科技有限公司 | Video analysis based method for implementing person fall-down behavior detection |
CN109146925A (en) * | 2018-08-23 | 2019-01-04 | 郑州航空工业管理学院 | Conspicuousness object detection method under a kind of dynamic scene |
CN109147889A (en) * | 2018-09-05 | 2019-01-04 | 广州小楠科技有限公司 | A kind of managing medical information platform |
CN109411039A (en) * | 2018-10-16 | 2019-03-01 | 广州益牛科技有限公司 | A kind of managing medical information platform |
CN110351453A (en) * | 2019-08-16 | 2019-10-18 | 焦作大学 | A kind of computer video data processing method |
Non-Patent Citations (4)
Title |
---|
XIN LIU等: "Automatic motion capture data denoising via filtered subspace clustering and low rank matrix approximation", 《ELSEVIER》 * |
杨翠英等: "病房护士站智能电子公告栏的开发与应用", 《医院管理论坛》 * |
覃剑: "视频序列中的运动目标检测与跟踪研究", 《中国博士学位论文全文数据库 信息科技辑》 * |
陆倩: "移动护理系统的开发与应用", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112869722A (en) * | 2021-01-11 | 2021-06-01 | 梅里医疗科技(洋浦)有限责任公司 | Intelligent 5G intelligent acquisition terminal for medical care and aged people based on Internet of things and acquisition method |
CN113284588A (en) * | 2021-05-10 | 2021-08-20 | 深圳市瀚翔工业设计有限公司 | Intelligent ward system |
CN114141347A (en) * | 2021-12-03 | 2022-03-04 | 北京汉王影研科技有限公司 | Intelligent clinical information management system |
CN114141347B (en) * | 2021-12-03 | 2022-07-01 | 北京汉王影研科技有限公司 | Intelligent clinical information management system |
CN114882983A (en) * | 2022-05-13 | 2022-08-09 | 浙江远图技术股份有限公司 | Smart ward interaction method, system and storage medium |
CN114882983B (en) * | 2022-05-13 | 2023-02-10 | 浙江远图技术股份有限公司 | Smart ward interaction method, system and storage medium |
CN116564498A (en) * | 2023-05-19 | 2023-08-08 | 徐州市永康电子科技有限公司 | Multi-parameter monitor and intelligent clinical interaction system thereof |
CN116564498B (en) * | 2023-05-19 | 2024-01-30 | 徐州市永康电子科技有限公司 | Multi-parameter monitor and intelligent clinical interaction system thereof |
CN116363021A (en) * | 2023-06-02 | 2023-06-30 | 中国人民解放军总医院第八医学中心 | Intelligent collection system for nursing and evaluating wound patients |
CN116433537A (en) * | 2023-06-13 | 2023-07-14 | 济南科汛智能科技有限公司 | Intelligent ward monitoring system based on Internet of things and cloud computing |
CN116433537B (en) * | 2023-06-13 | 2023-08-11 | 济南科汛智能科技有限公司 | Intelligent ward monitoring system based on Internet of things and cloud computing |
Also Published As
Publication number | Publication date |
---|---|
CN111292845A (en) | 2020-06-16 |
CN111951949B (en) | 2021-11-09 |
CN111292845B (en) | 2020-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111951949B (en) | Intelligent nursing interaction system for intelligent ward | |
CN105184239B (en) | Ward auxiliary healthcare system and auxiliary medical care method based on patient's Expression Recognition | |
US9183351B2 (en) | Mobile system with network-distributed data processing for biomedical applications | |
CN110477925A (en) | A kind of fall detection for home for the aged old man and method for early warning and system | |
CN1291749A (en) | Intelligent system for domestic remote medical monitor and consultation | |
CN113066562A (en) | Medical image transmission method and system based on 5g | |
WO2021082433A1 (en) | Digital pathological image quality control method and apparatus | |
CN104274164A (en) | Blood pressure predicting method and mobile phone based on facial image | |
CN111180055B (en) | Hospital supervision system and method | |
CN107516019A (en) | Noninvasive health forecast system and method | |
CN110559007A (en) | Intelligent diagnosis system and method for skull CT flat scan image | |
CN114511898A (en) | Pain recognition method and device, storage medium and electronic equipment | |
CN106606348A (en) | System for monitoring life safety, health and living safety of the elderly based on mobile communication | |
CN112837788A (en) | Medical image transmission method, device and storage medium | |
CN110111891B (en) | Staff health warning method and system based on face image | |
CN101739501A (en) | Identification system and method for risk degree of sickees | |
CN114360709B (en) | Medical data acquisition and verification system and method based on big data | |
CN212570410U (en) | Intelligent infectious disease pre-inspection and triage machine | |
CN109376635B (en) | A kind of nursing quality checking system and security incident report method | |
CN111739654A (en) | Internet of things-based isolated hospital security supervision method and device and storage medium | |
CN111599430A (en) | Big data analysis system for collecting patient information | |
CN113326745A (en) | Application system for judging and identifying stoma situation through image identification technology | |
CN111462869A (en) | Multifunctional head image diagnosis and treatment system and method for neonate clinic | |
CN111816297A (en) | Cloud-based nCoV virus diagnosis and treatment system and method | |
CN111223542A (en) | Intelligent health management auxiliary system, robot and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20211020 Address after: 2 / F, 6 JiangWang Road, Jianghan Economic Development Zone, Jianghan District, Wuhan City, Hubei Province 430000 Applicant after: WUHAN BOKE GUOTAI INFORMATION TECHNOLOGY Co.,Ltd. Address before: 578000 government affairs center building, No. 8, Yantian Road, xinyingwan District, Yangpu Economic Development Zone, county-level administrative division directly under the central government of Hainan Province Applicant before: Meili medical technology (Yangpu) Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |