CN112132112A - Behavior prejudging system - Google Patents

Behavior prejudging system Download PDF

Info

Publication number
CN112132112A
CN112132112A CN202011112907.9A CN202011112907A CN112132112A CN 112132112 A CN112132112 A CN 112132112A CN 202011112907 A CN202011112907 A CN 202011112907A CN 112132112 A CN112132112 A CN 112132112A
Authority
CN
China
Prior art keywords
acquisition device
image
infant
module
background processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011112907.9A
Other languages
Chinese (zh)
Inventor
满延慧
袁聪聪
彭磊
文继任
杨峻骁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Jusha Information Technology Co ltd
Original Assignee
Hunan Jusha Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Jusha Information Technology Co ltd filed Critical Hunan Jusha Information Technology Co ltd
Priority to CN202011112907.9A priority Critical patent/CN112132112A/en
Publication of CN112132112A publication Critical patent/CN112132112A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/04Babies, e.g. for SIDS detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Cardiology (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Physiology (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Pulmonology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Evolutionary Biology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The invention relates to the technical field of nursing, and discloses a behavior prejudging system which comprises a background processor, a data acquisition module, a wearable nursing module and a client terminal, wherein the state of an infant is judged based on a neural network; the intelligent infant monitoring system can intelligently pre-judge the state of the infant about to enter, can provide a certain prompting and alarming effect for a caregiver, reduces labor cost of the caregiver, quickly and accurately realizes understanding of infant requirements, reduces misjudgment, and has high practical value and wide application prospect.

Description

Behavior prejudging system
Technical Field
The invention relates to the technical field of nursing, in particular to a behavior prejudging system
Background
With the continuous development of modern construction, the life rhythm of people is accelerated continuously, and the life pressure is increased continuously. Because of the heavy pressure of work and life and the lack of experience of nursing babies, the phenomenon that young parents are cared for babies is sometimes happened. The nursing of the infants is a very heavy and delicate work, particularly, the requirements of the infants cannot be correctly expressed by words in the early stage of two years of age, if parents cannot timely master the dynamics of the infants and understand the appeal of the infants, the repeated labor of adults can be increased, and the growth feeling of the infants is also influenced to a certain extent.
Furthermore, when a baby who is unable to correctly express his/her actual condition presents some potentially dangerous or uncomfortable symptoms, parents or caregivers often cannot find the treatment for the first time, thus delaying valuable time.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a behavior prejudging system for solving the problems in the prior art.
The technical scheme adopted by the invention for solving the technical problems is as follows:
a behavior prejudging system comprises a background processor, a data acquisition module, a wearable nursing module and a client terminal, wherein the background processor is respectively connected with the data acquisition module, the wearable nursing module and the client terminal in a bidirectional mode;
the data acquisition module comprises a temperature and humidity acquisition device, an image acquisition device, a sound acquisition device, a life parameter acquisition device and a myoelectric acquisition device arranged on the limbs and the rectus abdominis of the infant, and the temperature and humidity acquisition device, the image acquisition device, the sound acquisition device, the life parameter acquisition device and the myoelectric acquisition device are respectively connected with the background processor;
the temperature and humidity acquisition device is arranged below the infant bed bedding and is used for acquiring the temperature and the humidity of the infant bedding in real time and transmitting the temperature and the humidity to the background processor;
the image acquisition device is used for acquiring images of the infants in real time and transmitting the acquired images to the background processor;
the sound collecting device is used for collecting baby crying voice information in real time and transmitting the information to the background processor;
the life parameter acquisition device is used for acquiring heart rate, body temperature, pulse and respiration information of the infant in real time and transmitting the heart rate, body temperature, pulse and respiration information to the background processor;
the myoelectricity acquisition device is used for acquiring myoelectricity signals of the infants in real time and transmitting the myoelectricity signals to the background processor;
the wearable nursing module is worn by a nursing person and used for receiving and checking prompts and warning information transmitted by the background processor or the state of the infant;
the client terminal is used for checking a real-time display picture transmitted by the background processor and entering a recording display picture, the real-time display picture displays real-time data of temperature and humidity data, heart rate, body temperature, pulse, respiration and images, and the recording display picture displays recorded data and a change trend curve of the body temperature, the heart rate, the pulse and the respiration as well as the time length or the times of the infant in various states according to the day and the week;
the background processor is used for receiving the transmission data of the data acquisition module, establishing a background model for the monitoring environment where the infant is located in advance according to the acquired image, and acquiring the action state of the infant, and specifically comprises the following steps:
obtaining a training image and preprocessing the image: performing data expansion on a training image, and performing gray level processing on the expanded image, wherein the data expansion on the training image comprises the following steps: performing multiple central rotations, horizontal translation and horizontal turning on an image RGB channel and a label thereof, stretching the size of the image to a uniform size, recording the corresponding relation of pixel coordinates before and after zooming, and storing the corresponding relation as a mapping table;
training a neural network of the human body posture by using the processed image as a training sample;
acquiring an acquired image, estimating the human body posture in the image by using a trained human body posture estimation neural network, and comprehensively judging the action state;
the background processor is further used for autonomously learning the ranges of the life parameters and the myoelectric parameters of the infants at the moment based on the neural network to obtain the recognition model.
Preferably, the client terminal comprises a mobile receiving device and an APP installed on the device; the mobile receiving equipment is one or more of an intelligent bracelet, an intelligent mobile phone and a tablet computer.
Preferably, the client terminal further includes a voice interaction module, where the voice interaction module is configured to perform keyword search, and specifically includes: and performing keyword recognition on the local area network, performing relatively complex continuous voice recognition on the cloud server, and receiving a recognition result returned by the cloud terminal by the voice interaction module and converting the recognition result into a corresponding control instruction.
Preferably, the wearable nursing module is further provided with one or more of a face recognition module, a fingerprint recognition module and a password recognition module.
Preferably, the image acquisition device is a high-definition camera with an infrared or night vision function.
Compared with the prior art, the invention has the following beneficial effects:
the method is based on various physiological characteristics and physiological parameters of the infants, and combines an image recognition technology and a neural network technology to obtain preprocessed characteristic data; through modeling in advance, bring the state of infant's normal life into the identification process, the state that the infant is about to get into can intelligent prejudgement infant, can provide certain suggestion and alarm effect for nursing personnel, reduced personnel's labour cost, quick accurate realization reduces the misjudgment to the understanding of infant demand simultaneously, has higher practical value and extensive application prospect.
Drawings
Fig. 1 is a schematic structural diagram of a behavior prediction system according to an embodiment of the present invention.
Detailed Description
The invention is further described below with reference to the drawings and specific preferred embodiments of the description, without thereby limiting the scope of protection of the invention.
Referring to fig. 1, the present embodiment provides a behavior prediction system, in which the background processor is respectively connected to the data acquisition module, the wearable nursing module, and the client terminal in two ways;
the data acquisition module comprises a temperature and humidity acquisition device, an image acquisition device, a sound acquisition device, a life parameter acquisition device and a myoelectric acquisition device arranged on the limbs and the rectus abdominis of the infant, and the temperature and humidity acquisition device, the image acquisition device, the sound acquisition device, the life parameter acquisition device and the myoelectric acquisition device are respectively connected with the background processor;
the temperature and humidity acquisition device is arranged below the infant bed bedding and is used for acquiring the temperature and the humidity of the infant bedding in real time and transmitting the temperature and the humidity to the background processor;
the image acquisition device is used for acquiring images of the infants in real time and transmitting the acquired images to the background processor;
the sound collecting device is used for collecting baby crying voice information in real time and transmitting the information to the background processor;
the life parameter acquisition device is used for acquiring heart rate, body temperature, pulse and respiration information of the infant in real time and transmitting the heart rate, body temperature, pulse and respiration information to the background processor;
the myoelectricity acquisition device is used for acquiring myoelectricity signals of the infants in real time and transmitting the myoelectricity signals to the background processor;
the wearable nursing module is worn by a nursing person and used for receiving and checking prompts and warning information transmitted by the background processor or the state of the infant;
the client terminal is used for checking a real-time display picture transmitted by the background processor and entering a recording display picture, the real-time display picture displays real-time data of temperature and humidity data, heart rate, body temperature, pulse, respiration and images, and the recording display picture displays recorded data and a change trend curve of the body temperature, the heart rate, the pulse and the respiration as well as the time length or the times of the infant in various states according to the day and the week;
the background processor is used for receiving the transmission data of the data acquisition module, establishing a background model for the monitoring environment where the infant is located in advance according to the acquired images, obtaining the action state of the infant, and autonomously learning the range where the life parameters and the myoelectricity parameters of the infant are located at the moment based on the neural network to obtain the identification model.
In the embodiment, the client terminal comprises mobile receiving equipment and an APP installed on the equipment; the mobile receiving equipment is one or more of an intelligent bracelet, an intelligent mobile phone and a tablet computer.
In the embodiment, when the client terminal enters the real-time display screen, real-time data of temperature and humidity data, heart rate, body temperature, pulse, respiration and images are displayed; when entering the recording display screen, the client can display the recorded data and the variation trend curve of the body temperature, the heart rate, the pulse and the respiration according to the day and the week, and the time length or the times of the infant in various states.
In this embodiment, the client terminal further includes a voice interaction module, where the voice interaction module is configured to execute keyword retrieval, and specifically includes: and performing keyword recognition on the local area network, performing relatively complex continuous voice recognition on the cloud server, and receiving a recognition result returned by the cloud terminal by the voice interaction module and converting the recognition result into a corresponding control instruction.
In this embodiment, a background model is established in advance for a monitoring environment in which an infant is located according to an acquired image, and an action state of the infant is obtained, which specifically includes:
obtaining a training image and preprocessing the image: performing data expansion on a training image, and performing gray level processing on the expanded image, wherein the data expansion on the training image comprises the following steps: performing multiple central rotations, horizontal translation and horizontal turning on an image RGB channel and a label thereof, stretching the size of the image to a uniform size, recording the corresponding relation of pixel coordinates before and after zooming, and storing the corresponding relation as a mapping table;
training a neural network of the human body posture by using the processed image as a training sample;
acquiring an acquired image, estimating the human body posture in the image by using the trained human body posture estimation neural network, and comprehensively judging the action state.
The wearable nursing module in the embodiment is further provided with one or more of a face recognition module, a fingerprint recognition module and a password recognition module.
The image acquisition device in the embodiment is a high-definition camera with an infrared or night vision function.
The present embodiment also provides an electronic device, including: a memory for storing executable instructions;
and a processor for communicating with the memory to execute the executable instructions to perform the functional operations of the behavior anticipation system.
The present embodiment also provides a computer-readable storage medium, which stores a behavior anticipation program, where the behavior anticipation program can be executed by one or more processors to implement the functional steps of the behavior anticipation system as described above.
The foregoing is considered as illustrative of the preferred embodiments of the invention and is not to be construed as limiting the invention in any way. Although the present invention has been described with reference to the preferred embodiments, it is not intended to be limited thereto. Therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical spirit of the present invention should fall within the protection scope of the technical scheme of the present invention, unless the technical spirit of the present invention departs from the content of the technical scheme of the present invention.

Claims (5)

1. A behavior pre-judging system is characterized by comprising a background processor, a data acquisition module, a wearable nursing module and a client terminal, wherein the background processor is respectively connected with the data acquisition module, the wearable nursing module and the client terminal in a bidirectional manner;
the data acquisition module comprises a temperature and humidity acquisition device, an image acquisition device, a sound acquisition device, a life parameter acquisition device and a myoelectric acquisition device arranged on the limbs and the rectus abdominis of the infant, and the temperature and humidity acquisition device, the image acquisition device, the sound acquisition device, the life parameter acquisition device and the myoelectric acquisition device are respectively connected with the background processor;
the temperature and humidity acquisition device is arranged below the infant bed bedding and is used for acquiring the temperature and the humidity of the infant bedding in real time and transmitting the temperature and the humidity to the background processor;
the image acquisition device is used for acquiring images of the infants in real time and transmitting the acquired images to the background processor;
the sound collecting device is used for collecting baby crying voice information in real time and transmitting the information to the background processor;
the life parameter acquisition device is used for acquiring heart rate, body temperature, pulse and respiration information of the infant in real time and transmitting the heart rate, body temperature, pulse and respiration information to the background processor;
the myoelectricity acquisition device is used for acquiring myoelectricity signals of the infants in real time and transmitting the myoelectricity signals to the background processor;
the wearable nursing module is worn by a nursing person and used for receiving and checking prompts and warning information transmitted by the background processor or the state of the infant;
the client terminal is used for checking a real-time display picture transmitted by the background processor and entering a recording display picture, the real-time display picture displays real-time data of temperature and humidity data, heart rate, body temperature, pulse, respiration and images, and the recording display picture displays recorded data and a change trend curve of the body temperature, the heart rate, the pulse and the respiration as well as the time length or the times of the infant in various states according to the day and the week;
the background processor is used for receiving the transmission data of the data acquisition module, establishing a background model for the monitoring environment where the infant is located in advance according to the acquired image, and acquiring the action state of the infant, and specifically comprises the following steps:
obtaining a training image and preprocessing the image: performing data expansion on a training image, and performing gray level processing on the expanded image, wherein the data expansion on the training image comprises the following steps: performing multiple central rotations, horizontal translation and horizontal turning on an image RGB channel and a label thereof, stretching the size of the image to a uniform size, recording the corresponding relation of pixel coordinates before and after zooming, and storing the corresponding relation as a mapping table;
training a neural network of the human body posture by using the processed image as a training sample;
acquiring an acquired image, estimating the human body posture in the image by using a trained human body posture estimation neural network, and comprehensively judging the action state;
the background processor is further used for autonomously learning the ranges of the life parameters and the myoelectric parameters of the infants at the moment based on the neural network to obtain the recognition model.
2. The behavior anticipation system of claim 1, wherein the client terminal comprises a mobile receiving device; the mobile receiving equipment is one or more of an intelligent bracelet, an intelligent mobile phone and a tablet computer.
3. The behavior prediction system according to claim 1, wherein the client terminal further comprises a voice interaction module, and the voice interaction module is configured to perform keyword search, and specifically includes: and performing keyword recognition on the local area network, performing relatively complex continuous voice recognition on the cloud server, and receiving a recognition result returned by the cloud terminal by the voice interaction module and converting the recognition result into a corresponding control instruction.
4. The behavior anticipation system of claim 1, wherein the wearable care module is further configured with one or more of a face recognition module, a fingerprint recognition module, and a password recognition module.
5. The behavior anticipation system of claim 1, wherein the image capturing device is a high-definition camera with infrared or night vision capabilities.
CN202011112907.9A 2020-10-16 2020-10-16 Behavior prejudging system Pending CN112132112A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011112907.9A CN112132112A (en) 2020-10-16 2020-10-16 Behavior prejudging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011112907.9A CN112132112A (en) 2020-10-16 2020-10-16 Behavior prejudging system

Publications (1)

Publication Number Publication Date
CN112132112A true CN112132112A (en) 2020-12-25

Family

ID=73853170

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011112907.9A Pending CN112132112A (en) 2020-10-16 2020-10-16 Behavior prejudging system

Country Status (1)

Country Link
CN (1) CN112132112A (en)

Similar Documents

Publication Publication Date Title
US20230222805A1 (en) Machine learning based monitoring system
CN109480868B (en) Intelligent infant monitoring system
US10638938B1 (en) Eyeglasses to detect abnormal medical events including stroke and migraine
DE112014006082T5 (en) Pulse wave measuring device, mobile device, medical equipment system and biological information communication system
US20160071390A1 (en) System for monitoring individuals as they age in place
EP3432772B1 (en) Using visual context to timely trigger measuring physiological parameters
CN108806765A (en) A kind of intelligence baby monitoring system
CN107832744A (en) A kind of baby sleep monitoring system and its method
WO2019204700A1 (en) Neonatal pain identification from neonatal facial expressions
CN107788990A (en) A kind of wearable falling detection device and fall detection system
CN113257440A (en) ICU intelligent nursing system based on patient video identification
CN113143223A (en) Edge artificial intelligence infant monitoring method
CN114999646A (en) Newborn exercise development assessment system, method, device and storage medium
CN116945156A (en) Intelligent elderly accompanying system based on computer vision technology
CN115695734A (en) Infrared thermal imaging protection monitoring method, device, equipment, system and medium
CN108010579A (en) Health monitoring system
KR102188076B1 (en) method and apparatus for using IoT technology to monitor elderly caregiver
CN113593693A (en) Remote health management platform
CN106845077A (en) A kind of Telemedicine System parameter configuring system
CN106650288A (en) Temperature control charging remote medical system parameter configuration system
CN208092911U (en) A kind of baby monitoring systems
CN112132112A (en) Behavior prejudging system
CN114360207B (en) Child quilt kicking detection system
Akshay et al. An Eye Movement Based Patient Assistance System
KR102668931B1 (en) Apparatus and method for providing customized service

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201225