CN113744881A - Method and system for generating human body sign types - Google Patents
Method and system for generating human body sign types Download PDFInfo
- Publication number
- CN113744881A CN113744881A CN202111064286.6A CN202111064286A CN113744881A CN 113744881 A CN113744881 A CN 113744881A CN 202111064286 A CN202111064286 A CN 202111064286A CN 113744881 A CN113744881 A CN 113744881A
- Authority
- CN
- China
- Prior art keywords
- human body
- body sign
- data
- type
- health rule
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 230000036541 health Effects 0.000 claims abstract description 68
- 238000012549 training Methods 0.000 claims abstract description 43
- 238000013473 artificial intelligence Methods 0.000 claims abstract description 34
- 238000012986 modification Methods 0.000 claims description 14
- 230000004048 modification Effects 0.000 claims description 14
- 238000007781 pre-processing Methods 0.000 claims description 6
- 238000007792 addition Methods 0.000 claims description 3
- 238000012217 deletion Methods 0.000 claims description 3
- 230000037430 deletion Effects 0.000 claims description 3
- 238000006467 substitution reaction Methods 0.000 claims 1
- 238000003066 decision tree Methods 0.000 description 24
- 238000001514 detection method Methods 0.000 description 19
- 238000012545 processing Methods 0.000 description 19
- 201000010099 disease Diseases 0.000 description 14
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 14
- 230000008569 process Effects 0.000 description 14
- 208000024891 symptom Diseases 0.000 description 11
- 210000004369 blood Anatomy 0.000 description 10
- 239000008280 blood Substances 0.000 description 10
- 238000010586 diagram Methods 0.000 description 10
- GUBGYTABKSRVRQ-QKKXKWKRSA-N Lactose Natural products OC[C@H]1O[C@@H](O[C@H]2[C@H](O)[C@@H](O)C(O)O[C@@H]2CO)[C@H](O)[C@@H](O)[C@H]1O GUBGYTABKSRVRQ-QKKXKWKRSA-N 0.000 description 7
- 239000008101 lactose Substances 0.000 description 7
- 238000005070 sampling Methods 0.000 description 6
- 239000002872 contrast media Substances 0.000 description 5
- 238000011156 evaluation Methods 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 238000005065 mining Methods 0.000 description 5
- 201000010538 Lactose Intolerance Diseases 0.000 description 4
- 238000002591 computed tomography Methods 0.000 description 4
- 239000002537 cosmetic Substances 0.000 description 4
- 238000002595 magnetic resonance imaging Methods 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 3
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 3
- 208000002173 dizziness Diseases 0.000 description 3
- 210000003743 erythrocyte Anatomy 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 239000008103 glucose Substances 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 210000002751 lymph Anatomy 0.000 description 3
- 235000015097 nutrients Nutrition 0.000 description 3
- 239000001301 oxygen Substances 0.000 description 3
- 229910052760 oxygen Inorganic materials 0.000 description 3
- 238000002600 positron emission tomography Methods 0.000 description 3
- 238000003672 processing method Methods 0.000 description 3
- 102000004169 proteins and genes Human genes 0.000 description 3
- 108090000623 proteins and genes Proteins 0.000 description 3
- 210000002700 urine Anatomy 0.000 description 3
- 206010012735 Diarrhoea Diseases 0.000 description 2
- 208000012671 Gastrointestinal haemorrhages Diseases 0.000 description 2
- UFHFLCQGNIYNRP-UHFFFAOYSA-N Hydrogen Chemical compound [H][H] UFHFLCQGNIYNRP-UHFFFAOYSA-N 0.000 description 2
- 208000012886 Vertigo Diseases 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 210000000988 bone and bone Anatomy 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 206010012601 diabetes mellitus Diseases 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 208000035861 hematochezia Diseases 0.000 description 2
- 239000001257 hydrogen Substances 0.000 description 2
- 229910052739 hydrogen Inorganic materials 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 235000013336 milk Nutrition 0.000 description 2
- 239000008267 milk Substances 0.000 description 2
- 210000004080 milk Anatomy 0.000 description 2
- 238000003062 neural network model Methods 0.000 description 2
- 230000007935 neutral effect Effects 0.000 description 2
- 230000036961 partial effect Effects 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 231100000889 vertigo Toxicity 0.000 description 2
- 208000006400 Arbovirus Encephalitis Diseases 0.000 description 1
- 206010063094 Cerebral malaria Diseases 0.000 description 1
- 206010011224 Cough Diseases 0.000 description 1
- 206010052804 Drug tolerance Diseases 0.000 description 1
- 206010052369 Encephalitis lethargica Diseases 0.000 description 1
- 206010019345 Heat stroke Diseases 0.000 description 1
- 102000001554 Hemoglobins Human genes 0.000 description 1
- 108010054147 Hemoglobins Proteins 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 201000009906 Meningitis Diseases 0.000 description 1
- 206010037660 Pyrexia Diseases 0.000 description 1
- 210000000577 adipose tissue Anatomy 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 208000003455 anaphylaxis Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000004159 blood analysis Methods 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 210000000476 body water Anatomy 0.000 description 1
- 230000037182 bone density Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000013170 computed tomography imaging Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 235000021004 dietary regimen Nutrition 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- -1 etc.) tolerance Substances 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 235000013861 fat-free Nutrition 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 210000001035 gastrointestinal tract Anatomy 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000007429 general method Methods 0.000 description 1
- 230000003054 hormonal effect Effects 0.000 description 1
- 210000001596 intra-abdominal fat Anatomy 0.000 description 1
- 210000000265 leukocyte Anatomy 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000004199 lung function Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000037323 metabolic rate Effects 0.000 description 1
- 239000003607 modifier Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 235000016709 nutrition Nutrition 0.000 description 1
- 230000035764 nutrition Effects 0.000 description 1
- 244000144985 peep Species 0.000 description 1
- 239000000700 radioactive tracer Substances 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 208000023504 respiratory system disease Diseases 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000002603 single-photon emission computed tomography Methods 0.000 description 1
- 210000002027 skeletal muscle Anatomy 0.000 description 1
- 206010041232 sneezing Diseases 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 210000004003 subcutaneous fat Anatomy 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 235000013619 trace mineral Nutrition 0.000 description 1
- 239000011573 trace mineral Substances 0.000 description 1
- 238000012285 ultrasound imaging Methods 0.000 description 1
- 238000005353 urine analysis Methods 0.000 description 1
- 201000002498 viral encephalitis Diseases 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/037—Emission tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/24323—Tree-organised classifiers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/60—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Public Health (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Radiology & Medical Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- High Energy & Nuclear Physics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Computational Biology (AREA)
- Epidemiology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Primary Health Care (AREA)
- Evolutionary Biology (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Mathematical Physics (AREA)
- Pulmonology (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Nutrition Science (AREA)
- Medical Treatment And Welfare Office Work (AREA)
Abstract
The embodiment of the specification provides a method for generating human body sign types, which comprises the steps of obtaining human body sign information of a target object; according to the human body sign information, determining the human body sign type of the target object based on a preset health rule, wherein: the preset health rule is obtained according to an artificial intelligence algorithm and relevant sample data training, the human body sign type is used for representing the difference degree of human body sign information of the target object and human body sign standard information, and different types of target objects correspond to different human body sign standards.
Description
Technical Field
The present disclosure relates to the field of artificial intelligence, and more particularly, to a method and system for generating human body sign types.
Background
With the continuous development and rise of AI (artificial intelligence) technology in recent years, AI technology (such as license plate recognition, face recognition, medical auxiliary diagnosis, etc.) is beginning to be used in a large number of fields. While mining various rules (including health rules) from data using AI models can be time and labor efficient, the decision-making principle of a large number of excellent AI models is a black box that is not understandable to humans. Therefore, it is likely that better results will be obtained with AI-assisted mining based on human knowledge, and how to embed human knowledge into the AI model and can explicitly present and modify rules is a difficult point. On the other hand, the physical sign type characterizes the health condition of a person from one aspect, and can also make the person better understand the reaction of the person to various environmental factors, and no good method for determining the physical sign type by using AI is available at present.
Therefore, a method for determining the type of a physical sign by using AI and effectively performing visual rule mining is needed.
Disclosure of Invention
One embodiment of the present specification provides a method for generating human body sign types. The method for generating the human body sign types comprises the following steps: acquiring human body sign information of a target object; according to the human body sign information, determining the human body sign type of the target object based on a preset health rule, wherein: the preset health rule is obtained according to an artificial intelligence algorithm and relevant sample data training, the human body sign type is used for representing the difference degree of human body sign information of the target object and human body sign standard information, and different types of target objects correspond to different human body sign standards.
One of the embodiments of the present specification provides a system for generating human body sign types, including: the acquisition module is used for acquiring human body sign information of a target object; the judgment module is used for determining the human body sign type of the target object based on a preset health rule according to the human body sign information, wherein: the preset health rule is obtained according to an artificial intelligence algorithm and relevant sample data training, the human body sign type is used for representing the difference degree of human body sign information of the target object and human body sign standard information, and different types of target objects correspond to different human body sign standards.
One of the embodiments of the present specification provides an apparatus for generating a human body sign type, including a processor for executing a method of generating a human body sign type.
One of the embodiments of the present specification provides a computer-readable storage medium, where the storage medium stores computer instructions, and when the computer reads the computer instructions in the storage medium, the computer executes a method for generating human body sign types.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of a system for generating human body sign types according to some embodiments of the present description;
FIG. 2 is an exemplary flow diagram of a method of generating human vital sign types, shown in accordance with some embodiments of the present description;
FIG. 3 is an exemplary flow diagram of a method of generating human vital sign types, shown in accordance with some embodiments of the present description;
FIG. 4 is an exemplary flow diagram of a method of generating human vital sign types, shown in accordance with some embodiments of the present description;
5A-5C are schematic diagrams of a predictive model according to some embodiments described herein.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
The embodiment of the application relates to a method and a system for generating human body sign types. The method and the system for generating the human body sign types can be applied to processes of intelligent auxiliary health state judgment, intelligent nutrition planning and the like. The difficulty that the working procedure is complex and the principle is difficult to peep is effectively solved. By the method and the system for generating the human body sign types, the following can be realized: and one or more functions of digging and displaying the health rules in the model, organically combining the excavated health rules with health knowledge and clinical experience and the like. The method and the system for generating the human body sign types can achieve one or more beneficial effects of efficient mining, accuracy improvement and the like.
It should be noted that the above examples are only for illustrative purposes, and are not intended to limit the application scenarios of the technical solutions disclosed in the present specification. The technical solutions disclosed in the present specification are explained in detail by the description of the drawings below.
Fig. 1 is a schematic diagram of an application scenario of a system for generating human body sign types according to some embodiments of the present description.
As shown in fig. 1, a system 100 for generating human body sign types may include a detection device 110, a network 120, a terminal 130, a processing device 140, and a storage device 150.
The detection device 110 may be used to detect a target subject to obtain vital sign data of the target subject. The detection device 110 may be an analysis device, e.g. a blood analysis device, a urine analysis device, etc. The detection device 110 may be an Imaging device, for example, an ultrasound Imaging device, a CT (Computed Tomography) Imaging device, a PET (Positron Emission Tomography) Imaging device, an MRI (Magnetic Resonance Imaging) Imaging device, a PET-CT Imaging device, a PET-MRI Imaging device, or the like.
The terminal 130 may include a mobile device 131, a tablet computer 132, a notebook computer 133, and the like, or any combination thereof. In some embodiments, terminal 130 may interact with other components in system 100 over a network. For example, the terminal 130 may send one or more control instructions to the detection device 110 to control the detection device 110 to detect the target object according to the instructions. In some embodiments, the terminal 130 may be part of the processing device 140. In some embodiments, the terminal 130 may be integrated with the processing device 140 as an operating console for the detection device 110. For example, a user/operator of the system 100 (e.g., a health industry practitioner) may control the operation of the detection device 110 via the console, such as scanning a target object, etc.
The storage device 150 may store data (e.g., human sign data, health rule data, etc. of the target subject), instructions, and/or any other information. In some embodiments, storage device 150 may store data obtained from detection device 110, terminal 130, and/or processing device 140, e.g., storage device 150 may store first and second images, etc. of a target object obtained from detection device 110. In some embodiments, storage device 150 may store data and/or instructions that processing device 140 may execute or use to perform the example methods described herein. In some embodiments, the storage device 150 may include one or a combination of mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like. In some embodiments, the storage device 150 may be implemented by a cloud platform as described herein. For example, the cloud platform may include one or a combination of private cloud, public cloud, hybrid cloud, community cloud, distributed cloud, cross-cloud, multi-cloud, and the like.
In some embodiments, storage device 150 may be connected to network 120 to enable communication with one or more components in system 100 (e.g., processing device 140, terminal 130, etc.). One or more components in system 100 may read data or instructions from storage device 150 via network 120. In some embodiments, the storage device 150 may be part of the processing device 140 or may be separate and directly or indirectly coupled to the processing device.
In some embodiments, the system for generating human body sign types 100 can include an obtaining module, and a determining module.
The acquisition module can be used for acquiring human body sign information of the target object.
The judgment module can be used for determining the human body sign type of the target object based on a preset health rule according to the human body sign information, wherein: the preset health rule is obtained according to an artificial intelligence algorithm and relevant sample data training, the human body sign type is used for representing the difference degree of human body sign information of a target object and human body sign standard information, and different types of target objects correspond to different human body sign standards.
The preprocessing module is used for acquiring original data of related sample characteristics; preprocessing raw data to generate the training data of related sample features, wherein the sample features comprise one or more of pure data type features, pure category type features and mixed type features.
A modification module for adjusting at least one initial health rule to generate at least one target health rule, comprising: and performing at least one of addition, deletion and modification on the at least one initial health rule by a relevant expert to generate the at least one target health rule.
It should be noted that the above descriptions of the candidate item display and determination system and the modules thereof are only for convenience of description, and the description is not limited to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. In some embodiments, the acquiring module and the determining module disclosed in fig. 1 may be different modules in a system, or may be a module that implements the functions of two or more modules. For example, each module may share one memory module, and each module may have its own memory module. Such variations are within the scope of the present disclosure.
Fig. 2 is an exemplary flow diagram of a method of generating human sign types, shown in accordance with some embodiments of the present description. As shown in fig. 2, the process 200 includes the following steps. In some embodiments, the process 200 may be performed by the processor 140.
And step 210, acquiring human body sign information of the target object. In some embodiments, step 210 is performed by an acquisition module.
The target object is the object subject to detection. For example, patients, physical examination guests, drug trial volunteers, and the like.
The human body sign information is information and/or data reflecting physiological characteristics of the target subject.
In some embodiments, the human sign information includes one or more of blood normative data, urine normative data, hormonal data, trace element data, antigen-antibody data. In some embodiments, the human vital sign information includes fused data of one or more of blood type, bone density, bone age, body fat rate, weight, body water, BMI, skeletal muscle rate, body age, visceral fat, muscle mass, protein, body score, bone mass, weight differential, body mass differential, fat free weight, subcutaneous fat rate, basal metabolic rate, body type judgment, and fat weight. In some embodiments, the human vital sign information includes one or more of electrocardiogram data, lung function data. In some embodiments, the human vital sign information includes one or more image data of MRI, XR, PET, SPECT, CT, ultrasound.
In some embodiments, the acquisition module may read the human body sign information of the target subject from a database. In some embodiments, the acquisition module may directly or indirectly process the detection data to obtain the human body sign information of the target subject. For example, blood routine data and urine routine data are read. For another example, data in a detection image such as CT or ultrasound is acquired by image recognition.
And step 220, determining the human body sign type of the target object based on a preset health rule according to the human body sign information. In some embodiments, step 220 is performed by a decision module.
The human body sign type can refer to a human body tolerance type and is used for representing the difference of the adaptability of a human body to environment, nutrient substances, medicines and the like.
In some embodiments, human sign types can include air temperature tolerance, air tolerance, ultraviolet light tolerance, nutrient (lactose, protein, etc.) tolerance, cosmetic tolerance, drug tolerance, and the like.
In some embodiments, the human sign types may be classified as: high tolerance type, normal type (corresponding to human body physical sign standard information), low tolerance type and the like.
In some embodiments, the human sign type may be used as an intermediate result, in combination with other factors, to determine the health condition of the person. In some embodiments, the human sign type may be used for non-diagnostic purposes.
The normal population and the low tolerance population (or the high tolerance population) usually have the same symptoms and different diseases. For example, a normal population and a low-tolerance population (or a high-tolerance population) may have different symptoms (or partial symptoms) although they have the same symptoms. Or the normal population and the low tolerance population (or the high tolerance population) have the same symptoms, but the normal population suffers from a certain disease and the low tolerance population (or the high tolerance population) does not suffer from the disease.
Several specific examples: 1) if the normal population has high blood sugar, the patient is likely to have diabetes. Lactose intolerant people, who have elevated blood sugar levels due to consumption of milk or milk-containing foods, are likely not to have diabetes. 2) The normal people have symptoms such as fever in summer, and may have epidemic encephalitis B, meningitis, cerebral malaria, etc. In the lactose intolerant people, the symptoms such as body temperature rise in summer may be heatstroke. 3) The normal people have symptoms of sneezing, coughing and the like, respiratory diseases are probably caused, and the people with low air tolerance can be out of water and soil. 4) The diarrhea and the hematochezia of normal people can be digestive tract diseases, the protein is low in tolerance, and the diarrhea and the hematochezia can be anaphylactic reactions.
The health condition of the target person can be judged by taking the human body sign type as an intermediate result and combining with inspection data, clinical symptoms and the like; compared with the situation that the physical sign type of the target person is uncertain, the method is more beneficial to improving the accuracy of judgment.
In some embodiments, the examination protocol may be formulated with assistance from human sign types. For example, a radiological examination plan is prepared based on the tolerance of the target object to a tracer or a contrast agent.
In some embodiments, the dietary regimen may be assisted by human sign type. For example, a kindergarten may develop an appropriate recipe for each individual child based on their tolerance to nutrients.
In some embodiments, a cosmetic regimen or the like may be assisted by a human sign type. For example, cosmetic packages and the like are recommended in accordance with the cosmetic tolerance.
In some embodiments, the travel scheme may be assisted by human body sign types. For example, extreme sports organizations, scientific teams, etc. may prepare equipment, plan trips, etc. based on the tolerance of a person to air, temperature, ultraviolet light, etc.
The health rule is a rule for judging the human body physical sign type by using medical information. For example, lactose intolerance is indicated when the blood glucose concentration is less than 200 mg/liter after 1 hour of lactose tolerance test. The same human body sign type may correspond to at least one health rule, e.g., a low lactose tolerance type may correspond to a blood glucose concentration of less than 200 mg/liter 1 hour after a lactose tolerance test; 2) the hydrogen concentration in the expired air after 3 hours of the hydrogen expiration experiment was increased by > 2X 10-5 mol/l compared to the fasting level.
The preset health rule is obtained according to an artificial intelligence algorithm and relevant sample data training.
In some embodiments, the artificial intelligence algorithm may include a machine learning model, which may include, but is not limited to, a neural network model, a support vector machine model, a k-nearest neighbor model, a decision tree model, and/or the like. The neural network model can comprise one or more of LeNet, GoogLeNeT, ImageNet, AlexNet, VGG, ResNet and the like.
In some embodiments, the relevant sample data comprises clinical outcome data corresponding to a population of clinically similar features subject to medical knowledge rules. In some embodiments, the relevant sample data comprises physical examination industry big data.
For more description of obtaining the preset health rule according to the artificial intelligence algorithm and the related sample data training, refer to fig. 3.
In some embodiments, the determining module may determine whether the human body sign information of the target object meets a preset health rule, and then determine the human body sign type of the target object. For example, if the blood glucose concentration of the target subject is less than 200 mg/liter after 1 hour of the lactose tolerance test, the target subject is of the lactose intolerance type; and after the target object is subjected to the lactose tolerance test for 1 hour, the blood sugar concentration is not lower than 200 mg/L, and the target object belongs to the lactose normal type.
It should be noted that the above description related to the flow 200 is only for illustration and description, and does not limit the applicable scope of the present specification. Various modifications and alterations to flow 200 will be apparent to those skilled in the art in light of this description. However, such modifications and variations are intended to be within the scope of the present description.
Fig. 3 is an exemplary flow diagram of a method of generating human sign types, shown in accordance with some embodiments of the present description.
At step 310, a predictive model is generated based on the training data and the candidate models.
In some embodiments, the candidate model may be trained based on a number of training samples with identifications to update parameters of the candidate model to derive the predictive model. In some embodiments, the training samples with the identifications may be input to the candidate model, and the parameters of the candidate model may be updated by the training iterations.
In some embodiments, the training sample may be human vital information and the sample identification may be a human vital type. For example, human body sign information of a low tolerance type population is taken as a positive sample, and is marked as 1; taking human body sign information of normal type people as a neutral sample, and marking the neutral sample as 0; human body sign information of a high tolerance type population is taken as a negative sample and is marked as-1. In some embodiments, training data may be prepared as per steps 312-314.
At step 312, raw data of the relevant sample features is obtained.
In some embodiments, raw data for relevant sample characteristics may be collected from a database (e.g., a database of a medical facility, a database of a physical examination center). Raw data includes, but is not limited to, various blood routine, urine routine, etc. examination data. The raw data may be numerical, such as white blood cell level of 0.15, or may be categorical, such as associated disease history of 'dizziness' or associated disease history of 'none'.
Step 314, preprocessing the raw data to generate the training data of the relevant sample features.
In some embodiments, the pre-processing comprises: each data characteristic column of the original data is detected and divided into a pure numerical type, a mixed type and a pure category type, the type of each column is recorded, and a corresponding coding dictionary is constructed to store the information.
The pure numerical type shows that the characteristic only has a characteristic value of a numerical type, and original data are reserved;
pure class type characteristics, which can be transformed, for example, the history of related diseases is three [ dizzy symptom, abnormal lymph and none ], three columns of data can be added, which are respectively recorded as [ related diseases history _ dizzy symptom, related diseases history _ abnormal lymph and related diseases history _ none ]. And the relevant disease history of the case marked as vertigo in the raw data-the vertigo column characteristic is marked as 1 and the rest as 0. The same operation is also performed for two columns of features [ related disease history _ abnormal lymph, related disease history _ none ]. And deleting the original column after the operation is finished, namely deleting the history of the related diseases.
The mixed type feature indicates that the numerical type feature and the classification type feature exist in the mixed type feature. For a hybrid type feature, all class value data in the hybrid type feature may be singled out. For example, the value of the number of red blood cells may have a numerical value, such as '0.15', or may have two categories (too high and too low), and then, for two data worth of too high and too low, the same operation as the pure category type feature is performed, and two columns are newly added (the number of red blood cells _ is too high, and the number of red blood cells _ is too low). But the original columns are not deleted. The values in the original column labeled "too high" and "too low" may be filled in any manner (e.g., -999 filling), and the location of these data recorded. When training or predicting the column, the data of several samples can be removed, and the sample data can be kept to train together.
The existing AI-related data processing methods generally mainly feature pure numerical values and pure types. Compared with other general processing methods, the hybrid feature processing method has the advantages that more data of original information is reserved, accuracy of a model obtained through final training can be remarkably improved, and therefore more excellent health rules are generated.
After the data processing is completed, the whole data becomes full-numerical characteristic data. Further, the missing value is filled in, and the filling method may be arbitrary, for example, average value filling, 0 value filling, or the like.
In some embodiments, the candidate models may be trained in various common training methods using the processed training data described above to generate predictive models. In some embodiments, the candidate model is a decision tree model, each node of the decision tree model corresponding to a health rule.
In some embodiments, the sampling pattern (e.g., put back samples, not put back samples, sample by weight, etc.), the sampling ratio (e.g., 80% of the data is sampled each time) of the training process may be predetermined. In some embodiments, a model complexity sequence may be preset, for example, the complexity sequence is [ 1, 2, 3 ], and then three decision tree models with tree heights of 1, 2, and 3 are generated when the model is trained.
In some embodiments, N candidate models may be trained according to the above settings (sampling mode, sampling ratio, complexity sequence, etc.), resulting in N prediction models, where N is a multiple of the length of the complexity sequence. It can be understood that due to the existence of the sampling process, the trained N prediction models are not the same.
In some embodiments, after a round of training is completed, the N resulting AI models described above may be evaluated based on evaluation indicators (e.g., false negative rate, false positive rate, accuracy, etc.). Specifically, the remaining samples after the sampling (i.e., the remaining 20% of the samples after 80% of the samples are sampled) may be input into the N AI models, so as to obtain the evaluation index of each AI model. And giving the corresponding acquisition weight to each AI model according to the quality of the evaluation index, and selecting one AI model as the round model according to the size of the acquisition weight. Specifically, according to the magnitude of the acquired weight, probability conversion is carried out on the weight by using a softmax function, so that the probability that each AI model is acquired can be obtained, and then the AI models with the probability less than 1/N are removed, and the models are not selected due to poor relative performance. Of the remaining AI models, one AI model is randomly selected according to the probability given by softmax, and the selected AI model is used as the output of the training round.
The AI model obtained by the distribution keeps high evaluation index, and is not easy to fall into overfitting when the data volume is small.
In some embodiments, after a round of training is completed, the distribution of training data may be adjusted so that the AI model obtained from the next round of training is different from the AI model obtained from the previous round. Because the importance of the positive example samples in the application scene of the application is far higher than that of the negative example samples, the method of removing all the recognized positive example samples is adopted, namely, all the current samples are predicted by using the round of model, the positive example samples which are predicted correctly are deleted from the existing samples, new left samples are obtained, and the left samples are used as training data of the next round of learning machine.
Compared with a general method (such as boosting) for adjusting data distribution by integrated learning, the model learned by the method is more suitable for actual needs, higher coverage of the case cases can be obtained by using fewer rules, and the false positive rate is obviously reduced.
In some embodiments, after a round of training is completed, it can be determined whether there are any positive examples in the remaining samples, and if there are no positive examples, the training process can be repeated using the remaining samples. Until no positive samples are left in the sample. In some embodiments, training may also be terminated in other ways, for example, when a maximum number of training rounds is reached.
As previously described, each round of training outputs an AI model. Taking the decision tree as an example, if the training is performed for three rounds, the prediction model M is three decision trees [ T _1, T _2, T _3 ].
At step 320, at least one health rule is generated using the predictive model.
In some embodiments, for each of the predictive models M, at least one health rule may be obtained against the contents of the model.
Taking the decision tree as an example again, the content of the T _1 decision tree is [ a >10 and B _ B <0.5 and C _ C >0.5 ]. Then using the previous encoding dictionary, a regular generation of the model can be made:
1. it can be seen that if a is not accompanied by any suffix, the a feature is a pure numerical feature or a numerical feature part in a mixed feature, and is directly retained, that is, the generated rule is [ a >10 ] unchanged.
2. For the B feature, it can be seen that the suffix B is attached. This indicates that the feature is a generated feature (either a pure class feature or a non-numerical partial extension of a mixed feature in data processing). Since it belongs to less than sign, the satisfied requirement is zero (only 0 and 1 in the generated feature, less than 0.5 representing 0), the rule generated for it is [ B! B! Is not equal to a symbol.
3. For C-feature, similar to B, except that we get here equal to the symbolic feature, the rule generated is [ C ═ C ].
4. After all the division rules are completed, the and logical connector can be used in conjunction with the at least one health rule to generate a rule to T _1 [ A >10 and B! B and C.
5. Rule generation is performed in the same manner for the remaining T _2 and T _3, for example, the [ T _2 rule ] and the [ T _3 rule ] are generated.
6. All rules of the merged model M are conformed using or logical connectors, the final rule form being:
[ A >10 and B! B and C or (T _2 rule) or (T _3 rule).
The rules generated in step 6 are directly visible to health industry practitioners and are in accordance with the format of the rules manually generated by health industry practitioners. After the relevant inspection and evaluation, if the relevant requirements are met, the rule can become a rule used by health industry practitioners.
It should be noted that the above description of the process 300 is for illustration and description only and is not intended to limit the scope of the present disclosure. Various modifications and changes to flow 300 will be apparent to those skilled in the art in light of this description. However, such modifications and variations are intended to be within the scope of the present description.
Fig. 4 is an exemplary flow diagram of a method of generating human sign types, shown in accordance with some embodiments of the present description.
At step 410, at least one initial health rule is generated using the predictive model based on the training data.
The initial health rules refer to health rules generated by the predictive model that have not been identified or modified.
In some embodiments, the initial health rules may be derived based on a trained predictive model. Taking the decision tree as an example again, the conditions corresponding to each node of the decision tree obtained by training may be processed by inverse coding and the like to obtain the initial health rule corresponding to each node (see step 320 for details). It is noted that in the decision tree example, the initial health rule is not the output result of the decision tree, but rather a representation of the internal implementation of the decision tree.
Step 420, adjusting the at least one initial health rule to generate at least one target health rule.
The target health rule refers to an identified or revised health rule.
In some embodiments, at least one initial health rule may be adjusted (e.g., weight adjusted, scope adjusted) to generate at least one target health rule. In some embodiments, the at least one target health rule may be generated by the relevant expert by performing at least one of an addition, a deletion, and a modification to the at least one initial health rule.
With the initial health rule [ A >10 and B! B and C or (T _2 rule) or (T _3 rule) for example. Related experts can modify the rules through direct modification, and then add the rules to the decision tree model according to the reverse idea of the original rule generation module. E.g., rule [ A >10 and B! B and C to [ a >5 and B! B and C, it is converted into the corresponding decision tree model [ a >5 and B _ B <0.5 and C _ C >0.5 ] and put into the prediction model M.
In addition, rules can be manually added to the decision tree model according to the existing health knowledge, and the rules can be used as an initialization start or supplement. That is, there is no model in the initial M, the health industry practitioner can add existing rules such as [ D >5 and E <3 ] to generate a decision tree model T _1, and then add T _1 directly to the model M. Then, at this time, M ═ T _1 ], training is continued again, and a finer and more effective overall model M can be obtained again.
It should be noted that the above description related to the flow 400 is only for illustration and description, and does not limit the applicable scope of the present specification. Various modifications and changes to flow 400 will be apparent to those skilled in the art in light of this description. However, such modifications and variations are intended to be within the scope of the present description.
5A-5C are schematic diagrams of a predictive model according to some embodiments described herein.
As shown in fig. 5A to 5C, the obtained contrast medium tolerance prediction model M includes a decision tree 510, a decision tree 520, and a decision tree 530 through the foregoing rule mining.
The rules in the contrast agent tolerance prediction model M may serve as guidelines for a new person (e.g., a radiologist).
The relevant specialist can add, modify, delete rules in the contrast agent tolerance prediction model M according to medical knowledge and clinical experience, for example, add a hemoglobin number <110g/L, modify "arterial oxygen saturation < 95%" to "arterial oxygen saturation < 97%", delete "venous oxygen saturation < 75%", and the like.
The expert in question may also add existing rules such as "true" for the first time use of the contrast agent to generate a decision tree model T _4, and then add T _4 to the prediction model M. Then M ═ decision tree 510, decision tree 520, decision tree 530, T _4 ] at this time, and training is continued on the basis, so that a more refined and more effective overall prediction model M can be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except where the application history document does not conform to or conflict with the contents of the present specification, it is to be understood that the application history document, as used herein in the present specification or appended claims, is intended to define the broadest scope of the present specification (whether presently or later in the specification) rather than the broadest scope of the present specification. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.
Claims (10)
1. A method of generating human body sign types, comprising:
acquiring human body sign information of a target object;
according to the human body sign information, determining the human body sign type of the target object based on a preset health rule, wherein:
the preset health rule is obtained according to an artificial intelligence algorithm and relevant sample data training, the human body sign type is used for representing the difference degree of human body sign information of the target object and human body sign standard information, and different types of target objects correspond to different human body sign standards.
2. The method according to claim 1, wherein the determining of the human body sign type of the target subject based on the preset health rule comprises:
generating a prediction model based on the training data and the candidate model;
at least one health rule is generated using the predictive model.
3. The method of claim 2, the generating a predictive model based on training data and a candidate model, comprising:
acquiring original data of related sample characteristics;
preprocessing the raw data to generate the training data of the relevant sample features, wherein the sample features comprise one or more of pure data type features, pure class type features, and mixed type features.
4. The method of claim 3, wherein the sample features comprise hybrid features, and pre-processing the raw data comprises:
classifying the raw data of the hybrid feature,
retaining numerical feature data in the original data of the mixed type feature as a part of the training data;
making a substitution for class-type feature data in the raw data of the hybrid feature as part of the training data.
5. The method of claim 2, the generating at least one health rule using the predictive model, comprising:
generating at least one initial health rule using the predictive model based on the training data;
and adjusting the at least one initial health rule to generate at least one target health rule.
6. The method of claim 5, wherein said adjusting said at least one initial health rule to generate at least one target health rule comprises: and performing at least one of addition, deletion and modification on the at least one initial health rule by a relevant expert to generate the at least one target health rule.
7. The method of claim 1, wherein the sign types include: high tolerance type, normal type and low tolerance type.
8. A system for generating human body sign types, comprising:
the acquisition module is used for acquiring human body sign information of a target object;
the judgment module is used for determining the human body sign type of the target object based on a preset health rule according to the human body sign information, wherein:
the preset health rule is obtained according to an artificial intelligence algorithm and relevant sample data training, the human body sign type is used for representing the difference degree of human body sign information of the target object and human body sign standard information, and different types of target objects correspond to different human body sign standards.
9. An apparatus for generating human body sign types, the apparatus comprising a processor and a memory; the memory for storing instructions, wherein the instructions, when executed by the processor, cause the apparatus to implement a method for generating human body sign types as claimed in any one of claims 1-7.
10. A computer-readable storage medium, wherein the storage medium stores computer instructions, and when the computer instructions in the storage medium are read by a computer, the computer performs the method for generating human body sign types according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111064286.6A CN113744881A (en) | 2021-09-10 | 2021-09-10 | Method and system for generating human body sign types |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111064286.6A CN113744881A (en) | 2021-09-10 | 2021-09-10 | Method and system for generating human body sign types |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113744881A true CN113744881A (en) | 2021-12-03 |
Family
ID=78738115
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111064286.6A Pending CN113744881A (en) | 2021-09-10 | 2021-09-10 | Method and system for generating human body sign types |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113744881A (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106529193A (en) * | 2016-11-30 | 2017-03-22 | 上海明波通信技术股份有限公司 | Health examination system |
US20170132383A1 (en) * | 2015-11-10 | 2017-05-11 | Sentrian, Inc. | Systems and methods for automated rule generation and discovery for detection of health state changes |
US20180060508A1 (en) * | 2016-08-26 | 2018-03-01 | International Business Machines Corporation | Personalized tolerance prediction of adverse drug events |
US20180317778A1 (en) * | 2017-05-05 | 2018-11-08 | Jiangsu Huaben Health Life Science and Technology Co., Ltd. | Method And Apparatus For Human Health Evaluation |
CN110021438A (en) * | 2017-07-21 | 2019-07-16 | 上海营康计算机科技有限公司 | Human nutrition state assistant diagnosis system and method |
CN110432878A (en) * | 2019-08-26 | 2019-11-12 | 浙江纳雄医疗器械有限公司 | A kind of hypoxic tolerance analytical equipment and its measurement method |
CN111613335A (en) * | 2020-06-24 | 2020-09-01 | 广州文殊科技有限公司 | Health early warning system and method |
CN112806961A (en) * | 2021-01-12 | 2021-05-18 | 北京普天大健康科技发展有限公司 | Sign data evaluation method and device |
CN113314226A (en) * | 2021-06-11 | 2021-08-27 | 曹庆恒 | System, method and equipment for intelligently analyzing anaphylactic reaction |
-
2021
- 2021-09-10 CN CN202111064286.6A patent/CN113744881A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170132383A1 (en) * | 2015-11-10 | 2017-05-11 | Sentrian, Inc. | Systems and methods for automated rule generation and discovery for detection of health state changes |
US20180060508A1 (en) * | 2016-08-26 | 2018-03-01 | International Business Machines Corporation | Personalized tolerance prediction of adverse drug events |
CN106529193A (en) * | 2016-11-30 | 2017-03-22 | 上海明波通信技术股份有限公司 | Health examination system |
US20180317778A1 (en) * | 2017-05-05 | 2018-11-08 | Jiangsu Huaben Health Life Science and Technology Co., Ltd. | Method And Apparatus For Human Health Evaluation |
CN110021438A (en) * | 2017-07-21 | 2019-07-16 | 上海营康计算机科技有限公司 | Human nutrition state assistant diagnosis system and method |
CN110432878A (en) * | 2019-08-26 | 2019-11-12 | 浙江纳雄医疗器械有限公司 | A kind of hypoxic tolerance analytical equipment and its measurement method |
CN111613335A (en) * | 2020-06-24 | 2020-09-01 | 广州文殊科技有限公司 | Health early warning system and method |
CN112806961A (en) * | 2021-01-12 | 2021-05-18 | 北京普天大健康科技发展有限公司 | Sign data evaluation method and device |
CN113314226A (en) * | 2021-06-11 | 2021-08-27 | 曹庆恒 | System, method and equipment for intelligently analyzing anaphylactic reaction |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111199550B (en) | Training method, segmentation method, device and storage medium of image segmentation network | |
CN107492099B (en) | Medical image analysis method, medical image analysis system, and storage medium | |
CN113011485B (en) | Multi-mode multi-disease long-tail distribution ophthalmic disease classification model training method and device | |
JP6522161B2 (en) | Medical data analysis method based on deep learning and intelligent analyzer thereof | |
He et al. | Image segmentation algorithm of lung cancer based on neural network model | |
Howard et al. | Cardiac rhythm device identification using neural networks | |
US10734107B2 (en) | Image search device, image search method, and image search program | |
CN110051324B (en) | Method and system for predicting death rate of acute respiratory distress syndrome | |
JP6885517B1 (en) | Diagnostic support device and model generation device | |
CN107766874B (en) | Measuring method and measuring system for ultrasonic volume biological parameters | |
CN110459328A (en) | A kind of Clinical Decision Support Systems for assessing sudden cardiac arrest | |
Mazzanti et al. | Imaging, health record, and artificial intelligence: hype or hope? | |
KR102366290B1 (en) | Medical machine learning system | |
CN110880366A (en) | Medical image processing system | |
CN110491479A (en) | A kind of construction method of sclerotin status assessment model neural network based | |
CN111951965B (en) | Panoramic health dynamic monitoring and predicting system based on time sequence knowledge graph | |
CN116110597B (en) | Digital twinning-based intelligent analysis method and device for patient disease categories | |
WO2023143628A1 (en) | Diabetic retinopathy detection method based on genetic fuzzy tree and deep network | |
CN117237351B (en) | Ultrasonic image analysis method and related device | |
CN110575178A (en) | Diagnosis and monitoring integrated medical system for judging motion state and judging method thereof | |
Badano et al. | The stochastic digital human is now enrolling for in silico imaging trials—methods and tools for generating digital cohorts | |
Sengan et al. | Echocardiographic image segmentation for diagnosing fetal cardiac rhabdomyoma during pregnancy using deep learning | |
CN114999638B (en) | Big data visualization processing method and system for medical diagnosis based on artificial intelligence | |
CN113744881A (en) | Method and system for generating human body sign types | |
CN115120238A (en) | Method, device and system for identifying first-onset schizophrenia patient based on federal learning multiple centers |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |