CN215017583U - Diagnosis and treatment device and system - Google Patents

Diagnosis and treatment device and system Download PDF

Info

Publication number
CN215017583U
CN215017583U CN202023335136.8U CN202023335136U CN215017583U CN 215017583 U CN215017583 U CN 215017583U CN 202023335136 U CN202023335136 U CN 202023335136U CN 215017583 U CN215017583 U CN 215017583U
Authority
CN
China
Prior art keywords
information
patient
controller
component
intervention
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202023335136.8U
Other languages
Chinese (zh)
Inventor
韩旭
韦新
韩明
杨卫轩
张�浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Huinao Intelligent Technology Co ltd
Original Assignee
Xi'an Huinao Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Huinao Intelligent Technology Co ltd filed Critical Xi'an Huinao Intelligent Technology Co ltd
Priority to CN202023335136.8U priority Critical patent/CN215017583U/en
Application granted granted Critical
Publication of CN215017583U publication Critical patent/CN215017583U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The present disclosure relates to a diagnosis and treatment device and system, the diagnosis and treatment device includes: the system comprises video interaction equipment, an information acquisition component, a medical intervention component and a controller, wherein the information acquisition component is in communication connection with the medical intervention component through the controller; the information acquisition component is used for acquiring the physical sign information of the patient in the interaction process of the patient using the video interaction equipment; the medical intervention component is used for receiving the intervention strategy information sent by the controller so as to execute the medical intervention operation corresponding to the intervention strategy information on the patient. Can link up the audio-visual interactive process who asks for the patient and acquire patient's sign information, and then carry out corresponding medical intervention to the patient according to sign information, form the operating system of information acquisition and feedback integration, improve intelligent degree and the service efficiency that the disease was diagnose, reduce the disease and diagnose the cost of labor when improving user experience. The diagnosis and treatment device is used for diagnosis and treatment of psychological diseases.

Description

Diagnosis and treatment device and system
Technical Field
The present disclosure relates to the field of intelligent medical technology, and in particular, to a diagnosis and treatment device and system.
Background
Psychological diseases are caused by brain dysfunction caused by internal and external pathogenic factors acting on human, thereby destroying the integrity of human brain function and the unity of individuals and external environment. Along with the continuous improvement of the living quality of people in China, the attention degree of people on the psychological diseases is gradually increased. However, the public understanding of the psychological problems or psychological diseases is generally insufficient, and the professional equipment implementation is extremely deficient. In addition, the experienced professional psychology diagnosis and treatment practitioners in China are too few, and part of patients have certain psychological resistance to the psychology treatment, so the requirements on related equipment and systems for performing psychological preliminary diagnosis and treatment deeply into the basic level are very urgent.
In the related art, the devices for diagnosing and treating psychological diseases are generally identified and analyzed according to the inquiry information obtained by the face-to-face inquiry of a doctor and a patient, so as to obtain a diagnosis result, and then different intervention devices are manually operated to treat the psychological diseases according to the diagnosis result. However, the inquiry process in the above manner is usually developed in a manner of face-to-face inquiry and answer between the doctor and the patient, which cannot overcome the psychological treatment and psychological doctor resistance of some patients, and the user experience is poor. The physical sign information or inquiry information of the patient in the diagnosis stage is acquired in a single manner, and usually only includes voice recognition, video images and the like. The treatment mode only comprises common music physiotherapy, the specific execution mode of the music physiotherapy is manually judged by a psychologist, and the treatment mode is single and the intelligent degree is low. The diagnosis and treatment methods based on single physical sign information or inquiry information and manual judgment of doctors are also poor in the efficiency of psychological disease treatment and high in cost.
SUMMERY OF THE UTILITY MODEL
To overcome the problems of the related art, the present disclosure provides a diagnosis and treatment device and system for diagnosis and treatment of psychological diseases.
According to a first aspect of the embodiments of the present disclosure, there is provided a diagnosis and treatment device, the device including: the system comprises video interaction equipment, an information acquisition component, a medical intervention component and a controller, wherein the information acquisition component is in communication connection with the medical intervention component through the controller;
the information acquisition component is used for acquiring the physical sign information of the patient in the interaction process of the patient using the audio-visual interaction equipment;
the medical intervention component is used for receiving the intervention strategy information sent by the controller so as to execute the medical intervention operation corresponding to the intervention strategy information on the patient.
Optionally, the information acquisition assembly, the medical intervention assembly and the controller are all arranged on the video interaction device.
Optionally, the information collecting assembly includes:
the electroencephalogram acquisition module is used for acquiring electroencephalogram information of the patient;
the radio equipment is used for collecting voice information in the interaction process;
an eye movement recognition device for acquiring eye movement trajectory information of the patient.
Optionally, the medical intervention assembly comprises:
a microcurrent stimulation module for applying microcurrent stimulation to the patient; and the number of the first and second groups,
music physiotherapy equipment for outputting physiotherapy audio.
Optionally, the video-audio interaction device includes: an eye assembly and a head binding assembly.
Optionally, the eye movement recognition device is disposed on the eye assembly, and the radio device and the music physiotherapy device are both disposed on the head binding assembly.
Optionally, the head binding assembly is further provided with a plurality of electroencephalogram devices, the electroencephalogram acquisition module and the micro-current stimulation module are integrated in the electroencephalogram devices, and the electroencephalogram devices are used for:
performing an operation of acquiring electroencephalogram information of the patient in response to a first control signal transmitted by the controller;
the application of microcurrent stimulation to the patient is performed in response to a second control signal sent by the controller.
Optionally, the controller is configured to:
receiving the sign information sent by the information acquisition component, wherein the sign information comprises: the electroencephalogram information, the voice information, and the eye movement identification information;
transmitting the intervention policy information to the medical intervention component, the intervention policy information comprising: music playing information of the music physical therapy module and running length information of the micro-current stimulation module.
According to a second aspect of the embodiments of the present disclosure, there is provided a diagnosis and treatment system including:
a second aspect of the disclosed embodiments provides a medical device; and the number of the first and second groups,
a server for storing a pathology database.
Optionally, the server is in communication connection with a controller included in the diagnosis and treatment device;
the controller is used for sending diagnosis information to the server, wherein the diagnosis information is the diagnosis information corresponding to the physical sign information of the patient, which is acquired in the interactive process of the communication between the audio-video interactive equipment and the patient, which is contained in the diagnosis and treatment device;
and receiving intervention strategy information sent by the server, wherein the intervention strategy information is determined according to the case database and the diagnosis information.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
the embodiment of this disclosure provides a diagnose device includes: the system comprises video interaction equipment, an information acquisition component, a medical intervention component and a controller, wherein the information acquisition component is in communication connection with the medical intervention component through the controller; the information acquisition component is used for acquiring the physical sign information of the patient in the interaction process of the patient using the video interaction equipment; the medical intervention component is used for receiving the intervention strategy information sent by the controller so as to execute the medical intervention operation corresponding to the intervention strategy information on the patient. Can link up the audio-visual interactive process who asks for the patient and acquire patient's sign information, and then carry out corresponding medical intervention to the patient according to sign information, form the operating system of information acquisition and feedback integration, improve intelligent degree and the service efficiency that the disease was diagnose, reduce the disease and diagnose the cost of labor when improving user experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
fig. 1 is a schematic structural view illustrating a medical device according to an exemplary embodiment;
fig. 2 is a schematic top view of another medical device according to the embodiment of fig. 1;
fig. 3 is a schematic diagram of a medical system according to an exemplary embodiment;
fig. 4 is a block diagram illustrating a structure of a control apparatus of a battery wobble device according to an embodiment of the disclosure;
fig. 5 is a schematic diagram of a hardware structure of a terminal according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of a hardware structure of a chip according to an embodiment of the present disclosure.
Detailed Description
The following detailed description of specific embodiments of the present disclosure is provided in connection with the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present disclosure, are given by way of illustration and explanation only, not limitation.
The utility model provides a diagnose device and system, specifically as follows:
fig. 1 is a schematic structural view illustrating a medical treatment apparatus according to an exemplary embodiment, referring to fig. 1, the apparatus including: the system comprises an audio-visual interaction device 110, an information acquisition component 120, a medical intervention component 130 and a controller 140, wherein the information acquisition component 120 is in communication connection with the medical intervention component 130 through the controller 140.
The information collecting component 120 is configured to collect sign information of a patient during an interaction process of the patient using the audio-visual interaction device 110; the medical intervention component 130 is configured to receive the intervention policy information sent by the controller 140, so as to perform a medical intervention operation corresponding to the intervention policy information on the patient.
Illustratively, the audiovisual interaction device 110 is configured to output an image for interaction with a patient, and the interaction process may include: a process of displaying the virtual character and outputting voice for inquiry communication, a process of outputting treatment items and contents, and a process of outputting a treatment-related screen, and the like. After the interaction process is initiated, the information collecting component 120 is also turned on to collect patient vital sign information during the interaction process, and the controller 140 analyzes the change of the vital sign information. The video interaction device 110 may be a television, a tablet computer, a display, a smart phone, a smart watch, smart glasses, or a VR device (Virtual Reality). The controller 140 may be a computer, a tablet computer, a smart phone, etc. capable of processing and transmitting information.
Further, as shown in fig. 1, the information collecting component 120 may include: an electroencephalogram acquisition module 121 for acquiring electroencephalogram information of the patient; a radio device 122 for collecting voice information in the interaction process; an eye movement recognition device 123 for acquiring eye movement trajectory information of the patient. The medical intervention package 130, comprising: a microcurrent stimulation module 131 for applying microcurrent stimulation to the patient; and a music physiotherapy apparatus 132 for outputting physiotherapy audio. The micro-current stimulation module 131 is configured to continuously perform micro-electrical stimulation on the ear lobe, the four limbs, and the like of the patient to achieve the purpose of therapeutic massage.
Illustratively, the audiovisual interaction device 110, the information collection component 120, the medical intervention component 130 and the controller 140 may be separate sets of devices, and the connection relationship may be that the audiovisual interaction device 110, the information collection component 120 and the medical intervention component 130 are all connected to the controller 140.
In a possible implementation manner, the av interaction device 110 is a television or a tablet pc, and at this time, a preset computer program may be written into a processing unit of the av interaction device 110, so as to use the processing unit as the controller 140, and the information collection component 120 and the medical intervention component 130 are connected to the controller 140 through a wired or wireless connection.
In another possible implementation manner, the video interaction device 110 is a VR device. In this case, since the VR device, the above-mentioned brain electrical acquisition module 121, the eye movement recognition device 123 and the micro-current stimulation module 131 all need to be in close contact with the head of the patient during operation, the information acquisition assembly 120, the medical intervention assembly 130 and the controller 140 may all be integrated in the VR device. In this case, a microphone of the VR device may be used as the sound receiving device 122, and an audio player of the VR device may be used as the music physiotherapy device 132.
Taking the audio-visual interaction device 110 as an example of a VR device, in an actual use process, a patient may start the VR device first. In response to the VR device being turned on, the controller 140 detects whether the patient has completed wearing the VR device correctly. Thereafter, the controller 140 turns on the information collecting component 120 to collect the patient vital sign information during the interaction. The vital sign information is then sent to the controller 140, and the controller 140 (or a server communicatively coupled to the controller 140) identifies and analyzes the received vital sign information to determine intervention strategy information. The intervention policy information may include: the music physiotherapy device 132 plays the music type, the playing volume and the playing time of the music, and the output intensity, the output position and the output time of the micro-current stimulation module 131 outputting the micro-current. The intervention strategy information is sent to the medical intervention component 130 to cause the micro-current stimulation module 131 and the music physiotherapy device 132 to begin performing therapeutic intervention operations in accordance with the intervention strategy information.
To sum up, this disclosure provides a diagnose device, should diagnose the device and include: the system comprises video interaction equipment, an information acquisition component, a medical intervention component and a controller, wherein the information acquisition component is in communication connection with the medical intervention component through the controller; the information acquisition component is used for acquiring the physical sign information of the patient in the interaction process of the patient using the video interaction equipment; the medical intervention component is used for receiving the intervention strategy information sent by the controller so as to execute the medical intervention operation corresponding to the intervention strategy information on the patient. Can link up the sign information of inquiry in the audio-visual interactive process who communicates the patient on a plurality of dimensions, and then carry out corresponding medical intervention to the patient according to multidimension degree sign information, form the operating mechanism of information acquisition and feedback integration, improve accuracy, intelligent degree and the service efficiency that psychological disease was diagnose, reduce the cost of labor when improving user experience.
For example, fig. 2 is a schematic top view of another diagnosis and treatment apparatus according to the embodiment shown in fig. 1, referring to fig. 2, the audiovisual interaction device 110 is a VR device, and the information acquisition component 120, the medical intervention component 130 and the controller 140 are all disposed on the audiovisual interaction device 110. Note that the devices or components represented by the dotted lines in fig. 2 are devices or components that are disposed inside the VR device and cannot be viewed from the appearance.
As shown in fig. 2, the VR device includes: an eye component 111 and a head binding component 112. The eye assembly includes a VR eyecup 111a, and the head binding assembly 112 is used to secure the VR eyecup 111a to the patient's head. The header binding component 112 includes: a helmet 112a and a connector 112 b. The eye assembly 111 further comprises: and a fixing member 111 b. The helmet 112a is connected to the VR eyeshade 111a through the connector 112b, and the fastener 111b is a length-adjustable fastener for fastening the connector 112b to the VR eyeshade 111 a. The helmet 112a and the connector 112b are internally provided with a plurality of electroencephalogram devices 113. The helmet 112a is also provided with at least one vent 112 c.
Illustratively, the eye movement recognition device 123 may be disposed on the eye assembly 111. The eye movement recognition device 123 may be a lightweight eye movement tracking module detachably connected to the VR eyeshade 111a, and when eye movement recognition is performed by the lightweight eye movement tracking module, a patient may observe contents played in the VR eyeshade 111a through the lightweight eye movement tracking module. Alternatively, the eye movement recognition device 123 may also be non-detachably disposed inside the VR eyecup 111 a. The eye movement identification information collected by the eye movement identification device 123 may include: the coordinate of the single/double-eye smooth fixation point, the coordinate of the single/double-eye original fixation point, a self-defined calibration eye pattern, a pupil radius, a pupil position, an eye-face distance, a fixation depth, an eye-lens distance, iris identification information and the like. Wherein, the pupil radius and the pupil position are the main basis for researching the mental state or emotion of the patient.
Illustratively, both the radio 122 and music physiotherapy 132 devices are disposed on the head-binding assembly 112. The radio 122, which is actually a microphone, is disposed below the helmet 112 a. The music physiotherapy device 132, which may be a headset, is disposed on both sides of the helmet 112 a. The controller 140 may be disposed on the eye assembly 111 or the head binding assembly 112. In the embodiment shown in fig. 2, the controller 140 is disposed in the VR eyecup 111a of the eye assembly 111.
Illustratively, a plurality of electroencephalogram devices 113 are further disposed on the head-binding component 112, and the electroencephalogram devices 113 are preferably disposed inside the head-binding component 112 so as to be closely fitted to the head of the patient and not to be visible from the appearance. As shown in fig. 2, there are three connections 112b between the eye assembly 111 and the helmet 112a of the VR device 110. After the VR device 110 is worn by the patient, the three connectors 112b are distributed on the left side, the right side and the upper side of the head of the patient, and the electroencephalogram device 113 can also be arranged on the three connectors 112 b.
Illustratively, the electroencephalogram device 113 integrates therein an electroencephalogram acquisition module 121 and a micro-current stimulation module 131 shown in fig. 1. The electroencephalogram device 113 is configured to: performing an operation of acquiring electroencephalogram information of the patient in response to a first control signal transmitted from the controller 140; the application of the microcurrent stimulation to the patient is performed in response to a second control signal sent by the controller 140. Alternatively, the electroencephalogram device 113 may be switched between an electroencephalogram acquisition module and a microcurrent stimulation module according to different control signals. The electroencephalogram device 113 may be an electrode, and the micro-current stimulation operation of the micro-current stimulation module needs to occupy at least two electrode positions, one of which is used for being applied with an anode voltage and the other is used for being applied with a cathode voltage. In addition, the rest electrodes can all execute electroencephalogram acquisition operation at the same time.
Illustratively, the operation currently performed by each computing device 113 may be defined by software settings during actual use by the patient. For example, the head binding component 112 includes 8 electroencephalogram devices 113, and in the information acquisition process, all the electroencephalogram devices 113 may be started to acquire electroencephalogram information. In the medical intervention process, two electroencephalogram devices 113 which need to execute micro-current stimulation can be determined according to the intervention strategy information, and other 6 electroencephalogram devices 113 are started to acquire electroencephalogram information so as to acquire the change of the electroencephalogram information of the patient in the treatment intervention process.
Illustratively, the controller 140 is configured to: receive the sign information sent by the information collecting component 120, where the sign information includes: the electroencephalogram information, the voice information, and the eye movement recognition information; transmit the intervention policy information to the medical intervention component 130, the intervention policy information comprising: music playing information of the music physiotherapy module 132 and run-time length information of the micro-current stimulation module 131 shown in fig. 1 described above.
Illustratively, after receiving the sign information, the controller 140 first needs to analyze and extract the electroencephalogram information, the voice information, and the eye movement recognition information to generate corresponding diagnostic information. And sending the diagnosis information to a cloud server, wherein the intervention strategy information is generated by the cloud server through a pre-stored case database and the diagnosis information. Alternatively, the pathology database may be directly stored in the controller 140, and the diagnosis information and the intervention strategy information may be sequentially acquired through the controller 140.
In conclusion, the diagnosis and treatment device provided by the disclosure can acquire the sign information of a patient in multiple dimensions in the video-audio interaction process of communicating and inquiring the patient, and then perform corresponding medical intervention on the patient according to the multi-dimensional sign information to form an information acquisition and feedback integrated working mechanism, so that the accuracy, the intelligent degree and the service efficiency of psychological disease diagnosis and treatment are improved, and the labor cost is reduced while the user experience is improved.
Illustratively, fig. 3 is a schematic structural diagram of a diagnosis and treatment system according to an exemplary embodiment, and referring to fig. 3, the diagnosis and treatment system 200 includes: at least one medical device 100 shown in fig. 1 or 2; and a server 210 for storing a pathology database. The server 210 may be a cloud server, and the cloud server may be in communication connection with a plurality of diagnosis and treatment devices 100 at the same time.
Optionally, the server 210 is in communication connection with the controller 140 included in the medical device 100;
the controller 140 is configured to send diagnostic information to the server 210, where the diagnostic information is diagnostic information corresponding to the physical sign information of the patient collected during an interaction process of the patient using the audio-visual interaction device 110 included in the diagnosis and treatment device 100;
intervention policy information, which is determined according to the case database and the diagnosis information, transmitted by the server 210 is received.
Illustratively, a diagnosis and treatment method applied to the diagnosis and treatment system 200 may include:
step 301, after the controller 140 determines that the patient correctly wears the medical device 100, the audio-visual interaction device 110 and the information collecting component are turned on.
Step 302, the controller 140 extracts the electroencephalogram waveform corresponding to the frequency band of the target condition from the acquired electroencephalogram information, and refers to a preset algorithm model to obtain a diagnosis and treatment report about the electroencephalogram information.
Wherein the target disorder can be a psychological disorder such as depression or psychosis.
Step 303, the controller 140 extracts semantic information matched with the target disease keyword from the collected voice information, and obtains a diagnosis and treatment report about the inquiry condition by referring to a preset algorithm model.
The controller 140 extracts 304 the patient's eye movement trajectory from the collected eye movement identification information.
The eye movement track is used for assisting in judging the psychological influence degree and attention degree of the current inquiry content on the patient.
In step 305, the controller 140 uploads the diagnosis report about the electroencephalogram information, the diagnosis report about the inquiry condition, and the above described psychological influence degree and attention degree to the database 210, so that the database 210 determines the data information matching the above two diagnosis reports and the psychological influence degree and attention degree according to the above described pathology database.
Step 306, the database 210 extracts the pathological features with the information matching degree exceeding 90%, forms important-level confirmed diagnosis opinions, extracts the pathological features with the information matching degree exceeding 80%, forms common-level confirmed diagnosis opinions, extracts the pathological features with the information matching degree exceeding 60%, and forms reference-level confirmed diagnosis opinions.
In step 307, the database 210 generates the intervention strategy information according to the diagnosis opinions of different levels, and sends the intervention strategy information to the controller 140.
In step 308, the controller 140 controls the micro-electrical stimulation module to perform the micro-current stimulation intervention to determine whether the intervention is effective.
In step 309, the controller 140 controls the micro-electrical stimulation module and the music physiotherapy module to start intervention therapy according to the intervention strategy information under the condition that the intervention is determined to be effective.
In conclusion, the diagnosis and treatment system provided by the disclosure can acquire the sign information of a patient in multiple dimensions in the video-audio interaction process of communicating and inquiring the patient, and then perform corresponding medical intervention on the patient according to the pre-constructed case database and the multi-dimensional sign information, so as to form an information acquisition and feedback integrated working mechanism, improve the accuracy, the intelligent degree and the service efficiency of disease diagnosis and treatment, and reduce the labor cost while improving the user experience.
Fig. 4 illustrates a block diagram of a terminal provided in an embodiment of the present disclosure, in a case where a corresponding integrated unit is employed. As shown in fig. 4, the terminal 40 may be a controller included in the medical apparatus or a chip applied to the controller. The terminal 40 includes: a communication unit 41 and a processing unit 42.
As shown in fig. 4, a communication unit 41 is configured to support the terminal 40 to perform the operations of transmitting and receiving information of the controller of the medical apparatus provided by the embodiment of the present disclosure, which are performed by the terminal in the above-described embodiment.
As shown in fig. 4, the processing unit 42 is configured to support the terminal 40 to execute the steps executed by the terminal in the foregoing embodiment and applied to the diagnosis and treatment method of the diagnosis and treatment system provided in the embodiment of the present disclosure.
In some possible implementations, as shown in fig. 4, the terminal 40 may further include a storage unit 43 for storing program codes and data of the base station.
As shown in fig. 4, the Processing Unit 42 may be a Processor or a device, such as a Central Processing Unit (CPU), a general-purpose Processor, a Digital Signal Processor (DSP), an Application-Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others. The communication unit 41 may be a transceiver, a transceiving circuit or a communication interface 43, etc. The storage unit 43 may be a memory.
As shown in fig. 4, when the processing unit 42 is a processor, the communication unit 41 is a transceiver, and the storage unit 43 is a memory, the terminal 40 according to the embodiment of the present disclosure may be the terminal 60 shown in fig. 6.
Fig. 5 shows a schematic diagram of a hardware structure of a terminal according to an embodiment of the present disclosure. As shown in fig. 5, the terminal 50 includes a processor 51 and a communication interface 53.
As shown in fig. 5, the processor 51 may be a general processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs according to the present disclosure. The communication interface 53 may be one or more. The communication interface 53 may use any transceiver or the like for communicating with other devices or communication networks.
As shown in fig. 5, the terminal 50 may further include a communication line 54. The communication link 54 may include a path for transmitting information between the aforementioned components.
Optionally, as shown in fig. 5, the terminal 50 may further include a memory 52. The memory 52 is used to store computer-executable instructions for performing aspects of the present disclosure and is controlled for execution by the processor 51. The processor 51 is configured to execute the computer executed instructions stored in the memory 52, so as to implement the diagnosis and treatment method applied to the diagnosis and treatment system provided by the embodiment of the present disclosure.
As shown in fig. 5, the memory 52 may be a read-only memory (ROM) or other types of static storage devices that can store static information and instructions, a Random Access Memory (RAM) or other types of dynamic storage devices that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory 52 may be separate and coupled to the processor 51 via a communication link 54. The memory 52 may also be integrated with the processor 51.
Optionally, the computer-executable instructions in the embodiments of the present disclosure may also be referred to as application program codes, which are not specifically limited in the embodiments of the present disclosure.
In one implementation, as shown in FIG. 5, processor 51 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 5, for example.
In one implementation, as shown in fig. 5, the terminal 50 may include a plurality of processors, such as the processor 51 and the processor 55 in fig. 5, for example. Each of these processors may be a single core processor or a multi-core processor.
Fig. 6 is a schematic structural diagram of a chip provided in an embodiment of the present disclosure. As shown in fig. 6, the chip 60 includes one or more (including two) processors 61 and a communication interface 62.
Optionally, as shown in FIG. 6, the chip also includes a memory 63, which may include both read-only memory and random access memory, and provides operating instructions and data to the processor. The portion of memory may also include non-volatile random access memory (NVRAM).
In some embodiments, as shown in FIG. 6, memory 63 stores elements, execution modules or data structures, or a subset thereof, or an expanded set thereof.
In the embodiment of the present disclosure, as shown in fig. 6, by calling an operation instruction stored in the memory 63 (the operation instruction may be stored in an operating system), a corresponding operation is performed.
As shown in fig. 6, a processor 61, which may also be referred to as a Central Processing Unit (CPU), controls the processing operations of any of the terminals.
As shown in fig. 6, the memory 61 may include a read-only memory and a random access memory, and provides instructions and data to the processor. The portion of memory may also include NVRAM. For example, the application memory, the communication interface 62 and the memory are coupled together by a bus system that may include a power bus, a control bus, a status signal bus, etc., in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 64 in fig. 6.
As shown in fig. 6, the method disclosed in the embodiment of the present disclosure may be applied to a processor, or may be implemented by a processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The processor may be a general purpose processor, a Digital Signal Processor (DSP), an ASIC, an FPGA (field-programmable gate array) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present disclosure may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present disclosure may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
In one possible implementation manner, as shown in fig. 6, the communication interface 63 is configured to perform a receiving step of an operation of transmitting and receiving information of a controller of the medical apparatus provided by the embodiment of the present disclosure. The processor is used for executing the steps of the diagnosis and treatment method applied to the diagnosis and treatment system provided by the embodiment of the disclosure.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer programs or instructions. When the computer program or instructions are loaded and executed on a computer, the procedures or functions described in the embodiments of the present disclosure are performed in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, a terminal, a user device, or other programmable apparatus. The computer program or instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer program or instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire or wirelessly. The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that integrates one or more available media. The usable medium may be a magnetic medium, such as a floppy disk, a hard disk, a magnetic tape; or optical media such as Digital Video Disks (DVDs); it may also be a semiconductor medium, such as a Solid State Drive (SSD).
While the disclosure has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed disclosure, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
While the disclosure has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the disclosure. Accordingly, the specification and figures are merely exemplary of the present disclosure as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present disclosure. It will be apparent to those skilled in the art that various changes and modifications can be made in the present disclosure without departing from the spirit and scope of the disclosure. Thus, if such modifications and variations of the present disclosure fall within the scope of the claims of the present disclosure and their equivalents, the present disclosure is intended to include such modifications and variations as well.

Claims (10)

1. A medical device, the device comprising: the system comprises video interaction equipment, an information acquisition component, a medical intervention component and a controller, wherein the information acquisition component is in communication connection with the medical intervention component through the controller;
the information acquisition component is used for acquiring the physical sign information of the patient in the interaction process of the patient using the audio-visual interaction equipment;
the medical intervention component is used for receiving the intervention strategy information sent by the controller so as to execute the medical intervention operation corresponding to the intervention strategy information on the patient.
2. The apparatus of claim 1, wherein the information collection component, the medical intervention component, and the controller are all disposed on the audiovisual interaction device.
3. The apparatus of claim 1 or 2, wherein the information gathering component comprises:
the electroencephalogram acquisition module is used for acquiring electroencephalogram information of the patient;
the radio equipment is used for collecting voice information in the interaction process;
an eye movement recognition device for acquiring eye movement trajectory information of the patient.
4. The device of claim 3, wherein the medical intervention assembly comprises:
a microcurrent stimulation module for applying microcurrent stimulation to the patient; and the number of the first and second groups,
music physiotherapy equipment for outputting physiotherapy audio.
5. The apparatus of claim 4, wherein the video interactive device comprises: an eye assembly and a head binding assembly.
6. The apparatus of claim 5, wherein the eye movement recognition device is disposed on the eye assembly, and the sound reception device and the music physiotherapy device are both disposed on the head binding assembly.
7. The apparatus of claim 6, wherein a plurality of electroencephalogram devices are further disposed on the head-binding assembly, wherein both the electroencephalogram acquisition module and the micro-current stimulation module are integrated into the electroencephalogram devices, and the electroencephalogram devices are configured to:
performing an operation of acquiring electroencephalogram information of the patient in response to a first control signal transmitted by the controller;
the application of microcurrent stimulation to the patient is performed in response to a second control signal sent by the controller.
8. The apparatus of any of claims 4-7, wherein the controller is to:
receiving the sign information sent by the information acquisition component, wherein the sign information comprises: the electroencephalogram information, the voice information, and the eye movement identification information;
transmitting the intervention policy information to the medical intervention component, the intervention policy information comprising: music playing information of the music physical therapy module and running length information of the micro-current stimulation module.
9. A medical system, comprising:
the medical device according to any one of claims 1 to 8; and the number of the first and second groups,
a server for storing a pathology database.
10. The system of claim 9, wherein the server is communicatively coupled to a controller included in the medical device;
the controller is used for sending diagnosis information to the server, wherein the diagnosis information is the diagnosis information corresponding to the physical sign information of the patient, which is acquired in the interactive process of the communication between the audio-video interactive equipment and the patient, which is contained in the diagnosis and treatment device;
and receiving intervention strategy information sent by the server, wherein the intervention strategy information is determined according to the case database and the diagnosis information.
CN202023335136.8U 2020-12-31 2020-12-31 Diagnosis and treatment device and system Active CN215017583U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202023335136.8U CN215017583U (en) 2020-12-31 2020-12-31 Diagnosis and treatment device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202023335136.8U CN215017583U (en) 2020-12-31 2020-12-31 Diagnosis and treatment device and system

Publications (1)

Publication Number Publication Date
CN215017583U true CN215017583U (en) 2021-12-07

Family

ID=79224217

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202023335136.8U Active CN215017583U (en) 2020-12-31 2020-12-31 Diagnosis and treatment device and system

Country Status (1)

Country Link
CN (1) CN215017583U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024006966A1 (en) * 2022-06-30 2024-01-04 Apple Inc. Eye-tracking system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024006966A1 (en) * 2022-06-30 2024-01-04 Apple Inc. Eye-tracking system

Similar Documents

Publication Publication Date Title
US20200268272A1 (en) Headgear with displaceable sensors for electrophysiology measurement and training
US10293158B2 (en) Controlling ear stimulation in response to a status of a subject
US20210060330A1 (en) Devices and methods for delivering mechanical stimulation to nerve, mechanoreceptor, and cell targets
US10406376B2 (en) Multi-factor control of ear stimulation
JP2023521187A (en) Acoustic electrical stimulation neuromodulation method and apparatus based on measurement, analysis and control of brain waves
US10327984B2 (en) Controlling ear stimulation in response to image analysis
US11529515B2 (en) Transcranial stimulation device and method based on electrophysiological testing
US11554244B2 (en) Systems and methods for multi-modal and non-invasive stimulation of the nervous system
CN105726045A (en) Emotion monitoring method and mobile terminal thereof
CN109620257B (en) Mental state intervention and regulation system based on biofeedback and working method thereof
CN110613459B (en) Tinnitus and deafness detection test matching and treatment system based on shared cloud computing platform
CN215017583U (en) Diagnosis and treatment device and system
CN113096808A (en) Event prompting method and device, computer equipment and storage medium
WO2018214531A1 (en) Data transmission method and device for wearable apparatus
CN115501483A (en) Method and device for intervening cognitive disorder through personalized transcranial electrical stimulation
EP3925665B1 (en) Intelligent glasses and glasses box
TWI543749B (en) Method of mixing binaural beats with music to modulate various neuronal networks in the brain
WO2021062210A1 (en) Virtual reality behavioral health platform
Gantz et al. COCHLEAR IMPLANT COMPRISONS
Jacquemin Transcranial direct current stimulation as a treatment for chronic subjective tinnitus
WO2022125510A1 (en) Systems and methods for detecting and addressing quality issues in remote therapy sessions
CN117714739A (en) Program-controlled video sharing method, sharing device and readable storage medium
CN116212187A (en) VR-based sensory stimulation system for ICU patients
EP4327337A1 (en) A dialogue-based medical decision system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant