CN115881313A - Information interaction equipment and method - Google Patents

Information interaction equipment and method Download PDF

Info

Publication number
CN115881313A
CN115881313A CN202211407166.6A CN202211407166A CN115881313A CN 115881313 A CN115881313 A CN 115881313A CN 202211407166 A CN202211407166 A CN 202211407166A CN 115881313 A CN115881313 A CN 115881313A
Authority
CN
China
Prior art keywords
voice
user
information
unit
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211407166.6A
Other languages
Chinese (zh)
Inventor
顾建英
潘晶
冯义兴
田华
张满圆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Timi Robot Co ltd
Original Assignee
Shanghai Timi Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Timi Robot Co ltd filed Critical Shanghai Timi Robot Co ltd
Priority to CN202211407166.6A priority Critical patent/CN115881313A/en
Publication of CN115881313A publication Critical patent/CN115881313A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application relates to an information interaction device and a method, wherein the information interaction device comprises: at least three main body frames, wherein each adjacent main body frame is connected with each other, and the bottom of each main body frame is connected through a bottom plate; the at least three display units are arranged along the outer periphery of each main body frame respectively, and a voice acquisition unit is arranged on the side edge of each display unit; a voice transmission unit provided on each main body frame; the image acquisition unit is arranged on the voice transmission unit; and the control unit is arranged on the bottom plate and is connected with the display unit, the voice acquisition unit, the voice transmission unit and the image acquisition unit in a wired and/or wireless mode. The information interaction equipment can simultaneously carry out face-to-face interaction with a plurality of users, and information transmission is achieved. The information interaction device realizes face-to-face interaction between people and the device, so that an interaction scene is more real, and the information interaction device is similar to face-to-face information exchange between people.

Description

Information interaction equipment and method
Technical Field
The application relates to the technical field of artificial intelligence equipment, in particular to information interaction equipment and a method.
Background
The digital man utilizes the scheme of information science to perform shape and function virtual simulation on a human body, and operates in a virtual world. In a medical scene, a former digital person can only be displayed through a smart phone or a computer screen, and the reality sense of the digital person cannot be reflected due to limited screen size.
Limited by the screen, when the user interacts with the digital person, the user can only translate the digital person into the code through the word instruction, so that the digital person can not perform more real interpersonal communication interaction with the user in the real world.
Disclosure of Invention
The information interaction device can simultaneously perform face-to-face interaction with a plurality of users, information transmission is achieved, and information interaction efficiency can be improved.
The embodiment of the application is realized as follows:
in a first aspect, the present application provides an information interaction device, including:
at least three main body frames, wherein each adjacent main body frame is connected with each other, and the bottom of each main body frame is connected through a bottom plate;
the at least three display units are arranged along the outer periphery of each main body frame respectively, and a voice acquisition unit is arranged on the side edge of each display unit;
the voice transmission unit is arranged on each main body frame;
the image acquisition unit is arranged on the voice transmission unit;
the control unit is arranged on the bottom plate and connected with the display unit, the voice acquisition unit, the voice transmission unit and the image acquisition unit in a wired and/or wireless mode.
In one embodiment, each of the body frames includes:
the support frame body is connected with the bottom plate, a first mounting piece is arranged on the support frame body, and the adjacent support frame bodies are connected through the first mounting piece;
the main frame body is arranged at the top of the support frame body;
the supporting plate is arranged on the supporting frame body, and the display unit is arranged on the supporting plate.
In one embodiment, when the number of the main body frames is three, the angle of the joint between the adjacent support frames is 60 °.
In one embodiment, one of the support frames is hinged to the base plate.
In an embodiment, the voice transmission unit is inclined toward the direction of the bottom plate according to a preset angle.
In one embodiment, the control unit is further configured to perform wired and/or wireless communication connection with the terminal device and the auxiliary diagnostic device.
In a second aspect, the present application provides an information interaction method, including:
each voice acquisition unit acquires a corresponding voice signal of a user in front of each display unit;
the image acquisition unit acquires an action signal sent by a user;
the control unit receives the voice signal and the action signal, processes the voice signal and the action signal and sends a processing result to the voice transmission unit and the display unit;
the voice transmission unit outputs the processing result in a voice form, and the display unit displays the processing result to a user in a text and/or picture information mode, so that man-machine interaction is realized.
In one embodiment, the image capturing unit captures an action signal sent by a user, including:
the image acquisition unit acquires a head action signal, body action information and an eyeball action signal of a user;
and transmitting the head action signal, the body action information and the eyeball action signal to the control unit for processing.
In one embodiment, the method further comprises:
the voice acquisition unit receives an auxiliary medical diagnosis voice signal of a user and sends the auxiliary medical diagnosis voice signal to the control unit;
and the control unit controls auxiliary diagnosis equipment to perform auxiliary medical diagnosis on the user according to the auxiliary medical diagnosis voice signal.
In one embodiment, the method further comprises:
the control unit arranges the processing results to generate a report;
the control unit sends the report to a terminal device of a user.
Compared with the prior art, the beneficial effect of this application is: the information interaction equipment can simultaneously carry out face-to-face interaction with a plurality of users, and information transmission is realized. The image acquisition unit acquires head action signals, body action signals and eyeball action signals of a human body in real time, the voice acquisition unit acquires voice information of a user, the control unit processes the action information and the voice information and transmits the action information and the voice information to the display unit and the voice transmission unit, and digital information required by the user is displayed for the user. The information interaction device realizes face-to-face interaction between people and the device, so that an interaction scene is more real, and the information interaction device is similar to face-to-face information exchange between people.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic structural diagram of an information interaction device according to an embodiment of the present application;
FIG. 2 is an exploded view of FIG. 1;
fig. 3 is a schematic flowchart of an information interaction method according to an embodiment of the present application.
Icon:
1-an information interaction device; 11-a body frame; 111-a support frame body; 1111-a first mounting; 112-the main frame body; 113-a support plate; 12-a base plate; 13-a display unit; 131-a voice acquisition unit; 14-a voice transmission unit; 15-an image acquisition unit; 16-a control unit.
Detailed Description
The terms "first," "second," "third," and the like are used for descriptive purposes only and not for purposes of indicating or implying relative importance, and do not denote any order or order.
Furthermore, the terms "horizontal", "vertical", "overhang" and the like do not imply that the components are required to be absolutely horizontal or overhang, but may be slightly inclined. For example, "horizontal" merely means that the direction is more horizontal than "vertical" and does not mean that the structure must be perfectly horizontal, but may be slightly inclined.
In the description of the present application, it should be noted that the terms "inside", "outside", "left", "right", "upper", "lower", and the like indicate orientations or positional relationships based on orientations or positional relationships shown in the drawings or orientations or positional relationships that are conventionally arranged when products of the application are used, and are used only for convenience in describing the application and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the application.
In the description of the present application, unless expressly stated or limited otherwise, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements.
The technical solution of the present application will be clearly and completely described below with reference to the accompanying drawings.
The digital person is an information interaction device capable of communicating information with a doctor as the name implies. When the existing information interaction equipment carries out information transmission with a doctor, the information transmission needs to be carried out in a touch mode and the like, and the interaction is slow and is not intuitively determined. And the doctor can only pass through the word instruction when interacting with the digital person, and can not realize real communication interaction with the digital person due to the function of the screen.
Based on the above problem, referring to fig. 1 and fig. 2, the present application provides an information interaction device 1, which includes: at least three main body frames 11, each adjacent main body frame being connected to each other, the bottom of each main body frame 11 being connected by a bottom plate 12; at least three display units 13 respectively arranged along the outer circumference of each main body frame 11, and a voice acquisition unit 131 is arranged on the side edge of each display unit 13; each main body frame 11 is provided with a voice transmission unit 14, each voice transmission unit 14 is provided with an image acquisition unit 15, the control unit 16 is arranged on the bottom plate 12, and the control unit 16 is respectively connected with the display unit 13, the voice acquisition unit 131, the voice transmission unit 14 and the image acquisition unit 15 in a wired and/or wireless mode.
Referring to fig. 2, each main body frame 11 includes: support frame body 111, main frame body 112, backup pad 113, wherein, support frame body 111 is connected with bottom plate 12, is equipped with first installed part 1111 on the support frame body 111, and adjacent support frame body 111 is connected through first installed part 1111. The first mounting member 1111 may be a screw, and the adjacent support frame 111 is fixed by the screw connection. The supporting plate 113 is disposed on the supporting frame 111, and the display unit 13 is disposed on the supporting plate 113. The support plate 113 may be mounted on the support frame 111 by welding or screw coupling. The display unit 13 may be mounted on the support plate 113 by welding or screw coupling.
In this embodiment, the number of the main frames 11 can be specifically designed according to the situation of the use place, but the number of the main frames 11 is not less than three. In one embodiment, when there are three main frames 11, the angle of the joint of the adjacent support frames 111 is 60 °, so as to form an equilateral triangle information interaction device 1. When the number of the main body frames 11 is four, the angle of the joint of the adjacent support frame bodies 111 is 90 °, and a square information interaction device 1 is formed. When the number of the main body frames 11 is larger, the shape of the formed information interaction device 1 is not described in detail herein.
In a hospital, clinic or pharmacy, the shape of the information interaction device 1 can be designed according to the number of users and the use site, so as to meet the use requirements of more users. In this embodiment, when the information interaction device 1 is a regular triangle, that is, one information interaction device 1 can simultaneously transmit information with three users.
In order to facilitate the maintenance of the control unit 16 by a maintenance worker when the control unit 16 is out of order, in one embodiment, one of the support frame bodies 111 is hinged to the base plate 12. For example, one of the support frames 111 may be fixed to the base plate 12 by a hinge, so that the support frame 111 may be opened, thereby facilitating a serviceman to access the base plate 12 for maintenance of the control unit 16.
In one embodiment, the display unit 13 can select an IPS lcd screen larger than 65 inches, and in the present application, the display unit 13 can select a 98 inch IPS lcd screen. The information can be displayed in all directions. The side edge of each display unit 13 is provided with a voice acquisition unit 131, and the voice acquisition unit 131 may be a microphone array arranged at the edge of the side edge of the display unit 13, and is used for recording the voice information of the user in real time, so as to suppress noise, remove reverberation, and improve the accuracy of recognition.
In one embodiment, the voice transmission unit 14 is inclined to the direction of the bottom plate 12 according to a predetermined angle. The voice transmission unit 14 is different from a common sound device, and the sound emitting direction of the voice transmission unit 14 has a certain range, so that in order to enable the voice transmission unit 14 to obtain the best voice information of a user, the voice transmission unit 14 can incline 30 degrees towards the direction of the bottom plate 12, and mutual noninterference between other adjacent information interaction devices 1 of the current information interaction device 1 can be ensured.
In one embodiment, the control unit 16 is further configured to be connected in wired and/or wireless communication with the terminal device and the auxiliary diagnostic device. The control unit 16 is used for controlling the medical equipment to perform auxiliary diagnosis. For example, the diagnostic device may be connected to the diagnostic device through NFC (Near Field Communication), BT (Bluetooth), and RFID (Radio Frequency Identification).
The terminal device can be an electronic product such as a mobile phone, a notebook computer, a tablet computer and the like of a user or a user, or a wearable portable device of the user or the user. Wearable personal devices may include bluetooth headsets, smart glasses, smart watches, and the like. The control unit 16 may also transmit the digital information that the user needs to specify to a printer via a wired network connection, for example, the printer may print out various medical examination reports for the user or user to review. The user or subscriber needs specific digitized information may include: the medical system comprises medical consulting room location information, medical staff information, registration payment bill information, medical report information, X-ray film information and the like.
In this embodiment, the information interaction device 1 may also be externally connected with a diagnosis and treatment device. The medical treatment apparatus may include: blood pressure monitor, blood oxygen monitor, height and weight measuring instrument, stethoscope, etc. The user or user can send the voice signal of blood pressure, blood oxygen, height and weight to be measured to the voice acquisition unit 131 through voice, the voice acquisition unit 131 sends the voice signal to the control unit 16, the control unit 16 receives the instruction and controls the oximeter, the blood pressure monitor and the height and weight measuring instrument which are connected with the voice acquisition unit through the wired network or the wireless network to enter the working state, and the user or user can use the oximeter, the blood pressure monitor and the height and weight measuring instrument to measure the blood pressure, the blood oxygen index and the height and weight index, thereby assisting in medical diagnosis.
In another embodiment, the control unit 16 may also remotely call the intelligent mobile robot in the hospital through a wired or wireless internet connection (e.g. WiFi, 5G), process the voice command or the action command of the user into a machine language, and issue the machine language to the intelligent mobile robot in real time, and the intelligent mobile robot performs corresponding operations according to the command, for example, guiding the user or the user to reach a designated medical point, or guiding the user or the user to reach a position of the oximeter, the sphygmomanometer, the height/weight measuring instrument, and informing the user or the user how to operate the oximeter, the sphygmomanometer, the height/weight measuring instrument.
Please refer to fig. 3, which is a flowchart illustrating an information interaction method according to an embodiment of the present application. The method can be applied to the information interaction equipment 1 shown in fig. 1 and fig. 2, the information interaction equipment 1 not only realizes real-time information interaction of a user or a user, but also can assist a doctor to carry out simple diagnosis on the user or the user, and the user or the user can obtain specific digital information according to needs. The method specifically comprises the steps S210-S240:
step S210: each voice collecting unit 131 collects the voice signal of the corresponding user in front of each display unit 13.
In this step, when the user stands in front of the corresponding display unit 13, faces the display unit 13, and gives a voice instruction by the user according to the prompt information displayed on the display unit 13, the voice acquisition unit 131 corresponding to each display unit 13 acquires the voice signal sent by the user.
For example, the interface on the display unit 13 displays user instructions including inquiry services, outpatient registration services, report printing services, inspection service windows, in-patient transaction windows, general examination services; wherein the general inspection service may include: and performing conventional simple medical operation services such as height and weight measurement, blood oxygen concentration measurement, blood pressure measurement and the like.
When the user sends out the voice message of the "outpatient registration service", the voice signal of the "outpatient registration service" is collected by the voice collecting unit 131 and transmitted to the control unit 16, and is processed by the control unit 16. When the user sends out the voice message of "inquiry service", the voice collecting unit 131 collects the voice signal of "outpatient registration service" and transmits it to the control unit 16.
In one embodiment, the voice collecting unit 131 receives the auxiliary medical diagnosis voice signal of the user and sends the auxiliary medical diagnosis voice signal to the control unit 16; the control unit 16 controls the auxiliary diagnosis device to perform auxiliary medical diagnosis on the user according to the auxiliary medical diagnosis voice signal.
The auxiliary diagnostic equipment comprises a blood pressure meter, an oximeter, a height and weight measuring instrument, a stethoscope and the like. When the user sends an auxiliary raw material diagnosis voice signal, for example, the user sends a voice signal "please give me to measure blood pressure", the voice acquisition unit 131 acquires the voice signal "the user sends" please give me to measure blood pressure "to the control unit 16, the control unit 16 receives the signal, and controls the blood pressure meter to be opened and started through software or a program, and the user automatically measures the blood pressure of the user according to the requirement.
Step S220: the image pickup unit 15 picks up an action signal from a user.
In one embodiment, the image capturing unit 15 captures a head movement signal, a body movement information and an eye movement signal of a user; the head motion signal, the body motion information and the eye motion signal are transmitted to the control unit 16 for processing.
In this step, the image acquisition unit 15 may be a depth camera capable of acquiring human body actions in real time, and is used for sensing the user's dynamics and realizing action monitoring. The action signals sent by the user comprise head action signals, body action signals and eyeball action signals.
When the user faces the display unit 13, looking at the image acquisition unit 15, a user guide is displayed according to an interface on the display unit 13, and whether to execute the next operation is confirmed by head nodding and shaking according to the prompt information on the display unit 13.
For example, the user may manually click the outpatient registration service button, or obtain the corresponding service by sending out a voice message of "outpatient registration service". When a plurality of outpatient department information appears, the user can manually click the corresponding consulted department information to determine whether to register or see a doctor. The correct registration information can be confirmed through head nodding actions or head shaking actions without consulting the department according to the prompt information. Whether the registration information is correct can also be confirmed through the body action of swinging and shaking hands.
If the user does not nod or shake his head or shake his hand within a predetermined time but continuously rotates the eyeball to repeatedly inquire about the department information, the eyeball movement of the user may be captured by the image capturing unit 15 and prompt information may be given, such as "whether to inquire about XXX department" or "whether to confirm the XXX department information". Waiting for the user to give a confirmation message.
Step S230: the control unit 16 receives and processes the voice signal and the motion signal, and transmits the processing result to the voice transmission unit 14 and the display unit 13.
As described in step S210, the voice collecting unit 131 collects the voice signal of the corresponding user in front of each display unit 13 and transmits the voice signal to the control unit 16, and the control unit 16 performs data processing to translate the voice information into a machine language and transmits the processed voice information to the voice transmission unit 14. The image capturing unit 15 transmits the captured motion signal issued by the user to the control unit 16, and the control unit 16 processes the motion information into machine language and transmits the processed voice information to the display unit 13.
Step S240: the voice transmission unit 14 outputs the processing result in the form of voice, and the display unit 13 displays the processing result to the user as text and/or picture information, thereby realizing human-computer interaction.
As shown in step S230, when the user sends the voice message of the "query service", the voice collecting unit 131 collects the voice signal of the "outpatient registration service" and transmits the voice signal to the control unit 16, after the control unit 16 performs data processing, the "outpatient registration service" signal is sent to the voice transmitting unit 14, the voice transmitting unit 14 sends out the query result, and outputs the query result in a voice form, for example, performs voice broadcast, the broadcast content may be "all outpatient departments include internal medicine outpatient department, surgical outpatient department, otorhinolaryngology department outpatient department.
When the user confirms that the registration information is correct through the head nodding action, or confirms that the department is not consulted through the head shaking action. Or after confirming whether the registration information is correct or not through the body action of swinging hands and shaking hands, the display unit 13 finally displays corresponding doctor information and sitting and examining time information under a department to be inquired by the user on a screen in a text or picture combination mode according to the action information of the user, and the user can select registration.
In one embodiment, the control unit 16 sorts the processing results to generate a report; the control unit 16 sends the report to the user's terminal device.
For example, when a user wants to obtain digitized information such as medical report information or X-ray film information, the control unit 16 may process the digitized information according to a voice command and an action command of the user, display the digitized information to the user through the display unit 13 in text and/or picture information, perform integration processing according to the text and picture information to generate a medical report, and transmit the medical report to a mobile phone, a tablet computer or a wearable and portable device of the user through a communication protocol between the control unit 16 and a user terminal. Wearable personal devices may include bluetooth headsets, smart glasses, smart watches, and the like.
In another embodiment, the control unit 16 may also transmit the medical report information required by the user to a printer through a wired network connection, and print out various medical examination reports for the user to review through the printer.
Therefore, the information interaction device 1 of the present application can simultaneously perform face-to-face interaction with multiple users, and information transfer is achieved. The image acquisition unit 15 acquires head action signals, body action signals and eyeball action signals of a human body in real time, the voice acquisition unit 131 acquires voice information of a user, and the control unit 16 processes the action information and the voice information and transmits the action information and the voice information to the display unit 13 and the voice transmission unit 14 so as to display digital information required by the user. The information interaction device 1 realizes face-to-face interaction between people and devices, so that an interaction scene is more real and is similar to face-to-face information exchange between people.
It should be noted that the features of the embodiments in the present application may be combined with each other without conflict.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. An information interaction device, comprising:
at least three main body frames, wherein each adjacent main body frame is connected with each other, and the bottom of each main body frame is connected through a bottom plate;
the at least three display units are arranged along the outer periphery of each main body frame respectively, and a voice acquisition unit is arranged on the side edge of each display unit;
the voice transmission unit is arranged on each main body frame;
the image acquisition unit is arranged on the voice transmission unit;
the control unit is arranged on the bottom plate and connected with the display unit, the voice acquisition unit, the voice transmission unit and the image acquisition unit in a wired and/or wireless mode.
2. The information interaction apparatus according to claim 1, wherein each of the main body frames includes:
the support frame body is connected with the bottom plate, a first mounting piece is arranged on the support frame body, and the adjacent support frame bodies are connected through the first mounting piece;
the main frame body is arranged at the top of the support frame body;
the supporting plate is arranged on the supporting frame body, and the display unit is arranged on the supporting plate.
3. The information interaction apparatus according to claim 2, wherein when the main body frame is three, an angle at a junction of adjacent support frames is 60 °.
4. The information interaction device of claim 2, wherein one of the support shelves is hinged to the base plate.
5. The information interaction device of claim 1, wherein the voice transmission unit is inclined in a direction of the bottom plate according to a preset angle.
6. The information interaction device of claim 1, wherein the control unit is further configured to be connected in wired and/or wireless communication with a terminal device and an auxiliary diagnostic device.
7. An information interaction method, comprising:
each voice acquisition unit acquires a voice signal of a user correspondingly positioned in front of each display unit;
the image acquisition unit acquires an action signal sent by a user;
the control unit receives the voice signal and the action signal, processes the voice signal and the action signal and sends a processing result to the voice transmission unit and the display unit;
the voice transmission unit outputs the processing result in a voice form, and the display unit displays the processing result to a user in a text and/or picture information mode, so that man-machine interaction is realized.
8. The method of claim 7, wherein the image capturing unit captures user-initiated motion signals comprising:
the image acquisition unit acquires a head action signal, body action information and an eyeball action signal of a user;
and transmitting the head action signal, the body action information and the eyeball action signal to the control unit for processing.
9. The method of claim 7, further comprising:
the voice acquisition unit receives an auxiliary medical diagnosis voice signal of a user and sends the auxiliary medical diagnosis voice signal to the control unit;
and the control unit controls auxiliary diagnosis equipment to perform auxiliary medical diagnosis on the user according to the auxiliary medical diagnosis voice signal.
10. The method of claim 7, further comprising:
the control unit arranges the processing results to generate a report;
the control unit sends the report to a terminal device of a user.
CN202211407166.6A 2022-11-10 2022-11-10 Information interaction equipment and method Pending CN115881313A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211407166.6A CN115881313A (en) 2022-11-10 2022-11-10 Information interaction equipment and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211407166.6A CN115881313A (en) 2022-11-10 2022-11-10 Information interaction equipment and method

Publications (1)

Publication Number Publication Date
CN115881313A true CN115881313A (en) 2023-03-31

Family

ID=85759674

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211407166.6A Pending CN115881313A (en) 2022-11-10 2022-11-10 Information interaction equipment and method

Country Status (1)

Country Link
CN (1) CN115881313A (en)

Similar Documents

Publication Publication Date Title
US8423081B2 (en) System for portability of images using a high-quality display
US6569097B1 (en) System for remote evaluation of ultrasound information obtained by a programmed application-specific data collection device
US10586020B2 (en) Telemedicine components, devices, applications and uses thereof
CN204428014U (en) Be convenient to the medical examination apparatus carrying out long-range inspection
JP3657786B2 (en) Electronic first aid kit
WO2017050071A1 (en) Health consultation service access method, apparatus and system
US20130013333A1 (en) Method for remote medical consultation and care
CN103154955A (en) Multifunctional medical device for telemedicine applications
US20150066523A1 (en) Telemedicine information system
WO2018218162A1 (en) Telemedicine systems
JP5845235B2 (en) Hospital use support system
JP2004171394A (en) Automatic selection device of carrying destination, automatic selection system of carrying destination, invalid database generation device, invalid database retrieval device, and movable body
US20040225476A1 (en) Inspection apparatus for diagnosis
CN115881313A (en) Information interaction equipment and method
KR100334892B1 (en) Ultrasonic detector using wireless communication and remote diagnosing system using the same
KR20120120817A (en) Remote patient monitoring system based on embedded terminal and sensor nodes
CN114582484A (en) Image information input method, physiological image processing method and image processing equipment
WO2017061658A1 (en) Remote medical information communication method and system using mobile communication terminal
CN208822749U (en) A kind of intelligent health detection system based on infrared thermal imaging
CN112669997A (en) Bedside remote medical consultation system
TWI778464B (en) Method for inputting image display information, method for processing physiological image information, and equipment for processing physiological image information
CN115762741A (en) Information interaction system
JP7422101B2 (en) Ultrasound diagnostic system
CN213660012U (en) Intelligent inquiry system
US20220385707A1 (en) Patient station for telemedicine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination