CN115620889A - Interaction method, processing device, system, device and medium based on augmented reality - Google Patents

Interaction method, processing device, system, device and medium based on augmented reality Download PDF

Info

Publication number
CN115620889A
CN115620889A CN202211255883.1A CN202211255883A CN115620889A CN 115620889 A CN115620889 A CN 115620889A CN 202211255883 A CN202211255883 A CN 202211255883A CN 115620889 A CN115620889 A CN 115620889A
Authority
CN
China
Prior art keywords
augmented reality
diagnosed
image
historical
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211255883.1A
Other languages
Chinese (zh)
Inventor
李亦超
王星汉
严鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202211255883.1A priority Critical patent/CN115620889A/en
Publication of CN115620889A publication Critical patent/CN115620889A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Biomedical Technology (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

The invention discloses an interaction method, processing equipment, a system, equipment and a medium based on augmented reality, wherein the interaction method comprises the following steps: acquiring current image data of a to-be-diagnosed object; acquiring historical diagnosis and treatment data of the object to be diagnosed according to the current image data; and generating an augmented reality image according to the historical diagnosis and treatment data and sending the augmented reality image to augmented reality equipment for displaying. According to the invention, when the doctor diagnoses the object to be diagnosed, the doctor can acquire the historical diagnosis and treatment data of the object to be diagnosed in a mode of expanding the real image, and the doctor can interact with the image and the characters in the historical diagnosis and treatment data in real time, so that the doctor can more intuitively observe the historical diagnosis and treatment data of the object to be diagnosed, the doctor can further conveniently acquire the historical diagnosis and treatment data of the object to be diagnosed, and the comprehensiveness of collecting the information of the object to be diagnosed is improved.

Description

Interaction method, processing device, system, device and medium based on augmented reality
Technical Field
The invention relates to the technical field of digital medical treatment, in particular to an interaction method, processing equipment, a system, equipment and a medium based on augmented reality.
Background
In the medical diagnosis process, doctors need to obtain accurate diagnosis based on the chief complaints, clinical examinations, current medical history, past medical history, personal life history and family medical history of the subjects to be diagnosed. Generally, the more information of a subject to be diagnosed is known, the greater the probability of obtaining a correct judgment. However, in traditional diagnosis and treatment, doctors often need to know the information of the object to be diagnosed through oral inquiry, and due to the difference between the memory and expression abilities of the object to be diagnosed, misunderstanding of the expression of the object to be diagnosed by the doctors can be caused, so that the problems of low efficiency and high misdiagnosis rate are caused.
At present, methods for a doctor to know a to-be-diagnosed object are generally limited to calling data of the to-be-diagnosed object in a hospital for diagnosis on a personal computer, but the existing method for checking historical diagnosis and treatment data on the personal computer is not convenient enough, and the doctor needs to switch between various data of the to-be-diagnosed object, so that the method for the doctor to check the historical diagnosis and treatment data on the personal computer is not convenient and intuitive enough, and meanwhile, the problem that information collection of the to-be-diagnosed object is not comprehensive enough when the doctor diagnoses the to-be-diagnosed object exists.
Disclosure of Invention
The invention provides an interaction method, processing equipment, a system, equipment and a medium based on augmented reality, aiming at overcoming the defects that the information collection of an object to be diagnosed is not comprehensive enough and the diagnosis method is not convenient and visual enough in the prior art.
The invention solves the technical problems through the following technical scheme:
the invention provides an interaction method based on augmented reality, which comprises the following steps:
acquiring current image data of a to-be-diagnosed object;
acquiring historical diagnosis and treatment data of the object to be diagnosed according to the current image data;
and generating an augmented reality according to the historical diagnosis and treatment data and sending the augmented reality to augmented reality equipment for displaying.
Preferably, the step of generating an augmented reality image according to the historical clinical data includes:
acquiring historical medical images in the historical diagnosis and treatment data;
determining a first human body part corresponding to the historical medical image;
acquiring position information of the first human body part corresponding to the current image of the object to be diagnosed;
acquiring position information of the augmented reality device;
registering the historical medical image to a corresponding body part of the current image of the object to be diagnosed based on the position information of the first body part and the position information of the augmented reality equipment to generate a corresponding first augmented reality image;
or acquiring character information in the historical diagnosis and treatment data;
generating a second augmented reality image by the character information;
or, acquiring the history medical image and the character information which are matched with each other in the history diagnosis and treatment data;
generating a third augmented reality image based on the historical medical image and the text information;
and/or the interaction method comprises the following steps:
acquiring a thermal imaging image of the object to be diagnosed;
and displaying the thermal imaging image on a corresponding body surface position of the current image of the object to be diagnosed to generate a fourth augmented reality image.
Preferably, the step of acquiring a thermal imaging image of the subject to be diagnosed includes:
acquiring infrared ray information and/or heat information of the object to be diagnosed;
mapping the infrared information or the thermal information to color or grayscale data to generate the thermographic image.
Preferably, when the historical clinical data of the object to be diagnosed includes multiple times of historical clinical data of the same organ of the object to be diagnosed, the step of generating the augmented reality image according to the historical clinical data includes:
acquiring a historical medical image of the historical diagnosis and treatment data each time;
sequentially generating a plurality of augmented reality images according to a time sequence, and sending the images to the augmented reality equipment for display;
and/or after the step of sending to the augmented reality device for display, the interaction method further comprises:
receiving a control instruction;
and performing corresponding operation on the augmented reality image according to the control instruction.
Preferably, before the step of sequentially generating a plurality of display images in time sequence, the interaction method further includes:
acquiring character information of the historical diagnosis and treatment data each time;
and displaying the historical medical images of the historical diagnosis and treatment data in different colors each time according to the character information.
Preferably, the control instruction comprises a click instruction generated by the space interaction device;
the step of performing corresponding operation on the augmented reality image according to the control instruction comprises:
receiving the click command;
acquiring the position information of the space interaction equipment and the position information of the area to be clicked in the augmented reality image;
calculating the distance between the space interaction equipment and the area to be clicked;
if the distance is smaller than a preset threshold value, performing corresponding operation on the augmented reality image;
and/or, the control instruction comprises a voice instruction;
the step of performing corresponding operation on the augmented reality image according to the control instruction comprises the following steps:
receiving the voice instruction;
acquiring voice information according to the voice instruction;
logic that analyzes the voice information to obtain voice instructions;
acquiring corresponding operation logic for the augmented reality image according to the logic of the voice instruction;
and performing corresponding operation on the augmented reality image according to the operation logic.
The invention also provides an interactive processing device based on the augmented reality, which comprises:
the first acquisition module is used for acquiring current image data of the object to be diagnosed;
the second acquisition module is used for acquiring historical diagnosis and treatment data of the object to be diagnosed according to the current image data;
and the display module is used for generating an augmented reality image according to the historical diagnosis and treatment data and sending the augmented reality image to augmented reality equipment for display.
The invention also provides an interaction system based on the augmented reality, which comprises the interaction processing equipment based on the augmented reality, the augmented reality equipment and a medical platform database;
the augmented reality device is used for displaying an augmented reality image;
the medical platform database is used for storing historical diagnosis and treatment data.
The invention further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the processor implements the augmented reality-based interaction method as described above.
The invention further provides a computer-readable storage medium, on which a computer program is stored, which computer program, when executed by a processor, implements an augmented reality based interaction method as described above.
The positive progress effects of the invention are as follows:
the invention provides an interaction method, processing equipment, a system, equipment and a medium based on augmented reality, wherein the interaction method acquires historical diagnosis and treatment data of an object to be diagnosed according to current image data, generates an augmented reality image according to the historical diagnosis and treatment data and sends the augmented reality image to augmented reality equipment for display, so that a doctor can acquire the historical diagnosis and treatment data of the object to be diagnosed in an augmented reality image mode when diagnosing the object to be diagnosed, and can interact with images and characters in the historical diagnosis and treatment data in real time, so that the doctor can observe the historical diagnosis and treatment data of the object to be diagnosed more intuitively, the doctor can acquire the historical diagnosis and treatment data of the object to be diagnosed conveniently, and the comprehensiveness of collecting information of the object to be diagnosed is improved.
Drawings
Fig. 1 is a first flowchart of an interaction method based on augmented reality according to embodiment 1 of the present invention;
fig. 2 is a first flowchart of step S103 according to embodiment 1 of the present invention;
fig. 3 is a second flowchart of step S103 according to embodiment 1 of the present invention;
FIG. 4 is a third flowchart of step S103 according to embodiment 1 of the present invention;
FIG. 5 is a schematic view of a user interface according to embodiment 1 of the present invention;
fig. 6 is a second flowchart of an interaction method based on augmented reality according to embodiment 1 of the present invention;
fig. 7 is an interaction processing device based on augmented reality according to embodiment 2 of the present invention;
fig. 8 is a first block diagram of a display module according to embodiment 2 of the present invention;
fig. 9 is a second block diagram of a display module according to embodiment 2 of the present invention;
fig. 10 is a third block diagram of a display module according to embodiment 2 of the present invention;
FIG. 11 is a first block diagram of an operation block according to embodiment 2 of the present invention;
FIG. 12 is a second block diagram of the operation block according to embodiment 2 of the present invention;
fig. 13 is a schematic block diagram of an interaction system based on augmented reality according to embodiment 3 of the present invention;
fig. 14 is a schematic structural diagram of an interaction system based on augmented reality according to embodiment 3 of the present invention;
fig. 15 is a schematic structural diagram of an electronic device according to embodiment 4 of the present invention.
Detailed Description
The invention is further illustrated by the following examples, which are not intended to limit the invention thereto.
It should be noted that references in the specification to "one embodiment," "an alternative embodiment," "another embodiment," etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the relevant art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
In the description of the present disclosure, it is to be understood that the terms "center," "lateral," "upper," "lower," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in the orientation or positional relationship indicated in the drawings for convenience in describing the present disclosure and for simplicity in description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed in a particular orientation, and be operated in a particular manner, and therefore should not be construed as limiting the present disclosure. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present disclosure, "a plurality" means two or more unless otherwise specified. Furthermore, the term "comprises" and any variations thereof is intended to cover non-exclusive inclusions.
In the description of the present disclosure, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present disclosure can be understood in a specific case to those of ordinary skill in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Example 1
As shown in fig. 1, the present embodiment discloses an interaction method based on augmented reality, where the interaction method includes:
s101, acquiring current image data of a to-be-diagnosed object;
specifically, the image data includes spatial coordinates and posture information of the object to be diagnosed in a reference system of the camera unit, and the posture information includes posture and information of five sense organs of the object to be diagnosed.
Step S102, obtaining historical diagnosis and treatment data of the object to be diagnosed according to the current image data;
specifically, the target detection unit is used for identifying whether the image data contains the person information, if so, the identity identification unit extracts facial feature information and posture feature information of five sense organs corresponding to the person information, and the two units are usually constructed by adopting a neural network. The neural network can process the three-channel color image containing the human body characteristic information into a characteristic vector and compare the characteristic vector with a characteristic vector in a medical platform database, so that the best matching identity is found, the file of the object to be diagnosed is called, and the historical diagnosis and treatment data of the object to be diagnosed are obtained. In the scheme, the data storage unit is connected with the cloud medical platform database, the data storage unit can download and upload the information of the current object to be diagnosed in real time, and the information obtained by the object to be diagnosed after detection of medical imaging equipment in a hospital is synchronized with the medical platform database at any time. The medical platform database can store historical diagnosis and treatment data of a plurality of hospitals, so that the problem that the historical diagnosis and treatment data of the objects to be diagnosed are not uniformly stored among the hospitals is solved, and the problem that a doctor can only see the historical diagnosis and treatment data of the objects to be diagnosed in the hospital when the doctor asks for the objects to be diagnosed is also solved.
And S103, generating an augmented reality image according to the historical diagnosis and treatment data and sending the augmented reality image to augmented reality equipment for displaying. The augmented reality refers to combining reality and virtual reality through a computer to create a virtual environment capable of human-computer interaction, and is also a general term for multiple technologies such as VR (virtual reality), AR (augmented reality), MR (mixed reality), and the like. By fusing the visual interaction technologies of the three parts, the experience is provided with the 'immersion feeling' of seamless conversion between the virtual world and the real world.
In particular, the augmented reality image may be an augmented reality image. The augmented reality device can be wearable augmented reality device, for example, the augmented reality device of wear-type glasses style, installs image sensing unit on the glasses, and this glasses open the back, and image sensing unit shoots the image data of the object of waiting to diagnose.
According to the scheme, the interaction method obtains the historical diagnosis and treatment data of the object to be diagnosed according to the current image data, then generates the augmented reality image according to the historical diagnosis and treatment data, and sends the augmented reality image to the augmented reality equipment for displaying, so that when a doctor diagnoses the object to be diagnosed, the historical diagnosis and treatment data of the object to be diagnosed can be obtained in the mode of the augmented reality image, and the doctor can interact with the image and the characters in the historical diagnosis and treatment data in real time, so that the doctor can observe the historical diagnosis and treatment data of the object to be diagnosed more intuitively, and the comprehensiveness of collecting the information of the object to be diagnosed is improved. In addition, if the scanning equipment supports, a sub-interactive interface of the networked scanning equipment can be added in the human-computer interactive interface, and the doctor can use the handle to click the virtual operation button of the scanning equipment in the head-mounted display screen to operate the scanning equipment so as to realize remote examination.
In an implementation manner, the implementation of step S103 includes the following three manners, specifically as follows:
as shown in fig. 2, in the first mode, when acquiring a historical medical image from historical clinical data, the specific steps of step S103 include:
step S10311, obtaining historical medical images in the historical diagnosis and treatment data;
step S10312, determining a first human body part corresponding to the historical medical image;
step S10313, obtaining position information of the first human body part corresponding to the current image of the object to be diagnosed;
step S10314, obtaining position information of the augmented reality device;
step S10315, registering the historical medical image to a corresponding body part of the current image of the object to be diagnosed based on the position information of the first body part and the position information of the augmented reality device, and generating a corresponding first augmented reality image;
in a specific embodiment, a historical medical image in the historical diagnosis and treatment data is obtained, a human body part corresponding to the historical medical image is determined to be a lung, then, position information of the lung corresponding to the current image of the object to be diagnosed is obtained, and finally, the historical medical image is registered to the lung of the current image of the object to be diagnosed according to the position information of the lung corresponding to the historical medical image and the position information of the augmented reality device, so that a corresponding augmented reality image is generated.
As shown in fig. 3, in the second mode, when acquiring the text information in the historical medical data, the specific step of step S103 includes:
step S10321, character information in the historical diagnosis and treatment data is obtained;
step S10322, generating a second augmented reality image from the character information;
as shown in fig. 4, in a third mode, when the historical medical image and the text information that are matched with each other in the historical clinical data are acquired at the same time, the specific step of step S103 includes:
step S10331, obtaining the matched historical medical image and character information in the historical diagnosis and treatment data;
and step S10332, generating a third augmented reality image based on the historical medical image and the character information.
In the scheme, when the third augmented reality image is sent to the augmented reality equipment to be displayed, the historical medical image and the text information are independent and correlated, namely, only the medical image is displayed by default, and the text information is in a hidden state at the moment. When the user clicks the medical image by using the space interaction equipment, the corresponding text information is displayed beside the medical image.
In a specific embodiment, assuming that a patient who claims uncomfortable heart goes to an intracardiac inquiry, a doctor wears an augmented reality device on the head, takes and identifies the current image data of the patient by using a camera arranged on the augmented reality device, searches a medical record of the patient, namely historical diagnosis and treatment data, in a database according to the image data, and the medical record shows that the patient has a record of myocardial ischemia and has taken a magnetic resonance image. Meanwhile, historical medical images and character information which are matched with each other in historical diagnosis and treatment data are obtained, a third augmented reality image is generated based on the historical medical images and the character information, the third augmented reality image is sent to augmented reality equipment to be displayed, so that medical record information is displayed, namely the character information is displayed on the side face of an object to be diagnosed, a heart magnetic resonance image is displayed on the real heart position of the object to be diagnosed in a semitransparent mode according to the real size, and meanwhile, the chest of the object to be diagnosed is highlighted. The effect is displayed as shown in the schematic diagram of the user interface in fig. 5. The specific text in fig. 5 is not intended to be a disclosure of the present application.
According to the scheme, historical medical images or character information in the historical diagnosis and treatment data can be displayed independently in an augmented reality mode, the historical medical images or character information in the historical diagnosis and treatment data can be displayed simultaneously in an augmented reality mode, a doctor can interact with images and characters in the historical diagnosis and treatment data in real time, the doctor can observe the historical diagnosis and treatment data of the object to be diagnosed more intuitively, the doctor can conveniently acquire the historical diagnosis and treatment data of the object to be diagnosed, and the comprehensiveness of collecting the information of the object to be diagnosed is improved.
As shown in fig. 6, in an implementable manner, the interaction method further includes:
step S201, acquiring a thermal imaging image of the object to be diagnosed;
specifically, the step of acquiring a thermal imaging image of the subject to be diagnosed includes:
acquiring infrared information and/or heat information of the object to be diagnosed;
in the scheme, the infrared information and/or the heat information of the object to be diagnosed can be acquired through an infrared night vision device or a thermal imager.
Mapping the infrared information or the thermal information to color or grayscale data to generate the thermographic image.
And S202, displaying the thermal imaging image on a body surface position corresponding to the current image of the object to be diagnosed, and generating a fourth augmented reality image.
According to the scheme, the thermal imaging image is displayed on the corresponding body surface position of the current image of the object to be diagnosed, so that a fourth augmented reality image is generated, and the fourth augmented reality image can reflect the heat or blood flow distribution of the object to be diagnosed, so that the object to be diagnosed can be observed in a multi-dimensional and multi-level mode.
In an implementation manner, when the historical clinical data of the object to be diagnosed includes multiple times of historical clinical data of the same organ of the object to be diagnosed, the step of generating the augmented reality image according to the historical clinical data includes:
acquiring historical medical images of the historical diagnosis and treatment data each time;
and sequentially generating a plurality of augmented reality images according to the time sequence, and sending the images to the augmented reality equipment for displaying.
According to the scheme, medical images in multiple times of historical diagnosis and treatment data of the same organ of the object to be diagnosed are obtained in the medical platform, the medical images can be downloaded to the local data storage unit from the medical data platform, after the medical images are subjected to filtering, coloring and other operations, the group of medical images are subjected to interpolation processing in a time sequence, a plurality of extended reality images are sequentially generated according to the time sequence, and a group of time-varying image data is obtained. Finally, the multiple augmented reality images are sent to the augmented reality device for display, for example, the multiple augmented reality images can be sent to an image display unit of the head-mounted device for dynamic display.
According to the scheme, multiple times of historical diagnosis and treatment data of the same organ of the object to be diagnosed can be sequentially generated according to the time sequence, multiple extended reality images are sent to the extended reality equipment to be displayed, so that a doctor can visually see the multiple times of historical diagnosis and treatment data of the same organ of the object to be diagnosed simultaneously, and the development trend of the object to be diagnosed can be more visually and accurately known.
In an implementable manner, before the step of sequentially generating a plurality of display images in chronological order, the interaction method further includes:
acquiring character information of the historical diagnosis and treatment data each time;
and displaying the historical medical images of the historical diagnosis and treatment data in different colors each time according to the character information.
In a specific embodiment, a subject to be diagnosed is used for medical inquiry, a doctor wears an augmented reality device on the head, after the subject to be diagnosed is shot and identified through a camera arranged on the augmented reality device, a medical record of the subject to be diagnosed, namely historical diagnosis and treatment data, is found in a database, when the historical physical examination of the subject to be diagnosed is carried out for multiple times, the subject to be diagnosed suffers from moderate fatty liver and has a tendency of severe development, and the historical diagnosis and treatment data comprise historical medical images such as color ultrasonography. The method comprises the steps of obtaining the text information of historical diagnosis and treatment data every time, displaying different colors of historical medical images of the historical diagnosis and treatment data every time according to the text information, displaying the parts with high fat content in bright red and the parts with low fat content in light blue, sequentially generating a plurality of augmented reality images according to a time sequence, and sending the images to the augmented reality equipment for displaying. Therefore, the liver physical examination data from the first time to the last time of the object to be diagnosed is dynamically displayed, so that a doctor can clearly see the fatty liver disease change of the object to be diagnosed through the dynamic color change after the liver is reconstructed, and the doctor can give the next examination item or treatment suggestion according to the change.
In an implementable manner, after the step of sending to the augmented reality device for display, the interaction method further includes:
receiving a control instruction;
and performing corresponding operation on the augmented reality image according to the control instruction.
Specifically, when the control instruction includes a click instruction generated by a spatial interaction device, the step of performing a corresponding operation on the augmented reality image according to the control instruction includes:
receiving the click command;
acquiring position information of the space interaction equipment and position information of a region to be clicked in the augmented reality image;
calculating the distance between the space interaction equipment and the area to be clicked;
and if the distance is smaller than a preset threshold value, performing corresponding operation on the augmented reality image.
According to the scheme, the distance between the space interaction equipment and the region to be clicked is analyzed by acquiring the position information of the space interaction equipment and the position information of the region to be clicked in the augmented reality image, so that the augmented reality image is subjected to corresponding operation, a doctor can drag a primitive shielding own sight with an induction pen according to the requirement of own visual angle, or a mark is added to the interested region of an object to be diagnosed.
When the control instruction comprises a voice instruction;
the step of performing corresponding operation on the augmented reality image according to the control instruction comprises:
receiving the voice instruction;
acquiring voice information according to the voice instruction;
logic that analyzes the voice information to obtain voice instructions;
acquiring corresponding operation logic for the augmented reality image according to the logic of the voice instruction;
and performing corresponding operation on the augmented reality image according to the operation logic.
According to the scheme, the voice instruction is received through the audio sensing unit, then voiceprint features of the received voice instruction are recognized, if an artificial doctor giving the voice instruction is recognized, the specific logic of the voice instruction is further analyzed, and the corresponding operation logic of the augmented reality image is obtained according to the logic of the voice instruction.
The scheme can perform corresponding operation on the augmented reality image through a voice instruction, for example, a doctor can start a thermal imaging function through voice.
According to the scheme, the corresponding operation can be carried out on the augmented reality image according to the control instruction, for example, the corresponding operation can be carried out on the augmented reality image through a click instruction or a voice instruction, so that a doctor can interact with medical images or character information of historical diagnosis and treatment data in real time, and the doctor can know the information of the object to be diagnosed in real time and all-directionally.
Example 2
As shown in fig. 7, the present embodiment discloses an interaction processing device based on augmented reality, where the interaction processing device includes:
the first acquisition module 11 is configured to acquire current image data of a subject to be diagnosed;
specifically, the image data includes spatial coordinates and posture information of the object to be diagnosed in the reference system of the camera unit, and the posture information includes posture and information of five sense organs of the object to be diagnosed.
A second obtaining module 12, configured to obtain historical diagnosis and treatment data of the object to be diagnosed according to the current image data;
specifically, the second obtaining module identifies whether the image data has the person information based on the target detection unit, if so, the identity identification unit extracts facial feature information and posture feature information of five sense organs corresponding to the person information, and the two units are usually constructed by adopting a neural network. The neural network can process the three-channel color image containing the human body characteristic information into a characteristic vector and compare the characteristic vector with a characteristic vector in a medical platform database, so that the best matching identity is found, the file of the object to be diagnosed is called, and the historical diagnosis and treatment data of the object to be diagnosed are obtained. In the scheme, the data storage unit is connected with the cloud medical platform database, the data storage unit can download and upload the information of the current object to be diagnosed in real time, and the information obtained by the object to be diagnosed after detection of medical imaging equipment in a hospital is synchronized with the medical platform database at any time. The medical platform database can store historical diagnosis and treatment data of a plurality of hospitals, so that the problem that the historical diagnosis and treatment data of the objects to be diagnosed are not uniformly stored among the hospitals is solved, and the problem that a doctor can only see the historical diagnosis and treatment data of the objects to be diagnosed in the hospital when the doctor asks for the objects to be diagnosed is also solved.
And the first display module 13 is configured to generate an augmented reality image according to the historical diagnosis and treatment data and send the augmented reality image to augmented reality equipment for display. The augmented reality refers to combining reality and virtual reality through a computer to create a virtual environment capable of human-computer interaction, and is also a general term for multiple technologies such as VR (virtual reality), AR (augmented reality), MR (mixed reality), and the like. By fusing the visual interaction technologies of the three parts, the experience is provided with the 'immersion feeling' of seamless conversion between the virtual world and the real world.
Specifically, the augmented reality image may be an augmented reality image. The augmented reality device can be wearable augmented reality device, for example, the augmented reality device of wear-type glasses style, installs image sensing unit on the glasses, and this glasses open the back, and image sensing unit shoots the image data of the object of waiting to diagnose.
According to the scheme, the interactive system acquires the historical diagnosis and treatment data of the object to be diagnosed according to the current image data, generates the augmented reality image according to the historical diagnosis and treatment data and sends the augmented reality image to the augmented reality equipment for displaying, so that when a doctor diagnoses the object to be diagnosed, the historical diagnosis and treatment data of the object to be diagnosed can be acquired in the mode of the augmented reality image, and the doctor can interact with the image and the characters in the historical diagnosis and treatment data in real time, so that the doctor can observe the historical diagnosis and treatment data of the object to be diagnosed more intuitively, and the comprehensiveness of collecting the information of the object to be diagnosed is improved. In addition, if the scanning equipment supports, a sub-interactive interface of the networked scanning equipment can be added in the human-computer interactive interface, and the doctor can use the handle to click the virtual operation button of the scanning equipment in the head-mounted display screen to operate the scanning equipment so as to realize remote examination.
As shown in fig. 8, in an implementation manner, the first display module 13 includes the following three cases of unit modules, which are as follows:
in the first case, when acquiring historical medical images in historical clinical data, the first display module 13 includes the following units:
a first obtaining unit 1311, configured to obtain a historical medical image in the historical clinical data;
a part determining unit 1312 for determining a first human body part corresponding to the historical medical image;
a second obtaining unit 1313, configured to obtain position information of the first human body part corresponding to the current image of the object to be diagnosed;
a third obtaining unit 1314, configured to obtain location information of the augmented reality device;
a first generating unit 1315, configured to register the historical medical image to a corresponding body part of the current image of the object to be diagnosed based on the location information of the first human body part and the location information of the augmented reality device, and generate a corresponding first augmented reality image;
in a specific embodiment, a historical medical image in the historical diagnosis and treatment data is obtained, a human body part corresponding to the historical medical image is determined to be a lung, then, position information of the lung corresponding to the current image of the object to be diagnosed is obtained, and finally, the historical medical image is registered to the lung of the current image of the object to be diagnosed according to the position information of the lung corresponding to the historical medical image and the position information of the augmented reality device, so that a corresponding augmented reality image is generated.
As shown in fig. 9, in the second case, when acquiring the text information in the historical clinical data, the module 13 includes the following units:
a fourth obtaining unit 1321, configured to obtain text information in the historical diagnosis and treatment data;
a second generating unit 1322, configured to generate a second augmented reality image from the text information;
in a third case, as shown in fig. 10, when acquiring historical medical images in historical clinical data, the first display module 13 includes the following units:
a fifth obtaining unit 1331, configured to obtain historical medical images and text information that are matched with each other in the historical clinical data at the same time;
a third generating unit 1332, configured to generate a third augmented reality image based on the historical medical image and the text information.
In the scheme, when the third augmented reality image is sent to the augmented reality equipment to be displayed, the historical medical image and the text information are independent and correlated, namely, only the medical image is displayed by default, and the text information is in a hidden state at the moment. When the user clicks the medical image by using the space interaction equipment, the corresponding text information is displayed beside the medical image.
In a specific embodiment, assuming that a patient who claims uncomfortable heart goes to an intracardiac inquiry, a doctor wears an augmented reality device on the head, takes and identifies the current image data of the patient by using a camera arranged on the augmented reality device, searches a medical record of the patient, namely historical diagnosis and treatment data, in a database according to the image data, and the medical record shows that the patient has a record of myocardial ischemia and has taken a magnetic resonance image. Meanwhile, historical medical images and character information which are matched with each other in historical diagnosis and treatment data are obtained, a third augmented reality image is generated based on the historical medical images and the character information, the third augmented reality image is sent to augmented reality equipment to be displayed, so that medical record information, namely the character information is displayed on the side face of the object to be diagnosed, the heart magnetic resonance image is displayed at the real heart position of the object to be diagnosed in a real size semi-transparent mode, and meanwhile, the chest of the object to be diagnosed is highlighted. The effect is displayed as shown in the schematic diagram of the user interface in fig. 5. The specific text in fig. 5 is not intended to be a disclosure of the present application.
According to the scheme, historical medical images or character information in the historical diagnosis and treatment data can be displayed independently in an augmented reality mode, the historical medical images or character information in the historical diagnosis and treatment data can be displayed simultaneously in an augmented reality mode, a doctor can interact with images and characters in the historical diagnosis and treatment data in real time, the doctor can observe the historical diagnosis and treatment data of the object to be diagnosed more intuitively, the doctor can conveniently acquire the historical diagnosis and treatment data of the object to be diagnosed, and the comprehensiveness of collecting the information of the object to be diagnosed is improved.
In one implementable manner, the interactive processing device includes:
a third obtaining module 14, configured to obtain a thermal imaging image of the subject to be diagnosed;
specifically, the step of acquiring a thermal imaging image of the subject to be diagnosed includes:
a fourth obtaining module 15, configured to obtain infrared information and/or thermal information of the subject to be diagnosed;
in the scheme, the infrared information and/or the heat information of the object to be diagnosed can be acquired through an infrared night vision device or a thermal imager.
Mapping the infrared information or the thermal information to color or grayscale data to generate the thermographic image.
A generating module 16, configured to display the thermal imaging image on a body surface position corresponding to the current image of the object to be diagnosed, and generate a fourth augmented reality image.
According to the scheme, the thermal imaging image is displayed on the corresponding body surface position of the current image of the object to be diagnosed, so that a fourth augmented reality image is generated, and the fourth augmented reality image can reflect the heat or blood flow distribution of the object to be diagnosed, so that the object to be diagnosed can be observed in a multi-dimensional and multi-level mode.
In an implementation manner, when the historical clinical data of the subject includes multiple times of historical clinical data of the same organ of the subject, the first display module 13 further includes:
a fifth obtaining module 17, configured to obtain a historical medical image of each time of the historical diagnosis and treatment data;
and the second display module 18 is configured to sequentially generate a plurality of augmented reality images according to a time sequence, and send the images to the augmented reality device for display.
According to the scheme, medical images in multiple times of historical diagnosis and treatment data of the same organ of the object to be diagnosed are obtained in the medical platform, the medical images can be downloaded to the local data storage unit from the medical data platform, after the medical images are subjected to filtering, coloring and other operations, the group of medical images are subjected to interpolation processing in a time sequence, a plurality of extended reality images are sequentially generated according to the time sequence, and a group of time-varying image data is obtained. Finally, the multiple augmented reality images are sent to the augmented reality device for display, for example, the multiple augmented reality images can be sent to an image display unit of the head-mounted device for dynamic display.
According to the scheme, multiple times of historical diagnosis and treatment data of the same organ of the object to be diagnosed can be sequentially generated according to the time sequence, multiple extended reality images are sent to the extended reality equipment to be displayed, so that a doctor can visually see the multiple times of historical diagnosis and treatment data of the same organ of the object to be diagnosed simultaneously, and the development trend of the object to be diagnosed can be more visually and accurately known.
In an implementable manner, before the step of sequentially generating a plurality of display images in chronological order, the interactive processing device includes:
a sixth obtaining module 19, configured to obtain text information of the historical diagnosis and treatment data each time;
and the third display module 20 is configured to display the historical medical images of the historical diagnosis and treatment data in different colors each time according to the text information.
In a specific embodiment, a subject to be diagnosed is used for medical inquiry, a doctor wears an augmented reality device on the head, after the subject to be diagnosed is shot and identified through a camera arranged on the augmented reality device, a medical record of the subject to be diagnosed, namely historical diagnosis and treatment data, is found in a database, when the historical physical examination of the subject to be diagnosed is carried out for multiple times, the subject to be diagnosed suffers from moderate fatty liver and has a tendency of severe development, and the historical diagnosis and treatment data comprise historical medical images such as color ultrasonography. Acquiring the text information of the historical diagnosis and treatment data every time, displaying the historical medical image of the historical diagnosis and treatment data every time in different colors according to the text information, for example, displaying the part with high fat content in bright red, displaying the part with low fat content in light blue, sequentially generating a plurality of augmented reality images according to the time sequence, and sending the images to the augmented reality equipment for displaying. Therefore, the liver physical examination data from the first time to the last time of the object to be diagnosed is dynamically displayed, so that a doctor can clearly see the fatty liver disease change of the object to be diagnosed through the dynamic color change after the liver is reconstructed, and the doctor can give the next examination item or treatment suggestion according to the change.
In one implementable manner, the interactive processing device includes:
a receiving module 21, configured to receive a control instruction;
and the operation module 22 is configured to perform corresponding operations on the augmented reality image according to the control instruction.
As shown in fig. 11, in particular, when the control instruction includes a click instruction generated by the space interaction device, the operation module 22 includes:
a first receiving unit 2211, configured to receive the click instruction;
a sixth obtaining unit 2212, configured to obtain the location information of the spatial interaction device and the location information of the area to be clicked in the augmented reality image;
a calculating unit 2213, configured to calculate a distance between the spatial interaction device and the area to be clicked;
and if the distance is smaller than a preset threshold value, performing corresponding operation on the augmented reality image.
According to the scheme, the distance between the space interaction equipment and the region to be clicked is analyzed by acquiring the position information of the space interaction equipment and the position information of the region to be clicked in the augmented reality image, so that the augmented reality image is subjected to corresponding operation, a doctor can drag a primitive shielding own sight with an induction pen according to the requirement of own visual angle, or a mark is added to the interested region of an object to be diagnosed.
As shown in fig. 12, when the control instruction includes a voice instruction;
the operation module 22 includes:
a second receiving unit 2221, configured to receive the voice instruction;
a seventh obtaining unit 2222, configured to obtain voice information according to the voice instruction;
an eighth obtaining unit 2223, configured to analyze the voice information to obtain logic of a voice instruction;
a ninth obtaining unit 2224, configured to obtain, according to the logic of the voice instruction, an operation logic corresponding to the augmented reality image;
an operation unit 2225, configured to perform a corresponding operation on the augmented reality image according to the operation logic.
According to the scheme, the voice instruction is received through the audio sensing unit, then voiceprint characteristics of the received voice instruction are recognized, if the artificial doctor giving the voice instruction is recognized, the specific logic of the voice instruction is further analyzed, and the corresponding operation logic of the augmented reality image is obtained according to the logic of the voice instruction.
The scheme can perform corresponding operation on the augmented reality image through a voice instruction, for example, a doctor can start a thermal imaging function through voice.
According to the scheme, the corresponding operation can be carried out on the augmented reality image according to the control instruction, for example, the corresponding operation can be carried out on the augmented reality image through a click instruction or a voice instruction, so that a doctor can interact with medical images or character information of historical diagnosis and treatment data in real time, and the doctor can know the information of the object to be diagnosed in real time and all-directionally.
Example 3
As shown in fig. 13 and fig. 14, the embodiment discloses an interactive system based on augmented reality, which includes the interactive processing device 1 based on augmented reality (i.e., the post-processing device shown in fig. 14), the augmented reality device 2 (i.e., the wearable augmented reality device shown in fig. 14) and the medical platform database 3;
the augmented reality device 2 is used for displaying an augmented reality image;
specifically, the augmented reality device 2 may be a wearable augmented reality device, such as a head-mounted glasses-type augmented reality device, and the glasses are mounted with an image sensing unit, and after the glasses are opened, the image sensing unit captures image data of the object to be diagnosed.
The medical platform database 3 is used for storing historical diagnosis and treatment data.
As shown in fig. 14, the wearable augmented reality device includes a color camera unit (i.e., an image sensing unit), an auxiliary vision unit, a spatial interaction unit, an audio sensing unit, and an image display unit. The post-processing equipment comprises an image identification unit, an auxiliary information processing unit, an interaction processing unit, an image registration unit, a graphic processing unit and a data storage unit. The data storage unit keeps connection with the cloud medical platform database, and can download and upload the latest information of the object to be diagnosed in real time.
In the scheme, the medical platform database 3 is synchronous with information acquired by the object to be diagnosed after the medical imaging equipment detects the object to be diagnosed.
Example 4
Fig. 15 is a schematic structural diagram of an electronic device according to embodiment 3 of the present invention. The electronic device comprises a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the augmented reality-based interaction method provided by embodiment 1 when executing the program. The electronic device 40 shown in fig. 15 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiment of the present invention.
As shown in fig. 15, the electronic device 40 may be embodied in the form of a general purpose computing device, which may be, for example, a server device. The components of the electronic device 40 may include, but are not limited to: the at least one processor 41, the at least one memory 42, and a bus 43 connecting the various system components (including the memory 42 and the processor 41).
The bus 43 includes a data bus, an address bus, and a control bus.
The memory 42 may include volatile memory, such as Random Access Memory (RAM) 421 and/or cache memory 422, and may further include Read Only Memory (ROM) 423.
Memory 42 may also include a program/utility 425 having a set (at least one) of program modules 424, such program modules 424 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The processor 41 executes various functional applications and data processing, such as the augmented reality-based interaction method provided in embodiment 1 of the present invention, by executing the computer program stored in the memory 42.
The electronic device 40 may also communicate with one or more external devices 44 (e.g., keyboard, pointing device, etc.). Such communication may be through an input/output (I/O) interface 45. Also, model-generating device 40 may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via network adapter 46. As shown, the network adapter 46 communicates with the other modules of the model-generated device 40 over a bus 43. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the model-generating device 40, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID (disk array) systems, tape drives, and data backup storage systems, etc.
It should be noted that although in the above detailed description several units/modules or sub-units/modules of the electronic device are mentioned, such a division is merely exemplary and not mandatory. Indeed, the features and functions of two or more of the units/modules described above may be embodied in one unit/module according to embodiments of the invention. Conversely, the features and functions of one unit/module described above may be further divided into embodiments by a plurality of units/modules.
Example 5
The present embodiment provides a computer-readable storage medium on which a computer program is stored, which when executed by a processor implements the augmented reality-based interaction method provided in embodiment 1.
More specific examples, among others, that the readable storage medium may employ may include, but are not limited to: a portable disk, a hard disk, random access memory, read only memory, erasable programmable read only memory, optical storage device, magnetic storage device, or any suitable combination of the foregoing.
In a possible implementation manner, the present invention can also be implemented in the form of a program product, which includes program code for causing a terminal device to execute an interaction method based on augmented reality provided in implementation example 1 when the program product runs on the terminal device.
Where program code for carrying out the invention is written in any combination of one or more programming languages, the program code may be executed entirely on the user device, partly on the user device, as a stand-alone software package, partly on the user device and partly on a remote device or entirely on the remote device.

Claims (10)

1. An interaction method based on augmented reality, the interaction method comprising:
acquiring current image data of a to-be-diagnosed object;
acquiring historical diagnosis and treatment data of the object to be diagnosed according to the current image data;
and generating an augmented reality image according to the historical diagnosis and treatment data and sending the augmented reality image to augmented reality equipment for displaying.
2. The augmented reality-based interaction method of claim 1, wherein the interaction method comprises:
acquiring a thermal imaging image of the object to be diagnosed;
and displaying the thermal imaging image on a corresponding body surface position of the current image of the object to be diagnosed to generate an augmented reality image.
3. The augmented reality-based interaction method of claim 2, wherein the step of acquiring a thermographic image of the subject to be diagnosed comprises:
acquiring infrared information and/or heat information of the object to be diagnosed;
mapping the infrared information or the thermal information to color or grayscale data to generate the thermographic image.
4. The augmented reality-based interaction method according to claim 1, wherein when the historical clinical data of the subject includes a plurality of times of historical clinical data of the same organ of the subject, the step of generating the augmented reality image from the historical clinical data includes:
acquiring historical medical images of the historical diagnosis and treatment data each time;
and sequentially generating a plurality of augmented reality images according to the time sequence, and sending the images to the augmented reality for displaying.
5. The augmented reality-based interaction method of claim 1, wherein after the step of sending to an augmented reality device for display, the interaction method further comprises:
receiving a control instruction;
and performing corresponding operation on the augmented reality image according to the control instruction.
6. The augmented reality-based interaction method of claim 4, wherein prior to the step of sequentially generating a plurality of display images in chronological order, the interaction method further comprises:
acquiring character information of the historical diagnosis and treatment data each time;
and displaying the historical medical images of the historical diagnosis and treatment data in different colors each time according to the character information.
7. An augmented reality-based interaction processing apparatus, comprising:
the first acquisition module is used for acquiring the current image data of the object to be diagnosed;
the second acquisition module is used for acquiring historical diagnosis and treatment data of the object to be diagnosed according to the current image data;
and the display module is used for generating an augmented reality image according to the historical diagnosis and treatment data and sending the augmented reality image to augmented reality equipment for display.
8. An augmented reality-based interaction system comprising the augmented reality-based interaction processing device of claim 7, an augmented reality device, and a medical platform database;
the augmented reality device is used for displaying an augmented reality image;
the medical platform database is used for storing historical diagnosis and treatment data.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the augmented reality based interaction method of any one of claims 1 to 6 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out an augmented reality based interaction method according to any one of claims 1 to 6.
CN202211255883.1A 2022-10-13 2022-10-13 Interaction method, processing device, system, device and medium based on augmented reality Pending CN115620889A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211255883.1A CN115620889A (en) 2022-10-13 2022-10-13 Interaction method, processing device, system, device and medium based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211255883.1A CN115620889A (en) 2022-10-13 2022-10-13 Interaction method, processing device, system, device and medium based on augmented reality

Publications (1)

Publication Number Publication Date
CN115620889A true CN115620889A (en) 2023-01-17

Family

ID=84863160

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211255883.1A Pending CN115620889A (en) 2022-10-13 2022-10-13 Interaction method, processing device, system, device and medium based on augmented reality

Country Status (1)

Country Link
CN (1) CN115620889A (en)

Similar Documents

Publication Publication Date Title
US20210038188A1 (en) Measurement navigation in a multi-modality medical imaging system
US9841811B2 (en) Visually directed human-computer interaction for medical applications
US9122773B2 (en) Medical information display apparatus and operation method and program
US7747050B2 (en) System and method for linking current and previous images based on anatomy
US9351641B2 (en) Mobile processing device system for patient monitoring data acquisition
JP5222082B2 (en) Information processing apparatus, control method therefor, and data processing system
JP5309187B2 (en) MEDICAL INFORMATION DISPLAY DEVICE, ITS OPERATION METHOD, AND MEDICAL INFORMATION DISPLAY PROGRAM
CN106569673B (en) Display method and display equipment for multimedia medical record report
US20060173858A1 (en) Graphical medical data acquisition system
CN107296650A (en) Intelligent operation accessory system based on virtual reality and augmented reality
US9779483B2 (en) Measurement and enhancement in a multi-modality medical imaging system
US20140181716A1 (en) Gesture-Based Interface for a Multi-Modality Medical Imaging System
JP7190059B2 (en) Image matching method, apparatus, device and storage medium
US10642953B2 (en) Data labeling and indexing in a multi-modality medical imaging system
JP2009157527A (en) Medical image processor, medical image processing method and program
WO2012029265A1 (en) Medical treatment information display device and method, and program
JP2016101502A (en) Medical image processing apparatus
US20200081523A1 (en) Systems and methods for display
US10854005B2 (en) Visualization of ultrasound images in physical space
CN115620889A (en) Interaction method, processing device, system, device and medium based on augmented reality
US20200005481A1 (en) Method and system using augmentated reality for positioning of ecg electrodes
WO2021103316A1 (en) Method, device, and system for determining target region of image
US20220192627A1 (en) Stress echocardiogram imaging comparison tool
US20080117229A1 (en) Linked Data Series Alignment System and Method
JP2020102128A (en) Information sharing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination