CN115910289A - Method and related equipment for viewing medical images - Google Patents

Method and related equipment for viewing medical images Download PDF

Info

Publication number
CN115910289A
CN115910289A CN202210381770.XA CN202210381770A CN115910289A CN 115910289 A CN115910289 A CN 115910289A CN 202210381770 A CN202210381770 A CN 202210381770A CN 115910289 A CN115910289 A CN 115910289A
Authority
CN
China
Prior art keywords
medical
attribute
image
medical image
medical images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210381770.XA
Other languages
Chinese (zh)
Inventor
谈琳
袁微微
杨海娜
陈玉鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Publication of CN115910289A publication Critical patent/CN115910289A/en
Pending legal-status Critical Current

Links

Images

Abstract

The present disclosure relates to a method and related apparatus for viewing medical images, wherein the method comprises: the method includes, by a processor: acquiring each medical image and an attribute mark of each medical image, wherein the attribute mark of each medical image reflects at least one attribute of operation and content of the medical image; and presenting, on the display, the attribute embodied by the attribute tag, and presenting, in association with the attribute, at least part of the medical image having the attribute tag. The method of viewing medical images of the present disclosure, when presented on a display, may associatively present at least a portion of the medical image with the attribute indicia. The method of the present disclosure significantly improves the processing efficiency of user analysis attributes through the materialized presentation of attributes, and can direct attention to the associated medical images by using the associated presentation manner of the attributes and the medical images having corresponding attribute marks, thereby significantly improving the work efficiency.

Description

Method and related equipment for viewing medical images
Technical Field
The present disclosure relates to the field of medical imaging technology, and more particularly, to a method and related apparatus for viewing medical images.
Background
When a patient is examined visually, pictures or images of different parts may be taken. Some patients, especially hospitalized patients, may take frequent shots. Medical workers save medical images taken by patients for subsequent review or archiving.
When viewing medical images of a patient, medical workers often desire to quickly understand and obtain the condition of the patient from the intended images. However, the current archiving method for medical images is relatively rough, and the archiving methods adopted in different hospital departments are different. It usually takes a doctor considerable time to manually view a large number of medical images in order to manually find the medical image that is desired to be viewed (e.g., for which part on which shooting date).
In reality, although the operating system also marks the time and date of the medical image, the medical image may be circulated between different terminals and may be operated by medical workers, so that the time and date of the medical image may be modified, and the time of the change may be far from the real shooting time of the medical image. In some cases, there are situations where multiple image shots (also called mapping) are required for a single patient on the same day, and it is possible to take medical images of multiple body parts of a single patient in the same mapping event, so that the single knowledge of the shooting date of the medical image cannot accurately locate and view the desired medical image, and for multiple results of the multiple image shots, even if the doctor limits the shooting date and the patient as the filtering item, the doctor needs to search for the desired medical image from the multiple medical images of the patient on the same shooting date, and at this time, the time date and the patient as the filtering label cannot be directly located to the medical image desired by the doctor, which is very inconvenient for the actual clinician to use.
In summary, the existing viewing manner for medical images cannot meet the requirements of doctors, on one hand, doctors need to spend a lot of time to find desired medical images, and on the other hand, the interactive interface for the doctors to view the medical images is rough, and lacks guidance for the doctors, resulting in heavy workload for the doctors to view the medical images.
Disclosure of Invention
In view of the above technical problems in the prior art, the present disclosure provides a method and related device for viewing medical images, which significantly improve the processing efficiency of user analysis attributes through the visualization presentation of attributes, and can direct attention to the associated medical images by using the associated presentation manner of the attributes and the medical images with corresponding attribute marks, thereby significantly improving the work efficiency.
According to a first aspect of the present disclosure, there is provided a method for viewing a medical image, the method comprising, by a processor: the method comprises the steps of obtaining at least one medical image, wherein the medical image is provided with a first attribute mark and/or a second attribute mark, the first attribute mark is used for representing the corresponding relation between the medical image and a biological tissue part, and the second attribute mark is used for determining the presentation mode of the medical image; according to the first attribute mark, associating the medical image with the same first attribute mark with a corresponding tissue area on the subject map, wherein the subject map is a two-dimensional model map or a three-dimensional model map used for representing a biological tissue part; and displaying the subject map, associating part or all of the medical images with the subject map, and presenting part or all of the plurality of medical images according to the second attribute labels on a display; and receiving, by the processor, a first interoperation instruction from a user; the first interactive operation instruction comprises the determined first attribute mark and/or second attribute mark information; and displaying the medical image related to the determined first attribute mark and/or the second attribute mark on the display in a preset mode according to the first interactive operation instruction.
According to a second aspect of the present disclosure, there is provided an apparatus for viewing medical images, the apparatus comprising a processor configured to perform a method for viewing medical images according to embodiments disclosed herein.
According to a third aspect of the present disclosure, there is provided an image workstation, which can be used for viewing medical images, comprising a communication interface and a processor, wherein the communication interface is configured to receive each medical image acquired by a medical imaging device and a first attribute label and/or a second attribute label of each medical image, the first attribute label is used for representing a corresponding relationship between the medical image and a biological tissue part, and the second attribute label is used for determining a presentation mode of the medical image; and
the processor is configured to perform a method for viewing medical images according to embodiments disclosed herein.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing a program for causing a processor to execute the method for viewing medical images according to embodiments of the present disclosure.
With the method for viewing medical images and the related device according to the embodiments of the present disclosure, the method for viewing medical images can provide a doctor with a visualized viewing mode, and by acquiring the attribute labels of the medical images and the medical images, at least part of the medical images with the attribute labels can be presented in an associated manner when presented on the display. By means of the visualization presentation of the attributes, the processing efficiency of analyzing the attributes by the user can be remarkably improved, the attributes concerned can be quickly and accurately positioned, attention is guided to the related medical images by means of the related presentation mode of the attributes and the medical images with the corresponding attribute marks, and the working efficiency of medical workers is remarkably improved.
Drawings
In the drawings, which are not necessarily drawn to scale, like reference numerals may describe similar components in different views. Like reference numerals having letter suffixes or different letter suffixes may represent different instances of similar components. The drawings illustrate various embodiments generally by way of example and not by way of limitation, and together with the description and claims serve to explain the disclosed embodiments. The same reference numbers will be used throughout the drawings to refer to the same or like parts, where appropriate. Such embodiments are illustrative, and are not intended to be exhaustive or exclusive embodiments of the present apparatus or method.
Fig. 1 shows a schematic diagram of a flow of a method for viewing medical images according to an embodiment of the present disclosure.
FIG. 2 is a sample diagram of a data structure for attribute tagging in accordance with an embodiment of the present disclosure.
FIG. 3 shows an example of attributes on a display and a presentation of a medical image of a method for viewing a medical image according to an embodiment of the disclosure.
Fig. 4 shows another example of attributes on a display and a presentation of a medical image of a method for viewing medical images according to an embodiment of the present disclosure.
Fig. 5 shows yet another example of attributes on a display and a presentation of a medical image of a method for viewing a medical image according to an embodiment of the present disclosure.
Fig. 6 shows yet another example of attributes on a display and a presentation of a medical image of a method for viewing a medical image according to an embodiment of the present disclosure.
Fig. 7 shows a schematic block diagram of an apparatus for viewing medical images according to an embodiment of the present disclosure.
FIG. 8 shows an example of an image workstation and its manner of operation according to an embodiment of the present disclosure.
Detailed Description
For a better understanding of the technical aspects of the present disclosure, reference is made to the following detailed description taken in conjunction with the accompanying drawings. Embodiments of the present disclosure are described in further detail below with reference to the figures and the detailed description, but the present disclosure is not limited thereto.
The use of "first," "second," and similar words in this disclosure is not intended to indicate any order, quantity, or importance, but rather is used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element preceding the word comprises the element listed after the word, and does not exclude the possibility that other elements may also be included. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
In the present disclosure, when a specific device is described as being located between a first device and a second device, there may or may not be intervening devices between the specific device and the first device or the second device. When a particular device is described as being coupled to other devices, that particular device may be directly coupled to the other devices without intervening devices or may be directly coupled to the other devices with intervening devices.
In the flow descriptions according to the embodiments of the present disclosure, the implementation order of the steps is not limited to the examples described, and the steps can be executed in any other order as long as the steps are executed without logical contradiction, which is not limited by the present disclosure.
All terms (including technical or scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs unless specifically defined otherwise. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
Fig. 1 shows a medical image display flow diagram according to an embodiment of the present disclosure, and as shown in fig. 1, in the method for viewing medical images proposed by this embodiment, each medical image and an attribute label of each medical image, which embodies at least one attribute of operation and content of the medical image, may be acquired by a processor in step S101. In some embodiments, the attribute indicia of a medical image may also be a composite attribute embodying both the operation and content of the medical image. The medical images in the present disclosure may represent medical images acquired using various modalities that may be selected from the group consisting of ultrasound imaging, computed Tomography (CT), magnetic Resonance Imaging (MRI), functional MRI, cone Beam Computed Tomography (CBCT), helical CT, positron Emission Tomography (PET), single Photon Emission Computed Tomography (SPECT), X-ray imaging, optical tomography, fluorescence imaging, and radiation therapy portal imaging. In various embodiments of the present disclosure, the attribute mark may be added on the basis of the captured medical image, the process of adding the attribute mark may be completed in the process of acquiring the medical image, may be manually added by a user, may be automatically added by a computer equipped with the medical imaging apparatus according to a preset setting, or may be semi-automatically added through interaction between the user and the computer, which is not limited herein. The content of the specific attribute tag may be used to embody at least one of an operation and a content of the medical image. The operation information of the medical image, such as when it was taken, with which imaging device it was taken, subject information, etc., and the content of the medical image, such as for which specific part (heart, left lung, etc.), are reference information that the doctor follows to view a certain medical image.
The medical images and the attribute marks of the medical images are acquired, reference information to be observed can be obtained without turning over the medical images, and the medical images with the concerned attributes can be quickly and accurately positioned to be observed as an example. It is assumed that the attribute labels of the medical image include the following: time (which may be specific to minutes and seconds), device number, subject information, examination location, special label, etc., then in a specific embodiment, if the doctor wants to view XX month XX day (morning \ afternoon \ hours and minutes and seconds) in XX year, the medical image can be screened directly according to the above-mentioned time, subject, examination location, etc. attributes to prepare to locate the concerned medical image.
By acquiring each medical image and the attribute mark of each medical image, a doctor can quickly judge the relevant attribute of a certain medical image or a certain group of medical images directly through the attribute mark without opening specific medical images to view contents or contrasting operation information in the process of reviewing the images of patients, so that the film reading efficiency of the doctor is improved.
At step S102, at least one medical image may be acquired, where the medical image has a first attribute mark and/or a second attribute mark, the first attribute mark is used to characterize a corresponding relationship between the medical image and a biological tissue region, and the second attribute mark is used to determine a presentation manner of the medical image. In this implementation, the first attribute flag is generally used for direct association with the subject map, for identifying a relationship between the medical image and the organ and tissue of the subject, such as whether the medical image is a lung image or a liver image; the second attribute mark is generally used for determining the presentation manner of the medical image, and it should be particularly noted that the second attribute mark is of various types, including a time mark, a correlation mark and the like, wherein the time mark is associated with the acquisition time of the medical image, and the correlation mark is associated with the content correlation of the medical image.
In the medical image viewing method according to this embodiment, the medical images having the same first attribute mark are associated with the corresponding tissue region on the subject map, where the subject map is a two-dimensional model map or a three-dimensional model map for characterizing a biological tissue region. And then loading the subject image on a display, and associating part or all of the medical image with the subject image, so that a user can perform medical viewing on each medical image according to the first attribute mark and/or the second attribute mark.
For convenience of presentation, in this embodiment, after the subject map is divided by the first attribute mark, the label information in the form of the digital information is further associated with the subject map for visual presentation, and then visually presented through the display.
In some embodiments, not only medical images associated with attributes of the operation or content of the medical image, respectively, may be presented, but also medical images associated with attributes of a composite of both the operation and content of the medical image. In various embodiments of the present disclosure, the presentation of attributes on the display is stylized, such as, but not limited to, an imaged presentation (e.g., without limitation, a presentation of subject regions on a subject map as will be described in detail below), a diagrammed presentation (e.g., without limitation, a menuing presentation of a diagramming event as will be described in detail below), an iconized presentation, and so forth. When a user views a medical image, the psychological pressure and workload are high for a large amount of medical images within a limited processing time, and by means of the visualization presentation of attributes, compared with abstract information (such as characters and voice) of the attributes, the processing efficiency of analyzing the attributes by the user can be remarkably improved, the concerned attributes can be quickly and accurately positioned, and further, by means of the associated presentation mode of the attributes and the medical images with corresponding attribute marks, the attention of the user can be quickly positioned to the associated medical images, so that the efficiency and the accuracy are remarkably improved.
For example, a simulated subject map may be presented on a display, and portions of the subject displayed, and then presented in association. Note that a human body is hereinafter explained as an example of a subject, but it is to be understood that in the present disclosure, the term "subject" may mean various organisms including a human body, an animal body, and a plant body, which can be detected by imaging with a medical imaging apparatus.
In one embodiment, the attributes and the medical image marked with the attributes can be presented in various ways in association, and can be customized according to the actual needs of the user. The specific association manner on the subject map of the display may be an association manner such as a mapping table, a mark line, an indication line, an annotation, presentation at a nearby position, and the like, which is not limited herein. Further, while performing the association display, the most relevant medical images are preferably displayed at the top, and other medical images are arranged according to a time sequence or a correlation or other rule and are further interactively presented to the medical staff. For example, an interactive button is provided on the display screen, and the button is clicked to unfold the folded other medical images, which is not limited in this embodiment.
In a specific embodiment, taking the medical image and attribute of the heart as an example of the capturing time, for example, if a marking line is used, the heart portion of the simulated subject may be marked with the marking line, and in some embodiments, if the doctor needs to observe the medical image corresponding to the heart, the position of the heart in the whole body diagram may be highlighted, such as highlighted or outlined, and the attribute represented by the attribute mark, such as the capturing time, is presented near the heart portion or at another position. Meanwhile, the other end of the marking line can be connected with a medical image corresponding to the shooting time. The thumbnail of the medical image may be displayed on the simulated subject image (in the case of the thumbnail, the doctor may select the thumbnail through optional interactive operation to view the original image), or the original image may also be provided with an optional association button, and a group of medical images shot at the same imaging event of the subject may be displayed through interactive operation, and a specific display manner is not specifically limited herein. Through the presentation mode, a doctor can intuitively and efficiently position the medical image to be checked without manually verifying the content of the medical image to be checked from a plurality of medical images, so that the film finding efficiency of the doctor is obviously improved, and the diagnosis efficiency of the doctor is improved.
Clinically, a doctor may acquire multiple sets of ultrasound images of different parts at a time, for example, various organ tissues such as heart, lung, abdomen, and cranium. In some embodiments, the attributes may include the subject region with which the respective medical image is associated. For example, the subject may be a left lung, a right lung, a left kidney, a right kidney, and the like, which are marked on attribute marks of the human body, so as to distinguish body parts related to different medical images, and specifically, the subject may be a 2D human body image, a 3D human body image, or other types of animals when the subject is other animals, and the subject is not limited herein. Through the configuration of the processor, the 2D or 3D human body diagram may support further viewing operations of the user, such as zooming, rotating, partially enlarging, and the like, which is not described in detail in this embodiment.
In some embodiments, attribute labels for respective medical images are also added when performing the mapping event for the respective medical images, the attribute labels being stored in association with the respective medical images. The above-mentioned partial attribute flag of the medical image (e.g. the partial attribute flag with respect to time in the second attribute flag) may be automatically added by the processor at the time of medical image acquisition, e.g. the capturing time. In some embodiments, the attribute labels (e.g., at least part of the first attribute labels) of the medical images may also be manually added by an operator performing the mapping event, e.g., a sonographer may set the subject region as an attribute label for a next set of medical images before starting a scan of a certain part of the patient, e.g., at least one of the system, organ, part of the subject, subdivision of the part, section of the organ, and section of the part of the subject may be set. In some other embodiments, it is also possible to set the attribute flag about the object region for each medical image separately or in batches, or on the basis that the attribute flag of the object region in a larger range of a group of medical images is set before, when a specific certain medical image or multiple medical images are taken, it is also feasible to set the corresponding attribute flag to be the attribute flag of a more detailed region. For another example, when a sonographer finds a problem requiring special attention when taking a medical image and manually marks the image (a second attribute mark), embodiments of the present disclosure may also provide the operator with a means to independently or additionally add an attribute mark such as a medical image of special attention and enable storage in association with the corresponding medical image by a processor. In addition, some attribute marks can be freely defined by an operator, such as specific symptoms or detail characteristics of the symptoms, and some disease details of a patient observed by the operator in the process of a mapping event can be used as custom attribute marks, so that the later medical staff can conveniently review the specific attribute marks, and the work efficiency of the medical staff is improved. Further, the above application modes can be selected and used according to the needs of medical image capturing and the angle convenient for the operation of the physician, and the embodiments of the present disclosure provide various modes which can be used in combination, which are not listed here. The data structure formed by the individual medical images and their attribute labels may be pre-set so that they can be archived and shared in a uniform manner among the individual users (e.g. different doctors, different hospitals, different institutions in a medical and health system). As an example, the data structure of the attribute labels of the subject regions may be a tree structure, a smaller range of subject region attribute labels may be a subtree of a larger range of subject region attribute labels, and the like, which is not limited herein.
In a specific embodiment, when a plurality of medical images are associated with a subject region and then presented according to the first attribute label, the medical images need to be sorted. There are various ways of presenting the medical images in a sorted manner, and this embodiment is implemented by the second attribute flag. The second attribute mark itself may include a plurality of secondary attribute marks, such as a time mark and a correlation mark, which are two of the most common mark methods at present, and other mark methods, such as an angle mark, a patient identity mark, a source mark for drawing, and other unusual mark forms, may also be included in the protection scope of the second attribute mark.
When the scheme of the embodiment is used for medical image viewing, the first attribute mark is used for supporting a series of viewing operations based on the distribution of the biological organ of the subject, such as an operation that a user designates to view medical information (including a medical image) of a specific part, or an operation that detailed information is viewed based on a thumbnail image or a schematic diagram, medical information corresponding to a complete medical event is viewed, and the like. The second attribute flag is used to support specific viewing operations such as whether to present the medical image in a thumbnail form, a display priority of each medical image when a plurality of medical images exist, a single display or a display in an animation carousel form, and when an event attribute flag exists on the display, to support viewing of each medical image in event units.
It should be further noted that, in addition to the medical image, based on the first attribute flag and the second attribute flag, the present embodiment also supports associating medical information of other various non-medical images onto the subject map and presenting according to the first attribute flag and the second attribute flag. Including case information manually entered by medical personnel, life support information or treatment information from various medical devices, and the like, in forms including but not limited to text information, audio and video information, and the like.
Fig. 2 shows a sample data structure diagram of an attribute flag of an embodiment of the present disclosure, for example, the attribute flag may include a plurality of fields, such as field 1, sub-field 1 202a, sub-field 2202b, \ 8230;, sub-field N202N (the respective sub-fields may be collectively denoted as 202 when they are not distinguished). For example, field 1201 may be used to record image information and each subfield 202 may be used to mark an attribute. In some embodiments, the sub-field 1 202a may be used to record the location of the medical image, the sub-field 2202b may be used to record the time of capture of the medical image, and so on. Each field is provided with a maximum character upper limit beyond which in some cases the data possibly recorded will exceed, and therefore also sub-field groups, adjacent sub-fields may be combined as a set of fields for recording information of medical images. In some embodiments, the position relationship of each field may also be adjusted according to the movement operation of the user, for example, the user desires to use the shooting time as the most important attribute, and the position of the shooting time may be adjusted to the position of the sub-field 1 202a, so as to be able to give customized effect to the user. The attribute marks of the unified data structure can greatly facilitate the management efficiency of the medical images in the later period, and meanwhile, the attribute marks can more intuitively present the attributes of the medical images for a doctor, for example, in the presenting step, the positions to be photographed or the photographing time can be directly presented, or both the positions and the photographing time can be presented, so that the doctor can quickly position one or a group of medical images to be viewed, and the image reading efficiency of the doctor is improved.
In summary, in the above embodiment, the attribute flag of a medical image may include one or more fields, each of which may further include several sub-fields, when there is a customization request from a user, the user may generate the customization attribute flag, the processor may mark the medical image according to the customization attribute flag of the user, and when there is no customization request from the user, the processor may mark the medical image according to a default attribute flag format.
The attribute tags and the medical images have an association relationship, and one attribute tag may correspond to one or a group of medical images according to the content of the attribute tag. The attribute tags may be stored separately from the medical image, or the attribute tags may be stored separately from the medical image and associated, for example, by mapping.
In some embodiments, the attribute tags of the medical images may be written in the file names of the medical images or may be stored separately and in association with each other. For example, zhangsan takes a medical image of a heart at 10 AM, 15 min and 06 sec on 20AM in 2018, 12 month and 20 month, the file name of the medical image may include zhangsan _ heart _20181220AM101506 and other fields. And part of attribute marks can be adopted for the file names independently or additionally, all the attribute marks are not required to be recorded in the file names, and the phenomenon that the reading is influenced due to overlong file names can be avoided. By recording the attribute flag in the file name, management of the medical image can be normalized, and the doctor can determine the operation and/or content information of the medical image at the first time after acquiring the medical image. It should be noted that, the order of the fields in the file name may be adjusted according to requirements, which is not specifically limited in this embodiment.
Based on the subject region associated with each medical image included in the attributes, the attributes represented by the attribute marks can be presented on a display, and the medical image of the subject region selected by the user can be presented in a designated form according to the interaction operation of the user. For example, a representative map of each medical image may be presented in association with the subject region with which the each medical image is associated on the subject map. As shown in fig. 3, in the case that the subject is a human body, on the display, a human body diagram 301 is presented, and on the human body diagram, various human body regions (as indicated by white dots of the brain in fig. 3) and a medical image 302 associated with the human body regions are presented, and optionally, other attribute labels 303 of the medical image 302 may also be presented at the same time. As shown in fig. 3, the medical image 302 in this example may be presented in the form of a thumbnail. Preferably, the thumbnail may be further expanded to present a complete attribute label, or an interactive button may be provided for a generic attribute field to select the associated other medical image.
In some cases, when the user operates (the fifth operation instruction) to move an operation unit such as a mouse to the medical image viewing window (which may be a detail or a thumbnail, and may be static or animated), the system may select to display the corresponding image capturing information in a new display area, for example, a sidebar at the bottom, or a new variable popup, in a manner different from the location outside the viewing window. These data may change dynamically as the objects the user intends to view change, or the fixed presentation may be selected based on the user's determination. Especially, in some cases, the user wants to quickly browse and find the target image, and then select the target image to view specific information, and the above embodiment provides great convenience.
The medical image 302, see fig. 3, may also be a representative medical image of the brain of a patient, the image of which the artwork may be presented by clicking on the image. The representative medical image referred to in this example is a representative medical image in each mapping event of the patient, and the representative image may be selected according to a rule or according to an operation of a doctor, and is not limited herein. In this example, the medical image 302 is associated with the brain by marking a line, and other association manners are not described herein.
By presenting the attributes embodied by the attribute markers on the display, a doctor can be made to determine at which body parts the patient has a medical image. According to the method, only the operator is required to mark the human body part during the drawing, and the stored medical image is associated with the human body part information, so that when the subsequent operator views the medical image, the medical image obtained by drawing can be displayed near the corresponding human body part according to different marks of the part (such as cranium, limbs, trunk and the like) and can be further marked according to organs, such as heart, liver, stomach, skin and the like. Preferably, the presented mark style can be a thumbnail, a customized graph or other figures or symbols. By this way of presentation, the efficiency of extracting critical information of the patient, such as zhang san in fig. 3, brain, 15. Therefore, the time for a doctor to search for a medical image expected to be viewed in a plurality of images is saved, and the diagnosis efficiency is improved.
In the existing medical report, a part of medical images are displayed to a patient, but the displayed medical images are only accurate to date, and for example, three times in 2018, 5 months and 10 days are subjected to drawing once, the date of the obtained medical report is 2018, 5 months and 17 days, and in this case, the report time of three times in the medical report is 2018, 5 months and 17 days. In some application scenarios, for example, in the case of performing a comprehensive physical examination on zhang san or in the case of performing a critical care on zhang san, a plurality of medical images may be taken on zhang san within the same day, and even in the same imaging event, a plurality of medical images may be taken for different parts of zhang san, and the time of the blurred medical report may not allow a doctor to accurately locate a desired medical image.
In some embodiments, the second attribute indicia further comprises the capture times of the individual medical images, and the precision of the capture times is such as to distinguish the capture times (which may be specifically defined as minutes, seconds) of different medical images in a single mapping event (e.g., occurring on the same day). Since a patient may have several mapping events during the day, a medical image of a part taken in the morning may be significantly different from a medical image of the part taken again in the afternoon. In this example, the attributes in the attribute tags define a capture time sufficient to distinguish between different medical images in a single mapping event, including the capture time of the medical image. The mapping event referred to in this example may comprise a series of mapping operations performed by a physician in a medical image examination, which in some specific medical imaging scenarios may be all of some images taken prior to replacing the current probe. For example, three patients performed one brain mapping in 2018, 5, 17, 15, 00, which performed one freezing operation every 5 minutes, and a total of 6 brain medical images were obtained. Then in order to distinguish the set of medical images in this example, the accuracy of the capturing time may be defined to the score, whereby each medical image in the mapping event may be distinguished. By the definition of the shooting time, the condition that only one general time range is given to the medical image is avoided, so that the shooting time of the medical image can be reflected in the attribute mark more clearly, and more effective judgment basis is provided for doctors. In this embodiment, the accuracy of the time attribute marker is defined as the shooting time that can distinguish different medical images in a single imaging event. Correspondingly, when the medical image is presented on the display, the time attribute mark can be used as a basis for specifically presenting a plurality of medical images.
In some embodiments, the capture time of each medical image is the time at which the user performed a freeze operation on that medical image. The capture time is set, for example, such that during a doctor's imaging operation on a patient, the time at which the freeze operation is performed can be considered as a manual marking of the medical image by the doctor, which marking can also be referred to as a correlation marking. In a dynamic imaging process of a subject, or in an imaging process of a long duration (e.g., MRI), by adopting the time of the freezing operation as the capturing time, it is possible to determine the accurate capturing time of the acquired medical image, not only the rough time of the imaging process (e.g., start or end, etc., which cannot define the expression form of the order before and after image capturing). In other embodiments, the relevance tags can be expressed in other forms, such as important/common tags that are manually added by doctors, tags that are manually ordered by medical personnel, machine tags that are triggered based on preset conditions, and the like. Correspondingly, when the medical image with the attribute label is presented on the display, the medical image with the attribute label can be used as a basis for specifically presenting a plurality of medical images according to the correlation label.
It should be particularly noted that the time attribute marks or the correlation marks are only used for the explanation of the specific presentation of the medical image in this embodiment, and the time attribute marks or the correlation marks are not mutually exclusive before each other, that is, a medical image may have both the time attribute marks and the correlation marks, and the processor correspondingly provides an alternative interactive interface, so that the user can select between different second attribute labels based on his/her needs, for example, switching from displaying according to the time attribute labels to displaying according to the correlation labels.
In some embodiments, the time of capture comprises terms of date, hour, and minute (even refined to seconds). That is, the accuracy of the shooting time may be defined to be seconds, for example, zhang san performs one brain mapping in 2018, 5, month, 17, day 15, and the one freezing operation is performed every 5 minutes, and the time of each freezing operation may be used as an attribute flag, for example, 20180517 15. The shooting time of a plurality of medical images in one imaging event can be effectively distinguished by defining the precision of the shooting time to seconds, and the shooting time is not defined by using the same general imaging event, so that doctors can more accurately grasp the disease details of patients.
In the diagnosis of some diseases, more detailed divisions of various tissue organs are of interest, e.g., left-right organ divisions, left lung right lung, left kidney right kidney; the lung is also divided into upper, lower, anterior, posterior, etc. The lung may be left lung and right lung, first region and second region of left lung, and first region and second region of right lung. In some embodiments, the subject region includes at least one of a system, an organ, a site, a subdivision of an organ, a partition of a site, a section of an organ, and a section of a site of the subject. By dividing different detected body areas, a doctor can more accurately grasp what area the medical image is when reviewing the medical image at a later stage, and the judgment efficiency of the doctor is improved.
When a clinician actually prints a picture, one part may be printed, but more often, multiple images are printed from multiple angles on one part. The thumbnail presented in this example may represent one image or a group of images, and is not limited herein. For example, in the case where the subject region has a plurality of associated medical images, the medical images having updated capturing times may be given higher priority to be used as the representative map. For example, a doctor takes multiple images of a part from multiple angles, and the last medical image in the group of medical images can be used as a representative image. The representative graph is arranged, so that a doctor can quickly know the operation or content of a certain mapping event, and the doctor can conveniently find a medical image to be reviewed. The imaging angle in this embodiment may be one of the attribute marks, and when the heart of the patient is subjected to image acquisition in both the front, rear, left, and right directions, if the doctor wants to view the medical images imaged from front to rear preferentially, the first attribute label is selected, and the imaging angle is selected, so that a plurality of medical images at the corresponding angle can be called. If the doctor selects to view the medical image of the heart part of the patient, the shooting angle can be used as a second attribute label, and when the view angle of the detected body image changes along with the operation of the user, the corresponding medical image can also present the medical image closest to the current view angle on the display based on the shooting angle.
The attribute of the attribute mark can comprise specific shooting time, when the image is actually shot, a plurality of medical images can be shot on the same part, and the plurality of medical images can cover more angles, so that the later diagnosis is facilitated. In some embodiments, the attributes may further include mapping events for capturing respective medical images, one mapping event may correspond to multiple capture times, such that the resulting set of medical images may correspond to one mapping event, or multiple sets of medical images may correspond to one mapping event. For example, if a patient has liver and lung disease, then the medical images taken of the liver and the medical images taken of the lung may be covered in one mapping event. While the representative maps of the respective medical images are presented in association with the subject region with which the respective medical images are associated, the representative times at which the mapping event is performed on the subject region may also be presented in association with the respective representative maps.
Referring to fig. 3, according to the attribute flag, the representative time of the mapping event corresponding to the set of medical images is 15. The medical image 302 and the attribute flag item 303 presented as thumbnails independently or additionally may not correspond to each other, for example, the representative graph of the medical image in fig. 3 may also be taken at 15.
The method of viewing medical images of the present disclosure may also present items of a mapping event performed on the subject region in response to a second interactive manipulation of the representative time by the user. As shown in fig. 4, different plotting events or shooting times are displayed by the indication marks. For example, after the user performs an interactive operation of clicking on the representative time, a selection box 404 of the mapping event may be presented, and the mapping event that the doctor wants to view is selected through the selection box 404, that is, the doctor can jump to the selected mapping event or the shooting time. A medical image taken by the selected mapping event may be presented in association with the associated subject region on the subject map in response to the user selecting the mapping event to view among the items of the presented mapping event. For example, after jumping to a selected mapping event, a corresponding presentation of a medical image or a group of medical images under the mapping event may be performed, for example, in a correlated manner in the vicinity of the associated subject region. The representative graph may be replaced with the representative graph of the selected charting event and the representative time may be replaced with the representative time of the selected charting event. For example, a specific shooting time may be used as an item in the selection box 404, the selection box 404 presenting the plotting event is only an optional presentation manner, and other attributes in the attribute tag may also be used as the content in the selection box 404, which is not specifically described herein.
Fig. 5 shows yet another example of attributes on a display and a presentation of a medical image of a method for viewing a medical image according to an embodiment of the present disclosure. The method of viewing medical images of the present disclosure may also present image details of the charting event according to user interaction. For example, in some embodiments, a detailed medical image corresponding to each representative map may be presented in response to a second interaction operation by the user on the respective representative map, as shown in fig. 5. The mapping event includes a plurality of medical images, which may be displayed in a priority order or a shooting time order. The display may be by way of artwork presentation of the currently selected medical image and presentation of thumbnails of the set of charting events through a portion of the graphical user interface. While basic information of the current medical image is presented through another part of the graphical user interface. A menu key may be provided to enable operations such as return to display the thumbnail and its corresponding detailed artwork in a switched manner. The interface displayed by the method for viewing the medical image is simple and clear, and meanwhile, the doctor can quickly acquire the information of the patient expressed by the medical image through a better clear interface under the condition that the details of the medical image are presented, so that the method for viewing the medical image greatly improves the management efficiency of the medical image and saves the precious time of the doctor.
In addition to the method of setting the representative diagram in the foregoing embodiment, in the actual operation process, the representative diagram may be set in other manners. For example, various attribute flags that affect the priority may also be set for each medical image during the course of the photographing, such as photographing time, image quality, user preference, and so forth. After a set of images is acquired, the medical images of higher priority may be presented as representative images. In some embodiments, the attribute may further include a quality level of each medical image, and the medical image is subjected to the stowage if each acquired medical image has an attribute flag indicating a quality level higher than a preset threshold. The quality level of the medical image may be automatically determined by means of, for example, feature value comparison, and the medical image may be collected when the quality level of the medical image is higher than a preset threshold. In some embodiments, where the subject region has multiple associated medical images, the medical images being collected have a higher priority for use as the representative map. For example, a mapping event includes a plurality of medical images, but only one of the medical images is collected (or manually marked), so that the collected (or manually marked) medical image can be used as a representative image. When a plurality of medical images with quality levels higher than a preset threshold are collected, the recently collected images can be used as representative images, and the specific setting mode is set according to actual needs, which is not listed here. By determining the representative map in this way, it is possible to determine the representative map by adding attribute marks as reference information for subsequent determination of priority at the image acquisition stage, and accordingly, it is possible to subsequently determine a medical image of higher priority as the representative map based on these reference information. Thus, the presented representative image is the medical image which the doctor desires to see at a high probability, and the doctor is further prevented from finding the medical image which the doctor wants to see from a plurality of medical images of a mapping event.
In some embodiments, a method for viewing medical images is also presented for presenting attribute tags and at least partially related medical images to a user, in a manner not limited to the aforementioned manner of presentation. Different from the foregoing presentation manner, in step S102, the subject map may be loaded on a display, and part or all of the medical images may be associated with the subject map, so that the user may perform medical viewing on each medical image according to the first attribute mark and/or the second attribute mark. As shown in fig. 6, the items of the mapping event of the respective medical image may be presented, for example, by a menu item 601. For example, three mapping events, namely a heart mapping event, a kidney and spleen mapping event, and a lung, heart and spleen mapping event are displayed in the menu item 601, and the user may select any one of the above events through an interactive operation to view specific contents, further, the user may view an existing medical image or patient medical data based on an existing label on one hand, and may manually modify an existing label during the viewing process on the other hand, or add a new label according to a requirement. The newly added label may be to mark a certain image or a certain information as a key point, or preliminary diagnosis information, or mark image quality, and the like, which is not specifically limited in this embodiment.
In some embodiments, at least a portion of a set of medical images taken by the selected mapping event may also be presented in association with its associated subject region on the subject map in response to the user selecting the mapping event to view among the items of the presented mapping event. In fig. 6, the user is currently selecting the mapping events for the lung, heart and spleen. As shown in fig. 6, a body map 301 may be presented and (at least part of, e.g. a representative map) a set of medical images taken of mapping events of the selected lung, heart and spleen is presented in association with the associated body region, i.e. the left lung, the right lung, the heart and the spleen.
In medicine, some organs are divided into left and right parts, namely left lung and right lung, and left kidney and right kidney; the lung is also divided into upper, lower, anterior, posterior, etc. The lung may be left lung and right lung, first region and second region of left lung, and first region and second region of right lung. Similarly, in some embodiments, the preset attributes may further include the subject region with which each medical image is associated. Referring to fig. 6, for example, the mapping event of the currently selected lung, heart and spleen involves three large organs, and the lung is divided into a left lung and a right lung (the occlusion is not shown in the figure), at which time more precise subject regions associated with the respective medical images can be displayed, so that the attribute is correlated with the corresponding subject region.
In some embodiments, a display portion may be further provided in the window of fig. 6 for presenting a set of medical images taken at the mapping event in association with a different manner than the above-described body map loading and area correspondence presentation. For example, one side of fig. 6 may introduce a picture frame 602, which picture frame 602 may present a set of medical images taken of mapping events of the lungs, heart and spleen. The presentation mode of a group of medical images taken by the heart may also be set according to priority, or may also be presented as thumbnails, which is not limited herein. A representative diagram may also be presented in the body diagram 301, either independently or additionally. The representative image may be one or a group of images of a group of medical images taken of the heart, a thumbnail image, or a medical image after a preset image processing step, which is not limited herein.
In some embodiments, the physician may select a representative image on each body region of body map 301, for example, a representative image at the heart, and a set of medical images (or representative images) taken of the heart during the mapping event may be automatically scrolled into picture frame 602. In this way, the doctor as the user can not only refer to the visualized mapping event item to narrow down the range of the medical image to be viewed, but also focus on each body region related to the mapping event item on the body map, select the body region to be viewed by referring to the representative map near each body region, and conveniently view the medical image related to the body region to be viewed in the mapping event directly in the picture frame 602. In fig. 6, if the mapping event of the lung, heart and spleen selected in the current menu item 601 is modified to the mapping event of the kidney and spleen, the medical image displayed as the kidney or spleen is changed in the picture frame 602 in association. The three-region linkage presentation mode of the drawing event item, the human body region association presentation representative diagram on the human body diagram and all large diagrams in the attention human body region greatly improves the film finding and reading efficiency and has higher user friendliness.
In some embodiments, after the user selects a medical image from the medical images of the kidney or spleen in the picture frame 602, the selected medical image may be presented to the human body map 301. The presenting mode can provide a very intuitive viewing mode of the medical image for a user, and simultaneously can mark the marked attribute and part of the medical image, thereby improving the judgment efficiency of doctors.
As an illustration example, the mapping events of the lung, the heart and the spleen are selected in fig. 6, and the representative times of the group of mapping events can also be illustrated on the items of the corresponding mapping events of the lung, the heart and the spleen, and the precision of the specific representative time can also reach the level of minutes/seconds. Specific association manners may refer to manners such as marking lines and the like of the foregoing embodiments, which are not limited herein. Referring to fig. 6, as an illustration example, for example, a medical image numbered 1 in the medical images of the group of hearts currently selected, the medical image numbered 1 may be correspondingly and associatively displayed at a heart portion of the human body map 301. Similarly, in the event other numbered medical images are selected, the other numbered medical images may be correspondingly presented in the body map 301.
As one illustration, in some embodiments, presenting at least a portion of a set of medical images taken by a selected mapping event in association with a subject region with which it is associated may specifically include: a representative image of a set of medical images taken by the selected mapping event is presented in association with the subject region with which it is associated. When the user selects an item from menu item 601, a corresponding set of medical images is displayed in picture frame 602. Similarly, each group of medical images may also have a corresponding representative image, and the specific representative image determination method is referred to above and will not be described herein again. As shown in fig. 6, the representative diagram is displayed at a corresponding portion of the human body diagram 301. The user can artificially replace the representative map, thereby realizing that the representative map of the medical image is presented in association with the object region associated with the representative map. In some embodiments, a set of medical images taken by a mapping event of the subject region associated with each representative map may also be presented in response to a third interactive manipulation of the representative map by the user. As shown in fig. 6, for example, when the user clicks on the representative image of the heart at this time, the user may enter the image information window interface shown in fig. 5, or may use other interface interaction manners, which is not limited herein. By the method for checking the medical images, unified management and display of the mapping events can be realized, the mapping efficiency of doctors is improved, and the diagnosis efficiency of doctors is greatly improved.
In the foregoing embodiment, an example of one presentation and selection of the menu item 601 and the attribute flag item 303 according to the mapping event is described, but the actual presentation and user interaction modes may be many, and are not limited to one. In some embodiments, the medical image with the constrained attributes may also be presented in response to a user constraining operation on the attributes of the medical image to be viewed (e.g., without limitation, by way of label filtering), the constrained attributes including any one or combination of the subject region to be viewed, the charting event to be viewed, and the capture time to be viewed. The property constraint operation referred to in this example may be, for example, setting a target condition according to which one or a group of medical images corresponding to the target constraint condition is rendered. For example, a search field may be provided to perform a search for a keyword, where the keyword used may be field content in the attribute tag content, and the specific manner used is not limited herein. The medical image which the user desires to view can be quickly obtained by responding to the attribute constraint operation of the medical image to be viewed by the user for displaying, and the correlated display is carried out, so that the use experience of a doctor is improved.
In some embodiments, the subject map may be a two-dimensional map or a three-dimensional map, and is not limited herein, and the two-dimensional map or the three-dimensional map may be a complete subject or a part of an association of the subject. In the case that the subject map is a three-dimensional map, the image viewing method of the present disclosure may further include changing a viewing angle of the three-dimensional map based on an operation of the user (for example, mouse dragging and rotating, or sliding operation on a touch screen, or gesture interaction in a gesture control system, which is not limited in this embodiment). The simulation display of the three-dimensional image can show richer physiological structure details of the patient, and a doctor can conveniently and rapidly view more accurate images according to the more accurate corresponding relation between the detected body area and the associated medical images in the three-dimensional image. The presentation angle or zoom of the three-dimensional map can be changed based on the user's operation, thereby facilitating the doctor to locate the subject region to be viewed. The embodiment of the present disclosure also proposes an apparatus 700 for viewing a medical image, as shown in fig. 7, the apparatus includes a processor 701, and the processor 701 is configured to execute the method for viewing a medical image according to various embodiments of the present disclosure. In some embodiments, the processor 701 may be a processing device including more than one general-purpose processing device, such as a microprocessor, central Processing Unit (CPU), graphics Processing Unit (GPU), or the like. More specifically, the at least one processor 701 may be a Complex Instruction Set Computing (CISC) microprocessor, a Reduced Instruction Set Computing (RISC) microprocessor, a Very Long Instruction Word (VLIW) microprocessor, a processor running other instruction sets, or a processor running a combination of instruction sets. The at least one processor 701 may also be one or more special-purpose processing devices such as an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), a system on a chip (SoC), or the like. In some embodiments, the processor may be located locally in the medical device or remotely, such as but not limited to the cloud. The apparatus for viewing medical images 700 may also include an input/output device 703 providing an interface between input/output peripherals and peripheral interfaces of the apparatus for viewing medical images 700. For example, the input/output peripheral may be an external display 704, and the input/output peripheral may also be a light assembly and other input/control devices. A memory 702 may also be included, the memory 702 may store software programs for performing the detection, and for saving medical images obtained for various mapping events, and the like. The memory 702 may store or load a software program such that, when executed by the processor 701, the software program implements a method for viewing medical images in accordance with various embodiments of the present disclosure.
In some embodiments, the apparatus for viewing medical images 700 may be located locally to the medical imaging apparatus, even integrated within the medical imaging apparatus, or may be located remotely, e.g., may be implemented as a remote image workstation.
In some embodiments, as shown in fig. 8, the image workstation 803 can be used to view medical images, including a communications interface configured to receive respective medical images acquired by a medical imaging device and attribute indicia for the respective medical images, the attribute indicia for the respective medical images embodying at least one attribute of operation and content of the medical images; and the processor is configured to perform the method for viewing medical images according to various embodiments of the present disclosure.
The image workstation 803 may be connected to the communication network 802 through a communication interface, and the medical imaging apparatuses (801 a to 801 n) may transmit medical images obtained by respective imaging events to the image workstation 803 through the communication network 802. For example, the medical imaging apparatuses (801 a-801 n) may all be ultrasound imaging apparatuses, or may be a part of the medical imaging apparatuses, another part of the medical imaging apparatuses is a magnetic resonance apparatus, or may be a combination of other imaging apparatuses, and are not limited herein. Referring to fig. 8, the image workstation 803 may be equipped with a display, and may also present the attributes of the medical image and the associated medical image via a remote display. The display 804 may be co-located with the image workstation 803 for local viewing or may be remotely viewed, for example, the display 804 may be on a portable terminal, and a doctor on the side of the portable terminal may remotely access the image workstation 803 via the communication network 802 to obtain the presentation instruction from the image workstation 803 and accordingly implement the presentation, for example, present the attribute embodied by the attribute mark on the display 804 and present at least part of the medical image having the attribute mark in association with the attribute.
Embodiments of the present disclosure also provide a non-transitory computer-readable storage medium storing a program that causes a processor to execute the method for viewing a medical image according to various embodiments of the present disclosure. The computer-readable storage medium in this example may be a non-transitory computer-readable medium, such as read-only memory (ROM), random-access memory (RAM), phase-change random-access memory (PRAM), static random-access memory (SRAM), dynamic random-access memory (DRAM), electrically erasable programmable read-only memory (EEPROM), other types of random-access memory (RAM), flash disks or other forms of flash memory, caches, registers, static memory, compact-disc read-only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, tape cassettes or other magnetic storage devices, or any other possible non-transitory medium that may be used to store information or instructions that may be accessed by a computer device, and so forth.
Moreover, although exemplary embodiments have been described herein, the scope thereof includes any and all embodiments based on the disclosure with equivalent elements, modifications, omissions, combinations (e.g., of various embodiments across), adaptations or alterations. The elements of the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application, which examples are to be construed as non-exclusive. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more versions thereof) may be used in combination with each other. For example, other embodiments may be used by those of ordinary skill in the art upon reading the above description. In addition, in the foregoing detailed description, various features may be grouped together to streamline the disclosure. This should not be interpreted as an intention that a disclosed feature not claimed is essential to any claim. Rather, the subject matter of the present disclosure may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with each other in various combinations or permutations. The scope of the disclosure should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
The above embodiments are merely exemplary embodiments of the present disclosure, which is not intended to limit the present disclosure, and the scope of the present disclosure is defined by the claims. Various modifications and equivalents of the disclosure may occur to those skilled in the art within the spirit and scope of the disclosure, and such modifications and equivalents are considered to be within the scope of the disclosure.

Claims (17)

1. A method for viewing medical images, the method comprising, by a processor:
acquiring a plurality of medical images, wherein the medical images are provided with a first attribute mark and a second attribute mark, the first attribute mark is used for representing the corresponding relation between the medical images and the examined biological tissue part, and the second attribute mark is used for determining the presentation content and/or the presentation sequence of the medical images;
according to the first attribute mark, associating the medical image with the same first attribute mark with a tissue area corresponding to the first attribute mark on a detected body image, wherein the detected body image is a two-dimensional model image or a three-dimensional model image used for representing a biological tissue part;
on the display of the display device, the display device is provided with a display screen,
displaying the subject map, associating part or all of the medical images with the subject map, and presenting part or all of the plurality of medical images according to the second attribute marks; and
by means of the said processor(s),
receiving a first interactive operation instruction from a user; the first interactive operation instruction comprises the determined first attribute mark and/or second attribute mark information;
and displaying the medical image related to the determined first attribute mark and/or the second attribute mark on the display in a preset mode according to the first interactive operation instruction.
2. The method of claim 1, wherein the second attribute indicia comprise time attribute indicia for characterizing each medical image capture time, the method further comprising:
and according to the time attribute mark and the first attribute mark, preferentially displaying the latest medical image in a time sequence or circularly displaying the medical images corresponding to the first attribute mark in the time sequence in each display area associated with the tissue area corresponding to the first attribute mark on the object map.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
and correspondingly presenting medical images of the biological tissues corresponding to the one or more first attribute marks in a time sequence in a time window of the display according to the time attribute marks in the second attribute marks.
4. The method of claim 1, wherein the second attribute indicia comprises a priority indicia for characterizing each medical image relevance, the method further comprising:
and correspondingly presenting medical images of the biological tissues corresponding to the one or more first attribute marks in the order of the priority marks in a display area associated with the corresponding tissue area on the subject map according to the priority marks.
5. The method of claim 4, wherein the relevance tags include manual tags for users and machine tags triggered based on preset conditions.
6. The method of claim 4, wherein the processor is further configured to,
highlighting the subject region associated with the determined first attribute indicia.
7. The method of claim 1, 2 or 4, wherein the first attribute indicia comprises at least one of a region indicia, an organ indicia, and a tissue angle indicia.
8. The method of claim 7, further comprising:
by means of the processor in question, the data is transmitted,
when a second interactive operation instruction from a user is received, adjusting the image presentation angle and the image size of the subject image on the display according to the second interactive operation instruction, and correspondingly adjusting the presentation form of the medical image associated with the subject image according to the change of the subject image on the display.
9. The method of claim 8, further comprising:
by means of the said processor(s),
when a fourth interactive operation instruction from the user is received, adjusting the local size or the overall size of the medical image displayed on the display according to the fourth interactive operation instruction.
10. The method of claim 1, 2 or 4, wherein at least one of the medical images further comprises an event attribute flag for marking a medical event to which the current medical image corresponds, the method further comprising:
by the processor, on the display:
according to the event attribute marks, correspondingly displaying the operation events of the medical images in an event window according to a preset sequence; and/or
Receiving event interaction operation from a user, and based on a target operation event selected by the event interaction operation, displaying one or more medical images corresponding to the target operation event on the display in whole or in part.
11. The method of claim 1, 2 or 4, wherein the processor is further configured to,
receiving a third interactive operation from the user, and determining a target medical image selected to be viewed by the user based on the third interactive operation;
displaying the target medical image on the display.
12. The method of claim 11, further comprising: receiving an event interaction operation from a user based on the event attribute mark of the target medical image on the display, and displaying one or more medical images corresponding to the target operation event on the display in whole or in part based on the target operation event selected by the event interaction operation.
13. The method of claim 1, 2 or 4, further comprising: displaying the medical image in the form of a partial thumbnail or a whole graph in a display area associated with each tissue area corresponding to the first attribute mark on the subject map;
receiving a fifth interoperation instruction from a user, determining a display area to be viewed on the display based on the fifth interoperation instruction, and displaying shooting information of the medical image in the display area to be viewed in an associated display area different from the display area on the display.
14. The method of claim 1, 2 or 4, further comprising:
acquiring medical information except the medical image, wherein the medical information comprises character information and audio and video information, and the character information and the audio and video information are manually input by medical personnel or are derived from one or more medical devices; the medical information is provided with a first attribute mark and a second attribute mark, the first attribute mark is used for representing the corresponding relation between the medical information and the detected biological tissue part, and the second attribute mark is used for determining the presentation content and/or the presentation sequence of the medical information;
according to the first attribute mark, associating medical information with the same first attribute mark with a tissue area corresponding to the first attribute mark on a subject map, wherein the subject map is a two-dimensional model map or a three-dimensional model map for representing a biological tissue part;
on the display of the display device, the display device is provided with a display screen,
displaying the subject map, associating part or all of the medical information with the subject map, and presenting part or all of the plurality of medical images according to the second attribute marks; and
by means of the said processor(s),
receiving a first interactive operation instruction from a user; the first interactive operation instruction comprises the determined first attribute mark and/or second attribute mark information;
and displaying the medical image and/or medical information related to the determined first attribute mark and/or second attribute mark on the display in a preset mode according to the first interactive operation instruction.
15. An apparatus for viewing medical images, the apparatus comprising a processor, characterized in that the processor is configured to perform the method for viewing medical images according to any of claims 1-14.
16. An image workstation, usable for viewing medical images, comprising a communication interface and a processor,
the communication interface is configured to receive each medical image acquired by a medical imaging device and a first attribute mark and/or a second attribute mark of each medical image, wherein the first attribute mark is used for representing the corresponding relation between the medical image and a biological tissue part, and the second attribute mark is used for determining the presentation mode of the medical image; and
the processor is configured to perform the method for viewing medical images according to any of claims 1-14.
17. A non-transitory computer-readable storage medium storing a program that causes a processor to execute the method for viewing medical images according to any one of claims 1-14.
CN202210381770.XA 2021-08-16 2022-04-12 Method and related equipment for viewing medical images Pending CN115910289A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2021109379225 2021-08-16
CN202110937922 2021-08-16

Publications (1)

Publication Number Publication Date
CN115910289A true CN115910289A (en) 2023-04-04

Family

ID=86494028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210381770.XA Pending CN115910289A (en) 2021-08-16 2022-04-12 Method and related equipment for viewing medical images

Country Status (1)

Country Link
CN (1) CN115910289A (en)

Similar Documents

Publication Publication Date Title
US6901277B2 (en) Methods for generating a lung report
US7130457B2 (en) Systems and graphical user interface for analyzing body images
US10127662B1 (en) Systems and user interfaces for automated generation of matching 2D series of medical images and efficient annotation of matching 2D medical images
JP6081126B2 (en) Medical image processing apparatus, diagnostic imaging apparatus, computer system, medical image processing program, and medical image processing method
CN105163684B (en) The through transport of surgical data is synchronous
EP1654626B1 (en) Methods and system for intelligent qualitative and quantitative analysis for medical diagnosis
US20030028401A1 (en) Customizable lung report generator
US20190214118A1 (en) Automated anatomically-based reporting of medical images via image annotation
JP5405678B2 (en) Medical report creation device, medical report reference device, and program thereof
US20180060535A1 (en) Updating probabilities of conditions based on annotations on medical images
US10729396B2 (en) Tracking anatomical findings within medical images
US20120284657A1 (en) User interface for providing clinical applications and associated data sets based on image data
US8244010B2 (en) Image processing device and a control method and control program thereof
US20140114679A1 (en) Method of anatomical tagging of findings in image data
US20140313222A1 (en) Anatomical tagging of findings in image data of serial studies
JP2001502453A (en) State-of-the-art diagnostic viewer
CN103222876B (en) Medical image-processing apparatus, image diagnosing system, computer system and medical image processing method
JP2005510326A (en) Image report creation method and system
CN106999145A (en) System and method for context imaging workflow
US20140204242A1 (en) Exam review facilitated by clinical findings management with anatomical tagging
US20100082365A1 (en) Navigation and Visualization of Multi-Dimensional Image Data
US20180060534A1 (en) Verifying annotations on medical images using stored rules
CN110993067A (en) Medical image labeling system
CN113329684A (en) Comment support device, comment support method, and comment support program
JP2009080545A (en) Medical information processor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination