CN115576428A - Imaging device and interaction method thereof - Google Patents

Imaging device and interaction method thereof Download PDF

Info

Publication number
CN115576428A
CN115576428A CN202211324045.5A CN202211324045A CN115576428A CN 115576428 A CN115576428 A CN 115576428A CN 202211324045 A CN202211324045 A CN 202211324045A CN 115576428 A CN115576428 A CN 115576428A
Authority
CN
China
Prior art keywords
image
projection
target
target object
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211324045.5A
Other languages
Chinese (zh)
Inventor
郭一鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Zhongke Medical Technology Industrial Technology Research Institute Co Ltd
Original Assignee
Wuhan Zhongke Medical Technology Industrial Technology Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Zhongke Medical Technology Industrial Technology Research Institute Co Ltd filed Critical Wuhan Zhongke Medical Technology Industrial Technology Research Institute Co Ltd
Priority to CN202211324045.5A priority Critical patent/CN115576428A/en
Publication of CN115576428A publication Critical patent/CN115576428A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • General Physics & Mathematics (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Pulmonology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the specification provides an imaging device and an interaction method thereof. The method comprises the following steps: acquiring an image containing a target object through an image acquisition device; determining projection content based on the image; and controlling the at least two projection devices to project on the target area of the imaging equipment based on the projection content.

Description

Imaging device and interaction method thereof
Technical Field
The present disclosure relates to the field of medical devices, and more particularly, to an imaging device and an interaction method thereof.
Background
An aperture-type Imaging device, such as a Magnetic Resonance Imaging (MRI) device or a Computed Tomography (CT) Imaging device, needs to move a patient into an examination aperture during examination or treatment. Chest distress, short breath, palpitation and other symptoms can occur to patients with claustrophobia; for some children, it may be difficult to maintain a quiet lying down for a long period of time.
It is therefore desirable to provide an imaging apparatus such that the patient can have immersive interaction in the aperture, creating a comfortable and pleasant examination environment for the patient.
Disclosure of Invention
One aspect of the present description provides an interaction method for an imaging apparatus comprising at least two projection devices. The method comprises the following steps: acquiring an image containing a target object through an image acquisition device; determining projection content based on the image; and controlling the at least two projection devices to project on the target area of the imaging equipment based on the projection content.
In some embodiments, the target region is located on an inner wall of an inspection aperture of the imaging device; the determining projection content based on the image includes: determining feature information of the target object based on the image; determining an interaction mode based on the feature information, wherein the interaction mode comprises a simple mode and a complex mode; and the controlling the at least two projection devices to project on the target area of the imaging device based on the projection content comprises: and controlling the at least two projection devices to project corresponding target interaction interfaces in the target area according to the determined interaction mode.
In some embodiments, said determining projected content based on said image comprises: determining feature information of the target object based on the image; and determining target projection content based on the characteristic information, wherein the target projection content comprises at least one of videos, games and pictures.
In some embodiments, said determining projected content based on said image comprises: determining eyeball information of the target object based on the image; determining an operation instruction of the target object based on the eyeball information; and determining target projection content based on the operation instruction.
In some embodiments, the method further comprises: and switching the projection content of the target area based on the operation instruction.
In some embodiments, the method further comprises: determining position information of the target object based on the image; determining a location of the target area based on the location information.
In some embodiments, the method further comprises: and correcting the projection images of the at least two projection devices in the target area based on the position of the target area.
Another aspect of the present specification provides an image forming apparatus comprising: the scanner is used for acquiring scanning data of a target object; the image acquisition device is used for acquiring an image containing the target object; a control module to determine projection content based on the image; and the at least two projection devices are used for projecting in a target area based on the projection content, and the target area is positioned on the inner wall of the aperture of the scanner.
In some embodiments, the image acquisition device is mounted directly above the scanner, and/or the at least two projection devices are movably mounted on the scanner.
Another aspect of the present specification provides a computer-readable storage medium storing computer instructions for performing the interaction method as described above when the computer executes the computer instructions.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals refer to like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of an exemplary imaging device shown in some embodiments herein;
FIG. 2 is a schematic block diagram of an exemplary interaction system, shown in accordance with some embodiments of the present description;
FIG. 3 is a flow diagram illustrating an exemplary interaction method for an imaging device in accordance with some embodiments of the present description;
FIG. 4 is a schematic diagram of an exemplary interaction interface, shown in accordance with some embodiments of the present description;
FIG. 5 is a schematic illustration of an exemplary projection region shown in accordance with some embodiments of the present description;
FIG. 6 is a schematic diagram of an exemplary projected picture shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
Generally, the words "module," "unit," or "block" as used herein refers to logic embodied in hardware or firmware, or a collection of software instructions. The modules, units, or blocks described herein may be implemented as software and/or hardware and may be stored in any type of non-transitory computer-readable medium or another storage device. In some embodiments, software modules/units/blocks may be compiled and linked into an executable program. It should be understood that software modules may be invoked from other modules/units/blocks or from themselves, and/or may be invoked in response to detected events or interrupts. The software modules/units/blocks configured for execution on a computing device may be provided on a computer readable medium (e.g., a compact disc, digital video disc, flash drive, magnetic disk, or any other tangible medium) or downloaded as digital (which may be initially stored in a compressed or installable format requiring installation, decompression, or decryption prior to execution). The software code herein may be stored in part or in whole in a memory device of a computing device performing the operations and employed in the operations of the computing device. The software instructions may be embedded in firmware, such as an EPROM. It should also be understood that hardware modules/units/blocks may be included in connected logic components, such as gates and flip-flops, and/or may include programmable units, such as programmable gate arrays or processors. The modules/units/blocks or computing device functions described herein may be implemented as software modules/units/blocks, but may be represented in hardware or firmware. Generally, the modules/units/blocks described herein refer to logical modules/units/blocks, which may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks, even though they are physical organizations or memory devices. The description may apply to the system, the engine, or a portion thereof.
It will be understood that when an element, engine, module or block is referred to as being "on," "connected to" or "coupled to" another element, engine, module or block, it can be directly on, connected or coupled to or in communication with the other element, engine, module or block, or intervening elements, engines, modules or blocks may be present, unless the context clearly dictates otherwise. In this specification, the term "and/or" may include any one or more of the associated listed items or combinations thereof. In this specification, the term "image" may refer to a 2D image, a 3D image, or a 4D image.
These and other features and characteristics of the present specification, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description of the drawings, all of which form a part of this specification. It is to be understood, however, that the drawings are designed solely for the purposes of illustration and description and are not intended as a definition of the limits of the specification. It should be understood that the figures are not drawn to scale.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" are intended to cover only the explicitly identified steps or elements as not constituting an exclusive list and that the method or apparatus may comprise further steps or elements.
Flow charts are used in this description to illustrate operations performed by systems according to embodiments of the present description, with the relevant description being to facilitate a better understanding of magnetic resonance imaging methods and/or systems. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
MRI and CT imaging devices are important examination tools for modern medical assistance in examining, treating, and diagnosing patient diseases. The examination time of the CT apparatus is generally 3 to 15 minutes, the examination time of the MRI apparatus is generally 15 to 30 minutes, and the patient needs to keep steady vital signs during the examination. However, the examination environment of MRI and CT devices is harsh for the patient, who is required to lie flat within a narrow and cramped cylindrical examination aperture, and suffers from noisy machine operation noise. Chest distress, short breath, palpitation and other symptoms can occur to patients with claustrophobia; for some children, it may be difficult to maintain a quiet lying down for a long period of time. For such patients, the current hospital practice is to inject sedatives, but with unpredictable side effects.
In the embodiment of the specification, an interaction method for an aperture imaging device such as an MRI device or a CT device is provided, which projects an interaction interface or plays a movie and television picture through a projection device in an examination aperture, and captures information such as eyeball motion of a patient through an image acquisition device to realize interaction, so that a comfortable and pleasant examination environment can be created for the patient.
FIG. 1 is a schematic diagram of an application scenario of an exemplary imaging device, shown in some embodiments herein.
As shown in fig. 1, in some embodiments, the scene 100 may include an imaging device 110, a processing device 120, a terminal device 130, a storage device 140, and a network 150. In some embodiments, the various components in the scenario 100 may be interconnected via the network 150 or may not be directly connected via the network 150. For example, the imaging device 110 and the terminal device 130 may be connected through the network 150. As another example, the imaging device 110 and the processing device 120 may be connected through the network 150 or directly connected. For another example, the imaging device 120 and the terminal device 130 may be connected through the network 150 or directly connected.
The imaging device 110 may be used to scan a target object or portion thereof located within its detection region (e.g., the examination aperture 101) and generate an image (e.g., a scanned image) related to the target object or portion thereof. In some embodiments, the target object may be biological or non-biological. For example, the target object may include a patient, a man-made object, and the like. In some embodiments, the target object may include a particular part of the body, such as the head, chest, abdomen, etc., or any combination thereof. In some embodiments, the target object may include a specific organ, such as a heart, esophagus, trachea, bronchi, stomach, gall bladder, small intestine, colon, bladder, ureter, uterus, fallopian tube, etc., or any combination thereof. In some embodiments, the target object may include a region of interest (ROI), such as a tumor, nodule, or the like.
In some embodiments, imaging device 110 may include a DSA (Digital subtraction angiography) device, a Digital Radiography Device (DR), a Computed Radiography device (CR), a Digital fluoroscopy Device (DF), a CT device, a magnetic resonance imaging device, a mammography X-ray machine, a C-arm device, and the like.
In some embodiments, the imaging device 110 may include an examination aperture, a couch, at least two projection apparatuses (e.g., projection apparatuses 104 and 105), an image acquisition apparatus, and the like. For more details, reference may be made to fig. 2 and its associated description, which are not repeated herein.
Processing device 120 may process data and/or information obtained from imaging device 110, terminal device 130, and/or storage device 140. For example, the processing device 120 may analyze and process an image containing the target object obtained by the image acquisition device, determine projection content, and control the projection device to project on the target area of the imaging device 110 according to the projection content. In some embodiments, the processing device 120 may be a single server or a group of servers. The server groups may be centralized or distributed. In some embodiments, the processing device 120 may be local or remote. For example, processing device 120 may access information and/or data from imaging device 110, terminal device 130, and/or storage device 140 via network 150. As another example, processing device 120 may be directly connected to imaging device 110, terminal device 130, and/or storage device 140 to access information and/or data. In some embodiments, the processing device 120 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
The terminal device 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, and the like, or any combination thereof. In some embodiments, the end device 130 may interact with other components in the scene 100 through the network 150. For example, the terminal device 130 may send one or more control instructions to the imaging device 110 through the network 150 to control the imaging device 110 to scan a target object or perform switching of projection content according to the instructions. As another example, the terminal device 130 may also receive a scanned image of the target object scanned by the imaging device 110 through the network 150. In some embodiments, the mobile device 131 may include smart home devices, wearable devices, mobile devices, virtual reality devices, augmented reality devices, and the like, or any combination thereof.
In some embodiments, the terminal device 130 may be part of the processing device 120. In some embodiments, the terminal device 130 may be integrated with the processing device 120 as an operating console for the imaging device 110. For example, a user/operator (e.g., a doctor or nurse) of the scene 100 may control the operation of the imaging device 110 through the console, e.g., scanning a target object, switching projection contents, and the like. In some embodiments, the processing device 120 may be integrated in the imaging device 110. For example, the processing device 120 may be part of the imaging device 110 for determining projected content based on the image and controlling at least two projection apparatuses to project the corresponding content on the target area.
Storage device 140 may store data (e.g., different modes of interactive interfaces, music, videos, games, pictures, etc.), instructions, and/or any other information. In some embodiments, storage device 140 may store data obtained from imaging device 110, processing device 120, and/or terminal device 130. For example, the storage device 140 may store a scan image of a target object acquired from the imaging device 110, or the like. In some embodiments, storage device 140 may store data and/or instructions that processing device 120 may execute or use to perform the exemplary methods described herein.
In some embodiments, storage device 140 may include one or a combination of mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like. In some embodiments, the storage device 140 may be implemented by a cloud platform as described herein.
In some embodiments, the storage device 140 may be connected to a network 150 to enable communication with one or more components (e.g., processing device 120, terminal device 130) in the scenario 100. One or more components in the scenario 100 may read data or instructions in the storage device 140 over the network 150. In some embodiments, the storage device 140 may be part of the imaging device 110 or the processing device 120, or may be separate and directly or indirectly connected to the imaging device 110 or the processing device 120.
The network 150 may include any suitable network capable of facilitating the exchange of information and/or data for the scenario 100. In some embodiments, one or more components of the scene 100 (e.g., imaging device 110, processing device 120, terminal device 130, storage device 140) may exchange information and/or data with one or more components of the scene 100 over the network 150. In some embodiments, the network 150 may include one or a combination of a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN), a Wide Area Network (WAN)), etc.), a wired network (e.g., ethernet), a wireless network (e.g., an 802.11 network, a wireless Wi-Fi network, etc.), a cellular network (e.g., a Long Term Evolution (LTE) network), a frame relay network, a Virtual Private Network (VPN), a satellite network, a telephone network, a router, a hub, a server computer, etc. In some embodiments, network 150 may include one or more network access points. For example, the network 150 may include wired and/or wireless network access points, such as base stations and/or internet exchange points, through which one or more components of the scenario 100 may connect to the network 150 to exchange data and/or information.
It should be noted that the foregoing description is provided for illustrative purposes only, and is not intended to limit the scope of the present description. Many variations and modifications may be made by one of ordinary skill in the art in light of the teachings of this specification. The features, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, an image acquisition device may also be included in the scene 100 for acquiring images within the scan room. However, such changes and modifications do not depart from the scope of the present specification.
FIG. 2 is a block diagram of an exemplary interaction system, shown in accordance with some embodiments of the present description.
As shown in fig. 2, in some embodiments, the interactive system 200 may include the imaging device 110 and a control module 210. The imaging apparatus 110 may include, among other things, an examination aperture 101, a couch 102, at least two projection devices (e.g., projection devices 104, 105), and an image acquisition device 106. The control module 210 may be of the same or similar construction as the processing device 120.
The inspection aperture 101 corresponds to the detection area of the imaging device 110. When a target object (e.g., a patient or a portion of a patient) is positioned within the examination aperture 101, the imaging device 110 may scan the target object to acquire a scanned image of the target object.
The couch 102 may be used to position a target object for scanning. For example, the target subject may lie on the back, on the side, or prone on the examination couch 102. In some embodiments, the couch 102 may be movably disposed in front of the imaging device 110 and parallel to the floor. For example, the examination couch 102 may be moved in and out of the examination aperture 101. For another example, the couch 102 may also be moved in a vertical direction to adjust the distance between the target object on the couch and the scan center when entering the examination aperture 101, so as to scan the target object within the detection range. In some embodiments, the couch 102 may be a separate device from the imaging device 110.
By way of example only, when the target object 103 requires a medical examination, the target object 103 may lie flat on the examination table 102. The couch 102 is then moved to bring the target object 103 into the examination aperture 101, and the position of the couch 102 is adjusted so that the target object 103 is located at the scan center. Further, the radiation scanning source of the imaging device 110 may emit a radiation beam (e.g., X-rays) to the target object 103, which is attenuated by the target object 103 and detected, thereby generating an image signal of the target object 103.
The projection devices 104 and/or 105 may be used to project onto the target area. The target region may be located at an inner wall of the inspection aperture 101. For example, the target region 107 is shown in FIG. 2 as being located above the inspection aperture 101. Wherein, the upper part may refer to a position opposite to the ground. In some embodiments, the depth, width of the target region 107 may be set by a user or automatically determined from patient information. Wherein depth may refer to a distance along an axial direction of the inspection aperture 101 (e.g., the direction of arrow B in the figure), and width may refer to a distance along a circumferential direction of the inspection aperture 101 (e.g., the direction of arrow a in the figure). For example, the target region 107 may have a depth of 20 centimeters and a width of two-thirds of the circumferential length of the inspection aperture 101. As another example, the imaging device 110 may adjust the size and/or position of the target region 107 in real-time based on the shape and/or position of the target object within the inspection aperture 101.
In some embodiments, the projected content of the target area may include, but is not limited to, video, images, games, and the like. In some embodiments, the projection devices 104 and 105 may project an interactive interface (such as the interface shown in fig. 4) at the target area 107 and determine the target projection content and/or switch the projection content in real-time according to user instructions. In some embodiments, the projected images of the projection devices 104 and 105 may be stitched together into a complete image. For example, as shown in fig. 5, the region 503 corresponds to the projection range of the projection device 104, the region 501 corresponds to the projection range of the projection device 105, and the regions 501 and 503 are combined together to form the target region 107. After the target projection content or the target interactive interface is determined, the projection image of the projection device 104 in the area 503 and the projection image of the projection device 105 in the area 501 may be jointly spliced into the complete target projection content or the target interactive interface.
In some embodiments, the projection device 104 or 105 may include a CRT (Cathode Ray Tube) projection, LCD projector, DLP (Digital Light processing) projection, LCOS (Liquid Crystal on Silicon) projection, or the like. In some embodiments, the projection devices 104 and 105 may be the same type of projector.
In some embodiments, the projection devices 104 and 105 may be mounted on the inner walls of the inspection aperture 101. For example, as shown in FIG. 2, the projection devices 104 and 105 may each be fixedly mounted at an inner wall of the examination aperture 101 on a side thereof adjacent to the examination couch 102. As another example, the projection devices 104 and 105 may be mounted on the inner wall of the inspection aperture 101 via a rail and moved in the axial direction (e.g., the direction indicated by arrow B) of the inspection aperture 101. In some embodiments, the projection devices 104 and 105 may be mounted on the couch 102. For example, the projection devices 104 and 105 may be fixed at two long sides of the examination table 102, respectively. As another example, the projection devices 104 and 105 may be mounted at an edge of one of the sides of the couch 102 via a guide structure and moved in the direction of the side.
In some embodiments, two or more projection devices may be installed, and the installation position and/or the installation manner of the two or more projection devices may be any combination of the above installation positions and installation manners, which is not limited in this specification. In some embodiments, the projection device may be a device separate from the imaging device 110. For example, the projection devices 104 and 105 may be installed on the floor where the imaging apparatus 110 is located.
The image capture device 106 may be used to capture image information. For example, the image capture device 106 may capture an image of all and/or a portion of the area containing the target object. In some embodiments, the image acquisition device 106 may be mounted on the inspection aperture 101. For example, the image acquisition device 106 may be fixedly, or slidably, or rotatably mounted above the inspection aperture 101. In some embodiments, the image capture device 106 may be controlled to move based on the position of the target object, such that its capture range covers the target object. For example, the image capturing device 106 may be controlled to rotate by a certain angle and/or move by a certain distance according to the position of the target object on the examination table 102, so that the target object can be captured. In some embodiments, the image capture device 106 may include a camera (e.g., a digital camera, an analog camera, a depth camera, etc.), a red-green-blue (RGB) sensor, an RGB-depth (RGB-D) sensor, or other device that may capture image data of an object.
In some embodiments, image acquisition device 106 may be a separate device from imaging device 110. For example, image capture device 106 may be fixedly mounted in a ceiling, corner, etc. location within the scan room. For another example, the image capturing device 106 may be slidably and/or rotatably mounted on a ceiling, a floor, or the like of the position where the imaging apparatus 110 is located, which is not limited in this specification. In some embodiments, two or more image capture devices 106 may be mounted, and the mounting locations and mounting manners of the two or more image capture devices may be any combination of the above-described mounting locations and mounting manners. For example, two image acquisition devices may be respectively installed on the imaging apparatus 110 at positions corresponding to the couch top of the examination couch 102.
In some embodiments, the projection devices 104 and 105, and the image acquisition device 106 may be connected to the control module 210 of the operating room by optical fibers. The control module 210 may determine projection content based on the image containing the target object acquired by the image acquisition device 106 and control the projection devices 104 and 105 to project the corresponding content in the target area 107. For more description of the determination of the projection content, reference may be made to fig. 3, which is not described herein again.
In some embodiments, the control module 210 may also be configured to geometrically correct the projected images of the projection devices 104 and 105 to present a normal, undistorted projected picture in the target area. For example, the correction method may include a three-dimensional reconstruction method, a two-dimensional image transformation method, or the like. Illustratively, as shown in fig. 6, before being corrected, the projection pictures of the projection devices 104 and 105 in the target area 107 are shown as 601, and have deformities; the control module 210 performs projection matrix calibration based on the curved surface shape of the target area 107, and then performs geometric adjustment on a projection image of the projection device by taking one of the projection devices 104 and 105 as a reference, and the deformity is substantially eliminated as shown in 603 on the projection image; the colors of the projection devices 104 and 105 are further corrected so that the brightness of the overlapping areas of the projection screens of the projection devices 104 and 105 are fused, and accordingly the projection image is as shown in 605.
In some embodiments, the control module 210 may determine the position information of the target object based on the image containing the target object acquired by the image acquisition device 106, determine the position of the target region 107 based on the position information, and further correct the projection images of the projection devices 104 and 105 in the target region. In some embodiments, the at least two projection devices may be controlled to move based on the position of the target area, so that the projection range thereof can cover the target area. For example, the projection devices 104 and 105 may be controlled to rotate by a certain angle and/or move by a certain distance, respectively, according to the position of the target area.
It should be noted that the above description of the interactive system 200 is for illustrative purposes only and is not intended to limit the scope of the present description. Various modifications and adaptations may occur to those skilled in the art in light of this disclosure. However, such changes and modifications do not depart from the scope of the present specification. For example, the imaging device 110 described above may include one or more additional modules, such as a storage module for data storage, and the like.
FIG. 3 is a flow diagram illustrating an exemplary interaction method for an imaging device in accordance with some embodiments of the present description.
In some embodiments, flow 300 may be performed by imaging device 110 or processing device 120. For example, the flow 300 may be implemented as instructions (e.g., an application) and stored, for example, in a memory of the storage device 140 and accessible by the imaging device 110 or the processing device 120. Imaging device 110 or processing device 120 may execute the instructions and, when executing the instructions, may be configured to perform flow 300. The operational schematic of flow 300 presented below is illustrative. In some embodiments, the process may be accomplished with one or more additional operations not described and/or one or more operations not discussed. Additionally, the order of the operations of flow 300 shown in FIG. 3 and described below is non-limiting.
Step 310, acquiring an image containing the target object by the image acquisition device. In some embodiments, step 310 may be performed by processing device 120.
In some embodiments, the acquired image may contain all or part of the target object. For example, the acquired image may include only the eyes or head of the patient. As another example, the acquired image may include the entire body of the patient. In some embodiments, the acquired images may be one or more.
In some embodiments, images of the target object may be acquired by an image acquisition device (e.g., image acquisition device 106) continuously or intermittently (e.g., periodically) before, during, and/or after a scan of the target object is performed. For example, the image acquisition device 106 may acquire a first image of the target object before scanning and/or a second image of the target object during scanning.
At step 320, projected content is determined based on the image. In some embodiments, step 320 may be performed by processing device 120.
Projected content may refer to content contained in an image projected at a target area. For example, projected content may include various content types of video, games, pictures, etc., and/or specific types of video/games/pictures, etc.
In some embodiments, the projected content may be determined based on information of the target object. For example, the processing device 120 may control the projection devices 104 and 105 to project any one of a video, a game, or a picture based on the patient information. As another example, the processing device 120 may control the projection apparatuses 104 and 105 to project any one type of video, such as a social category, an entertainment category, a current news category, a life category, and the like, based on the patient information.
In some embodiments, feature information of the target object may be determined based on the image, and the target projection content may be determined based on the feature information. The characteristic information may reflect a personal characteristic of the target object. For example, the characteristic information may include the sex, age, etc. of the patient. Users of different ages or genders may project different types of content. For example only, for patients of a particular age group, videos suitable for the age group may be played according to characteristic information of the patients in consideration of their adaptability and receptivity. For example, children aged 0 to 14 years, play short animation films of the right age; the middle-aged and the elderly people over 45 years old play relaxed landscape videos and the like. In some embodiments, a trained neural network model (e.g., a trained convolutional neural network) may be utilized to determine feature information for a target object based on an image and to determine target projection content based on the feature information. For example, an image containing a target object may be input into a trained convolutional neural network, and the convolutional neural network determines feature information of the target object by analyzing the image and outputs corresponding target projection content, such as a landscape video, based on the feature information.
In some embodiments, the target projection content may be determined based on the patient's case information. For example, for a patient with a high pain level at a scanned part, the target projection content can be short videos such as stories, entertainment news, fun videos and the like which are easy for people to enter a drama; for the patient with no pain feeling or low pain level at the scanning part, the target projection content is a short video of a social class or an educational class, and the like. As another example, depending on the scanning location, the target projection content may be a mini-game for a patient whose scanning location is below the chest; for patients whose scan region is the chest or above, the target projection content is video.
In some embodiments, the projected content may be selected autonomously by the user prior to scanning. For example, if the user selects a funny video, the projection devices 104 and 105 are directly controlled to project the stored funny-like short videos at the target area 107 after the patient enters the examination aperture 101.
In some embodiments, the projected content may also include an interactive interface. The interactive interface may provide various options for the user, enabling different pages to be accessed by manipulating different options. For example, the interactive interface may include options 1-game, option 2-video, option 3-picture, and after the user selects the corresponding option, the control module 210 controls the projection devices 104 and 105 to play the corresponding content stored inside the target area 107. In some embodiments, the interactive interface may include multiple pages. For example, the above "option 2-video", after being selected by the user, may further project an interactive interface related to the video type, such as option 21-landscape type, option 22-food type, option 23-science type, option 24-animal type, etc. When a user selects a certain option in the page, the corresponding type of video can be launched in the target area through the projection device.
In some embodiments, an interactive interface may be included that includes a plurality of different interaction modes. For example, the interaction mode may include a simple mode and a complex mode. The simple mode may include only one or more buttons of the same type of operation, and the complex mode may include a plurality of different types of buttons. By way of example only, as shown in FIG. 4, a simple mode of the interactive interface may include only one or more options 420 for providing different types of projected content; the complex mode interactive interface may include a return button 410, page up and down buttons 430 and 440, an emergency call button 450, and the like, in addition to the option 420. Wherein the return button 410 may return to the home interface or a previous level page, the option 420 may enter a next level menu or a video/image/game play page, the button 430 may page up, the button 440 may page down, and the emergency call button 450 may call the doctor. For example, when the patient feels discomfort, a doctor in the operating room may be called by operating the button 450, a video call may be made with the doctor, or the scan may be suspended by operating the button 450.
In some embodiments, feature information of the target object may be determined based on the image, and the interaction mode may be determined based on the feature information. Users of different ages or genders can project different modes of interactive interfaces. Further, at least two projection devices can be controlled to project corresponding target interactive interfaces on the target area according to the determined interactive mode. For example, users aged over 60 and under 5 may project an interactive interface in a simple mode, while other users project an interactive interface in a complex mode. In some embodiments, the characteristic information of the target object may be determined by an image recognition algorithm. For example, feature information of the target object may be determined by inputting the image into a trained deep learning model. In some embodiments, the trained deep learning model can be directly utilized to determine the feature information of the target object, and the interaction mode is determined based on the feature information. For example, the input of the model is an image and the output is an interaction pattern.
In some embodiments, the interaction mode of the interactive interface may be set autonomously by the user prior to scanning. For example, the physician may ask the patient whether to use the simple mode or the complex mode before scanning, and automatically project an interactive interface of the corresponding mode on the target area after the patient enters the examination aperture 101 according to the patient's selection.
In some embodiments, the interaction pattern may be determined based on a scanned region of the patient. For example, for a patient who scans a site other than the head, a complex pattern of interactive interfaces may be projected; for a patient scanning the head, a simple mode of the interactive interface is projected.
In some embodiments, after entering the interactive interface, further, the next page or the specific target projection content may be determined based on the action information of the target object. In some embodiments, after entering the interactive interface, the image containing the target object may be acquired again, and the motion information of the target object may be determined based on the image, so as to determine the specific target projection content based on the motion information. For example, the interaction pattern may be determined based on a first image captured earlier than a second image, and the motion information may be determined based on the second image.
In some embodiments, eyeball information of the target object may be determined based on the image, an operation instruction of the target object may be determined based on the eyeball information, and the target projection content may be determined based on the operation instruction.
The eyeball information may include at least one of an eyeball gaze direction, a gaze time, an eyeball motion direction, and the like. In some embodiments, eye information for a target object in one or more images may be determined. For example, the processing device 120 may identify an eye gaze direction of the patient in each image based on a plurality of images continuously acquired by the image acquisition apparatus, and then determine an eye movement direction and/or a gaze time. In some embodiments, eye information in the image may be determined by an image recognition algorithm.
In some embodiments, the operation instruction of the target object may be determined based on the eyeball information in the plurality of images. For example, the processing device 120 may compare whether the directions of gaze of the eyeballs of the target object in the images of the adjacent frames are consistent, determine the shooting time of the two images corresponding to the inconsistent directions, and further determine the time when the eyeballs gaze in the same direction; if the time exceeds a preset value (e.g., 3 seconds, 5 seconds), the patient is considered to select the button at the eyeball gazing direction as the target button, for example, a social video, a downward page, and the like. Further, the content of the next-level page corresponding to the target button may be determined as the target projection content.
And step 330, controlling at least two projection devices to project on the target area of the imaging device based on the projection content. In some embodiments, step 330 may be performed by processing device 120.
After determining the projection content, the at least two projection devices may be controlled to project on the target area of the imaging apparatus. For example, the processing device 120 may control at least two projection apparatuses to project the corresponding target interactive interface on the target area according to the determined interaction mode. For another example, the processing device 120 may control at least two projection apparatuses to project on the target area according to the determined target projection content.
In some embodiments, after the target area delivers specific target projection content (e.g., video, picture, game, etc.), the projection content of the target area may be switched further based on the operation instruction of the target object. For example, when the eye movement direction of the target object is up, down, left, or right, the next video or picture may be projected on the target area 107.
In some embodiments, head motion information of the target object may be determined based on the image, an operation instruction may be determined based on the head motion information, and the projection content of the target region may be switched according to the operation instruction.
In some embodiments, position information of the target object may be determined based on the image, and the position of the target area may be determined based on the position information. For example, the processing device 120 may determine a first coordinate position of the eye of the target object relative to the examination couch 102 based on the image, further determine a second coordinate position of the eye relative to the examination aperture 101 based on the position of the examination couch 102 and the first coordinate position of the eye, and determine a position of the target region within the examination aperture 101 from the second coordinate position. For example, the target region 107 may be located where the eye's head-up line of sight is perpendicular to an inner wall above the inspection aperture 101, or where a perpendicular bisector of the target region is less than a predetermined angle (e.g., 1 degree, 2 degrees, 3 degrees, 5 degrees, etc.) from the eye's head-up line of sight, and so forth.
Further, the projection images of the at least two projection devices at the target area may be corrected based on the position of the target area. For example, the processing device 120 may perform geometric correction on the projection images of the projection apparatuses 104 and 105 based on the position of the target area, so that the projection images of the two projection apparatuses in the target area can be spliced into the complete projection content. The geometric correction of the projection image of the projection apparatus may be performed in any feasible manner, for example, three-dimensional reconstruction, and the like, and the description is not limited thereto.
It should be noted that the above description of the process 300 is for illustration and description only and is not intended to limit the scope of the present disclosure. Various modifications and changes to flow 300 will be apparent to those skilled in the art in light of this description. However, such modifications and variations are intended to be within the scope of the present description.
Some embodiments of the present description also provide an image forming apparatus including: the scanner is used for acquiring scanning data of a target object; the image acquisition device is used for acquiring an image containing a target object; a control module to determine projection content based on the image; and the at least two projection devices are used for projecting in the target area based on the projection content.
Some embodiments of the present specification also provide a computer-readable storage medium storing computer instructions for performing the interaction method (e.g., the process 300) as described above when executed by a computer.
The beneficial effects that may be brought by the embodiments of the present specification include, but are not limited to: (1) The images are put into the inspection aperture by using the at least two projection devices, so that the patient can have an immersive immersion feeling, the sense of fear of a claustrophobic environment is reduced, and the scanning efficiency is improved; (2) The image acquisition device is used for acquiring the image of the target object, and the type of the interaction mode and/or the projection content is determined based on the image, so that more adaptive content can be pertinently projected for different patients, the user experience of the patients in the inspection aperture is improved, and a more comfortable and pleasant inspection environment is created for the patients; (3) By corresponding to the projection image based on the image eye gaze coordinates, the patient can be made to interact contactlessly within the examination aperture. It should be noted that different embodiments may produce different advantages, and in different embodiments, the advantages that may be produced may be any one or combination of the above, or any other advantages that may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, though not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the foregoing description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit-preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range in some embodiments of the specification are approximations, in specific embodiments, such numerical values are set forth as precisely as possible within the practical range.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except where the application history document does not conform to or conflict with the contents of the present specification, it is to be understood that the application history document, as used herein in the present specification or appended claims, is intended to define the broadest scope of the present specification (whether presently or later in the specification) rather than the broadest scope of the present specification. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present specification can be seen as consistent with the teachings of the present specification. Accordingly, the embodiments of the present description are not limited to only those explicitly described and depicted herein.

Claims (10)

1. An interaction method for an imaging apparatus comprising at least two projection devices, the method comprising:
acquiring an image containing a target object through an image acquisition device;
determining projection content based on the image;
and controlling the at least two projection devices to project on the target area of the imaging equipment based on the projection content.
2. The method of claim 1, the target region being located on an inner wall of an inspection aperture of the imaging device;
the determining projection content based on the image includes:
determining feature information of the target object based on the image;
determining an interaction mode based on the feature information, wherein the interaction mode comprises a simple mode and a complex mode; and
the controlling the at least two projection devices to project on the target area of the imaging device based on the projection content includes:
and controlling the at least two projection devices to project corresponding target interaction interfaces in the target area according to the determined interaction mode.
3. The method of claim 1, the determining projection content based on the image, comprising:
determining feature information of the target object based on the image;
and determining target projection content based on the characteristic information, wherein the target projection content comprises at least one of videos, games and pictures.
4. The method of claim 1, the determining projection content based on the image, comprising:
determining eyeball information of the target object based on the image;
determining an operation instruction of the target object based on the eyeball information;
and determining target projection content based on the operation instruction.
5. The method of claim 4, further comprising:
and switching the projection content of the target area based on the operation instruction.
6. The method of claim 1, further comprising:
determining position information of the target object based on the image;
determining a location of the target area based on the location information.
7. The method of claim 6, further comprising:
and correcting the projection images of the at least two projection devices in the target area based on the position of the target area.
8. An image forming apparatus comprising:
the scanner is used for acquiring scanning data of a target object;
the image acquisition device is used for acquiring an image containing the target object;
a control module to determine projection content based on the image;
and the at least two projection devices are used for projecting in a target area based on the projection content, and the target area is positioned on the inner wall of the aperture of the scanner.
9. The apparatus of claim 8, the image acquisition device being mounted directly above the scanner, and/or the at least two projection devices being movably mounted on the scanner.
10. A computer readable storage medium storing computer instructions for performing the method of any one of claims 1 to 7 when executed by a computer.
CN202211324045.5A 2022-10-27 2022-10-27 Imaging device and interaction method thereof Pending CN115576428A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211324045.5A CN115576428A (en) 2022-10-27 2022-10-27 Imaging device and interaction method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211324045.5A CN115576428A (en) 2022-10-27 2022-10-27 Imaging device and interaction method thereof

Publications (1)

Publication Number Publication Date
CN115576428A true CN115576428A (en) 2023-01-06

Family

ID=84587180

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211324045.5A Pending CN115576428A (en) 2022-10-27 2022-10-27 Imaging device and interaction method thereof

Country Status (1)

Country Link
CN (1) CN115576428A (en)

Similar Documents

Publication Publication Date Title
CN111374675B (en) System and method for detecting patient state in a medical imaging session
CN111938678B (en) Imaging system and method
US20210104055A1 (en) Systems and methods for object positioning and image-guided surgery
CN108968996B (en) Apparatus, method and storage medium providing motion-gated medical imaging
JP4484462B2 (en) Method and apparatus for positioning a patient in a medical diagnostic or therapeutic device
CN109730704B (en) Method and system for controlling exposure of medical diagnosis and treatment equipment
US9361726B2 (en) Medical image diagnostic apparatus, medical image processing apparatus, and methods therefor
US9734574B2 (en) Image processor, treatment system, and image processing method
JP2004000411A (en) Device, method, and system for displaying animation, device, method, and system for processing the same, program, computer-readable storage medium, and method and system for supporting image diagnosis
CN113647967A (en) Control method, device and system of medical scanning equipment
CN111528879A (en) Method and system for acquiring medical image
KR102373967B1 (en) Computed tomography and localization of anatomical structures required to be imaged
WO2022068941A1 (en) Systems and methods for digital radiography
US11200727B2 (en) Method and system for fusing image data
JP2020171483A (en) X-ray fluoroscopic imaging apparatus
CN115576428A (en) Imaging device and interaction method thereof
US11730440B2 (en) Method for controlling a medical imaging examination of a subject, medical imaging system and computer-readable data storage medium
US20210196402A1 (en) Systems and methods for subject positioning and image-guided surgery
CN114225236A (en) Radiotherapy guiding device, radiotherapy guiding method, electronic equipment and storage medium
CN110693513A (en) Control method, system and storage medium for multi-modal medical system
WO2022028439A1 (en) Medical device control method and system
JP2020168173A (en) Dynamic analysis apparatus, dynamic analysis system, and program
JP2021083961A (en) Medical image processor, medical image processing method, and medical image processing program
JP7310239B2 (en) Image processing device, radiation imaging system and program
US20220079538A1 (en) X-ray dynamic image display apparatus, storage medium, x-ray dynamic image display method, and x-ray dynamic image display system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination