CN114569076A - Positioning method, device and storage medium for near-infrared brain function imaging device - Google Patents

Positioning method, device and storage medium for near-infrared brain function imaging device Download PDF

Info

Publication number
CN114569076A
CN114569076A CN202210192280.5A CN202210192280A CN114569076A CN 114569076 A CN114569076 A CN 114569076A CN 202210192280 A CN202210192280 A CN 202210192280A CN 114569076 A CN114569076 A CN 114569076A
Authority
CN
China
Prior art keywords
positioning
probe
mobile sensor
dimensional
probes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210192280.5A
Other languages
Chinese (zh)
Inventor
邓皓
汪待发
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Danyang Huichuang Medical Equipment Co ltd
Original Assignee
Danyang Huichuang Medical Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Danyang Huichuang Medical Equipment Co ltd filed Critical Danyang Huichuang Medical Equipment Co ltd
Priority to CN202210192280.5A priority Critical patent/CN114569076A/en
Publication of CN114569076A publication Critical patent/CN114569076A/en
Priority to PCT/CN2023/079065 priority patent/WO2023165527A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • A61B5/0042Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part for the brain

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Neurology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The present disclosure relates to a positioning method, apparatus, and storage medium for a near-infrared brain function imaging apparatus. The near-infrared brain function imaging device is provided with a head cap, the head cap is used for being worn on the head of a subject and is provided with a plurality of probes used for transmitting and/or receiving near-infrared signals or installation positions capable of being assembled with the probes, and the positioning method comprises the following steps: displaying a three-dimensional brain image on a display interface, and acquiring position data of a measuring point of the mobile sensor determined by a positioning system; determining a distance between the mobile sensor and a head of an object based on the acquired position data of the measurement point of the mobile sensor; and under the condition that the distance is smaller than the preset distance, processing the position data of the measuring point of the mobile sensor to obtain the mapping position of the mobile sensor. The distance between the mobile sensor and the head of the object can be judged first, so that computing resources can be saved, and the data processing speed is increased.

Description

Positioning method, device and storage medium for near-infrared brain function imaging device
Technical Field
The present disclosure relates to the field of medical devices, and more particularly, to a positioning method, device and storage medium for a near-infrared brain function imaging apparatus.
Background
Near infrared spectral brain function imaging (fNIRS) is a new brain function imaging technique. By using multi-channel sensing composed of near infrared light and a transmitting probe-receiving probe, based on a nerve-blood oxygen coupling mechanism, the fNIRS can penetrate through the skull to detect and image the change of activation of brain activity with high time resolution, and effectively perform visualization and quantitative evaluation on brain functions.
When the near-infrared brain function imaging device is used, the channel formed by the probe on the head cap needs to be acquired and corresponds to the brain region position on the head of a user, so that the physiological state of which brain region is specifically represented by the acquired near-infrared data can be determined. The existing positioning software has fewer functions, is complex to operate, is not convenient to use and has poor user experience. In addition, the existing positioning equipment always sends data to the computer, and the computer always processes the received data, so that a large amount of computing resources are consumed.
Disclosure of Invention
The present disclosure is provided to solve the above-mentioned problems occurring in the prior art.
There is a need for a positioning method, apparatus, and medium for a near-infrared brain function imaging device, which can display a three-dimensional brain image on a display interface, perform mapping processing of a measurement position in a case where a positional relationship between a mobile sensor and a head of an object conforms to a measurement scene, and mark the mapped mapping position of the mobile sensor on the three-dimensional brain image in real time after the mapping processing. Therefore, the calculation load is obviously reduced, the positioning process is more visual, the positioning efficiency of the near-infrared brain function imaging device is obviously improved, and the guiding effect on the user is also improved.
According to a first aspect of the present disclosure, there is provided a positioning method for a near-infrared brain function imaging device having a headgear for wearing on a head of a subject and having a plurality of probes for transmitting and/or receiving near-infrared signals, the positioning method including displaying a three-dimensional brain image on a display interface, acquiring position data of a measurement point of a mobile sensor determined by a positioning system; determining a distance between the mobile sensor and a head of an object based on the acquired position data of the measurement point of the mobile sensor; processing the position data of the measuring point of the mobile sensor under the condition that the distance is smaller than a preset distance to obtain a mapping position of the mobile sensor; and marking the mapping position of the mobile sensor on the three-dimensional brain image according to the mapping position of the mobile sensor.
According to a second aspect of the present disclosure, there is provided a positioning apparatus for a near-infrared brain function imaging device, the positioning apparatus comprising a first positioning component and a first processor, wherein the first positioning component is configured to: positioning each probe on a head cap of the near-infrared brain function imaging device to determine the measurement position of each probe; the first processor is configured to: the positioning method for the near-infrared brain function imaging device of the various embodiments of the present disclosure is performed.
According to a third aspect of the present disclosure, there is provided a near-infrared brain function imaging system including: a headgear configured to be worn on a head of a subject and having a plurality of probes for transmitting and/or receiving near-infrared signals or mounting locations to which each of the probes can be fitted; a second positioning assembly configured to: positioning each probe on the headgear to determine a measurement position of each probe; a second processor configured to: the positioning method for the near-infrared brain function imaging device of the various embodiments of the present disclosure is performed.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing a program for causing a processor to execute the steps of the positioning method for a near-infrared brain function imaging apparatus according to various embodiments of the present disclosure.
With the positioning method, the positioning apparatus, the storage medium, and the near-infrared brain function imaging system for a near-infrared brain function imaging device according to the embodiments of the present disclosure, it is possible to determine the head of the object and the mobile sensor in advance during the positioning of the near-infrared brain function imaging device, and only when the distance between the mobile sensor and the head of the object is smaller than a preset distance, process the position data of the measurement point of the mobile sensor to obtain the mapping position of the mobile sensor; and marking the mapping position of the mobile sensor on the three-dimensional brain image, so that the computing resource can be saved, and the data processing speed can be improved. The mapping position of the mobile sensor marked on the three-dimensional brain image has a positioning and navigation effect, so that a user can visually check the position of the mobile sensor in real time in the positioning process, the positioning process is more visual, and the positioning efficiency of the near-infrared brain function imaging device is remarkably improved.
Drawings
In the drawings, which are not necessarily drawn to scale, like reference numerals may describe similar components in different views. The drawings illustrate various embodiments generally by way of example and not by way of limitation, and together with the description and claims serve to explain the disclosed embodiments. The same reference numbers will be used throughout the drawings to refer to the same or like parts, where appropriate. Such embodiments are illustrative, and are not intended to be exhaustive or exclusive embodiments of the present apparatus or method.
Fig. 1 shows a schematic diagram of a positioning apparatus in cooperation with a near-infrared brain function imaging device according to an embodiment of the present disclosure.
Fig. 2 shows a flowchart of a first example of a localization method for a near-infrared brain function imaging device according to an embodiment of the present disclosure.
Fig. 3(a) shows a schematic diagram of a display interface for a positioning method of a near-infrared brain function imaging device according to an embodiment of the present disclosure.
Fig. 3(b) shows a schematic diagram of a display interface for a positioning method of a near-infrared brain function imaging device according to an embodiment of the present disclosure.
Fig. 4 shows a flowchart of a second example of a localization method for a near-infrared brain function imaging device according to an embodiment of the present disclosure.
Fig. 5 shows a flowchart of a third example of a localization method for a near-infrared brain function imaging device according to an embodiment of the present disclosure.
Fig. 6 shows a schematic diagram of a display of measurement positions of a probe on a three-dimensional brain image via a channel layout grid formed by incomplete mapping according to an embodiment of the present disclosure.
Fig. 7 shows a schematic diagram of a two-dimensional relationship between probes for a positioning method of a near-infrared brain function imaging device according to an embodiment of the present disclosure.
Detailed Description
For a better understanding of the technical aspects of the present disclosure, reference is made to the following detailed description taken in conjunction with the accompanying drawings. Embodiments of the present disclosure are described in further detail below with reference to the figures and the detailed description, but the present disclosure is not limited thereto. The order in which the various steps described herein are described as examples should not be construed as a limitation if there is no requirement for a context relationship between each other, and one skilled in the art would know that sequential adjustments may be made without destroying the logical relationship between each other, rendering the overall process impractical.
The use of "first," "second," and similar terms in this disclosure are not intended to indicate any order, quantity, or importance, but rather are used for distinction. The word "comprising" or "comprises", and the like, means that the element preceding the word covers the element listed after the word, and does not exclude the possibility that other elements are also covered.
Embodiments of the present disclosure provide a positioning method for a near-infrared brain function imaging device. Note that the positioning method may be implemented via a positioning device that may operate in cooperation with a near-infrared brain function imaging apparatus.
Fig. 1 shows a schematic diagram of a positioning apparatus in cooperation with a near-infrared brain function imaging device according to an embodiment of the present disclosure. The full configuration of the near-infrared brain function imaging apparatus 100 is not shown in fig. 1, only a part of the components related to positioning is shown, and the near-infrared brain function imaging apparatus 100 has at least a head cap 101, and the head cap 101 is intended to be worn on the head of a subject 107. For example, the headgear 101 may have a plurality of probes 108 for transmitting and/or receiving near-infrared signals. As another example, the headgear 101 may be provided with a plurality of mounting locations for removably mounting respective ones of the probes 108, and in use, the probes 108 may be mounted to the headgear 101 via the mounting locations. Wherein each of the plurality of probes 108 may be configured as a transmitting probe (S) or a receiving probe (D), each pair of paired probes forms a channel, and the line segment connecting the two forms the channel. In some embodiments, one transmitting probe may correspond to multiple receiving probes, or vice versa, with a receiving probe corresponding to multiple transmitting probes, in a paired relationship depending on the specific requirements of the deployment location of the probes, the brain functional region to be detected, and the like.
As shown in fig. 1, the positioning device 105 may include a positioning component 104 and a processor 102. The positioning assembly 104 may be configured to position each probe 108 on the head cap 101 of the near-infrared brain function imaging device and determine a measurement position of each probe. Note that "positioning each probe 108 on the headgear 101" may directly position each probe 108 with the headgear 101 assembled via the positioning assembly 104 (direct positioning), but is not necessarily performed in a case where the probe 108 is assembled with the headgear 101, and may position an installation site with a measurement position of the installation site as a measurement position of the probe 108 for assembly, in a case where the installation site of the headgear 101 is not yet assembled with the probe 108, thereby achieving indirect positioning of the corresponding probe 108 via the installation site (indirect positioning). For convenience of description, the direct positioning is taken as an example for illustration.
The processor 102 may be configured to perform a localization method for a near-infrared brain function imaging device according to various embodiments of the present disclosure. In some embodiments, the positioning device 105 may also include a memory 103 and a display 106. Wherein the memory 103 is configured to store a positioning program that causes the processor 102 to execute the flow of the positioning method and data generated and/or required during execution, and may also store the measurement positions of the respective probes determined via the positioning component 104. In some embodiments, memory 103 may be configured to store the measurement locations and/or mapped locations of each probe 108 in association with each probe 108. Specifically, the memory may store only the measurement positions or mapping positions of each probe 108 associated with each probe 108, or both, and the present disclosure is not limited in this regard as long as historical position information for the probe can be obtained upon receiving an indication from a user to reposition an already positioned (i.e., already stored measurement positions and/or mapping positions) probe.
In particular, the positioning component 104 can employ a variety of implementations. For example, as shown in fig. 1, the positioning assembly 104 may include a magnetic source 104b and a mobile sensor (also referred to as a detection pen) 104a capable of generating an orthogonal magnetic field in a three-dimensional space, and the measurement position of the probe 108 is determined by the magnetic action between the two, which is not described herein. In use, the magnetic source 104b may be placed on a fixed support, the detection pen 104a is moved to position each probe 108 on the headgear 101, when the measurement position of the probe 108 is determined, a key on the detection pen 104a is pressed, and the positioning assembly 104 sends the measurement position data of the probe 108 to the processor 102 for processing, which is merely an example, and the implementation manner of the positioning assembly 104 is not specifically limited by the present disclosure. For convenience of description, the following description of the positioning method is made with the positioning assembly 104 configured as shown in fig. 1.
In other embodiments, the user may also perform various interaction operations of positioning through other interaction means (not shown), such as touch screen buttons, a mouse, a keyboard, a trackball, a gesture sensing means, and the like, and the interaction operations may be designated operations of clicking, stopping, and the like.
In some embodiments, the display 106 may be configured to display a three-dimensional brain image on a display interface thereof under the control of the processor 102, wherein the three-dimensional brain image is constructed based on a three-dimensional brain model, which may be obtained from medical image data of the head of the subject, for example, a brain nuclear magnetic image of the subject, or may employ existing brain atlas data, such as an ICBM152 atlas, which is not specifically limited by the present disclosure. In some embodiments, the display 106 may employ LEDs, OLEDs, and the like, which are not described herein.
In some embodiments, processor 102 may be a processing device including more than one general purpose processing device, such as a microprocessor, Central Processing Unit (CPU), Graphics Processing Unit (GPU), or the like. More specifically, the processor may be a Complex Instruction Set Computing (CISC) microprocessor, Reduced Instruction Set Computing (RISC) microprocessor, Very Long Instruction Word (VLIW) microprocessor, processor running other instruction sets, or processors running a combination of instruction sets. The processor may also be one or more special-purpose processing devices such as an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), a system on a chip (SoC), or the like. The processor 102 may be configured to perform a localization method for a near-infrared brain function imaging device according to various embodiments of the present disclosure.
A positioning method for a near-infrared brain function imaging device according to an embodiment of the present disclosure will be described in detail below with reference to fig. 2. Fig. 2 shows a flowchart of a first example of a localization method for a near-infrared brain function imaging device according to an embodiment of the present disclosure.
As shown in fig. 2, when the user wants to position the near-infrared brain function imaging device, a three-dimensional brain image may be displayed on the display interface (step 201), and as mentioned above, the three-dimensional brain image is constructed based on a three-dimensional brain model. In some embodiments, the three-dimensional brain image may be oriented in any direction before positioning has not been started, or may be a default direction predefined by the user or the system, and preferably, the forehead portion of the three-dimensional brain image is oriented toward the user to correspond to the forehead direction in which the subject properly wears the headgear for easy viewing by the user.
Position data of the measurement points of the mobile-type sensor determined by the positioning apparatus may be acquired (step 202). Note that the arrow direction for each step in fig. 2 does not imply that the steps are limited to the order of execution, and that the steps may be executed in an order different from that shown in fig. 2, as long as there is no logical conflict. For example, the display of the three-dimensional brain image (step 201) may be performed simultaneously with step 202, may be performed after step 202, or may be performed before step 202, which is not specifically limited by the present disclosure.
Specifically, the user moves the detection pen 104a to position each probe 108 on the headwear 101, and determines the distance between the mobile sensor (i.e., the detection pen 104a) and the head of the subject based on the acquired position data of the measurement point of the mobile sensor (step 203). The processor 102 performs a preliminary process on the position data of the measurement point of the mobile sensor to obtain the distance between the mobile sensor and the head of the object. For example, the position of the head of the subject may be set in advance, and the distance may be obtained by comparing the position data of the measurement point of the mobility sensor specified by the positioning system with the predetermined position of the head. For another example, the position data of at least one reference point of the head may be first acquired by using the mobile sensor, and the distance and the like may be obtained by comparing the position data of the reference point or the representative data determined based on the position data of each reference point with the measurement point of the mobile sensor, which is not particularly limited in the present disclosure.
Next, when the distance is smaller than the preset distance, the position data of the measurement point of the mobile sensor is processed to obtain a mapping position of the mobile sensor (step 204). And when the distance is judged to be smaller than the preset distance, the detection pen is considered to be about to or about to measure and position the probe on the head cap, and the processor calculates and processes the acquired position data of the measurement point of the detection pen in real time. And when the distance is equal to or greater than the preset distance, it is assumed that the user does not intend to perform positioning using the detection pen, does not process the position data of the measurement point of the mobile sensor, and does not display the mapping position mark of the mobile sensor on the three-dimensional brain image. Whether position data of a measuring point of the mobile sensor is processed or not is determined by judging the distance between the mobile sensor and the head of the object, so that computing resources can be saved, and the data processing speed can be increased.
Finally (step 205), the mapping position of the mobile sensor is marked on the three-dimensional brain image according to the mapping position of the mobile sensor. The mapping position mark for displaying the detection pen on the three-dimensional brain image has a positioning and navigation effect, so that the positioning process is more visual, and a user can visually check the position of the detection pen on the three-dimensional brain image in the positioning process. In some embodiments, each brain area may be marked on the three-dimensional brain image, so that the user can see in real time which brain area the mapping position of the detection pen is located on and the specific position on the brain area in the positioning process, so as to determine the positioning condition.
It is understood that the mapping position of the mobile sensor may be marked in any form, such as a circular point, a circle, a dashed frame (as shown in the right area of fig. 3 (b)), as long as the mapping position of the mobile sensor can be marked to facilitate the user to view the positioning situation, and the disclosure does not specifically limit this.
According to the method and the device, whether the position data of the measuring point of the mobile sensor is processed or not is determined by judging the distance between the mobile sensor and the head of the object, and the position data of the measuring point of the mobile sensor is processed only by determining the position data of the measuring point of the mobile sensor to be processed, so that the computing resources can be saved, the data processing speed is increased, and the working efficiency is improved.
The above steps describe a process of obtaining a mapping position of the mobile sensor from position data of the measurement point of the mobile sensor, and the following steps describe a process of obtaining a positioning position (a first positioning position and a second positioning position) of the probe from position data of the measurement point of the mobile sensor.
As shown in fig. 4, step 401, step 402, step 403, and step 404 are respectively the same as step 201, step 202, step 203, and step 204 in the flowchart shown in fig. 2, and are not repeated here. In this example, if the distance is less than the preset distance in step 404, the position data of the measurement point of the mobile sensor is processed to obtain a mapped position of the mobile sensor, and then in step 405, the position data of the measurement point of the mobile sensor may be mapped to a three-dimensional brain model as the measurement position of the corresponding probe in response to a first operation of the mobile sensor by the user to determine a mapped position of the corresponding probe as the first positioning position of the corresponding probe (e.g., D12 of fig. 3(a) and D14 of fig. 3 (b)). Specifically, in step 405, the obtained measurement position of each probe is completely mapped to the three-dimensional brain model, and finally, the mapping position of the corresponding probe is determined and is used as the first positioning position of the corresponding probe.
It should be noted that a first operation (for example, holding, moving, touching or installing a probe) of the mobile sensor by the user represents that the probe to be touched by the mobile sensor is to be positioned, and the processor uses the current position data of the mobile sensor as the measurement position of the corresponding probe. Wherein the position of each probe measuring point is used for representing the measuring position of each probe on the head. The measurement position of each probe can be determined as the acquired measurement position of each probe by the positioning assembly in response to the user's designation operation of the measurement point of each probe on the headgear by the test pen.
Next, a first location of the corresponding probe may be marked on the three-dimensional brain image (step 406), and the first location of the probe may be marked with a mapping point (e.g., D12 of fig. 3(a) and D14 of fig. 3 (b)). The processor can also determine a channel formed between the probes according to the mapping point information, and/or determine a brain area to which the channel belongs according to the known data information of the channel formed between the probes and each brain area, and display the brain area information in the display interface. Specifically, the anatomical position of the physiological state represented by the channel formed by each pair of SD probes is the brain region information to which the channel belongs, and the brain region information can be identified by characters, colors, and the like. Therefore, the user can directly observe the brain area position information of the current channel, and the brain area information and the preset position information of each probe are checked in a contrast mode, so that the method is beneficial to modifying the setting position of the probe in the design of the head cap, positioning the brain area with the significant difference when near-infrared signals are processed, and the like.
In some embodiments, in response to a first operation of the mobile sensor by the user, the position data of the measurement points of the mobile sensor may be adaptively transformed with respect to the three-dimensional brain model without changing an angle between connecting lines of the measurement positions to obtain a second location position of the corresponding probe, wherein the second location position is used for determining a channel grid formed between the probes (step 407). Specifically, in step 407, the acquired measurement position of each probe is incompletely mapped to the three-dimensional brain model, so as to obtain a second positioning position of the corresponding probe. The incomplete mapping means that the relative position relationship between the measurement positions of the probes does not change when the probes are mapped on the three-dimensional brain model, that is, the channel layout grid formed between the actual measurement positions of the probes does not deform when the probes are not completely mapped on the three-dimensional brain model.
It should be noted that the channel determined by the first positioning position in the grid is used for determining the brain region to which the channel belongs, so that the operator can conveniently check the brain region corresponding to the positioning position, and the channel grid determined by each second positioning position is used for checking whether the position of each probe is accurately positioned.
In some embodiments, in addition to obtaining the first location position, the second location position, or other representative position of each probe, the processor may obtain other information, such as layout information of channels between probes, and form a channel layout grid using the determined mapping positions of each probe based on the obtained layout information of the channels between probes, whereby the channel layout grid may be further displayed on the three-dimensional brain image together with the positions of the probes. Specifically, when the measurement position is completely mapped into the three-dimensional brain model, the user can determine the brain region position actually corresponding to each channel according to the channel layout grid formed by the completely mapped position of each probe, and when the measurement position is not completely mapped into the three-dimensional brain model, the user can determine whether each probe is positioned incorrectly according to the channel layout grid formed by the incompletely mapped position of each probe. As shown in fig. 3(a) and fig. 6, it can be seen from the SD layout shown in fig. 3(a) that the channel layout grid formed among the four probes S1, D1, D6 and S7 is rectangular, while the channel layout grid formed via the incomplete mapping shown in fig. 6 is represented by non-rectangular, and according to the grid shape of the channel layout, the user can clearly see that S7 is obviously deviated from the preset position, and accordingly, it can be determined that the probe S7 is positioned incorrectly, and needs to be repositioned.
The above steps are processes of obtaining the mapping position of the mobile sensor and the positioning position of the probe, and the display of the mapping position of the mobile sensor and the positioning position of the probe will be described in detail below, that is, the first positioning position, the channel formed between the probes and/or the brain region to which the channel belongs are marked on the three-dimensional brain image.
Fig. 5 shows a flowchart of a third example of a localization method for a near-infrared brain function imaging device according to an embodiment of the present disclosure. A specific flow of displaying the mapped position of the mobile sensor and the positioning position of the probe will be described below with reference to a third example.
In step 501, a first positioning position of the probe and a mapping position of the mobile sensor may be marked on the first three-dimensional brain image;
specifically, after the measurement position of the probe is acquired, the measurement position may be mapped into the three-dimensional brain model to determine the mapping position of the corresponding probe as the first positioning position of the corresponding probe, that is, the mapping position is mapped into the space of the three-dimensional brain model from the actual three-dimensional space, after the mapping position of the probe is acquired, the three-dimensional brain image constructed and formed by the three-dimensional brain model adjusts the current display viewing angle to the portion where the mapping position of the probe is located, so that the user can see the mapping position of the probe on the three-dimensional brain image in time during the positioning process, so as to determine the positioning condition of the probe through the mapping position, for example, whether the positioning of the probe is wrong, whether the position of the brain region corresponding to the probe is wrong, whether the headgear is worn correctly, the deviation degree of the position of the brain region corresponding to the probe from the expected position, and the like.
In step 502, the mapping position of the mobile sensor and the first positioning position of the probe are marked with different marks so that the marks are not hidden from each other. For example, as shown in fig. 3(b), the first positioning position is indicated by a dot, and the mapping position of the test pen is indicated by a dashed box.
In step 503, a channel grid formed between the probes at which the second positioning positions have been obtained is marked on the second three-dimensional brain image.
In this embodiment, in steps 501 and 503, two three-dimensional brain images (a first three-dimensional brain image and a second three-dimensional brain image) are used to mark the channel grids formed between the probes at the first location position and the mapping position of the movement sensor, and the second location position, respectively, and at this time, the two three-dimensional brain images are displayed in different display areas on the same interface, and the characteristics that the display visual angle of the three-dimensional brain images is changed along with the first positioning position or the second positioning position of the currently measured probe measuring point are combined, the positioning process can be made more visible and easier to understand, while allowing the user to navigate the display interface without having to switch back and forth, more positioning information is obtained on the same interface, and the observation visual angle is the direction of the front visual angle, so that the position of the probe can be checked more conveniently, more in detail and more accurately. In order to avoid information confusion caused by excessive information provision, in other embodiments, two three-dimensional brain users may be displayed on different interfaces, and the positioning condition may be checked by switching the interfaces, as long as the user can check and operate the positioning condition, which is not specifically limited by the present disclosure.
In step 504, a display perspective of the three-dimensional brain image may be changed according to the measurement position of each probe such that a portion of the three-dimensional brain image corresponding to the measurement position is directed toward a user. By mapping the measurement position of the probe to the three-dimensional brain model and adjusting from the current display perspective so that the portion of the three-dimensional brain image where the mapping position of the probe is located faces the user, the mapping position will match onto (or be immediately adjacent to) the three-dimensional brain model rather than significantly deviating from the three-dimensional brain model, thereby facilitating the user to more accurately grasp the relative relationship between the probe and the three-dimensional brain model.
Note that the "portion of the three-dimensional brain image corresponding to the measurement position" is intended to mean a portion of the three-dimensional brain image corresponding to the measurement position, and may be a direct correspondence or an indirect correspondence. For example, a portion of the three-dimensional brain image containing the measurement location may be oriented toward the user (a directly corresponding example). In other embodiments, not limited to the measurement position of the probe, other representative positions calculated or derived based on the measurement position may be used, and the display perspective of the three-dimensional brain image is changed according to the representative position information of the probe, so as to adjust the current display perspective to make the portion of the three-dimensional brain image containing the representative position face the user (an indirect example).
In particular, the display perspective of the three-dimensional brain image may be changed in various ways depending on the determined mapping positions of the respective probes.
Specifically, the above process may include: first, a reference point on the three-dimensional brain image and an observation point of the human eye are determined, the reference point and the observation point of the human eye are connected to form a reference line, the azimuth angle of the reference line is set to 0 °, then, a first connection line is determined based on a first mapping position to be used for adjusting the orientation of the three-dimensional brain image and the reference point on the three-dimensional brain image, and the azimuth angle between the first connection line and the reference line, that is, a first azimuth angle is calculated. In some embodiments, the display perspective of the three-dimensional brain image can be turned to the position of the first mapping position based on the first azimuth angle, so that the three-dimensional brain image can be automatically turned in real time along with the change of the measuring point, and a user can observe the mapping position of the current measuring point of the probe on the brain in real time in the positioning process.
In some embodiments, the measurement location (or a representative location such as a mapping location) may also be marked on the three-dimensional brain image with the portion of the three-dimensional brain image corresponding to the measurement location oriented toward the user to make the positioning process more visible and easier to understand, while allowing the user to easily verify the position of the probe.
In this embodiment, regarding the display interface, the positioning method further includes: displaying the three-dimensional brain image in a first display area on the same display interface; and displaying a two-dimensional relationship graph among the probes in a second display area on the same display interface (as shown in FIG. 7, the two-dimensional relationship among the three probes D6, S7 and D7 is shown in the graph). The two-dimensional relationship here refers to a relative positional relationship between two probes, such as a distance between the probes, an included angle between adjacent channels between the probes, and the like (as shown in fig. 7), and the two-dimensional relationship may display the numbers of the probes and the channels formed between the probes, and the two-dimensional relationship data may be obtained from three-dimensional data of the measurement positions of the probes, and may also function similarly to a channel grid. On the two-dimensional relation graph, highlighting marking can be carried out on data which do not accord with preset rules, and a user judges the positioning condition of the probe through highlighted data identification and channel grids. Based on the three-dimensional brain image and the display and prompt of information such as the probe position, the channel layout grid, the brain area to which the channel belongs and the like on the display interface, the user can check the measurement condition and the reasonability of the probe position.
In some embodiments, the identifier of each probe may be displayed on the display interface, and the mapping position of each probe may be checked based on a preset rule, and the identifiers of probes that do not satisfy the preset rule are displayed in the first presentation manner. Specifically, the preset rule may be, for example, a reasonable range of parameters such as a distance and an angle between the probes, especially a distance range between paired SDs, and may further include a comparison relationship between channels between the probes and the corresponding brain area information displayed in the display interface, and other preset rules that may prompt the user that there is a positioning error, and the like, and is not limited in this respect. In some embodiments, the first presenting manner may present the identifier of the probe in another color or a highlighted color different from the original color of the identifier of the probe, for example, and the like, without limitation. Therefore, the user can determine not only whether information such as the position of the probe, the layout of the channels, and the correspondence between the channels and the brain region is erroneous, but also the degree of deformation of the headgear.
In some embodiments, when the first positional location of the channel grid and the probe are marked on the same three-dimensional brain image: the second positioning position of the probe and the channel grid are displayed by the transparent mark, so that the first positioning position of the probe can be prevented from being shielded, and a user can observe different information at the same time; in some embodiments, the display angle of the three-dimensional brain image is automatically adjusted to a display angle at which the indicia of the channel grid and the first location position of the probe are clearly visible; in other embodiments, after detecting the first operation of the mobile sensor by the user, in a case where too much information to be displayed on the three-dimensional brain image may cause occlusion, the grid of the channel associated with the current probe may be displayed within a predetermined time period, that is, only the channel information and the brain region information associated with the probe currently being positioned are displayed. In some embodiments, in response to a first operation of the mobile sensor by the user, the mapped position of the mobile sensor may be hidden for a predetermined time interval, i.e., only the mapped position of the probe is displayed on the three-dimensional brain image for the predetermined time interval. Or according to other display rules, the positioning information currently concerned by the user can be highlighted to the greatest extent. The above-described modes can be implemented in any combination without violating each other, and are not particularly limited herein.
In some embodiments, in the case that the portion of the three-dimensional brain image corresponding to the measurement position (or the mapping position mapped based on the measurement position, and other representative positions) faces the user, the measurement position and/or the mapping position may also be marked on the three-dimensional brain image, for example, the measurement position and the mapping position may be marked on the same three-dimensional brain image at the same time, or the measurement position and the mapping position may be marked on two three-dimensional brain images located in different display areas on the same interface, respectively, and in combination with the feature that the display view angle of the three-dimensional brain image is transformed along with the measurement position and/or the mapping position of the probe measurement point currently measured, the positioning process may be more visualized and easier to understand, and at the same time, the user may obtain more positioning information on the same interface without switching the display interface back and forth, and the viewing view angle is in the front view angle direction, therefore, the position of the probe can be checked more conveniently, in more detail and more accurately. In another embodiment, the measurement position and the mapping position may be respectively marked on the three-dimensional brain images on different interfaces, and the user may view the positioning condition by switching the interfaces as long as the user can view and operate the positioning condition conveniently, which is not specifically limited by the present disclosure.
In the whole positioning process, a user only needs to conduct appointed operation on each probe on the head cap according to the guidance of the preset positioning sequence, and when the positioning of one probe is completed, the display visual angle of the three-dimensional brain image, the mapping position of the probe, the grid shape of the channel layout and other conditions can be checked from the display interface, the positioning condition of each probe can be intuitively and timely obtained without additional manual operation, and the positioning efficiency is remarkably improved. The preset positioning sequence is used for guiding a user to position each probe on the head cap.
Embodiments of the present disclosure also provide a positioning apparatus for a near-infrared brain function imaging device, which may be provided separately from the near-infrared brain function imaging device with a head cap according to an embodiment of the present disclosure, and includes at least a first positioning component and a first processor, wherein the first positioning component may be configured to position each probe on the head cap of the near-infrared brain function imaging device to determine a measurement position of each probe. The first processor in the positioning device may then be configured to perform the positioning method for a near-infrared brain function imaging apparatus according to the aforementioned various embodiments of the present disclosure. The first processor may be a processing device, such as a microprocessor, Central Processing Unit (CPU), Graphics Processing Unit (GPU), etc., including one or more general purpose processing devices.
There is also provided a near-infrared brain function imaging system according to embodiments of the present disclosure, which may be at least composed of a head cap worn on a head of a subject and having a plurality of probes for transmitting and/or receiving near-infrared signals or having mounting positions to which the respective probes can be mounted, a second positioning assembly for positioning the respective probes on the head cap and determining measurement positions of the respective probes, and a second processor for performing the steps of the positioning method for the near-infrared brain function imaging apparatus according to embodiments of the present disclosure. It is understood that after the probe on the headgear is positioned, the probe can be used to collect the near infrared signal of the subject's head, and the second processor can also perform data collection, processing analysis, and presenting analysis results on the near infrared signal.
Embodiments of the present disclosure also provide a computer storage medium having stored thereon computer-executable instructions that, when executed by a processor, perform the steps of the positioning method for a near-infrared brain function imaging apparatus according to the foregoing. The storage medium may include read-only memory (ROM), flash memory, Random Access Memory (RAM), Dynamic Random Access Memory (DRAM) such as synchronous DRAM (sdram) or Rambus DRAM, static memory (e.g., flash memory, static random access memory), etc., on which computer-executable instructions may be stored in any format.
Moreover, although exemplary embodiments have been described herein, the scope thereof includes any and all embodiments based on the disclosure with equivalent elements, modifications, omissions, combinations (e.g., of various embodiments across), adaptations or alterations. The elements of the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application, which examples are to be construed as non-exclusive. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more versions thereof) may be used in combination with each other. For example, other embodiments may be used by those of ordinary skill in the art upon reading the above description. In addition, in the foregoing detailed description, various features may be grouped together to streamline the disclosure. This should not be interpreted as an intention that a disclosed feature not claimed is essential to any claim. Rather, the subject matter of the present disclosure may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with each other in various combinations or permutations. The scope of the invention should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
The above embodiments are only exemplary embodiments of the present disclosure, and are not intended to limit the present invention, the scope of which is defined by the claims. Various modifications and equivalents may be made thereto by those skilled in the art within the spirit and scope of the present disclosure, and such modifications and equivalents should be considered to be within the scope of the present invention.

Claims (13)

1. A positioning method for a near-infrared brain function imaging apparatus having a head cap which is to be worn on a head of a subject and has a plurality of probes for transmitting and/or receiving near-infrared signals or mounting positions to which the respective probes can be fitted, the positioning method comprising:
displaying a three-dimensional brain image on a display interface, wherein the three-dimensional brain image is constructed and formed on the basis of a three-dimensional brain model;
acquiring position data of a measurement point of a mobile sensor determined by a positioning system;
determining a distance between the mobile sensor and a head of an object based on the acquired position data of the measurement point of the mobile sensor;
processing the position data of the measuring point of the mobile sensor under the condition that the distance is smaller than a preset distance to obtain a mapping position of the mobile sensor;
the mapping position of the mobile sensor is marked on the three-dimensional brain image.
2. The positioning method according to claim 1, further comprising, after acquiring the position data of the measurement point of the mobile sensor determined by the positioning system:
in response to a first operation of the mobile sensor by a user, mapping position data of a measurement point of the mobile sensor as a measurement position of a corresponding probe to a three-dimensional brain model to determine a mapped position of the corresponding probe as a first positioning position of the corresponding probe;
marking a first location position of the corresponding probe on the three-dimensional brain image.
3. The positioning method according to claim 2, further comprising:
responding to a first operation of a user on the mobile sensor, and performing adaptive conversion on position data of measuring points of the mobile sensor under the condition of not changing an included angle between connecting lines of measuring positions according to the three-dimensional brain model to obtain a second positioning position of the corresponding probe, wherein the second positioning position is used for determining a channel grid formed among the probes.
4. The positioning method according to claim 2, further comprising:
and marking the first positioning position, the channel formed among the probes and/or the brain area to which the channel belongs on the three-dimensional brain image.
5. The positioning method according to claim 3, further comprising:
marking a first positioning position of the probe and a mapping position of the mobile sensor on the first three-dimensional brain image;
and marking a channel grid formed among the probes of which the second positioning positions are obtained on the second three-dimensional brain image.
6. The positioning method according to claim 2, wherein when the first positioning position of the probe and the mapping position of the mobile sensor are marked on the same three-dimensional brain image:
the mapping position of the mobile sensor and the first positioning position of the probe are marked differently with different marks so that the marks of the two do not obscure each other.
7. The positioning method according to claim 2, further comprising:
displaying the three-dimensional brain image in a first display area on the same display interface;
and displaying a two-dimensional relation graph between the probes in a second display area on the same display interface, wherein the two-dimensional relation graph is determined by the measuring positions of the probes.
8. The method of claim 3, further comprising, while marking the first location position of the channel grid and probe on the same three-dimensional brain image:
displaying a second positioning position of the probe and a channel grid by using a transparent mark; and/or
Automatically adjusting the display angle of the three-dimensional brain image to a display angle at which the marks of the channel grid and the first positioning position of the probe are clearly visible; and/or
Upon detecting a first operation of the mobile sensor by a user, displaying a grid of channels associated with a current probe for a predetermined period of time.
9. The positioning method according to claim 2 or 3, further comprising: and changing the display visual angle of the three-dimensional brain image according to the measurement position of each probe, so that the part of the three-dimensional brain image corresponding to the measurement position faces the user.
10. The positioning method according to any one of claims 1 to 9, further comprising:
and when the distance is equal to or greater than a preset distance, not processing the position data of the measurement point of the mobile sensor, and not displaying the mapping position of the mobile sensor on the three-dimensional brain image.
11. A positioning apparatus for a near-infrared brain function imaging device, comprising a first positioning assembly and a first processor, wherein,
the first positioning component is configured to: positioning each probe on a head cap of the near-infrared brain function imaging device to determine the measurement position of each probe;
the first processor is configured to: performing the localization method for a near-infrared brain function imaging device of any one of claims 1-10.
12. A near-infrared brain function imaging system, comprising:
a headgear configured to be worn on a head of a subject and having a plurality of probes for transmitting and/or receiving near-infrared signals, or mounting locations to which respective ones of the probes can be fitted;
a second positioning assembly configured to: positioning each probe on the headgear to determine a measurement position of each probe;
a second processor configured to: performing the localization method for a near-infrared brain function imaging device of any one of claims 1-10.
13. A non-transitory computer-readable storage medium storing a program that causes a processor to execute the positioning method for a near-infrared brain function imaging apparatus according to any one of claims 1 to 10.
CN202210192280.5A 2022-03-01 2022-03-01 Positioning method, device and storage medium for near-infrared brain function imaging device Pending CN114569076A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210192280.5A CN114569076A (en) 2022-03-01 2022-03-01 Positioning method, device and storage medium for near-infrared brain function imaging device
PCT/CN2023/079065 WO2023165527A1 (en) 2022-03-01 2023-03-01 Positioning method and apparatus for near-infrared brain function imaging device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210192280.5A CN114569076A (en) 2022-03-01 2022-03-01 Positioning method, device and storage medium for near-infrared brain function imaging device

Publications (1)

Publication Number Publication Date
CN114569076A true CN114569076A (en) 2022-06-03

Family

ID=81771814

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210192280.5A Pending CN114569076A (en) 2022-03-01 2022-03-01 Positioning method, device and storage medium for near-infrared brain function imaging device

Country Status (1)

Country Link
CN (1) CN114569076A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797560A (en) * 2022-11-28 2023-03-14 广州市碳码科技有限责任公司 Head model construction method and system based on near infrared spectrum imaging
CN115844325A (en) * 2022-11-17 2023-03-28 天津大学 Distributed fNIRS brain function imaging system for super-scanning application
WO2023165527A1 (en) * 2022-03-01 2023-09-07 丹阳慧创医疗设备有限公司 Positioning method and apparatus for near-infrared brain function imaging device, and storage medium
CN117156072A (en) * 2023-11-01 2023-12-01 慧创科仪(北京)科技有限公司 Device for processing near infrared data of multiple persons, processing equipment and storage medium
CN117456111A (en) * 2023-12-25 2024-01-26 慧创科仪(北京)科技有限公司 Label display method and device based on near infrared brain function imaging data

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003344269A (en) * 2002-05-22 2003-12-03 Hitachi Ltd Biological light measuring device
JP2007185491A (en) * 2005-12-16 2007-07-26 National Agriculture & Food Research Organization Space analysis method of transcanial brain function measurement/stimulation point
JP2009261588A (en) * 2008-04-24 2009-11-12 Shimadzu Corp Optical bioinstrumentation apparatus and holder arrangement support system
CN104780847A (en) * 2012-11-14 2015-07-15 株式会社岛津制作所 Optical biometric device and position measurement device used in same
CN108833691A (en) * 2018-06-01 2018-11-16 深圳鑫想科技有限责任公司 A kind of body temperature method for automatic measurement, device and mobile terminal
CN108903915A (en) * 2018-07-24 2018-11-30 丹阳慧创医疗设备有限公司 A kind of positioning device and method near infrared spectrum cerebral function imaging system
US20190282111A1 (en) * 2018-03-15 2019-09-19 Ricoh Company, Ltd. Input device, measurement system, and computer-readable medium
CN110787359A (en) * 2019-09-23 2020-02-14 苏州商信宝信息科技有限公司 Intelligent head nursing method and system based on data processing and analysis

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003344269A (en) * 2002-05-22 2003-12-03 Hitachi Ltd Biological light measuring device
JP2007185491A (en) * 2005-12-16 2007-07-26 National Agriculture & Food Research Organization Space analysis method of transcanial brain function measurement/stimulation point
JP2009261588A (en) * 2008-04-24 2009-11-12 Shimadzu Corp Optical bioinstrumentation apparatus and holder arrangement support system
CN104780847A (en) * 2012-11-14 2015-07-15 株式会社岛津制作所 Optical biometric device and position measurement device used in same
US20190282111A1 (en) * 2018-03-15 2019-09-19 Ricoh Company, Ltd. Input device, measurement system, and computer-readable medium
CN108833691A (en) * 2018-06-01 2018-11-16 深圳鑫想科技有限责任公司 A kind of body temperature method for automatic measurement, device and mobile terminal
CN108903915A (en) * 2018-07-24 2018-11-30 丹阳慧创医疗设备有限公司 A kind of positioning device and method near infrared spectrum cerebral function imaging system
CN110787359A (en) * 2019-09-23 2020-02-14 苏州商信宝信息科技有限公司 Intelligent head nursing method and system based on data processing and analysis

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023165527A1 (en) * 2022-03-01 2023-09-07 丹阳慧创医疗设备有限公司 Positioning method and apparatus for near-infrared brain function imaging device, and storage medium
CN115844325A (en) * 2022-11-17 2023-03-28 天津大学 Distributed fNIRS brain function imaging system for super-scanning application
CN115797560A (en) * 2022-11-28 2023-03-14 广州市碳码科技有限责任公司 Head model construction method and system based on near infrared spectrum imaging
CN115797560B (en) * 2022-11-28 2023-07-25 广州市碳码科技有限责任公司 Near infrared spectrum imaging-based head model construction method and system
CN117156072A (en) * 2023-11-01 2023-12-01 慧创科仪(北京)科技有限公司 Device for processing near infrared data of multiple persons, processing equipment and storage medium
CN117156072B (en) * 2023-11-01 2024-02-13 慧创科仪(北京)科技有限公司 Device for processing near infrared data of multiple persons, processing equipment and storage medium
CN117456111A (en) * 2023-12-25 2024-01-26 慧创科仪(北京)科技有限公司 Label display method and device based on near infrared brain function imaging data
CN117456111B (en) * 2023-12-25 2024-04-05 慧创科仪(北京)科技有限公司 Label display method and device based on near infrared brain function imaging data

Similar Documents

Publication Publication Date Title
CN114569076A (en) Positioning method, device and storage medium for near-infrared brain function imaging device
CN114246556B (en) Positioning method, apparatus and storage medium for near-infrared brain function imaging device
CN114246557A (en) Positioning method, device and storage medium for near-infrared brain function imaging device
US10537247B2 (en) Information processing apparatus, method, and programmed storage medium, for calculating ranges of regions of interest of scanned or other images
WO2023165527A1 (en) Positioning method and apparatus for near-infrared brain function imaging device, and storage medium
US9123096B2 (en) Information processing apparatus and control method thereof
EP3501387A1 (en) Marking a computerized model of a cardiac surface
US9974618B2 (en) Method for determining an imaging specification and image-assisted navigation as well as device for image-assisted navigation
CN108903915B (en) Positioning device and method for near-infrared spectrum brain function imaging system
JP6866444B2 (en) Ultrasonography method, ultrasonic inspection system and related equipment
CN111629670A (en) Echo window artifact classification and visual indicator for ultrasound systems
US11612345B2 (en) Input device, measurement system, and computer-readable medium
US12053324B2 (en) System for visualization and control of surgical devices utilizing a graphical user interface
CN112533540A (en) Ultrasonic imaging method, ultrasonic imaging device and puncture navigation system
JP2015173843A (en) Medical image display device, method and program
CN118000863B (en) Kidney puncture guiding method, system and computer equipment based on multimode fusion
CN110363757B (en) Ultrasonic image measurement result display method, device, equipment and readable storage medium
JP2015100479A (en) Ultrasonic image processor
CN106028943A (en) Ultrasonic virtual endoscopic imaging system and method, and apparatus thereof
KR20130080311A (en) Method and apparatus for displaying medical image
CN107463320B (en) System correction method and device
CN108536151A (en) A kind of the closed loop execution system and visual guidance method of visual guidance
KR101819548B1 (en) Ultrasonic image diagnostic apparatus, and user interface control device and user interface operation method used therein
JP4171129B2 (en) Perimeter
CN113367725B (en) Method, device, equipment and medium for adjusting ultrasonic body mark graph

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220603