WO2024090190A1 - Dispositif d'inspection ultrasonore, procédé d'inspection et programme - Google Patents

Dispositif d'inspection ultrasonore, procédé d'inspection et programme Download PDF

Info

Publication number
WO2024090190A1
WO2024090190A1 PCT/JP2023/036673 JP2023036673W WO2024090190A1 WO 2024090190 A1 WO2024090190 A1 WO 2024090190A1 JP 2023036673 W JP2023036673 W JP 2023036673W WO 2024090190 A1 WO2024090190 A1 WO 2024090190A1
Authority
WO
WIPO (PCT)
Prior art keywords
probe
ultrasonic
camera
image
inspection device
Prior art date
Application number
PCT/JP2023/036673
Other languages
English (en)
Japanese (ja)
Inventor
一生 本郷
裕之 鎌田
直子 小林
務 澤田
博之 茂井
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2024090190A1 publication Critical patent/WO2024090190A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/04Analysing solids
    • G01N29/06Visualisation of the interior, e.g. acoustic microscopy
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/22Details, e.g. general constructional or apparatus details
    • G01N29/24Probes

Definitions

  • This technology relates to an ultrasonic inspection device, an inspection method, and a program, and in particular to an ultrasonic inspection device, an inspection method, and a program that make it easier to use the ultrasonic inspection device.
  • ultrasound examination devices that use ultrasound to capture internal organs and other objects to obtain ultrasound images have been widely used in the medical field.
  • the posture of the probe that transmits ultrasound and receives the reflected waves of the ultrasound, and the relative position of the probe with respect to the person being examined may be used (see, for example, Patent Document 1).
  • an ultrasound examination device is used in a hospital
  • the posture of the probe and its relative position with respect to the person being examined are obtained, for example, using a camera installed near the hospital bed.
  • An ultrasonic inspection device includes an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected back from the object to be inspected, a camera that is integrated with a probe on which the ultrasonic sensor is provided and captures at least an image of the area in the direction in which the probe contacts the object to be inspected, and a calculation unit that measures the relative position of the probe with respect to the object to be inspected based on the image captured by the camera.
  • the inspection method is configured such that an ultrasonic inspection device is integrated with a probe equipped with an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected, and the relative position of the probe with respect to the object to be inspected is measured based on an image captured by a camera that captures at least an image of the vicinity of the direction in which the probe comes into contact with the object to be inspected.
  • a program is configured as an integrated part of a probe equipped with an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected back from the object to be inspected, and causes a computer to execute a process of measuring the relative position of the probe with respect to the object to be inspected based on an image captured by a camera that captures at least an image of the vicinity of the direction in which the probe comes into contact with the object to be inspected.
  • an ultrasonic sensor is integrated with a probe that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected, and the relative position of the probe with respect to the object to be inspected is measured based on images taken by a camera that captures at least the area in the direction in which the probe comes into contact with the object to be inspected.
  • FIG. 1 is a diagram illustrating an example of the configuration of an ultrasonic inspection device according to an embodiment of the present technology.
  • FIG. 2 is a perspective view showing an example of the appearance of a probe.
  • FIG. 2 is a perspective view of the probe as viewed from the tip end side.
  • FIG. 13 is a diagram showing an example of how to use the probe.
  • FIG. 2 is a cross-sectional view showing a schematic configuration of the camera. 13 is a diagram showing a state in which light is emitted into a space surrounded by three mirrors facing each other.
  • FIG. 1A and 1B are diagrams illustrating photographing by a camera and an external force acting on the camera.
  • FIG. 2 is a diagram showing an example of an image captured by a camera.
  • FIG. 1 is a block diagram showing an example of the configuration of an information processing device; 10 is a flowchart illustrating processing performed by an ultrasonic inspection device according to the present technology.
  • FIG. 1 is a diagram showing an example of a situation in which an ultrasonic inspection device is used.
  • 1A and 1B are diagrams for explaining a method for measuring the rigidity of an inspection object.
  • FIG. 13 is a diagram showing an example of the appearance of a probe with a lid portion attached.
  • FIG. 1 is a diagram for explaining remote medical treatment using an ultrasound examination device according to the present technology.
  • 10A and 10B are diagrams for explaining a case where the ultrasonic sensor operates normally and a case where the operation of the ultrasonic sensor is stopped.
  • FIG. 2 is a diagram showing an example of the arrangement of two cameras.
  • FIG. 1 is a diagram showing an example of the arrangement of one camera.
  • FIG. 1 is a diagram showing an example of the arrangement of two fisheye cameras.
  • FIG. 2 is a diagram showing an example of the arrangement of one fish-eye camera.
  • FIG. 1 is a diagram showing an example of an arrangement of a plurality of types of cameras.
  • FIG. 13 is a diagram showing an example of a screen for presenting positions on the human body where ultrasound examinations have been performed.
  • FIG. 13 is a diagram showing an example of an ultrasound examination screen displayed on a display of an information processing device during ultrasound examination.
  • FIG. 13 is a diagram illustrating details of an application area.
  • FIG. 1 is a diagram showing an example of the arrangement of one camera.
  • FIG. 1 is a diagram showing an example of the arrangement of two fisheye cameras.
  • FIG. 2 is a diagram showing an example of the arrangement of one fish-eye camera.
  • FIG. 1 is a
  • FIG. 13 is a diagram showing an example of a graph illustrating a time series change in an external force acting on a probe.
  • FIG. 13 is a diagram showing an example of the display of a force sense meter.
  • 13A and 13B are diagrams showing an example of a display of a guide for moving a probe to a region to be examined.
  • 13A and 13B are diagrams showing an example of a guide display for presenting a target value of the force for pressing the probe.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of a computer.
  • FIG. 1 is a diagram illustrating an example of the configuration of an ultrasonic inspection device according to an embodiment of the present technology.
  • the ultrasound examination device of the present technology is a portable device used to take and examine ultrasound images that show the internal state of each part of the body, such as the abdomen, of a person being examined.
  • the ultrasound examination device of the present technology is composed of a probe 1 and an information processing device 2.
  • the probe 1 and the information processing device 2 are connected via a wired or wireless communication path.
  • the probe 1 emits ultrasonic waves to the object to be inspected and receives the reflected ultrasonic waves.
  • the probe 1 measures the intensity of the received reflected waves and supplies ultrasonic measurement data, which is data showing the measurement results of the intensity of the reflected waves over time, to the information processing device 2.
  • the information processing device 2 is composed of a tablet terminal, a smartphone, a notebook PC (Personal Computer), a dedicated terminal, etc.
  • the information processing device 2 generates an ultrasound image based on the ultrasound measurement data supplied from the probe 1, and displays the ultrasound image on a display provided in the information processing device 2, for example.
  • Figure 2 is a perspective view showing an example of the appearance of probe 1.
  • the probe 1 has a tip portion 11 and a support portion 12.
  • a small ultrasonic sensor 21 is provided on the contact surface of the tip 11 with the test subject, which emits ultrasonic waves and receives the reflected ultrasonic waves.
  • the support portion 12 is a member that supports the tip portion 11 and allows a user of the ultrasound examination device to grasp the probe 1.
  • a roughly triangular opening H is formed in the center of the support portion 12 on the tip portion 11 side, and a plate-shaped transparent member 31 is formed to cover the opening H.
  • Figure 3 is a perspective view of the probe 1 as seen from the tip 11 side.
  • a camera 22 is provided inside the opening H.
  • the camera 22 includes a force sensor section to which the tip 11 is directly or indirectly attached, and at least a portion of the body is integrated with the support section 12. The detailed configuration of the camera 22 will be described later.
  • the camera 22 captures images of the external space of the probe 1.
  • the camera 22 is separated from the external space by a transparent member 31.
  • the transparent member 31 makes it possible to make the camera 22 dustproof, drip-proof, and waterproof. To make it easier to clean the probe 1, it is desirable that the boundary between the transparent member 31 and the support part 12 is flat.
  • the shape of the transparent member 31 is not limited to a plate shape. For example, if at least a portion of the camera 22 is exposed outside the support part 12, the transparent member 31 may be formed in a hemispherical shape to cover the camera 22.
  • Figure 4 shows an example of how to use the probe 1.
  • a user performing an ultrasound examination presses the tip 11 of the probe 1 against a predetermined position on the surface of the object Obj to be examined (e.g., a position directly above an internal organ). At this time, an ultrasound image showing, for example, the inside of the object to be examined is displayed in real time on the display of the information processing device 2. Note that the user performing the ultrasound examination may be the person to be examined, or may be a person other than the person to be examined.
  • the camera 22 captures, for example, an image in the direction in which the probe 1 comes into contact with the object of inspection Obj (toward the tip 11).
  • the image captured by the camera 22 captures, for example, a part of the object of inspection Obj.
  • the ultrasonic inspection device measures the relative position of the probe 1 with respect to the object of inspection Obj based on the image captured by the camera 22.
  • the relative position of the probe 1 is measured, for example, using Visual SLAM (Simultaneous Localization And Mapping) technology that uses the captured image as input.
  • Visual SLAM Simultaneous Localization And Mapping
  • the information processing device 2 can also detect abnormalities such as rough skin of the person being examined based on the image captured by the camera 22.
  • abnormalities such as rough skin of the person being examined based on the image captured by the camera 22.
  • the camera 22 is also used as an optical force sensor.
  • the ultrasonic inspection device measures the external force acting on the tip 11 of the probe 1 based on the image captured by the camera 22.
  • FIG. 5 is a cross-sectional view showing a schematic configuration of the camera 22.
  • the camera 22 includes a base unit 110, a force sensor unit 120, strain bodies 130 and 180, an image capturing unit 140, a light source unit 150, a first mirror 161, a second mirror 162, and a half mirror 170.
  • the base unit 110 is a rigid structural member in the shape of a flat plate with a light intake hole 110H provided approximately in the center.
  • the base unit 110 allows light incident from outside the camera 22 (also called external light) to enter the photographing unit 140 provided on the first surface S101 side of the base unit 110 (the side opposite to the side where external light enters the camera 22) through the light intake hole 110H provided approximately in the center.
  • the force sense acting unit 120 is a rigid structural member provided via a strain generating body 130 on the second surface S102 side of the base unit 110 (the side on which external light enters the camera 22).
  • the force sense acting unit 120 may be provided, for example, so as to face the second surface S102 of the base unit 110 around the light intake hole 110H.
  • the force sense acting unit 120 is a part of the camera 22 on which an external force (external force) is applied via the tip portion 11.
  • an external force acts on the force sense acting unit 120
  • the strain generating body 130 between the force sense acting unit 120 and the base portion 110 deforms, and the positional relationship between the first mirror 161 and the second mirror 162 changes.
  • This causes a change in the position of each light point of the reflected light that is emitted from the light source portion 150 and multiple-reflected by the first mirror 161 and the second mirror 162. Therefore, the camera 22 can measure the external force acting on the force sense acting unit 120 by measuring the change in the position of each light point of the reflected light that is multiple-reflected by the first mirror 161 and the second mirror 162.
  • a plurality of force sense acting units 120 may be provided.
  • the camera 22 can measure the external forces acting on the plurality of locations by receiving the external forces acting on the plurality of locations of the camera 22 at each of the force sense acting units 120.
  • the multiple force sense units 120 may be arranged in point symmetry or line symmetry with respect to the light intake hole 110H.
  • two force sense units 120 may be arranged at 180 degrees across the light intake hole 110H (i.e., facing each other across the light intake hole 110H).
  • Three force sense units 120 may be arranged at 120 degrees from each other with the light intake hole 110H at the center, or four force sense units 120 may be arranged at 90 degrees from each other with the light intake hole 110H at the center.
  • the half mirror 170 is provided on the side where external light enters so as to cover the light intake hole 110H.
  • the half mirror 170 may be configured in a rectangular or circular flat plate shape and may be provided so as to span multiple force sense acting units 120 via a strain generating body 180.
  • Half mirror 170 is an optical element that has a light transmittance of more than 20% and less than 90%, and a light reflectance of more than 10% and less than 80%, and transmits some of the incident light and reflects some of the incident light.
  • half mirror 170 can be constructed by depositing an extremely thin film having light transmittance and light reflectance using a metal material such as chromium (Cr) on a transparent element made of glass or resin.
  • a metal material such as chromium (Cr)
  • half mirror 170 can be constructed by depositing a dielectric multilayer film having light transmittance and light reflectance on a transparent element made of glass or resin.
  • the light transmittance and light reflectance of half mirror 170 can be set to any value depending on the characteristics realized by camera 22.
  • the half mirror 170 which has optical transparency, can transmit, for example, external light incident on the camera 22 into the interior of the camera 22, which is provided with the light intake hole 110H. This allows the camera 22 to capture the external space of the camera 22 with the imaging unit 140 by using the external light captured through the light intake hole 110H.
  • the half mirror 170 which has optical reflectivity, can also reflect, for example, the light emitted from the light source unit 150 in the same manner as the first mirror 161 and the second mirror 162. This allows the camera 22 to measure the positions of each light point of the reflected light that is emitted from the light source unit 150 and is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170 with the imaging unit 140. Therefore, the camera 22 can simultaneously capture the external space of the camera 22 and measure the light point group of the reflected light that is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170 with the imaging unit 140.
  • the flexure bodies 130 and 180 are structural members that deform in proportion to the stress acting thereon.
  • the flexure bodies 130 and 180 may be elastic bodies that are obviously easily deformed, such as rubber, elastomers, or springs.
  • the flexure bodies 130 and 180 may also be structural members that are made of the same material as the other components, but are formed with low rigidity so that they are more easily deformed than the other components.
  • the flexure body 130 is provided between the base unit 110 and the force sense acting unit 120, and the flexure body 180 is provided between the half mirror 170 and the force sense acting unit 120.
  • the flexure bodies 130 and 180 can displace the positional relationship between the first mirror 161, the second mirror 162, and the half mirror 170 by deforming in response to an external force acting on the force sense acting unit 120.
  • the first mirror 161 is provided on the second surface S102 of the base unit 110, and the second mirror 162 is provided on the surface of the force sense acting unit 120 facing the base unit 110. That is, the first mirror 161 and the second mirror 162 are provided on the surface of the camera 22 facing the internal space surrounded by the base unit 110 and the force sense acting unit 120.
  • the first mirror 161 and the second mirror 162 can be formed, for example, by depositing a metal material such as chrome (Cr) with a film thickness having sufficient light reflectance on a transparent member made of glass or resin.
  • the first mirror 161 and the second mirror 162 facing each other can multiple-reflect the light emitted from the light source unit 150 in the reflection space 121 between the first mirror 161 and the second mirror 162.
  • Figure 6 is a diagram showing how light is emitted into a space surrounded by three opposing mirrors.
  • the light L emitted from the light source unit 1500 is multiple-reflected by the first mirror 1610, the second mirror 1620, and the third mirror 1630, which are provided facing each other at positions corresponding to each side of the triangular prism.
  • the light L emitted from the light source unit 1500 is received by the light receiving unit 1400 with the number of light points of the reflected light being amplified by the multiple reflections of the first mirror 1610, the second mirror 1620, and the third mirror 1630.
  • the positions of the light points of the reflected light of the light L emitted from the light source unit 1500 are displaced by amplifying the displacement of the first mirror 1610, the second mirror 1620, and the third mirror 1630.
  • the first mirror 1610, the second mirror 1620, and the third mirror 1630 may be arranged to correspond to each side of an equilateral triangle or an isosceles triangle, or may be arranged to correspond to each side of a triangle that is a broken equilateral triangle or an isosceles triangle.
  • the first mirror 161, the second mirror 162, and the third mirror can form a structure corresponding to the sides of a triangular prism as shown in FIG. 6.
  • the camera 22 can multiple-reflect the light emitted from the light source unit 150, with the inside of the triangular prism whose sides are formed by the first mirror 161, the second mirror 162, and the third mirror being the reflection space 121.
  • the first mirror 161 and the second mirror 162 may form a structure corresponding to the sides of a triangular pyramid with a third mirror (not shown). Even in this case, the camera 22 can multiple-reflect the light emitted from the light source unit 150 by using the inside of the triangular pyramid whose sides are formed by the first mirror 161, the second mirror 162, and the third mirror as a reflection space 121.
  • the first mirror 161 and the second mirror 162 may form a structure corresponding to the side surfaces of a rectangular prism between them and a third mirror and a fourth mirror (not shown). Even in such a case, the camera 22 can multiple-reflect the light emitted from the light source unit 150 by using the inside of the rectangular prism whose sides are formed by the first mirror 161, the second mirror 162, the third mirror, and the fourth mirror as the reflection space 121.
  • the first mirror 161 and the second mirror 162 may form a structure corresponding to the sides of a quadrangular pyramid between them and a third mirror and a fourth mirror (not shown). Even in this case, the camera 22 can multiple-reflect the light emitted from the light source unit 150 by using the inside of the quadrangular pyramid whose sides are formed by the first mirror 161, the second mirror 162, the third mirror, and the fourth mirror as the reflection space 121.
  • the light source unit 150 emits light toward the second surface S102 side of the base unit 110. Specifically, the light source unit 150 emits light into a reflection space 121 surrounded on at least two sides by the first mirror 161 and the second mirror 162.
  • the reflection space 121 is, for example, the space between the first mirror 161 and the second mirror 162 that face each other. By emitting light into the reflection space 121, the light source unit 150 can cause the emitted light to be multiple-reflected in the reflection space 121 between the first mirror 161 and the second mirror 162.
  • the light source unit 150 may emit light into the reflection space 121 from the bottom side of the reflection space 121 (i.e., the base unit 110 side), or may emit light into the reflection space 121 from the side side of the reflection space 121 (i.e., the strain body 130 side).
  • the light source unit 150 may be, for example, an LED (Light Emitting Diode) light source capable of emitting light with high linearity.
  • the light source unit 150 may be provided on the base unit 110 side.
  • wiring to the light source unit 150 can be formed in the same way as wiring to the imaging unit 140, so the cost and amount of work involved in forming the wiring can be reduced. Therefore, the production cost of the camera 22 can be further reduced.
  • the light source unit 150 may be provided inside the base unit 110 so that the main body of the LED light source and wiring are not exposed in the reflection space 121. In this case, it is possible to prevent the image of the main body and wiring of the light source unit 150 from being multiple-reflected on the first mirror 161 and the second mirror 162. This prevents the multiple-reflection image of the main body and wiring of the light source unit 150 from becoming a noise source, and therefore prevents a decrease in the measurement sensitivity of the light spot group of the reflected light that is multiple-reflected on the first mirror 161 and the second mirror 162.
  • the light source unit 150 may emit light into the reflection space 121 through a pinhole.
  • the pinhole is, for example, a hole with a diameter of about several mm.
  • the light source unit 150 can further improve the convergence of the emitted light by emitting light into the reflection space 121 through the pinhole.
  • the light source unit 150 can make the shape of each light point of the reflected light reflected multiple times by the first mirror 161 and the second mirror 162 a smaller perfect circle, thereby improving the measurement sensitivity of each light point.
  • the accuracy of pinhole processing is generally higher than the positioning accuracy when the light source unit 150 is assembled, the light source unit 150 can further improve the accuracy of the position where the light is emitted by emitting light into the reflection space 121 through the pinhole. Therefore, it is possible to more easily control the position of each light point of the reflected light reflected multiple times by the first mirror 161 and the second mirror 162.
  • a plurality of light source sections 150 may be provided corresponding to each of the haptic action sections 120.
  • the light source section 150 may emit light of a different color for each corresponding haptic action section 120.
  • the light source section 150 may emit light to another reflection space 121 surrounded on two sides by the second mirror 162 and the first mirror 161 provided on the corresponding haptic action section 120 so that the light spot group of the reflected light is separated for each corresponding haptic action section 120.
  • the camera 22 can measure the external force acting on each of the haptic action sections 120 by the displacement of the light spot group of the reflected light that can be separated from each other by color or position. Therefore, the camera 22 can measure the external force acting on each of the haptic action sections 120 by separating them from each other with high accuracy.
  • the photographing unit 140 is an image sensor that acquires a photographed image by receiving light incident through the light intake hole 110H.
  • the photographing unit 140 may be, for example, a CMOS (Complementary Metal-Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor.
  • the photographing unit 140 can receive external light that passes through the half mirror 170 and enters the camera 22, and a group of light points of the reflected light that is emitted from the light source unit 150 and is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170. In other words, the photographing unit 140 can acquire an image in which the group of light points of the multiple-reflected reflected light is superimposed on the photographed image of the external space of the camera 22.
  • the camera 22 can measure the external force acting on the force sensing unit 120 from the displacement of the position of each light point of the reflected light that is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170. Therefore, the camera 22 can simultaneously capture images of the external space and measure the external forces acting on the force sense action unit 120.
  • Figure 7 is a diagram explaining the photographing by the camera 22 and the external forces acting on the camera 22.
  • the camera 22 can capture an image of an object Obj1 present in external space. Furthermore, the camera 22 receives a force Fz1 in the Z-axis direction, a moment Mx1 about the X-axis, and a moment My1 about the Y-axis at the upper force sense acting section 120a while facing directly in FIG. 7, and can detect these forces Fz1, moment Mx1, and moment My1. Furthermore, the camera 22 receives a force Fz2 in the Z-axis direction, a moment Mx2 about the X-axis, and moment My2 about the Y-axis at the lower force sense acting section 120b while facing directly in FIG. 7, and can detect these forces Fz2, moment Mx2, and moment My2.
  • FIG. 8 shows an example of an image captured by the camera 22.
  • the captured image CI captured by the image capture unit 140 includes an object Obj1 and light spot groups LC1 and LC2.
  • the light spot group LC1 is, for example, a light spot group of reflected light that is emitted from the upper light source unit 150a facing directly in FIG. 7 and is multiple-reflected by the first mirror 161a and the second mirror 162a.
  • the positions of the force sense action unit 120a and the second mirror 162a are displaced.
  • the position of the light spot group LC1 on the captured image CI is displaced in the directions corresponding to the force Fz1, moment Mx1, and moment My1 acting on the force sense action unit 120a. Therefore, the camera 22 can calculate the direction and magnitude of the force Fz1, moment Mx1, and moment My1 acting on the force sense action unit 120a from the displacement of the position of the light spot group LC1.
  • the light spot group LC2 is, for example, a light spot group formed by multiple reflections of light emitted from the lower light source unit 150b facing directly in FIG. 7 by the first mirror 161b and the second mirror 162b.
  • the positions of the force sense action unit 120b and the second mirror 162b are displaced.
  • the position of the light spot group LC2 on the captured image CI is displaced in the directions corresponding to the force Fz2, moment Mx2, and moment My2 acting on the force sense action unit 120b. Therefore, the camera 22 can calculate the direction and magnitude of the force Fz2, moment Mx2, and moment My2 acting on the force sense action unit 120b from the displacement of the position of the light spot group LC2.
  • the camera 22 can calculate the direction and magnitude of the external force acting on the force sense action unit 120a by previously associating the state of displacement of the position of the light point group LC1 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on the force sense action unit 120a. Also, the camera 22 can calculate the direction and magnitude of the external force acting on the force sense action unit 120b by previously associating the state of displacement of the position of the light point group LC2 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on the force sense action unit 120b.
  • the camera 22 may use machine learning to associate the state of displacement of each of the position of the light point groups LC1 and LC2 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on each of the force sense action units 120a and 120b.
  • the camera 22 may create a calibration curve to associate the state of displacement of each of the light point groups LC1 and LC2 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on each of the force sense acting units 120a and 120b.
  • the camera 22 can capture an image CI in which the light spot groups LC1 and LC2 are superimposed on the captured image of the subject Obj1, so that the force and moment acting on the force sense acting units 120a and 120b can be measured simultaneously while capturing an image of the subject Obj1.
  • the tip 11 is attached so that the movement of the tip 11 is transmitted to the force sense acting units 120a and 120b, so the camera 22 can measure the external force acting on the tip 11.
  • the orientation of the probe and the relative position of the probe to the person being examined may be used (see, for example, Patent Document 1).
  • an ultrasound examination device is used in a hospital
  • the orientation of the probe and its relative position to the person being examined are obtained, for example, using a camera installed near the hospital bed.
  • the camera 22 which captures at least the area in the direction in which the probe 1 comes into contact with the object to be inspected, is configured as one unit with the probe 1 provided with the ultrasonic sensor 21, and the relative position of the probe 1 with respect to the object to be inspected is measured based on the image captured by the camera 22. Because the camera 22 is configured as one unit with the probe 1, the user can easily use the ultrasonic inspection device without having to install the camera or connect the camera to the ultrasonic inspection device.
  • the camera 22 captures an image of the external space of the probe 1, and also captures an image of a group of light points of light emitted from a light source provided inside the probe 1 and reflected in a reflection space provided inside the probe 1.
  • the external force acting on the tip 11 of the probe 1 is measured based on the displacement of the group of light points captured in the image captured by the camera 22.
  • the external force may be measured by a force sensor formed to support the tip 11 instead of the camera 22.
  • introducing a force sensor along with an ultrasonic sensor and a camera into one probe may result in the probe becoming large.
  • the probe be as small as possible.
  • the camera 22 integrated with the probe 1 is also used as an optical force sensor, the force sensor itself and the unique wiring can be reduced, making it possible to realize a smaller, lighter ultrasonic inspection device.
  • FIG. 9 is a diagram showing an example of the configuration of the probe 1. As shown in FIG.
  • the probe 1 is composed of an ultrasonic sensor 21, a camera 22, an inertial sensor 23, and a calculation unit 24.
  • the ultrasonic sensor 21 emits ultrasonic waves to the object to be inspected and receives the reflected ultrasonic waves.
  • the ultrasonic sensor 21 measures the intensity of the received reflected waves and obtains ultrasonic measurement data that indicates, for example, the measurement results of the intensity of the reflected waves over time.
  • the camera 22 captures the external space of the probe 1 and acquires the captured image.
  • the inertial sensor 23 is composed of a gyro sensor and an acceleration sensor, and is mounted inside the probe 1.
  • the inertial sensor 23 measures at least one of the angular velocity of the probe 1 and the acceleration of the probe 1.
  • the degree to which the probe 1 is tilted relative to the direction of gravity can be measured using an acceleration sensor, so the measurement results from the acceleration sensor can be used to improve the accuracy of the tilt angle of the probe 1 measured using techniques such as Visual SLAM. Improving the accuracy of measuring the tilt angle of the probe 1 makes it possible to more accurately correct errors that occur in the measurement of the external force acting on the probe 1, improving the accuracy of the measurement value of the external force.
  • the gyro sensor can measure rotational movement mainly around the axis of gravity
  • the measurement results of the gyro sensor can be used to improve the accuracy of the movement state of the probe 1 measured by techniques such as Visual SLAM.
  • Visual SLAM measures the relative change in position with respect to the object being inspected, so it is not suitable for measuring the absolute translational amount of probe 1. For example, with Visual SLAM technology, it may not be possible to determine whether the object being inspected or probe 1 has moved. By using the results of measuring the acceleration of probe 1 using an acceleration sensor, it is possible to measure the absolute change in the position of probe 1.
  • the calculation unit 24 measures the relative position of the probe 1 with respect to the object to be inspected based on the image captured by the camera 22.
  • the calculation unit 24 also measures the external force acting on the probe 1 based on the image captured by the camera 22.
  • FIG. 10 is a block diagram showing an example configuration of the information processing device 2.
  • the information processing device 2 is composed of a data acquisition unit 51, a calculation unit 52, a communication unit 53, a recording unit 54, a presentation control unit 55, and a presentation unit 56.
  • the data acquisition unit 51 acquires ultrasonic measurement data, captured images, measurement results of the inertial sensor 23, measurement results of the relative position of the probe 1, and measurement results of external forces acting on the probe 1 from the probe 1, and supplies them to the calculation unit 52, communication unit 53, and recording unit 54. Since the external forces acting on the probe 1 are measured by the calculation unit 24 of the probe 1, the data acquisition unit 51 can also acquire from the probe 1 the area portion of the captured image in which the subject appears, rather than acquiring the entire captured image including the intersection group for measuring the external forces. By acquiring the area portion of the captured image in which the subject appears, the amount of data exchanged between the probe 1 and the information processing device 2 can be reduced.
  • the calculation unit 52 generates an ultrasound image based on the ultrasound measurement data supplied from the data acquisition unit 51, and supplies it to the communication unit 53 and the recording unit 54.
  • the calculation unit 52 also supplies the captured image, the measurement results of the inertial sensor 23, the measurement results of the relative position of the probe 1, and the measurement results of the external force acting on the probe 1, all of which are supplied from the data acquisition unit 51, as well as the ultrasound image, to the presentation control unit 55.
  • the calculation unit 52 can also measure the relative position of the probe 1 with respect to the object of inspection and the external force acting on the probe 1 based on the captured image.
  • the communication unit 53 transmits at least two of the data supplied from the data acquisition unit 51 and the calculation unit 52 to an external device connected to the information processing device 2.
  • the information processing device 2 and the external device are connected via a wired or wireless network.
  • the ultrasound image, the captured image, the measurement results of the inertial sensor 23, the measurement results of the relative position of the probe 1, and the measurement results of the external force acting on the probe 1 are encrypted and transmitted so that they can only be decrypted by an external device connected to the information processing device 2.
  • the data may be encrypted in the configuration that acquired the data, or may be encrypted collectively in the communication unit 53.
  • the recording unit 54 records the data supplied from the data acquisition unit 51 and the calculation unit 52 in association with each other.
  • the presentation control unit 55 generates a screen and sound to be presented to the user based on the data supplied from the calculation unit 52, and controls the presentation unit 56 to present the screen and sound.
  • the presentation unit 56 is composed of a display, speaker, etc., and presents to the user images and sounds based on ultrasound images and the measurement results of the external forces acting on the probe 1 according to the control of the presentation control unit 55.
  • step S1 the data acquisition unit 51 of the information processing device 2 acquires various data, such as ultrasonic measurement data and captured images, from the probe 1.
  • step S2 for example, the calculation unit 52 of the information processing device 2 generates an ultrasound image based on the ultrasound measurement data.
  • step S3 for example, the calculation unit 24 of the probe 1 measures the relative position of the probe 1 with respect to the test object based on the captured image.
  • step S4 for example, the calculation unit 24 of the probe 1 measures the external force acting on the probe 1 based on the captured image.
  • step S5 the recording unit 54 of the information processing device 2 records the measurement results of various information by the probe 1 and the information processing device 2.
  • the ultrasound image, the captured image, the measurement results of the relative position of the probe 1, the measurement results of the external force acting on the probe 1, and the measurement results of the inertial sensor 23 are recorded in association with each other.
  • FIG. 12 is a diagram showing an example of a situation in which an ultrasonic inspection device is used.
  • the probe 1 is used by connecting it to a tablet terminal 2A as an information processing device 2 as shown in FIG. 12A, or by connecting it to a notebook PC 2B as an information processing device 2 as shown in FIG. 12B.
  • a guide on how to use probe 1 may be presented from tablet terminal 2A or notebook PC 2B.
  • a voice message such as "Please press a little harder to the right” may be presented, or a screen containing colors, symbols, figures, character strings, etc. according to the guide content may be presented.
  • a sound or a display indicating that the force or position of probe 1 is appropriate is displayed.
  • a sound saying "The current force is optimal" may be displayed, or a character string or figure indicating that the force or position of probe 1 is appropriate may be displayed in a color different from the colors of other displays. Note that if probe 1 is pressed too far into the test subject, an alert may be displayed.
  • the contents of the guide on how to use the probe 1 are determined based on the difference between the target values and at least one of the measured values of the relative position of the probe 1, the external force acting on the probe 1, the angular velocity of the probe 1, and the acceleration of the probe 1, and are intended to bring these measured values closer to the target values.
  • the target values for at least one of the relative position of the probe 1, the external force acting on the probe 1, the angular velocity of the probe 1, and the acceleration of the probe 1 are input using an input means provided on the probe 1 or a device connected to the probe 1 (the information processing device 2 or an external device).
  • the input means includes a microphone that collects sound, a switch on the probe 1, a touch panel on the information processing device 2, etc.
  • a user can input target values recommended by a medical professional, or a medical professional who is checking various data acquired by the probe 1 and information processing device 2 in a remote location can input target values. Since the user can use the probe 1 by following the guide, it is possible to realize an ultrasound examination device that is easy to operate even for users who are novices in the medical field.
  • Example of measuring elasticity of test object The stiffness of the test object, which indicates the elasticity of the test object, may be measured based on the acceleration of the probe 1 measured by an acceleration sensor and the measurement results of the external force acting on the probe 1.
  • the stiffness of the test object is measured, for example, by the calculation unit 52 of the information processing device 2.
  • Figure 13 is a diagram to explain how to measure the stiffness of the test object.
  • FIG. 13 shows a schematic diagram of a situation in which a support part Su on which a force sensor FS is provided is pressed against an object to be inspected Obj.
  • the force sensor FS is provided on both the upper and lower sides of the contact surface of the support part Su with the object to be inspected Obj.
  • the upper part of Figure 13 shows the situation before force is applied to the support part Su
  • the lower part of Figure 13 shows the situation when force F is applied from the side of the support part Su opposite the contact surface with the inspection object Obj.
  • the acting force Fsr [N] measured by the upper force sensor FS is equal to the acting force Fsl [N] measured by the lower force sensor FS.
  • the stiffness Ks [N/mm] of the force sensor FS is given by the following equation (1).
  • the movement amount sm [mm] of the force sensor FS due to the application of force F can be calculated by double integration of the translational acceleration az [mm/ s2 ] of the support part Su (force sensor FS) in the pushing direction, as shown in the following equation (3).
  • the sum of the movement hm, initial thickness h0, and initial thickness s0 is equal to the sum of thickness h1, thickness s1, and movement sm, as shown in formula (4) below.
  • the stiffness Kh [N/mm] of the object to be inspected Obj is given by the following formula (6).
  • the stiffness Kh, the movement amount hm, the initial thickness h0, the thickness h1, the movement amount sm, and the thickness s1 are unknown parameters
  • the stiffness Ks, the acting force Fsr, the acting force Fsl, the initial thickness s0, and the translational acceleration az are known parameters.
  • the stiffness Kh can be measured. Note that if the acting force Fsr and the acting force Fsl are greater than 0, the double integral of the translational acceleration az is greater than the movement amount hm.
  • the person to be examined will hardly move even when the probe 1 is pressed against the person, it is possible to measure the rigidity of the subject to be examined using the acceleration of the probe 1 measured by the acceleration sensor. Since the elasticity of the subject to be examined indicates, for example, the hardness of the affected area, it is also possible to discover abnormalities in the affected area by performing an ultrasound examination using the ultrasound examination device of this technology.
  • a lid for covering the camera 22 may be attached to the probe 1 so as to be movable.
  • FIG. 14 shows an example of the appearance of probe 1 with the lid attached.
  • the probe 1 is provided with a lid 301 that can slide or be attached or detached.
  • a lid 301 that can slide or be attached or detached.
  • an opening H is exposed, and the camera 22 can capture the external space of the probe 1.
  • the opening H is blocked, and the camera 22 cannot capture the external space of the probe 1.
  • the ultrasound examination can be performed with the cover 301 closed, making it possible to protect the privacy of the person being examined according to their wishes.
  • the photographing unit 140 provided in the camera 22 may be configured with an image sensor capable of detecting visible light and infrared light. In this case, based on the image photographed by the camera 22, it is possible to measure the appearance and shape of the subject, measure the external force acting on the probe 1, and estimate the moisture content of the subject.
  • FIG. 15 is a diagram for explaining remote medical treatment using an ultrasound examination device according to the present technology.
  • a user U1 is performing an ultrasound examination on a person P1 who is to be examined, for example at home.
  • a notebook PC 2B connected to a probe 1 is connected, as shown by the dashed line, to a tablet terminal 302 used by a medical worker D1 in a remote location, for example via a wireless network.
  • the notebook PC 2B transmits two or more pieces of data, such as an ultrasound image, a captured image, measurement results of the external force acting on the probe 1, and measurement results of the inertial sensor 23, to the tablet terminal 302.
  • a screen corresponding to the data transmitted from the notebook PC 2B is displayed on the tablet terminal 302. While viewing the screen displayed on the tablet terminal 302, the medical worker D1 inputs advice on how to use the probe 1 by operating the tablet terminal 302 or by voice.
  • the tablet terminal 302 transmits information indicating the advice input by the medical worker D1 to the notebook PC 2B.
  • the notebook PC 2B receives the information indicating the advice sent from the tablet terminal 302 and presents the advice to the user U1. For example, text indicating the content of the advice may be displayed on the display, and the voice of the medical worker D1 may be output from a speaker. Note that, for example, if the medical worker D1 determines that the ultrasound image has been properly captured, he or she may input an instruction to record the ultrasound image by operating the tablet terminal 302. When information indicating an instruction to record an ultrasound image is sent from the tablet terminal 302, the notebook PC 2B records, for example, the ultrasound image, the captured image, the measurement results of the external force acting on the probe 1, the measurement results of the inertial sensor 23, etc. in association with each other.
  • a medical professional D1 in a remote location can check the measurement results of the ultrasound examination device, and the user U1 can receive advice from the medical professional D1 on how to use the probe 1.
  • the medical professional D1 can check not only the ultrasound images, but also the measurement results of the external forces acting on the probe 1, and can easily provide advice.
  • the calculation unit 52 of the information processing device 2 can also recognize a subject appearing in a captured image by performing image recognition based on the captured image of the camera 22.
  • image recognition based on the captured image the information processing device 2 can grasp the distance of the probe 1 to a predetermined part of the person to be examined based on the result of the image recognition and the measurement result of the relative position of the probe 1.
  • the operation of the ultrasonic sensor 21 may be stopped.
  • FIG. 16 is a diagram for explaining the cases when the ultrasonic sensor 21 operates normally and when the operation of the ultrasonic sensor 21 is stopped.
  • the ultrasonic sensor 21 operates normally.
  • the operation of the ultrasonic sensor 21 stops during an ultrasound examination of a part close to the head, such as the neck, the operation of the ultrasonic sensor 21 can be resumed, for example, by the user responding to a warning message displayed on the display of the information processing device 2.
  • a plurality of cameras 22 may be integrated into the probe 1.
  • two cameras 22 are provided on the probe 1.
  • FIG. 17 is a diagram showing an example of the arrangement of the two cameras 22.
  • the surface of the probe 1 on which the ultrasonic sensor 21 is provided is referred to as the front surface.
  • a of Figure 17 shows an example in which two cameras 22 (opening H) are provided on the front surface of the probe 1 (the contact surface of the tip 11 with the test subject).
  • the upper part of A of Figure 17 shows a schematic diagram of the probe 1 viewed from above, and the lower part of A of Figure 17 shows a schematic diagram of the probe 1 viewed from the side.
  • the two cameras 22 are provided at the left and right ends inside the tip 11, respectively, and an ultrasonic sensor 21 is provided between the two cameras 22.
  • the two cameras 22 (opening H) may also be provided on the side surfaces of the tip 11.
  • FIG. 17B shows an example in which two cameras 22 are provided on the upper and lower surfaces of the support portion 12 of the probe 1 (hereinafter, when the camera 22 is provided on the support portion 12, one of the multiple surfaces on which the camera 22 is provided is referred to as the upper surface).
  • the upper side of FIG. 17B shows a schematic diagram of the probe 1 viewed from above
  • the lower side of FIG. 17B shows a schematic diagram of the probe 1 viewed from the side.
  • the two cameras 22 are configured integrally with the probe 1 but are provided outside the support portion 12.
  • FIG. 17 shows another example in which two cameras 22 are provided on the upper and lower surfaces of the support portion 12 of the probe 1.
  • the upper part of Fig. 17C shows a schematic diagram of the probe 1 viewed from above
  • the lower part of Fig. 17C shows a schematic diagram of the probe 1 viewed from the side.
  • the two cameras 22 are provided inside the support portion 12.
  • the field of view of the camera 22 with respect to the object to be inspected can be ensured satisfactorily.
  • the ultrasonic sensor 21 is provided between the two cameras 22, which may increase the width of the tip 11.
  • Figure 18 shows an example of the placement of one camera 22.
  • a of Figure 18 shows an example in which one camera 22 (opening H) is provided on the front surface (tip portion 11) of the probe 1.
  • the upper part of A of Figure 18 shows a schematic diagram of the probe 1 viewed from above, and the lower part of A of Figure 18 shows a schematic diagram of the probe 1 viewed from the side.
  • one camera 22 is provided at either the left or right end inside the tip portion 11. It is to be noted that one camera 22 (opening H) may also be provided on the side surface of the tip portion 11.
  • FIG. 18 shows an example in which one camera 22 is provided on the upper surface of the support part 12 of the probe 1.
  • the upper part of FIG. 18B shows a schematic diagram of the probe 1 viewed from above
  • the lower part of FIG. 18B shows a schematic diagram of the probe 1 viewed from the side.
  • one camera 22 is configured integrally with the probe 1 but is provided outside the support part 12.
  • FIG. 18 shows another example in which one camera 22 is provided on the upper surface of the support part 12 of the probe 1.
  • the upper part of Fig. 18C shows a schematic diagram of the probe 1 seen from above
  • the lower part of Fig. 18C shows a schematic diagram of the probe 1 seen from the side.
  • one camera 22 is provided inside the support part 12.
  • cameras 22 are provided on both the top and bottom surfaces of the probe 1, it is considered that the field of view of at least any of the cameras 22 will not be covered by the user's hand when the user grasps the support part 12. Therefore, in terms of always being able to ensure the field of view of the cameras 22, it is preferable to provide cameras 22 on both the top and bottom surfaces of the probe 1 rather than providing cameras 22 on only one of the top and bottom surfaces of the probe 1.
  • the camera 22 may be a fisheye camera or an omnidirectional camera integrated into the probe 1.
  • the camera 22 By capturing images with a fisheye camera or an omnidirectional camera, it is possible to capture images of a wide range of the external space of the probe 1, improving the stability of the Visual SLAM technology that uses captured images as input.
  • Figure 19 shows an example of the placement of two fisheye cameras 22A.
  • a of FIG. 19 shows an example in which two fisheye cameras 22A are provided on the front surface (tip portion 11) of the probe 1.
  • the upper part of A of FIG. 19 shows a schematic diagram of the probe 1 viewed from above, and the lower part of A of FIG. 19 shows a schematic diagram of the probe 1 viewed from the side.
  • the two fisheye cameras 22A are provided at the left and right ends inside the tip portion 11, and an ultrasonic sensor 21 is provided between the two fisheye cameras 22A.
  • FIG. 19 shows an example in which two fisheye cameras 22A are provided on the upper and lower surfaces of the support part 12 of the probe 1.
  • the upper part of Figure 19B shows a schematic diagram of the probe 1 viewed from above
  • the lower part of Figure 19B shows a schematic diagram of the probe 1 viewed from the side.
  • the two fisheye cameras 22A are configured integrally with the probe 1, but part of the body of the fisheye camera 22A is exposed outside the support part 12.
  • FIG. 19C shows another example in which two fisheye cameras 22A are provided on the upper and lower surfaces of the support portion 12 of the probe 1.
  • the upper part of FIG. 19C shows a schematic diagram of the probe 1 viewed from above, and the lower part of FIG. 19C shows a schematic diagram of the probe 1 viewed from the side.
  • the two fisheye cameras 22A are provided inside the support portion 12.
  • Figure 20 shows an example of the placement of one fisheye camera 22A.
  • FIG. 20 shows an example in which one fisheye camera 22A is provided on the front surface (tip portion 11) of the probe 1.
  • the upper part of Fig. 20A shows a schematic diagram of the probe 1 as viewed from above, and the lower part of Fig. 20A shows a schematic diagram of the probe 1 as viewed from the side.
  • one fisheye camera 22A is provided at either the left or right end inside the tip portion 11.
  • B of Figure 20 shows an example in which one fisheye camera 22A is provided on the upper surface of the support part 12 of the probe 1.
  • the upper part of B of Figure 20 shows a schematic diagram of the probe 1 viewed from above
  • the lower part of B of Figure 20 shows a schematic diagram of the probe 1 viewed from the side.
  • one fisheye camera 22A is configured integrally with the probe 1, but part of the body of the fisheye camera 22A is exposed to the outside of the support part 12.
  • FIG. 20 shows another example in which one fisheye camera 22A is provided on the upper surface of the support part 12 of the probe 1.
  • the upper part of Fig. 20C shows a schematic diagram of the probe 1 seen from above
  • the lower part of Fig. 20C shows a schematic diagram of the probe 1 seen from the side.
  • one fisheye camera 22A is provided inside the support part 12.
  • multiple cameras 22 of the same type are configured integrally with the probe 1.
  • Multiple types of cameras may be configured integrally with the probe 1 as the cameras 22.
  • a fisheye camera and a camera with a normal lens are provided on the probe 1.
  • Figure 21 shows an example of the placement of multiple types of cameras.
  • FIG. 21A shows an example in which a fisheye camera 22A and a camera 22B with a normal lens are provided on the front surface (tip portion 11) of a probe 1.
  • the upper part of FIG. 21A shows a schematic diagram of the probe 1 viewed from above, and the lower part of FIG. 21A shows a schematic diagram of the probe 1 viewed from the side.
  • the fisheye camera 22A is provided at one of the left or right ends inside the tip portion 11, and the camera 22B is provided at the other left or right end inside the tip portion 11.
  • An ultrasonic sensor 21 is provided between the fisheye camera 22A and the camera 22.
  • FIG. 21B shows an example in which fisheye camera 22A is provided on the upper surface of support part 12 of probe 1, and camera 22B is provided on the lower surface of support part 12 of probe 1.
  • the upper part of FIG. 21B shows a schematic diagram of probe 1 seen from above, and the lower part of FIG. 21B shows a schematic diagram of probe 1 seen from the side.
  • fisheye camera 22A is configured integrally with probe 1, but a part of the body of fisheye camera 22A is exposed to the outside of support part 12.
  • camera 22 is configured integrally with probe 1, but is provided outside support part 12.
  • fisheye camera 22A may be provided outside support part 12 like camera 22B, or a part of the body of camera 22 may be exposed to the outside of support part 12 like fisheye camera 22A.
  • FIG. 21C shows another example in which fisheye camera 22A is provided on the upper surface of support portion 12 of probe 1, and camera 22B is provided on the lower surface of support portion 12 of probe 1.
  • the upper part of FIG. 21C shows a schematic diagram of probe 1 viewed from above, and the lower part of FIG. 21C shows a schematic diagram of probe 1 viewed from the side.
  • fisheye camera 22A and camera 22B are provided inside support portion 12.
  • Example of Displaying Locations Where Previous Ultrasound Examinations Have Been Performed Locations on the human body where previous ultrasound examinations have been performed and locations on the human body where the current ultrasound examination is being performed may be displayed on the display of the information processing device 2.
  • Figure 22 shows an example of a screen that displays the positions on the human body where ultrasound examinations were performed.
  • the locations where ultrasound examinations have been performed in the past and the current location where ultrasound examinations are being performed are displayed superimposed on an image that, for example, diagrammatically represents the human body.
  • the dotted rounded rectangles indicate areas where ultrasound examinations have been performed in the past and ultrasound images have been recorded.
  • the dotted rounded rectangles are displayed as rounded rectangles with green lines, for example.
  • the darkness of the line color changes according to the external force acting on the probe 1 when the examination of that area was performed. For example, the darker the line color, the stronger the external force that was applied.
  • the darkness of the line color is the same for rounded rectangles with lines in colors other than green, which will be explained below.
  • the dashed rounded rectangle indicates the area where an ultrasound examination was previously performed while the patient was exhaling, and an ultrasound image was recorded. In reality, the dashed rounded rectangle is displayed, for example, as a rounded rectangle with yellow lines. The dashed rounded rectangle indicates the area where an ultrasound examination was previously performed while the patient was inhaling, and an ultrasound image was recorded. In reality, the dashed rounded rectangle is displayed, for example, as a rounded rectangle with purple lines.
  • the solid rounded rectangle indicates the area where the ultrasound examination was performed and the ultrasound image was recorded.
  • the solid rounded rectangle is displayed, for example, as a rounded rectangle with red lines.
  • the white or grey circle indicates the area where the ultrasound examination was performed and the ultrasound image was displayed on the display.
  • the circle is displayed, for example, as a circle with a blue interior.
  • the intensity of the colour inside the circle changes depending on the external force acting on the probe 1 when the examination of that area was performed. For example, the darker the colour inside the circle, the stronger the external force that was applied.
  • the positions on the human body where the ultrasound examination was performed are estimated based on the results of image recognition based on the image captured by the camera 22 and the measurement results of the relative position of the probe 1.
  • the user can compare the positions where ultrasound examinations were performed in the past with the positions where ultrasound examinations are being performed this time, and determine whether there are any areas that have been forgotten to be examined.
  • Example of recording ultrasound image when examination situation becomes substantially the same as past examination situation When performing an ultrasound examination of substantially the same part as a part previously examined, when the past examination situation and the current examination situation become substantially the same, an ultrasound image may be recorded in the recording unit 54 of the information processing device 2. For example, when the posture of the person to be examined becomes substantially the same during the past examination and the current examination, an ultrasound image is recorded.
  • the examination conditions include, for example, the posture of the person being examined, the breathing state of the person being examined, the position on the human body to which the probe 1 is pressed, and the external force acting on the probe 1.
  • Whether the posture of the person being tested is approximately the same in the past and present can be determined using, for example, Visual SLAM technology and image recognition.
  • the breathing state of the person being examined indicates whether the person is inhaling, exhaling, or in a normal state. Whether the breathing state of the person being examined is substantially the same in the past and present is determined, for example, by using Visual SLAM technology and audio picked up by a microphone (not shown) installed in the information processing device 2.
  • Whether the position on the human body where the probe 1 is pressed is substantially the same in the past and present is determined, for example, by using Visual SLAM technology and the similarity of ultrasound images. Whether the external force acting on the probe 1 is substantially the same in the past and present is determined, for example, based on the light point cloud captured in the image captured by the camera 22.
  • the user can easily obtain ultrasound images taken under examination conditions that are approximately the same as those of previous examinations.
  • Example of managing information on a subject of examination It is also possible for the data acquisition unit 51 of the information processing device 2 to acquire patient information on a patient who is a subject of examination by photographing a document or screen including a barcode, a two-dimensional code, other special code, character strings, etc. with the camera 22 and analyzing the barcode, etc., that appears in the photographed image.
  • the face of the patient who is a subject of examination may be photographed with the camera 22, and the patient's face may be recognized by image recognition based on the photographed image, thereby acquiring patient information corresponding to the recognized patient's face.
  • the information processing device 2 can register ultrasound images and the measurement results of the external force acting on the probe 1 as examination data in a medical chart linked to the patient information. Since the user does not need to manually register ultrasound images and the like in the medical chart, it is possible to reduce the user's effort and errors.
  • the information processing device 2 can also obtain examination data from past ultrasound examinations along with patient information based on the results of reading barcodes and the like and the results of recognizing the patient's face.
  • the information processing device 2 can obtain individual settings for a user based on the results of reading a barcode or recognizing the user's face, and can register test data linked to the user. For example, if a user has color vision deficiency, colors that are easy for the user to read are displayed based on the user's individual settings. For example, the display of various types of information that can serve as a reference for how to use the probe 1 is turned on or off depending on the user's level of familiarity with the medical field.
  • Information about the test subject or user including patient information, information indicating the registration destination of test data, information indicating individual settings for the user, etc., is acquired, for example, by the data acquisition unit 51 of the information processing device 2.
  • FIG. 23 is a diagram showing an example of an ultrasound examination screen displayed on the display of the information processing device 2 during ultrasound examination.
  • the left side of the ultrasound examination screen displays an app (application) area A101, which allows the user to input operations and check the results of measurements taken by various sensors provided in the probe 1, for example.
  • a graph G101 showing the time series changes in the external force acting on the probe 1 is displayed below the app area A101.
  • a probe field of view image area A102 for displaying, for example, an image captured by the camera 22 provided on the probe 1 is displayed in the upper right-hand corner of the app area A101.
  • An image P101 captured by the camera 22 is displayed in the upper part of the probe field of view image area A102, and buttons B101 to B104 are displayed in the lower part.
  • a force meter indicating the magnitude and direction of the external force acting on the probe 1 is superimposed on the captured image P101.
  • button B101 When button B101 is pressed, for example, the captured image P101 is displayed with the left and right sides reversed, and when button B102 is pressed, for example, the captured image P101 is displayed with the top and bottom sides reversed.
  • button B103 When button B103 is pressed, for example, the force sense meter is displayed with the left and right sides reversed, and when button B104 is pressed, for example, the force sense meter is displayed with the top and bottom sides reversed.
  • the direction in which the user is looking differs by 180 degrees from the shooting direction of the camera 22.
  • the up, down, left, and right directions as seen by the user may differ from the up, down, left, and right directions in the image captured by the camera 22.
  • buttons B101 to B104 By inverting the captured image left-right or top-bottom using a GUI (Graphical User Interface) such as buttons B101 to B104, it is possible to prevent the user from becoming confused and working efficiency from decreasing because the up-down-left-right directions for the user are different from the up-down-left-right directions in the image captured by the camera 22.
  • GUI Graphic User Interface
  • an overhead image P102 capturing an entire image of a person to be examined is displayed.
  • the overhead image P102 is captured, for example, by a camera provided in the information processing device 2 or an external camera.
  • an ultrasound image area A103 for displaying, for example, an ultrasound image is displayed.
  • various numerical values, graphs G101, captured images P101, overhead images P102, ultrasound images, etc. are all displayed synchronously.
  • FIG. 24 is a diagram for explaining details of the app area A101.
  • a display example of the upper part of the app area A101 excluding the graph G101 is shown.
  • a button B111 is displayed at the top center of the application area A101 for setting target values for the external force acting on the probe 1, the measurement value of the relative position of the probe 1, etc.
  • a button B112 is displayed for saving logs of captured images and the measurement results of the external forces acting on the probe.
  • a button B113 is displayed for switching whether or not to continue saving logs of the measurement results of the external forces acting on the probe and the measurement results from the inertial sensor.
  • a check box C101 for reading a two-dimensional code For reading a character string.
  • checkboxes C102 are checkboxes for selecting items to be displayed in the graph G101.
  • FIG. 25 is a diagram showing an example of a graph G101 showing the time series change in the external force acting on the probe 1.
  • the horizontal axis shows the time
  • the vertical axis shows the magnitude of the force.
  • the example in Figure 25 shows the time series changes in the force (moment) Flx in the x-axis direction, the force (moment) Fly in the y-axis direction, and the force Flz in the z-axis direction acting on the left side of the probe 1, as well as the force (moment) Frx in the x-axis direction, the force (moment) Fry in the y-axis direction, and the force Frz in the z-axis direction acting on the right side of the probe 1.
  • the force range displayed in graph G101 may be specified by any of the following: - The range is automatically adjusted dynamically according to the magnitude of the force currently acting. ⁇ The maximum and minimum values are fixed according to the magnitude of the force that can be handled. - The user inputs maximum and minimum values. - Automatic adjustment is performed to ensure that the range is within the expected range depending on the area being examined during ultrasound testing. - Automatic adjustment is made to match the expected range based on past measurement results.
  • Figure 26 shows an example of the force meter display.
  • the two force meters 351 indicate the external forces acting on the left and right sides of the probe 1.
  • the two force meters 351 are displayed, for example, superimposed on the left and right areas, respectively, of the captured image P101 in which the light spot cloud appears. Since the user does not need to check the light spot cloud, the area in the captured image in which the light spot cloud appears is actually masked. Below, the area in the captured image that is not masked and in which only the external space of the probe 1 appears is referred to as the field of view of the captured image.
  • the force sense meter 351 is composed of a circle and an arrow extending from the center of the circle.
  • the size of the circle indicates the magnitude of the force (Flz, Frz) pressing into either the left or right side of the probe 1.
  • the length of the arrow indicates the magnitude of the force (Flx, Fly, Frx, Fry) acting in the up, down, left and right directions, and the direction of the arrow indicates the direction of the force acting in the up, down, left and right directions.
  • the measurement results of the external forces acting on the probe 1 are displayed as numbers, it is difficult for the user to understand.
  • a GUI such as a force meter
  • one force meter 352 indicates the resultant force of the external forces acting on the left and right sides of the probe 1.
  • the force sense meter 352 is composed of a circle and an arrow extending from the center of the circle.
  • the size of the circle indicates the magnitude of the resultant force pushing on the left and right sides of the probe 1.
  • the length of the arrow indicates the magnitude of the resultant force acting in the up, down, left and right directions on the left and right sides of the probe 1, and the direction of the arrow indicates the direction of the resultant force acting in the up, down, left and right directions.
  • the force meter 352 is displayed with its left and right position shifted to the side with the stronger force pushing the probe 1, and its top and bottom position shifted toward the part where the tip 11 is in contact with the test subject.
  • the top and bottom position of the force meter 352 may be fixed to the center of the captured image.
  • the force meter 351 is displayed on the left and right of the captured image, so the user must move their line of sight left and right.
  • the user can check the external force acting on the probe 1 by looking at one location (near the force meter 352).
  • a guide for moving the probe 1 to that area may be displayed superimposed on the image captured by the camera 22.
  • the direction and distance in which the probe 1 is moved to the area to be inspected is obtained based on an overhead image captured by the camera 22 when the user moves the probe 1 away from the person's body, and the relative position of the probe 1 to the inspection target measured using Visual SLAM technology.
  • FIG. 27 shows an example of a guide display for moving the probe 1 to the area to be examined.
  • the area to be examined is a part of a person's arm.
  • an arrow A121 is displayed as a guide, superimposed on the captured image P101 of the camera 22.
  • the arrow A121 is superimposed, for example, in the center of the captured image P101.
  • the arrow A121 may be displayed in an emphasized manner, for example, by being displayed large or blinking on the captured image P101.
  • a rectangular frame R101 that surrounds the part that is expected to appear in the field of view of the captured image P101 when the probe 1 comes into contact with the subject is displayed superimposed on the captured image P101 as a guide.
  • the user can bring the probe 1 into contact with a part of the arm to be examined by moving the probe 1 so that the frame R101 overlaps with the field of view of the captured image P101.
  • the frame R101 may be highlighted by blinking or the like.
  • a guide to indicate the target value of the force with which the probe 1 is pressed against the subject may be superimposed on the image captured by the camera 22.
  • Such a guide is displayed, for example, by pressing button B111 in FIG. 24.
  • the target value is set, for example, by pressing a button to select from among predetermined options, or by selecting from the results of measuring external forces when an ultrasound image was previously captured.
  • FIG. 28 shows an example of a guide display that shows the target value of the force for pressing the probe 1.
  • a guide meter 353 indicating the target value of the force for pushing the probe 1 is displayed together with a force sense meter 352 superimposed on the image P101 captured by the camera 22.
  • the guide meter 353 is also configured by combining a circle with an arrow extending from the center of the circle.
  • the size of the circle, the direction of the arrow, and the size of the arrow on the guide meter 353 correspond to the size of the circle, the direction of the arrow, and the size of the arrow on the force sense meter 352.
  • the guide meter 353 is displayed as a line in an inconspicuous color, such as a light color.
  • the force meter 352 is displayed smaller than the guide meter 353, and the circle of the force meter 352 is displayed in blue, for example.
  • the force meter 352 is displayed at approximately the same size as the guide meter 353, and the circle of the force meter 352 is displayed in, for example, green.
  • the force meter 352 is displayed larger than the guide meter 353, and the circle of the force meter 352 is displayed in red, for example.
  • the user can adjust the force with which the probe 1 is pressed so that the force meter 352 and the guide meter 353 overlap, allowing the user to press the probe 1 into the test subject with an appropriate force.
  • an alert may be issued to the user, such as by emitting a sound.
  • the color of the arrow on the force meter 352 may change according to the force with which the probe 1 is pressed, in the same way as the color of the circle.
  • the information processing device 2 may be configured, for example, by a single-board computer and a PC.
  • the single-board computer is connected to the probe 1 and the PC, and the PC is connected to the probe 1 and the single-board computer.
  • the single-board computer for example, adjusts the light intensity of the LEDs provided inside the camera 22 based on commands from the PC.
  • the single-board computer for example, obtains numerical data indicating the acceleration, angular velocity, and temperature of the probe 1 from sensors provided in the probe 1 and transmits the data to the PC.
  • the PC for example, acquires captured images from the imaging unit 140 of the camera 22, and estimates the external force acting on the probe 1 using a learning model that uses the captured images as input.
  • the PC for example, acquires numerical data transmitted from a single-board computer.
  • the PC displays, for example, the external force acting on the probe 1, the acceleration, angular velocity, temperature, etc. of the probe 1 on the ultrasound examination screen.
  • IMU Inertial Measurement Unit
  • the absolute attitude can be acquired by the inertial sensor, making it possible to accurately measure and compensate for the effects of gravity. It is possible to measure the absolute amount of movement of the probe 1 even when the person being examined and the probe 1 move at the same time. It is possible to measure the elasticity of the skin based on the measurement results from the inertial sensor.
  • the above-described series of processes can be executed by hardware or software.
  • the program constituting the software is installed from a program recording medium into a computer incorporated in dedicated hardware, or into a general-purpose personal computer, etc.
  • FIG. 29 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes using a program.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • an input/output interface 505 Connected to the input/output interface 505 are an input unit 506 consisting of a keyboard, mouse, etc., and an output unit 507 consisting of a display, speakers, etc. Also connected to the input/output interface 505 are a storage unit 508 consisting of a hard disk or non-volatile memory, a communication unit 509 consisting of a network interface, etc., and a drive 510 that drives removable media 511.
  • the CPU 501 for example, loads a program stored in the storage unit 508 into the RAM 503 via the input/output interface 505 and the bus 504 and executes the program, thereby performing the above-mentioned series of processes.
  • the programs executed by the CPU 501 are provided, for example, by being recorded on removable media 511, or via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and are installed in the storage unit 508.
  • the program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or it may be a program in which processing is performed in parallel or at the required timing, such as when called.
  • each step described in the above flowchart can be executed by a single device, or can be shared and executed by multiple devices.
  • one step includes multiple processes
  • the processes included in that one step can be executed by one device, or can be shared and executed by multiple devices.
  • Example of combination of configurations The present technology can also have the following configurations.
  • an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected and return; a camera that is integrated with the probe provided with the ultrasonic sensor and captures at least an image of the vicinity of the direction in which the probe contacts the inspection object; and a calculation unit that measures a relative position of the probe with respect to the object to be inspected based on an image captured by the camera.
  • the camera photographs an external space of the probe and photographs a light point group of light emitted from a light source provided inside the probe and reflected in a reflection space provided inside the probe
  • the ultrasonic inspection device according to (1) wherein the calculation unit measures an external force acting on the probe based on a displacement of the light spot cloud captured in the captured image.
  • the ultrasonic inspection device according to (2) further comprising an inertial sensor that measures at least one of an angular velocity of the probe and an acceleration of the probe.
  • the calculation unit measures elasticity of the inspection object based on the measurement result of the external force and the measurement result by the inertial sensor.
  • the ultrasonic inspection device described in (3) or (4) further includes a recording unit that corresponds and records at least two or more pieces of data among an ultrasonic image generated based on the reflected wave received by the ultrasonic sensor, the measurement result of the external force, the captured image, the measurement result of the relative position, and the measurement result by the inertial sensor.
  • a recording unit that corresponds and records at least two or more pieces of data among an ultrasonic image generated based on the reflected wave received by the ultrasonic sensor, the measurement result of the external force, the captured image, the measurement result of the relative position, and the measurement result by the inertial sensor.
  • the ultrasonic inspection device described in (6) further comprising a presentation control unit that presents a guide to a user for bringing the measurement value closer to the target value based on a difference between at least any one of the measurement values of the relative position, the external force, the angular velocity of the probe, and the acceleration and the target value.
  • the ultrasonic inspection device described in any of (3) to (7) further includes a communication unit that transmits at least two or more pieces of data among an ultrasonic image generated based on the reflected wave received by the ultrasonic sensor, the captured image, the measurement result of the relative position, the measurement result of the external force, and the measurement result by the inertial sensor to an external device.
  • the communication unit receives information indicating advice on how to use the probe or information indicating an instruction to record the ultrasound image from the external device; a presentation control unit that presents the advice to a user;
  • the ultrasonic inspection device described in (8) further comprises a recording unit that, when information indicating an instruction to record the ultrasonic image is transmitted from the external device, records the ultrasonic image, the captured image, and the measurement results of the external force in association with each other.
  • the camera captures at least one of a barcode, a two-dimensional code, a special code, a character string, a face of the inspection target, and a face of a user;
  • the ultrasonic inspection device according to any one of (1) to (12), wherein a cover portion for covering the camera is attached movably relative to the probe.
  • the ultrasonic inspection device according to any one of (1) to (13), wherein the camera is a fisheye camera or an omnidirectional camera.
  • the calculation unit performs image recognition based on the captured image to recognize the inspection object appearing in the captured image,
  • the ultrasonic inspection device according to any one of (1) to (15), wherein the camera is provided on at least one of a tip portion, a bottom surface, and a front surface of the probe.
  • the probe is provided with a plurality of the cameras.
  • the recording unit records the ultrasonic image when a past inspection situation and a current inspection situation are substantially the same.
  • An inspection method comprising: an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected; and a probe that is integral with the probe and is provided with an ultrasonic sensor that receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected; and a relative position of the probe with respect to the object to be inspected is measured based on an image captured by a camera that captures at least an image of the vicinity of the direction in which the probe contacts the object to be inspected.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • General Physics & Mathematics (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Radiology & Medical Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Acoustics & Sound (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

La présente technologie concerne un dispositif d'inspection ultrasonore, un procédé d'inspection et un programme qui facilitent l'utilisation du dispositif d'inspection ultrasonore. Ce dispositif d'inspection ultrasonore comprend : un capteur ultrasonore qui émet des ondes ultrasonores vers un objet à inspecter et qui reçoit des ondes réfléchies des ondes ultrasonores qui sont réfléchies par l'objet à inspecter ; une caméra dans laquelle est intégrée une sonde équipée d'un capteur ultrasonore, et qui capture au moins le voisinage de la direction dans laquelle la sonde entre en contact avec l'objet à inspecter ; et une unité de calcul qui mesure la position relative de la sonde par rapport à l'objet à inspecter sur la base d'une image capturée par la caméra. La présente technologie peut être appliquée, par exemple, à un dispositif d'inspection ultrasonore portatif qui capture des images ultrasonores.
PCT/JP2023/036673 2022-10-26 2023-10-10 Dispositif d'inspection ultrasonore, procédé d'inspection et programme WO2024090190A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022171639 2022-10-26
JP2022-171639 2022-10-26

Publications (1)

Publication Number Publication Date
WO2024090190A1 true WO2024090190A1 (fr) 2024-05-02

Family

ID=90830706

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/036673 WO2024090190A1 (fr) 2022-10-26 2023-10-10 Dispositif d'inspection ultrasonore, procédé d'inspection et programme

Country Status (1)

Country Link
WO (1) WO2024090190A1 (fr)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004201722A (ja) * 2002-12-20 2004-07-22 Ge Medical Systems Global Technology Co Llc 超音波診断装置
US20070167709A1 (en) * 2000-12-28 2007-07-19 Guided Therapy Systems, Inc. Visual imaging system for ultrasonic probe
CN101569541A (zh) * 2008-04-29 2009-11-04 香港理工大学 三维超声波成像系统
US20130237811A1 (en) * 2012-03-07 2013-09-12 Speir Technologies Inc. Methods and systems for tracking and guiding sensors and instruments
JP2015181660A (ja) * 2014-03-24 2015-10-22 キヤノン株式会社 被検体情報取得装置および乳房検査装置
CN204723091U (zh) * 2015-06-12 2015-10-28 成都迈迪特科技有限公司 带有摄像头的膀胱扫描仪
US20170105701A1 (en) * 2015-10-19 2017-04-20 Clarius Mobile Health Corp. Systems and methods for remote graphical feedback of ultrasound scanning technique
KR101792952B1 (ko) * 2016-06-27 2017-11-01 고려대학교 산학협력단 네비게이션 기반의 초음파 촬영장치
JP2019521745A (ja) * 2016-06-20 2019-08-08 バタフライ ネットワーク,インコーポレイテッド ユーザによる超音波装置の操作を援助するための自動画像取得
JP2020049062A (ja) * 2018-09-28 2020-04-02 ゼネラル・エレクトリック・カンパニイ 超音波画像表示装置
KR20200101086A (ko) * 2019-02-19 2020-08-27 삼성메디슨 주식회사 초음파 진단 장치
CN112472133A (zh) * 2020-12-22 2021-03-12 深圳市德力凯医疗设备股份有限公司 一种超声探头的姿态监控方法及装置
WO2021085186A1 (fr) * 2019-10-31 2021-05-06 ソニー株式会社 Dispositif capteur
WO2022249537A1 (fr) * 2021-05-24 2022-12-01 ソニーグループ株式会社 Dispositif capteur

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070167709A1 (en) * 2000-12-28 2007-07-19 Guided Therapy Systems, Inc. Visual imaging system for ultrasonic probe
JP2004201722A (ja) * 2002-12-20 2004-07-22 Ge Medical Systems Global Technology Co Llc 超音波診断装置
CN101569541A (zh) * 2008-04-29 2009-11-04 香港理工大学 三维超声波成像系统
US20130237811A1 (en) * 2012-03-07 2013-09-12 Speir Technologies Inc. Methods and systems for tracking and guiding sensors and instruments
JP2015181660A (ja) * 2014-03-24 2015-10-22 キヤノン株式会社 被検体情報取得装置および乳房検査装置
CN204723091U (zh) * 2015-06-12 2015-10-28 成都迈迪特科技有限公司 带有摄像头的膀胱扫描仪
US20170105701A1 (en) * 2015-10-19 2017-04-20 Clarius Mobile Health Corp. Systems and methods for remote graphical feedback of ultrasound scanning technique
JP2019521745A (ja) * 2016-06-20 2019-08-08 バタフライ ネットワーク,インコーポレイテッド ユーザによる超音波装置の操作を援助するための自動画像取得
KR101792952B1 (ko) * 2016-06-27 2017-11-01 고려대학교 산학협력단 네비게이션 기반의 초음파 촬영장치
JP2020049062A (ja) * 2018-09-28 2020-04-02 ゼネラル・エレクトリック・カンパニイ 超音波画像表示装置
KR20200101086A (ko) * 2019-02-19 2020-08-27 삼성메디슨 주식회사 초음파 진단 장치
WO2021085186A1 (fr) * 2019-10-31 2021-05-06 ソニー株式会社 Dispositif capteur
CN112472133A (zh) * 2020-12-22 2021-03-12 深圳市德力凯医疗设备股份有限公司 一种超声探头的姿态监控方法及装置
WO2022249537A1 (fr) * 2021-05-24 2022-12-01 ソニーグループ株式会社 Dispositif capteur

Similar Documents

Publication Publication Date Title
US10863898B2 (en) System and method for determining distances from an object
US9345401B2 (en) Handheld vision tester and calibration thereof
EP3402384B1 (fr) Systèmes et procédés de détermination de la distance par rapport à un objet
CN108604116A (zh) 能够进行眼睛追踪的可穿戴设备
WO2016183537A1 (fr) Dispositif de balayage biométrique portable
KR101985438B1 (ko) 신체 능력의 측정 및 증진을 위한 스마트 기기
CN108292448A (zh) 信息处理装置、信息处理方法和程序
JP2023523317A (ja) 内部体器官の超音波画像を取得するためのシステム
JP5857805B2 (ja) カメラ較正装置
WO2024090190A1 (fr) Dispositif d'inspection ultrasonore, procédé d'inspection et programme
US11579449B2 (en) Systems and methods for providing mixed-reality experiences under low light conditions
JP7209954B2 (ja) 眼振解析システム
TW201720364A (zh) 色覺檢測方法與系統
JP5884564B2 (ja) 診断支援装置
US20240122469A1 (en) Virtual reality techniques for characterizing visual capabilities
JP5480028B2 (ja) 見做し注視点検出装置