WO2024090190A1 - Ultrasonic inspection device, inspection method, and program - Google Patents

Ultrasonic inspection device, inspection method, and program Download PDF

Info

Publication number
WO2024090190A1
WO2024090190A1 PCT/JP2023/036673 JP2023036673W WO2024090190A1 WO 2024090190 A1 WO2024090190 A1 WO 2024090190A1 JP 2023036673 W JP2023036673 W JP 2023036673W WO 2024090190 A1 WO2024090190 A1 WO 2024090190A1
Authority
WO
WIPO (PCT)
Prior art keywords
probe
ultrasonic
camera
image
inspection device
Prior art date
Application number
PCT/JP2023/036673
Other languages
French (fr)
Japanese (ja)
Inventor
一生 本郷
裕之 鎌田
直子 小林
務 澤田
博之 茂井
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2024090190A1 publication Critical patent/WO2024090190A1/en

Links

Images

Definitions

  • This technology relates to an ultrasonic inspection device, an inspection method, and a program, and in particular to an ultrasonic inspection device, an inspection method, and a program that make it easier to use the ultrasonic inspection device.
  • ultrasound examination devices that use ultrasound to capture internal organs and other objects to obtain ultrasound images have been widely used in the medical field.
  • the posture of the probe that transmits ultrasound and receives the reflected waves of the ultrasound, and the relative position of the probe with respect to the person being examined may be used (see, for example, Patent Document 1).
  • an ultrasound examination device is used in a hospital
  • the posture of the probe and its relative position with respect to the person being examined are obtained, for example, using a camera installed near the hospital bed.
  • An ultrasonic inspection device includes an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected back from the object to be inspected, a camera that is integrated with a probe on which the ultrasonic sensor is provided and captures at least an image of the area in the direction in which the probe contacts the object to be inspected, and a calculation unit that measures the relative position of the probe with respect to the object to be inspected based on the image captured by the camera.
  • the inspection method is configured such that an ultrasonic inspection device is integrated with a probe equipped with an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected, and the relative position of the probe with respect to the object to be inspected is measured based on an image captured by a camera that captures at least an image of the vicinity of the direction in which the probe comes into contact with the object to be inspected.
  • a program is configured as an integrated part of a probe equipped with an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected back from the object to be inspected, and causes a computer to execute a process of measuring the relative position of the probe with respect to the object to be inspected based on an image captured by a camera that captures at least an image of the vicinity of the direction in which the probe comes into contact with the object to be inspected.
  • an ultrasonic sensor is integrated with a probe that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected, and the relative position of the probe with respect to the object to be inspected is measured based on images taken by a camera that captures at least the area in the direction in which the probe comes into contact with the object to be inspected.
  • FIG. 1 is a diagram illustrating an example of the configuration of an ultrasonic inspection device according to an embodiment of the present technology.
  • FIG. 2 is a perspective view showing an example of the appearance of a probe.
  • FIG. 2 is a perspective view of the probe as viewed from the tip end side.
  • FIG. 13 is a diagram showing an example of how to use the probe.
  • FIG. 2 is a cross-sectional view showing a schematic configuration of the camera. 13 is a diagram showing a state in which light is emitted into a space surrounded by three mirrors facing each other.
  • FIG. 1A and 1B are diagrams illustrating photographing by a camera and an external force acting on the camera.
  • FIG. 2 is a diagram showing an example of an image captured by a camera.
  • FIG. 1 is a block diagram showing an example of the configuration of an information processing device; 10 is a flowchart illustrating processing performed by an ultrasonic inspection device according to the present technology.
  • FIG. 1 is a diagram showing an example of a situation in which an ultrasonic inspection device is used.
  • 1A and 1B are diagrams for explaining a method for measuring the rigidity of an inspection object.
  • FIG. 13 is a diagram showing an example of the appearance of a probe with a lid portion attached.
  • FIG. 1 is a diagram for explaining remote medical treatment using an ultrasound examination device according to the present technology.
  • 10A and 10B are diagrams for explaining a case where the ultrasonic sensor operates normally and a case where the operation of the ultrasonic sensor is stopped.
  • FIG. 2 is a diagram showing an example of the arrangement of two cameras.
  • FIG. 1 is a diagram showing an example of the arrangement of one camera.
  • FIG. 1 is a diagram showing an example of the arrangement of two fisheye cameras.
  • FIG. 2 is a diagram showing an example of the arrangement of one fish-eye camera.
  • FIG. 1 is a diagram showing an example of an arrangement of a plurality of types of cameras.
  • FIG. 13 is a diagram showing an example of a screen for presenting positions on the human body where ultrasound examinations have been performed.
  • FIG. 13 is a diagram showing an example of an ultrasound examination screen displayed on a display of an information processing device during ultrasound examination.
  • FIG. 13 is a diagram illustrating details of an application area.
  • FIG. 1 is a diagram showing an example of the arrangement of one camera.
  • FIG. 1 is a diagram showing an example of the arrangement of two fisheye cameras.
  • FIG. 2 is a diagram showing an example of the arrangement of one fish-eye camera.
  • FIG. 1 is a
  • FIG. 13 is a diagram showing an example of a graph illustrating a time series change in an external force acting on a probe.
  • FIG. 13 is a diagram showing an example of the display of a force sense meter.
  • 13A and 13B are diagrams showing an example of a display of a guide for moving a probe to a region to be examined.
  • 13A and 13B are diagrams showing an example of a guide display for presenting a target value of the force for pressing the probe.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of a computer.
  • FIG. 1 is a diagram illustrating an example of the configuration of an ultrasonic inspection device according to an embodiment of the present technology.
  • the ultrasound examination device of the present technology is a portable device used to take and examine ultrasound images that show the internal state of each part of the body, such as the abdomen, of a person being examined.
  • the ultrasound examination device of the present technology is composed of a probe 1 and an information processing device 2.
  • the probe 1 and the information processing device 2 are connected via a wired or wireless communication path.
  • the probe 1 emits ultrasonic waves to the object to be inspected and receives the reflected ultrasonic waves.
  • the probe 1 measures the intensity of the received reflected waves and supplies ultrasonic measurement data, which is data showing the measurement results of the intensity of the reflected waves over time, to the information processing device 2.
  • the information processing device 2 is composed of a tablet terminal, a smartphone, a notebook PC (Personal Computer), a dedicated terminal, etc.
  • the information processing device 2 generates an ultrasound image based on the ultrasound measurement data supplied from the probe 1, and displays the ultrasound image on a display provided in the information processing device 2, for example.
  • Figure 2 is a perspective view showing an example of the appearance of probe 1.
  • the probe 1 has a tip portion 11 and a support portion 12.
  • a small ultrasonic sensor 21 is provided on the contact surface of the tip 11 with the test subject, which emits ultrasonic waves and receives the reflected ultrasonic waves.
  • the support portion 12 is a member that supports the tip portion 11 and allows a user of the ultrasound examination device to grasp the probe 1.
  • a roughly triangular opening H is formed in the center of the support portion 12 on the tip portion 11 side, and a plate-shaped transparent member 31 is formed to cover the opening H.
  • Figure 3 is a perspective view of the probe 1 as seen from the tip 11 side.
  • a camera 22 is provided inside the opening H.
  • the camera 22 includes a force sensor section to which the tip 11 is directly or indirectly attached, and at least a portion of the body is integrated with the support section 12. The detailed configuration of the camera 22 will be described later.
  • the camera 22 captures images of the external space of the probe 1.
  • the camera 22 is separated from the external space by a transparent member 31.
  • the transparent member 31 makes it possible to make the camera 22 dustproof, drip-proof, and waterproof. To make it easier to clean the probe 1, it is desirable that the boundary between the transparent member 31 and the support part 12 is flat.
  • the shape of the transparent member 31 is not limited to a plate shape. For example, if at least a portion of the camera 22 is exposed outside the support part 12, the transparent member 31 may be formed in a hemispherical shape to cover the camera 22.
  • Figure 4 shows an example of how to use the probe 1.
  • a user performing an ultrasound examination presses the tip 11 of the probe 1 against a predetermined position on the surface of the object Obj to be examined (e.g., a position directly above an internal organ). At this time, an ultrasound image showing, for example, the inside of the object to be examined is displayed in real time on the display of the information processing device 2. Note that the user performing the ultrasound examination may be the person to be examined, or may be a person other than the person to be examined.
  • the camera 22 captures, for example, an image in the direction in which the probe 1 comes into contact with the object of inspection Obj (toward the tip 11).
  • the image captured by the camera 22 captures, for example, a part of the object of inspection Obj.
  • the ultrasonic inspection device measures the relative position of the probe 1 with respect to the object of inspection Obj based on the image captured by the camera 22.
  • the relative position of the probe 1 is measured, for example, using Visual SLAM (Simultaneous Localization And Mapping) technology that uses the captured image as input.
  • Visual SLAM Simultaneous Localization And Mapping
  • the information processing device 2 can also detect abnormalities such as rough skin of the person being examined based on the image captured by the camera 22.
  • abnormalities such as rough skin of the person being examined based on the image captured by the camera 22.
  • the camera 22 is also used as an optical force sensor.
  • the ultrasonic inspection device measures the external force acting on the tip 11 of the probe 1 based on the image captured by the camera 22.
  • FIG. 5 is a cross-sectional view showing a schematic configuration of the camera 22.
  • the camera 22 includes a base unit 110, a force sensor unit 120, strain bodies 130 and 180, an image capturing unit 140, a light source unit 150, a first mirror 161, a second mirror 162, and a half mirror 170.
  • the base unit 110 is a rigid structural member in the shape of a flat plate with a light intake hole 110H provided approximately in the center.
  • the base unit 110 allows light incident from outside the camera 22 (also called external light) to enter the photographing unit 140 provided on the first surface S101 side of the base unit 110 (the side opposite to the side where external light enters the camera 22) through the light intake hole 110H provided approximately in the center.
  • the force sense acting unit 120 is a rigid structural member provided via a strain generating body 130 on the second surface S102 side of the base unit 110 (the side on which external light enters the camera 22).
  • the force sense acting unit 120 may be provided, for example, so as to face the second surface S102 of the base unit 110 around the light intake hole 110H.
  • the force sense acting unit 120 is a part of the camera 22 on which an external force (external force) is applied via the tip portion 11.
  • an external force acts on the force sense acting unit 120
  • the strain generating body 130 between the force sense acting unit 120 and the base portion 110 deforms, and the positional relationship between the first mirror 161 and the second mirror 162 changes.
  • This causes a change in the position of each light point of the reflected light that is emitted from the light source portion 150 and multiple-reflected by the first mirror 161 and the second mirror 162. Therefore, the camera 22 can measure the external force acting on the force sense acting unit 120 by measuring the change in the position of each light point of the reflected light that is multiple-reflected by the first mirror 161 and the second mirror 162.
  • a plurality of force sense acting units 120 may be provided.
  • the camera 22 can measure the external forces acting on the plurality of locations by receiving the external forces acting on the plurality of locations of the camera 22 at each of the force sense acting units 120.
  • the multiple force sense units 120 may be arranged in point symmetry or line symmetry with respect to the light intake hole 110H.
  • two force sense units 120 may be arranged at 180 degrees across the light intake hole 110H (i.e., facing each other across the light intake hole 110H).
  • Three force sense units 120 may be arranged at 120 degrees from each other with the light intake hole 110H at the center, or four force sense units 120 may be arranged at 90 degrees from each other with the light intake hole 110H at the center.
  • the half mirror 170 is provided on the side where external light enters so as to cover the light intake hole 110H.
  • the half mirror 170 may be configured in a rectangular or circular flat plate shape and may be provided so as to span multiple force sense acting units 120 via a strain generating body 180.
  • Half mirror 170 is an optical element that has a light transmittance of more than 20% and less than 90%, and a light reflectance of more than 10% and less than 80%, and transmits some of the incident light and reflects some of the incident light.
  • half mirror 170 can be constructed by depositing an extremely thin film having light transmittance and light reflectance using a metal material such as chromium (Cr) on a transparent element made of glass or resin.
  • a metal material such as chromium (Cr)
  • half mirror 170 can be constructed by depositing a dielectric multilayer film having light transmittance and light reflectance on a transparent element made of glass or resin.
  • the light transmittance and light reflectance of half mirror 170 can be set to any value depending on the characteristics realized by camera 22.
  • the half mirror 170 which has optical transparency, can transmit, for example, external light incident on the camera 22 into the interior of the camera 22, which is provided with the light intake hole 110H. This allows the camera 22 to capture the external space of the camera 22 with the imaging unit 140 by using the external light captured through the light intake hole 110H.
  • the half mirror 170 which has optical reflectivity, can also reflect, for example, the light emitted from the light source unit 150 in the same manner as the first mirror 161 and the second mirror 162. This allows the camera 22 to measure the positions of each light point of the reflected light that is emitted from the light source unit 150 and is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170 with the imaging unit 140. Therefore, the camera 22 can simultaneously capture the external space of the camera 22 and measure the light point group of the reflected light that is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170 with the imaging unit 140.
  • the flexure bodies 130 and 180 are structural members that deform in proportion to the stress acting thereon.
  • the flexure bodies 130 and 180 may be elastic bodies that are obviously easily deformed, such as rubber, elastomers, or springs.
  • the flexure bodies 130 and 180 may also be structural members that are made of the same material as the other components, but are formed with low rigidity so that they are more easily deformed than the other components.
  • the flexure body 130 is provided between the base unit 110 and the force sense acting unit 120, and the flexure body 180 is provided between the half mirror 170 and the force sense acting unit 120.
  • the flexure bodies 130 and 180 can displace the positional relationship between the first mirror 161, the second mirror 162, and the half mirror 170 by deforming in response to an external force acting on the force sense acting unit 120.
  • the first mirror 161 is provided on the second surface S102 of the base unit 110, and the second mirror 162 is provided on the surface of the force sense acting unit 120 facing the base unit 110. That is, the first mirror 161 and the second mirror 162 are provided on the surface of the camera 22 facing the internal space surrounded by the base unit 110 and the force sense acting unit 120.
  • the first mirror 161 and the second mirror 162 can be formed, for example, by depositing a metal material such as chrome (Cr) with a film thickness having sufficient light reflectance on a transparent member made of glass or resin.
  • the first mirror 161 and the second mirror 162 facing each other can multiple-reflect the light emitted from the light source unit 150 in the reflection space 121 between the first mirror 161 and the second mirror 162.
  • Figure 6 is a diagram showing how light is emitted into a space surrounded by three opposing mirrors.
  • the light L emitted from the light source unit 1500 is multiple-reflected by the first mirror 1610, the second mirror 1620, and the third mirror 1630, which are provided facing each other at positions corresponding to each side of the triangular prism.
  • the light L emitted from the light source unit 1500 is received by the light receiving unit 1400 with the number of light points of the reflected light being amplified by the multiple reflections of the first mirror 1610, the second mirror 1620, and the third mirror 1630.
  • the positions of the light points of the reflected light of the light L emitted from the light source unit 1500 are displaced by amplifying the displacement of the first mirror 1610, the second mirror 1620, and the third mirror 1630.
  • the first mirror 1610, the second mirror 1620, and the third mirror 1630 may be arranged to correspond to each side of an equilateral triangle or an isosceles triangle, or may be arranged to correspond to each side of a triangle that is a broken equilateral triangle or an isosceles triangle.
  • the first mirror 161, the second mirror 162, and the third mirror can form a structure corresponding to the sides of a triangular prism as shown in FIG. 6.
  • the camera 22 can multiple-reflect the light emitted from the light source unit 150, with the inside of the triangular prism whose sides are formed by the first mirror 161, the second mirror 162, and the third mirror being the reflection space 121.
  • the first mirror 161 and the second mirror 162 may form a structure corresponding to the sides of a triangular pyramid with a third mirror (not shown). Even in this case, the camera 22 can multiple-reflect the light emitted from the light source unit 150 by using the inside of the triangular pyramid whose sides are formed by the first mirror 161, the second mirror 162, and the third mirror as a reflection space 121.
  • the first mirror 161 and the second mirror 162 may form a structure corresponding to the side surfaces of a rectangular prism between them and a third mirror and a fourth mirror (not shown). Even in such a case, the camera 22 can multiple-reflect the light emitted from the light source unit 150 by using the inside of the rectangular prism whose sides are formed by the first mirror 161, the second mirror 162, the third mirror, and the fourth mirror as the reflection space 121.
  • the first mirror 161 and the second mirror 162 may form a structure corresponding to the sides of a quadrangular pyramid between them and a third mirror and a fourth mirror (not shown). Even in this case, the camera 22 can multiple-reflect the light emitted from the light source unit 150 by using the inside of the quadrangular pyramid whose sides are formed by the first mirror 161, the second mirror 162, the third mirror, and the fourth mirror as the reflection space 121.
  • the light source unit 150 emits light toward the second surface S102 side of the base unit 110. Specifically, the light source unit 150 emits light into a reflection space 121 surrounded on at least two sides by the first mirror 161 and the second mirror 162.
  • the reflection space 121 is, for example, the space between the first mirror 161 and the second mirror 162 that face each other. By emitting light into the reflection space 121, the light source unit 150 can cause the emitted light to be multiple-reflected in the reflection space 121 between the first mirror 161 and the second mirror 162.
  • the light source unit 150 may emit light into the reflection space 121 from the bottom side of the reflection space 121 (i.e., the base unit 110 side), or may emit light into the reflection space 121 from the side side of the reflection space 121 (i.e., the strain body 130 side).
  • the light source unit 150 may be, for example, an LED (Light Emitting Diode) light source capable of emitting light with high linearity.
  • the light source unit 150 may be provided on the base unit 110 side.
  • wiring to the light source unit 150 can be formed in the same way as wiring to the imaging unit 140, so the cost and amount of work involved in forming the wiring can be reduced. Therefore, the production cost of the camera 22 can be further reduced.
  • the light source unit 150 may be provided inside the base unit 110 so that the main body of the LED light source and wiring are not exposed in the reflection space 121. In this case, it is possible to prevent the image of the main body and wiring of the light source unit 150 from being multiple-reflected on the first mirror 161 and the second mirror 162. This prevents the multiple-reflection image of the main body and wiring of the light source unit 150 from becoming a noise source, and therefore prevents a decrease in the measurement sensitivity of the light spot group of the reflected light that is multiple-reflected on the first mirror 161 and the second mirror 162.
  • the light source unit 150 may emit light into the reflection space 121 through a pinhole.
  • the pinhole is, for example, a hole with a diameter of about several mm.
  • the light source unit 150 can further improve the convergence of the emitted light by emitting light into the reflection space 121 through the pinhole.
  • the light source unit 150 can make the shape of each light point of the reflected light reflected multiple times by the first mirror 161 and the second mirror 162 a smaller perfect circle, thereby improving the measurement sensitivity of each light point.
  • the accuracy of pinhole processing is generally higher than the positioning accuracy when the light source unit 150 is assembled, the light source unit 150 can further improve the accuracy of the position where the light is emitted by emitting light into the reflection space 121 through the pinhole. Therefore, it is possible to more easily control the position of each light point of the reflected light reflected multiple times by the first mirror 161 and the second mirror 162.
  • a plurality of light source sections 150 may be provided corresponding to each of the haptic action sections 120.
  • the light source section 150 may emit light of a different color for each corresponding haptic action section 120.
  • the light source section 150 may emit light to another reflection space 121 surrounded on two sides by the second mirror 162 and the first mirror 161 provided on the corresponding haptic action section 120 so that the light spot group of the reflected light is separated for each corresponding haptic action section 120.
  • the camera 22 can measure the external force acting on each of the haptic action sections 120 by the displacement of the light spot group of the reflected light that can be separated from each other by color or position. Therefore, the camera 22 can measure the external force acting on each of the haptic action sections 120 by separating them from each other with high accuracy.
  • the photographing unit 140 is an image sensor that acquires a photographed image by receiving light incident through the light intake hole 110H.
  • the photographing unit 140 may be, for example, a CMOS (Complementary Metal-Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor.
  • the photographing unit 140 can receive external light that passes through the half mirror 170 and enters the camera 22, and a group of light points of the reflected light that is emitted from the light source unit 150 and is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170. In other words, the photographing unit 140 can acquire an image in which the group of light points of the multiple-reflected reflected light is superimposed on the photographed image of the external space of the camera 22.
  • the camera 22 can measure the external force acting on the force sensing unit 120 from the displacement of the position of each light point of the reflected light that is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170. Therefore, the camera 22 can simultaneously capture images of the external space and measure the external forces acting on the force sense action unit 120.
  • Figure 7 is a diagram explaining the photographing by the camera 22 and the external forces acting on the camera 22.
  • the camera 22 can capture an image of an object Obj1 present in external space. Furthermore, the camera 22 receives a force Fz1 in the Z-axis direction, a moment Mx1 about the X-axis, and a moment My1 about the Y-axis at the upper force sense acting section 120a while facing directly in FIG. 7, and can detect these forces Fz1, moment Mx1, and moment My1. Furthermore, the camera 22 receives a force Fz2 in the Z-axis direction, a moment Mx2 about the X-axis, and moment My2 about the Y-axis at the lower force sense acting section 120b while facing directly in FIG. 7, and can detect these forces Fz2, moment Mx2, and moment My2.
  • FIG. 8 shows an example of an image captured by the camera 22.
  • the captured image CI captured by the image capture unit 140 includes an object Obj1 and light spot groups LC1 and LC2.
  • the light spot group LC1 is, for example, a light spot group of reflected light that is emitted from the upper light source unit 150a facing directly in FIG. 7 and is multiple-reflected by the first mirror 161a and the second mirror 162a.
  • the positions of the force sense action unit 120a and the second mirror 162a are displaced.
  • the position of the light spot group LC1 on the captured image CI is displaced in the directions corresponding to the force Fz1, moment Mx1, and moment My1 acting on the force sense action unit 120a. Therefore, the camera 22 can calculate the direction and magnitude of the force Fz1, moment Mx1, and moment My1 acting on the force sense action unit 120a from the displacement of the position of the light spot group LC1.
  • the light spot group LC2 is, for example, a light spot group formed by multiple reflections of light emitted from the lower light source unit 150b facing directly in FIG. 7 by the first mirror 161b and the second mirror 162b.
  • the positions of the force sense action unit 120b and the second mirror 162b are displaced.
  • the position of the light spot group LC2 on the captured image CI is displaced in the directions corresponding to the force Fz2, moment Mx2, and moment My2 acting on the force sense action unit 120b. Therefore, the camera 22 can calculate the direction and magnitude of the force Fz2, moment Mx2, and moment My2 acting on the force sense action unit 120b from the displacement of the position of the light spot group LC2.
  • the camera 22 can calculate the direction and magnitude of the external force acting on the force sense action unit 120a by previously associating the state of displacement of the position of the light point group LC1 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on the force sense action unit 120a. Also, the camera 22 can calculate the direction and magnitude of the external force acting on the force sense action unit 120b by previously associating the state of displacement of the position of the light point group LC2 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on the force sense action unit 120b.
  • the camera 22 may use machine learning to associate the state of displacement of each of the position of the light point groups LC1 and LC2 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on each of the force sense action units 120a and 120b.
  • the camera 22 may create a calibration curve to associate the state of displacement of each of the light point groups LC1 and LC2 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on each of the force sense acting units 120a and 120b.
  • the camera 22 can capture an image CI in which the light spot groups LC1 and LC2 are superimposed on the captured image of the subject Obj1, so that the force and moment acting on the force sense acting units 120a and 120b can be measured simultaneously while capturing an image of the subject Obj1.
  • the tip 11 is attached so that the movement of the tip 11 is transmitted to the force sense acting units 120a and 120b, so the camera 22 can measure the external force acting on the tip 11.
  • the orientation of the probe and the relative position of the probe to the person being examined may be used (see, for example, Patent Document 1).
  • an ultrasound examination device is used in a hospital
  • the orientation of the probe and its relative position to the person being examined are obtained, for example, using a camera installed near the hospital bed.
  • the camera 22 which captures at least the area in the direction in which the probe 1 comes into contact with the object to be inspected, is configured as one unit with the probe 1 provided with the ultrasonic sensor 21, and the relative position of the probe 1 with respect to the object to be inspected is measured based on the image captured by the camera 22. Because the camera 22 is configured as one unit with the probe 1, the user can easily use the ultrasonic inspection device without having to install the camera or connect the camera to the ultrasonic inspection device.
  • the camera 22 captures an image of the external space of the probe 1, and also captures an image of a group of light points of light emitted from a light source provided inside the probe 1 and reflected in a reflection space provided inside the probe 1.
  • the external force acting on the tip 11 of the probe 1 is measured based on the displacement of the group of light points captured in the image captured by the camera 22.
  • the external force may be measured by a force sensor formed to support the tip 11 instead of the camera 22.
  • introducing a force sensor along with an ultrasonic sensor and a camera into one probe may result in the probe becoming large.
  • the probe be as small as possible.
  • the camera 22 integrated with the probe 1 is also used as an optical force sensor, the force sensor itself and the unique wiring can be reduced, making it possible to realize a smaller, lighter ultrasonic inspection device.
  • FIG. 9 is a diagram showing an example of the configuration of the probe 1. As shown in FIG.
  • the probe 1 is composed of an ultrasonic sensor 21, a camera 22, an inertial sensor 23, and a calculation unit 24.
  • the ultrasonic sensor 21 emits ultrasonic waves to the object to be inspected and receives the reflected ultrasonic waves.
  • the ultrasonic sensor 21 measures the intensity of the received reflected waves and obtains ultrasonic measurement data that indicates, for example, the measurement results of the intensity of the reflected waves over time.
  • the camera 22 captures the external space of the probe 1 and acquires the captured image.
  • the inertial sensor 23 is composed of a gyro sensor and an acceleration sensor, and is mounted inside the probe 1.
  • the inertial sensor 23 measures at least one of the angular velocity of the probe 1 and the acceleration of the probe 1.
  • the degree to which the probe 1 is tilted relative to the direction of gravity can be measured using an acceleration sensor, so the measurement results from the acceleration sensor can be used to improve the accuracy of the tilt angle of the probe 1 measured using techniques such as Visual SLAM. Improving the accuracy of measuring the tilt angle of the probe 1 makes it possible to more accurately correct errors that occur in the measurement of the external force acting on the probe 1, improving the accuracy of the measurement value of the external force.
  • the gyro sensor can measure rotational movement mainly around the axis of gravity
  • the measurement results of the gyro sensor can be used to improve the accuracy of the movement state of the probe 1 measured by techniques such as Visual SLAM.
  • Visual SLAM measures the relative change in position with respect to the object being inspected, so it is not suitable for measuring the absolute translational amount of probe 1. For example, with Visual SLAM technology, it may not be possible to determine whether the object being inspected or probe 1 has moved. By using the results of measuring the acceleration of probe 1 using an acceleration sensor, it is possible to measure the absolute change in the position of probe 1.
  • the calculation unit 24 measures the relative position of the probe 1 with respect to the object to be inspected based on the image captured by the camera 22.
  • the calculation unit 24 also measures the external force acting on the probe 1 based on the image captured by the camera 22.
  • FIG. 10 is a block diagram showing an example configuration of the information processing device 2.
  • the information processing device 2 is composed of a data acquisition unit 51, a calculation unit 52, a communication unit 53, a recording unit 54, a presentation control unit 55, and a presentation unit 56.
  • the data acquisition unit 51 acquires ultrasonic measurement data, captured images, measurement results of the inertial sensor 23, measurement results of the relative position of the probe 1, and measurement results of external forces acting on the probe 1 from the probe 1, and supplies them to the calculation unit 52, communication unit 53, and recording unit 54. Since the external forces acting on the probe 1 are measured by the calculation unit 24 of the probe 1, the data acquisition unit 51 can also acquire from the probe 1 the area portion of the captured image in which the subject appears, rather than acquiring the entire captured image including the intersection group for measuring the external forces. By acquiring the area portion of the captured image in which the subject appears, the amount of data exchanged between the probe 1 and the information processing device 2 can be reduced.
  • the calculation unit 52 generates an ultrasound image based on the ultrasound measurement data supplied from the data acquisition unit 51, and supplies it to the communication unit 53 and the recording unit 54.
  • the calculation unit 52 also supplies the captured image, the measurement results of the inertial sensor 23, the measurement results of the relative position of the probe 1, and the measurement results of the external force acting on the probe 1, all of which are supplied from the data acquisition unit 51, as well as the ultrasound image, to the presentation control unit 55.
  • the calculation unit 52 can also measure the relative position of the probe 1 with respect to the object of inspection and the external force acting on the probe 1 based on the captured image.
  • the communication unit 53 transmits at least two of the data supplied from the data acquisition unit 51 and the calculation unit 52 to an external device connected to the information processing device 2.
  • the information processing device 2 and the external device are connected via a wired or wireless network.
  • the ultrasound image, the captured image, the measurement results of the inertial sensor 23, the measurement results of the relative position of the probe 1, and the measurement results of the external force acting on the probe 1 are encrypted and transmitted so that they can only be decrypted by an external device connected to the information processing device 2.
  • the data may be encrypted in the configuration that acquired the data, or may be encrypted collectively in the communication unit 53.
  • the recording unit 54 records the data supplied from the data acquisition unit 51 and the calculation unit 52 in association with each other.
  • the presentation control unit 55 generates a screen and sound to be presented to the user based on the data supplied from the calculation unit 52, and controls the presentation unit 56 to present the screen and sound.
  • the presentation unit 56 is composed of a display, speaker, etc., and presents to the user images and sounds based on ultrasound images and the measurement results of the external forces acting on the probe 1 according to the control of the presentation control unit 55.
  • step S1 the data acquisition unit 51 of the information processing device 2 acquires various data, such as ultrasonic measurement data and captured images, from the probe 1.
  • step S2 for example, the calculation unit 52 of the information processing device 2 generates an ultrasound image based on the ultrasound measurement data.
  • step S3 for example, the calculation unit 24 of the probe 1 measures the relative position of the probe 1 with respect to the test object based on the captured image.
  • step S4 for example, the calculation unit 24 of the probe 1 measures the external force acting on the probe 1 based on the captured image.
  • step S5 the recording unit 54 of the information processing device 2 records the measurement results of various information by the probe 1 and the information processing device 2.
  • the ultrasound image, the captured image, the measurement results of the relative position of the probe 1, the measurement results of the external force acting on the probe 1, and the measurement results of the inertial sensor 23 are recorded in association with each other.
  • FIG. 12 is a diagram showing an example of a situation in which an ultrasonic inspection device is used.
  • the probe 1 is used by connecting it to a tablet terminal 2A as an information processing device 2 as shown in FIG. 12A, or by connecting it to a notebook PC 2B as an information processing device 2 as shown in FIG. 12B.
  • a guide on how to use probe 1 may be presented from tablet terminal 2A or notebook PC 2B.
  • a voice message such as "Please press a little harder to the right” may be presented, or a screen containing colors, symbols, figures, character strings, etc. according to the guide content may be presented.
  • a sound or a display indicating that the force or position of probe 1 is appropriate is displayed.
  • a sound saying "The current force is optimal" may be displayed, or a character string or figure indicating that the force or position of probe 1 is appropriate may be displayed in a color different from the colors of other displays. Note that if probe 1 is pressed too far into the test subject, an alert may be displayed.
  • the contents of the guide on how to use the probe 1 are determined based on the difference between the target values and at least one of the measured values of the relative position of the probe 1, the external force acting on the probe 1, the angular velocity of the probe 1, and the acceleration of the probe 1, and are intended to bring these measured values closer to the target values.
  • the target values for at least one of the relative position of the probe 1, the external force acting on the probe 1, the angular velocity of the probe 1, and the acceleration of the probe 1 are input using an input means provided on the probe 1 or a device connected to the probe 1 (the information processing device 2 or an external device).
  • the input means includes a microphone that collects sound, a switch on the probe 1, a touch panel on the information processing device 2, etc.
  • a user can input target values recommended by a medical professional, or a medical professional who is checking various data acquired by the probe 1 and information processing device 2 in a remote location can input target values. Since the user can use the probe 1 by following the guide, it is possible to realize an ultrasound examination device that is easy to operate even for users who are novices in the medical field.
  • Example of measuring elasticity of test object The stiffness of the test object, which indicates the elasticity of the test object, may be measured based on the acceleration of the probe 1 measured by an acceleration sensor and the measurement results of the external force acting on the probe 1.
  • the stiffness of the test object is measured, for example, by the calculation unit 52 of the information processing device 2.
  • Figure 13 is a diagram to explain how to measure the stiffness of the test object.
  • FIG. 13 shows a schematic diagram of a situation in which a support part Su on which a force sensor FS is provided is pressed against an object to be inspected Obj.
  • the force sensor FS is provided on both the upper and lower sides of the contact surface of the support part Su with the object to be inspected Obj.
  • the upper part of Figure 13 shows the situation before force is applied to the support part Su
  • the lower part of Figure 13 shows the situation when force F is applied from the side of the support part Su opposite the contact surface with the inspection object Obj.
  • the acting force Fsr [N] measured by the upper force sensor FS is equal to the acting force Fsl [N] measured by the lower force sensor FS.
  • the stiffness Ks [N/mm] of the force sensor FS is given by the following equation (1).
  • the movement amount sm [mm] of the force sensor FS due to the application of force F can be calculated by double integration of the translational acceleration az [mm/ s2 ] of the support part Su (force sensor FS) in the pushing direction, as shown in the following equation (3).
  • the sum of the movement hm, initial thickness h0, and initial thickness s0 is equal to the sum of thickness h1, thickness s1, and movement sm, as shown in formula (4) below.
  • the stiffness Kh [N/mm] of the object to be inspected Obj is given by the following formula (6).
  • the stiffness Kh, the movement amount hm, the initial thickness h0, the thickness h1, the movement amount sm, and the thickness s1 are unknown parameters
  • the stiffness Ks, the acting force Fsr, the acting force Fsl, the initial thickness s0, and the translational acceleration az are known parameters.
  • the stiffness Kh can be measured. Note that if the acting force Fsr and the acting force Fsl are greater than 0, the double integral of the translational acceleration az is greater than the movement amount hm.
  • the person to be examined will hardly move even when the probe 1 is pressed against the person, it is possible to measure the rigidity of the subject to be examined using the acceleration of the probe 1 measured by the acceleration sensor. Since the elasticity of the subject to be examined indicates, for example, the hardness of the affected area, it is also possible to discover abnormalities in the affected area by performing an ultrasound examination using the ultrasound examination device of this technology.
  • a lid for covering the camera 22 may be attached to the probe 1 so as to be movable.
  • FIG. 14 shows an example of the appearance of probe 1 with the lid attached.
  • the probe 1 is provided with a lid 301 that can slide or be attached or detached.
  • a lid 301 that can slide or be attached or detached.
  • an opening H is exposed, and the camera 22 can capture the external space of the probe 1.
  • the opening H is blocked, and the camera 22 cannot capture the external space of the probe 1.
  • the ultrasound examination can be performed with the cover 301 closed, making it possible to protect the privacy of the person being examined according to their wishes.
  • the photographing unit 140 provided in the camera 22 may be configured with an image sensor capable of detecting visible light and infrared light. In this case, based on the image photographed by the camera 22, it is possible to measure the appearance and shape of the subject, measure the external force acting on the probe 1, and estimate the moisture content of the subject.
  • FIG. 15 is a diagram for explaining remote medical treatment using an ultrasound examination device according to the present technology.
  • a user U1 is performing an ultrasound examination on a person P1 who is to be examined, for example at home.
  • a notebook PC 2B connected to a probe 1 is connected, as shown by the dashed line, to a tablet terminal 302 used by a medical worker D1 in a remote location, for example via a wireless network.
  • the notebook PC 2B transmits two or more pieces of data, such as an ultrasound image, a captured image, measurement results of the external force acting on the probe 1, and measurement results of the inertial sensor 23, to the tablet terminal 302.
  • a screen corresponding to the data transmitted from the notebook PC 2B is displayed on the tablet terminal 302. While viewing the screen displayed on the tablet terminal 302, the medical worker D1 inputs advice on how to use the probe 1 by operating the tablet terminal 302 or by voice.
  • the tablet terminal 302 transmits information indicating the advice input by the medical worker D1 to the notebook PC 2B.
  • the notebook PC 2B receives the information indicating the advice sent from the tablet terminal 302 and presents the advice to the user U1. For example, text indicating the content of the advice may be displayed on the display, and the voice of the medical worker D1 may be output from a speaker. Note that, for example, if the medical worker D1 determines that the ultrasound image has been properly captured, he or she may input an instruction to record the ultrasound image by operating the tablet terminal 302. When information indicating an instruction to record an ultrasound image is sent from the tablet terminal 302, the notebook PC 2B records, for example, the ultrasound image, the captured image, the measurement results of the external force acting on the probe 1, the measurement results of the inertial sensor 23, etc. in association with each other.
  • a medical professional D1 in a remote location can check the measurement results of the ultrasound examination device, and the user U1 can receive advice from the medical professional D1 on how to use the probe 1.
  • the medical professional D1 can check not only the ultrasound images, but also the measurement results of the external forces acting on the probe 1, and can easily provide advice.
  • the calculation unit 52 of the information processing device 2 can also recognize a subject appearing in a captured image by performing image recognition based on the captured image of the camera 22.
  • image recognition based on the captured image the information processing device 2 can grasp the distance of the probe 1 to a predetermined part of the person to be examined based on the result of the image recognition and the measurement result of the relative position of the probe 1.
  • the operation of the ultrasonic sensor 21 may be stopped.
  • FIG. 16 is a diagram for explaining the cases when the ultrasonic sensor 21 operates normally and when the operation of the ultrasonic sensor 21 is stopped.
  • the ultrasonic sensor 21 operates normally.
  • the operation of the ultrasonic sensor 21 stops during an ultrasound examination of a part close to the head, such as the neck, the operation of the ultrasonic sensor 21 can be resumed, for example, by the user responding to a warning message displayed on the display of the information processing device 2.
  • a plurality of cameras 22 may be integrated into the probe 1.
  • two cameras 22 are provided on the probe 1.
  • FIG. 17 is a diagram showing an example of the arrangement of the two cameras 22.
  • the surface of the probe 1 on which the ultrasonic sensor 21 is provided is referred to as the front surface.
  • a of Figure 17 shows an example in which two cameras 22 (opening H) are provided on the front surface of the probe 1 (the contact surface of the tip 11 with the test subject).
  • the upper part of A of Figure 17 shows a schematic diagram of the probe 1 viewed from above, and the lower part of A of Figure 17 shows a schematic diagram of the probe 1 viewed from the side.
  • the two cameras 22 are provided at the left and right ends inside the tip 11, respectively, and an ultrasonic sensor 21 is provided between the two cameras 22.
  • the two cameras 22 (opening H) may also be provided on the side surfaces of the tip 11.
  • FIG. 17B shows an example in which two cameras 22 are provided on the upper and lower surfaces of the support portion 12 of the probe 1 (hereinafter, when the camera 22 is provided on the support portion 12, one of the multiple surfaces on which the camera 22 is provided is referred to as the upper surface).
  • the upper side of FIG. 17B shows a schematic diagram of the probe 1 viewed from above
  • the lower side of FIG. 17B shows a schematic diagram of the probe 1 viewed from the side.
  • the two cameras 22 are configured integrally with the probe 1 but are provided outside the support portion 12.
  • FIG. 17 shows another example in which two cameras 22 are provided on the upper and lower surfaces of the support portion 12 of the probe 1.
  • the upper part of Fig. 17C shows a schematic diagram of the probe 1 viewed from above
  • the lower part of Fig. 17C shows a schematic diagram of the probe 1 viewed from the side.
  • the two cameras 22 are provided inside the support portion 12.
  • the field of view of the camera 22 with respect to the object to be inspected can be ensured satisfactorily.
  • the ultrasonic sensor 21 is provided between the two cameras 22, which may increase the width of the tip 11.
  • Figure 18 shows an example of the placement of one camera 22.
  • a of Figure 18 shows an example in which one camera 22 (opening H) is provided on the front surface (tip portion 11) of the probe 1.
  • the upper part of A of Figure 18 shows a schematic diagram of the probe 1 viewed from above, and the lower part of A of Figure 18 shows a schematic diagram of the probe 1 viewed from the side.
  • one camera 22 is provided at either the left or right end inside the tip portion 11. It is to be noted that one camera 22 (opening H) may also be provided on the side surface of the tip portion 11.
  • FIG. 18 shows an example in which one camera 22 is provided on the upper surface of the support part 12 of the probe 1.
  • the upper part of FIG. 18B shows a schematic diagram of the probe 1 viewed from above
  • the lower part of FIG. 18B shows a schematic diagram of the probe 1 viewed from the side.
  • one camera 22 is configured integrally with the probe 1 but is provided outside the support part 12.
  • FIG. 18 shows another example in which one camera 22 is provided on the upper surface of the support part 12 of the probe 1.
  • the upper part of Fig. 18C shows a schematic diagram of the probe 1 seen from above
  • the lower part of Fig. 18C shows a schematic diagram of the probe 1 seen from the side.
  • one camera 22 is provided inside the support part 12.
  • cameras 22 are provided on both the top and bottom surfaces of the probe 1, it is considered that the field of view of at least any of the cameras 22 will not be covered by the user's hand when the user grasps the support part 12. Therefore, in terms of always being able to ensure the field of view of the cameras 22, it is preferable to provide cameras 22 on both the top and bottom surfaces of the probe 1 rather than providing cameras 22 on only one of the top and bottom surfaces of the probe 1.
  • the camera 22 may be a fisheye camera or an omnidirectional camera integrated into the probe 1.
  • the camera 22 By capturing images with a fisheye camera or an omnidirectional camera, it is possible to capture images of a wide range of the external space of the probe 1, improving the stability of the Visual SLAM technology that uses captured images as input.
  • Figure 19 shows an example of the placement of two fisheye cameras 22A.
  • a of FIG. 19 shows an example in which two fisheye cameras 22A are provided on the front surface (tip portion 11) of the probe 1.
  • the upper part of A of FIG. 19 shows a schematic diagram of the probe 1 viewed from above, and the lower part of A of FIG. 19 shows a schematic diagram of the probe 1 viewed from the side.
  • the two fisheye cameras 22A are provided at the left and right ends inside the tip portion 11, and an ultrasonic sensor 21 is provided between the two fisheye cameras 22A.
  • FIG. 19 shows an example in which two fisheye cameras 22A are provided on the upper and lower surfaces of the support part 12 of the probe 1.
  • the upper part of Figure 19B shows a schematic diagram of the probe 1 viewed from above
  • the lower part of Figure 19B shows a schematic diagram of the probe 1 viewed from the side.
  • the two fisheye cameras 22A are configured integrally with the probe 1, but part of the body of the fisheye camera 22A is exposed outside the support part 12.
  • FIG. 19C shows another example in which two fisheye cameras 22A are provided on the upper and lower surfaces of the support portion 12 of the probe 1.
  • the upper part of FIG. 19C shows a schematic diagram of the probe 1 viewed from above, and the lower part of FIG. 19C shows a schematic diagram of the probe 1 viewed from the side.
  • the two fisheye cameras 22A are provided inside the support portion 12.
  • Figure 20 shows an example of the placement of one fisheye camera 22A.
  • FIG. 20 shows an example in which one fisheye camera 22A is provided on the front surface (tip portion 11) of the probe 1.
  • the upper part of Fig. 20A shows a schematic diagram of the probe 1 as viewed from above, and the lower part of Fig. 20A shows a schematic diagram of the probe 1 as viewed from the side.
  • one fisheye camera 22A is provided at either the left or right end inside the tip portion 11.
  • B of Figure 20 shows an example in which one fisheye camera 22A is provided on the upper surface of the support part 12 of the probe 1.
  • the upper part of B of Figure 20 shows a schematic diagram of the probe 1 viewed from above
  • the lower part of B of Figure 20 shows a schematic diagram of the probe 1 viewed from the side.
  • one fisheye camera 22A is configured integrally with the probe 1, but part of the body of the fisheye camera 22A is exposed to the outside of the support part 12.
  • FIG. 20 shows another example in which one fisheye camera 22A is provided on the upper surface of the support part 12 of the probe 1.
  • the upper part of Fig. 20C shows a schematic diagram of the probe 1 seen from above
  • the lower part of Fig. 20C shows a schematic diagram of the probe 1 seen from the side.
  • one fisheye camera 22A is provided inside the support part 12.
  • multiple cameras 22 of the same type are configured integrally with the probe 1.
  • Multiple types of cameras may be configured integrally with the probe 1 as the cameras 22.
  • a fisheye camera and a camera with a normal lens are provided on the probe 1.
  • Figure 21 shows an example of the placement of multiple types of cameras.
  • FIG. 21A shows an example in which a fisheye camera 22A and a camera 22B with a normal lens are provided on the front surface (tip portion 11) of a probe 1.
  • the upper part of FIG. 21A shows a schematic diagram of the probe 1 viewed from above, and the lower part of FIG. 21A shows a schematic diagram of the probe 1 viewed from the side.
  • the fisheye camera 22A is provided at one of the left or right ends inside the tip portion 11, and the camera 22B is provided at the other left or right end inside the tip portion 11.
  • An ultrasonic sensor 21 is provided between the fisheye camera 22A and the camera 22.
  • FIG. 21B shows an example in which fisheye camera 22A is provided on the upper surface of support part 12 of probe 1, and camera 22B is provided on the lower surface of support part 12 of probe 1.
  • the upper part of FIG. 21B shows a schematic diagram of probe 1 seen from above, and the lower part of FIG. 21B shows a schematic diagram of probe 1 seen from the side.
  • fisheye camera 22A is configured integrally with probe 1, but a part of the body of fisheye camera 22A is exposed to the outside of support part 12.
  • camera 22 is configured integrally with probe 1, but is provided outside support part 12.
  • fisheye camera 22A may be provided outside support part 12 like camera 22B, or a part of the body of camera 22 may be exposed to the outside of support part 12 like fisheye camera 22A.
  • FIG. 21C shows another example in which fisheye camera 22A is provided on the upper surface of support portion 12 of probe 1, and camera 22B is provided on the lower surface of support portion 12 of probe 1.
  • the upper part of FIG. 21C shows a schematic diagram of probe 1 viewed from above, and the lower part of FIG. 21C shows a schematic diagram of probe 1 viewed from the side.
  • fisheye camera 22A and camera 22B are provided inside support portion 12.
  • Example of Displaying Locations Where Previous Ultrasound Examinations Have Been Performed Locations on the human body where previous ultrasound examinations have been performed and locations on the human body where the current ultrasound examination is being performed may be displayed on the display of the information processing device 2.
  • Figure 22 shows an example of a screen that displays the positions on the human body where ultrasound examinations were performed.
  • the locations where ultrasound examinations have been performed in the past and the current location where ultrasound examinations are being performed are displayed superimposed on an image that, for example, diagrammatically represents the human body.
  • the dotted rounded rectangles indicate areas where ultrasound examinations have been performed in the past and ultrasound images have been recorded.
  • the dotted rounded rectangles are displayed as rounded rectangles with green lines, for example.
  • the darkness of the line color changes according to the external force acting on the probe 1 when the examination of that area was performed. For example, the darker the line color, the stronger the external force that was applied.
  • the darkness of the line color is the same for rounded rectangles with lines in colors other than green, which will be explained below.
  • the dashed rounded rectangle indicates the area where an ultrasound examination was previously performed while the patient was exhaling, and an ultrasound image was recorded. In reality, the dashed rounded rectangle is displayed, for example, as a rounded rectangle with yellow lines. The dashed rounded rectangle indicates the area where an ultrasound examination was previously performed while the patient was inhaling, and an ultrasound image was recorded. In reality, the dashed rounded rectangle is displayed, for example, as a rounded rectangle with purple lines.
  • the solid rounded rectangle indicates the area where the ultrasound examination was performed and the ultrasound image was recorded.
  • the solid rounded rectangle is displayed, for example, as a rounded rectangle with red lines.
  • the white or grey circle indicates the area where the ultrasound examination was performed and the ultrasound image was displayed on the display.
  • the circle is displayed, for example, as a circle with a blue interior.
  • the intensity of the colour inside the circle changes depending on the external force acting on the probe 1 when the examination of that area was performed. For example, the darker the colour inside the circle, the stronger the external force that was applied.
  • the positions on the human body where the ultrasound examination was performed are estimated based on the results of image recognition based on the image captured by the camera 22 and the measurement results of the relative position of the probe 1.
  • the user can compare the positions where ultrasound examinations were performed in the past with the positions where ultrasound examinations are being performed this time, and determine whether there are any areas that have been forgotten to be examined.
  • Example of recording ultrasound image when examination situation becomes substantially the same as past examination situation When performing an ultrasound examination of substantially the same part as a part previously examined, when the past examination situation and the current examination situation become substantially the same, an ultrasound image may be recorded in the recording unit 54 of the information processing device 2. For example, when the posture of the person to be examined becomes substantially the same during the past examination and the current examination, an ultrasound image is recorded.
  • the examination conditions include, for example, the posture of the person being examined, the breathing state of the person being examined, the position on the human body to which the probe 1 is pressed, and the external force acting on the probe 1.
  • Whether the posture of the person being tested is approximately the same in the past and present can be determined using, for example, Visual SLAM technology and image recognition.
  • the breathing state of the person being examined indicates whether the person is inhaling, exhaling, or in a normal state. Whether the breathing state of the person being examined is substantially the same in the past and present is determined, for example, by using Visual SLAM technology and audio picked up by a microphone (not shown) installed in the information processing device 2.
  • Whether the position on the human body where the probe 1 is pressed is substantially the same in the past and present is determined, for example, by using Visual SLAM technology and the similarity of ultrasound images. Whether the external force acting on the probe 1 is substantially the same in the past and present is determined, for example, based on the light point cloud captured in the image captured by the camera 22.
  • the user can easily obtain ultrasound images taken under examination conditions that are approximately the same as those of previous examinations.
  • Example of managing information on a subject of examination It is also possible for the data acquisition unit 51 of the information processing device 2 to acquire patient information on a patient who is a subject of examination by photographing a document or screen including a barcode, a two-dimensional code, other special code, character strings, etc. with the camera 22 and analyzing the barcode, etc., that appears in the photographed image.
  • the face of the patient who is a subject of examination may be photographed with the camera 22, and the patient's face may be recognized by image recognition based on the photographed image, thereby acquiring patient information corresponding to the recognized patient's face.
  • the information processing device 2 can register ultrasound images and the measurement results of the external force acting on the probe 1 as examination data in a medical chart linked to the patient information. Since the user does not need to manually register ultrasound images and the like in the medical chart, it is possible to reduce the user's effort and errors.
  • the information processing device 2 can also obtain examination data from past ultrasound examinations along with patient information based on the results of reading barcodes and the like and the results of recognizing the patient's face.
  • the information processing device 2 can obtain individual settings for a user based on the results of reading a barcode or recognizing the user's face, and can register test data linked to the user. For example, if a user has color vision deficiency, colors that are easy for the user to read are displayed based on the user's individual settings. For example, the display of various types of information that can serve as a reference for how to use the probe 1 is turned on or off depending on the user's level of familiarity with the medical field.
  • Information about the test subject or user including patient information, information indicating the registration destination of test data, information indicating individual settings for the user, etc., is acquired, for example, by the data acquisition unit 51 of the information processing device 2.
  • FIG. 23 is a diagram showing an example of an ultrasound examination screen displayed on the display of the information processing device 2 during ultrasound examination.
  • the left side of the ultrasound examination screen displays an app (application) area A101, which allows the user to input operations and check the results of measurements taken by various sensors provided in the probe 1, for example.
  • a graph G101 showing the time series changes in the external force acting on the probe 1 is displayed below the app area A101.
  • a probe field of view image area A102 for displaying, for example, an image captured by the camera 22 provided on the probe 1 is displayed in the upper right-hand corner of the app area A101.
  • An image P101 captured by the camera 22 is displayed in the upper part of the probe field of view image area A102, and buttons B101 to B104 are displayed in the lower part.
  • a force meter indicating the magnitude and direction of the external force acting on the probe 1 is superimposed on the captured image P101.
  • button B101 When button B101 is pressed, for example, the captured image P101 is displayed with the left and right sides reversed, and when button B102 is pressed, for example, the captured image P101 is displayed with the top and bottom sides reversed.
  • button B103 When button B103 is pressed, for example, the force sense meter is displayed with the left and right sides reversed, and when button B104 is pressed, for example, the force sense meter is displayed with the top and bottom sides reversed.
  • the direction in which the user is looking differs by 180 degrees from the shooting direction of the camera 22.
  • the up, down, left, and right directions as seen by the user may differ from the up, down, left, and right directions in the image captured by the camera 22.
  • buttons B101 to B104 By inverting the captured image left-right or top-bottom using a GUI (Graphical User Interface) such as buttons B101 to B104, it is possible to prevent the user from becoming confused and working efficiency from decreasing because the up-down-left-right directions for the user are different from the up-down-left-right directions in the image captured by the camera 22.
  • GUI Graphic User Interface
  • an overhead image P102 capturing an entire image of a person to be examined is displayed.
  • the overhead image P102 is captured, for example, by a camera provided in the information processing device 2 or an external camera.
  • an ultrasound image area A103 for displaying, for example, an ultrasound image is displayed.
  • various numerical values, graphs G101, captured images P101, overhead images P102, ultrasound images, etc. are all displayed synchronously.
  • FIG. 24 is a diagram for explaining details of the app area A101.
  • a display example of the upper part of the app area A101 excluding the graph G101 is shown.
  • a button B111 is displayed at the top center of the application area A101 for setting target values for the external force acting on the probe 1, the measurement value of the relative position of the probe 1, etc.
  • a button B112 is displayed for saving logs of captured images and the measurement results of the external forces acting on the probe.
  • a button B113 is displayed for switching whether or not to continue saving logs of the measurement results of the external forces acting on the probe and the measurement results from the inertial sensor.
  • a check box C101 for reading a two-dimensional code For reading a character string.
  • checkboxes C102 are checkboxes for selecting items to be displayed in the graph G101.
  • FIG. 25 is a diagram showing an example of a graph G101 showing the time series change in the external force acting on the probe 1.
  • the horizontal axis shows the time
  • the vertical axis shows the magnitude of the force.
  • the example in Figure 25 shows the time series changes in the force (moment) Flx in the x-axis direction, the force (moment) Fly in the y-axis direction, and the force Flz in the z-axis direction acting on the left side of the probe 1, as well as the force (moment) Frx in the x-axis direction, the force (moment) Fry in the y-axis direction, and the force Frz in the z-axis direction acting on the right side of the probe 1.
  • the force range displayed in graph G101 may be specified by any of the following: - The range is automatically adjusted dynamically according to the magnitude of the force currently acting. ⁇ The maximum and minimum values are fixed according to the magnitude of the force that can be handled. - The user inputs maximum and minimum values. - Automatic adjustment is performed to ensure that the range is within the expected range depending on the area being examined during ultrasound testing. - Automatic adjustment is made to match the expected range based on past measurement results.
  • Figure 26 shows an example of the force meter display.
  • the two force meters 351 indicate the external forces acting on the left and right sides of the probe 1.
  • the two force meters 351 are displayed, for example, superimposed on the left and right areas, respectively, of the captured image P101 in which the light spot cloud appears. Since the user does not need to check the light spot cloud, the area in the captured image in which the light spot cloud appears is actually masked. Below, the area in the captured image that is not masked and in which only the external space of the probe 1 appears is referred to as the field of view of the captured image.
  • the force sense meter 351 is composed of a circle and an arrow extending from the center of the circle.
  • the size of the circle indicates the magnitude of the force (Flz, Frz) pressing into either the left or right side of the probe 1.
  • the length of the arrow indicates the magnitude of the force (Flx, Fly, Frx, Fry) acting in the up, down, left and right directions, and the direction of the arrow indicates the direction of the force acting in the up, down, left and right directions.
  • the measurement results of the external forces acting on the probe 1 are displayed as numbers, it is difficult for the user to understand.
  • a GUI such as a force meter
  • one force meter 352 indicates the resultant force of the external forces acting on the left and right sides of the probe 1.
  • the force sense meter 352 is composed of a circle and an arrow extending from the center of the circle.
  • the size of the circle indicates the magnitude of the resultant force pushing on the left and right sides of the probe 1.
  • the length of the arrow indicates the magnitude of the resultant force acting in the up, down, left and right directions on the left and right sides of the probe 1, and the direction of the arrow indicates the direction of the resultant force acting in the up, down, left and right directions.
  • the force meter 352 is displayed with its left and right position shifted to the side with the stronger force pushing the probe 1, and its top and bottom position shifted toward the part where the tip 11 is in contact with the test subject.
  • the top and bottom position of the force meter 352 may be fixed to the center of the captured image.
  • the force meter 351 is displayed on the left and right of the captured image, so the user must move their line of sight left and right.
  • the user can check the external force acting on the probe 1 by looking at one location (near the force meter 352).
  • a guide for moving the probe 1 to that area may be displayed superimposed on the image captured by the camera 22.
  • the direction and distance in which the probe 1 is moved to the area to be inspected is obtained based on an overhead image captured by the camera 22 when the user moves the probe 1 away from the person's body, and the relative position of the probe 1 to the inspection target measured using Visual SLAM technology.
  • FIG. 27 shows an example of a guide display for moving the probe 1 to the area to be examined.
  • the area to be examined is a part of a person's arm.
  • an arrow A121 is displayed as a guide, superimposed on the captured image P101 of the camera 22.
  • the arrow A121 is superimposed, for example, in the center of the captured image P101.
  • the arrow A121 may be displayed in an emphasized manner, for example, by being displayed large or blinking on the captured image P101.
  • a rectangular frame R101 that surrounds the part that is expected to appear in the field of view of the captured image P101 when the probe 1 comes into contact with the subject is displayed superimposed on the captured image P101 as a guide.
  • the user can bring the probe 1 into contact with a part of the arm to be examined by moving the probe 1 so that the frame R101 overlaps with the field of view of the captured image P101.
  • the frame R101 may be highlighted by blinking or the like.
  • a guide to indicate the target value of the force with which the probe 1 is pressed against the subject may be superimposed on the image captured by the camera 22.
  • Such a guide is displayed, for example, by pressing button B111 in FIG. 24.
  • the target value is set, for example, by pressing a button to select from among predetermined options, or by selecting from the results of measuring external forces when an ultrasound image was previously captured.
  • FIG. 28 shows an example of a guide display that shows the target value of the force for pressing the probe 1.
  • a guide meter 353 indicating the target value of the force for pushing the probe 1 is displayed together with a force sense meter 352 superimposed on the image P101 captured by the camera 22.
  • the guide meter 353 is also configured by combining a circle with an arrow extending from the center of the circle.
  • the size of the circle, the direction of the arrow, and the size of the arrow on the guide meter 353 correspond to the size of the circle, the direction of the arrow, and the size of the arrow on the force sense meter 352.
  • the guide meter 353 is displayed as a line in an inconspicuous color, such as a light color.
  • the force meter 352 is displayed smaller than the guide meter 353, and the circle of the force meter 352 is displayed in blue, for example.
  • the force meter 352 is displayed at approximately the same size as the guide meter 353, and the circle of the force meter 352 is displayed in, for example, green.
  • the force meter 352 is displayed larger than the guide meter 353, and the circle of the force meter 352 is displayed in red, for example.
  • the user can adjust the force with which the probe 1 is pressed so that the force meter 352 and the guide meter 353 overlap, allowing the user to press the probe 1 into the test subject with an appropriate force.
  • an alert may be issued to the user, such as by emitting a sound.
  • the color of the arrow on the force meter 352 may change according to the force with which the probe 1 is pressed, in the same way as the color of the circle.
  • the information processing device 2 may be configured, for example, by a single-board computer and a PC.
  • the single-board computer is connected to the probe 1 and the PC, and the PC is connected to the probe 1 and the single-board computer.
  • the single-board computer for example, adjusts the light intensity of the LEDs provided inside the camera 22 based on commands from the PC.
  • the single-board computer for example, obtains numerical data indicating the acceleration, angular velocity, and temperature of the probe 1 from sensors provided in the probe 1 and transmits the data to the PC.
  • the PC for example, acquires captured images from the imaging unit 140 of the camera 22, and estimates the external force acting on the probe 1 using a learning model that uses the captured images as input.
  • the PC for example, acquires numerical data transmitted from a single-board computer.
  • the PC displays, for example, the external force acting on the probe 1, the acceleration, angular velocity, temperature, etc. of the probe 1 on the ultrasound examination screen.
  • IMU Inertial Measurement Unit
  • the absolute attitude can be acquired by the inertial sensor, making it possible to accurately measure and compensate for the effects of gravity. It is possible to measure the absolute amount of movement of the probe 1 even when the person being examined and the probe 1 move at the same time. It is possible to measure the elasticity of the skin based on the measurement results from the inertial sensor.
  • the above-described series of processes can be executed by hardware or software.
  • the program constituting the software is installed from a program recording medium into a computer incorporated in dedicated hardware, or into a general-purpose personal computer, etc.
  • FIG. 29 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes using a program.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • an input/output interface 505 Connected to the input/output interface 505 are an input unit 506 consisting of a keyboard, mouse, etc., and an output unit 507 consisting of a display, speakers, etc. Also connected to the input/output interface 505 are a storage unit 508 consisting of a hard disk or non-volatile memory, a communication unit 509 consisting of a network interface, etc., and a drive 510 that drives removable media 511.
  • the CPU 501 for example, loads a program stored in the storage unit 508 into the RAM 503 via the input/output interface 505 and the bus 504 and executes the program, thereby performing the above-mentioned series of processes.
  • the programs executed by the CPU 501 are provided, for example, by being recorded on removable media 511, or via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and are installed in the storage unit 508.
  • the program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or it may be a program in which processing is performed in parallel or at the required timing, such as when called.
  • each step described in the above flowchart can be executed by a single device, or can be shared and executed by multiple devices.
  • one step includes multiple processes
  • the processes included in that one step can be executed by one device, or can be shared and executed by multiple devices.
  • Example of combination of configurations The present technology can also have the following configurations.
  • an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected and return; a camera that is integrated with the probe provided with the ultrasonic sensor and captures at least an image of the vicinity of the direction in which the probe contacts the inspection object; and a calculation unit that measures a relative position of the probe with respect to the object to be inspected based on an image captured by the camera.
  • the camera photographs an external space of the probe and photographs a light point group of light emitted from a light source provided inside the probe and reflected in a reflection space provided inside the probe
  • the ultrasonic inspection device according to (1) wherein the calculation unit measures an external force acting on the probe based on a displacement of the light spot cloud captured in the captured image.
  • the ultrasonic inspection device according to (2) further comprising an inertial sensor that measures at least one of an angular velocity of the probe and an acceleration of the probe.
  • the calculation unit measures elasticity of the inspection object based on the measurement result of the external force and the measurement result by the inertial sensor.
  • the ultrasonic inspection device described in (3) or (4) further includes a recording unit that corresponds and records at least two or more pieces of data among an ultrasonic image generated based on the reflected wave received by the ultrasonic sensor, the measurement result of the external force, the captured image, the measurement result of the relative position, and the measurement result by the inertial sensor.
  • a recording unit that corresponds and records at least two or more pieces of data among an ultrasonic image generated based on the reflected wave received by the ultrasonic sensor, the measurement result of the external force, the captured image, the measurement result of the relative position, and the measurement result by the inertial sensor.
  • the ultrasonic inspection device described in (6) further comprising a presentation control unit that presents a guide to a user for bringing the measurement value closer to the target value based on a difference between at least any one of the measurement values of the relative position, the external force, the angular velocity of the probe, and the acceleration and the target value.
  • the ultrasonic inspection device described in any of (3) to (7) further includes a communication unit that transmits at least two or more pieces of data among an ultrasonic image generated based on the reflected wave received by the ultrasonic sensor, the captured image, the measurement result of the relative position, the measurement result of the external force, and the measurement result by the inertial sensor to an external device.
  • the communication unit receives information indicating advice on how to use the probe or information indicating an instruction to record the ultrasound image from the external device; a presentation control unit that presents the advice to a user;
  • the ultrasonic inspection device described in (8) further comprises a recording unit that, when information indicating an instruction to record the ultrasonic image is transmitted from the external device, records the ultrasonic image, the captured image, and the measurement results of the external force in association with each other.
  • the camera captures at least one of a barcode, a two-dimensional code, a special code, a character string, a face of the inspection target, and a face of a user;
  • the ultrasonic inspection device according to any one of (1) to (12), wherein a cover portion for covering the camera is attached movably relative to the probe.
  • the ultrasonic inspection device according to any one of (1) to (13), wherein the camera is a fisheye camera or an omnidirectional camera.
  • the calculation unit performs image recognition based on the captured image to recognize the inspection object appearing in the captured image,
  • the ultrasonic inspection device according to any one of (1) to (15), wherein the camera is provided on at least one of a tip portion, a bottom surface, and a front surface of the probe.
  • the probe is provided with a plurality of the cameras.
  • the recording unit records the ultrasonic image when a past inspection situation and a current inspection situation are substantially the same.
  • An inspection method comprising: an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected; and a probe that is integral with the probe and is provided with an ultrasonic sensor that receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected; and a relative position of the probe with respect to the object to be inspected is measured based on an image captured by a camera that captures at least an image of the vicinity of the direction in which the probe contacts the object to be inspected.

Abstract

The present technology relates to an ultrasonic inspection device, an inspection method, and a program that make it easier to use the ultrasonic inspection device. This ultrasonic inspection device comprises: an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and that receives reflected waves of the ultrasonic waves that are reflected back from the object to be inspected; a camera that is integrated with a probe equipped with an ultrasonic sensor, and that captures at least the vicinity of the direction in which the probe contacts the object to be inspected; and a calculation unit that measures the relative position of the probe with respect to the object to be inspected on the basis of an image captured by the camera. The present technology can be applied to, for example, a portable ultrasonic inspection device that captures ultrasonic images.

Description

超音波検査装置、検査方法、およびプログラムUltrasonic inspection device, inspection method, and program
 本技術は、超音波検査装置、検査方法、およびプログラムに関し、特に、超音波検査装置をより容易に使用することができるようにした超音波検査装置、検査方法、およびプログラムに関する。 This technology relates to an ultrasonic inspection device, an inspection method, and a program, and in particular to an ultrasonic inspection device, an inspection method, and a program that make it easier to use the ultrasonic inspection device.
 従来、医療分野では、体内の臓器などを超音波を用いて撮影して超音波画像を取得する超音波検査装置が広く普及している。 Traditionally, ultrasound examination devices that use ultrasound to capture internal organs and other objects to obtain ultrasound images have been widely used in the medical field.
 超音波画像を用いた検査では、超音波を送信し、当該超音波の反射波を受信するプローブの姿勢や、検査対象となる人物に対するプローブの相対位置が用いられることがある(例えば特許文献1を参照)。病院内で超音波検査装置が使用される場合、プローブの姿勢や検査対象に対する相対位置は、例えば、病院のベッド付近に設置されたカメラを用いて取得される。 In examinations using ultrasound images, the posture of the probe that transmits ultrasound and receives the reflected waves of the ultrasound, and the relative position of the probe with respect to the person being examined may be used (see, for example, Patent Document 1). When an ultrasound examination device is used in a hospital, the posture of the probe and its relative position with respect to the person being examined are obtained, for example, using a camera installed near the hospital bed.
特開2020-127629号公報JP 2020-127629 A
 また、近年、小型でポータブルな超音波検査装置が一般向けに販売されている。例えば病院以外の場所で小型の超音波検査装置が使用される場合、プローブや検査対象を撮影可能な位置にカメラを設置したり、カメラを超音波検査装置に接続したりする手間がかかる。 In addition, in recent years, small, portable ultrasound examination devices have been sold to the general public. For example, when a small ultrasound examination device is used in a location other than a hospital, it is time-consuming to install a camera in a position where it can photograph the probe and the subject to be examined, and to connect the camera to the ultrasound examination device.
 本技術はこのような状況に鑑みてなされたものであり、超音波検査装置をより容易に使用することができるようにするものである。 This technology was developed in light of these circumstances, and makes it easier to use ultrasound testing equipment.
 本技術の一側面の超音波検査装置は、超音波を検査対象に放射し、前記検査対象で反射されて戻ってくる前記超音波の反射波を受信する超音波センサと、前記超音波センサが設けられたプローブと一体になって構成され、前記プローブが前記検査対象と接触する方向近傍を少なくとも撮影するカメラと、前記カメラにより撮影された撮影画像に基づいて、前記検査対象に対する前記プローブの相対位置を計測する演算部とを備える。 An ultrasonic inspection device according to one aspect of the present technology includes an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected back from the object to be inspected, a camera that is integrated with a probe on which the ultrasonic sensor is provided and captures at least an image of the area in the direction in which the probe contacts the object to be inspected, and a calculation unit that measures the relative position of the probe with respect to the object to be inspected based on the image captured by the camera.
 本技術の一側面の検査方法は、超音波検査装置が、超音波を検査対象に放射し、前記検査対象で反射されて戻ってくる前記超音波の反射波を受信する超音波センサが設けられたプローブと一体になって構成され、前記プローブが前記検査対象と接触する方向近傍を少なくとも撮影するカメラにより撮影された撮影画像に基づいて、前記検査対象に対する前記プローブの相対位置を計測する。 In one aspect of the present technology, the inspection method is configured such that an ultrasonic inspection device is integrated with a probe equipped with an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected, and the relative position of the probe with respect to the object to be inspected is measured based on an image captured by a camera that captures at least an image of the vicinity of the direction in which the probe comes into contact with the object to be inspected.
 本技術の一側面のプログラムは、超音波を検査対象に放射し、前記検査対象で反射されて戻ってくる前記超音波の反射波を受信する超音波センサが設けられたプローブと一体になって構成され、前記プローブが前記検査対象と接触する方向近傍を少なくとも撮影するカメラにより撮影された撮影画像に基づいて、前記検査対象に対する前記プローブの相対位置を計測する処理をコンピュータに実行させる。 A program according to one aspect of the present technology is configured as an integrated part of a probe equipped with an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected back from the object to be inspected, and causes a computer to execute a process of measuring the relative position of the probe with respect to the object to be inspected based on an image captured by a camera that captures at least an image of the vicinity of the direction in which the probe comes into contact with the object to be inspected.
 本技術の一側面においては、超音波を検査対象に放射し、前記検査対象で反射されて戻ってくる前記超音波の反射波を受信する超音波センサが設けられたプローブと一体になって構成され、前記プローブが前記検査対象と接触する方向近傍を少なくとも撮影するカメラにより撮影された撮影画像に基づいて、前記検査対象に対する前記プローブの相対位置が計測される。 In one aspect of this technology, an ultrasonic sensor is integrated with a probe that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected, and the relative position of the probe with respect to the object to be inspected is measured based on images taken by a camera that captures at least the area in the direction in which the probe comes into contact with the object to be inspected.
本技術の一実施形態に係る超音波検査装置の構成例を示す図である。1 is a diagram illustrating an example of the configuration of an ultrasonic inspection device according to an embodiment of the present technology. プローブの外観の例を示す斜視図である。FIG. 2 is a perspective view showing an example of the appearance of a probe. プローブを先端部側から見た場合の斜視図である。FIG. 2 is a perspective view of the probe as viewed from the tip end side. プローブの使用方法の例を示す図である。FIG. 13 is a diagram showing an example of how to use the probe. カメラの構成を模式的に示す断面図である。FIG. 2 is a cross-sectional view showing a schematic configuration of the camera. 互いに対向する3枚のミラーで囲まれた空間に光を出射した様態を示す図である。13 is a diagram showing a state in which light is emitted into a space surrounded by three mirrors facing each other. FIG. カメラによる撮影と、カメラに作用する外力とを説明する図である。1A and 1B are diagrams illustrating photographing by a camera and an external force acting on the camera. カメラの撮影画像の一例を示す図である。FIG. 2 is a diagram showing an example of an image captured by a camera. プローブの構成例を示す図である。FIG. 2 is a diagram illustrating an example of the configuration of a probe. 情報処理装置の構成例を示すブロック図である。1 is a block diagram showing an example of the configuration of an information processing device; 本技術の超音波検査装置が行う処理について説明するフローチャートである。10 is a flowchart illustrating processing performed by an ultrasonic inspection device according to the present technology. 超音波検査装置を使用する状況の例を示す図である。FIG. 1 is a diagram showing an example of a situation in which an ultrasonic inspection device is used. 検査対象の剛性の計測方法を説明するための図である。1A and 1B are diagrams for explaining a method for measuring the rigidity of an inspection object. 蓋部が取り付けられたプローブの外観の例を示す図である。FIG. 13 is a diagram showing an example of the appearance of a probe with a lid portion attached. 本技術の超音波検査装置を用いた遠隔診療を説明するための図である。FIG. 1 is a diagram for explaining remote medical treatment using an ultrasound examination device according to the present technology. 超音波センサが正常に動作する場合と、超音波センサの動作が停止される場合とを説明するための図である。10A and 10B are diagrams for explaining a case where the ultrasonic sensor operates normally and a case where the operation of the ultrasonic sensor is stopped. 2つのカメラの配置例を示す図である。FIG. 2 is a diagram showing an example of the arrangement of two cameras. 1つのカメラの配置例を示す図である。FIG. 1 is a diagram showing an example of the arrangement of one camera. 2つの魚眼カメラの配置例を示す図である。FIG. 1 is a diagram showing an example of the arrangement of two fisheye cameras. 1つの魚眼カメラの配置例を示す図である。FIG. 2 is a diagram showing an example of the arrangement of one fish-eye camera. 複数の種類のカメラの配置例を示す図である。FIG. 1 is a diagram showing an example of an arrangement of a plurality of types of cameras. 超音波検査を行った人体上の位置を提示するための画面の例を示す図である。FIG. 13 is a diagram showing an example of a screen for presenting positions on the human body where ultrasound examinations have been performed. 超音波検査の際に情報処理装置のディスプレイに表示される超音波検査画面の例を示す図である。FIG. 13 is a diagram showing an example of an ultrasound examination screen displayed on a display of an information processing device during ultrasound examination. アプリ領域の詳細を説明する図である。FIG. 13 is a diagram illustrating details of an application area. プローブに作用する外力の時系列の変化を示すグラフの例を示す図である。FIG. 13 is a diagram showing an example of a graph illustrating a time series change in an external force acting on a probe. 力覚メータの表示例を示す図である。FIG. 13 is a diagram showing an example of the display of a force sense meter. 検査対象となる部位までプローブを移動させるためのガイドの表示例を示す図である。13A and 13B are diagrams showing an example of a display of a guide for moving a probe to a region to be examined. プローブを押し込む力の目標値を提示するためのガイドの表示例を示す図である。13A and 13B are diagrams showing an example of a guide display for presenting a target value of the force for pressing the probe. コンピュータのハードウェアの構成例を示すブロック図である。FIG. 2 is a block diagram showing an example of the hardware configuration of a computer.
 以下、本技術を実施するための形態について説明する。説明は以下の順序で行う。
 1.超音波検査装置の概要
 2.各機器の構成と動作
 3.変形例
Hereinafter, an embodiment of the present technology will be described in the following order.
1. Overview of ultrasonic inspection device 2. Configuration and operation of each device 3. Modification
<1.超音波検査装置の概要>
 図1は、本技術の一実施形態に係る超音波検査装置の構成例を示す図である。
<1. Overview of Ultrasonic Inspection Equipment>
FIG. 1 is a diagram illustrating an example of the configuration of an ultrasonic inspection device according to an embodiment of the present technology.
 本技術の超音波検査装置は、例えば検査対象となる人物の腹部などの各部位の内部の様子が写る画像である超音波画像を撮影し、検査するのに用いられるポータブルな装置である。本技術の超音波検査装置は、図1に示すように、プローブ1と情報処理装置2により構成される。プローブ1と情報処理装置2は有線または無線の通信経路を介して接続される。 The ultrasound examination device of the present technology is a portable device used to take and examine ultrasound images that show the internal state of each part of the body, such as the abdomen, of a person being examined. As shown in Figure 1, the ultrasound examination device of the present technology is composed of a probe 1 and an information processing device 2. The probe 1 and the information processing device 2 are connected via a wired or wireless communication path.
 プローブ1は、超音波を検査対象に放射し、超音波の反射波を受信する。プローブ1は、受信した反射波の強度を測定し、例えば反射波の強度の時系列の測定結果を示すデータである超音波測定データを情報処理装置2に供給する。 The probe 1 emits ultrasonic waves to the object to be inspected and receives the reflected ultrasonic waves. The probe 1 measures the intensity of the received reflected waves and supplies ultrasonic measurement data, which is data showing the measurement results of the intensity of the reflected waves over time, to the information processing device 2.
 情報処理装置2は、タブレット端末、スマートフォン、ノートPC(Personal Computer)、専用端末などにより構成される。情報処理装置2は、プローブ1から供給された超音波測定データに基づいて、超音波画像を生成し、例えば情報処理装置2に設けられたディスプレイに超音波画像を表示させる。 The information processing device 2 is composed of a tablet terminal, a smartphone, a notebook PC (Personal Computer), a dedicated terminal, etc. The information processing device 2 generates an ultrasound image based on the ultrasound measurement data supplied from the probe 1, and displays the ultrasound image on a display provided in the information processing device 2, for example.
 図2は、プローブ1の外観の例を示す斜視図である。 Figure 2 is a perspective view showing an example of the appearance of probe 1.
 図2に示すように、プローブ1は、先端部11と支持部12を有する。 As shown in FIG. 2, the probe 1 has a tip portion 11 and a support portion 12.
 先端部11における検査対象との接触面には、超音波を放射し、超音波の反射波を受信する小型の超音波センサ21が設けられる。 A small ultrasonic sensor 21 is provided on the contact surface of the tip 11 with the test subject, which emits ultrasonic waves and receives the reflected ultrasonic waves.
 支持部12は、先端部11を支持し、超音波検査装置のユーザがプローブ1を把持するための部材である。支持部12における先端部11側の中央部には、略三角形の開口部Hが形成され、開口部Hを覆うように、板状の透明部材31が形成される。 The support portion 12 is a member that supports the tip portion 11 and allows a user of the ultrasound examination device to grasp the probe 1. A roughly triangular opening H is formed in the center of the support portion 12 on the tip portion 11 side, and a plate-shaped transparent member 31 is formed to cover the opening H.
 図3は、プローブ1を先端部11側から見た場合の斜視図である。 Figure 3 is a perspective view of the probe 1 as seen from the tip 11 side.
 図3に示すように、開口部Hの内部には、カメラ22が設けられる。例えば、カメラ22は、先端部11が直接的または間接的に取り付けられた力覚作用部を備え、本体の少なくとも一部が支持部12と一体になって構成される。カメラ22の詳細な構成については後述する。カメラ22は、プローブ1の外部空間を撮影する。 As shown in FIG. 3, a camera 22 is provided inside the opening H. For example, the camera 22 includes a force sensor section to which the tip 11 is directly or indirectly attached, and at least a portion of the body is integrated with the support section 12. The detailed configuration of the camera 22 will be described later. The camera 22 captures images of the external space of the probe 1.
 カメラ22と外部空間は、透明部材31により隔てられている。透明部材31によりカメラ22の防塵、防滴、および防水を実現することができる。プローブ1を清掃しやすくするために、透明部材31と支持部12の境界部分が平らであることが望ましい。なお、透明部材31の形状は板状に限定されない。例えば、カメラ22の少なくとも一部が支持部12の外部に露出している場合、カメラ22を覆うように、半球状の透明部材31が形成されてもよい。 The camera 22 is separated from the external space by a transparent member 31. The transparent member 31 makes it possible to make the camera 22 dustproof, drip-proof, and waterproof. To make it easier to clean the probe 1, it is desirable that the boundary between the transparent member 31 and the support part 12 is flat. Note that the shape of the transparent member 31 is not limited to a plate shape. For example, if at least a portion of the camera 22 is exposed outside the support part 12, the transparent member 31 may be formed in a hemispherical shape to cover the camera 22.
 図4は、プローブ1の使用方法の例を示す図である。 Figure 4 shows an example of how to use the probe 1.
 超音波検査を行うユーザは、図4に示すように、プローブ1の先端部11を、検査対象Objの表面の所定の位置(例えば、内臓などの直上の位置)に押し当てる。このとき、情報処理装置2のディスプレイには、例えば、検査対象の内部の様子が写る超音波画像がリアルタイムに表示される。なお、超音波検査を行うユーザは、検査対象となる人物自身であってもよいし、検査対象となる人物以外の人物であってもよい。 As shown in FIG. 4, a user performing an ultrasound examination presses the tip 11 of the probe 1 against a predetermined position on the surface of the object Obj to be examined (e.g., a position directly above an internal organ). At this time, an ultrasound image showing, for example, the inside of the object to be examined is displayed in real time on the display of the information processing device 2. Note that the user performing the ultrasound examination may be the person to be examined, or may be a person other than the person to be examined.
 カメラ22は、例えば、プローブ1が検査対象Objと接触する方向(先端部11側の方向)を撮影する。カメラ22により撮影された撮影画像には、例えば検査対象Objの一部が写る。超音波検査装置は、カメラ22の撮影画像に基づいて、検査対象Objに対するプローブ1の相対位置を計測する。プローブ1の相対位置は、例えば、撮影画像を入力とするVisual SLAM(Simultaneous Localization And Mapping)技術を用いて計測される。 The camera 22 captures, for example, an image in the direction in which the probe 1 comes into contact with the object of inspection Obj (toward the tip 11). The image captured by the camera 22 captures, for example, a part of the object of inspection Obj. The ultrasonic inspection device measures the relative position of the probe 1 with respect to the object of inspection Obj based on the image captured by the camera 22. The relative position of the probe 1 is measured, for example, using Visual SLAM (Simultaneous Localization And Mapping) technology that uses the captured image as input.
 なお、情報処理装置2は、カメラ22の撮影画像に基づいて、検査対象となる人物の肌荒れなどの異常を検出することも可能である。カメラ22をプローブ1に設けることで、カメラをプローブ1の外部に設置した場合と比較して、検査対象をより近くで撮影することができ、より多くの画素数で患部を撮影することができる。 In addition, the information processing device 2 can also detect abnormalities such as rough skin of the person being examined based on the image captured by the camera 22. By providing the camera 22 on the probe 1, it is possible to capture an image of the subject being examined from a closer distance than in the case where the camera is installed outside the probe 1, and it is possible to capture an image of the affected area with a greater number of pixels.
 ユーザが、プローブ1の先端部11を検査対象Objに押し当てると、検査対象Objには、矢印A1で示す外力が加えられ、プローブ1の先端部11には、矢印A2で示す外力が加えられる。プローブの先端部11に作用する外力は、検査対象Objに作用する外力の方向と逆向きの力となる。 When the user presses the tip 11 of the probe 1 against the test object Obj, an external force indicated by arrow A1 is applied to the test object Obj, and an external force indicated by arrow A2 is applied to the tip 11 of the probe 1. The external force acting on the tip 11 of the probe is in the opposite direction to the external force acting on the test object Obj.
 本技術の超音波検査装置においては、カメラ22が光学式の力覚センサとしても利用される。言い換えると、超音波検査装置は、カメラ22の撮影画像に基づいて、プローブ1の先端部11に作用する外力を計測する。 In the ultrasonic inspection device of this technology, the camera 22 is also used as an optical force sensor. In other words, the ultrasonic inspection device measures the external force acting on the tip 11 of the probe 1 based on the image captured by the camera 22.
 図5を参照して、光学式の力覚センサとして利用されるカメラ22の構成について説明する。図5は、カメラ22の構成を模式的に示す断面図である。 The configuration of the camera 22 used as an optical force sensor will be described with reference to FIG. 5. FIG. 5 is a cross-sectional view showing a schematic configuration of the camera 22.
 図5に示すように、カメラ22は、ベース部110、力覚作用部120、起歪体130,180、撮影部140、光源部150、第1ミラー161、第2ミラー162、およびハーフミラー170を備える。 As shown in FIG. 5, the camera 22 includes a base unit 110, a force sensor unit 120, strain bodies 130 and 180, an image capturing unit 140, a light source unit 150, a first mirror 161, a second mirror 162, and a half mirror 170.
 ベース部110は、略中央に光取込孔110Hが設けられた平板形状の剛直な構造部材である。ベース部110は、略中央に設けられた光取込孔110Hを介して、カメラ22の外部から入射した光(外光とも称する)をベース部110の第1面S101側(カメラ22に入射する外光の入射側と反対側)に設けられた撮影部140に入射させる。 The base unit 110 is a rigid structural member in the shape of a flat plate with a light intake hole 110H provided approximately in the center. The base unit 110 allows light incident from outside the camera 22 (also called external light) to enter the photographing unit 140 provided on the first surface S101 side of the base unit 110 (the side opposite to the side where external light enters the camera 22) through the light intake hole 110H provided approximately in the center.
 力覚作用部120は、ベース部110の第2面S102側(カメラ22に入射する外光の入射側)に起歪体130を介して設けられた剛直な構造部材である。力覚作用部120は、例えば、光取込孔110Hの周囲のベース部110の第2面S102と対向するように設けられてもよい。 The force sense acting unit 120 is a rigid structural member provided via a strain generating body 130 on the second surface S102 side of the base unit 110 (the side on which external light enters the camera 22). The force sense acting unit 120 may be provided, for example, so as to face the second surface S102 of the base unit 110 around the light intake hole 110H.
 力覚作用部120は、カメラ22において、先端部11を介して加えられる外部からの力(外力)が作用する部位である。力覚作用部120に外力が作用した場合、力覚作用部120およびベース部110の間の起歪体130が変形し、第1ミラー161および第2ミラー162の位置関係が変位する。これにより、光源部150から出射され、第1ミラー161および第2ミラー162によって多重反射された反射光の各光点の位置が変位する。したがって、カメラ22は、第1ミラー161および第2ミラー162によって多重反射された反射光の各光点の位置の変位を測定することで、力覚作用部120に作用した外力を計測することができる。 The force sense acting unit 120 is a part of the camera 22 on which an external force (external force) is applied via the tip portion 11. When an external force acts on the force sense acting unit 120, the strain generating body 130 between the force sense acting unit 120 and the base portion 110 deforms, and the positional relationship between the first mirror 161 and the second mirror 162 changes. This causes a change in the position of each light point of the reflected light that is emitted from the light source portion 150 and multiple-reflected by the first mirror 161 and the second mirror 162. Therefore, the camera 22 can measure the external force acting on the force sense acting unit 120 by measuring the change in the position of each light point of the reflected light that is multiple-reflected by the first mirror 161 and the second mirror 162.
 力覚作用部120は、複数設けられてもよい。このような場合、カメラ22は、カメラ22の複数の箇所に作用する外力を力覚作用部120の各々で受けることで、複数の箇所にそれぞれ作用する外力を計測することができる。 A plurality of force sense acting units 120 may be provided. In such a case, the camera 22 can measure the external forces acting on the plurality of locations by receiving the external forces acting on the plurality of locations of the camera 22 at each of the force sense acting units 120.
 具体的には、複数の力覚作用部120は、光取込孔110Hに対して点対称または線対称に配置されて設けられてもよい。例えば、力覚作用部120は、光取込孔110Hを挟んで180度の配置で(すなわち、光取込孔110Hを挟んで対向して)2つ設けられてもよい。力覚作用部120は、光取込孔110Hを中心として互いに120度の配置で3つ設けられてもよく、光取込孔110Hを中心として互いに90度の配置で4つ設けられてもよい。複数の力覚作用部120は、光取込孔110Hに対して点対称または線対称に配置されることで、カメラ22に作用する外力を等方的に計測することが可能である。 Specifically, the multiple force sense units 120 may be arranged in point symmetry or line symmetry with respect to the light intake hole 110H. For example, two force sense units 120 may be arranged at 180 degrees across the light intake hole 110H (i.e., facing each other across the light intake hole 110H). Three force sense units 120 may be arranged at 120 degrees from each other with the light intake hole 110H at the center, or four force sense units 120 may be arranged at 90 degrees from each other with the light intake hole 110H at the center. By arranging the multiple force sense units 120 in point symmetry or line symmetry with respect to the light intake hole 110H, it is possible to isotropically measure external forces acting on the camera 22.
 ハーフミラー170は、光取込孔110Hを覆うように外光の入射側に設けられる。具体的には、ハーフミラー170は、矩形または円形の平板形状に構成され、複数の力覚作用部120の間に起歪体180を介して架け渡されるように設けられてもよい。 The half mirror 170 is provided on the side where external light enters so as to cover the light intake hole 110H. Specifically, the half mirror 170 may be configured in a rectangular or circular flat plate shape and may be provided so as to span multiple force sense acting units 120 via a strain generating body 180.
 ハーフミラー170は、20%超90%未満の光透過率と、10%超80%未満の光反射率とを有し、入射する光を一部透過させると共に、入射する光を一部反射する光学部材である。例えば、ハーフミラー170は、ガラスや樹脂などで構成された透明部材にクロム(Cr)などの金属材料を用いて光透過率および光反射率を有する程度の極薄膜を成膜することで構成され得る。または、ハーフミラー170は、ガラスや樹脂などで構成された透明部材に誘電体多層膜を光透過率および光反射率を有するように成膜することで構成され得る。ハーフミラー170の光透過率および光反射率は、カメラ22で実現する特性に応じて任意の値に設定することが可能である。 Half mirror 170 is an optical element that has a light transmittance of more than 20% and less than 90%, and a light reflectance of more than 10% and less than 80%, and transmits some of the incident light and reflects some of the incident light. For example, half mirror 170 can be constructed by depositing an extremely thin film having light transmittance and light reflectance using a metal material such as chromium (Cr) on a transparent element made of glass or resin. Alternatively, half mirror 170 can be constructed by depositing a dielectric multilayer film having light transmittance and light reflectance on a transparent element made of glass or resin. The light transmittance and light reflectance of half mirror 170 can be set to any value depending on the characteristics realized by camera 22.
 光透過性を有するハーフミラー170は、例えば、カメラ22に入射する外光を光取込孔110Hが設けられたカメラ22の内部に透過させることができる。これによれば、カメラ22は、光取込孔110Hを介して取り込んだ外光によって、カメラ22の外部空間を撮影部140で撮影することができる。また、光反射性を有するハーフミラー170は、例えば、光源部150から出射された光を第1ミラー161および第2ミラー162と同様に反射することができる。これによれば、カメラ22は、光源部150から出射され、第1ミラー161、第2ミラー162、およびハーフミラー170で多重反射された反射光の各光点の位置を撮影部140で計測することができる。したがって、カメラ22は、カメラ22の外部空間の撮影と、第1ミラー161、第2ミラー162、およびハーフミラー170で多重反射された反射光の光点群の計測とを撮影部140にて同時に行うことが可能である。 The half mirror 170, which has optical transparency, can transmit, for example, external light incident on the camera 22 into the interior of the camera 22, which is provided with the light intake hole 110H. This allows the camera 22 to capture the external space of the camera 22 with the imaging unit 140 by using the external light captured through the light intake hole 110H. The half mirror 170, which has optical reflectivity, can also reflect, for example, the light emitted from the light source unit 150 in the same manner as the first mirror 161 and the second mirror 162. This allows the camera 22 to measure the positions of each light point of the reflected light that is emitted from the light source unit 150 and is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170 with the imaging unit 140. Therefore, the camera 22 can simultaneously capture the external space of the camera 22 and measure the light point group of the reflected light that is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170 with the imaging unit 140.
 起歪体130,180は、作用する応力に比例して変形する構造部材である。例えば、起歪体130,180は、ゴム、エラストマー、ばねなどの明らかに変形しやすい弾性体であってもよい。また、起歪体130,180は、他の構成と同じ材質であるものの、他の構成よりも変形しやすいように低剛性で形成された構造部材であってもよい。起歪体130は、ベース部110と力覚作用部120との間に設けられ、起歪体180は、ハーフミラー170と力覚作用部120との間に設けられる。起歪体130,180は、力覚作用部120に作用する外力に応じて変形することで、第1ミラー161、第2ミラー162、およびハーフミラー170の位置関係を変位させることができる。 The flexure bodies 130 and 180 are structural members that deform in proportion to the stress acting thereon. For example, the flexure bodies 130 and 180 may be elastic bodies that are obviously easily deformed, such as rubber, elastomers, or springs. The flexure bodies 130 and 180 may also be structural members that are made of the same material as the other components, but are formed with low rigidity so that they are more easily deformed than the other components. The flexure body 130 is provided between the base unit 110 and the force sense acting unit 120, and the flexure body 180 is provided between the half mirror 170 and the force sense acting unit 120. The flexure bodies 130 and 180 can displace the positional relationship between the first mirror 161, the second mirror 162, and the half mirror 170 by deforming in response to an external force acting on the force sense acting unit 120.
 第1ミラー161は、ベース部110の第2面S102に設けられ、第2ミラー162は、力覚作用部120のベース部110と対向する面に設けられる。すなわち、第1ミラー161および第2ミラー162は、ベース部110および力覚作用部120で囲まれたカメラ22の内部空間側の面に設けられる。第1ミラー161および第2ミラー162は、例えば、ガラスまたは樹脂等で構成された透明部材にクロム(Cr)などの金属材料を十分な光反射率を有する膜厚で成膜することで構成され得る。互いに対向し合う第1ミラー161および第2ミラー162は、光源部150から出射された光を第1ミラー161および第2ミラー162の間の反射空間121で多重反射させることができる。 The first mirror 161 is provided on the second surface S102 of the base unit 110, and the second mirror 162 is provided on the surface of the force sense acting unit 120 facing the base unit 110. That is, the first mirror 161 and the second mirror 162 are provided on the surface of the camera 22 facing the internal space surrounded by the base unit 110 and the force sense acting unit 120. The first mirror 161 and the second mirror 162 can be formed, for example, by depositing a metal material such as chrome (Cr) with a film thickness having sufficient light reflectance on a transparent member made of glass or resin. The first mirror 161 and the second mirror 162 facing each other can multiple-reflect the light emitted from the light source unit 150 in the reflection space 121 between the first mirror 161 and the second mirror 162.
 ここで、図6を参照して、対向する第1ミラー161および第2ミラー162による多重反射について説明する。図6は、互いに対向する3枚のミラーで囲まれた空間に光を出射した様態を示す図である。 Now, referring to Figure 6, we will explain the multiple reflections caused by the opposing first mirror 161 and second mirror 162. Figure 6 is a diagram showing how light is emitted into a space surrounded by three opposing mirrors.
 図6に示すように、三角柱の各側面に対応する位置に互いに対向するように設けられた第1ミラー1610、第2ミラー1620、および第3ミラー1630では、光源部1500から出射された光Lが多重反射される。このような場合、光源部1500から出射された光Lは、第1ミラー1610、第2ミラー1620、および第3ミラー1630の多重反射によって反射光の光点の数が増幅されて受光部1400に受光される。また、第1ミラー1610、第2ミラー1620、および第3ミラー1630が変位した場合、光源部1500から出射された光Lの反射光の光点の位置は、第1ミラー1610、第2ミラー1620、および第3ミラー1630の変位を増幅して変位する。 As shown in FIG. 6, the light L emitted from the light source unit 1500 is multiple-reflected by the first mirror 1610, the second mirror 1620, and the third mirror 1630, which are provided facing each other at positions corresponding to each side of the triangular prism. In such a case, the light L emitted from the light source unit 1500 is received by the light receiving unit 1400 with the number of light points of the reflected light being amplified by the multiple reflections of the first mirror 1610, the second mirror 1620, and the third mirror 1630. In addition, when the first mirror 1610, the second mirror 1620, and the third mirror 1630 are displaced, the positions of the light points of the reflected light of the light L emitted from the light source unit 1500 are displaced by amplifying the displacement of the first mirror 1610, the second mirror 1620, and the third mirror 1630.
 なお、第1ミラー1610、第2ミラー1620、および第3ミラー1630は、正三角形または二等辺三角形の各辺に対応するように配置されてもよく、正三角形または二等辺三角形を崩した三角形の各辺に対応するように配置されてもよい。第1ミラー1610、第2ミラー1620、および第3ミラー1630は、対称性が低い三角形(すなわち、正三角形または二等辺三角形を崩した三角形)の各辺に対応するように配置されることによって、多重反射による反射光の光点の数をより増加させることが可能である。 The first mirror 1610, the second mirror 1620, and the third mirror 1630 may be arranged to correspond to each side of an equilateral triangle or an isosceles triangle, or may be arranged to correspond to each side of a triangle that is a broken equilateral triangle or an isosceles triangle. By arranging the first mirror 1610, the second mirror 1620, and the third mirror 1630 to correspond to each side of a triangle with low symmetry (i.e., an equilateral triangle or an isosceles triangle), it is possible to further increase the number of light spots of reflected light due to multiple reflections.
 図5に戻り、カメラ22では、第1ミラー161および第2ミラー162は、図示しない第3ミラーとの間で図6に示すような三角柱の側面に対応する構造を構成することができる。このような場合、カメラ22は、第1ミラー161、第2ミラー162、および第3ミラーで側面が構成された三角柱の内部を反射空間121として、光源部150から出射された光を多重反射させることができる。 Returning to FIG. 5, in the camera 22, the first mirror 161, the second mirror 162, and the third mirror (not shown) can form a structure corresponding to the sides of a triangular prism as shown in FIG. 6. In such a case, the camera 22 can multiple-reflect the light emitted from the light source unit 150, with the inside of the triangular prism whose sides are formed by the first mirror 161, the second mirror 162, and the third mirror being the reflection space 121.
 なお、第1ミラー161および第2ミラー162は、図示しない第3ミラーとの間で三角錐の側面に対応する構造を構成してもよい。このような場合であっても、カメラ22は、第1ミラー161、第2ミラー162、および第3ミラーで側面が構成された三角錐の内部を反射空間121として、光源部150から出射された光を多重反射させることができる。 The first mirror 161 and the second mirror 162 may form a structure corresponding to the sides of a triangular pyramid with a third mirror (not shown). Even in this case, the camera 22 can multiple-reflect the light emitted from the light source unit 150 by using the inside of the triangular pyramid whose sides are formed by the first mirror 161, the second mirror 162, and the third mirror as a reflection space 121.
 また、第1ミラー161および第2ミラー162は、図示しない第3ミラーおよび第4ミラーとの間で四角柱の側面に対応する構造を構成してもよい。このような場合であっても、カメラ22は、第1ミラー161、第2ミラー162、第3ミラー、および第4ミラーで側面が構成された四角柱の内部を反射空間121として、光源部150から出射された光を多重反射させることができる。 Furthermore, the first mirror 161 and the second mirror 162 may form a structure corresponding to the side surfaces of a rectangular prism between them and a third mirror and a fourth mirror (not shown). Even in such a case, the camera 22 can multiple-reflect the light emitted from the light source unit 150 by using the inside of the rectangular prism whose sides are formed by the first mirror 161, the second mirror 162, the third mirror, and the fourth mirror as the reflection space 121.
 さらに、第1ミラー161および第2ミラー162は、図示しない第3ミラーおよび第4ミラーとの間で四角錐の側面に対応する構造を構成してもよい。このような場合であっても、カメラ22は、第1ミラー161、第2ミラー162、第3ミラー、および第4ミラーで側面が構成された四角錐の内部を反射空間121として、光源部150から出射された光を多重反射させることができる。 Furthermore, the first mirror 161 and the second mirror 162 may form a structure corresponding to the sides of a quadrangular pyramid between them and a third mirror and a fourth mirror (not shown). Even in this case, the camera 22 can multiple-reflect the light emitted from the light source unit 150 by using the inside of the quadrangular pyramid whose sides are formed by the first mirror 161, the second mirror 162, the third mirror, and the fourth mirror as the reflection space 121.
 光源部150は、ベース部110の第2面S102側に光を出射する。具体的には、光源部150は、第1ミラー161および第2ミラー162にて少なくとも2面以上が囲まれた反射空間121に光を出射する。反射空間121とは、例えば、互いに対向する第1ミラー161および第2ミラー162の間の空間である。光源部150は、反射空間121に光を出射することで、出射した光を第1ミラー161および第2ミラー162の間の反射空間121で多重反射させることができる。光源部150は、反射空間121の底面側(すなわち、ベース部110側)から反射空間121に光を出射してもよく、反射空間121の側面側(すなわち、起歪体130側)から反射空間121に光を出射してもよい。光源部150は、例えば、直進性の高い光を出射可能なLED(Light Emitting Diode)光源であってもよい。 The light source unit 150 emits light toward the second surface S102 side of the base unit 110. Specifically, the light source unit 150 emits light into a reflection space 121 surrounded on at least two sides by the first mirror 161 and the second mirror 162. The reflection space 121 is, for example, the space between the first mirror 161 and the second mirror 162 that face each other. By emitting light into the reflection space 121, the light source unit 150 can cause the emitted light to be multiple-reflected in the reflection space 121 between the first mirror 161 and the second mirror 162. The light source unit 150 may emit light into the reflection space 121 from the bottom side of the reflection space 121 (i.e., the base unit 110 side), or may emit light into the reflection space 121 from the side side of the reflection space 121 (i.e., the strain body 130 side). The light source unit 150 may be, for example, an LED (Light Emitting Diode) light source capable of emitting light with high linearity.
 例えば、光源部150は、ベース部110側に設けられてもよい。このような場合、カメラ22においては、撮影部140への配線と同様に光源部150への配線を形成することができるため、配線形成のコストおよび作業量を低減することができる。したがって、カメラ22の生産コストをより低減することができる。 For example, the light source unit 150 may be provided on the base unit 110 side. In such a case, in the camera 22, wiring to the light source unit 150 can be formed in the same way as wiring to the imaging unit 140, so the cost and amount of work involved in forming the wiring can be reduced. Therefore, the production cost of the camera 22 can be further reduced.
 また、光源部150は、LED光源などの本体部分および配線等が反射空間121に露出しないようにベース部110の内部に設けられてもよい。このような場合、光源部150の本体部分および配線の像が第1ミラー161および第2ミラー162にて多重反射することを防止することができる。したがって、光源部150の本体部分および配線の多重反射像がノイズ要因となることを防止することができるため、第1ミラー161および第2ミラー162にて多重反射された反射光の光点群の計測感度が低下することを防止することができる。 The light source unit 150 may be provided inside the base unit 110 so that the main body of the LED light source and wiring are not exposed in the reflection space 121. In this case, it is possible to prevent the image of the main body and wiring of the light source unit 150 from being multiple-reflected on the first mirror 161 and the second mirror 162. This prevents the multiple-reflection image of the main body and wiring of the light source unit 150 from becoming a noise source, and therefore prevents a decrease in the measurement sensitivity of the light spot group of the reflected light that is multiple-reflected on the first mirror 161 and the second mirror 162.
 さらに、光源部150は、ピンホールを介して反射空間121に光を出射してもよい。ピンホールは、例えば、数mm程度の直径の孔である。光源部150は、ピンホールを介して反射空間121に光を出射することで、出射された光の収束性をより高めることができる。これによれば、光源部150は、第1ミラー161および第2ミラー162にて多重反射された反射光の各光点の形状をより小さな真円とすることができるため、各光点の計測感度を向上させることができる。また、ピンホール加工の精度は、一般的に、光源部150の組み付け時の位置決め精度よりも高いため、光源部150は、ピンホールを介して反射空間121に光を出射することで、光を出射する位置の精度をより高めることができる。したがって第1ミラー161および第2ミラー162にて多重反射された反射光の各光点の位置をより容易に制御することが可能である。 Furthermore, the light source unit 150 may emit light into the reflection space 121 through a pinhole. The pinhole is, for example, a hole with a diameter of about several mm. The light source unit 150 can further improve the convergence of the emitted light by emitting light into the reflection space 121 through the pinhole. As a result, the light source unit 150 can make the shape of each light point of the reflected light reflected multiple times by the first mirror 161 and the second mirror 162 a smaller perfect circle, thereby improving the measurement sensitivity of each light point. In addition, since the accuracy of pinhole processing is generally higher than the positioning accuracy when the light source unit 150 is assembled, the light source unit 150 can further improve the accuracy of the position where the light is emitted by emitting light into the reflection space 121 through the pinhole. Therefore, it is possible to more easily control the position of each light point of the reflected light reflected multiple times by the first mirror 161 and the second mirror 162.
 ここで、力覚作用部120が複数設けられる場合、光源部150は、力覚作用部120の各々に対応して複数設けられてもよい。このような場合、光源部150は、対応する力覚作用部120ごとに異なる色の光を出射してもよい。また、光源部150は、対応する力覚作用部120ごとに反射光の光点群が分離されるように、対応する力覚作用部120に設けられた第2ミラー162および第1ミラー161にて2面が囲まれた別の反射空間121にそれぞれ光を出射してもよい。これによれば、カメラ22は、力覚作用部120の各々に作用された外力を互いに色または位置にて分離可能な反射光の光点群の変位で計測することができる。したがって、カメラ22は、力覚作用部120の各々に作用する外力を互いに高い精度で分離して計測することができる。 Here, when a plurality of haptic action sections 120 are provided, a plurality of light source sections 150 may be provided corresponding to each of the haptic action sections 120. In such a case, the light source section 150 may emit light of a different color for each corresponding haptic action section 120. Also, the light source section 150 may emit light to another reflection space 121 surrounded on two sides by the second mirror 162 and the first mirror 161 provided on the corresponding haptic action section 120 so that the light spot group of the reflected light is separated for each corresponding haptic action section 120. In this way, the camera 22 can measure the external force acting on each of the haptic action sections 120 by the displacement of the light spot group of the reflected light that can be separated from each other by color or position. Therefore, the camera 22 can measure the external force acting on each of the haptic action sections 120 by separating them from each other with high accuracy.
 撮影部140は、光取込孔110Hを介して入射する光を受光することで撮影画像を取得するイメージセンサである。撮影部140は、例えば、CMOS(Complementary Metal-Oxide Semiconductor)イメージセンサまたはCCD(Charge Coupled Device)イメージセンサであってもよい。撮影部140は、ハーフミラー170を透過してカメラ22に入射する外光と、光源部150から出射され、第1ミラー161、第2ミラー162、およびハーフミラー170で多重反射された反射光の光点群とを受光することができる。すなわち、撮影部140は、カメラ22の外部空間の撮影画像に、多重反射された反射光の光点群が重畳された画像を取得することができる。これによれば、カメラ22は、第1ミラー161、第2ミラー162、およびハーフミラー170で多重反射された反射光の各光点の位置の変位から、力覚作用部120に作用する外力を計測することができる。したがって、カメラ22は、外部空間の撮影と、力覚作用部120に作用する外力の計測とを同時に行うことが可能である。 The photographing unit 140 is an image sensor that acquires a photographed image by receiving light incident through the light intake hole 110H. The photographing unit 140 may be, for example, a CMOS (Complementary Metal-Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor. The photographing unit 140 can receive external light that passes through the half mirror 170 and enters the camera 22, and a group of light points of the reflected light that is emitted from the light source unit 150 and is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170. In other words, the photographing unit 140 can acquire an image in which the group of light points of the multiple-reflected reflected light is superimposed on the photographed image of the external space of the camera 22. In this way, the camera 22 can measure the external force acting on the force sensing unit 120 from the displacement of the position of each light point of the reflected light that is multiple-reflected by the first mirror 161, the second mirror 162, and the half mirror 170. Therefore, the camera 22 can simultaneously capture images of the external space and measure the external forces acting on the force sense action unit 120.
 図7は、カメラ22による撮影と、カメラ22に作用する外力とを説明する図である。 Figure 7 is a diagram explaining the photographing by the camera 22 and the external forces acting on the camera 22.
 例えば、図7に示すように、カメラ22は、外部空間に存在する被写体Obj1を撮影することができる。また、カメラ22は、図7に正対して上部の力覚作用部120aにてZ軸方向の力Fz1、X軸回りのモーメントMx1、およびY軸回りのモーメントMy1を受けることで、これらの力Fz1、モーメントMx1、およびモーメントMy1を検出することができる。さらに、カメラ22は、図7に正対して下部の力覚作用部120bにてZ軸方向の力Fz2、X軸回りのモーメントMx2、およびY軸回りのモーメントMy2を受けることで、これらの力Fz2、モーメントMx2、およびモーメントMy2を検出することができる。 For example, as shown in FIG. 7, the camera 22 can capture an image of an object Obj1 present in external space. Furthermore, the camera 22 receives a force Fz1 in the Z-axis direction, a moment Mx1 about the X-axis, and a moment My1 about the Y-axis at the upper force sense acting section 120a while facing directly in FIG. 7, and can detect these forces Fz1, moment Mx1, and moment My1. Furthermore, the camera 22 receives a force Fz2 in the Z-axis direction, a moment Mx2 about the X-axis, and moment My2 about the Y-axis at the lower force sense acting section 120b while facing directly in FIG. 7, and can detect these forces Fz2, moment Mx2, and moment My2.
 図8は、カメラ22の撮影画像の一例を示す図である。 FIG. 8 shows an example of an image captured by the camera 22.
 図8に示すように、撮影部140で撮影された撮影画像CIには、被写体Obj1と、光点群LC1,LC2とが含まれる。 As shown in FIG. 8, the captured image CI captured by the image capture unit 140 includes an object Obj1 and light spot groups LC1 and LC2.
 光点群LC1は、例えば、図7に正対して上部の光源部150aから出射された光が第1ミラー161aおよび第2ミラー162aで多重反射された反射光の光点群である。力覚作用部120aに力Fz1、モーメントMx1、およびモーメントMy1が作用した場合、力覚作用部120aおよび第2ミラー162aの位置が変位する。これにより、撮影画像CI上の光点群LC1の位置は、力覚作用部120aに作用する力Fz1、モーメントMx1、およびモーメントMy1の各々に対応した方向に変位する。したがって、カメラ22は、光点群LC1の位置の変位から力覚作用部120aに作用する力Fz1、モーメントMx1、およびモーメントMy1の向きおよび大きさを算出することができる。 The light spot group LC1 is, for example, a light spot group of reflected light that is emitted from the upper light source unit 150a facing directly in FIG. 7 and is multiple-reflected by the first mirror 161a and the second mirror 162a. When the force Fz1, moment Mx1, and moment My1 act on the force sense action unit 120a, the positions of the force sense action unit 120a and the second mirror 162a are displaced. As a result, the position of the light spot group LC1 on the captured image CI is displaced in the directions corresponding to the force Fz1, moment Mx1, and moment My1 acting on the force sense action unit 120a. Therefore, the camera 22 can calculate the direction and magnitude of the force Fz1, moment Mx1, and moment My1 acting on the force sense action unit 120a from the displacement of the position of the light spot group LC1.
 同様に、光点群LC2は、例えば、図7に正対して下部の光源部150bから出射された光が第1ミラー161bおよび第2ミラー162bで多重反射された光点群である。力覚作用部120bに力Fz2、モーメントMx2、およびモーメントMy2が作用した場合、力覚作用部120bおよび第2ミラー162bの位置が変位する。これにより、撮影画像CI上の光点群LC2の位置は、力覚作用部120bに作用する力Fz2、モーメントMx2、およびモーメントMy2の各々に対応した方向に変位する。したがって、カメラ22は、光点群LC2の位置の変位から力覚作用部120bに作用する力Fz2、モーメントMx2、およびモーメントMy2の向きおよび大きさを算出することができる。 Similarly, the light spot group LC2 is, for example, a light spot group formed by multiple reflections of light emitted from the lower light source unit 150b facing directly in FIG. 7 by the first mirror 161b and the second mirror 162b. When the force Fz2, moment Mx2, and moment My2 act on the force sense action unit 120b, the positions of the force sense action unit 120b and the second mirror 162b are displaced. As a result, the position of the light spot group LC2 on the captured image CI is displaced in the directions corresponding to the force Fz2, moment Mx2, and moment My2 acting on the force sense action unit 120b. Therefore, the camera 22 can calculate the direction and magnitude of the force Fz2, moment Mx2, and moment My2 acting on the force sense action unit 120b from the displacement of the position of the light spot group LC2.
 したがって、カメラ22は、反射光の光点群LC1の位置の変位の様態と、力覚作用部120aに作用する外力の向きおよび大きさの実測値とをあらかじめ対応付けておくことで、力覚作用部120aに作用する外力の向きおよび大きさを算出することができる。また、カメラ22は、反射光の光点群LC2の位置の変位の様態と、力覚作用部120bに作用する外力の向きおよび大きさの実測値とをあらかじめ対応付けておくことで、力覚作用部120bに作用する外力の向きおよび大きさを算出することができる。例えば、カメラ22は、機械学習を用いて、反射光の光点群LC1,LC2の各々の位置の変位の様態と、力覚作用部120a,120bの各々に作用する外力の向きおよび大きさの実測値とを対応付けてもよい。または、カメラ22は、キャリブレーションカーブを作成することで、反射光の光点群LC1,LC2の各々の位置の変位の様態と、力覚作用部120a,120bの各々に作用する外力の向きおよび大きさの実測値とを対応付けてもよい。 Therefore, the camera 22 can calculate the direction and magnitude of the external force acting on the force sense action unit 120a by previously associating the state of displacement of the position of the light point group LC1 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on the force sense action unit 120a. Also, the camera 22 can calculate the direction and magnitude of the external force acting on the force sense action unit 120b by previously associating the state of displacement of the position of the light point group LC2 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on the force sense action unit 120b. For example, the camera 22 may use machine learning to associate the state of displacement of each of the position of the light point groups LC1 and LC2 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on each of the force sense action units 120a and 120b. Alternatively, the camera 22 may create a calibration curve to associate the state of displacement of each of the light point groups LC1 and LC2 of the reflected light with the actual measured values of the direction and magnitude of the external force acting on each of the force sense acting units 120a and 120b.
 これによれば、カメラ22は、被写体Obj1の撮影画像に光点群LC1,LC2が重畳された撮影画像CIを撮影することができるため、被写体Obj1を撮影しつつ、力覚作用部120a,120bに作用する力およびモーメントを同時に計測することができる。プローブ1においては、先端部11の動きが力覚作用部120a,120bに伝達するように、先端部11が取り付けられるため、カメラ22は、先端部11に作用する外力を計測することができる。 As a result, the camera 22 can capture an image CI in which the light spot groups LC1 and LC2 are superimposed on the captured image of the subject Obj1, so that the force and moment acting on the force sense acting units 120a and 120b can be measured simultaneously while capturing an image of the subject Obj1. In the probe 1, the tip 11 is attached so that the movement of the tip 11 is transmitted to the force sense acting units 120a and 120b, so the camera 22 can measure the external force acting on the tip 11.
 一般的に、超音波検査では、プローブの姿勢や、検査対象となる人物に対するプローブの相対位置が用いられることがある(例えば特許文献1を参照)。病院内で超音波検査装置が使用される場合、プローブの姿勢や検査対象に対する相対位置は、例えば、病院のベッド付近に設置されたカメラを用いて取得される。 In general, in ultrasound examinations, the orientation of the probe and the relative position of the probe to the person being examined may be used (see, for example, Patent Document 1). When an ultrasound examination device is used in a hospital, the orientation of the probe and its relative position to the person being examined are obtained, for example, using a camera installed near the hospital bed.
 従来、例えば病院以外の場所で小型の超音波検査装置が使用される場合、プローブや検査対象を撮影可能な位置にカメラを設置したり、カメラを超音波検査装置に接続したりする手間がかかる。超音波検査装置を持ち運ぶことを考慮すると、超音波センサやカメラなどが1つのプローブに設けられることが好ましい。  Conventionally, when a small ultrasound examination device is used in a place other than a hospital, for example, it is time-consuming to install a camera in a position where it can photograph the probe and the subject to be examined, and to connect the camera to the ultrasound examination device. Considering the portability of the ultrasound examination device, it is preferable to provide the ultrasound sensor, camera, etc. in a single probe.
 本技術の超音波検査装置においては、上述したように、プローブ1が検査対象と接触する方向近傍を少なくとも撮影するカメラ22が、超音波センサ21が設けられたプローブ1と一体になって構成され、カメラ22の撮影画像に基づいて、検査対象に対するプローブ1の相対位置が計測される。カメラ22がプローブ1と一体になって構成されるため、ユーザは、カメラを設置したり、カメラを超音波検査装置に接続したりすることなく、容易に超音波検査装置を使用することが可能となる。 As described above, in the ultrasonic inspection device of the present technology, the camera 22, which captures at least the area in the direction in which the probe 1 comes into contact with the object to be inspected, is configured as one unit with the probe 1 provided with the ultrasonic sensor 21, and the relative position of the probe 1 with respect to the object to be inspected is measured based on the image captured by the camera 22. Because the camera 22 is configured as one unit with the probe 1, the user can easily use the ultrasonic inspection device without having to install the camera or connect the camera to the ultrasonic inspection device.
 また、本技術の超音波検査装置においては、カメラ22により、プローブ1の外部空間が撮影されるとともに、プローブ1の内部に設けられた光源から出射され、プローブ1の内部に設けられた反射空間にて反射された光の光点群が撮影される。カメラ22の撮影画像に写る光点群の変位に基づいて、プローブ1の先端部11に作用する外力が計測される。 In addition, in the ultrasonic inspection device of this technology, the camera 22 captures an image of the external space of the probe 1, and also captures an image of a group of light points of light emitted from a light source provided inside the probe 1 and reflected in a reflection space provided inside the probe 1. The external force acting on the tip 11 of the probe 1 is measured based on the displacement of the group of light points captured in the image captured by the camera 22.
 なお、カメラ22の撮影画像に基づいてプローブ1に作用する外力が計測されるのではなく、カメラ22の代わりに先端部11を支持するように形成された力覚センサによって外力が計測されるようにしてもよい。しかし、超音波センサおよびカメラとともに力覚センサを1つのプローブに導入すると、プローブが大きくなってしまう可能性がある。ポータブルな超音波検査装置を実現するためには、プローブは少しでも小型であることが好ましい。 In addition, instead of measuring the external force acting on the probe 1 based on the image captured by the camera 22, the external force may be measured by a force sensor formed to support the tip 11 instead of the camera 22. However, introducing a force sensor along with an ultrasonic sensor and a camera into one probe may result in the probe becoming large. To realize a portable ultrasonic inspection device, it is preferable that the probe be as small as possible.
 プローブ1と一体になって設けられたカメラ22が光学式の力覚センサとしても利用される場合、力覚センサ自体や独自配線が削減され、より小型で軽量の超音波検査装置を実現することが可能となる。 If the camera 22 integrated with the probe 1 is also used as an optical force sensor, the force sensor itself and the unique wiring can be reduced, making it possible to realize a smaller, lighter ultrasonic inspection device.
<2.各機器の構成と動作>
 図9は、プローブ1の構成例を示す図である。
2. Configuration and operation of each device
FIG. 9 is a diagram showing an example of the configuration of the probe 1. As shown in FIG.
 図9に示すように、プローブ1は、超音波センサ21、カメラ22、慣性センサ23、および演算部24により構成される。 As shown in FIG. 9, the probe 1 is composed of an ultrasonic sensor 21, a camera 22, an inertial sensor 23, and a calculation unit 24.
 超音波センサ21は、超音波を検査対象に放射し、超音波の反射波を受信する。超音波センサ21は、受信した反射波の強度を測定し、例えば反射波の強度の時系列の測定結果を示す超音波測定データを取得する。 The ultrasonic sensor 21 emits ultrasonic waves to the object to be inspected and receives the reflected ultrasonic waves. The ultrasonic sensor 21 measures the intensity of the received reflected waves and obtains ultrasonic measurement data that indicates, for example, the measurement results of the intensity of the reflected waves over time.
 カメラ22は、プローブ1の外部空間を撮影し、撮影画像を取得する。 The camera 22 captures the external space of the probe 1 and acquires the captured image.
 慣性センサ23は、ジャイロセンサや加速度センサにより構成され、プローブ1の内部に搭載される。慣性センサ23は、プローブ1の角速度とプローブ1の加速度とのうちの少なくともいずれかを計測する。 The inertial sensor 23 is composed of a gyro sensor and an acceleration sensor, and is mounted inside the probe 1. The inertial sensor 23 measures at least one of the angular velocity of the probe 1 and the acceleration of the probe 1.
 重力方向に対してプローブ1がどの程度傾いているのかが、加速度センサにより計測できるため、加速度センサの計測結果を用いて、Visual SLAM技術などによって計測されるプローブ1の傾き角度の精度を向上させることが可能となる。プローブ1の傾き角度の計測の精度が向上すると、プローブ1に作用する外力の計測に生じる誤差の補正がより正しくなり、外力の計測値の精度が向上する。 The degree to which the probe 1 is tilted relative to the direction of gravity can be measured using an acceleration sensor, so the measurement results from the acceleration sensor can be used to improve the accuracy of the tilt angle of the probe 1 measured using techniques such as Visual SLAM. Improving the accuracy of measuring the tilt angle of the probe 1 makes it possible to more accurately correct errors that occur in the measurement of the external force acting on the probe 1, improving the accuracy of the measurement value of the external force.
 また、ジャイロセンサにより、主に重力軸周りの回転動作が計測できるため、ジャイロセンサの計測結果を用いて、Visual SLAM技術などによって計測されるプローブ1の運動状態の精度を向上させることが可能となる。 In addition, since the gyro sensor can measure rotational movement mainly around the axis of gravity, the measurement results of the gyro sensor can be used to improve the accuracy of the movement state of the probe 1 measured by techniques such as Visual SLAM.
 さらに、加速度センサによるプローブ1の加速度の計測結果を用いて、Visual SLAM技術などによって計測されるプローブ1の並進量の精度を向上させることが可能となる。Visual SLAM技術では、検査対象に対する位置の相対変化が計測されるため、Visual SLAM技術は、プローブ1の絶対並進量の計測に適していない。例えば、Visual SLAM技術では、検査対象が移動したのか、プローブ1が移動したのかが判別できない可能性がある。加速度センサによるプローブ1の加速度の計測結果を用いることで、プローブ1の位置の絶対変化を計測することができる。 Furthermore, by using the results of measuring the acceleration of probe 1 using an acceleration sensor, it is possible to improve the accuracy of the translational amount of probe 1 measured using techniques such as Visual SLAM. Visual SLAM technology measures the relative change in position with respect to the object being inspected, so it is not suitable for measuring the absolute translational amount of probe 1. For example, with Visual SLAM technology, it may not be possible to determine whether the object being inspected or probe 1 has moved. By using the results of measuring the acceleration of probe 1 using an acceleration sensor, it is possible to measure the absolute change in the position of probe 1.
 演算部24は、カメラ22の撮影画像に基づいて、検査対象に対するプローブ1の相対位置を計測する。また、演算部24は、カメラ22の撮影画像に基づいて、プローブ1に作用する外力を計測する。 The calculation unit 24 measures the relative position of the probe 1 with respect to the object to be inspected based on the image captured by the camera 22. The calculation unit 24 also measures the external force acting on the probe 1 based on the image captured by the camera 22.
 図10は、情報処理装置2の構成例を示すブロック図である。 FIG. 10 is a block diagram showing an example configuration of the information processing device 2.
 図10に示すように、情報処理装置2は、データ取得部51、演算部52、通信部53、記録部54、提示制御部55、および提示部56により構成される。 As shown in FIG. 10, the information processing device 2 is composed of a data acquisition unit 51, a calculation unit 52, a communication unit 53, a recording unit 54, a presentation control unit 55, and a presentation unit 56.
 データ取得部51は、超音波測定データ、撮影画像、慣性センサ23の計測結果、プローブ1の相対位置の計測結果、および、プローブ1に作用する外力の計測結果を、プローブ1から取得し、演算部52、通信部53、および記録部54に供給する。なお、プローブ1に作用する外力はプローブ1の演算部24で計測されるため、データ取得部51は、当該外力を計測するための交点群を含む撮影画像全体を取得するのではなく、撮影画像の被写体が写る領域部分をプローブ1から取得することも可能である。撮影画像の被写体が写る領域部分を取得することで、プローブ1と情報処理装置2でやり取りされるデータの量を少なくすることができる。 The data acquisition unit 51 acquires ultrasonic measurement data, captured images, measurement results of the inertial sensor 23, measurement results of the relative position of the probe 1, and measurement results of external forces acting on the probe 1 from the probe 1, and supplies them to the calculation unit 52, communication unit 53, and recording unit 54. Since the external forces acting on the probe 1 are measured by the calculation unit 24 of the probe 1, the data acquisition unit 51 can also acquire from the probe 1 the area portion of the captured image in which the subject appears, rather than acquiring the entire captured image including the intersection group for measuring the external forces. By acquiring the area portion of the captured image in which the subject appears, the amount of data exchanged between the probe 1 and the information processing device 2 can be reduced.
 演算部52は、データ取得部51から供給された超音波測定データに基づいて超音波画像を生成し、通信部53と記録部54に供給する。また、演算部52は、データ取得部51から供給された撮影画像、慣性センサ23の計測結果、プローブ1の相対位置の計測結果、および、プローブ1に作用する外力の計測結果、ならびに、超音波画像を提示制御部55に供給する。なお、演算部52が、撮影画像に基づいて、検査対象に対するプローブ1の相対位置やプローブ1に作用する外力を計測することも可能である。 The calculation unit 52 generates an ultrasound image based on the ultrasound measurement data supplied from the data acquisition unit 51, and supplies it to the communication unit 53 and the recording unit 54. The calculation unit 52 also supplies the captured image, the measurement results of the inertial sensor 23, the measurement results of the relative position of the probe 1, and the measurement results of the external force acting on the probe 1, all of which are supplied from the data acquisition unit 51, as well as the ultrasound image, to the presentation control unit 55. The calculation unit 52 can also measure the relative position of the probe 1 with respect to the object of inspection and the external force acting on the probe 1 based on the captured image.
 通信部53は、データ取得部51や演算部52から供給されたデータのうちの少なくとも2つ以上のデータを、情報処理装置2に接続された外部の装置に送信する。情報処理装置2と外部の装置は、有線または無線のネットワークを介して接続される。 The communication unit 53 transmits at least two of the data supplied from the data acquisition unit 51 and the calculation unit 52 to an external device connected to the information processing device 2. The information processing device 2 and the external device are connected via a wired or wireless network.
 超音波画像、撮影画像、慣性センサ23の計測結果、プローブ1の相対位置の計測結果、および、プローブ1に作用する外力の測定結果は、情報処理装置2に接続された外部の装置でのみ復号可能なように暗号化されて送信される。データの暗号化は、それらのデータを取得した構成において暗号化されてもよいし、通信部53においてまとめて暗号化されてもよい。 The ultrasound image, the captured image, the measurement results of the inertial sensor 23, the measurement results of the relative position of the probe 1, and the measurement results of the external force acting on the probe 1 are encrypted and transmitted so that they can only be decrypted by an external device connected to the information processing device 2. The data may be encrypted in the configuration that acquired the data, or may be encrypted collectively in the communication unit 53.
 記録部54は、データ取得部51や演算部52から供給されたデータを対応付けて記録する。 The recording unit 54 records the data supplied from the data acquisition unit 51 and the calculation unit 52 in association with each other.
 提示制御部55は、演算部52から供給されたデータに基づいて、ユーザに提示される画面や音声を生成し、提示部56を制御して当該画面や当該音声を提示させる。 The presentation control unit 55 generates a screen and sound to be presented to the user based on the data supplied from the calculation unit 52, and controls the presentation unit 56 to present the screen and sound.
 提示部56は、ディスプレイやスピーカなどにより構成され、提示制御部55による制御に従って、超音波画像やプローブ1に作用する外力の測定結果などに基づく画面や音声をユーザに提示する。 The presentation unit 56 is composed of a display, speaker, etc., and presents to the user images and sounds based on ultrasound images and the measurement results of the external forces acting on the probe 1 according to the control of the presentation control unit 55.
 次に、図11のフローチャートを参照して、本技術の超音波検査装置が行う処理について説明する。 Next, the processing performed by the ultrasound inspection device of this technology will be explained with reference to the flowchart in Figure 11.
 ステップS1において、情報処理装置2のデータ取得部51は、例えば超音波測定データや撮影画像といった各種のデータをプローブ1から取得する。 In step S1, the data acquisition unit 51 of the information processing device 2 acquires various data, such as ultrasonic measurement data and captured images, from the probe 1.
 ステップS2において、例えば情報処理装置2の演算部52は、超音波測定データに基づいて超音波画像を生成する。 In step S2, for example, the calculation unit 52 of the information processing device 2 generates an ultrasound image based on the ultrasound measurement data.
 ステップS3において、例えばプローブ1の演算部24は、撮影画像に基づいて、検査対象に対するプローブ1の相対位置を計測する。 In step S3, for example, the calculation unit 24 of the probe 1 measures the relative position of the probe 1 with respect to the test object based on the captured image.
 ステップS4において、例えばプローブ1の演算部24は、撮影画像に基づいて、プローブ1に作用する外力を計測する。 In step S4, for example, the calculation unit 24 of the probe 1 measures the external force acting on the probe 1 based on the captured image.
 ステップS5において、情報処理装置2の記録部54は、プローブ1や情報処理装置2による各種の情報の計測結果を記録する。 In step S5, the recording unit 54 of the information processing device 2 records the measurement results of various information by the probe 1 and the information processing device 2.
 以上のように、本技術の超音波検査装置においては、超音波画像、撮影画像、プローブ1の相対位置の計測結果、プローブ1に作用する外力の計測結果、および慣性センサ23の計測結果が対応付けられて記録される。 As described above, in the ultrasound inspection device of this technology, the ultrasound image, the captured image, the measurement results of the relative position of the probe 1, the measurement results of the external force acting on the probe 1, and the measurement results of the inertial sensor 23 are recorded in association with each other.
 超音波画像とともに他のデータが記録されるため、検査後に超音波画像を再度確認する際、例えばプローブ1に作用する外力の計測結果を確認することで、超音波画像が、適切な力でプローブ1が検査対象に押し当てられた状況で取得された信頼性の高い画像であることを確認できる。また、超音波画像とともに記録された他のデータを、超音波画像が取得された状況の再現や、超音波画像に関する機械学習に用いることも可能である。 Because other data is recorded along with the ultrasound image, when checking the ultrasound image again after the examination, for example by checking the measurement results of the external force acting on the probe 1, it is possible to confirm that the ultrasound image is a highly reliable image obtained when the probe 1 was pressed against the subject with an appropriate force. In addition, other data recorded along with the ultrasound image can be used to recreate the situation in which the ultrasound image was obtained or for machine learning related to ultrasound images.
<3.変形例>
・プローブの使用方法のガイドを提示する例
 図12は、超音波検査装置を使用する状況の例を示す図である。
3. Modifications
Example of Presenting a Guide on How to Use a Probe FIG. 12 is a diagram showing an example of a situation in which an ultrasonic inspection device is used.
 プローブ1は、図12のAに示すように、情報処理装置2としてのタブレット端末2Aに接続されたり、図12のBに示すように、情報処理装置2としてのノートPC2Bに接続されたりして使用される。 The probe 1 is used by connecting it to a tablet terminal 2A as an information processing device 2 as shown in FIG. 12A, or by connecting it to a notebook PC 2B as an information processing device 2 as shown in FIG. 12B.
 ユーザU1がプローブ1を検査対象となる人物P1の所定の部位に押し当てて超音波検査を行う際、タブレット端末2AやノートPC2Bから、プローブ1の使用方法に関するガイドが提示されるようにしてもよい。プローブ1の使用方法に関するガイドとして、例えば「もう少し右側よりに強く押してください」という音声が提示されたり、ガイド内容に応じた色、記号、図形、文字列などを含む画面が提示されたりする。 When user U1 presses probe 1 against a specific part of subject P1 to perform an ultrasound examination, a guide on how to use probe 1 may be presented from tablet terminal 2A or notebook PC 2B. As a guide on how to use probe 1, for example, a voice message such as "Please press a little harder to the right" may be presented, or a screen containing colors, symbols, figures, character strings, etc. according to the guide content may be presented.
 ガイド内容が、例えば「指でぐいっと押し込みながら」といったように音声や文字で提示してもユーザU1が理解しにくい内容である場合、動画像やアニメーションで、動作や力加減などを示すガイドが提示される。 If the guide content is difficult for user U1 to understand when presented in audio or text, such as "push firmly with your finger," a guide showing the action and the amount of force to use is presented in the form of moving images or animation.
 また、プローブ1を押し込む力加減やプローブ1の位置が適切になった場合、力加減やプローブ1の位置が適切であることを示す音声や画面が提示される。例えば、「今の力加減が最適です」という音声が提示されたり、力加減やプローブ1の位置が適切であることを示す文字列や図形が、他の表示の色と異なる色で表示されたりする。なお、プローブ1を検査対象に押し込み過ぎている場合、アラートが提示されてもよい。 Furthermore, when the force with which probe 1 is pressed or the position of probe 1 is appropriate, a sound or a display indicating that the force or position of probe 1 is appropriate is displayed. For example, a sound saying "The current force is optimal" may be displayed, or a character string or figure indicating that the force or position of probe 1 is appropriate may be displayed in a color different from the colors of other displays. Note that if probe 1 is pressed too far into the test subject, an alert may be displayed.
 プローブ1の使用方法に関するガイドの内容は、プローブ1の相対位置、プローブ1に作用する外力、プローブ1の角速度、およびプローブ1の加速度のうちの少なくともいずれかの計測値と、目標値との差分に基づいて決定され、これらの計測値を目標値に近づけるための内容となる。プローブ1の相対位置、プローブ1に作用する外力、プローブ1の角速度、およびプローブ1の加速度のうちの少なくともいずれかに対する目標値は、プローブ1や、プローブ1に接続された機器(情報処理装置2や外部の装置)に設けられた入力手段を用いて入力される。当該入力手段は、音声を集音するマイクロフォン、プローブ1上のスイッチ、情報処理装置2のタッチパネルなどを含む。 The contents of the guide on how to use the probe 1 are determined based on the difference between the target values and at least one of the measured values of the relative position of the probe 1, the external force acting on the probe 1, the angular velocity of the probe 1, and the acceleration of the probe 1, and are intended to bring these measured values closer to the target values. The target values for at least one of the relative position of the probe 1, the external force acting on the probe 1, the angular velocity of the probe 1, and the acceleration of the probe 1 are input using an input means provided on the probe 1 or a device connected to the probe 1 (the information processing device 2 or an external device). The input means includes a microphone that collects sound, a switch on the probe 1, a touch panel on the information processing device 2, etc.
 例えば、医療従事者が推奨する目標値をユーザが入力したり、プローブ1や情報処理装置2により取得される各種のデータを遠隔地で確認している医療従事者が目標値を入力したりすることができる。ユーザはガイドに従ってプローブ1を使用することができるため、医療分野の初学者のユーザにとっても操作性が高い超音波検査装置を実現することが可能となる。 For example, a user can input target values recommended by a medical professional, or a medical professional who is checking various data acquired by the probe 1 and information processing device 2 in a remote location can input target values. Since the user can use the probe 1 by following the guide, it is possible to realize an ultrasound examination device that is easy to operate even for users who are novices in the medical field.
・検査対象の弾力を計測する例
 加速度センサにより計測されたプローブ1の加速度と、プローブ1に作用する外力の計測結果に基づいて、検査対象の弾力を示す剛性が計測されるようにしてもよい。検査対象の剛性は、例えば情報処理装置2の演算部52において計測される。
Example of measuring elasticity of test object The stiffness of the test object, which indicates the elasticity of the test object, may be measured based on the acceleration of the probe 1 measured by an acceleration sensor and the measurement results of the external force acting on the probe 1. The stiffness of the test object is measured, for example, by the calculation unit 52 of the information processing device 2.
 図13は、検査対象の剛性の計測方法を説明するための図である。 Figure 13 is a diagram to explain how to measure the stiffness of the test object.
 図13においては、力覚センサFSが設けられた支持部Suを検査対象Objに対して押し込む状況が模式的に示されている。図13の例では、力覚センサFSが、支持部Suにおける検査対象Objとの接触面の上側と下側にそれぞれ設けられている。 FIG. 13 shows a schematic diagram of a situation in which a support part Su on which a force sensor FS is provided is pressed against an object to be inspected Obj. In the example of FIG. 13, the force sensor FS is provided on both the upper and lower sides of the contact surface of the support part Su with the object to be inspected Obj.
 図13の上側には、支持部Suに対して力が加えられる前の状況が示され、図13の下側には、支持部Suにおける検査対象Objとの接触面とは反対の面側から力Fが加えられている状況が示されている。 The upper part of Figure 13 shows the situation before force is applied to the support part Su, and the lower part of Figure 13 shows the situation when force F is applied from the side of the support part Su opposite the contact surface with the inspection object Obj.
 簡単のため、力Fが加えられているとき、上側の力覚センサFSで計測された作用力Fsr[N]と、下側の力覚センサFSで計測された作用力Fsl[N]とが等しいとする。この場合、力Fが加えられる前の力覚センサFSの初期厚さ[mm]をs0、力Fが加えられているときの力覚センサFSの厚さ[mm]をs1とすると、力覚センサFSの剛性Ks[N/mm]は、下式(1)で示される。 For simplicity's sake, let us assume that when force F is applied, the acting force Fsr [N] measured by the upper force sensor FS is equal to the acting force Fsl [N] measured by the lower force sensor FS. In this case, if the initial thickness [mm] of the force sensor FS before force F is applied is s0, and the thickness [mm] of the force sensor FS when force F is applied is s1, then the stiffness Ks [N/mm] of the force sensor FS is given by the following equation (1).
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 式(1)を変形すると、厚さs1は、式(2)で示される。 By transforming equation (1), the thickness s1 is given by equation (2).
 力Fが加えられたことによる力覚センサFSの移動量sm[mm]は、下式(3)で示されるように、押し込まれた方向への支持部Su(力覚センサFS)の併進加速度az[mm/s2]の2回積分で求められる。 The movement amount sm [mm] of the force sensor FS due to the application of force F can be calculated by double integration of the translational acceleration az [mm/ s2 ] of the support part Su (force sensor FS) in the pushing direction, as shown in the following equation (3).
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 力Fが加えられる前の検査対象Objの初期厚さ[mm]をh0、力Fが加えられているときの検査対象Objの厚さ[mm]をh1とし、力Fが加えられたことによる検査対象Objの移動量[mm]をhmとすると、下式(4)で示されるように、移動量hm、初期厚さh0、および初期厚さs0の合計と、厚さh1、厚さs1、および移動量smの合計とが一致する。 If the initial thickness [mm] of the object of inspection Obj before force F is applied is h0, the thickness [mm] of the object of inspection Obj when force F is applied is h1, and the amount of movement [mm] of the object of inspection Obj due to the application of force F is hm, then the sum of the movement hm, initial thickness h0, and initial thickness s0 is equal to the sum of thickness h1, thickness s1, and movement sm, as shown in formula (4) below.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 式(4)を変形すると、検査対象Objの厚さの変化量(h0-h1)は、式(5)で示される。 By transforming equation (4), the change in thickness of the object Obj (h0-h1) is expressed by equation (5).
 検査対象Objの剛性Kh[N/mm]は、下式(6)で示される。 The stiffness Kh [N/mm] of the object to be inspected Obj is given by the following formula (6).
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 式(5)の厚さs1と移動量smに、式(2)と式(3)の右辺をそれぞれ代入し、式(6)の(h0-h1)に、式(5)の右辺を代入すると、下式(7)が得られる。 Substituting the right-hand sides of equations (2) and (3) into the thickness s1 and movement amount sm in equation (5), respectively, and substituting the right-hand side of equation (5) into (h0-h1) in equation (6), we obtain the following equation (7).
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 本技術の超音波検査装置においては、剛性Kh、移動量hm、初期厚さh0、厚さh1、移動量sm、および厚さs1が未知のパラメータであり、剛性Ks、作用力Fsr、作用力Fsl、初期厚さs0、および併進加速度azが既知のパラメータである。式(7)の右辺においては、移動量hmだけが未知のパラメータとなる。したがって、移動量hmが十分に小さければ、剛性Khを計測することができる。なお、作用力Fsrと作用力Fslが0より大きい場合、併進加速度azの2回積分は移動量hmより大きくなる。 In the ultrasonic inspection device of this technology, the stiffness Kh, the movement amount hm, the initial thickness h0, the thickness h1, the movement amount sm, and the thickness s1 are unknown parameters, and the stiffness Ks, the acting force Fsr, the acting force Fsl, the initial thickness s0, and the translational acceleration az are known parameters. In the right-hand side of equation (7), only the movement amount hm is an unknown parameter. Therefore, if the movement amount hm is sufficiently small, the stiffness Kh can be measured. Note that if the acting force Fsr and the acting force Fsl are greater than 0, the double integral of the translational acceleration az is greater than the movement amount hm.
 プローブ1を検査対象となる人物に押し当てても当該人物はほとんど移動しないと考えらえるため、加速度センサにより計測されたプローブ1の加速度を用いて、検査対象の剛性を計測することが可能となる。検査対象の弾力によって例えば患部の硬さがわかるため、本技術の超音波検査装置を用いた超音波検査で、患部の異常を発見することも可能となる。 Since it is expected that the person to be examined will hardly move even when the probe 1 is pressed against the person, it is possible to measure the rigidity of the subject to be examined using the acceleration of the probe 1 measured by the acceleration sensor. Since the elasticity of the subject to be examined indicates, for example, the hardness of the affected area, it is also possible to discover abnormalities in the affected area by performing an ultrasound examination using the ultrasound examination device of this technology.
・カメラを覆う蓋部を設ける例
 カメラ22を覆う蓋部がプローブ1に対して移動可能に取り付けられるようにしてもよい。
Example of Providing a Lid for Covering the Camera A lid for covering the camera 22 may be attached to the probe 1 so as to be movable.
 図14は、蓋部が取り付けられたプローブ1の外観の例を示す図である。 FIG. 14 shows an example of the appearance of probe 1 with the lid attached.
 図14に示すように、プローブ1には、スライドまたは着脱可能な蓋部301が設けられる。図14の左側に示すように、蓋部301を開けると開口部Hが露出し、カメラ22がプローブ1の外部空間を撮影可能になる。一方、図14の右側に示すように、蓋部301を閉じると開口部Hが遮蔽され、カメラ22はプローブ1の外部空間を撮影できなくなる。 As shown in FIG. 14, the probe 1 is provided with a lid 301 that can slide or be attached or detached. As shown on the left side of FIG. 14, when the lid 301 is opened, an opening H is exposed, and the camera 22 can capture the external space of the probe 1. On the other hand, as shown on the right side of FIG. 14, when the lid 301 is closed, the opening H is blocked, and the camera 22 cannot capture the external space of the probe 1.
 例えば検査対象となる人物(患者)が撮影されることを嫌がる場合、蓋部301を閉じて超音波検査を行うといったように、検査対象となる人物の要望に応じてプライバシーを保護することが可能となる。 For example, if the person being examined (patient) does not want to be photographed, the ultrasound examination can be performed with the cover 301 closed, making it possible to protect the privacy of the person being examined according to their wishes.
・赤外光を検知可能なイメージセンサを用いる例
 カメラ22に設けられる撮影部140が、可視光と赤外光を検知可能なイメージセンサにより構成されてもよい。この場合、カメラ22の撮影画像に基づいて、被写体の外観や形状の計測、および、プローブ1に作用する外力の計測とともに、被写体の水分含有量の推定を行うことが可能となる。
Example of using an image sensor capable of detecting infrared light The photographing unit 140 provided in the camera 22 may be configured with an image sensor capable of detecting visible light and infrared light. In this case, based on the image photographed by the camera 22, it is possible to measure the appearance and shape of the subject, measure the external force acting on the probe 1, and estimate the moisture content of the subject.
 被写体の水分含有量の推定を行うことで、検査対象(人体)自体の水分含有量を確認したり、超音波検査用のゲルが検査対象に有効に塗布されているかを確認したりすることができる。 By estimating the moisture content of the subject, it is possible to confirm the moisture content of the test subject (human body) itself and to confirm whether the gel used for ultrasound testing has been effectively applied to the test subject.
・遠隔診療の例
 図15は、本技術の超音波検査装置を用いた遠隔診療を説明するための図である。
Example of Remote Medical Treatment FIG. 15 is a diagram for explaining remote medical treatment using an ultrasound examination device according to the present technology.
 図15の左側に示すように、ユーザU1は、検査対象となる人物P1に対する超音波検査を自宅などで行っているとする。プローブ1に接続されたノートPC2Bは、破線で示すように、遠隔地にいる医療従事者D1が使用するタブレット端末302と例えば無線のネットワークを介して接続される。 As shown on the left side of FIG. 15, a user U1 is performing an ultrasound examination on a person P1 who is to be examined, for example at home. A notebook PC 2B connected to a probe 1 is connected, as shown by the dashed line, to a tablet terminal 302 used by a medical worker D1 in a remote location, for example via a wireless network.
 ノートPC2Bは、超音波画像、撮影画像、プローブ1に作用する外力の計測結果、慣性センサ23の計測結果などのデータのうちの2つ以上のデータをタブレット端末302に送信する。 The notebook PC 2B transmits two or more pieces of data, such as an ultrasound image, a captured image, measurement results of the external force acting on the probe 1, and measurement results of the inertial sensor 23, to the tablet terminal 302.
 タブレット端末302には、ノートPC2Bから送信されてきたデータに応じた画面が表示される。医療従事者D1は、タブレット端末302に表示された画面を見ながら、プローブ1の使用方法についてのアドバイスを、タブレット端末302を操作して入力したり、音声で入力したりする。タブレット端末302は、医療従事者D1により入力されたアドバイスを示す情報をノートPC2Bに送信する。 A screen corresponding to the data transmitted from the notebook PC 2B is displayed on the tablet terminal 302. While viewing the screen displayed on the tablet terminal 302, the medical worker D1 inputs advice on how to use the probe 1 by operating the tablet terminal 302 or by voice. The tablet terminal 302 transmits information indicating the advice input by the medical worker D1 to the notebook PC 2B.
 ノートPC2Bは、タブレット端末302から送信されてきたアドバイスを示す情報を受信し、アドバイスをユーザU1に提示する。例えば、アドバイスの内容を示す文字がディスプレイに表示されたり、医療従事者D1の音声がスピーカから出力されたりする。なお、例えば超音波画像が適切に撮影されていると判断した場合、医療従事者D1は、超音波画像を記録する指示を、タブレット端末302を操作するなどして入力することも可能である。超音波画像を記録する指示を示す情報がタブレット端末302から送信されてきた場合、ノートPC2Bは、例えば、超音波画像、撮影画像、プローブ1に作用する外力の計測結果、慣性センサ23の計測結果などを対応付けて記録する。 The notebook PC 2B receives the information indicating the advice sent from the tablet terminal 302 and presents the advice to the user U1. For example, text indicating the content of the advice may be displayed on the display, and the voice of the medical worker D1 may be output from a speaker. Note that, for example, if the medical worker D1 determines that the ultrasound image has been properly captured, he or she may input an instruction to record the ultrasound image by operating the tablet terminal 302. When information indicating an instruction to record an ultrasound image is sent from the tablet terminal 302, the notebook PC 2B records, for example, the ultrasound image, the captured image, the measurement results of the external force acting on the probe 1, the measurement results of the inertial sensor 23, etc. in association with each other.
 以上のように、遠隔地にいる医療従事者D1が超音波検査装置の計測結果を確認したり、プローブ1の使用方法についてのアドバイスを医療従事者D1からユーザU1が受けたりするといった遠隔診療を実現することが可能となる。医療従事者D1は、超音波画像だけではなく、プローブ1に作用する外力の計測結果なども確認することができ、容易にアドバイスをすることが可能となる。 As described above, it is possible to realize remote medical treatment, in which a medical professional D1 in a remote location can check the measurement results of the ultrasound examination device, and the user U1 can receive advice from the medical professional D1 on how to use the probe 1. The medical professional D1 can check not only the ultrasound images, but also the measurement results of the external forces acting on the probe 1, and can easily provide advice.
・人体に対するプローブの位置に応じて動作を停止する例
 情報処理装置2の演算部52は、カメラ22の撮影画像に基づく画像認識を行うことで、撮影画像に写る被写体を認識することも可能である。撮影画像に基づく画像認識が行われる場合、情報処理装置2は、画像認識の結果とプローブ1の相対位置の計測結果とに基づいて、検査対象となる人物の所定の部位に対するプローブ1の距離を把握することができる。
Example of stopping operation depending on the position of the probe relative to the human body The calculation unit 52 of the information processing device 2 can also recognize a subject appearing in a captured image by performing image recognition based on the captured image of the camera 22. When image recognition based on the captured image is performed, the information processing device 2 can grasp the distance of the probe 1 to a predetermined part of the person to be examined based on the result of the image recognition and the measurement result of the relative position of the probe 1.
 プローブ1の超音波センサ21が頭部や目などの所定の部位に閾値よりも近づいたと判定された場合、超音波センサ21の動作が停止されるようにしてもよい。 If it is determined that the ultrasonic sensor 21 of the probe 1 is closer than a threshold value to a predetermined part such as the head or eyes, the operation of the ultrasonic sensor 21 may be stopped.
 図16は、超音波センサ21が正常に動作する場合と、超音波センサ21の動作が停止される場合とを説明するための図である。 FIG. 16 is a diagram for explaining the cases when the ultrasonic sensor 21 operates normally and when the operation of the ultrasonic sensor 21 is stopped.
 図16のAに示すように、プローブ1が人物P1の腹部に押し当てられて超音波検査が行われる場合、超音波センサ21は正常に動作する。 As shown in A of FIG. 16, when the probe 1 is pressed against the abdomen of person P1 to perform an ultrasound examination, the ultrasonic sensor 21 operates normally.
 例えば、超音波検査が目の検査に向かない場合、図16のBに示すように、プローブ1の超音波センサ21が人物P1の目に近づけられたとき、超音波センサ21の動作が停止される。子供が誤って目にプローブ1を近づけたり、大人が説明書などを読まずに目にプローブ1を近づけたりしてしまう可能性がある。超音波センサ21が頭部や目に近づいた場合、超音波センサ21の動作を停止させることで、人体において重要な器官が集まった部位である頭部や目の安全が確保される。 For example, if ultrasound testing is not suitable for eye testing, as shown in FIG. 16B, when the ultrasonic sensor 21 of the probe 1 is brought close to the eye of person P1, the operation of the ultrasonic sensor 21 is stopped. There is a risk that a child may bring the probe 1 close to the eye by mistake, or that an adult may bring the probe 1 close to the eye without reading the instructions. By stopping the operation of the ultrasonic sensor 21 when it comes close to the head or eyes, the safety of the head and eyes, which are areas of the human body where important organs are located, is ensured.
 首などの頭部に近い部位の超音波検査を行う際に超音波センサ21の動作が停止した場合、例えば、情報処理装置2のディスプレイに表示された警告文にユーザが応答することで、超音波センサ21の動作が再開される。 If the operation of the ultrasonic sensor 21 stops during an ultrasound examination of a part close to the head, such as the neck, the operation of the ultrasonic sensor 21 can be resumed, for example, by the user responding to a warning message displayed on the display of the information processing device 2.
・カメラの配置位置とカメラの種類について
 複数のカメラ22がプローブ1に一体となって構成されるようにしてもよい。例えば2つのカメラ22がプローブ1に設けられる。
Regarding the arrangement position and type of camera A plurality of cameras 22 may be integrated into the probe 1. For example, two cameras 22 are provided on the probe 1.
 図17は、2つのカメラ22の配置例を示す図である。以下では、プローブ1において超音波センサ21が設けられる側の面を前面とする。 FIG. 17 is a diagram showing an example of the arrangement of the two cameras 22. In the following, the surface of the probe 1 on which the ultrasonic sensor 21 is provided is referred to as the front surface.
 図17のAには、2つのカメラ22(開口部H)が、プローブ1の前面(先端部11における検査対象との接触面)に設けられている例が示される。図17のAの上側には、上から見たプローブ1の模式図が示され、図17のAの下側には、側面側から見たプローブ1の模式図が示される。図17のAの上側に示すように、例えば、2つのカメラ22は先端部11の内部の左右端にそれぞれ設けられ、2つのカメラ22の間に超音波センサ21が設けられる。なお、2つのカメラ22(開口部H)が、先端部11の側面に設けられてもよい。 A of Figure 17 shows an example in which two cameras 22 (opening H) are provided on the front surface of the probe 1 (the contact surface of the tip 11 with the test subject). The upper part of A of Figure 17 shows a schematic diagram of the probe 1 viewed from above, and the lower part of A of Figure 17 shows a schematic diagram of the probe 1 viewed from the side. As shown in the upper part of A of Figure 17, for example, the two cameras 22 are provided at the left and right ends inside the tip 11, respectively, and an ultrasonic sensor 21 is provided between the two cameras 22. Note that the two cameras 22 (opening H) may also be provided on the side surfaces of the tip 11.
 図17のBには、2つのカメラ22が、プローブ1の支持部12の上面と下面(以下では、カメラ22が支持部12に設けられる場合、カメラ22が設けられた複数の面のうちのある1つの面を上面とする)に設けられている例が示される。図17のBの上側には、上から見たプローブ1の模式図が示され、図17のBの下側には、側面側から見たプローブ1の模式図が示される。図17のBの下側に示すように、例えば、2つのカメラ22は、プローブ1に一体となって構成されるが、支持部12の外部に設けられる。 B of FIG. 17 shows an example in which two cameras 22 are provided on the upper and lower surfaces of the support portion 12 of the probe 1 (hereinafter, when the camera 22 is provided on the support portion 12, one of the multiple surfaces on which the camera 22 is provided is referred to as the upper surface). The upper side of FIG. 17B shows a schematic diagram of the probe 1 viewed from above, and the lower side of FIG. 17B shows a schematic diagram of the probe 1 viewed from the side. As shown in the lower side of FIG. 17B, for example, the two cameras 22 are configured integrally with the probe 1 but are provided outside the support portion 12.
 図17のCには、2つのカメラ22が、プローブ1の支持部12の上面と下面に設けられている他の例が示される。図17のCの上側には、上から見たプローブ1の模式図が示され、図17のCの下側には、側面側から見たプローブ1の模式図が示される。図17のCの下側に示すように、例えば、2つのカメラ22は、支持部12の内部に設けられる。 C of Fig. 17 shows another example in which two cameras 22 are provided on the upper and lower surfaces of the support portion 12 of the probe 1. The upper part of Fig. 17C shows a schematic diagram of the probe 1 viewed from above, and the lower part of Fig. 17C shows a schematic diagram of the probe 1 viewed from the side. As shown in the lower part of Fig. 17C, for example, the two cameras 22 are provided inside the support portion 12.
 カメラ22がプローブ1の前面に設けられる場合、検査対象に対するカメラ22の視野を良好に確保することができる。しかし、カメラ22がプローブ1の前面に設けられる場合、2つのカメラ22の間に超音波センサ21が設けられるため、先端部11の幅が広くなる可能性がある。 When the camera 22 is provided on the front surface of the probe 1, the field of view of the camera 22 with respect to the object to be inspected can be ensured satisfactorily. However, when the camera 22 is provided on the front surface of the probe 1, the ultrasonic sensor 21 is provided between the two cameras 22, which may increase the width of the tip 11.
 図18は、1つのカメラ22の配置例を示す図である。 Figure 18 shows an example of the placement of one camera 22.
 図18のAには、1つのカメラ22(開口部H)が、プローブ1の前面(先端部11)に設けられている例が示される。図18のAの上側には、上から見たプローブ1の模式図が示され、図18のAの下側には、側面側から見たプローブ1の模式図が示される。図18のAの上側に示すように、例えば、1つのカメラ22は先端部11の内部の左右のいずれか一端に設けられる。なお、1つのカメラ22(開口部H)が先端部11の側面に設けられてもよい。 A of Figure 18 shows an example in which one camera 22 (opening H) is provided on the front surface (tip portion 11) of the probe 1. The upper part of A of Figure 18 shows a schematic diagram of the probe 1 viewed from above, and the lower part of A of Figure 18 shows a schematic diagram of the probe 1 viewed from the side. As shown in the upper part of A of Figure 18, for example, one camera 22 is provided at either the left or right end inside the tip portion 11. It is to be noted that one camera 22 (opening H) may also be provided on the side surface of the tip portion 11.
 図18のBには、1つのカメラ22が、プローブ1の支持部12の上面に設けられている例が示される。図18のBの上側には、上から見たプローブ1の模式図が示され、図18のBの下側には、側面側から見たプローブ1の模式図が示される。図18のBの下側に示すように、例えば、1つのカメラ22は、プローブ1に一体となって構成されるが、支持部12の外部に設けられる。 B of FIG. 18 shows an example in which one camera 22 is provided on the upper surface of the support part 12 of the probe 1. The upper part of FIG. 18B shows a schematic diagram of the probe 1 viewed from above, and the lower part of FIG. 18B shows a schematic diagram of the probe 1 viewed from the side. As shown in the lower part of FIG. 18B, for example, one camera 22 is configured integrally with the probe 1 but is provided outside the support part 12.
 図18のCには、1つのカメラ22が、プローブ1の支持部12の上面に設けられている他の例が示される。図18のCの上側には、上から見たプローブ1の模式図が示され、図18のCの下側には、側面側から見たプローブ1の模式図が示される。図18のCの下側に示すように、例えば、1つのカメラ22は、支持部12の内部に設けられる。 C of Fig. 18 shows another example in which one camera 22 is provided on the upper surface of the support part 12 of the probe 1. The upper part of Fig. 18C shows a schematic diagram of the probe 1 seen from above, and the lower part of Fig. 18C shows a schematic diagram of the probe 1 seen from the side. As shown in the lower part of Fig. 18C, for example, one camera 22 is provided inside the support part 12.
 プローブ1の上下両面にカメラ22が設けられる場合、ユーザが支持部12を把持する際に少なくともいずれかのカメラ22の視野はユーザの手で覆われないと考えられる。したがって、カメラ22の視野を常に確保できるという点で、プローブ1の上面と下面のいずれか一方だけにカメラ22が設けられる場合よりも、プローブ1の上下両面にカメラ22が設けられる場合の方が好ましい。 When cameras 22 are provided on both the top and bottom surfaces of the probe 1, it is considered that the field of view of at least any of the cameras 22 will not be covered by the user's hand when the user grasps the support part 12. Therefore, in terms of always being able to ensure the field of view of the cameras 22, it is preferable to provide cameras 22 on both the top and bottom surfaces of the probe 1 rather than providing cameras 22 on only one of the top and bottom surfaces of the probe 1.
 カメラ22として、通常のレンズを有するカメラではなく、魚眼カメラや全方位カメラがプローブ1に一体となって構成されるようにしてもよい。魚眼カメラや全方位カメラで撮影することで、プローブ1の外部空間のうちの広い範囲の空間を撮影することができ、撮影画像を入力とするVisual SLAM技術の安定性を向上させることが可能となる。 Instead of a camera with a normal lens, the camera 22 may be a fisheye camera or an omnidirectional camera integrated into the probe 1. By capturing images with a fisheye camera or an omnidirectional camera, it is possible to capture images of a wide range of the external space of the probe 1, improving the stability of the Visual SLAM technology that uses captured images as input.
 図19は、2つの魚眼カメラ22Aの配置例を示す図である。 Figure 19 shows an example of the placement of two fisheye cameras 22A.
 図19のAには、2つの魚眼カメラ22Aが、プローブ1の前面(先端部11)に設けられている例が示される。図19のAの上側には、上から見たプローブ1の模式図が示され、図19のAの下側には、側面側から見たプローブ1の模式図が示される。図19のAの上側に示すように、例えば、2つの魚眼カメラ22Aは先端部11の内部の左右端にそれぞれ設けられ、2つの魚眼カメラ22Aの間に超音波センサ21が設けられる。 A of FIG. 19 shows an example in which two fisheye cameras 22A are provided on the front surface (tip portion 11) of the probe 1. The upper part of A of FIG. 19 shows a schematic diagram of the probe 1 viewed from above, and the lower part of A of FIG. 19 shows a schematic diagram of the probe 1 viewed from the side. As shown in the upper part of A of FIG. 19, for example, the two fisheye cameras 22A are provided at the left and right ends inside the tip portion 11, and an ultrasonic sensor 21 is provided between the two fisheye cameras 22A.
 図19のBには、2つの魚眼カメラ22Aが、プローブ1の支持部12の上面と下面に設けられている例が示される。図19のBの上側には、上から見たプローブ1の模式図が示され、図19のBの下側には、側面側から見たプローブ1の模式図が示される。図19のBの下側に示すように、例えば、2つの魚眼カメラ22Aはプローブ1に一体となって構成されるが、魚眼カメラ22Aの本体の一部が支持部12の外部に露出している。 B of Figure 19 shows an example in which two fisheye cameras 22A are provided on the upper and lower surfaces of the support part 12 of the probe 1. The upper part of Figure 19B shows a schematic diagram of the probe 1 viewed from above, and the lower part of Figure 19B shows a schematic diagram of the probe 1 viewed from the side. As shown in the lower part of Figure 19B, for example, the two fisheye cameras 22A are configured integrally with the probe 1, but part of the body of the fisheye camera 22A is exposed outside the support part 12.
 図19のCには、2つの魚眼カメラ22Aが、プローブ1の支持部12の上面と下面に設けられている他の例が示される。図19のCの上側には、上から見たプローブ1の模式図が示され、図19のCの下側には、側面側から見たプローブ1の模式図が示される。図19のCの下側に示すように、例えば、2つの魚眼カメラ22Aは、支持部12の内部に設けられる。 FIG. 19C shows another example in which two fisheye cameras 22A are provided on the upper and lower surfaces of the support portion 12 of the probe 1. The upper part of FIG. 19C shows a schematic diagram of the probe 1 viewed from above, and the lower part of FIG. 19C shows a schematic diagram of the probe 1 viewed from the side. As shown in the lower part of FIG. 19C, for example, the two fisheye cameras 22A are provided inside the support portion 12.
 図20は、1つの魚眼カメラ22Aの配置例を示す図である。 Figure 20 shows an example of the placement of one fisheye camera 22A.
 図20のAには、1つの魚眼カメラ22Aが、プローブ1の前面(先端部11)に設けられている例が示される。図20のAの上側には、上から見たプローブ1の模式図が示され、図20のAの下側には、側面側から見たプローブ1の模式図が示される。図20のAの上側に示すように、例えば、1つの魚眼カメラ22Aは先端部11の内部の左右のいずれか一端に設けられる。 A of Fig. 20 shows an example in which one fisheye camera 22A is provided on the front surface (tip portion 11) of the probe 1. The upper part of Fig. 20A shows a schematic diagram of the probe 1 as viewed from above, and the lower part of Fig. 20A shows a schematic diagram of the probe 1 as viewed from the side. As shown in the upper part of Fig. 20A, for example, one fisheye camera 22A is provided at either the left or right end inside the tip portion 11.
 図20のBには、1つの魚眼カメラ22Aが、プローブ1の支持部12の上面に設けられている例が示される。図20のBの上側には、上から見たプローブ1の模式図が示され、図20のBの下側には、側面側から見たプローブ1の模式図が示される。図20のBの下側に示すように、例えば、1つの魚眼カメラ22Aはプローブ1に一体となって構成されるが、魚眼カメラ22Aの本体の一部が支持部12の外部に露出している。 B of Figure 20 shows an example in which one fisheye camera 22A is provided on the upper surface of the support part 12 of the probe 1. The upper part of B of Figure 20 shows a schematic diagram of the probe 1 viewed from above, and the lower part of B of Figure 20 shows a schematic diagram of the probe 1 viewed from the side. As shown in the lower part of B of Figure 20, for example, one fisheye camera 22A is configured integrally with the probe 1, but part of the body of the fisheye camera 22A is exposed to the outside of the support part 12.
 図20のCには、1つの魚眼カメラ22Aが、プローブ1の支持部12の上面に設けられている他の例が示される。図20のCの上側には、上から見たプローブ1の模式図が示され、図20のCの下側には、側面側から見たプローブ1の模式図が示される。図20のCの下側に示すように、例えば、1つの魚眼カメラ22Aは、支持部12の内部に設けられる。 C of Fig. 20 shows another example in which one fisheye camera 22A is provided on the upper surface of the support part 12 of the probe 1. The upper part of Fig. 20C shows a schematic diagram of the probe 1 seen from above, and the lower part of Fig. 20C shows a schematic diagram of the probe 1 seen from the side. As shown in the lower part of Fig. 20C, for example, one fisheye camera 22A is provided inside the support part 12.
 以上では、同じ種類の複数のカメラ22がプローブ1に一体となって構成される例について説明した。カメラ22として、複数の種類のカメラがプローブ1に一体となって構成されるようにしてもよい。例えば、魚眼カメラと、通常のレンズを有するカメラとがプローブ1に設けられる。 The above describes an example in which multiple cameras 22 of the same type are configured integrally with the probe 1. Multiple types of cameras may be configured integrally with the probe 1 as the cameras 22. For example, a fisheye camera and a camera with a normal lens are provided on the probe 1.
 図21は、複数の種類のカメラの配置例を示す図である。 Figure 21 shows an example of the placement of multiple types of cameras.
 図21のAには、魚眼カメラ22Aと、通常のレンズを有するカメラ22Bが、プローブ1の前面(先端部11)に設けられている例が示される。図21のAの上側には、上から見たプローブ1の模式図が示され、図21のAの下側には、側面側から見たプローブ1の模式図が示される。図21のAの上側に示すように、例えば、魚眼カメラ22Aは先端部11の内部の左右端の一方に設けられ、カメラ22Bは先端部11の内部の左右端の他方に設けられる。魚眼カメラ22Aとカメラ22の間に超音波センサ21が設けられる。 FIG. 21A shows an example in which a fisheye camera 22A and a camera 22B with a normal lens are provided on the front surface (tip portion 11) of a probe 1. The upper part of FIG. 21A shows a schematic diagram of the probe 1 viewed from above, and the lower part of FIG. 21A shows a schematic diagram of the probe 1 viewed from the side. As shown in the upper part of FIG. 21A, for example, the fisheye camera 22A is provided at one of the left or right ends inside the tip portion 11, and the camera 22B is provided at the other left or right end inside the tip portion 11. An ultrasonic sensor 21 is provided between the fisheye camera 22A and the camera 22.
 図21のBには、魚眼カメラ22Aがプローブ1の支持部12の上面に設けられ、カメラ22Bがプローブ1の支持部12の下面に設けられている例が示される。図21のBの上側には、上から見たプローブ1の模式図が示され、図21のBの下側には、側面側から見たプローブ1の模式図が示される。図21のBの下側に示すように、例えば、魚眼カメラ22Aは、プローブ1と一体となって構成されるが、魚眼カメラ22Aの本体の一部が支持部12の外部に露出している。また、カメラ22は、プローブ1に一体となって構成されるが、支持部12の外部に設けられる。なお、カメラ22Bのように、魚眼カメラ22Aが支持部12の外部に設けられてもよし、魚眼カメラ22Aのように、カメラ22の本体の一部が支持部12の外部に露出していてもよい。 21B shows an example in which fisheye camera 22A is provided on the upper surface of support part 12 of probe 1, and camera 22B is provided on the lower surface of support part 12 of probe 1. The upper part of FIG. 21B shows a schematic diagram of probe 1 seen from above, and the lower part of FIG. 21B shows a schematic diagram of probe 1 seen from the side. As shown in the lower part of FIG. 21B, for example, fisheye camera 22A is configured integrally with probe 1, but a part of the body of fisheye camera 22A is exposed to the outside of support part 12. Also, camera 22 is configured integrally with probe 1, but is provided outside support part 12. Note that fisheye camera 22A may be provided outside support part 12 like camera 22B, or a part of the body of camera 22 may be exposed to the outside of support part 12 like fisheye camera 22A.
 図21のCには、魚眼カメラ22Aがプローブ1の支持部12の上面に設けられ、カメラ22Bがプローブ1の支持部12の下面に設けられている他の例が示される。図21のCの上側には、上から見たプローブ1の模式図が示され、図21のCの下側には、側面側から見たプローブ1の模式図が示される。図21のCの下側に示すように、例えば、魚眼カメラ22Aとカメラ22Bは、支持部12の内部に設けられる。 FIG. 21C shows another example in which fisheye camera 22A is provided on the upper surface of support portion 12 of probe 1, and camera 22B is provided on the lower surface of support portion 12 of probe 1. The upper part of FIG. 21C shows a schematic diagram of probe 1 viewed from above, and the lower part of FIG. 21C shows a schematic diagram of probe 1 viewed from the side. As shown in the lower part of FIG. 21C, for example, fisheye camera 22A and camera 22B are provided inside support portion 12.
・過去に超音波検査を行った位置を表示する例
 過去に超音波検査を行った人体上の位置や今回超音波検査を行った人体上の位置が情報処理装置2のディスプレイに表示されるようにしてもよい。
Example of Displaying Locations Where Previous Ultrasound Examinations Have Been Performed Locations on the human body where previous ultrasound examinations have been performed and locations on the human body where the current ultrasound examination is being performed may be displayed on the display of the information processing device 2.
 図22は、超音波検査を行った人体上の位置を提示するための画面の例を示す図である。 Figure 22 shows an example of a screen that displays the positions on the human body where ultrasound examinations were performed.
 図22に示すように、過去に超音波検査を行った位置や今回超音波検査を行った位置が、例えば、人体を模式的に表現する画像に重畳されて表示される。 As shown in FIG. 22, the locations where ultrasound examinations have been performed in the past and the current location where ultrasound examinations are being performed are displayed superimposed on an image that, for example, diagrammatically represents the human body.
 図22において、点線の角丸四角形は、過去に超音波検査を行い、超音波画像を記録した部位を示す。実際には、点線の角丸四角形は例えば緑色の線の角丸四角形で表示される。線の色の濃さは、当該部位の検査を行った際にプローブ1に作用していた外力に応じて変更される。例えば線の色が濃いほど、強い外力が加えられていたことが示される。線の色の濃さは、以下で説明する緑色以外の色の線の角丸四角形においても同様である。 In FIG. 22, the dotted rounded rectangles indicate areas where ultrasound examinations have been performed in the past and ultrasound images have been recorded. In reality, the dotted rounded rectangles are displayed as rounded rectangles with green lines, for example. The darkness of the line color changes according to the external force acting on the probe 1 when the examination of that area was performed. For example, the darker the line color, the stronger the external force that was applied. The darkness of the line color is the same for rounded rectangles with lines in colors other than green, which will be explained below.
 破線の角丸四角形は、過去に息を吐いた状態で超音波検査を行い、超音波画像を記録した部位を示す。実際には、破線の角丸四角形は例えば黄色の線の角丸四角形で表示される。一点鎖線の角丸四角形は、過去に息を吸った状態で超音波検査を行い、超音波画像を記録した部位を示す。実際には、一点鎖線の角丸四角形は例えば紫色の線の角丸四角形で表示される。 The dashed rounded rectangle indicates the area where an ultrasound examination was previously performed while the patient was exhaling, and an ultrasound image was recorded. In reality, the dashed rounded rectangle is displayed, for example, as a rounded rectangle with yellow lines. The dashed rounded rectangle indicates the area where an ultrasound examination was previously performed while the patient was inhaling, and an ultrasound image was recorded. In reality, the dashed rounded rectangle is displayed, for example, as a rounded rectangle with purple lines.
 実線の角丸四角形は、今回超音波検査を行い、超音波画像を記録した部位を示す。実際には、実線の角丸四角形は例えば赤色の線の角丸四角形で表示される。白色または灰色の円形は、今回超音波検査を行い、超音波画像がディスプレイに表示された部位を示す。実際には、円形は例えば青色で内部が塗られた円形で表示される。円形内部の色の濃さは、当該部位の検査を行った際にプローブ1に作用していた外力に応じて変更される。例えば円形内部の色が濃いほど、強い外力が加えられていたことが示される。 The solid rounded rectangle indicates the area where the ultrasound examination was performed and the ultrasound image was recorded. In reality, the solid rounded rectangle is displayed, for example, as a rounded rectangle with red lines. The white or grey circle indicates the area where the ultrasound examination was performed and the ultrasound image was displayed on the display. In reality, the circle is displayed, for example, as a circle with a blue interior. The intensity of the colour inside the circle changes depending on the external force acting on the probe 1 when the examination of that area was performed. For example, the darker the colour inside the circle, the stronger the external force that was applied.
 超音波検査を行った人体上の位置は、カメラ22の撮影画像に基づく画像認識の結果とプローブ1の相対位置の計測結果とに基づいて推定される。ユーザは、図22の画面を見ることで、過去に超音波検査を行った位置と今回超音波検査を行った位置とを比較して、検査し忘れた部位がないかどうかを判断することができる。 The positions on the human body where the ultrasound examination was performed are estimated based on the results of image recognition based on the image captured by the camera 22 and the measurement results of the relative position of the probe 1. By looking at the screen in Figure 22, the user can compare the positions where ultrasound examinations were performed in the past with the positions where ultrasound examinations are being performed this time, and determine whether there are any areas that have been forgotten to be examined.
・過去の検査状況と略同一の検査状況となった場合に超音波画像を記録する例
 過去に超音波検査が行われた部位と略同一の部位の超音波検査を今回行う場合、過去の検査状況と現在の検査状況が略同一となったとき、情報処理装置2の記録部54に超音波画像が記録されるようにしてもよい。例えば、検査対象となる人物の姿勢が、過去の検査時と現在の検査時で略同一となったとき、超音波画像が記録される。
Example of recording ultrasound image when examination situation becomes substantially the same as past examination situation When performing an ultrasound examination of substantially the same part as a part previously examined, when the past examination situation and the current examination situation become substantially the same, an ultrasound image may be recorded in the recording unit 54 of the information processing device 2. For example, when the posture of the person to be examined becomes substantially the same during the past examination and the current examination, an ultrasound image is recorded.
 検査状況には、例えば、検査対象となる人物の姿勢、検査対象となる人物の呼吸の状態、プローブ1が押し当てられている人体上の位置、およびプローブ1に作用する外力が含まれる。 The examination conditions include, for example, the posture of the person being examined, the breathing state of the person being examined, the position on the human body to which the probe 1 is pressed, and the external force acting on the probe 1.
 検査対象となる人物の姿勢が過去と現在で略同一であるかは、例えばVisual SLAM技術と画像認識を利用して判定される。 Whether the posture of the person being tested is approximately the same in the past and present can be determined using, for example, Visual SLAM technology and image recognition.
 検査対象となる人物の呼吸の状態は、当該人物が息を吸った状態であるか、息を吐いた状態であるか、または、通常の状態であるかを示す。検査対象となる人物の呼吸の状態が過去と現在で略同一であるかは、例えばVisual SLAM技術と、情報処理装置2に設けられた図示せぬマイクロフォンで集音された音声とを利用して判定される。 The breathing state of the person being examined indicates whether the person is inhaling, exhaling, or in a normal state. Whether the breathing state of the person being examined is substantially the same in the past and present is determined, for example, by using Visual SLAM technology and audio picked up by a microphone (not shown) installed in the information processing device 2.
 プローブ1が押し当てられている人体上の位置が過去と現在で略同一であるかは、例えばVisual SLAM技術と、超音波画像の類似度とを利用して判定される。プローブ1に作用する外力が過去と現在で略同一であるかは、例えばカメラ22の撮影画像に写る光点群に基づいて判定される。 Whether the position on the human body where the probe 1 is pressed is substantially the same in the past and present is determined, for example, by using Visual SLAM technology and the similarity of ultrasound images. Whether the external force acting on the probe 1 is substantially the same in the past and present is determined, for example, based on the light point cloud captured in the image captured by the camera 22.
 以上のように、ユーザは、過去の検査状況と略同一の検査状況で撮影された超音波画像を容易に取得することが可能となる。 As described above, the user can easily obtain ultrasound images taken under examination conditions that are approximately the same as those of previous examinations.
・検査対象の情報を管理する例
 バーコード、2次元コード、その他の特殊コード、文字列などを含む書類や画面をカメラ22で撮影し、撮影画像に写るバーコードなどを解析することで、検査対象となる患者に関する患者情報を情報処理装置2のデータ取得部51が取得することも可能である。なお、検査対象となる患者の顔をカメラ22で撮影し、撮影画像に基づく画像認識により患者の顔を認識することで、認識した患者の顔に対応する患者情報が取得されるようにしてもよい。
Example of managing information on a subject of examination It is also possible for the data acquisition unit 51 of the information processing device 2 to acquire patient information on a patient who is a subject of examination by photographing a document or screen including a barcode, a two-dimensional code, other special code, character strings, etc. with the camera 22 and analyzing the barcode, etc., that appears in the photographed image. Note that the face of the patient who is a subject of examination may be photographed with the camera 22, and the patient's face may be recognized by image recognition based on the photographed image, thereby acquiring patient information corresponding to the recognized patient's face.
 患者情報を取得できれば、情報処理装置2は、超音波画像や、プローブ1に作用する外力の測定結果などを、患者情報に紐づいたカルテに検査データとして登録することができる。ユーザは、超音波画像などをカルテに手動で登録する必要がないため、ユーザの手間や間違いを削減することが可能となる。 Once the patient information is acquired, the information processing device 2 can register ultrasound images and the measurement results of the external force acting on the probe 1 as examination data in a medical chart linked to the patient information. Since the user does not need to manually register ultrasound images and the like in the medical chart, it is possible to reduce the user's effort and errors.
 なお、情報処理装置2が、バーコードなどの読み取り結果や、患者の顔の認識結果に基づいて、患者情報とともに、過去の超音波検査の検査データを取得することも可能である。 In addition, the information processing device 2 can also obtain examination data from past ultrasound examinations along with patient information based on the results of reading barcodes and the like and the results of recognizing the patient's face.
 また、情報処理装置2が、バーコードなどの読み取り結果や、ユーザの顔の認識結果に基づいて、ユーザに個別の設定を取得したり、ユーザに紐づけて検査データを登録したりすることも可能である。例えば、ユーザが色覚異常を有する場合、ユーザに個別の設定に基づいて、ユーザが読み取りやすい色の表示が行われる。例えば、ユーザの医療分野に関する習熟度に応じて、プローブ1の使用方法の参考となるような各種の情報の表示がオンまたはオフされる。 In addition, the information processing device 2 can obtain individual settings for a user based on the results of reading a barcode or recognizing the user's face, and can register test data linked to the user. For example, if a user has color vision deficiency, colors that are easy for the user to read are displayed based on the user's individual settings. For example, the display of various types of information that can serve as a reference for how to use the probe 1 is turned on or off depending on the user's level of familiarity with the medical field.
 ユーザは、超音波検査を行う度に設定を変更したり、超音波画像などを登録したりする必要がないため、ユーザの手間や間違いを削減することが可能となる。 Users do not need to change settings or register ultrasound images every time they perform an ultrasound examination, reducing the amount of work and mistakes required.
 患者情報、検査データの登録先を示す情報、ユーザに個別の設定を示す情報などを含む検査対象またはユーザに関する情報は、例えば情報処理装置2のデータ取得部51により取得される。 Information about the test subject or user, including patient information, information indicating the registration destination of test data, information indicating individual settings for the user, etc., is acquired, for example, by the data acquisition unit 51 of the information processing device 2.
・情報処理装置に表示される画面について
 図23は、超音波検査の際に情報処理装置2のディスプレイに表示される超音波検査画面の例を示す図である。
Screen Displayed on Information Processing Device FIG. 23 is a diagram showing an example of an ultrasound examination screen displayed on the display of the information processing device 2 during ultrasound examination.
 図23に示すように、超音波検査画面の左部には、例えば、ユーザが操作を入力したり、プローブ1に設けられた各種のセンサによる計測結果を確認したりするためのアプリ(アプリケーション)領域A101が表示される。アプリ領域A101の下部には、プローブ1に作用する外力の時系列の変化を示すグラフG101が表示される。 As shown in FIG. 23, the left side of the ultrasound examination screen displays an app (application) area A101, which allows the user to input operations and check the results of measurements taken by various sensors provided in the probe 1, for example. A graph G101 showing the time series changes in the external force acting on the probe 1 is displayed below the app area A101.
 アプリ領域A101の右側上方には、例えば、プローブ1に設けられたカメラ22の撮影画像を表示するためのプローブ視野画像領域A102が表示される。プローブ視野画像領域A102の上部には、カメラ22の撮影画像P101が表示され、下部には、ボタンB101乃至B104が表示される。撮影画像P101には、プローブ1に作用する外力の大きさと方向を示す力覚メータが重畳される。 A probe field of view image area A102 for displaying, for example, an image captured by the camera 22 provided on the probe 1 is displayed in the upper right-hand corner of the app area A101. An image P101 captured by the camera 22 is displayed in the upper part of the probe field of view image area A102, and buttons B101 to B104 are displayed in the lower part. A force meter indicating the magnitude and direction of the external force acting on the probe 1 is superimposed on the captured image P101.
 ボタンB101が押下されると、例えば、撮影画像P101の左右が反転されて表示され、ボタンB102が押下されると、例えば、撮影画像P101の上下が反転されて表示される。また、ボタンB103が押下されると、例えば、力覚メータの左右が反転されて表示され、ボタンB104が押下されると、例えば、力覚メータの上下が反転されて表示される。 When button B101 is pressed, for example, the captured image P101 is displayed with the left and right sides reversed, and when button B102 is pressed, for example, the captured image P101 is displayed with the top and bottom sides reversed. When button B103 is pressed, for example, the force sense meter is displayed with the left and right sides reversed, and when button B104 is pressed, for example, the force sense meter is displayed with the top and bottom sides reversed.
 ユーザが、例えば、正面にある情報処理装置2のディスプレイを見ながら、プローブ1を自身に向けて使用する場合、ユーザが見ている方向とカメラ22の撮影方向とが180度異なる。また、プローブ1の持ち方によって、ユーザにとっての上下左右方向と、カメラ22の撮影画像における上下左右方向とが異なる場合がある。 For example, when a user uses the probe 1 while pointing it at themselves while looking at the display of the information processing device 2 in front of them, the direction in which the user is looking differs by 180 degrees from the shooting direction of the camera 22. Also, depending on how the probe 1 is held, the up, down, left, and right directions as seen by the user may differ from the up, down, left, and right directions in the image captured by the camera 22.
 ボタンB101乃至B104などのGUI(Graphical User Interface)を用いて撮影画像の左右や上下を反転させることで、ユーザにとっての上下左右方向と、カメラ22の撮影画像における上下左右方向とが異なるために、ユーザが混乱し、作業効率が低下することを防ぐことが可能となる。 By inverting the captured image left-right or top-bottom using a GUI (Graphical User Interface) such as buttons B101 to B104, it is possible to prevent the user from becoming confused and working efficiency from decreasing because the up-down-left-right directions for the user are different from the up-down-left-right directions in the image captured by the camera 22.
 なお、例えば、選択されたボタンが緑色に変化するといったように、撮影画像などが反転されていることが視覚的に表現されることが望ましい。撮影画像などを反転させるためのGUIがプルダウンである場合、選択された項目だけが表示されることが望ましい。 It is desirable to visually indicate that the captured image, etc. is inverted, for example by having the selected button turn green. If the GUI for inverting the captured image, etc. is a pull-down menu, it is desirable for only the selected item to be displayed.
 プローブ視野画像領域A102の右側には、例えば検査対象となる人物の全体を撮影する俯瞰画像P102が表示される。俯瞰画像P102は、例えば情報処理装置2に設けられたカメラや外部のカメラにより撮影される。 To the right of the probe field of view image area A102, for example, an overhead image P102 capturing an entire image of a person to be examined is displayed. The overhead image P102 is captured, for example, by a camera provided in the information processing device 2 or an external camera.
 アプリ領域A101の右側下方には、例えば、超音波画像を表示するための超音波画像領域A103が表示される。なお、超音波検査画面において、各種の数値、グラフG101、撮影画像P101、俯瞰画像P102、超音波画像などは全て同期して表示される。 In the lower right-hand corner of the app area A101, an ultrasound image area A103 for displaying, for example, an ultrasound image is displayed. On the ultrasound examination screen, various numerical values, graphs G101, captured images P101, overhead images P102, ultrasound images, etc. are all displayed synchronously.
 図24は、アプリ領域A101の詳細を説明する図である。図24においては、グラフG101を除くアプリ領域A101の上部分の表示例が示される。 FIG. 24 is a diagram for explaining details of the app area A101. In FIG. 24, a display example of the upper part of the app area A101 excluding the graph G101 is shown.
 図24に示すように、アプリ領域A101の中央部上方には、プローブ1に作用する外力やプローブ1の相対位置の計測値などに対する目標値を設定するためのボタンB111が表示される。 As shown in FIG. 24, a button B111 is displayed at the top center of the application area A101 for setting target values for the external force acting on the probe 1, the measurement value of the relative position of the probe 1, etc.
 アプリ領域A101の右上部には、撮影画像や、プローブに作用する外力の計測結果などのログを保存させるためのボタンB112が表示される。ボタンB112の下側には、プローブに作用する外力の計測結果や慣性センサによる計測結果のログを継続して保存するか否かを切り替えるためのボタンB113が表示される。 In the upper right corner of the app area A101, a button B112 is displayed for saving logs of captured images and the measurement results of the external forces acting on the probe. Below the button B112, a button B113 is displayed for switching whether or not to continue saving logs of the measurement results of the external forces acting on the probe and the measurement results from the inertial sensor.
 図24において破線で囲んで示す領域A111には、プローブ1に作用する外力の計測値などの、プローブ1に設けられたセンサにより計測された各種の数値が表示される。 In area A111 shown surrounded by a dashed line in FIG. 24, various numerical values measured by a sensor provided on probe 1, such as the measured value of the external force acting on probe 1, are displayed.
 領域A111の下側には、2次元コードの読み取りを行うためのチェックボックスC101や、文字列の読み取りを行うためのチェックボックスC102が表示される。 Below area A111 are displayed a check box C101 for reading a two-dimensional code and a check box C102 for reading a character string.
 チェックボックスC102の下側において、破線で囲んで示す領域A112には、グラフG101で表示される項目を選択するためのチェックボックスが表示される。 Below the checkboxes C102, in the area A112 enclosed in a dashed line, are checkboxes for selecting items to be displayed in the graph G101.
 図25は、プローブ1に作用する外力の時系列の変化を示すグラフG101の例を示す図である。図25において、横軸は時刻を示し、縦軸は力の大きさを示す。 FIG. 25 is a diagram showing an example of a graph G101 showing the time series change in the external force acting on the probe 1. In FIG. 25, the horizontal axis shows the time, and the vertical axis shows the magnitude of the force.
 図25の例では、プローブ1の左側に作用するx軸方向の力(モーメント)Flx、y軸方向の力(モーメント)Fly、およびz軸方向の力Flz、並びに、プローブ1の右側に作用するx軸方向の力(モーメント)Frx、y軸方向の力(モーメント)Fry、およびz軸方向の力Frzの時系列の変化が示されている。 The example in Figure 25 shows the time series changes in the force (moment) Flx in the x-axis direction, the force (moment) Fly in the y-axis direction, and the force Flz in the z-axis direction acting on the left side of the probe 1, as well as the force (moment) Frx in the x-axis direction, the force (moment) Fry in the y-axis direction, and the force Frz in the z-axis direction acting on the right side of the probe 1.
 グラフG101において表示される力のレンジは、下記のいずれかによって指定されてもよい。
・現在作用している力の大きさに応じて動的にレンジが自動調整される。
・対応可能な力の大きさに合わせて最大値と最小値が固定される。
・ユーザが最大値と最小値を入力する。
・超音波検査の検査対象となる部位に応じて想定されるレンジになるように自動調整が行われる。
・過去の計測結果に基づいて想定されるレンジになるように自動調整が行われる。
The force range displayed in graph G101 may be specified by any of the following:
- The range is automatically adjusted dynamically according to the magnitude of the force currently acting.
・The maximum and minimum values are fixed according to the magnitude of the force that can be handled.
- The user inputs maximum and minimum values.
- Automatic adjustment is performed to ensure that the range is within the expected range depending on the area being examined during ultrasound testing.
- Automatic adjustment is made to match the expected range based on past measurement results.
 図26は、力覚メータの表示例を示す図である。 Figure 26 shows an example of the force meter display.
 図26のAの例では、2つの力覚メータ351により、プローブ1の左側と右側のそれぞれに作用する外力が示される。2つの力覚メータ351は、例えば、撮影画像P101内の光点群が写る左側の領域と右側の領域にそれぞれ重畳されて表示される。ユーザは光点群を確認する必要がないため、実際には撮影画像内の光点群が写る領域はマスクされる。以下では、撮影画像において、マスクされず、プローブ1の外部空間だけが写る領域を撮影画像の視野部と称する。 In the example of A in FIG. 26, the two force meters 351 indicate the external forces acting on the left and right sides of the probe 1. The two force meters 351 are displayed, for example, superimposed on the left and right areas, respectively, of the captured image P101 in which the light spot cloud appears. Since the user does not need to check the light spot cloud, the area in the captured image in which the light spot cloud appears is actually masked. Below, the area in the captured image that is not masked and in which only the external space of the probe 1 appears is referred to as the field of view of the captured image.
 力覚メータ351は、円形と、円形の中心から伸びる矢印とを組み合わせて構成される。力覚メータ351において、円形の大きさは、プローブ1の左側と右側のいずれか一方を押し込む力の大きさ(Flz,Frz)を示す。矢印の長さは、上下左右方向に作用する力の大きさ(Flx,Fly,Frx,Fry)を示し、矢印の向きは、上下左右方向に左右する力の方向を示す。 The force sense meter 351 is composed of a circle and an arrow extending from the center of the circle. In the force sense meter 351, the size of the circle indicates the magnitude of the force (Flz, Frz) pressing into either the left or right side of the probe 1. The length of the arrow indicates the magnitude of the force (Flx, Fly, Frx, Fry) acting in the up, down, left and right directions, and the direction of the arrow indicates the direction of the force acting in the up, down, left and right directions.
 プローブ1に作用する外力の計測結果が数字で表示されても、ユーザにとってはわかりづらい。プローブ1に作用する外力の計測結果が、力覚メータなどのGUIを用いて表示されることで、ユーザは当該外力を直感的に認識することが可能となる。 If the measurement results of the external forces acting on the probe 1 are displayed as numbers, it is difficult for the user to understand. By displaying the measurement results of the external forces acting on the probe 1 using a GUI such as a force meter, the user can intuitively recognize the external forces.
 図26のBの例では、1つの力覚メータ352により、プローブ1の左側と右側のそれぞれに作用する外力の合力が示される。 In the example of FIG. 26B, one force meter 352 indicates the resultant force of the external forces acting on the left and right sides of the probe 1.
 力覚メータ352は、円形と、円形の中心から伸びる矢印とを組み合わせて構成される。力覚メータ351において、円形の大きさは、プローブ1の左側と右側のそれぞれを押し込む力の合力の大きさを示す。矢印の長さは、プローブ1の左側と右側のそれぞれに対して上下左右方向に作用する力の合力の大きさを示し、矢印の向きは、上下左右方向に作用する力の合力の方向を示す。 The force sense meter 352 is composed of a circle and an arrow extending from the center of the circle. In the force sense meter 351, the size of the circle indicates the magnitude of the resultant force pushing on the left and right sides of the probe 1. The length of the arrow indicates the magnitude of the resultant force acting in the up, down, left and right directions on the left and right sides of the probe 1, and the direction of the arrow indicates the direction of the resultant force acting in the up, down, left and right directions.
 力覚メータ352は、左側と右側のうち、プローブ1を押し込む力が強い方に左右位置を寄せて表示され、先端部11と検査対象が接触している部分へ向かう方向に上下位置を寄せて表示される。なお、力覚メータ352の上下位置は、撮影画像の中央部に固定されるようにしてもよい。 The force meter 352 is displayed with its left and right position shifted to the side with the stronger force pushing the probe 1, and its top and bottom position shifted toward the part where the tip 11 is in contact with the test subject. The top and bottom position of the force meter 352 may be fixed to the center of the captured image.
 図26のAの例では、力覚メータ351が撮影画像の左右に表示されるため、ユーザは視線を左右に動かす必要がある。図26のBの例では、ユーザは、1箇所(力覚メータ352近傍)を見ることで、プローブ1に作用する外力を確認することができる。 In the example of A in FIG. 26, the force meter 351 is displayed on the left and right of the captured image, so the user must move their line of sight left and right. In the example of B in FIG. 26, the user can check the external force acting on the probe 1 by looking at one location (near the force meter 352).
 なお、検査対象となる部位が明確である場合、当該部位までプローブ1を移動させるためのガイドが、カメラ22の撮影画像に重畳されて表示されてもよい。検査対象となる部位までプローブ1を移動させる方向と距離は、ユーザがプローブ1を人物の体から離した際にカメラ22が体を俯瞰で撮影した撮影画像や、Visual SLAM技術を利用して計測された検査対象に対するプローブ1の相対位置に基づいて取得される。 If the area to be inspected is clear, a guide for moving the probe 1 to that area may be displayed superimposed on the image captured by the camera 22. The direction and distance in which the probe 1 is moved to the area to be inspected is obtained based on an overhead image captured by the camera 22 when the user moves the probe 1 away from the person's body, and the relative position of the probe 1 to the inspection target measured using Visual SLAM technology.
 図27は、検査対象となる部位までプローブ1を移動させるためのガイドの表示例を示す図である。図27の説明においては、例えば検査対象が人物の腕の一部分であるとする。 FIG. 27 shows an example of a guide display for moving the probe 1 to the area to be examined. In the explanation of FIG. 27, it is assumed that the area to be examined is a part of a person's arm.
 プローブ1が腕の一部分から大きく離れている場合、図27のAに示すように、ガイドとしての矢印A121がカメラ22の撮影画像P101に重畳されて表示される。矢印A121は、例えば撮影画像P101の中央に重畳される。矢印A121が、例えば、撮影画像P101上で大きく表示されたり、点滅して表示されたりして、強調されて表示されてもよい。 When the probe 1 is far away from a part of the arm, as shown in FIG. 27A, an arrow A121 is displayed as a guide, superimposed on the captured image P101 of the camera 22. The arrow A121 is superimposed, for example, in the center of the captured image P101. The arrow A121 may be displayed in an emphasized manner, for example, by being displayed large or blinking on the captured image P101.
 プローブ1が腕の一部分に近づいた場合、図27のBに示すように、プローブ1が検査対象に接触したときに撮影画像P101の視野部に写ると想定される部分を囲む矩形の枠R101が、ガイドとして撮影画像P101に重畳されて表示される。 When the probe 1 approaches a part of the arm, as shown in FIG. 27B, a rectangular frame R101 that surrounds the part that is expected to appear in the field of view of the captured image P101 when the probe 1 comes into contact with the subject is displayed superimposed on the captured image P101 as a guide.
 ユーザは、図27のCに示すように、枠R101が撮影画像P101の視野部に重なるようにプローブ1を移動させることで、検査対象となる腕の一部分にプローブ1を接触させることができる。なお、枠R101が点滅するなどして、強調されて表示されてもよい。 As shown in FIG. 27C, the user can bring the probe 1 into contact with a part of the arm to be examined by moving the probe 1 so that the frame R101 overlaps with the field of view of the captured image P101. Note that the frame R101 may be highlighted by blinking or the like.
 また、プローブ1を検査対象に対して押し込む力の目標値を提示するためのガイドがカメラ22の撮影画像に重畳されてもよい。このようなガイドは、例えば、図24のボタンB111が押下されることで表示される。目標値の設定は、例えば、ボタンを押下することで所定の選択肢の中から選択されたり、過去に超音波画像を撮影したときの外力の計測結果の中から選択されたりして行われる。 In addition, a guide to indicate the target value of the force with which the probe 1 is pressed against the subject may be superimposed on the image captured by the camera 22. Such a guide is displayed, for example, by pressing button B111 in FIG. 24. The target value is set, for example, by pressing a button to select from among predetermined options, or by selecting from the results of measuring external forces when an ultrasound image was previously captured.
 図28は、プローブ1を押し込む力の目標値を提示するためのガイドの表示例を示す図である。 FIG. 28 shows an example of a guide display that shows the target value of the force for pressing the probe 1.
 図28の例では、プローブ1を押し込む力の目標値を示すガイドメータ353が、力覚メータ352とともに、カメラ22の撮影画像P101に重畳されて表示される。 In the example of FIG. 28, a guide meter 353 indicating the target value of the force for pushing the probe 1 is displayed together with a force sense meter 352 superimposed on the image P101 captured by the camera 22.
 ガイドメータ353も、力覚メータ352と同様に、円形と、円形の中心から伸びる矢印とを組み合わせて構成される。ガイドメータ353における円形の大きさ、矢印の向き、および矢印の大きさは、力覚メータ352における円形の大きさ、矢印の向き、および矢印の大きさに対応する。実際には、ガイドメータ353は、例えば薄い色などの目立たない色の線で表示される。 Like the force sense meter 352, the guide meter 353 is also configured by combining a circle with an arrow extending from the center of the circle. The size of the circle, the direction of the arrow, and the size of the arrow on the guide meter 353 correspond to the size of the circle, the direction of the arrow, and the size of the arrow on the force sense meter 352. In reality, the guide meter 353 is displayed as a line in an inconspicuous color, such as a light color.
 プローブ1を押し込む力が目標値よりも小さい場合、図28のAに示すように、力覚メータ352は、ガイドメータ353よりも小さく表示され、力覚メータ352の円形は、例えば青色で表示される。 If the force with which the probe 1 is pressed is smaller than the target value, as shown in FIG. 28A, the force meter 352 is displayed smaller than the guide meter 353, and the circle of the force meter 352 is displayed in blue, for example.
 プローブ1を押し込む力が目標値と同程度(目標値と計測値の差分がわずか)である場合、図28のBに示すように、力覚メータ352は、ガイドメータ353と同程度の大きさで表示され、力覚メータ352の円形は、例えば緑色で表示される。 When the force with which the probe 1 is pressed is approximately the same as the target value (the difference between the target value and the measured value is small), as shown in FIG. 28B, the force meter 352 is displayed at approximately the same size as the guide meter 353, and the circle of the force meter 352 is displayed in, for example, green.
 プローブ1を押し込む力が目標値よりも大きい場合、図28のCに示すように、力覚メータ352は、ガイドメータ353よりも大きく表示され、力覚メータ352の円形は、例えば赤色で表示される。 If the force with which the probe 1 is pressed is greater than the target value, as shown in FIG. 28C, the force meter 352 is displayed larger than the guide meter 353, and the circle of the force meter 352 is displayed in red, for example.
 ユーザは、力覚メータ352とガイドメータ353が重なるように、プローブ1を押し込む力を調整することで、プローブ1を適切な力で検査対象に対して押し込むことができる。 The user can adjust the force with which the probe 1 is pressed so that the force meter 352 and the guide meter 353 overlap, allowing the user to press the probe 1 into the test subject with an appropriate force.
 なお、強すぎる力でプローブ1を押し込むと危険であるため、音が発せられるなどしてアラートがユーザに提示されてもよい。力覚メータ352の矢印の色が、円形の色と同様に、プローブ1を押し込む力に応じて変化してもよい。 Note that since it is dangerous to press the probe 1 with too much force, an alert may be issued to the user, such as by emitting a sound. The color of the arrow on the force meter 352 may change according to the force with which the probe 1 is pressed, in the same way as the color of the circle.
 2つの力覚メータ351が表示される場合、2つの力覚メータ351それぞれに対応するガイドメータが表示され、2つの力覚メータ351それぞれの円形の色が、プローブ1を押し込む力に応じて変化する。 When two force meters 351 are displayed, a guide meter corresponding to each of the two force meters 351 is displayed, and the color of the circle of each of the two force meters 351 changes according to the force with which the probe 1 is pressed.
・超音波検査装置の構成について
 情報処理装置2が、例えばシングルボードコンピュータとPCにより構成されるようにしてもよい。シングルボードコンピュータは、プローブ1とPCに接続され、PCは、プローブ1とシングルボードコンピュータに接続される。
Regarding the configuration of the ultrasonic inspection device, the information processing device 2 may be configured, for example, by a single-board computer and a PC. The single-board computer is connected to the probe 1 and the PC, and the PC is connected to the probe 1 and the single-board computer.
 シングルボードコンピュータは、例えば、PCによる指令に基づいてカメラ22の内部に設けられたLEDの光量を調整する。シングルボードコンピュータは、例えば、プローブ1の加速度、角速度、および温度を示す数値データを、プローブ1に設けられたセンサから取得し、PCに送信する。 The single-board computer, for example, adjusts the light intensity of the LEDs provided inside the camera 22 based on commands from the PC. The single-board computer, for example, obtains numerical data indicating the acceleration, angular velocity, and temperature of the probe 1 from sensors provided in the probe 1 and transmits the data to the PC.
 PCは、例えば、カメラ22の撮影部140から撮影画像を取得し、撮影画像を入力とする学習モデルを利用して、プローブ1に作用する外力を推定する。PCは、例えば、シングルボードコンピュータから送信されてきた数値データを取得する。PCは、例えば、プローブ1に作用する外力、プローブ1の加速度、角速度、温度などを、超音波検査画面に表示する。 The PC, for example, acquires captured images from the imaging unit 140 of the camera 22, and estimates the external force acting on the probe 1 using a learning model that uses the captured images as input. The PC, for example, acquires numerical data transmitted from a single-board computer. The PC displays, for example, the external force acting on the probe 1, the acceleration, angular velocity, temperature, etc. of the probe 1 on the ultrasound examination screen.
・慣性センサ(IMU:Inertial Measurement Unit)の搭載効果
 慣性センサにより絶対姿勢が取得できるため、重力影響を正確に計測して補償することが可能となる。検査対象の人物とプローブ1が同時に動いた時にも、プローブ1の絶対的な移動量を計測することが可能となる。慣性センサによる計測結果に基づいて、肌の弾性を計測することが可能となる。
・Effects of installing an inertial sensor (IMU: Inertial Measurement Unit) The absolute attitude can be acquired by the inertial sensor, making it possible to accurately measure and compensate for the effects of gravity. It is possible to measure the absolute amount of movement of the probe 1 even when the person being examined and the probe 1 move at the same time. It is possible to measure the elasticity of the skin based on the measurement results from the inertial sensor.
 カメラ22が手などで一時的に塞がれた状態でもプローブ1の位置を把握することが可能となる。プローブ1が検査対象に近接している状態など、カメラ22の撮影画像に基づいてプローブ1の外部の状況を把握しづらい状態でも、例えばお腹が呼吸で拡大または縮小することによる位置変化を取得することが可能となる。カメラとのセンサフュージョンで4Dエコーのような超音波検査を実現することが可能となる。 It is possible to grasp the position of the probe 1 even when the camera 22 is temporarily blocked by a hand or other object. Even in situations where it is difficult to grasp the external situation of the probe 1 based on the image captured by the camera 22, such as when the probe 1 is close to the subject of examination, it is possible to obtain position changes caused, for example, by the abdomen expanding and contracting due to breathing. Sensor fusion with the camera makes it possible to realize ultrasound examinations such as 4D echo.
・コンピュータについて
 上述した一連の処理は、ハードウェアにより実行することもできるし、ソフトウェアにより実行することもできる。一連の処理をソフトウェアにより実行する場合には、そのソフトウェアを構成するプログラムが、専用のハードウェアに組み込まれているコンピュータ、または汎用のパーソナルコンピュータなどに、プログラム記録媒体からインストールされる。
Regarding the computer: The above-described series of processes can be executed by hardware or software. When the series of processes is executed by software, the program constituting the software is installed from a program recording medium into a computer incorporated in dedicated hardware, or into a general-purpose personal computer, etc.
 図29は、上述した一連の処理をプログラムにより実行するコンピュータのハードウェアの構成例を示すブロック図である。 FIG. 29 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes using a program.
 CPU(Central Processing Unit)501,ROM(Read Only Memory)502,RAM(Random Access Memory)503は、バス504により相互に接続されている。 CPU (Central Processing Unit) 501, ROM (Read Only Memory) 502, and RAM (Random Access Memory) 503 are interconnected by a bus 504.
 バス504には、さらに、入出力インタフェース505が接続される。入出力インタフェース505には、キーボード、マウスなどよりなる入力部506、ディスプレイ、スピーカなどよりなる出力部507が接続される。また、入出力インタフェース505には、ハードディスクや不揮発性のメモリなどよりなる記憶部508、ネットワークインタフェースなどよりなる通信部509、リムーバブルメディア511を駆動するドライブ510が接続される。 Further connected to the bus 504 is an input/output interface 505. Connected to the input/output interface 505 are an input unit 506 consisting of a keyboard, mouse, etc., and an output unit 507 consisting of a display, speakers, etc. Also connected to the input/output interface 505 are a storage unit 508 consisting of a hard disk or non-volatile memory, a communication unit 509 consisting of a network interface, etc., and a drive 510 that drives removable media 511.
 以上のように構成されるコンピュータでは、CPU501が、例えば、記憶部508に記憶されているプログラムを入出力インタフェース505及びバス504を介してRAM503にロードして実行することにより、上述した一連の処理が行われる。 In a computer configured as described above, the CPU 501, for example, loads a program stored in the storage unit 508 into the RAM 503 via the input/output interface 505 and the bus 504 and executes the program, thereby performing the above-mentioned series of processes.
 CPU501が実行するプログラムは、例えばリムーバブルメディア511に記録して、あるいは、ローカルエリアネットワーク、インターネット、デジタル放送といった、有線または無線の伝送媒体を介して提供され、記憶部508にインストールされる。 The programs executed by the CPU 501 are provided, for example, by being recorded on removable media 511, or via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and are installed in the storage unit 508.
 コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 The program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or it may be a program in which processing is performed in parallel or at the required timing, such as when called.
 なお、本明細書に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limiting, and other effects may also be present.
 本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 The embodiment of this technology is not limited to the above-mentioned embodiment, and various modifications are possible without departing from the gist of this technology.
 また、上述のフローチャートで説明した各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。 In addition, each step described in the above flowchart can be executed by a single device, or can be shared and executed by multiple devices.
 さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 Furthermore, when one step includes multiple processes, the processes included in that one step can be executed by one device, or can be shared and executed by multiple devices.
・構成の組み合わせ例
 本技術は、以下のような構成をとることもできる。
Example of combination of configurations The present technology can also have the following configurations.
(1)
 超音波を検査対象に放射し、前記検査対象で反射されて戻ってくる前記超音波の反射波を受信する超音波センサと、
 前記超音波センサが設けられたプローブと一体になって構成され、前記プローブが前記検査対象と接触する方向近傍を少なくとも撮影するカメラと、
 前記カメラにより撮影された撮影画像に基づいて、前記検査対象に対する前記プローブの相対位置を計測する演算部と
 を備える超音波検査装置。
(2)
 前記カメラは、前記プローブの外部空間を撮影するとともに、前記プローブの内部に設けられた光源から出射され、前記プローブの内部に設けられた反射空間にて反射された光の光点群を撮影し、
 前記演算部は、前記撮影画像に写る前記光点群の変位に基づいて、前記プローブに作用する外力を計測する
 前記(1)に記載の超音波検査装置。
(3)
 前記プローブの角速度および前記プローブの加速度のうちの少なくともいずれかを計測する慣性センサをさらに備える
 前記(2)に記載の超音波検査装置。
(4)
 前記演算部は、前記外力の計測結果と、前記慣性センサによる計測結果とに基づいて、前記検査対象の弾力を計測する
 前記(3)に記載の超音波検査装置。
(5)
 前記超音波センサにより受信された前記反射波に基づいて生成された超音波画像、前記外力の計測結果、前記撮影画像、前記相対位置の計測結果、および前記慣性センサによる計測結果のうちの少なくとも2つ以上のデータを対応付けて記録する記録部をさらに備える
 前記(3)または(4)に記載の超音波検査装置。
(6)
 前記相対位置、前記外力、前記プローブの角速度、および前記加速度のうちの少なくともいずれかに対する目標値を入力するための入力手段が、前記プローブまたは前記プローブに接続された機器に設けられる
 前記(3)乃至(5)のいずれかに記載の超音波検査装置。
(7)
 前記相対位置、前記外力、前記プローブの角速度、および前記加速度のうちの少なくともいずれかの計測値と、前記目標値との差分に基づいて、前記計測値を前記目標値に近づけるためのガイドをユーザに提示する提示制御部をさらに備える
 前記(6)に記載の超音波検査装置。
(8)
 前記超音波センサにより受信された前記反射波に基づいて生成された超音波画像、前記撮影画像、前記相対位置の計測結果、前記外力の計測結果、および前記慣性センサによる計測結果のうちの少なくともいずれか2つ以上のデータを、外部の装置に送信する通信部をさらに備える
 前記(3)乃至(7)のいずれかに記載の超音波検査装置。
(9)
 前記通信部は、前記プローブの使用方法についてのアドバイスを示す情報、または、前記超音波画像を記録する指示を示す情報を前記外部の装置から受信し、
 前記アドバイスをユーザに提示する提示制御部と、
 前記超音波画像を記録する指示を示す情報が前記外部の装置から送信されてきた場合、前記超音波画像、前記撮影画像、および前記外力の計測結果を対応付けて記録する記録部と
 をさらに備える前記(8)に記載の超音波検査装置。
(10)
 前記通信部は、前記超音波画像、前記撮影画像、前記相対位置の計測結果、前記外力の計測結果、および前記慣性センサによる計測結果のうちの少なくともいずれか2つ以上のデータが、前記外部の装置でのみ復号可能なように暗号化されたデータを送信する
 前記(8)または(9)に記載の超音波検査装置。
(11)
 前記カメラは、バーコード、2次元コード、特殊コード、文字列、前記検査対象の顔、およびユーザの顔のうちの少なくともいずれかを撮影し、
 前記撮影画像に写る前記バーコード、前記2次元コード、前記特殊コード、前記文字列、前記検査対象の顔、および前記ユーザの顔のうちの少なくともいずれかに基づいて、前記検査対象または前記ユーザに関する情報を取得する取得部をさらに備える
 前記(1)乃至(10)のいずれかに記載の超音波検査装置。
(12)
 前記カメラと外部空間を隔てる透明部材が前記プローブに形成される
 前記(1)乃至(11)のいずれかに記載の超音波検査装置。
(13)
 前記カメラを覆う蓋部が前記プローブに対して移動可能に取り付けられる
 前記(1)乃至(12)のいずれかに記載の超音波検査装置。
(14)
 前記カメラは、魚眼カメラまたは全方位カメラにより構成される
 前記(1)乃至(13)のいずれかに記載の超音波検査装置。
(15)
 前記演算部は、前記撮影画像に基づく画像認識を行って、前記撮影画像に写る前記検査対象の認識を行い、
 前記超音波センサは、前記検査対象の認識結果と前記相対位置の計測結果とに基づいて、前記検査対象の所定の部位に前記超音波センサが閾値よりも近づいた場合、動作を停止する
 前記(1)乃至(14)のいずれかに記載の超音波検査装置。
(16)
 前記カメラは、前記プローブの先端部、下面、および前面のうちの少なくともいずれかに設けられる
 前記(1)乃至(15)のいずれかに記載の超音波検査装置。
(17)
 前記プローブには、前記カメラが複数設けられる
 前記(1)乃至(16)のいずれかに記載の超音波検査装置。
(18)
 前記記録部は、過去の検査状況と現在の検査状況が略同一である場合、前記超音波画像を記録する
 前記(5)に記載の超音波検査装置。
(19)
 超音波検査装置が、
 超音波を検査対象に放射し、前記検査対象で反射されて戻ってくる前記超音波の反射波を受信する超音波センサが設けられたプローブと一体になって構成され、前記プローブが前記検査対象と接触する方向近傍を少なくとも撮影するカメラにより撮影された撮影画像に基づいて、前記検査対象に対する前記プローブの相対位置を計測する
 検査方法。
(20)
 超音波を検査対象に放射し、前記検査対象で反射されて戻ってくる前記超音波の反射波を受信する超音波センサが設けられたプローブと一体になって構成され、前記プローブが前記検査対象と接触する方向近傍を少なくとも撮影するカメラにより撮影された撮影画像に基づいて、前記検査対象に対する前記プローブの相対位置を計測する
 処理をコンピュータに実行させるためのプログラム。
(1)
an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected and return;
a camera that is integrated with the probe provided with the ultrasonic sensor and captures at least an image of the vicinity of the direction in which the probe contacts the inspection object;
and a calculation unit that measures a relative position of the probe with respect to the object to be inspected based on an image captured by the camera.
(2)
The camera photographs an external space of the probe and photographs a light point group of light emitted from a light source provided inside the probe and reflected in a reflection space provided inside the probe,
The ultrasonic inspection device according to (1), wherein the calculation unit measures an external force acting on the probe based on a displacement of the light spot cloud captured in the captured image.
(3)
The ultrasonic inspection device according to (2), further comprising an inertial sensor that measures at least one of an angular velocity of the probe and an acceleration of the probe.
(4)
The ultrasonic inspection device according to claim 3, wherein the calculation unit measures elasticity of the inspection object based on the measurement result of the external force and the measurement result by the inertial sensor.
(5)
The ultrasonic inspection device described in (3) or (4) further includes a recording unit that corresponds and records at least two or more pieces of data among an ultrasonic image generated based on the reflected wave received by the ultrasonic sensor, the measurement result of the external force, the captured image, the measurement result of the relative position, and the measurement result by the inertial sensor.
(6)
The ultrasonic inspection device according to any one of (3) to (5), wherein an input means for inputting target values for at least one of the relative position, the external force, the angular velocity of the probe, and the acceleration is provided in the probe or in an apparatus connected to the probe.
(7)
The ultrasonic inspection device described in (6), further comprising a presentation control unit that presents a guide to a user for bringing the measurement value closer to the target value based on a difference between at least any one of the measurement values of the relative position, the external force, the angular velocity of the probe, and the acceleration and the target value.
(8)
The ultrasonic inspection device described in any of (3) to (7) further includes a communication unit that transmits at least two or more pieces of data among an ultrasonic image generated based on the reflected wave received by the ultrasonic sensor, the captured image, the measurement result of the relative position, the measurement result of the external force, and the measurement result by the inertial sensor to an external device.
(9)
the communication unit receives information indicating advice on how to use the probe or information indicating an instruction to record the ultrasound image from the external device;
a presentation control unit that presents the advice to a user;
The ultrasonic inspection device described in (8) further comprises a recording unit that, when information indicating an instruction to record the ultrasonic image is transmitted from the external device, records the ultrasonic image, the captured image, and the measurement results of the external force in association with each other.
(10)
The ultrasonic inspection device according to (8) or (9), wherein the communication unit transmits encrypted data such that at least two or more of the data of the ultrasonic image, the captured image, the measurement result of the relative position, the measurement result of the external force, and the measurement result by the inertial sensor can be decrypted only by the external device.
(11)
The camera captures at least one of a barcode, a two-dimensional code, a special code, a character string, a face of the inspection target, and a face of a user;
The ultrasound inspection device described in any of (1) to (10), further comprising an acquisition unit that acquires information about the test subject or the user based on at least one of the barcode, the two-dimensional code, the special code, the character string, the face of the test subject, and the face of the user that appear in the captured image.
(12)
The ultrasonic inspection device according to any one of (1) to (11), wherein a transparent member that separates the camera from an external space is formed on the probe.
(13)
The ultrasonic inspection device according to any one of (1) to (12), wherein a cover portion for covering the camera is attached movably relative to the probe.
(14)
The ultrasonic inspection device according to any one of (1) to (13), wherein the camera is a fisheye camera or an omnidirectional camera.
(15)
The calculation unit performs image recognition based on the captured image to recognize the inspection object appearing in the captured image,
The ultrasonic inspection device described in any one of (1) to (14), wherein the ultrasonic sensor stops operating when the ultrasonic sensor comes closer to a predetermined part of the object of inspection than a threshold based on the recognition result of the object of inspection and the measurement result of the relative position.
(16)
The ultrasonic inspection device according to any one of (1) to (15), wherein the camera is provided on at least one of a tip portion, a bottom surface, and a front surface of the probe.
(17)
The ultrasonic inspection device according to any one of (1) to (16), wherein the probe is provided with a plurality of the cameras.
(18)
The ultrasonic inspection device according to (5), wherein the recording unit records the ultrasonic image when a past inspection situation and a current inspection situation are substantially the same.
(19)
Ultrasonic inspection equipment,
An inspection method comprising: an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected; and a probe that is integral with the probe and is provided with an ultrasonic sensor that receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected; and a relative position of the probe with respect to the object to be inspected is measured based on an image captured by a camera that captures at least an image of the vicinity of the direction in which the probe contacts the object to be inspected.
(20)
A program for causing a computer to execute a process in which the program is configured integrally with a probe equipped with an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected and return, and the program measures the relative position of the probe with respect to the object to be inspected based on an image captured by a camera that captures at least an image of the vicinity of the direction in which the probe contacts the object to be inspected.
 1 プローブ, 2 情報処理装置, 11 先端部, 12 支持部, 21 超音波センサ, 22 カメラ, 23 慣性センサ, 24 演算部, 31 透明部材, 51 データ取得部, 52 演算部, 53 通信部, 54 記録部, 55 提示制御部, 55 提示部 1 Probe, 2 Information processing device, 11 Tip, 12 Support, 21 Ultrasonic sensor, 22 Camera, 23 Inertial sensor, 24 Calculation unit, 31 Transparent member, 51 Data acquisition unit, 52 Calculation unit, 53 Communication unit, 54 Recording unit, 55 Presentation control unit, 55 Presentation unit

Claims (20)

  1.  超音波を検査対象に放射し、前記検査対象で反射されて戻ってくる前記超音波の反射波を受信する超音波センサと、
     前記超音波センサが設けられたプローブと一体になって構成され、前記プローブが前記検査対象と接触する方向近傍を少なくとも撮影するカメラと、
     前記カメラにより撮影された撮影画像に基づいて、前記検査対象に対する前記プローブの相対位置を計測する演算部と
     を備える超音波検査装置。
    an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected and return;
    a camera that is integrated with the probe provided with the ultrasonic sensor and captures at least an image of the vicinity of the direction in which the probe contacts the inspection object;
    and a calculation unit that measures a relative position of the probe with respect to the object to be inspected based on an image captured by the camera.
  2.  前記カメラは、前記プローブの外部空間を撮影するとともに、前記プローブの内部に設けられた光源から出射され、前記プローブの内部に設けられた反射空間にて反射された光の光点群を撮影し、
     前記演算部は、前記撮影画像に写る前記光点群の変位に基づいて、前記プローブに作用する外力を計測する
     請求項1に記載の超音波検査装置。
    The camera photographs an external space of the probe and photographs a light point group of light emitted from a light source provided inside the probe and reflected in a reflection space provided inside the probe,
    The ultrasonic inspection device according to claim 1 , wherein the calculation unit measures an external force acting on the probe based on a displacement of the light spot cloud captured in the captured image.
  3.  前記プローブの角速度および前記プローブの加速度のうちの少なくともいずれかを計測する慣性センサをさらに備える
     請求項2に記載の超音波検査装置。
    The ultrasonic inspection device according to claim 2 , further comprising an inertial sensor that measures at least one of an angular velocity of the probe and an acceleration of the probe.
  4.  前記演算部は、前記外力の計測結果と、前記慣性センサによる計測結果とに基づいて、前記検査対象の弾力を計測する
     請求項3に記載の超音波検査装置。
    The ultrasonic inspection device according to claim 3 , wherein the calculation unit measures elasticity of the inspection object based on a measurement result of the external force and a measurement result by the inertial sensor.
  5.  前記超音波センサにより受信された前記反射波に基づいて生成された超音波画像、前記外力の計測結果、前記撮影画像、前記相対位置の計測結果、および前記慣性センサによる計測結果のうちの少なくとも2つ以上のデータを対応付けて記録する記録部をさらに備える
     請求項3に記載の超音波検査装置。
    4. The ultrasonic inspection device according to claim 3, further comprising a recording unit that records in association with each other at least two or more pieces of data among an ultrasonic image generated based on the reflected wave received by the ultrasonic sensor, the measurement result of the external force, the captured image, the measurement result of the relative position, and the measurement result by the inertial sensor.
  6.  前記相対位置、前記外力、前記プローブの角速度、および前記加速度のうちの少なくともいずれかに対する目標値を入力するための入力手段が、前記プローブまたは前記プローブに接続された機器に設けられる
     請求項3に記載の超音波検査装置。
    4. The ultrasonic inspection device according to claim 3, wherein an input unit for inputting a target value for at least one of the relative position, the external force, the angular velocity of the probe, and the acceleration is provided in the probe or in a device connected to the probe.
  7.  前記相対位置、前記外力、前記プローブの角速度、および前記加速度のうちの少なくともいずれかの計測値と、前記目標値との差分に基づいて、前記計測値を前記目標値に近づけるためのガイドをユーザに提示する提示制御部をさらに備える
     請求項6に記載の超音波検査装置。
    7. The ultrasonic inspection device according to claim 6, further comprising a presentation control unit that presents to a user a guide for bringing the measurement value closer to the target value, based on a difference between at least any one of the measurement values of the relative position, the external force, the angular velocity of the probe, and the acceleration and the target value.
  8.  前記超音波センサにより受信された前記反射波に基づいて生成された超音波画像、前記撮影画像、前記相対位置の計測結果、前記外力の計測結果、および前記慣性センサによる計測結果のうちの少なくともいずれか2つ以上のデータを、外部の装置に送信する通信部をさらに備える
     請求項3に記載の超音波検査装置。
    4. The ultrasonic inspection device according to claim 3, further comprising a communication unit that transmits to an external device at least two or more pieces of data among an ultrasonic image generated based on the reflected wave received by the ultrasonic sensor, the captured image, the measurement result of the relative position, the measurement result of the external force, and the measurement result by the inertial sensor.
  9.  前記通信部は、前記プローブの使用方法についてのアドバイスを示す情報、または、前記超音波画像を記録する指示を示す情報を前記外部の装置から受信し、
     前記アドバイスをユーザに提示する提示制御部と、
     前記超音波画像を記録する指示を示す情報が前記外部の装置から送信されてきた場合、前記超音波画像、前記撮影画像、および前記外力の計測結果を対応付けて記録する記録部と
     をさらに備える請求項8に記載の超音波検査装置。
    the communication unit receives information indicating advice on how to use the probe or information indicating an instruction to record the ultrasound image from the external device;
    a presentation control unit that presents the advice to a user;
    The ultrasonic inspection device according to claim 8, further comprising a recording unit that, when information indicating an instruction to record the ultrasonic image is transmitted from the external device, records the ultrasonic image, the captured image, and the measurement result of the external force in association with each other.
  10.  前記通信部は、前記超音波画像、前記撮影画像、前記相対位置の計測結果、前記外力の計測結果、および前記慣性センサによる計測結果のうちの少なくともいずれか2つ以上のデータが、前記外部の装置でのみ復号可能なように暗号化されたデータを送信する
     請求項8に記載の超音波検査装置。
    The ultrasonic inspection device according to claim 8, wherein the communication unit transmits encrypted data such that at least two or more of the ultrasonic image, the captured image, the measurement result of the relative position, the measurement result of the external force, and the measurement result by the inertial sensor can be decrypted only by the external device.
  11.  前記カメラは、バーコード、2次元コード、特殊コード、文字列、前記検査対象の顔、およびユーザの顔のうちの少なくともいずれかを撮影し、
     前記撮影画像に写る前記バーコード、前記2次元コード、前記特殊コード、前記文字列、前記検査対象の顔、および前記ユーザの顔のうちの少なくともいずれかに基づいて、前記検査対象または前記ユーザに関する情報を取得する取得部をさらに備える
     請求項1に記載の超音波検査装置。
    The camera captures at least one of a barcode, a two-dimensional code, a special code, a character string, a face of the inspection target, and a face of a user;
    The ultrasound inspection device according to claim 1, further comprising an acquisition unit that acquires information about the subject or the user based on at least one of the barcode, the two-dimensional code, the special code, the character string, the face of the subject, and the face of the user that appear in the captured image.
  12.  前記カメラと外部空間を隔てる透明部材が前記プローブに形成される
     請求項1に記載の超音波検査装置。
    The ultrasonic inspection device according to claim 1 , wherein a transparent member that separates the camera from an external space is formed on the probe.
  13.  前記カメラを覆う蓋部が前記プローブに対して移動可能に取り付けられる
     請求項1に記載の超音波検査装置。
    The ultrasonic inspection device according to claim 1 , wherein a cover portion for covering the camera is attached movably relative to the probe.
  14.  前記カメラは、魚眼カメラまたは全方位カメラにより構成される
     請求項1に記載の超音波検査装置。
    The ultrasonic inspection device according to claim 1 , wherein the camera is a fisheye camera or an omnidirectional camera.
  15.  前記演算部は、前記撮影画像に基づく画像認識を行って、前記撮影画像に写る前記検査対象の認識を行い、
     前記超音波センサは、前記検査対象の認識結果と前記相対位置の計測結果とに基づいて、前記検査対象の所定の部位に前記超音波センサが閾値よりも近づいた場合、動作を停止する
     請求項1に記載の超音波検査装置。
    The calculation unit performs image recognition based on the captured image to recognize the inspection object appearing in the captured image,
    The ultrasonic inspection device according to claim 1 , wherein the ultrasonic sensor stops operating when the ultrasonic sensor approaches a predetermined portion of the object of inspection closer than a threshold based on the recognition result of the object of inspection and the measurement result of the relative position.
  16.  前記カメラは、前記プローブの先端部、下面、および前面のうちの少なくともいずれかに設けられる
     請求項1に記載の超音波検査装置。
    The ultrasonic inspection device according to claim 1 , wherein the camera is provided on at least one of a tip portion, a bottom surface, and a front surface of the probe.
  17.  前記プローブには、前記カメラが複数設けられる
     請求項1に記載の超音波検査装置。
    The ultrasonic inspection device according to claim 1 , wherein the probe is provided with a plurality of the cameras.
  18.  前記記録部は、過去の検査状況と現在の検査状況が略同一である場合、前記超音波画像を記録する
     請求項5に記載の超音波検査装置。
    The ultrasonic inspection device according to claim 5 , wherein the recording unit records the ultrasonic image when a past inspection situation and a current inspection situation are substantially the same.
  19.  超音波検査装置が、
     超音波を検査対象に放射し、前記検査対象で反射されて戻ってくる前記超音波の反射波を受信する超音波センサが設けられたプローブと一体になって構成され、前記プローブが前記検査対象と接触する方向近傍を少なくとも撮影するカメラにより撮影された撮影画像に基づいて、前記検査対象に対する前記プローブの相対位置を計測する
     検査方法。
    Ultrasonic inspection equipment,
    An inspection method comprising: an ultrasonic sensor that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected; and a probe that is integral with the probe and is provided with an ultrasonic sensor that receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected; and measuring the relative position of the probe with respect to the object to be inspected based on images captured by a camera that captures at least an image of the vicinity of the direction in which the probe comes into contact with the object to be inspected.
  20.  超音波を検査対象に放射し、前記検査対象で反射されて戻ってくる前記超音波の反射波を受信する超音波センサが設けられたプローブと一体になって構成され、前記プローブが前記検査対象と接触する方向近傍を少なくとも撮影するカメラにより撮影された撮影画像に基づいて、前記検査対象に対する前記プローブの相対位置を計測する
     処理をコンピュータに実行させるためのプログラム。
    A program for causing a computer to execute a process in which an ultrasonic sensor is integrated with a probe that emits ultrasonic waves to an object to be inspected and receives the reflected waves of the ultrasonic waves that are reflected by the object to be inspected, and the relative position of the probe with respect to the object to be inspected is measured based on an image captured by a camera that captures at least an image of the vicinity of the direction in which the probe contacts the object to be inspected.
PCT/JP2023/036673 2022-10-26 2023-10-10 Ultrasonic inspection device, inspection method, and program WO2024090190A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022171639 2022-10-26
JP2022-171639 2022-10-26

Publications (1)

Publication Number Publication Date
WO2024090190A1 true WO2024090190A1 (en) 2024-05-02

Family

ID=90830706

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/036673 WO2024090190A1 (en) 2022-10-26 2023-10-10 Ultrasonic inspection device, inspection method, and program

Country Status (1)

Country Link
WO (1) WO2024090190A1 (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004201722A (en) * 2002-12-20 2004-07-22 Ge Medical Systems Global Technology Co Llc Ultrasonograph
US20070167709A1 (en) * 2000-12-28 2007-07-19 Guided Therapy Systems, Inc. Visual imaging system for ultrasonic probe
CN101569541A (en) * 2008-04-29 2009-11-04 香港理工大学 Three-dimensional ultrasonic imaging system
US20130237811A1 (en) * 2012-03-07 2013-09-12 Speir Technologies Inc. Methods and systems for tracking and guiding sensors and instruments
JP2015181660A (en) * 2014-03-24 2015-10-22 キヤノン株式会社 Object information acquiring apparatus and breast examination apparatus
CN204723091U (en) * 2015-06-12 2015-10-28 成都迈迪特科技有限公司 With the bladder scanner of photographic head
US20170105701A1 (en) * 2015-10-19 2017-04-20 Clarius Mobile Health Corp. Systems and methods for remote graphical feedback of ultrasound scanning technique
KR101792952B1 (en) * 2016-06-27 2017-11-01 고려대학교 산학협력단 Ultrasonic Imaging Apparatus
JP2019521745A (en) * 2016-06-20 2019-08-08 バタフライ ネットワーク,インコーポレイテッド Automatic image acquisition to assist the user in operating the ultrasound system
JP2020049062A (en) * 2018-09-28 2020-04-02 ゼネラル・エレクトリック・カンパニイ Ultrasound image display apparatus
KR20200101086A (en) * 2019-02-19 2020-08-27 삼성메디슨 주식회사 Ultrasound diagnostic apparatus
CN112472133A (en) * 2020-12-22 2021-03-12 深圳市德力凯医疗设备股份有限公司 Posture monitoring method and device for ultrasonic probe
WO2021085186A1 (en) * 2019-10-31 2021-05-06 ソニー株式会社 Sensor device
WO2022249537A1 (en) * 2021-05-24 2022-12-01 ソニーグループ株式会社 Sensor device

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070167709A1 (en) * 2000-12-28 2007-07-19 Guided Therapy Systems, Inc. Visual imaging system for ultrasonic probe
JP2004201722A (en) * 2002-12-20 2004-07-22 Ge Medical Systems Global Technology Co Llc Ultrasonograph
CN101569541A (en) * 2008-04-29 2009-11-04 香港理工大学 Three-dimensional ultrasonic imaging system
US20130237811A1 (en) * 2012-03-07 2013-09-12 Speir Technologies Inc. Methods and systems for tracking and guiding sensors and instruments
JP2015181660A (en) * 2014-03-24 2015-10-22 キヤノン株式会社 Object information acquiring apparatus and breast examination apparatus
CN204723091U (en) * 2015-06-12 2015-10-28 成都迈迪特科技有限公司 With the bladder scanner of photographic head
US20170105701A1 (en) * 2015-10-19 2017-04-20 Clarius Mobile Health Corp. Systems and methods for remote graphical feedback of ultrasound scanning technique
JP2019521745A (en) * 2016-06-20 2019-08-08 バタフライ ネットワーク,インコーポレイテッド Automatic image acquisition to assist the user in operating the ultrasound system
KR101792952B1 (en) * 2016-06-27 2017-11-01 고려대학교 산학협력단 Ultrasonic Imaging Apparatus
JP2020049062A (en) * 2018-09-28 2020-04-02 ゼネラル・エレクトリック・カンパニイ Ultrasound image display apparatus
KR20200101086A (en) * 2019-02-19 2020-08-27 삼성메디슨 주식회사 Ultrasound diagnostic apparatus
WO2021085186A1 (en) * 2019-10-31 2021-05-06 ソニー株式会社 Sensor device
CN112472133A (en) * 2020-12-22 2021-03-12 深圳市德力凯医疗设备股份有限公司 Posture monitoring method and device for ultrasonic probe
WO2022249537A1 (en) * 2021-05-24 2022-12-01 ソニーグループ株式会社 Sensor device

Similar Documents

Publication Publication Date Title
US10863898B2 (en) System and method for determining distances from an object
US9498118B2 (en) Handheld vision tester and calibration thereof
EP3402384B1 (en) Systems and methods for determining distance from an object
CN108604116A (en) It can carry out the wearable device of eye tracks
WO2016183537A1 (en) Handheld biometric scanner device
KR101985438B1 (en) Smart Apparatus for Measuring And Improving Physical Ability
CN108292448A (en) Information processing unit, information processing method and program
JP2023523317A (en) System for acquiring ultrasound images of internal body organs
JP5857805B2 (en) Camera calibration device
Fletcher et al. Development of smart phone-based child health screening tools for community health workers
WO2024090190A1 (en) Ultrasonic inspection device, inspection method, and program
US11579449B2 (en) Systems and methods for providing mixed-reality experiences under low light conditions
JP7209954B2 (en) Nystagmus analysis system
TW201720364A (en) Color vision examination method and system for objectively examining color recognition capability of examinee
JP5884564B2 (en) Diagnosis support device
US20240122469A1 (en) Virtual reality techniques for characterizing visual capabilities
JP5480028B2 (en) Lookout gaze detection device