WO2022163468A1 - 症状記録装置、症状記録方法、及びプログラム - Google Patents
症状記録装置、症状記録方法、及びプログラム Download PDFInfo
- Publication number
- WO2022163468A1 WO2022163468A1 PCT/JP2022/001849 JP2022001849W WO2022163468A1 WO 2022163468 A1 WO2022163468 A1 WO 2022163468A1 JP 2022001849 W JP2022001849 W JP 2022001849W WO 2022163468 A1 WO2022163468 A1 WO 2022163468A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- neck
- control unit
- imaging
- symptom
- Prior art date
Links
- 208000024891 symptom Diseases 0.000 title claims abstract description 71
- 238000000034 method Methods 0.000 title claims description 25
- 238000003384 imaging method Methods 0.000 claims abstract description 116
- 210000004731 jugular vein Anatomy 0.000 claims abstract description 19
- 238000004891 communication Methods 0.000 claims description 15
- 210000003109 clavicle Anatomy 0.000 claims description 9
- 230000035945 sensitivity Effects 0.000 claims description 4
- 238000012545 processing Methods 0.000 description 16
- 238000012790 confirmation Methods 0.000 description 10
- 238000012986 modification Methods 0.000 description 10
- 230000004048 modification Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 206010000060 Abdominal distension Diseases 0.000 description 5
- 238000013135 deep learning Methods 0.000 description 3
- 238000003703 image analysis method Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000010349 pulsation Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 206010019280 Heart failures Diseases 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 206010059865 Jugular vein distension Diseases 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 210000000624 ear auricle Anatomy 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000005713 exacerbation Effects 0.000 description 1
- 230000004217 heart function Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 210000001562 sternum Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
Definitions
- the present disclosure relates to a symptom recording device, a symptom recording method, and a program.
- An increase in central venous pressure appears as jugular venous distention.
- a method of estimating circulatory conditions such as cardiac function is clinically used by estimating central venous pressure.
- Patent Document 1 discloses a technique for capturing an image of the right side of the neck in order to determine the central venous pressure.
- Confirmation of jugular vein distension is a method that allows judging intravascular congestion due to exacerbation of heart failure from the appearance, but experience and knowledge are required. Congestion is an increase in volume.
- When attempting to confirm jugular venous distention by visual observation or image analysis in telemedicine or telemonitoring it is necessary to prepare conditions such as positioning for correctly capturing images for confirming jugular venous distention. However, it is difficult for users without experience or knowledge to meet such conditions.
- the purpose of the present disclosure is to make it easier for a user with no experience or knowledge to take an image that allows confirmation of the state of the jugular vein.
- a symptom recording device performs control to display, as a preview image, an image obtained by an imaging device used for imaging before imaging a neck image for confirming the state of the jugular vein.
- a control unit is provided for supporting the imaging by performing control for outputting range designation data designating the imaging range including the neck, and for recording the photographed neck image.
- the range designation data is data that designates a range from below the ear to the clavicle as the imaging range.
- control unit performs control to output body posture instruction data for instructing a body posture before capturing the neck image.
- control unit performs control to output clothing instruction data for instructing clothing or hairstyle instruction data for instructing hairstyle before the neck image is captured.
- control unit performs control to output condition instruction data that instructs imaging conditions including ambient brightness before imaging the neck image.
- control unit sets imaging conditions including the sensitivity of the imaging element before capturing the neck image.
- control unit further records date and time data indicating the date and time when the neck image was taken.
- control unit further records angle data indicating the angle at which the neck image was taken.
- control unit determines whether or not the imaging range is included in the image obtained by the imaging element, and if it is determined that the imaging range is included, the imaging range is superimposed on the preview image.
- control unit determines whether or not the imaging range is included in the image obtained by the imaging device, and if it is determined that the imaging range is included, the imaging device The image obtained in is recorded as the cervical image.
- control unit performs control to display the photographed neck image, and outputs question data asking whether the photographed neck image includes the imaging range. control to assist recapture of the neck image when a response is entered that the imaging range is not included;
- control unit performs control to display the previously recorded cervical image together with the photographed cervical image.
- control unit analyzes the captured cervical image, determines whether the captured cervical image includes the imaging range, and determines whether the imaging range is included in the captured cervical image. If it is determined that there is no such image, it assists recapturing of the neck image.
- the apparatus further comprises a communication section that is controlled by the control section and transmits the neck image to a server.
- control unit controls the neck when indication data indicating that the imaging range is not included in the transmitted neck image is transmitted from the server and received by the communication unit. Support recapture of partial images.
- the control unit displays, as a preview image, an image obtained by an imaging device used for the shooting. and control to output range designation data designating an imaging range including the neck to support the imaging, and the control unit records the captured neck image. It is.
- a program as one aspect of the present disclosure performs control to display, as a preview image, an image obtained by an imaging device used for imaging before imaging a neck image for confirming the state of the jugular vein, and By performing control for outputting range designation data designating an imaging range including the neck, the computer is caused to execute processing for supporting the imaging and processing for recording the photographed neck image.
- FIG. 1 is a diagram showing the configuration of a system according to an embodiment of the present disclosure
- FIG. 1 is a block diagram showing the configuration of a symptom recording device according to an embodiment of the present disclosure
- FIG. FIG. 5 is a diagram showing a screen example of the symptom recording device according to the embodiment of the present disclosure
- FIG. FIG. 5 is a diagram showing a screen example of the symptom recording device according to the embodiment of the present disclosure
- FIG. FIG. 5 is a diagram showing a screen example of the symptom recording device according to the embodiment of the present disclosure
- FIG. FIG. 5 is a diagram showing a screen example of the symptom recording device according to the embodiment of the present disclosure
- FIG. 4 is a flow chart showing the operation of the symptom recording device according to the embodiment of the present disclosure
- the system 10 includes multiple symptom recording devices 20 and at least one server 30 .
- Each of the multiple symptom recording devices 20 is used by a user such as a patient, a patient's family member, a caregiver, or a medical worker.
- the patient is, for example, a heart failure patient.
- the number of symptom recording devices 20 is not limited to a plurality, and may be one. For convenience of description, one symptom recording device 20 will be described below.
- the symptom recording device 20 is held by the user. Alternatively, the symptom recording device 20 is installed at the patient's home.
- the symptom recording device 20 is, for example, a general-purpose terminal such as a mobile phone, smart phone, tablet, or PC, or a dedicated terminal such as a gadget. "PC" is an abbreviation for personal computer.
- the symptom recording device 20 can communicate with the server 30 via the network 40.
- the server 30 is installed in a facility such as a data center.
- Server 30 is, for example, a server computer belonging to a cloud computing system or other computing system.
- the network 40 includes the Internet, at least one WAN, at least one MAN, or any combination thereof.
- WAN is an abbreviation for wide area network.
- MAN is an abbreviation for metropolitan area network.
- Network 40 may include at least one wireless network, at least one optical network, or any combination thereof.
- a wireless network is, for example, an ad-hoc network, a cellular network, a wireless LAN, a satellite communication network, or a terrestrial microwave network.
- LAN is an abbreviation for local area network.
- FIG. 1 An outline of this embodiment will be described with reference to FIGS. 2, 4, and 6.
- FIG. 2 An outline of this embodiment will be described with reference to FIGS. 2, 4, and 6.
- the control unit 21 displays the image obtained by the imaging device used for the image taking as the preview image 52 . , and control is performed to output range designation data D1 that designates the imaging range including the neck, thereby supporting the imaging.
- the control unit 21 records the captured neck image 56 .
- the user can receive instructions regarding the imaging range for obtaining an image that allows confirmation of the state of the jugular vein. Therefore, according to the present embodiment, it becomes easier for a user with no experience or knowledge to capture an image that allows confirmation of the state of the jugular vein.
- the imaging range includes at least the part of the neck where the state of the jugular vein can be confirmed, and in this embodiment, includes the range from below the ear to the clavicle. That is, the range designation data D1 is data that designates the range from below the ear to the clavicle as the imaging range. According to this embodiment, by including the clavicle in the imaging range, it is possible to improve the accuracy of central venous pressure estimation by height estimation. As a modified example of this embodiment, the imaging range may include the angle of the sternum. Also, the higher the central venous pressure, the higher the location of the pulsation, and in some patients the pulsation may appear just below the ear.
- the imaging range may include the earlobe.
- the symptom recording device 20 includes a control section 21 , a storage section 22 , a communication section 23 , an input section 24 and an output section 25 .
- the control unit 21 includes at least one processor, at least one programmable circuit, at least one dedicated circuit, or any combination thereof.
- a processor may be a general-purpose processor such as a CPU or GPU, or a dedicated processor specialized for a particular process.
- CPU is an abbreviation for central processing unit.
- GPU is an abbreviation for graphics processing unit.
- a programmable circuit is, for example, an FPGA.
- FPGA is an abbreviation for field-programmable gate array.
- a dedicated circuit is, for example, an ASIC.
- ASIC is an abbreviation for application specific integrated circuit.
- the control unit 21 executes processing related to the operation of the symptom recording device 20 while controlling each unit of the symptom recording device 20 .
- the storage unit 22 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or any combination thereof.
- a semiconductor memory is, for example, a RAM or a ROM.
- RAM is an abbreviation for random access memory.
- ROM is an abbreviation for read only memory.
- RAM is, for example, SRAM or DRAM.
- SRAM is an abbreviation for static random access memory.
- DRAM is an abbreviation for dynamic random access memory.
- ROM is, for example, EEPROM.
- EEPROM is an abbreviation for electrically erasable programmable read only memory.
- the storage unit 22 functions, for example, as a main memory device, an auxiliary memory device, or a cache memory.
- the storage unit 22 stores data used for operation of the symptom recording device 20 and data obtained by the operation of the symptom recording device 20 .
- the communication unit 23 includes at least one communication interface.
- the communication interface is, for example, an interface compatible with mobile communication standards such as LTE, 4G standard, or 5G standard, an interface compatible with short-range wireless communication standards such as Bluetooth (registered trademark), or a LAN interface.
- LTE is an abbreviation for Long Term Evolution.
- 4G is an abbreviation for 4th generation.
- 5G is an abbreviation for 5th generation.
- the communication unit 23 receives data used for operating the symptom recording device 20 and transmits data obtained by operating the symptom recording device 20 .
- the input unit 24 includes at least two input interfaces.
- One input interface is an imaging device, such as a camera, that includes an imaging device.
- Other input interfaces are, for example, physical keys, capacitive keys, a pointing device, a touch screen integrated with the display, or a microphone.
- Input unit 24 receives an operation to input data used for operation of symptom recording device 20 .
- the input unit 24 may be connected to the symptom recording device 20 as an external input device instead of being provided in the symptom recording device 20 .
- As the connection interface for example, an interface compatible with standards such as USB, HDMI (registered trademark), or Bluetooth (registered trademark) can be used.
- USB is an abbreviation for Universal Serial Bus.
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- the output unit 25 includes at least two output interfaces.
- One output interface is a display such as an LCD or organic EL display.
- LCD is an abbreviation for liquid crystal display.
- EL is an abbreviation for electro luminescence.
- Another output interface is, for example, a speaker.
- the output unit 25 outputs data obtained by operating the symptom recording device 20 .
- the output unit 25 may be connected to the symptom recording device 20 as an external output device instead of being provided in the symptom recording device 20 .
- As the connection interface for example, an interface compatible with standards such as USB, HDMI (registered trademark), or Bluetooth (registered trademark) can be used.
- the function of the symptom recording device 20 is realized by executing the program according to the present embodiment with a processor as the control unit 21. That is, the functions of the symptom recording device 20 are realized by software.
- the program causes the computer to function as the symptom recording device 20 by causing the computer to execute the operation of the symptom recording device 20 . That is, the computer functions as the symptom recording device 20 by executing the operation of the symptom recording device 20 according to the program.
- the program can be stored on a non-transitory computer-readable medium.
- a non-transitory computer-readable medium is, for example, a flash memory, a magnetic recording device, an optical disk, a magneto-optical recording medium, or a ROM.
- Program distribution is performed, for example, by selling, assigning, or lending a portable medium such as an SD card, DVD, or CD-ROM storing the program.
- SD is an abbreviation for Secure Digital.
- DVD is an abbreviation for digital versatile disc.
- CD-ROM is an abbreviation for compact disc read only memory.
- the program may be distributed by storing the program in the storage of the server and transferring the program from the server to another computer.
- a program may be provided as a program product.
- a computer for example, temporarily stores a program stored in a portable medium or a program transferred from a server in a main storage device. Then, the computer reads the program stored in the main storage device with the processor, and executes processing according to the read program with the processor.
- the computer may read the program directly from the portable medium and execute processing according to the program.
- the computer may execute processing according to the received program every time the program is transferred from the server to the computer.
- the processing may be executed by a so-called ASP type service that realizes the function only by executing the execution instruction and obtaining the result without transferring the program from the server to the computer.
- "ASP" is an abbreviation for application service provider.
- a program includes information that is used for processing by a computer and that conforms to the program. For example, data that is not a direct instruction to a computer but that has the property of prescribing the processing of the computer corresponds to "things equivalent to a program.”
- a part or all of the functions of the symptom recording device 20 may be realized by a programmable circuit or a dedicated circuit as the control unit 21. That is, part or all of the functions of the symptom recording device 20 may be realized by hardware.
- step S1 the control unit 21 performs control to output body position instruction data D2 that indicates the body position.
- the display as the output unit 25 is controlled by the control unit 21 to display the body position instruction data D2 on the screen 50.
- FIG. Alternatively, the speaker as the output unit 25 is controlled by the control unit 21 to output the body posture instruction data D2 by voice.
- the control unit 21 causes the display to display an illustration 51 that instructs the user on the angle with respect to the supine position as the body position instruction data D2.
- the angle indicated by the illustration 51 is 30° or more and 45° or less, and is 45° in the example of FIG. That is, in the example of FIG. 3, the user is instructed to adjust the patient's body position so that he or she is standing 45° from the supine position.
- the control unit 21 causes the display to further display messages such as "Let's adjust the body position as shown below" and "Let's prepare for video shooting after adjusting the body position".
- control unit 21 may support capturing of a body posture image such as an image of the entire patient, and may record the captured body posture image. Specifically, the control unit 21 may cause the storage unit 22 to store an image of the body posture obtained by an imaging device of an imaging device as the input unit 24 .
- This variant makes it easier to ensure that the patient is in the correct position. As a result, the accuracy of central venous pressure estimation increases.
- control unit 21 may cause the display to display the previously recorded image of the body position together with the captured image of the body position. This modification makes it easier to check whether the patient is in the correct posture.
- the sitting position or the semi-sitting position may be selected by mode selection.
- the information as to which mode has been selected can be used as additional information for diagnosis.
- the control unit 21 may further perform control to output clothing instruction data D3a that instructs clothing.
- the display may be controlled by the control section 21 to display the clothing instruction data D3a on the screen 50.
- the speaker may be controlled by the control section 21 to output the clothing instruction data D3a by voice.
- the control unit 21 may cause the display to display an clothing instruction message such as "Please wear clothing that allows you to see from below the ears to the clavicle", or may cause the speaker to output the message by voice. .
- control unit 21 may further perform control to output hairstyle instruction data D3b that instructs a hairstyle.
- the display may be controlled by the control unit 21 to display the hairstyle instruction data D3b on the screen 50.
- the speaker may be controlled by the control unit 21 to output the hairstyle instruction data D3b by voice.
- the control unit 21 may cause the display to display a hairstyle instruction message such as "if you have long hair, please tie your hair back", or output it to the speaker by voice.
- step S2 the control unit 21 activates the imaging device as the input unit 24. Specifically, the control unit 21 activates the camera.
- step S3 the control unit 21 performs control to display the image obtained by the imaging element as the preview image 52, and also performs control to output range designation data D1 that designates the imaging range including the neck.
- the display as the output unit 25 is controlled by the control unit 21 to display the preview image 52 and the range designation data D1 on the screen 50 .
- the speaker as the output unit 25 is controlled by the control unit 21 to output the range designation data D1 by voice while the preview image 52 is being displayed on the screen 50 .
- the control unit 21 outputs a preview image 52 and range designation data D1 such as "Adjust the position of the camera so that the region from below the ear to the clavicle is sufficiently captured.”
- range instruction message 53 is displayed on the display. That is, in the example of FIG. 4, the user is instructed to adjust the position of the camera and photograph the range from below the ear to the clavicle.
- Range indication message 53 may include other messages, such as a message directing the patient's face to be turned sideways.
- the control unit 21 determines whether or not the image captured by the imaging element includes the imaging range designated by the range designation data D1. As shown in FIG. 4, when the control unit 21 determines that the image captured by the image sensor includes the imaging range indicated by the range designation data D1, the control unit 21 displays a figure 54 representing the imaging range. Control is performed to superimpose and display the preview image 52 . The display is controlled by the control unit 21 to display the graphics 54 superimposed on the preview image 52 . Graphic 54 may have any shape, color, and pattern, but in the example of FIG. 4 is a rectangular red frame. If the control unit 21 determines that the imaging range indicated by the range designation data D1 is not included in the image obtained by the imaging device, it performs control to hide the graphic 54 . The display is controlled by the controller 21 to hide the graphics 54 .
- step S4 the control unit 21 performs control to output condition instruction data D4 that instructs imaging conditions including ambient brightness.
- the display as the output unit 25 is controlled by the control unit 21 to display the condition instruction data D4 on the screen 50.
- FIG. Alternatively, the speaker as the output unit 25 is controlled by the control unit 21 to output the condition instruction data D4 by voice.
- the control unit 21 causes the display to display a condition instruction message 55 such as "Please make the room brighter" as the condition instruction data D4. That is, in the example of FIG. 5, the user is instructed to adjust the darkness or brightness of the room to make it sufficiently bright.
- the condition instruction message 55 may include other messages such as a message instructing the user to eliminate camera shake or a message instructing not to include a foreign object in the image.
- step S5 the control unit 21 sets imaging conditions including the sensitivity of the imaging device. Specifically, the control unit 21 performs sensitivity adjustment in the background. In this embodiment, since a video is captured as the neck image 56 for checking the state of the jugular vein, the control unit 21 may set the capturing time as part of the imaging conditions. The control unit 21 may set other conditions as part of the imaging conditions.
- the control unit 21 supports the imaging of the cervical image 56 by executing the processing from step S3 to step S5.
- step S6 the control unit 21 records the photographed neck image 56. Specifically, the control unit 21 causes the storage unit 22 to store the photographed neck image 56 .
- control unit 21 causes the storage unit 22 to store, as the cervical image 56, a video containing, as a frame group, a group of images obtained by the imaging device for a certain period of time after step S5.
- Video shooting may be started automatically, or may be started in response to a user operation such as tapping a shooting button via the touch screen as the input unit 24 .
- Video capture may be ended automatically or may be ended in response to a user operation such as tapping a stop button via a touch screen.
- the control unit 21 analyzes an image obtained by the imaging device, and determines whether or not the image includes the imaging range indicated by the range indication data D1. good too.
- a known method can be used as the image analysis method. Machine learning such as deep learning may be used.
- the control unit 21 may record the image as the neck image 56 when determining that the image obtained by the imaging device includes the imaging range indicated by the range designation data D1.
- the control unit 21 stores, as the cervical image 56, a video containing, as a frame group, a group of images including the imaging range designated by the range designation data D1 obtained by the imaging device, in the storage unit 22. may be stored.
- control unit 21 may further record date and time data D5 indicating the date and time when the cervical image 56 was taken. Specifically, the control unit 21 may cause the storage unit 22 to store the date and time data D5 together with the photographed neck image 56 . According to this modification, it is possible to analyze the diurnal variation of the central venous pressure. Alternatively, it is possible to know whether the neck image 56 was taken in the warm season or the cold season.
- control unit 21 may further record angle data D6 indicating the angle at which the neck image 56 was taken. Specifically, the control unit 21 may cause the storage unit 22 to store the angle data D6 together with the captured neck image 56 . According to this modification, the body posture can be estimated from the camera angle and the neck angle.
- step S7 the control unit 21 performs control to display the photographed neck image 56 and control to output question data D7 asking whether to transmit the photographed neck image 56 to the server 30. conduct.
- the display as the output unit 25 is controlled by the control unit 21 to display the neck image 56 and the question data D7 on the screen 50.
- FIG. Alternatively, the speaker as the output unit 25 is controlled by the control unit 21 to output the question data D7 by voice while the neck image 56 is being displayed on the screen 50 .
- control unit 21 sends a confirmation message 57 such as "Video shooting is completed" and "Do you want to send?" show on the display. That is, in the example of FIG. 6, the user is asked whether or not to send the captured video to the server 30 .
- step S3 onwards When an answer is input via the input unit 24 that the photographed neck image 56 will not be sent to the server 30, the processing from step S3 onwards is executed again. That is, the control unit 21 supports recapturing of the neck image 56 .
- step S8 When an answer to transmit the photographed neck image 56 to the server 30 is input via the input unit 24, the process of step S8 is executed.
- control unit 21 causes the display to display a recapture button 58 labeled "Recapture” and a send button 59 labeled "Send".
- a recapture button 58 labeled "Recapture”
- a send button 59 labeled "Send”.
- the question data D7 may be data asking whether the photographed neck image 56 includes the imaging range indicated by the range indication data D1.
- the confirmation message 57 may be a message such as "Is the area from below the ear to the collarbone visible in the video?" That is, the user may be asked whether the area below the ear to the collarbone is visible in the captured video.
- the retake button 58 and send button 59 may be labeled "no" and "yes” respectively.
- the processes after step S3 are executed again.
- the send button 59 is tapped via the touch screen, the process of step S8 is executed.
- step S3 when an answer is input via the input unit 24 that the photographed cervical image 56 does not include the imaging range indicated by the range designation data D1, the processing after step S3 is performed again. executed.
- step S8 When an answer is input via the input unit 24 that the photographed cervical image 56 includes the imaging range indicated by the range designation data D1, the process of step S8 is executed.
- control unit 21 may cause the display to display the previously recorded neck image 56 together with the captured neck image 56 . According to this modified example, it becomes easier to check whether the neck image 56 has been captured correctly.
- the control unit 21 analyzes the photographed neck image 56 to determine whether the photographed neck image 56 includes the imaging range indicated by the range designation data D1.
- a known method can be used as the image analysis method. Machine learning such as deep learning may be used.
- FIG. If the control unit 21 determines that the photographed cervical image 56 does not include the imaging range indicated by the range designation data D1, the processing from step S3 is executed again. If the control unit 21 determines that the photographed neck image 56 includes the imaging range designated by the range designation data D1, the process of step S8 is executed.
- step S8 the control unit 21 causes the communication unit 23 to transmit the photographed neck image 56.
- the communication unit 23 is controlled by the control unit 21 to transmit the neck image 56 to the server 30 .
- the server 30 analyzes the photographed neck image 56 to determine whether the photographed neck image 56 includes the imaging range designated by the range designation data D1.
- a known method can be used as the image analysis method. Machine learning such as deep learning may be used. If the server 30 determines that the photographed cervical image 56 does not include the imaging range indicated by the range designation data D1, it is pointed out that the cervical image 56 does not include the imaging range.
- Pointing data D ⁇ b>8 may be transmitted from the server 30 .
- indication data D8 is transmitted from server 30 and received by communication unit 23, the processes after step S3 may be executed again. That is, the control unit 21 may assist recapturing of the cervical image 56 .
- the indication data D8 may include comments from the medical institution that confirmed the neck image 56, such as "the image is dark" or "the jugular vein is not shown.”
- the symptom recording device 20 indicates the body position of the subject during jugular vein measurement.
- the symptom recording device 20 may indicate the subject's clothing or hairstyle during jugular vein measurement.
- the symptom recorder 20 acquires and displays video of the subject's neck area.
- the symptom recording device 20 instructs an imaging range including the vicinity of the subject's neck in conjunction with the video display.
- the symptom recording device 20 instructs or sets imaging conditions, which are conditions for imaging the vicinity of the neck of the subject.
- a symptom recorder 20 records the captured video. Therefore, according to the present embodiment, even a user without specialized knowledge can stably shoot a video for confirming jugular vein distention at home, that is, without failure.
- a mode may be provided in which the patient himself/herself captures the neck image 56 .
- the camera is an in-camera.
- make the instructions sound Posture imaging is unnecessary. Shooting starts automatically when the imaging range is captured, instead of pressing a button.
- the symptom recording device 20 is a smart phone. First, a message is displayed, such as "Take an image", "Hold the smartphone with your right hand so that the screen is facing you", and "Press the button to start shooting". Then, a message such as "Move your smartphone a little further", “Move your smartphone a little closer”, "A little more to the right” or "A little more to the left” is displayed.
- step S1 may be omitted.
- step S4 the process of step S5, or both of these may be omitted.
- the cervical image 56 a still image may be taken instead of a video.
- the neck image 56 includes the imaging range indicated by the range indication data D1
- whether or not the neck image 56 satisfies the imaging conditions indicated by the condition indication data D4. may be further determined.
- the cervical image 56 satisfies the imaging conditions designated by the condition designation data D4. It may be determined whether there is
- the control unit 21 when assisting recapture of the neck image 56, notifies the user of the reason why recapture is necessary, such as the imaging range, illuminance, body position, foreign matter such as a user's finger, or camera shake. You may
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Quality & Reliability (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Physiology (AREA)
- Biophysics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
20 症状記録装置
21 制御部
22 記憶部
23 通信部
24 入力部
25 出力部
30 サーバ
40 ネットワーク
50 画面
51 イラスト
52 プレビュー画像
53 範囲指示メッセージ
54 図形
55 条件指示メッセージ
56 頸部画像
57 確認メッセージ
58 再撮影ボタン
59 送信ボタン
Claims (17)
- 頸静脈の状態を確認するための頸部画像の撮影前に、当該撮影に用いられる撮像素子で得られた画像をプレビュー画像として表示する制御を行うとともに、頸部を含む撮像範囲を指示する範囲指示データを出力する制御を行うことで、当該撮影を支援し、撮影された前記頸部画像を記録する制御部を備える症状記録装置。
- 前記範囲指示データは、前記撮像範囲として、耳の下から鎖骨までの範囲を指示するデータである請求項1に記載の症状記録装置。
- 前記制御部は、前記頸部画像の撮影前に、体位を指示する体位指示データを出力する制御を行う請求項1又は請求項2に記載の症状記録装置。
- 前記制御部は、前記頸部画像の撮影前に、服装を指示する服装指示データ、又は髪型を指示する髪型指示データを出力する制御を行う請求項1から請求項3のいずれか1項に記載の症状記録装置。
- 前記制御部は、前記頸部画像の撮影前に、周囲の明るさを含む撮像条件を指示する条件指示データを出力する制御を行う請求項1から請求項4のいずれか1項に記載の症状記録装置。
- 前記制御部は、前記頸部画像の撮影前に、前記撮像素子の感度を含む撮像条件を設定する請求項1から請求項5のいずれか1項に記載の症状記録装置。
- 前記制御部は、前記頸部画像が撮影された日時を示す日時データを更に記録する請求項1から請求項6のいずれか1項に記載の症状記録装置。
- 前記制御部は、前記頸部画像が撮影された角度を示す角度データを更に記録する請求項1から請求項7のいずれか1項に記載の症状記録装置。
- 前記制御部は、前記撮像素子で得られている画像に前記撮像範囲が含まれているかどうかを判定し、前記撮像範囲が含まれていると判定した場合に、前記撮像範囲を表す図形を前記プレビュー画像に重ねて表示する制御を行う請求項1から請求項8のいずれか1項に記載の症状記録装置。
- 前記制御部は、前記撮像素子で得られている画像に前記撮像範囲が含まれているかどうかを判定し、前記撮像範囲が含まれていると判定した場合に、前記撮像素子で得られている画像を前記頸部画像として記録する請求項1から請求項8のいずれか1項に記載の症状記録装置。
- 前記制御部は、撮影された前記頸部画像を表示する制御を行うとともに、撮影された前記頸部画像に前記撮像範囲が含まれているかどうかを質問する質問データを出力する制御を行い、前記撮像範囲が含まれていないという回答が入力された場合に、前記頸部画像の再撮影を支援する請求項1から請求項8のいずれか1項に記載の症状記録装置。
- 前記制御部は、撮影された前記頸部画像とともに、前回記録した頸部画像を表示する制御を行う請求項1から請求項11のいずれか1項に記載の症状記録装置。
- 前記制御部は、撮影された前記頸部画像を解析して、撮影された前記頸部画像に前記撮像範囲が含まれているかどうかを判定し、前記撮像範囲が含まれていないと判定した場合に、前記頸部画像の再撮影を支援する請求項1から請求項8のいずれか1項に記載の症状記録装置。
- 前記制御部により制御されて、前記頸部画像をサーバに送信する通信部を更に備える請求項1から請求項8のいずれか1項に記載の症状記録装置。
- 前記制御部は、送信した前記頸部画像に前記撮像範囲が含まれていないことを指摘する指摘データが前記サーバから送信されて前記通信部により受信された場合に、前記頸部画像の再撮影を支援する請求項14に記載の症状記録装置。
- 制御部が、頸静脈の状態を確認するための頸部画像の撮影前に、当該撮影に用いられる撮像素子で得られた画像をプレビュー画像として表示する制御を行うとともに、頸部を含む撮像範囲を指示する範囲指示データを出力する制御を行うことで、当該撮影を支援し、
前記制御部が、撮影された前記頸部画像を記録する症状記録方法。 - 頸静脈の状態を確認するための頸部画像の撮影前に、当該撮影に用いられる撮像素子で得られた画像をプレビュー画像として表示する制御を行うとともに、頸部を含む撮像範囲を指示する範囲指示データを出力する制御を行うことで、当該撮影を支援する処理と、
撮影された前記頸部画像を記録する処理と
をコンピュータに実行させるプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280011978.0A CN116801815A (zh) | 2021-01-29 | 2022-01-19 | 症状记录装置、症状记录方法以及程序 |
JP2022578287A JPWO2022163468A1 (ja) | 2021-01-29 | 2022-01-19 | |
US18/360,965 US20230368385A1 (en) | 2021-01-29 | 2023-07-28 | Symptom recording device, symptom recording method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021013718 | 2021-01-29 | ||
JP2021-013718 | 2021-01-29 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/360,965 Continuation US20230368385A1 (en) | 2021-01-29 | 2023-07-28 | Symptom recording device, symptom recording method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022163468A1 true WO2022163468A1 (ja) | 2022-08-04 |
Family
ID=82653350
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/001849 WO2022163468A1 (ja) | 2021-01-29 | 2022-01-19 | 症状記録装置、症状記録方法、及びプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230368385A1 (ja) |
JP (1) | JPWO2022163468A1 (ja) |
CN (1) | CN116801815A (ja) |
WO (1) | WO2022163468A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016513517A (ja) * | 2013-03-14 | 2016-05-16 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 被検者のバイタルサイン情報を取得するためのデバイス及び方法 |
JP2018514238A (ja) * | 2015-05-19 | 2018-06-07 | グーグル エルエルシー | 視覚的な中心静脈圧測定 |
JP2020510487A (ja) * | 2017-03-13 | 2020-04-09 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 対象の生理学的信号を測定及び処理するデバイス、システム並びに方法 |
-
2022
- 2022-01-19 JP JP2022578287A patent/JPWO2022163468A1/ja active Pending
- 2022-01-19 CN CN202280011978.0A patent/CN116801815A/zh active Pending
- 2022-01-19 WO PCT/JP2022/001849 patent/WO2022163468A1/ja active Application Filing
-
2023
- 2023-07-28 US US18/360,965 patent/US20230368385A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016513517A (ja) * | 2013-03-14 | 2016-05-16 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 被検者のバイタルサイン情報を取得するためのデバイス及び方法 |
JP2018514238A (ja) * | 2015-05-19 | 2018-06-07 | グーグル エルエルシー | 視覚的な中心静脈圧測定 |
JP2020510487A (ja) * | 2017-03-13 | 2020-04-09 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 対象の生理学的信号を測定及び処理するデバイス、システム並びに方法 |
Also Published As
Publication number | Publication date |
---|---|
US20230368385A1 (en) | 2023-11-16 |
CN116801815A (zh) | 2023-09-22 |
JPWO2022163468A1 (ja) | 2022-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102296396B1 (ko) | 비접촉 체온 측정 시 정확도를 향상시키기 위한 장치 및 방법 | |
KR20180120022A (ko) | 전자 장치 및 전자 장치의 영상 표시 방법 | |
US20160037067A1 (en) | Method for generating image and electronic device thereof | |
KR20170097860A (ko) | 디스플레이를 이용하여 이미지를 촬영하는 전자 장치 및 이미지 촬영 방법 | |
KR20160118001A (ko) | 촬영 장치, 그 제어 방법 및 컴퓨터로 판독 가능한 기록매체. | |
KR20170019823A (ko) | 이미지 처리 방법 및 이를 지원하는 전자장치 | |
US20210022603A1 (en) | Techniques for providing computer assisted eye examinations | |
CN108632543B (zh) | 图像显示方法、装置、存储介质及电子设备 | |
KR20200133923A (ko) | 이동통신 단말기에서의 안면 인식을 이용한 관련 질환 모니터링 방법 및 시스템 | |
KR20150099317A (ko) | 이미지 처리 방법 및 장치 | |
EP3104304A1 (en) | Electronic apparatus and method of extracting still images | |
JP2012254221A (ja) | 画像処理装置、画像処理装置の制御方法、およびプログラム | |
US10009545B2 (en) | Image processing apparatus and method of operating the same | |
JP2017049695A (ja) | 支援システム、支援方法、プログラム | |
WO2022163468A1 (ja) | 症状記録装置、症状記録方法、及びプログラム | |
JP2018067851A (ja) | 撮像制御プログラム、撮像制御方法および情報処理装置 | |
WO2021182129A1 (ja) | 遠隔医療システム、遠隔医療方法、情報処理装置、及び、プログラム | |
JP2014226515A (ja) | 診断支援システム | |
JP2013255594A (ja) | 画像処理装置及び画像処理方法 | |
JP2021197665A (ja) | 通信システム及び通信方法 | |
JP2021068432A (ja) | 試験システム、サーバおよび試験方法 | |
WO2021210362A1 (ja) | 症状記録装置、症状記録方法、及びプログラム | |
KR20240142446A (ko) | 진단 이미지의 가이드 셀프-캡처 | |
CN113749614B (zh) | 皮肤检测方法和设备 | |
US20230214996A1 (en) | Eyes measurement system, method and computer-readable medium thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22745689 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022578287 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280011978.0 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22745689 Country of ref document: EP Kind code of ref document: A1 |