WO2023195091A1 - Processing device, processing program, and processing method - Google Patents

Processing device, processing program, and processing method Download PDF

Info

Publication number
WO2023195091A1
WO2023195091A1 PCT/JP2022/017151 JP2022017151W WO2023195091A1 WO 2023195091 A1 WO2023195091 A1 WO 2023195091A1 JP 2022017151 W JP2022017151 W JP 2022017151W WO 2023195091 A1 WO2023195091 A1 WO 2023195091A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
subject
photographing device
photographing
processing
Prior art date
Application number
PCT/JP2022/017151
Other languages
French (fr)
Japanese (ja)
Inventor
渉 ▲高▼橋
敦史 福田
賢 ▲高▼橋
Original Assignee
アイリス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アイリス株式会社 filed Critical アイリス株式会社
Priority to PCT/JP2022/017151 priority Critical patent/WO2023195091A1/en
Publication of WO2023195091A1 publication Critical patent/WO2023195091A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/267Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the respiratory tract, e.g. laryngoscopes, bronchoscopes

Definitions

  • the present disclosure relates to a processing device, a processing program, and a processing method connected to a photographing device configured to photograph an image of a subject.
  • Patent Document 1 describes a camera head including a first housing that houses an image sensor, a cable section that is connected to the camera head and transmits at least an image signal from the image sensor, and a tube through which the cable section is inserted.
  • a signal transmission unit comprising a second housing having a shape, an annular member sandwiched between the first housing and the second housing and having an outer circumferential surface for identifying the type of camera head, and the first housing. and an annular sealing member fixed by the second casing and the annular member.
  • the present disclosure provides various embodiments that allow a camera to be connected to a photographing device configured to photograph an image of a subject, and to improve the usability of the photographing device.
  • the purpose is to provide a processing device, a processing program, and a processing method.
  • a processing device communicably connected via a network to a photographing device configured to photograph an image of a subject's natural aperture as a subject, and including at least one processor. wherein the at least one processor acquires operation information related to an operation of the photographing device performed by an operator of the photographing device in order to photograph the image with the photographing device, and the acquired operation
  • a processing device configured to perform processing for outputting related information related to the photographing device based on the information is provided.
  • a processing device configured to capture an image of a subject's natural aperture as a subject and a processing device communicatively connected via a network to the imaging device
  • a processing program that functions as a processor for , is provided.
  • a processing device communicably connected via a network to a photographing device configured to photograph an image of a subject's natural aperture as a subject, and including at least one processor.
  • the processing method executed by the at least one processor, wherein the at least one processor performs an operation of the photographing device performed by an operator of the photographing device in order to photograph the image with the photographing device. and a step of outputting related information related to the photographing device based on the obtained operation information.
  • a processing device configured to photograph an image of a subject and can improve the usability of the photographing device.
  • FIG. 1 is a diagram showing a usage state of an imaging device 200 according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram showing a usage state of the imaging device 200 according to an embodiment of the present disclosure.
  • FIG. 3 is a block diagram showing the configuration of the processing system 1 according to an embodiment of the present disclosure.
  • FIG. 4 is a block diagram showing the configuration of a display device 100 and a photographing device 200 according to an embodiment of the present disclosure.
  • FIG. 5 is a block diagram showing the configuration of a server device 300 according to an embodiment of the present disclosure.
  • FIG. 6A is a diagram conceptually showing an imaging device management table stored in server device 300 according to an embodiment of the present disclosure.
  • FIG. 6B is a diagram conceptually showing an operator management table stored in the server device 300 according to an embodiment of the present disclosure.
  • FIG. 6C is a diagram conceptually showing a target person management table stored in the server device 300 according to an embodiment of the present disclosure.
  • FIG. 6D is a diagram conceptually showing a related information table stored in the server device 300 according to an embodiment of the present disclosure.
  • FIG. 7 is a diagram showing a processing sequence executed between the display device 100, the imaging device 200, and the server device 300 according to an embodiment of the present disclosure.
  • FIG. 8 is a diagram showing a processing flow executed in the imaging device 200 according to an embodiment of the present disclosure.
  • FIG. 9A is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure.
  • FIG. 9B is a diagram illustrating a processing flow related to generation of a trained model according to an embodiment of the present disclosure.
  • FIG. 9C is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure.
  • FIG. 9D is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure.
  • FIG. 10A is a diagram illustrating an example of a screen displayed on the display device 100 according to an embodiment of the present disclosure.
  • FIG. 10B is a diagram illustrating an example of a screen displayed on the display device 100 according to an embodiment of the present disclosure.
  • FIG. 10C is a diagram illustrating an example of a screen displayed on the display device 100 according to an embodiment of the present disclosure.
  • FIG. 10D is a diagram illustrating an example of a screen displayed on the display device 100 according to an embodiment of the present disclosure.
  • FIG. 10E is a diagram illustrating an example of a screen displayed on the display device 100 according to an embodiment of the present
  • the processing system 1 according to the present disclosure is mainly used to photograph the inside of a subject's oral cavity and obtain a subject image.
  • the processing system 1 is used to image the back of the throat of the oral cavity, specifically, the pharynx. Therefore, in the following, a case will be mainly described in which the processing system 1 according to the present disclosure is used to photograph the pharynx.
  • the pharynx is an example of a region to be imaged, and naturally, other regions in the oral cavity such as the tonsils and larynx, and other natural openings such as the external auditory canal, vagina, rectum, and nasal cavity are also included in the present disclosure.
  • Such a processing system 1 can be suitably used.
  • the processing system 1 determines the possibility of contracting a predetermined disease from a subject image obtained by photographing a subject including at least the pharyngeal region of the oral cavity of the subject, and determines the possibility of contracting a predetermined disease. It is used to diagnose or assist in the diagnosis of diseases.
  • An example of a disease determined by the processing system 1 is influenza.
  • the possibility of influenza infection is usually diagnosed by examining the subject's pharynx and tonsils and determining the presence or absence of findings such as follicles in the pharynx region.
  • the processing system 1 determines the possibility of contracting influenza and outputting the result, it becomes possible to diagnose or assist. Note that the determination of the possibility of contracting influenza is one example.
  • the processing system 1 can be suitably used even if it is only for photographing a subject. Further, the processing system 1 can be suitably used for any disease that causes a difference in oral cavity findings depending on the disease. Note that differences in findings are not limited to those discovered by a doctor or the like and whose existence is medically known. For example, any difference that can be recognized by a person other than a doctor or that can be detected by artificial intelligence or image recognition technology can be suitably applied to the processing system 1.
  • diseases include, in addition to influenza, infectious diseases such as streptococcal infection, adenovirus infection, EB virus infection, mycoplasma infection, hand-foot-and-mouth disease, herpangina, candidiasis, arteriosclerosis, diabetes, Diseases that exhibit vascular disorders or mucosal disorders such as hypertension, and tumors such as tongue cancer, pharyngeal cancer, and oral cavity cancer are included.
  • infectious diseases such as streptococcal infection, adenovirus infection, EB virus infection, mycoplasma infection, hand-foot-and-mouth disease, herpangina, candidiasis, arteriosclerosis, diabetes, Diseases that exhibit vascular disorders or mucosal disorders such as hypertension, and tumors such as tongue cancer, pharyngeal cancer, and oral cavity cancer are included.
  • subjects to be photographed by the photographing device 200 may include any human being, such as a patient, a subject, a diagnosis subject, and a healthy person.
  • an operator who holds the imaging device 200 and performs an imaging operation is not limited to a medical worker such as a doctor, nurse, or laboratory technician, but may include any person such as the subject himself/herself.
  • the processing system 1 according to the present disclosure is typically used in a medical institution. However, the application is not limited to this case, and may be used at any location such as the target's home, school, or workplace.
  • an institution associated with an operator is, for example, an institution associated with an operator, an imaging device, a display device, or a server device, and means a company, group, organization, or the like.
  • examples include medical institutions, departments, departments, etc. to which medical personnel such as doctors, nurses, and laboratory technicians, who are operators, belong.
  • the subject only needs to include at least a part of the subject's natural orifice.
  • the disease to be determined may be any disease as long as it shows a difference in the findings at the natural orifice that is the subject.
  • the subject includes at least a portion of the oral cavity, particularly the pharynx or the vicinity of the pharynx, and the possibility of contracting influenza is determined as the disease.
  • the subject image may be one or more moving images or one or more still images.
  • the camera captures a live view image, and the captured live view image is displayed on the display 203. Thereafter, when the operator presses the photographing button, one or more still images are photographed by the camera, and the photographed images are displayed on the display 203.
  • shooting button shooting of a moving image is started, and the image being shot with the camera during that time is displayed on the display 203. Then, when the shooting button is pressed again, shooting of the moving image is finished.
  • various images such as through images, still images, and videos are taken by the camera and displayed on the display, but the subject image does not only refer to a specific image among these images. , may include all images taken by the camera.
  • FIG. 1 is a diagram showing a usage state of an imaging device 200 according to an embodiment of the present disclosure.
  • the operator attaches the auxiliary tool 400 to the distal end of the imaging device 200 so as to cover it, and inserts the imaging device 200 together with the auxiliary tool 400 into the oral cavity 712 of the subject 700.
  • an operator (which may be the subject 700 himself or a person different from the subject 700) attaches the auxiliary tool 400 to the tip of the imaging device 200 so as to cover it.
  • the operator inserts the imaging device 200 with the auxiliary tool 400 attached into the oral cavity 712.
  • the tip of the auxiliary tool 400 passes through the incisors 711 and is inserted to the vicinity of the soft palate 713.
  • the imaging device 200 is similarly inserted up to the vicinity of the soft palate 713.
  • the tongue 714 is pushed downward by the auxiliary tool 400 (functioning as a tongue depressor) and the movement of the tongue 714 is restricted. This allows the operator to secure a good field of view for the imaging device 200 and to take good pictures of the pharynx 715 located in front of the imaging device 200.
  • the photographed subject image (typically, an image including the pharynx 715) is transmitted from the photographing device 200 to the server device 300 that is communicably connected via a wired or wireless network.
  • the processor of the server device 300 that receives the subject image processes the processing program stored in the memory, thereby determining the possibility of contracting a predetermined disease.
  • the results are then sent to the display device 100 and output to a display or the like via the output interface of the display device 100.
  • FIG. 2 is a diagram showing a usage state of the imaging device 200 according to an embodiment of the present disclosure. Specifically, FIG. 2 is a diagram showing a state in which the operator 600 grips the imaging device 200.
  • the imaging device 200 is composed of a main body 201, a grip 202, and a display 203 from the side inserted into the oral cavity.
  • the main body 201 and the grip 202 are formed into a substantially columnar shape with a predetermined length along the insertion direction H into the oral cavity.
  • the display 203 is arranged on the side of the grip 202 opposite to the main body 201 side.
  • the photographing device 200 is formed into a generally columnar shape as a whole, and is held by the operator 600 in a manner similar to holding a pencil. That is, since the display panel of the display 203 faces toward the operator 600 when in use, it is possible to easily handle the photographing device 200 while checking the subject image photographed by the photographing device 200 in real time.
  • the photographing button 220 is arranged on the top side of the grip. Therefore, when the operator 600 holds it, the operator 600 can easily press the shooting button 220 with his/her index finger or the like.
  • FIG. 3 is a schematic diagram of processing system 1 according to an embodiment of the present disclosure.
  • the processing system 1 includes a display device 100, a photographing device 200, and a server device 300, and each device is communicably connected via a wired or wireless network.
  • the display device 100 inputs subject information, interview information, diagnostic information, etc. required for processing in the server device 300, and determines related information related to the imaging device 200 and the possibility of contracting a predetermined disease.
  • the results are received from the server device 300 and output.
  • the tip of the imaging device 200 is inserted into the oral cavity of the subject, photographs the inside of the oral cavity, particularly the pharynx, and transmits the photographed subject image to the server device 300 via a wired or wireless network. Further, the photographing device 200 transmits, to the server device 300, operation information regarding an operation performed by the operator or information used to generate the operation information in connection with the photographing.
  • the server device 300 receives the operation information transmitted by the imaging device 200 or the information used to generate the operation information, and outputs related information based on the operation information. Further, the server device 300 receives and processes a subject image photographed by the photographing device 200. Furthermore, the server device 300 determines the possibility of contracting a predetermined disease based on the received subject image, medical interview information, and finding information, and transmits the result to the display device 100 .
  • the processing device means the display device 100, the server device 300, or a combination thereof. That is, although the case where the server device 300 functions as a processing device will be described below, the display device 100 can also function as a processing device in the same way. Furthermore, in the present disclosure, storage and processing performed by the processing device may be distributed to other terminal devices, other server devices, and the like. In other words, the processing device is not limited to only one configured with a single housing, but may include the display device 100, the photographing device 200, the server device 300, another terminal device, another server device, or a combination thereof. including.
  • the processing system 1 can include one or more display devices 100, photographing devices 200, and server devices 300.
  • FIG. 4 is a block diagram showing the configuration of a display device 100 and a photographing device 200 according to an embodiment of the present disclosure. Specifically, FIG. 4 is a diagram specifically showing the configurations of the display device 100 and the photographing device 200 in the processing system 1.
  • the display device 100 includes a processor 111, a memory 112, an input interface 113, an output interface 114, and a communication interface 115.
  • the photographing device 200 also includes a camera 211, a light source 212, a processor 213, a memory 214, an output interface 215, an input interface 210, a communication interface 216, a sensor 217, and a battery 218. Each of these components is electrically connected to each other via control lines and data lines.
  • the display device 100 and the photographing device 200 do not need to include all of the components shown in FIG. 4, and can be configured by omitting some of them, or can include other components. be.
  • the display device 100 or the photographing device 200 can include a battery or the like for driving each component.
  • the display device 100 and the photographing device 200 only need to be communicably connected through a wired or wireless network, as described with reference to FIG. 3, and there is no need for the two to be configured to be able to directly communicate.
  • the processor 111 functions as a control unit that controls other components of the processing system 1 based on a processing program stored in the memory 112. Based on the processing program stored in the memory 112, the processor 111 inputs subject information, interview information, finding information, etc., and outputs related information received from the server device 300. Specifically, the processor 111 performs "a process of accepting input of target person information related to a target person by an operator or the target person himself/herself via the input interface 113" and a "process of accepting input of target person information related to the target person via the communication interface 115".
  • ⁇ Process of transmitting subject information to the server device 300'' ⁇ Process of receiving input of interview information and finding information of the subject by the operator or the subject via the input interface 113'', ⁇ Process of receiving input of interview information and finding information of the subject via the communication interface 115''; ⁇ Process of transmitting the received interview information and finding information to the server device 300 together with the subject information'', ⁇ selecting a subject via the input interface 113, and determining the possibility of the selected subject being affected by a predetermined disease.
  • a process of receiving a determination result and a subject image shown in FIG. and the like are executed based on the processing program stored in the memory 112.
  • the processor 111 is mainly composed of one or more CPUs, but may be appropriately combined with GPUs, FPGAs, etc.
  • the memory 112 is comprised of RAM, ROM, nonvolatile memory, HDD, etc., and functions as a storage unit.
  • the memory 112 stores instructions for various controls of the processing system 1 according to the present embodiment as a processing program.
  • the memory 112 stores information such as "processing for receiving input of subject information related to a subject by an operator or the subject himself/herself via the input interface 113" and "processing for receiving input of subject information related to the subject via the communication interface 115".
  • ⁇ Process of transmitting subject information to the server device 300'' ⁇ Process of receiving input of interview information and finding information of the subject by the operator or the subject via the input interface 113'', ⁇ Process of receiving input of interview information and finding information of the subject via the communication interface 115''; ⁇ Process of transmitting the received interview information and finding information to the server device 300 together with the subject information'', ⁇ selecting a subject via the input interface 113, and determining the possibility of the selected subject being affected by a predetermined disease.
  • a processing program for the processor 111 to execute, such as a process for outputting the information received by the processor 111, is stored therein.
  • the memory 112 also stores subject information, subject images, interview information, finding information, related information, and the like.
  • the input interface 113 functions as an input unit that receives an instruction input from an operator to the display device 100.
  • Examples of the input interface 113 include a "confirm button” for making various selections, a “back/cancel button” for returning to the previous screen or canceling the entered confirmation operation, and a pointer output to the output interface 114.
  • Physical key buttons include a cross key button for movement, an on/off key for turning on and off the power of the display device 100, and a character input key button for inputting various characters. Note that it is also possible to use a touch panel as the input interface 113, which is provided to be superimposed on the display that functions as the output interface 114 and has an input coordinate system corresponding to the display coordinate system of the display.
  • icons corresponding to the physical keys are displayed on the display, and the operator inputs instructions via the touch panel to select each icon.
  • the detection method of the subject's instruction input using the touch panel may be any method such as a capacitance method or a resistive film method.
  • a mouse, a keyboard, etc. can also be used as the input interface 113.
  • the input interface 113 does not always need to be physically provided in the display device 100, and may be connected via a wired or wireless network as necessary.
  • the output interface 114 functions as an output unit for outputting information such as determination results and related information received from the server device 300.
  • An example of the output interface 114 is a display including a liquid crystal panel, an organic EL display, a plasma display, or the like.
  • the display device 100 itself does not necessarily need to be equipped with a display.
  • an interface for connecting to a display or the like that can be connected to the display device 100 via a wired or wireless network can function as the output interface 114 that outputs display data to the display or the like.
  • the communication interface 115 functions as a communication unit for transmitting and receiving subject information, interview information, subject images, finding information, related information, etc. to and from the server device 300 connected via a wired or wireless network.
  • Examples of the communication interface 115 include wired communication connectors such as USB and SCSI, wireless communication transmitting and receiving devices such as wireless LAN, Bluetooth (registered trademark), and infrared rays, and various connection terminals for printed mounting boards and flexible mounting boards. There are various things that can be mentioned.
  • the camera 211 functions as a photographing unit that detects reflected light reflected on the oral cavity of the subject and generates a subject image.
  • the camera 211 includes, for example, a CMOS image sensor to detect the light, and a lens system and a drive system to realize a desired function.
  • the image sensor is not limited to a CMOS image sensor, but other sensors such as a CCD image sensor can also be used.
  • the camera 211 can have an autofocus function, and is preferably set to focus on a specific portion, for example, in front of a lens. Further, the camera 211 may have a zoom function, and is preferably set to take images at an appropriate magnification depending on the size of the pharynx or influenza follicle.
  • the light source 212 is driven by instructions from the processor 213 of the imaging device 200, and functions as a light source section for irradiating light into the oral cavity.
  • Light source 212 includes one or more light sources.
  • the light source 212 includes one or more LEDs, and each LED emits light having a predetermined frequency band toward the oral cavity.
  • the light source 212 uses light having a desired band among an ultraviolet light band, a visible light band, and an infrared light band, or a combination thereof. Note that when determining the possibility of contracting influenza in the display device 100, it is preferable to use light in the visible light band.
  • the processor 213 functions as a control unit that controls other components of the imaging device 200 based on the processing program stored in the memory 214. Based on the processing program stored in the memory 214, the processor 213 executes the process of "receiving a target person who has not yet been photographed through a communication interface 216 from a server device 300 that is communicatively connected to the photographing device 200 via a network.”
  • ⁇ Process of receiving information'' ⁇ Process of selecting subject information of a subject to be photographed from a list of subjects who have not yet been photographed'', ⁇ Process of selecting subject information of a subject to be photographed from a list of subjects to be photographed who have not yet been photographed'', ⁇ Process of selecting subject information of a subject to be photographed from a list of subjects to be photographed who have not yet been photographed''; ⁇ Processing of photographing a subject image and transmitting it to server device 300'', ⁇ Processing of detecting, by sensor 217, a state that changes
  • the memory 214 is composed of RAM, ROM, nonvolatile memory, HDD, etc., and functions as a storage unit.
  • the memory 214 stores instructions for various controls of the processing system 1 according to the present embodiment as a processing program. Specifically, the memory 214 performs "a process of receiving subject information of a subject to be photographed who has not yet been photographed via the communication interface 216 from the server device 300 that is communicably connected to the photographing device 200 via a network"; ⁇ Process of selecting subject information of a subject to be photographed from a list of subjects who have not yet been photographed''; "a process of transmitting information to the server device 300", "a process of detecting, by the sensor 217, a state that changes due to the operation of the imaging device operated by the operator", "a process of detecting operation information or information resulting from an operation of the imaging device 200 by the operator”
  • the output interface 215 functions as an output unit for outputting the subject image photographed by the photographing device 200, subject information, and the like.
  • An example of the output interface 215 is the display 203, but the output interface 215 is not limited to this, and may be configured from other liquid crystal panels, organic EL displays, plasma displays, etc. Further, the display 203 does not need to be provided, and for example, an interface for connecting to a display, etc. that can be connected to the display device 100 via a wired or wireless network is an output interface that outputs display data to the display, etc. It is also possible to function as 114.
  • the input interface 210 functions as an input unit that accepts the subject's instruction input to the display device 100 and the photographing device 200.
  • Examples of the input interface 210 include a “shooting button” for instructing the shooting device 200 to start and end recording, a “power button” for turning on and off the power of the shooting device 200, and a “power button” for making various selections.
  • buttons and keys may be physically prepared, or may be displayed as icons on the output interface 215 and may be displayed on the output interface 215 using a touch panel or the like superimposed on the output interface 215 and arranged as the input interface 210. It may also be possible to make the selection possible.
  • the detection method of the subject's instruction input using the touch panel may be any method such as a capacitance method or a resistive film method.
  • the communication interface 216 functions as a communication unit for transmitting and receiving information to and from the server device 300 and/or other devices.
  • Examples of the communication interface 216 include wired communication connectors such as USB and SCSI, wireless communication transmitting and receiving devices such as wireless LAN, Bluetooth (registered trademark), and infrared rays, and various connection terminals for printed mounting boards and flexible mounting boards. There are various things that can be mentioned.
  • the sensor 217 functions as a detection unit for detecting the state of each component making up the imaging device 200.
  • Examples of the sensor 217 include a voltage sensor, a current sensor, a temperature sensor, a GPS, an acceleration sensor, or a combination thereof.
  • a detected value (for example, voltage, current, temperature, position information, or a combination thereof) detected by the sensor 217 is transmitted to the server device 300 as operation information or one of the pieces of information used to generate the operation information.
  • the battery 218 functions as a power supply unit that supplies power to drive each component of the imaging device 200.
  • the battery 218 is charged by receiving power from the power source when connected to the power source, and starts discharging to drive each component when the photographing device 200 is turned on by pressing a power button or the like. do.
  • FIG. 5 is a block diagram showing the configuration of server device 300 according to an embodiment of the present disclosure.
  • server device 300 includes a memory 311, a processor 312, and a communication interface 313. Each of these components is electrically connected to each other via control lines and data lines. Note that the server device 300 does not need to include all of the components shown in FIG. 5; it is possible to omit some of them, or to add other components. For example, it is also possible to configure the server device 300 by connecting it with another server device. Moreover, it is also possible to connect with other database devices and configure the server device 300 as an integrated unit.
  • the memory 311 is composed of RAM, ROM, nonvolatile memory, HDD, etc., and functions as a storage unit.
  • the memory 311 stores instructions for various controls of the processing system 1 according to the present embodiment as a processing program. Specifically, the memory 311 stores "processing for acquiring operation information related to the operation of the photographing device 200 by the operator in order to photograph an image of a subject with the photographing device 200" and "processing for acquiring operation information related to the operation of the photographing device 200 by the operator to photograph an image of a subject with the photographing device 200" A process of outputting related information related to the device 200 to at least one of the display device 100 and the photographing device 200,” and “receiving a subject image photographed by the photographing device 200 from the photographing device 200 via the communication interface 313.” ⁇ Process of receiving medical interview information and finding information inputted on display device 100 from display device 100 via communication interface 313, and storing it in a subject management table in association with subject ID information''; When the display device 100 receives
  • the memory 311 stores processing programs for the processor 312 to execute, such as "transmission processing".
  • the memory 311 also stores an imaging device management table (FIG. 6A), an operator management table (FIG. 6B), a subject management table (FIG. 6C), and a related information table (FIG. 6D). Stores various information etc.
  • the processor 312 functions as a control unit that controls other components of the server device 300 based on the processing program stored in the memory 311. Based on the processing program stored in the memory 311, the processor 312 performs processing for determining the possibility of contracting a predetermined disease and processing for outputting related information.
  • the processor 312 performs "a process of acquiring operation information related to an operation of the photographing device 200 by an operator to photograph an image of a subject with the photographing device 200" and a "process of acquiring operation information related to the operation of the photographing device 200 by an operator to photograph an image of a subject with the photographing device 200”; A process of outputting related information related to the device 200 to at least one of the display device 100 and the photographing device 200,” and “receiving a subject image photographed by the photographing device 200 from the photographing device 200 via the communication interface 313.”
  • the display device 100 receives a request to determine the possibility of a predetermined disease for a selected subject from the display device 100 via the communication interface 313, a subject image associated with the subject from the image management table is displayed.
  • the processor 312 is mainly composed of one or more CPUs, but may be appropriately combined with a GPU, FPGA, or the like.
  • the communication interface 313 functions as a communication unit for transmitting and receiving information to and from the display device 100, the photographing device 200, and/or other devices.
  • Examples of the communication interface 313 include wired communication connectors such as USB and SCSI, wireless communication transmitting and receiving devices such as wireless LAN, Bluetooth (registered trademark), and infrared rays, and various connection terminals for printed mounting boards and flexible mounting boards. There are various things that can be mentioned.
  • FIG. 6A is a diagram conceptually showing an imaging device management table stored in the server device 300 according to an embodiment of the present disclosure.
  • the information stored in the imaging device management table is updated and stored as needed according to the progress of processing by the processor 312 of the server device 300.
  • the photographing device management table stores operation information, related information, software (SW) version information, parts information, etc. in association with photographing device ID information.
  • “Photographing device ID information” is information unique to each photographing device 200 and used to identify each photographing device 200.
  • an arbitrary character string or code may be assigned by the server device 300, or information such as a manufacturing number or product number may be used.
  • “Operation information” is information related to an operation by an operator for photographing an image with each photographing device 200.
  • An example of such information is an output value from the sensor 217 such as a voltage sensor, current sensor, or temperature sensor (it may be the output value itself from the sensor 217, or data obtained by processing the output value).
  • the number of times the auxiliary device 400 is worn the number of times the image is taken, the number of times the battery 218 is charged, the voltage of the battery 218 when fully charged, the accumulated total charging time of the battery 218, and the number of times the battery 218 is overcharged.
  • Such operation information is obtained by being received from the photographing device 200 or generated by the server device 300 based on information used to generate the operation information received from the photographing device 200.
  • Related information is information generated based on operation information, and is information related to the imaging device 200. Examples include information on how to use the imaging device 200, information that promotes at least one of the replacement, inspection, and calibration of various components (parts, etc.) that make up the imaging device 200, and information that promotes the purchase of the auxiliary tool 400. Examples include information on promotion, information on fees for determining morbidity for a predetermined disease, and the like. Specifically, the information regarding how to use the photographing device 200 includes information such as tips for using the photographing device 200, a manual, methods for improving operation, promotion of charging the battery 218, and estimated time for charging completion.
  • Information that facilitates the replacement, inspection, and/or calibration of various components includes the battery 218, the LED used in the light source 212, the COMS sensor used in the camera 211, and various lenses installed in the photographing device 200. , Notification of the quality of the light guide tube for transmitting the light emitted from the light source 212 to the outside, physical keys, touch panel, software built into the photographing device 200, etc., and notification of the timing of replacement, inspection, calibration, etc.; Examples include notification of replacement, inspection, and calibration methods.
  • Information regarding the judgment fee includes notifications of judgment fees that have occurred so far, notifications of judgment fees that are predicted to occur in the future, notifications of payment methods for the judgment fees, and the like. Examples of information that promotes the purchase of the auxiliary tool 400 include information indicating the number of remaining auxiliary tools 400 in stock, information indicating the purchasing method, and the like.
  • SW version information is information for specifying the version of the processing program currently stored in each imaging device 200. The information is updated every time a new version of the processing program is installed.
  • Parts information is information for specifying each of the various components (parts, etc.) installed in each imaging device 200. Such information may be any information as long as it can identify each of the main components, but typically includes a serial number. The part information may be output together with the related information, or if there is a need to repair or replace only the part that corresponds to the manufacturing lot, the part information can be used to output the part information based on the information. is used to identify the photographing device 200 installed.
  • FIG. 6B is a diagram conceptually showing an operator management table stored in the server device 300 according to an embodiment of the present disclosure.
  • the information stored in the operator management table is updated and stored as needed according to the progress of processing by the processor 312 of the server device 300.
  • institution ID information is information specific to the institution to which each operator belongs, and is information for identifying each institution. For example, when the operator is a medical worker, a code or name is used that identifies the medical institution, department, department, etc. to which the medical worker belongs.
  • the operator ID information is information unique to each operator and used to identify each operator.
  • Display device ID information is information specific to each display device 100 and is information for identifying each display device 100. Based on the display device ID information, it is possible to specify the display device 100 held by each institution or each operator.
  • Photographing device ID information is information specific to the photographing device 200 and is information for identifying each photographing device 200. Based on the imaging device ID information, it is possible to specify the imaging device 200 managed by each institution or each operator.
  • FIG. 6C is a diagram conceptually showing a target person management table stored in the server device 300 according to an embodiment of the present disclosure.
  • the information stored in the target person management table is updated and stored as needed according to the progress of processing by the processor 312 of the server device 300.
  • the subject management table stores interview information, finding information, subject images, determination result information, etc. in association with subject ID information.
  • “Target ID information” is information unique to each target person and used to identify each target person. The subject ID information is generated every time a new subject is registered by the operator or the subject himself/herself.
  • "Interview information” is information inputted by, for example, an operator or a subject, and is information used as a reference for diagnosis by a doctor or the like, such as the subject's medical history and symptoms.
  • “Finding information” is information input by an operator such as a doctor, and indicates a situation that differs from normal as obtained by various examinations such as visual inspection, interview, palpation, and auscultation, as well as tests to assist in diagnosis.
  • the “subject image” is information indicating the image data itself or the storage location of the subject image used to determine the possibility of contracting a predetermined disease.
  • “Determination result information” is information indicating the determination result of the possibility of contracting a predetermined disease such as influenza based on the determination image.
  • An example of such determination result information is the positivity rate for influenza. However, it is not limited to the positive rate, but any information that indicates the possibility, such as information specifying whether it is positive or negative, may be used. Further, the determination result does not need to be a specific numerical value, and may be in any format, such as classification according to the high positive rate or classification indicating whether it is positive or negative.
  • FIG. 6D is a diagram conceptually showing a related information table stored in the server device 300 according to an embodiment of the present disclosure.
  • the information stored in the related information table is updated and stored whenever it is added to the processor 312 of the server device 300 through processing.
  • the related information table stores related information that is output in association with operation information for each type of related information that is output.
  • the types of related information shown in FIG. 6D are just examples, and of course it is possible to add or subtract as appropriate depending on the related information that is desired to be output.
  • the output value of the "voltage sensor" which is the operation information is greater than or equal to X1 and less than X2, it is considered to be a normal value and nothing is output.
  • the value is greater than or equal to X2 and less than X3
  • the message "The battery is low” is output.
  • the value is greater than or equal to X3 and less than X4
  • the message "Please charge the battery” is output.
  • FIG. 7 is a diagram showing a processing sequence executed between display device 100, photographing device 200, and server device 300 according to an embodiment of the present disclosure.
  • S11 to S21 are processes for outputting related information mainly based on information received from the photographing device 200
  • S41 to S46 are processes for outputting related information mainly based on information received from the display device 100. show.
  • a single display device 100 and a single photographing device 200 are connected will be described below, the case where a plurality of display devices 100 and a plurality of photographing devices 200 are connected is naturally applicable. can also be processed in the same way.
  • the power of the photographing device 200 is turned on by pressing the power button or the like, and the photographing device 200 is activated (S11). Then, the photographing device 200 extracts the target person information of the target to be photographed based on the operator's instruction input received by the input interface 210 from the target person information of the target who has not been photographed received from the server device 300. Select (S12). Next, the photographing device 200 determines whether or not the auxiliary tool 400 is worn, and if it is not yet worn, outputs a wearing display to encourage the user to wear it via the output interface 215 (S13). Note that this display is just an example, and the wearing may be prompted by other sounds, flashing lights, vibrations, or the like.
  • the photographing device 200 photographs a subject image based on the operator's instruction input received at the input interface 210 (S15). ). Note that although here, the attachment of the auxiliary tool 400 is detected, this process itself may be skipped. Further, instead of detecting whether the auxiliary device 400 is attached, for example, a confirmation display for the operator to confirm that the auxiliary device 400 is attached is output via the output interface 215, and the operator inputs a predetermined operation in response to the confirmation display (for example, , tap operation, etc.), the operator himself/herself may be able to input that he/she is wearing the auxiliary device 400.
  • a confirmation display for the operator to confirm that the auxiliary device 400 is attached is output via the output interface 215, and the operator inputs a predetermined operation in response to the confirmation display (for example, , tap operation, etc.)
  • the operator himself/herself may be able to input that he/she is wearing the auxiliary device 400.
  • the attachment of the auxiliary tool 400 is detected by, for example, using the camera 211 to detect an image specific to the auxiliary tool 400, or by pressing a pre-installed switch when the auxiliary tool 400 is attached. Good too. Furthermore, these methods may be used to detect not only attachment but also removal of the auxiliary tool 400.
  • the photographing device 200 transmits the photographed subject image (T11) to the server device 300 along with the subject ID information of the first subject via the communication interface 216.
  • the server device 300 receives the subject image via the communication interface 313, it stores the received subject image in the subject management table in association with the subject ID information received together.
  • the photographing device 200 acquires various operation information or information used to generate the operation information.
  • the voltage of the battery 218 is detected at predetermined intervals by a voltage sensor connected to the battery 218. Further, the number of times each physical key is pressed and the number of times the touch panel is operated are counted when the power button is pressed in S11, an instruction is input for selecting a subject in S12, and an instruction is input for photographing in S15. Further, in S15, the light source 212 is turned on in conjunction with photographing, so that the number of times the light source 212 is turned on is counted, and the lighting time is measured.
  • the value of the current flowing through the light source 212 is detected using a current sensor connected to the light source 212. Furthermore, when the attachment of the auxiliary tool 400 is detected in S14, the number of times the auxiliary tool 400 is attached is counted. Although an example is given here, other operation information or information used to generate operation information may of course be acquired. Details of the acquired operational information or the information used to generate the operational information method will be described later.
  • the photographing device 200 After reading the operation information acquired as described above or the information used to generate the operation information, the photographing device 200 transmits the read information (T12) to the server device 300 together with the photographing device ID information (S17).
  • the server device 300 refers to the operation information in the photographing device management table based on the received photographing device ID information, and updates and stores the operation information or information used to generate the operation information (S18). Then, the server device 300 generates related information by referring to the related information table based on the updated and stored operation information and identifying related information that satisfies the conditions set in the operation information (S19). .
  • the server device 300 stores the generated related information in the imaging device management table and outputs it to the display device 100 and the imaging device 200 (T13).
  • the display device 100 and the photographing device 200 that have received the related information output the received related information to a display or the like via the output interface 114 and the output interface 215 (S20 and S21).
  • the display device 100 receives selection of the subject information of the subject to be diagnosed on the screen on which the list of subject information of the subject is displayed (S41).
  • the display device 100 generates interview information and observation information of the subject based on input by the subject or the operator, and stores the generated information in the subject management table of the server device 300 in advance.
  • the display device 100 refers to the subject management table, reads out the subject ID information of the subject, and transmits it to the server device 300 together with a request for determining the possibility of contracting a predetermined disease (T41).
  • the server device 300 When the server device 300 receives the determination request, it refers to the subject image in the subject management system based on the subject ID information received together with the judgment request, and displays the subject image and interview information associated with the subject ID information. and read out finding information. Then, the server device 300 determines the possibility of contracting a predetermined disease based on the read subject image, and stores the determination result in the subject management table (S42). The server device 300 transmits the stored determination result (T42) to the display device 100. The display device 100 that has received the determination result displays the received determination result on the display via the output interface 114 (S43).
  • the server device 300 acquires various operation information in a series of processes. For example, when the possibility of contracting a predetermined disease is determined in S42, the number of times of determination among the operation information in the operator management table is updated and stored (S44). From then on, similarly to S19 to S21, the server device 300 refers to the related information table based on the updated and stored operation information and identifies related information that satisfies the conditions set in the operation information. Related information is generated (S45). When the related information is generated, the server device 300 stores the generated related information in the imaging device management table and outputs it to the display device 100 and the imaging device 200 (T23). The display device 100 and the photographing device 200 that have received the related information output the received related information to a display or the like via the output interface 114 and the output interface 215 (S46 and S47).
  • the process of reading out the operation information in the photographing device 200 and transmitting it to the server device 300 is executed after the subject image is transmitted.
  • the timing of transmission is each time the operation information or information used to generate the operation information is acquired by the imaging device 200, the timing at every predetermined period, and the timing when communicating with the server device 300 for transmitting other information. It may be performed at any timing or a combination thereof.
  • the generated related information is output to both the display device 100 and the photographing device 200, it may be output to only one of them.
  • the display device 100 is used frequently for inputting medical interview information and finding information, and therefore is frequently used by the operator. Therefore, it is desirable to at least output it to the display device 100.
  • the method of outputting the related information in the display device 100 and the photographing device 200 is not limited to displaying on a display or the like, but may be any method such as sound, issuing of an LED, etc., vibration, or a combination thereof. This completes the processing sequence.
  • FIG. 8 is a diagram showing a process flow executed by photographing device 200 according to an embodiment of the present disclosure. Specifically, FIG. 8 is a diagram showing a processing flow executed in the photographing process of S11 to S17 in FIG. 7. The processing flow is mainly performed by the processor 213 of the photographing device 200 reading and executing a processing program stored in the memory 214. In addition, in the following processing flow, an example of information acquired as operation information or information used to generate operation information is illustrated, but of course the information is not limited to the exemplified information only, and other information may also be used. Information may be obtained.
  • the photographing device 200 starts the photographing device 200 by detecting a press of the power button or the like (S111).
  • the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information.
  • a detection value from a voltage sensor connected to the battery 218, a communication speed of the communication interface 216, a radio wave intensity when performing wireless communication via the communication interface 216, The operation log, the detected value from the acceleration sensor, etc. are acquired at any time and stored in the memory 214.
  • the processor 213 receives the subject information of the subjects whose subject images have not been captured from the server device 300 via the communication interface 216, the processor 213 outputs the subject information of the subjects whose subject images have not been captured as a list via the interface 215. output to the display. Then, the processor 213 receives selection of subject information of the subject to be photographed from the list via the input interface 210 (S112). At this time, the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information. - Receiving selection operations via the touch panel, number of touch panel operations and operation log
  • the processor 213 activates the camera 211 and determines whether or not the auxiliary device 400 is normally attached to the photographing device 200. If it is determined that the auxiliary device 400 is not attached, the auxiliary device 400 is outputted (S113). Then, the processor 213 determines whether or not the auxiliary tool 400 is worn at a predetermined period while the auxiliary tool 400 wearing display screen is output, and if it is determined that the auxiliary tool 400 is worn normally, the processor 213 proceeds to the next process.
  • the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information. ⁇ Wearing of the auxiliary device 400 used to count the number of times the auxiliary device 400 is worn ⁇ Current time obtained by referring to the timer (through image display start time) ⁇ Operation log
  • the processor 213 starts photographing the inside of the oral cavity including the pharynx using the camera 211 (S114). Specifically, photography is performed by turning on the light source 212 to irradiate the subject with light, and detecting reflected light from the subject, such as the pharynx, with the camera 211. At this time, the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information.
  • the processor 213 outputs the photographed subject image to the display via the output interface 215.
  • the processor 213 receives the operator's confirmation operation via the input interface 210
  • the processor 213 transmits the photographed subject image together with the subject ID information to the server device 300 via the communication interface 216 (S115). If an instruction for re-imaging is accepted instead of a confirmation operation, the process returns to S114 again.
  • the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information. -Current time obtained by referring to the timer (lighting end time of the light source 212) ⁇ Receive a confirmation operation via the touch panel, and record the number of touch panel operations and operation log
  • the battery 218 of the photographing device 200 is a rechargeable secondary battery. Therefore, for example, by connecting the photographing device 200 to a charging device (not shown) for charging, the above-described photographing process is temporarily charged, and charging of the battery 218 of the photographing device 200 can be started. It is possible. Therefore, when the processor 213 detects the connection to the charging device, it controls the battery 218 to start charging. At this time or while being charged, the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information.
  • the processor 213 reads each operation information stored in the memory 214 or the information used to generate the operation information from the memory 214 as described above (S116), and uses the read information to It is transmitted along with the device ID information to the server device 300 via the communication interface 216 (S117).
  • the transmitted operation information or the information used to generate the operation information is processed by the processor 312 of the server device 300 and is associated with the imaging device ID information as operation information. It is stored in the imaging device management table.
  • the processes related to reading and transmitting these information are performed at this timing in the example of FIG. 8, they may of course be performed at other timings. For example, it may be performed at any timing, such as each time the information is acquired, every predetermined time, the timing when communicating with the server device 300 for transmitting other information, or a combination thereof. With the above, the processing flow ends.
  • FIG. 9A is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure. Specifically, FIG. 9A is a diagram showing a processing flow executed in the determination processing of S41 to S44 in FIG. 7. The processing flow is mainly performed by the processor 312 of the server device 300 reading and executing a processing program stored in the memory 311. In addition, in the following processing flow, although an example of operation information and related information is illustrated, it is not limited to only the illustrated information of course, and information other than these may be used.
  • the processor 312 receives, from the display device 100 via the communication interface 313, a request to determine the possibility of contracting a predetermined disease, along with the subject ID information of the subject selected as a diagnosis subject ( S211). Furthermore, the processor 312 stores the interview information and observation information input in advance by the subject or the operator on the display device 100 in association with the subject ID information in the subject management table. Therefore, based on the received subject ID information, the processor 312 refers to the interview information and observation information in the subject management table, and stores the interview information and observation information associated with the subject ID information of the subject. Read out (S212 and S213). Further, based on the similarly received subject ID information, the processor 312 refers to the subject images in the subject management table and reads out the subject image associated with the subject ID information of the subject (S214).
  • the processor 312 uses the read interview information, finding information, and subject image to execute a process of determining the possibility of contracting a predetermined disease (S215).
  • FIG. 9B is a diagram showing a processing flow related to generation of a trained model according to an embodiment of the present disclosure. Specifically, FIG. 9B is a diagram showing a processing flow related to generation of a learned determination model used in the determination process of S215 in FIG. 9A.
  • the processing flow may be executed by the processor 312 of the server device 300, or may be executed by a processor of another device.
  • the processor 312 executes the step of acquiring a subject image that includes at least a portion of the pharynx (S311). Further, the processor 312 executes a step of acquiring interview information and finding information that are stored in advance in association with the subject ID information of the subject who is the subject of the subject image (S311). Next, the processor 312 performs a processing step in which a correct label is assigned in advance to the subject of the subject image based on the results of a rapid influenza test using immunochromatography, a PCR test, a virus isolation culture test, etc. (S312). Then, the processor 312 executes a step of storing the assigned correct label information as determination result information in association with the subject image, interview information, and finding information (S313). Note that although the subject image itself is used here, feature amounts obtained from the determination image may also be used.
  • the processor executes a step of performing machine learning of a disease determination pattern for the disease using them (S314).
  • this machine learning method provides a set of information to a neural network that combines neurons, and learns while adjusting the parameters of each neuron so that the output from the neural network is the same as the correct label information. This is done by repeating.
  • a step of acquiring a learned judgment model is executed (S315).
  • the acquired learned determination model may be stored in the memory 311 of the server device 300 or in another device connected to the server device 300 via a wired or wireless network.
  • the processor 312 stores the result determined by the determination process in the target person management table in association with the target person ID information, and also stores the stored determination result and the subject used in the determination.
  • the image is transmitted to the display device 100 (S216).
  • the processor 312 updates and stores the following information as the operation information of the photographing device management table in the memory 311, as an example (S217). ⁇ Number of times the possibility of contracting a given disease is judged/Operation log
  • FIG. 9C is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure. Specifically, FIG. 9C is a diagram showing a processing flow executed in the related information output processing of SS19 to S21 and S45 to S47 in FIG. 7. The processing flow is mainly performed by the processor 312 of the server device 300 reading and executing a processing program stored in the memory 311. In addition, in the following processing flow, although an example of operation information and related information is illustrated, it is not limited to only the illustrated information of course, and information other than these may be used.
  • the processor 312 determines whether operation information or information used to generate the operation information has been received from the photographing device 200 via the communication interface 313 (S231).
  • the operation information is generated based on the information (for example, when the lighting start time and the lighting end time of the light source 212 are received, the operation information is (calculating the lighting time of the light source 212 as information) and acquiring operation information.
  • the processor 312 stores the operation information received in S231, the operation information generated in S231, and the operation information acquired in S217 of FIG. The information is updated and stored (S232). Then, the processor 312 refers to the related information table, reads related information that satisfies the conditions associated with the related information from each piece of operation information, and generates related information (S233). The processor 312 outputs the read related information to the display device 100 specified by the display device ID information associated with the image capturing device ID information and the image capturing device 200 specified by the image capturing device ID information (S234). .
  • Information indicating that the auxiliary tool 400 is worn is acquired by the imaging device 200, and processed by the processor 312 in the server device 300 to determine the number of times the auxiliary tool 400 is worn. is stored as operation information.
  • the information indicating that the auxiliary device 400 has been worn is obtained by the operator performing an operation input indicating that it was determined that the auxiliary device 400 was normally worn and that the attachment of the auxiliary device 400 was confirmed in S113 of FIG. 8, that the shooting button was pressed in S114 of FIG. 8, that the confirmation operation was accepted in S115 of FIG. 8, that the number of times the possibility of contracting a predetermined disease was determined in S215 of FIG.
  • auxiliary device 400 is purchased and the user inputs a reset operation, the count is reset and the count starts again from 1.
  • ⁇ Related information Based on the number of wearing times stored as operation information, nothing will be output if the "number of wearing times" is from 0 to less than 100, and when it reaches the 100th and 150th times, a message will be displayed saying "Auxiliary devices are out of stock.” From the 181st to the 200th time, the message "The stock of auxiliary tools is running low.” is output each time the operation information is acquired, and from the 201st time onwards, the message "The stock of auxiliary tools is low” is output. Please do so.” is output.
  • the related information output from the 201st time onwards includes a link to a web page where the auxiliary tool 400 is purchased, contact information for the purchase location, and the like.
  • Operation information or information used to generate operation information Information indicating that the power button has been pressed and information indicating that the shooting button has been pressed are acquired by the photographing device 200, and the server device 300 processes the processor 312. After the processing, the number of presses of each physical key is stored as operation information. Note that the count of the number of presses is reset when the physical key is replaced, for example, when the user inputs a reset operation, and the count is restarted from 1.
  • ⁇ Related information Based on the number of presses of each physical key stored as operation information, each time a predetermined number of times, such as the t1th (e.g., 10,000th) or t2th (e.g., 20,000th) time, is reached. "The physical key may have deteriorated.” is output, and after the t3th time (for example, 30,000 times), "Please replace the physical key. Please contact ⁇ for replacement.” is output, and information for notifying deterioration of the physical key and promoting at least one of replacement, inspection, and calibration is output.
  • t1th e.g. 10,000th
  • t2th e.g., 20,000th
  • [Information promoting touch panel replacement, etc.] Information used to generate operation information or operation information: information indicating that a tap, drag, or swipe operation has been performed on the touch panel for a selection operation or a confirmation operation is acquired via the input interface 210 of the photographing device 200.
  • the number of times the touch panel is operated is stored as operation information after processing by the processor 312. Note that when the touch panel is replaced or the like and the user inputs a reset operation, the count is reset and the count starts again from 1.
  • ⁇ Related information Based on the number of touch panel operations stored as operation information, each time a predetermined number of times such as t4th (for example, 50,000th) or t5th (for example, 100,000th) is reached, " The touch panel sensor may have deteriorated.'' is output, and after the 6th time (for example, 300,000 times), the message ⁇ Please replace the touch panel.Please contact ⁇ for replacement.'' is output. and outputs information for notifying deterioration of the touch panel and promoting at least one of replacement, inspection, and calibration.
  • t4th for example, 50,000th
  • t5th for example, 100,000th
  • the lighting start time and lighting end time of the light source 212 are acquired in the photographing device 200 from the current time information obtained by referring to the timer, and the server device 300 acquires the lighting start time and the lighting end time. After the process of 312, the lighting time of the light source 212 is stored as operation information. Further, an output of an on signal to the light source 212 is acquired in the imaging device 200, and the number of times the light source 212 is turned on is stored as operation information through processing by the processor 312 in the server device 300.
  • a detected value from a current sensor connected to the light source 212 of the photographing device 200 is acquired in the photographing device 200, and the detected value is stored as operation information in the server device 300.
  • a detection value from a temperature sensor installed around the light source 212 is acquired by the photographing device 200 and stored as operation information in the server device 300. Incidentally, when the light source 212 is replaced or the like and the user inputs a reset operation, the count is reset and the count starts again from 1 for the lighting time and the number of times the light source 212 is lit.
  • ⁇ Related information Based on the lighting time stored as operation information, each time the cumulative lighting time reaches a predetermined time such as s1 hours (for example, 1,000 hours), s2 hours (for example, 2,000 hours), " "The LED of light source 212 may have deteriorated.” is output, and after s3 hours (for example, 10,000 hours) "Please replace the LED of light source 212. Please contact ⁇ for replacement. ”, and information for notifying the deterioration of the light source 212 and promoting at least one of replacement, inspection, and calibration is output. Also, based on the number of lighting times stored as operation information, each time a predetermined number of times such as t7th (for example, 10,000th time), t8th (for example, 20,000th time), etc.
  • t7th for example, 10,000th time
  • t8th for example, 20,000th time
  • the message ⁇ Please check the LED of light source 212'' or ⁇ Replace the LED of light source 212.'' For replacement, please contact ⁇ . ” is output, and information for notifying the deterioration of the LED of the light source 212 and promoting at least one of replacement, inspection, and calibration is output.
  • the detected value of the temperature sensor stored as operation information is compared with a predetermined threshold value (for example, 60 degrees Celsius), and if the threshold value is exceeded, the LED of the light source 212 is notified of deterioration, replaced, or inspected. Information for facilitating at least one of the above and the calibration is output.
  • the information such as the lighting time, the number of lighting times, the detected value from the current sensor, and the temperature stored as the operation information above is stored in chronological order as historical data, and the light source 212 is adjusted based on the comparison with past historical data.
  • the deterioration of the LED may be notified, or the predicted time of deterioration may be notified.
  • Such a notification can be generated by determining deterioration or predicting the timing of deterioration using a learned prediction model obtained by machine learning using historical data as input data.
  • Operation information or information used to generate operation information Information indicating that connection of the battery 218 to the charging device has been detected is acquired, and is processed by the processor 312 in the server device 300 to generate the previous information. The cumulative number of charging times is stored as operation information.
  • the current amount of charge is obtained from the integrated value of the amount of current charged and the amount of current discharged by the current sensor connected to the battery 218, and the detected value from the voltage sensor connected to the battery 218 is acquired by the imaging device. 200, and the detected value from the voltage sensor at the time of full charge is stored as operation information in the server device 300.
  • a detected value of the cell voltage of the battery 218 is acquired in the photographing device 200 by a voltage sensor connected to the battery 218, and the detected value is processed by the processor 312 in the server device 300, resulting in an overcharge state in which an excessive voltage is applied.
  • the number of times this is displayed is stored as operation information.
  • the number of times an over-discharge state, which is a state of excessive discharge, has been displayed is stored as operation information.
  • the charging start time and charging end time for the battery 218 are acquired in the photographing device 200, and the cumulative charging time of the battery 218 is obtained by processing by the processor 312 in the server device 300. is stored as operation information.
  • a detected value from a temperature sensor installed around the battery 218 is acquired by the photographing device 200 and stored as operation information in the server device 300. Note that the number of charging times, the number of overcharging and overdischarging, and the cumulative charging time are reset when the battery 218 is replaced, for example, when the user inputs a reset operation, and the count is restarted from 1. Ru.
  • ⁇ Related information Based on the number of charging times, overcharging times, or overdischarging times stored as operation information, at a predetermined number of times, such as the t10th (for example, 500th time), t11th (for example, 1,000th time), etc. Each time it reaches the limit, the message "Battery 218 may have deteriorated.” is output, and after the t12th (for example, 2,000th) time, "Please replace the battery 218. For replacement, please contact ⁇ .” ” is output, and information for notifying the deterioration of the battery 218 and promoting at least one of replacement and inspection is output.
  • the conditions for outputting the related information are set to different times for each of the number of times of charging, the number of overcharging, and the number of overdischarging.
  • the detected value from the voltage sensor at full charge stored as operation information is compared with a predetermined threshold (for example, 3.7V), and if the value is lower than the threshold, the battery has deteriorated and cannot be fully charged. Please check the battery 218 or replace the battery 218. For exchange, please contact ⁇ . ” is output, and information for notifying the deterioration of the battery 218 and promoting at least one of replacement and inspection is output.
  • each time the cumulative charging time reaches a predetermined time such as s4 hours (for example, 500 hours) or s5 hours (for example, 1,000 hours)
  • the message "The battery 218 "There is a possibility of deterioration.” is output, and after s6 hours (for example, 2,000 hours), "Please replace the battery 218. Please contact ⁇ for replacement.” is output, and the battery 218 Deterioration of the device is notified, and information for promoting at least one of replacement and inspection is output.
  • the detected value of the temperature sensor stored as operation information is compared with a predetermined threshold value (for example, 50 degrees Celsius), and if the threshold value is exceeded, the battery 218 is notified of deterioration or replaced or inspected.
  • Information for promoting at least one of the above is output.
  • information such as the number of charging times, the number of overcharging times, the number of overdischarging, the detected value from the voltage sensor, the charging time, the detected value from the temperature sensor, etc. stored as the operation information above is stored in chronological order as historical data, Deterioration of the battery 218 may be notified based on comparison with past historical data, or a predicted timing of deterioration may be notified. Such a notification can be generated by determining deterioration or predicting the timing of deterioration using a learned prediction model obtained by machine learning using historical data as input data.
  • Operation information or information used to generate operation information from the integrated value of the amount of current charged and the amount of current discharged by the current sensor connected to the battery 218, or the output of the voltage sensor connected to the battery 218 The current remaining charge amount is acquired from the value, and the remaining charge amount is stored in the server device 300 as operation information. Further, an operation log of the photographing device 200, information indicating the location of use, and current time information are acquired by the photographing device 200 and stored as operation information in the server device 300.
  • - Related information From the operation log stored as operation information, information indicating the location of use, and current time information, predicted values of usage frequency and power consumption for each day of the week are calculated for each institution.
  • a decrease in the amount of charge of the battery 218 is predicted based on the remaining charge amount obtained as the operation information and the predicted value. Based on the results, if it is predicted that the charge amount of the battery 218 will be less than the specified amount (for example, 500 mAh) during medical treatment hours at a medical institution such as a hospital, related information will be displayed indicating that the battery 218 is connected to a charging device. Please start charging.'' and information to facilitate charging is output. In medical institutions, etc., it would be a serious problem if the imaging device 200 becomes unusable due to a decrease in the remaining capacity of the battery 218 during medical treatment hours, so such a problem can be solved by notifying the patient in advance.
  • the specified amount for example, 500 mAh
  • the predicted values of usage frequency and power consumption for each day of the week for each institution are learned predictions obtained by machine learning using the past history of usage frequency and power consumption for the same day of the week by the same institution as input data. It is possible to calculate using a model. However, you are not limited to this method; any method can be used, such as a method that uses the maximum usage frequency, maximum power usage, average usage frequency, average power usage, etc. for each day of the week as predicted values from past history. It is.
  • Operation information or information used to generate operation information A detected value from a voltage sensor connected to an AC adapter for supplying power from the charging device to the battery 218 is acquired in the imaging device 200, and is acquired in the server device 300. The detected value is acquired as operation information.
  • ⁇ Related information The time required to fully charge is calculated from the voltage sensor detection value stored as operation information and the predetermined battery capacity, and a notification stating "It will take XX hours to fully charge.” is related. Output as information.
  • a detected value from a voltage sensor or a current sensor connected to an electric wire for supplying power from the charging device to the battery 218 is acquired in the imaging device 200, and is transmitted to the server device. At 300, the detected value is acquired as operation information.
  • ⁇ Related information The detected value of the voltage sensor or current sensor stored as operation information and a predetermined threshold (for example, 0.5A as the threshold of the current sensor.
  • the threshold value is exceeded or lower than the threshold, information notifying that the charging device has deteriorated or that a charging device that does not meet the recommended standards is connected is output as related information.
  • a cable capable of both data communication and power supply or a cable capable of only power supply is generally used. Therefore, which cable was used for connection may be detected and output as related information. For example, if a cable that can only supply power is used, data communication cannot be performed via the cable, so the usability is further improved by outputting it as related information.
  • Operation information or information used to generate operation information are acquired in the imaging device 200 from the current time information obtained by referring to the timer, and the server device 300 acquires the display start time and display end time. After the process of step 312, the cumulative usage time of the CMOS image sensor is stored as operation information. Further, information indicating that the photographing button has been pressed is acquired by the photographing device 200, and processed by the processor 312 in the server device 300, and the number of photographing times is stored as operation information.
  • CMOS image sensor when the CMOS image sensor is replaced or the like, and the user inputs a reset operation, the cumulative usage time and the number of times of photographing are reset, and counting starts again from 1.
  • ⁇ Related information Based on the cumulative usage time of the CMOS sensor stored as operation information, each time the cumulative usage time reaches a predetermined time such as s7 hours, s8 hours, etc., a message saying "The CMOS image sensor may deteriorate.” ” is output, and after s9 hours, “Please replace the CMOS image sensor. Please contact ⁇ for replacement.” is output, and the CMOS image sensor is notified, or at least one of replacement and inspection is output. Information to promote this will be output.
  • CMOS image sensor may deteriorate.
  • CMOS image sensor is output. Please replace the image sensor. For replacement, please contact be done.
  • a subject image photographed by the photographing device 200 may be used as the operation information.
  • changes in color, saturation, brightness, and pixel loss are determined, and related information is sent to ⁇ for replacement. Please contact us.'' and information for facilitating at least one of replacement and inspection of the CMOS image sensor and the light source 212 is output.
  • the subject image is applied to a trained judgment model obtained by machine learning using the subject image for learning and appropriate or inappropriate label information for each subject image as input data. This can be done by inputting the information.
  • the method is not limited to this method, and other image analysis methods may be used, such as analyzing the degree of similarity with an appropriate subject image.
  • Operation information or information used to generate operation information From the current time information obtained by referring to the timer, the start time and end time of use of the imaging device 200 are acquired in the imaging device 200, and the server device 300 In response to processing by the processor 312, the cumulative usage time of the photographing device 200 is stored as operation information. Further, the detection value from a temperature sensor installed around the light guide tube is acquired by the photographing device 200, and is stored as operation information in the server device 300.
  • the lighting start time and lighting end time of the light source 212 are acquired in the photographing device 200, and the lighting time of the light source 212 is processed by the processor 312 in the server device 300. It is stored as operation information. Note that the cumulative usage time is reset when the various lenses and light guide tubes are replaced, for example, and the user inputs a reset operation, and the count is restarted from 1.
  • each time the cumulative usage time reaches a predetermined time such as s10 hours, s11 hours, etc. "Please replace the various lenses and light guide tubes. Please contact ⁇ for replacement.” is output after s12 hours, indicating that the various lenses and light guide tubes have deteriorated.
  • Information for notifying or promoting at least one of replacement and inspection is output. Note that it is also possible to set a different number of times for each type of lens or for each light guide tube.
  • related information includes information that compares the detected value of the temperature sensor stored as operation information with a predetermined threshold value (for example, 120 degrees Celsius), and notifies that abnormal heat generation is observed when the threshold value is exceeded.
  • the light guide tube is easily affected by light and heat from the light source 212. Therefore, it is also possible to output related information by further combining the lighting time of the light source 212 acquired as described above with the information on the cumulative usage time described above.
  • Operation information or information used to generate operation information A detected value from the acceleration sensor is always acquired by the imaging device 200 and stored as operation information in the server device 300.
  • ⁇ Related information The detection value of the acceleration sensor stored as operation information is compared with a predetermined threshold value, and if the value exceeds the threshold value, the camera is judged to have been used abnormally such as falling or colliding.
  • Information prompting inspection etc. of 200 is output as related information.
  • notifications can be generated by determining abnormal detected values using a trained prediction model obtained by machine learning using historical data of detected values from acceleration sensors as input data. be. Further, the operation information itself is output in order to estimate the cause of a failure of the photographing device 200 at the time of inspection.
  • Operation information or information used to generate operation information The detected value from a temperature sensor installed at a heat generating part of the imaging device 200 (for example, around the light source 212, a control board on which the processor 213, etc. is mounted) The information is constantly acquired by the photographing device 200 and stored as operation information in the server device 300.
  • - Related information Compare the detected value of the temperature sensor stored as operation information with a predetermined threshold, and if it exceeds the threshold, check the imaging device 200 as there is a risk of abnormal heat generation, etc.
  • the prompting information is output as related information.
  • notifications can be generated by determining abnormal detected values using a trained prediction model obtained by machine learning using historical data of detected values from temperature sensors as input data. be. Further, the operation information itself is output in order to estimate the cause of a failure of the photographing device 200 at the time of inspection.
  • Operation information or information used to generate operation information The communication speed of communication performed with the display device 100 or the server device 300 via the communication interface 216 of the photographing device 200 and the detected value of the radio field intensity during the communication The information is acquired by the imaging device 200, processed by the processor 312 of the server device 300, and stored as operation information.
  • ⁇ Related information Compare the communication speed or radio field strength stored as operation information with a predetermined threshold (for example, 1 Mbps for communication speed), and if it exceeds the threshold, it is determined that the communication state is not good and the communication state is determined. Information on how to improve is output as related information.
  • Operation information or information used to generate operation information The communication speed of communication performed with the display device 100 or the server device 300 is acquired in the camera device 200 via the communication interface 216 of the camera device 200, and the communication speed is acquired by the camera device 200. The information is processed by the processor 312 and stored as operation information.
  • - Related information Based on the communication speed stored as operation information, the expected waiting time when transmitting the subject image etc. from the display device 100 to the server device 300 is calculated, and the calculated expected waiting time is output as related information. be done.
  • Operation information or information used to generate operation information Information indicating that the auxiliary tool 400 has been removed or not is acquired by the imaging device 200 and processed by the processor 312 in the server device 300. The information is stored as operation information.
  • - Related information Based on the information stored as operation information indicating that the auxiliary device 400 has been removed or has not been removed, it is determined that the auxiliary device 400 is still attached, and the auxiliary device 400 is not removed. If it is determined that it remains the same, related information indicating this is generated. By outputting information that promotes the removal of the used auxiliary tool 400 in this way, it is possible to prevent infection between subjects.
  • Operation information or information used to generate operation information The operation log of the photographing device 200, information indicating the location of use, and current time information are acquired by the photographing device 200 and stored as operation information in the server device 300.
  • - Related information When an abnormal operation is detected from the operation log stored as operation information, information regarding an operation method to improve it is output as related information.
  • the subject image is transmitted to the server device 300 by a confirming operation after the subject image is captured in photographing processing, but the number of times a canceling operation is performed instead of the confirming operation is predetermined.
  • a threshold value for example, three times
  • information to assist in photographing such as "Be sure to firmly fix the photographing device 200 so that it does not move during photographing.” is output as related information.
  • a predetermined threshold for example, 3 minutes as the lighting time of the light source
  • the message "Turn off the power to the imaging device 200 after shooting?" , "Let's put the imaging device 200 upright and put it into a sleep state.”
  • Information that assists in the use of the imaging device 200 is output as related information. Note that these times and times are reset by the operator inputting a predetermined operation or by outputting related information, and are counted again from 1.
  • the above threshold value can be calculated using a learned prediction model obtained by machine learning using the history of past operation logs of the same institution on the same day of the week as input data.
  • any method can be used, such as a method using the maximum value, minimum value, or average value for each day of the week from past history as a threshold value.
  • Operation information or information used to generate operation information A subject image photographed by the photographing device 200 is acquired by the photographing device 200, and is also stored as operation information in the server device 300.
  • ⁇ Related information By analyzing the subject images stored as operation information, for example, if it is determined that there are many images with inappropriate angles of view or images with a lot of blur, the related information will be superimposed on the through image and the subject image will be displayed.
  • a guide display is output to guide the position of the camera, and assistance is provided in order to capture a good image of the subject, such as "Be sure to fix the camera device 200 firmly so that it does not move during shooting.”
  • Information is output as related information.
  • operation information is added to a trained judgment model obtained by machine learning using subject images for learning and appropriate or inappropriate label information for each subject image as input data. This can be done by inputting the subject image obtained as follows.
  • the method is not limited to this method, and other image analysis methods may be used, such as analyzing the degree of similarity with an appropriate subject image.
  • the related information may be output to the display of the photographing device 200. Alternatively, it may be output as audio from the photographing device 200 or the display device 100.
  • Operation information or information used to generate operation information A detected value from a temperature sensor installed around the light source 212 (for example, a board on which an LED of the light source 212 is installed) is acquired in the imaging device 200, and is sent to the server. The information is stored in the device 300 as operation information.
  • ⁇ Related information Compare the detected value of the temperature sensor stored as operation information with a predetermined threshold value, and if it exceeds the threshold value, the LED of the light source 212 may be at a high temperature. Please refrain from operating the device.'' A notification is output as related information.
  • the processor 312 refers to the operator management table and specifies the operator ID information with which the display device ID information is associated. Next, the processor 312 adds up all the determined fees for the display device ID information associated with the operator ID information, and calculates the determined fee for each operator ID information. For example, if two pieces of display device ID information are associated with the operator ID information, the respective determination fees calculated for the two pieces of display device ID information are added up.
  • information notifying the calculated judgment fee and information notifying the payment method of the judgment fee are generated as related information, and are associated with the operator ID information.
  • the information is transmitted to the display device 100. Note that, at this time, the summation is performed for each operator ID information, but the processor 312 can also refer to the operator management table and further summation for each institution ID information. By doing so, it becomes possible to bill a judgment fee for each medical institution or department, for example, and it is possible to further improve convenience.
  • a processing device such as the server device 300 is communicably connected via a network to a photographing device configured to photograph an image of a subject's natural orifice, and the processing device includes at least one processor.
  • a device The at least one processor, Determining the possibility of contracting a predetermined disease based on the image taken by the imaging device, Outputting the result of the determination or information generated based on the result of the determination to assist in diagnosing the disease; calculating a usage fee to be charged to the operator of the photographing device or the institution to which the operator belongs according to at least one of the number of determinations or the number of outputs; Outputting information indicating the calculated usage fee, is configured to perform processing for
  • the processor 312 of the server device 300 can store the determined fee for each determined disease in advance, and calculate the usage fee for each determined disease in the calculation of the usage fee described above.
  • an operation log of the photographing device 200 or the number of determinations, information indicating the location of use, and current time information are acquired in the photographing device 200 and stored as operation information in the server device 300.
  • the processor 312 can obtain a predicted value of the number of determinations by inputting the institution ID information or operator ID information and the period into the learned determination number prediction model. Therefore, the processor 312 calculates the predicted value of the judgment fee for each institution or each operator in the input period from the obtained predicted value of the number of judgments.
  • the processor 312 outputs the obtained predicted value as related information. By doing so, it becomes possible to calculate a predicted value for each institution or each operator and secure the fee in advance as a budget, thereby further improving convenience.
  • usage fees For example, it is also possible to add up usage fees for each predetermined period such as daily, weekly, monthly, and yearly and output the sum. Further, for example, if the usage fee has been paid in advance, it is also possible to output the current remaining prepaid fee. Further, for example, in a case where a fixed amount is to be paid every predetermined period regardless of the number of times of determination, it is also possible to output a comparison between the determined fee and the case where the fee is paid each time.
  • the operation information exemplified above is used to notify the deterioration of the display, notify at least one of replacement, inspection, and calibration, and notify the timing of maintenance of the imaging device 200.
  • Various related information such as notifications may be output.
  • FIG. 9D is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure. Specifically, FIG. 9D is a diagram showing a processing flow executed in a process for updating the processing program installed as the SW of the photographing device 200. The processing flow is mainly performed by the processor 312 of the server device 300 reading and executing a processing program stored in the memory 311.
  • each device such as the processing system 1 or the imaging device 200 and the processing program are devices that are subject to regulations or approvals specified by predetermined laws, regulations, government orders, guidelines, etc. (for example, drugs, medical devices, etc.). Act on Assuring Quality, Efficacy, and Safety (so-called Pharmaceutical Devices Act), Federal Food, Drug, and Cosmetic Act, Medical Device Amendment Act, European Medical Device Regulations (Medical Device Regulation), In Vitro Diagnostic Regulation, Medical Device Regulations of the State Council of China No. 739), contraindications and prohibitions, Related documents such as published precautions information that describes the principle of the device, how to use it, precautions for use, etc., package inserts, history of revision instructions, review reports, reexamination reports, emergency safety information, etc. Created. Then, when these related documents are revised or issued, it is necessary to promptly notify the operator etc. of the revision or issuance, so the fact that the revision or issuance has been made is notified as related information.
  • the processor 312 determines whether revision or issuance information of a related document such as an attached document has been received via the communication interface 313. If the information has been received, the processor 312 generates, as related information, information indicating that a related document such as an attached document has been revised or issued. Further, the display device 100 stores links that are stored in advance to obtain related documents that have not yet been revised or issued, and a recording medium such as a two-dimensional code in which the links are recorded. Therefore, the processor 312 displays the content stored in the display device 100 such as a link where the related document after revision or publication is obtained as related information, and the link is recorded as a recording medium such as a two-dimensional code. It is possible to include a processing program for updating. The processor 312 then outputs the relevant information to the display device 100, the photographing device 200, or a combination thereof via the communication interface 313.
  • the processing program installed as the SW of the imaging device 200 may need to be updated due to revisions of related documents such as attached documents. Therefore, a case will be described below in which the processor 312 executes processing related to updating a processing program specified in the relevant document due to the revision of the relevant document.
  • the processor 312 determines whether revision information of a related document such as an attached document has been received via the communication interface 313 (S241).
  • the processor 312 refers to the photographing device management table to refer to the SW version information of the currently stored processing program (S242).
  • the processor 312 compares the SW version information of the processing program specified by the received revision information with the stored SW version information, and determines whether the processing program needs to be updated. As a result, if the two are different, the processor 312 determines that an update is necessary, and provides related information that prompts the processing program to be updated (for example, "The attached document has been revised. Download the latest SW from below. ) (S233).
  • the processor 312 outputs the information to the display device 100, the photographing device 200, or a combination thereof via the communication interface 313 (S234).
  • a processing device such as the server device 300 is communicably connected via a network to a photographing device configured to photograph an image of a subject's natural orifice, and the processing device includes at least one processor.
  • a device The at least one processor, Store version information of the processing program installed in the imaging device in memory, At least one of the processing device, the imaging device, and the processing program obtains version information specified in a notification resulting from a related document created for predetermined regulation or approval; If the acquired version information differs from the version information stored in the memory, outputting information prompting an update to the processing program specified by the acquired version information; is configured to perform processing for
  • the notification is not limited to the revision or issuance of related documents, and other received information may be notified as related information.
  • the processor 312 of the server device 300 receives defect information regarding various components from other server devices etc. via the communication interface 313.
  • the processor 312 then refers to the parts information in the imaging device management table in the memory 311 based on the manufacturing lot included in the received defect information.
  • the processor 312 performs at least one of replacement, inspection, and calibration of the corresponding component (part) of the imaging device 200. Generate prompt notifications with relevant information. The processor 312 then outputs the generated related information to the display device 100, the photographing device 200, or a combination thereof via the communication interface 313.
  • FIGS. 10A to 10E are diagrams showing examples of screens displayed on the display device 100 according to an embodiment of the present disclosure.
  • FIG. 10A is a screen showing a list of subjects displayed on the display device 100, in which subjects for determining the possibility of contracting a predetermined disease, subjects for inputting interview information, and subjects for inputting finding information are displayed. This is an example of a screen displayed when selecting a target person.
  • FIGS. 10B to 10E are examples of screens on which related information is displayed on the display device 100.
  • a subject list screen is displayed on the display via the output interface 114.
  • This screen includes a list screen 11, and on the list screen 11, information on people who have left the facility, such as the current status, name of the person, ID information of the person, and attribute information, is displayed as a list in a line-by-line format for each person.
  • interview input icon 12 for inputting interview information
  • diagnosis icon 13 for inputting finding information and sending a judgment request
  • new icon for newly registering a new subject.
  • a registration icon 14 is displayed.
  • the subject list screen shown in FIG. 10A is a screen that is frequently referenced by the user when making a determination using the processing system 1.
  • FIG. 10B a case is described in which information promoting the purchase of the auxiliary tool 400 is received and output as related information.
  • the related information display 15 is displayed superimposed on the subject list screen.
  • FIG. 10C a case is described in which information promoting updating of a processing program is received and output as related information.
  • the related information display 17 is displayed superimposed on the subject list screen.
  • FIG. 10D a case is described in which information that assists the operation of the photographing device 200 is received and output as related information.
  • the related information display 18 is displayed superimposed on the subject list screen.
  • the output method mentioned here is just an example, and of course output may be performed using other methods.
  • a notification may be displayed at the top of the screen, and by clicking on the notification, the screen may be switched and detailed information thereof may be output.
  • the reception may be notified by sound, LED, vibration, or a combination thereof, and the detailed information may be output by opening a predetermined application program.
  • FIG. 10E a case is described in which information indicating a determination fee is received and output as related information.
  • the usage fee for each display device 100 that has transmitted the determination request and the total amount of usage fees for the institutions to which these display devices 100 are associated are displayed on the usage fee display screen 19.
  • a payment icon 20 and a cancel icon 21 are displayed at the bottom of the usage fee display screen.
  • the output destinations of the related information shown in FIGS. 10B to 10C do not need to be displayed only on the display device 100, and may also be output to, for example, the photographing device 200 or other terminal device owned by a pre-registered operator. good.
  • the present embodiment provides a processing device, a processing program, and a processing method that are connected to a photographing device configured to photograph an image of a subject and can improve the usability of the photographing device. I can do it.
  • the interview information and finding information are input in advance by the operator or the subject, or are received from an electronic medical record device or the like connected to a wired or wireless network.
  • this information may be obtained from a photographed subject image.
  • the medical interview information and observation information may be written in advance on a paper medium or the like by hand or in a mark sheet format, and may be acquired by reading this using an optical reading device such as a camera or a scanner.
  • an optical reading device such as a camera or a scanner.
  • a recording medium such as a two-dimensional code
  • Finding information and interview information associated with the learning subject image are given as correct labels to the learning subject image, and a learned information estimation model is obtained by machine learning these sets using a neural network. Then, the processor 111 inputs the subject image to the learned information estimation model, thereby obtaining desired interview information and attribute information.
  • interview information and attribute information include gender, age, degree of redness of the throat, degree of swelling of tonsils, presence or absence of white moss, and the like. This saves the operator the trouble of inputting medical interview information and finding information.
  • the related information is generated based on the operation information
  • the related information can also be generated by further considering information other than the operation information. For example, it is possible to further take into consideration information about the approximate replacement timing of various components that is registered in advance at the time of manufacturing or maintenance of the photographing device 200. Although the timing for outputting the related information has not yet been reached based on the operation information, the related information may be output when the estimated time is reached.
  • Each trained model described in the above embodiments was generated using a neural network or a convolutional neural network.
  • the information is not limited to these, and it is also possible to generate it using machine learning such as the nearest neighbor method, decision tree, regression tree, and random forest.
  • the server device 300 performs determination processing and output processing of related information.
  • these various processes can be appropriately distributed and processed by the display device 100, the photographing device 200, other devices (including a cloud server device, etc.), and the like.
  • the location of use was obtained through GPS.
  • this is just one example of acquiring the place of use; for example, it may be acquired by acquiring institution ID information and specifying the place associated with the institution ID information.
  • the processes and procedures described herein can be implemented not only by what is explicitly described in the embodiments, but also by software, hardware, or a combination thereof. Specifically, the processes and procedures described in this specification can be realized by implementing logic corresponding to the processes in a medium such as an integrated circuit, volatile memory, nonvolatile memory, magnetic disk, or optical storage. be done. Further, the processes and procedures described in this specification can be implemented as computer processing programs and executed by various computers including display devices and server devices.
  • processing system 100 display device 200 photographing device 300 server device 400 auxiliary tool

Abstract

[Problem] To connect to an imaging device configured to capture an image of a subject, and improve the usability of the imaging device. [Solution] Provided is a processing device that is communicably connected, via a network, to an imaging device configured to capture an image, the subject of which is a natural orifice of a person subject to imaging, and that comprises at least one processor, wherein the at least one processor is configured to perform processing for: obtaining operation information, which relates to an operation of the imaging device performed by an operator of the imaging device, in order to capture an image in the imaging device; and outputting related information, which relates to the imaging device, on the basis of the obtained operation information.

Description

処理装置、処理プログラム及び処理方法Processing device, processing program and processing method
 本開示は、被写体の画像を撮影するように構成された撮影装置と接続された処理装置、処理プログラム及び処理方法に関する。 The present disclosure relates to a processing device, a processing program, and a processing method connected to a photographing device configured to photograph an image of a subject.
 従来より、ヒトなどの生体又はその一部を被写体として画像を撮影するための撮影装置が知られていた。例えば、特許文献1には、撮像素子を収納する第1の筐体を備えるカメラヘッドと、カメラヘッドと接続され少なくとも撮像素子からの画像信号を伝送するケーブル部と、ケーブル部が挿通される筒状の第2の筐体とを備える信号伝送部と、第1の筐体と第2の筐体とで挟持されカメラヘッドの種類を識別する外周面を有する環状部材と、第1の筐体と前記第2の筐体と前記環状部材とにより固定される環状のシール部材と、を備える医療用カメラ装置が記載されている。 2. Description of the Related Art Conventionally, photographing devices have been known for photographing an image of a living body such as a human being or a part thereof as a subject. For example, Patent Document 1 describes a camera head including a first housing that houses an image sensor, a cable section that is connected to the camera head and transmits at least an image signal from the image sensor, and a tube through which the cable section is inserted. a signal transmission unit comprising a second housing having a shape, an annular member sandwiched between the first housing and the second housing and having an outer circumferential surface for identifying the type of camera head, and the first housing. and an annular sealing member fixed by the second casing and the annular member.
特開2016-214661号公報JP2016-214661A
 そこで、上記のような技術を踏まえ、本開示では、様々な実施形態により、被写体の画像を撮影するように構成された撮影装置と接続され、当該撮影装置の使い勝手をよりよくすることが可能な処理装置、処理プログラム及び処理方法を提供することを目的とする。 Therefore, based on the above-mentioned technology, the present disclosure provides various embodiments that allow a camera to be connected to a photographing device configured to photograph an image of a subject, and to improve the usability of the photographing device. The purpose is to provide a processing device, a processing program, and a processing method.
 本開示の一態様によれば、「対象者の自然開口部を被写体とする画像を撮影するように構成された撮影装置とネットワークを介して通信可能に接続され、少なくとも一つのプロセッサを含む処理装置であって、前記少なくとも一つのプロセッサが、前記撮影装置において前記画像を撮影するために、前記撮影装置の操作者によってなされた前記撮影装置の操作に関連する操作情報を取得し、取得した前記操作情報に基づいて前記撮影装置に関連する関連情報を出力する、ための処理を実行するように構成された、処理装置」が提供される。 According to one aspect of the present disclosure, "a processing device communicably connected via a network to a photographing device configured to photograph an image of a subject's natural aperture as a subject, and including at least one processor. wherein the at least one processor acquires operation information related to an operation of the photographing device performed by an operator of the photographing device in order to photograph the image with the photographing device, and the acquired operation A processing device configured to perform processing for outputting related information related to the photographing device based on the information is provided.
 本開示の一態様によれば、「対象者の自然開口部を被写体とする画像を撮影するように構成された撮影装置とネットワークを介して通信可能に接続された処理装置を、前記撮影装置において前記画像を撮影するために、前記撮影装置の操作者によってなされた前記撮影装置の操作に関連する操作情報を取得し、取得した前記操作情報に基づいて前記撮影装置に関連する関連情報を出力する、ためのプロセッサとして機能させる処理プログラム」が提供される。 According to one aspect of the present disclosure, "a processing device configured to capture an image of a subject's natural aperture as a subject and a processing device communicatively connected via a network to the imaging device," Obtaining operation information related to an operation of the photographing device performed by an operator of the photographing device in order to photograph the image, and outputting related information related to the photographing device based on the acquired operation information. A processing program that functions as a processor for , is provided.
 本開示の一態様によれば、「対象者の自然開口部を被写体とする画像を撮影するように構成された撮影装置とネットワークを介して通信可能に接続され、少なくとも一つのプロセッサを含む処理装置において、前記少なくとも一つのプロセッサにより実行される処理方法であって、前記少なくとも一つのプロセッサが、前記撮影装置において前記画像を撮影するために、前記撮影装置の操作者によってなされた前記撮影装置の操作に関連する操作情報を取得する段階と、取得した前記操作情報に基づいて前記撮影装置に関連する関連情報を出力する段階と、を含む処理方法」が提供される。 According to one aspect of the present disclosure, "a processing device communicably connected via a network to a photographing device configured to photograph an image of a subject's natural aperture as a subject, and including at least one processor. , the processing method executed by the at least one processor, wherein the at least one processor performs an operation of the photographing device performed by an operator of the photographing device in order to photograph the image with the photographing device. and a step of outputting related information related to the photographing device based on the obtained operation information.
 本開示によれば、被写体の画像を撮影するように構成された撮影装置と接続され、当該撮影装置の使い勝手をよりよくすることが可能な処理装置、処理プログラム及び処理方法を提供することができる。 According to the present disclosure, it is possible to provide a processing device, a processing program, and a processing method that are connected to a photographing device configured to photograph an image of a subject and can improve the usability of the photographing device. .
 なお、上記効果は説明の便宜のための例示的なものであるにすぎず、限定的なものではない。上記効果に加えて、又は上記効果に代えて、本開示中に記載されたいかなる効果や当業者であれば明らかな効果を奏することも可能である。 Note that the above effects are merely illustrative for convenience of explanation, and are not limiting. In addition to or in place of the above effects, any effects described in the present disclosure or effects obvious to those skilled in the art may be achieved.
図1は、本開示の一実施形態に係る撮影装置200の使用状態を示す図である。FIG. 1 is a diagram showing a usage state of an imaging device 200 according to an embodiment of the present disclosure. 図2は、本開示の一実施形態に係る撮影装置200の使用状態を示す図である。FIG. 2 is a diagram showing a usage state of the imaging device 200 according to an embodiment of the present disclosure. 図3は、本開示の一実施形態に係る処理システム1の構成を示すブロック図である。FIG. 3 is a block diagram showing the configuration of the processing system 1 according to an embodiment of the present disclosure. 図4は、本開示の一実施形態に係る表示装置100及び撮影装置200の構成を示すブロック図である。FIG. 4 is a block diagram showing the configuration of a display device 100 and a photographing device 200 according to an embodiment of the present disclosure. 図5は、本開示の一実施形態に係るサーバ装置300の構成を示すブロック図である。FIG. 5 is a block diagram showing the configuration of a server device 300 according to an embodiment of the present disclosure. 図6Aは、本開示の一実施形態に係るサーバ装置300に記憶される撮影装置管理テーブルを概念的に示す図である。FIG. 6A is a diagram conceptually showing an imaging device management table stored in server device 300 according to an embodiment of the present disclosure. 図6Bは、本開示の一実施形態に係るサーバ装置300に記憶される操作者管理テーブルを概念的に示す図である。FIG. 6B is a diagram conceptually showing an operator management table stored in the server device 300 according to an embodiment of the present disclosure. 図6Cは、本開示の一実施形態に係るサーバ装置300に記憶される対象者管理テーブルを概念的に示す図である。FIG. 6C is a diagram conceptually showing a target person management table stored in the server device 300 according to an embodiment of the present disclosure. 図6Dは、本開示の一実施形態に係るサーバ装置300に記憶される関連情報テーブルを概念的に示す図である。FIG. 6D is a diagram conceptually showing a related information table stored in the server device 300 according to an embodiment of the present disclosure. 図7は、本開示の一実施形態に係る表示装置100、撮影装置200及びサーバ装置300との間で実行される処理シーケンスを示す図である。FIG. 7 is a diagram showing a processing sequence executed between the display device 100, the imaging device 200, and the server device 300 according to an embodiment of the present disclosure. 図8は、本開示の一実施形態に係る撮影装置200において実行される処理フローを示す図である。FIG. 8 is a diagram showing a processing flow executed in the imaging device 200 according to an embodiment of the present disclosure. 図9Aは、本開示の一実施形態に係るサーバ装置300において実行される処理フローを示す図である。FIG. 9A is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure. 図9Bは、本開示の一実施形態に係る学習済みモデルの生成に係る処理フローを示す図である。FIG. 9B is a diagram illustrating a processing flow related to generation of a trained model according to an embodiment of the present disclosure. 図9Cは、本開示の一実施形態に係るサーバ装置300において実行される処理フローを示す図である。FIG. 9C is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure. 図9Dは、本開示の一実施形態に係るサーバ装置300において実行される処理フローを示す図である。FIG. 9D is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure. 図10Aは、本開示の一実施形態に係る表示装置100に表示される画面の例を示す図である。FIG. 10A is a diagram illustrating an example of a screen displayed on the display device 100 according to an embodiment of the present disclosure. 図10Bは、本開示の一実施形態に係る表示装置100に表示される画面の例を示す図である。FIG. 10B is a diagram illustrating an example of a screen displayed on the display device 100 according to an embodiment of the present disclosure. 図10Cは、本開示の一実施形態に係る表示装置100に表示される画面の例を示す図である。FIG. 10C is a diagram illustrating an example of a screen displayed on the display device 100 according to an embodiment of the present disclosure. 図10Dは、本開示の一実施形態に係る表示装置100に表示される画面の例を示す図である。FIG. 10D is a diagram illustrating an example of a screen displayed on the display device 100 according to an embodiment of the present disclosure. 図10Eは、本開示の一実施形態に係る表示装置100に表示される画面の例を示す図である。FIG. 10E is a diagram illustrating an example of a screen displayed on the display device 100 according to an embodiment of the present disclosure.
 添付図面を参照して本開示の様々な実施形態を説明する。なお、図面における共通する構成要素には同一の参照符号が付されている。 Various embodiments of the present disclosure will be described with reference to the accompanying drawings. Note that common components in the drawings are given the same reference numerals.
<第1実施形態>
1.処理システム1の概要
 本開示に係る処理システム1は、主に対象者の口腔の内部を撮影し被写体画像を得るために用いられる。特に、当該処理システム1は、口腔の喉奥周辺、具体的には咽頭を撮影するために用いられる。したがって、以下においては、本開示に係る処理システム1を咽頭の撮影に用いた場合について主に説明する。ただし、咽頭は撮影部位の一例であって、当然に、扁桃や喉頭などの口腔内の他の部位や、外耳道、膣、直腸、鼻腔などの他の自然開口部であっても、本開示に係る処理システム1を好適に用いることができる。
<First embodiment>
1. Overview of Processing System 1 The processing system 1 according to the present disclosure is mainly used to photograph the inside of a subject's oral cavity and obtain a subject image. In particular, the processing system 1 is used to image the back of the throat of the oral cavity, specifically, the pharynx. Therefore, in the following, a case will be mainly described in which the processing system 1 according to the present disclosure is used to photograph the pharynx. However, the pharynx is an example of a region to be imaged, and naturally, other regions in the oral cavity such as the tonsils and larynx, and other natural openings such as the external auditory canal, vagina, rectum, and nasal cavity are also included in the present disclosure. Such a processing system 1 can be suitably used.
 本開示に係る処理システム1は、一例としては、対象者の口腔の少なくとも咽頭領域が含まれた被写体を撮影して得られた被写体画像から、所定の疾患に対する罹患の可能性を判定し、所定の疾患に対する診断又はその補助をするために用いられる。処理システム1によって判定される疾患の一例としては、インフルエンザが挙げられる。通常、インフルエンザの罹患の可能性は、対象者の咽頭や扁桃領域を診察したり、咽頭領域に濾胞などの所見の有無を判断することによって診断される。しかし、処理システム1を用いてインフルエンザに対する罹患の可能性を判定しその結果を出力することで、その診断又は補助をすることが可能となる。なお、インフルエンザの罹患の可能性の判定は一例である。処理システム1は、そもそも被写体の撮影をするためだけであっても好適に用いることができる。また、処理システム1は、罹患することによって口腔内の所見に差が現れる疾患の判定であれば、いずれであっても好適に用いることができる。なお、所見の差とは、医師等により発見されかつ医学的にその存在が既知となっているものに限らない。例えば、医師以外の者によって認識され得る差や、人工知能や画像認識技術によってその検出が可能な差であれば、処理システム1に対して好適に適用することが可能である。そのような疾患の一例としては、インフルエンザのほかに、溶連菌感染症、アデノウイルス感染症、EBウイルス感染症、マイコプラズマ感染症、手足口病、ヘルパンギーナ、カンジダ症などの感染症、動脈硬化、糖尿病、高血圧症などの血管障害または粘膜障害を呈する疾病、舌癌、咽頭癌、口腔癌などの腫瘍などが挙げられる。 As an example, the processing system 1 according to the present disclosure determines the possibility of contracting a predetermined disease from a subject image obtained by photographing a subject including at least the pharyngeal region of the oral cavity of the subject, and determines the possibility of contracting a predetermined disease. It is used to diagnose or assist in the diagnosis of diseases. An example of a disease determined by the processing system 1 is influenza. The possibility of influenza infection is usually diagnosed by examining the subject's pharynx and tonsils and determining the presence or absence of findings such as follicles in the pharynx region. However, by using the processing system 1 to determine the possibility of contracting influenza and outputting the result, it becomes possible to diagnose or assist. Note that the determination of the possibility of contracting influenza is one example. The processing system 1 can be suitably used even if it is only for photographing a subject. Further, the processing system 1 can be suitably used for any disease that causes a difference in oral cavity findings depending on the disease. Note that differences in findings are not limited to those discovered by a doctor or the like and whose existence is medically known. For example, any difference that can be recognized by a person other than a doctor or that can be detected by artificial intelligence or image recognition technology can be suitably applied to the processing system 1. Examples of such diseases include, in addition to influenza, infectious diseases such as streptococcal infection, adenovirus infection, EB virus infection, mycoplasma infection, hand-foot-and-mouth disease, herpangina, candidiasis, arteriosclerosis, diabetes, Diseases that exhibit vascular disorders or mucosal disorders such as hypertension, and tumors such as tongue cancer, pharyngeal cancer, and oral cavity cancer are included.
 なお、本開示において、疾患に対する「判定」や「診断」等の用語を用いるが、これらは医師による確定的な判定又は診断を必ずしも意味するものではない。例えば、本開示の処理システム1を対象者自らが使用したり、医師以外の操作者が使用したりして、処理システム1に含まれる表示装置100によって判定又は診断されることも当然に含みうる。 Note that in this disclosure, terms such as "determination" and "diagnosis" for diseases are used, but these do not necessarily mean definitive determination or diagnosis by a doctor. For example, it may naturally include that the processing system 1 of the present disclosure is used by the subject himself or by an operator other than a doctor, and the display device 100 included in the processing system 1 determines or diagnoses the subject. .
 また、本開示において、撮影装置200による撮影の対象となる対象者は、患者、被検者、診断対象者、健常者など、あらゆるヒトを含みうる。また、本開示において撮影装置200を保持して撮影の操作を行う操作者は、医師、看護師、検査技師などの医療従事者に限らず、対象者自身などあらゆるヒトを含みうる。本開示に係る処理システム1は、典型的には医療機関で使用されることが想定される。しかし、この場合に限らず、例えば対象者の自宅や学校、職場など、その使用場所はいずれでもよい。 Furthermore, in the present disclosure, subjects to be photographed by the photographing device 200 may include any human being, such as a patient, a subject, a diagnosis subject, and a healthy person. Further, in the present disclosure, an operator who holds the imaging device 200 and performs an imaging operation is not limited to a medical worker such as a doctor, nurse, or laboratory technician, but may include any person such as the subject himself/herself. It is assumed that the processing system 1 according to the present disclosure is typically used in a medical institution. However, the application is not limited to this case, and may be used at any location such as the target's home, school, or workplace.
 また、本開示において、操作者に対応付けられる機関は、例えば操作者、撮影装置、表示装置又はサーバ装置が対応付けられた機関であって、会社、団体、組織等を意味する。典型的には、操作者である医師や看護師、検査技師などの医療従事者が所属する医療機関やその部署、診療科等が挙げられる。 Furthermore, in the present disclosure, an institution associated with an operator is, for example, an institution associated with an operator, an imaging device, a display device, or a server device, and means a company, group, organization, or the like. Typically, examples include medical institutions, departments, departments, etc. to which medical personnel such as doctors, nurses, and laboratory technicians, who are operators, belong.
 また、本開示においては、上記のとおり、被写体としては対象者の自然開口部の少なくとも一部が含まれていればよい。また、判定される疾患も被写体である自然開口部においての所見に差が現れる疾患であればいずれでもよい。しかし、以下においては、被写体として口腔内の少なくとも一部、特に咽頭又は咽頭周辺を含み、疾患としてインフルエンザに対する罹患の可能性を判定する場合について説明する。 Furthermore, in the present disclosure, as described above, the subject only needs to include at least a part of the subject's natural orifice. Further, the disease to be determined may be any disease as long as it shows a difference in the findings at the natural orifice that is the subject. However, in the following, a case will be described in which the subject includes at least a portion of the oral cavity, particularly the pharynx or the vicinity of the pharynx, and the possibility of contracting influenza is determined as the disease.
 また、本開示においては、被写体画像は、一又は複数の動画であってもよいし一又は複数の静止画であってもよい。動作の一例としては、電源ボタンが押下されると、カメラによるスルー画像の取り込みを行い、取り込まれたスルー画像がディスプレイ203に表示される。その後、操作者によって撮影ボタンが押下されると、一又は複数の静止画像がカメラによって撮影され、撮影された画像をディスプレイ203に表示する。又は、対象者によって撮影ボタンが押下されると、動画の撮影が開始され、その間カメラで撮影されている画像がディスプレイ203に表示される。そして、再度撮影ボタンが押下されると動画の撮影が終了される。このように、一連の動作において、スルー画像や静止画、動画など様々な画像がカメラによって撮影され、ディスプレイに表示されるが、被写体画像はこれらのうちの特定の画像のみを意味するのではなく、カメラで撮影された画像の全てを含みうる。 Furthermore, in the present disclosure, the subject image may be one or more moving images or one or more still images. As an example of the operation, when the power button is pressed, the camera captures a live view image, and the captured live view image is displayed on the display 203. Thereafter, when the operator presses the photographing button, one or more still images are photographed by the camera, and the photographed images are displayed on the display 203. Alternatively, when the subject presses the shooting button, shooting of a moving image is started, and the image being shot with the camera during that time is displayed on the display 203. Then, when the shooting button is pressed again, shooting of the moving image is finished. In this way, in a series of operations, various images such as through images, still images, and videos are taken by the camera and displayed on the display, but the subject image does not only refer to a specific image among these images. , may include all images taken by the camera.
 図1は、本開示の一実施形態に係る撮影装置200の使用状態を示す図である。図1によれば、操作者は、撮影装置200の先端に補助具400を被覆するように装着し、補助具400とともに撮影装置200を対象者700の口腔712に挿入する。具体的には、まず、操作者(対象者700自らの場合もあれば対象者700とは異なるものの場合もある)が撮影装置200の先端に、補助具400を被覆するように装着する。そして、操作者は、補助具400が装着された撮影装置200を、口腔712内に挿入する。このとき補助具400の先端は、門歯711を通過して軟口蓋713近辺まで挿入される。つまり、撮影装置200も同様に軟口蓋713近辺まで挿入される。このとき、補助具400(舌圧子として機能する。)によって、舌714が下方へ押しやられ舌714の動きが制限される。これによって、操作者は、撮影装置200の良好な視野を確保して、撮影装置200の前方に位置する咽頭715の良好な撮影が可能となる。 FIG. 1 is a diagram showing a usage state of an imaging device 200 according to an embodiment of the present disclosure. According to FIG. 1, the operator attaches the auxiliary tool 400 to the distal end of the imaging device 200 so as to cover it, and inserts the imaging device 200 together with the auxiliary tool 400 into the oral cavity 712 of the subject 700. Specifically, first, an operator (which may be the subject 700 himself or a person different from the subject 700) attaches the auxiliary tool 400 to the tip of the imaging device 200 so as to cover it. Then, the operator inserts the imaging device 200 with the auxiliary tool 400 attached into the oral cavity 712. At this time, the tip of the auxiliary tool 400 passes through the incisors 711 and is inserted to the vicinity of the soft palate 713. That is, the imaging device 200 is similarly inserted up to the vicinity of the soft palate 713. At this time, the tongue 714 is pushed downward by the auxiliary tool 400 (functioning as a tongue depressor) and the movement of the tongue 714 is restricted. This allows the operator to secure a good field of view for the imaging device 200 and to take good pictures of the pharynx 715 located in front of the imaging device 200.
 撮影された被写体画像(典型的には、咽頭715を含む画像)は、撮影装置200から有線又は無線ネットワークで通信可能に接続されたサーバ装置300に送信される。被写体画像を受信したサーバ装置300のプロセッサがメモリに記憶された処理プログラムを処理することによって、所定の疾患に対する罹患の可能性が判定される。そして、結果が表示装置100に送信され、表示装置100の出力インターフェイスを介してディスプレイなどに出力される。 The photographed subject image (typically, an image including the pharynx 715) is transmitted from the photographing device 200 to the server device 300 that is communicably connected via a wired or wireless network. The processor of the server device 300 that receives the subject image processes the processing program stored in the memory, thereby determining the possibility of contracting a predetermined disease. The results are then sent to the display device 100 and output to a display or the like via the output interface of the display device 100.
 図2は、本開示の一実施形態に係る撮影装置200の使用状態を示す図である。具体的には、図2は、操作者600が撮影装置200を把持した状態を示す図である。図2によれば、撮影装置200は、口腔内に挿入される側から、本体201、グリップ202及びディスプレイ203により構成される。本体201及びグリップ202は、口腔への挿入方向Hに沿って所定長の略柱状に形成される。また、ディスプレイ203は、グリップ202の本体201側とは反対側に配置される。そのため、撮影装置200は、全体として略柱状に形成されており、鉛筆を把持するような持ち方をすることによって、操作者600に把持される。つまり、使用状態においてディスプレイ203の表示パネルが操作者600自身の方向を向くため、撮影装置200によって撮影された被写体画像をリアルタイムで確認しながら撮影装置200を容易に取り扱うことが可能である。 FIG. 2 is a diagram showing a usage state of the imaging device 200 according to an embodiment of the present disclosure. Specifically, FIG. 2 is a diagram showing a state in which the operator 600 grips the imaging device 200. According to FIG. 2, the imaging device 200 is composed of a main body 201, a grip 202, and a display 203 from the side inserted into the oral cavity. The main body 201 and the grip 202 are formed into a substantially columnar shape with a predetermined length along the insertion direction H into the oral cavity. Further, the display 203 is arranged on the side of the grip 202 opposite to the main body 201 side. Therefore, the photographing device 200 is formed into a generally columnar shape as a whole, and is held by the operator 600 in a manner similar to holding a pencil. That is, since the display panel of the display 203 faces toward the operator 600 when in use, it is possible to easily handle the photographing device 200 while checking the subject image photographed by the photographing device 200 in real time.
 また、ディスプレイ203に被写体画像が通常の向きに表示される向きで操作者600がグリップ202を把持するとき、撮影ボタン220がグリップの上面側に配置されるように構成されている。そのため、操作者600が把持したときに、操作者600が撮影ボタン220を人差し指等で容易に押下可能になっている。 Furthermore, when the operator 600 holds the grip 202 in a direction in which a subject image is displayed in the normal direction on the display 203, the photographing button 220 is arranged on the top side of the grip. Therefore, when the operator 600 holds it, the operator 600 can easily press the shooting button 220 with his/her index finger or the like.
2.処理システム1の構成
 図3は、本開示の一実施形態に係る処理システム1の概略図である。図3によれば、処理システム1は、表示装置100と、撮影装置200と、サーバ装置300とを含み、各装置が有線又は無線ネットワークを介して通信可能に接続されている。表示装置100は、サーバ装置300における処理で必要とされる対象者情報、問診情報、診断情報等の入力を行ったり、撮影装置200に関連する関連情報や所定の疾患の罹患の可能性を判定した結果をサーバ装置300から受信して出力する。
2. Configuration of processing system 1 FIG. 3 is a schematic diagram of processing system 1 according to an embodiment of the present disclosure. According to FIG. 3, the processing system 1 includes a display device 100, a photographing device 200, and a server device 300, and each device is communicably connected via a wired or wireless network. The display device 100 inputs subject information, interview information, diagnostic information, etc. required for processing in the server device 300, and determines related information related to the imaging device 200 and the possibility of contracting a predetermined disease. The results are received from the server device 300 and output.
 撮影装置200は、その先端が対象者の口腔内に挿入されて、口腔内、特に咽頭を撮影し、有線又は無線ネットワークを介して撮影された被写体画像をサーバ装置300に送信する。また、撮影装置200は、その撮影に関連して、操作者により行われる操作に関する操作情報又は操作情報の生成に使われる情報をサーバ装置300に送信する。 The tip of the imaging device 200 is inserted into the oral cavity of the subject, photographs the inside of the oral cavity, particularly the pharynx, and transmits the photographed subject image to the server device 300 via a wired or wireless network. Further, the photographing device 200 transmits, to the server device 300, operation information regarding an operation performed by the operator or information used to generate the operation information in connection with the photographing.
 サーバ装置300は、撮影装置200において送信された操作情報又は操作情報の生成に使われる情報を受信して、操作情報に基づいて関連情報を出力する。また、サーバ装置300は、撮影装置200において撮影された被写体画像を受信し処理する。また、サーバ装置300は、受信した被写体画像、問診情報及び所見情報に基づいて所定の疾患に対する罹患の可能性を判定してその結果を表示装置100に送信する。 The server device 300 receives the operation information transmitted by the imaging device 200 or the information used to generate the operation information, and outputs related information based on the operation information. Further, the server device 300 receives and processes a subject image photographed by the photographing device 200. Furthermore, the server device 300 determines the possibility of contracting a predetermined disease based on the received subject image, medical interview information, and finding information, and transmits the result to the display device 100 .
 なお、本開示において、処理装置は、表示装置100、サーバ装置300又はそれらの組み合わせを意味する。すなわち、以下においてはサーバ装置300が処理装置として機能する場合について説明するが、表示装置100も同様に処理装置として機能することが可能である。また、本開示においては、処理装置が行う記憶や処理は、他の端末装置や他のサーバ装置などに分散させる場合もある。つまり、処理装置は、単一の筐体から構成されるもののみに限定されるわけではなく、表示装置100、撮影装置200、サーバ装置300、他の端末装置、他のサーバ装置又はそれらの組み合わせを含む。 Note that in this disclosure, the processing device means the display device 100, the server device 300, or a combination thereof. That is, although the case where the server device 300 functions as a processing device will be described below, the display device 100 can also function as a processing device in the same way. Furthermore, in the present disclosure, storage and processing performed by the processing device may be distributed to other terminal devices, other server devices, and the like. In other words, the processing device is not limited to only one configured with a single housing, but may include the display device 100, the photographing device 200, the server device 300, another terminal device, another server device, or a combination thereof. including.
 また、図3においては表示装置100及び撮影装置200は1台ずつしか記載されていない。しかし、例えば規模の大きい病院などの機関では、複数の表示装置100又は複数の撮影装置200を管理・運用することも当然あり得る。また、サーバ装置300の運営者は、例えば複数の機関でそれぞれ使用される表示装置100及び撮影装置200を一括して管理・運用することも当然あり得る。そのため、本開示において、処理システム1は、一又は複数の表示装置100、撮影装置200及びサーバ装置300を含むことが可能である。 Further, in FIG. 3, only one display device 100 and only one photographing device 200 are shown. However, for example, in an institution such as a large hospital, it is possible that a plurality of display devices 100 or a plurality of imaging devices 200 may be managed and operated. Furthermore, the operator of the server device 300 may, for example, collectively manage and operate the display device 100 and the photographing device 200 that are respectively used by a plurality of institutions. Therefore, in the present disclosure, the processing system 1 can include one or more display devices 100, photographing devices 200, and server devices 300.
 図4は、本開示の一実施形態に係る表示装置100及び撮影装置200の構成を示すブロック図である。具体的には、図4は、処理システム1のうち、表示装置100及び撮影装置200の構成を具体的に示した図である。図4によると、表示装置100は、プロセッサ111、メモリ112、入力インターフェイス113、出力インターフェイス114及び通信インターフェイス115を含む。また、撮影装置200は、カメラ211、光源212、プロセッサ213、メモリ214、出力インターフェイス215、入力インターフェイス210、通信インターフェイス216、センサ217及びバッテリ218を含む。これらの各構成要素は、互いに、制御ライン及びデータラインを介して互いに電気的に接続される。なお、表示装置100及び撮影装置200は、図4に示す構成要素のすべてを備える必要はなく、一部を省略して構成することも可能であるし、他の構成要素を加えることも可能である。例えば、表示装置100又は撮影装置200は、各構成要素を駆動するためのバッテリ等を含むことが可能である。なお、表示装置100及び撮影装置200は、図3で説明した通り、有線又は無線ネットワークを通じて通信可能に接続されていれば良く、両者が直接通信可能に構成される必要はない。 FIG. 4 is a block diagram showing the configuration of a display device 100 and a photographing device 200 according to an embodiment of the present disclosure. Specifically, FIG. 4 is a diagram specifically showing the configurations of the display device 100 and the photographing device 200 in the processing system 1. According to FIG. 4, the display device 100 includes a processor 111, a memory 112, an input interface 113, an output interface 114, and a communication interface 115. The photographing device 200 also includes a camera 211, a light source 212, a processor 213, a memory 214, an output interface 215, an input interface 210, a communication interface 216, a sensor 217, and a battery 218. Each of these components is electrically connected to each other via control lines and data lines. Note that the display device 100 and the photographing device 200 do not need to include all of the components shown in FIG. 4, and can be configured by omitting some of them, or can include other components. be. For example, the display device 100 or the photographing device 200 can include a battery or the like for driving each component. Note that the display device 100 and the photographing device 200 only need to be communicably connected through a wired or wireless network, as described with reference to FIG. 3, and there is no need for the two to be configured to be able to directly communicate.
2-1.表示装置100の構成
 まず、表示装置100のうち、プロセッサ111は、メモリ112に記憶された処理プログラムに基づいて処理システム1の他の構成要素の制御を行う制御部として機能する。プロセッサ111は、メモリ112に記憶された処理プログラムに基づいて、対象者情報、問診情報、所見情報等を入力したり、サーバ装置300から受信した関連情報を出力する。具体的には、プロセッサ111は、「入力インターフェイス113を介して、操作者又は対象者自らによる対象者に関連する対象者情報の入力を受け付ける処理」、「通信インターフェイス115を介して、受け付けられた対象者情報をサーバ装置300に送信する処理」、「入力インターフェイス113を介して、操作者又は対象者による対象者の問診情報及び所見情報の入力を受け付ける処理」、「通信インターフェイス115を介して、受け付けられた問診情報及び所見情報を対象者情報と共にサーバ装置300に送信する処理」、「入力インターフェイス113を介して対象者を選択し選択された対象者の所定の疾患に対する罹患の可能性の判定要求を通信インターフェイス115を介してサーバ装置300に送信する処理」、「通信インターフェイス115を介して、サーバ装置300から対象者の被写体画像等に基づいて判定された所定の疾患に対する罹患の可能性を示す判定結果と被写体画像を受信し、当該判定結果と被写体画像とを共に出力インターフェイス114を介して出力する処理」、「サーバ装置300において生成された関連情報を受信して、出力インターフェイス114を介して受信した当該情報を出力する処理」等を、メモリ112に記憶された処理プログラムに基づいて実行する。プロセッサ111は、主に一又は複数のCPUにより構成されるが、適宜GPUやFPGAなどを組み合わせてもよい。
2-1. Configuration of Display Device 100 First, in the display device 100, the processor 111 functions as a control unit that controls other components of the processing system 1 based on a processing program stored in the memory 112. Based on the processing program stored in the memory 112, the processor 111 inputs subject information, interview information, finding information, etc., and outputs related information received from the server device 300. Specifically, the processor 111 performs "a process of accepting input of target person information related to a target person by an operator or the target person himself/herself via the input interface 113" and a "process of accepting input of target person information related to the target person via the communication interface 115". ``Process of transmitting subject information to the server device 300'', ``Process of receiving input of interview information and finding information of the subject by the operator or the subject via the input interface 113'', ``Process of receiving input of interview information and finding information of the subject via the communication interface 115''; ``Process of transmitting the received interview information and finding information to the server device 300 together with the subject information'', ``selecting a subject via the input interface 113, and determining the possibility of the selected subject being affected by a predetermined disease. ``A process of transmitting a request to the server device 300 via the communication interface 115'', ``A process of transmitting a request to the server device 300 via the communication interface 115'', ``a process of transmitting a request to the server device 300 via the communication interface 115, and a process of transmitting a request to the server device 300 via the communication interface 115 to send a request to the server device 300 via the communication interface 115 to send a request to the server device 300 via the communication interface 115, A process of receiving a determination result and a subject image shown in FIG. and the like are executed based on the processing program stored in the memory 112. The processor 111 is mainly composed of one or more CPUs, but may be appropriately combined with GPUs, FPGAs, etc.
 メモリ112は、RAM、ROM、不揮発性メモリ、HDD等から構成され、記憶部として機能する。メモリ112は、本実施形態に係る処理システム1の様々な制御のための指示命令を処理プログラムとして記憶する。具体的には、メモリ112は、「入力インターフェイス113を介して、操作者又は対象者自らによる対象者に関連する対象者情報の入力を受け付ける処理」、「通信インターフェイス115を介して、受け付けられた対象者情報をサーバ装置300に送信する処理」、「入力インターフェイス113を介して、操作者又は対象者による対象者の問診情報及び所見情報の入力を受け付ける処理」、「通信インターフェイス115を介して、受け付けられた問診情報及び所見情報を対象者情報と共にサーバ装置300に送信する処理」、「入力インターフェイス113を介して対象者を選択し選択された対象者の所定の疾患に対する罹患の可能性の判定要求を通信インターフェイス115を介してサーバ装置300に送信する処理」、「通信インターフェイス115を介して、サーバ装置300から対象者の被写体画像等に基づいて判定された所定の疾患に対する罹患の可能性を示す判定結果と被写体画像を受信し、当該判定結果と被写体画像とを共に出力インターフェイス114を介して出力する処理」、「サーバ装置300において生成された関連情報を受信して、出力インターフェイス114を介して受信した当該情報を出力する処理」等、プロセッサ111が実行するための処理プログラムを記憶する。また、メモリ112は、当該処理プログラムのほかに、対象者の対象者情報、被写体画像、問診情報、所見情報、関連情報等を記憶する。 The memory 112 is comprised of RAM, ROM, nonvolatile memory, HDD, etc., and functions as a storage unit. The memory 112 stores instructions for various controls of the processing system 1 according to the present embodiment as a processing program. Specifically, the memory 112 stores information such as "processing for receiving input of subject information related to a subject by an operator or the subject himself/herself via the input interface 113" and "processing for receiving input of subject information related to the subject via the communication interface 115". ``Process of transmitting subject information to the server device 300'', ``Process of receiving input of interview information and finding information of the subject by the operator or the subject via the input interface 113'', ``Process of receiving input of interview information and finding information of the subject via the communication interface 115''; ``Process of transmitting the received interview information and finding information to the server device 300 together with the subject information'', ``selecting a subject via the input interface 113, and determining the possibility of the selected subject being affected by a predetermined disease. ``A process of transmitting a request to the server device 300 via the communication interface 115'', ``A process of transmitting a request to the server device 300 via the communication interface 115'', ``a process of transmitting a request to the server device 300 via the communication interface 115, and a process of transmitting a request to the server device 300 via the communication interface 115 to send a request to the server device 300 via the communication interface 115 to send a request to the server device 300 via the communication interface 115, A process of receiving a determination result and a subject image shown in FIG. A processing program for the processor 111 to execute, such as a process for outputting the information received by the processor 111, is stored therein. In addition to the processing program, the memory 112 also stores subject information, subject images, interview information, finding information, related information, and the like.
 入力インターフェイス113は、表示装置100に対する操作者の指示入力を受け付ける入力部として機能する。入力インターフェイス113の一例としては、各種選択を行うための「確定ボタン」、前画面に戻ったり入力した確定操作をキャンセルするための「戻る/キャンセルボタン」、出力インターフェイス114に出力されたポインタ等の移動をするための十字キーボタン、表示装置100の電源のオンオフをするためのオン・オフキー、様々な文字を入力するための文字入力キーボタン等の物理キーボタンが挙げられる。なお、入力インターフェイス113には、出力インターフェイス114として機能するディスプレイに重畳して設けられ、ディスプレイの表示座標系に対応する入力座標系を有するタッチパネルを用いることも可能である。この場合、ディスプレイに上記物理キーに対応するアイコンが表示され、タッチパネルを介して操作者が指示入力を行うことで、各アイコンに対する選択が行われる。タッチパネルによる対象者の指示入力の検出方式は、静電容量式、抵抗膜式などいかなる方式であってもよい。また、上記以外にも、マウスやキーボード等も入力インターフェイス113として用いることが可能である。入力インターフェイス113は、常に表示装置100に物理的に備えられる必要はなく、有線や無線ネットワークを介して必要に応じて接続されてもよい。 The input interface 113 functions as an input unit that receives an instruction input from an operator to the display device 100. Examples of the input interface 113 include a "confirm button" for making various selections, a "back/cancel button" for returning to the previous screen or canceling the entered confirmation operation, and a pointer output to the output interface 114. Physical key buttons include a cross key button for movement, an on/off key for turning on and off the power of the display device 100, and a character input key button for inputting various characters. Note that it is also possible to use a touch panel as the input interface 113, which is provided to be superimposed on the display that functions as the output interface 114 and has an input coordinate system corresponding to the display coordinate system of the display. In this case, icons corresponding to the physical keys are displayed on the display, and the operator inputs instructions via the touch panel to select each icon. The detection method of the subject's instruction input using the touch panel may be any method such as a capacitance method or a resistive film method. In addition to the above, a mouse, a keyboard, etc. can also be used as the input interface 113. The input interface 113 does not always need to be physically provided in the display device 100, and may be connected via a wired or wireless network as necessary.
 出力インターフェイス114は、サーバ装置300から受信した判定結果や関連情報等の情報を出力するための出力部として機能する。出力インターフェイス114の一例としては、液晶パネル、有機ELディスプレイ又はプラズマディスプレイ等から構成されるディスプレイが挙げられる。しかし、必ずしも表示装置100そのものにディスプレイが備えられている必要はない。例えば、有線又は無線ネットワークを介して表示装置100と接続可能なディスプレイ等と接続するためのインターフェイスが、当該ディスプレイ等に表示データを出力する出力インターフェイス114として機能することも可能である。 The output interface 114 functions as an output unit for outputting information such as determination results and related information received from the server device 300. An example of the output interface 114 is a display including a liquid crystal panel, an organic EL display, a plasma display, or the like. However, the display device 100 itself does not necessarily need to be equipped with a display. For example, an interface for connecting to a display or the like that can be connected to the display device 100 via a wired or wireless network can function as the output interface 114 that outputs display data to the display or the like.
 通信インターフェイス115は、有線又は無線ネットワークを介して接続されたサーバ装置300との間で対象者情報、問診情報、被写体画像、所見情報、関連情報等を送受信したりするための通信部として機能する。通信インターフェイス115の一例としては、USB、SCSIなどの有線通信用コネクタや、無線LAN、Bluetooth(登録商標)、赤外線などの無線通信用送受信デバイスや、プリント実装基板やフレキシブル実装基板用の各種接続端子など、様々なものが挙げられる。 The communication interface 115 functions as a communication unit for transmitting and receiving subject information, interview information, subject images, finding information, related information, etc. to and from the server device 300 connected via a wired or wireless network. . Examples of the communication interface 115 include wired communication connectors such as USB and SCSI, wireless communication transmitting and receiving devices such as wireless LAN, Bluetooth (registered trademark), and infrared rays, and various connection terminals for printed mounting boards and flexible mounting boards. There are various things that can be mentioned.
2-2.撮影装置200の構成
 撮影装置200のうち、カメラ211は、被写体である口腔に反射した反射光を検出して被写体画像を生成する撮影部として機能する。カメラ211は、その光を検出するために、一例としてCMOSイメージセンサと、所望の機能を実現するためのレンズ系と駆動系を備える。イメージセンサは、CMOSイメージセンサに限らず、CCDイメージセンサ等の他のセンサを用いることも可能である。カメラ211は、特に図示はしないが、オートフォーカス機能を有することができ、例えばレンズの正面に焦点が特定部位に合うように設定されていることが好ましい。また、カメラ211は、ズーム機能を有することができ、咽頭又はインフルエンザ濾胞のサイズに応じて適切な倍率で撮影するように設定されていることが好ましい。
2-2. Configuration of Photographing Apparatus 200 In the photographing apparatus 200, the camera 211 functions as a photographing unit that detects reflected light reflected on the oral cavity of the subject and generates a subject image. The camera 211 includes, for example, a CMOS image sensor to detect the light, and a lens system and a drive system to realize a desired function. The image sensor is not limited to a CMOS image sensor, but other sensors such as a CCD image sensor can also be used. Although not particularly illustrated, the camera 211 can have an autofocus function, and is preferably set to focus on a specific portion, for example, in front of a lens. Further, the camera 211 may have a zoom function, and is preferably set to take images at an appropriate magnification depending on the size of the pharynx or influenza follicle.
 光源212は、撮影装置200のプロセッサ213からの指示によって駆動され、口腔内に光を照射するための光源部として機能する。光源212は、一又は複数の光源を含む。本実施形態においては、光源212は、一又は複数のLEDから構成され、各LEDから所定の周波数帯域を有する光が口腔方向に照射される。光源212には、紫外光帯域、可視光帯域、赤外光帯域の中から所望の帯域を有する光、又はそれらの組み合わせが用いられる。なお、表示装置100においてインフルエンザの罹患の可能性の判定を行う場合には、可視光帯域の光を用いるのが好ましい。 The light source 212 is driven by instructions from the processor 213 of the imaging device 200, and functions as a light source section for irradiating light into the oral cavity. Light source 212 includes one or more light sources. In this embodiment, the light source 212 includes one or more LEDs, and each LED emits light having a predetermined frequency band toward the oral cavity. The light source 212 uses light having a desired band among an ultraviolet light band, a visible light band, and an infrared light band, or a combination thereof. Note that when determining the possibility of contracting influenza in the display device 100, it is preferable to use light in the visible light band.
 プロセッサ213は、メモリ214に記憶された処理プログラムに基づいて撮影装置200の他の構成要素の制御を行う制御部として機能する。プロセッサ213は、メモリ214に記憶された処理プログラムに基づいて、「撮影装置200とネットワークを介して通信可能に接続されたサーバ装置300から通信インターフェイス216を介して未撮影の撮影対象者の対象者情報を受信する処理」、「未撮影の撮影対象者の一覧から撮影対象となる対象者の対象者情報を選択する処理」、「カメラ211によって選択された対象者の口腔の少なくとも一部が含まれた被写体画像を撮影してサーバ装置300に送信する処理」、「操作者により操作される撮影装置においてその操作によって変化する状態をセンサ217によって検出する処理」、「操作者による撮影装置200に対する操作に起因する操作情報又は操作情報の生成に使われる情報(センサ217によって検出された情報や、操作者によって撮影装置200が操作された操作履歴や撮影装置200の設定の変更履歴などを含む操作ログ等を含む)をサーバ装置300に通信インターフェイス216を介して送信する処理」等を、メモリ214に記憶された処理プログラムに基づいて実行する。プロセッサ213は、主に一又は複数のCPUにより構成されるが、適宜GPUやFPGAなどを組み合わせてもよい。 The processor 213 functions as a control unit that controls other components of the imaging device 200 based on the processing program stored in the memory 214. Based on the processing program stored in the memory 214, the processor 213 executes the process of "receiving a target person who has not yet been photographed through a communication interface 216 from a server device 300 that is communicatively connected to the photographing device 200 via a network." ``Process of receiving information'', ``Process of selecting subject information of a subject to be photographed from a list of subjects who have not yet been photographed'', ``Process of selecting subject information of a subject to be photographed from a list of subjects to be photographed who have not yet been photographed'', ``Process of selecting subject information of a subject to be photographed from a list of subjects to be photographed who have not yet been photographed''; ``Processing of photographing a subject image and transmitting it to server device 300'', ``Processing of detecting, by sensor 217, a state that changes due to the operation of the photographing device operated by the operator''; Operation information resulting from the operation or information used to generate the operation information (information detected by the sensor 217, operations including the operation history of the operator's operation of the photographing device 200, the change history of the settings of the photographing device 200, etc.) "processing for transmitting logs (including logs, etc.) to the server device 300 via the communication interface 216" is executed based on the processing program stored in the memory 214. The processor 213 is mainly composed of one or more CPUs, but may be appropriately combined with a GPU, FPGA, etc.
 メモリ214は、RAM、ROM、不揮発性メモリ、HDD等から構成され、記憶部として機能する。メモリ214は、本実施形態に係る処理システム1の様々な制御のための指示命令を処理プログラムとして記憶する。具体的には、メモリ214は、「撮影装置200とネットワークを介して通信可能に接続されたサーバ装置300から通信インターフェイス216を介して未撮影の撮影対象者の対象者情報を受信する処理」、「未撮影の撮影対象者の一覧から撮影対象となる対象者の対象者情報を選択する処理」、「カメラ211によって選択された対象者の口腔の少なくとも一部が含まれた被写体画像を撮影してサーバ装置300に送信する処理」、「操作者により操作される撮影装置においてその操作によって変化する状態をセンサ217によって検出する処理」、「操作者による撮影装置200に対する操作に起因する操作情報又は操作情報の生成に使われる情報(センサ217によって検出された情報や、操作者によって操作された操作の履歴、操作ログ等を含む)をサーバ装置300に通信インターフェイス216を介して送信する処理」等、プロセッサ213が実行するための処理プログラムを記憶する。また、メモリ214は、当該処理プログラムのほかに、対象者の対象者情報、被写体画像、操作情報等を記憶する。 The memory 214 is composed of RAM, ROM, nonvolatile memory, HDD, etc., and functions as a storage unit. The memory 214 stores instructions for various controls of the processing system 1 according to the present embodiment as a processing program. Specifically, the memory 214 performs "a process of receiving subject information of a subject to be photographed who has not yet been photographed via the communication interface 216 from the server device 300 that is communicably connected to the photographing device 200 via a network"; ``Process of selecting subject information of a subject to be photographed from a list of subjects who have not yet been photographed''; "a process of transmitting information to the server device 300", "a process of detecting, by the sensor 217, a state that changes due to the operation of the imaging device operated by the operator", "a process of detecting operation information or information resulting from an operation of the imaging device 200 by the operator" A process of transmitting information used to generate operation information (including information detected by the sensor 217, a history of operations performed by the operator, operation logs, etc.) to the server device 300 via the communication interface 216, etc. , stores a processing program for execution by the processor 213. In addition to the processing program, the memory 214 also stores subject information, subject images, operation information, etc. of the subject.
 出力インターフェイス215は、撮影装置200によって撮影された被写体画像や対象者情報等を出力するための出力部として機能する。出力インターフェイス215の一例としては、ディスプレイ203が挙げられるが、これのみに限らず、他の液晶パネル、有機ELディスプレイ、プラズマディスプレイ等から構成されていてもよい。また、ディスプレイ203が備えられている必要はなく、例えば、有線又は無線ネットワークを介して表示装置100と接続可能なディスプレイ等と接続するためのインターフェイスが、当該ディスプレイ等に表示データを出力する出力インターフェイス114として機能することも可能である。 The output interface 215 functions as an output unit for outputting the subject image photographed by the photographing device 200, subject information, and the like. An example of the output interface 215 is the display 203, but the output interface 215 is not limited to this, and may be configured from other liquid crystal panels, organic EL displays, plasma displays, etc. Further, the display 203 does not need to be provided, and for example, an interface for connecting to a display, etc. that can be connected to the display device 100 via a wired or wireless network is an output interface that outputs display data to the display, etc. It is also possible to function as 114.
 入力インターフェイス210は、表示装置100及び撮影装置200に対する対象者の指示入力を受け付ける入力部として機能する。入力インターフェイス210の一例としては、撮影装置200による録画の開始・終了を指示するための「撮影ボタン」、撮影装置200の電源のオンオフをするための「電源ボタン」、各種選択を行うための「確定ボタン」、前画面に戻ったり入力した確定操作をキャンセルするための「戻る/キャンセルボタン」、出力インターフェイス215に表示されたアイコン等の移動をするための十字キーボタン等の物理キーボタンが挙げられる。なお、これらの各種ボタン・キーは、物理的に用意されたものであってもよいし、出力インターフェイス215にアイコンとして表示され、出力インターフェイス215に重畳して入力インターフェイス210として配置されたタッチパネル等を用いて選択可能にしたものであってもよい。タッチパネルによる対象者の指示入力の検出方式は、静電容量式、抵抗膜式などいかなる方式であってもよい。 The input interface 210 functions as an input unit that accepts the subject's instruction input to the display device 100 and the photographing device 200. Examples of the input interface 210 include a "shooting button" for instructing the shooting device 200 to start and end recording, a "power button" for turning on and off the power of the shooting device 200, and a "power button" for making various selections. "Confirm button", "Back/Cancel button" for returning to the previous screen or canceling the entered confirmation operation, and physical key buttons such as a cross key button for moving icons etc. displayed on the output interface 215. It will be done. Note that these various buttons and keys may be physically prepared, or may be displayed as icons on the output interface 215 and may be displayed on the output interface 215 using a touch panel or the like superimposed on the output interface 215 and arranged as the input interface 210. It may also be possible to make the selection possible. The detection method of the subject's instruction input using the touch panel may be any method such as a capacitance method or a resistive film method.
 通信インターフェイス216は、サーバ装置300及び/又は他の装置と情報の送受信を行うための通信部として機能する。通信インターフェイス216の一例としては、USB、SCSIなどの有線通信用コネクタや、無線LAN、Bluetooth(登録商標)、赤外線などの無線通信用送受信デバイスや、プリント実装基板やフレキシブル実装基板用の各種接続端子など、様々なものが挙げられる。 The communication interface 216 functions as a communication unit for transmitting and receiving information to and from the server device 300 and/or other devices. Examples of the communication interface 216 include wired communication connectors such as USB and SCSI, wireless communication transmitting and receiving devices such as wireless LAN, Bluetooth (registered trademark), and infrared rays, and various connection terminals for printed mounting boards and flexible mounting boards. There are various things that can be mentioned.
 センサ217は、撮影装置200を構成する各構成要素の状態を検出するための検出部として機能する。センサ217の一例としては、電圧センサ、電流センサ、温度センサ、GPS、加速度センサ又はそれらの組み合わせ等が挙げられる。センサ217で検出された検出値(例えば、電圧、電流、温度、位置情報又はそれらの組み合わせ)は、操作情報又は操作情報の生成に使用される情報の一つとしてサーバ装置300に送信される。 The sensor 217 functions as a detection unit for detecting the state of each component making up the imaging device 200. Examples of the sensor 217 include a voltage sensor, a current sensor, a temperature sensor, a GPS, an acceleration sensor, or a combination thereof. A detected value (for example, voltage, current, temperature, position information, or a combination thereof) detected by the sensor 217 is transmitted to the server device 300 as operation information or one of the pieces of information used to generate the operation information.
 バッテリ218は、撮影装置200の各構成要素を駆動するための電力を供給する電力供給部として機能する。バッテリ218は、電源に接続されているときに当該電源から電力供給されることで充電され、電源ボタン等の押下により撮影装置200がオンされることによって各構成要素を駆動するために放電を開始する。 The battery 218 functions as a power supply unit that supplies power to drive each component of the imaging device 200. The battery 218 is charged by receiving power from the power source when connected to the power source, and starts discharging to drive each component when the photographing device 200 is turned on by pressing a power button or the like. do.
2-3.サーバ装置300の構成
 図5は、本開示の一実施形態に係るサーバ装置300の構成を示すブロック図である。図5によると、サーバ装置300は、メモリ311、プロセッサ312及び通信インターフェイス313を含む。これらの各構成要素は、互いに、制御ライン及びデータラインを介して互いに電気的に接続される。なお、サーバ装置300は、図5に示す構成要素のすべてを備える必要はなく、一部を省略して構成することも可能であるし、他の構成要素を加えることも可能である。例えば、他のサーバ装置と接続し一体としてサーバ装置300を構成することも可能である。また、他のデータベース装置と接続し一体としてサーバ装置300を構成することも可能である。
2-3. Configuration of server device 300 FIG. 5 is a block diagram showing the configuration of server device 300 according to an embodiment of the present disclosure. According to FIG. 5, server device 300 includes a memory 311, a processor 312, and a communication interface 313. Each of these components is electrically connected to each other via control lines and data lines. Note that the server device 300 does not need to include all of the components shown in FIG. 5; it is possible to omit some of them, or to add other components. For example, it is also possible to configure the server device 300 by connecting it with another server device. Moreover, it is also possible to connect with other database devices and configure the server device 300 as an integrated unit.
 メモリ311は、RAM、ROM、不揮発性メモリ、HDD等から構成され、記憶部として機能する。メモリ311は、本実施形態に係る処理システム1の様々な制御のための指示命令を処理プログラムとして記憶する。具体的には、メモリ311は、「撮影装置200において被写体の画像を撮影するための操作者による撮影装置200の操作に関連する操作情報を取得する処理」、「取得した操作情報に基づいて撮影装置200に関連する関連情報を、表示装置100及び撮影装置200のうちの少なくともいずれかに出力する処理」、「撮影装置200によって撮影された被写体画像を通信インターフェイス313を介して撮影装置200から受信する処理」、「表示装置100で入力された問診情報及び所見情報を通信インターフェイス313を介して表示装置100から受信し、対象者ID情報に対応付けて対象者管理テーブルに記憶する処理」、「表示装置100において選択された対象者について所定の疾患に対する罹患の可能性の判定要求を通信インターフェイス313を介して表示装置100から受信すると、画像管理テーブルから当該対象者に対応付けられた被写体画像と、対象者管理テーブルから当該対象者に対応付けられた問診情報や所見情報を読みだして、当該可能性の判定をする処理」、「判定された結果を通信インターフェイス313を介して表示装置100に送信する処理」等、プロセッサ312が実行するための処理プログラムを記憶する。また、メモリ311は、当該処理プログラムのほかに、撮影装置管理テーブル(図6A)、操作者管理テーブル(図6B)、対象者管理テーブル(図6C)及び関連情報テーブル(図6D)に記憶される各種情報等を記憶する。 The memory 311 is composed of RAM, ROM, nonvolatile memory, HDD, etc., and functions as a storage unit. The memory 311 stores instructions for various controls of the processing system 1 according to the present embodiment as a processing program. Specifically, the memory 311 stores "processing for acquiring operation information related to the operation of the photographing device 200 by the operator in order to photograph an image of a subject with the photographing device 200" and "processing for acquiring operation information related to the operation of the photographing device 200 by the operator to photograph an image of a subject with the photographing device 200" A process of outputting related information related to the device 200 to at least one of the display device 100 and the photographing device 200,” and “receiving a subject image photographed by the photographing device 200 from the photographing device 200 via the communication interface 313.” ``Process of receiving medical interview information and finding information inputted on display device 100 from display device 100 via communication interface 313, and storing it in a subject management table in association with subject ID information''; When the display device 100 receives a request to determine the possibility of a predetermined disease for a selected subject from the display device 100 via the communication interface 313, a subject image associated with the subject from the image management table is displayed. , a process of reading out interview information and finding information associated with the subject from the subject management table and determining the possibility, and a process of displaying the determined results on the display device 100 via the communication interface 313. It stores processing programs for the processor 312 to execute, such as "transmission processing". In addition to the processing program, the memory 311 also stores an imaging device management table (FIG. 6A), an operator management table (FIG. 6B), a subject management table (FIG. 6C), and a related information table (FIG. 6D). Stores various information etc.
 プロセッサ312は、メモリ311に記憶された処理プログラムに基づいてサーバ装置300の他の構成要素の制御を行う制御部として機能する。プロセッサ312は、メモリ311に記憶された処理プログラムに基づいて、所定の疾患に対する罹患の可能性を判定する処理や関連情報を出力する処理を行う。具体的には、プロセッサ312は、「撮影装置200において被写体の画像を撮影するための操作者による撮影装置200の操作に関連する操作情報を取得する処理」、「取得した操作情報に基づいて撮影装置200に関連する関連情報を、表示装置100及び撮影装置200のうちの少なくともいずれかに出力する処理」、「撮影装置200によって撮影された被写体画像を通信インターフェイス313を介して撮影装置200から受信する処理」、「表示装置100で入力された問診情報及び所見情報を通信インターフェイス313を介して表示装置100から受信し、対象者ID情報に対応付けて対象者管理テーブルに記憶する処理」、「表示装置100において選択された対象者について所定の疾患に対する罹患の可能性の判定要求を通信インターフェイス313を介して表示装置100から受信すると、画像管理テーブルから当該対象者に対応付けられた被写体画像と、対象者管理テーブルから当該対象者に対応付けられた問診情報や所見情報を読みだして、当該可能性の判定をする処理」、「判定された結果を通信インターフェイス313を介して表示装置100に送信する処理」等を、メモリ311に記憶された処理プログラムに基づいて実行する。プロセッサ312は、主に一又は複数のCPUにより構成されるが、適宜GPUやFPGAなどを組み合わせてもよい。 The processor 312 functions as a control unit that controls other components of the server device 300 based on the processing program stored in the memory 311. Based on the processing program stored in the memory 311, the processor 312 performs processing for determining the possibility of contracting a predetermined disease and processing for outputting related information. Specifically, the processor 312 performs "a process of acquiring operation information related to an operation of the photographing device 200 by an operator to photograph an image of a subject with the photographing device 200" and a "process of acquiring operation information related to the operation of the photographing device 200 by an operator to photograph an image of a subject with the photographing device 200"; A process of outputting related information related to the device 200 to at least one of the display device 100 and the photographing device 200,” and “receiving a subject image photographed by the photographing device 200 from the photographing device 200 via the communication interface 313.” ``Process of receiving medical interview information and finding information inputted on display device 100 from display device 100 via communication interface 313, and storing it in a subject management table in association with subject ID information''; When the display device 100 receives a request to determine the possibility of a predetermined disease for a selected subject from the display device 100 via the communication interface 313, a subject image associated with the subject from the image management table is displayed. , a process of reading out interview information and finding information associated with the subject from the subject management table and determining the possibility, and a process of displaying the determined results on the display device 100 via the communication interface 313. "transmission processing" and the like are executed based on the processing program stored in the memory 311. The processor 312 is mainly composed of one or more CPUs, but may be appropriately combined with a GPU, FPGA, or the like.
 通信インターフェイス313は、表示装置100、撮影装置200及び/又は他の装置と情報の送受信を行うための通信部として機能する。通信インターフェイス313の一例としては、USB、SCSIなどの有線通信用コネクタや、無線LAN、Bluetooth(登録商標)、赤外線などの無線通信用送受信デバイスや、プリント実装基板やフレキシブル実装基板用の各種接続端子など、様々なものが挙げられる。 The communication interface 313 functions as a communication unit for transmitting and receiving information to and from the display device 100, the photographing device 200, and/or other devices. Examples of the communication interface 313 include wired communication connectors such as USB and SCSI, wireless communication transmitting and receiving devices such as wireless LAN, Bluetooth (registered trademark), and infrared rays, and various connection terminals for printed mounting boards and flexible mounting boards. There are various things that can be mentioned.
3.サーバ装置300のメモリ311に記憶される情報
 図6Aは、本開示の一実施形態に係るサーバ装置300に記憶される撮影装置管理テーブルを概念的に示す図である。撮影装置管理テーブルに記憶される情報は、サーバ装置300のプロセッサ312の処理の進行に応じて随時更新して記憶される。
3. Information stored in the memory 311 of the server device 300 FIG. 6A is a diagram conceptually showing an imaging device management table stored in the server device 300 according to an embodiment of the present disclosure. The information stored in the imaging device management table is updated and stored as needed according to the progress of processing by the processor 312 of the server device 300.
 図6Aによれば、撮影装置管理テーブルには、撮影装置ID情報に対応付けて、操作情報、関連情報、ソフトウェア(SW)バージョン情報、部品情報等が記憶される。「撮影装置ID情報」は、各撮影装置200に固有の情報で各撮影装置200を特定するための情報である。撮影装置ID情報は、任意の文字列やコードをサーバ装置300によって割り当ててもよいし、製造番号や製品番号などの情報を用いることも可能である。「操作情報」は、各撮影装置200において画像を撮影するための操作者による操作に関連する情報である。このような情報の一例としては、電圧センサ、電流センサ、温度センサ等のセンサ217からの出力値(センサ217からの出力値そのものであってもよいし、出力値を処理して得られたデータであってもよい)、補助具400の装着回数、撮影回数、バッテリ218の充電回数、満充電時のバッテリ218の電圧、累積されたバッテリ218の総充電時間、バッテリ218に対して過充電された回数、バッテリ218に対して過放電された回数、光源212の点灯回数、光源212の点灯時間、光源212に流れる電流値、スルー画像の表示時間、撮影装置200の使用時間(例えば、電源オンからオフまでの時間や、スリープ終了からスリープ開始までの時間)、各物理キーの押下回数、撮影装置200の使用場所、所定の疾患に対する罹患の可能性の判定回数、通信速度、通信時間、無線通信の電波強度、撮影装置200に対する操作ログ、又はそれらの組み合わせが挙げられる。このような操作情報は、撮影装置200から受信されるか、撮影装置200から受信した操作情報の生成に使われる情報に基づいてサーバ装置300で生成されることによって取得される。 According to FIG. 6A, the photographing device management table stores operation information, related information, software (SW) version information, parts information, etc. in association with photographing device ID information. “Photographing device ID information” is information unique to each photographing device 200 and used to identify each photographing device 200. As the imaging device ID information, an arbitrary character string or code may be assigned by the server device 300, or information such as a manufacturing number or product number may be used. “Operation information” is information related to an operation by an operator for photographing an image with each photographing device 200. An example of such information is an output value from the sensor 217 such as a voltage sensor, current sensor, or temperature sensor (it may be the output value itself from the sensor 217, or data obtained by processing the output value). ), the number of times the auxiliary device 400 is worn, the number of times the image is taken, the number of times the battery 218 is charged, the voltage of the battery 218 when fully charged, the accumulated total charging time of the battery 218, and the number of times the battery 218 is overcharged. the number of times the battery 218 is over-discharged, the number of times the light source 212 is turned on, the time the light source 212 is turned on, the current value flowing through the light source 212, the display time of the through image, the usage time of the photographing device 200 (for example, when the power is turned on) to off, the time from the end of sleep to the start of sleep), the number of presses of each physical key, the location where the imaging device 200 is used, the number of times the possibility of contracting a predetermined disease is determined, communication speed, communication time, wireless Examples include the radio field strength of communication, the operation log for the photographing device 200, or a combination thereof. Such operation information is obtained by being received from the photographing device 200 or generated by the server device 300 based on information used to generate the operation information received from the photographing device 200.
 「関連情報」は、操作情報に基づいて生成される情報であって、撮影装置200に関連する情報である。その一例としては、撮影装置200の使用方法に関する情報、撮影装置200を構成する各種構成要素(部品など)の交換、点検及び校正のうちの少なくともいずれかを促進する情報、補助具400の購入を促進する情報、所定の疾患に対する罹患の判定料金に関する情報等が挙げられる。具体的には、撮影装置200の使用方法に関する情報としては、撮影装置200の使用上のヒント、マニュアル、操作の改善方法、バッテリ218の充電促進、充電完了予測時間等の情報が挙げられる。各種構成要素の交換、点検及び校正のうちの少なくともいずれかを促進する情報としては、バッテリ218、光源212に用いられるLED、カメラ211に用いられれるCOMSセンサ、撮影装置200に設置される各種レンズ、光源212から照射された光を外部に伝達するための導光筒、物理キー、タッチパネル、撮影装置200に組み込まれているソフトウェア等に対する品質の通知、交換、点検、校正等の時期の通知、交換、点検、校正の方法の通知等が挙げられる。判定料金に関する情報としては、これまで発生した判定料金の通知、将来発生すると予測される判定料金の通知、判定料金の支払い方法の通知等が挙げられる。補助具400の購入を促進する情報としては、補助具400の残り在庫数を示す情報、購入方法を示す情報等が挙げられる。 "Related information" is information generated based on operation information, and is information related to the imaging device 200. Examples include information on how to use the imaging device 200, information that promotes at least one of the replacement, inspection, and calibration of various components (parts, etc.) that make up the imaging device 200, and information that promotes the purchase of the auxiliary tool 400. Examples include information on promotion, information on fees for determining morbidity for a predetermined disease, and the like. Specifically, the information regarding how to use the photographing device 200 includes information such as tips for using the photographing device 200, a manual, methods for improving operation, promotion of charging the battery 218, and estimated time for charging completion. Information that facilitates the replacement, inspection, and/or calibration of various components includes the battery 218, the LED used in the light source 212, the COMS sensor used in the camera 211, and various lenses installed in the photographing device 200. , Notification of the quality of the light guide tube for transmitting the light emitted from the light source 212 to the outside, physical keys, touch panel, software built into the photographing device 200, etc., and notification of the timing of replacement, inspection, calibration, etc.; Examples include notification of replacement, inspection, and calibration methods. Information regarding the judgment fee includes notifications of judgment fees that have occurred so far, notifications of judgment fees that are predicted to occur in the future, notifications of payment methods for the judgment fees, and the like. Examples of information that promotes the purchase of the auxiliary tool 400 include information indicating the number of remaining auxiliary tools 400 in stock, information indicating the purchasing method, and the like.
 「SWバージョン情報」は、各撮影装置200に現在記憶されている処理プログラムのバージョンを特定するための情報である。当該情報は、新たなバージョンの処理プログラムがインストールされるごとに更新される。「部品情報」は各撮影装置200に設置されている各種構成要素(部品など)それぞれを特定するための情報である。このような情報としては、各種主構成要素ごとを特定できればいずれでもよいが、典型的には製造番号等が挙げられる。当該部品情報は、関連情報を出力する際に部品情報も一緒に出力したり、特に製造ロットに該当する部品のみの修理・交換等の必要性が生じた場合に、当該情報に基づいて当該部品が設置された撮影装置200を特定するために用いられる。 "SW version information" is information for specifying the version of the processing program currently stored in each imaging device 200. The information is updated every time a new version of the processing program is installed. “Parts information” is information for specifying each of the various components (parts, etc.) installed in each imaging device 200. Such information may be any information as long as it can identify each of the main components, but typically includes a serial number. The part information may be output together with the related information, or if there is a need to repair or replace only the part that corresponds to the manufacturing lot, the part information can be used to output the part information based on the information. is used to identify the photographing device 200 installed.
 図6Bは、本開示の一実施形態に係るサーバ装置300に記憶される操作者管理テーブルを概念的に示す図である。操作者管理テーブルに記憶される情報は、サーバ装置300のプロセッサ312の処理の進行に応じて随時更新して記憶される。 FIG. 6B is a diagram conceptually showing an operator management table stored in the server device 300 according to an embodiment of the present disclosure. The information stored in the operator management table is updated and stored as needed according to the progress of processing by the processor 312 of the server device 300.
 図6Bによれば、操作者管理テーブルには、機関ID情報、操作者ID情報、表示装置ID情報、撮影装置ID情報等がそれぞれ対応付けて記憶される。「機関ID情報」は、各操作者が所属する機関に固有の情報で各機関を特定するための情報である。例えば、操作者が医療従事者である場合には、当該医療従事者が所属する医療機関やその部署、診療科などを特定するコードや名称が用いられる。操作者ID情報は、各操作者に固有の情報で各操作者を特定するための情報である。「表示装置ID情報」は、各表示装置100に固有の情報で各表示装置100を特定するための情報である。表示装置ID情報に基づいて、各機関又は各操作者が保持する表示装置100を特定することが可能である。「撮影装置ID情報」は撮影装置200に固有の情報で各撮影装置200を特定するための情報である。撮影装置ID情報に基づいて、各機関又は各操作者が管理する撮影装置200を特定することが可能である。 According to FIG. 6B, in the operator management table, institution ID information, operator ID information, display device ID information, photographing device ID information, etc. are stored in association with each other. "Institution ID information" is information specific to the institution to which each operator belongs, and is information for identifying each institution. For example, when the operator is a medical worker, a code or name is used that identifies the medical institution, department, department, etc. to which the medical worker belongs. The operator ID information is information unique to each operator and used to identify each operator. “Display device ID information” is information specific to each display device 100 and is information for identifying each display device 100. Based on the display device ID information, it is possible to specify the display device 100 held by each institution or each operator. “Photographing device ID information” is information specific to the photographing device 200 and is information for identifying each photographing device 200. Based on the imaging device ID information, it is possible to specify the imaging device 200 managed by each institution or each operator.
 図6Cは、本開示の一実施形態に係るサーバ装置300に記憶される対象者管理テーブルを概念的に示す図である。対象者管理テーブルに記憶される情報は、サーバ装置300のプロセッサ312の処理の進行に応じて随時更新して記憶される。 FIG. 6C is a diagram conceptually showing a target person management table stored in the server device 300 according to an embodiment of the present disclosure. The information stored in the target person management table is updated and stored as needed according to the progress of processing by the processor 312 of the server device 300.
 図6Cによれば、対象者管理テーブルには、対象者ID情報に対応付けて、問診情報、所見情報、被写体画像、判定結果情報等が記憶される。「対象者ID情報」は、各対象者に固有の情報で各対象者を特定するための情報である。対象者ID情報は、操作者又は対象者自身によって新たな対象者が登録されるごとに生成される。「問診情報」は、例えば操作者又は対象者等によって入力された情報で、対象者の病歴や症状などの医師等による診断の参考にされる情報である。「所見情報」は、医師等の操作者によって入力される情報で、対象者を視診、問診、触診、聴診の各種診察や診断を補助するための検査によって得られた正常とは異なる様子を示す情報である。「被写体画像」は、所定の疾患に対する罹患の可能性の判定のために用いられる被写体画像の画像データそのもの又は格納先を示す情報である。「判定結果情報」は、判定用画像に基づいてインフルエンザ等の所定の疾患に対する罹患の可能性の判定結果を示す情報である。このような判定結果情報の一例としては、インフルエンザに対する陽性率が挙げられる。しかし、陽性率に限らず、陽性か陰性かを特定する情報など可能性が示されたものであればいずれでもよい。また、判定結果は、具体的な数値である必要はなく、陽性率の高さに応じた分類、陽性か陰性かを示す分類など、その形式はいずれでもよい。 According to FIG. 6C, the subject management table stores interview information, finding information, subject images, determination result information, etc. in association with subject ID information. “Target ID information” is information unique to each target person and used to identify each target person. The subject ID information is generated every time a new subject is registered by the operator or the subject himself/herself. "Interview information" is information inputted by, for example, an operator or a subject, and is information used as a reference for diagnosis by a doctor or the like, such as the subject's medical history and symptoms. "Finding information" is information input by an operator such as a doctor, and indicates a situation that differs from normal as obtained by various examinations such as visual inspection, interview, palpation, and auscultation, as well as tests to assist in diagnosis. It is information. The “subject image” is information indicating the image data itself or the storage location of the subject image used to determine the possibility of contracting a predetermined disease. "Determination result information" is information indicating the determination result of the possibility of contracting a predetermined disease such as influenza based on the determination image. An example of such determination result information is the positivity rate for influenza. However, it is not limited to the positive rate, but any information that indicates the possibility, such as information specifying whether it is positive or negative, may be used. Further, the determination result does not need to be a specific numerical value, and may be in any format, such as classification according to the high positive rate or classification indicating whether it is positive or negative.
 図6Dは、本開示の一実施形態に係るサーバ装置300に記憶される関連情報テーブルを概念的に示す図である。関連情報テーブルに記憶される情報は、サーバ装置300のプロセッサ312に処理によって追加されるごとに随時更新して記憶される。 FIG. 6D is a diagram conceptually showing a related information table stored in the server device 300 according to an embodiment of the present disclosure. The information stored in the related information table is updated and stored whenever it is added to the processor 312 of the server device 300 through processing.
 図6Dによれば、関連情報テーブルには、出力される関連情報の種類ごとに、操作情報に対応付けて出力される関連情報が記憶される。なお、図6Dに記載されている関連情報の種類等はあくまで一例であって、当然に出力したい関連情報に応じて適宜追加や削減をすることが可能である。まず、関連情報の種類として「補助具400の購入を促進する情報」を参照すると、操作情報である「装着回数」が0回目から100回目未満の場合は何も出力せず、100回目及び150回目に到達すると「補助具の在庫が少なくなりました。」と出力し、181回目から200回目までは操作情報を取得する都度「補助具の在庫が少なくなりました。」と出力し、201回目以降は「補助具はここから購入してください。」と出力する。また、関連情報の種類として「撮影装置200の使用方法に関する情報」を参照すると、操作情報である「電圧センサ」の出力値がX1以上X2未満の場合は正常値であるとして何も出力せず、X2以上X3未満の場合は「バッテリ残量が少なくなりました。」と出力し、X3以上X4未満の場合は「充電をしてください。」と出力する。上記の通り、これら以外にも取得される操作情報と出力される関連情報の組み合わせは種々存在するが、詳細については後述する。 According to FIG. 6D, the related information table stores related information that is output in association with operation information for each type of related information that is output. Note that the types of related information shown in FIG. 6D are just examples, and of course it is possible to add or subtract as appropriate depending on the related information that is desired to be output. First, when referring to "information promoting the purchase of auxiliary device 400" as the type of related information, nothing is output when the "number of times worn" is from 0 to less than 100 times, and when it is less than 100 times and 150 times. When the 181st to 200th operation information is obtained, it outputs ``The stock of auxiliary tools is low.'' and 201. From the first time onward, the message "Please purchase auxiliary tools from here." is output. Also, when referring to "information regarding how to use the photographing device 200" as the type of related information, if the output value of the "voltage sensor" which is the operation information is greater than or equal to X1 and less than X2, it is considered to be a normal value and nothing is output. , if the value is greater than or equal to X2 and less than X3, the message "The battery is low" is output. If the value is greater than or equal to X3 and less than X4, the message "Please charge the battery" is output. As mentioned above, there are various combinations of operation information to be acquired and related information to be output in addition to these, but the details will be described later.
4.処理システム1で実行される処理シーケンス
 図7は、本開示の一実施形態に係る表示装置100、撮影装置200及びサーバ装置300との間で実行される処理シーケンスを示す図である。このうち、S11~S21は主に撮影装置200から受信する情報に基づいて関連情報を出力する処理、S41~S46は主に表示装置100から受信する情報に基づいて関連情報を出力する処理をそれぞれ示す。なお、以下においては、単一の表示装置100及び単一の撮影装置200が接続されている場合について説明するが、当然複数の表示装置100や複数の撮影装置200が接続されている場合であっても同様に処理されることが可能である。
4. Processing sequence executed by processing system 1 FIG. 7 is a diagram showing a processing sequence executed between display device 100, photographing device 200, and server device 300 according to an embodiment of the present disclosure. Of these, S11 to S21 are processes for outputting related information mainly based on information received from the photographing device 200, and S41 to S46 are processes for outputting related information mainly based on information received from the display device 100. show. In addition, although the case where a single display device 100 and a single photographing device 200 are connected will be described below, the case where a plurality of display devices 100 and a plurality of photographing devices 200 are connected is naturally applicable. can also be processed in the same way.
 図7によると、電源ボタン等の押下により撮影装置200の電源がオンになり、撮影装置200が起動される(S11)。そして、撮影装置200は、サーバ装置300から受信した未撮影の対象者の対象者情報から、入力インターフェイス210で受け付けた操作者の指示入力に基づいて、撮影対象とする対象者の対象者情報を選択する(S12)。次に、撮影装置200は、補助具400の装着の有無を判定して、まだ装着されていない場合にはその装着を促すための装着表示を、出力インターフェイス215を介して出力する(S13)。なお、当該表示はあくまで一例であって、他に音声や光の点滅、振動等によって装着を促してもよい。撮影装置200に対して補助具400が装着されたことが検出されると(S14)、撮影装置200は入力インターフェイス210で受け付けた操作者の指示入力に基づいて、被写体画像の撮影を行う(S15)。なお、ここでは、補助具400の装着の検出をすることとしたが、当該処理そのものをスキップしてもよい。また、補助具400の装着の検出に代えて、例えば補助具400の装着を操作者が確認する確認表示を出力インターフェイス215を介して出力し、操作者による当該確認表示に対する所定の操作入力(例えば、タップ操作等)を受け付け可能にすることによって、操作者自らが補助具400を装着したことを入力できるようにしてもよい。また、補助具400の装着の検出は、例えばカメラ211で補助具400特有の画像の写り込みを検出したり、補助具400の装着によってあらかじめ設置されたスイッチが押下される等の方法で行ってもよい。また、これらの方法を用いて補助具400の装着だけではなく取り外しを検知するようにしてもよい。 According to FIG. 7, the power of the photographing device 200 is turned on by pressing the power button or the like, and the photographing device 200 is activated (S11). Then, the photographing device 200 extracts the target person information of the target to be photographed based on the operator's instruction input received by the input interface 210 from the target person information of the target who has not been photographed received from the server device 300. Select (S12). Next, the photographing device 200 determines whether or not the auxiliary tool 400 is worn, and if it is not yet worn, outputs a wearing display to encourage the user to wear it via the output interface 215 (S13). Note that this display is just an example, and the wearing may be prompted by other sounds, flashing lights, vibrations, or the like. When it is detected that the auxiliary tool 400 is attached to the photographing device 200 (S14), the photographing device 200 photographs a subject image based on the operator's instruction input received at the input interface 210 (S15). ). Note that although here, the attachment of the auxiliary tool 400 is detected, this process itself may be skipped. Further, instead of detecting whether the auxiliary device 400 is attached, for example, a confirmation display for the operator to confirm that the auxiliary device 400 is attached is output via the output interface 215, and the operator inputs a predetermined operation in response to the confirmation display (for example, , tap operation, etc.), the operator himself/herself may be able to input that he/she is wearing the auxiliary device 400. Furthermore, the attachment of the auxiliary tool 400 is detected by, for example, using the camera 211 to detect an image specific to the auxiliary tool 400, or by pressing a pre-installed switch when the auxiliary tool 400 is attached. Good too. Furthermore, these methods may be used to detect not only attachment but also removal of the auxiliary tool 400.
 被写体画像の撮影がなされると、撮影装置200は、通信インターフェイス216を介して撮影された被写体画像(T11)を、第1の対象者の対象者ID情報と共にサーバ装置300に送信する。サーバ装置300は、通信インターフェイス313を介して被写体画像を受信すると、一緒に受信した対象者ID情報に対応付けて対象者管理テーブルに受信した被写体画像を記憶する。 When the subject image is photographed, the photographing device 200 transmits the photographed subject image (T11) to the server device 300 along with the subject ID information of the first subject via the communication interface 216. When the server device 300 receives the subject image via the communication interface 313, it stores the received subject image in the subject management table in association with the subject ID information received together.
 また、上記の一連の処理において、撮影装置200では種々の操作情報又は操作情報の生成に使われる情報が取得される。S11において撮影装置200が起動された後は、所定間隔でバッテリ218に接続された電圧センサによってバッテリ218の電圧が検出される。また、S11の電源ボタンの押下、S12の対象者の選択における指示入力、S15の撮影の指示入力等において各物理キーの押下回数やタッチパネルの操作回数がカウントされる。また、S15において撮影に伴い、光源212が点灯することによって光源212の点灯回数がカウントされるとともに、点灯時間の計時がされる。また、S15において光源212が点灯することによって、光源212に接続された電流センサを用いて光源212に流れる電流値が検出される。また、S14において補助具400の装着が検出されると、補助具400の装着回数がカウントされる。なお、ここでは一例を挙げているものの、当然他の操作情報又は操作情報の生成に使われる情報が取得されてもよい。取得される操作情報又は操作情報法の生成に使われる情報の詳細は後述する。 Additionally, in the series of processes described above, the photographing device 200 acquires various operation information or information used to generate the operation information. After the photographing device 200 is activated in S11, the voltage of the battery 218 is detected at predetermined intervals by a voltage sensor connected to the battery 218. Further, the number of times each physical key is pressed and the number of times the touch panel is operated are counted when the power button is pressed in S11, an instruction is input for selecting a subject in S12, and an instruction is input for photographing in S15. Further, in S15, the light source 212 is turned on in conjunction with photographing, so that the number of times the light source 212 is turned on is counted, and the lighting time is measured. Furthermore, when the light source 212 is turned on in S15, the value of the current flowing through the light source 212 is detected using a current sensor connected to the light source 212. Furthermore, when the attachment of the auxiliary tool 400 is detected in S14, the number of times the auxiliary tool 400 is attached is counted. Although an example is given here, other operation information or information used to generate operation information may of course be acquired. Details of the acquired operational information or the information used to generate the operational information method will be described later.
 撮影装置200は、上記の通り取得された操作情報又は操作情報の生成に使われる情報を読み出すと、読み出したそれらの情報(T12)を撮影装置ID情報と共にサーバ装置300に送信する(S17)。サーバ装置300は、受信した撮影装置ID情報に基づいて撮影装置管理テーブルの操作情報を参照し、操作情報又は操作情報の生成に使われる情報を更新して記憶する(S18)。そして、サーバ装置300は、更新して記憶された操作情報に基づいて、関連情報テーブルを参照し、操作情報に設定された条件を満たす関連情報を特定することによって関連情報を生成する(S19)。サーバ装置300は、関連情報が生成されると、生成された関連情報を撮影装置管理テーブルに記憶するとともに、表示装置100及び撮影装置200に出力する(T13)。関連情報を受信した表示装置100及び撮影装置200は、受信した関連情報を出力インターフェイス114及び出力インターフェイス215を介して、ディスプレイ等に出力する(S20及びS21)。 After reading the operation information acquired as described above or the information used to generate the operation information, the photographing device 200 transmits the read information (T12) to the server device 300 together with the photographing device ID information (S17). The server device 300 refers to the operation information in the photographing device management table based on the received photographing device ID information, and updates and stores the operation information or information used to generate the operation information (S18). Then, the server device 300 generates related information by referring to the related information table based on the updated and stored operation information and identifying related information that satisfies the conditions set in the operation information (S19). . When the related information is generated, the server device 300 stores the generated related information in the imaging device management table and outputs it to the display device 100 and the imaging device 200 (T13). The display device 100 and the photographing device 200 that have received the related information output the received related information to a display or the like via the output interface 114 and the output interface 215 (S20 and S21).
 次に、図7によると、表示装置100は、対象者の対象者情報の一覧が表示された画面において、診断の対象となる対象者の対象者情報の選択を受け付ける(S41)。このとき、特に図示はしないが、表示装置100は、対象者又は操作者による入力に基づいて当該対象者の問診情報及び所見情報を生成し、あらかじめサーバ装置300の対象者管理テーブルに記憶する。そして、表示装置100は、対象者管理テーブルを参照し、当該対象者の対象者ID情報を読み出して、所定の疾患に対する罹患の可能性の判定要求(T41)と共に、サーバ装置300に送信する。サーバ装置300は、判定要求を受信すると、一緒に受信した対象者ID情報に基づいて、対象者管理―ブルの被写体画像を参照し、当該対象者ID情報に対応付けられた被写体画像、問診情報及び所見情報を読み出す。そして、サーバ装置300は、読み出した被写体画像に基づいて所定の疾患に対する罹患の可能性を判定し、判定結果を対象者管理テーブルに記憶する(S42)。サーバ装置300は、記憶された判定結果(T42)を表示装置100に送信する。判定結果を受信した表示装置100は、受信した判定結果を出力インターフェイス114を介してディスプレイに表示する(S43)。 Next, according to FIG. 7, the display device 100 receives selection of the subject information of the subject to be diagnosed on the screen on which the list of subject information of the subject is displayed (S41). At this time, although not particularly illustrated, the display device 100 generates interview information and observation information of the subject based on input by the subject or the operator, and stores the generated information in the subject management table of the server device 300 in advance. Then, the display device 100 refers to the subject management table, reads out the subject ID information of the subject, and transmits it to the server device 300 together with a request for determining the possibility of contracting a predetermined disease (T41). When the server device 300 receives the determination request, it refers to the subject image in the subject management system based on the subject ID information received together with the judgment request, and displays the subject image and interview information associated with the subject ID information. and read out finding information. Then, the server device 300 determines the possibility of contracting a predetermined disease based on the read subject image, and stores the determination result in the subject management table (S42). The server device 300 transmits the stored determination result (T42) to the display device 100. The display device 100 that has received the determination result displays the received determination result on the display via the output interface 114 (S43).
 また、サーバ装置300は、一連の処理において、種々の操作情報を取得する。例えば、S42において所定の疾患に対する罹患の可能性の判定が行われると、操作者管理テーブルの操作情報のうち、判定回数を更新して記憶する(S44)。以降は、S19~S21と同様に、サーバ装置300は、更新して記憶された操作情報に基づいて、関連情報テーブルを参照し、操作情報に設定された条件を満たす関連情報を特定することによって関連情報を生成する(S45)。サーバ装置300は、関連情報が生成されると、生成された関連情報を撮影装置管理テーブルに記憶するとともに、表示装置100及び撮影装置200に出力する(T23)。関連情報を受信した表示装置100及び撮影装置200は、受信した関連情報を出力インターフェイス114及び出力インターフェイス215を介して、ディスプレイ等に出力する(S46及びS47)。 Additionally, the server device 300 acquires various operation information in a series of processes. For example, when the possibility of contracting a predetermined disease is determined in S42, the number of times of determination among the operation information in the operator management table is updated and stored (S44). From then on, similarly to S19 to S21, the server device 300 refers to the related information table based on the updated and stored operation information and identifies related information that satisfies the conditions set in the operation information. Related information is generated (S45). When the related information is generated, the server device 300 stores the generated related information in the imaging device management table and outputs it to the display device 100 and the imaging device 200 (T23). The display device 100 and the photographing device 200 that have received the related information output the received related information to a display or the like via the output interface 114 and the output interface 215 (S46 and S47).
 なお、図7の例では、撮影装置200において操作情報を読み出してサーバ装置300に送信する処理を、被写体画像が送信された後の処理として実行した。しかし、送信のタイミングは、操作情報又は操作情報の生成に使用される情報が撮影装置200で取得される都度のタイミング、所定期間ごとのタイミング、他の情報送信のためにサーバ装置300と通信するタイミング又はそれらの組み合わせなど、いずれのタイミングで行ってもよい。また、生成された関連情報は、表示装置100及び撮影装置200の両方に出力されたが、いずれか一方に出力されるだけでもよい。表示装置100は問診情報や所見情報の入力で随時使用がなされるため、操作者の使用頻度が高い。そのため、表示装置100には少なくとも出力するのが望ましい。また、表示装置100及び撮影装置200における関連情報の出力方法は、ディスプレイ等への表示に限らず、音声、LED等の発行、振動又はそれらの組み合わせなど、いずれの方法であってもよい。これにより一連の処理シーケンスは終了される。 Note that in the example of FIG. 7, the process of reading out the operation information in the photographing device 200 and transmitting it to the server device 300 is executed after the subject image is transmitted. However, the timing of transmission is each time the operation information or information used to generate the operation information is acquired by the imaging device 200, the timing at every predetermined period, and the timing when communicating with the server device 300 for transmitting other information. It may be performed at any timing or a combination thereof. Further, although the generated related information is output to both the display device 100 and the photographing device 200, it may be output to only one of them. The display device 100 is used frequently for inputting medical interview information and finding information, and therefore is frequently used by the operator. Therefore, it is desirable to at least output it to the display device 100. Further, the method of outputting the related information in the display device 100 and the photographing device 200 is not limited to displaying on a display or the like, but may be any method such as sound, issuing of an LED, etc., vibration, or a combination thereof. This completes the processing sequence.
5.撮影装置200で実行される処理フロー
 図8は、本開示の一実施形態に係る撮影装置200において実行される処理フローを示す図である。具体的には、図8は、図7のS11~S17の撮影処理において実行される処理フローを示す図である。当該処理フローは、主に撮影装置200のプロセッサ213がメモリ214に記憶された処理プログラムを読み出して実行することにより行われる。なお、以下の処理フローにおいては、操作情報又は操作情報の生成に使用される情報として取得される情報の一例を例示するが、当然例示された情報のみに限定されるわけではなく、これら以外の情報が取得されてもよい。
5. Process flow executed by photographing device 200 FIG. 8 is a diagram showing a process flow executed by photographing device 200 according to an embodiment of the present disclosure. Specifically, FIG. 8 is a diagram showing a processing flow executed in the photographing process of S11 to S17 in FIG. 7. The processing flow is mainly performed by the processor 213 of the photographing device 200 reading and executing a processing program stored in the memory 214. In addition, in the following processing flow, an example of information acquired as operation information or information used to generate operation information is illustrated, but of course the information is not limited to the exemplified information only, and other information may also be used. Information may be obtained.
 図8によると、撮影装置200は、電源ボタンの押下等を検出することで撮影装置200を起動する(S111)。このとき、プロセッサ213は、操作情報又は操作情報の生成に使用される情報として、一例としては、以下の情報をメモリ214に記憶する。
・電源ボタンの押下を受けて、電源ボタンの押下回数
・バッテリ218に接続された電圧センサからの検出値
・タイマを参照して得られる現在の時間(撮影装置200の使用開始時間)
・GPSを参照して撮影装置200の使用場所
・通信インターフェイス216の通信速度
・通信インターフェイス216を介した無線通信を行うときの電波強度
・操作ログ
・加速度センサからの検出値
・撮影装置200の発熱箇所(例えば、光源212の周囲やプロセッサ213等が搭載された制御基板など)に設置された温度センサからの検出値
According to FIG. 8, the photographing device 200 starts the photographing device 200 by detecting a press of the power button or the like (S111). At this time, the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information.
・The number of presses of the power button in response to the press of the power button ・Detected value from the voltage sensor connected to the battery 218 ・Current time obtained by referring to the timer (start time of use of the imaging device 200)
- Location of use of the imaging device 200 using GPS - Communication speed of the communication interface 216 - Radio field strength when performing wireless communication via the communication interface 216 - Operation log - Detection value from the acceleration sensor - Heat generation of the imaging device 200 Detected value from a temperature sensor installed at a location (for example, around the light source 212, a control board on which the processor 213, etc. is mounted)
 なお、以下の処理において特に例示していなくても、例えばバッテリ218に接続された電圧センサからの検出値、通信インターフェイス216の通信速度、通信インターフェイス216を介した無線通信を行うときの電波強度、操作ログ、加速度センサからの検出値等は、随時のタイミングで取得され、メモリ214に記憶される。 Note that, even if not specifically exemplified in the following processing, for example, a detection value from a voltage sensor connected to the battery 218, a communication speed of the communication interface 216, a radio wave intensity when performing wireless communication via the communication interface 216, The operation log, the detected value from the acceleration sensor, etc. are acquired at any time and stored in the memory 214.
 次に、プロセッサ213は、通信インターフェイス216を介してサーバ装置300から被写体画像が未撮影の対象者の対象者情報を受信すると、未撮影の対象者の対象者情報を一覧として出力インターフェイス215を介してディスプレイに出力する。そして、プロセッサ213は、入力インターフェイス210を介して、当該一覧から撮影対象となる対象者の対象者情報の選択を受け付ける(S112)。このとき、プロセッサ213は、操作情報又は操作情報の生成に使用される情報として、一例としては、以下の情報をメモリ214に記憶する。
・タッチパネルを介して選択操作を受けて、タッチパネルの操作回数
・操作ログ
Next, when the processor 213 receives the subject information of the subjects whose subject images have not been captured from the server device 300 via the communication interface 216, the processor 213 outputs the subject information of the subjects whose subject images have not been captured as a list via the interface 215. output to the display. Then, the processor 213 receives selection of subject information of the subject to be photographed from the list via the input interface 210 (S112). At this time, the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information.
- Receiving selection operations via the touch panel, number of touch panel operations and operation log
次に、プロセッサ213は、カメラ211を起動するとともに、撮影装置200に対して補助具400が正常に装着されているか否かを判定して、装着されていないと判定された場合には補助具400の装着を促すための装着表示を出力する(S113)。そして、プロセッサ213は、補助具400の装着表示画面が出力された状態において所定周期で補助具400の装着の有無を判定し、正常な装着が判定されると、次の処理に進む。なお、このような判定は、いかなる処理方法で行われてもよいが、一例としては、補助具400の装着によって所定のスイッチがオンになることを検出する処理方法、補助具400が装着された状態で撮影された画像特有のパターンの有無を検出する処理方法、又はこれらの組み合わせ等が挙げられる。このとき、プロセッサ213は、操作情報又は操作情報の生成に使用される情報として、一例としては、以下の情報をメモリ214に記憶する。
・補助具400の装着回数のカウントに使用される補助具400の装着
・タイマを参照して得られる現在の時間(スルー画像の表示開始時間)
・操作ログ
Next, the processor 213 activates the camera 211 and determines whether or not the auxiliary device 400 is normally attached to the photographing device 200. If it is determined that the auxiliary device 400 is not attached, the auxiliary device 400 is outputted (S113). Then, the processor 213 determines whether or not the auxiliary tool 400 is worn at a predetermined period while the auxiliary tool 400 wearing display screen is output, and if it is determined that the auxiliary tool 400 is worn normally, the processor 213 proceeds to the next process. Note that such a determination may be made using any processing method, but one example is a processing method that detects that a predetermined switch is turned on when the auxiliary device 400 is attached; Examples include a processing method of detecting the presence or absence of a pattern specific to an image photographed in a certain state, or a combination thereof. At this time, the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information.
・Wearing of the auxiliary device 400 used to count the number of times the auxiliary device 400 is worn ・Current time obtained by referring to the timer (through image display start time)
・Operation log
 次に、正常に補助具400が装着されたと判定されると、プロセッサ213はカメラ211による咽頭を含む口腔内を被写体とした撮影を開始する(S114)。具体的には、光源212をオンにして被写体に対して光を照射し、咽頭等の被写体からの反射光をカメラ211によって検出することによって撮影が行われる。このとき、プロセッサ213は、操作情報又は操作情報の生成に使用される情報として、一例としては、以下の情報をメモリ214に記憶する。
・タイマを参照して得られる現在の時間(光源212の点灯開始時間)
・タイマを参照して得られる現在の時間(スルー画像の表示終了時間)
・光源212の点灯回数のカウントに使用される光源212がオンされた情報
・光源212に接続された電流センサからの検出値
・光源212の周囲に設置された温度センサからの検出値
・バッテリ218の周囲に設置された電流センサからの検出値
・撮影ボタンの押下を受けて、撮影ボタンの押下回数
・撮影ボタンの押下を受けて、被写体画像の撮影回数
・導光筒の周囲に設置された温度センサからの検出値
・操作ログ
Next, when it is determined that the auxiliary device 400 has been worn normally, the processor 213 starts photographing the inside of the oral cavity including the pharynx using the camera 211 (S114). Specifically, photography is performed by turning on the light source 212 to irradiate the subject with light, and detecting reflected light from the subject, such as the pharynx, with the camera 211. At this time, the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information.
-Current time obtained by referring to the timer (lighting start time of the light source 212)
・Current time obtained by referring to the timer (through image display end time)
- Information that the light source 212 is turned on, which is used to count the number of times the light source 212 is turned on - Detected value from a current sensor connected to the light source 212 - Detected value from a temperature sensor installed around the light source 212 - Battery 218 The detection value from the current sensor installed around the light guide tube.The number of presses of the photograph button in response to the press of the photograph button.The number of times the subject image is taken in response to the press of the photograph button. Detection values and operation logs from temperature sensors
 次に、プロセッサ213は、上記のとおり撮影がなされることによって被写体画像が撮影されると、撮影された被写体画像を、出力インターフェイス215を介してディスプレイに出力する。そして、プロセッサ213は、入力インターフェイス210を介して操作者の確定操作を受け付けると、撮影された被写体画像を、通信インターフェイス216を介して対象者ID情報と共にサーバ装置300に送信する(S115)。確定操作ではなく再撮影の指示が受け付けられた場合には、再度S114に戻る。このとき、プロセッサ213は、操作情報又は操作情報の生成に使用される情報として、一例としては、以下の情報をメモリ214に記憶する。
・タイマを参照して得られる現在の時間(光源212の点灯終了時間)
・タッチパネルを介して確定操作を受けて、タッチパネルの操作回数
・操作ログ
Next, when a subject image is photographed by photographing as described above, the processor 213 outputs the photographed subject image to the display via the output interface 215. When the processor 213 receives the operator's confirmation operation via the input interface 210, the processor 213 transmits the photographed subject image together with the subject ID information to the server device 300 via the communication interface 216 (S115). If an instruction for re-imaging is accepted instead of a confirmation operation, the process returns to S114 again. At this time, the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information.
-Current time obtained by referring to the timer (lighting end time of the light source 212)
・Receive a confirmation operation via the touch panel, and record the number of touch panel operations and operation log
 ここで、撮影装置200のバッテリ218は充電が可能な二次電池が用いられる。そのため、一例としては、撮影装置200を充電するための充電装置(図示しない)に接続することで、上記の撮影に係る処理が一旦充電され、撮影装置200のバッテリ218に対する充電を開始することが可能である。したがって、プロセッサ213は、充電装置への接続を検出すると、バッテリ218に対する充電を開始するよう制御する。このとき、又は充電されている間、プロセッサ213は、操作情報又は操作情報の生成に使用される情報として、一例としては、以下の情報をメモリ214に記憶する。
・充電装置への接続の検出
・タイマを参照して得られる現在の時間(撮影装置200の使用終了時間)
・タイマを参照して得られる現在の時間(バッテリ218に対する充電開始時間及び充電終了時間)
・操作ログ
Here, the battery 218 of the photographing device 200 is a rechargeable secondary battery. Therefore, for example, by connecting the photographing device 200 to a charging device (not shown) for charging, the above-described photographing process is temporarily charged, and charging of the battery 218 of the photographing device 200 can be started. It is possible. Therefore, when the processor 213 detects the connection to the charging device, it controls the battery 218 to start charging. At this time or while being charged, the processor 213 stores, for example, the following information in the memory 214 as the operation information or information used to generate the operation information.
・Detection of connection to the charging device ・Current time obtained by referring to the timer (time when use of the imaging device 200 ends)
- Current time obtained by referring to the timer (charging start time and charging end time for battery 218)
・Operation log
 次に、プロセッサ213は、上記の通りメモリ214に記憶された各操作情報又は操作情報の生成に使用される情報をメモリ214から読み出して(S116)、読み出した当該情報を、撮影装置200の撮影装置ID情報と共に、通信インターフェイス216を介してサーバ装置300に送信する(S117)。なお、特に図示はしていないが、送信された操作情報又は操作情報の生成に使用される情報は、サーバ装置300のプロセッサ312の処理を受けて操作情報として、撮影装置ID情報に対応付けて撮影装置管理テーブルに記憶される。また、これら情報の読み出し及び送信に係る処理は、図8の例ではこのタイミングで行っているが、当然このタイミング以外で行ってもよい。例えば、これら情報が取得される都度のタイミング、所定時間ごとのタイミング、他の情報の送信のためにサーバ装置300と通信するタイミング又はそれらの組み合わせなど、いずれのタイミングで行ってもよい。以上により、当該処理フローを終了する。 Next, the processor 213 reads each operation information stored in the memory 214 or the information used to generate the operation information from the memory 214 as described above (S116), and uses the read information to It is transmitted along with the device ID information to the server device 300 via the communication interface 216 (S117). Although not particularly illustrated, the transmitted operation information or the information used to generate the operation information is processed by the processor 312 of the server device 300 and is associated with the imaging device ID information as operation information. It is stored in the imaging device management table. Furthermore, although the processes related to reading and transmitting these information are performed at this timing in the example of FIG. 8, they may of course be performed at other timings. For example, it may be performed at any timing, such as each time the information is acquired, every predetermined time, the timing when communicating with the server device 300 for transmitting other information, or a combination thereof. With the above, the processing flow ends.
6.サーバ装置300で実行される処理フロー
6-1.判定処理
 図9Aは、本開示の一実施形態に係るサーバ装置300において実行される処理フローを示す図である。具体的には、図9Aは、図7のS41~S44の判定処理において実行される処理フローを示す図である。当該処理フローは、主にサーバ装置300のプロセッサ312がメモリ311に記憶された処理プログラムを読み出して実行することにより行われる。なお、以下の処理フローにおいては、操作情報や関連情報の一例を例示するが、当然例示された情報のみに限定されるわけではなく、これら以外の情報が用いられてもよい。
6. Process flow executed by server device 300
6-1. Determination Process FIG. 9A is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure. Specifically, FIG. 9A is a diagram showing a processing flow executed in the determination processing of S41 to S44 in FIG. 7. The processing flow is mainly performed by the processor 312 of the server device 300 reading and executing a processing program stored in the memory 311. In addition, in the following processing flow, although an example of operation information and related information is illustrated, it is not limited to only the illustrated information of course, and information other than these may be used.
 図9Aによると、プロセッサ312は、通信インターフェイス313を介して表示装置100から、診断対象として選択された対象者の対象者ID情報と共に、所定の疾患に対する罹患の可能性の判定要求を受信する(S211)。また、プロセッサ312は、あらかじめ表示装置100において対象者又は操作者によって入力された問診情報及び所見情報を、対象者管理テーブルにおいて対象者ID情報に対応付けて記憶している。そのため、プロセッサ312は、受信した対象者ID情報に基づいて、対象者管理テーブルの問診情報及び所見情報を参照して、当該対象者の対象者ID情報に対応付けられた問診情報及び所見情報を読み出す(S212及びS213)。また、プロセッサ312は、同じく受信した対象者ID情報に基づいて、対象者管理テーブルの被写体画像を参照して、当該対象者の対象者ID情報に対応付けられた被写体画像を読み出す(S214)。 According to FIG. 9A, the processor 312 receives, from the display device 100 via the communication interface 313, a request to determine the possibility of contracting a predetermined disease, along with the subject ID information of the subject selected as a diagnosis subject ( S211). Furthermore, the processor 312 stores the interview information and observation information input in advance by the subject or the operator on the display device 100 in association with the subject ID information in the subject management table. Therefore, based on the received subject ID information, the processor 312 refers to the interview information and observation information in the subject management table, and stores the interview information and observation information associated with the subject ID information of the subject. Read out (S212 and S213). Further, based on the similarly received subject ID information, the processor 312 refers to the subject images in the subject management table and reads out the subject image associated with the subject ID information of the subject (S214).
 次に、プロセッサ312は、読み出された問診情報、所見情報及び被写体画像を用いて、所定の疾患に対する罹患の可能性の判定処理を実行する(S215)。 Next, the processor 312 uses the read interview information, finding information, and subject image to execute a process of determining the possibility of contracting a predetermined disease (S215).
 ここで、上記判定処理の一例としては、以下に示すような学習済み判定モデルにこれら情報を入力して判定することが考えられる。しかし、当然この処理方法のみに限らず、画像解析処理により罹患状態を示す画像との一致度に基づいて判断する処理方法など、いずれの処理方法でも採用することが可能である。 Here, as an example of the above-described determination process, it is conceivable to input this information into a learned determination model as shown below and make a determination. However, it is of course possible to employ not only this processing method but also any processing method, such as a processing method in which a judgment is made based on the degree of matching with an image showing a diseased state through image analysis processing.
 図9Bは、本開示の一実施形態に係る学習済みモデルの生成に係る処理フローを示す図である。具体的には、図9Bは、図9AのS215の判定処理に用いられる学習済み判定モデルの生成に係る処理フローを示す図である。当該処理フローは、サーバ装置300のプロセッサ312によって実行されてもよいし、他の装置のプロセッサによって実行されてもよい。 FIG. 9B is a diagram showing a processing flow related to generation of a trained model according to an embodiment of the present disclosure. Specifically, FIG. 9B is a diagram showing a processing flow related to generation of a learned determination model used in the determination process of S215 in FIG. 9A. The processing flow may be executed by the processor 312 of the server device 300, or may be executed by a processor of another device.
 図9Bによると、プロセッサ312は、咽頭の少なくとも一部が含まれる被写体画像を取得するステップを実行する(S311)。また、プロセッサ312は、当該被写体画像の被写体となった対象者の対象者ID情報にあらかじめ対応付けて記憶された問診情報及び所見情報を取得するステップを実行する(S311)。次に、プロセッサ312は、当該被写体画像の被写体となった対象者に対して、あらかじめイムノクロマトによるインフルエンザ迅速検査、PCR検査、ウイルス分離培養検査の結果等に基づいて付与され正解ラベルを付与する処理ステップを実行する(S312)。そして、プロセッサ312は付与された正解ラベル情報を判定結果情報として被写体画像、並びに問診情報及び所見情報に対応付けて記憶するステップを実行する(S313)。なお、ここでは被写体画像そのものを用いているが、判定用画像から得られた特徴量を用いるようにしてもよい。 According to FIG. 9B, the processor 312 executes the step of acquiring a subject image that includes at least a portion of the pharynx (S311). Further, the processor 312 executes a step of acquiring interview information and finding information that are stored in advance in association with the subject ID information of the subject who is the subject of the subject image (S311). Next, the processor 312 performs a processing step in which a correct label is assigned in advance to the subject of the subject image based on the results of a rapid influenza test using immunochromatography, a PCR test, a virus isolation culture test, etc. (S312). Then, the processor 312 executes a step of storing the assigned correct label information as determination result information in association with the subject image, interview information, and finding information (S313). Note that although the subject image itself is used here, feature amounts obtained from the determination image may also be used.
 被写体画像、並びに問診情報及び所見情報とそれに対応付けられた正解ラベル情報がそれぞれ得られると、プロセッサは、それらを用いて疾患に対する罹患の判定パターンの機械学習を行うステップを実行する(S314)。当該機械学習は、一例として、ニューロンを組み合わせたニューラルネットワークに対して、これら情報の組を与え、ニューラルネットワークからの出力が正解ラベル情報と同じになるように、各ニューロンのパラメータを調整しながら学習を繰り返すことにより行われる。そして、学習済みの判定モデルを取得するステップが実行される(S315)。取得された学習済み判定モデルは、サーバ装置300のメモリ311やサーバ装置300と有線又は無線ネットワークを介して接続された他の装置内に記憶されていてもよい。 When the subject image, the medical interview information, the finding information, and the correct label information associated therewith are obtained, the processor executes a step of performing machine learning of a disease determination pattern for the disease using them (S314). For example, this machine learning method provides a set of information to a neural network that combines neurons, and learns while adjusting the parameters of each neuron so that the output from the neural network is the same as the correct label information. This is done by repeating. Then, a step of acquiring a learned judgment model is executed (S315). The acquired learned determination model may be stored in the memory 311 of the server device 300 or in another device connected to the server device 300 via a wired or wireless network.
 次に、再び図9Aに戻り、プロセッサ312は、判定処理によって判定された結果を対象者管理テーブルに対象者ID情報に対応付けて記憶するとともに、記憶された判定結果及び判定に用いられた被写体画像を表示装置100に送信する(S216)。このとき、プロセッサ312は、操作情報として、一例としては、以下の情報をメモリ311の撮影装置管理テーブルの操作情報に更新して記憶する(S217)。
・所定の疾患に対する罹患の可能性の判定回数
・操作ログ
Next, returning to FIG. 9A again, the processor 312 stores the result determined by the determination process in the target person management table in association with the target person ID information, and also stores the stored determination result and the subject used in the determination. The image is transmitted to the display device 100 (S216). At this time, the processor 312 updates and stores the following information as the operation information of the photographing device management table in the memory 311, as an example (S217).
・Number of times the possibility of contracting a given disease is judged/Operation log
6-2.関連情報の出力処理
 図9Cは、本開示の一実施形態に係るサーバ装置300において実行される処理フローを示す図である。具体的には、図9Cは、図7のSS19~S21及びS45~S47の関連情報の出力処理において実行される処理フローを示す図である。当該処理フローは、主にサーバ装置300のプロセッサ312がメモリ311に記憶された処理プログラムを読み出して実行することにより行われる。なお、以下の処理フローにおいては、操作情報や関連情報の一例を例示するが、当然例示された情報のみに限定されるわけではなく、これら以外の情報が用いられてもよい。
6-2. Related information output processing FIG. 9C is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure. Specifically, FIG. 9C is a diagram showing a processing flow executed in the related information output processing of SS19 to S21 and S45 to S47 in FIG. 7. The processing flow is mainly performed by the processor 312 of the server device 300 reading and executing a processing program stored in the memory 311. In addition, in the following processing flow, although an example of operation information and related information is illustrated, it is not limited to only the illustrated information of course, and information other than these may be used.
 図9Cによると、プロセッサ312は、通信インターフェイス313を介して撮影装置200から操作情報又は操作情報の生成に使用される情報を受信したか否かを判断する(S231)。そして、操作情報の生成に使用される情報を受信した場合には、当該情報に基づいて操作情報を生成して(例えば、光源212の点灯開始時間と点灯終了時間を受信した場合には、操作情報として光源212の点灯時間を算出する)、操作情報を取得する。 According to FIG. 9C, the processor 312 determines whether operation information or information used to generate the operation information has been received from the photographing device 200 via the communication interface 313 (S231). When the information used to generate the operation information is received, the operation information is generated based on the information (for example, when the lighting start time and the lighting end time of the light source 212 are received, the operation information is (calculating the lighting time of the light source 212 as information) and acquiring operation information.
 次に、プロセッサ312は、S231において受信した操作情報、S231において生成された操作情報、及び図9AのS217で取得された操作情報を、それぞれメモリ311の撮影装置管理テーブルに撮影装置ID情報に対応付けて更新して記憶する(S232)。そして、プロセッサ312は、関連情報テーブルを参照して、各操作情報のうち関連情報に対応付けられた条件を満たす関連情報を読み出して、関連情報を生成する(S233)。プロセッサ312は、読み出した関連情報を、撮影装置ID情報が対応付けられた表示装置ID情報によって特定される表示装置100、及び撮影装置ID情報によって特定される撮影装置200にそれぞれ出力する(S234)。 Next, the processor 312 stores the operation information received in S231, the operation information generated in S231, and the operation information acquired in S217 of FIG. The information is updated and stored (S232). Then, the processor 312 refers to the related information table, reads related information that satisfies the conditions associated with the related information from each piece of operation information, and generates related information (S233). The processor 312 outputs the read related information to the display device 100 specified by the display device ID information associated with the image capturing device ID information and the image capturing device 200 specified by the image capturing device ID information (S234). .
 なお、操作情報又は操作情報の生成に使用される情報と関連情報の対応関係について一例をあげると下記のとおりである。なお、以下に記載する数値等の各条件や、出力される情報は単なる一例であって、当然他の条件や情報が設定されてもよい。 An example of the correspondence between operation information or information used to generate operation information and related information is as follows. Note that the conditions such as numerical values described below and the information to be output are merely examples, and other conditions and information may of course be set.
[補助具400の購入を促進する情報]
・操作情報又は操作情報の生成に使用される情報:補助具400が装着されたことを示す情報が撮影装置200で取得され、サーバ装置300においてプロセッサ312の処理を受けて補助具400の装着回数が操作情報として記憶される。補助具400が装着されたことを示す情報は、図8のS113において補助具400の正常な装着が判定されたこと、補助具400の装着を確認したことを示す操作入力を操作者が行ったこと、図8のS114において撮影ボタンが押下されたこと、図8のS115において確定操作が受け付けられたこと、図9AのS215においてなされた所定の疾患に対する罹患の可能性の判定回数又はこれらの組み合わせに基づいて生成される。なお、装着回数は補助具400の購入が行われ例えば使用者がリセットの操作入力をすることによって、カウントがリセットされ、再び1からカウントが開始される。
・関連情報:操作情報として記憶された装着回数に基づいて、「装着回数」が0回目から100回目未満の場合は何も出力されず、100回目及び150回目に到達すると「補助具の在庫が少なくなりました。」と出力され、181回目から200回目までは操作情報を取得する都度「補助具の在庫が少なくなりました。」と出力され、201回目以降は「補助具はここから購入してください。」と出力される。201回目以降に出力される関連情報は、補助具400の購入先となるウェブページへのリンクや、購入先への連絡情報等が表示される。
[Information promoting the purchase of auxiliary tool 400]
- Operation information or information used to generate operation information: Information indicating that the auxiliary tool 400 is worn is acquired by the imaging device 200, and processed by the processor 312 in the server device 300 to determine the number of times the auxiliary tool 400 is worn. is stored as operation information. The information indicating that the auxiliary device 400 has been worn is obtained by the operator performing an operation input indicating that it was determined that the auxiliary device 400 was normally worn and that the attachment of the auxiliary device 400 was confirmed in S113 of FIG. 8, that the shooting button was pressed in S114 of FIG. 8, that the confirmation operation was accepted in S115 of FIG. 8, that the number of times the possibility of contracting a predetermined disease was determined in S215 of FIG. 9A, or a combination thereof. Generated based on. Note that when the auxiliary device 400 is purchased and the user inputs a reset operation, the count is reset and the count starts again from 1.
・Related information: Based on the number of wearing times stored as operation information, nothing will be output if the "number of wearing times" is from 0 to less than 100, and when it reaches the 100th and 150th times, a message will be displayed saying "Auxiliary devices are out of stock." From the 181st to the 200th time, the message "The stock of auxiliary tools is running low." is output each time the operation information is acquired, and from the 201st time onwards, the message "The stock of auxiliary tools is low" is output. Please do so.” is output. The related information output from the 201st time onwards includes a link to a web page where the auxiliary tool 400 is purchased, contact information for the purchase location, and the like.
[物理キーの交換等を促進する情報]
・操作情報又は操作情報の生成に使用される情報:電源ボタンが押下されたことを示す情報及び撮影ボタンが押下されたことを示す情報が撮影装置200で取得され、サーバ装置300においてプロセッサ312の処理を受けて各物理キーの押下回数が操作情報として記憶される。なお、押下回数は物理キーの交換等が行われ例えば使用者がリセットの操作入力をすることによって、カウントがリセットされ、再び1からカウントが開始される。
・関連情報:操作情報として記憶された各物理キーの押下回数に基づいて、t1回(例えば、10,000回)目、t2回(例えば、20,000回)目等の所定回数に達するごとに「物理キーの劣化の可能性があります。」と出力され、t3回(例えば、30,000回)目以降は「物理キーを交換してください。交換においては○○までご連絡ください。」と出力され、物理キーの劣化が通知されたり、交換、点検及び校正のうちの少なくともいずれかを促進するための情報が出力される。
[Information promoting physical key exchange, etc.]
- Operation information or information used to generate operation information: Information indicating that the power button has been pressed and information indicating that the shooting button has been pressed are acquired by the photographing device 200, and the server device 300 processes the processor 312. After the processing, the number of presses of each physical key is stored as operation information. Note that the count of the number of presses is reset when the physical key is replaced, for example, when the user inputs a reset operation, and the count is restarted from 1.
・Related information: Based on the number of presses of each physical key stored as operation information, each time a predetermined number of times, such as the t1th (e.g., 10,000th) or t2th (e.g., 20,000th) time, is reached. "The physical key may have deteriorated." is output, and after the t3th time (for example, 30,000 times), "Please replace the physical key. Please contact ○○ for replacement." is output, and information for notifying deterioration of the physical key and promoting at least one of replacement, inspection, and calibration is output.
[タッチパネルの交換等を促進する情報]
・操作情報又は操作情報の生成に使用される情報:タッチパネルにおいて選択操作や確定操作のためにタップ、ドラック、スワイプ操作がなされたことを示す情報が撮影装置200の入力インターフェイス210を介して取得され、サーバ装置300においてプロセッサ312の処理を受けてタッチパネルの操作回数が操作情報として記憶される。なお、操作回数はタッチパネルの交換等が行われ例えば使用者がリセットの操作入力をすることによって、カウントがリセットされ、再び1からカウントが開始される。
・関連情報:操作情報として記憶されたタッチパネルの操作回数に基づいて、t4回(例えば、50,000回)目、t5回(例えば、100,000回)目等の所定回数に達するごとに「タッチパネルのセンサに劣化の可能性があります。」と出力され、t6回(例えば、300,000回)目以降は「タッチパネルを交換してください。交換においては○○までご連絡ください。」と出力され、タッチパネルの劣化が通知されたり、交換、点検及び校正のうちの少なくともいずれかを促進するための情報が出力される。
[Information promoting touch panel replacement, etc.]
- Information used to generate operation information or operation information: information indicating that a tap, drag, or swipe operation has been performed on the touch panel for a selection operation or a confirmation operation is acquired via the input interface 210 of the photographing device 200. In the server device 300, the number of times the touch panel is operated is stored as operation information after processing by the processor 312. Note that when the touch panel is replaced or the like and the user inputs a reset operation, the count is reset and the count starts again from 1.
・Related information: Based on the number of touch panel operations stored as operation information, each time a predetermined number of times such as t4th (for example, 50,000th) or t5th (for example, 100,000th) is reached, " The touch panel sensor may have deteriorated.'' is output, and after the 6th time (for example, 300,000 times), the message ``Please replace the touch panel.Please contact ○○ for replacement.'' is output. and outputs information for notifying deterioration of the touch panel and promoting at least one of replacement, inspection, and calibration.
[光源212の交換等を促進する情報]
・操作情報又は操作情報の生成に使用される情報:タイマを参照して得られる現在の時間情報から、光源212の点灯開始時間と点灯終了時間が撮影装置200において取得され、サーバ装置300においてプロセッサ312の処理を受けて光源212の点灯時間が操作情報として記憶される。また、撮影装置200において光源212に対するオン信号の出力を取得して、サーバ装置300においてプロセッサ312の処理を受けて光源212の点灯回数が操作情報として記憶される。また、撮影装置200の光源212に接続された電流センサからの検出値を撮影装置200において取得し、サーバ装置300において当該検出値が操作情報として記憶される。また、光源212の周囲に設置された温度センサからの検出値が撮影装置200において取得され、サーバ装置300において操作情報として記憶される。なお、点灯時間や点灯回数は光源212の交換等が行われ例えば使用者がリセットの操作入力をすることによって、カウントがリセットされ、再び1からカウントが開始される。
[Information promoting replacement of light source 212, etc.]
- Operation information or information used to generate operation information: The lighting start time and lighting end time of the light source 212 are acquired in the photographing device 200 from the current time information obtained by referring to the timer, and the server device 300 acquires the lighting start time and the lighting end time. After the process of 312, the lighting time of the light source 212 is stored as operation information. Further, an output of an on signal to the light source 212 is acquired in the imaging device 200, and the number of times the light source 212 is turned on is stored as operation information through processing by the processor 312 in the server device 300. Further, a detected value from a current sensor connected to the light source 212 of the photographing device 200 is acquired in the photographing device 200, and the detected value is stored as operation information in the server device 300. Further, a detection value from a temperature sensor installed around the light source 212 is acquired by the photographing device 200 and stored as operation information in the server device 300. Incidentally, when the light source 212 is replaced or the like and the user inputs a reset operation, the count is reset and the count starts again from 1 for the lighting time and the number of times the light source 212 is lit.
・関連情報:操作情報として記憶された点灯時間に基づいて、累積の点灯時間がs1時間(例えば1,000時間)、s2時間(例えば、2,000時間)等の所定時間に達するごとに「光源212のLEDに劣化の可能性があります。」と出力され、s3時間(例えば、10,000時間)以降は「光源212のLEDを交換してください。交換においては○○までご連絡ください。」と出力され、光源212の劣化が通知されたり、交換、点検及び校正のうちの少なくともいずれかを促進するための情報が出力される。また、操作情報として記憶された点灯回数に基づいて、t7回(例えば、10,000回)目、t8回(例えば、20,000回)目等の所定回数に達するごとに「光源212のLEDに劣化の可能性があります。」と出力され、t9回(例えば、100,000回)目以降は「光源212のLEDを交換してください。交換においては○○までご連絡ください。」と出力され、光源の劣化が通知されたり、交換、点検及び校正のうちの少なくともいずれかを促進するための情報が出力される。また、操作情報として記憶された電流センサからの検出値とあらかじめ決められた閾値(例えば、200mA。また、上限と下限などの複数の閾値があってもよい)とを比較し、当該閾値を超える場合又は下回る場合に異常な電流が流れていると判断して、「光源212のLEDを点検してください。」や「光源212のLEDを交換してください。交換においては○○まで連絡ください。」と出力され、光源212のLEDの劣化が通知されたり、交換、点検及び校正のうちの少なくともいずれかを促進するための情報が出力される。また、操作情報として記憶された温度センサの検出値とあらかじめ決められた閾値(例えば、60℃)とを比較し、当該閾値を超える場合に光源212のLEDの劣化が通知されたり、交換、点検及び校正のうちの少なくともいずれかを促進するための情報が出力される。また、上記で操作情報として記憶された点灯時間、点灯回数、電流センサからの検出値、温度等の情報を履歴データとして時系列で記憶し、過去の履歴データとの比較に基づいて光源212のLEDの劣化が通知されたり、劣化の予測時期が通知されてもよい。このような通知は、履歴データを入力データとして機械学習することによって得られた学習済み予測モデルを使って劣化の判定又は劣化の時期を予測することで生成が可能である。 ・Related information: Based on the lighting time stored as operation information, each time the cumulative lighting time reaches a predetermined time such as s1 hours (for example, 1,000 hours), s2 hours (for example, 2,000 hours), " "The LED of light source 212 may have deteriorated." is output, and after s3 hours (for example, 10,000 hours) "Please replace the LED of light source 212. Please contact ○○ for replacement. ”, and information for notifying the deterioration of the light source 212 and promoting at least one of replacement, inspection, and calibration is output. Also, based on the number of lighting times stored as operation information, each time a predetermined number of times such as t7th (for example, 10,000th time), t8th (for example, 20,000th time), etc. are reached, "LED of light source 212 "There is a possibility of deterioration." is output, and after the t9th time (for example, 100,000 times), "Please replace the LED of light source 212. Please contact ○○ for replacement." is output. and outputs information for notifying deterioration of the light source and promoting at least one of replacement, inspection, and calibration. In addition, the detected value from the current sensor stored as operation information is compared with a predetermined threshold (for example, 200 mA; there may be multiple thresholds such as an upper limit and a lower limit), and when the value exceeds the threshold. If the current is above or below the current level, it is determined that an abnormal current is flowing, and the message ``Please check the LED of light source 212'' or ``Replace the LED of light source 212.'' For replacement, please contact ○○. ” is output, and information for notifying the deterioration of the LED of the light source 212 and promoting at least one of replacement, inspection, and calibration is output. In addition, the detected value of the temperature sensor stored as operation information is compared with a predetermined threshold value (for example, 60 degrees Celsius), and if the threshold value is exceeded, the LED of the light source 212 is notified of deterioration, replaced, or inspected. Information for facilitating at least one of the above and the calibration is output. In addition, the information such as the lighting time, the number of lighting times, the detected value from the current sensor, and the temperature stored as the operation information above is stored in chronological order as historical data, and the light source 212 is adjusted based on the comparison with past historical data. The deterioration of the LED may be notified, or the predicted time of deterioration may be notified. Such a notification can be generated by determining deterioration or predicting the timing of deterioration using a learned prediction model obtained by machine learning using historical data as input data.
[バッテリ218の交換等を促進する情報]
・操作情報又は操作情報の生成に使用される情報:充電装置にバッテリ218が接続されたことを検出されたことを示す情報が取得され、サーバ装置300においてプロセッサ312の処理を受けてこれまでの累積の充電回数が操作情報として記憶される。また、バッテリ218に接続された電流センサによって充電された電流量と放電された電流量の積算値から現在の充電量を取得するとともに、バッテリ218に接続された電圧センサからの検出値を撮影装置200において取得し、サーバ装置300において満充電時の電圧センサから検出値が操作情報として記憶される。また、バッテリ218に接続された電圧センサによってバッテリ218のセル電圧の検出値を撮影装置200において取得し、サーバ装置300においてプロセッサ312の処理を受けて過剰な電圧がかかった状態である過充電状態を示した回数が操作情報として記憶される。また、逆に、過剰に放電された状態である過放電状態を示した回数が操作情報として記憶される。また、タイマを参照して得られる現在の時間情報から、バッテリ218に対する充電開始時間及び充電終了時間が撮影装置200において取得され、サーバ装置300においてプロセッサ312の処理を受けてバッテリ218の累積充電時間が操作情報として記憶される。また、バッテリ218の周囲に設置された温度センサからの検出値が撮影装置200において取得され、サーバ装置300において操作情報として記憶される。なお、充電回数、過充電や過放電の回数、累積充電時間はバッテリ218の交換等が行われ例えば使用者がリセットの操作入力をすることによって、カウントがリセットされ、再び1からカウントが開始される。
[Information promoting replacement of battery 218, etc.]
- Operation information or information used to generate operation information: Information indicating that connection of the battery 218 to the charging device has been detected is acquired, and is processed by the processor 312 in the server device 300 to generate the previous information. The cumulative number of charging times is stored as operation information. In addition, the current amount of charge is obtained from the integrated value of the amount of current charged and the amount of current discharged by the current sensor connected to the battery 218, and the detected value from the voltage sensor connected to the battery 218 is acquired by the imaging device. 200, and the detected value from the voltage sensor at the time of full charge is stored as operation information in the server device 300. In addition, a detected value of the cell voltage of the battery 218 is acquired in the photographing device 200 by a voltage sensor connected to the battery 218, and the detected value is processed by the processor 312 in the server device 300, resulting in an overcharge state in which an excessive voltage is applied. The number of times this is displayed is stored as operation information. Conversely, the number of times an over-discharge state, which is a state of excessive discharge, has been displayed is stored as operation information. Further, from the current time information obtained by referring to the timer, the charging start time and charging end time for the battery 218 are acquired in the photographing device 200, and the cumulative charging time of the battery 218 is obtained by processing by the processor 312 in the server device 300. is stored as operation information. Further, a detected value from a temperature sensor installed around the battery 218 is acquired by the photographing device 200 and stored as operation information in the server device 300. Note that the number of charging times, the number of overcharging and overdischarging, and the cumulative charging time are reset when the battery 218 is replaced, for example, when the user inputs a reset operation, and the count is restarted from 1. Ru.
・関連情報:操作情報として記憶された充電回数、過充電回数又は過放電回数に基づいて、t10回(例えば、500回)目、t11回(例えば、1,000回)目等の所定回数に達するごとに「バッテリ218に劣化の可能性があります。」と出力され、t12回(例えば、2,000回)目以降は「バッテリ218を交換してください。交換においては○○までご連絡ください。」と出力され、バッテリ218の劣化が通知されたり、交換及び点検のうちの少なくともいずれかを促進するための情報が出力される。なお、充電回数、過充電回数及び過放電回数ごとに関連情報が出力される条件はそれぞれ異なる回数が設定される。また、操作情報として記憶された満充電時の電圧センサからの検出値とあらかじめ決められた閾値(例えば、3.7V)とを比較し、当該閾値を下回る場合に電池が劣化し十分に充電できていないと判断して、「バッテリ218を点検してください。」やバッテリ218を交換してください。交換においては○○まで連絡ください。」と出力され、バッテリ218の劣化が通知されたり、交換及び点検のうちの少なくともいずれかを促進するための情報が出力される。また、操作情報として記憶された累積充電時間に基づいて、累積充電時間がs4時間(例えば、500時間)、s5時間(例えば、1,000時間)等の所定時間に達するごとに「バッテリ218に劣化の可能性があります。」と出力され、s6時間(例えば、2,000時間)以降は「バッテリ218を交換してください。交換においては○○までご連絡ください。」と出力され、バッテリ218の劣化が通知されたり、交換及び点検のうちの少なくともいずれかを促進するための情報が出力される。また、操作情報として記憶された温度センサの検出値とあらかじめ決められた閾値(例えば、50℃)とを比較し、当該閾値を超える場合にバッテリ218の劣化が通知されたり、交換及び点検のうちの少なくともいずれかを促進するための情報が出力される。また、上記で操作情報として記憶された充電回数、過充電回数、過放電回数、電圧センサからの検出値、充電時間、温度センサからの検出値等の情報を履歴データとして時系列で記憶し、過去の履歴データとの比較に基づいてバッテリ218の劣化が通知されたり、劣化の予測時期が通知されてもよい。このような通知は、履歴データを入力データとして機械学習することによって得られた学習済み予測モデルを使って劣化の判定又は劣化の時期を予測することで生成が可能である。 ・Related information: Based on the number of charging times, overcharging times, or overdischarging times stored as operation information, at a predetermined number of times, such as the t10th (for example, 500th time), t11th (for example, 1,000th time), etc. Each time it reaches the limit, the message "Battery 218 may have deteriorated." is output, and after the t12th (for example, 2,000th) time, "Please replace the battery 218. For replacement, please contact ○○." ” is output, and information for notifying the deterioration of the battery 218 and promoting at least one of replacement and inspection is output. Note that the conditions for outputting the related information are set to different times for each of the number of times of charging, the number of overcharging, and the number of overdischarging. In addition, the detected value from the voltage sensor at full charge stored as operation information is compared with a predetermined threshold (for example, 3.7V), and if the value is lower than the threshold, the battery has deteriorated and cannot be fully charged. Please check the battery 218 or replace the battery 218. For exchange, please contact ○○. ” is output, and information for notifying the deterioration of the battery 218 and promoting at least one of replacement and inspection is output. In addition, based on the cumulative charging time stored as operation information, each time the cumulative charging time reaches a predetermined time such as s4 hours (for example, 500 hours) or s5 hours (for example, 1,000 hours), the message "The battery 218 "There is a possibility of deterioration." is output, and after s6 hours (for example, 2,000 hours), "Please replace the battery 218. Please contact ○○ for replacement." is output, and the battery 218 Deterioration of the device is notified, and information for promoting at least one of replacement and inspection is output. In addition, the detected value of the temperature sensor stored as operation information is compared with a predetermined threshold value (for example, 50 degrees Celsius), and if the threshold value is exceeded, the battery 218 is notified of deterioration or replaced or inspected. Information for promoting at least one of the above is output. In addition, information such as the number of charging times, the number of overcharging times, the number of overdischarging, the detected value from the voltage sensor, the charging time, the detected value from the temperature sensor, etc. stored as the operation information above is stored in chronological order as historical data, Deterioration of the battery 218 may be notified based on comparison with past historical data, or a predicted timing of deterioration may be notified. Such a notification can be generated by determining deterioration or predicting the timing of deterioration using a learned prediction model obtained by machine learning using historical data as input data.
[バッテリ218の充電を補助する情報]
・操作情報又は操作情報の生成に使用される情報:バッテリ218に接続された電流センサによって充電された電流量と放電された電流量の積算値から、又はバッテリ218に接続された電圧センサの出力値から、現在の残充電量が取得され、サーバ装置300において残充電量が操作情報として記憶される。また、撮影装置200の操作ログ、使用場所を示す情報及び現在の時間情報が撮影装置200において取得され、サーバ装置300において操作情報として記憶される。
・関連情報:操作情報として記憶された操作ログと、使用場所を示す情報及び現在の時間情報とから、各機関の曜日ごとの使用頻度や使用電力量の予測値が算出される。そして、操作情報として取得された残充電量と上記予測値からバッテリ218の充電量の減少予測がされる。その結果に基づいて、病院等の医療機関などの診療時間内にバッテリ218の充電量が規定量(例えば、500mAh)よりも下回ることが予測される場合には、関連情報として「充電装置に接続して充電を開始してください。」と出力され、充電を促進するための情報が出力される。医療機関等においては診療時間内にバッテリ218の残量低下によって撮影装置200が使用できなくなると重大な問題となるため、あらかじめ通知することでこのような問題を解消することができる。
[Information to assist in charging the battery 218]
- Operation information or information used to generate operation information: from the integrated value of the amount of current charged and the amount of current discharged by the current sensor connected to the battery 218, or the output of the voltage sensor connected to the battery 218 The current remaining charge amount is acquired from the value, and the remaining charge amount is stored in the server device 300 as operation information. Further, an operation log of the photographing device 200, information indicating the location of use, and current time information are acquired by the photographing device 200 and stored as operation information in the server device 300.
- Related information: From the operation log stored as operation information, information indicating the location of use, and current time information, predicted values of usage frequency and power consumption for each day of the week are calculated for each institution. Then, a decrease in the amount of charge of the battery 218 is predicted based on the remaining charge amount obtained as the operation information and the predicted value. Based on the results, if it is predicted that the charge amount of the battery 218 will be less than the specified amount (for example, 500 mAh) during medical treatment hours at a medical institution such as a hospital, related information will be displayed indicating that the battery 218 is connected to a charging device. Please start charging.'' and information to facilitate charging is output. In medical institutions, etc., it would be a serious problem if the imaging device 200 becomes unusable due to a decrease in the remaining capacity of the battery 218 during medical treatment hours, so such a problem can be solved by notifying the patient in advance.
 なお、各機関の曜日ごとの使用頻度や使用電力量の予測値は、過去の同機関の同曜日における使用頻度及び使用電力量の履歴を入力データとして機械学習することによって得られた学習済み予測モデルを使って算出することが可能である。ただし、この方法のみに限らず、過去の履歴から曜日ごとの最大使用頻度や最大使用電力量、平均使用頻度や平均使用電力量などを予測値とする方法など、いずれの方法を用いることが可能である。 The predicted values of usage frequency and power consumption for each day of the week for each institution are learned predictions obtained by machine learning using the past history of usage frequency and power consumption for the same day of the week by the same institution as input data. It is possible to calculate using a model. However, you are not limited to this method; any method can be used, such as a method that uses the maximum usage frequency, maximum power usage, average usage frequency, average power usage, etc. for each day of the week as predicted values from past history. It is.
[バッテリ218の充電を補助する情報]
・操作情報又は操作情報の生成に使用される情報:充電装置からバッテリ218に電力を供給するためのACアダプタに接続された電圧センサからの検出値が撮影装置200において取得され、サーバ装置300においてその検出値が操作情報として取得される。
・関連情報:操作情報として記憶された電圧センサの検出値とあらかじめ決められたバッテリ容量から満充電までに要する時間を算出し、「満充電までに○○時間要します。」という通知が関連情報として出力される。
[Information to assist in charging the battery 218]
- Operation information or information used to generate operation information: A detected value from a voltage sensor connected to an AC adapter for supplying power from the charging device to the battery 218 is acquired in the imaging device 200, and is acquired in the server device 300. The detected value is acquired as operation information.
・Related information: The time required to fully charge is calculated from the voltage sensor detection value stored as operation information and the predetermined battery capacity, and a notification stating "It will take XX hours to fully charge." is related. Output as information.
[充電装置の交換等を促進する情報]
・操作情報又は操作情報の生成に使用される情報:充電装置からバッテリ218に電力を供給するための電線に接続された電圧センサ又は電流センサからの検出値が撮影装置200において取得され、サーバ装置300においてその検出値が操作情報として取得される。
・関連情報:操作情報として記憶された電圧センサ又は電流センサの検出値とあらかじめ決められた閾値(例えば、電流センサの閾値として0.5A。また、上限と下限など複数の閾値があってもよい)とを比較し、当該閾値を超える場合又は下回る場合に充電装置の劣化又は推奨規格外の充電装置が接続されたことを通知する情報が関連情報として出力される。なお、このとき例えば充電装置と撮影装置200との接続には、データ通信及び電力供給がともに可能なケーブルか、電力供給のみが可能なケーブルのいずれかが一般的には用いられる。そのため、いずれのケーブルが接続に用いられたかを検出して関連情報として出力してもよい。例えば、電力供給のみが可能なケーブルが用いられている場合は当該ケーブルを介してデータ通信ができないため、関連情報として出力することでより使い勝手が向上する。
[Information promoting replacement of charging device, etc.]
- Information used to generate operation information or operation information: A detected value from a voltage sensor or a current sensor connected to an electric wire for supplying power from the charging device to the battery 218 is acquired in the imaging device 200, and is transmitted to the server device. At 300, the detected value is acquired as operation information.
・Related information: The detected value of the voltage sensor or current sensor stored as operation information and a predetermined threshold (for example, 0.5A as the threshold of the current sensor. Also, there may be multiple thresholds such as an upper limit and a lower limit) ), and if the threshold value is exceeded or lower than the threshold, information notifying that the charging device has deteriorated or that a charging device that does not meet the recommended standards is connected is output as related information. Note that, at this time, for example, to connect the charging device and the photographing device 200, either a cable capable of both data communication and power supply, or a cable capable of only power supply is generally used. Therefore, which cable was used for connection may be detected and output as related information. For example, if a cable that can only supply power is used, data communication cannot be performed via the cable, so the usability is further improved by outputting it as related information.
[CMOSイメージセンサの交換等を促進する情報]
・操作情報又は操作情報の生成に使用される情報:タイマを参照して得られる現在の時間情報から、スルー画像の表示開始時間と表示終了時間が撮影装置200において取得され、サーバ装置300においてプロセッサ312の処理を受けてCMOSイメージセンサの累積使用時間が操作情報として記憶される。また、撮影ボタンが押下されたことを示す情報が撮影装置200で取得され、サーバ装置300においてプロセッサ312の処理を受けて撮影回数が操作情報として記憶される。なお、累積使用時間や撮影回数はCMOSイメージセンサの交換等が行われ例えば使用者がリセットの操作入力をすることによって、カウントがリセットされ、再び1からカウントが開始される。
・関連情報:操作情報として記憶されたCMOSセンサの累積使用時間に基づいて、累積の使用時間がs7時間、s8時間等の所定時間に達するごとに「CMOSイメージセンサに劣化の可能性があります。」と出力され、s9時間以降は「CMOSイメージセンサを交換してください。交換においては○○までご連絡ください。」と出力され、CMOSイメージセンサが通知されたり、交換及び点検のうちの少なくともいずれかを促進するための情報が出力される。また、操作情報として記憶された撮影回数に基づいて、t13回目、t14回目等の所定回数に達するごとに「CMOSイメージセンサに劣化の可能性があります。」と出力され、t15回目以降は「CMOSイメージセンサを交換してください。交換においては○○までご連絡ください。」と出力され、CMOSイメージセンサの劣化が通知されたり、交換及び点検のうちの少なくともいずれかを促進するための情報が出力される。
[Information promoting replacement of CMOS image sensor, etc.]
- Operation information or information used to generate operation information: The display start time and display end time of the through image are acquired in the imaging device 200 from the current time information obtained by referring to the timer, and the server device 300 acquires the display start time and display end time. After the process of step 312, the cumulative usage time of the CMOS image sensor is stored as operation information. Further, information indicating that the photographing button has been pressed is acquired by the photographing device 200, and processed by the processor 312 in the server device 300, and the number of photographing times is stored as operation information. Incidentally, when the CMOS image sensor is replaced or the like, and the user inputs a reset operation, the cumulative usage time and the number of times of photographing are reset, and counting starts again from 1.
・Related information: Based on the cumulative usage time of the CMOS sensor stored as operation information, each time the cumulative usage time reaches a predetermined time such as s7 hours, s8 hours, etc., a message saying "The CMOS image sensor may deteriorate." ” is output, and after s9 hours, “Please replace the CMOS image sensor. Please contact ○○ for replacement.” is output, and the CMOS image sensor is notified, or at least one of replacement and inspection is output. Information to promote this will be output. Also, based on the number of shooting times stored as operation information, each time a predetermined number of shots is reached, such as t13th, t14th, etc., the message "The CMOS image sensor may deteriorate." is output, and after the t15th time, "CMOS image sensor" is output. Please replace the image sensor. For replacement, please contact be done.
 なお、CMOSイメージセンサや光源212の交換等を促進する情報を関連情報として出力する場合、操作情報として撮影装置200で撮影された被写体画像が用いられてもよい。被写体画像を解析することによって色彩や彩度、明度の変化や画素の欠損を判定し、関連情報として「CMOSイメージセンサ(又は光源212のLED)の劣化の恐れがあります。交換においては○○までご連絡ください。」と出力され、CMOSイメージセンサや光源212の交換及び点検のうちの少なくともいずれかを促進するための情報が出力される。なお、この判定には、あらかじめ学習用の被写体画像と、各被写体画像に対して適切又は不適切のラベル情報とを入力データとして機械学習することによって得られた学習済み判定モデルに、被写体画像を入力することで行うことが可能である。ただし、この方法のみに限らず、適切な被写体画像との類似度を解析するなど他の画像解析方法が用いられてもよい。 Note that when outputting information promoting replacement of the CMOS image sensor or the light source 212 as related information, a subject image photographed by the photographing device 200 may be used as the operation information. By analyzing the subject image, changes in color, saturation, brightness, and pixel loss are determined, and related information is sent to ○○ for replacement. Please contact us.'' and information for facilitating at least one of replacement and inspection of the CMOS image sensor and the light source 212 is output. In addition, for this judgment, the subject image is applied to a trained judgment model obtained by machine learning using the subject image for learning and appropriate or inappropriate label information for each subject image as input data. This can be done by inputting the information. However, the method is not limited to this method, and other image analysis methods may be used, such as analyzing the degree of similarity with an appropriate subject image.
[各種レンズや導光筒の交換等を促進する情報]
・操作情報又は操作情報の生成に使用される情報:タイマを参照して得られる現在の時間情報から、撮影装置200の使用開始時間と使用終了時間が撮影装置200において取得され、サーバ装置300においてプロセッサ312の処理を受けて撮影装置200の累積使用時間が操作情報として記憶される。また、導光筒の周辺に設置された温度センサからの検出値を撮影装置200において取得し、サーバ装置300において操作情報として記憶される。また、タイマを参照して得られる現在の時間情報から、光源212の点灯開始時間と点灯終了時間が撮影装置200において取得され、サーバ装置300においてプロセッサ312の処理を受けて光源212の点灯時間が操作情報として記憶される。なお、累積使用時間は各種レンズや導光筒の交換等が行われ例えば使用者がリセットの操作入力をすることによって、カウントがリセットされ、再び1からカウントが開始される。
[Information promoting replacement of various lenses and light guide tubes, etc.]
- Operation information or information used to generate operation information: From the current time information obtained by referring to the timer, the start time and end time of use of the imaging device 200 are acquired in the imaging device 200, and the server device 300 In response to processing by the processor 312, the cumulative usage time of the photographing device 200 is stored as operation information. Further, the detection value from a temperature sensor installed around the light guide tube is acquired by the photographing device 200, and is stored as operation information in the server device 300. Further, from the current time information obtained by referring to the timer, the lighting start time and lighting end time of the light source 212 are acquired in the photographing device 200, and the lighting time of the light source 212 is processed by the processor 312 in the server device 300. It is stored as operation information. Note that the cumulative usage time is reset when the various lenses and light guide tubes are replaced, for example, and the user inputs a reset operation, and the count is restarted from 1.
・関連情報:操作情報として記憶された撮影装置200の累積使用時間に基づいて、累積の使用時間がs10時間、s11時間等の所定時間に達するごとに「各種レンズや導光筒に劣化の可能性があります。」と出力され、s12時間以降は「各種レンズや導光筒を交換してください。交換においては○○までご連絡ください。」と出力され、各種レンズや導光筒の劣化が通知されたり、交換及び点検のうちの少なくともいずれかを促進するための情報が出力される。なお、各種レンズごと、又は導光筒でそれぞれ異なる回数を設定することも可能である。また、操作情報として記憶された温度センサの検出値とあらかじめ決められた閾値(例えば、120℃)とを比較し、当該閾値を超える場合に異常な発熱がみられることを通知する情報が関連情報として出力される。また、例えば導光筒は光源212により光と熱の影響も受けやすい。そのため、上記の通り取得した光源212の点灯時間をさらに上記の累積使用時間の情報に組み合わせて関連情報を出力することも可能である。 - Related information: Based on the cumulative usage time of the photographing device 200 stored as operation information, each time the cumulative usage time reaches a predetermined time such as s10 hours, s11 hours, etc. "Please replace the various lenses and light guide tubes. Please contact ○○ for replacement." is output after s12 hours, indicating that the various lenses and light guide tubes have deteriorated. Information for notifying or promoting at least one of replacement and inspection is output. Note that it is also possible to set a different number of times for each type of lens or for each light guide tube. In addition, related information includes information that compares the detected value of the temperature sensor stored as operation information with a predetermined threshold value (for example, 120 degrees Celsius), and notifies that abnormal heat generation is observed when the threshold value is exceeded. is output as Further, for example, the light guide tube is easily affected by light and heat from the light source 212. Therefore, it is also possible to output related information by further combining the lighting time of the light source 212 acquired as described above with the information on the cumulative usage time described above.
[撮影装置200の点検等を促進する情報]
・操作情報又は操作情報の生成に使用される情報:加速度センサからの検出値が撮影装置200において常時取得され、サーバ装置300において操作情報として記憶される。
・関連情報:操作情報として記憶された加速度センサの検出値とあらかじめ決められた閾値とを比較し、当該閾値を超える場合に落下や衝突等の異常な使用がなされた恐れがあるとして、撮影装置200の点検等を促す情報が関連情報として出力される。また、このような通知は、加速度センサからの検出値の履歴データを入力データとして機械学習することによって得られた学習済み予測モデルを使って異常な検出値の判定をすることで生成が可能である。さらに、当該操作情報そのものは、点検時に撮影装置200の故障の原因等を推測するために出力される。
[Information that promotes inspection of the photographing device 200, etc.]
- Operation information or information used to generate operation information: A detected value from the acceleration sensor is always acquired by the imaging device 200 and stored as operation information in the server device 300.
・Related information: The detection value of the acceleration sensor stored as operation information is compared with a predetermined threshold value, and if the value exceeds the threshold value, the camera is judged to have been used abnormally such as falling or colliding. Information prompting inspection etc. of 200 is output as related information. In addition, such notifications can be generated by determining abnormal detected values using a trained prediction model obtained by machine learning using historical data of detected values from acceleration sensors as input data. be. Further, the operation information itself is output in order to estimate the cause of a failure of the photographing device 200 at the time of inspection.
[撮影装置200の点検等を促進する情報]
・操作情報又は操作情報の生成に使用される情報:撮影装置200の発熱箇所(例えば、光源212の周囲やプロセッサ213等が搭載された制御基板など)に設置された温度センサからの検出値が撮影装置200において常時取得され、サーバ装置300において操作情報として記憶される。
・関連情報:操作情報として記憶された温度センサの検出値とあらかじめ決められた閾値とを比較し、当該閾値を超える場合には異常な発熱等の恐れがあるとして、撮影装置200の点検等を促す情報が関連情報として出力される。また、このような通知は、温度センサからの検出値の履歴データを入力データとして機械学習することによって得られた学習済み予測モデルを使って異常な検出値の判定をすることで生成が可能である。さらに、当該操作情報そのものは、点検時に撮影装置200の故障の原因等を推測するために出力される。
[Information that promotes inspection of the photographing device 200, etc.]
- Operation information or information used to generate operation information: The detected value from a temperature sensor installed at a heat generating part of the imaging device 200 (for example, around the light source 212, a control board on which the processor 213, etc. is mounted) The information is constantly acquired by the photographing device 200 and stored as operation information in the server device 300.
- Related information: Compare the detected value of the temperature sensor stored as operation information with a predetermined threshold, and if it exceeds the threshold, check the imaging device 200 as there is a risk of abnormal heat generation, etc. The prompting information is output as related information. In addition, such notifications can be generated by determining abnormal detected values using a trained prediction model obtained by machine learning using historical data of detected values from temperature sensors as input data. be. Further, the operation information itself is output in order to estimate the cause of a failure of the photographing device 200 at the time of inspection.
[撮影装置200の通信状態の改善を補助する情報]
・操作情報又は操作情報の生成に使用される情報:撮影装置200の通信インターフェイス216を介して、表示装置100又はサーバ装置300と行われる通信における通信速度や当該通信時の電波強度の検出値が撮影装置200において取得され、サーバ装置300のプロセッサ312の処理を受けて操作情報として記憶される。
・関連情報:操作情報として記憶された通信速度又は電波強度とあらかじめ決められた閾値(例えば、通信速度において1Mbps)とを比較し、当該閾値を超える場合に良好な通信状態ではないとして、通信状態を改善する方法に関する情報が関連情報として出力される。
[Information that assists in improving the communication status of the photographing device 200]
- Operation information or information used to generate operation information: The communication speed of communication performed with the display device 100 or the server device 300 via the communication interface 216 of the photographing device 200 and the detected value of the radio field intensity during the communication The information is acquired by the imaging device 200, processed by the processor 312 of the server device 300, and stored as operation information.
・Related information: Compare the communication speed or radio field strength stored as operation information with a predetermined threshold (for example, 1 Mbps for communication speed), and if it exceeds the threshold, it is determined that the communication state is not good and the communication state is determined. Information on how to improve is output as related information.
[撮影装置200の操作を補助する情報]
・操作情報又は操作情報の生成に使用される情報:撮影装置200の通信インターフェイス216を介して、表示装置100又はサーバ装置300と行われる通信における通信速度が撮影装置200において取得され、サーバ装置300のプロセッサ312の処理を受けて操作情報として記憶される。
・関連情報:操作情報として記憶された通信速度に基づいて、表示装置100からサーバ装置300に被写体画像等を送信するときの予想待機時間を算出し、算出された予想待機時間が関連情報として出力される。
[Information that assists in operating the imaging device 200]
- Operation information or information used to generate operation information: The communication speed of communication performed with the display device 100 or the server device 300 is acquired in the camera device 200 via the communication interface 216 of the camera device 200, and the communication speed is acquired by the camera device 200. The information is processed by the processor 312 and stored as operation information.
- Related information: Based on the communication speed stored as operation information, the expected waiting time when transmitting the subject image etc. from the display device 100 to the server device 300 is calculated, and the calculated expected waiting time is output as related information. be done.
[補助具400の使用を補助する情報]
・操作情報又は操作情報の生成に使用される情報:補助具400が取り外されたこと、又は取り外されていないことを示す情報が撮影装置200で取得され、サーバ装置300においてプロセッサ312の処理を受けて当該情報が操作情報として記憶される。
・関連情報:操作情報として記憶された補助具400が取り外されたこと、又は取り外されていないことを示す情報に基づいて、補助具400が装着されたままになっていることを判定し、装着されたままになっていると判定された場合にはその旨を示す関連情報が生成される。このように使用済みの補助具400の取り外しを促進する情報を出力することで、対象者間などによる感染防止をすることができる。
[Information to assist in the use of the aid 400]
- Operation information or information used to generate operation information: Information indicating that the auxiliary tool 400 has been removed or not is acquired by the imaging device 200 and processed by the processor 312 in the server device 300. The information is stored as operation information.
- Related information: Based on the information stored as operation information indicating that the auxiliary device 400 has been removed or has not been removed, it is determined that the auxiliary device 400 is still attached, and the auxiliary device 400 is not removed. If it is determined that it remains the same, related information indicating this is generated. By outputting information that promotes the removal of the used auxiliary tool 400 in this way, it is possible to prevent infection between subjects.
[撮影装置200の適切な操作を補助する情報]
・操作情報又は操作情報の生成に使用される情報:撮影装置200の操作ログ、使用場所を示す情報及び現在の時間情報が撮影装置200において取得され、サーバ装置300において操作情報として記憶される。
・関連情報:操作情報として記憶された操作ログから異常な操作が検出されると、それを改善する操作方法に関する情報が関連情報として出力される。一例としては、例えば撮影処理において被写体画像が撮影された後に確定操作がされることによってサーバ装置300に被写体画像が送信されるが、この確定操作ではなくキャンセル操作がされた回数があらかじめ決められた閾値(例えば。3回)よりも多い場合には、「撮影時に撮影装置200が動かないようにしっかり固定して撮影しましょう。」などの撮影を補助する情報が関連情報として出力される。また、光源212の点灯時間やスルー画像の表示時間があらかじめ決められた閾値(例えば、光源の点灯時間として3分)よりも長い場合には、「撮影終了後は撮影装置200の電源を切るか、撮影装置200を立てて置きスリープ状態にしましょう。」などの撮影装置200の使用を補助する情報が関連情報として出力される。なお、これらの回数や時間は、操作者が所定の操作を入力したり、関連情報が出力されることによってリセットされ、再度1からカウントされる。
[Information that assists in proper operation of the imaging device 200]
- Operation information or information used to generate operation information: The operation log of the photographing device 200, information indicating the location of use, and current time information are acquired by the photographing device 200 and stored as operation information in the server device 300.
- Related information: When an abnormal operation is detected from the operation log stored as operation information, information regarding an operation method to improve it is output as related information. As an example, the subject image is transmitted to the server device 300 by a confirming operation after the subject image is captured in photographing processing, but the number of times a canceling operation is performed instead of the confirming operation is predetermined. If the number of times is greater than a threshold value (for example, three times), information to assist in photographing, such as "Be sure to firmly fix the photographing device 200 so that it does not move during photographing." is output as related information. In addition, if the lighting time of the light source 212 or the display time of the through image is longer than a predetermined threshold (for example, 3 minutes as the lighting time of the light source), the message "Turn off the power to the imaging device 200 after shooting?" , "Let's put the imaging device 200 upright and put it into a sleep state." Information that assists in the use of the imaging device 200 is output as related information. Note that these times and times are reset by the operator inputting a predetermined operation or by outputting related information, and are counted again from 1.
 なお、上記閾値は、過去の同機関の同曜日における操作ログの履歴を入力データとして機械学習することによって得られた学習済み予測モデルを使って算出することが可能である。ただし、この方法のみに限らず、過去の履歴から曜日ごとの最大値、最小値又は平均値などを閾値とする方法など、いずれの方法を用いることが可能である。 Note that the above threshold value can be calculated using a learned prediction model obtained by machine learning using the history of past operation logs of the same institution on the same day of the week as input data. However, it is not limited to this method, and any method can be used, such as a method using the maximum value, minimum value, or average value for each day of the week from past history as a threshold value.
[撮影装置200の適切な操作を補助する情報]
・操作情報又は操作情報の生成に使用される情報:撮影装置200で撮影された被写体画像が撮影装置200において取得され、サーバ装置300において操作情報としても記憶される。
・関連情報:操作情報として記憶された被写体画像を解析することによって例えば不適切な画角の画像やブレが多い画像が多いと判定される場合には、関連情報としてスルー画像に重畳して被写体の位置をガイドするためのガイド表示が出力されたり、「撮影時に撮影装置200が動かないようにしっかり固定して撮影しましょう。」など、良好な被写体画像を撮影するために、撮影を補助する情報が関連情報として出力される。また、被写体画像を解析することによって、例えば、対象者の口が十分に開かれていないことなど対象者の状態や撮影環境を判定し、その結果に応じて「もう少し口を大きく開いてください」や「この状態を維持してください。」など、対象者に対する通知が関連情報として出力される。
[Information that assists in proper operation of the imaging device 200]
- Operation information or information used to generate operation information: A subject image photographed by the photographing device 200 is acquired by the photographing device 200, and is also stored as operation information in the server device 300.
・Related information: By analyzing the subject images stored as operation information, for example, if it is determined that there are many images with inappropriate angles of view or images with a lot of blur, the related information will be superimposed on the through image and the subject image will be displayed. A guide display is output to guide the position of the camera, and assistance is provided in order to capture a good image of the subject, such as "Be sure to fix the camera device 200 firmly so that it does not move during shooting." Information is output as related information. In addition, by analyzing the subject image, we can determine the subject's condition and shooting environment, such as whether the subject's mouth is not open enough, and depending on the results, we can tell you to "open your mouth a little wider." Notifications to the target person, such as "Please maintain this state.", are output as related information.
 なお、これらの判定には、あらかじめ学習用の被写体画像と、各被写体画像に対して適切又は不適切のラベル情報とを入力データとして機械学習することによって得られた学習済み判定モデルに、操作情報として得られた被写体画像を入力することで行うことが可能である。ただし、この方法のみに限らず、適切な被写体画像との類似度を解析するなど他の画像解析方法が用いられてもよい。また、特に対象者に対して通知するような場合には、関連情報を表示装置100に出力するのに代えて、又は表示装置100に出力するのに加えて、撮影装置200のディスプレイに出力したり、撮影装置200又は表示装置100から音声として出力するようにしてもよい。 In addition, for these judgments, operation information is added to a trained judgment model obtained by machine learning using subject images for learning and appropriate or inappropriate label information for each subject image as input data. This can be done by inputting the subject image obtained as follows. However, the method is not limited to this method, and other image analysis methods may be used, such as analyzing the degree of similarity with an appropriate subject image. In addition, especially when notifying a target person, instead of outputting the related information to the display device 100 or in addition to outputting it to the display device 100, the related information may be output to the display of the photographing device 200. Alternatively, it may be output as audio from the photographing device 200 or the display device 100.
[撮影装置200の適切な操作を補助する情報]
・操作情報又は操作情報の生成に使用される情報:光源212の周囲(例えば、光源212のLEDが設置された基板)に設置された温度センサからの検出値が撮影装置200において取得され、サーバ装置300において操作情報として記憶される。
・関連情報:操作情報として記憶された温度センサの検出値とあらかじめ決められた閾値とを比較し、当該閾値を超える場合に光源212のLEDが高温になっている可能性があるため、「しばらく操作を控えてください。」などの注意喚起をする通知が関連情報として出力される。
[Information that assists in proper operation of the imaging device 200]
- Operation information or information used to generate operation information: A detected value from a temperature sensor installed around the light source 212 (for example, a board on which an LED of the light source 212 is installed) is acquired in the imaging device 200, and is sent to the server. The information is stored in the device 300 as operation information.
・Related information: Compare the detected value of the temperature sensor stored as operation information with a predetermined threshold value, and if it exceeds the threshold value, the LED of the light source 212 may be at a high temperature. Please refrain from operating the device.'' A notification is output as related information.
[所定の疾患に対する罹患の可能性の判定料金に関する情報]
・操作情報又は操作情報の生成に使用される情報:図9AのS216において所定の疾患に対する罹患の可能性が判定された結果が送信されると、サーバ装置300において判定回数として操作情報が記憶される。なお、判定回数は判定料金の算出や決済に係る処理が行われるとカウントがリセットされ、再び1からカウントが開始される。
・関連情報:サーバ装置300のプロセッサ312は、操作情報として判定回数を読み出すと、表示装置ID情報ごとに判定料金を算出する。具体的には、プロセッサ312は、あらかじめ設定された1回の判定当たりの料金に判定回数をかけ合わせることによって判定料金を算出する。そして、表示装置ID情報ごとの判定料金が算出されると、プロセッサ312は操作者管理テーブルを参照して、当該表示装置ID情報が対応付けられた操作者ID情報を特定する。次に、プロセッサ312は、操作者ID情報に対応付けられた表示装置ID情報の全ての判定料金を合算して、操作者ID情報ごとに判定料金を算出する。例えば、操作者ID情報に2つの表示装置ID情報が対応付けられていた場合には、2つの表示装置ID情報に対して算出された各判定料金が合算される。
[Information regarding fees for determining the possibility of contracting a given disease]
- Operation information or information used to generate operation information: When the result of determining the possibility of contracting a predetermined disease in S216 of FIG. 9A is transmitted, the operation information is stored in the server device 300 as the number of determinations. Ru. Note that the count of the number of determinations is reset when processing related to determination fee calculation and settlement is performed, and counting starts again from 1.
- Related information: When the processor 312 of the server device 300 reads out the number of determinations as operation information, it calculates the determination fee for each display device ID information. Specifically, the processor 312 calculates the determination fee by multiplying the preset fee per determination by the number of determinations. Then, when the determination fee for each display device ID information is calculated, the processor 312 refers to the operator management table and specifies the operator ID information with which the display device ID information is associated. Next, the processor 312 adds up all the determined fees for the display device ID information associated with the operator ID information, and calculates the determined fee for each operator ID information. For example, if two pieces of display device ID information are associated with the operator ID information, the respective determination fees calculated for the two pieces of display device ID information are added up.
 操作者ID情報ごとに判定料金が算出されると、算出された判定料金を通知する情報と、その判定料金の支払い方法を通知する情報を関連情報として生成し、操作者ID情報に対応付けられた表示装置100に送信する。なお、このとき、操作者ID情報ごとに合算したが、プロセッサ312は、操作者管理テーブルを参照してさらに機関ID情報ごとに合算することも可能である。このようにすることで、例えば医療機関単位や診療科単位で判定料金の請求をすることが可能となり、より利便性を向上することが可能である。 When the judgment fee is calculated for each operator ID information, information notifying the calculated judgment fee and information notifying the payment method of the judgment fee are generated as related information, and are associated with the operator ID information. The information is transmitted to the display device 100. Note that, at this time, the summation is performed for each operator ID information, but the processor 312 can also refer to the operator management table and further summation for each institution ID information. By doing so, it becomes possible to bill a judgment fee for each medical institution or department, for example, and it is possible to further improve convenience.
 すなわち、サーバ装置300等の処理装置は、対象者の自然開口部を被写体とする画像を撮影するように構成された撮影装置とネットワークを介して通信可能に接続され、少なくとも一つのプロセッサを含む処理装置であって、
 当該少なくとも一つのプロセッサが、
 当該撮影装置によって撮影された当該画像に基づいて所定の疾患に対する罹患の可能性を判定し、
 当該判定の結果又は当該判定の結果に基づいて生成された罹患の診断を補助する情報を出力し、
 前記判定の回数又は前記出力の回数の少なくともいずれか一方に応じて前記撮影装置の操作者又は当該操作者が所属する機関に対して請求される利用料金を算出し、
 算出された当該利用料金を示す情報を出力する、
 ための処理を実行するように構成される。
That is, a processing device such as the server device 300 is communicably connected via a network to a photographing device configured to photograph an image of a subject's natural orifice, and the processing device includes at least one processor. A device,
The at least one processor,
Determining the possibility of contracting a predetermined disease based on the image taken by the imaging device,
Outputting the result of the determination or information generated based on the result of the determination to assist in diagnosing the disease;
calculating a usage fee to be charged to the operator of the photographing device or the institution to which the operator belongs according to at least one of the number of determinations or the number of outputs;
Outputting information indicating the calculated usage fee,
is configured to perform processing for
 なお、上記において、例えば、複数の疾患に対する罹患の可能性を判定可能である場合、表示装置100から判定要求を受信する際に、判定対象とする疾患を特定する情報を併せて受信する。そして、サーバ装置300のプロセッサ312は、判定された疾患ごとに判定料金をあらかじめ記憶しておき、上記の利用料金の算出において判定された疾患ごとに利用料金を算出することが可能である。 Note that in the above, for example, if it is possible to determine the possibility of contracting a plurality of diseases, when receiving the determination request from the display device 100, information specifying the disease to be determined is also received. The processor 312 of the server device 300 can store the determined fee for each determined disease in advance, and calculate the usage fee for each determined disease in the calculation of the usage fee described above.
 また、操作情報として、撮影装置200の操作ログ又は判定回数、使用場所を示す情報及び現在の時間情報が撮影装置200において取得され、サーバ装置300において操作情報として記憶される。このような場合、各機関又は各操作者の所定期間ごとの使用頻度の履歴を入力データとして機械学習することによって学習済み判定回数予測モデルを生成することが可能である。プロセッサ312は、当該学習済み判定回数予測モデルに機関ID情報又は操作者ID情報と期間を入力することで、判定回数の予測値を得ることができる。そこで、プロセッサ312は得られた判定回数の予測値から、入力された期間における各機関又は各操作者の判定料金の予測値を算出する。プロセッサ312は得られた予測値を関連情報として出力する。このようにすることで、機関ごと又は操作者ごとに予測値を算出してあらかじめその料金を予算として確保することが可能となり、より利便性を向上させることができる。 Further, as operation information, an operation log of the photographing device 200 or the number of determinations, information indicating the location of use, and current time information are acquired in the photographing device 200 and stored as operation information in the server device 300. In such a case, it is possible to generate a learned judgment frequency prediction model by performing machine learning using the history of usage frequency for each predetermined period by each engine or each operator as input data. The processor 312 can obtain a predicted value of the number of determinations by inputting the institution ID information or operator ID information and the period into the learned determination number prediction model. Therefore, the processor 312 calculates the predicted value of the judgment fee for each institution or each operator in the input period from the obtained predicted value of the number of judgments. The processor 312 outputs the obtained predicted value as related information. By doing so, it becomes possible to calculate a predicted value for each institution or each operator and secure the fee in advance as a budget, thereby further improving convenience.
 また、利用料金の出力の方法は、様々な方法をとることが可能である。例えば、日、週、月、年等の所定期間ごとの利用料金を合算して出力することも可能である。また、例えば、あらかじめ利用料金を前払いしているような場合には、現在の残存する前払い料金を出力することも可能である。また、例えば判定回数に関係なく所定期間ごとに定額料金の支払いをするような場合には、都度払いする場合との判定料金比較を出力することも可能である。 Furthermore, various methods can be used to output usage fees. For example, it is also possible to add up usage fees for each predetermined period such as daily, weekly, monthly, and yearly and output the sum. Further, for example, if the usage fee has been paid in advance, it is also possible to output the current remaining prepaid fee. Further, for example, in a case where a fixed amount is to be paid every predetermined period regardless of the number of times of determination, it is also possible to output a comparison between the determined fee and the case where the fee is paid each time.
[その他の関連情報]
 上記に例示する関連情報のほかに、上記に例示された操作情報を利用して、ディスプレイの劣化の通知や、交換、点検及び校正のうちの少なくともいずれかの通知、撮影装置200のメンテナンス時期の通知など、様々な関連情報が出力されてもよい。
[Other related information]
In addition to the related information exemplified above, the operation information exemplified above is used to notify the deterioration of the display, notify at least one of replacement, inspection, and calibration, and notify the timing of maintenance of the imaging device 200. Various related information such as notifications may be output.
6-3.処理プログラムの更新処理
 図9Dは、本開示の一実施形態に係るサーバ装置300において実行される処理フローを示す図である。具体的には、図9Dは、撮影装置200のSWとしてインストールされている処理プログラムを更新するための処理において実行される処理フローを示す図である。当該処理フローは、主にサーバ装置300のプロセッサ312がメモリ311に記憶された処理プログラムを読み出して実行することにより行われる。
6-3. Processing program update processing FIG. 9D is a diagram showing a processing flow executed in the server device 300 according to an embodiment of the present disclosure. Specifically, FIG. 9D is a diagram showing a processing flow executed in a process for updating the processing program installed as the SW of the photographing device 200. The processing flow is mainly performed by the processor 312 of the server device 300 reading and executing a processing program stored in the memory 311.
 ここで、処理システム1又は撮影装置200などの各装置や処理プログラムは、所定の法令や規則、政令、ガイドライン等によって定められた規制又は承認の対象となる機器(例えば、医薬品、医療機器等の品質、有効性及び安全性の確保等に関する法律(いわゆる薬機法)、連邦食品医薬品化粧品法(Federal Food, Drug, and Cosmetic Act)、医療機器修正法(Medical Device Amendment Act)、欧州医療機器規則(Medical Device Regulation)、体外診断用医療機器規則(In Vitro Diagnostic Regulation)、中国国務院令第739号医療機器監督管理条例などによって規制の対象となる医療機器)に該当する場合、禁忌や禁止事項、機器の原理、使用方法、使用上の注意等が記載された公表された注意事項等情報、添付文書、改訂指示反映履歴、審査報告書、再審査報告書、緊急安全性情報等の関連文書が作成される。そして、これら関連文書が改訂又は発出されると、その改訂又は発出に応じて操作者等に迅速に周知する必要があるため、改訂又は発出があった旨を関連情報として通知する。 Here, each device such as the processing system 1 or the imaging device 200 and the processing program are devices that are subject to regulations or approvals specified by predetermined laws, regulations, government orders, guidelines, etc. (for example, drugs, medical devices, etc.). Act on Assuring Quality, Efficacy, and Safety (so-called Pharmaceutical Devices Act), Federal Food, Drug, and Cosmetic Act, Medical Device Amendment Act, European Medical Device Regulations (Medical Device Regulation), In Vitro Diagnostic Regulation, Medical Device Regulations of the State Council of China No. 739), contraindications and prohibitions, Related documents such as published precautions information that describes the principle of the device, how to use it, precautions for use, etc., package inserts, history of revision instructions, review reports, reexamination reports, emergency safety information, etc. Created. Then, when these related documents are revised or issued, it is necessary to promptly notify the operator etc. of the revision or issuance, so the fact that the revision or issuance has been made is notified as related information.
 具体的には、プロセッサ312は、添付文書等の関連文書の改訂又は発出情報を通信インターフェイス313を介して受信したか否かを判断する。そして、受信していた場合には、プロセッサ312は、関連情報として、添付文書等の関連文書に改訂又は発出があった旨を示す情報を生成する。また、表示装置100には、あらかじめ記憶された改訂又は発出前の関連文書の入手先となるリンクや、当該リンクが記録された二次元コード等の記録媒体が記憶されている。そのため、プロセッサ312は、関連情報として改訂又は発出後の関連文書の入手先となるリンクや、当該リンクが記録され二次元コード等の記録媒体となるように、表示装置100に記憶された内容を更新するための処理プログラムを含むことが可能である。そして、プロセッサ312は、当該関連情報を通信インターフェイス313を介して、表示装置100、撮影装置200又はそれらの組み合わせに出力する。 Specifically, the processor 312 determines whether revision or issuance information of a related document such as an attached document has been received via the communication interface 313. If the information has been received, the processor 312 generates, as related information, information indicating that a related document such as an attached document has been revised or issued. Further, the display device 100 stores links that are stored in advance to obtain related documents that have not yet been revised or issued, and a recording medium such as a two-dimensional code in which the links are recorded. Therefore, the processor 312 displays the content stored in the display device 100 such as a link where the related document after revision or publication is obtained as related information, and the link is recorded as a recording medium such as a two-dimensional code. It is possible to include a processing program for updating. The processor 312 then outputs the relevant information to the display device 100, the photographing device 200, or a combination thereof via the communication interface 313.
 また、これに加えて、添付文書等の関連文書の改訂等によって撮影装置200のSWとしてインストールされた処理プログラムの更新を要する場合がある。そこで、以下では関連文書の改訂によって、プロセッサ312は、当該関連文書において特定される処理プログラムの更新に係る処理を実行する場合について説明する。 In addition to this, the processing program installed as the SW of the imaging device 200 may need to be updated due to revisions of related documents such as attached documents. Therefore, a case will be described below in which the processor 312 executes processing related to updating a processing program specified in the relevant document due to the revision of the relevant document.
 具体的には、図9Dによると、プロセッサ312は、添付文書等の関連文書の改訂情報を通信インターフェイス313を介して受信したか否かを判断する(S241)。そして、添付文書等の関連文書の改訂情報を受信していた場合には、プロセッサ312は、撮影装置管理テーブルを参照して現在記憶されている処理プログラムのSWバージョン情報を参照する(S242)。次に、プロセッサ312は、受信した改訂情報によって特定された処理プログラムのSWバージョン情報と記憶されたSWバージョン情報とを比較して、処理プログラムの更新の要否を判断する。その結果、プロセッサ312は、両者が異なる場合は更新が必要と判断して、関連情報として処理プログラムの更新を促す情報(例えば、「添付文書が改訂されました。最新のSWを下記からダウンロードしてください。)を生成する(S233)。当該情報が生成されると、プロセッサ312は、通信インターフェイス313を介して、表示装置100、撮影装置200又はそれらの組み合わせに当該情報を出力する(S234)。以上により、当該処理フローを終了する。 Specifically, according to FIG. 9D, the processor 312 determines whether revision information of a related document such as an attached document has been received via the communication interface 313 (S241). When the processor 312 has received revision information of a related document such as an attached document, the processor 312 refers to the photographing device management table to refer to the SW version information of the currently stored processing program (S242). Next, the processor 312 compares the SW version information of the processing program specified by the received revision information with the stored SW version information, and determines whether the processing program needs to be updated. As a result, if the two are different, the processor 312 determines that an update is necessary, and provides related information that prompts the processing program to be updated (for example, "The attached document has been revised. Download the latest SW from below. ) (S233). Once the information is generated, the processor 312 outputs the information to the display device 100, the photographing device 200, or a combination thereof via the communication interface 313 (S234). With the above, the processing flow ends.
 すなわち、サーバ装置300等の処理装置は、対象者の自然開口部を被写体とする画像を撮影するように構成された撮影装置とネットワークを介して通信可能に接続され、少なくとも一つのプロセッサを含む処理装置であって、
 当該少なくとも一つのプロセッサが、
 当該撮影装置にインストールされている処理プログラムのバージョン情報をメモリに記憶し、
 当該処理装置、当該撮影装置及び当該処理プログラムの少なくともいずれか一つが所定の規制又は承認のために作成された関連文書に起因する通知において特定されたバージョン情報を取得し、
 取得されたバージョン情報とメモリに記憶されたバージョン情報とが異なる場合には、当該取得されたバージョン情報により特定される処理プログラムへの更新を促す情報を出力する、
 ための処理を実行するように構成される。
That is, a processing device such as the server device 300 is communicably connected via a network to a photographing device configured to photograph an image of a subject's natural orifice, and the processing device includes at least one processor. A device,
The at least one processor,
Store version information of the processing program installed in the imaging device in memory,
At least one of the processing device, the imaging device, and the processing program obtains version information specified in a notification resulting from a related document created for predetermined regulation or approval;
If the acquired version information differs from the version information stored in the memory, outputting information prompting an update to the processing program specified by the acquired version information;
is configured to perform processing for
 なお、ここでは、添付文書等の関連文書の改訂又は発出を関連情報として通知する場合について説明した。しかし、関連文書の改訂又は発出に限らず、他の受信情報を関連情報として通知してもよい。例えば、撮影装置200を構成する各種構成要素(部品など)等においてリコールや不具合の通知を関連情報として出力することも可能である。具体的には、サーバ装置300のプロセッサ312は、他のサーバ装置等から各種構成要素に関する不具合情報を通信インターフェイス313を介して受信する。そして、プロセッサ312は、受信した不具合情報に含まれる製造ロットに基づいて、メモリ311の撮影装置管理テーブルの部品情報を参照する。プロセッサ312は、部品情報として記憶された製造番号が製造ロットに含まれている場合には、当該撮影装置200のうち該当する構成要素(部品)の交換、点検及び校正のうちの少なくともいずれかを促す通知を関連情報として生成する。そして、プロセッサ312は、生成された関連情報を、通信インターフェイス313を介して、表示装置100、撮影装置200又はそれらの組み合わせに出力する。 Here, we have explained the case where the revision or issuance of related documents such as attached documents is notified as related information. However, the notification is not limited to the revision or issuance of related documents, and other received information may be notified as related information. For example, it is also possible to output notifications of recalls or defects in various components (parts, etc.) constituting the photographing device 200 as related information. Specifically, the processor 312 of the server device 300 receives defect information regarding various components from other server devices etc. via the communication interface 313. The processor 312 then refers to the parts information in the imaging device management table in the memory 311 based on the manufacturing lot included in the received defect information. If the serial number stored as part information is included in the manufacturing lot, the processor 312 performs at least one of replacement, inspection, and calibration of the corresponding component (part) of the imaging device 200. Generate prompt notifications with relevant information. The processor 312 then outputs the generated related information to the display device 100, the photographing device 200, or a combination thereof via the communication interface 313.
7.関連情報の出力
 図10A~図10Eは、本開示の一実施形態に係る表示装置100に表示される画面の例を示す図である。具体的には、図10Aは、表示装置100において表示される対象者一覧画面であって、所定の疾患に対する罹患の可能性を判定する対象や、問診情報の入力の対象や、所見情報の入力の対象となる対象者を選択する際に表示される画面の例である。また、図10B~図10Eは、表示装置100において関連情報が表示される画面の例である。
7. Output of related information FIGS. 10A to 10E are diagrams showing examples of screens displayed on the display device 100 according to an embodiment of the present disclosure. Specifically, FIG. 10A is a screen showing a list of subjects displayed on the display device 100, in which subjects for determining the possibility of contracting a predetermined disease, subjects for inputting interview information, and subjects for inputting finding information are displayed. This is an example of a screen displayed when selecting a target person. Further, FIGS. 10B to 10E are examples of screens on which related information is displayed on the display device 100.
 図10Aによると、出力インターフェイス114を介してディスプレイに対象者一覧画面が表示されている。当該画面は一覧画面11を含み、一覧画面11には、対象者毎に、現在のステータス、対象者名、対象者ID情報及び属性情報等の退所者情報が一行単位で一覧として表示されている。また、一覧画面11の下部には、問診情報の入力を行うための問診入力アイコン12、所見情報の入力や判定要求を送信するための診断アイコン13、新たな対象者を新規登録するための新規登録アイコン14が表示されている。これら各アイコンに対して操作入力を行うことにより、各アイコンに応じた処理を実行することが可能である。 According to FIG. 10A, a subject list screen is displayed on the display via the output interface 114. This screen includes a list screen 11, and on the list screen 11, information on people who have left the facility, such as the current status, name of the person, ID information of the person, and attribute information, is displayed as a list in a line-by-line format for each person. There is. In addition, at the bottom of the list screen 11, there is an interview input icon 12 for inputting interview information, a diagnosis icon 13 for inputting finding information and sending a judgment request, and a new icon for newly registering a new subject. A registration icon 14 is displayed. By performing operation input on each of these icons, it is possible to execute processing corresponding to each icon.
 このように、図10Aに示される対象者一覧画面は、処理システム1を使った判定を行ううえで使用者が参照する頻度が高い画面となる。 In this way, the subject list screen shown in FIG. 10A is a screen that is frequently referenced by the user when making a determination using the processing system 1.
 次に、図10Bによると、関連情報として補助具400の購入を促進する情報が受信され、出力された場合が記載されている。図10Bに示されるように、関連情報表示15が対象者一覧画面上に重畳して表示されている。 Next, according to FIG. 10B, a case is described in which information promoting the purchase of the auxiliary tool 400 is received and output as related information. As shown in FIG. 10B, the related information display 15 is displayed superimposed on the subject list screen.
 同様に、図10Cによると、関連情報として処理プログラムの更新を促進する情報が受信され、出力された場合が記載されている。図10Cに示されるように、関連情報表示17が対象者一覧画面に重畳して表示されている。 Similarly, according to FIG. 10C, a case is described in which information promoting updating of a processing program is received and output as related information. As shown in FIG. 10C, the related information display 17 is displayed superimposed on the subject list screen.
 同様に、図10Dによると、関連情報として撮影装置200の操作を補助する情報が受信され、出力された場合が記載されている。図10Dに示されるように、関連情報表示18が対象者一覧画面に重畳して表示されている。 Similarly, according to FIG. 10D, a case is described in which information that assists the operation of the photographing device 200 is received and output as related information. As shown in FIG. 10D, the related information display 18 is displayed superimposed on the subject list screen.
 このように、図10B~図10Cの例では、参照する頻度の高い画面である対象者一覧画面上に関連情報を出力することによって、使用者に対してより注意を喚起することが可能である。なお、ここで挙げた出力方法は単なる一例であって、当然他の方法によって出力してもよい。例えば、関連情報を受信すると、画面上部に通知を出し、当該通知をクリックすることによって画面を切り換えてその詳細情報を出力するようにしてもよい。また、関連情報を受信すると、音声、LED、振動又はそれらの組み合わせによって受信を通知し、所定のアプリケーションプログラムを開くことによってその詳細な情報を出力するようにしてもよい。 In this way, in the examples shown in FIGS. 10B to 10C, by outputting related information on the target person list screen, which is a frequently referenced screen, it is possible to draw more attention to the user. . Note that the output method mentioned here is just an example, and of course output may be performed using other methods. For example, when related information is received, a notification may be displayed at the top of the screen, and by clicking on the notification, the screen may be switched and detailed information thereof may be output. Furthermore, when the relevant information is received, the reception may be notified by sound, LED, vibration, or a combination thereof, and the detailed information may be output by opening a predetermined application program.
 図10Eによると、関連情報として判定料金を示す情報が受信され出力された場合が記載されている。図10Eによると、判定要求を送信した表示装置100ごとの利用料金と、これらの表示装置100が対応付けられた機関の合算の利用料金の額が利用料金表示画面19に表示されている。また、利用料金表示画面の下部には、支払アイコン20及びキャンセルアイコン21が表示されている。支払アイコンに対する操作入力を受け付けることによって、関連情報として出力された支払方法を示す情報が表示される画面に移行し、支払処理を進めることが可能となる。 According to FIG. 10E, a case is described in which information indicating a determination fee is received and output as related information. According to FIG. 10E, the usage fee for each display device 100 that has transmitted the determination request and the total amount of usage fees for the institutions to which these display devices 100 are associated are displayed on the usage fee display screen 19. Furthermore, a payment icon 20 and a cancel icon 21 are displayed at the bottom of the usage fee display screen. By accepting an operation input to the payment icon, the screen moves to a screen where information indicating the payment method output as related information is displayed, and it becomes possible to proceed with the payment process.
 このように、使用頻度の高い表示装置100において判定料金を表示することによって、判定料金の把握漏れなどをなくし、より利便性を高めることが可能となる。 In this way, by displaying the judgment fee on the frequently used display device 100, it is possible to eliminate failure to grasp the judgment fee and further improve convenience.
 なお、図10B~図10Cに示す関連情報の出力先は、表示装置100にのみ表示する必要はなく、例えば撮影装置200やあらかじめ登録した操作者が所持する他の端末装置等に出力してもよい。 Note that the output destinations of the related information shown in FIGS. 10B to 10C do not need to be displayed only on the display device 100, and may also be output to, for example, the photographing device 200 or other terminal device owned by a pre-registered operator. good.
 以上、本実施形態においては、被写体の画像を撮影するように構成された撮影装置と接続され、当該撮影装置の使い勝手をよりよくすることが可能な処理装置、処理プログラム及び処理方法を提供することができる。 As described above, the present embodiment provides a processing device, a processing program, and a processing method that are connected to a photographing device configured to photograph an image of a subject and can improve the usability of the photographing device. I can do it.
8.変形例8. Variant
 上記実施形態においては、問診情報及び所見情報は、あらかじめ操作者又は対象者によって入力されるか、有線又は無線ネットワークに接続された電子カルテ装置等から受信する場合について説明した。しかし、これらに代えて、又はこれらに加えて撮影された被写体画像からこれらの情報を入手してもよい。また、問診情報や所見情報は、あらかじめ紙媒体等に手書きやマークシート形式で記入され、これをカメラやスキャナ等の光学的な読取装置を用いて読み取ることで取得してもよい。また、別途設けられた端末装置等で入力された問診情報や所見情報、又はそれらの保存先を二次元コード等の記録媒体に記録し、当該記録媒体を介してこれらの情報を取得してもよい。学習用被写体画像に対して、当該学習用被写体画像に対応付けられた所見情報や問診情報を正解ラベルとして与え、これらの組をニューラルネットワークで機械学習することによって学習済み情報推定モデルを得る。そして、プロセッサ111が、当該学習済み情報推定モデルに対して被写体画像を入力として与えることで、所望の問診情報及び属性情報を得ることができる。このような問診情報及び属性情報の一例としては、性別、年齢、咽頭発赤度合い、扁桃腫脹度合い、白苔の有無等が挙げられる。これにより、操作者が問診情報及び所見情報を入力する手間を省くことができる。 In the above embodiments, the case has been described in which the interview information and finding information are input in advance by the operator or the subject, or are received from an electronic medical record device or the like connected to a wired or wireless network. However, instead of or in addition to these, this information may be obtained from a photographed subject image. Further, the medical interview information and observation information may be written in advance on a paper medium or the like by hand or in a mark sheet format, and may be acquired by reading this using an optical reading device such as a camera or a scanner. In addition, even if the interview information and finding information entered on a separately provided terminal device, etc., or their storage location are recorded on a recording medium such as a two-dimensional code, and this information is acquired via the recording medium. good. Finding information and interview information associated with the learning subject image are given as correct labels to the learning subject image, and a learned information estimation model is obtained by machine learning these sets using a neural network. Then, the processor 111 inputs the subject image to the learned information estimation model, thereby obtaining desired interview information and attribute information. Examples of such interview information and attribute information include gender, age, degree of redness of the throat, degree of swelling of tonsils, presence or absence of white moss, and the like. This saves the operator the trouble of inputting medical interview information and finding information.
 また、上記実施形態においては、操作情報に基づいて関連情報が生成される場合について説明したが、関連情報は操作情報以外の情報もさらに考慮して生成することも可能である。一例としては、撮影装置200の製造時やメンテナンス時にあらかじめ登録された各種構成要素の交換時期の目安の情報をさらに加味することができる。操作情報に基づくとまだ関連情報を出力するタイミングには達していないものの、当該目安の時期に到達すると関連情報を出力するようにしてもよい。 Furthermore, in the above embodiment, a case has been described in which the related information is generated based on the operation information, but the related information can also be generated by further considering information other than the operation information. For example, it is possible to further take into consideration information about the approximate replacement timing of various components that is registered in advance at the time of manufacturing or maintenance of the photographing device 200. Although the timing for outputting the related information has not yet been reached based on the operation information, the related information may be output when the estimated time is reached.
 上記実施形態で説明した各学習済みモデルは、ニューラルネットワークや畳み込みニューラルネットワークを用いて生成した。しかし、これらに限らず、ニアレストネイバー法、決定木、回帰木、ランダムフォレスト等の機械学習を用いて生成することも可能である。 Each trained model described in the above embodiments was generated using a neural network or a convolutional neural network. However, the information is not limited to these, and it is also possible to generate it using machine learning such as the nearest neighbor method, decision tree, regression tree, and random forest.
 上記実施形態においては、サーバ装置300において判定処理や関連情報の出力処理が行われる場合について説明した。しかし、これらの各種処理は、表示装置100や撮影装置200、他の装置(クラウドサーバ装置などを含む)などで適宜分散して処理することが可能である。 In the above embodiment, a case has been described in which the server device 300 performs determination processing and output processing of related information. However, these various processes can be appropriately distributed and processed by the display device 100, the photographing device 200, other devices (including a cloud server device, etc.), and the like.
 上記実施形態においては、使用場所の取得をGPSを通じて取得した。しかし、これは使用場所の取得に一例であり、例えば機関ID情報を取得して、当該機関ID情報に対応付けられた場所を特定することにより取得してもよい。 In the above embodiment, the location of use was obtained through GPS. However, this is just one example of acquiring the place of use; for example, it may be acquired by acquiring institution ID information and specifying the place associated with the institution ID information.
 なお、これら変形例は、上記で具体的に説明する点を除いて、図1~図10Eで説明した一実施形態における構成、処理、手順と同様である。したがって、それらの事項の詳細な説明は省略する。また、各変形例や各実施形態で説明した各要素を適宜組み合わせるか、それらを置き換えてシステムを構成することも可能である。 Note that these modified examples are the same as the configuration, processing, and procedure of the embodiment described in FIGS. 1 to 10E, except for the points specifically described above. Therefore, detailed explanation of those matters will be omitted. It is also possible to configure a system by appropriately combining or replacing the elements described in each modification and each embodiment.
 本明細書で説明される処理及び手順は、実施形態において明示的に説明されたものによってのみならず、ソフトウェア、ハードウェア又はこれらの組み合わせによっても実現可能である。具体的には、本明細書で説明された処理及び手順は、集積回路、揮発性メモリ、不揮発性メモリ、磁気ディスク、光ストレージ等の媒体に、当該処理に相当するロジックを実装することによって実現される。また、本明細書で説明される処理及び手順は、それらの処理・手順をコンピュータ処理プログラムとして実装し、表示装置やサーバ装置を含む各種のコンピュータに実行させることが可能である。 The processes and procedures described herein can be implemented not only by what is explicitly described in the embodiments, but also by software, hardware, or a combination thereof. Specifically, the processes and procedures described in this specification can be realized by implementing logic corresponding to the processes in a medium such as an integrated circuit, volatile memory, nonvolatile memory, magnetic disk, or optical storage. be done. Further, the processes and procedures described in this specification can be implemented as computer processing programs and executed by various computers including display devices and server devices.
 本明細書中で説明される処理及び手順が単一の装置、ソフトウェア、コンポーネント、モジュールによって実行される旨が説明されたとしても、そのような処理又は手順は、複数の装置、複数のソフトウェア、複数のコンポーネント、及び/又は、複数のモジュールによって実行されるものとすることができる。また、本明細書中で説明される各種情報が単一のメモリや記憶部に格納される旨が説明されたとしても、そのような情報は、単一の装置に備えられた複数のメモリ又は複数の装置に分散して配置された複数のメモリに分散して格納されるものとすることができる。さらに、本明細書において説明されるソフトウェア及びハードウェアの要素は、それらをより少ない構成要素に統合して、又は、より多い構成要素に分解することによって実現されるものとすることができる。 Even if processes and procedures described herein are described as being performed by a single device, software, component, or module, such processes or procedures may be performed by multiple devices, software, components, or modules. It may be implemented by multiple components and/or multiple modules. Further, even if it is explained that the various information described in this specification is stored in a single memory or storage unit, such information may be stored in multiple memories or storage units provided in a single device. The information may be stored in a distributed manner in a plurality of memories distributed and arranged in a plurality of devices. Additionally, the software and hardware elements described herein may be implemented by integrating them into fewer components or decomposing them into more components.
 1    処理システム
 100  表示装置
 200  撮影装置
 300  サーバ装置
 400  補助具
 

 
1 processing system 100 display device 200 photographing device 300 server device 400 auxiliary tool

Claims (12)

  1.  対象者の自然開口部を被写体とする画像を撮影するように構成された撮影装置とネットワークを介して通信可能に接続され、少なくとも一つのプロセッサを含む処理装置であって、
     前記少なくとも一つのプロセッサが、
     前記撮影装置において前記画像を撮影するために、前記撮影装置の操作者によってなされた前記撮影装置の操作に関連する操作情報を取得し、
     取得した前記操作情報に基づいて前記撮影装置に関連する関連情報を出力する、
     ための処理を実行するように構成された、処理装置。
    A processing device communicably connected via a network to a photographing device configured to photograph an image of a natural aperture of a subject, the processing device including at least one processor,
    the at least one processor,
    obtaining operation information related to an operation of the photographing device performed by an operator of the photographing device in order to photograph the image with the photographing device;
    outputting relevant information related to the photographing device based on the acquired operation information;
    A processing device configured to perform processing for.
  2.  前記操作情報は、前記撮影装置に対する操作の回数を示す情報であり、
     前記関連情報は、前記操作情報に基づいて生成された前記撮影装置の構成要素の交換、点検及び校正のうちの少なくともいずれかを促進するための情報である、
     請求項1に記載の処理装置。
    The operation information is information indicating the number of operations on the photographing device,
    The related information is information for promoting at least one of replacement, inspection, and calibration of the components of the photographing device, which is generated based on the operation information.
    The processing device according to claim 1.
  3.  前記操作情報は、前記撮影装置に配置されたセンサによって検出された情報であり、
     前記関連情報は、前記操作情報に基づいて生成された前記撮影装置の構成要素の交換、点検及び校正のうちの少なくともいずれかを促進するための情報である、
     請求項1に記載の処理装置。
    The operation information is information detected by a sensor arranged in the photographing device,
    The related information is information for promoting at least one of replacement, inspection, and calibration of the components of the photographing device, which is generated based on the operation information.
    The processing device according to claim 1.
  4.  前記画像は、前記自然開口部に対して前記撮影装置と共に挿入される補助具であって、前記挿入されるごとに交換して挿入されることが推奨される補助具を用いて前記撮影装置によって撮影され、
     前記関連情報は、前記補助具の購入を促進するための情報である、
     請求項1に記載の処理装置。
    The image is captured by the imaging device using an auxiliary tool that is inserted into the natural orifice together with the imaging device, and which is recommended to be replaced each time it is inserted. Photographed,
    The related information is information for promoting the purchase of the auxiliary tool,
    The processing device according to claim 1.
  5.  前記操作情報は、前記撮影装置の操作のために必要とされるバッテリの状態を示す情報であり、
     前記関連情報は、前記バッテリの交換及び点検のうちの少なくともいずれかを促進するための情報である、
     請求項1に記載の処理装置。
    The operation information is information indicating the state of a battery required for operation of the photographing device,
    The related information is information for facilitating at least one of replacing and inspecting the battery,
    The processing device according to claim 1.
  6.  前記操作情報は、前記撮影装置の操作のために必要とされるバッテリの状態を示す情報であり、
     前記関連情報は、前記バッテリの充電状況を示す情報である、
     請求項1に記載の処理装置。
    The operation information is information indicating the state of a battery required for operation of the photographing device,
    The related information is information indicating the charging status of the battery,
    The processing device according to claim 1.
  7.  前記バッテリの状態を示す情報は、前記バッテリに接続された電圧センサ又は電流センサによって検出された情報である、請求項4又は5に記載の処理装置。 The processing device according to claim 4 or 5, wherein the information indicating the state of the battery is information detected by a voltage sensor or a current sensor connected to the battery.
  8.  前記関連情報は、前記画像の撮影において前記自然開口部内に光を照射するための光源の点灯状態を示す情報であり、
     前記関連情報は、前記光源の交換、点検及び校正のうちの少なくともいずれかを促進するための情報である、
     請求項1に記載の処理装置。
    The related information is information indicating a lighting state of a light source for irradiating light into the natural opening when photographing the image,
    The related information is information for facilitating at least one of replacement, inspection, and calibration of the light source.
    The processing device according to claim 1.
  9.  前記関連情報は、前記操作情報に基づいて、前記操作者による前記操作を補助するための補助情報である、請求項1に記載の処理装置。 The processing device according to claim 1, wherein the related information is auxiliary information for assisting the operation by the operator based on the operation information.
  10.  前記処理装置は、前記対象者に関連する対象者情報を入力するための表示装置と前記ネットワークを介して通信可能に接続され、
     前記関連情報は前記表示装置に出力される、
     請求項1に記載の処理装置。
    The processing device is communicably connected via the network to a display device for inputting subject information related to the subject,
    the related information is output to the display device;
    The processing device according to claim 1.
  11.  対象者の自然開口部を被写体とする画像を撮影するように構成された撮影装置とネットワークを介して通信可能に接続された処理装置を、
     前記撮影装置において前記画像を撮影するために、前記撮影装置の操作者によってなされた前記撮影装置の操作に関連する操作情報を取得し、
     取得した前記操作情報に基づいて前記撮影装置に関連する関連情報を出力する、
     ためのプロセッサとして機能させる処理プログラム。
    A processing device communicatively connected via a network to a photographing device configured to photograph an image of a subject's natural aperture as a subject;
    obtaining operation information related to an operation of the photographing device performed by an operator of the photographing device in order to photograph the image with the photographing device;
    outputting relevant information related to the photographing device based on the acquired operation information;
    A processing program that functions as a processor for
  12.  対象者の自然開口部を被写体とする画像を撮影するように構成された撮影装置とネットワークを介して通信可能に接続され、少なくとも一つのプロセッサを含む処理装置において、前記少なくとも一つのプロセッサにより実行される処理方法であって、
     前記少なくとも一つのプロセッサが、
     前記撮影装置において前記画像を撮影するために、前記撮影装置の操作者によってなされた前記撮影装置の操作に関連する操作情報を取得する段階と、
     取得した前記操作情報に基づいて前記撮影装置に関連する関連情報を出力する段階と、
     を含む処理方法。

     
    in a processing device that is communicably connected via a network to an imaging device configured to capture an image of a natural aperture of a subject and includes at least one processor; A processing method comprising:
    the at least one processor,
    acquiring operation information related to an operation of the photographing device performed by an operator of the photographing device in order to photograph the image with the photographing device;
    outputting relevant information related to the photographing device based on the acquired operation information;
    processing methods including;

PCT/JP2022/017151 2022-04-06 2022-04-06 Processing device, processing program, and processing method WO2023195091A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/017151 WO2023195091A1 (en) 2022-04-06 2022-04-06 Processing device, processing program, and processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/017151 WO2023195091A1 (en) 2022-04-06 2022-04-06 Processing device, processing program, and processing method

Publications (1)

Publication Number Publication Date
WO2023195091A1 true WO2023195091A1 (en) 2023-10-12

Family

ID=88242705

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/017151 WO2023195091A1 (en) 2022-04-06 2022-04-06 Processing device, processing program, and processing method

Country Status (1)

Country Link
WO (1) WO2023195091A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH034831A (en) * 1989-06-01 1991-01-10 Toshiba Corp Endoscope device
JPH06254040A (en) * 1993-03-02 1994-09-13 Asahi Optical Co Ltd Front end of endoscope
JP2001166222A (en) * 1999-12-09 2001-06-22 Olympus Optical Co Ltd Endoscope
JP2002272822A (en) * 2001-03-14 2002-09-24 Olympus Optical Co Ltd Management system and washing device for expendables
JP2012245254A (en) * 2011-05-30 2012-12-13 Hoya Corp Endoscope, endoscope system, and endoscope management system
JP2014004156A (en) * 2012-06-25 2014-01-16 Hoya Corp External module for endoscope, and endoscope system
JP2021117612A (en) * 2020-01-23 2021-08-10 Hoya株式会社 Endoscopic diagnosis data management system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH034831A (en) * 1989-06-01 1991-01-10 Toshiba Corp Endoscope device
JPH06254040A (en) * 1993-03-02 1994-09-13 Asahi Optical Co Ltd Front end of endoscope
JP2001166222A (en) * 1999-12-09 2001-06-22 Olympus Optical Co Ltd Endoscope
JP2002272822A (en) * 2001-03-14 2002-09-24 Olympus Optical Co Ltd Management system and washing device for expendables
JP2012245254A (en) * 2011-05-30 2012-12-13 Hoya Corp Endoscope, endoscope system, and endoscope management system
JP2014004156A (en) * 2012-06-25 2014-01-16 Hoya Corp External module for endoscope, and endoscope system
JP2021117612A (en) * 2020-01-23 2021-08-10 Hoya株式会社 Endoscopic diagnosis data management system

Similar Documents

Publication Publication Date Title
US11779213B2 (en) Metaverse system
JP5459423B2 (en) Diagnostic system
US11759109B2 (en) Method for automating collection, association, and coordination of multiple medical data sources
US7343001B2 (en) Automatic detector selection by study type
JP5238100B2 (en) Receiving device and capsule endoscope system
US20130218026A1 (en) Automated assessment of skin lesions using image library
WO2013156999A1 (en) System &amp; method for facilitating remote medical diagnosis and consultation
CN103356166B (en) A kind of multi-functional physical-examination indagation equipment
CN109935316A (en) Medical auxiliary system, information terminal device, patient image data adquisitiones
WO2023195091A1 (en) Processing device, processing program, and processing method
AU2022200340B2 (en) Digital image screening and/or diagnosis using artificial intelligence
JP4810141B2 (en) Image management apparatus and image management method
WO2023181417A1 (en) Imaging device, program, and method
US20100042003A1 (en) Small-scale diagnostic system and display control method
WO2023286199A1 (en) Processing device, processing program, processing method, and processing system
CN117238439B (en) Intelligent health medication management system and method
WO2009107649A1 (en) Medical image management system
US20230389792A1 (en) System and method of corneal surface measurement augmented for machine ingestion
US20210313058A1 (en) Modular telehealth system and method thereof
WO2023073844A1 (en) Processing device, processing program, and processing method
US20210074410A1 (en) Medical image management apparatus, medical image management method, and recording medium
US20210074408A1 (en) Medical image processing apparatus, medical image processing method, and recording medium
US20210057102A1 (en) Dental health and care system
JP2007259924A (en) Small-scale diagnostic system
CN117242529A (en) Method for providing diagnosis and treatment service for clinical diagnosis and prescription and system thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22936491

Country of ref document: EP

Kind code of ref document: A1