WO2023054295A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2023054295A1
WO2023054295A1 PCT/JP2022/035808 JP2022035808W WO2023054295A1 WO 2023054295 A1 WO2023054295 A1 WO 2023054295A1 JP 2022035808 W JP2022035808 W JP 2022035808W WO 2023054295 A1 WO2023054295 A1 WO 2023054295A1
Authority
WO
WIPO (PCT)
Prior art keywords
paralysis
feature amount
information processing
user
control unit
Prior art date
Application number
PCT/JP2022/035808
Other languages
French (fr)
Japanese (ja)
Inventor
雄太 吉田
俊彦 西村
Original Assignee
テルモ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by テルモ株式会社 filed Critical テルモ株式会社
Publication of WO2023054295A1 publication Critical patent/WO2023054295A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/60Healthcare; Welfare

Definitions

  • the present invention relates to an information processing device, an information processing method, and a program.
  • a determination device determines the possibility of a stroke based on the presence or absence of facial nerve palsy determined based on the user's facial image, the results of a speech test, and the results of an interview (Patent Document 1). Early detection and treatment of stroke can greatly improve patient prognosis.
  • central paralysis There are two types of facial paralysis: central paralysis and peripheral paralysis. Among these, when central palsy has occurred, there is a high possibility that an intracerebral disease such as stroke has developed, and prompt treatment is required.
  • Patent Document 1 only the presence or absence of facial paralysis is determined using the facial image.
  • an utterance test and an interview are performed to determine whether or not there is a stroke, which is troublesome for the user and takes time for the examination.
  • an object is to provide an information processing device or the like that determines the type of facial paralysis using a face image.
  • the information processing apparatus includes an image acquisition unit that acquires a face image of a user, a feature acquisition unit that acquires a feature amount of the face image based on the face image, and a peripheral paralysis of the user based on the feature amount. and a type determination unit that determines whether the condition is a paralysis or a central paralysis.
  • an information processing device or the like that determines the type of facial paralysis using a face image.
  • FIG. 4 is an explanatory diagram for explaining an outline of a determination process; It is an explanatory view explaining the composition of an information processor.
  • FIG. 4 is an explanatory diagram for explaining feature points of a face image;
  • FIG. 4 is an explanatory diagram for explaining a record layout of a feature amount DB;
  • It is an explanatory view explaining a decision model.
  • 4 is a flowchart for explaining the flow of processing of a program;
  • FIG. 10 is a flowchart for explaining the flow of processing of a subroutine for calculating a feature amount relating to the forehead;
  • FIG. FIG. 11 is a flowchart for explaining the flow of processing of a subroutine for feature amount calculation regarding eyebrows;
  • FIG. 10 is a flowchart for explaining the flow of processing of a subroutine for feature amount calculation relating to between eyes and eyebrows;
  • FIG. FIG. 11 is a flow chart for explaining the flow of processing of a subroutine for feature amount calculation relating to a law line;
  • FIG. 11 is a flow chart for explaining the flow of processing of a subroutine for calculating a feature amount related to mouth corners;
  • FIG. 10 is a flowchart for explaining the flow of processing of a subroutine for calculating a feature amount relating to the mouth;
  • FIG. FIG. 11 is a flowchart for explaining the flow of processing of a subroutine for feature amount calculation regarding contours;
  • FIG. 10 is an explanatory diagram for explaining an example of a screen; It is an explanatory view explaining an example of a screen of a modification.
  • FIG. 11 is an explanatory diagram for explaining the configuration of an information processing device according to a third embodiment;
  • FIG. 11 is a functional block diagram of an information processing device according to a fourth embodiment;
  • Embodiment 1 There are two types of facial paralysis, central paralysis and peripheral paralysis. It is known that the majority of facial paralysis is peripheral paralysis, and the incidence of central paralysis is less than 1% of all facial paralysis.
  • central paralysis occurs mainly around the cheeks and mouth.
  • the main cause of central paralysis is so-called stroke, such as cerebral infarction, cerebral hemorrhage, subarachnoid hemorrhage, and transient ischemic attack. It is known that the earlier the discovery and treatment of these diseases, the better the patient's prognosis. Treatment for central palsy is primarily performed in neurology and neurosurgery.
  • peripheral paralysis paralysis may occur over the entire face, from the upper part of the face such as the forehead and eyebrows to the lower part of the face such as the mouth.
  • symptoms often differ on the left and right sides of the face.
  • the main causes of peripheral paralysis are swelling of the facial nerve and neuritis caused by herpes virus.
  • Early treatment is desirable for peripheral paralysis, but the urgency is lower than for central paralysis.
  • Treatment for peripheral palsy is mainly performed by an otolaryngologist.
  • an information processing device 20 (see FIG. 2) and the like that assists general patients to consult an appropriate clinical department and receive appropriate medical care will be described.
  • Fig. 1 is an explanatory diagram explaining the outline of the determination process.
  • the user makes a plurality of specified facial expressions and takes a facial image with each facial expression.
  • “No. 1” is “rest”
  • “No. 2” is “wrinkles on the forehead”
  • “No. 3” is “slightly closed eyes”
  • “No. 4" is “strongly closed eyes”.
  • These facial expressions are four of the ten facial expressions conventionally used in the Yanagihara method.
  • facial expressions are examples and are not intended to be limiting. For example, more than 5 or less than 3 facial expressions may be used. Expressions other than those specified by the Yanagihara method may be used.
  • the presence or absence of facial paralysis is determined based on multiple facial images. For example, if a plurality of facial images change according to a designated facial expression, it is determined that there is no facial paralysis.
  • a method for determining the presence or absence of facial nerve palsy based on a facial image is known, for example, from Patent Document 1, and therefore detailed description thereof will be omitted.
  • the user is notified that there is no facial paralysis. For example, a patient who has had a stroke and is concerned about the risk of another stroke can be reassured by a notification that he or she does not have facial paralysis.
  • facial nerve palsy When it is determined that there is facial nerve palsy, feature values are calculated for each of the multiple evaluation parts such as "forehead” and "eyebrows". Examples of evaluation parts and feature amounts will be described later. It is determined whether the facial paralysis is peripheral paralysis or central paralysis by combining the feature values of a plurality of evaluation sites.
  • FIG. 2 is an explanatory diagram for explaining the configuration of the information processing device 20.
  • the information processing device 20 includes a control unit 21, a main storage device 22, an auxiliary storage device 23, a communication unit 24, a display unit 25, an input unit 26, a speaker 27, an imaging unit 28, and a bus.
  • the control unit 21 is an arithmetic control device that executes the program of the present embodiment.
  • One or a plurality of CPUs (Central Processing Units), GPUs (Graphics Processing Units), multi-core CPUs, or the like is used for the control unit 21 .
  • the control unit 21 is connected to each hardware unit forming the information processing apparatus 20 via a bus.
  • the main storage device 22 is a storage device such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), flash memory, or the like.
  • the main storage device 22 temporarily stores information necessary during the process performed by the control unit 21 and the program being executed by the control unit 21 .
  • the auxiliary storage device 23 is a storage device such as SRAM, flash memory, hard disk, or magnetic tape.
  • the auxiliary storage device 23 stores a feature DB (database) 41, a determination model 46, programs to be executed by the control unit 21, and various data necessary for executing the programs.
  • the feature amount DB 41 and the judgment model 46 may be stored in an external large-capacity storage device connected to the auxiliary storage device 23 .
  • the communication unit 24 is an interface that performs communication between the information processing device 20 and the network.
  • the display unit 25 is, for example, a liquid crystal display panel or an organic EL (electro-luminescence) panel.
  • the input unit 26 is, for example, a keyboard, mouse and microphone. The display unit 25 and the input unit 26 may be laminated to form a touch panel.
  • the imaging unit 28 is, for example, a general camera.
  • the imaging unit 28 may be, for example, an infrared camera.
  • the photographing unit 28 may be a stereoscopic photographing camera configured with two cameras.
  • the imaging unit 28 may be a 3D TOF (3 Dimension Time of Flight) sensor. When using a camera for stereoscopic photography or a 3D TOF sensor, the photographing unit 28 can photograph a three-dimensional face image.
  • the display unit 25, the input unit 26, the speaker 27 and the imaging unit 28 may be external devices.
  • the photographing unit 28 is desirably arranged at a position where the user's face can be photographed while the user is looking at the display unit 25 .
  • the information processing device 20 is a general-purpose information device used by a user, such as a smartphone, personal computer, tablet, or smart speaker.
  • the information processing device 20 may be dedicated hardware prepared for the determination method of the present embodiment.
  • Information processing device 20 may be incorporated in, for example, a pet-type robot or a care robot.
  • the information processing device 20 operates on a large-scale computer connected via a network to so-called IoT devices such as a display unit 25, an input unit 26, a speaker 27, and a photographing unit 28 placed near the user. It may be configured by a combination with a virtual machine, a cloud computing system, or the like.
  • the functions of the information processing device 20 may be implemented by cooperating with a plurality of pieces of hardware. For example, the determination of the presence or absence of facial nerve palsy described using FIG. 1, the feature amount calculation for each part, and the determination of the type of paralysis may be performed by different hardware.
  • the control unit 21 displays the image captured through the image capturing unit 28 on the display unit 25 .
  • the control unit 21 may display a guide indicating an appropriate face position and the like in the image to guide the user to the appropriate position and distance to the photographing unit 28 . If the imaging unit 28 has sufficiently high pixels, the control unit 21 may cut out an appropriate range from the image captured by the imaging unit 28 and display it on the display unit 25 .
  • the control unit 21 outputs from the speaker 27 a voice commanding a "rest" facial expression such as "open your eyes and close your mouth".
  • the control unit 21 may display on the display unit 25 an illustration, characters, or the like explaining an appropriate facial expression.
  • the user makes an instructed facial expression while confirming his/her own face displayed on the display unit 25 .
  • the control unit 21 takes a face image.
  • the control unit 21 calculates the feature amount of each part from the photographed face image and records it in the feature amount DB 41 .
  • the control unit 21 outputs a voice commanding the next facial expression from the speaker 27, such as "Look up and wrinkle your forehead.”
  • the user makes the indicated facial expression.
  • the control unit 21 takes a face image.
  • the control unit 21 calculates feature amounts and records them in the feature amount DB 41 .
  • the control unit 21 repeats the above processing, captures a face image with a predetermined expression, and records the feature amount. Note that the control unit 21 may output an instruction to return to the initial "rest” facial expression between each facial expression. The control unit 21 may capture a face image and record the feature amount each time the user returns to the "rest" expression.
  • Table 1 is a table outlining the method of distinguishing between central paralysis and peripheral paralysis.
  • the bilateral innervation area is the area where nerves from the left and right cerebral cortices reach, and is generally the area above the eyes.
  • the unilateral innervation area is an area to which nerves from the cerebral cortex on the left and right opposite sides reach, and is generally the area below the eye.
  • Table 2 is a table showing the parts belonging to the bilateral dominant region and the unilateral dominant region, respectively, and an overview of the feature values of each part.
  • Table 2 shows examples of feature amounts that can be calculated based on captured images of the four types of facial expressions described using FIG. A description of each feature amount will be given later.
  • control unit 21 determines that there is no paralysis when the amount of change in feature amount when changing facial expressions between different facial expressions is greater than a predetermined threshold.
  • the control unit 21 may score the amount of change in the feature amount for each part of the bilateral areas of control according to a predetermined procedure, and determine the presence or absence of paralysis in the bilateral areas of control based on the total score or the like.
  • the control unit 21 determines the presence or absence of paralysis based on the combination of the difference between the left and right feature amounts and the amount of change in the feature amount when the facial expression is changed between different facial expressions. For example, when the difference between the left and right feature amounts does not increase even when the facial expression is changed, the control unit 21 determines that there is no paralysis.
  • the control unit 21 scores the difference between the left and right feature amounts and the amount of change in the feature amount for each part of the unilateral dominant area according to a predetermined procedure, and determines the presence or absence of paralysis in the unilateral dominant area based on the total score or the like. You can judge.
  • FIG. 3 is an explanatory diagram explaining the feature points 49 of the face image.
  • Each feature point 49 is indicated by a black circle in FIG.
  • Characteristic points 49 that are particularly used in the following description are denoted by alphabetical symbols such as characteristic point 49a and characteristic point 49b. Extraction of the feature points 49 can be realized by a known method such as OpenPose, and therefore detailed description is omitted.
  • FIG. 3 will be described as an example of a so-called mirror image format in which the user's right hand side is displayed on the right side of FIG. 3 and the user's left hand side is displayed on the left side of FIG. Accordingly, the right side of FIG. 3 corresponds to the user's right eye.
  • the control unit 21 calculates the scale of the face image based on the feature points 49 described using FIG. 3 and the size of the user's face registered in advance.
  • the control unit 21 may calculate the scale of the face image based on the average face size of the user.
  • the average face size may be defined for each user's gender or age, for example.
  • the case where the coordinates of each feature point 49 are determined by the coordinates corresponding to the actual size of the user's face will be described as an example.
  • the control unit 21 performs edge detection on the forehead portion of the face image. Since edge detection is an image processing technique that has been commonly used in the past, detailed description thereof will be omitted. The number of horizontal line-shaped edges detected by edge detection is the number of wrinkles on the forehead.
  • the control unit 21 calculates the depth of wrinkles based on the difference in brightness between the edge portion and its surroundings. Note that the relationship between the difference in brightness and the depth of wrinkles is measured in advance and incorporated into the program of the present embodiment. If the photographing unit 28 has a function of photographing a three-dimensional shape, the control unit 21 can directly read the depth of wrinkles from the face image.
  • the eyebrow angle means the angle between the left and right eyebrows and the midline of the face.
  • the midline of the face is a straight line passing through the bridge of the nose.
  • the control unit 21 calculates a straight line L1 obtained by linearly approximating a plurality of feature points 49a indicating the bridge of the nose.
  • the control unit 21 calculates a straight line L2 connecting the feature points 49b and 49c indicating both ends of the user's right eyebrow.
  • the control unit 21 calculates the angle ⁇ a formed by L1 and L2. ⁇ a is the angle of the right eyebrow.
  • the control unit 21 calculates an arc that approximates the array of all feature points 49 related to the right eyebrow.
  • the reciprocal of the radius of the calculated arc is the curvature of the right eyebrow.
  • the control unit 21 similarly calculates the angle and curvature of the left eyebrow.
  • the control unit 21 calculates the area of the area surrounded by the line connecting the feature point 49 regarding the user's left eyebrow and the feature point 49 connecting the upper edge of the right eye.
  • the ends of the two curves are interpolated by straight lines, for example.
  • the portion for calculating the area is indicated by left-down hatching in FIG.
  • the calculated area is the area between the left eyebrow and the left eye.
  • the control unit 21 similarly calculates the area between the right eyebrow and the right eye.
  • the control unit 21 performs edge detection on the portion from the left side of the nose to the vicinity of the left corner of the mouth in the face image.
  • the length of the sharpest edge detected by edge detection is the length of the left statute.
  • the control unit 21 calculates the depth of the left law line. Since the depth calculation method is the same as that for the depth of wrinkles on the forehead, the description is omitted.
  • the control unit 21 similarly calculates the length and depth of the right normal line.
  • the control unit 21 calculates a straight line L3 that connects the feature points 49d and 49e indicating the right and left corners of the user's mouth.
  • the control unit 21 calculates a straight line L4 connecting the representative point of the left eye and the representative point of the right eye.
  • the representative point is, for example, the center of gravity of each eye.
  • the representative point may be the inner corner or the outer corner of each eye.
  • the control unit 21 calculates the angle formed by L3 and L4. The calculated angle is the angle of the corner of the mouth.
  • the control unit 21 calculates the area of the area surrounded by the curve or polygonal line that connects the characteristic point 49 regarding the lower edge of the upper lip and the characteristic point 49 regarding the upper edge of the lower lip. A portion for calculating the area is indicated by hatching sloping downward to the right in FIG. The calculated area is the area of the mouth. If the user's mouth is tightly closed, the mouth area is zero.
  • the control unit 21 calculates an arc approximating the arrangement of all feature points 49 related to the contour of the right side of the face.
  • the reciprocal of the calculated radius of the arc is the curvature of the right contour.
  • the control unit 21 similarly calculates the curvature of the contour on the left side.
  • control unit 21 can calculate the feature values shown in Table 2.
  • the control unit 21 calculates a feature amount for each face image obtained by photographing a plurality of facial expressions described using FIG.
  • FIG. 4 is an explanatory diagram for explaining the record layout of the feature DB 41.
  • the feature amount DB 41 is a DB that records the feature amount for each part calculated based on each face image.
  • the feature amount DB 41 has a date field and a feature amount field.
  • the feature amount field has facial expression fields corresponding to the respective facial expressions described using FIG.
  • Each facial expression field such as the rest field, has a part field indicating a part of the face, such as a forehead field, an eyebrow field, a field between the eyebrows and the eyes, a normal line field, a corner of the mouth field, a mouth field, and an outline field.
  • the forehead field has a wrinkle number field and a wrinkle depth field.
  • the eyebrow field has a left field and a right field.
  • the left and right fields have angle and curvature fields respectively.
  • the eyebrow-to-eye field has a left area field and a right area field.
  • the law line field has a right field and a left field.
  • the right field and left field each have a depth field and a length field.
  • the mouth corner field has an angle field.
  • the mouth field has an area field.
  • the contour field has a right field and a left field.
  • the date field records the date when the type of paralysis was determined.
  • the number of wrinkles field records the number of wrinkles on the forehead.
  • a representative value of the depth of wrinkles is recorded in the wrinkle depth field.
  • the depth of each wrinkle may be recorded in the wrinkle depth field.
  • the angles formed by the left and right eyebrows and the bridge of the nose are recorded in the angle field within the eyebrow field.
  • the curvature field records the curvature of the left and right eyebrows.
  • the area between the right eyebrow and the upper edge of the right eye is recorded in the right area field in the eyebrow-to-eye field.
  • the left area field records the area between the left eyebrow and the upper edge of the left eye.
  • the depth field within the law line field the depth of each left and right law line is recorded.
  • the length field records the length of each of the left and right legal lines. The angle formed by a line connecting the left and right corners of the mouth and a line connecting the centers of the left and right eyes is recorded in the mouth corner field.
  • the area of the region surrounded by the upper edge of the lower lip and the lower edge of the upper lip is recorded in the mouth field.
  • Left and right fields within the contour field record the curvature of the left and right contour lines.
  • the feature amount DB 41 may have fields for recording face images corresponding to respective facial expressions.
  • FIG. 5 is an explanatory diagram for explaining the judgment model 46.
  • the judgment model 46 accepts a series of feature values recorded in one record in FIG. is a model that outputs
  • the judgment model 46 is a logic-based program that outputs judgment results according to a known judgment method such as the Yanagihara method.
  • the user's attending physician or the like may be able to appropriately adjust the determination parameters and the like of the determination model 46 in accordance with the user's symptoms such as pre-existing diseases.
  • the judgment model 46 may be a model generated by machine learning using training data recording combinations of multiple sets of feature values and judgment results.
  • the determination model 46 may be configured in hardware using FPGA (Field Programmable Gate Array) or ASIC (Application Specific Integrated Circuit).
  • the determination model 46 may output only one of the probability of "peripheral paralysis” or the probability of "central paralysis”.
  • the judgment model 46 may output a judgment result indicating either "peripheral paralysis” or "central paralysis”.
  • FIG. 6 is a flowchart explaining the flow of program processing.
  • Control unit 21 outputs a voice commanding a facial expression such as "Please open your eyes and close your mouth” from speaker 27 to instruct the user on the facial expression to be captured next (step S501).
  • the control section 21 implements the function of the facial expression instruction section.
  • the control unit 21 controls the photographing unit 28 to photograph a face image and stores it in the auxiliary storage device 23 (step S502). By step S502, the control unit 21 realizes the function of the image acquiring unit that acquires the user's face image.
  • the control unit 21 determines whether or not shooting of a predetermined facial expression has been completed (step S503). If it is determined that the processing has not ended (NO in step S503), the control unit 21 returns to step S501.
  • step S503 the control unit 21 determines whether or not there is facial paralysis based on the series of photographed face images.
  • step S504 the control unit 21 realizes the function of the paralysis determination unit.
  • control unit 21 When it is determined that there is no facial paralysis (NO in step S504), the control unit 21 outputs a message such as "no facial paralysis" from the speaker 27 or the display unit 25 (step S505). The control unit 21 terminates the processing.
  • control unit 21 When determining that there is facial paralysis (YES in step S504), the control unit 21 selects one facial image from the facial images captured in step S502 (step S511). The control unit 21 extracts the feature points 49 described using FIG. 3 from the face image (step S512).
  • the control unit 21 selects one part described using Table 2 (step S513).
  • the control unit 21 starts a subroutine for feature amount calculation (step S514).
  • the feature amount calculation subroutine is a subroutine for calculating the feature amount related to the part selected in step S513.
  • the processing flow of the feature amount calculation subroutine for each part will be described later.
  • the control unit 21 realizes the function of the feature amount acquisition unit.
  • the control unit 21 records the calculated feature amount in the feature amount DB 41 (step S515).
  • the control unit 21 realizes the function of the feature amount recording unit in step S515.
  • the control unit 21 determines whether or not the calculation of the feature amount for the predetermined part has ended (step S516). If it is determined that the processing has not ended (NO in step S516), the control unit 21 returns to step S513.
  • step S516 determines whether processing of all face images captured in step S502 has been completed. If it is determined that the processing has not ended (NO in step S517), the control unit 21 returns to step S511.
  • control unit 21 uses the determination model 46 described with reference to FIG. A certain probability is obtained (step S518).
  • the control unit 21 determines whether the probability of central paralysis is high (step S519). It should be noted that in step S519, it is determined that there is central paralysis when the probability of having central paralysis exceeds a predetermined determination threshold value. By step S519, the control unit 21 realizes the function of the type determination unit that determines whether the user has peripheral paralysis or central paralysis.
  • step S519 is set to 20%.
  • the control unit 21 determines that there is central paralysis in step S519 (YES in step S519). .
  • the determination threshold By appropriately setting the determination threshold, the risk of overlooking central paralysis can be reduced.
  • control unit 21 If it is determined that the patient has central palsy (YES in step S519), the control unit 21 simply displays the determination result, for example, "Due to suspicion of stroke paralysis, it is recommended to see a neurosurgeon.”
  • the indicated message is output by voice or text (step S520).
  • the control unit 21 outputs a message indicating the basis of the judgment and the judgment result, for example, "There is paralysis of the law line, and central facial nerve palsy, which is a symptom of stroke, was suspected with a probability of 70%.” Or you can output it as a character.
  • the control unit 21 may superimpose an indicator indicating the location where it is determined that there is paralysis on the face image indicating the grounds for the determination. By showing the reason for the determination, the user can realize the necessity of prompt medical examination.
  • the control unit 21 may display a list of nearby neurological departments or neurosurgeons on the display unit 25.
  • the control unit 21 may display a search button for searching neurology or neurosurgery on the display unit 25 .
  • the control unit 21 may automatically notify a pre-registered contact such as a family member, helper, or primary care doctor that the user has developed central palsy.
  • the control unit 21 may receive an instruction from the user as to whether or not to notify the contact. If the information processing device 20 is a device having a call function such as a smart phone, the control unit 21 may display an emergency call button for making an emergency call such as 119, for example.
  • control unit 21 may say, for example, "Because of suspicion of paralysis of the facial nerve, which is different from cerebral apoplexy, a visit to an otolaryngologist is recommended.” A message that briefly indicates the determination result is output by voice or text (step S521). After completing step S520 or step S521, the control unit 21 ends the process. Through steps S520 and S521, the control unit 21 realizes the function of a type output unit that outputs whether the user has peripheral paralysis or central paralysis.
  • FIG. 7 is a flow chart for explaining the processing flow of the subroutine for calculating feature values relating to the forehead.
  • the feature amount calculation subroutine is a subroutine for calculating the feature amount related to the part selected in step S513.
  • the control unit 21 starts a subroutine described using FIG. 7 in step S514.
  • the control unit 21 performs edge detection on the forehead portion of the face image (step S531). As described above, the edges detected as horizontal lines are wrinkles on the forehead. The control unit 21 calculates the number of wrinkles based on the edge detection result (step S532). Control unit 21 calculates the depth of wrinkles, for example, based on the difference in brightness between the edge portion and its surroundings (step S533). The control unit 21 terminates the processing.
  • control unit 21 calculates the depth of all detected forehead wrinkles.
  • the control unit 21 may further calculate a representative value such as an average value or maximum value of depth.
  • Control unit 21 may calculate the depth of a typical wrinkle, such as the longest wrinkle or the central wrinkle, among the detected forehead wrinkles.
  • FIG. 8 is a flow chart for explaining the processing flow of the subroutine for feature amount calculation regarding eyebrows.
  • the control unit 21 starts a subroutine described using FIG. 8 in step S514.
  • the control unit 21 performs linear approximation of the plurality of feature points 49 indicating the bridge of the nose to calculate the median line (step S541).
  • the control unit 21 calculates a straight line connecting the characteristic points 49 indicating both ends of one eyebrow (step S542).
  • the control unit 21 calculates the angle between the straight line calculated in step S541 and the straight line calculated in step S542 (step S543).
  • the control unit 21 calculates an arc that approximates the arrangement of the plurality of feature points 49 indicating one eyebrow (step S544). Through step S544, the coordinates of the center point of the arc approximating one eyebrow and the radius are calculated. The control unit 21 calculates the curvature of the eyebrow, which is the reciprocal of the radius (step S545). The control unit 21 terminates the processing.
  • FIG. 9 is a flow chart for explaining the processing flow of a subroutine for feature amount calculation between eyebrows and eyes.
  • the control unit 21 activates a subroutine described using FIG. 9 in step S514.
  • the control unit 21 acquires coordinates of a plurality of characteristic points 49 indicating one eyebrow (step S551).
  • the control unit 21 acquires coordinates of a plurality of feature points 49 indicating the upper edge of the eye on the same side (step S552).
  • the control unit 21 calculates the area of the area surrounded by straight lines or curved lines connecting the feature points 49 acquired in steps S551 and S552 (step S553).
  • the control unit 21 terminates the processing.
  • FIG. 10 is a flow chart for explaining the processing flow of the subroutine for feature amount calculation related to the law line.
  • the control unit 21 performs edge detection on the part of the face image from one side of the nose to the vicinity of the corner of the mouth on the same side (step S561).
  • the control unit 21 selects an edge corresponding to the law line from the detected edges (step S562). Specifically, the control unit 21 selects the sharpest edge or the longest edge from among the edges that are farther from the midline toward the lower side of the face.
  • the control unit 21 calculates the length of the law line (step S563).
  • Control unit 21 calculates the depth of the law line, for example, based on the difference in brightness between the edge portion and its surroundings (step S564).
  • the control unit 21 terminates the processing.
  • FIG. 11 is a flowchart for explaining the processing flow of a subroutine for feature amount calculation related to mouth corners.
  • the control unit 21 starts a subroutine described using FIG. 11 in step S514.
  • the control unit 21 calculates straight lines connecting the characteristic points 49 indicating the left and right corners of the mouth (step S571).
  • the control unit 21 calculates representative points for each of the left and right eyes. For example, the control unit 21 calculates the center of gravity of a plurality of feature points 49 corresponding to the edge of one eye and uses it as a representative point.
  • the control unit 21 may use, for example, the feature point 49 indicating the inner corner or the outer corner of the eye as the representative point.
  • the control unit 21 calculates a straight line connecting the representative points of the left and right eyes (step S572).
  • the control unit 21 calculates the angle formed by the straight line calculated in step S571 and the straight line calculated in step S572 (step S573). The control unit 21 terminates the process.
  • FIG. 12 is a flow chart for explaining the processing flow of the subroutine for calculating feature values relating to the mouth.
  • the control unit 21 starts a subroutine described using FIG. 12 in step S514.
  • the control unit 21 acquires coordinates of a plurality of feature points 49 corresponding to the lower edge of the upper lip (step S581).
  • the control unit 21 acquires coordinates of a plurality of feature points 49 corresponding to the upper edge of the lower lip (step S582).
  • the control unit 21 calculates the area of the area surrounded by straight lines or curved lines connecting the feature points 49 acquired in steps S581 and S582 (step S583).
  • the control unit 21 terminates the processing.
  • FIG. 13 is a flow chart for explaining the processing flow of a subroutine for contour feature amount calculation.
  • the control unit 21 starts a subroutine described using FIG. 13 in step S514.
  • the control unit 21 calculates an arc approximating the arrangement of the plurality of characteristic points 49 representing one of the left and right contours (step S591). Through step S591, the coordinates of the center point of the arc that approximates one contour and the radius are calculated. The control unit 21 calculates the curvature of the contour, which is the reciprocal of the radius (step S592). The control unit 21 terminates the processing.
  • FIG. 14 is an explanatory diagram for explaining a screen example.
  • FIG. 14 shows an example of a screen on which the control unit 21 notifies the user that there is a symptom of central paralysis in step S520 of the flowchart explained using FIG.
  • a face image column 71 On the screen shown in FIG. 14, a face image column 71, a result column 72, a medical institution contact button 73, and a family contact button 74 are displayed.
  • the result column 72 displays sentences indicating the locations where it was determined that there was paralysis and the probability of having central paralysis.
  • the user's face image is displayed in the face image column 71 .
  • an index 76 is displayed to indicate the location where it is determined that there is paralysis.
  • the control unit 21 displays the pre-registered contact information for medical institutions such as neurosurgery.
  • the control unit 21 may input the user's address and a keyword such as "neurosurgery" into a known search engine and display the search results. If the information processing device 20 is a device such as a smart phone that has a call function, the control unit 21 may initiate a call to a pre-registered medical institution.
  • the control unit 21 sends a notice to the pre-registered family members that the user has developed central nervous system paralysis.
  • e-mail short Message Service
  • SMS Short Message Service
  • SNS Social Network Service
  • the information processing device 20 is a device such as a smart phone that has a call function
  • the control unit 21 may initiate a call to a family member's phone number registered in advance.
  • the information processing apparatus 20 that determines whether the facial paralysis that the user has developed is peripheral paralysis or central paralysis. According to the present embodiment, it is possible to provide the information processing apparatus 20 that recommends a visit to an appropriate department according to the type of paralysis that the user has developed. It is possible to provide the information processing apparatus 20 that contributes to a significant improvement in the user's prognosis by allowing the user to promptly visit an appropriate clinical department.
  • the information processing device 20 that determines the type of facial paralysis only from the user's face image. Since the operation is simple, it is possible to provide the information processing apparatus 20 that even elderly people can easily use.
  • the information processing device 20 that notifies the user when the user has not developed facial paralysis.
  • a patient who feels uneasy about the onset of facial paralysis can use the information processing apparatus according to the present embodiment as appropriate to confirm that the onset does not occur, and feel relieved.
  • the information processing apparatus 20 of the present embodiment may be used, for example, by firefighters in charge of emergency transportation. By determining the type of paralysis that the patient to be transported has developed, the transport destination can be appropriately selected.
  • control unit 21 may acquire each feature amount described using Table 2 using a model generated so as to output the feature amount when a face image is input.
  • models are generated by machine learning, for example, using training data recording combinations of face images and feature quantities.
  • the model for outputting the feature amount and the determination model 46 are integrated into a model, and when a face image is input, the type of paralysis may be output.
  • control unit 21 displays the effect of rehabilitation based on the feature amount recorded in the feature amount DB 41 in time series.
  • FIG. 15 is an explanatory diagram for explaining a screen example of the modified example.
  • a face image field 71 and a result field 72 are displayed on the screen shown in FIG.
  • the control unit 21 compares the feature amount recorded in the past with the latest feature amount for the portion of the law line where the facial nerve palsy has developed.
  • the control unit 21 displays the change in the user's state between the past and the present in the result column 72 based on the difference in feature amount.
  • the information processing apparatus 20 that increases the user's motivation for rehabilitation can be provided by objectively judging the effect of rehabilitation and displaying it in the result column 72 .
  • the present embodiment relates to an information processing apparatus 20 that determines the type of paralysis using facial images in which features of each part are likely to emerge. Descriptions of parts common to the first embodiment are omitted.
  • Table 3 shows the relationship between each part and the facial expressions that are susceptible to the type of paralysis.
  • the two facial expressions of “rest” and “strongly closed eyes” are influenced by the type of central or peripheral paralysis. easy to receive. Therefore, even if the facial expressions of "wrinkles on the forehead” and “slightly closed eyes” are not calculated for these parts, the determination result is not significantly affected.
  • step S513 of the program described using FIG. 6 the control unit 21 selects only parts suitable for using the facial expression of the face image selected in step S511. For example, if a face image with a "quiet” expression is selected in step S511, the control unit 21 selects the "forehead”, “legal lines”, “corners of the mouth", “mouth” and “contour” in step S513. Select sequentially.
  • control unit 21 sequentially selects the "eyebrows” and “between the eyebrows and the eyes” in step S513.
  • an information processing apparatus 20 that performs appropriate determination with a small amount of calculation by not calculating feature amounts that have little effect on the determination of central paralysis or peripheral paralysis. can.
  • the present embodiment relates to a mode of realizing the information processing apparatus 20 of the present embodiment by operating a general-purpose computer 90 and a program 97 in combination. Descriptions of parts common to the first embodiment are omitted.
  • FIG. 16 is an explanatory diagram for explaining the configuration of the information processing device 20 according to the third embodiment.
  • the computer 90 includes a control section 21, a main storage device 22, an auxiliary storage device 23, a communication section 24, a display section 25, an input section 26, a speaker 27, an imaging section 28, a reading section 29 and a bus.
  • the computer 90 is an information device such as a general-purpose personal computer, tablet, smart phone, or server computer.
  • the program 97 is recorded on a portable recording medium 96.
  • the control unit 21 reads the program 97 via the reading unit 29 and stores it in the auxiliary storage device 23 .
  • Control unit 21 may also read program 97 stored in semiconductor memory 98 such as a flash memory installed in computer 90 .
  • the control unit 21 may download the program 97 from another server computer (not shown) connected via the communication unit 24 and a network (not shown) and store it in the auxiliary storage device 23 .
  • the program 97 is installed as a control program of the computer 90, loaded into the main storage device 22 and executed. As described above, the computer 90 functions as the information processing apparatus 20 described above. Program 97 is an example of a program product.
  • FIG. 17 is a functional block diagram of the information processing device 20 according to the fourth embodiment.
  • the information processing apparatus 20 includes an image acquisition section 81 , a feature amount acquisition section 82 and a type determination section 83 .
  • the image acquisition unit 81 acquires the user's face image.
  • the feature quantity acquisition unit 82 acquires the feature quantity of the face image based on the face image.
  • the type determination unit 83 determines whether the user has peripheral paralysis or central paralysis based on the feature amount.
  • Appendix 1 an image acquisition unit that acquires a face image of a user; a feature amount acquisition unit that acquires a feature amount of the face image based on the face image; and a type determination unit that determines whether the user has peripheral paralysis or central paralysis based on the feature amount.
  • Appendix 2 Based on the facial image, comprising a paralysis determination unit that determines the presence or absence of facial nerve paralysis, The information processing apparatus according to appendix 1, wherein the feature amount acquisition unit acquires the feature amount when the paralysis determination unit determines that there is paralysis.
  • Appendix 3 A facial expression instruction unit that instructs the user to express a facial expression, The information processing apparatus according to appendix 1 or appendix 2, wherein the image acquisition unit acquires the face image of the user after the facial expression instruction unit issues an instruction.
  • the facial expression instruction unit instructs a facial expression of wrinkling the forehead
  • the information processing apparatus according to appendix 3, wherein the feature amount acquisition unit acquires the number and depth of wrinkles on the forehead.
  • the facial expression instruction unit instructs a facial expression of wrinkling the forehead, a slightly closed eye expression, and a strongly closed eye expression
  • the information processing apparatus according to appendix 3 or appendix 4, wherein the feature amount acquisition unit acquires angles and curvatures of left and right eyebrows for each image.
  • the facial expression instruction unit instructs a facial expression of wrinkling the forehead, a slightly closed eye expression, and a strongly closed eye expression, According to any one of appendices 3 to 5, the feature amount acquisition unit acquires an area between the left eyebrow and the eye and an area between the right eyebrow and the eye for each image.
  • Information processing equipment
  • Appendix 7 The information processing apparatus according to any one of appendices 1 to 6, further comprising a type output unit that outputs whether the user has peripheral palsy or central palsy.
  • Appendix 8 When the type determination unit determines that it is peripheral paralysis, output information about otolaryngology, The information processing apparatus according to any one of appendices 1 to 7, wherein information relating to neurology or neurosurgery is output when the type determination unit determines that the patient has central palsy.
  • Appendix 9 The information processing apparatus according to any one of appendices 1 to 8, wherein a predetermined contact is notified when the type determination unit determines that the patient has central paralysis.
  • a feature amount recording unit that records the feature amount in time series, The information according to any one of Supplements 1 to 9, wherein the type determination unit determines whether the user has peripheral paralysis or central paralysis based on time-series changes in the feature amount. processing equipment.
  • Appendix 11 A feature amount recording unit that records the feature amount in time series, The information processing apparatus according to any one of appendices 1 to 9, wherein the user's state change is output based on a time-series change in the feature amount.
  • Appendix 12 The information processing apparatus according to any one of appendices 1 to 11, wherein the image obtaining unit obtains the face image of the user as a three-dimensional image.
  • (Appendix 14) Get the user's facial image, acquiring a feature amount of the face image based on the face image; A program that causes a computer to execute a process of determining whether the user has peripheral paralysis or central paralysis based on the feature amount.
  • control unit 22 main storage device 23 auxiliary storage device 24 communication unit 25 display unit 26 input unit 27 speaker 28 photographing unit 29 reading unit 41 feature amount DB 46 judgment model 49 feature point 71 face image column 72 result column 73 medical institution contact button 74 family contact button 76 index 81 image acquisition unit 82 feature amount acquisition unit 83 type determination unit 90 computer 96 portable recording medium 97 program 98 semiconductor memory

Abstract

Provided are an information processing device and the like for determining a type of facial nerve paralysis by using a face image. The information processing device comprises: an image acquisition unit for acquiring a face image of a user; a feature amount acquisition unit for acquiring a feature amount of the face image on the basis of the face image; and a type determination unit for determining whether paralysis of the user is peripheral paralysis or central paralysis on the basis of the feature amount. The information processing device comprises a paralysis determination unit for determining presence/absence of facial nerve paralysis on the basis of the face image, and the feature amount acquisition unit acquires the feature amount when the paralysis determination unit determines that there is paralysis.

Description

情報処理装置、情報処理方法およびプログラムInformation processing device, information processing method and program
 本発明は、情報処理装置、情報処理方法およびプログラムに関する。 The present invention relates to an information processing device, an information processing method, and a program.
 ユーザの顔画像に基づいて判定した顔面神経麻痺の有無と、発語テスト結果および問診結果とに基づいて、脳卒中の可能性を判定する判定装置が提案されている(特許文献1)。脳卒中の早期発見および早期治療により、患者の予後が大幅に改善できる。 A determination device has been proposed that determines the possibility of a stroke based on the presence or absence of facial nerve palsy determined based on the user's facial image, the results of a speech test, and the results of an interview (Patent Document 1). Early detection and treatment of stroke can greatly improve patient prognosis.
特開2020-199072号公報Japanese Patent Application Laid-Open No. 2020-199072
 顔面神経麻痺には、中枢性麻痺と末梢性麻痺との2つの種別が存在する。このうち、中枢性麻痺が生じている場合には、脳卒中等の脳内疾患が発症している可能性が高く、速やかな治療が必要である。 There are two types of facial paralysis: central paralysis and peripheral paralysis. Among these, when central palsy has occurred, there is a high possibility that an intracerebral disease such as stroke has developed, and prompt treatment is required.
 しかしながら特許文献1の判定装置では、顔画像を用いて判定するのは顔面神経麻痺の有無のみである。顔画像の撮影に加えて、発語テストおよび問診を行ない、脳卒中の有無を判定するため、ユーザにとっては煩わしい手間と、検査時間とがかかる。 However, in the determination device of Patent Document 1, only the presence or absence of facial paralysis is determined using the facial image. In addition to photographing a face image, an utterance test and an interview are performed to determine whether or not there is a stroke, which is troublesome for the user and takes time for the examination.
 一つの側面では、顔画像を用いて顔面神経麻痺の種別を判定する情報処理装置等を提供することを目的とする。 In one aspect, an object is to provide an information processing device or the like that determines the type of facial paralysis using a face image.
 情報処理装置は、ユーザの顔画像を取得する画像取得部と、前記顔画像に基づいて前記顔画像の特徴量を取得する特徴量取得部と、前記特徴量に基づいて前記ユーザが末梢性麻痺であるか中枢性麻痺であるかを判定する種別判定部とを備える。 The information processing apparatus includes an image acquisition unit that acquires a face image of a user, a feature acquisition unit that acquires a feature amount of the face image based on the face image, and a peripheral paralysis of the user based on the feature amount. and a type determination unit that determines whether the condition is a paralysis or a central paralysis.
 一つの側面では、顔画像を用いて顔面神経麻痺の種別を判定する情報処理装置等を提供できる。 In one aspect, it is possible to provide an information processing device or the like that determines the type of facial paralysis using a face image.
判定プロセスの概要を説明する説明図である。FIG. 4 is an explanatory diagram for explaining an outline of a determination process; 情報処理装置の構成を説明する説明図である。It is an explanatory view explaining the composition of an information processor. 顔画像の特徴点を説明する説明図である。FIG. 4 is an explanatory diagram for explaining feature points of a face image; 特徴量DBのレコードレイアウトを説明する説明図である。FIG. 4 is an explanatory diagram for explaining a record layout of a feature amount DB; 判定モデルを説明する説明図である。It is an explanatory view explaining a decision model. プログラムの処理の流れを説明するフローチャートである。4 is a flowchart for explaining the flow of processing of a program; 額に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。FIG. 10 is a flowchart for explaining the flow of processing of a subroutine for calculating a feature amount relating to the forehead; FIG. 眉に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。FIG. 11 is a flowchart for explaining the flow of processing of a subroutine for feature amount calculation regarding eyebrows; FIG. 目と眉との間に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。FIG. 10 is a flowchart for explaining the flow of processing of a subroutine for feature amount calculation relating to between eyes and eyebrows; FIG. 法令線に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。FIG. 11 is a flow chart for explaining the flow of processing of a subroutine for feature amount calculation relating to a law line; FIG. 口角に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。FIG. 11 is a flow chart for explaining the flow of processing of a subroutine for calculating a feature amount related to mouth corners; FIG. 口に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。FIG. 10 is a flowchart for explaining the flow of processing of a subroutine for calculating a feature amount relating to the mouth; FIG. 輪郭に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。FIG. 11 is a flowchart for explaining the flow of processing of a subroutine for feature amount calculation regarding contours; FIG. 画面例を説明する説明図である。FIG. 10 is an explanatory diagram for explaining an example of a screen; 変形例の画面例を説明する説明図である。It is an explanatory view explaining an example of a screen of a modification. 実施の形態3の情報処理装置の構成を説明する説明図である。FIG. 11 is an explanatory diagram for explaining the configuration of an information processing device according to a third embodiment; 実施の形態4の情報処理装置の機能ブロック図である。FIG. 11 is a functional block diagram of an information processing device according to a fourth embodiment;
[実施の形態1]
 顔面神経麻痺には、中枢性麻痺と末梢性麻痺との2つの種別が存在する。顔面神経麻痺の大半は末梢性麻痺であり、中枢性麻痺の発症率は、顔面神経麻痺全体の1パーセント以下であることが知られている。
[Embodiment 1]
There are two types of facial paralysis, central paralysis and peripheral paralysis. It is known that the majority of facial paralysis is peripheral paralysis, and the incidence of central paralysis is less than 1% of all facial paralysis.
 中枢性麻痺では、主に頬および口の周辺に麻痺の症状が生じる。中枢性麻痺の主な原因は、脳梗塞、脳出血等、くも膜下出血および一過性脳虚血発作等の、いわゆる脳卒中である。これらの疾患は、発見および治療開始が早いほど、患者の予後が良いことが知られており、治療開始の緊急性が非常に高い。中枢性麻痺に対する治療は、主に脳神経内科および脳神経外科で行なわれる。 In central paralysis, paralysis occurs mainly around the cheeks and mouth. The main cause of central paralysis is so-called stroke, such as cerebral infarction, cerebral hemorrhage, subarachnoid hemorrhage, and transient ischemic attack. It is known that the earlier the discovery and treatment of these diseases, the better the patient's prognosis. Treatment for central palsy is primarily performed in neurology and neurosurgery.
 末梢性麻痺では、額および眉等の顔面上部から、口元等の顔面下部に至るまで、顔面の全面に麻痺の症状が生じる可能性がある。また、顔面の左右で症状が異なる場合が多い。末梢性麻痺の主な原因は、顔面神経の腫れ、および、ヘルペスウイルスによる神経炎等である。末梢性麻痺も早期に治療を開始することが望ましいが、中枢性麻痺に比べると緊急性は低い。末梢性麻痺に対する治療は、主に耳鼻科で行なわれる。 In peripheral paralysis, paralysis may occur over the entire face, from the upper part of the face such as the forehead and eyebrows to the lower part of the face such as the mouth. In addition, symptoms often differ on the left and right sides of the face. The main causes of peripheral paralysis are swelling of the facial nerve and neuritis caused by herpes virus. Early treatment is desirable for peripheral paralysis, but the urgency is lower than for central paralysis. Treatment for peripheral palsy is mainly performed by an otolaryngologist.
 したがって、顔面神経麻痺の患者にとって、中枢性麻痺であるか、末梢性麻痺であるかを正確に識別して、適切な診療科を、適切な時期に受診することは重要である。医師が顔面神経麻痺の患者を診察する場合の検査には、たとえば柳原法、House-Brackmann法および Sunnybrook法等が従来から臨床現場で使用されている。 Therefore, it is important for patients with facial nerve palsy to accurately distinguish between central and peripheral palsy and to visit the appropriate department at the appropriate time. The Yanagihara method, the House-Brackmann method, the Sunnybrook method, and the like have been conventionally used in clinical practice when doctors examine patients with facial nerve palsy.
 しかし、一般の患者が顔面神経麻痺の症状をみても、中枢性麻痺であるか、末梢性麻痺であるかを識別して、適切な診療科を選択することは難しい。たとえば中枢性麻痺を発症している患者が耳鼻科を受診した場合、脳神経外科等に移動する必要があるため、治療の開始が遅れてしまう。逆に末梢性麻痺を発症している患者の多くが脳神経外科を受診した場合、脳神経外科が混雑してしまい、真に緊急の治療を必要としている患者の処置が遅れるおそれがある。 However, even if a general patient sees the symptoms of facial nerve palsy, it is difficult to distinguish whether it is central or peripheral palsy and select an appropriate department. For example, when a patient who has developed central palsy consults an otolaryngologist, it is necessary to move to a neurosurgery or the like, which delays the start of treatment. On the other hand, if many patients with peripheral palsy visit a neurosurgeon, the neurosurgery department may become overcrowded, delaying the treatment of patients who truly need urgent treatment.
 本実施の形態においては、一般の患者が適切な診療科を受診して、適切な医療を受けるための支援を行なう情報処理装置20(図2参照)等について説明する。 In the present embodiment, an information processing device 20 (see FIG. 2) and the like that assists general patients to consult an appropriate clinical department and receive appropriate medical care will be described.
 図1は、判定プロセスの概要を説明する説明図である。ユーザは、指定された複数の表情を作り、それぞれの表情で顔画像を撮影する。図1においては「No.1」は「安静」、「No.2」は「額に皺を寄せる」、「No.3」は「軽い閉眼」、「No.4」は「強い閉眼」の例を示す。これらの表情は、従来から柳原法で使用されている10種類の表情のうちの4種類である。 Fig. 1 is an explanatory diagram explaining the outline of the determination process. The user makes a plurality of specified facial expressions and takes a facial image with each facial expression. In FIG. 1, "No. 1" is "rest", "No. 2" is "wrinkles on the forehead", "No. 3" is "slightly closed eyes", and "No. 4" is "strongly closed eyes". Give an example. These facial expressions are four of the ten facial expressions conventionally used in the Yanagihara method.
 以下の説明においては、これらの4種類の表情を使用する場合を例にして説明する。これらの表情は例示であり、これらに限定するものではない。たとえば、5種類以上または3種類以下の表情が使用されてもよい。柳原法で定められている以外の表情が使用されてもよい。 In the following explanation, the case of using these four types of facial expressions will be explained as an example. These facial expressions are examples and are not intended to be limiting. For example, more than 5 or less than 3 facial expressions may be used. Expressions other than those specified by the Yanagihara method may be used.
 複数の顔画像に基づいて、顔面神経麻痺の有無が判定される。たとえば、複数の顔画像が指定された表情に沿って変化している場合、顔面神経麻痺はないと判定される。顔画像に基づいて顔面神経麻痺の有無を判定する方法は、たとえば特許文献1等で公知であるため、詳細については説明を省略する。 The presence or absence of facial paralysis is determined based on multiple facial images. For example, if a plurality of facial images change according to a designated facial expression, it is determined that there is no facial paralysis. A method for determining the presence or absence of facial nerve palsy based on a facial image is known, for example, from Patent Document 1, and therefore detailed description thereof will be omitted.
 顔面神経麻痺がないと判定された場合、ユーザに対して顔面神経麻痺がない旨が通知される。たとえば脳卒中の既往症があり再発リスクの不安感を抱えている患者は、顔面神経麻痺がない旨の通知を確認することで、安心できる。 If it is determined that there is no facial paralysis, the user is notified that there is no facial paralysis. For example, a patient who has had a stroke and is worried about the risk of another stroke can be reassured by a notification that he or she does not have facial paralysis.
 顔面神経麻痺があると判定された場合、「額」および「眉」等の複数の評価部位について、それぞれ特徴量が算出される。評価部位および特徴量の例については後述する。複数の評価部位の特徴量を総合して、顔面神経麻痺が末梢性麻痺であるか、中枢性麻痺であるかが判定される。 When it is determined that there is facial nerve palsy, feature values are calculated for each of the multiple evaluation parts such as "forehead" and "eyebrows". Examples of evaluation parts and feature amounts will be described later. It is determined whether the facial paralysis is peripheral paralysis or central paralysis by combining the feature values of a plurality of evaluation sites.
 末梢性麻痺であると判定された場合、耳鼻科の受診を推奨する旨の通知が行なわれる。近隣にある耳鼻科の住所、電話番号、診察時間およびURL(Uniform Resource Locator)等を示すリストが通知されてもよい。 If it is determined that you have peripheral paralysis, you will be notified that you are recommended to see an otolaryngologist. A list showing addresses, telephone numbers, consultation hours, URLs (Uniform Resource Locators), etc. of nearby otolaryngologists may be notified.
 中枢性麻痺であると判定された場合、脳神経内科または脳神経外科を緊急に受診することを推奨する旨の通知が行なわれる。近隣にある脳神経内科または脳神経外科の住所、電話番号、診察時間およびURL等を示すリストが通知されてもよい。事前に登録済の家族、ヘルパーまたはかかりつけ医等の連絡先への通知が行なわれてもよい。 If it is determined that you have central palsy, you will be notified that it is recommended that you urgently consult a neurologist or neurosurgeon. A list showing addresses, telephone numbers, consultation hours, URLs, etc. of neurological departments or neurosurgeons in the vicinity may be notified. Contacts such as pre-registered family members, helpers or primary care physicians may be notified.
 顔面神経麻痺のなかでも発生頻度は少ないが、治療の緊急性が高い中枢性麻痺を速やかに判定し、適切な診療科を推奨することにより、医療資源の無駄を省きながら、患者の予後の向上を図れる。中枢性麻痺の治療開始が遅れたことによる後遺症を抱えた患者が少なくなることは、社会的な意義も高い。 Improving the prognosis of patients while reducing waste of medical resources by promptly determining central palsy, which occurs less frequently than facial nerve palsy but is highly urgent for treatment, and recommending an appropriate clinical department. can be achieved. It is of great social significance that the number of patients suffering from aftereffects due to delays in starting treatment for central paralysis has decreased.
 図2は、情報処理装置20の構成を説明する説明図である。情報処理装置20は、制御部21、主記憶装置22、補助記憶装置23、通信部24、表示部25、入力部26、スピーカ27、撮影部28およびバスを備える。 FIG. 2 is an explanatory diagram for explaining the configuration of the information processing device 20. As shown in FIG. The information processing device 20 includes a control unit 21, a main storage device 22, an auxiliary storage device 23, a communication unit 24, a display unit 25, an input unit 26, a speaker 27, an imaging unit 28, and a bus.
 制御部21は、本実施の形態のプログラムを実行する演算制御装置である。制御部21には、一または複数のCPU(Central Processing Unit)、GPU(Graphics Processing Unit)またはマルチコアCPU等が使用される。制御部21は、バスを介して情報処理装置20を構成するハードウェア各部と接続されている。 The control unit 21 is an arithmetic control device that executes the program of the present embodiment. One or a plurality of CPUs (Central Processing Units), GPUs (Graphics Processing Units), multi-core CPUs, or the like is used for the control unit 21 . The control unit 21 is connected to each hardware unit forming the information processing apparatus 20 via a bus.
 主記憶装置22は、SRAM(Static Random Access Memory)、DRAM(Dynamic Random Access Memory)、フラッシュメモリ等の記憶装置である。主記憶装置22には、制御部21が行なう処理の途中で必要な情報および制御部21で実行中のプログラムが一時的に保存される。 The main storage device 22 is a storage device such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), flash memory, or the like. The main storage device 22 temporarily stores information necessary during the process performed by the control unit 21 and the program being executed by the control unit 21 .
 補助記憶装置23は、SRAM、フラッシュメモリ、ハードディスクまたは磁気テープ等の記憶装置である。補助記憶装置23には、特徴量DB(Database)41、判定モデル46、制御部21に実行させるプログラム、およびプログラムの実行に必要な各種データが保存される。特徴量DB41および判定モデル46は、補助記憶装置23に接続された外部の大容量記憶装置に記憶されていてもよい。 The auxiliary storage device 23 is a storage device such as SRAM, flash memory, hard disk, or magnetic tape. The auxiliary storage device 23 stores a feature DB (database) 41, a determination model 46, programs to be executed by the control unit 21, and various data necessary for executing the programs. The feature amount DB 41 and the judgment model 46 may be stored in an external large-capacity storage device connected to the auxiliary storage device 23 .
 通信部24は、情報処理装置20とネットワークとの間の通信を行なうインターフェースである。表示部25は、たとえば液晶表示パネルまたは有機EL(electro-luminescence)パネルである。入力部26は、たとえばキーボード、マウスおよびマイクである。表示部25と入力部26とが積層されて、タッチパネルを構成していてもよい。 The communication unit 24 is an interface that performs communication between the information processing device 20 and the network. The display unit 25 is, for example, a liquid crystal display panel or an organic EL (electro-luminescence) panel. The input unit 26 is, for example, a keyboard, mouse and microphone. The display unit 25 and the input unit 26 may be laminated to form a touch panel.
 撮影部28は、たとえば一般的なカメラである。撮影部28は、たとえば赤外線カメラであってもよい。撮影部28は、2台のカメラで構成された、立体撮影用のカメラであってもよい。撮影部28は、3D TOF(3 Dimension Time of Flight)センサであってもよい。立体撮影用のカメラまたは3D TOFセンサを使用する場合、撮影部28は三次元的な顔画像を撮影できる。 The imaging unit 28 is, for example, a general camera. The imaging unit 28 may be, for example, an infrared camera. The photographing unit 28 may be a stereoscopic photographing camera configured with two cameras. The imaging unit 28 may be a 3D TOF (3 Dimension Time of Flight) sensor. When using a camera for stereoscopic photography or a 3D TOF sensor, the photographing unit 28 can photograph a three-dimensional face image.
 表示部25、入力部26、スピーカ27および撮影部28は、外付けの機器であってもよい。撮影部28は、ユーザが表示部25を見ている状態でユーザの顔を撮影できる位置に配置されていることが望ましい。 The display unit 25, the input unit 26, the speaker 27 and the imaging unit 28 may be external devices. The photographing unit 28 is desirably arranged at a position where the user's face can be photographed while the user is looking at the display unit 25 .
 たとえば情報処理装置20は、スマートフォン、パソコン、タブレットまたはスマートスピーカ等の、ユーザが使用する汎用の情報機器である。情報処理装置20は、本実施の形態の判定手法用に用意された専用のハードウェアであってもよい。情報処理装置20は、たとえばペット型等の愛玩用のロボット、または、介護用のロボット等に組み込まれていてもよい。 For example, the information processing device 20 is a general-purpose information device used by a user, such as a smartphone, personal computer, tablet, or smart speaker. The information processing device 20 may be dedicated hardware prepared for the determination method of the present embodiment. Information processing device 20 may be incorporated in, for example, a pet-type robot or a care robot.
 情報処理装置20は、ユーザの近くに配置された表示部25、入力部26、スピーカ27および撮影部28等のいわゆるIoT機器と、ネットワークを介して接続された大型計算機、大型計算機上で動作する仮想マシン、またはクラウドコンピューティングシステム等との組み合わせにより構成されていてもよい。 The information processing device 20 operates on a large-scale computer connected via a network to so-called IoT devices such as a display unit 25, an input unit 26, a speaker 27, and a photographing unit 28 placed near the user. It may be configured by a combination with a virtual machine, a cloud computing system, or the like.
 複数のハードウェアが連携して、情報処理装置20の機能を実現してもよい。たとえば図1を使用して説明した顔面神経麻痺の有無の判定と、部位ごとの特徴量算出と、麻痺の種別の判定とが、異なるハードウェアで行なわれてもよい。 The functions of the information processing device 20 may be implemented by cooperating with a plurality of pieces of hardware. For example, the determination of the presence or absence of facial nerve palsy described using FIG. 1, the feature amount calculation for each part, and the determination of the type of paralysis may be performed by different hardware.
 本実施の形態を用いた判定手順の概要を説明する。制御部21は撮影部28を介して撮影した画像を表示部25に表示する。制御部21は画像に適切な顔の位置等を示すガイドを表示して、ユーザが撮影部28に対して適切な位置および距離になるように誘導してもよい。撮影部28が十分に高画素である場合、制御部21は撮影部28が撮影した画像から、適切な範囲を切り出して表示部25に表示してもよい。 An overview of the determination procedure using this embodiment will be described. The control unit 21 displays the image captured through the image capturing unit 28 on the display unit 25 . The control unit 21 may display a guide indicating an appropriate face position and the like in the image to guide the user to the appropriate position and distance to the photographing unit 28 . If the imaging unit 28 has sufficiently high pixels, the control unit 21 may cut out an appropriate range from the image captured by the imaging unit 28 and display it on the display unit 25 .
 制御部21は、スピーカ27からたとえば「目を開けて、口を閉じてください」等の「安静」の表情を指示する音声を出力する。制御部21は、適切な表情を説明するイラストおよび文字等を表示部25に表示してもよい。ユーザは、表示部25に表示された自分の顔を確認しながら、指示された表情を作る。制御部21は顔画像を撮影する。制御部21は撮影した顔画像から各部位の特徴量を算出し、特徴量DB41に記録する。 The control unit 21 outputs from the speaker 27 a voice commanding a "rest" facial expression such as "open your eyes and close your mouth". The control unit 21 may display on the display unit 25 an illustration, characters, or the like explaining an appropriate facial expression. The user makes an instructed facial expression while confirming his/her own face displayed on the display unit 25 . The control unit 21 takes a face image. The control unit 21 calculates the feature amount of each part from the photographed face image and records it in the feature amount DB 41 .
 制御部21は、スピーカ27からたとえば「目を上に向けて、おでこに皺を寄せてください」等の、次の表情を指示する音声を出力する。ユーザは指示された表情を作る。制御部21は顔画像を撮影する。制御部21は特徴量を算出し、特徴量DB41に記録する。 The control unit 21 outputs a voice commanding the next facial expression from the speaker 27, such as "Look up and wrinkle your forehead." The user makes the indicated facial expression. The control unit 21 takes a face image. The control unit 21 calculates feature amounts and records them in the feature amount DB 41 .
 制御部21は、以上の処理を繰り返して、所定の表情の顔画像を撮影して、特徴量を記録する。なお、それぞれの表情の間に制御部21は、最初の「安静」の表情に戻る指示を出力してもよい。制御部21は、ユーザが「安静」の表情に戻るたびに、顔画像の撮影と特徴量の記録とを行なってもよい。 The control unit 21 repeats the above processing, captures a face image with a predetermined expression, and records the feature amount. Note that the control unit 21 may output an instruction to return to the initial "rest" facial expression between each facial expression. The control unit 21 may capture a face image and record the feature amount each time the user returns to the "rest" expression.
 表1は、中枢性麻痺と末梢性麻痺との識別方法の概要を説明する表である。 Table 1 is a table outlining the method of distinguishing between central paralysis and peripheral paralysis.
Figure JPOXMLDOC01-appb-T000001
Figure JPOXMLDOC01-appb-T000001
 両側性支配領域は、左右両側の大脳皮質からの神経が達している領域であり、おおむね目から上側の領域である。片側性支配領域は、左右それぞれ反対側の大脳皮質からの神経が達している領域であり、おおむね目から下側の領域である。 The bilateral innervation area is the area where nerves from the left and right cerebral cortices reach, and is generally the area above the eyes. The unilateral innervation area is an area to which nerves from the cerebral cortex on the left and right opposite sides reach, and is generally the area below the eye.
 健康な人では、両側性支配領域、片側性支配領域のいずれにも麻痺は生じない。中枢性麻痺が発症している患者では、片側性支配領域に麻痺が生じるが、両側性支配領域には麻痺が生じない。末梢性麻痺が発症している患者では、片側性支配領域と両側支配領域との双方に麻痺が生じる。なお、両側性支配領域に麻痺があり、片側性支配領域には麻痺がないという症状は臨床上発生しないため、表1においては「N/A(Not Applicable)」で示す。 In healthy people, paralysis does not occur in either the bilateral or unilateral areas of control. Patients with central palsy develop paralysis in the unilateral area of supremacy, but not in the bilateral area of supremacy. Patients with peripheral palsy develop paralysis in both unilateral and bilateral territorial tracts. In Table 1, "N/A (Not Applicable)" is indicated because the symptom of paralysis in the bilateral ruling area and no paralysis in the unilateral ruling area does not occur clinically.
 表2は、両側性支配領域および片側性支配領域にそれぞれ属する部位と、各部位の特徴量の概要を示す表である。表2は、図1を使用して説明した4種類の表情を撮影した撮影画像に基づいて算出可能な特徴量の例を示す。それぞれの特徴量に関する説明は後述する。 Table 2 is a table showing the parts belonging to the bilateral dominant region and the unilateral dominant region, respectively, and an overview of the feature values of each part. Table 2 shows examples of feature amounts that can be calculated based on captured images of the four types of facial expressions described using FIG. A description of each feature amount will be given later.

Figure JPOXMLDOC01-appb-T000002
Figure JPOXMLDOC01-appb-T000002
 両側性支配領域においては、異なる表情の間で表情を変更した際の特徴量の変化量が所定の閾値よりも多い場合に、制御部21は麻痺がないと判定する。制御部21は、両側性支配領域のそれぞれの部位について、特徴量の変化量を所定の手順でスコア化し、合計スコア等に基づいて両側性支配領域に関する麻痺の有無を判定してもよい。 In the bilateral dominant region, the control unit 21 determines that there is no paralysis when the amount of change in feature amount when changing facial expressions between different facial expressions is greater than a predetermined threshold. The control unit 21 may score the amount of change in the feature amount for each part of the bilateral areas of control according to a predetermined procedure, and determine the presence or absence of paralysis in the bilateral areas of control based on the total score or the like.
 片側性支配領域においては、左右の特徴量の相違と、異なる表情の間で表情を変更した際の特徴量の変化量との組み合わせに基づいて、制御部21は麻痺の有無を判定する。たとえば、表情を変えても左右の特徴量の相違が大きくならない場合に、制御部21は麻痺がないと判定する。制御部21は、片側性支配領域のそれぞれの部位について、左右の特徴量の相違および特徴量の変化量を所定の手順でスコア化し、合計スコア等に基づいて片側性支配領域に関する麻痺の有無を判定してもよい。 In the unilateral dominant region, the control unit 21 determines the presence or absence of paralysis based on the combination of the difference between the left and right feature amounts and the amount of change in the feature amount when the facial expression is changed between different facial expressions. For example, when the difference between the left and right feature amounts does not increase even when the facial expression is changed, the control unit 21 determines that there is no paralysis. The control unit 21 scores the difference between the left and right feature amounts and the amount of change in the feature amount for each part of the unilateral dominant area according to a predetermined procedure, and determines the presence or absence of paralysis in the unilateral dominant area based on the total score or the like. You can judge.
 図3は、顔画像の特徴点49を説明する説明図である。それぞれの特徴点49を、図3中に黒丸で示す。以下の説明で特に使用する特徴点49には、特徴点49a、特徴点49b等のアルファベットの符号をつけて示す。特徴点49の抽出は、たとえばOpenPose等の公知の手法により実現できるため、詳細については説明を省略する。 FIG. 3 is an explanatory diagram explaining the feature points 49 of the face image. Each feature point 49 is indicated by a black circle in FIG. Characteristic points 49 that are particularly used in the following description are denoted by alphabetical symbols such as characteristic point 49a and characteristic point 49b. Extraction of the feature points 49 can be realized by a known method such as OpenPose, and therefore detailed description is omitted.
 以下の説明においては、図3はユーザの右手側を図3の右側に、ユーザの左手側を図3の左側に表示する、いわゆる鏡像の形式である場合を例にして説明する。したがって、図3の右側が、ユーザの右目に対応する。 In the following description, FIG. 3 will be described as an example of a so-called mirror image format in which the user's right hand side is displayed on the right side of FIG. 3 and the user's left hand side is displayed on the left side of FIG. Accordingly, the right side of FIG. 3 corresponds to the user's right eye.
 図3を使用して、特徴量を算出する方法の概要を説明する。制御部21は、図3を使用して説明した特徴点49と、あらかじめ登録されているユーザの顔のサイズとに基づいて顔画像の縮尺を算出する。制御部21は、ユーザの平均的な顔のサイズに基づいて顔画像の縮尺を算出してもよい。平均的な顔のサイズは、たとえばユーザの性別または年齢ごとに定められていてもよい。以下の説明では、ユーザの顔の実寸大に対応する座標により、それぞれの特徴点49の座標が定められている場合を例にして説明する。 Using Fig. 3, an overview of the method of calculating the feature amount will be explained. The control unit 21 calculates the scale of the face image based on the feature points 49 described using FIG. 3 and the size of the user's face registered in advance. The control unit 21 may calculate the scale of the face image based on the average face size of the user. The average face size may be defined for each user's gender or age, for example. In the following description, the case where the coordinates of each feature point 49 are determined by the coordinates corresponding to the actual size of the user's face will be described as an example.
 それぞれの特徴量を算出する方法の概要を説明する。制御部21は、顔画像のうち額の部分についてエッジ検出を行なう。エッジ検出は、従来から一般的に行なわれている画像処理手法であるため、詳細については説明を省略する。エッジ検出により検出された横線状のエッジの本数が、額の皺の数である。  The outline of the method for calculating each feature amount is explained. The control unit 21 performs edge detection on the forehead portion of the face image. Since edge detection is an image processing technique that has been commonly used in the past, detailed description thereof will be omitted. The number of horizontal line-shaped edges detected by edge detection is the number of wrinkles on the forehead.
 制御部21は、エッジ部分と、その周囲との明るさの差に基づいて、皺の深さを算出する。なお、明るさの差と皺の深さとの関係についてはあらかじめ測定されて、本実施の形態のプログラムに組み込まれている。なお、撮影部28が三次元的な形状を撮影する機能を有する場合には、制御部21は顔画像から皺の深さを直接読み取れる。 The control unit 21 calculates the depth of wrinkles based on the difference in brightness between the edge portion and its surroundings. Note that the relationship between the difference in brightness and the depth of wrinkles is measured in advance and incorporated into the program of the present embodiment. If the photographing unit 28 has a function of photographing a three-dimensional shape, the control unit 21 can directly read the depth of wrinkles from the face image.
 眉の角度は、左右それぞれの眉と、顔の正中線とがなす角度を意味する。顔の正中線は、鼻筋を通る直線である。具体的には、制御部21は、鼻筋を示す複数の特徴点49aを直線近似した直線L1を算出する。制御部21は、ユーザの右眉の両端を示す特徴点49bと特徴点49cとを結ぶ直線L2を算出する。制御部21は、L1とL2とが成す角度θaを算出する。θaが、右眉の角度である。 The eyebrow angle means the angle between the left and right eyebrows and the midline of the face. The midline of the face is a straight line passing through the bridge of the nose. Specifically, the control unit 21 calculates a straight line L1 obtained by linearly approximating a plurality of feature points 49a indicating the bridge of the nose. The control unit 21 calculates a straight line L2 connecting the feature points 49b and 49c indicating both ends of the user's right eyebrow. The control unit 21 calculates the angle θa formed by L1 and L2. θa is the angle of the right eyebrow.
 制御部21は、右眉に関するすべての特徴点49の配列を近似する円弧を算出する。算出した円弧の半径の逆数が、右眉の曲率である。制御部21は、左眉の角度および曲率についても同様にして算出する。 The control unit 21 calculates an arc that approximates the array of all feature points 49 related to the right eyebrow. The reciprocal of the radius of the calculated arc is the curvature of the right eyebrow. The control unit 21 similarly calculates the angle and curvature of the left eyebrow.
 制御部21は、ユーザの左眉に関する特徴点49と、右目の上縁を結ぶ特徴点49とを結ぶ線に囲まれた領域の面積を算出する。二つの曲線の端部同士は、たとえば直線により補完される。面積を算出する部分を、図3に左下がりのハッチングで示す。算出した面積が左眉と左目との間の面積である。制御部21は、右眉と右目との間の面積も同様にして算出する。 The control unit 21 calculates the area of the area surrounded by the line connecting the feature point 49 regarding the user's left eyebrow and the feature point 49 connecting the upper edge of the right eye. The ends of the two curves are interpolated by straight lines, for example. The portion for calculating the area is indicated by left-down hatching in FIG. The calculated area is the area between the left eyebrow and the left eye. The control unit 21 similarly calculates the area between the right eyebrow and the right eye.
 制御部21は、顔画像のうち鼻の左側から左側の口角付近までの部分についてエッジ検出を行なう。エッジ検出により検出された最も鮮明なエッジの長さが、左側の法令線の長さである。制御部21は、左側の法令線の深さを算出する。深さの算出方法は、額の皺の深さと同様であるため説明を省略する。制御部21は、右側の法令線の長さおよび深さも同様にして算出する。 The control unit 21 performs edge detection on the portion from the left side of the nose to the vicinity of the left corner of the mouth in the face image. The length of the sharpest edge detected by edge detection is the length of the left statute. The control unit 21 calculates the depth of the left law line. Since the depth calculation method is the same as that for the depth of wrinkles on the forehead, the description is omitted. The control unit 21 similarly calculates the length and depth of the right normal line.
 制御部21は、ユーザの左右の口角を示す特徴点49dと特徴点49eとを結ぶ直線L3を算出する。制御部21は、左目の代表点と、右目の代表点とを結ぶ直線L4を算出する。代表点は、たとえばそれぞれの目の重心である。代表点は、それぞれの目の目頭または目尻等であってもよい。制御部21は、L3とL4とがなす角度を算出する。算出した角度が、口角の角度である。 The control unit 21 calculates a straight line L3 that connects the feature points 49d and 49e indicating the right and left corners of the user's mouth. The control unit 21 calculates a straight line L4 connecting the representative point of the left eye and the representative point of the right eye. The representative point is, for example, the center of gravity of each eye. The representative point may be the inner corner or the outer corner of each eye. The control unit 21 calculates the angle formed by L3 and L4. The calculated angle is the angle of the corner of the mouth.
 制御部21は、上唇の下縁に関する特徴点49と、下唇の上縁に関する特徴点49とを結ぶ曲線または折れ線により囲まれた領域の面積を算出する。面積を算出する部分を、図3に右下がりのハッチングで示す。算出した面積が口の面積である。ユーザの口がしっかりと閉じている場合には、口の面積はゼロである The control unit 21 calculates the area of the area surrounded by the curve or polygonal line that connects the characteristic point 49 regarding the lower edge of the upper lip and the characteristic point 49 regarding the upper edge of the lower lip. A portion for calculating the area is indicated by hatching sloping downward to the right in FIG. The calculated area is the area of the mouth. If the user's mouth is tightly closed, the mouth area is zero.
 制御部21は、顔の右側の輪郭に関するすべての特徴点49の配列を近似する円弧を算出する。算出した円弧の半径の逆数が、右側の輪郭の曲率である。制御部21は、左側の輪郭の曲率についても同様にして算出する。 The control unit 21 calculates an arc approximating the arrangement of all feature points 49 related to the contour of the right side of the face. The reciprocal of the calculated radius of the arc is the curvature of the right contour. The control unit 21 similarly calculates the curvature of the contour on the left side.
 以上の手順により、制御部21は表2に示す特徴量をそれぞれ算出できる。制御部21は、図1を使用して説明した複数の表情を撮影したそれぞれの顔画像について、特徴量を算出する。 Through the above procedure, the control unit 21 can calculate the feature values shown in Table 2. The control unit 21 calculates a feature amount for each face image obtained by photographing a plurality of facial expressions described using FIG.
 図4は、特徴量DB41のレコードレイアウトを説明する説明図である。特徴量DB41は、それぞれの顔画像に基づいて算出した部位ごとの特徴量を記録するDBである。特徴量DB41は、日付フィールドおよび特徴量フィールドを有する。特徴量フィールドは、安静フィールド、額に皺を寄せるフィールド等、図1を使用して説明したそれぞれの表情に対応する表情フィールドを有する。 FIG. 4 is an explanatory diagram for explaining the record layout of the feature DB 41. FIG. The feature amount DB 41 is a DB that records the feature amount for each part calculated based on each face image. The feature amount DB 41 has a date field and a feature amount field. The feature amount field has facial expression fields corresponding to the respective facial expressions described using FIG.
 安静フィールド等の各表情フィールドは、額フィールド、眉フィールド、眉と目との間フィールド、法令線フィールド、口角フィールド、口フィールドおよび輪郭フィールド等の、顔の部位を示す部位フィールドをそれぞれ有する。 Each facial expression field, such as the rest field, has a part field indicating a part of the face, such as a forehead field, an eyebrow field, a field between the eyebrows and the eyes, a normal line field, a corner of the mouth field, a mouth field, and an outline field.
 額フィールドは、皺の数フィールドおよび皺の深さフィールドを有する。眉フィールドは、左フィールドおよび右フィールドを有する。左フィールドおよび右フィールドは、それぞれ角度フィールドおよび曲率フィールドを有する。眉と目との間フィールドは、左面積フィールドおよび右面積フィールドを有する。 The forehead field has a wrinkle number field and a wrinkle depth field. The eyebrow field has a left field and a right field. The left and right fields have angle and curvature fields respectively. The eyebrow-to-eye field has a left area field and a right area field.
 法令線フィールドは、右フィールドおよび左フィールドを有する。右フィールドおよび左フィールドは、それぞれ深さフィールドおよび長さフィールドを有する。口角フィールドは、角度フィールドを有する。口フィールドは、面積フィールドを有する。輪郭フィールドは、右フィールドおよび左フィールドを有する。 The law line field has a right field and a left field. The right field and left field each have a depth field and a length field. The mouth corner field has an angle field. The mouth field has an area field. The contour field has a right field and a left field.
 日付フィールドには、麻痺の種別を判定した日付が記録されている。皺の数フィールドには額の皺の本数が記録されている。皺の深さフィールドには、皺の深さの代表値が記録されている。なお、皺の深さフィールドには、それぞれの皺の深さが記録されてもよい。 The date field records the date when the type of paralysis was determined. The number of wrinkles field records the number of wrinkles on the forehead. A representative value of the depth of wrinkles is recorded in the wrinkle depth field. The depth of each wrinkle may be recorded in the wrinkle depth field.
 眉フィールド内の角度フィールドには、左右それぞれの眉と、鼻筋とがなす角度が記録されている。同様に曲率フィールドには、左右それぞれの眉の曲率が記録されている。眉と目との間フィールドに右面積フィールドには、右眉と右目の上縁との間の面積が記録されている。同様に左面積フィールドには、左眉と左目の上縁との間の面積が記録されている。 The angles formed by the left and right eyebrows and the bridge of the nose are recorded in the angle field within the eyebrow field. Similarly, the curvature field records the curvature of the left and right eyebrows. The area between the right eyebrow and the upper edge of the right eye is recorded in the right area field in the eyebrow-to-eye field. Similarly, the left area field records the area between the left eyebrow and the upper edge of the left eye.
 法令線フィールド内の深さフィールドには、左右それぞれの法令線の深さが記録されている。同様に長さフィールドには、左右それぞれの法令線の長さが記録されている。口角フィールドには左右の口角を繋いだ線と、左右の目の中央とを繋いだ線とがなす角度が記録されている。 In the depth field within the law line field, the depth of each left and right law line is recorded. Similarly, the length field records the length of each of the left and right legal lines. The angle formed by a line connecting the left and right corners of the mouth and a line connecting the centers of the left and right eyes is recorded in the mouth corner field.
 口フィールドには、下唇の上縁と上唇の下縁とで囲まれた領域の面積が記録されている。輪郭フィールド内の左右フィールドには、左右の輪郭線の曲率が記録されている。なお、特徴量DB41はそれぞれの表情に対応する顔画像を記録するフィールドを有してもよい。 The area of the region surrounded by the upper edge of the lower lip and the lower edge of the upper lip is recorded in the mouth field. Left and right fields within the contour field record the curvature of the left and right contour lines. Note that the feature amount DB 41 may have fields for recording face images corresponding to respective facial expressions.
 図5は、判定モデル46を説明する説明図である。判定モデル46は、図4における1つのレコードに記録された一連の特徴量を受け付けて、表1および表2を使用して説明したプロセスにより「末梢性麻痺」および「中枢性麻痺」である確率をそれぞれ出力するモデルである。 FIG. 5 is an explanatory diagram for explaining the judgment model 46. FIG. The judgment model 46 accepts a series of feature values recorded in one record in FIG. is a model that outputs
 判定モデル46は、たとえば柳原法等の公知の判定手法に準拠して判定結果を出力するロジックベースのプログラムである。ユーザの主治医等が、ユーザの既往症等の症状に応じて、判定モデル46の判定パラメータ等を適宜調整可能であってもよい。判定モデル46は、多数組の特徴量と判定結果との組み合わせを記録した訓練データを使用して、機械学習により生成されたモデルであってもよい。 The judgment model 46 is a logic-based program that outputs judgment results according to a known judgment method such as the Yanagihara method. The user's attending physician or the like may be able to appropriately adjust the determination parameters and the like of the determination model 46 in accordance with the user's symptoms such as pre-existing diseases. The judgment model 46 may be a model generated by machine learning using training data recording combinations of multiple sets of feature values and judgment results.
 情報処理装置20が専用のハードウェアである場合には、判定モデル46はFPGA(Field Programmable Gate Array)またはASIC(Application Specific Integrated Circuit)等を用いてハードウェア的に構成されていてもよい。 If the information processing device 20 is dedicated hardware, the determination model 46 may be configured in hardware using FPGA (Field Programmable Gate Array) or ASIC (Application Specific Integrated Circuit).
 なお、判定モデル46は、「末梢性麻痺」である確率、または、「中枢性麻痺」である確率のいずれか一方のみを出力してもよい。判定モデル46は、「末梢性麻痺」であるか、「中枢性麻痺」であるかのいずれか一方を示す判定結果を出力してもよい。 Note that the determination model 46 may output only one of the probability of "peripheral paralysis" or the probability of "central paralysis". The judgment model 46 may output a judgment result indicating either "peripheral paralysis" or "central paralysis".
 図6は、プログラムの処理の流れを説明するフローチャートである。制御部21は、スピーカ27からたとえば「目を開けて、口を閉じてください」等の表情を指示する音声を出力して、次に撮影する表情をユーザに指示する(ステップS501)。ステップS501により、制御部21は表情指示部の機能を実現する。 FIG. 6 is a flowchart explaining the flow of program processing. Control unit 21 outputs a voice commanding a facial expression such as "Please open your eyes and close your mouth" from speaker 27 to instruct the user on the facial expression to be captured next (step S501). Through step S501, the control section 21 implements the function of the facial expression instruction section.
 制御部21は、撮影部28を制御して顔画像を撮影し、補助記憶装置23に記憶する(ステップS502)。ステップS502により、制御部21はユーザの顔画像を取得する画像取得部の機能を実現する。制御部21は、所定の表情の撮影を終了したか否かを判定する(ステップS503)。終了していないと判定した場合(ステップS503でNO)、制御部21はステップS501に戻る。 The control unit 21 controls the photographing unit 28 to photograph a face image and stores it in the auxiliary storage device 23 (step S502). By step S502, the control unit 21 realizes the function of the image acquiring unit that acquires the user's face image. The control unit 21 determines whether or not shooting of a predetermined facial expression has been completed (step S503). If it is determined that the processing has not ended (NO in step S503), the control unit 21 returns to step S501.
 終了したと判定した場合(ステップS503でYES)、制御部21は、撮影した一連の顔画像に基づいて顔面神経麻痺があるか否かを判定する(ステップS504)。前述のとおり、顔画像に基づいて顔面神経麻痺の有無を判定する方法は、たとえば特許文献1等で公知であるため、詳細については説明を省略する。ステップS504により制御部21は、麻痺判定部の機能を実現する。 When it is determined that the process has ended (YES in step S503), the control unit 21 determines whether or not there is facial paralysis based on the series of photographed face images (step S504). As described above, the method for determining the presence or absence of facial nerve palsy based on a facial image is known, for example, from Patent Document 1, and therefore detailed description thereof is omitted. By step S504, the control unit 21 realizes the function of the paralysis determination unit.
 顔面神経麻痺がないと判定した場合(ステップS504でNO)、制御部21は「顔面神経麻痺はありません」等のメッセージをスピーカ27または表示部25から出力する(ステップS505)。制御部21は処理を終了する。 When it is determined that there is no facial paralysis (NO in step S504), the control unit 21 outputs a message such as "no facial paralysis" from the speaker 27 or the display unit 25 (step S505). The control unit 21 terminates the processing.
 顔面神経麻痺があると判定した場合(ステップS504でYES)、制御部21はステップS502で撮影した顔画像から1枚の顔画像を選択する(ステップS511)。制御部21は顔画像から図3を使用して説明した特徴点49を抽出する(ステップS512)。 When determining that there is facial paralysis (YES in step S504), the control unit 21 selects one facial image from the facial images captured in step S502 (step S511). The control unit 21 extracts the feature points 49 described using FIG. 3 from the face image (step S512).
 制御部21は、表2を使用して説明した部位を一つ選択する(ステップS513)。制御部21は、特徴量算出のサブルーチンを起動する(ステップS514)。特徴量算出のサブルーチンは、ステップS513で選択された部位に関する特徴量を算出するサブルーチンである。それぞれの部位に関する特徴量算出のサブルーチンの処理の流れは後述する。特徴量算出のサブルーチンの実行により、制御部21は特徴量取得部の機能を実現する。 The control unit 21 selects one part described using Table 2 (step S513). The control unit 21 starts a subroutine for feature amount calculation (step S514). The feature amount calculation subroutine is a subroutine for calculating the feature amount related to the part selected in step S513. The processing flow of the feature amount calculation subroutine for each part will be described later. By executing the feature amount calculation subroutine, the control unit 21 realizes the function of the feature amount acquisition unit.
 制御部21は、算出した特徴量を特徴量DB41に記録する(ステップS515)。制御部21は、ステップS515により特徴量記録部の機能を実現する。制御部21は、所定の部位に関する特徴量の算出を終了したか否かを判定する(ステップS516)。終了していないと判定した場合(ステップS516でNO)、制御部21はステップS513に戻る。 The control unit 21 records the calculated feature amount in the feature amount DB 41 (step S515). The control unit 21 realizes the function of the feature amount recording unit in step S515. The control unit 21 determines whether or not the calculation of the feature amount for the predetermined part has ended (step S516). If it is determined that the processing has not ended (NO in step S516), the control unit 21 returns to step S513.
 終了したと判定した場合(ステップS516でYES)、制御部21はステップS502で撮影したすべての顔画像の処理を終了したか否かを判定する(ステップS517)。終了していないと判定した場合(ステップS517でNO)、制御部21はステップS511に戻る。 If it is determined that processing has been completed (YES in step S516), the control unit 21 determines whether processing of all face images captured in step S502 has been completed (step S517). If it is determined that the processing has not ended (NO in step S517), the control unit 21 returns to step S511.
 終了したと判定した場合(ステップS517でYES)、制御部21は図5を使用して説明した判定モデル46を使用して、ユーザの顔面神経麻痺が中枢性麻痺である確率および末梢性麻痺である確率を取得する(ステップS518)。 If it is determined to have ended (YES in step S517), the control unit 21 uses the determination model 46 described with reference to FIG. A certain probability is obtained (step S518).
 制御部21は、中枢性麻痺である確率が高いか否かを判定する(ステップS519)。なお、ステップS519においては、中枢性麻痺である確率が所定の判定閾値を超えた場合に、中枢性麻痺であると判定する。ステップS519により制御部21は、ユーザが末梢性麻痺であるか中枢性麻痺であるかを判定する種別判定部の機能を実現する。 The control unit 21 determines whether the probability of central paralysis is high (step S519). It should be noted that in step S519, it is determined that there is central paralysis when the probability of having central paralysis exceeds a predetermined determination threshold value. By step S519, the control unit 21 realizes the function of the type determination unit that determines whether the user has peripheral paralysis or central paralysis.
 ステップS519における判定閾値が20パーセントに設定されている場合を例にして具体的に説明する。判定モデル46の出力が中枢性麻痺である確率が30パーセント、末梢性麻痺である確率が70パーセントである場合、ステップS519において制御部21は中枢性麻痺である(ステップS519でYES)と判定する。判定閾値が適切に設定されていることにより、中枢性麻痺の見落としリスクを低減できる。 A specific description will be given by taking as an example the case where the determination threshold in step S519 is set to 20%. When the output of the determination model 46 has a 30% probability of central paralysis and a 70% probability of peripheral paralysis, the control unit 21 determines that there is central paralysis in step S519 (YES in step S519). . By appropriately setting the determination threshold, the risk of overlooking central paralysis can be reduced.
 中枢性麻痺であると判定した場合(ステップS519でYES)、制御部21はたとえば「脳卒中の麻痺症状の疑いのため、脳神経外科への受診をお勧めします」等の、判定結果を端的に示すメッセージを音声または文字で出力する(ステップS520)。 If it is determined that the patient has central palsy (YES in step S519), the control unit 21 simply displays the determination result, for example, "Due to suspicion of stroke paralysis, it is recommended to see a neurosurgeon." The indicated message is output by voice or text (step S520).
 制御部21は、たとえば「法令線の麻痺があり、70パーセントの確率で脳卒中の症状である中枢性顔面神経麻痺が疑われました」等の、判定の根拠と判定結果とを示すメッセージを音声または文字で出力してもよい。制御部21は、判定の根拠を示す顔画像に、麻痺があると判定した場所を示すインジケータを重畳させて表示してもよい。判定の理由が示されることにより、ユーザは速やかな受診の必要性を実感できる。 The control unit 21 outputs a message indicating the basis of the judgment and the judgment result, for example, "There is paralysis of the law line, and central facial nerve palsy, which is a symptom of stroke, was suspected with a probability of 70%." Or you can output it as a character. The control unit 21 may superimpose an indicator indicating the location where it is determined that there is paralysis on the face image indicating the grounds for the determination. By showing the reason for the determination, the user can realize the necessity of prompt medical examination.
 制御部21は、近隣の脳神経内科または脳神経外科のリストを表示部25に表示してもよい。制御部21は、脳神経内科または脳神経外科を検索する検索ボタンを表示部25に表示してもよい。 The control unit 21 may display a list of nearby neurological departments or neurosurgeons on the display unit 25. The control unit 21 may display a search button for searching neurology or neurosurgery on the display unit 25 .
 制御部21は、事前に登録済の家族、ヘルパーまたはかかりつけ医等の連絡先に対して、ユーザが中枢性麻痺を発症している旨を自動的に通知してもよい。制御部21は、連絡先への通知の要否について、ユーザの指示を受け付けてもよい。情報処理装置20がスマートフォン等の通話機能を有する機器である場合、制御部21は、たとえば119番等の緊急通報を行なう緊急通報ボタンを表示してもよい。 The control unit 21 may automatically notify a pre-registered contact such as a family member, helper, or primary care doctor that the user has developed central palsy. The control unit 21 may receive an instruction from the user as to whether or not to notify the contact. If the information processing device 20 is a device having a call function such as a smart phone, the control unit 21 may display an emergency call button for making an emergency call such as 119, for example.
 中枢性麻痺ではないと判定した場合(ステップS519でNO)、制御部21はたとえば「脳卒中とは異なる顔面神経の麻痺症状の疑いのため、耳鼻科への受診をお勧めします」等の、判定結果を端的に示すメッセージを音声または文字で出力する(ステップS521)。ステップS520またはステップS521の終了後、制御部21は処理を終了する。ステップS520およびステップS521により制御部21は、ユーザが末梢性麻痺であるか中枢性麻痺であるかを出力する種別出力部の機能を実現する。 If it is determined that the patient does not have central palsy (NO in step S519), the control unit 21 may say, for example, "Because of suspicion of paralysis of the facial nerve, which is different from cerebral apoplexy, a visit to an otolaryngologist is recommended." A message that briefly indicates the determination result is output by voice or text (step S521). After completing step S520 or step S521, the control unit 21 ends the process. Through steps S520 and S521, the control unit 21 realizes the function of a type output unit that outputs whether the user has peripheral paralysis or central paralysis.
 図7は、額に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。特徴量算出のサブルーチンは、ステップS513で選択された部位に関する特徴量を算出するサブルーチンである。制御部21は、ステップS513において「額」を選択した場合に、ステップS514において図7を使用して説明するサブルーチンを起動する。 FIG. 7 is a flow chart for explaining the processing flow of the subroutine for calculating feature values relating to the forehead. The feature amount calculation subroutine is a subroutine for calculating the feature amount related to the part selected in step S513. When "amount" is selected in step S513, the control unit 21 starts a subroutine described using FIG. 7 in step S514.
 制御部21は、顔画像のうち額の部分についてエッジ検出を行なう(ステップS531)。前述のとおり、横線状に検出されたエッジが、額の皺である。制御部21は、エッジ検出結果に基づいて皺の数を算出する(ステップS532)。制御部21は、たとえばエッジ部分と、その周囲との明るさの差に基づいて、皺の深さを算出する(ステップS533)。制御部21は処理を終了する。 The control unit 21 performs edge detection on the forehead portion of the face image (step S531). As described above, the edges detected as horizontal lines are wrinkles on the forehead. The control unit 21 calculates the number of wrinkles based on the edge detection result (step S532). Control unit 21 calculates the depth of wrinkles, for example, based on the difference in brightness between the edge portion and its surroundings (step S533). The control unit 21 terminates the processing.
 なお、ステップS533において、たとえば制御部21は検出したすべての額の皺について深さを算出する。制御部21は、さらに深さの平均値または最大値等の代表値を算出してもよい。制御部21は、たとえば検出した額の皺のうち、たとえば最も長い皺、または中央に位置する皺等の代表的な皺の深さを算出してもよい。 In step S533, for example, the control unit 21 calculates the depth of all detected forehead wrinkles. The control unit 21 may further calculate a representative value such as an average value or maximum value of depth. Control unit 21 may calculate the depth of a typical wrinkle, such as the longest wrinkle or the central wrinkle, among the detected forehead wrinkles.
 図8は、眉に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。制御部21は、ステップS513において「額」を選択した場合に、ステップS514において図8を使用して説明するサブルーチンを起動する。 FIG. 8 is a flow chart for explaining the processing flow of the subroutine for feature amount calculation regarding eyebrows. When "amount" is selected in step S513, the control unit 21 starts a subroutine described using FIG. 8 in step S514.
 制御部21は、鼻筋を示す複数の特徴点49を直線近似して、正中線を算出する(ステップS541)。制御部21は、一方の眉の両端をそれぞれ示す特徴点49同士を結ぶ直線を算出する(ステップS542)。制御部21は、ステップS541で算出した直線と、ステップS542で算出した直線とがなす角度を算出する(ステップS543)。 The control unit 21 performs linear approximation of the plurality of feature points 49 indicating the bridge of the nose to calculate the median line (step S541). The control unit 21 calculates a straight line connecting the characteristic points 49 indicating both ends of one eyebrow (step S542). The control unit 21 calculates the angle between the straight line calculated in step S541 and the straight line calculated in step S542 (step S543).
 制御部21は、一方の眉を示す複数の特徴点49の配列を近似する円弧を算出する(ステップS544)。ステップS544により、一方の眉を近似する円弧の中心点の座標と、半径とが算出される。制御部21は、半径の逆数である眉の曲率を算出する(ステップS545)。制御部21は処理を終了する。 The control unit 21 calculates an arc that approximates the arrangement of the plurality of feature points 49 indicating one eyebrow (step S544). Through step S544, the coordinates of the center point of the arc approximating one eyebrow and the radius are calculated. The control unit 21 calculates the curvature of the eyebrow, which is the reciprocal of the radius (step S545). The control unit 21 terminates the processing.
 図9は、眉と目との間に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。制御部21は、ステップS513において「眉と目との間」を選択した場合に、ステップS514において図9を使用して説明するサブルーチンを起動する。 FIG. 9 is a flow chart for explaining the processing flow of a subroutine for feature amount calculation between eyebrows and eyes. When "Between eyebrows and eyes" is selected in step S513, the control unit 21 activates a subroutine described using FIG. 9 in step S514.
 制御部21は、一方の眉を示す複数の特徴点49の座標を取得する(ステップS551)。制御部21は、同じ側の目の上縁を示す複数の特徴点49の座標を取得する(ステップS552)。制御部21は、ステップS551およびステップS552により取得した特徴点49同士を結ぶ直線または曲線で囲まれた領域の面積を算出する(ステップS553)。制御部21は処理を終了する。 The control unit 21 acquires coordinates of a plurality of characteristic points 49 indicating one eyebrow (step S551). The control unit 21 acquires coordinates of a plurality of feature points 49 indicating the upper edge of the eye on the same side (step S552). The control unit 21 calculates the area of the area surrounded by straight lines or curved lines connecting the feature points 49 acquired in steps S551 and S552 (step S553). The control unit 21 terminates the processing.
 図10は、法令線に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。制御部21は、ステップS513において「法令線」を選択した場合に、ステップS514において図10を使用して説明するサブルーチンを起動する。 FIG. 10 is a flow chart for explaining the processing flow of the subroutine for feature amount calculation related to the law line. When the control unit 21 selects "regulation line" in step S513, the control unit 21 starts a subroutine described using FIG. 10 in step S514.
 制御部21は、顔画像のうち鼻の一方の側から同側の口角付近までの部分についてエッジ検出を行なう(ステップS561)。制御部21は、検出したエッジの中から、法令線に対応するエッジを選択する(ステップS562)。具体的には、制御部21は顔の下側にいくほど正中線から離れるエッジの中から、最も鮮明なエッジ、または、最も長いエッジを選択する。 The control unit 21 performs edge detection on the part of the face image from one side of the nose to the vicinity of the corner of the mouth on the same side (step S561). The control unit 21 selects an edge corresponding to the law line from the detected edges (step S562). Specifically, the control unit 21 selects the sharpest edge or the longest edge from among the edges that are farther from the midline toward the lower side of the face.
 制御部21は、法令線の長さを算出する(ステップS563)。制御部21は、たとえばエッジ部分と、その周囲との明るさの差に基づいて、法令線の深さを算出する(ステップS564)。制御部21は処理を終了する。 The control unit 21 calculates the length of the law line (step S563). Control unit 21 calculates the depth of the law line, for example, based on the difference in brightness between the edge portion and its surroundings (step S564). The control unit 21 terminates the processing.
 図11は、口角に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。制御部21は、ステップS513において「口角」を選択した場合に、ステップS514において図11を使用して説明するサブルーチンを起動する。 FIG. 11 is a flowchart for explaining the processing flow of a subroutine for feature amount calculation related to mouth corners. When "corner of the mouth" is selected in step S513, the control unit 21 starts a subroutine described using FIG. 11 in step S514.
 制御部21は、左右の口角を示す特徴点49同士を結ぶ直線を算出する(ステップS571)。制御部21は、左右の目それぞれについて、代表点を算出する。たとえば制御部21は、1個の目の縁に対応する複数の特徴点49の重心を算出して、代表点に使用する。制御部21は、たとえば目頭または目尻を示す特徴点49を代表点に使用してもよい。制御部21は、左右の目の代表点を結ぶ直線を算出する(ステップS572)。 The control unit 21 calculates straight lines connecting the characteristic points 49 indicating the left and right corners of the mouth (step S571). The control unit 21 calculates representative points for each of the left and right eyes. For example, the control unit 21 calculates the center of gravity of a plurality of feature points 49 corresponding to the edge of one eye and uses it as a representative point. The control unit 21 may use, for example, the feature point 49 indicating the inner corner or the outer corner of the eye as the representative point. The control unit 21 calculates a straight line connecting the representative points of the left and right eyes (step S572).
 制御部21は、ステップS571で算出した直線と、ステップS572で算出した直線とがなす角度を算出する(ステップS573)。制御部21は、処理を終了する。 The control unit 21 calculates the angle formed by the straight line calculated in step S571 and the straight line calculated in step S572 (step S573). The control unit 21 terminates the process.
 図12は、口に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。制御部21は、ステップS513において「口」を選択した場合に、ステップS514において図12を使用して説明するサブルーチンを起動する。 FIG. 12 is a flow chart for explaining the processing flow of the subroutine for calculating feature values relating to the mouth. When "mouth" is selected in step S513, the control unit 21 starts a subroutine described using FIG. 12 in step S514.
 制御部21は、上唇の下縁に対応する複数の特徴点49の座標を取得する(ステップS581)。制御部21は、下唇の上縁に対応する複数の特徴点49の座標を取得する(ステップS582)。制御部21は、ステップS581およびステップS582により取得した特徴点49同士を結ぶ直線または曲線で囲まれた領域の面積を算出する(ステップS583)。制御部21は処理を終了する。 The control unit 21 acquires coordinates of a plurality of feature points 49 corresponding to the lower edge of the upper lip (step S581). The control unit 21 acquires coordinates of a plurality of feature points 49 corresponding to the upper edge of the lower lip (step S582). The control unit 21 calculates the area of the area surrounded by straight lines or curved lines connecting the feature points 49 acquired in steps S581 and S582 (step S583). The control unit 21 terminates the processing.
 図13は、輪郭に関する特徴量算出のサブルーチンの処理の流れを説明するフローチャートである。制御部21は、ステップS513において「輪郭」を選択した場合に、ステップS514において図13を使用して説明するサブルーチンを起動する。 FIG. 13 is a flow chart for explaining the processing flow of a subroutine for contour feature amount calculation. When "outline" is selected in step S513, the control unit 21 starts a subroutine described using FIG. 13 in step S514.
 制御部21は、左右の一方の輪郭を示す複数の特徴点49の配列を近似する円弧を算出する(ステップS591)。ステップS591により、一方の輪郭を近似する円弧の中心点の座標と、半径とが算出される。制御部21は、半径の逆数である輪郭の曲率を算出する(ステップS592)。制御部21は処理を終了する。 The control unit 21 calculates an arc approximating the arrangement of the plurality of characteristic points 49 representing one of the left and right contours (step S591). Through step S591, the coordinates of the center point of the arc that approximates one contour and the radius are calculated. The control unit 21 calculates the curvature of the contour, which is the reciprocal of the radius (step S592). The control unit 21 terminates the processing.
 図14は、画面例を説明する説明図である。図14は、図6を使用して説明したフローチャートのステップS520において、制御部21がユーザに対して中枢性麻痺の症状があることを通知する画面の例を示す。 FIG. 14 is an explanatory diagram for explaining a screen example. FIG. 14 shows an example of a screen on which the control unit 21 notifies the user that there is a symptom of central paralysis in step S520 of the flowchart explained using FIG.
 図14に示す画面には、顔画像欄71、結果欄72、医療機関連絡ボタン73および家族連絡ボタン74が表示されている。結果欄72には、麻痺があると判定された場所と、中枢性麻痺である確率とを示す文章が表示されている。 On the screen shown in FIG. 14, a face image column 71, a result column 72, a medical institution contact button 73, and a family contact button 74 are displayed. The result column 72 displays sentences indicating the locations where it was determined that there was paralysis and the probability of having central paralysis.
 顔画像欄71には、ユーザの顔画像が表示されている。顔画像欄71内に、麻痺があると判定された場所を示す指標76が表示されている。 The user's face image is displayed in the face image column 71 . In the face image field 71, an index 76 is displayed to indicate the location where it is determined that there is paralysis.
 ユーザが医療機関連絡ボタン73を選択した場合、制御部21はあらかじめ登録されている脳神経外科等の医療機関への連絡先を表示する。制御部21は、ユーザの住所と、「脳神経外科」等のキーワードを公知の検索エンジンに入力して、検索結果を表示しても良い。情報処理装置20がスマートフォン等の通話機能を有する機器である場合、制御部21は、あらかじめ登録されている医療機関宛ての通話を開始してもよい。 When the user selects the medical institution contact button 73, the control unit 21 displays the pre-registered contact information for medical institutions such as neurosurgery. The control unit 21 may input the user's address and a keyword such as "neurosurgery" into a known search engine and display the search results. If the information processing device 20 is a device such as a smart phone that has a call function, the control unit 21 may initiate a call to a pre-registered medical institution.
 ユーザが家族連絡ボタン74を選択した場合、制御部21はあらかじめ登録されている家族宛てに、ユーザが中枢性神経麻痺を発症している旨の通知を送信する。通知には、たとえば電子メール、SMS(Short Message Service)、またはSNS(Social Network Service)あてのメッセージを使用できる。情報処理装置20がスマートフォン等の通話機能を有する機器である場合、制御部21は、あらかじめ登録されている家族の電話番号宛ての通話を開始してもよい。 When the user selects the family contact button 74, the control unit 21 sends a notice to the pre-registered family members that the user has developed central nervous system paralysis. For the notification, e-mail, SMS (Short Message Service), or message addressed to SNS (Social Network Service) can be used, for example. If the information processing device 20 is a device such as a smart phone that has a call function, the control unit 21 may initiate a call to a family member's phone number registered in advance.
 本実施の形態によると、ユーザが発症している顔面神経麻痺が末梢性麻痺であるか、中枢性麻痺であるかを判定する情報処理装置20を提供できる。本実施の形態によると、ユーザが発症している麻痺の種別に応じて適切な診療科への受診を推奨する情報処理装置20を提供できる。適切な診療科を速やかに受診できることにより、ユーザの予後の大幅な改善に貢献する情報処理装置20を提供できる。 According to the present embodiment, it is possible to provide the information processing apparatus 20 that determines whether the facial paralysis that the user has developed is peripheral paralysis or central paralysis. According to the present embodiment, it is possible to provide the information processing apparatus 20 that recommends a visit to an appropriate department according to the type of paralysis that the user has developed. It is possible to provide the information processing apparatus 20 that contributes to a significant improvement in the user's prognosis by allowing the user to promptly visit an appropriate clinical department.
 本実施の形態によると、ユーザの顔画像のみから顔面神経麻痺の種別を判定する情報処理装置20を提供できる。操作が簡単であるため、たとえば高齢者であっても気軽に使用できる情報処理装置20を提供できる。 According to the present embodiment, it is possible to provide the information processing device 20 that determines the type of facial paralysis only from the user's face image. Since the operation is simple, it is possible to provide the information processing apparatus 20 that even elderly people can easily use.
 本実施の形態によると、ユーザが顔面神経麻痺を発症していない場合には、その旨を通知する情報処理装置20を提供できる。顔面神経麻痺発症に関する不安感を抱える患者は、本実施の形態の情報処理装置を適宜使用して発症していないことを確認して、安心できる。 According to the present embodiment, it is possible to provide the information processing device 20 that notifies the user when the user has not developed facial paralysis. A patient who feels uneasy about the onset of facial paralysis can use the information processing apparatus according to the present embodiment as appropriate to confirm that the onset does not occur, and feel relieved.
 本実施の形態の情報処理装置20は、たとえば緊急搬送を担当する消防隊員等が使用してもよい。搬送対象の患者が発症している麻痺の種別を判定することにより、搬送先を適切に選択できる。 The information processing apparatus 20 of the present embodiment may be used, for example, by firefighters in charge of emergency transportation. By determining the type of paralysis that the patient to be transported has developed, the transport destination can be appropriately selected.
 なお制御部21は、顔画像を入力した場合に特徴量を出力するように生成されたモデルを用いて、表2を使用して説明したそれぞれの特徴量を取得してもよい。このようなモデルは、たとえば顔画像と特徴量との組み合わせを記録した訓練データを使用して、機械学習により生成される。特徴量を出力するモデルと、判定モデル46とが一体のモデルに構成されており、顔画像が入力された場合に、麻痺の種別を出力してもよい。 Note that the control unit 21 may acquire each feature amount described using Table 2 using a model generated so as to output the feature amount when a face image is input. Such models are generated by machine learning, for example, using training data recording combinations of face images and feature quantities. The model for outputting the feature amount and the determination model 46 are integrated into a model, and when a face image is input, the type of paralysis may be output.
[変形例]
 本変形例は、顔面神経麻痺を発症した後にリハビリテーションを行なっているユーザ向けの画面表示に関する。実施の形態1と共通する部分については、説明を省略する。
[Modification]
This modification relates to screen display for users who are undergoing rehabilitation after developing facial paralysis. Descriptions of parts common to the first embodiment are omitted.
 中枢性麻痺であっても、末梢性麻痺であっても、急性期治療が完了した後の回復期には、長期間にわたるリハビリテーションが必要になる場合がある。本変形例においては、制御部21は特徴量DB41に時系列的に記録された特徴量に基づいて、リハビリテーションの効果を表示する。  Whether it is central paralysis or peripheral paralysis, long-term rehabilitation may be required during the recovery period after acute treatment is completed. In this modified example, the control unit 21 displays the effect of rehabilitation based on the feature amount recorded in the feature amount DB 41 in time series.
 図15は、変形例の画面例を説明する説明図である。図15に示す画面には、顔画像欄71および結果欄72が表示されている。制御部21は、顔面神経麻痺を発症した法令線の部分について、過去に記録された特徴量と、最新の特徴量とを比較する。制御部21は、特徴量の差異に基づいて、過去と現在との間のユーザの状態変化を結果欄72に表示する。 FIG. 15 is an explanatory diagram for explaining a screen example of the modified example. A face image field 71 and a result field 72 are displayed on the screen shown in FIG. The control unit 21 compares the feature amount recorded in the past with the latest feature amount for the portion of the law line where the facial nerve palsy has developed. The control unit 21 displays the change in the user's state between the past and the present in the result column 72 based on the difference in feature amount.
 本変形例によると、リハビリテーションの効果を客観的に判定して結果欄72に表示することにより、リハビリテーションに対するユーザの意欲を高める情報処理装置20を提供できる。 According to this modification, the information processing apparatus 20 that increases the user's motivation for rehabilitation can be provided by objectively judging the effect of rehabilitation and displaying it in the result column 72 .
[実施の形態2]
 本実施の形態は、それぞれの部位について特徴が出やすい顔画像を使用して、麻痺の種別を判定する情報処理装置20に関する。実施の形態1と共通する部分については、説明を省略する。
[Embodiment 2]
The present embodiment relates to an information processing apparatus 20 that determines the type of paralysis using facial images in which features of each part are likely to emerge. Descriptions of parts common to the first embodiment are omitted.
 表3は、それぞれに部位と、麻痺の種別の影響を受けやすい表情との関係を示す。 Table 3 shows the relationship between each part and the facial expressions that are susceptible to the type of paralysis.
Figure JPOXMLDOC01-appb-T000003
Figure JPOXMLDOC01-appb-T000003
 たとえば「法令線」、「口角」、「口」および「輪郭」に関しては、「安静」および「強い閉眼」の2つの表情が中枢性麻痺であるか末梢性麻痺であるかの種別の影響を受けやすい。したがって、「額に皺を寄せる」および「軽い閉眼」の表情についてはこれらの部位に関する特徴量を算出しなくても、判別結果に大きな影響を及ぼさない。 For example, with regard to “legal lines”, “corners of the mouth”, “mouth” and “contour”, the two facial expressions of “rest” and “strongly closed eyes” are influenced by the type of central or peripheral paralysis. easy to receive. Therefore, even if the facial expressions of "wrinkles on the forehead" and "slightly closed eyes" are not calculated for these parts, the determination result is not significantly affected.
 本実施の形態で制御部21が行なう処理の概要を説明する。図6を使用して説明したプログラムのステップS513において制御部21は、ステップS511で選択した顔画像の表情を使用するのに適した部位のみ選択する。たとえば、ステップS511で「安静」の表情を撮影した顔画像を選択した場合、ステップS513において制御部21は「額」、「法令線」、「口角」、「口」および「輪郭」の部位を順次選択する。 An overview of the processing performed by the control unit 21 in this embodiment will be described. In step S513 of the program described using FIG. 6, the control unit 21 selects only parts suitable for using the facial expression of the face image selected in step S511. For example, if a face image with a "quiet" expression is selected in step S511, the control unit 21 selects the "forehead", "legal lines", "corners of the mouth", "mouth" and "contour" in step S513. Select sequentially.
 同様にステップS511で「額に皺を寄せる」の表情を撮影した顔画像を選択した場合、ステップS513において制御部21は「眉」および「眉と目との間」の部位を順次選択する。 Similarly, if a face image with a facial expression of "wrinkle the forehead" is selected in step S511, the control unit 21 sequentially selects the "eyebrows" and "between the eyebrows and the eyes" in step S513.
 本実施の形態によると、中枢性麻痺であるか末梢性麻痺であるかの判別に対する影響の少ない特徴量の算出を行なわないことにより、少ない計算量で適切な判別を行なう情報処理装置20を提供できる。 According to the present embodiment, an information processing apparatus 20 is provided that performs appropriate determination with a small amount of calculation by not calculating feature amounts that have little effect on the determination of central paralysis or peripheral paralysis. can.
[実施の形態3]
 本実施の形態は、汎用のコンピュータ90とプログラム97とを組み合わせて動作させることにより、本実施の形態の情報処理装置20を実現する形態に関する。実施の形態1と共通する部分については、説明を省略する。
[Embodiment 3]
The present embodiment relates to a mode of realizing the information processing apparatus 20 of the present embodiment by operating a general-purpose computer 90 and a program 97 in combination. Descriptions of parts common to the first embodiment are omitted.
 図16は、実施の形態3の情報処理装置20の構成を説明する説明図である。コンピュータ90は、制御部21、主記憶装置22、補助記憶装置23、通信部24、表示部25、入力部26、スピーカ27、撮影部28、読取部29およびバスを備える。コンピュータ90は、汎用のパーソナルコンピュータ、タブレット、スマートフォンまたはサーバコンピュータ等の情報機器である。 FIG. 16 is an explanatory diagram for explaining the configuration of the information processing device 20 according to the third embodiment. The computer 90 includes a control section 21, a main storage device 22, an auxiliary storage device 23, a communication section 24, a display section 25, an input section 26, a speaker 27, an imaging section 28, a reading section 29 and a bus. The computer 90 is an information device such as a general-purpose personal computer, tablet, smart phone, or server computer.
 プログラム97は、可搬型記録媒体96に記録されている。制御部21は、読取部29を介してプログラム97を読み込み、補助記憶装置23に保存する。また制御部21は、コンピュータ90内に実装されたフラッシュメモリ等の半導体メモリ98に記憶されたプログラム97を読出してもよい。さらに、制御部21は、通信部24および図示しないネットワークを介して接続される図示しない他のサーバコンピュータからプログラム97をダウンロードして補助記憶装置23に保存してもよい。 The program 97 is recorded on a portable recording medium 96. The control unit 21 reads the program 97 via the reading unit 29 and stores it in the auxiliary storage device 23 . Control unit 21 may also read program 97 stored in semiconductor memory 98 such as a flash memory installed in computer 90 . Furthermore, the control unit 21 may download the program 97 from another server computer (not shown) connected via the communication unit 24 and a network (not shown) and store it in the auxiliary storage device 23 .
 プログラム97は、コンピュータ90の制御プログラムとしてインストールされ、主記憶装置22にロードして実行される。以上により、コンピュータ90は、上述の情報処理装置20の機能を果たす。プログラム97は、プログラム製品の例示である。 The program 97 is installed as a control program of the computer 90, loaded into the main storage device 22 and executed. As described above, the computer 90 functions as the information processing apparatus 20 described above. Program 97 is an example of a program product.
[実施の形態4]
 図17は、実施の形態4の情報処理装置20の機能ブロック図である。情報処理装置20は、画像取得部81、特徴量取得部82および種別判定部83を備える。
[Embodiment 4]
FIG. 17 is a functional block diagram of the information processing device 20 according to the fourth embodiment. The information processing apparatus 20 includes an image acquisition section 81 , a feature amount acquisition section 82 and a type determination section 83 .
 画像取得部81は、ユーザの顔画像を取得する。特徴量取得部82は、顔画像に基づいて顔画像の特徴量を取得する。種別判定部83は、特徴量に基づいてユーザが末梢性麻痺であるか中枢性麻痺であるかを判定する。 The image acquisition unit 81 acquires the user's face image. The feature quantity acquisition unit 82 acquires the feature quantity of the face image based on the face image. The type determination unit 83 determines whether the user has peripheral paralysis or central paralysis based on the feature amount.
 各実施例で記載されている技術的特徴(構成要件)はお互いに組合せ可能であり、組み合わせすることにより、新しい技術的特徴を形成することができる。
 今回開示された実施の形態はすべての点で例示であって、制限的なものでは無いと考えられるべきである。本発明の範囲は、上記した意味では無く、請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。
The technical features (constituent elements) described in each embodiment can be combined with each other, and new technical features can be formed by combining them.
The embodiments disclosed this time are illustrative in all respects and should be considered not restrictive. The scope of the present invention is not defined by the above-described meaning, but is indicated by the scope of claims, and is intended to include all modifications within the meaning and scope equivalent to the scope of claims.
(付記1)
 ユーザの顔画像を取得する画像取得部と、
 前記顔画像に基づいて前記顔画像の特徴量を取得する特徴量取得部と、
 前記特徴量に基づいて前記ユーザが末梢性麻痺であるか中枢性麻痺であるかを判定する種別判定部と
 を備える情報処理装置。
(Appendix 1)
an image acquisition unit that acquires a face image of a user;
a feature amount acquisition unit that acquires a feature amount of the face image based on the face image;
and a type determination unit that determines whether the user has peripheral paralysis or central paralysis based on the feature amount.
(付記2)
 前記顔画像に基づいて、顔面神経麻痺の有無を判定する麻痺判定部を備え、
 前記特徴量取得部は、前記麻痺判定部により麻痺があると判定された場合に前記特徴量を取得する
 付記1に記載の情報処理装置。
(Appendix 2)
Based on the facial image, comprising a paralysis determination unit that determines the presence or absence of facial nerve paralysis,
The information processing apparatus according to appendix 1, wherein the feature amount acquisition unit acquires the feature amount when the paralysis determination unit determines that there is paralysis.
(付記3)
 前記ユーザに対して表情を指示する表情指示部を備え、
 前記画像取得部は、前記表情指示部により指示が行なわれた後に、前記ユーザの前記顔画像を取得する
 付記1または付記2に記載の情報処理装置。
(Appendix 3)
A facial expression instruction unit that instructs the user to express a facial expression,
The information processing apparatus according to appendix 1 or appendix 2, wherein the image acquisition unit acquires the face image of the user after the facial expression instruction unit issues an instruction.
(付記4)
 前記表情指示部は、額に皺を寄せる表情を指示し、
 前記特徴量取得部は、額の皺の数および深さを取得する
 付記3に記載の情報処理装置。
(Appendix 4)
The facial expression instruction unit instructs a facial expression of wrinkling the forehead,
The information processing apparatus according to appendix 3, wherein the feature amount acquisition unit acquires the number and depth of wrinkles on the forehead.
(付記5)
 前記表情指示部は、額に皺を寄せる表情、軽く閉眼した表情および強く閉眼した表情をそれぞれ指示し、
 前記特徴量取得部は、それぞれの画像について左右の眉の角度および曲率を取得する
 付記3または付記4に記載の情報処理装置。
(Appendix 5)
The facial expression instruction unit instructs a facial expression of wrinkling the forehead, a slightly closed eye expression, and a strongly closed eye expression,
The information processing apparatus according to appendix 3 or appendix 4, wherein the feature amount acquisition unit acquires angles and curvatures of left and right eyebrows for each image.
(付記6)
 前記表情指示部は、額に皺を寄せる表情、軽く閉眼した表情および強く閉眼した表情をそれぞれ指示し、
 前記特徴量取得部は、それぞれの画像について左の眉と目との間の面積、および、右の眉と目との間の面積を取得する
 付記3から付記5のいずれか一つに記載の情報処理装置。
(Appendix 6)
The facial expression instruction unit instructs a facial expression of wrinkling the forehead, a slightly closed eye expression, and a strongly closed eye expression,
According to any one of appendices 3 to 5, the feature amount acquisition unit acquires an area between the left eyebrow and the eye and an area between the right eyebrow and the eye for each image. Information processing equipment.
(付記7)
 前記ユーザが末梢性麻痺であるか中枢性麻痺であるかを出力する種別出力部を備える
 付記1から付記6のいずれか一つに記載の情報処理装置。
(Appendix 7)
The information processing apparatus according to any one of appendices 1 to 6, further comprising a type output unit that outputs whether the user has peripheral palsy or central palsy.
(付記8)
 前記種別判定部が末梢性麻痺であると判定した場合、耳鼻科に関する情報を出力し、
 前記種別判定部が中枢性麻痺であると判定した場合、脳神経内科または脳神経外科に関する情報を出力する
 付記1から付記7のいずれか一つに記載の情報処理装置。
(Appendix 8)
When the type determination unit determines that it is peripheral paralysis, output information about otolaryngology,
The information processing apparatus according to any one of appendices 1 to 7, wherein information relating to neurology or neurosurgery is output when the type determination unit determines that the patient has central palsy.
(付記9)
 前記種別判定部が中枢性麻痺であると判定した場合、所定の連絡先に通知する
 付記1から付記8のいずれか一つに記載の情報処理装置。
(Appendix 9)
The information processing apparatus according to any one of appendices 1 to 8, wherein a predetermined contact is notified when the type determination unit determines that the patient has central paralysis.
(付記10)
 前記特徴量を時系列的に記録する特徴量記録部を備え、
 前記種別判定部は、前記特徴量の時系列的な変化に基づいて前記ユーザが末梢性麻痺であるか中枢性麻痺であるかを判定する
 付記1から付記9のいずれか一つに記載の情報処理装置。
(Appendix 10)
A feature amount recording unit that records the feature amount in time series,
The information according to any one of Supplements 1 to 9, wherein the type determination unit determines whether the user has peripheral paralysis or central paralysis based on time-series changes in the feature amount. processing equipment.
(付記11)
 前記特徴量を時系列的に記録する特徴量記録部を備え、
 前記特徴量の時系列的な変化に基づいて前記ユーザの状態変化を出力する
 付記1から付記9のいずれか一つに記載の情報処理装置。
(Appendix 11)
A feature amount recording unit that records the feature amount in time series,
The information processing apparatus according to any one of appendices 1 to 9, wherein the user's state change is output based on a time-series change in the feature amount.
(付記12)
 前記画像取得部は、前記ユーザの顔画像を三次元画像で取得する
 付記1から付記11のいずれか一つに記載の情報処理装置。
(Appendix 12)
The information processing apparatus according to any one of appendices 1 to 11, wherein the image obtaining unit obtains the face image of the user as a three-dimensional image.
(付記13)
 ユーザの顔画像を取得し、
 前記顔画像に基づいて前記顔画像の特徴量を取得し、
 前記特徴量に基づいて前記ユーザが末梢性麻痺であるか中枢性麻痺であるかを判定する
 処理をコンピュータが実行する情報処理方法。
(Appendix 13)
Get the user's facial image,
acquiring a feature amount of the face image based on the face image;
An information processing method in which a computer executes a process of determining whether the user has peripheral palsy or central palsy based on the feature amount.
(付記14)
 ユーザの顔画像を取得し、
 前記顔画像に基づいて前記顔画像の特徴量を取得し、
 前記特徴量に基づいて前記ユーザが末梢性麻痺であるか中枢性麻痺であるかを判定する
 処理をコンピュータに実行させるプログラム。
(Appendix 14)
Get the user's facial image,
acquiring a feature amount of the face image based on the face image;
A program that causes a computer to execute a process of determining whether the user has peripheral paralysis or central paralysis based on the feature amount.
 20 情報処理装置
 21 制御部
 22 主記憶装置
 23 補助記憶装置
 24 通信部
 25 表示部
 26 入力部
 27 スピーカ
 28 撮影部
 29 読取部
 41 特徴量DB
 46 判定モデル
 49 特徴点
 71 顔画像欄
 72 結果欄
 73 医療機関連絡ボタン
 74 家族連絡ボタン
 76 指標
 81 画像取得部
 82 特徴量取得部
 83 種別判定部
 90 コンピュータ
 96 可搬型記録媒体
 97 プログラム
 98 半導体メモリ
20 information processing device 21 control unit 22 main storage device 23 auxiliary storage device 24 communication unit 25 display unit 26 input unit 27 speaker 28 photographing unit 29 reading unit 41 feature amount DB
46 judgment model 49 feature point 71 face image column 72 result column 73 medical institution contact button 74 family contact button 76 index 81 image acquisition unit 82 feature amount acquisition unit 83 type determination unit 90 computer 96 portable recording medium 97 program 98 semiconductor memory

Claims (14)

  1.  ユーザの顔画像を取得する画像取得部と、
     前記顔画像に基づいて前記顔画像の特徴量を取得する特徴量取得部と、
     前記特徴量に基づいて前記ユーザが末梢性麻痺であるか中枢性麻痺であるかを判定する種別判定部と
     を備える情報処理装置。
    an image acquisition unit that acquires a face image of a user;
    a feature amount acquisition unit that acquires a feature amount of the face image based on the face image;
    and a type determination unit that determines whether the user has peripheral paralysis or central paralysis based on the feature amount.
  2.  前記顔画像に基づいて、顔面神経麻痺の有無を判定する麻痺判定部を備え、
     前記特徴量取得部は、前記麻痺判定部により麻痺があると判定された場合に前記特徴量を取得する
     請求項1に記載の情報処理装置。
    Based on the facial image, comprising a paralysis determination unit that determines the presence or absence of facial nerve paralysis,
    The information processing apparatus according to claim 1, wherein the feature amount acquisition unit acquires the feature amount when the paralysis determination unit determines that there is paralysis.
  3.  前記ユーザに対して表情を指示する表情指示部を備え、
     前記画像取得部は、前記表情指示部により指示が行なわれた後に、前記ユーザの前記顔画像を取得する
     請求項1に記載の情報処理装置。
    A facial expression instruction unit that instructs the user to express a facial expression,
    The information processing apparatus according to claim 1, wherein the image obtaining section obtains the face image of the user after the facial expression instruction section gives an instruction.
  4.  前記表情指示部は、額に皺を寄せる表情を指示し、
     前記特徴量取得部は、額の皺の数および深さを取得する
     請求項3に記載の情報処理装置。
    The facial expression instruction unit instructs a facial expression of wrinkling the forehead,
    The information processing apparatus according to claim 3, wherein the feature quantity acquisition unit acquires the number and depth of wrinkles on the forehead.
  5.  前記表情指示部は、額に皺を寄せる表情、軽く閉眼した表情および強く閉眼した表情をそれぞれ指示し、
     前記特徴量取得部は、それぞれの画像について左右の眉の角度および曲率を取得する
     請求項3に記載の情報処理装置。
    The facial expression instruction unit instructs a facial expression of wrinkling the forehead, a slightly closed eye expression, and a strongly closed eye expression,
    The information processing apparatus according to claim 3, wherein the feature amount acquisition unit acquires angles and curvatures of left and right eyebrows for each image.
  6.  前記表情指示部は、額に皺を寄せる表情、軽く閉眼した表情および強く閉眼した表情をそれぞれ指示し、
     前記特徴量取得部は、それぞれの画像について左の眉と目との間の面積、および、右の眉と目との間の面積を取得する
     請求項3に記載の情報処理装置。
    The facial expression instruction unit instructs a facial expression of wrinkling the forehead, a slightly closed eye expression, and a strongly closed eye expression,
    The information processing apparatus according to claim 3, wherein the feature amount acquisition unit acquires an area between the left eyebrow and the eye and an area between the right eyebrow and the eye for each image.
  7.  前記ユーザが末梢性麻痺であるか中枢性麻痺であるかを出力する種別出力部を備える
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, further comprising a type output unit that outputs whether the user has peripheral palsy or central palsy.
  8.  前記種別判定部が末梢性麻痺であると判定した場合、耳鼻科に関する情報を出力し、
     前記種別判定部が中枢性麻痺であると判定した場合、脳神経内科または脳神経外科に関する情報を出力する
     請求項1に記載の情報処理装置。
    When the type determination unit determines that it is peripheral paralysis, output information about otolaryngology,
    The information processing apparatus according to claim 1, wherein information relating to neurology or neurosurgery is output when the type determination unit determines that the patient has central palsy.
  9.  前記種別判定部が中枢性麻痺であると判定した場合、所定の連絡先に通知する
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein when the type determination unit determines that the patient has central palsy, a predetermined contact is notified.
  10.  前記特徴量を時系列的に記録する特徴量記録部を備え、
     前記種別判定部は、前記特徴量の時系列的な変化に基づいて前記ユーザが末梢性麻痺であるか中枢性麻痺であるかを判定する
     請求項1から請求項9のいずれか一つに記載の情報処理装置。
    A feature amount recording unit that records the feature amount in time series,
    10. The type determination unit according to any one of claims 1 to 9, wherein the type determination unit determines whether the user has peripheral paralysis or central paralysis based on time-series changes in the feature amount. information processing equipment.
  11.  前記特徴量を時系列的に記録する特徴量記録部を備え、
     前記特徴量の時系列的な変化に基づいて前記ユーザの状態変化を出力する
     請求項1から請求項9のいずれか一つに記載の情報処理装置。
    A feature amount recording unit that records the feature amount in time series,
    The information processing apparatus according to any one of claims 1 to 9, wherein the state change of the user is output based on the time-series change of the feature quantity.
  12.  前記画像取得部は、前記ユーザの顔画像を三次元画像で取得する
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the image acquisition unit acquires the face image of the user as a three-dimensional image.
  13.  ユーザの顔画像を取得し、
     前記顔画像に基づいて前記顔画像の特徴量を取得し、
     前記特徴量に基づいて前記ユーザが末梢性麻痺であるか中枢性麻痺であるかを判定する
     処理をコンピュータが実行する情報処理方法。
    Get the user's facial image,
    acquiring a feature amount of the face image based on the face image;
    An information processing method in which a computer executes a process of determining whether the user has peripheral palsy or central palsy based on the feature amount.
  14.  ユーザの顔画像を取得し、
     前記顔画像に基づいて前記顔画像の特徴量を取得し、
     前記特徴量に基づいて前記ユーザが末梢性麻痺であるか中枢性麻痺であるかを判定する
     処理をコンピュータに実行させるプログラム。
    Get the user's facial image,
    acquiring a feature amount of the face image based on the face image;
    A program that causes a computer to execute a process of determining whether the user has peripheral paralysis or central paralysis based on the feature amount.
PCT/JP2022/035808 2021-09-29 2022-09-27 Information processing device, information processing method, and program WO2023054295A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-160019 2021-09-29
JP2021160019 2021-09-29

Publications (1)

Publication Number Publication Date
WO2023054295A1 true WO2023054295A1 (en) 2023-04-06

Family

ID=85782685

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/035808 WO2023054295A1 (en) 2021-09-29 2022-09-27 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2023054295A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008204200A (en) * 2007-02-20 2008-09-04 Space Vision:Kk Face analysis system and program
JP2009520546A (en) * 2005-12-21 2009-05-28 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Computer-based evaluation of facial palsy
JP2020199072A (en) * 2019-06-10 2020-12-17 国立大学法人滋賀医科大学 Cerebral apoplexy determination device, method, and program
US20210093231A1 (en) * 2018-01-22 2021-04-01 UNIVERSITY OF VIRGINIA PATENT FOUNDATION d/b/a UNIVERSITY OF VIRGINIA LICENSING & VENTURE System and method for automated detection of neurological deficits

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009520546A (en) * 2005-12-21 2009-05-28 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Computer-based evaluation of facial palsy
JP2008204200A (en) * 2007-02-20 2008-09-04 Space Vision:Kk Face analysis system and program
US20210093231A1 (en) * 2018-01-22 2021-04-01 UNIVERSITY OF VIRGINIA PATENT FOUNDATION d/b/a UNIVERSITY OF VIRGINIA LICENSING & VENTURE System and method for automated detection of neurological deficits
JP2020199072A (en) * 2019-06-10 2020-12-17 国立大学法人滋賀医科大学 Cerebral apoplexy determination device, method, and program

Similar Documents

Publication Publication Date Title
US10722181B2 (en) Systems, methods, and computer-readable media for using descriptors to identify when a subject is likely to have a dysmorphic feature
JP6595474B2 (en) Method and system for wound assessment and management
JP4710550B2 (en) Comment layout in images
WO2020121308A9 (en) Systems and methods for diagnosing a stroke condition
CN109830280A (en) Psychological aided analysis method, device, computer equipment and storage medium
CN111126180A (en) Facial paralysis severity automatic detection system based on computer vision
KR20160016039A (en) Facial Nerve Palsy Grading Apparatus and Method
US20190148015A1 (en) Medical information processing device and program
US20200311933A1 (en) Processing fundus images using machine learning models to generate blood-related predictions
WO2023054295A1 (en) Information processing device, information processing method, and program
EP3949832A1 (en) Illness aggravation estimation system
Mackenzie et al. Morphological and morphometric changes in the faces of female-to-male (FtM) transsexual people
WO2023189309A1 (en) Computer program, information processing method, and information processing device
JP7036374B2 (en) Insurance premium calculation system and insurance premium calculation method
US20230360199A1 (en) Predictive data analysis techniques using a hierarchical risk prediction machine learning framework
CA3187876A1 (en) System and method for automatic personalized assessment of human body surface conditions
WO2023060720A1 (en) Emotional state display method, apparatus and system
JP2017205426A (en) Psychological state evaluation program and psychological state evaluation device
Lou et al. Automated measurement of ocular movements using deep learning-based image analysis
US20230060942A1 (en) Health Monitoring System with Precision Eye-Blocking Filter
WO2023048153A1 (en) Information processing method, computer program, and information processing device
Acharya et al. Automatic Overlaying of the Vessels and Nerves of the Face Using Machine Learning
石切山順一 et al. Quantitative assessment support for the improvement of facial appearance around the mouth based on the local image classification
JP2024007407A (en) Information processing device
CN118053185A (en) Pain part auxiliary identification and image identification model training method and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22876162

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023551492

Country of ref document: JP