WO2021014938A1 - Walking ability evaluation device, walking ability evaluation system, walking ability evaluation method, program, and cognitive function evaluation device - Google Patents
Walking ability evaluation device, walking ability evaluation system, walking ability evaluation method, program, and cognitive function evaluation device Download PDFInfo
- Publication number
- WO2021014938A1 WO2021014938A1 PCT/JP2020/026208 JP2020026208W WO2021014938A1 WO 2021014938 A1 WO2021014938 A1 WO 2021014938A1 JP 2020026208 W JP2020026208 W JP 2020026208W WO 2021014938 A1 WO2021014938 A1 WO 2021014938A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- walking
- pedestrian
- walking speed
- face area
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the present disclosure relates to a walking function evaluation device, a walking function evaluation system, a walking function evaluation method, a program, and a cognitive function evaluation device that evaluate walking function without requiring additional work for daily life.
- Patent Document 1 a dementia initial stage identification system for identifying dementia by discovering an operation different from the daily operation tendency based on the switch operation status of the monitoring sensor unit has been known (for example, Patent Document 1).
- dementia There are various levels of dementia from mild to severe, but if a change in cognitive function is found at the stage of mild cognitive impairment before dementia, exercise training etc. can suppress the onset of dementia. In some cases.
- the present disclosure has been made in view of such problems, and an object of the present disclosure is to discover and present abnormalities in gait function at the stage of mild cognitive impairment without requiring additional work for daily life. To do.
- the walking function evaluation device is a walking speed estimation device that estimates the walking speed of a pedestrian based on an image acquisition unit that acquires an image including a pedestrian as a subject and an image acquired by the image acquisition unit.
- a pedestrian based on a unit, a recording unit that records time-series data of walking speed, a reference data storage unit that stores reference data of walking speed that is considered to have deteriorated walking function, and time-series data and reference data. It is provided with an evaluation unit for evaluating the walking function of the pedestrian and a presentation unit for presenting information on the walking function of the pedestrian evaluated by the evaluation unit.
- the gait function evaluation device of the present disclosure can detect and present abnormalities in gait function at the stage of mild cognitive impairment without requiring additional work for daily work.
- FIG. 1 is a diagram showing an outline of a cognitive function evaluation system according to the first embodiment.
- FIG. 2 is a block diagram showing a functional configuration of the cognitive function evaluation system according to the first embodiment.
- FIG. 3 is an explanatory diagram for explaining the time-series change in the size of the face region in the order in which the pedestrian approaches and then moves away from the camera.
- FIG. 4 is a flowchart of operation example 1 of the cognitive function evaluation system.
- FIG. 5 is an explanatory diagram showing an example of a display screen for displaying a decrease in cognitive function.
- FIG. 6 is an explanatory diagram showing an example of a display screen for displaying time-series data of walking speed.
- FIG. 7 is a block diagram showing a functional configuration of the cognitive function evaluation system according to the third embodiment.
- FIG. 8 is a flowchart of operation example 2 of the cognitive function evaluation system.
- FIG. 9 is an explanatory diagram showing the positions of the first camera and the second camera according to the fourth embodiment.
- each figure is a schematic view and is not necessarily exactly illustrated. Therefore, for example, the scales and the like do not always match in each figure. Further, in each figure, substantially the same configuration is designated by the same reference numerals, and duplicate description will be omitted or simplified.
- FIG. 1 is a diagram showing an outline of the cognitive function evaluation system 10 according to the first embodiment.
- FIG. 2 is a block diagram showing a functional configuration of the cognitive function evaluation system 10 according to the first embodiment.
- the cognitive function evaluation system 10 includes a camera 20, a cognitive function evaluation device 30, and an output device 40.
- the cognitive function evaluation system 10 is a device that acquires an image of the pedestrian M taken by the camera 20 and evaluates the cognitive function of the pedestrian M based on the acquired image.
- the evaluation method of cognitive function will be described later.
- the camera 20 is, for example, a camera installed on the ceiling of a corridor of a pedestrian M's house, the ceiling of a corridor of a nursing facility, or the like, and walks in a passage such as the corridor 60 of the house.
- An image moving image composed of a plurality of images
- the pedestrian M as a subject is taken.
- the camera 20 outputs an image including the walking pedestrian M as a subject for each frame to the image acquisition unit 31 described later.
- the frame rate (fr) of the camera 20 is, for example, 30 fps (frames / second).
- the camera 20 may be a camera 20 using a CMOS (Complementary Molecular Oxide Semiconductor) image sensor, or a camera 20 using a CCD (Charge Coupled Device) image sensor, and is not particularly limited. Absent.
- the camera 20 may be installed inside an office, for example, or may be installed in a building other than the home, such as a public institution.
- the camera 20 for example, a surveillance camera or a security camera that constantly captures a pedestrian M living in a house or a nursing facility is diverted. As a result, if the pedestrian M does not perform any special movement and the pedestrian M is living as usual, the walking image of the pedestrian M is transmitted to the recording unit 36 of the cognitive function evaluation device 30. It will be automatically accumulated. That is, the cognitive function evaluation system 10 can evaluate the cognitive function of the pedestrian M without the pedestrian M performing any special movement or operation.
- the cognitive function evaluation device 30 evaluates the cognitive function of the pedestrian M.
- the cognitive function evaluation device 30 is, for example, a home controller installed in a house and for controlling a device installed in the house, but may be a personal computer or the like.
- the cognitive function evaluation device 30 may be installed in the house where the camera 20 is installed, or may be installed outside the house.
- the cognitive function evaluation device 30 includes an image acquisition unit 31, a face area detection unit 32, a face area measurement unit 33, an authentication unit 34, and a walking speed estimation unit 35.
- a recording unit 36, a reference data storage unit 37, an evaluation unit 38, and a communication unit 39 are provided.
- the image acquisition unit 31 acquires an image (more specifically, image data) taken by the camera 20 for each frame.
- the image acquisition unit 31 is a communication module (communication circuit) that communicates with the camera 20.
- the communication performed by the image acquisition unit 31 may be wired communication or wireless communication.
- the image acquisition unit 31 may be capable of communicating with the camera 20, and the communication standard is not particularly limited.
- the face area detection unit 32 reads the image data acquired by the image acquisition unit 31 and detects the face area of the pedestrian M for each frame from the image data.
- FIG. 3 is an explanatory diagram for explaining the time-series change in the size of the face region in the order in which the pedestrian M approaches and then moves away from the camera 20 (from left to right in FIG. 3).
- the face area detection unit 32 specifically detects and determines the contour A of the face of the pedestrian M with a circle.
- the facial contour A is detected to be substantially the same size as the facial contour of the pedestrian M in the image or slightly larger than the actual facial contour of the pedestrian M.
- the contour A of the face may be a quadrangle and is not particularly limited.
- the detection result of the face area detection unit 32 is sent to the face area measurement unit 33 and the authentication unit 34, respectively.
- the face area measuring unit 33 measures the size of the face area detected by the face area detecting unit 32.
- the size of the face region is calculated by measuring the maximum diameter R of the size of the face region, for example, assuming that the face region is a circle.
- the unit of maximum diameter is calculated by pixel (number of pixels).
- the method of measuring the size of the face region is performed by using, for example, the coordinate values of the face region detected by the face region detection unit 32 in the X direction and the Y direction.
- frame 1 (fr1), frame 2 (fr2), frame 3 (fr3), frame 4 (fr4), frame 5 (fr5) are used, and the face of each frame is used. It is assumed that the maximum diameter R of the size of the region is R1, R2, R3, R4, and R5, respectively. In FIG. 3, it is assumed that the pedestrian M is closest to the camera 20 in the frame 3 (fr3).
- the maximum diameter R3 of the face region of the frame 3 (fr3) is the frame 1 (fr1), the frame 2 (fr2), and the frame 3 (fr3). It is the longest of the maximum diameters R1, R2, R3, R4, and R5 of each face region of the frame 3 (fr3), the frame 4 (fr4), and the frame 5 (fr5).
- the measurement result of the size of the face area is sent to the walking speed estimation unit 35.
- the face area may be detected as a quadrangle and is not particularly limited.
- the length of the diagonal side of the quadrangle is measured and calculated as the maximum length of the face area.
- the face area detection unit 32 and the face area measurement unit 33 are realized by, for example, a microcomputer and various sensor devices.
- the authentication unit 34 calculates the facial feature amount of the pedestrian M from the face area detected by the face area detection unit 32.
- the facial features can be determined by using the relative positions of the eyes, nose, and mouth on the face, the contour A of the face, and the like.
- the authentication unit 34 determines whether the facial feature amount of the pedestrian M extracted from the image data matches the facial feature amount among the plurality of registered persons. Then, with respect to the pedestrian M included in the image data, the authentication unit 34 sets the person who has the maximum degree of matching of facial features from a plurality of registered persons as the corresponding person. recognize.
- the plurality of registered persons and the facial features of the plurality of registered persons are stored in, for example, a semiconductor memory or an HDD (Hard Disk Drive).
- the walking speed estimation unit 35 estimates the walking speed of the pedestrian M from the change in the maximum diameter R of the face region measured for each frame.
- the walking speed estimation unit 35 estimates the walking speed of the pedestrian M by using the maximum diameter R of the face region and the amount of change in the maximum diameter R of the face region.
- the walking speed estimation unit 35 calculates the change in the maximum diameter R of the face region per hour by the following formula.
- the unit of change of the maximum diameter R of the face region is pixel / s.
- the change in the maximum diameter R of the face region from frame 1 (fr1) to frame 2 (fr2) can be calculated by the following formula.
- the recording unit 36 is, for example, a semiconductor memory such as a flash memory or an HDD, and records time-series data of the walking speed sent from the walking speed estimation unit 35.
- the recording unit 36 sends the time series data to the evaluation unit 38.
- the time-series data is data showing the estimation results of the walking speed in the order in which the walking speed is estimated. For example, when the walking speed of the pedestrian M is estimated on the 24th, 25th, and 26th, the recording unit 36 records the walking speed in the order of the estimated walking speed on the 24th, 25th, and 26th. ing.
- the time series data is converted into data such as a line graph.
- the reference data storage unit 37 stores, for example, a semiconductor memory such as a flash memory or an HDD, and when the walking speed is equal to or less than a predetermined threshold, the reference data of the walking speed that the cognitive function is deteriorated is stored.
- the reference data storage unit 37 sends the reference data to the evaluation unit 38.
- the predetermined threshold value is, for example, 1 m / s. Generally, when the walking speed is less than 1 m / s, it is considered that the cognitive function is deteriorated.
- the evaluation unit 38 evaluates the cognitive function of the pedestrian M based on the time series data and the reference data. Specifically, the evaluation unit 38 evaluates the number of days in which the walking speed falls below the predetermined threshold value during the predetermined period of the time series data. Then, when the number of days below the predetermined threshold is equal to or greater than the number of days when the walking speed in the predetermined period in the past is less than the predetermined threshold, it is evaluated that the cognitive function of the pedestrian M has deteriorated.
- the number of days in the predetermined period is, for example, several days, one week, two weeks, one month, one year, and the like.
- the number of days in the predetermined period past the predetermined period is several days, one week, two weeks, one month before the predetermined period, and the like.
- the predetermined period past the predetermined period is also one week, and the predetermined period and the predetermined period past the predetermined period are preferably the same number of days. ..
- the evaluation unit 38 may determine that the cognitive function has deteriorated when the walking speed falls below 1 m / s in the number of days more than half of the week.
- the communication unit 39 is a communication module (communication circuit) for the cognitive function evaluation device 30 to communicate with the output device 40 via a wide area communication network 50 such as the Internet.
- the communication performed by the communication unit 39 may be wireless communication or wired communication.
- the communication standard used for communication is also not particularly limited.
- the cognitive function evaluation device 30 may be realized by a non-volatile memory in which the program is stored, a non-volatile memory which is a temporary storage area for executing the program, an input / output port, a processor for executing the program, and the like. it can.
- the 38 and the communication unit 39 do not necessarily show a substantive configuration, but show a function realized by a processor or the like.
- the output device 40 is an information terminal composed of a presentation unit 41 such as a liquid crystal display panel or an organic EL (Electroluminescence) display panel, and an operation reception unit 42 that receives user operations such as a touch panel.
- a presentation unit 41 such as a liquid crystal display panel or an organic EL (Electroluminescence) display panel
- an operation reception unit 42 that receives user operations such as a touch panel.
- the output device 40 includes a presentation unit 41 that presents information regarding the cognitive function of the evaluated pedestrian M sent from the evaluation unit 38.
- the information on the cognitive function is, for example, time series data of walking speed or a change in the cognitive function of the pedestrian M.
- the output device 40 is, for example, a portable information terminal such as a smartphone or a tablet terminal, or a monitor or a touch panel, but may be a stationary information terminal such as a personal computer.
- the output device 40 can communicate with the cognitive function evaluation device 30 via the wide area communication network 50.
- the output device 40 is, for example, an information terminal installed in a hospital, a nursing care facility, or the like. That is, the time-series data of walking speed, the change in walking speed, and the evaluation result of cognitive function are calculated from the image data captured in the house of pedestrian M, and the hospital or nursing facility receives the evaluation result. To do.
- a hospital or nursing home can treat pedestrian M or encourage treatment at the stage of mild dementia if there is an abnormality in the cognitive function of pedestrian M.
- the output device 40 may directly communicate with the cognitive function evaluation device 30 without going through the wide area communication network 50.
- the presentation unit 41 generates and displays a GUI (Graphical User Interface) object such as a selection button for accepting selection and operation from the pedestrian M, for example.
- GUI Graphic User Interface
- the output device 40 may have a speaker that outputs audio in place of the presentation unit 41 or in addition to the presentation unit 41.
- the output device 40 may output a voice representing the content to be presented to the pedestrian M.
- FIG. 4 is a flowchart of operation example 1 of the cognitive function evaluation system 10.
- the image acquisition unit 31 of the cognitive function evaluation device 30 acquires an image (image data) including the pedestrian M as a subject from the camera 20 (S10).
- the face area detection unit 32 detects the face area of the pedestrian M from the acquired image (S20).
- the authentication unit 34 calculates the facial feature amount of the pedestrian M from the face area detected by the face area detection unit 32 (S30). Further, the authentication unit 34 determines whether the facial features of the pedestrian M match the facial features of the plurality of registered persons (S40).
- the face area measuring unit 33 measures the size of the face area detected from the face area detecting unit 32 (S50).
- the walking speed is estimated from the size of the face area measured by the face area measuring unit 33 (S60).
- the recording unit 36 records time-series data of the walking speed estimated by the walking speed estimation unit 35 (S70).
- the evaluation unit 38 evaluates the number of days when the walking speed falls below the predetermined threshold value from the predetermined period of the time series data. Then, the evaluation unit 38 evaluates the change in cognitive function by comparing the number of days when the walking speed in the predetermined period past the predetermined period is lower than the predetermined threshold value and the number of days when the walking speed in the predetermined period is lower than the predetermined threshold value. (Step S80).
- the evaluation unit 38 may periodically evaluate the cognitive function and have the communication unit 39 transmit the evaluation result.
- the communication unit 39 transmits time-series data of walking speed and changes in cognitive function to the presentation unit 41 (S90).
- the presentation unit 41 of the output device 40 presents the time-series data of the walking speed sent from the communication unit 39 and the change in the cognitive function (S100).
- FIG. 5 is an explanatory diagram showing an example of a display screen for displaying a decrease in cognitive function.
- changes in cognitive function as of today, June 30 (Sun) are displayed.
- the evaluation results for about one month from June 2nd to June 29th are displayed.
- the predetermined period is 23 days to 29 days, and the number of days when the walking speed falls below the predetermined threshold value is 4 days in the predetermined period.
- the predetermined period past the predetermined period is 2 to 8 days, 9 to 15 days, and 16 to 22 days.
- the presenting unit 41 presents that the cognitive function tends to deteriorate in the predetermined period because the number of days in which the walking speed falls below the predetermined threshold is increased by one day as compared with the past predetermined period of the predetermined period. You may.
- FIG. 6 is an explanatory diagram showing an example of a display screen for displaying time-series data of walking speed.
- the time-series data of walking speed is recorded in the recording unit 36 from the 19th to the 30th, and the presenting unit 41 may present it as a line graph.
- the hatched area of the line graph indicates that the walking speed is below a predetermined threshold value. In this case, it can be easily determined that the walking speed tends to decrease.
- Another means may be used to estimate the walking speed of the pedestrian M from an image including the pedestrian M as a subject.
- evaluation unit 38 may have the communication unit 39 transmit the evaluation result only when the cognitive function tends to decline, and the evaluation unit 38 is not particularly limited.
- the evaluation unit 38 compares the current walking speed with the walking speed from the present to the past for a predetermined period, and evaluates the change in walking speed in a plurality of stages. For example, the evaluation unit 38 compares the current walking speed with the walking speed one week ago, and evaluates the change in walking speed on a five-point scale of “A” to “E”.
- “A” to “E” correspond to the evaluation result of the change in walking speed.
- “A” indicates that the walking speed is the lowest
- “C” indicates that there is no change
- “E” indicates that the walking speed is the highest.
- the evaluation unit 38 determines that the change in walking speed is "B" when the current walking speed is 10% lower than the walking speed one week ago, and the current walking speed is one week ago. If the walking speed is 20% or more lower than the walking speed of, the change in walking speed is judged as "A".
- the change in walking speed is judged as "D"
- the current walking speed is 20% or more than the walking speed one week ago. If it is improved, the change in walking speed is judged as "E”.
- the stage of change in walking speed may be 8 stages or 10 stages, and is not particularly limited.
- the cognitive function evaluation system 10 recommends the action content to the pedestrian M based on the evaluation result of the change in the walking speed and the cognitive function of the pedestrian M evaluated by the cognitive function evaluation device 30. Determine as a recommendation.
- FIG. 7 is a block diagram showing a functional configuration of the cognitive function evaluation system 10 according to the third embodiment.
- the cognitive function evaluation device 30 further includes a database unit 200 and an analysis unit 201 in addition to the configuration of the first embodiment.
- the database unit 200 records the cognitive function of the pedestrian M sent from the evaluation unit 38, the evaluation result of the change in walking speed, or the time-series data of walking speed.
- the database unit 200 is realized by a non-volatile storage device such as a semiconductor memory or an HDD.
- the analysis unit 201 determines the action content recommended for the pedestrian M as the recommended content based on the evaluation result of the change in the walking speed and the cognitive function of the pedestrian M evaluated by the evaluation unit 38.
- the analysis unit 201 is realized by, for example, a microcomputer. Specifically, the analysis unit 201 is realized by a non-volatile memory in which the program is stored, a volatile memory which is a temporary recording area for executing the program, an input / output port, a processor in which the program is executed, and the like. ..
- the analysis unit 201 transmits the determined recommended content to the presentation unit 41.
- Recommendations include content that encourages the start of cognitive training to improve cognitive function.
- the recommended content may be, for example, content that promotes aerobic exercise such as walking or jogging, or content that promotes the intake of vegetables and fruits containing nutrients such as vitamin C, vitamin E, and ⁇ -carotene.
- the pedestrian M can be encouraged to take effective actions to improve the cognitive function.
- FIG. 8 is a flowchart of operation example 2 of the cognitive function evaluation system 10.
- the cognitive function evaluation system 10 determines the action content recommended for the pedestrian M as the recommended content based on the evaluation result, and presents the recommended content. Indicates to do.
- the analysis unit 201 next to S80, the analysis unit 201 further determines the recommended action content as the recommended content based on the evaluation result of the change in cognitive function (S81).
- the analysis unit 201 determines and transmits the recommended content to the presentation unit 41 (S82).
- the cognitive function evaluation system 10 includes two cameras, a first camera 202 and a second camera 203.
- FIG. 9 is an explanatory diagram showing the positions of the first camera 202 and the second camera 203 according to the fourth embodiment.
- the first camera 202 is provided on the ceiling or wall of a corridor such as a house.
- the second camera 203 is installed at a predetermined distance L from the first camera 202.
- first camera 202 and the second camera 203 are installed at the same height as the corridor 60 where the pedestrian M walks.
- the walking speed estimation unit 35 determines the walking speed of the pedestrian M from the image data when the pedestrian M is closest to the first camera 202 and the image data when the pedestrian M is closest to the second camera 203. presume.
- the image data when the pedestrian M is closest to the first camera 202 is the image data taken when the pedestrian M is located in the vertical direction of the first camera 202.
- the image data when the pedestrian M is closest to the second camera 203 is the image data taken when the pedestrian M is located in the vertical direction of the second camera 203.
- the face area measuring unit 33 measures the maximum diameter R of the face area of the pedestrian M included in the image data when the pedestrian M is closest to the first camera 202. Further, the face area measuring unit 33 measures the maximum diameter R of the face area of the pedestrian M included in the image data when the pedestrian M is closest to the second camera 203.
- the walking speed estimation unit 35 acquires the predetermined distance L between the first camera 202 and the second camera 203, the time T1 when the pedestrian M is closest to the first camera 202, and the second camera 203.
- the walking speed of the pedestrian M is estimated from the time T2 at which the image data when the pedestrian M is closest to the pedestrian M is acquired.
- the cognitive function evaluation system 10 may evaluate changes in the cognitive function of a plurality of pedestrians M included as subjects in the image. For example, when the image acquired by the camera 20 includes two pedestrians M as subjects, the feature amounts of the faces of the two pedestrians M extracted from the image data by the authentication unit 34 and the registered features of the two pedestrians M are registered. It is judged at the same time whether or not they match the facial features of a plurality of people. When the facial features of the two pedestrians M match among the facial features of the plurality of registered persons, the walking speed estimation unit 35 parallels the walking speeds of the two pedestrians M. May be estimated.
- the walking speed estimation unit 35 may estimate the walking speed of the other pedestrian M immediately after estimating the walking speed of one pedestrian M, even if the processing is not simultaneous and parallel processing, and is particularly limited. There is no such thing.
- the number of pedestrians M included as subjects in the image may be 3, 4, or 5 or more, and is not particularly limited.
- the processing described in the above embodiment may be realized by centralized processing using a single device (system), or may be realized by distributed processing using a plurality of devices. Good. Further, the number of processors that execute the above program may be singular or plural. That is, centralized processing may be performed, or distributed processing may be performed.
- the cognitive function evaluation device 30 and the output device 40 may be integrally formed, and are not particularly limited.
- the output device 40 may be a portable information terminal of the pedestrian M or a stationary information terminal such as a personal computer.
- the pedestrian M can be aware of changes in cognitive function at any time.
- the communication method between the devices described in the above embodiment is not particularly limited.
- the wireless communication method (communication standard) is, for example, short-range wireless communication such as ZigBee (registered trademark), Bluetooth (registered trademark), or wireless LAN (Local Area Network).
- wired communication may be performed between the devices instead of wireless communication.
- the wired communication is a power line carrier communication (PLC: Power Line Communication) or a communication using a wired LAN.
- the order of a plurality of processes in the flowchart of the operation example described in the above embodiment is an example.
- the order of the plurality of processes may be changed, and the plurality of processes may be executed in parallel.
- another processing unit may execute the processing executed by the specific processing unit.
- the cognitive function in the above embodiment may be read as a walking function
- the present invention may be realized as a walking function evaluation device, a walking function evaluation system, a walking function evaluation method, or the like.
- Cognitive function evaluation system 20 Camera 30 Cognitive function evaluation device 31 Image acquisition unit 32 Face area detection unit 33 Face area measurement unit 34 Authentication unit 35 Walking speed estimation unit 36 Recording unit 37 Reference data storage unit 38 Evaluation unit 39 Communication unit 40 Output Device 41 Presentation unit 42 Operation reception unit 200 Database unit 201 Analysis unit 202 1st camera 203 2nd camera M Pedestrian R Maximum diameter
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Physiology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
A cognitive function evaluation device (30) is provided with: an image acquisition unit (31) for acquiring an image including a walking person (M) as a subject; a walking speed estimation unit (35) for estimating the waking speed of the walking person (M) on the basis of the image acquired by the image acquisition unit (31); a recording unit (36) for recording time-series data of the walking speed; a reference data storage unit (37) for storing walking speed reference data for determining cognitive decline; an evaluation unit (38) for evaluating the cognitive functions of the waking person (M) on the basis of the time-series data and the reference data; and a presentation unit (41) for presenting information relating to the cognitive functions of the walking person (M) evaluated by the evaluation unit (38).
Description
本開示は、日々の生活に対する追加の作業を必要とせずに、歩行機能の評価を行う歩行機能評価装置、歩行機能評価システム、歩行機能評価方法、プログラム、及び、認知機能評価装置に関する。
The present disclosure relates to a walking function evaluation device, a walking function evaluation system, a walking function evaluation method, a program, and a cognitive function evaluation device that evaluate walking function without requiring additional work for daily life.
従来、見守りセンサユニットのスイッチ操作状況に基づいて、日常の操作傾向と異なる操作を発見することで認知症を見極める認知症初期段階見極めシステムが知られている(例えば、特許文献1)。
Conventionally, a dementia initial stage identification system for identifying dementia by discovering an operation different from the daily operation tendency based on the switch operation status of the monitoring sensor unit has been known (for example, Patent Document 1).
認知症には、軽度から重度まで様々なレベルがあるが、認知症になる前の軽度認知障害の段階で認知機能に異変が発見された場合、運動トレーニングなどによって、認知症の発症が抑えられる場合がある。
There are various levels of dementia from mild to severe, but if a change in cognitive function is found at the stage of mild cognitive impairment before dementia, exercise training etc. can suppress the onset of dementia. In some cases.
本開示は、このような課題に鑑みてなされたものであり、日々の生活に対する追加の作業を必要とせずに、軽度認知障害の段階で歩行機能の異変を発見して提示することを目的とする。
The present disclosure has been made in view of such problems, and an object of the present disclosure is to discover and present abnormalities in gait function at the stage of mild cognitive impairment without requiring additional work for daily life. To do.
本開示の一様態に係る歩行機能評価装置は、歩行者を被写体として含む画像を取得する画像取得部と、画像取得部が取得した画像に基づいて、歩行者の歩行速度を推定する歩行速度推定部と、歩行速度の時系列データを記録する記録部と、歩行機能が低下したとする歩行速度の基準データを保存する基準データ保存部と、時系列データと基準データとに基づいて、歩行者の歩行機能を評価する評価部と、評価部に評価された歩行者の歩行機能に関する情報を提示する提示部とを備える。
The walking function evaluation device according to the present disclosure is a walking speed estimation device that estimates the walking speed of a pedestrian based on an image acquisition unit that acquires an image including a pedestrian as a subject and an image acquired by the image acquisition unit. A pedestrian based on a unit, a recording unit that records time-series data of walking speed, a reference data storage unit that stores reference data of walking speed that is considered to have deteriorated walking function, and time-series data and reference data. It is provided with an evaluation unit for evaluating the walking function of the pedestrian and a presentation unit for presenting information on the walking function of the pedestrian evaluated by the evaluation unit.
本開示の歩行機能評価装置は、日々の作業に対する追加の作業を必要とせずに、軽度認知障害の段階で歩行機能の異変を発見して提示することができる。
The gait function evaluation device of the present disclosure can detect and present abnormalities in gait function at the stage of mild cognitive impairment without requiring additional work for daily work.
以下では、本発明の実施の形態に係る認知機能評価装置、認知機能評価システム、認知機能評価方法、及び、プログラムについて、図面を用いて詳細に説明する。なお、以下に説明する実施の形態は、いずれも本発明の一具体例を示すものである。したがって、以下の実施の形態で示される数値、形状、材料、構成要素、構成要素の配置及び接続形態、ステップ、ステップの順序などは、一例であり、本発明を限定する趣旨ではない。よって、以下の実施の形態における構成要素のうち、独立請求項に記載されていない構成要素については、任意の構成要素として説明される。
Hereinafter, the cognitive function evaluation device, the cognitive function evaluation system, the cognitive function evaluation method, and the program according to the embodiment of the present invention will be described in detail with reference to the drawings. In addition, all the embodiments described below show a specific example of the present invention. Therefore, the numerical values, shapes, materials, components, arrangement and connection forms of the components, steps, the order of steps, etc. shown in the following embodiments are examples, and are not intended to limit the present invention. Therefore, among the components in the following embodiments, the components not described in the independent claims will be described as arbitrary components.
また、各図は、模式図であり、必ずしも厳密に図示されたものではない。したがって、例えば、各図において縮尺などは必ずしも一致しない。また、各図において、実質的に同一の構成については同一の符号を付しており、重複する説明は省略又は簡略化する。
In addition, each figure is a schematic view and is not necessarily exactly illustrated. Therefore, for example, the scales and the like do not always match in each figure. Further, in each figure, substantially the same configuration is designated by the same reference numerals, and duplicate description will be omitted or simplified.
(実施の形態1)
まず、実施の形態1に係る認知機能評価システム10の概要について、図1、及び図2を用いて説明する。図1は、実施の形態1に係る認知機能評価システム10の概要を示す図である。図2は、実施の形態1に係る認知機能評価システム10の機能構成を示すブロック図である。図1、及び図2に示されるように、認知機能評価システム10は、カメラ20と、認知機能評価装置30と、出力装置40と、を備える。 (Embodiment 1)
First, the outline of the cognitivefunction evaluation system 10 according to the first embodiment will be described with reference to FIGS. 1 and 2. FIG. 1 is a diagram showing an outline of the cognitive function evaluation system 10 according to the first embodiment. FIG. 2 is a block diagram showing a functional configuration of the cognitive function evaluation system 10 according to the first embodiment. As shown in FIGS. 1 and 2, the cognitive function evaluation system 10 includes a camera 20, a cognitive function evaluation device 30, and an output device 40.
まず、実施の形態1に係る認知機能評価システム10の概要について、図1、及び図2を用いて説明する。図1は、実施の形態1に係る認知機能評価システム10の概要を示す図である。図2は、実施の形態1に係る認知機能評価システム10の機能構成を示すブロック図である。図1、及び図2に示されるように、認知機能評価システム10は、カメラ20と、認知機能評価装置30と、出力装置40と、を備える。 (Embodiment 1)
First, the outline of the cognitive
認知機能評価システム10は、カメラ20によって撮影された歩行者Mの画像を取得し、取得した画像に基づいて、歩行者Mの認知機能を評価する装置である。認知機能の評価方法については後述される。
The cognitive function evaluation system 10 is a device that acquires an image of the pedestrian M taken by the camera 20 and evaluates the cognitive function of the pedestrian M based on the acquired image. The evaluation method of cognitive function will be described later.
図1に示すように、カメラ20は、例えば、歩行者Mの住宅の廊下の天井や、介護施設などの廊下の天井などに設置されるカメラであり、住宅の廊下60などの通路を歩行している歩行者Mを被写体として含む画像(複数の画像によって構成される動画像)を撮影する。
As shown in FIG. 1, the camera 20 is, for example, a camera installed on the ceiling of a corridor of a pedestrian M's house, the ceiling of a corridor of a nursing facility, or the like, and walks in a passage such as the corridor 60 of the house. An image (moving image composed of a plurality of images) including the pedestrian M as a subject is taken.
カメラ20は、歩行中の歩行者Mを被写体として含む画像を1フレームごとに、後述する画像取得部31に対して出力する。カメラ20のフレームレート(fr)は、例えは、30fps(フレーム/秒)である。カメラ20は、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサを用いたカメラ20であってもよいし、CCD(Charge Coupled Device)イメージセンサを用いたカメラ20であってもよく、特に限定されることはない。
The camera 20 outputs an image including the walking pedestrian M as a subject for each frame to the image acquisition unit 31 described later. The frame rate (fr) of the camera 20 is, for example, 30 fps (frames / second). The camera 20 may be a camera 20 using a CMOS (Complementary Molecular Oxide Semiconductor) image sensor, or a camera 20 using a CCD (Charge Coupled Device) image sensor, and is not particularly limited. Absent.
なお、カメラ20は、例えば、オフィスの内部に設置されてあってもよく、公共機関など自宅以外の建物に設置されてあってもよい。
The camera 20 may be installed inside an office, for example, or may be installed in a building other than the home, such as a public institution.
カメラ20には、例えば、住宅や介護施設などで生活をしている歩行者Mを常時撮影する監視カメラまたはセキュリティカメラなどが流用される。これにより、歩行者Mが特別な動作を行うことなく、歩行者Mが普段通りに生活をしていれば、歩行者Mの歩行中の画像は、認知機能評価装置30の記録部36に、自動的に蓄積されることとなる。つまり、認知機能評価システム10は、歩行者Mが特別な動作や操作などをすることもなく、歩行者Mの認知機能を評価することができる。
For the camera 20, for example, a surveillance camera or a security camera that constantly captures a pedestrian M living in a house or a nursing facility is diverted. As a result, if the pedestrian M does not perform any special movement and the pedestrian M is living as usual, the walking image of the pedestrian M is transmitted to the recording unit 36 of the cognitive function evaluation device 30. It will be automatically accumulated. That is, the cognitive function evaluation system 10 can evaluate the cognitive function of the pedestrian M without the pedestrian M performing any special movement or operation.
認知機能評価装置30は、歩行者Mの認知機能を評価する。認知機能評価装置30は、例えば、住宅内に設置され、住宅に設置された機器を制御するためのホームコントローラーであるが、パーソナルコンピュータなどであってもよい。認知機能評価装置30は、カメラ20が設置された住宅内に設置されてもよいし、当該住宅の外に設置されてもよい。
The cognitive function evaluation device 30 evaluates the cognitive function of the pedestrian M. The cognitive function evaluation device 30 is, for example, a home controller installed in a house and for controlling a device installed in the house, but may be a personal computer or the like. The cognitive function evaluation device 30 may be installed in the house where the camera 20 is installed, or may be installed outside the house.
図2に示すように、認知機能評価装置30は、具体的には、画像取得部31と、顔領域検出部32と、顔領域測定部33と、認証部34と、歩行速度推定部35と、記録部36と、基準データ保存部37と、評価部38と、通信部39とを備える。
As shown in FIG. 2, specifically, the cognitive function evaluation device 30 includes an image acquisition unit 31, a face area detection unit 32, a face area measurement unit 33, an authentication unit 34, and a walking speed estimation unit 35. A recording unit 36, a reference data storage unit 37, an evaluation unit 38, and a communication unit 39 are provided.
画像取得部31は、カメラ20によって撮影された画像(より詳細には、画像データ)を1フレームごとに取得する。画像取得部31は、具体的には、カメラ20と通信を行う通信モジュール(通信回路)である。画像取得部31によって行われる通信は、有線通信であってもよいし、無線通信であってもよい。画像取得部31は、カメラ20と通信可能であればよく、通信規格は、特に限定されることはない。
The image acquisition unit 31 acquires an image (more specifically, image data) taken by the camera 20 for each frame. Specifically, the image acquisition unit 31 is a communication module (communication circuit) that communicates with the camera 20. The communication performed by the image acquisition unit 31 may be wired communication or wireless communication. The image acquisition unit 31 may be capable of communicating with the camera 20, and the communication standard is not particularly limited.
顔領域検出部32は、画像取得部31によって取得された画像データを読み込んで、画像データから1フレームごとに歩行者Mの顔領域を検出する。
The face area detection unit 32 reads the image data acquired by the image acquisition unit 31 and detects the face area of the pedestrian M for each frame from the image data.
図3は、歩行者Mがカメラ20に近づいてから離れる順(図3の左から右)に顔領域の大きさの時系列変化を説明するための説明図である。図3に示すように、顔領域検出部32は、具体的には、歩行者Mの顔の輪郭Aを円で検出して判定する。
FIG. 3 is an explanatory diagram for explaining the time-series change in the size of the face region in the order in which the pedestrian M approaches and then moves away from the camera 20 (from left to right in FIG. 3). As shown in FIG. 3, the face area detection unit 32 specifically detects and determines the contour A of the face of the pedestrian M with a circle.
なお、顔の輪郭Aは、画像中の歩行者Mの顔の輪郭と略同じ大きさか、実際の歩行者Mの顔の輪郭よりも少しだけ大きく検出される。
Note that the facial contour A is detected to be substantially the same size as the facial contour of the pedestrian M in the image or slightly larger than the actual facial contour of the pedestrian M.
なお、顔の輪郭Aは、四角形であってもよく、特に限定されることはない。
The contour A of the face may be a quadrangle and is not particularly limited.
顔領域を検出する方法は、公知技術であるため、本実施の形態ではどのような手順や方法で行っても構わないが、説明を省略する。
Since the method for detecting the face region is a known technique, any procedure or method may be used in the present embodiment, but the description thereof will be omitted.
顔領域検出部32の検出結果は、顔領域測定部33、及び認証部34にそれぞれ送られる。
The detection result of the face area detection unit 32 is sent to the face area measurement unit 33 and the authentication unit 34, respectively.
顔領域測定部33は、顔領域検出部32によって検出された顔領域の大きさを測定する。顔領域の大きさは、例えば、顔領域が円であるとした場合には、顔領域の大きさの最大直径Rが測定されて算出される。最大直径の単位は、pixel(画素数)で算出される。こうした手段をとることによって、顔領域の最大の長さで、顔領域の大きさが検出されるので、歩行者Mの顔の向きが横向きであっても、前向きであっても、検出結果のばらつきを少なくすることができる。
The face area measuring unit 33 measures the size of the face area detected by the face area detecting unit 32. The size of the face region is calculated by measuring the maximum diameter R of the size of the face region, for example, assuming that the face region is a circle. The unit of maximum diameter is calculated by pixel (number of pixels). By taking such means, the size of the face area is detected at the maximum length of the face area, so that the detection result can be obtained regardless of whether the face of the pedestrian M is sideways or forward. The variation can be reduced.
ここで、顔領域の大きさの測定方法は、例えば、顔領域検出部32によって検出された顔領域のX方向、及びY方向の各座標値を用いることで行う。
Here, the method of measuring the size of the face region is performed by using, for example, the coordinate values of the face region detected by the face region detection unit 32 in the X direction and the Y direction.
図3に示すように、図3の左から順番に、フレーム1(fr1)、フレーム2(fr2)、フレーム3(fr3)、フレーム4(fr4)、フレーム5(fr5)とし、各フレームの顔領域の大きさの最大直径Rは、それぞれR1、R2、R3、R4、R5であるとする。図3において、フレーム3(fr3)は、歩行者Mが最もカメラ20に近いものとする。
As shown in FIG. 3, in order from the left of FIG. 3, frame 1 (fr1), frame 2 (fr2), frame 3 (fr3), frame 4 (fr4), frame 5 (fr5) are used, and the face of each frame is used. It is assumed that the maximum diameter R of the size of the region is R1, R2, R3, R4, and R5, respectively. In FIG. 3, it is assumed that the pedestrian M is closest to the camera 20 in the frame 3 (fr3).
フレーム3(fr3)は、歩行者Mが最もカメラ20に近い距離に位置しているので、フレーム3(fr3)の顔領域の最大直径R3は、フレーム1(fr1)、フレーム2(fr2)、フレーム3(fr3)、フレーム4(fr4)、フレーム5(fr5)の各顔領域の最大直径R1、R2、R3、R4、R5の中で最も長い。
Since the pedestrian M is located at the distance closest to the camera 20 in the frame 3 (fr3), the maximum diameter R3 of the face region of the frame 3 (fr3) is the frame 1 (fr1), the frame 2 (fr2), and the frame 3 (fr3). It is the longest of the maximum diameters R1, R2, R3, R4, and R5 of each face region of the frame 3 (fr3), the frame 4 (fr4), and the frame 5 (fr5).
顔領域の大きさの測定結果は、歩行速度推定部35に送られる。
The measurement result of the size of the face area is sent to the walking speed estimation unit 35.
顔領域は、四角形で検出されてもよく、特に限定されることはない。顔領域が四角形の場合は、四角形の対角辺の長さが測定されて、顔領域の最大の長さとして算出される。
The face area may be detected as a quadrangle and is not particularly limited. When the face area is a quadrangle, the length of the diagonal side of the quadrangle is measured and calculated as the maximum length of the face area.
なお、顔領域検出部32、及び顔領域測定部33は、例えば、マイクロコンピュータ、及び各種センサ機器などで実現される。
The face area detection unit 32 and the face area measurement unit 33 are realized by, for example, a microcomputer and various sensor devices.
認証部34は、顔領域検出部32によって検出された顔領域から歩行者Mの顔の特徴量を算出する。顔の特徴量としては、顔における目、鼻、及び口の相対的な位置、並びに顔の輪郭Aなどを用いて定めることができる。
The authentication unit 34 calculates the facial feature amount of the pedestrian M from the face area detected by the face area detection unit 32. The facial features can be determined by using the relative positions of the eyes, nose, and mouth on the face, the contour A of the face, and the like.
認証部34は、画像データから抽出した歩行者Mの顔の特徴量と、登録済みの複数の人物中の顔の特徴量と一致するか判断する。そして、認証部34は、画像データに含まれている歩行者Mについて、登録済みの複数の人物から、顔の特徴量の一致度が所定の閾値より高く、最大となる人物を該当の人物として認識する。
The authentication unit 34 determines whether the facial feature amount of the pedestrian M extracted from the image data matches the facial feature amount among the plurality of registered persons. Then, with respect to the pedestrian M included in the image data, the authentication unit 34 sets the person who has the maximum degree of matching of facial features from a plurality of registered persons as the corresponding person. recognize.
なお、登録済みの複数の人物、及び登録済みの複数の人物の顔の特徴量は、例えば、半導体メモリまたはHDD(Hard Disk Drive)などによって保存されている。
Note that the plurality of registered persons and the facial features of the plurality of registered persons are stored in, for example, a semiconductor memory or an HDD (Hard Disk Drive).
歩行速度推定部35は、1フレームごとに測定された顔領域の最大直径Rの変化から歩行者Mの歩行速度を推定する。歩行速度推定部35は、顔領域の最大直径R、及び顔領域の最大直径Rの変化量を用いて、歩行者Mの歩行速度を推定する。
The walking speed estimation unit 35 estimates the walking speed of the pedestrian M from the change in the maximum diameter R of the face region measured for each frame. The walking speed estimation unit 35 estimates the walking speed of the pedestrian M by using the maximum diameter R of the face region and the amount of change in the maximum diameter R of the face region.
歩行速度推定部35は、時間あたりの顔領域の最大直径Rの変化を次の式で算出する。顔領域の最大直径Rの変化の単位は、pixel/sである。
The walking speed estimation unit 35 calculates the change in the maximum diameter R of the face region per hour by the following formula. The unit of change of the maximum diameter R of the face region is pixel / s.
顔領域の最大直径Rの変化=(Rn+1-Rn)/fr
顔領域の大きさの単位は、pixelで算出されるが、画面上の大きさから変換することにより、空間上での移動量に変換することができる(以下、変換処理と称する)。これにより、メートル単位に変換することが可能である。pixelからメートルに単位を変換する方法は、公知技術であるためどのような手順や方法で行っても構わない。 Change in maximum diameter R of face region = (Rn + 1-Rn) / fr
The unit of the size of the face area is calculated by pixel, but it can be converted into the amount of movement in space by converting from the size on the screen (hereinafter, referred to as conversion process). This makes it possible to convert to metric units. How to convert units from pixel to meters, it may be performed in any procedures and methods for known techniques.
顔領域の大きさの単位は、pixelで算出されるが、画面上の大きさから変換することにより、空間上での移動量に変換することができる(以下、変換処理と称する)。これにより、メートル単位に変換することが可能である。pixelからメートルに単位を変換する方法は、公知技術であるためどのような手順や方法で行っても構わない。 Change in maximum diameter R of face region = (Rn + 1-Rn) / fr
The unit of the size of the face area is calculated by pixel, but it can be converted into the amount of movement in space by converting from the size on the screen (hereinafter, referred to as conversion process). This makes it possible to convert to metric units. How to convert units from pixel to meters, it may be performed in any procedures and methods for known techniques.
変換処理を行うことによって、例えば、フレーム1(fr1)からフレーム2(fr2)における顔領域の最大直径Rの変化は、次の式で算出することができる。
By performing the conversion process, for example, the change in the maximum diameter R of the face region from frame 1 (fr1) to frame 2 (fr2) can be calculated by the following formula.
顔領域の最大直径Rの変化=30・(R2-R1)
フレーム2(fr2)からフレーム3(fr3)における顔領域の最大直径Rの変化も同様に算出できる。このようにして、歩行速度推定部35は、日ごとの歩行者Mの歩行速度を推定する。 Change in maximum diameter R of face area = 30 · (R2-R1)
The change in the maximum diameter R of the face region from the frame 2 (fr2) to the frame 3 (fr3) can be calculated in the same manner. In this way, the walkingspeed estimation unit 35 estimates the walking speed of the pedestrian M on a daily basis.
フレーム2(fr2)からフレーム3(fr3)における顔領域の最大直径Rの変化も同様に算出できる。このようにして、歩行速度推定部35は、日ごとの歩行者Mの歩行速度を推定する。 Change in maximum diameter R of face area = 30 · (R2-R1)
The change in the maximum diameter R of the face region from the frame 2 (fr2) to the frame 3 (fr3) can be calculated in the same manner. In this way, the walking
記録部36は、例えば、フラッシュメモリなどの半導体メモリまたはHDDなどであって、歩行速度推定部35から送られた歩行速度の時系列データを記録する。記録部36は、時系列データを評価部38に送る。時系列データとは、歩行速度を推定した順番に歩行速度の推定結果を表したデータである。記録部36は、例えば、24日、25日、26日にそれぞれ歩行者Mの歩行速度を推定した場合、24日、25日、26日に推定した歩行速度の順番に、歩行速度を記録している。時系列データは、例えば、折れ線グラフのようにデータ化されている。
The recording unit 36 is, for example, a semiconductor memory such as a flash memory or an HDD, and records time-series data of the walking speed sent from the walking speed estimation unit 35. The recording unit 36 sends the time series data to the evaluation unit 38. The time-series data is data showing the estimation results of the walking speed in the order in which the walking speed is estimated. For example, when the walking speed of the pedestrian M is estimated on the 24th, 25th, and 26th, the recording unit 36 records the walking speed in the order of the estimated walking speed on the 24th, 25th, and 26th. ing. The time series data is converted into data such as a line graph.
基準データ保存部37は、例えば、フラッシュメモリなどの半導体メモリまたはHDDなどであって、歩行速度が所定閾値以下である場合、認知機能が低下したとする歩行速度の基準データを保存する。基準データ保存部37は、基準データを評価部38に送る。
The reference data storage unit 37 stores, for example, a semiconductor memory such as a flash memory or an HDD, and when the walking speed is equal to or less than a predetermined threshold, the reference data of the walking speed that the cognitive function is deteriorated is stored. The reference data storage unit 37 sends the reference data to the evaluation unit 38.
所定閾値は、例えば、1m/sである。一般的に、歩行速度は、1m/s以下を下回ると、認知機能が低下していると考えられている。
The predetermined threshold value is, for example, 1 m / s. Generally, when the walking speed is less than 1 m / s, it is considered that the cognitive function is deteriorated.
評価部38は、時系列データと基準データとに基づいて、歩行者Mの認知機能を評価する。具体的には、評価部38は、時系列データの所定期間中から歩行速度が所定閾値を下回る日数を評価する。そして、所定閾値を下回る日数が、所定期間よりも過去の所定期間における歩行速度が所定閾値を下回る日数以上である場合、歩行者Mの認知機能が低下したと評価する。
The evaluation unit 38 evaluates the cognitive function of the pedestrian M based on the time series data and the reference data. Specifically, the evaluation unit 38 evaluates the number of days in which the walking speed falls below the predetermined threshold value during the predetermined period of the time series data. Then, when the number of days below the predetermined threshold is equal to or greater than the number of days when the walking speed in the predetermined period in the past is less than the predetermined threshold, it is evaluated that the cognitive function of the pedestrian M has deteriorated.
所定期間の日数は、例えば、数日、一週間、二週間、一ヶ月、一年などである。所定期間よりも過去の所定期間の日数は、所定期間よりも、数日、一週間、二週間、一ヶ月前などである。ここで、例えば、所定期間が一週間であれば、所定期間よりも過去の所定期間も、一週間であるなど、所定期間と所定期間よりも過去の所定期間は、同じ日数であることが好ましい。
The number of days in the predetermined period is, for example, several days, one week, two weeks, one month, one year, and the like. The number of days in the predetermined period past the predetermined period is several days, one week, two weeks, one month before the predetermined period, and the like. Here, for example, if the predetermined period is one week, the predetermined period past the predetermined period is also one week, and the predetermined period and the predetermined period past the predetermined period are preferably the same number of days. ..
なお、例えば、所定期間の日数と、所定期間よりも過去の所定期間の日数が異なる場合には、一方の日数に合わせるように補正したものと比較できる。
Note that, for example, when the number of days in a predetermined period and the number of days in a predetermined period in the past are different from the predetermined period, it can be compared with the one corrected to match one of the days.
評価部38は、例えば、所定期間が一週間の場合、一週間の半分以上の日数における歩行速度が1m/sを下回ると、認知機能が低下したと判断してもよい。
For example, when the predetermined period is one week, the evaluation unit 38 may determine that the cognitive function has deteriorated when the walking speed falls below 1 m / s in the number of days more than half of the week.
通信部39は、認知機能評価装置30がインターネットなどの広域通信ネットワーク50を介して、出力装置40と通信を行うための通信モジュール(通信回路)である。通信部39によって行われる通信は、無線通信であってもよいし、有線通信であってもよい。通信に用いられる通信規格についても、特に限定されることはない。
The communication unit 39 is a communication module (communication circuit) for the cognitive function evaluation device 30 to communicate with the output device 40 via a wide area communication network 50 such as the Internet. The communication performed by the communication unit 39 may be wireless communication or wired communication. The communication standard used for communication is also not particularly limited.
認知機能評価装置30は、プログラムが格納された不揮発性メモリ、プログラムを実行するための一時的な記憶領域である不揮発性メモリ、入出力ポートと、プログラムを実行するプロセッサなどで実現することもができる。
The cognitive function evaluation device 30 may be realized by a non-volatile memory in which the program is stored, a non-volatile memory which is a temporary storage area for executing the program, an input / output port, a processor for executing the program, and the like. it can.
図2において、画像取得部31と、顔領域検出部32と、顔領域測定部33と、認証部34と、歩行速度推定部35と、記録部36と、基準データ保存部37と、評価部38と、通信部39とは、必ずしも実体のある構成を示しておらず、プロセッサなどによって、実現される機能を示している。
In FIG. 2, the image acquisition unit 31, the face area detection unit 32, the face area measurement unit 33, the authentication unit 34, the walking speed estimation unit 35, the recording unit 36, the reference data storage unit 37, and the evaluation unit The 38 and the communication unit 39 do not necessarily show a substantive configuration, but show a function realized by a processor or the like.
出力装置40は、液晶表示パネルまたは有機EL(Electroluminescence)表示パネルなどの提示部41、及びタッチパネルなどのユーザの操作を受け付ける操作受付部42によって構成される情報端末である。
The output device 40 is an information terminal composed of a presentation unit 41 such as a liquid crystal display panel or an organic EL (Electroluminescence) display panel, and an operation reception unit 42 that receives user operations such as a touch panel.
出力装置40は、評価部38から送られる評価された歩行者Mの認知機能に関する情報を提示する提示部41を備える。認知機能に関する情報とは、例えば、歩行速度の時系列データまたは歩行者Mの認知機能の変化などである。出力装置40は、例えば、スマートフォン、またはタブレット端末などの携帯型の情報端末や、モニタやタッチパネルであるが、パーソナルコンピュータなどの据え置き型の情報端末であってもよい。
The output device 40 includes a presentation unit 41 that presents information regarding the cognitive function of the evaluated pedestrian M sent from the evaluation unit 38. The information on the cognitive function is, for example, time series data of walking speed or a change in the cognitive function of the pedestrian M. The output device 40 is, for example, a portable information terminal such as a smartphone or a tablet terminal, or a monitor or a touch panel, but may be a stationary information terminal such as a personal computer.
出力装置40は、広域通信ネットワーク50を介して、認知機能評価装置30と通信を行うことができる。出力装置40は、例えば、病院や介護施設などに設置された情報端末である。すなわち、歩行者Mの住宅などで撮像された画像データから、歩行速度の時系列データ、または、歩行速度の変化、及び認知機能の評価結果が算出されて、病院または介護施設が評価結果を受信する。
The output device 40 can communicate with the cognitive function evaluation device 30 via the wide area communication network 50. The output device 40 is, for example, an information terminal installed in a hospital, a nursing care facility, or the like. That is, the time-series data of walking speed, the change in walking speed, and the evaluation result of cognitive function are calculated from the image data captured in the house of pedestrian M, and the hospital or nursing facility receives the evaluation result. To do.
この場合、日々の生活に対する追加の作業を必要とせずに、軽度認知症障害の段階で認知機能の異変を発見して提示することができる。日々の生活に対する追加の作業を必要とせずにとは、特殊な測定機器などを歩行者Mに装着させないことなどである。
In this case, it is possible to detect and present abnormalities in cognitive function at the stage of mild dementia disorder without requiring additional work for daily life. The fact that no additional work is required for daily life means that the pedestrian M is not equipped with a special measuring device or the like.
病院または介護施設は、歩行者Mの認知機能に異変がある場合、軽度認知症の段階で歩行者Mを治療することや、治療を促すことができる。
A hospital or nursing home can treat pedestrian M or encourage treatment at the stage of mild dementia if there is an abnormality in the cognitive function of pedestrian M.
なお、出力装置40は、広域通信ネットワーク50を介さずに認知機能評価装置30と直接通信を行っても良い。
Note that the output device 40 may directly communicate with the cognitive function evaluation device 30 without going through the wide area communication network 50.
提示部41は、例えば、歩行者Mからの選択及び操作を受け付けるための選択ボタンなどのGUI(Graphical User Interface)オブジェクトを生成して表示する。
The presentation unit 41 generates and displays a GUI (Graphical User Interface) object such as a selection button for accepting selection and operation from the pedestrian M, for example.
なお、出力装置40は、提示部41の代わりに、または提示部41に加えて、音声を出力するスピーカーを有しても良い。出力装置40は、歩行者Mに提示する内容を表す音声を出力してもよい。
Note that the output device 40 may have a speaker that outputs audio in place of the presentation unit 41 or in addition to the presentation unit 41. The output device 40 may output a voice representing the content to be presented to the pedestrian M.
(動作例1)
次に、認知機能評価システム10の動作例1について説明する。図4は、認知機能評価システム10の動作例1のフローチャートである。 (Operation example 1)
Next, operation example 1 of the cognitivefunction evaluation system 10 will be described. FIG. 4 is a flowchart of operation example 1 of the cognitive function evaluation system 10.
次に、認知機能評価システム10の動作例1について説明する。図4は、認知機能評価システム10の動作例1のフローチャートである。 (Operation example 1)
Next, operation example 1 of the cognitive
まず、認知機能評価装置30の画像取得部31は、歩行者Mを被写体として含む画像(画像データ)をカメラ20から取得する(S10)。
First, the image acquisition unit 31 of the cognitive function evaluation device 30 acquires an image (image data) including the pedestrian M as a subject from the camera 20 (S10).
次に、顔領域検出部32は、取得された画像から歩行者Mの顔領域を検出する(S20)。
Next, the face area detection unit 32 detects the face area of the pedestrian M from the acquired image (S20).
次に、認証部34は、顔領域検出部32に検出された顔領域から歩行者Mの顔の特徴量を算出する(S30)。さらに、認証部34は、歩行者Mの顔の特徴量と、登録済みの複数の人物中の顔の特徴量と一致するか判断する(S40)。
Next, the authentication unit 34 calculates the facial feature amount of the pedestrian M from the face area detected by the face area detection unit 32 (S30). Further, the authentication unit 34 determines whether the facial features of the pedestrian M match the facial features of the plurality of registered persons (S40).
一致しない場合は、S10に戻る(S40でNO)。一方で、一致する場合は、以下の処理を行う(S40でYES)。
If they do not match, return to S10 (NO in S40). On the other hand, if they match, the following processing is performed (YES in S40).
次に、顔領域測定部33は、顔領域検出部32から検出された顔領域の大きさを測定する(S50)。
Next, the face area measuring unit 33 measures the size of the face area detected from the face area detecting unit 32 (S50).
次に、顔領域測定部33によって測定された顔領域の大きさから歩行速度を推定する(S60)。
Next, the walking speed is estimated from the size of the face area measured by the face area measuring unit 33 (S60).
記録部36は、歩行速度推定部35によって推定される歩行速度の時系列データを記録する(S70)。
The recording unit 36 records time-series data of the walking speed estimated by the walking speed estimation unit 35 (S70).
評価部38は、時系列データの所定期間から歩行速度が前記所定閾値を下回る日数を評価する。そして、評価部38は、所定期間よりも過去の所定期間の歩行速度が所定閾値を下回る日数と、所定期間における歩行速度が所定閾値を下回る日数とを比較して、認知機能の変化を評価する(ステップS80)。
The evaluation unit 38 evaluates the number of days when the walking speed falls below the predetermined threshold value from the predetermined period of the time series data. Then, the evaluation unit 38 evaluates the change in cognitive function by comparing the number of days when the walking speed in the predetermined period past the predetermined period is lower than the predetermined threshold value and the number of days when the walking speed in the predetermined period is lower than the predetermined threshold value. (Step S80).
評価部38は、例えば、定期的に認知機能の評価を行い、通信部39に評価結果を送信させてもよい。
The evaluation unit 38 may periodically evaluate the cognitive function and have the communication unit 39 transmit the evaluation result.
通信部39は、歩行速度の時系列データと、認知機能の変化を提示部41に送信する(S90)。
The communication unit 39 transmits time-series data of walking speed and changes in cognitive function to the presentation unit 41 (S90).
出力装置40の提示部41は、通信部39から送られる歩行速度の時系列データと、認知機能の変化を提示する(S100)。
The presentation unit 41 of the output device 40 presents the time-series data of the walking speed sent from the communication unit 39 and the change in the cognitive function (S100).
図5は、認知機能の低下を表示するための表示画面の一例を示す説明図である。図5に示される例では、本日である6月30日(日)における認知機能の変化が表示されている。また、6月2日から6月29日までの約1か月間の評価結果が表示されている。図5では、所定期間が23日~29日であって、所定期間において、歩行速度が所定閾値を下回る日数が4日である。また、所定期間より過去の所定期間は、2日~8日、9日~15日、16日~22日である。例えば、認知機能の変化を評価するために、所定期間より過去の所定期間を9日~15日とすると、所定期間より過去の所定期間における歩行速度が所定閾値を下回る日数が3日である。そのため、所定期間は、所定期間の過去の所定期間よりも歩行速度が所定閾値を下回る日数が1日だけ増えているので、認知機能が低下している傾向があることを提示部41が提示してもよい。
FIG. 5 is an explanatory diagram showing an example of a display screen for displaying a decrease in cognitive function. In the example shown in FIG. 5, changes in cognitive function as of today, June 30 (Sun), are displayed. In addition, the evaluation results for about one month from June 2nd to June 29th are displayed. In FIG. 5, the predetermined period is 23 days to 29 days, and the number of days when the walking speed falls below the predetermined threshold value is 4 days in the predetermined period. The predetermined period past the predetermined period is 2 to 8 days, 9 to 15 days, and 16 to 22 days. For example, in order to evaluate the change in cognitive function, if the predetermined period past the predetermined period is 9 to 15 days, the number of days in which the walking speed in the predetermined period past the predetermined period falls below the predetermined threshold is 3 days. Therefore, the presenting unit 41 presents that the cognitive function tends to deteriorate in the predetermined period because the number of days in which the walking speed falls below the predetermined threshold is increased by one day as compared with the past predetermined period of the predetermined period. You may.
図6は、歩行速度の時系列データを表示するための表示画面の一例を示す説明図である。図6に示すように、例えば、歩行速度の時系列データは、19日~30日まで記録部36に記録されていて、提示部41が折れ線グラフで提示してもよい。図6に示される例では、折れ線グラフのハッチングしている領域は、歩行速度が所定閾値を下回っていることを示している。この場合、歩行速度が低下している傾向があることが容易に判断できる。
FIG. 6 is an explanatory diagram showing an example of a display screen for displaying time-series data of walking speed. As shown in FIG. 6, for example, the time-series data of walking speed is recorded in the recording unit 36 from the 19th to the 30th, and the presenting unit 41 may present it as a line graph. In the example shown in FIG. 6, the hatched area of the line graph indicates that the walking speed is below a predetermined threshold value. In this case, it can be easily determined that the walking speed tends to decrease.
なお、顔領域検出部32、顔領域測定部33の代わりに、別の手段を用いて、歩行者Mを被写体として含む画像から歩行者Mの歩行速度を推定してもよい。
Instead of the face area detection unit 32 and the face area measurement unit 33, another means may be used to estimate the walking speed of the pedestrian M from an image including the pedestrian M as a subject.
なお、評価部38は、認知機能に低下傾向がある場合のみ、通信部39に評価結果を送信させてもよく、特に限定されることはない。
Note that the evaluation unit 38 may have the communication unit 39 transmit the evaluation result only when the cognitive function tends to decline, and the evaluation unit 38 is not particularly limited.
(実施の形態2)
実施の形態2の基本的な構成は、実施の形態1と略同じである。よって、同一箇所には同一符号を付して説明を省略し、異なる箇所のみの説明を行う。 (Embodiment 2)
The basic configuration of the second embodiment is substantially the same as that of the first embodiment. Therefore, the same reference numerals are given to the same parts, the description is omitted, and only different parts are described.
実施の形態2の基本的な構成は、実施の形態1と略同じである。よって、同一箇所には同一符号を付して説明を省略し、異なる箇所のみの説明を行う。 (Embodiment 2)
The basic configuration of the second embodiment is substantially the same as that of the first embodiment. Therefore, the same reference numerals are given to the same parts, the description is omitted, and only different parts are described.
評価部38は、現在の歩行速度と、現在から所定期間より過去の歩行速度とを比較して、歩行速度の変化を複数の段階で評価する。例えば、評価部38は、現在の歩行速度と、一週間前の歩行速度とを比較して、歩行速度の変化を「A」~「E」の5段階で評価する。
The evaluation unit 38 compares the current walking speed with the walking speed from the present to the past for a predetermined period, and evaluates the change in walking speed in a plurality of stages. For example, the evaluation unit 38 compares the current walking speed with the walking speed one week ago, and evaluates the change in walking speed on a five-point scale of “A” to “E”.
「A」~「E」は、歩行速度の変化の評価結果に対応する。一例として、「A」~「E」の順で、「A」は歩行速度が最も低下しており、「C」は変化がなく、「E」は歩行速度が最も向上していることを表すが、逆であってもよい。具体的には、評価部38は、現在の歩行速度が一週間前の歩行速度よりも10%低下していると歩行速度の変化を「B」と判断し、現在の歩行速度が一週間前の歩行速度よりも20%以上低下していると歩行速度の変化を「A」と判断する。また、現在の歩行速度が一週間前の歩行速度よりも10%向上していると歩行速度の変化を「D」と判断し、現在の歩行速度が一週間前の歩行速度よりも20%以上向上していると歩行速度の変化を「E」と判断する。
"A" to "E" correspond to the evaluation result of the change in walking speed. As an example, in the order of "A" to "E", "A" indicates that the walking speed is the lowest, "C" indicates that there is no change, and "E" indicates that the walking speed is the highest. However, the opposite may be true. Specifically, the evaluation unit 38 determines that the change in walking speed is "B" when the current walking speed is 10% lower than the walking speed one week ago, and the current walking speed is one week ago. If the walking speed is 20% or more lower than the walking speed of, the change in walking speed is judged as "A". In addition, if the current walking speed is 10% higher than the walking speed one week ago, the change in walking speed is judged as "D", and the current walking speed is 20% or more than the walking speed one week ago. If it is improved, the change in walking speed is judged as "E".
なお、歩行速度の変化の段階は、8段階でも10段階であってもよく、特に限定されることはない。
The stage of change in walking speed may be 8 stages or 10 stages, and is not particularly limited.
この場合、歩行速度の変化を容易に知ることができるので、軽度認知障害の段階で早期に認知機能の異変を発見して、提示することができる。
In this case, since the change in walking speed can be easily known, it is possible to detect and present a change in cognitive function at an early stage at the stage of mild cognitive impairment.
(実施の形態3)
実施の形態3の基本的な構成は、実施の形態1と略同じである。よって、同一箇所には同一符号を付して説明を省略し、異なる箇所のみの説明を行う。 (Embodiment 3)
The basic configuration of the third embodiment is substantially the same as that of the first embodiment. Therefore, the same reference numerals are given to the same parts, the description is omitted, and only different parts are described.
実施の形態3の基本的な構成は、実施の形態1と略同じである。よって、同一箇所には同一符号を付して説明を省略し、異なる箇所のみの説明を行う。 (Embodiment 3)
The basic configuration of the third embodiment is substantially the same as that of the first embodiment. Therefore, the same reference numerals are given to the same parts, the description is omitted, and only different parts are described.
実施の形態3では、認知機能評価システム10は、認知機能評価装置30によって評価された歩行者Mの認知機能、及び歩行速度の変化の評価結果に基づいて、歩行者Mに推奨する行動内容を推奨内容として決定する。図7は、実施の形態3に係る認知機能評価システム10の機能構成を示すブロック図である。認知機能評価装置30は、実施の形態1の構成に加えて、さらに、データベース部200と、解析部201とを備える。
In the third embodiment, the cognitive function evaluation system 10 recommends the action content to the pedestrian M based on the evaluation result of the change in the walking speed and the cognitive function of the pedestrian M evaluated by the cognitive function evaluation device 30. Determine as a recommendation. FIG. 7 is a block diagram showing a functional configuration of the cognitive function evaluation system 10 according to the third embodiment. The cognitive function evaluation device 30 further includes a database unit 200 and an analysis unit 201 in addition to the configuration of the first embodiment.
データベース部200は、評価部38から送られる歩行者Mの認知機能、並びに、歩行速度の変化の評価結果、または歩行速度の時系列データを記録する。データベース部200は、半導体メモリまたはHDDなどの不揮発性記憶装置で実現される。
The database unit 200 records the cognitive function of the pedestrian M sent from the evaluation unit 38, the evaluation result of the change in walking speed, or the time-series data of walking speed. The database unit 200 is realized by a non-volatile storage device such as a semiconductor memory or an HDD.
解析部201は、評価部38によって評価された歩行者Mの認知機能、及び歩行速度の変化の評価結果に基づいて、歩行者Mに推奨する行動内容を推奨内容として決定する。
The analysis unit 201 determines the action content recommended for the pedestrian M as the recommended content based on the evaluation result of the change in the walking speed and the cognitive function of the pedestrian M evaluated by the evaluation unit 38.
解析部201は、例えば、マイクロコンピュータで実現される。具体的には、解析部201は、プログラムが格納された不揮発性メモリ、プログラムを実行するための一時的な記録領域である揮発性メモリ、入出力ポート、プログラムを実行するプロセッサなどで実現される。
The analysis unit 201 is realized by, for example, a microcomputer. Specifically, the analysis unit 201 is realized by a non-volatile memory in which the program is stored, a volatile memory which is a temporary recording area for executing the program, an input / output port, a processor in which the program is executed, and the like. ..
解析部201は、決定した推奨内容を提示部41に送信する。推奨内容とは、認知機能を向上させる認知トレーニングを開始することを促す内容などである。また、推奨内容は、例えば、ウォーキングやジョギングなどの有酸素運動を促す内容であったり、ビタミンC、ビタミンE、βカロチンなどの栄養素を含む野菜や果物の摂取を促す内容であってもよい。
The analysis unit 201 transmits the determined recommended content to the presentation unit 41. Recommendations include content that encourages the start of cognitive training to improve cognitive function. In addition, the recommended content may be, for example, content that promotes aerobic exercise such as walking or jogging, or content that promotes the intake of vegetables and fruits containing nutrients such as vitamin C, vitamin E, and β-carotene.
この場合、認知機能を改善するための効果的な行動を歩行者Mに促すことができる。
In this case, the pedestrian M can be encouraged to take effective actions to improve the cognitive function.
(動作例2)
次に、認知機能評価システム10の動作例2について説明する。図8は、認知機能評価システム10の動作例2のフローチャートである。図8は、図7の動作例1のS80の次に、認知機能評価システム10が、評価結果に基づいて、歩行者Mに推奨する行動内容を推奨内容として決定して、その推奨内容を提示することを示している。 (Operation example 2)
Next, operation example 2 of the cognitivefunction evaluation system 10 will be described. FIG. 8 is a flowchart of operation example 2 of the cognitive function evaluation system 10. In FIG. 8, after S80 of the operation example 1 of FIG. 7, the cognitive function evaluation system 10 determines the action content recommended for the pedestrian M as the recommended content based on the evaluation result, and presents the recommended content. Indicates to do.
次に、認知機能評価システム10の動作例2について説明する。図8は、認知機能評価システム10の動作例2のフローチャートである。図8は、図7の動作例1のS80の次に、認知機能評価システム10が、評価結果に基づいて、歩行者Mに推奨する行動内容を推奨内容として決定して、その推奨内容を提示することを示している。 (Operation example 2)
Next, operation example 2 of the cognitive
実施の形態3では、さらに、S80の次に、解析部201は、認知機能の変化の評価結果に基づいて、推奨する行動内容を推奨内容として決定する(S81)。
In the third embodiment, next to S80, the analysis unit 201 further determines the recommended action content as the recommended content based on the evaluation result of the change in cognitive function (S81).
次に、解析部201は、決定して推奨内容を提示部41に送信する(S82)。
Next, the analysis unit 201 determines and transmits the recommended content to the presentation unit 41 (S82).
(実施の形態4)
実施の形態4の基本的な構成は、実施の形態1と略同じである。よって、同一箇所には同一符号を付して説明を省略し、異なる箇所のみの説明を行う。これ以降の実施の形態も同様とする。 (Embodiment 4)
The basic configuration of the fourth embodiment is substantially the same as that of the first embodiment. Therefore, the same reference numerals are given to the same parts, the description is omitted, and only different parts are described. The same applies to the subsequent embodiments.
実施の形態4の基本的な構成は、実施の形態1と略同じである。よって、同一箇所には同一符号を付して説明を省略し、異なる箇所のみの説明を行う。これ以降の実施の形態も同様とする。 (Embodiment 4)
The basic configuration of the fourth embodiment is substantially the same as that of the first embodiment. Therefore, the same reference numerals are given to the same parts, the description is omitted, and only different parts are described. The same applies to the subsequent embodiments.
実施の形態4では、認知機能評価システム10は、第1カメラ202と第2カメラ203の2台のカメラを備える。図9は、実施の形態4に係る第1カメラ202、及び第2カメラ203の位置を示す説明図である。図9に示すように、第1カメラ202は、住宅などの廊下の天井や壁に設けられている。第2カメラ203は、第1カメラ202と所定の距離Lをおいて設置されている。
In the fourth embodiment, the cognitive function evaluation system 10 includes two cameras, a first camera 202 and a second camera 203. FIG. 9 is an explanatory diagram showing the positions of the first camera 202 and the second camera 203 according to the fourth embodiment. As shown in FIG. 9, the first camera 202 is provided on the ceiling or wall of a corridor such as a house. The second camera 203 is installed at a predetermined distance L from the first camera 202.
なお、第1カメラ202と第2カメラ203は、歩行者Mが歩行する廊下60に対して、同じ高さに設置されていることが好ましい。
It is preferable that the first camera 202 and the second camera 203 are installed at the same height as the corridor 60 where the pedestrian M walks.
歩行速度推定部35は、第1カメラ202に歩行者Mが最も接近した場合の画像データと、第2カメラ203に歩行者Mが最も接近した場合の画像データとから歩行者Mの歩行速度を推定する。
The walking speed estimation unit 35 determines the walking speed of the pedestrian M from the image data when the pedestrian M is closest to the first camera 202 and the image data when the pedestrian M is closest to the second camera 203. presume.
第1カメラ202に歩行者Mが最も接近した場合の画像データとは、歩行者Mが第1カメラ202の鉛直方向に位置するときに撮影した場合の画像データである。第2カメラ203に歩行者Mが最も接近した場合の画像データとは、歩行者Mが第2カメラ203の鉛直方向に位置するときに撮影した場合の画像データである。
The image data when the pedestrian M is closest to the first camera 202 is the image data taken when the pedestrian M is located in the vertical direction of the first camera 202. The image data when the pedestrian M is closest to the second camera 203 is the image data taken when the pedestrian M is located in the vertical direction of the second camera 203.
顔領域測定部33は、第1カメラ202に歩行者Mが最も接近した場合の画像データに含まれる歩行者Mの顔領域の最大直径Rを測定する。また、顔領域測定部33は、第2カメラ203に歩行者Mが最も接近した場合の画像データに含まれる歩行者Mの顔領域の最大直径Rを測定する。
The face area measuring unit 33 measures the maximum diameter R of the face area of the pedestrian M included in the image data when the pedestrian M is closest to the first camera 202. Further, the face area measuring unit 33 measures the maximum diameter R of the face area of the pedestrian M included in the image data when the pedestrian M is closest to the second camera 203.
歩行速度推定部35は、第1カメラ202と第2カメラ203の所定の距離Lと、第1カメラ202に歩行者Mが最も接近した場合の画像データを取得した時間T1、及び第2カメラ203に歩行者Mが最も接近した場合の画像データを取得した時間T2から、歩行者Mの歩行速度を推定する。
The walking speed estimation unit 35 acquires the predetermined distance L between the first camera 202 and the second camera 203, the time T1 when the pedestrian M is closest to the first camera 202, and the second camera 203. The walking speed of the pedestrian M is estimated from the time T2 at which the image data when the pedestrian M is closest to the pedestrian M is acquired.
(変形例)
認知機能評価システム10は、画像に被写体として含まれる複数の歩行者Mの認知機能の変化の評価をしてもよい。例えば、カメラ20によって取得される画像に、歩行者Mが被写体として2人含まれている場合、認証部34において画像データから抽出した2人の歩行者Mの顔の特徴量と、登録済みの複数の人物中の顔の特徴量と一致するかそれぞれ同時に判断する。登録済みの複数の人物中の顔の特徴量の中から2人の歩行者Mの顔の特徴量が一致した場合、歩行速度推定部35は、2人の歩行者Mの歩行速度を並行して推定してもよい。 (Modification example)
The cognitivefunction evaluation system 10 may evaluate changes in the cognitive function of a plurality of pedestrians M included as subjects in the image. For example, when the image acquired by the camera 20 includes two pedestrians M as subjects, the feature amounts of the faces of the two pedestrians M extracted from the image data by the authentication unit 34 and the registered features of the two pedestrians M are registered. It is judged at the same time whether or not they match the facial features of a plurality of people. When the facial features of the two pedestrians M match among the facial features of the plurality of registered persons, the walking speed estimation unit 35 parallels the walking speeds of the two pedestrians M. May be estimated.
認知機能評価システム10は、画像に被写体として含まれる複数の歩行者Mの認知機能の変化の評価をしてもよい。例えば、カメラ20によって取得される画像に、歩行者Mが被写体として2人含まれている場合、認証部34において画像データから抽出した2人の歩行者Mの顔の特徴量と、登録済みの複数の人物中の顔の特徴量と一致するかそれぞれ同時に判断する。登録済みの複数の人物中の顔の特徴量の中から2人の歩行者Mの顔の特徴量が一致した場合、歩行速度推定部35は、2人の歩行者Mの歩行速度を並行して推定してもよい。 (Modification example)
The cognitive
なお、歩行速度推定部35は、同時並行処理でなくても、一方の歩行者Mの歩行速度を推定したすぐ後に、他方の歩行者Mの歩行速度を推定してもよく、特に限定されることはない。
The walking speed estimation unit 35 may estimate the walking speed of the other pedestrian M immediately after estimating the walking speed of one pedestrian M, even if the processing is not simultaneous and parallel processing, and is particularly limited. There is no such thing.
この場合、複数の歩行者Mの認知機能の異変を効率よく発見して提示することができる。
In this case, it is possible to efficiently discover and present abnormalities in the cognitive function of a plurality of pedestrians M.
なお、画像に被写体として含まれる歩行者Mの人数は、3人でも4人でも、5人以上であってもよく、特に限定されることはない。
The number of pedestrians M included as subjects in the image may be 3, 4, or 5 or more, and is not particularly limited.
例えば、上記実施の形態において説明した処理は、単一の装置(システム)を用いて集中処理することによって実現してもよく、または、複数の装置を用いて分散処理することによって実現してもよい。また、上記プログラムを実行するプロセッサは、単数であってもよく、複数であってもよい。すなわち、集中処理を行ってもよく、または分散処理を行ってもよい。
For example, the processing described in the above embodiment may be realized by centralized processing using a single device (system), or may be realized by distributed processing using a plurality of devices. Good. Further, the number of processors that execute the above program may be singular or plural. That is, centralized processing may be performed, or distributed processing may be performed.
例えば、認知機能評価装置30と出力装置40は、一体に形成されてあってもよく、特に限定されることはない。
For example, the cognitive function evaluation device 30 and the output device 40 may be integrally formed, and are not particularly limited.
また、出力装置40は、歩行者Mの携帯型の情報端末やパーソナルコンピュータなどの据え置き型の情報端末であってもよい。
Further, the output device 40 may be a portable information terminal of the pedestrian M or a stationary information terminal such as a personal computer.
これにより、歩行者Mは、認知機能の変化をいつでも知ることができる。
As a result, the pedestrian M can be aware of changes in cognitive function at any time.
また、上記実施の形態で説明した装置間の通信方法については特に限定されるものではない。装置間で無線通信が行われる場合、無線通信の方式(通信規格)は、例えば、ZigBee(登録商標)、Bluetooth(登録商標)、または、無線LAN(Local Area Network)などの近距離無線通信である。また、装置間においては、無線通信に代えて、有線通信が行われてもよい。有線通信は、具体的には、電力線搬送通信(PLC:Power Line Communication)または有線LANを用いた通信などである。
Further, the communication method between the devices described in the above embodiment is not particularly limited. When wireless communication is performed between devices, the wireless communication method (communication standard) is, for example, short-range wireless communication such as ZigBee (registered trademark), Bluetooth (registered trademark), or wireless LAN (Local Area Network). is there. Further, wired communication may be performed between the devices instead of wireless communication. Specifically, the wired communication is a power line carrier communication (PLC: Power Line Communication) or a communication using a wired LAN.
また、上記実施の形態において説明された動作例のフローチャートにおける複数の処理の順序は一例である。複数の処理の順序は、変更されても良いし、複数の処理は、並行して実行されてもよい。また、特定の処理部が実行する処理を別の処理部が実行してもよい。
Further, the order of a plurality of processes in the flowchart of the operation example described in the above embodiment is an example. The order of the plurality of processes may be changed, and the plurality of processes may be executed in parallel. Further, another processing unit may execute the processing executed by the specific processing unit.
その他、各実施の形態に対して当業者が思いつく各種変形を施して得られる形態、または、本発明の趣旨を逸脱しない範囲で各実施の形態における構成要素及び機能を任意に組み合わせることで実現される形態も本発明に含まれる。
In addition, it is realized by applying various modifications to each embodiment that can be conceived by those skilled in the art, or by arbitrarily combining the components and functions of each embodiment within the range not deviating from the gist of the present invention. The form is also included in the present invention.
例えば、上記実施の形態における認知機能は、歩行機能に読み替えられてもよく、本発明は、歩行機能評価装置、歩行機能評価システム、及び、歩行機能評価方法などとして実現されてもよい。
For example, the cognitive function in the above embodiment may be read as a walking function, and the present invention may be realized as a walking function evaluation device, a walking function evaluation system, a walking function evaluation method, or the like.
10 認知機能評価システム
20 カメラ
30 認知機能評価装置
31 画像取得部
32 顔領域検出部
33 顔領域測定部
34 認証部
35 歩行速度推定部
36 記録部
37 基準データ保存部
38 評価部
39 通信部
40 出力装置
41 提示部
42 操作受付部
200 データベース部
201 解析部
202 第1カメラ
203 第2カメラ
M 歩行者
R 最大直径 10 Cognitivefunction evaluation system 20 Camera 30 Cognitive function evaluation device 31 Image acquisition unit 32 Face area detection unit 33 Face area measurement unit 34 Authentication unit 35 Walking speed estimation unit 36 Recording unit 37 Reference data storage unit 38 Evaluation unit 39 Communication unit 40 Output Device 41 Presentation unit 42 Operation reception unit 200 Database unit 201 Analysis unit 202 1st camera 203 2nd camera M Pedestrian R Maximum diameter
20 カメラ
30 認知機能評価装置
31 画像取得部
32 顔領域検出部
33 顔領域測定部
34 認証部
35 歩行速度推定部
36 記録部
37 基準データ保存部
38 評価部
39 通信部
40 出力装置
41 提示部
42 操作受付部
200 データベース部
201 解析部
202 第1カメラ
203 第2カメラ
M 歩行者
R 最大直径 10 Cognitive
Claims (7)
- 歩行者を被写体として含む画像を取得する画像取得部と、
前記画像取得部が取得した画像に基づいて、前記歩行者の歩行速度を推定する歩行速度推定部と、
前記歩行速度の時系列データを記録する記録部と、
歩行機能が低下したとする歩行速度の基準データを保存する基準データ保存部と、
前記時系列データと前記基準データとに基づいて、前記歩行者の歩行機能を評価する評価部と、
前記評価部に評価された前記歩行者の歩行機能に関する情報を提示する提示部と、を備える、
歩行機能評価装置。 An image acquisition unit that acquires an image that includes a pedestrian as a subject,
A walking speed estimation unit that estimates the walking speed of the pedestrian based on the image acquired by the image acquisition unit, and a walking speed estimation unit.
A recording unit that records time-series data of the walking speed and
A reference data storage unit that stores reference data for walking speed when walking function has deteriorated,
An evaluation unit that evaluates the walking function of the pedestrian based on the time series data and the reference data.
The evaluation unit includes a presentation unit that presents information on the walking function of the pedestrian evaluated.
Walking function evaluation device. - 前記歩行機能評価装置は、さらに、
前記画像から前記歩行者の顔領域を検出する顔領域検出部と、
前記顔領域の大きさを測定する顔領域測定部と、
前記顔領域から前記歩行者の顔の特徴量を算出し、前記顔の特徴量と、登録済みの複数の人物中の顔の特徴量と一致するか認証する認証部と、を備え、
前記認証部で、前記歩行者の顔の特徴量と、前記登録済みの複数の人物中の顔の特徴量と一致したと判断した場合に、
前記歩行速度推定部は、前記顔領域の大きさに基づいて、前記歩行者の歩行速度を推定し、
前記基準データ保存部は、前記歩行速度が所定閾値以下である場合、歩行機能が低下していると判断できる前記所定閾値を保存し、
前記評価部は、前記時系列データの所定期間から歩行速度が前記所定閾値を下回る日数を評価して、前記所定期間よりも過去の所定期間の歩行速度が前記所定閾値を下回る日数と、前記所定期間の前記日数とを比較して、歩行機能の変化を評価し、
前記提示部は、前記歩行速度の時系列データ、及び前記歩行者の歩行機能の変化を提示する、
請求項1記載の歩行機能評価装置。 The walking function evaluation device further
A face area detection unit that detects the pedestrian's face area from the image,
A face area measuring unit that measures the size of the face area,
It is provided with an authentication unit that calculates the facial feature amount of the pedestrian from the face area and authenticates whether the facial feature amount matches or matches the facial feature amount among a plurality of registered persons.
When the authentication unit determines that the facial features of the pedestrian match the facial features of the plurality of registered persons,
The walking speed estimation unit estimates the walking speed of the pedestrian based on the size of the face region.
The reference data storage unit stores the predetermined threshold value at which it can be determined that the walking function is deteriorated when the walking speed is equal to or less than the predetermined threshold value.
The evaluation unit evaluates the number of days in which the walking speed falls below the predetermined threshold value from the predetermined period of the time series data, and sets the number of days in which the walking speed in the predetermined period past the predetermined period falls below the predetermined threshold value. Evaluate changes in gait function by comparing with the number of days of the period
The presenting unit presents time-series data of the walking speed and changes in the walking function of the pedestrian.
The walking function evaluation device according to claim 1. - 前記提示部は、前記歩行速度の変化を複数の段階で提示する、
請求項1または2記載の歩行機能評価装置。 The presenting unit presents the change in walking speed in a plurality of stages.
The walking function evaluation device according to claim 1 or 2. - カメラと、歩行機能評価装置と、出力装置とを備える歩行機能評価システムであって、
前記カメラは、歩行中の歩行者を被写体として含む画像を撮像し、
前記歩行機能評価装置は、
前記カメラが撮像した前記画像を取得する画像取得部と、
前記画像から前記歩行者の顔領域を検出する顔領域検出部と、
前記顔領域の大きさを測定する顔領域測定部と、
前記顔領域から前記歩行者の顔の特徴量を算出し、前記特徴量と、登録済みの複数の人物中の顔の特徴量と一致するか認証する認証部と、
前記認証部で、前記歩行者の顔の特徴量と、前記登録済みの複数の人物中の顔の特徴量と一致した場合に、
前記顔領域の大きさに基づいて、前記歩行者の歩行速度を推定する歩行速度推定部と、
前記歩行速度の時系列データを記録する記録部と、前記歩行速度が所定閾値以下である場合、歩行機能が低下していると判断できる前記所定閾値を保存する基準データ保存部と、
前記時系列データの所定期間から歩行速度が前記所定閾値を下回る日数を評価して、前記所定期間よりも過去の所定期間の歩行速度が前記所定閾値を下回る日数と、前記所定期間の前記日数とを比較して、歩行機能の変化を評価する評価部と、を備え、
前記出力装置は、前記歩行速度の時系列データ、及び前記歩行機能の変化を表示する、
歩行機能評価システム。 A walking function evaluation system including a camera, a walking function evaluation device, and an output device.
The camera captures an image including a walking pedestrian as a subject.
The walking function evaluation device is
An image acquisition unit that acquires the image captured by the camera, and
A face area detection unit that detects the pedestrian's face area from the image,
A face area measuring unit that measures the size of the face area,
An authentication unit that calculates the facial features of the pedestrian from the face area and authenticates whether the features match the facial features of a plurality of registered persons.
When the authentication unit matches the facial features of the pedestrian with the facial features of the plurality of registered persons,
A walking speed estimation unit that estimates the walking speed of the pedestrian based on the size of the face region,
A recording unit that records time-series data of the walking speed, and a reference data storage unit that stores the predetermined threshold value that can be determined that the walking function is deteriorated when the walking speed is equal to or less than a predetermined threshold value.
The number of days when the walking speed falls below the predetermined threshold value is evaluated from the predetermined period of the time series data, and the number of days when the walking speed in the predetermined period past the predetermined period falls below the predetermined threshold value and the number of days in the predetermined period Equipped with an evaluation unit that evaluates changes in walking function by comparing
The output device displays time-series data of the walking speed and changes in the walking function.
Walking function evaluation system. - 歩行者を被写体として含む画像を取得する工程と、
前記画像から前記歩行者の顔領域を検出する工程と、
前記顔領域の大きさを測定する工程と、
前記顔領域から前記歩行者の顔の特徴量を算出し、前記特徴量と、登録済の複数の人物中の顔の特徴量と一致するか認証する工程と、
前記歩行者の顔の特徴量が前記登録済の複数の人物中の顔の特徴量と一致すると判断された後に、
前記顔領域の大きさに基づいて、前記歩行者の歩行速度を推定する工程と、
前記歩行速度の時系列データを記録する工程と、
歩行機能が低下していることを判断できる歩行速度の所定閾値と、前記時系列データの所定期間から歩行速度が前記所定閾値を下回る日数を評価して、前記所定期間よりも過去の所定期間の歩行速度が前記所定閾値を下回る日数と、前記所定期間の前記日数とを比較して、歩行機能の変化を評価する工程と、
前記歩行速度の時系列データ、及び前記歩行機能の変化を提示する工程と、を含む、
歩行機能評価方法。 The process of acquiring an image that includes a pedestrian as a subject,
A step of detecting the pedestrian's face region from the image and
The step of measuring the size of the face area and
A step of calculating the facial feature amount of the pedestrian from the face area and certifying whether the feature amount matches the facial feature amount of a plurality of registered persons.
After it is determined that the facial features of the pedestrian match the facial features of the plurality of registered persons,
A step of estimating the walking speed of the pedestrian based on the size of the face region, and
The process of recording the time series data of the walking speed and
The predetermined threshold of the walking speed at which it can be determined that the walking function is deteriorated and the number of days for which the walking speed falls below the predetermined threshold from the predetermined period of the time series data are evaluated, and the predetermined period past the predetermined period is evaluated. A step of evaluating a change in walking function by comparing the number of days when the walking speed is below the predetermined threshold with the number of days in the predetermined period.
The step of presenting the time-series data of the walking speed and the change of the walking function is included.
Walking function evaluation method. - コンピュータに請求項5記載の歩行機能評価方法を実行させるためのプログラム。 A program for causing a computer to execute the walking function evaluation method according to claim 5.
- 歩行者を被写体として含む画像を取得する画像取得部と、
前記画像取得部が取得した画像に基づいて、前記歩行者の歩行速度を推定する歩行速度推定部と、
前記歩行速度の時系列データを記録する記録部と、
歩行機能が低下したとする歩行速度の基準データを保存する基準データ保存部と、
前記時系列データと前記基準データとに基づいて、前記歩行者の歩行機能を評価する評価部と、
前記評価部に評価された前記歩行者の歩行機能に関する情報を提示する提示部と、
前記画像から前記歩行者の顔領域を検出する顔領域検出部と、
前記顔領域の大きさを測定する顔領域測定部と、
前記顔領域から前記歩行者の顔の特徴量を算出し、前記顔の特徴量と、登録済みの複数の人物中の顔の特徴量と一致するか認証する認証部と、を備え、
前記認証部で、前記歩行者の顔の特徴量と、前記登録済みの複数の人物中の顔の特徴量と一致したと判断した場合に、
前記歩行速度推定部は、前記顔領域の大きさに基づいて、前記歩行者の歩行速度を推定し、
前記基準データ保存部は、前記歩行速度が所定閾値以下である場合、歩行機能が低下していると判断できる前記所定閾値を保存し、
前記評価部は、前記時系列データの所定期間から歩行速度が前記所定閾値を下回る日数を評価して、前記所定期間よりも過去の所定期間の歩行速度が前記所定閾値を下回る日数と、前記所定期間の前記日数とを比較して、歩行機能の変化を評価し、歩行機能の変化の評価結果から、前記所定閾値を下回る日数と、前記所定期間の前記日数とを比較して得られた、回数もしくは周期から認知機能低下を判断し、
前記提示部は、認知機能の変化を提示する、
認知機能評価装置。 An image acquisition unit that acquires an image that includes a pedestrian as a subject,
A walking speed estimation unit that estimates the walking speed of the pedestrian based on the image acquired by the image acquisition unit, and a walking speed estimation unit.
A recording unit that records time-series data of the walking speed and
A reference data storage unit that stores reference data for walking speed when walking function has deteriorated,
An evaluation unit that evaluates the walking function of the pedestrian based on the time series data and the reference data.
A presentation unit that presents information on the walking function of the pedestrian evaluated by the evaluation unit, and a presentation unit.
A face area detection unit that detects the pedestrian's face area from the image,
A face area measuring unit that measures the size of the face area,
It is provided with an authentication unit that calculates the facial feature amount of the pedestrian from the face area and authenticates whether the facial feature amount matches or matches the facial feature amount among a plurality of registered persons.
When the authentication unit determines that the facial features of the pedestrian match the facial features of the plurality of registered persons,
The walking speed estimation unit estimates the walking speed of the pedestrian based on the size of the face region.
The reference data storage unit stores the predetermined threshold value at which it can be determined that the walking function is deteriorated when the walking speed is equal to or less than the predetermined threshold value.
The evaluation unit evaluates the number of days when the walking speed is lower than the predetermined threshold value from the predetermined period of the time series data, and the number of days when the walking speed in the predetermined period past the predetermined period is lower than the predetermined threshold value and the predetermined number of days. The change in walking function was evaluated by comparing with the number of days in the period, and the number of days below the predetermined threshold was compared with the number of days in the predetermined period from the evaluation result of the change in walking function. Judging cognitive decline from the number of times or cycle,
The presentation unit presents changes in cognitive function.
Cognitive function evaluation device.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021533913A JP7129652B2 (en) | 2019-07-22 | 2020-07-03 | Walking function evaluation device, walking function evaluation system, walking function evaluation method, program, and cognitive function evaluation device |
CN202080037615.5A CN113853158B (en) | 2019-07-22 | 2020-07-03 | Walking function evaluation device, walking function evaluation system, walking function evaluation method, recording medium, and cognitive function evaluation device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-134452 | 2019-07-22 | ||
JP2019134452 | 2019-07-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021014938A1 true WO2021014938A1 (en) | 2021-01-28 |
Family
ID=74192461
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/026208 WO2021014938A1 (en) | 2019-07-22 | 2020-07-03 | Walking ability evaluation device, walking ability evaluation system, walking ability evaluation method, program, and cognitive function evaluation device |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP7129652B2 (en) |
CN (1) | CN113853158B (en) |
WO (1) | WO2021014938A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7007769B1 (en) | 2021-05-26 | 2022-01-25 | 株式会社to you | Network system, server, and walking speed measurement method |
WO2023275975A1 (en) * | 2021-06-29 | 2023-01-05 | 日本電気株式会社 | Cognitive function estimation device, cognitive function estimation method, and recording medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007241500A (en) * | 2006-03-07 | 2007-09-20 | Toshiba Corp | Face authentication device and face authentication method |
JP2009285077A (en) * | 2008-05-28 | 2009-12-10 | Sanyo Electric Co Ltd | Vital function measuring device, vital function measuring program and vital function measuring system |
WO2017104321A1 (en) * | 2015-12-17 | 2017-06-22 | 日本ロジックス株式会社 | Monitoring system and monitoring method |
JP2018099267A (en) * | 2016-12-20 | 2018-06-28 | 株式会社竹中工務店 | Movement quantity estimation device, movement quantity estimation program and movement quantity estimation system |
WO2019116830A1 (en) * | 2017-12-13 | 2019-06-20 | パナソニックIpマネジメント株式会社 | Cognitive function reduction determination system |
WO2019130674A1 (en) * | 2017-12-25 | 2019-07-04 | コニカミノルタ株式会社 | Abnormal behavior detection device, method, and system for nursing facility |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3950085B2 (en) * | 2003-06-10 | 2007-07-25 | 株式会社つくばマルチメディア | Map-guided omnidirectional video system |
JP2006099615A (en) * | 2004-09-30 | 2006-04-13 | Toshiba Corp | Face authentication device and entrance/exit management device |
JP2007127478A (en) * | 2005-11-02 | 2007-05-24 | Konica Minolta Holdings Inc | Device and method for speed detection of tracking subject |
JP2007289656A (en) * | 2006-03-28 | 2007-11-08 | Fujifilm Corp | Image recording apparatus, image recording method and image recording program |
CA2741510A1 (en) * | 2010-05-26 | 2011-11-26 | James H. Lacey | Door mountable camera surveillance device and method |
JP5766564B2 (en) * | 2011-09-15 | 2015-08-19 | 株式会社東芝 | Face authentication apparatus and face authentication method |
JP2013109051A (en) * | 2011-11-18 | 2013-06-06 | Glory Ltd | Electronic information providing system and electronic information providing method |
JP5782387B2 (en) * | 2012-01-05 | 2015-09-24 | 株式会社 日立産業制御ソリューションズ | Entrance / exit management system |
JP6168488B2 (en) * | 2012-08-24 | 2017-07-26 | パナソニックIpマネジメント株式会社 | Body motion detection device and electrical stimulation device including the same |
CN103106393B (en) * | 2012-12-12 | 2016-08-17 | 袁培江 | A kind of embedded human face identification intelligent identity authorization system based on robot platform |
JP2015070581A (en) * | 2013-09-30 | 2015-04-13 | Kddi株式会社 | Moving route estimation device, moving route estimation method, and computer program |
CN104268882A (en) * | 2014-09-29 | 2015-01-07 | 深圳市热活力科技有限公司 | High-speed moving object detecting and speed measuring method and system based on double-linear-array cameras |
JP6492890B2 (en) * | 2015-03-31 | 2019-04-03 | 富士通株式会社 | Information collection program, information collection method, and information collection apparatus |
WO2016199749A1 (en) * | 2015-06-10 | 2016-12-15 | コニカミノルタ株式会社 | Image processing system, image processing device, image processing method, and image processing program |
CN106709932B (en) * | 2015-11-12 | 2020-12-04 | 创新先进技术有限公司 | Face position tracking method and device and electronic equipment |
CN205230242U (en) * | 2015-12-04 | 2016-05-11 | 重庆财信合同能源管理有限公司 | Intelligent residential district security protection system |
JP2018029706A (en) * | 2016-08-23 | 2018-03-01 | 株式会社デジタル・スタンダード | Terminal device, evaluation system, and program |
KR20180035434A (en) * | 2016-09-29 | 2018-04-06 | 삼성전자주식회사 | Display apparatus and controlling method thereof |
JP2018114278A (en) * | 2017-01-19 | 2018-07-26 | パナソニックIpマネジメント株式会社 | Fall-while-walking prevention device, fall-while-walking prevention device controller and control method, and fall-while-walking prevention device control program |
CN107462741B (en) * | 2017-07-26 | 2019-12-31 | 武汉船用机械有限责任公司 | Moving object speed and acceleration measuring device |
JP6609818B2 (en) * | 2017-09-28 | 2019-11-27 | 有限会社起福 | Home management system, presence determination terminal and computer program used therefor |
CN109522846B (en) * | 2018-11-19 | 2020-08-14 | 深圳博为教育科技有限公司 | Standing monitoring method, device, server and standing monitoring system |
-
2020
- 2020-07-03 JP JP2021533913A patent/JP7129652B2/en active Active
- 2020-07-03 CN CN202080037615.5A patent/CN113853158B/en active Active
- 2020-07-03 WO PCT/JP2020/026208 patent/WO2021014938A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007241500A (en) * | 2006-03-07 | 2007-09-20 | Toshiba Corp | Face authentication device and face authentication method |
JP2009285077A (en) * | 2008-05-28 | 2009-12-10 | Sanyo Electric Co Ltd | Vital function measuring device, vital function measuring program and vital function measuring system |
WO2017104321A1 (en) * | 2015-12-17 | 2017-06-22 | 日本ロジックス株式会社 | Monitoring system and monitoring method |
JP2018099267A (en) * | 2016-12-20 | 2018-06-28 | 株式会社竹中工務店 | Movement quantity estimation device, movement quantity estimation program and movement quantity estimation system |
WO2019116830A1 (en) * | 2017-12-13 | 2019-06-20 | パナソニックIpマネジメント株式会社 | Cognitive function reduction determination system |
WO2019130674A1 (en) * | 2017-12-25 | 2019-07-04 | コニカミノルタ株式会社 | Abnormal behavior detection device, method, and system for nursing facility |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7007769B1 (en) | 2021-05-26 | 2022-01-25 | 株式会社to you | Network system, server, and walking speed measurement method |
JP2022181371A (en) * | 2021-05-26 | 2022-12-08 | 株式会社to you | Network system, server, and walking speed measuring method |
WO2023275975A1 (en) * | 2021-06-29 | 2023-01-05 | 日本電気株式会社 | Cognitive function estimation device, cognitive function estimation method, and recording medium |
Also Published As
Publication number | Publication date |
---|---|
CN113853158B (en) | 2024-05-14 |
JP7129652B2 (en) | 2022-09-02 |
CN113853158A (en) | 2021-12-28 |
JPWO2021014938A1 (en) | 2021-01-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190216333A1 (en) | Thermal face image use for health estimation | |
WO2021014938A1 (en) | Walking ability evaluation device, walking ability evaluation system, walking ability evaluation method, program, and cognitive function evaluation device | |
JP6706809B2 (en) | Cognitive function evaluation device, cognitive function evaluation system, cognitive function evaluation method, and program | |
KR20170132644A (en) | Method for obtaining care information, method for sharing care information, and electronic apparatus therefor | |
US20130251197A1 (en) | Method and a device for objects counting | |
WO2016088413A1 (en) | Information processing device, information processing method, and program | |
CN109512384A (en) | Scalp detection device | |
CN103905727A (en) | Object area tracking apparatus, control method, and program of the same | |
US20210407107A1 (en) | Target association using occlusion analysis, clustering, or both | |
US20160239769A1 (en) | Methods for determining manufacturing waste to optimize productivity and devices thereof | |
JP6784044B2 (en) | Behavior analysis device, behavior analysis method and program | |
CN113435353A (en) | Multi-mode-based in-vivo detection method and device, electronic equipment and storage medium | |
TWI671707B (en) | Image analysis method, electronic system and non-transitory computer-readable recording medium | |
US20140297221A1 (en) | Locating system and locating method using pressure sensor | |
US20190370976A1 (en) | Information processing device, imaging device, information processing method, and storage medium | |
TWI631480B (en) | Entry access system having facil recognition | |
CN111625794B (en) | Recording method, operation control module, household appliance, system and storage medium | |
US20230252874A1 (en) | Shadow-based fall detection | |
US20230111865A1 (en) | Spatial motion attention for intelligent video analytics | |
JP7484569B2 (en) | Image processing device, image processing method, and program | |
WO2017154477A1 (en) | Pulse estimating device, pulse estimating system, and pulse estimating method | |
KR20200145259A (en) | Method for processing diagnostic information about oral conditions | |
JP7166106B2 (en) | Installation environment estimation device and program | |
EP3839909A1 (en) | Detecting the presence of an object in a monitored environment | |
JP2010015466A (en) | Visitor management system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20844163 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021533913 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20844163 Country of ref document: EP Kind code of ref document: A1 |