WO2022208586A1 - 通知装置、通知システム、通知方法及び非一時的なコンピュータ可読媒体 - Google Patents
通知装置、通知システム、通知方法及び非一時的なコンピュータ可読媒体 Download PDFInfo
- Publication number
- WO2022208586A1 WO2022208586A1 PCT/JP2021/013217 JP2021013217W WO2022208586A1 WO 2022208586 A1 WO2022208586 A1 WO 2022208586A1 JP 2021013217 W JP2021013217 W JP 2021013217W WO 2022208586 A1 WO2022208586 A1 WO 2022208586A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- notification
- accident
- information
- notification target
- person
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 206
- 238000001514 detection method Methods 0.000 claims abstract description 251
- 238000004458 analytical method Methods 0.000 claims abstract description 100
- 230000010485 coping Effects 0.000 claims description 162
- 238000011282 treatment Methods 0.000 claims description 63
- 230000000740 bleeding effect Effects 0.000 claims description 10
- 239000008280 blood Substances 0.000 claims description 9
- 210000004369 blood Anatomy 0.000 claims description 9
- 230000008520 organization Effects 0.000 claims description 7
- 206010020751 Hypersensitivity Diseases 0.000 claims description 5
- 208000026935 allergic disease Diseases 0.000 claims description 5
- 230000007815 allergy Effects 0.000 claims description 5
- 230000036760 body temperature Effects 0.000 claims description 5
- 230000035935 pregnancy Effects 0.000 claims description 5
- 230000002920 convulsive effect Effects 0.000 claims description 4
- 230000000241 respiratory effect Effects 0.000 claims description 3
- 230000004044 response Effects 0.000 abstract description 14
- 238000012545 processing Methods 0.000 description 53
- 230000005540 biological transmission Effects 0.000 description 38
- 238000010586 diagram Methods 0.000 description 36
- 238000004891 communication Methods 0.000 description 28
- 230000015654 memory Effects 0.000 description 20
- 230000008569 process Effects 0.000 description 13
- 208000010392 Bone Fractures Diseases 0.000 description 6
- 230000009471 action Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 230000029058 respiratory gaseous exchange Effects 0.000 description 6
- 229960003965 antiepileptics Drugs 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 208000027418 Wounds and injury Diseases 0.000 description 4
- 230000006378 damage Effects 0.000 description 4
- 208000019622 heart disease Diseases 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 208000014674 injury Diseases 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 206010017076 Fracture Diseases 0.000 description 3
- 230000010365 information processing Effects 0.000 description 3
- 208000010125 myocardial infarction Diseases 0.000 description 3
- 208000017667 Chronic Disease Diseases 0.000 description 2
- 208000034656 Contusions Diseases 0.000 description 2
- 206010010904 Convulsion Diseases 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 239000000969 carrier Substances 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000036461 convulsion Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000005021 gait Effects 0.000 description 2
- 230000037230 mobility Effects 0.000 description 2
- 230000000474 nursing effect Effects 0.000 description 2
- 238000013139 quantization Methods 0.000 description 2
- 208000020446 Cardiac disease Diseases 0.000 description 1
- 206010017577 Gait disturbance Diseases 0.000 description 1
- 208000032843 Hemorrhage Diseases 0.000 description 1
- 206010061599 Lower limb fracture Diseases 0.000 description 1
- 208000006670 Multiple fractures Diseases 0.000 description 1
- 206010037660 Pyrexia Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009118 appropriate response Effects 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 208000034158 bleeding Diseases 0.000 description 1
- 208000034526 bruise Diseases 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000010837 poor prognosis Methods 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0116—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/41—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0438—Sensor means for detecting
- G08B21/0476—Cameras to detect unsafe condition, e.g. video cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/10—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using wireless transmission systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
- G08G1/202—Dispatching vehicles on the basis of a location, e.g. taxi dispatching
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
- G08G1/205—Indicating the location of the monitored vehicles as destination, e.g. accidents, stolen, rental
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/12—Messaging; Mailboxes; Announcements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/90—Services for handling of emergency or hazardous situations, e.g. earthquake and tsunami warning systems [ETWS]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/08—Detecting or categorising vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/006—Alarm destination chosen according to type of event, e.g. in case of fire phone the fire service, in case of medical emergency phone the ambulance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/55—Push-based network services
Definitions
- the present invention relates to a notification device, notification system, notification method, and non-transitory computer-readable medium.
- Patent Document 1 when the input sound data is from a traffic accident, the sound data and video data are temporarily recorded in a data recording device, and the traffic accident is detected based on the recorded data.
- a recording and analysis system for classification and analysis is disclosed.
- Patent Document 2 when there is an emergency request from the caller's terminal, based on the condition of the patient who is the target of emergency, content candidates to be sent to the caller who made the emergency request are selected.
- a content generation device is disclosed.
- Patent Document 3 when a report is received that a specific person has a medical problem, the current location of the specific person is identified, and registered users in the area around the current location are notified of the specific person's A position reporting system is disclosed.
- An object of the present disclosure is to provide a notification device, a notification system, a notification method, and a non-temporary computer-readable medium that can realize a prompt initial response to an analysis target person such as an injured person or an emergency patient. .
- a notification device includes accident occurrence detection means for detecting the occurrence of an accident from video captured by one or more cameras, and analyzing the video to obtain detailed information about a person to be analyzed related to the accident. , determination means for determining a coping method for the person to be analyzed based on the detailed information, and notification means for notifying the notification target of the determined coping method.
- a notification system includes one or more cameras, a notification device connected to the one or more cameras, and one or more terminals that receive notifications from the notification device, wherein the notification device , an accident occurrence detection means for detecting the occurrence of an accident from images captured by one or more cameras; an analysis means for analyzing the images and deriving detailed information about a person to be analyzed related to the accident; and a determining means for determining a coping method for the person to be analyzed, and a notifying means for notifying one or more terminals of the determined coping method.
- a notification method includes an accident occurrence detection step of detecting the occurrence of an accident from video captured by one or more cameras, and analyzing the video to obtain detailed information about the person to be analyzed related to the accident. , a determination step of determining a coping method for the person to be analyzed based on the detailed information, and a notification step of notifying the notification target of the determined coping method.
- a non-temporary computer-readable medium includes an accident occurrence detection step of detecting an accident occurrence from video captured by one or more cameras;
- a computer is caused to execute an analysis step of deriving detailed information of a target person, a determination step of determining a coping method for the analysis target person based on the detailed information, and a notification step of notifying a notification target of the determined coping method. It is where the program is stored.
- a notification device capable of realizing a prompt initial response to an analysis target person such as an injured person or an emergency patient.
- FIG. 1 is a block diagram showing an example of a notification device according to Embodiment 1;
- FIG. 4 is a flow chart showing an example of processing executed by the notification device according to the first exemplary embodiment;
- 1 is a schematic diagram showing an example of a notification system according to a first exemplary embodiment;
- FIG. FIG. 11 is a schematic diagram showing an example of a notification system according to a second embodiment;
- FIG. 11 is a block diagram showing an example of a camera according to a second embodiment;
- FIG. FIG. 9 is a schematic diagram showing an example of a road provided with cameras according to the second embodiment;
- FIG. 9 is a block diagram showing an example of an MEC server according to a second embodiment;
- FIG. 11 is a block diagram showing an example of a notification server according to the second embodiment
- FIG. FIG. 11 is a block diagram showing an example of a terminal according to a second embodiment
- FIG. FIG. 11 is a sequence diagram showing an example of typical processing of the notification system according to the second embodiment
- FIG. 10 is a flow chart showing an example of processing executed by a notification server according to the second embodiment
- FIG. 9 is a schematic diagram showing an example of a road provided with cameras according to the second embodiment
- FIG. 11 is a schematic diagram showing an example of a notification system according to a third embodiment
- FIG. FIG. 11 is a block diagram showing an example of a notification server according to a third embodiment
- FIG. 11 is a block diagram showing an example of a terminal according to a third embodiment
- FIG. FIG. 14 is a schematic diagram showing an example of a notification system according to a fourth embodiment
- FIG. FIG. 13 is a block diagram showing an example of a notification server according to a fourth embodiment
- FIG. FIG. 13 is a block diagram showing an example of a terminal according to a fourth embodiment
- FIG. 12 is a schematic diagram showing a configuration example of a cell of a base station according to a fifth embodiment
- FIG. It is a block diagram showing an example of a hardware configuration of an apparatus according to each embodiment.
- Embodiment 1 of the present disclosure will be described below with reference to the drawings.
- (1A) a notification device that detects the occurrence of an accident and notifies a notification target will be described.
- FIG. 1 is a block diagram showing an example of a notification device.
- the notification device 10 includes an accident occurrence detection unit 101 , an analysis unit 102 , a determination unit 103 and a notification unit 104 . Each part (each means) of the notification device 10 is controlled by a controller (not shown). Each component will be described below.
- the accident occurrence detection unit 101 detects the occurrence of an accident from images captured by one or more cameras.
- the term “accident” means an incident, whether accidental or intentional, in which an injured person or a suddenly ill person occurs. This includes, but is not limited to, car-to-car accidents, sudden deterioration of medical conditions, etc.
- the accident detection unit 101 may determine the type of accident such as a collision accident, a fire, or aggravation of a chronic disease.
- the “video” may be either moving image data or still image data.
- One or more cameras may be installed at any location.
- the plurality of cameras may photograph a predetermined place from different directions, or the plurality of cameras may photograph different places.
- the accident occurrence detection unit 101 detects that an accident has occurred at one or more locations by analyzing images captured by one or more cameras. Any combination of one or more types of cameras such as visible light cameras, infrared cameras, and thermo cameras may be used.
- the accident detection unit 101 identifies the position of the accident site based on at least one of the position of the camera that was determined to have captured the accident and the range of the accident in the video data. Also good.
- the analysis unit 102 analyzes the video that detects the occurrence of the accident, thereby identifying an injured person or a suddenly ill person caused by the accident (hereinafter referred to as a person to be analyzed). ). Note that the analysis unit 102 analyzes a video including the position of the accident site in the shooting range instead of the video that detects the occurrence of the accident or in addition to the video that detects the occurrence of the accident, and analyzes the video to be analyzed. Detailed information of the person may be derived.
- the analysis unit 102 analyzes video data captured by the camera during the period from around the timing when the accident detection unit 101 detects the occurrence of an accident until the time analysis processing is performed, or captures video captured by the camera at one or a plurality of timings. Based on this, we derive this detailed information.
- the detailed information indicates, for example, at least one of information relating to the body condition of the person to be analyzed and personal information of the person to be analyzed of the accident, but is not limited to these.
- Information about the body state of the person to be analyzed includes, for example, the consciousness state, breathing state, bleeding state, bone fracture state, burn state, impact site, convulsion state, walking state, heartbeat state, pulse state, and body temperature state of the person to be analyzed.
- the state of consciousness indicates at least one of the presence or absence of consciousness and the degree of turbidity
- the respiratory state indicates at least one of the presence or absence of breathing and the degree of normality
- the bleeding state indicates the presence or absence, location and degree of bleeding.
- the fracture state may indicate at least one of presence/absence, location, and degree of fracture.
- the state of burns indicates at least one of the presence or absence, location, and degree of burns
- the severely hit site indicates at least one of the location and degree of the blow
- the convulsive state indicates at least any of the presence or absence and degree of convulsions.
- the gait state indicates at least one of the degree of wobble and the degree of normality in walking
- the heartbeat state indicates at least one of the presence or absence of heartbeat, the number of heartbeats, and the degree of normality.
- the pulse state indicates the presence or absence of pulse, the number of times. and at least one of the degree of normality.
- the body temperature state may indicate at least one of presence/absence of fever, temperature, and location.
- the analysis unit 102 may directly or indirectly derive this detailed information based on the video of the person to be analyzed.
- the analysis unit 102 can derive detailed information by directly determining the bleeding state from the state of bleeding of the person to be analyzed in the video.
- the analysis unit 102 determines whether the person to be analyzed has a broken leg or a bruise.
- Detailed information can be derived by indirectly estimating that the
- the analysis unit 102 may derive this detailed information by using not only the image of the person to be analyzed, but also other images showing how the accident occurred. For example, when the camera that captured the person to be analyzed or another camera captures the appearance of scattered objects due to an explosion accident, the analysis unit 102 analyzes the occurrence of the accident, It is possible to determine that the person to be analyzed has been hit by a blast or to have been hit by an object thrown by the blast, and to infer that the person to be analyzed has at least one of bleeding, broken bones, and bruises.
- the analysis unit 102 further uses other information such as information on the type of accident determined by the accident occurrence detection unit 101 and attributes of the pre-registered area in which the camera that captured the accident was installed to obtain this detailed information. can be derived. For example, when the accident occurrence detection unit 101 determines that the accident is a fire, the analysis unit 102 takes this information into consideration in video analysis, and determines whether the person to be analyzed has a burn. can be determined.
- the analysis unit 102 may control the camera that captures the video in order to derive accurate detailed information.
- the camera may be controlled to increase the imaging magnification, increase the resolution, increase the frame rate, increase the number of quantization bits, increase the bit rate, and the like. .
- the camera angle of the camera may be controlled so that the person to be analyzed or a specific part of the person to be analyzed is included in the shooting range.
- the analysis unit 102 identifies the person to be analyzed from among the registrants based on the face image of the person to be analyzed appearing in the video of the accident and the face information of the registrant.
- personal information associated with the registrant may be derived as detailed information.
- Personal information includes, but is not limited to, at least one of medical history, current medical history, pregnancy status, age, sex, blood type, allergy information, personal number, family medical institution, and emergency contact information.
- the analysis unit 102 may perform face authentication using the face image of the person to be analyzed and the face information of the registrant.
- the analysis unit 102 may identify the person to be analyzed from among the registrants by acquiring the authentication result of face authentication performed by another device other than the notification device 10 .
- the personal information stored in another device may be acquired by the analysis unit 102 together with the authentication result.
- the analysis unit 102 may authenticate using an iris, which is a type of physical feature, or a gait, which is a type of behavioral feature, or use two or more biometric authentication techniques. may be combined arbitrarily for authentication.
- the determination unit 103 determines how to deal with the person to be analyzed appearing in the video.
- the coping method includes at least one of first aid, procedure, and other actions for the injury or illness of the person to be analyzed.
- treatment refers to the concept of lifesaving treatment and first aid for the person to be analyzed that can be performed by ordinary people
- treatment refers to the analysis target that can be performed by specialists such as doctors and emergency personnel. Indicates a medical procedure for a person.
- Treatment and “treatment” for a specific injury or illness may be completely different responses, or at least part of the content of the two may overlap.
- “other actions” include requesting an ambulance, requesting staff to be dispatched to the place where the accident occurred (accident site), requesting acceptance of the person to be analyzed, preparing for treatment, and contacting the designated contact. At least one of them is included, but the invention is not limited to these.
- “preparation for treatment” means at least one of, for example, preparation for blood transfusion, preparation of necessary medical equipment, and securing medical staff and treatment locations (for example, hospital rooms, operating rooms, examination rooms, etc.) Including but not limited to one.
- predetermined contact information includes, for example, at least one contact information among relatives such as family members, friends, workplaces and their relationships, specific medical institutions and nursing institutions, medical staff and nursing staff However, it is not limited to these.
- the determination unit 103 can determine a suitable coping method based on the detailed information of the person to be analyzed obtained by the analysis unit 102 .
- the determination unit 103 may determine how to deal with the person to be analyzed based on the personal information. For example, the determination unit 103 selects at least one of treatment, treatment, or other action based on at least one of the past history, current medical history, pregnancy status, age, sex, blood type, and allergy information of the person to be analyzed. You may decide as a coping method.
- the determination unit 103 determines that the person to be analyzed needs a coping method for a heart attack. Then, decide to request an ambulance, request acceptance of the person to be analyzed to a medical institution (especially a medical institution specializing in heart), and designate an AED (Automated External Defibrillator) for heart disease as a medical device to be used for treatment or treatment. . Furthermore, the determination unit 103 may determine the coping method in consideration of the location information of the occurrence of the accident.
- the determining unit 103 may compare the location information of the location where the accident occurred and the location information of the place where the AED is placed, identify the location of the AED around the accident site, and include it as a coping method.
- the determination unit 103 may determine, as a coping method, to contact a predetermined contact such as a family medical institution or an emergency contact.
- the coping method includes at least one content for each notification destination.
- the coping method for each contact may include any one of treatment, treatment, and other actions, or may include a plurality of them.
- the coping method determined by the determination unit 103 may be the same regardless of the notification destination, or may be different depending on the attributes of the notification destination. Details of this will be described in a second embodiment.
- the notification unit 104 notifies one or more notification targets of the coping method determined by the determination unit 103 .
- the notification target may be a notification destination stored in the notification device 10 in advance. Pre-stored notification destinations include, but are not limited to, public or private emergency services, health facilities.
- the notification unit 104 may notify each analysis target person to the same notification destination, or based on at least one of the detailed information derived by the analysis unit 102 and the coping method determined by the determination unit 103, the notification target person may be selected from a plurality of registration destinations.
- the notification unit 104 selects an emergency service, a medical institution (especially a heart attack) as a pre-stored notification destination. specialized medical institutions) can be notified. Further, when the person to be analyzed is a registrant registered in advance, the notification unit 104 may determine the notification target based on the personal information associated with the registrant. For example, the notification unit 104 may notify the family medical institution and the emergency contact information included in the personal information. The notification unit 104 can determine such notification targets in place of or in addition to pre-stored notification destinations.
- the notification unit 104 may notify a coping method to a notification target existing in a peripheral area (first area) including the accident site.
- Notification targets existing in the first area may be, for example, medical personnel, non-medical personnel, and medical institutions. Details of this will be described later in the second embodiment.
- the notification unit 104 may notify the location information of the accident site identified by the accident occurrence detection unit 101 (that is, the location information of the person to be analyzed) along with the coping method. Also, the notification executed by the notification unit 104 can take any form. For example, the notification unit 104 may notify the application of the terminal to be notified, or may notify by message such as SMS (Short Message Service) or MMS (Multimedia Messaging Service), e-mail, telephone, etc. Also good.
- SMS Short Message Service
- MMS Multimedia Messaging Service
- At least one of the accident detection unit 101, the analysis unit 102, and the determination unit 103 uses a machine-learned accident detection model to detect the occurrence of an accident and provide detailed information (for example, information related to the body condition of the person to be analyzed). information) and determination of coping methods.
- the accident detection unit 101 uses an accident detection model machine-learned so that a video is input as input data and a determination result as to the presence or absence of an accident is generated as output data.
- the presence or absence of the occurrence of an accident may be detected by inputting an image captured by a camera.
- the analysis unit 102 uses a detailed information analysis model machine-learned so that a video is input as input data and detailed information is generated as output data. Detailed information may be derived by inputting a video image.
- the determining unit 103 uses a coping method model that has undergone machine learning so that detailed information is input as input data and a coping method is generated as output data, and the detailed information derived by the analysis unit 102 is added to the model.
- the detection model described above may be provided inside the notification device 10 or may be provided outside.
- the model may be provided for each process of the accident occurrence detection unit 101, the analysis unit 102, and the determination unit 103, or a plurality of processes (for example, the accident occurrence detection unit 101 and the analysis unit 102). ), one model may be used in common.
- the accident occurrence detection unit 101, the analysis unit 102, and the determination unit 103 may execute each process by a method other than this.
- a table in which detailed information and coping methods are associated with each other is prepared, and after the detailed information is derived by the analysis unit 102, the determination unit 103 refers to the table to determine the appropriate coping method for the detailed information. You can decide how.
- FIG. 2 is a flowchart showing an example of typical processing of the notification device 10, and the processing of the notification device 10 will be explained with this flowchart.
- the details of the processing executed by each unit below are as described above.
- the accident detection unit 101 detects the occurrence of an accident from images captured by one or more cameras (step S11; accident detection step).
- the analysis unit 102 analyzes the video and derives detailed information about the person to be analyzed involved in the accident detected by the accident occurrence detection unit 101 (step S12; analysis step).
- the determination unit 13 determines how to deal with the person to be analyzed (step S13; determination step).
- the notification unit 104 notifies the notification target of the coping method determined in step S12 (step S14; notification step).
- the notification device 10 can operate as described above.
- the notification device 10 can analyze the person to be analyzed, determine a coping method, and notify the person, so that a prompt initial response to the person to be analyzed can be realized. .
- the notification device 10 may be of a centralized configuration composed of a single computer, or of a distributed configuration in which a plurality of computers share the processing of the accident occurrence detection unit 101 to the notification unit 104 and execute them. It may be a configuration. In a distributed configuration, multiple devices may be connected via a communication network such as a LAN (Local Area Network), a WAN (Wide Area Network), the Internet, or the like. An example of this will be described later in a second embodiment.
- LAN Local Area Network
- WAN Wide Area Network
- the Internet or the like. An example of this will be described later in a second embodiment.
- FIG. 3 is a block diagram showing an example of a notification system.
- the notification system S1 includes a notification device 10, cameras 11A and 11B, and terminals 12A and 12B. Since the description of the notification device 10 is as shown in (1A), the description is omitted.
- the cameras 11A and 11B (hereinafter collectively referred to as the cameras 11) shoot an arbitrary location and transmit the shot image data to the notification device 10.
- the camera 11 transmits video data through a wired line, but the video data may be transmitted through a wireless line.
- the camera 11 and the notification device 10 may be directly connected, or may be connected via a communication network or the like.
- Accident occurrence detection unit 101 detects the occurrence of an accident from the video captured by camera 11 . The details are as shown in (1A).
- the analysis unit 102, the determination unit 103, and the notification unit 104 also perform the processing shown in (1A). Note that the output destinations of the notification from the notification unit 104 are the terminals 12A and 12B.
- the terminals 12A and 12B receive the coping method from the notification unit 104 of the notification device 10.
- the coping method is received by the wireless lines RA and RB, but the coping method may be transmitted by a wired line.
- the terminal 12 may be any information terminal capable of communication, such as a mobile phone such as a smart phone, a personal computer, a personal digital assistant, a car navigation system, an in-vehicle communication terminal, and the like.
- the camera 11 is provided at least one of a traffic light, a roadside machine, an intersection, and a railroad crossing, and can photograph the location.
- the signal may target any vehicle such as an automobile, a two-wheeled vehicle, a bicycle, a pedestrian, or a railroad.
- an intersection means a place where two or more roads intersect.
- This "road” may be a road for automobiles, or a road for bicycles or pedestrians. Since these locations are locations where accidents related to automobiles, railways, etc. are assumed to occur, even in the event of an accident, the camera 11 can photograph the accident and transmit video data to the notification device 10. , the notification device 10 can notify the injured person of the accident how to deal with it. Therefore, it is possible to realize a prompt response to the injured in the accident.
- the camera 11 may be installed in any other outdoor location (for example, locations along roads or railroad tracks other than those mentioned above, airports and nearby locations), or in any indoor location. In this way, by installing the camera 11 at a place where it is thought that an accident is likely to occur or where there is some other danger, it is possible to quickly deal with the injured in the accident. can.
- Embodiment 2 (2A) Embodiment 2 of the present disclosure will be described below with reference to the drawings.
- (2A) a specific example of the first embodiment (1B) described above will be described.
- FIG. 4 is a schematic diagram showing an example of a notification system.
- the notification system S2 includes cameras 20A-20N, an MEC (Mobile Edge Computing or Multi-access Edge Computing) server 21, a notification server 22, and terminals 23A-23D.
- the cameras 20A to 20N and terminals 23A to 23D are collectively referred to as camera 20 and terminal 23, respectively.
- the camera 20 and the MEC server 21 may be connected via a wired line, or may be connected via a wireless line such as LTE (Long Term Evolution), 5G (5th Generation), or wireless LAN. and wireless lines.
- LTE Long Term Evolution
- 5G Fifth Generation
- wireless LAN wireless local area network
- the MEC server 21 and the notification server 22 may be connected by a wired line, may be connected by a wireless line such as LTE, 5G, or wireless LAN, or may be connected by a combination of a wired line and a wireless line. may be connected by The MEC server 21 and the notification server 22 show a configuration in which a plurality of computers share the processing of the accident occurrence detection unit 101 to the notification unit 104 and execute them.
- the notification server 22 and the terminal 23 receive notifications via wireless lines RA to RD such as LTE, 5G, and wireless LAN, but are not limited to this, and may receive notifications via wired lines. However, notifications may be received via both wired and wireless links.
- Each unit of each device is controlled by a control unit (controller) in the device (not shown) reading a program. The configuration and processing of each device will be described below.
- FIG. 5 is a block diagram showing an example of the camera 20.
- the camera 20 corresponds to the camera 11 of ( 1 B), and includes a photographing unit 201 , a transmitting/receiving unit (transceiver) 202 and a storage unit 203 . Each component of the camera 20 will be described below.
- the photographing unit 201 has a lens, an image sensor, etc. as a hardware configuration, and photographs an image. However, the images captured and transmitted by the camera 20 may be still images captured at predetermined intervals.
- the imaging unit 201 may have a function of controlling one or more of imaging magnification, resolution, frame rate, quantization bit number, bit rate, and camera angle. Note that this video data may also include time information at the time of shooting.
- the transmitting/receiving unit 202 transmits the video data to the MEC server 21 together with identification information of the camera 20 (for example, information such as the ID of the own camera and position information).
- the self-camera ID may be an arbitrary string of characters or numbers assigned to each camera without duplication, or may be communication address information such as an IP (Internet Protocol) address.
- the storage unit 203 stores identification information for transmission by the transmission/reception unit 202 . Note that this identification information can be appropriately changed according to the movement of the camera 20 or the like.
- FIG. 6 is a schematic diagram showing an example of a road on which cameras 20 are provided.
- cameras 20A and 20B are installed at the traffic light of intersection C1
- cameras 20C and 20D are installed at the traffic light of intersection C2
- cameras 20E and 20F are installed at the traffic light of intersection C3.
- the cameras 20A and 20B are provided at traffic lights at different positions at the same intersection C1. Therefore, both of them transmit to the MEC server 21 video data of the intersection C1 photographed from different positions and angles.
- the camera 20A includes the intersection C1 and the surroundings of the camera 20B in its photographing range
- the camera 20B includes the intersection C1 and the surroundings of the camera 20A in its photographing range.
- cameras 20C and 20D transmit video data of intersection C2 from different positions and angles to the MEC server 21, and cameras 20E and 20F capture video data of intersection C3 from different positions and angles and send video data to the MEC server. 21. Also, since the intersection C2 is adjacent to the intersection C1, the intersection C2 is included in a later-described region R1 including the intersection C1.
- an accident TR occurs at intersection C1. Therefore, the cameras 20A and 20B transmit video data of the accident TR to the MEC server 21 . Further, the camera 20A photographs the medical personnel NA when the accident occurs, and the camera 20B photographs the non-medical personnel NB when the accident occurs. Furthermore, the cameras 20C and 20D photograph the vehicle NC owned by the hospital C when the accident occurs. In addition, a vehicle NG owned by a medical staff member NE, a non-medical staff member NF, and a vehicle owned by a non-medical staff member G are present around the intersection C3. When the accident occurs, the cameras 20E and 20F photograph the medical personnel NE and the non-medical personnel NF, and the camera 20E photographs the vehicle NG.
- FIG. 7 is a block diagram showing an example of the MEC server 21.
- the MEC server 21 corresponds to part of the notification device 10 of (1B), and in this example, includes an accident detection unit 211 , a transmission/reception unit (transceiver) 212 and a storage unit 213 .
- Each component of the MEC server 21 will be described below.
- the accident occurrence detection unit 211 corresponds to the accident occurrence detection unit 101 of (1A) and detects the occurrence of an accident from the video data transmitted from each camera 20A to 20N.
- Accident occurrence detection unit 211 analyzes the images captured by cameras 20A to 20N to detect that an accident has occurred at time t in the area captured by cameras 20A and 20B, that is, intersection C1.
- the accident occurrence detection unit 211 identifies the position of the accident site based on the positions of the cameras 20A and 20B that are determined to have photographed the occurrence of the accident and the range of the accident occurrence in the video data.
- Accident occurrence detection unit 211 refers to the position information of cameras 20A and 20B stored in storage unit 213, which will be described later, for the positions of cameras 20A and 20B based on the IDs of the cameras 20A and 20B transmitted by cameras 20A and 20B. You can specify with When the cameras 20A and 20B transmit their own camera position information, the accident detection unit 211 can use the position information as it is.
- the accident occurrence detection unit 211 uses an accident detection model machine-learned so that a video is input as input data and a determination result as to whether an accident has occurred is generated as output data. input the video data captured by the . Thereby, the accident occurrence detection unit 211 detects whether or not an accident has occurred in each piece of video data. Other processes executed by the accident occurrence detection unit 211 are the same as those of the accident occurrence detection unit 101, and thus description thereof is omitted.
- the transmission/reception unit 212 receives captured video data and identification information from each of the cameras 20A to 20N. Further, when an accident is detected in the video data of a certain camera 20, the accident occurrence detection unit 211 detects a Video data captured by the camera 20 is extracted. The transmitting/receiving unit 212 transmits the extracted video data and the location information of the accident site together to the notification server 22 . In this example, the transmitting/receiving unit 212 transmits to the notification server 22 together the video data captured by the cameras 20A and 20B from the time t1 when the accident was detected to the present, and the location information of the intersection C. The identification information of the camera 20 that captured the accident may be transmitted as the location information of the accident site.
- the accident occurrence detection unit 211 determines the area R1 (see FIG. 6) including the intersection C, which is the accident site, as the first area including the accident site. Then, the accident detection unit 211 identifies the camera 20 that captures the area R ⁇ b>1 based on the information on the shooting location of the camera 20 stored in the storage unit 213 . As shown in FIG. 6, the cameras 20 that capture the region R1 are cameras 20A and 20B provided at the intersection C1 and cameras 20C and 20D provided near the intersection C1. Therefore, the accident detection unit 211 determines that not only the video data captured by the cameras 20A and 20B, but also the video data captured by the cameras 20C and 20D are necessary for the notification server 22 to detect notification targets, which will be described later.
- the accident occurrence detection unit 211 also extracts video data from the time t2 of the cameras 20C and 20D to the present, and the transmission/reception unit 212 transmits the video data to the notification server 22 .
- the video data captured by the cameras 20E and 20F is not transmitted to the notification server 22 because the area around the intersection C3 is not included in the area R1. Therefore, in this case, medical personnel NE, non-medical personnel NF, and vehicles NG are not considered as notification targets.
- the "first area” may be a circular area with a predetermined distance from the accident site as a radius, or an area located within a predetermined time at a predetermined walking speed or vehicle speed from the accident site. It may be a predetermined section including the accident site. Predetermined divisions include, by way of example but not limitation, emergency service divisions and administrative divisions such as counties, cities, towns, villages, districts, and states.
- the accident occurrence detection unit 211 uses, for example, the position information of the detected accident site, the map information around the accident site stored in the storage unit 213, and the criteria for setting the first area, to detect the first area as described above. A region can be set.
- the times t1 and t2 may be the same timing as the time t, or may be a timing earlier than the time t by a predetermined period so that the details of the accident are clearly clarified in the notification server 22 .
- the predetermined period is, for example, several seconds to several minutes.
- a predetermined criterion for determining times t1 and t2 may be defined.
- the time t1 is the timing when the injured person in the accident is first seen in the video data.
- the accident occurrence detection unit 211 determines the times t1 and t2 using the predetermined period or reference of the timing stored in the storage unit 213, and extracts the video data as described above.
- the storage unit 213 stores information necessary for processing executed by the accident detection unit 211 and the transmission/reception unit 212 . Specifically, the following information can be stored.
- (i) Detection model, map information, and first area setting criteria used by the accident detection unit 211 (ii) Position information of each camera 20 associated with the ID of the camera 20 (iii) Detection by the accident detection unit 211 Target video data captured by the cameras 20A to 20N in a predetermined period (iv) values of timings t1 and t2 for extracting video data or criteria for setting the timings
- FIG. 8 is a block diagram showing an example of the notification server 22.
- the notification server 22 corresponds to part of the notification device 10 of (1B), and includes a notification target detection unit 221 , an analysis unit 222 , a determination unit 223 , a transmission/reception unit (transceiver) 224 and a storage unit 225 .
- Each component of the notification server 22 will be described below.
- the notification target detection unit 221 detects notification targets existing in a first area (detection area) including the accident site.
- the notification target includes at least one of a predetermined organization (organization), an individual, and a vehicle.
- Predetermined institutions that are candidates for notification are, for example, institutions related to medical care or emergency services, which include medical institutions such as hospitals and clinics, regardless of whether they are public or private, and specialized staff such as ambulance crews are dispatched. This includes agencies involved in emergency services.
- emergency medical institutions include, but are not limited to, fire departments and EMS (Emergency Medical Services).
- the predetermined organizations that are candidates for notification may include a police station in charge of handling accidents, and a company that deals with accidents on roads, railroads, etc. (for example, a road service company).
- predetermined individuals who are candidates for notification may include at least one of medical personnel such as doctors, nurses, and paramedics, and non-medical personnel (especially ordinary people).
- medical personnel such as doctors, nurses, and paramedics
- non-medical personnel especially ordinary people
- a person whose duty is to go to the scene of an accident such as a firefighter, a police officer, or a staff member of a company responding to an accident, may be included.
- a predetermined vehicle that is a candidate for notification is, for example, a vehicle owned (owned or temporarily occupied) by the above-described predetermined organization or individual. An example would be an ambulance or a car owned by a hospital.
- the information indicating each institution, each individual, and each vehicle, which are candidates for notification, is stored in advance in the storage unit 225 together with the terminal information of each institution, each individual, and each vehicle, which is the actual transmission destination of the notification of the transmitting/receiving unit 224. stored in This terminal is a terminal associated with each institution, each individual, or each vehicle (for example, a terminal owned or temporarily occupied by an institution, etc.), and this terminal information is used by each institution, each individual, or even It can be updated from time to time by an administrator.
- the notification target detection unit 221 can select, as notification targets, even predetermined organizations, individuals, and vehicles that do not exist in the first area including the accident site.
- the analysis unit 222 which will be described later, acquires information such as a family medical institution and emergency contact information as personal information
- the notification target detection unit 221 detects those contact information regardless of the location of the family medical institution and emergency contact information. can be added as notification targets.
- emergency medical institutions that are one type of predetermined institutions
- emergency medical institutions that are candidates for notification may be stored in the storage unit 225 .
- the notification target detection unit 221 determines whether the accident site exists in the first area including the accident site. Regardless, one or more emergency medical institutions may be specified according to the circumstances of the accident and selected for notification.
- the detailed information used to identify the emergency medical institution includes, for example, information on the state of the body of the person to be analyzed obtained by analyzing the video data by the analysis unit 222 (for example, information on the type or degree of injury or illness), It may be at least one of personal information such as past medical history, current medical history, and whether or not you are pregnant.
- the notification target detection unit 221 may select one or a plurality of predetermined organizations registered as notification targets in advance as emergency response organizations regardless of the situation as notification targets.
- the notification target detection unit 221 analyzes the video data captured by each of the cameras 20A to 20D received from the MEC server 21, and detects whether or not notification target candidates appear in the video data.
- the storage unit 225 stores specific information for specifying a predetermined individual or vehicle for the notification target detection unit 221 to detect a notification target using video data.
- the specific information is biometric information such as face information for each individual, and information that can identify the vehicle such as license plate number and video data for each vehicle.
- the notification target detection unit 221 compares the specific information with the video data captured by the cameras 20A to 20D to determine whether or not the notification target candidate appears in each video data in the first area. judge.
- the notification target detection unit 221 detects that medical personnel NA and non-medical personnel NB, who are candidates for notification targets, are shown in the video data captured by the cameras 20A and 20B. judge. Furthermore, the notification target detection unit 221 detects that the license plate of the vehicle NC, which is a notification target candidate, is shown in the video data captured by the cameras 20C and 20D. Thereby, the notification target detection unit 221 detects the medical personnel NA, the non-medical personnel NB, and the vehicle NC owned by the hospital C as notification targets existing in the first region including the accident site. Coping methods are notified to these notification targets.
- the analysis unit 222 corresponds to the analysis unit 102 according to Embodiment 1, and derives detailed information about the person to be analyzed related to the accident, which is captured in the video data related to the cameras 20A and 20B. Specifically, the analysis unit 222 performs face authentication using the face image of the person to be analyzed appearing in the video data and the face information of the registrant stored in the storage unit 225 in advance, thereby identifying the person to be analyzed. is specified from among the registrants, and the personal information of the registrants stored in the storage unit 225 is acquired as detailed information.
- This registrant is a person who has registered his/her personal information in advance by himself, a person who has a chronic disease, or a person who is preliminarily stored as some kind of important person. Furthermore, the analysis unit 222 also acquires information about the body condition of the person to be analyzed as detailed information by analyzing the video data relating to the cameras 20A and 20B. The analysis unit 222 can acquire at least one of the personal information and the information about the state of the body as detailed information. The details of the personal information and the information about the state of the body of the person to be analyzed, and the details of other processes executed by the analysis unit 222 are the same as those described in the first embodiment, so descriptions thereof will be omitted.
- the analysis unit 222 acquires, as personal information, the fact that the current medical history of the person to be analyzed has heart disease, and the contact information of Hospital D, which is a family medical institution. Furthermore, as the physical condition of the person to be analyzed, the fact that there is an abnormality in the breathing condition and the pulse condition is acquired.
- the notification target detection unit 221 adds the hospital D as a notification target. This hospital D does not need to be in the first area containing the accident site.
- the decision unit 223 corresponds to the decision unit 103 according to Embodiment 1, and decides how to deal with the person to be analyzed.
- the determination unit 223 determines the cases where the notification targets are medical personnel NA and vehicles NC (medical personnel or their vehicles), the cases where the notification targets are non-medical personnel NB (general people), and the notification targets is hospital D (medical institution).
- the determination unit 223 determines the treatment for the person to be analyzed, and furthermore, the medical equipment used in the treatment, which is a predetermined area including the accident site. Identifies the location of the medical device present in the (third region).
- the decision unit 223 decides artificial respiration, chest compression (cardiac massage), use of AED, and actions that can be performed by medical personnel as treatments for the person to be analyzed.
- the determining unit 223 identifies the AED as a medical device used in the treatment, and refers to the list of position information of the AEDs stored in the storage unit 225 and the position information of the accident site, thereby determining the location of the accident.
- the determination unit 223 includes the information on this treatment and the information on the location of the AED existing in the predetermined area in the coping method for which the medical personnel NA and the vehicle NC are to be notified.
- the predetermined area (third area) including the accident site where the medical equipment exists may be the same area as the first area, or may be a different area.
- the setting criteria for this area are stored in the storage unit 225 .
- the determination unit 223 determines treatment for the person to be analyzed and the medical equipment used in the treatment, which exists in a predetermined area including the accident site. Locate medical equipment. In this example, the determination unit 223 determines the use of artificial respiration, chest compression, and AED as treatments for the person to be analyzed. Further, the determining unit 223 identifies the location of the AED existing in the predetermined area (third area) including the accident site, as described above. The determination unit 223 includes the treatment information and the location information of the AED existing in the predetermined area in the coping method.
- the determination unit 223 may include only one of the two types of information, ie, the information on the treatment or allowance and the information on the location of the AED existing in the predetermined area, in the coping method.
- the determination unit 223 determines the coping method so as to include the treatment for the person to be analyzed and the acceptance request for the person to be analyzed.
- the treatment has the same content as the treatment when the notification targets are the medical personnel NA and the vehicle NC.
- the coping method is not limited to this example. You can decide. Details of other processes executed by the determination unit 223 are the same as those described in the first embodiment, and thus description thereof is omitted.
- the transmission/reception unit 224 corresponds to the notification unit 104 according to Embodiment 1, and notifies the notification target detected by the notification target detection unit 221 of the coping method determined by the determination unit 223 .
- the transmitting/receiving unit 224 notifies the terminals 23A and 23C of treatment information and coping methods including information on the locations of AEDs existing in a predetermined area. Further, the transmitting/receiving unit 224 notifies the terminal 23B of the coping method including the treatment information and the location information of the AED existing in the predetermined area. Further, the transmission/reception unit 224 notifies the terminal 23D of the treatment for the person to be analyzed and the coping method including the request for acceptance of the person to be analyzed.
- terminals 23A and 23B are smartphones owned by medical personnel NA and non-medical personnel NB, respectively, terminal 23C is a car navigation system mounted on vehicle NC, and terminal 23D is provided by hospital D. This is a terminal for accepting emergency patients.
- the contact information of the terminals 23A, 23B, and 23C are associated with the medical personnel NA, the non-medical personnel NB, and the vehicle NC, respectively, and stored in the storage unit 225 as information on candidates for notification. uses that information for transmission.
- the contact information of the terminal 23D is included in the personal information acquired by the analysis unit 222, and the transmission/reception unit 224 uses the information for transmission.
- Contact information is information including at least one of communication address information (for example, IP address) of each terminal, telephone number, e-mail address, and any other arbitrary user identifier.
- the transmitting/receiving unit 224 transmits coping methods to the terminals 23A, 23B, and 23D via the radio lines RA, RB, and RD, respectively. Further, the transmitting/receiving unit 224 transmits the coping method to a roadside device (not shown) near the terminal 23C, and the roadside device transmits the coping method to the terminal 23C via the radio circuit RC. In this way, the notification server 22 can transmit the coping method to the terminal onboard the vehicle via the roadside unit. Further, the transmitting/receiving unit 224 may notify each notification target terminal 23 of the location information of the accident site received by the notification server 22 (that is, the location information of the person to be analyzed) together with the coping method.
- the storage unit 225 stores information necessary for processing executed by the notification target detection unit 221 to the transmission/reception unit 224 . Specifically, the following information can be stored.
- Information on predetermined organizations, individuals, and vehicles that are candidates for notification targets, and terminal information on the organizations, individuals, and vehicles that are transmission destinations of notifications associated with each information (ii) Notification target detection unit 221 specific information for identifying a predetermined individual or vehicle for detecting a notification target using video data (iii) registrant's face information for the analysis unit 222 to perform face authentication (iv) various medical devices ( For example, a list of location information provided with an AED) and setting criteria for a predetermined area (third area) that is the notification range of the location where various medical devices are present. , one or a plurality of predetermined organizations registered in advance as notification targets may be further stored.
- the storage unit 225 may also store map information of locations where the cameras 20 managed by the notification server 22 are arranged and setting criteria for the first area. The administrator can update this information, and the notification server 22 sends updated information to the MEC server 21 when updated. Thereby, the MEC server 21 updates the information stored in the storage unit 213 .
- FIG. 9 is a block diagram showing an example of the terminal 23.
- the terminal 23 corresponds to the terminal 12 of (1B), and includes an output unit 231 , a transmitting/receiving unit (transceiver) 232 and a storage unit 233 . Each component of the terminal 23 will be described below.
- the output unit 231 is an interface that notifies the user of the terminal 23 of information by text, image, voice, etc., and is composed of a display, a speaker, and the like.
- the output unit 231 outputs to the user of the terminal 23 the coping method and the location information of the accident site that have been transmitted from the transmission/reception unit 224 of the notification server 22 to each terminal 23 .
- the terminal 23 is a smartphone
- the information may be displayed on its display
- the terminal 23 is a car navigation system
- the information may be displayed on the display.
- the transmitting/receiving unit 232 receives the coping method and the location information of the accident site notified from the notification server 22 .
- the terminal 23 further includes an input unit, and when the user who visually recognizes the coping method inputs content indicating that the coping method has been confirmed, the transmitting/receiving unit 232 notifies the notification server 22 that the confirmation has been completed. A notification may be sent indicating that the
- the storage unit 233 stores information such as coping methods transmitted from the notification server 22, and also stores information necessary for operating the terminal 23.
- FIG. 10 is a sequence diagram showing an example of typical processing of the notification system S2, and this sequence diagram explains the processing of the notification system S2. The details of the processing executed by each device are as described above. Note that FIG. 10 shows the camera 20A and the terminal 23A as examples of the camera 20 and the terminal 23, respectively, but other cameras 20 and terminals 23 also perform processing similar to the following.
- the photographing unit 201 of the camera 20A photographs the intersection C1 (step S21), and the transmitting/receiving unit 202 transmits the photographed image data to the MEC server 21 (step S22).
- the transmission/reception unit 212 of the MEC server 21 receives the video data transmitted in step S22.
- Accident occurrence detection unit 211 detects the occurrence of an accident at intersection C1 based on the image data of camera 20A (step S23).
- the transmission/reception unit 212 transmits the video data of the camera 20A involved in the accident to the notification server 22 (step S24).
- the transmission/reception unit 224 of the notification server 22 receives the video data transmitted in step S24.
- the notification server 22 determines a coping method for the person to be analyzed based on the received video data (step S25). Details of this will be described later.
- the transmission/reception unit 224 of the notification server 22 transmits the determined coping method and the like to the terminal 23 (step S26).
- the transmission/reception unit 232 of the terminal 23A receives the video data transmitted in step S26.
- the output unit 231 of the terminal 23A outputs the coping method and the like to the user (step S27).
- FIG. 11 is a flowchart showing an example of the processing of step S25, and the processing of the notification server 22 is explained by this flowchart.
- the notification target detection unit 221 detects a notification target from the video data transmitted from the MEC server 21 (step S251). Also, the analysis unit 222 analyzes the video data and derives detailed information about the person to be analyzed related to the accident detected by the accident occurrence detection unit 211 of the MEC server 21 (step S252).
- the determination unit 223 determines how to deal with the person to be analyzed (step S253).
- the transmission/reception unit 224 notifies the notification target detected by the notification target detection unit 221 of the coping method determined in step S253 (step S254).
- Either step S251 or S252 may be performed first.
- the notification target may be added based on the personal information, as in the case of Hospital D described above. may be executed first. Further, the details of the processing of steps S251 to S254 are as described in the description of the notification object detection unit 221 to the transmission/reception unit 224.
- the notification system S2 can analyze the person to be analyzed, determine the coping method, and notify the person. can.
- the notification target detection unit 221 of the notification server 22 detects a notification target existing in the first area including the place where the accident occurred, and the transmission/reception unit 224 notifies the detected notification target of the coping method. can be done.
- the notification server 22 detects people and vehicles existing around the accident site instead of the whistleblower, and provides information to them. Therefore, it is possible to obtain the cooperation of non-reporters around the accident site and people who are unaware of the accident (for example, people who are in the blind spot from the accident site) in responding to the injured or suddenly ill. can. Therefore, it becomes possible to increase the survival probability of the injured or the suddenly ill and to suppress the poor prognosis.
- the notification server 22 can notify at least one of the person and the vehicle of the coping method. Therefore, it is possible for a person in the vicinity of the site or a person in the vehicle to immediately recognize the coping method and take action.
- the determination unit 223 of the notification server 22 determines the treatment or treatment for the person to be analyzed, and the medical equipment used in the treatment or treatment, which exists in the area including the accident site, as the determination of the coping method. determining the location of the medical device to be performed. By transmitting this coping method to the notification target by the transmitting/receiving unit 224, the notification target person can take a more appropriate response to the injured person or the suddenly ill person.
- the notification target detection unit 221 of the notification server 22 detects the notification target candidate by determining that the notification target candidate appears in the video, and the transmission/reception unit 224 detects the notification target candidate as the notification target.
- the coping method can be notified to the associated terminal.
- the notification server 22 can notify people present around the accident site, so that it is possible to increase the probability that an injured person or an emergency patient will be cared for.
- the notification target may include institutions related to medical care or emergency services.
- the determination unit 223 selects at least one of the following: treatment for the person to be analyzed, request for staff to be dispatched to the location where the accident occurred, request for acceptance of the person to be analyzed, and preparation for treatment. can be determined to be included.
- the notification server 22 can have the injured person or the suddenly sick person treated by a medical or emergency institution, so that the injured person or the suddenly sick person can receive early or appropriate treatment at a specialized institution. Can receive.
- the analysis unit 222 identifies the person to be analyzed from among the registrants based on the face image of the person to be analyzed appearing in the video and the authentication result of face authentication using the face information of the registrant.
- the personal information associated with the registrant may be derived as the detailed information.
- the determination unit 223 can determine a coping method based on the personal information. Therefore, since the notification server 22 can notify a coping method according to individual characteristics, it can be expected that a response according to the situation of the injured person or the suddenly sick person will be made.
- Personal information may include, for example, at least one of the following: medical history, current medical history, pregnancy status, age, gender, blood type, allergy information, personal number, primary medical institution, and emergency contact information. Based on such medical-related personal information, the notification server 22 can notify detailed and accurate coping methods.
- the transmitting/receiving unit 224 may determine a notification target based on personal information.
- the notification server 22 can have a person or organization who knows the circumstances of the person to be analyzed take care of the person to be analyzed, so that an injured person or an emergency patient can be properly cared for.
- the detailed information may include at least one of consciousness state, breathing state, bleeding state, bone fracture state, burn state, impact site, convulsive state, walking state, heartbeat state, pulse state, and body temperature state.
- the notification server 22 can notify detailed and accurate coping methods.
- the determination unit 223 determines whether the notification target is a non-medical person or their vehicle (first type candidate) or a medical person or their vehicle (second type candidate). Different coping methods (treatments or treatments) can be notified for each. Therefore, the notification server 22 can notify the person to be notified of a coping method that takes into account the expertise of the person to be notified. It is also possible to make a notice that considers the
- the configuration and processing of the notification system S2 shown in (2A) can be changed as follows. For example, even if the notification target is a non-medical personnel vehicle, the determination unit 223 can determine the coping method in the same manner as when the notification target is a non-medical personnel.
- the notification target detection unit 221 added Hospital D as a notification target according to the personal information acquired by the analysis unit 222, regardless of the distance from the accident site. However, the notification target detection unit 221 does not need to add the hospital D as a notification target if the hospital D is not included in the region R1 or if it is not included within a predetermined region from the accident site. Note that the predetermined area is stored in advance in the storage unit 225 as a threshold value for determination.
- the notification target detection unit 221 may specify one or more emergency medical institutions according to the circumstances of the accident and set them as notification targets.
- the coping method notified to the emergency medical institution includes at least one of treatment for the person to be analyzed, a request to dispatch staff to the location where the accident occurred, and a request to accept the person to be analyzed. If, in response to the notification from the transmitting/receiving unit 224, the emergency medical institution replies to the notification server 22 that it cannot be accepted, the notification target detection unit 221 will set the other emergency medical institution as a new notification target. You can reset. The transmitting/receiving unit 224 notifies the new notification target of the coping method.
- This notification target resetting method may be based on the priority order stored in advance in the storage unit 225 .
- the notification target detection unit 221 determines the priority based on at least one of the location information of the accident site and the detailed information acquired by the analysis unit 222 and the criteria for determining the priority stored in the storage unit 225 in advance. and may be reset based on the order of priority.
- the notification system S2 can also be applied to situations other than this.
- the analysis unit 222 determines that the person to be analyzed hits his/her head at the time of the accident, based on the image data captured by the cameras 20A and 20B, the determination unit 223 determines that "Don't move your head. "Things" may be included in coping methods as treatments or measures.
- the determination unit 223 may include "requires type O blood transfusion", which is preparation for blood transfusion (preparation for treatment), in the coping method.
- the MEC server 21 may be provided at any location on the communication path from the camera 20 to the notification server 22.
- the MEC server 21 and camera 20 may be connected by a wired line, and the MEC server 21 may be connected to the notification server 22 via 5G wireless communication.
- the camera 20 may be connected via 5G wireless communication to the MEC server 21 installed near the 5G base station, and the MEC server 21 and notification server 22 may be connected via a wired line.
- the MEC server 21 instead of installing the MEC server 21 near the 5G base station, the MEC server 21 may be installed in a predetermined regional unit or in the core network (5GC) of the 5G system.
- 5GC core network
- the MEC server 21 By providing the MEC server 21 closer to the camera 20 on the communication path, the distance over which a series of video data output from each camera 20 flows can be shortened. Therefore, it is possible to suppress the flow of a large amount of data in the network, reduce the network load, and suppress the communication delay.
- the MEC server 21 may be provided independently, or multiple MEC servers 21 may exist in a geographically dispersed state. Also, the MEC server 21 may include at least one of the notification target detection unit 221 , analysis unit 222 and determination unit 223 included in the notification server 22 .
- the MEC server 21 may not be provided, the notification server 22 may include the accident occurrence detection unit 211 , and the camera 20 may directly transmit video data to the notification server 22 .
- the MEC server 21 determines that the cameras 20C and 20D are present in the region R1, thereby determining that the video data captured by the cameras 20C and 20D is also necessary for the processing of the notification server 22. , and transmitted the video data to the notification server 22 .
- the MEC server 21 may not execute the determination, and only the video data captured by the cameras 20A and 20B that detected the accident may be transmitted to the notification server 22 .
- the notification target detection unit 221 of the notification server 22 uses the map information around the accident site stored in the storage unit 225 and the first area setting criteria to determine the area R1 as the first area including the accident site. set. The notification target detection unit 221 then identifies the cameras 20A to 20D as the cameras 20 that capture the region R1.
- the notification target detection unit 221 determines that the video data captured by the cameras 20C and 20D is also necessary for notification target detection, and outputs the video data captured by the cameras 20C and 20D to the MEC server 21. request that The MEC server 21 transmits video data captured by the cameras 20C and 20D to the notification server 22 based on the request.
- the notification target detection unit 221 performs the above-described processing using this video data.
- (2B) (2B) describes a variation different from (2A).
- points different from (2A) will be described, and descriptions of the same points as (2A) will be omitted as appropriate.
- FIG. 12 is a schematic diagram showing an example of a road on which cameras 20 are installed.
- FIG. 12 discloses a region R2 that is wider than the region R1 and includes the region R1 and the periphery of the intersection C3. The processing will be described below using this example.
- the accident occurrence detection unit 211 of the MEC server 21 detects the occurrence of an accident from the video data captured by the cameras 20A and 20B, as shown in (2A). At this time, the accident occurrence detection unit 211 determines the region R1 as the first region including the accident site, and determines the region R2 as the second region including the accident site. Then, the accident occurrence detection unit 211 detects not only the video data captured by the cameras 20A to 20D related to the region R1, but also the video data captured by the cameras 20E and 20F related to the region R2 from time t3 to the present. to the notification server 22 via. Note that the time t3 is set in the same manner as the time t2 described above.
- the "second area” may be a circular area with a radius of a predetermined distance longer than the "first area” from the accident site, or an area including the "first area”, It may be an area located within a predetermined time at a predetermined walking speed or vehicle speed from the site. Also, the "second area” may be a predetermined section containing the "first area” inside.
- the accident occurrence detection unit 211 uses, for example, the position information of the detected accident site, the map information around the accident site stored in the storage unit 213, and the criteria for setting the first and second regions to determine the first and second regions. 2 regions can be set.
- the notification target detection unit 221 detects non-medical personnel and their vehicles (first type candidates) that are notification target candidates existing in the first area, and notification target candidates existing in the second area. Medical personnel and their vehicles (second type candidates) are detected as notification targets.
- the notification target detection unit 221 analyzes the video data captured by the cameras 20A to 20F and compares the specific information with the video data captured by the cameras 20A to 20F, thereby identifying notification targets in each video data. Determine whether it is visible or not.
- the notification target detection unit 221 detects medical personnel NA, non-medical personnel NB, and a hospital A vehicle NC owned by C is detected. Furthermore, the notification target detection unit 221 detects the medical personnel NE as a notification target that exists in the second area and does not exist in the first area, based on the video data captured by the cameras 20E to 20F. The notification target detection unit 221 analyzes the video data captured by the cameras 20E to 20F with the specific information of medical personnel, etc., and excludes the specific information of non-medical personnel, etc., from the analysis target. It is also possible to detect the medical personnel NE and not detect the non-medical personnel NF as notification targets that exist in the first region and do not exist in the first region.
- the notification target detection unit 221 analyzes the specific information of both medical personnel and non-medical personnel regarding the video data captured by the cameras 20E to 20F, the non-medical personnel NF detected as a result of the analysis and vehicle NG may be set as the exclusion of notification.
- the analysis unit 222 executes the same processing as (2A).
- the determining unit 223 decides whether the notification targets are medical personnel NA, vehicles NC, and medical personnel NE (medical personnel and their vehicles), when the notification targets are non-medical personnel NB (general people), and when notification is performed.
- the coping method is determined separately depending on whether the target is Hospital D (medical institution) or not. This determination method is as described in (2A).
- the transmitting/receiving unit 224 notifies the terminal 23E of the treatment information and the coping method including the information of the location of the AED existing in the predetermined area.
- the terminal 23E is a smart phone owned by the medical personnel NE, and the contact information thereof is stored in the storage unit 225 in association with the medical personnel NE.
- the transmitting/receiving unit 224 notifies the terminals 23A to 23C in the same manner as in (2A).
- the transmitting/receiving unit 224 also notifies each notification target terminal 23 of the location information of the accident site received by the notification server 22 together with the coping method.
- the notification target detection unit 221 detects that a first type candidate exists in the region R1 and that a second type candidate exists in the region R2 as notification target candidates.
- the transmitting/receiving unit 224 can notify the first-type candidates existing in the area R1 and the second-type candidates existing in the area R2 as notification targets, and notify the coping method.
- Medical personnel, etc. can be expected to arrive at the scene of an accident even if they are located slightly away from the scene of the accident, because it can be said that one of their duties is to rescue the injured or the suddenly ill.
- general volunteers, etc. may not be familiar with rescue and may have other jobs, so it may not be realistic to ask for rescue when they are located slightly away from the accident site.
- the notification server 22 changes the detection area (that is, the range for which assistance is sought) according to the attributes of the notification target, thereby increasing the possibility of helping the injured or the seriously ill. can be done.
- the transmitting/receiving unit 224 may notify at least one of the first-type candidate existing in the region R1 and the second-type candidate existing in the region R2.
- the first type of candidate is "non-medical personnel" and the second type of candidate is "medical personnel", but this example is not limited to this.
- candidates of the second type may include firefighters, police officers, company staff who respond to accidents, and other persons whose job it is to go to the scene of an accident.
- the number of candidates to be notified is not limited to two types, and three or more types may be set. More than one type may be set.
- the predetermined area (third area) to be notified of the locations of various medical devices may be the same area as any of the plurality of detection areas set, or may be an area different from any of the detection areas. good.
- Embodiment 3 In the third embodiment, an example will be described in which a notification target is detected by a notification server acquiring position information from a terminal possessed by a notification target candidate. In the following description, differences from the second embodiment will be described, and descriptions of the same points as the second embodiment will be omitted as appropriate.
- FIG. 13 is a schematic diagram showing an example of a notification system.
- the notification system S3 includes cameras 20A-20N, a MEC server 21, a notification server 32, and terminals 33A-33C, 23D. Terminals 33A to 33C are collectively referred to as terminal 33.
- the camera 20 transmits video data to the MEC server 21, as in (2A).
- the accident occurrence detection unit 211 of the MEC server 21 also detects the occurrence of an accident using the video data transmitted from the cameras 20A to 20N, as in (2A).
- the transmitting/receiving unit 212 transmits to the notification server 22 together the image data of the cameras 20A and 20B that captured the intersection C1 where the accident occurred, and the location information of the accident site.
- the MEC server 21 does not need to determine the region R1, which is the first region including the accident site, and as a result, does not need to transmit the video data captured by the cameras 20C and 20D to the notification server 22. . Therefore, the storage unit 213 does not need to store the criteria for setting the first area.
- FIG. 14 is a block diagram showing an example of the notification server 32 according to the third embodiment.
- the notification server 32 includes a location information acquisition unit 321 in addition to the configuration of the notification server 22 according to the second embodiment.
- the position information acquisition unit 321 acquires the position information from the terminal 33 possessed by each notification target candidate.
- Position information acquisition unit 321 for example GPS (Global Positioning System) or the like satellite positioning system, or using positioning technology such as indoor positioning technology terminal 33 to acquire the position information of the own terminal, from the transmission and reception unit 232 of the terminal 33 receive. This position information is updated at a predetermined timing.
- GPS Global Positioning System
- This position information is updated at a predetermined timing.
- the notification target detection unit 221 detects notification target candidates existing in the first area including the accident site based on the latest position information of the terminal 33 acquired by the position information acquisition unit 321 .
- the accident detection unit 211 sets the first area using, for example, the position information of the detected accident site, the map information around the accident site stored in the storage unit 225, and the first area setting criteria. be able to. The details of this method are as described in the second embodiment. Further, when the notification target detection unit 221 detects that a terminal associated with a notification target candidate (for example, a terminal owned by a candidate) exists in the first area, the notification target candidate exists in the first region.
- a notification target candidate for example, a terminal owned by a candidate
- the terminals 33A, 33B, and 33C owned by the medical personnel NA, the non-medical personnel NB, and the vehicle NC, which are candidates for notification, are located in the region R1, which is the first region. positioned.
- terminals 33E and 33F owned by medical personnel NE and non-medical personnel NF, who are candidates for notification are located outside the region R1.
- the notification target detection unit 221 analyzes such a positional relationship between the region R ⁇ b>1 and each terminal 33 based on the position information of the terminal 33 acquired by the position information acquisition unit 321 . Based on this analysis, the notification target detection unit 221 detects medical personnel NA, non-medical personnel NB, and vehicles NC as notification targets existing in the region R1.
- the processing executed by the analysis unit 222 and the determination unit 223 is the same as (2A).
- the transmitting/receiving unit 224 also notifies the terminals 33A and 33C of treatment information and coping methods including information on the locations of AEDs existing in a predetermined area, , and notifies the terminal 33B of the coping method including the information of the location of the AED existing in the predetermined area.
- the terminals 33A and 33B are smartphones owned by medical personnel NA and non-medical personnel NB, respectively, and the terminal 33C is a car navigation system mounted on the vehicle NC.
- the contact information of the terminals 33A, 33B, and 33C is associated with the medical personnel NA, the non-medical personnel NB, and the vehicle NC, respectively, and stored in the storage unit 225 as information of candidates for notification.
- Other processes executed by the transmitting/receiving unit 224 are the same as (2A), so description thereof will be omitted.
- the storage unit 225 stores information necessary for the processing executed by the notification target detection unit 221 to the transmission/reception unit 224 . However, since the notification target detection unit 221 does not detect notification target candidates from the video data of the camera, the storage unit 225 stores specific information for face authentication that identifies a predetermined individual or vehicle. No need.
- FIG. 15 is a block diagram showing an example of the terminal 33. As shown in FIG. The terminal 33 further includes a position detector 331 in addition to the configuration of the terminal 23 according to the second embodiment.
- the position detection unit 331 updates the position information of the terminal 33 itself at a predetermined timing using some positioning technology such as a satellite positioning system and indoor positioning technology.
- the location detection unit 331 transmits its own location information to the notification server 32 using the transmission/reception unit 232 .
- the timing at which the position detection unit 331 transmits the position information to the notification server 32 can be at least one of a predetermined cycle and detection of a predetermined event.
- the predetermined event may be a change of a connected base station or a cell when the terminal 33 uses cellular radio communication such as LTE or 5G.
- the predetermined event may be movement of the terminal 33 by a predetermined distance, a predetermined time, a request from the notification server 32, or the like.
- the position detection unit 331 is implemented by, for example, installing a notification application in the terminal 33 and causing the control unit of the terminal 33 to operate the application. Since the processing executed by the output unit 231 to the storage unit 233 is the same as (2A), the description is omitted. As an example, when a notification application is installed in the terminal 33, the output unit 231 may display the coping method and the location information of the accident site received from the notification server 32 as a push notification on the screen.
- the location information acquisition unit 321 acquires the location information of the terminal from the terminal 33 of the notification target candidate, and the notification target detection unit 221 detects the location information based on the location information acquired by the location information acquisition unit 321. Notification target can be detected.
- the notification server 22 can also detect notification target candidates that are not captured by the camera 20 and notify the coping method, so that the number of notification destinations to which the coping method can be notified can be increased. Therefore, it can be expected that more people will cooperate in relief efforts.
- two or more types of candidates to be notified may be set.
- Two or more types of areas to be detected may be set.
- the variations described in the second embodiment can be applied as appropriate.
- Embodiment 4 In the fourth embodiment, an example will be described in which a terminal notified of a coping method controls whether or not to output the notified coping method. In the following description, differences from the third embodiment will be described, and descriptions of the same points as the third embodiment will be omitted as appropriate.
- FIG. 16 is a schematic diagram showing an example of a notification system.
- the notification system S4 includes cameras 20A-20N, an MEC server 21, a notification server 42, and terminals 43A-43C and 23D. Terminals 43A to 43C are collectively referred to as terminal 43.
- FIG. The camera 20 and MEC server 21 perform the same processing as in the third embodiment.
- FIG. 17 is a block diagram showing an example of the notification server 42 according to the fourth embodiment.
- the notification server 42 includes a notification area identification unit 421 instead of the notification target detection unit 221 of the notification server 22 according to the second embodiment.
- the notification server selects, as a notification target, those existing in a predetermined detection area from among the notification target candidates. Notify the notification target of the countermeasure method. Therefore, the notification server 42 does not need to include the notification target detection unit 221 for detecting the notification target.
- the notification area specifying unit 421 sets the area R1 to be notified based on the location information of the accident site and the setting criteria for the first area stored in the storage unit 225 .
- the details of this setting are as described above.
- the analysis unit 222 of the notification server 42 executes the same processing as in the second embodiment. Further, similarly to the second embodiment, the determination unit 223 also determines when the notification target is the first type (for example, medical personnel or their vehicle) and when the notification target is the second type (for example, ordinary people). and the case of a medical institution.
- the first type for example, medical personnel or their vehicle
- the notification target for example, ordinary people
- the transmitting/receiving unit 224 notifies all notification targets related to people and vehicles stored in the storage unit 225 of the coping method determined by the determination unit 223 . In this manner, the transmitting/receiving unit 224 notifies the first-type notification target including the terminals 43A and 43C of the treatment information and the coping method including the information of the location of the AED existing in the predetermined area. . Further, the transmitting/receiving unit 224 notifies the second-type notification target including the terminal 43B of the coping method including the treatment information and the information of the location of the AED existing in the predetermined area.
- the transmission/reception unit 224 notifies the terminal 23D of the treatment for the person to be analyzed and the coping method including the request for acceptance of the person to be analyzed.
- the notification to the first and second types of notification targets may be performed by unicast communication for each terminal, or may be performed by multicast communication for each of the first or second type of notification targets. , or by multicast communication in units of base stations or cells.
- the transmission/reception unit 224 notifies the terminals 43A to 43C and 23D to be notified of the information on the location of the accident site and the information on the area R1 set by the notification area specifying unit 421, together with the coping method.
- the storage unit 225 stores information necessary for the processing executed by the notification area specifying unit 421 to the transmitting/receiving unit 224 .
- the storage unit 225 stores the following information.
- Analysis unit 222 performs face authentication Face information of the registrant for execution
- the method of determining the predetermined organization to which the transmitting/receiving unit 224 notifies the coping method based on the setting criteria (iv) of the first area setting criteria (i) is as described in Embodiment 2. . Further, the storage unit 225 does not need to store specific information for specifying a predetermined individual or vehicle.
- FIG. 18 is a block diagram showing an example of the terminal 43. As shown in FIG. The terminal 43 further includes an output control unit 431 in addition to the configuration of the terminal 33 according to the third embodiment.
- the transmitting/receiving unit 232 receives the coping method, the location information of the accident site, and the information regarding the area R1 that have been transmitted from the transmitting/receiving unit 224 of the notification server 42 to each terminal 43 .
- the output control unit 431 compares the latest position information of the own terminal detected by the position detection unit 331 with the received information about the area R1, and determines whether the own terminal is within the area R1.
- the output control unit 431 controls the output unit 231 to output the coping method and the location information of the accident site received from the notification server 42.
- the output is, for example, an application push notification.
- the user of the terminal 43 that is, the target of notification
- the output control unit 431 controls the output unit 231 not to output the coping method and the position information of the accident site received from the notification server 42. do. Therefore, the user of the terminal 43 does not know information such as coping methods.
- Other processes executed by the terminal 43 are the same as those in the third embodiment, so description thereof will be omitted.
- the notification server 42 does not need to specify the notification target, so the notification server 42 can be configured more simply.
- the coping method and the like are output for the person to be notified who is near the accident site, the coping method and the like are not output to the person to be notified who is far from the accident site. Therefore, it is possible to reduce the burden on the person to be notified who is far from the accident site.
- two or more types of candidates to be notified may be set.
- two or more areas to be notified may be set.
- the notification area identification unit 421 notifies the first type notification target of the coping method based on the location information of the accident site and the setting criteria for the first and second areas stored in the storage unit 225. and a region R2 in which the coping method is notified to the notification target of the second type. If the notification target is the first type, the transmission/reception unit 224 notifies information about the region R1 together with the coping method, etc. If the notification target is the second type, the transmission/reception unit 224 notifies the coping method, etc. to notify the information about the region R2.
- the transmitting/receiving unit 232 receives information on the coping method, the location information of the accident site, and the area R1 or R2 transmitted from the transmitting/receiving unit 224 of the notification server 42 to each terminal 43 .
- the output control unit 431 compares the latest location information of the own terminal detected by the location detection unit 331 with the received location information of the region R1 or R2, and determines whether the own terminal is within the region R1 or R2. judge.
- the output control unit 431 performs the above control according to the determination result.
- the storage unit 233 of the terminal 43 may store attribute information indicating whether the terminal itself is a notification target of the first type or the second type.
- the output control unit 431 may determine which of the region R1 and the region R2 is to be used in the above-described determination regarding the position of the own terminal. As a result, for example, coping methods are output to the terminals of medical personnel even if they are some distance from the accident site, while coping methods are not output to the terminals of volunteers, etc., if they are some distance from the accident site. can be made
- the notification server 42 does not have to include the notification area specifying unit 421.
- the transmitting/receiving unit 224 notifies each notification target terminal 43 of the location information of the accident site and the criteria for setting the first area together with the coping method.
- the output control unit 431 sets the notification target area R1 based on the received positional information of the accident site and the criteria for setting the first area. Then, the latest location information of the own terminal detected by the location detection unit 331 is compared with the location information of the area R1 to determine whether or not the own terminal is within the area R1. Even in this way, it is possible to prevent output of coping methods and the like for a person to be notified who is far from the accident site.
- the setting criteria for the first area may be stored in advance in the storage unit 233 of the terminal 43 .
- the transmitting/receiving unit 224 does not need to notify the terminal 43 of the setting criteria for the first area together with the coping method and the location information of the accident site.
- Embodiment 4 For other variations of Embodiment 4, those described in Embodiments 2 and 3 can be applied as appropriate.
- Embodiment 5 (5A) In Embodiment 5, an example will be described in which a notification server uses a network configuration to notify a notification target near the accident site of a coping method. In the following description, differences from the second embodiment will be described, and descriptions of the same points as the second embodiment will be omitted as appropriate.
- the notification system according to Embodiment 5 is as shown in FIG. 4, and the configuration of each device according to FIG. 4 is as shown in FIGS.
- the notification server 22 executes processing different from that of the second embodiment as follows.
- the notification target detection unit 221 identifies one or more base stations that include the location of the accident site within its own cell.
- the notification target detection unit 221 performs this identification by comparing the location information covered by the cell of each base station, which is associated with the identification information of each base station, and the location of the accident site.
- the location information covered by the cell of each base station may be stored in the storage unit 225, or may be stored in another database or the like and may be acquired by the notification target detection unit 221 and used. good. Accordingly, the notification target detection unit 221 detects notification targets by setting all terminals under the specified base station as notification targets.
- the analysis unit 222 executes the same processing as in the second embodiment.
- the determining unit 223 determines the coping method for the first type candidate (non-medical personnel) and the second type candidate (medical concerned parties)
- the determination unit 223 may include either one of the first-type candidate coping method and the second-type candidate coping method in the coping method. The details are as described in the fourth embodiment. Also, when the target of notification is the hospital D, the determination unit 223 determines the coping method as described in the second embodiment.
- the transmitting/receiving unit 224 notifies all terminals under the control of the base station specified by the notification target detecting unit 221 of the coping method determined by the determining unit 223 and the location information of the accident site by broadcast communication or multicast communication.
- broadcast communication or multicast communication for example, eMBMS (evolved Multimedia Broadcast Multicast System) or SC-PTM (Single Cell Point To Multipoint) for LTE network, FeMBMS (Further evolved Multimedia Broadcast Multicast Service) for 5G network may As for the hospital D, which is the notification target, as described in the second embodiment, the coping method and the location information of the accident site are notified.
- the storage unit 225 stores information necessary for processing executed by the notification target detection unit 221 to the transmission/reception unit 224 . Specifically, the following information is stored.
- Information on a predetermined institution that is a candidate for notification, and terminal information on the institution that is the destination of the associated notification (i) Face information of the registrant for the analysis unit 222 to perform face authentication ( iii) A list of location information where various medical devices (e.g., AEDs) are installed, and criteria for setting a predetermined region (third region) to be notified of locations where various medical devices are present
- the storage unit 225 may store location information covered by the cell of each base station and map information of locations where the cameras 20 managed by the notification server 22 are arranged.
- the terminal 23 that has received the coping method and the location information of the accident site from the notification server 22 outputs the information to a display or the like, as described in the second embodiment.
- the notification server 22 can notify many and unspecified users having terminals under the control of the base station of the countermeasures and the like. Therefore, it is possible to increase the probability that an injured person or an emergency patient will be rescued.
- one or more base stations specified by the notification target detection unit 221 are macrocell base stations having the largest cell size, and smaller cell sizes (for example, microcells, picocells, femtocells). Either of the base stations or both of them may be included.
- the notification target detection unit 221 may specify not only the base station containing the location of the accident site within its own cell, but also the nearby base stations such as adjacent base stations. For other variations as well, those described in the previous embodiments can be applied as appropriate.
- the notification target detection unit 221 does not set all terminals under the specified base station as the notification target, but rather detects the service provided by a specific mobile communication carrier (communication carrier) under the base station. It is also possible to acquire the information of the terminal that provides the information and detect that terminal as a notification target.
- the HSS Home Subscriber Server
- the MME Mobility Management Entity
- the notification target detection unit 221 can acquire information on terminals that are under the control of a specific base station and that are provided with services by a specific communication carrier.
- UDM Unified Data Management
- AMF Access and Mobility Management
- NEF Network Exposure Function
- the analysis unit 222 and the determination unit 223 execute the same processing as (5A).
- the transmitting/receiving unit 224 notifies all the terminals specified by the notification target detecting unit 221 of the coping method determined by the determining unit 223 and the location information of the accident site.
- This notification method may be communication by unicast, or may use an emergency call system or the like in a mobile network.
- the notification server 22 can notify an unspecified number of users with terminals under the control of the base station of the countermeasures and the like.
- a first type of notification target is, for example, non-medical personnel
- a second type of notification target is, for example, medical personnel.
- FIG. 19 shows a configuration example of a cell of a base station for explaining areas to be notified.
- the notification target detection unit 221 identifies the base station A that includes the location of the accident site within the cell A of its own station.
- the notification target detection unit 221 identifies cells B to G adjacent to cell A, and determines base stations B to G (base stations located in the vicinity of base station A) constituting the respective cells. further specify.
- the notification target detection unit 221 uses the location information covered by the cell of each base station to perform this base station identification process.
- the storage unit 225 stores identification information of terminals of the first type and second type candidates, and types of notification targets (information indicating whether the notification targets are the first type or the second type). is stored. Furthermore, in the storage unit 225, setting criteria for the first area, which is an area in which the first type of candidate is notified of the coping method, and a second area, which is an area in which the second type of candidate is notified of the coping method, are stored. Region setting criteria are also stored.
- the notification target detecting unit 221 detects, as a first type of candidate, the terminals that are candidates accommodated in the cell A and stored in the storage unit 225 as notification targets, while the second type of candidates is the cell A to
- the candidate terminals stored in G and stored in the storage unit 225 are subject to notification.
- the notification target detection unit 221 uses the setting criteria for each area to identify cell A as the first area and cells A to G as the second area.
- the notification target detection unit 221 acquires identification information (for example, a telephone number) of a terminal connected to each base station from an LTE or 5G network device.
- the notification target detection unit 221 uses the acquired identification information of each terminal to determine whether the information of each terminal is stored in the storage unit 225, and if so, what type of notification target the terminal is. identify.
- the notification target detection unit 221 notifies the first type candidate terminal accommodated in cell A and the second type candidate terminal accommodated in cell B based on the specified notification target type. It can be detected as a target.
- the method of notifying these notification targets of the coping method is as described above.
- the notification target detection unit 221 can perform similar processing not only in (5A) but also in (5B).
- the notification target detection unit 221 acquires identification information of a terminal that is under the control of a specific base station and is provided with a service by a specific communication carrier, and the terminal indicated by the identification information Identify what the target type is. Accordingly, the notification target detection unit 221 can detect each type of candidate terminal as a notification target.
- those described in the second to fourth embodiments can be appropriately applied.
- Embodiment 6 (6A) In Embodiments 2 to 4, when a coping method is notified by unicast communication, the notification server 22 determines whether the candidate terminal to be notified is owned by an individual or mounted in a vehicle. The area treated as a notification target may be changed according to.
- non-medical personnel and their vehicles are detected as notification targets by the notification target detection unit 221 when they are located within the region R1, but are located outside the region R1. In some cases, it was not detected as a notification target. However, even when the non-medical personnel's vehicle is located within a predetermined area wider than the area R1, the notification target detection unit 221 may detect the candidate as a notification target. If the area R2 (see FIG. 12) is set as a detection area for vehicles of non-medical personnel, the notification target detection unit 221 can also detect vehicle NG as a detection target.
- the notification target detection unit 221 detects the vehicle as a notification target. You can Whether the notification target candidate is an individual or a vehicle is included in the terminal information of the notification target candidate stored in the storage unit 225 .
- the storage unit 225 also stores detection area setting criteria when the notification target candidate is a vehicle. The notification target detection unit 221 can use these pieces of information to set the detection area and detect the notification target.
- the notification target detection unit 221 detects a detection area related to the vehicle of the medical personnel, a detection area related to the vehicle of the non-medical personnel, a detection area related to the individual medical personnel, and a detection area related to the individual non-medical personnel.
- a wide detection area may be set in the order of . In this order, the order of the "detection area related to vehicle of non-medical personnel" and the “detection area related to individual medical personnel" may be reversed, or both may be areas of the same size.
- the notification server 22 can change and set the detection area depending on whether the notification target candidate is an individual or a vehicle. If the person to be notified is in a vehicle that can move faster than on foot, it is expected that the person will be in time to rescue the injured or the suddenly ill even if the person is some distance away from the accident site. Therefore, by setting the detection area of the terminal installed in the vehicle to be wider than that of the terminal owned by an individual, it is possible to obtain the cooperation of more people.
- the detection area was changed according to whether the notification target was an individual or a vehicle, but it is also possible to divide the notification target candidates into three or more types and set the detection area for each.
- the notification target detection unit 221 may divide the detection target candidates into individuals, bicycles, and automobiles, and may set the detection area to be wider toward the end.
- the same detection region may be set regardless of whether the notification target candidate is a non-medical person or a medical person. The same is true when the notification target candidate is a vehicle.
- the “first type of candidate” may be an individual
- the “second type of candidate” may be a vehicle.
- the location information acquisition unit 321 of the notification server 32 may generate a movement history of the user of the terminal (that is, the notification target candidate) based on the location information acquired at a predetermined timing. Based on this movement history, the notification target detection unit 221 may analyze the means of transportation (walking, automobile, etc.) and average speed of candidates for notification, and set the detection area based on this.
- the notification target detection unit 221 detects that the terminal NH owned by a certain medical person H has an average speed of 30 km/h during the period from the present (when this process of the notification target detection unit 221 is executed) to a predetermined past timing. based on its movement history.
- the storage unit 225 stores "15 km/h" as the threshold value of the average speed.
- the notification target detection unit 221 determines that the average speed of the terminal NH is equal to or higher than the threshold, and thus estimates that the terminal NH (that is, the medical personnel H) is moving by car. Therefore, the notification target detection unit 221 can set the detection area of the terminal NH to a wider area for vehicles than the area R2 for individual medical personnel.
- the notification target detection unit 221 determines that the terminal NH is moving at an average speed of 5 km/h during the period from the present to the predetermined past timing, the average speed of the terminal NH is less than the threshold. judge. In this case, the notification target detection unit 221 presumes that the terminal NH is moving on foot, and sets the detection region of the terminal NH to the region R2 for individual medical personnel. It should be noted that the notification target detection unit 221 can change the detection area with the same logic for terminals owned by non-medical personnel and terminals provided in vehicles.
- the notification server 22 can widen and set the detection area when the notification target candidate is in a state in which it is easy to move, that is, in a state in which it can move quickly. Therefore, it is possible to obtain the cooperation of more people for relief.
- the detection area may be changed in three steps or more instead of in two steps.
- the same detection area may be set regardless of whether they are non-medical personnel or medical personnel.
- the notification device 10 according to Embodiment 1 or the notification server according to Embodiments 2 to 6 provides detailed information (information on the body condition of the person to be analyzed, at least part of personal information, etc.) may be notified to the notification target terminal together with the coping method described above. Further, instead of or in addition to the detailed information, the image of the analysis target person captured by the camera may be notified to the notification target terminal together with the coping method as a still image or moving image. As a result, the user who visually recognizes the terminal can know the condition of the person to be analyzed, so that more appropriate relief can be provided.
- the notification device 10 according to Embodiment 1 or the notification server according to Embodiments 2 to 6 further sends a questionnaire asking detailed information about the accident (for example, eyewitness information) to the terminal that notifies the coping method. You can send it.
- the notification device 10 or the notification server can store detailed information on the accident by receiving the answers to the questionnaire from the terminals. This information can be shared with various places related to relief, such as the hospital to which the person to be analyzed has been transported, emergency response departments, fire departments, and police.
- the analysis unit 222 based on at least one of the degree of injury or illness of the person to be analyzed and the number of persons to be analyzed (injured or suddenly ill), The severity of the accident may be determined.
- the notification target detection unit 221 can change the detection area based on the severity.
- the storage unit 225 may store "four people" as the threshold for the severity of the accident.
- the analysis unit 222 determines that there is one person injured in the accident, the number of injured people in the accident is less than the threshold, so the analysis unit 222 determines the severity of the accident to be "low.” .
- the notification target detection unit 221 leaves the detection region as the region R1.
- the notification target detection unit 221 sets the detection area as a predetermined area (for example, area R2) wider than area R1.
- Similar processing can be performed in the third embodiment.
- the notification target detection unit 221 detects the plurality of detection regions. A similar process can be performed for at least one based on the severity of the accident.
- the notification target detection unit 221 detects one or more base stations that include the location of the accident site within its own cell. Identify.
- the notification target detection unit 221 detects not only the base station containing the location of the accident site within its own cell but also the one adjacent to it. Alternatively, nearby base stations, such as multiple base stations (eg, macrocell base stations), are also of particular interest.
- the notification target detection unit 221 can change the detection area based on the severity of the accident, so the higher the severity of the accident, the more collaborators can be gathered. Therefore, it is possible to increase the probability that an appropriate initial response will be taken even for a serious accident.
- the severity of the accident may be determined in three or more stages, and the detection area may be changed in three or more stages accordingly.
- the notification target detection unit 221 widens the detection area from the initial detection area, and detects a new target detection area.
- a notification target candidate may also be detected as a new notification target.
- a predetermined threshold used for this determination is stored in the storage unit 225 .
- the notification target detection unit 221 can continue to repeat the process of expanding the detection area until the number of existing notification targets reaches or exceeds a predetermined threshold.
- the notification target detection unit 221 performs this processing even when the notification target is specified by any of the methods described in each embodiment, that is, the video data from the camera, the position information from the terminal 23, or the base station information. can be executed.
- the notification server 22 can reliably secure personnel who can participate in the initial response to the accident.
- the notification device 10 may notify coping methods to terminals other than individual, vehicle, and institution terminals.
- the notification server 22 displays the accident site based on the position information of the accident site and the map information indicating the positions of one or more digital signage terminals stored in the storage unit 225.
- the coping method for the first type of notification target (non-medical personnel) and the location information of the accident site may be notified to the digital signage terminals within a predetermined area including.
- the digital signage terminal Upon receipt of the notification, the digital signage terminal outputs information on how to deal with the accident and the location of the accident site on the screen or by voice. By recognizing the digital signage terminal, the passerby is expected to take the initial response to the injured or the suddenly ill.
- the predetermined area may be stored in the storage unit 225 .
- this notification to the digital signage terminal may be made when the notification targets of individuals and vehicles existing in the detection area initially set by the notification target detection unit 221 are less than a predetermined threshold. Moreover, based on the severity of the accident determined by the analysis unit 222, the predetermined area including the digital signage terminals to be notified can be expanded as the severity increases.
- the notification server 22 may notify the coping method and the location information of the accident site using the disaster prevention radio speaker in the same manner as the example of the digital signage terminal. From the speaker, information about the coping method and the location of the accident site is output to people in the vicinity by voice.
- the notification target detection unit 221 may exclude notification target candidates that satisfy a certain condition from the notification targets based on the map information stored in the storage unit 225 .
- the notification target candidate who exists in the vicinity of the accident site is on a highway or on a train track (i.e., the notification target is on a train). in the vehicle), the person to be notified cannot immediately rush to the scene of the accident.
- the notification target detection unit 221 detects such a case by referring to the information on the location of the notification target candidate from the map information, and removes the notification target candidate from the notification target selection.
- the transmission/reception unit 224 stores the information of the parking space near the accident site in the map information stored in the storage unit 225. It may be extracted based on the information and notified to the terminal 23 of the vehicle together with the coping method. As a result, when the user of the vehicle approaches the site of the accident, the user can immediately stop the vehicle and head to the site of the accident, thereby enabling quicker aid.
- this disclosure has been described as a hardware configuration, but this disclosure is not limited to this.
- This disclosure implements the processing (steps) of the device (any of the notification device, notification server, MEC server, camera, and terminal) described in the above embodiments by causing a processor in a computer to execute a computer program. It is also possible to
- FIG. 20 is a block diagram showing a hardware configuration example of an information processing device (signal processing device) in which the processing of each embodiment described above is executed.
- this information processing device 90 includes a signal processing circuit 91 , a processor 92 and a memory 93 .
- the signal processing circuit 91 is a circuit for processing signals under the control of the processor 92 .
- the signal processing circuit 91 may include a communication circuit that receives signals from the transmitting device.
- the processor 92 reads out software (computer program) from the memory 93 and executes it, thereby performing the processing of the device described in the above embodiment.
- software computer program
- the processor 92 one of CPU (Central Processing Unit), MPU (Micro Processing Unit), FPGA (Field-Programmable Gate Array), DSP (Demand-Side Platform), and ASIC (Application Specific Integrated Circuit) is used. may be used, or a plurality of them may be used in parallel.
- the memory 93 is composed of a volatile memory, a nonvolatile memory, or a combination thereof.
- the number of memories 93 is not limited to one, and a plurality of memories may be provided.
- the volatile memory may be RAM (Random Access Memory) such as DRAM (Dynamic Random Access Memory) or SRAM (Static Random Access Memory).
- the non-volatile memory may be, for example, ROM (Random Only Memory) such as PROM (Programmable Random Only Memory), EPROM (Erasable Programmable Read Only Memory), or SSD (Solid State Drive).
- the memory 93 is used to store one or more instructions.
- one or more instructions are stored in memory 93 as a group of software modules.
- the processor 92 can perform the processing described in the above embodiments by reading out and executing these software module groups from the memory 93 .
- the memory 93 may include, in addition to being provided outside the processor 92, one built into the processor 92.
- the memory 93 may include storage located remotely from the processors that make up the processor 92 .
- the processor 92 can access the memory 93 via an I/O (Input/Output) interface.
- processors included in each device in the above-described embodiments execute one or more programs containing instructions for causing a computer to execute the algorithms described with reference to the drawings. .
- the signal processing method described in each embodiment can be realized.
- Non-transitory computer readable media include various types of tangible storage media.
- Examples of non-transitory computer-readable media include magnetic recording media (e.g., flexible discs, magnetic tapes, hard disk drives), magneto-optical recording media (e.g., magneto-optical discs), CD-ROMs (Read Only Memory), CD-Rs, CD-R/W, semiconductor memory (eg mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (Random Access Memory)).
- the program may also be delivered to the computer on various types of transitory computer readable medium. Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves. Transitory computer-readable media can deliver the program to the computer via wired channels, such as wires and optical fibers, or wireless channels.
- the notification device according to appendix 1.
- the notification target detection means detects that a first type candidate, which is a notification target candidate, exists in the first region, and that a second type candidate, which is a notification target candidate, is in the first region. Detecting the presence in a second area that is wider than the first area and includes the location where the accident occurred; The notification means detects the first type candidate existing in the first area detected by the notification object detection means, and the second type candidate existing in the second area detected by the notification object detection means. Notifying the coping method with at least one of the types of candidates as the notification target, The notification device according to appendix 2. (Appendix 4) The notification target includes at least one of a person and a vehicle, The notification device according to appendix 2 or 3.
- the determination means determines the treatment or treatment for the person to be analyzed as the determination of the coping method, and the medical device used in the treatment or treatment, which includes the location where the accident occurred. perform at least one of the determination of the location of medical devices present in the area of
- the notification means includes at least one of the determined treatment or treatment and the location of the identified medical device in the coping method and notifies the notification target. 5.
- the notification device according to any one of appendices 2 to 4.
- the notification target detection means detects the notification target candidate as the notification target by determining that the notification target candidate appears in the video, The notification means notifies the terminal associated with the notification target detected by the notification target detection means of the coping method.
- the notification device according to any one of appendices 2 to 5.
- (Appendix 7) Further comprising location information acquisition means for acquiring location information of the terminal from the terminal possessed by the notification target candidate, The notification target detection means detects the notification target based on the position information acquired by the position information acquisition means, The notification means notifies the terminal of the notification target detected by the notification target detection means of the coping method.
- the notification device according to any one of appendices 2 to 5.
- the notification target detection means detects the notification target by specifying one or a plurality of base stations including the location where the accident occurred within its cell, and The notification means notifies the terminal of the notification target detected by the notification target detection means of the coping method.
- the notification device according to any one of appendices 2 to 5.
- the notification target includes at least medical or emergency institutions,
- the decision means decides the coping method so as to include at least one of a treatment for the person to be analyzed, a request to dispatch staff to the location where the accident occurred, a request to accept the person to be analyzed, and preparation for treatment. decide and
- the notification means notifies the organization of the coping method,
- the notification device according to appendix 1 or 2.
- the analysis means identifies the person to be analyzed from among the registrants based on the face image of the person to be analyzed appearing in the video and the result of face authentication using the face information of the registrant. and derive personal information associated with the registrant as the detailed information,
- the determining means determines the coping method based on the personal information.
- the notification device according to any one of appendices 1 to 9.
- the personal information includes at least one of medical history, current medical history, pregnancy status, age, gender, blood type, allergy information, personal number, family medical institution, and emergency contact information.
- the notification device according to appendix 10.
- the notification means determines the notification target based on the personal information, and notifies the determined notification target.
- the notification device according to appendix 10 or 11.
- the detailed information includes at least one of consciousness state, respiratory state, bleeding state, fracture state, burn state, impact site, convulsive state, walking state, heartbeat state, pulse state and body temperature state, 13.
- the notification device according to any one of appendices 1 to 12.
- the notification device accident occurrence detection means for detecting the occurrence of an accident from images captured by the one or more cameras; analysis means for analyzing the video and deriving detailed information about the person to be analyzed related to the accident; determining means for determining a method of dealing with the person to be analyzed based on the detailed information; a notification means for notifying the one or more terminals of the determined coping method.
- the one or more cameras are provided at least one of a traffic light, a roadside machine, an intersection and a railroad crossing, 15. The notification system of clause 14.
- Notification device 101 Accident detection unit 102 Analysis unit 103 Determination unit 104 Notification unit 11 Camera 12 Terminal 20
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Emergency Management (AREA)
- Business, Economics & Management (AREA)
- Computer Networks & Wireless Communication (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Signal Processing (AREA)
- Computer Security & Cryptography (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Gerontology & Geriatric Medicine (AREA)
- Computational Linguistics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Environmental & Geological Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Alarm Systems (AREA)
Abstract
Description
(1A)
以下、図面を参照して本開示の実施の形態1について説明する。この(1A)では、事故の発生を検出して、通知対象に通知する通知装置について説明する。
次に、(1B)では、通知装置10を備えた通知システムについて説明する。
(2A)
以下、図面を参照して本開示の実施の形態2について説明する。この(2A)では、上述の実施の形態1(1B)の具体例について説明する。
(i)事故発生検知部211が用いる検知モデル、地図情報及び第1の領域の設定基準
(ii)カメラ20のIDに関連付けられた各カメラ20の位置情報
(iii)事故発生検知部211の検知対象となる、所定の期間におけるカメラ20A~20Nが撮影した映像データ
(iv)映像データを抽出するためのタイミングt1、t2の値又はそのタイミングを設定するための基準
(i)通知対象の候補となる所定の機関、個人並びに車両の情報、及びその各々の情報に関連付けられた通知の送信先となる機関、個人及び車両の端末情報
(ii)通知対象検知部221が映像データを用いて通知対象を検知するための、所定の個人又は車両を特定する特定情報
(iii)分析部222が顔認証を実行するための登録者の顔情報
(iv)各種医療機器(例えばAED)が設けられた位置情報の一覧、及び、各種医療機器が存在する場所の通知範囲となる所定の領域(第3の領域)の設定基準
ただし、記憶部225には、救急対応の機関として、予め通知対象として登録された1又は複数の所定の機関が、さらに格納されていても良い。
(2B)では、(2A)と異なるバリエーションについて説明する。以降の説明では、(2A)と異なる点を説明し、(2A)と同じ点については、適宜説明を省略する。
実施の形態3では、通知対象の候補が有する端末から通知サーバが位置情報を取得することにより、通知対象を検知する例を説明する。以降の説明では、実施の形態2と異なる点を説明し、実施の形態2と同じ点については、適宜説明を省略する。
実施の形態4では、対処方法を通知された端末が、通知された対処方法を出力するか否かを制御する例を説明する。以降の説明では、実施の形態3と異なる点を説明し、実施の形態3と同じ点については、適宜説明を省略する。
(i)通知対象となる所定の機関、個人並びに車両の情報、及びその各々の情報に関連付けられた通知の送信先となる機関、個人及び車両の端末情報
(ii)分析部222が顔認証を実行するための登録者の顔情報
(iii)各種医療機器(例えばAED)が設けられた位置情報の一覧及び、各種医療機器が存在する場所の通知対象となる所定の領域(第3の領域)の設定基準
(iv)第1の領域の設定基準
(i)の情報に基づいて、送受信部224が対処方法を通知する所定の機関を決定する方法は、実施の形態2に記載した通りである。また、記憶部225には、所定の個人又は車両を特定する特定情報が格納される必要はない。
(5A)
実施の形態5では、通知サーバがネットワークの構成を利用して、事故現場の近くにいる通知対象に対処方法を通知する例を説明する。以降の説明では、実施の形態2と異なる点を説明し、実施の形態2と同じ点については、適宜説明を省略する。
(i)通知対象の候補となる所定の機関の情報、及びそれに関連付けられた通知の送信先となる機関の端末情報
(ii)分析部222が顔認証を実行するための登録者の顔情報
(iii)各種医療機器(例えばAED)が設けられた位置情報の一覧、及び、各種医療機器が存在する場所の通知対象となる所定の領域(第3の領域)の設定基準
通知対象検知部221は、特定した基地局の配下にある全ての端末を通知対象とするのではなく、その基地局の配下にあって、特定の移動体通信事業者(通信キャリア)がサービスを提供している端末の情報を取得し、その端末を通知対象として検知しても良い。例えば、4Gのネットワークでは、コアネットワークにあるHSS(Home Subscriber server)が通信キャリアの加入者情報の管理をしており、コアネットワークのMME(Mobility Management Entity)が端末の位置情報や基地局間のハンドオーバーに関する管理(ユーザの移動管理)をしている。そのため、通知対象検知部221は、HSS及びMMEと通信することで、特定の基地局の配下にあって、特定の通信キャリアからサービスを提供されている端末の情報を取得することができる。
実施の形態5においても、通知対象となる候補は2種類以上設定されても良いし、それに対応して、それぞれの種類の候補を通知対象検知部221が通知対象とする領域も、2種類以上設定されても良い。第1の種類の通知対象は、例えば非医療関係者であり、第2の種類の通知対象は、例えば医療関係者である。
(6A)
実施の形態2~4において、ユニキャスト通信で対処方法を通知する場合、通知サーバ22は、通知対象の候補となる端末が、個人が保有しているものか、それとも車両に搭載されたものかに応じて、通知対象として扱われる領域を変更しても良い。
実施の形態3において、通知サーバ32の位置情報取得部321は、所定のタイミングで取得した位置情報に基づいて、端末のユーザ(すなわち通知対象の候補)の移動履歴を生成しても良い。この移動履歴に基づいて、通知対象検知部221は、通知対象の候補の移動手段(徒歩か自動車か、等)や平均速度を分析し、それに基づいて検知領域を設定しても良い。
(付記1)
1又は複数のカメラで撮影された映像から事故の発生を検知する事故発生検知手段と、
前記映像を分析して、前記事故に係る分析対象人物の詳細情報を導出する分析手段と、
前記詳細情報に基づいて、前記分析対象人物への対処方法を決定する決定手段と、
決定された前記対処方法を通知対象に通知する通知手段と、を備える
通知装置。
(付記2)
前記事故が発生した場所を含む第1の領域に存在する前記通知対象を検知する通知対象検知手段をさらに備え、
前記通知手段は、前記通知対象検知手段が検知した前記通知対象に前記対処方法を通知する、
付記1に記載の通知装置。
(付記3)
前記通知対象検知手段は、前記通知対象の候補である第1の種類の候補が前記第1の領域に存在すること、及び、前記通知対象の候補である第2の種類の候補が、前記第1の領域よりも広い、前記事故が発生した場所を含む第2の領域に存在することを検知し、
前記通知手段は、前記通知対象検知手段が検知した前記第1の領域に存在する前記第1の種類の候補、及び、前記通知対象検知手段が検知した前記第2の領域に存在する前記第2の種類の候補、の少なくともいずれか一方を前記通知対象として、前記対処方法を通知する、
付記2に記載の通知装置。
(付記4)
前記通知対象は、人物及び車両の少なくともいずれか一方を含む、
付記2又は3に記載の通知装置。
(付記5)
前記決定手段は、前記対処方法の決定として、前記分析対象人物への手当又は処置の決定、及び、前記手当又は処置において使用される医療機器であって、前記事故が発生した場所を含む第3の領域に存在する医療機器の場所の特定の少なくともいずれか一方を実行し、
前記通知手段は、前記決定された手当又は処置、及び、前記特定された医療機器の場所の少なくともいずれか一方を前記対処方法に含めて前記通知対象に通知する、
付記2乃至4のいずれか1項に記載の通知装置。
(付記6)
前記通知対象検知手段は、前記通知対象の候補が前記映像に映っていることを判定することで、前記候補を前記通知対象として検知し、
前記通知手段は、前記通知対象検知手段が検知した前記通知対象と関連付けられた端末に対して、前記対処方法を通知する、
付記2乃至5のいずれか1項に記載の通知装置。
(付記7)
前記通知対象の候補が有する端末から、前記端末の位置情報を取得する位置情報取得手段をさらに備え、
前記通知対象検知手段は、前記位置情報取得手段が取得した前記位置情報に基づいて前記通知対象を検知し、
前記通知手段は、前記通知対象検知手段が検知した前記通知対象の端末に前記対処方法を通知する、
付記2乃至5のいずれか1項に記載の通知装置。
(付記8)
前記通知対象検知手段は、前記事故が発生した場所を自局のセル内に含む1又は複数の基地局を特定することにより前記通知対象を検知し、
前記通知手段は、前記通知対象検知手段が検知した前記通知対象の端末に前記対処方法を通知する、
付記2乃至5のいずれか1項に記載の通知装置。
(付記9)
前記通知対象は、少なくとも医療又は救急に係る機関を含み、
前記決定手段は、前記分析対象人物への処置、前記事故が発生した場所へのスタッフの派遣要請、前記分析対象人物の受け入れ要請及び処置の準備の少なくともいずれかが含まれるように前記対処方法を決定し、
前記通知手段は、前記機関に前記対処方法を通知する、
付記1又は2に記載の通知装置。
(付記10)
前記分析手段は、前記映像に映っている前記分析対象人物の顔画像と、登録者の顔情報とを用いた顔認証の認証結果に基づいて、前記分析対象人物を前記登録者の中から特定し、前記登録者に関連付けられた個人情報を前記詳細情報として導出し、
前記決定手段は、前記個人情報に基づいて前記対処方法を決定する、
付記1乃至9のいずれか1項に記載の通知装置。
(付記11)
前記個人情報は、既往歴、現病歴、妊娠の有無、年齢、性別、血液型、アレルギー情報、個人番号、かかりつけ医療機関、及び緊急連絡先の少なくともいずれかを含む、
付記10に記載の通知装置。
(付記12)
前記通知手段は、前記個人情報に基づいて前記通知対象を決定し、決定した前記通知対象に通知する、
付記10又は11に記載の通知装置。
(付記13)
前記詳細情報は、意識状態、呼吸状態、出血状態、骨折状態、火傷状態、強打部位、痙攣状態、歩行状態、心拍状態、脈拍状態及び体温状態の少なくともいずれかを含む、
付記1乃至12のいずれか1項に記載の通知装置。
(付記14)
1又は複数のカメラと、
前記1又は複数のカメラと接続された通知装置と、
前記通知装置から通知を受信する1又は複数の端末と、を備え、
前記通知装置は、
前記1又は複数のカメラで撮影された映像から事故の発生を検知する事故発生検知手段と、
前記映像を分析して、前記事故に係る分析対象人物の詳細情報を導出する分析手段と、
前記詳細情報に基づいて、前記分析対象人物への対処方法を決定する決定手段と、
決定された前記対処方法を前記1又は複数の端末に通知する通知手段と、を有する
通知システム。
(付記15)
前記1又は複数のカメラは、信号機、路側機、交差点及び踏切の少なくともいずれかに設けられている、
付記14に記載の通知システム。
(付記16)
1又は複数のカメラで撮影された映像から事故の発生を検知する事故発生検知ステップと、
前記映像を分析して、前記事故に係る分析対象人物の詳細情報を導出する分析ステップと、
前記詳細情報に基づいて、前記分析対象人物への対処方法を決定する決定ステップと、
決定された前記対処方法を通知対象に通知する通知ステップと、
を通知装置が実行する通知方法。
(付記17)
1又は複数のカメラで撮影された映像から事故の発生を検知する事故発生検知ステップと、
前記映像を分析して、前記事故に係る分析対象人物の詳細情報を導出する分析ステップと、
前記詳細情報に基づいて、前記分析対象人物への対処方法を決定する決定ステップと、
決定された前記対処方法を通知対象に通知する通知ステップと、
をコンピュータに実行させるプログラムが格納された非一時的なコンピュータ可読媒体。
10 通知装置
101 事故発生検知部 102 分析部
103 決定部 104 通知部
11 カメラ
12 端末
20 カメラ
201 撮影部 202 送受信部
203 記憶部
21 MECサーバ
211 事故発生検知部 212 送受信部
213 記憶部
22、32、42、52 通知サーバ
221 通知対象検知部 222 分析部
223 決定部 224 送受信部
225 記憶部 321 位置情報取得部
421 通知領域特定部
23、33、43、53 端末
231 出力部 232 送受信部
233 記憶部 231 位置検出部
331 位置検出部 431 出力制御部
90 情報処理装置
91 信号処理回路 92 プロセッサ
93 メモリ
Claims (17)
- 1又は複数のカメラで撮影された映像から事故の発生を検知する事故発生検知手段と、
前記映像を分析して、前記事故に係る分析対象人物の詳細情報を導出する分析手段と、
前記詳細情報に基づいて、前記分析対象人物への対処方法を決定する決定手段と、
決定された前記対処方法を通知対象に通知する通知手段と、を備える
通知装置。 - 前記事故が発生した場所を含む第1の領域に存在する前記通知対象を検知する通知対象検知手段をさらに備え、
前記通知手段は、前記通知対象検知手段が検知した前記通知対象に前記対処方法を通知する、
請求項1に記載の通知装置。 - 前記通知対象検知手段は、前記通知対象の候補である第1の種類の候補が前記第1の領域に存在すること、及び、前記通知対象の候補である第2の種類の候補が、前記第1の領域よりも広い、前記事故が発生した場所を含む第2の領域に存在することを検知し、
前記通知手段は、前記通知対象検知手段が検知した前記第1の領域に存在する前記第1の種類の候補、及び、前記通知対象検知手段が検知した前記第2の領域に存在する前記第2の種類の候補、の少なくともいずれか一方を前記通知対象として、前記対処方法を通知する、
請求項2に記載の通知装置。 - 前記通知対象は、人物及び車両の少なくともいずれか一方を含む、
請求項2又は3に記載の通知装置。 - 前記決定手段は、前記対処方法の決定として、前記分析対象人物への手当又は処置の決定、及び、前記手当又は処置において使用される医療機器であって、前記事故が発生した場所を含む第3の領域に存在する医療機器の場所の特定の少なくともいずれか一方を実行し、
前記通知手段は、前記決定された手当又は処置、及び、前記特定された医療機器の場所の少なくともいずれか一方を前記対処方法に含めて前記通知対象に通知する、
請求項2乃至4のいずれか1項に記載の通知装置。 - 前記通知対象検知手段は、前記通知対象の候補が前記映像に映っていることを判定することで、前記候補を前記通知対象として検知し、
前記通知手段は、前記通知対象検知手段が検知した前記通知対象と関連付けられた端末に対して、前記対処方法を通知する、
請求項2乃至5のいずれか1項に記載の通知装置。 - 前記通知対象の候補が有する端末から、前記端末の位置情報を取得する位置情報取得手段をさらに備え、
前記通知対象検知手段は、前記位置情報取得手段が取得した前記位置情報に基づいて前記通知対象を検知し、
前記通知手段は、前記通知対象検知手段が検知した前記通知対象の端末に前記対処方法を通知する、
請求項2乃至5のいずれか1項に記載の通知装置。 - 前記通知対象検知手段は、前記事故が発生した場所を自局のセル内に含む1又は複数の基地局を特定することにより前記通知対象を検知し、
前記通知手段は、前記通知対象検知手段が検知した前記通知対象の端末に前記対処方法を通知する、
請求項2乃至5のいずれか1項に記載の通知装置。 - 前記通知対象は、少なくとも医療又は救急に係る機関を含み、
前記決定手段は、前記分析対象人物への処置、前記事故が発生した場所へのスタッフの派遣要請、前記分析対象人物の受け入れ要請及び処置の準備の少なくともいずれかが含まれるように前記対処方法を決定し、
前記通知手段は、前記機関に前記対処方法を通知する、
請求項1又は2に記載の通知装置。 - 前記分析手段は、前記映像に映っている前記分析対象人物の顔画像と、登録者の顔情報とを用いた顔認証の認証結果に基づいて、前記分析対象人物を前記登録者の中から特定し、前記登録者に関連付けられた個人情報を前記詳細情報として導出し、
前記決定手段は、前記個人情報に基づいて前記対処方法を決定する、
請求項1乃至9のいずれか1項に記載の通知装置。 - 前記個人情報は、既往歴、現病歴、妊娠の有無、年齢、性別、血液型、アレルギー情報、個人番号、かかりつけ医療機関、及び緊急連絡先の少なくともいずれかを含む、
請求項10に記載の通知装置。 - 前記通知手段は、前記個人情報に基づいて前記通知対象を決定し、決定した前記通知対象に、決定された前記対処方法を通知する、
請求項10又は11に記載の通知装置。 - 前記詳細情報は、意識状態、呼吸状態、出血状態、骨折状態、火傷状態、強打部位、痙攣状態、歩行状態、心拍状態、脈拍状態及び体温状態の少なくともいずれかを含む、
請求項1乃至12のいずれか1項に記載の通知装置。 - 1又は複数のカメラと、
前記1又は複数のカメラと接続された通知装置と、
前記通知装置から通知を受信する1又は複数の端末と、を備え、
前記通知装置は、
前記1又は複数のカメラで撮影された映像から事故の発生を検知する事故発生検知手段と、
前記映像を分析して、前記事故に係る分析対象人物の詳細情報を導出する分析手段と、
前記詳細情報に基づいて、前記分析対象人物への対処方法を決定する決定手段と、
決定された前記対処方法を前記1又は複数の端末に通知する通知手段と、を有する
通知システム。 - 前記1又は複数のカメラは、信号機、路側機、交差点及び踏切の少なくともいずれかに設けられている、
請求項14に記載の通知システム。 - 1又は複数のカメラで撮影された映像から事故の発生を検知する事故発生検知ステップと、
前記映像を分析して、前記事故に係る分析対象人物の詳細情報を導出する分析ステップと、
前記詳細情報に基づいて、前記分析対象人物への対処方法を決定する決定ステップと、
決定された前記対処方法を通知対象に通知する通知ステップと、
を通知装置が実行する通知方法。 - 1又は複数のカメラで撮影された映像から事故の発生を検知する事故発生検知ステップと、
前記映像を分析して、前記事故に係る分析対象人物の詳細情報を導出する分析ステップと、
前記詳細情報に基づいて、前記分析対象人物への対処方法を決定する決定ステップと、
決定された前記対処方法を通知対象に通知する通知ステップと、
をコンピュータに実行させるプログラムが格納された非一時的なコンピュータ可読媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21934763.0A EP4273829A4 (en) | 2021-03-29 | 2021-03-29 | NOTIFICATION DEVICE, NOTIFICATION SYSTEM, NOTIFICATION METHOD AND NON-TRANSITORY COMPUTER-READABLE MEDIUM |
US18/274,695 US20240127595A1 (en) | 2021-03-29 | 2021-03-29 | Notification device, notification system, notification method, and non-transitory computer-readable medium |
PCT/JP2021/013217 WO2022208586A1 (ja) | 2021-03-29 | 2021-03-29 | 通知装置、通知システム、通知方法及び非一時的なコンピュータ可読媒体 |
JP2023509901A JPWO2022208586A5 (ja) | 2021-03-29 | 通知装置、通知システム、通知方法及びプログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/013217 WO2022208586A1 (ja) | 2021-03-29 | 2021-03-29 | 通知装置、通知システム、通知方法及び非一時的なコンピュータ可読媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022208586A1 true WO2022208586A1 (ja) | 2022-10-06 |
Family
ID=83458405
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/013217 WO2022208586A1 (ja) | 2021-03-29 | 2021-03-29 | 通知装置、通知システム、通知方法及び非一時的なコンピュータ可読媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240127595A1 (ja) |
EP (1) | EP4273829A4 (ja) |
WO (1) | WO2022208586A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001325689A (ja) * | 2000-05-18 | 2001-11-22 | Nec Corp | 救急用車両の配車管理方法、コールセンタ、救急通報端末および配車管理システムと記憶媒体 |
JP2002008184A (ja) * | 2000-06-20 | 2002-01-11 | Nec Corp | 交通事故検出システム、方法、画像処理装置及びプログラムを記録した記録媒体 |
WO2005101346A1 (ja) | 2004-03-31 | 2005-10-27 | Hitachi Zosen Corporation | 突発事象の記録・解析システム |
JP2016096574A (ja) | 2009-11-13 | 2016-05-26 | ゾール メディカル コーポレイションZOLL Medical Corporation | 地域密着型応答システム |
JP2018110304A (ja) * | 2016-12-28 | 2018-07-12 | パナソニックIpマネジメント株式会社 | 監視システム、監視方法、及びプログラム |
JP2019101983A (ja) | 2017-12-07 | 2019-06-24 | 日本電気株式会社 | コンテンツ選定装置、コンテンツ選定方法及びプログラム |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8576066B2 (en) * | 2011-02-28 | 2013-11-05 | International Business Machines Corporation | Managing emergency response services using mobile communication devices |
US9491277B2 (en) * | 2014-04-03 | 2016-11-08 | Melissa Vincent | Computerized method and system for global health, personal safety and emergency response |
-
2021
- 2021-03-29 EP EP21934763.0A patent/EP4273829A4/en active Pending
- 2021-03-29 US US18/274,695 patent/US20240127595A1/en active Pending
- 2021-03-29 WO PCT/JP2021/013217 patent/WO2022208586A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001325689A (ja) * | 2000-05-18 | 2001-11-22 | Nec Corp | 救急用車両の配車管理方法、コールセンタ、救急通報端末および配車管理システムと記憶媒体 |
JP2002008184A (ja) * | 2000-06-20 | 2002-01-11 | Nec Corp | 交通事故検出システム、方法、画像処理装置及びプログラムを記録した記録媒体 |
WO2005101346A1 (ja) | 2004-03-31 | 2005-10-27 | Hitachi Zosen Corporation | 突発事象の記録・解析システム |
JP2016096574A (ja) | 2009-11-13 | 2016-05-26 | ゾール メディカル コーポレイションZOLL Medical Corporation | 地域密着型応答システム |
JP2018110304A (ja) * | 2016-12-28 | 2018-07-12 | パナソニックIpマネジメント株式会社 | 監視システム、監視方法、及びプログラム |
JP2019101983A (ja) | 2017-12-07 | 2019-06-24 | 日本電気株式会社 | コンテンツ選定装置、コンテンツ選定方法及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP4273829A4 |
Also Published As
Publication number | Publication date |
---|---|
EP4273829A1 (en) | 2023-11-08 |
EP4273829A4 (en) | 2024-03-13 |
US20240127595A1 (en) | 2024-04-18 |
JPWO2022208586A1 (ja) | 2022-10-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10796560B2 (en) | Personal emergency response system with predictive emergency dispatch risk assessment | |
US9411997B1 (en) | Systems and methods for tracking subjects | |
US20200346751A1 (en) | Unmanned aerial vehicle emergency dispatch and diagnostics data apparatus, systems and methods | |
WO2020151339A1 (zh) | 基于无人驾驶车辆的异常处理方法、装置及相关设备 | |
WO2012002435A1 (ja) | 感染拡大防止支援システム、感染拡大防止支援サーバ、検査端末、移動端末及びプログラム | |
JP2023517097A (ja) | 船内病原体感染追跡のための接触者追跡システム及び方法 | |
US20150312739A1 (en) | Computer program, method, and system for obtaining and providing emergency services | |
TW201921905A (zh) | 使用語音和感測器資料擷取的緊急回應 | |
KR102232485B1 (ko) | 분산형 동선 추적 장치 및 이를 이용한 방법 | |
CN111862532A (zh) | 一种面对突发状况的报警系统 | |
Chiou et al. | A real-time, automated and privacy-preserving mobile emergency-medical-service network for informing the closest rescuer to rapidly support mobile-emergency-call victims | |
Nikoloudakis et al. | An NF V-powered emergency system for smart enhanced living environments | |
JP2020052856A (ja) | 救援支援サーバ、救援支援システム及びプログラム | |
KV et al. | Design and development of a smartphone-based application to save lives during accidents and emergencies | |
WO2022208586A1 (ja) | 通知装置、通知システム、通知方法及び非一時的なコンピュータ可読媒体 | |
CN111539254A (zh) | 目标检测方法、装置、电子设备及计算机可读存储介质 | |
Alshareef et al. | First responder help facilitated by the mobile cloud | |
Intawong et al. | A-SA SOS: A mobile-and IoT-based pre-hospital emergency service for the elderly and village health volunteers | |
CN111091901A (zh) | 老年人急救决策方法、装置、设备及存储介质 | |
Bashar et al. | Intelligent alarm system for hospitals using smartphone technology | |
WO2020075283A1 (ja) | 異常者予知システム、異常者予知方法、およびプログラム | |
Bajpai et al. | ICT for education: Lessons from China | |
KR20160032462A (ko) | 소셜 네트워크 서비스를 이용한 사회 안전망 시스템 및 방법 | |
WO2023037552A1 (ja) | 監視支援装置、システム及び方法、並びに、コンピュータ可読媒体 | |
Seegolam et al. | A Generic Contact Tracing Framework |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21934763 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18274695 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2021934763 Country of ref document: EP Effective date: 20230803 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023509901 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |