US20210201601A1 - Information processing device and information processing method - Google Patents
Information processing device and information processing method Download PDFInfo
- Publication number
- US20210201601A1 US20210201601A1 US17/205,053 US202117205053A US2021201601A1 US 20210201601 A1 US20210201601 A1 US 20210201601A1 US 202117205053 A US202117205053 A US 202117205053A US 2021201601 A1 US2021201601 A1 US 2021201601A1
- Authority
- US
- United States
- Prior art keywords
- vehicle
- occupant
- window
- information
- information processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/08—Mouthpieces; Microphones; Attachments therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/20—Arrangements for obtaining desired frequency or directional characteristics
- H04R1/32—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
- H04R1/326—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only for microphones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/008—Registering or indicating the working of vehicles communicating information to a remotely located station
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/12—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time in graphical form
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/10—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using wireless transmission systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/20—Arrangements for obtaining desired frequency or directional characteristics
- H04R1/32—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
- H04R1/40—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers
- H04R1/406—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers microphones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R3/00—Circuits for transducers, loudspeakers or microphones
- H04R3/005—Circuits for transducers, loudspeakers or microphones for combining the signals of two or more microphones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/90—Services for handling of emergency or hazardous situations, e.g. earthquake and tsunami warning systems [ETWS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R2021/0027—Post collision measures, e.g. notifying emergency services
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/016—Personal emergency signalling and security systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B27/00—Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations
- G08B27/001—Signalling to an emergency team, e.g. firemen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2499/00—Aspects covered by H04R or H04S not otherwise provided for in their subgroups
- H04R2499/10—General applications
- H04R2499/13—Acoustic transducers and sound field adaptation in vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/027—Services making use of location information using location based information parameters using movement velocity, acceleration information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/38—Services specially adapted for particular environments, situations or purposes for collecting sensor information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
Definitions
- the present disclosure relates to an information processing device and an information processing method for acquiring information on a vehicle that is in an accident.
- a system in which, when a collision accident has occurred in a vehicle, an emergency notification is automatically executed from an emergency notification device mounted in the vehicle to a management center, and an ambulance or the like is caused to be directed to an accident site by arrangement of the management center has been known.
- JP 2016-30481 A discloses a vehicular emergency notification device capable of estimating a degree of injury of an occupant on the management center side when a vehicle accident occurs.
- microphones are provided near seats of the vehicle one by one, a voice of the occupant is acquired by the microphone, and voice information is transmitted to the management center.
- An operator of the management center ascertains, for example, a situation of injury by talking with the occupant.
- JP 2016-30481 A clear acquisition of a voice of a specific occupant is likely to be difficult in a situation in which noise outside an accident vehicle is great.
- the present disclosure has been made in view of such circumstances, and the present disclosure provides an information processing device and an information processing method capable of facilitating clear acquisition of a voice of an occupant at a designated position in a vehicle cabin when an accident has occurred in a vehicle.
- a first aspect of the present disclosure relates to an information processing device.
- the information processing device includes an acquisition unit configured to acquire information on a position of an occupant from a vehicle that is in an accident; a reception unit configured to receive a manipulation input from an operator; and a determination unit configured to determine directivity of a microphone in a vehicle cabin of the vehicle based on the manipulation input.
- the directivity of the microphone in the vehicle cabin of the vehicle that is in the accident is determined based on the manipulation input of the operator, it is possible to facilitate clear acquisition of the voice of the occupant at the designated position.
- the acquisition unit may acquire information on the occupant from the vehicle, and the information processing device may include a display unit that displays an image representing a seating position of the occupant and a state of the occupant based on the information on the occupant.
- the reception unit may receive a manipulation input for designating the seating position of the occupant in the image displayed on the display unit, and the determination unit may determine the directivity of the microphone to be a direction of the designated seating position of the occupant.
- the acquisition unit may acquire information on an open and closed state of a window from the vehicle, and the display unit may display an image representing the open and closed state of the window.
- the determination unit may determine the directivity of the microphone to be a direction of a designated window based on a manipulation input for designating the window in the image displayed on the display unit.
- a second aspect of the present disclosure also relates to an information processing device.
- the information processing device includes an acquisition unit configured to acquire information on an occupant from a vehicle that is in an accident; and a display unit configured to display an image representing a seating position of the occupant and a state of the occupant based on the information on the occupant.
- the second aspect it is possible to rapidly ascertain the seating position and the state of each occupant of the vehicle that is in the accident based on the image.
- a third aspect of the present disclosure relates to an information processing method.
- the information processing method includes an acquisition step of acquiring information on a position of an occupant from a vehicle that is in an accident; a reception step of receiving a manipulation input from an operator; and a determination step of determining directivity of a microphone in a vehicle cabin of the vehicle based on the manipulation input received in the reception step.
- FIG. 1 is a block diagram illustrating a configuration of an emergency call system according to an embodiment
- FIG. 2 is a diagram schematically illustrating a vehicle cabin of a vehicle in FIG. 1 ;
- FIG. 3 is a diagram illustrating an example of an image displayed on a display unit of an information processing device in FIG. 1 .
- FIG. 1 is a block diagram illustrating a configuration of an emergency notification system 1 according to an embodiment.
- the emergency notification system 1 includes an emergency notification device 10 and an information processing device 40 .
- the emergency notification device 10 is mounted in a vehicle 90 that is a car.
- the emergency notification device 10 has a wireless communication function, and is connected to a network 30 via a wireless base station or a wireless access point.
- An information processing device 40 is connected to the network 30 , and the information processing device 40 communicates with the emergency notification device 10 via the network 30 .
- the information processing device 40 is installed, for example, at an emergency notification center and used by an operator.
- a standard of the wireless communication is not particularly limited, and includes, for example, 3G (a third generation mobile communication system), 4G (a fourth generation mobile communication system), or 5G (a fifth generation mobile communication system).
- the vehicle 90 includes an emergency notification device 10 , an occupant detection sensor 12 , a seat belt sensor 14 , a window opening and closing sensor 16 , a door ECU 18 , a microphone 20 , and a speaker 22 .
- the emergency notification device 10 includes a detection unit 50 , an acquisition unit 52 , a derivation unit 54 , a communication unit 56 , a holding unit 58 , a directivity controller 60 , and a call unit 62 .
- the occupant detection sensor 12 is provided at each seat of the vehicle 90 , detects a load on a seat to detect a seating state of the occupant on each seat, and outputs a detection result to the acquisition unit 52 .
- the seat belt sensor 14 is provided in each seat of the vehicle 90 , detects a wearing state of the seat belt of the occupant on each seat, and outputs a detection result to the acquisition unit 52 .
- the window opening and closing sensor 16 detects an open and closed state of each window of the vehicle 90 , and outputs a detection result to the door ECU 18 and the acquisition unit 52 .
- an acceleration sensor detects an acceleration equal to or greater than a predetermined threshold value due to, for example, a collision of the vehicle 90
- an airbag electronic control unit (ECU) (not illustrated) outputs a deployment signal for deploying an airbag to the airbag and the detection unit 50 .
- the detection unit 50 receives the deployment signal, the detection unit 50 detects that an accident has occurred in the vehicle 90 .
- the door ECU 18 diagnoses whether or not there is a possibility that the window is broken with respect to the respective windows of which a closed state has been detected by the window opening and closing sensor 16 .
- the door ECU 18 sets a jam protection function to be effective, closes the window, and executes the diagnosis.
- the jam protection is a function of executing a window opening operation when a load is applied to a motor for opening and closing a window during a window closing operation to prevent pinching of a hand or the like. Normally, the jam protection function is disabled when the window is substantially completely closed, but is enabled for diagnosis.
- the door ECU 18 detects that the window is in an open state when the jam protection does not work in the window in which the window opening and closing sensor 16 indicates a closed state, and outputs a detection result to the acquisition unit 52 .
- the acquisition unit 52 regularly acquires the detection result of the occupant detection sensor 12 and the detection result of the seat belt sensor 14 regardless of the presence or absence of an accident occurrence detection.
- the acquisition unit 52 outputs the detection result of the occupant detection sensor 12 and the detection result of the seat belt sensor 14 acquired immediately before the occurrence of the accident to the derivation unit 54 .
- the acquisition unit 52 acquires the detection result of the open and closed state of the window output from at least one of the door ECU 18 or the window opening and closing sensor 16 , and outputs the acquired detection result to the derivation unit 54 .
- the window is considered to be open when the detection result of the door ECU 18 indicates an open state.
- the emergency notification information includes information for specifying the vehicle 90 , the position of the vehicle 90 , the detection result of the occupant detection sensor 12 , the detection result of the seat belt sensor 14 , and the detection result of the open and closed state of the window, which are output from the acquisition unit 52 .
- the information for specifying the vehicle 90 includes a vehicle identification number (VIN), information on a license plate, a vehicle type, a color of the vehicle 90 , and the like, and is stored in a storage unit (not illustrated) in advance.
- the position of the vehicle 90 is acquired by a GPS receiver (not illustrated).
- the communication unit 56 transmits the emergency notification information output from the derivation unit 54 to the information processing device 40 .
- FIG. 2 schematically illustrates a vehicle cabin of the vehicle 90 in FIG. 1 .
- One microphone 20 is provided, for example, near a center of the vehicle cabin.
- the microphone 20 is unidirectional, and a sensitivity in a specific direction is higher than a sensitivity in other directions.
- the directivity of the microphone 20 is set to a direction d 1 of a driver's seat so that it is easy to acquire driver's speech.
- information on voice acquired by the microphone is supplied to a navigation device (not illustrated) or the like, and the navigation device or the like can be manipulated through voice recognition.
- the direction of the directivity of the microphone can be changed under the control of the directivity controller 60 .
- the directivity of the microphone 20 can be changed to, for example, a direction d 2 of a passenger seat, a direction d 3 of a right rear seat, a direction d 4 of a left rear seat, a direction d 5 of the window on the driver's seat side, and the like.
- the holding unit 58 holds control information for the directivity of the microphone 20 regarding each seat and each window in advance.
- the directivity control information includes, for example, an angle for setting the directivity in a direction of each seat and each window.
- the directivity controller 60 controls the directivity of the microphone 20 based on directivity information of the microphone 20 transmitted from the information processing device 40 to be described below at the time of occurrence of an accident and control information for the directivity held in the holding unit 58 .
- the directivity controller 60 controls the directivity based on the control information for the directivity on the right rear seat.
- Voice information acquired by the microphone 20 is also supplied to the call unit 62 .
- the call unit 62 is connected to the call unit 78 of the information processing device 40 via the communication unit 56 , and executes a call between the occupant of the vehicle 90 and the operator of the information processing device 40 using the microphone 20 and the speaker 22 .
- the speaker 22 is provided in the vehicle cabin and outputs a voice based on a voice signal supplied from the call unit 62 .
- the information processing device 40 includes a communication unit 70 , an acquisition unit 72 , a display controller 74 , a display unit 76 , a call unit 78 , a reception unit 80 , and a determination unit 82 .
- the communication unit 70 receives the emergency notification information from the vehicle that is in the accident, and outputs the received emergency notification information to the acquisition unit 72 .
- the acquisition unit 72 acquires the emergency notification information output from the communication unit 70 , and acquires information on the occupant of the accident vehicle based on the emergency notification information.
- the acquisition unit 72 acquires information on the position of the occupant based on the detection result of the occupant detection sensor 12 in the emergency notification information.
- the acquisition unit 72 acquires information on the state of the occupant based on the detection result of the occupant detection sensor 12 and the detection result of the seat belt sensor 14 in the emergency notification information.
- the acquisition unit 72 acquires information on the state of the occupant indicating that a degree of serious injury of the occupant is low when the occupant is seated on a certain seat and the seat belt is worn.
- the acquisition unit 72 acquires information on the state of the occupant indicating that the degree of serious injury of the occupant is high when the occupant is seated on the certain seat and the seat belt is not worn.
- the acquisition unit 72 acquires information on the open and closed state of the window based on the detection result of the open and closed state of the window in the emergency notification information.
- the acquisition unit 72 outputs the acquired information on the occupant and the acquired information on the open and closed state of the window to the display controller 74 .
- the display controller 74 controls the display of the image in the display unit 76 based on the information on the occupant and the information on the open and closed state of the window, which are output from the acquisition unit 72 .
- the display unit 76 displays an image illustrating the seating position of the occupant in the accident vehicle, the state of the occupant, and the open and closed state of the window.
- FIG. 3 illustrates an example of an image 100 displayed on the display unit 76 of the information processing device 40 .
- the image 100 includes a FIG. 102 indicating an occupant of which a seating position is a driver's seat, a FIG. 104 indicating an occupant of which a seating position is a passenger seat, and a FIG. 106 indicating an occupant of which a seating position is a right rear seat. Since no figure is displayed in a position corresponding to the left rear seat, this indicates that no occupant is seated at the left rear seat. Instead of the figures, characters indicating occupants may be displayed.
- FIGS. 102, 104, and 106 also indicates a state of the occupant using color or a pattern.
- the FIG. 102 indicates that a state of the occupant at the driver's seat is “a high degree of serious injury”.
- the FIG. 104 indicates that the state of the passenger at the passenger seat is “a high degree of serious injury”.
- the FIG. 106 indicates that the state of the occupant at the right rear seat is “a low degree of serious injury”.
- the image 100 also includes a FIG. 110 indicating that a window on the passenger seat side is in an open state. Since no figures are displayed in positions corresponding to other windows, the image 100 indicates that the other windows are in a closed state.
- the call unit 78 executes a call between the occupant of the vehicle 90 and the operator via the call unit 62 of the emergency notification device 10 that has transmitted the emergency notification information.
- the reception unit 80 receives a manipulation input of an operator for designating the seating position of the occupant or the position of the window in the open state in the image displayed on the display unit 76 .
- the reception unit 80 may include a touch sensor that receives a touch manipulation input on a screen of the display unit 76 by the operator.
- the determination unit 82 determines the directivity of the microphone 20 in the vehicle cabin of the vehicle 90 based on the manipulation input.
- the determination unit 82 determines the directivity of the microphone 20 in the direction of the designated seating position of the occupant based on the manipulation input for designating the seating position of the occupant in the image displayed on the display unit 76 .
- the determination unit 82 determines the directivity of the microphone 20 in the direction of the designated window based on the manipulation input for designating the window in the image displayed on the display unit 76 .
- the determination unit 82 outputs information on the determined directivity of the microphone 20 to the communication unit 70 , and the communication unit 70 transmits directivity information of the microphone 20 to the emergency notification device 10 .
- the emergency notification device 10 controls the directivity of the microphone 20 according to the directivity information as described above.
- the operator When a call is started, the operator calls the occupant of the vehicle 90 such as “Is it OK?”. A voice of the operator is listened to at each seat. When there is a response indicating that there is no problem from the driver to which the microphone 20 is directed, the operator does not request an ambulance.
- the operator confirms the image 100 on the display unit 76 , and designates the seating position of the occupant with a low degree of serious injury through, for example, a touch manipulation on the screen.
- the operator can rapidly ascertain the degree of serious injury and the seating position of each occupant by confirming the image. Accordingly, it becomes easy to start talking with an occupant with a low degree of serious injury who is more highly likely to be able to talk than an occupant with a high degree of serious injury.
- FIG. 3 it is assumed that the operator designates the right rear seat.
- the operator When the operator can talk with a designated occupant, the operator listens to a degree of injury of each occupant and requests an ambulance. Since the microphone 20 has high sensitivity in a direction of the designated occupant and low sensitivity in the other directions, it is easy to clearly acquire the voice of the designated occupant even in a situation in which noise outside the vehicle is large.
- the operator also requests the ambulance even when there is no response from all occupants.
- the operator sends a notification to a communication control room of a fire department headquarter, informs of the position of the vehicle 90 and the information for specifying the vehicle 90 based on the received emergency notification information, and informs of a degree of injury when the operator can listen to the degree of injury from the occupant.
- the information processing device 40 may transmit the emergency notification information to a terminal device such as the communication command room of the fire department.
- the operator may designate the position of the window in an open state in the image when there is no response from all the occupants.
- the operator can rapidly ascertain the window in an open state by confirming the image. Since the directivity of the microphone 20 is controlled in a direction of the designated open window and the operator can listen to the sound outside the vehicle collected by the microphone 20 , a range of measures, for example, is widened as will be shown below.
- the road on which the accident has occurred is likely to be rapidly specified.
- travel sound of another vehicle is heard from the outside of the vehicle and a voice of a passerby is not heard, an accident is likely to have occurred on an expressway or a highway.
- the voice of the passerby from the outside of the vehicle is heard, an accident is likely to have occurred on a general road.
- the operator can talk to a passerby outside the vehicle to confirm when a road is a general road.
- the information processing device 40 may transmit an instruction to increase a volume of the speaker 22 to the emergency notification device 10 according to the manipulation input of the operator.
- the operator can talk with a passerby outside the vehicle to request the passerby to perform an emergency measure of the occupant or listen to a more detailed accident situation or the like from the passerby. Since the microphone 20 has high sensitivity in the direction of the designated window in an open state and low sensitivity in the other directions, it is easy to clearly acquire a voice of the passerby.
- configurations of the detection unit 50 , acquisition unit 52 , the derivation unit 54 , the directivity controller 60 , the call unit 62 , the acquisition unit 72 , the display controller 74 , the call unit 78 , the reception unit 80 , and the determination unit 82 can be realized by hardware such as a CPU, a memory, or another LSI of any computer and can be realized by software such as a program loaded into the memory, functional blocks realized by cooperation of these are depicted. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by solely hardware, solely software, or a combination thereof.
- the directivity of the microphone 20 in the vehicle cabin of the vehicle 90 causing the accident is determined based on a manipulation input of the operator, it can be easy to clearly acquire the voice of the passenger in the designated position. Thus, it is possible to allow the operator to rapidly take an appropriate action with respect to the accident.
- the presence or absence of opening and closing of the door on the rear seat may be detected by a door opening and closing sensor (not illustrated), and when the door is opened and subsequently closed before travel of the vehicle 90 starts, the acquisition unit 52 of the emergency notification device 10 may acquire that the occupant is seated on the seat corresponding to the door.
- the directivity of the microphone 20 may be regularly directed to a direction of the rear seat during travel of the vehicle 90 , and the acquisition unit 52 may estimate a seating position of the occupant at the rear seat based on a frequency of a sound collected by the microphone 20 and the direction of the directivity when the sound is collected.
- the vehicle 90 in which the occupant detection sensor 12 and the seat belt sensor 14 are not provided at the rear seat it is possible to cause the operator to ascertain the presence or absence of the occupant at the rear seat or the seating position.
- a first microphone of which the directivity can be controlled with respect to the occupant at the driver's seat, the occupant at the passenger seat, and the window, and a second microphone of which the directivity can be controlled with respect to the occupant at the rear seat and the window can be provided.
- a degree of freedom of the configuration of the vehicle 90 can be improved.
- the emergency notification device 10 may hold driver information input in advance.
- the driver information includes, for example, information likely to be effective for lifesaving treatment, such as an age group, a blood type, and a sex.
- the derivation unit 54 also includes the driver information in the emergency notification information.
- the operator can inform emergency personnel before a movement of the driver information, and it becomes easy for the emergency personnel to prepare for a movement.
- the derivation unit 54 of the emergency notification device 10 may also include, in the emergency notification information, information on an acceleration at the time of occurrence of an accident detected by an acceleration sensor (not illustrated).
- the acquisition unit 72 of the information processing device 40 may increase the degree of serious injury when an absolute value of the acceleration is greater for the occupant on the seat in which the seat belt is not worn. That is, the degree of serious injury may have steps more than two steps including high and low. In the modification example, it is possible to increase the amount of information to be provided to the operator.
- the derivation unit 54 of the emergency notification device 10 may also include rollover information in the emergency notification information.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Otolaryngology (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Computer Networks & Wireless Communication (AREA)
- Environmental & Geological Engineering (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Mechanical Engineering (AREA)
- Alarm Systems (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Emergency Alarm Devices (AREA)
- Telephonic Communication Services (AREA)
Abstract
In an information processing device, an acquisition unit acquires information on a position of an occupant from a vehicle that is in an accident. A reception unit receives a manipulation input from an operator. A determination unit determines directivity of a microphone in a vehicle cabin of the vehicle based on the manipulation input received by the reception unit.
Description
- This application is a continuation of U.S. patent application Ser. No. 16/538,892, filed February Aug. 13, 2019, which claims the benefit of Japanese Patent Application No. 2018-200397 filed on Oct. 24, 2018 including the specification, drawings and abstract is incorporated herein by reference in its entirety.
- The present disclosure relates to an information processing device and an information processing method for acquiring information on a vehicle that is in an accident.
- A system in which, when a collision accident has occurred in a vehicle, an emergency notification is automatically executed from an emergency notification device mounted in the vehicle to a management center, and an ambulance or the like is caused to be directed to an accident site by arrangement of the management center has been known.
- Japanese Unexamined Patent Application Publication No. 2016-30481 (JP 2016-30481 A) discloses a vehicular emergency notification device capable of estimating a degree of injury of an occupant on the management center side when a vehicle accident occurs. In this device, microphones are provided near seats of the vehicle one by one, a voice of the occupant is acquired by the microphone, and voice information is transmitted to the management center. An operator of the management center ascertains, for example, a situation of injury by talking with the occupant.
- In a technology of JP 2016-30481 A, for example, clear acquisition of a voice of a specific occupant is likely to be difficult in a situation in which noise outside an accident vehicle is great.
- The present disclosure has been made in view of such circumstances, and the present disclosure provides an information processing device and an information processing method capable of facilitating clear acquisition of a voice of an occupant at a designated position in a vehicle cabin when an accident has occurred in a vehicle.
- A first aspect of the present disclosure relates to an information processing device. The information processing device includes an acquisition unit configured to acquire information on a position of an occupant from a vehicle that is in an accident; a reception unit configured to receive a manipulation input from an operator; and a determination unit configured to determine directivity of a microphone in a vehicle cabin of the vehicle based on the manipulation input.
- According to the first aspect, since the directivity of the microphone in the vehicle cabin of the vehicle that is in the accident is determined based on the manipulation input of the operator, it is possible to facilitate clear acquisition of the voice of the occupant at the designated position.
- In the information processing device according to the first aspect, the acquisition unit may acquire information on the occupant from the vehicle, and the information processing device may include a display unit that displays an image representing a seating position of the occupant and a state of the occupant based on the information on the occupant.
- In the information processing device according to the first aspect, the reception unit may receive a manipulation input for designating the seating position of the occupant in the image displayed on the display unit, and the determination unit may determine the directivity of the microphone to be a direction of the designated seating position of the occupant.
- In the information processing device according to the first aspect, the acquisition unit may acquire information on an open and closed state of a window from the vehicle, and the display unit may display an image representing the open and closed state of the window.
- In the information processing device according to the first aspect, the determination unit may determine the directivity of the microphone to be a direction of a designated window based on a manipulation input for designating the window in the image displayed on the display unit.
- A second aspect of the present disclosure also relates to an information processing device. The information processing device includes an acquisition unit configured to acquire information on an occupant from a vehicle that is in an accident; and a display unit configured to display an image representing a seating position of the occupant and a state of the occupant based on the information on the occupant.
- According to the second aspect, it is possible to rapidly ascertain the seating position and the state of each occupant of the vehicle that is in the accident based on the image.
- A third aspect of the present disclosure relates to an information processing method. The information processing method includes an acquisition step of acquiring information on a position of an occupant from a vehicle that is in an accident; a reception step of receiving a manipulation input from an operator; and a determination step of determining directivity of a microphone in a vehicle cabin of the vehicle based on the manipulation input received in the reception step.
- According to the aspects of the present disclosure, it is possible to facilitate clear acquisition of a voice of an occupant at a designated position in a vehicle cabin when an accident has occurred in a vehicle.
- Features, advantages, and technical and industrial significance of exemplary embodiments of the disclosure will be described below with reference to the accompanying drawings, in which like numerals denote like elements, and wherein:
-
FIG. 1 is a block diagram illustrating a configuration of an emergency call system according to an embodiment; -
FIG. 2 is a diagram schematically illustrating a vehicle cabin of a vehicle inFIG. 1 ; and -
FIG. 3 is a diagram illustrating an example of an image displayed on a display unit of an information processing device inFIG. 1 . -
FIG. 1 is a block diagram illustrating a configuration of anemergency notification system 1 according to an embodiment. Theemergency notification system 1 includes anemergency notification device 10 and aninformation processing device 40. - The
emergency notification device 10 is mounted in avehicle 90 that is a car. Theemergency notification device 10 has a wireless communication function, and is connected to anetwork 30 via a wireless base station or a wireless access point. Aninformation processing device 40 is connected to thenetwork 30, and theinformation processing device 40 communicates with theemergency notification device 10 via thenetwork 30. Theinformation processing device 40 is installed, for example, at an emergency notification center and used by an operator. A standard of the wireless communication is not particularly limited, and includes, for example, 3G (a third generation mobile communication system), 4G (a fourth generation mobile communication system), or 5G (a fifth generation mobile communication system). - The
vehicle 90 includes anemergency notification device 10, anoccupant detection sensor 12, aseat belt sensor 14, a window opening andclosing sensor 16, adoor ECU 18, amicrophone 20, and aspeaker 22. Theemergency notification device 10 includes adetection unit 50, anacquisition unit 52, aderivation unit 54, acommunication unit 56, aholding unit 58, adirectivity controller 60, and acall unit 62. - The
occupant detection sensor 12 is provided at each seat of thevehicle 90, detects a load on a seat to detect a seating state of the occupant on each seat, and outputs a detection result to theacquisition unit 52. - The
seat belt sensor 14 is provided in each seat of thevehicle 90, detects a wearing state of the seat belt of the occupant on each seat, and outputs a detection result to theacquisition unit 52. - The window opening and
closing sensor 16 detects an open and closed state of each window of thevehicle 90, and outputs a detection result to thedoor ECU 18 and theacquisition unit 52. - When an acceleration sensor (not illustrated) detects an acceleration equal to or greater than a predetermined threshold value due to, for example, a collision of the
vehicle 90, an airbag electronic control unit (ECU) (not illustrated) outputs a deployment signal for deploying an airbag to the airbag and thedetection unit 50. When thedetection unit 50 receives the deployment signal, thedetection unit 50 detects that an accident has occurred in thevehicle 90. - When the occurrence of the accident is detected by the
detection unit 50, thedoor ECU 18 diagnoses whether or not there is a possibility that the window is broken with respect to the respective windows of which a closed state has been detected by the window opening andclosing sensor 16. Thedoor ECU 18 sets a jam protection function to be effective, closes the window, and executes the diagnosis. The jam protection is a function of executing a window opening operation when a load is applied to a motor for opening and closing a window during a window closing operation to prevent pinching of a hand or the like. Normally, the jam protection function is disabled when the window is substantially completely closed, but is enabled for diagnosis. When the window is not broken, it is assumed that a load is applied to the motor through an operation of closing a window that has already been closed and the jam protection works. When the window is broken, it is assumed that no load is applied to the motor through an operation of closing the window and the jam protection does not work. Thedoor ECU 18 detects that the window is in an open state when the jam protection does not work in the window in which the window opening andclosing sensor 16 indicates a closed state, and outputs a detection result to theacquisition unit 52. - The
acquisition unit 52 regularly acquires the detection result of theoccupant detection sensor 12 and the detection result of theseat belt sensor 14 regardless of the presence or absence of an accident occurrence detection. When occurrence of an accident is detected by thedetection unit 50, theacquisition unit 52 outputs the detection result of theoccupant detection sensor 12 and the detection result of theseat belt sensor 14 acquired immediately before the occurrence of the accident to thederivation unit 54. - When the occurrence of the accident is detected by the
detection unit 50, theacquisition unit 52 acquires the detection result of the open and closed state of the window output from at least one of thedoor ECU 18 or the window opening andclosing sensor 16, and outputs the acquired detection result to thederivation unit 54. As described above, in the window in which the detection result of the window opening andclosing sensor 16 indicates a closed state, the window is considered to be open when the detection result of thedoor ECU 18 indicates an open state. - When the
detection unit 50 detects occurrence of an accident, thederivation unit 54 derives the emergency notification information and outputs the derived emergency notification information to thecommunication unit 56. The emergency notification information is also referred to as minimum set of data (MSD). The emergency notification information includes information for specifying thevehicle 90, the position of thevehicle 90, the detection result of theoccupant detection sensor 12, the detection result of theseat belt sensor 14, and the detection result of the open and closed state of the window, which are output from theacquisition unit 52. The information for specifying thevehicle 90 includes a vehicle identification number (VIN), information on a license plate, a vehicle type, a color of thevehicle 90, and the like, and is stored in a storage unit (not illustrated) in advance. The position of thevehicle 90 is acquired by a GPS receiver (not illustrated). - The
communication unit 56 transmits the emergency notification information output from thederivation unit 54 to theinformation processing device 40. -
FIG. 2 schematically illustrates a vehicle cabin of thevehicle 90 inFIG. 1 . Onemicrophone 20 is provided, for example, near a center of the vehicle cabin. Themicrophone 20 is unidirectional, and a sensitivity in a specific direction is higher than a sensitivity in other directions. In a normal time other than the time of occurrence of an accident, the directivity of themicrophone 20 is set to a direction d1 of a driver's seat so that it is easy to acquire driver's speech. In the normal time, information on voice acquired by the microphone is supplied to a navigation device (not illustrated) or the like, and the navigation device or the like can be manipulated through voice recognition. The direction of the directivity of the microphone can be changed under the control of thedirectivity controller 60. The directivity of themicrophone 20 can be changed to, for example, a direction d2 of a passenger seat, a direction d3 of a right rear seat, a direction d4 of a left rear seat, a direction d5 of the window on the driver's seat side, and the like. - Return to
FIG. 1 . The holdingunit 58 holds control information for the directivity of themicrophone 20 regarding each seat and each window in advance. The directivity control information includes, for example, an angle for setting the directivity in a direction of each seat and each window. - The
directivity controller 60 controls the directivity of themicrophone 20 based on directivity information of themicrophone 20 transmitted from theinformation processing device 40 to be described below at the time of occurrence of an accident and control information for the directivity held in the holdingunit 58. For example, when the designated directivity is a direction of the right rear seat, thedirectivity controller 60 controls the directivity based on the control information for the directivity on the right rear seat. - Voice information acquired by the
microphone 20 is also supplied to thecall unit 62. When the occurrence of the accident is detected, thecall unit 62 is connected to thecall unit 78 of theinformation processing device 40 via thecommunication unit 56, and executes a call between the occupant of thevehicle 90 and the operator of theinformation processing device 40 using themicrophone 20 and thespeaker 22. Thespeaker 22 is provided in the vehicle cabin and outputs a voice based on a voice signal supplied from thecall unit 62. - The
information processing device 40 includes acommunication unit 70, anacquisition unit 72, adisplay controller 74, adisplay unit 76, acall unit 78, areception unit 80, and adetermination unit 82. Thecommunication unit 70 receives the emergency notification information from the vehicle that is in the accident, and outputs the received emergency notification information to theacquisition unit 72. - The
acquisition unit 72 acquires the emergency notification information output from thecommunication unit 70, and acquires information on the occupant of the accident vehicle based on the emergency notification information. - The
acquisition unit 72 acquires information on the position of the occupant based on the detection result of theoccupant detection sensor 12 in the emergency notification information. - The
acquisition unit 72 acquires information on the state of the occupant based on the detection result of theoccupant detection sensor 12 and the detection result of theseat belt sensor 14 in the emergency notification information. Theacquisition unit 72 acquires information on the state of the occupant indicating that a degree of serious injury of the occupant is low when the occupant is seated on a certain seat and the seat belt is worn. Theacquisition unit 72 acquires information on the state of the occupant indicating that the degree of serious injury of the occupant is high when the occupant is seated on the certain seat and the seat belt is not worn. - The
acquisition unit 72 acquires information on the open and closed state of the window based on the detection result of the open and closed state of the window in the emergency notification information. Theacquisition unit 72 outputs the acquired information on the occupant and the acquired information on the open and closed state of the window to thedisplay controller 74. - The
display controller 74 controls the display of the image in thedisplay unit 76 based on the information on the occupant and the information on the open and closed state of the window, which are output from theacquisition unit 72. - Under the control of the
display controller 74, thedisplay unit 76 displays an image illustrating the seating position of the occupant in the accident vehicle, the state of the occupant, and the open and closed state of the window. -
FIG. 3 illustrates an example of animage 100 displayed on thedisplay unit 76 of theinformation processing device 40. Theimage 100 includes aFIG. 102 indicating an occupant of which a seating position is a driver's seat, aFIG. 104 indicating an occupant of which a seating position is a passenger seat, and aFIG. 106 indicating an occupant of which a seating position is a right rear seat. Since no figure is displayed in a position corresponding to the left rear seat, this indicates that no occupant is seated at the left rear seat. Instead of the figures, characters indicating occupants may be displayed. - Each of the
FIGS. 102, 104, and 106 also indicates a state of the occupant using color or a pattern. TheFIG. 102 indicates that a state of the occupant at the driver's seat is “a high degree of serious injury”. TheFIG. 104 indicates that the state of the passenger at the passenger seat is “a high degree of serious injury”. TheFIG. 106 indicates that the state of the occupant at the right rear seat is “a low degree of serious injury”. - The
image 100 also includes aFIG. 110 indicating that a window on the passenger seat side is in an open state. Since no figures are displayed in positions corresponding to other windows, theimage 100 indicates that the other windows are in a closed state. - Return to
FIG. 1 . When theacquisition unit 72 acquires the emergency notification information, thecall unit 78 executes a call between the occupant of thevehicle 90 and the operator via thecall unit 62 of theemergency notification device 10 that has transmitted the emergency notification information. - The
reception unit 80 receives a manipulation input of an operator for designating the seating position of the occupant or the position of the window in the open state in the image displayed on thedisplay unit 76. Thereception unit 80 may include a touch sensor that receives a touch manipulation input on a screen of thedisplay unit 76 by the operator. - The
determination unit 82 determines the directivity of themicrophone 20 in the vehicle cabin of thevehicle 90 based on the manipulation input. Thedetermination unit 82 determines the directivity of themicrophone 20 in the direction of the designated seating position of the occupant based on the manipulation input for designating the seating position of the occupant in the image displayed on thedisplay unit 76. Thedetermination unit 82 determines the directivity of themicrophone 20 in the direction of the designated window based on the manipulation input for designating the window in the image displayed on thedisplay unit 76. - The
determination unit 82 outputs information on the determined directivity of themicrophone 20 to thecommunication unit 70, and thecommunication unit 70 transmits directivity information of themicrophone 20 to theemergency notification device 10. Theemergency notification device 10 controls the directivity of themicrophone 20 according to the directivity information as described above. - When a call is started, the operator calls the occupant of the
vehicle 90 such as “Is it OK?”. A voice of the operator is listened to at each seat. When there is a response indicating that there is no problem from the driver to which themicrophone 20 is directed, the operator does not request an ambulance. - When there is no response to the call from the driver, the operator confirms the
image 100 on thedisplay unit 76, and designates the seating position of the occupant with a low degree of serious injury through, for example, a touch manipulation on the screen. The operator can rapidly ascertain the degree of serious injury and the seating position of each occupant by confirming the image. Accordingly, it becomes easy to start talking with an occupant with a low degree of serious injury who is more highly likely to be able to talk than an occupant with a high degree of serious injury. In the example ofFIG. 3 , it is assumed that the operator designates the right rear seat. - When the operator can talk with a designated occupant, the operator listens to a degree of injury of each occupant and requests an ambulance. Since the
microphone 20 has high sensitivity in a direction of the designated occupant and low sensitivity in the other directions, it is easy to clearly acquire the voice of the designated occupant even in a situation in which noise outside the vehicle is large. - The operator also requests the ambulance even when there is no response from all occupants. When the operator requests the ambulance, the operator sends a notification to a communication control room of a fire department headquarter, informs of the position of the
vehicle 90 and the information for specifying thevehicle 90 based on the received emergency notification information, and informs of a degree of injury when the operator can listen to the degree of injury from the occupant. Theinformation processing device 40 may transmit the emergency notification information to a terminal device such as the communication command room of the fire department. - The operator may designate the position of the window in an open state in the image when there is no response from all the occupants. The operator can rapidly ascertain the window in an open state by confirming the image. Since the directivity of the
microphone 20 is controlled in a direction of the designated open window and the operator can listen to the sound outside the vehicle collected by themicrophone 20, a range of measures, for example, is widened as will be shown below. - For example, even when it is difficult to specify a road on which an accident has occurred from a position of the
vehicle 90 in the emergency notification information due to an expressway or a highway and general roads being vertically in parallel, the road on which the accident has occurred is likely to be rapidly specified. When travel sound of another vehicle is heard from the outside of the vehicle and a voice of a passerby is not heard, an accident is likely to have occurred on an expressway or a highway. When the voice of the passerby from the outside of the vehicle is heard, an accident is likely to have occurred on a general road. The operator can talk to a passerby outside the vehicle to confirm when a road is a general road. When the operator talks to a passerby outside the vehicle, theinformation processing device 40 may transmit an instruction to increase a volume of thespeaker 22 to theemergency notification device 10 according to the manipulation input of the operator. - Further, the operator can talk with a passerby outside the vehicle to request the passerby to perform an emergency measure of the occupant or listen to a more detailed accident situation or the like from the passerby. Since the
microphone 20 has high sensitivity in the direction of the designated window in an open state and low sensitivity in the other directions, it is easy to clearly acquire a voice of the passerby. - Although configurations of the
detection unit 50,acquisition unit 52, thederivation unit 54, thedirectivity controller 60, thecall unit 62, theacquisition unit 72, thedisplay controller 74, thecall unit 78, thereception unit 80, and thedetermination unit 82 can be realized by hardware such as a CPU, a memory, or another LSI of any computer and can be realized by software such as a program loaded into the memory, functional blocks realized by cooperation of these are depicted. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by solely hardware, solely software, or a combination thereof. - According to the embodiment, since the directivity of the
microphone 20 in the vehicle cabin of thevehicle 90 causing the accident is determined based on a manipulation input of the operator, it can be easy to clearly acquire the voice of the passenger in the designated position. Thus, it is possible to allow the operator to rapidly take an appropriate action with respect to the accident. - The present disclosure has been described above based on the embodiment. The embodiment is merely an example, and it is understood by those skilled in the art that various modification examples can be made with respect to a combination of respective components or respective processes, and such modification examples are also within the scope of the present disclosure.
- For example, in the case of the
vehicle 90 in which theoccupant detection sensor 12 and theseat belt sensor 14 are not provided at the rear seat, the presence or absence of opening and closing of the door on the rear seat may be detected by a door opening and closing sensor (not illustrated), and when the door is opened and subsequently closed before travel of thevehicle 90 starts, theacquisition unit 52 of theemergency notification device 10 may acquire that the occupant is seated on the seat corresponding to the door. Further, when theoccupant detection sensor 12 and theseat belt sensor 14 are not provided at the rear seat, the directivity of themicrophone 20 may be regularly directed to a direction of the rear seat during travel of thevehicle 90, and theacquisition unit 52 may estimate a seating position of the occupant at the rear seat based on a frequency of a sound collected by themicrophone 20 and the direction of the directivity when the sound is collected. In these modification examples, in thevehicle 90 in which theoccupant detection sensor 12 and theseat belt sensor 14 are not provided at the rear seat, it is possible to cause the operator to ascertain the presence or absence of the occupant at the rear seat or the seating position. - Although one
microphone 20 is provided in thevehicle 90 in the embodiment, a first microphone of which the directivity can be controlled with respect to the occupant at the driver's seat, the occupant at the passenger seat, and the window, and a second microphone of which the directivity can be controlled with respect to the occupant at the rear seat and the window can be provided. In this modification example, a degree of freedom of the configuration of thevehicle 90 can be improved. - The
emergency notification device 10 may hold driver information input in advance. The driver information includes, for example, information likely to be effective for lifesaving treatment, such as an age group, a blood type, and a sex. When the occurrence of an accident is detected, thederivation unit 54 also includes the driver information in the emergency notification information. In this modification example, for example, even when the driver is not conscious, the operator can inform emergency personnel before a movement of the driver information, and it becomes easy for the emergency personnel to prepare for a movement. - The
derivation unit 54 of theemergency notification device 10 may also include, in the emergency notification information, information on an acceleration at the time of occurrence of an accident detected by an acceleration sensor (not illustrated). Theacquisition unit 72 of theinformation processing device 40 may increase the degree of serious injury when an absolute value of the acceleration is greater for the occupant on the seat in which the seat belt is not worn. That is, the degree of serious injury may have steps more than two steps including high and low. In the modification example, it is possible to increase the amount of information to be provided to the operator. - When rollover of the
vehicle 90 is detected by a gyro sensor (not illustrated) or the like, thederivation unit 54 of theemergency notification device 10 may also include rollover information in the emergency notification information. In the modification example, it is possible to increase the amount of information to be provided to the operator.
Claims (4)
1. An information processing device, comprising:
an acquisition unit configured to acquire information on an open and closed state of a window of a vehicle that is in an accident;
a reception unit configured to receive a manipulation input from an operator external to the vehicle;
a display unit configured to display an image representing the open and close state of the window based on the information on the open and closed state of the window; and
a determination unit configured to determine directivity of a microphone in a vehicle cabin of the vehicle to be a direction of a designated window based on the manipulation input for designating the window in the image displayed on the display unit.
2. The information processing device according to claim 1 , further comprising:
a speaker in the vehicle cabin configured to output a voice of the operator to the vehicle cabin and outside of the vehicle.
3. An information processing method, comprising:
an acquisition step of acquiring information on an open and closed state of a window of a vehicle that is in an accident;
a reception step of receiving a manipulation input from an operator external to the vehicle;
displaying on a display unit an image representing the open and close state of the window based on the information on the open and closed state of the window; and
a determination step of determining directivity of a microphone in a vehicle cabin of the vehicle to be a direction of a designated window based on the manipulation input for designating the window in the image displayed on the display unit.
4. The information processing method according to claim 3 , further comprising:
outputting a voice of the operator to the vehicle cabin and outside of the vehicle via a speaker in the vehicle cabin.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/205,053 US20210201601A1 (en) | 2018-10-24 | 2021-03-18 | Information processing device and information processing method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018200397A JP7115216B2 (en) | 2018-10-24 | 2018-10-24 | Information processing device and information processing method |
JP2018-200397 | 2018-10-24 | ||
US16/538,892 US10991171B2 (en) | 2018-10-24 | 2019-08-13 | Information processing device and information processing method |
US17/205,053 US20210201601A1 (en) | 2018-10-24 | 2021-03-18 | Information processing device and information processing method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/538,892 Continuation US10991171B2 (en) | 2018-10-24 | 2019-08-13 | Information processing device and information processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210201601A1 true US20210201601A1 (en) | 2021-07-01 |
Family
ID=70327405
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/538,892 Active US10991171B2 (en) | 2018-10-24 | 2019-08-13 | Information processing device and information processing method |
US17/205,053 Abandoned US20210201601A1 (en) | 2018-10-24 | 2021-03-18 | Information processing device and information processing method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/538,892 Active US10991171B2 (en) | 2018-10-24 | 2019-08-13 | Information processing device and information processing method |
Country Status (3)
Country | Link |
---|---|
US (2) | US10991171B2 (en) |
JP (1) | JP7115216B2 (en) |
CN (1) | CN111093130B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7353534B2 (en) * | 2021-05-17 | 2023-09-29 | 三菱電機株式会社 | Emergency request device |
WO2023112668A1 (en) * | 2021-12-16 | 2023-06-22 | 日本電気株式会社 | Sound analysis device, sound analysis method, and recording medium |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6735506B2 (en) * | 1992-05-05 | 2004-05-11 | Automotive Technologies International, Inc. | Telematics system |
US9129505B2 (en) * | 1995-06-07 | 2015-09-08 | American Vehicular Sciences Llc | Driver fatigue monitoring system and method |
JP2002123880A (en) | 2000-10-13 | 2002-04-26 | Denso Corp | Emergency countermeasure system and emergency countermeasure device |
US20130267194A1 (en) * | 2002-06-11 | 2013-10-10 | American Vehicular Sciences Llc | Method and System for Notifying a Remote Facility of an Accident Involving a Vehicle |
JP2007094935A (en) * | 2005-09-30 | 2007-04-12 | Omron Corp | Information processing device, method, system, and program, and recording medium |
EP1980093A2 (en) * | 2005-10-03 | 2008-10-15 | John R. Eubank | First aid information for cellular telephones and electronic devices |
US8054990B2 (en) * | 2006-11-22 | 2011-11-08 | General Motors Llc | Method of recognizing speech from a plurality of speaking locations within a vehicle |
US20080143497A1 (en) | 2006-12-15 | 2008-06-19 | General Motors Corporation | Vehicle Emergency Communication Mode Method and Apparatus |
JP2009124540A (en) | 2007-11-16 | 2009-06-04 | Toyota Motor Corp | Vehicle call device, and calling method |
JP4547721B2 (en) * | 2008-05-21 | 2010-09-22 | 株式会社デンソー | Automotive information provision system |
US9491420B2 (en) * | 2009-09-20 | 2016-11-08 | Tibet MIMAR | Vehicle security with accident notification and embedded driver analytics |
US20110112988A1 (en) * | 2009-11-12 | 2011-05-12 | Baker William W | Portable radio communication apparatus and method of use |
JP5729345B2 (en) * | 2012-04-10 | 2015-06-03 | 株式会社デンソー | Emotion monitoring system |
US20140111357A1 (en) | 2012-10-22 | 2014-04-24 | Ford Global Technologies, Llc | Method and Apparatus for Alarm Control |
JP6089799B2 (en) | 2013-03-06 | 2017-03-08 | 株式会社デンソー | Vehicle sound input control device |
CN104065798B (en) | 2013-03-21 | 2016-08-03 | 华为技术有限公司 | Audio signal processing method and equipment |
JP2015153001A (en) | 2014-02-12 | 2015-08-24 | 株式会社デンソー | Vehicle accident state prediction device and vehicle accident state prediction system |
US9800983B2 (en) * | 2014-07-24 | 2017-10-24 | Magna Electronics Inc. | Vehicle in cabin sound processing system |
JP6376381B2 (en) | 2014-07-28 | 2018-08-22 | 株式会社デンソー | Vehicle accident reporting system |
JPWO2016072164A1 (en) * | 2014-11-05 | 2017-08-10 | 日立オートモティブシステムズ株式会社 | In-vehicle voice processing device |
JP6611474B2 (en) * | 2015-06-01 | 2019-11-27 | クラリオン株式会社 | Sound collector and control method of sound collector |
DE102016200061B4 (en) | 2016-01-06 | 2021-11-04 | Volkswagen Aktiengesellschaft | Method, computer program and devices for remote control of a vehicle by means of a mobile device |
JP6607119B2 (en) | 2016-03-29 | 2019-11-20 | 株式会社デンソー | Driver emergency response system and vehicle side response device |
US10547937B2 (en) * | 2017-08-28 | 2020-01-28 | Bose Corporation | User-controlled beam steering in microphone array |
CN108162903B (en) * | 2017-12-26 | 2020-08-04 | 奇瑞新能源汽车股份有限公司 | Method and device for monitoring a vehicle |
-
2018
- 2018-10-24 JP JP2018200397A patent/JP7115216B2/en active Active
-
2019
- 2019-08-13 US US16/538,892 patent/US10991171B2/en active Active
- 2019-08-21 CN CN201910772065.0A patent/CN111093130B/en active Active
-
2021
- 2021-03-18 US US17/205,053 patent/US20210201601A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
US20200134936A1 (en) | 2020-04-30 |
JP2020066339A (en) | 2020-04-30 |
CN111093130B (en) | 2021-12-28 |
JP7115216B2 (en) | 2022-08-09 |
US10991171B2 (en) | 2021-04-27 |
CN111093130A (en) | 2020-05-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210201601A1 (en) | Information processing device and information processing method | |
US10525921B2 (en) | Monitoring windshield vibrations for vehicle collision detection | |
US9235687B2 (en) | Apparatus for estimating bodily injury level of vehicle occupant | |
JP6376381B2 (en) | Vehicle accident reporting system | |
CN111204302A (en) | Automobile collision accident handling method and electronic equipment | |
CN110313022B (en) | Apparatus and method for accident response | |
JP2009290789A (en) | System and method for emergency reporting | |
JP2017117194A (en) | Vehicular emergency call system | |
US11700522B2 (en) | Vehicle that has automatic notification function | |
JP2015161977A (en) | Emergency corresponding device for vehicle and vehicle accident notification system | |
JP2022027035A (en) | Server device used for vehicle automatic emergency notification system | |
KR20180115444A (en) | eCall system and the operating method thereof | |
JP4186356B2 (en) | Vehicle emergency call device and vehicle emergency call method | |
CN108622001B (en) | Method for triggering a security function | |
JP2009003806A (en) | Emergency reporting system and portable terminal | |
JP4883039B2 (en) | Emergency call system terminal | |
US11812356B2 (en) | Vehicle with automatic report function | |
KR20220081453A (en) | Apparatus for rear occupant alert of vehicle and method thereof | |
JP4136174B2 (en) | Reporting system | |
JP4136175B2 (en) | Reporting system | |
JP6089799B2 (en) | Vehicle sound input control device | |
JP7264139B2 (en) | VEHICLE AGENT DEVICE, VEHICLE AGENT SYSTEM, AND VEHICLE AGENT PROGRAM | |
KR20220036767A (en) | Emergency call system for analyzing accident based on image and method of providing emergency call in the same | |
CN113487830A (en) | Vehicle safety early warning method, device, storage medium and device | |
CN117382574A (en) | Seat belt control method, apparatus, vehicle, storage medium, and program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |