WO2018101429A1 - Dispositif de traitement d'informations, procédé de collecte d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de collecte d'informations et programme Download PDF

Info

Publication number
WO2018101429A1
WO2018101429A1 PCT/JP2017/043135 JP2017043135W WO2018101429A1 WO 2018101429 A1 WO2018101429 A1 WO 2018101429A1 JP 2017043135 W JP2017043135 W JP 2017043135W WO 2018101429 A1 WO2018101429 A1 WO 2018101429A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
sound
output
generated
vehicle
Prior art date
Application number
PCT/JP2017/043135
Other languages
English (en)
Japanese (ja)
Inventor
洋人 河内
昭光 藤吉
洋一 奥山
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Publication of WO2018101429A1 publication Critical patent/WO2018101429A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01HMEASUREMENT OF MECHANICAL VIBRATIONS OR ULTRASONIC, SONIC OR INFRASONIC WAVES
    • G01H3/00Measuring characteristics of vibrations by using a detector in a fluid
    • G01H3/04Frequency
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram

Definitions

  • the present invention relates to an information processing apparatus, an information collection method, and a program.
  • Patent Document 1 An example of a technique for collecting and utilizing various information during traveling of a vehicle is disclosed in, for example, Patent Document 1 and Patent Document 2 below.
  • Patent Document 1 sound generated when a vehicle travels over a joint for a bridge is collected using a microphone mounted on the vehicle, and the abnormality of the joint for the bridge is analyzed by analyzing the sound.
  • Techniques for detection are disclosed.
  • Patent Document 2 when a vibration measurement value greater than a predetermined threshold is detected by the vibration measurement means during travel of the vehicle, the road administrator uses the vehicle position information and the vibration measurement value at that time.
  • a technique for detecting a place that needs to be repaired is output.
  • JP 2011-242294 A Japanese Unexamined Patent Publication No. 2016-95184
  • the technology for utilizing map information can be improved by creating map information that has a large amount of information and is useful in various aspects.
  • a technique for creating such map information is desired.
  • Examples of problems to be solved by the present invention include providing a technique for creating map information that has a large amount of information and is useful in various aspects, and contributes to driving support for a moving object.
  • the invention described in claim 1 Sound information collected by a sound collecting device mounted on the moving body and generated based on an external sound generated outside the moving body, and position information of the moving body when the sound information is generated An acquisition unit to acquire; Generating output information in which the sound information and the position information are associated with each other, and outputting the output information to a predetermined output destination; Is an information processing apparatus.
  • the invention according to claim 9 is: Computer Sound information collected by a sound collecting device mounted on the moving body and generated based on an external sound generated outside the moving body, and position information of the moving body when the sound information is generated A process of acquiring; Generating output information in which the sound information and the position information are associated, and outputting the output information to a predetermined output destination; Is an information collection method including
  • the invention according to claim 10 is: Computer Sound information collected by a sound collecting device mounted on the moving body and generated based on an external sound generated outside the moving body, and position information of the moving body when the sound information is generated Means for obtaining, and Means for generating output information in which the sound information and the position information are associated with each other, and outputting the output information to a predetermined output destination; It is a program to make it function as.
  • FIG. 6 is a diagram for explaining a flow of processing in which an acquisition unit specifies “position information of a vehicle when sound information is generated” based on a correspondence relationship between an acquisition time of vehicle position information and an acquisition time of sound information; is there.
  • FIG. 6 is a diagram for explaining a flow of processing in which an acquisition unit specifies “position information of a vehicle when sound information is generated” based on a correspondence relationship between an acquisition time of vehicle position information and an acquisition time of sound information; is there.
  • It is a block diagram which shows notionally the functional composition of the server apparatus of a 1st embodiment. It is a figure which shows an example of the table which correlates sound information and map information. It is a figure which illustrates the hardware constitutions of an information processing apparatus and a server apparatus.
  • It is a flowchart which illustrates the flow of the process performed by the information processing apparatus of 1st Embodiment. 12 is a flowchart illustrating another example of processing executed by the information processing apparatus. It is a flowchart which illustrates the flow of the process performed by the server apparatus of 1st Embodiment. It is a block diagram which shows notionally the function structure of the information processing apparatus in 2nd Embodiment. It is a figure which shows an example of the table which correlates sound analysis information and map information. It is a block diagram which shows notionally the function structure of the server apparatus for analysis in 3rd Embodiment. It is a figure which shows an example of the table which associates additional information and map information.
  • FIG. 1 It is a figure which illustrates the hardware constitutions of the server apparatus for analysis. It is a flowchart which illustrates the flow of the process which the server apparatus for analysis in 3rd Embodiment performs. It is a block diagram which shows notionally the function structure of the server apparatus for analysis in 4th Embodiment. It is a figure which shows an example of the drawing data which a display output part produces
  • each block in the block diagram represents a functional unit configuration, not a hardware unit configuration.
  • FIG. 1 is a block diagram conceptually illustrating a configuration example of the information collection system 1.
  • the dotted line in the figure represents a wired or wireless communication path.
  • the information collection system 1 includes an information processing device 10 connected to a sound collection device 30 mounted on each vehicle, and a server device 20 connected to the information processing device 10 of each vehicle. Consists of.
  • the configuration of the information collection system 1 is not limited to the example of FIG.
  • the information collection system 1 can be configured to include one or more information processing apparatuses 10 and one or more server apparatuses 20.
  • the case of the vehicle will be described as a specific example, but the present invention may be applied to a moving body other than the vehicle.
  • the information processing apparatus 10 is an apparatus mounted on a vehicle.
  • the information processing apparatus 10 can communicate with the server apparatus 20 by connecting to the network 50 via a wireless line such as 3G or LTE (Long Term Evolution), for example.
  • the information processing device 10 is, for example, an external device that can be attached to the inside or outside of a vehicle.
  • the information processing apparatus 10 may be an apparatus incorporated in a vehicle, such as an ECU (Electronic Control Unit). Further, the information processing apparatus 10 may be a portable terminal (for example, a smartphone or a tablet terminal) in which an application that realizes each function described below is installed.
  • the server device 20 is a device having a function of collecting information acquired by the information processing device 10.
  • the server device 20 can communicate with each of the information processing devices 10 mounted on each vehicle via the network 50.
  • the sound collecting device 30 is a device including a microphone or a microphone array that generates and outputs an electrical signal (sound information) corresponding to the collected sound wave.
  • the sound collecting device 30 is provided, for example, on the outer periphery of the vehicle.
  • the installation position and number of the sound collecting devices 30 are not particularly limited.
  • the sound collection device 30 can transmit sound information to the information processing device 10 by wireless connection or wired connection. Further, the information processing device 10 and the sound collecting device 30 may be integrally configured as one device.
  • FIG. 2 is a block diagram conceptually showing the functional configuration of the information processing apparatus 10 according to the first embodiment.
  • the information processing apparatus 10 according to the present embodiment includes an acquisition unit 12 and an output unit 14.
  • the acquisition unit 12 acquires sound information generated using the sound collection device 30 mounted on the vehicle.
  • the acquisition unit 12 acquires vehicle position information when the sound information is generated.
  • the acquisition unit 12 acquires position information from a GPS (Global Positioning System) module (not shown) or the like mounted on the vehicle in accordance with the timing at which sound information is acquired from the sound collection device 30.
  • the acquisition unit 12 acquires the position information of the surrounding radio base stations in accordance with the timing at which the sound information is acquired from the sound collection device 30, and calculates the position information of the vehicle using the position information of the radio base stations. May be.
  • the position information acquired in this way can be used as “position information of the vehicle when sound information is generated”.
  • the acquisition unit 12 determines that “when the sound information is generated is based on the correspondence between the time when the position information of the vehicle is acquired and the time when the sound information generated using the sound collection device 30 is acquired. It is also possible to specify “vehicle position information”.
  • FIG. 3 illustrates a flow of processing in which the acquisition unit 12 specifies “vehicle position information when sound information is generated” based on the correspondence between the acquisition time of the vehicle position information and the acquisition time of the sound information. It is a figure for doing.
  • the horizontal axis in FIG. 3 represents the time axis.
  • the graph to which the reference symbol S is given indicates sound information acquired by the acquisition unit 12 from the sound collection device 30. Further, acquisition unit 12, at time t A and time t B, is assumed to get the position information of the vehicle.
  • acquisition unit 12 based on the time of the correspondence relationship between the obtained position information of the acquired vehicle position information and the time t B of the vehicle at time t A, from the time t A to time t B it can be specified as the position information of the vehicle when the sound information S B obtained in the generated.
  • the acquisition unit 12 obtains position information having a width (position information including two points of the position at the time t A and the position at the time t B ) from the time t A to the time t B. it may be specified as the position information of the vehicle when the information S B is generated.
  • acquisition unit 12 between the position at time t A representative position between the up position at time t B (e.g., waypoints, etc.
  • the output unit 14 generates output information in which the sound information acquired by the acquisition unit 12 and the vehicle position information are linked to each other.
  • the output unit 14 outputs output information including sound information and vehicle position information to a predetermined output destination.
  • the predetermined output destination may be, for example, the server device 20 or a storage device (not shown) built in the information processing device 10.
  • FIG. 4 is a block diagram conceptually showing the functional configuration of the server apparatus 20 of the first embodiment. As illustrated in FIG. 4, the server device 20 includes a reception unit 22 and an association unit 24.
  • the receiving unit 22 receives output information output from the output unit 14 of the information processing apparatus 10 as input information.
  • the associating unit 24 stores the sound information included in the input information in a predetermined storage unit in association with the map information using the vehicle position information included in the input information received by the receiving unit 22.
  • the predetermined storage unit is, for example, an internal storage device (not shown) included in the server device 20 or an external storage device connected to the server device 20.
  • the associating unit 24 can associate sound information and map information using a table as shown in FIG.
  • FIG. 5 is a diagram showing an example of a table associating sound information with map information.
  • the table illustrated in FIG. 5 stores sound information for each piece of information indicating the position on the map (hereinafter referred to as “map position information”) stored in the “position on the map” column.
  • the map position information may be, for example, information indicating one position or information indicating an area defined by three or more position information.
  • the map position information may be a simple identifier. In this case, information indicating a specific position or area on the map is separately prepared in a state associated with the identifier.
  • the associating unit 24 compares the position information included in the input information received by the receiving unit 22 with the map position information stored in the “position on the map” column, It is possible to specify a row (record) in which the sound information included is to be stored. Specifically, the associating unit 24 displays a line (record) that stores map position information that matches the position indicated by the position information included in the input information or that includes the position indicated by the position information. It can be specified as “row (record) to be stored”. Then, the associating unit 24 stores the sound information included in the input information received by the receiving unit 22 in the “sound information” column of the identified row (record). As described above, the associating unit 24 can associate the sound information with the map information by storing the sound information in association with the information indicating the position on the map.
  • the sound information generated by the sound collecting device 30 mounted on each vehicle and the position information of the vehicle when the sound information is generated are sent to the server device 20 in a state of being linked to each other. It is done.
  • the sound information includes various information that can indicate the situation when the sound information is generated. For example, if a person's voice is included in the sound information, it can be understood that a person exists around the vehicle. Further, for example, the age and sex of a person can be estimated from the feature amount (frequency distribution or the like) of the person's voice. For example, if the siren sound of an emergency vehicle is contained in sound information, it can be determined whether the emergency vehicle passed near the vehicle. For example, if the sound information includes an object collision sound or a sudden brake sound, it is possible to detect a contact accident or a sudden brake step around the vehicle.
  • the moving speed of the vehicle is estimated using the characteristic that the volume of the high frequency region increases as the vehicle moves at high speed.
  • An estimated value of the moving speed of the vehicle can be calculated from the sound information using a function to be derived.
  • the “location on the map” is specified based on the position information of the vehicle linked to the sound information, and the sound information is stored in association with the position on the map. Sound information is accumulated for each position on the map, so that characteristics of each position (for example, there are many / few pedestrians and bicycles, frequent / frequent passage of emergency vehicles, fast / slow vehicle movement speed) , Etc.). That is, according to the present embodiment, map information having high information density (added value) and useful in various aspects can be created.
  • Each functional component of the information processing device 10 and the server device 20 may be realized by hardware (for example, a hard-wired electronic circuit) that implements each functional component, or between hardware and software. It may be realized by a combination (for example, a combination of an electronic circuit and a program for controlling the electronic circuit).
  • a combination for example, a combination of an electronic circuit and a program for controlling the electronic circuit.
  • FIG. 6 is a diagram illustrating a hardware configuration of the information processing apparatus 10 and the server apparatus 20.
  • the information processing apparatus 10 includes a bus 102, a processor 104, a memory 106, a storage device 108, an input / output interface 110, and a network interface 112.
  • the bus 102 is a data transmission path through which the processor 104, the memory 106, the storage device 108, the input / output interface 110, and the network interface 112 transmit / receive data to / from each other.
  • the method of connecting the processors 104 and the like is not limited to bus connection.
  • the processor 104 is an arithmetic processing device realized using a microprocessor or the like.
  • the memory 106 is a memory realized using a RAM (Random Access Memory) or the like.
  • the storage device 108 is a storage device realized using a ROM (Read Only Memory), a flash memory, or the like.
  • the input / output interface 110 is an interface for connecting the information processing apparatus 10 to peripheral devices.
  • the input / output interface 110 is connected to a GPS module 1101 for acquiring information indicating the current position of the vehicle.
  • the information processing apparatus 10 can also acquire the position information of the surrounding base station via the network interface 112 and estimate the current position of the vehicle using the position information of the surrounding base station.
  • the GPS module 1101 may not be connected to the input / output interface 110.
  • the input / output interface 110 may be further connected to various input devices that accept input operations from a user, a display device, a touch panel in which they are integrated, and the like.
  • the network interface 112 is an interface for connecting the information processing apparatus 10 to a communication network.
  • the information processing apparatus 10 may have a plurality of network interfaces 112.
  • the information processing apparatus 10 includes a network interface 112 for connecting to a CAN communication network, a network interface 112 for connecting to a WAN (Wide Area Network) communication network, and a short-range wireless communication standard (for example, Bluetooth (registered) And a network interface 112 supporting the trademark).
  • the information processing apparatus 10 can communicate with the external server apparatus 20 via the WAN communication network and output output information to the server apparatus 20.
  • the information processing apparatus 10 can communicate with the sound collection device 30 by short-range wireless and acquire sound information generated by the sound collection device 30.
  • the information processing apparatus 10 can also acquire information indicating the operation of the vehicle (for example, the moving speed of the vehicle) via the CAN communication network.
  • the storage device 108 stores a program module for realizing each functional component of the information processing apparatus 10.
  • the processor 104 reads out the program module to the memory 106 and executes it, thereby realizing the function of each functional component of the information processing apparatus 10.
  • the server device 20 includes a bus 202, a processor 204, a memory 206, a storage device 208, an input / output interface 210, and a network interface 212.
  • the bus 202 is a data transmission path through which the processor 204, the memory 206, the storage device 208, the input / output interface 210, and the network interface 212 exchange data with each other.
  • the method of connecting the processors 204 and the like is not limited to bus connection.
  • the processor 204 is an arithmetic processing device realized using a microprocessor or the like.
  • the memory 206 is a memory realized using a RAM (Random Access Memory) or the like.
  • the storage device 208 is a storage device realized using a ROM (Read Only Memory), a flash memory, or the like.
  • the input / output interface 210 is an interface for connecting the server device 20 to peripheral devices.
  • the input / output interface 210 is connected to an input device such as a keyboard and a mouse, a display device such as an LCD (Liquid Crystal Display), a touch panel integrated with them.
  • the input device and the display device may be connected via a network interface 212 over a network.
  • the network interface 212 is an interface for connecting the server device 20 to a communication network.
  • the server device 20 includes a network interface 212 for connecting to a WAN (Wide Area Network) communication network.
  • the server device 20 can communicate with the information processing device 10 mounted on the vehicle via the WAN communication network and acquire output information from the information processing device 10 (input information for the server device 20).
  • the storage device 208 stores a program module for realizing each functional component of the server device 20.
  • the processor 204 reads out the program module to the memory 206 and executes it, thereby realizing the function of each functional component of the server device 20.
  • FIG. 7 is a flowchart illustrating the flow of processing executed by the information processing apparatus 10 according to the first embodiment.
  • the acquisition unit 12 communicates with the sound collection device 30 wirelessly or by wire to obtain sound information generated by the sound collection device 30 (S102).
  • the acquisition unit 12 may be configured to actively acquire sound information from the sound collection device 30 by, for example, notifying the sound collection device 30 of a transmission request for sound information, or transmitting sound information.
  • the sound information may be passively acquired from the sound collecting device 30 by monitoring the situation.
  • the acquisition unit 12 acquires the position information when the sound information is generated using the position information of the vehicle based on the GPS information of the GPS module 1101 and the position information of the surrounding base stations.
  • the acquisition unit 12 can acquire position information when sound information is generated by, for example, the method described with reference to FIG.
  • the output unit 14 associates the sound information acquired in S102 with the vehicle position information acquired in S104, and generates output information (S106). Then, the output unit 14 outputs the output information to the server device 20 (S108).
  • FIG. 8 is a flowchart illustrating another example of processing executed by the information processing apparatus 10. The flowchart in FIG. 8 is executed following the step S110 in the flowchart in FIG.
  • the output unit 14 temporarily stores the output information in a predetermined storage unit (for example, the storage device 108 of the information processing apparatus 10) (S110). And the output part 14 determines whether the transmission conditions of the output information accumulate
  • the transmission condition is, for example, when the number of output information accumulated in the storage device reaches a predetermined number or when the current time reaches a scheduled transmission time.
  • the output unit 14 When the transmission condition is not satisfied (S112: NO), the output unit 14 does not execute the process described later. Note that when the transmission condition is satisfied later, the output unit 14 performs processing described later.
  • the output unit 14 When the transmission condition is satisfied (S112: YES), the output unit 14 reads the output information stored in the predetermined storage unit and transmits it to the server device 20 (S114). Then, the output unit 14 deletes the intervention detection information transmitted in S114 from the predetermined storage unit (S116). Specifically, when the output unit 14 receives a confirmation signal indicating that the output information has been normally received from the server device 20, the output unit 14 deletes the transmitted output information from the predetermined storage unit. The output unit 14 may delete the output information record itself, or may logically delete the output information by adding a deletion flag to the output information. Further, the output unit 14 may add a flag to the output information that has already been transmitted, and delete the information to which the flag is assigned by batch processing that is periodically executed. The output unit 14 may be configured to retransmit the output information when receiving a signal indicating a reception error from the server device 20.
  • the acquisition unit 12 may further acquire information indicating the generation time of sound information.
  • the acquisition unit 12 can check the time managed in the information processing apparatus 10 in accordance with the timing at which the sound information is acquired, and can acquire the time at that time as the “sound information generation time”.
  • the sound collection device 30 acquires a time managed in the sound collection device 30 according to the timing at which the sound information is generated or transmitted, and transmits the time as “sound information generation time” in association with the sound information. May be.
  • the output unit 14 may further associate the generation time of the sound information with the output information and output it to the server device 20 or the storage device 108. In this way, a collection of sound information can be classified based on the generation time, and a more detailed analysis is possible.
  • the acquisition unit 12 may further acquire information (operation information) indicating the operation of the vehicle when sound information is generated via the CAN communication network.
  • operation information As an example of the operation information of the vehicle acquired by the acquisition unit 12, for example, the moving speed of the vehicle can be cited.
  • the output unit 14 may further associate the vehicle operation information with the output information and output it to the server device 20 or the storage device 108.
  • information that can be analyzed in more detail can be generated. For example, when wind information is included in the sound information, it can be determined whether or not the wind noise is generated by the movement of the vehicle based on the information on the moving speed of the vehicle, and the strength of the wind at the location can be estimated. .
  • the acquisition unit 12 may further acquire information (weather information) indicating the weather when the sound information is generated.
  • the acquisition unit 12 can access the Web server that distributes the weather information via the network interface 112 and acquire the weather information corresponding to the position indicated by the position information in S104.
  • the acquisition unit 12 may estimate the weather at the current position of the vehicle using sensing data obtained from various sensors (such as a raindrop sensor, an illuminance sensor, and an image sensor) mounted on the vehicle.
  • the acquisition part 12 may estimate the weather in the present position of a vehicle using the control signal of the vehicle which can be acquired via a CAN communication network. For example, when a control signal for operating the wiper is acquired, the acquisition unit 12 can generate information indicating that the weather is rainy.
  • the output unit 14 may further associate the weather information with the output information and output it to the server device 20 or the storage device 108. In this way, the collection of sound information can be classified based on the weather information, and a more detailed analysis is possible.
  • the acquisition unit 12 further displays the state of the driver when the sound information is generated, that is, the state information indicating how the driver has reacted with the external sound or the sound source of the external sound. You may get it.
  • the acquisition unit 12 acquires a driver's image from an in-vehicle camera mounted on the vehicle, and generates state information by analyzing the driver's state (facial expression, behavior, etc.) based on the acquired image.
  • state information is generated by analyzing the state of the driver (change in heart rate, etc.) from a biosensor worn by or in contact with the driver.
  • the acquisition unit 12 may acquire the driver's image and the output signal of the biosensor described above over, for example, several seconds to several tens of seconds immediately before, immediately after, or both of the timing when the sound information is generated. Note that past driver images and biosensor output signals are stored in, for example, the storage device 108.
  • the acquisition unit 12 can read the driver image immediately before the timing at which the sound information is generated or the output signal of the biosensor from the storage device 108 or the like.
  • the output unit 14 may further associate the state information generated in this way and output the state information to the server device 20 or the storage device 108. For example, it is assumed that a bicycle jumping out from a blind spot of a vehicle driver on a crossroad or the like suddenly brakes.
  • the sound collecting device 30 collects the sound of the sudden braking of the bicycle, and the acquisition unit 12 obtains sound information relating to the sudden braking of the bicycle generated by the sound collecting device 30.
  • the in-vehicle camera captures the driver's face image, and the biometric sensor acquires the driver's heart rate.
  • the facial image at this time has a surprised expression, and the heart rate suddenly increases.
  • the acquisition unit 12 acquires the image and the sensing result of the biological sensor from the in-vehicle camera and the biological sensor, and generates state information indicating that the driver is surprised. Also, it is assumed that a bicycle running alongside the vehicle applied a brake to wait for a signal in front of an intersection or the like.
  • the sound collecting device 30 collects the sudden braking sound, and the acquisition unit 12 obtains sound information relating to the sudden braking generated by the sound collecting device 30.
  • the sound collecting device 30 collects the sound of the bicycle brake, and the acquisition unit 12 obtains sound information relating to the bicycle brake generated by the sound collecting device 30.
  • the in-vehicle camera captures the driver's face image, and the biometric sensor acquires the driver's heart rate.
  • the facial image at this time has little change in facial expression before and after the bicycle brakes, and there is little change in heart rate.
  • the acquisition unit 12 acquires such an image and the sensing result of the biological sensor from the in-vehicle camera and the biological sensor, and generates state information indicating that there is little change in the driver's state even when the brake sound is heard.
  • both the in-vehicle camera and the biosensor are used, but one of them may be used, and the in-vehicle camera may take an image of the behavior of the driver instead of the driver's facial expression.
  • sensors such as an electroencephalogram sensor, a vibration sensor, and an in-vehicle microphone instead of the in-vehicle camera and the biological sensor, the driver's brain wave, the driver's behavior, the driver's behavior and the like may be acquired. Moreover, you may combine said several sensor suitably.
  • the driver's state is associated with the sound information and output to the server device 20 and the storage device 108, more detailed analysis is possible.
  • FIG. 9 is a flowchart illustrating the flow of processing executed by the server device 20 according to the first embodiment.
  • the receiving unit 22 acquires output information (sound information and vehicle position information) output from the information processing apparatus 10 as input information (S202).
  • the receiving unit 22 stores the acquired input information, for example, in a table (such as a table for storing input information as it is) different from the table shown in FIG. 5 (S204). This other table is prepared in advance on the storage device 208, for example.
  • the associating unit 24 groups the sound information stored in the other table based on the vehicle position information, and totals the number of acquired sound information for each group (S206). For example, the associating unit 24 can group the sound information for each position indicated by the position information of the vehicle associated with the sound information or for each area including the position.
  • the associating unit 24 selects one of the groups (S208), and determines whether or not the number of acquired sound information for each group is equal to or greater than a predetermined threshold (reference) (S210).
  • the predetermined threshold is defined on the program module of the associating unit 24 as, for example, a predetermined value corresponding to the reliability to be secured in later analysis. Specifically, the larger the threshold value serving as a reference, the more sound information is required to associate with the map information. As a result, since the number of information that can be used for analysis increases, the reliability of the analysis result can be improved.
  • the associating unit 24 determines whether there is a group that has not yet been selected (S214). If all groups have been selected (S214: NO), the server device 20 ends the process. On the other hand, when there is a group that has not been selected yet (S214: YES), the associating unit 24 selects the group (S208), and determines whether or not the number of acquired sound information of the group is equal to or greater than a predetermined threshold. It is determined again (S210).
  • the associating unit 24 associates the sound information belonging to that group with the map information using the vehicle position information (S212). ).
  • the associating unit 24 can specify the position on the map to which the sound information should be associated using the vehicle position information associated with each sound information.
  • the associating unit 24 can identify the row (record) to which the sound information of each group should be associated with reference to the “position on the map (map position information)” in the table as shown in FIG. . Then, the associating unit 24 can associate each sound information with the map information by storing the sound information of the group described above in the “sound information” column of the identified row (record).
  • the associating unit 24 determines whether there is a group not yet selected (S214). If all groups have been selected (S214: NO), the server device 20 ends the process. On the other hand, when there is a group that has not been selected yet (S214: YES), the associating unit 24 selects the group (S208), and determines whether or not the number of acquired sound information of the group is equal to or greater than a predetermined threshold. The determination is made again (S214).
  • the associating unit 24 may store the information together with the sound information. In this way, more detailed information can be obtained from a collection of sound information associated with map information.
  • FIG. 10 is a block diagram conceptually showing the functional configuration of the information processing apparatus 10 in the second embodiment. As illustrated in FIG. 10, the information processing apparatus 10 according to the present embodiment further includes a sound analysis unit 16.
  • the sound analysis unit 16 analyzes the sound information acquired by the acquisition unit 12 and generates sound analysis information.
  • the sound analysis unit 16 can generate sound analysis information indicating the volume of the sound information based on the sound pressure level of the sound information (electric signal), for example. Further, for example, the sound analysis unit 16 decomposes the sound information into frequency components using frequency analysis software and the like, and the sound analysis information indicating the volume for each frequency based on the sound pressure level of the sound information for each frequency component May be generated.
  • the sound analysis unit 16 may generate sound analysis information indicating a sound source using a known sound source separation algorithm and sound source identification algorithm.
  • the sound analysis unit 16 uses these algorithms to generate, for example, a person's voice, a vehicle running sound (eg, engine sound, road noise), an emergency vehicle siren sound, a sudden brake sound, a bicycle chain sound, and the like. , It can be extracted from the sound information.
  • the sound analysis unit 16 indicates the age and sex of a person when the sound source is a person using a dictionary database generated based on the analysis result (formant distribution, etc.) of the sound sample for each age and sex. Sound analysis information can also be generated.
  • the output unit 14 of the present embodiment generates output information in which the sound analysis information generated from the sound information by the analysis of the sound analysis unit 16 and the vehicle position information acquired by the acquisition unit 12 are linked. And output to a predetermined output destination (for example, the storage device 108 of the information processing apparatus 10 and the server apparatus 20).
  • the receiving unit 22 of the present embodiment receives output information including sound analysis information output from the output unit 14 of the information processing apparatus 10 as input information.
  • sound analysis information includes volume level analysis results (total volume level and volume level analysis results), person (adult / child) voice detection results, and bicycle chain sound detection.
  • the result includes the detection result of the siren sound of the emergency vehicle, the detection result of the sudden brake sound, the detection result of the collision sound, and the like.
  • the association unit 24 of the present embodiment stores the sound analysis information included in the input information in association with the map information using the vehicle position information included in the input information.
  • the associating unit 24 can compare the position information of the vehicle included in the input information with the position information on the map, and associate the sound analysis information with the map information based on the comparison result.
  • the associating unit 24 can associate sound analysis information and map information using a table as shown in FIG.
  • FIG. 11 is a diagram showing an example of a table for associating sound analysis information with map information.
  • the table illustrated in FIG. 11 stores sound analysis information for each information map position information indicating the position on the map stored in the “position on the map” column.
  • the map position information may be, for example, information indicating one position or information indicating an area defined by three or more position information.
  • the map position information may be a simple identifier. In this case, information indicating a specific position or area on the map is separately prepared in a state associated with the identifier.
  • the associating unit 24 compares the position information included in the input information received by the receiving unit 22 with the map position information stored in the “position on the map” column, A row (record) in which the included sound analysis information is to be stored can be specified. Specifically, the associating unit 24 displays a line (record) that stores map position information that matches the position indicated by the position information included in the input information or that includes the position indicated by the position information as “sound analysis information”. Can be specified as “row (record) to be stored”. Then, the associating unit 24 stores the sound analysis information included in the input information received by the receiving unit 22 in the “sound analysis information” column of the identified row (record).
  • the association unit 24 can associate the sound analysis information with the map information by storing the sound analysis information in association with the information indicating the position on the map. Further, in the case where the sound information that is the basis of the sound analysis information is acquired from the information processing apparatus 10, the associating unit 24 may further store the sound information in association with it. In this case, a column of “sound information” is further provided in the table of FIG.
  • the information processing apparatus 10 has a hardware configuration as shown in FIG.
  • the storage device 108 of this embodiment further stores a program module for realizing the sound analysis unit 16.
  • the processor 104 further realizes the function of the sound analysis unit 16 by reading the program module into the memory 106 and executing it.
  • the analysis of the sound information generated by the sound collection device 30 is executed by the information processing device 10 of each vehicle, and the sound analysis information is included in the output information.
  • the load concerning the analysis process of sound information in the server apparatus 20 can be reduced.
  • FIG. 12 is a block diagram conceptually showing the functional structure of the analysis server device 40 in the third embodiment.
  • the analysis server device 40 of this embodiment includes an acquisition unit 42 and an additional information generation unit 44.
  • the analysis server device 40 may be the server device 20 in each of the above-described embodiments, or may be another device provided separately from the server device 20.
  • the acquisition unit 42 acquires sound information generated using a sound collecting device mounted on the vehicle and position information associated with the sound information.
  • the position information associated with the sound information is, for example, “position information of the vehicle when the sound information is generated” or “map position information” described in the above embodiments.
  • the acquisition unit 42 can acquire sound information and vehicle position information from the information processing apparatus 10 described in the above embodiments.
  • the information processing apparatus 10 produces
  • the acquisition part 42 can acquire sound analysis information and the positional information on a vehicle.
  • the additional information generation unit 44 does not have to analyze the sound information.
  • the server device 20 or the like described in each of the above-described embodiments accumulates sound information and vehicle position information acquired from the information processing device 10 in a predetermined table, the acquisition unit 42 refers to the table. Thus, sound information and vehicle position information can be acquired.
  • the acquisition unit 42 can acquire the sound analysis information and the vehicle position information.
  • the additional information generation unit 44 does not have to analyze the sound information.
  • the additional information generation unit 44 refers to a table as shown in FIG. 5 or FIG. 11, for example, and acquires sound information or sound analysis information and map position information associated with these information. Can do.
  • the additional information generation unit 44 does not have to analyze the sound information when referring to the table of FIG.
  • the additional information generation unit 44 generates additional information generated using the analysis result of the sound information acquired by the acquisition unit 42.
  • the additional information generation unit 44 generates at least one of a sound information volume analysis result, a sound information frequency band analysis result, and a sound information sound source analysis result generated by analyzing the sound information.
  • additional information can be generated as follows using sound analysis information including: In the following, an example in which the additional information generation unit 44 analyzes sound information will be described.
  • the additional information generation unit 44 may be configured to acquire various analysis results of sound information by other processing units.
  • the additional information generation unit 44 analyzes the sound information acquired by the acquisition unit 42 in the same manner as the sound analysis unit 16 of the second embodiment, and generates information (sound analysis information) indicating the analysis result.
  • the additional information generation unit 44 analyzes the analysis result of the sound information volume (total volume or volume by frequency) and the analysis result of the sound information sound source (for example, a person, a bicycle, an emergency vehicle, a sudden brake, etc.). Sound analysis information including at least one of them is generated.
  • the additional information generation unit 44 takes statistics of the analysis result indicated by the sound analysis information, and generates additional information based on the statistics. For example, the additional information generation unit 44 can calculate additional information indicating a feature that the volume level is large / small based on the average value, median value, mode value, etc. of the volume level of the sound information, which can be calculated from the sound analysis information. Can be generated. In this case, the additional information generation unit 44 can determine the level of the volume level by comparing these values with a predetermined threshold value that determines that the volume level is high or low. The threshold is defined by a program module of the additional information generation unit 44 or the like.
  • the additional information generation unit 44 has a high / low number of pedestrians and bicycles, a high / low frequency of passing emergency vehicles, and a high vehicle moving speed based on the detection frequency of each sound source included in the sound analysis information. Additional information indicating characteristics such as / slow can be generated. For example, the additional information generation unit 44 compares the threshold value for determining the level of detection frequency (high or low) with the number of detections for each sound source, thereby detecting the level of detection frequency (high or low) for each sound source. ) Can be determined.
  • the threshold is defined by a program module of the additional information generation unit 44 or the like.
  • the threshold value may be defined as a different value for each sound source, or may be defined as a value common to all sound sources.
  • the additional information generation unit 44 is characterized in that the average moving speed of the vehicle is fast / slow based on the average value, median value, mode value, etc. of the speed that can be estimated from the frequency distribution of road noise and engine sound. Can be generated.
  • the additional information generation unit 44 (1) totals the number of sound information for each group classified by the position information associated with the sound information, and (2) the number of sound information for each group is equal to or greater than a reference threshold. (3) Additional information may be generated using the analysis result of sound information belonging to the selected group.
  • the additional information generation unit 44 associates the additional information with the position on the map information specified by using the position information acquired by the acquisition unit 42.
  • the additional information generation unit 44 can associate sound information and map information using a table as shown in FIG.
  • FIG. 13 is a diagram illustrating an example of a table for associating additional information with map information.
  • the table illustrated in FIG. 13 stores additional information for each map position information.
  • the map position information may be, for example, information indicating one position or information indicating an area defined by three or more position information.
  • the map position information may be a simple identifier. In this case, information indicating a specific position or area on the map is separately prepared in a state associated with the identifier.
  • the additional information generation unit 44 compares the position information acquired by the acquisition unit 42 with the map position information stored in the “position on the map” column, and stores the generated additional information.
  • the power line (record) can be specified. Specifically, the additional information generation unit 44 displays a row (record) that stores map position information that matches the position indicated by the position information acquired by the acquisition unit 42 or includes the position indicated by the position information. It can be specified as “row (record) in which additional information is to be stored”. Then, the additional information generation unit 44 stores the additional information in the “sound information” column of the identified row (record). As described above, the additional information generation unit 44 can associate the additional information with the map information by storing the additional information in association with the information indicating the position on the map.
  • sound information generated by the sound collection device 30 mounted on each vehicle is acquired, and additional information to be added to the map information is generated using the analysis result of the sound information.
  • This additional information is, for example, information indicating characteristics such as high / low volume level, high / low pedestrians and bicycles, high / low frequency of emergency vehicles, and high / low vehicle moving speed.
  • a “position on the map” is specified based on the position information associated with the sound information, and additional information is associated with the position on the map.
  • map information useful in various aspects can be created.
  • Each function component of the analysis server device 40 may be realized by hardware (for example, a hard-wired electronic circuit) that realizes each function component, or a combination of hardware and software (example) : A combination of an electronic circuit and a program for controlling the electronic circuit).
  • hardware for example, a hard-wired electronic circuit
  • software for example: A combination of an electronic circuit and a program for controlling the electronic circuit.
  • FIG. 14 is a diagram illustrating a hardware configuration of the analysis server device 40.
  • the analysis server device 40 includes a bus 402, a processor 404, a memory 406, a storage device 408, an input / output interface 410, and a network interface 412.
  • the bus 402 is a data transmission path through which the processor 404, the memory 406, the storage device 408, the input / output interface 410, and the network interface 412 transmit / receive data to / from each other.
  • the method of connecting the processors 404 and the like is not limited to bus connection.
  • the processor 404 is an arithmetic processing unit realized using a microprocessor or the like.
  • the memory 406 is a memory realized using a RAM (Random Access Memory) or the like.
  • the storage device 408 is a storage device realized by using a ROM (Read Only Memory), a flash memory, or the like.
  • the input / output interface 410 is an interface for connecting the analysis server device 40 to peripheral devices.
  • the input / output interface 410 is connected to an input device such as a keyboard and a mouse, a display device such as an LCD (Liquid Crystal Display), a touch panel integrated with them.
  • the input device and the display device may be connected via a network interface 412 over a network.
  • the network interface 412 is an interface for connecting the analysis server device 40 to a communication network.
  • the analysis server device 40 has a network interface 412 for connecting to a WAN (Wide Area Network) communication network.
  • WAN Wide Area Network
  • the storage device 408 stores a program module for realizing each functional component of the analysis server device 40.
  • the processor 404 reads out the program module to the memory 406 and executes it, thereby realizing the functions of the functional components of the analysis server device 40.
  • FIG. 15 is a flowchart illustrating the flow of processing executed by the analysis server device 40 according to the third embodiment.
  • the analysis server device 40 generates additional information with reference to a table as illustrated in FIG.
  • the acquisition unit 42 acquires sound information and position information (map position information) associated with the sound information (S302). Specifically, the acquisition unit 42 reads the sound information stored in each record and the map position information associated with the sound information with reference to a table as shown in FIG.
  • the additional information generation unit 44 groups sound information based on the position information acquired in S302, and totals the number of sound information of each group (S304). When referring to the table of FIG. 5, the sound information is grouped by map position information. The additional information generation unit 44 adds up the number of sound information for each group based on the map position information.
  • the additional information generation unit 44 selects one group (S306), and determines whether or not the number of pieces of sound information counted for each group is equal to or greater than a predetermined threshold (reference) (S308).
  • the predetermined threshold is defined on the program module of the additional information generation unit 44 as a predetermined value corresponding to the reliability required for the additional information generated based on the analysis result of the sound information, for example. Specifically, as the reference threshold value is larger, more sound information is used when generating additional information. As a result, the reliability of the additional information can be improved by using the analysis results of the plurality of sound information.
  • the additional information generation unit 44 determines whether there is a group that has not yet been selected (S316). If all the groups have been selected (S316: NO), the analysis server device 40 ends the process. On the other hand, if there is a group that has not yet been selected (S316: YES), the additional information generation unit 44 selects the group (S306), and whether or not the number of acquired sound information of the group is equal to or greater than a predetermined threshold value. It is determined again (S308).
  • the additional information generation unit 44 analyzes each piece of sound information belonging to the group and obtains sound analysis information (for example, a person, Information indicating detection results such as a bicycle, an emergency vehicle, and a sudden brake sound is generated (S310). Further, the additional information generation unit 44 adds additional information (for example, there are many / few pedestrians and bicycles, frequent / frequent passage of emergency vehicles, based on statistics of analysis results indicated by the sound analysis information, vehicle moving speed) Is fast / slow, etc.) (S312).
  • sound analysis information for example, a person, Information indicating detection results such as a bicycle, an emergency vehicle, and a sudden brake sound is generated (S310). Further, the additional information generation unit 44 adds additional information (for example, there are many / few pedestrians and bicycles, frequent / frequent passage of emergency vehicles, based on statistics of analysis results indicated by the sound analysis information, vehicle moving speed) Is fast / slow, etc.
  • the additional information generation unit 44 associates the additional information with the map information by using the position information associated with the sound information belonging to the selected group (S314). For example, the additional information generation unit 44 associates the map position information read in S302 with the additional information generated in S312 and generates a table as shown in FIG. Can be associated.
  • the additional information generation unit 44 determines whether there is a group that has not yet been selected (S316). If all the groups have been selected (S316: NO), the analysis server device 40 ends the process. On the other hand, if there is a group that has not yet been selected (S316: YES), the additional information generation unit 44 selects the group (S306), and whether or not the number of acquired sound information of the group is equal to or greater than a predetermined threshold value. It is determined again (S308).
  • the additional information generation unit 44 can also generate more detailed additional information using these information.
  • the additional information generation unit 44 can generate additional information indicating characteristics by time zone or weather by further using time information and weather information.
  • the additional information generation unit 44 can include wind strength that can be estimated by further using the operation information in the additional information.
  • the state information is added to the additional information.
  • state information indicating that the driver was surprised is output from the output unit 14 in association with sound information indicating sudden braking of the bicycle.
  • the additional information generation unit generates additional information indicating that the driver who has heard the sudden braking at this position is surprised, and adds the additional information to the map information.
  • state information indicating that there is no change in the driver's state is output from the output unit 14 in association with the sound information indicating the bicycle brake.
  • the additional information generation unit generates additional information indicating that there is little change in the state of the driver even if the brake sound of the bicycle is heard at this position, and adds the additional information to the map information. Since the map generated in this way is used by other vehicles for driving the vehicle, it is possible to predict in advance what kind of reaction the driver will react with the external sound generated at the destination. It is possible to perform vehicle control based on the above.
  • the analysis server device 40 of this embodiment is the same as that of the third embodiment except for the following points.
  • FIG. 16 is a block diagram conceptually showing the functional structure of the analysis server device 40 in the fourth embodiment. As shown in FIG. 16, the analysis server device 40 of the present embodiment further includes a display output unit 46 in addition to the configuration of the third embodiment.
  • the display output unit 46 acquires range information that specifies the display range of the map information, and reads additional information associated with the range indicated by the range information.
  • the display output unit 46 is, for example, an input device connected to the analysis server device 40 or another device connected to the analysis server device 40 via the network interface 412 (for example, a navigation device mounted on a vehicle,
  • the range information that specifies the display range of the map information can be acquired from a user PC (Personal Computer) or the like.
  • the display output unit 46 generates and outputs drawing data to be displayed overlaid on the map information based on the read additional information.
  • the output destination of the drawing data is a display device connected to the analysis server device 40 or another device connected via the network interface 412.
  • the drawing data is used to visualize additional information.
  • the drawing data is data for drawing a volume level distribution map, a sound source distribution map, a vehicle average moving speed distribution map, and the like.
  • the display output unit 46 generates drawing data for displaying information as exemplified in FIGS. 17 to 22, for example.
  • FIGS. 17 to 22 are diagrams illustrating an example of the drawing data generated by the display output unit 46.
  • FIG. 17 shows an example in which the display output unit 46 generates and outputs drawing data indicating the volume level distribution in the designated display range.
  • the display output unit 46 reads the additional information stored for each position on the map with reference to, for example, a table as shown in FIG. Then, the display output unit 46 specifies the volume level for each position based on the additional information for each position on the map (eg, the volume level is large / small). Then, the display output unit 46 connects the volume levels for each position to generate drawing data indicating the distribution of the volume levels. Then, for example, a screen as shown in FIG. 17 is displayed by the drawing data generated by the display output unit 46. With such a screen, a person browsing the screen can easily grasp the volume level (that is, the noise level) for each place.
  • the volume level that is, the noise level
  • the display output unit 46 displays information for designating the frequency band via an input field as indicated by reference numeral 170 in FIG. Further, it may be obtained, and drawing data indicating a volume level distribution in a specified frequency band may be generated and output. As a result, it is possible to easily grasp which frequency band of sound is generated at what volume at each location.
  • the volume for each frequency band is useful information for knowing the characteristics of the place. For example, a place with a high volume in the low range may be inferred as a place where low frequency sounds or very low frequency sounds may be emitted at a reasonable frequency, which may cause discomfort or damage to health. it can. Further, for example, it can be estimated that a place where the volume of the high frequency is high is a place where a high frequency sound such as a construction metal sound or a bicycle brake sound is likely to occur frequently.
  • the display output unit 46 distributes the frequency band of the sound information in the range indicated by the range information. It is also possible to output drawing data indicating.
  • the additional information generation unit 44 can generate additional information indicating a statistical value (for example, an average value, an intermediate value, etc.) of the frequency from the analysis result of the frequency of the sound information collected at each point.
  • the display output unit 46 can generate drawing data indicating the frequency band distribution of the sound information based on the additional information generated as described above. In this case, for example, an item “frequency distribution” is added to the input field for selecting the display information of FIG.
  • the frequency band distribution displayed in this way is also useful information for estimating the characteristics of the place.
  • the display output unit 46 generates and outputs drawing data indicating the distribution of the detection frequency of the sound source in the designated display range.
  • the display output unit 46 reads the additional information stored for each position on the map with reference to, for example, a table as shown in FIG. Then, the display output unit 46 adds additional information for each position on the map (eg, there are many / few pedestrians, many / few bicycles, high / low emergency vehicle passing frequency, high frequency of sudden braking / For each position based on whether or not a sound source with a high detection frequency exists. And the display output part 46 produces
  • a screen as shown in FIG. 18 is displayed by the drawing data generated by the display output unit 46.
  • This screen makes it easy for the person viewing the screen to find places that require attention, such as places where there are many pedestrians and bicycles, places where emergency vehicles frequently pass, and places where sudden braking frequently occurs. Can do.
  • the display output unit 46 further acquires, for example, information for designating the type of the sound source via the input field as indicated by reference numeral 180 in FIG. 18, and draws the drawing data indicating the distribution of the detection frequency of the designated type of sound source. It may be generated and output.
  • the input field is not limited to the example of FIG. 18 and may be configured such that a plurality of types can be specified, such as a check box. By narrowing down the information displayed on the screen in this way, it becomes easier for a person browsing the screen to find desired information.
  • drawing data indicating whether the sound source affects the driver is generated and output.
  • the display output unit 46 reads, for example, additional information indicating the driver's state stored for each position on the map. And the display output part 46 produces
  • an icon 191 and an icon 192 are shown as an example of drawing data indicating whether the sound source affects the driver.
  • the icon 191 is an icon indicating that the sound source does not affect the driver or that the sound source has little effect on the driver.
  • the icon 192 is an icon indicating that the sound source may affect the driver.
  • the “drawing data indicating whether the sound source affects the driver” displayed by the display output unit 46 is not limited to the example of FIG.
  • the display output unit 46 may display icons 201 and 202 that indicate changes in the facial expression of a person.
  • the display output unit 46 may display icons 211 and 212 indicating changes in heart rate.
  • the degree of influence on the driver is set to three or more levels, and icons corresponding to the levels may be displayed.
  • the display output unit 46 can specify the icon to be displayed based on the stage indicated by the additional information for each position on the map.
  • the screens illustrated in FIGS. 19 to 21 allow a person who views the screen to easily understand whether the sound source that appears in the position where he / she will travel from affects the driver himself / herself. Can contribute to driving support.
  • the display output unit 46 generates and outputs drawing data indicating the distribution of the average speed of the vehicle in the designated display range.
  • the display output unit 46 reads additional information for each position on the map with reference to, for example, a table as shown in FIG. And the display output part 46 specifies the average speed of the vehicle for every position based on the additional information (Example: Average speed of a vehicle is quick / slow) stored for every position on a map. And the display output part 46 connects the average speed of the vehicle for every position, and produces
  • the server device for analysis 40 of this embodiment has a hardware configuration as shown in FIG.
  • the storage device 408 of this embodiment further stores a program module for realizing the display output unit 46.
  • the processor 404 reads out the program module of the display output unit 46 to the memory 406 and executes it, thereby realizing the function of the display output unit 46.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Automation & Control Theory (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

L'invention concerne un système de collecte d'informations (1) comprenant un dispositif de traitement d'informations (10) et un dispositif serveur (20). Le dispositif de traitement d'informations (10) acquiert des informations sonores générées par un dispositif de collecte de sons (30) monté dans une entité mobile ainsi que des informations indiquant la position de l'entité mobile au moment où les informations sonores ont été générées. Le dispositif serveur (20) acquiert les informations sonores ainsi que les informations de position de l'entité mobile au moment où les informations sonores ont été générées à partir du dispositif de traitement d'informations (10), puis les stocke.
PCT/JP2017/043135 2016-11-30 2017-11-30 Dispositif de traitement d'informations, procédé de collecte d'informations et programme WO2018101429A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-233488 2016-11-30
JP2016233488 2016-11-30

Publications (1)

Publication Number Publication Date
WO2018101429A1 true WO2018101429A1 (fr) 2018-06-07

Family

ID=62241693

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/043135 WO2018101429A1 (fr) 2016-11-30 2017-11-30 Dispositif de traitement d'informations, procédé de collecte d'informations et programme

Country Status (1)

Country Link
WO (1) WO2018101429A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020046594A (ja) * 2018-09-21 2020-03-26 パイオニア株式会社 データ構造、記憶媒体、及び記憶装置
JP2020046593A (ja) * 2018-09-21 2020-03-26 パイオニア株式会社 データ構造、記憶媒体、及び記憶装置
JP2020140379A (ja) * 2019-02-27 2020-09-03 トヨタ自動車株式会社 運転支援システム
JP2020144404A (ja) * 2019-03-04 2020-09-10 トヨタ自動車株式会社 運転支援システム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009068890A (ja) * 2007-09-11 2009-04-02 Pioneer Electronic Corp 環境音圧記録装置、環境音圧記録方法及び環境音圧記録プログラム
JP2012073088A (ja) * 2010-09-28 2012-04-12 Sony Corp 位置情報提供装置、位置情報提供方法、位置情報提供システム、及びプログラム
WO2013121464A1 (fr) * 2012-02-16 2013-08-22 三菱電機株式会社 Appareil d'émission sonore
JP2015191256A (ja) * 2014-03-27 2015-11-02 パイオニア株式会社 危険度合い判定装置、危険度合い判定方法および危険度合い判定プログラム
US20150338227A1 (en) * 2012-11-22 2015-11-26 Freescale Semiconductor, Inc. Navigation system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009068890A (ja) * 2007-09-11 2009-04-02 Pioneer Electronic Corp 環境音圧記録装置、環境音圧記録方法及び環境音圧記録プログラム
JP2012073088A (ja) * 2010-09-28 2012-04-12 Sony Corp 位置情報提供装置、位置情報提供方法、位置情報提供システム、及びプログラム
WO2013121464A1 (fr) * 2012-02-16 2013-08-22 三菱電機株式会社 Appareil d'émission sonore
US20150338227A1 (en) * 2012-11-22 2015-11-26 Freescale Semiconductor, Inc. Navigation system
JP2015191256A (ja) * 2014-03-27 2015-11-02 パイオニア株式会社 危険度合い判定装置、危険度合い判定方法および危険度合い判定プログラム

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020046594A (ja) * 2018-09-21 2020-03-26 パイオニア株式会社 データ構造、記憶媒体、及び記憶装置
JP2020046593A (ja) * 2018-09-21 2020-03-26 パイオニア株式会社 データ構造、記憶媒体、及び記憶装置
JP2020140379A (ja) * 2019-02-27 2020-09-03 トヨタ自動車株式会社 運転支援システム
US11409305B2 (en) 2019-02-27 2022-08-09 Toyota Jidosha Kabushiki Kaisha Driving assistance system
JP7120077B2 (ja) 2019-02-27 2022-08-17 トヨタ自動車株式会社 運転支援システム
JP2020144404A (ja) * 2019-03-04 2020-09-10 トヨタ自動車株式会社 運転支援システム
JP7133155B2 (ja) 2019-03-04 2022-09-08 トヨタ自動車株式会社 運転支援システム

Similar Documents

Publication Publication Date Title
WO2018101429A1 (fr) Dispositif de traitement d'informations, procédé de collecte d'informations et programme
US11308785B1 (en) Systems and methods for the mitigation of drowsy or sleepy driving
US20170129497A1 (en) System and method for assessing user attention while driving
KR20190115040A (ko) 운전 거동 결정 방법, 디바이스, 장비 및 저장 매체
WO2015151594A1 (fr) Système, procédé et programme d'aide à la conduite
JPWO2018116862A1 (ja) 情報処理装置および方法、並びにプログラム
US20130131893A1 (en) Vehicle-use information collection system
WO2017160663A1 (fr) Dispositif de cartographie de pollution de circulation
KR101744963B1 (ko) 블랙박스 영상 정보를 이용하여 교통정보 및 차량정보 제공 시스템 및 그 방법
CN107645703B (zh) 行走安全监控方法和装置
JP5115542B2 (ja) 交通情報算出装置、交通システム及びコンピュータプログラム
US20210229674A1 (en) Driver profiling and identification
US10255803B2 (en) Vehicle image data transmission device
WO2018101430A1 (fr) Dispositif de serveur, procédé d'analyse, et programme
KR102268134B1 (ko) 모바일 데이터와 인프라 데이터를 이용한 차량 충돌 경보 장치 및 그 방법
JP2024045531A (ja) 情報処理装置、サーバ装置、情報処理方法、及びプログラム
KR20200031286A (ko) 교통안내 시스템 및 방법
KR102101975B1 (ko) 실시간 추돌 경고 시스템 및 방법
JP2018129585A (ja) 監視システムおよび監視方法
JP2005305003A (ja) 運転状況確認装置、運転状況確認システム
CN112740298B (zh) 信息提供系统、服务器、方法以及计算机可读取存储介质
JP4315073B2 (ja) 故障解析システム
JP2016191985A (ja) 道路情報データベース構築支援システムおよび該道路情報データベース構築支援システムにより構築されるデータベースを用いた運転支援システム
CN112700138A (zh) 道路交通风险管理的方法、装置和系统
JP6265525B2 (ja) 運転関連情報共用システム及び運転者端末

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17876827

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17876827

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP