WO2019198449A1 - Système de fourniture d'informations, terminal mobile, dispositif de fourniture d'informations, procédé de fourniture d'informations, et programme informatique - Google Patents

Système de fourniture d'informations, terminal mobile, dispositif de fourniture d'informations, procédé de fourniture d'informations, et programme informatique Download PDF

Info

Publication number
WO2019198449A1
WO2019198449A1 PCT/JP2019/011739 JP2019011739W WO2019198449A1 WO 2019198449 A1 WO2019198449 A1 WO 2019198449A1 JP 2019011739 W JP2019011739 W JP 2019011739W WO 2019198449 A1 WO2019198449 A1 WO 2019198449A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
vehicle
mobile
evaluation value
pedestrian
Prior art date
Application number
PCT/JP2019/011739
Other languages
English (en)
Japanese (ja)
Inventor
良明 林
Original Assignee
住友電気工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 住友電気工業株式会社 filed Critical 住友電気工業株式会社
Priority to JP2020513152A priority Critical patent/JPWO2019198449A1/ja
Publication of WO2019198449A1 publication Critical patent/WO2019198449A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
    • G08G1/133Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams within the vehicle ; Indicators inside the vehicles or at stops
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information providing system, a mobile terminal, an information providing apparatus, an information providing method, and a computer program.
  • This application claims priority based on Japanese Patent Application No. 2018-074446 filed on Apr. 10, 2018, and incorporates all the content described in the above Japanese application.
  • Patent Document 1 discloses a transportation system that informs the host vehicle of information on other vehicles.
  • An information providing system includes a mobile terminal mounted on at least a part of one or more moving objects located in a predetermined area, and the one or more movements in the map information of the area.
  • a calculation unit for obtaining an evaluation value of a predicted traffic situation that is a prediction result of the traffic situation of each of the mobile terminals based on dynamic map information on which dynamic information about the body is superimposed; and the predicted traffic situation of each of the mobile terminals Determining whether to notify the mobile terminal whether or not to notify the mobile terminal based on the evaluation value, and notification for notifying the mobile terminal of the predicted traffic situation based on the determination result of the determination unit And a section.
  • a mobile terminal is a mobile terminal that receives the predicted traffic situation from the information providing system and outputs the predicted traffic situation to a user.
  • An information providing method is an information providing method for providing information to a mobile terminal, and includes dynamic information related to the one or more mobile objects in map information of an area where the one or more mobile objects are located.
  • a determination step for determining for each mobile terminal based on the evaluation value whether or not to notify the mobile terminal of the predicted traffic situation of each of the mobile terminals, and based on the determination result of the determination unit
  • a notification step of notifying the mobile terminal of the predicted traffic situation is notifying the mobile terminal of the predicted traffic situation.
  • a computer program is a computer program for causing a computer to execute an information providing process for providing information to a mobile terminal, and is a map of an area where one or a plurality of moving objects are located on the computer. Predicting the traffic situation of each of the mobile terminals mounted on at least a part of the one or more mobile objects based on the dynamic map information in which the dynamic information about the one or more mobile objects is superimposed on the information
  • a notification step of notifying the mobile terminal of the predicted traffic situation based on a determination result of the determination unit It is an eye of the computer program.
  • An information providing apparatus is based on dynamic map information in which dynamic information about the one or more moving objects is superimposed on map information of an area where the one or more moving objects are located.
  • a calculation unit for obtaining an evaluation value of a predicted traffic situation that is a prediction result of a traffic situation of each mobile terminal mounted on at least a part of the one or more mobile objects, and the predicted traffic situation of each of the mobile terminals A determination unit that determines, for each mobile terminal, whether to notify the mobile terminal based on the evaluation value.
  • FIG. 1 is a schematic diagram illustrating an overall configuration of a wireless communication system according to an embodiment.
  • FIG. 2 is a block diagram illustrating an example of an internal configuration of the edge server and the core server.
  • FIG. 3 is a block diagram illustrating an example of an internal configuration of a vehicle-mounted device in which a communication terminal is mounted.
  • FIG. 4 is a block diagram illustrating an example of the internal configuration of the pedestrian terminal.
  • FIG. 5 is a block diagram illustrating an example of an internal configuration of a roadside sensor equipped with a wireless communication device that is a communication terminal.
  • FIG. 6 is an overall configuration diagram of the information providing system according to the present embodiment.
  • FIG. 7 is a sequence diagram illustrating an example of dynamic information update processing and distribution processing.
  • FIG. 8 is a functional block diagram of the edge server showing functions for providing the predicted traffic situation.
  • FIG. 9 is a diagram illustrating an example of a mobile object database.
  • FIG. 10 is a flowchart illustrating an example of a calculation process of the evaluation value of the predicted traffic situation by the calculation unit.
  • FIG. 11 is a flowchart showing an example of the vehicle calculation process in step S55 in FIG.
  • FIG. 12 is a flowchart illustrating an example of a pedestrian situation determination process in FIG. 11.
  • FIG. 13 is a flowchart showing an example of the pedestrian calculation process in FIG.
  • FIG. 14 is a diagram illustrating an example of an evaluation value database.
  • FIG. 15 is a flowchart illustrating an example of determination processing by the determination unit.
  • FIG. 10 is a flowchart illustrating an example of a calculation process of the evaluation value of the predicted traffic situation by the calculation unit.
  • FIG. 11 is a flowchart showing an example of the vehicle calculation process in step S55 in FIG.
  • FIG. 16 is a diagram illustrating a situation around an intersection according to scenario 1.
  • FIG. 17 is a diagram illustrating a situation around an intersection according to scenario 2.
  • FIG. 18 is a diagram illustrating a situation around an intersection according to scenario 3.
  • FIG. 19 is a diagram illustrating a situation around an intersection according to scenario 4.
  • FIG. 20 is a diagram illustrating a situation around an intersection according to scenario 5.
  • FIG. 21 is a diagram illustrating a situation around an intersection according to scenario 6.
  • FIG. FIG. 22 is a diagram illustrating an aspect of information provision executed by a system according to another embodiment.
  • the central device of the system is configured to determine the presence / absence of an abnormal event in each vehicle based on the vehicle information obtained from each vehicle and to notify the determination result to each vehicle.
  • the above conventional example it is configured to notify the result of the occurrence of an abnormal event.
  • the traffic situation of each moving body such as the presence or absence of the possibility of collision between the moving bodies using such a system is described.
  • the relationship between each vehicle is diverse, and when providing the information on the predicted traffic situation to each vehicle, the information given to each vehicle is provided if it is provided without selecting the information obtained. The amount becomes enormous and is not preferable from the viewpoint of the load applied to the system.
  • This disclosure has been made in view of such circumstances, and aims to provide a technology that can appropriately provide necessary information.
  • An information providing system includes a mobile terminal mounted on at least a part of one or a plurality of mobile objects located in a predetermined area, and the map information of the area including the 1 or Based on dynamic map information on which dynamic information on a plurality of moving objects is superimposed, a calculation unit that obtains an estimated value of a predicted traffic situation that is a prediction result of the traffic situation of each of the mobile terminals, and each of the mobile terminals A determination unit that determines for each mobile terminal whether or not to notify the mobile terminal of the predicted traffic situation based on the evaluation value, and the mobile terminal based on the determination result of the determination unit A notification unit for notifying.
  • the notification unit notifies the predicted traffic situation to a mobile terminal determined to be notified by the determination unit among the mobile terminals. In this case, it is possible to provide information only to mobile terminals that are determined to require information on the predicted traffic situation based on the evaluation value.
  • the predicted traffic situation includes a target mobile unit equipped with a mobile terminal for which the evaluation value is calculated by the calculation unit, and the target mobile unit among the one or more mobile units. It is a prediction result of collision prediction with other mobile bodies other than the body, and the calculation unit performs the collision prediction for each of the mobile terminals based on the dynamic map information, and based on the prediction result
  • the evaluation value is preferably obtained as a value for evaluating the safety level of each mobile terminal. In this case, it is possible to obtain an evaluation value related to the safety degree based on the prediction of the collision between the mobile terminal and the mobile object.
  • the calculation unit identifies a moving body predicted to collide with the target moving body among the other moving bodies based on the prediction result, and It is preferable to obtain the evaluation value based on the prediction result related to the moving body predicted to collide. In this case, it is possible to obtain an evaluation value related to the degree of safety based on the collision prediction of the moving body predicted to collide.
  • the calculation unit when the calculation unit is a pedestrian when one of the target moving body and the moving body predicted to collide with the target moving body is a pedestrian, It is preferable to add an adjustment value according to the situation to the evaluation value. In this case, the situation unique to the pedestrian can be reflected in the execution determination of information provision to the mobile terminal.
  • the calculation unit determines whether or not there is a blind spot factor that causes a blind spot between the target moving body and a moving body that is predicted to collide with the target moving body.
  • the determination result of the presence / absence of the blind spot factor may be added to the evaluation value. In this case, the presence / absence of the blind spot factor can be reflected in the determination of execution of information provision to the mobile terminal.
  • the information providing system further includes a control unit that controls the mobile terminal so that the predicted traffic situation is output to a user of the mobile terminal, and the control unit
  • the output mode of the predicted traffic situation may be controlled to be different depending on the attribute of the moving body predicted to collide. In this case, it can output to a user in the output mode according to the characteristic of each attribute of the moving body estimated to collide.
  • the predicted traffic situation is information indicating whether or not a future movement of the mobile terminal for which the evaluation value is obtained by the calculation unit is a comfortable movement
  • the calculation The unit evaluates the future mobility comfort of each of the mobile terminals based on the dynamic map information, and obtains the evaluation value based on the future mobility comfort of each of the mobile terminals. Also good. In this case, an evaluation value regarding the comfort of each mobile terminal can be obtained.
  • the mobile terminal which is other embodiment receives the said predicted traffic condition from any one information provision system of said (1) to (8), and outputs the said predicted traffic condition to a user It is.
  • an information providing method is an information providing method for providing information to a mobile terminal.
  • the one or more information is provided.
  • an evaluation of a predicted traffic situation that is a prediction result of the traffic situation of each of the mobile terminals mounted on at least a part of the one or more mobile bodies
  • a calculation step for obtaining a value a determination step for determining whether or not to notify the mobile terminal of the predicted traffic situation of each of the mobile terminals based on the evaluation value, and determination by the determination unit
  • a computer program is a computer program for causing a computer to execute an information providing process for providing information to a mobile terminal, and one or a plurality of moving objects are located in the computer.
  • the traffic of each of the mobile terminals mounted on at least a part of the one or more mobile objects based on the dynamic map information in which the dynamic information about the one or more mobile objects is superimposed on the map information of the area
  • a determination step of determining, and a notification step of notifying the mobile terminal of the predicted traffic situation based on a determination result of the determination unit Is a computer program for causing.
  • the information providing apparatus includes dynamic map information in which dynamic information regarding the one or more moving objects is superimposed on map information of an area where the one or more moving objects are located.
  • a computing unit that obtains an evaluation value of a predicted traffic situation that is a prediction result of a traffic situation of each of the mobile terminals mounted on at least a part of the one or more mobile objects, and the predicted traffic of each of the mobile terminals
  • a determination unit that determines, for each mobile terminal, whether or not to notify the mobile terminal of a situation based on the evaluation value.
  • FIG. 1 is a schematic diagram illustrating an overall configuration of a wireless communication system according to an embodiment.
  • a wireless communication system includes a plurality of communication terminals 1A to 1D capable of wireless communication, one or more base stations 2 that wirelessly communicate with communication terminals 1A to 1D, and wired or wirelessly to base station 2. It includes one or more edge servers 3 that communicate with each other and one or more core servers 4 that communicate with the edge servers 3 in a wired or wireless manner.
  • Communication terminals 1A to 1D are also referred to as communication terminals 1 as representative.
  • the core server 4 is installed in the core data center (DC) of the core network.
  • the edge server 3 is installed in a distributed data center (DC) of a metro network.
  • a metro network is a communication network constructed for each city, for example. Each metro network is connected to a core network.
  • the base station 2 is communicably connected to any edge server 3 of the distributed data center included in the metro network.
  • the core server 4 is communicably connected to the core network.
  • the edge server 3 is communicably connected to the metro network. Therefore, the core server 4 can communicate with the edge server 3 and the base station 2 belonging to each metro network via the core network and the metro network.
  • the base station 2 includes at least one of a macro cell base station, a micro cell base station, and a pico cell base station.
  • the edge server 3 and the core server 4 are general-purpose servers capable of SDN (Software-Defined Networking).
  • the base station 2 and a relay device such as a repeater (not shown) are transport devices capable of SDN. Therefore, a plurality of virtual networks (network slices) S1 to S4 satisfying conflicting service request conditions such as low-latency communication and large-capacity communication can be defined as physical devices of the wireless communication system by network virtualization technology.
  • the above-mentioned network virtualization technology is a basic concept of “5th generation mobile communication system” (hereinafter abbreviated as “5G”), which is currently being standardized. Therefore, the wireless communication system according to the present embodiment is compliant with 5G, for example.
  • the radio communication system according to the present embodiment may be a mobile communication system capable of defining a plurality of network slices (hereinafter also referred to as “slices”) S1 to S4 according to predetermined service request conditions such as a delay time. What is necessary is just and it is not limited to 5G.
  • the hierarchy of slices to be defined is not limited to four, but may be five or more.
  • each network slice S1 to S4 is defined as follows.
  • the slice S1 is a network slice defined so that the communication terminals 1A to 1D communicate directly.
  • the communication terminals 1A to 1D that directly communicate in the slice S1 are also referred to as “node N1”.
  • the slice S2 is a network slice defined so that the communication terminals 1A to 1D communicate with the base station 2.
  • the highest communication node in the slice S2 (base station 2 in the illustrated example) is also referred to as “node N2”.
  • the slice S3 is a network slice defined so that the communication terminals 1A to 1D communicate with the edge server 3 via the base station 2.
  • the highest communication node (edge server 3 in the example) in the slice S3 is also referred to as “node N3”.
  • the node N2 becomes a relay node. That is, data communication is performed through an uplink path of node N1 ⁇ node N2 ⁇ node N3 and a downlink path of node N3 ⁇ node N2 ⁇ node N1.
  • the slice S4 is a network slice defined so that the communication terminals 1A to 1D communicate with the core server 4 via the base station 2 and the edge server 3.
  • the highest communication node in the slice S4 (core server 4 in the figure) is also referred to as “node N4”.
  • the node N2 and the node N3 are relay nodes. That is, data communication is performed by the uplink path of node N1 ⁇ node N2 ⁇ node N3 ⁇ node N4 and the downlink path of node N4 ⁇ node N3 ⁇ node N2 ⁇ node N1.
  • the routing does not use the edge server 3 as a relay node.
  • data communication is performed through the uplink path of node N1 ⁇ node N2 ⁇ node N4 and the downlink path of node N4 ⁇ node N2 ⁇ node N1.
  • the communication terminal 1A is a wireless communication device mounted on the vehicle 5.
  • the vehicles 5 include not only ordinary passenger cars but also public vehicles such as route buses and emergency vehicles.
  • the vehicle 5 may be a two-wheeled vehicle (motorcycle) as well as a four-wheeled vehicle.
  • the drive system of the vehicle 5 may be any of engine drive, electric motor drive, and hybrid system.
  • the driving method of the vehicle 5 may be any of normal driving in which the passenger performs operations such as acceleration / deceleration and steering of the steering wheel, and automatic driving in which the operation is performed by software.
  • the communication terminal 1 ⁇ / b> A of the vehicle 5 may be an existing wireless communication device in the vehicle 5, or may be a portable terminal brought into the vehicle 5 by a passenger.
  • the passenger's mobile terminal is temporarily connected to the in-vehicle LAN (Local Area Network) of the vehicle 5 to become an in-vehicle wireless communication device.
  • LAN Local Area Network
  • the communication terminal 1B is a portable terminal (pedestrian terminal) carried by the pedestrian 7.
  • the pedestrian 7 is a person who moves on foot such as outdoors on roads and parking lots and indoors such as in buildings and underground shopping streets.
  • the pedestrian 7 includes not only a person walking but also a person who rides on a bicycle having no power source.
  • the communication terminal 1 ⁇ / b> C is a wireless communication device mounted on the roadside sensor 8.
  • the roadside sensor 8 is an image type vehicle detector installed on the road, a security camera installed outdoors or indoors, and the like.
  • the communication terminal 1D is a wireless communication device mounted on the traffic signal controller 9 at the intersection.
  • FIG. 2 is a block diagram illustrating an example of the internal configuration of the edge server 3 and the core server 4.
  • the edge server 3 includes a control unit 31 including a CPU (Central Processing Unit), a ROM (Read Only Memory) 32, a RAM (Random Access Memory) 33, a storage unit 34, and a communication unit 35. including.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the control unit 31 reads one or more programs stored in advance in the ROM 32 into the RAM 33 and executes them, thereby controlling the operation of each hardware and communicating the computer device with the core server 4 or the base station 2. It functions as the edge server 3.
  • the RAM 33 is composed of a volatile memory element such as SRAM (Static RAM) or DRAM (Dynamic RAM), and temporarily stores a program executed by the control unit 31 and data necessary for the execution.
  • SRAM Static RAM
  • DRAM Dynamic RAM
  • the storage unit 34 includes a non-volatile memory element such as a flash memory or an EEPROM (Electrically Erasable Programmable Read Only Memory), or a magnetic storage device such as a hard disk.
  • a non-volatile memory element such as a flash memory or an EEPROM (Electrically Erasable Programmable Read Only Memory), or a magnetic storage device such as a hard disk.
  • the communication unit 35 has a function of communicating with the core server 4 and the base station 2 via the metro network.
  • the communication unit 35 transmits the information given from the control unit 31 to the external device via the metro network and gives the information received via the metro network to the control unit 31.
  • the storage unit 34 stores a dynamic information map (hereinafter also simply referred to as “map”) M1 as dynamic map information.
  • the map M1 is an aggregate (virtual database) of data in which dynamic information that changes every moment is superimposed on a high-definition digital map that is static information.
  • the information constituting the map M1 includes the following “dynamic information” and “static information”.
  • “Dynamic information” refers to dynamic data that requires a delay time of 1 second or less.
  • the position information and signal information of mobile bodies (such as vehicles and pedestrians) that are utilized as ITS (Intelligent Transport Systems) prefetch information correspond to dynamic information.
  • the position information of the moving body included in the dynamic information does not have a wireless communication function in addition to the position information of the vehicle 5 and the pedestrian 7 that can perform wireless communication by having the communication terminals 1A and 1B. Position information of the vehicle 5 and the pedestrian 7 is also included.
  • Static information refers to static data that allows a delay time of one month or less.
  • road surface information, lane information, and three-dimensional structure data correspond to static information.
  • the control unit 31 of the edge server 3 updates the dynamic information of the map M1 stored in the storage unit 34 every predetermined update cycle (update process). Specifically, the control unit 31 collects, from each communication terminal 1A to 1D, various sensor information acquired by the vehicle 5, the roadside sensor 8, and the like within the service area of its own device at every predetermined update period. The dynamic information of the map M1 is updated based on the collected sensor information.
  • the control unit 31 When receiving the dynamic information request message from the communication terminal 1A, 1B of the predetermined user, the control unit 31 sends the latest dynamic information to the communication terminal 1A, 1B that is the transmission source of the request message for each predetermined distribution cycle. Distribute (distribution process). The control unit 31 collects traffic information and weather information of each location in the service area from the traffic control center and the private weather service support center, and updates the dynamic information or static information of the map M1 based on the collected information. May be.
  • the core server 4 includes a control unit 41 including a CPU, a ROM 42, a RAM 43, a storage unit 44, and a communication unit 45.
  • the control unit 41 reads one or more programs stored in advance in the ROM 32 into the RAM 43 and executes them to control the operation of each hardware and function as a core server 4 capable of communicating with the edge server 3.
  • the RAM 43 is composed of a volatile memory element such as SRAM or DRAM, and temporarily stores a program executed by the control unit 41 and data necessary for the execution.
  • the storage unit 44 is configured by a nonvolatile memory element such as a flash memory or an EEPROM, or a magnetic storage device such as a hard disk.
  • the communication unit 45 has a function of communicating with the edge server 3 and the base station 2 via the core network.
  • the communication unit 45 transmits information given from the control unit 41 to the external device via the core network, and gives information received via the core network to the control unit 41.
  • the storage unit 44 of the core server 4 stores an information map M2.
  • the data structure of the map M2 (data structure including dynamic information and static information) is the same as the data structure of the map M1.
  • the map M2 may be a map of the same service area as the map M1 of the specific edge server 3, or may be a wider area map in which the maps M1 held by the plurality of edge servers 3 are integrated.
  • control unit 41 of the core server 4 updates the dynamic information of the map M2 stored in the storage unit 44, and the distribution process of distributing the dynamic information in response to the request message. And may be performed. That is, the control unit 41 can perform update processing and distribution processing based on the map M2 of the own device independently of the edge server 3. That is, the control unit 41 can independently execute dynamic information update processing and distribution processing based on the map M2 of its own device separately from the edge server 3.
  • the core server 4 belonging to the slice S4 has a longer communication delay time with the communication terminals 1A to 1D than the edge server 3 belonging to the slice S3. For this reason, even if the core server 4 independently updates the dynamic information of the map M2, it is inferior in real time as compared to the dynamic information of the map M1 managed by the edge server 3. Therefore, for example, it is preferable that the control unit 31 of the edge server 3 and the control unit 41 of the core server 4 perform dynamic information update processing and distribution processing in a distributed manner according to the priority defined for each predetermined area. .
  • FIG. 3 is a block diagram illustrating an example of an internal configuration of the in-vehicle device 50 of the vehicle 5 on which the communication terminal 1A is mounted.
  • an in-vehicle device 50 mounted on a vehicle 5 includes a control unit (ECU: Electronic Control Unit) 51, a GPS receiver 52, a vehicle speed sensor 53, a gyro sensor 54, a storage unit 55, a display 56, a speaker. 57, an input device 58, an in-vehicle camera 59, a radar sensor 60, and a communication unit 61.
  • ECU Electronic Control Unit
  • the communication unit 61 is the communication terminal 1A described above (a wireless communication device capable of communication conforming to 5G). Therefore, the in-vehicle device 50 of the vehicle 5 can communicate with the edge server 3 as a kind of mobile terminal belonging to the slice S3. The in-vehicle device 50 of the vehicle 5 can also communicate with the core server 4 as a kind of mobile terminal belonging to the slice S4.
  • the control unit 51 is a computer device that performs route search of the vehicle 5, control of the other electronic devices 52 to 61, and the like.
  • the control unit 51 obtains the vehicle position of the host vehicle from GPS signals that the GPS receiver 52 periodically acquires. Further, the control unit 51 complements the vehicle position and direction based on the input signals of the vehicle speed sensor 53 and the gyro sensor 54, and grasps the accurate current position and direction of the vehicle 5.
  • the GPS receiver 52, the vehicle speed sensor 53, and the gyro sensor 54 are sensors that measure the current position, speed, and direction of the vehicle 5.
  • the storage unit 55 includes a map database.
  • the map database provides road map data to the control unit 51.
  • the road map data includes link data and node data, and is stored in a recording medium such as a DVD, a CD-ROM, a memory card, or an HDD.
  • the storage unit 55 reads out necessary road map data from the recording medium and provides it to the control unit 51.
  • the display 56 and the speaker 57 are output devices for notifying a user who is a passenger of the vehicle 5 of various information generated by the control unit 51. Specifically, the display 56 displays an input screen for route search, a map image around the host vehicle, route information to the destination, and the like. The speaker 57 outputs an announcement or the like for guiding the vehicle 5 to the destination. These output devices can also notify the passenger of the provision information received by the communication unit 61.
  • the input device 58 is a device for a passenger of the vehicle 5 to perform various input operations.
  • the input device 58 includes an operation switch provided on the handle, a touch panel provided on the joystick display 56, and a combination thereof.
  • the input device 58 may be a voice recognition device that accepts input by voice recognition of a passenger.
  • the input signal generated by the input device 58 is transmitted to the control unit 51.
  • the in-vehicle camera 59 is an image sensor that captures an image in front of the vehicle 5.
  • the in-vehicle camera 59 may be either monocular or compound eye.
  • the radar sensor 60 is a sensor that detects an object existing in front of or around the vehicle 5 by a millimeter wave radar, a LiDAR method, or the like. Based on the measurement data from the in-vehicle camera 59 and the radar sensor 60, the control unit 51 executes driving support control that outputs a warning to the occupant during driving to the display 56 or performs forced braking intervention. be able to.
  • the control unit 51 is configured by an arithmetic processing device such as a microcomputer that executes various control programs stored in the storage unit 55. As a function realized by executing the control program, the control unit 51 displays a map image on the display 56, a route from the departure point to the destination (including the position if there is a relay point). And various navigation functions such as a function of guiding the vehicle 5 to the destination according to the calculated route.
  • an arithmetic processing device such as a microcomputer that executes various control programs stored in the storage unit 55.
  • the control unit 51 displays a map image on the display 56, a route from the departure point to the destination (including the position if there is a relay point).
  • various navigation functions such as a function of guiding the vehicle 5 to the destination according to the calculated route.
  • the control unit 51 Based on the measurement data of at least one of the in-vehicle camera 59 and the radar sensor 60, the control unit 51 performs object recognition processing for recognizing an object in front of or around the host vehicle, and measurement for calculating a distance to the recognized object. And a function of executing distance processing.
  • the control unit 51 can calculate the position information of the object recognized by the object recognition process from the distance calculated by the distance measurement process and the sensor position of the host vehicle.
  • the control unit 51 can execute the following processes in communication with the edge server 3 (which may be the core server 4). 1) Request message transmission processing 2) Dynamic information reception processing 3) Change point information generation processing 4) Change point information transmission processing
  • the request message transmission process is a process of transmitting, to the edge server 3, a control packet for requesting distribution of dynamic information of the map M1 that is sequentially updated by the edge server 3.
  • the control packet includes the vehicle ID of the host vehicle.
  • the dynamic information receiving process is a process of receiving the dynamic information distributed by the edge server 3 to the own apparatus.
  • the change point information generation process by the control unit 51 calculates the change between the information from the comparison result between the received dynamic information and the sensor information of the host vehicle at the time of reception, and information on the difference between the two pieces of information. This is a process for generating change point information.
  • the change point information generated by the control unit 51 is, for example, the following information examples a1 to a2.
  • Information example a1 Change point information regarding a recognized object
  • the control unit 51 detects an object X (a moving object such as a vehicle or a pedestrian and an obstacle) that is not included in the received dynamic information by its own object recognition processing. In such a case, the detected image data of the object X and the position information are used as change point information.
  • the control unit 51 detects the detected object X And the difference value between the position information of the two are used as change point information.
  • Information example a2 Change point information regarding own vehicle
  • the control unit 51 deviates the position information of the own vehicle included in the received dynamic information from the vehicle position of the own vehicle calculated by the GPS signal by a predetermined threshold or more. If they are different, the difference value between them is used as change point information.
  • the control unit 51 determines the difference between the two. The value is used as change point information.
  • the control unit 51 When the change point information is generated as described above, the control unit 51 generates a communication packet addressed to the edge server 3 including the generated change point information and the vehicle ID of the host vehicle.
  • the change point information transmission process is a process of transmitting the communication packet including the change point information to the edge server 3.
  • the change point information transmission process is performed within the dynamic information distribution cycle by the edge server 3.
  • control unit 51 executes driving support control that causes the display 56 to output a warning for a driver who is driving or to perform forced braking intervention. You can also.
  • FIG. 4 is a block diagram illustrating an example of an internal configuration of the pedestrian terminal 70 (communication terminal 1B).
  • the pedestrian terminal 70 in FIG. 4 is a wireless communication device capable of communication processing based on 5G, for example. Therefore, the pedestrian terminal 70 can communicate with the edge server 3 as a kind of mobile terminal belonging to the slice S3.
  • the pedestrian terminal 70 can also communicate with the core server 4 as a kind of mobile terminal belonging to the slice S4.
  • pedestrian terminal 70 includes a control unit 71, a storage unit 72, a display unit 73, an operation unit 74, and a communication unit 75.
  • the communication unit 75 is a communication interface that wirelessly communicates with the base station 2 of the carrier that provides the 5G service.
  • the communication unit 75 converts the RF signal from the base station 2 into a digital signal and outputs it to the control unit 71. Further, the communication unit 75 converts the digital signal input from the control unit 71 into an RF signal and transmits the RF signal to the base station 2.
  • the control unit 71 includes a CPU, a ROM, and a RAM.
  • the control unit 71 reads out and executes the program stored in the storage unit 72 and controls the overall operation of the pedestrian terminal 70.
  • the storage unit 72 is configured by a hard disk, a non-volatile memory, or the like, and stores various computer programs and data.
  • the storage unit 72 stores a mobile ID that is identification information of the pedestrian terminal 70.
  • the mobile ID is, for example, a unique user ID or MAC address of the carrier contractor.
  • the storage unit 72 stores various application software arbitrarily installed by the user.
  • the application software stored in the storage unit 72 includes, for example, application software for receiving an information providing service for receiving dynamic information of the map M1 through communication with the edge server 3 (or the core server 4). .
  • the operation unit 74 includes various operation buttons and a touch panel function of the display unit 73.
  • the operation unit 74 outputs an operation signal corresponding to a user operation to the control unit 71.
  • the display unit 73 is, for example, a liquid crystal display.
  • the display unit 73 presents various information to the user. For example, the display unit 73 displays the image data of the information maps M1 and M2 transmitted from the servers 3 and 4 on the screen.
  • the control unit 71 uses the time synchronization function to acquire the current time from the GPS signal, the position detection function to measure the current position (latitude, longitude, and altitude) of the host vehicle from the GPS signal, and the direction sensor to determine the direction of the pedestrian 7. And a direction detection function for measuring.
  • the control unit 71 can execute the following processes in communication with the edge server 3 (which may be the core server 4). 1) Request message transmission processing 2) Dynamic information reception processing 3) Change point information generation processing 4) Change point information transmission processing
  • the request message transmission process is a process of transmitting, to the edge server 3, a control packet for requesting distribution of dynamic information of the map M1 that is sequentially updated by the edge server 3.
  • the control packet includes the mobile ID of the pedestrian terminal 70.
  • the dynamic information receiving process is a process of receiving the dynamic information distributed by the edge server 3 to the own apparatus.
  • the change point information generation process by the control unit 71 calculates the change between the received dynamic information and the sensor information of the host vehicle at the time of reception, and information on the difference between the two pieces of information. This is a process for generating change point information.
  • the change point information generated by the control unit 71 is, for example, the following information example.
  • the control unit 71 has a predetermined threshold value based on the position information of the own pedestrian 7 included in the received dynamic information and the position of the own pedestrian 7 calculated by the GPS signal. When there is a deviation, the difference value between the two is used as change point information. When the azimuth of the pedestrian 7 included in the received dynamic information and the azimuth of the pedestrian 7 calculated by the azimuth sensor are deviated by a predetermined threshold or more, the control unit 71 sets a difference value between the two. Change point information.
  • the control unit 71 When generating the change point information as described above, the control unit 71 generates a communication packet addressed to the edge server 3 including the generated change point information and the portable ID of the terminal 70 itself.
  • the change point information transmission process is a process of transmitting the communication packet including the change point information to the edge server 3.
  • the change point information transmission process is performed within the dynamic information distribution cycle by the edge server 3.
  • control unit 71 transmits the state information including the position and orientation information of the terminal 70 to the edge server 3 by performing the change point information generation process and the change point information transmission process.
  • FIG. 5 is a block diagram showing an example of the internal configuration of the roadside sensor 8 equipped with a wireless communication device that is the communication terminal 1C.
  • roadside sensor 8 includes a control unit 81, a storage unit 82, a roadside camera 83, a radar sensor 84, and a communication unit 85.
  • the communication unit 85 is the above-described communication terminal 1C, that is, a wireless communication device capable of communication processing based on, for example, 5G. Therefore, the roadside sensor 8 can communicate with the edge server 3 as a kind of fixed terminal belonging to the slice S3. The roadside sensor 8 can also communicate with the core server 4 as a kind of fixed terminal belonging to the slice S4.
  • the control unit 81 includes a CPU, a ROM, and a RAM.
  • the control unit 81 reads and executes the program stored in the storage unit 82 and controls the overall operation of the roadside sensor 8.
  • the storage unit 82 is configured by a hard disk, a non-volatile memory, or the like, and stores various computer programs and data.
  • the storage unit 82 stores a sensor ID that is identification information of the roadside sensor 8.
  • the sensor ID is, for example, a user ID unique to the owner of the roadside sensor 8 or a MAC address.
  • the roadside camera 83 is an image sensor that captures an image of a predetermined shooting area.
  • the roadside camera 83 may be either monocular or compound eye.
  • the radar sensor 60 is a sensor that detects an object existing in front of or around the vehicle 5 by a millimeter wave radar, a LiDAR method, or the like.
  • the control unit 81 transmits the captured video data and the like to the security administrator's computer device.
  • the control unit 81 transmits the captured video data and the like to the traffic control center.
  • the control unit 81 performs object recognition processing for recognizing an object in the shooting area based on at least one measurement data of the roadside camera 83 and the radar sensor 84, and distance measurement processing for calculating a distance to the recognized object. , Has a function of executing.
  • the control unit 51 can calculate the position information of the object recognized by the object recognition process from the distance calculated by the distance measurement process and the sensor position of the host vehicle.
  • the control unit 81 can execute the following processes in communication with the edge server 3 (which may be the core server 4). 1) Change point information generation process 2) Change point information transmission process
  • the change point information generation processing in the roadside sensor 8 is based on the comparison result between the previous sensor information and the current sensor information for each predetermined measurement cycle (for example, the dynamic information distribution cycle by the edge server 3). This is a process of calculating changes between sensor information and generating change point information that is information relating to differences between the two pieces of information.
  • the change point information generated by the roadside sensor 8 is, for example, the following information example b1.
  • the control unit 81 detects an object Y (a moving object such as a vehicle or a pedestrian and an obstacle) that has not been detected in the previous object recognition process by the current object recognition process. In such a case, the detected image data of the object Y and the position information are used as change point information.
  • the control unit 81 detects the detected object Y. And the difference value between them are used as change point information.
  • the control unit 81 When generating the change point information as described above, the control unit 81 generates a communication packet addressed to the edge server 3 including the generated change point information and the sensor ID of the own device.
  • the change point information transmission process is a process of transmitting the communication packet including the change point information in the data to the edge server 3.
  • the change point information transmission process is performed within the dynamic information distribution cycle by the edge server 3.
  • FIG. 6 is an overall configuration diagram of the information providing system according to the present embodiment.
  • the information providing system according to the present embodiment includes a large number of vehicles 5, pedestrian terminals 70, and roadside sensors 8 scattered in a service area (real world) of edge server 3 that is relatively wide.
  • these communication nodes include an edge server 3 that can perform wireless communication with low delay by communication based on 5G performed via the base station 2 and functions as an information providing apparatus. That is, the information providing system includes a part or all of the above-described wireless communication system.
  • the mobile body existing in the service area of the edge server 3 includes the vehicle 5 capable of wireless communication by mounting the communication terminal 1A and the in-vehicle device 50, and the pedestrian 7 carrying the pedestrian terminal 70. Also included are a vehicle 5 that does not have a wireless communication function and a pedestrian 7 that does not carry the pedestrian terminal 70.
  • the edge server 3 collects the above-described change point information at a predetermined cycle from the in-vehicle device 50 of the vehicle 5 in the service area, the pedestrian terminal 70, the roadside sensor 8, and the like (step S31).
  • the edge server 3 integrates the collected change point information by map matching (integration processing), and updates the dynamic information of the information map M1 being managed (step S32).
  • the edge server 3 transmits the latest dynamic information to the requesting communication node (step S33). Thereby, for example, the vehicle 5 that has received the dynamic information can use the dynamic information for driving assistance of the passenger.
  • the edge server 3 may transmit the map M1 updated in step S32 as dynamic information to the requesting communication node.
  • the vehicle 5 that has received the dynamic information detects the change point information with the sensor information of the vehicle based on the dynamic information, the vehicle 5 transmits the detected change point information to the edge server 3 (step S34).
  • change point information collection (step S31) ⁇ dynamic information update (step S32) ⁇ dynamic information distribution (step S33) ⁇ change point information detection by a vehicle (Step S34) ⁇ Information processing in each communication node circulates in the order of change point information collection (Step S31).
  • FIG. 6 illustrates an information providing system including only one edge server 3, the information providing system may include a plurality of edge servers 3, instead of the edge server 3, or in addition to the edge server 3.
  • One or a plurality of core servers 4 may be included.
  • the information map M1 managed by the edge server 3 may be a map in which at least dynamic information of an object is superimposed on map information such as a digital map. This also applies to the core server information map M2.
  • FIG. 7 is a sequence diagram illustrating an example of dynamic information update processing and distribution processing executed by the cooperation of the pedestrian terminal 70, the in-vehicle device 50 of the vehicle 5, the roadside sensor 8, and the edge server 3.
  • the execution subject is the pedestrian terminal 70, the in-vehicle device 50 of the vehicle 5, the roadside sensor 8, and the edge server 3, but the actual execution subject is the control units 71, 51, 81, 31. It is. 7, U1, U2,... Are dynamic information distribution cycles.
  • step S ⁇ b> 1 when the edge server 3 receives the dynamic information request message from the pedestrian terminal 70 and the in-vehicle device 50 of the vehicle 5 (step S ⁇ b> 1), the latest dynamic information at the time of reception is transmitted to the transmission source. To the pedestrian terminal 70 and the in-vehicle device 50 of the vehicle 5 (step S2).
  • the edge server 3 analyzes the received request message, and when the information indicating the request source included in the message is information indicating the communication terminal 1 registered in advance, Send dynamic information to the source.
  • step S1 If there is a request message from either one of the pedestrian terminal 70 and the in-vehicle device 50 in step S1, dynamic information is distributed only to one communication terminal that is the transmission source of the request message in step S2. Is done.
  • the pedestrian terminal 70 that has received the dynamic information distributed in step S2 generates change point information within the distribution cycle U1 (step S3), and transmits the generated change point information to the edge server 3 (step S6).
  • the vehicle-mounted device 50 that has received the dynamic information distributed in step S2 generates change point information from the comparison result between the dynamic information and its own sensor information within the distribution cycle U1 (step S4), and the generated change.
  • the point information is transmitted to the edge server 3 (step S6).
  • the roadside sensor 8 produces
  • the edge server 3 When the edge server 3 receives the change point information from the pedestrian terminal 70, the in-vehicle device 50, and the roadside sensor 8 within the update cycle U1, the edge server 3 updates the dynamic information reflecting the change point information (step S7). The updated dynamic information is distributed to the pedestrian terminal 70 and the in-vehicle device 50 (step S8).
  • step S6 when only the in-vehicle device 50 generates change point information within the distribution cycle U1, only the change point information generated by the in-vehicle device 50 in step S4 is transmitted to the edge server 3 (step S6), and the change point The dynamic information that reflects only the information is updated (step S7). If the pedestrian terminal 70, the in-vehicle device 50, and the roadside sensor 8 do not perform change point information within the distribution cycle U1, the processes of steps S3 to S7 are not executed, and the dynamic information for the previous transmission is transmitted. The same dynamic information as (Step S2) is distributed to the pedestrian terminal 70 and the in-vehicle device 50 (Step S8). Thus, the edge server 3 updates the dynamic information in step S7 based on the change point information transmitted within the distribution cycle U1.
  • the pedestrian terminal 70 that has received the dynamic information distributed in step S8 generates change point information within the distribution cycle U2 (step S9), and transmits the generated change point information to the edge server 3 (step S12). ).
  • the in-vehicle device 50 that has received the dynamic information distributed in step S8 generates change point information from the comparison result between the dynamic information and its own sensor information within the distribution cycle U2 (step S10), and the generated change.
  • the point information is transmitted to the edge server 3 (step S12). Moreover, if the roadside sensor 8 produces
  • the edge server 3 When receiving the change point information from the in-vehicle device 50 and the roadside sensor 8 within the distribution cycle U2, the edge server 3 updates the dynamic information reflecting the change point information (step S13), and the updated dynamic information. Is distributed to the pedestrian terminal 70 and the vehicle-mounted device 50 (step S14). Thus, the edge server 3 updates the dynamic information in step S13 based on the change point information transmitted within the distribution cycle U2.
  • step S14 is performed until either the dynamic information distribution stop request message is received from both the pedestrian terminal 70 and the vehicle 5 or the communication between the pedestrian terminal 70 and the vehicle 5 is interrupted. And the same sequence is repeated.
  • the information providing system of the present embodiment has a function of providing information related to the predicted traffic situation to the pedestrian terminal 70 mounted on one or a plurality of moving bodies located in the service area and the in-vehicle device 50 of the vehicle 5.
  • the predicted traffic situation indicates a result of predicting a future traffic situation.
  • FIG. 8 is a functional block diagram of the edge server 3 showing functions for providing the predicted traffic situation.
  • the control unit 31 of the edge server 3 functionally includes a calculation unit 31a, a determination unit 31b, a notification unit 31c, and a detection unit 31d. Each of these functions is realized by the control unit 31 executing a program stored in the storage unit 34.
  • the calculation unit 31a is a mobile terminal (pedestrian terminal 70 and on-vehicle device) mounted on a mobile body (pedestrian 7, vehicle 5) located in the service area represented by the dynamic information map M1.
  • Device 50 It has a function of predicting each traffic situation and obtaining an evaluation value of the predicted traffic situation.
  • the computing unit 31a refers to a mobile database 34a (described later) in which information obtained from the dynamic information map M1 is registered, and obtains an estimated value of the predicted traffic situation.
  • the determination unit 31b has a function of determining, based on the evaluation value, whether to notify the mobile terminal (the pedestrian terminal 70 and the in-vehicle device 50) information regarding the predicted traffic situation of each mobile terminal. Yes.
  • the notification unit 31c has a function of notifying the mobile terminal of information related to the predicted traffic situation based on the determination result of the determination unit 31b.
  • the detection unit 31d detects a plurality of moving bodies (pedestrian 7 and vehicle 5) whose position information is included in the dynamic information of the dynamic information map M1, and the moving body information indicating the status of each moving body. It has the function to generate.
  • FIG. 9 is a diagram illustrating an example of the mobile object database 34a.
  • the moving body information generated by the detection unit 31d is registered in the moving body database 34a.
  • the mobile database 34a is managed and updated by the detection unit 31d.
  • the detection unit 31d refers to the dynamic information map M1, and when the position information of a new moving body (pedestrian 7, vehicle 5) is registered in the dynamic information, the moving body ID is given to the moving body, The mobile body information corresponding to the mobile body ID is generated and registered in the mobile body database 34a.
  • the detection unit 31d detects a moving body whose position information is registered in the dynamic information map M1, and assigns a moving body ID to each moving body to generate moving body information.
  • the moving body information includes information such as information on presence / absence of a communication function, vehicle ID (mobile ID), moving body attribute information, position information, direction information indicating a moving direction, and speed information indicating a moving speed.
  • the attribute information is information indicating the type of the moving object, for example, information indicating whether the moving object is a vehicle or a pedestrian.
  • attribute information contains the information which shows whether it is an adult or a child, and the information which shows the direction of a body, when a moving body is a pedestrian.
  • the attribute information includes information indicating whether or not the pedestrian is a crutch or wheelchair user, information indicating the appearance of the color or type of clothes, information indicating whether or not the user is a walking smartphone, and the like. Also good.
  • the mobile body database 34a is provided with columns for registering mobile body IDs, information on presence / absence of communication functions, vehicle ID (mobile ID), mobile body attribute information, position information, direction information, and speed information. ing.
  • the detecting unit 31d refers to the dynamic information map M1 and generates moving body information for each moving body.
  • the detection unit 31d includes information on the presence or absence of a communication function included in the dynamic information map M1, the vehicle ID (mobile ID), and the position information among the mobile body information. Get as it is.
  • the detection unit 31d refers to the image data of the moving object captured by the camera or the like included in the dynamic information map M1, determines the attribute of each moving object, and generates attribute information based on the determination To do.
  • the detection unit 31d calculates based on the temporal change of the position information of each moving object included in the dynamic information map M1.
  • the detection unit 31d repeatedly generates the mobile body information of each mobile body and registers the mobile body information in the mobile body database 34a, and updates the mobile body database 34a as needed. Thereby, the moving body information registered in the moving body database 34a is maintained in the latest information.
  • evaluation value database 34b in FIG. 8 is a database for registering the evaluation value of the predicted traffic situation obtained by the calculation unit 31a.
  • the evaluation value database 34b will be described later.
  • FIG. 10 is a flowchart illustrating an example of a calculation process of the evaluation value of the predicted traffic situation by the calculation unit 31a.
  • the computing unit 31a reads the mobile object database 34a (step S51), and specifies a mobile object having a communication function as an evaluation target (step S52).
  • Information relating to the predicted traffic situation can be provided to a mobile body having the pedestrian terminal 70 and the in-vehicle device 50. Therefore, the calculating part 31a specifies the mobile body which has the pedestrian terminal 70 and the vehicle-mounted apparatus 50 as an evaluation object which calculates
  • the evaluation object is a movement having the pedestrian terminal 70 or the in-vehicle device 50 that is the evaluation object in addition to the pedestrian terminal 70 or the in-vehicle device 50 that is carried or mounted on the moving body. It may refer to the body (target moving body).
  • the calculation unit 31a performs evaluation value calculation processing (step S53).
  • the calculation unit 31a sequentially executes the process for each of the specified evaluation targets, and repeats the process until it is processed for all the evaluation targets.
  • the calculation unit 31a first determines whether or not the attribute to be evaluated is a vehicle (step S54).
  • step S54 When it is determined that the attribute to be evaluated is a vehicle (step S54), the calculation unit 31a proceeds to step S55 and performs a calculation process for the vehicle. The calculation unit 31a obtains an evaluation value to be evaluated in the vehicle calculation process (step S55).
  • FIG. 11 is a flowchart showing an example of the vehicle calculation process in step S55 in FIG.
  • the calculation unit 31a identifies a moving body (another moving body) other than the evaluation target (step S61), and a collision between the evaluation target (target moving body) and a moving body other than the evaluation target.
  • the predicted time is calculated for each mobile object other than the evaluation target (step S62).
  • the calculation unit 31a refers to the moving object database 34a and the evaluation object and the moving object other than the evaluation object collide with each other from the position information, the azimuth information, and the speed information of the evaluation object and the moving object other than the evaluation object.
  • the estimated collision time is obtained.
  • the calculation unit 31a sets the collision prediction time to an extremely large predetermined value (for example, 5 minutes).
  • the calculation unit 31a sets the collision prediction time to the predetermined value even when the calculated collision prediction time is equal to or longer than the predetermined value.
  • the calculation unit 31a predicts a collision between the evaluation object and a mobile body other than the evaluation object based on the dynamic information map M1, and obtains a collision prediction time as a prediction result (predicted traffic situation) of the collision prediction. .
  • the computing unit 31a identifies the mobile body having the shortest predicted collision time among the predicted collision times calculated for each of the mobile bodies other than the evaluation target as a collision prediction target (step S63).
  • the calculating part 31a specifies the mobile body of the shortest collision prediction time as a collision prediction object.
  • the computing unit 31a obtains an evaluation value of the prediction result (collision prediction result) based on the collision prediction time between the evaluation object and the collision prediction object (step S64).
  • the calculation unit 31a obtains an evaluation value of the prediction result according to the following rule with respect to the collision prediction time.
  • the setting is as follows, but the present invention is not limited to this.
  • the evaluation value is set such that the larger the value, the higher the factor that hinders the safety of the evaluation target. That is, the evaluation value is a value for evaluating the degree of safety of each evaluation object.
  • the calculation unit 31a sets “0” as the evaluation value for the moving body other than the collision prediction target.
  • the calculation unit 31a obtains an evaluation value for each moving object based on the collision prediction time that is a prediction result of the collision prediction between the evaluation object and the moving object other than the evaluation object. Thereby, the evaluation value of the collision prediction between the evaluation object and the moving object other than the evaluation object can be obtained.
  • the calculation unit 31a selects a mobile body (collision prediction target) that is predicted to collide with the evaluation target among the mobile bodies other than the evaluation target based on the collision prediction time that is a prediction result of the collision prediction. And an evaluation value is obtained based on the collision prediction time for the collision prediction target. Thereby, it is possible to obtain an evaluation value related to a collision prediction of an evaluation target predicted to collide.
  • step S65 determines whether or not there is a blind spot factor that causes a blind spot between the evaluation target and the collision prediction target.
  • the presence or absence of a blind spot factor between the evaluation target and the collision prediction target is determined by the calculation unit 31a referring to the dynamic information map M1.
  • the computing unit 31a refers to the dynamic information map M1 and determines the presence or absence of a building or other moving body that blocks the prospects of both between the evaluation target and the collision prediction target. When such a building or other moving body exists between the evaluation target and the collision prediction target, the calculation unit 31a determines that there is a blind spot factor between the evaluation target and the collision prediction target. On the other hand, if there is nothing that blocks the line of sight between the evaluation target and the collision prediction target, the calculation unit 31a determines that there is no blind spot factor.
  • step S65 If it is determined in step S65 that there is a blind spot factor between the evaluation target and the collision prediction target, the computing unit 31a adds to the evaluation value obtained in step S64 (step S66), and proceeds to step S67. On the other hand, when it is determined that there is no blind spot factor between the evaluation target and the collision prediction target (step S65), the calculation unit 31a proceeds to step S67 without performing addition to the evaluation value.
  • the evaluation value is set such that the larger the value, the higher the factor that hinders the safety of the evaluation target.
  • the calculating part 31a adds to an evaluation value.
  • the added value added to the evaluation value in step S66 is “100”, for example. This added value is an example and is not limited to this. The same applies to the added values shown below.
  • the calculation unit 31a determines the presence / absence of a blind spot factor that causes a blind spot between the evaluation target and the collision prediction target, and adds the determination result of the presence / absence of the blind spot factor to the evaluation value.
  • the presence / absence of the blind spot factor can be reflected in the evaluation value, and can be reflected in the execution determination of providing information to the evaluation target described later.
  • step S67 the calculation unit 31a determines whether or not the collision prediction target is a pedestrian (step S67).
  • the collision prediction target the moving body
  • step S68 determines the situation of the pedestrian 7 that is the collision prediction target, and finishes the process.
  • the collision prediction target (the moving body) is not the pedestrian 7
  • the collision prediction target is the vehicle 5
  • the calculation unit 31 a finishes the vehicle calculation process without determining the situation of the pedestrian 7.
  • FIG. 12 is a flowchart showing an example of the situation determination process for the pedestrian 7 in FIG.
  • the calculation unit 31 a determines whether or not the pedestrian 7 subject to collision prediction is walking on a pedestrian crossing or a sidewalk (step S ⁇ b> 71). Whether or not the pedestrian 7 subject to collision prediction is walking on a pedestrian crossing or a sidewalk is determined by referring to the dynamic information map M1 by the calculation unit 31a.
  • the computing unit 31a compares the position information of the pedestrian crossing and the sidewalk included in the static information of the dynamic information map M1 with the position information of the pedestrian 7, so that the pedestrian 7 walks the pedestrian crossing or the sidewalk. It can be determined whether or not.
  • step S71 when it determines with the pedestrian 7 of collision prediction object not walking the pedestrian crossing or a sidewalk, the calculating part 31a adds to an evaluation value (step S72), and progresses to step S73.
  • step S71 when it determines with the pedestrian 7 of collision prediction object walking the pedestrian crossing or a sidewalk (step S71), the calculating part 31a progresses to step S73, without performing addition with respect to an evaluation value.
  • the calculating part 31a adds to an evaluation value. Note that the added value added to the evaluation value in step S72 is, for example, “100”.
  • the computing unit 31a determines whether or not the walking speed of the pedestrian 7 to be predicted for collision is slower than a predetermined value (step S73). Whether or not the walking speed of the pedestrian 7 subject to collision prediction is slower than a predetermined value can be determined by referring to the mobile database 34a.
  • the predetermined value to be compared with the walking speed is set, for example, every 3.6 km as a general pedestrian speed.
  • step S73 when it determines with the walking speed of the pedestrian 7 of collision prediction object being slower than predetermined value, the calculating part 31a adds to an evaluation value (step S74), and progresses to step S75.
  • step S74 when it determines with the walking speed of the pedestrian 7 of collision prediction object not being slower than predetermined value (step S73), the calculating part 31a progresses to step S75, without adding with respect to an evaluation value.
  • the pedestrian 7 subject to collision prediction when crossing a pedestrian crossing and the walking speed of the pedestrian 7 is slower than a predetermined value, the pedestrian 7 may not cross the pedestrian crossing during the green light. Considering the fact that there is, there is a safety hindrance factor in the evaluation target. Therefore, when it determines with the walking speed of the pedestrian 7 being slower than a predetermined value, the calculating part 31a adds to an evaluation value. Note that the added value added to the evaluation value in step S74 is, for example, “100”.
  • the computing unit 31a determines whether or not the attribute of the pedestrian 7 to be predicted for collision is a child (step S75). Whether or not the attribute of the pedestrian 7 to be predicted for collision is a child can be determined by referring to the mobile database 34a.
  • step S75 when it determines with the attribute of the pedestrian 7 of collision prediction object being a child, the calculating part 31a adds to an evaluation value (step S76), and progresses to step S77.
  • step S76 when it determines with the attribute of the pedestrian 7 of collision prediction object not being a child (step S75), the calculating part 31a progresses to step S77, without performing addition with respect to an evaluation value.
  • the calculation unit 31a adds to the evaluation value.
  • the added value added to the evaluation value in step S76 is “100”, for example.
  • the calculation unit 31a determines whether or not the pedestrian 7 subject to collision prediction is meandering (step S77). Whether or not the pedestrian 7 subject to collision prediction is meandering can be determined based on a time-dependent change in the position information of the moving body included in the moving body database 34a or the dynamic information map M1.
  • step S77 when it is determined that the pedestrian 7 subject to collision prediction is meandering, the calculation unit 31a adds the evaluation value (step S78), and proceeds to step S79.
  • step S78 when it determines with the pedestrian 7 of collision prediction object not meandering (step S77), the calculating part 31a progresses to step S79, without performing addition with respect to an evaluation value.
  • the calculating part 31a adds to an evaluation value.
  • the added value added to the evaluation value in step S78 is, for example, “100”.
  • the calculation unit 31a determines whether or not the pedestrian 7 to be predicted for collision is ignoring the signal (step S79). Whether or not the pedestrian 7 subject to collision prediction ignores the signal can be determined based on the signal information included in the dynamic information of the dynamic information map M1 and the position information of the moving body.
  • step S79 when it is determined that the pedestrian 7 subject to collision prediction ignores the signal, the calculation unit 31a adds to the evaluation value (step S80), and ends the vehicle calculation process (FIG. 11).
  • step S80 when it determines with the pedestrian 7 of collision prediction object not ignoring a signal (step S79), the calculating part 31a complete
  • the calculating part 31a adds to an evaluation value.
  • the addition value added to the evaluation value in step S79 is, for example, “100”.
  • the calculation unit 31a acquires the situation regarding the pedestrian 7, and adds the added value as the adjustment value according to the acquired situation of the pedestrian 7 to the evaluation value.
  • the calculation unit 31a performs the vehicle calculation process (step S55 in FIG. 10) to obtain the evaluation value to be evaluated.
  • step S54 when it is determined in step S54 in FIG. 10 that the attribute to be evaluated is not a vehicle (step S54), the calculation unit 31a proceeds to step S56 and performs a pedestrian calculation process.
  • the computing unit 31a obtains an evaluation value to be evaluated in the pedestrian computation process (step S56).
  • FIG. 13 is a flowchart showing an example of the pedestrian calculation process in FIG.
  • Step 61 to Step S ⁇ b> 66 are the same processes as those in FIG. 11 except for whether the evaluation target is a vehicle or a pedestrian. Therefore, description of step 61 to step S66 is omitted.
  • step S65 or step S66 the arithmetic unit 31a proceeds to step S82.
  • step 82 the calculation unit 31a determines whether or not the collision prediction target is a vehicle (step S82). When it determines with a collision prediction object (its mobile body) being a vehicle, the calculating part 31a progresses to step S83, performs the situation determination of the pedestrian 7 which is an evaluation object, and complete
  • the situation determination process of the pedestrian 7 to be evaluated is the same as in FIG.
  • the calculation unit 31a finishes the pedestrian calculation process without determining the situation of the evaluation target pedestrian 7.
  • the rule regarding the evaluation value according to the collision prediction time and the addition value added to the evaluation value by each determination are the calculation process for the vehicle in consideration of being a pedestrian. Set to a different value.
  • the calculation unit 31a determines whether or not the collision prediction target is a pedestrian in step S67 in FIG. 11 showing the vehicle calculation process, and only when the collision prediction target (the moving body) is the pedestrian 7. Then, the situation of the pedestrian 7 that is a collision prediction target is determined. Similarly, in step S82 in FIG. 13 showing the pedestrian calculation process, the calculation unit 31a determines whether or not the collision prediction target is a vehicle, and determines that the collision prediction target (the moving body) is a vehicle. Only when it does, the situation determination of the pedestrian 7 which is an evaluation object is performed.
  • the calculation unit 31a acquires the situation related to the pedestrian and acquires the acquired pedestrian. 7 is added to the evaluation value.
  • the situation unique to the pedestrian can be reflected in the evaluation value, and can be reflected in the execution determination for providing information to the evaluation target described later.
  • step S57 when the evaluation value of the evaluation target is obtained by the vehicle calculation process of step S55 or the pedestrian calculation process of step S56, the calculation unit 31a proceeds to step S57 and uses the calculated evaluation value as the evaluation value. Registration in the database 34b (step S57).
  • the calculation unit 31a sequentially performs the processing from step S54 to step S57 in FIG. 10 on the specified evaluation target, and repeats the processing until all the evaluation targets are processed.
  • the calculation unit 31a obtains evaluation values for all the specified evaluation targets and registers them in the database 34b, the calculation unit 31a returns to step S51 again and repeats the same processing.
  • the calculation unit 31a repeatedly calculates and registers the evaluation value of each specified evaluation target by repeating the calculation process, and updates the registration content of the evaluation value database 34b as needed.
  • FIG. 14 is a diagram illustrating an example of the evaluation value database 34b. As shown in FIG. 14, in the evaluation value database 34b, the mobile object ID to be evaluated and the evaluation values of mobile objects other than the evaluation object are registered in association with each other.
  • the evaluation value database 34b is provided with a mobile object ID field to be evaluated, a vehicle ID (mobile ID) field, and a mobile object evaluation value field other than the evaluation object.
  • the mobile object ID of the mobile object specified as the evaluation object is registered in the column of the evaluation object mobile object ID.
  • vehicle ID (mobile ID) the vehicle ID (mobile ID) of the mobile terminal pedestrian terminal 70 or the in-vehicle device 50 included in the mobile object to be evaluated is registered.
  • the column of the evaluation value of the mobile body other than the evaluation target includes a mobile body ID column of each mobile body other than the evaluation target.
  • the evaluation value of each of the plurality of mobile objects is registered in the column of the evaluation value of the mobile object other than the evaluation object.
  • the evaluation value for the moving object other than the collision prediction target is set to “0”. Therefore, a mobile body other than the evaluation target in which a value other than “0” is registered as the evaluation value of the mobile body other than the evaluation target indicates that the evaluation target is a collision prediction target.
  • the collision prediction object of each evaluation object can be specified by referring to the evaluation value database 34b.
  • FIG. 15 is a flowchart illustrating an example of a determination process performed by the determination unit 31b.
  • the determination unit 31b refers to the evaluation value database 34b and determines whether there is an evaluation target whose evaluation value is equal to or greater than a preset threshold value (step S85). If it determines with there being no evaluation object whose evaluation value is more than a threshold value in step S85, the determination part 31b will repeat step S85 further. Therefore, the determination unit 31b repeats Step S85 until it is determined that there is an evaluation target whose evaluation value is equal to or greater than the threshold value.
  • step S85 If it is determined in step S85 that there is an evaluation object whose evaluation value is equal to or greater than the threshold value, the determination unit 31b proceeds to step S86 and determines to notify the evaluation object of information related to the collision prediction result of the collision prediction object. Return to step S85.
  • the determination unit 31b refers to the evaluation value of each evaluation object, and if there are a plurality of evaluation objects whose evaluation values are equal to or greater than the threshold value, determines the notification of information related to the collision prediction result of the collision prediction object to all the evaluation objects. . As described above, the determination unit 31b determines, for each evaluation target, whether or not to notify the evaluation target of information related to the prediction result of each evaluation target based on the evaluation value.
  • whether or not to notify information related to the collision prediction result (information related to the predicted traffic situation) is determined for each of the evaluation targets (the pedestrian terminal 70, the in-vehicle device 50, or the mobile body having these). Since it determines for every evaluation object based on the evaluation value of a prediction result, required information can be appropriately provided to each evaluation object.
  • the dynamic information map M1 on which dynamic information on a plurality of moving objects is superimposed is used in order to obtain the evaluation value of the prediction result of the collision prediction of each evaluation target, there is also a moving object on which no mobile terminal is mounted.
  • the evaluation value of the predicted traffic situation can be obtained, and the traffic situation appropriately predicted for each evaluation target can be expressed as the evaluation value.
  • the information regarding the prediction result appropriately collided can be appropriately provided to each evaluation object.
  • the notification unit 31c determines the collision prediction result between the evaluation target and the collision prediction target to the evaluation target for which the determination unit 31b has determined the notification. Notify information about. Thereby, it is possible to provide information only to the mobile terminal that is determined to need information on the prediction result of the collision prediction based on the evaluation value.
  • the information related to the collision prediction result notified to the evaluation target by the notification unit 31c includes the attribute of the collision prediction target, the direction in which the collision prediction target approaches, the collision prediction time, and the like.
  • the mobile terminal (the pedestrian terminal 70 and the in-vehicle device 50) to be evaluated that has received the notification by the notification unit 31c outputs information on the notified prediction result to the user of the mobile terminal.
  • FIG. 16 is a diagram illustrating a situation around an intersection according to scenario 1.
  • pedestrians 7 ⁇ / b> A and 7 ⁇ / b> B get off the vehicle 5 ⁇ / b> C parked at a position immediately after passing the pedestrian crossing P and cross the pedestrian crossing P.
  • the pedestrian 7A is an adult, and the pedestrian 7B is a child.
  • the light color of the signal of the pedestrian crossing P was flashing blue.
  • the light color of the signal of the route on which the vehicle 5A travels is red, but the signal of the pedestrian crossing P is blinking blue.
  • the vehicle 5A approaching the pedestrian crossing P recognizes that it will soon turn blue. For this reason, it is assumed that the vehicle 5A attempts to pass through the intersection without stopping. It is assumed that the pedestrians 7A and 7B cannot see the vehicle 5A due to the presence of the vehicle 5C. Further, a vehicle 5B further travels behind the vehicle 5A.
  • the vehicles 5A, 5B, 5C and pedestrians 7A, 7B are registered as moving bodies in the dynamic information map M1 and the moving body database 34a by the system. Further, it is assumed that the vehicles 5A and 5B are equipped with the in-vehicle device 50, and the pedestrians 7A and 7B carry the pedestrian terminal 70.
  • the edge server 3 (the computing unit 31a thereof) specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B.
  • the pedestrians 7A and 7B have a child attribute (only the pedestrian 7B) and ignore the signal.
  • a vehicle 5C that is a blind spot factor exists between the vehicle 5A and the pedestrians 7A and 7B.
  • the edge server 3 when the edge server 3 obtains an evaluation value when the vehicle 5A is an evaluation object, the pedestrian situation determination is taken into consideration in addition to the collision prediction time. In this scenario, an additional value is added to the evaluation value due to blind spot factor, pedestrian attributes, and signal ignorance.
  • the threshold value used for determining whether or not to notify the prediction result is set smaller than the evaluation value in the case where factors that inhibit safety overlap as in this scenario. Therefore, in the case of this scenario, the evaluation values of the pedestrians 7A and 7B in the vehicle 5A are larger than the threshold value. For this reason, the edge server 3 notifies the information regarding the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A to the vehicle 5A.
  • the edge server 3 also considers the pedestrian's situation determination in addition to the collision prediction time when obtaining the evaluation value when the pedestrians 7A and 7B are evaluated. Therefore, the evaluation value of the vehicle 5A in the pedestrians 7A and 7B becomes a value larger than the threshold value, and the edge server 3 notifies the pedestrians 7A and 7B of the information related to the collision prediction result of the vehicle 5A in the pedestrians 7A and 7B. .
  • the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B.
  • the evaluation value is set to a large value.
  • the edge server 3 notifies the vehicle 5B of information related to the collision prediction result of the vehicle 5A in the vehicle 5B.
  • the edge server 3 can appropriately provide necessary information to each mobile terminal.
  • the in-vehicle device 50 of the vehicles 5A and 5B and the pedestrian terminal 70 of the pedestrians 7A and 7B output information on the collision prediction result to the user of the own device based on the notification from the edge server 3.
  • the edge server 3 can make the user of the vehicle 5A recognize in advance that a pedestrian appears from the right side of the pedestrian crossing P ahead.
  • the pedestrians 7A and 7B are displayed without specifying them.
  • the display method may be different depending on whether the attribute of the pedestrian 7 is a child or an adult. . This is because pedestrians need more attention if they are children.
  • the output mode of the information related to the collision prediction result may be controlled differently according to the attribute of the collision prediction target.
  • information can be output to a user in the output mode according to the feature of the attribute of a collision prediction object.
  • the output screen V2 of the pedestrian terminals 70 of the pedestrians 7A and 7B includes a display D3 indicating that the vehicle A appears from the left side of the front pedestrian crossing P, an arrow D4 indicating the traveling direction of the vehicle 5A, and the like. Is displayed. Thereby, the edge server 3 can make the pedestrians 7A and 7B recognize in advance that the vehicle appears from the left side of the pedestrian crossing P ahead.
  • a display D5 or the like indicating a warning to the vehicle 5A traveling in front of the host vehicle is displayed.
  • the edge server 3 can make the user of the vehicle 5B recognize in advance that the vehicle 5A traveling ahead stops and approaches.
  • the edge server 3 can avoid collision between the moving bodies by causing the in-vehicle device 50 of the vehicles 5A and 5B and the pedestrian terminal 70 of the pedestrians 7A and 7B to output to the user. it can.
  • the output of the information regarding the collision prediction result by the pedestrian terminal 70 and the vehicle-mounted apparatus 50 is controlled by the output control part which the control part 71 and the control part 51 of the pedestrian terminal 70 and the vehicle-mounted apparatus 50 have.
  • the control part 31 of the edge server 3 has an output control part which can control the output toward the user by the pedestrian terminal 70 and the vehicle-mounted apparatus 50
  • the output control part of the edge server 3 is You may control the output to the user by the pedestrian terminal 70 and the vehicle-mounted apparatus 50.
  • FIG. 17 is a diagram illustrating a situation around an intersection according to scenario 2.
  • the settings of the vehicles 5A and 5B and the pedestrians 7A and 7B are the same as in the scenario 1.
  • scenario 2 there is no scenario 5 vehicle 5C.
  • scenario 2 the pedestrians 7A and 7B cross the pedestrian crossing P at a speed lower than a general walking speed (3.6 km / hour).
  • the signal color of the pedestrian crossing P was flashing blue at the timing when the pedestrians 7A and 7B started to cross the pedestrian crossing P. In the middle of the pedestrian crossing P, the signal color of the pedestrian crossing P turns red, and the pedestrians 7A and 7B slowly cross the pedestrian crossing P at the same walking speed.
  • the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B.
  • the edge server 3 obtains an evaluation value when the vehicle 5A is an evaluation object, the pedestrian attribute, the speed of the pedestrian, and an addition value obtained by ignoring the signal are added to the evaluation value obtained from the collision prediction time. to add.
  • the evaluation value of the pedestrians 7A and 7B in the vehicle 5A becomes a value larger than the threshold value, and the edge server 3 notifies the vehicle 5A of information regarding the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
  • the edge server 3 also considers the pedestrian's situation determination in addition to the collision prediction when obtaining the evaluation value when the pedestrians 7A and 7B are evaluated. Therefore, the evaluation value of the vehicle 5A in the pedestrians 7A and 7B becomes a value larger than the threshold value, and the edge server 3 sends the information regarding the prediction result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B to the pedestrians 7A and 7B. Notice.
  • the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B.
  • the evaluation value is set to a large value.
  • the edge server 3 notifies the vehicle 5B of information related to the collision prediction result of the vehicle 5A in the vehicle 5B.
  • the in-vehicle device 50 of the vehicles 5A and 5B and the pedestrian terminal 70 of the pedestrians 7A and 7B output information on the collision prediction result to the user of the own device based on the notification from the edge server 3.
  • the edge server 3 can make the user of the vehicle 5A recognize in advance that a pedestrian appears from the right side of the pedestrian crossing P ahead.
  • the edge server 3 can make the pedestrians 7A and 7B recognize in advance that a vehicle appears from the left side of the front crosswalk P.
  • the edge server 3 can avoid the collision of each moving body by making the vehicle-mounted apparatus 50 of vehicle 5A, 5B and the pedestrian terminal 70 of pedestrian 7A, 7B perform an output to a user. .
  • FIG. 18 is a diagram illustrating a situation around an intersection according to scenario 3.
  • pedestrians 7A and 7B walk along the sidewalk H, and there is an obstacle G1 on the sidewalk H.
  • the pedestrians 7A and 7B are walking on the roadway avoiding the obstacle G1.
  • the pedestrian 7A is an adult, and the pedestrian 7B is a child.
  • the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B.
  • the attributes of the pedestrians 7A and 7B are children (only the pedestrian 7B) and do not walk on the sidewalk.
  • the edge server 3 obtains the evaluation value of the pedestrians 7A and 7B when the vehicle 5A is the evaluation target, the attribute of the pedestrian and the pedestrian walk to the evaluation value obtained from the predicted collision time. Add the value depending on where you are (whether you are walking on a pedestrian crossing or sidewalk). Thereby, the evaluation value of the pedestrians 7A and 7B in the vehicle 5A becomes a value larger than the threshold value, and the edge server 3 notifies the vehicle 5A of information regarding the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
  • the edge server 3 also considers the pedestrian's situation determination in addition to the collision prediction when obtaining the evaluation value when the pedestrians 7A and 7B are evaluated. Therefore, the evaluation value of the vehicle 5A in the pedestrians 7A and 7B becomes a value larger than the threshold value, and the edge server 3 notifies the pedestrians 7A and 7B of the information related to the collision prediction result of the vehicle 5A in the pedestrians 7A and 7B. .
  • the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B. Further, a vehicle 5D that is a blind spot factor exists between the vehicle 5B and the vehicle 5A.
  • the edge server 3 When the edge server 3 obtains an evaluation value when the vehicle 5B is an evaluation object, the edge server 3 adds an addition value based on the presence or absence of a blind spot factor to the evaluation value obtained from the collision prediction time. If the evaluation value of the vehicle 5A in the vehicle 5B is larger than the threshold value, the edge server 3 notifies the vehicle 5B of information related to the collision prediction result of the vehicle 5A in the vehicle 5B.
  • the edge server 3 specifies the vehicle 5B as a collision prediction target of the vehicle C. As a result of the collision prediction, if the collision prediction time is short, the evaluation value is set to a large value. As a result, the edge server 3 notifies the vehicle 5C of information related to the collision prediction result of the vehicle 5B in the vehicle 5C.
  • the edge server 3 makes the in-vehicle device 50 of the vehicles 5A, 5B, and 5C and the pedestrian terminal 70 of the pedestrians 7A and 7B perform the output to the user, thereby avoiding collision between the moving bodies. Can do.
  • FIG. 19 is a diagram illustrating a situation around an intersection according to scenario 4.
  • pedestrians 7A and 7B walk along the sidewalk H, they are walking between the parked vehicle 5C and the vehicle 5D and are not crosswalks to cross the roadway.
  • the pedestrian 7A is an adult, and the pedestrian 7B is a child.
  • the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B.
  • the attributes of the pedestrians 7A and 7B are children (only the pedestrian 7B) and do not walk on the sidewalks and pedestrian crossings.
  • a vehicle 5D that is a blind spot factor exists between the vehicle 5A and the pedestrians 7A and 7B.
  • the edge server 3 calculates the evaluation value of the pedestrians 7A and 7B when the vehicle 5A is the evaluation target, in addition to the collision prediction time, the attribute of the pedestrian, the place where the pedestrian is walking (crossing)
  • the added value is added to the evaluation value depending on whether there is a sidewalk or whether or not walking on the sidewalk and whether there is a blind spot factor.
  • the evaluation value of the pedestrians 7A and 7B in the vehicle 5A becomes a value larger than the threshold value, and the edge server 3 notifies the vehicle 5A of information regarding the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
  • the edge server 3 also considers the pedestrian's situation determination in addition to the collision prediction when obtaining the evaluation value when the pedestrians 7A and 7B are evaluated. Therefore, the evaluation value of the vehicle 5A in the pedestrians 7A and 7B becomes a value larger than the threshold value, and the edge server 3 notifies the pedestrians 7A and 7B of the information related to the collision prediction result of the vehicle 5A in the pedestrians 7A and 7B. .
  • the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B.
  • the evaluation value is set to a large value.
  • the edge server 3 notifies the vehicle 5B of information related to the collision prediction result of the vehicle 5A in the vehicle 5B.
  • in-vehicle device that includes the in-vehicle camera 59 in the vehicle 5 through which the pedestrian slips, in addition to detection by the roadside sensor 8. If 50 is mounted, the vehicle-mounted camera 59 can also detect it.
  • FIG. 20 is a diagram illustrating a situation around an intersection according to scenario 5.
  • pedestrians 7A and 7B meander along the sidewalk H and walk.
  • the pedestrian 7A is an adult, and the pedestrian 7B is a child.
  • the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B.
  • the attributes of the pedestrians 7A and 7B are children (only the pedestrian 7B), and when the pedestrians protrude from the roadway, they are not walking on the sidewalk.
  • the edge server 3 obtains the evaluation value of the pedestrians 7A and 7B when the vehicle 5A is the evaluation target
  • the evaluation value obtained from the collision prediction time includes the attribute of the pedestrian, the walking state of the pedestrian ( Addition value depending on whether or not meandering and a place where a pedestrian is walking (whether or not walking on a pedestrian crossing or a sidewalk).
  • the evaluation value of the pedestrians 7A and 7B in the vehicle 5A becomes a value larger than the threshold value, and the edge server 3 notifies the vehicle 5A of information regarding the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
  • the edge server 3 also considers the pedestrian's situation determination in addition to the collision prediction when obtaining the evaluation value when the pedestrians 7A and 7B are evaluated. Therefore, the evaluation value of the vehicle 5A in the pedestrians 7A and 7B becomes a value larger than the threshold value, and the edge server 3 notifies the pedestrians 7A and 7B of the information related to the collision prediction result of the vehicle 5A in the pedestrians 7A and 7B. .
  • the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B.
  • the edge server 3 calculates
  • FIG. 21 is a diagram illustrating a situation around an intersection according to scenario 6.
  • pedestrians 7A and 7B cross a pedestrian crossing P that crosses a route R1.
  • the signal color of the pedestrian crossing P was flashing blue. The color changes to red, and the pedestrians 7A and 7B hurry to cross the pedestrian crossing P as they are.
  • the pedestrian 7A is an adult, and the pedestrian 7B is a child.
  • the vehicle 5A enters the intersection from the route R2, turns left, and travels in the intersection toward the route R1. Further, the vehicle 5B travels following the vehicle 5A. Since the lamp color of the signal of the pedestrian crossing P changes from flashing blue to red, the lights of the vehicles 5A and 5B and the traffic lights ahead are also switched from blue to yellow and red. Therefore, the vehicles 5A and 5B are rushing to pass through the intersection. Furthermore, there is a building G2 between the route R2 and the route R1 and at the corner of the intersection that blocks the view of the route R1 and the route R2.
  • the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A and the collision prediction targets of the pedestrians 7A and 7B. As a result, the vehicle 5A is specified. In addition, the attributes of the pedestrians 7A and 7B are children (only the pedestrian 7B), and the signals are ignored. Furthermore, there is a building G2 that is a blind spot factor between the vehicle 5A and the pedestrians 7A and 7B.
  • the edge server 3 obtains an evaluation value when the vehicle 5A is an evaluation object, the pedestrian attribute, the presence / absence of a blind spot factor, and an addition value by signal ignorance are added to the evaluation value obtained from the collision prediction time. to add.
  • the evaluation value of the pedestrians 7A and 7B in the vehicle 5A becomes a value larger than the threshold value, and the edge server 3 notifies the vehicle 5A of information regarding the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
  • the edge server 3 also considers the pedestrian's situation determination in addition to the collision prediction when obtaining the evaluation value when the pedestrians 7A and 7B are evaluated. Therefore, the evaluation value of the vehicle 5A in the pedestrians 7A and 7B becomes a value larger than the threshold value, and the edge server 3 sends the information regarding the prediction result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B to the pedestrians 7A and 7B. Notice.
  • the edge server 3 identifies the vehicle 5A as a collision prediction target of the vehicle 5B. As a result of the collision prediction, if the collision prediction time is short, the evaluation value is set to a large value. As a result, the edge server 3 notifies the vehicle 5B of information related to the collision prediction result of the vehicle 5A in the vehicle 5B.
  • the in-vehicle device 50 of the vehicles 5A and 5B and the pedestrian terminal 70 of the pedestrians 7A and 7B output information on the collision prediction result to the user of the own device based on the notification from the edge server 3.
  • the edge server 3 can make the user of the vehicle 5A recognize in advance that a pedestrian crosses the pedestrian crossing P.
  • the edge server 3 can make the pedestrians 7A and 7B recognize in advance that the vehicle 5A appears from the right side of the pedestrian crossing P.
  • the edge server 3 can avoid the collision of each moving body by making the vehicle-mounted apparatus 50 of vehicle 5A, 5B and the pedestrian terminal 70 of pedestrian 7A, 7B perform an output to a user. .
  • FIG. 22 is a diagram illustrating an aspect of information provision executed by a system according to another embodiment.
  • FIG. 22 is a flowchart illustrating an example of arithmetic processing according to another embodiment.
  • the edge server 3 of the present embodiment is configured to evaluate the comfort of movement of each evaluation object based on the dynamic information map M1 and obtain an evaluation value based on the comfort of future movement of each evaluation object. Has been. That is, the predicted traffic situation of each evaluation object predicted by the calculation unit 31a of the present embodiment is a situation indicating whether or not the future movement of the evaluation object is a comfortable movement.
  • the edge server 3 divides each route in the service area into a plurality of unit areas, determines whether there is a factor that impairs comfort in each unit area, and determines comfort. Identify non-comfortable areas that have more than a certain amount of damage.
  • the edge server 3 registers the determination result in the database and updates it as needed.
  • the factor that impairs comfort is, for example, the occurrence of traffic jams or the presence of a meandering pedestrian. If these factors exist above a certain level, the edge server 3 identifies the unit area as a non-comfort area.
  • the edge server 3 calculates
  • the evaluation value is set according to the distance between the evaluation target and the non-comfort area. This evaluation value is set to a larger value as the evaluation target and the non-comfort area are closer.
  • the evaluation value of the present embodiment is set according to the possibility that the evaluation target passes through the non-comfort area.
  • the determination unit 31b When the evaluation value is equal to or greater than a predetermined threshold, the determination unit 31b notifies the evaluation target of information related to the non-comfort area. And when approaching a non-comfort area, the in-vehicle device 50 of vehicles 5A and 5B and the pedestrian terminal 70 of pedestrians 7A and 7B send information about the non-comfort area based on the notification from the edge server 3. To the user.
  • each route R10, R11, R12, R13, R14 delimited by the intersection constitutes a unit area.
  • a traffic jam occurs in route R11, and meandering pedestrians 7A and 7B exist.
  • the edge server 3 identifies the route R11 as a non-comfort area. It is assumed that other routes are not specified as non-comfort areas.
  • the vehicle 5A traveling along the route R10 toward the intersection where the route R11 and the route R12 are connected is approaching the route R11 which is a non-comfort area.
  • the evaluation value of the vehicle 5A as the evaluation target increases as it approaches the route R11 that is a non-comfort area, and becomes equal to or higher than a predetermined threshold value.
  • the edge server 3 notifies the vehicle 5A, which is the evaluation target, of information related to the non-comfort area.
  • the vehicle 5B traveling along the route R13 toward the intersection where the route R11 and the route R14 are connected is approaching the route R11 which is a non-comfort area.
  • the evaluation value of the vehicle 5B as the evaluation target increases as it approaches the route R11 that is a non-comfort area, and becomes equal to or greater than a predetermined threshold value.
  • the edge server 3 notifies the information regarding the non-comfort area to the vehicle 5B which is the evaluation target.
  • the in-vehicle device 50 of the vehicle 5 ⁇ / b> A or 5 ⁇ / b> B Based on the notification from the edge server 3, the in-vehicle device 50 of the vehicle 5 ⁇ / b> A or 5 ⁇ / b> B outputs information on the non-comfort area that is likely to pass in the future to the user of the own device as information on the predicted traffic information.
  • a display D10 indicating that a non-comfort area exists, an arrow D11 indicating a detour avoiding the non-comfort area, and the like Is displayed.
  • a display D12 indicating that a non-comfort area exists, an arrow D13 indicating a detour avoiding the non-comfort area, and the like are displayed. Is done.
  • the edge server 3 can make the user of the vehicle 5A recognize in advance that a non-comfort area appears, and can maintain the comfort during movement for each evaluation object.

Abstract

L'invention concerne un système de fourniture d'informations comprenant : un terminal mobile installé dans un corps mobile ou au moins une partie d'une pluralité de corps mobiles situés à l'intérieur d'une zone prescrite; une unité de calcul destinée à obtenir une valeur d'évaluation pour le résultat d'une prévision d'une situation de trafic pour chaque terminal mobile, c'est-à-dire, une situation de trafic de prévision, sur la base d'informations cartographiques dynamiques dans lesquelles des informations dynamiques concernant le ou les corps mobiles sont superposées sur des informations cartographiques de la zone; une unité de détermination destinée à déterminer s'il faut ou non signaler la situation de trafic prévue pour chaque terminal mobile à des terminaux mobiles sur la base de la valeur d'évaluation et d'une base par terminal mobile; et une unité de signalisation destinée à signaler la situation de trafic de prévision à des terminaux mobiles sur la base des résultats de détermination par l'unité de détermination.
PCT/JP2019/011739 2018-04-10 2019-03-20 Système de fourniture d'informations, terminal mobile, dispositif de fourniture d'informations, procédé de fourniture d'informations, et programme informatique WO2019198449A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020513152A JPWO2019198449A1 (ja) 2018-04-10 2019-03-20 情報提供システム、移動端末、情報提供装置、情報提供方法、及びコンピュータプログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018075446 2018-04-10
JP2018-075446 2018-04-10

Publications (1)

Publication Number Publication Date
WO2019198449A1 true WO2019198449A1 (fr) 2019-10-17

Family

ID=68164050

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/011739 WO2019198449A1 (fr) 2018-04-10 2019-03-20 Système de fourniture d'informations, terminal mobile, dispositif de fourniture d'informations, procédé de fourniture d'informations, et programme informatique

Country Status (2)

Country Link
JP (1) JPWO2019198449A1 (fr)
WO (1) WO2019198449A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114596707A (zh) * 2022-03-16 2022-06-07 阿波罗智联(北京)科技有限公司 交通控制方法及装置、设备、系统、介质
CN114997527A (zh) * 2022-07-18 2022-09-02 苏州智能交通信息科技股份有限公司 基于道路运输动态数据的企业考核评价方法、系统及终端
US11854271B2 (en) 2020-02-17 2023-12-26 Honda Motor Co., Ltd. System, computer-readable storage medium, and information processing method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011123551A (ja) * 2009-12-08 2011-06-23 Toyota Central R&D Labs Inc 死角領域推定装置及びプログラム
JP2011253403A (ja) * 2010-06-03 2011-12-15 Advanced Telecommunication Research Institute International 歩車間通信システム
JP2013507691A (ja) * 2009-10-08 2013-03-04 本田技研工業株式会社 動的交差点地図作成の方法
JP2016095695A (ja) * 2014-11-14 2016-05-26 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation 移動体が特定領域に接近していることを通知する方法、並びに、その為のサーバ・コンピュータ及びサーバ・コンピュータ・プログラム
WO2017002590A1 (fr) * 2015-06-29 2017-01-05 株式会社日立製作所 Dispositif de génération d'instructions de déplacement

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013507691A (ja) * 2009-10-08 2013-03-04 本田技研工業株式会社 動的交差点地図作成の方法
JP2011123551A (ja) * 2009-12-08 2011-06-23 Toyota Central R&D Labs Inc 死角領域推定装置及びプログラム
JP2011253403A (ja) * 2010-06-03 2011-12-15 Advanced Telecommunication Research Institute International 歩車間通信システム
JP2016095695A (ja) * 2014-11-14 2016-05-26 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation 移動体が特定領域に接近していることを通知する方法、並びに、その為のサーバ・コンピュータ及びサーバ・コンピュータ・プログラム
WO2017002590A1 (fr) * 2015-06-29 2017-01-05 株式会社日立製作所 Dispositif de génération d'instructions de déplacement

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11854271B2 (en) 2020-02-17 2023-12-26 Honda Motor Co., Ltd. System, computer-readable storage medium, and information processing method
CN114596707A (zh) * 2022-03-16 2022-06-07 阿波罗智联(北京)科技有限公司 交通控制方法及装置、设备、系统、介质
CN114596707B (zh) * 2022-03-16 2023-09-01 阿波罗智联(北京)科技有限公司 交通控制方法及装置、设备、系统、介质
CN114997527A (zh) * 2022-07-18 2022-09-02 苏州智能交通信息科技股份有限公司 基于道路运输动态数据的企业考核评价方法、系统及终端
CN114997527B (zh) * 2022-07-18 2022-11-11 苏州智能交通信息科技股份有限公司 基于道路运输动态数据的企业考核评价方法、系统及终端

Also Published As

Publication number Publication date
JPWO2019198449A1 (ja) 2021-04-30

Similar Documents

Publication Publication Date Title
US11238738B2 (en) Information providing system, server, mobile terminal, and computer program
JP6872959B2 (ja) 通信システム、車両搭載器及びプログラム
WO2018220971A1 (fr) Dispositif de commande de communication, procédé de commande de communication et programme informatique
CN111708358A (zh) 紧急情况下运载工具的操作
US9373255B2 (en) Method and system for producing an up-to-date situation depiction
WO2017176550A1 (fr) Procédé et système de sélection d'itinéraires assistée par capteur de véhicule autonome par rapport aux conditions routières dynamiques
WO2019198449A1 (fr) Système de fourniture d'informations, terminal mobile, dispositif de fourniture d'informations, procédé de fourniture d'informations, et programme informatique
JP2020027645A (ja) サーバ、無線通信方法、コンピュータプログラム、及び車載装置
US11945472B2 (en) Trajectory planning of vehicles using route information
CA3099840A1 (fr) Systeme et procede d'utilisation d'une communication v2x et de donnees de capteurs
WO2020071072A1 (fr) Système de fourniture d'informations, terminal mobile, procédé de fourniture d'informations, et programme informatique
JP2020091652A (ja) 情報提供システム、サーバ、及びコンピュータプログラム
JP2020091614A (ja) 情報提供システム、サーバ、移動端末、及びコンピュータプログラム
JP6881001B2 (ja) 自動走行制御装置
US20230221128A1 (en) Graph Exploration for Rulebook Trajectory Generation
WO2021201304A1 (fr) Procédé et dispositif d'aide à la conduite autonome
JP2020091612A (ja) 情報提供システム、サーバ、及びコンピュータプログラム
WO2021117370A1 (fr) Dispositif de mise à jour d'informations dynamiques, procédé de mise à jour, système de fourniture d'informations et programme informatique
US11967230B2 (en) System and method for using V2X and sensor data
JP2020091613A (ja) 情報提供システム、サーバ、及びコンピュータプログラム
US20230398866A1 (en) Systems and methods for heads-up display
US20230159026A1 (en) Predicting Motion of Hypothetical Agents
US20240123975A1 (en) Guided generation of trajectories for remote vehicle assistance
US20240029568A1 (en) Lane Monitoring During Turns In An Intersection
WO2019239757A1 (fr) Appareil de commande de communication, procédé de commande de communication, programme d'ordinateur et dispositif de communication embarqué dans un véhicule

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19784635

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020513152

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19784635

Country of ref document: EP

Kind code of ref document: A1