WO2021191682A1 - 情報提供方法、車両システム及び管理装置 - Google Patents
情報提供方法、車両システム及び管理装置 Download PDFInfo
- Publication number
- WO2021191682A1 WO2021191682A1 PCT/IB2021/000166 IB2021000166W WO2021191682A1 WO 2021191682 A1 WO2021191682 A1 WO 2021191682A1 IB 2021000166 W IB2021000166 W IB 2021000166W WO 2021191682 A1 WO2021191682 A1 WO 2021191682A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- information
- stop position
- data
- user
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 50
- 238000012544 monitoring process Methods 0.000 claims abstract description 42
- 238000004891 communication Methods 0.000 claims description 38
- 238000001514 detection method Methods 0.000 claims description 27
- 230000010365 information processing Effects 0.000 claims description 6
- 238000012545 processing Methods 0.000 description 26
- 230000006399 behavior Effects 0.000 description 16
- 230000006870 function Effects 0.000 description 14
- 230000008569 process Effects 0.000 description 12
- 238000004364 calculation method Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 238000010295 mobile communication Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012806 monitoring device Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096725—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096775—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/096805—Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
- G08G1/096811—Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed offboard
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/096833—Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route
- G08G1/096844—Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route where the complete route is dynamically recomputed based on new data
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/123—Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
- G08G1/202—Dispatching vehicles on the basis of a location, e.g. taxi dispatching
Definitions
- the present invention relates to an information providing method, a vehicle system, and a management device.
- Patent Document 1 discloses a peripheral monitoring device for a vehicle that can support safety when the vehicle gets off by using an imaging unit mounted on the vehicle. Specifically, the vehicle peripheral monitoring device predicts whether or not the object enters the door opening / closing region based on the images captured behind and behind the vehicle captured by the imaging unit after the vehicle has stopped. do.
- Patent Document 1 there is a problem that the detection range of the object is limited because the image is captured by the imaging unit after the vehicle has stopped.
- the present invention has been made in view of such a problem, and an object of the present invention is an information providing method, a vehicle system, and a management capable of improving the accuracy of information to be provided to a user by detecting an object over a wide range. To provide a device.
- the information providing method detects an object existing around the vehicle, acquires movement information data which is information on the movement of the object, and when the vehicle stops at a stop position, the object is the vehicle. It determines whether or not there is a possibility of entering the surrounding monitoring range, and if it is determined that the object may enter the monitoring range, information data about the object is transmitted to the information providing device.
- the present invention it is possible to detect an object existing around the vehicle before the vehicle stops at the stop position, so that the object can be detected over a wide range. As a result, the accuracy of the information to be provided to the user can be improved.
- FIG. 1 is a block diagram showing a configuration of a safety support system according to the present embodiment.
- FIG. 2 is a flowchart showing the operation of the vehicle system according to the present embodiment.
- FIG. 3 is an explanatory diagram showing an example of the traveling environment around the vehicle.
- FIG. 4 is an explanatory diagram that predicts the behavior of the object when the vehicle stops at the stop position.
- FIG. 5 is an explanatory diagram showing an example of the traveling environment around the vehicle.
- FIG. 6 is an explanatory diagram that predicts the behavior of the object when the vehicle is stopped at the stopped position.
- FIG. 7 is an explanatory diagram showing an example of the traveling environment around the vehicle.
- FIG. 8 is an explanatory diagram predicting the behavior of the object when the vehicle is stopped at the stopped position.
- FIG. 9 is a diagram illustrating a monitoring range set around the vehicle.
- FIG. 10 is a diagram illustrating another aspect of the monitoring range.
- FIG. 11 is a diagram illustrating an example of a monitoring
- This safety support system 1 is a system that supports safety by providing information to a user who uses a vehicle.
- a user who uses a vehicle is typically a user who gets on the vehicle, and includes a user who plans to get off the vehicle at the stop position where the vehicle stops, or a user who plans to get on the vehicle at the stop position. include.
- the safety support system 1 is mainly composed of a vehicle system 10 and a management device 20.
- the vehicle system 10 and the management device 20 are configured to be able to communicate with each other via the network 5.
- the road traffic information system 30 is connected to the network 5.
- the vehicle system 10 and the management device 20 can communicate with the road traffic information system 30 via the network 5.
- the network 5 includes, for example, the Internet.
- the network 5 may use a mobile communication function such as 4G / LTE or 5G.
- the vehicle system 10 detects an object around the vehicle and performs a process for providing necessary information to the user based on the detection result of the object.
- the vehicle system 10 includes an object detection device 11, a position estimation device 12, a microcomputer 13, a communication unit 16, and an information providing device 17.
- the vehicle system 10 is applied to a vehicle (service vehicle) that provides a transportation service in response to a vehicle allocation request from a user.
- vehicle may be an engine vehicle driven only by an engine, a hybrid vehicle driven by an engine and an electric motor, or an electric vehicle driven only by an electric motor.
- the vehicle is an autonomous driving vehicle that is driven by the automatic driving function after the driver gets on it.
- the vehicle may be a vehicle that travels by the manual driving of the driver, or a vehicle that travels by the automatic driving function without the driver getting on the vehicle.
- Autonomous driving refers to a state in which at least one actuator, for example, an actuator such as a brake, an accelerator, or a steering wheel, is controlled without the operation of an occupant. Therefore, other actuators may be operated by the operation of the occupant. Further, the automatic operation may be a state in which any control such as acceleration / deceleration control or lateral position control is executed. Further, the manual operation in the present embodiment refers to a state in which the occupant is operating the brake, the accelerator, and the steering, for example.
- an actuator such as a brake, an accelerator, or a steering wheel
- the object detection device 11 includes a plurality of object detection sensors mounted on the vehicle.
- the object detection device 11 collects information data on objects existing around the vehicle by using a plurality of object detection sensors (information collection device).
- Multiple object detection sensors include a laser range finder.
- the laser range finder senses the surroundings of the vehicle (for example, 360 degrees) within a predetermined range, and outputs the sensing result data.
- the sensing result data is output in, for example, a point cloud format.
- the plurality of object detection sensors include a camera.
- the camera photographs the surroundings of the vehicle (for example, 360 degrees) and outputs the captured image data. Multiple cameras are installed so that the surroundings of the vehicle can be photographed.
- the object detection device 11 outputs the sensing result data and the image data to the microcomputer 13 as information data related to the object (hereinafter referred to as “object information data”).
- the object detection device 11 may include a type of sensor other than the laser range finder and the camera.
- the position estimation device 12 measures the absolute position of the vehicle by using position estimation technology such as GPS (Global Positioning System) and odometry.
- the position estimation device 12 includes a GPS receiver, an inertial navigation system, sensors provided on the brake pedal and the accelerator pedal, sensors for acquiring vehicle behavior such as a wheel speed sensor and a yaw rate sensor, a laser radar, and a camera.
- the position estimation device 12 measures the position, speed, acceleration, steering angle, and posture (moving direction) of the vehicle.
- the microcomputer 13 controls the vehicle system 10.
- the microcomputer 13 is a general-purpose microcomputer including a hardware processor such as a CPU (central processing unit), a memory, and an input / output unit.
- a computer program for functioning as the vehicle system 10 is installed in the microcomputer 13.
- the microcomputer 13 functions as a plurality of information processing circuits included in the vehicle system 10.
- an example of realizing a plurality of information processing circuits included in the vehicle system 10 by software is shown.
- information is provided by preparing dedicated hardware for executing each of the following information processing. It is also possible to configure a processing circuit.
- a plurality of information processing circuits may be configured by individual hardware.
- the microcomputer 13 includes a processing unit 14 and an object prediction unit 15 as a plurality of information processing circuits.
- the processing unit 14 acquires object information data from the object detection device 11.
- the processing unit 14 detects an object existing around the vehicle based on the object information data, and generates object movement information data which is information on the movement of the object.
- Information regarding the movement of an object includes the position, speed, posture (movement direction), acceleration, steering angle, and the like of the object.
- the processing unit 14 integrates each object information data obtained from a plurality of object detection sensors included in the object detection device 11 to generate one object information data for each object. Specifically, from the object information data obtained from each of the object detection sensors, rational object information data having the smallest error is calculated in consideration of the error characteristics of each object detection sensor.
- the processing unit 14 tracks the detected object. Specifically, the processing unit 14 verifies and associates the identity of the object between different times based on the behavior of the objects output at different times, and tracks the object based on the association. As a result, object movement information data including information on the movement of the object is generated.
- the processing unit 14 detects the type of an object such as a moving object or a stationary object based on the object information data and the object movement information data. Further, the processing unit 14 detects the type of a moving object such as another vehicle, a motorcycle, a bicycle, or a pedestrian. Similarly, the processing unit 14 detects the type of a stationary object such as a parked vehicle or a building. The type of object constitutes part of the information about the movement of the object.
- the processing unit 14 has a function of recording information regarding the movement of an object. As a result, the processing unit 14 records a history of information regarding the movement of the object while the vehicle is traveling.
- Vehicle information data includes information such as vehicle position, posture (movement direction), speed, door lock and door open / closed state, seatbelt sensor value for detecting seatbelt attachment / detachment, and whether or not automatic driving is in progress. ..
- the object prediction unit 15 acquires object movement information data and vehicle information data from the processing unit 14. In addition, the object prediction unit 15 acquires road traffic information data, which will be described later, from the road traffic information system 30. The object prediction unit 15 acquires map information data, which will be described later, from the map database (map DB) 24 of the management device 20.
- the object prediction unit 15 calculates the time to arrive at the stop position where the vehicle stops from the current position of the vehicle as the arrival time based on the road traffic information data and the map information data.
- the object prediction unit 15 can acquire the stop position data indicating the stop position from the data acquired from the management device 20, specifically, the vehicle allocation request data described later.
- the object prediction unit 15 detects the presence / absence of a user scheduled to get off or the presence / absence of a user scheduled to board.
- the user who is scheduled to disembark is a user who is scheduled to disembark from the vehicle at the stop position
- the user who is scheduled to board is a user who is scheduled to board from the vehicle at the stop position.
- the object prediction unit 15 detects the presence / absence of a user scheduled to disembark and the presence / absence of a user scheduled to board the vehicle based on the data acquired from the management device 20, specifically, the vehicle allocation request data described later.
- the object prediction unit 15 predicts the behavior of the object based on the object movement information data calculated by the processing unit 14. Then, the object prediction unit 15 determines, based on the prediction result of the behavior, whether or not there is a possibility that the object may enter the monitoring range set around the vehicle when the vehicle stops at the stop position. .. When the object prediction unit 15 determines that the object may enter the monitoring range, the object prediction unit 15 transmits information data about the object to the information providing device 17.
- the communication unit 16 is a communication device that communicates with the management device 20 or the road traffic information system 30 via the network 5.
- the communication unit 16 stores the information acquired from the management device 20 or the road traffic information system 30 in a memory (not shown) or the like. Further, the communication unit 16 transmits vehicle information to the management device 20.
- the communication unit 16 may be a communication device having a mobile communication function such as 4G / LTE, or may be a communication device having a Wifi communication function.
- the communication unit 16 communicates with the information providing device 17.
- the communication unit 16 transmits information to be provided to the occupant to the information providing device 17.
- the communication unit 16 may be a device having a mobile communication function such as 4G / LTE, or a device having a Wifi communication function.
- the communication unit 16 may communicate with the information providing device 17 via a wire such as a wire harness.
- the information providing device 17 provides information to the user who uses the vehicle.
- the information providing device 17 is, for example, a display device mounted on a vehicle.
- the display device is arranged at a position that can be visually recognized by the user. For example, when the user gets in the rear seat of the vehicle, the information providing device 17 is provided on the back side of the front seat.
- the information providing device 17 is not limited to the form provided in the vehicle.
- the information providing device 17 may be a user terminal owned by the user, for example, a mobile information terminal such as a mobile phone, a smartphone, or a communication device.
- the management device 20 acquires vehicle allocation request data including a vehicle allocation request from the user terminal, and performs various processes for providing a vehicle transportation service according to the acquired vehicle allocation request data.
- the management device 20 is composed of one or a plurality of computers.
- the management device 20 has a calculation unit (controller) 21, a user database (user DB) 22, a vehicle database (vehicle DB) 23, a map database (map DB) 24, and a communication unit 25.
- the calculation unit 21 updates the vehicle database 23 by acquiring vehicle information data from the vehicle system 10.
- the calculation unit 21 can grasp the state of the vehicle based on the vehicle database 23.
- the calculation unit 21 performs a predetermined process based on the vehicle allocation request data stored in the user database 22. Specifically, the calculation unit 21 determines a vehicle to be dispatched to the user who requested the vehicle dispatch request. In addition, the calculation unit 21 generates vehicle allocation information such as a departure place, a destination, and a traveling route. That is, the calculation unit 21 travels the vehicle from the departure point set in the vehicle allocation request to the destination, that is, from the boarding place where the user gets on to the getting-off place where the user gets off, based on the map information of the map database 24. Calculate the possible travel route. In calculating the travel route, the road traffic information data may be referred to, and the route that takes the shortest time to arrive at the destination from the departure point may be considered.
- the user database 22 is a database that manages vehicle allocation request data acquired from the user terminal, that is, vehicle allocation request information (request information).
- request information is managed for each user, and each request information is associated with a user ID that identifies the user.
- the request information includes at least the location information of the user's departure place.
- the position information of the departure place is the current position information detected by the position positioning function provided in the user terminal.
- the positioning function consists of a GPS receiver that receives signals from GPS satellites.
- the location information of the departure place may be the location information input by the user as the departure place.
- the request information includes the location information of the destination, the designation of the waypoint and its location information, the number of passengers, the baggage information, the time when the vehicle allocation request was obtained, the time when the ride is desired, and so on. Information such as whether or not shared riding may be included.
- the user database 22 When the user database 22 acquires the vehicle allocation request data from the user terminal, the user database 22 adds this data to the user database 22. Then, the user database 22 provides the vehicle allocation request data to the calculation unit 21.
- the vehicle database 23 is a database that manages vehicle information data.
- vehicle information data is managed for each vehicle, and each vehicle information data is associated with a vehicle ID that identifies the vehicle.
- Vehicle information data includes at least vehicle position information.
- vehicle information data includes information such as attitude (movement direction), speed, door lock and door open / closed state, seatbelt sensor value for detecting seatbelt attachment / detachment, and whether or not automatic driving is in progress.
- service information includes whether or not the vehicle can be reserved, whether or not the vehicle is being picked up, whether or not there are passengers, the number of passengers, the status of boarding or disembarking, and whether or not the vehicle has arrived at the destination. Information related to the above may also be included.
- the vehicle database 23 adds necessary information to the vehicle database 23 based on this data.
- the map database 24 stores map information data including map information indicating the structure of the road on which the vehicle can travel.
- Map information includes road structure information such as absolute lane positions, lane connections, and relative positions, traffic rules, and road signs.
- the map information includes information on areas or places where users can get on and off. Further, the map information may include information on the waiting place of the vehicle and pedestrian road information for calculating the route on which the user walks.
- the communication unit 25 is a communication device that communicates with the vehicle system 10 or the road traffic information system 30 via the network 5.
- the communication unit 25 stores information (vehicle information, map information, etc.) acquired from the vehicle system 10 or the road traffic information system 30 in a memory (not shown) or the like. Further, the communication unit 25 transmits the vehicle allocation information to the vehicle system 10.
- the communication unit 25 may be a communication device having a mobile communication function such as 4G / LTE, or may be a communication device having a Wifi communication function.
- the communication unit 25 communicates with the user terminal.
- the communication unit 25 stores the information (vehicle allocation request data) acquired from the user terminal in a memory (not shown) or the like.
- the road traffic information system 30 manages road traffic information data including road traffic information.
- Road traffic information is information that affects the running of vehicles, such as traffic congestion information and traffic regulation information.
- the traffic information is, for example, VICS (Vehicle Information and Communication System).
- the object prediction unit 15 determines whether or not the arrival time T is equal to or less than the determination time.
- the arrival time T is the time required to arrive at the stop position Pa along the traveling route starting from the current position of the vehicle Va, which is the own vehicle.
- the arrival time is calculated based on the current position of the vehicle Va, the distance to the stop position Pa on the traveling route, and the like. Further, the arrival time T is calculated in consideration of the influence of road traffic information, for example, traffic congestion.
- the determination time is a time for determining that it is appropriate for the vehicle Va to approach the stop position Pa and start detecting an object.
- the stop position Pa corresponds to the destination of the vehicle allocation request received from the user.
- the stop position Pa corresponds to the departure point of the vehicle allocation request received from the user.
- the object prediction unit 15 detects that the running vehicle will stop at the stop position Pa from now on. That is, the object prediction unit 15 detects that the vehicle will stop at the stop position Pa in the future (by the time the determination time elapses). In this case, an affirmative decision is made in step S10, and the process proceeds to step S11. On the other hand, if the arrival time is longer than the determination time, a negative determination is made in step S10, and the process returns to step S10.
- step S11 the object prediction unit 15 detects the presence / absence of a user scheduled to get off or the presence / absence of a user scheduled to board.
- step S12 the processing unit 14 acquires the object information data from the object detection device 11.
- step S13 the processing unit 14 acquires the own vehicle movement information data including the movement information of the vehicle Va based on the calculation result of the position estimation device 12 and the vehicle information data.
- the movement information of the vehicle Va includes information such as the position, speed, and posture (movement direction) of the vehicle Va.
- step S14 the processing unit 14 recognizes the type of the object based on the object information data and detects the moving object as the object. In addition, the processing unit 14 detects types of objects such as other vehicles, motorcycles, bicycles, and pedestrians.
- step S15 the processing unit 14 generates object movement information data including movement information of the object.
- the movement information of the object includes the position, speed, and posture (movement direction) of the object.
- step S16 the object prediction unit 15 predicts the behavior of the object based on the object movement information data and the own vehicle movement information data.
- the objects are other vehicles Vb1, Vb2, Vb3, pedestrian Pe1, and bicycles Bc1 and Bc2 running around the vehicle Va.
- the object prediction unit 15 calculates the time required for the vehicle Va to arrive at the stop position Pa based on the own vehicle movement information data, the distance to the stop position Pa, and the like. Then, the object prediction unit 15 assumes that the object continues its current behavior, and predicts the position of the object after the required time has elapsed, that is, when the vehicle Va arrives at the stop position Pa. Further, the object prediction unit 15 considers that the other vehicle Vb3 is affected by the vehicle Va and therefore decelerates in accordance with the stop of the vehicle Va. Based on such drive prediction, as shown in FIG. 4, the bicycle Bc2 and the other vehicle Vb3 are predicted to exist in the vicinity of the vehicle Va when the vehicle Va is stopped at the stop position Pa.
- the object prediction unit 15 can infer the lane change from the posture specified from the object movement information data. In this case, the object prediction unit 15 considers that the other vehicle Vb3 is not affected by the vehicle Va and therefore continues to travel at the same speed. As a result, as shown in FIG. 6, it is predicted that the behavior of the other vehicle Vb3 does not exist in the vicinity of the vehicle Va when the vehicle Va is stopped at the stop position Pa.
- the object prediction unit 15 also predicts the pedestrian Pe2 traveling in the direction approaching the vehicle Va from the front of the vehicle Va.
- the bicycle Bc2 and the pedestrian Pe2 are predicted to exist in the vicinity of the vehicle Va when the vehicle Va is stopped at the stop position Pa.
- the object prediction unit 15 refers not only to the object movement information data generated in the latest process but also to the object movement information data generated in the past. Then, the behavior of the object may be predicted.
- step S17 the object prediction unit 15 determines whether or not the object may enter the monitoring range Ra when the vehicle Va stops at the stop position Pa based on the behavior prediction of the object. ..
- the monitoring range Ra is an area for distinguishing whether or not to notify the user who gets off the vehicle Va of the existence of the object, and is set around the vehicle Va. For example, in the case of left-hand traffic, the occupant usually gets off from the left side of the vehicle Va. As shown in FIG. 9, the monitoring range Ra is set to the left side of the vehicle Va.
- step S18 If there is a possibility that the object may enter the monitoring range Ra, an affirmative judgment is made in step S18, and the process proceeds to step S18. On the other hand, if there is no possibility that the object enters the monitoring range Ra, a negative determination is made in step S17, and the process proceeds to step S20.
- the object prediction unit 15 generates object information data.
- the object information data is data indicating information about an object (object) that may enter the monitoring range Ra.
- the object information data is image data obtained by photographing an object that may enter the monitoring range Ra.
- the object information data may include information in the direction in which the object enters the monitoring range Ra.
- the information providing device 17 includes a speaker that outputs voice
- the object information data may be voice data indicating an object.
- step S19 the object prediction unit 15 outputs the object information data to the information providing device 17 via the communication unit 16.
- step S20 the object prediction unit 15 determines whether or not the user's disembarkation or the user's boarding has been completed. When the user's disembarkation or the user's boarding is completed, an affirmative determination is made in step S20, and this process ends. On the other hand, if the user's disembarkation or the user's boarding is not completed, a negative determination is made in step S20, and the process returns to step S12.
- the information providing method acquires the stop position data indicating the stop position, detects that the running vehicle Va will stop at the stop position from now on, and is an object existing around the vehicle Va. (Object) is detected, and object movement information data is acquired based on the detection result of the object.
- the information providing method determines whether or not there is a possibility that the object may enter the monitoring range Ra when the vehicle Va stops at the stop position Pa based on the object movement information data, and the object moves. When it is determined that there is a possibility of entering the monitoring range Ra, the object information data is transmitted to the information providing device 17.
- a blind spot may be generated by another object or the like, and the detection range of the object may be limited. In this case, it may be difficult to detect an object of interest.
- the vehicle Va can detect the object while heading toward the stop position Pa, the object can be detected over a wide range.
- the object can be detected in the entire circumference of the vehicle Va. Therefore, in addition to the object approaching from the rear of the vehicle Va, the object approaching from various directions can be detected. As a result, the object can be detected over a wide range, and the accuracy of the information to be provided to the user can be improved.
- the object information data is transmitted to the information providing device 17.
- the vehicle system may perform other controls, such as not unlocking the door, with or alone with the transmission of the object information data.
- the behavior of the object may be predicted and the object information data may be transmitted.
- the behavior prediction of the object may be performed based only on the movement information obtained at the timing of performing the prediction processing, or the movement information obtained at the timing of performing the prediction processing and the recorded past. It may be performed based on the movement information of.
- the object information data can be recognized before getting off, so that when the user gets off, the user can take a foreseen action on the object.
- the behavior prediction of the object may be performed in advance before the vehicle Va arrives at the stop position Pa, and the object information data may be transmitted after the vehicle Va arrives at the stop position Pa.
- the behavior prediction of the object may be performed based on the movement information recorded up to that timing after the vehicle Va arrives at the stop position Pa.
- the object movement information data includes at least the speed of the object and the moving direction of the object.
- the object information data includes the direction in which the object enters the monitoring range Ra.
- the user can understand from which direction the object is approaching. As a result, the user can perform specific recognition of the object.
- the type of the object may be specified based on the detection result of the object.
- the object information data may include the type data of the object.
- the user can understand what kind of object is approaching. As a result, the user can perform specific recognition of the object.
- the object information data includes one or both of the image data obtained by capturing the object and the audio data explaining the object.
- the user can understand what kind of object is approaching. As a result, the user can perform specific recognition of the object.
- the information providing method determines whether or not the arrival time until the vehicle Va arrives at the stop position Pa is less than or equal to the preset determination time, and when the arrival time is less than or equal to the determination time. , Start detecting the object.
- one user gets on the vehicle Va, but a plurality of users may get on the vehicle like a shared vehicle.
- a plurality of information providing devices 17 are associated with the plurality of users.
- the information providing method according to the present embodiment transmits object information data to the information providing device 17 associated with the user getting off at the stop position Pa from among the plurality of information providing devices 17. It is preferable to do so.
- the monitoring range Ra is set on the left side of the vehicle Va.
- the monitoring range is not limited to this.
- the monitoring range Rb may be set on the right side of the vehicle Va.
- the monitoring range Rc may be set behind the vehicle Va.
- the monitoring range may be a combination of these ranges Ra, Rb, and Rc.
- the monitoring range Ra may be set according to the direction in which the user gets off. For example, if the user is likely to get off from the left side of the vehicle, the monitoring range Ra is set on the left side of the vehicle Va, and if the user is likely to get off from the right side of the vehicle, the right side of the vehicle Va is set. It is like setting the monitoring range Rb on the side. In this case, the user may be a driver.
- the monitoring range Ra may be switched depending on the type of the object. For example, motorcycles are faster than pedestrians, so care must be taken when getting off. Therefore, as shown in FIG. 11, when the object is a two-wheeled vehicle, the monitoring range Ra is increased.
- the vehicle system 10 has technical matters corresponding to the above-mentioned information providing method, and has the same operations and effects as the vehicle control method.
- the processing unit 14 acquires (calculates) movement information based on the detection result of the object by the object detection device 11, and directly acquires the movement information from the object detection device 11.
- the processing unit 14 functions as a part of an information collecting device that acquires movement information.
- the information collecting device of the vehicle system 10 may acquire movement information from an object other than the own vehicle by using V2X (Vehicle to Everything) technology including vehicle-to-vehicle communication.
- the vehicle system 10 performs the prediction process, but an external device (for example, the management device 20) capable of communicating with the vehicle Va may perform the prediction process.
- the management device 20 acquires object information data, which is information about an object collected by using a sensor while the vehicle is running, from the vehicle, and based on the object information data, is information about the movement of the object. Calculate a certain object movement information data. Then, the management device 20 determines whether or not the object may enter the monitoring range when the vehicle stops at the stop position based on the object movement information, and the object enters the monitoring range. When it is determined that there is a possibility, information data about the object is transmitted for providing to the user. The data may be transmitted to the vehicle system 10 and then transmitted by the vehicle system 10 to the information providing device 17, or may be transmitted directly to the information providing device 17.
- Safety support system 10
- Vehicle system 11
- Object detection device 12
- Position estimation device 13
- Microcomputer 14
- Processing unit 15
- Object prediction unit 16
- Communication unit (communication device) 17
- Information providing device 20
- Management device 21
- Calculation unit (controller) 22
- User database 23
- Map database 25
- Communication unit (communication device) 30
- Vehicle Information and Communication System
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
- Burglar Alarm Systems (AREA)
Abstract
Description
10 車両システム
11 物体検出装置
12 位置推定装置
13 マイクロコンピュータ
14 処理部
15 物体予測部
16 通信部(通信装置)
17 情報提供装置
20 管理装置
21 計算部(コントローラ)
22 ユーザデータベース
23 車両データベース
24 地図データベース
25 通信部(通信装置)
30 道路交通情報システム
Claims (13)
- 車両を利用するユーザに情報を提供する情報提供装置に対して情報を出力する情報処理装置の情報提供方法において、
前記車両が停車する停車位置を示す停車位置データを取得し、
走行中の前記車両がこれから前記停車位置で停車することを検出し、
前記車両の周囲に存在する物体を検出し、
検出した前記物体の移動に関する情報である移動情報データを取得し、
前記移動情報データに基づいて、前記車両が前記停車位置に停車した場合において前記物体が前記車両の周囲に設定される監視範囲内に進入する可能性があるか否かを判断し、
前記物体が前記監視範囲内に進入する可能性があると判断した場合に、前記物体に関する情報データを前記情報提供装置に送信する
情報提供方法。 - 前記車両を利用する前記ユーザは、前記車両に乗車するユーザであり、前記停車位置において前記車両から降車する予定のあるユーザ、又は前記停車位置において前記車両に乗車する予定のあるユーザを含む
請求項1に記載の情報提供方法。 - 前記情報提供装置は、前記停車位置において前記車両から降車する予定のあるユーザ又は前記停車位置において前記車両に乗車する予定のあるユーザが保有する携帯情報端末である
請求項2に記載の情報提供方法。 - 走行中の前記車両がこれから前記停車位置で停車することを検出することに加えて、前記停車位置において前記車両から降車する予定のあるユーザの有無、又は前記停車位置において前記車両に乗車する予定のあるユーザの有無を検出する
請求項2又は3に記載の情報提供方法。 - 前記移動情報データは、前記物体の速度と前記物体の移動方向とを少なくとも含む
請求項1から4のいずれか一項記載の情報提供方法。 - 前記物体に関する情報データは、前記監視範囲に対して前記物体が進入する方向を含む
請求項1から5のいずれか一項記載の情報提供方法。 - 前記物体の検出結果に基づいて、前記物体の種別を特定し、
前記物体に関する情報データは、前記物体の種別データを含む
請求項1から6のいずれか一項記載の情報提供方法。 - 前記物体に関する情報データは、前記物体が撮像された画像データ、及び、前記物体を説明する音声データの一方又は両方を含む
請求項1から7のいずれか一項記載の情報提供方法。 - 前記車両に複数のユーザが乗車し、前記複数のユーザに対して複数の情報提供装置が対応付けられている場合には、
前記複数の情報提供装置の中から、前記停車位置で降車する前記ユーザに対応付けられている前記情報提供装置に対して、前記物体に関する情報データを送信する
請求項1から8のいずれか一項記載の情報提供方法。 - 前記車両が前記停車位置に到着するまでの到着時間が予め設定された判定時間以下であるか否かを判断し、
前記到着時間が前記判定時間以下である場合に、前記物体の検出を開始する
請求項1から9のいずれか一項記載の情報提供方法。 - 前記車両が前記停車位置に停車する前に、前記物体に関する情報を前記情報提供装置に送信する
請求項1から10のいずれか一項記載の情報提供方法。 - 車両を利用するユーザに情報を提供する情報提供装置と、
前記車両の周囲に存在する物体に関する情報データを収集する情報収集装置と、
前記情報収集装置と、前記情報提供装置との間で通信を行う通信装置と、
前記通信装置を制御するコントローラと、を有し、
前記コントローラは、
前記車両が停車する停車位置を示す停車位置データを取得し、
走行中の前記車両がこれから前記停車位置で停車することを検出し、
前記情報収集装置から取得されたデータを用いて前記車両の周囲に存在する前記物体を検出し、
検出した前記物体の移動に関する情報である移動情報データを前記情報収集装置により取得し、
前記移動情報データに基づいて、前記車両が前記停車位置に停車した場合において前記物体が前記車両の周囲に設定される監視範囲内に進入する可能性があるか否かを判断し、
前記物体が前記監視範囲内に進入する可能性があると判断した場合に、前記物体に関する情報データを前記情報提供装置に送信する
車両システム。 - 車両の周囲に存在する物体に関する情報データを収集する情報収集装置と、
前記車両を利用するユーザに情報を提供する情報提供装置との間で通信を行う通信装置と、
前記通信装置を制御するコントローラと、を有し、
前記コントローラは、
前記車両が停車する停車位置を示す停車位置データを取得し、
走行中の前記車両がこれから前記停車位置で停車することを検出し、
前記情報収集装置から取得されたデータを用いて前記車両の周囲に存在する前記物体を検出し、
検出した前記物体の移動に関する情報である移動情報データを前記情報収集装置により取得し、
前記移動情報データに基づいて、前記車両が前記停車位置に停車した場合において前記物体が前記車両の周囲に設定される監視範囲内に進入する可能性があるか否かを判断し、
前記物体が前記監視範囲内に進入する可能性があると判断した場合に、前記物体に関する情報データを前記情報提供装置に送信する
管理装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180022639.8A CN115315738A (zh) | 2020-03-26 | 2021-03-18 | 信息提供方法、车辆系统以及管理装置 |
US17/907,100 US20240046778A1 (en) | 2020-03-26 | 2021-03-18 | Information providing method, vehicle system and management device |
EP21775925.7A EP4131214A4 (en) | 2020-03-26 | 2021-03-18 | INFORMATION PROVISION METHOD, VEHICLE SYSTEM AND MANAGEMENT DEVICE |
BR112022019248A BR112022019248A2 (pt) | 2020-03-26 | 2021-03-18 | Método de fornecimento de informações, sistema de veículo e dispositivo de gerenciamento |
JP2022509742A JP7311030B2 (ja) | 2020-03-26 | 2021-03-18 | 情報提供方法、車両システム及び管理装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020056307 | 2020-03-26 | ||
JP2020-056307 | 2020-03-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021191682A1 true WO2021191682A1 (ja) | 2021-09-30 |
Family
ID=77890981
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2021/000166 WO2021191682A1 (ja) | 2020-03-26 | 2021-03-18 | 情報提供方法、車両システム及び管理装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20240046778A1 (ja) |
EP (1) | EP4131214A4 (ja) |
JP (1) | JP7311030B2 (ja) |
CN (1) | CN115315738A (ja) |
BR (1) | BR112022019248A2 (ja) |
WO (1) | WO2021191682A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008158578A (ja) * | 2006-12-20 | 2008-07-10 | Tokai Rika Co Ltd | 車両の降車安全装置及び車両の安全降車方法 |
JP2018134887A (ja) | 2017-02-20 | 2018-08-30 | 株式会社アルファ | 車両用周辺監視装置 |
JP2018206186A (ja) * | 2017-06-07 | 2018-12-27 | パナソニックIpマネジメント株式会社 | 歩行者端末装置、車載端末装置、歩車間通信システム、および降車通知方法 |
JP2019178960A (ja) * | 2018-03-30 | 2019-10-17 | 株式会社ナビタイムジャパン | 情報処理システム、情報処理プログラム、情報処理装置、および情報処理方法 |
JP2019192069A (ja) * | 2018-04-27 | 2019-10-31 | いすゞ自動車株式会社 | 報知装置、及び、報知方法 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RO127817A2 (ro) * | 2008-04-15 | 2012-09-28 | Lilian Cătălin Vişoiu | Sistem de preavertizare în caz de frânare a unui autovehicul |
DE102014005464A1 (de) * | 2014-04-12 | 2014-09-25 | Daimler Ag | Verfahren zumindest zur Verminderung einer Kollisionsschwere eines Fahrzeuges mit einer Person |
KR101759020B1 (ko) * | 2016-02-05 | 2017-07-18 | 인하대학교 산학협력단 | 차량 개문에 따른 위험 상황 자동 알림 장치 및 방법 |
KR101960915B1 (ko) * | 2017-03-22 | 2019-07-17 | 한국오므론전장 주식회사 | 차량 도어 충돌 방지 시스템 및 방법 |
US10089872B1 (en) * | 2017-05-11 | 2018-10-02 | Here Global B.V. | Vehicle communication system for vehicle boarding area |
KR102420218B1 (ko) * | 2017-10-21 | 2022-07-13 | 현대자동차주식회사 | 하차 승객 보호 시스템 및 그 방법 |
US10741081B2 (en) * | 2017-11-14 | 2020-08-11 | GM Global Technology Operations LLC | Remote park assist system |
-
2021
- 2021-03-18 CN CN202180022639.8A patent/CN115315738A/zh active Pending
- 2021-03-18 US US17/907,100 patent/US20240046778A1/en active Pending
- 2021-03-18 JP JP2022509742A patent/JP7311030B2/ja active Active
- 2021-03-18 BR BR112022019248A patent/BR112022019248A2/pt unknown
- 2021-03-18 WO PCT/IB2021/000166 patent/WO2021191682A1/ja active Application Filing
- 2021-03-18 EP EP21775925.7A patent/EP4131214A4/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008158578A (ja) * | 2006-12-20 | 2008-07-10 | Tokai Rika Co Ltd | 車両の降車安全装置及び車両の安全降車方法 |
JP2018134887A (ja) | 2017-02-20 | 2018-08-30 | 株式会社アルファ | 車両用周辺監視装置 |
JP2018206186A (ja) * | 2017-06-07 | 2018-12-27 | パナソニックIpマネジメント株式会社 | 歩行者端末装置、車載端末装置、歩車間通信システム、および降車通知方法 |
JP2019178960A (ja) * | 2018-03-30 | 2019-10-17 | 株式会社ナビタイムジャパン | 情報処理システム、情報処理プログラム、情報処理装置、および情報処理方法 |
JP2019192069A (ja) * | 2018-04-27 | 2019-10-31 | いすゞ自動車株式会社 | 報知装置、及び、報知方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4131214A4 |
Also Published As
Publication number | Publication date |
---|---|
EP4131214A4 (en) | 2023-10-04 |
JP7311030B2 (ja) | 2023-07-19 |
CN115315738A (zh) | 2022-11-08 |
BR112022019248A2 (pt) | 2022-11-16 |
EP4131214A1 (en) | 2023-02-08 |
US20240046778A1 (en) | 2024-02-08 |
JPWO2021191682A1 (ja) | 2021-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108725432B (zh) | 自动驾驶装置以及通知方法 | |
US11345359B2 (en) | Autonomous driving vehicles with dual autonomous driving systems for safety | |
CN112462640B (zh) | 车辆远程指示系统 | |
US11747828B2 (en) | Vehicle remote instruction system and remote instruction device | |
JP7139717B2 (ja) | 車両用通信装置、車両用通信方法、及び制御プログラム | |
US10875545B2 (en) | Autonomous driving system | |
CN112486162A (zh) | 车辆远程指示系统 | |
SE1250747A1 (sv) | Anordning och förfarande för att bedöma olycksrisk vid framförande av ett fordon | |
CN111830859B (zh) | 车辆远程指示系统 | |
CN112927550B (zh) | 自动泊车系统 | |
CN113287074A (zh) | 使用语音交互增加自主运载工具安全性和灵活性的方法和系统 | |
CN111731295B (zh) | 行驶控制装置、行驶控制方法以及存储程序的存储介质 | |
CN111762168A (zh) | 控制装置、控制方法以及存储介质 | |
JP2020138611A (ja) | 車両制御装置、車両制御システム、車両制御方法、およびプログラム | |
JP2007066179A (ja) | 車両用運転支援装置 | |
WO2020241971A1 (ko) | 교통 사고 처리 장치 및 교통 사고 처리 방법 | |
WO2021053763A1 (ja) | 運転支援装置、運転支援方法及びプログラム | |
CN113557174A (zh) | 信息处理装置、信息处理方法、移动控制装置和移动控制方法 | |
JPWO2019039280A1 (ja) | 情報処理装置、情報処理方法、プログラム、及び、車両 | |
US11386724B2 (en) | Control device, a control method, and a non-transitory computer readable medium storing a control program of a door lock of a vehicle | |
US20210284195A1 (en) | Obstacle prediction system for autonomous driving vehicles | |
WO2021191682A1 (ja) | 情報提供方法、車両システム及び管理装置 | |
CN115454036A (zh) | 远程操作委托系统、远程操作委托方法以及存储介质 | |
JP7312662B2 (ja) | 配車システム、配車システムの配車方法及びサーバ | |
CN111301418B (zh) | 驾驶辅助装置及其控制方法、车辆以及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21775925 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022509742 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 17907100 Country of ref document: US |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112022019248 Country of ref document: BR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021775925 Country of ref document: EP Effective date: 20221026 |
|
ENP | Entry into the national phase |
Ref document number: 112022019248 Country of ref document: BR Kind code of ref document: A2 Effective date: 20220923 |