WO2017047176A1 - 情報処理装置、情報処理方法、およびプログラム - Google Patents
情報処理装置、情報処理方法、およびプログラム Download PDFInfo
- Publication number
- WO2017047176A1 WO2017047176A1 PCT/JP2016/067568 JP2016067568W WO2017047176A1 WO 2017047176 A1 WO2017047176 A1 WO 2017047176A1 JP 2016067568 W JP2016067568 W JP 2016067568W WO 2017047176 A1 WO2017047176 A1 WO 2017047176A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- information
- user
- evaluation
- information processing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0965—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages responding to signals from another vehicle, e.g. emergency vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0129—Traffic data processing for creating historical data or processing based on historical data
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- Patent Document 1 in order to increase motivation for improving driving manners, behavior (hazard, horn, etc.) that evaluates driving manners addressed to the vehicle by others is acquired and transmitted to the server. A system for calculating points according to driving evaluation information is described.
- Patent Document 2 describes an in-vehicle information presentation device that displays an energy consumption state of an in-vehicle battery of another vehicle on a map screen together with the current position of the other vehicle.
- the present disclosure proposes an information processing apparatus, an information processing method, and a program that can effectively use information related to user evaluation for a target vehicle.
- a recognition unit that automatically recognizes information on a target vehicle from user environment information, an acquisition unit that acquires a user evaluation for the recognized target vehicle, and a notification for the acquired user evaluation
- An information processing apparatus including a generation unit that generates information is proposed.
- a processor automatically recognizes information on a target vehicle from user environment information, acquires a user evaluation for the recognized target vehicle, and notifies the acquired user evaluation. Proposing an information processing method including generating
- the computer notifies the recognition unit that automatically recognizes the information on the target vehicle from the user's environment information, the acquisition unit that acquires the user evaluation for the recognized target vehicle, and the acquired user evaluation.
- a program for functioning as a generation unit that generates information to be used is proposed.
- the figure explaining the case where the acceleration / deceleration information of the other vehicle of the periphery by 3rd Embodiment is displayed is shown. It is a figure which shows the list of the acceleration / deceleration information of the surrounding vehicle which the own vehicle by 3rd Embodiment acquired. It is a figure explaining the outline
- FIG. 1 is a diagram illustrating an overview of an information processing system according to an embodiment of the present disclosure.
- the vehicle 10 ⁇ / b> A, the vehicle 10 ⁇ / b> B, and the vehicle 10 ⁇ / b> C have an inter-vehicle communication function capable of transmitting and receiving signals to and from other vehicles existing in the vicinity.
- the vehicle 10A when the vehicle 10A gives way when the vehicle 10A joins the main line, the vehicle 10A is willing to the extent that the vehicle 10A does not overload the vehicle 10B (that is, does not hinder travel). Although it is a manner, it is usually done by raising the hand, slightly lowering the head, blinking the hazard lamp once or twice, etc. Such thanks are ad hoc, but in order to maintain good driving, such thanks may be collected as an evaluation for the driver (or vehicle). For example, the driver of the vehicle 10A who gave the road explicitly evaluates the vehicle 10B by pressing the “Thanks” button provided on the steering wheel or the like or saying “Thank you for giving it!”. I do.
- the vehicle 10A acquires user evaluation based on pressing of the “Thanks” button or the collected user voice, and transmits the user evaluation to the vehicle 10B recognized as the target vehicle by inter-vehicle communication.
- the vehicle 10B stores the received evaluation information or presents it with a display device (AR (augmented reality) function of the display or the windshield) visible to the driver, thereby realizing driver communication. For example, messages such as “Thanks” or “Thank you for giving me!” Are displayed on the display device.
- AR augmented reality
- the evaluation from other users of the vehicle 10B that is transmitted from the surrounding vehicle by inter-vehicle communication is also transmitted from the vehicle 10B to the third-party vehicle 10C.
- the vehicle 10C receives the evaluation information of the vehicle 10B traveling nearby by inter-vehicle communication and presents it on a display device in the vehicle. Therefore, the driver of the vehicle 10C grasps whether the driving manners of the vehicle 10B are good or bad by evaluation information of the vehicle 10B (for example, a numerical value indicating the number of times the vehicle 10B has been “Thanks”, a latest comment, etc.). Can do.
- the evaluation information of each vehicle is also transmitted to the surrounding vehicles, so that, for example, the driver of the vehicle 10C has less number of “Thanks” of the vehicle 10B running in front, and the latest comment is “sudden. If it is “interrupted” or “dangerous meander driving!”, It can be recognized that the vehicle 10B is a dangerous vehicle with bad driving manners. As a result, the driver of the vehicle 10C can prevent accidents and perform comfortable driving by avoiding running behind the vehicle 10B or paying particular attention to the movement of the vehicle 10B.
- the information on other vehicles that can be obtained from other vehicles by inter-vehicle communication is not limited to the above-described evaluation information, and various information useful for preventing accidents, driving comfortably, improving driving manners, etc. is obtained. Can do.
- the information of other vehicles is not limited to the example communicated by inter-vehicle communication, and may be acquired via the management server 4 (see FIG. 28).
- Each vehicle is communicatively connected to a roadside wireless device 3 (see FIG. 28) installed on the road by road-to-vehicle communication, and transmits / receives data to / from the management server 4 via the roadside wireless device 3.
- Information of each vehicle is associated with identification information (for example, vehicle number) unique to the vehicle and managed by the management server 4 together with the current position information.
- the automatic vehicles are illustrated as the vehicle 10 ⁇ / b> A, the vehicle 10 ⁇ / b> B, and the vehicle 10 ⁇ / b> C.
- the present embodiment is not limited to this, and the vehicle 10 may be a motorcycle or a light vehicle. Good.
- the information processing apparatus 100 can be mounted on a moving body such as the vehicles 10A, 10B, 10C, for example.
- the information processing apparatus 100 can be mounted on a smartphone, a tablet terminal, a mobile phone terminal, a PC (personal computer), or the like brought into the vehicle 10.
- FIG. 2 is a diagram illustrating an example of a basic configuration of the information processing apparatus 100 according to the present embodiment.
- the information processing apparatus 100 includes a control unit 110, a communication unit 120, an input unit 130, an output unit 140, and a storage unit 150.
- the control unit 110 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 100 according to various programs.
- the control unit 110 is realized by an electronic circuit such as a CPU (Central Processing Unit) or a microprocessor, for example.
- the communication unit 120 is a communication module for transmitting and receiving data to and from other devices by wire / wireless.
- the communication unit 120 communicates with another information processing apparatus 100 or the roadside apparatus 3.
- the communication unit 120 is a communication module that performs inter-vehicle communication with another information processing apparatus 100 mounted on another vehicle that travels in the vicinity, and a communication module that performs communication with the roadside apparatus 3 installed in the vicinity.
- the input unit 130 receives information input to the information processing apparatus 100 from the outside.
- the input unit 130 can be realized by a touch panel, a switch, a button, a microphone, and various sensors.
- the output unit 140 outputs information by video, image, sound, vibration or the like.
- the output unit 140 is realized by a display device (AR of a display or a windshield) installed so as to be recognized by a driver of the vehicle, a speaker, or the like.
- the storage unit 150 is realized by a ROM (Read Only Memory) that stores programs and calculation parameters used for the processing of the control unit 110, and a RAM (Random Access Memory) that temporarily stores parameters that change as appropriate.
- ROM Read Only Memory
- RAM Random Access Memory
- FIG. 3 is a diagram illustrating an example of a detailed configuration of the information processing apparatus 100 according to the present embodiment.
- the information processing apparatus 100 includes an in-vehicle device 1301, a sensor 1302, an operation input unit 1303, a microphone 1304, a vehicle-to-vehicle communication unit 1201, a road-to-vehicle communication unit 1202, a network I / F (interface) unit 1203, A recognition unit 1101, a user evaluation acquisition unit 1102, a notification information generation unit 1103, a notification control unit 1104, an estimation unit 1105, a display unit 1401, and a speaker 1402 are included.
- the in-vehicle device 1301, the sensor 1302, the operation input unit 1303, and the microphone 1304 are examples of the input unit 130.
- the in-vehicle device 1301 is a device or system provided in a vehicle, and for example, a speedometer, a fuel consumption meter, a navigation device, a driving assist system such as a cruise controller or an automatic brake, safety equipment, and an automatic driving system are assumed. .
- Examples of information obtained from the in-vehicle device 1301 by the recognition unit 1101 and the estimation unit 1105 include the following information.
- the sensor 1302 detects various information in and around the vehicle.
- the sensor 1302 is assumed to be a camera (image sensor), a depth sensor, a line-of-sight sensor, a touch sensor, a vital sensor, an emotion sensor, and the like.
- Examples of information obtained from the sensor 1302 by the recognition unit 1101 and the estimation unit 1105 include the following information.
- the operation input unit 1303 detects operation input information by the user.
- the operation input unit 1303 is realized by a touch panel, a switch, a button, or the like.
- the operation input unit 1303 includes a “Thanks” button (an example of an evaluation button) provided around the steering wheel, lever switch, or steering.
- Information obtained by the user evaluation acquisition unit 1102 from the operation input unit 1303 includes, for example, information on pressing a “Thanks” button (an example of information related to evaluation of other vehicles by the user).
- the microphone 1304 collects the speech from the user.
- Information obtained by the user evaluation acquisition unit 1102 from the microphone 1304 includes, for example, an evaluation comment on the other vehicle by the user such as “Thank you for handing over”.
- the information obtained by the estimation unit 1105 from the microphone 1304 includes, for example, information requests to other vehicles by the user, such as “I wonder if that car is flirting but is dangerous” or “Where did everyone come from?”.
- the in-vehicle device 1301, the sensor 1302, the operation input unit 1303, or the microphone 1304 described above may be provided separately from the information processing apparatus 100 in the vehicle.
- the inter-vehicle communication unit 1201, the road-to-vehicle communication unit 1202, and the network I / F (interface) unit 1203 are examples of the communication unit 120.
- the vehicle-to-vehicle communication unit 1201 and the road-to-vehicle communication unit 1202 perform data communication between a vehicle and a vehicle, and a vehicle and a roadside wireless device by short-range wireless communication such as Wi-Fi (registered trademark) and BlueTooth (registered trademark), for example. .
- the network I / F unit 1203 performs data communication between the vehicle and the network through a mobile network such as 4G (LTE; Long Term Evolution) or 3G.
- the inter-vehicle communication unit 1201 may realize data communication with another vehicle by visible light communication using a high-speed blinking pattern such as a headlight, a vehicle width lamp, a winker, and a brake lamp.
- the recognition unit 1101, the user evaluation acquisition unit 1102, the notification information generation unit 1103, the notification control unit 1104, and the estimation unit 1105 are functional examples of the control unit 110.
- the recognition unit 1101 recognizes the target vehicle that the user is paying attention to and the target vehicle when the user performs evaluation.
- the recognition unit 1101 may automatically recognize the target vehicle based on information (that is, user environment information) obtained from the in-vehicle device 1301, the sensor 1302, and the communication unit 120.
- the user environment information is, for example, the traveling state of the user vehicle, the user's line-of-sight information, the user's concentration, the user's utterance content, the user's surrounding information, and the positional relationship between the user vehicle and surrounding vehicles.
- the recognition unit 1101 can automatically recognize the target vehicle by, for example, comprehensively interpreting various types of obtained information (user environment information) using machine learning.
- the recognition unit 1101 may recognize the target vehicle in accordance with a user instruction input from the operation input unit 1303, the microphone 1304, or the like.
- FIG. 4 is a diagram for explaining an example of a method for recognizing a target vehicle according to the present embodiment.
- the vehicle 10A is based on the positional relationship between the vehicle 10A and the vehicle 10B.
- the information processing apparatus 100 mounted on the vehicle recognizes the vehicle 10B as a target vehicle that the user pays attention to.
- the positional relationship between the vehicle 10A and the vehicle 10B is, for example, an image obtained from a camera which is an example of the sensor 1302, distance information obtained from a stereo camera, distance information obtained from an infrared sensor, and a vehicle-to-vehicle communication unit 1201. It is acquired based on the signal obtained from 10B. Further, as illustrated on the right side of FIG. 4, the vehicle 10 ⁇ / b> C being watched by the user may be recognized as the target vehicle based on the driver's line-of-sight information.
- the line-of-sight information is obtained from a line-of-sight sensor which is an example of the sensor 1302.
- the vehicle 10 ⁇ / b> C existing in the line-of-sight direction indicated by the line-of-sight information is recognized from an image captured by an outwardly provided camera, which is an example of the sensor 1302.
- the recognition unit 1101 recognizes the vehicle reflected in the mirror based on the captured image obtained by capturing the side mirror or the rearview mirror according to the driver's line of sight. To do.
- the recognition unit 1101 further uses the driver's concentration degree obtained from a vital sensor, which is an example of the sensor 1302, to more accurately determine the target vehicle to which the driver is paying attention. Is possible.
- the recognition unit 1101 can also perform individual recognition of the recognized target vehicle.
- the individual identification information (ID) of the target vehicle may be extracted from a signal received from the target vehicle via the vehicle-to-vehicle communication unit 1201, or based on the vehicle number, vehicle type, and color information acquired by image recognition.
- the above database may be queried to obtain individual identification information. Further, a vehicle that is not in the database may be automatically newly registered on the database side.
- the user evaluation acquisition unit 1102 acquires information related to the evaluation of the user (here, the driver) for the target vehicle. Specifically, for example, the user evaluation acquisition unit 1102 acquires, as user evaluation information, from the operation input unit 1303, information that a “Thanks” button that is an example of the operation input unit 1303 is pressed. In addition, the user evaluation acquisition unit 1102 analyzes the user's uttered voice collected by the microphone 1304, and acquires a user comment converted into text as user evaluation information.
- the notification information generation unit 1103 is based on the user evaluation acquired by the user evaluation acquisition unit 1102 and the target vehicle recognized by the recognition unit 1101 (the vehicle that the driver pays attention to and is the evaluation target vehicle here).
- the information for notifying the user evaluation for the target vehicle is generated.
- the notification information generation unit 1103 generates an evaluation display screen that displays user evaluation for the target vehicle.
- the notification information generation unit 1103 can also generate a reason for user evaluation based on at least one of the recognized information on the target vehicle and the user's environment information.
- the reason for the generated user evaluation can be associated with the user evaluation and stored in the storage unit 150 or transmitted from the communication unit 120 to the target vehicle or a third-party vehicle.
- the notification control unit 1104 controls to notify the notification information generated by the notification information generation unit 1103 to a predetermined notification destination.
- the notification control unit 1104 controls the generated notification information to be transmitted to the target vehicle by the inter-vehicle communication unit 1201.
- the notification control unit 1104 may perform control so as to notify the driver of the evaluation information for the host vehicle received by the inter-vehicle communication unit 1201 from the display unit 1401 or the speaker 1402.
- the notification control unit 1104 may perform control so as to notify the driver of the evaluation information of the target vehicle received by the inter-vehicle communication unit 1201 from the display unit 1401 or the speaker 1402.
- the transmission / reception of the evaluation information is not limited to vehicle-to-vehicle communication, and may be performed via the road-to-vehicle communication unit 1202 or the network I / F unit 1203.
- the evaluation information may include unique identification information of the target vehicle, an evaluation reason, and the like.
- the estimation unit 1105 estimates the type of information that the driver currently desires and what information is the information to be notified to the driver (for example, the type). For example, the estimation unit 1105 is obtained from the driver's line-of-sight information obtained from the line-of-sight sensor, the driver's concentration obtained from the vital sensor, the driving situation obtained from the in-vehicle device 1301, the driving content, and the microphone 1304 (voice sensor). Based on at least one of the contents of the driver's utterance, it is comprehensively interpreted using machine learning to estimate what information is currently appropriate for the driver.
- the estimation unit 1105 estimates that the user wants to know only the speed of other vehicles and dangerous vehicles while cruising on a highway, and the other vehicle's moving direction (straight, Estimate that you want to know acceleration / deceleration information and blind spot information, and that you want to know the evaluation of surrounding vehicles while the vehicle is stopped. Further, the estimation unit 1105 estimates that the driver wants to know the degree of danger of the target vehicle when the driver utters “That car is flirting ...”, and utters “Where are you all coming from?” If it does, it is estimated that the driver's address (prefecture information) of the surrounding vehicle would be desired.
- the estimation unit 1105 may adjust the content and amount of information to be presented to the driver, the presentation method, and the like according to the driver's concentration degree obtained by the vital sensor. For example, when the degree of concentration of the driver is low, the estimation unit 1105 adjusts to present more critical information and realizes alerting the driver.
- the estimation unit 1105 estimates information currently appropriate for the driver, the estimation unit 1105 makes a request for information estimated from the communication unit 120 to a predetermined request destination.
- the estimation unit 1105 may include the speed of surrounding vehicles, dangerous vehicle information (risk level of surrounding vehicles), the moving direction of other vehicles (straight, right turn, left turn), acceleration / deceleration information, blind spot information, or evaluation information.
- the corresponding information is acquired from the surrounding vehicles by the inter-vehicle communication unit 1201.
- Dangerous vehicle information risk of surrounding vehicles
- the evaluation information that is, a vehicle with a low evaluation and a bad driving manner evaluation is a dangerous vehicle (high risk)
- Display unit 1401 and speaker 1402 are examples of output unit 140.
- the display unit 1401 is realized by a display device such as a liquid crystal display (LCD) device or an organic EL (OLED: Organic Light Emitting Diode) display device.
- the display unit 1401 is an AR display in a transmissive / semi-transmissive head-up display or a windshield provided at a position in the driver's field of view that can be read without changing the line of sight from the front. May be performed.
- the display unit 1401 may be realized by an HMD (head mounted display) worn by the driver, and AR display may be performed on the HMD. In the AR display, information is displayed so as to correspond to an actual vehicle.
- HMD head mounted display
- FIG. 5 is a flowchart showing an evaluation input process to another vehicle according to the first embodiment.
- the user evaluation acquisition unit 1102 of the information processing apparatus 100 determines that the user (here, the driver) has pressed the “Thanks” button detected by the operation input unit 1303. Obtained as information (step S103).
- the user evaluation acquisition unit 1102 recognizes (converts to text) the driver's utterance collected by the microphone 1304 and acquires it as user evaluation information (step S106).
- the recognition unit 1101 of the information processing apparatus 100 acquires information used for recognizing the evaluation target vehicle from the sensor 1302 and the in-vehicle device 1301 (step S109). For example, the recognition unit 1101 acquires the driver's line-of-sight information from the line-of-sight sensor, the captured image of the surrounding vehicle, the positional relationship information with the surrounding vehicle, the driving situation of the vehicle, and the like from the camera.
- the recognizing unit 1101 recognizes the vehicle that the driver is paying attention to (ie, the vehicle that is the target of user evaluation) based on the acquired information (step S112). For example, when the “Thanks” button is pressed when the user vehicle joins the main line or the driver speaks “thank you”, the recognition unit 1101 joins the main line based on the positional relationship information and the captured image. It is recognized that the other vehicle running behind (the other vehicle that gave up the road at the time of joining) is the vehicle to be evaluated. Further, when the user vehicle suddenly brakes and utters “Dangerous!”, The recognizing unit 1101 recognizes that another vehicle that has appeared (interrupted) in front of the user vehicle based on the line-of-sight information and the captured image.
- the recognition unit 1101 may acquire individual identification information of the target vehicle.
- the individual identification information of the target vehicle may be acquired from the management server 4 on the network based on information such as the number, vehicle type, and color of the target vehicle obtained by analyzing the captured image. You may acquire by communication.
- the notification information generation unit 1103 provides information for notifying the user evaluation for the target vehicle based on the user evaluation acquired by the user evaluation acquisition unit 1102 and the information on the target vehicle recognized by the recognition unit 1101. Generate (step S115). Such information may include individual identification information of the target vehicle.
- the notification control unit 1104 transmits the generated notification information to the recognized vehicle via the inter-vehicle communication unit 1201 (step S118). Further, the notification control unit 1104 may transmit the generated notification information to the management server 4 on the network via the network I / F unit 1203. Further, the notification control unit 1104 may notify the user (the driver of the user vehicle) that the user evaluation for the target vehicle has been performed from the display unit 1401 or the speaker 1402. Such notification information may be notified in real time when evaluation is performed.
- FIG. 6 is a diagram illustrating a display example when a user evaluation is notified in the target vehicle.
- the driver of the vehicle 10A evaluates the vehicle 10B that has given way, and notification information for notifying the evaluation is the vehicle 10A.
- a case will be described in which the vehicle is transmitted to the vehicle 10B and AR is displayed on the windshield.
- the windshield 1401B of the vehicle 10B has a frame image 30 displayed so as to surround the vehicle 10A so as to specify the vehicle that has been evaluated, and the vehicle that the “Thanks” button has been pressed.
- An evaluation information image 31 that displays evaluation information such as a comment display from 10A is displayed as an AR.
- the user evaluation reason (evaluation reason) may be displayed together with the user evaluation.
- the information processing apparatus 100 may perform control so as to notify the user of the evaluation. More specifically, for example, when the recognition unit 1101 determines that the information processing apparatus 100 has “given the road”, the information processing apparatus 100 performs “Thanks” to the vehicle that has given the road to the user.
- the notification information generation unit 1103 generates notification information such as “??”. Then, the information processing apparatus 100 notifies the user of the generated notification information from the display unit 1401 or the speaker 1402 by the notification control unit 1104 and proposes positive evaluation.
- FIG. 7 is a flowchart showing a display process of evaluation information of other vehicles.
- the estimation unit 1105 of the information processing apparatus 100 acquires information for estimating information required by the user (here, the driver) from the sensor 1302 and the in-vehicle device 1301 (step). S123).
- the estimation unit 1105 acquires line-of-sight information, a captured image, a current position, a driving situation, and the like from the in-vehicle device 1301 from the sensor 1302.
- the estimation unit 1105 estimates information that the driver currently needs (or information that is currently appropriate for the driver) based on the acquired information (step S126).
- the driver wants to know the evaluation information of the surrounding vehicle or the surrounding vehicle evaluation information. Is currently appropriate.
- the inter-vehicle communication unit 1201 acquires information related to the evaluation from the surrounding vehicles according to the estimation result of the estimation unit 1105 (step S129). Specifically, evaluation information from other users accumulated in each vehicle is acquired.
- the display device 1401d displays a user vehicle (vehicle 10D) and surrounding vehicles 10B and 10C on a map.
- the image displayed on the display device 1401d can be generated by the notification control unit 1104, for example.
- the positional relationship of each vehicle is grasped based on a signal received from each vehicle or an image captured by a camera.
- the evaluation information display images 36 and 37 that display the number of “Thanks” and the evaluation comments are displayed, for example, partly superimposed on the vehicles 10B and 10C so as to correspond to the vehicles 10B and 10C. Thereby, the user of vehicle 10D can grasp
- the estimation unit 1105 estimates information that the driver currently needs (or information that is currently appropriate for the driver) based on the acquired information (step S206).
- the driver wants to know the speed information of the surrounding vehicle at present, or estimates that the speed information of the surrounding vehicle is currently appropriate.
- FIG. 14 is a diagram illustrating a case where speed information of other vehicles traveling in the vicinity is displayed on the Bird View screen.
- a CG image assuming that the user vehicle (vehicle 10D) and the surrounding vehicles 10B and 10C are viewed from the viewpoint of the rear sky is displayed on the screen of the display device 1401d.
- the positional relationship of each vehicle is grasped based on a signal received from each vehicle or an image captured by a camera.
- the speed information display images 49 and 50 which show speed information are displayed, for example, partly superimposed on the vehicles 10B and 10C so as to correspond to the vehicles 10B and 10C.
- the user of vehicle 10D can grasp
- the speed of the vehicle 10C is estimated for other vehicles, and the vehicle 10D can also acquire the speed information of the vehicle 10C estimated by the other vehicles via inter-vehicle communication or network communication.
- the information processing apparatus 100 of the vehicle 10D may adopt and display information having the highest reliability.
- each speed information includes a speed error (plus or minus km / h) corresponding to the distance from the estimation subject to the estimation target.
- the speed information of the vehicles 10A to 10C estimated by the vehicle 10D based on the captured image or the like has a larger speed error as the speed information of the target vehicle away from the vehicle 10D as shown in FIG.
- the speed error is 0 km / h.
- the notification control unit 1104 of the information processing apparatus 100 mounted on the vehicle 10D adopts the one with the smallest speed error among the speed information of each other vehicle, and controls to display on the display unit 1401. Therefore, in the example shown in FIG. 16, the speed “95 km / h (error 0 km / h)” estimated by the vehicle 10A is adopted as the speed information of the vehicle 10A, and the speed “estimated by the vehicle 10B” is used as the speed information of the vehicle 10B. 86 km / h (error 0 km / h) ”is adopted, and the speed“ 94 km / h (error 1 km / h) ”estimated by the vehicle 10B closest to the vehicle 10C is adopted as the speed information of the vehicle 10C.
- FIG. 17 is a diagram for explaining the outline of the information processing system according to the third embodiment.
- the driver of the vehicle 10A waiting for a right turn at the intersection knows that the oncoming vehicle 10B is decelerating in response to the yellow signal, but is a blind spot of the vehicle 10B.
- the trend of 10E (motorcycle here) cannot be grasped visually.
- the vehicle 10A tries to turn right at the intersection, there is a high possibility that the vehicle 10A will come into contact with the vehicle 10E that has traveled straight and cause an accident. Therefore, in the present embodiment, for example, when the vehicle is about to turn a road, safer driving can be supported by acquiring acceleration / deceleration of surrounding vehicles and notifying the driver.
- by grasping information that the vehicle 10E existing in the blind spot of the vehicle 10B is accelerating it is possible to safely turn right after waiting for the vehicle 10E to go straight.
- FIG. 18 is a flowchart showing acceleration information display processing of another vehicle according to the third embodiment.
- the estimation unit 1105 of the information processing apparatus 100 acquires information for estimating information required by the user (here, the driver) from the sensor 1302 and the vehicle-mounted device 1301 (step S303).
- the estimation unit 1105 acquires line-of-sight information, a captured image, a current position, a driving situation, and the like from the in-vehicle device 1301 from the sensor 1302.
- the estimation unit 1105 estimates information that the driver currently needs (or information that is currently appropriate for the driver) based on the acquired information (step S306).
- the driver wants to know acceleration / deceleration information of the surrounding vehicle or the acceleration / deceleration information of the surrounding vehicle is currently appropriate. Estimated.
- the inter-vehicle communication unit 1201 acquires acceleration / deceleration information from surrounding vehicles according to the estimation result of the estimation unit 1105 (step S309). Specifically, acceleration / deceleration information detected from the in-vehicle device of each vehicle is acquired. As shown in FIG. 17, even when the vehicle 10A and the vehicle 10E are separated from each other and the inter-vehicle communication cannot be performed, the vehicle 10A receives the vehicle 10E information from the vehicle 10B as the surrounding vehicle information received by the vehicle 10E. Information may be acquired. As a result, the vehicle 10A can also acquire information on a bicycle or motorcycle in front of several tens of meters with low reliability.
- FIG. 19 shows a diagram for explaining a case where acceleration / deceleration information of other vehicles in the vicinity is displayed.
- the information processing apparatus 100 mounted on the vehicle 10D acquires acceleration information of vehicles around the intersection and notifies the driver of the vehicle 10D.
- speed display images 51 and 52 that display acceleration / deceleration of each vehicle are AR-displayed on the windshield 1401D of the vehicle 10D.
- an emphasis display 53 for clearly indicating the hidden vehicle 10E is displayed.
- the driver of the vehicle 10D can easily grasp that the vehicle 10B of the oncoming vehicle is decelerated, but that the vehicle 10E exists in the blind spot, and further that the vehicle 10E is accelerating, Accidents can be prevented in advance.
- acceleration / deceleration information of the vehicle 10B is received from the vehicle 10B.
- the present embodiment is not limited to this, and the recognition unit 1101 of the information processing apparatus 100 is captured by a camera. It is also possible to estimate the acceleration / deceleration of the target vehicle based on analysis of the captured image, distance information obtained from the stereo camera, and distance information obtained from the infrared sensor.
- the notification control unit 1104 may perform control so as to employ the highly reliable information from the acquired acceleration / deceleration information of the surrounding vehicles and notify the driver.
- this will be specifically described with reference to FIG.
- FIG. 20 is a diagram showing a list of acceleration / deceleration information of surrounding vehicles acquired by the vehicle 10A (own vehicle).
- acceleration / deceleration information of the vehicles 10B and 10E estimated by the vehicle 10A (own vehicle) and the vehicle 10B is shown.
- the vehicle 10A can acquire acceleration / deceleration information of the vehicle 10A and the vehicle 10E estimated by the vehicle 10B via inter-vehicle communication or network communication.
- the positional relationship between the vehicles 10A, 10B, and 10E is as shown in FIG. 17, and the vehicle 10E is hidden in the blind spot of the vehicle 10B.
- the vehicle 10E is a vehicle that does not support communication.
- each speed information includes a reliability (percentage) corresponding to the positional relationship between the estimation subject and the estimation target.
- the recognition unit 1101 of the information processing apparatus 100 mounted on the vehicle 10A can estimate acceleration / deceleration of the vehicle 10B based on a captured image by a camera, distance information by a stereo camera, distance information by an infrared sensor, or the like.
- the reliability of acceleration is 40% and the reliability of deceleration is 60%.
- the reliability is 100%.
- the vehicle 10A to the vehicle 10E cannot be detected by a camera, an infrared sensor, or the like, estimation is impossible. Further, the estimation result of the vehicle 10E mainly composed of the vehicle 10B has a reliability of 100% because the vehicle 10E is close to the vehicle 10B.
- FIG. 21 is a diagram for explaining the outline of the information processing system according to the fourth embodiment.
- the driver of the vehicle 10B notices the own vehicle in which the vehicle 10A is in the blind spot on the right rear. I'm concerned whether or not. If it is found that the vehicle 10A is aware of the host vehicle (the vehicle 10B), the driver of the vehicle 10B is likely to decelerate and wait for the opponent's lane change because there is no sudden lane change. Whether or not the vehicle 10A is aware of the host vehicle (the vehicle 10B) is notified from the vehicle 10A by inter-vehicle communication, for example.
- the inter-vehicle communication unit 1201 acquires the recognition information from the target vehicle according to the estimation result of the estimation unit 1105 (step S409). Specifically, driver recognition information detected from an in-vehicle device or a line-of-sight sensor of the target vehicle, or vehicle recognition information in the case of automatic driving is acquired.
- FIG. 23 is a diagram illustrating a case where the recognition information of the target vehicle is displayed.
- the notice display image 55 indicating that the target vehicle recognizes the host vehicle is AR-displayed on the windshield 1401B of the vehicle 10B.
- the target vehicle is identified by the recognition unit 1101 based on the line-of-sight information of the driver of the vehicle 10B, the captured image of the camera, and the like.
- a question mark may be displayed when the target vehicle does not recognize the host vehicle or when it is unknown whether or not it is recognized.
- the driver of the vehicle 10B can easily grasp whether or not the target vehicle in the front left lane that is about to change the lane by taking out the right turn signal is aware of the host vehicle.
- FIG. 24 is a diagram for explaining a case where the recognition information of the target vehicle is displayed on the Top View screen.
- the display device 1401b installed around the steering wheel displays the user vehicle (vehicle 10B) and the target vehicle 10A on a map, indicating that the vehicle 10A is aware of the vehicle 10B.
- the notice display image 56 is displayed so as to correspond to the vehicle 10A. Thereby, the user of the vehicle 10B can easily grasp that the target vehicle is aware of the own vehicle.
- the information processing apparatus 100 executes a risk avoidance action such as automatically turning on the hazard lamp of the host vehicle (step S418).
- a risk avoidance action such as automatically turning on the hazard lamp of the host vehicle (step S418).
- the information processing apparatus 100 may notify the target vehicle of the presence of the host vehicle through inter-vehicle communication. As a result, the target vehicle can be made aware of the presence of the host vehicle, and danger can be avoided.
- a notification image 57 indicating the success probability of passing the host vehicle (the vehicle 10A) is displayed on the Top View screen.
- traveling navigation for overtaking is displayed with arrows.
- FIG. 26 and FIG. 27 are diagrams for explaining warning notification based on information acquired from other nearby vehicles.
- the warning notification is displayed as AR on the windshield 1401A of the vehicle 10A.
- These warning notifications are presented based on the recognition result received from the other vehicle by the inter-vehicle communication unit 1201 or the like.
- the information processing apparatus 100 of the vehicle 10A that has received the recognition result AR displays a notification image 59 indicating a warning regarding the vehicle 10H on the windshield 1401A as shown in FIG.
- a notification image 60 that warns that the vehicle I traveling in the right lane is not insured may be displayed on the windshield 1401A of the vehicle 10A.
- the vehicle profile such as car insurance subscription information
- the driver can be alerted.
- Such information can be acquired from the target vehicle I.
- the information processing device of the vehicle 10A 100 displays AR on the windshield 1401A as a notification image 62 indicating a warning regarding the vehicle 10J in the blind spot.
- FIG. 28 is a diagram for explaining the overall configuration of the information processing system according to the sixth embodiment.
- the information processing system according to the present embodiment includes a roadside apparatus 3 that performs wireless communication with each vehicle 10 (specifically, for example, the vehicles 10A to 10E), a management server 4, and a communication terminal 6. Including.
- the management server 4 appropriately transmits information acquired from the vehicle 10 via the roadside apparatus 3 (hereinafter also referred to as vehicle information) to the communication terminal 6 via the network 5.
- vehicle information acquired from the vehicle 10 via the roadside apparatus 3
- the communication terminal 6 for example, a smartphone 6A, a smart watch 6B, a tablet terminal 6C, or a PC (personal computer) 6D is assumed.
- a method for utilizing vehicle information will be described with reference to FIGS.
- the recognition unit 1101 recognizes a driver's face image captured by a fingerprint sensor provided on a steering wheel or a camera, or The driver is identified based on personal authentication or the like by a communication terminal such as a smartphone owned by the driver, and transmitted to the management server 4 together with the vehicle information.
- the evaluation in the first embodiment is not limited to the evaluation from an automobile or a motorcycle to an automobile, but may be an evaluation from a bicycle or a pedestrian to an automobile / motorcycle. Further, the evaluation information of the target vehicle may be presented to a bicycle or a pedestrian.
- a vehicle with extremely bad evaluation for example, a danger recognized by a plurality of users
- a vehicle with extremely bad evaluation for example, a danger recognized by a plurality of users
- the information processing system stores a captured image obtained by imaging the driving situation of the host vehicle and the surrounding situation when evaluated from another vehicle in association with the evaluation so that it can be reproduced later. Also good.
- the information processing system verifies the consistency between the evaluation information for the own vehicle and the driving state of the own vehicle, and notifies the driver of the evaluation with low reliability in order to prevent inappropriate evaluation. You may make it not.
- the information processing system learns the probability of evaluation execution according to the driving situation from the relationship between the past driving situation and evaluation execution, and does not display the evaluation from other vehicles performed when the execution probability is low. May be.
- the information processing system may notify the driver of multiple types of information by combining the above-described embodiments.
- the evaluation information and speed information of surrounding vehicles may be displayed together.
- information display priorities may be given according to driving conditions and surrounding conditions so that high priority items are displayed. For example, at an intersection, information useful for preventing an accident assumed at the intersection is preferentially displayed.
- the information processing apparatus 100 may be realized as an apparatus mounted on any type of vehicle such as an automobile, an electric vehicle, a hybrid electric vehicle, and a motorcycle. Further, at least a part of the components of the information processing apparatus 100 may be realized in a module for an apparatus mounted on a vehicle (for example, an integrated circuit module configured by one die).
- FIG. 33 is a block diagram illustrating an example of a schematic configuration of a vehicle control system 900 to which the technology according to the present disclosure can be applied.
- the vehicle control system 900 includes an electronic control unit 902, a storage device 904, an input device 906, an out-of-vehicle sensor 908, a vehicle state sensor 910, a passenger sensor 912, a communication IF 914, an output device 916, a power generation device 918, a braking device 920, a steering. 922 and a lamp actuating device 924.
- the electronic control unit 902 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the vehicle control system 900 according to various programs.
- the electronic control unit 902 can be formed as an ECU (Electronic Control Unit) together with a storage device 904 described later.
- a plurality of ECUs may be included in the vehicle control system 900.
- each of the various sensors or the various drive systems may be provided with an ECU for controlling them, and further provided with an ECU for controlling the plurality of ECUs in a coordinated manner.
- the plurality of ECUs are connected via an in-vehicle communication network that conforms to an arbitrary standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or Flexray.
- the electronic control unit 902 can form, for example, the control unit 110 shown in FIG.
- the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the user using the above-described input means and outputs the input signal to the electronic control unit 902.
- the passenger can operate the input device 906 to input various data to the vehicle control system 900 and to instruct processing operations.
- the input device 906 can form, for example, the input unit 130 shown in FIG.
- the vehicle outside sensor 908 is realized by a sensor that detects information outside the vehicle.
- the outside sensor 908 includes a sonar device, a radar device, a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device, a camera, a stereo camera, a ToF (Time Of Flight) camera, an infrared sensor, an environmental sensor, a microphone, and the like. May be included.
- the vehicle exterior sensor 908 may form, for example, the sensor 1302 shown in FIG.
- the braking device 920 is a device for applying a braking force to the vehicle, or decelerating or stopping the vehicle.
- the braking device 920 may include, for example, a brake installed on each wheel, a brake pipe or an electric circuit for transmitting the depression pressure of the brake pedal to the brake, and the like.
- the braking device 920 may include a control device for operating a sliding or skid prevention mechanism by brake control such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
- the power generation device 918, the braking device 920, the steering 922, and the lamp operation device 924 may operate based on a manual operation by a driver or may operate based on an automatic operation by the electronic control unit 902. .
- FIG. 34 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the present embodiment. Note that the information processing apparatus 1000 illustrated in FIG. 34 can implement the management server 4 illustrated in FIG. 28, for example. Information processing by the management server 28 according to the present embodiment is realized by cooperation of software and hardware described below.
- the information processing apparatus 1000 includes a CPU (Central Processing Unit) 1001, a ROM (Read Only Memory) 1002, a RAM (Random Access Memory) 1003, and a host bus 1004a.
- the information processing apparatus 1000 includes a bridge 1004, an external bus 1004b, an interface 1005, an input device 1006, an output device 1007, a storage device 1008, a drive 1009, a connection port 1011, and a communication device 1013.
- the information processing apparatus 1000 may include a processing circuit such as a DSP or an ASIC in place of or in addition to the CPU 1001.
- the CPU 1001, the ROM 1002, and the RAM 1003 are connected to each other by a host bus 1004a including a CPU bus.
- the host bus 1004a is connected to an external bus 1004b such as a PCI (Peripheral Component Interconnect / Interface) bus through a bridge 1004.
- an external bus 1004b such as a PCI (Peripheral Component Interconnect / Interface) bus through a bridge 1004.
- PCI Peripheral Component Interconnect / Interface
- the output device 1007 is formed of a device capable of visually or audibly notifying the acquired information to the user. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as laser projectors, LED projectors and lamps, audio output devices such as speakers and headphones, printer devices, and the like. .
- the output device 1007 outputs results obtained by various processes performed by the information processing apparatus 1000. Specifically, the display device visually displays the results obtained by various processes performed by the information processing device 1000 in various formats such as text, images, tables, and graphs.
- the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally.
- the storage device 1008 is a data storage device formed as an example of a storage unit of the information processing device 1000.
- the storage apparatus 1008 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
- the storage device 1008 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 1008 stores programs executed by the CPU 1001, various data, various data acquired from the outside, and the like.
- the drive 1009 is a reader / writer for a storage medium, and is built in or externally attached to the information processing apparatus 1000.
- the drive 1009 reads information recorded on a mounted removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 1003.
- the drive 1009 can also write information on a removable storage medium.
- the communication device 1013 is a communication interface formed by a communication device or the like for connecting to the network 1020, for example.
- the communication device 1013 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 1013 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communication, or the like.
- the communication device 1013 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet or other communication devices.
- the information on other vehicles is not limited to evaluation information, but includes speed information, acceleration / deceleration information, recognition information, behavior prediction recognition results, departure place / destination, car insurance participation information. It is also possible to acquire a vehicle profile such as the above, a driver profile, etc., and present it to the user.
- a computer program for causing the information processing apparatus 100 or the management server 4 to perform the functions of the information processing apparatus 100 or the management server 4 on hardware such as a CPU, a ROM, and a RAM built in the management server 4 can be created. It is.
- a computer-readable storage medium storing the computer program is also provided.
- the configuration of the information processing apparatus 100 illustrated in FIGS. 2 and 3 is an example, and the present embodiment is not limited thereto.
- the information processing apparatus 100 is realized by a smartphone, a tablet terminal, or the like, among the configurations shown in FIG. 3, the in-vehicle device 1301, the sensor 1302, the operation input unit 1303, the microphone 1304, the display unit 1401, the speaker 1402, and the communication
- the unit 120 is provided on the vehicle side.
- the information processing apparatus 100 acquires and processes various information from the in-vehicle device 1301, the sensor 1302, the operation input unit 1303, and the microphone 1304 provided on the vehicle side, and displays the information provided on the vehicle side. Control is performed so that notification to the user from the unit 1401 or the speaker 1402 and transmission / reception of data to / from the outside from the communication unit 120 provided on the vehicle side are performed.
- the information processing apparatus wherein the other user evaluation acquisition unit acquires information received from the target vehicle through inter-vehicle communication.
- the information generated by the generation unit is the information processing apparatus according to (2) or (3), in which at least one of an evaluator, an evaluated person, and a neighboring third party is notified.
- the recognizing unit is based on line-of-sight information, peripheral information, a positional relationship between the user vehicle and surrounding vehicles, a user's concentration degree, a driving situation of the user vehicle, or a user's utterance content included in the user's environment information.
- the information processing apparatus according to any one of (2) to (4), wherein a target vehicle that is a user evaluation target is automatically recognized.
- the information processing apparatus includes: An estimation unit for estimating what information about the recognized target vehicle should be notified to the current driver;
- Information acquired from the target vehicle includes information on evaluation of the target vehicle from other users, speed information of the target vehicle, acceleration / deceleration information of the target vehicle, recognition information on the user vehicle in the target vehicle, and the target
- the information processing apparatus according to (8) which is at least one of a vehicle travel prediction recognition result and a profile.
- the notification control unit performs display control so that information acquired from the target vehicle is visually recognized corresponding to the target vehicle in a display device provided in the vehicle and visible by a user. Information processing device.
- (11) The information processing apparatus according to (9) or (10), wherein the notification control unit performs control so as to issue a warning to a user based on information acquired from the target vehicle.
- the information processing apparatus controls to notify a travel prediction result of the user vehicle based on information acquired from the target vehicle.
- the notification control unit notifies the user in real time of the evaluation content performed on the target vehicle, the evaluation content performed on the user vehicle from another user, or the evaluation content performed on the other vehicle from the other user.
- the information processing apparatus according to any one of (2) to (12), wherein the information processing apparatus is controlled.
- the information processing apparatus according to any one of (2) to (13), further including a transmission control unit that controls to transmit a driving situation of the user vehicle to a server on the network.
- the generation unit generates information for notifying the user of an evaluation request based on at least one of the information related to the target vehicle and the user's environmental information.
- the information processing apparatus according to any one of claims.
- (18) Processor Automatically recognizing target vehicle information from the user's environmental information; Obtaining a user evaluation for the recognized target vehicle; Generating information for notifying the acquired user evaluation; Including an information processing method.
- Computer A recognition unit that automatically recognizes information on the target vehicle from the user's environmental information; An acquisition unit for acquiring a user evaluation for the recognized target vehicle; A generating unit that generates information for notifying the acquired user evaluation; Program to function as
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Traffic Control Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
1.本開示の一実施形態による情報処理システムの概要
2.構成
2-1.基本構成
2-2.詳細構成
3.各実施形態
3-1.第1の実施形態
3-2.第2の実施形態
3-3.第3の実施形態
3-4.第4の実施形態
3-5.第5の実施形態
3-6.第6の実施形態
4.補足
5.ハードウェア構成例
5-1.車両制御システムの構成例
5-2.情報処理装置の構成例
5-3.その他
6.まとめ
本開示の一実施形態による情報処理システムは、対象車両に対するユーザ評価に関する情報を効果的に利用することで、例えば運転マナーの向上や、事故の未然防止、快適な走行等を支援する。図1は、本開示の一実施形態による情報処理システムの概要について説明する図である。図示された例では、車両10A、車両10B、および車両10Cが、周辺に存在する他車両と信号の送受信が可能な車車間通信機能を有している。
次に、上述した情報処理システムを実現する情報処理装置100の構成について、図2および図3を参照して説明する。情報処理装置100は、例えば車両10A、10B、10C等の移動体に搭載され得る。若しくは、情報処理装置100は、車両10に持ち込まれるスマートフォン、タブレット端末、携帯電話端末、PC(パーソナルコンピュータ)等に搭載され得る。
図2は、本実施形態による情報処理装置100の基本構成の一例を示す図である。図2に示すように、情報処理装置100は、制御部110、通信部120、入力部130、出力部140、および記憶部150を含む。
図3は、本実施形態による情報処理装置100の詳細構成の一例を示す図である。図3に示すように、情報処理装置100は、車載機器1301、センサ1302、操作入力部1303、マイクロホン1304、車車間通信部1201、路車間通信部1202、ネットワークI/F(interface)部1203、認識部1101、ユーザ評価取得部1102、通知情報生成部1103、通知制御部1104、推定部1105、表示部1401、およびスピーカ1402を含む。
<3-1.第1の実施形態>
まず、図5~図10を参照して第1の実施形態による情報処理システムについて説明する。第1の実施形態では、他車両への評価処理、および各車両に蓄積された評価情報の活用について説明する。
続いて、図11~図16を参照して第2の実施形態による情報処理システムについて説明する。第2の実施形態では、他車両の速度情報の活用について説明する。
次に、図17~図20を参照して第3の実施形態による情報処理システムについて説明する。第3の実施形態では、他車両の加減速に関する情報の活用について説明する。
続いて、図21~図24を参照して第4の実施形態による情報処理システムについて説明する。第4の実施形態では、他車両が自車両を認識しているか否か(他車両の自車両への気付き)に関する情報の活用について説明する。
次に、図25~図27を参照して第5の実施形態による情報処理システムについて説明する。第5の実施形態では、他車両から取得した種々の情報に基づくユーザ車両の走行予測結果や警告の通知について説明する。
続いて、図28~図32を参照して第6の実施形態による情報処理システムについて説明する。第6の実施形態では、車両の情報をネットワークを介して他者に公開して活用する場合について説明する。
以上説明した各実施形態について以下補足する。
本開示に係る技術は、様々な製品へ応用可能である。例えば、情報処理装置100は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車などのいずれかの種類の車両に搭載される装置として実現されてもよい。また、情報処理装置100の少なくとも一部の構成要素は、車両に搭載される装置のためのモジュール(例えば、1つのダイで構成される集積回路モジュール)において実現されてもよい。
図33は、本開示に係る技術が適用され得る車両制御システム900の概略的な構成の一例を示すブロック図である。車両制御システム900は、電子制御ユニット902、ストレージ装置904、入力装置906、車外センサ908、車両状態センサ910、搭乗者センサ912、通信IF914、出力装置916、動力生成装置918、制動装置920、ステアリング922及びランプ作動装置924を備える。
図34は、本実施形態に係る情報処理装置のハードウェア構成の一例を示すブロック図である。なお、図34に示す情報処理装置1000は、例えば、図28に示した管理サーバ4を実現し得る。本実施形態に係る管理サーバ28による情報処理は、ソフトウェアと、以下に説明するハードウェアとの協働により実現される。
以上、本実施形態に係る情報処理装置100又は管理サーバ28の機能を実現可能なハードウェア構成の一例を示した。上記の各構成要素は、汎用的な部材を用いて実現されていてもよいし、各構成要素の機能に特化したハードウェアにより実現されていてもよい。従って、本実施形態を実施する時々の技術レベルに応じて、適宜、利用するハードウェア構成を変更することが可能である。
以上、図1~図34を参照して、本開示の一実施形態について詳細に説明した。上述したように、本開示の実施形態による情報処理システムでは、対象車両に対するユーザ評価に関する情報を効果的に利用することが可能となる。
(1)
ユーザの環境情報から対象車両の情報を自動的に認識する認識部と、
前記認識した対象車両に対するユーザ評価を取得する取得部と、
前記取得したユーザ評価を通知するための情報を生成する生成部と、
を備える、情報処理装置。
(2)
前記認識部は、ユーザ車両に設けられたセンサからの情報に基づいて前記対象車両を認識し、
前記情報処理装置は、
前記認識部で認識された対象車両に関する他ユーザからの評価に関する情報を取得する他ユーザ評価取得部と、
前記取得した所定の情報をユーザ車両においてユーザに通知するよう制御する通知制御部と、をさらに備える、前記(1)に記載の情報処理装置。
(3)
前記他ユーザ評価取得部は、車車間通信により前記対象車両から受信した情報を取得する、前記(2)に記載の情報処理装置。
(4)
前記生成部により生成された情報は、評価者、被評価者、および周辺の第三者のうち少なくともいずれかに通知される、前記(2)または(3)に記載の情報処理装置。
(5)
前記認識部は、ユーザの環境情報に含まれる、視線情報、周辺情報、ユーザ車両と周囲の車両との位置関係、ユーザの集中度、ユーザ車両の走行状況、またはユーザの発話内容に基づいて、ユーザ評価対象となる対象車両を自動的に認識する、前記(2)~(4)のいずれか1項に記載の情報処理装置。
(6)
前記取得部は、ユーザによる操作入力、音声入力、または運転操作に基づくユーザ評価を取得する、前記(2)~(5)のいずれか1項に記載の情報処理装置。
(7)
前記通知制御部は、車内に設けられたユーザにより視認可能な表示装置または音声出力装置から通知するよう制御する、前記(2)~(6)のいずれか1項に記載の情報処理装置。
(8)
前記情報処理装置は、
現在運転者に前記認識された対象車両に関するどのような情報を通知すべきかを推定する推定部と、
前記推定部による推定された情報を前記対象車両から取得するよう制御する取得制御部と、をさらに備える、前記(2)~(7)のいずれか1項に記載の情報処理装置。
(9)
前記対象車両から取得する情報は、前記対象車両に対する他ユーザからの評価に関する情報、前記対象車両の速度情報、前記対象車両の加減速情報、前記対象車両におけるユーザ車両への認識情報、および前記対象車両の走行予測認識結果、プロファイルの少なくともいずれかである、前記(8)に記載の情報処理装置。
(10)
前記通知制御部は、前記対象車両から取得した情報を、車内に設けられた、ユーザにより視認可能な表示装置において前記対象車両に対応して視認されるよう表示制御する、前記(8)に記載の情報処理装置。
(11)
前記通知制御部は、前記対象車両から取得した情報に基づいて、ユーザへの警告を行うよう制御する、前記(9)または(10)に記載の情報処理装置。
(12)
前記通知制御部は、前記対象車両から取得した情報に基づいて、ユーザ車両の走行予測結果を通知するよう制御する、前記(9)~(11)のいずれか1項に記載の情報処理装置。
(13)
前記通知制御部は、前記対象車両へ行った評価内容、他ユーザからユーザ車両に対して行われた評価内容、または他ユーザから他車両へ行われた評価内容を、リアルタイムでユーザに通知するよう制御する、前記(2)~(12)のいずれか1項に記載の情報処理装置。
(14)
前記情報処理装置は、ユーザ車両の運転状況をネットワーク上のサーバへ送信するよう制御する送信制御部をさらに備える、前記(2)~(13)のいずれか1項に記載の情報処理装置。
(15)
前記取得部は、歩行者または自転車運転者による対象車両に対するユーザ評価を取得する、前記(2)に記載の情報処理装置。
(16)
前記生成部は、前記対象車両に関する情報またはユーザの環境情報の少なくともいずれかに基づいて、前記ユーザに対して評価を求める通知を行うための情報を生成する、前記(2)~(15)のいずれか1項に記載の情報処理装置。
(17)
前記生成部は、前記対象車両に関する情報またはユーザの環境情報の少なくともいずれかに基づいて、前記ユーザ評価の理由を生成する、前記(2)~(16)のいずれか1項に記載の情報処理装置。
(18)
プロセッサが、
ユーザの環境情報から対象車両の情報を自動的に認識することと、
前記認識した対象車両に対するユーザ評価を取得することと、
前記取得したユーザ評価を通知するための情報を生成することと、
を含む、情報処理方法。
(19)
コンピュータを、
ユーザの環境情報から対象車両の情報を自動的に認識する認識部と、
前記認識した対象車両に対するユーザ評価を取得する取得部と、
前記取得したユーザ評価を通知するための情報を生成する生成部と、
として機能させるための、プログラム。
110 制御部
1101 認識部
1102 ユーザ評価取得部
1103 通知情報生成部
1104 通知制御部
120 通信部
1201 車車間通信部
1202 路車間通信部
1203 ネットワークI/F部
130 入力部
1301 車載機器
1302 センサ
1303 操作入力部
1304 マイクロホン
140 出力部
1401 表示部
1402 スピーカ
150 記憶部
10 車両
4 管理サーバ
Claims (19)
- ユーザの環境情報から対象車両の情報を自動的に認識する認識部と、
前記認識した対象車両に対するユーザ評価を取得する取得部と、
前記取得したユーザ評価を通知するための情報を生成する生成部と、
を備える、情報処理装置。 - 前記認識部は、ユーザ車両に設けられたセンサからの情報に基づいて前記対象車両を認識し、
前記情報処理装置は、
前記認識部で認識された対象車両に関する他ユーザからの評価に関する情報を取得する他ユーザ評価取得部と、
前記取得した所定の情報をユーザ車両においてユーザに通知するよう制御する通知制御部と、をさらに備える、請求項1に記載の情報処理装置。 - 前記他ユーザ評価取得部は、車車間通信により前記対象車両から受信した情報を取得する、請求項2に記載の情報処理装置。
- 前記生成部により生成された情報は、評価者、被評価者、および周辺の第三者のうち少なくともいずれかに通知される、請求項2に記載の情報処理装置。
- 前記認識部は、ユーザの環境情報に含まれる、視線情報、周辺情報、ユーザ車両と周囲の車両との位置関係、ユーザの集中度、ユーザ車両の走行状況、またはユーザの発話内容に基づいて、ユーザ評価対象となる対象車両を自動的に認識する、請求項2に記載の情報処理装置。
- 前記取得部は、ユーザによる操作入力、音声入力、または運転操作に基づくユーザ評価を取得する、請求項2に記載の情報処理装置。
- 前記通知制御部は、車内に設けられた、ユーザにより視認可能な表示装置または音声出力装置から通知するよう制御する、請求項2に記載の情報処理装置。
- 前記情報処理装置は、
現在運転者に前記認識された対象車両に関するどのような情報を通知すべきかを推定する推定部と、
前記推定部による推定された情報を前記対象車両から取得するよう制御する取得制御部と、
をさらに備える、請求項2に記載の情報処理装置。 - 前記対象車両から取得する情報は、前記対象車両に対する他ユーザからの評価に関する情報、前記対象車両の速度情報、前記対象車両の加減速情報、前記対象車両におけるユーザ車両への認識情報、および前記対象車両の走行予測認識結果、プロファイルの少なくともいずれかである、請求項8に記載の情報処理装置。
- 前記通知制御部は、前記対象車両から取得した情報を、車内に設けられた、ユーザにより視認可能な表示装置において前記対象車両に対応して視認されるよう表示制御する、請求項8に記載の情報処理装置。
- 前記通知制御部は、前記対象車両から取得した情報に基づいて、ユーザへの警告を行うよう制御する、請求項9に記載の情報処理装置。
- 前記通知制御部は、前記対象車両から取得した情報に基づいて、ユーザ車両の走行予測結果を通知するよう制御する、請求項9に記載の情報処理装置。
- 前記通知制御部は、前記対象車両へ行った評価内容、他ユーザからユーザ車両に対して行われた評価内容、または他ユーザから他車両へ行われた評価内容を、リアルタイムでユーザに通知するよう制御する、請求項2に記載の情報処理装置。
- 前記情報処理装置は、ユーザ車両の運転状況をネットワーク上のサーバへ送信するよう制御する送信制御部をさらに備える、請求項2に記載の情報処理装置。
- 前記取得部は、歩行者または自転車運転者による対象車両に対するユーザ評価を取得する、請求項2に記載の情報処理装置。
- 前記生成部は、前記対象車両に関する情報またはユーザの環境情報の少なくともいずれかに基づいて、前記ユーザに対して評価を求める通知を行うための情報を生成する、請求項2に記載の情報処理装置。
- 前記生成部は、前記対象車両に関する情報またはユーザの環境情報の少なくともいずれかに基づいて、前記ユーザ評価の理由を生成する、請求項2に記載の情報処理装置。
- プロセッサが、
ユーザの環境情報から対象車両の情報を自動的に認識することと、
前記認識した対象車両に対するユーザ評価を取得することと、
前記取得したユーザ評価を通知するための情報を生成することと、
を含む、情報処理方法。 - コンピュータを、
ユーザの環境情報から対象車両の情報を自動的に認識する認識部と、
前記認識した対象車両に対するユーザ評価を取得する取得部と、
前記取得したユーザ評価を通知するための情報を生成する生成部と、
として機能させるための、プログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP16846049.1A EP3352154A4 (en) | 2015-09-18 | 2016-06-13 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM |
CN201680052582.5A CN108028015B (zh) | 2015-09-18 | 2016-06-13 | 信息处理装置、信息处理方法和存储介质 |
US15/750,384 US10380888B2 (en) | 2015-09-18 | 2016-06-13 | Information processing apparatus, information processing method, and program |
JP2017539715A JP6690649B2 (ja) | 2015-09-18 | 2016-06-13 | 情報処理装置、情報処理方法、およびプログラム |
US16/505,721 US10699569B2 (en) | 2015-09-18 | 2019-07-09 | Information processing apparatus, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-185362 | 2015-09-18 | ||
JP2015185362 | 2015-09-18 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/750,384 A-371-Of-International US10380888B2 (en) | 2015-09-18 | 2016-06-13 | Information processing apparatus, information processing method, and program |
US16/505,721 Continuation US10699569B2 (en) | 2015-09-18 | 2019-07-09 | Information processing apparatus, information processing method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017047176A1 true WO2017047176A1 (ja) | 2017-03-23 |
Family
ID=58288657
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/067568 WO2017047176A1 (ja) | 2015-09-18 | 2016-06-13 | 情報処理装置、情報処理方法、およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (2) | US10380888B2 (ja) |
EP (1) | EP3352154A4 (ja) |
JP (1) | JP6690649B2 (ja) |
CN (1) | CN108028015B (ja) |
WO (1) | WO2017047176A1 (ja) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017072874A (ja) * | 2015-10-05 | 2017-04-13 | 日産自動車株式会社 | 情報提供装置、情報提供システム及び情報提供方法 |
WO2018225178A1 (ja) * | 2017-06-07 | 2018-12-13 | 三菱電機株式会社 | 危険車両予測装置、危険車両警報システムおよび危険車両予測方法 |
CN109835258A (zh) * | 2017-11-29 | 2019-06-04 | 通用汽车环球科技运作有限责任公司 | 基于通信的无照明车辆指示 |
JP2019127196A (ja) * | 2018-01-26 | 2019-08-01 | Kddi株式会社 | 周辺走行中の被観測車両の運転特性を推定するプログラム、装置及び方法 |
CN110097782A (zh) * | 2018-01-29 | 2019-08-06 | 丰田自动车株式会社 | 代理控制器和代理协作方法 |
WO2019172289A1 (ja) * | 2018-03-08 | 2019-09-12 | 東芝デジタルソリューションズ株式会社 | 隊列走行運用システムおよび隊列走行運用方法 |
JP2020034996A (ja) * | 2018-08-27 | 2020-03-05 | Zホールディングス株式会社 | 情報処理装置、情報処理方法、及び情報処理プログラム |
JPWO2019039212A1 (ja) * | 2017-08-23 | 2020-09-17 | ソニー株式会社 | 情報処理装置、情報処理システム、および情報処理方法、並びにプログラム |
WO2020208696A1 (ja) * | 2019-04-09 | 2020-10-15 | 三菱電機株式会社 | 情報記録制御装置および情報記録制御方法 |
JPWO2020230313A1 (ja) * | 2019-05-15 | 2020-11-19 | ||
JP2020197831A (ja) * | 2019-05-31 | 2020-12-10 | Necプラットフォームズ株式会社 | 運転マナー評価提示装置、運転マナー評価提示方法、運転マナー評価システム、および運転マナー評価提示プログラム |
CN112776713A (zh) * | 2019-11-07 | 2021-05-11 | 丰田自动车株式会社 | 车辆、声响控制装置以及计算机可读取非暂存记录介质 |
WO2021111753A1 (ja) * | 2019-12-05 | 2021-06-10 | ソニーグループ株式会社 | 情報処理装置、および情報処理方法、並びにプログラム |
JP2021516396A (ja) * | 2018-03-27 | 2021-07-01 | 杭州欧▲雷▼激光技▲術▼有限公司 | 車両外部環境情報を検出するための検出システム及び検出方法 |
JP2021520541A (ja) * | 2018-04-04 | 2021-08-19 | トヨタ モーター エンジニアリング アンド マニュファクチャリング ノース アメリカ,インコーポレイティド | 周囲のビークルの観察を使用して交通フローを判定するためのシステム及び方法 |
JPWO2021186853A1 (ja) * | 2020-03-19 | 2021-09-23 | ||
JPWO2021044486A1 (ja) * | 2019-09-02 | 2021-11-25 | 三菱電機株式会社 | 自動運転制御装置および自動運転制御方法 |
JP2022026230A (ja) * | 2020-07-30 | 2022-02-10 | 株式会社デンソー | 運転支援装置、運転支援プログラム、運転訓練プログラム及び情報管理プログラム |
JP2022071799A (ja) * | 2020-10-28 | 2022-05-16 | 株式会社日本総合研究所 | 情報処理方法及び情報処理システム |
WO2022102374A1 (ja) * | 2020-11-16 | 2022-05-19 | 株式会社小糸製作所 | 車両用表示システム |
WO2022249917A1 (ja) * | 2021-05-26 | 2022-12-01 | 株式会社デンソー | 車両用制御装置及び車両用制御方法 |
CN115624008A (zh) * | 2022-07-08 | 2023-01-20 | 珠海科艺普检测科技有限公司 | 一种基于生物信息技术的大口黑鲈鱼的鱼苗智能检测方法 |
Families Citing this family (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6635428B2 (ja) * | 2015-05-20 | 2020-01-22 | 修一 田山 | 自動車周辺情報表示システム |
US10380888B2 (en) * | 2015-09-18 | 2019-08-13 | Sony Corporation | Information processing apparatus, information processing method, and program |
US10479373B2 (en) * | 2016-01-06 | 2019-11-19 | GM Global Technology Operations LLC | Determining driver intention at traffic intersections for automotive crash avoidance |
JP6650635B2 (ja) * | 2016-02-29 | 2020-02-19 | パナソニックIpマネジメント株式会社 | 判定装置、判定方法、および判定プログラム |
US11001271B2 (en) * | 2016-07-12 | 2021-05-11 | Honda Motor Co., Ltd. | Drive assistance device |
US9919648B1 (en) | 2016-09-27 | 2018-03-20 | Robert D. Pedersen | Motor vehicle artificial intelligence expert system dangerous driving warning and control system and method |
KR101973627B1 (ko) * | 2017-07-11 | 2019-04-29 | 엘지전자 주식회사 | 차량에 구비된 차량 제어 장치 및 차량의 제어방법 |
US11315415B2 (en) * | 2017-09-03 | 2022-04-26 | Innovart Design Inc. | Information sharing system and information sharing method for vehicle |
KR102029906B1 (ko) * | 2017-11-10 | 2019-11-08 | 전자부품연구원 | 이동수단의 가상현실 콘텐츠 제공 장치 및 방법 |
JP7043845B2 (ja) * | 2018-01-17 | 2022-03-30 | トヨタ自動車株式会社 | 車両用表示連携制御装置 |
USD914734S1 (en) * | 2018-02-05 | 2021-03-30 | St Engineering Land Systems Ltd | Display screen or portion thereof with graphical user interface |
JP7006527B2 (ja) | 2018-07-09 | 2022-01-24 | トヨタ自動車株式会社 | 車載装置および車両捜索システム |
CN108924624B (zh) * | 2018-08-03 | 2021-08-31 | 百度在线网络技术(北京)有限公司 | 信息处理方法和装置 |
JP7192398B2 (ja) * | 2018-10-31 | 2022-12-20 | トヨタ自動車株式会社 | 情報処理装置、情報処理システム、プログラム、および情報処理方法 |
CN111161554A (zh) * | 2018-11-07 | 2020-05-15 | 北京宝沃汽车有限公司 | 信息发送的方法和装置 |
JP7206854B2 (ja) * | 2018-11-29 | 2023-01-18 | トヨタ自動車株式会社 | 情報提供システム、サーバ、携帯端末、プログラム及び情報提供方法 |
JP2020101869A (ja) * | 2018-12-19 | 2020-07-02 | 本田技研工業株式会社 | 制御装置及びプログラム |
JP7158502B2 (ja) * | 2019-01-17 | 2022-10-21 | 三菱電機株式会社 | 情報処理装置、および情報処理システム |
US11423779B2 (en) | 2019-02-15 | 2022-08-23 | Ford Global Technologies, Llc | Vehicle detection systems and methods |
JP7088076B2 (ja) * | 2019-02-25 | 2022-06-21 | トヨタ自動車株式会社 | 情報処理システム、プログラム、及び制御方法 |
CN109801508B (zh) * | 2019-02-26 | 2021-06-04 | 百度在线网络技术(北京)有限公司 | 路口处障碍物的运动轨迹预测方法及装置 |
US10933317B2 (en) * | 2019-03-15 | 2021-03-02 | Sony Interactive Entertainment LLC. | Near real-time augmented reality video gaming system |
CN111753580A (zh) * | 2019-03-27 | 2020-10-09 | 北京外号信息技术有限公司 | 光通信装置的识别方法和相应的电子设备 |
CN111756783B (zh) * | 2019-03-29 | 2021-11-12 | 比亚迪股份有限公司 | 终端的信息推送方法、服务器和终端的信息推送系统 |
US10839682B1 (en) * | 2019-04-26 | 2020-11-17 | Blackberry Limited | Method and system for traffic behavior detection and warnings |
US11403949B2 (en) * | 2019-05-09 | 2022-08-02 | Hitachi Astemo, Ltd. | System for predicting vehicle behavior |
US11001200B2 (en) * | 2019-05-30 | 2021-05-11 | Nissan North America, Inc. | Vehicle occupant warning system |
US11603098B2 (en) * | 2019-08-27 | 2023-03-14 | GM Global Technology Operations LLC | Systems and methods for eye-tracking data collection and sharing |
US20210061276A1 (en) * | 2019-08-27 | 2021-03-04 | GM Global Technology Operations LLC | Systems and methods for vehicle operator intention prediction using eye-movement data |
USD948552S1 (en) * | 2019-09-06 | 2022-04-12 | Kubota Corporation | Display screen for tractor with graphical user interface |
WO2021070768A1 (ja) * | 2019-10-09 | 2021-04-15 | ソニー株式会社 | 情報処理装置、および情報処理システム、並びに情報処理方法 |
US11555711B2 (en) * | 2020-04-11 | 2023-01-17 | Harman Becker Automotive Systems Gmbh | Systems and methods for augmented reality in a vehicle |
DE102020210238A1 (de) | 2020-08-12 | 2022-02-17 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Warnen von Verkehrsteilnehmern mit einer Umfeldüberwachung eines in Betrieb befindlichen Fahrzeugs und Vorrichtung zur Ausführung des Verfahrens |
US11830489B2 (en) | 2021-06-30 | 2023-11-28 | Bank Of America Corporation | System and method for speech processing based on response content |
CN118139775A (zh) * | 2021-10-20 | 2024-06-04 | 三星电子株式会社 | 安装在车辆上的电子设备及其操作方法 |
CN114248806A (zh) * | 2022-01-13 | 2022-03-29 | 云控智行科技有限公司 | 一种无人车驾驶控制方法、装置及电子设备 |
CN115230581B (zh) * | 2022-06-22 | 2024-05-24 | 东风柳州汽车有限公司 | 车辆转弯报警方法、装置、设备及存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008126954A (ja) * | 2006-11-24 | 2008-06-05 | Tokai Rika Co Ltd | 意思伝達装置 |
JP2012252407A (ja) * | 2011-05-31 | 2012-12-20 | Nissan Motor Co Ltd | 車両用通行支援装置及び車両用通行支援方法 |
JP2013152524A (ja) * | 2012-01-24 | 2013-08-08 | Denso Corp | 車車間通信装置 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001034897A (ja) * | 1999-07-21 | 2001-02-09 | Toyota Central Res & Dev Lab Inc | 運転支援システム |
US7639841B2 (en) * | 2004-12-20 | 2009-12-29 | Siemens Corporation | System and method for on-road detection of a vehicle using knowledge fusion |
JP4655730B2 (ja) * | 2005-04-07 | 2011-03-23 | トヨタ自動車株式会社 | 車両用運転支援装置 |
JP2012150557A (ja) | 2011-01-17 | 2012-08-09 | Toyota Central R&D Labs Inc | 運転マナー啓発装置、システム、及びプログラム |
JP5621682B2 (ja) | 2011-03-29 | 2014-11-12 | 株式会社デンソー | 車載用情報提示装置 |
DE112012004767T5 (de) * | 2011-11-16 | 2014-11-06 | Flextronics Ap, Llc | Vollständiges Fahrzeugökosystem |
WO2013128920A1 (ja) * | 2012-02-27 | 2013-09-06 | ヤマハ発動機株式会社 | 運転状態共有装置および運転状態共有システム |
US20140310379A1 (en) * | 2013-04-15 | 2014-10-16 | Flextronics Ap, Llc | Vehicle initiated communications with third parties via virtual personality |
DE102012014457A1 (de) * | 2012-07-21 | 2014-01-23 | Audi Ag | Verfahren zum Betreiben eines Kraftfahrzeugs und Kraftfahrzeug |
US20140080098A1 (en) * | 2012-09-14 | 2014-03-20 | Hyundai Motor Company | System and method of evaluating and reporting the driving acuity and performance of a test subject |
CN103198685B (zh) * | 2013-03-15 | 2016-04-13 | Tcl康钛汽车信息服务(深圳)有限公司 | 一种实现驾驶安全预警的方法、系统 |
US20140322676A1 (en) * | 2013-04-26 | 2014-10-30 | Verizon Patent And Licensing Inc. | Method and system for providing driving quality feedback and automotive support |
EP3114574A4 (en) * | 2014-03-03 | 2018-03-07 | Inrix, Inc. | Traffic obstruction detection |
US10380888B2 (en) * | 2015-09-18 | 2019-08-13 | Sony Corporation | Information processing apparatus, information processing method, and program |
-
2016
- 2016-06-13 US US15/750,384 patent/US10380888B2/en active Active
- 2016-06-13 WO PCT/JP2016/067568 patent/WO2017047176A1/ja active Application Filing
- 2016-06-13 CN CN201680052582.5A patent/CN108028015B/zh active Active
- 2016-06-13 JP JP2017539715A patent/JP6690649B2/ja active Active
- 2016-06-13 EP EP16846049.1A patent/EP3352154A4/en not_active Ceased
-
2019
- 2019-07-09 US US16/505,721 patent/US10699569B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008126954A (ja) * | 2006-11-24 | 2008-06-05 | Tokai Rika Co Ltd | 意思伝達装置 |
JP2012252407A (ja) * | 2011-05-31 | 2012-12-20 | Nissan Motor Co Ltd | 車両用通行支援装置及び車両用通行支援方法 |
JP2013152524A (ja) * | 2012-01-24 | 2013-08-08 | Denso Corp | 車車間通信装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3352154A4 * |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017072874A (ja) * | 2015-10-05 | 2017-04-13 | 日産自動車株式会社 | 情報提供装置、情報提供システム及び情報提供方法 |
WO2018225178A1 (ja) * | 2017-06-07 | 2018-12-13 | 三菱電機株式会社 | 危険車両予測装置、危険車両警報システムおよび危険車両予測方法 |
JP7188389B2 (ja) | 2017-08-23 | 2022-12-13 | ソニーグループ株式会社 | 情報処理装置、情報処理システム、および情報処理方法、並びにプログラム |
JPWO2019039212A1 (ja) * | 2017-08-23 | 2020-09-17 | ソニー株式会社 | 情報処理装置、情報処理システム、および情報処理方法、並びにプログラム |
CN109835258A (zh) * | 2017-11-29 | 2019-06-04 | 通用汽车环球科技运作有限责任公司 | 基于通信的无照明车辆指示 |
JP2019127196A (ja) * | 2018-01-26 | 2019-08-01 | Kddi株式会社 | 周辺走行中の被観測車両の運転特性を推定するプログラム、装置及び方法 |
CN110097782A (zh) * | 2018-01-29 | 2019-08-06 | 丰田自动车株式会社 | 代理控制器和代理协作方法 |
JP2019159492A (ja) * | 2018-03-08 | 2019-09-19 | 東芝デジタルソリューションズ株式会社 | 隊列走行運用システムおよび隊列走行運用方法 |
WO2019172289A1 (ja) * | 2018-03-08 | 2019-09-12 | 東芝デジタルソリューションズ株式会社 | 隊列走行運用システムおよび隊列走行運用方法 |
US11941991B2 (en) | 2018-03-08 | 2024-03-26 | Toshiba Digital Solutions Corporation | Platooning operation system and platooning operation method |
JP7134649B2 (ja) | 2018-03-08 | 2022-09-12 | 東芝デジタルソリューションズ株式会社 | 隊列走行運用システムおよび隊列走行運用方法 |
JP2021516396A (ja) * | 2018-03-27 | 2021-07-01 | 杭州欧▲雷▼激光技▲術▼有限公司 | 車両外部環境情報を検出するための検出システム及び検出方法 |
JP2021520541A (ja) * | 2018-04-04 | 2021-08-19 | トヨタ モーター エンジニアリング アンド マニュファクチャリング ノース アメリカ,インコーポレイティド | 周囲のビークルの観察を使用して交通フローを判定するためのシステム及び方法 |
JP7179866B2 (ja) | 2018-04-04 | 2022-11-29 | トヨタ モーター エンジニアリング アンド マニュファクチャリング ノース アメリカ,インコーポレイティド | 周囲のビークルの観察を使用して交通フローを判定するためのシステム及び方法 |
JP2020034996A (ja) * | 2018-08-27 | 2020-03-05 | Zホールディングス株式会社 | 情報処理装置、情報処理方法、及び情報処理プログラム |
JP7475808B2 (ja) | 2018-08-27 | 2024-04-30 | Lineヤフー株式会社 | 情報処理装置、情報処理方法、及び情報処理プログラム |
WO2020208696A1 (ja) * | 2019-04-09 | 2020-10-15 | 三菱電機株式会社 | 情報記録制御装置および情報記録制御方法 |
JPWO2020208696A1 (ja) * | 2019-04-09 | 2021-09-13 | 三菱電機株式会社 | 情報記録制御装置および情報記録制御方法 |
JP7140278B2 (ja) | 2019-05-15 | 2022-09-21 | 日産自動車株式会社 | 表示制御方法及び表示制御装置 |
JPWO2020230313A1 (ja) * | 2019-05-15 | 2020-11-19 | ||
JP2020197831A (ja) * | 2019-05-31 | 2020-12-10 | Necプラットフォームズ株式会社 | 運転マナー評価提示装置、運転マナー評価提示方法、運転マナー評価システム、および運転マナー評価提示プログラム |
JPWO2021044486A1 (ja) * | 2019-09-02 | 2021-11-25 | 三菱電機株式会社 | 自動運転制御装置および自動運転制御方法 |
JP7330278B2 (ja) | 2019-09-02 | 2023-08-21 | 三菱電機株式会社 | 自動運転制御装置および自動運転制御方法 |
JP2021077029A (ja) * | 2019-11-07 | 2021-05-20 | トヨタ自動車株式会社 | 車両、音響制御装置、及び音響制御プログラム |
CN112776713A (zh) * | 2019-11-07 | 2021-05-11 | 丰田自动车株式会社 | 车辆、声响控制装置以及计算机可读取非暂存记录介质 |
JP7268581B2 (ja) | 2019-11-07 | 2023-05-08 | トヨタ自動車株式会社 | 車両、音響制御装置、及び音響制御プログラム |
WO2021111753A1 (ja) * | 2019-12-05 | 2021-06-10 | ソニーグループ株式会社 | 情報処理装置、および情報処理方法、並びにプログラム |
JPWO2021186853A1 (ja) * | 2020-03-19 | 2021-09-23 | ||
WO2021186853A1 (ja) * | 2020-03-19 | 2021-09-23 | 日本電気株式会社 | 画像生成装置、画像生成方法、およびプログラム |
JP2022026230A (ja) * | 2020-07-30 | 2022-02-10 | 株式会社デンソー | 運転支援装置、運転支援プログラム、運転訓練プログラム及び情報管理プログラム |
JP7463896B2 (ja) | 2020-07-30 | 2024-04-09 | 株式会社デンソー | 運転支援装置、運転支援プログラム、運転訓練プログラム及び情報管理プログラム |
JP7170076B2 (ja) | 2020-10-28 | 2022-11-11 | 株式会社日本総合研究所 | 情報処理方法及び情報処理システム |
JP2022071799A (ja) * | 2020-10-28 | 2022-05-16 | 株式会社日本総合研究所 | 情報処理方法及び情報処理システム |
WO2022102374A1 (ja) * | 2020-11-16 | 2022-05-19 | 株式会社小糸製作所 | 車両用表示システム |
WO2022249917A1 (ja) * | 2021-05-26 | 2022-12-01 | 株式会社デンソー | 車両用制御装置及び車両用制御方法 |
JP7517248B2 (ja) | 2021-05-26 | 2024-07-17 | 株式会社デンソー | 車両用制御装置及び車両用制御方法 |
CN115624008A (zh) * | 2022-07-08 | 2023-01-20 | 珠海科艺普检测科技有限公司 | 一种基于生物信息技术的大口黑鲈鱼的鱼苗智能检测方法 |
CN115624008B (zh) * | 2022-07-08 | 2023-09-05 | 珠海科艺普检测科技有限公司 | 一种基于生物信息技术的大口黑鲈鱼的鱼苗智能检测方法 |
Also Published As
Publication number | Publication date |
---|---|
EP3352154A4 (en) | 2019-08-07 |
US20190333380A1 (en) | 2019-10-31 |
US10699569B2 (en) | 2020-06-30 |
EP3352154A1 (en) | 2018-07-25 |
US10380888B2 (en) | 2019-08-13 |
CN108028015B (zh) | 2021-07-23 |
JP6690649B2 (ja) | 2020-04-28 |
US20180225963A1 (en) | 2018-08-09 |
CN108028015A (zh) | 2018-05-11 |
JPWO2017047176A1 (ja) | 2018-06-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10699569B2 (en) | Information processing apparatus, information processing method, and program | |
JP7399075B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
US11295143B2 (en) | Information processing apparatus, information processing method, and program | |
JP6962316B2 (ja) | 情報処理装置、情報処理方法、プログラム、およびシステム | |
CN111361552B (zh) | 自动驾驶系统 | |
JP2021185486A (ja) | 車両に安全に追い付けるように運転を支援するシステムおよび方法 | |
JP6693354B2 (ja) | 車両用情報提示装置 | |
JP6269360B2 (ja) | 運転支援システム及び運転支援方法 | |
WO2020100585A1 (ja) | 情報処理装置、および情報処理方法、並びにプログラム | |
JP7035447B2 (ja) | 車両制御装置 | |
CN112789205B (zh) | 针对自动驾驶车辆检测队列并对队列进行响应 | |
JP2018005797A (ja) | 運転支援方法およびそれを利用した運転支援装置、運転支援システム、自動運転制御装置、車両、プログラム | |
CN113401071A (zh) | 显示控制装置、显示控制方法以及计算机可读取存储介质 | |
JP2012103849A (ja) | 情報提供装置 | |
CN114572219B (zh) | 自动超车方法、装置、车辆、存储介质及芯片 | |
JP2024066121A (ja) | 運転支援装置、運転支援方法、およびプログラム | |
JP2014106037A (ja) | 車載情報提供装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16846049 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017539715 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15750384 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016846049 Country of ref document: EP |