WO2022137558A1 - 情報処理装置及び情報処理方法 - Google Patents
情報処理装置及び情報処理方法 Download PDFInfo
- Publication number
- WO2022137558A1 WO2022137558A1 PCT/JP2020/048934 JP2020048934W WO2022137558A1 WO 2022137558 A1 WO2022137558 A1 WO 2022137558A1 JP 2020048934 W JP2020048934 W JP 2020048934W WO 2022137558 A1 WO2022137558 A1 WO 2022137558A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- information processing
- determined
- image
- processing apparatus
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 82
- 238000003672 processing method Methods 0.000 title claims abstract description 20
- 238000013459 approach Methods 0.000 claims description 8
- 238000002834 transmittance Methods 0.000 claims description 8
- 230000037361 pathway Effects 0.000 abstract 1
- 238000000605 extraction Methods 0.000 description 11
- 230000006399 behavior Effects 0.000 description 9
- 239000000284 extract Substances 0.000 description 9
- 239000003086 colorant Substances 0.000 description 8
- 238000010586 diagram Methods 0.000 description 5
- 238000000034 method Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
- B60K35/81—Arrangements for controlling instruments for controlling displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/105—Speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/166—Navigation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/177—Augmented reality
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/179—Distances to obstacles or vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/186—Displaying information according to relevancy
- B60K2360/1868—Displaying information according to relevancy according to driving situations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/188—Displaying information using colour changes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/50—External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
Definitions
- the present invention relates to an information processing apparatus and an information processing method.
- a head-up display device for displaying is known (Patent Document 1).
- Patent Document 1 As a result of the AR guide being superimposed and displayed in the real world, there is a possibility that the user's attention to the real world may be lowered by the AR guide. As a result, there is a problem that the user of the vehicle may be delayed in noticing the change in the driving environment in the real world.
- the present invention has been made in view of the above problems, and it is possible to suppress the user's attention to the real world from being lowered by the AR guide, and the vehicle user can easily notice the change in the driving environment in the real world.
- the purpose is to provide various information processing devices and information processing methods.
- the information processing device and the information processing method control a display unit having a display area that allows the surroundings of the vehicle to pass through so as to be visible from the driver's seat of the vehicle.
- a display unit having a display area that allows the surroundings of the vehicle to pass through so as to be visible from the driver's seat of the vehicle.
- the user's attention to the real world is suppressed from being lowered by the AR guide, and the user of the vehicle can easily notice the change in the driving environment in the real world.
- FIG. 1 is a block diagram showing a configuration of an information processing apparatus according to an embodiment of the present invention.
- FIG. 2 is a flowchart showing the processing of the information processing apparatus according to the embodiment of the present invention.
- FIG. 3A is a diagram showing a first example of the display of the AR guide.
- FIG. 3B is a diagram showing a second example of the display of the AR guide.
- the information processing apparatus includes an acquisition unit 25, a display unit 21, and a controller 100.
- the acquisition unit 25 is connected to the controller 100 and acquires information on objects located around the vehicle (object information) and information on the route on which the vehicle is scheduled to travel (route information).
- the acquisition unit 25 is connected to a plurality of object detection sensors mounted on the vehicle, such as a laser radar, a millimeter wave radar, and a camera, which detect objects around the vehicle.
- the object detection sensor detects moving objects including other vehicles, motorcycles, bicycles, pedestrians, and stationary objects including parked vehicles as objects around the vehicle.
- the object detection sensor detects the position, attitude (yaw angle), size, speed, acceleration, deceleration, and yaw rate of moving and stationary objects with respect to the vehicle.
- the acquisition unit 25 acquires information about a moving object and a stationary object detected by the object detection sensor as object information.
- the acquisition unit 25 is connected to a navigation device (not shown), and acquires, for example, information on a guidance route to the destination of the vehicle as route information.
- the acquisition unit 25 acquires information such as a road map, the current position of the vehicle, and Point Of Interest (POI), which is a place of interest of the user scattered between the vehicle and the destination. May be good.
- POI Point Of Interest
- the acquisition unit 25 may acquire an captured image obtained by imaging the surroundings of the vehicle.
- the acquisition unit 25 may have a storage unit for storing acquired object information and route information.
- the display unit 21 is connected to the controller 100 and has a display area 23 that allows the surroundings of the vehicle to pass through so as to be visible from the driver's seat of the vehicle.
- the AR guide (image) generated by the controller 100 is displayed in the display area 23. Therefore, in the display area 23, the AR guide is displayed superimposed on the scenery around the vehicle.
- the display unit 21 may be a head-up display device, a head-mounted display, smart glasses, or a prompter.
- the display area 23 may be the windshield (windshield) of the vehicle, and the AR guide may be projected onto the display area 23 from a projector (not shown) to present the AR guide to the user. As a result, the user will see the AR guide superimposed on the scenery around the vehicle that can be seen through the windshield.
- the AR guide presented by the display unit 21 is, for example, a guide route to the destination of the vehicle, a road map, the current position of the vehicle, and a point of interest of the user scattered between the vehicle and the destination. It expresses Of Interest (POI) and the like.
- POI Point Of Interest
- the display unit 21 may display the AR guide by superimposing it on the captured image acquired by the acquisition unit 25.
- the display area 23 may be a display capable of displaying an captured image and an AR guide instead of transmitting the periphery of the vehicle so that it can be visually recognized from the driver's seat of the vehicle.
- the controller 100 is a general-purpose computer including a CPU (Central Processing Unit), a memory, a storage device, an input / output unit, and the like.
- the controller 100 is connected to a navigation device (not shown). For example, the navigation device performs vehicle route guidance.
- a navigation device not shown. For example, the navigation device performs vehicle route guidance.
- the controller 100 (control unit) is a general-purpose microcomputer including a CPU (central processing unit), a memory, and an input / output unit.
- a computer program for functioning as an information processing device is installed in the controller 100. By executing the computer program, the controller 100 functions as a plurality of information processing circuits (110, 120, 130, 140) included in the information processing apparatus.
- the computer program may be stored in a recording medium that can be read and written by a computer.
- an example of realizing a plurality of information processing circuits (110, 120, 130, 140) by software is shown.
- a plurality of information processing circuits (110, 120, 130, 140) may be configured by individual hardware.
- the information processing circuit (110, 120, 130, 140) may also be used as a navigation device or a control unit used for controlling a vehicle.
- the controller 100 includes an object extraction unit 110, a determination unit 120, an image generation unit 130, and an output unit 140 as a plurality of information processing circuits (110, 120, 130, 140).
- the object extraction unit 110 extracts an object that may interfere with the running of the vehicle based on the object information and the route information. For example, the object extraction unit 110 extracts an object existing within a predetermined distance from the vehicle or within a predetermined distance from the route on which the vehicle is scheduled to travel.
- the predetermined distance is set in advance.
- predetermined distances may be set according to the type of the object. For example, the predetermined distance set for the object having a high moving speed may be set longer than the predetermined distance set for the object having a low moving speed.
- the predetermined distance set when extracting other vehicles, motorcycles, and bicycles may be set longer than the predetermined distance set when extracting pedestrians and parked vehicles.
- the object extraction unit 110 is an object that exists within a predetermined distance from the vehicle or within a predetermined distance from the route on which the vehicle is scheduled to travel, and is located forward with respect to the traveling direction of the vehicle. It may be the one that extracts the object.
- the object extraction unit 110 may extract an object located in a range visible from the driver's seat of the vehicle via the display area 23.
- the determination unit 120 determines whether or not the object interferes with the running of the vehicle based on the object information and the route information for the object extracted by the object extraction unit 110.
- the determination unit 120 determines whether or not the object approaches the vehicle or the route on which the vehicle is scheduled to travel, and if it is determined that the object approaches the vehicle or the route, the determination unit 120 determines that the object interferes with the travel of the vehicle. It may be a thing.
- the determination unit 120 may predict the distance between the object and the vehicle after a unit time has elapsed, based on the object information and the route information. Then, the determination unit 120 determines that the object approaches the vehicle when the distance between the object and the vehicle after a unit time elapses is smaller than the distance between the current object and the vehicle. May be good.
- the determination unit 120 may predict the distance between the object and the route on which the vehicle is scheduled to travel after a unit time has elapsed, based on the object information and the route information. Then, when the distance between the object and the path after a unit time elapses is smaller than the distance between the current object and the path, the determination unit 120 determines that the object approaches the path on which the vehicle plans to travel. It may be a judgment.
- the determination unit 120 determines whether or not the object is shielded by the AR guide displayed in the display area 23 when viewed from the driver's seat, and when it is determined that the object is shielded by the AR guide, the determination unit 120 determines. It may be determined that the object interferes with the running of the vehicle.
- the determination unit 120 determines whether or not the distance (distance on the display area 23) between the image of the object visually recognized in the display area 23 and the AR guide when viewed from the driver's seat is equal to or less than a predetermined distance. It may be a thing. Then, the determination unit 120 may determine that the object interferes with the running of the vehicle when the distance between the image of the object and the AR guide is equal to or less than a predetermined distance.
- the predetermined distance may be set based on the size of the image of the object in the display area 23 and the size of the AR guide. For example, the predetermined distance set when the size of the image of the object is small may be set larger than the predetermined distance set when the size of the image of the object is large. Further, the predetermined distance set when the ratio of the size of the image of the object to the size of the AR guide is small compared to the predetermined distance set when the ratio of the size of the image of the object to the size of the AR guide is large. It may be set large.
- the determination unit 120 may calculate the collision time between the object and the vehicle based on the object information and the route information, and determine whether or not the collision time is equal to or less than a predetermined time. Then, the determination unit 120 may determine that it interferes with the running of the object description vehicle when it is determined that the collision time is equal to or less than a predetermined time.
- the collision time between the object and the vehicle is calculated from the moving speed and the moving direction of the object, and the moving speed and the moving direction of the vehicle.
- the determination unit 120 determines whether or not the speed of the vehicle is equal to or higher than the predetermined speed, and if it is determined that the speed of the vehicle is equal to or higher than the predetermined speed, the determination unit 120 determines that the object interferes with the running of the vehicle. May be.
- the determination unit 120 determines whether or not the speed of the object is equal to or higher than the predetermined speed based on the object information, and when it is determined that the speed of the object is equal to or higher than the predetermined speed, the object is traveling. It may be determined that it interferes with.
- the image generation unit 130 generates an AR guide based on the route information. For example, the image generation unit 130 generates an AR guide that is visually recognized in the display area 23 when viewed from the driver's seat and overlaps with an image of the route on which the vehicle is scheduled to travel.
- FIGS. 3A and 3B show an example of an AR guide generated so as to overlap the image of the route on which the vehicle is scheduled to travel.
- FIG. 3A is a diagram showing a first example of the display of the AR guide.
- FIG. 3B is a diagram showing a second example of the display of the AR guide.
- a left turn is planned at an intersection located in front of the lane in which the vehicle travels.
- FIG. 3A shows how the AR guides AR1 to AR7 are displayed in the display area 23 along the image TL of the route on which the vehicle is scheduled to travel. Therefore, the user can recognize the route on which the vehicle plans to travel along the AR guides AR1 to AR7.
- FIG. 3B shows how the AR guides AR1 to AR5 are displayed in the display area 23 along the image TL of the route on which the vehicle is scheduled to travel. Unlike FIG. 3A, the AR guide AR6 and AR guide AR7 are not shown because of the presence of another vehicle VC.
- the image generation unit 130 When the image generation unit 130 becomes able to visually recognize the VC of another vehicle via the display area 23, the image generation unit 130 changes the AR guide from the state of FIG. 3A to the state of FIG. 3B.
- the image generation unit 130 changes a part or the whole of the AR guide to increase the ratio of the display area 23 to pass through the periphery of the vehicle. Further, the image generation unit 130 changes a part or the whole of the AR guide when there is an object that interferes with the traveling of the vehicle by the determination unit 120, and increases the ratio of the display area 23 to pass through the periphery of the vehicle. It may be something to make.
- the image generation unit 130 may increase the ratio of the display area 23 transmitting through the periphery of the vehicle by increasing the transmittance of a part or the whole of the AR guide. Further, the image generation unit 130 may increase the ratio of the display area 23 transmitting through the periphery of the vehicle by setting the transmittance of a part or the whole of the AR guide to 100%. According to FIGS. 3A and 3B, the other vehicle VC shown in FIG. 3B may be made visible by increasing the transmittance of the AR guide AR6 and the AR guide AR7.
- the image generation unit 130 may increase the ratio of the display area 23 to pass through the periphery of the vehicle by reducing the size of a part or the whole of the AR guide.
- the other vehicle VC shown in FIG. 3B can be easily visually recognized.
- the image generation unit 130 reduces the width of the AR guide indicating the route on which the vehicle is scheduled to travel along the road width direction in which the vehicle travels, so that the display area 23 transmits the periphery of the vehicle. It may be something to increase.
- FIGS. 3A and 3B in the state after the size is changed (FIG. 3B) as compared with the width W1 of the AR guide along the road width direction in the state before the size is changed (FIG. 3A).
- the width W2 of the AR guide along the road width direction is small. As a result, the other vehicle VC shown in FIG. 3B can be easily visually recognized.
- the image generation unit 130 increases the ratio of the display area 23 to pass through the periphery of the vehicle by reducing the length of the AR guide indicating the route on which the vehicle is scheduled to travel along the direction in which the vehicle travels. It may be something to make.
- the state after the size change (FIG. 3A) as compared with the height H1 of the AR guide along the direction in which the vehicle travels in the state before the size change (FIG. 3A).
- the height H2 of the AR guide along the direction in which the vehicle travels in 3B) is small.
- the other vehicle VC shown in FIG. 3B can be easily visually recognized.
- the vehicle in the state of FIG. 3B travels as compared with the total length of the AR guide (the length from the AR guide AR1 to the AR guide AR7) along the direction in which the vehicle travels in the state of FIG. 3A.
- the total length of the AR guide along the direction (the length from the AR guide AR1 to the AR guide AR5) is reduced.
- the other vehicle VC shown in FIG. 3B can be easily visually recognized.
- the image generation unit 130 may increase the ratio of the display area 23 transmitting through the periphery of the vehicle by bringing the color of a part or the whole of the AR guide closer to the color around the vehicle. For example, the image generation unit 130 may acquire the color around the vehicle based on the captured image.
- the other vehicle VC shown in FIG. 3B can be visually recognized. There may be. Further, the colors of the AR guide AR6 and the AR guide AR7 in FIG. 3A may be close to the colors of the other vehicle VC in FIG. 3B.
- the AR guide By bringing the color of a part or the whole of the AR guide closer to the color around the vehicle, the AR guide becomes inconspicuous in the display area 23, and the object visible through the display area 23 becomes relatively conspicuous. As a result, according to FIG. 3B, the other vehicle VC can be easily visually recognized.
- RGB color model a model that expresses colors with the three primary colors of red, green, and blue
- an AR guide is applied to the RGB values of the image around the vehicle reflected in the display area 23. It means that the RGB values of are close to each other.
- the image generation unit 130 changes a part or the whole of the AR guide to reduce the ratio of the display area 23 to pass around the vehicle. It may be. Further, the image generation unit 130 changes a part or the whole of the AR guide when there is no object that interferes with the traveling of the vehicle by the determination unit 120, and reduces the ratio of the display area 23 to pass through the periphery of the vehicle. It may be something to make.
- the AR guide is displayed by changing the AR guide from the state of FIG. 3B to the state of FIG. 3A.
- the region 23 may reduce the rate of transmission around the vehicle. This makes the AR guide easily visible.
- the output unit 140 outputs the AR guide generated / changed by the image generation unit 130.
- the output AR guide is displayed in the display area 23.
- the processing of the information processing apparatus shown in FIG. 2 may be started based on a user's instruction, or may be repeatedly executed at a predetermined cycle.
- step S101 the acquisition unit 25 acquires object information and route information.
- the acquired object information and route information are transmitted to the controller 100.
- step S103 the image generation unit 130 generates an AR guide based on the route information.
- step S105 the object extraction unit 110 extracts an object that can interfere with the traveling of the vehicle based on the object information and the route information. Then, the determination unit 120 determines whether or not the object interferes with the running of the vehicle based on the object information and the route information for the object extracted by the object extraction unit 110.
- the image generation unit 130 changes the AR guide in step S109. More specifically, the image generation unit 130 modifies a part or the whole of the AR guide to increase the ratio of the display area 23 to be transmitted around the vehicle. Then, the process proceeds to step S111.
- step S107 if it is determined that the object does not interfere with the traveling of the vehicle (NO in step S107), the process proceeds to step S111.
- step S111 the output unit 140 outputs the AR guide generated / changed by the image generation unit 130.
- the output AR guide is displayed in the display area 23.
- the information processing apparatus and the information processing method according to the present embodiment control a display unit having a display area that allows the surroundings of the vehicle to pass through so as to be visible from the driver's seat of the vehicle.
- a display unit having a display area that allows the surroundings of the vehicle to pass through so as to be visible from the driver's seat of the vehicle.
- the user's attention to the real world is suppressed from being lowered by the AR guide, and the user of the vehicle can easily notice the change in the driving environment in the real world.
- the possibility that the object located around the vehicle is shielded by the AR guide is reduced, and the user can easily notice the object or the change in the behavior of the object.
- the user can perform a faster driving operation corresponding to the change in the behavior of the object.
- the user can obtain driving support from the system by displaying the AR guide, and at the same time, can operate the vehicle without lowering the attention to the objects located around the vehicle.
- the information processing apparatus and the information processing method according to the present embodiment may increase the ratio of the display area to transmit the surroundings by increasing the transmittance of a part or the whole of the image. As a result, the possibility that the object located around the vehicle is shielded by the AR guide is reduced, and the user can easily notice the object or the change in the behavior of the object. By controlling the transmittance of the image, the user can pay attention to the objects located around the vehicle while checking the displayed AR guide.
- the information processing apparatus and the information processing method according to the present embodiment may increase the ratio of the display area transmitting through the surroundings by setting the transmittance to 100%. As a result, it is possible to prevent the object located around the vehicle from being shielded by the AR guide, and the user can easily notice the object or the change in the behavior of the object. The user can also pay attention to objects located around the vehicle.
- the information processing apparatus and the information processing method according to the present embodiment may increase the ratio of the display area transmitting through the surroundings by reducing the size of a part or the whole of the image. This prevents objects located around the vehicle from being shielded by the AR guide.
- the information processing apparatus and the information processing method according to the present embodiment increase the ratio of the display area transmitting through the surroundings by reducing the width of the image showing the route along the road width direction in which the vehicle travels. It may be a thing. This makes it easier for the user to notice an object located on the side of the road on which the vehicle travels and a change in the behavior of the object. In particular, the user is more likely to notice an object popping out from the side of the road on which the vehicle travels. As a result, the user can perform a faster driving operation corresponding to an object popping out from the side of the road.
- the information processing apparatus and the information processing method according to the present embodiment increase the ratio of the display area to pass through the surroundings by reducing the length of the image showing the route along the direction in which the vehicle travels. May be. This makes it easier for the user to notice an object located in front of the vehicle along the traveling direction and a change in the behavior of the object. As a result, the user can perform a faster driving operation corresponding to the object located in front of the vehicle.
- the information processing apparatus and the information processing method according to the present embodiment may increase the ratio of the display area transmitting through the surroundings by bringing the color of a part or the whole of the image closer to the surrounding colors. .. This prevents the user's attention to objects located around the vehicle from being reduced by the AR guide. Furthermore, the user can easily notice the change in the behavior of the object. As a result, the user can perform a faster driving operation corresponding to the change in the behavior of the object.
- the information processing apparatus and the information processing method according to the present embodiment determine whether or not the object interferes with the traveling of the vehicle based on the information of the object located around the vehicle and the route on which the vehicle is scheduled to travel.
- the ratio may be increased. This makes it possible to extract an object whose driving operation of the vehicle needs to be changed in order to avoid interference with the running of the vehicle. Therefore, the user can easily notice the object, and the user can perform a faster driving operation corresponding to the behavior of the object that needs to change the driving operation of the vehicle.
- the information processing apparatus and the information processing method according to the present embodiment determine whether or not the object approaches the vehicle or the route, and when it is determined that the object approaches the vehicle or the route, the object interferes with the traveling of the vehicle. It may be determined to be. As a result, it is possible to accurately extract an object whose driving operation of the vehicle needs to be changed in order to avoid interference with the running of the vehicle.
- the information processing apparatus and the information processing method according to the present embodiment determine whether or not the object is shielded by the image displayed in the display area when viewed from the driver's seat, and the object is shielded by the image. If it is determined, it may be determined that the object interferes with the running of the vehicle. As a result, it is possible to accurately extract an object whose driving operation of the vehicle needs to be changed in order to avoid interference with the running of the vehicle. Further, since it is possible to determine whether or not the object is shielded by the image based on the information on the position of the object without using the information on the velocity of the object, the calculation cost for extracting the object can be reduced.
- the information processing apparatus and the information processing method according to the present embodiment calculate the collision time between the object and the vehicle, determine whether or not the collision time is the predetermined time or less, and determine that the collision time is the predetermined time or less. If it is determined, it may be determined that the object interferes with the running of the vehicle. As a result, it is possible to accurately extract an object whose driving operation of the vehicle needs to be changed in order to avoid interference with the running of the vehicle.
- the information processing apparatus and the information processing method according to the present embodiment determine whether or not the speed of the vehicle is equal to or higher than the predetermined speed, and when it is determined that the speed of the vehicle is equal to or higher than the predetermined speed, the object is determined. It may be determined that it interferes with the running of the vehicle. As a result, it is possible to accurately extract an object whose driving operation of the vehicle needs to be changed in order to avoid interference with the running of the vehicle. Further, since it is possible to determine whether or not the object is shielded by the image based on the information on the velocity of the object without using the information on the position of the object, the calculation cost for extracting the object can be reduced.
- the information processing device and the information processing method according to the present embodiment may reduce the ratio when it is determined that the object does not interfere with the traveling of the vehicle.
- the smaller the number of objects that interfere with the running of the vehicle the smaller the proportion of the display area that penetrates the surroundings.
- Processing circuits include programmed processors, electrical circuits, etc., as well as devices such as application specific integrated circuits (ASICs) and circuit components arranged to perform the described functions. Etc. are also included.
- ASICs application specific integrated circuits
- Display unit 23 Display area 25 Acquisition unit 100 Controller 110 Object extraction unit 120 Judgment unit 130 Image generation unit 140 Output unit
Landscapes
- Engineering & Computer Science (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
Description
図1を参照して、本実施形態に係る情報処理装置の構成例を説明する。情報処理装置は一例として車両に搭載される。図1に示すように、情報処理装置は、取得部25と、表示部21と、コントローラ100とを備える。
次に、本実施形態に係る情報処理装置の処理手順を、図2のフローチャートを参照して説明する。図2に示す情報処理装置の処理は、ユーザの指示に基づいて開始されるものであってもよいし、所定の周期で繰り返し実行されるものであってもよい。
以上詳細に説明したように、本実施形態に係る情報処理装置及び情報処理方法は、車両の運転席から視認可能なように車両の周囲を透過させる表示領域を有する表示部、を制御する。車両の周囲に位置する物体の情報に基づいて、車両又は車両が走行を予定する経路から所定距離の範囲内に物体があるか否かを判定し、範囲内に物体があると判定された場合に、表示領域に重ねて表示する画像(ARガイド)の一部または全体を変更して、表示領域が周囲を透過させる割合を増加させる。
23 表示領域
25 取得部
100 コントローラ
110 物体抽出部
120 判定部
130 画像生成部
140 出力部
Claims (14)
- 表示部と、取得部と、コントローラと、を備える情報処理装置であって、
前記取得部は、
車両の周囲に位置する物体及び前記車両が走行を予定する経路の情報を取得し、
前記表示部は、
前記車両の運転席から視認可能なように前記周囲を透過させる表示領域を有し、
前記表示領域に重ねて画像を表示し、
前記コントローラは、前記経路に基づいて前記画像を生成する際、
前記情報に基づいて、前記車両又は前記経路から所定距離の範囲内に前記物体があるか否かを判定し、
前記範囲内に前記物体があると判定された場合に、前記画像の一部または全体を変更して、前記表示領域が前記周囲を透過させる割合を増加させること
を特徴とする情報処理装置。 - 請求項1に記載の情報処理装置であって、
前記コントローラは、
前記画像の一部または全体の透過率を高めること
を特徴とする情報処理装置。 - 請求項2に記載の情報処理装置であって、
前記コントローラは、
前記透過率を100%にすること
を特徴とする情報処理装置。 - 請求項1~3のいずれか一項に記載の情報処理装置であって、
前記コントローラは、
前記画像の一部または全体のサイズを小さくすること
を特徴とする情報処理装置。 - 請求項4に記載の情報処理装置であって、
前記コントローラは、
前記車両が走行する道路幅方向に沿った、前記経路を示す前記画像の幅を小さくすること
を特徴とする情報処理装置。 - 請求項4又は5に記載の情報処理装置であって、
前記コントローラは、
前記車両が走行する方向に沿った、前記経路を示す前記画像の長さを小さくすること
を特徴とする情報処理装置。 - 請求項1~6のいずれか一項に記載の情報処理装置であって、
前記コントローラは、
前記画像の一部または全体の色を前記周囲の色に近づけること
を特徴とする情報処理装置。 - 請求項1~7のいずれか一項に記載の情報処理装置であって、
前記コントローラは、
前記情報に基づいて、前記物体が前記車両の走行に干渉するか否かを判定し、
前記物体が前記車両の走行に干渉すると判定された場合に、前記割合を増加させること
を特徴とする情報処理装置。 - 請求項8に記載の情報処理装置であって、
前記コントローラは、
前記物体が、前記車両又は前記経路に近づくか否かを判定し、
前記車両又は前記経路に近づくと判定された場合に、前記物体が前記車両の走行に干渉すると判定すること
を特徴とする情報処理装置。 - 請求項8又は9に記載の情報処理装置であって、
前記コントローラは、
前記運転席から見て、前記表示領域に表示される前記画像によって前記物体が遮蔽されるか否かを判定し、
前記画像によって前記物体が遮蔽されると判定された場合に、前記物体が前記車両の走行に干渉すると判定すること
を特徴とする情報処理装置。 - 請求項8~10のいずれか一項に記載の情報処理装置であって、
前記コントローラは、
前記物体と前記車両の衝突時間を算出し、
前記衝突時間が所定時間以下であるか否かを判定し、
前記衝突時間が前記所定時間以下であると判定された場合に、前記物体が前記車両の走行に干渉すると判定すること
を特徴とする情報処理装置。 - 請求項8~11のいずれか一項に記載の情報処理装置であって、
前記コントローラは、
前記車両の速度が所定速度以上であるか否かを判定し、
前記車両の速度が前記所定速度以上であると判定された場合に、前記物体が前記車両の走行に干渉すると判定すること
を特徴とする情報処理装置。 - 請求項8~12のいずれか一項に記載の情報処理装置であって、
前記コントローラは、
前記物体が前記車両の走行に干渉しないと判定された場合に、前記割合を減少させること
を特徴とする情報処理装置。 - 車両の運転席から視認可能なように前記車両の周囲を透過させる表示領域を有する表示部、を制御する情報処理方法であって、
前記車両の周囲に位置する物体及び前記車両が走行を予定する経路の情報を取得し、
前記経路に基づいて、前記表示領域に重ねて表示する画像を生成する際、
前記情報に基づいて、前記車両又は前記経路から所定距離の範囲内に前記物体があるか否かを判定し、
前記範囲内に前記物体があると判定された場合に、前記画像の一部または全体を変更して、前記表示領域が前記周囲を透過させる割合を増加させること
を特徴とする情報処理方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/258,954 US20240042855A1 (en) | 2020-12-25 | 2020-12-25 | Information processing device and information processing method |
EP20966180.0A EP4269152A4 (en) | 2020-12-25 | 2020-12-25 | INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD |
JP2022570994A JPWO2022137558A1 (ja) | 2020-12-25 | 2020-12-25 | |
PCT/JP2020/048934 WO2022137558A1 (ja) | 2020-12-25 | 2020-12-25 | 情報処理装置及び情報処理方法 |
CN202080108112.2A CN116601034A (zh) | 2020-12-25 | 2020-12-25 | 信息处理装置和信息处理方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/048934 WO2022137558A1 (ja) | 2020-12-25 | 2020-12-25 | 情報処理装置及び情報処理方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022137558A1 true WO2022137558A1 (ja) | 2022-06-30 |
Family
ID=82157722
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/048934 WO2022137558A1 (ja) | 2020-12-25 | 2020-12-25 | 情報処理装置及び情報処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240042855A1 (ja) |
EP (1) | EP4269152A4 (ja) |
JP (1) | JPWO2022137558A1 (ja) |
CN (1) | CN116601034A (ja) |
WO (1) | WO2022137558A1 (ja) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006284458A (ja) * | 2005-04-01 | 2006-10-19 | Denso Corp | 運転支援情報表示システム |
JP2012153256A (ja) * | 2011-01-26 | 2012-08-16 | Toyota Motor Corp | 画像処理装置 |
JP2013108852A (ja) * | 2011-11-21 | 2013-06-06 | Alpine Electronics Inc | ナビゲーション装置 |
US20170187963A1 (en) * | 2015-12-24 | 2017-06-29 | Lg Electronics Inc. | Display device for vehicle and control method thereof |
JP2017151637A (ja) * | 2016-02-23 | 2017-08-31 | トヨタ自動車株式会社 | 表示装置 |
WO2018070193A1 (ja) | 2016-10-13 | 2018-04-19 | マクセル株式会社 | ヘッドアップディスプレイ装置 |
JP2019012236A (ja) * | 2017-06-30 | 2019-01-24 | パナソニックIpマネジメント株式会社 | 表示システム、情報提示システム、表示システムの制御方法、プログラム、及び移動体 |
JP2019059248A (ja) * | 2016-03-28 | 2019-04-18 | マクセル株式会社 | ヘッドアップディスプレイ装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5983547B2 (ja) * | 2013-07-02 | 2016-08-31 | 株式会社デンソー | ヘッドアップディスプレイ及びプログラム |
KR101899981B1 (ko) * | 2016-12-02 | 2018-09-19 | 엘지전자 주식회사 | 차량용 헤드 업 디스플레이 |
-
2020
- 2020-12-25 EP EP20966180.0A patent/EP4269152A4/en active Pending
- 2020-12-25 CN CN202080108112.2A patent/CN116601034A/zh active Pending
- 2020-12-25 US US18/258,954 patent/US20240042855A1/en active Pending
- 2020-12-25 WO PCT/JP2020/048934 patent/WO2022137558A1/ja active Application Filing
- 2020-12-25 JP JP2022570994A patent/JPWO2022137558A1/ja active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006284458A (ja) * | 2005-04-01 | 2006-10-19 | Denso Corp | 運転支援情報表示システム |
JP2012153256A (ja) * | 2011-01-26 | 2012-08-16 | Toyota Motor Corp | 画像処理装置 |
JP2013108852A (ja) * | 2011-11-21 | 2013-06-06 | Alpine Electronics Inc | ナビゲーション装置 |
US20170187963A1 (en) * | 2015-12-24 | 2017-06-29 | Lg Electronics Inc. | Display device for vehicle and control method thereof |
JP2017151637A (ja) * | 2016-02-23 | 2017-08-31 | トヨタ自動車株式会社 | 表示装置 |
JP2019059248A (ja) * | 2016-03-28 | 2019-04-18 | マクセル株式会社 | ヘッドアップディスプレイ装置 |
WO2018070193A1 (ja) | 2016-10-13 | 2018-04-19 | マクセル株式会社 | ヘッドアップディスプレイ装置 |
JP2019012236A (ja) * | 2017-06-30 | 2019-01-24 | パナソニックIpマネジメント株式会社 | 表示システム、情報提示システム、表示システムの制御方法、プログラム、及び移動体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4269152A4 |
Also Published As
Publication number | Publication date |
---|---|
EP4269152A4 (en) | 2024-01-17 |
JPWO2022137558A1 (ja) | 2022-06-30 |
US20240042855A1 (en) | 2024-02-08 |
EP4269152A1 (en) | 2023-11-01 |
CN116601034A (zh) | 2023-08-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102100051B1 (ko) | 색 선별 거울을 이용하는 자율 주행 차량용 3d lidar 시스템 | |
EP2860971B1 (en) | Display control apparatus, method, recording medium, and vehicle | |
KR102306790B1 (ko) | 컨텐츠 시각화 장치 및 방법 | |
US20180272934A1 (en) | Information presentation system | |
US11181737B2 (en) | Head-up display device for displaying display items having movement attribute or fixed attribute, display control method, and control program | |
US9196163B2 (en) | Driving support apparatus and driving support method | |
WO2019038903A1 (ja) | 周囲車両表示方法及び周囲車両表示装置 | |
JP5916541B2 (ja) | 車載システム | |
US20210016793A1 (en) | Control apparatus, display apparatus, movable body, and image display method | |
CN111034186B (zh) | 周围车辆显示方法及周围车辆显示装置 | |
JP7014205B2 (ja) | 表示制御装置および表示制御プログラム | |
US11698265B2 (en) | Vehicle display device | |
JP7409265B2 (ja) | 車載表示装置、方法およびプログラム | |
WO2019189515A1 (en) | Control apparatus, display apparatus, movable body, and image display method | |
WO2020105685A1 (ja) | 表示制御装置、方法、及びコンピュータ・プログラム | |
JP7127565B2 (ja) | 表示制御装置及び表示制御プログラム | |
WO2022137558A1 (ja) | 情報処理装置及び情報処理方法 | |
US11222552B2 (en) | Driving teaching device | |
JP7014206B2 (ja) | 表示制御装置および表示制御プログラム | |
JP2022516849A (ja) | ヘッドアップディスプレイシステム、方法、データキャリア、処理システム及び車両 | |
WO2021049215A1 (ja) | 表示制御装置、および表示制御プログラム | |
JP7321787B2 (ja) | 情報処理装置及び情報処理方法 | |
WO2021015090A1 (ja) | 制御装置 | |
JP2021104794A (ja) | 車両用表示装置 | |
JP2020138610A (ja) | 車両用表示制御装置、車両用表示制御方法、車両用表示制御プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20966180 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022570994 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202080108112.2 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18258954 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020966180 Country of ref document: EP Effective date: 20230725 |