WO2020045040A1 - Control device, projection control system, moving object control system, moving object, control method, and control program - Google Patents

Control device, projection control system, moving object control system, moving object, control method, and control program Download PDF

Info

Publication number
WO2020045040A1
WO2020045040A1 PCT/JP2019/031473 JP2019031473W WO2020045040A1 WO 2020045040 A1 WO2020045040 A1 WO 2020045040A1 JP 2019031473 W JP2019031473 W JP 2019031473W WO 2020045040 A1 WO2020045040 A1 WO 2020045040A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
moving object
vehicle
information
image data
Prior art date
Application number
PCT/JP2019/031473
Other languages
French (fr)
Inventor
Hiroshi Yamaguchi
Masato Kusanagi
Original Assignee
Ricoh Company, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Company, Ltd. filed Critical Ricoh Company, Ltd.
Publication of WO2020045040A1 publication Critical patent/WO2020045040A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/06Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle
    • B60Q1/08Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically
    • B60Q1/085Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically due to special conditions, e.g. adverse weather, type of road, badly illuminated road signs or potential dangers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/48Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for parking purposes
    • B60Q1/488Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for parking purposes for indicating intention to park
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/179Distances to obstacles or vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/334Projection means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/77Instrument locations other than the dashboard
    • B60K2360/797Instrument locations other than the dashboard at the vehicle exterior
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/10Indexing codes relating to particular vehicle conditions
    • B60Q2300/11Linear movements of the vehicle
    • B60Q2300/112Vehicle speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/10Indexing codes relating to particular vehicle conditions
    • B60Q2300/14Other vehicle conditions
    • B60Q2300/142Turn signal actuation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/30Indexing codes relating to the vehicle environment
    • B60Q2300/32Road surface or travel path
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/40Indexing codes relating to other road users or special conditions
    • B60Q2300/42Indexing codes relating to other road users or special conditions oncoming vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/40Indexing codes relating to other road users or special conditions
    • B60Q2300/45Special conditions, e.g. pedestrians, road signs or potential dangers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2400/00Special features or arrangements of exterior signal lamps for vehicles
    • B60Q2400/50Projected symbol or information, e.g. onto the road or car body

Definitions

  • the present invention relates to a control device, a projection control system, a moving object control system, a moving object, a control method, and a control program.
  • the communication here includes not only verbal transmission of information, but also transmission of, for example, a pedestrian’s intention to cross a road to a vehicle by using a predetermined gesture when the pedestrian is about to cross the road.
  • the communication also includes a case where the vehicle detects the pedestrian's gesture and recognizes the intention to cross the road, and transmits a recognition result to the pedestrian.
  • the present invention has been made in view of the above problems, and has an object to improve transmitting performance of a moving object.
  • a control device includes: an acquisition unit configured to acquire information recognized by a moving object; a generation unit configured to generate an image representing propagation of a wave in a display mode in accordance with the acquired information; and an output unit configured to output the image to a projection unit.
  • FIGs. 1A and 1B are illustrations of an example of arrangement of each device included in a projection control system in a moving object.
  • FIG. 2 is a first diagram illustrating an example of a hardware configuration of a projection device.
  • FIG. 3 is a second diagram illustrating an example of a hardware configuration of the projection device.
  • FIG. 4 is a third diagram illustrating an example of a hardware configuration of the projection device.
  • FIG. 5 is a diagram illustrating an example of a hardware configuration of a control device.
  • FIG. 1A and 1B are illustrations of an example of arrangement of each device included in a projection control system in a moving object.
  • FIG. 2 is a first diagram illustrating an example of a hardware configuration of a projection device.
  • FIG. 3 is a second diagram illustrating an example of a hardware configuration of the projection device.
  • FIG. 4 is a third diagram illustrating an example of a hardware configuration of the projection device.
  • FIG. 5 is a diagram illustrating an example of a hardware configuration of a
  • FIG. 6 is a first diagram illustrating an example of a functional configuration of the control device.
  • FIG. 7 is a first diagram illustrating a method for generating projection image data by an image generation unit.
  • FIG. 8 is a diagram illustrating an example of projection image data generated at each time by the image generation unit.
  • FIG. 9 is a first diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 10 is a second diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 11 is a third diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 12 is a fourth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 13 is a fifth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 14 is a sixth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 15 is a seventh diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 16 is an eighth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 17 is a second diagram illustrating an example of a functional configuration of the control device.
  • FIGs. 18A and 18B (FIG.
  • FIG. 18 are illustrations of a method for generating projection image data by the image generation unit.
  • FIG. 19 is a ninth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 20 is a tenth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 21 is an eleventh diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 22 is a twelfth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 23 is a thirteenth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 24 is a third diagram illustrating an example of a functional configuration of the control device.
  • FIGs. 25A and 25B are illustrations of a method for generating projection image data by the image generation unit.
  • FIG. 26 is a fourteenth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 27 is a fourth diagram illustrating an example of a functional configuration of the control device.
  • FIGs. 28A and 28B (FIG. 28) are illustrations of a method for generating projection image data by the image generation unit.
  • FIG. 29 is a fifteenth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
  • FIG. 30 is a fifth diagram illustrating an example of a functional configuration of the control device.
  • FIGs. 31A and 31B are illustrations of a method for generating projection image data by the image generation unit.
  • FIG. 32 is a sixteenth diagram illustrating an example of projection of an image to be projected onto a
  • FIG. 1 is a diagram illustrating an example of arrangement of each device included in a projection control system in a moving object.
  • description is given with the projection control system being mounted on a vehicle 100 as illustrated in FIG. 1.
  • the projection control system includes a projection unit (a projection device 110 and a projection device 120) and a control device 130.
  • the projection device 110 is disposed, for example, at the position of a left headlight of the vehicle 100, and projects a predetermined image to the front of the vehicle 100.
  • the projection device 120 is disposed, for example, at the position of a right headlight of the vehicle 100, and projects a predetermined image to the front of the vehicle 100.
  • control device 130 is disposed, for example, in a dashboard of the vehicle 100.
  • the control device 130 generates projection image data (left-side projection image data and right-side projection image data) to be projected by the projection devices 110 and 120 in a predetermined image update cycle, and transmits the generated projection image data to the projection devices 110 and 120.
  • the number of projection devices included in the projection control system is not limited to two, but may be one.
  • one projection device is disposed, for example, at a front center position of the vehicle 100, and projects a predetermined image to the front of the vehicle 100.
  • the number of projection devices included in the projection control system may be three or more.
  • the projection devices are disposed, for example, on both side surfaces and a rear surface (for example, the position of tail lamps) of the vehicle 100 in addition to the positions of the headlights of the vehicle 100, and project predetermined images to the front, the both sides, and the rear of the vehicle 100.
  • FIG. 2 is a first diagram illustrating an example of a hardware configuration of the projection device. Note that, since the projection devices 110 and 120 have the same hardware configuration, only the projection device 110 is illustrated in the example of FIG. 2.
  • the projection device 110 includes a light source 201, a collimator lens 202, microelectromechanical systems (MEMS) 203, a wavelength conversion element 205, and a projection lens 206.
  • MEMS microelectromechanical systems
  • the light source 201 emits, for example, light having a blue wavelength band in order to draw predetermined projection image data (here, left-side projection image data) generated by the control device 130.
  • the collimator lens 202 condenses the light flux emitted from the light source 201 on the MEMS 203.
  • the MEMS 203 includes a reflection mirror agitation-driven by a mechanism that is movable in a tilted manner in two axial directions, namely, longitudinal and horizontal directions, based on a control signal from the control device 130.
  • the MEMS 203 controls the reflected light in a range represented by a scanning width 204 to two-dimensionally scan the wavelength conversion element 205.
  • the wavelength conversion element 205 is a reflective fluorescent material on which predetermined projection image data is to be drawn, and when irradiated, from the front side, with a blue light flux two-dimensionally scanned by the MEMS 203, the wavelength conversion element 205 emits a yellow fluorescent material (fluorescent material at least including green and red wavelength bands).
  • the projection lens 206 projects, to the front, light that has turned white due to mixing of light after conversion by the wavelength conversion element 205 and light that has not been converted.
  • an image in accordance with the predetermined projection image data generated by the control device 130 can be projected into a space in front of the vehicle 100.
  • an image in accordance with the predetermined projection image data generated by the control device 130 can be projected into a space in front of the vehicle 100.
  • FIG. 4 is a third diagram illustrating an example of a hardware configuration of the projection device. Differences from FIGS. 2 and 3 are that the projection device 110 of FIG. 4 includes a light source 401 instead of the light source 201, and includes a microdisplay 405 instead of the wavelength conversion elements 205 and 305.
  • the light source 401 is a white LED that emits white light based on a control signal from the control device 130.
  • the microdisplay 405 is irradiated with the light emitted from the light source 401 via the collimator lens 202.
  • the microdisplay 405 is, for example, a digital microdisplay device (DMD (registered trademark)).
  • the microdisplay 405 displays predetermined projection image data (here, left-side projection image data) generated by the control device 130, and controls turning on and off of image light for each pixel in accordance with the displayed predetermined projection image data.
  • the light emitted onto the microdisplay 405 from the light source 401 is reflected toward the projection lens 206 when the image light is on.
  • an image in accordance with the predetermined projection image data generated by the control device 130 can be projected into a space in front of the vehicle 100.
  • the image light when the image light is off, the light emitted onto the microdisplay 405 by the light source 401 is directed toward a direction different from that of the projection lens 206 (see a dotted arrow 406). In this case, a black image is projected in the space in front of the vehicle 100.
  • the microdisplay 405 is not limited to the DMD (registered trademark), but may be other reflective liquid crystal panels or transmissive liquid crystal panels.
  • FIG. 5 is a diagram illustrating an example of a hardware configuration of a control device.
  • control device 130 includes a central processing unit (CPU) 501, random access memory (RAM) 502, a storage unit 503, and an input/output unit 504.
  • CPU central processing unit
  • RAM random access memory
  • storage unit 503 storage unit
  • the units of the control device 130 are mutually connected via a bus 505.
  • the CPU 501 is a computer that executes a program (for example, a control program described later) stored in the storage unit 503.
  • the RAM 502 is a main storage device, such as dynamic random access memory (DRAM) or static random access memory (SRAM).
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • the RAM 502 functions as a work area in which the program stored in the storage unit 503 is developed when the program is to be executed by the CPU 501.
  • the storage unit 503 is a non-volatile memory, such as EPROM and EEPROM, and stores a program to be executed by the CPU 501.
  • the input/output unit 504 is an interface device for communicating with the projection devices 110 and 120 or a controller area network (CAN) (not illustrated).
  • CAN controller area network
  • FIG. 6 is a first diagram illustrating an example of a functional configuration of the control device.
  • control program is installed in the control device 130, and when the control program is executed, the control device 130 functions as an information acquisition unit 610, a speed/acceleration information extraction unit 620, an image generation unit 630, and a left/right dividing unit 640.
  • the information acquisition unit 610 acquires CAN information from the CAN.
  • the CAN information includes various types of information recognized by the vehicle 100.
  • the speed/acceleration information extraction unit 620 is an example of an acquisition unit, and extracts speed information and acceleration information of the vehicle 100 included in the CAN information acquired by the information acquisition unit 610.
  • the image generation unit 630 is an example of a generation unit, and generates projection image data based on the speed information and the acceleration information extracted by the speed/acceleration information extraction unit 620.
  • the left/right dividing unit 640 is an example of an output unit.
  • the left/right dividing unit 640 divides the projection image data generated by the image generation unit 630 into data for the projection device 110 and data for the projection device 120, and outputs left-side projection image data and right-side projection image data to the projection unit (the projection device 110 and the projection device 120).
  • the image generation unit 630 generates “image data indicating wave propagation” as projection image data.
  • the wavelength determining unit 710 determines a wavelength based on the acceleration information extracted by the speed/acceleration information extraction unit 620.
  • the wave moving speed determining unit 720 determines a moving speed of the wave based on the speed information extracted by the speed/acceleration information extraction unit 620.
  • the update unit 730 generates projection image data based on the wavelength determined by the wavelength determining unit 710 and the moving speed of the wave determined by the wave moving speed determining unit 720.
  • the update unit 730 generates projection image data at a predetermined image update cycle.
  • FIG. 8 is a diagram illustrating an example of projection image data generated at each time by the image generation unit. Note that, in the example of FIG. 8, to express propagation of a wave, peak positions of the height of the waves (transverse waves) are represented by a broken line.
  • projection image data 701 schematically indicates projection image data generated by the image generation unit 630 at time T1. If the acceleration information of the vehicle 100 is "0" at time T1, the wavelength has a default length d.
  • the wavelength depends on the acceleration information of the vehicle 100.
  • projection image data 702 schematically indicates projection image data generated by the image generation unit 630 at time T2. Also at time T2, if the acceleration information of the vehicle 100 remains "0," the wavelength in the projection image data 702 remains d, as illustrated in FIG. 8.
  • a length L which the wave moves during the time range (T2-T1) is a length proportional to the speed information ("v1") of the vehicle 100. That is, in the propagation of the waves indicated by the projection image data, the moving speed of the wave depends on the speed information of the vehicle 100.
  • Image Projection Example an image projection example when projection image data generated by the projection control system is projected on a road surface (moving route) is described. Note that, in the image data indicating propagation of a wave, in FIG. 8, the propagation of a wave is expressed by illustrating peak positions of the height of the waves with a line. However, the method for expressing the wave propagation in the image data indicating the wave propagation is not limited to the same. Therefore, hereinafter, an image projection example is explained, illustrating various expression methods as methods for expressing propagation of a wave.
  • FIG. 9 is a first diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 9 illustrates a case in which the peak positions of the height of the waves are represented by lines (or bands) in an image data indicating propagation of the waves (transverse waves).
  • the example of FIG. 9 illustrates a case in which the acceleration information is zero.
  • a person who shares the environment with the vehicle 100 can recognize the speed of the vehicle 100 from the moving speed of the lines (or bands). Further, since the distance between the lines (or bands) is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed.
  • FIG. 10 is a second diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 10 illustrates a case in which the peak position of the wave height is represented by a line (or a band) in the image data indicating the propagation of the waves (transverse waves). Note that the example in FIG. 10 illustrates a case in which the traffic signal in the traveling direction (moving direction) turns to the red at an intersection and the vehicle 100 decelerates (when the acceleration information becomes minus).
  • the wavelength becomes shorter.
  • the driver of another vehicle waiting for a right turn in the opposite lane can recognize that the driver of the vehicle 100 has an intention to stop.
  • FIG. 10 illustrates a scene in which the vehicle 100 decelerates at an intersection
  • a driver of the vehicle 100 recognizes this, and the vehicle 100 decelerates before the pedestrian crossing
  • the same effect can be obtained.
  • the wavelength becomes shorter the pedestrian who is to cross the pedestrian crossing can recognize that the driver of the vehicle 100 has an intention to stop.
  • FIG. 11 is a third diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 11 illustrates a case in which the peak positions of the height waves are represented by sets of dots in the image data indicating the propagation of the waves (transverse waves).
  • the example of FIG. 11 illustrates a case in which the acceleration information is zero.
  • FIG. 12 is a fourth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 12 illustrates a case in which dense and sparse of waves are represented by dense and sparse of dots in image data indicating the propagation of the waves (longitudinal waves).
  • the example of FIG. 12 illustrates a case in which the acceleration information is zero.
  • a person who shares the environment with the vehicle 100 (for example, a pedestrian or a driver of another vehicle) can recognize the speed of the vehicle 100 from the moving speed at which the dense position of the dots moves. Further, since the distance between the dense positions of the dots is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed.
  • FIG. 13 is a fifth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 13 illustrates a case in which the peak positions of the height of the waves are represented by transverse lines (or transverse bands) in the image data indicating the propagation of the waves (transverse waves), and illustrates longitudinal lines (or longitudinal bands) radially extending to the front.
  • the example of FIG. 13 illustrates a case in which the acceleration information is zero.
  • transverse lines or transverse bands
  • a person who shares the environment with the vehicle 100 can recognize the speed of the vehicle 100 from the moving speed of the transverse lines (or transverse bands). Further, since the distance between the transverse lines (or transverse bands) is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed.
  • FIG. 14 is a sixth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 14 illustrates a case in which the peak position of the height of the wave is represented by a line (or a band) in an image data indicating propagation of the wave (transverse wave).
  • a difference with FIG. 9 is that a wavelength and a moving speed of the wave are greater than those of FIG. 9, and the number of lines (or bands) projected at the same time on the road surface is reduced (in the example of FIG. 14, a case in which only one line (or a band) is projected at the same time is illustrated).
  • a person who shares the environment with the vehicle 100 can recognize the speed of the vehicle 100 from the moving speed of the line (or the band). Further, since the interval at which the lines (or bands) appear is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed.
  • FIG. 15 is a seventh diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 15 illustrates a case in which a peak position of a wave (transverse wave) moving in a direction crossing the moving direction (travel direction) of the vehicle 100, which is a circumferential direction about a front position of the vehicle 100, is indicated by a line (or a band).
  • the wavelength and the moving speed of the wave are increased and the number of lines (or bands) projected on the road surface at the same time is reduced(the example of FIG. 15 illustrates a case in which a single line or band is projected at the same time).
  • a person who shares the environment with the vehicle 100 can recognize the speed of the vehicle 100 from the moving speed of the line (or the band). Further, since the interval at which the lines (or bands) appear is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed.
  • FIG. 16 is an eighth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 16 illustrates a case in which the peak positions of the height of the waves are represented by lines (or bands) in an image data indicating propagation of the waves (transverse waves).
  • a difference from FIG. 9 is that the lines (or the bands) indicating the peak positions of the height of the waves are projected on the road surface in the entire circumferential direction of the vehicle 100.
  • a person who shares the environment with the vehicle 100 can recognize the speed of the vehicle 100 from any directions based on the moving speed of the lines (or the bands). Further, since the distance between the lines (or bands) is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed from any directions.
  • the projection control system of the first embodiment it is possible to transmit information (speed information, acceleration information) recognized by the vehicle 100 to a person who shares the environment with the vehicle 100. That is, according to the projection control system of the first embodiment, transmitting performance of the vehicle can be improved.
  • the speed information and the acceleration information of the vehicle 100 are extracted from the CAN information and reflected on the display mode of the image data indicating the propagation of the waves is described.
  • the CAN information to be reflected on the display mode of the image data indicating the propagation of the waves is not limited to the same.
  • route information indicating the travel route (moving route) of the vehicle 100 may be extracted and reflected on the display mode of the image data indicating the propagation of the waves.
  • the second embodiment is described mainly about differences from the first embodiment.
  • FIG. 17 is a second diagram illustrating an example of a functional configuration of the control device. A difference from the functional configuration illustrated in FIG. 6 is that the functional configuration illustrated in FIG. 17 includes a route information extraction unit 1710 and an image generation unit 1720.
  • the route information extraction unit 1710 is an example of an acquisition unit, and extracts route information indicating the travel route of the vehicle 100, which is included in the CAN information acquired by the information acquisition unit 610.
  • the route information here includes, when the driver of the vehicle 100 is driving, angle information of a steering, operation information of blinkers, navigation information of a navigation device, and the like.
  • the route information includes route information indicating a travel route of the vehicle 100 controlled by the ADAS.
  • the travel route of the vehicle 100 controlled in the advanced driver assistance system includes, for example, a travel route for avoiding collisions with obstacles.
  • the image generation unit 1720 is an example of a generation unit, and generates projection image data based on the speed information and acceleration information extracted by the speed/acceleration information extraction unit 620 and the route information extracted by the route information extraction unit 1710.
  • FIG. 18 is a second diagram illustrating a method for generating projection image data by an image generation unit.
  • a difference from the method for generating projection image data by the image generation unit 630 illustrated in FIG. 7 is that the image generation unit 1720 in FIG. 18A includes a wave moving direction determining unit 1810 and an update unit 1820.
  • the wave moving direction determining unit 1810 determines the moving direction of the wave based on the route information extracted by the route information extraction unit 1710.
  • the update unit 1820 generates projection image data based on the wavelength determined by the wavelength determining unit 710, the moving speed of the wave determined by the wave moving speed determining unit 720, and the moving direction of the wave determined by the wave moving direction determining unit 1810.
  • FIG. 18B is a diagram illustrating a specific example of projection image data 1830 generated by the image generation unit 1720. As illustrated in FIG. 18B, it is assumed that an arrow 1831 is determined as the moving direction of the wave on the projection image data 1830 based on the route information extracted by the route information extraction unit 1710. In this case, the update unit 1820 generates projection image data so that the wave moves along the arrow 1831.
  • FIG. 19 is a ninth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 19 illustrates a scene in which the vehicle 100 changes lanes.
  • FIG. 20 is a tenth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 20 illustrates a scene in which the vehicle 100 is overtaking another motorcycle.
  • the route information extraction unit 1710 extracts route information indicating the travel route of the vehicle 100 on which overtaking is to be performed. Further, the wave moving direction determining unit 1810 determines the moving direction of the wave based on the extracted route information. Then, as illustrated in FIG. 20, an image in which waves propagate along the travel route of the vehicle 100 when the vehicle 100 overtakes another motorcycle is projected on the road surface around the other motorcycle.
  • FIG. 21 is an eleventh diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 21 illustrates a scene in which the vehicle 100 is parking.
  • the route information extraction unit 1710 extracts route information indicating the travel route of the vehicle 100 at the time of parking. Further, the wave moving direction determining unit 1810 determines the moving direction of the wave based on the extracted route information. As a result, as illustrated in FIG. 21, an image in which waves propagate along the travel route of the vehicle 100 at the time of parking is projected on the road surface.
  • a person who shares the environment with the vehicle 100 in the example of FIG. 21, a pedestrian near the parking space, and the like, can recognize that the vehicle 100 is about to park.
  • FIG. 22 is a twelfth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 22 illustrates a scene in which the vehicle 100 turns left on a road with one lane on one side.
  • the blinkers of the vehicle 100 cannot be viewed from the driver of another vehicle traveling in the lane after the left turn. For this reason, when an image is not projected on the road surface by the projection control system, the driver of the other vehicle is not able to recognize whether the vehicle 100 turns left. On the other hand, when the image is projected on the road surface of the lane by the projection control system after the left turn, drivers of other vehicles traveling in the lane after the left turn easily recognize that the vehicle 100 is trying to turn left.
  • FIG. 23 is a thirteenth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 23 illustrates a scene in which the vehicle 100 joins at a junction.
  • the route information extraction unit 1710 extracts navigation information as route information indicating the travel route of the vehicle 100. Further, the wave moving direction determining unit 1810 determines the moving direction of the wave based on the extracted navigation information. Then, as illustrated in FIG. 23, an image in which waves propagate toward the lane to which the vehicle 100 is about to join at the junction is projected on the road surface of the lane to which the vehicle 100 is about to join.
  • a person who shares the environment with the vehicle 100 in the example of FIG. 23, a driver of another vehicle traveling on the lane to which the vehicle 100 is about to join) can recognize that the vehicle 100 is moving to the position toward which the wave propagates.
  • the projection control system acquires information (speed information, acceleration information, route information) recognized by the vehicle 100; - generates image data (projection image data) indicating propagation of waves in a display mode in accordance with the acquired information (speed information, acceleration information, route information); and - projects the generated projection image data onto the road surface on which the vehicle 100 travels via the projection device all the time the vehicle 100 is traveling.
  • information (speed information, acceleration information, route information) recognized by the vehicle 100 can be transmitted to a person who shares the environment with the vehicle 100. That is, according to the projection control system of the second embodiment, transmitting performance of the vehicle can further be improved.
  • the person information extraction unit 2410 is an example of an acquisition unit, and extracts person information indicating a detection result of a person included in the CAN information acquired by the information acquisition unit 610.
  • the person information includes a distance from the vehicle 100 to the detected person and a direction (angle) of the detected person as viewed from the vehicle 100.
  • the image generation unit 2420 is an example of a generation unit, and generates projection image data based on the speed information and acceleration information extracted by the speed/acceleration information extraction unit 620 and the route information extracted by the route information extraction unit 1710. Further, the image generation unit 2420 transforms the waveform of the waves represented in the generated projection image data (for example, the form of a line or a band representing the peak position of the height of the wave) based on the person information extracted by the person information extraction unit 2410. Furthermore, the image generation unit 2420 outputs projection image data indicating waves of which waveform has been transformed.
  • FIG. 25 is a third diagram illustrating a method for generating projection image data by an image generation unit.
  • a difference from the method for generating projection image data by the image generation unit 1720 illustrated in FIG. 18 is that the image generation unit 2420 in FIG. 25A includes a vector calculation unit 2510 and an update unit 2520.
  • the update unit 2520 transforms the waveform of the waves represented in the generated projection image data. Furthermore, the update unit 2520 outputs projection image data indicating waves of which waveform has been transformed.
  • FIG. 25B is a diagram illustrating a specific example of projection image data 2530 output by the image generation unit 2420.
  • the vector calculation unit 2510 specifies the position, on the projection image data 2530, that corresponds to the person information extracted by the person information extraction unit 2410 (see a black circle 2531).
  • the vector calculation unit 2510 calculates concentric vectors of different size about the position of the black circle 2531 (for example, vectors directed outward from the center).
  • the calculated vectors are assumed to have the same size on the same circle, and become larger as they are closer to the black circle 2531 and become smaller as they are apart from the black circle 2531.
  • the concentric circles extend outward with time, and FIG. 25B illustrates the positions of the concentric circles at a predetermined time.
  • the update unit 2520 transforms the waveform of the waves represented in the projection image data 2530 based on the calculated vectors. In this manner, in the update unit 2520, since the propagating waves come into contact with the concentric circles, a state in which the waves are pressed back with the force corresponding to the vectors on the circles in the direction according to the vectors on the circles (as if the waves are affected by a person) can be represented in the projection image data 2530.
  • FIG. 26 is a fourteenth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 26 illustrates a scene in which a person is about to cross a road at the right front side of the vehicle 100.
  • the vector calculation unit 2510 calculates a vector based on the extracted person information. Then, as illustrated in FIG. 26, an image in which waves of which waveform near a person has been transformed are propagating is projected on the road surface.
  • the person who shares the environment with the vehicle 100 in the example of FIG. 26, the person who is about to cross the road) can notice that the vehicle 100 has recognized the existence of himself/herself.
  • the projection control system acquires information (speed information, acceleration information, route information, person information) recognized by the vehicle 100; - generates image data (projection image data) indicating propagation of waves in a display mode in accordance with the acquired information (speed information, acceleration information, route information); - projects the generated projection image data onto the road surface on which the vehicle 100 travels via the projection device all the time the vehicle 100 is traveling; - when the vehicle 100 extracts person information, transforms the waveform of the waves represented in the generated projection image data and outputs in accordance with the extracted person information; and - when the vehicle 100 extracts person information, projects projection image data indicating waves of which waveform has been transformed via a projection device onto a road surface on which the vehicle 100 is traveling.
  • the projection control system of the third embodiment it is possible to transmit, to a person who shares the environment with the vehicle 100, that the vehicle 100 has recognized the person. That is, according to the projection control system of the third embodiment, transmitting performance of the vehicle can be further improved.
  • the CAN information to be reflected on the display mode of the image data indicating the propagation of the waves is not limited to the same.
  • information on an oncoming vehicle oncoming vehicle information
  • a fourth embodiment is described mainly about differences from the third embodiment.
  • FIG. 27 is a fourth diagram illustrating an example of a functional configuration of the control device. A difference from the functional configuration illustrated in FIG. 24 is that a control device 2700 in FIG. 27 includes an oncoming vehicle information extraction unit 2710 and an image generation unit 2720.
  • the oncoming vehicle information extraction unit 2710 is an example of an acquisition unit, and extracts oncoming vehicle information included in the CAN information acquired by the information acquisition unit 610.
  • the oncoming vehicle information includes a distance from the vehicle 100 to the oncoming vehicle, a direction (angle) to the oncoming vehicle as viewed from the vehicle 100, speed information of the oncoming vehicle, and speed information of the vehicle 100.
  • the image generation unit 2720 is an example of a generation unit, and generates projection image data based on the speed information and acceleration information extracted by the speed/acceleration information extraction unit 620 and the route information extracted by the route information extraction unit 1710. Further, based on the oncoming vehicle information extracted by the oncoming vehicle information extraction unit 2710, the image generation unit 2720 transforms the waveform of the waves represented in the generated projection image data. Furthermore, the image generation unit 2720 outputs projection image data indicating the waves of which waveform has been transformed.
  • FIG. 28 is a fourth diagram illustrating a method for generating projection image data by an image generation unit.
  • a difference from the method for generating projection image data by the image generation unit 2420 illustrated in FIG. 25 is that the image generation unit 2720 illustrated in FIG. 28A includes a position/relative speed calculation unit 2810 and an update unit 2820.
  • the position/relative speed calculation unit 2810 calculates relative positions and relative speed between the vehicle 100 and an oncoming vehicle for transforming the waveform of the waves.
  • the update unit 2820 generates projection image data based on the wavelength determined by the wavelength determining unit 710, the moving speed of the wave determined by the wave moving speed determining unit 720, and the moving direction of the wave determined by the wave moving direction determining unit 1810.
  • the update unit 2820 transforms the waveform of the waves represented in the generated projection image data. Furthermore, the update unit 2820 outputs projection image data indicating waves of which waveform has been transformed.
  • FIG. 28B is a diagram illustrating a specific example of projection image data 2830 output by the image generation unit 2720.
  • an arrow 2831 represents position/speed information of the vehicle 100 in the projection image data 2830, which is specified based on the oncoming vehicle information.
  • an arrow 2832 represents the position/speed information of the oncoming vehicle in the projection image data 2830, which is specified based on the oncoming vehicle information.
  • the update unit 2820 transforms the waveform of the waves represented in the projection image data 2830 based on the calculated relative positions and relative speed. In this manner, in the update unit 2820, the propagating waves can express, in the projection image data 2830, a state in which the moving speed is reduced at a position in accordance with the relative positions and relative speed (as if the oncoming vehicle is weakened by the projected waves).
  • FIG. 29 is a fifteenth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 29 illustrates a scene in which an oncoming vehicle traveling straight approaches from the right front side of the vehicle 100.
  • the position/relative speed calculation unit 2810 calculates the relative positions and the relative speed between the vehicle 100 and the oncoming vehicle based on the extracted oncoming vehicle information. Then, as illustrated in FIG. 29, an image in which waves of which waveform on the oncoming vehicle side has been transformed are propagating is projected on the road surface.
  • the projection control system acquires information (speed information, acceleration information, route information, oncoming vehicle information) recognized by the vehicle 100; - generates image data (projection image data) indicating propagation of waves in a display mode in accordance with the acquired information (speed information, acceleration information, route information); - projects the generated projection image data onto the road surface on which the vehicle 100 travels via the projection device all the time the vehicle 100 is traveling; - when the vehicle 100 extracts oncoming vehicle information, transforms the waveform of the waves represented in the generated projection image data and outputs in accordance with the extracted oncoming vehicle information; and - when the vehicle 100 extracts oncoming vehicle information, projects projection image data indicating waves of which waveform has been transformed via a projection device onto a road surface on which the vehicle 100 is traveling.
  • the projection control system of the fourth embodiment it is possible to transmit, to a driver of an oncoming vehicle who shares the environment with the vehicle 100, that the vehicle 100 has recognized that oncoming vehicle. That is, according to the projection control system of the fourth embodiment, transmitting performance of the vehicle can be further improved.
  • the CAN information to be reflected on the display mode of the image data indicating the propagation of the waves is not limited to the same.
  • information indicating a state of a road surface may be extracted and reflected on the display mode of the image data indicating the propagation of the waves.
  • a fifth embodiment is described mainly about differences from the fourth embodiment.
  • the image generation unit 3020 is an example of a generation unit.
  • the image generation unit 3020 generates projection image data based on the speed information and acceleration information extracted by the speed/acceleration information extraction unit 620 and the route information extracted by the route information extraction unit 1710. Further, based on the road surface information extracted by the road surface information extraction unit 3010, the image generation unit 3020 transforms the waveform of the waves represented in the generated projection image data. Furthermore, the image generation unit 3020 outputs projection image data indicating the waves of which waveform has been transformed.
  • the position/degree of roughness calculation unit 3110 calculates the position for transforming the waveform of the waves and the degree of roughness.
  • FIG. 31B is a diagram illustrating a specific example of the projection image data 3130 output by the image generation unit 3020.
  • a hatched area 3131 indicates the position for transforming the waveform of the waves and the degree of roughness on the projection image data 3130, and is calculated by the position/degree of roughness calculation unit 3110.
  • the waveform of the waves is transformed based on the calculated position and the degree of roughness.
  • the propagating waves can express, in the projection image data 3130, a state in which the waveform is transformed at the position of the hatched area (as if the shape is transformed due to the road roughness).
  • FIG. 32 is a sixteenth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system.
  • the example of FIG. 32 illustrates a scene in which roughness exists on the road surface on the right front side of the vehicle 100 that is travelling straight.
  • the position/degree of roughness calculation unit 3110 calculates, based on the extracted road surface information, the position in which the waveform is to be transformed on the projection image data and the degree of roughness when the waveform is to be transformed. Then, as illustrated in FIG. 32, an image in which waves of which waveform has been transformed in a position in which the road roughness exists are propagating is projected onto the road surface.
  • the projection control system acquires information (speed information, acceleration information, route information, road surface information) recognized by the vehicle 100; - generates image data (projection image data) indicating propagation of waves in a display mode in accordance with the acquired information (speed information, acceleration information, route information); - projects the generated projection image data onto the road surface on which the vehicle 100 travels via the projection device all the time the vehicle 100 is traveling; - when the vehicle 100 extracts road surface information, transforms the waveform of the waves represented in the generated projection image data and outputs in accordance with the extracted road surface information; and - when the vehicle 100 extracts road surface information, projects projection image data indicating waves of which waveforms have been transformed via a projection device onto a road surface on which the vehicle 100 is traveling.
  • the projection control system of the fifth embodiment it is possible to transmit road surface information to the driver of a following vehicle who shares the environment with the vehicle 100. That is, according to the projection control system of the fifth embodiment, transmitting performance of the vehicle can be further improved.
  • projection image data may be generated based on a predetermined wavelength, a predetermined moving speed, and a predetermined moving direction, not on the speed information, the acceleration information, and the route information, and the waveform may be transformed based on person information, oncoming vehicle information, or road surface information.
  • the color of the line (or band) indicating the peak positions of the height of the waves is not particularly mentioned.
  • the color of the line (or band) indicating the peak position of the height of the waves may be determined arbitrarily. Further, the number of the color of the line (or band) indicating the peak position of the height of the waves may be one or greater.
  • the projection image data may be configured to change depending on time of day (for example, daytime and nighttime).
  • a method for changing depending on time of day may be determined arbitrarily; for example, the figure and ground may be inverted between daytime and nighttime. Further, for example, the contrast of the figure and ground may be changed between daytime and nighttime.
  • the image generation unit selects a display mode for easier view depending on time of day and generates projection image data. Furthermore, the image generation unit may be configured to adjust a light amount depending on time of day in addition to selection of the display mode.
  • timing to activate and end the projection control system are not particularly mentioned.
  • the projection control system may be configured to be activated and ended at arbitrary timings under instructions of the driver.
  • the light source of the projection unit may emit light all the time to project projection image data on the road surface.
  • the light source of the projection unit and projection of the projection image data on the road surface do not necessarily have to be synchronized with the activation of the projection control system.
  • the light source of the projection unit may emit light all the time to project the projection image data on the road surface.
  • the light source of the projection unit may be turned on at arbitrary timing after activation of the projection control system, at least while the light source of the projection unit is emitting light and the vehicle 100 is moving, the projection image data may be projected all the time on the road surface.
  • projection image data can be projected on a road surface all the time at least while the vehicle 100 is moving.
  • the projection image data is projected onto the road surface.
  • objects on which the projection image data is projected may be arbitrarily determined, and the projection image data may be projected onto a surface other than the road surface.
  • the projection control system is provided as a component separated from other systems and devices (an information output system that generates and outputs information indicating the state of the vehicle 100, such as an advanced driver assistance system, an automatic parking system, and a navigation device).
  • an information output system that generates and outputs information indicating the state of the vehicle 100, such as an advanced driver assistance system, an automatic parking system, and a navigation device.
  • the projection control system may be configured integrally with other systems and devices, and configured to function as a part of a moving object control system.
  • the projection control system is mounted on the vehicle 100
  • the projection control system may be mounted on a moving object other than the vehicle 100.
  • Control device 610 Information acquisition unit 620: Speed/acceleration information extraction unit 630: Image generation unit 640: Left/right dividing unit 710: Wavelength determining unit 720: Wave moving speed determining unit 730: Update unit 1700: Control device 1710: Route information extraction unit 1720: Image generation unit 1810: Wave moving direction determining unit 1820: Update unit 2400: Control device 2410: Person information extraction unit 2420: Image generation unit 2510: Vector calculation unit 2520: Update unit 2700: Control device 2710: Oncoming vehicle information extraction unit 2720: Image generation unit 2810: Position/relative speed calculation unit 2820: Update unit 3000: Control device 3010: Road surface information extraction unit 3020: Image generation unit 3110: Position/degree of roughness calculation unit 3120: Update unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

A control device includes an acquisition unit (610) configured to acquire information recognized by a moving object; a generation unit (630) configured to generate an image representing propagation of a wave in a display mode in accordance with the acquired information; and an output unit configured to output the image to a projection unit.

Description

CONTROL DEVICE, PROJECTION CONTROL SYSTEM, MOVING OBJECT CONTROL SYSTEM, MOVING OBJECT, CONTROL METHOD, AND CONTROL PROGRAM
The present invention relates to a control device, a projection control system, a moving object control system, a moving object, a control method, and a control program.
In order to create safe and secure society in which people and moving objects (for example, vehicles) share the same environment (space), it is important for people and moving objects to more deeply understand each other through communication.
The communication here includes not only verbal transmission of information, but also transmission of, for example, a pedestrian’s intention to cross a road to a vehicle by using a predetermined gesture when the pedestrian is about to cross the road. The communication also includes a case where the vehicle detects the pedestrian's gesture and recognizes the intention to cross the road, and transmits a recognition result to the pedestrian.
By deepening mutual understanding through such communication, it is considered to be possible to create safe and secure society in which people and moving objects share the same environment.
Japanese Unexamined Patent Application Publication No. 2016-37260
However, current moving objects do not have sufficient transmitting performance to transmit recognition results (information). For this reason, in various communication scenes, sufficient amount of information cannot be transmitted from the moving objects.
The present invention has been made in view of the above problems, and has an object to improve transmitting performance of a moving object.
A control device according to each of embodiments of the present invention includes: an acquisition unit configured to acquire information recognized by a moving object; a generation unit configured to generate an image representing propagation of a wave in a display mode in accordance with the acquired information; and an output unit configured to output the image to a projection unit.
According to each of the embodiments of the present invention, transmitting performance of a moving object can be improved.
The accompanying drawings are intended to depict example embodiments of the present invention and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted. Also, identical or similar reference numerals designate identical or similar components throughout the several views.
FIGs. 1A and 1B (FIG. 1) are illustrations of an example of arrangement of each device included in a projection control system in a moving object. FIG. 2 is a first diagram illustrating an example of a hardware configuration of a projection device. FIG. 3 is a second diagram illustrating an example of a hardware configuration of the projection device. FIG. 4 is a third diagram illustrating an example of a hardware configuration of the projection device. FIG. 5 is a diagram illustrating an example of a hardware configuration of a control device. FIG. 6 is a first diagram illustrating an example of a functional configuration of the control device. FIG. 7 is a first diagram illustrating a method for generating projection image data by an image generation unit. FIG. 8 is a diagram illustrating an example of projection image data generated at each time by the image generation unit. FIG. 9 is a first diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 10 is a second diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 11 is a third diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 12 is a fourth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 13 is a fifth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 14 is a sixth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 15 is a seventh diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 16 is an eighth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 17 is a second diagram illustrating an example of a functional configuration of the control device. FIGs. 18A and 18B (FIG. 18) are illustrations of a method for generating projection image data by the image generation unit. FIG. 19 is a ninth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 20 is a tenth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 21 is an eleventh diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 22 is a twelfth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 23 is a thirteenth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 24 is a third diagram illustrating an example of a functional configuration of the control device. FIGs. 25A and 25B (FIG. 25) are illustrations of a method for generating projection image data by the image generation unit. FIG. 26 is a fourteenth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 27 is a fourth diagram illustrating an example of a functional configuration of the control device. FIGs. 28A and 28B (FIG. 28) are illustrations of a method for generating projection image data by the image generation unit. FIG. 29 is a fifteenth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system. FIG. 30 is a fifth diagram illustrating an example of a functional configuration of the control device. FIGs. 31A and 31B are illustrations of a method for generating projection image data by the image generation unit. FIG. 32 is a sixteenth diagram illustrating an example of projection of an image to be projected onto a road surface by the projection control system.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have a similar function, operate in a similar manner, and achieve a similar result.
Hereinafter, embodiments of the present invention are described with reference to the attached drawings. In the specification and the drawings, components having substantially the same functional configuration are denoted by the same reference numerals and redundant description is omitted.
First Embodiment
1. Example of Arrangement of Projection Control System
First, an example of arrangement of each device included in a projection control system according to a first embodiment mounted on a moving object is described. FIG. 1 is a diagram illustrating an example of arrangement of each device included in a projection control system in a moving object. In the first embodiment, description is given with the projection control system being mounted on a vehicle 100 as illustrated in FIG. 1.
The projection control system includes a projection unit (a projection device 110 and a projection device 120) and a control device 130. As illustrated in FIG. 1A, the projection device 110 is disposed, for example, at the position of a left headlight of the vehicle 100, and projects a predetermined image to the front of the vehicle 100. Further, the projection device 120 is disposed, for example, at the position of a right headlight of the vehicle 100, and projects a predetermined image to the front of the vehicle 100.
Furthermore, as illustrated in FIG. 1B, the control device 130 is disposed, for example, in a dashboard of the vehicle 100. The control device 130 generates projection image data (left-side projection image data and right-side projection image data) to be projected by the projection devices 110 and 120 in a predetermined image update cycle, and transmits the generated projection image data to the projection devices 110 and 120.
The number of projection devices included in the projection control system is not limited to two, but may be one. In this case, one projection device is disposed, for example, at a front center position of the vehicle 100, and projects a predetermined image to the front of the vehicle 100. Also, the number of projection devices included in the projection control system may be three or more. In this case, the projection devices are disposed, for example, on both side surfaces and a rear surface (for example, the position of tail lamps) of the vehicle 100 in addition to the positions of the headlights of the vehicle 100, and project predetermined images to the front, the both sides, and the rear of the vehicle 100.
2. Hardware Configuration of Projection Device
Next, an example of a hardware configuration of the projection device is described (here, three hardware configuration examples are described).
(1) Projection Device Hardware Configuration Example 1
FIG. 2 is a first diagram illustrating an example of a hardware configuration of the projection device. Note that, since the projection devices 110 and 120 have the same hardware configuration, only the projection device 110 is illustrated in the example of FIG. 2.
As illustrated in FIG. 2, the projection device 110 includes a light source 201, a collimator lens 202, microelectromechanical systems (MEMS) 203, a wavelength conversion element 205, and a projection lens 206.
The light source 201 emits, for example, light having a blue wavelength band in order to draw predetermined projection image data (here, left-side projection image data) generated by the control device 130. The collimator lens 202 condenses the light flux emitted from the light source 201 on the MEMS 203.
The MEMS 203 includes a reflection mirror agitation-driven by a mechanism that is movable in a tilted manner in two axial directions, namely, longitudinal and horizontal directions, based on a control signal from the control device 130. When reflecting light collected by the collimator lens 202, the MEMS 203 controls the reflected light in a range represented by a scanning width 204 to two-dimensionally scan the wavelength conversion element 205.
The wavelength conversion element 205 is a reflective fluorescent material on which predetermined projection image data is to be drawn, and when irradiated, from the front side, with a blue light flux two-dimensionally scanned by the MEMS 203, the wavelength conversion element 205 emits a yellow fluorescent material (fluorescent material at least including green and red wavelength bands).
The projection lens 206 projects, to the front, light that has turned white due to mixing of light after conversion by the wavelength conversion element 205 and light that has not been converted. As a result, in the projection device 110 illustrated in FIG. 2, an image in accordance with the predetermined projection image data generated by the control device 130 can be projected into a space in front of the vehicle 100.
(2) Projection Device Hardware Configuration Example 2
FIG. 3 is a second diagram illustrating an example of a hardware configuration of the projection device. As in FIG. 2, only the projection device 110 is illustrated in the example of FIG. 3. A difference from FIG. 2 is that, since a wavelength conversion element 305 is a transmissive fluorescent material, the wavelength conversion element 305 is irradiated, from the back side, with a blue light flux two-dimensionally scanned by the MEMS 203.
As in FIG. 2, in the configuration of the projection device 110 illustrated in FIG. 3, an image in accordance with the predetermined projection image data generated by the control device 130 can be projected into a space in front of the vehicle 100.
(3) Projection Device Hardware Configuration Example 3
FIG. 4 is a third diagram illustrating an example of a hardware configuration of the projection device. Differences from FIGS. 2 and 3 are that the projection device 110 of FIG. 4 includes a light source 401 instead of the light source 201, and includes a microdisplay 405 instead of the wavelength conversion elements 205 and 305.
The light source 401 is a white LED that emits white light based on a control signal from the control device 130. The microdisplay 405 is irradiated with the light emitted from the light source 401 via the collimator lens 202.
The microdisplay 405 is, for example, a digital microdisplay device (DMD (registered trademark)). The microdisplay 405 displays predetermined projection image data (here, left-side projection image data) generated by the control device 130, and controls turning on and off of image light for each pixel in accordance with the displayed predetermined projection image data. The light emitted onto the microdisplay 405 from the light source 401 is reflected toward the projection lens 206 when the image light is on. As a result, in the projection device 110 illustrated in FIG. 4, an image in accordance with the predetermined projection image data generated by the control device 130 can be projected into a space in front of the vehicle 100.
Further, when the image light is off, the light emitted onto the microdisplay 405 by the light source 401 is directed toward a direction different from that of the projection lens 206 (see a dotted arrow 406). In this case, a black image is projected in the space in front of the vehicle 100.
The microdisplay 405 is not limited to the DMD (registered trademark), but may be other reflective liquid crystal panels or transmissive liquid crystal panels.
3. Hardware Configuration of Control Device
Next, a hardware configuration of the control device 130 is described. FIG. 5 is a diagram illustrating an example of a hardware configuration of a control device.
As illustrated in FIG. 5, the control device 130 includes a central processing unit (CPU) 501, random access memory (RAM) 502, a storage unit 503, and an input/output unit 504. The units of the control device 130 are mutually connected via a bus 505.
The CPU 501 is a computer that executes a program (for example, a control program described later) stored in the storage unit 503. The RAM 502 is a main storage device, such as dynamic random access memory (DRAM) or static random access memory (SRAM). The RAM 502 functions as a work area in which the program stored in the storage unit 503 is developed when the program is to be executed by the CPU 501.
The storage unit 503 is a non-volatile memory, such as EPROM and EEPROM, and stores a program to be executed by the CPU 501. The input/output unit 504 is an interface device for communicating with the projection devices 110 and 120 or a controller area network (CAN) (not illustrated).
4. Functional Configuration of Control Device
(1) Entire Configuration
Next, a functional configuration of the entire control device 130 is described. FIG. 6 is a first diagram illustrating an example of a functional configuration of the control device.
As described above, the control program is installed in the control device 130, and when the control program is executed, the control device 130 functions as an information acquisition unit 610, a speed/acceleration information extraction unit 620, an image generation unit 630, and a left/right dividing unit 640.
The information acquisition unit 610 acquires CAN information from the CAN. The CAN information includes various types of information recognized by the vehicle 100. The speed/acceleration information extraction unit 620 is an example of an acquisition unit, and extracts speed information and acceleration information of the vehicle 100 included in the CAN information acquired by the information acquisition unit 610.
The image generation unit 630 is an example of a generation unit, and generates projection image data based on the speed information and the acceleration information extracted by the speed/acceleration information extraction unit 620.
The left/right dividing unit 640 is an example of an output unit. The left/right dividing unit 640 divides the projection image data generated by the image generation unit 630 into data for the projection device 110 and data for the projection device 120, and outputs left-side projection image data and right-side projection image data to the projection unit (the projection device 110 and the projection device 120).
(2) Method for Generating Projection Image Data by Image Generation Unit
Next, a method for generating projection image data by the image generation unit 630 is described. FIG. 7 is a first diagram illustrating a method for generating projection image data by an image generation unit. As illustrated in FIG. 7, the image generation unit 630 includes a wavelength determining unit 710, a wave moving speed determining unit 720, and an update unit 730.
The image generation unit 630 generates “image data indicating wave propagation” as projection image data. Specifically, the wavelength determining unit 710 determines a wavelength based on the acceleration information extracted by the speed/acceleration information extraction unit 620. Further, the wave moving speed determining unit 720 determines a moving speed of the wave based on the speed information extracted by the speed/acceleration information extraction unit 620.
The update unit 730 generates projection image data based on the wavelength determined by the wavelength determining unit 710 and the moving speed of the wave determined by the wave moving speed determining unit 720. The update unit 730 generates projection image data at a predetermined image update cycle.
(3) Specific Example of Projection Image Data
Next, a specific example of projection image data generated at each time by the image generation unit 630 is described. FIG. 8 is a diagram illustrating an example of projection image data generated at each time by the image generation unit. Note that, in the example of FIG. 8, to express propagation of a wave, peak positions of the height of the waves (transverse waves) are represented by a broken line.
In FIG. 8, projection image data 701 schematically indicates projection image data generated by the image generation unit 630 at time T1. If the acceleration information of the vehicle 100 is "0" at time T1, the wavelength has a default length d.
It is assumed that, if the acceleration information is minus (deceleration), the wavelength is shorter than the default length d, and if the acceleration information is plus (acceleration), the wavelength is longer than the default length d. That is, in the propagation of the waves indicated by the projection image data, the wavelength depends on the acceleration information of the vehicle 100.
Further, projection image data 702 schematically indicates projection image data generated by the image generation unit 630 at time T2. Also at time T2, if the acceleration information of the vehicle 100 remains "0," the wavelength in the projection image data 702 remains d, as illustrated in FIG. 8.
Further, if time T2 is α (α is an integer) times a predetermined image update cycle of the projection image data, a length L which the wave moves during the time range (T2-T1) is a length proportional to the speed information ("v1") of the vehicle 100. That is, in the propagation of the waves indicated by the projection image data, the moving speed of the wave depends on the speed information of the vehicle 100.
Hereinafter, if the acceleration information remains "0" and the speed information remains "v1" also at times T3 and T4, propagation of the waves of which the wavelength is d and which travels by the length of L is expressed during the time range (T3-T2, T4-T3) also in the projection image data 703 and 704.
5. Image Projection Example
Next, an image projection example when projection image data generated by the projection control system is projected on a road surface (moving route) is described. Note that, in the image data indicating propagation of a wave, in FIG. 8, the propagation of a wave is expressed by illustrating peak positions of the height of the waves with a line. However, the method for expressing the wave propagation in the image data indicating the wave propagation is not limited to the same. Therefore, hereinafter, an image projection example is explained, illustrating various expression methods as methods for expressing propagation of a wave.
(1) When peak positions of height of waves are represented by lines (or bands) (acceleration information = 0)
FIG. 9 is a first diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 9 illustrates a case in which the peak positions of the height of the waves are represented by lines (or bands) in an image data indicating propagation of the waves (transverse waves). The example of FIG. 9 illustrates a case in which the acceleration information is zero.
Thus, by representing the peak positions of the height of the waves by lines (or bands), a person who shares the environment with the vehicle 100 (for example, a pedestrian or a driver of another vehicle) can recognize the speed of the vehicle 100 from the moving speed of the lines (or bands). Further, since the distance between the lines (or bands) is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed.
(2) When peak positions of height of waves are represented by lines (or bands) (acceleration information = minus)
FIG. 10 is a second diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 10 illustrates a case in which the peak position of the wave height is represented by a line (or a band) in the image data indicating the propagation of the waves (transverse waves). Note that the example in FIG. 10 illustrates a case in which the traffic signal in the traveling direction (moving direction) turns to the red at an intersection and the vehicle 100 decelerates (when the acceleration information becomes minus).
As illustrated in FIG. 10, as the vehicle 100 decelerates, the wavelength becomes shorter. As a result, the driver of another vehicle waiting for a right turn in the opposite lane can recognize that the driver of the vehicle 100 has an intention to stop.
Although the example in FIG. 10 illustrates a scene in which the vehicle 100 decelerates at an intersection, in a case in which, for example, a pedestrian tries to cross a pedestrian crossing, a driver of the vehicle 100 recognizes this, and the vehicle 100 decelerates before the pedestrian crossing, the same effect can be obtained. Specifically, when the wavelength becomes shorter, the pedestrian who is to cross the pedestrian crossing can recognize that the driver of the vehicle 100 has an intention to stop.
(3) When peak positions of height of waves are represented by sets of dots (acceleration information = 0)
FIG. 11 is a third diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 11 illustrates a case in which the peak positions of the height waves are represented by sets of dots in the image data indicating the propagation of the waves (transverse waves). The example of FIG. 11 illustrates a case in which the acceleration information is zero.
Thus, by representing the peak positions of the height of the waves by sets of dots, a person who shares the environment with the vehicle 100 (for example, a pedestrian or a driver of another vehicle) can recognize the speed of the vehicle 100 from the moving speed of the sets of dots. Further, since the distance between the sets of dots is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed.
(4) When dense and sparse of waves are represented by dense and sparse of dots (acceleration information = 0)
FIG. 12 is a fourth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 12 illustrates a case in which dense and sparse of waves are represented by dense and sparse of dots in image data indicating the propagation of the waves (longitudinal waves). The example of FIG. 12 illustrates a case in which the acceleration information is zero.
In this manner, by representing dense and sparse of waves by dense and sparse of dots, a person who shares the environment with the vehicle 100 (for example, a pedestrian or a driver of another vehicle) can recognize the speed of the vehicle 100 from the moving speed at which the dense position of the dots moves. Further, since the distance between the dense positions of the dots is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed.
(5) When peak positions of height of waves are represented by line (or band) (with longitudinal lines, acceleration information = 0)
FIG. 13 is a fifth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 13 illustrates a case in which the peak positions of the height of the waves are represented by transverse lines (or transverse bands) in the image data indicating the propagation of the waves (transverse waves), and illustrates longitudinal lines (or longitudinal bands) radially extending to the front. The example of FIG. 13 illustrates a case in which the acceleration information is zero.
Thus, by representing the peak positions of the height of the waves by transverse lines (or transverse bands), a person who shares the environment with the vehicle 100 (for example, a pedestrian or a driver of another vehicle) can recognize the speed of the vehicle 100 from the moving speed of the transverse lines (or transverse bands). Further, since the distance between the transverse lines (or transverse bands) is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed.
In addition, according to the example of FIG. 13, since longitudinal lines extending radially to the front are illustrated, it is easier for a person who shares the environment with the vehicle 100 to recognize the moving speed of the transverse lines and the distance between the transverse lines.
(6) When peak position of height of wave is represented by line (or band) (illustrated by single line or band, acceleration information = 0)
FIG. 14 is a sixth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 14 illustrates a case in which the peak position of the height of the wave is represented by a line (or a band) in an image data indicating propagation of the wave (transverse wave).
A difference with FIG. 9 is that a wavelength and a moving speed of the wave are greater than those of FIG. 9, and the number of lines (or bands) projected at the same time on the road surface is reduced (in the example of FIG. 14, a case in which only one line (or a band) is projected at the same time is illustrated).
In this manner, by representing the peak position of the height of the wave by a line (or a band), a person who shares the environment with the vehicle 100 (for example, a pedestrian or a driver of another vehicle) can recognize the speed of the vehicle 100 from the moving speed of the line (or the band). Further, since the interval at which the lines (or bands) appear is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed.
(7) When peak position of height of wave in direction different from travel direction is represented by line (or band) (illustrated by single line or band, acceleration information = 0)
FIG. 15 is a seventh diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 15 illustrates a case in which a peak position of a wave (transverse wave) moving in a direction crossing the moving direction (travel direction) of the vehicle 100, which is a circumferential direction about a front position of the vehicle 100, is indicated by a line (or a band). However, in the example of FIG. 15, the wavelength and the moving speed of the wave are increased and the number of lines (or bands) projected on the road surface at the same time is reduced(the example of FIG. 15 illustrates a case in which a single line or band is projected at the same time).
In this manner, by representing the peak position of the height of the wave by a line (or a band), a person who shares the environment with the vehicle 100 (for example, a pedestrian or a driver of another vehicle) can recognize the speed of the vehicle 100 from the moving speed of the line (or the band). Further, since the interval at which the lines (or bands) appear is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed.
(8) When peak positions of height of waves are represented by lines (or bands) (entire circumference, acceleration information = 0)
FIG. 16 is an eighth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 16 illustrates a case in which the peak positions of the height of the waves are represented by lines (or bands) in an image data indicating propagation of the waves (transverse waves). A difference from FIG. 9 is that the lines (or the bands) indicating the peak positions of the height of the waves are projected on the road surface in the entire circumferential direction of the vehicle 100.
In this manner, by representing the peak positions of the height of the waves by lines (or bands) in the entire circumferential direction, a person who shares the environment with the vehicle 100 (for example, a pedestrian or a driver of another vehicle) can recognize the speed of the vehicle 100 from any directions based on the moving speed of the lines (or the bands). Further, since the distance between the lines (or bands) is constant, a person who shares the environment with the vehicle 100 can recognize that the vehicle 100 is traveling at a constant speed from any directions.
Conclusion
As apparent from the description above, the projection control system according to the first embodiment
- acquires information (speed information, acceleration information) recognized by the vehicle 100;
- generates image data (projection image data) indicating propagation of a wave in a display mode in accordance with the acquired information (speed information, acceleration information); and
- projects the generated projection image data onto the road surface on which the vehicle 100 travels via the projection device all the time the vehicle 100 is traveling.
In this manner, according to the projection control system of the first embodiment, it is possible to transmit information (speed information, acceleration information) recognized by the vehicle 100 to a person who shares the environment with the vehicle 100. That is, according to the projection control system of the first embodiment, transmitting performance of the vehicle can be improved.
Second Embodiment
In the first embodiment described above, the case in which the speed information and the acceleration information of the vehicle 100 are extracted from the CAN information and reflected on the display mode of the image data indicating the propagation of the waves is described. However, the CAN information to be reflected on the display mode of the image data indicating the propagation of the waves is not limited to the same. For example, route information indicating the travel route (moving route) of the vehicle 100 may be extracted and reflected on the display mode of the image data indicating the propagation of the waves. Hereinafter, the second embodiment is described mainly about differences from the first embodiment.
1. Functional Configuration of Control Device
(1) Entire Configuration
First, a functional configuration of the entire control device of a projection control system according to the second embodiment is described. FIG. 17 is a second diagram illustrating an example of a functional configuration of the control device. A difference from the functional configuration illustrated in FIG. 6 is that the functional configuration illustrated in FIG. 17 includes a route information extraction unit 1710 and an image generation unit 1720.
The route information extraction unit 1710 is an example of an acquisition unit, and extracts route information indicating the travel route of the vehicle 100, which is included in the CAN information acquired by the information acquisition unit 610. Note that the route information here includes, when the driver of the vehicle 100 is driving, angle information of a steering, operation information of blinkers, navigation information of a navigation device, and the like.
Further, when the vehicle 100 is equipped with, for example, an advanced driver assistance system (ADAS), the route information includes route information indicating a travel route of the vehicle 100 controlled by the ADAS. The travel route of the vehicle 100 controlled in the advanced driver assistance system includes, for example, a travel route for avoiding collisions with obstacles.
The image generation unit 1720 is an example of a generation unit, and generates projection image data based on the speed information and acceleration information extracted by the speed/acceleration information extraction unit 620 and the route information extracted by the route information extraction unit 1710.
(2) Method for Generating Projection Image Data by Image Generation Unit
Next, a method for generating projection image data by the image generation unit 1720 is described. FIG. 18 is a second diagram illustrating a method for generating projection image data by an image generation unit. A difference from the method for generating projection image data by the image generation unit 630 illustrated in FIG. 7 is that the image generation unit 1720 in FIG. 18A includes a wave moving direction determining unit 1810 and an update unit 1820.
The wave moving direction determining unit 1810 determines the moving direction of the wave based on the route information extracted by the route information extraction unit 1710.
The update unit 1820 generates projection image data based on the wavelength determined by the wavelength determining unit 710, the moving speed of the wave determined by the wave moving speed determining unit 720, and the moving direction of the wave determined by the wave moving direction determining unit 1810.
FIG. 18B is a diagram illustrating a specific example of projection image data 1830 generated by the image generation unit 1720. As illustrated in FIG. 18B, it is assumed that an arrow 1831 is determined as the moving direction of the wave on the projection image data 1830 based on the route information extracted by the route information extraction unit 1710. In this case, the update unit 1820 generates projection image data so that the wave moves along the arrow 1831.
2. Image Projection Example
Next, an image projection example when projection image data generated by the projection control system is projected on a road surface is described. In the following description, it is assumed that the wave propagation is expressed by representing the peak position of the wave height with a line (or a band).
(1) When Changing Lanes
FIG. 19 is a ninth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 19 illustrates a scene in which the vehicle 100 changes lanes.
When the driver of the vehicle 100 operates a blinker lever to make a right blinker of the vehicle 100 blink, the wave moving direction determining unit 1810 acquires route information in a case in which the vehicle 100 changes the lane to the right lane, and determines the moving direction of the wave to the right. Then, as illustrated in FIG. 19, an image in which waves propagate toward the right lane is projected on the road surface of the right lane of the vehicle 100.
In this manner, by determining the moving direction of the wave based on the route information indicating the travel route, a person who shares the environment with the vehicle 100 (in the example of FIG. 19, a driver of another vehicle traveling on the right lane) can recognize that the vehicle 100 is moving to the position toward which the wave propagates.
(2) When Overtaking
FIG. 20 is a tenth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 20 illustrates a scene in which the vehicle 100 is overtaking another motorcycle.
When the advanced driver assistance system mounted on the vehicle 100 detects another motorcycle and determines to overtake it, the route information extraction unit 1710 extracts route information indicating the travel route of the vehicle 100 on which overtaking is to be performed. Further, the wave moving direction determining unit 1810 determines the moving direction of the wave based on the extracted route information. Then, as illustrated in FIG. 20, an image in which waves propagate along the travel route of the vehicle 100 when the vehicle 100 overtakes another motorcycle is projected on the road surface around the other motorcycle.
In this manner, by determining the moving direction of the wave based on the route information indicating the travel route, a person who shares the environment with the vehicle 100 (in the example of FIG. 20, the rider of the other motorcycle) can recognize that the vehicle 100 is about to overtake its own motorcycle.
(3) When Parking
FIG. 21 is an eleventh diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 21 illustrates a scene in which the vehicle 100 is parking.
If an automatic parking system mounted on the vehicle 100 determines to park at a predetermined position, the route information extraction unit 1710 extracts route information indicating the travel route of the vehicle 100 at the time of parking. Further, the wave moving direction determining unit 1810 determines the moving direction of the wave based on the extracted route information. As a result, as illustrated in FIG. 21, an image in which waves propagate along the travel route of the vehicle 100 at the time of parking is projected on the road surface.
In this manner, by determining the moving direction of the waves based on the route information indicating the travel route, a person who shares the environment with the vehicle 100 (in the example of FIG. 21, a pedestrian near the parking space, and the like) can recognize that the vehicle 100 is about to park.
(4) When Turning Left
FIG. 22 is a twelfth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 22 illustrates a scene in which the vehicle 100 turns left on a road with one lane on one side.
When the driver of the vehicle 100 operates a blinker lever to make a left blinker of the vehicle 100 blink, the wave moving direction determining unit 1810 acquires route information in a case in which the vehicle 100 turns to the left, and determines the moving direction of the waves to the left. Then, as illustrated in FIG. 21, an image in which waves propagate toward a lane after the left turn is projected on the road surface of the lane after the left turn.
In this manner, by determining the moving direction of the wave based on the route information indicating the travel route, a person who shares the environment with the vehicle 100 (in the example of FIG. 21, a driver of another vehicle traveling on the lane after the left turn) can recognize that the vehicle 100 is moving to the position toward which the wave propagates.
The blinkers of the vehicle 100 cannot be viewed from the driver of another vehicle traveling in the lane after the left turn. For this reason, when an image is not projected on the road surface by the projection control system, the driver of the other vehicle is not able to recognize whether the vehicle 100 turns left. On the other hand, when the image is projected on the road surface of the lane by the projection control system after the left turn, drivers of other vehicles traveling in the lane after the left turn easily recognize that the vehicle 100 is trying to turn left.
(5) When Joining
FIG. 23 is a thirteenth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 23 illustrates a scene in which the vehicle 100 joins at a junction.
It is assumed that, when the vehicle 100 is traveling in accordance with the navigation information by the navigation device mounted on the vehicle 100, the driver of the vehicle 100 operates the blinker lever to make the right blinker of the vehicle 100 blink. In this case, the route information extraction unit 1710 extracts navigation information as route information indicating the travel route of the vehicle 100. Further, the wave moving direction determining unit 1810 determines the moving direction of the wave based on the extracted navigation information. Then, as illustrated in FIG. 23, an image in which waves propagate toward the lane to which the vehicle 100 is about to join at the junction is projected on the road surface of the lane to which the vehicle 100 is about to join.
In this manner, by determining the moving direction of the wave based on the route information indicating the travel route, a person who shares the environment with the vehicle 100 (in the example of FIG. 23, a driver of another vehicle traveling on the lane to which the vehicle 100 is about to join) can recognize that the vehicle 100 is moving to the position toward which the wave propagates.
Conclusion
As apparent from the description above, the projection control system according to the second embodiment
- acquires information (speed information, acceleration information, route information) recognized by the vehicle 100;
- generates image data (projection image data) indicating propagation of waves in a display mode in accordance with the acquired information (speed information, acceleration information, route information); and
- projects the generated projection image data onto the road surface on which the vehicle 100 travels via the projection device all the time the vehicle 100 is traveling.
In this manner, according to the projection control system of the second embodiment, information (speed information, acceleration information, route information) recognized by the vehicle 100 can be transmitted to a person who shares the environment with the vehicle 100. That is, according to the projection control system of the second embodiment, transmitting performance of the vehicle can further be improved.
Third Embodiment
In the second embodiment described above, the case in which the route information of the vehicle 100 is further extracted from the CAN information and reflected on the display mode of the image data indicating the propagation of the waves is described. However, the CAN information to be reflected on the display mode of the image data indicating the propagation of the waves is not limited to the same. For example, person information indicating a detection result of an obstacle (person) may be extracted and reflected on the display mode of the image data indicating the propagation of the waves. Hereinafter, a third embodiment is described mainly about differences from the second embodiment.
1. Functional Configuration of Control Device
(1) Entire Configuration
First, a functional configuration of the entire control device of a projection control system according to the third embodiment is described. FIG. 24 is a third diagram illustrating an example of a functional configuration of the control device. A difference from the functional configuration illustrated in FIG. 17 is that a control device 2400 in FIG. 24 includes a person information extraction unit 2410 and an image generation unit 2420.
The person information extraction unit 2410 is an example of an acquisition unit, and extracts person information indicating a detection result of a person included in the CAN information acquired by the information acquisition unit 610. The person information includes a distance from the vehicle 100 to the detected person and a direction (angle) of the detected person as viewed from the vehicle 100.
The image generation unit 2420 is an example of a generation unit, and generates projection image data based on the speed information and acceleration information extracted by the speed/acceleration information extraction unit 620 and the route information extracted by the route information extraction unit 1710. Further, the image generation unit 2420 transforms the waveform of the waves represented in the generated projection image data (for example, the form of a line or a band representing the peak position of the height of the wave) based on the person information extracted by the person information extraction unit 2410. Furthermore, the image generation unit 2420 outputs projection image data indicating waves of which waveform has been transformed.
(2) Method for Generating Projection Image Data by Image Generation Unit
Next, a method for generating projection image data by the image generation unit 2420 is described. FIG. 25 is a third diagram illustrating a method for generating projection image data by an image generation unit. A difference from the method for generating projection image data by the image generation unit 1720 illustrated in FIG. 18 is that the image generation unit 2420 in FIG. 25A includes a vector calculation unit 2510 and an update unit 2520.
The vector calculation unit 2510 calculates a vector for transforming waveform of waves based on the person information extracted by the person information extraction unit 2410.
The update unit 2520 generates projection image data based on the wavelength determined by the wavelength determining unit 710, the moving speed of the wave determined by the wave moving speed determining unit 720, and the moving direction of the wave determined by the wave moving direction determining unit 1810.
Further, based on the vector calculated by the vector calculation unit 2510, the update unit 2520 transforms the waveform of the waves represented in the generated projection image data. Furthermore, the update unit 2520 outputs projection image data indicating waves of which waveform has been transformed.
FIG. 25B is a diagram illustrating a specific example of projection image data 2530 output by the image generation unit 2420. As illustrated in FIG. 25B, the vector calculation unit 2510 specifies the position, on the projection image data 2530, that corresponds to the person information extracted by the person information extraction unit 2410 (see a black circle 2531).
Further, on the projection image data 2530, the vector calculation unit 2510 calculates concentric vectors of different size about the position of the black circle 2531 (for example, vectors directed outward from the center). The calculated vectors are assumed to have the same size on the same circle, and become larger as they are closer to the black circle 2531 and become smaller as they are apart from the black circle 2531. The concentric circles extend outward with time, and FIG. 25B illustrates the positions of the concentric circles at a predetermined time.
The update unit 2520 transforms the waveform of the waves represented in the projection image data 2530 based on the calculated vectors. In this manner, in the update unit 2520, since the propagating waves come into contact with the concentric circles, a state in which the waves are pressed back with the force corresponding to the vectors on the circles in the direction according to the vectors on the circles (as if the waves are affected by a person) can be represented in the projection image data 2530.
2. Image Projection Example
Next, an image projection example when projection image data generated by the projection control system is projected on a road surface is described.
FIG. 26 is a fourteenth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 26 illustrates a scene in which a person is about to cross a road at the right front side of the vehicle 100.
When the person information extraction unit 2410 extracts person information from the CAN information, the vector calculation unit 2510 calculates a vector based on the extracted person information. Then, as illustrated in FIG. 26, an image in which waves of which waveform near a person has been transformed are propagating is projected on the road surface.
In this manner, since the waveform of the waves is transformed, the person who shares the environment with the vehicle 100 (in the example of FIG. 26, the person who is about to cross the road) can notice that the vehicle 100 has recognized the existence of himself/herself.
Conclusion
As apparent from the description above, the projection control system according to the third embodiment
- acquires information (speed information, acceleration information, route information, person information) recognized by the vehicle 100;
- generates image data (projection image data) indicating propagation of waves in a display mode in accordance with the acquired information (speed information, acceleration information, route information);
- projects the generated projection image data onto the road surface on which the vehicle 100 travels via the projection device all the time the vehicle 100 is traveling;
- when the vehicle 100 extracts person information, transforms the waveform of the waves represented in the generated projection image data and outputs in accordance with the extracted person information; and
- when the vehicle 100 extracts person information, projects projection image data indicating waves of which waveform has been transformed via a projection device onto a road surface on which the vehicle 100 is traveling.
In this manner, according to the projection control system of the third embodiment, it is possible to transmit, to a person who shares the environment with the vehicle 100, that the vehicle 100 has recognized the person. That is, according to the projection control system of the third embodiment, transmitting performance of the vehicle can be further improved.
Fourth Embodiment
In the third embodiment described above, the case in which the person information is further extracted from the CAN information and reflected on the display mode of the image data indicating the propagation of the waves is described. However, the CAN information to be reflected on the display mode of the image data indicating the propagation of the waves is not limited to the same. For example, information on an oncoming vehicle (oncoming vehicle information) may be extracted and reflected on the display mode of the image data indicating the propagation of the waves. Hereinafter, a fourth embodiment is described mainly about differences from the third embodiment.
1. Functional Configuration of Control Device
(1) Entire Configuration
First, a functional configuration of the entire control device of a projection control system according to the fourth embodiment is described. FIG. 27 is a fourth diagram illustrating an example of a functional configuration of the control device. A difference from the functional configuration illustrated in FIG. 24 is that a control device 2700 in FIG. 27 includes an oncoming vehicle information extraction unit 2710 and an image generation unit 2720.
The oncoming vehicle information extraction unit 2710 is an example of an acquisition unit, and extracts oncoming vehicle information included in the CAN information acquired by the information acquisition unit 610. Note that the oncoming vehicle information includes a distance from the vehicle 100 to the oncoming vehicle, a direction (angle) to the oncoming vehicle as viewed from the vehicle 100, speed information of the oncoming vehicle, and speed information of the vehicle 100.
The image generation unit 2720 is an example of a generation unit, and generates projection image data based on the speed information and acceleration information extracted by the speed/acceleration information extraction unit 620 and the route information extracted by the route information extraction unit 1710. Further, based on the oncoming vehicle information extracted by the oncoming vehicle information extraction unit 2710, the image generation unit 2720 transforms the waveform of the waves represented in the generated projection image data. Furthermore, the image generation unit 2720 outputs projection image data indicating the waves of which waveform has been transformed.
(2) Method for Generating Projection Image Data by Image Generation Unit
Next, a method for generating projection image data by the image generation unit 2720 is described. FIG. 28 is a fourth diagram illustrating a method for generating projection image data by an image generation unit. A difference from the method for generating projection image data by the image generation unit 2420 illustrated in FIG. 25 is that the image generation unit 2720 illustrated in FIG. 28A includes a position/relative speed calculation unit 2810 and an update unit 2820.
Based on the oncoming vehicle information extracted by the oncoming vehicle information extraction unit 2710, the position/relative speed calculation unit 2810 calculates relative positions and relative speed between the vehicle 100 and an oncoming vehicle for transforming the waveform of the waves.
The update unit 2820 generates projection image data based on the wavelength determined by the wavelength determining unit 710, the moving speed of the wave determined by the wave moving speed determining unit 720, and the moving direction of the wave determined by the wave moving direction determining unit 1810.
Further, based on the relative positions and the relative speed between the vehicle 100 and the oncoming vehicle calculated by the position/relative speed calculation unit 2810, the update unit 2820 transforms the waveform of the waves represented in the generated projection image data. Furthermore, the update unit 2820 outputs projection image data indicating waves of which waveform has been transformed.
FIG. 28B is a diagram illustrating a specific example of projection image data 2830 output by the image generation unit 2720. In FIG. 28B, an arrow 2831 represents position/speed information of the vehicle 100 in the projection image data 2830, which is specified based on the oncoming vehicle information. Further, an arrow 2832 represents the position/speed information of the oncoming vehicle in the projection image data 2830, which is specified based on the oncoming vehicle information.
Based on the arrows 2831 and 2832, the position/relative speed calculation unit 2810 calculates the relative positions and the relative speed between the vehicle 100 and the oncoming vehicle on the projection image data 2830.
The update unit 2820 transforms the waveform of the waves represented in the projection image data 2830 based on the calculated relative positions and relative speed. In this manner, in the update unit 2820, the propagating waves can express, in the projection image data 2830, a state in which the moving speed is reduced at a position in accordance with the relative positions and relative speed (as if the oncoming vehicle is weakened by the projected waves).
2. Image Projection Example
Next, an image projection example when projection image data generated by the projection control system is projected on a road surface is described.
FIG. 29 is a fifteenth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 29 illustrates a scene in which an oncoming vehicle traveling straight approaches from the right front side of the vehicle 100.
When the oncoming vehicle information extraction unit 2710 extracts oncoming vehicle information from the CAN information, the position/relative speed calculation unit 2810 calculates the relative positions and the relative speed between the vehicle 100 and the oncoming vehicle based on the extracted oncoming vehicle information. Then, as illustrated in FIG. 29, an image in which waves of which waveform on the oncoming vehicle side has been transformed are propagating is projected on the road surface.
In this manner, since the waveform of the waves is transformed, the person who shares the environment with the vehicle 100 (in the example of FIG. 29, the driver of the oncoming vehicle) can notice that the vehicle 100 has recognized the existence of the vehicle which he/she is driving.
Conclusion
As apparent from the description above, the projection control system according to the fourth embodiment
- acquires information (speed information, acceleration information, route information, oncoming vehicle information) recognized by the vehicle 100;
- generates image data (projection image data) indicating propagation of waves in a display mode in accordance with the acquired information (speed information, acceleration information, route information);
- projects the generated projection image data onto the road surface on which the vehicle 100 travels via the projection device all the time the vehicle 100 is traveling;
- when the vehicle 100 extracts oncoming vehicle information, transforms the waveform of the waves represented in the generated projection image data and outputs in accordance with the extracted oncoming vehicle information; and
- when the vehicle 100 extracts oncoming vehicle information, projects projection image data indicating waves of which waveform has been transformed via a projection device onto a road surface on which the vehicle 100 is traveling.
In this manner, according to the projection control system of the fourth embodiment, it is possible to transmit, to a driver of an oncoming vehicle who shares the environment with the vehicle 100, that the vehicle 100 has recognized that oncoming vehicle. That is, according to the projection control system of the fourth embodiment, transmitting performance of the vehicle can be further improved.
Fifth Embodiment
In the third and the fourth embodiments described above, the case in which the person information or the oncoming vehicle information is further extracted from the CAN information and reflected on the display mode of the image data indicating the propagation of the waves is described. However, the CAN information to be reflected on the display mode of the image data indicating the propagation of the waves is not limited to the same. For example, information indicating a state of a road surface (road surface information) may be extracted and reflected on the display mode of the image data indicating the propagation of the waves. Hereinafter, a fifth embodiment is described mainly about differences from the fourth embodiment.
1. Functional Configuration of Control Device
(1) Entire Configuration
First, a functional configuration of the entire control device of a projection control system according to the fifth embodiment is described. FIG. 30 is a fourth diagram illustrating an example of a functional configuration of the control device. A difference from the functional configuration illustrated in FIG. 27 is that a control device 3000 in FIG. 30 includes a road surface information extraction unit 3010 and an image generation unit 3020.
The road surface information extraction unit 3010 is an example of an acquisition unit, and extracts road surface information included in the CAN information acquired by the information acquisition unit 610. Note that the road surface information includes information on places with road roughness and information on a degree of road roughness.
The image generation unit 3020 is an example of a generation unit. The image generation unit 3020generates projection image data based on the speed information and acceleration information extracted by the speed/acceleration information extraction unit 620 and the route information extracted by the route information extraction unit 1710. Further, based on the road surface information extracted by the road surface information extraction unit 3010, the image generation unit 3020 transforms the waveform of the waves represented in the generated projection image data. Furthermore, the image generation unit 3020 outputs projection image data indicating the waves of which waveform has been transformed.
(2) Method for Generating Projection Image Data by Image Generation Unit
Next, a method for generating projection image data by the image generation unit 3020 is described. FIG. 31 is a fifth diagram illustrating a method for generating projection image data by an image generation unit. A difference from the method for generating projection image data by the image generation unit 2720 illustrated in FIG. 28 is that the image generation unit 3020 in FIG. 31A includes a position/degree of roughness calculation unit 3110 and an update unit 3120.
Based on the road surface information extracted by the road surface information extraction unit 3010, the position/degree of roughness calculation unit 3110 calculates the position for transforming the waveform of the waves and the degree of roughness.
The update unit 3120 generates projection image data based on the wavelength determined by the wavelength determining unit 710, the moving speed of the wave determined by the wave moving speed determining unit 720, and the moving direction of the wave determined by the wave moving direction determining unit 1810.
Further, based on the position for transforming the waveform of the waves and the degree of roughness calculated by the position/degree of roughness calculation unit 3110, the update unit 3120 transforms the waveform of the waves represented in the generated projection image data. Furthermore, the update unit 3120 outputs projection image data indicating waves of which waveform has been transformed.
FIG. 31B is a diagram illustrating a specific example of the projection image data 3130 output by the image generation unit 3020. In FIG. 31B, a hatched area 3131 indicates the position for transforming the waveform of the waves and the degree of roughness on the projection image data 3130, and is calculated by the position/degree of roughness calculation unit 3110.
In the update unit 3120, the waveform of the waves is transformed based on the calculated position and the degree of roughness. In this manner, in the update unit 2520, the propagating waves can express, in the projection image data 3130, a state in which the waveform is transformed at the position of the hatched area (as if the shape is transformed due to the road roughness).
2. Image Projection Example
Next, an image projection example when projection image data generated by the projection control system is projected on a road surface is described.
FIG. 32 is a sixteenth diagram illustrating an example of projection of an image to be projected onto a road surface by a projection control system. The example of FIG. 32 illustrates a scene in which roughness exists on the road surface on the right front side of the vehicle 100 that is travelling straight.
When the road surface information extraction unit 3010 extracts road surface information from the CAN information, the position/degree of roughness calculation unit 3110 calculates, based on the extracted road surface information, the position in which the waveform is to be transformed on the projection image data and the degree of roughness when the waveform is to be transformed. Then, as illustrated in FIG. 32, an image in which waves of which waveform has been transformed in a position in which the road roughness exists are propagating is projected onto the road surface.
In this manner, since the waveform of the waves are transformed, a person who shares the environment with the vehicle 100 (in the example of FIG. 32, the driver of the following vehicle) can share the road surface information.
Conclusion
As apparent from the description above, the projection control system according to the fifth embodiment
- acquires information (speed information, acceleration information, route information, road surface information) recognized by the vehicle 100;
- generates image data (projection image data) indicating propagation of waves in a display mode in accordance with the acquired information (speed information, acceleration information, route information);
- projects the generated projection image data onto the road surface on which the vehicle 100 travels via the projection device all the time the vehicle 100 is traveling;
- when the vehicle 100 extracts road surface information, transforms the waveform of the waves represented in the generated projection image data and outputs in accordance with the extracted road surface information; and
- when the vehicle 100 extracts road surface information, projects projection image data indicating waves of which waveforms have been transformed via a projection device onto a road surface on which the vehicle 100 is traveling.
In this manner, according to the projection control system of the fifth embodiment, it is possible to transmit road surface information to the driver of a following vehicle who shares the environment with the vehicle 100. That is, according to the projection control system of the fifth embodiment, transmitting performance of the vehicle can be further improved.
Other Embodiments
In the above third to fifth embodiments, it has been described that, with respect to the projection image data in which the wavelength, the moving speed of the waves, and the moving direction of the waves are determined based on the speed information, the acceleration information, and the route information, the waveform of the waves is transformed based on the person information, the oncoming vehicle information, or the road surface information
However, objects based on which the waveform of the waves is to be transformed is not limited to those described above. For example, projection image data may be generated based on a predetermined wavelength, a predetermined moving speed, and a predetermined moving direction, not on the speed information, the acceleration information, and the route information, and the waveform may be transformed based on person information, oncoming vehicle information, or road surface information.
In the above first to fifth embodiments, in the image data indicating the propagation of the waves, for example, the color of the line (or band) indicating the peak positions of the height of the waves is not particularly mentioned. However, the color of the line (or band) indicating the peak position of the height of the waves may be determined arbitrarily. Further, the number of the color of the line (or band) indicating the peak position of the height of the waves may be one or greater.
In the above first to fifth embodiments, it has been described that, when projecting projection image data onto a road surface via a projection device, the same projection image data is projected regardless of time of day of the projection.
However, the projection image data may be configured to change depending on time of day (for example, daytime and nighttime). Further, a method for changing depending on time of day may be determined arbitrarily; for example, the figure and ground may be inverted between daytime and nighttime. Further, for example, the contrast of the figure and ground may be changed between daytime and nighttime.
In any case, it is assumed that the image generation unit selects a display mode for easier view depending on time of day and generates projection image data. Furthermore, the image generation unit may be configured to adjust a light amount depending on time of day in addition to selection of the display mode.
In the above first to fifth embodiments, timing to activate and end the projection control system are not particularly mentioned. However, the projection control system may be configured to be activated and ended at arbitrary timings under instructions of the driver.
Alternatively, the projection control system may be configured to, for example, be activated when an ignition switch of the vehicle 100 is turned on and may be ended when the ignition switch is turned off.
In this manner, after the ignition switch of the vehicle 100 is turned on and until turned off (regardless of whether the vehicle 100 is moving), the light source of the projection unit may emit light all the time to project projection image data on the road surface.
Turning on of the light source of the projection unit and projection of the projection image data on the road surface do not necessarily have to be synchronized with the activation of the projection control system. For example, after activation of the projection control system and at least while the vehicle 100 is moving, the light source of the projection unit may emit light all the time to project the projection image data on the road surface.
Further, turning on of the light source of the projection unit and projection of the projection image data on the road surface do not necessarily have to be synchronized. For example, the light source of the projection unit may be turned on at arbitrary timing after activation of the projection control system, at least while the light source of the projection unit is emitting light and the vehicle 100 is moving, the projection image data may be projected all the time on the road surface.
In this manner, projection image data can be projected on a road surface all the time at least while the vehicle 100 is moving.
In the above first to fifth embodiments, the projection image data is projected onto the road surface. However, objects on which the projection image data is projected may be arbitrarily determined, and the projection image data may be projected onto a surface other than the road surface.
In the above first to fifth embodiments, the projection control system is provided as a component separated from other systems and devices (an information output system that generates and outputs information indicating the state of the vehicle 100, such as an advanced driver assistance system, an automatic parking system, and a navigation device). However, the projection control system may be configured integrally with other systems and devices, and configured to function as a part of a moving object control system.
Moreover, in the above first to fifth embodiments, the case in which the projection control system is mounted on the vehicle 100 has been described, but the projection control system may be mounted on a moving object other than the vehicle 100.
Note that the present invention is not limited to the configurations described herein, that is, not limited to configurations described in the embodiments above and combinations with other elements. These points may be changed without departing from the spirit of the embodiments of the present invention, and may be appropriately determined in accordance with application forms thereof.
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention.
This patent application is based on and claims priority pursuant to 35 U.S.C. §119(a) to Japanese Patent Application No. 2018-161192, filed on August 30, 2018, in the Japan Patent Office, the entire disclosure of which is hereby incorporated by reference herein.
100: Vehicle
110, 120: Projection devices
130: Control device
610: Information acquisition unit
620: Speed/acceleration information extraction unit
630: Image generation unit
640: Left/right dividing unit
710: Wavelength determining unit
720: Wave moving speed determining unit
730: Update unit
1700: Control device
1710: Route information extraction unit
1720: Image generation unit
1810: Wave moving direction determining unit
1820: Update unit
2400: Control device
2410: Person information extraction unit
2420: Image generation unit
2510: Vector calculation unit
2520: Update unit
2700: Control device
2710: Oncoming vehicle information extraction unit
2720: Image generation unit
2810: Position/relative speed calculation unit
2820: Update unit
3000: Control device
3010: Road surface information extraction unit
3020: Image generation unit
3110: Position/degree of roughness calculation unit
3120: Update unit

Claims (16)

  1. A control device, comprising:
    an acquisition unit configured to acquire information recognized by a moving object;
    a generation unit configured to generate an image representing propagation of a wave in a display mode in accordance with the acquired information; and
    an output unit configured to output the image to a projection unit.
  2. The control device according to claim 1, wherein the generation unit generates the image in a display mode in accordance with a wavelength and a moving speed of a wave that are determined based on the acquired information.
  3. The control device according to claim 1 or 2, wherein the generation unit generates the image in a display mode in accordance with a moving direction of a wave determined based on the acquired information.
  4. The control device according to claim 2, wherein the acquisition unit acquires one of a speed and acceleration of the moving object as information recognized by the moving object.
  5. The control device according to claim 3, wherein the acquisition unit acquires a movement route of the moving object as information recognized by the moving object.
  6. The control device according to claim 4 or 5, wherein, in response to detection of a person by the moving object, when the acquisition unit acquires, as the information recognized by the moving object unit, a distance from the moving object to the detected person and a direction of the detected person with respect to the moving object, the generation unit transforms a waveform of the wave represented in the image based on the acquired information recognized by the moving object.
  7. The control device according to claim 4 or 5, wherein, in response to detection of another moving object by the moving object, when the acquisition unit acquires, as the information recognized by the moving object, a distance from the moving object to the detected other moving object, a direction of the other moving object with respect to the moving object, and speed information of the moving object and speed information of the other moving object, the generation unit transforms a waveform of the wave represented in the image based on the acquired information recognized by the moving object.
  8. The control device according to claim 4 or 5, wherein, in response to detection, by the moving object, of roughness of a road on which the moving object moves, when the acquisition unit acquires, as the information recognized by the moving object, a position of the detected roughness and a degree of roughness, the generation unit transforms a waveform of the wave represented in the image based on the acquired information recognized by the moving object.
  9. The control device according to any one of claims 1 to 8, wherein the generation unit represents a peak position of height of a transverse wave by a line or a band, or a set of dots, to generate an image representing propagation of the wave.
  10. The control device according to any one of claims 1 to 8, wherein the generation unit represents dense and sparse of longitudinal waves by dots, to generate an image representing propagation of the wave.
  11. A projection control system, comprising:
    an acquisition unit configured to acquire information recognized by a moving object;
    a generation unit configured to generate an image representing propagation of a wave in a display mode in accordance with the acquired information; and
    a projection unit configured to project the image.
  12. The projection control system according to claim 11, wherein the projection unit projects the image at least when a light source of the projection unit is on and the moving object is moving.
  13. A moving object control system, comprising:
    an information output system configured to generate and output information recognized by the moving object; and
    the projection control system according to claim 11 or 12.
  14. A moving object equipped with the moving object control system according to claim 13.
  15. A control method comprising:
    an acquisition step of acquiring information recognized by a moving object,
    a generation step of generating an image representing propagation of a wave in a display mode in accordance with the acquired information; and
    an output step of outputting the image to a projection unit.
  16. A control program that causes a computer to execute:
    an acquisition step of acquiring information recognized by a moving object;
    a generation step of generating an image representing propagation of a wave in a display mode in accordance with the acquired information; and
    an output step of outputting the image to a projection unit.

PCT/JP2019/031473 2018-08-30 2019-08-08 Control device, projection control system, moving object control system, moving object, control method, and control program WO2020045040A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018161192A JP7279318B2 (en) 2018-08-30 2018-08-30 Control device, projection control system, moving body control system, moving body, control method and control program
JP2018-161192 2018-08-30

Publications (1)

Publication Number Publication Date
WO2020045040A1 true WO2020045040A1 (en) 2020-03-05

Family

ID=67734781

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/031473 WO2020045040A1 (en) 2018-08-30 2019-08-08 Control device, projection control system, moving object control system, moving object, control method, and control program

Country Status (2)

Country Link
JP (1) JP7279318B2 (en)
WO (1) WO2020045040A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7474133B2 (en) 2020-06-26 2024-04-24 株式会社小糸製作所 Vehicle lighting fixtures

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013203103A (en) * 2012-03-27 2013-10-07 Denso It Laboratory Inc Display device for vehicle, control method therefor, and program
JP2016037260A (en) 2014-08-11 2016-03-22 株式会社小糸製作所 Display system for vehicle
US20180090007A1 (en) * 2015-03-16 2018-03-29 Denso Corporation Image generation apparatus
US20180174463A1 (en) * 2016-12-19 2018-06-21 Toyota Jidosha Kabushiki Kaisha Driving assistance apparatus for vehicle
JP2018161192A (en) 2017-03-24 2018-10-18 東芝ライフスタイル株式会社 Washing machine

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6252316B2 (en) * 2014-03-31 2017-12-27 株式会社デンソー Display control device for vehicle
JP6643659B2 (en) * 2014-04-09 2020-02-12 パナソニックIpマネジメント株式会社 Display control device, display control method, and display control program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013203103A (en) * 2012-03-27 2013-10-07 Denso It Laboratory Inc Display device for vehicle, control method therefor, and program
JP2016037260A (en) 2014-08-11 2016-03-22 株式会社小糸製作所 Display system for vehicle
US20180090007A1 (en) * 2015-03-16 2018-03-29 Denso Corporation Image generation apparatus
US20180174463A1 (en) * 2016-12-19 2018-06-21 Toyota Jidosha Kabushiki Kaisha Driving assistance apparatus for vehicle
JP2018161192A (en) 2017-03-24 2018-10-18 東芝ライフスタイル株式会社 Washing machine

Also Published As

Publication number Publication date
JP7279318B2 (en) 2023-05-23
JP2020032876A (en) 2020-03-05

Similar Documents

Publication Publication Date Title
US10800258B2 (en) Vehicular display control device
JP6273976B2 (en) Display control device for vehicle
CN109311480B (en) Method for operating a driver assistance system and driver assistance system
US10748425B2 (en) Image generation apparatus
JP6447468B2 (en) Driving assistance device
JP7254832B2 (en) HEAD-UP DISPLAY, VEHICLE DISPLAY SYSTEM, AND VEHICLE DISPLAY METHOD
WO2015076142A1 (en) Vehicle information projection system
JP2018149853A (en) Image display device
JP2019511066A (en) Method for controlling the automatic display of a pictogram indicating the presence of an obstacle in front of a vehicle
WO2020188910A1 (en) Display control apparatus, display apparatus, display system, moving body, program, and image generation method
WO2020031917A1 (en) Vehicle display system and vehicle
JP7348819B2 (en) Vehicle driving support system
CN114302828A (en) Display system for vehicle and vehicle
JP7295863B2 (en) Vehicle display system and vehicle
WO2016013167A1 (en) Vehicle display control device
JP7018854B2 (en) Notification system, notification control method, and program
WO2020045040A1 (en) Control device, projection control system, moving object control system, moving object, control method, and control program
JP2019073073A (en) Image processing unit and head up display device equipped with the same
JP6980553B2 (en) Vehicle lighting system and vehicle
Srinivasan et al. Light as a Service–Techniques, Applications, Challenges in Automotive industry
JP2017007581A (en) Vehicular safe control apparatus
EP3961291B1 (en) Vehicular head-up display and light source unit used therefor
JP7474133B2 (en) Vehicle lighting fixtures
WO2023085000A1 (en) Road surface-marking light and road surface-marking light system
WO2023157721A1 (en) Vehicle control device and vehicle control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19758519

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19758519

Country of ref document: EP

Kind code of ref document: A1