WO2019188234A1 - 車両制御装置及び車両制御システム - Google Patents

車両制御装置及び車両制御システム Download PDF

Info

Publication number
WO2019188234A1
WO2019188234A1 PCT/JP2019/009876 JP2019009876W WO2019188234A1 WO 2019188234 A1 WO2019188234 A1 WO 2019188234A1 JP 2019009876 W JP2019009876 W JP 2019009876W WO 2019188234 A1 WO2019188234 A1 WO 2019188234A1
Authority
WO
WIPO (PCT)
Prior art keywords
instructor
vehicle
vehicle control
instruction
control device
Prior art date
Application number
PCT/JP2019/009876
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
春樹 的野
永崎 健
遠藤 健
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Priority to CN201980022095.8A priority Critical patent/CN111918807A/zh
Publication of WO2019188234A1 publication Critical patent/WO2019188234A1/ja

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/06Automatic manoeuvring for parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a vehicle control device and a vehicle control system.
  • Patent Document 1 states that “a vehicle control system performs at least one of vehicle speed control and steering control based on a detection unit that detects a vehicle peripheral state and a vehicle peripheral state detected by the detection unit.
  • An automatic driving control unit that performs automatic driving automatically, a recognition unit that recognizes a moving object that exists in the vicinity of the vehicle based on the peripheral state of the vehicle detected by the detection unit, and a recognition unit that is recognized
  • a transmission unit that transmits information for causing the terminal device to call attention in a direction other than the direction of the moving object.
  • Patent Document 1 in order to receive information for alerting, persons around the vehicle need to have a terminal device. Also, with such technology, people around the vehicle cannot control the vehicle.
  • An object of the present invention is to provide a vehicle control device or the like that can control a vehicle according to an instruction of an instructor who does not have a terminal device outside the vehicle.
  • the present invention provides an instructor detection unit for detecting an instructor, an instruction operation detection unit for detecting an instruction operation indicating an instruction of the instructor, and when the instruction operation is detected, A first vehicle control unit that controls the vehicle according to the instruction indicated by the instruction operation.
  • the vehicle can be controlled in accordance with an instruction from an instructor who does not have a terminal device outside the vehicle.
  • 1 is a configuration diagram of a vehicle control system including an ECU as a vehicle control device according to a first embodiment of the present invention. It is a block diagram which shows the function of CPU used for ECU by the 1st Embodiment of this invention. It is a flowchart which shows operation
  • FIG. 1 is a configuration diagram of a vehicle control system 1 including an ECU 100 (Electronic Control Unit) as a vehicle control device according to a first embodiment of the present invention.
  • ECU 100 Electronic Control Unit
  • the ECU 100 includes a CPU 101 (Central Processing Unit) as a processor, a memory 102 as a storage device, an I / F 103 as an input / output circuit, and the like. ECU 100 is mounted on the vehicle.
  • CPU 101 Central Processing Unit
  • memory 102 as a storage device
  • I / F 103 as an input / output circuit
  • the camera 200 captures an image and temporarily stores the captured image in a memory.
  • the camera 200 is connected to the I / F 103 and is controlled by the CPU 101.
  • the camera 200 is a stereo camera, for example.
  • the light source, sound source, or display 300 sends a signal (first signal) for confirming whether there is an intention to control the vehicle to an instructor (human) around the vehicle.
  • a light source, a sound source, or a display 300 is connected to the I / F 103 and controlled by the CPU 101.
  • the light source is, for example, a vehicle light
  • the sound source is, for example, a horn
  • the display is, for example, an electric bulletin board provided on the front surface of a bus or the like.
  • a light source, a sound source, or a display 300 is connected to the I / F 103 and controlled by the CPU 101.
  • the vehicle 400 includes a drive mechanism that drives the vehicle body, a braking mechanism that brakes the vehicle body, a steering mechanism that steers the vehicle body, and the like.
  • FIG. 2 is a block diagram showing functions of the CPU 101 used in the ECU 100 according to the first embodiment of the present invention.
  • the CPU 101 executes a predetermined program stored in advance in the memory 102 to thereby perform an instructor detection unit 1011, a signal transmission unit 1012, a response operation detection unit 1013, an instruction operation detection unit 1014, a first vehicle control unit 1015, It functions as the second vehicle control unit 1016.
  • these functions may be realized by a circuit such as an FPGA (Field-Programmable Gate Array).
  • the instructor detection unit 1011 detects an instructor (human) from the image captured by the camera 200.
  • the signal transmission unit 1012 confirms whether the instructor has an intention to control the vehicle by controlling at least one of the light source, the sound source, and the display.
  • Signal first signal
  • the signal transmission unit 1012 lights a vehicle light, sounds a horn, or displays a message “Do you want to control the vehicle?” On an electric bulletin board for a predetermined period.
  • the predetermined operation is, for example, an operation of raising both arms, an operation of raising a flag, or the like.
  • the response operation detection unit 1013 detects a response operation indicating the response of the instructor from the image captured by the camera 200. Next, the response operation notifies that the instruction operation indicating the instruction from the instructor will continue.
  • the instruction operation detection unit 1014 detects an instruction operation from an image captured by the camera 200. For example, if the instructor wants to stop the vehicle, both arms are leveled, and if the vehicle is to be parked, after pointing the parking position, the instructor turns one arm downward.
  • the first vehicle control unit 1015 controls the vehicle 400 according to the instruction indicated by the instruction operation when the instruction operation is detected by the instruction operation detection unit 1014.
  • the second vehicle control unit 1016 performs control to bring the vehicle 400 closer to the instructor when the response operation is detected by the response operation detection unit 1013.
  • FIG. 3 is a flowchart showing the operation of the CPU 101 used in the ECU 100 according to the first embodiment of the present invention.
  • the CPU 101 determines whether or not an instructor (human) is detected from the image captured by the camera 200 (S15). For example, the CPU 101 analyzes the image and determines (recognizes) an object that is similar to a human shape stored in advance in the memory 102 and that moves similar to a human action as a human.
  • the CPU 101 When the instructor (human) is detected (S15: YES), the CPU 101 (signal transmission unit 1012) has an intention to control the vehicle by, for example, lighting the vehicle light for a predetermined period. A signal (first signal) for confirming whether or not is transmitted (S20). On the other hand, in order to show the intention to control the vehicle, for example, the instructor raises both arms, for example. If no instructor (human) is detected (S15: NO), the CPU 101 repeats the process of S15.
  • CPU 101 determines whether or not a response operation (raising both arms) indicating the response of the instructor is detected from the image captured by camera 200 (S25).
  • the information defining the response action is stored in advance in the memory 102, for example.
  • the CPU 101 compares the information indicating the action of the instructor obtained by analyzing (behavior analysis) the image captured by the camera 200 with the information defining the response action stored in the memory 102. If they are similar, it is determined that a response operation has been detected (recognized).
  • the CPU 101 (second vehicle control unit 1016) performs control to bring the vehicle 400 closer to the instructor (S30). For example, the CPU 101 advances the vehicle 400 to a position at a predetermined distance from the instructor and stops it. As a result, the proportion of the image of the instructor in the image captured by the camera 200 increases, and the instruction operation following the response operation is easily recognized in the image analysis.
  • the CPU 101 repeats the process of S25 for a predetermined time (S26: YES), and then returns to normal automatic operation (S27). In normal automatic driving, control for advancing the vehicle 400 while avoiding humans is performed.
  • the instructor places both arms in a horizontal direction as an instruction operation indicating an instruction to stop the vehicle 400, for example.
  • CPU 101 determines whether or not an instruction action has been detected from an image captured by camera 200 (S35).
  • the information defining the instruction operation is stored in the memory 102 in advance, for example.
  • the CPU 101 compares the information indicating the operation of the instructor obtained by analyzing (behavior analysis) the image captured by the camera 200 with the information defining the instruction operation stored in the memory 102. When they are similar, it is determined that the instruction operation is detected (recognized).
  • CPU101 (1st vehicle control part 1015) controls the vehicle 400 according to the instruction
  • the vehicle can be controlled in accordance with an instruction from an instructor who does not have a terminal device outside the vehicle.
  • Modification 1 of the first embodiment of the present invention will be described with reference to FIGS. 4A and 4B.
  • 4A is a diagram illustrating the focal length when the angle of view of the camera 200 is wide
  • FIG. 4B is a diagram illustrating the focal length when the angle of view of the camera 200 is narrow.
  • the CPU 101 controls the vehicle 400 to approach the instructor according to the angle of view of the camera 200.
  • the CPU 101 brings the vehicle 400 closer to the instructor, and the distance R to the instructor detected by the camera 200 corresponds to the angle of view ⁇ 1.
  • the vehicle 400 is stopped.
  • FIG. 5 is a flowchart showing the operation of the second modification of the first embodiment. In FIG. 5, the processing from S20 to S30 in FIG. 3 is omitted.
  • the vehicle is controlled according to the instruction operation from the instructor without waiting for the response operation from the instructor.
  • the CPU 101 first vehicle control unit 1015 performs control to stop the vehicle 400 in a safe place. As a result, the vehicle can be quickly controlled.
  • the CPU 101 may predict an instruction operation from an image captured by the camera 200.
  • the CPU 101 (second vehicle control unit 1016) performs control to bring the vehicle 400 closer to the instructor when the instruction operation is predicted. Thereby, the vehicle 400 can be brought close to the instructor without waiting for a response operation from the instructor.
  • FIG. 6 is a flowchart showing the operation of the second exemplary embodiment of the present invention.
  • the process of S28 is added to the flowchart shown in FIG. Note that the system configuration of this embodiment is the same as that of the first embodiment.
  • the CPU 101 transmits a second signal different from the first signal. For example, the CPU 101 sends a signal (second signal) for notifying that the response operation of the instructor has been detected by blinking the vehicle light for a predetermined period (S20). The CPU 101 may sound a horn for a predetermined period or display “OK” on the display.
  • the present embodiment it is possible to notify the instructor that the response operation of the instructor has been detected. As a result, it is possible to give a sense of security to the instructor around the vehicle.
  • FIG. 7 is a flowchart showing the operation of the third exemplary embodiment of the present invention.
  • the processes of S45 and S50 are added to the flowchart shown in FIG. Note that the system configuration of this embodiment is the same as that of the first embodiment.
  • CPU101 performs the process from S15 to S35 similarly to 1st Embodiment, and detects the instruction
  • the instruction operation of the first instructor indicates an instruction to move forward in the right direction, and includes an operation of pointing to the next instructor.
  • CPU101 (1st vehicle control part 1015) controls vehicle 400 according to the instruction
  • the CPU 101 determines whether or not the instruction operation of the second instructor in the direction pointed by the first instructor is detected from the image captured by the camera 200 (S45). .
  • the CPU 101 determines that the instruction operation of the first instructor includes the action of pointing the second instructor in the instruction action of the first instructor. Subsequently, the instruction operation of the second instructor is detected.
  • the CPU 101 may cause the second instructor to zoom in on the camera 200.
  • CPU101 (1st vehicle control part 1015) controls the vehicle 400 according to the instruction
  • the CPU 101 controls the vehicle 400 according to the instruction indicated by the instruction operation of the first instructor, and then performs the vehicle according to the instruction indicated by the instruction operation of the second instructor. 400 is controlled.
  • the instruction can be taken over from the first instructor to the second instructor.
  • FIG. 9 is a diagram illustrating a distance R1 from the camera 200 to the first instructor and a distance R2 from the camera 200 to the second instructor.
  • control for bringing the vehicle 400 closer to the second instructor is not performed, but the CPU 101 (second vehicle control unit 1016) sets the vehicles 400 in order of increasing distance from the camera 200, respectively. Control may be performed to approach the instructor. For example, in the example of FIG. 9, since the distance R1 is smaller than the distance R1, the CPU 101 (second vehicle control unit 1016) brings the vehicle 400 close to the first instructor, and then moves the vehicle 400 to the second instructor. Control to bring it closer to.
  • the present invention is not limited to the above-described embodiment, and includes various modifications.
  • the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to the one having all the configurations described.
  • a part of the configuration of an embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of an embodiment.
  • the process of S28 (sending the second signal) of the second embodiment shown in FIG. 6 is added between the process of S25 and the process of S30 in the flowchart of the third embodiment shown in FIG. May be.
  • the CPU 101 may output a sound “approaching” to the speaker or display that effect on the display. Accordingly, it is possible to give a notice to the instructor that the vehicle 400 is approaching the instructor and to give a sense of security.
  • a process in which the signal transmission unit 1012 transmits the third signal may be added between the process of S35 and the process of S40 of the second embodiment illustrated in FIG. Thereby, it is possible to notify the instructor that the instruction operation of the instructor has been detected.
  • the first signal (first embodiment), the second signal (second embodiment), and the third signal are different from each other, but may be the same signal. Thereby, since control can be simplified, design / manufacturing cost can be reduced.
  • the CPU 101 performs control to bring the vehicle 400 closer to the instructor, but the camera 200 has a telephoto zoom function.
  • the CPU 101 may cause the first instructor to zoom in on the camera 200 instead of the process of S30. Thereby, even if the vehicle 400 does not approach the instructor, the instruction operation can be easily detected.
  • the light of the vehicle 400 is turned on as the first signal, but the traffic regulations of the country in which the vehicle 400 is used. If this is recognized, the hazard may be turned on (flashing).
  • the camera 200 is focused on the instructor by controlling the vehicle 400 close to the instructor according to the angle of view of the camera 200.
  • the instruction operation detection unit 1014) may detect an instruction operation from an image (region) having the highest resolution among images captured by the camera 200. Thereby, the instruction operation can be easily detected.
  • the instructor is a human, but the instructor may be a robot.
  • the CPU 101 first vehicle control unit 1015 performs control for emergency stopping the vehicle 400. Also good. Thereby, safety is improved.
  • a signboard or the like on which a predetermined marker (for example, a rectangular frame) is drawn is placed behind the instructor, and the CPU 101 (instructor detecting unit 1011) starts from an image area (inside the rectangular frame) where the marker is detected.
  • An instructor may be detected. This facilitates detection of the instructor.
  • the instructor in the present embodiment is not necessarily limited to a human or a robot, and may include artificial intelligence and guidance signs (all known configurations such as lightning, analog method, and digital method).
  • the CPU 101 may determine the authority of the instructor from an object 500 (reflecting material, object having a predetermined pattern) held by the instructor.
  • the correspondence relationship between the object and authority held by the instructor is stored in the memory 102 in advance.
  • CPU101 (1st vehicle control part 1015) permits control of the vehicle 400 according to authority. Thereby, authority, such as a general person and a guidance person, can be changed, for example.
  • the above components, functions, etc. may be realized in hardware by designing a part or all of them, for example, with an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
  • Information such as programs, tables, and files for realizing each function can be stored in a recording device such as a memory, a hard disk, or an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Image Analysis (AREA)
PCT/JP2019/009876 2018-03-28 2019-03-12 車両制御装置及び車両制御システム WO2019188234A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980022095.8A CN111918807A (zh) 2018-03-28 2019-03-12 车辆控制装置以及车辆控制系统

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-062256 2018-03-28
JP2018062256A JP6978973B2 (ja) 2018-03-28 2018-03-28 車両制御装置及び車両制御システム

Publications (1)

Publication Number Publication Date
WO2019188234A1 true WO2019188234A1 (ja) 2019-10-03

Family

ID=68061440

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/009876 WO2019188234A1 (ja) 2018-03-28 2019-03-12 車両制御装置及び車両制御システム

Country Status (3)

Country Link
JP (1) JP6978973B2 (zh)
CN (1) CN111918807A (zh)
WO (1) WO2019188234A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7331721B2 (ja) * 2020-02-07 2023-08-23 トヨタ自動車株式会社 自動運転車両の制御装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006264465A (ja) * 2005-03-23 2006-10-05 Advics:Kk 車両用走行支援装置
US20150251697A1 (en) * 2014-03-06 2015-09-10 Ford Global Technologies, Llc Vehicle target identification using human gesture recognition
JP2017121865A (ja) * 2016-01-07 2017-07-13 トヨタ自動車株式会社 自動運転車両
WO2017146815A1 (en) * 2016-02-22 2017-08-31 Uber Technologies, Inc. Intention signaling for an autonomous vehicle
JP2017533609A (ja) * 2014-08-26 2017-11-09 トヨタ モーター セールス,ユー.エス.エー.,インコーポレイティド 対話式移動体制御システムのための一体化ウェアラブル用品
JP2018045397A (ja) * 2016-09-14 2018-03-22 本田技研工業株式会社 自動運転車両

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008174192A (ja) * 2007-01-22 2008-07-31 Aisin Aw Co Ltd 駐車支援方法及び駐車支援装置
JP2013156793A (ja) * 2012-01-30 2013-08-15 Hitachi Consumer Electronics Co Ltd 車両用衝突危険回避システム
JP6393537B2 (ja) * 2014-07-09 2018-09-19 株式会社デンソーテン 車両用装置、車両制御システム、車両制御方法
WO2017145364A1 (ja) * 2016-02-26 2017-08-31 三菱電機株式会社 駐車支援装置および駐車支援方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006264465A (ja) * 2005-03-23 2006-10-05 Advics:Kk 車両用走行支援装置
US20150251697A1 (en) * 2014-03-06 2015-09-10 Ford Global Technologies, Llc Vehicle target identification using human gesture recognition
JP2017533609A (ja) * 2014-08-26 2017-11-09 トヨタ モーター セールス,ユー.エス.エー.,インコーポレイティド 対話式移動体制御システムのための一体化ウェアラブル用品
JP2017121865A (ja) * 2016-01-07 2017-07-13 トヨタ自動車株式会社 自動運転車両
WO2017146815A1 (en) * 2016-02-22 2017-08-31 Uber Technologies, Inc. Intention signaling for an autonomous vehicle
JP2018045397A (ja) * 2016-09-14 2018-03-22 本田技研工業株式会社 自動運転車両

Also Published As

Publication number Publication date
JP2019172052A (ja) 2019-10-10
CN111918807A (zh) 2020-11-10
JP6978973B2 (ja) 2021-12-08

Similar Documents

Publication Publication Date Title
JP6855542B2 (ja) 車両外部のフィードバックを提供する方法、装置、機器および記憶媒体
KR102648812B1 (ko) 차량 및 그의 보행자 감지 알림 방법
US10303257B2 (en) Communication between autonomous vehicle and external observers
JP6084598B2 (ja) 標識情報表示システム及び標識情報表示方法
JP6468173B2 (ja) 運転支援装置
JP2020042778A (ja) 無人車の交差点通過方法、装置、機器及び記憶媒体
JP2024063147A (ja) 車載装置
JP7345123B2 (ja) 情報処理装置、情報処理方法、及び、コンピュータプログラム
JP6749483B2 (ja) 報知制御装置および報知制御方法
JP2007304729A (ja) 車両の走行支援システムおよび走行支援装置
JP2007280263A (ja) 運転支援装置
JP2018055637A5 (ja) 運転支援装置及び運転支援方法
US10625733B2 (en) Delayed parking optimization of autonomous vehicles
JP2017100562A (ja) 運転制御装置、運転制御方法及びプログラム
CN103818385A (zh) 汽车中用于输出信息的方法和装置
WO2019188234A1 (ja) 車両制御装置及び車両制御システム
JP2019149131A (ja) 運転支援装置
JP2012203829A (ja) 車両用動体検知システム
CN111532268A (zh) 车辆及其控制装置以及控制方法
JP2008243013A (ja) 撮像システム
JPWO2019220229A1 (ja) Ecu及び車線逸脱警告システム
JP6776535B2 (ja) 車両制御装置
JP2021133777A (ja) 走行支援装置
JP2019166992A (ja) 車両制御装置およびコンピュータプログラム
KR101673306B1 (ko) 웨어러블 기기를 이용한 차량의 안전 장치 및 그 제어방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19776842

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19776842

Country of ref document: EP

Kind code of ref document: A1