WO2023210128A1 - Electronic control device - Google Patents

Electronic control device Download PDF

Info

Publication number
WO2023210128A1
WO2023210128A1 PCT/JP2023/005995 JP2023005995W WO2023210128A1 WO 2023210128 A1 WO2023210128 A1 WO 2023210128A1 JP 2023005995 W JP2023005995 W JP 2023005995W WO 2023210128 A1 WO2023210128 A1 WO 2023210128A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
electronic control
communication
unit
processing
Prior art date
Application number
PCT/JP2023/005995
Other languages
French (fr)
Japanese (ja)
Inventor
将史 溝口
隆 村上
朋仁 蛯名
功治 前田
一 芹沢
幸則 浅田
毅 福田
Original Assignee
日立Astemo株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立Astemo株式会社 filed Critical 日立Astemo株式会社
Publication of WO2023210128A1 publication Critical patent/WO2023210128A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices
    • G06F13/12Program control for peripheral devices using hardware independent of the central processor, e.g. channel or peripheral processor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/38Information transfer, e.g. on bus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/38Information transfer, e.g. on bus
    • G06F13/42Bus transfer protocol, e.g. handshake; Synchronisation

Definitions

  • the present invention relates to an electronic control device.
  • Patent Document 1 states, ⁇ The driver has a plurality of interface sections corresponding to a plurality of application programs and a common processing section that executes instructions from the plurality of application programs, and communication between the plurality of application programs is , under the control of the driver.”
  • Patent Document 2 states that ⁇ data sent in a communication frame from an external bus connected to an external device is regarded as received data, and a message box in which at least the received data is stored; and an internal memory having a plurality of storage areas in which received data transferred via a bus can be written and read.
  • a buffer used by each of the application programs is installed as a component of a driver.
  • This buffer allows each application program to share one peripheral without having to prepare multiple peripherals (for example, CAN (Controller Area Network) controllers and CAN transceivers).
  • peripherals for example, CAN (Controller Area Network) controllers and CAN transceivers.
  • the present invention was made in view of this situation, and aims to reduce the cost of developing software programs.
  • An electronic control device includes an arithmetic processing section having an application processing section that performs application processing, a data storage section that stores data processed by the arithmetic processing section, and a data storage section that stores data read from the data storage section.
  • a peripheral that transmits data to the control device or receives data from another electronic control device; a first timing at which the arithmetic processing unit inputs and outputs data to the data storage unit for application processing; A second timing at which the data read from the data storage unit is passed to the peripheral and the peripheral transmits the data to another electronic control device is fixed in advance.
  • FIG. 1 is a diagram showing a schematic configuration example of an electronic control device according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing an example of the internal configuration of each part of the electronic control device according to the first embodiment of the present invention.
  • FIG. 1 is a block diagram showing an example of the internal configuration of a vehicle control device equipped with an electronic control device according to a first embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a configuration example of camera object detection data according to the first embodiment of the present invention. It is a figure showing pattern 1 of time synchronization of a first electronic control unit (camera ECU) and a second electronic control unit (automatic driving ECU).
  • FIG. 2 is a diagram showing pattern 2 of time synchronization of a first electronic control unit (camera ECU) and a second electronic control unit (automatic driving ECU).
  • 3 is a diagram illustrating an example of a change in data flow between electronic control devices time-synchronized in pattern 1.
  • FIG. 7 is a diagram illustrating an example of a change in data flow between electronic control devices time-synchronized in pattern 3.
  • FIG. 2 is a diagram showing definitions of LET and LCT proposed in Non-Patent Document 1.
  • FIG. 3 is a diagram showing an example of processing time of each electronic control device and communication time of an in-vehicle network.
  • FIG. 2 is a diagram showing an example in which a conventional software architecture is applied to a first electronic control unit (camera ECU).
  • FIG. 9 is a diagram showing an example of mapping the LET and LCT proposed in Non-Patent Document 1 to the modules and tasks shown in FIGS. 8 and 9.
  • FIG. 2 is a diagram illustrating an overview of processing by a communication preparation processing unit and a peripheral access unit according to the first embodiment of the present invention, using AUTOSAR as an example.
  • It is a diagram showing software architecture according to the present embodiment in a first electronic control unit (camera ECU).
  • FIG. 3 is a diagram showing an example in which the LET and LCT according to the first embodiment of the present invention are applied to each module and task according to the present embodiment.
  • 1 is a timing chart comparing the technology according to the first embodiment of the present invention and the technology disclosed in Non-Patent Document 1.
  • It is a diagram showing software architecture in a first electronic control unit (camera ECU) according to a second embodiment of the present invention.
  • It is a diagram showing software architecture in a second electronic control unit (automatic driving ECU) according to a second embodiment of the present invention.
  • FIG. 3 is a diagram showing an example in which the LET and LCT according to the first embodiment of the present invention are applied to each module and task according to the present embodiment.
  • 1 is a timing chart comparing the technology according to the first embodiment of the present invention and the technology disclosed in Non-Patent Document 1.
  • It is a diagram showing software architecture in a first electronic control unit (camera ECU) according to a second embodiment of the present invention.
  • It
  • FIG. 7 is a diagram showing an example of mapping LET and LCT according to the second embodiment of the present invention to each module and task according to the present embodiment.
  • FIG. 7 is a diagram illustrating a configuration example of a vehicle control device equipped with a third electronic control device (central ECU) according to a third embodiment of the present invention. It is a diagram showing software architecture in a third electronic control unit (central ECU) according to a third embodiment of the present invention.
  • a first embodiment of the present invention which will be described below, applies the present invention to an electronic control device and a vehicle control device.
  • a vehicle control device installed in an automatic driving vehicle that controls steering, accelerator, and brakes using image data acquired from a camera is taken as an example, and the electronic control device and vehicle control device are A configuration example and an operation example will be explained.
  • the present invention is applicable not only to automatic driving vehicles but also to vehicles that do not perform automatic driving.
  • the present invention is also applicable to a real-time system connected to a network and configured to communicate with other electronic control units (ECUs) inside and outside the vehicle.
  • ECUs electronice control units
  • the electronic control device and vehicle control device according to the first embodiment can be applied to products such as robots, construction machines, autonomous forklifts, etc., in which the CPU is configured with multi-cores and software programs run in real time.
  • FIG. 1 is a diagram showing a schematic configuration example of an electronic control device according to a first embodiment.
  • the electronic control device 100 includes a CPU 1, a RAM 2, and peripherals 3.
  • the CPU 1 is equipped with multiple cores. For the sake of explanation, in this embodiment, it is assumed that there are two cores, and the first core 11 and the second core 12 are included, but the present invention also applies when the CPU 1 is equipped with three or more cores. is applicable.
  • the peripheral 3 is hardware used to communicate with the outside of the electronic control device 100 (for example, another electronic control device 100), and includes a register 31 therein.
  • the peripheral 3 is specifically hardware such as a CAN controller or a physical layer of Ethernet (registered trademark), and is connected to the in-vehicle network 4.
  • a plurality of electronic control devices (a first electronic control device (camera ECU) 6, a second electronic control device (autonomous driving ECU) 7) shown in FIG. ) allows data communication.
  • the peripheral (peripheral 3) transmits data read from the data storage unit (data storage unit 2A) (see FIG. 2) to another electronic control device, or receives data from another electronic control device.
  • Communication data is transmitted to the in-vehicle network 4 by the first core 11 or second core 12 of the CPU 1 writing a value to the hardware-specific transmission register 31.
  • the peripheral 3 receives communication data transmitted by another electronic control device 100 from the in-vehicle network 4, the value is written to the hardware-specific reception register 31.
  • This communication data is converted into a format usable by the first core 11 and the second core 12 and stored in the RAM 2.
  • the first core 11 and the second core 12 can read data from the RAM 2 and use it for application processing by an application module (an example of an application program) provided in each core.
  • FIG. 2 is a block diagram showing an example of the internal configuration of each part of the electronic control device 100 according to the first embodiment.
  • a calculation processing section 1A is provided as a functional section corresponding to the first core 11 of the CPU 1 shown in FIG. 1
  • a calculation processing section 1B is provided as a functional section corresponding to the second core 12.
  • a data storage section 2A is provided as a functional section corresponding to the RAM 2 shown in FIG.
  • This arithmetic processing section (arithmetic processing sections 1A, 1B, first core 11, second core 12) has an application processing section (application processing section 22) that performs application processing.
  • the arithmetic processing units (arithmetic processing units 1A, 1B, first core 11, second core 12) prepare for communication of application data generated by application processing with the application processing unit (application processing unit 22). It has a communication preparation processing unit (communication preparation processing unit 201) that performs the communication preparation processing.
  • the arithmetic processing unit 1A includes an application processing unit 22, a communication preparation processing unit 201, a peripheral access unit 202, and a task activation time management unit 26.
  • at least one arithmetic processing unit includes a communication data storage unit (communication data storage unit 28) and a peripheral access unit (peripheral access unit) that accesses a peripheral (peripheral 3). 202) and a startup time management section (task startup time management section 26) that manages the startup time for starting up the peripheral access section (peripheral access section 202).
  • the application processing unit 22 of the arithmetic processing unit 1A performs application processing to realize the functions handled by the arithmetic processing unit 1A.
  • Application data generated by application processing is stored in the application data storage unit 23.
  • the communication preparation processing unit 201 of the arithmetic processing unit 1A performs communication preparation processing for the electronic control device 100 to communicate with another electronic control device, and stores communication data in the communication data storage unit 28. Further, the communication preparation processing unit 201 reads communication data received from another electronic control device 100 from the communication data storage unit 28, converts it into a format that can be used by the application processing unit 22, and then converts the communication data received from the other electronic control device 100 into a format that can be used by the application processing unit 22. can also be stored in the application data storage section 23.
  • the peripheral access unit 202 communicates with the communication data storage unit 28 and accesses the peripheral 3 and the communication data storage unit 28 according to the activation instruction input from the task activation time management unit 26.
  • the task activation time management unit 26 manages the activation time of the peripheral access unit 202 when the activation of the peripheral access unit 202 is set as a task. Then, the startup time management section (task startup time management section 26) operates the application processing section (application processing section 22) at a first timing based on a predetermined internal time, and operates the peripheral access section at a second timing. (Peripheral access unit 202) is operated. As described later, the first timing according to the first embodiment is fixed as the timing at which the application processing unit (application processing unit 22) inputs and outputs data to and from the application data storage unit (application data storage unit 23). Ru.
  • the second timing is fixed as the timing at which the peripheral access unit (peripheral access unit 202) transfers the communication data stored in the communication data storage unit (communication data storage unit 28) to the peripheral (peripheral 3).
  • the internal time is corrected by a time synchronization signal that the electronic control device 100 receives from the outside. Therefore, the internal times of the plurality of electronic control devices 100 are synchronized.
  • the arithmetic processing unit 1B includes an application processing unit 22 and a communication preparation processing unit 201.
  • the application processing unit 22 of the arithmetic processing unit 1B performs application processing to realize the functions that the arithmetic processing unit 1B is responsible for.
  • Application data generated by application processing is stored in the application data storage unit 23.
  • the communication preparation processing section 201 of the arithmetic processing section 1B performs communication preparation processing for the electronic control device 100 to communicate with another electronic control device, and stores communication data in the communication data storage section 28.
  • the processing of the communication preparation processing section 201 of the calculation processing section 1B is similar to the processing of the communication preparation processing section 201 of the calculation processing section 1A.
  • the data storage unit stores data processed by the calculation processing units (calculation processing units 1A, 1B, first core 11, second core 12).
  • This data storage unit stores communication data generated by an application data storage unit (application data storage unit 23) that stores application data and a communication preparation processing unit (communication preparation processing unit 201). It has a communication data storage section (communication data storage section 28).
  • the application data storage unit 23 stores application data processed by the application processing units 22 of the calculation processing units 1A and 1B.
  • the application data storage unit serves as an area where each application processing unit (application processing unit 22) temporarily stores data during application processing. It has a local area (local area 23a) that is used only by the application processing unit 22).
  • the local area 23a is an area provided inside the RAM 2 shown in FIG. The data stored in the local area 23a can be accessed individually only by each application processing section 22, and therefore is not changed by other application processing sections 22.
  • the application data storage unit stores data resulting from completion of processing by the application processing unit (application processing unit 22), and transmits the data to other application processing units (application processing unit 22). It has a global area (global area 23b) that is made available for use by the public.
  • the global area 23b is an area provided in the RAM2.
  • the data stored in the global area 23b can be mutually accessed and used by each application processing unit 22. Therefore, the application processing unit (application processing unit 22) accesses the global area (global area 23b) at the first timing. Further, the application data stored in the global area 23b of the application data storage section 23 can be read by the communication preparation processing section 201 of the arithmetic processing section 1A, 1B.
  • Local variables and global variables are distinguished by the compiler, and within the RAM 2, local variables are placed in the local area 23a and global variables are placed in the global area 23b. Which area of the RAM 2 is the local area 23a and which area is the global area 23b depends on the electronic control device 100.
  • the communication data storage unit 28 stores communication data processed by the communication preparation processing unit 201 of the arithmetic processing units 1A and 1B.
  • the communication data stored in the communication data storage section 28 can be read by the peripheral access section 202.
  • the peripheral 3 transmits data output from the peripheral access unit 202 at the timing activated by the task activation time management unit 26 to other electronic control devices via the in-vehicle network 4 (hereinafter also referred to as a bus). Further, the peripheral 3 outputs data received from another electronic control device via the in-vehicle network 4 to the peripheral access unit 202.
  • the communication data that the peripheral access unit 202 acquires from the peripheral 3 at the timing of activation by the task activation time management unit 26 is stored in the communication data storage unit 28.
  • the communication preparation processing section 201 of the arithmetic processing section 1A, 1B acquires the communication data from the communication data storage section 28, converts it into data that can be processed by the application processing section 22, and stores it in the global area 23b of the application data storage section 23. save.
  • the application processing units 22 of the arithmetic processing units 1A and 1B read data from the application data storage unit 23 and use it for processing each application.
  • the arithmetic processing units (arithmetic processing units 1A, 1B, first core 11, second core 12) are stored in a data storage unit (data storage unit) for application processing.
  • the second timing at which the peripheral (peripheral 3) transmits the data to the peripheral (peripheral 3) and the peripheral (peripheral 3) transmits the data to another electronic control device is fixed in advance.
  • the pre-fixed processing timing according to this embodiment will be described below in comparison with the conventional technology.
  • FIG. 3 is a block diagram showing an example of the internal configuration of a vehicle control device equipped with an electronic control device according to the first embodiment.
  • electronic control devices installed in the vehicle control device 5 a first electronic control device (camera ECU) 6 and a second electronic control device (automatic driving ECU) 7 are shown as examples.
  • the vehicle control device 5 includes a first electronic control device (camera ECU) 6, a second electronic control device (automatic driving ECU) 7, a camera 8, a steering wheel 16, an accelerator 17, and a brake 18.
  • a first electronic control unit (camera ECU) 6 and a second electronic control unit (automatic driving ECU) 7 are connected to each other via an in-vehicle network 4.
  • the internal hardware configurations of the first electronic control unit (camera ECU) 6 and the second electronic control unit (automatic driving ECU) 7 are the same as the electronic control unit 100 shown in FIG. 1, respectively.
  • the camera 8 captures an image in front of the vehicle at a cycle of, for example, 100 milliseconds, and generates camera image data 9 using a known method such as JPEG (Joint Photographic Experts Group). Camera image data 9 generated by the camera 8 is input to a first electronic control unit (camera ECU) 6.
  • a first electronic control unit camera ECU
  • a first electronic control unit (camera ECU) 6 receives camera image data 9 from a camera 8 and generates camera object detection data 10.
  • camera ECU camera ECU
  • FIG. 4 is a diagram showing an example of the configuration of camera object detection data 10.
  • the camera object detection data 10 includes items such as an object number (identifier), object type, and coordinates.
  • the number (identifier) item specifies each object detected from the camera image data 9 by the image recognition application modules 221 and 222 (see FIG. 8 described later) included in the first electronic control unit (camera ECU) 6. number will be assigned.
  • the object type field the object type, such as a car or a pedestrian, is stored for each detected object. Coordinate information in the camera image data 9 of objects detected by the image recognition application modules 221 and 222 is stored in the coordinate item.
  • the camera image data 9 input to the first electronic control unit (camera ECU) 6 is image data compressed using a known compression format such as JPEG or PNG (Portable Network Graphics), or image data detected by the imaging sensor of the camera 8. It is a numerical string itself, and is generally called raw data.
  • the camera object detection data 10 output from the first electronic control unit (camera ECU) 6 is composed of items such as the number (identifier), object type, and coordinates of the object detected from the camera image data 9. . Therefore, the camera image data 9 is larger in size than the camera object detection data 10, which is about several megabytes.
  • the first electronic control unit (camera ECU) 6 transmits camera object detection data 10 to the second electronic control unit (automatic driving ECU) 7 via the in-vehicle network 4. Since the data size of the camera object detection data 10 is smaller than the camera image data 9, it does not need to occupy the band of the in-vehicle network 4. Note that the internal logic of the first electronic control unit (camera ECU) 6 is not directly related to the present invention, and therefore will not be illustrated.
  • the second electronic control unit (automatic driving ECU) 7 receives the camera object detection data 10 from the first electronic control unit (camera ECU) 6 at a cycle of, for example, 100 milliseconds. Then, the second electronic control unit (automatic driving ECU) 7 plans the traveling direction (trajectory) of the vehicle based on the camera object detection data 10, and provides a steering control command 13 and an accelerator control command to realize the plan. 14 and a brake control command 15.
  • the steering control command 13 is command data for controlling the operation of the steering wheel 16.
  • the accelerator control command 14 is command data for controlling the operation of the accelerator 17.
  • the brake control command 15 is command data for controlling the operation of the brake 18.
  • the steering 16 is controlled by the steering control command 13 and changes the direction of travel of the vehicle.
  • the accelerator 17 is controlled by the accelerator control command 14 to accelerate the vehicle.
  • Brake 18 is controlled by brake control command 15 to decelerate the vehicle. Controlling the traveling direction and acceleration/deceleration of the vehicle in this manner is called “vehicle control.” Note that the logic for analyzing the camera object detection data 10 in the second electronic control unit (autonomous driving ECU) 7, the logic for generating each control command, and the data format of each control command are not directly related to the present invention. Therefore, an example will be omitted.
  • FIG. 5A is a diagram showing a time synchronization pattern 1
  • FIG. 5B is a diagram showing a time synchronization pattern 2.
  • first electronic control unit camera ECU
  • second electronic control unit automatic driving ECU
  • periodic processing is performed every 100 milliseconds to generate each control command from the camera object detection data 10.
  • these periodic processes are implemented using timeout processing of hardware counters installed inside the electronic control units 6 and 7.
  • timeout processing of hardware counters installed inside the electronic control units 6 and 7.
  • the operating clocks of the hardware counters are different, there will be a time lag between the electronic control units 6 and 7, which may not maintain consistency in data flow and may affect vehicle control.
  • the data generated by the first electronic control unit (camera ECU) 6 in the first cycle is transmitted to the second electronic control unit (autonomous driving ECU) 7 via the in-vehicle network 4. Arrived after the start of the second cycle. Therefore, the second electronic control unit (autonomous driving ECU) 7 uses the data generated by the first electronic control unit (camera ECU) 6 in the first cycle, not in the second cycle but in the next third cycle. It becomes possible to perform calculation processing. In this way, when a difference occurs between the internal times of the electronic control units 6 and 7, the behavior of the system fluctuates.
  • control is performed to synchronize the first electronic control unit (camera ECU) 6 and the second electronic control unit (automatic driving ECU) 7 so that the times do not deviate.
  • the method of time synchronization between electronic control devices is specified in AUTOSAR (AUTomotive Open System Architecture), which is a standard for in-vehicle software.
  • AUTOSAR AUTomotive Open System Architecture
  • FIGS. 6A and 6B are diagrams showing examples of changes in data flow between time-synchronized electronic control units.
  • FIG. 6A is a diagram showing the same time synchronization pattern 1 as in FIG. 5A
  • FIG. 6B is a diagram showing a changed data flow pattern 3.
  • the first electronic control unit (camera ECU) 6 and the second electronic control unit (automatic operation ECU) 7 are synchronized in time. has been done.
  • the start timings of periodic tasks every 100 milliseconds are the same between the two electronic control devices. However, even if the start timing (phase) of the periodic task is shifted every 100 milliseconds, if this shift is constant, it can be considered that the times are synchronized. However, as shown in pattern 3, if the processing time of the first electronic control unit (camera ECU) 6 in the first cycle becomes long, the communication time of the camera object detection data 10 transmitted over the in-vehicle network 4 is 2 It depends on the cycle. In this case, since the camera object detection data 10 does not arrive by the start of the second period of the second electronic control unit (automatic driving ECU) 7, Processing omissions occur and data flow becomes inconsistent.
  • Non-patent document 1 is the following document. ”Kai-Bjorn Gemlau, Leonie KOHLER, Rolf Ernst, and Why Quinton. 2021. “System-level Logical Execution Time: Augmenting the Logical Execution Time Paradigm for Distributed Real-time Automotive Software.” ACM Trans. Cyber-Phys. Syst. 5, 2, Article 14 (January 2021), 27 pages. DOI:https://doi.org/10.1145/3381847” This non-patent document 1 describes a method of extending a scheduling method called LET between ECUs.
  • FIG. 7A is a diagram showing the definitions of LET and LCT. As shown in FIG. 7A, the period of each cycle of the first electronic control device, the in-vehicle network, and the second electronic control device is constant at 100 milliseconds.
  • the first process, second process, third process, and fourth process are assigned to the LETs in the first, second, third, and fourth cycles in the first electronic control device, respectively. .
  • Data is read at the read timing at the beginning of each process, and processed data is written at the write timing at the end of each process.
  • Different devices and in-vehicle networks can read data written to the RAM 2, peripheral 3, etc. at the end of one cycle at the beginning of the next cycle.
  • the LCT in the second cycle in the in-vehicle network is assigned the first process of communicating data that is the result of the first process in the first cycle by the first electronic control unit to the second electronic control unit.
  • the LCT of the third and fourth cycles in the in-vehicle network data that is the result of the first processing of the second and third cycles by the first electronic control unit is sent to the second electronic control unit.
  • a second process and a third process to communicate are assigned.
  • the first processing and the second processing of the second electronic control device are performed, respectively, using data received via the in-vehicle network. Assigned.
  • FIG. 7B is a diagram showing an example of the processing time of each electronic control device and the communication time of the in-vehicle network. Since the period of each cycle of the first electronic control device, the in-vehicle network, and the second electronic control device is fixed, the processing performed by the first electronic control device and the second electronic control device is performed within the LET. Run it somewhere. Furthermore, communication processing performed on the in-vehicle network is executed somewhere within the LCT. For example, the first electronic control unit executes processing somewhere within the LET at a certain period. In the next cycle, the in-vehicle network communicates data to the second electronic control unit somewhere within the LET. In the next cycle, the second electronic control unit obtains data from the in-vehicle network and executes processing somewhere within the LET.
  • the method proposed in Non-Patent Document 1 is to fix the start time and end time of each of the application processing installed in the electronic control device and the communication processing of the in-vehicle network in advance and do not change them.
  • the start time is the time at which data necessary for processing of an application installed in the electronic control device and communication processing of the in-vehicle network is acquired.
  • the end time is the time at which data generated by the processing of the application installed in the electronic control device and the communication processing of the in-vehicle network is released to the electronic control device or the in-vehicle network in the next cycle.
  • Non-Patent Document 1 could not be applied as is to applications that require large amounts of data communication, such as high-quality in-vehicle cameras, zone ECUs, and central ECUs. Note that if you lengthen the time from the start time to the end time of communication processing, it is possible to increase the amount of data communication, but the time required to exchange data between applications will increase, resulting in latency (communication delay). The problem was that it was getting worse.
  • the time required for the CPU 1 of each ECU to process each application varies depending on the surrounding environment of the vehicle or the internal state of the vehicle. Therefore, in the technology according to Non-Patent Document 1, the processing of each application executed by each application processing unit 22 shown in FIG. 2 is performed using local variables that do not share data with other application processing units 22. We will do so. Further, at the start and end of processing in each application processing section 22, global variables shared among the plurality of application processing sections 22 are read and written. Furthermore, the timing at which each application processing unit 22 reads and writes global variables is fixed.
  • the time required for each application processing unit 22 to process an application is defined as the time from the timing of reading a fixed global variable to the timing of writing a fixed global variable, and this time is called LET.
  • the processing time required by each application processing unit 22 can be regarded as always constant, and it is thought that the above-mentioned breakdown in data flow consistency will not occur.
  • the timing of reading a global variable from the RAM 2 of the electronic control device of the transmission source and the timing of writing the value of the global variable to the RAM 2 of the electronic control device of the receiving destination are fixed. Then, the time from the timing of reading the global variable to the timing of writing the value of the global variable is regarded as LCT.
  • the communication load (also referred to as bus load) on the in-vehicle network 4 is sufficiently small. Therefore, a high priority is set for communication messages (communication data) with short communication cycles.
  • the register of the peripheral 3 of the destination electronic control unit (second electronic control unit (autonomous driving ECU) 7) is 31 (see FIG. 1), the data always arrives.
  • the processing cycle of the first electronic control unit (camera ECU) 6 and the second electronic control unit (autonomous driving ECU) 7 shown in FIG. 1 is 100 milliseconds, the LCT should also be set to 100 milliseconds.
  • FIG. 8 is a diagram showing an example in which a conventional software architecture is applied to the first electronic control unit (camera ECU) 6. As shown in FIG.
  • the first core 11 in the first electronic control unit (camera ECU) 6 is dedicated to communication, and the second core 12 is dedicated to applications. That is, an OS (Operating System) 19, communication middleware 20, and communication driver 21 are allocated to the first core 11. Furthermore, an OS 19, an image recognition application module (vehicle front) 221, an image recognition application module (vehicle rear) 222, a data acquisition task 24, and a data disclosure task 25 are assigned to the second core 12.
  • the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 are both modules that execute application processing.
  • the data acquisition task 24 acquires data necessary for each application processing section 22 from the RAM 2 (see FIG. 1) at the start of processing of each application module.
  • the data acquisition task 24 can acquire data from the local area 23a or global area 23b of the application data storage unit 23 (see FIG. 2).
  • the data publishing task 25 stores the results of each application process in the application data storage unit 23 included in the RAM 2 when the process of each application module ends.
  • the data publication task 25 sets an area (global area 23b) accessible from both the first core 11 and the second core 12 in the application data storage unit 23, and stores the results of each application process in this global area 23b. save. Therefore, data generated by the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 and used by other application modules is saved in the global area 23b by the data publication task 25.
  • the communication middleware 20 has a function of adding data necessary for communication, including header generation, to the data stored in the global area 23b, and generating communication data.
  • This communication data is, for example, a frame used in Ethernet.
  • the communication driver 21 writes communication data to the register 31 of the peripheral 3 shown in FIG.
  • the timing for starting the data acquisition task 24 and the data publication task 25 is fixed. Further, the processing of the data acquisition task 24 and the data publication task 25 is only access to the RAM 2. Therefore, the CPU processing time of the data acquisition task 24 and the data publication task 25 can be considered constant. As a result, even if the CPU processing time (application processing time) of the image recognition application module (front of the vehicle) 221 and the image recognition application module (rear of the vehicle) 222 changes, the timing at which the data acquisition task 24 is started and the data release task The timing at which 25 is completed does not change.
  • the execution time of the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 is calculated from the timing when the data acquisition task 24 is started and when the data publishing task 25 is started. The time until completion is constant. This certain period of time is implemented as a LET regarding the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222.
  • the task start time management unit 26 is a functional unit that implements a task scheduling function provided by the OS 19.
  • FIG. 9 is a diagram showing an example in which the conventional software architecture is applied to the second electronic control unit (automatic driving ECU) 7.
  • the first core 11 in the second electronic control unit (automatic driving ECU) 7 is dedicated to communication, and the second core 12 is dedicated to applications. That is, an OS (Operating System) 19, communication middleware 20, and communication driver 21 are allocated to the first core 11. Furthermore, the OS 19, trajectory generation application module 223, control command generation application module 224, data acquisition task 24, and data publication task 25 are assigned to the second core 12.
  • OS Operating System
  • the trajectory generation application module 223 generates a trajectory on which the vehicle travels.
  • the control command generation application module 224 generates control commands (steering control command 13, accelerator control command 14, and brake control command 15) for the steering 16, accelerator 17, and brake 18 shown in FIG. 3 according to the generated trajectory.
  • the execution time of the trajectory generation application module 223 and the control command generation application module 224 is constant as the time from the timing when the data acquisition task 24 is started until the timing when the data publication task 25 is completed.
  • the time is implemented as a LET for the trajectory generation application module 223 and the control command generation application module 224.
  • the data acquisition task 24, the data publication task 25, and the task start time management unit 26 are the same as those described with reference to FIG. 8, so detailed explanations will be omitted.
  • the communication middleware 20 shown in FIGS. 8 and 9 is activated periodically and performs a process of transmitting communication data to other electronic control devices. Therefore, the communication middleware 20 uses the data publishing task 25 to read the data stored in the global area 23b of the application data storage unit 23 shown in FIG. Thereafter, the communication middleware 20 adds data necessary for communication (such as a communication header) to the read data, and writes the communication data to the register 31 of the peripheral 3 shown in FIG. 1 by calling the communication driver 21.
  • data necessary for communication such as a communication header
  • the communication middleware 20 is also activated by a data reception interrupt from the peripheral 3, and can also perform reception processing of communication data transmitted from other electronic control devices.
  • the communication middleware 20 reads communication data stored in the register 31 of the peripheral 3 using the communication driver 21 during data reception processing.
  • the communication middleware 20 then executes necessary predetermined processing regarding data reception defined by the communication protocol, such as communication header removal and CRC (Cyclic Redundancy Check). Thereafter, the communication middleware 20 stores the received data in the application data storage unit 23, particularly in the local area 23a of the RAM 2 that is accessible from the first core 11 and not accessible from the second core 12. save.
  • the communication middleware 20 can also be activated periodically and store the received data stored in the local area 23a of the RAM2 in the global area 23b of the RAM2. Therefore, both the first core 11 and the second core 12 can access the global area 23b.
  • LCT is defined by the following process. That is, starting from the time when the communication middleware 20 installed in the first electronic control unit (camera ECU) 6 is activated for transmission processing, the communication middleware installed in the second electronic control unit (autonomous driving ECU) 7 is started. The communication middleware 20 that has been updated is also activated. The communication middleware 20 installed in the second electronic control unit (autonomous driving ECU) 7 is accessible especially from the first core 11 of the application data storage section 23 shown in FIG. Data stored in the local area 23a that is not accessible from the core 12 is retrieved. The communication middleware 20 then stores the data in the global area 23b that is accessible from both the first core 11 and the second core 12 in the application data storage unit 23. In this way, the time calculated from the time when the communication middleware 20 is activated for transmission processing until data is stored in the global area 23b is defined as the LCT.
  • the time required for communication from the first electronic control unit (camera ECU) 6 to the second electronic control unit (automatic driving ECU) 7 may vary depending on the status of the in-vehicle network 4. However, from the viewpoint of data flow, even the fluctuating time can be regarded as constant as LCT.
  • FIG. 10 is a diagram showing an example of mapping the LET and LCT proposed in Non-Patent Document 1 to the modules and tasks shown in FIGS. 8 and 9.
  • the processing performed in the timing chart of the first electronic control unit (camera ECU) 6 represents the processing performed in the application-dedicated second core 12 of the first electronic control unit (camera ECU) 6.
  • the first half of the processing performed in the timing chart of the in-vehicle network 4 is the processing performed in the first core 11 dedicated to communication of the first electronic control unit (camera ECU) 6. represents.
  • the latter half of the processing performed in the timing chart of the in-vehicle network 4 (including the write timing on the right side of the diagram) is performed in the first core 11 dedicated to communication of the second electronic control unit (autonomous driving ECU) 7.
  • the processing performed in the timing chart of the second electronic control unit (automatic driving ECU) 7 represents the processing performed in the second core 12 dedicated to the application of the second electronic control unit (automatic driving ECU) 7. ing.
  • a data acquisition task 24 acquires data necessary for application processing at the beginning of the first cycle. Then, an image recognition application module (vehicle front) 221 and an image recognition application module (vehicle rear) 222 (abbreviated as “image recognition (vehicle front) 221 and image recognition (vehicle rear) 222" in the figure) perform processing. Execute. The processed data is written to the peripheral 3 by the data publishing task 25 at the end of the first cycle.
  • the area described as "processing" in the figure represents the time during which the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 execute the process.
  • the data acquisition task 24 acquires data from the peripheral 3 at the beginning of the third cycle. Then, the trajectory generation application module 223 and the control command generation application module 224 (abbreviated as "trajectory generation 223 and control command generation 224" in the figure) execute the process. The processed data is written to the peripheral 3 by the data publishing task 25 at the end of the third cycle.
  • the area labeled "Processing" in the figure represents the time during which the trajectory generation application module 223 and the control command generation application module 224 execute the process.
  • FIG. 11 is a timing chart of the communication middleware 20 and the in-vehicle network 4 when the first electronic control unit (camera ECU) 6 transmits a large amount of data. Restrictions on data communication amount will be explained with reference to FIG. 11.
  • the timing chart on the upper side of FIG. 11 is similar to the timing chart shown in FIG.
  • the data publication task 25 in the communication middleware 20 reads data stored in the application data storage unit 23 in an area (global area 23b) that is accessible from both the first core 11 and the second core 12, and writes communication headers, etc. There is a process of adding data necessary for communication and writing the data into the register 31 of the peripheral 3. In this process, for example, in the case of Ethernet communication in the in-vehicle software standard AUTOSAR, as shown on the left side of FIG. This is done via a module called . Furthermore, in this process, it is necessary to call the Driver after going through modules called Interface and Transceiver that belong to Communication Hardware abstraction. Since this processing is all executed by the CPU, the CPU processing time is on the order of 100 microseconds to 1 millisecond. In this embodiment, the CPU processing time is 150 microseconds.
  • Ethernet which allows a communication speed of 1 gigabit per second
  • the size of a frame which is a communication unit
  • 1 Kbyte is 8000 bits
  • the CPU processing time of the communication middleware 20 150 microseconds
  • the timing chart shown in the lower part of FIG. 11 shows the relationship between the processing times of the camera object detection data 10 generated by the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222, respectively.
  • the CPU processing time of the communication middleware 20 for the camera object detection data 10 generated by the image recognition application module (vehicle front) 221 is 150 microseconds, and then the camera object detection data processed by the communication middleware 20 is It is shown that the time it takes for 10 frames to flow through the in-vehicle network 4 is 80 microseconds.
  • the CPU 1 of the first electronic control unit (camera ECU) 6 performs communication processing for transmitting frames (data).
  • the camera object detection data 10 processed by the CPU of the image recognition application module (front of the vehicle) 221 to the in-vehicle network 4 is completed. It is desirable to start the process of transmitting the object detection data 10 to the in-vehicle network 4.
  • the image recognition application module (vehicle rear) 222 does not operate until the CPU processing of the image recognition application module (vehicle front) 221 is completed.
  • the CPU processing wait time is 70 microseconds from the end of frame transmission by the image recognition application module (front of the vehicle) 221 until the end of the frame generation process of the image recognition application module (rear of the vehicle) 222. will occur.
  • the period of 70 microseconds from when the image recognition application module (vehicle front) 221 finishes transmitting the camera object detection data 10 until the image recognition application module (vehicle rear) 222 starts transmitting the camera object detection data 10 is
  • One electronic control unit (camera ECU) 6 cannot use the band of the in-vehicle network 4. For this reason, the amount of data that can be communicated over the in-vehicle network 4 is restricted during the 70 microseconds that is the waiting time for CPU processing.
  • FIG. 12 is a diagram illustrating an overview of the processing of the communication preparation processing unit 201 and peripheral access unit 202 shown in FIG. 2, using AUTOSAR as an example.
  • a configuration example of conventional communication middleware 20 and communication driver 21 is shown.
  • the communication middleware 20 reads data generated by each application module and stored in the global area 23b of the application data storage section 23 using RTE_read(). Thereafter, the communication middleware 20 accesses the peripheral 3 by sequentially calling COM, PduR, Tp (Transport Protocol), If (Interface), and Driver (communication driver 21), and stores the transmission data in the register 31.
  • COM, PduR, and Tp belong to the service layer, and If belongs to the HW (hardware) abstraction layer.
  • the communication middleware 20 is responsible for processing from Rte_read() to If. As described above, it takes approximately 150 microseconds from the time the process starts from RTE_read( ) until the driver completes the process.
  • the processing of the communication middleware 20 and the communication driver 21 shown in FIGS. 8 and 9 is divided into the communication preparation processing section 201 and the peripheral access section 202 shown in FIG. 2.
  • FIG. 12 a configuration example of the communication preparation processing section 201 and the peripheral access section 202 according to this embodiment is shown.
  • the communication preparation processing section 201 and the peripheral access section 202 are provided in the arithmetic processing section 1A.
  • the arithmetic processing unit 1B is provided with a communication preparation processing unit 201, but is not provided with a peripheral access unit 202.
  • a communication data storage unit 28 is installed as a buffer between If and Driver. As shown in FIG. 2, the communication data storage section 28 is provided in the data storage section 2A.
  • the first core 11 and the second core 12 do not call the Driver directly, but instead call the Driver.
  • the communication data storage unit 28 is configured to store the arguments to be passed when doing so.
  • the communication preparation processing section 201 is comprised from RTE_read() to the modified If. As shown in FIG. 2, the communication preparation processing section 201 receives data stored in the global area 23b of the application data storage section 23 as input, and performs processing from RTE_read() to the modified If shown on the right side of FIG. After doing so, the data is output to the communication data storage section 28.
  • the peripheral access unit 202 is a functional unit that can execute a process of extracting data from the communication data storage unit 28 and calling a Driver using the extracted data as an argument.
  • the communication preparation processing unit (communication preparation processing unit 201) is a service layer and hardware abstraction layer defined by AUTOSAR (registered trademark), and the peripheral access unit (peripheral access unit 202) is defined by AUTOSAR (registered trademark). This is a driver.
  • the communication data storage unit (communication data storage unit 28) is provided between the hardware abstraction layer and the driver.
  • the actual communication data storage unit 28 is secured in an area of the RAM 2 that can be accessed from both the first core 11 and the second core 12, and the application data storage unit 23 is Separate area.
  • Most of the CPU processing from Rte_read() until the Driver completes the process is performed by the communication preparation processing unit 201, and the CPU processing time required by the communication preparation processing unit 201 is about 130 microseconds.
  • the peripheral access unit 202 since the peripheral access unit 202 only retrieves data from the communication data storage unit 28 and calls the Driver (communication driver 21), the CPU processing time required by the peripheral access unit 202 is about 20 microseconds.
  • the communication preparation processing unit 201 is provided in both the first core 11 and the second core 12 (the arithmetic processing units 1A and 1B shown in FIG. 2), the communication preparation processing unit 201 12 can each execute the communication preparation processing unit 201 in parallel.
  • the arithmetic processing units (arithmetic processing units 1A, 1B, first core 11, second core 12) operate the communication preparation processing unit ( Data is transmitted to other electronic control devices 100 by activating the communication preparation processing section 201) and the peripheral access section (peripheral access section 202) in this order. Then, the communication preparation processing unit (communication preparation processing unit 201) stores communication data in the communication data storage unit (communication data storage unit 28), and the peripheral access unit (peripheral access unit 202) performs processing to store communication data in the communication data storage unit (communication data storage unit 28). The communication data read from the data storage unit 28), which the peripheral (peripheral 3) transmits to another electronic control device, is stored in the peripheral (peripheral 3).
  • the arithmetic processing units perform communication preparation on the peripheral access unit (peripheral access unit 202) based on a predetermined internal time.
  • Data is received from other electronic control devices 100 by activating the processing units (communication preparation processing unit 201) in this order.
  • the peripheral access unit acquires the data received by the peripheral (peripheral 3) from another electronic control device, and writes it into the communication data storage unit (communication data storage unit 28), and the communication preparation process.
  • the communication preparation section (communication preparation processing section 201) writes communication data read from the communication data storage section (communication data storage section 28) into the application data storage section (application data storage section 23).
  • the software architecture according to this embodiment will be explained with reference to FIGS. 13 and 14.
  • the first core 11 is dedicated to communication
  • the second core 12 is dedicated to applications, as in the example in which the method in Non-Patent Document 1 described with reference to FIGS. 8 and 9 is applied.
  • the first core 11 and second core 12 are used for both applications and communication.
  • FIG. 13 is a diagram showing the software architecture of the first electronic control unit (camera ECU) 6 according to this embodiment.
  • One of the plurality of electronic control units detects an object from camera image data (camera image data 9) captured by the camera.
  • the camera object detection data (camera object detection data 10) is transmitted to the in-vehicle network (in-vehicle network 4).
  • This first electronic control unit (camera ECU) 6 has an OS 19, an image recognition application module (vehicle front) 221, a data acquisition task 24, a data disclosure task 25, and a communication preparation processing unit 201 for the first core 11. is assigned.
  • an OS 19, an image recognition application module (rear of the vehicle) 222, a data acquisition task 24, a data disclosure task 25, and a communication preparation processing unit 201 are assigned to the second core 12.
  • the task activation time management unit 26 can be executed from both the first core 11 and the second core 12.
  • the peripheral access unit 202 of the first electronic control unit (camera ECU) 6 can be executed by either the first core 11 or the second core 12.
  • the peripheral access unit 202 is executed by the arithmetic processing unit 1A corresponding to the first core 11, but it may also be executed by the arithmetic processing unit 1B corresponding to the second core 12. .
  • FIG. 14 is a diagram showing the software architecture according to this embodiment in the second electronic control unit (automatic driving ECU) 7.
  • One of the plurality of electronic control devices determines the control target of the vehicle based on camera object detection data (camera object detection data 10). At least one of a control command value for the target angle of the steering (steering 16), a control command value for the target acceleration force of the accelerator (accelerator 17), and a control command value for the target damping force of the brake (brake 18). Generate and send control command values to the controlled object.
  • this second electronic control unit (automatic driving ECU) 7 an OS 19, a trajectory generation application module 223, a data acquisition task 24, a data disclosure task 25, and a communication preparation processing unit 201 are assigned to the first core 11. . Furthermore, the OS 19, the control command generation application module 224, the data acquisition task 24, the data publication task 25, and the communication preparation processing unit 201 are assigned to the second core 12.
  • the task activation time management unit 26 can be executed from both the first core 11 and the second core 12.
  • the peripheral access unit 202 of the second electronic control unit (automatic driving ECU) 7 can be executed by either the first core 11 or the second core 12.
  • the peripheral access unit 202 is executed by the arithmetic processing unit 1A corresponding to the first core 11, but it may also be executed by the arithmetic processing unit 1B corresponding to the second core 12. .
  • the first core 11 and the second core 12 both execute application processing and communication processing (particularly the communication preparation processing unit 201), and the first core 11 performs communication processing. Also executes a peripheral access unit 202 for accessing the peripheral 3 that executes.
  • the data acquisition task 24 installed in the first core 11 is activated at a predetermined time. Furthermore, the data acquisition task 24 installed in the first core 11 has a role of acquiring data necessary for processing of the image recognition application module (vehicle front) 221, which is an application module also installed in the first core 11. have.
  • the data acquired by the data acquisition task 24 may be generated by an image recognition application module (vehicle rear) 222 installed in the same electronic control unit. In this case, the data acquisition task 24 acquires data stored in the global area 23b of the application data storage unit 23.
  • the first core 11 activates the peripheral access unit 202.
  • the application modules installed in different electronic control devices are, for example, the application modules shown in FIG. These are the trajectory generation application module 223 or the control command generation application module 224.
  • the peripheral access unit 202 activated by the first core 11 acquires the data stored in the register 31 shown in FIG. 1 using the communication driver 21 shown in FIG. The data is stored in the storage unit 28.
  • the data acquisition task 24 installed in the second core 12 performs image recognition, which is an application module installed in the same second core 12. It has the role of acquiring data necessary for execution of the application module (vehicle rear) 222.
  • the image recognition application module (vehicle front) 221 installed in the first core 11 executes the image recognition application module (vehicle front) after the completion of the data acquisition task 24 and before the start time of the data publication task 25, similar to the technology disclosed in Non-Patent Document 1. executed at any time between.
  • the image recognition application module (vehicle front) 221 there may be data acquired from an application module installed in a different electronic control device, that is, data received from the outside.
  • the image recognition application module (vehicle front) 221 calls the communication preparation processing unit 201 installed in the same first core 11 before executing the process.
  • the called communication preparation processing unit 201 retrieves data from the communication data storage unit 28 shown in FIG. Note that the process described here is a data reception process, so in the flow shown in FIG. 12, it is Rte_write() instead of Rte_read(). As a result, the data retrieved from the communication data storage section 28 can be accessed from the image recognition application module (front of the vehicle) 221.
  • the first core 11 immediately calls the communication preparation processing unit 201 with the data that needs to be transmitted to the outside as an argument. Then, the communication preparation processing unit 201 processes Rte_read(), COM, PduR, and Tp in order as shown in FIG. 12, and stores the data in the communication data storage unit 28 at If.
  • processing is performed by the image recognition application module (vehicle rear) 222 installed in the second core 12.
  • the processing by the image recognition application module (vehicle rear) 222 is executed at any time between after the task processing of the data acquisition task 24 is completed and before the start time of the data publication task 25.
  • the image recognition application module (vehicle rear) 222 there may be data acquired from an application module installed in a different electronic control device, that is, data received from the outside.
  • the image recognition application module (vehicle rear) 222 calls the communication preparation processing section 201 before executing the process.
  • the communication preparation processing unit 201 retrieves data from the communication data storage unit 28 and processes Tp, PduR, COM, and Rte_write() in the reverse order of the processing shown in FIG. Through such processing, the data retrieved from the communication data storage section 28 can be accessed from the image recognition application module (vehicle rear) 222.
  • the second core 12 immediately calls the communication preparation processing unit 201 with the data that needs to be sent to the outside as an argument. Then, the communication preparation processing unit 201 processes Rte_write(), COM, PduR, and Tp in order, and stores the data in the communication data storage unit 28 at If.
  • Non-Patent Document 1 since the first core 11 is dedicated for communication, all communication processing is executed by the same core.
  • the communication preparation processing unit 201 performs the first Note that this is executed by the second core 11 and the second core 12, respectively.
  • the data publication task 25 installed in the first core 11 is activated at a predetermined time.
  • the data publication task 25 installed in the first core 11 has the role of publishing the execution results (application data) of the image recognition application module (vehicle front) 221, which is an application module installed in the same first core 11. have Then, the application data storage unit 23 stores data to be published to application modules within the same electronic control device in the global area 23b.
  • the data publication task 25 installed in the first core 11 activates the peripheral access unit 202 immediately after the processing is completed.
  • the peripheral access unit 202 acquires the data stored in the communication data storage unit 28, calls the communication driver 21 shown in FIG. 12, and writes the data to the register 31 of the peripheral 3.
  • the data publication task 25 installed in the second core 12 is activated at a predetermined time.
  • the data publication task 25 installed in the second core 12 has the role of publishing the execution results of the image recognition application module (vehicle rear) 222, which is an application module installed in the same second core 12. Therefore, the application data storage unit 23 stores data to be made public to application modules within the same electronic control device in the global area 23b.
  • the image recognition application module (vehicle rear) 222 has already completed the communication preparation processing section 201 regarding external transmission data. Furthermore, all accesses to the peripherals 3 are executed by the peripheral access unit 202 mounted on the first core 11. Therefore, the data publication task 25 installed in the second core 12 does not particularly process data to be published to application modules in different electronic control devices.
  • FIG. 15 is a diagram showing an example in which the LET and LCT according to the present embodiment are applied to each module and task according to the present embodiment.
  • each module and task to which LET and LCT are applied are shown in FIGS. 13 and 14.
  • the communication preparation processing unit 201 and the peripheral access unit 202 which will be described below, perform data reception processing, they are referred to as (receiving side), and when they perform data transmission processing, they are referred to as (transmission side).
  • the electronic control unit (first electronic control unit (camera ECU) 6) on the sending side completes the process of transmitting data to the in-vehicle network (in-vehicle network 4), and the electronic control unit (second electronic control unit) on the receiving side
  • a start time of a process at which the electronic control unit (automatic driving ECU) 7) receives data from the in-vehicle network (in-vehicle network 4) is defined.
  • the startup time management unit (task startup time management unit 26) of each of the plurality of electronic control units is , starts the peripheral access unit (peripheral access unit 202) included in each of the plurality of electronic control devices based on the specified completion time and start time.
  • the completion time of the process in which the first electronic control unit (camera ECU) 6 transmits data and the start time of the process in which the second electronic control unit (automatic driving ECU) 7 receives the data.
  • the interval is fixed at 100 milliseconds. Therefore, the period of each process in the first electronic control unit (camera ECU) 6 and the second electronic control unit (autonomous driving ECU) 7 can be matched with the period of transmitting data to the in-vehicle network 4. .
  • the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 are assigned to the first core 11 and the second core 12, respectively, so that image recognition The application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 can simultaneously execute their respective application processes.
  • the data publication task 25 stores the application data in the global area 23b of the application data storage unit 23 and publishes the application data.
  • the peripheral access unit 202 (receiving side) whose activation is managed by the task activation time management unit 26 accesses the peripheral 3 at the read timing, reads communication data, and stores it in the communication data storage unit 28 .
  • the communication preparation processing unit 201 (receiving side) converts the communication data read from the communication data storage unit 28 into application data that can be processed by each application module, and sends the converted application data to the application data storage unit 23. save.
  • the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 execute application processing using the application data read from the application data storage section 23.
  • the processed application data is stored in the application data storage unit 23.
  • the communication preparation processing unit 201 performs communication preparation processing on the application data read from the application data storage unit 23 and stores the communication data in the communication data storage unit 28.
  • the peripheral access unit 202 reads the communication data from the communication data storage unit 28 and writes the communication data to the register 31 of the peripheral 3 at the write timing.
  • the time from read timing to write timing in the first electronic control unit (camera ECU) 6 is fixed at 100 milliseconds.
  • the technology according to the present embodiment differs from the technology disclosed in Non-Patent Document 1 in that the timing to be fixed is not the start time and end time of the application module, but the start of processing that combines the application module and the communication preparation processing unit 201. Time and end time. Further, the communication preparation processing unit 201 is executed by both the first core 11 and the second core 12. Therefore, of the communication processing that required 150 microseconds of CPU processing time shown in FIG.
  • FIG. 16 is a timing chart comparing the technology according to the first embodiment and the technology disclosed in Non-Patent Document 1.
  • FIG. 16 shows an example of the timing of each process according to the first embodiment.
  • the processes of the first core 11 and the second core 12 can be started simultaneously.
  • the first core 11 and the second core 12 separate communication processing by the CPU into a communication preparation processing section (130 microseconds) and a peripheral access section (20 microseconds).
  • the peripheral access unit 202 (reception side) of the first core 11 performs reception processing in 20 microseconds. After that, the first core 11 and the second core 12 perform parallel processing on the communication preparation processing unit 201 (receiving side) and the communication preparation processing unit 201 (transmission side), and perform application processing (image recognition application (front) and image recognition application (front)). The recognition application (backwards) is also executed in parallel.
  • the first core 11 executes the peripheral access unit 202 (transmission side) using one core. Therefore, the communication processing performed on the in-vehicle network 4 does not have to wait for processing by the CPU 1, and the band limitation of the in-vehicle network 4 is eliminated.
  • FIG. 16 shows an example of the timing of each process related to the technology disclosed in Non-Patent Document 1.
  • the core is separated into a first core 11 dedicated to communication and a second core 12 dedicated to applications. Therefore, after the application-dedicated second core 12 sequentially processes the image recognition application (vehicle front) and the image recognition application (vehicle rear), the first core 11 performs communication processing.
  • the first core 11 dedicated to communication sequentially transmits the processing data of the image recognition application (front of the vehicle) and the processing data of the image recognition application (rear of the vehicle) to the second electronic control unit (automatic control unit) via the in-vehicle network 4. Performs communication processing for sending data to the driving ECU) 7.
  • the processing data of the image recognition application (vehicle front) is communicated over the in-vehicle network 4.
  • the in-vehicle network 4 uses high-speed, large-capacity communication such as 1 Gigabit Ethernet, the CPU processing (150 microseconds) required for communication by the first core 11 is This is longer than the time it takes to flow on the bus (80 microseconds). Therefore, in the in-vehicle network 4, a processing wait occurs until the first core 11 dedicated to communication finishes communication processing of processing data of the image recognition application (vehicle rear).
  • the in-vehicle network 4 can send the frame data of the image recognition application (vehicle rear) to the in-vehicle network 4 bus. In this way, the communication band of the in-vehicle network 4 is limited.
  • the timing to be fixed as LET is determined from the start time of application processing in each application processing unit, not from the start time and end time of application processing in each application processing unit. This is the end time of communication preparation processing.
  • Each core was provided with a communication preparation processing section 201, respectively.
  • By fixing the LET in this way it is possible to simultaneously execute communication preparation processing in a plurality of cores. For this reason, conventionally, one core performed communication preparation processing and then sent a frame to the bus, so if one application module did not finish communication preparation processing, another application module would start communication preparation processing. This resulted in a wait time before sending the frame onto the bus.
  • the units for fixing the start time and end time of LET are not application processing and communication processing, but application processing and communication preparation. processing and peripheral access processing. Therefore, most of the communication processing can be executed in parallel in each core (arithmetic processing unit) while keeping the start time and end time of the processing fixed. Further, even when a large amount of data communication is required, there is no need to extend the time from the start time to the end time of communication-related processing, so it is possible to improve latency (communication delay).
  • the communication middleware 20 is not a standard basic software module, but may be a complex device driver.
  • a complex device driver is software that can be installed by users with their own specifications.
  • the present invention is also applicable to electronic control devices equipped with general real-time operating systems other than AUTOSAR.
  • the communication preparation processing unit 201 (sending side) adds a timestamp indicating the transmission time to the frame data and performs frame transmission processing.
  • the communication preparation processing unit 201 (receiving side) may perform a process of checking the time stamp added to the received frame when receiving the frame.
  • the electronic control device according to the second embodiment differs from the first embodiment in that the communication middleware 20 is a lightweight communication driver. Note that the same configurations as those in the first embodiment are given the same reference numerals and the description thereof will be omitted.
  • FIG. 17 is a diagram showing the software architecture of the first electronic control unit (camera ECU) 6A according to the second embodiment.
  • One or more arithmetic processing units may each have an application processing section (application processing section 22).
  • the first electronic control unit (camera ECU) 6A will be described as having only the first core 11 that is used for both communication processing and application processing.
  • An OS 19 an image recognition application module (vehicle front) 221, an image recognition application module (vehicle rear) 222, a data acquisition task 24, a data disclosure task 25, and a lightweight communication driver 29 are assigned to the first core 11.
  • a task start time management unit 26 is assigned to the OS 19.
  • FIG. 18 is a diagram showing the software architecture of the second electronic control unit (automatic driving ECU) 7A according to the second embodiment.
  • the second electronic control unit (automatic driving ECU) 7A has only a first core 11 that is used for both communication processing and application processing.
  • the OS 19, trajectory generation application module 223, control command generation application module 224, data acquisition task 24, data publication task 25, and lightweight communication driver 29 are assigned to the first core 11.
  • a task start time management unit 26 is assigned to the OS 19.
  • FIGS. 8 and 9 show an example of communication middleware 20 and communication driver 21 for transmitting and receiving data, which are allocated to the first core 11 dedicated to communication.
  • the total processing time required for execution of the communication preparation processing unit (communication preparation processing unit 201) and the peripheral access unit (peripheral access unit 202) is the communication time of data communicated with other electronic control devices. It may be shorter than For example, the total CPU processing time of the communication middleware 20 and the communication driver 21 may be shorter than the time during which a data frame flows through the bus of the in-vehicle network 4.
  • the arithmetic processing unit (arithmetic processing unit 1A, first core 11) is a lightweight communication driver (lightweight
  • the lightweight communication driver (lightweight communication driver 29) performs transmission and reception processing of communication data. Therefore, the communication middleware 20 and communication driver 21 are collectively referred to as a "lightweight communication driver.”
  • the lightweight communication driver 29 shown in FIGS. 17 and 18 does not cause the CPU to wait for processing as shown in FIG. 16 even if one core (first core 11) continuously executes transmission/reception processing. .
  • the communication preparation processing section 201 and the peripheral access section 202 do not need to be separated. Furthermore, the data acquisition task 24 acquires data necessary for the processing of the application module from other application modules within the same electronic control device by accessing the global area 23b.
  • FIG. 19 is a diagram showing an example of mapping the LET and LCT according to the present embodiment to each module and task according to the present embodiment.
  • each module and task for mapping LET and LCT are shown in FIGS. 17 and 18.
  • the modules and tasks labeled (receiving side) are responsible for data reception processing
  • the modules and tasks labeled (transmission side) are responsible for data transmission processing. do.
  • the data acquisition task 24 mapped to the read timing at which the first cycle starts acquires data.
  • the lightweight communication driver 29 (reception side) performs a reception process on the received data.
  • the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 execute processing.
  • the data disclosure task 25 writes the processed data to the register 31 of the peripheral 3, and the lightweight communication driver 29 (transmission side) performs a transmission process to transmit the data to the in-vehicle network 4. conduct.
  • the lightweight communication driver 29 performs data reception processing.
  • the trajectory generation application module 223 and the control command generation application module 224 execute processing.
  • the data disclosure task 25 writes the processed data to the register 31 of the peripheral 3, and the lightweight communication driver 29 (transmission side) performs data transmission processing.
  • the task activation time management unit 26 immediately calls the lightweight communication driver 29 to perform data reception processing. Then, the data publishing task 25 publishes the data to other application modules within the same electronic control device by writing the processing results of the application module into the global area 23b. Further, after the processing of the data disclosure task 25 is completed, the task activation time management unit 26 immediately calls the lightweight communication driver 29 to perform data transmission processing.
  • the electronic control device 100 according to the second embodiment described above even when there is only one core, it is possible to reduce software development costs by fixing the timing. Further, even when the electronic control device 100 is configured with multiple cores, unlike the technology disclosed in Non-Patent Document 1, there is no need to install a dedicated communication core. Therefore, the number of CPU cores used in the electronic control device 100 can be reduced. In this way, in the electronic control device 100 according to the second embodiment, the timing can be adjusted without modifying the communication middleware 20 and the communication driver 21 shown in FIG. This has the effect of reducing software development costs due to fixed costs.
  • FIG. 20 is a diagram showing a configuration example of a vehicle control device 5A equipped with a third electronic control device (central ECU) 71 according to the third embodiment.
  • the vehicle control device 5A includes a third electronic control device (central ECU) 71 instead of the second electronic control device (automatic driving ECU) 7.
  • the third electronic control device 71 is a central computer and is responsible for heavy-load calculations such as AI (Artificial Intelligence) and image processing, so an electronic control device with emphasis on calculation performance is used.
  • a zone ECU (not shown) is provided for the third electronic control unit (central ECU) 71.
  • the zone ECU is responsible for light-load calculations such as sensor control and actuator control for each zone into which the vehicle is divided, and uses an electronic control unit with an emphasis on safety.
  • a first electronic control unit (camera ECU) 6B included in the vehicle control device 5A transmits camera image data 9 acquired from the camera 8 to the in-vehicle network 4 at a predetermined time. It only has a role.
  • the camera image data 9 transmitted to the in-vehicle network 4 by the first electronic control unit (camera ECU) 6B is received by the third electronic control unit (central ECU) 71.
  • a third electronic control unit (central ECU) 71 generates a steering control command 13, an accelerator control command 14, and a brake control command 15 based on the camera image data 9.
  • the third electronic control unit (central ECU) 71 controls the operations of the steering wheel 16, accelerator 17, and brake 18 by outputting each command.
  • the third electronic control unit (central ECU) 71 can update the software that operates on the third electronic control unit (central ECU) 71 using a known OTA (Over the Air) technology.
  • OTA Over the Air
  • FIG. 21 is a diagram showing the software architecture of the third electronic control unit (central ECU) 71 according to the present embodiment.
  • One of the plurality of electronic control devices includes a plurality of arithmetic processing units (arithmetic processing units 1A, 1B, three cores (first It has a core 11, a second core 12 and a third core 72)).
  • the first core 11, second core 12, and third core 72 are all used for applications and communication.
  • One or more application processing units are provided in each of the plurality of arithmetic processing units (arithmetic processing units 1A, 1B, first core 11, second core 12, and third core 72).
  • One of the plurality of arithmetic processing units (first core 11) includes an update unit (software update unit 73) that updates the function of the application processing unit (application processing unit 22).
  • the first core 11 includes a trajectory generation application module 223, an OS 19, a data acquisition task 24, which was installed in the second electronic control unit (automatic driving ECU) 7 according to the first embodiment shown in FIG.
  • a data publication task 25 a communication preparation processing section 201, and a peripheral access section 202 are assigned.
  • the OS 19 of the first core 11 can execute the task start time management section 26. Furthermore, a software update section 73 is newly allocated to the first core 11.
  • the second core 12 includes a control command generation application module 224, an OS 19, and a data acquisition task 24 installed in the second electronic control unit (automatic driving ECU) 7 according to the first embodiment shown in FIG. , a data disclosure task 25, and a communication preparation processing unit 201 are assigned.
  • the OS 19 of the second core 12 can execute the task start time management unit 26.
  • the third core 72 includes an image recognition application module (vehicle front) 221 and an image recognition application module installed in the first electronic control unit (camera ECU) 6 according to the first embodiment shown in FIG. (vehicle rear) 222, OS 19, data acquisition task 24, data disclosure task 25, and communication preparation processing section 201 are assigned.
  • the software update unit 73 installed in the first core 11 can update the application module installed in the third electronic control unit (central ECU) 71 using OTA (Over the Air) technology. That is, the software update unit 73 has a role of updating the image recognition application module (vehicle front) 221, the image recognition application module (vehicle rear) 222, the trajectory generation application module 223, and the control command generation application module 224.
  • OTA Over the Air
  • the software update performed by the software update unit 73 is normally carried out when each application module is started or after it is finished.
  • a master core and a slave core are designated, and the master core updates software including the slave cores.
  • the core number of the master core is often "0". Therefore, if the first core 11 is the master core, the software update unit 73 is executed by the first core 11.
  • This software update unit 73 can also update the trajectory generation application module 223 and the control command generation application module 224.
  • the software update unit 73 can update the functions of the trajectory generation application module 223 and the control command generation application module 224 in automatic driving at any time.
  • the software update unit 73 also updates the image processing functions of the image recognition application modules 221 and 222 that are performed on the camera image data 9, thereby making it possible to improve the image recognition performance of each module. Become.
  • the data transmitted from the first electronic control unit (camera ECU) 6 to the second electronic control unit (automatic driving ECU) 7 is camera object detection data 10.
  • the data transmitted from the first electronic control unit (camera ECU) 6B to the third electronic control unit (central ECU) 71 is camera image data 9.
  • camera object detection data 10 is sent from the first electronic control unit (camera ECU) 6 to the second electronic control unit (autonomous driving ECU) 7 via the in-vehicle network 4 as in the first embodiment. There is no need to send the data. Therefore, the process in which the third electronic control unit (central ECU) 71 performs image recognition and generates a trajectory and a control command for automatic driving is shortened, and the vehicle can be quickly controlled.
  • Non-Patent Document 1 since the data size of the camera image data 9 is approximately several megabytes, the technology disclosed in Non-Patent Document 1 has limitations on the communication band. It was impossible to send or receive. On the other hand, with the technology according to the third embodiment, since there is no restriction on the communication band, it is possible to arrange such application modules.
  • the first electronic control device (camera ECU) 6 and the second electronic control device (automatic driving ECU ) 7 are integrated into one third electronic control unit (central ECU) 71.
  • the software update unit 73 updates the functions of the application module at an arbitrary timing. Therefore, the accuracy of image recognition and vehicle driving control can be improved by using an application module with improved functionality.
  • the present invention aims to reduce software development costs by fixing timing. Then, by updating the application modules, the possibility that the input and output timing of each application module changes before and after the update is eliminated. Therefore, software updating by OTA is particularly useful for the third electronic control unit (central ECU) 71 according to the third embodiment.
  • the arrangement of application modules within the vehicle can be changed flexibly. Further, it has the effect of integrating a plurality of electronic control devices and making it possible to easily update a plurality of application modules.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Combined Controls Of Internal Combustion Engines (AREA)
  • Small-Scale Networks (AREA)

Abstract

This electronic control device comprises: a calculation processing unit that has an application processing unit which performs application processing; a data storage unit that stores data which is processed by the calculation processing unit; and a peripheral that transmits, to another electronic control device, data which has been read from the data storage unit, or that receives data from another electronic control device. Fixed in advance are a first timing at which the calculation processing unit inputs/outputs data to/from the data storage unit for the application processing, and a second timing at which the calculation processing unit provides, to the peripheral, data that has been read from the data storage unit, and at which the peripheral transmits the data to the other electronic control device.

Description

電子制御装置electronic control unit
 本発明は、電子制御装置に関する。 The present invention relates to an electronic control device.
 車両に複数の電子制御装置(ECU:Electronic Control Unit)が搭載されるようになり、電子制御装置の開発コストを低減することが求められている。そこで、電子制御装置の開発コストを低減するため特許文献1及び2に記載された技術が知られている。特許文献1には、「ドライバは、複数のアプリケーションプログラムに対応した複数のインターフェース部と、複数のアプリケーションプログラムからの指示を実行する共通処理部と、を有し、複数のアプリケーションプログラム間の通信は、ドライバの制御下で行われる」と記載されている。 As vehicles are now equipped with multiple electronic control units (ECUs), there is a need to reduce the development cost of electronic control units. Therefore, techniques described in Patent Documents 1 and 2 are known to reduce the development cost of electronic control devices. Patent Document 1 states, ``The driver has a plurality of interface sections corresponding to a plurality of application programs and a common processing section that executes instructions from the plurality of application programs, and communication between the plurality of application programs is , under the control of the driver."
 また、特許文献2には、「外部機器と接続する外部バスから通信フレームに載せて送られてくるデータを受信データとし、その受信データが少なくとも格納されるメッセージボックスと、当該電子制御装置の内部バスを介して転送されてくる受信データの書込み及び読出しが可能な複数の格納領域を有する内部メモリとを備える。」と記載されている。 Furthermore, Patent Document 2 states that ``data sent in a communication frame from an external bus connected to an external device is regarded as received data, and a message box in which at least the received data is stored; and an internal memory having a plurality of storage areas in which received data transferred via a bus can be written and read.''
特開2013-062734号公報Japanese Patent Application Publication No. 2013-062734 特開2011-250110号公報Japanese Patent Application Publication No. 2011-250110
 特許文献1に開示された技術では、複数のアプリケーションプログラムを搭載した電子制御装置に関して、ドライバの構成要素としてアプリケーションプログラムの各々が使用するバッファが設置される。このバッファにより、ペリフェラル(例えばCAN(Controller Area Network)コントローラやCANトランシーバ)を複数個用意しなくても、各アプリケーションプログラムが1個のペリフェラルを共有する構成としている。 In the technology disclosed in Patent Document 1, for an electronic control device loaded with a plurality of application programs, a buffer used by each of the application programs is installed as a component of a driver. This buffer allows each application program to share one peripheral without having to prepare multiple peripherals (for example, CAN (Controller Area Network) controllers and CAN transceivers).
 しかしながら、複数のアプリケーションプログラムがペリフェラルを共有する場合と、ペリフェラルを共有しない場合とを比較すると、各アプリケーションプログラムがデータを生成してから該データがペリフェラルに到達するまでのタイミングが異なっている。このため、複数のアプリケーションプログラムがペリフェラルを共有する場合と、共有しない場合で電子制御装置の振る舞い(例えばCANメッセージの送信タイミング及び送信の順序)が変化する。この結果、変化した電子制御装置の動作の検証、及びデータの入出力タイミングの再設計にコストを要していた。 However, when comparing the case where multiple application programs share a peripheral with the case where a plurality of application programs do not share a peripheral, the timing from when each application program generates data until the data reaches the peripheral is different. Therefore, the behavior of the electronic control device (for example, the timing and order of transmission of CAN messages) changes depending on whether a plurality of application programs share a peripheral or not. As a result, costs are required to verify the changed operation of the electronic control device and to redesign the data input/output timing.
 特許文献2に開示された技術では、CPU(Central Processing Unit)が制御対象機器を制御するための処理を行う状況において、通信処理の一部をDMA(Direct Memory Access)コントローラが代替する構成としたものである。しかしながら、一般にCPUとDMAコントローラでは処理速度が異なる。このため、従来、CPUで実行していた処理をDMAコントローラに代替すると、代替の前後でアプリケーション処理及び通信処理を含むCPUのスケジューリングが変化する。この結果、特許文献2に開示された技術では、特許文献1に開示された技術と同様に、電子制御装置の動作の検証、及びデータのタイミングの再設計にコストを要していた。 In the technology disclosed in Patent Document 2, in a situation where a CPU (Central Processing Unit) performs processing to control a controlled device, a DMA (Direct Memory Access) controller replaces part of the communication processing. It is something. However, generally the processing speed of a CPU and a DMA controller is different. For this reason, when processing conventionally executed by a CPU is replaced with a DMA controller, the scheduling of the CPU, including application processing and communication processing, changes before and after the replacement. As a result, in the technique disclosed in Patent Document 2, similar to the technique disclosed in Patent Document 1, costs are required for verifying the operation of the electronic control device and redesigning the data timing.
 本発明はこのような状況に鑑みて成されたものであり、ソフトウェアプログラムの開発コストの削減を目的とする。 The present invention was made in view of this situation, and aims to reduce the cost of developing software programs.
 本発明に係る電子制御装置は、アプリケーション処理を行うアプリケーション処理部を有する演算処理部と、演算処理部で処理されるデータを保存するデータ保存部と、データ保存部から読み出したデータを他の電子制御装置に送信し、又は他の電子制御装置からデータを受信するペリフェラルと、を備え、演算処理部がアプリケーション処理のためにデータ保存部にデータを入出力する第1タイミングと、演算処理部がデータ保存部から読み出したデータをペリフェラルに渡して、ペリフェラルが他の電子制御装置にデータを送信する第2タイミングとが予め固定される。 An electronic control device according to the present invention includes an arithmetic processing section having an application processing section that performs application processing, a data storage section that stores data processed by the arithmetic processing section, and a data storage section that stores data read from the data storage section. a peripheral that transmits data to the control device or receives data from another electronic control device; a first timing at which the arithmetic processing unit inputs and outputs data to the data storage unit for application processing; A second timing at which the data read from the data storage unit is passed to the peripheral and the peripheral transmits the data to another electronic control device is fixed in advance.
 本発明によれば、第1及び第2タイミングが予め固定されるので、ソフトウェアプログラムの開発コストを削減することが可能となる。
 上記した以外の課題、構成及び効果は、以下の実施形態の説明により明らかにされる。
According to the present invention, since the first and second timings are fixed in advance, it is possible to reduce the cost of developing a software program.
Problems, configurations, and effects other than those described above will be made clear by the following description of the embodiments.
本発明の第1の実施形態に係る電子制御装置の概略構成例を示す図である。1 is a diagram showing a schematic configuration example of an electronic control device according to a first embodiment of the present invention. 本発明の第1の実施形態に係る電子制御装置の各部の内部構成例を示すブロック図である。FIG. 2 is a block diagram showing an example of the internal configuration of each part of the electronic control device according to the first embodiment of the present invention. 本発明の第1の実施形態に係る電子制御装置を搭載した車両制御装置の内部構成例を示すブロック図である。FIG. 1 is a block diagram showing an example of the internal configuration of a vehicle control device equipped with an electronic control device according to a first embodiment of the present invention. 本発明の第1の実施形態に係るカメラ物体検知データの構成例を示した図である。FIG. 3 is a diagram illustrating a configuration example of camera object detection data according to the first embodiment of the present invention. 第一の電子制御装置(カメラECU)と第二の電子制御装置(自動運転ECU)の時刻同期のパターン1を示した図である。It is a figure showing pattern 1 of time synchronization of a first electronic control unit (camera ECU) and a second electronic control unit (automatic driving ECU). 第一の電子制御装置(カメラECU)と第二の電子制御装置(自動運転ECU)の時刻同期のパターン2を示した図である。It is a figure showing pattern 2 of time synchronization of a first electronic control unit (camera ECU) and a second electronic control unit (automatic driving ECU). パターン1で時刻同期された電子制御装置間におけるデータフローの変化の例を示す図である。3 is a diagram illustrating an example of a change in data flow between electronic control devices time-synchronized in pattern 1. FIG. パターン3で時刻同期された電子制御装置間におけるデータフローの変化の例を示す図である。7 is a diagram illustrating an example of a change in data flow between electronic control devices time-synchronized in pattern 3. FIG. 非特許文献1にて提案されたLET及びLCTの定義を示した図である。FIG. 2 is a diagram showing definitions of LET and LCT proposed in Non-Patent Document 1. 各電子制御装置の処理時間と、車載ネットワークの通信時間の例を示す図である。FIG. 3 is a diagram showing an example of processing time of each electronic control device and communication time of an in-vehicle network. 第一の電子制御装置(カメラECU)に従来のソフトウェアアーキテクチャを適用した例を示す図である。FIG. 2 is a diagram showing an example in which a conventional software architecture is applied to a first electronic control unit (camera ECU). 第二の電子制御装置(自動運転ECU)における従来のソフトウェアアーキテクチャを適用した例を示す図である。It is a figure which shows the example to which the conventional software architecture is applied in the second electronic control unit (automatic driving|operation ECU). 非特許文献1にて提案されたLET及びLCTを、図8及び図9に示したモジュール及びタスクにマッピングした例を示す図である。FIG. 9 is a diagram showing an example of mapping the LET and LCT proposed in Non-Patent Document 1 to the modules and tasks shown in FIGS. 8 and 9. FIG. 第一の電子制御装置(カメラECU)において大量のデータ送信を行う場合における通信ミドルウェアと車載ネットワークのタイミングチャートである。It is a timing chart of communication middleware and an in-vehicle network when transmitting a large amount of data in the first electronic control unit (camera ECU). 本発明の第1の実施形態に係る通信準備処理部とペリフェラルアクセス部の処理の概要について、AUTOSARを例として説明する図である。FIG. 2 is a diagram illustrating an overview of processing by a communication preparation processing unit and a peripheral access unit according to the first embodiment of the present invention, using AUTOSAR as an example. 第一の電子制御装置(カメラECU)における本実施形態に係るソフトウェアアーキテクチャを示した図である。It is a diagram showing software architecture according to the present embodiment in a first electronic control unit (camera ECU). 第二の電子制御装置(自動運転ECU)における本実施形態に係るソフトウェアアーキテクチャを示した図である。It is a diagram showing the software architecture according to the present embodiment in a second electronic control unit (automatic driving ECU). 本発明の第1の実施形態に係るLET及びLCTを、本実施形態に係る各モジュール及びタスクに適用した例を示す図である。FIG. 3 is a diagram showing an example in which the LET and LCT according to the first embodiment of the present invention are applied to each module and task according to the present embodiment. 本発明の第1の実施形態に係る技術と、非特許文献1に開示された技術とを比較したタイミングチャートである。1 is a timing chart comparing the technology according to the first embodiment of the present invention and the technology disclosed in Non-Patent Document 1. 本発明の第2の実施形態に係る第一の電子制御装置(カメラECU)におけるソフトウェアアーキテクチャを示した図である。It is a diagram showing software architecture in a first electronic control unit (camera ECU) according to a second embodiment of the present invention. 本発明の第2の実施形態に係る第二の電子制御装置(自動運転ECU)におけるソフトウェアアーキテクチャを示した図である。It is a diagram showing software architecture in a second electronic control unit (automatic driving ECU) according to a second embodiment of the present invention. 本発明の第2の実施形態に係るLET及びLCTを、本実施形態に係る各モジュール及びタスクにマッピングした例を示す図である。FIG. 7 is a diagram showing an example of mapping LET and LCT according to the second embodiment of the present invention to each module and task according to the present embodiment. 本発明の第3の実施形態に係る第三の電子制御装置(セントラルECU)を搭載した車両制御装置の構成例を示した図である。FIG. 7 is a diagram illustrating a configuration example of a vehicle control device equipped with a third electronic control device (central ECU) according to a third embodiment of the present invention. 本発明の第3の実施形態に係る第三の電子制御装置(セントラルECU)におけるソフトウェアアーキテクチャを示した図である。It is a diagram showing software architecture in a third electronic control unit (central ECU) according to a third embodiment of the present invention.
 以下、本発明を実施するための形態について、添付図面を参照して説明する。本明細書及び図面において、実質的に同一の機能又は構成を有する構成要素については、同一の符号を付することにより重複する説明を省略する。 Hereinafter, embodiments for carrying out the present invention will be described with reference to the accompanying drawings. In this specification and the drawings, components having substantially the same functions or configurations are designated by the same reference numerals and redundant explanations will be omitted.
[第1の実施形態]
 以下に説明する、本発明の第1の実施形態は、電子制御装置及び車両制御装置に本発明を適用したものであって、特に、ネットワークに接続され、車両内外の他の電子制御装置(ECU)との間で通信を行うものに関する。第1の実施形態においては、カメラから取得した画像データを用いて、ステアリング、アクセル及びブレーキを制御する自動運転車両に搭載される車両制御装置を例に挙げて、電子制御装置及び車両制御装置の構成例及び動作例を説明する。ただし、自動運転車両に限らず、自動運転を行わない車両にも本発明を適用可能である。また、ネットワークに接続され、車両内外の他の電子制御装置(ECU)との間で通信を行うように構成されたリアルタイムシステムに対しても本発明を適用可能である。例えば、ロボット、建設機械、自律フォークリフト等のCPUがマルチコアで構成され、リアルタイムでソフトウェアプログラムが稼働する製品に第1の実施形態に係る電子制御装置及び車両制御装置を適用可能である。
[First embodiment]
A first embodiment of the present invention, which will be described below, applies the present invention to an electronic control device and a vehicle control device. ). In the first embodiment, a vehicle control device installed in an automatic driving vehicle that controls steering, accelerator, and brakes using image data acquired from a camera is taken as an example, and the electronic control device and vehicle control device are A configuration example and an operation example will be explained. However, the present invention is applicable not only to automatic driving vehicles but also to vehicles that do not perform automatic driving. The present invention is also applicable to a real-time system connected to a network and configured to communicate with other electronic control units (ECUs) inside and outside the vehicle. For example, the electronic control device and vehicle control device according to the first embodiment can be applied to products such as robots, construction machines, autonomous forklifts, etc., in which the CPU is configured with multi-cores and software programs run in real time.
<電子制御装置の構成例>
 図1は、第1の実施形態に係る電子制御装置の概略構成例を示す図である。
 電子制御装置100は、CPU1、RAM2及びペリフェラル3を備える。
 CPU1は、複数のコアを搭載する。説明のため、本実施形態において、コアは二つとし、第一のコア11と第二のコア12が含まれるとするが、CPU1に3つ以上のコアが搭載されている場合においても本発明は適用可能である。
<Example of configuration of electronic control device>
FIG. 1 is a diagram showing a schematic configuration example of an electronic control device according to a first embodiment.
The electronic control device 100 includes a CPU 1, a RAM 2, and peripherals 3.
The CPU 1 is equipped with multiple cores. For the sake of explanation, in this embodiment, it is assumed that there are two cores, and the first core 11 and the second core 12 are included, but the present invention also applies when the CPU 1 is equipped with three or more cores. is applicable.
 ペリフェラル3は、電子制御装置100の外部(例えば、他の電子制御装置100)と通信を行うために使用されるハードウェアであり、内部にレジスタ31を含む。このペリフェラル3は、具体的にはCANコントローラやEthernet(登録商標)の物理層(Physical Layer)に例示されるハードウェアであって、車載ネットワーク4に接続される。図3に示す複数の電子制御装置(第一の電子制御装置(カメラECU)6、第二の電子制御装置(自動運転ECU)7)は、ペリフェラル(ペリフェラル3)を通じて互いに車載ネットワーク(車載ネットワーク4)によりデータの通信を行うことができる。そして、ペリフェラル(ペリフェラル3)は、データ保存部(データ保存部2A)(図2参照)から読み出したデータを他の電子制御装置に送信し、又は他の電子制御装置からデータを受信する。 The peripheral 3 is hardware used to communicate with the outside of the electronic control device 100 (for example, another electronic control device 100), and includes a register 31 therein. The peripheral 3 is specifically hardware such as a CAN controller or a physical layer of Ethernet (registered trademark), and is connected to the in-vehicle network 4. A plurality of electronic control devices (a first electronic control device (camera ECU) 6, a second electronic control device (autonomous driving ECU) 7) shown in FIG. ) allows data communication. The peripheral (peripheral 3) transmits data read from the data storage unit (data storage unit 2A) (see FIG. 2) to another electronic control device, or receives data from another electronic control device.
 CPU1の第一のコア11又は第二のコア12がハードウェア固有の送信用のレジスタ31に値を書き込むことにより、通信データが車載ネットワーク4に送信される。逆に、ペリフェラル3が、他の電子制御装置100が送信した通信データを車載ネットワーク4から受信した場合は、ハードウェア固有の受信用のレジスタ31に値が書き込まれる。この通信データは、第一のコア11と第二のコア12が利用可能な形式に変換され、RAM2に保存される。そして、第一のコア11と第二のコア12は、RAM2からデータを読み出し、各コアが備えるアプリケーションモジュール(アプリケーションプログラムの一例)によるアプリケーション処理に用いることが可能である。 Communication data is transmitted to the in-vehicle network 4 by the first core 11 or second core 12 of the CPU 1 writing a value to the hardware-specific transmission register 31. Conversely, when the peripheral 3 receives communication data transmitted by another electronic control device 100 from the in-vehicle network 4, the value is written to the hardware-specific reception register 31. This communication data is converted into a format usable by the first core 11 and the second core 12 and stored in the RAM 2. The first core 11 and the second core 12 can read data from the RAM 2 and use it for application processing by an application module (an example of an application program) provided in each core.
<電子制御装置の各部の内部構成例>
 図2は、第1の実施形態に係る電子制御装置100の各部の内部構成例を示すブロック図である。
 図2に示すブロック図では、図1に示したCPU1の第一のコア11に相当する機能部として演算処理部1Aが設けられ、第二のコア12に相当する機能部として演算処理部1Bが設けられる。また、図1に示したRAM2に相当する機能部としてデータ保存部2Aが設けられる。この演算処理部(演算処理部1A,1B、第一のコア11、第二のコア12)は、アプリケーション処理を行うアプリケーション処理部(アプリケーション処理部22)を有する。また、演算処理部(演算処理部1A,1B、第一のコア11、第二のコア12)は、アプリケーション処理部(アプリケーション処理部22)と、アプリケーション処理により生成されたアプリケーションデータの通信準備を行う通信準備処理部(通信準備処理部201)を有している。
<Example of internal configuration of each part of electronic control unit>
FIG. 2 is a block diagram showing an example of the internal configuration of each part of the electronic control device 100 according to the first embodiment.
In the block diagram shown in FIG. 2, a calculation processing section 1A is provided as a functional section corresponding to the first core 11 of the CPU 1 shown in FIG. 1, and a calculation processing section 1B is provided as a functional section corresponding to the second core 12. provided. Further, a data storage section 2A is provided as a functional section corresponding to the RAM 2 shown in FIG. This arithmetic processing section ( arithmetic processing sections 1A, 1B, first core 11, second core 12) has an application processing section (application processing section 22) that performs application processing. In addition, the arithmetic processing units ( arithmetic processing units 1A, 1B, first core 11, second core 12) prepare for communication of application data generated by application processing with the application processing unit (application processing unit 22). It has a communication preparation processing unit (communication preparation processing unit 201) that performs the communication preparation processing.
 演算処理部1Aは、アプリケーション処理部22、通信準備処理部201、ペリフェラルアクセス部202、及びタスク起動時刻管理部26を備える。ここで、少なくとも一つの演算処理部(演算処理部1A、第一のコア11)は、通信データ保存部(通信データ保存部28)及びペリフェラル(ペリフェラル3)にアクセスするペリフェラルアクセス部(ペリフェラルアクセス部202)と、ペリフェラルアクセス部(ペリフェラルアクセス部202)とを起動する起動時刻を管理する起動時刻管理部(タスク起動時刻管理部26)を有する。 The arithmetic processing unit 1A includes an application processing unit 22, a communication preparation processing unit 201, a peripheral access unit 202, and a task activation time management unit 26. Here, at least one arithmetic processing unit (arithmetic processing unit 1A, first core 11) includes a communication data storage unit (communication data storage unit 28) and a peripheral access unit (peripheral access unit) that accesses a peripheral (peripheral 3). 202) and a startup time management section (task startup time management section 26) that manages the startup time for starting up the peripheral access section (peripheral access section 202).
 演算処理部1Aのアプリケーション処理部22は、演算処理部1Aが担う機能を実現するためのアプリケーション処理を行う。アプリケーション処理により生成されたアプリケーションデータは、アプリケーションデータ保存部23に保存される。 The application processing unit 22 of the arithmetic processing unit 1A performs application processing to realize the functions handled by the arithmetic processing unit 1A. Application data generated by application processing is stored in the application data storage unit 23.
 演算処理部1Aの通信準備処理部201は、電子制御装置100が他の電子制御装置との通信するための通信準備処理を行い、通信データを通信データ保存部28に保存する。また、通信準備処理部201は、他の電子制御装置100から受信された通信データを通信データ保存部28から読み出して、アプリケーション処理部22が利用可能な形式に変換した後、この変換後のデータをアプリケーションデータ保存部23に保存することもできる。 The communication preparation processing unit 201 of the arithmetic processing unit 1A performs communication preparation processing for the electronic control device 100 to communicate with another electronic control device, and stores communication data in the communication data storage unit 28. Further, the communication preparation processing unit 201 reads communication data received from another electronic control device 100 from the communication data storage unit 28, converts it into a format that can be used by the application processing unit 22, and then converts the communication data received from the other electronic control device 100 into a format that can be used by the application processing unit 22. can also be stored in the application data storage section 23.
 ペリフェラルアクセス部202は、通信データ保存部28と通信し、タスク起動時刻管理部26から入力される起動指示に従って、ペリフェラル3及び通信データ保存部28にアクセスする。 The peripheral access unit 202 communicates with the communication data storage unit 28 and accesses the peripheral 3 and the communication data storage unit 28 according to the activation instruction input from the task activation time management unit 26.
 タスク起動時刻管理部26は、ペリフェラルアクセス部202の起動をタスクとした時に、ペリフェラルアクセス部202の起動時刻を管理する。そして、起動時刻管理部(タスク起動時刻管理部26)は、予め決められた内部時刻に基づいて、第1タイミングでアプリケーション処理部(アプリケーション処理部22)を動作させ、第2タイミングでペリフェラルアクセス部(ペリフェラルアクセス部202)を動作させる。後述するように、第1の実施形態に係る第1タイミングは、アプリケーション処理部(アプリケーション処理部22)がアプリケーションデータ保存部(アプリケーションデータ保存部23)に対してデータを入出力するタイミングとして固定される。また、第2タイミングは、ペリフェラルアクセス部(ペリフェラルアクセス部202)が通信データ保存部(通信データ保存部28)に保存された通信データをペリフェラル(ペリフェラル3)に転送するタイミングとして固定される。また、内部時刻は、電子制御装置100が外部から受信する時刻同期用の信号により補正される。このため、複数の電子制御装置100の内部時刻が同期する。 The task activation time management unit 26 manages the activation time of the peripheral access unit 202 when the activation of the peripheral access unit 202 is set as a task. Then, the startup time management section (task startup time management section 26) operates the application processing section (application processing section 22) at a first timing based on a predetermined internal time, and operates the peripheral access section at a second timing. (Peripheral access unit 202) is operated. As described later, the first timing according to the first embodiment is fixed as the timing at which the application processing unit (application processing unit 22) inputs and outputs data to and from the application data storage unit (application data storage unit 23). Ru. Further, the second timing is fixed as the timing at which the peripheral access unit (peripheral access unit 202) transfers the communication data stored in the communication data storage unit (communication data storage unit 28) to the peripheral (peripheral 3). Further, the internal time is corrected by a time synchronization signal that the electronic control device 100 receives from the outside. Therefore, the internal times of the plurality of electronic control devices 100 are synchronized.
 演算処理部1Bは、アプリケーション処理部22、及び通信準備処理部201を備える。
 演算処理部1Bのアプリケーション処理部22は、演算処理部1Bが担う機能を実現するためのアプリケーション処理を行う。アプリケーション処理により生成されたアプリケーションデータは、アプリケーションデータ保存部23に保存される。
 演算処理部1Bの通信準備処理部201は、電子制御装置100が他の電子制御装置との通信するための通信準備処理を行い、通信データを通信データ保存部28に保存する。演算処理部1Bの通信準備処理部201の処理は、演算処理部1Aの通信準備処理部201の処理と同様である。
The arithmetic processing unit 1B includes an application processing unit 22 and a communication preparation processing unit 201.
The application processing unit 22 of the arithmetic processing unit 1B performs application processing to realize the functions that the arithmetic processing unit 1B is responsible for. Application data generated by application processing is stored in the application data storage unit 23.
The communication preparation processing section 201 of the arithmetic processing section 1B performs communication preparation processing for the electronic control device 100 to communicate with another electronic control device, and stores communication data in the communication data storage section 28. The processing of the communication preparation processing section 201 of the calculation processing section 1B is similar to the processing of the communication preparation processing section 201 of the calculation processing section 1A.
 データ保存部(データ保存部2A)は、演算処理部(演算処理部1A,1B、第一のコア11、第二のコア12)で処理されるデータを保存する。このデータ保存部(データ保存部2A)は、アプリケーションデータを保存するアプリケーションデータ保存部(アプリケーションデータ保存部23)と、通信準備処理部(通信準備処理部201)により生成された通信データを保存する通信データ保存部(通信データ保存部28)とを有している。 The data storage unit (data storage unit 2A) stores data processed by the calculation processing units ( calculation processing units 1A, 1B, first core 11, second core 12). This data storage unit (data storage unit 2A) stores communication data generated by an application data storage unit (application data storage unit 23) that stores application data and a communication preparation processing unit (communication preparation processing unit 201). It has a communication data storage section (communication data storage section 28).
 アプリケーションデータ保存部23は、演算処理部1A,1Bのアプリケーション処理部22で処理されたアプリケーションデータを保存する。アプリケーションデータ保存部(アプリケーションデータ保存部23)は、それぞれのアプリケーション処理部(アプリケーション処理部22)がアプリケーション処理の途中のデータを一時的に保存する領域として、処理を実行しているアプリケーション処理部(アプリケーション処理部22)だけが使用するローカル領域(ローカル領域23a)を有する。ローカル領域23aは、図1に示したRAM2の内部に設けられる領域である。ローカル領域23aに保存されるデータは、各アプリケーション処理部22だけが個別にアクセス可能であるため、他のアプリケーション処理部22により変更されない。 The application data storage unit 23 stores application data processed by the application processing units 22 of the calculation processing units 1A and 1B. The application data storage unit (application data storage unit 23) serves as an area where each application processing unit (application processing unit 22) temporarily stores data during application processing. It has a local area (local area 23a) that is used only by the application processing unit 22). The local area 23a is an area provided inside the RAM 2 shown in FIG. The data stored in the local area 23a can be accessed individually only by each application processing section 22, and therefore is not changed by other application processing sections 22.
 また、アプリケーションデータ保存部(アプリケーションデータ保存部23)は、アプリケーション処理部(アプリケーション処理部22)が処理を完了した結果のデータを保存し、他のアプリケーション処理部(アプリケーション処理部22)にデータを使用可能に公開するグローバル領域(グローバル領域23b)を有する。グローバル領域23bは、RAM2に設けられる領域である。グローバル領域23bに保存されるデータは、各アプリケーション処理部22が相互にアクセス可能であり、データを利用することができる。このため、アプリケーション処理部(アプリケーション処理部22)は、第1タイミングでグローバル領域(グローバル領域23b)にアクセスする。また、アプリケーションデータ保存部23のグローバル領域23bに保存されたアプリケーションデータは、演算処理部1A,1Bの通信準備処理部201が読み出し可能である。 Further, the application data storage unit (application data storage unit 23) stores data resulting from completion of processing by the application processing unit (application processing unit 22), and transmits the data to other application processing units (application processing unit 22). It has a global area (global area 23b) that is made available for use by the public. The global area 23b is an area provided in the RAM2. The data stored in the global area 23b can be mutually accessed and used by each application processing unit 22. Therefore, the application processing unit (application processing unit 22) accesses the global area (global area 23b) at the first timing. Further, the application data stored in the global area 23b of the application data storage section 23 can be read by the communication preparation processing section 201 of the arithmetic processing section 1A, 1B.
 後述する図8、図9、図13、図14に示すデータ取得タスク24がローカル領域23a又はグローバル領域23bからデータを読み出すことが可能である。また、データ公開タスク25がグローバル領域23bにデータを書き込むことが可能である。ローカル変数やグローバル変数はコンパイラによって区別され、RAM2の内部で、ローカル変数がローカル領域23aに配置され、グローバル変数がグローバル領域23bに配置される。RAM2のどの領域がローカル領域23aでどの領域がグローバル領域23bかは電子制御装置100に依存する。 It is possible for the data acquisition task 24 shown in FIGS. 8, 9, 13, and 14, which will be described later, to read data from the local area 23a or the global area 23b. Further, the data publishing task 25 can write data to the global area 23b. Local variables and global variables are distinguished by the compiler, and within the RAM 2, local variables are placed in the local area 23a and global variables are placed in the global area 23b. Which area of the RAM 2 is the local area 23a and which area is the global area 23b depends on the electronic control device 100.
 通信データ保存部28は、演算処理部1A,1Bの通信準備処理部201で処理された通信データを保存する。通信データ保存部28に保存された通信データは、ペリフェラルアクセス部202により読み出し可能である。 The communication data storage unit 28 stores communication data processed by the communication preparation processing unit 201 of the arithmetic processing units 1A and 1B. The communication data stored in the communication data storage section 28 can be read by the peripheral access section 202.
 ペリフェラル3は、タスク起動時刻管理部26により起動されたタイミングでペリフェラルアクセス部202から出力されたデータを、車載ネットワーク4(以下、バスとも呼ぶ)を介して他の電子制御装置に送信する。また、ペリフェラル3は、車載ネットワーク4を介して他の電子制御装置から受信したデータをペリフェラルアクセス部202に出力する。タスク起動時刻管理部26により起動されたタイミングでペリフェラルアクセス部202がペリフェラル3から取得した通信データは、通信データ保存部28に保存される。その後、演算処理部1A,1Bの通信準備処理部201が通信データ保存部28から通信データを取得し、アプリケーション処理部22が処理可能なデータに変換し、アプリケーションデータ保存部23のグローバル領域23bに保存する。演算処理部1A,1Bのアプリケーション処理部22は、アプリケーションデータ保存部23からデータを読み出して、各アプリケーションの処理に用いる。 The peripheral 3 transmits data output from the peripheral access unit 202 at the timing activated by the task activation time management unit 26 to other electronic control devices via the in-vehicle network 4 (hereinafter also referred to as a bus). Further, the peripheral 3 outputs data received from another electronic control device via the in-vehicle network 4 to the peripheral access unit 202. The communication data that the peripheral access unit 202 acquires from the peripheral 3 at the timing of activation by the task activation time management unit 26 is stored in the communication data storage unit 28. After that, the communication preparation processing section 201 of the arithmetic processing section 1A, 1B acquires the communication data from the communication data storage section 28, converts it into data that can be processed by the application processing section 22, and stores it in the global area 23b of the application data storage section 23. save. The application processing units 22 of the arithmetic processing units 1A and 1B read data from the application data storage unit 23 and use it for processing each application.
 そして、第1の実施の形態に係る電子制御装置100では、演算処理部(演算処理部1A,1B、第一のコア11、第二のコア12)がアプリケーション処理のためにデータ保存部(データ保存部2A)にデータを入出力する第1タイミングと、演算処理部(演算処理部1A,1B、第一のコア11、第二のコア12)がデータ保存部(データ保存部2A)から読み出したデータをペリフェラル(ペリフェラル3)に渡して、ペリフェラル(ペリフェラル3)が他の電子制御装置にデータを送信する第2タイミングとが予め固定される。以下、本実施の形態に係る、予め固定される処理のタイミングについて、従来の技術と比較しつつ説明する。 In the electronic control device 100 according to the first embodiment, the arithmetic processing units ( arithmetic processing units 1A, 1B, first core 11, second core 12) are stored in a data storage unit (data storage unit) for application processing. The first timing of inputting and outputting data to the storage unit 2A) and the first timing when the calculation processing unit ( processing units 1A, 1B, first core 11, second core 12) read from the data storage unit (data storage unit 2A) The second timing at which the peripheral (peripheral 3) transmits the data to the peripheral (peripheral 3) and the peripheral (peripheral 3) transmits the data to another electronic control device is fixed in advance. The pre-fixed processing timing according to this embodiment will be described below in comparison with the conventional technology.
<車両制御装置の内部構成例>
 図3は、第1の実施形態に係る電子制御装置を搭載した車両制御装置の内部構成例を示すブロック図である。ここでは、車両制御装置5に搭載される電子制御装置として、第一の電子制御装置(カメラECU)6と第二の電子制御装置(自動運転ECU)7とを例に示す。
<Example of internal configuration of vehicle control device>
FIG. 3 is a block diagram showing an example of the internal configuration of a vehicle control device equipped with an electronic control device according to the first embodiment. Here, as electronic control devices installed in the vehicle control device 5, a first electronic control device (camera ECU) 6 and a second electronic control device (automatic driving ECU) 7 are shown as examples.
 車両制御装置5は、第一の電子制御装置(カメラECU)6、第二の電子制御装置(自動運転ECU)7、カメラ8、ステアリング16、アクセル17及びブレーキ18を備える。第一の電子制御装置(カメラECU)6と第二の電子制御装置(自動運転ECU)7は、互いに車載ネットワーク4にて接続される。第一の電子制御装置(カメラECU)6と第二の電子制御装置(自動運転ECU)7の内部のハードウェア構成はそれぞれ図1に示した電子制御装置100と同様である。 The vehicle control device 5 includes a first electronic control device (camera ECU) 6, a second electronic control device (automatic driving ECU) 7, a camera 8, a steering wheel 16, an accelerator 17, and a brake 18. A first electronic control unit (camera ECU) 6 and a second electronic control unit (automatic driving ECU) 7 are connected to each other via an in-vehicle network 4. The internal hardware configurations of the first electronic control unit (camera ECU) 6 and the second electronic control unit (automatic driving ECU) 7 are the same as the electronic control unit 100 shown in FIG. 1, respectively.
 カメラ8は、例えば、100ミリ秒周期で車両前方の画像を撮影し、JPEG(Joint Photographic Experts Group)などの公知の手法によりカメラ画像データ9を生成する。カメラ8が生成したカメラ画像データ9は、第一の電子制御装置(カメラECU)6に入力される。 The camera 8 captures an image in front of the vehicle at a cycle of, for example, 100 milliseconds, and generates camera image data 9 using a known method such as JPEG (Joint Photographic Experts Group). Camera image data 9 generated by the camera 8 is input to a first electronic control unit (camera ECU) 6.
 第一の電子制御装置(カメラECU)6は、カメラ8からカメラ画像データ9を受け取り、カメラ物体検知データ10を生成する。ここで、カメラ物体検知データ10の構成例について説明する。
 図4は、カメラ物体検知データ10の構成例を示した図である。
A first electronic control unit (camera ECU) 6 receives camera image data 9 from a camera 8 and generates camera object detection data 10. Here, a configuration example of the camera object detection data 10 will be explained.
FIG. 4 is a diagram showing an example of the configuration of camera object detection data 10.
 カメラ物体検知データ10は、物体の番号(識別子)、物体種別、及び座標の項目で構成される。
 番号(識別子)の項目には、第一の電子制御装置(カメラECU)6が有する画像認識アプリケーションモジュール221,222(後述する図8を参照)によりカメラ画像データ9から検知された物体毎に特定の番号が割り当てられる。
 物体種別の項目には、検知された物体ごとに、例えば、自動車、歩行者等の物体種別が格納される。
 座標の項目には、画像認識アプリケーションモジュール221,222により検知された物体のカメラ画像データ9内における座標情報が格納される。
The camera object detection data 10 includes items such as an object number (identifier), object type, and coordinates.
The number (identifier) item specifies each object detected from the camera image data 9 by the image recognition application modules 221 and 222 (see FIG. 8 described later) included in the first electronic control unit (camera ECU) 6. number will be assigned.
In the object type field, the object type, such as a car or a pedestrian, is stored for each detected object.
Coordinate information in the camera image data 9 of objects detected by the image recognition application modules 221 and 222 is stored in the coordinate item.
 第一の電子制御装置(カメラECU)6に入力されるカメラ画像データ9は、JPEGやPNG(Portable Network Graphics)など公知の圧縮形式により圧縮された画像データ、又はカメラ8のイメージングセンサが感知した数値列そのものであり、一般にrawデータと呼ばれる。一方、第一の電子制御装置(カメラECU)6から出力されるカメラ物体検知データ10は、カメラ画像データ9から検知された物体の番号(識別子)、物体種別、及び座標の項目で構成される。このため、カメラ画像データ9は、カメラ物体検知データ10と比較して、送受信するデータのサイズが大きく、数メガバイト程度である。 The camera image data 9 input to the first electronic control unit (camera ECU) 6 is image data compressed using a known compression format such as JPEG or PNG (Portable Network Graphics), or image data detected by the imaging sensor of the camera 8. It is a numerical string itself, and is generally called raw data. On the other hand, the camera object detection data 10 output from the first electronic control unit (camera ECU) 6 is composed of items such as the number (identifier), object type, and coordinates of the object detected from the camera image data 9. . Therefore, the camera image data 9 is larger in size than the camera object detection data 10, which is about several megabytes.
 図3に示すように、第一の電子制御装置(カメラECU)6は、車載ネットワーク4を介して第二の電子制御装置(自動運転ECU)7にカメラ物体検知データ10を送信する。カメラ物体検知データ10のデータサイズは、カメラ画像データ9よりも小さいので、車載ネットワーク4の帯域を占有しないですむ。なお、第一の電子制御装置(カメラECU)6の内部ロジックについては、本発明に直接関係しないため例示を省略する。 As shown in FIG. 3, the first electronic control unit (camera ECU) 6 transmits camera object detection data 10 to the second electronic control unit (automatic driving ECU) 7 via the in-vehicle network 4. Since the data size of the camera object detection data 10 is smaller than the camera image data 9, it does not need to occupy the band of the in-vehicle network 4. Note that the internal logic of the first electronic control unit (camera ECU) 6 is not directly related to the present invention, and therefore will not be illustrated.
 第二の電子制御装置(自動運転ECU)7は、例えば、100ミリ秒周期で第一の電子制御装置(カメラECU)6からカメラ物体検知データ10を受信する。そして、第二の電子制御装置(自動運転ECU)7は、カメラ物体検知データ10に基づき、車両の進行方向(軌道)を計画し、該計画を実現するためのステアリング制御指令13、アクセル制御指令14及びブレーキ制御指令15を生成する。ステアリング制御指令13は、ステアリング16の動作を制御するための指令データである。アクセル制御指令14は、アクセル17の動作を制御するための指令データである。ブレーキ制御指令15は、ブレーキ18の動作を制御するための指令データである。 The second electronic control unit (automatic driving ECU) 7 receives the camera object detection data 10 from the first electronic control unit (camera ECU) 6 at a cycle of, for example, 100 milliseconds. Then, the second electronic control unit (automatic driving ECU) 7 plans the traveling direction (trajectory) of the vehicle based on the camera object detection data 10, and provides a steering control command 13 and an accelerator control command to realize the plan. 14 and a brake control command 15. The steering control command 13 is command data for controlling the operation of the steering wheel 16. The accelerator control command 14 is command data for controlling the operation of the accelerator 17. The brake control command 15 is command data for controlling the operation of the brake 18.
 ステアリング16は、ステアリング制御指令13により制御され、車両の進行方向を変える。アクセル17は、アクセル制御指令14により制御され、車両を加速する。ブレーキ18は、ブレーキ制御指令15により制御され、車両を減速する。このように車両の進行方向、加減速が制御されることを「車両制御」と呼ぶ。なお、第二の電子制御装置(自動運転ECU)7におけるカメラ物体検知データ10の解析処理、及び各制御指令の生成を行うロジック、並びに各制御指令のデータ形式については、本発明に直接関係しないため例示を省略する。 The steering 16 is controlled by the steering control command 13 and changes the direction of travel of the vehicle. The accelerator 17 is controlled by the accelerator control command 14 to accelerate the vehicle. Brake 18 is controlled by brake control command 15 to decelerate the vehicle. Controlling the traveling direction and acceleration/deceleration of the vehicle in this manner is called "vehicle control." Note that the logic for analyzing the camera object detection data 10 in the second electronic control unit (autonomous driving ECU) 7, the logic for generating each control command, and the data format of each control command are not directly related to the present invention. Therefore, an example will be omitted.
 <時刻同期>
 次に、第一の電子制御装置(カメラECU)6と第二の電子制御装置(自動運転ECU)7の時刻同期について説明する。
 図5Aと図5Bは、第一の電子制御装置(カメラECU)6と第二の電子制御装置(自動運転ECU)7の時刻同期の必要性を示した図である。図5Aは、時刻同期のパターン1を示す図であり、図5Bは、時刻同期のパターン2を示す図である。
<Time synchronization>
Next, time synchronization between the first electronic control unit (camera ECU) 6 and the second electronic control unit (automatic driving ECU) 7 will be explained.
5A and 5B are diagrams showing the necessity of time synchronization between the first electronic control unit (camera ECU) 6 and the second electronic control unit (automatic driving ECU) 7. FIG. 5A is a diagram showing a time synchronization pattern 1, and FIG. 5B is a diagram showing a time synchronization pattern 2.
 第一の電子制御装置(カメラECU)6においては、カメラ画像データ9からカメラ物体検知データ10を生成するために100ミリ秒ごとの周期処理が行われる。第二の電子制御装置(自動運転ECU)7においては、カメラ物体検知データ10から各制御指令を生成するために100ミリ秒ごとの周期処理が行われる。通常、これらの周期処理は電子制御装置6,7の内部に搭載されたハードウェアカウンタのタイムアウト処理を用いて実装される。しかし、ハードウェアカウンタの動作クロックが異なると電子制御装置6,7間で時刻にずれが生じるため、データフローの一貫性が保たれず、車両制御に影響を与える可能性がある。 In the first electronic control unit (camera ECU) 6, periodic processing is performed every 100 milliseconds to generate camera object detection data 10 from camera image data 9. In the second electronic control unit (automatic driving ECU) 7, periodic processing is performed every 100 milliseconds to generate each control command from the camera object detection data 10. Normally, these periodic processes are implemented using timeout processing of hardware counters installed inside the electronic control units 6 and 7. However, if the operating clocks of the hardware counters are different, there will be a time lag between the electronic control units 6 and 7, which may not maintain consistency in data flow and may affect vehicle control.
 図5Aに示すパターン1において、第一の電子制御装置(カメラECU)6が1周期目に生成したデータは、車載ネットワーク4を介して第二の電子制御装置(自動運転ECU)7の2周期目の開始より前に到達する。このため、第二の電子制御装置(自動運転ECU)7は、2周期目において第一の電子制御装置(カメラECU)6が1周期目に生成したデータを使用して計算処理を実行可能である。 In pattern 1 shown in FIG. 5A, data generated in the first cycle by the first electronic control unit (camera ECU) 6 is transmitted to the second electronic control unit (automatic driving ECU) 7 in two cycles via the in-vehicle network 4. Arrive before the start of the eye. Therefore, the second electronic control unit (autonomous driving ECU) 7 can perform calculation processing in the second cycle using the data that the first electronic control unit (camera ECU) 6 generated in the first cycle. be.
 一方、図5Bに示すパターン2において、第一の電子制御装置(カメラECU)6が1周期目に生成したデータは、車載ネットワーク4を介して第二の電子制御装置(自動運転ECU)7の2周期目の開始より後に到達する。このため、第二の電子制御装置(自動運転ECU)7は、2周期目ではなく次の3周期目に第一の電子制御装置(カメラECU)6が1周期目に生成したデータを使用して計算処理を実行可能となる。このように、電子制御装置6,7の内部時刻にずれが生じるとシステムの振る舞いに変動が生じる。 On the other hand, in pattern 2 shown in FIG. 5B, the data generated by the first electronic control unit (camera ECU) 6 in the first cycle is transmitted to the second electronic control unit (autonomous driving ECU) 7 via the in-vehicle network 4. Arrived after the start of the second cycle. Therefore, the second electronic control unit (autonomous driving ECU) 7 uses the data generated by the first electronic control unit (camera ECU) 6 in the first cycle, not in the second cycle but in the next third cycle. It becomes possible to perform calculation processing. In this way, when a difference occurs between the internal times of the electronic control units 6 and 7, the behavior of the system fluctuates.
 ここで、システム開発者は、複数の電子制御装置の内部時刻のずれによる、複数の電子制御装置の全体を含むシステムの振る舞いの変動が車両制御に影響を与えないことを検証する必要がある。この検証作業は、ソフトウェア開発コストの増大の一因となっていた。そこで、第1の実施形態では、第一の電子制御装置(カメラECU)6と、第二の電子制御装置(自動運転ECU)7の間で時刻がずれないよう同期を行う制御が行われる。電子制御装置間の時刻同期の手法は、車載ソフトウェアの標準規格であるAUTOSAR(AUTomotive Open System Architecture)に規定されている。ただし、電子制御装置間の時刻同期の手法は本発明に直接関連しないためAUTOSARの詳細の例示は省略する。 Here, the system developer needs to verify that fluctuations in the behavior of the system including all of the multiple electronic control units due to differences in the internal time of the multiple electronic control units do not affect vehicle control. This verification work was one of the causes of increased software development costs. Therefore, in the first embodiment, control is performed to synchronize the first electronic control unit (camera ECU) 6 and the second electronic control unit (automatic driving ECU) 7 so that the times do not deviate. The method of time synchronization between electronic control devices is specified in AUTOSAR (AUTomotive Open System Architecture), which is a standard for in-vehicle software. However, since the method of time synchronization between electronic control units is not directly related to the present invention, detailed illustration of AUTOSAR will be omitted.
 <時刻同期された電子制御装置間におけるデータフローの変化>
 図6Aと図6Bは、時刻同期された電子制御装置間におけるデータフローの変化の例を示す図である。図6Aは、図5Aと同じ時刻同期のパターン1を示す図であり、図6Bは、変化したデータフローのパターン3を示す図である。図6Aと図6Bでは、図5Aと図5Bの場合と異なりパターン1とパターン3では共に第一の電子制御装置(カメラECU)6と第二の電子制御装置(自動運転ECU)7が時刻同期されている。
<Changes in data flow between time-synchronized electronic control units>
FIGS. 6A and 6B are diagrams showing examples of changes in data flow between time-synchronized electronic control units. FIG. 6A is a diagram showing the same time synchronization pattern 1 as in FIG. 5A, and FIG. 6B is a diagram showing a changed data flow pattern 3. In FIGS. 6A and 6B, unlike in FIGS. 5A and 5B, in both patterns 1 and 3, the first electronic control unit (camera ECU) 6 and the second electronic control unit (automatic operation ECU) 7 are synchronized in time. has been done.
 図6Aでは100ミリ秒ごとの周期タスクの開始タイミングが二つの電子制御装置間で一致している。ただし、100ミリ秒ごとの周期タスクの開始タイミング(位相)がずれている場合であっても、このずれが一定であれば時刻同期されていると考えてよい。しかしながら、パターン3に示すように、1周期目における第一の電子制御装置(カメラECU)6の処理時間が長くなった場合、車載ネットワーク4で伝送されるカメラ物体検知データ10の通信時間が2周期目にかかっている。この場合、第二の電子制御装置(自動運転ECU)7の2周期目の開始までにカメラ物体検知データ10が到達しないので、第二の電子制御装置(自動運転ECU)7の2周期目において処理抜けが発生し、データフローの一貫性が保たれなくなる。 In FIG. 6A, the start timings of periodic tasks every 100 milliseconds are the same between the two electronic control devices. However, even if the start timing (phase) of the periodic task is shifted every 100 milliseconds, if this shift is constant, it can be considered that the times are synchronized. However, as shown in pattern 3, if the processing time of the first electronic control unit (camera ECU) 6 in the first cycle becomes long, the communication time of the camera object detection data 10 transmitted over the in-vehicle network 4 is 2 It depends on the cycle. In this case, since the camera object detection data 10 does not arrive by the start of the second period of the second electronic control unit (automatic driving ECU) 7, Processing omissions occur and data flow becomes inconsistent.
 このようなデータ送信側の電子制御装置における処理時間の変動は、交通量の少ない高速道路における直進走行時と、交通量の多い繁華街における交差点右折走行時のように車両の周辺環境から得られる情報量の差によって発生し、車両制御に重大な影響を与える。したがって、システム開発者は、交通量の多寡によらず、図6Bのパターン3に示すようなデータフローの一貫性の破綻が発生しないことを検証しなければならない。また、システム開発者は、必要に応じてタイミングやCPUのスケジューリングを再設計することが必要となるため、車載ネットワーク4で伝送されるデータのデータ量の変化は、ソフトウェア開発コストを上昇させる一つの要因であった。 Such fluctuations in processing time in the electronic control unit on the data transmitting side are obtained from the surrounding environment of the vehicle, such as when driving straight on a highway with little traffic and when turning right at an intersection in a busy downtown area. This occurs due to a difference in the amount of information and has a significant impact on vehicle control. Therefore, the system developer must verify that data flow consistency does not break down as shown in pattern 3 in FIG. 6B, regardless of the amount of traffic. Additionally, system developers are required to redesign timing and CPU scheduling as necessary, so changes in the amount of data transmitted over the in-vehicle network 4 are one of the factors that increases software development costs. It was a factor.
 <LET及びLCT>
 ここで、非特許文献1に開示されたLET(Logical Execution Time)及びLCT(Logical Communication Time)の概要について説明する。
 図7Aと図7Bは、非特許文献1にて提案されたLET及びLCTの概要を示した図である。非特許文献1は、以下の文献である。
”Kai-Bjorn Gemlau, Leonie KOHLER, Rolf Ernst, and Sophie Quinton. 2021. “System-level Logical Execution Time: Augmenting the Logical Execution Time Paradigm for Distributed Real-time Automotive Software.” ACM Trans. Cyber-Phys. Syst. 5, 2, Article 14 (January 2021), 27 pages. DOI:https://doi.org/10.1145/3381847”
 この非特許文献1には、LETと呼ばれるスケジューリング手法をECU間に拡張する手法に関して記載されている。
<LET and LCT>
Here, an overview of LET (Logical Execution Time) and LCT (Logical Communication Time) disclosed in Non-Patent Document 1 will be explained.
7A and 7B are diagrams showing an overview of LET and LCT proposed in Non-Patent Document 1. Non-patent document 1 is the following document.
”Kai-Bjorn Gemlau, Leonie KOHLER, Rolf Ernst, and Sophie Quinton. 2021. “System-level Logical Execution Time: Augmenting the Logical Execution Time Paradigm for Distributed Real-time Automotive Software.” ACM Trans. Cyber-Phys. Syst. 5, 2, Article 14 (January 2021), 27 pages. DOI:https://doi.org/10.1145/3381847”
This non-patent document 1 describes a method of extending a scheduling method called LET between ECUs.
 図7Aは、LETとLCTの定義を示す図である。図7Aに示すように、第一の電子制御装置、車載ネットワーク、第二の電子制御装置の各周期の期間は100ミリ秒で一定である。 FIG. 7A is a diagram showing the definitions of LET and LCT. As shown in FIG. 7A, the period of each cycle of the first electronic control device, the in-vehicle network, and the second electronic control device is constant at 100 milliseconds.
 第一の電子制御装置における1周期目、2周期目、3周期目、4周期目のLETには、それぞれ第1の処理、第2の処理、第3の処理、第4の処理が割り当てられる。各処理の始めのリードタイミングでデータの読み出しが行われ、各処理の終わりのライトタイミングで処理後のデータの書き込みが行われる。異なる装置及び車載ネットワークは、ある周期の最後にRAM2、ペリフェラル3等に書き込まれたデータを、次の周期の最初に読み出すことができる。 The first process, second process, third process, and fourth process are assigned to the LETs in the first, second, third, and fourth cycles in the first electronic control device, respectively. . Data is read at the read timing at the beginning of each process, and processed data is written at the write timing at the end of each process. Different devices and in-vehicle networks can read data written to the RAM 2, peripheral 3, etc. at the end of one cycle at the beginning of the next cycle.
 車載ネットワークでは、第一の電子制御装置で処理されたデータの通信処理が示される。例えば、車載ネットワークにおける2周期目のLCTには、第一の電子制御装置による1周期目の第1の処理の結果であるデータを第二の電子制御装置に通信する第1の処理が割り当てられる。同様に、車載ネットワークにおける3周期目、4周期目のLCTには、第一の電子制御装置による2周期目、3周期目の第1の処理の結果であるデータを第二の電子制御装置に通信する第2の処理、第3の処理が割り当てられる。 In the in-vehicle network, communication processing of data processed by the first electronic control device is shown. For example, the LCT in the second cycle in the in-vehicle network is assigned the first process of communicating data that is the result of the first process in the first cycle by the first electronic control unit to the second electronic control unit. . Similarly, in the LCT of the third and fourth cycles in the in-vehicle network, data that is the result of the first processing of the second and third cycles by the first electronic control unit is sent to the second electronic control unit. A second process and a third process to communicate are assigned.
 そして、第二の電子制御装置における3周期目、4周期目のLETには、それぞれ車載ネットワークを介して受信したデータを用いる、第二の電子制御装置の第1の処理、第2の処理が割り当てられる。 Then, in the third and fourth cycles of LET in the second electronic control device, the first processing and the second processing of the second electronic control device are performed, respectively, using data received via the in-vehicle network. Assigned.
 図7Bは、各電子制御装置の処理時間と、車載ネットワークの通信時間の例を示す図である。第一の電子制御装置、車載ネットワーク、第二の電子制御装置の各周期の期間は固定されているので、第一の電子制御装置及び第二の電子制御装置で行われる処理は、LET内のどこかで実行する。また、車載ネットワークで行われる通信処理は、LCT内のどこかで実行する。例えば、第一の電子制御装置は、ある周期でLET内のどこかで処理を実行する。次の周期で、車載ネットワークは、LET内のどこかでデータを第二の電子制御装置に通信する。さらに次の周期で、第二の電子制御装置は、車載ネットワークからデータを取得して、LET内のどこかで処理を実行する。 FIG. 7B is a diagram showing an example of the processing time of each electronic control device and the communication time of the in-vehicle network. Since the period of each cycle of the first electronic control device, the in-vehicle network, and the second electronic control device is fixed, the processing performed by the first electronic control device and the second electronic control device is performed within the LET. Run it somewhere. Furthermore, communication processing performed on the in-vehicle network is executed somewhere within the LCT. For example, the first electronic control unit executes processing somewhere within the LET at a certain period. In the next cycle, the in-vehicle network communicates data to the second electronic control unit somewhere within the LET. In the next cycle, the second electronic control unit obtains data from the in-vehicle network and executes processing somewhere within the LET.
 非特許文献1に提案された手法は、電子制御装置に搭載されるアプリケーションの処理、及び車載ネットワークの通信処理のそれぞれに対して、その開始時刻及び終了時刻を予め固定し、変化させないというものである。ここで、開始時刻とは、電子制御装置に搭載されるアプリケーションの処理、及び車載ネットワークの通信処理が必要なデータを取得する時刻である。また、終了時刻とは、電子制御装置に搭載されるアプリケーションの処理、及び車載ネットワークの通信処理で生成されたデータを、次の周期の電子制御装置又は車載ネットワークに公開する時刻である。非特許文献1に提案された手法を用いることで、ペリフェラルの共有やDMAコントローラによる処理の代替、ひいてはアプリケーションのアップデートやコア割り付けの変更、ハードウェアの変更を行う場合においても、各アプリケーション処理及び各通信処理の順序が変化せず、データフローの一貫性が保たれる。このため、検証及びタイミングの再設計に関するソフトウェア開発コストが削減されると期待されていた。 The method proposed in Non-Patent Document 1 is to fix the start time and end time of each of the application processing installed in the electronic control device and the communication processing of the in-vehicle network in advance and do not change them. be. Here, the start time is the time at which data necessary for processing of an application installed in the electronic control device and communication processing of the in-vehicle network is acquired. Furthermore, the end time is the time at which data generated by the processing of the application installed in the electronic control device and the communication processing of the in-vehicle network is released to the electronic control device or the in-vehicle network in the next cycle. By using the method proposed in Non-Patent Document 1, each application process and each The order of communication processing does not change, and data flow consistency is maintained. Therefore, it was expected that software development costs related to verification and timing redesign would be reduced.
 しかしながら、非特許文献1に開示された手法に従ってアプリケーションの処理及び通信処理の開始及び終了時刻を固定すると、以下の問題が発生することが判明した。この問題とは、各メッセージ(データ)の通信処理に要する処理時間が長い場合、各メッセージの処理に要するCPU処理時間に起因して、通信処理の開始から終了までの時間に通信可能なメッセージ量(すなわちデータ量)が制約されるというものであった。 However, it has been found that when the start and end times of application processing and communication processing are fixed according to the method disclosed in Non-Patent Document 1, the following problem occurs. This problem occurs when the processing time required for communication processing of each message (data) is long, and the amount of messages that can be communicated in the time from the start to the end of communication processing is due to the CPU processing time required to process each message. (in other words, the amount of data) was limited.
 したがって、高画質な車載カメラやゾーンECU及びセントラルECUに例示される大量のデータ通信を必要とするアプリケーションに対して、非特許文献1に開示された手法をそのまま適用することはできなかった。なお、通信処理の開始時刻から終了時刻までの時間を長くした場合、データ通信量を増加させることが可能であるが、アプリケーション間のデータ交換に要する時間が長くなるため、レイテンシ(通信遅延)が悪化するという問題があった。 Therefore, the method disclosed in Non-Patent Document 1 could not be applied as is to applications that require large amounts of data communication, such as high-quality in-vehicle cameras, zone ECUs, and central ECUs. Note that if you lengthen the time from the start time to the end time of communication processing, it is possible to increase the amount of data communication, but the time required to exchange data between applications will increase, resulting in latency (communication delay). The problem was that it was getting worse.
 図6を参照して説明したように、各ECUのCPU1が各アプリケーションの処理に要する時間は、車両の周辺環境あるいは車両の内部状態によって変動する。そこで、非特許文献1に係る技術では、図2に示した各アプリケーション処理部22で実行する各アプリケーションの処理は全て他のアプリケーション処理部22とデータを共有しないローカル変数を用いて内部の処理を行うこととする。また、各アプリケーション処理部22における処理の開始時と終了時に複数のアプリケーション処理部22の間で互いに共有されるグローバル変数をリード及びライトする。さらに各アプリケーション処理部22がグローバル変数のリード及びライトを行うタイミングを固定する。これにより、各アプリケーション処理部22がアプリケーション処理に要する時間を、固定したグローバル変数のリードのタイミングから固定したグローバル変数のライトのタイミングまでの時間とみなし、この時間をLETと呼ぶように定義した。このようにLETを定義することで、各アプリケーション処理部22に要する処理時間を常に一定とみなすことができ、前述のようなデータフローの一貫性の破綻が生じないと考えられる。 As explained with reference to FIG. 6, the time required for the CPU 1 of each ECU to process each application varies depending on the surrounding environment of the vehicle or the internal state of the vehicle. Therefore, in the technology according to Non-Patent Document 1, the processing of each application executed by each application processing unit 22 shown in FIG. 2 is performed using local variables that do not share data with other application processing units 22. We will do so. Further, at the start and end of processing in each application processing section 22, global variables shared among the plurality of application processing sections 22 are read and written. Furthermore, the timing at which each application processing unit 22 reads and writes global variables is fixed. As a result, the time required for each application processing unit 22 to process an application is defined as the time from the timing of reading a fixed global variable to the timing of writing a fixed global variable, and this time is called LET. By defining the LET in this way, the processing time required by each application processing unit 22 can be regarded as always constant, and it is thought that the above-mentioned breakdown in data flow consistency will not occur.
 同様に、データ通信に関しても、送信元の電子制御装置が備えるRAM2からグローバル変数を読み出すタイミングと、受信先の電子制御装置が備えるRAM2にグローバル変数の値を書き込むタイミングを固定する。そして、グローバル変数をリードするタイミングからグローバル変数の値をライトするタイミングまでの時間をLCTとみなす。 Similarly, regarding data communication, the timing of reading a global variable from the RAM 2 of the electronic control device of the transmission source and the timing of writing the value of the global variable to the RAM 2 of the electronic control device of the receiving destination are fixed. Then, the time from the timing of reading the global variable to the timing of writing the value of the global variable is regarded as LCT.
 LCTの間のいずれかのタイミングにおいて送信元の電子制御装置(第一の電子制御装置(カメラECU)6)上のBSW(Basic Software)による送信処理と、車載ネットワーク4上の物理的なデータ転送と、受信先の電子制御装置(第二の電子制御装置(自動運転ECU)7)上のBSWによる受信処理とが実行される。 Transmission processing by the BSW (Basic Software) on the transmission source electronic control unit (first electronic control unit (camera ECU) 6) and physical data transfer on the in-vehicle network 4 at any timing during the LCT and reception processing by the BSW on the receiving destination electronic control device (second electronic control device (automatic operation ECU) 7).
 本実施形態においては、車載ネットワーク4における通信負荷(バス負荷とも呼ぶ)は十分小さいとする。そこで、通信周期の短い通信メッセージ(通信データ)に対して優先度を高く設定する。この設定により、通信周期の異なる、どの通信メッセージに関しても次の通信周期が開始するタイミングまでに、送信先の電子制御装置(第二の電子制御装置(自動運転ECU)7)のペリフェラル3のレジスタ31(図1を参照)に必ずデータが到達する。ここで、CANの場合、バス負荷が25%以下であれば十分であることが知られている。そこで、図1に示した第一の電子制御装置(カメラECU)6及び第二の電子制御装置(自動運転ECU)7の処理周期は100ミリ秒であるから、LCTも100ミリ秒と設定すればよい。 In this embodiment, it is assumed that the communication load (also referred to as bus load) on the in-vehicle network 4 is sufficiently small. Therefore, a high priority is set for communication messages (communication data) with short communication cycles. With this setting, for any communication message with a different communication cycle, the register of the peripheral 3 of the destination electronic control unit (second electronic control unit (autonomous driving ECU) 7) is 31 (see FIG. 1), the data always arrives. Here, in the case of CAN, it is known that it is sufficient if the bus load is 25% or less. Therefore, since the processing cycle of the first electronic control unit (camera ECU) 6 and the second electronic control unit (autonomous driving ECU) 7 shown in FIG. 1 is 100 milliseconds, the LCT should also be set to 100 milliseconds. Bye.
 <LET及びLCTの実装>
 次に、非特許文献1にて提案されたLET及びLCTを実装するための従来のソフトウェアアーキテクチャについて、図8~図11を参照して説明する。
 図8は、第一の電子制御装置(カメラECU)6に従来のソフトウェアアーキテクチャを適用した例を示す図である。
<Implementation of LET and LCT>
Next, a conventional software architecture for implementing LET and LCT proposed in Non-Patent Document 1 will be described with reference to FIGS. 8 to 11.
FIG. 8 is a diagram showing an example in which a conventional software architecture is applied to the first electronic control unit (camera ECU) 6. As shown in FIG.
 第一の電子制御装置(カメラECU)6における第一のコア11を通信専用、第二のコア12をアプリケーション専用とする。すなわち、第一のコア11に対して、OS(Operating System)19、通信ミドルウェア20及び通信ドライバ21が割り付けられる。また、第二のコア12に対して、OS19、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222、データ取得タスク24、データ公開タスク25が割り付けられる。 The first core 11 in the first electronic control unit (camera ECU) 6 is dedicated to communication, and the second core 12 is dedicated to applications. That is, an OS (Operating System) 19, communication middleware 20, and communication driver 21 are allocated to the first core 11. Furthermore, an OS 19, an image recognition application module (vehicle front) 221, an image recognition application module (vehicle rear) 222, a data acquisition task 24, and a data disclosure task 25 are assigned to the second core 12.
 画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222は、いずれもアプリケーション処理を実行するモジュールである。
 データ取得タスク24は、各アプリケーションモジュールの処理開始時に各アプリケーション処理部22に必要なデータをRAM2(図1を参照)から取得する。データ取得タスク24は、アプリケーションデータ保存部23(図2を参照)のローカル領域23a又はグローバル領域23bからデータを取得できる。
The image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 are both modules that execute application processing.
The data acquisition task 24 acquires data necessary for each application processing section 22 from the RAM 2 (see FIG. 1) at the start of processing of each application module. The data acquisition task 24 can acquire data from the local area 23a or global area 23b of the application data storage unit 23 (see FIG. 2).
 データ公開タスク25は、各アプリケーションモジュールの処理終了時に、各アプリケーション処理の結果をRAM2に含まれるアプリケーションデータ保存部23に保存する。データ公開タスク25は、第一のコア11及び第二のコア12の双方からアクセス可能な領域(グローバル領域23b)をアプリケーションデータ保存部23に設定し、このグローバル領域23bに各アプリケーション処理の結果を保存する。このため、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222が生成し、他のアプリケーションモジュールで用いられるデータは、データ公開タスク25によってグローバル領域23bに保存される。 The data publishing task 25 stores the results of each application process in the application data storage unit 23 included in the RAM 2 when the process of each application module ends. The data publication task 25 sets an area (global area 23b) accessible from both the first core 11 and the second core 12 in the application data storage unit 23, and stores the results of each application process in this global area 23b. save. Therefore, data generated by the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 and used by other application modules is saved in the global area 23b by the data publication task 25.
 通信ミドルウェア20は、グローバル領域23bに保存されたデータに対し、ヘッダ生成等を含む通信に必要なデータを付加し、通信データを生成する機能を有する。この通信データは、例えば、イーサネットで用いられるフレームである。
 通信ドライバ21は、通信データを、図1に示したペリフェラル3のレジスタ31にライトする。
The communication middleware 20 has a function of adding data necessary for communication, including header generation, to the data stored in the global area 23b, and generating communication data. This communication data is, for example, a frame used in Ethernet.
The communication driver 21 writes communication data to the register 31 of the peripheral 3 shown in FIG.
 データ取得タスク24とデータ公開タスク25を起動するタイミングは固定されている。また、データ取得タスク24とデータ公開タスク25の処理は、RAM2へのアクセスのみである。このため、データ取得タスク24とデータ公開タスク25のCPU処理時間は一定とみなすことができる。その結果、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222のCPU処理時間(アプリケーション処理時間)が変動しても、データ取得タスク24が起動されるタイミングとデータ公開タスク25が完了するタイミングは変動しない。 The timing for starting the data acquisition task 24 and the data publication task 25 is fixed. Further, the processing of the data acquisition task 24 and the data publication task 25 is only access to the RAM 2. Therefore, the CPU processing time of the data acquisition task 24 and the data publication task 25 can be considered constant. As a result, even if the CPU processing time (application processing time) of the image recognition application module (front of the vehicle) 221 and the image recognition application module (rear of the vehicle) 222 changes, the timing at which the data acquisition task 24 is started and the data release task The timing at which 25 is completed does not change.
 データフローの観点から見ると、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222の実行時間は、データ取得タスク24が起動されるタイミングから起算してデータ公開タスク25が完了するタイミングまでの時間として一定である。この一定時間が画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222に関するLETとして実装される。 From the viewpoint of data flow, the execution time of the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 is calculated from the timing when the data acquisition task 24 is started and when the data publishing task 25 is started. The time until completion is constant. This certain period of time is implemented as a LET regarding the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222.
 なお、電子制御装置100に含まれる各アプリケーションモジュール及び各タスクを所定のタイミングに起動する操作は、タスク起動時刻管理部26によって管理及び実行される。タスク起動時刻管理部26は、OS19から提供されるタスクスケジューリング機能を実現する機能部である。 Note that the operation of starting each application module and each task included in the electronic control device 100 at a predetermined timing is managed and executed by the task starting time management unit 26. The task start time management unit 26 is a functional unit that implements a task scheduling function provided by the OS 19.
 図9は、第二の電子制御装置(自動運転ECU)7における従来のソフトウェアアーキテクチャを適用した例を示す図である。 FIG. 9 is a diagram showing an example in which the conventional software architecture is applied to the second electronic control unit (automatic driving ECU) 7.
 第二の電子制御装置(自動運転ECU)7における第一のコア11を通信専用、第二のコア12をアプリケーション専用とする。すなわち、第一のコア11に対して、OS(Operating System)19、通信ミドルウェア20及び通信ドライバ21を割り付ける。また、第二のコア12に対して、OS19、軌道生成アプリケーションモジュール223、制御指令生成アプリケーションモジュール224、データ取得タスク24、データ公開タスク25を割り付ける。 The first core 11 in the second electronic control unit (automatic driving ECU) 7 is dedicated to communication, and the second core 12 is dedicated to applications. That is, an OS (Operating System) 19, communication middleware 20, and communication driver 21 are allocated to the first core 11. Furthermore, the OS 19, trajectory generation application module 223, control command generation application module 224, data acquisition task 24, and data publication task 25 are assigned to the second core 12.
 軌道生成アプリケーションモジュール223は、車両が走行する軌道を生成する。
 制御指令生成アプリケーションモジュール224は、生成された軌道に従って、図3に示したステアリング16、アクセル17及びブレーキ18に対する制御指令(ステアリング制御指令13、アクセル制御指令14及びブレーキ制御指令15)を生成する。
The trajectory generation application module 223 generates a trajectory on which the vehicle travels.
The control command generation application module 224 generates control commands (steering control command 13, accelerator control command 14, and brake control command 15) for the steering 16, accelerator 17, and brake 18 shown in FIG. 3 according to the generated trajectory.
 軌道生成アプリケーションモジュール223及び制御指令生成アプリケーションモジュール224の実行時間は、データ取得タスク24が起動されるタイミングから起算してデータ公開タスク25が完了するタイミングまでの時間として一定である。該時間が軌道生成アプリケーションモジュール223及び制御指令生成アプリケーションモジュール224に関するLETとして実装される。
 データ取得タスク24、データ公開タスク25、及びタスク起動時刻管理部26については、図8を参照して説明したものと同じであるので、詳細な説明を省略する。
The execution time of the trajectory generation application module 223 and the control command generation application module 224 is constant as the time from the timing when the data acquisition task 24 is started until the timing when the data publication task 25 is completed. The time is implemented as a LET for the trajectory generation application module 223 and the control command generation application module 224.
The data acquisition task 24, the data publication task 25, and the task start time management unit 26 are the same as those described with reference to FIG. 8, so detailed explanations will be omitted.
 次に、LCTの実装を考えるため、通信ミドルウェア20の処理について説明する。
 図8と図9に示した通信ミドルウェア20は、周期的に起動され、他の電子制御装置に対して通信データの送信処理を行う。そこで、通信ミドルウェア20は、図2に示したアプリケーションデータ保存部23のグローバル領域23bに保存されたデータをデータ公開タスク25により読み取る。その後、通信ミドルウェア20は、通信に必要なデータ(通信ヘッダ等)を読み取ったデータに付加し、通信ドライバ21をコールすることにより、図1に示したペリフェラル3のレジスタ31に通信データを書き込む。
Next, in order to consider the implementation of LCT, the processing of the communication middleware 20 will be explained.
The communication middleware 20 shown in FIGS. 8 and 9 is activated periodically and performs a process of transmitting communication data to other electronic control devices. Therefore, the communication middleware 20 uses the data publishing task 25 to read the data stored in the global area 23b of the application data storage unit 23 shown in FIG. Thereafter, the communication middleware 20 adds data necessary for communication (such as a communication header) to the read data, and writes the communication data to the register 31 of the peripheral 3 shown in FIG. 1 by calling the communication driver 21.
 また、通信ミドルウェア20は、ペリフェラル3のデータ受信割り込みによっても起動され、他の電子制御装置から送信された通信データの受信処理を行うこともできる。通信ミドルウェア20は、データの受信処理に際して、通信ドライバ21を用いてペリフェラル3のレジスタ31に保存された通信データを読み取る。そして、通信ミドルウェア20は、通信ヘッダの除去やCRC(Cyclic Redundancy Check)など通信プロトコルによって規定されるデータ受信に関して必要な所定の処理を実行する。その後、通信ミドルウェア20は、アプリケーションデータ保存部23において、特に第一のコア11からアクセス可能であって、第二のコア12からアクセス可能ではないRAM2のローカル領域23aに、受信処理後のデータを保存する。 The communication middleware 20 is also activated by a data reception interrupt from the peripheral 3, and can also perform reception processing of communication data transmitted from other electronic control devices. The communication middleware 20 reads communication data stored in the register 31 of the peripheral 3 using the communication driver 21 during data reception processing. The communication middleware 20 then executes necessary predetermined processing regarding data reception defined by the communication protocol, such as communication header removal and CRC (Cyclic Redundancy Check). Thereafter, the communication middleware 20 stores the received data in the application data storage unit 23, particularly in the local area 23a of the RAM 2 that is accessible from the first core 11 and not accessible from the second core 12. save.
 また、通信ミドルウェア20は、周期的に起動され、RAM2のローカル領域23aに保存された受信データを、RAM2のグローバル領域23bに保存することもできる。このため、第一のコア11及び第二のコア12の双方でグローバル領域23bにアクセス可能である。 Furthermore, the communication middleware 20 can also be activated periodically and store the received data stored in the local area 23a of the RAM2 in the global area 23b of the RAM2. Therefore, both the first core 11 and the second core 12 can access the global area 23b.
 LCTは、以下の処理で規定される。
 すなわち、第一の電子制御装置(カメラECU)6に搭載された通信ミドルウェア20が送信処理のために起動される時刻から起算されると、第二の電子制御装置(自動運転ECU)7に搭載された通信ミドルウェア20も起動する。第二の電子制御装置(自動運転ECU)7に搭載された通信ミドルウェア20は、図2に示したアプリケーションデータ保存部23のうち、特に第一のコア11からアクセス可能であって、第二のコア12からアクセス可能ではないローカル領域23aに保存されたデータを取り出す。そして、通信ミドルウェア20は、アプリケーションデータ保存部23のうち、特に第一のコア11及び第二のコア12の双方からアクセス可能なグローバル領域23bにデータを保存する。このように通信ミドルウェア20が送信処理のために起動される時刻から起算され、グローバル領域23bにデータを保存するまでの時刻がLCTとして規定される。
LCT is defined by the following process.
That is, starting from the time when the communication middleware 20 installed in the first electronic control unit (camera ECU) 6 is activated for transmission processing, the communication middleware installed in the second electronic control unit (autonomous driving ECU) 7 is started. The communication middleware 20 that has been updated is also activated. The communication middleware 20 installed in the second electronic control unit (autonomous driving ECU) 7 is accessible especially from the first core 11 of the application data storage section 23 shown in FIG. Data stored in the local area 23a that is not accessible from the core 12 is retrieved. The communication middleware 20 then stores the data in the global area 23b that is accessible from both the first core 11 and the second core 12 in the application data storage unit 23. In this way, the time calculated from the time when the communication middleware 20 is activated for transmission processing until data is stored in the global area 23b is defined as the LCT.
 車載ネットワーク4の状況により第一の電子制御装置(カメラECU)6から第二の電子制御装置(自動運転ECU)7への通信に要する時間は変動しうる。ただし、データフローの観点で見れば、変動した時間もLCTとして一定とみなすことができる。 The time required for communication from the first electronic control unit (camera ECU) 6 to the second electronic control unit (automatic driving ECU) 7 may vary depending on the status of the in-vehicle network 4. However, from the viewpoint of data flow, even the fluctuating time can be regarded as constant as LCT.
 ここで、非特許文献1にて提案された技術を用いて、アプリケーション処理及び通信処理を行う例について説明する。
 図10は、非特許文献1にて提案されたLET及びLCTを、図8及び図9に示したモジュール及びタスクにマッピングした例を示す図である。
Here, an example will be described in which application processing and communication processing are performed using the technology proposed in Non-Patent Document 1.
FIG. 10 is a diagram showing an example of mapping the LET and LCT proposed in Non-Patent Document 1 to the modules and tasks shown in FIGS. 8 and 9.
 第一の電子制御装置(カメラECU)6のタイミングチャートで行われる処理は、第一の電子制御装置(カメラECU)6のアプリケーション専用の第二のコア12で行われる処理を表している。また、車載ネットワーク4のタイミングチャートで行われる前半の処理(図の左側のリードタイミングを含む)は、第一の電子制御装置(カメラECU)6の通信専用の第一のコア11で行われる処理を表している。一方、車載ネットワーク4のタイミングチャートで行われる後半の処理(図の右側のライトタイミングを含む)は、第二の電子制御装置(自動運転ECU)7の通信専用の第一のコア11で行われる処理を表している。そして、第二の電子制御装置(自動運転ECU)7のタイミングチャートで行われる処理は、第二の電子制御装置(自動運転ECU)7のアプリケーション専用の第二のコア12で行われる処理を表している。 The processing performed in the timing chart of the first electronic control unit (camera ECU) 6 represents the processing performed in the application-dedicated second core 12 of the first electronic control unit (camera ECU) 6. In addition, the first half of the processing performed in the timing chart of the in-vehicle network 4 (including the read timing on the left side of the diagram) is the processing performed in the first core 11 dedicated to communication of the first electronic control unit (camera ECU) 6. represents. On the other hand, the latter half of the processing performed in the timing chart of the in-vehicle network 4 (including the write timing on the right side of the diagram) is performed in the first core 11 dedicated to communication of the second electronic control unit (autonomous driving ECU) 7. Represents processing. The processing performed in the timing chart of the second electronic control unit (automatic driving ECU) 7 represents the processing performed in the second core 12 dedicated to the application of the second electronic control unit (automatic driving ECU) 7. ing.
(1周期目)
 第一の電子制御装置(カメラECU)6では、1周期目の始めにデータ取得タスク24がアプリケーション処理に必要なデータを取得する。そして、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222(図中では、「画像認識(車両前方)221及び画像認識(車両後方)222」と略記する)が処理を実行する。処理後のデータは、1周期目の終わりでデータ公開タスク25によりペリフェラル3に書き込まれる。図中の「処理」と記載される領域は、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222が処理を実行する時間を表す。
(1st cycle)
In the first electronic control unit (camera ECU) 6, a data acquisition task 24 acquires data necessary for application processing at the beginning of the first cycle. Then, an image recognition application module (vehicle front) 221 and an image recognition application module (vehicle rear) 222 (abbreviated as "image recognition (vehicle front) 221 and image recognition (vehicle rear) 222" in the figure) perform processing. Execute. The processed data is written to the peripheral 3 by the data publishing task 25 at the end of the first cycle. The area described as "processing" in the figure represents the time during which the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 execute the process.
(2周期目)
 次に、車載ネットワーク4では、2周期目の始めに第一の電子制御装置(カメラECU)6における通信ミドルウェア20がデータを読み出して、第二の電子制御装置(自動運転ECU)7に向けて通信処理を実行する。図中の「通信」と記載される領域は、バス(車載ネットワーク4)をデータが流れている時間を表す。その後、2周期目の終わりで第二の電子制御装置(自動運転ECU)7における通信ミドルウェア20が受信したデータをRAM2に書き込む。
(2nd cycle)
Next, in the in-vehicle network 4, at the beginning of the second cycle, the communication middleware 20 in the first electronic control unit (camera ECU) 6 reads the data and sends it to the second electronic control unit (autonomous driving ECU) 7. Executes communication processing. The area labeled "communication" in the figure represents the time during which data is flowing through the bus (in-vehicle network 4). Thereafter, at the end of the second cycle, the communication middleware 20 in the second electronic control unit (automatic driving ECU) 7 writes the received data into the RAM 2.
(3周期目)
 次に、第二の電子制御装置(自動運転ECU)7では、3周期目の始めにデータ取得タスク24がペリフェラル3からデータを取得する。そして、軌道生成アプリケーションモジュール223及び制御指令生成アプリケーションモジュール224(図中では、「軌道生成223及び制御指令生成224」と略記する)が処理を実行する。処理後のデータは、3周期目の終わりでデータ公開タスク25によりペリフェラル3に書き込まれる。図中の「処理」と記載される領域は、軌道生成アプリケーションモジュール223及び制御指令生成アプリケーションモジュール224が処理を実行する時間を表す。
(3rd cycle)
Next, in the second electronic control unit (automatic driving ECU) 7, the data acquisition task 24 acquires data from the peripheral 3 at the beginning of the third cycle. Then, the trajectory generation application module 223 and the control command generation application module 224 (abbreviated as "trajectory generation 223 and control command generation 224" in the figure) execute the process. The processed data is written to the peripheral 3 by the data publishing task 25 at the end of the third cycle. The area labeled "Processing" in the figure represents the time during which the trajectory generation application module 223 and the control command generation application module 224 execute the process.
 <データ通信量の制約>
 図11は、第一の電子制御装置(カメラECU)6において大量のデータ送信を行う場合における通信ミドルウェア20と車載ネットワーク4のタイミングチャートである。図11を参照しながらデータ通信量の制約について説明する。図11の上側のタイミングチャートは、図10に示したタイミングチャートと同様である。
<Restrictions on data communication amount>
FIG. 11 is a timing chart of the communication middleware 20 and the in-vehicle network 4 when the first electronic control unit (camera ECU) 6 transmits a large amount of data. Restrictions on data communication amount will be explained with reference to FIG. 11. The timing chart on the upper side of FIG. 11 is similar to the timing chart shown in FIG.
 通信ミドルウェア20におけるデータ公開タスク25により、アプリケーションデータ保存部23において特に第一のコア11及び第二のコア12双方からアクセス可能な領域(グローバル領域23b)に保存されたデータを読み取り、通信ヘッダ等の通信に必要なデータを付加してペリフェラル3のレジスタ31に該データを書き込む処理がある。この処理では、例えば車載ソフトウェア標準規格AUTOSARにおけるEthernet通信の場合、後述する図12の左側に示すように、RTE(RunTime Environment)からCommunication (COM) Serviceに所属するComとPduR(Protocol data unit router)と呼ばれるモジュールを経由して行われる。さらに、この処理では、Communication Hardware abstractionに所属するInterfaceとTransceiverと呼ばれるモジュールを経由した後、Driverをコールする必要がある。この処理は全てCPUで実行されるため、100マイクロ秒~1ミリ秒程度のオーダのCPU処理時間となる。本実施形態ではCPU処理時間を150マイクロ秒とする。 The data publication task 25 in the communication middleware 20 reads data stored in the application data storage unit 23 in an area (global area 23b) that is accessible from both the first core 11 and the second core 12, and writes communication headers, etc. There is a process of adding data necessary for communication and writing the data into the register 31 of the peripheral 3. In this process, for example, in the case of Ethernet communication in the in-vehicle software standard AUTOSAR, as shown on the left side of FIG. This is done via a module called . Furthermore, in this process, it is necessary to call the Driver after going through modules called Interface and Transceiver that belong to Communication Hardware abstraction. Since this processing is all executed by the CPU, the CPU processing time is on the order of 100 microseconds to 1 millisecond. In this embodiment, the CPU processing time is 150 microseconds.
 一方、1ギガビット毎秒の通信速度を許容するEthernetに関して、通信単位であるフレームのサイズを1キロバイトとする。この場合、車載ネットワーク4上を該フレームが物理信号として流れる時間は、1Kバイトが8000ビットであるので、8000/10^9=80*10^(-6)、すなわち80マイクロ秒程度である。ここで、フレームが車載ネットワーク4を流れている時間(80マイクロ秒)よりも通信ミドルウェア20のCPU処理時間(150マイクロ秒)のほうが長いため、150マイクロ秒-80マイクロ秒=70マイクロ秒の時間分の待ち時間が生じる。 On the other hand, regarding Ethernet, which allows a communication speed of 1 gigabit per second, the size of a frame, which is a communication unit, is 1 kilobyte. In this case, since 1 Kbyte is 8000 bits, the time that the frame flows as a physical signal on the in-vehicle network 4 is 8000/10^9=80*10^(-6), that is, about 80 microseconds. Here, the CPU processing time of the communication middleware 20 (150 microseconds) is longer than the time the frame flows through the in-vehicle network 4 (80 microseconds), so the time is 150 microseconds - 80 microseconds = 70 microseconds. There will be a waiting time of several minutes.
 図11の下側に示すタイミングチャートでは、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222がそれぞれ生成したカメラ物体検知データ10の処理時間の関係が示される。上述したように画像認識アプリケーションモジュール(車両前方)221が生成したカメラ物体検知データ10に対する通信ミドルウェア20のCPU処理時間が150マイクロ秒であり、次に、通信ミドルウェア20で処理されたカメラ物体検知データ10のフレームが車載ネットワーク4を流れる時間が80マイクロ秒であることが示される。図10に示したように、車載ネットワーク4においても、第一の電子制御装置(カメラECU)6のCPU1によりフレーム(データ)を流すための通信処理が行われる。 The timing chart shown in the lower part of FIG. 11 shows the relationship between the processing times of the camera object detection data 10 generated by the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222, respectively. As described above, the CPU processing time of the communication middleware 20 for the camera object detection data 10 generated by the image recognition application module (vehicle front) 221 is 150 microseconds, and then the camera object detection data processed by the communication middleware 20 is It is shown that the time it takes for 10 frames to flow through the in-vehicle network 4 is 80 microseconds. As shown in FIG. 10, also in the in-vehicle network 4, the CPU 1 of the first electronic control unit (camera ECU) 6 performs communication processing for transmitting frames (data).
 ここで、画像認識アプリケーションモジュール(車両前方)221のカメラ物体検知データ10に対するCPU処理が終わると、次に、画像認識アプリケーションモジュール(車両後方)222のカメラ物体検知データ10に対するCPU処理が行われる。このように従来は、アプリケーション専用の第二のコア12で、画像認識アプリケーションモジュール(車両前方)221、画像認識アプリケーションモジュール(車両後方)222の順で通信ミドルウェア20を用いたCPU処理が行われていた。 Here, when the CPU processing of the camera object detection data 10 of the image recognition application module (vehicle front) 221 is finished, the CPU processing of the camera object detection data 10 of the image recognition application module (vehicle rear) 222 is next performed. In this way, conventionally, CPU processing using the communication middleware 20 is performed in the second core 12 dedicated to applications, in the order of the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222. Ta.
 ここで、画像認識アプリケーションモジュール(車両前方)221のCPU処理後のカメラ物体検知データ10が車載ネットワーク4に流す処理が終了した直後に、画像認識アプリケーションモジュール(車両後方)222のCPU処理後のカメラ物体検知データ10が車載ネットワーク4に流す処理が開始することが望ましい。しかし、画像認識アプリケーションモジュール(車両後方)222は、画像認識アプリケーションモジュール(車両前方)221のCPU処理が終了した後でなければ動作しない。この結果、画像認識アプリケーションモジュール(車両前方)221のフレームの送信が終了してから画像認識アプリケーションモジュール(車両後方)222のフレームを生成する処理が終了するまで、70マイクロ秒のCPU処理の待ち時間が生じてしまう。 Here, immediately after the process of transmitting the camera object detection data 10 processed by the CPU of the image recognition application module (front of the vehicle) 221 to the in-vehicle network 4, the camera object detection data 10 processed by the CPU of the image recognition application module (rear of the vehicle) 222 is completed. It is desirable to start the process of transmitting the object detection data 10 to the in-vehicle network 4. However, the image recognition application module (vehicle rear) 222 does not operate until the CPU processing of the image recognition application module (vehicle front) 221 is completed. As a result, the CPU processing wait time is 70 microseconds from the end of frame transmission by the image recognition application module (front of the vehicle) 221 until the end of the frame generation process of the image recognition application module (rear of the vehicle) 222. will occur.
 画像認識アプリケーションモジュール(車両前方)221のカメラ物体検知データ10の送信が完了してから画像認識アプリケーションモジュール(車両後方)222のカメラ物体検知データ10の送信を開始するまでの70マイクロ秒間は、第一の電子制御装置(カメラECU)6が車載ネットワーク4の帯域を使用できない。このため、CPU処理の待ち時間である70マイクロ秒間においては、車載ネットワーク4で通信可能なデータ量が制約されてしまう。 The period of 70 microseconds from when the image recognition application module (vehicle front) 221 finishes transmitting the camera object detection data 10 until the image recognition application module (vehicle rear) 222 starts transmitting the camera object detection data 10 is One electronic control unit (camera ECU) 6 cannot use the band of the in-vehicle network 4. For this reason, the amount of data that can be communicated over the in-vehicle network 4 is restricted during the 70 microseconds that is the waiting time for CPU processing.
 <通信準備処理部及びペリフェラルアクセス部>
 図12は、図2に示した通信準備処理部201とペリフェラルアクセス部202の処理の概要について、AUTOSARを例として説明する図である。
<Communication preparation processing unit and peripheral access unit>
FIG. 12 is a diagram illustrating an overview of the processing of the communication preparation processing unit 201 and peripheral access unit 202 shown in FIG. 2, using AUTOSAR as an example.
 図12の左側には、従来の通信ミドルウェア20と通信ドライバ21の構成例が示される。通常、AUTOSARにおいては、各アプリケーションモジュールが生成したアプリケーションデータ保存部23のグローバル領域23bに保存されたデータを、通信ミドルウェア20がRTE_read()により読み取る。その後、通信ミドルウェア20は、COM、PduR、Tp(Transport Protocol)、If(Interface)、Driver(通信ドライバ21)を順にコールしてペリフェラル3にアクセスし、送信データをレジスタ31に格納する。AUTOSARにおいて、COM、PduR、Tpはサービスレイヤに属し、IfはHW(ハードウェア)抽象化レイヤに属する。 On the left side of FIG. 12, a configuration example of conventional communication middleware 20 and communication driver 21 is shown. Normally, in AUTOSAR, the communication middleware 20 reads data generated by each application module and stored in the global area 23b of the application data storage section 23 using RTE_read(). Thereafter, the communication middleware 20 accesses the peripheral 3 by sequentially calling COM, PduR, Tp (Transport Protocol), If (Interface), and Driver (communication driver 21), and stores the transmission data in the register 31. In AUTOSAR, COM, PduR, and Tp belong to the service layer, and If belongs to the HW (hardware) abstraction layer.
 通信ミドルウェア20は、Rte_read()からIfまでの処理を担う。RTE_read()から処理を開始し、Driverが処理を完了するまで、上述した通り、およそ150マイクロ秒を要する。 The communication middleware 20 is responsible for processing from Rte_read() to If. As described above, it takes approximately 150 microseconds from the time the process starts from RTE_read( ) until the driver completes the process.
 そこで、本実施形態においては、図8と図9に示した通信ミドルウェア20と通信ドライバ21の処理を、図2に示した通信準備処理部201とペリフェラルアクセス部202に分割する。
 図12の右側には、本実施形態に係る通信準備処理部201とペリフェラルアクセス部202の構成例が示される。図2に示したように、通信準備処理部201とペリフェラルアクセス部202は、演算処理部1Aに設けられる。なお、演算処理部1Bには、通信準備処理部201が設けられるが、ペリフェラルアクセス部202は設けられない。また、本実施形態では、IfとDriverの間にバッファとして通信データ保存部28を設置する。図2に示したように、通信データ保存部28は、データ保存部2Aに設けられる。
Therefore, in this embodiment, the processing of the communication middleware 20 and the communication driver 21 shown in FIGS. 8 and 9 is divided into the communication preparation processing section 201 and the peripheral access section 202 shown in FIG. 2.
On the right side of FIG. 12, a configuration example of the communication preparation processing section 201 and the peripheral access section 202 according to this embodiment is shown. As shown in FIG. 2, the communication preparation processing section 201 and the peripheral access section 202 are provided in the arithmetic processing section 1A. Note that the arithmetic processing unit 1B is provided with a communication preparation processing unit 201, but is not provided with a peripheral access unit 202. Furthermore, in this embodiment, a communication data storage unit 28 is installed as a buffer between If and Driver. As shown in FIG. 2, the communication data storage section 28 is provided in the data storage section 2A.
 ここで、通信準備処理部201にあるIfを改造することで、第一のコア11及び第二のコア12(演算処理部1A,1B)がDriverを直接コールするのではなく、該Driverをコールする際に渡す引数を通信データ保存部28に保存するように構成する。そして、RTE_read()から改造したIfまでを通信準備処理部201としている。図2に示したように、通信準備処理部201は、アプリケーションデータ保存部23のグローバル領域23bに保存されたデータを入力とし、図12の右側に示すRTE_read()から改造したIfまでの処理を行った後、通信データ保存部28にデータを出力する。 Here, by modifying If in the communication preparation processing unit 201, the first core 11 and the second core 12 ( arithmetic processing units 1A, 1B) do not call the Driver directly, but instead call the Driver. The communication data storage unit 28 is configured to store the arguments to be passed when doing so. The communication preparation processing section 201 is comprised from RTE_read() to the modified If. As shown in FIG. 2, the communication preparation processing section 201 receives data stored in the global area 23b of the application data storage section 23 as input, and performs processing from RTE_read() to the modified If shown on the right side of FIG. After doing so, the data is output to the communication data storage section 28.
 さらに本実施形態では、通信データ保存部28からデータを取り出し、該取り出したデータを引数としてDriverをコールする処理を実行可能な機能部をペリフェラルアクセス部202とする。そして、通信準備処理部(通信準備処理部201)は、AUTOSAR(登録商標)に規定されるサービスレイヤとハードウェア抽象化レイヤであり、ペリフェラルアクセス部(ペリフェラルアクセス部202)は、AUTOSARに規定されるドライバである。通信データ保存部(通信データ保存部28)は、ハードウェア抽象化レイヤと、ドライバとの間に設けられている。 Further, in this embodiment, the peripheral access unit 202 is a functional unit that can execute a process of extracting data from the communication data storage unit 28 and calling a Driver using the extracted data as an argument. The communication preparation processing unit (communication preparation processing unit 201) is a service layer and hardware abstraction layer defined by AUTOSAR (registered trademark), and the peripheral access unit (peripheral access unit 202) is defined by AUTOSAR (registered trademark). This is a driver. The communication data storage unit (communication data storage unit 28) is provided between the hardware abstraction layer and the driver.
 図2に示したように、通信データ保存部28の実体は、第一のコア11と第二のコア12の双方からアクセス可能なRAM2の領域に確保することとし、アプリケーションデータ保存部23とは別の領域とする。Rte_read()からDriverが処理を完了するまでのCPU処理の大部分は通信準備処理部201で行われる処理であり、通信準備処理部201が要するCPU処理時間は130マイクロ秒程度である。一方、ペリフェラルアクセス部202は、通信データ保存部28からのデータ取り出しと、Driver(通信ドライバ21)のコールのみを行うため、ペリフェラルアクセス部202が要するCPU処理時間は20マイクロ秒程度である。さらに、通信準備処理部201は、第一のコア11及び第二のコア12(図2に示した演算処理部1A,1B)の双方に設けられるため、第一のコア11及び第二のコア12がそれぞれ通信準備処理部201を並列実行することが可能である。 As shown in FIG. 2, the actual communication data storage unit 28 is secured in an area of the RAM 2 that can be accessed from both the first core 11 and the second core 12, and the application data storage unit 23 is Separate area. Most of the CPU processing from Rte_read() until the Driver completes the process is performed by the communication preparation processing unit 201, and the CPU processing time required by the communication preparation processing unit 201 is about 130 microseconds. On the other hand, since the peripheral access unit 202 only retrieves data from the communication data storage unit 28 and calls the Driver (communication driver 21), the CPU processing time required by the peripheral access unit 202 is about 20 microseconds. Furthermore, since the communication preparation processing unit 201 is provided in both the first core 11 and the second core 12 (the arithmetic processing units 1A and 1B shown in FIG. 2), the communication preparation processing unit 201 12 can each execute the communication preparation processing unit 201 in parallel.
 図12の右側に示したように、演算処理部(演算処理部1A,1B、第一のコア11、第二のコア12)は、予め決められた内部時刻に基づいて、通信準備処理部(通信準備処理部201)、ペリフェラルアクセス部(ペリフェラルアクセス部202)の順に起動することで、他の電子制御装置100にデータを送信する。そして、通信準備処理部(通信準備処理部201)が通信データ保存部(通信データ保存部28)に通信データを保存する処理と、ペリフェラルアクセス部(ペリフェラルアクセス部202)が通信データ保存部(通信データ保存部28)から読み出した通信データであって、ペリフェラル(ペリフェラル3)が他の電子制御装置に送信するデータをペリフェラル(ペリフェラル3)に格納する処理とを行う。 As shown on the right side of FIG. 12, the arithmetic processing units ( arithmetic processing units 1A, 1B, first core 11, second core 12) operate the communication preparation processing unit ( Data is transmitted to other electronic control devices 100 by activating the communication preparation processing section 201) and the peripheral access section (peripheral access section 202) in this order. Then, the communication preparation processing unit (communication preparation processing unit 201) stores communication data in the communication data storage unit (communication data storage unit 28), and the peripheral access unit (peripheral access unit 202) performs processing to store communication data in the communication data storage unit (communication data storage unit 28). The communication data read from the data storage unit 28), which the peripheral (peripheral 3) transmits to another electronic control device, is stored in the peripheral (peripheral 3).
 逆に、演算処理部(演算処理部1A,1B、第一のコア11、第二のコア12)は、予め決められた内部時刻に基づいて、ペリフェラルアクセス部(ペリフェラルアクセス部202)、通信準備処理部(通信準備処理部201)の順に起動することで、他の電子制御装置100からデータを受信する。そして、ペリフェラル(ペリフェラル3)が他の電子制御装置から受信したデータをペリフェラルアクセス部(ペリフェラルアクセス部202)が取得し、通信データ保存部(通信データ保存部28)に書き込む処理と、通信準備処理部(通信準備処理部201)が通信データ保存部(通信データ保存部28)から読み出した通信データをアプリケーションデータ保存部(アプリケーションデータ保存部23)に書き込む処理とを行う。 On the contrary, the arithmetic processing units ( arithmetic processing units 1A, 1B, first core 11, second core 12) perform communication preparation on the peripheral access unit (peripheral access unit 202) based on a predetermined internal time. Data is received from other electronic control devices 100 by activating the processing units (communication preparation processing unit 201) in this order. Then, the peripheral access unit (peripheral access unit 202) acquires the data received by the peripheral (peripheral 3) from another electronic control device, and writes it into the communication data storage unit (communication data storage unit 28), and the communication preparation process. The communication preparation section (communication preparation processing section 201) writes communication data read from the communication data storage section (communication data storage section 28) into the application data storage section (application data storage section 23).
 <本実施形態に係るソフトウェアアーキテクチャ>
 次に、本実施形態に係るソフトウェアアーキテクチャについて、図13と図14を参照して説明する。
 本実施形態に係るソフトウェアアーキテクチャでは、図8と図9を参照して説明した非特許文献1における方式を適用した例のように第一のコア11を通信専用、第二のコア12をアプリケーション専用とするのではなく、第一のコア11と第二のコア12をアプリケーションと通信の兼用とする。
<Software architecture according to this embodiment>
Next, the software architecture according to this embodiment will be explained with reference to FIGS. 13 and 14.
In the software architecture according to this embodiment, the first core 11 is dedicated to communication, and the second core 12 is dedicated to applications, as in the example in which the method in Non-Patent Document 1 described with reference to FIGS. 8 and 9 is applied. Instead, the first core 11 and second core 12 are used for both applications and communication.
 図13は、第一の電子制御装置(カメラECU)6における本実施形態に係るソフトウェアアーキテクチャを示した図である。 FIG. 13 is a diagram showing the software architecture of the first electronic control unit (camera ECU) 6 according to this embodiment.
 第1の実施形態に係る複数の電子制御装置の一つ(第一の電子制御装置(カメラECU)6)は、カメラが撮像したカメラ画像データ(カメラ画像データ9)から物体検知を行った結果であるカメラ物体検知データ(カメラ物体検知データ10)を車載ネットワーク(車載ネットワーク4)に送信する。この第一の電子制御装置(カメラECU)6では、第一のコア11に対して、OS19、画像認識アプリケーションモジュール(車両前方)221、データ取得タスク24、データ公開タスク25、通信準備処理部201が割り付けられる。また、第二のコア12に対して、OS19、画像認識アプリケーションモジュール(車両後方)222、データ取得タスク24、データ公開タスク25、通信準備処理部201が割り付けられる。 One of the plurality of electronic control units (first electronic control unit (camera ECU) 6) according to the first embodiment detects an object from camera image data (camera image data 9) captured by the camera. The camera object detection data (camera object detection data 10) is transmitted to the in-vehicle network (in-vehicle network 4). This first electronic control unit (camera ECU) 6 has an OS 19, an image recognition application module (vehicle front) 221, a data acquisition task 24, a data disclosure task 25, and a communication preparation processing unit 201 for the first core 11. is assigned. Further, an OS 19, an image recognition application module (rear of the vehicle) 222, a data acquisition task 24, a data disclosure task 25, and a communication preparation processing unit 201 are assigned to the second core 12.
 OS19では、第一のコア11と第二のコア12の双方からタスク起動時刻管理部26を実行可能である。
 第一の電子制御装置(カメラECU)6のペリフェラルアクセス部202は、第一のコア11又は第二のコア12のいずれかで実行可能とする。図2では、ペリフェラルアクセス部202が第一のコア11に相当する演算処理部1Aで実行される形態としたが、第二のコア12に相当する演算処理部1Bで実行される形態としてもよい。
In the OS 19, the task activation time management unit 26 can be executed from both the first core 11 and the second core 12.
The peripheral access unit 202 of the first electronic control unit (camera ECU) 6 can be executed by either the first core 11 or the second core 12. In FIG. 2, the peripheral access unit 202 is executed by the arithmetic processing unit 1A corresponding to the first core 11, but it may also be executed by the arithmetic processing unit 1B corresponding to the second core 12. .
 図14は、第二の電子制御装置(自動運転ECU)7における本実施形態に係るソフトウェアアーキテクチャを示した図である。 FIG. 14 is a diagram showing the software architecture according to this embodiment in the second electronic control unit (automatic driving ECU) 7.
 第1の実施形態に係る複数の電子制御装置の一つ(第二の電子制御装置(自動運転ECU)7)は、カメラ物体検知データ(カメラ物体検知データ10)に基づいて、車両の制御対象であるステアリング(ステアリング16)の目標角度の制御指令値、アクセル(アクセル17)の目標加速力の制御指令値、及びブレーキ(ブレーキ18)の目標減衰力の制御指令値のうち、少なくとも一つを生成して制御対象に制御指令値を送信する。この第二の電子制御装置(自動運転ECU)7では、第一のコア11に対して、OS19、軌道生成アプリケーションモジュール223、データ取得タスク24、データ公開タスク25、通信準備処理部201が割り付けられる。また、第二のコア12に対して、OS19、制御指令生成アプリケーションモジュール224、データ取得タスク24、データ公開タスク25、通信準備処理部201が割り付けられる。 One of the plurality of electronic control devices (second electronic control device (autonomous driving ECU) 7) according to the first embodiment determines the control target of the vehicle based on camera object detection data (camera object detection data 10). At least one of a control command value for the target angle of the steering (steering 16), a control command value for the target acceleration force of the accelerator (accelerator 17), and a control command value for the target damping force of the brake (brake 18). Generate and send control command values to the controlled object. In this second electronic control unit (automatic driving ECU) 7, an OS 19, a trajectory generation application module 223, a data acquisition task 24, a data disclosure task 25, and a communication preparation processing unit 201 are assigned to the first core 11. . Furthermore, the OS 19, the control command generation application module 224, the data acquisition task 24, the data publication task 25, and the communication preparation processing unit 201 are assigned to the second core 12.
 OS19では、第一のコア11と第二のコア12の双方からタスク起動時刻管理部26を実行可能である。
 また、第二の電子制御装置(自動運転ECU)7のペリフェラルアクセス部202は、第一のコア11又は第二のコア12のいずれかで実行可能とする。図2では、ペリフェラルアクセス部202が第一のコア11に相当する演算処理部1Aで実行される形態としたが、第二のコア12に相当する演算処理部1Bで実行される形態としてもよい。
In the OS 19, the task activation time management unit 26 can be executed from both the first core 11 and the second core 12.
Further, the peripheral access unit 202 of the second electronic control unit (automatic driving ECU) 7 can be executed by either the first core 11 or the second core 12. In FIG. 2, the peripheral access unit 202 is executed by the arithmetic processing unit 1A corresponding to the first core 11, but it may also be executed by the arithmetic processing unit 1B corresponding to the second core 12. .
 図13と図14に示したように、第一のコア11と第二のコア12はいずれもアプリケーション処理と通信の処理(特に通信準備処理部201)を実行し、第一のコア11は通信を実行するペリフェラル3にアクセスするためのペリフェラルアクセス部202も実行する。 As shown in FIGS. 13 and 14, the first core 11 and the second core 12 both execute application processing and communication processing (particularly the communication preparation processing unit 201), and the first core 11 performs communication processing. Also executes a peripheral access unit 202 for accessing the peripheral 3 that executes.
 第一のコア11に搭載されたデータ取得タスク24は、予め決められた時刻になると起動される。また、第一のコア11に搭載されたデータ取得タスク24は、同じく第一のコア11に搭載されたアプリケーションモジュールである画像認識アプリケーションモジュール(車両前方)221の処理に必要なデータを取得する役割を持つ。データ取得タスク24が取得するデータが同一の電子制御装置内に搭載された画像認識アプリケーションモジュール(車両後方)222により生成される場合がある。この場合、データ取得タスク24は、アプリケーションデータ保存部23のグローバル領域23bに保存されたデータを取得する。 The data acquisition task 24 installed in the first core 11 is activated at a predetermined time. Furthermore, the data acquisition task 24 installed in the first core 11 has a role of acquiring data necessary for processing of the image recognition application module (vehicle front) 221, which is an application module also installed in the first core 11. have. The data acquired by the data acquisition task 24 may be generated by an image recognition application module (vehicle rear) 222 installed in the same electronic control unit. In this case, the data acquisition task 24 acquires data stored in the global area 23b of the application data storage unit 23.
 データ取得タスク24が取得するデータが、異なる電子制御装置に搭載されたアプリケーションモジュールにより生成され、ペリフェラル3が受信したものである場合、第一のコア11はペリフェラルアクセス部202を起動する。異なる電子制御装置に搭載されたアプリケーションモジュールとは、例えば、第一の電子制御装置(カメラECU)6の外部の第二の電子制御装置(自動運転ECU)7に搭載された、図14に示す軌道生成アプリケーションモジュール223、又は制御指令生成アプリケーションモジュール224である。第一のコア11により起動されたペリフェラルアクセス部202は、図1に示したレジスタ31に格納されたデータを、図12に示した通信ドライバ21を用いて取得し、図12に示した通信データ保存部28に格納する。 If the data acquired by the data acquisition task 24 is generated by an application module installed in a different electronic control device and received by the peripheral 3, the first core 11 activates the peripheral access unit 202. The application modules installed in different electronic control devices are, for example, the application modules shown in FIG. These are the trajectory generation application module 223 or the control command generation application module 224. The peripheral access unit 202 activated by the first core 11 acquires the data stored in the register 31 shown in FIG. 1 using the communication driver 21 shown in FIG. The data is stored in the storage unit 28.
 第一のコア11に搭載されたデータ取得タスク24の処理と同様に、第二のコア12に搭載されたデータ取得タスク24は、同じ第二のコア12に搭載されたアプリケーションモジュールである画像認識アプリケーションモジュール(車両後方)222の実行に必要なデータを取得する役割を持つ。 Similar to the processing of the data acquisition task 24 installed in the first core 11, the data acquisition task 24 installed in the second core 12 performs image recognition, which is an application module installed in the same second core 12. It has the role of acquiring data necessary for execution of the application module (vehicle rear) 222.
 第一のコア11に搭載された画像認識アプリケーションモジュール(車両前方)221は、非特許文献1に開示された技術と同様に、データ取得タスク24の完了後からデータ公開タスク25の開始時刻の前までの間の任意の時刻に実行される。ただし、画像認識アプリケーションモジュール(車両前方)221が取得するデータのうち、異なる電子制御装置に搭載されたアプリケーションモジュールから取得したもの、すなわち外部から受信したデータが存在する場合がある。この場合、画像認識アプリケーションモジュール(車両前方)221は、処理を実行する前に、同じ第一のコア11に搭載された通信準備処理部201をコールする。 The image recognition application module (vehicle front) 221 installed in the first core 11 executes the image recognition application module (vehicle front) after the completion of the data acquisition task 24 and before the start time of the data publication task 25, similar to the technology disclosed in Non-Patent Document 1. executed at any time between. However, among the data acquired by the image recognition application module (vehicle front) 221, there may be data acquired from an application module installed in a different electronic control device, that is, data received from the outside. In this case, the image recognition application module (vehicle front) 221 calls the communication preparation processing unit 201 installed in the same first core 11 before executing the process.
 コールされた通信準備処理部201は、図12に示した通信データ保存部28からデータを取り出し、Tp、PduR、COM、Rte_write()の順に処理し、アプリケーションデータ保存部23にデータを保存する。なお、ここで説明する処理は、データの受信処理であるため、図12に示したフローではRte_read()ではなくRte_write()となる。この結果、通信データ保存部28から取り出されたデータが画像認識アプリケーションモジュール(車両前方)221からアクセス可能となる。 The called communication preparation processing unit 201 retrieves data from the communication data storage unit 28 shown in FIG. Note that the process described here is a data reception process, so in the flow shown in FIG. 12, it is Rte_write() instead of Rte_read(). As a result, the data retrieved from the communication data storage section 28 can be accessed from the image recognition application module (front of the vehicle) 221.
 また、画像認識アプリケーションモジュール(車両前方)221の処理完了後、画像認識アプリケーションモジュール(車両前方)221が生成したデータのうち、異なる電子制御装置に搭載されたアプリケーションモジュールに対してデータを公開する場合がある。このように外部(例えば、異なる電子制御装置)に公開するデータは、外部に送信される必要がある。そこで、第一のコア11は、外部に送信される必要があるデータを引数として、通信準備処理部201を直ちにコールする。そして、通信準備処理部201は、図12に示したようにRte_read()、COM、PduR、Tpを順に処理し、Ifにおいて該データを通信データ保存部28に保存する。 In addition, when the data generated by the image recognition application module (vehicle front) 221 is released to an application module installed in a different electronic control unit after the processing of the image recognition application module (vehicle front) 221 is completed. There is. Data disclosed to the outside (for example, to a different electronic control device) in this way needs to be transmitted to the outside. Therefore, the first core 11 immediately calls the communication preparation processing unit 201 with the data that needs to be transmitted to the outside as an argument. Then, the communication preparation processing unit 201 processes Rte_read(), COM, PduR, and Tp in order as shown in FIG. 12, and stores the data in the communication data storage unit 28 at If.
 第一のコア11に搭載された画像認識アプリケーションモジュール(車両前方)221と同様に、第二のコア12に搭載された画像認識アプリケーションモジュール(車両後方)222による処理が行われる。画像認識アプリケーションモジュール(車両後方)222による処理は、データ取得タスク24のタスク処理の完了後からデータ公開タスク25の開始時刻の前までの間の任意の時刻に実行される。 Similar to the image recognition application module (vehicle front) 221 installed in the first core 11, processing is performed by the image recognition application module (vehicle rear) 222 installed in the second core 12. The processing by the image recognition application module (vehicle rear) 222 is executed at any time between after the task processing of the data acquisition task 24 is completed and before the start time of the data publication task 25.
 ただし、画像認識アプリケーションモジュール(車両後方)222が取得するデータのうち、異なる電子制御装置に搭載されたアプリケーションモジュールから取得したデータ、すなわち外部から受信したデータが存在する場合がある。この場合、画像認識アプリケーションモジュール(車両後方)222は、処理を実行する前に通信準備処理部201をコールする。このとき、通信準備処理部201は、通信データ保存部28からデータを取り出し、図12に示した処理の逆順でTp、PduR、COM、Rte_write()を処理する。このような処理を経て、通信データ保存部28から取り出したデータを画像認識アプリケーションモジュール(車両後方)222からアクセス可能とする。 However, among the data acquired by the image recognition application module (vehicle rear) 222, there may be data acquired from an application module installed in a different electronic control device, that is, data received from the outside. In this case, the image recognition application module (vehicle rear) 222 calls the communication preparation processing section 201 before executing the process. At this time, the communication preparation processing unit 201 retrieves data from the communication data storage unit 28 and processes Tp, PduR, COM, and Rte_write() in the reverse order of the processing shown in FIG. Through such processing, the data retrieved from the communication data storage section 28 can be accessed from the image recognition application module (vehicle rear) 222.
 また、画像認識アプリケーションモジュール(車両後方)222の処理が完了した後、画像認識アプリケーションモジュール(車両後方)222の生成したデータのうち、異なる電子制御装置に搭載されたアプリケーションモジュールに対してデータを公開する場合がある。このように外部(例えば、異なる電子制御装置)に送信する必要があるデータに関しては、第二のコア12が該外部に送信する必要があるデータを引数として通信準備処理部201を直ちにコールする。そして、通信準備処理部201は、Rte_write()、COM、PduR、Tpを順に処理し、Ifにおいて該データを通信データ保存部28に保存する。 In addition, after the processing of the image recognition application module (rear of the vehicle) 222 is completed, the data generated by the image recognition application module (rear of the vehicle) 222 is released to application modules installed in different electronic control units. There are cases where Regarding data that needs to be sent to the outside (for example, a different electronic control device), the second core 12 immediately calls the communication preparation processing unit 201 with the data that needs to be sent to the outside as an argument. Then, the communication preparation processing unit 201 processes Rte_write(), COM, PduR, and Tp in order, and stores the data in the communication data storage unit 28 at If.
 ここで、非特許文献1に開示された技術では、第一のコア11を通信専用としているため全ての通信処理が同一のコアで実行されていた。一方、データの送信及び受信の処理において、本実施形態に係る通信準備処理部201は、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222の処理の前後に、第一のコア11と第二のコア12によってそれぞれ実行される点に注意されたい。 Here, in the technology disclosed in Non-Patent Document 1, since the first core 11 is dedicated for communication, all communication processing is executed by the same core. On the other hand, in the data transmission and reception processing, the communication preparation processing unit 201 according to the present embodiment performs the first Note that this is executed by the second core 11 and the second core 12, respectively.
 第一のコア11に搭載されたデータ公開タスク25は、予め決められた時刻になると起動される。第一のコア11に搭載されたデータ公開タスク25は、同じ第一のコア11に搭載されたアプリケーションモジュールである画像認識アプリケーションモジュール(車両前方)221の実行結果(アプリケーションデータ)を公開する役割を持つ。そして、アプリケーションデータ保存部23は、同一の電子制御装置内にあるアプリケーションモジュールに公開するデータを、グローバル領域23bに保存する。 The data publication task 25 installed in the first core 11 is activated at a predetermined time. The data publication task 25 installed in the first core 11 has the role of publishing the execution results (application data) of the image recognition application module (vehicle front) 221, which is an application module installed in the same first core 11. have Then, the application data storage unit 23 stores data to be published to application modules within the same electronic control device in the global area 23b.
 また、第一のコア11に搭載されたデータ公開タスク25は、処理が完了すると、直ちにペリフェラルアクセス部202を起動する。ペリフェラルアクセス部202は、通信データ保存部28に保存されたデータを取得し、図12に示した通信ドライバ21をコールして該データをペリフェラル3のレジスタ31に書き込む。同様に、第二のコア12に搭載されたデータ公開タスク25は、予め決められた時刻になると起動される。 Furthermore, the data publication task 25 installed in the first core 11 activates the peripheral access unit 202 immediately after the processing is completed. The peripheral access unit 202 acquires the data stored in the communication data storage unit 28, calls the communication driver 21 shown in FIG. 12, and writes the data to the register 31 of the peripheral 3. Similarly, the data publication task 25 installed in the second core 12 is activated at a predetermined time.
 一方、第二のコア12に搭載されたデータ公開タスク25は、同じ第二のコア12に搭載されたアプリケーションモジュールである画像認識アプリケーションモジュール(車両後方)222の実行結果を公開する役割を持つ。そこで、アプリケーションデータ保存部23は、同一の電子制御装置内にあるアプリケーションモジュールに対して公開するデータをグローバル領域23bに保存する。 On the other hand, the data publication task 25 installed in the second core 12 has the role of publishing the execution results of the image recognition application module (vehicle rear) 222, which is an application module installed in the same second core 12. Therefore, the application data storage unit 23 stores data to be made public to application modules within the same electronic control device in the global area 23b.
 なお、画像認識アプリケーションモジュール(車両後方)222は、外部送信データに関する通信準備処理部201を既に完了している。また、ペリフェラル3へのアクセスは全て、第一のコア11に搭載されたペリフェラルアクセス部202により実行される。このため、第二のコア12に搭載されたデータ公開タスク25は、異なる電子制御装置内のアプリケーションモジュールに対して公開するデータに関しては特に処理を行わない。 Note that the image recognition application module (vehicle rear) 222 has already completed the communication preparation processing section 201 regarding external transmission data. Furthermore, all accesses to the peripherals 3 are executed by the peripheral access unit 202 mounted on the first core 11. Therefore, the data publication task 25 installed in the second core 12 does not particularly process data to be published to application modules in different electronic control devices.
 <非特許文献1に開示された技術と、本実施形態に係る技術との差分>
 次に、非特許文献1に開示された技術と、本実施形態に係る技術との差分について、図15と図16を参照して説明する。
 図15は、本実施形態に係るLET及びLCTを、本実施形態に係る各モジュール及びタスクに適用した例を示す図である。ここで、LET及びLCTが適用される各モジュール及びタスクは、図13と図14に示したものである。以下に説明する通信準備処理部201及びペリフェラルアクセス部202がデータの受信処理を行う場合に(受信側)と付し、データの送信処理を行う場合に(送信側)と付す。
<Differences between the technology disclosed in Non-Patent Document 1 and the technology according to this embodiment>
Next, the difference between the technology disclosed in Non-Patent Document 1 and the technology according to this embodiment will be explained with reference to FIGS. 15 and 16.
FIG. 15 is a diagram showing an example in which the LET and LCT according to the present embodiment are applied to each module and task according to the present embodiment. Here, each module and task to which LET and LCT are applied are shown in FIGS. 13 and 14. When the communication preparation processing unit 201 and the peripheral access unit 202, which will be described below, perform data reception processing, they are referred to as (receiving side), and when they perform data transmission processing, they are referred to as (transmission side).
 そして、送信側の電子制御装置(第一の電子制御装置(カメラECU)6)が車載ネットワーク(車載ネットワーク4)にデータを送信する処理の完了時刻と、受信側の電子制御装置(第二の電子制御装置(自動運転ECU)7)が車載ネットワーク(車載ネットワーク4)からデータを受信する処理の開始時刻とが規定されている。また、複数の電子制御装置(第一の電子制御装置(カメラECU)6、第二の電子制御装置(自動運転ECU)7)のそれぞれが有する起動時刻管理部(タスク起動時刻管理部26)は、規定された完了時刻及び開始時刻に基づいて、複数の電子制御装置のそれぞれが有するペリフェラルアクセス部(ペリフェラルアクセス部202)を起動する。図15に示すように、第一の電子制御装置(カメラECU)6がデータを送信する処理の完了時刻と、第二の電子制御装置(自動運転ECU)7がデータを受信する処理の開始時刻の間は、100ミリ秒で固定されている。このため、第一の電子制御装置(カメラECU)6と、第二の電子制御装置(自動運転ECU)7における各処理の期間と、車載ネットワーク4にデータを送信する期間とを合わせることができる。 Then, the electronic control unit (first electronic control unit (camera ECU) 6) on the sending side completes the process of transmitting data to the in-vehicle network (in-vehicle network 4), and the electronic control unit (second electronic control unit) on the receiving side A start time of a process at which the electronic control unit (automatic driving ECU) 7) receives data from the in-vehicle network (in-vehicle network 4) is defined. In addition, the startup time management unit (task startup time management unit 26) of each of the plurality of electronic control units (first electronic control unit (camera ECU) 6, second electronic control unit (automatic driving ECU) 7) is , starts the peripheral access unit (peripheral access unit 202) included in each of the plurality of electronic control devices based on the specified completion time and start time. As shown in FIG. 15, the completion time of the process in which the first electronic control unit (camera ECU) 6 transmits data, and the start time of the process in which the second electronic control unit (automatic driving ECU) 7 receives the data. The interval is fixed at 100 milliseconds. Therefore, the period of each process in the first electronic control unit (camera ECU) 6 and the second electronic control unit (autonomous driving ECU) 7 can be matched with the period of transmitting data to the in-vehicle network 4. .
(1周期目)
<データ取得タスク及びデータ公開タスクによる第1処理>
 第一の電子制御装置(カメラECU)6では、1周期目が開始されるリードタイミングでデータ取得タスク24が画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222に必要なデータ(例えば、カメラ画像データ9)を取得する。次に、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222がアプリケーション処理を行ってアプリケーションデータ(例えば、カメラ物体検知データ10)を生成する。このように、第一のコア11及び第二のコア12には、それぞれ割り付けられた画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222が割り付けられているので、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222が同時にそれぞれのアプリケーション処理を実行できる。そして、データ公開タスク25は、アプリケーションデータをアプリケーションデータ保存部23のグローバル領域23bに保存してアプリケーションデータを公開する。
(1st cycle)
<First processing by data acquisition task and data publication task>
In the first electronic control unit (camera ECU) 6, the data acquisition task 24 is performed at the read timing when the first cycle starts, and the data acquisition task 24 is performed to perform the data acquisition task 24 necessary for the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222. Obtain data (for example, camera image data 9). Next, the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 perform application processing to generate application data (for example, camera object detection data 10). In this way, the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 are assigned to the first core 11 and the second core 12, respectively, so that image recognition The application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 can simultaneously execute their respective application processes. Then, the data publication task 25 stores the application data in the global area 23b of the application data storage unit 23 and publishes the application data.
<ペリフェラルアクセス部及び通信準備処理部による第2処理>
 1周期目では、他の電子制御装置から受信した通信データに基づいた処理も行われる。
 タスク起動時刻管理部26により起動管理されるペリフェラルアクセス部202(受信側)は、リードタイミングでペリフェラル3にアクセスし、通信データを読み出すと、通信データ保存部28に保存する。通信準備処理部201(受信側)は、通信データ保存部28から読み出した通信データに対して、各アプリケーションモジュールが処理可能なアプリケーションデータに変換し、変換後のアプリケーションデータをアプリケーションデータ保存部23に保存する。画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222は、アプリケーションデータ保存部23から読み出したアプリケーションデータを用いてアプリケーション処理を実行する。
<Second processing by peripheral access unit and communication preparation processing unit>
In the first cycle, processing based on communication data received from other electronic control devices is also performed.
The peripheral access unit 202 (receiving side) whose activation is managed by the task activation time management unit 26 accesses the peripheral 3 at the read timing, reads communication data, and stores it in the communication data storage unit 28 . The communication preparation processing unit 201 (receiving side) converts the communication data read from the communication data storage unit 28 into application data that can be processed by each application module, and sends the converted application data to the application data storage unit 23. save. The image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 execute application processing using the application data read from the application data storage section 23.
 上記の第1及び第2の処理が完了すると、アプリケーションデータ保存部23に処理後のアプリケーションデータが保存される。通信準備処理部201は、アプリケーションデータ保存部23から読み出したアプリケーションデータの通信準備処理を行い、通信データ保存部28に通信データを保存する。その後、ペリフェラルアクセス部202が通信データ保存部28から通信データを読み出し、ライトタイミングでペリフェラル3のレジスタ31に通信データを書き込む。ここで、第一の電子制御装置(カメラECU)6におけるリードタイミングからライトタイミングまでは、100ミリ秒で固定されている。 When the first and second processes described above are completed, the processed application data is stored in the application data storage unit 23. The communication preparation processing unit 201 performs communication preparation processing on the application data read from the application data storage unit 23 and stores the communication data in the communication data storage unit 28. Thereafter, the peripheral access unit 202 reads the communication data from the communication data storage unit 28 and writes the communication data to the register 31 of the peripheral 3 at the write timing. Here, the time from read timing to write timing in the first electronic control unit (camera ECU) 6 is fixed at 100 milliseconds.
(2周期目)
 車載ネットワーク4のバスでは、通信データの通信処理が行われ、第一の電子制御装置(カメラECU)6のペリフェラル3から、第二の電子制御装置(自動運転ECU)7のペリフェラル3に向けて通信データが送信される。
(2nd cycle)
On the bus of the in-vehicle network 4, communication processing of communication data is performed, and data is transmitted from the peripheral 3 of the first electronic control unit (camera ECU) 6 to the peripheral 3 of the second electronic control unit (autonomous driving ECU) 7. Communication data is sent.
(3周期目)
 3周期目の処理も1周期目の処理と同様に、データ取得タスク24及びデータ公開タスク25による第1処理、又はペリフェラルアクセス部202及び通信準備処理部201による第2処理が行われる。なお、3周期目の処理は、第二の電子制御装置(自動運転ECU)7で行われるため、軌道生成アプリケーションモジュール223及び制御指令生成アプリケーションモジュール224がアプリケーション処理を実行する点が1周期目と異なる。
(3rd cycle)
Similarly to the first cycle processing, in the third cycle, the first process by the data acquisition task 24 and the data release task 25, or the second process by the peripheral access unit 202 and the communication preparation processing unit 201 is performed. Note that the processing in the third cycle is performed by the second electronic control unit (automatic operation ECU) 7, so the point in which the trajectory generation application module 223 and the control command generation application module 224 execute the application processing is different from the first cycle. different.
 本実施形態に係る技術は、非特許文献1に開示された技術とは異なり、固定するタイミングをアプリケーションモジュールの開始時刻と終了時刻ではなく、アプリケーションモジュールと通信準備処理部201を合わせた処理の開始時刻と終了時刻とする。また、通信準備処理部201は第一のコア11及び第二のコア12の両方で実行される。したがって、図11に示した150マイクロ秒のCPU処理時間を要していた通信処理のうち、130マイクロ秒分の処理が通信準備処理部201の処理として並列化される。 The technology according to the present embodiment differs from the technology disclosed in Non-Patent Document 1 in that the timing to be fixed is not the start time and end time of the application module, but the start of processing that combines the application module and the communication preparation processing unit 201. Time and end time. Further, the communication preparation processing unit 201 is executed by both the first core 11 and the second core 12. Therefore, of the communication processing that required 150 microseconds of CPU processing time shown in FIG.
 図16は、第1の実施形態に係る技術と、非特許文献1に開示された技術とを比較したタイミングチャートである。 FIG. 16 is a timing chart comparing the technology according to the first embodiment and the technology disclosed in Non-Patent Document 1.
 図16の上側には、第1の実施形態に係る各処理のタイミングの例が示される。第一のコア11と第二のコア12の処理は同時に開始可能である。そして、第一のコア11と第二のコア12は、CPUによる通信処理を通信準備処理部(130マイクロ秒)とペリフェラルアクセス部(20マイクロ秒)に分離する。 The upper part of FIG. 16 shows an example of the timing of each process according to the first embodiment. The processes of the first core 11 and the second core 12 can be started simultaneously. The first core 11 and the second core 12 separate communication processing by the CPU into a communication preparation processing section (130 microseconds) and a peripheral access section (20 microseconds).
 第一のコア11のペリフェラルアクセス部202(受信側)は、20マイクロ秒で受信処理を行う。その後、第一のコア11と第二のコア12は、通信準備処理部201(受信側)と通信準備処理部201(送信側)を並列処理とし、アプリケーション処理(画像認識アプリケーション(前方)と画像認識アプリケーション(後方))も並列で実行する。そして、第一のコア11は、ペリフェラルアクセス部202(送信側)を一つのコアで実行する。このため、車載ネットワーク4で行われる通信処理は、CPU1の処理を待たなくてよくなり、車載ネットワーク4の帯域の制限が解消される。 The peripheral access unit 202 (reception side) of the first core 11 performs reception processing in 20 microseconds. After that, the first core 11 and the second core 12 perform parallel processing on the communication preparation processing unit 201 (receiving side) and the communication preparation processing unit 201 (transmission side), and perform application processing (image recognition application (front) and image recognition application (front)). The recognition application (backwards) is also executed in parallel. The first core 11 executes the peripheral access unit 202 (transmission side) using one core. Therefore, the communication processing performed on the in-vehicle network 4 does not have to wait for processing by the CPU 1, and the band limitation of the in-vehicle network 4 is eliminated.
 図16の下側には、非特許文献1に開示された技術に係る各処理のタイミングの例が示される。非特許文献1に開示された技術では、図8と図9に示したように、通信専用の第一のコア11と、アプリケーション専用の第二のコア12に分離されている。このため、アプリケーション専用の第二のコア12で順に、画像認識アプリケーション(車両前方)と画像認識アプリケーション(車両後方)の処理が行われた後、第一のコア11で通信処理が行われる。そして、通信専用の第一のコア11は、画像認識アプリケーション(車両前方)の処理データと、画像認識アプリケーション(車両後方)の処理データを順に車載ネットワーク4を介して第二の電子制御装置(自動運転ECU)7に送信するための通信処理を行う。 The lower part of FIG. 16 shows an example of the timing of each process related to the technology disclosed in Non-Patent Document 1. In the technology disclosed in Non-Patent Document 1, as shown in FIGS. 8 and 9, the core is separated into a first core 11 dedicated to communication and a second core 12 dedicated to applications. Therefore, after the application-dedicated second core 12 sequentially processes the image recognition application (vehicle front) and the image recognition application (vehicle rear), the first core 11 performs communication processing. The first core 11 dedicated to communication sequentially transmits the processing data of the image recognition application (front of the vehicle) and the processing data of the image recognition application (rear of the vehicle) to the second electronic control unit (automatic control unit) via the in-vehicle network 4. Performs communication processing for sending data to the driving ECU) 7.
 ここで、通信専用の第一のコア11で画像認識アプリケーション(車両前方)の処理データの通信処理が終わると、車載ネットワーク4で画像認識アプリケーション(車両前方)の処理データが通信される。しかし、車載ネットワーク4が、例えば、1ギガビットイーサネットのような高速大容量通信とした場合には、第一のコア11が通信に要するCPU処理(150マイクロ秒)が、フレームデータが車載ネットワーク4のバスを流れている時間(80マイクロ秒)よりも長い。このため、車載ネットワーク4では、通信専用の第一のコア11が画像認識アプリケーション(車両後方)の処理データの通信処理を終えるまで処理待ちが発生する。通信専用の第一のコア11における通信処理が終わると、車載ネットワーク4が、画像認識アプリケーション(車両後方)のフレームデータを車載ネットワーク4のバスに流すことができる。このように車載ネットワーク4の通信帯域には制限が生じてしまう。 Here, when the communication processing of the processing data of the image recognition application (vehicle front) is completed in the first core 11 dedicated for communication, the processing data of the image recognition application (vehicle front) is communicated over the in-vehicle network 4. However, when the in-vehicle network 4 uses high-speed, large-capacity communication such as 1 Gigabit Ethernet, the CPU processing (150 microseconds) required for communication by the first core 11 is This is longer than the time it takes to flow on the bus (80 microseconds). Therefore, in the in-vehicle network 4, a processing wait occurs until the first core 11 dedicated to communication finishes communication processing of processing data of the image recognition application (vehicle rear). When the communication processing in the first core 11 dedicated to communication is completed, the in-vehicle network 4 can send the frame data of the image recognition application (vehicle rear) to the in-vehicle network 4 bus. In this way, the communication band of the in-vehicle network 4 is limited.
 以上説明した第1の実施形態に係る電子制御装置100では、LETとして固定するタイミングを、各アプリケーション処理部におけるアプリケーション処理の開始時刻と終了時刻ではなく、各アプリケーション処理部におけるアプリケーション処理の開始時刻から通信準備処理の終了時刻とする。そして、各コアには、通信準備処理部201をそれぞれ設けた。このようにLETを固定することにより、複数のコアで通信準備処理を同時に実行することができる。このため、従来は、一つのコアで通信準備処理を行った後、バスにフレームを送信していたので、あるアプリケーションモジュールによる通信準備処理が終わらなければ、別のアプリケーションモジュールによる通信準備処理が開始できず、結果としてフレームをバスに送信するまで待機時間が生じていた。そして、制御周期が短くなるほど、待機時間の発生により、バスに送信可能なデータ量が減少するトレードオフが発生していた。一方、本実施の形態に係る電子制御装置では、制御周期が短くなっても、バスに送信可能なデータ量を減少させなくてよい。また、複数の電子制御装置間で大量のデータ通信が必要な場合においても、LET(タイミング固定)を導入することが可能となるので、ソフトウェアの開発コストを削減する効果を有する。 In the electronic control device 100 according to the first embodiment described above, the timing to be fixed as LET is determined from the start time of application processing in each application processing unit, not from the start time and end time of application processing in each application processing unit. This is the end time of communication preparation processing. Each core was provided with a communication preparation processing section 201, respectively. By fixing the LET in this way, it is possible to simultaneously execute communication preparation processing in a plurality of cores. For this reason, conventionally, one core performed communication preparation processing and then sent a frame to the bus, so if one application module did not finish communication preparation processing, another application module would start communication preparation processing. This resulted in a wait time before sending the frame onto the bus. A trade-off has occurred in that the shorter the control period, the more waiting time occurs and the amount of data that can be transmitted to the bus decreases. On the other hand, in the electronic control device according to the present embodiment, even if the control period becomes shorter, the amount of data that can be transmitted to the bus does not need to be reduced. Further, even when a large amount of data communication is required between a plurality of electronic control devices, it is possible to introduce LET (fixed timing), which has the effect of reducing software development costs.
 また、上述したように第1の実施形態に係る電子制御装置100では、LETの開始時刻と終了時刻を固定する単位を、アプリケーションの処理と通信処理とするのではなく、アプリケーション処理と、通信準備処理及びペリフェラルアクセス処理とする。このため、処理の開始時刻と終了時刻の固定を保ちながら通信処理の大部分を各コア(演算処理部)において並列して実行することが可能となる。また、大量のデータ通信が必要な場合においても通信に関する処理の開始時刻から終了時刻までの時間延長が不要となるため、レイテンシ(通信遅延)を改善することが可能となる。 Furthermore, as described above, in the electronic control device 100 according to the first embodiment, the units for fixing the start time and end time of LET are not application processing and communication processing, but application processing and communication preparation. processing and peripheral access processing. Therefore, most of the communication processing can be executed in parallel in each core (arithmetic processing unit) while keeping the start time and end time of the processing fixed. Further, even when a large amount of data communication is required, there is no need to extend the time from the start time to the end time of communication-related processing, so it is possible to improve latency (communication delay).
 なお、第1の実施形態に係る電子制御装置100は、標準的なAUTOSARを例に挙げて電子制御装置の構成及び処理の例を説明した。ただし、通信ミドルウェア20は標準的なベーシックソフトウェアモジュールではなく、コンプレックスデバイスドライバでもよい。コンプレックスデバイスドライバは、ユーザが独自仕様で組み込み可能なソフトウェアである。 Note that, regarding the electronic control device 100 according to the first embodiment, an example of the configuration and processing of the electronic control device has been described using standard AUTOSAR as an example. However, the communication middleware 20 is not a standard basic software module, but may be a complex device driver. A complex device driver is software that can be installed by users with their own specifications.
 また、AUTOSAR以外の一般的なリアルタイムオペレーティングシステムを搭載した電子制御装置においても本発明を適用可能である。 The present invention is also applicable to electronic control devices equipped with general real-time operating systems other than AUTOSAR.
 また、図15に示したデータフローが保たれていることを検証するため、通信準備処理部201(送信側)は、送信時刻を示すタイムスタンプをフレームデータに付加してフレームの送信処理を行う。一方、通信準備処理部201(受信側)は、フレームの受信時に、受信したフレームに付加されたタイムスタンプを確認する処理を入れてもよい。 In addition, in order to verify that the data flow shown in FIG. 15 is maintained, the communication preparation processing unit 201 (sending side) adds a timestamp indicating the transmission time to the frame data and performs frame transmission processing. . On the other hand, the communication preparation processing unit 201 (receiving side) may perform a process of checking the time stamp added to the received frame when receiving the frame.
[第2の実施形態]
 次に、本発明の第2の実施形態に係る電子制御装置の構成例及び処理方法について説明する。第2の実施形態に係る電子制御装置が第1の実施形態と異なる点は、通信ミドルウェア20が軽量通信ドライバとなっている点である。なお、第1の実施形態と同様の構成については、同一の符号を付してその説明を省略する。
[Second embodiment]
Next, a configuration example and a processing method of an electronic control device according to a second embodiment of the present invention will be described. The electronic control device according to the second embodiment differs from the first embodiment in that the communication middleware 20 is a lightweight communication driver. Note that the same configurations as those in the first embodiment are given the same reference numerals and the description thereof will be omitted.
 <軽量通信ドライバ>
 軽量通信ドライバを実行可能な第一の電子制御装置(カメラECU)6A及び第二の電子制御装置(自動運転ECU)7Aの構成例について、図17と図18を参照して説明する。
 図17は、第2の実施形態に係る第一の電子制御装置(カメラECU)6Aにおけるソフトウェアアーキテクチャを示した図である。
<Lightweight communication driver>
A configuration example of a first electronic control unit (camera ECU) 6A and a second electronic control unit (automatic driving ECU) 7A that can execute a lightweight communication driver will be described with reference to FIGS. 17 and 18.
FIG. 17 is a diagram showing the software architecture of the first electronic control unit (camera ECU) 6A according to the second embodiment.
 演算処理部(演算処理部1A,1B、第一のコア11、第二のコア12)は一以上設けられ、演算処理部(演算処理部1A,1B、第一のコア11、第二のコア12)がそれぞれアプリケーション処理部(アプリケーション処理部22)を有する構成としてよい。図17では、第一の電子制御装置(カメラECU)6Aは、通信処理及びアプリケーション処理に兼用される第一のコア11だけを持つ構成として説明する。 One or more arithmetic processing units ( arithmetic processing units 1A, 1B, first core 11, second core 12) are provided; 12) may each have an application processing section (application processing section 22). In FIG. 17, the first electronic control unit (camera ECU) 6A will be described as having only the first core 11 that is used for both communication processing and application processing.
 第一のコア11に対して、OS19、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222、データ取得タスク24、データ公開タスク25、及び軽量通信ドライバ29を割り付ける。OS19には、タスク起動時刻管理部26が割り付けられる。 An OS 19, an image recognition application module (vehicle front) 221, an image recognition application module (vehicle rear) 222, a data acquisition task 24, a data disclosure task 25, and a lightweight communication driver 29 are assigned to the first core 11. A task start time management unit 26 is assigned to the OS 19.
 図18は、第2の実施形態に係る第二の電子制御装置(自動運転ECU)7Aにおけるソフトウェアアーキテクチャを示した図である。 FIG. 18 is a diagram showing the software architecture of the second electronic control unit (automatic driving ECU) 7A according to the second embodiment.
 第二の電子制御装置(自動運転ECU)7Aは、通信処理及びアプリケーション処理に兼用される第一のコア11だけを持つ。
 第一のコア11に対して、OS19、軌道生成アプリケーションモジュール223及び制御指令生成アプリケーションモジュール224、データ取得タスク24、データ公開タスク25、及び軽量通信ドライバ29を割り付ける。OS19には、タスク起動時刻管理部26が割り付けられる。
The second electronic control unit (automatic driving ECU) 7A has only a first core 11 that is used for both communication processing and application processing.
The OS 19, trajectory generation application module 223, control command generation application module 224, data acquisition task 24, data publication task 25, and lightweight communication driver 29 are assigned to the first core 11. A task start time management unit 26 is assigned to the OS 19.
 図8と図9では通信専用の第一のコア11に割り付けられた、データの送信及び受信を行うための通信ミドルウェア20と通信ドライバ21の例を示した。ここで、通信準備処理部(通信準備処理部201)とペリフェラルアクセス部(ペリフェラルアクセス部202)の実行に要する処理時間の合計が、他の電子制御装置との間で通信されるデータの通信時間よりも短い場合がある。例えば、データフレームが車載ネットワーク4のバスを流れている時間よりも、通信ミドルウェア20と通信ドライバ21のCPU処理時間の合計が短い場合がある。この場合、演算処理部(演算処理部1A、第一のコア11)は、通信準備処理部(通信データ保存部201)及びペリフェラルアクセス部(ペリフェラルアクセス部202)を一体化した軽量通信ドライバ(軽量通信ドライバ29)とし、軽量通信ドライバ(軽量通信ドライバ29)が通信データの送受信処理を行う。そこで、通信ミドルウェア20と通信ドライバ21を合わせて「軽量通信ドライバ」と呼ぶ。図17と図18に示した軽量通信ドライバ29は、1つのコア(第一のコア11)で連続して送受信処理を実行しても、図16に示したようなCPUの処理待ちが発生しない。 FIGS. 8 and 9 show an example of communication middleware 20 and communication driver 21 for transmitting and receiving data, which are allocated to the first core 11 dedicated to communication. Here, the total processing time required for execution of the communication preparation processing unit (communication preparation processing unit 201) and the peripheral access unit (peripheral access unit 202) is the communication time of data communicated with other electronic control devices. It may be shorter than For example, the total CPU processing time of the communication middleware 20 and the communication driver 21 may be shorter than the time during which a data frame flows through the bus of the in-vehicle network 4. In this case, the arithmetic processing unit (arithmetic processing unit 1A, first core 11) is a lightweight communication driver (lightweight The lightweight communication driver (lightweight communication driver 29) performs transmission and reception processing of communication data. Therefore, the communication middleware 20 and communication driver 21 are collectively referred to as a "lightweight communication driver." The lightweight communication driver 29 shown in FIGS. 17 and 18 does not cause the CPU to wait for processing as shown in FIG. 16 even if one core (first core 11) continuously executes transmission/reception processing. .
 ただし、第2の実施形態では、通信準備処理部201とペリフェラルアクセス部202を分けなくてよい。また、データ取得タスク24は、グローバル領域23bにアクセスすることで、アプリケーションモジュールの処理に必要なデータを、同一の電子制御装置内の他のアプリケーションモジュールから取得する。 However, in the second embodiment, the communication preparation processing section 201 and the peripheral access section 202 do not need to be separated. Furthermore, the data acquisition task 24 acquires data necessary for the processing of the application module from other application modules within the same electronic control device by accessing the global area 23b.
 図19は、本実施形態に係るLET及びLCTを、本実施形態に係る各モジュール及びタスクにマッピングした例を示す図である。ここで、LET及びLCTをマッピングする各モジュール及びタスクは、図17と図18に示したものである。以下に説明する各モジュール及びタスクのうち、(受信側)と付した各モジュール及びタスクはデータの受信処理を担い、(送信側)と付した各モジュール及びタスクはデータの送信処理を担うものとする。 FIG. 19 is a diagram showing an example of mapping the LET and LCT according to the present embodiment to each module and task according to the present embodiment. Here, each module and task for mapping LET and LCT are shown in FIGS. 17 and 18. Among the modules and tasks described below, the modules and tasks labeled (receiving side) are responsible for data reception processing, and the modules and tasks labeled (transmission side) are responsible for data transmission processing. do.
(1周期目)
 第一の電子制御装置(カメラECU)6Aでは、1周期目が開始されるリードタイミングにマッピングされたデータ取得タスク24がデータを取得する。次に、軽量通信ドライバ29(受信側)が受信したデータの受信処理を行う。次に、画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222が処理を実行する。そして、1周期目が終了するライトタイミングで、データ公開タスク25が処理後のデータをペリフェラル3のレジスタ31に書き込み、軽量通信ドライバ29(送信側)が車載ネットワーク4にデータを送信する送信処理を行う。
(1st cycle)
In the first electronic control unit (camera ECU) 6A, the data acquisition task 24 mapped to the read timing at which the first cycle starts acquires data. Next, the lightweight communication driver 29 (reception side) performs a reception process on the received data. Next, the image recognition application module (vehicle front) 221 and the image recognition application module (vehicle rear) 222 execute processing. Then, at the write timing when the first cycle ends, the data disclosure task 25 writes the processed data to the register 31 of the peripheral 3, and the lightweight communication driver 29 (transmission side) performs a transmission process to transmit the data to the in-vehicle network 4. conduct.
(2周期目)
 車載ネットワーク4のバスでは、第一の電子制御装置(カメラECU)6Aから第二の電子制御装置(自動運転ECU)7Aに向けてデータが通信される。
(2nd cycle)
On the bus of the in-vehicle network 4, data is communicated from the first electronic control unit (camera ECU) 6A to the second electronic control unit (automatic driving ECU) 7A.
(3周期目)
 第二の電子制御装置(自動運転ECU)7Aでは、3周期目が開始されるリードタイミングにマッピングされたデータ取得タスク24がデータを取得する。次に、軽量通信ドライバ29(受信側)がデータの受信処理を行う。次に、軌道生成アプリケーションモジュール223及び制御指令生成アプリケーションモジュール224が処理を実行する。そして、3周期目が終了するライトタイミングで、データ公開タスク25が処理後のデータをペリフェラル3のレジスタ31に書き込み、軽量通信ドライバ29(送信側)がデータの送信処理を行う。
(3rd cycle)
In the second electronic control unit (automatic driving ECU) 7A, the data acquisition task 24 mapped to the read timing at which the third cycle starts acquires data. Next, the lightweight communication driver 29 (reception side) performs data reception processing. Next, the trajectory generation application module 223 and the control command generation application module 224 execute processing. Then, at the write timing when the third cycle ends, the data disclosure task 25 writes the processed data to the register 31 of the peripheral 3, and the lightweight communication driver 29 (transmission side) performs data transmission processing.
 図19に示したように、データ取得タスク24による処理の終了後は、タスク起動時刻管理部26が直ちに軽量通信ドライバ29をコールしてデータの受信処理を行う。そして、データ公開タスク25はアプリケーションモジュールの処理結果をグローバル領域23bに書き込むことにより同一電子制御装置内の他のアプリケーションモジュールに対してデータを公開する。また、データ公開タスク25の処理の終了後は、タスク起動時刻管理部26が直ちに軽量通信ドライバ29をコールしてデータの送信処理を行う。 As shown in FIG. 19, after the processing by the data acquisition task 24 is completed, the task activation time management unit 26 immediately calls the lightweight communication driver 29 to perform data reception processing. Then, the data publishing task 25 publishes the data to other application modules within the same electronic control device by writing the processing results of the application module into the global area 23b. Further, after the processing of the data disclosure task 25 is completed, the task activation time management unit 26 immediately calls the lightweight communication driver 29 to perform data transmission processing.
 以上説明した第2の実施形態に係る電子制御装置100では、コアが1つしか存在しない場合においても、タイミング固定によるソフトウェアの開発費削減を実現することができる。また、電子制御装置100がマルチコアで構成される場合においても、非特許文献1に開示された技術と異なり、通信専用コアを設置する必要がない。このため、電子制御装置100で使用するCPUのコア数を削減することができる。このように、第2の実施形態に係る電子制御装置100では、図8に示した通信ミドルウェア20と通信ドライバ21を改造することなく、また使用可能なコア数に制限がある場合においても、タイミング固定によるソフトウェアの開発費削減が実現される効果を有する。 In the electronic control device 100 according to the second embodiment described above, even when there is only one core, it is possible to reduce software development costs by fixing the timing. Further, even when the electronic control device 100 is configured with multiple cores, unlike the technology disclosed in Non-Patent Document 1, there is no need to install a dedicated communication core. Therefore, the number of CPU cores used in the electronic control device 100 can be reduced. In this way, in the electronic control device 100 according to the second embodiment, the timing can be adjusted without modifying the communication middleware 20 and the communication driver 21 shown in FIG. This has the effect of reducing software development costs due to fixed costs.
[第3の実施形態]
 次に、本発明の第3の実施形態に係る電子制御装置の構成例及び処理方法について、図20~図21を参照して説明する。
 第3の実施形態に係る電子制御装置が、第1の実施形態に係る電子制御装置と異なる点は、第一の電子制御装置に搭載されたアプリケーションモジュールを、第二の電子制御装置に搭載するように変更した点である。なお、第1の実施形態と同様の構成については、同一の符号を付してその説明を省略する。
 図20は、第3の実施形態に係る第三の電子制御装置(セントラルECU)71を搭載した車両制御装置5Aの構成例を示した図である。
[Third embodiment]
Next, a configuration example and a processing method of an electronic control device according to a third embodiment of the present invention will be described with reference to FIGS. 20 and 21.
The electronic control device according to the third embodiment differs from the electronic control device according to the first embodiment in that the application module installed in the first electronic control device is installed in the second electronic control device. The following changes were made. Note that the same configurations as those in the first embodiment are given the same reference numerals and the description thereof will be omitted.
FIG. 20 is a diagram showing a configuration example of a vehicle control device 5A equipped with a third electronic control device (central ECU) 71 according to the third embodiment.
 <車両制御装置5A>
 車両制御装置5Aは、第二の電子制御装置(自動運転ECU)7の代わりに第三の電子制御装置(セントラルECU)71を備える。
 第三の電子制御装置71は、中央コンピュータであって、AI(Artificial Intelligence)や画像処理など、負荷の重い演算を担うため、計算性能が重視された電子制御装置が用いられる。第三の電子制御装置(セントラルECU)71に対して、不図示のゾーンECUが設けられる。ゾーンECUは、車両を区分けしたゾーンごとにセンサ制御、アクチュエータ制御などの負荷の軽い演算を担っており、安全性が重視された電子制御装置が用いられる。
<Vehicle control device 5A>
The vehicle control device 5A includes a third electronic control device (central ECU) 71 instead of the second electronic control device (automatic driving ECU) 7.
The third electronic control device 71 is a central computer and is responsible for heavy-load calculations such as AI (Artificial Intelligence) and image processing, so an electronic control device with emphasis on calculation performance is used. A zone ECU (not shown) is provided for the third electronic control unit (central ECU) 71. The zone ECU is responsible for light-load calculations such as sensor control and actuator control for each zone into which the vehicle is divided, and uses an electronic control unit with an emphasis on safety.
 車両制御装置5Aに含まれる第一の電子制御装置(カメラECU)6Bは、第1の実施形態と異なり、カメラ8から取得したカメラ画像データ9を予め決められた時刻に車載ネットワーク4に送信する役割のみを持つ。第一の電子制御装置(カメラECU)6Bにより車載ネットワーク4に送信されたカメラ画像データ9は、第三の電子制御装置(セントラルECU)71が受信する。 Unlike the first embodiment, a first electronic control unit (camera ECU) 6B included in the vehicle control device 5A transmits camera image data 9 acquired from the camera 8 to the in-vehicle network 4 at a predetermined time. It only has a role. The camera image data 9 transmitted to the in-vehicle network 4 by the first electronic control unit (camera ECU) 6B is received by the third electronic control unit (central ECU) 71.
 第三の電子制御装置(セントラルECU)71は、カメラ画像データ9に基づいて、ステアリング制御指令13、アクセル制御指令14及びブレーキ制御指令15を生成する。そして、第三の電子制御装置(セントラルECU)71は、各指令を出力することで、ステアリング16、アクセル17及びブレーキ18の動作を制御する。 A third electronic control unit (central ECU) 71 generates a steering control command 13, an accelerator control command 14, and a brake control command 15 based on the camera image data 9. The third electronic control unit (central ECU) 71 controls the operations of the steering wheel 16, accelerator 17, and brake 18 by outputting each command.
 また、第三の電子制御装置(セントラルECU)71は、公知のOTA(Over the air)技術によって、第三の電子制御装置(セントラルECU)71で動作するソフトウェアを更新可能である。 Furthermore, the third electronic control unit (central ECU) 71 can update the software that operates on the third electronic control unit (central ECU) 71 using a known OTA (Over the Air) technology.
 図21は、本実施形態に係る第三の電子制御装置(セントラルECU)71におけるソフトウェアアーキテクチャを示した図である。 FIG. 21 is a diagram showing the software architecture of the third electronic control unit (central ECU) 71 according to the present embodiment.
 第3の実施形態に係る複数の電子制御装置の一つ(第三の電子制御装置(セントラルECU)71)は、複数の演算処理部(演算処理部1A,1B、3つのコア(第一のコア11、第二のコア12及び第三のコア72))を有する。第一のコア11、第二のコア12及び第三のコア72は、いずれもアプリケーションと通信の兼用とする。複数の演算処理部(演算処理部1A,1B、第一のコア11、第二のコア12及び第三のコア72)のそれぞれに一つ以上のアプリケーション処理部(アプリケーション処理部22)が設けられる。そして、複数の演算処理部の一つ(第一のコア11)に、アプリケーション処理部(アプリケーション処理部22)の機能を更新する更新部(ソフトウェア更新部73)を有する。 One of the plurality of electronic control devices (third electronic control device (central ECU) 71) according to the third embodiment includes a plurality of arithmetic processing units ( arithmetic processing units 1A, 1B, three cores (first It has a core 11, a second core 12 and a third core 72)). The first core 11, second core 12, and third core 72 are all used for applications and communication. One or more application processing units (application processing unit 22) are provided in each of the plurality of arithmetic processing units ( arithmetic processing units 1A, 1B, first core 11, second core 12, and third core 72). . One of the plurality of arithmetic processing units (first core 11) includes an update unit (software update unit 73) that updates the function of the application processing unit (application processing unit 22).
 第一のコア11には、図14に示した第1の実施形態に係る第二の電子制御装置(自動運転ECU)7に搭載されていた軌道生成アプリケーションモジュール223、OS19、データ取得タスク24、データ公開タスク25、通信準備処理部201、ペリフェラルアクセス部202が割り付けられる。第一のコア11のOS19は、タスク起動時刻管理部26を実行可能である。また、第一のコア11には、新たにソフトウェア更新部73が割り付けられる。 The first core 11 includes a trajectory generation application module 223, an OS 19, a data acquisition task 24, which was installed in the second electronic control unit (automatic driving ECU) 7 according to the first embodiment shown in FIG. A data publication task 25, a communication preparation processing section 201, and a peripheral access section 202 are assigned. The OS 19 of the first core 11 can execute the task start time management section 26. Furthermore, a software update section 73 is newly allocated to the first core 11.
 第二のコア12には、図14に示した第1の実施形態に係る第二の電子制御装置(自動運転ECU)7に搭載されていた制御指令生成アプリケーションモジュール224、OS19、データ取得タスク24、データ公開タスク25及び通信準備処理部201が割り付けられる。第二のコア12のOS19は、タスク起動時刻管理部26を実行可能である。 The second core 12 includes a control command generation application module 224, an OS 19, and a data acquisition task 24 installed in the second electronic control unit (automatic driving ECU) 7 according to the first embodiment shown in FIG. , a data disclosure task 25, and a communication preparation processing unit 201 are assigned. The OS 19 of the second core 12 can execute the task start time management unit 26.
 第三のコア72には、図13に示した第1の実施形態に係る第一の電子制御装置(カメラECU)6に搭載されていた画像認識アプリケーションモジュール(車両前方)221及び画像認識アプリケーションモジュール(車両後方)222、OS19、データ取得タスク24、データ公開タスク25及び通信準備処理部201が割り付けられる。 The third core 72 includes an image recognition application module (vehicle front) 221 and an image recognition application module installed in the first electronic control unit (camera ECU) 6 according to the first embodiment shown in FIG. (vehicle rear) 222, OS 19, data acquisition task 24, data disclosure task 25, and communication preparation processing section 201 are assigned.
 第一のコア11に搭載されるソフトウェア更新部73は、OTA(Over the air)技術によって、第三の電子制御装置(セントラルECU)71に搭載されたアプリケーションモジュールを更新可能である。すなわち、ソフトウェア更新部73は、画像認識アプリケーションモジュール(車両前方)221、画像認識アプリケーションモジュール(車両後方)222、軌道生成アプリケーションモジュール223及び制御指令生成アプリケーションモジュール224を更新する役割をもつ。 The software update unit 73 installed in the first core 11 can update the application module installed in the third electronic control unit (central ECU) 71 using OTA (Over the Air) technology. That is, the software update unit 73 has a role of updating the image recognition application module (vehicle front) 221, the image recognition application module (vehicle rear) 222, the trajectory generation application module 223, and the control command generation application module 224.
 ソフトウェア更新部73が行うソフトウェア更新は、通常は、各アプリケーションモジュールの起動時か、終了後に実施される。多くのコンピュータでは、マスターコアとスレーブコアが指定され、マスターコアがスレーブコアを含めてソフトウェアの更新を実施する。マスターコアのコア番号は「0」であることが多い。このため、第一のコア11をマスターコアとすると、ソフトウェア更新部73は、第一のコア11で実行される。このソフトウェア更新部73は、軌道生成アプリケーションモジュール223、制御指令生成アプリケーションモジュール224もアップデートすることは可能である。 The software update performed by the software update unit 73 is normally carried out when each application module is started or after it is finished. In many computers, a master core and a slave core are designated, and the master core updates software including the slave cores. The core number of the master core is often "0". Therefore, if the first core 11 is the master core, the software update unit 73 is executed by the first core 11. This software update unit 73 can also update the trajectory generation application module 223 and the control command generation application module 224.
 ソフトウェア更新部73は、自動運転における軌道生成アプリケーションモジュール223、制御指令生成アプリケーションモジュール224の機能を随時アップデートすることが可能となる。また、ソフトウェア更新部73は、カメラ画像データ9に対して行われる画像認識アプリケーションモジュール221,222の画像処理機能についてもアップデートを行うことで、各モジュールの画像認識の性能を改善することが可能となる。 The software update unit 73 can update the functions of the trajectory generation application module 223 and the control command generation application module 224 in automatic driving at any time. In addition, the software update unit 73 also updates the image processing functions of the image recognition application modules 221 and 222 that are performed on the camera image data 9, thereby making it possible to improve the image recognition performance of each module. Become.
 第1の実施形態においては、第一の電子制御装置(カメラECU)6から第二の電子制御装置(自動運転ECU)7に送信されるデータはカメラ物体検知データ10であった。一方、第3の実施形態においては、第一の電子制御装置(カメラECU)6Bから第三の電子制御装置(セントラルECU)71に送信されるデータはカメラ画像データ9である。第3の実施形態では、第1の実施形態のような第一の電子制御装置(カメラECU)6から第二の電子制御装置(自動運転ECU)7に車載ネットワーク4を通じてカメラ物体検知データ10を送信する処理が不要となる。このため、第三の電子制御装置(セントラルECU)71が画像認識後、自動運転のための軌道生成及び制御指令生成を行う処理が短縮され、車両の速やかな運転制御が可能となる。 In the first embodiment, the data transmitted from the first electronic control unit (camera ECU) 6 to the second electronic control unit (automatic driving ECU) 7 is camera object detection data 10. On the other hand, in the third embodiment, the data transmitted from the first electronic control unit (camera ECU) 6B to the third electronic control unit (central ECU) 71 is camera image data 9. In the third embodiment, camera object detection data 10 is sent from the first electronic control unit (camera ECU) 6 to the second electronic control unit (autonomous driving ECU) 7 via the in-vehicle network 4 as in the first embodiment. There is no need to send the data. Therefore, the process in which the third electronic control unit (central ECU) 71 performs image recognition and generates a trajectory and a control command for automatic driving is shortened, and the vehicle can be quickly controlled.
 また、図4を参照して説明したように、カメラ画像データ9のデータサイズは数メガバイト程度あるため、非特許文献1に開示された技術では通信帯域に制限が生じていたためカメラ画像データ9を送受信することは不可能であった。一方、第3の実施形態に係る技術では、通信帯域に制限が生じないため、このようなアプリケーションモジュールの配置が可能となる。 Furthermore, as explained with reference to FIG. 4, since the data size of the camera image data 9 is approximately several megabytes, the technology disclosed in Non-Patent Document 1 has limitations on the communication band. It was impossible to send or receive. On the other hand, with the technology according to the third embodiment, since there is no restriction on the communication band, it is possible to arrange such application modules.
 以上説明した第3の実施形態に係る車両制御装置5Aでは、第1の実施形態に係る車両制御装置5において第一の電子制御装置(カメラECU)6と第二の電子制御装置(自動運転ECU)7に分散して搭載されていた各アプリケーションモジュールが一つの第三の電子制御装置(セントラルECU)71に統合される。そして、ソフトウェア更新部73により任意のタイミングでアプリケーションモジュールの機能が更新される。このため、機能が向上したアプリケーションモジュールを用いて、画像認識、車両の運転制御の精度を高めることができる。 In the vehicle control device 5A according to the third embodiment described above, in the vehicle control device 5 according to the first embodiment, the first electronic control device (camera ECU) 6 and the second electronic control device (automatic driving ECU ) 7 are integrated into one third electronic control unit (central ECU) 71. Then, the software update unit 73 updates the functions of the application module at an arbitrary timing. Therefore, the accuracy of image recognition and vehicle driving control can be improved by using an application module with improved functionality.
 また、本発明はタイミング固定によるソフトウェア開発コスト削減を目的としたものである。そして、アプリケーションモジュールの更新により、更新前後で各アプリケーションモジュールの入力及び出力のタイミングが変動する可能性が排除される。このため、OTAによるソフトウェア更新は、第3の実施形態に係る第三の電子制御装置(セントラルECU)71に対して特に有用となる。 Furthermore, the present invention aims to reduce software development costs by fixing timing. Then, by updating the application modules, the possibility that the input and output timing of each application module changes before and after the update is eliminated. Therefore, software updating by OTA is particularly useful for the third electronic control unit (central ECU) 71 according to the third embodiment.
 このため、第3の実施形態に係る車両制御装置5Aでは、車両内のアプリケーションモジュールの配置を柔軟に変えることができる。また、複数の電子制御装置を統合し、複数のアプリケーションモジュールの更新を容易に実行可能とする効果を有する。 Therefore, in the vehicle control device 5A according to the third embodiment, the arrangement of application modules within the vehicle can be changed flexibly. Further, it has the effect of integrating a plurality of electronic control devices and making it possible to easily update a plurality of application modules.
 なお、本発明は上述した各実施形態に限られるものではなく、請求の範囲に記載した本発明の要旨を逸脱しない限りその他種々の応用例、変形例を取り得ることは勿論である。
 例えば、上述した各実施形態は本発明を分かりやすく説明するために装置及びシステムの構成を詳細かつ具体的に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されない。また、ここで説明した実施形態の構成の一部を他の実施形態の構成に置き換えることは可能であり、さらにはある実施形態の構成に他の実施形態の構成を加えることも可能である。また、各実施形態の構成の一部について、他の構成の追加、削除、置換をすることも可能である。
 また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えてもよい。
It should be noted that the present invention is not limited to the embodiments described above, and it goes without saying that various other applications and modifications may be made without departing from the gist of the present invention as set forth in the claims.
For example, in each of the embodiments described above, the configurations of devices and systems are explained in detail and specifically in order to explain the present invention in an easy-to-understand manner, and the embodiments are not necessarily limited to having all the configurations described. Further, it is possible to replace a part of the configuration of the embodiment described here with the configuration of another embodiment, and furthermore, it is also possible to add the configuration of another embodiment to the configuration of a certain embodiment. Furthermore, it is also possible to add, delete, or replace some of the configurations of each embodiment with other configurations.
Further, the control lines and information lines are shown to be necessary for explanation purposes, and not all control lines and information lines are necessarily shown in the product. In reality, almost all components may be considered to be interconnected.
 1…CPU、1A,1B…演算処理部、2…RAM、2A…データ保存部、3…ペリフェラル、4…車載ネットワーク、5…車両制御装置、6…第一の電子制御装置(カメラECU)、7…第二の電子制御装置(自動運転ECU)、11…第一のコア、12…第二のコア、19…OS、22…アプリケーション処理部、23…アプリケーションデータ保存部、24…データ取得タスク、25…データ公開タスク、26…タスク起動時刻管理部、28…通信データ保存部、100…電子制御装置、201…通信準備処理部、202…ペリフェラルアクセス部、221~224…アプリケーションモジュール 1... CPU, 1A, 1B... Arithmetic processing unit, 2... RAM, 2A... Data storage unit, 3... Peripheral, 4... In-vehicle network, 5... Vehicle control device, 6... First electronic control device (camera ECU), 7... Second electronic control unit (automatic driving ECU), 11... First core, 12... Second core, 19... OS, 22... Application processing section, 23... Application data storage section, 24... Data acquisition task , 25...Data disclosure task, 26...Task start time management section, 28...Communication data storage section, 100...Electronic control unit, 201...Communication preparation processing section, 202...Peripheral access section, 221-224...Application module

Claims (11)

  1.  アプリケーション処理を行うアプリケーション処理部を有する演算処理部と、
     前記演算処理部で処理されるデータを保存するデータ保存部と、
     前記データ保存部から読み出した前記データを他の電子制御装置に送信し、又は他の前記電子制御装置からデータを受信するペリフェラルと、を備え、
     前記演算処理部が前記アプリケーション処理のために前記データ保存部に前記データを入出力する第1タイミングと、前記演算処理部が前記データ保存部から読み出した前記データを前記ペリフェラルに渡して、前記ペリフェラルが他の前記電子制御装置に前記データを送信する第2タイミングとが予め固定される
     電子制御装置。
    an arithmetic processing unit having an application processing unit that performs application processing;
    a data storage unit that stores data processed by the arithmetic processing unit;
    a peripheral that transmits the data read from the data storage unit to another electronic control device or receives data from another electronic control device,
    A first timing at which the arithmetic processing unit inputs and outputs the data to the data storage unit for the application processing, and a first timing at which the arithmetic processing unit passes the data read from the data storage unit to the peripheral, and and a second timing for transmitting the data to another electronic control device are fixed in advance.
  2.  前記演算処理部は、前記アプリケーション処理部と、前記アプリケーション処理により生成されたアプリケーションデータの通信準備を行う通信準備処理部とを有し、
     前記データ保存部は、前記アプリケーションデータを保存するアプリケーションデータ保存部と、前記通信準備処理部により生成された通信データを保存する通信データ保存部とを有し、
     少なくとも一つの前記演算処理部は、前記通信データ保存部及び前記ペリフェラルにアクセスするペリフェラルアクセス部と、前記ペリフェラルアクセス部とを起動する起動時刻を管理する起動時刻管理部とを有し、
     前記起動時刻管理部は、予め決められた内部時刻に基づいて、前記第1タイミングで前記アプリケーション処理部を動作させ、前記第2タイミングで前記ペリフェラルアクセス部を動作させる
     請求項1に記載の電子制御装置。
    The arithmetic processing unit includes the application processing unit and a communication preparation processing unit that prepares for communication of application data generated by the application processing,
    The data storage unit includes an application data storage unit that stores the application data, and a communication data storage unit that stores communication data generated by the communication preparation processing unit,
    At least one of the arithmetic processing units includes a peripheral access unit that accesses the communication data storage unit and the peripheral, and a startup time management unit that manages a startup time for starting the peripheral access unit,
    The electronic control according to claim 1, wherein the startup time management unit operates the application processing unit at the first timing and operates the peripheral access unit at the second timing, based on a predetermined internal time. Device.
  3.  前記第1タイミングは、前記アプリケーション処理部が前記アプリケーションデータ保存部に対して前記データを入出力するタイミングとして固定され、
     前記第2タイミングは、前記ペリフェラルアクセス部が前記通信データ保存部に保存された前記通信データを前記ペリフェラルに転送するタイミングとして固定される
     請求項2に記載の電子制御装置。
    The first timing is fixed as a timing at which the application processing unit inputs and outputs the data to the application data storage unit,
    The electronic control device according to claim 2, wherein the second timing is fixed as a timing at which the peripheral access unit transfers the communication data stored in the communication data storage unit to the peripheral.
  4.  前記内部時刻は、外部から受信する時刻同期用の信号により補正される
     請求項3に記載の電子制御装置。
    The electronic control device according to claim 3, wherein the internal time is corrected by a time synchronization signal received from the outside.
  5.  前記アプリケーションデータ保存部は、それぞれの前記アプリケーション処理部が前記アプリケーション処理の途中のデータを一時的に保存する領域として、処理を実行している前記アプリケーション処理部だけが使用するローカル領域と、前記アプリケーション処理部が処理を完了した結果のデータを保存し、他の前記アプリケーション処理部に前記データを使用可能に公開するグローバル領域と、を有し、
     前記アプリケーション処理部は、前記第1タイミングで前記グローバル領域にアクセスする
     請求項4に記載の電子制御装置。
    The application data storage section includes a local area used only by the application processing section executing processing, and a local area where each application processing section temporarily stores data during the application processing. a global area that stores data resulting from completion of processing by the processing unit and makes the data usable to other application processing units;
    The electronic control device according to claim 4, wherein the application processing unit accesses the global area at the first timing.
  6.  前記演算処理部は、予め決められた内部時刻に基づいて、前記通信準備処理部、前記ペリフェラルアクセス部の順に起動して、前記通信準備処理部が前記通信データ保存部に前記通信データを保存する処理と、前記ペリフェラルアクセス部が前記通信データ保存部から読み出した前記通信データであって、前記ペリフェラルが他の前記電子制御装置に送信するデータを前記ペリフェラルに格納する処理とを行い、
     予め決められた内部時刻に基づいて、前記ペリフェラルアクセス部、前記通信準備処理部の順に起動して、前記ペリフェラルが他の前記電子制御装置から受信したデータを前記ペリフェラルアクセス部が取得し、前記通信データ保存部に書き込む処理と、前記通信準備処理部が前記通信データ保存部から読み出した前記通信データを前記アプリケーションデータ保存部に書き込む処理とを行う
     請求項5に記載の電子制御装置。
    The arithmetic processing section activates the communication preparation processing section and the peripheral access section in this order based on a predetermined internal time, and the communication preparation processing section stores the communication data in the communication data storage section. processing, and processing of storing in the peripheral the communication data read by the peripheral access unit from the communication data storage unit and transmitted by the peripheral to another of the electronic control devices,
    Based on a predetermined internal time, the peripheral access unit and the communication preparation processing unit are activated in this order, and the peripheral access unit acquires the data that the peripheral has received from the other electronic control device, and the communication The electronic control device according to claim 5, wherein the electronic control device performs a process of writing into a data storage unit, and a process of writing the communication data read from the communication data storage unit by the communication preparation processing unit into the application data storage unit.
  7.  前記通信準備処理部は、AUTOSAR(登録商標)に規定されるサービスレイヤとハードウェア抽象化レイヤであり、
     前記ペリフェラルアクセス部は、AUTOSARに規定されるドライバであり、
     前記通信データ保存部は、前記ハードウェア抽象化レイヤと、前記ドライバとの間に設けられる
     請求項3に記載の電子制御装置。
    The communication preparation processing unit is a service layer and a hardware abstraction layer defined in AUTOSAR (registered trademark),
    The peripheral access unit is a driver defined by AUTOSAR,
    The electronic control device according to claim 3, wherein the communication data storage unit is provided between the hardware abstraction layer and the driver.
  8.  前記通信準備処理部と前記ペリフェラルアクセス部の実行に要する処理時間の合計が、他の前記電子制御装置との間で通信される前記データの通信時間よりも短い場合に、前記演算処理部は、前記通信準備処理部及び前記ペリフェラルアクセス部を一体化した軽量通信ドライバとし、前記軽量通信ドライバが前記通信データの送受信処理を行う
     請求項3に記載の電子制御装置。
    When the total processing time required for execution of the communication preparation processing section and the peripheral access section is shorter than the communication time of the data communicated with other electronic control devices, the arithmetic processing section: The electronic control device according to claim 3, wherein the communication preparation processing unit and the peripheral access unit are integrated into a lightweight communication driver, and the lightweight communication driver performs transmission and reception processing of the communication data.
  9.  複数の前記演算処理部の一つに、前記アプリケーション処理部の機能を更新する更新部を有する
     請求項3に記載の電子制御装置。
    The electronic control device according to claim 3, wherein one of the plurality of arithmetic processing units includes an updating unit that updates a function of the application processing unit.
  10.  複数の前記電子制御装置は、前記ペリフェラルを通じて互いに車載ネットワークにより前記データの通信を行い、
     送信側の前記電子制御装置が前記車載ネットワークに前記データを送信する処理の完了時刻と、受信側の前記電子制御装置が前記車載ネットワークから前記データを受信する処理の開始時刻とが規定され、
     複数の前記電子制御装置のそれぞれが有する起動時刻管理部は、規定された前記完了時刻及び前記開始時刻に基づいて、複数の前記電子制御装置のそれぞれが有するペリフェラルアクセス部を起動する
     請求項3に記載の電子制御装置。
    The plurality of electronic control devices communicate the data with each other through the in-vehicle network through the peripheral,
    A completion time of a process in which the electronic control unit on a transmitting side transmits the data to the in-vehicle network, and a start time in a process in which the electronic control unit on a receiving side receives the data from the in-vehicle network are defined,
    According to claim 3, the activation time management unit included in each of the plurality of electronic control units activates the peripheral access unit included in each of the plurality of electronic control units based on the specified completion time and the start time. The electronic control device described.
  11.  複数の前記電子制御装置の一つは、カメラが撮像したカメラ画像データから物体検知を行った結果であるカメラ物体検知データを前記車載ネットワークに送信し、
     複数の前記電子制御装置の一つは、前記カメラ物体検知データに基づいて、車両の制御対象であるステアリングの目標角度の制御指令値、アクセルの目標加速力の制御指令値、及びブレーキの目標減衰力の制御指令値のうち、少なくとも一つを生成して前記制御対象に前記制御指令値を送信する
     請求項10に記載の電子制御装置。
    One of the plurality of electronic control devices transmits camera object detection data, which is a result of object detection from camera image data captured by the camera, to the in-vehicle network;
    One of the plurality of electronic control devices is configured to determine, based on the camera object detection data, a control command value for a target angle of a steering wheel, which is a control target of the vehicle, a control command value for a target acceleration force of an accelerator, and a target attenuation of a brake. The electronic control device according to claim 10, wherein the electronic control device generates at least one of force control command values and transmits the control command value to the controlled object.
PCT/JP2023/005995 2022-04-26 2023-02-20 Electronic control device WO2023210128A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-072189 2022-04-26
JP2022072189A JP2023161698A (en) 2022-04-26 2022-04-26 Electronic control device

Publications (1)

Publication Number Publication Date
WO2023210128A1 true WO2023210128A1 (en) 2023-11-02

Family

ID=88518559

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/005995 WO2023210128A1 (en) 2022-04-26 2023-02-20 Electronic control device

Country Status (2)

Country Link
JP (1) JP2023161698A (en)
WO (1) WO2023210128A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012204935A (en) * 2011-03-24 2012-10-22 Fujitsu Ten Ltd Communication system and communication apparatus
JP2019510327A (en) * 2016-02-16 2019-04-11 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング Method and apparatus for driving a control device
WO2020090108A1 (en) * 2018-11-02 2020-05-07 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Fraudulent control prevention system and fraudulent control prevention method
JP2020188330A (en) * 2019-05-13 2020-11-19 日立オートモティブシステムズ株式会社 Vehicle control device and vehicle control system
JP2022014679A (en) * 2020-07-07 2022-01-20 日立Astemo株式会社 Electronic control unit

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012204935A (en) * 2011-03-24 2012-10-22 Fujitsu Ten Ltd Communication system and communication apparatus
JP2019510327A (en) * 2016-02-16 2019-04-11 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング Method and apparatus for driving a control device
WO2020090108A1 (en) * 2018-11-02 2020-05-07 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Fraudulent control prevention system and fraudulent control prevention method
JP2020188330A (en) * 2019-05-13 2020-11-19 日立オートモティブシステムズ株式会社 Vehicle control device and vehicle control system
JP2022014679A (en) * 2020-07-07 2022-01-20 日立Astemo株式会社 Electronic control unit

Also Published As

Publication number Publication date
JP2023161698A (en) 2023-11-08

Similar Documents

Publication Publication Date Title
EP3821348B1 (en) Streaming engine
KR101522477B1 (en) Simulation method, system and program
US9063800B2 (en) Automated method for decoupling avionics application software in an IMA system
EP1703391B1 (en) Vehicle control software and vehicle control apparatus
US7930079B2 (en) Software system of electronic control unit for vehicle and design method thereof
KR20150067380A (en) Interface for interchanging data between redundant programs for controlling a motor vehicle
JP2012133789A (en) System and method executing decentralized software
WO2011148920A1 (en) Multiprocessor system, execution control method, execution control program
KR20100048857A (en) Method and devices for developing robot software components in intelligence robot system
CN112639738A (en) Data passing through gateway
Wurst et al. System performance modelling of heterogeneous hw platforms: An automated driving case study
Ernst Automated driving: The cyber-physical perspective
CN112867998B (en) Operation accelerator, switch, task scheduling method and processing system
WO2023210128A1 (en) Electronic control device
WO2024093731A1 (en) Automotive open system architecture, data processing method and on-board device
US20220222129A1 (en) System for parallel processing middleware node application algorithms using threads
US20200150648A1 (en) Vehicle control apparatus
CN117312215A (en) Server system, job execution method, device, equipment and medium
US20220300445A1 (en) Electronic Control Unit, Vehicle Comprising the Electronic Control Unit and Computer-Implemented Method
Zeng et al. Design space exploration of automotive platforms in metropolis
US20040143656A1 (en) Control software architecture for realizing a decentralized cooperative control of a plurality of electronic control apparatuses connected via a network
KR101951908B1 (en) Apparatus and method for sharing devices for robot software components
JP7146075B2 (en) Data processing device having multiple processor devices and multiple interfaces
US10261817B2 (en) System on a chip and method for a controller supported virtual machine monitor
US20240028423A1 (en) Synchronization Method and Apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23795876

Country of ref document: EP

Kind code of ref document: A1