WO2022004446A1 - 情報処理装置、および情報処理方法、情報処理システム、並びにプログラム - Google Patents

情報処理装置、および情報処理方法、情報処理システム、並びにプログラム Download PDF

Info

Publication number
WO2022004446A1
WO2022004446A1 PCT/JP2021/023308 JP2021023308W WO2022004446A1 WO 2022004446 A1 WO2022004446 A1 WO 2022004446A1 JP 2021023308 W JP2021023308 W JP 2021023308W WO 2022004446 A1 WO2022004446 A1 WO 2022004446A1
Authority
WO
WIPO (PCT)
Prior art keywords
update
unit
vehicle
recognition
information processing
Prior art date
Application number
PCT/JP2021/023308
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
竜太 佐藤
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Priority to EP21832905.0A priority Critical patent/EP4177733A4/en
Priority to US18/003,211 priority patent/US20230244471A1/en
Priority to CN202180045785.2A priority patent/CN115997193A/zh
Publication of WO2022004446A1 publication Critical patent/WO2022004446A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/60Software deployment
    • G06F8/65Updates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/14Error detection or correction of the data by redundancy in operation
    • G06F11/1402Saving, restoring, recovering or retrying
    • G06F11/1415Saving, restoring, recovering or retrying at system level
    • G06F11/1433Saving, restoring, recovering or retrying at system level during software upgrading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present disclosure relates to information processing devices, information processing methods, information processing systems, and programs, and in particular, information processing devices and information processing that enable quick and safe update of SW (software program). Regarding methods, information processing systems, and programs.
  • SW software program installed in hardware causes various problems as its use progresses, but development is progressing to solve the problems that have occurred, and updates are made to update to the developed SW.
  • the SW is repeatedly distributed, and the SW is updated by the update SW to improve convenience.
  • This disclosure was made in view of such a situation, and in particular, it is intended to enable quick and safe update by the update SW.
  • the information processing device, the information processing system, and the program of one aspect of the present disclosure include an update unit for updating the SW (software program) and an operation status recognition unit for recognizing the operation status of the SW updated by the update unit.
  • An information processing device, an information processing system, and a program include an update unit for updating the SW (software program) and an operation status recognition unit for recognizing the operation status of the SW updated by the update unit.
  • the information processing method of one aspect of the present disclosure is an information processing method of an information processing apparatus including an update unit and an operation status recognition unit, in which the update unit updates a SW (software program) and the operation status is described.
  • the recognition unit is an information processing method including a step of recognizing the updated operating status of the SW.
  • the SW (software program) is updated and the operating status of the updated SW is recognized.
  • the SW realizes an object recognition process for recognizing an object existing in the surroundings based on an image of the surroundings in order to realize the automatic driving of a vehicle capable of automatic driving.
  • the case of the recognition unit will be described as an example.
  • SW applied to this disclosure is not limited to the recognition unit that realizes the object recognition process, and may be any SW that can be updated.
  • the recognition unit that realizes the object recognition process as the SW applied to the present disclosure is configured by machine learning.
  • SW consisting of the recognition unit generated by machine learning can improve the recognition accuracy by repeating further machine learning (re-learning) with the learning data collected for learning. There is.
  • the SW management system of FIG. 1 is composed of vehicles M1 to Mn equipped with SW as a recognition unit and a server Sv for managing SW.
  • the vehicles M1 to Mn each include cameras C1 to Cn and recognition units R1 to Rn.
  • the vehicles M1 to Mn, the cameras C1 to Cn, and the recognition units R1 to Rn are distinguished by n as an identifier, but they have basically the same configuration and do not need to be particularly distinguished thereafter. In this case, it is also simply referred to as a vehicle M, a camera C, and a recognition unit R.
  • the camera C captures an image of the surroundings of the vehicle M, which is necessary for realizing the automatic driving of the vehicle M.
  • the recognition unit R executes an object recognition process based on the image captured by the camera C, and recognizes an object existing around the vehicle M.
  • Each vehicle M realizes automatic driving based on the recognition result of the recognition unit R.
  • each vehicle M1 to Mn provides vehicle storage information U1 to Un, which is a combination of an image captured by the cameras C1 to Cn and a recognition result obtained by the object recognition process realized by the recognition units R1 to Rn, in the network N. It is transmitted to the server Sv via.
  • the server Sv provides vehicle storage information U1 to Un, which is a combination of images captured by the cameras C1 to Cn transmitted from each vehicle M1 to Mn via the network N and recognition results of the recognition units R1 to Rn. Accumulate as parameter P.
  • the server Sv includes a re-learning unit L of the recognition unit R.
  • the re-learning unit L relearns the current recognition unit R by using the parameter P consisting of vehicle storage information U1 to Un as learning data, thereby re-learning the current recognition unit R. Is generated, and is distributed to the vehicles M1 to Mn as distribution information D1 to Dn via the network N, and each of the recognition units R1 to Rn is updated.
  • the recognition unit R may not be properly updated by the update SW, and an inappropriate object recognition process may be performed, which may make it impossible to properly realize automatic operation.
  • the update SW for updating the recognition unit R needs to be delivered after the safety is confirmed by sufficient simulation.
  • the recognition unit R is updated by the update SW, so that even if an appropriate operation cannot be realized by any chance, the vehicle M group with less danger and safety is ensured.
  • the update SW is delivered to update the recognition unit R, the operation status after the update is confirmed, and when it is confirmed that the operation is sufficient and there is no problem, the update SW is transferred to another vehicle M. To be delivered.
  • the recognition unit R is configured to recognize a pedestrian among objects
  • the recognition unit of the vehicle Mm traveling on the highway H among the vehicles M1 to Mn. Rm is updated by the update SW prior to the other vehicle M.
  • the recognition unit Rm of the vehicle Mm traveling on the highway H is updated by the update SW prior to the other vehicle M, and the recognition accuracy of the updated pedestrian is confirmed. To be done.
  • the update SW is also distributed to the recognition unit R of the vehicle M other than the vehicle Mm traveling on the highway so that the update can be performed. ..
  • the safety of the vehicle 1 is ensured while suppressing the cost for reducing the occurrence of the trouble related to the object recognition process of the recognition unit R updated by the update SW generated by the re-learning of the recognition unit R. It is possible to update the recognition unit R with the update SW and confirm the presence or absence of a defect.
  • the SW management system 10 of the present disclosure is composed of vehicles 1-1 to 1-n, a server 2, and a network 3.
  • the vehicle 1-1 to 1-n are simply referred to as the vehicle 1, and other configurations are also referred to in the same manner.
  • Vehicles 1-1 to 1-n have configurations corresponding to vehicles M1 to Mn in FIG. 1, and are vehicles capable of automatic driving, respectively.
  • the vehicles 1-1 to 1-n are objects existing in the surroundings based on the images captured by the cameras 1a-1 to 1a-n and the cameras 1a-1 to 1a-n that capture images of their surroundings.
  • the recognition units 1b-1 to 1bn for recognizing the above are provided, and automatic operation is realized based on the recognition results of the recognition units 1b-1 to 1bn.
  • the recognition units 1b-1 to 1b-n mounted on each of the vehicles 1-1 to 1-n are SWs (software programs), respectively, and the update is repeated by the update SW distributed from the server 2.
  • the cameras 1a-1 to 1an do not have to have the same structure, but the recognition units 1b-1 to 1bn are repeatedly updated by the update SW delivered from the server 2. So they are the same.
  • each vehicle 1-1 to 1-n stores information obtained by combining the image captured by the cameras 1a-1 to 1an and the recognition result of the recognition units 1b-1 to 1bn, respectively. Is accumulated and transmitted to the server 2 via the network 3.
  • the server 2 relearns the recognition unit 1b by using the vehicle storage information transmitted from each of the vehicles 1-1 to 1-n via the network 3 as learning data.
  • the server 2 generates the relearned recognition unit 1b as an update SW, distributes the relearned recognition units 1b to the vehicles 1-1 to 1-n via the network 3, and distributes the respective recognition units 1b-1 to 1b-n. Let me update.
  • the server 2 When the update SW is distributed to update the recognition units 1b-1 to 1b-n, the server 2 has a problem in the operating state due to the update of the recognition unit 1b among the vehicles 1-1 to 1-n.
  • the update SW is delivered and updated from the vehicle 1 (group) that has a small effect.
  • the server 2 confirms the operating state of the recognition unit 1b updated by the update SW, and when there is no problem in the operating state, distributes the update SW to the other vehicle 1 to update it.
  • the recognition unit 1b of the vehicle 1 whose safety is ensured is only updated. , It is possible to suppress the occurrence of fatal problems.
  • the presence or absence of a defect related to the update is confirmed, and then the distribution to the other vehicle 1 is performed by the update SW. Delivery to the vehicle 1 having a large influence when a defect occurs in the updated recognition unit 1b can be made to be delivered after sufficient safety is ensured.
  • FIG. 3 is a block diagram showing a configuration example of a vehicle control system 11 which is an example of a mobile device control system of a vehicle 1 to which the present technology is applied.
  • the vehicle control system 11 is provided in the vehicle 1 and performs processing related to driving support and automatic driving of the vehicle 1.
  • the vehicle control system 11 includes a processor 21, a communication unit 22, a map information storage unit 23, a GNSS (Global Navigation Satellite System) receiving unit 24, an external recognition sensor 25, an in-vehicle sensor 26, a vehicle sensor 27, a recording unit 28, and a driving support unit. It includes an automatic driving control unit 29, a DMS (Driver Monitoring System) 30, an HMI (Human Machine Interface) 31, and a vehicle control unit 32.
  • a processor 21 includes a processor 21, a communication unit 22, a map information storage unit 23, a GNSS (Global Navigation Satellite System) receiving unit 24, an external recognition sensor 25, an in-vehicle sensor 26, a vehicle sensor 27, a recording unit 28, and a driving support unit. It includes an automatic driving control unit 29, a DMS (Driver Monitoring System) 30, an HMI (Human Machine Interface) 31, and a vehicle control unit 32.
  • DMS Driver Monitoring System
  • HMI Human Machine Interface
  • the communication network 41 is an in-vehicle communication network compliant with any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), FlexRay (registered trademark), and Ethernet (registered trademark). It is composed of buses and buses.
  • each part of the vehicle control system 11 may be directly connected by, for example, short-range wireless communication (NFC (Near Field Communication)), Bluetooth (registered trademark), or the like without going through the communication network 41.
  • NFC Near Field Communication
  • Bluetooth registered trademark
  • the description of the communication network 41 shall be omitted.
  • the processor 21 and the communication unit 22 communicate with each other via the communication network 41, it is described that the processor 21 and the communication unit 22 simply communicate with each other.
  • the processor 21 is composed of various processors such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and an ECU (Electronic Control Unit), for example.
  • the processor 21 controls the entire vehicle control system 11.
  • the communication unit 22 communicates with various devices inside and outside the vehicle, other vehicles, servers, base stations, etc., and transmits and receives various data.
  • the communication unit 22 receives from the outside a program for updating the software for controlling the operation of the vehicle control system 11, map information, traffic information, information around the vehicle 1, and the like.
  • the communication unit 22 transmits information about the vehicle 1 (for example, data indicating the state of the vehicle 1, recognition result by the recognition unit 73, etc.), information around the vehicle 1, and the like to the outside.
  • the communication unit 22 performs communication corresponding to a vehicle emergency call system such as eCall.
  • the communication method of the communication unit 22 is not particularly limited. Moreover, a plurality of communication methods may be used.
  • the communication unit 22 wirelessly communicates with the equipment in the vehicle by a communication method such as wireless LAN, Bluetooth, NFC, WUSB (WirelessUSB).
  • a communication method such as wireless LAN, Bluetooth, NFC, WUSB (WirelessUSB).
  • the communication unit 22 may use USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface, registered trademark), or MHL (Mobile High-) via a connection terminal (and a cable if necessary) (not shown).
  • Wired communication is performed with the equipment in the car by a communication method such as definitionLink).
  • the device in the vehicle is, for example, a device that is not connected to the communication network 41 in the vehicle.
  • mobile devices and wearable devices possessed by passengers such as drivers, information devices brought into a vehicle and temporarily installed, and the like are assumed.
  • the communication unit 22 is a base station using a wireless communication system such as 4G (4th generation mobile communication system), 5G (5th generation mobile communication system), LTE (LongTermEvolution), DSRC (DedicatedShortRangeCommunications), etc.
  • a wireless communication system such as 4G (4th generation mobile communication system), 5G (5th generation mobile communication system), LTE (LongTermEvolution), DSRC (DedicatedShortRangeCommunications), etc.
  • a server or the like existing on an external network for example, the Internet, a cloud network, or a network peculiar to a business operator
  • the communication unit 22 uses P2P (Peer To Peer) technology to communicate with a terminal existing in the vicinity of the vehicle (for example, a pedestrian or store terminal, or an MTC (Machine Type Communication) terminal). ..
  • the communication unit 22 performs V2X communication.
  • V2X communication is, for example, vehicle-to-vehicle (Vehicle to Vehicle) communication with other vehicles, road-to-vehicle (Vehicle to Infrastructure) communication with roadside devices, and home (Vehicle to Home) communication.
  • And pedestrian-to-vehicle (Vehicle to Pedestrian) communication with terminals owned by pedestrians.
  • the communication unit 22 receives electromagnetic waves transmitted by a vehicle information and communication system (VICS (Vehicle Information and Communication System), registered trademark) such as a radio wave beacon, an optical beacon, and FM multiplex broadcasting.
  • VICS Vehicle Information and Communication System
  • the map information storage unit 23 stores a map acquired from the outside and a map created by the vehicle 1.
  • the map information storage unit 23 stores a three-dimensional high-precision map, a global map that is less accurate than the high-precision map and covers a wide area, and the like.
  • the high-precision map is, for example, a dynamic map, a point cloud map, a vector map (also referred to as an ADAS (Advanced Driver Assistance System) map), or the like.
  • the dynamic map is, for example, a map composed of four layers of dynamic information, quasi-dynamic information, quasi-static information, and static information, and is provided from an external server or the like.
  • the point cloud map is a map composed of point clouds (point cloud data).
  • a vector map is a map in which information such as lanes and signal positions is associated with a point cloud map.
  • the point cloud map and the vector map may be provided from, for example, an external server or the like, and the vehicle 1 is used as a map for matching with a local map described later based on the sensing result by the radar 52, LiDAR 53, or the like. It may be created and stored in the map information storage unit 23. Further, when a high-precision map is provided from an external server or the like, in order to reduce the communication capacity, map data of, for example, several hundred meters square, relating to the planned route on which the vehicle 1 is about to travel is acquired from the server or the like.
  • the GNSS receiving unit 24 receives the GNSS signal from the GNSS satellite and supplies it to the traveling support / automatic driving control unit 29.
  • the external recognition sensor 25 includes various sensors used for recognizing the external situation of the vehicle 1, and supplies sensor data from each sensor to each part of the vehicle control system 11.
  • the type and number of sensors included in the external recognition sensor 25 are arbitrary.
  • the external recognition sensor 25 includes a camera 51, a radar 52, a LiDAR (Light Detection and Ringing, Laser Imaging Detection and Ringing) 53, and an ultrasonic sensor 54.
  • the number of cameras 51, radar 52, LiDAR 53, and ultrasonic sensors 54 is arbitrary, and examples of sensing areas of each sensor will be described later.
  • the camera 51 for example, a camera of any shooting method such as a ToF (TimeOfFlight) camera, a stereo camera, a monocular camera, an infrared camera, etc. is used as needed.
  • ToF TimeOfFlight
  • stereo camera stereo camera
  • monocular camera stereo camera
  • infrared camera etc.
  • the external recognition sensor 25 includes an environment sensor for detecting weather, weather, brightness, and the like.
  • the environment sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, an illuminance sensor, and the like.
  • the external recognition sensor 25 includes a microphone used for detecting the sound around the vehicle 1 and the position of the sound source.
  • the in-vehicle sensor 26 includes various sensors for detecting information in the vehicle, and supplies sensor data from each sensor to each part of the vehicle control system 11.
  • the type and number of sensors included in the in-vehicle sensor 26 are arbitrary.
  • the in-vehicle sensor 26 includes a camera, a radar, a seating sensor, a steering wheel sensor, a microphone, a biological sensor, and the like.
  • the camera for example, a camera of any shooting method such as a ToF camera, a stereo camera, a monocular camera, and an infrared camera can be used.
  • the biosensor is provided on, for example, a seat, a steering wheel, or the like, and detects various biometric information of a occupant such as a driver.
  • the vehicle sensor 27 includes various sensors for detecting the state of the vehicle 1, and supplies sensor data from each sensor to each part of the vehicle control system 11.
  • the type and number of sensors included in the vehicle sensor 27 are arbitrary.
  • the vehicle sensor 27 includes a speed sensor, an acceleration sensor, an angular velocity sensor (gyro sensor), and an inertial measurement unit (IMU (Inertial Measurement Unit)).
  • the vehicle sensor 27 includes a steering angle sensor that detects the steering angle of the steering wheel, a yaw rate sensor, an accelerator sensor that detects the operation amount of the accelerator pedal, and a brake sensor that detects the operation amount of the brake pedal.
  • the vehicle sensor 27 includes a rotation sensor that detects the rotation speed of an engine or a motor, an air pressure sensor that detects tire air pressure, a slip ratio sensor that detects tire slip ratio, and a wheel speed that detects wheel rotation speed. Equipped with a sensor.
  • the vehicle sensor 27 includes a battery sensor that detects the remaining amount and temperature of the battery, and an impact sensor that detects an impact from the outside.
  • the recording unit 28 includes, for example, a magnetic storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, and the like. ..
  • the recording unit 28 records various programs, data, and the like used by each unit of the vehicle control system 11.
  • the recording unit 28 records a rosbag file including messages sent and received by the ROS (Robot Operating System) in which an application program related to automatic driving operates.
  • the recording unit 28 includes an EDR (Event Data Recorder) and a DSSAD (Data Storage System for Automated Driving), and records information on the vehicle 1 before and after an event such as an accident.
  • EDR Event Data Recorder
  • DSSAD Data Storage System for Automated Driving
  • the driving support / automatic driving control unit 29 controls the driving support and automatic driving of the vehicle 1.
  • the driving support / automatic driving control unit 29 includes an analysis unit 61, an action planning unit 62, and an motion control unit 63.
  • the analysis unit 61 analyzes the vehicle 1 and the surrounding conditions.
  • the analysis unit 61 includes a self-position estimation unit 71, a sensor fusion unit 72, and a recognition unit 73.
  • the self-position estimation unit 71 estimates the self-position of the vehicle 1 based on the sensor data from the external recognition sensor 25 and the high-precision map stored in the map information storage unit 23. For example, the self-position estimation unit 71 generates a local map based on the sensor data from the external recognition sensor 25, and estimates the self-position of the vehicle 1 by matching the local map with the high-precision map.
  • the position of the vehicle 1 is, for example, based on the center of the rear wheel-to-axle.
  • the local map is, for example, a three-dimensional high-precision map created by using a technology such as SLAM (Simultaneous Localization and Mapping), an occupied grid map (OccupancyGridMap), or the like.
  • the three-dimensional high-precision map is, for example, the point cloud map described above.
  • the occupied grid map is a map that divides a three-dimensional or two-dimensional space around the vehicle 1 into a grid (grid) of a predetermined size and shows the occupied state of an object in grid units.
  • the occupied state of an object is indicated by, for example, the presence or absence of an object and the probability of existence.
  • the local map is also used, for example, in the detection process and the recognition process of the external situation of the vehicle 1 by the recognition unit 73.
  • the self-position estimation unit 71 may estimate the self-position of the vehicle 1 based on the GNSS signal and the sensor data from the vehicle sensor 27.
  • the sensor fusion unit 72 performs a sensor fusion process for obtaining new information by combining a plurality of different types of sensor data (for example, image data supplied from the camera 51 and sensor data supplied from the radar 52). .. Methods for combining different types of sensor data include integration, fusion, and association.
  • the recognition unit 73 performs detection processing and recognition processing of the external situation of the vehicle 1.
  • the recognition unit 73 performs detection processing and recognition processing of the external situation of the vehicle 1 based on the information from the external recognition sensor 25, the information from the self-position estimation unit 71, the information from the sensor fusion unit 72, and the like. ..
  • the recognition unit 73 performs detection processing, recognition processing, and the like of objects around the vehicle 1.
  • the object detection process is, for example, a process of detecting the presence / absence, size, shape, position, movement, etc. of an object.
  • the object recognition process is, for example, a process of recognizing an attribute such as an object type or identifying a specific object.
  • the detection process and the recognition process are not always clearly separated and may overlap.
  • the recognition unit 73 detects an object around the vehicle 1 by performing clustering that classifies the point cloud based on sensor data such as LiDAR or radar into a point cloud. As a result, the presence / absence, size, shape, and position of an object around the vehicle 1 are detected.
  • the recognition unit 73 detects the movement of an object around the vehicle 1 by performing tracking that follows the movement of a mass of point clouds classified by clustering. As a result, the velocity and the traveling direction (movement vector) of the object around the vehicle 1 are detected.
  • the recognition unit 73 recognizes the type of an object around the vehicle 1 by performing an object recognition process such as semantic segmentation on the image data supplied from the camera 51.
  • the object to be detected or recognized is assumed to be, for example, a vehicle, a person, a bicycle, an obstacle, a structure, a road, a traffic light, a traffic sign, a road sign, or the like.
  • the recognition unit 73 recognizes the traffic rules around the vehicle 1 based on the map stored in the map information storage unit 23, the estimation result of the self-position, and the recognition result of the object around the vehicle 1. I do.
  • this processing for example, the position and state of a signal, the contents of traffic signs and road markings, the contents of traffic regulations, the lanes in which the vehicle can travel, and the like are recognized.
  • the recognition unit 73 performs recognition processing of the environment around the vehicle 1.
  • the surrounding environment to be recognized for example, weather, temperature, humidity, brightness, road surface condition, and the like are assumed.
  • the action planning unit 62 creates an action plan for the vehicle 1. For example, the action planning unit 62 creates an action plan by performing route planning and route tracking processing.
  • route planning is a process of planning a rough route from the start to the goal.
  • This route plan is called a track plan, and in the route planned by the route plan, the track generation (Local) capable of safely and smoothly traveling in the vicinity of the vehicle 1 in consideration of the motion characteristics of the vehicle 1 is taken into consideration.
  • the processing of path planning is also included.
  • Route tracking is a process of planning an operation for safely and accurately traveling on a route planned by route planning within a planned time. For example, the target speed and the target angular velocity of the vehicle 1 are calculated.
  • the motion control unit 63 controls the motion of the vehicle 1 in order to realize the action plan created by the action plan unit 62.
  • the motion control unit 63 controls the steering control unit 81, the brake control unit 82, and the drive control unit 83 so that the vehicle 1 travels on the track calculated by the track plan. Take control.
  • the motion control unit 63 performs coordinated control for the purpose of realizing ADAS functions such as collision avoidance or impact mitigation, follow-up travel, vehicle speed maintenance travel, collision warning of own vehicle, and lane deviation warning of own vehicle.
  • the motion control unit 63 performs coordinated control for the purpose of automatic driving or the like in which the vehicle autonomously travels without being operated by the driver.
  • the DMS 30 performs driver authentication processing, driver status recognition processing, and the like based on sensor data from the in-vehicle sensor 26 and input data input to the HMI 31.
  • As the state of the driver to be recognized for example, physical condition, arousal degree, concentration degree, fatigue degree, line-of-sight direction, drunkenness degree, driving operation, posture and the like are assumed.
  • the DMS 30 may perform authentication processing for passengers other than the driver and recognition processing for the status of the passenger. Further, for example, the DMS 30 may perform the recognition processing of the situation inside the vehicle based on the sensor data from the sensor 26 in the vehicle. As the situation inside the vehicle to be recognized, for example, temperature, humidity, brightness, odor, etc. are assumed.
  • the HMI 31 is used for inputting various data and instructions, generates an input signal based on the input data and instructions, and supplies the input signal to each part of the vehicle control system 11.
  • the HMI 31 includes an operation device such as a touch panel, a button, a microphone, a switch, and a lever, and an operation device that can be input by a method other than manual operation by voice or gesture.
  • the HMI 31 may be, for example, a remote control device using infrared rays or other radio waves, or an externally connected device such as a mobile device or a wearable device that supports the operation of the vehicle control system 11.
  • the HMI 31 performs output control for generating and outputting visual information, auditory information, and tactile information for the passenger or the outside of the vehicle, and for controlling output contents, output timing, output method, and the like.
  • the visual information is, for example, information shown by an image such as an operation screen, a state display of the vehicle 1, a warning display, a monitor image showing a situation around the vehicle 1, or light.
  • Auditory information is, for example, information indicated by voice such as guidance, warning sounds, and warning messages.
  • the tactile information is information given to the passenger's tactile sensation by, for example, force, vibration, movement, or the like.
  • a display device As a device that outputs visual information, for example, a display device, a projector, a navigation device, an instrument panel, a CMS (Camera Monitoring System), an electronic mirror, a lamp, etc. are assumed.
  • the display device is a device that displays visual information in the occupant's field of view, such as a head-up display, a transmissive display, and a wearable device having an AR (Augmented Reality) function, in addition to a device having a normal display. You may.
  • an audio speaker for example, an audio speaker, headphones, earphones, etc. are assumed.
  • a haptics element using haptics technology or the like As a device that outputs tactile information, for example, a haptics element using haptics technology or the like is assumed.
  • the haptic element is provided on, for example, a steering wheel, a seat, or the like.
  • the vehicle control unit 32 controls each part of the vehicle 1.
  • the vehicle control unit 32 includes a steering control unit 81, a brake control unit 82, a drive control unit 83, a body system control unit 84, a light control unit 85, and a horn control unit 86.
  • the steering control unit 81 detects and controls the state of the steering system of the vehicle 1.
  • the steering system includes, for example, a steering mechanism including a steering wheel, electric power steering, and the like.
  • the steering control unit 81 includes, for example, a control unit such as an ECU that controls the steering system, an actuator that drives the steering system, and the like.
  • the brake control unit 82 detects and controls the state of the brake system of the vehicle 1.
  • the brake system includes, for example, a brake mechanism including a brake pedal and the like, ABS (Antilock Brake System) and the like.
  • the brake control unit 82 includes, for example, a control unit such as an ECU that controls the brake system, an actuator that drives the brake system, and the like.
  • the drive control unit 83 detects and controls the state of the drive system of the vehicle 1.
  • the drive system includes, for example, a drive force generator for generating a drive force of an accelerator pedal, an internal combustion engine, a drive motor, or the like, a drive force transmission mechanism for transmitting the drive force to the wheels, and the like.
  • the drive control unit 83 includes, for example, a control unit such as an ECU that controls the drive system, an actuator that drives the drive system, and the like.
  • the body system control unit 84 detects and controls the state of the body system of the vehicle 1.
  • the body system includes, for example, a keyless entry system, a smart key system, a power window device, a power seat, an air conditioner, an airbag, a seat belt, a shift lever, and the like.
  • the body system control unit 84 includes, for example, a control unit such as an ECU that controls the body system, an actuator that drives the body system, and the like.
  • the light control unit 85 detects and controls various light states of the vehicle 1. As the light to be controlled, for example, a headlight, a backlight, a fog light, a turn signal, a brake light, a projection, a bumper display, or the like is assumed.
  • the light control unit 85 includes a control unit such as an ECU that controls the light, an actuator that drives the light, and the like.
  • the horn control unit 86 detects and controls the state of the car horn of the vehicle 1.
  • the horn control unit 86 includes, for example, a control unit such as an ECU that controls the car horn, an actuator that drives the car horn, and the like.
  • FIG. 4 is a diagram showing an example of a sensing region by a camera 51, a radar 52, a LiDAR 53, and an ultrasonic sensor 54 of the external recognition sensor 25 of FIG.
  • the sensing area 101F and the sensing area 101B show an example of the sensing area of the ultrasonic sensor 54.
  • the sensing region 101F covers the periphery of the front end of the vehicle 1.
  • the sensing region 101B covers the periphery of the rear end of the vehicle 1.
  • the sensing results in the sensing area 101F and the sensing area 101B are used, for example, for parking support of the vehicle 1.
  • the sensing area 102F to the sensing area 102B show an example of the sensing area of the radar 52 for a short distance or a medium distance.
  • the sensing area 102F covers a position farther than the sensing area 101F in front of the vehicle 1.
  • the sensing region 102B covers the rear of the vehicle 1 to a position farther than the sensing region 101B.
  • the sensing area 102L covers the rear periphery of the left side surface of the vehicle 1.
  • the sensing region 102R covers the rear periphery of the right side surface of the vehicle 1.
  • the sensing result in the sensing area 102F is used, for example, for detecting a vehicle, a pedestrian, or the like existing in front of the vehicle 1.
  • the sensing result in the sensing region 102B is used, for example, for a collision prevention function behind the vehicle 1.
  • the sensing results in the sensing area 102L and the sensing area 102R are used, for example, for detecting an object in a blind spot on the side of the vehicle 1.
  • the sensing area 103F to the sensing area 103B show an example of the sensing area by the camera 51.
  • the sensing area 103F covers a position farther than the sensing area 102F in front of the vehicle 1.
  • the sensing region 103B covers the rear of the vehicle 1 to a position farther than the sensing region 102B.
  • the sensing area 103L covers the periphery of the left side surface of the vehicle 1.
  • the sensing region 103R covers the periphery of the right side surface of the vehicle 1.
  • the sensing result in the sensing area 103F is used, for example, for recognition of traffic lights and traffic signs, lane departure prevention support system, and the like.
  • the sensing result in the sensing area 103B is used, for example, for parking assistance, a surround view system, and the like.
  • the sensing results in the sensing area 103L and the sensing area 103R are used, for example, in a surround view system or the like.
  • the sensing area 104 shows an example of the sensing area of LiDAR53.
  • the sensing region 104 covers a position far from the sensing region 103F in front of the vehicle 1.
  • the sensing area 104 has a narrower range in the left-right direction than the sensing area 103F.
  • the sensing result in the sensing area 104 is used for, for example, emergency braking, collision avoidance, pedestrian detection, and the like.
  • the sensing area 105 shows an example of the sensing area of the radar 52 for a long distance.
  • the sensing region 105 covers a position farther than the sensing region 104 in front of the vehicle 1.
  • the sensing area 105 has a narrower range in the left-right direction than the sensing area 104.
  • the sensing result in the sensing region 105 is used, for example, for ACC (Adaptive Cruise Control) or the like.
  • each sensor may have various configurations other than those shown in FIG. Specifically, the ultrasonic sensor 54 may be made to sense the side of the vehicle 1, or the LiDAR 53 may be made to sense the rear of the vehicle 1.
  • the camera 51 and the recognition unit 73 in FIG. 3 have a configuration corresponding to the camera 1a and the recognition unit 1b in FIG.
  • the server 2 is composed of a processor 111, an input unit 112, an output unit 113, a storage unit 114, a communication unit 115, a drive 116, and a removable storage medium 117, and is connected to each other via a bus 118 to provide data. And programs can be sent and received.
  • the processor 111 controls the entire operation of the server 2. Further, the processor 111 manages the update of the recognition unit 73 as a SW. Further, the processor 111 recognizes the operating state of the recognition unit 73 as a SW and transmits it to the server 2.
  • the input unit 112 is composed of an input device such as a keyboard and a mouse for which a user inputs an operation command, and supplies various input signals to the processor 111.
  • the output unit 113 outputs and displays an image of the operation screen and the processing result controlled and supplied by the processor 111 to a display device including an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence).
  • a display device including an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence).
  • the storage unit 114 is composed of an HDD (Hard Disk Drive), SSD (Solid State Drive), semiconductor memory, or the like, and is controlled by the processor 111 to write or read various data and programs including content data.
  • HDD Hard Disk Drive
  • SSD Solid State Drive
  • semiconductor memory or the like
  • the communication unit 115 is controlled by the processor 111, and various data are connected to and from various devices via a communication network represented by a LAN (Local Area Network) by wire (or wirelessly (not shown)). And send and receive programs.
  • a communication network represented by a LAN (Local Area Network) by wire (or wirelessly (not shown)). And send and receive programs.
  • LAN Local Area Network
  • the drive 116 includes a magnetic disk (including a flexible disk), an optical disk (including a CD-ROM (Compact Disc-Read Only Memory), a DVD (Digital Versatile Disc)), an optical magnetic disk (including an MD (Mini Disc)), and an optical magnetic disk (including an MD (Mini Disc)).
  • a magnetic disk including a flexible disk
  • an optical disk including a CD-ROM (Compact Disc-Read Only Memory), a DVD (Digital Versatile Disc)
  • an optical magnetic disk including an MD (Mini Disc)
  • an MD Mini Disc
  • the processor 21 of the vehicle 1 realizes the functions as the control unit 201, the operation status recognition unit 202, the operation status report unit 203, and the update unit 204.
  • the control unit 201 controls the entire operation of the vehicle control system 11 and outputs various control signals.
  • control unit 201 outputs a control signal for controlling various operations of the vehicle 1 to the vehicle control unit 32 based on the object recognition result by the recognition unit 73 based on the image captured by the camera 51. ..
  • control unit 201 controls the operation of the operation status recognition unit 202 to recognize the operation status of the recognition unit 73.
  • control unit 201 controls and transmits the communication unit 22 when the server 2 requests the vehicle information required for the grouping of the vehicle 1.
  • the vehicle information required for grouping the vehicles 1 will be described later when the configuration of the server 2 is described.
  • the operation status recognition unit 202 recognizes the operation status of the recognition unit 73 based on the image captured by the camera 51 and the object recognition result of the recognition unit 73 according to the image captured by the camera 51.
  • the operation status recognition unit 202 is, for example, in addition to the image captured by the camera 51 and the object recognition result of the recognition unit 73 based on the image captured by the camera 51, the map information storage unit 23. Based on the map information of the above, the position information based on the signals from the GNSS receiver 24, and the external recognition sensor 25, the object that can be recognized from the map information corresponding to the current position information is identified and compared with the object recognition result. It is determined whether or not the recognition unit 73 is properly operating, the determination result is recognized as an operation status, and the determination result is output to the operation status report unit 203.
  • the operating status recognition unit 202 has a position specified based on the GNSS signal of the GNSS receiving unit 24 and a signal of the external recognition sensor 25, and an angle of view of the camera 51 based on the direction of the vehicle 1 at the specified position.
  • the information of the object existing in the inside is read out from the map information, compared with the recognition result of the recognition unit 73, and the comparison result is recognized as the operation status.
  • the operation status recognition unit 202 recognizes at that time. If the recognition result of Part 73 matches a specific sign or a specific building read from the map information, it is considered to be appropriate as an operating status, and if it does not match, it is considered to be not an appropriate operating status. ..
  • the object to be compared with the recognition result may be a specific sign, a specific building, or the like specified from the map information, or a dedicated marker for confirming the operating status of the recognition unit 73. May be good.
  • the object to be compared with the recognition result may be an object that can be recognized along with a phenomenon that is expected to occur from the position and time zone, for example, that the object is generated from the position and time zone. It may be a vehicle or a pedestrian in a traffic jam where is expected. In this case, whether or not it is possible to recognize that the vehicle or pedestrian is moving may also be used to determine the operating status of the recognition unit 73.
  • the object to be compared with the recognition result may be a sensitive signal that operates with V2X communication or the approach of the vehicle 1, a gate bar of a parking lot, or the like.
  • the operating status whether or not the change in operation such as the sensitive signal changing from red to green as the vehicle 1 approaches and the gate bar being opened or closed is recognized as a recognition result. It may be used for the determination of.
  • the operation status recognition unit 202 backs up the recognition unit 73 before the update, compares the recognition rate of the recognition unit 73 before the update with the recognition rate of the recognition unit 73 after the update, and compares the comparison result with the operation status. May be regarded as. In this case, when the recognition rate of the recognition unit 73 after the update is not lower than the recognition rate of the recognition unit 73 before the update and there is no deterioration, it is considered that an appropriate update has been made, and vice versa. In addition, when there is a decrease in the recognition rate and there is deterioration, it may be considered that an inappropriate update has been made.
  • the operation status report unit 203 determines whether or not it is necessary to report to the server 2 based on the operation status information supplied from the operation status recognition unit 202, and when it is determined that it is necessary, the operation status report unit 203 is a communication unit. 22 is controlled to transmit the image captured by the camera 51 and the recognition result to the server 2.
  • the operating status reporting unit 203 reports the image captured by the camera 51 and the recognition result to the server 2.
  • the operating status reporting unit 203 may report the image captured by the camera 51 and the recognition result to the server 2.
  • the operation status reporting unit 203 reports the operation status, the image captured by the camera 51, and the recognition result to the server 2, regardless of whether the operation status is appropriate or not. May be good.
  • the update unit 204 controls the communication unit 22 to update the recognition unit 73 from the server 2, receives the update SW, expands the information as necessary, and relearns the recognition unit 73 by the server 2. Update to the state that was made.
  • the processor 111 of the server 2 serves as a vehicle information collection unit 231, a grouping unit 232, a distribution order determination unit 233, a distribution status confirmation unit 234, a distribution planning unit 235, a distribution unit 236, a re-learning unit 237, and an update SW release unit 238. To realize the function of.
  • the vehicle information collecting unit 231 controls the communication unit 115 to request vehicle information for grouping from the vehicle 1, collects it, and outputs it to the grouping unit 232.
  • the vehicle information is various information required for grouping the vehicle 1 by the grouping unit 232, which will be described later.
  • the vehicle type the position information
  • the detection history of the external recognition sensor 25 the route traveled in the past.
  • Information such as travel history including travel history, total mileage, and weather information corresponding to the current position information of the vehicle 1.
  • the grouping unit 232 groups the vehicles 1-1 to 1-n based on the vehicle information collected from each of the vehicles 1, and outputs the grouping result to the distribution order determination unit 233.
  • the grouping performed here is to group the vehicles 1-1 to 1-n for setting the order for distributing the update SW for updating the recognition unit 73 into at least two groups or more.
  • the distribution of the update SW is safe from the group of vehicles 1 that are not in a dangerous state, are likely to be safe, and are safe, even if they cannot operate properly when the recognition unit 73 is updated. Try to be done in order of lower sex.
  • the distribution order determination unit 233 determines and determines the order in which the update SW is distributed in order from the group most likely to be safe (safety is ensured) based on the grouping result supplied from the grouping unit 232. Information on the distribution order for the grouping result is output to the distribution planning unit 235.
  • the distribution status confirmation unit 234 controls the communication unit 115 to confirm the distribution status indicating which order of the vehicle 1 group the update SW is being distributed to, and the confirmed distribution status information is distributed to the distribution planning unit 235. Output to.
  • the distribution status confirmation unit 234 controls the communication unit 115 to indicate whether or not the object recognition process by the recognition unit 73 updated by the distributed update SW transmitted from the vehicle 1 is appropriate. The information is confirmed as the distribution status and output to the distribution planning unit 235.
  • the distribution planning unit 235 plans the order and timing for distributing the update SW from the distribution order information supplied from the distribution order determination unit 233 and the distribution status information supplied from the distribution status confirmation unit 234. , Is output to the distribution unit 236 as a distribution plan.
  • the re-learning unit 237 accumulates vehicle storage information that constitutes parameters used as re-learning data consisting of an image captured by the camera 51 supplied from the vehicle 1 and an object recognition result by the recognition unit 73. Using the accumulated vehicle storage information as re-learning data, the corresponding recognition unit 73 is re-learned to generate an update SW as a re-learning result, which is output to the update SW release unit 238.
  • the update SW release unit 238 confirms the operation of the recognition unit 73 updated by the update SW generated by the re-learning unit 237 by the re-learning unit 237 by simulation, and confirms whether or not the recognition unit 73 can be distributed.
  • the update SW that has been determined and can be distributed is output to the distribution unit 236.
  • the distribution unit 236 controls the communication unit 115 to distribute the update SW supplied from the update SW release unit 238 to the vehicle 1 according to the distribution plan supplied from the distribution planning unit 235, and updates the recognition unit 73. ..
  • the grouping of vehicles 1-1 to 1-n by the grouping unit 232 is a process for the purpose of grouping a group of vehicles for confirming the state when the recognition unit 73 is actually updated by the update SW to be distributed. Is.
  • the vehicles 1-1 to 1-n are grouped as shown in FIG. 8, for example.
  • vehicles 1-1 to 1-n are grouped into groups G1 to Gx in order from the top.
  • the group G1 is composed of the vehicles 1-11 to 1-X among the vehicles 1-1 to 1-n
  • the group G2 is the vehicles 1-21 to 1-n among the vehicles 1-1 to 1-n. It is composed of 1-Y
  • the group Gx is composed of the vehicles 1-31 to 1-Z among the vehicles 1-1 to 1-n.
  • the safest group becomes group G1 and the next safest group becomes group G2 even if some trouble occurs, which is the most dangerous group. However, it becomes a group Gx.
  • the distribution order determination unit 233 determines the order of distributing the update SW as the groups G1, G2, ... Gx. Become.
  • the grouping unit 232 scores the risk when some trouble occurs by updating the recognition unit 73 by the update SW in the vehicles 1-1 to 1-n, and groups according to the score.
  • an update risk the risk that occurs when some trouble occurs by updating the recognition unit 73 by the update SW.
  • grouping is to group vehicles 1-1 to 1-n into a plurality of groups according to the renewal risk.
  • the update SW is delivered first to the first group consisting of the number of vehicles 1 that are completely safe and whose influence of the update risk is likely to be stochastically determined with a predetermined accuracy. Then, the presence or absence of a defect when traveling is confirmed using the recognition result of the recognition unit 73 updated by the update SW.
  • the update SW is distributed in order from the upper group, and if it is confirmed that there is no problem. , Sequentially ensure that the update risk is delivered to the top group.
  • the operating status of the recognition unit 73 updated by the update SW is sufficiently verified in the vehicle 1 of the group delivered immediately before. Since the vehicle 1 is distributed in the same state, it is possible to update the recognition unit 73 more safely as the vehicle 1 of the group distributed later.
  • the grouping unit 232 encounters an accident as the number of accidents or the number of vehicles in each region increases according to the position information of vehicles 1-1 to 1-n and the number of accidents in each region and the number of vehicles in each region. You may consider that the risk of renewal is high.
  • the grouping unit 232 sets the score so that the number of accidents in each area and the vehicle 1 in the area where the number of automobiles is large is lower, and the ranking is set according to the score set in this way to be higher.
  • the groups G1 to Gx may be set from.
  • the grouping unit 232 tends to reduce the recognition accuracy by object recognition according to the position information of each vehicle 1 and the weather at the current location of each area. For example, the vehicle 1 existing in an area of stormy weather.
  • the score may be set low, and conversely, the score may be set higher for the vehicle 1 existing in the area of sunny weather, which is considered to have little deterioration in recognition accuracy, and the grouping may be performed.
  • the grouping unit 232 considers that, for example, the smaller the mileage, the lower the probability of encountering an accident and the lower the update risk, depending on the mileage, based on the operation history of each vehicle 1, and sets the score high. On the contrary, the larger the mileage, the higher the probability of encountering an accident and the higher the risk of renewal, and the group may be grouped by setting a high score.
  • the grouping unit 232 acquires the tendency of acceleration / deceleration, the speed range, and the driving conditions such as the road to be used from the operation history of each vehicle 1 (sensing result of the external recognition sensor 25), and for example, accelerates / decelerates.
  • the speed range is high speed range, or the road where accidents occur frequently is used frequently
  • the score is set low as considering that the update risk is high, and conversely, the change in acceleration / deceleration.
  • the frequency of use is low, the speed range is low, and the roads with high accident frequency are used infrequently, it may be considered that the renewal risk is low and the score may be set high for grouping.
  • the grouping unit 232 is, for example, a vehicle type in which it is known that a group of purchasers who prefer loose driving purchases more than a predetermined number, or a vehicle type having a size smaller than a predetermined size, depending on the vehicle type of the vehicle 1.
  • a vehicle type in which it is known that a group of purchasers who prefer loose driving purchases more than a predetermined number, or a vehicle type having a size smaller than a predetermined size, depending on the vehicle type of the vehicle 1.
  • vehicles with driving performance higher than the prescribed driving performance vehicles equipped with more safety equipment than the prescribed number, commercial vehicles that are known to not be forced to drive because they are for commercial purposes, etc.
  • the update risk is low, and the score may be set high so as to be grouped.
  • the grouping unit 232 is a vehicle model in which it is known that the number of purchasers who prefer loose driving is less than a predetermined number, a vehicle model having a size larger than a predetermined size, and a traveling performance higher than a predetermined driving performance. Vehicles that do not have performance, vehicles that do not have more safety equipment than the specified number, private vehicles that may drive unreasonably compared to commercial vehicles, etc. are considered to have a high risk of renewal. The score may be set low for grouping.
  • the grouping unit 232 may group the vehicle 1 by a combination of scores in consideration of the above-mentioned factors.
  • the grouping unit 232 first sets a score for the vehicle 1, then obtains a ranking, and sets a group from the top with a predetermined number of divisions.
  • the number of vehicles 1 belonging to each group does not have to be equal.
  • the higher group may have a smaller number and the lower group may have a larger number.
  • the number of vehicles 1 belonging to each group does not have to be fixed.
  • the width of the score belonging to each group may be determined and the group may be set according to the score.
  • the update SW may be delivered from these groups, and after the safety is confirmed, it may be delivered to a general vehicle.
  • the number of passengers on buses is larger than the number of passengers on buses, and the impact in the event of an accident is greater. After the safety is confirmed, it may be delivered to a bus or the like.
  • step S11 the vehicle information collecting unit 231 controls the communication unit 115 to request vehicle information from the vehicles 1-1 to 1-n.
  • step S31 the control unit 201 controls the communication unit 22 to determine whether or not the vehicle information has been requested from the server 2, and repeats the same process until the request is made.
  • step S32 the control unit 201 is required to control the communication unit 22 and group the vehicles 1, for example, vehicle type, GNSS.
  • Vehicle information such as position information based on signals, travel history including detection history of the external recognition sensor 25 and routes traveled in the past, total mileage, and weather information corresponding to the current position information of vehicle 1 is sent to the server 2. Send.
  • step S33 the control unit 201 determines whether or not the end of the process is instructed, and if the end is not instructed, the process returns to step S31.
  • steps S31 to S33 are repeated until the end of the process is instructed.
  • step S33 the process ends.
  • step S12 the vehicle information collecting unit 231 of the server 2 controls the communication unit 115 to transmit the vehicle information from the vehicle 1. And collect.
  • the vehicle information collecting unit 231 may continue to collect each of the vehicle information of the vehicle 1 in a database in association with the information for individually identifying the vehicle 1, for example.
  • step S13 the grouping unit 232 sets any unprocessed vehicle 1 among the vehicles 1-1 to 1-n stored in the database in the vehicle information collecting unit 231 as the processing target vehicle.
  • step S14 the grouping unit 232 sets a score for realizing grouping based on the vehicle information of the vehicle to be processed.
  • the grouping unit 232 is based on the vehicle information so that the smaller the update risk (risk when a problem occurs due to the update of the recognition unit 73 by the update SW), the higher the score is set. Set a score for each.
  • step S15 it is determined whether or not there is an unprocessed vehicle 1 for which the score required for grouping is not set, and if there is an unprocessed vehicle 1, the process returns to step S13.
  • steps S13 to 15 are repeated until a score for grouping all the vehicles 1 in which the vehicle information is stored in the vehicle information collecting unit 231 is set.
  • step S15 a score for grouping all the vehicles 1 in which the vehicle information is stored in the vehicle information collecting unit 231 is set, and a score required for grouping is not set. If it is determined that the unprocessed vehicle 1 does not exist, the process proceeds to step S16.
  • step S16 the grouping unit 232 obtains (sorts) the ranks of the vehicles 1-1 to 1-n based on the obtained score.
  • step S17 the grouping unit 232 sets the number of groups.
  • the number of groups may be set to a fixed value or may be set dynamically.
  • step S18 the grouping unit 232 groups the vehicles 1-1 to 1-n so as to have a set number of groups based on the ranking according to the score, and outputs the result to the distribution order determination unit 233. ..
  • step S19 the distribution order determination unit 233 determines the distribution order for each group according to the score of the group supplied from the grouping unit 232.
  • step S20 the vehicle information collecting unit 231 determines whether or not the end of the process is instructed, and if the end is not instructed, the process proceeds to step S21.
  • step S21 the vehicle information collecting unit 231 determines whether or not a predetermined time has elapsed, and repeats the same process until the predetermined time has elapsed.
  • step S21 when it is considered that the predetermined time has elapsed, the process returns to step S11, and the subsequent processes are repeated.
  • vehicle information is acquired from each vehicle 1, and grouping based on the vehicle information is repeated, so that fixed information such as the vehicle type of vehicle 1 and position information and weather are obtained.
  • the grouping based on the changing information around the vehicle 1 such as is continuously set while changing in real time.
  • step S20 the process ends.
  • the grouping is repeated based on the vehicle information including the fixed information of the vehicle 1 and the changing information, and the order in which the update SW is distributed is the vehicle 1.
  • the process set for each group is repeated.
  • the vehicle 1 is grouped according to the update risk by the update SW, and the distribution order of the update SW is set for each group of the vehicle 1.
  • step S51 the recognition unit 73 of the vehicle 1 recognizes an object based on the image captured by the camera 51, and outputs the object recognition result together with the image to the control unit 201.
  • step S52 the control unit 201 controls the vehicle control unit 32 based on the object recognition result to control the operation of the vehicle 1. For example, when it is recognized that a pedestrian is present in front of the vehicle based on the object recognition result, the control unit 201 supplies a control signal to the vehicle control unit 32 and makes contact with the pedestrian. It is controlled so that the operation that avoids the above is performed.
  • step S53 the control unit 201 controls the communication unit 22, stores the object recognition result of the recognition unit 73 and the image of the camera 51 at that time as vehicle storage information, and transmits it to the server 2.
  • the control unit 201 may transmit the identifier to the server 2 together with the identifier for identifying the vehicle 1 and other vehicle information.
  • step S71 the re-learning unit 237 of the server 2 controls the communication unit 115 to determine whether or not the vehicle storage information including the object recognition result of the recognition unit 73 and the image of the camera 51 at that time has been transmitted. Is determined.
  • step S71 If it is determined in step S71 that the vehicle storage information including the object recognition result of the recognition unit 73 and the image of the camera 51 at that time is not transmitted, the process proceeds to step S75.
  • step S71 if it is determined in step S71 that the vehicle storage information including the object recognition result of the recognition unit 73 and the image of the camera 51 at that time has been transmitted, the process proceeds to step S72.
  • step S72 the re-learning unit 237 receives and stores the transmitted object recognition result of the recognition unit 73 and the image of the camera 51 at that time as vehicle storage information for re-learning.
  • the re-learning unit 237 also receives and stores them.
  • the vehicle storage information for re-learning is received and stored from all the vehicles 1 by the processing of steps S71 and S72, but it may be a part.
  • step S73 the re-learning unit 237 re-learns the recognition unit 73 using the accumulated vehicle storage information for re-learning, and supplies the re-learning result to the update SW release unit 238.
  • step S74 the update SW release unit 238 executes a simulation using the relearned recognition unit 73 to verify the recognition accuracy.
  • step S75 the re-learning unit 237 determines whether or not the end of the process is instructed, and if the end of the process is not instructed, the process returns to step S71 and the subsequent processes are repeated.
  • the vehicle storage information for re-learning consisting of the recognition result of the recognition unit 73 and the image is collected from the vehicle 1, the re-learning is repeated, and the re-learning is performed.
  • the process in which the recognition accuracy of the unit 73 is obtained by simulation is repeated.
  • step S75 the process ends.
  • the vehicle storage information for re-learning consisting of the recognition result of the recognition unit 73 and the image is collected from the vehicle 1, the re-learning is repeated, and the recognition accuracy of the re-learning unit 73 is improved. It is possible to repeat the processing required by the simulation.
  • step S91 the update SW release unit 238 distributes and updates as an update SW based on whether or not the relearned recognition unit 73 required by the above-mentioned relearning process is in a state to be distributed. It is determined whether or not there is an update, and the same process is repeated until it is determined that there is an update.
  • the update SW release unit 238 should be updated, for example, when the recognition accuracy of the relearned recognition unit 73 is improved by a predetermined ratio from the recognition accuracy of the recognition unit 73 before the relearning. It may be determined that it is in a state.
  • step S91 determines in step S91 that the state should be updated. If the relearned recognition unit 73 determines in step S91 that the state should be updated, the process proceeds to step S92.
  • step S92 the distribution planning unit 235 initializes the counter i of the identifier that identifies the grouped group to 1.
  • step S93 the distribution planning unit 235 acquires the vehicle information of the vehicle 1 belonging to the group i among the distribution order information in the group unit determined by the distribution order determination unit 233.
  • step S94 the distribution planning unit 235 sets the unprocessed vehicle 1 among the vehicles 1 belonging to the group i as the processing target vehicle.
  • step S95 the distribution planning unit 235 executes the distribution timing setting process and sets the update timing of the recognition unit 73 by the update SW of the processing target vehicle.
  • the operation of the recognition unit 73 being updated is stopped, so that the timing at which the movement of the vehicle 1 is stopped and the recognition process by the recognition unit 73 related to the update are performed.
  • the timing when it becomes unnecessary is set as the timing when the update SW is delivered.
  • step S96 the distribution planning unit 235 determines whether or not there is an unprocessed vehicle 1 for which the distribution timing has not been set among the vehicles 1 belonging to the group i, and when the unprocessed vehicle 1 exists. , The process returns to step S94.
  • steps S94 to S96 are repeated until the delivery timing is set for all the vehicles 1 belonging to the group i.
  • step S96 if it is considered that the delivery timing has been set for all the vehicles 1 belonging to the group i, the process proceeds to step S97.
  • the distribution plan of the update SW is completed by planning all the distribution timings of the vehicles 1 belonging to the group i.
  • step S97 the distribution planning unit 235 determines whether or not it is the distribution timing of any vehicle 1 belonging to the group i based on the distribution plan.
  • step S97 If it is determined in step S97 that the delivery timing of any vehicle 1 belonging to the group i has been reached based on the delivery plan, the process proceeds to step S98. If the delivery timing of the vehicle 1 belonging to the group i is not set in step S97, the process proceeds to step S103.
  • step S98 the distribution planning unit 235 controls the distribution unit 236 to acquire the update SW supplied from the update SW release unit 238 to update the recognition unit 73 generated by the relearning, and the communication unit 115. It is delivered to the vehicle 1 whose delivery timing is higher.
  • the distribution status confirmation unit 234 controls the communication unit 115 to acquire the version of the distributed update SW and the information of the vehicle 1 to be the distribution destination as the distribution status.
  • step S121 the update unit 204 of the vehicle 1 controls the communication unit 22 to determine whether or not the update SW has been transmitted from the server 2.
  • step S121 In the case of the vehicle 1 belonging to the group i whose delivery timing has come, it is determined in step S121 that the update SW has been transmitted from the server 2, and the process proceeds to step S122.
  • step S122 the update unit 204 stops at least one of the operation related to the object recognition process of the recognition unit 73 and the operation control based on the recognition result of the recognition unit 73 in the control unit 201.
  • step S123 the update unit 204 controls the communication unit 22 to acquire the transmitted update SW and update the recognition unit 73. At this time, the update unit 204 holds the recognition unit 73 before the update for backup before updating by the update SW.
  • step S124 the update unit 204 operates the recognition unit 73 on a trial basis, confirms the operating state, and confirms that the update by the update SW has been properly completed.
  • the update unit 204 repeats the update by the update SW until the update is completed properly, and completes the update in an appropriate state.
  • the update unit 204 ends the update process and reactivates the pre-update recognition unit 73 held as a backup.
  • step S125 after the update by the update SW is completed, the recognition unit 73 recognizes the object based on the image captured by the camera 51, and outputs the recognition result to the operation status recognition unit 202.
  • step S126 the operation status recognition unit 202 acquires the image captured by the camera 51 and the object recognition result of the recognition unit 73 according to the image captured by the camera 51.
  • step S127 the operation status recognition unit 202 determines whether or not the operation status of the recognition unit 73 is appropriate based on the acquired object recognition result, and operates the determination result in association with the object recognition result and the image. Output to the status report unit 203.
  • the operating status recognition unit 202 for example, in addition to the image captured by the camera 51 and the object recognition result of the recognition unit 73 based on the image captured by the camera 51, the map information of the map information storage unit 23, Based on the position information based on the signals from the GNSS receiver 24 and the external recognition sensor 25, the object that can be recognized from the map information corresponding to the current position information is identified, and the object is appropriately recognized by comparison with the object recognition result. It is determined whether or not the unit 73 is operating.
  • step S1208 the operation status reporting unit 203 determines whether or not it is necessary to report the operation status. That is, the necessity of reporting the operating status is, for example, when the operating status is inappropriate. However, the necessity of reporting the operating status may be either when the operating status is appropriate or when the operating status is required regardless of whether or not the operating status is appropriate.
  • step S128 If it is determined in step S128 that it is necessary to report the operating status, the process proceeds to step S129.
  • step S129 the operation status reporting unit 203 controls the communication unit 22 and reports the operation status to the server 2.
  • step S128 If it is determined in step S128 that the operation status report is unnecessary, the process of step S129 is skipped.
  • step S130 the operation status recognition unit 202 determines whether or not the operation status of the updated recognition unit 73 is sufficiently determined, and if it is determined that the determination is not sufficient, the process returns to step S124.
  • steps S125 to S130 are repeated until it is determined that the operation status of the recognition unit 73 after the update is sufficiently determined.
  • the determination of whether or not the operation status of the recognition unit 73 after the update is sufficient may be determined by, for example, whether or not the operation status has been recognized more than a predetermined number of times.
  • step S130 If it is determined in step S130 that the operation status of the updated recognition unit 73 is sufficiently determined, the process proceeds to step S131.
  • step S131 the update unit 204 determines whether or not the update of the recognition unit 73 by the update SW is appropriate.
  • the updating unit 204 recognizes, for example, the operating status of the operating status recognition unit 202. It is determined whether or not the update of the recognition unit 73 by the update SW is appropriate based on whether or not the ratio of the number of times when the operation status is recognized as inappropriate is higher than the predetermined ratio with respect to the predetermined number of times. You may do so.
  • step S131 If it is determined in step S131 that the update of the recognition unit 73 by the update SW is appropriate, the process proceeds to step S132.
  • step S132 the updating unit 204 restarts the operation related to the object recognition processing of the updated recognition unit 73 and the operation control based on the recognition result of the updated recognition unit 73 in the control unit 201. At this time, the update unit 204 discards the pre-update recognition unit 73 held for backup.
  • step S131 If it is determined in step S131 that the update of the recognition unit 73 by the update SW is not appropriate, the process proceeds to step S133.
  • step S133 the update unit 204 returns to the state of the recognition unit 73 before the update held for backup, operates related to the object recognition process of the recognition unit 73 before the update, and before the update in the control unit 201.
  • the operation control based on the recognition result of the recognition unit 73 of the above is restarted. That is, in this case, the recognition unit 73 is not updated, and the operation in the state before the update is continued.
  • control unit 201 does not perform the process based on the recognition result of the recognition unit 73 after the update by the update SW instead of the process of step S133. It may be in a state where it is not done.
  • the update of the recognition unit 73 by the update SW is in an inappropriate state, and the reliability of the operation control based on the recognition result is low. Therefore, the operation such as automatic operation using the recognition result of the recognition unit 73 is stopped.
  • the driver who is the user may be made aware of the fact.
  • step S134 it is determined whether or not the end of the process is instructed, and if the end is not instructed, the process returns to step S121.
  • step S99 the distribution status confirmation unit 234 of the server 2 controls the communication unit 115 to determine whether or not any vehicle 1 has reported the operation status of the recognition unit 73 updated by the update SW. If there is a report on the operating status, the process proceeds to step S100. If there is no report of the operating status of the recognition unit 73 updated by the update SW from any vehicle 1 in step S99, the process proceeds to step S103.
  • step S100 the distribution status confirmation unit 234 acquires the operation status information of the recognition unit 73 updated by the reported update SW.
  • step S101 the distribution status confirmation unit 234 aggregates the operation status information of the recognition unit 73 updated by the reported update SW. More specifically, the distribution status confirmation unit 234 aggregates, for example, the rate at which the recognition process of the recognition unit 73 updated by the update SW is deemed inappropriate.
  • step S102 the distribution status confirmation unit 234 determines whether or not the update of the recognition unit 73 by the update SW is inappropriate based on the aggregation result.
  • the distribution status confirmation unit 234 updates, for example, depending on whether or not the rate at which the recognition process of the recognition unit 73 updated by the update SW is deemed inappropriate is higher than the predetermined rate. It is determined whether or not the update of the recognition unit 73 by the SW is inappropriate.
  • step S102 If it is not determined in step S102 that the update of the recognition unit 73 by the update SW is inappropriate, the process proceeds to step S103.
  • step S103 the distribution status confirmation unit 234 determines whether or not the distribution to the update SW has been performed and the recognition unit 73 has been updated in all the vehicles 1 in the group i.
  • step S103 if the update SW is not delivered to all the vehicles 1 in the group i and the recognition unit 73 is not updated, the process returns to step S97.
  • step S103 if it is determined that the update SW has been delivered to all the vehicles 1 in the group i and the recognition unit 73 has been updated, the process proceeds to step S104.
  • step S104 the distribution planning unit 235 increments the counter i by 1.
  • step S105 the distribution planning unit 235 determines whether or not the counter i is larger than the maximum value which is the number of groups and the update process is completed for all the groups.
  • step S105 when the counter i is equal to or less than the maximum value which is the number of groups and it is considered that the update process has not been completed for all the groups, the process returns to step S93, and the subsequent processes are repeated. Is done.
  • the process of distributing the update SW in group units and updating the recognition unit 73 is repeated in order from each vehicle 1 in the highly safe group when there is a problem in the updated recognition unit 73. Then, the processes of steps S93 to S105 are repeated until the recognition unit 73 is updated by the update SW for the vehicles 1 of all the groups.
  • step S105 when the counter i is larger than the maximum value which is the number of groups and it is determined that the processing is completed for all the groups, the processing proceeds to step S106.
  • step S106 it is determined whether or not the end of the process is instructed, and if the end is not instructed, the process returns to step S91, and the subsequent processes are repeated.
  • step S106 the process ends.
  • step S102 If it is determined in step S102 that the update of the recognition unit 73 by the update SW is inappropriate, the process proceeds to step S107.
  • step S107 the distribution status confirmation unit 234 has confirmed that a problem will occur in the update of the recognition unit 73 by the current update SW, so the distribution plan is to stop the distribution of the subsequent update SW. Notify department 235.
  • the distribution planning unit 235 discards the distribution plan and stops the distribution of the update SW of the group i thereafter.
  • the update SW is delivered to each group of the vehicle 1 in order from the group having the lowest update risk.
  • the update SW is sequentially moved to the vehicle 1 of the group having a high update risk. It is possible to distribute and update the recognition unit 73.
  • the update SW of the recognition unit 73 is distributed in order from the group of the vehicle 1 having the lowest update risk, so even if a defect is recognized from the operating status of the recognition unit 73 after the update, even if a defect is recognized. It is possible to safely update the recognition unit 73 while suppressing the occurrence of a fatal problem due to erroneous recognition.
  • the recognition unit 73 is updated by the update SW in order from the vehicle 1 having the lowest update risk, and the update SW is gradually distributed while checking the operation status. It will be possible to expand to the vehicle 1 of the high group. As a result, it is possible to reduce the time cost related to re-learning, and it is possible to realize prompt delivery of the update SW.
  • Update timing setting process (1) >> Next, the update timing setting process (No. 1) of FIG. 11 will be described with reference to the flowchart of FIG.
  • step S151 the distribution planning unit 235 reads out the vehicle information of the vehicle 1 included in the distribution order information determined by the distribution order determination unit 233, and reads out the operation results of the vehicle to be processed.
  • the operation record referred to here is, for example, the daily operating time zone of the vehicle 1.
  • step S152 the distribution planning unit 235 estimates the time zone in which the processing target vehicle is stopped based on the read operation results of the processing target vehicle.
  • step S153 the distribution planning unit 235 sets the timing with the highest possibility of being stopped among the time zones in which the processing target vehicle is stopped as the distribution timing.
  • the safe timing is set to the update timing of the recognition unit 73 by the update SW. Can be set as.
  • Update timing setting process (2) >> Next, the update timing setting process (No. 2) of FIG. 11 will be described with reference to the flowchart of FIG.
  • the update timing to the timing at which the recognition target of the recognition unit 73 is unlikely to be recognized, the timing has little influence even if the recognition unit 73 does not function during the update process. May be set as the update timing.
  • the recognition target of the recognition unit 73 is only a pedestrian and the pedestrian is recognized
  • the control unit 201 controls the operation so as to avoid contact with the pedestrian, for example, a highway. Since the pedestrian is not recognized, such as when the vehicle is running, an action for avoiding contact with the pedestrian is not required.
  • the recognition target recognized by the recognition unit 73 is not detected based on the operation results and the operation route plan, and the timing at which there is no problem even if the recognition function by the recognition unit 73 is stopped is set as the update timing. It may be set.
  • step S171 the distribution planning unit 235 acquires the information of the operation route planned by the action planning unit 62.
  • step S172 the distribution planning unit 235 estimates the fluctuation of the data to be recognized on the driving route based on the acquired information on the driving route.
  • the distribution planning unit 235 serves as a pedestrian by the recognition unit 73 based on the image captured by the camera 51 when the processing target vehicle moves along the planned driving route. Estimate the position on the driving path where there is no recognized data (probably).
  • step S173 the distribution planning unit 235 identifies a section on the operation route in which the data to be recognized does not exist (highly likely) based on the information on the fluctuation of the data to be recognized on the estimated operation route. do.
  • the distribution planning unit 235 identifies a section in which the pedestrian data does not exist (highly likely) on the planned driving route.
  • the section where the pedestrian data does not exist is, for example, on a highway or a motorway.
  • step S174 the distribution planning unit 235 sets the timing of passing through the section where the data to be recognized on the estimated operation route does not exist (highly likely) as the update timing.
  • the timing of passing through a section where pedestrian data does not exist (highly likely) on the planned driving route is set as the update timing. ..
  • the update timing in this way, when the recognition unit 73 is updated by the update SW, the operation based on the recognition result of the recognition unit 73 is stopped, that is, the function of recognizing a pedestrian is stopped. Even if it becomes a state, it is running on the highway during the update, it is a timing that pedestrians are not recognized, and a malfunction due to misrecognition such as pedestrians being recognized occurs. Therefore, it is possible to safely update the recognition unit 73.
  • the update timing may be set during driving.
  • the vehicle 1 may set the update timing.
  • the update unit 204 of the vehicle 1 is based on the operating status of the flowcharts of FIGS. 12 and 13. Execute the update timing setting process explained with reference to, and set the update timing. Then, in the process of step S32, the control unit 201 includes the update timing information in the vehicle information, controls the communication unit 22, and transmits the information to the server 2.
  • Update for each processing unit >> ⁇ Update for each component>
  • the recognition unit 73 is updated by the update SW during the operation in which the recognition process is performed. It may be updated.
  • the recognition process by the recognition unit 73 is performed in chronological order for each component.
  • the component indicates, for example, a recognition process for each recognition target, and is set for each recognition target such as recognizing a railroad crossing, recognizing a signal, or recognizing a pedestrian. Therefore, in the object recognition process by the recognition unit 73, it can be considered that the object recognition process for each component having a different recognition target is sequentially executed in chronological order.
  • the components C1 and C2 are executed in chronological order as shown in FIG. 14, the component C1 is processed at times t11 to t15, and the component C2 is processed at times t15 to t19.
  • each component is performed in frame units of the image captured by the camera 51.
  • the processing of the frame F1 is performed at the times t11 to t12
  • the processing of the frame F2 is performed at the times t12 to t13
  • the processing of the frame F3 is performed at the times t13 to t14.
  • the processing of the frame F4 is performed at the time t14 to t15.
  • the processing of the frame F5 is performed at the times t15 to t16
  • the processing of the frame F6 is performed at the times t16 to t17
  • the processing of the frame F7 is performed at the times t17 to t18.
  • the processing of the frame F8 is performed at the time t18 to t19.
  • FIG. 14 shows an example of processing for four frames in any of the components C1 and C2, processing for another number of frames may be performed, and the same number of frames may be performed for each component. It does not have to be.
  • the component C1 is an object recognition process for recognizing a railroad crossing, for example, when traveling on a highway, the railroad crossing does not exist and the railroad crossing is not detected as a recognition result. Is virtually unnecessary.
  • the component C1 that recognizes the railroad crossing if the vehicle is traveling on a highway where the railroad crossing does not exist, the operation of the object recognition process of the component C1 (or the control unit based on the recognition result of the recognition unit 73). There is no problem even if the recognition unit 73 is updated by the update SW without stopping the operation control of 201).
  • the recognition unit 73 is updated by the update SW for each component without stopping the operation of the recognition unit 73. You may do so.
  • the camera 51 performs the imaging process at the times t11 to t31, and the camera 51 performs the imaging data transfer process at the times t31 to t32.
  • the recognition unit 73 performs recognition processing.
  • the image pickup process is performed by the camera 51 at times t12 to t34
  • the image pickup data is transferred by the camera 51 at times t34 to t35
  • the recognition process is performed by the recognition unit 73 at times t35 to t36. Is done.
  • the operation is substantially stopped, so even if the recognition unit 73 is updated by the update SW. There is no effect on motion control based on the recognition result.
  • the update timing is set so that the recognition unit 73 is updated by the update SW at the timing between frames when the recognition unit 73 is not functioning. You may do so.
  • the process block B1 is processed at the time t32 to t51, and the process block B1 is processed at the time t52. It is assumed that the processing block B2 is processed at t53, the processing block B3 is processed at times t54 to t55, and the processing block B4 is processed at times t56 to t33.
  • the processing block B1 is processed at times t35 to t57
  • the processing block B2 is processed at times t58 to t59
  • the processing block is processed at times t60 to t61. It is assumed that the processing of B3 is performed and the processing of the processing block B4 is performed at times t62 to t36.
  • the processing block B1 Since the processing of the processing block B1 is not performed in the period T31 from the time t51 when the processing of the processing block B1 of the frame F1 is completed to the start of the processing of the processing block B1 of the frame F2, the processing block B1 Since the operation is substantially stopped, it is time to update.
  • the update SW updates the recognition unit 73 in units of processing blocks at the timing between the processing blocks in which the recognition unit 73 is not functioning. You may do it.
  • the processing blocks constituting the recognition processing of the recognition unit 73 are each configured by a neural network, the processing blocks may be updated in specific layer units.
  • each of the processing blocks B1 to B4 is composed of, for example, a neural network composed of layers L1 to Lx, the update of the processing block B1 is considered.
  • the layer Lx of the processing block B1 is updated on a layer-by-layer basis at the first timing in the same period T31 as the updateable timing between the processing blocks, as shown in FIG. , Layers Lx-1, Lx-2, ... L1 may be sequentially updated in layer units at the same timing.
  • FIG. 17 shows an example in which the update is performed by the update SW in units of one layer, the update may be performed by the update SW in units of a plurality of layers.
  • the update SW may be used to update each channel in the layer.
  • the update may be performed in units of one channel or a plurality of channels. It may be a unit update.
  • steps S201 to S204, S206, S207, S209 to S217, and steps S234 to S240, S242 in the flowchart of FIG. 18 are the steps S91 to S94, S96, S97, S99 to S107, and S99 to S107 in the flowchart of FIG. Since it is the same as the processing of steps S125 to S131 and S134, the description thereof will be omitted.
  • step S204 when the unprocessed vehicle among the vehicles 1 of the group i is set as the processing target vehicle, in step S205, the distribution planning unit 235 executes the processing unit setting process and the processing target vehicle.
  • the update unit for updating the recognition unit 73 by the update SW is set, and the update timing according to the update unit is set.
  • step S271 the distribution planning unit 235 determines whether or not the processing unit updated by the update SW is the update processing for each channel.
  • step S271 if the processing unit updated by the update SW is the channel unit update processing, the processing proceeds to step S272.
  • step S272 the distribution planning unit 235 sets the update timing when the processing unit is the channel unit.
  • step S271 if the processing unit updated by the update SW is not the channel unit update processing, the processing proceeds to step S273.
  • step S273 the distribution planning unit 235 determines whether or not the processing unit updated by the update SW is the update processing for each layer.
  • step S273 if the processing unit updated by the update SW is the layer unit update processing, the processing proceeds to step S274.
  • step S274 the distribution planning unit 235 sets the update timing when the processing unit is the layer unit, as described with reference to FIG.
  • step S273 if the processing unit updated by the update SW is not the layer unit update processing, the processing proceeds to step S275.
  • step S275 the distribution planning unit 235 determines whether or not the processing unit updated by the update SW is a block unit update process.
  • step S275 if the processing unit updated by the update SW is a block unit update process, the process proceeds to step S276.
  • step S276 the distribution planning unit 235 sets the update timing when the processing unit is the block unit, as described with reference to FIG.
  • step S275 if the processing unit updated by the update SW is not the block unit update processing, the processing proceeds to step S277.
  • step S277 the distribution planning unit 235 determines whether or not the processing unit updated by the update SW is the update process between frames.
  • step S277 if the processing unit updated by the update SW is the frame unit update processing, the processing proceeds to step S278.
  • step S278 the distribution planning unit 235 sets the update timing when the processing unit is between frames, as described with reference to FIG.
  • step S277 if the processing unit updated by the update SW is not the update processing between frames, the processing proceeds to step S279.
  • step S279 the distribution planning unit 235 determines whether or not the processing unit updated by the update SW is the update processing for each component.
  • step S279 if the processing unit updated by the update SW is the update processing of the component unit, the processing proceeds to step S280.
  • step S280 the distribution planning unit 235 sets the update timing when the processing unit is the component unit, as described with reference to FIG.
  • step S279 if the processing unit updated by the update SW is not the update processing for each component, the processing proceeds to step S281.
  • step S281 Since the processing in step S281 is not an update of any of the processing units of channel unit, layer unit, block unit, frame-to-frame, and component unit, the recognition unit 73 is stopped, with the entire normal recognition unit 73 as a unit. Therefore, the distribution planning unit 235 executes the same update timing setting process as in step S99 in the flowchart of FIG. 11 to set the update timing.
  • step S207 the update SW for executing the update of the recognition unit 73 in each processing unit is performed in the processing of step S208. It is transmitted to the vehicle 1.
  • step S232 the update unit 204 controls the communication unit 22 to acquire and recognize the update SW according to the transmitted processing unit. Part 73 is updated. At this time, the update unit 204 holds the recognition unit 73 before the update for backup before updating by the update SW.
  • the update process of FIG. 18 is based on the operation related to the object recognition process of the recognition unit 73 and the recognition result of the recognition unit 73 in the control unit 201, as in the process of step S122 in the update process of FIG.
  • the operation control is not stopped, and the operation is continued.
  • step S233 the update unit 204 confirms that the recognition unit 73 has been reliably updated.
  • the update unit 204 recognizes the recognition result of the recognition unit 73 before the update and the recognition after the update. By comparing with the recognition result of the unit 73 and determining whether or not there is a change of a predetermined level or more, it may be determined whether or not the update is appropriately performed.
  • step S240 the update unit 204 determines whether or not the update of the recognition unit 73 by the update SW is appropriate.
  • step S240 If it is determined in step S240 that the update of the recognition unit 73 by the update SW is not appropriate, the process proceeds to step S241.
  • step S241 the update unit 204 returns to the state of the recognition unit 73 before the update held for backup, the operation related to the object recognition process of the recognition unit 73 before the update, and the operation before the update in the control unit 201.
  • the operation control based on the recognition result of the recognition unit 73 of the above is set to the state before the update. That is, in this case, the recognition unit 73 is not updated, and the operation in the state before the update is continued.
  • control unit 201 does not perform the process based on the recognition result of the recognition unit 73 after the update by the update SW instead of the process of step S241. It may be in a state where it is not done.
  • step S240 determines that the update of the recognition unit 73 by the update SW is appropriate. If it is determined in step S240 that the update of the recognition unit 73 by the update SW is appropriate, the process of step S241 is skipped.
  • the operation of the recognition unit 73 is updated without being stopped, and the update is also performed appropriately, so that the operation is continued as it is.
  • the update processing of the recognition unit 73 by the update SW can be executed by setting the timing for each processing unit, so that the operation of the object recognition processing in the recognition unit 73 can be executed without stopping. It will be possible to update appropriately.
  • the non-operation update process is safer, but the operation update process can be updated almost at any time. Therefore, the non-operation update process and the operation update process may be switched.
  • the non-operation update process is performed within a predetermined period after the distribution of the update SW starts, and the operation update process is executed when the update cannot be performed within the predetermined period. It is also good.
  • a score indicating urgency or priority according to the update content of the recognition unit 73 by the update SW is set, and the update does not operate when the urgency or priority is lower than the predetermined score.
  • the update process may be performed so that the operation update process may be executed in the case of an update having a higher urgency or a higher priority than a predetermined score.
  • the score indicating the urgency and the priority may be changed according to the automatic driving plan and the like. For example, regarding the update process of the recognition unit 73 related to the function not scheduled to be used in the automatic driving plan. Set a low score indicating urgency and priority so that non-operation update processing is performed, and urgency and priority for update processing of the recognition unit 73 related to the function scheduled to be used in the automatic operation plan.
  • the operation update process may be performed by setting a high score indicating.
  • the update process of the recognition unit 73 related to the function that is not scheduled to be used in response to the change of the automatic operation plan due to the change of the plan made in the automatic operation plan shows urgency and priority.
  • the non-operation update process is performed by changing the setting so that the score becomes low, and the update process of the recognition unit 73 related to the function scheduled to be used according to the change of the automatic operation plan.
  • the operation update process may be performed by changing the setting so as to increase the score indicating the urgency or priority.
  • the managed SW (software program) is the recognition unit 73 that executes the object recognition process and recognizes the object
  • the SW is generated by machine learning or the like, it has been described.
  • Other processing may be executed, for example, a SW that executes a route search formed by machine learning, or a SW that realizes battery management.
  • Example of execution by software By the way, the series of processes described above can be executed by hardware, but can also be executed by software.
  • the programs that make up the software may execute various functions by installing a computer embedded in dedicated hardware or various programs. It is installed from a recording medium on a possible, eg, general purpose computer.
  • FIG. 20 shows a configuration example of a general-purpose computer.
  • This personal computer has a built-in CPU (Central Processing Unit) 1001.
  • the input / output interface 1005 is connected to the CPU 1001 via the bus 1004.
  • a ROM (Read Only Memory) 1002 and a RAM (Random Access Memory) 1003 are connected to the bus 1004.
  • the input / output interface 1005 includes an input unit 1006 composed of input devices such as a keyboard and a mouse for inputting operation commands by the user, an output unit 1007 for outputting a processing operation screen and an image of processing results to a display device, and programs and various data. It is composed of a storage unit 1008 including a hard disk drive for storing, a LAN (Local Area Network) adapter, and the like, and is connected to a communication unit 1009 which executes communication processing via a network represented by the Internet.
  • magnetic discs including flexible discs
  • optical discs including CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc)
  • optical magnetic discs including MD (Mini Disc)
  • a drive 1010 for reading / writing data is connected to a removable storage medium 1011 such as a memory.
  • the CPU 1001 is read from a program stored in the ROM 1002 or a removable storage medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, installed in the storage unit 1008, and loaded from the storage unit 1008 into the RAM 1003. Various processes are executed according to the program.
  • the RAM 1003 also appropriately stores data and the like necessary for the CPU 1001 to execute various processes.
  • the CPU 1001 loads the program stored in the storage unit 1008 into the RAM 1003 via the input / output interface 1005 and the bus 1004 and executes the above-mentioned series. Is processed.
  • the program executed by the computer (CPU1001) can be recorded and provided on the removable storage medium 1011 as a package medium or the like, for example.
  • the program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit 1008 via the input / output interface 1005 by mounting the removable storage medium 1011 in the drive 1010. Further, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. In addition, the program can be installed in the ROM 1002 or the storage unit 1008 in advance.
  • the program executed by the computer may be a program in which processing is performed in chronological order according to the order described in the present specification, in parallel, or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
  • the CPU 1001 in FIG. 20 realizes the functions of the processor 21 in FIG. 3 and the processor 111 in FIG.
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
  • the present disclosure can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and jointly processed.
  • each step described in the above flowchart can be executed by one device or shared by a plurality of devices.
  • the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
  • the present disclosure may also have the following configuration.
  • the SW is formed by machine learning and is formed.
  • the update unit backs up the SW before the update, and then updates the SW using the update SW.
  • the operation status recognition unit recognizes the updated operation status of the SW by comparing the processing result of the SW updated by the update unit with the processing result of the backed up SW before the update ⁇ 2. > The information processing apparatus described in.
  • ⁇ 4> The information processing apparatus according to any one of ⁇ 1> to ⁇ 3>, wherein the SW is a SW formed by machine learning and functions as an object recognition unit that executes an object recognition process based on an image.
  • the operating status recognition unit compares the object recognition result of the SW that functions as the object recognition unit updated by the update unit with the information of surrounding objects based on the current position information.
  • the information processing apparatus according to ⁇ 4> which recognizes the updated operating status of the SW.
  • ⁇ 6> Further includes a control unit that controls the operation based on the object recognition result of the SW.
  • the information processing device wherein the control unit stops control of the operation based on the object recognition result of the SW that functions as the updated object recognition unit based on the operation status.
  • the control unit controls the operation based on the object recognition result of the SW that functions as the object recognition unit before the update based on the operation status.
  • the update unit acquires an update SW for updating the SW distributed from the server, updates the SW based on the update SW, and updates the SW.
  • the information processing apparatus further including an operation status recognition result transmission unit that transmits an operation status recognition result, which is an operation status recognition result by the operation status recognition unit, to the server.
  • the update SW is updated based on the safety of the operation controlled by the control unit based on the processing result of the SW when the SW is not properly updated by the update SW.
  • the information processing device according to ⁇ 8>, wherein when the information processing devices to which the SW is distributed are grouped, the information processing devices of the safest group are sequentially and sequentially distributed in group units.
  • the server estimates the timing at which the operation control by the control unit becomes unnecessary based on the object recognition result of the SW that functions as the object recognition unit, and the operation control by the control unit is unnecessary.
  • the information processing apparatus according to ⁇ 9>, wherein a distribution plan is generated in which the timing is the timing at which the update SW is distributed.
  • the safety of the distribution plan is such that the timing at which the operation control by the control unit becomes unnecessary based on the object recognition result of the SW that functions as the object recognition unit is the timing at which the update SW is distributed.
  • the information processing apparatus according to ⁇ 9> which is generated in units of the group grouped based on the above.
  • ⁇ 12> The control that the timing at which a predetermined object is not recognized as the object recognition result of the SW that functions as the object recognition unit is based on the object recognition result of the SW that functions as the object recognition unit.
  • a distribution plan is generated in which the timing at which the operation control by the unit is not required is estimated and the estimated timing at which the control unit does not need to control the operation is the timing at which the update SW is distributed.
  • the information processing device described. ⁇ 13> The control unit controls the automatic driving of the vehicle based on the object recognition result.
  • a distribution plan is generated in which the timing at which the predetermined object travels on a route on which the vehicle is planned to travel on a route that is not recognized as the object recognition result is the timing at which the update SW is distributed.
  • ⁇ 14> The information processing apparatus according to ⁇ 8>, wherein the server stops the distribution of the SW to the information processing apparatus based on the operation status recognition result transmitted from the operation status recognition result transmission unit. .. ⁇ 15>
  • the object recognition accuracy by the updated SW is higher than the object recognition accuracy by the SW before the update based on the operation status recognition result transmitted from the operation status recognition result transmission unit.
  • the information processing apparatus which stops the distribution of the SW to the information processing apparatus when the value is low.
  • the operation status recognition result transmission unit transmits the operation status recognition result, which is the operation status recognition result by the operation status recognition unit, and the object recognition result corresponding to the image to the server.
  • the information according to ⁇ 8> wherein the server relearns the SW based on the image and the corresponding object recognition result, and generates the update SW for updating the SW to the relearned state.
  • Processing equipment ⁇ 17>
  • the server updates the SW to a relearned state when the relearned SW reaches a predetermined recognition accuracy based on the image and the corresponding object recognition result.
  • the information processing device according to ⁇ 16> that distributes SW.
  • Update section and In the information processing method of an information processing device equipped with an operation status recognition unit The update unit updates the SW (software program) and The operation status recognition unit is an information processing method including a step of recognizing the updated operation status of the SW.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Traffic Control Systems (AREA)
PCT/JP2021/023308 2020-07-03 2021-06-21 情報処理装置、および情報処理方法、情報処理システム、並びにプログラム WO2022004446A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP21832905.0A EP4177733A4 (en) 2020-07-03 2021-06-21 INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, INFORMATION PROCESSING SYSTEM AND PROGRAM
US18/003,211 US20230244471A1 (en) 2020-07-03 2021-06-21 Information processing apparatus, information processing method, information processing system, and program
CN202180045785.2A CN115997193A (zh) 2020-07-03 2021-06-21 信息处理装置、信息处理方法、信息处理系统和程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020115427 2020-07-03
JP2020-115427 2020-07-03

Publications (1)

Publication Number Publication Date
WO2022004446A1 true WO2022004446A1 (ja) 2022-01-06

Family

ID=79316126

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/023308 WO2022004446A1 (ja) 2020-07-03 2021-06-21 情報処理装置、および情報処理方法、情報処理システム、並びにプログラム

Country Status (4)

Country Link
US (1) US20230244471A1 (zh)
EP (1) EP4177733A4 (zh)
CN (1) CN115997193A (zh)
WO (1) WO2022004446A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011081604A (ja) 2009-10-07 2011-04-21 Toyota Motor Corp 車両用プログラム更新装置
JP2013148957A (ja) 2012-01-17 2013-08-01 Toyota Motor Corp 安全制御装置及び安全制御方法
JP2014092878A (ja) * 2012-11-01 2014-05-19 Nippon Telegr & Teleph Corp <Ntt> 分類モデル更新支援装置及び方法及びプログラム
JP2018005894A (ja) * 2016-06-23 2018-01-11 住友電気工業株式会社 プログラム配信システム、サーバ、プログラム配信方法、およびコンピュータプログラム
JP2019175349A (ja) * 2018-03-29 2019-10-10 Kddi株式会社 配信装置
JP2020060987A (ja) * 2018-10-11 2020-04-16 みこらった株式会社 移動装置及び移動装置用プログラム
WO2020095545A1 (ja) * 2018-11-05 2020-05-14 日本電気株式会社 物体認識システム、認識装置、物体認識方法および物体認識プログラム

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE112019001046T5 (de) * 2018-02-28 2020-11-26 Sony Corporation Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, programm und mobiler körper
JP7130984B2 (ja) * 2018-03-01 2022-09-06 日本電気株式会社 画像判定システム、モデル更新方法およびモデル更新プログラム
KR102287460B1 (ko) * 2019-08-16 2021-08-10 엘지전자 주식회사 인공지능 무빙 에이전트

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011081604A (ja) 2009-10-07 2011-04-21 Toyota Motor Corp 車両用プログラム更新装置
JP2013148957A (ja) 2012-01-17 2013-08-01 Toyota Motor Corp 安全制御装置及び安全制御方法
JP2014092878A (ja) * 2012-11-01 2014-05-19 Nippon Telegr & Teleph Corp <Ntt> 分類モデル更新支援装置及び方法及びプログラム
JP2018005894A (ja) * 2016-06-23 2018-01-11 住友電気工業株式会社 プログラム配信システム、サーバ、プログラム配信方法、およびコンピュータプログラム
JP2019175349A (ja) * 2018-03-29 2019-10-10 Kddi株式会社 配信装置
JP2020060987A (ja) * 2018-10-11 2020-04-16 みこらった株式会社 移動装置及び移動装置用プログラム
WO2020095545A1 (ja) * 2018-11-05 2020-05-14 日本電気株式会社 物体認識システム、認識装置、物体認識方法および物体認識プログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4177733A4

Also Published As

Publication number Publication date
EP4177733A4 (en) 2023-11-22
CN115997193A (zh) 2023-04-21
EP4177733A1 (en) 2023-05-10
US20230244471A1 (en) 2023-08-03

Similar Documents

Publication Publication Date Title
JP7314798B2 (ja) 撮像装置、画像処理装置、及び、画像処理方法
WO2021241189A1 (ja) 情報処理装置、情報処理方法、およびプログラム
WO2022138123A1 (en) Available parking space identification device, available parking space identification method, and program
WO2020241303A1 (ja) 自動走行制御装置、および自動走行制御システム、並びに自動走行制御方法
WO2022158185A1 (ja) 情報処理装置、情報処理方法、プログラムおよび移動装置
WO2021060018A1 (ja) 信号処理装置、信号処理方法、プログラム、及び、移動装置
US20230289980A1 (en) Learning model generation method, information processing device, and information processing system
WO2022004448A1 (ja) 情報処理装置、および情報処理方法、情報処理システム、並びにプログラム
WO2022004446A1 (ja) 情報処理装置、および情報処理方法、情報処理システム、並びにプログラム
WO2022004423A1 (ja) 情報処理装置、情報処理方法、及び、プログラム
WO2022004447A1 (ja) 情報処理装置、および情報処理方法、情報処理システム、並びにプログラム
WO2023171401A1 (ja) 信号処理装置、信号処理方法、および記録媒体
WO2022113772A1 (ja) 情報処理装置、情報処理方法、及び、情報処理システム
WO2023063199A1 (ja) 情報処理装置、および情報処理方法、並びにプログラム
WO2023074419A1 (ja) 情報処理装置、情報処理方法、及び、情報処理システム
WO2022259621A1 (ja) 情報処理装置、情報処理方法、コンピュータプログラム
WO2022024569A1 (ja) 情報処理装置と情報処理方法およびプログラム
WO2023032276A1 (ja) 情報処理装置、情報処理方法、及び、移動装置
WO2022014327A1 (ja) 情報処理装置、情報処理方法、およびプログラム
WO2023053498A1 (ja) 情報処理装置、情報処理方法、記録媒体、および車載システム
WO2023054090A1 (ja) 認識処理装置、認識処理方法、および認識処理システム
WO2023149089A1 (ja) 学習装置、学習方法及び学習プログラム
JP7367014B2 (ja) 信号処理装置、信号処理方法、プログラム、及び、撮像装置
US20230410486A1 (en) Information processing apparatus, information processing method, and program
WO2022145286A1 (ja) 情報処理装置、情報処理方法、プログラム、移動装置、及び、情報処理システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21832905

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2021832905

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP