WO2023079881A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2023079881A1
WO2023079881A1 PCT/JP2022/036903 JP2022036903W WO2023079881A1 WO 2023079881 A1 WO2023079881 A1 WO 2023079881A1 JP 2022036903 W JP2022036903 W JP 2022036903W WO 2023079881 A1 WO2023079881 A1 WO 2023079881A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
visibility
feature point
radar
unit
Prior art date
Application number
PCT/JP2022/036903
Other languages
English (en)
Japanese (ja)
Inventor
淳 吉澤
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023079881A1 publication Critical patent/WO2023079881A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information processing device, an information processing method, and a program.
  • IMU Inertial Measurement Unit
  • GNSS Global Navigation Satellite System
  • Patent Document 2 proposes a self-position estimation method that combines a camera and a radar. It's not something you raise.
  • the present disclosure proposes an information processing device, an information processing method, and a program capable of appropriately linking a camera and a radar to improve the accuracy of self-position estimation.
  • information including a visibility calculation unit that extracts visibility information from camera image data, and a radar parameter determination unit that determines operation parameters for controlling the effective positioning range of the radar based on the visibility information
  • a processing device is provided. Further, according to the present disclosure, there are provided an information processing method in which the information processing of the information processing device is executed by a computer, and a program for causing the computer to implement the information processing of the information processing device.
  • FIG. 1 is a block diagram showing a configuration example of a vehicle control system
  • FIG. FIG. 4 is a diagram showing an example of a sensing area by an external recognition sensor
  • 1 is a schematic diagram of a conventional vehicle control system
  • FIG. 1 is a schematic diagram of a vehicle control system of the present disclosure that is an improvement over conventional systems
  • FIG. It is a figure which shows the structural example of an image signal processing part.
  • It is a figure which shows the structural example of a radar signal processing part.
  • 4 is a flow chart showing an example of information processing for controlling radar operation based on visibility information.
  • FIG. 5 is a schematic diagram of a vehicle control system according to a first modified example
  • FIG. 11 is a schematic diagram of a vehicle control system according to a second modified example;
  • FIG. 1 is a diagram for explaining the outline of the auxiliary sensing mechanism.
  • the vehicle 1 has a vehicle control system 11 that supports driving based on sensing information such as GNSS 91 and IMU 92 .
  • the vehicle control system 11 is an information processing device that processes various types of information.
  • Vehicle 1 has an auxiliary sensing mechanism that combines camera 51 and radar 52 . Auxiliary sensing mechanisms are used to aid position detection by GNSS 91 and IMU 92 .
  • GNSS91 has become an indispensable technology for realizing driving support and automated driving.
  • the GNSS 91 generally receives radio waves from a plurality of satellites, calculates the distance between each satellite and the vehicle 1, and calculates the three-dimensional position of the vehicle on the ground.
  • the GNSS 91 uses radio waves from satellites in the sky, there is a problem that radio waves are attenuated by shielding objects and signal quality is degraded due to multiple reflections. As a result, performance is severely compromised in certain situations, and in some cases, the vehicle cannot be located. For example, it is known that tunnels and streets surrounded by tall buildings often prevent the GNSS 91 from performing certain operations.
  • An in-vehicle camera can be cited as a strong candidate for such additional auxiliary sensing means.
  • the image information obtained by the camera 51 has extremely high resolution, making it possible to acquire an extremely large amount of road information. Therefore, it is thought that it is a very effective means as an additional sensing means with respect to self-contained navigation by GNSS91 and IMU92.
  • the surrounding image information obtained by the camera 51 by estimating the distance to a specific known location and incorporating the results of estimating the position of the vehicle 1 into the position estimation results of the GNSS 91 and the IMU 92, the further It is possible to improve the position accuracy.
  • Patent Document 1 discloses a method in which a server specifies a position from image feature values using a camera image of the surroundings when GNSS 91 cannot be used.
  • the camera 51 suffers from the problem that the quality of the obtained image is greatly degraded under certain extreme environments such as rain, fog, snow and strong light. Also, in general, the camera 51 is an excellent means with extremely high spatial resolution performance, but it is difficult to measure distance and velocity with high accuracy even at long distances. Thus, in the specific cases in which the camera 51 is weak in assisted sensing by the camera 51 for self-contained navigation by the GNSS 91 and the IMU 92, further improvements are still desired.
  • the radar 52 is known as a sensing means that is highly resistant to such environmental conditions that the camera 51 is not good at.
  • using the radar 52 as an auxiliary sensing means for the camera 51 presents new problems unique to radar.
  • the radar 52 is prone to artifacts due to multipath.
  • the radar 52 is suitable for measuring the distance and speed to a distant object, but it is difficult to accurately detect the azimuth of the object.
  • Patent Document 2 discloses sensing by cooperation of the camera 51 and the radar 52 .
  • the illuminance of the surrounding area is detected, and when the surrounding illuminance is above a predetermined amount, the reliability of the camera is determined to be high, and the operation of the radar is stopped. Also, when the ambient illumination is less than the predetermined level, the reliability of the camera is determined to be low, and the operation of the camera is stopped. This reduces power consumption in sensing.
  • Patent Literature 2 does not describe a method of assisting the GNSS 91 by adaptively varying the operating conditions of the radar based on the image information of the camera.
  • the present invention has been made in view of the above circumstances.
  • the vehicle control system 11 of the present disclosure calculates the effective positioning range of the camera 51 from the image captured by the camera 51 .
  • the effective positioning range means a range in which positioning accuracy can be guaranteed by the camera.
  • the effective positioning range is defined based on visibility, for example.
  • Visibility means the maximum distance at which an object can be clearly seen with the naked eye.
  • a visibility measurement method a known method defined by the WMO (World Meteorological Organization) or the like can be used.
  • Visibility information may be presented as distance, or may be presented as level information obtained by scaling the visibility. For example, it is conceivable to set the visibility level to 5 when the field of view of the camera 51 is not obstructed, and to set the visibility level to 1 when there is thick fog.
  • the auxiliary sensing mechanism changes the optimum operating parameters to be given to the radar 52 based on visibility information (visibility information) extracted from the image data of the camera 51 .
  • the visibility value obtained from the image data of the camera 51 is set as a value that allows the operating parameters of the radar 52 to be sensed at a long distance with high accuracy.
  • the environment is "foggy" or "backlit"
  • the reliable distance measurement range obtained from the image data of the camera 51 is greatly reduced. Therefore, the visibility value also varies accordingly.
  • the radar 52 is then given operating parameters that correspond to the reduced visibility value and allow it to operate accurately at close range.
  • the operating parameters of the radar 52 are adjusted by changing parameters such as chirp bandwidth, chirp time width, and chirp transmission interval in the transmitted FMCW signal used by the radar 52 .
  • chirp signal frequency bandwidth as an operating parameter is determined based on visibility information. If the visibility value is small, the operating parameters of radar 52 are tuned to optimum parameters at close range. If the visibility value is large, the operating parameters of radar 52 are tuned to optimum parameters at long range.
  • the operating parameters of the radar 52 are variably controlled based on visibility information. Therefore, there is a need for one or more radars 52 whose operating parameters can be adaptively controlled.
  • a radar 52 may be installed in the vehicle as a retrofit, or the radar 52 already installed in the vehicle 1 may be modified in design and used as the radar 52 for the present disclosure.
  • the radar 52 for use in the present disclosure is retrofitted to the front of the vehicle 1 .
  • the vehicle control system 11 of the present disclosure can determine the direction in which positioning should be performed by the radar 52 from the image information obtained from the camera 51 and transmit it to the radar 52 . By doing so, the radar 52 can adjust the transmission/reception beam BM so as to be optimal in the direction indicated by the system, and perform sensing.
  • the image of the object detected by the radar 52 is fed back to the camera 51 again, compared with the image information acquired by the camera 51, and after eliminating the image that may be a false image of the radar 52, the image is obtained from the camera 51.
  • Distances to feature points can be mapped using distances obtained from radar 52 . This improves the reliability of operation in environments in which the camera 51 is not good.
  • FIG. 2 is a block diagram showing a configuration example of the vehicle control system 11. As shown in FIG.
  • the vehicle control system 11 is provided in the vehicle 1 and performs processing related to driving support and automatic driving of the vehicle 1.
  • the vehicle control system 11 includes a vehicle control ECU (Electronic Control Unit) 21, a communication unit 22, a map information accumulation unit 23, a position information acquisition unit 24, an external recognition sensor 25, an in-vehicle sensor 26, a vehicle sensor 27, a storage unit 28, a travel It has a support/automatic driving control unit 29 , a DMS (Driver Monitoring System) 30 , an HMI (Human Machine Interface) 31 , and a vehicle control unit 32 .
  • Vehicle control ECU 21, communication unit 22, map information storage unit 23, position information acquisition unit 24, external recognition sensor 25, in-vehicle sensor 26, vehicle sensor 27, storage unit 28, driving support/automatic driving control unit 29, driver monitoring system ( DMS) 30 , human machine interface (HMI) 31 , and vehicle control unit 32 are connected via a communication network 41 so as to be able to communicate with each other.
  • DMS driver monitoring system
  • HMI human machine interface
  • the communication network 41 is, for example, a CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), FlexRay (registered trademark), Ethernet (registered trademark), and other digital two-way communication standards. It is composed of a communication network, a bus, and the like.
  • the communication network 41 may be used properly depending on the type of data to be transmitted. For example, CAN may be applied to data related to vehicle control, and Ethernet may be applied to large-capacity data.
  • Each part of the vehicle control system 11 performs wireless communication assuming relatively short-range communication such as near field communication (NFC (Near Field Communication)) or Bluetooth (registered trademark) without going through the communication network 41. may be connected directly using NFC (Near Field Communication) or Bluetooth (registered trademark) without going through the communication network 41. may be connected directly using NFC (Near Field Communication) or Bluetooth (registered trademark) without going through the communication network 41. may be connected directly using NFC (Near Field Communication)
  • Bluetooth registered trademark
  • the vehicle control ECU 21 is composed of various processors such as a CPU (Central Processing Unit) and an MPU (Micro Processing Unit).
  • the vehicle control ECU 21 controls the functions of the entire vehicle control system 11 or a part thereof.
  • the communication unit 22 communicates with various devices inside and outside the vehicle, other vehicles, servers, base stations, etc., and transmits and receives various data. At this time, the communication unit 22 can perform communication using a plurality of communication methods.
  • the communication with the outside of the vehicle that can be performed by the communication unit 22 will be described schematically.
  • the communication unit 22 uses a wireless communication method such as 5G (5th generation mobile communication system), LTE (Long Term Evolution), DSRC (Dedicated Short Range Communications), etc., via a base station or access point, on an external network communicates with a server (hereinafter referred to as an external server) located in the
  • the external network with which the communication unit 22 communicates is, for example, the Internet, a cloud network, or a provider's own network.
  • the communication method that the communication unit 22 performs with the external network is not particularly limited as long as it is a wireless communication method that enables digital two-way communication at a communication speed of a predetermined value or more and a distance of a predetermined value or more.
  • the communication unit 22 can communicate with a terminal existing in the vicinity of the own vehicle using P2P (Peer To Peer) technology.
  • Terminals in the vicinity of one's own vehicle are, for example, terminals worn by pedestrians, bicycles, and other moving objects that move at relatively low speeds, terminals installed at fixed locations in stores, etc., or MTC (Machine Type Communication) terminal.
  • the communication unit 22 can also perform V2X communication.
  • V2X communication includes, for example, vehicle-to-vehicle communication with other vehicles, vehicle-to-infrastructure communication with roadside equipment, and vehicle-to-home communication. communication, and communication between a vehicle and others, such as vehicle-to-pedestrian communication with a terminal or the like possessed by a pedestrian.
  • the communication unit 22 can receive from the outside a program for updating the software that controls the operation of the vehicle control system 11 (Over The Air).
  • the communication unit 22 can also receive map information, traffic information, information around the vehicle 1, and the like from the outside.
  • the communication unit 22 can transmit information about the vehicle 1, information about the surroundings of the vehicle 1, and the like to the outside.
  • the information about the vehicle 1 that the communication unit 22 transmits to the outside includes, for example, data indicating the state of the vehicle 1, recognition results by the recognition unit 73, and the like.
  • the communication unit 22 performs communication corresponding to a vehicle emergency call system such as e-call.
  • the communication unit 22 receives electromagnetic waves transmitted by a vehicle information and communication system (VICS (registered trademark)) such as radio beacons, optical beacons, and FM multiplex broadcasting.
  • VICS vehicle information and communication system
  • the communication with the inside of the vehicle that can be performed by the communication unit 22 will be described schematically.
  • the communication unit 22 can communicate with each device in the vehicle using, for example, wireless communication.
  • the communication unit 22 performs wireless communication with devices in the vehicle using a communication method such as wireless LAN, Bluetooth, NFC, and WUSB (Wireless USB) that enables digital two-way communication at a communication speed higher than a predetermined value. can be done.
  • the communication unit 22 can also communicate with each device in the vehicle using wired communication.
  • the communication unit 22 can communicate with each device in the vehicle by wired communication via a cable connected to a connection terminal (not shown).
  • the communication unit 22 performs digital two-way communication at a communication speed above a predetermined level by wired communication such as USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface) (registered trademark), and MHL (Mobile High-Definition Link). can communicate with each device in the vehicle.
  • wired communication such as USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface) (registered trademark), and MHL (Mobile High-Definition Link).
  • equipment in the vehicle refers to equipment that is not connected to the communication network 41 in the vehicle, for example.
  • in-vehicle devices include mobile devices and wearable devices possessed by passengers PA such as drivers, information devices brought into the vehicle and temporarily installed, and the like.
  • the map information accumulation unit 23 accumulates one or both of the map obtained from the outside and the map created by the vehicle 1. For example, the map information accumulation unit 23 accumulates a three-dimensional high-precision map, a global map covering a wide area, and the like, which is lower in accuracy than the high-precision map.
  • High-precision maps are, for example, dynamic maps, point cloud maps, vector maps, etc.
  • the dynamic map is, for example, a map consisting of four layers of dynamic information, quasi-dynamic information, quasi-static information, and static information, and is provided to the vehicle 1 from an external server or the like.
  • a point cloud map is a map composed of a point cloud (point cloud data).
  • a vector map is a map adapted to ADAS (Advanced Driver Assistance System) and AD (Autonomous Driving) by associating traffic information such as lane and traffic signal positions with a point cloud map.
  • the point cloud map and the vector map may be provided from an external server or the like, and based on the sensing results of the camera 51, radar 52, LiDAR 53, etc., as a map for matching with a local map described later. It may be created by the vehicle 1 and stored in the map information storage unit 23 . Further, when a high-precision map is provided from an external server or the like, in order to reduce the communication capacity, map data of, for example, several hundred meters square, regarding the planned route that the vehicle 1 will travel from now on, is acquired from the external server or the like. .
  • the location information acquisition unit 24 receives GNSS signals from GNSS91 (Global Navigation Satellite System) satellites and acquires location information of the vehicle 1 .
  • the acquired position information is supplied to the driving support/automatic driving control unit 29 .
  • the location information acquisition unit 24 is not limited to the method using GNSS signals, and may acquire location information using beacons, for example.
  • the external recognition sensor 25 includes various sensors used for recognizing situations outside the vehicle 1 and supplies sensor data from each sensor to each part of the vehicle control system 11 .
  • the type and number of sensors included in the external recognition sensor 25 are arbitrary.
  • the external recognition sensor 25 includes a camera 51 , a radar 52 , a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) 53 , and an ultrasonic sensor 54 .
  • the configuration is not limited to this, and the external recognition sensor 25 may be configured to include one or more types of sensors among the camera 51 , radar 52 , LiDAR 53 , and ultrasonic sensor 54 .
  • the numbers of cameras 51 , radars 52 , LiDARs 53 , and ultrasonic sensors 54 are not particularly limited as long as they are realistically installable in the vehicle 1 .
  • the type of sensor provided in the external recognition sensor 25 is not limited to this example, and the external recognition sensor 25 may be provided with other types of sensors. An example of the sensing area of each sensor included in the external recognition sensor 25 will be described later.
  • the imaging method of the camera 51 is not particularly limited.
  • various types of cameras such as a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, and an infrared camera, which are capable of distance measurement, can be applied to the camera 51 as necessary.
  • the camera 51 is not limited to this, and may simply acquire a photographed image regardless of distance measurement.
  • the external recognition sensor 25 can include an environment sensor for detecting the environment with respect to the vehicle 1.
  • the environment sensor is a sensor for detecting the environment such as weather, climate, brightness, etc., and can include various sensors such as raindrop sensors, fog sensors, sunshine sensors, snow sensors, and illuminance sensors.
  • the external recognition sensor 25 includes a microphone used for detecting the sound around the vehicle 1 and the position of the sound source.
  • the in-vehicle sensor 26 includes various sensors for detecting information inside the vehicle, and supplies sensor data from each sensor to each part of the vehicle control system 11 .
  • the types and number of various sensors included in the in-vehicle sensor 26 are not particularly limited as long as they are the types and number that can be realistically installed in the vehicle 1 .
  • the in-vehicle sensor 26 may comprise one or more sensors among cameras, radar, seating sensors, steering wheel sensors, microphones, and biometric sensors.
  • the camera provided in the in-vehicle sensor 26 for example, cameras of various shooting methods capable of distance measurement, such as a ToF camera, a stereo camera, a monocular camera, and an infrared camera, can be used.
  • the camera included in the in-vehicle sensor 26 is not limited to this, and may simply acquire a photographed image regardless of distance measurement.
  • the biosensors included in the in-vehicle sensor 26 are provided, for example, on a seat, a steering wheel, or the like, and detect various biometric information of a passenger such as a driver.
  • the vehicle sensor 27 includes various sensors for detecting the state of the vehicle 1, and supplies sensor data from each sensor to each section of the vehicle control system 11.
  • the types and number of various sensors included in the vehicle sensor 27 are not particularly limited as long as the types and number are practically installable in the vehicle 1 .
  • the vehicle sensor 27 includes a speed sensor, an acceleration sensor, an angular velocity sensor (gyro sensor), and an inertial measurement unit (IMU (Inertial Measurement Unit)) integrating them.
  • the vehicle sensor 27 includes a steering angle sensor that detects the steering angle of the steering wheel, a yaw rate sensor, an accelerator sensor that detects the amount of operation of the accelerator pedal, and a brake sensor that detects the amount of operation of the brake pedal.
  • the vehicle sensor 27 includes a rotation sensor that detects the number of rotations of an engine or a motor, an air pressure sensor that detects tire air pressure, a slip rate sensor that detects a tire slip rate, and a wheel speed sensor that detects the rotational speed of a wheel.
  • a sensor is provided.
  • the vehicle sensor 27 includes a battery sensor that detects the remaining battery level and temperature, and an impact sensor that detects external impact.
  • Vehicle sensor 27 includes a sensor that detects driving status DS.
  • the storage unit 28 includes at least one of a nonvolatile storage medium and a volatile storage medium, and stores data and programs.
  • the storage unit 28 is used as, for example, EEPROM (Electrically Erasable Programmable Read Only Memory) and RAM (Random Access Memory), and storage media include magnetic storage devices such as HDD (Hard Disc Drive), semiconductor storage devices, optical storage devices, And a magneto-optical storage device can be applied.
  • the storage unit 28 stores various programs and data used by each unit of the vehicle control system 11 .
  • the storage unit 28 includes an EDR (Event Data Recorder) and a DSSAD (Data Storage System for Automated Driving), and stores information about the vehicle 1 before and after an event such as an accident and information acquired by the in-vehicle sensor 26 .
  • EDR Event Data Recorder
  • DSSAD Data Storage System for Automated Driving
  • the driving support/automatic driving control unit 29 controls driving support and automatic driving of the vehicle 1 .
  • the driving support/automatic driving control unit 29 includes an analysis unit 61 , an action planning unit 62 and an operation control unit 63 .
  • the analysis unit 61 analyzes the vehicle 1 and its surroundings.
  • the analysis unit 61 includes a self-position estimation unit 71 , a sensor fusion unit 72 and a recognition unit 73 .
  • the self-position estimation unit 71 estimates the self-position of the vehicle 1 based on the sensor data from the external recognition sensor 25 and the high-precision map accumulated in the map information accumulation unit 23. For example, the self-position estimation unit 71 generates a local map based on sensor data from the external recognition sensor 25, and estimates the self-position of the vehicle 1 by matching the local map and the high-precision map.
  • the position of the vehicle 1 is based on, for example, the center of the rear wheel versus axle.
  • a local map is, for example, a three-dimensional high-precision map created using techniques such as SLAM (Simultaneous Localization and Mapping), an occupancy grid map, or the like.
  • the three-dimensional high-precision map is, for example, the point cloud map described above.
  • the occupancy grid map is a map that divides the three-dimensional or two-dimensional space around the vehicle 1 into grids (lattice) of a predetermined size and shows the occupancy state of objects in grid units.
  • the occupancy state of an object is indicated, for example, by the presence or absence of the object and the existence probability.
  • the local map is also used, for example, by the recognizing unit 73 for detection processing and recognition processing of the situation outside the vehicle 1 .
  • the self-position estimation unit 71 may estimate the self-position of the vehicle 1 based on the position information acquired by the position information acquisition unit 24 and the sensor data from the vehicle sensor 27.
  • the sensor fusion unit 72 combines a plurality of different types of sensor data (for example, image data supplied from the camera 51 and sensor data supplied from the radar 52) to perform sensor fusion processing to obtain new information.
  • Methods for combining different types of sensor data include integration, fusion, federation, and the like.
  • the recognition unit 73 executes a detection process for detecting the situation outside the vehicle 1 and a recognition process for recognizing the situation outside the vehicle 1 .
  • the recognition unit 73 performs detection processing and recognition processing of the situation outside the vehicle 1 based on information from the external recognition sensor 25, information from the self-position estimation unit 71, information from the sensor fusion unit 72, and the like. .
  • the recognition unit 73 performs detection processing and recognition processing of objects around the vehicle 1 .
  • Object detection processing is, for example, processing for detecting the presence or absence, size, shape, position, movement, and the like of an object.
  • Object recognition processing is, for example, processing for recognizing an attribute such as the type of an object or identifying a specific object.
  • detection processing and recognition processing are not always clearly separated, and may overlap.
  • the recognition unit 73 detects objects around the vehicle 1 by clustering the point cloud based on sensor data from the radar 52 or the LiDAR 53 or the like for each cluster of point groups. As a result, presence/absence, size, shape, and position of objects around the vehicle 1 are detected.
  • the recognizing unit 73 detects the movement of objects around the vehicle 1 by performing tracking that follows the movement of the cluster of points classified by clustering. As a result, the speed and traveling direction (movement vector) of the object around the vehicle 1 are detected.
  • the recognition unit 73 detects or recognizes vehicles, people, bicycles, obstacles, structures, roads, traffic lights, traffic signs, road markings, etc. based on image data supplied from the camera 51 . Further, the recognition unit 73 may recognize types of objects around the vehicle 1 by performing recognition processing such as semantic segmentation.
  • the recognition unit 73 based on the map accumulated in the map information accumulation unit 23, the estimation result of the self-position by the self-position estimation unit 71, and the recognition result of the object around the vehicle 1 by the recognition unit 73, Recognition processing of traffic rules around the vehicle 1 can be performed. Through this processing, the recognition unit 73 can recognize the position and state of traffic lights, the content of traffic signs and road markings, the content of traffic restrictions, the lanes in which the vehicle can travel, and the like.
  • the recognition unit 73 can perform recognition processing of the environment around the vehicle 1 .
  • Surrounding environments to be recognized by the recognizing unit 73 include presence/absence of pedestrians and surrounding vehicles, weather, temperature, humidity, brightness, road surface conditions, and the like.
  • the action plan section 62 creates an action plan for the vehicle 1.
  • the action planning unit 62 creates an action plan by performing route planning and route following processing.
  • trajectory planning is the process of planning a rough route from the start to the goal. This route planning is called trajectory planning, and in the planned route, trajectory generation (local path planning) that can proceed safely and smoothly in the vicinity of the vehicle 1 in consideration of the motion characteristics of the vehicle 1. It also includes the processing to be performed.
  • Route following is the process of planning actions to safely and accurately travel the route planned by route planning within the planned time.
  • the action planning unit 62 can, for example, calculate the target speed and the target angular speed of the vehicle 1 based on the result of this route following processing.
  • the motion control unit 63 controls the motion of the vehicle 1 in order to implement the action plan created by the action planning unit 62.
  • the operation control unit 63 controls a steering control unit 81, a brake control unit 82, and a drive control unit 83 included in the vehicle control unit 32, which will be described later, so that the vehicle 1 can control the trajectory calculated by the trajectory plan. Acceleration/deceleration control and direction control are performed so as to advance.
  • the operation control unit 63 performs cooperative control aimed at realizing ADAS functions such as collision avoidance or shock mitigation, follow-up driving, vehicle speed maintenance driving, collision warning of own vehicle, and lane deviation warning of own vehicle.
  • the operation control unit 63 performs cooperative control aimed at automatic driving in which the vehicle autonomously travels without depending on the operation of the driver.
  • the DMS 30 performs driver authentication processing, driver state recognition processing, etc., based on sensor data from the in-vehicle sensor 26 and input data input to the HMI 31, which will be described later.
  • As the state of the driver to be recognized for example, physical condition, wakefulness, concentration, fatigue, gaze direction, drunkenness, driving operation, posture, etc. are assumed.
  • the DMS 30 may perform authentication processing for passengers other than the driver and processing for recognizing the state of the passenger. Further, for example, the DMS 30 may perform recognition processing of the situation inside the vehicle based on the sensor data from the sensor 26 inside the vehicle. Conditions inside the vehicle to be recognized include temperature, humidity, brightness, smell, and the like, for example.
  • the HMI 31 inputs various data, instructions, etc., and presents various data to passengers including the driver.
  • the HMI 31 comprises an input device for human input of data.
  • the HMI 31 generates an input signal based on data, instructions, etc. input from an input device, and supplies the input signal to each section of the vehicle control system 11 .
  • the HMI 31 includes operators such as a touch panel, buttons, switches, and levers as input devices.
  • the HMI 31 is not limited to this, and may further include an input device capable of inputting information by a method other than manual operation using voice, gestures, or the like.
  • the HMI 31 may use, as an input device, a remote control device using infrared rays or radio waves, or an external connection device such as a mobile device or wearable device corresponding to the operation of the vehicle control system 11 .
  • the presentation of data by HMI31 will be briefly explained.
  • the HMI 31 generates visual information, auditory information, and tactile information for the passenger or outside the vehicle.
  • the HMI 31 performs output control for controlling the output, output content, output timing, output method, and the like of each generated information.
  • the HMI 31 generates and outputs visual information such as an operation screen, a status display of the vehicle 1, a warning display, and information (contents) indicated by an image or light such as a monitor image showing the situation around the vehicle 1.
  • the HMI 31 also generates and outputs information indicated by sounds such as voice guidance, warning sounds, warning messages, etc., as auditory information.
  • the HMI 31 generates and outputs, as tactile information, information given to the passenger's tactile sense by force, vibration, movement, or the like.
  • a display device that presents visual information by displaying an image by itself or a projector device that presents visual information by projecting an image can be applied.
  • the display device displays visual information within the passenger's field of view, such as a head-up display, a transmissive display, or a wearable device with an AR (Augmented Reality) function. It may be a device.
  • the HMI 31 can also use a display device provided in the vehicle 1, such as a navigation device, an instrument panel, a CMS (Camera Monitoring System), an electronic mirror, a lamp, etc., as an output device for outputting visual information.
  • Audio speakers, headphones, and earphones can be applied as output devices for the HMI 31 to output auditory information.
  • a haptic element using haptic technology can be applied as an output device for the HMI 31 to output tactile information.
  • a haptic element is provided at a portion of the vehicle 1 that is in contact with a passenger, such as a steering wheel or a seat.
  • the vehicle control unit 32 controls each unit of the vehicle 1.
  • the vehicle control section 32 includes a steering control section 81 , a brake control section 82 , a drive control section 83 , a body system control section 84 , a light control section 85 and a horn control section 86 .
  • the steering control unit 81 detects and controls the state of the steering system of the vehicle 1 .
  • the steering system includes, for example, a steering mechanism including a steering wheel, an electric power steering, and the like.
  • the steering control unit 81 includes, for example, a steering ECU that controls the steering system, an actuator that drives the steering system, and the like.
  • the brake control unit 82 detects and controls the state of the brake system of the vehicle 1 .
  • the brake system includes, for example, a brake mechanism including a brake pedal, an ABS (Antilock Brake System), a regenerative brake mechanism, and the like.
  • the brake control unit 82 includes, for example, a brake ECU that controls the brake system, an actuator that drives the brake system, and the like.
  • the drive control unit 83 detects and controls the state of the drive system of the vehicle 1 .
  • the drive system includes, for example, an accelerator pedal, a driving force generator for generating driving force such as an internal combustion engine or a driving motor, and a driving force transmission mechanism for transmitting the driving force to the wheels.
  • the drive control unit 83 includes, for example, a drive ECU that controls the drive system, an actuator that drives the drive system, and the like.
  • the body system control unit 84 detects and controls the state of the body system of the vehicle 1 .
  • the body system includes, for example, a keyless entry system, smart key system, power window device, power seat, air conditioner, air bag, seat belt, shift lever, and the like.
  • the body system control unit 84 includes, for example, a body system ECU that controls the body system, an actuator that drives the body system, and the like.
  • the light control unit 85 detects and controls the states of various lights of the vehicle 1 .
  • Lights to be controlled include, for example, headlights, backlights, fog lights, turn signals, brake lights, projections, bumper displays, and the like.
  • the light control unit 85 includes a light ECU that controls the light, an actuator that drives the light, and the like.
  • the horn control unit 86 detects and controls the state of the car horn of the vehicle 1 .
  • the horn control unit 86 includes, for example, a horn ECU for controlling the car horn, an actuator for driving the car horn, and the like.
  • FIG. 3 is a diagram showing an example of sensing areas by the camera 51, the radar 52, the LiDAR 53, the ultrasonic sensor 54, etc. of the external recognition sensor 25. As shown in FIG. Note that FIG. 3 schematically shows a state in which the vehicle 1 is viewed from above. 3, the left side is the front end (front) of the vehicle 1, and the right end is the rear end (rear) of the vehicle 1, when viewing FIG.
  • a sensing area 101F and a sensing area 101B are examples of sensing areas of the ultrasonic sensor 54.
  • FIG. The sensing area 101 ⁇ /b>F covers the periphery of the front end of the vehicle 1 with a plurality of ultrasonic sensors 54 .
  • the sensing area 101B covers the periphery of the rear end of the vehicle 1 with a plurality of ultrasonic sensors 54 .
  • the sensing results in the sensing area 101F and the sensing area 101B are used, for example, for parking assistance of the vehicle 1 and the like.
  • Sensing areas 102F to 102B show examples of sensing areas of the radar 52 for short or medium range.
  • the sensing area 102F covers the front of the vehicle 1 to a position farther than the sensing area 101F.
  • the sensing area 102B covers the rear of the vehicle 1 to a position farther than the sensing area 101B.
  • the sensing area 102L covers the rear periphery of the left side surface of the vehicle 1 .
  • the sensing area 102R covers the rear periphery of the right side surface of the vehicle 1 .
  • the sensing result in the sensing area 102F is used, for example, to detect vehicles, pedestrians, etc. existing in front of the vehicle 1.
  • the sensing result in the sensing area 102B is used for the rear collision prevention function of the vehicle 1, for example.
  • the sensing results in the sensing area 102L and the sensing area 102R are used, for example, to detect an object in a blind spot on the side of the vehicle 1, or the like.
  • Sensing areas 103F to 103B show examples of sensing areas by the camera 51 .
  • the sensing area 103F covers the front of the vehicle 1 to a position farther than the sensing area 102F.
  • the sensing area 103B covers the rear of the vehicle 1 to a position farther than the sensing area 102B.
  • the sensing area 103L covers the periphery of the left side surface of the vehicle 1 .
  • the sensing area 103R covers the periphery of the right side surface of the vehicle 1 .
  • the sensing results in the sensing area 103F can be used, for example, for recognition of traffic lights and traffic signs, lane departure prevention support systems, and automatic headlight control systems.
  • a sensing result in the sensing area 103B can be used for parking assistance and a surround view system, for example.
  • Sensing results in the sensing area 103L and the sensing area 103R can be used, for example, in a surround view system.
  • the sensing area 104 shows an example of the sensing area of the LiDAR53.
  • the sensing area 104 covers the front of the vehicle 1 to a position farther than the sensing area 103F.
  • the sensing area 104 has a narrower lateral range than the sensing area 103F.
  • the sensing results in the sensing area 104 are used, for example, to detect objects such as surrounding vehicles.
  • a sensing area 105 shows an example of a sensing area of the long-range radar 52 .
  • the sensing area 105 covers the front of the vehicle 1 to a position farther than the sensing area 104 .
  • the sensing area 105 has a narrower lateral range than the sensing area 104 .
  • the sensing results in the sensing area 105 are used, for example, for ACC (Adaptive Cruise Control), emergency braking, and collision avoidance.
  • ACC Adaptive Cruise Control
  • emergency braking emergency braking
  • collision avoidance collision avoidance
  • the sensing regions of the cameras 51, the radar 52, the LiDAR 53, and the ultrasonic sensors 54 included in the external recognition sensor 25 may have various configurations other than those shown in FIG. Specifically, the ultrasonic sensor 54 may also sense the sides of the vehicle 1 , and the LiDAR 53 may sense the rear of the vehicle 1 . Moreover, the installation position of each sensor is not limited to each example mentioned above. Also, the number of each sensor may be one or plural.
  • FIG. 4 is a schematic diagram of a conventional vehicle control system 19. As shown in FIG.
  • the conventional vehicle control system 19 processes the data of the camera 51 and the radar 52 in parallel.
  • the data processing unit 200 uses the image signal processing unit 201 and the radar signal processing unit 202 to extract feature points of the object from the data of the camera 51 and the radar 52, respectively.
  • the data processing unit 200 collates the feature points extracted from the data of the camera 51 and the radar 52 with the image feature point database 203 and the radar feature point database 204 .
  • the image feature point database 203 stores feature point data extracted from past image data.
  • the radar feature point database 204 data of feature points extracted from past measurement data of the radar 52 are accumulated.
  • the data processing unit 200 estimates the position of the camera 51 by matching the feature point group extracted from the image data of the camera 51 with the image feature point database 203 .
  • the data processing unit 200 estimates the position of the radar 52 by collating the feature point group extracted from the measurement data of the radar 52 with the radar feature point database 204 .
  • a known method is adopted as a method of estimating the self-position using the feature point group.
  • Patent Literature 1 describes a method of extracting feature points from image data and estimating their positions.
  • the data processing unit 200 outputs the camera position estimation information and the radar position estimation information to the data integration unit 210 .
  • the data integration unit 210 integrates position information data of the camera 51 and the radar 52 with the self-contained navigation data derived from the GNSS 91 and the IMU 92 .
  • the data integration unit 210 may refer to the map database 95 as appropriate to further confirm the validity of the position information.
  • the data integration unit 210 integrates signals by, for example, Kalman filter processing or arbitrary signal processing, and outputs position information.
  • FIG. 5 is a schematic diagram of the vehicle control system 11 of the present disclosure, which is an improvement over the conventional system.
  • the vehicle control system 11 of the present disclosure integrates feature point group information extracted from the camera 51 and feature point group information extracted from radar 52 data.
  • the vehicle control system 11 performs self-position estimation using highly accurate feature point information obtained by integration.
  • the process of generating feature point information is performed by the data processing unit 300 .
  • the data integration unit 310 performs self-location estimation processing based on feature point information.
  • data processing section 300 includes sensor fusion section 72 .
  • Data integrating section 310 includes self-position estimating section 71 .
  • the data processing unit 300 has an image signal processing unit 301 , a radar signal processing unit 302 and an auxiliary sensor position information generating unit 303 .
  • the image signal processing unit 301 extracts feature point information from the image data of the camera 51 .
  • the feature point information includes information on the position of each feature point included in the captured image.
  • the image signal processing unit 301 also extracts visibility information from the image data, and determines operating parameters of the radar 52 based on the visibility information.
  • the image signal processing unit 301 controls the positioning direction (feature point azimuth) and effective positioning range (direction and shape of the beam BM) of the radar 52 by adjusting operation parameters of the radar 52 . Adjustment of the operating parameters of the radar 52 is performed by the drive control section 83 .
  • the effective positioning range of the radar 52 is defined by the maximum positioning distance of the radar 52.
  • the maximum positioning distance means the maximum distance for which positioning accuracy is guaranteed by the radar 52 .
  • the maximum positioning distance of the radar 52 is set to a value slightly larger than the visibility calculated from the image data of the camera 51 . If the visibility is short, the maximum positioning distance of the radar 52 is also set short. If the visibility is long, the maximum positioning distance of the radar 52 is also set long. For example, the maximum positioning distance is set to a value that is 5 meters or more and 10 meters or less than the visibility obtained from the image data.
  • the radar signal processing unit 302 extracts the distance to each feature point (feature point distance) from the sensor data measured by the radar 52 .
  • the image signal processing unit 301 integrates information regarding the position of each feature point extracted from the image data of the camera 51 and information regarding the distance to each feature point obtained from the radar signal processing unit 302 . Thereby, the image signal processing unit 301 increases the accuracy of the feature point information generated based on the image data.
  • the auxiliary sensor position information generation unit 303 performs self-position estimation based on the feature point information whose accuracy has been improved through integration processing.
  • the vehicle control system 11 storage unit 28
  • the auxiliary sensor position information generating unit 303 estimates the position of the camera 51 by collating the feature point information after the integration processing with the image feature point database CD.
  • the auxiliary sensor position information generation unit 303 generates auxiliary sensor position information indicating the position of the camera 51 and supplies it to the data integration unit 310 .
  • the data integration unit 310 integrates the auxiliary sensor position information with the self-contained navigation data obtained from the GNSS 91 and the IMU 92 to generate final position information of the vehicle 1 .
  • the difference between the method of the present disclosure and the conventional method is that the radar processing is a processing structure that accompanies the camera processing. Therefore, in the vehicle control system 11 of the present disclosure, information processed by the radar signal processing section 302 is provided indirectly via the image signal processing section 301 .
  • FIG. 6 is a diagram showing a configuration example of the image signal processing unit 301. As shown in FIG.
  • the image signal processing unit 301 has a visibility calculation unit 311 , a feature point extraction unit 312 and a feature point information update unit 313 .
  • the input is image data supplied from the camera 51 .
  • the visibility calculation unit 311 extracts visibility information from the image data of the camera 51 .
  • Visibility information includes information about the maximum distance (visibility) at which an object can be clearly recognized by the naked eye. If the visibility value is uniform regardless of direction, such as fog, rain, and snow, the visibility information includes average visibility information that is independent of direction. If the visibility value varies depending on the azimuth due to the influence of the late afternoon sun, backlight, or the like, the visibility information includes information about the visibility value (visibility distribution) for each azimuth. Whether or not the visibility distribution is biased is determined based on a preset allowable criterion.
  • the visibility calculation unit 311 is configured using a multilayer or deep neural network.
  • the visibility calculation unit 311 extracts visibility information using a multi-layer or deep neural network that has learned the relationship between the shooting scene and visibility. By having the neural network learn image scenes such as rain, fog, and backlight in advance, it is possible to obtain visibility information for the input image with high response.
  • the feature point extraction unit 312 extracts a plurality of feature points from the image data.
  • the feature point extraction unit 312 generates feature point information based on the position of each feature point extracted from the image data, and supplies the feature point information update unit 313 with the feature point information. Extraction of feature points is performed using an algorithm such as SfM (Structure from Motion) or SuperPoint, which analyzes the feature quantity of an image, or an algorithm developed from these algorithms.
  • SfM Structure from Motion
  • SuperPoint which analyzes the feature quantity of an image, or an algorithm developed from these algorithms.
  • SfM self-location is estimated through steps such as image acquisition, feature point extraction/association, camera position/orientation estimation, and 3D map generation.
  • Non-Patent Document A describes an example of SfM.
  • Non-Patent Document A Photogrammetry and Remote Sensing, Vol. 55, No. 3
  • SuperPoint uses deep learning as a means of estimating self-location using a camera.
  • deep learning is used to learn in advance the relationship between the absolute orientation of the image sensor and the image captured in that orientation.
  • an image for orientation estimation is input to the trained model, the absolute position and orientation of the image sensor are estimated.
  • Non-Patent Document B below describes an example of SuperPoint.
  • the feature point extraction unit 312 detects the direction to be sensed in detail by the radar 52 as the feature point direction based on the visibility information.
  • the feature point azimuth is forward in the traveling direction of the vehicle 1 by default. If the visibility distribution satisfies the acceptance criteria, the feature point extraction unit 312 detects the default orientation as the feature point orientation. If the visibility distribution is biased beyond the permissible standard, the feature point extraction unit 312 extracts the orientation with the shortest visibility from the visibility distribution as the feature point orientation.
  • the feature point information updating unit 313 determines the reliability (accuracy) of the position of each feature point based on the visibility information. For example, when the position of the feature point obtained from the image data of the camera 51 exists at a position that exceeds the effective positioning range of the camera 51 calculated from the visibility information, the feature point information updating unit 313 determines whether the position is accurate. presumed to be low. In this case, the feature point information updating unit 313 corrects the position of the feature point estimated to be inaccurate based on the distance to the feature point obtained from the radar signal processing unit 302 .
  • the feature point information updating unit 313 decomposes the position of the feature point estimated to be inaccurate into the orientation of the feature point and the distance to the feature point.
  • the feature point information updating unit 313 replaces the distance to the feature point with the distance to the feature point obtained from the radar signal processing unit 302 without changing the direction of the feature point.
  • the feature point information updating unit 313 updates the feature point information obtained from the image data of the camera 51 with high accuracy.
  • the auxiliary sensor position information generation unit 303 collates the updated feature point information with the image feature point database CD to generate auxiliary sensor position information indicating the position of the camera 51 .
  • FIG. 7 is a diagram showing a configuration example of the radar signal processing unit 302. As shown in FIG.
  • the radar signal processing unit 302 has a radar parameter determination unit 321 and a feature point distance calculation unit 322 .
  • the radar parameter determination unit 321 determines operation parameters for controlling the positioning direction and effective positioning range of the radar 52 based on the visibility information.
  • the radar 52 performs distance measurements for feature points in the positioning direction (feature point azimuth) defined by the operating parameters and the effective positioning range.
  • the operating parameters include parameters for controlling the range resolution of the radar 52 and the like.
  • the feature point distance calculator 322 analyzes the measurement data of the radar 52 based on the operating parameters.
  • the operating parameters of the radar 52 are tuned to optimum parameters at close range if the visibility value is small. If the visibility value is large, the operating parameters of radar 52 are tuned for greater range.
  • the maximum positioning distance R max of FMCW radar is generally given by the following formula (1).
  • Equation (1) “c” indicates the speed of light. “T chirp ” indicates the chirp signal time width. “BW” indicates the chirp signal frequency bandwidth. “f S ” indicates the sampling frequency of the IF signal.
  • the range resolution ⁇ R is generally given by the following formula (2).
  • the sampling frequency fS corresponds to the operating frequency of the A/D converter that samples the beat signal. Therefore, if the sampling frequency fS is increased, there is a concern that the circuit cost will increase. Therefore, in order to increase the maximum positioning distance Rmax , it is desirable to set the chirp signal frequency bandwidth BW to a small value. However, in this case, there is a demerit that the range resolution ⁇ R increases at the same time.
  • auxiliary positioning is performed by the radar 52 in an area equal to or greater than the effective limit distance for positioning by the camera 51 .
  • the limit distance is determined with reference to visibility information determined by the image signal processing unit 301 . Therefore, the radar signal processing unit 302 can obtain the best range resolution ⁇ R by setting the maximum chirp signal frequency bandwidth BW that covers the range exceeding the limit distance.
  • the feature point distance calculator 322 calculates the distance to each feature point based on the measurement result of the radar 52 and supplies it to the feature point information updater 313 .
  • a feature point information updating unit 313 integrates the feature point information extracted from the image data and the distance information to each feature point to update the feature point information.
  • the feature point distance calculator 322 A/D-converts the beat signal input from the radar 52, and then calculates the distance to each feature point. For example, the feature point distance calculator 322 performs FFT on the input signal and estimates the distance from the peak value of the spectrum to the feature point.
  • the radar signal processing unit 302 refers to the feature point azimuth given from the image signal processing unit 301 and feeds back distance information to the feature point to the image signal processing unit 301 .
  • Such coordinated operation of the camera 51 and the radar 52 enables measurement with higher precision than in the past. For example, when the sensing by the camera 51 does not provide sufficient measurement accuracy, supplementary measurement by the radar 52 is performed, and the result is fused with the information of the feature points of the camera 51 to obtain highly accurate feature points of the camera 51. Positioning data is obtained.
  • FIG. 8 is a flowchart showing an example of information processing for controlling the operation of the radar 52 based on visibility information.
  • the visibility calculation unit 311 extracts visibility information from the image data of the camera 51 (step S1).
  • the feature point extraction unit 312 determines whether or not the visibility distribution extracted from the visibility information has a bias exceeding the allowable standard (step S2). For example, an area in which the visibility value is equal to or less than a predetermined value is defined as a low visibility area, and if the magnitude of variation in visibility in the low visibility area falls within a preset range, the visibility is uniform, that is, the visibility distribution. is determined to meet the acceptance criteria. If the visibility variation in the low visibility region does not fall within a preset range, it is determined that the visibility is uneven, that is, the visibility distribution is biased beyond the allowable standard.
  • the radar parameter determining unit 321 determines the operating parameters of the radar 52 based on the average visibility calculated from the visibility distribution. (step S3).
  • the average visibility means, for example, an average visibility value in a low visibility region.
  • the auxiliary sensor position information generation unit 303 generates auxiliary sensor position information using feature point information of all feature points included in the image data of the camera 51 .
  • the feature point extraction unit 312 extracts the orientation with the shortest visibility from the visibility distribution as the feature point orientation (step S4).
  • the radar parameter determination unit 321 sets the detection range of the radar 52 to a range corresponding to the feature point azimuth (step S5), and determines the operation parameter of the radar 52 based on the visibility of the feature point azimuth (step S6).
  • the auxiliary sensor position information generation unit 303 selectively uses the feature point information of one or more feature points existing in the feature point azimuth among the plurality of feature points included in the image data of the camera 51 to generate the auxiliary sensor position information. Generate.
  • FIG. 9 is a schematic diagram of a vehicle control system 11A according to a first modified example.
  • the difference between this modified example and the configuration example shown in FIG. 5 is that the auxiliary sensor position information generation unit 340 and the image feature point database CD are provided in an external server.
  • the data processing section 330 has a communication device 331 that communicates with the auxiliary sensor position information generating section 340 .
  • the image signal processing unit 301 supplies feature point information to the auxiliary sensor position information generating unit 340 via the communication device 331 .
  • the auxiliary sensor position information generation unit 340 generates auxiliary sensor position information by comparing the feature point information with the image feature point database CD, and supplies the auxiliary sensor position information to the data processing unit 330 via the communication device 331 .
  • heavy-load processing and large-capacity data are transferred to a high-performance external server. Therefore, the configuration of the vehicle 1 is simplified.
  • FIG. 10 is a schematic diagram of a vehicle control system 11B according to a second modification.
  • the difference between this modified example and the configuration example shown in FIG. 5 is that the operating parameters of a plurality of radars 52 with different sensing regions are independently controlled based on visibility information.
  • the drive control section 83 has a plurality of radar control sections 97 corresponding to each radar 52 .
  • the radar signal processing unit 302 has a radar selection unit 323 .
  • the radar selection unit 323 acquires information on operating parameters from the radar parameter determination unit 321 . Also, the radar selection unit 323 acquires information about the feature point orientation from the feature point extraction unit 312 . The radar selection unit 323 selectively gives the acquired operation parameter to the radar 52 having a sensing region corresponding to the feature point orientation. The radar control unit 97 drives the radar 52 using the given operation parameters to detect an object.
  • the auxiliary sensor position information generation unit 303 selectively uses feature point information of one or more feature points existing in the feature point orientation to generate auxiliary sensor position information.
  • the vehicle control system 11 has a visibility calculator 311 and a radar parameter determiner 321 .
  • the visibility calculator 311 extracts visibility information from the image data of the camera 51 .
  • the radar parameter determination unit 321 determines operation parameters for controlling the effective positioning range of the radar 52 based on the visibility information.
  • the processing of the vehicle control system 11 is executed by a computer.
  • the program of the present disclosure causes the computer to implement the processing of the vehicle control system 11 .
  • the effective positioning range of the radar 52 is adjusted based on the effective positioning range of the camera 51. Therefore, the camera 51 and the radar 52 can be appropriately linked to improve the accuracy of self-position estimation.
  • the vehicle control system 11 has a feature point extraction unit 312 , a feature point distance calculation unit 322 and a feature point information update unit 313 .
  • a feature point extraction unit 312 extracts a plurality of feature points from image data.
  • the feature point distance calculator 322 calculates the distance to each feature point based on the measurement result of the radar 52 .
  • the feature point information update unit 313 integrates the feature point information regarding the position of each feature point extracted from the image data and the distance information to each feature point to update the feature point information.
  • the vehicle control system 11 has an auxiliary sensor position information generator 303.
  • the auxiliary sensor position information generation unit 303 collates the updated feature point information with the image feature point database CD to generate auxiliary sensor position information indicating the position of the camera 51 .
  • the vehicle control system 11 has a data integration section 310 .
  • the data integration unit 310 integrates the auxiliary sensor position information with the self-contained navigation data to generate position information.
  • the feature point extraction unit 312 extracts the orientation with the shortest visibility from the visibility distribution as the feature point orientation.
  • the radar parameter determination unit 321 determines operation parameters based on the visibility of the feature point azimuth.
  • the auxiliary sensor position information generation unit 303 selectively uses the feature point information of one or more feature points existing in the feature point azimuth among the plurality of feature points included in the image data to generate the auxiliary sensor position information.
  • highly accurate auxiliary sensor position information can be generated using only the measurement results of the specific radar 52 optimized based on the visibility distribution.
  • the vehicle control system 11 has a radar selection section 323 .
  • the radar selection unit 323 selectively provides operation parameters to the radar 52 having a sensing region corresponding to the feature point orientation.
  • the radar parameter determination unit 321 determines the operating parameters based on the average visibility calculated from the visibility distribution.
  • the radar parameter determination unit 321 determines the chirp signal frequency bandwidth BW as an operating parameter based on the visibility information.
  • measurement by the radar 52 is performed with an appropriate range resolution ⁇ R according to the effective positioning range of the radar 52 .
  • the visibility calculation unit 311 extracts visibility information using a multilayer or deep neural network that has learned the relationship between the shooting scene and visibility.
  • a visibility calculation unit that extracts visibility information from the image data of the camera; a radar parameter determination unit that determines an operation parameter for controlling the effective positioning range of the radar based on the visibility information;
  • Information processing device having (2) a feature point extraction unit that extracts a plurality of feature points from the image data; a feature point distance calculation unit that calculates the distance to each feature point based on the measurement result of the radar; a feature point information updating unit that updates the feature point information by integrating feature point information regarding the position of each feature point extracted from the image data and information on the distance to each feature point;
  • an auxiliary sensor position information generation unit that generates auxiliary sensor position information indicating the position of the camera by comparing the updated feature point information with an image feature point database;
  • a data integration unit that integrates the auxiliary sensor position information with self-contained navigation data to generate position information;
  • the feature point extracting unit extracts an orientation with the shortest visibility from the visibility distribution as a feature point orientation
  • the radar parameter determination unit determines the operating parameter based on the visibility of the feature point orientation
  • the auxiliary sensor position information generation unit selectively generates the auxiliary sensor position information by selectively using feature point information of one or more feature points existing in the feature point orientation among the plurality of feature points included in the image data. to generate The information processing apparatus according to (4) above.
  • a radar selection unit that selectively applies the operation parameter to a radar having a sensing area corresponding to the feature point orientation; The information processing apparatus according to (5) above.
  • the radar parameter determination unit determines the operating parameter based on an average visibility calculated from the visibility distribution.
  • the information processing apparatus according to any one of (1) to (6) above.
  • the radar parameter determination unit determines a chirp signal frequency bandwidth as the operating parameter based on the visibility information.
  • the information processing apparatus according to any one of (1) to (7) above.
  • the visibility calculation unit extracts the visibility information using a multilayer or deep neural network that has learned the relationship between the shooting scene and the visibility.
  • the information processing apparatus according to any one of (1) to (8) above.
  • a computer-implemented information processing method comprising: (11) Extract visibility information from camera image data, Determining operating parameters that control the effective positioning range of the radar based on the visibility information; A program that makes a computer do something.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

Un dispositif de traitement d'informations (11) comprend une unité de calcul de visibilité et une unité de détermination de paramètre de radar. L'unité de calcul de visibilité extrait des informations de visibilité à partir de données image d'une caméra (51). L'unité de détermination de paramètre de radar détermine un paramètre de fonctionnement pour commander une plage de positionnement effective d'un radar (52) sur la base des informations de visibilité.
PCT/JP2022/036903 2021-11-05 2022-10-03 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2023079881A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021181168A JP2023069374A (ja) 2021-11-05 2021-11-05 情報処理装置、情報処理方法およびプログラム
JP2021-181168 2021-11-05

Publications (1)

Publication Number Publication Date
WO2023079881A1 true WO2023079881A1 (fr) 2023-05-11

Family

ID=86241429

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/036903 WO2023079881A1 (fr) 2021-11-05 2022-10-03 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (2)

Country Link
JP (1) JP2023069374A (fr)
WO (1) WO2023079881A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018151177A (ja) * 2017-03-10 2018-09-27 ソニー株式会社 情報処理装置及び情報処理方法
JP2019191945A (ja) * 2018-04-25 2019-10-31 日立オートモティブシステムズ株式会社 電子制御装置、演算方法
WO2020230694A1 (fr) * 2019-05-16 2020-11-19 ソニー株式会社 Corps mobile

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018151177A (ja) * 2017-03-10 2018-09-27 ソニー株式会社 情報処理装置及び情報処理方法
JP2019191945A (ja) * 2018-04-25 2019-10-31 日立オートモティブシステムズ株式会社 電子制御装置、演算方法
WO2020230694A1 (fr) * 2019-05-16 2020-11-19 ソニー株式会社 Corps mobile

Also Published As

Publication number Publication date
JP2023069374A (ja) 2023-05-18

Similar Documents

Publication Publication Date Title
US11531354B2 (en) Image processing apparatus and image processing method
US20220383749A1 (en) Signal processing device, signal processing method, program, and mobile device
US20230230368A1 (en) Information processing apparatus, information processing method, and program
US20240054793A1 (en) Information processing device, information processing method, and program
CN112534297A (zh) 信息处理设备和信息处理方法、计算机程序、信息处理系统以及移动设备
WO2022158185A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et dispositif mobile
US20220277556A1 (en) Information processing device, information processing method, and program
CN115668285A (zh) 信息处理装置、信息处理方法、信息处理系统及程序
WO2023153083A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations et dispositif de déplacement
US20230289980A1 (en) Learning model generation method, information processing device, and information processing system
WO2023079881A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2024024471A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations
WO2023054090A1 (fr) Dispositif de traitement de reconnaissance, procédé de traitement de reconnaissance et système de traitement de reconnaissance
WO2023149089A1 (fr) Dispositif d'apprentissage, procédé d'apprentissage, et programme d'apprentissage
WO2023063145A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
WO2024009829A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et système de commande de véhicule
US20230377108A1 (en) Information processing apparatus, information processing method, and program
WO2023032276A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et dispositif mobile
WO2023074419A1 (fr) Dispositif de traitement d'information, procédé de traitement d'information et système de traitement d'information
WO2023021756A1 (fr) Système de traitement d'informations, dispositif de traitement d'informations et procédé de traitement d'informations
WO2024048180A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et système de commande de véhicule
WO2023145460A1 (fr) Système de détection de vibration et procédé de détection de vibration
WO2024062976A1 (fr) Dispositif et procédé de traitement d'informations
WO2023162497A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image
US20240019539A1 (en) Information processing device, information processing method, and information processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22889705

Country of ref document: EP

Kind code of ref document: A1