WO2019172103A1 - Signal processing system and evaluation system for same, and signal processing device used in said signal processing system - Google Patents

Signal processing system and evaluation system for same, and signal processing device used in said signal processing system Download PDF

Info

Publication number
WO2019172103A1
WO2019172103A1 PCT/JP2019/008008 JP2019008008W WO2019172103A1 WO 2019172103 A1 WO2019172103 A1 WO 2019172103A1 JP 2019008008 W JP2019008008 W JP 2019008008W WO 2019172103 A1 WO2019172103 A1 WO 2019172103A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal processing
sensing
sensor
processing device
time
Prior art date
Application number
PCT/JP2019/008008
Other languages
French (fr)
Japanese (ja)
Inventor
中田 啓明
道雄 森岡
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Publication of WO2019172103A1 publication Critical patent/WO2019172103A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a signal processing system including an external sensor signal processing device having a cooperative operation function, an evaluation system thereof, and a signal processing device used in the signal processing system. It is composed of a signal processing device that processes signals obtained from an in-vehicle external sensor that senses the external environment around the vehicle, and constitutes an automatic driving system, and detects or recognizes an object existing around the vehicle, and the signal processing device.
  • the present invention relates to a signal processing system and an evaluation system for evaluating the signal processing apparatus and the signal processing system.
  • Patent Document 1 describes a method for recognizing a traveling environment by integrating a recognition result by a radar and a recognition result by an image. Further, in Patent Document 2, in order to effectively superimpose the sensing function of the radar device and the sensing function of the camera device to improve the object detection accuracy, a marker is calculated from the supplemental area of the radar device, and the camera device Describes a method for calculating an edge from the image acquired in step (1) and calculating a component region from the calculation results.
  • the sensing time may differ depending on the position of the sensing object in the sensing range.
  • the processing time from processing a signal obtained from an external sensor to obtaining an object detection or recognition result may vary depending on the sensor or the external environment.
  • the sensing time is different, the relative position from the sensor of the same object changes for each external sensor when the external sensor moves as the vehicle moves, such as in-vehicle devices, or when the external object itself moves. Sensed. Therefore, the recognition processing result for sensing data by a certain external sensor or the object position information in the middle of the recognition processing is utilized when performing the recognition processing for sensing data by another external sensor.
  • the present invention has been made in view of the above circumstances, and the object of the present invention is to perform recognition processing using sensing data from a plurality of external sensors, When using object position information during recognition processing in recognition processing of sensing data of another external sensor, a signal processing system capable of suppressing the influence of positional deviation due to a difference in sensing time, its evaluation system, and its The object is to provide a signal processing apparatus used in a signal processing system.
  • a signal processing system and the like includes a plurality of signal processing devices for external sensors connected to external sensors, and each signal processing device senses the external sensors.
  • sensing data from the first external sensor is recognized by the first signal processing device, and sensing data from the second external sensor is the second signal.
  • the sensing time of the sensing data and the object detection position are managed, and the object detection position is the sensing data to be processed by the second signal processing device. Is corrected so as to be positioned at the sensing time of data, said second signal processing apparatus is characterized by using said corrected position when performing the recognition process.
  • the second external sensor when the recognition process is performed using the sensing data of a plurality of external sensors, the second external sensor is used by using the object position information which is the recognition process result of the first external sensor or information during the recognition process.
  • the recognition process it is possible to suppress a position shift for performing a specific process such as a intensive process on the sensing data of the second external sensor, and thereby improve the positional accuracy, thereby improving the second external sensor. It leads to improvement of recognition accuracy. Further, it is possible to reduce the margin of the area where specific processing is performed, and to reduce the processing load of recognition processing.
  • the internal structural example of the vehicle control system shown in FIG. The use image figure of the positional information obtained from the sensing of the millimeter wave radar in the recognition process of a camera in case an external sensor is comprised with a millimeter wave radar and a camera.
  • (A), (b) is a figure which shows the rough road surface area
  • 5 is a timing chart for explaining an example of signal processing operation timing in the first embodiment, where (a) is a signal processing operation timing when a recognition processing result for the external sensor A is used for recognition processing for the external sensor B; ) Is a timing chart for explaining signal processing operation timing when the result of preprocessing for the external sensor A is used for recognition processing for the external sensor B.
  • the structural example of the evaluation system which performs simulation evaluation of the sensor system of 1st Embodiment.
  • the figure explaining an example of the simplification of the time information at the time of performing the simulation evaluation of the sensor system of the first embodiment (in the case where the whole is sensed simultaneously in one sensing unit).
  • the figure explaining the other example of the simplification of the time information at the time of performing the simulation evaluation of the sensor system of the first embodiment (considering the difference in sensing time within one sensing only for main objects).
  • the internal structural example of the signal processing apparatus in the signal processing system which processes the sensing result of the external sensor B using the sensing result of the external sensor A in 4th Embodiment The timing chart explaining the example of the signal processing operation timing in 4th Embodiment.
  • the signal processing system 1000 of this embodiment includes an external sensor A101, an external sensor B201, a sensor A signal processing device 100, a sensor B signal processing device 200, and a fusion processing device 650.
  • External sensor A101 is a first external sensor that senses the external environment around the vehicle
  • external sensor B201 is a second external sensor that senses the external environment around the vehicle.
  • Specific examples of sensors that sense the vehicle surroundings include millimeter wave radar, monocular cameras, stereo cameras, and riders (LiDAR: LightARDetection and Ranging).
  • the external sensor A101 and the external sensor B201 are generally different types or types of sensors, but may be the same type or type. For example, it is possible to use an infrared camera and a general camera even if both are cameras.
  • the sensor A signal processing device 100 and the sensor B signal processing device 200 are connected to the external sensor A101 and the external sensor B201, respectively, and perform signal processing on the sensing data output of the external sensor A101 and the sensing data output of the external sensor B201.
  • an ECU Electronic Control Unit
  • the signal processing function may be implemented as a partial function of the ECU.
  • These signal processing apparatuses 100 and 200 perform signal processing on the sensing data, detect objects existing around the vehicle, and identify the objects as necessary, thereby outputting a final recognition result.
  • the detection results include not only position information (object detection position) and orientation, but also signal reflection delay time when using millimeter waves and lasers, temporal object movement, image characteristics, when using cameras, Distance information calculated from parallax (in the case of a stereo camera) or the like is also included.
  • the sensor A signal processing device 100 and the sensor B signal processing device 200 exchange time synchronization information 10 and operate in time synchronization.
  • As a method of synchronous operation there is a method indicated as IEEE AS802.11AS.
  • the same clock signal may be supplied to both the sensor A signal processing device 100 and the sensor B signal processing device 200 more easily.
  • the sensor A signal processing device 100 outputs the sensor A processing information 108 to the sensor B signal processing device 200.
  • the sensor A processing information 108 includes time information at which sensing is performed together with position information (object detection position) of an object existing around the host vehicle detected by signal processing for sensing data output by the external sensor A101.
  • the position information of the object existing around the host vehicle included in the sensor A processing information 108 is matched with the sensing time by the external sensor B201. It correct
  • the host vehicle position information 701 input from the outside is referred to.
  • the position information of the object included in the sensor A processing information 108 may be a final processing result in the sensor A signal processing apparatus 100, but it is better to use information after the presence detection of the object and before the object type determination processing. .
  • the sensor A processing information is sent from the sensing timing of the external sensor A101 to the sensor B signal processing device 200 when the information required when the sensor B signal processing device 200 performs the recognition processing is obtained. There is little delay until 108 is output (this is called low delay), and errors in correction of position information can be suppressed.
  • the sensor A signal processing device 100 outputs the recognition result as the sensor A recognition result 105 to the fusion processing device 650
  • the sensor B signal processing device 200 outputs the recognition result as the sensor B recognition result 205 to the fusion processing device 650.
  • the sensor A recognition result 105 and the sensor B recognition result 205 are integrated by the fusion processing device 650 and output as an integrated recognition result 595.
  • the fusion process by the fusion processing apparatus 650 is performed using a known technique such as the technique disclosed in Patent Document 1.
  • the internal configuration of the sensor A signal processing device 100 and the sensor B signal processing device 200 in the signal processing system 1000 will be described with reference to FIG. In the following description, the sensor A signal processing device 100 and the sensor B signal processing device 200 will be described on the assumption that they operate synchronously.
  • the sensing data output of the external sensor A101 is sent to the recognition unit A150 that performs recognition processing via the sensor input unit A110, and the recognition result of the recognition unit A150 is sent to the sensor A via the result output unit A120.
  • a recognition result 105 is output.
  • the sensor input unit A110 outputs parameters and timing signals necessary for operation control of the external sensor A101 to the external sensor A101, receives the sensing data output of the external sensor A101, and receives the received sensing data until the recognition unit A150 performs processing. Hold.
  • the sensing time of the external sensor A101 and the sensing data are also managed in association with each other.
  • the object detection information obtained by the recognition process of the recognition unit A150 is sent to the cooperative data output unit A140.
  • the detection information includes identification information necessary for distinguishing which object is the same object between continuous sensing data even when a plurality of objects are detected.
  • the cooperative data output unit A140 acquires the sensing time of the sensing data used by the recognition unit A150 for the recognition process from the sensor input unit A110, and uses the sensor detection information as the sensor A processing information 108 together with the sensing time information to perform signal processing for the sensor B. Output to the device 200.
  • the sensor A processing information 108 is received by the cooperative data input unit B230 in the sensor B signal processing device 200 and sent to the position correction unit B280.
  • the sensing data output of the external sensor B201 is sent to the recognition unit B250 that performs recognition processing via the sensor input unit B210, and the recognition result of the recognition unit B250 is sent to the sensor B via the result output unit B220.
  • the result is output as a recognition result 205.
  • the sensor input unit B210 outputs parameters and timing signals necessary for operation control of the external sensor B201 to the external sensor B201, receives the sensing data output of the external sensor B201, and receives the sensing data until the recognition unit B250 performs processing. Hold.
  • the sensing time of the external sensor B201 and the sensing data are also managed in association with each other.
  • the position correction unit B280 calculates the position correction result in consideration of the time required for the position correction process (detailed later), and uses the position correction result to recognize the unit B250.
  • the sensing data to be subjected to the next recognition process is received from the recognition unit B250, and the sensing time of the corresponding data is received from the sensor input unit B210.
  • the position correction unit B 280 determines the difference between the sensing time included in the sensor A processing information 108 and the sensing time of the sensing data of the external sensor B 201, that is, the sensing time at the external sensor A 101 included in the sensor A processing information 108.
  • the elapsed time until the sensing time of the sensing data is calculated, the movement amount and the change of the direction of the vehicle are calculated using the own vehicle position information 701, and the sensor A signal processing apparatus 100 input as the sensor A processing information 108.
  • the position information included in the sensor A processing information 108 is obtained from the external sensor using the object position (object detection position) obtained from the above and the history information of the sensing time (or the object position history information and the elapsed time information between the histories).
  • the position information corresponding to the sensing time sensed by B201 is corrected.
  • time synchronization is performed between the signal processing apparatuses 100 and 200 that process the respective sensing data in order to obtain the difference in sensing time between the external sensor A101 and the external sensor B201. If a method is used in which the time difference is not greatly shifted within the range used for processing, the time synchronization may not be performed. For example, even when each signal processing device 100, 200 is operated by a different crystal oscillator and the synchronization processing is not performed as time, each signal processing device 100, 200 has a common 4-bit unit in one second. Time information is obtained by combining the count information received immediately before as time-related information and the elapsed time information in each of the signal processing devices 100 and 200 after receiving the count information immediately before. If used, since the difference in the number of pulses output by different crystal oscillators per second is usually small, sufficient time accuracy can be obtained in obtaining the difference in sensing time.
  • the corrected position information is sent to the recognition unit B250 and used when the next sensing data is recognized. For example, for the region indicated by the position information, a noise removal process with good performance even when the process is heavy, an object identification algorithm (recognition process algorithm) with a high recognition rate is applied, or the recognition process of the recognition unit B250 If the object is not detected in step (b), the noise filter strength is reduced (that is, the recognition process parameter is changed) and the recognition process is performed again.
  • the recognition unit B250 can efficiently perform recognition processing by selecting a region and performing high-load processing, and can perform non-detection by performing re-recognition processing on a portion where an object cannot be detected. Can be suppressed (in other words, an increase in the undetected rate can be suppressed).
  • the use image of the position information obtained from the sensor A processing information 108 in the recognition processing of the sensor B signal processing apparatus 200 is described with reference to FIG. 3 assuming that the external sensor A101 is a rider and the external sensor B201 is a camera. To do.
  • the vehicle existing at the position 922 in the sensing La 920 of the rider has moved to the position 932 at the sensing Lb 930.
  • the vehicle existing at the position 942 in the sensing Ca 940 of the camera has moved to the position 953 in the sensing Cb 950.
  • the vehicles sensed by the rider and the camera are the same vehicle.
  • the position 952 of the sensing Cb 950 is obtained by adjusting and projecting the position 932 of the rider's sensing Lb 930 to the position of the camera sensing data.
  • the position of the rider and the camera needs to be adjusted because the position of the object existing at the same position in the space is shifted due to the difference in the mounting position in the vehicle.
  • the position 952 is a position in which the rider adjusts the position 932 in the sensing Lb 930 to the position in the sensing Cb 950 of the camera. ing. Therefore, the position correction unit B280 (see FIG. 2) estimates the position of the vehicle at the sensing time of the sensing Cb 950 using the sensor A processing information 108 in the sensing La 920 and the sensor A processing information 108 in the sensing Lb 930, and the sensing Cb 950. It outputs as a position used for the recognition process by the recognition part B250.
  • the position correction unit B280 estimates the position at the sensing Cb 950 from the results of the sensing La 920 and the sensing Lb 930, so that the direction and distance of the object as the object detection result by the rider from the own vehicle, the attachment position of the rider on the own vehicle, Using the sensing time information of the sensing La 920 and the sensing Lb 930 and the information on the position and orientation of the own vehicle corresponding to each sensing time, the detection target vehicle based on the position and orientation of the own vehicle at the sensing time of the sensing Lb 930 First, the position in space is obtained. Next, the position correction unit B280 extrapolates and obtains a position in the space corresponding to the sensing time of the sensing Cb950.
  • the position correction unit B280 considers the influence of the movement and orientation change of the host vehicle from the sensing time of the sensing Lb930 to the sensing time of the sensing Cb950, and the space of the detection target vehicle with respect to the host vehicle at the sensing time of the sensing Cb950. Find the top position. Finally, the position correction unit B280 determines the position of the target vehicle (two-dimensional position in the sensing result (image) of the camera) in the sensing Cb 950 in consideration of the mounting position of the camera in the host vehicle.
  • the position of the sensing Cb at the sensing time is estimated using the two object detection results of the sensing La 920 and the sensing Lb 930, but may be estimated using the sensing results of more riders.
  • sensing time information of sensing data targeted for detection processing is added to the object detection result and output, but this sensing time is simply What is necessary is just to add the start time of sensing or the intermediate time of the end of sensing from the start of sensing in units of one sensing. However, depending on the external sensor, it may take time for one sensing, and time accuracy may be insufficient for position correction based on time. As a countermeasure, a method for adding time information will be described with reference to FIGS.
  • the sensing data of the external sensor A101 is divided into a plurality of sensing areas and adding and managing the sensing time in units of sensing areas. For example, when the horizontal scanning is advanced in the vertical direction during sensing, as shown in FIG. 4, the sensing data is divided into a plurality of sensing areas from the sensing area A 981 to the sensing area D 984, and a representative sensing time is added to each sensing area. To do. As the representative sensing time, a sensing start time in the sensing region or an intermediate time from the sensing start to the sensing end is used.
  • the sensing region where the object exists is determined from the position (center of the object region) on the sensing data of the object (object p990, object q992) that is the position correction target,
  • the sensing time corresponding to the sensing area is used. Note that the number of divisions and the division direction are determined in consideration of the scanning order at the time of sensing by the external sensor, the time required for one sensing, and the accuracy required for position correction.
  • the method shown in FIG. 5 defines a time calculation formula 988 for calculating the sensing time from the horizontal position and the vertical position on the sensing data of the external sensor A101, and the coefficients Tcv, Tch, offset time Ts necessary for time calculation as the sensing time. It is a method to manage using.
  • the position correction unit B280 uses the sensing time
  • the object object p990, object
  • the position correction target using the time calculation formula 988.
  • q992 is used as the sensing time.
  • the time calculation formula 988 is an example, and the formula may be modified so that the calculation is easy, or a different formula may be used in accordance with the sensing of the external sensor.
  • a table defining output values for input values may be used as the time calculation formula 988.
  • parameters such as coefficients Tcv, Tch, and offset time Ts necessary for time calculation need not be sent for each sensing for parameters that do not change, only when they change or for every certain number of sensing times. It is also possible to send it.
  • the method shown in FIG. 6 is a method of adding and managing sensing time information for each object detected using the sensing data of the external sensor A101.
  • the position correction unit B280 When the position correction unit B280 performs position correction based on the time information, if the sensing time of the external sensor B201 differs depending on the position on the sensing data, the position correction unit B280 performs sensing to obtain the sensing time of the sensing data from the sensor input unit B210. It is managed as a relationship between the position on the data and the sensing time, the sensing time of the external sensor B201 is obtained from the position of the detected object obtained from the signal processing apparatus 100 for sensor A, and the sensing time included in the sensor A processing information 108 Calculate the difference between
  • the external sensor A101 is directly connected to the sensor A signal processing apparatus 100
  • the external sensor B201 is directly connected to the sensor B signal processing apparatus 200
  • the sensor A signal processing device 100 and the sensor B signal processing device are connected to a common bus 790 (configured by a CAN bus, an in-vehicle Ethernet, or the like) as an in-vehicle network.
  • the external sensor A101 and the sensor A signal processing apparatus 100, and the external sensor B201 and the sensor B signal processing apparatus 200 are directly connected, and a signal having a large amount of data such as a point cloud or an image flows to the common bus 790. Suppressed.
  • Fusion processing for outputting an integrated recognition result 595 (see FIG. 1), which is a combination of the sensor A recognition result 105 of the sensor A signal processing device 100 and the sensor B recognition result 205 of the sensor B signal processing device 200, to the common bus 790.
  • an information display device 610 (consisting of indicators such as meters) that informs the driver of the host vehicle, and a user input device 620 that receives operations from the driver (steering, accelerator, brake) , And other switches), and a vehicle control system 700 that performs control related to the mechanical part of the host vehicle is connected.
  • the common bus 790 may be divided into a plurality of physical buses for ensuring the amount of data flowing through the bus, ensuring safety, and ensuring security. In such a case, information is appropriately transferred between physical buses. It is necessary to use a device for replacement, or to configure each device connected to the common bus 790 to connect to a plurality of physical buses as necessary.
  • the own vehicle position information 701 (see FIG. 1) input to the sensor B signal processing device 200 is supplied from the vehicle control system 700 via the common bus 790, for example. Since the vehicle control system 700 performs control related to the mechanical part of the host vehicle, the speed and acceleration of the host vehicle are acquired by a sensor, and information based on the sensing data of the sensor is acquired from the host vehicle position information. 701 is supplied.
  • the internal configuration of the vehicle control system 700 is as shown in FIG. 8, and the common bus 790 controls a power source via a vehicle integrated control device 705 that performs integrated control of mechanical parts of the vehicle.
  • An engine control device 710, a transmission control device 720 that controls the power transmission portion, a brake control device 730 that controls the brake, and a steering control device 740 that controls the steering are connected.
  • Each control device (710, 720, 730, 740) includes an actuator (712, 722, 732, 742) that controls the movement of the associated mechanical part and a sensor (714, 724, 734, 734) that monitors its state. 744) are connected and controlled to operate properly.
  • the configuration of the vehicle control system 700 shown in FIG. 8 is an example, and a plurality of control devices are integrated, the connection between the control devices is performed by a common bus 790, or in the case of an electric vehicle, the engine control device 710 is a motor control device. It may be appropriately changed depending on the configuration of the host vehicle.
  • the integrated recognition result 595 output from the fusion processing device 650 is transmitted to the vehicle integrated control device 705, and the vehicle integrated control device 705 issues an instruction necessary for emergency brake operation, for example, when there is a high risk of collision.
  • an instruction necessary for speed control is issued based on the set speed, the distance to the vehicle ahead and the relative speed.
  • the vehicle integrated control device 705 issues a steering operation instruction to the steering control device 740 or the like based on the recognition result, so Keep the vehicle.
  • an automatic driving control device may be added to the common bus 790.
  • the behavior that the automatic driving control device should take as a vehicle is determined from the output of the fusion processing device 650 and the state of the host vehicle supplied from the vehicle control system 700, and an instruction is issued to the vehicle control system 700.
  • the vehicle system as a whole should be considered to be sufficiently safe and secure in the event of an emergency. It is necessary to do.
  • the recognition processing of the sensor B signal processing apparatus 200 is performed.
  • the vehicle present at position 962 in sensing Rc 960 of the millimeter wave radar (external sensor A101) has moved to position 972 at sensing Rd 970.
  • the vehicle existing at the position 948 in the sensing Cc945 of the camera (external sensor B201) has moved to the position 958 in the sensing Cd955.
  • the vehicles sensed by the millimeter wave radar and the camera are the same vehicle.
  • the position correction unit B 280 first detects the position 962 and the position 972 on the plane where the millimeter wave radar detects, and the position correction unit B280 at the sensing time when the respective positions 962 and 972 are sensed.
  • the position 974 at the sensing time of the sensing Cd955 is estimated by extrapolation from the position and orientation of the vehicle, the sensing time of the sensing Cd955, and the position and orientation of the host vehicle.
  • estimation may be performed by extrapolation using the sensing result of the millimeter wave radar before the sensing Rc 960.
  • the relative speed may be used for estimation.
  • the position correction unit B280 performs road surface estimation on the sensing Cd955, and obtains a region 971 obtained by projecting a sensing region by the millimeter wave radar on the estimated road surface 956 in the sensing Cd955 as shown in FIG.
  • a position 975 on the sensing data by the camera is obtained (in association with the estimated road surface 956).
  • the area indicated by the position 975 in FIG. 11 is widened with a margin, and the height is set to the maximum height of the object 958 assumed as the detection target in advance.
  • the height on the sensing data is calculated from the distance to the object (target object) 958.
  • the vertical position on the sensing data is obtained by assuming the position of the estimated road surface 956 at the position 975 as the bottom surface of the object 958. As a result, an area 959 on the sensing data of sensing Cd955 is obtained.
  • the road surface estimation process may be performed by the position correction unit B 280 as the position correction process, or by the recognition unit B 250 in consideration of use for recognition processing other than the position correction, and the result is the position correction unit. You may provide to B280.
  • the rough road surface area in the sensing data is estimated. For example, as shown in FIG. 12A, an area having a high possibility of a road is estimated by approximating a trapezoidal area based on image characteristics with respect to sensing data 901 for one eye of a camera. Such a region 902 is obtained by calculating a position where a road surface is projected on an image on the assumption that a road having a straight traveling direction has a certain width and a certain gradient.
  • the road may be curved, so it can be estimated by detecting the white line from the image or detecting the shoulder of the road based on the characteristics of the distance information obtained from the stereo parallax. Then, as shown in FIG. 12B, a rough road surface area 903 is estimated.
  • V-Disparity image As shown in the left diagram of FIG. 13, the flat road surface 904 in the vicinity and the road surface 905 in the distant upward gradient are inclined as shown in the area 914 and the area 915 in the right diagram of FIG. 13, respectively. Are projected in a straight line in different oblique directions.
  • an obstacle such as a preceding vehicle has a property of being projected in a straight line in a vertical direction like a region 917.
  • the image is first converted into a binarized image with a certain threshold value to detect the most dominant straight line in the V-Disparity image, and then Hough transform is performed.
  • An example of the detected straight line is shown as a straight line 911 in the right diagram of FIG.
  • a straight line 911 indicates the relationship between the vertical position at which the estimated road surface is projected on the image and the parallax, and can be converted into the distance in the depth direction and the road surface height position in the three-dimensional space.
  • the detected straight line 911 agrees well with the parallax voting result on the near side (region 914 in the right figure in FIG. 13), but on the far side where the gradient changes (FIG. 13). There is a deviation in the area 915 portion in the right figure), and it is considered that the reliability of the road surface estimation result represented by the straight line 911 is low.
  • the part with low reliability is determined as a part where the number of votes of parallax at the place where the straight line 911 passes is confirmed and the place where the number of votes is less than a certain number is continuously generated.
  • the estimated road surface of the sensing data corresponding to the portion whose vertical position is above the point 916 in the right diagram of FIG. 13 is invalid.
  • an effective estimated road surface can be obtained from the straight line 911.
  • FIG. 14 and 15 show examples of signal processing procedures in the signal processing devices 100 and 200 (the recognition unit A150 and the recognition unit B250).
  • FIG. 14 assumes sensing signal processing by a millimeter wave radar
  • FIG. 15 assumes sensing signal processing by a stereo camera.
  • a millimeter wave whose frequency changes with time is transmitted, and the reflected wave is received by a plurality of receiving antennas, and both the transmission signal and the receiving signal at each receiving antenna are received.
  • Output as sensing data.
  • the output sensing data is stored in the buffer by the sensor input R (S502). Simple processing that does not require a buffer, such as addition between signals, is performed before buffering.
  • the maximum time point detection R (S512) and the sensor output R (S501) from the millimeter wave radar to the millimeter wave radar transmits the elapsed time until the reflected wave is received, or the difference between the transmission frequency and the reception frequency
  • the distance to the object that reflects the millimeter wave is calculated based on (if the frequency is changed with time, a difference occurs between the transmission frequency and the reception frequency at the same timing according to the delay of the reflected wave).
  • the direction of the object is detected from the phase difference of the received wave between the receiving antennas.
  • actual processing for example, it appears as a peak of the result of performing Range-Doppler FFT processing, so the local maximum point is detected, and the direction, distance, relative speed from the own vehicle of each object for each local maximum point Is detected.
  • the noise removal R confirms from the detection history so far whether it is a sudden detection, and if it is a sudden detection, it is removed as noise.
  • the object tracking R confirms the relationship between the movement amount and the movement speed from the position of the previous detection result for each detected object, and is the same as the previous detection result.
  • the object is associated with which object.
  • it is necessary to predict movement for each object in order to correct the detection position in time, so information on the speed and direction of each object or the same object It is necessary to have position information that can identify In order to identify the same object, an identification ID is added and managed for each object.
  • the feature parameter of the detected object is extracted by common feature parameter detection R (S522).
  • the moving speed and movement pattern of the object, the intensity of the reflected wave, and the change thereof are extracted as feature parameters.
  • this parameter is referred to as necessary.
  • the object In the vehicle identification R (S524), when the characteristics of the vehicle are satisfied in consideration of the feature parameter and the previous identification information of the object by object tracking, the object is marked as a vehicle. In the bicycle identification R (S526), if the characteristics of the bicycle are satisfied in consideration of the characteristic parameters and the previous identification information of the object by object tracking, the object is marked as a bicycle. Similarly, in the sign identification R (S528), when the sign feature is satisfied, the object is marked as a sign. Although the millimeter wave radar cannot identify the contents of the sign, it can be used to confirm the detection information of the sign detected by a camera or the like when performing fusion processing later, so the presence of the sign is confirmed. .
  • the recognition process result output R (S530) the identification ID, direction, distance, relative speed, and object type as the object recognition result (sensor recognition result) are output for each individual object.
  • the outside world is sensed as an image by the stereo camera.
  • the exposure time and exposure sensitivity are appropriately adjusted according to the external environment.
  • Sensing data is buffered by the sensor input S (S532), and in the image correction S (S541), image distortion due to camera mounting position and optical system errors, brightness and hue correction due to sensor characteristic errors, and sensor In the output S (S531), interpolation of pixel information of defective pixels existing in the stereo camera is performed.
  • the parallax detection S (S542) is performed to obtain the parallax of each pixel of the left and right camera images. Thereafter, the estimated road surface is obtained by road surface detection S (S543) using the parallax information.
  • the road surface detection S (S543) After the road surface detection S (S543), a part different from the estimated road surface is detected as an object in the lump portion of the same parallax in the object detection S (S544).
  • the distance is calculated from the parallax, the average value of the distance of the area of the object is obtained, and the distance to the object is calculated. Further, the direction of the object relative to the direction of the host vehicle is calculated from the position on the video. At this time, considering the size of the detected object, a small object that is not stably detected in terms of time is excluded from the detection target.
  • the object detected in the object detection S (S544) is detected in the object tracking S (S545) by checking the relationship between the movement amount and the moving speed from the position of the immediately preceding detection result for each detected object, and the immediately preceding detection. Corresponding to which object is the same object as the result. In order to identify the same object, an identification ID is added and managed for each object. The relative velocity is calculated by obtaining a change in distance with respect to the same object and calculating using the sensing time interval.
  • common feature parameter detection S calculates parameters commonly used in recognition processing such as edge distribution, luminance portion, and actual size. In the subsequent recognition process S (S550), this parameter is referred to as necessary.
  • the object In the vehicle identification S (S552), when the characteristics of the vehicle are satisfied in consideration of the feature parameter and the previous identification information of the object by object tracking, the object is marked as a vehicle.
  • the bicycle identification S (S553) if the characteristics of the bicycle are satisfied in consideration of the characteristic parameters and the previous identification information of the object by object tracking, the object is marked as a bicycle.
  • the pedestrian identification S (S554) satisfies the pedestrian characteristics, the object is marked as a pedestrian. Where possible, also differentiate between adults or children.
  • the sign identification S (S555) when the feature is satisfied, the object is marked as a sign and at the same time the contents of the sign are identified.
  • the recognition processing result output S (S560) the identification ID, the position, the distance, the relative speed, the type of the object, the additional information (the contents of the sign, etc.) as the object recognition result (sensor recognition result) for each object. ) Is output.
  • the sensor signal processing requires a plurality of processes. Roughly, the pre-processing portion (pre-processing R (S510) in FIG. 14) and the pre-processing in FIG. S (S540)) and recognition processing parts (recognition processing R (S520) in FIG. 14 and recognition processing S (S550) in FIG. 15) necessary for object identification and content recognition.
  • the preprocess A1 (S821) is performed on the sensing A1 (S811) by the external sensor A101, and the recognition process A1 (S831) is performed on the output of the preprocess A1 (S821). And output the above recognition processing result.
  • the preprocessing A2 (S822) is performed on the sensing A2 (S812) by the external sensor A101, the recognition processing A2 (S832) is performed on the output of the preprocessing A2 (S822), and the above recognition processing result is output.
  • Processing after sensing A3 (S813) is performed in the same manner. That is, pipeline processing of sensing, preprocessing, and recognition processing is performed.
  • the preprocess B1 (S871) is performed on the sensing B1 (S861) by the external sensor B201, and the recognition process B1 (S881) is performed on the output of the preprocess B1 (S871).
  • the above recognition processing result is output.
  • the preprocessing B2 (S872) is performed on the sensing B2 (S862) by the external sensor B201
  • the recognition processing B2 (S882) is performed on the output of the preprocessing B2 (S872)
  • the above recognition processing result is output.
  • the same processing is performed after the sensing B3 (S863). That is, similar to the external sensor A101, pipeline processing of sensing, preprocessing, and recognition processing is performed.
  • FIG. 16A shows the operation timing when the result of the recognition process for the external sensor A101 is used for the recognition process for the external sensor B201. Yes. Due to the timing of the processing, the recognition processing B2 (S882) uses the output of the position correction B1 (S875). That is, position correction B1 (S875) is performed based on information up to sensing A1 (S811), and the position correction B1 (S875) is used for recognition processing for sensing B2 (S862).
  • FIG. 16B shows the operation timing when the result of the preprocessing for the external sensor A101 is used for the recognition process for the external sensor B201. Due to the timing of the processing, the recognition processing B2 (S882) uses the output of the position correction B2 (S876). That is, position correction B2 (S876) is performed based on the information up to sensing A2 (S812) (instead of the information up to sensing A1 (S811)), and the position correction B2 (S876) is recognized for sensing B2 (S862). It will be used for processing.
  • the processing result (object detection result) using the sensing data of the external sensor A101 closest to that in FIG. 16A can be used in the recognition processing of the external sensor B201.
  • the position correction uses extrapolation, it is better to use the most recent sensing data in order to improve the correction accuracy. That is, the intermediate processing result in the sensor signal processing of the external sensor A101 (preprocessing during recognition processing) It is better to use the result of (5) for recognition processing of sensing data of the external sensor B201.
  • the recognition information is also included in the sensor A process information 108 (see FIG. 1) and used for the sensing data recognition process of the external sensor B201.
  • the external sensor A101 is a millimeter wave radar
  • the evaluation target portion 340 of the evaluation system 1001 includes a sensor A signal processing device 100, a sensor B signal processing device 200, and a fusion processing device 650 that constitute the signal processing system 1000.
  • the evaluation system 1001 This is a system that evaluates how much the results sensed by the sensor A101 and the external sensor B201 can be recognized as an object by the configuration of the evaluation target portion 340.
  • the evaluation system 1001 of this embodiment mainly includes a vehicle control system model 370, a road environment generation model 390, an information display device model 310, a pseudo user input generation model 320, and a pseudo sensing signal generation device 330.
  • the output (integrated recognition result 595) of the processing system 1000 (of the fusion processing apparatus 650) is connected to a vehicle control system model 370 that simulates the movement of the vehicle, and the vehicle control system model 370 is a road that simulates the surrounding environment of the vehicle.
  • Generate simulated sensing signal of data Inputting a simulated sensing signal of the sensing data to the signal processor 100, 200 of the signal processing system 1000.
  • the vehicle control system model 370 is obtained by modeling the vehicle control system 700 shown in FIG. 7 for simulation.
  • the road environment generation model 390 is based on the position and orientation of the host vehicle calculated by the vehicle system model 370 and the surrounding environment (terrain, road, objects on the road, objects around the road, etc.) around the host vehicle at a certain simulation time. This is a simulation model for obtaining a position, a moving direction, a speed, a light state, a road surface state, and the like.
  • the road environment generation model 390 works closely with the vehicle control system model 370 to simulate the behavior of the host vehicle.
  • the pseudo user input generation model 320 is a simulation model that supplies a signal corresponding to the operation of the driver to the vehicle control system model 370.
  • the information display device model 310 is based on the signal output from the vehicle control system model 370 in the evaluation environment. This is a simulation model in which a meter and a warning light are displayed on the screen in a pseudo manner, and a temporal change in the output of the vehicle control system model 370 is recorded.
  • the pseudo sensing signal generation device 330 generates pseudo signals of the external sensor A101 and the external sensor B201 (pseudo sensing signals such as CG and radar reflected signals) based on the information generated by the road environment generation model 390.
  • the pseudo sensing signal generation device 330 gives a pseudo signal to the sensor A signal processing device 100 and the sensor B signal processing device 200 of the signal processing system 1000 at a simulation timing corresponding to an actual vehicle system, so that the evaluation target portion 340 is operated, and the recognition result is transmitted to the vehicle control system model 370, whereby the movement of the host vehicle including the evaluation target portion 340 can be simulated.
  • the evaluation system 1001 collects and analyzes the input / output and internal signals of the evaluation target portion 340 during the simulation execution to determine whether or not the evaluation target portion 340 has sufficiently recognized, and corresponds to the vehicle control system model 370. It is possible to evaluate whether or not a desired motion can be obtained as a vehicle behavior in the vehicle that performs the operation.
  • the sensing time may vary depending on the sensing location.
  • the exposure time may be slightly different in units such as pixels or lines.
  • the pseudo sensing signal generation device 330 when sensing time is required for each individual object in the sensor A processing information 108, as shown in FIG. A pseudo signal is generated as if the whole was sensed at the same time (at the same time), and all objects (object p990, object q992, object r994, object at the time ft when the sensing was performed once)
  • the signal processing apparatus 100 for the sensor A is modified for the simulation environment so that the sensing times in s996) are aligned and output.
  • sensing time is not required for each object, when the sensing time is indicated for each sensing area as described with reference to FIG. 4, the sensing times for all sensing areas are aligned with ft and described with reference to FIG. 5.
  • the vehicle object p990 and the object q992 are regarded as main objects, the sensing times are set as pt and qt, respectively, and vehicle sensing signals are generated at positions corresponding to the sensing times pt and qt, and the object r994,
  • a representative sensing time corresponding to the one sensing is used as the sensing time ft, and a sensing signal is generated at a position corresponding to the representative sensing time ft.
  • the pseudo sensing signal generation device 330 causes the recognition unit A150 and the cooperative data output unit A140 of the sensor A signal processing device 100 to detect the main object (for example, the object p990, the object q992). Information may be sent, and only those objects may be modified in the recognition unit A150 and the cooperative data output unit A140 so that the sensing time is overwritten based on the information.
  • the sensing time is indicated for each sensing area as described with reference to FIG. 4 or when the sensing time is indicated using the time calculation formula 988 as described with reference to FIG. Since the sensing time cannot be added to the coordinate data output unit A140, the main object (eg, object p990, object q992) is sent to the position correction unit B280 instead of sending information related to the sensing time of the main object (eg, object p990, object q992). p990, information on the sensing time of the object q992) is sent, and the position correction unit B280 modifies the sensing time to be overwritten and managed based on the information.
  • the main object eg, object p990, object q992
  • the position correction unit B280 modifies the sensing time to be overwritten and managed based on the information.
  • CMOS sensors used in cameras have different sensing times depending on the sensing position, for example, the exposure timing varies from line to line, but CG (computer graphics) is used to generate pseudo sensing signals corresponding to such sensing.
  • CG computer graphics
  • the calculation load becomes high.
  • the signal processing apparatuses 100 and 200 for sensing data of each external sensor are time-synchronized using the same clock or using a method defined in IEEE 802.11AS. It also manages the sensing timing of connected external sensors.
  • the signal processing apparatus 100 for the sensor A uses the sensing time at which the sensing data to be processed is acquired together with the position information of the target object as the recognition processing result or information during the recognition processing to the signal processing apparatus 200 for the sensor B. Output.
  • the sensor B signal processing apparatus 200 that processes the sensing data of the external sensor B201 detects the sensing time of the sensing data acquired by the external sensor B201 from the position information of the object received from the sensor A signal processing apparatus 100 and the history of the sensing time. The position of the object corresponding to is estimated and used as a priority processing region in the recognition processing of sensing data acquired by the external sensor B201.
  • the object position which is the recognition process result of the external sensor A101 (first external sensor) or information during the recognition process.
  • recognition processing of the external sensor B201 second external sensor
  • the second embodiment has a configuration in which the position correction unit is moved from the sensor B signal processing device 200 to the sensor A signal processing device 100 with respect to the configuration described in FIG. 2 of the first embodiment.
  • the flow of outputting the sensor B recognition result 205 from the external sensor B201 through the sensor input unit B210, the recognition unit BX252, and the result output unit B220 is the same as that in the first embodiment.
  • the position correction unit AX182 provided in the sensor A signal processing device 100 senses the detection information of the object obtained by the recognition processing of the recognition unit A150 and the sensing data used by the recognition unit A150 for the recognition processing.
  • the time is acquired from the sensor input unit A110. At this time, if the sensing time differs depending on the position on the sensing data, the external sensor A101 manages the sensing time for each detected object.
  • the recognition unit BX252 of the sensor B signal processing device 200 cooperates in advance from the time for performing the next recognition processing to the time required to acquire the position information of the object subjected to position correction from the sensor A signal processing device 100.
  • the data input unit BX232 is requested to acquire the position information of the object obtained by sensing with the external sensor A101 from the sensor A signal processing device 100.
  • the position information to be acquired includes information for distinguishing at which timing the sensing data of the external sensor B201 used for the next recognition process is the sensing data.
  • the cooperative data input unit BX232 Upon receiving the request, the cooperative data input unit BX232 receives the sensing time of the corresponding sensing data from the sensor input unit B210, and transmits the position information of the object detected based on the sensing data of the external sensor A101 corresponding to the sensing time. Request to the cooperative data output unit AX142. At this time, if the sensing time differs depending on the position on the sensing data, the external sensor B201 receives the sensing time as a relationship between the position on the sensing data and the sensing time.
  • the cooperative data output unit AX142 Upon receiving the request, the cooperative data output unit AX142 corrects the position of the object detected by the recognition unit A150 together with the sensing time information included in the request according to the position of the time obtained from the sensing time information, and then outputs the cooperative data.
  • the position correction unit AX182 is requested to send it to the unit AX142.
  • the position correction unit AX182 When the position correction unit AX182 receives the request from the cooperative data output unit AX142, the position correction unit AX182 conforms to the request from the cooperative data output unit AX142 using the latest object detection information obtained from the recognition unit A150 and the previous object detection information.
  • the sensing time information that is, position information corresponding to the sensing time information of the external sensor B201 is calculated by extrapolation.
  • the calculation result of the position correction unit AX182 is transmitted (output) to the recognition unit BX252 via the cooperative data output unit AX142 and the cooperative data input unit BX232, and the recognition unit BX252 sends the calculation result to the next recognition process. Take advantage of.
  • the third embodiment is a configuration in which the sensor A signal processing device 100 and the sensor B signal processing device 200 are integrated into the sensor AB signal processing device 300 in contrast to the configuration described in FIG. 2 of the first embodiment. ing.
  • the flow of outputting the sensor B recognition result 205 via the result output unit B220 is the same as that in the first embodiment.
  • an input / output unit (cooperative data output unit A140 and cooperative data input unit B230 in FIG. 2) that is necessary for communication of the sensor A processing information 108 as it is integrated into the sensor AB signal processing device 300. Accordingly, since the input / output of the position correction unit is different from that of the first embodiment, the position correction unit BZ384 is replaced. Since the sensor input unit A110 and the recognition unit A150 in FIG. 2 are also connected to the position correction unit BZ384, the sensor input unit AZ114 and the recognition unit AZ154 are replaced. In one embodiment, the sensor input unit A110 and the recognition unit A150 are the same as the information exchanged with the cooperative data output unit A140.
  • the information received by the position correction unit BZ384 from the sensor input unit AZ114 and the recognition unit AZ154 is substantially the same as that in the first embodiment
  • the position correction unit B280 receives the cooperation data output unit A140 and the cooperation data from the sensor input unit A110 and the recognition unit A150. This is the same as the information received via the input unit B230.
  • Information exchanged by the position correction unit BZ380 with the sensor input unit B210 and the recognition unit B250 is substantially the same as information exchanged with the position correction unit B280 in the first embodiment.
  • the position correction unit BZ384 of the signal processing device 300 for the sensor AB uses the external sensor A101 and the external sensor to detect the position information of each object detected by the recognition unit AZ154, similarly to the position correction unit B280 of the first embodiment.
  • the information is corrected using the sensing time information of B201 and transmitted to the recognition unit B250, and the recognition unit B250 uses the information when performing the recognition process.
  • the device configuration of the sensor AB signal processing device 300 (and thus the system configuration of the signal processing system) is simplified. can do.
  • the fourth embodiment is configured to control the sensing timing of the external sensor A101 in accordance with the sensing timing of the external sensor B201 based on the configuration described in FIG. 2 of the first embodiment.
  • FIG. 22 shows an internal configuration of the signal processing devices 100 and 200 according to the fourth embodiment.
  • the flow of outputting the sensor A recognition result 105 from the external sensor A101 through the sensor input unit AY113, the recognition unit A150, the result output unit A120, and the sensor B signal in the sensor A signal processing device 100 is the same as that in the first embodiment.
  • the position of the object detected by the recognition unit A150 is corrected based on the sensing time of the external sensor A101 and the sensing time of the external sensor B201 by the position correction unit B280 via the cooperative data output unit AY143 and the cooperative data input unit BY233.
  • the flow of using the position corrected by the recognition unit BY253 of the sensor B signal processing device 200 is the same.
  • the sensor input unit AY113 of the sensor A signal processing device 100 performs the sensing of the external sensor A101 based on the sensing timing information coming from the sensor B signal processing device 200 via the cooperative data output unit AY143. It has a function of controlling operation timing.
  • the recognition unit BY253 of the signal processing apparatus for sensor B 200 has a function of notifying the timing calculation unit BY293 of processing timing for starting processing of sensing data.
  • the cooperative data input unit BY233 and the cooperative data output unit AY143 send a sensing start request to the external sensor A101 output from the timing calculation unit BY293 to the cooperative data input unit B230 and the cooperative data output unit A140 of FIG.
  • the function to convey is added.
  • the timing calculation unit BY293 performs preprocessing by the recognition unit A150 on the periodicity of the processing of the recognition unit BY253, the time required for one sensing of the external sensor A101 set in advance, and the sensing data thereof, and the position of the object is calculated.
  • the calculation result is sent to the position correction unit B280, the time at which the position correction result reaches the recognition unit BY253 (preparation time Tp899 (see FIG. 23)), and a request signal is input from the preset timing calculation unit BY293 to the cooperative data.
  • FIG. 23 shows operation timings in the configuration shown in FIG.
  • the sensing timing of the external sensor A101 from the sensor B signal processing device 200 By instructing the sensing timing of the external sensor A101 from the sensor B signal processing device 200 to the sensor A signal processing device 100, sensing A2 (S812), preprocessing A2 (S822), position correction B2 (S876), recognition processing
  • the waiting time is suppressed in the process flow of B2 (S882), and the waiting time is also obtained by the cooperative operation of the sensor A signal processing device 100 and the sensor B signal processing device 200 in the subsequent processing on the sensing data of the external sensor A101. It is suppressed. Therefore, when performing the recognition process of the external sensor B201, it is possible to suppress errors associated with the extrapolation process performed in the correction process based on the sensing time of the object position detection result by the sensing of the external sensor A101.
  • the configuration of the fourth embodiment is configured to adjust the sensing timing of the external sensor A101 based on the configuration of the first embodiment, sensing between the external sensors and adjustment of the processing timing can be performed. Therefore, a configuration based on the configuration shown in the second embodiment or the third embodiment is also conceivable.
  • a configuration is also conceivable in which the sensor B signal processing device 200 is operated with reference to the sensing timing of the external sensor A101. In this case, the sensing of the external sensor B201 is synchronized with the operation timing of the sensor A signal processing device 100. Timing needs to be adjusted.
  • the sensing time of the sensing data of one external sensor matches the processing timing of the sensing data of the other external sensor. Therefore, the positional deviation due to the difference in sensing time can be further suppressed, and the recognition accuracy of the external sensor can be further improved.
  • the present invention is not limited to the above-described embodiment, and includes various modifications.
  • the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to one having all the configurations described.
  • a part of the configuration of an embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of an embodiment.
  • each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
  • Information such as programs, tables, and files for realizing each function can be stored in a memory, a hard disk, a storage device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
  • control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
  • DESCRIPTION OF SYMBOLS 10 Time synchronous information, 100 ... Signal processing apparatus for sensor A, 101 ... External sensor A, 105 ... Sensor A recognition result, 108 ... Sensor A processing information, 110 ... Sensor input part A, 113 ... Sensor input part AY, 114 ... sensor input part AZ, 120 ... result output part A, 140 ... cooperative data output part A, 142 ... cooperative data output part AX, 143 ... cooperative data output part AY, 150 ... recognition part A, 154 ... recognition part AZ, 182 ... position correction unit AX, 200 ... signal processing device for sensor B, 201 ... external sensor B, 205 ... sensor B recognition result, 210 ... sensor input unit B, 220 ...
  • result output unit B 230 ... cooperative data input unit B, 232 ... Cooperative data input unit BX, 233 ... Cooperative data input unit BY, 250 ... Recognition unit B, 252 ... Recognition unit BX, 253 ... Recognition unit BY, 280 ... Position correction unit B 293 ... Timing calculation unit BY, 310 ... Information display device model, 320 ... Pseudo user input generation model, 330 ... Pseudo sensing signal generation device, 340 ... Evaluation target part, 370 ... Vehicle control system model, 384 ... Position correction unit BZ, 390 ... Road environment generation model, 595 ... Integrated recognition result, 610 ... Information display device, 620 ... User input device, 650 ...
  • Fusion processing device 700 ... Vehicle control system, 701 ... Own vehicle position information, 705 ... Vehicle integrated control device , 710 ... Engine control device, 712 ... Engine control system actuator, 714 ... Engine control system sensor, 720 ... Transmission control device, 722 ... Transmission control system actuator, 724 ... Transmission control system sensor, 730 ... Brake control device, 732 ... Blur Control system actuator, 734 ... brake control system sensor, 740 ... steering control device, 742 ... steering control system actuator, 744 ... steering control system sensor, 790 ... common bus, 899 ... preparation time Tp, 901 ... one eye of stereo camera 902 ... Estimated area with high possibility of road (assuming straight road with constant width and constant gradient), 903 ...

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

Provided are: a signal processing system whereby an effect of positional deviation due to a difference in sensing time can be suppressed when a result of recognition processing of sensing data of one external sensor or object position information which is in the midst of recognition processing is utilized in recognition processing of sensing data of another external sensor during recognition processing using sensing data of a plurality of external sensors; an evaluation system for the signal processing system; and a signal processing device used in the signal processing system. A signal processing device 100 for a sensor A manages an object detection position and a sensing time of sensing data to be processed, the object detection position is corrected to be the position at the sensing time of sensing data to be processed by a signal processing device 200 for a sensor B, and the signal processing device 200 for the sensor B uses the corrected position when performing recognition processing.

Description

信号処理システム、及びその評価システム、並びにその信号処理システムに用いられる信号処理装置SIGNAL PROCESSING SYSTEM, ITS EVALUATION SYSTEM, AND SIGNAL PROCESSING DEVICE USED FOR THE SIGNAL PROCESSING SYSTEM
 本発明は、協調動作機能を有する外界センサ用信号処理装置で構成される信号処理システム、及びその評価システム、並びにその信号処理システムに用いられる信号処理装置に係り、特に、車両の先進安全システムや自動運転システムを構成する、車両周辺の外界の状況をセンシングする車載外界センサから得られる信号を処理して車両周辺に存在する物体を検出あるいは認識する信号処理装置、その信号処理装置により構成される信号処理システム、及びそれら信号処理装置や信号処理システムを評価する評価システムに関する。 The present invention relates to a signal processing system including an external sensor signal processing device having a cooperative operation function, an evaluation system thereof, and a signal processing device used in the signal processing system. It is composed of a signal processing device that processes signals obtained from an in-vehicle external sensor that senses the external environment around the vehicle, and constitutes an automatic driving system, and detects or recognizes an object existing around the vehicle, and the signal processing device. The present invention relates to a signal processing system and an evaluation system for evaluating the signal processing apparatus and the signal processing system.
 近年、衝突を回避あるいは衝突時の被害を軽減するために、緊急時に自動的にブレーキ操作などを行う先進安全システムを搭載した車両が普及してきている。また、自律的に移動することが可能な自動運転車も実験レベル、あるいは限定的な条件下においては実現されている。先進安全システムや自動運転車では、車両の外界の状況を認識するために、カメラやレーダ、レーザレンジファインダ、ソナーなどの外界センサと、外界センサで取り込んだセンシングデータを処理し、周辺の物体や状況を検出して認識する信号処理装置が通常搭載される。しかし、外界センサ毎に環境や対象物によって検出の得手不得手があり、未検知、誤検知や誤認識が発生する場合もあるので、認識の信頼性を高めるために、複数の外界センサを連携して統合処理することがある。 In recent years, vehicles equipped with advanced safety systems that automatically perform braking operations in an emergency have become widespread in order to avoid collisions or reduce damage during collisions. Self-driving cars that can move autonomously have also been realized on an experimental level or under limited conditions. In advanced safety systems and autonomous vehicles, in order to recognize the external environment of the vehicle, external sensors such as a camera, radar, laser range finder, and sonar, and sensing data captured by the external sensor are processed, and surrounding objects and A signal processing device that detects and recognizes the situation is usually installed. However, each outside sensor may not be good at detection depending on the environment or object, and undetected, false detection, or misrecognition may occur, so multiple external sensors are linked in order to increase the reliability of recognition. And may be integrated.
 複数の外界センサを統合処理する方法として、特許文献1には、レーダによる認識結果と画像による認識結果を統合して走行環境を認識する方法が記載されている。また、特許文献2には、レーダ装置のセンシング機能とカメラ装置のセンシング機能を効果的に重畳して、物体の検出精度を向上させるために、レーダ装置の補足領域からマーカを算出し、カメラ装置で取得した画像からエッジを算出し、これらの算出結果からコンポーネント領域を算出する方法が記載されている。 As a method for integrating a plurality of external sensors, Patent Document 1 describes a method for recognizing a traveling environment by integrating a recognition result by a radar and a recognition result by an image. Further, in Patent Document 2, in order to effectively superimpose the sensing function of the radar device and the sensing function of the camera device to improve the object detection accuracy, a marker is calculated from the supplemental area of the radar device, and the camera device Describes a method for calculating an edge from the image acquired in step (1) and calculating a component region from the calculation results.
特開2003-168197公報JP 2003-168197 A 特開2016-153775公報JP 2016-153775 A
 ところで、外界センサは、種類や形式によりセンシングタイミングやセンシング周期が異なることが多い。外界センサによっては、センシング範囲の中のセンシング対象物の位置により、センシング時刻が異なる場合もある。また、外界センサから得られた信号を処理し、物体の検出や認識結果を出すまでの処理時間が、センサ毎、あるいは外界の状況により異なる場合もある。センシング時刻が異なると、車載装置などのように外界センサの位置が車両の移動に伴い動く場合や外界の物体そのものが移動する場合、外界センサ毎に同一物体のセンサからの相対位置が変化してセンシングされる。そのため、ある外界センサによるセンシングデータに対する認識処理結果あるいは認識処理途中の物体位置情報を別の外界センサによるセンシングデータに対する認識処理を行う際に活用し、例えば当該位置に対して認識精度向上のために重点的な処理を行う場合、同一物体に対するセンシング位置が異なると、当該処理を適用する箇所にずれが発生し、認識精度を向上する効果が低下する場合がある。また、位置ずれを考慮して、当該処理の適用範囲に余裕(マージン)を持たせて大きくすると、認識処理に必要な処理負荷が増加することになる。 By the way, external sensors often have different sensing timing and sensing cycle depending on the type and form. Depending on the external sensor, the sensing time may differ depending on the position of the sensing object in the sensing range. In addition, the processing time from processing a signal obtained from an external sensor to obtaining an object detection or recognition result may vary depending on the sensor or the external environment. When the sensing time is different, the relative position from the sensor of the same object changes for each external sensor when the external sensor moves as the vehicle moves, such as in-vehicle devices, or when the external object itself moves. Sensed. Therefore, the recognition processing result for sensing data by a certain external sensor or the object position information in the middle of the recognition processing is utilized when performing the recognition processing for sensing data by another external sensor. In the case of performing intensive processing, if the sensing position with respect to the same object is different, a shift occurs at a position where the processing is applied, and the effect of improving recognition accuracy may be reduced. Further, if the application range of the processing is increased with a margin (margin) in consideration of the positional deviation, the processing load necessary for the recognition processing increases.
 本発明は、上記事情に鑑みてなされたものであって、その目的とするところは、複数の外界センサのセンシングデータを用いて認識処理を行う際、ある外界センサのセンシングデータの認識処理結果あるいは認識処理途中の物体位置情報を別の外界センサのセンシングデータの認識処理で利用する場合に、センシング時刻の違いによる位置ずれの影響を抑制することのできる信号処理システム、及びその評価システム、並びにその信号処理システムに用いられる信号処理装置を提供することにある。 The present invention has been made in view of the above circumstances, and the object of the present invention is to perform recognition processing using sensing data from a plurality of external sensors, When using object position information during recognition processing in recognition processing of sensing data of another external sensor, a signal processing system capable of suppressing the influence of positional deviation due to a difference in sensing time, its evaluation system, and its The object is to provide a signal processing apparatus used in a signal processing system.
 上記課題を解決するために、本発明に係る信号処理システム等は、各々に外界センサが接続される複数の外界センサ用信号処理装置で構成され、各信号処理装置は、前記外界センサによりセンシングした情報から外界の物体を認識する機能を有する信号処理システムにおいて、第一の外界センサによるセンシングデータは第一の信号処理装置で認識処理を行い、第二の外界センサによるセンシングデータは第二の信号処理装置で認識処理を行い、第一の信号処理装置の出力を第二の信号処理装置が認識処理を行う際に参照する信号処理システムであって、前記第一の信号処理装置は、処理対象としたセンシングデータのセンシング時刻と物体検出位置を管理し、前記物体検出位置は、前記第二の信号処理装置が処理対象とするセンシングデータのセンシング時刻における位置となるように補正され、前記第二の信号処理装置は、認識処理を行う際に前記補正された位置を用いることを特徴とする。 In order to solve the above-described problems, a signal processing system and the like according to the present invention includes a plurality of signal processing devices for external sensors connected to external sensors, and each signal processing device senses the external sensors. In a signal processing system having a function of recognizing an external object from information, sensing data from the first external sensor is recognized by the first signal processing device, and sensing data from the second external sensor is the second signal. A signal processing system that performs recognition processing in a processing device and refers to the output of the first signal processing device when the second signal processing device performs recognition processing, wherein the first signal processing device is a processing target The sensing time of the sensing data and the object detection position are managed, and the object detection position is the sensing data to be processed by the second signal processing device. Is corrected so as to be positioned at the sensing time of data, said second signal processing apparatus is characterized by using said corrected position when performing the recognition process.
 本発明によれば、複数の外界センサのセンシングデータを用いて認識処理を行う際、第一の外界センサの認識処理結果あるいは認識処理途中の情報である物体位置情報を用いて第二の外界センサの認識処理を行う場合において、第二の外界センサのセンシングデータに対して重点的な処理などの特定の処理を行う位置のずれを抑制でき、もって、位置精度の向上から第二の外界センサの認識精度の向上につながる。また、特定の処理を行う領域のマージンを削減でき、認識処理の処理負荷を低減できる。 According to the present invention, when the recognition process is performed using the sensing data of a plurality of external sensors, the second external sensor is used by using the object position information which is the recognition process result of the first external sensor or information during the recognition process. When the recognition process is performed, it is possible to suppress a position shift for performing a specific process such as a intensive process on the sensing data of the second external sensor, and thereby improve the positional accuracy, thereby improving the second external sensor. It leads to improvement of recognition accuracy. Further, it is possible to reduce the margin of the area where specific processing is performed, and to reduce the processing load of recognition processing.
 上記した以外の課題、構成及び効果は、以下の実施形態の説明により明らかにされる。 Issues, configurations, and effects other than those described above will be clarified by the following description of the embodiments.
第1実施形態における、外界センサAのセンシング結果を用いて外界センサBのセンシング結果の処理を行う信号処理システムの構成例。The structural example of the signal processing system which processes the sensing result of the external sensor B using the sensing result of the external sensor A in 1st Embodiment. 第1実施形態における、外界センサAのセンシング結果を用いて外界センサBのセンシング結果の処理を行う信号処理システムにおける信号処理装置の内部構成例。The internal structural example of the signal processing apparatus in the signal processing system which processes the sensing result of the external sensor B using the sensing result of the external sensor A in 1st Embodiment. 外界センサをライダーとカメラで構成する場合の、カメラ側の認識処理でライダーによるセンシングから得た位置情報の利用イメージ図。The use image figure of the positional information obtained from the sensing by a rider by the recognition process of the camera side in the case of comprising an external sensor with a rider and a camera. センシング時刻としてセンシング領域ごとに時刻を付加する例を説明する図。The figure explaining the example which adds time for every sensing area as sensing time. センシング時刻として時刻算出式を用いる例を説明する図。The figure explaining the example which uses a time calculation formula as sensing time. センシング時刻として検出した物体毎に時刻を付加する例を説明する図。The figure explaining the example which adds time to every object detected as sensing time. 第1実施形態の信号処理システムを車両システムに組み込んだ構成例。The structural example which incorporated the signal processing system of 1st Embodiment in the vehicle system. 図7に示す車両制御システムの内部構成例。The internal structural example of the vehicle control system shown in FIG. 外界センサをミリ波レーダとカメラで構成する場合の、カメラの認識処理におけるミリ波レーダのセンシングから得た位置情報の利用イメージ図。The use image figure of the positional information obtained from the sensing of the millimeter wave radar in the recognition process of a camera in case an external sensor is comprised with a millimeter wave radar and a camera. ミリ波レーダによる位置検出結果から補正した位置を推定する例を説明する図。The figure explaining the example which estimates the position corrected from the position detection result by millimeter wave radar. カメラによるセンシング結果にミリ波レーダによるセンシング領域を投影したイメージ図。The image figure which projected the sensing area by millimeter wave radar on the sensing result by the camera. (a)、(b)は、路面推定におけるおおまかな路面領域を示す図。(A), (b) is a figure which shows the rough road surface area | region in road surface estimation. ステレオカメラの片眼画像とV-Disparity画像の関係例。An example of the relationship between a single-eye image of a stereo camera and a V-Disparity image. ミリ波レーダによるセンシングの信号処理の例を説明するフローチャート。The flowchart explaining the example of the signal processing of the sensing by a millimeter wave radar. カメラによるセンシングの信号処理の例を説明するフローチャート。The flowchart explaining the example of the signal processing of the sensing by a camera. 第1実施形態における信号処理動作タイミングの例を説明するタイミングチャートであり、(a)は、外界センサAに対する認識処理結果を外界センサBに対する認識処理に利用する場合の信号処理動作タイミング、(b)は、外界センサAに対する前処理結果を外界センサBに対する認識処理に利用する場合の信号処理動作タイミングを説明するタイミングチャート。5 is a timing chart for explaining an example of signal processing operation timing in the first embodiment, where (a) is a signal processing operation timing when a recognition processing result for the external sensor A is used for recognition processing for the external sensor B; ) Is a timing chart for explaining signal processing operation timing when the result of preprocessing for the external sensor A is used for recognition processing for the external sensor B. 第1実施形態のセンサシステムのシミュレーション評価を行う評価システムの構成例。The structural example of the evaluation system which performs simulation evaluation of the sensor system of 1st Embodiment. 第1実施形態のセンサシステムのシミュレーション評価を行う際の時刻情報の簡略化の一例(センシング一回単位で全体を同時にセンシングしたとして扱う場合)を説明する図。The figure explaining an example of the simplification of the time information at the time of performing the simulation evaluation of the sensor system of the first embodiment (in the case where the whole is sensed simultaneously in one sensing unit). 第1実施形態のセンサシステムのシミュレーション評価を行う際の時刻情報の簡略化の他例(主要な物体のみ一回のセンシング内でのセンシング時刻の違いを考慮)を説明する図。The figure explaining the other example of the simplification of the time information at the time of performing the simulation evaluation of the sensor system of the first embodiment (considering the difference in sensing time within one sensing only for main objects). 第2実施形態における、外界センサAのセンシング結果を用いて外界センサBのセンシング結果の処理を行う信号処理システムにおける信号処理装置の内部構成例。The internal structural example of the signal processing apparatus in the signal processing system which processes the sensing result of the external sensor B using the sensing result of the external sensor A in 2nd Embodiment. 第3実施形態における、外界センサAのセンシング結果を用いて外界センサBのセンシング結果の処理を行う信号処理装置の内部構成例。The internal structural example of the signal processing apparatus which processes the sensing result of the external sensor B using the sensing result of the external sensor A in 3rd Embodiment. 第4実施形態における、外界センサAのセンシング結果を用いて外界センサBのセンシング結果の処理を行う信号処理システムにおける信号処理装置の内部構成例。The internal structural example of the signal processing apparatus in the signal processing system which processes the sensing result of the external sensor B using the sensing result of the external sensor A in 4th Embodiment. 第4実施形態における信号処理動作タイミングの例を説明するタイミングチャート。The timing chart explaining the example of the signal processing operation timing in 4th Embodiment.
 以下、本発明の実施形態について図面を参照して説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
[第1実施形態]
 本発明の第1実施形態の全体構成について図1を用いて説明する。
[First Embodiment]
The overall configuration of the first embodiment of the present invention will be described with reference to FIG.
 本実施形態の信号処理システム1000は、外界センサA101、外界センサB201、センサA用信号処理装置100、センサB用信号処理装置200、フュージョン処理装置650を含んで構成される。 The signal processing system 1000 of this embodiment includes an external sensor A101, an external sensor B201, a sensor A signal processing device 100, a sensor B signal processing device 200, and a fusion processing device 650.
 外界センサA101は車両周辺の外界状況のセンシングを行う第一の外界センサ、外界センサB201は車両周辺の外界状況のセンシングを行う第2の外界センサである。車両周辺状況のセンシングを行うセンサの具体例としては、ミリ波レーダ、単眼カメラ、ステレオカメラ、ライダー(LiDAR:Light Detection and Ranging)などがある。外界センサA101と外界センサB201は異なる種類または形式のセンサが一般的であるが、同一種類または同一形式のセンサでも構わない。例えば、共にカメラであっても、赤外線カメラと一般的なカメラを用いることも考えられる。 External sensor A101 is a first external sensor that senses the external environment around the vehicle, and external sensor B201 is a second external sensor that senses the external environment around the vehicle. Specific examples of sensors that sense the vehicle surroundings include millimeter wave radar, monocular cameras, stereo cameras, and riders (LiDAR: LightARDetection and Ranging). The external sensor A101 and the external sensor B201 are generally different types or types of sensors, but may be the same type or type. For example, it is possible to use an infrared camera and a general camera even if both are cameras.
 センサA用信号処理装置100、センサB用信号処理装置200はそれぞれ、外界センサA101、外界センサB201に接続され、外界センサA101のセンシングデータ出力、外界センサB201のセンシングデータ出力に対して信号処理を行う装置であり、車両に搭載する場合、ECU(Electronic Control Unit)、すなわち電子制御ユニットとして実装する。信号処理機能は、ECUの一部機能として実装することもあり得る。 The sensor A signal processing device 100 and the sensor B signal processing device 200 are connected to the external sensor A101 and the external sensor B201, respectively, and perform signal processing on the sensing data output of the external sensor A101 and the sensing data output of the external sensor B201. When mounted on a vehicle, it is mounted as an ECU (Electronic Control Unit), that is, an electronic control unit. The signal processing function may be implemented as a partial function of the ECU.
 これらの信号処理装置100、200は、センシングデータに対して信号処理を行い、車両周辺に存在する物体を検出し、必要に応じて物体の識別を行うことで、最終的な認識結果を出力する。検出結果には、位置情報(物体検出位置)や方位だけでなく、ミリ波やレーザを利用する場合は信号の反射遅延時間、カメラを利用する場合は時間的な物体の移動、画像の特徴、視差(ステレオカメラの場合)などから算出する距離情報も含まれる。 These signal processing apparatuses 100 and 200 perform signal processing on the sensing data, detect objects existing around the vehicle, and identify the objects as necessary, thereby outputting a final recognition result. . The detection results include not only position information (object detection position) and orientation, but also signal reflection delay time when using millimeter waves and lasers, temporal object movement, image characteristics, when using cameras, Distance information calculated from parallax (in the case of a stereo camera) or the like is also included.
 センサA用信号処理装置100、センサB用信号処理装置200は時刻同期情報10を交換し、時刻同期して動作する。同期動作する方法としては、IEEE 802.11ASとして示されている方法などがある。より簡単に同一のクロック信号をセンサA用信号処理装置100、センサB用信号処理装置200の両方へ供給しても良い。 The sensor A signal processing device 100 and the sensor B signal processing device 200 exchange time synchronization information 10 and operate in time synchronization. As a method of synchronous operation, there is a method indicated as IEEE AS802.11AS. The same clock signal may be supplied to both the sensor A signal processing device 100 and the sensor B signal processing device 200 more easily.
 センサA用信号処理装置100は、センサB用信号処理装置200へセンサA処理情報108を出力する。センサA処理情報108は、外界センサA101によるセンシングデータ出力に対する信号処理により検出した自車両周辺に存在する物体の位置情報(物体検出位置)とともに、センシングを行った時刻情報を含む。 The sensor A signal processing device 100 outputs the sensor A processing information 108 to the sensor B signal processing device 200. The sensor A processing information 108 includes time information at which sensing is performed together with position information (object detection position) of an object existing around the host vehicle detected by signal processing for sensing data output by the external sensor A101.
 センサB用信号処理装置200は、外界センサB201のセンシングデータを処理する際に、センサA処理情報108に含まれる自車両周辺に存在する物体の位置情報を、外界センサB201によるセンシング時刻に合わせて補正し、外界センサB201のセンシングデータに対する認識処理に活用する(後で詳述)。位置情報の補正を行う際に、自車両の位置及び向きの時間的な変化を考慮するため、外部から入力される自車両位置情報701を参照する。 When the sensor B signal processing device 200 processes the sensing data of the external sensor B201, the position information of the object existing around the host vehicle included in the sensor A processing information 108 is matched with the sensing time by the external sensor B201. It correct | amends and it utilizes for the recognition process with respect to the sensing data of the external sensor B201 (it explains in full detail later). When correcting the position information, in order to consider temporal changes in the position and orientation of the host vehicle, the host vehicle position information 701 input from the outside is referred to.
 センサA処理情報108に含まれる物体の位置情報は、センサA用信号処理装置100における最終的な処理結果でも良いが、物体の存在検出後で物体の種類判定処理前の情報を用いたほうが良い。センサB用信号処理装置200が認識処理を行う際に必要とする情報が得られた時点での情報を用いたほうが、外界センサA101のセンシングタイミングからセンサB用信号処理装置200へセンサA処理情報108が出力されるまでの遅延が少なく(これを低遅延という)、位置情報の補正における誤差を抑制できる。 The position information of the object included in the sensor A processing information 108 may be a final processing result in the sensor A signal processing apparatus 100, but it is better to use information after the presence detection of the object and before the object type determination processing. . The sensor A processing information is sent from the sensing timing of the external sensor A101 to the sensor B signal processing device 200 when the information required when the sensor B signal processing device 200 performs the recognition processing is obtained. There is little delay until 108 is output (this is called low delay), and errors in correction of position information can be suppressed.
 センサA用信号処理装置100は、認識結果をセンサA認識結果105としてフュージョン処理装置650へ出力し、センサB用信号処理装置200は、認識結果をセンサB認識結果205としてフュージョン処理装置650へ出力する。センサA認識結果105及びセンサB認識結果205はフュージョン処理装置650で統合され、統合認識結果595として出力される。フュージョン処理装置650によるフュージョン処理は、特許文献1に示されている手法などの公知技術を用いて実施する。 The sensor A signal processing device 100 outputs the recognition result as the sensor A recognition result 105 to the fusion processing device 650, and the sensor B signal processing device 200 outputs the recognition result as the sensor B recognition result 205 to the fusion processing device 650. To do. The sensor A recognition result 105 and the sensor B recognition result 205 are integrated by the fusion processing device 650 and output as an integrated recognition result 595. The fusion process by the fusion processing apparatus 650 is performed using a known technique such as the technique disclosed in Patent Document 1.
 図2を用いて前記信号処理システム1000におけるセンサA用信号処理装置100及びセンサB用信号処理装置200の内部構成について説明する。尚、以降の説明では、センサA用信号処理装置100とセンサB用信号処理装置200は同期動作していることを前提に説明する。 The internal configuration of the sensor A signal processing device 100 and the sensor B signal processing device 200 in the signal processing system 1000 will be described with reference to FIG. In the following description, the sensor A signal processing device 100 and the sensor B signal processing device 200 will be described on the assumption that they operate synchronously.
 センサA用信号処理装置100では、外界センサA101のセンシングデータ出力をセンサ入力部A110を経由して認識処理を行う認識部A150へ送り、認識部A150の認識結果を結果出力部A120を経てセンサA認識結果105として出力する。 In the signal processing apparatus 100 for sensor A, the sensing data output of the external sensor A101 is sent to the recognition unit A150 that performs recognition processing via the sensor input unit A110, and the recognition result of the recognition unit A150 is sent to the sensor A via the result output unit A120. A recognition result 105 is output.
 センサ入力部A110では、外界センサA101の動作制御に必要なパラメータやタイミング信号を外界センサA101へ出力し、外界センサA101のセンシングデータ出力を受け取り、認識部A150が処理を行うまで、受け取ったセンシングデータを保持する。外界センサA101のセンシング時刻とセンシングデータを対応付けて管理することも行う。 The sensor input unit A110 outputs parameters and timing signals necessary for operation control of the external sensor A101 to the external sensor A101, receives the sensing data output of the external sensor A101, and receives the received sensing data until the recognition unit A150 performs processing. Hold. The sensing time of the external sensor A101 and the sensing data are also managed in association with each other.
 認識部A150の認識処理で得た物体の検出情報は、協調データ出力部A140へ送られる。この際、検出情報は、複数の物体を検出した場合でも、連続するセンシングデータ間でどの物体が同一の物体であるかを区別するために必要な識別情報を含む。協調データ出力部A140は、認識部A150が認識処理に用いたセンシングデータのセンシング時刻をセンサ入力部A110から取得し、センシング時刻情報と共に物体の検出情報をセンサA処理情報108としてセンサB用信号処理装置200へ出力する。 The object detection information obtained by the recognition process of the recognition unit A150 is sent to the cooperative data output unit A140. At this time, the detection information includes identification information necessary for distinguishing which object is the same object between continuous sensing data even when a plurality of objects are detected. The cooperative data output unit A140 acquires the sensing time of the sensing data used by the recognition unit A150 for the recognition process from the sensor input unit A110, and uses the sensor detection information as the sensor A processing information 108 together with the sensing time information to perform signal processing for the sensor B. Output to the device 200.
 センサA処理情報108は、センサB用信号処理装置200では、協調データ入力部B230で受け取り、位置補正部B280へ送る。 The sensor A processing information 108 is received by the cooperative data input unit B230 in the sensor B signal processing device 200 and sent to the position correction unit B280.
 センサB用信号処理装置200では、外界センサB201のセンシングデータ出力をセンサ入力部B210を経由して認識処理を行う認識部B250へ送り、認識部B250の認識結果を結果出力部B220を経てセンサB認識結果205として出力する。 In the signal processing apparatus 200 for the sensor B, the sensing data output of the external sensor B201 is sent to the recognition unit B250 that performs recognition processing via the sensor input unit B210, and the recognition result of the recognition unit B250 is sent to the sensor B via the result output unit B220. The result is output as a recognition result 205.
 センサ入力部B210では、外界センサB201の動作制御に必要なパラメータやタイミング信号を外界センサB201へ出力し、外界センサB201のセンシングデータ出力を受け取り、認識部B250が処理を行うまで、受け取ったセンシングデータを保持する。外界センサB201のセンシング時刻とセンシングデータを対応付けて管理することも行う。 The sensor input unit B210 outputs parameters and timing signals necessary for operation control of the external sensor B201 to the external sensor B201, receives the sensing data output of the external sensor B201, and receives the sensing data until the recognition unit B250 performs processing. Hold. The sensing time of the external sensor B201 and the sensing data are also managed in association with each other.
 位置補正部B280は、センサA処理情報108を受け取ると、位置補正処理に要する時間を考慮した上で、位置補正結果を算出し(後で詳述)、その位置補正結果を用いて認識部B250で次に認識処理を行うセンシングデータがどのデータであるのかを認識部B250より受け取り、対応するデータのセンシング時刻をセンサ入力部B210より受け取る。位置補正部B280は、センサA処理情報108に含まれるセンシング時刻と外界センサB201のセンシングデータのセンシング時刻の差分、すなわちセンサA処理情報108に含まれる外界センサA101でのセンシング時刻から外界センサB201のセンシングデータのセンシング時刻までの経過時間を計算し、自車両位置情報701を用いて車両の移動量や向きの変化を計算し、さらにセンサA処理情報108として入力されたセンサA用信号処理装置100から得た物体位置(物体検出位置)とそのセンシング時刻の履歴情報(あるいは、物体位置の履歴情報及び履歴間の経過時間情報)を用いて、センサA処理情報108に含まれる位置情報を外界センサB201がセンシングしたセンシング時刻に対応する位置情報に補正する。 When the position correction unit B280 receives the sensor A processing information 108, the position correction unit B280 calculates the position correction result in consideration of the time required for the position correction process (detailed later), and uses the position correction result to recognize the unit B250. Thus, the sensing data to be subjected to the next recognition process is received from the recognition unit B250, and the sensing time of the corresponding data is received from the sensor input unit B210. The position correction unit B 280 determines the difference between the sensing time included in the sensor A processing information 108 and the sensing time of the sensing data of the external sensor B 201, that is, the sensing time at the external sensor A 101 included in the sensor A processing information 108. The elapsed time until the sensing time of the sensing data is calculated, the movement amount and the change of the direction of the vehicle are calculated using the own vehicle position information 701, and the sensor A signal processing apparatus 100 input as the sensor A processing information 108. The position information included in the sensor A processing information 108 is obtained from the external sensor using the object position (object detection position) obtained from the above and the history information of the sensing time (or the object position history information and the elapsed time information between the histories). The position information corresponding to the sensing time sensed by B201 is corrected.
 尚、外界センサA101と外界センサB201のセンシング時刻の差分を求めるために、本実施形態ではそれぞれのセンシングデータを処理する信号処理装置100、200間の時刻同期を行っているが、位置情報の補正処理に用いる範囲で時刻の差分が大きくずれることがない手法を用いれば、時刻同期を行わなくても構わない。例えば、それぞれの信号処理装置100、200が異なる水晶発振器で動作しており、時刻としては同期処理が行われていない場合でも、それぞれの信号処理装置100、200が1秒単位で共通の4ビット程度のカウント情報を受け取り、時刻に関する情報として直前に受け取った当該カウント情報と、直前に当該カウント情報を受け取ってからのそれぞれの信号処理装置100、200内での経過時間情報を組合せて時刻情報として利用すれば、1秒間に異なる水晶発振器が出力するパルス数の差は通常僅かであるため、センシング時刻の差分を求める上では、十分な時刻精度を得ることができる。 In this embodiment, time synchronization is performed between the signal processing apparatuses 100 and 200 that process the respective sensing data in order to obtain the difference in sensing time between the external sensor A101 and the external sensor B201. If a method is used in which the time difference is not greatly shifted within the range used for processing, the time synchronization may not be performed. For example, even when each signal processing device 100, 200 is operated by a different crystal oscillator and the synchronization processing is not performed as time, each signal processing device 100, 200 has a common 4-bit unit in one second. Time information is obtained by combining the count information received immediately before as time-related information and the elapsed time information in each of the signal processing devices 100 and 200 after receiving the count information immediately before. If used, since the difference in the number of pulses output by different crystal oscillators per second is usually small, sufficient time accuracy can be obtained in obtaining the difference in sensing time.
 補正した位置情報は認識部B250へ送られ、次のセンシングデータの認識処理を行う際に用いる。例えば、位置情報で示された領域については、処理が重くても性能の良いノイズ除去処理を用いたり、認識率の高い物体識別アルゴリズム(認識処理アルゴリズム)を適用したり、認識部B250の認識処理で物体が検出されない場合に、ノイズフィルタの強度を弱めて(つまり、認識処理パラメータを変更して)再度認識処理を実施したりする。認識部B250は、領域を選択して負荷の高い処理を行うことで効率的に認識処理を行うことが可能となり、また、物体を検出できない部分に対して再認識処理を行うことで未検出となるケースを抑制する(換言すれば、未検出率増加を抑制する)ことも可能となる。 The corrected position information is sent to the recognition unit B250 and used when the next sensing data is recognized. For example, for the region indicated by the position information, a noise removal process with good performance even when the process is heavy, an object identification algorithm (recognition process algorithm) with a high recognition rate is applied, or the recognition process of the recognition unit B250 If the object is not detected in step (b), the noise filter strength is reduced (that is, the recognition process parameter is changed) and the recognition process is performed again. The recognition unit B250 can efficiently perform recognition processing by selecting a region and performing high-load processing, and can perform non-detection by performing re-recognition processing on a portion where an object cannot be detected. Can be suppressed (in other words, an increase in the undetected rate can be suppressed).
 図3を用いて外界センサA101がライダーで外界センサB201がカメラである場合を想定して、センサB用信号処理装置200の認識処理におけるセンサA処理情報108から得た位置情報の利用イメージを説明する。 The use image of the position information obtained from the sensor A processing information 108 in the recognition processing of the sensor B signal processing apparatus 200 is described with reference to FIG. 3 assuming that the external sensor A101 is a rider and the external sensor B201 is a camera. To do.
 図3において、ライダー(外界センサA101)のセンシングLa920において位置922に存在した車両は、センシングLb930で位置932に移動している。カメラ(外界センサB201)のセンシングCa940において位置942に存在した車両は、センシングCb950において位置953に移動している。図3において、ライダーとカメラそれぞれでセンシングされている車両は、同一の車両である。センシングCb950の位置952は、ライダーのセンシングLb930における位置932をカメラのセンシングデータでの位置に調整して投影したものである。ライダーとカメラでは、車両での取り付け位置の違いにより、空間上の同一位置に存在する物体の位置がずれてセンシングされるため、位置の調整が必要である。 In FIG. 3, the vehicle existing at the position 922 in the sensing La 920 of the rider (external sensor A101) has moved to the position 932 at the sensing Lb 930. The vehicle existing at the position 942 in the sensing Ca 940 of the camera (external sensor B201) has moved to the position 953 in the sensing Cb 950. In FIG. 3, the vehicles sensed by the rider and the camera are the same vehicle. The position 952 of the sensing Cb 950 is obtained by adjusting and projecting the position 932 of the rider's sensing Lb 930 to the position of the camera sensing data. The position of the rider and the camera needs to be adjusted because the position of the object existing at the same position in the space is shifted due to the difference in the mounting position in the vehicle.
 位置952は、ライダーでセンシングLb930における位置932をカメラのセンシングCb950における位置に調整したものであるが、センシングLb930とセンシングCb950でセンシングの時間が異なるため、センシングCb950における車両の位置953とずれが生じている。そこで、位置補正部B280(図2参照)では、センシングLa920におけるセンサA処理情報108とセンシングLb930におけるセンサA処理情報108を用いて、センシングCb950のセンシング時刻における車両の位置を推定し、センシングCb950の認識部B250による認識処理に利用する位置として出力する。 The position 952 is a position in which the rider adjusts the position 932 in the sensing Lb 930 to the position in the sensing Cb 950 of the camera. ing. Therefore, the position correction unit B280 (see FIG. 2) estimates the position of the vehicle at the sensing time of the sensing Cb 950 using the sensor A processing information 108 in the sensing La 920 and the sensor A processing information 108 in the sensing Lb 930, and the sensing Cb 950. It outputs as a position used for the recognition process by the recognition part B250.
 位置補正部B280は、センシングLa920とセンシングLb930の結果から、センシングCb950における位置を推定するため、ライダーによる物体検出結果である物体の自車両からの向きと距離、ライダーの自車両での取り付け位置、センシングLa920、センシングLb930のそれぞれのセンシング時刻情報及び各センシング時刻に対応する自車両の位置及び向きの情報を用いて、センシングLb930のセンシング時刻における自車両の位置及び向きを基準とした、検出対象車両の空間上位置を最初に求める。次に、位置補正部B280は、センシングCb950のセンシング時刻に対応する空間上の位置を外挿して求める。その後、位置補正部B280は、センシングLb930のセンシング時刻からセンシングCb950のセンシング時刻までの自車両の移動及び向きの変化の影響を考慮し、センシングCb950のセンシング時刻における、自車両に対する検出対象車両の空間上の位置を求める。最後に、位置補正部B280は、カメラの自車両での取り付け位置を考慮して、センシングCb950における対象車両の位置(カメラのセンシング結果(画像)における2次元的な位置)を求める。 The position correction unit B280 estimates the position at the sensing Cb 950 from the results of the sensing La 920 and the sensing Lb 930, so that the direction and distance of the object as the object detection result by the rider from the own vehicle, the attachment position of the rider on the own vehicle, Using the sensing time information of the sensing La 920 and the sensing Lb 930 and the information on the position and orientation of the own vehicle corresponding to each sensing time, the detection target vehicle based on the position and orientation of the own vehicle at the sensing time of the sensing Lb 930 First, the position in space is obtained. Next, the position correction unit B280 extrapolates and obtains a position in the space corresponding to the sensing time of the sensing Cb950. Thereafter, the position correction unit B280 considers the influence of the movement and orientation change of the host vehicle from the sensing time of the sensing Lb930 to the sensing time of the sensing Cb950, and the space of the detection target vehicle with respect to the host vehicle at the sensing time of the sensing Cb950. Find the top position. Finally, the position correction unit B280 determines the position of the target vehicle (two-dimensional position in the sensing result (image) of the camera) in the sensing Cb 950 in consideration of the mounting position of the camera in the host vehicle.
 尚、ここでは、センシングLa920とセンシングLb930の2つの物体検出結果を用いてセンシングCbのセンシング時刻における位置を推定したが、より多くのライダーによるセンシング結果を用いて推定してもよい。 Here, the position of the sensing Cb at the sensing time is estimated using the two object detection results of the sensing La 920 and the sensing Lb 930, but may be estimated using the sensing results of more riders.
 センサA用信号処理装置100からセンサA処理情報108を出力する際、物体検出結果に検出処理対象としたセンシングデータのセンシング時刻情報を付加して出力するが、このセンシング時刻は、簡単には、センシングの開始時刻、あるいはセンシング開始からセンシング終了の中間時刻などをセンシング1回分単位で付加すればよい。しかし、外界センサによっては1回のセンシングに時間を要する場合があり、時刻による位置補正に対しては、時間精度不足になることがある。その対策方法として、時刻情報を付加する方法を図4から図6を用いて説明する。 When sensor A processing information 108 is output from the signal processing apparatus 100 for sensor A, sensing time information of sensing data targeted for detection processing is added to the object detection result and output, but this sensing time is simply What is necessary is just to add the start time of sensing or the intermediate time of the end of sensing from the start of sensing in units of one sensing. However, depending on the external sensor, it may take time for one sensing, and time accuracy may be insufficient for position correction based on time. As a countermeasure, a method for adding time information will be described with reference to FIGS.
 図4に示す方法は、外界センサA101のセンシングデータを複数のセンシング領域に分割し、センシング領域単位でセンシング時刻を付加・管理する方法である。例えば、センシング時に水平方向の走査を垂直方向に進める場合、図4に示すようにセンシングデータをセンシング領域A981からセンシング領域D984までの複数のセンシング領域に分割し、センシング領域ごとに代表センシング時刻を付加する。代表センシング時刻は、当該センシング領域のセンシングの開始時刻、あるいはセンシング開始からセンシング終了の中間時刻などを用いる。位置補正部B280がセンシング時刻を用いる場合は、位置補正対象とする物体(物体p990、物体q992)のセンシングデータ上の位置(物体の領域の中心)から物体の存在するセンシング領域を判断し、当該センシング領域に対応するセンシング時刻を用いる。尚、分割数や分割方向は、外界センサのセンシング時の走査順序や1回のセンシングに要する時刻、位置補正に求められる精度を考慮して決定する。 4 is a method of dividing the sensing data of the external sensor A101 into a plurality of sensing areas and adding and managing the sensing time in units of sensing areas. For example, when the horizontal scanning is advanced in the vertical direction during sensing, as shown in FIG. 4, the sensing data is divided into a plurality of sensing areas from the sensing area A 981 to the sensing area D 984, and a representative sensing time is added to each sensing area. To do. As the representative sensing time, a sensing start time in the sensing region or an intermediate time from the sensing start to the sensing end is used. When the position correction unit B280 uses the sensing time, the sensing region where the object exists is determined from the position (center of the object region) on the sensing data of the object (object p990, object q992) that is the position correction target, The sensing time corresponding to the sensing area is used. Note that the number of divisions and the division direction are determined in consideration of the scanning order at the time of sensing by the external sensor, the time required for one sensing, and the accuracy required for position correction.
 図5に示す方法は、外界センサA101のセンシングデータ上の水平位置と垂直位置からセンシング時刻を算出する時刻算出式988を定義し、センシング時刻として時刻算出に必要な係数Tcv、Tch、オフセット時刻Tsを用いて管理する方法である。位置補正部B280がセンシング時刻を用いる場合、位置補正対象とする物体(物体p990、物体q992)のセンシングデータ上の水平位置及び垂直位置から時刻計算式988を用いて、当該物体(物体p990、物体q992)のセンシング時刻として用いる。 The method shown in FIG. 5 defines a time calculation formula 988 for calculating the sensing time from the horizontal position and the vertical position on the sensing data of the external sensor A101, and the coefficients Tcv, Tch, offset time Ts necessary for time calculation as the sensing time. It is a method to manage using. When the position correction unit B280 uses the sensing time, the object (object p990, object) is calculated from the horizontal position and the vertical position on the sensing data of the object (object p990, object q992) as the position correction target using the time calculation formula 988. q992) is used as the sensing time.
 この時刻算出式988は一例であり、計算しやすいように数式を変形したり、外界センサのセンシングに合わせて異なる数式を用いたりしてもよい。時刻算出式988に、入力値に対する出力値を定義したテーブルを用いてもよい。また、時刻算出に必要な係数Tcv、Tchやオフセット時刻Tsなどのパラメータは、変化しないパラメータに関しては1回のセンシング毎に送付しなくても良く、変化したときだけ、あるいは一定のセンシング回数毎に送付することも考えられる。 The time calculation formula 988 is an example, and the formula may be modified so that the calculation is easy, or a different formula may be used in accordance with the sensing of the external sensor. A table defining output values for input values may be used as the time calculation formula 988. In addition, parameters such as coefficients Tcv, Tch, and offset time Ts necessary for time calculation need not be sent for each sensing for parameters that do not change, only when they change or for every certain number of sensing times. It is also possible to send it.
 図6に示す方法は、外界センサA101のセンシングデータを用いて検出した物体毎にセンシング時刻情報を付加・管理する方法である。各物体(物体p990、物体q992)の領域に対応する代表センシング時刻を付加することで、位置補正部B280がセンシング時刻を用いる場合、位置補正対象とする物体(物体p990、物体q992)に付加されている代表センシング時刻を参照する。 The method shown in FIG. 6 is a method of adding and managing sensing time information for each object detected using the sensing data of the external sensor A101. By adding the representative sensing time corresponding to the area of each object (object p990, object q992), when the position correction unit B280 uses the sensing time, it is added to the object (object p990, object q992) that is the position correction target. Refer to the representative sensing time.
 位置補正部B280は、時刻情報に基づき位置補正を行う際、仮に外界センサB201のセンシング時刻がセンシングデータ上の位置により異なる場合、センサ入力部B210から当該センシングデータのセンシング時刻を得るために、センシングデータ上の位置とセンシング時刻の関係として管理しておき、センサA用信号処理装置100から得た検出物体の位置から外界センサB201のセンシング時刻を求め、センサA処理情報108に含まれるセンシング時刻との差分を計算する。 When the position correction unit B280 performs position correction based on the time information, if the sensing time of the external sensor B201 differs depending on the position on the sensing data, the position correction unit B280 performs sensing to obtain the sensing time of the sensing data from the sensor input unit B210. It is managed as a relationship between the position on the data and the sensing time, the sensing time of the external sensor B201 is obtained from the position of the detected object obtained from the signal processing apparatus 100 for sensor A, and the sensing time included in the sensor A processing information 108 Calculate the difference between
 図7及び図8を用いて本実施形態の信号処理システム1000を車両システムに組み込んだ構成の一例を説明する。 An example of a configuration in which the signal processing system 1000 according to the present embodiment is incorporated in a vehicle system will be described with reference to FIGS.
 図示実施形態の車両システムでは、外界センサA101はセンサA用信号処理装置100、外界センサB201はセンサB用信号処理装置200に直接接続する。センサA用信号処理装置100及びセンサB用信号処理装置は、車載ネットワークとしての共通バス790(CANバスや車載Ethernetなどで構成)に接続している。ここでは、外界センサA101とセンサA用信号処理装置100、外界センサB201とセンサB用信号処理装置200を直接接続し、点群や画像などの大きいデータ量の信号が共通バス790に流れることを抑止している。 In the vehicle system of the illustrated embodiment, the external sensor A101 is directly connected to the sensor A signal processing apparatus 100, and the external sensor B201 is directly connected to the sensor B signal processing apparatus 200. The sensor A signal processing device 100 and the sensor B signal processing device are connected to a common bus 790 (configured by a CAN bus, an in-vehicle Ethernet, or the like) as an in-vehicle network. Here, the external sensor A101 and the sensor A signal processing apparatus 100, and the external sensor B201 and the sensor B signal processing apparatus 200 are directly connected, and a signal having a large amount of data such as a point cloud or an image flows to the common bus 790. Suppressed.
 共通バス790には、センサA用信号処理装置100のセンサA認識結果105及びセンサB用信号処理装置200のセンサB認識結果205を統合した統合認識結果595(図1参照)を出力するフュージョン処理装置650の他に、自車両のドライバに自車両の状態を伝える情報表示装置610(メータ類を始めとするインジケータなどで構成)、ドライバからの操作を受け取るユーザ入力装置620(ステアリング、アクセル、ブレーキ、その他スイッチ類などで構成)、及び自車両の機械的部分に関連した制御を行う車両制御システム700が接続される。共通バス790は、バスを流れるデータ量確保や安全性確保、セキュリティ確保のために、複数の物理的なバスに分割してもよく、そのような場合は、適宜物理的なバス間で情報を交換するための装置を用いたり、共通バス790に接続するそれぞれの装置が必要に応じて複数の物理的バスに接続可能な構成にしたりする必要がある。 Fusion processing for outputting an integrated recognition result 595 (see FIG. 1), which is a combination of the sensor A recognition result 105 of the sensor A signal processing device 100 and the sensor B recognition result 205 of the sensor B signal processing device 200, to the common bus 790. In addition to the device 650, an information display device 610 (consisting of indicators such as meters) that informs the driver of the host vehicle, and a user input device 620 that receives operations from the driver (steering, accelerator, brake) , And other switches), and a vehicle control system 700 that performs control related to the mechanical part of the host vehicle is connected. The common bus 790 may be divided into a plurality of physical buses for ensuring the amount of data flowing through the bus, ensuring safety, and ensuring security. In such a case, information is appropriately transferred between physical buses. It is necessary to use a device for replacement, or to configure each device connected to the common bus 790 to connect to a plurality of physical buses as necessary.
 センサB用信号処理装置200に入力する自車両位置情報701(図1参照)は、例えば車両制御システム700から共通バス790を介して供給する。車両制御システム700は、自車両の機械的部分に関連した制御を行うため、自車両の速度や加速度などをセンサで取得しており、このセンサのセンシングデータを基にした情報を自車両位置情報701として供給する。 The own vehicle position information 701 (see FIG. 1) input to the sensor B signal processing device 200 is supplied from the vehicle control system 700 via the common bus 790, for example. Since the vehicle control system 700 performs control related to the mechanical part of the host vehicle, the speed and acceleration of the host vehicle are acquired by a sensor, and information based on the sensing data of the sensor is acquired from the host vehicle position information. 701 is supplied.
 車両制御システム700の内部構成は図8に示すようになっており、共通バス790は、車両の機械的部分の統合的な制御を行う車両統合制御装置705を介して、動力源の制御を行うエンジン制御装置710、動力伝達部分の制御を行うトランスミッション制御装置720、ブレーキの制御を行うブレーキ制御装置730、ステアリングの制御を行うステアリング制御装置740が接続される。それぞれの制御装置(710、720、730、740)には、関連する機械的部分の動きを制御するアクチュエータ(712、722、732、742)とその状態を監視するセンサ(714、724、734、744)が接続されており、適切に動作するように制御される。 The internal configuration of the vehicle control system 700 is as shown in FIG. 8, and the common bus 790 controls a power source via a vehicle integrated control device 705 that performs integrated control of mechanical parts of the vehicle. An engine control device 710, a transmission control device 720 that controls the power transmission portion, a brake control device 730 that controls the brake, and a steering control device 740 that controls the steering are connected. Each control device (710, 720, 730, 740) includes an actuator (712, 722, 732, 742) that controls the movement of the associated mechanical part and a sensor (714, 724, 734, 734) that monitors its state. 744) are connected and controlled to operate properly.
 図8に示す車両制御システム700の構成は一例であり、複数の制御装置が統合されたり、制御装置間の接続を共通バス790で行ったり、電気自動車の場合はエンジン制御装置710がモータ制御装置になるなど、自車両の構成により適宜変更してもよい。 The configuration of the vehicle control system 700 shown in FIG. 8 is an example, and a plurality of control devices are integrated, the connection between the control devices is performed by a common bus 790, or in the case of an electric vehicle, the engine control device 710 is a motor control device. It may be appropriately changed depending on the configuration of the host vehicle.
 フュージョン処理装置650の出力する統合認識結果595は車両統合制御装置705へ伝えられ、車両統合制御装置705は、例えば、衝突の危険性が高い場合には緊急ブレーキ操作に必要な指示を出し、先行車に追従するクルーズコントロールが有効な場合は設定速度、前方車両までの距離や相対速度などに基づき速度制御に必要な指示を出す。 The integrated recognition result 595 output from the fusion processing device 650 is transmitted to the vehicle integrated control device 705, and the vehicle integrated control device 705 issues an instruction necessary for emergency brake operation, for example, when there is a high risk of collision. When cruise control that follows the vehicle is effective, an instruction necessary for speed control is issued based on the set speed, the distance to the vehicle ahead and the relative speed.
 信号処理装置100、200やフュージョン処理装置650が車線なども識別できる場合は、車両統合制御装置705は、その認識結果に基づきステアリング制御装置740などにステアリング操作指示を出すことで、車線内に自車両を保つ。 When the signal processing devices 100 and 200 and the fusion processing device 650 can also identify a lane or the like, the vehicle integrated control device 705 issues a steering operation instruction to the steering control device 740 or the like based on the recognition result, so Keep the vehicle.
 より高いレベルの自動運転に対応する際には、共通バス790に自動運転制御装置を付加する場合もある。この場合、フュージョン処理装置650の出力及び車両制御システム700から供給される自車両の状態から、自動運転制御装置が車両としてとるべき挙動を判断し、車両制御システム700へ指示を出す。 When dealing with higher levels of automatic driving, an automatic driving control device may be added to the common bus 790. In this case, the behavior that the automatic driving control device should take as a vehicle is determined from the output of the fusion processing device 650 and the state of the host vehicle supplied from the vehicle control system 700, and an instruction is issued to the vehicle control system 700.
 尚、クルーズコントロールやステアリング制御など部分的にでも車両を自動的に制御する場合には、車両システム全体として、十分安全性を検討した上で、万一の場合にも安全を確保できるような実装をする必要がある。 If the vehicle is to be controlled automatically even in part, such as cruise control or steering control, the vehicle system as a whole should be considered to be sufficiently safe and secure in the event of an emergency. It is necessary to do.
 図9を用いて、外界センサA101がほぼ路面に沿う形で平面的に位置検出を行うミリ波レーダ、外界センサB201がカメラである場合を想定して、センサB用信号処理装置200の認識処理におけるセンサA処理情報108から得た位置情報の利用イメージを説明する。 9, assuming that the external sensor A101 is a millimeter wave radar that detects the position in a plane substantially along the road surface, and the external sensor B201 is a camera, the recognition processing of the sensor B signal processing apparatus 200 is performed. The use image of the position information obtained from the sensor A processing information 108 in FIG.
 図9において、ミリ波レーダ(外界センサA101)のセンシングRc960において位置962に存在した車両は、センシングRd970で位置972に移動している。カメラ(外界センサB201)のセンシングCc945において位置948に存在した車両は、センシングCd955において位置958に移動している。図9において、ミリ波レーダとカメラそれぞれでセンシングされている車両は同一の車両である。 In FIG. 9, the vehicle present at position 962 in sensing Rc 960 of the millimeter wave radar (external sensor A101) has moved to position 972 at sensing Rd 970. The vehicle existing at the position 948 in the sensing Cc945 of the camera (external sensor B201) has moved to the position 958 in the sensing Cd955. In FIG. 9, the vehicles sensed by the millimeter wave radar and the camera are the same vehicle.
 ミリ波レーダが平面的に位置検出を行う外界センサである場合、センサA処理情報108に含まれる物体の位置も平面に対する位置(つまり、外界センサからの角度と距離)となり、高さ方向の情報を得ることができない。このような場合には、位置補正部B280は、図10に示すように、まずミリ波レーダが検出を行う平面において、位置962と位置972、それぞれの位置962、972をセンシングしたセンシング時刻における自車両の位置及び向き、センシングCd955のセンシング時刻、自車両の位置及び向きから、センシングCd955のセンシング時刻における位置974を外挿により推定する。この際、センシングRc960以前のミリ波レーダのセンシング結果も用いて外挿して推定しても良い。また、検出した物体の相対速度が分かる場合、相対速度を用いて推定を行ってもよい。その後、位置補正部B280は、センシングCd955に対して路面推定を行い、図11に示すようにセンシングCd955における推定路面956に対してミリ波レーダによるセンシング領域を投影した領域971を求め、領域971に対応する位置974(図10参照)を計算することで、カメラによるセンシングデータ上の位置975を(推定路面956上に対応付けて)求める。 When the millimeter wave radar is an external sensor that detects the position in a plane, the position of the object included in the sensor A processing information 108 is also a position with respect to the plane (that is, the angle and distance from the external sensor), and the height direction information Can't get. In such a case, as shown in FIG. 10, the position correction unit B 280 first detects the position 962 and the position 972 on the plane where the millimeter wave radar detects, and the position correction unit B280 at the sensing time when the respective positions 962 and 972 are sensed. The position 974 at the sensing time of the sensing Cd955 is estimated by extrapolation from the position and orientation of the vehicle, the sensing time of the sensing Cd955, and the position and orientation of the host vehicle. At this time, estimation may be performed by extrapolation using the sensing result of the millimeter wave radar before the sensing Rc 960. In addition, when the relative speed of the detected object is known, the relative speed may be used for estimation. Thereafter, the position correction unit B280 performs road surface estimation on the sensing Cd955, and obtains a region 971 obtained by projecting a sensing region by the millimeter wave radar on the estimated road surface 956 in the sensing Cd955 as shown in FIG. By calculating a corresponding position 974 (see FIG. 10), a position 975 on the sensing data by the camera is obtained (in association with the estimated road surface 956).
 図11の位置975で示される領域は、幅がミリ波レーダの分解能により誤差がある場合は、その分マージンをとって広げ、高さは検出対象物として想定する物体958の最大高さを予め決めておき、物体(対象物)958までの距離からセンシングデータ上の高さを算出する。センシングデータ上での縦位置は、位置975に推定路面956の位置を物体958の底面と仮定して求める。その結果、センシングCd955のセンシングデータ上の領域959が求まる。 If there is an error due to the resolution of the millimeter wave radar, the area indicated by the position 975 in FIG. 11 is widened with a margin, and the height is set to the maximum height of the object 958 assumed as the detection target in advance. The height on the sensing data is calculated from the distance to the object (target object) 958. The vertical position on the sensing data is obtained by assuming the position of the estimated road surface 956 at the position 975 as the bottom surface of the object 958. As a result, an area 959 on the sensing data of sensing Cd955 is obtained.
 尚、路面推定処理は、位置補正処理として位置補正部B280で行っても良いし、位置補正以外の認識処理にも利用することを考慮し、認識部B250で行って、その結果を位置補正部B280に提供しても良い。 The road surface estimation process may be performed by the position correction unit B 280 as the position correction process, or by the recognition unit B 250 in consideration of use for recognition processing other than the position correction, and the result is the position correction unit. You may provide to B280.
 図12及び図13を用いて、カメラとしてステレオカメラを用いる場合の路面推定処理について説明する。 The road surface estimation process when a stereo camera is used as the camera will be described with reference to FIGS.
 路面推定を行うために、まず、センシングデータにおけるおおまかな路面領域を推定する。例えば、図12(a)に示すように、カメラの片眼のセンシングデータ901に対して、画像の特徴から道路の可能性が高い領域を台形領域に近似して推定する。このような領域902は、進行方向が直線の道路が一定幅、一定勾配であることを仮定し、路面が画像上に投影される位置を算出することで求められる。 To estimate the road surface, first, the rough road surface area in the sensing data is estimated. For example, as shown in FIG. 12A, an area having a high possibility of a road is estimated by approximating a trapezoidal area based on image characteristics with respect to sensing data 901 for one eye of a camera. Such a region 902 is obtained by calculating a position where a road surface is projected on an image on the assumption that a road having a straight traveling direction has a certain width and a certain gradient.
 実際には道路が曲がっている場合もあるので、画像から白線を検知したり、ステレオ視による視差から得られる距離情報の特徴を基に路肩を検出したりすることで、道路の曲がり具合を推定した上で、図12(b)に示すように、おおまかな路面領域903を推定する。 Actually, the road may be curved, so it can be estimated by detecting the white line from the image or detecting the shoulder of the road based on the characteristics of the distance information obtained from the stereo parallax. Then, as shown in FIG. 12B, a rough road surface area 903 is estimated.
 次に、図13右図に示すような垂直位置と視差による仮想平面を考え、センシングデータの垂直位置毎に、路面領域の部分における視差を投票した画像を生成する。投票の数は画像の濃淡として反映する。この画像をV-Disparity画像とする。V-Disparity画像では、図13左図に示すように、近傍の平坦な路面904及び遠方の上り勾配の路面905は、それぞれ図13右図の領域914の部分及び領域915の部分のように勾配の異なる斜め方向の直線状に投影される性質がある。また、先行車両のような障害物は、領域917のように垂直方向に直線状に投影される性質がある。 Next, a virtual plane based on the vertical position and parallax as shown in the right diagram of FIG. 13 is considered, and an image in which the parallax in the portion of the road surface area is voted is generated for each vertical position of the sensing data. The number of votes is reflected as the shading of the image. This image is a V-Disparity image. In the V-Disparity image, as shown in the left diagram of FIG. 13, the flat road surface 904 in the vicinity and the road surface 905 in the distant upward gradient are inclined as shown in the area 914 and the area 915 in the right diagram of FIG. 13, respectively. Are projected in a straight line in different oblique directions. In addition, an obstacle such as a preceding vehicle has a property of being projected in a straight line in a vertical direction like a region 917.
 V-Disparity画像を作成した後、V-Disparity画像において最も支配的な直線を検出するため、例えばまず一定の閾値で2値化した画像に変換し、ハフ変換を行う。検出される直線の例を図13右図の直線911として示す。直線911は、推定された路面が画像上に投影される垂直位置と視差の関係を示しており、三次元空間上の奥行方向の距離と路面高さ位置に変換可能である。V-Disparity画像上では、検出された直線911は、近傍側(図13右図の領域914部分)では視差の投票結果とよく一致しているが、勾配が変化している遠方側(図13右図の領域915部分)ではずれが発生しており、直線911で表す路面推定結果の信頼性が低いと考えられる。 After creating the V-Disparity image, first, for example, the image is first converted into a binarized image with a certain threshold value to detect the most dominant straight line in the V-Disparity image, and then Hough transform is performed. An example of the detected straight line is shown as a straight line 911 in the right diagram of FIG. A straight line 911 indicates the relationship between the vertical position at which the estimated road surface is projected on the image and the parallax, and can be converted into the distance in the depth direction and the road surface height position in the three-dimensional space. On the V-Disparity image, the detected straight line 911 agrees well with the parallax voting result on the near side (region 914 in the right figure in FIG. 13), but on the far side where the gradient changes (FIG. 13). There is a deviation in the area 915 portion in the right figure), and it is considered that the reliability of the road surface estimation result represented by the straight line 911 is low.
 信頼性が低い部分は、具体的には、直線911が通る場所の視差の投票数を確認し、投票数が一定以下になる場所が一定以上連続して発生した部分として判定する。この判定により、図13右図の点916より垂直位置が上の部分に対応するセンシングデータの推定路面は無効と判断する。その結果、直線911から有効な推定路面を得ることができる。 Specifically, the part with low reliability is determined as a part where the number of votes of parallax at the place where the straight line 911 passes is confirmed and the place where the number of votes is less than a certain number is continuously generated. By this determination, it is determined that the estimated road surface of the sensing data corresponding to the portion whose vertical position is above the point 916 in the right diagram of FIG. 13 is invalid. As a result, an effective estimated road surface can be obtained from the straight line 911.
 前記した信号処理装置100、200(の認識部A150、認識部B250)での信号処理手順の例を図14及び図15に示す。図14はミリ波レーダによるセンシングの信号処理を想定し、図15はステレオカメラによるセンシングの信号処理を想定している。 14 and 15 show examples of signal processing procedures in the signal processing devices 100 and 200 (the recognition unit A150 and the recognition unit B250). FIG. 14 assumes sensing signal processing by a millimeter wave radar, and FIG. 15 assumes sensing signal processing by a stereo camera.
 図14において、センサ出力R(S501)では、時間的に周波数の変化するミリ波を送信し、その反射波を複数の受信アンテナで受信して、送信信号、各受信アンテナでの受信信号を共にセンシングデータとして出力する。出力されたセンシングデータは、センサ入力R(S502)でバッファに保存する。信号間の加算などのバッファ不要な簡易な処理は、バッファ前に行う。 In FIG. 14, at sensor output R (S501), a millimeter wave whose frequency changes with time is transmitted, and the reflected wave is received by a plurality of receiving antennas, and both the transmission signal and the receiving signal at each receiving antenna are received. Output as sensing data. The output sensing data is stored in the buffer by the sensor input R (S502). Simple processing that does not require a buffer, such as addition between signals, is performed before buffering.
 保存したデータから、極大点検出R(S512)で、センサ出力R(S501)にてミリ波レーダからミリ波を送信した後に反射波を受信するまでの経過時間、あるいは送信周波数と受信周波数のずれ(時間的に周波数を変化させると、反射波の遅延に応じて同一タイミングにおける送信周波数と受信周波数にずれが発生する)を基にミリ波を反射した物体までの距離を算出する。また、受信アンテナ間の受信波の位相差から物体の方位を検出する。この際、実際の処理としては、例えばRange-Doppler FFT処理を施した結果のピークとして現れるので、極大点を検出して、極大点毎に個々の物体の自車両からの方向、距離、相対速度を検出する。 From the stored data, the maximum time point detection R (S512) and the sensor output R (S501) from the millimeter wave radar to the millimeter wave radar transmits the elapsed time until the reflected wave is received, or the difference between the transmission frequency and the reception frequency The distance to the object that reflects the millimeter wave is calculated based on (if the frequency is changed with time, a difference occurs between the transmission frequency and the reception frequency at the same timing according to the delay of the reflected wave). Further, the direction of the object is detected from the phase difference of the received wave between the receiving antennas. At this time, as actual processing, for example, it appears as a peak of the result of performing Range-Doppler FFT processing, so the local maximum point is detected, and the direction, distance, relative speed from the own vehicle of each object for each local maximum point Is detected.
 極大点検出R(S512)後、ノイズ除去R(S514)では、これまでの検出履歴から、時間的に突発的な検出でないかを確認し、突発的な検出であれば、ノイズとして除去する。 After the maximum point detection R (S512), the noise removal R (S514) confirms from the detection history so far whether it is a sudden detection, and if it is a sudden detection, it is removed as noise.
 ノイズ除去R(S514)後、物体追跡R(S516)では、各検出物体に対し、直前の検出結果の位置からの移動量や移動速度との関係を確認して、直前の検出結果と同一の物体がどの物体であるのかの対応付けを行う。別の外界センサに対して情報を提供する際、検出位置の時間的な補正を行う上で、物体毎に移動を予測する必要があるので、各物体の動く速度と方向の情報、もしくは同一物体を識別できる位置情報が必要となる。同一物体を識別するために、物体毎に識別IDを付加して管理する。 After the noise removal R (S514), the object tracking R (S516) confirms the relationship between the movement amount and the movement speed from the position of the previous detection result for each detected object, and is the same as the previous detection result. The object is associated with which object. When providing information to another external sensor, it is necessary to predict movement for each object in order to correct the detection position in time, so information on the speed and direction of each object or the same object It is necessary to have position information that can identify In order to identify the same object, an identification ID is added and managed for each object.
 ここでは、この物体追跡R(S516)までが、前処理R(S510)として処理される。 Here, up to the object tracking R (S516) is processed as the preprocessing R (S510).
 検出した各物体は、共通特徴パラメータ検出R(S522)で、検出物体の特徴パラメータを抽出する。例えば、物体の移動速度や移動パターン、反射波の強さ及びその変化を特徴パラメータとして抽出する。以降の認識処理R(S520)では、必要に応じてこのパラメータを参照する。 For each detected object, the feature parameter of the detected object is extracted by common feature parameter detection R (S522). For example, the moving speed and movement pattern of the object, the intensity of the reflected wave, and the change thereof are extracted as feature parameters. In the subsequent recognition process R (S520), this parameter is referred to as necessary.
 自動車識別R(S524)では、特徴パラメータ及び物体追跡による当該物体の以前の識別情報を考慮して、自動車の特徴を満たす場合、当該物体を自動車としてマークする。自転車識別R(S526)では、特徴パラメータ及び物体追跡による当該物体の以前の識別情報を考慮して、自転車の特徴を満たす場合、当該物体を自転車としてマークする。同様に標識識別R(S528)では、標識の特徴を満たす場合、当該物体を標識としてマークする。ミリ波レーダでは標識の内容を識別することはできないが、後にフュージョン処理を行う際にカメラ等で検出した標識の検出情報の確認に利用することが可能であるため、標識の存在の確認は行う。 In the vehicle identification R (S524), when the characteristics of the vehicle are satisfied in consideration of the feature parameter and the previous identification information of the object by object tracking, the object is marked as a vehicle. In the bicycle identification R (S526), if the characteristics of the bicycle are satisfied in consideration of the characteristic parameters and the previous identification information of the object by object tracking, the object is marked as a bicycle. Similarly, in the sign identification R (S528), when the sign feature is satisfied, the object is marked as a sign. Although the millimeter wave radar cannot identify the contents of the sign, it can be used to confirm the detection information of the sign detected by a camera or the like when performing fusion processing later, so the presence of the sign is confirmed. .
 最終的に、認識処理結果出力R(S530)では、個々の物体毎に物体認識結果(センサ認識結果)としての識別ID、方向、距離、相対速度、物体の種類を出力する。 Finally, in the recognition process result output R (S530), the identification ID, direction, distance, relative speed, and object type as the object recognition result (sensor recognition result) are output for each individual object.
 一方、図15において、センサ出力S(S531)では、ステレオカメラにて外界を映像としてセンシングする。センシングの際、外界環境に応じて、露光時間や露光感度を適宜調整する。センシングデータは、センサ入力S(S532)でバッファされ、画像補正S(S541)では、カメラの取り付け位置や光学系の誤差に伴う画像の歪み補正、センサの特性誤差による明るさや色合いの補正、センサ出力S(S531)にてステレオカメラに存在する欠陥画素の画素情報の補間などを行う。 On the other hand, in FIG. 15, at the sensor output S (S531), the outside world is sensed as an image by the stereo camera. At the time of sensing, the exposure time and exposure sensitivity are appropriately adjusted according to the external environment. Sensing data is buffered by the sensor input S (S532), and in the image correction S (S541), image distortion due to camera mounting position and optical system errors, brightness and hue correction due to sensor characteristic errors, and sensor In the output S (S531), interpolation of pixel information of defective pixels existing in the stereo camera is performed.
 画像補正S(S541)の後、視差検出S(S542)を行い、左右カメラの画像の各画素の視差を求める。その後、視差情報を活用し、路面検出S(S543)で推定路面を求める。 After the image correction S (S541), the parallax detection S (S542) is performed to obtain the parallax of each pixel of the left and right camera images. Thereafter, the estimated road surface is obtained by road surface detection S (S543) using the parallax information.
 路面検出S(S543)の後、物体検出S(S544)で同一視差の塊部分で推定路面と異なる部分を物体として検出する。検出した物体に対して、視差から距離を算出し、当該物体の領域の距離の平均値を求め、物体までの距離を算出する。また、映像上の位置から自車両の向きに対する物体の方向を算出する。この際、検出した物体の大きさも考慮しながら、小さく時間的にも安定的に検出されていない物体に関しては、検出対象外とする。 After the road surface detection S (S543), a part different from the estimated road surface is detected as an object in the lump portion of the same parallax in the object detection S (S544). For the detected object, the distance is calculated from the parallax, the average value of the distance of the area of the object is obtained, and the distance to the object is calculated. Further, the direction of the object relative to the direction of the host vehicle is calculated from the position on the video. At this time, considering the size of the detected object, a small object that is not stably detected in terms of time is excluded from the detection target.
 物体検出S(S544)で検出した物体は、物体追跡S(S545)で、各検出物体に対し、直前の検出結果の位置からの移動量や移動速度との関係を確認して、直前の検出結果と同一の物体がどの物体であるのかの対応付けを行う。同一物体を識別するために、物体毎に識別IDを付加して管理する。同一物体に関して距離の変化を求め、センシングの時間間隔を用いて計算することで、相対速度を算出する。 The object detected in the object detection S (S544) is detected in the object tracking S (S545) by checking the relationship between the movement amount and the moving speed from the position of the immediately preceding detection result for each detected object, and the immediately preceding detection. Corresponding to which object is the same object as the result. In order to identify the same object, an identification ID is added and managed for each object. The relative velocity is calculated by obtaining a change in distance with respect to the same object and calculating using the sensing time interval.
 ここでは、この物体追跡S(S545)までが、前処理S(S540)として処理される。 Here, up to the object tracking S (S545) is processed as the pre-processing S (S540).
 検出した各物体は、共通特徴パラメータ検出S(S551)で、エッジ分布や輝度部分、実際の大きさなどの認識処理で共通的に用いるパラメータを算出する。以降の認識処理S(S550)では、必要に応じてこのパラメータを参照する。 For each detected object, common feature parameter detection S (S551) calculates parameters commonly used in recognition processing such as edge distribution, luminance portion, and actual size. In the subsequent recognition process S (S550), this parameter is referred to as necessary.
 自動車識別S(S552)では、特徴パラメータ及び物体追跡による当該物体の以前の識別情報を考慮して、自動車の特徴を満たす場合、当該物体を自動車としてマークする。自転車識別S(S553)では、特徴パラメータ及び物体追跡による当該物体の以前の識別情報を考慮して、自転車の特徴を満たす場合、当該物体を自転車としてマークする。歩行者識別S(S554)も同様に、歩行者の特徴を満たす場合、当該物体を歩行者としてマークする。可能な場合、大人または子供の区別も行う。標識識別S(S555)では、特徴を満たす場合、当該物体を標識としてマークすると同時に、標識の内容を識別する。 In the vehicle identification S (S552), when the characteristics of the vehicle are satisfied in consideration of the feature parameter and the previous identification information of the object by object tracking, the object is marked as a vehicle. In the bicycle identification S (S553), if the characteristics of the bicycle are satisfied in consideration of the characteristic parameters and the previous identification information of the object by object tracking, the object is marked as a bicycle. Similarly, when the pedestrian identification S (S554) satisfies the pedestrian characteristics, the object is marked as a pedestrian. Where possible, also differentiate between adults or children. In the sign identification S (S555), when the feature is satisfied, the object is marked as a sign and at the same time the contents of the sign are identified.
 最終的に、認識処理結果出力S(S560)では、個々の物体毎に物体認識結果(センサ認識結果)としての識別ID、位置、距離、相対速度、物体の種類、付加情報(標識の内容など)を出力する。 Finally, in the recognition processing result output S (S560), the identification ID, the position, the distance, the relative speed, the type of the object, the additional information (the contents of the sign, etc.) as the object recognition result (sensor recognition result) for each object. ) Is output.
 図14、図15に示すように、センサ信号処理には複数の処理が必要であり、おおまかに物体の検出に必要な前処理部分(図14の前処理R(S510)、図15の前処理S(S540))と、物体の識別や内容認識に必要な認識処理部分(図14の認識処理R(S520)、図15の認識処理S(S550))がある。 As shown in FIG. 14 and FIG. 15, the sensor signal processing requires a plurality of processes. Roughly, the pre-processing portion (pre-processing R (S510) in FIG. 14) and the pre-processing in FIG. S (S540)) and recognition processing parts (recognition processing R (S520) in FIG. 14 and recognition processing S (S550) in FIG. 15) necessary for object identification and content recognition.
 図16を用いて、センサA用信号処理装置100側の外界センサA101に対する処理とセンサB用信号処理装置200側の外界センサB201に対する信号処理の動作タイミングを説明する。 Referring to FIG. 16, the operation timing of processing for the external sensor A101 on the sensor A signal processing device 100 side and signal processing for the external sensor B201 on the sensor B signal processing device 200 side will be described.
 基本動作として、外界センサA101に対する処理では、外界センサA101によるセンシングA1(S811)に対して前処理A1(S821)を行い、前処理A1(S821)の出力に対して認識処理A1(S831)を行い、前述の認識処理結果を出力する。また、外界センサA101によるセンシングA2(S812)に対して前処理A2(S822)を行い、前処理A2(S822)の出力に対して認識処理A2(S832)を行い、前述の認識処理結果を出力する。センシングA3(S813)以降の処理も同様に行う。すなわち、センシング、前処理、認識処理のパイプライン処理を行う。 As a basic operation, in the process for the external sensor A101, the preprocess A1 (S821) is performed on the sensing A1 (S811) by the external sensor A101, and the recognition process A1 (S831) is performed on the output of the preprocess A1 (S821). And output the above recognition processing result. Also, the preprocessing A2 (S822) is performed on the sensing A2 (S812) by the external sensor A101, the recognition processing A2 (S832) is performed on the output of the preprocessing A2 (S822), and the above recognition processing result is output. To do. Processing after sensing A3 (S813) is performed in the same manner. That is, pipeline processing of sensing, preprocessing, and recognition processing is performed.
 一方、外界センサB201に対する処理では、外界センサB201によるセンシングB1(S861)に対して前処理B1(S871)を行い、前処理B1(S871)の出力に対して認識処理B1(S881)を行い、前述の認識処理結果を出力する。また、外界センサB201によるセンシングB2(S862)に対して前処理B2(S872)を行い、前処理B2(S872)の出力に対して認識処理B2(S882)を行い、前述の認識処理結果を出力する。センシングB3(S863)以降の処理も同様である。すなわち、外界センサA101と同様に、センシング、前処理、認識処理のパイプライン処理を行う。 On the other hand, in the process for the external sensor B201, the preprocess B1 (S871) is performed on the sensing B1 (S861) by the external sensor B201, and the recognition process B1 (S881) is performed on the output of the preprocess B1 (S871). The above recognition processing result is output. In addition, the preprocessing B2 (S872) is performed on the sensing B2 (S862) by the external sensor B201, the recognition processing B2 (S882) is performed on the output of the preprocessing B2 (S872), and the above recognition processing result is output. To do. The same processing is performed after the sensing B3 (S863). That is, similar to the external sensor A101, pipeline processing of sensing, preprocessing, and recognition processing is performed.
 外界センサA101の処理を利用して外界センサB201の処理を行うにあたり、図16(a)では、外界センサA101に対する認識処理の結果を外界センサB201に対する認識処理に利用する場合の動作タイミングを示している。処理を行うタイミングの関係で、認識処理B2(S882)では位置補正B1(S875)の出力を利用している。すなわち、センシングA1(S811)までの情報を基に位置補正B1(S875)を行い、その位置補正B1(S875)をセンシングB2(S862)に対する認識処理に利用することになる。 In performing the process of the external sensor B201 using the process of the external sensor A101, FIG. 16A shows the operation timing when the result of the recognition process for the external sensor A101 is used for the recognition process for the external sensor B201. Yes. Due to the timing of the processing, the recognition processing B2 (S882) uses the output of the position correction B1 (S875). That is, position correction B1 (S875) is performed based on information up to sensing A1 (S811), and the position correction B1 (S875) is used for recognition processing for sensing B2 (S862).
 図16(b)では、外界センサA101に対する前処理の結果を外界センサB201に対する認識処理に利用する場合の動作タイミングを示している。処理を行うタイミングの関係で、認識処理B2(S882)では位置補正B2(S876)の出力を利用している。すなわち、(センシングA1(S811)までの情報に代えて)センシングA2(S812)までの情報を基に位置補正B2(S876)を行い、その位置補正B2(S876)をセンシングB2(S862)に対する認識処理に利用することになる。 FIG. 16B shows the operation timing when the result of the preprocessing for the external sensor A101 is used for the recognition process for the external sensor B201. Due to the timing of the processing, the recognition processing B2 (S882) uses the output of the position correction B2 (S876). That is, position correction B2 (S876) is performed based on the information up to sensing A2 (S812) (instead of the information up to sensing A1 (S811)), and the position correction B2 (S876) is recognized for sensing B2 (S862). It will be used for processing.
 図14及び図15を用いて説明したように、センサ信号処理では、前処理で物体の位置検出を行うため、図16(a)に示すように外界センサA101に対する認識処理の結果を用いる代わりに、図16(b)に示すように前処理の結果を用いることが可能である。図16(b)に示すタイミングを用いることにより、外界センサB201の認識処理において図16(a)に比べてより直近の外界センサA101のセンシングデータを用いた処理結果(物体検出結果)を利用できる。位置補正では、外挿を用いるため、補正の精度を向上する上でより直近のセンシングデータを利用した方がよく、すなわち、外界センサA101のセンサ信号処理における中間処理結果(認識処理途中の前処理の結果)を外界センサB201のセンシングデータの認識処理に利用した方がよい。 As described with reference to FIGS. 14 and 15, in the sensor signal processing, since the position of the object is detected in the preprocessing, instead of using the result of the recognition processing for the external sensor A101 as shown in FIG. The result of the preprocessing can be used as shown in FIG. By using the timing shown in FIG. 16B, the processing result (object detection result) using the sensing data of the external sensor A101 closest to that in FIG. 16A can be used in the recognition processing of the external sensor B201. . Since the position correction uses extrapolation, it is better to use the most recent sensing data in order to improve the correction accuracy. That is, the intermediate processing result in the sensor signal processing of the external sensor A101 (preprocessing during recognition processing) It is better to use the result of (5) for recognition processing of sensing data of the external sensor B201.
 尚、補正の精度を若干犠牲にして、一部認識処理を行ってから、その認識情報もセンサA処理情報108(図1参照)に含めて、外界センサB201のセンシングデータの認識処理に用いることもあり得る。例えば、外界センサA101がミリ波レーダである場合は、図14の自動車識別R(S524)まで行い、自動車であるかを判別した情報を外界センサB201のセンシングデータの認識処理で用いる場合、当該外界センサB201の認識処理で自動車識別を行う際に、自動車識別R(S524)の結果が真の場合は自動車の可能性がやや低い場合でも自動車と認識し、一方、自動車識別R(S524)の結果が偽の場合は自動車の可能性がやや高い場合でも自動車と認識しないことで、外界センサB201のセンシングデータに対する認識精度を向上させることが考えられる。 In addition, after partially performing the recognition process at a slight sacrifice in the accuracy of correction, the recognition information is also included in the sensor A process information 108 (see FIG. 1) and used for the sensing data recognition process of the external sensor B201. There is also a possibility. For example, when the external sensor A101 is a millimeter wave radar, the process up to the vehicle identification R (S524) in FIG. 14 is performed, and when the information that determines whether the vehicle is a vehicle is used in the sensing data recognition process of the external sensor B201, When the vehicle identification is performed by the recognition processing of the sensor B201, if the result of the vehicle identification R (S524) is true, the vehicle is recognized even if the possibility of the vehicle is slightly low, while the result of the vehicle identification R (S524) If is false, it is conceivable to improve the recognition accuracy for the sensing data of the external sensor B201 by not recognizing the automobile even if the possibility of the automobile is somewhat high.
 図17を用いて、図1に示す信号処理システム1000(センサシステム)のシミュレーション評価を行う評価システム1001の構成を説明する。 The configuration of an evaluation system 1001 that performs simulation evaluation of the signal processing system 1000 (sensor system) shown in FIG. 1 will be described with reference to FIG.
 この評価システム1001の評価対象部分340は、前記信号処理システム1000を構成するセンサA用信号処理装置100、センサB用信号処理装置200、及びフュージョン処理装置650で構成され、評価システム1001は、外界センサA101及び外界センサB201でセンシングした結果を評価対象部分340の構成で物体としてどの程度認識可能であるかの評価を行うシステムである。 The evaluation target portion 340 of the evaluation system 1001 includes a sensor A signal processing device 100, a sensor B signal processing device 200, and a fusion processing device 650 that constitute the signal processing system 1000. The evaluation system 1001 This is a system that evaluates how much the results sensed by the sensor A101 and the external sensor B201 can be recognized as an object by the configuration of the evaluation target portion 340.
 本実施形態の評価システム1001は、主に、車両制御システムモデル370、道路環境生成モデル390、情報表示装置モデル310、疑似ユーザ入力生成モデル320、疑似センシング信号生成装置330を含んで構成され、信号処理システム1000(のフュージョン処理装置650)の出力(統合認識結果595)は、車両の動きを模擬する車両制御システムモデル370に接続され、車両制御システムモデル370は、車両の周辺環境を模擬する道路環境生成モデル390に接続され、疑似センシング信号生成装置330にて、車両制御システムモデル370(で生成した情報)と道路環境生成モデル390(で生成した情報)から外界センサA101及び外界センサB201のセンシングデータの模擬センシング信号を生成し、そのセンシングデータの模擬センシング信号を信号処理システム1000の信号処理装置100、200に入力する。 The evaluation system 1001 of this embodiment mainly includes a vehicle control system model 370, a road environment generation model 390, an information display device model 310, a pseudo user input generation model 320, and a pseudo sensing signal generation device 330. The output (integrated recognition result 595) of the processing system 1000 (of the fusion processing apparatus 650) is connected to a vehicle control system model 370 that simulates the movement of the vehicle, and the vehicle control system model 370 is a road that simulates the surrounding environment of the vehicle. Sensing of the external sensor A101 and the external sensor B201 from the vehicle control system model 370 (information generated by) and the road environment generation model 390 (information generated by) by the pseudo sensing signal generator 330 connected to the environment generation model 390. Generate simulated sensing signal of data Inputting a simulated sensing signal of the sensing data to the signal processor 100, 200 of the signal processing system 1000.
 詳しくは、車両制御システムモデル370は、図7に示す車両制御システム700をシミュレーション用にモデル化したものである。道路環境生成モデル390は、車両システムモデル370で計算した自車両の位置及び向きから、自車両周辺のあるシミュレーション時刻における外界の周辺環境(地形、道路、道路上の物体、道路周辺の物体など)に関する位置、移動方向、速度、光の状態や路面の状態などを求めるシミュレーションモデルである。道路環境生成モデル390は、車両制御システムモデル370と密に連携し、自車両の挙動のシミュレーションを行う。 Specifically, the vehicle control system model 370 is obtained by modeling the vehicle control system 700 shown in FIG. 7 for simulation. The road environment generation model 390 is based on the position and orientation of the host vehicle calculated by the vehicle system model 370 and the surrounding environment (terrain, road, objects on the road, objects around the road, etc.) around the host vehicle at a certain simulation time. This is a simulation model for obtaining a position, a moving direction, a speed, a light state, a road surface state, and the like. The road environment generation model 390 works closely with the vehicle control system model 370 to simulate the behavior of the host vehicle.
 擬似ユーザ入力生成モデル320は、ドライバの操作に相当する信号を車両制御システムモデル370に供給するシミュレーションモデルであり、情報表示装置モデル310は、車両制御システムモデル370が出力する信号に基づき評価環境上の画面に擬似的にメータや警告灯表示を行ったり、車両制御システムモデル370の出力の時間的変化の記録を行ったりするシミュレーションモデルである。 The pseudo user input generation model 320 is a simulation model that supplies a signal corresponding to the operation of the driver to the vehicle control system model 370. The information display device model 310 is based on the signal output from the vehicle control system model 370 in the evaluation environment. This is a simulation model in which a meter and a warning light are displayed on the screen in a pseudo manner, and a temporal change in the output of the vehicle control system model 370 is recorded.
 擬似センシング信号生成装置330は、道路環境生成モデル390で生成した情報を基にして、外界センサA101及び外界センサB201の信号(CGやレーダ反射信号などの疑似センシング信号)を擬似的に生成する。擬似センシング信号生成装置330が実際の車両システムに相当するシミュレーションタイミングで擬似的な信号を信号処理システム1000のセンサA用信号処理装置100及びセンサB用信号処理装置200に与えることで、評価対象部分340が動作し、その認識結果が車両制御システムモデル370に伝わり、これにより、評価対象部分340を含めて自車両の動きをシミュレーションできる。評価システム1001では、シミュレーション実行中の評価対象部分340の入出力及び内部信号を収集・分析することで、評価対象部分340が十分な認識を行えているか否か、また車両制御システムモデル370に相当する車両で車両挙動として所望の動作を得ることができるか否かを評価可能である。 The pseudo sensing signal generation device 330 generates pseudo signals of the external sensor A101 and the external sensor B201 (pseudo sensing signals such as CG and radar reflected signals) based on the information generated by the road environment generation model 390. The pseudo sensing signal generation device 330 gives a pseudo signal to the sensor A signal processing device 100 and the sensor B signal processing device 200 of the signal processing system 1000 at a simulation timing corresponding to an actual vehicle system, so that the evaluation target portion 340 is operated, and the recognition result is transmitted to the vehicle control system model 370, whereby the movement of the host vehicle including the evaluation target portion 340 can be simulated. The evaluation system 1001 collects and analyzes the input / output and internal signals of the evaluation target portion 340 during the simulation execution to determine whether or not the evaluation target portion 340 has sufficiently recognized, and corresponds to the vehicle control system model 370. It is possible to evaluate whether or not a desired motion can be obtained as a vehicle behavior in the vehicle that performs the operation.
 図18及び図19を用いて、擬似センシング信号生成装置330がセンサA用信号処理装置100及びセンサB用信号処理装置200に与える擬似的な信号を生成する際の簡略化方法を説明する。 18 and 19, a simplified method when the pseudo sensing signal generation device 330 generates a pseudo signal to be given to the sensor A signal processing device 100 and the sensor B signal processing device 200 will be described.
 前記のように、外界センサがセンシングを行う際、センシングする場所によりセンシング時刻が異なる場合がある。例えばカメラに用いるCMOSセンサでは、ピクセルやラインなどの単位で露光時刻が少しずつ異なる場合がある。擬似センシング信号生成装置330でこの時間的ずれを再現するのが困難である場合、センサA処理情報108で個々の物体毎にセンシング時刻が必要である場合には、図18に示すように、センシング一回単位で全体を同時に(同一時刻で)センシングしたものとして擬似的な信号を生成した上で、当該センシング一回を行った時刻ftに全ての物体(物体p990、物体q992、物体r994、物体s996)のセンシング時刻を揃えて出力するように、シミュレーション環境用にセンサA用信号処理装置100を改変する。物体毎にセンシング時刻が必要ではないが、図4を用いて説明したようにセンシング領域毎にセンシング時刻を示す場合は、全てのセンシング領域のセンシング時刻をftに揃え、図5を用いて説明したように時刻算出式988を用いてセンシング時刻を示す場合は、数式の係数を0にし、Ts=ftとしてセンシング時刻をセンシングの位置に依存しないようにすれば良い。 As described above, when an external sensor performs sensing, the sensing time may vary depending on the sensing location. For example, in a CMOS sensor used in a camera, the exposure time may be slightly different in units such as pixels or lines. When it is difficult to reproduce this time lag in the pseudo sensing signal generation device 330, when sensing time is required for each individual object in the sensor A processing information 108, as shown in FIG. A pseudo signal is generated as if the whole was sensed at the same time (at the same time), and all objects (object p990, object q992, object r994, object at the time ft when the sensing was performed once) The signal processing apparatus 100 for the sensor A is modified for the simulation environment so that the sensing times in s996) are aligned and output. Although sensing time is not required for each object, when the sensing time is indicated for each sensing area as described with reference to FIG. 4, the sensing times for all sensing areas are aligned with ft and described with reference to FIG. 5. Thus, when the sensing time is indicated using the time calculation formula 988, the coefficient of the formula is set to 0, and Ts = ft is set so that the sensing time does not depend on the sensing position.
 図18に示すように全ての物体のセンシング時刻を同一時刻にすると、位置補正の動作確認が不十分である場合には、図19に示すように、擬似センシング信号生成装置330で主要な物体(道路上の車両など)(物体p990、物体q992)に限定し、一回のセンシングにおけるセンシング時刻の違いを考慮して擬似的なセンシング信号を生成する方法がある。図19では、車両である物体p990及び物体q992を主要な物体とみなし、センシング時刻をそれぞれpt、qtとして、そのセンシング時刻pt、qtに対応する位置に車両のセンシング信号を生成し、物体r994、物体s996に関してはセンシング時刻ftとして当該一回のセンシングに対応する代表センシング時刻を用い、その代表センシング時刻ftにあわせた位置にセンシング信号を生成する。 As shown in FIG. 18, when the sensing time of all the objects is set to the same time, if the operation confirmation of the position correction is insufficient, as shown in FIG. There is a method of generating a pseudo sensing signal in consideration of a difference in sensing time in one sensing, limited to (vehicle on road, etc.) (object p990, object q992). In FIG. 19, the vehicle object p990 and the object q992 are regarded as main objects, the sensing times are set as pt and qt, respectively, and vehicle sensing signals are generated at positions corresponding to the sensing times pt and qt, and the object r994, For the object s996, a representative sensing time corresponding to the one sensing is used as the sensing time ft, and a sensing signal is generated at a position corresponding to the representative sensing time ft.
 この方法のように、物体毎にセンシング時刻を付加してセンサA処理情報108に含める構成では、シミュレーション環境用にセンサ入力部A110を一回のセンシングでは全体が同一の時刻にセンシングされたとしてセンシング時刻を管理させた上で、擬似センシング信号生成装置330からセンサA用信号処理装置100の認識部A150及び協調データ出力部A140に当該主要な物体(例えば、物体p990、物体q992)のセンシング時刻に関する情報を送り、それらの物体のみ認識部A150及び協調データ出力部A140においてセンシング時刻をその情報に基づき上書きするように改変すればよい。 In the configuration in which the sensing time is added to each object and included in the sensor A processing information 108 as in this method, it is assumed that the sensor input unit A110 for the simulation environment is sensed as a whole at the same time in one sensing. After managing the time, the pseudo sensing signal generation device 330 causes the recognition unit A150 and the cooperative data output unit A140 of the sensor A signal processing device 100 to detect the main object (for example, the object p990, the object q992). Information may be sent, and only those objects may be modified in the recognition unit A150 and the cooperative data output unit A140 so that the sensing time is overwritten based on the information.
 図4を用いて説明したようにセンシング領域毎にセンシング時刻を示す場合や図5を用いて説明したように時刻算出式988を用いてセンシング時刻を示す場合は、センサA処理情報108で物体毎にセンシング時刻を付加できないため、協調データ出力部A140に当該主要な物体(例えば、物体p990、物体q992)のセンシング時刻に関する情報を送る代わりに、位置補正部B280に当該主要な物体(例えば、物体p990、物体q992)のセンシング時刻に関する情報を送り、位置補正部B280でセンシング時刻をその情報に基づき上書き・管理するように改変する。 When the sensing time is indicated for each sensing area as described with reference to FIG. 4 or when the sensing time is indicated using the time calculation formula 988 as described with reference to FIG. Since the sensing time cannot be added to the coordinate data output unit A140, the main object (eg, object p990, object q992) is sent to the position correction unit B280 instead of sending information related to the sensing time of the main object (eg, object p990, object q992). p990, information on the sensing time of the object q992) is sent, and the position correction unit B280 modifies the sensing time to be overwritten and managed based on the information.
 カメラで用いられるCMOSセンサでは、例えばライン毎に露光タイミングが異なるなど、センシングの位置によりセンシング時刻が異なるセンサが多いが、このようなセンシングに対応する疑似センシング信号生成をCG(コンピュータグラフィックス)を用いて描画する際、センシング時刻の違いの再現にはセンサ上に投影される位置毎に当該位置に描画される可能性のある物体の3次元位置再計算及びその結果に基づくレンダリング処理が必要となり、計算負荷が高くなる。前述した図18あるいは図19に示す方法を用いると、例えばカメラの信号を擬似センシング信号生成装置330で生成する場合、少なくとも物体毎に予め3次元位置を特定した上でCG描画処理を行えるため、CG描画に伴う計算負荷を抑制できる。 Many CMOS sensors used in cameras have different sensing times depending on the sensing position, for example, the exposure timing varies from line to line, but CG (computer graphics) is used to generate pseudo sensing signals corresponding to such sensing. In order to reproduce the difference in sensing time, it is necessary to recalculate the 3D position of an object that may be drawn at that position and render it based on the result. The calculation load becomes high. When the method shown in FIG. 18 or FIG. 19 described above is used, for example, when a camera signal is generated by the pseudo-sensing signal generation device 330, a CG drawing process can be performed after a three-dimensional position is specified in advance for each object. The calculation load accompanying CG drawing can be suppressed.
 以上で説明したように、本実施形態では、それぞれの外界センサのセンシングデータに対する信号処理装置100、200を、同一クロック利用、あるいはIEEE 802.11ASで定義される方法などを用いて、時刻同期させる。また、接続されている外界センサのセンシングタイミングを管理する。センサA用信号処理装置100は、認識処理結果あるいは認識処理途中の情報として、対象としている物体の位置情報とともに、処理対象としているセンシングデータを取得したセンシング時刻をセンサB用信号処理装置200に対して出力する。外界センサB201のセンシングデータを処理するセンサB用信号処理装置200は、センサA用信号処理装置100から受け取った物体の位置情報とセンシング時刻の履歴から、外界センサB201で取得したセンシングデータのセンシング時刻に対応した物体の位置を推定し、外界センサB201で取得したセンシングデータの認識処理における重点処理領域として用いる。 As described above, in this embodiment, the signal processing apparatuses 100 and 200 for sensing data of each external sensor are time-synchronized using the same clock or using a method defined in IEEE 802.11AS. It also manages the sensing timing of connected external sensors. The signal processing apparatus 100 for the sensor A uses the sensing time at which the sensing data to be processed is acquired together with the position information of the target object as the recognition processing result or information during the recognition processing to the signal processing apparatus 200 for the sensor B. Output. The sensor B signal processing apparatus 200 that processes the sensing data of the external sensor B201 detects the sensing time of the sensing data acquired by the external sensor B201 from the position information of the object received from the sensor A signal processing apparatus 100 and the history of the sensing time. The position of the object corresponding to is estimated and used as a priority processing region in the recognition processing of sensing data acquired by the external sensor B201.
 これにより、本実施形態によれば、複数の外界センサのセンシングデータを用いて認識処理を行う際、外界センサA101(第一の外界センサ)の認識処理結果あるいは認識処理途中の情報である物体位置情報を用いて外界センサB201(第二の外界センサ)の認識処理を行う場合において、外界センサB201のセンシングデータに対して重点的な処理などの特定の処理を行う位置のずれを抑制でき、もって、位置精度の向上から外界センサB201の認識精度の向上につながる。また、特定の処理を行う領域のマージンを削減でき、認識処理の処理負荷を低減できる。 Thus, according to the present embodiment, when the recognition process is performed using the sensing data of a plurality of external sensors, the object position which is the recognition process result of the external sensor A101 (first external sensor) or information during the recognition process. In the case of performing recognition processing of the external sensor B201 (second external sensor) using information, it is possible to suppress a shift in position for performing specific processing such as focused processing on the sensing data of the external sensor B201. This leads to an improvement in recognition accuracy of the external sensor B201 from an improvement in position accuracy. Further, it is possible to reduce the margin of the area where specific processing is performed, and to reduce the processing load of recognition processing.
[第2実施形態]
 図20を用いて本発明の第2実施形態を説明する。
[Second Embodiment]
A second embodiment of the present invention will be described with reference to FIG.
 本第2実施形態は、第1実施形態の図2で説明した構成に対し、位置補正部をセンサB用信号処理装置200からセンサA用信号処理装置100へ移動した構成となっている。 The second embodiment has a configuration in which the position correction unit is moved from the sensor B signal processing device 200 to the sensor A signal processing device 100 with respect to the configuration described in FIG. 2 of the first embodiment.
 図20において、センサA用信号処理装置100にて、外界センサA101から、センサ入力部A110、認識部A150、結果出力部A120を経てセンサA認識結果105を出力する流れ、及び、センサB用信号処理装置200にて、外界センサB201から、センサ入力部B210、認識部BX252、結果出力部B220を経てセンサB認識結果205を出力する流れは、上記第1実施形態と同じである。 In FIG. 20, in the sensor A signal processing apparatus 100, the flow of outputting the sensor A recognition result 105 from the external sensor A101 through the sensor input unit A110, the recognition unit A150, and the result output unit A120, and the sensor B signal In the processing device 200, the flow of outputting the sensor B recognition result 205 from the external sensor B201 through the sensor input unit B210, the recognition unit BX252, and the result output unit B220 is the same as that in the first embodiment.
 本実施形態において、センサA用信号処理装置100に設けられた位置補正部AX182は、認識部A150の認識処理で得た物体の検出情報、及び認識部A150が認識処理に用いたセンシングデータのセンシング時刻をセンサ入力部A110から取得する。この際、外界センサA101では、センシングデータ上の位置によりセンシング時刻が異なる場合は、検出した物体毎にセンシング時刻を管理する。 In the present embodiment, the position correction unit AX182 provided in the sensor A signal processing device 100 senses the detection information of the object obtained by the recognition processing of the recognition unit A150 and the sensing data used by the recognition unit A150 for the recognition processing. The time is acquired from the sensor input unit A110. At this time, if the sensing time differs depending on the position on the sensing data, the external sensor A101 manages the sensing time for each detected object.
 センサB用信号処理装置200の認識部BX252は、次の認識処理を行う時間から、センサA用信号処理装置100から位置補正を施した物体の位置情報を取得するまでに要する時間以上前もって、協調データ入力部BX232に対し、センサA用信号処理装置100から外界センサA101でセンシングして得た物体の位置情報を取得するように要請する。この際、取得する位置情報には、次の認識処理に利用する外界センサB201でのセンシングデータがどのタイミングのセンシングデータになるのかを区別する情報を含める。 The recognition unit BX252 of the sensor B signal processing device 200 cooperates in advance from the time for performing the next recognition processing to the time required to acquire the position information of the object subjected to position correction from the sensor A signal processing device 100. The data input unit BX232 is requested to acquire the position information of the object obtained by sensing with the external sensor A101 from the sensor A signal processing device 100. At this time, the position information to be acquired includes information for distinguishing at which timing the sensing data of the external sensor B201 used for the next recognition process is the sensing data.
 要請を受けた協調データ入力部BX232は、対応するセンシングデータのセンシング時刻をセンサ入力部B210より受け取り、当該センシング時刻に対応する外界センサA101のセンシングデータを基に検出した物体の位置情報を送信するように協調データ出力部AX142に要求する。この際、外界センサB201は、センシングデータ上の位置によりセンシング時刻が異なる場合は、当該センシング時刻をセンシングデータ上の位置とセンシング時刻の関係として受け取る。 Upon receiving the request, the cooperative data input unit BX232 receives the sensing time of the corresponding sensing data from the sensor input unit B210, and transmits the position information of the object detected based on the sensing data of the external sensor A101 corresponding to the sensing time. Request to the cooperative data output unit AX142. At this time, if the sensing time differs depending on the position on the sensing data, the external sensor B201 receives the sensing time as a relationship between the position on the sensing data and the sensing time.
 要求を受けた協調データ出力部AX142は、要求に含まれるセンシング時刻情報とともに、認識部A150で検出した物体の位置を当該センシング時刻情報から求めた時刻の位置に合わせて補正した上で協調データ出力部AX142に送出するように、位置補正部AX182に要求する。 Upon receiving the request, the cooperative data output unit AX142 corrects the position of the object detected by the recognition unit A150 together with the sensing time information included in the request according to the position of the time obtained from the sensing time information, and then outputs the cooperative data. The position correction unit AX182 is requested to send it to the unit AX142.
 位置補正部AX182は、協調データ出力部AX142から要求を受け取ると、認識部A150から得ている最新の物体検出情報及びそれ以前の物体検出情報を用いて、協調データ出力部AX142からの要求に適合したセンシング時刻情報、すなわち外界センサB201のセンシング時刻情報に対応する位置情報を外挿で算出する。位置補正部AX182の算出結果は、協調データ出力部AX142及び協調データ入力部BX232を経由して認識部BX252に伝えられ(出力され)、認識部BX252は、その算出結果を次の認識処理の際に活用する。 When the position correction unit AX182 receives the request from the cooperative data output unit AX142, the position correction unit AX182 conforms to the request from the cooperative data output unit AX142 using the latest object detection information obtained from the recognition unit A150 and the previous object detection information. The sensing time information, that is, position information corresponding to the sensing time information of the external sensor B201 is calculated by extrapolation. The calculation result of the position correction unit AX182 is transmitted (output) to the recognition unit BX252 via the cooperative data output unit AX142 and the cooperative data input unit BX232, and the recognition unit BX252 sends the calculation result to the next recognition process. Take advantage of.
 本第2実施形態においても、上記第1実施形態と同様の作用効果が得られることは勿論である。 Of course, in the second embodiment, the same effects as those of the first embodiment can be obtained.
[第3実施形態]
 図21を用いて本発明の第3実施形態を説明する。
[Third Embodiment]
A third embodiment of the present invention will be described with reference to FIG.
 本第3実施形態は、第1実施形態の図2で説明した構成に対し、センサA用信号処理装置100とセンサB用信号処理装置200をセンサAB用信号処理装置300に統合した構成となっている。 The third embodiment is a configuration in which the sensor A signal processing device 100 and the sensor B signal processing device 200 are integrated into the sensor AB signal processing device 300 in contrast to the configuration described in FIG. 2 of the first embodiment. ing.
 図21において、外界センサA101から、センサ入力部AZ114、認識部AZ154、結果出力部A120を経てセンサA認識結果105を出力する流れ、及び、外界センサB201から、センサ入力部B210、認識部B250、結果出力部B220を経てセンサB認識結果205を出力する流れは、上記第1実施形態と同じである。 In FIG. 21, the flow of outputting the sensor A recognition result 105 from the external sensor A101 through the sensor input unit AZ114, the recognition unit AZ154, and the result output unit A120, and the sensor input unit B210, the recognition unit B250, The flow of outputting the sensor B recognition result 205 via the result output unit B220 is the same as that in the first embodiment.
 本実施形態においては、センサAB用信号処理装置300に統合したのに伴い、センサA処理情報108の通信に必要となる入出力部(図2の協調データ出力部A140及び協調データ入力部B230)が省略され、これに伴い、位置補正部の入出力が上記第1実施形態と異なるため、位置補正部BZ384に置き換えられている。図2のセンサ入力部A110、認識部A150も位置補正部BZ384に接続されることになるため、センサ入力部AZ114、認識部AZ154に置き換えられているが、そこでやりとりする情報は、実質的に第1実施形態でセンサ入力部A110、認識部A150が協調データ出力部A140とやりとりする情報と同じである。また、位置補正部BZ384がセンサ入力部AZ114、認識部AZ154から受け取る情報も、実質的に第1実施形態で位置補正部B280がセンサ入力部A110及び認識部A150から協調データ出力部A140及び協調データ入力部B230を経て受け取る情報と同じである。位置補正部BZ380がセンサ入力部B210及び認識部B250とやりとりする情報も、実質的に第1実施形態で位置補正部B280とやりとりする情報と同じである。 In the present embodiment, an input / output unit (cooperative data output unit A140 and cooperative data input unit B230 in FIG. 2) that is necessary for communication of the sensor A processing information 108 as it is integrated into the sensor AB signal processing device 300. Accordingly, since the input / output of the position correction unit is different from that of the first embodiment, the position correction unit BZ384 is replaced. Since the sensor input unit A110 and the recognition unit A150 in FIG. 2 are also connected to the position correction unit BZ384, the sensor input unit AZ114 and the recognition unit AZ154 are replaced. In one embodiment, the sensor input unit A110 and the recognition unit A150 are the same as the information exchanged with the cooperative data output unit A140. Also, the information received by the position correction unit BZ384 from the sensor input unit AZ114 and the recognition unit AZ154 is substantially the same as that in the first embodiment, the position correction unit B280 receives the cooperation data output unit A140 and the cooperation data from the sensor input unit A110 and the recognition unit A150. This is the same as the information received via the input unit B230. Information exchanged by the position correction unit BZ380 with the sensor input unit B210 and the recognition unit B250 is substantially the same as information exchanged with the position correction unit B280 in the first embodiment.
 本実施形態において、センサAB用信号処理装置300の位置補正部BZ384は、第1実施形態の位置補正部B280と同様に、認識部AZ154で検出した各物体の位置情報を外界センサA101及び外界センサB201のセンシング時刻情報を用いて補正した上で認識部B250に伝え、認識部B250は、その情報を認識処理を行う際に用いる。 In the present embodiment, the position correction unit BZ384 of the signal processing device 300 for the sensor AB uses the external sensor A101 and the external sensor to detect the position information of each object detected by the recognition unit AZ154, similarly to the position correction unit B280 of the first embodiment. The information is corrected using the sensing time information of B201 and transmitted to the recognition unit B250, and the recognition unit B250 uses the information when performing the recognition process.
 本第3実施形態においても、上記第1実施形態と同様の作用効果が得られることに加えて、当該センサAB用信号処理装置300の装置構成(ひいては、信号処理システムのシステム構成)を簡素化することができる。 In the third embodiment, in addition to obtaining the same operational effects as those of the first embodiment, the device configuration of the sensor AB signal processing device 300 (and thus the system configuration of the signal processing system) is simplified. can do.
[第4実施形態]
 図22及び図23を用いて本発明の第4実施形態を説明する。
[Fourth Embodiment]
A fourth embodiment of the present invention will be described with reference to FIGS.
 本第4実施形態は、第1実施形態の図2で説明した構成をもとに、外界センサB201のセンシングタイミングに合わせて外界センサA101のセンシングタイミングを制御する構成となっている。 The fourth embodiment is configured to control the sensing timing of the external sensor A101 in accordance with the sensing timing of the external sensor B201 based on the configuration described in FIG. 2 of the first embodiment.
 第4実施形態の信号処理装置100、200の内部構成を図22に示す。 FIG. 22 shows an internal configuration of the signal processing devices 100 and 200 according to the fourth embodiment.
 図22において、センサA用信号処理装置100にて、外界センサA101から、センサ入力部AY113、認識部A150、結果出力部A120を経てセンサA認識結果105を出力する流れ、及び、センサB用信号処理装置200にて、外界センサB201から、センサ入力部B210、認識部BY253、結果出力部B220を経てセンサB認識結果205を出力する流れは、上記第1実施形態と同じである。また、協調データ出力部AY143及び協調データ入力部BY233を経て、位置補正部B280で、認識部A150で検出した物体の位置を、外界センサA101のセンシング時刻及び外界センサB201のセンシング時刻に基づき補正し、センサB用信号処理装置200の認識部BY253で補正した位置を利用する流れも同じである。 In FIG. 22, the flow of outputting the sensor A recognition result 105 from the external sensor A101 through the sensor input unit AY113, the recognition unit A150, the result output unit A120, and the sensor B signal in the sensor A signal processing device 100. The flow of outputting the sensor B recognition result 205 from the external sensor B201 through the sensor input unit B210, the recognition unit BY253, and the result output unit B220 in the processing device 200 is the same as that in the first embodiment. The position of the object detected by the recognition unit A150 is corrected based on the sensing time of the external sensor A101 and the sensing time of the external sensor B201 by the position correction unit B280 via the cooperative data output unit AY143 and the cooperative data input unit BY233. The flow of using the position corrected by the recognition unit BY253 of the sensor B signal processing device 200 is the same.
 但し、本第4実施形態において、センサA用信号処理装置100のセンサ入力部AY113は、センサB用信号処理装置200から協調データ出力部AY143経由で来るセンシングタイミング情報に基づき、外界センサA101のセンシング動作タイミングを制御する機能を有する。また、センサB用信号処理装置200の認識部BY253は、センシングデータの処理を開始する処理タイミングをタイミング算出部BY293に通知する機能を有する。 However, in the fourth embodiment, the sensor input unit AY113 of the sensor A signal processing device 100 performs the sensing of the external sensor A101 based on the sensing timing information coming from the sensor B signal processing device 200 via the cooperative data output unit AY143. It has a function of controlling operation timing. The recognition unit BY253 of the signal processing apparatus for sensor B 200 has a function of notifying the timing calculation unit BY293 of processing timing for starting processing of sensing data.
 また、協調データ入力部BY233及び協調データ出力部AY143は、図2の協調データ入力部B230及び協調データ出力部A140に対し、それぞれタイミング算出部BY293から出力される外界センサA101へのセンシング開始要求を伝える機能を付加している。 Further, the cooperative data input unit BY233 and the cooperative data output unit AY143 send a sensing start request to the external sensor A101 output from the timing calculation unit BY293 to the cooperative data input unit B230 and the cooperative data output unit A140 of FIG. The function to convey is added.
 タイミング算出部BY293は、認識部BY253の処理の周期性、予め設定した外界センサA101の1回のセンシングに要する時間及びそのセンシングデータに対し、認識部A150で前処理して物体の位置が算出され、その算出結果が位置補正部B280に送られ、位置補正結果が認識部BY253に到達する時刻(準備時間Tp899(図23参照))、さらに予め設定したタイミング算出部BY293から要求信号が協調データ入力部BY233、協調データ出力部AY143、センサ入力部AY113を経て、外界センサA101に伝わり、外界センサA101のセンシングが開始するまでの遅延を用い、外界センサA101のセンシング開始要求を出すタイミングを算出し、そのタイミングで外界センサA101のセンシング開始要求を出す。 The timing calculation unit BY293 performs preprocessing by the recognition unit A150 on the periodicity of the processing of the recognition unit BY253, the time required for one sensing of the external sensor A101 set in advance, and the sensing data thereof, and the position of the object is calculated. The calculation result is sent to the position correction unit B280, the time at which the position correction result reaches the recognition unit BY253 (preparation time Tp899 (see FIG. 23)), and a request signal is input from the preset timing calculation unit BY293 to the cooperative data. A timing to issue a sensing start request of the external sensor A101 using the delay until the sensing of the external sensor A101 is transmitted to the external sensor A101 via the unit BY233, the cooperative data output unit AY143, and the sensor input unit AY113, Senshin of external sensor A101 at that timing It issues a start request.
 図23は、図22に示す構成における動作タイミングを示す。 FIG. 23 shows operation timings in the configuration shown in FIG.
 センサB用信号処理装置200からセンサA用信号処理装置100へ外界センサA101のセンシングタイミングを指示することで、センシングA2(S812)、前処理A2(S822)、位置補正B2(S876)、認識処理B2(S882)の処理の流れにおいて待ち時間が抑制され、以降の外界センサA101のセンシングデータに対する処理においても、センサA用信号処理装置100とセンサB用信号処理装置200の協調動作で待ち時間が抑制される。したがって、外界センサB201の認識処理を行う際に、外界センサA101のセンシングによる物体の位置検出結果のセンシング時刻による補正処理で行う外挿処理に伴う誤差を抑制できる。 By instructing the sensing timing of the external sensor A101 from the sensor B signal processing device 200 to the sensor A signal processing device 100, sensing A2 (S812), preprocessing A2 (S822), position correction B2 (S876), recognition processing The waiting time is suppressed in the process flow of B2 (S882), and the waiting time is also obtained by the cooperative operation of the sensor A signal processing device 100 and the sensor B signal processing device 200 in the subsequent processing on the sensing data of the external sensor A101. It is suppressed. Therefore, when performing the recognition process of the external sensor B201, it is possible to suppress errors associated with the extrapolation process performed in the correction process based on the sensing time of the object position detection result by the sensing of the external sensor A101.
 尚、本第4実施形態の構成は、第1実施形態の構成を基にして外界センサA101のセンシングタイミングを調整する構成としているが、外界センサ間でのセンシング及びその処理タイミングの調整を行えればよいので、第2実施形態または第3実施形態で示した構成を基にした構成も考えられる。また、外界センサA101のセンシングタイミングを基準として、センサB用信号処理装置200を動作させる構成も考えられ、この場合は、センサA用信号処理装置100の動作タイミングに合わせて、外界センサB201のセンシングタイミングを調整する必要がある。 Although the configuration of the fourth embodiment is configured to adjust the sensing timing of the external sensor A101 based on the configuration of the first embodiment, sensing between the external sensors and adjustment of the processing timing can be performed. Therefore, a configuration based on the configuration shown in the second embodiment or the third embodiment is also conceivable. A configuration is also conceivable in which the sensor B signal processing device 200 is operated with reference to the sensing timing of the external sensor A101. In this case, the sensing of the external sensor B201 is synchronized with the operation timing of the sensor A signal processing device 100. Timing needs to be adjusted.
 本第4実施形態においても、上記第1実施形態と同様の作用効果が得られることに加えて、一方の外界センサのセンシングデータのセンシング時刻が他方の外界センサのセンシングデータの処理タイミングに合うように調整されるので、センシング時刻の違いによる位置ずれを更に抑制でき、外界センサの認識精度を更に向上させることができる。 Also in the fourth embodiment, in addition to obtaining the same operational effects as those of the first embodiment, the sensing time of the sensing data of one external sensor matches the processing timing of the sensing data of the other external sensor. Therefore, the positional deviation due to the difference in sensing time can be further suppressed, and the recognition accuracy of the external sensor can be further improved.
 なお、本発明は上記した実施形態に限定されるものではなく、様々な変形形態が含まれる。例えば、上記した実施形態は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態の構成の一部を他の実施形態の構成に置き換えることが可能であり、また、ある実施形態の構成に他の実施形態の構成を加えることも可能である。また、各実施形態の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 Note that the present invention is not limited to the above-described embodiment, and includes various modifications. For example, the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to one having all the configurations described. Further, a part of the configuration of an embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of an embodiment. In addition, it is possible to add, delete, and replace other configurations for a part of the configuration of each embodiment.
 また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリや、ハードディスク、SSD(Solid State Drive)等の記憶装置、または、ICカード、SDカード、DVD等の記録媒体に置くことができる。 In addition, each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit. Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor. Information such as programs, tables, and files for realizing each function can be stored in a memory, a hard disk, a storage device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
 また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えてもよい。 Also, the control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
10…時刻同期情報、100…センサA用信号処理装置、101…外界センサA、105…センサA認識結果、108…センサA処理情報、110…センサ入力部A、113…センサ入力部AY、114…センサ入力部AZ、120…結果出力部A、140…協調データ出力部A、142…協調データ出力部AX、143…協調データ出力部AY、150…認識部A、154…認識部AZ、182…位置補正部AX、200…センサB用信号処理装置、201…外界センサB、205…センサB認識結果、210…センサ入力部B、220…結果出力部B、230…協調データ入力部B、232…協調データ入力部BX、233…協調データ入力部BY、250…認識部B、252…認識部BX、253…認識部BY、280…位置補正部B、293…タイミング算出部BY、310…情報表示装置モデル、320…疑似ユーザ入力生成モデル、330…疑似センシング信号生成装置、340…評価対象部分、370…車両制御システムモデル、384…位置補正部BZ、390…道路環境生成モデル、595…統合認識結果、610…情報表示装置、620…ユーザ入力装置、650…フュージョン処理装置、700…車両制御システム、701…自車両位置情報、705…車両統合制御装置、710…エンジン制御装置、712…エンジン制御系アクチュエータ、714…エンジン制御系センサ、720…トランスミッション制御装置、722…トランスミッション制御系アクチュエータ、724…トランスミッション制御系センサ、730…ブレーキ制御装置、732…ブレーキ制御系アクチュエータ、734…ブレーキ制御系センサ、740…ステアリング制御装置、742…ステアリング制御系アクチュエータ、744…ステアリング制御系センサ、790…共通バス、899…準備時間Tp、901…ステレオカメラの片眼のセンシングデータに道路の可能性が高い推定領域を重ねた図、902…道路の可能性が高い推定領域(直線の道路が一定幅一定勾配と仮定)、903…道路の可能性が高い推定領域(道路の曲がり具合を推定)、904…近傍の平坦な路面、905…上り勾配の路面、911…V-Disparity画像において最も支配的な直線、914…V-Disparity画像での近傍の平坦な路面領域、915…V-Disparity画像での遠方の上り勾配領域、916…垂直位置がこの点より上の部分に対応する推定路面は無効と判断する点、917…V-Disparity画像での先行車両のような障害物、920…センシングLa、922…センシングLaでセンシングされた車両、930…センシングLb、932…センシングLbでセンシングされた車両、940…センシングCa、942…センシングCaでセンシングされた車両、945…センシングCc、948…センシングCcでセンシングされた車両、950…センシングCb、952…センシングCbに投影したセンシングLbでセンシングされた車両、953…センシングCbでセンシングされた車両、955…センシングCd、956…推定路面、957…センシングCdに推定路面などを付加した説明図、958…センシングCdでセンシングされた車両、959…センシングCdのセンシングデータ上のミリ波レーダによる車両の推定領域、960…センシングRc、962…センシングRcでセンシングされた車両、970…センシングRd、971…センシングCdにおける推定路面に対してミリ波レーダによるセンシングを投影した領域、972…センシングRdでセンシングされた車両、974…センシングのセンシング時刻における推定位置、975…カメラによるセンシングデータ上にミリ波レーダによるセンシングから推定した車両の位置、981…領域A、982…領域B、983…領域C、984…領域D、988…時刻算出式、990…物体p、992…物体q、994…物体r、996…物体s、1000…信号処理システム、1000…評価システム DESCRIPTION OF SYMBOLS 10 ... Time synchronous information, 100 ... Signal processing apparatus for sensor A, 101 ... External sensor A, 105 ... Sensor A recognition result, 108 ... Sensor A processing information, 110 ... Sensor input part A, 113 ... Sensor input part AY, 114 ... sensor input part AZ, 120 ... result output part A, 140 ... cooperative data output part A, 142 ... cooperative data output part AX, 143 ... cooperative data output part AY, 150 ... recognition part A, 154 ... recognition part AZ, 182 ... position correction unit AX, 200 ... signal processing device for sensor B, 201 ... external sensor B, 205 ... sensor B recognition result, 210 ... sensor input unit B, 220 ... result output unit B, 230 ... cooperative data input unit B, 232 ... Cooperative data input unit BX, 233 ... Cooperative data input unit BY, 250 ... Recognition unit B, 252 ... Recognition unit BX, 253 ... Recognition unit BY, 280 ... Position correction unit B 293 ... Timing calculation unit BY, 310 ... Information display device model, 320 ... Pseudo user input generation model, 330 ... Pseudo sensing signal generation device, 340 ... Evaluation target part, 370 ... Vehicle control system model, 384 ... Position correction unit BZ, 390 ... Road environment generation model, 595 ... Integrated recognition result, 610 ... Information display device, 620 ... User input device, 650 ... Fusion processing device, 700 ... Vehicle control system, 701 ... Own vehicle position information, 705 ... Vehicle integrated control device , 710 ... Engine control device, 712 ... Engine control system actuator, 714 ... Engine control system sensor, 720 ... Transmission control device, 722 ... Transmission control system actuator, 724 ... Transmission control system sensor, 730 ... Brake control device, 732 ... Blur Control system actuator, 734 ... brake control system sensor, 740 ... steering control device, 742 ... steering control system actuator, 744 ... steering control system sensor, 790 ... common bus, 899 ... preparation time Tp, 901 ... one eye of stereo camera 902 ... Estimated area with high possibility of road (assuming straight road with constant width and constant gradient), 903 ... Estimated area with high possibility of road (Estimates the degree of curve of the road), 904... Flat road surface in the vicinity, 905... Road surface of the upward gradient, 911... The most dominant straight line in the V-Disparity image, 914. Area, 915 ... distant uphill area in the V-Disparity image, 916 ... the estimated road surface corresponding to the part whose vertical position is above this point is invalid Points to be judged, 917 ... Obstacle such as preceding vehicle in V-Disparity image, 920 ... Sensing La, 922 ... Vehicle sensed with sensing La, 930 ... Sensing Lb, 932 ... Vehicle sensed with sensing Lb, 940 ... Sensing Ca, 942 ... Vehicle sensed with sensing Ca, 945 ... Sensing Cc, 948 ... Vehicle sensed with sensing Cc, 950 ... Sensing Cb, 952 ... Vehicle sensed with sensing Lb projected onto sensing Cb, 953 ... Vehicle sensed by sensing Cb, 955 ... Sensing Cd, 956 ... Estimated road surface, 957 ... Explanatory drawing with estimated road surface added to sensing Cd, 958 ... Vehicle sensed by sensing Cd, 959 ... Sensing of sensing Cd On the data Estimated area of vehicle by rewave radar, 960... Sensing Rc, 962... Vehicle sensed by sensing Rc, 970... Sensing Rd, 971. ... vehicle sensed by sensing Rd, 974 ... estimated position at sensing time of sensing, 975 ... position of vehicle estimated from sensing by millimeter wave radar on sensing data by camera, 981 ... area A, 982 ... area B, 983 ... Area C, 984 ... Area D, 988 ... Time calculation formula, 990 ... Object p, 992 ... Object q, 994 ... Object r, 996 ... Object s, 1000 ... Signal processing system, 1000 ... Evaluation system

Claims (15)

  1.  各々に外界センサが接続される複数の外界センサ用信号処理装置で構成され、各信号処理装置は、前記外界センサによりセンシングした情報から外界の物体を認識する機能を有する信号処理システムにおいて、第一の外界センサによるセンシングデータは第一の信号処理装置で認識処理を行い、第二の外界センサによるセンシングデータは第二の信号処理装置で認識処理を行い、第一の信号処理装置の出力を第二の信号処理装置が認識処理を行う際に参照する信号処理システムであって、
     前記第一の信号処理装置は、処理対象としたセンシングデータのセンシング時刻と物体検出位置を管理し、前記物体検出位置は、前記第二の信号処理装置が処理対象とするセンシングデータのセンシング時刻における位置となるように補正され、前記第二の信号処理装置は、認識処理を行う際に前記補正された位置を用いることを特徴とする、信号処理システム。
    The signal processing system includes a plurality of signal processing devices for an external sensor to which external sensors are connected, and each signal processing device has a function of recognizing an object in the external environment from information sensed by the external sensor. Sensing data from the outside sensor is recognized by the first signal processing device, sensing data from the second outside sensor is recognized by the second signal processing device, and the output of the first signal processing device is A signal processing system to be referred to when the second signal processing device performs recognition processing,
    The first signal processing device manages a sensing time and an object detection position of sensing data to be processed, and the object detection position is at a sensing time of sensing data to be processed by the second signal processing device. The signal processing system is corrected to be a position, and the second signal processing device uses the corrected position when performing recognition processing.
  2.  請求項1に記載の信号処理システムであって、
     前記物体検出位置は、前記第一の信号処理装置が処理対象としたセンシングデータのセンシング時刻から前記第二の信号処理装置が処理対象とするセンシングデータのセンシング時刻までの経過時間を用いて補正されることを特徴とする、信号処理システム。
    The signal processing system according to claim 1,
    The object detection position is corrected using an elapsed time from a sensing time of sensing data to be processed by the first signal processing device to a sensing time of sensing data to be processed by the second signal processing device. A signal processing system.
  3.  請求項2に記載の信号処理システムであって、
     前記第一の信号処理装置と前記第二の信号処理装置の時刻が同期されており、前記時刻の差分を用いて前記センシング時刻の経過時間を求めることを特徴とする、信号処理システム。
    The signal processing system according to claim 2,
    The time of said 1st signal processing apparatus and said 2nd signal processing apparatus is synchronized, The elapsed time of the said sensing time is calculated | required using the difference of the said time, The signal processing system characterized by the above-mentioned.
  4.  請求項1に記載の信号処理システムであって、
     前記第二の信号処理装置が認識処理を行う際に参照する前記第一の信号処理装置の出力は、物体の種類まで特定した認識処理結果とは別の、前記認識処理結果より低遅延な物体検出結果であることを特徴とする、信号処理システム。
    The signal processing system according to claim 1,
    The output of the first signal processing device that is referred to when the second signal processing device performs recognition processing is an object having a lower delay than the recognition processing result, which is different from the recognition processing result specified up to the type of the object. A signal processing system characterized by being a detection result.
  5.  請求項1に記載の信号処理システムであって、
     前記補正された位置に対し、特定の認識処理アルゴリズムの適用、あるいは認識処理パラメータの変更の少なくとも一方を適用することを特徴とする、信号処理システム。
    The signal processing system according to claim 1,
    A signal processing system, wherein at least one of application of a specific recognition processing algorithm or change of a recognition processing parameter is applied to the corrected position.
  6.  請求項1に記載の信号処理システムであって、
     前記第一の外界センサのセンシングデータのセンシング時刻が前記第二の外界センサのセンシングデータの処理タイミングに合うように調整されていることを特徴とする、信号処理システム。
    The signal processing system according to claim 1,
    A signal processing system, wherein a sensing time of sensing data of the first external sensor is adjusted to match a processing timing of sensing data of the second external sensor.
  7.  請求項1に記載の信号処理システムであって、
     前記第一の外界センサのセンシングデータのセンシング時刻を、前記第一の外界センサ上のセンシング領域毎に、前記第一の外界センサ上の位置との関係を示す方法で、又は前記第一の外界センサのセンシングデータを用いて検出した物体毎に管理することを特徴とする、信号処理システム。
    The signal processing system according to claim 1,
    The sensing time of the sensing data of the first external sensor is a method for indicating the relationship with the position on the first external sensor for each sensing area on the first external sensor, or the first external sensor A signal processing system that manages each object detected using sensing data of a sensor.
  8.  請求項7に記載の信号処理システムの評価システムにおいて、
     前記信号処理システムの出力は、車両の動きを模擬するモデルに接続され、前記車両の動きを模擬するモデルは、車両の周辺環境を模擬するモデルに接続され、前記車両の動きを模擬するモデルと前記車両の周辺環境を模擬するモデルから外界センサのセンシングデータの模擬センシング信号を生成し、前記センシングデータの模擬センシング信号を前記信号処理システムの信号処理装置に入力する評価システムであって、
     一回のセンシングにおけるセンシング時刻を、同一時刻として処理、又は一部の物体のみ物体毎に管理することを特徴とする、評価システム。
    In the evaluation system of the signal processing system according to claim 7,
    The output of the signal processing system is connected to a model that simulates the movement of the vehicle, the model that simulates the movement of the vehicle is connected to a model that simulates the surrounding environment of the vehicle, and the model that simulates the movement of the vehicle; An evaluation system that generates a simulated sensing signal of sensing data of an external sensor from a model that simulates the surrounding environment of the vehicle, and inputs the simulated sensing signal of the sensing data to a signal processing device of the signal processing system,
    An evaluation system, wherein sensing times in one sensing are processed as the same time, or only a part of objects are managed for each object.
  9.  各々に外界センサが接続される複数の外界センサ用信号処理装置で構成され、各信号処理装置は、前記外界センサによりセンシングした情報から外界の物体を認識する機能を有する信号処理システムであって、第一の外界センサによるセンシングデータは第一の信号処理装置で認識処理を行い、第二の外界センサによるセンシングデータは第二の信号処理装置で認識処理を行い、第一の信号処理装置の出力を第二の信号処理装置が認識処理を行う際に参照する信号処理システムにおいて、第二の信号処理装置として用いられる信号処理装置であって、
     前記第一の信号処理装置が処理対象としたセンシングデータのセンシング時刻から前記第二の信号処理装置が処理対象とするセンシングデータのセンシング時刻までの経過時間を用いて前記第一の信号処理装置の物体検出結果の位置情報を補正した結果を用いて、前記認識処理を行うことを特徴とする、信号処理装置。
    Each signal processing device is composed of a plurality of signal processing devices for external sensors connected to external sensors, and each signal processing device is a signal processing system having a function of recognizing an external object from information sensed by the external sensor, Sensing data from the first external sensor is recognized by the first signal processor, and sensing data from the second external sensor is recognized by the second signal processor, and output from the first signal processor. In the signal processing system referred to when the second signal processing device performs recognition processing, a signal processing device used as the second signal processing device,
    The first signal processing device uses the elapsed time from the sensing time of the sensing data to be processed by the first signal processing device to the sensing time of the sensing data to be processed by the second signal processing device. A signal processing apparatus that performs the recognition processing using a result of correcting position information of an object detection result.
  10.  請求項9に記載の信号処理装置であって、
     前記第一の信号処理装置と前記第二の信号処理装置の時刻が同期されており、前記時刻の差分を用いて前記センシング時刻の経過時間を求めることを特徴とする、信号処理装置。
    The signal processing device according to claim 9,
    The time of said 1st signal processing apparatus and said 2nd signal processing apparatus is synchronized, The elapsed time of the said sensing time is calculated | required using the difference of the said time, The signal processing apparatus characterized by the above-mentioned.
  11.  各々に外界センサが接続される複数の外界センサ用信号処理装置で構成され、各信号処理装置は、前記外界センサによりセンシングした情報から外界の物体を認識する機能を有する信号処理システムであって、第一の外界センサによるセンシングデータは第一の信号処理装置で認識処理を行い、第二の外界センサによるセンシングデータは第二の信号処理装置で認識処理を行い、第一の信号処理装置の出力を第二の信号処理装置が認識処理を行う際に参照する信号処理システムにおいて、第一の信号処理装置として用いられる信号処理装置であって、
     物体の種類まで特定した認識処理結果とは別の、前記第二の信号処理装置が認識処理を行う際に用いる物体検出結果を、前記外界センサによるセンシングデータから前記認識処理結果を出力するより低遅延に出力する機能を有することを特徴とする、信号処理装置。
    Each signal processing device is composed of a plurality of signal processing devices for external sensors connected to external sensors, and each signal processing device is a signal processing system having a function of recognizing an external object from information sensed by the external sensor, Sensing data from the first external sensor is recognized by the first signal processor, and sensing data from the second external sensor is recognized by the second signal processor, and output from the first signal processor. In the signal processing system that is referred to when the second signal processing device performs recognition processing, a signal processing device used as the first signal processing device,
    The object detection result used when the second signal processing device performs the recognition process, which is different from the recognition process result specified to the object type, is lower than the recognition process result output from the sensing data by the external sensor. A signal processing apparatus having a function of outputting to a delay.
  12.  各々に外界センサが接続される複数の外界センサ用信号処理装置で構成され、各信号処理装置は、前記外界センサによりセンシングした情報から外界の物体を認識する機能を有する信号処理システムであって、第一の外界センサによるセンシングデータは第一の信号処理装置で認識処理を行い、第二の外界センサによるセンシングデータは第二の信号処理装置で認識処理を行い、第一の信号処理装置の出力を第二の信号処理装置が認識処理を行う際に参照する信号処理システムにおいて、第一の信号処理装置として用いられる信号処理装置であって、
     前記第二の信号処理装置が処理対象とするセンシングデータのセンシング時刻に合わせて補正された物体検出結果の位置情報を前記第二の信号処理装置に出力することを特徴とする、信号処理装置。
    Each signal processing device is composed of a plurality of signal processing devices for external sensors connected to external sensors, and each signal processing device is a signal processing system having a function of recognizing an external object from information sensed by the external sensor, Sensing data from the first external sensor is recognized by the first signal processor, and sensing data from the second external sensor is recognized by the second signal processor, and output from the first signal processor. In the signal processing system that is referred to when the second signal processing device performs recognition processing, a signal processing device used as the first signal processing device,
    The signal processing apparatus outputs position information of an object detection result corrected according to a sensing time of sensing data to be processed by the second signal processing apparatus to the second signal processing apparatus.
  13.  複数の外界センサが接続され、前記外界センサによりセンシングした情報から外界の物体を認識する機能を有する信号処理装置において、第一の外界センサによるセンシングデータに対する第一の認識処理の出力を第二の外界センサによるセンシングデータに対する第二の認識処理を行う際に参照する信号処理装置であって、
     前記第一の認識処理が処理対象としたセンシングデータのセンシング時刻から前記第二の認識処理が処理対象とするセンシングデータのセンシング時刻までの経過時間を用いて第一の認識処理の物体検出結果の位置情報を補正した結果を用いて、前記第二の認識処理を行うことを特徴とする、信号処理装置。
    In a signal processing apparatus having a function of recognizing an object in the outside world from information sensed by the outside world sensor connected to a plurality of outside world sensors, the output of the first recognition processing for the sensing data by the first outside world sensor A signal processing device referred to when performing a second recognition process for sensing data by an external sensor,
    The object detection result of the first recognition process using the elapsed time from the sensing time of the sensing data targeted by the first recognition process to the sensing time of the sensing data targeted by the second recognition process. The signal processing apparatus, wherein the second recognition process is performed using a result of correcting the position information.
  14.  請求項13に記載の信号処理装置であって、
     前記第一の外界センサのセンシングデータのセンシング時刻が前記第二の外界センサのセンシングデータの処理タイミングに合うように調整されていることを特徴とする、信号処理装置。
    The signal processing device according to claim 13,
    A signal processing apparatus, wherein a sensing time of sensing data of the first external sensor is adjusted so as to match a processing timing of sensing data of the second external sensor.
  15.  請求項9に記載の信号処理装置であって、
     前記位置情報を補正する際に、前記位置情報を推定路面上に対応付けることを特徴とする、信号処理装置。
    The signal processing device according to claim 9,
    The signal processing apparatus according to claim 1, wherein when the position information is corrected, the position information is associated with an estimated road surface.
PCT/JP2019/008008 2018-03-08 2019-03-01 Signal processing system and evaluation system for same, and signal processing device used in said signal processing system WO2019172103A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018041761A JP6932664B2 (en) 2018-03-08 2018-03-08 A signal processing system, its evaluation system, and a signal processing device used in the signal processing system.
JP2018-041761 2018-03-08

Publications (1)

Publication Number Publication Date
WO2019172103A1 true WO2019172103A1 (en) 2019-09-12

Family

ID=67847386

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/008008 WO2019172103A1 (en) 2018-03-08 2019-03-01 Signal processing system and evaluation system for same, and signal processing device used in said signal processing system

Country Status (2)

Country Link
JP (1) JP6932664B2 (en)
WO (1) WO2019172103A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3889641A1 (en) * 2020-04-02 2021-10-06 Mitsubishi Electric Corporation Object recognition device and object recognition method

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021149820A (en) * 2020-03-23 2021-09-27 株式会社デンソー Sensor evaluation device
CN111611709B (en) * 2020-05-20 2023-09-15 阿波罗智能技术(北京)有限公司 Method, apparatus, electronic device, and storage medium for controlling a simulated vehicle
JP7432447B2 (en) * 2020-06-15 2024-02-16 日立Astemo株式会社 Sensor recognition integration device
US20240078908A1 (en) * 2021-01-26 2024-03-07 Kyocera Corporation Observation device and observation method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002303671A (en) * 2001-04-03 2002-10-18 Nissan Motor Co Ltd Object-classification discrimination apparatus
JP2012014553A (en) * 2010-07-02 2012-01-19 Honda Motor Co Ltd Apparatus for monitoring surrounding of vehicle
WO2014010546A1 (en) * 2012-07-10 2014-01-16 本田技研工業株式会社 Failure-assessment apparatus
JP2015059808A (en) * 2013-09-18 2015-03-30 株式会社東芝 Object monitoring device and object monitoring system
JP2017075881A (en) * 2015-10-16 2017-04-20 三菱電機株式会社 Object recognition integration apparatus and object recognition integration method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101644370B1 (en) * 2014-10-23 2016-08-01 현대모비스 주식회사 Object detecting apparatus, and method for operating the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002303671A (en) * 2001-04-03 2002-10-18 Nissan Motor Co Ltd Object-classification discrimination apparatus
JP2012014553A (en) * 2010-07-02 2012-01-19 Honda Motor Co Ltd Apparatus for monitoring surrounding of vehicle
WO2014010546A1 (en) * 2012-07-10 2014-01-16 本田技研工業株式会社 Failure-assessment apparatus
JP2015059808A (en) * 2013-09-18 2015-03-30 株式会社東芝 Object monitoring device and object monitoring system
JP2017075881A (en) * 2015-10-16 2017-04-20 三菱電機株式会社 Object recognition integration apparatus and object recognition integration method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3889641A1 (en) * 2020-04-02 2021-10-06 Mitsubishi Electric Corporation Object recognition device and object recognition method
CN113494938A (en) * 2020-04-02 2021-10-12 三菱电机株式会社 Object recognition device and object recognition method
US11921191B2 (en) 2020-04-02 2024-03-05 Mitsubishi Electric Corporation Object recognition device and object recognition method
CN113494938B (en) * 2020-04-02 2024-05-17 三菱电机株式会社 Object recognition device and object recognition method

Also Published As

Publication number Publication date
JP6932664B2 (en) 2021-09-08
JP2019158390A (en) 2019-09-19

Similar Documents

Publication Publication Date Title
WO2019172103A1 (en) Signal processing system and evaluation system for same, and signal processing device used in said signal processing system
JP7160040B2 (en) Signal processing device, signal processing method, program, moving object, and signal processing system
EP3447528B1 (en) Automated driving system that merges heterogenous sensor data
US11009602B2 (en) Method and system for environment detection
US10279809B2 (en) Travelled-route selecting apparatus and method
JP4428277B2 (en) Object detection device
US20150073705A1 (en) Vehicle environment recognition apparatus
US10562532B2 (en) Autonomous driving system
JP5089545B2 (en) Road boundary detection and judgment device
US10800427B2 (en) Systems and methods for a vehicle controller robust to time delays
KR102569904B1 (en) Apparatus and method for tracking target vehicle and vehicle including the same
EP3888276B1 (en) Verifying timing of sensors used in autonomous driving vehicles
JP6780611B2 (en) Autonomous driving device
US20190071094A1 (en) Vehicle control system, vehicle control method, and storage medium
JP6787157B2 (en) Vehicle control device
WO2016047689A1 (en) Device for estimating axial misalignment amount of beam sensor
JP2017211249A (en) Target detection apparatus
WO2019181284A1 (en) Information processing device, movement device, method, and program
JP2021099793A (en) Intelligent traffic control system and control method for the same
JP2018100899A (en) Object detection device, object detection program, and recording medium
JP2016131367A (en) Moving body system
WO2019021591A1 (en) Image processing device, image processing method, program, and image processing system
JP2019067116A (en) Solid object ground discrimination device
JP6645910B2 (en) Position estimation device
JP2018180641A (en) Vehicle identification device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19763394

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19763394

Country of ref document: EP

Kind code of ref document: A1