WO2019172103A1 - Système de traitement de signal et système d'évaluation associé, et dispositif de traitement de signal utilisé dans ledit système de traitement de signal - Google Patents

Système de traitement de signal et système d'évaluation associé, et dispositif de traitement de signal utilisé dans ledit système de traitement de signal Download PDF

Info

Publication number
WO2019172103A1
WO2019172103A1 PCT/JP2019/008008 JP2019008008W WO2019172103A1 WO 2019172103 A1 WO2019172103 A1 WO 2019172103A1 JP 2019008008 W JP2019008008 W JP 2019008008W WO 2019172103 A1 WO2019172103 A1 WO 2019172103A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal processing
sensing
sensor
processing device
time
Prior art date
Application number
PCT/JP2019/008008
Other languages
English (en)
Japanese (ja)
Inventor
中田 啓明
道雄 森岡
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Publication of WO2019172103A1 publication Critical patent/WO2019172103A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a signal processing system including an external sensor signal processing device having a cooperative operation function, an evaluation system thereof, and a signal processing device used in the signal processing system. It is composed of a signal processing device that processes signals obtained from an in-vehicle external sensor that senses the external environment around the vehicle, and constitutes an automatic driving system, and detects or recognizes an object existing around the vehicle, and the signal processing device.
  • the present invention relates to a signal processing system and an evaluation system for evaluating the signal processing apparatus and the signal processing system.
  • Patent Document 1 describes a method for recognizing a traveling environment by integrating a recognition result by a radar and a recognition result by an image. Further, in Patent Document 2, in order to effectively superimpose the sensing function of the radar device and the sensing function of the camera device to improve the object detection accuracy, a marker is calculated from the supplemental area of the radar device, and the camera device Describes a method for calculating an edge from the image acquired in step (1) and calculating a component region from the calculation results.
  • the sensing time may differ depending on the position of the sensing object in the sensing range.
  • the processing time from processing a signal obtained from an external sensor to obtaining an object detection or recognition result may vary depending on the sensor or the external environment.
  • the sensing time is different, the relative position from the sensor of the same object changes for each external sensor when the external sensor moves as the vehicle moves, such as in-vehicle devices, or when the external object itself moves. Sensed. Therefore, the recognition processing result for sensing data by a certain external sensor or the object position information in the middle of the recognition processing is utilized when performing the recognition processing for sensing data by another external sensor.
  • the present invention has been made in view of the above circumstances, and the object of the present invention is to perform recognition processing using sensing data from a plurality of external sensors, When using object position information during recognition processing in recognition processing of sensing data of another external sensor, a signal processing system capable of suppressing the influence of positional deviation due to a difference in sensing time, its evaluation system, and its The object is to provide a signal processing apparatus used in a signal processing system.
  • a signal processing system and the like includes a plurality of signal processing devices for external sensors connected to external sensors, and each signal processing device senses the external sensors.
  • sensing data from the first external sensor is recognized by the first signal processing device, and sensing data from the second external sensor is the second signal.
  • the sensing time of the sensing data and the object detection position are managed, and the object detection position is the sensing data to be processed by the second signal processing device. Is corrected so as to be positioned at the sensing time of data, said second signal processing apparatus is characterized by using said corrected position when performing the recognition process.
  • the second external sensor when the recognition process is performed using the sensing data of a plurality of external sensors, the second external sensor is used by using the object position information which is the recognition process result of the first external sensor or information during the recognition process.
  • the recognition process it is possible to suppress a position shift for performing a specific process such as a intensive process on the sensing data of the second external sensor, and thereby improve the positional accuracy, thereby improving the second external sensor. It leads to improvement of recognition accuracy. Further, it is possible to reduce the margin of the area where specific processing is performed, and to reduce the processing load of recognition processing.
  • the internal structural example of the vehicle control system shown in FIG. The use image figure of the positional information obtained from the sensing of the millimeter wave radar in the recognition process of a camera in case an external sensor is comprised with a millimeter wave radar and a camera.
  • (A), (b) is a figure which shows the rough road surface area
  • 5 is a timing chart for explaining an example of signal processing operation timing in the first embodiment, where (a) is a signal processing operation timing when a recognition processing result for the external sensor A is used for recognition processing for the external sensor B; ) Is a timing chart for explaining signal processing operation timing when the result of preprocessing for the external sensor A is used for recognition processing for the external sensor B.
  • the structural example of the evaluation system which performs simulation evaluation of the sensor system of 1st Embodiment.
  • the figure explaining an example of the simplification of the time information at the time of performing the simulation evaluation of the sensor system of the first embodiment (in the case where the whole is sensed simultaneously in one sensing unit).
  • the figure explaining the other example of the simplification of the time information at the time of performing the simulation evaluation of the sensor system of the first embodiment (considering the difference in sensing time within one sensing only for main objects).
  • the internal structural example of the signal processing apparatus in the signal processing system which processes the sensing result of the external sensor B using the sensing result of the external sensor A in 4th Embodiment The timing chart explaining the example of the signal processing operation timing in 4th Embodiment.
  • the signal processing system 1000 of this embodiment includes an external sensor A101, an external sensor B201, a sensor A signal processing device 100, a sensor B signal processing device 200, and a fusion processing device 650.
  • External sensor A101 is a first external sensor that senses the external environment around the vehicle
  • external sensor B201 is a second external sensor that senses the external environment around the vehicle.
  • Specific examples of sensors that sense the vehicle surroundings include millimeter wave radar, monocular cameras, stereo cameras, and riders (LiDAR: LightARDetection and Ranging).
  • the external sensor A101 and the external sensor B201 are generally different types or types of sensors, but may be the same type or type. For example, it is possible to use an infrared camera and a general camera even if both are cameras.
  • the sensor A signal processing device 100 and the sensor B signal processing device 200 are connected to the external sensor A101 and the external sensor B201, respectively, and perform signal processing on the sensing data output of the external sensor A101 and the sensing data output of the external sensor B201.
  • an ECU Electronic Control Unit
  • the signal processing function may be implemented as a partial function of the ECU.
  • These signal processing apparatuses 100 and 200 perform signal processing on the sensing data, detect objects existing around the vehicle, and identify the objects as necessary, thereby outputting a final recognition result.
  • the detection results include not only position information (object detection position) and orientation, but also signal reflection delay time when using millimeter waves and lasers, temporal object movement, image characteristics, when using cameras, Distance information calculated from parallax (in the case of a stereo camera) or the like is also included.
  • the sensor A signal processing device 100 and the sensor B signal processing device 200 exchange time synchronization information 10 and operate in time synchronization.
  • As a method of synchronous operation there is a method indicated as IEEE AS802.11AS.
  • the same clock signal may be supplied to both the sensor A signal processing device 100 and the sensor B signal processing device 200 more easily.
  • the sensor A signal processing device 100 outputs the sensor A processing information 108 to the sensor B signal processing device 200.
  • the sensor A processing information 108 includes time information at which sensing is performed together with position information (object detection position) of an object existing around the host vehicle detected by signal processing for sensing data output by the external sensor A101.
  • the position information of the object existing around the host vehicle included in the sensor A processing information 108 is matched with the sensing time by the external sensor B201. It correct
  • the host vehicle position information 701 input from the outside is referred to.
  • the position information of the object included in the sensor A processing information 108 may be a final processing result in the sensor A signal processing apparatus 100, but it is better to use information after the presence detection of the object and before the object type determination processing. .
  • the sensor A processing information is sent from the sensing timing of the external sensor A101 to the sensor B signal processing device 200 when the information required when the sensor B signal processing device 200 performs the recognition processing is obtained. There is little delay until 108 is output (this is called low delay), and errors in correction of position information can be suppressed.
  • the sensor A signal processing device 100 outputs the recognition result as the sensor A recognition result 105 to the fusion processing device 650
  • the sensor B signal processing device 200 outputs the recognition result as the sensor B recognition result 205 to the fusion processing device 650.
  • the sensor A recognition result 105 and the sensor B recognition result 205 are integrated by the fusion processing device 650 and output as an integrated recognition result 595.
  • the fusion process by the fusion processing apparatus 650 is performed using a known technique such as the technique disclosed in Patent Document 1.
  • the internal configuration of the sensor A signal processing device 100 and the sensor B signal processing device 200 in the signal processing system 1000 will be described with reference to FIG. In the following description, the sensor A signal processing device 100 and the sensor B signal processing device 200 will be described on the assumption that they operate synchronously.
  • the sensing data output of the external sensor A101 is sent to the recognition unit A150 that performs recognition processing via the sensor input unit A110, and the recognition result of the recognition unit A150 is sent to the sensor A via the result output unit A120.
  • a recognition result 105 is output.
  • the sensor input unit A110 outputs parameters and timing signals necessary for operation control of the external sensor A101 to the external sensor A101, receives the sensing data output of the external sensor A101, and receives the received sensing data until the recognition unit A150 performs processing. Hold.
  • the sensing time of the external sensor A101 and the sensing data are also managed in association with each other.
  • the object detection information obtained by the recognition process of the recognition unit A150 is sent to the cooperative data output unit A140.
  • the detection information includes identification information necessary for distinguishing which object is the same object between continuous sensing data even when a plurality of objects are detected.
  • the cooperative data output unit A140 acquires the sensing time of the sensing data used by the recognition unit A150 for the recognition process from the sensor input unit A110, and uses the sensor detection information as the sensor A processing information 108 together with the sensing time information to perform signal processing for the sensor B. Output to the device 200.
  • the sensor A processing information 108 is received by the cooperative data input unit B230 in the sensor B signal processing device 200 and sent to the position correction unit B280.
  • the sensing data output of the external sensor B201 is sent to the recognition unit B250 that performs recognition processing via the sensor input unit B210, and the recognition result of the recognition unit B250 is sent to the sensor B via the result output unit B220.
  • the result is output as a recognition result 205.
  • the sensor input unit B210 outputs parameters and timing signals necessary for operation control of the external sensor B201 to the external sensor B201, receives the sensing data output of the external sensor B201, and receives the sensing data until the recognition unit B250 performs processing. Hold.
  • the sensing time of the external sensor B201 and the sensing data are also managed in association with each other.
  • the position correction unit B280 calculates the position correction result in consideration of the time required for the position correction process (detailed later), and uses the position correction result to recognize the unit B250.
  • the sensing data to be subjected to the next recognition process is received from the recognition unit B250, and the sensing time of the corresponding data is received from the sensor input unit B210.
  • the position correction unit B 280 determines the difference between the sensing time included in the sensor A processing information 108 and the sensing time of the sensing data of the external sensor B 201, that is, the sensing time at the external sensor A 101 included in the sensor A processing information 108.
  • the elapsed time until the sensing time of the sensing data is calculated, the movement amount and the change of the direction of the vehicle are calculated using the own vehicle position information 701, and the sensor A signal processing apparatus 100 input as the sensor A processing information 108.
  • the position information included in the sensor A processing information 108 is obtained from the external sensor using the object position (object detection position) obtained from the above and the history information of the sensing time (or the object position history information and the elapsed time information between the histories).
  • the position information corresponding to the sensing time sensed by B201 is corrected.
  • time synchronization is performed between the signal processing apparatuses 100 and 200 that process the respective sensing data in order to obtain the difference in sensing time between the external sensor A101 and the external sensor B201. If a method is used in which the time difference is not greatly shifted within the range used for processing, the time synchronization may not be performed. For example, even when each signal processing device 100, 200 is operated by a different crystal oscillator and the synchronization processing is not performed as time, each signal processing device 100, 200 has a common 4-bit unit in one second. Time information is obtained by combining the count information received immediately before as time-related information and the elapsed time information in each of the signal processing devices 100 and 200 after receiving the count information immediately before. If used, since the difference in the number of pulses output by different crystal oscillators per second is usually small, sufficient time accuracy can be obtained in obtaining the difference in sensing time.
  • the corrected position information is sent to the recognition unit B250 and used when the next sensing data is recognized. For example, for the region indicated by the position information, a noise removal process with good performance even when the process is heavy, an object identification algorithm (recognition process algorithm) with a high recognition rate is applied, or the recognition process of the recognition unit B250 If the object is not detected in step (b), the noise filter strength is reduced (that is, the recognition process parameter is changed) and the recognition process is performed again.
  • the recognition unit B250 can efficiently perform recognition processing by selecting a region and performing high-load processing, and can perform non-detection by performing re-recognition processing on a portion where an object cannot be detected. Can be suppressed (in other words, an increase in the undetected rate can be suppressed).
  • the use image of the position information obtained from the sensor A processing information 108 in the recognition processing of the sensor B signal processing apparatus 200 is described with reference to FIG. 3 assuming that the external sensor A101 is a rider and the external sensor B201 is a camera. To do.
  • the vehicle existing at the position 922 in the sensing La 920 of the rider has moved to the position 932 at the sensing Lb 930.
  • the vehicle existing at the position 942 in the sensing Ca 940 of the camera has moved to the position 953 in the sensing Cb 950.
  • the vehicles sensed by the rider and the camera are the same vehicle.
  • the position 952 of the sensing Cb 950 is obtained by adjusting and projecting the position 932 of the rider's sensing Lb 930 to the position of the camera sensing data.
  • the position of the rider and the camera needs to be adjusted because the position of the object existing at the same position in the space is shifted due to the difference in the mounting position in the vehicle.
  • the position 952 is a position in which the rider adjusts the position 932 in the sensing Lb 930 to the position in the sensing Cb 950 of the camera. ing. Therefore, the position correction unit B280 (see FIG. 2) estimates the position of the vehicle at the sensing time of the sensing Cb 950 using the sensor A processing information 108 in the sensing La 920 and the sensor A processing information 108 in the sensing Lb 930, and the sensing Cb 950. It outputs as a position used for the recognition process by the recognition part B250.
  • the position correction unit B280 estimates the position at the sensing Cb 950 from the results of the sensing La 920 and the sensing Lb 930, so that the direction and distance of the object as the object detection result by the rider from the own vehicle, the attachment position of the rider on the own vehicle, Using the sensing time information of the sensing La 920 and the sensing Lb 930 and the information on the position and orientation of the own vehicle corresponding to each sensing time, the detection target vehicle based on the position and orientation of the own vehicle at the sensing time of the sensing Lb 930 First, the position in space is obtained. Next, the position correction unit B280 extrapolates and obtains a position in the space corresponding to the sensing time of the sensing Cb950.
  • the position correction unit B280 considers the influence of the movement and orientation change of the host vehicle from the sensing time of the sensing Lb930 to the sensing time of the sensing Cb950, and the space of the detection target vehicle with respect to the host vehicle at the sensing time of the sensing Cb950. Find the top position. Finally, the position correction unit B280 determines the position of the target vehicle (two-dimensional position in the sensing result (image) of the camera) in the sensing Cb 950 in consideration of the mounting position of the camera in the host vehicle.
  • the position of the sensing Cb at the sensing time is estimated using the two object detection results of the sensing La 920 and the sensing Lb 930, but may be estimated using the sensing results of more riders.
  • sensing time information of sensing data targeted for detection processing is added to the object detection result and output, but this sensing time is simply What is necessary is just to add the start time of sensing or the intermediate time of the end of sensing from the start of sensing in units of one sensing. However, depending on the external sensor, it may take time for one sensing, and time accuracy may be insufficient for position correction based on time. As a countermeasure, a method for adding time information will be described with reference to FIGS.
  • the sensing data of the external sensor A101 is divided into a plurality of sensing areas and adding and managing the sensing time in units of sensing areas. For example, when the horizontal scanning is advanced in the vertical direction during sensing, as shown in FIG. 4, the sensing data is divided into a plurality of sensing areas from the sensing area A 981 to the sensing area D 984, and a representative sensing time is added to each sensing area. To do. As the representative sensing time, a sensing start time in the sensing region or an intermediate time from the sensing start to the sensing end is used.
  • the sensing region where the object exists is determined from the position (center of the object region) on the sensing data of the object (object p990, object q992) that is the position correction target,
  • the sensing time corresponding to the sensing area is used. Note that the number of divisions and the division direction are determined in consideration of the scanning order at the time of sensing by the external sensor, the time required for one sensing, and the accuracy required for position correction.
  • the method shown in FIG. 5 defines a time calculation formula 988 for calculating the sensing time from the horizontal position and the vertical position on the sensing data of the external sensor A101, and the coefficients Tcv, Tch, offset time Ts necessary for time calculation as the sensing time. It is a method to manage using.
  • the position correction unit B280 uses the sensing time
  • the object object p990, object
  • the position correction target using the time calculation formula 988.
  • q992 is used as the sensing time.
  • the time calculation formula 988 is an example, and the formula may be modified so that the calculation is easy, or a different formula may be used in accordance with the sensing of the external sensor.
  • a table defining output values for input values may be used as the time calculation formula 988.
  • parameters such as coefficients Tcv, Tch, and offset time Ts necessary for time calculation need not be sent for each sensing for parameters that do not change, only when they change or for every certain number of sensing times. It is also possible to send it.
  • the method shown in FIG. 6 is a method of adding and managing sensing time information for each object detected using the sensing data of the external sensor A101.
  • the position correction unit B280 When the position correction unit B280 performs position correction based on the time information, if the sensing time of the external sensor B201 differs depending on the position on the sensing data, the position correction unit B280 performs sensing to obtain the sensing time of the sensing data from the sensor input unit B210. It is managed as a relationship between the position on the data and the sensing time, the sensing time of the external sensor B201 is obtained from the position of the detected object obtained from the signal processing apparatus 100 for sensor A, and the sensing time included in the sensor A processing information 108 Calculate the difference between
  • the external sensor A101 is directly connected to the sensor A signal processing apparatus 100
  • the external sensor B201 is directly connected to the sensor B signal processing apparatus 200
  • the sensor A signal processing device 100 and the sensor B signal processing device are connected to a common bus 790 (configured by a CAN bus, an in-vehicle Ethernet, or the like) as an in-vehicle network.
  • the external sensor A101 and the sensor A signal processing apparatus 100, and the external sensor B201 and the sensor B signal processing apparatus 200 are directly connected, and a signal having a large amount of data such as a point cloud or an image flows to the common bus 790. Suppressed.
  • Fusion processing for outputting an integrated recognition result 595 (see FIG. 1), which is a combination of the sensor A recognition result 105 of the sensor A signal processing device 100 and the sensor B recognition result 205 of the sensor B signal processing device 200, to the common bus 790.
  • an information display device 610 (consisting of indicators such as meters) that informs the driver of the host vehicle, and a user input device 620 that receives operations from the driver (steering, accelerator, brake) , And other switches), and a vehicle control system 700 that performs control related to the mechanical part of the host vehicle is connected.
  • the common bus 790 may be divided into a plurality of physical buses for ensuring the amount of data flowing through the bus, ensuring safety, and ensuring security. In such a case, information is appropriately transferred between physical buses. It is necessary to use a device for replacement, or to configure each device connected to the common bus 790 to connect to a plurality of physical buses as necessary.
  • the own vehicle position information 701 (see FIG. 1) input to the sensor B signal processing device 200 is supplied from the vehicle control system 700 via the common bus 790, for example. Since the vehicle control system 700 performs control related to the mechanical part of the host vehicle, the speed and acceleration of the host vehicle are acquired by a sensor, and information based on the sensing data of the sensor is acquired from the host vehicle position information. 701 is supplied.
  • the internal configuration of the vehicle control system 700 is as shown in FIG. 8, and the common bus 790 controls a power source via a vehicle integrated control device 705 that performs integrated control of mechanical parts of the vehicle.
  • An engine control device 710, a transmission control device 720 that controls the power transmission portion, a brake control device 730 that controls the brake, and a steering control device 740 that controls the steering are connected.
  • Each control device (710, 720, 730, 740) includes an actuator (712, 722, 732, 742) that controls the movement of the associated mechanical part and a sensor (714, 724, 734, 734) that monitors its state. 744) are connected and controlled to operate properly.
  • the configuration of the vehicle control system 700 shown in FIG. 8 is an example, and a plurality of control devices are integrated, the connection between the control devices is performed by a common bus 790, or in the case of an electric vehicle, the engine control device 710 is a motor control device. It may be appropriately changed depending on the configuration of the host vehicle.
  • the integrated recognition result 595 output from the fusion processing device 650 is transmitted to the vehicle integrated control device 705, and the vehicle integrated control device 705 issues an instruction necessary for emergency brake operation, for example, when there is a high risk of collision.
  • an instruction necessary for speed control is issued based on the set speed, the distance to the vehicle ahead and the relative speed.
  • the vehicle integrated control device 705 issues a steering operation instruction to the steering control device 740 or the like based on the recognition result, so Keep the vehicle.
  • an automatic driving control device may be added to the common bus 790.
  • the behavior that the automatic driving control device should take as a vehicle is determined from the output of the fusion processing device 650 and the state of the host vehicle supplied from the vehicle control system 700, and an instruction is issued to the vehicle control system 700.
  • the vehicle system as a whole should be considered to be sufficiently safe and secure in the event of an emergency. It is necessary to do.
  • the recognition processing of the sensor B signal processing apparatus 200 is performed.
  • the vehicle present at position 962 in sensing Rc 960 of the millimeter wave radar (external sensor A101) has moved to position 972 at sensing Rd 970.
  • the vehicle existing at the position 948 in the sensing Cc945 of the camera (external sensor B201) has moved to the position 958 in the sensing Cd955.
  • the vehicles sensed by the millimeter wave radar and the camera are the same vehicle.
  • the position correction unit B 280 first detects the position 962 and the position 972 on the plane where the millimeter wave radar detects, and the position correction unit B280 at the sensing time when the respective positions 962 and 972 are sensed.
  • the position 974 at the sensing time of the sensing Cd955 is estimated by extrapolation from the position and orientation of the vehicle, the sensing time of the sensing Cd955, and the position and orientation of the host vehicle.
  • estimation may be performed by extrapolation using the sensing result of the millimeter wave radar before the sensing Rc 960.
  • the relative speed may be used for estimation.
  • the position correction unit B280 performs road surface estimation on the sensing Cd955, and obtains a region 971 obtained by projecting a sensing region by the millimeter wave radar on the estimated road surface 956 in the sensing Cd955 as shown in FIG.
  • a position 975 on the sensing data by the camera is obtained (in association with the estimated road surface 956).
  • the area indicated by the position 975 in FIG. 11 is widened with a margin, and the height is set to the maximum height of the object 958 assumed as the detection target in advance.
  • the height on the sensing data is calculated from the distance to the object (target object) 958.
  • the vertical position on the sensing data is obtained by assuming the position of the estimated road surface 956 at the position 975 as the bottom surface of the object 958. As a result, an area 959 on the sensing data of sensing Cd955 is obtained.
  • the road surface estimation process may be performed by the position correction unit B 280 as the position correction process, or by the recognition unit B 250 in consideration of use for recognition processing other than the position correction, and the result is the position correction unit. You may provide to B280.
  • the rough road surface area in the sensing data is estimated. For example, as shown in FIG. 12A, an area having a high possibility of a road is estimated by approximating a trapezoidal area based on image characteristics with respect to sensing data 901 for one eye of a camera. Such a region 902 is obtained by calculating a position where a road surface is projected on an image on the assumption that a road having a straight traveling direction has a certain width and a certain gradient.
  • the road may be curved, so it can be estimated by detecting the white line from the image or detecting the shoulder of the road based on the characteristics of the distance information obtained from the stereo parallax. Then, as shown in FIG. 12B, a rough road surface area 903 is estimated.
  • V-Disparity image As shown in the left diagram of FIG. 13, the flat road surface 904 in the vicinity and the road surface 905 in the distant upward gradient are inclined as shown in the area 914 and the area 915 in the right diagram of FIG. 13, respectively. Are projected in a straight line in different oblique directions.
  • an obstacle such as a preceding vehicle has a property of being projected in a straight line in a vertical direction like a region 917.
  • the image is first converted into a binarized image with a certain threshold value to detect the most dominant straight line in the V-Disparity image, and then Hough transform is performed.
  • An example of the detected straight line is shown as a straight line 911 in the right diagram of FIG.
  • a straight line 911 indicates the relationship between the vertical position at which the estimated road surface is projected on the image and the parallax, and can be converted into the distance in the depth direction and the road surface height position in the three-dimensional space.
  • the detected straight line 911 agrees well with the parallax voting result on the near side (region 914 in the right figure in FIG. 13), but on the far side where the gradient changes (FIG. 13). There is a deviation in the area 915 portion in the right figure), and it is considered that the reliability of the road surface estimation result represented by the straight line 911 is low.
  • the part with low reliability is determined as a part where the number of votes of parallax at the place where the straight line 911 passes is confirmed and the place where the number of votes is less than a certain number is continuously generated.
  • the estimated road surface of the sensing data corresponding to the portion whose vertical position is above the point 916 in the right diagram of FIG. 13 is invalid.
  • an effective estimated road surface can be obtained from the straight line 911.
  • FIG. 14 and 15 show examples of signal processing procedures in the signal processing devices 100 and 200 (the recognition unit A150 and the recognition unit B250).
  • FIG. 14 assumes sensing signal processing by a millimeter wave radar
  • FIG. 15 assumes sensing signal processing by a stereo camera.
  • a millimeter wave whose frequency changes with time is transmitted, and the reflected wave is received by a plurality of receiving antennas, and both the transmission signal and the receiving signal at each receiving antenna are received.
  • Output as sensing data.
  • the output sensing data is stored in the buffer by the sensor input R (S502). Simple processing that does not require a buffer, such as addition between signals, is performed before buffering.
  • the maximum time point detection R (S512) and the sensor output R (S501) from the millimeter wave radar to the millimeter wave radar transmits the elapsed time until the reflected wave is received, or the difference between the transmission frequency and the reception frequency
  • the distance to the object that reflects the millimeter wave is calculated based on (if the frequency is changed with time, a difference occurs between the transmission frequency and the reception frequency at the same timing according to the delay of the reflected wave).
  • the direction of the object is detected from the phase difference of the received wave between the receiving antennas.
  • actual processing for example, it appears as a peak of the result of performing Range-Doppler FFT processing, so the local maximum point is detected, and the direction, distance, relative speed from the own vehicle of each object for each local maximum point Is detected.
  • the noise removal R confirms from the detection history so far whether it is a sudden detection, and if it is a sudden detection, it is removed as noise.
  • the object tracking R confirms the relationship between the movement amount and the movement speed from the position of the previous detection result for each detected object, and is the same as the previous detection result.
  • the object is associated with which object.
  • it is necessary to predict movement for each object in order to correct the detection position in time, so information on the speed and direction of each object or the same object It is necessary to have position information that can identify In order to identify the same object, an identification ID is added and managed for each object.
  • the feature parameter of the detected object is extracted by common feature parameter detection R (S522).
  • the moving speed and movement pattern of the object, the intensity of the reflected wave, and the change thereof are extracted as feature parameters.
  • this parameter is referred to as necessary.
  • the object In the vehicle identification R (S524), when the characteristics of the vehicle are satisfied in consideration of the feature parameter and the previous identification information of the object by object tracking, the object is marked as a vehicle. In the bicycle identification R (S526), if the characteristics of the bicycle are satisfied in consideration of the characteristic parameters and the previous identification information of the object by object tracking, the object is marked as a bicycle. Similarly, in the sign identification R (S528), when the sign feature is satisfied, the object is marked as a sign. Although the millimeter wave radar cannot identify the contents of the sign, it can be used to confirm the detection information of the sign detected by a camera or the like when performing fusion processing later, so the presence of the sign is confirmed. .
  • the recognition process result output R (S530) the identification ID, direction, distance, relative speed, and object type as the object recognition result (sensor recognition result) are output for each individual object.
  • the outside world is sensed as an image by the stereo camera.
  • the exposure time and exposure sensitivity are appropriately adjusted according to the external environment.
  • Sensing data is buffered by the sensor input S (S532), and in the image correction S (S541), image distortion due to camera mounting position and optical system errors, brightness and hue correction due to sensor characteristic errors, and sensor In the output S (S531), interpolation of pixel information of defective pixels existing in the stereo camera is performed.
  • the parallax detection S (S542) is performed to obtain the parallax of each pixel of the left and right camera images. Thereafter, the estimated road surface is obtained by road surface detection S (S543) using the parallax information.
  • the road surface detection S (S543) After the road surface detection S (S543), a part different from the estimated road surface is detected as an object in the lump portion of the same parallax in the object detection S (S544).
  • the distance is calculated from the parallax, the average value of the distance of the area of the object is obtained, and the distance to the object is calculated. Further, the direction of the object relative to the direction of the host vehicle is calculated from the position on the video. At this time, considering the size of the detected object, a small object that is not stably detected in terms of time is excluded from the detection target.
  • the object detected in the object detection S (S544) is detected in the object tracking S (S545) by checking the relationship between the movement amount and the moving speed from the position of the immediately preceding detection result for each detected object, and the immediately preceding detection. Corresponding to which object is the same object as the result. In order to identify the same object, an identification ID is added and managed for each object. The relative velocity is calculated by obtaining a change in distance with respect to the same object and calculating using the sensing time interval.
  • common feature parameter detection S calculates parameters commonly used in recognition processing such as edge distribution, luminance portion, and actual size. In the subsequent recognition process S (S550), this parameter is referred to as necessary.
  • the object In the vehicle identification S (S552), when the characteristics of the vehicle are satisfied in consideration of the feature parameter and the previous identification information of the object by object tracking, the object is marked as a vehicle.
  • the bicycle identification S (S553) if the characteristics of the bicycle are satisfied in consideration of the characteristic parameters and the previous identification information of the object by object tracking, the object is marked as a bicycle.
  • the pedestrian identification S (S554) satisfies the pedestrian characteristics, the object is marked as a pedestrian. Where possible, also differentiate between adults or children.
  • the sign identification S (S555) when the feature is satisfied, the object is marked as a sign and at the same time the contents of the sign are identified.
  • the recognition processing result output S (S560) the identification ID, the position, the distance, the relative speed, the type of the object, the additional information (the contents of the sign, etc.) as the object recognition result (sensor recognition result) for each object. ) Is output.
  • the sensor signal processing requires a plurality of processes. Roughly, the pre-processing portion (pre-processing R (S510) in FIG. 14) and the pre-processing in FIG. S (S540)) and recognition processing parts (recognition processing R (S520) in FIG. 14 and recognition processing S (S550) in FIG. 15) necessary for object identification and content recognition.
  • the preprocess A1 (S821) is performed on the sensing A1 (S811) by the external sensor A101, and the recognition process A1 (S831) is performed on the output of the preprocess A1 (S821). And output the above recognition processing result.
  • the preprocessing A2 (S822) is performed on the sensing A2 (S812) by the external sensor A101, the recognition processing A2 (S832) is performed on the output of the preprocessing A2 (S822), and the above recognition processing result is output.
  • Processing after sensing A3 (S813) is performed in the same manner. That is, pipeline processing of sensing, preprocessing, and recognition processing is performed.
  • the preprocess B1 (S871) is performed on the sensing B1 (S861) by the external sensor B201, and the recognition process B1 (S881) is performed on the output of the preprocess B1 (S871).
  • the above recognition processing result is output.
  • the preprocessing B2 (S872) is performed on the sensing B2 (S862) by the external sensor B201
  • the recognition processing B2 (S882) is performed on the output of the preprocessing B2 (S872)
  • the above recognition processing result is output.
  • the same processing is performed after the sensing B3 (S863). That is, similar to the external sensor A101, pipeline processing of sensing, preprocessing, and recognition processing is performed.
  • FIG. 16A shows the operation timing when the result of the recognition process for the external sensor A101 is used for the recognition process for the external sensor B201. Yes. Due to the timing of the processing, the recognition processing B2 (S882) uses the output of the position correction B1 (S875). That is, position correction B1 (S875) is performed based on information up to sensing A1 (S811), and the position correction B1 (S875) is used for recognition processing for sensing B2 (S862).
  • FIG. 16B shows the operation timing when the result of the preprocessing for the external sensor A101 is used for the recognition process for the external sensor B201. Due to the timing of the processing, the recognition processing B2 (S882) uses the output of the position correction B2 (S876). That is, position correction B2 (S876) is performed based on the information up to sensing A2 (S812) (instead of the information up to sensing A1 (S811)), and the position correction B2 (S876) is recognized for sensing B2 (S862). It will be used for processing.
  • the processing result (object detection result) using the sensing data of the external sensor A101 closest to that in FIG. 16A can be used in the recognition processing of the external sensor B201.
  • the position correction uses extrapolation, it is better to use the most recent sensing data in order to improve the correction accuracy. That is, the intermediate processing result in the sensor signal processing of the external sensor A101 (preprocessing during recognition processing) It is better to use the result of (5) for recognition processing of sensing data of the external sensor B201.
  • the recognition information is also included in the sensor A process information 108 (see FIG. 1) and used for the sensing data recognition process of the external sensor B201.
  • the external sensor A101 is a millimeter wave radar
  • the evaluation target portion 340 of the evaluation system 1001 includes a sensor A signal processing device 100, a sensor B signal processing device 200, and a fusion processing device 650 that constitute the signal processing system 1000.
  • the evaluation system 1001 This is a system that evaluates how much the results sensed by the sensor A101 and the external sensor B201 can be recognized as an object by the configuration of the evaluation target portion 340.
  • the evaluation system 1001 of this embodiment mainly includes a vehicle control system model 370, a road environment generation model 390, an information display device model 310, a pseudo user input generation model 320, and a pseudo sensing signal generation device 330.
  • the output (integrated recognition result 595) of the processing system 1000 (of the fusion processing apparatus 650) is connected to a vehicle control system model 370 that simulates the movement of the vehicle, and the vehicle control system model 370 is a road that simulates the surrounding environment of the vehicle.
  • Generate simulated sensing signal of data Inputting a simulated sensing signal of the sensing data to the signal processor 100, 200 of the signal processing system 1000.
  • the vehicle control system model 370 is obtained by modeling the vehicle control system 700 shown in FIG. 7 for simulation.
  • the road environment generation model 390 is based on the position and orientation of the host vehicle calculated by the vehicle system model 370 and the surrounding environment (terrain, road, objects on the road, objects around the road, etc.) around the host vehicle at a certain simulation time. This is a simulation model for obtaining a position, a moving direction, a speed, a light state, a road surface state, and the like.
  • the road environment generation model 390 works closely with the vehicle control system model 370 to simulate the behavior of the host vehicle.
  • the pseudo user input generation model 320 is a simulation model that supplies a signal corresponding to the operation of the driver to the vehicle control system model 370.
  • the information display device model 310 is based on the signal output from the vehicle control system model 370 in the evaluation environment. This is a simulation model in which a meter and a warning light are displayed on the screen in a pseudo manner, and a temporal change in the output of the vehicle control system model 370 is recorded.
  • the pseudo sensing signal generation device 330 generates pseudo signals of the external sensor A101 and the external sensor B201 (pseudo sensing signals such as CG and radar reflected signals) based on the information generated by the road environment generation model 390.
  • the pseudo sensing signal generation device 330 gives a pseudo signal to the sensor A signal processing device 100 and the sensor B signal processing device 200 of the signal processing system 1000 at a simulation timing corresponding to an actual vehicle system, so that the evaluation target portion 340 is operated, and the recognition result is transmitted to the vehicle control system model 370, whereby the movement of the host vehicle including the evaluation target portion 340 can be simulated.
  • the evaluation system 1001 collects and analyzes the input / output and internal signals of the evaluation target portion 340 during the simulation execution to determine whether or not the evaluation target portion 340 has sufficiently recognized, and corresponds to the vehicle control system model 370. It is possible to evaluate whether or not a desired motion can be obtained as a vehicle behavior in the vehicle that performs the operation.
  • the sensing time may vary depending on the sensing location.
  • the exposure time may be slightly different in units such as pixels or lines.
  • the pseudo sensing signal generation device 330 when sensing time is required for each individual object in the sensor A processing information 108, as shown in FIG. A pseudo signal is generated as if the whole was sensed at the same time (at the same time), and all objects (object p990, object q992, object r994, object at the time ft when the sensing was performed once)
  • the signal processing apparatus 100 for the sensor A is modified for the simulation environment so that the sensing times in s996) are aligned and output.
  • sensing time is not required for each object, when the sensing time is indicated for each sensing area as described with reference to FIG. 4, the sensing times for all sensing areas are aligned with ft and described with reference to FIG. 5.
  • the vehicle object p990 and the object q992 are regarded as main objects, the sensing times are set as pt and qt, respectively, and vehicle sensing signals are generated at positions corresponding to the sensing times pt and qt, and the object r994,
  • a representative sensing time corresponding to the one sensing is used as the sensing time ft, and a sensing signal is generated at a position corresponding to the representative sensing time ft.
  • the pseudo sensing signal generation device 330 causes the recognition unit A150 and the cooperative data output unit A140 of the sensor A signal processing device 100 to detect the main object (for example, the object p990, the object q992). Information may be sent, and only those objects may be modified in the recognition unit A150 and the cooperative data output unit A140 so that the sensing time is overwritten based on the information.
  • the sensing time is indicated for each sensing area as described with reference to FIG. 4 or when the sensing time is indicated using the time calculation formula 988 as described with reference to FIG. Since the sensing time cannot be added to the coordinate data output unit A140, the main object (eg, object p990, object q992) is sent to the position correction unit B280 instead of sending information related to the sensing time of the main object (eg, object p990, object q992). p990, information on the sensing time of the object q992) is sent, and the position correction unit B280 modifies the sensing time to be overwritten and managed based on the information.
  • the main object eg, object p990, object q992
  • the position correction unit B280 modifies the sensing time to be overwritten and managed based on the information.
  • CMOS sensors used in cameras have different sensing times depending on the sensing position, for example, the exposure timing varies from line to line, but CG (computer graphics) is used to generate pseudo sensing signals corresponding to such sensing.
  • CG computer graphics
  • the calculation load becomes high.
  • the signal processing apparatuses 100 and 200 for sensing data of each external sensor are time-synchronized using the same clock or using a method defined in IEEE 802.11AS. It also manages the sensing timing of connected external sensors.
  • the signal processing apparatus 100 for the sensor A uses the sensing time at which the sensing data to be processed is acquired together with the position information of the target object as the recognition processing result or information during the recognition processing to the signal processing apparatus 200 for the sensor B. Output.
  • the sensor B signal processing apparatus 200 that processes the sensing data of the external sensor B201 detects the sensing time of the sensing data acquired by the external sensor B201 from the position information of the object received from the sensor A signal processing apparatus 100 and the history of the sensing time. The position of the object corresponding to is estimated and used as a priority processing region in the recognition processing of sensing data acquired by the external sensor B201.
  • the object position which is the recognition process result of the external sensor A101 (first external sensor) or information during the recognition process.
  • recognition processing of the external sensor B201 second external sensor
  • the second embodiment has a configuration in which the position correction unit is moved from the sensor B signal processing device 200 to the sensor A signal processing device 100 with respect to the configuration described in FIG. 2 of the first embodiment.
  • the flow of outputting the sensor B recognition result 205 from the external sensor B201 through the sensor input unit B210, the recognition unit BX252, and the result output unit B220 is the same as that in the first embodiment.
  • the position correction unit AX182 provided in the sensor A signal processing device 100 senses the detection information of the object obtained by the recognition processing of the recognition unit A150 and the sensing data used by the recognition unit A150 for the recognition processing.
  • the time is acquired from the sensor input unit A110. At this time, if the sensing time differs depending on the position on the sensing data, the external sensor A101 manages the sensing time for each detected object.
  • the recognition unit BX252 of the sensor B signal processing device 200 cooperates in advance from the time for performing the next recognition processing to the time required to acquire the position information of the object subjected to position correction from the sensor A signal processing device 100.
  • the data input unit BX232 is requested to acquire the position information of the object obtained by sensing with the external sensor A101 from the sensor A signal processing device 100.
  • the position information to be acquired includes information for distinguishing at which timing the sensing data of the external sensor B201 used for the next recognition process is the sensing data.
  • the cooperative data input unit BX232 Upon receiving the request, the cooperative data input unit BX232 receives the sensing time of the corresponding sensing data from the sensor input unit B210, and transmits the position information of the object detected based on the sensing data of the external sensor A101 corresponding to the sensing time. Request to the cooperative data output unit AX142. At this time, if the sensing time differs depending on the position on the sensing data, the external sensor B201 receives the sensing time as a relationship between the position on the sensing data and the sensing time.
  • the cooperative data output unit AX142 Upon receiving the request, the cooperative data output unit AX142 corrects the position of the object detected by the recognition unit A150 together with the sensing time information included in the request according to the position of the time obtained from the sensing time information, and then outputs the cooperative data.
  • the position correction unit AX182 is requested to send it to the unit AX142.
  • the position correction unit AX182 When the position correction unit AX182 receives the request from the cooperative data output unit AX142, the position correction unit AX182 conforms to the request from the cooperative data output unit AX142 using the latest object detection information obtained from the recognition unit A150 and the previous object detection information.
  • the sensing time information that is, position information corresponding to the sensing time information of the external sensor B201 is calculated by extrapolation.
  • the calculation result of the position correction unit AX182 is transmitted (output) to the recognition unit BX252 via the cooperative data output unit AX142 and the cooperative data input unit BX232, and the recognition unit BX252 sends the calculation result to the next recognition process. Take advantage of.
  • the third embodiment is a configuration in which the sensor A signal processing device 100 and the sensor B signal processing device 200 are integrated into the sensor AB signal processing device 300 in contrast to the configuration described in FIG. 2 of the first embodiment. ing.
  • the flow of outputting the sensor B recognition result 205 via the result output unit B220 is the same as that in the first embodiment.
  • an input / output unit (cooperative data output unit A140 and cooperative data input unit B230 in FIG. 2) that is necessary for communication of the sensor A processing information 108 as it is integrated into the sensor AB signal processing device 300. Accordingly, since the input / output of the position correction unit is different from that of the first embodiment, the position correction unit BZ384 is replaced. Since the sensor input unit A110 and the recognition unit A150 in FIG. 2 are also connected to the position correction unit BZ384, the sensor input unit AZ114 and the recognition unit AZ154 are replaced. In one embodiment, the sensor input unit A110 and the recognition unit A150 are the same as the information exchanged with the cooperative data output unit A140.
  • the information received by the position correction unit BZ384 from the sensor input unit AZ114 and the recognition unit AZ154 is substantially the same as that in the first embodiment
  • the position correction unit B280 receives the cooperation data output unit A140 and the cooperation data from the sensor input unit A110 and the recognition unit A150. This is the same as the information received via the input unit B230.
  • Information exchanged by the position correction unit BZ380 with the sensor input unit B210 and the recognition unit B250 is substantially the same as information exchanged with the position correction unit B280 in the first embodiment.
  • the position correction unit BZ384 of the signal processing device 300 for the sensor AB uses the external sensor A101 and the external sensor to detect the position information of each object detected by the recognition unit AZ154, similarly to the position correction unit B280 of the first embodiment.
  • the information is corrected using the sensing time information of B201 and transmitted to the recognition unit B250, and the recognition unit B250 uses the information when performing the recognition process.
  • the device configuration of the sensor AB signal processing device 300 (and thus the system configuration of the signal processing system) is simplified. can do.
  • the fourth embodiment is configured to control the sensing timing of the external sensor A101 in accordance with the sensing timing of the external sensor B201 based on the configuration described in FIG. 2 of the first embodiment.
  • FIG. 22 shows an internal configuration of the signal processing devices 100 and 200 according to the fourth embodiment.
  • the flow of outputting the sensor A recognition result 105 from the external sensor A101 through the sensor input unit AY113, the recognition unit A150, the result output unit A120, and the sensor B signal in the sensor A signal processing device 100 is the same as that in the first embodiment.
  • the position of the object detected by the recognition unit A150 is corrected based on the sensing time of the external sensor A101 and the sensing time of the external sensor B201 by the position correction unit B280 via the cooperative data output unit AY143 and the cooperative data input unit BY233.
  • the flow of using the position corrected by the recognition unit BY253 of the sensor B signal processing device 200 is the same.
  • the sensor input unit AY113 of the sensor A signal processing device 100 performs the sensing of the external sensor A101 based on the sensing timing information coming from the sensor B signal processing device 200 via the cooperative data output unit AY143. It has a function of controlling operation timing.
  • the recognition unit BY253 of the signal processing apparatus for sensor B 200 has a function of notifying the timing calculation unit BY293 of processing timing for starting processing of sensing data.
  • the cooperative data input unit BY233 and the cooperative data output unit AY143 send a sensing start request to the external sensor A101 output from the timing calculation unit BY293 to the cooperative data input unit B230 and the cooperative data output unit A140 of FIG.
  • the function to convey is added.
  • the timing calculation unit BY293 performs preprocessing by the recognition unit A150 on the periodicity of the processing of the recognition unit BY253, the time required for one sensing of the external sensor A101 set in advance, and the sensing data thereof, and the position of the object is calculated.
  • the calculation result is sent to the position correction unit B280, the time at which the position correction result reaches the recognition unit BY253 (preparation time Tp899 (see FIG. 23)), and a request signal is input from the preset timing calculation unit BY293 to the cooperative data.
  • FIG. 23 shows operation timings in the configuration shown in FIG.
  • the sensing timing of the external sensor A101 from the sensor B signal processing device 200 By instructing the sensing timing of the external sensor A101 from the sensor B signal processing device 200 to the sensor A signal processing device 100, sensing A2 (S812), preprocessing A2 (S822), position correction B2 (S876), recognition processing
  • the waiting time is suppressed in the process flow of B2 (S882), and the waiting time is also obtained by the cooperative operation of the sensor A signal processing device 100 and the sensor B signal processing device 200 in the subsequent processing on the sensing data of the external sensor A101. It is suppressed. Therefore, when performing the recognition process of the external sensor B201, it is possible to suppress errors associated with the extrapolation process performed in the correction process based on the sensing time of the object position detection result by the sensing of the external sensor A101.
  • the configuration of the fourth embodiment is configured to adjust the sensing timing of the external sensor A101 based on the configuration of the first embodiment, sensing between the external sensors and adjustment of the processing timing can be performed. Therefore, a configuration based on the configuration shown in the second embodiment or the third embodiment is also conceivable.
  • a configuration is also conceivable in which the sensor B signal processing device 200 is operated with reference to the sensing timing of the external sensor A101. In this case, the sensing of the external sensor B201 is synchronized with the operation timing of the sensor A signal processing device 100. Timing needs to be adjusted.
  • the sensing time of the sensing data of one external sensor matches the processing timing of the sensing data of the other external sensor. Therefore, the positional deviation due to the difference in sensing time can be further suppressed, and the recognition accuracy of the external sensor can be further improved.
  • the present invention is not limited to the above-described embodiment, and includes various modifications.
  • the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to one having all the configurations described.
  • a part of the configuration of an embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of an embodiment.
  • each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
  • Information such as programs, tables, and files for realizing each function can be stored in a memory, a hard disk, a storage device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
  • control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
  • DESCRIPTION OF SYMBOLS 10 Time synchronous information, 100 ... Signal processing apparatus for sensor A, 101 ... External sensor A, 105 ... Sensor A recognition result, 108 ... Sensor A processing information, 110 ... Sensor input part A, 113 ... Sensor input part AY, 114 ... sensor input part AZ, 120 ... result output part A, 140 ... cooperative data output part A, 142 ... cooperative data output part AX, 143 ... cooperative data output part AY, 150 ... recognition part A, 154 ... recognition part AZ, 182 ... position correction unit AX, 200 ... signal processing device for sensor B, 201 ... external sensor B, 205 ... sensor B recognition result, 210 ... sensor input unit B, 220 ...
  • result output unit B 230 ... cooperative data input unit B, 232 ... Cooperative data input unit BX, 233 ... Cooperative data input unit BY, 250 ... Recognition unit B, 252 ... Recognition unit BX, 253 ... Recognition unit BY, 280 ... Position correction unit B 293 ... Timing calculation unit BY, 310 ... Information display device model, 320 ... Pseudo user input generation model, 330 ... Pseudo sensing signal generation device, 340 ... Evaluation target part, 370 ... Vehicle control system model, 384 ... Position correction unit BZ, 390 ... Road environment generation model, 595 ... Integrated recognition result, 610 ... Information display device, 620 ... User input device, 650 ...
  • Fusion processing device 700 ... Vehicle control system, 701 ... Own vehicle position information, 705 ... Vehicle integrated control device , 710 ... Engine control device, 712 ... Engine control system actuator, 714 ... Engine control system sensor, 720 ... Transmission control device, 722 ... Transmission control system actuator, 724 ... Transmission control system sensor, 730 ... Brake control device, 732 ... Blur Control system actuator, 734 ... brake control system sensor, 740 ... steering control device, 742 ... steering control system actuator, 744 ... steering control system sensor, 790 ... common bus, 899 ... preparation time Tp, 901 ... one eye of stereo camera 902 ... Estimated area with high possibility of road (assuming straight road with constant width and constant gradient), 903 ...

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne : un système de traitement de signal grâce auquel un effet de déviation de position provoqué par une différence dans le moment de détection peut être supprimé lorsqu'un résultat de traitement de reconnaissance de données de détection d'un capteur externe ou que des informations de position d'objet qui sont en cours de traitement de reconnaissance sont utilisées dans le traitement de reconnaissance de données de détection d'un autre capteur externe pendant un traitement de reconnaissance utilisant les données de détection de plusieurs capteurs externes ; un système d'évaluation pour le système de traitement de signal ; et un dispositif de traitement de signal utilisé dans le système de traitement de signal. Un dispositif de traitement de signal (100) pour un capteur A gère une position de détection d'objet et un moment de détection des données de détection à traiter, la position de détection d'objet est corrigée pour être la position au niveau du moment de détection des données de détection à traiter par un dispositif de traitement de signal (200) pour un capteur B, et le dispositif de traitement de signal (200) pour le capteur B utilise la position corrigée lors de la réalisation d'un traitement de reconnaissance.
PCT/JP2019/008008 2018-03-08 2019-03-01 Système de traitement de signal et système d'évaluation associé, et dispositif de traitement de signal utilisé dans ledit système de traitement de signal WO2019172103A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018041761A JP6932664B2 (ja) 2018-03-08 2018-03-08 信号処理システム、及びその評価システム、並びにその信号処理システムに用いられる信号処理装置
JP2018-041761 2018-03-08

Publications (1)

Publication Number Publication Date
WO2019172103A1 true WO2019172103A1 (fr) 2019-09-12

Family

ID=67847386

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/008008 WO2019172103A1 (fr) 2018-03-08 2019-03-01 Système de traitement de signal et système d'évaluation associé, et dispositif de traitement de signal utilisé dans ledit système de traitement de signal

Country Status (2)

Country Link
JP (1) JP6932664B2 (fr)
WO (1) WO2019172103A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3889641A1 (fr) * 2020-04-02 2021-10-06 Mitsubishi Electric Corporation Dispositif et procédé de reconnaissance d'objet

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021149820A (ja) * 2020-03-23 2021-09-27 株式会社デンソー センサ評価装置
CN111611709B (zh) * 2020-05-20 2023-09-15 阿波罗智能技术(北京)有限公司 用于控制仿真车辆的方法、装置、电子设备和存储介质
JP7432447B2 (ja) * 2020-06-15 2024-02-16 日立Astemo株式会社 センサ認識統合装置
US20240078908A1 (en) * 2021-01-26 2024-03-07 Kyocera Corporation Observation device and observation method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002303671A (ja) * 2001-04-03 2002-10-18 Nissan Motor Co Ltd 物体種別判別装置
JP2012014553A (ja) * 2010-07-02 2012-01-19 Honda Motor Co Ltd 車両の周辺監視装置
WO2014010546A1 (fr) * 2012-07-10 2014-01-16 本田技研工業株式会社 Appareil d'évaluation de défaut
JP2015059808A (ja) * 2013-09-18 2015-03-30 株式会社東芝 物体監視装置、および物体監視システム
JP2017075881A (ja) * 2015-10-16 2017-04-20 三菱電機株式会社 物体認識統合装置および物体認識統合方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101644370B1 (ko) * 2014-10-23 2016-08-01 현대모비스 주식회사 물체 검출 장치 및 그 동작 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002303671A (ja) * 2001-04-03 2002-10-18 Nissan Motor Co Ltd 物体種別判別装置
JP2012014553A (ja) * 2010-07-02 2012-01-19 Honda Motor Co Ltd 車両の周辺監視装置
WO2014010546A1 (fr) * 2012-07-10 2014-01-16 本田技研工業株式会社 Appareil d'évaluation de défaut
JP2015059808A (ja) * 2013-09-18 2015-03-30 株式会社東芝 物体監視装置、および物体監視システム
JP2017075881A (ja) * 2015-10-16 2017-04-20 三菱電機株式会社 物体認識統合装置および物体認識統合方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3889641A1 (fr) * 2020-04-02 2021-10-06 Mitsubishi Electric Corporation Dispositif et procédé de reconnaissance d'objet
CN113494938A (zh) * 2020-04-02 2021-10-12 三菱电机株式会社 物体识别装置及物体识别方法
US11921191B2 (en) 2020-04-02 2024-03-05 Mitsubishi Electric Corporation Object recognition device and object recognition method
CN113494938B (zh) * 2020-04-02 2024-05-17 三菱电机株式会社 物体识别装置及物体识别方法

Also Published As

Publication number Publication date
JP6932664B2 (ja) 2021-09-08
JP2019158390A (ja) 2019-09-19

Similar Documents

Publication Publication Date Title
WO2019172103A1 (fr) Système de traitement de signal et système d'évaluation associé, et dispositif de traitement de signal utilisé dans ledit système de traitement de signal
JP7160040B2 (ja) 信号処理装置、および信号処理方法、プログラム、移動体、並びに、信号処理システム
EP3447528B1 (fr) Système de conduite automatisé qui fusionne les données de capteurs hétérogènes
US11009602B2 (en) Method and system for environment detection
US10279809B2 (en) Travelled-route selecting apparatus and method
JP4428277B2 (ja) 物体検出装置
US20150073705A1 (en) Vehicle environment recognition apparatus
US10562532B2 (en) Autonomous driving system
JP5089545B2 (ja) 道路境界検出判断装置
US10800427B2 (en) Systems and methods for a vehicle controller robust to time delays
KR102569904B1 (ko) 표적 차량 추적 장치 및 그의 표적 차량 추적 방법과 그를 포함하는 차량
EP3888276B1 (fr) Vérification de la temporisation de capteurs utilisés dans des véhicules à conduite autonome
JP6780611B2 (ja) 自動運転装置
US20190071094A1 (en) Vehicle control system, vehicle control method, and storage medium
JP6787157B2 (ja) 車両制御装置
WO2016047689A1 (fr) Dispositif d'estimation d'amplitude de désalignement axial de capteur de faisceau
JP2017211249A (ja) 物標検出装置
WO2019181284A1 (fr) Dispositif de traitement d'informations, dispositif de mouvement, procédé et programme
JP2021099793A (ja) インテリジェント交通管制システム及びその制御方法
JP2018100899A (ja) 物体検出装置、物体検出プログラム、および記録媒体
JP2016131367A (ja) 移動体システム
WO2019021591A1 (fr) Dispositif de traitement d'images, procédé de traitement d'images, programme, et système de traitement d'images
JP2019067116A (ja) 立体物接地判定装置
JP6645910B2 (ja) 位置推定装置
JP2018180641A (ja) 車両識別装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19763394

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19763394

Country of ref document: EP

Kind code of ref document: A1