WO2023046346A1 - Verarbeiten von sensordaten in einem steuergerät mittels verlustbehafteter kompression - Google Patents
Verarbeiten von sensordaten in einem steuergerät mittels verlustbehafteter kompression Download PDFInfo
- Publication number
- WO2023046346A1 WO2023046346A1 PCT/EP2022/071922 EP2022071922W WO2023046346A1 WO 2023046346 A1 WO2023046346 A1 WO 2023046346A1 EP 2022071922 W EP2022071922 W EP 2022071922W WO 2023046346 A1 WO2023046346 A1 WO 2023046346A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor data
- data
- compressed
- control unit
- decompressed
- Prior art date
Links
- 238000007906 compression Methods 0.000 title claims abstract description 81
- 230000006835 compression Effects 0.000 title claims abstract description 80
- 238000012545 processing Methods 0.000 title claims abstract description 55
- 238000000034 method Methods 0.000 claims abstract description 84
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 59
- 238000012360 testing method Methods 0.000 claims description 19
- 238000007781 pre-processing Methods 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 15
- 238000011156 evaluation Methods 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 9
- 230000006837 decompression Effects 0.000 description 17
- 238000012549 training Methods 0.000 description 10
- 238000010200 validation analysis Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 238000011161 development Methods 0.000 description 7
- 230000018109 developmental process Effects 0.000 description 7
- 238000003860 storage Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000013459 approach Methods 0.000 description 4
- 230000009467 reduction Effects 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000013144 data compression Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000002347 injection Methods 0.000 description 2
- 239000007924 injection Substances 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 230000006735 deficit Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/85—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Definitions
- the invention relates to a method for processing sensor data in a control unit using lossy compression and a control unit for executing this method. Furthermore, the invention relates to a method for evaluating a lossy compression method for use in such a control unit, a computer program for executing one or both of the aforementioned methods and a computer-readable medium on which such a computer program is stored.
- the image data can be compressed using a lossless compression method, for example by a measuring PC or data logger connected to the respective sensor system.
- the decompressed image data which is identical to the uncompressed output data due to the lossless compression, can then be used to train, validate or test the image processing algorithm.
- this can only reduce the amount of data by around 40% to 60%. Disclosure of Invention
- Embodiments of the present invention enable a significant reduction in the amount of data in the development, optimization and release of image processing algorithms, such as those used for environment recognition in the context of partially or fully automated driving.
- the associated costs are therefore significantly lower compared to conventional approaches.
- a first aspect of the invention relates to a computer-implemented method for processing sensor data in a control unit using lossy compression.
- the method comprises at least the following steps: receiving the sensor data in the control unit, the sensor data being provided by a sensor system for detecting an environment of a vehicle; Compressing the sensor data in a lossy compression method by the control unit or a circuit integrated in the control unit in order to obtain compressed sensor data; decompressing the compressed sensor data by the control unit or a circuit integrated in the control unit in order to obtain decompressed sensor data; and evaluating the decompressed sensor data by the control unit in order to recognize objects in the area surrounding the vehicle.
- the method can be performed automatically by a processor of the controller.
- the sensor data received in the control unit can be output data to be compressed, in particular uncompressed, which are generated and output by the sensor system.
- the lossy compression method can be carried out, for example, by the processor of the control unit.
- the control device can include a separate hardware module for the compression and decompression, in particular a separate programmable hardware module, on which a corresponding computer program can be stored.
- irrelevant information can be irretrievably removed from the sensor data. This process can also be referred to as irrelevance reduction.
- the sensor data can no longer be completely reconstructed from the compressed sensor data.
- the decompressed sensor data may include different and/or less information than the original sensor data.
- the amount of data in the compressed sensor data can be reduced by at least 40% compared to the original sensor data, for example by at least 60%, in particular by at least 80%.
- the vehicle may be an automobile, such as a car, truck, bus, or motorcycle.
- a vehicle can also be understood as an autonomous, mobile robot.
- the method could also be applied to sensors in production or safety technology.
- the sensor system can be a camera, a radar, lidar or ultrasonic sensor, for example, or a combination of at least two of these examples.
- An object can be, for example, another vehicle, a pedestrian, a cyclist, a lane marking, or a traffic sign. However, an object can also be a line or an edge in an image.
- the controller may be configured to automatically steer, accelerate, and/or decelerate the vehicle based on the sensor data.
- the vehicle can have a corresponding actuator system include, for example in the form of a steering actuator, a brake actuator, a motor control unit, an electric drive motor or a combination of at least two of these examples.
- a second aspect of the invention relates to a control device.
- the controller includes a processor configured to perform the method of processing sensor data described above and below.
- the control unit can include hardware and/or software modules.
- the control unit can include a memory and data communication interfaces for data communication with peripheral devices.
- Features of the aforementioned method can also be features of the control unit and vice versa.
- a third aspect of the invention relates to a computer-implemented method for evaluating a lossy compression method for use in a control device, as described above and below.
- the method comprises at least the following steps: receiving sensor data in a data processing device, the sensor data being provided by a sensor system for detecting an environment of a vehicle; compressing the sensor data in the lossy compression method to obtain compressed sensor data, and/or receiving compressed sensor data, which were provided by the control unit by compressing the sensor data in the lossy compression method, in the data processing device; decompressing the compressed sensor data to obtain decompressed sensor data; providing the sensor data and the decompressed sensor data as input data to an object detection algorithm trained on recorded sensor data to convert the input data into output data indicative of objects in the vicinity of the vehicle; determining a deviation between the output data provided by the object detection algorithm by converting the sensor data and the output data provided by the object detection algorithm by converting the decompressed sensor data; and generating an evaluation for the lossy compression method depending on the deviation.
- object recognition algorithm can mean, for example, an artificial neural network, a support vector machine, a k-means algorithm, a k-nearest neighbor algorithm, a decision tree, a random forest or a combination of at least two of these examples be understood.
- the recorded sensor data can be data that has not been lossy compressed (in contrast to the decompressed sensor data).
- the recorded sensor data can have been generated by recording the uncompressed sensor data provided by the sensor system.
- the recorded sensor data can have been generated by compressing the sensor data provided by the sensor system in a lossless compression method and then decompressing the lossless compressed sensor data.
- the recorded sensor data can include, for example, training data for training the object recognition algorithm in a training step and/or validation data for validating the object recognition algorithm in a validation step after a training step.
- the training data and the validation data can differ from each other.
- the object recognition algorithm can be trained in the data processing device with lossless data or with data that was compressed using a different lossy compression method than that implemented in the control unit.
- the validation data that were used to validate the object recognition algorithm in the data processing device match the validation data that were used to validate the object recognition algorithm in the (real) control unit.
- the validation data should match insofar as they were compressed using the same lossy compression method.
- the sensor data and the decompressed sensor data can be processed by the object recognition algorithm in parallel processes, for example.
- data processing device can be understood to mean, for example, a server, a PC, a laptop, a tablet, a smartphone, an expansion card, such as an FPGA circuit board, or a combination of at least two of these examples.
- control unit and the data processing device can be connected to one another for data communication via a wired and/or wireless data communication connection, for example via WLAN, Bluetooth and/or mobile radio.
- influences of the lossy compression method on the recognition performance of the object recognition algorithm can be determined directly without the need for additional comparison data, such as ground truth data.
- Such a comparison can be carried out, for example, in parallel with the normal operation of the control unit while the vehicle is driving. This simplifies the adaptation of the lossy compression method with regard to optimal recognition performance.
- a fourth aspect of the invention relates to a data processing device.
- the data processing device comprises a processor configured to execute the method for evaluating a lossy compression method described above and below.
- the data processing device can include a memory and data communication interfaces for data communication with peripheral devices.
- Features of the aforementioned method can also be features of the data processing device and vice versa.
- the computer program comprises instructions which, when the computer program is executed by the processor, cause a processor to carry out one or both of the methods described above and below.
- the computer-readable medium can be volatile or non-volatile data storage.
- the computer-readable medium can be a hard drive, USB storage device, RAM, ROM, EPROM, or flash memory.
- the computer-readable medium can also be a data communication network such as the Internet or a data cloud (cloud) enabling a download of a program code.
- the method for processing the sensor data can also include: Sending the compressed sensor data and/or the output data from the control unit to a data processing device for recording and/or evaluating the compressed sensor data and/or the output data.
- a data processing device for recording and/or evaluating the compressed sensor data and/or the output data.
- the method when the control unit is operated in a test mode, can further comprise: receiving recorded compressed sensor data, which was compressed in the lossy compression method and/or in a lossy compression method to be tested, in the control unit; the controller decompressing the recorded compressed sensor data to obtain recorded decompressed sensor data; Enter the recorded decompressed sensor data as the input data to the object detection algorithm to test the controller.
- the compressed sensor data and the recorded compressed sensor data can have been compressed using the same lossy compression method or using different lossy compression methods.
- the lossy compression method to be tested can deviate from the lossy compression method (currently used in the control unit).
- the compression method to be tested can be an updated version of the compression method currently in use.
- the recorded compressed sensor data can be based on sensor data that come from a single real vehicle or from a number of real vehicles, more precisely from their respective sensors.
- the recorded compressed sensor data can have been provided by a single control unit or by multiple control units of different vehicles.
- the recorded, compressed sensor data was generated by a mathematical model that simulates physical properties of the sensors, the vehicle and/or objects to be detected in the area surrounding the vehicle.
- the test mode can be used, for example, to check updated control unit software for errors or to train, validate and/or test an object recognition algorithm running on the control unit using the recorded, decompressed sensor data. In this way, the release of new software versions for the control unit can be simplified.
- the decompressed sensor data may be input to an object detection algorithm trained on recorded decompressed sensor data to convert the input data into the output data.
- the recorded decompressed sensor data can have been generated by decompressing recorded compressed sensor data that was compressed in the lossy compression method.
- Object recognition can be significantly improved with the aid of such an object recognition algorithm.
- the sensor system can be a camera and the sensor data can be image data provided by the camera. Experiments have shown that video algorithms, such as those used to recognize traffic signs and other objects, perform very well when combined with lossy video compression.
- the decompressed sensor data can be obtained by first decompressing and then preprocessing the compressed sensor data by the control unit.
- the pre-processing can include, for example, noise reduction, rectification or exposure control.
- the compressed sensor data can be obtained by first preprocessing the sensor data provided by the sensor system by the control unit and then compressing it. In this way, the object recognition can be further improved.
- the controller may include a programmable module configured to compress the sensor data in the lossy compression method to obtain the compressed sensor data and/or to decompress the compressed sensor data to obtain the decompressed sensor data.
- the programmable module can, for example, be an FPGA or a system on a chip, or SoC for short.
- the compression and/or decompression can be carried out by a graphics processor (GPU) or a digital signal processor (DSP).
- a computer program for evaluating the decompressed sensor data for example a computer program that codes the object recognition algorithm, can be stored in another area of the control device outside of the programmable module. In this way, the compression and/or decompression can be flexibly adapted to different system configurations without the hardware of the control unit having to be modified.
- the programmable module can, for example, be inserted directly after an image sensor and before an image processing chain in the control unit. Thereby can the module can be configured to compress video data and then decompress it again, with this lossy decompressed video data being able to be evaluated by the control unit.
- This is particularly relevant in connection with artificial neural networks, since here even small differences in the characteristics of the images, which can arise as a result of lossy compression, can lead to different evaluation results. It can thus be ruled out that differences in the evaluation results are due to the lossy compression.
- the module can additionally include an interface for measuring and/or feeding in the lossy compressed sensor data.
- the sensor data no longer need to be compressed separately on a measurement system for storage and decompressed on a storage server for re-feeding. This also reduces the network load when measuring or feeding in the data and relieves the load on the entire system.
- the compression and decompression of the recorded sensor data can also take place in the measurement and/or refeeding computer, for example using a special expansion card in a PC.
- the measuring and/or refeeding computer is equipped with the same module for (de)compression as the control unit.
- the programmable device can be configured to first decompress the compressed sensor data and then to preprocess it in order to obtain the decompressed sensor data.
- a pre-processing function for example an image pre-processing function for noise reduction, rectification and/or exposure control, can be integrated into the programmable module in addition to the compression and/or decompression function.
- the pre-processing can be flexibly adapted to different system configurations without the hardware of the control unit having to be modified or special test hardware being required.
- the method for evaluating a lossy compression method can further comprise: generating control commands for controlling a recording of the sensor data as a function of the deviation.
- the sensor data can be recorded in the data processing device, the control device and/or a data storage device which is connected to the data processing device and/or the control device for data communication, for example a central storage server.
- the recording of the (uncompressed) sensor data or other additional data provided by the vehicle can be started automatically if the deviation is classified as relevant by the data processing device.
- an ongoing recording can be automatically interrupted if the data processing device again classifies the deviation as irrelevant.
- the raw sensor data can reach the (object recognition) algorithms in the control unit via two paths.
- the algorithms can receive the raw sensor data directly from the sensors, for example from a camera. This is the case when the control unit is actually operating in the vehicle.
- the recorded sensor data can be fed back in, which can also be referred to as replay.
- HIL hardware in the loop.
- the same circuit board can also be used to measure the sensor data.
- connection to the sensors is not relevant since the raw sensor data comes directly from the re-injection framework.
- lossy data compression achieves higher compression rates than lossless data compression, depending on the accepted data loss.
- special lossy compression methods that are optimized for video algorithms do not result in any significant impairment of the recognition performance of corresponding video algorithms, despite higher compression rates. Therefore, the use of such compression algorithms in validation makes sense to further reduce the amount of data.
- the challenge is to ensure that the compression or decompression does not affect or distort the validation data compared to the real, uncompressed data, because in this case it could not be argued that the tests are representative of the real world.
- self-learning algorithms it would hardly be possible to assess whether the algorithms behaved exactly the same in all cases on both data sets and whether the effects of compression never made a difference.
- the compression can optionally be optimized during development by using a special HIL debug board (see above) as a data processing device.
- HIL debug board see above
- video streams can be recorded lossy and lossless at the same time.
- FPGA field-programmable gate array
- the HIL debug board may be configured to compare the lossy and original video streams, for example using a machine learned algorithm. In this way, the quality of the lossy compression can be evaluated parallel to the normal operation of the control unit.
- the influence of the lossy compression on the recognition quality of the respective algorithm can be determined by comparing the recognition quality that the algorithm achieves when evaluating the original data with that which the algorithm achieves when evaluating the corresponding compressed data.
- the aforementioned HIL debug board can be connected between the sensors and the control unit, for example.
- the HIL debug board can include an FPGA, for example, on which the aforementioned functions can be tested and optimized accordingly. It is conceivable, for example, for the sensor data to be lossy and lossless compressed in parallel. Thus, the two compression methods can be directly compared with each other.
- the HIL debug board can be configured to ensure data integrity of all data paths with CRC32. This can ensure that the examined differences in the sensor data are not due to data errors go back
- the HIL debug board can be configured to cause the original sensor data to be saved when certain trigger conditions are present, such as emergency braking of the vehicle.
- the compression can be optimized on the HIL debug board during development.
- the recognition performance is then evaluated, for example, by feeding both sensor datasets back in separately.
- the measurement data obtained in this way are compared with one another.
- the training of the algorithms can be based on both sensor data sets, i. H. the original and the lossy, decompressed sensor data.
- a framework can be used for the evaluation, also called Performance Evaluation Framework or PEF for short, which compares ground truth data with the respective output data and quantifies deviations.
- lossy compression can be applied to previously recorded, unmodified sensor data.
- the resulting lossy data can also be fed into an open-loop test, for example.
- the resulting output data can then be compared with the original sensor data and/or the ground truth data with regard to the recognition performance. This procedure should be repeated several times to compensate for system jitter.
- the HIL debug board may include an FPGA that may be configured to compare the original and lossy data streams. To do this, the two paths in the HIL debug board can be processed by an appropriate algorithm and then compared. In the event of relevant deviations, additional data can be specifically recorded for further analysis. It is also advantageous if various image parameters such as noise or brightness are adjusted as a result of the decompression. In this way, different uncertainties can be simulated and tested on the same image both in the case of refeeding and in the real hardware. This further increases the robustness of the video algorithms.
- FIG. 1 shows a control device according to an embodiment of the invention and a data processing device according to an embodiment of the invention.
- FIG. 2 shows the data processing device from FIG. 1 in detail.
- the vehicle is equipped with a sensor system 2 that detects the surroundings of the vehicle and outputs corresponding sensor data 3 that are received by control unit 1 and processed using lossy compression.
- the sensor system 2 can be a camera, for example.
- the sensor data 3 can accordingly be image data, which can include individual images or image sequences from the surroundings of the vehicle.
- the sensor system 2 can also be a radar, lidar or ultrasonic sensor, for example.
- the sensor data 3 are first converted into compressed sensor data 5 by compression using a lossy compression method in a compression module 4 .
- the compressed ones Sensor data 5 are then converted into decompressed raw sensor data 7 by decompression in a decompression module 6 .
- the decompressed raw sensor data 7 are pre-processed in a pre-processing module 8 , being converted into (correspondingly pre-processed) decompressed sensor data 9 .
- the pre-processing module 8 can be configured, for example, to reduce noise in the images or to adjust an exposure. Depending on the type of sensor data 3, the pre-processing module 8 can include other and/or additional pre-processing functions.
- the decompressed sensor data 9 is entered as input data 10 into an object recognition algorithm 11, which is configured to convert the input data 10 into output data 12, which combines recognized objects in the vicinity of the vehicle, for example pedestrians, lane markings, traffic signs, etc., with their respective positions and/or orientations.
- object recognition algorithm 11 is configured to convert the input data 10 into output data 12, which combines recognized objects in the vicinity of the vehicle, for example pedestrians, lane markings, traffic signs, etc., with their respective positions and/or orientations.
- the decompression module 6 can transfer the decompressed raw sensor data 7 to the object recognition algorithm 11 as the input data 10 .
- the compression module 4 and the decompression module 6 can be stored in a separate programmable component 13 of the control device 1, for example an FPGA or an SoC.
- the object recognition algorithm 11 and the preprocessing module 8 can be stored outside of this module 13 in the control device 1 .
- control unit 1 can be connected to an external data processing device 14 via a suitable data communication connection, which can be wireless or wired.
- the data processing device 14 can be a PC, laptop, tablet or a special debug board, for example.
- the compressed sensor data 5 and/or the output data 12 assigned to the compressed sensor data 5 can be sent to the data processing device 14, stored there and evaluated in a suitable manner.
- the compressed sensor data 5 can be stored in the control unit 1 itself and read out there if required.
- the data processing device 14 can send recorded compressed sensor data 15, which was obtained by recording the compressed sensor data 5, to the control unit 1 if the latter is operated in a corresponding test mode.
- the recorded compressed sensor data 15 can be decompressed analogously to the compressed sensor data 5 in the decompression module 6 , being converted into recorded decompressed sensor data 16 .
- the recorded, decompressed sensor data 16 can, for example, have been preprocessed in the module 13 analogously to the decompressed raw sensor data 7 .
- pre-processing in pre-processing module 8 is also possible.
- the sensor data 3 can be received by the data processing device 14 alone or in parallel by the control unit 1 and by the data processing device 14 .
- the data processing device 14 can be connected between the sensor system 2 and the control unit 1 , with the control unit 1 receiving the sensor data 3 from the data processing device 14 .
- the sensor data 3 can be compressed in the data processing device 14 in order to obtain the recorded compressed sensor data 15 .
- the data processing device 14 converts the recorded compressed sensor data 15 into the recorded decompressed sensor data 16 by decompression and provides it to the control unit 1 as the input data 10 for test purposes if required.
- the recorded decompressed sensor data 16 can be used, for example, as training, validation and/or test data for training, validating or testing the object recognition algorithm 11 in a machine learning method.
- control unit 2 shows how a lossy compression method, as used or intended to be used in control unit 1, can be automatically evaluated by data processing device 14.
- the sensor data 3 provided by the sensor system 2 is received in the data processing device 14 and input into a compression module 4, which converts the sensor data 3 into compressed sensor data 5 by compression using the lossy compression method to be tested.
- the compressed sensor data 5 is then input into a decompression module 6, which converts the compressed sensor data 5 into decompressed sensor data 9 by decompression.
- the data processing device 14 can receive the compressed sensor data 5 from the control device 1 .
- the compression module 4 and the decompression module 6 are modules of a further programmable component 17 which has been programmed in the same or similar manner as the component 13 of the control device 1 or is identical to it.
- the sensor data 3 and the decompressed sensor data 9 are used as input data 10 for a first object recognition module 18 and a second Object recognition module 19 of the data processing device 14 is used, the sensor data 3 being entered into the first object recognition module 18 and the decompressed sensor data 9 being entered into the second object recognition module 19 .
- the two object recognition modules 18, 19 each execute an object recognition algorithm 11, which has been trained with recorded sensor data, in order to convert the input data 10 into output data 12, which display objects in the area surrounding the vehicle.
- the recorded sensor data can have been provided by the sensor system 2 or another sensor system, which can be real or simulated.
- Object recognition algorithm 11, which is executed in control unit 1 can differ from object recognition algorithm 11, which is executed in data processing device 14, or can match it.
- the output data 12 of the two object recognition modules 18, 19 are then compared with one another in a comparison module 20 in order to determine a deviation between the respective output data 12.
- a comparison result 21 quantifying the deviation is finally evaluated in an evaluation module 22 which outputs an evaluation 23 corresponding to the respective deviation of the lossy compression method used for the compression of the sensor data 3 .
- the evaluation module 22 can be configured, depending on the comparison result 21 and/or the evaluation 23, to send control commands 24 to start and/or interrupt a recording of the sensor data 3 provided by the sensor system 2 or other relevant vehicle data in an internal or external data memory generate.
- control unit 1 and the data processing device 14 are each equipped with a processor 25 which executes a computer program, the execution of which causes the method steps described above using the example of the control unit 1 or the data processing device 14 to be carried out.
- processor 25 which executes a computer program, the execution of which causes the method steps described above using the example of the control unit 1 or the data processing device 14 to be carried out.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Stored Programmes (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280063830.1A CN117999786A (zh) | 2021-09-21 | 2022-08-04 | 在控制设备中借助有损压缩处理传感器数据 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102021210494.0 | 2021-09-21 | ||
DE102021210494.0A DE102021210494A1 (de) | 2021-09-21 | 2021-09-21 | Verarbeiten von Sensordaten in einem Steuergerät mittels verlustbehafteter Kompression |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023046346A1 true WO2023046346A1 (de) | 2023-03-30 |
Family
ID=83151918
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2022/071922 WO2023046346A1 (de) | 2021-09-21 | 2022-08-04 | Verarbeiten von sensordaten in einem steuergerät mittels verlustbehafteter kompression |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN117999786A (de) |
DE (1) | DE102021210494A1 (de) |
WO (1) | WO2023046346A1 (de) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190287024A1 (en) * | 2018-03-13 | 2019-09-19 | Lyft, Inc. | Low latency image processing using byproduct decompressed images |
US20200304804A1 (en) * | 2019-03-21 | 2020-09-24 | Qualcomm Incorporated | Video compression using deep generative models |
DE102019214587A1 (de) * | 2019-09-24 | 2021-03-25 | Conti Temic Microelectronic Gmbh | Verarbeitung verlustbehaftet komprimierten ADAS-Sensordaten für Fahrerassistenzsysteme |
-
2021
- 2021-09-21 DE DE102021210494.0A patent/DE102021210494A1/de active Pending
-
2022
- 2022-08-04 WO PCT/EP2022/071922 patent/WO2023046346A1/de active Application Filing
- 2022-08-04 CN CN202280063830.1A patent/CN117999786A/zh active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190287024A1 (en) * | 2018-03-13 | 2019-09-19 | Lyft, Inc. | Low latency image processing using byproduct decompressed images |
US20200304804A1 (en) * | 2019-03-21 | 2020-09-24 | Qualcomm Incorporated | Video compression using deep generative models |
DE102019214587A1 (de) * | 2019-09-24 | 2021-03-25 | Conti Temic Microelectronic Gmbh | Verarbeitung verlustbehaftet komprimierten ADAS-Sensordaten für Fahrerassistenzsysteme |
Also Published As
Publication number | Publication date |
---|---|
CN117999786A (zh) | 2024-05-07 |
DE102021210494A1 (de) | 2023-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102017217733A1 (de) | Prüfen eines neuronalen Netzes | |
DE102010013943B4 (de) | Verfahren und Vorrichtung für eine Funktionsprüfung einer Objekt-Erkennungseinrichtung eines Kraftwagens | |
EP3847578A1 (de) | Verfahren und vorrichtung zur klassifizierung von objekten | |
DE102019217613A1 (de) | Verfahren zur diagnose eines motorzustands und diagnostisches modellierungsverfahren dafür | |
DE102014106506A1 (de) | Verfahren zum Durchführen einer Diagnose eines Kamerasystems eines Kraftfahrzeugs, Kamerasystem und Kraftfahrzeug | |
DE102018109276A1 (de) | Bildhintergrundsubtraktion für dynamische beleuchtungsszenarios | |
WO2017102150A1 (de) | Verfahren zum bewerten einer durch zumindest einen sensor eines fahrzeugs erfassten gefahrensituation, verfahren zum steuern einer wiedergabe einer gefahrenwarnung und verfahren zum wiedergeben einer gefahrenwarnung | |
DE102019120696A1 (de) | Vorrichtung und Verfahren zur Reifenprüfung | |
WO2020020599A1 (de) | Verfahren, system und elektronische recheneinrichtung zum überprüfen von sensoreinrichtungen von fahrzeugen, insbesondere von kraftfahrzeugen | |
DE102017201796A1 (de) | Steuervorrichtung zum Ermitteln einer Eigenbewegung eines Kraftfahrzeugs sowie Kraftfahrzeug und Verfahren zum Bereitstellen der Steuervorrichtung | |
DE102020214596A1 (de) | Verfahren zum Erzeugen von Trainingsdaten für ein Erkennungsmodell zum Erkennen von Objekten in Sensordaten einer Umfeldsensorik eines Fahrzeugs, Verfahren zum Erzeugen eines solchen Erkennungsmodells und Verfahren zum Ansteuern einer Aktorik eines Fahrzeugs | |
WO2023046346A1 (de) | Verarbeiten von sensordaten in einem steuergerät mittels verlustbehafteter kompression | |
DE102017220282A1 (de) | Testverfahren für ein Kamerasystem, ein Steuergerät des Kamerasystems, das Kamerasystem und ein Fahrzeug mit diesem Kamerasystem | |
WO2020216622A1 (de) | Erkennung und behebung von rauschen in labels von lern-daten für trainierbare module | |
DE102016208076A1 (de) | Verfahren und vorrichtung zur auswertung eines eingabewerts in einem fahrerassistenzsystem, fahrerassistenzsystem und testsystem für ein fahrerassistenzsystem | |
WO2021185586A1 (de) | Verfahren zur erzeugung von trainingsdaten, fahrzeug und trainingssystem | |
DE102019205504A1 (de) | Steuervorrichtung und -verfahren sowie Computer-Programm-Produkt | |
EP4034856B1 (de) | Qualitätskontrolle für vorgespannte bauelemente | |
DE102022112194A1 (de) | Konvertierungsvorrichtung und Verfahren zum Konvertieren eines vorgegebenen Modells eines Maschinellen Lernens in ein vorgegebenes Ausgabeformat sowie zugehörigen computerlesbares Speichermedium | |
DE112022002734T5 (de) | Fahrzeugsteuersystem und fahrzeugsteuerverfahren | |
DE102022212227A1 (de) | Verfahren zum Ermitteln eines Betriebszustands eines ein erstes Umfeldsensorsystem und ein zweites Umfeldsensorsystem umfassenden Objekterkennungssystems eines Schienenfahrzeugs | |
WO2021185523A1 (de) | Verfahren zum betreiben einer datenbankeinrichtung zum sammeln von fehlerdatensätzen aus einer vielzahl von kraftfahrzeugen; datenbankeinrichtung; kraftfahrzeug-steuereinrichtung sowie system | |
DE102021212489A1 (de) | Verfahren zum Überwachen eines Bereichs eines Parkplatzes | |
DE102023200346A1 (de) | Verfahren zum Erkennen einer Anomalie bei einem Fahrzeugbetrieb | |
DE102023102151A1 (de) | Verfahren zum Überprüfen eines neuronalen Netzwerks für eine Objektklassifizierung |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22761987 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18691552 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280063830.1 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22761987 Country of ref document: EP Kind code of ref document: A1 |