WO2022258289A1 - Verfahren zum parametrisieren einer szene - Google Patents
Verfahren zum parametrisieren einer szene Download PDFInfo
- Publication number
- WO2022258289A1 WO2022258289A1 PCT/EP2022/062780 EP2022062780W WO2022258289A1 WO 2022258289 A1 WO2022258289 A1 WO 2022258289A1 EP 2022062780 W EP2022062780 W EP 2022062780W WO 2022258289 A1 WO2022258289 A1 WO 2022258289A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- objects
- scene
- parameter
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 101150058514 PTGES gene Proteins 0.000 claims abstract 5
- 230000006870 function Effects 0.000 claims description 10
- 238000011156 evaluation Methods 0.000 abstract 1
- 238000009826 distribution Methods 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 5
- 238000007796 conventional method Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 1
- 239000006227 byproduct Substances 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the invention relates to a method for parameterizing a scene with a surface and a control unit that is set up/programmed to carry out this method.
- the invention also relates to a motor vehicle with such a control device.
- the basic idea of the invention is therefore to calculate object size and object distance from an image generated by a camera as a function of a parameter characterizing the surface geometry of a surface on which the objects are arranged. It is to be regarded as essential to the invention that said objects are recognized by means of image recognition measures and are assigned to specific predefined object classes. A probability distribution of the size of the object assigned to the respective object class is predefined for each predefined object class. Thus, for the estimated object size of each detected object, a probability that the detected object has the calculated object size can be calculated using the known probability distribution.
- a so-called scene probability can be calculated from the individual probability that can be calculated in this way for each object—for example by product formation.
- the resulting value for the scene probability is a measure of whether the initially selected parameter characterizing the surface geometry accurately reflects the actually existing surface geometry. If the calculation of the scene probability explained above is carried out for different values of the parameter characterizing the geometry of the surface, that value is preferably classified as best characterizing the actual surface geometry at which the calculated scene probability assumes a maximum value.
- the inventive method presented here for parameterizing a scene with a surface on which at least two objects are arranged, using a camera arranged at a distance from the objects comprises the five measures a) to e) explained below.
- the camera According to a first measure a), the camera generates an image of the scene, which contains image data relating to the at least two objects.
- second measure b) at least two objects are recognized in the generated image and the objects recognized in the image by evaluating the image data are assigned to a specific object class.
- a respective object size of the at least two detected objects is estimated as a function of at least one surface parameter characterizing the surface.
- an individual probability is calculated for each of the at least two objects that the object has the object size estimated in step c).
- a scene probability is calculated from the at least two calculated individual probabilities.
- the at least one surface parameter characterizing the surface is an angle which the main ray running from the camera to the object forms with a flat reference surface on which the object is arranged.
- the scene probability is maximized by varying the at least one surface parameter characterizing the surface and that value of the at least one parameter parameterizing the surface is output as the result of the method in which the scene probability assumes a maximum value.
- the value of the surface parameter that best matches the evaluated scene can be determined with particularly high accuracy.
- the scene probability can be calculated particularly expediently by multiplying the at least two calculated individual probabilities. The scene probability can thus be determined with only a very small amount of computing effort.
- At least the measures c) to e) are carried out iteratively while varying the at least one parameter characterizing the surface in order to determine the maximum value sought.
- This measure is also associated with a not inconsiderable simplification of the method according to the invention.
- a distance of the object from the camera can expediently be estimated in measure c) and then the object size of the object can be calculated as a function of the distance.
- the distance can particularly preferably be estimated as a function of the at least one parameter that parameterizes the surface.
- the parameter that parametrizes the surface remains the only free parameter of the entire process.
- the invention also relates to a control device with a data processing unit and with a memory unit.
- the control unit according to the invention is set up and/or programmed to carry out the method explained above, so that the advantages of the method according to the invention explained above also result for the control unit according to the invention.
- the invention relates to a motor vehicle with a camera whose field of view is aligned with the surroundings, in particular the area in front of the motor vehicle, so that the camera can generate an image of the surroundings of the motor vehicle during operation.
- the motor vehicle also includes a control device according to the invention, which is connected to the camera in a data-transmitting manner. The advantages of the method according to the invention explained above are thus also transferred to the motor vehicle according to the invention.
- FIG. 2 the figure 1 supplementary representation, based on which a calculation of the distance of the object from the camera and the Calculation of the object size resulting from the calculated distance is illustrated,
- FIG. 3 shows a greatly simplified flow diagram of the method according to the invention.
- FIG. 1 shows a schematic representation of a typical scene 1 in which the method according to the invention can be used.
- the method serves to parameterize a surface 2 present in the scene 1, which does not have to be flat as shown in FIG.
- the surface 2 comprises a first surface section 2a, which forms an elevation 3a, and a second surface section 2b, which adjoins the first surface section 2a and forms a depression 3b.
- An object 10 - in the example a person 11 - is arranged on the surface 2 at a transition 4 between the depression 3a and elevation 3b.
- the camera 5 can be a video camera, for example, which can be installed in a motor vehicle (not shown).
- Two or more objects 10 are typically arranged in the scene 1, but for the sake of simplicity only a single object 10 is shown in FIG.
- the course of the surface 2, in particular the depression 3b and the elevation 3a, can be parameterized as shown in FIG Main beam S forms with a flat reference surface RE, on which the object 10 is arranged.
- the height of the camera 5, ie its vertical distance from the reference plane of the latching element, is denoted by h.
- the angle between the main beam S of the camera 5 and the line of sight V extending from the camera 5 to the base point F is defined as the angle ⁇ .
- the base point F in turn is the point of intersection of the object 10 with the reference plane RE.
- FIG. 3 shows a flow chart of the method according to the invention.
- a first measure a) of the method an image of the scene 1 is generated using the camera 5 .
- This image contains image data relating to the surface 2 and also the objects 10 arranged on the surface 2.
- a second measure b) following the first measure a) at least two of the objects 10 present in the scene 1 are identified by evaluating the objects 10 provided by the camera 5 generated image data recognized.
- Each detected object 10 is assigned to a specific object class OK from a predefined set M of such object class OK.
- Possible object classes are, for example, "motor vehicles",
- the object size of the detected object 10 is estimated.
- the calculation of the object size as a function of the angle a, ie the angle between the reference plane RE and the main beam S from the camera 5 to the head point K of the object 10, is explained below with reference to FIG. To this end, relationships known to those skilled in the art of radiation optics are used.
- ⁇ is the intermediate angle between the line of sight V, which extends from the camera 5 to a base point F, at which the object 10 touches the reference plane RE.
- the principal point PP is the point located at a distance f from the camera 5 on the principal ray S, where f is the focal length of the camera 5 expressed in optical pixels of the image.
- An image plane B runs perpendicular to the principal ray S through the principal point PP.
- the point at which the line of sight V of the camera 5 intersects with the plane B is designated as the point PF.
- X is the distance from the principal point PP to the point PF, both of which are located on the image plane B.
- the estimated distance d of the object 10 to the camera 5 is obtained as a function of the angle a: d(a) h*tan(a+ ⁇ ) (2), where h is the distance of the camera 5 to the reference plane RE.
- the estimated object size s i.e. the distance between the top point K and the bottom point F of the object 10, depends on the angle a:
- the object size s of each object 10 recognized in measure b) is estimated. For example, if four different objects 10 are recognized in measure b), then four object sizes s1(a), s2(a), s3(a), s4(a) are estimated.
- a statistic relating to the object size distribution can be formed for the various object classes OK, so that the estimated object size s(a) explained above can be assigned a
- POK probability of occurrence
- s (a) can be calculated from a predetermined probability distribution. This applies to all detected objects and thus to all estimated object sizes. If the different objects are assigned to different object classes, then they are also assigned to different object size distributions.
- an individual probability P(s(a)) that the object has the object size estimated in measure c) is calculated for each object recognized in measure b), taking into account the classification carried out in measure b). .
- the previously known probability distribution must be used for each object class that occurs and the individual probability POK (s(a)) calculated from this. If, for example, four objects are detected in step b), then four individual probabilities P 1 (s(a)), P 2 (s(a)), P 3 (s(a)), P s (a)) are calculated.
- the individual probabilities Pi(s(a)), P 2 (s(a)), P 3 (s(a)), P 4 (s(a)) calculates a scene probability P tot (a).
- the scene probability P tot (a) can, for example, be calculated by multiplying that in measure d).
- the scene probability P tot (a) is now maximized in the method according to the invention.
- max (P tot (a)) the maximum value
- at least the measures c) to e) can be carried out iteratively while varying the at least one parameter characterizing the surface—that is, the angle a.
- the value for the angle ⁇ determined in this way as the result E can be regarded as the value which best represents the actual scene.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/547,966 US20240233160A9 (en) | 2021-06-10 | 2022-05-11 | Method for parameterizing a scene |
CN202280041619.XA CN117480524A (zh) | 2021-06-10 | 2022-05-11 | 用于将场景参数化的方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102021205836.1 | 2021-06-10 | ||
DE102021205836.1A DE102021205836A1 (de) | 2021-06-10 | 2021-06-10 | Verfahren zum Parametrisieren einer Szene |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022258289A1 true WO2022258289A1 (de) | 2022-12-15 |
Family
ID=81984875
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2022/062780 WO2022258289A1 (de) | 2021-06-10 | 2022-05-11 | Verfahren zum parametrisieren einer szene |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN117480524A (de) |
DE (1) | DE102021205836A1 (de) |
WO (1) | WO2022258289A1 (de) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160180531A1 (en) * | 2014-12-22 | 2016-06-23 | Delphi Technologies, Inc. | Method To Determine Distance Of An Object From An Automated Vehicle With A Monocular Device |
US10580164B2 (en) * | 2018-04-05 | 2020-03-03 | Microsoft Technology Licensing, Llc | Automatic camera calibration |
-
2021
- 2021-06-10 DE DE102021205836.1A patent/DE102021205836A1/de active Pending
-
2022
- 2022-05-11 CN CN202280041619.XA patent/CN117480524A/zh active Pending
- 2022-05-11 WO PCT/EP2022/062780 patent/WO2022258289A1/de active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160180531A1 (en) * | 2014-12-22 | 2016-06-23 | Delphi Technologies, Inc. | Method To Determine Distance Of An Object From An Automated Vehicle With A Monocular Device |
US10580164B2 (en) * | 2018-04-05 | 2020-03-03 | Microsoft Technology Licensing, Llc | Automatic camera calibration |
Also Published As
Publication number | Publication date |
---|---|
CN117480524A (zh) | 2024-01-30 |
DE102021205836A1 (de) | 2022-12-15 |
US20240135569A1 (en) | 2024-04-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3785177A1 (de) | Verfahren und vorrichtung zum ermitteln einer netzkonfiguration eines neurona-len netzes | |
DE102006057552A1 (de) | System und Verfahren zur Messung des Abstands eines vorausfahrenden Fahrzeugs | |
DE102015117379A1 (de) | Verfahren zum Erfassen eines dynamischen Objekts in einem Umgebungsbereich eines Kraftfahrzeugs auf Basis von Informationen einer kraftfahrzeugseitigen Ultraschalldetektionseinrichtung, Fahrerassistenzsystem und Kraftfahrzeug | |
WO2020061603A1 (de) | Verfahren und vorrichtung zur analyse eines sensordatenstroms sowie verfahren zum führen eines fahrzeugs | |
WO2020216795A1 (de) | Verfahren und vorrichtung zum optimieren eines bonding-verfahrens mithilfe eines bayes' schen optimierungsprozesses | |
EP2834630A1 (de) | Verfahren zur prüfung eines bauteils basierend auf barkhausenrauschen | |
DE102020120887B4 (de) | Verfahren zum erfassen einer einhängeposition eines auflagestegs und flachbettwerkzeugmaschine | |
DE102018103474A1 (de) | Ein system und verfahren zur objektabstandserkennung und positionierung | |
WO2022258289A1 (de) | Verfahren zum parametrisieren einer szene | |
EP4043976B1 (de) | Verfahren und system zur vermessung von bauteilen sowie programm | |
DE19581950B4 (de) | Automatisches Formberechnungsverfahren und Vorrichtung für eine Konturformmessmaschine | |
DE102021202878A1 (de) | Verfahren zur Reichweitenbestimmung für einen LiDAR-Sensor | |
WO2021063572A1 (de) | Vorrichtung und verfahren zum verarbeiten von daten eines neuronalen netzes | |
WO2021063571A1 (de) | Verfahren und vorrichtung zum betreiben eines elektrochemischen bearbeitungssystems | |
EP1889754B1 (de) | Verfahren und Vorrichtung zur Ansteuerung von Personenschutzmitteln und Computerprogrammprodukt | |
DE102005028252A1 (de) | Verfahren zur rechnergestützten Verarbeitung von digitalen Daten | |
DE102019220615A1 (de) | Verfahren und Vorrichtung zum Erkennen und Klassifizieren von Objekten | |
DE102019210129A1 (de) | Verfahren zum Überprüfen einer Kalibrierung mehrerer Sensoren und Kraftfahrzeug mit mehreren Sensoren | |
WO2002092248A1 (de) | Verfahren zum automatisierten auswählen von blechen, insbesondere stahlblechen, für das umformen zu bauelementen | |
EP4130656B1 (de) | Vorbereitung der auswertung von stichproben von messwerten einer messgrösse aus einer vermessung einer vielzahl von werkstücken durch ein oder mehrere koordinatenmessgeräte | |
DE102022208384A1 (de) | Verfahren zum Ermitteln eines Qualitätszustands eines Prüfobjekts | |
DE102021002907A1 (de) | Verfahren zum dynamischen Anpassen eines Sensors, Steuervorrichtung zur Durchführung eines solchen Verfahrens, Sensoranpassungsvorrichtung mit einer solchen Steuervorrichtung und Kraftfahrzeug mit einer solchen Sensoranpassungsvorrichtung | |
EP3916495A1 (de) | Verfahren und anordnung zum auswerten oder zum vorbereiten einer auswertung einer stichprobe von messdaten aus einer vermessung einer vielzahl von werkstücken | |
EP1179802B1 (de) | Verfahren und Einrichtung zur Erstellung von Korrelations-Bildpunktmengen | |
EP4278329A1 (de) | Verfahren und system zum erkennen von in einem bild anhand einer punktwolke repräsentierten objekten |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22728816 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18547966 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280041619.X Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22728816 Country of ref document: EP Kind code of ref document: A1 |