EP3953900A1 - Verfahren und vorrichtung zum steuern mindestens einer aktorik - Google Patents
Verfahren und vorrichtung zum steuern mindestens einer aktorikInfo
- Publication number
- EP3953900A1 EP3953900A1 EP20723058.2A EP20723058A EP3953900A1 EP 3953900 A1 EP3953900 A1 EP 3953900A1 EP 20723058 A EP20723058 A EP 20723058A EP 3953900 A1 EP3953900 A1 EP 3953900A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- tracking function
- object tracking
- data
- pose
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000005457 optimization Methods 0.000 claims description 29
- 238000001303 quality assessment method Methods 0.000 claims description 7
- 230000007613 environmental effect Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 230000035945 sensitivity Effects 0.000 description 4
- 238000004088 simulation Methods 0.000 description 4
- 238000011960 computer-aided design Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000010206 sensitivity analysis Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013400 design of experiment Methods 0.000 description 2
- 230000004438 eyesight Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000007619 statistical method Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 238000011478 gradient descent method Methods 0.000 description 1
- 238000005304 joining Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000002922 simulated annealing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/251—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Definitions
- the invention relates to a method and a device for controlling at least one actuator system.
- Actuators are used in manufacturing technology, for example.
- the actuators are programmed for fixed, predetermined movement sequences.
- the invention is based on the object of improving a method and a device for controlling at least one actuator.
- a method for controlling at least one actuator is made available, the at least one actuator being controlled on the basis of an object pose of an object by means of provided control data, the object pose being controlled by a
- Object tracking function is provided, and wherein the object tracking function as input data digital geometry data of an object to be tracked and captured
- Sensor data are fed to at least one sensor detecting the object.
- a device for controlling at least one actuator comprising a control device and an object tracking device.
- the control device is designed to control the at least one actuator system on the basis of an object pose of an object by means of control data.
- the object tracking device is designed to provide the object pose by means of an object tracking function, wherein digital geometry data of an object to be tracked and detected sensor data of at least one sensor detecting the object are fed to the object tracking function as input data.
- the method and the device make it possible to control the at least one actuator system in an improved manner.
- the method and the device for example, generate control data for controlling the at least one actuator system and feed it to the actuator system. This is done by providing an object pose in a first step, that is, a position and an orientation of the object to be tracked or manipulated.
- the control data is then generated by defining positions at which the at least one actuator system is to act on the object to be tracked or manipulated. If the object pose is known, the positions can be determined relative to the at least one actuator system on the basis of a three-dimensional object model that can be created from the digital geometry data.
- the at least one actuator can, for example, grip the object at the defined position and / or manipulate it there in a targeted manner by
- Gripping tools and / or processing tools are performed there.
- the object pose describes both a position and an orientation of the to
- the position can be provided in three-dimensional Cartesian coordinates, for example.
- the orientation can be provided as pitch, yaw and roll angles, for example.
- the actuators work in particular in the same coordinate system and can therefore manipulate the object directly on the basis of the object pose provided.
- the sensor pose of the sensor is also defined in the same coordinate system.
- the object pose can be defined as an object pose (position and orientation) relative to the actuator system.
- the object pose is provided by an object tracking function, which can also be referred to as a tracking function (“Object Tracking”).
- An object tracking function is, in particular, one known per se from computer vision applications
- Object tracking function which provides an object pose on the basis of digital geometry data or an object model of the object to be tracked and sensor data acquired from the object to be tracked. At least one initial sensor pose is also known to the object tracking function, that is to say a sensor position and a
- the object tracking function is done with the help of parameters parameterized.
- Object tracking functions available on the market are provided by Fraunhofer IGD, for example, as a software library in the form of the VisionLib.
- Object tracking functions in the form of software packages are also offered by Diota and PTC Vuforia.
- Digital geometry data of the object to be tracked or manipulated can be available, for example, in the form of Computer Aided Design (CAD) data.
- CAD Computer Aided Design
- a sensor can for example be a camera, in particular one in the visible
- a camera operating in the wavelength range which captures camera images of the object to be tracked or manipulated and provides them as sensor data.
- the camera can also be a depth camera (RGB-D).
- the sensor can also comprise other optical imaging measurement technology, e.g. Neuromorphic Cameras.
- sensor data can be one-dimensional or multi-dimensional.
- sensor data are two-dimensional camera images that are captured and provided by a camera.
- sensor data are two-dimensional camera images that are captured and provided by a camera.
- Sensor data are provided by several sensors, e.g. from cameras that are sensitive in different wavelength ranges (e.g. RGB, infrared) or from other sensors (e.g. an inertial measuring unit, IMU).
- sensors e.g. from cameras that are sensitive in different wavelength ranges (e.g. RGB, infrared) or from other sensors (e.g. an inertial measuring unit, IMU).
- the advantage of the invention is that the at least one actuator system can be controlled in an improved manner, in particular independently of a specifically present object pose. Since the object pose is provided or estimated on the basis of the detected sensor data, it is possible, independently of the specific object pose in which the object is located, to act on the object in a targeted manner by means of the at least one actuator. This enables a wide range of flexible application scenarios, for example in manufacturing technology or when manipulating or transporting objects in warehouse logistics.
- Object tracking devices can be designed individually or collectively as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor. However, it can also be provided that parts are designed individually or combined as an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- parameters of the object tracking function are or are defined with the aid of sensor data simulated on the basis of the digital geometry data of the object to be tracked. This makes it possible to dispense with a time-consuming and complex, especially manual, acquisition and compilation of sensor data for establishing the parameters of the object tracking function. In this case, simulated does not necessarily mean that a simulation must have taken place. The simulated sensor data only designate sensor data provided to define a ground truth.
- simulated sensor data can also be provided by an already optimized system in order to optimize another system.
- the definition takes place in particular by simulating a large number of sensor data, for example a large number of camera images in the case of a camera used as a sensor.
- the sensor data are simulated on the basis of the digital geometry data of the object to be tracked.
- a three-dimensional object model can be created on the basis of the digital geometry data.
- the sensor data are then simulated from the three-dimensional object model,
- a camera image from a camera used as a sensor for example a camera image from a camera used as a sensor.
- the simulated image for example a camera image from a camera used as a sensor.
- sensor data are sent to the
- Object tracking function transferred as it is also used for the recorded, i.e. real sensor data. Since object poses for the simulated sensor data are known from the digital geometry data or from the three-dimensional object model of the object to be tracked as a true object pose or basic truth and are also known from the
- Object tracking function provided, in particular estimated, object poses can be compared, those parameters of the object tracking function can be determined with which a quality in the object tracking or in providing the object pose can be improved.
- the definition and optimization can be done completely automatically, so that effort and costs can be saved.
- the device comprises an optimization device, wherein the optimization device is designed in such a way that parameters of the object tracking function are based on the digital ones
- the optimization device can be designed as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor. However, it can also be provided that the optimization device or parts thereof are designed as an application-specific integrated circuit (ASIC). Provision can also be made for actuator parameters of the at least one actuator to be defined, in particular optimized, in this case. For this purpose, a target function is correspondingly taken into account, taking into account the effectiveness of the actuators or
- At least one sensor property of the at least one sensor is taken into account when generating the simulated sensor data.
- these can be properties in the form of sensor artifacts (e.g. noise or sensitivity). If the sensor is a camera, these properties can be, for example, noise, blurring, white balance, glare spots, chromatic aberration and distortion of optical lenses of the camera and / or a wavelength-dependent sensitivity of the camera.
- At least one environmental property is taken into account when generating the simulated sensor data.
- a surrounding property can be, for example, lighting conditions or a background scenery in front of which the object to be tracked is located.
- a large number of different background sceneries are simulated in the simulated sensor data.
- a background image can be selected at random without the object to be tracked, which is depicted in an object pose, itself being changed.
- the background scenery for example in that a surface reflection of a part of the background scenery is mapped onto the object or is simulated on this.
- changing object properties are also taken into account when generating the simulated sensor data.
- Changing object properties can, for example, be in the form of an object geometry that changes over time (e.g. the object has several geometric states) and / or of changing
- Object tracking function can be determined by iterative optimization, for this purpose a quality assessment for each one iteration by means of the object tracking function
- various criteria can be assessed in the context of a comparison between a true object pose used in the simulation and assumed as a basic truth with an object pose provided or estimated by the object tracking function on the basis of the simulated sensor data, for example accuracy (trueness), precision (English precision), a combination of accuracy and precision (English accuracy), a robustness, a jitter and / or a latency when recognizing the object pose.
- Optimization device is also designed such that the parameters of the
- Object tracking function is viewed in particular as a “black box” with an underlying but unknown function with a multi-dimensional parameter space. It can be provided here that the parameter space is reduced to those parameters that have the greatest influence on the quality of the object tracking.
- a statistical analysis can be carried out in the form of a targeted test planning (“Design of Experiments”) and / or a sensitivity analysis.
- targeted test planning for example, targeted scanning of the parameter space (randomly or quasi-randomly) can take place.
- the parameter space can be reduced in size.
- the underlying unknown function can be found by creating a metamodel of the “black box” of the object tracking function.
- the underlying unknown function is approximated. This allows the optimization of the parameters of the object tracking function to be accelerated. In particular, it can be estimated where the underlying unknown function could have an optimum. In this way, parameters of the object tracking function can be searched for in a more targeted manner. Furthermore, a metamodel can be calculated with less computational effort and therefore faster. For the optimization itself, for example, optimization methods known per se can be used, for example evolutionary methods, gradient descent methods and / or simulated annealing.
- Object tracking function uses one of the following methods: Bayesian
- the digital geometry data of the object to be tracked are changed before being fed to the object tracking function.
- the object tracking function can track the object in an improved manner, since, for example, special features of the object, which one
- Facilitate object tracking in which changed digital geometry data can be highlighted. These can, for example, be particularly prominent corners and / or edges of the object that facilitate object tracking.
- features are added to the digital geometry data when changing and / or that features are removed from the digital geometry data when changing. In this way, features of the object to be tracked that are particularly easy to track can be highlighted and other features that may hinder object tracking can be removed.
- digital geometry data are generated and / or changed on the basis of sensor data recorded by the at least one sensor.
- sensor data recorded on the real object can be used to generate or change the digital geometry data.
- FIG. 1 shows a schematic representation of an embodiment of the device for controlling at least one actuator system
- FIG. 2 shows a schematic representation of method sections of an embodiment of the method for clarifying the setting and the optimization of the parameters of the object tracking function.
- the device 1 shows a schematic representation of an embodiment of the device 1 for controlling an actuator 50.
- the device 1 comprises a control device 2 and an object tracking device 3.
- the actuator system 50 can be, for example, a robot arm or a manipulator comprising a plurality of hydraulic or electromechanical actuators, which is used for manipulating, for example, for gripping and processing an object, for example a workpiece or a component.
- the control device 2 is designed to control the actuator system 50 on the basis of an object pose 4 of an object.
- the control device 2 generates control data 10 and feeds the control data 10 to the actuator system 50.
- the object pose 4 is fed to the control device 2 from the object tracking device 3.
- the object pose 4 includes both a position of the object (e.g. in the form of three-dimensional Cartesian coordinates) and an orientation of the object (e.g. as a pitch, yaw and roll angle).
- the object pose 4 is defined in particular as an object pose relative to the actuator system 50.
- the object tracking device 3 is designed to provide the object pose 4 by means of an object tracking function, the object tracking function as
- the sensor 51 can be, for example, a camera with which camera images of the object are captured and provided as sensor data.
- a sensor pose (position and orientation) of the sensor 51 relative to the actuator system 50 is known; this is determined, for example, as part of a registration of the poses.
- the device 1 comprises an optimization device 7.
- the optimization device 7 is designed to define parameters 20 of the object tracking function of the object tracking device 3 with the aid of sensor data simulated on the basis of the digital geometry data 5 of the object to be tracked (cf. FIG. 2). This can be done in particular by defining the parameters 20 of the object tracking function by iterative optimization, a quality assessment being carried out for each object pose 4 provided during an iteration by means of the object tracking function.
- At least one environmental property is taken into account when generating the simulated sensor data.
- features are added to the digital geometry data 5 when it is changed and / or that features are removed from the digital geometry data 5 when it is changed.
- digital geometry data 5 are generated and / or changed on the basis of sensor data recorded by the sensor 51.
- FIG. 2 a schematic illustration of method sections 100, 200 is a
- Embodiment of the method for illustrating the definition and optimization of the parameters 20 of the object tracking function 30 is shown.
- the method sections 100, 200 are in particular by means of an object tracking device 3 and a
- Optimization device 7 of the device 1 executed (Fig. 1).
- simulated sensor data 8 are generated and provided.
- the parameters 20 of the object tracking function 30 are defined and in particular optimized.
- the method section 100 comprises the
- Method sections 101 and 102 Method section 200 includes
- the sensor under consideration is a camera and the sensor data are camera images.
- a method step 101 starting from a real scene 11, in which the object to be tracked or manipulated under certain conditions, such as for example a lighting, a background scenery and an object pose is to be found, a simulated scene 12 is simulated. This is done on the basis of digital
- Geometry data 5 of the object 40 to be tracked or manipulated which are available, for example, in the form of CAD data or have been derived on the basis of a real object from recorded sensor data.
- a realistic three-dimensional object model is generated from the digital geometry data 5 and arranged according to a selected scenery and an object pose, which can in particular be chosen at random, and then e.g. Photo-realistic rendered.
- At least one environmental property 13 is taken into account. It is also provided that sensor properties 14, such as a
- Sensitivity and / or a sensor noise, etc. are taken into account.
- all influences 15 and properties of a signal processing of a sensor used in the subsequent detection of the object 40 to be tracked or manipulated such as e.g. signal noise, signal filtering, frequency response and / or attenuation etc. are taken into account.
- the environment properties 13 are already taken into account when the simulated scene 12 is created.
- Sensor data are generated from the simulated scene 12;
- camera images of the simulated scene 12 are generated from a specific camera pose by projecting the simulated scene 12 onto a two-dimensional surface at the position of the camera.
- the influences 15 of the signal processing, in particular the sensor properties 14, are added to the respectively generated camera images in a method step 102.
- noise as occurs with a real camera, is added to the generated camera images, or the generated camera images are filtered by means of a color sensitivity curve, as is the case with a real camera.
- the described method sections 101, 102 are simulated for a large number of
- the simulated sensor data 8, in particular the simulated camera images, are each linked to a true object pose 16 used during the simulation, which during the
- parameters 20 of object tracking function 30 are optimized. This takes place in a method step 201 in which the simulated sensor data 8 of the Object tracking function 30 are supplied.
- the object tracking function 30 estimates an object pose 4 on the basis of the provided simulated sensor data 8 and the likewise provided digital geometry data 5 of the object 40 to be tracked or manipulated.
- the estimated object pose 4 is compared with the true object pose 16 associated with the provided sensor data 8 and at least one quality criterion 17 for object tracking is determined.
- a comparison can be made between one that is assumed to be a fundamental truth
- Object pose 16 of the simulation with an object pose 4 provided or estimated by the object tracking function 4 different quality criteria 17 can be assessed, for example accuracy (English precision), a combination of accuracy and precision (English accuracy), robustness, jitter and / or latency in the
- an adaptation or optimization of the parameters 20 of the object tracking function 30 takes place in a method step 203.
- Object tracking function 30 is viewed in particular as a “black box” with an underlying but unknown function with a multi-dimensional parameter space. It can be provided here that the parameter space is reduced to those parameters that have the greatest influence on the quality of the object tracking. To achieve this, a statistical analysis in the form of a targeted
- Test planning Design of Experiments
- / or a sensitivity analysis are carried out.
- the underlying unknown function can be found in method step 203 by creating a metamodel of the “black box” of the object tracking function 30. This makes it possible to accelerate the optimization of the parameters 20 of the object tracking function 30.
- optimization for the optimization itself, for example, optimization methods known per se can be used, for example evolutionary methods.
- Object tracking function 30 uses one of the following methods: Bayesian
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019207089.2A DE102019207089A1 (de) | 2019-05-16 | 2019-05-16 | Verfahren und Vorrichtung zum Steuern mindestens einer Aktorik |
PCT/EP2020/061639 WO2020229149A1 (de) | 2019-05-16 | 2020-04-27 | Verfahren und vorrichtung zum steuern mindestens einer aktorik |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3953900A1 true EP3953900A1 (de) | 2022-02-16 |
Family
ID=70480250
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20723058.2A Pending EP3953900A1 (de) | 2019-05-16 | 2020-04-27 | Verfahren und vorrichtung zum steuern mindestens einer aktorik |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3953900A1 (de) |
DE (1) | DE102019207089A1 (de) |
WO (1) | WO2020229149A1 (de) |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10345743A1 (de) * | 2003-10-01 | 2005-05-04 | Kuka Roboter Gmbh | Verfahren und Vorrichtung zum Bestimmen von Position und Orientierung einer Bildempfangseinrichtung |
US10430641B2 (en) * | 2017-03-08 | 2019-10-01 | GM Global Technology Operations LLC | Methods and systems for object tracking using bounding boxes |
US10755428B2 (en) * | 2017-04-17 | 2020-08-25 | The United States Of America, As Represented By The Secretary Of The Navy | Apparatuses and methods for machine vision system including creation of a point cloud model and/or three dimensional model |
-
2019
- 2019-05-16 DE DE102019207089.2A patent/DE102019207089A1/de active Pending
-
2020
- 2020-04-27 WO PCT/EP2020/061639 patent/WO2020229149A1/de unknown
- 2020-04-27 EP EP20723058.2A patent/EP3953900A1/de active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2020229149A1 (de) | 2020-11-19 |
DE102019207089A1 (de) | 2020-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102018109774B4 (de) | Bildverarbeitungssystem, Bildverarbeitungsvorrichtung und Bildverarbeitungsprogramm | |
EP2043045B1 (de) | Verfahren zur Objektverfolgung | |
DE102015015194A1 (de) | Bildverarbeitungsvorrichtung und -verfahren und Programm | |
DE102018200154A1 (de) | Kalibrationsvorrichtung, Kalibrationsverfahren und Programm für einen visuellen Sensor | |
DE102018206208A1 (de) | Verfahren, Vorrichtung, Erzeugnis und Computerprogramm zum Betreiben eines technischen Systems | |
DE102005037841B4 (de) | Verfahren und Anordnung zur Bestimmung der relativen Lage eines ersten Objektes bezüglich eines zweiten Objektes, sowie ein entsprechendes Computerprogramm und ein entsprechendes computerlesbares Speichermedium | |
WO2000063681A2 (de) | Bildbearbeitung zur vorbereitung einer texturnalyse | |
DE102018207414A1 (de) | Bildverarbeitungssystem | |
DE102014207095A1 (de) | Kantenmessungs-Videowerkzeug mit robustem Kantenunterscheidungs-Spielraum | |
EP3882856A1 (de) | Verfahren und vorrichtung zum bestimmen einer pose | |
DE112017008101T5 (de) | Autonome roboter und verfahren zum betreiben derselben | |
DE102019215903A1 (de) | Verfahren und Vorrichtung zum Erzeugen von Trainingsdaten für ein Erkennungsmodell zum Erkennen von Objekten in Sensordaten eines Sensors insbesondere eines Fahrzeugs, Verfahren zum Trainieren und Verfahren zum Ansteuern | |
DE102016102579A1 (de) | Verfahren und Vorrichtung zum Bestimmen einer Vielzahl von Raumkoordinaten an einem Gegenstand | |
DE102018204451A1 (de) | Verfahren und Vorrichtung zur Autokalibrierung eines Fahrzeugkamerasystems | |
DE102018103474A1 (de) | Ein system und verfahren zur objektabstandserkennung und positionierung | |
WO2005031647A1 (de) | Verfahren und vorrichtung zur berührungslosen optischen 3d-lagebestimmung eines objekts | |
DE112021004779T5 (de) | Vorrichtung zum Einstellen eines Parameters, Robotersystem, Verfahren und Computerprogramm | |
EP2887010B1 (de) | Verfahren und Vorrichtung zum dreidimensionalen optischen Vermessen von Objekten mit einem topometrischen Messverfahren sowie Computerprogramm hierzu | |
EP3953900A1 (de) | Verfahren und vorrichtung zum steuern mindestens einer aktorik | |
EP3663800B1 (de) | Verfahren zur objekterfassung mit einer 3d-kamera | |
DE102020204677B4 (de) | Trackingsystem und Computerprogramm zur Kompensation von Sichtschatten bei der Nachverfolgung von Messobjekten | |
EP3582140B1 (de) | System zur automatischen erkennung von laborarbeitsgegenständen sowie verfahren zum betrieb eines systems zur automatischen erkennung von laborgegenständen | |
WO2020229352A1 (de) | Verfahren zum bereitstellen einer objektverfolgungsfunktion | |
DE102019220364A1 (de) | Kalibriereinrichtung und Verfahren zum Kalibrieren einer Vorrichtung | |
DE102018208604A1 (de) | Ermitteln eines Aufnahmeverhaltens einer Aufnahmeeinheit |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20211110 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20240905 |