EP4405917A1 - Procédé d'identification d'au moins un objet - Google Patents

Procédé d'identification d'au moins un objet

Info

Publication number
EP4405917A1
EP4405917A1 EP22797251.0A EP22797251A EP4405917A1 EP 4405917 A1 EP4405917 A1 EP 4405917A1 EP 22797251 A EP22797251 A EP 22797251A EP 4405917 A1 EP4405917 A1 EP 4405917A1
Authority
EP
European Patent Office
Prior art keywords
camera system
multispectral sensor
spectral
image
arrangement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22797251.0A
Other languages
German (de)
English (en)
Inventor
Ulrich Hausmann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Optronia GmbH
Original Assignee
Optronia GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Optronia GmbH filed Critical Optronia GmbH
Publication of EP4405917A1 publication Critical patent/EP4405917A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/194Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D75/00Accessories for harvesters or mowers

Definitions

  • the invention relates to a method for identifying at least one object according to the preamble of claim 1.
  • the invention also relates to an arrangement for carrying out such a method.
  • a method for detecting objects is already known from AT 413 899 B, in which a multi-sensor unit comprising an IR radiation sensor, a microwave sensor and/or a video camera is used to simultaneously process different spectral ranges for animal identification.
  • the disadvantage here is that the different recording techniques must always be active together in order to be able to ensure the functionality of the multi-sensor unit.
  • the different means of recording the respective spectral ranges are not coordinated with one another, so that a high power requirement and a high computing capacity are required when the multi-sensor unit is in operation.
  • no sufficiently meaningful diagnoses on details of the detected objects are possible via passively acting sensors.
  • a method for detecting objects in the form of animals is also already known from document DE 10 2009 039 601 B4, in which a front-end sensor and a camera with flashlight are provided and the camera, after triggering a detection signal in an area associated with the front-end sensor in this area is maneuvered in order to produce an image with flashlight for subsequent evaluation via a pattern recognition algorithm.
  • the front-end sensor is in the form of a microwave radar sensor or an infrared sensor.
  • the disadvantage here is, on the one hand, a large time delay between the recording of the object the two types of imaging, whereby a real-time analysis in the sense of a continuous imaging process of objects is not possible.
  • the passively operating sensor unit in the infrared range which is kept simple, no adequate spectral information of the object—in particular geometric data of details of the object—other than an identification of the presence of the object is possible.
  • a method for detecting objects is known from US Pat. No. 10,761,211 B2, in which the direction of movement of cameras is moved relative to sunlight irradiation such that a constant illumination by sunlight of objects to be identified, which is necessary for precise imaging, is ensured.
  • Sunlight is used to expose the objects in order to prevent interference between active lighting and the sunlight, whereby the wavelength range of the cameras is limited and noise problems and measurement errors have to be taken into account.
  • the images In order to coordinate the large number of cameras, the images must be correlated with one another.
  • different cameras are possible for recording images, which identify the object with a time offset relative to one another, the different cameras are, however, only coordinated with one another with regard to a location of the cameras or the time offset to the object.
  • the cameras are always active to determine the required data, so that there is a high consumption of resources during operation.
  • the objective technical task of the present invention is therefore to provide a method for identifying objects that is improved over the prior art and an arrangement for carrying out such a method, in which the disadvantages of the prior art are at least partially eliminated, and which in particular characterized by a resource-saving use with a simultaneously high generated information content of the object to be identified.
  • the at least one multispectral sensor simultaneously records at least one spectral image of at least two mutually different wavelength ranges of the at least one object, with the at least one multispectral sensor having at least one illumination source for active exposure during the recording of the at least one spectral image
  • the at least one object is pre-analyzed by the at least one evaluation device via the at least one spectral image of the at least one object recorded by the at least one multispectral sensor, depending on the spectral pre-analysis of the at least one
  • At least one optical image of the at least one object is recorded by the at least one camera system via the at least one control and/or regulation device
  • the at least one evaluation device analyzes the at least one object via the at least one spectral image and/or or the at least one optical image.
  • the objects to be identified can still be analyzed sharply and true to detail through the targeted and coordinated use of the combination of the two types of imaging, despite the savings in power and computing effort.
  • the at least one multispectral sensor particularly preferably works continuously and thus scans 100% of a background.
  • the control device triggers the at least one camera system selectively or not continuously over time.
  • At least one camera system for each individual object, for example a plant, with special, singular spectral properties being searched for and/or documented, for example rotting.
  • a random imaging of objects by the at least one camera system can be used via a statistical selection process in order to determine an overall distribution—for example, the occurrence of a specific plant species—via a statistical evaluation. This can be done in the course of the spectral pre-analysis and/or in a separate step from the pre-analysis of the at least one object, preferably as a supplement.
  • the analysis of the at least one object by the at least one evaluation device can also be carried out as a function of the spectral pre-analysis, with the evaluation device only becoming active apart from a spectral pre-analysis if the spectral image of the at least one object - recorded by the at least one Multispectral sensor - this requires .
  • the recording of optical images by the at least one camera system and/or the analysis by the at least one evaluation device can be present, for example, as a function of the spectral pre-analysis if the information content of the spectral pre-analysis is not sufficient for a meaningful diagnosis with regard to the at least one object e ktes or the spectral pre-analysis is a contour and/or coloring that is noticeable compared to a background and which points to an object to be examined Object (e.g. a plant protruding from the ground) can be inferred identified.
  • the at least one optical image is only stored for documentation purposes, with the analysis of at least one object being interpreted so broadly that documentation of the at least one optical image or, if necessary, a review of the documentation also includes an analysis of the at least represents an object.
  • the analysis can be distinguished from the previous analysis, with the analysis being able to be initiated as a function of the pre-analysis, with the analysis possibly only being carried out if the pre-analysis identifies an object which is to be checked more precisely via the at least one optical image.
  • spectral pre-analysis An interim analysis between spectral pre-analysis and analysis is also possible, with radioactivity, foreign bodies or other information content being able to be identified, for example.
  • the analysis is particularly preferred to determine a plant type and its condition such as growth, degree of maturity, fertilizer content, water content, drought stress, diseases, fungal infestation, rot, death, hail damage, insect infestation, contamination of the plant, contamination of the subsoil, etc.
  • the at least one camera system can thus be used depending on the situation, which requires a particularly resource-saving method and/or only information about the
  • An implementation of when optical images of the at least one object are to be recorded by the at least one camera system as a function of the spectral pre-analysis can be flexibly adjusted and can include, for example, threshold values or individually configurable decision parameters.
  • the decision to record optical images can be made when a spectrally recorded object is identified with a contour of a plant in contrast to stones and soil.
  • the requirements for the analysis and/or specific circumstances of the arrangement can be integrated, with the arrangement for carrying out the method being flexibly adaptable in new areas of application.
  • the at least two different wavelength ranges in connection with the at least one multispectral sensor means that the maxima of the intensity distribution of the respective wavelength ranges are separate from one another, with the wavelength ranges being able to overlap in the spectral distribution, with e.g one wavelength range has a maximum in the red wavelength range and another wavelength range has a maximum in the green wavelength range.
  • the wavelength ranges are preferably located in the infrared range, near-infrared range up to 1400 nm and/or in the spectral range between 180 nm and 1400 nm, particularly preferably between 580 nm and 960 nm, with the at least one multispectral sensor generally also using RGB as an alternative or in addition -Can use light sources.
  • the wavelength range of the at least one multispectral sensor is preferably located below 500 nm and/or above 850 nm.
  • a wavelength range of the at least one multispectral sensor is particularly preferably at least partially or completely separate from a wavelength range of the at least one camera system.
  • the at least one camera system is preferably embodied as an optical camera, although monochrome cameras and/or RGB cameras are also conceivable, with the advantages relating to the passive exposure to sunlight on the at least one object when using an optical one Camera are particularly pronounced.
  • the at least one camera system can generally record individual images and/or a sequence of images as a video stream, the at least one camera system being particularly preferably designed as an embedded vision system, the at least one, in particular board level, camera system having an integrated image evaluation Module includes wherein the at least one image is not stored and/or recorded.
  • the at least one control and/or regulation device is preferably designed in the form of a system-on-module (SOM), the at least one control and/or regulation device being a CPU, a GPU, a memory, an energy management module and/or Includes high-speed interfaces.
  • SOM system-on-module
  • Libraries, in particular artificial intelligence libraries, are particularly preferred that are comprised by the GPU and/or are processed by the GPU.
  • the at least one camera system is particularly preferably inactive, with the at least one camera system being activated by the at least one control and/or regulation device for recording the at least one optical image of the at least one object, depending on the spectral pre-analysis, preferably with the at least one camera system and the at least one multispectral sensor are arranged together on a base body. This results in a specific activation of the at least one camera system depending on individual objects of interest.
  • the at least one multispectral sensor and/or the at least one camera system in a spectral range between 1 pm and 100 pm.
  • passive sensor systems are usually used for imaging objects, which use sunlight and/or scattered light as exposure.
  • changing lighting conditions are detrimental to sufficiently precise images of objects, since an analysis of the images is sensitive with regard to varying lighting conditions.
  • cameras could be used which permanently use as much light as possible for imaging in order to block out the sunlight or scattered light for imaging.
  • complex and powerful exposure systems are required, which also require a very high power consumption.
  • At least one multispectral sensor can be used by the present invention, which includes its own lighting in order to essentially completely block out the sunlight, since its own lighting conditions are created.
  • Flashlight can be used selectively in the at least one camera system in order to use the advantages of an active illumination for blocking out sunlight for analysis purposes for a limited time by an exposure device of low complexity and/or required power. This makes it possible—in particular through a coordinated combination of the two types of imaging—to reduce power consumption particularly effectively and at the same time to generate images with a particularly high degree of detail.
  • the at least one object can be designed essentially arbitrarily in terms of type, shape and/or quality.
  • the at least one object particularly preferably represents a plant, a multiplicity of plants or part of a plant, with fruit, flowers, grasses and the like, for example, being included in the term plant.
  • the analysis of the at least one object can be used, for example, in order to be able to reduce the amount of plant treatment agent and use it more purposefully, with this bringing both ecological and cost-related advantages.
  • the analysis can make a harvest more effective and/or more efficient, while in particular a higher quality and/or a greater yield of the products can be ensured at the same time.
  • other objects apart from agriculture are also conceivable, with areas of application in waste separation, in recycling or for quality assurance purposes being possible.
  • the at least one object can be identified quickly (processing of digital data from the at least one optical image is only initiated when required) and inexpensively (less expensive components are required), since the at least one camera system can be connected to the at least one comparatively low-energy multispectral sensor is switched on and an analysis based on the optical information is only carried out if necessary, so that there is an increased computing effort and power consumption only if this is profitable for diagnosis or identification.
  • both the at least one spectral image and the at least one optical image are used for analysis by the at least one evaluation device, since in this way additional information can be used to identify the at least one object in a combination of the types of imaging compared to the use of one type of imaging.
  • additional information can be used to identify the at least one object in a combination of the types of imaging compared to the use of one type of imaging.
  • the at least one spectral image in the infrared range it can be diagnosed that a structure on the at least one object is fungal infestation and not drought stress or rot, although this aspect would not be clearly recognizable from the at least one optical image.
  • image processing can be used to identify which subgenus or species a plant is (e.g. in contrast to weeds), although this fact could not be determined on the basis of spectral information .
  • an analysis can be carried out solely on the basis of the at least one spectral image or the at least one optical image.
  • the subsoil or the at least one object can be analyzed accurately in real time.
  • real time is to be interpreted in such a way that a time delay which occurs when processing the method steps is negligible in comparison to typical reaction times and/or typical movement times of actuators; However, slow data acquisition due to the principle involved by the component parts of the arrangement involved is not to be taken into account.
  • the at least one multispectral sensor can be set up to detect a specific object, in particular a plant genus, wherein when the specific object is identified by the at least one multispectral sensor, the at least one camera system is activated in order to view the specific object with at least one optical image with increased capture the level of detail.
  • the activation time of the at least one camera system can be reduced by the at least one multispectral sensor.
  • the at least one camera system Via the at least one multispectral sensor, the at least one camera system, preferably via the at least one control and/or regulation device, can record the position of the object to be specifically identified, preferably with a temporal and/or spatial offset relative to the at least one multispectral sensor, for recording the at least one optical image can be communicated, the wavelength ranges of the at least one multispectral sensor preferably being different from the wavelength range of the at least one camera system.
  • At least one camera system has a high frame rate in order to generate sufficient images for analysis purposes.
  • it is generally sufficient to be designed in this way by the coordination of the at least one multispectral sensor - which is generally constantly active with low resource consumption - with the at least one camera system - which is selectively activated on the basis of the pre-analysis for a specific object to record a few optical images, preferably with high resolution and/or selectively, per second, as a result of which costs can be saved in particular with the at least one camera system and/or less complex structural components are required.
  • the at least one camera system can be set up to be activated selectively via an algorithm in order to use a statistical evaluation - preferably by the at least one evaluation device - to obtain statistical statements on the number of objects on the ground, the number of objects to be able to meet ects with specific properties such as rot, pest infestation, size et cetera and/or the like, as a result of which an activation time of the at least one camera system can be further reduced.
  • protection is also required for an arrangement for carrying out such a method, comprising at least one multispectral sensor, in particular photodiode, at least one, in particular optical, camera system, at least one control and/or regulation device and at least one evaluation device, wherein the at least one multispectral sensor comprises at least one illumination source for active illumination during the recording of at least one spectral image and at least one spectral image of at least two different wavelength ranges of the at least one object, preferably a plant, can be recorded simultaneously by the at least one multispectral sensor, the at least one evaluation device is set up to pre-analyze the at least one object via the at least one spectral image and the at least one open-loop and/or closed-loop control device and the at least one cam erasystem are designed to record at least one optical image of the at least one object using the at least one camera system as a function of the spectral pre-analysis, with the at least one evaluation device evaluating the at least one object via the at least one spectral image
  • Wavelength ranges is generally arbitrary.
  • An optical camera system can be conceived by capturing images in the visible spectral range, preferably using optical lenses and/or optical imaging laws, whereby IR camera systems can also be used according to the invention.
  • the arrangement is maneuvered relative to the at least one object, in particular by an automatic feed device, or the at least one object moves relative to the stationary arrangement.
  • the at least one evaluation device and/or the at least one open-loop and/or closed-loop control device can be implemented using suitable source codes and/or circuits.
  • a flash device preferably at least one of the at least one Camera system spatially separate LED flash device, is used for active exposure.
  • at least one line laser is provided for the preferably active illumination of the at least one object, it preferably being provided that the at least one line laser and/or via triangulation determine a height of the at least an object can be determined.
  • IR radiator and/or an LED lighting source is particularly preferably used.
  • the at least one object is usually passively illuminated by sunlight and/or artificial light, which can vary in the angle of incidence, shadow cast, intensity and spectrum. This situation, which is detrimental to imaging purposes, is counteracted by the active exposure, with the light sources being able to be adjusted to the circumstances and/or requirements.
  • the at least one multispectral sensor is particularly preferably designed to be passive or without lighting, so that the at least one multispectral sensor does not include active lighting, with the at least one camera system preferably including separate active lighting and/or active lighting as an integral part of the at least one camera system.
  • Such short flash durations can on the one hand ensure a sharp image despite movement of the arrangement relative to the at least one object and on the other hand synchronization with the at least one multispectral sensor and/or the at least one evaluation device is possible in a particularly favorable manner, since time delays due to the recording of the at least one optical image are low.
  • any number of spectral and/or optical images of the same object and/or other objects is arbitrary. In general, however, exactly one optical image is sufficient for analysis purposes to identify an object.
  • the at least one spectral image and the at least one optical image can be recorded sequentially and/or simultaneously.
  • a time interval between spectral images of the at least one multispectral sensor and/or a flash duration of the at least one camera system can be adapted to the speed of movement of the arrangement relative to the at least one object, with an optical image preferably being essentially instantaneous or without a time delay after a spectral identification of the at least one obj ect takes place.
  • a on the spectral Abbi ld subsequent spectral image can during the recording of the optical Image and / or be taken on the optical image following.
  • the at least one multispectral sensor is essentially always active during operation of the arrangement, it being preferably provided that the at least one multispectral sensor continuously records spectral images of a substrate and/or the at least one object and/or at least one Total reflection lens, wherein the at least one total reflection lens comprises two cover surfaces connected via a, preferably convexly curved, lateral surface, wherein at least one recess adjoining the first cover surface and pointing in the direction of the second cover surface is arranged between the first cover surface and the second cover surface, wherein the at least one recess has at least two boundary surface sections, preferably facets, pointing in the direction of the second cover surface and separate from one another, for the refraction of light in the direction of the lateral surface.
  • At least one lighting device is particularly preferably provided, which comprises the at least one multispectral sensor and/or the total reflection lens for focusing light from at least two light sources, preferably light-emitting diodes, of the at least one lighting device onto an imaging area common to the light sources, with the at least one multispectral sensor light emitted by at least two light sources of the at least one lighting device and reflected by the at least one object is detected electromagnetically.
  • the constant recording of spectral images prevents that an object of interest is not discovered, the resources regarding the at least one Camera system and / or the at least one evaluation device are spared.
  • the at least one multispectral sensor and/or the at least one illumination device comprises a total reflection lens according to Austrian patent application number A 50491/2021.
  • spectral images can be recorded with such a total reflection lens, since light from a light source can enter the total reflection lens and upon entry the light from the light source is split into partial beams which are available for analyzing a spectrum of an object illuminated by the light source.
  • a multiplicity of sub-beams are effected through the interface sections, the number of sub-beams corresponding to the number of interface sections.
  • the interface sections act as facets that are preferably planar orthogonal to an optical axis/axis of symmetry of the total reflection lens, as a result of which the spectrum of the light reflected by the object via the total reflection lens can be used particularly advantageously for a spectral analysis.
  • the interface sections are curved along the optical axis for simplified production. If light sources are arranged with a lateral offset to the optical axis, collimation still takes place via the interface sections in the imaging area with concentric and/overlapping intensity and/or wavelength distribution of the light sources, which cannot be guaranteed with a recess with constant curvature without interface sections.
  • the at least one evaluation device, the at least one spectral image preferably via a Search algorithm compares with a large number of data records stored in a database, it being preferably provided that geometric data, particularly preferably contours, of the at least one spectral image are used for the data comparison.
  • the search algorithm can, for example, be in the form of a program for comparing digital data with a table and/or matrix.
  • the search algorithm preferably includes artificial intelligence including machine learning, deep learning and/or neural networks.
  • the data records can include, for example, spectral information, geometry, color, contour, etc., which can generally also be used via the at least one optical image.
  • a plant genus can already be inferred from the at least one spectral image via pattern recognition.
  • Pattern recognition is particularly preferred, however, based on the at least one optical image, preferably via a geometry of the at least one object, in order to be able to determine details of objects such as species.
  • the at least one multispectral sensor can provide additional information; in the case of plants as objects, for example diseases, drought stress, nitrogen content and the like.
  • a common database can be used for the at least one spectral image and the at least one optical image.
  • a database with regard to spectral data and a separate database from this with regard to optical data such as plant shape and/or plant color, pest infestation, deficiency symptoms et cetera are preferably provided. It has proven advantageous for the at least one evaluation device to assign the at least one optical image, preferably together with the at least one spectral image, to an object in an image database using a pattern recognition algorithm.
  • the at least one object can be identified accurately with a low susceptibility to errors, with the pattern recognition algorithm particularly preferably comprising artificial intelligence.
  • the artificial intelligence is preferably based on tensor flow and/or deep learning and can include machine learning and/or neural networks. A large number of optical images of objects such as plants in the learning process of the artificial intelligence can serve as learning data, which are compared with objects that have already been categorized in terms of contour, geometry, shape, color or the like.
  • the pattern recognition algorithm particularly preferably uses the spectral data of the at least one spectral image as an additional information source.
  • an advantageous variant consists in that the at least one multispectral sensor, the at least one camera system and/or at least one flash device that may be present are arranged on a base body, with the at least one multispectral sensor generating the at least one spectral image and/or the at least one camera system records at least one optical image in the direction of a substrate with a viewing angle, preferably between 5° and 15°, relative to a perpendicular to the substrate, it preferably being provided that the viewing angle, particularly preferably automatically, increases and/or decreases and/or the viewing angle of the at least one camera system - optionally with at least one flash device, in particular spatially separate from the at least one camera system - is essentially identical to the viewing angle of the at least one multispectral sensor or to the viewing angle of the at least one
  • multispectral sensors differs .
  • the at least one object can be recognized early on by a viewing angle, so that, for example, actuators for treatment (such as harvesting, sorting and/or spraying) can be controlled in good time depending on the analysis of the object. If a viewing angle between 5° and 15° is selected, a particularly favorable compromise can be achieved between a resolution of the images, the exposure conditions, a lead time for actuators and/or a degree of detail of the images.
  • the viewing angle can be adjusted automatically, in particular via the base body and/or automatically, preferably by means of a sensor, via a relative speed between the at least one object and the base body and/or individually for the at least one camera system and/or the at least one multispectral sensor via the at least one open-loop and/or closed-loop control device.
  • a viewing angle, in particular an opening angle or an angle of inclination, of the at least one multispectral sensor, preferably together with the infrared light and/or the LED illumination source, and the at least one camera system, preferably together with the at least one flashlight device, can be different, with the viewing angle of the at least one multispectral sensor and the at least one camera system being particularly preferably identical.
  • the at least one flash device would have to provide an insufficiently high exposure time for sharp images, whereas very short exposure times would be required for a viewing angle of 0 in order to generate sharp images
  • the level of detail is reduced and timely activation cannot be guaranteed, at least not in every area of application.
  • An optimum for the identification of the at least one object can be set by the viewing angle.
  • the at least one multispectral sensor and the at least one camera system are arranged, preferably next to one another, along an arrangement direction, with the arrangement direction being oriented orthogonally to a direction of movement of the arrangement, in particular with a flash device that may be present, relative to the at least one object.
  • the at least one multispectral sensor and the at least one camera system are synchronized via the at least one control and/or regulation device such that the at least one spectral image and the at least one optical image of the at least one object are synchronized with a temporal offset of a maximum of 50 ps, preferably a maximum of 30 ps, and/or a spatial offset z of a maximum of 10 mm, preferably a maximum of 6 mm.
  • the spatial offset is related to a relative displacement of the imaging areas of the at least one multispectral sensor and the at least one camera system of two consecutive images, the imaging areas generally being spaced apart from the arrangement relative to the at least one object, taking into account the viewing angle of the Arrangement are conditional.
  • the spatial offset z can be viewed in a parallel movement of the base body relative to the at least one object between a spectral image and an optical image, the spatial offset z being used for analysis purposes to identify the at least one object at speeds below 50 km/h, preferably below 30 km/h, can be neglected and, in particular, an identical viewing angle can be provided.
  • a spatial offset of the spectral and optical images recorded by the object is negligible and can be kept below 5 mm.
  • At least one actuator preferably an agricultural machine, particularly preferably by the at least one open-loop and/or closed-loop control device, depending on the analysis of the at least one evaluation device via the at least one spectral image and/or or the at least one optical image of the at least one object is controlled.
  • the at least one actuator can be present in a spray device, a harvesting device and/or a valve arrangement, for example.
  • the at least one camera system has an aspect ratio other than 1:1, preferably 5:3, and the at least one open-loop and/or closed-loop control device particularly preferably has an imaging range, preferably of at least 300 mm at least 500 mm, the at least one camera system is divided into sub-areas, with the at least one evaluation device sequentially forwarding images of the sub-areas of the imaging area - as optical images - for evaluation, it being preferably provided that sub-areas of the imaging area, preferably by a flash device that may be present, are illuminated differently, preferably in white, RGB and/or UV.
  • the sub-areas can be analyzed in further processing during the analysis of the at least one optical image under varying exposure conditions.
  • a particularly efficient analysis of the at least one optical image is possible, since images of the partial areas of the at least one evaluation device can be forwarded sequentially.
  • Splitting and sequential forwarding is also possible with an aspect ratio of 1:1.
  • the sub-areas can be created in an analogous manner to form a panorama image, composed of sub-images, with a first sub-area already being captured by the at least one camera system and forwarded in the form of an image, a subsequent sub-area being captured and a subsequent sub-area is subsequently recorded.
  • splitting up the imaging area into sub-areas, with the imaging area being composed of the sub-areas is also conceivable.
  • the imaging area of the at least one camera system is preferably two-dimensional and has a minimum longitudinal extent of 15 mm, in particular 300 mm, with at least 2 megapixels or 4 megapixels being used particularly preferably for capturing the at least one optical image.
  • the imaging range of a multispectral sensor preferably includes a minimum longitudinal extension of 15 mm, with the imaging range of a large number of multispectral sensors particularly preferably extending along a straight line with a minimum longitudinal extension of 60 mm.
  • an arrangement of a large number of multispectral sensors in one dimension can cover an image of a linear area of 500 mm, with two groups of multispectral sensors, for example, spectrally imaging along this linear area over the respective assigned imaging area.
  • spectral images of a substrate and/or the at least one object can be continuously recorded via the at least one control and/or regulation device by the at least one multispectral sensor.
  • the at least one camera system and/or the at least one multispectral sensor is arranged relative to a substrate in such a way that an imaging area of the at least one camera system and/or the at least one multispectral sensor extends in a dimension of at least 300 mm, preferably at least 500 mm.
  • the at least one camera system has an aspect ratio not equal to 1:1, preferably 5:3, whereby it is preferably provided that the at least one open-loop and/or closed-loop control device is set up to cover an imaging area of the to subdivide at least one camera system into sub-areas and to sequentially forward images of the sub-areas to the at least one evaluation device for evaluation.
  • partial areas of the imaging area can be illuminated differently, preferably by at least one flash device that may be present and/or in white, RGB and/or UV.
  • the at least one multispectral sensor comprises at least one total reflection lens. If at least one total reflection lens is provided, it is preferably provided that the at least one total reflection lens comprises two cover surfaces connected via a preferably convexly curved lateral surface, with at least one cover surface adjoining the first cover surface and between the first cover surface and the second cover surface pointing in the direction of the second top surface recess is arranged, wherein the at least one recess has at least two pointing in the direction of the second top surface and separate boundary surface sections, preferably facets, for refraction of light in the direction of the lateral surface.
  • At least one base body is provided, with at least two groups of at least one multispectral sensor, preferably four multispectral sensors each, being arranged on the base body and between the at least two groups the at least one camera system and/or at least one given f lls existing flash device, preferably LED flash device, particularly preferably next to each other in a row, is arranged.
  • at least two groups of at least one multispectral sensor preferably four multispectral sensors each, being arranged on the base body and between the at least two groups the at least one camera system and/or at least one given f lls existing flash device, preferably LED flash device, particularly preferably next to each other in a row, is arranged.
  • the at least one base body, the at least one multispectral sensor, the at least one camera system and/or the at least one flash device are directed towards a substrate with a viewing angle, preferably between 5° and 15°, relative to a perpendicular to the underground is aligned, it preferably being provided that the viewing angle, particularly preferably automatically, can be enlarged and/or reduced and/or the viewing angle of the at least one camera system is essentially identical to the viewing angle l of the at least one multispectral sensor or of the viewing angle of at least one
  • the viewing angle can range from 0° to 45°.
  • At least one lighting device which comprises the at least one multispectral sensor and wherein the at least one lighting device has at least two light sources, preferably light-emitting diodes, and the at least one total reflection lens for focusing light of the comprises at least two light sources on an imaging area common to the light sources, the light emitted by the at least two light sources of the at least one lighting device and reflected by the at least one object being electromagnetically detectable by the at least one multispectral sensor.
  • the at least one lighting device is preferably designed as a lighting optic and/or lighting module, active and/or passive optical components being particularly preferably included in the at least one lighting device.
  • the at least one camera system preferably also detects the light emitted by the at least one flash device and/or the at least one lighting device electromagnetically after reflection on the at least one object, with the at least one camera system for example having at least one CCD chip and/or at least may include a CMOS chip.
  • An advantageous variant consists in the at least one camera system having at least one flash device, preferably at least one of the at least one camera system separate LED flash device, wherein the at least one object is captured by the at least one flash device for optical imaging via the at least one control and/or regulation device with a flash duration in the range between 10 ps and 1000 ps, preferably between 20 ps and 400 ps, can be illuminated, it preferably being provided that the at least one control and/or regulation device is set up to display the at least one spectral image with a time offset of a maximum of 50 ps, preferably a maximum of 30 ps, and/or a spatial offset of a maximum of 10 mm, preferably a maximum of 6 mm.
  • the at least one evaluation device is set up to compare the at least one spectral image, preferably using a search algorithm, with a large number of data sets stored in a database, it being preferably provided that geometric data, particularly preferably contours, of the at least one spectral image can be used for data comparison.
  • the at least one evaluation device is set up to assign the at least one optical image, preferably together with the at least one spectral image, to an object in an image database using a pattern recognition algorithm.
  • the arrangement comprises at least one actuator, preferably an agricultural machine, the at least one actuator depending on the analysis of the at least one evaluation device via the at least one spectral image and/or the at least one optical image of the at least an object, preferably by the at least one open-loop and/or closed-loop control device.
  • At least one line laser is provided with which a height of the at least one object is determined, preferably as a function of the spectral pre-analysis and/or during the recording of the at least one optical image and/or via the at least one evaluation device, it being preferably provided that the height is determined via triangulation and/or an offset of light reflected at the at least one object and/or a subsurface.
  • the arrangement and/or the method can be used to determine another useful parameter, which can provide information about growth of the at least one object, for example, and can be used in particular to monitor a chronological progression of the height of the at least one object can be .
  • Fig. 1 an agricultural machine with two arrangements for carrying out a method for identifying an object in a schematic view from above
  • Fig. 2 shows an arrangement for carrying out the method according to a particularly preferred exemplary embodiment during the recording of a spectral image and an optical image in a perspective view
  • FIG. 3 shows an illumination device according to a preferred embodiment with two multispectral sensors for an arrangement for carrying out the method in a perspective sectional view
  • Fig. 4 a total reflection lens for
  • Lighting device according to the embodiment of FIG. 3 in a view from above and a sectional view
  • 5a, 5b an object to be identified by the arrangement according to the exemplary embodiment according to FIG. 2 in the form of a plant before and during a process of pattern recognition/pattern assignment
  • FIG. 6 shows a line laser for determining a height of the object to be identified.
  • the arrangement 18 can be seen in FIG. 2, whereby the arrangement 18 includes a large number of multispectral sensors 2 in the form of photodiodes with active lighting, an optical camera system 3 with a flash device 7 as active lighting, a control and/or regulating device 4 and an evaluation device 5 are included, wherein chronologically the multispectral sensor 2 simultaneously records a spectral image of three different wavelength ranges of the object 1 or the subsurface 8, the object 1 is pre-analyzed by the evaluation device 5 via the spectral image of the object 1 or the subsurface 8 recorded by the multispectral sensor 2 , the camera system 3 records an optical image of the object 1 via the control and/or regulation device 4 depending on the spectral pre-analysis of the object 1 and the object 1 via the spectral image and the optical image depending on the spectral pre-analysis is analyzed via the evaluation device 5.
  • the actuators 14 are controlled by the control and/or regulation device 4 depending on the analysis of the evaluation device 5 via the spectral image and the optical image of the object 1 .
  • the open-loop and/or closed-loop control device 4 or the evaluation device 5 can comprise a large number of modules for each of their own areas of responsibility.
  • the actuators 14 are spatially spaced from the assemblies 18, although this is not absolutely necessary.
  • Fig. 2 shows the arrangement 18 for carrying out the method with eight multispectral sensors 2 in two groups of four each, a camera system 3 with the flash device 7 arranged next to the camera system 3 and a control system connected to the multispectral sensors 2, the camera system 3 and the flash device 7 by cable. and/or control device 4 and evaluation submission 5, the connection of the control and/or control device 4 or the evaluation device 5 also being able to be present wirelessly and/or via a radio signal.
  • a camera is an integral part of the camera system 3
  • the flash device 7 is assigned externally to the camera system 3 .
  • an integral flash device 7 of the camera system 3 is also possible.
  • a base body 10 is provided in the arrangement 18, with two groups 26 of multispectral sensors 2 being arranged on the base body 10 and between the two groups 26 the camera system 3 and a flash device 7 in the form of an LED flash device being arranged.
  • the evaluation device 5 compares the spectral images using a search algorithm with a large number of data sets stored in a database 9 .
  • the connection to the database 9 is designed to be wireless, although this is not absolutely necessary since, for example, a database 9 can also be arranged on the arrangement 18 .
  • the database 9 can be recorded, for example, in an on-board computer in a tractor cab or be spatially separate from the tractor. Geometric data in the form of contours and spectral information of the spectral images are used for the data comparison, with the evaluation device 5 being programmed accordingly for the comparison with the aid of a search algorithm.
  • the base body 10 has a longitudinal extent, the longitudinal extent of the base body 10 being arranged orthogonally to a direction of movement (indicated by an arrow) relative to the background 8 and the objects 1 to be identified.
  • the multispectral sensors 2 and the camera system 3 are arranged next to one another along the longitudinal extent.
  • the flash device 7 is used for the optical imaging of the object 1 (see FIGS. 5a and 5b) for a limited time interval over the imaging period, a flash duration of 20 ps being used for exposure in this embodiment.
  • the flash device 7 is used as an LED flash device that is spatially separate from the camera system 3 for active exposure, it also generally being possible to use a flash device 7 that is integrally connected to the camera system 3 .
  • the multispectral sensor 2, the camera system 3 and the flash device 7 are arranged on the base body 10, with the multispectral sensors 2 displaying the spectral image and the camera system 3 displaying the optical image in the direction of a substrate 8 with a viewing angle 11 of 15° relative to a vertical 12 onto the subsurface 8 .
  • the viewing angle 11 can be enlarged and reduced automatically individually or jointly for the multispectral sensors 2 and the camera system 3 , the viewing angle 11 of the camera system 3 being identical to the viewing angle 11 of the multispectral sensor 2 .
  • the spectral image and the optical image can be recorded simultaneously, with the control and/or regulation device 4 being set up to record spectral images with a time offset of 20 ps—due, for example, to the exposure time of the flash device 7 of 20 ps - to increase relative to the recording of optical Ababi ld under a spatial shift 13 of a maximum of 5 mm.
  • this spatial offset z 13 is irrelevant for the purpose of the analysis, so that an indicated correction of the viewing angle 11 of the camera system 3 relative to the viewing angle 11 of the multispectral sensors 2 is also possible with a sequential recording does not need to be compensated by images.
  • the multispectral sensors 2 and the camera system 3 are synchronized via the control and/or regulation device 4 in such a way that the optical image has a time offset of a maximum of 30 ps - caused, for example, by the pre-analysis of the spectral image - and a spatial offset z 13 of a maximum of 5 mm is recorded.
  • the camera system 3 has an aspect ratio of 5:3, which is correspondingly visible in the imaging area.
  • the two-dimensional imaging area 16 of the camera system 3 with an extension of 500 mm to 300 mm is divided into two equidistant sub-areas 17 via the control and/or regulation device 4, with the evaluation device 5 sequentially transferring images of the sub-areas 17 of the imaging area 16 for evaluation.
  • Partial areas 17 of the imaging area 16 can be illuminated in white, RGB or UV, for example via the flash device 7 or separate exposure devices. However, the subdivision and varying exposure can also be dispensed with, with white flash light being used particularly preferably to block out sunlight or scattered light.
  • the component components are controlled and digital information is forwarded via the control and/or regulation device 4 by suitable programming of the control and/or regulation device 4 .
  • Fig. 3 shows an illumination device 27 for the multispectral sensors 2, with the illumination device 27 comprising two multispectral sensors 2 in this exemplary embodiment.
  • the lighting device 27 includes an LED light source in the form of a light emitting diode, which three Light sources 28 includes.
  • the light sources 28 include a primary lens, with light from the light sources 28 entering a total reflection lens 20 separate from the primary lens for focusing light from the light sources 28 onto an imaging region 16 common to the light sources 28 .
  • a receiver lens 29 is provided before the entry of light reflected from the object 1 , a filter 30 being arranged between the receiver lens 29 and the multispectral sensor 2 .
  • the light emitted by the three light sources 28 of the illumination device 27 and reflected by the object 1 is detected electromagnetically by the multispectral sensor 2, with an exact spectral analysis being ensured despite the lateral spacing of the light sources 28 from an optical axis of the total reflection lens 20.
  • the illumination device 27 comprises the total reflection lens 20, the total reflection lens 20 having two cover surfaces 22, 23 connected via a convexly curved lateral surface 21. Between the first cover surface 22 and the second cover surface 23 there is a recess 24 adjoining the first cover surface 22 and pointing in the direction of the second cover surface 23 .
  • the recess 24 has a multiplicity of boundary surface sections 25 , pointing in the direction of the second cover surface 23 and separate from one another, in the form of boundary surface sections 25 that widen in the direction of the first cover surface for the refraction of light in the direction of the lateral surface 21 .
  • the interface sections 25 are planar.
  • an LED illumination source 6 comprising a plurality of light sources is used for active exposure, light in the infrared range and near-infrared range being able to be emitted simultaneously in different wavelength ranges.
  • Fig. 4 shows a total reflection lens 20, which can be used together with the multispectral sensor 2 to record spectral images, the multispectral sensor 2 being continuously active during operation of the arrangement 18, including its own exposure and continuously spectral images of the substrate 8 or the Objects 1 takes on.
  • the total reflection lens 20 comprises two top surfaces 22, 23 connected via a convex lateral surface 21, with a recess 24 adjoining the first top surface 22 and pointing in the direction of the second top surface 23 being arranged between the first top surface 22 and the second top surface 23, the cut-out 24 has a multiplicity of planar boundary surface sections 25 in the form of facets, which point in the direction of the second cover surface 23 and are separate from one another, for the refraction of light in the direction of the lateral surface 21 for spectral imaging.
  • the boundary surface sections 25 widen in the direction of the first cover surface 22.
  • An aspherical lens is materially bonded to the planar second cover surface 23 .
  • 5a and 5b illustrate the analysis based on an optical image of the object 1 in the form of a plant, which was recorded by the camera system 3 after at least one multispectral sensor 2 identified the object 1 as the object 1 to be distinguished from the background 8 and to be analyzed .
  • the arrangement 18 and the method for identifying the object are therefore based on the example of plant recognition and Background detection shown. This identification takes place in the course of the preliminary analysis by the evaluation device 5, with the control and/or regulation device 4 then initiating the camera system 3 for optical imaging.
  • the evaluation device 5 is set up to assign the optical image to an object 1 in an image database together with spectral images or in isolation via a pattern recognition algorithm. For this purpose, starting from FIG. 5a, the contour or geometry of the object 1 is identified, as can be seen in FIG. 5b by dotted lines around partial areas of the object 1.
  • the object 1 can be associated with a categorized object 1 of the image database by the evaluation device with the aid of a coloring of the object 1 and, if necessary, spectral information. Subsequently, actuators 14 can be activated automatically on the basis of the identification of the object 1.
  • the object 1 can alternatively or in addition to a
  • Visualization device are displayed.
  • FIG. 6 shows a line laser 31, which can be used in addition to the arrangement 18 and/or during the method.
  • a height of the object 1 can be determined via the line laser 31 as a function of the spectral pre-analysis or during the recording of the optical image and via the at least one evaluation device 5 .
  • the height 33 can be determined via triangulation and an offset 32 of light reflected on the at least one object 1 or the substrate 8 .
  • the triangulation via the offset 32 of the reflected light can take place as follows: A laser line is directed onto the surface of the object 1 at an oblique angle. There it hits the surface and creates a visible line. In adjacent areas where object 1 is lower, the beam hits offset. This line offset of the bright laser lines in the optical image can be determined in particular with the camera system 3 . If the angle between the camera system 3 and the line laser 31 is known, the height 33 of the object 1 or the height 33 of a part of the object 1 can be calculated using trigonometry.
  • the height 33 can be determined discretely or continuously, with a three-dimensional image of the object 1 being able to be created in the case of a sequence of heights 33.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

L'invention concerne un procédé d'identification d'au moins un objet (1), en particulier une plante, par au moins un agencement (18) comprenant au moins un capteur multispectral (2), en particulier une photodiode, au moins un système de caméra, en particulier optique (3), au moins un dispositif de commande à boucle ouverte et/ou fermée (4) et au moins un dispositif d'évaluation (5), les étapes de procédé suivantes étant réalisées : Le capteur multispectral ou les capteurs multispectraux (2) enregistre simultanément au moins une image spectrale d'au moins deux gammes de longueurs d'onde, qui sont différentes les unes des autres, du ou des objets (1), le capteur multispectral ou les capteurs multispectraux (2) comprenant au moins une source d'éclairage pour un éclairage actif pendant l'enregistrement de la ou des images spectrales, l'objet ou les objets (1) étant pré-analysé(s) par le ou les dispositifs d'évaluation (5) au moyen de ladite au moins une image spectrale dudit ou desdits objets (1) enregistrés par le capteur multispectral ou les capteurs multispectraux (2), - en fonction de la pré-analyse spectrale du ou des objets (1), au moins une image optique dudit ou desdits objets (19) est enregistrée par le ou les systèmes de caméra (3) au moyen du ou des dispositifs de commande à boucle fermée et/ou ouverte (4), - le ou les dispositifs d'évaluation (5) analysent le ou les objets (1) au moyen de l'image spectrale ou des images spectrales et/ou de l'image optique ou des images optiques.
EP22797251.0A 2021-09-23 2022-09-22 Procédé d'identification d'au moins un objet Pending EP4405917A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ATA50756/2021A AT525511A1 (de) 2021-09-23 2021-09-23 Verfahren zur Identifikation wenigstens eines Objektes
PCT/AT2022/000010 WO2023044514A1 (fr) 2021-09-23 2022-09-22 Procédé d'identification d'au moins un objet

Publications (1)

Publication Number Publication Date
EP4405917A1 true EP4405917A1 (fr) 2024-07-31

Family

ID=84043898

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22797251.0A Pending EP4405917A1 (fr) 2021-09-23 2022-09-22 Procédé d'identification d'au moins un objet

Country Status (3)

Country Link
EP (1) EP4405917A1 (fr)
AT (1) AT525511A1 (fr)
WO (1) WO2023044514A1 (fr)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5253302A (en) * 1989-02-28 1993-10-12 Robert Massen Method and arrangement for automatic optical classification of plants
DE10016688C2 (de) * 2000-04-04 2003-12-24 Deutsch Zentr Luft & Raumfahrt Verfahren zur Detektion von Tieren und/oder Gelegen von Bodenbrütern in deren natürlichem Lebensraum sowie Einrichtungen zur Durchführung des Verfahrens
DE102009039601B4 (de) * 2009-09-01 2020-06-18 Deutsches Zentrum für Luft- und Raumfahrt e.V. Verfahren und Vorrichtung zur Suche und Erkennung von in landwirtschaftlichen Flächen versteckten Tieren
DE102009039602B3 (de) * 2009-09-01 2011-04-07 Deutsches Zentrum für Luft- und Raumfahrt e.V. Verfahren und Vorrichtung zur Suche und Erkennung von in landwirtschaftlichen Feldern und Wiesen versteckten Tieren
DE102011082908A1 (de) * 2011-09-19 2013-03-21 Deere & Company Verfahren und Anordnung zur optischen Beurteilung von Erntegut in einer Erntemaschine
EP2823346B1 (fr) * 2012-03-06 2017-06-14 Fraen Corporation Interface oscillante pour lentilles de mélange de lumière
US9658201B2 (en) * 2013-03-07 2017-05-23 Blue River Technology Inc. Method for automatic phenotype measurement and selection

Also Published As

Publication number Publication date
AT525511A1 (de) 2023-04-15
WO2023044514A1 (fr) 2023-03-30

Similar Documents

Publication Publication Date Title
DE69720798T2 (de) Vorrrichtung und verfahren zur erkennung und bestimmung der position eines tintenteils
DE69606059T2 (de) Zitzenlokalisation zum melken
DE69811667T2 (de) Vorrichtung und verfahren zur erkennung und bestimmung der position eines tierteils
DE10310768B4 (de) Vorrichtung zur Überwachung in drei Dimensionen
EP3782467B1 (fr) Procédé d'identification des mauvaises herbes dans un rang défini de plantes d'une surface agricole
DE102017111718A1 (de) Verfahren zur Erzeugung und Analyse eines Übersichtskontrastbildes
DE69905370T2 (de) Verfahren und vorrichtung zum trennen oder beschädigen von unerwünschtem pflanzenwuchs
US20200375172A1 (en) Device to detect and exercise control over weeds applied on agricultural machinery
DE102017221649A1 (de) Prüfverfahren zur Detektion von Oberflächenfehlern auf matten und glänzenden Flächen und zugehörige Vorrichtung sowie Prüfanordnung zwischen Vorrichtung und Bauteil
EP1565885B1 (fr) Procede pour saisir une propriete d'au moins un objet
DE19950396A1 (de) Vorrichtung und Verfahren zum Bestimmen des Pflanzenzustandes
DE102011118611A1 (de) Vorrichtung und Verfahren für eine halbautomatische Prüfstation
WO2021105017A1 (fr) Procédé de traitement de plantes dans un champ
DE102023103252B3 (de) Unkrautbekämpfungsmodul sowie Vorrichtung und Verfahren zur laserbasierten Unkrautbekämpfung
EP4405917A1 (fr) Procédé d'identification d'au moins un objet
DE202011107932U1 (de) Vorrichtung für eine halbautomatische Prüfstation
DE102014226291A1 (de) Vorrichtung und Verfahren zum Warnen vor Oberflächenschäden an Fahrzeugen
EP1004240B1 (fr) Dispositif de reconnaissance de plantes et/ou de formes apliquées en agriculture
EP2327043B1 (fr) Dispositif pour enregistrer des données biométriques
DE60028731T2 (de) Verfahren und vorrichtung zur erfassung von rissen in gegenständen aus durchsichtigem oder lichtdurchlässigem material
EP1483951B1 (fr) Procédé et appareil pour déterminer la demande d'engrais dans les jardins
EP3688660A1 (fr) Procédé de détection d'un bord de feuille, procédé pour le traitement ciblé de plantes au moyen d'un agent foliaire et utilisation d'un capteur d'images à base d'évènements pour la détection d'un bord de feuille
EP4064818B1 (fr) Procédé de traitement de plantes dans un champ
AT524838B1 (de) Totalreflexionslinse
DE102022131973A1 (de) Arbeitsmaschine zum Spritzen von Nutzpflanzen

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20240229

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR