EP3152498B1 - Cooking device with light pattern projector and camera - Google Patents

Cooking device with light pattern projector and camera Download PDF

Info

Publication number
EP3152498B1
EP3152498B1 EP15727630.4A EP15727630A EP3152498B1 EP 3152498 B1 EP3152498 B1 EP 3152498B1 EP 15727630 A EP15727630 A EP 15727630A EP 3152498 B1 EP3152498 B1 EP 3152498B1
Authority
EP
European Patent Office
Prior art keywords
light pattern
cooking
cooking appliance
camera
food
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP15727630.4A
Other languages
German (de)
French (fr)
Other versions
EP3152498A1 (en
Inventor
Sebastian Erbe
Robert KÜHN
Dan Neumayer
Daniel Vollmar
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BSH Hausgeraete GmbH
Original Assignee
BSH Hausgeraete GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BSH Hausgeraete GmbH filed Critical BSH Hausgeraete GmbH
Priority to PL15727630T priority Critical patent/PL3152498T3/en
Publication of EP3152498A1 publication Critical patent/EP3152498A1/en
Application granted granted Critical
Publication of EP3152498B1 publication Critical patent/EP3152498B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C7/00Stoves or ranges heated by electric energy
    • F24C7/08Arrangement or mounting of control or safety devices
    • F24C7/081Arrangement or mounting of control or safety devices on stoves
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C7/00Stoves or ranges heated by electric energy
    • F24C7/08Arrangement or mounting of control or safety devices
    • F24C7/082Arrangement or mounting of control or safety devices on ranges, e.g. control panels, illumination
    • F24C7/085Arrangement or mounting of control or safety devices on ranges, e.g. control panels, illumination on baking ovens

Definitions

  • the invention relates to a cooking appliance, comprising a cooking chamber with a loading opening that can be closed by means of a door, a light pattern projector fixed in relation to the cooking chamber for generating a light pattern, a camera for taking pictures from an area that can be irradiated by the light pattern and one coupled to the camera Evaluation device for calculating a three-dimensional shape of at least one object which is located in the area that can be irradiated by the light pattern, by means of a light pattern evaluation.
  • the invention is particularly advantageously applicable to ovens.
  • the invention is particularly advantageously applicable to household appliances.
  • EP 2 530 387 A1 discloses an oven with a device for detecting a three-dimensional shape of food on a baking tray of the oven.
  • the device contains at least one laser which is arranged or can be arranged over a cooking space of the oven. A laser beam from the laser is directed downwards.
  • the device further comprises at least one camera which is arranged or can be arranged above a baking sheet of the oven. The camera is arranged or can be arranged in a front section of the furnace.
  • the baking tray and the camera are mechanically coupled so that the camera and the baking tray can be moved synchronously.
  • An upper side of the baking sheet is in a field of view of the camera.
  • An angle between a central axis of a field of view of the camera and the laser beam is predetermined.
  • the device for detecting the three-dimensional shape of the food on the baking sheet is also disclosed.
  • EP 2 149 755 A1 discloses an oven for heating food products comprising a cooking chamber to receive the product via a loading port and a product feature corn extraction system configured to extract at least one product feature representative of a configuration of the product, the system comprising : at least one camera, which is designed and arranged to record top views of the product, and at least one contour plane unit that is used to extract or highlight contour planes at least a portion of the product and, as the case may be, an object that to introduce together is provided with the product in the cooking space, and a product feature extraction unit for extracting the at least one product feature on the basis of the top view of the product and contour planes of the product.
  • a method for operating an oven for heating a food product comprising the steps: a) extracting a product feature of a product that is to be heated in a chamber of the oven by recording at least one top view of a product using at least one camera, extracting and / or Highlighting of contour planes of at least one section of the product and, as may be the case, an object that is intended to be introduced together with the product into the cooking chamber, using at least one contour plane unit, and b) extracting the at least a product feature based on the plan views and contour planes, based on at least one product feature, and optionally secondary data representing a physical configuration of the product, preferably at least one of the product temperature, a product weight and a product density, for automatic control or for Heating the Product.
  • EP 1 921 384 A1 discloses a device for determining the temperature inside a product to be cooked.
  • the device has at least one temperature sensor for detecting at least one surface temperature of the item to be cooked and / or an ambient temperature of the item to be cooked, in particular at a measuring location within a cooking space surrounding the item to be cooked, preferably with an ambient temperature sensor arranged at the measuring location.
  • the device comprises at least one distance sensor for detecting one or a plurality of distances between the distance sensor on the one hand and one or a plurality of distance measuring points on the surface of the food to be cooked on the other hand.
  • the device comprises at least one time measuring device for recording the time during preparation of the food and at least one calculation device for calculating the temperature inside the food from the surface temperature of the food and / or ambient temperature, the interval or the plurality of intervals, the time and one Starting temperature of the food.
  • a method for determining the temperature inside a product to be cooked is also disclosed.
  • DE 197 48 062 A1 discloses a method and a device for three-dimensional, optical measurement of objects. Afterwards, the measuring system must be calibrated in the case of optical, areally working, three-dimensional measuring methods, since the geometric parameters of the system must be known in order to carry out the triangulation calculation. After calibration, the lenses must no longer be adjusted, as this changes the imaging errors of the optics in an uncontrollable manner. The method enables the measuring system to be set to a different measuring field size even after calibration.
  • the measuring system By determining the inner bundle of rays from the projector and camera by means of a device which simultaneously serves to focus on a wide variety of measuring distances, the measuring system is adapted to different measuring field sizes in such a way that the geometric changes made to the system can be precisely determined and those that are decisive for the triangulation Parameters can be calculated without recalibration.
  • the calibration now takes place with a measurement field size which is selected solely from the point of view of the ease of manufacture of the calibration device and easily manageable dimensions. Once the system has been calibrated, it can then be set to the most varied, in particular very large, measurement distances and measurement volumes. An application to household appliances or cooking appliances is not disclosed.
  • WO 00/70303 discloses a method and an apparatus for the imaging of three-dimensional objects, comprising a structural light source which projects a focused image onto an object by passing light either continuously or in a stroboscopic manner through an optical grating and a downstream projection lens. An application to household appliances or cooking appliances is not disclosed.
  • DE 10 2006 005 874 A1 discloses a device and a method for contactless measurement of, in particular, cylindrical objects on surfaces. To this end, it is proposed to use a laser to generate a line on the surface, the reflection of which is measured by a camera. After the line has been recorded, it is shifted parallel to itself several times and the recording is repeated. In this way, a shadow image of the object arranged on the surface is generated by successively displacing the line. It is also possible to separate the multiline triangulation and the shadow formation from one another.
  • a stationary laser or another radiation source can be used for multiline triangulation.
  • the Shadow formation can be carried out by two also stationary radiation sources, for example a row of LEDs, simultaneously or one after the other. The use of a stationary structure of radiation sources and camera simplifies and makes the mechanical structure cheaper. An application to household appliances or cooking appliances is not disclosed.
  • WO 2010/102261 A1 discloses a food treatment device in which food located in the device is treated by radiation.
  • the food treatment device has a camera for image recognition.
  • a type of food eg pizza
  • the image recognition can identify its location, occupancy, etc., for example.
  • a light pattern can also be used to identify the height or other expansion of the pizza.
  • a cooking device having a cooking chamber with a loading opening that can be closed by means of a door, (at least) one projector (referred to below as a "light pattern projector" without restricting generality) for generating a light pattern, at least) one camera for recording images from an area that can be irradiated by the light pattern and an evaluation device coupled to the camera for calculating a three-dimensional shape of at least one object that is located in the area that can be irradiated by the light pattern by means of a light pattern evaluation, the The light pattern projector is arranged to radiate a light pattern into the cooking chamber, the camera is arranged in particular fixedly with respect to the cooking chamber, the camera is arranged for taking pictures from an area of the cooking chamber that can be irradiated by the light pattern, even when the cooking chamber is closed and the Evaluation device is set up for repeated calculation of the three-dimensional shape of the at least one object, which is located in the region of the cooking chamber that can be irradiated by the light pattern, during
  • the cooking appliance has the advantage that the depth information can be used as a parameter for automatic programs.
  • a possible change in volume of the item to be cooked during the cooking process when the cooking appliance is in operation can thus be recorded and a type of item to be cooked can be determined.
  • the control of the cooking parameters can be influenced, for example the cooking space temperature. So For example, a rising behavior of a bread and a shrinking behavior of a piece of meat may be recorded and possibly used to control the cooking appliance.
  • the basically known method of patterned or structured light (“structure light”) is used in particular.
  • a defined light pattern is projected onto the object to be recorded or measured by means of the light pattern projector and recorded by the camera.
  • a three-dimensional model of this object can be calculated by means of the evaluation device.
  • a depth resolution of a specific image point depends on the angle between a light beam for generating this image point and a normal vector to a plane or to an optical axis of the camera. A theoretical optimum resolution would be at the largest possible angle.
  • the cooking appliance may be or have an oven, in particular an oven.
  • the cooking space may then also be referred to as the oven space.
  • the oven may be a stand-alone oven or it may be part of an oven / hob combination or range.
  • the oven may have a microwave and / or steam treatment functionality.
  • the cooking appliance is a household appliance, in particular in the sense of “white goods”.
  • the light pattern projector may also be referred to as a so-called “3D scanner”.
  • the light pattern projector emits at least one light pattern, for example a stripe and / or point pattern, but is not limited thereto. In this way, any other light pattern can be generated, for example ring-shaped Pattern, wave pattern, etc.
  • a pattern is selected in particular in such a way that it matches the desired resolution of the three-dimensional image.
  • the camera may be a digital camera. It may take individual pictures and / or picture sequences, especially videos.
  • the evaluation device may be an independent device of the cooking appliance, e.g. in the form of electronics, in particular on a separate circuit board. Alternatively, it may be integrated into a further device of the cooking appliance, e.g. into a central control device. This further device may then also be able to carry out the evaluation in particular.
  • an optical axis of the light pattern projector and an optical axis of the camera are at an angle between 20 ° and 30 ° to one another. This achieves good visibility of the projected light pattern with good depth resolution and consequently a particularly reliable determination of the three-dimensional shape of the at least one object.
  • the light pattern projector and the camera are arranged behind a wall or muffle of the cooking chamber, in particular at a predefined distance. In this way, these two components can be sufficiently thermally insulated from the cooking space.
  • the cooking chamber wall may have a respective window for the light pattern projector and for the camera.
  • the windows may be covered with transparent glass.
  • the light pattern projector and the camera are arranged behind a ceiling of the cooking space.
  • a food support eg a baking sheet or a wire rack
  • This in turn enables particularly precise images and measurements to be generated.
  • This position has the further advantage that cooling air conducted over the ceiling (eg for cooling electronics arranged above the cooking chamber) can also be used to cool the light pattern projector and the camera.
  • the distance between the light pattern projector and the camera behind the cooking chamber wall or muffle can be muffle-specific.
  • different light patterns can be radiated into the cooking space by means of the light pattern projector.
  • the three-dimensional shape of the at least one object can be determined with a particularly small error.
  • alternating point-like and strip-like light patterns can be irradiated and evaluated.
  • Different dot patterns and / or different stripe patterns can also be radiated into the cooking space. This can take place in a predetermined sequence or if a measured depth resolution does not produce sufficient results.
  • the light pattern projector has at least one pixel-like screen or screen for shaping the light pattern.
  • the pixel-like screen may e.g. be a liquid crystal screen or an LCD screen.
  • the pixel-like screen may itself generate light as a structural unit in order to adequately illuminate the cooking space with the light pattern.
  • the pixel-like screen may e.g. can also be backlit by at least one separate light source so that it can be used as a 'variable diaphragm'. The latter case enables particularly high luminous fluxes.
  • the light emitted by the light pattern projector and received by the camera may be visible light and / or infrared light.
  • the advantage of infrared light is that an observer looking into the cooking space cannot see the light pattern.
  • the 3D scanner can be calibrated.
  • At least one calibration marking is located on a food support, for example on a baking sheet or a grate etc.
  • it may be located on a surface for use or storage of the food support for food.
  • This calibration mark especially likes have a known size, so that a distance to the camera can be determined by means of the size recorded by the camera.
  • a calibration marking may be, for example, a colored marking and / or a predetermined shaped marking.
  • the calibration marks can also be defined geometrical features, e.g. Functional areas of the cooking chamber wall or muffle such as insertion protrusions.
  • the calibration mark (s) can also serve to determine the insertion level on which the object to be measured is located.
  • the calibration preferably takes place in the closed muffle or with the cooking space closed, in particular at the start of a cooking process. This minimizes any influence on the measurement from the environment.
  • 3D scan preferred insertion levels in the cooking space for the 3D measurement (“3D scan”). They are preferably located in a lower third of the muffle. This has the advantage that large-area and / or large-volume objects can also be reliably identified and measured.
  • the 3D measurement of the object advantageously takes place after the calibration. In principle, however, calibration can also be dispensed with.
  • the cooking appliance is equipped with an insertion identifier.
  • the cooking appliance may then, when it detects that a food carrier is located on an insertion level that is unfavorable for a 3D measurement, to output a warning signal and / or a display to a user.
  • the cooking appliance may then also prevent a 3D scan.
  • the evaluation device is set up to recognize a type of item to be cooked on the basis of its shape and shape change calculated by the light pattern evaluation.
  • This enables, among other things, an automatic adaptation of cooking parameters to the food (e.g. within the framework of a cooking program) and / or an adaptation of user guidance to the food (e.g. by displaying cooking parameters and / or cooking programs suitable for the recognized food).
  • the cooking appliance is set up to recognize the food being cooked on the basis of an image evaluation of images recorded by the camera (without 3D measurement, also referred to as “image recognition” below) and the 3D scan. Recognition of the food to be cooked on the basis of a combined image recognition and a 3D measurement enables a higher recognition probability through the additional height and depth information. This may e.g. enter as information in an image recognition algorithm.
  • the evaluation device is set up to recognize a type of food carrier, e.g. whether the food is on a wire rack or on a baking sheet.
  • This information can be used by the cooking appliance, for example, to set or adapt cooking parameters, for example a heating output of upper and / or lower heating or an activation and / or setting of a heating output of a circulating air heater.
  • the evaluation device is set up to recognize a type of accessory, in particular cooking utensils, for example a roasting pan or the like placed on a food support, in which the food is located. So it may be measurable whether the food to be cooked is in an open roaster or the roaster is closed. This information may also be used by the cooking appliance to set or adapt cooking parameters, possibly including the option of selecting the cooking method used. If the roaster is closed, the cooking device may be dependent on input from the user about the type of content.
  • a type of accessory in particular cooking utensils, for example a roasting pan or the like placed on a food support, in which the food is located. So it may be measurable whether the food to be cooked is in an open roaster or the roaster is closed. This information may also be used by the cooking appliance to set or adapt cooking parameters, possibly including the option of selecting the cooking method used. If the roaster is closed, the cooking device may be dependent on input from the user about the type of content.
  • the evaluation device is set up to detect a core temperature of an object.
  • the core temperature can be calculated using a correlation with a change in volume determined by the 3D scan during a cooking process with knowledge of the type of food being cooked. There is no need for a separate core temperature probe or roast skewer. For a particularly high level of reliability in determining the core temperature, it is a preferred development that the food to be cooked has an almost homogeneous structure.
  • the core temperature is determined by means of a 3D scan, not as in EP 1 921 384 A1 described by means of one or more distance sensors.
  • the evaluation device is coupled to a control device of the cooking device and the control device is set up to adapt an operation of the cooking device on the basis of at least one object parameter determined by the evaluation device.
  • the associated object can be food, an accessory and / or a food carrier.
  • Object parameters can e.g. be a position, shape, volume or type, etc., of the object.
  • the 3D information determined by one or more 3D scans can be used in particular for the automation of cooking, cooking and baking processes. As already mentioned, it is possible to carry out the 3D scan during a cooking process. The 3D information or 3D data that is then determined is not only used to identify the food being cooked, but also to adapt the cooking parameters if necessary.
  • a cooking process or cooking process can thus be individually tailored to the food being cooked. If an exact detection of the food to be cooked is not possible, inputs from a user can in particular be taken into account. For this purpose, the cooking appliance may request the user to enter further information about the food to be cooked into the cooking appliance.
  • the cooking appliance has a screen on which at least one three-dimensional image of at least one object recorded by the camera can be displayed.
  • a three-dimensional representation of the contents of the cooking space is offered on a screen. This enables a particularly informative presentation of information for a user.
  • the screen may e.g. be present on a front side or a top side of the cooking appliance.
  • the screen may be a touch-sensitive sensor screen or touch screen.
  • One way of operating the cooking device is to carry out a calibration before starting a food treatment (e.g. a cooking process). Furthermore, shortly before the start of a food treatment, a 3D measurement or a 3D scan of the food to be cooked may be carried out in order to record its initial geometry. An object detection with regard to a type of item to be cooked, a type of an accessory and / or a type of a food carrier may also be carried out before the start of a food treatment.
  • At least one 3D scan may also take place during the food treatment, in particular several 3D scans at, for example, periodic intervals.
  • a product to be cooked is identified and, for example, the end of treatment and / or a core temperature can be identified.
  • the light pattern projector is also provided for illuminating the cooking space. For example, it may illuminate the cooking space for viewing by a user and only shine the light pattern into the cooking space for comparatively short periods in between. This means that there is no need for a separate light source for lighting the cooking space.
  • Fig.1 shows an arrangement (“3D scanner”) for determining a three-dimensional shape of at least one object O (“3D scanner”), having a light pattern projector 1 directed at the object, a camera 2 directed at the object O, and a control device C for operation the light pattern projector 1 and for calculating a three-dimensional shape of the object O on the basis of at least one image received by the camera 2 by means of a light pattern evaluation.
  • a screen 3 for viewing the object O ′ calculated by the control device C is optionally available.
  • the light pattern projector 1 generates a predetermined light pattern L, e.g. a stripe or dot pattern.
  • the light pattern projector 1 emits its light in a light beam with a first optical axis A1.
  • the camera 2 typically a digital camera, has a field of view F with a second optical axis A2, which is oriented at an angle to the first optical axis A1 of the light pattern projector 1. In other words, the camera 2 is oriented at an angle to the light pattern projector 1. It observes a region of the object O that is or can be irradiated by the light pattern L.
  • Fig. 2 shows a sketch of a reconstruction of a shape of the object O measured by means of the 3D scanner 1, 2, C.
  • the light pattern projector 1 here has a light source Q, e.g. an array of light-emitting diodes, which is followed by a pattern-generating element in the form of a freely programmable LCD surface D that can be radiated through. Depending on the pattern M generated on the LCD surface D, a corresponding, in particular complementary, light pattern L is emitted from the LCD surface D.
  • a light source Q e.g. an array of light-emitting diodes
  • a pattern-generating element in the form of a freely programmable LCD surface D that can be radiated through.
  • a corresponding, in particular complementary, light pattern L is emitted from the LCD surface D.
  • an LED screen may already serve as the light source (not shown), in which case a separate light source can then be dispensed with because of the background lighting (“backlighting”) integrated therein.
  • Fig. 2 is shown by way of example how light in the form of a vertical column or line G is radiated from the light pattern projector 1 onto the object O.
  • a projection P (G) of this line G which is distorted by the surface contour of the object O, appears on the object O.
  • the camera 2 takes an image of this projection P (G), which reproduces the distortion, due to its inclination with respect to the light pattern projector 1.
  • the camera 2 stores the projection P (G) as appropriately positioned image points B or "pixels" of a matrix that results from a matrix-like structure of individual sensors of a sensor array S of the camera, for example a CCD sensor array.
  • the height or depth information is given by the deviation of the image points B from a vertical line.
  • the depth resolution depends on an angle W between the light beam r leading to the image point B and the direction of the column or line G.
  • a theoretical optimum in terms of resolution would be at the largest possible angle W.
  • the visibility of the projection P (G) on the surface of the object O and thus its detectability in the camera image deteriorate with increasing approach to this optimum. Since a reconstruction is only possible for those points on the surface of the object O which on the one hand are visible from the camera 2 and on the other hand can be illuminated by the light pattern projector 1, a compromise is made here.
  • Such a 3D measurement or 3D scan is known in principle and is therefore not discussed further below.
  • Fig. 3 shows a sectional view in side view of a cooking device equipped with a 3D scanner in the form of an oven 4.
  • the oven 4 has a cooking space 6 delimited by an oven muffle 5.
  • the oven muffle 5 has at the front a loading opening 8 which can be closed by means of an oven door 7 and through which objects, in particular in the form of food O1, can be brought into the cooking space 6.
  • a cooking space temperature T can be set by means of one or more, in particular electrically operated, heating elements (not shown).
  • a ceiling 9 of the furnace muffle 5 there are two viewing windows 10 and 11, which can be covered, for example, with transparent glass panes.
  • a light pattern projector 1 e.g. with an LCD display for pattern generation
  • a camera 2 behind the viewing window 11 their distance from the furnace muffle 5 is thermally protected.
  • a flow of cooling air may sweep over the ceiling 9, for example for cooling components arranged there, such as a control device.
  • the light pattern projector 1 and the camera 2 can also be further cooled by this cooling air flow.
  • the light pattern projector 1 and the camera 2 are arranged laterally offset from one another.
  • their optical axes A1 and A2 assume an angle ⁇ between 20 ° and 30 °, which enables high depth resolution with good visibility.
  • the light pattern projector 1 and the camera 2 are fixedly arranged in relation to the cooking space 6 and therefore do not move with it, for example when the oven door 7 is actuated.
  • the light pattern projector 1 emits a light pattern L through the viewing window 10 into the cooking space 6, in such a way that, from a predetermined distance from the ceiling 9, practically the entire horizontal surface of the cooking space 6 can be illuminated with the light pattern L. This may e.g. be the case in a lower half or in a lower third of the cooking space 6.
  • the camera 2 records images from an area of the cooking chamber 6 that can at least partially be irradiated by the light pattern
  • the oven 4 also has an evaluation device 12 coupled to the camera 2 for calculating a three-dimensional shape, for example of the food O1 and a food carrier O2, which are located in the area that can be irradiated by the light pattern L, by means of a light pattern evaluation. This is based on a 3D measurement based on at least one image recorded by the camera 2.
  • the light pattern projector 1, the camera 2 and the evaluation device 12 together form the 3D scanner.
  • the evaluation device 12 may be functionally integrated into a central control device of the baking oven 4 or it may be coupled to a control device as an independent unit.
  • a 3D scan comprising e.g. a recording of an image of the projection P (G) of the light pattern L by means of the camera 2 and a calculation of the three-dimensional shape of the food O1 and the food support O2 can be carried out with the oven door 7 or loading opening 8 closed.
  • the food carrier O2 has one or more, for example colored, calibration markings K on its upper side, which have a known size and can be easily identified.
  • the size of the calibration mark (s) K recorded by the camera 2 allows a distance from the ceiling 9 and thus, for example, the insertion level used identify.
  • the oven 4 may output a message to a user, e.g. a display on a front screen 3 of a control panel 13. At least one calibration mark may also be located on the oven muffle 4.
  • an initial 3D measurement of the item to be cooked O1 may be carried out by means of the 3D scanner 1, 2, 12 in order to calculate its original shape.
  • the calculated shape may be displayed on the screen 3.
  • the calculated shape is used to determine the item to be cooked O1 by the cooking appliance 4, specifically for image recognition of the item to be cooked O1, which can also be carried out by the camera 2.
  • a type of food carrier O2 may also be recognized by means of the 3D scanner 1, 2, 12.
  • One or more cooking parameters of the cooking process may be adapted on the basis of a recognition of the food O1 and possibly the carrier O2.
  • a cooking time and / or the cooking space temperature T may thus be adapted on the basis of the recognized type and / or the recognized volume of the item to be cooked and / or the recognized item carrier O2.
  • 3D measurements can be carried out repeatedly in order to determine a change in shape and / or volume of the item to be cooked O1.
  • the oven 4 may adapt the cooking sequence, e.g. change the cooking time and / or the cooking space temperature, including switching off the heaters.
  • the light pattern projector 1 may project different light patterns L into the cooking space 6.
  • Fig. 4 shows a diagram of a profile of a temperature T and a volume V of the item to be cooked O1 with a determination of the item to be cooked by means of the cooking appliance 4, for example.
  • the food O1 to be cooked is purely by way of example a dough product, for example a round tarte flambée or a biscuit in a springform pan.
  • a distinction between these two types of food O1 cannot be distinguished only by means of image recognition by the camera 2 because in a two-dimensional view from above (top view) both the tarte flambée and the sponge cake look circular. In addition, both are similar in color. However, both types of food require a different specific baking environment. If the tarte flambée is treated like the sponge cake, the result is unsatisfactory, which also applies to the reverse case.
  • the cooking appliance 4 Through the 3D measurement by means of the 3D scanner 1, 2, 12, the cooking appliance 4 also receives spatial information about the food to be cooked O1.
  • This spatial information of the initial state of the item to be cooked O1 before the cooking process may already be sufficient to distinguish the flat tarte flambée from the higher biscuit.
  • the type of item to be cooked O1 is determined by means of the 3D scanner 1, 2 12 from the change in its shape, in particular a change ⁇ V in its volume V.
  • the cooking space temperature T has an initial value Ts at an initial point in time ts of the cooking process, e.g. Room temperature.
  • the cooking space temperature T increases due to at least one activated heater, specifically for tarte flambée and sponge cake, as shown by the curve T1 + T2. If the cooking space temperature T reaches a target temperature Td1 for biscuits at a point in time td, which is below a target temperature Td2 for tarte flambée, a further 3D measurement is carried out in order to determine the type of item O1 to be cooked.
  • the cooking appliance 4 can then, for example, increase its cooking space temperature T to the associated setpoint Td2, as indicated by the temperature curve T2.
  • the cooking process ends at an associated end time te2.
  • the cooking appliance 4 may subsequently keep its cooking space temperature T at the associated setpoint Td1, as indicated by the temperature curve T1. The cooking process ends at an associated end time te1.
  • a clear distinguishing feature for identifying the food to be cooked can therefore be provided.

Description

Die Erfindung betrifft ein Gargerät, aufweisend einen Garraum mit einer mittels einer Tür verschließbaren Beschickungsöffnung, einen in Bezug auf den Garraum fest angeordneten Lichtmusterprojektor zum Erzeugen eines Lichtmusters, eine Kamera zum Aufnehmen von Bildern aus einem durch das Lichtmuster bestrahlbaren Bereich und eine mit der Kamera gekoppelte Auswerteeinrichtung zur Berechnung einer dreidimensionalen Form mindestens eines Objekts, das sich in dem durch das Lichtmuster bestrahlbaren Bereich befindet, mittels einer Lichtmusterauswertung. Die Erfindung ist insbesondere vorteilhaft anwendbar auf Öfen. Die Erfindung ist insbesondere vorteilhaft anwendbar auf Haushaltsgeräten.The invention relates to a cooking appliance, comprising a cooking chamber with a loading opening that can be closed by means of a door, a light pattern projector fixed in relation to the cooking chamber for generating a light pattern, a camera for taking pictures from an area that can be irradiated by the light pattern and one coupled to the camera Evaluation device for calculating a three-dimensional shape of at least one object which is located in the area that can be irradiated by the light pattern, by means of a light pattern evaluation. The invention is particularly advantageously applicable to ovens. The invention is particularly advantageously applicable to household appliances.

EP 2 530 387 A1 offenbart einen Backofen mit einer Vorrichtung zur Erfassung einer dreidimensionalen Form von Nahrungsmitteln auf einem Backblech des Ofens. Die Vorrichtung enthält mindestens einen Laser, der über einem Garraum des Ofens angeordnet oder anordenbar ist. Ein Laserstrahl des Lasers ist nach unten gerichtet ist. Die Vorrichtung umfasst ferner mindestens eine Kamera, die oberhalb eines Backblechs des Ofens angeordnet oder anordenbar ist. Die Kamera ist in einem vorderen Abschnitt des Ofens angeordnet oder anordenbar. Das Backblech und die Kamera sind mechanisch gekoppelt, so dass die Kamera und das Backblech synchron bewegbar sind. Eine obere Seite des Backblechs befindet sich in einem Sichtfeld der Kamera. Ein Winkel zwischen einer Mittelachse eines Sichtfelds der Kamera und des Laserstrahls ist vorgegeben. Ferner offenbart ist die Vorrichtung zum Erfassen der dreidimensionalen Form des Nahrungsmittels auf dem Backblech. EP 2 530 387 A1 discloses an oven with a device for detecting a three-dimensional shape of food on a baking tray of the oven. The device contains at least one laser which is arranged or can be arranged over a cooking space of the oven. A laser beam from the laser is directed downwards. The device further comprises at least one camera which is arranged or can be arranged above a baking sheet of the oven. The camera is arranged or can be arranged in a front section of the furnace. The baking tray and the camera are mechanically coupled so that the camera and the baking tray can be moved synchronously. An upper side of the baking sheet is in a field of view of the camera. An angle between a central axis of a field of view of the camera and the laser beam is predetermined. The device for detecting the three-dimensional shape of the food on the baking sheet is also disclosed.

EP 2 149 755 A1 offenbart einen Ofen zum Erhitzen von Lebensmittelprodukten, umfassend einen Garraum, um das Produkt über eine Beschickungsöffnung zu empfangen, und ein Produktmerkmais-Extraktionssystem, das ausgelegt ist zum Extrahieren von mindestens einem Produktmerkmal, das repräsentativ ist für eine Konfiguration des Produkts, wobei das System umfasst: wenigstens eine Kamera, die zur Aufnahme von Draufsichten auf das Produkt ausgebildet ist und angeordnet ist, und zumindest eine Konturebenen-Einheit, die zum Extrahieren oder Hervorheben von Konturebenen mindestens ein Abschnitts des Produkts und, wie der Fall sein mag, ein Objekt, das zum Einführen zusammen mit dem Produkt in den Garraum vorgesehen ist, und eine Produktmerkmals-Extraktionseinheit zum Extrahieren des mindestens einen Produktmerkmals auf der Basis der Draufsicht auf das Produkt und Konturebenen des Produkt. EP 2 149 755 A1 discloses an oven for heating food products comprising a cooking chamber to receive the product via a loading port and a product feature corn extraction system configured to extract at least one product feature representative of a configuration of the product, the system comprising : at least one camera, which is designed and arranged to record top views of the product, and at least one contour plane unit that is used to extract or highlight contour planes at least a portion of the product and, as the case may be, an object that to introduce together is provided with the product in the cooking space, and a product feature extraction unit for extracting the at least one product feature on the basis of the top view of the product and contour planes of the product.

Ein Verfahren dient zum Betreiben eines Ofens zum Erhitzen eines Lebensmittelprodukts, umfassend die Schritte: a) Extrahieren eines Produktmerkmal eines Produktes, das in einer Kammer des Ofens erhitzt werden soll, mittels Aufnehmens mindestens einer Draufsicht eines Produkts mittels mindestens einer Kamera, Extrahieren und/oder Hervorheben von Konturebenen von mindestens einem Abschnitt des Produkts und, wie es der Fall sein mag, eines Objekts, das zum Einführen zusammen mit dem Produkt in den Garraum bestimmt ist, und zwar unter Verwendung mindestens einer Konturebenen-Einheit, und b) Extrahieren des mindestens einen Produktmerkmals auf der Grundlage der Draufsichten und Konturebenen, und zwar basierend auf mindestens einem Produktmerkmal, und optional sekundären Daten, die eine physikalische Konfiguration des Produkts repräsentieren, vorzugsweise mindestens eine der Produkttemperatur, ein Produktgewicht und eine Produktdichte, und zwar zum automatischen Steuern oder zum Erwärmen des Produktes.A method is used for operating an oven for heating a food product, comprising the steps: a) extracting a product feature of a product that is to be heated in a chamber of the oven by recording at least one top view of a product using at least one camera, extracting and / or Highlighting of contour planes of at least one section of the product and, as may be the case, an object that is intended to be introduced together with the product into the cooking chamber, using at least one contour plane unit, and b) extracting the at least a product feature based on the plan views and contour planes, based on at least one product feature, and optionally secondary data representing a physical configuration of the product, preferably at least one of the product temperature, a product weight and a product density, for automatic control or for Heating the Product.

EP 1 921 384 A1 offenbart eine Vorrichtung zum Bestimmen der Temperatur im Inneren eines Garguts. Die Vorrichtung weist wenigstens einen Temperatursensor zum Erfassen wenigstens einer Oberflächentemperatur des Garguts und/oder einer Umgebungstemperatur des Garguts, insbesondere an einem Messort innerhalb eines das Gargut umgebenden Garraumes, vorzugsweise mit einem an dem Messort angeordneten Umgebungstemperaturfühler, auf. Weiterhin umfasst die Vorrichtung wenigstens einen Abstandssensor zum Erfassen eines oder einer Vielzahl von Abständen zwischen dem Abstandssensor einerseits und einem oder einer Vielzahl von Abstandsmesspunkten auf der Oberfläche des Garguts andererseits. Außerdem umfasst die Vorrichtung wenigstens eine Zeitmesseinrichtung zum Erfassen der Zeit während einer Zubereitung des Garguts und wenigstens eine Berechnungseinrichtung zum Berechnen der Temperatur im Inneren des Garguts aus der Oberflächentemperatur des Garguts und/oder Umgebungstemperatur, dem Abstand oder der Vielzahl von Abständen, der Zeit und einer Anfangstemperatur des Garguts. Ferner offenbart ist ein Verfahren zum Bestimmen der Temperatur im Inneren eines Garguts. EP 1 921 384 A1 discloses a device for determining the temperature inside a product to be cooked. The device has at least one temperature sensor for detecting at least one surface temperature of the item to be cooked and / or an ambient temperature of the item to be cooked, in particular at a measuring location within a cooking space surrounding the item to be cooked, preferably with an ambient temperature sensor arranged at the measuring location. Furthermore, the device comprises at least one distance sensor for detecting one or a plurality of distances between the distance sensor on the one hand and one or a plurality of distance measuring points on the surface of the food to be cooked on the other hand. In addition, the device comprises at least one time measuring device for recording the time during preparation of the food and at least one calculation device for calculating the temperature inside the food from the surface temperature of the food and / or ambient temperature, the interval or the plurality of intervals, the time and one Starting temperature of the food. A method for determining the temperature inside a product to be cooked is also disclosed.

DE 197 48 062 A1 offenbart ein Verfahren und eine Einrichtung zur dreidimensionalen, optischen Vermessung von Objekten. Danach muss bei optischen, flächenhaft arbeitenden, dreidimensionalen Messverfahren das Messsystem kalibriert werden, da die geometrischen Kenngrößen des Systems zur Durchführung der Triangulationsrechnung bekannt sein müssen. Nach der Kalibrierung dürfen die Objektive nicht mehr verstellt werden, da sich hierdurch die Abbildungsfehler der Optiken unkontrollierbar verändern. Das Verfahren ermöglicht es, das Messsystem auch nach der Kalibrierung auf eine andere Messfeldgröße einzustellen. Durch die Ermittlung der inneren Strahlenbündel von Projektor und Kamera mittels einer Einrichtung, welche gleichzeitig zur Fokussierung auf unterschiedlichste Messabstände dient, erfolgt eine Anpassung des Messsystems an verschiedene Messfeldgrößen derart, dass die dabei durchgeführten geometrischen Veränderungen am System exakt bestimmbar sind und die für die Triangulation maßgebenden Parameter ohne Neukalibrierung berechnet werden können. Die Kalibrierung erfolgt nun bei einer Messfeldgröße, welche allein unter den Gesichtspunkten günstiger Herstellbarkeit der Kalibriervorrichtung sowie leicht handhabbarer Abmessungen ausgewählt wird. Das einmal kalibrierte System kann dann auf unterschiedlichste, insbesondere auch sehr große Messabstände und Messvolumen eingestellt werden. Eine Anwendung auf Haushaltsgeräte oder Gargeräte ist nicht offenbart. DE 197 48 062 A1 discloses a method and a device for three-dimensional, optical measurement of objects. Afterwards, the measuring system must be calibrated in the case of optical, areally working, three-dimensional measuring methods, since the geometric parameters of the system must be known in order to carry out the triangulation calculation. After calibration, the lenses must no longer be adjusted, as this changes the imaging errors of the optics in an uncontrollable manner. The method enables the measuring system to be set to a different measuring field size even after calibration. By determining the inner bundle of rays from the projector and camera by means of a device which simultaneously serves to focus on a wide variety of measuring distances, the measuring system is adapted to different measuring field sizes in such a way that the geometric changes made to the system can be precisely determined and those that are decisive for the triangulation Parameters can be calculated without recalibration. The calibration now takes place with a measurement field size which is selected solely from the point of view of the ease of manufacture of the calibration device and easily manageable dimensions. Once the system has been calibrated, it can then be set to the most varied, in particular very large, measurement distances and measurement volumes. An application to household appliances or cooking appliances is not disclosed.

WO 00/70303 offenbart ein Verfahren und eine Vorrichtung für die Abbildung von dreidimensionalen Objekten, umfassend eine Strukturlichtquelle, die ein fokussiertes Bild auf ein Objekt projiziert, indem entweder kontinuierlich oder stroboskopartig Licht durch ein optisches Gitter und eine nachgeschaltete Projektionslinse läuft. Eine Anwendung auf Haushaltsgeräte oder Gargeräte ist nicht offenbart. WO 00/70303 discloses a method and an apparatus for the imaging of three-dimensional objects, comprising a structural light source which projects a focused image onto an object by passing light either continuously or in a stroboscopic manner through an optical grating and a downstream projection lens. An application to household appliances or cooking appliances is not disclosed.

DE 10 2006 005 874 A1 offenbart eine Vorrichtung und ein Verfahren zum berührungsfreien Vermessen von insbesondere zylindrischen Gegenständen auf Oberflächen. Dazu wird vorgeschlagen, mit Hilfe eines Lasers eine Linie auf der Oberfläche zu erzeugen, deren Reflexion von einer Kamera gemessen wird. Nach der Aufnahme der Linie wird diese mehrfach parallel zu sich verschoben und die Aufnahme wiederholt. Auf diese Weise wird durch aufeinanderfolgendes Verschieben der Linie ein Schattenbild des auf der Oberfläche angeordneten Gegenstands erzeugt. Es ist ebenfalls möglich, die Mehrlinientriangulation und die Schattenbildung voneinander zu trennen. Zur Mehrlinientriangulation kann ein ortsfester Laser oder eine andere Strahlungsquelle verwendet werden. Die Schattenbildung kann von zwei ebenfalls ortsfesten Strahlenquellen, beispielsweise einer Reihe von LEDs, gleichzeitig oder nacheinander durchgeführt werden. Durch die Verwendung eines ortsfesten Aufbaus von Strahlungsquellen und Kamera vereinfacht und verbilligt sich der mechanische Aufbau. Eine Anwendung auf Haushaltsgeräte oder Gargeräte ist nicht offenbart. DE 10 2006 005 874 A1 discloses a device and a method for contactless measurement of, in particular, cylindrical objects on surfaces. To this end, it is proposed to use a laser to generate a line on the surface, the reflection of which is measured by a camera. After the line has been recorded, it is shifted parallel to itself several times and the recording is repeated. In this way, a shadow image of the object arranged on the surface is generated by successively displacing the line. It is also possible to separate the multiline triangulation and the shadow formation from one another. A stationary laser or another radiation source can be used for multiline triangulation. The Shadow formation can be carried out by two also stationary radiation sources, for example a row of LEDs, simultaneously or one after the other. The use of a stationary structure of radiation sources and camera simplifies and makes the mechanical structure cheaper. An application to household appliances or cooking appliances is not disclosed.

WO 2010/102261 A1 offenbart ein Speisenbehandlungsgerät, bei dem in dem Gerät befindliche Speise durch Strahlung behandelt wird. Das Speisenbehandlungsgerät weist eine Kamera zur Bilderkennung auf. Eine Speisenart (z.B. Pizza) wird voreingestellt, und durch die Bilderkennung kann z.B. deren Lage, Belegung usw. erkannt. Durch ein Lichtmuster kann zusätzlich eine Höhe oder andere Ausdehnung der Pizza erkannt werden. WO 2010/102261 A1 discloses a food treatment device in which food located in the device is treated by radiation. The food treatment device has a camera for image recognition. A type of food (eg pizza) is preset, and the image recognition can identify its location, occupancy, etc., for example. A light pattern can also be used to identify the height or other expansion of the pizza.

Es ist die Aufgabe der vorliegenden Erfindung, die Nachteile des Standes der Technik zumindest teilweise zu überwinden und speziell eine besonders vielseitig einsetzbare Möglichkeit zur Vermessung von Gargut bereitzustellen.It is the object of the present invention to at least partially overcome the disadvantages of the prior art and specifically to provide a particularly versatile possibility for measuring food to be cooked.

Diese Aufgabe wird gemäß den Merkmalen des unabhängigen Anspruchs gelöst. Bevorzugte Ausführungsformen sind insbesondere den abhängigen Ansprüchen entnehmbar.This object is achieved according to the features of the independent claim. Preferred embodiments can be inferred in particular from the dependent claims.

Die Aufgabe wird gelöst durch ein Gargerät, aufweisend einen Garraum mit einer mittels einer Tür verschließbaren Beschickungsöffnung, (mindestens) einen in Bezug auf den Garraum fest angeordneten Projektors (im Folgenden ohne Beschränkung der Allgemeinheit als "Lichtmusterprojektor" bezeichnet) zum Erzeugen eines Lichtmusters, (mindestens) eine Kamera zum Aufnehmen von Bildern aus einem durch das Lichtmuster bestrahlbaren Bereich und eine mit der Kamera gekoppelte Auswerteeinrichtung zur Berechnung einer dreidimensionalen Form mindestens eines Objekts, das sich in dem durch das Lichtmuster bestrahlbaren Bereich befindet, und zwar mittels einer Lichtmusterauswertung, wobei der Lichtmusterprojektor zum Einstrahlen eines Lichtmusters in den Garraum angeordnet ist, die Kamera in Bezug auf den Garraum insbesondere fest angeordnet ist, die Kamera zum Aufnehmen von Bildern aus einem durch das Lichtmuster bestrahlbaren Bereich des Garraums auch bei verschlossenem Garraum angeordnet ist und die Auswerteeinrichtung zur wiederholten Berechnung der dreidimensionalen Form des mindestens einen Objekts, das sich in dem durch das Lichtmuster bestrahlbaren Bereich des Garraums befindet, während eines Betriebs des Gargeräts eingerichtet ist. Das Gargerät ergibt den Vorteil, dass die Tiefeninformation als Parameter für Automatikprogramme dienen kann. So lässt sich daraus eine eventuelle Volumenänderung des Garguts während des Garprozesses beim Betrieb des Gargeräts erfassen und eine Art des Garguts bestimmen. So kann Einfluss auf die Steuerung der Garparameter genommen werden, z.B. auf eine Garraumtemperatur. So mag z.B. ein Aufgehverhalten eines Brotes und ein Schrumpfverhalten eines Fleischstücks erfasst und ggf. zu Steuerung des Gargeräts verwendet werden.The object is achieved by a cooking device having a cooking chamber with a loading opening that can be closed by means of a door, (at least) one projector (referred to below as a "light pattern projector" without restricting generality) for generating a light pattern, at least) one camera for recording images from an area that can be irradiated by the light pattern and an evaluation device coupled to the camera for calculating a three-dimensional shape of at least one object that is located in the area that can be irradiated by the light pattern by means of a light pattern evaluation, the The light pattern projector is arranged to radiate a light pattern into the cooking chamber, the camera is arranged in particular fixedly with respect to the cooking chamber, the camera is arranged for taking pictures from an area of the cooking chamber that can be irradiated by the light pattern, even when the cooking chamber is closed and the Evaluation device is set up for repeated calculation of the three-dimensional shape of the at least one object, which is located in the region of the cooking chamber that can be irradiated by the light pattern, during operation of the cooking appliance. The cooking appliance has the advantage that the depth information can be used as a parameter for automatic programs. A possible change in volume of the item to be cooked during the cooking process when the cooking appliance is in operation can thus be recorded and a type of item to be cooked can be determined. In this way, the control of the cooking parameters can be influenced, for example the cooking space temperature. So For example, a rising behavior of a bread and a shrinking behavior of a piece of meat may be recorded and possibly used to control the cooking appliance.

Für die Erzeugung der dreidimensionalen Form bzw. des dreidimensionalen Abbilds des durch das Lichtmuster bestrahlbaren Bereichs des Garraums findet also insbesondere das grundsätzlich bekannte Verfahren des gemusterten oder strukturierten Lichtes ("Structure Light") Anwendung. Hierbei wird ein definiertes Lichtmuster mittels des Lichtmusterprojektors auf das aufzunehmende oder auszumessende Objekt projiziert und von der Kamera aufgenommen. Durch den Grad der Deformation des Lichtmusters an dem Objekt kann mittels der Auswerteeinrichtung ein dreidimensionales Modell dieses Objekts errechnet werden. Eine Tiefenauflösung eines bestimmten Bildpunkts hängt dabei von dem Winkel zwischen einem Lichtstrahl zur Erzeugung dieses Bildpunkts und einem Normalenvektor zu einer Ebene bzw. zu einer optischen Achse der Kamera ab. Ein theoretisches Optimum der Auflösung würde bei einem möglichst großen Winkel liegen. Jedoch verschlechtert sich mit zunehmender Annäherung an dieses Optimum eine Sichtbarkeit des projizierten Lichtmusters auf der Objektoberfläche und somit dessen Detektierbarkeit durch die Kamera. Eine Positionsbestimmung ist nur für solche Punkte möglich, welche einerseits von der Kamera aus sichtbar sind und andererseits von dem Lichtmusterprojektor angestrahlt werden können (also nicht in einem Schatten liegen).For the generation of the three-dimensional shape or the three-dimensional image of the region of the cooking chamber that can be irradiated by the light pattern, the basically known method of patterned or structured light (“structure light”) is used in particular. A defined light pattern is projected onto the object to be recorded or measured by means of the light pattern projector and recorded by the camera. Based on the degree of deformation of the light pattern on the object, a three-dimensional model of this object can be calculated by means of the evaluation device. A depth resolution of a specific image point depends on the angle between a light beam for generating this image point and a normal vector to a plane or to an optical axis of the camera. A theoretical optimum resolution would be at the largest possible angle. However, with increasing approach to this optimum, the visibility of the projected light pattern on the object surface and thus its detectability by the camera deteriorate. A position determination is only possible for those points which on the one hand are visible from the camera and on the other hand can be illuminated by the light pattern projector (ie not in a shadow).

Das Gargerät mag ein Ofen sein oder einen solchen aufweisen, insbesondere einen Backofen. Der Garraum mag dann auch als Ofenraum bezeichnet werden. Der Ofen mag ein eigenständiger Ofen sein oder einen Teil einer Ofen/Kochfeld-Kombination bzw. eines Herds sein. Der Ofen mag zusätzlich oder alternativ zu einer Ausgestaltung als Ofen eine Mikrowellen- und/oder Dampfbehandlungs-Funktionalität aufweisen.The cooking appliance may be or have an oven, in particular an oven. The cooking space may then also be referred to as the oven space. The oven may be a stand-alone oven or it may be part of an oven / hob combination or range. In addition or as an alternative to being configured as an oven, the oven may have a microwave and / or steam treatment functionality.

Es ist eine Weiterbildung, dass das Gargerät ein Haushaltsgerät ist, insbesondere im Sinne 'weißer Ware'.It is a further development that the cooking appliance is a household appliance, in particular in the sense of “white goods”.

Der Lichtmusterprojektor mag zusammen mit der Kamera sowie der Auswerteeinrichtung auch als ein sog. "3D-Scanner" bezeichnet werden. Der Lichtmusterprojektor strahlt mindestens ein Lichtmuster, z.B. ein Streifen- und/oder Punktmuster, ab, ist jedoch nicht darauf beschränkt. So können auch andere beliebige Lichtmuster erzeugt werden, z.B. ringförmige Muster, Wellenmuster usw. Ein Muster wird insbesondere so gewählt, dass es zur gewünschten Auflösung der dreidimensionalen Abbildung passt.The light pattern projector, together with the camera and the evaluation device, may also be referred to as a so-called “3D scanner”. The light pattern projector emits at least one light pattern, for example a stripe and / or point pattern, but is not limited thereto. In this way, any other light pattern can be generated, for example ring-shaped Pattern, wave pattern, etc. A pattern is selected in particular in such a way that it matches the desired resolution of the three-dimensional image.

Die Kamera mag insbesondere eine Digitalkamera sein. Sie mag einzelne Bilder und/oder Bildfolgen, insbesondere Videos, aufnehmen.In particular, the camera may be a digital camera. It may take individual pictures and / or picture sequences, especially videos.

Die Auswerteeinrichtung mag eine eigenständige Einrichtung des Gargeräts sein, z.B. in Form einer Elektronik, insbesondere auf einer eigenständigen Platine. Sie mag alternativ in eine weitere Einrichtung des Gargeräts integriert sein, z.B. in eine zentrale Steuereinrichtung. Diese weitere Einrichtung mag dann insbesondere die Auswertung zusätzlich durchführen können.The evaluation device may be an independent device of the cooking appliance, e.g. in the form of electronics, in particular on a separate circuit board. Alternatively, it may be integrated into a further device of the cooking appliance, e.g. into a central control device. This further device may then also be able to carry out the evaluation in particular.

Es ist eine Ausgestaltung, dass eine optische Achse des Lichtmusterprojektors und eine optische Achse der Kamera in einem Winkel zwischen 20° und 30° zueinander stehen. Dadurch wird eine gute Sichtbarkeit des projizierten Lichtmusters bei guter Tiefenauflösung erreicht und folglich eine besonders zuverlässige Bestimmung der dreidimensionalen Form des mindestens einen Objekts.It is an embodiment that an optical axis of the light pattern projector and an optical axis of the camera are at an angle between 20 ° and 30 ° to one another. This achieves good visibility of the projected light pattern with good depth resolution and consequently a particularly reliable determination of the three-dimensional shape of the at least one object.

Es ist noch eine Ausgestaltung, dass der Lichtmusterprojektor und die Kamera hinter einer Wandung oder Muffel des Garraums angeordnet sind, insbesondere in einem vordefinierten Abstand. So können diese beiden Komponenten thermisch gegenüber dem Garraum thermisch ausreichend stark isoliert werden. Die Garraumwandung mag ein jeweiliges Fenster für den Lichtmusterprojektor und für die Kamera aufweisen. Die Fenster mögen mit transparentem Glas abgedeckt sein.Another embodiment is that the light pattern projector and the camera are arranged behind a wall or muffle of the cooking chamber, in particular at a predefined distance. In this way, these two components can be sufficiently thermally insulated from the cooking space. The cooking chamber wall may have a respective window for the light pattern projector and for the camera. The windows may be covered with transparent glass.

Es ist eine weitere Ausgestaltung, dass der Lichtmusterprojektor und die Kamera hinter einer Decke des Garraums angeordnet sind. So lässt sich ein Gargutträger (z.B. ein Backblech oder ein Gitterrost) besonders einfach voll beleuchten und erfassen. Dadurch wiederum können besonders genaue Abbildungen und Messungen erzeugt werden. Diese Position bringt den weiteren Vorteil, dass über die Decke geleitete Kühlluft (z.B. zur Kühlung von oberhalb des Garraums angeordneter Elektronik) auch zur Kühlung des Lichtmusterprojektors und der Kamera verwendet werden kann. Der Abstand des Lichtmusterprojektors und der Kamera hinter der Garraumwandung oder Muffel kann muffelspezifisch sein.It is a further embodiment that the light pattern projector and the camera are arranged behind a ceiling of the cooking space. In this way, a food support (eg a baking sheet or a wire rack) can be fully illuminated and detected particularly easily. This in turn enables particularly precise images and measurements to be generated. This position has the further advantage that cooling air conducted over the ceiling (eg for cooling electronics arranged above the cooking chamber) can also be used to cool the light pattern projector and the camera. The distance between the light pattern projector and the camera behind the cooking chamber wall or muffle can be muffle-specific.

Es ist noch eine weitere Ausgestaltung, dass mittels des Lichtmusterprojektors unterschiedliche Lichtmuster in den Garraum einstrahlbar sind. Dadurch kann eine Bestimmung der dreidimensionalen Form des mindestens einen Objekts mit einem besonders geringen Fehler durchgeführt werden. Beispielsweise können abwechselnd punktförmige und streifenförmige Lichtmuster eingestrahlt und ausgewertet werden. Auch können unterschiedliche Punktmuster und/oder unterschiedliche Streifenmuster in den Garraum eingestrahlt werden. Dies kann in einer vorbestimmten Abfolge geschehen oder dann, wenn eine gemessene Tiefenauflösung keine ausreichenden Ergebnisse bringt.It is yet another embodiment that different light patterns can be radiated into the cooking space by means of the light pattern projector. As a result, the three-dimensional shape of the at least one object can be determined with a particularly small error. For example, alternating point-like and strip-like light patterns can be irradiated and evaluated. Different dot patterns and / or different stripe patterns can also be radiated into the cooking space. This can take place in a predetermined sequence or if a measured depth resolution does not produce sufficient results.

Es ist ferner eine Ausgestaltung, dass der Lichtmusterprojektor mindestens einen bildpunktartigen Schirm oder Bildschirm zur Formung des Lichtmusters aufweist. Dieser ermöglicht eine besonders einfache Gestaltung und Variation des Lichtmusters mit hoher Auflösung. Der bildpunktartige Bildschirm mag z.B. ein Flüssigkristallbildschirm oder LCD-Bildschirm sein. Der bildpunktartige Bildschirm mag als eine bauliche Einheit selbst Licht erzeugen, um den Garraum ausreichend mit dem Lichtmuster zu bestrahlen. Der bildpunktartige Bildschirm mag aber z.B. auch von mindestens einer separaten Lichtquelle hinterleuchtet werden, so dass er als 'variable Blende' einsetzbar ist. Der letztere Fall ermöglicht besonders hohe Lichtströme.It is also an embodiment that the light pattern projector has at least one pixel-like screen or screen for shaping the light pattern. This enables a particularly simple design and variation of the light pattern with high resolution. The pixel-like screen may e.g. be a liquid crystal screen or an LCD screen. The pixel-like screen may itself generate light as a structural unit in order to adequately illuminate the cooking space with the light pattern. However, the pixel-like screen may e.g. can also be backlit by at least one separate light source so that it can be used as a 'variable diaphragm'. The latter case enables particularly high luminous fluxes.

Das von dem Lichtmusterprojektor abgestrahlte und von der Kamera empfangene Licht mag sichtbares Licht und/oder infrarotes Licht sein. Der Vorteil des infraroten Lichts liegt darin, dass ein in den Garraum blickender Betrachter das Lichtmuster nicht sieht.The light emitted by the light pattern projector and received by the camera may be visible light and / or infrared light. The advantage of infrared light is that an observer looking into the cooking space cannot see the light pattern.

Es ist eine Weiterbildung, dass der 3D-Scanner kalibrierbar ist. Es ist eine Ausgestaltung davon, dass sich in der Muffel mindestens eine Kalibrierungsmarkierung befindet. Anhand der bekannten Position der mindestens einen Kalibrierungsmarkierung relativ zu dem mindestens einen auszumessenden Objekt lässt sich ein Abstand des Objekts und damit auch dessen Größe oder Form genauer bestimmen.It is a further development that the 3D scanner can be calibrated. In one embodiment, there is at least one calibration mark in the muffle. Based on the known position of the at least one calibration marking relative to the at least one object to be measured, a distance of the object and thus also its size or shape can be determined more precisely.

Es ist eine alternative oder zusätzliche Ausgestaltung, dass sich mindestens eine Kalibrierungsmarkierung an einem Gargutträger, z.B. an einem Backblech oder einem Gitterrost usw., befindet. Sie mag sich insbesondere auf einer Gebrauchs- oder Ablagefläche des Gargutträgers für Gargut befinden. Diese Kalibrierungsmarkierung mag insbesondere eine bekannte Größe aufweisen, so dass sich mittels der von der Kamera aufgenommenen Größe ein Abstand zu der Kamera bestimmen lässt.It is an alternative or additional embodiment that at least one calibration marking is located on a food support, for example on a baking sheet or a grate etc. In particular, it may be located on a surface for use or storage of the food support for food. This calibration mark especially likes have a known size, so that a distance to the camera can be determined by means of the size recorded by the camera.

Eine Kalibrierungsmarkierung mag beispielsweise eine farbige Markierungen und/oder eine vorbestimmt geformte Markierung sein. Die Kalibrierungsmarkierungen können auch definierte geometrische Merkmale sein, z.B. Funktionsbereiche der Garraumwandung oder Muffel wie Einschubvorsprünge. Die Kalibrierungsmarkierung(en) kann bzw. können ferner dazu dienen, festzustellen, auf welcher Einschubebene sich das auszumessende Objekt befindet.A calibration marking may be, for example, a colored marking and / or a predetermined shaped marking. The calibration marks can also be defined geometrical features, e.g. Functional areas of the cooking chamber wall or muffle such as insertion protrusions. The calibration mark (s) can also serve to determine the insertion level on which the object to be measured is located.

Vorzugsweise findet die Kalibrierung in der geschlossen Muffel bzw. bei geschlossenem Garraum statt, insbesondere zu Beginn eines Garvorgangs statt. So wird eine eventuelle Beeinflussung der Messung aus der Umgebung minimiert. Um eine sichere Erfassung des Garguts und die vorherige Kalibrierung zu vereinfachen, ist es vorteilhaft, bevorzugte Einschubebenen in dem Garraum für die 3D-Vermessung ("3D-Scan") vorzugeben. Vorzugsweise befinden sie diese in einem unteren Drittel der Muffel. Dies bringt den Vorteil mit sich, dass auch großflächige und/oder großvolumige Objekte sicher erkannt und vermessen werden können.The calibration preferably takes place in the closed muffle or with the cooking space closed, in particular at the start of a cooking process. This minimizes any influence on the measurement from the environment. In order to simplify reliable detection of the item to be cooked and the previous calibration, it is advantageous to predefine preferred insertion levels in the cooking space for the 3D measurement (“3D scan”). They are preferably located in a lower third of the muffle. This has the advantage that large-area and / or large-volume objects can also be reliably identified and measured.

Die 3D-Vermessung des Objekts findet vorteilhafterweise nach der Kalibrierung statt. Auf die Kalibrierung mag aber grundsätzlich auch verzichtet werden.The 3D measurement of the object advantageously takes place after the calibration. In principle, however, calibration can also be dispensed with.

Es ist eine Weiterbildung, dass das Gargerät mit einer Einschuberkennung ausgerüstet ist. Das Gargerät mag dann, wenn es erkennt, dass sich ein Gargutträger auf einer für eine 3D-Vermessung ungünstigen Einschubebene befindet, ein Hinweissignal und/oder eine Anzeige an einen Benutzer auszugeben. Auch mag das Gargerät dann einen 3D-Scan verhindern.It is a further development that the cooking appliance is equipped with an insertion identifier. The cooking appliance may then, when it detects that a food carrier is located on an insertion level that is unfavorable for a 3D measurement, to output a warning signal and / or a display to a user. The cooking appliance may then also prevent a 3D scan.

Erfindungsgemäß ist die Auswerteeinrichtung dazu eingerichtet, eine Art eines Garguts anhand seiner durch die Lichtmusterauswertung berechneten Form und Formänderung zu erkennen. Dies ermöglicht unter anderem eine automatische Anpassung von Garparametern an das Gargut (z.B. im Rahmen eines Garprogramms) und/oder eine Anpassung einer Benutzerführung an das Gargut (z.B. durch Anzeige von für das erkannte Gargut geeigneten Garparametern und/oder Garprogrammen).According to the invention, the evaluation device is set up to recognize a type of item to be cooked on the basis of its shape and shape change calculated by the light pattern evaluation. This enables, among other things, an automatic adaptation of cooking parameters to the food (e.g. within the framework of a cooking program) and / or an adaptation of user guidance to the food (e.g. by displaying cooking parameters and / or cooking programs suitable for the recognized food).

Das Gargerät ist dazu eingerichtet, eine Garguterkennung anhand einer Bildauswertung von durch die Kamera aufgenommenen Bildern (ohne 3D-Vermessung, im Folgenden auch als "Bilderkennung" bezeichnet) und des 3D-Scans durchzuführen. Eine Garguterkennung anhand einer kombinierten Bilderkennung und einer 3D-Vermessung ermöglicht eine höhere Erkennungswahrscheinlichkeit durch die zusätzliche Höhen- bzw. Tiefeninformation. Diese mag z.B. als Information in einen Bilderkennungsalgorithmus eingehen.The cooking appliance is set up to recognize the food being cooked on the basis of an image evaluation of images recorded by the camera (without 3D measurement, also referred to as “image recognition” below) and the 3D scan. Recognition of the food to be cooked on the basis of a combined image recognition and a 3D measurement enables a higher recognition probability through the additional height and depth information. This may e.g. enter as information in an image recognition algorithm.

Es ist auch eine Ausgestaltung, dass die Auswerteeinrichtung dazu eingerichtet ist, eine Art eines Gargutträgers zu erkennen, z.B. ob das Gargut auf einem Rost oder auch einem Backblech aufliegt. Diese Information kann von dem Gargerät beispielsweise dazu verwendet werden, Garparameter einzustellen oder anzupassen, beispielsweise eine Heizleistung einer Ober- und/oder einer Unterhitze oder eine Aktivierung und/oder eine Einstellung einer Heizleistung einer Umluftheizung.It is also an embodiment that the evaluation device is set up to recognize a type of food carrier, e.g. whether the food is on a wire rack or on a baking sheet. This information can be used by the cooking appliance, for example, to set or adapt cooking parameters, for example a heating output of upper and / or lower heating or an activation and / or setting of a heating output of a circulating air heater.

Es ist eine Weiterbildung, dass die Auswerteeinrichtung dazu eingerichtet ist, eine Art eines Zubehörs, insbesondere Gargeschirrs, zu erkennen, beispielsweise einen auf einem Gargutträger abgelegten Bräter o.ä., in dem sich das Gargut befindet. So mag es messbar sein, ob das Gargut in einem offenen Bräter liegt oder der Bräter geschlossen ist. Diese Information mag von dem Gargerät ebenfalls dazu verwendet werden, Garparameter einzustellen oder anzupassen, ggf. einschließlich der Möglichkeit, das verwendete Garverfahren zu wählen. Sollte der Bräter geschlossen sein, mag das Gargerät auf eine Eingabe des Anwenders zur Art des Inhalts angewiesen sein.It is a further development that the evaluation device is set up to recognize a type of accessory, in particular cooking utensils, for example a roasting pan or the like placed on a food support, in which the food is located. So it may be measurable whether the food to be cooked is in an open roaster or the roaster is closed. This information may also be used by the cooking appliance to set or adapt cooking parameters, possibly including the option of selecting the cooking method used. If the roaster is closed, the cooking device may be dependent on input from the user about the type of content.

Es ist außerdem eine Ausgestaltung, dass die Auswerteeinrichtung dazu eingerichtet ist, eine Kerntemperatur eines Objekts zu erkennen. Die Kerntemperatur lässt sich durch eine Korrelation mit einer durch den 3D-Scan bestimmten Volumenänderung während eines Garprozesses unter Kenntnis der Art des Garguts berechnen. So mag auf einen eigenständigen Kerntemperaturfühler oder Bratenspieß verzichtet werden. Es ist eine für eine besonders hohe Zuverlässigkeit der Bestimmung der Kerntemperatur bevorzugte Weiterbildung, dass das Gargut eine nahezu homogene Struktur aufweist. Die Kerntemperatur wird vorliegend also mittels eines 3D-Scans bestimmt, nicht wie in EP 1 921 384 A1 beschrieben mittels eines oder mehrerer Abstandssensoren.It is also an embodiment that the evaluation device is set up to detect a core temperature of an object. The core temperature can be calculated using a correlation with a change in volume determined by the 3D scan during a cooking process with knowledge of the type of food being cooked. There is no need for a separate core temperature probe or roast skewer. For a particularly high level of reliability in determining the core temperature, it is a preferred development that the food to be cooked has an almost homogeneous structure. In this case, the core temperature is determined by means of a 3D scan, not as in EP 1 921 384 A1 described by means of one or more distance sensors.

Erfindungsgemäß ist die Auswerteeinrichtung mit einer Steuereinrichtung des Gargeräts gekoppelt und die Steuereinrichtung ist dazu eingerichtet, auf einer Grundlage mindestens eines durch die Auswerteeinrichtung bestimmten Objektparameters einen Betrieb des Gargeräts anzupassen. Das zugehörige Objekt kann, wie zum Teil bereits oben angesprochen, Gargut, ein Zubehör und/oder ein Gargutträger sein. Objektparameter können z.B. eine Position, Form, Volumen oder Art usw. des Objekts sein. Grundsätzlich kann die durch einen oder mehrere 3D-Scans ermittelte 3D-Informationen insbesondere für eine Automatisierung von Koch-, Gar- und Backprozessen genutzt werden. Wie bereits erwähnt, ist es möglich, den 3D-Scan auch während eines Garvorgangs durchzuführen. Die dann ermittelten 3D-Informationen oder 3D-Daten nutzen nicht nur der Garguterkennung, sondern auch einer eventuellen Anpassung der Garparameter. Somit lässt sich ein Garvorgang oder Garverlauf individuell auf das Gargut abstimmen. Sollte eine genaue Detektierung des Garguts nicht möglich sein, können insbesondere Eingaben von einem Anwender berücksichtigt werden. Dazu mag das Gargerät den Anwender auffordern, weitere Informationen über das Gargut in das Gargerät einzugeben.According to the invention, the evaluation device is coupled to a control device of the cooking device and the control device is set up to adapt an operation of the cooking device on the basis of at least one object parameter determined by the evaluation device. As already mentioned in part above, the associated object can be food, an accessory and / or a food carrier. Object parameters can e.g. be a position, shape, volume or type, etc., of the object. In principle, the 3D information determined by one or more 3D scans can be used in particular for the automation of cooking, cooking and baking processes. As already mentioned, it is possible to carry out the 3D scan during a cooking process. The 3D information or 3D data that is then determined is not only used to identify the food being cooked, but also to adapt the cooking parameters if necessary. A cooking process or cooking process can thus be individually tailored to the food being cooked. If an exact detection of the food to be cooked is not possible, inputs from a user can in particular be taken into account. For this purpose, the cooking appliance may request the user to enter further information about the food to be cooked into the cooking appliance.

Es ist noch eine weitere Ausgestaltung, dass das Gargerät einen Bildschirm aufweist, auf welchem mindestens ein dreidimensionales Abbild mindestens eines durch die Kamera aufgenommenen Objekts darstellbar ist. In anderen Worten wird eine dreidimensionale Darstellung des Inhalts des Garraums auf einem Bildschirm angeboten. Dies ermöglicht eine besonders informative Informationsdarstellung für einen Nutzer. Der Bildschirm mag z.B. an einer Frontseite oder einer Oberseite des Gargeräts vorhanden sein. Der Bildschirm mag ein berührungsempfindlicher Sensorbildschirm oder Touchscreen sein.In yet another embodiment, the cooking appliance has a screen on which at least one three-dimensional image of at least one object recorded by the camera can be displayed. In other words, a three-dimensional representation of the contents of the cooking space is offered on a screen. This enables a particularly informative presentation of information for a user. The screen may e.g. be present on a front side or a top side of the cooking appliance. The screen may be a touch-sensitive sensor screen or touch screen.

Es ist eine Möglichkeit zum Betreiben des Gargeräts, dass vor Beginn einer Speisenbehandlung (z.B. eines Garablaufs) eine Kalibrierung durchgeführt wird. Ferner mag kurz vor Beginn einer Speisenbehandlung eine 3D-Vermessung bzw. ein 3D-Scan des Garguts durchgeführt werden, um dessen Anfangsgeometrie zu erfassen. Auch mag vor Beginn einer Speisenbehandlung eine Objekterkennung in Bezug auf eine Art des Garguts, eine Art eines Zubehörs und/oder einer Art eines Gargutträgers durchgeführt werden.One way of operating the cooking device is to carry out a calibration before starting a food treatment (e.g. a cooking process). Furthermore, shortly before the start of a food treatment, a 3D measurement or a 3D scan of the food to be cooked may be carried out in order to record its initial geometry. An object detection with regard to a type of item to be cooked, a type of an accessory and / or a type of a food carrier may also be carried out before the start of a food treatment.

Um eine Veränderung der Form des Garguts bzw. der Speise feststellen zu können, mag mindestens ein 3D-Scan auch während der Speisenbehandlung stattfinden, insbesondere mehrere 3D-Scans in z.B. periodischen Abständen. Mittels der erkannten Formänderung des Garguts wird erfindungsgemäß eine Garguterkennung durchgeführt und es mag beispielsweise eine Erkennung eines Behandlungsendes und/oder eine Bestimmung einer Kerntemperatur durchgeführt werden.In order to be able to determine a change in the shape of the item to be cooked or the food, at least one 3D scan may also take place during the food treatment, in particular several 3D scans at, for example, periodic intervals. According to the invention, by means of the recognized change in shape of the item to be cooked, a product to be cooked is identified and, for example, the end of treatment and / or a core temperature can be identified.

Es ist noch eine Ausgestaltung, dass der Lichtmusterprojektor auch zur Beleuchtung des Garraums vorgesehen ist. Er mag beispielsweise den Garraum zur Ansicht durch einen Nutzer beleuchten und nur für vergleichsweise kurze Perioden dazwischen das Lichtmuster in den Garraum einstrahlen. So kann auf eine separate Lichtquelle zur Garraumbeleuchtung verzichtet werden.Another embodiment is that the light pattern projector is also provided for illuminating the cooking space. For example, it may illuminate the cooking space for viewing by a user and only shine the light pattern into the cooking space for comparatively short periods in between. This means that there is no need for a separate light source for lighting the cooking space.

Die oben beschriebenen Eigenschaften, Merkmale und Vorteile dieser Erfindung sowie die Art und Weise, wie diese erreicht werden, werden klarer und deutlicher verständlich im Zusammenhang mit der folgenden schematischen Beschreibung eines Ausführungsbeispiels, das im Zusammenhang mit den Zeichnungen näher erläutert wird.

Fig.1
zeigt eine Skizze einer Anordnung eines 3D-Scanners;
Fig.2
zeigt eine Skizze einer Rekonstruktion einer Form eines mittels eines 3D-Scanners vermessenen Objekts;
Fig.3
zeigt als Schnittdarstellung in Seitenansicht ein mit einem 3D-Scanner ausgerüstetes erfindungsgemäßes Gargerät; und
Fig.4
zeigt ein Diagramm eines Verlaufs einer Temperatur und eines Volumens eines geheizten Objekts mit einer Gargutbestimmung mittels eines erfindungsgemäßen Gargeräts.
The properties, features and advantages of this invention described above and the manner in which they are achieved will become clearer and more clearly understood in connection with the following schematic description of an exemplary embodiment which is explained in more detail in connection with the drawings.
Fig.1
shows a sketch of an arrangement of a 3D scanner;
Fig. 2
shows a sketch of a reconstruction of a shape of an object measured by means of a 3D scanner;
Fig. 3
shows, as a sectional illustration in side view, a cooking appliance according to the invention equipped with a 3D scanner; and
Fig. 4
shows a diagram of a course of a temperature and a volume of a heated object with a cooking product determination by means of a cooking device according to the invention.

Fig.1 zeigt eine Anordnung ("3D-Scanner") zum Bestimmen einer dreidimensionalen Form mindestens eines Objekts O ("3D-Scanner"), aufweisend einen auf das Objekt gerichteten Lichtmusterprojektor 1, eine auf das Objekt O gerichtete Kamera 2, eine Steuereinrichtung C zum Betreiben der Lichtmusterprojektors 1 und zum Berechnen einer dreidimensionalen Form des Objekts O auf Grundlage mindestens eines von der Kamera 2 empfangenen Bilds mittels einer Lichtmusterauswertung. Optional ist ein Bildschirm 3 zum Betrachten des durch die Steuereinrichtung C errechneten Objekts O' vorhanden. Fig.1 shows an arrangement (“3D scanner”) for determining a three-dimensional shape of at least one object O (“3D scanner”), having a light pattern projector 1 directed at the object, a camera 2 directed at the object O, and a control device C for operation the light pattern projector 1 and for calculating a three-dimensional shape of the object O on the basis of at least one image received by the camera 2 by means of a light pattern evaluation. A screen 3 for viewing the object O ′ calculated by the control device C is optionally available.

Der Lichtmusterprojektor 1 erzeugt ein vorbestimmtes Lichtmuster L, z.B. ein Streifen- oder Punktmuster. Der Lichtmusterprojektor 1 strahlt sein Licht in einem Lichtbündel mit einer ersten optischen Achse A1 aus.The light pattern projector 1 generates a predetermined light pattern L, e.g. a stripe or dot pattern. The light pattern projector 1 emits its light in a light beam with a first optical axis A1.

Die Kamera 2, typischerweise eine Digitalkamera, weist ein Sichtfeld F mit einer zweiten optischen Achse A2 auf, welche schräg zu der ersten optischen Achse A1 des Lichtmusterprojektors 1 ausgerichtet ist. Die Kamera 2 ist in anderen Worten schräg zu dem Lichtmusterprojektor 1 ausgerichtet. Sie betrachtet einen durch das Lichtmuster L bestrahlten oder bestrahlbaren Bereich des Objekts O.The camera 2, typically a digital camera, has a field of view F with a second optical axis A2, which is oriented at an angle to the first optical axis A1 of the light pattern projector 1. In other words, the camera 2 is oriented at an angle to the light pattern projector 1. It observes a region of the object O that is or can be irradiated by the light pattern L.

Fig.2 zeigt eine Skizze einer Rekonstruktion einer Form des mittels des 3D-Scanners 1,2,C vermessenen Objekts O. Fig. 2 shows a sketch of a reconstruction of a shape of the object O measured by means of the 3D scanner 1, 2, C.

Der Lichtmusterprojektor 1 weist hier eine Lichtquelle Q auf, z.B. ein Feld aus Leuchtdioden, der ein Mustererzeugungselement in Form einer durchstrahlbaren, frei programmierbaren LCD-Fläche D nachgeschaltet ist. Je nach dem auf der LCD-Fläche D erzeugten Muster M wird von der LCD-Fläche D ein entsprechendes, insbesondere komplementäres, Lichtmuster L abgestrahlt. Alternativ mag bereits ein LED-Bildschirm als Lichtquelle dienen (o. Abb.), wobei dann aufgrund der darin integrierten Hintergrundbeleuchtung ("Backlighting") auf eine gesonderte Lichtquelle verzichtet werden kann.The light pattern projector 1 here has a light source Q, e.g. an array of light-emitting diodes, which is followed by a pattern-generating element in the form of a freely programmable LCD surface D that can be radiated through. Depending on the pattern M generated on the LCD surface D, a corresponding, in particular complementary, light pattern L is emitted from the LCD surface D. Alternatively, an LED screen may already serve as the light source (not shown), in which case a separate light source can then be dispensed with because of the background lighting ("backlighting") integrated therein.

In Fig.2 ist beispielhaft gezeigt, wie Licht in Form einer vertikalen Spalte oder Linie G von dem Lichtmusterprojektor 1 auf das Objekt O gestrahlt wird. Auf dem Objekt O erscheint somit eine durch die Oberflächenkontur des Objekts O verzerrte Projektion P(G) dieser Linie G. Die Kamera 2 nimmt aufgrund ihrer Schrägstellung gegenüber dem Lichtmusterprojektor 1 ein Bild dieser Projektion P(G) auf, welche die Verzerrung wiedergibt. Die Kamera 2 speichert die Projektion P(G) als entsprechend positionierte Bildpunkte B oder "Pixel" einer Matrix, die sich aufgrund eines matrixartigen Aufbaus von Einzelsensoren eines Sensorarray S der Kamera ergibt, z.B. eines CCD-Sensorarrays. Die Höhen- bzw. Tiefeninformation ist durch die Abweichung der Bildpunkte B von einer vertikalen Linie gegeben.In Fig. 2 is shown by way of example how light in the form of a vertical column or line G is radiated from the light pattern projector 1 onto the object O. A projection P (G) of this line G, which is distorted by the surface contour of the object O, appears on the object O. The camera 2 takes an image of this projection P (G), which reproduces the distortion, due to its inclination with respect to the light pattern projector 1. The camera 2 stores the projection P (G) as appropriately positioned image points B or "pixels" of a matrix that results from a matrix-like structure of individual sensors of a sensor array S of the camera, for example a CCD sensor array. The height or depth information is given by the deviation of the image points B from a vertical line.

Falls die Ebenen aller vertikalen Spalten oder Linien G bekannt sind und zu jedem Bildpunkt B im Kamerabild ein Lichtstrahl r im Raum angegeben werden kann, von dem aus das auf den jeweiligen Einzelsensor einfallende Licht kommt, und falls ferner eine Zuordnung der Bildpunkte zu der von den Bildpunkten aus sichtbaren Projektion P(G) und somit auch zu den entsprechenden Linien G existiert, so lassen sich Punkte auf der Objektoberfläche durch einen einfachen Strahl-Ebenenschnitt rekonstruieren.If the planes of all vertical columns or lines G are known and a light beam r can be specified in space for each pixel B in the camera image from which the light incident on the respective individual sensor comes, and if there is also an assignment of the image points to the projection P (G) visible from the image points and thus also to the corresponding lines G, points on the object surface can be identified by a simple ray Reconstruct plane section.

Die Tiefenauflösung hängt dabei von einem Winkel W zwischen dem zum dem Bildpunkt B führenden Lichtstrahl r und der Richtung der Spalte oder Linie G ab. Ein theoretisches Optimum bezüglich der Auflösung würde bei einem möglichst großen Winkel W liegen. Jedoch verschlechtert sich mit zunehmender Annäherung an dieses Optimum die Sichtbarkeit der Projektion P(G) auf der Oberfläche des Objekts O und somit deren Detektierbarkeit im Kamerabild. Da eine Rekonstruktion nur für solche Punkte auf der Oberfläche des Objekts O möglich ist, welche einerseits von der Kamera 2 aus sichtbar sind und andererseits von dem Lichtmusterprojektor 1 angestrahlt werden können, wird hier ein Kompromiss eingegangen. Eine solche 3D-Vermessung oder 3D-Scan ist grundsätzlich bekannt und wird daher im Folgenden nicht weiter ausgeführt.The depth resolution depends on an angle W between the light beam r leading to the image point B and the direction of the column or line G. A theoretical optimum in terms of resolution would be at the largest possible angle W. However, the visibility of the projection P (G) on the surface of the object O and thus its detectability in the camera image deteriorate with increasing approach to this optimum. Since a reconstruction is only possible for those points on the surface of the object O which on the one hand are visible from the camera 2 and on the other hand can be illuminated by the light pattern projector 1, a compromise is made here. Such a 3D measurement or 3D scan is known in principle and is therefore not discussed further below.

Fig.3 zeigt als Schnittdarstellung in Seitenansicht ein mit einem 3D-Scanner ausgerüstetes Gargerät in Form eines Backofens 4. Der Backofen 4 weist einen von einer Ofenmuffel 5 begrenzten Garraum 6 auf. Die Ofenmuffel 5 weist frontseitig eine mittels einer Backofentür 7 verschließbare Beschickungsöffnung 8 auf, durch welche Objekte insbesondere in Form von Gargut O1 in den Garraum 6 verbracht werden können. Eine Garraumtemperatur T ist mittels eines oder mehrerer, insbesondere elektrisch betreibbarer, Heizkörper (o. Abb.) einstellbar. Fig. 3 shows a sectional view in side view of a cooking device equipped with a 3D scanner in the form of an oven 4. The oven 4 has a cooking space 6 delimited by an oven muffle 5. The oven muffle 5 has at the front a loading opening 8 which can be closed by means of an oven door 7 and through which objects, in particular in the form of food O1, can be brought into the cooking space 6. A cooking space temperature T can be set by means of one or more, in particular electrically operated, heating elements (not shown).

An einer Decke 9 der Ofenmuffel 5 befinden sich zwei Sichtfenster 10 und 11, die z.B. mit transparenten Glasscheiben abgedeckt sein können. An der dem Garraum 6 abgewandten Seite der Ofenmuffel 5 und in einem vorbestimmten Abstand von der Ofenmuffel 5 befindet sich hinter dem Sichtfenster 10 ein Lichtmusterprojektor 1 (z.B. mit einem LCD-Display zur Mustererzeugung) und hinter dem Sichtfenster 11 eine Kamera 2. Diese sind durch ihren Abstand von der Ofenmuffel 5 thermisch geschützt. Zudem mag über die Decke 9 ein Kühlluftstrom streichen, z.B. zum Kühlen von dort angeordneten Komponenten wie einer Steuereinrichtung. Auch der Lichtmusterprojektor 1 und die Kamera 2 können durch diesen Kühlluftstrom weiter gekühlt werden.On a ceiling 9 of the furnace muffle 5 there are two viewing windows 10 and 11, which can be covered, for example, with transparent glass panes. On the side of the furnace muffle 5 facing away from the cooking chamber 6 and at a predetermined distance from the furnace muffle 5, there is a light pattern projector 1 (e.g. with an LCD display for pattern generation) behind the viewing window 10 and a camera 2 behind the viewing window 11 their distance from the furnace muffle 5 is thermally protected. In addition, a flow of cooling air may sweep over the ceiling 9, for example for cooling components arranged there, such as a control device. The light pattern projector 1 and the camera 2 can also be further cooled by this cooling air flow.

Der Lichtmusterprojektor 1 und die Kamera 2 sind seitlich zueinander versetzt angeordnet. Zudem nehmen ihre optischen Achsen A1 bzw. A2 einen Winkel α zwischen 20° und 30° ein, was eine hohe Tiefenauflösung bei guter Sichtbarkeit ermöglicht. Der Lichtmusterprojektor 1 und die Kamera 2 sind in Bezug auf den Garraum 6 fest angeordnet und bewegen sich somit beispielsweise bei einer Betätigung der Backofentür 7 nicht mit.The light pattern projector 1 and the camera 2 are arranged laterally offset from one another. In addition, their optical axes A1 and A2 assume an angle α between 20 ° and 30 °, which enables high depth resolution with good visibility. The light pattern projector 1 and the camera 2 are fixedly arranged in relation to the cooking space 6 and therefore do not move with it, for example when the oven door 7 is actuated.

Der Lichtmusterprojektor 1 strahlt durch das Sichtfenster 10 ein Lichtmuster L in den Garraum 6, und zwar so, dass ab einem vorbestimmten Abstand von der Decke 9 praktisch die gesamte horizontale Fläche des Garraums 6 mit dem Lichtmuster L beleuchtbar ist. Dies mag z.B. in einer unteren Hälfte oder in einem unteren Drittel des Garraums 6 der Fall sein. Die Kamera 2 nimmt Bilder aus einem zumindest teilweise durch das Lichtmuster bestrahlbaren Bereich des Garraums 6 aufThe light pattern projector 1 emits a light pattern L through the viewing window 10 into the cooking space 6, in such a way that, from a predetermined distance from the ceiling 9, practically the entire horizontal surface of the cooking space 6 can be illuminated with the light pattern L. This may e.g. be the case in a lower half or in a lower third of the cooking space 6. The camera 2 records images from an area of the cooking chamber 6 that can at least partially be irradiated by the light pattern

Der Backofen 4 weist ferner eine mit der Kamera 2 gekoppelte Auswerteeinrichtung 12 zur Berechnung einer dreidimensionalen Form beispielsweise des Garguts O1 und eines Gargutträgers O2, die sich in dem durch das Lichtmuster L bestrahlbaren Bereich befinden, mittels einer Lichtmusterauswertung auf. Diese beruht auf einer 3D-Vermessung auf der Grundlage mindestens eines durch die Kamera 2 aufgenommenen Bilds. Der Lichtmusterprojektor 1, die Kamera 2 und die Auswerteeinrichtung 12 bilden zusammen den 3D-Scanner. Die Auswerteeinrichtung 12 mag, wie hier dargestellt, funktional in eine zentrale Steuereinrichtung des Backofens 4 integriert sein oder mag als eigenständige Einheit mit einer Steuereinrichtung gekoppelt sein.The oven 4 also has an evaluation device 12 coupled to the camera 2 for calculating a three-dimensional shape, for example of the food O1 and a food carrier O2, which are located in the area that can be irradiated by the light pattern L, by means of a light pattern evaluation. This is based on a 3D measurement based on at least one image recorded by the camera 2. The light pattern projector 1, the camera 2 and the evaluation device 12 together form the 3D scanner. As shown here, the evaluation device 12 may be functionally integrated into a central control device of the baking oven 4 or it may be coupled to a control device as an independent unit.

Ein 3D-Scan, umfassend z.B. eine Aufnahme eines Bildes der Projektion P(G) des Lichtmusters L mittels der Kamera 2 und daraus eine Berechnung der dreidimensionalen Form des Garguts O1 und des Gargutträgers O2, kann bei geschlossener Backofentür 7 bzw. Beschickungsöffnung 8 durchgeführt werden.A 3D scan comprising e.g. a recording of an image of the projection P (G) of the light pattern L by means of the camera 2 and a calculation of the three-dimensional shape of the food O1 and the food support O2 can be carried out with the oven door 7 or loading opening 8 closed.

Insbesondere kann bei geschlossener Backofentür 7, aber bei noch nicht begonnenem Garablauf, zunächst eine Kalibrierung durchgeführt werden. Dazu weist der Gargutträger O2 an seiner Oberseite einen oder mehrere z.B. farbige Kalibrierungsmarkierungen K auf, die eine bekannte Größe aufweisen und sich einfach identifizieren lassen. Beispielsweise lässt sich über die durch die Kamera 2 aufgenommene Größe der Kalibrierungsmarkierung(en) K eine Entfernung von der Decke 9 und damit z.B. die verwendete Einschubebene identifizieren. Bei einer zur 3D-Vermessung ungeeigneten, weil z.B. zu hohen Einschubebene mag dar Backofen 4 einen Hinweis an einen Nutzer ausgeben, z.B. eine Anzeige an einem frontseitigen Bildschirm 3 einer Bedienblende 13. Mindestens eine Kalibrierungsmarkierung mag sich auch an der Ofenmuffel 4 befinden.In particular, when the oven door 7 is closed, but the cooking process has not yet started, a calibration can first be carried out. For this purpose, the food carrier O2 has one or more, for example colored, calibration markings K on its upper side, which have a known size and can be easily identified. For example, the size of the calibration mark (s) K recorded by the camera 2 allows a distance from the ceiling 9 and thus, for example, the insertion level used identify. In the case of an insertion level that is unsuitable for 3D measurement, for example because the rack level is too high, the oven 4 may output a message to a user, e.g. a display on a front screen 3 of a control panel 13. At least one calibration mark may also be located on the oven muffle 4.

Nach der Kalibrierung, aber noch vor einem Garablauf bzw. vor einer Behandlung des Garguts O1 mag mittels des 3D-Scanners 1, 2, 12 eine initiale 3D-Vermessung des Garguts O1 durchgeführt werden, um dessen ursprüngliche Form zu berechnen. Die berechnete Form mag in dem Bildschirm 3 angezeigt werden. Die berechnete Form wird zur Bestimmung des Garguts O1 durch das Gargerät 4 herangezogen, und zwar zu einer durch die Kamera 2 auch zusätzlich ausführbaren Bilderkennung des Garguts O1. Mittels des 3D-Scanners 1, 2, 12 mag auch eine Art des Gargutträgers O2 erkannt werden. Auf der Grundlage einer Erkennung des Garguts O1 und ggf. des Gargutträgers O2 mögen ein oder mehrere Garparameter des Garablaufs angepasst werden. So mag anhand der erkannten Art und/oder des erkannten Volumens des Garguts und/oder des erkannten Gargutträgers O2 eine Garzeit und/oder die Garraumtemperatur T angepasst werden.After calibration, but before a cooking process or before treatment of the item to be cooked O1, an initial 3D measurement of the item to be cooked O1 may be carried out by means of the 3D scanner 1, 2, 12 in order to calculate its original shape. The calculated shape may be displayed on the screen 3. The calculated shape is used to determine the item to be cooked O1 by the cooking appliance 4, specifically for image recognition of the item to be cooked O1, which can also be carried out by the camera 2. A type of food carrier O2 may also be recognized by means of the 3D scanner 1, 2, 12. One or more cooking parameters of the cooking process may be adapted on the basis of a recognition of the food O1 and possibly the carrier O2. A cooking time and / or the cooking space temperature T may thus be adapted on the basis of the recognized type and / or the recognized volume of the item to be cooked and / or the recognized item carrier O2.

Während des Garablaufs können wiederholt 3D-Vermessungen durchgeführt werden, um eine Form- und/oder Volumenänderung des Garguts O1 festzustellen. Bei einer Form- und/oder Volumenänderung mag der Backofen 4 den Garablauf anpassen, z.B. die Garzeit und/oder die Garraumtemperatur ändern, einschließlich eines Abschaltens der Heizungen.During the cooking process, 3D measurements can be carried out repeatedly in order to determine a change in shape and / or volume of the item to be cooked O1. In the event of a change in shape and / or volume, the oven 4 may adapt the cooking sequence, e.g. change the cooking time and / or the cooking space temperature, including switching off the heaters.

Um eine Genauigkeit der Tiefeninformation und damit des Volumens des Garguts O1 zu verbessern, mag der Lichtmusterprojektor 1 unterschiedliche Lichtmuster L in den Garraum 6 einstrahlen.In order to improve the accuracy of the depth information and thus the volume of the item to be cooked O1, the light pattern projector 1 may project different light patterns L into the cooking space 6.

Fig.4 zeigt ein Diagramm eines Verlaufs einer Temperatur T und eines Volumens V des Garguts O1 mit einer Gargutbestimmung mittels beispielsweise des Gargeräts 4. Fig. 4 shows a diagram of a profile of a temperature T and a volume V of the item to be cooked O1 with a determination of the item to be cooked by means of the cooking appliance 4, for example.

Das Gargut O1 sei rein beispielhaft ein Teigprodukt, beispielsweise ein runder Flammkuchen oder ein Biskuit in einer Springform. Nur mittels einer Bilderkennung durch die Kamera 2 kann eine Unterscheidung zwischen diesen beiden Arten von Gargut O1 nicht getroffen werden, da bei einer zweidimensionalen Ansicht von oben (Draufsicht) sowohl der Flammkuchen als auch das Biskuit kreisförmig aussehen. Zudem sind sich beide in der Farbgebung ähnlich. Beide Gargutarten benötigen jedoch eine unterschiedliche spezifische Backumgebung. Wird der Flammkuchen wie das Biskuit behandelt, ergibt sich ein unbefriedigendes Ergebnis, was auch für den umgekehrten Fall gilt. Durch die 3D-Vermessung mittels des 3D-Scanners 1, 2, 12 erhält das Gargerät 4 zusätzlich eine räumliche Information über das Gargut O1. Diese räumliche Information des Ausgangszustands des Garguts O1 vor dem Garablauf (z.B. ein Anfangsvolumen V0) mag bereits ausreichen, um den flachen Flammkuchen von dem höheren Biskuit zu unterscheiden. Zusätzlich wird die Art des Garguts O1 mittels des 3D-Scanners 1, 2 12 aus der Änderung seiner Form, insbesondere einer Änderung ΔV seines Volumens V, bestimmt.The food O1 to be cooked is purely by way of example a dough product, for example a round tarte flambée or a biscuit in a springform pan. A distinction between these two types of food O1 cannot be distinguished only by means of image recognition by the camera 2 because in a two-dimensional view from above (top view) both the tarte flambée and the sponge cake look circular. In addition, both are similar in color. However, both types of food require a different specific baking environment. If the tarte flambée is treated like the sponge cake, the result is unsatisfactory, which also applies to the reverse case. Through the 3D measurement by means of the 3D scanner 1, 2, 12, the cooking appliance 4 also receives spatial information about the food to be cooked O1. This spatial information of the initial state of the item to be cooked O1 before the cooking process (for example an initial volume V0) may already be sufficient to distinguish the flat tarte flambée from the higher biscuit. In addition, the type of item to be cooked O1 is determined by means of the 3D scanner 1, 2 12 from the change in its shape, in particular a change ΔV in its volume V.

So weist die Garraumtemperatur T zu einem Anfangszeitpunkt ts des Garablaufs einen Anfangswert Ts auf, z.B. Raumtemperatur. Mit fortlaufender Zeit t erhöht sich die Garraumtemperatur T aufgrund mindestens einer aktivierten Heizung, und zwar wie durch die Kurve T1+T2 gezeigt gleich für Flammkuchen und Biskuit. Erreicht die Garraumtemperatur T zu einem Zeitpunkt td eine Solltemperatur Td1 für Biskuit, welche unter einer Solltemperatur Td2 für Flammkuchen liegt, wird eine weitere 3D-Vermessung durchgeführt, um die Art des Garguts O1 zu bestimmen.Thus, the cooking space temperature T has an initial value Ts at an initial point in time ts of the cooking process, e.g. Room temperature. As the time t increases, the cooking space temperature T increases due to at least one activated heater, specifically for tarte flambée and sponge cake, as shown by the curve T1 + T2. If the cooking space temperature T reaches a target temperature Td1 for biscuits at a point in time td, which is below a target temperature Td2 for tarte flambée, a further 3D measurement is carried out in order to determine the type of item O1 to be cooked.

Hat sich die Höhe bzw. das Volumen V0 des Garguts O1 nicht wesentlich geändert, kann davon ausgegangen werden, dass es sich um Flammkuchen handelt, welcher typischerweise nicht aufgeht. Dessen Volumenverlauf ist als Kurve V2 gezeigt. Wird also der Flammkuchen erkannt, kann das Gargerät 4 im Folgenden beispielsweise seine Garraumtemperatur T auf den zugehörigen Sollwert Td2 erhöhen, wie durch die Temperaturkurve T2 angedeutet. Der Garablauf endet zu einem zugehörigen Endzeitpunkt te2.If the height or the volume V0 of the item to be cooked O1 has not changed significantly, it can be assumed that it is a tarte flambée which typically does not rise. Its volume curve is shown as curve V2. If the tarte flambée is recognized, the cooking appliance 4 can then, for example, increase its cooking space temperature T to the associated setpoint Td2, as indicated by the temperature curve T2. The cooking process ends at an associated end time te2.

Hat sich die Höhe bzw. das Volumen V0 des Garguts O1 jedoch zum Zeitpunkt td merklich um den ΔV gesteigert, kann davon ausgegangen werden, dass es sich um Biskuit handelt, welcher typischerweise aufgeht. Dessen Volumenverlauf ist als Kurve V1 gezeigt. Wird also Biskuit erkannt, mag das Gargerät 4 im Folgenden seine Garraumtemperatur T auf dem zugehörigen Sollwert Td1 halten, wie durch die Temperaturkurve T1 angedeutet. Der Garablauf endet zu einem zugehörigen Endzeitpunkt te1.However, if the height or the volume V0 of the item to be cooked O1 has increased noticeably by the ΔV at the time td, it can be assumed that it is a biscuit which typically rises. Its volume curve is shown as curve V1. If biscuit is recognized, the cooking appliance 4 may subsequently keep its cooking space temperature T at the associated setpoint Td1, as indicated by the temperature curve T1. The cooking process ends at an associated end time te1.

Mittels der Höhen- und/oder Volumeninformation der 3D-Vermessung lässt sich also ein klares Unterscheidungsmerkmal zur Garguterkennung bereitstellen.Using the height and / or volume information from the 3D measurement, a clear distinguishing feature for identifying the food to be cooked can therefore be provided.

BezugszeichenlisteList of reference symbols

11
LichtmusterprojektorLight pattern projector
22
Kameracamera
33
Bildschirmscreen
44th
Backofenoven
55
OfenmuffelFurnace muffle
66th
GarraumCooking space
77th
BackofentürOven door
88th
BeschickungsöffnungLoading opening
99
Deckeblanket
1010
SichtfensterViewing window
1111
SichtfensterViewing window
1212
AuswerteeinrichtungEvaluation device
1313
BedienblendeControl panel
A1A1
erste optische Achsefirst optical axis
A2A2
zweite optische Achsesecond optical axis
BB.
BildpunktPixel
CC.
SteuereinrichtungControl device
DD.
LCD-FlächeLCD surface
FF.
SichtfeldField of view
GG
Linieline
KK
KalibrierungsmarkierungenCalibration marks
LL.
LichtmusterLight pattern
MM.
Mustertemplate
OO
Objektobject
O1O1
GargutFood
O2O2
GargutträgerFood carrier
O'O'
errechnetes Objektcalculated object
P(G)P (G)
Projektionprojection
QQ
LichtquelleLight source
rr
LichtstrahlBeam of light
SS.
SensorarraySensor array
TT
GarraumtemperaturOven temperature
T1T1
TemperaturkurveTemperature curve
T2T2
TemperaturkurveTemperature curve
Td1Rd1
SolltemperaturTarget temperature
tt
ZeitdauerDuration
tdtd
Zeitpunkt bei Erreichen der Solltemperatur Td1Time when the target temperature Td1 is reached
te1te1
EndzeitpunktEnd time
te2te2
EndzeitpunktEnd time
tsts
Anfangszeitpunkt des GarablaufsStart time of the cooking process
TsTs
Garraumtemperatur zum Anfangszeitpunkt ts des GarablaufsCooking space temperature at the start time ts of the cooking process
VV
Volumenvolume
V0V0
AnfangsvolumenInitial volume
V1V1
VolumenverlaufVolume history
V2V2
VolumenverlaufVolume history
ΔVΔV
VolumenänderungVolume change
WW.
Winkelangle
αα
Winkelangle

Claims (12)

  1. Cooking appliance (4) having
    - a cooking chamber (6) with a loading opening (8) which can be closed by means of a door (7),
    - a light pattern projector (1) which is arranged in a fixed manner relative to the cooking chamber (6) for generating a light pattern (L),
    - a camera (2) for capturing images from a region which can be irradiated by the light pattern and
    - an analysis facility (12) which is coupled to the camera (2) for calculating a three-dimensional shape of an object (O; 01, O2), which is located in the region that can be irradiated by the light pattern (L), by means of a light pattern analysis,
    wherein
    - the light pattern projector (1) is arranged to radiate a light pattern (L) into the cooking chamber (6),
    - the camera (2) is arranged in a fixed manner relative to the cooking chamber (6),
    - the camera (2) is arranged to capture images from a region of the cooking chamber (6) which can be irradiated by the light pattern (L) even when the cooking chamber (6) is closed and
    - the analysis facility (12) designed to calculate repeatedly the three-dimensional shape of the at least one object (O; 01, O2), which is located in the region of the cooking chamber (6) which can be irradiated by the light pattern (L), during operation of the cooking appliance (4),
    characterised in that
    - the analysis facility (12) is designed to recognise a type of food (01) on the basis of its shape calculated by the light pattern analysis and change in shape during the operation of the cooking appliance (4), wherein
    - the analysis facility (12) is coupled to a control facility of the cooking appliance (4) and the control facility is designed to adjust operation of the cooking appliance (4) based on at least one object parameter determined by the analysis facility (12).
  2. Cooking appliance (4) according to claim 1, characterised in that an optical axis (A1) of the light pattern projector (1) and an optical axis (A2) of the camera (2) are at an angle of between 20° and 30° to one another.
  3. Cooking appliance (4) according to one of the preceding claims, characterised in that the light pattern projector (1) and camera (2) are arranged behind a ceiling (9) of the cooking chamber (6).
  4. Cooking appliance (4) according to one of the preceding claims, characterised in that the light pattern projector (1) can radiate different light patterns (L) into the cooking chamber (6).
  5. Cooking appliance (4) according to one of the preceding claims, characterised in that the light pattern projector (1) has at least one image point-type screen for shaping the light pattern (L).
  6. Cooking appliance (4) according to one of the preceding claims, characterised in that predefined calibration markings (K) are present on a muffle (5) which delimits the cooking chamber.
  7. Cooking appliance (4) according to one of the preceding claims, characterised in that the analysis facility (12) is designed to additionally recognise the type of food (01) on the basis of an image recognition.
  8. Cooking appliance (4) according to one of the preceding claims, characterised in that the analysis facility (12) is designed to recognise a type of food support (O2) on the basis of its shape calculated by the light pattern analysis.
  9. Cooking appliance (4) according to one of the preceding claims, characterised in that the analysis facility (12) is designed to recognise a type of accessory on the basis of its shape calculated by the light pattern analysis.
  10. Cooking appliance (4) according to one of the preceding claims, characterised in that the analysis facility (12) is designed to recognise a core temperature of an object (01) on the basis of a change in its shape recognised by the light pattern analysis.
  11. Cooking appliance (4) according to one of the preceding claims, characterised in that the cooking appliance (4) has a screen (3), on which a calculated shape of the food (01) can be displayed.
  12. Cooking appliance (4) according to one of the preceding claims, characterised in that the light pattern projector (1) is also provided to illuminate the cooking chamber (6).
EP15727630.4A 2014-06-05 2015-06-03 Cooking device with light pattern projector and camera Active EP3152498B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PL15727630T PL3152498T3 (en) 2014-06-05 2015-06-03 Cooking device with light pattern projector and camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102014210672.9A DE102014210672A1 (en) 2014-06-05 2014-06-05 Cooking device with light pattern projector and camera
PCT/EP2015/062349 WO2015185608A1 (en) 2014-06-05 2015-06-03 Cooking device with light pattern projector and camera

Publications (2)

Publication Number Publication Date
EP3152498A1 EP3152498A1 (en) 2017-04-12
EP3152498B1 true EP3152498B1 (en) 2020-11-11

Family

ID=53366015

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15727630.4A Active EP3152498B1 (en) 2014-06-05 2015-06-03 Cooking device with light pattern projector and camera

Country Status (7)

Country Link
US (1) US10228145B2 (en)
EP (1) EP3152498B1 (en)
CN (1) CN106461230B (en)
DE (1) DE102014210672A1 (en)
ES (1) ES2835724T3 (en)
PL (1) PL3152498T3 (en)
WO (1) WO2015185608A1 (en)

Families Citing this family (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014113389A1 (en) * 2014-09-17 2016-03-17 Pilz Gmbh & Co. Kg Method and device for identifying structural elements of a projected structural pattern in camera images
USD787041S1 (en) 2015-09-17 2017-05-16 Whirlpool Corporation Gas burner
US10837651B2 (en) 2015-09-24 2020-11-17 Whirlpool Corporation Oven cavity connector for operating power accessory trays for cooking appliance
US11777190B2 (en) 2015-12-29 2023-10-03 Whirlpool Corporation Appliance including an antenna using a portion of appliance as a ground plane
JP6909954B2 (en) * 2016-03-29 2021-07-28 パナソニックIpマネジメント株式会社 Cooker
JPWO2017170318A1 (en) * 2016-03-29 2019-02-14 パナソニックIpマネジメント株式会社 Cooker
JPWO2017170319A1 (en) * 2016-03-29 2019-02-14 パナソニックIpマネジメント株式会社 Cooker
CN107296507A (en) * 2016-04-15 2017-10-27 松下知识产权经营株式会社 Cook householder method and cook accessory system
DE102016206483A1 (en) * 2016-04-18 2017-10-19 Convotherm Elektrogeräte GmbH A method of determining a cleaning need and quality management monitoring system of a commercial cooking appliance, and a commercial cooking appliance
CN108700306B (en) * 2016-04-20 2021-09-14 德国福维克控股公司 System for making at least one food item and method for operating a related system
US10145568B2 (en) 2016-06-27 2018-12-04 Whirlpool Corporation High efficiency high power inner flame burner
DE102016012036A1 (en) 2016-10-11 2018-04-12 Diehl Ako Stiftung & Co. Kg Food support for receiving food in a cooking appliance and cooking appliance with such a food support
DE102016221446A1 (en) * 2016-11-02 2018-05-03 BSH Hausgeräte GmbH Calibrating an oxygen sensor of a household appliance
EP3346190B1 (en) 2017-01-10 2020-05-06 Electrolux Appliances Aktiebolag Food preparation entity
US10551056B2 (en) 2017-02-23 2020-02-04 Whirlpool Corporation Burner base
US10451290B2 (en) 2017-03-07 2019-10-22 Whirlpool Corporation Forced convection steam assembly
US10660162B2 (en) 2017-03-16 2020-05-19 Whirlpool Corporation Power delivery system for an induction cooktop with multi-output inverters
DE102017206056A1 (en) 2017-04-10 2018-10-11 BSH Hausgeräte GmbH Operating a cooking appliance
KR102366006B1 (en) 2017-06-20 2022-02-23 삼성전자주식회사 Oven
EP3665419A4 (en) * 2017-08-11 2021-05-05 Brava Home, Inc. Configurable cooking systems and methods
CN109166054A (en) * 2017-08-16 2019-01-08 嘉易烤株式会社 The time point management service that is carbonized when cooking fish meat provides system
CN107692840A (en) * 2017-09-06 2018-02-16 珠海格力电器股份有限公司 The display methods and device of electrical device, electrical device
DE102017121084A1 (en) * 2017-09-12 2019-03-14 Rational Aktiengesellschaft Cooking appliance and method for detecting the occupancy of a drawer in a cooking chamber
KR102400018B1 (en) 2017-09-29 2022-05-19 삼성전자주식회사 Method and apparatus for auto cooking
WO2019075610A1 (en) * 2017-10-16 2019-04-25 Midea Group Co., Ltd. Machine learning control of cooking appliances
US10591218B2 (en) * 2017-10-27 2020-03-17 Whirlpool Corporation Oven having an imaging device
US10605463B2 (en) 2017-10-27 2020-03-31 Whirlpool Corporation Cooking appliance with a user interface
US10523851B2 (en) 2018-02-19 2019-12-31 Haier Us Appliance Solutions, Inc. Camera assembly for an oven appliance
WO2019198621A1 (en) * 2018-04-09 2019-10-17 パナソニックIpマネジメント株式会社 Heating cooker
CN108742170B (en) * 2018-05-08 2020-08-18 华南理工大学 Intelligent recognition cooking system of oven
WO2019246490A1 (en) 2018-06-21 2019-12-26 Prince Castle LLC Infrared toaster
US10627116B2 (en) 2018-06-26 2020-04-21 Whirlpool Corporation Ventilation system for cooking appliance
US10619862B2 (en) 2018-06-28 2020-04-14 Whirlpool Corporation Frontal cooling towers for a ventilation system of a cooking appliance
WO2020014159A1 (en) * 2018-07-09 2020-01-16 Brava Home, Inc. In-oven camera and computer vision systems and methods
US10837652B2 (en) 2018-07-18 2020-11-17 Whirlpool Corporation Appliance secondary door
DE102018124378B4 (en) * 2018-10-02 2023-03-23 BIBA - Bremer Institut für Produktion und Logistik GmbH Device and method for process monitoring of several pieces of dough in a process chamber and a process chamber with such a device
CN109330356A (en) * 2018-11-09 2019-02-15 珠海格力电器股份有限公司 Cooking apparatus
JP7290415B2 (en) * 2018-12-06 2023-06-13 三星電子株式会社 Three-dimensional measuring device and heating cooker
WO2020116814A1 (en) * 2018-12-06 2020-06-11 Samsung Electronics Co., Ltd. Heating cooker including three dimensional measuring device
DE102018221749A1 (en) * 2018-12-14 2020-06-18 BSH Hausgeräte GmbH Oven and control procedures
US11287140B2 (en) 2019-01-04 2022-03-29 Whirlpool Corporation Cooking appliance with an imaging device
DE102019203259A1 (en) * 2019-03-11 2020-09-17 BSH Hausgeräte GmbH Optical recognition of food
US11677901B2 (en) * 2019-03-12 2023-06-13 Marmon Foodservice Technologies, Inc. Infrared toaster
DE102019107834A1 (en) * 2019-03-27 2020-07-16 Miele & Cie. Kg Method for operating a cooking device and cooking device
DE102019107859A1 (en) * 2019-03-27 2020-07-09 Miele & Cie. Kg Method for operating a cooking device and cooking device
DE102019107812A1 (en) * 2019-03-27 2020-10-01 Miele & Cie. Kg Method for operating a cooking appliance and cooking appliance
DE102019107846A1 (en) * 2019-03-27 2020-07-16 Miele & Cie. Kg Method for operating a cooking device and cooking device
DE102019204533A1 (en) * 2019-04-01 2020-10-01 BSH Hausgeräte GmbH Method for preparing a product to be cooked with optically indicated cooking product zones, cooking device and computer program product
DE102019204531A1 (en) * 2019-04-01 2020-10-01 BSH Hausgeräte GmbH Household appliance and method for determining contour information of goods
EP3758439B1 (en) * 2019-06-25 2022-10-26 Electrolux Appliances Aktiebolag Method and system for controlling an oven, and oven for heating food items
DE102019209198A1 (en) * 2019-06-26 2020-12-31 Robert Bosch Gmbh Home appliance
DE102019210426B3 (en) * 2019-07-15 2020-12-10 BSH Hausgeräte GmbH Control unit and method for evaluating image data in a household appliance
US10819905B1 (en) * 2019-09-13 2020-10-27 Guangdong Media Kitchen Appliance Manufacturing Co., Ltd. System and method for temperature sensing in cooking appliance with data fusion
JP7236644B2 (en) * 2019-10-01 2023-03-10 パナソニックIpマネジメント株式会社 heating cooker
DE102019216682A1 (en) * 2019-10-29 2021-04-29 BSH Hausgeräte GmbH Determining a target processing state of a product to be cooked
US20210186032A1 (en) * 2019-12-19 2021-06-24 Whirlpool Corporation Monitoring system
DE102020107568B4 (en) * 2020-03-19 2022-02-03 Miele & Cie. Kg Method for controlling a cooking device and cooking device
CN216535005U (en) 2020-04-06 2022-05-17 沙克忍者运营有限责任公司 Cooking system
KR102341238B1 (en) * 2020-08-21 2021-12-20 주식회사 비욘드허니컴 Automatic Cooker System For Personal Cooperation
DE102020126249A1 (en) * 2020-10-07 2022-04-07 Welbilt Deutschland GmbH Cooking device, in particular commercial cooking device
US11940153B2 (en) 2020-12-01 2024-03-26 GMG Products, LLC Fuel conditioner for grill
US11747087B2 (en) * 2021-01-06 2023-09-05 Bsh Home Appliances Corporation Household appliance including reflective door
CN113208449B (en) * 2021-05-31 2022-10-11 广东美的厨房电器制造有限公司 Control method and control device for cooking equipment and cooking equipment
USD1005769S1 (en) 2021-09-08 2023-11-28 Newage Products Inc. Oven

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4224743A (en) * 1978-06-19 1980-09-30 Alternative Pioneering Systems, Inc. Food dehydrating machine
US4588297A (en) 1982-06-14 1986-05-13 Nippon Steel Corporation Optical profile measuring method
DE19748062C2 (en) 1997-10-31 2000-10-05 Bernward Maehner Method and device for three-dimensional optical measurement of objects
AU3994799A (en) 1999-05-14 2000-12-05 3Dmetrics, Incorporated Color structured light 3d-imaging system
JP2001099615A (en) * 1999-09-30 2001-04-13 Nippon Crucible Co Ltd Object distance measuring instrument and three- dimensional object shape measuring instrument
DE50113144D1 (en) * 2000-06-28 2007-11-29 Bosch Gmbh Robert Device for the pictorial detection of piece goods
JP3826111B2 (en) * 2003-06-06 2006-09-27 株式会社東芝 Cooker
US7131529B2 (en) * 2003-07-01 2006-11-07 Casa Herrera, Inc. Oven conveyor alignment system apparatus and method
US10687391B2 (en) * 2004-12-03 2020-06-16 Pressco Ip Llc Method and system for digital narrowband, wavelength specific cooking, curing, food preparation, and processing
DE102006005874C5 (en) 2005-05-11 2017-05-18 Carl Zeiss Automated Inspection GmbH Method for non-contact measurement
JP2007192518A (en) * 2006-01-23 2007-08-02 Matsushita Electric Ind Co Ltd High-frequency heating device
EP1921384B1 (en) 2006-11-02 2009-05-27 Electrolux Home Products Corporation N.V. Device and method for determining the inner temperature of food
DE102008024731B4 (en) * 2008-05-19 2020-08-20 BAM Bundesanstalt für Materialforschung und -prüfung Method and device for sintering an object by determining the geometric surface profile of the object
EP2149755B1 (en) 2008-07-30 2012-12-05 Electrolux Home Products Corporation N.V. Oven and method of operating the same
BRPI1010247A2 (en) * 2009-03-05 2015-08-25 Pressco Tech Inc Method and system for digital narrowband, specific wavelength cooking, curing, food preparation and processing
DE102010003115A1 (en) * 2010-03-22 2011-09-22 BSH Bosch und Siemens Hausgeräte GmbH Cooking appliance
GB2486165A (en) * 2010-11-30 2012-06-13 St Microelectronics Res & Dev Oven using a Single Photon Avalanche Diode (SPAD) array
EP2530387B1 (en) 2011-06-03 2017-04-26 Electrolux Home Products Corporation N.V. A cooking oven including an apparatus for detecting the three-dimensional shape of food stuff on a food stuff carrier
WO2013098004A1 (en) * 2011-12-26 2013-07-04 Arcelik Anonim Sirketi Oven with optical detection means
CN104427954B (en) * 2012-05-03 2016-10-12 3形状股份有限公司 The automated production of dental restoration

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
US20170115008A1 (en) 2017-04-27
WO2015185608A1 (en) 2015-12-10
EP3152498A1 (en) 2017-04-12
ES2835724T3 (en) 2021-06-23
US10228145B2 (en) 2019-03-12
CN106461230B (en) 2019-04-12
PL3152498T3 (en) 2021-05-31
CN106461230A (en) 2017-02-22
DE102014210672A1 (en) 2015-12-17

Similar Documents

Publication Publication Date Title
EP3152498B1 (en) Cooking device with light pattern projector and camera
EP3500798B1 (en) Establishing a degree of browning of food to be cooked
WO2018188913A1 (en) Operating a cooking appliance
DE102019120008B4 (en) Method for operating a cooking appliance and cooking appliance
DE102016107617A1 (en) Method for operating a cooking appliance and cooking appliance
WO2016207442A1 (en) Method and device for measuring an object surface in a contactless manner
DE102013110644B3 (en) Process for cooking food in a cooking appliance
DE102014109432B4 (en) Laser scanner and procedure
DE1583443B1 (en) PROCEDURE FOR DETERMINING THE TEMPERATURE DISTRIBUTION ON THE BATCH SURFACE OF A CHAMBER
DE102014114901A1 (en) Cooking appliance and method for detecting a process parameter of a cooking process
DE102019201332A1 (en) Household cooking appliance and method for operating a household cooking appliance
WO2010130567A2 (en) Cooking hob and method for heating cooking vessels placed on the cooking hob
DE102013114227A1 (en) Cooking appliance with charge detection
WO2022078839A1 (en) Method for determining the time for cleaning a cooking chamber of a cooking appliance
DE102014210673A1 (en) Determining a shelf level of a food support
DE102011075187B3 (en) hob
DE102019201330A1 (en) Microwave device and method for operating a microwave device
DE10125247C1 (en) Household appliance with a cooking space
DE102015101174A1 (en) Hob and method for projecting an image
DE102019107828B4 (en) Method for operating a cooking appliance and cooking appliance
CN208286987U (en) A kind of optical stop system and Ophthalmologic apparatus for Ophthalmologic apparatus
DE102015105128A1 (en) Method and device for measuring the degree of gloss and / or the mattness of objects
DE102015206437B3 (en) Device for determining the thermal expansion and / or structural transformations of samples
WO2019170447A1 (en) Interaction module
BE1030929B1 (en) Method for calibrating a dirty infrared sensor, method for measuring a temperature with a dirty infrared sensor and kitchen appliance

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20170105

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20190506

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20200619

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1333844

Country of ref document: AT

Kind code of ref document: T

Effective date: 20201115

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502015013793

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20201111

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210311

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210211

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210212

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210311

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210211

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2835724

Country of ref document: ES

Kind code of ref document: T3

Effective date: 20210623

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502015013793

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20210812

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20210630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210603

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210603

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210311

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

REG Reference to a national code

Ref country code: AT

Ref legal event code: MM01

Ref document number: 1333844

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210603

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210603

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20150603

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230504

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230622

Year of fee payment: 9

Ref country code: DE

Payment date: 20230630

Year of fee payment: 9

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: TR

Payment date: 20230530

Year of fee payment: 9

Ref country code: PL

Payment date: 20230518

Year of fee payment: 9

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20230630

Year of fee payment: 9

Ref country code: GB

Payment date: 20230622

Year of fee payment: 9

Ref country code: ES

Payment date: 20230719

Year of fee payment: 9

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201111