EP4229336A1 - Procédé de détermination du temps de nettoyage d'une chambre de cuisson d'un appareil de cuisson - Google Patents

Procédé de détermination du temps de nettoyage d'une chambre de cuisson d'un appareil de cuisson

Info

Publication number
EP4229336A1
EP4229336A1 EP21790808.6A EP21790808A EP4229336A1 EP 4229336 A1 EP4229336 A1 EP 4229336A1 EP 21790808 A EP21790808 A EP 21790808A EP 4229336 A1 EP4229336 A1 EP 4229336A1
Authority
EP
European Patent Office
Prior art keywords
image
camera
cooking chamber
cooking
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21790808.6A
Other languages
German (de)
English (en)
Inventor
Helge Nelson
Jürgen Scharmann
Niklas Birwe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Miele und Cie KG
Original Assignee
Miele und Cie KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Miele und Cie KG filed Critical Miele und Cie KG
Publication of EP4229336A1 publication Critical patent/EP4229336A1/fr
Pending legal-status Critical Current

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C7/00Stoves or ranges heated by electric energy
    • F24C7/08Arrangement or mounting of control or safety devices
    • F24C7/082Arrangement or mounting of control or safety devices on ranges, e.g. control panels, illumination
    • F24C7/085Arrangement or mounting of control or safety devices on ranges, e.g. control panels, illumination on baking ovens
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C14/00Stoves or ranges having self-cleaning provisions, e.g. continuous catalytic cleaning or electrostatic cleaning

Definitions

  • the present invention relates to a method for determining the time for cleaning a cooking chamber of a cooking appliance, in which camera images from a camera directed into the cooking chamber are fed to evaluation electronics, which uses the camera images to determine the degree of soiling of the cooking chamber using software-supported image analysis.
  • Shading effects from trays or racks in the cooking space are not recognized and are not taken into account.
  • the respective measurement is limited to the window size to the optical sensor and is inaccurate. It is not possible to discern the nature of the pollution.
  • the reference value must also have been saved beforehand, although it remains unclear what should actually serve as a reference value.
  • DE 102014 116 709 A1 describes a method for determining the degree of soiling of a cooking appliance, in which a sensor surface is illuminated at an angle by a light source and the light reflected by the sensor surface is received by a camera in order to compare the light emitted by the light source Emitted light to close with the light received by the camera on a degree of contamination. Since both the light source and the camera are outside the cooking chamber, but the dirt accumulates inside the cooking chamber, this method is also imprecise and unreliable.
  • the general problem is that a cooking appliance is cleaned too infrequently and cleaning is postponed even though it could help the function of the cooking appliance and the result of the cooking process. Fat spatters adhering to the interior surfaces of a cooking appliance make up a significant part of the soiling of a cooking appliance.
  • a light source that is intended to illuminate the cooking chamber can be darkened by soiling, so that it no longer illuminates the cooking chamber sufficiently to be able to observe the food well during a cooking process.
  • the image quality of camera images can deteriorate over the long term if they are not cleaned. This is particularly troublesome when the camera images are transferred to a display of the cooking appliance or a smart device that is independent of the cooking appliance and the food to be cooked can no longer be seen optimally on the camera images. As a result, the cooking progress is recognized more poorly and may be incorrectly evaluated.
  • the aesthetic pleasure of seeing the cooking progress can also be affected. Delayed cleaning of the cooking chamber can result in increased workload occur, or unpleasant odors arise from the combustion of the dirt during pyrolytic cleaning. The reaction products from the combustion of pollution can also exceed permissible limit values for the room air without the user noticing.
  • the object is achieved for a generic method in that, from several camera images, the brightness values of several or all pixels of the camera images are decided whether cleaning is required by averaging using a metric and subsequent forming a limit value or using a limiting value and subsequent averaging using a metric.
  • a variance in the image values can arise in particular if the food to be cooked is moved in the cooking chamber.
  • Camera images can also be used that were taken from various successive cooking processes with different items to be cooked.
  • different brightness values of the individual pixels can also result from a comparison of camera images from a single cooking process if the food to be cooked changes sufficiently during the cooking process, for example rises and/or browns.
  • the brightness values of pixels change when averaging to form an average image, regardless of whether the camera images used for the averaging show the food to be cooked or not, while dirt spots are at the same place and, with regard to their shape and brightness, are at least approximately the same during further cooking processes stay the same. This is also independent of the use of a color or a monochromatic camera.
  • Averaging to generate an average image by means of a metric is a mathematical process used to calculate image values of pixels with one another in order to obtain information about regions that do not change or no longer change and which can therefore be regarded as dirty regions .
  • An average image can be formed from the multiple camera images by calculating, for example, an arithmetic mean value or a median value as a metric from the gray or color values for a pixel point or a pixel field that corresponds to one another in the camera images.
  • the advantage of a median value can be seen in the fact that individual outlier values have less of an impact on the mean value. For example, if there are fat deposits on wall surfaces in the cooking chamber, these can be clearly seen on the average image as dark spots that contrast in their area and at the edges with the brightness or color values of the surrounding image components of the average image.
  • this description refers to the "average image" on the basis of which evaluations are made, this does not always mean the exact average image in the narrow sense that results from the averaging of the gray or color values of the individual pixels from the camera images , but in a broader sense also those images and the gray or color values contained therein and other information that result from further processing of the average image in the narrower sense, such as a binarized image in which the pixels are only digitally labeled as "dirty " or "not soiled” are shown. If derivations formed from the average image are also meant, this is indicated in each case.
  • the dirty pixels are generally not only darker than non-dirty pixels, but also have a lower standard deviation or a smaller variance. Therefore, instead of the arithmetic mean, the standard deviation or the lower variance can also be determined.
  • Other methods can also be used as a metric for forming an average for combining multiple images, such as forming a geometric or harmonic mean, forming averages in different color spaces, determining weighted averages, determining spatial or temporal gradients of brightness and/or color, the use of high-pass, low-pass or other filters, without this list of examples being limited to the methods mentioned.
  • the evaluation electronics can calculate image values for each individual pixel or also for pixel fields in which the image values of several pixels are viewed together. By considering pixel fields, the complexity of the calculations can be reduced without a loss of quality in the assessment of the degree of soiling necessarily having to occur.
  • Information about the degree of soiling is extracted from the average image or a derivation formed from it by subjecting the average image to a software-supported image evaluation.
  • the image evaluation can be carried out using the known statistical methods and/or suitable filter techniques using suitable filters for individual image values such as the brightness or color values of individual pixels and pixel fields, area distributions of certain brightness or color values, and the like.
  • the image evaluation is no longer dependent on information that is outside the camera images or the average image or a derivation formed therefrom, such as reference images or other reference values.
  • the evaluation electronics can output a signal that a cleaning process is recommended when the degree of soiling determined using the average image or a derivation formed therefrom has reached a threshold value for soiling that is no longer acceptable.
  • the threshold value can be defined, for example, via a percentage of the area of the areas identified as dirty in the total area of the average image or a derivation formed therefrom and/or the area percentage of at least one single coherent area identified as dirty in the total area of the average image or a derivation formed therefrom. Larger pixel fields recognized as dirty can also be taken into account with a disproportionate weighting in the evaluation, because larger areas of dirt make cleaning the cooking chamber seem advisable.
  • the evaluation electronics can evaluate the need for cleaning after each cooking process, but timing is also possible, for example after every third or fifth cooking process, or the evaluation is based on operating parameters such as the preselected temperature, a grill that is switched on, the duration of a cooking process or a combination of these options.
  • the software-supported image processing can be used to classify pixels and/or pixel fields from the camera images or the average image or a derivation formed therefrom as dirty or not dirty, and the evaluation electronics determine the degree of soiling from the proportions of the pixels and/or pixel fields classified as dirty and not dirty of the cooking chamber.
  • the evaluation electronics can assess the cooking chamber as in need of cleaning if the proportion of pixels and pixel fields classified as dirty in the total area of the average image or a derivation formed therefrom reaches or exceeds 5%.
  • the proportion can also be higher or lower depending on the programming of the evaluation electronics.
  • the image evaluation of the camera images or the average image or a derivation formed therefrom can take place independently of how users of the cooking appliance individually equip the cooking chamber, for example with grates, baking trays, baking paper or other vessels.
  • the image evaluation of the average image or a derivation formed therefrom is therefore much more accurate in detecting the soiling, and the point in time at which a cleaning process is to be carried out can be determined much more precisely.
  • the cooking appliance can also remind the user that the cooking chamber needs to be cleaned at a point in time before the soiling has a negative effect on the food to be cooked, the camera image of the cooking chamber no longer shows the cooking food in a satisfactory quality or, in the case of pyrolytic cleaning, it is unpleasant Odors arise that are no longer classified as tolerable by the user or are no longer tolerable due to the pollution in the room air.
  • cleaning can then be carried out manually by the user or thermally using a pyrolysis program.
  • the invention can be implemented easily and inexpensively in cooking appliances, since a suitable camera is already installed as standard in some cooking appliances by professional users, but also in those in the consumer sector.
  • the manufacturing costs incurred when implementing the invention in a cooking appliance are then very low.
  • the cameras are often camera systems that have software-supported image analysis. The image evaluation makes it possible to see whether food has been placed in the cooking compartment. Camera images can then be produced which show the cooking chamber with or without the food to be cooked placed in it.
  • other sensor technologies are also possible, such as ultrasound, odor sensors or other technologies, or combinations of such technologies, which can be used to determine whether camera images showing an item to be cooked are being taken with the camera.
  • the sensor signal output by the existing cooking product detection sensor system then only has to be evaluated by suitable software, which forms part of an evaluation electronics system. Therefore there is little programming effort.
  • a physical computing capacity on which the software can run is also often already provided in the operating electronics in a cooking appliance, so that existing equipment technology can also be used with regard to the computing capacity to implement the invention and no additional costs are incurred here either.
  • the evaluation electronics can then consist solely of a software package that is loaded onto the control electronics of the cooking appliance.
  • an average image is first calculated during averaging using a metric and subsequent limit value formation, then an average brightness value is calculated for a majority or all pixels of the average image, then the average brightness values of the pixels are compared with a limit value that contains a binarizing evaluation threshold for the differentiation of the average brightness values between a pixel rated as dirty or not dirty, and then based on the number of pixels rated as dirty after the limit value comparison, a decision is made as to whether cleaning is necessary.
  • This method of calculation results in an average image in which variably bright pixels in the respective camera images are shown weakened in their gray tone value by averaging, while the pixels in which darker gray tone values are specified without a greater variance in all camera images contain the gray tone values for these Pixels remain in the dark area.
  • the average image can be simply binarized by showing, for example, all pixels with a gray tone value ⁇ 20 as dirty and all pixels >20 as not dirty in a derivation of the average image.
  • the binarized derivation can now be evaluated by determining how many pixels are considered dirty and whether this number exceeds a threshold value above which cleaning is considered necessary. If a gray tone value is mentioned here, this information initially refers to gray tone images that have been recorded by a monochromatic camera.
  • each pixel of an image is assigned a value that represents its brightness or light intensity on a scale from 0 to 255, for example. 0 stands for a black pixel and 255 for a white pixel.
  • a gray tone image can first be generated from the color image.
  • Commercially available color cameras generate color information by placing a wavelength-selective light-permeable grating in front of the sensor chip, a so-called Bayer filter, which, depending on its position in the grating, allows red, green or blue light to pass through.
  • a so-called RGB image can then be reconstructed from the red, green and blue color components.
  • Each pixel has three values, these represent the light intensity in the red, green and blue areas of the optical spectrum.
  • a gray image can now be generated from the RGB image by reducing the color information to pure brightness information.
  • the gray value is determined for each pixel as the average of the red, green and blue values. It is thus possible to form gray tone values even when using color cameras. It is crucial for the further procedure that an image is available in which the light intensity and thus the brightness are represented by the values of the individual pixels.
  • an individual brightness value is first calculated using a metric for a majority or all of the pixels of each camera image, then the individual brightness values of the pixels are compared with a limit value, which is a binarizing evaluation threshold for distinguishing between the individual brightness values between a pixel of the camera image assessed as dirty or not dirty, a binary contamination image is created for each camera picture from the pixels assessed as dirty or not dirty, and then an average image is calculated from a majority or all pixels of the binary dirt images, in which the gray level value of each pixel corresponds to a contamination probability, and the gray level values of the pixels are calculated to form an average contamination probability value of the average image et, which is compared to a limit value and if the limit value is exceeded, cleaning is assessed as necessary.
  • a limit value which is a binarizing evaluation threshold for distinguishing between the individual brightness values between a pixel of the camera image assessed as dirty or not dirty
  • a binary contamination image is created for each camera picture from the pixels assessed as dirty or not dirty
  • an average image is calculated from a
  • some or all of the several camera images are colored images in which the color channels of the individual pixels are evaluated differently or further calculated without the color values.
  • the camera images are produced by the camera through a transparent layer that separates the camera from the cooking chamber, with the transparent layer forming part of the inner wall of the cooking chamber.
  • the transparent layer can consist of a pane of glass, for example. Soiling such as fat splashes that occur in the cooking chamber during the cooking process are deposited on the interior surfaces of the cooking chamber and thus also on the transparent layer.
  • the soiling of the transparent layer is thus similar to the soiling of the inner surfaces in the rest of the cooking space and is therefore representative of the degree of soiling of the cooking space overall.
  • the transparent layer is located between the cooking chamber and the camera, the dirt that collects on the transparent layer covers the camera's view of the food to be cooked that is behind the dirt in the cooking chamber from the camera's point of view.
  • the image values of the contamination on the transparent layer contrast well in the average image or a derivation formed therefrom with the image areas surrounding the contamination, which show averaged image values from the image values of the individual images of the item to be cooked, which differ from one another, through the items to be cooked recorded during different cooking processes.
  • the multiple camera images show multiple different cooking processes and/or loading situations.
  • a certain number of camera images is required, via which a statistical smoothing of the color values with characteristically distributed image values for individual pixels or pixel fields is established.
  • brightness and/or color values are achieved for the individual pixels or pixel fields from which soiled surfaces are visible.
  • the more camera images from different cooking processes are included in the averaging the better the recognition.
  • the changes that can be detected between different camera images of a single cooking process are often too small to derive reliable conclusions about the degree of soiling of the cooking chamber.
  • the evaluation electronics evaluate the quality of the average image or a derivation formed from it using an evaluation algorithm, and a point in time for cleaning the cooking chamber is only determined if the quality of the average image or a derivation formed therefrom has been assessed as sufficient. If the averaged image is averaged from camera images that have too little variance with regard to the image values for the individual pixels or pixel fields, soiling in the cooking chamber cannot be identified with sufficient certainty and clearly. A small variance can easily be detected by an evaluation algorithm by comparing individual pixel values from different images. There are various stochastic methods for this.
  • the camera images can, for example, vary slightly if the same homogeneous-looking food is always being cooked and/or only a single small food is being cooked in the same place in the cooking chamber, the food is not being moved in the cooking chamber, or the camera images all show the same empty cooking chamber. In such a case, further camera images of the cooking chamber are required in order to obtain a variance in the image values for individual pixels or pixel fields that is necessary for detecting soiling. In such a case, the evaluation electronics should not give any recommendation as to when cleaning is required. The electronic evaluation system only makes a recommendation for the time of cleaning when the evaluation algorithm considers the quality of the average image or a derivation formed from it to be sufficient to serve as the basis for a recommendation.
  • the degree of contamination of pixels or pixel fields of the average image or a derivation formed therefrom is weighted differently in different sectors of the average image or a derivation formed therefrom.
  • This measure is advantageous for compensating for imbalances in the assessment of the degree of contamination of the cooking chamber, which can arise, for example, from a focusing of the camera image and/or the perspective with which the camera is directed into the cooking chamber. For example, if the camera is aimed at an angle from above into the cooking chamber, without a correction, soiling that is found in the average image or a derivation formed from it in the front area from the camera view would be weighted more heavily than soiling that is in the front of the camera due to the perspective distortion located in a more distant area.
  • This perspective effect can be corrected by weighting the detected contamination differently, depending on the sector, when determining the degree of contamination.
  • Such perspective distortions would not play a role if the transparent layer is arranged flat in front of the camera lens.
  • different weightings of pixels and pixel fields recognized as dirty can result due to optical distortions in the average image or a derivation from the lens optics formed therefrom, which represents different sectors of the camera image in different sizes.
  • a sector-dependent correction of the weighting of the detected contamination is also possible here.
  • Another option is to weight the outer areas of a camera image, in which the side walls can be seen, for example, less heavily than the center of the cooking space, where the food is usually located. This measure can improve the averaging of the camera images. If camera images are transmitted and displayed, or if additional evaluation functions are carried out based on camera images, these are significantly less disturbed by dirt in the outer areas than by dirt in the center of the picture. In such cases, the visual assessment of the degree of soiling by a user looking into the cooking chamber from the outside can deviate from the assessment of the degree of soiling by the method according to the invention. For the display of the camera image or the additional evaluation functions, however, the evaluation using the method according to the invention is advantageous because the evaluation of the degree of soiling is based on the camera images that are displayed and/or additionally evaluated.
  • the classification of the pixels and/or pixel fields of the average image or a derivation formed therefrom takes place taking into account the probabilities of the respective classification. For example, a first pixel or a first pixel area can be rated dirty as "very likely”, “likely” or “unlikely” and/or a second pixel or a second pixel area can be rated as "very likely", “likely” or “unlikely” not dirty will. There are more or less gradations in the classification of probability possible.
  • the differentiated classification of the individual pixels or pixel areas results in a more precise situational image of the degree of soiling of the cooking space in the cooking appliance. Such a soft classification with, for example, a total of six different classification levels could be superior to a digital yes/no classification because the certainty with which a pixel or a pixel field was assigned a classification is included in the assessment of the degree of contamination.
  • the average image is converted into a gray image.
  • An average image can show color fluctuations that are not due to the photographed food or soiling, but to other effects, such as uneven lighting of the cooking chamber.
  • the color fluctuations can be removed from the average image by converting the image values of the pixels or pixel fields into a gray image.
  • the gray image or the average image is converted into a differential image by filtering.
  • dirty spots in the image values of the pixels and pixel arrays may have a higher gray value than other spots on the averaged image, or the dirty spots may have a particular color value that other spots in the averaged image do not have.
  • filters the points on the gray image or the average image in a difference image can be made particularly recognizable, for example by using thresholding to hide other image values.
  • structures with a small size are removed from the average image, the gray image and/or the difference image in a correction run. Due to the random nature of the original camera images, small local variations that do not represent contamination cannot be avoided. However, structures with a small size can be ignored or removed via morphological operations. This takes place in a correction run to which the average image, the gray image and/or the difference image are subjected. In the correction run, the structures recognized by the image processing software are smoothed over a large area, particularly in the area of possible contamination. The image remaining after the correction run shows the spots where dirt is located in the average image with a high level of hit accuracy.
  • the areas of such false positives can be image areas in where pollution is indicated, although there is actually no pollution. Shadows from the lighting can be found in such areas, for example, or the areas are covered by components of the cooking appliance. If such image areas were included in the assessment of the degree of soiling, the result would be falsified. This is avoided by ignoring these areas. Depending on the device, these areas can be located at different points in the images. Accordingly, the areas of the false positives can be programmed device-specifically into the operating software, or the operating software has a self-learning program with which areas of the false positives are recognized in a self-learning manner and excluded from the evaluation.
  • the evaluation electronics delete previously recorded camera images.
  • pictures taken before the cleaning may no longer be used, since they show dirt that is no longer present. Therefore, the memory must be reset as soon as a cleaning has been performed.
  • a sufficient number of camera images must also be created in order to be able to carry out a new assessment.
  • the evaluation electronics know this and can automatically empty its memory. If the customer cleans manually, a memory for the current degree of soiling can be provided for this case, in which the last evaluated degree of soiling has been established. The status of the memory can be queried with each new evaluation. If this changes towards clean, previously recorded camera images must also be deleted because the current status apparently no longer corresponds to the previously determined degree of soiling.
  • Fig. 1 Cooking device with a camera
  • Fig. 2 Procedure with averaging and subsequent limit value formation
  • FIG. 3 Process sequence from Fig. 2 with reversed sequence.
  • An exemplary embodiment of the invention is shown purely schematically in FIG. 1 and is described in more detail below.
  • 1 shows a cooking appliance according to the invention with a camera.
  • FIG. 1 shows a cooking appliance 30 according to the invention as an example.
  • the cooking appliance 30 is embodied here as a baking oven and has a housing 2 in which a cooking chamber 6 delimited by a housing wall 4 is arranged.
  • the baking oven also has a 2D camera 8 designed as a fixed-focus camera as an optical sensor, with an opening 4.1 for the camera 8 being formed in the housing wall 4 and covered by a lens 10 designed as a converging lens made of borosilicate glass.
  • a cooking chamber lighting 11 designed as an oven lamp is attached to a cooking chamber ceiling 4.2 of the housing 4.
  • the baking oven light 11 is designed here at the same time as a camera light 11 .
  • the fixed focus camera 8 in the present embodiment can be constructed more simply.
  • the cooking chamber lighting 11 can also be arranged on a side wall or consist of a combination of ceiling and side lighting.
  • the camera 8 is designed to take pictures of the interior of the cooking chamber 6 and is arranged accordingly on the housing 2 . Since the cooking chamber 6 is relatively wide and flat, a lens 8.1 of the camera 8 has a correspondingly large opening angle on the object side.
  • the camera can also be designed as a plenoptic camera, TOF camera or stereo camera, ie using 3D technology. Cameras designed as thermal imaging cameras are also conceivable. In addition to single images, the recording of image sequences and moving images is also possible.
  • the opening 4.1 is closed essentially gas-tight by the converging lens 10 and a seal 12 arranged between the converging lens 10 and the cooking chamber ceiling 4.2 of the housing wall 4 and designed as a glass fiber seal.
  • a metal gasket with or without a graphite coating can be used.
  • the vapors produced during a cooking process taking place in the cooking chamber 6 of a cooking item 16 placed on a cooking item support 14 cannot escape through the opening 4.1 in an undesired manner.
  • the camera 8 can also be arranged in a lateral position or in any other position relative to the cooking chamber 6, so that it can also take useful photos of the cooking chamber 6 and the dirt present therein from these positions for evaluation purposes.
  • the converging lens 10 made of borosilicate glass and the glass fiber seal 12 protect the camera 8 from substances contained in the vapor and from the high temperatures arising during the cooking process. This is particularly important since the cooking appliance 30 is equipped as a self-cleaning cooking appliance 30, ie with a pyrolysis function. Far higher temperatures result in pyrolysis operation.
  • the cooking chamber 6 is thermally insulated from the environment in a manner known to those skilled in the art by a substantially circumferential insulating layer 18 .
  • the insulating layer 18 is only partially shown in the figure.
  • the opening 4.1 in the cooking chamber wall 4 is arranged here in the center of the cooking chamber ceiling 4.2 of the housing wall 4.
  • the opening 4.1 is kept as small as possible in order to have to break through the insulation layer 18 as little as possible at this point.
  • the opening 4.1 is shown significantly enlarged in the figure for the sake of clarity.
  • the camera 8 is arranged further away from the cooking chamber ceiling 4.2 and thus from the cooking chamber 6 in a cooler area of the housing 2 due to the insulating layer 18, but also for the purpose of reducing the temperature load on the camera 8 by the thermal radiation.
  • the opening 4.1 can be made so small because a converging lens 10 is used here.
  • the converging lens 10 widens the beam path of the light, symbolized by dashed lines 19 in the figure, through the opening 4.1.
  • this enables the cooking chamber 6 to be shown completely in the essential areas on the images recorded with the camera 8, despite a reduced opening angle of the lens 8.1 of the camera 8 on the object side.
  • the angle of view in the plane of the page shown in the figure can be increased from approximately 50° to approximately 110°.
  • the aforementioned angle information cannot be inferred from the schematic representation.
  • the opening 4.1 can be arranged in the region of a top-heating element 20 in the cooking chamber ceiling 4.2, without the opening 4.1 and the top-heating element 20 having a negative effect on one another.
  • the top heater 20 does not appear on the images captured by the camera 8.
  • the cooking chamber lighting 11 is also arranged on the cooking chamber ceiling 4.2 of the cooking chamber wall 4 that there is no undesired interaction between the cooking chamber lighting 11 and the top heat radiator 20.
  • the camera 8 and the converging lens 10 are also arranged in a cooling channel 2.1 of the housing 2.
  • the camera 8 and the converging lens 10 are in heat transfer connection with the cooling air guided in the cooling channel 2.1.
  • the cooling channel 2.1 is designed here as an air duct 2.1 of the cooking appliance.
  • the air duct 2.1 is connected to the housing wall 4 in a flow-conducting manner by means of a flange 22, which is also resistant to chemicals and temperature, and the air duct 2.1 is connected to the environment in a substantially gas-tight manner.
  • the air duct 2.1 of the cooking appliance 30 is used to suck the vapors produced during cooking processes in the cooking chamber 6 out of the cooking chamber 6 and to discharge them into the open air.
  • the vapor-air mixture is guided in the air duct 2.1 formed in the housing 2.
  • the cooking appliance is also cooled in this way.
  • the cooking device 30 can be implemented with fewer components and thus more cost-effectively.
  • the individual components of the cooking appliance 30 according to the invention are partially shown at a distance from one another in the figure; see, for example, the distance between the insulation layer 18 and the cooking chamber ceiling 4.2 of the housing wall 4. This is only for a better overview.
  • the components of the cooking appliance 30 according to the invention are connected to one another in a manner known to those skilled in the art, unless explicitly stated otherwise using the exemplary embodiment.
  • the camera 8 can also be designed as a plenoptic camera, TOF camera or stereo camera, ie using 3D technology. Cameras designed as thermal imaging cameras are also conceivable. In addition to single images, the recording of image sequences and moving images is also possible.
  • a camera image 32 recorded by the camera 8 is transmitted to the evaluation electronics 40 in each case.
  • the cooking appliance 30 is connected to the evaluation electronics 40 to which the camera images 32 taken by the camera 8 are transmitted.
  • the evaluation electronics 40 have image processing software 46 with which the software-supported image processing is carried out.
  • the individual camera images produced by the camera 8 32, which are used for the image evaluation can be stored in the evaluation electronics 40, but it is also possible to store them in the operating electronics 34 of the cooking appliance 30 or, if the cooking appliance 30 has an Internet connection and is appropriately equipped, in the cloud.
  • the image processing software 46 is programmed in such a way that it calculates an average image 42 from a plurality of available camera images 32 as an image evaluation.
  • the evaluation electronics 40 extracts information about the degree of soiling of the cooking chamber 6 from the average image 42 .
  • the operating electronics 34 can then communicate this point in time to an operator via the available displays, be it an indicator light on the cooking appliance 30, an indication on a display on the cooking appliance 30 or a display of a smart device that communicates with the cooking appliance 30 via WLAN or the Internet.
  • the lens 10 represents a transparent layer through which the camera 8 produces camera images 32 of the cooking chamber 6 .
  • Fat spatters that splash up from the food 16 to be cooked during a cooking process also adhere to the lens 10 , which forms part of the inner wall of the cooking chamber 6 on the side of the lens 10 that faces the cooking chamber 6 . From the point of view of the camera, this dirt adhering to the lens 10 obscures the view from the camera of the food to be cooked 16 at the points where it is located on the lens 16 in the cooking chamber 6 there are no more changes, or at most only minor changes, in the image values of the relevant pixels or pixel arrays.
  • the items to be cooked 16 lying in the cooking chamber 6 during the respective cooking process can be clearly seen in the camera image 32 at the locations of the lens 10 where there is no soiling, and the different textures and colors of the items to be cooked 16 in the cooking chamber 6 result different values for the image values for the relevant pixels or pixel fields in the camera images 32, which are then calculated into an averaged value by averaging in the average image 42.
  • the double arrow 48 indicates that the evaluation electronics 40 and the operating electronics 34 can communicate with one another beyond the point in time 44 by providing the information.
  • the communication can relate in particular to cooking item identification, which can be a component of the operating electronics 34 or of the camera 8 .
  • the cooking product recognition can be used to verify for the evaluation electronics 40 that the available camera images 32 show the cooking chamber 6 with the food to be cooked in it 16 and that the camera images 32 were produced on the occasion of various cooking processes.
  • the evaluation electronics 40 can further process the calculated average image 42 into a gray image 50 that can offer better evaluation options.
  • the gray image 50 alone or in combination with the average image 42 can be further processed to form a difference image 52, in which certain values can have been filtered out of the image values for individual pixels or pixel fields in order to obtain an image that can be evaluated even better.
  • the average image 42, the gray image 50 and/or the differential image 52 can be used by the evaluation electronics 40 to determine the degree of soiling of the cooking chamber 6.
  • a mean value for the brightness value of this pixel is first formed for this pixel from a plurality of camera images 32 using a metric. If this new brightness value determined from a number of camera images 32 is available, this brightness value can be entered in the average image 42 for this pixel.
  • the average image is composed of a large number of brightness values calculated in this way for the individual pixels.
  • the limit value formation 80 the average image 42 is binarized in that a decision is made for each pixel examined in the context of the limit value formation 80 as to whether its brightness value 60 is above or below an assumed limit value. If the brightness value is above the limit value, the corresponding pixel is considered not to be dirty; if the brightness value is below, this pixel is considered to be dirty. A decision can then be made on the basis of the contamination found as to whether a cleaning cycle is necessary.
  • Fig. 3 is the procedure from Fig. 2, but in reversed order with a first occurring limit value formation 80 and the subsequent mean value formation 70.
  • the camera 8 can be a monochromatic camera 8 that records a gray tone image.
  • a grayscale image each pixel of an image is assigned a value that represents its brightness/light intensity on a scale from 0 to 255, for example. 0 stands for a black pixel and 255 for a white pixel. If a color camera 8 is used, then so a gray tone image can first be generated from the color image.
  • RGB image can then be reconstructed from the red, green and blue color components.
  • Each pixel has three values, these represent the light intensity in the red, green and blue areas of the optical spectrum.
  • a gray image can now be generated from the RGB image by reducing the color information to pure brightness information. In the simplest case, the gray value is determined for each pixel as the average of the red, green and blue values.
  • the value 20 corresponds to a very dark gray on the scale used. If a pixel is even darker, the light intensity here is extremely low, which can indicate that the incidence of light is being disturbed by dirt at this point.
  • the dirty pixels are marked in white in the figure above, the non-dirty ones are black. It is clearly visible that recognition from a single camera image often does not work and the food to be cooked has a major impact on recognition. For example, the spots between the French fries are incorrectly identified as soiling.
  • An average image 42 is calculated from the multiple camera images, in which the average brightness is calculated for each pixel using a metric. The equation below shows this for 3 sample images, each 2 x 2 pixels.
  • D the arithmetic mean of the corresponding pixels from all three images A, B and C is calculated for each pixel in this exemplary embodiment.
  • the average image displayed below can be calculated from several camera images. This average image was determined from only six camera images, so that the different loads in the cooking chamber 6 can still be clearly identified. However, the loads in the cooking chamber 6 are already weakened by the averaging. In contrast to this, however, the contamination is always the same and is not weakened by averaging. see figure 5
  • the detection corresponds to a very good approximation of the dirt that is actually present.
  • the large lower soiling is not fully recognized.
  • the brightness correction can be useful.
  • the invention is not limited to the present embodiment.
  • the teaching according to the invention can also be advantageously used in cooking appliances other than ovens.
  • the cameras or other optical sensors may also be positioned elsewhere than suggested in the foregoing physical description, inside or outside the cooking cavity, such as in the cooking cavity access door or in front of the cooking cavity access door, such as on the cooking cavity facing side of a handle, and multiple cameras and/or optical sensors may be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

La présente invention concerne un procédé de détermination du temps de nettoyage d'une chambre de cuisson (6) d'un appareil de cuisson (30), dans lequel des images de caméra (32) d'une caméra (8) dirigée dans la chambre de cuisson (6) sont acheminées à des composants électroniques d'évaluation (40) qui, sur la base des images de caméra (32), déterminent le niveau de salissure de la chambre de cuisson (6) au moyen d'une analyse d'image assistée par logiciel. Selon l'invention, pour déterminer le temps correct du nettoyage de la chambre de cuisson (6) avec une précision améliorée, la nécessité d'un nettoyage est déterminée sur la base d'une pluralité d'images de caméra (32), plus particulièrement les valeurs de luminosité (60) d'une pluralité ou de la totalité des pixels des images de caméra (32), au moyen d'un calcul de moyenne (70) au moyen d'une mesure et d'une formation de valeur limite ultérieure (80) ou au moyen d'une formation de valeur limite (80) et d'un calcul de moyenne ultérieur (70) au moyen d'une mesure.
EP21790808.6A 2020-10-14 2021-10-06 Procédé de détermination du temps de nettoyage d'une chambre de cuisson d'un appareil de cuisson Pending EP4229336A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020126930.7A DE102020126930A1 (de) 2020-10-14 2020-10-14 Verfahren zur Bestimmung des Zeitpunktes für die Reinigung eines Garraums eines Gargeräts
PCT/EP2021/077565 WO2022078839A1 (fr) 2020-10-14 2021-10-06 Procédé de détermination du temps de nettoyage d'une chambre de cuisson d'un appareil de cuisson

Publications (1)

Publication Number Publication Date
EP4229336A1 true EP4229336A1 (fr) 2023-08-23

Family

ID=78134930

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21790808.6A Pending EP4229336A1 (fr) 2020-10-14 2021-10-06 Procédé de détermination du temps de nettoyage d'une chambre de cuisson d'un appareil de cuisson

Country Status (3)

Country Link
EP (1) EP4229336A1 (fr)
DE (1) DE102020126930A1 (fr)
WO (1) WO2022078839A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12010409B2 (en) * 2021-12-22 2024-06-11 Whirlpool Corporation Camera view port dedicated self cleaning cycles
US11986862B2 (en) * 2022-04-25 2024-05-21 John Bean Technologies Corporation System and method for optimizing a cleaning session of a food processing system
DE102022204280A1 (de) * 2022-05-02 2023-11-02 BSH Hausgeräte GmbH Betreiben eines Gargeräts mit mindestens einer Garraumkamera
WO2024053908A1 (fr) * 2022-09-07 2024-03-14 삼성전자주식회사 Appareil de cuisson et son procédé de commande

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10128024B4 (de) 2001-06-08 2006-07-06 BSH Bosch und Siemens Hausgeräte GmbH Gargerät
WO2014102074A1 (fr) 2012-12-26 2014-07-03 Arcelik Anonim Sirketi Four à pyrolyse
DE102014116709A1 (de) 2014-11-14 2016-05-19 Rational Aktiengesellschaft Gargerät sowie Verfahren zur Bestimmung des Verschmutzungsgrades eines Gargeräts
KR101644711B1 (ko) 2014-12-08 2016-08-01 엘지전자 주식회사 전자레인지 및 전자레인지의 제어 방법
DE102016206483A1 (de) 2016-04-18 2017-10-19 Convotherm Elektrogeräte GmbH Verfahren zum Feststellen einer Reinigungsnotwendigkeit und Qualitätsmanagement-Überwachungssystem eines gewerblichen Gargeräts, und gewerbliches Gargerät
JP6842955B2 (ja) * 2017-03-13 2021-03-17 三菱電機株式会社 加熱調理器
DE102017206058A1 (de) 2017-04-10 2018-10-11 BSH Hausgeräte GmbH Bestimmen eines Verschmutzungsgrads in einem Garraum
DE102017206056A1 (de) 2017-04-10 2018-10-11 BSH Hausgeräte GmbH Betreiben eines Gargeräts
JP7190632B2 (ja) * 2017-06-07 2022-12-16 パナソニックIpマネジメント株式会社 加熱調理器および加熱調理器の制御方法
EP3477205B1 (fr) * 2017-10-25 2020-04-22 Diehl AKO Stiftung & Co. KG Appareil électroménager
WO2019208342A1 (fr) * 2018-04-27 2019-10-31 パナソニックIpマネジメント株式会社 Dispositif de cuisson chauffant
JP7141328B2 (ja) * 2018-12-25 2022-09-22 東京瓦斯株式会社 洗浄システム、調理機、そのプログラムおよび洗浄方法
EP3767183A1 (fr) 2019-07-16 2021-01-20 Electrolux Appliances Aktiebolag Procédé de reconnaissance de la pollution d'un appareil domestique et système de reconnaissance de pollution pour appareil domestique

Also Published As

Publication number Publication date
DE102020126930A1 (de) 2022-04-14
WO2022078839A1 (fr) 2022-04-21

Similar Documents

Publication Publication Date Title
WO2022078839A1 (fr) Procédé de détermination du temps de nettoyage d'une chambre de cuisson d'un appareil de cuisson
EP3446040B1 (fr) Procédé pour déterminer une nécessité de nettoyage d'un appareil de cuisson industriel et appareil de cuisson industriel
WO2018033383A1 (fr) Détermination d'un degré de dorage d'un produit à cuire
EP3152498B1 (fr) Appareil de cuisson avec projecteur de motif lumineux et caméra
DE202011002570U1 (de) Vorrichtung zur automatischen Wärmebehandlung von Lebensmitteln
EP3872403A1 (fr) Procédé de détermination d'une durée du cycle de nettoyage
DE102016107617A1 (de) Verfahren zum Betreiben eines Gargeräts und Gargerät
DE102019107846A1 (de) Verfahren zum Betreiben eines Gargeräts und Gargerät
CN112188190A (zh) 污渍检测方法、烹饪器具、服务器和存储介质
EP3904769A1 (fr) Appareil de cuisson avec détection électronique des erreurs de l'opérateur
EP3742051B1 (fr) Régulation de l'exposition des produits à cuire
WO2021032477A1 (fr) Fonctionnement d'un appareil de cuisson ménager doté d'au moins une caméra
DE10015760C2 (de) Verfahren zum Betrieb eines Garofens sowie Garofen
DE102019107828B4 (de) Verfahren zum Betreiben eines Gargeräts und Gargerät
DE102023209367A1 (de) Betreiben eines Gargeräts mit einer digitalen Garraum-Farbkamera
EP4274997A1 (fr) Procédé pour déterminer une fin de temps de cuisson d'aliments, et appareil de cuisson électroménager
DE102020126249A1 (de) Gargerät, insbesondere gewerbliches Gargerät
EP3767580A1 (fr) Unité de commande et procédé d'évaluation des données d'image dans un appareil électroménager
BE1030431B1 (de) Verfahren zum Garen eines Garguts in einem Gargerät mit einer Kameraeinrichtung
BE1030917B1 (de) Gargerät und Verfahren zum Betreiben eines Gargeräts
BE1030426B1 (de) Verfahren zum Garen eines Garguts in einem Gargerät mit einer Kameraeinrichtung
WO2023213514A1 (fr) Fonctionnement d'un appareil de cuisson avec au moins une caméra de chambre de cuisson
BE1030428B1 (de) Verfahren zum Garen eines Garguts in einem Gargerät mit einer Kameraeinrichtung
WO2022089977A1 (fr) Procédé pour faire fonctionner un appareil de cuisson et appareil de cuisson
DE102023106641A1 (de) Verfahren zum Garen eines Garguts in einem Gargerät mit einer Kameraeinrichtung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230515

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)