WO2019059304A1 - Appareil de cuisson - Google Patents

Appareil de cuisson Download PDF

Info

Publication number
WO2019059304A1
WO2019059304A1 PCT/JP2018/034893 JP2018034893W WO2019059304A1 WO 2019059304 A1 WO2019059304 A1 WO 2019059304A1 JP 2018034893 W JP2018034893 W JP 2018034893W WO 2019059304 A1 WO2019059304 A1 WO 2019059304A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
heating
unit
camera
color
Prior art date
Application number
PCT/JP2018/034893
Other languages
English (en)
Japanese (ja)
Inventor
貞平 匡史
藤田 篤志
ザリナ ラフィー
幸 裕弘
久保谷 寛行
藤濤 知也
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2017183991A external-priority patent/JP6906192B2/ja
Priority claimed from JP2018096532A external-priority patent/JP7178622B2/ja
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2019059304A1 publication Critical patent/WO2019059304A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B6/00Heating by electric, magnetic or electromagnetic fields
    • H05B6/02Induction heating
    • H05B6/10Induction heating apparatus, other than furnaces, for specific applications
    • H05B6/12Cooking devices

Definitions

  • the present invention relates to a heating cooker for heating an object to be cooked.
  • a heating cooker which shoots a cooking object with a camera and controls heating of the cooking object based on an image of the cooking object shown in the photographed image (see, for example, Patent Document 1).
  • the color tone of the image of the object to be cooked that is captured in the captured image of the camera changes.
  • the entire photographed image, that is, the image to be cooked may become reddish (the hue changes).
  • the entire photographed image, that is, the image to be cooked may be whitened (color saturation may be reduced) by the light of the light.
  • the color tone of the image of the cooking object shown to a picked-up image will also differ.
  • the heating of the cooking target is appropriately performed without being affected by changes or differences in the light environment. Make it a task.
  • a main body including a top surface including a container placement unit on which a container containing a cooking target is placed;
  • a heating unit configured to heat the container placed on the container placement unit from below;
  • a marker provided on the main body and photographed by the camera;
  • the color tone of the image of the marker in the photographed image of the camera is measured, and based on the measurement result, the photographed image is color corrected so that the image of the marker is photographed in the actual color of the marker itself in the photographed image.
  • the image processing unit to A cooking target state estimation unit that estimates the state of the cooking target based on the image of the cooking target captured in the captured image that has been color-corrected by the image processing unit; And a heating control unit configured to control the heating unit based on the state of the cooking target estimated by the cooking target state estimation unit.
  • a container containing the object to be cooked A main body including a top surface including a container placement unit on which the container is placed; A heating unit configured to heat the container placed on the container placement unit from below; A camera for photographing the cooking object; A marker provided in the container and photographed by the camera; The color tone of the image of the marker in the photographed image of the camera is measured, and based on the measurement result, the photographed image is color corrected so that the image of the marker is photographed in the actual color of the marker itself in the photographed image.
  • the image processing unit to A cooking target state estimation unit that estimates the state of the cooking target image based on the cooking target image captured in the captured image that has been color-corrected by the image processing unit; And a heating control unit configured to control the heating unit based on the state of the cooking target estimated by the cooking target state estimation unit.
  • the heating cooker which controls the heating of the cooking object based on the image of the cooking object shown in the photographed image of the camera, the heating of the cooking object is appropriately performed without being influenced by the change or difference of the light environment. It can be carried out.
  • FIG. 1 Schematic perspective view of the heating cooker according to Embodiment 1 of the present invention
  • FIG. 2 Top view of the top plate of the cooker A diagram schematically showing the configuration of a heating cooker Block diagram showing the control system of the heating cooker
  • a diagram showing a photographed image of an example of a camera Flowchart diagram showing the flow of geometric correction of a photographed image
  • a diagram showing a photographed image with keystone correction A diagram showing a size-corrected photographed image
  • FIG. position-corrected and rotation-corrected captured image Flow chart showing the flow of color correction of a photographed image
  • FIG. 2 Top view of the top plate of the cooker A diagram schematically showing the configuration of a heating cooker Block diagram showing the control system of the heating cooker Flow chart showing the flow of color correction of a photographed image
  • a diagram showing a photographed image of an example of a camera A diagram showing an example of a reference chart for adjustment Diagram showing standard RGB values and detected RGB values
  • the heating cooker includes, from below, a main body including a top surface including a container placement unit on which a container containing a cooking object is placed, and the container placed on the container placement unit from below
  • An image processing unit that performs color correction on the captured image so that the image of the marker is captured in the actual color of the marker itself in the captured image; and cooking that is captured in the captured image color-corrected by the image processing unit
  • a cooking target state estimation unit that estimates the state of the cooking target based on an image of the target, and a heating control unit that controls the heating unit based on the cooking target state estimated by the cooking target state estimation unit It has a.
  • a heating cooker for controlling heating of a cooking target based on an image of the cooking target shown in a captured image of a camera, wherein the heating of the cooking target is not affected by changes or differences in the light environment. You can do it properly.
  • the hue may be at least one of hue, saturation, and lightness.
  • hue, saturation, and lightness By using these as parameters of color tone, color correction can be appropriately performed.
  • the marker is provided on the top surface of the main body.
  • the image of the object to be cooked that is, the image of the liquid surface is appropriately color corrected under the color correction condition calculated based on the color of the image of the marker provided on the horizontal top surface it can.
  • the marker may be provided on a portion of the top surface other than the container placement unit. This prevents the container from being placed on the marker.
  • each of the plurality of markers is associated with the plurality of heating units, and in the vicinity of the container placement unit facing the corresponding heating unit.
  • the image processing unit calculates a color correction condition for each of the images of the plurality of markers, and the plurality of cooking objects corresponding to the container placement unit in the photographed image are calculated according to the calculated color correction conditions. Color correct the area. Thereby, the image of the cooking object heated by each of the plurality of heating units can be color-corrected separately.
  • each of the plurality of markers is provided at the center of each heating unit as viewed from above the top surface of the main body.
  • the marker can indicate the position of the heating unit, and can also function as a mark indicating the container mounting position.
  • the image processing unit includes a reference chart display unit that displays a reference chart for color correction as the marker, the camera is installed at a position away from the top surface, and images the container and the cooking target
  • the image processing apparatus further includes a color correction unit that performs color correction of an image captured by the camera using a pixel value of a reference chart included in the image captured by the camera.
  • the cooking target state estimation unit recognizes that the cooking target in the image captured by the camera has reached a predetermined color, and the cooking target state estimation unit determines the cooking target in the image. Upon recognizing that the color has been reached, the heating control unit performs control to increase or decrease the heating output.
  • an illumination state detection unit for detecting an illumination state around the top surface of the top plate is provided during heating output from the heating unit, and the color is detected when a change in the illumination state is detected by the illumination state detection unit.
  • the correction unit performs color correction again using the pixel values of the reference chart included in the image captured by the camera.
  • the illumination state detection unit when the illumination state detection unit does not detect an illumination state greater than a predetermined brightness, a warning is notified to the user.
  • the display unit displays a heating condition of the heating unit, and when the image processing unit does not recognize the reference chart, a warning is notified from the display unit to the user.
  • the heating unit has a heating coil that generates an induction magnetic field to heat the container, and the heating control unit supplies a high frequency current to the heating coil to heat the container.
  • the distance from the camera to the top surface of the top plate is 600 mm or more and 2000 mm or less. With this distance, the entire top surface within the field of view of the camera can be photographed properly.
  • a heating cooker is a container containing a cooking object, a main body including a top surface provided with a container mounting unit on which the container is mounted, and the container mounted on the container mounting unit.
  • the color of the image of the marker which is provided in the heating unit which heats the container from below, the camera which shoots the cooking object, the container and which is provided in the container and which is photographed by the camera and which is photographed in the photographed image of the camera
  • An image processing unit that performs color correction on the photographed image so that the image of the marker is captured in the actual color of the marker itself in the photographed image based on the measurement result; and photographing that is color-corrected by the image processing unit
  • a cooking target state estimation unit for estimating the state of the cooking target image based on the image of the cooking target in the image, and the heating unit based on the cooking target state estimated by the cooking target state estimation unit It has a heating control unit that, the.
  • a heating cooker that controls heating of a cooking target based on an image of the cooking target shown in a captured image of a camera, wherein the heating of the cooking target is not affected by changes or differences in the light environment. You can do it properly.
  • Embodiment 1 Hereinafter, Embodiment 1 of the present invention will be described with reference to the drawings.
  • FIG. 1 is a schematic perspective view of a heating cooker according to a first embodiment of the present invention.
  • FIG. 2 is a top view of the top plate of a heating cooker.
  • FIG. 3 is a diagram schematically showing the configuration of the heating cooker.
  • FIG. 4 is a block diagram showing a control system of the heating cooker.
  • the X-axis direction indicates the width direction (longitudinal direction) of the heating cooker
  • the Y-axis direction indicates the depth direction
  • the Z-axis direction indicates the height direction.
  • the heating cooker 10 has a main body 12 and, as an upper portion of the main body 12, a top plate 14 on which a container C containing a cooking target T such as a stew is placed. Have.
  • the top surface 14 a of the top plate 14 is provided with a container placement unit on which the container C is placed.
  • the main body 12 includes the top surface 14 a including the container placement unit on which the container C containing the cooking target T is placed.
  • the heating cooker 10 is an induction heating cooker, and as shown in FIG. 1 and FIG. 3, it is placed on the container mounting portion (container mounting area described later) in the top plate 14. Below the placed container C, that is, in the main body 12, heating coils 16A to 16C are disposed as heating parts of the heating cooker 10. The heating coils 16A to 16C inductively heat the container C placed on the container placement part on the top plate 14 from below. The heating coils 16A to 16C generate an induction magnetic field to heat the container C.
  • a plurality of touch keys 18A to 18C are provided on a portion of the top plate 14 on the front surface 10a side of the heating cooker 10, as operation units for the user to operate the heating coils 16A to 16C, respectively. For example, heating of the corresponding heating coil 16A is started or stopped by the touch key 18A. In addition, the heating level (for example, four stages) is adjusted.
  • a plurality of output display units 20A to 20C for displaying the output state (heating level) of the heating coils 16A to 16C are also provided on the top plate 14 on the front surface 10a side of the heating cooker 10.
  • a setting unit 22 is provided on the front surface 10a side of the heating cooker 10.
  • the setting unit 22 can be inserted into and taken out of the main body 12, and the setting key 22a for setting the heating by the heating coils 16A to 16C in detail, the setting contents thereof, the detailed state of the heating coils 16A to 16C, etc.
  • a setting display unit 22b for displaying.
  • the setting portion 22 sets the heating temperature, the heating time, the timer, etc. of the heating coils 16A to 16C.
  • the heating cooker 10 equips the front surface 10a side of the heating cooker 10 with alerting
  • the heating cooker 10 has a plurality of temperature sensors 26A to 26C for detecting the temperature of the container C placed on the top plate 14 and a weight for detecting the weight of the container C placed on the top plate 14 And a sensor 28.
  • the plurality of temperature sensors 26A to 26C are disposed below the top plate 14 and at the centers of the heating coils 16A to 16C. That is, for example, the temperature sensor 26A is disposed above the heating coil 16A and detects the temperature on the bottom side of the container C inductively heated by the heating coil 16A through the top plate 14.
  • the weight sensor 28 determines the weight of the object on the top plate 14, that is, when there are a plurality of containers C, the weights of the plurality of containers C and the cooking contained in each container C. Detect the sum of the subject's weight.
  • the heating cooker 10 has the camera 30 which image
  • the camera 30 is, for example, a camera capable of capturing still images, moving images, infrared images, etc., and captures the entire top surface 14 a of the top plate 14 (i.e., the top surface of the main body 12) which is the upper portion of the main body 12.
  • the camera 30 is attached to a ceiling or a range hood (not shown) located above the heating cooker 10 or the like.
  • the camera 30 is attached to the range hood via a magnet. Therefore, the camera 30 can be installed freely with respect to the main body 12 of the heating cooker 10 (that is, the installation freedom is high). As a result, the camera can be mounted at a position preferred by the user without impairing the design of the kitchen.
  • the camera 30 that can be disposed at a position preferred by the user is configured to wirelessly communicate with the control unit 50 of the heating cooker 10.
  • the camera 30 is provided with the antenna 32, and the antenna 34 is provided in the main body 12 of the heating cooker 10.
  • the camera 30 and the control unit 50 are connected via the router device 36. That is, the camera 30 and the control unit 50 are connected in a wirelessly communicable manner to a local area network 38 centering on the router device 36.
  • the router device 36 is connected to the Internet 40.
  • the router device 36 is connected to the portable terminal 42 of the user in a wirelessly communicable manner.
  • the control unit 50 of the heating cooker 10 is, for example, a processing device such as a CPU, and performs various functions by executing a program stored in the storage unit 60 such as a ROM, a RAM, a hard disk (for example, , And is configured to function as an image processing unit described later. As shown in FIG. 4, the control unit 50 is configured to receive from these signals corresponding to the user's operation on the touch keys 18A to 18C and the setting key 22a of the setting unit 22. The control unit 50 is also configured to receive from them signals corresponding to the temperatures detected by the temperature sensors 26A to 26C, respectively. Control unit 50 is further configured to receive a signal corresponding to the weight detected by weight sensor 28. The control unit 50 is configured to receive from the camera 30 a captured image (data) captured by the camera 30.
  • control unit 50 of the heating cooker 10 is configured to control the heating coils 16A to 16C, the output display units 20A to 20C, the setting display unit 22b, and the notification unit 24.
  • control unit 50 controls the outputs of the heating coils 16A to 16C so that the temperature sensors 26A to 26C continue to detect the heating temperature set by the user via the setting key 22a of the setting unit 22. Thereby, for example, when frying, the oil accommodated in the container C can be maintained at a constant temperature.
  • control unit 50 heats when the temperature sensors 26A to 26C detect the boiling temperature.
  • the coils 16A to 16C are stopped, and the user is notified via the notification unit 24 that the water heating is completed.
  • control unit 50 is configured to perform various controls based on a captured image of the camera 30, specifically, an image of the cooking object T shown in the captured image.
  • the control unit 50 of the heating cooker 10 sets the image acquisition unit 52 that acquires a photographed image from the camera 30, the image processing unit 54 that processes (corrects) the acquired image, and the corrected photographed image.
  • a cooking object state estimation unit 56 that estimates the state of the cooking object T based on the image of the cooking object T in the container C, and heating as a heating control unit that controls the heating coils 16A to 16C based on the estimation result It has (functions as) a coil control unit 58.
  • the heating coil control unit 58 supplies a high frequency current to the heating coils 16A to 16C to heat the container C.
  • the heating coil control unit 58 controls the heating of the container C by controlling the amount of high frequency current supplied.
  • the image acquiring unit 52 of the control unit 50 of the heating cooker 10 is a photographed image photographed by the camera 30, for example, a photographing in which the top plate 14 in a state in which the container C accommodating the cooking target T being cooked is placed Acquire an image (data).
  • the camera 30 performs shooting at predetermined time intervals.
  • FIG. 5 illustrates an example of a photographed image (original image) of the camera 30 acquired by the image acquiring unit 52.
  • an image 14 a ′ of the top surface 14 a of the top plate 14 is captured in a captured image P ⁇ b> 0 of the camera 30.
  • image T 'of cooking object T is shown by the cross hatching.
  • the image processing unit 54 performs image processing on the captured image P0 of the camera 30 acquired by the image acquisition unit 52. Specifically, the image processing unit 54 estimates the state of the cooking target T in a short time with high accuracy based on the image T ′ of the cooking target T captured in the photographed image P0 of the camera 30 by the cooking target state estimation unit 56 As the pre-processing, the photographed image P0 is corrected so that it can be performed. In the case of the first embodiment, geometrical correction and color correction are performed on the captured image P0. First, geometric correction will be described with reference to the flowchart shown in FIG.
  • a plurality of markers 44A to 44C provided on the main body 12 of the heating cooker 10 are used.
  • the markers 44 A to 44 C are provided on the main body 12 and captured by the camera 30.
  • the markers 44A to 44C are alignment markers photographed by the camera 30 in order to facilitate identification of the image T 'of the cooking target T in the photographed image, that is, to specify in a short time with high accuracy.
  • the plurality of markers 44A to 44C are provided on the top surface 14a of the top plate 14 (that is, the top surface of the main body 12).
  • the markers 44A to 44C are marks drawn on the top surface 14a of the top plate 14 or stickers attached to the top surface 14a.
  • the markers 44A to 44C have shapes and colors that can be recognized with high accuracy (it is easy to specify in a captured image of a camera), and in the case of the first embodiment, they are equilateral triangles.
  • the plurality of markers 44A to 44C are associated with the heating coils 16A to 16C.
  • one vertex of the equilateral triangle-shaped marker is matched by being directed to the heating coil.
  • the plurality of markers 44A to 44C correspond to the top plate 14 other than the container placement areas 46A to 46C (container placement portions on which the containers C heated by the heating coils are placed) facing the heating coils 16A to 16C. It is provided in the portion of the top surface 14a. This prevents the container C from being placed on the markers 44A to 44C.
  • the marker 44A is associated with the heating coil 16A, and is provided in the vicinity (rear side) of the container placement area 46A facing the heating coil 16A.
  • the marker 44B is matched with the heating coil 16B, and is provided in the vicinity (back side) of the container mounting area 46B which opposes the heating coil 16B.
  • the marker 44C is matched with the heating coil 16C, and is provided in the vicinity (front side) of the container mounting area 46C which opposes the heating coil 16C.
  • the container placement areas 46A to 46C are defined by substantially annular marks 48A to 48C provided on the top surface 14a of the top plate 14, and are presented to the user.
  • the images of the markers 44A to 44C can be specified in the photographed image of the camera 30, the area of the photographed image corresponding to the container placement areas 46A to 46C, that is, the cooking target T can be determined based on the positions of the specified markers. Areas in which an image may be present (areas to be cooked RA to RC, see FIG. 9) can be identified with high accuracy in a short time.
  • the camera 30 is not at a fixed position with respect to the main body 12 of the heating cooker 10, but can be freely installed. Therefore, the image of the cooking object T is not necessarily captured at a fixed position of the captured image. In the absence of the markers 44A to 44C, since it is necessary to confirm the entire photographed image in order to specify the image of the cooking object T, it takes time to specify the image and a specific error may occur. In addition, since the cooking target T has various shapes and is not a fixed shape or color, the specific time of the image and the probability of a specific error vary depending on the cooking target T.
  • the image of the markers 44A to 44C having a certain feature (shape and color) in the photographed image is specified, and the area of the photographed image corresponding to the container mounting area 46A to 46C based on the position of the image of the specified marker.
  • the cooking target presence areas RA to RC are identified, and the image of the cooking target T is specified from within the cooking target presence areas RA to RC.
  • the image processing unit 54 performs geometrical correction on the photographed image P0 of the camera 30 by using such markers 44A to 44C.
  • step S100 the image processing unit 54 forms an image 44A 'to 44C' of a plurality of markers 44A to 44C in the photographed image P0 of the camera 30 shown in FIG. Identify.
  • step S110 the image processing unit 54 measures the shapes of the images 44A 'to 44C' of the markers 44A to 44C.
  • the shapes of the images 44A 'to 44C' of the markers 44A to 44C in the photographed image P are , And the actual shapes of the markers 44A to 44C themselves. That is, in the case of the first embodiment, since the actual shape is an equilateral triangle, the shapes of the images 44A 'to 44C' should also be equilateral triangles.
  • the camera 30 can be freely installed with respect to the main body 12 of the heating cooker 10, the camera 30 is installed with the optical axis 30a not orthogonal to the top surface 14a of the top plate 14 There is a case. In that case, as shown in FIG. 5, the shape of the image 14 'of the top plate 14 is distorted, and the shapes of the images 44A' to 44C 'of the markers 44A to 44C are also distorted so that they are not regular triangles.
  • step S120 the image processing unit 54 geometrically corrects the photographed image P0 of the camera 30 based on the shapes of the images 44A 'to 44C' of the markers 44A to 44C measured in step S110. Specifically, the image processing unit 54 compares the shapes of the measured images 44A 'to 44C' with the actual shapes of the markers 44A to 44C themselves, and calculates the correction condition of the keystone correction based on the comparison result. .
  • the correction condition is a condition for trapezoidal correction of the shape of the photographed image P0 so that the images 44A 'to 44C' are captured in a shape similar to the actual shapes of the markers 44A to 44C themselves.
  • the photographed image P0 is substantially the same as the photographed image obtained when the optical axis 30a of the camera 30 is photographed orthogonal to the top surface 14a of the top plate 14. Are corrected to the same photographed image P1.
  • step S130 the image processing unit 54 measures the size of the images 44A 'to 44C' of the markers 44A to 44C in the captured image P1. Then, in the subsequent step S140, the image processing unit 54 corrects the size of the photographed image P1 of the camera 30 based on the size measured in step S130. Specifically, the photographed image is subjected to enlargement correction or reduction correction so that the images 44A 'to 44C' appear in the photographed image in a predetermined size determined in advance. By this size correction, as shown in FIG. 8, the photographed image P1 is substantially the same as the photographed image obtained when photographed with the distance between the camera 30 and the top surface 14a of the top plate 14 being a predetermined distance. Is corrected to the same photographed image P2.
  • step S150 the image processing unit 54 measures the positions and orientations of the images 44A 'to 44C' of the markers 44A to 44C in the captured image P2. Then, in the subsequent step S160, the image processing unit 54 performs position correction and rotation correction on the captured image P2 of the camera 30 based on the position and orientation measured in step S150. Specifically, the photographed image is subjected to position correction and rotational correction so that the images 44A 'to 44C' are captured at predetermined positions and postures determined in advance. As a result of these corrections, as shown in FIG. 9, the photographed image P2 is obtained when the camera 30 is photographed at a predetermined position relative to the main body 12 of the heating cooker 10 in a predetermined posture.
  • the captured image P3 is corrected to be substantially the same as the image.
  • the predetermined position is, for example, the position of the camera 30 such that the optical axis 30 a of the camera 30 passes the center of the top plate 14.
  • the predetermined posture is, for example, a rotation posture of the camera 30 centered on the optical axis 30 a such that the longitudinal direction (X-axis direction) of the top plate 14 coincides with the longitudinal direction of the photographed image.
  • step S160 When the position correction and the rotation correction in step S160 are completed, the geometrical correction on the captured image P0 of the camera 30 by the image processing unit 54 is completed.
  • Such geometrical correction makes it possible to obtain a photographed image photographed under substantially the same geometrical condition no matter where the camera 30 is installed with respect to the main body 12 of the heating cooker 10 Can.
  • the cooking target state estimation unit 56 estimates the state of the cooking target T with substantially constant accuracy based on the image T 'of the cooking target T shown in the photographed image. Can.
  • images 44A 'to 44C' of the markers 44A to 44C are captured at predetermined positions and predetermined postures in the photographed image P3 corrected by the image processing unit 54. Therefore, although the details will be described later, the cooking target state estimation unit 56 can specify the images 44A 'to 44C' of the markers 44A to 44C in a short time with high accuracy in the photographed image.
  • steps S130 and S140 shown in FIG. 6 can be omitted. Specifically, based on the size of the images 44A 'to 44C' of the markers 44A to 44C that appear in the captured image initially captured by the installed camera 30, the images 44A 'to 44C' are captured in a predetermined size The camera 30 is zoomed. By maintaining the zoom state, it is not necessary to correct the size of the photographed image photographed thereafter.
  • the image processing unit 54 performs color correction on the geometrically corrected captured image P3.
  • the reason for performing the color correction is to remove the influence of the light environment around the heating cooker 10 from the photographed image.
  • the color tone of the image T 'of the cooking target T captured in the captured image of the camera 30 changes.
  • the light may cause the entire photographed image, that is, the image T 'of the cooking object T to become reddish (change in hue).
  • the image T 'of the object to be cooked T may be whitened (the saturation may be reduced) by the light of the light.
  • the color tone of image T 'of cooking object T reflected to a picked-up image will also differ.
  • the cooking target state estimation unit 56 when a change or difference occurs in the color of the cooking target in the captured image due to a change or difference in the light environment around the heating cooker 10, the cooking target state estimation unit 56 generates an image T of the cooking target T in the captured image. There is a possibility that the estimation of the state of the cooking target T based on 'can not be performed with high accuracy. As a result, there is a possibility that heating control can not be appropriately performed on the cooking target T based on the estimation result of the state of the cooking target T.
  • the image processing unit 54 performs color correction on the geometrically corrected captured image P3.
  • the markers 44A to 44C used in the geometric correction are again used for the color correction.
  • the image processing unit 54 measures the color tone of the images of the markers 44A to 44C in the photographed image of the camera 30, and based on the measurement result, the images of the markers 44A to 44C in the photographed image P3 are actually markers 44A to 44C themselves.
  • the color of the captured image P3 is corrected so that the color tone of the image is taken.
  • the flow of color correction using the markers 44A to 44C will be described with reference to the flowchart shown in FIG.
  • step S200 the image processing unit 54 performs the image 44A 'of the plurality of markers 44A to 44C in the photographed image P3 geometrically corrected as shown in FIG. Identify 44C '.
  • step S210 the image processing unit 54 measures the color tone for each of the images 44A 'to 44C' of the plurality of markers 44A to 44C that appear in the captured image of the camera 30.
  • the hue, saturation, and lightness are measured as parameters indicating the hue. Note that at least one of hue, saturation, and lightness may be measured as the hue.
  • step S220 the image processing unit 54 in the case of the first embodiment based on the measurement result obtained by measuring the color tone in step 210 for the images 44A 'to 44C' of the respective markers 44A to 44C.
  • a color correction condition based on hue, saturation, and lightness.
  • color correction conditions for correcting the photographed image P3 are calculated for each of the images 44A 'to 44C' so that the images 44A 'to 44C' are captured by the actual color tone of the markers 44A to 44C themselves. This is because the images 44A'-44C 'of all the markers 44A-44C are not necessarily equally affected by the light environment.
  • only the marker 44A may be exposed to sunset light, whereby only the image 44A 'of the marker 44A may be reddish. Assuming such a case, color correction conditions are calculated for each of the images 44A 'to 44C'. Therefore, a plurality of different color correction conditions may be calculated.
  • step S230 following step S220 the image processing unit 54 performs color correction on the cooking target existing areas RA to RC in the photographed image P3 shown in FIG. 9 according to the color correction conditions calculated in step S220.
  • These cooking target existing areas RA to RC are areas corresponding to the container mounting areas 46A to 46C as described above and as shown in FIG. 9, that is, the image T 'of the cooking target T exists. It is an area that
  • the cooking target existing area RA in the captured image P3 is color-corrected under the color correction condition calculated for the image 44A 'of the corresponding marker 44A. Further, the cooking target existing area RB is color-corrected under the color correction condition calculated for the image 44B 'of the corresponding marker 44B. Then, the cooking target existing area RC is color-corrected under the color correction condition calculated for the image 44C 'of the corresponding marker 44C.
  • the image T 'of the cooking object T in the container C placed in different container placement areas 46A to 46C is color corrected under different color correction conditions.
  • the image T 'of the cooking object T in each of the container placement areas 46A to 46C is appropriately color corrected.
  • a color correction condition is calculated to make the image 44A 'of the marker 44A a color tone when the light of the setting sun is not illuminated.
  • the image T 'of the cooking object T in the container C placed in the container placement area 46A is color-corrected using the color correction condition. That is, the image T 'of the cooking object T is color-corrected to a color which is not exposed to the light of the setting sun.
  • this color correction is not applied to the image T 'of the cooking target T in the container placement areas 46B to 46C which are not exposed to the light of the setting sun.
  • the cooking target state estimation unit 56 estimates the state of the cooking target T with substantially constant accuracy based on the image T 'of the cooking target T in the photographed image regardless of the change or difference in the light environment. can do.
  • the cooking target state estimation unit 56 specifies the image T 'of the cooking target T in the photographed image based on the positions of the images 44A' to 44C 'of the markers 44A to 44C shown in the photographed image.
  • the image T 'of the cooking target T is specified in the photographed image which has been geometrically corrected and color corrected by the image processing unit 54.
  • the cooking target state estimation unit 56 first specifies the images 44A 'to 44C' of the markers 44A to 44C that appear in the captured image of the camera 30. Next, based on the positions of the identified images 44A 'to 44C', the cooking object presence areas RA to RC corresponding to the container placement areas 46A to 46C of the top surface 14a of the top plate 14 in the photographed image are identified. Then, the image T 'of the cooking object T is specified from within the cooking object presence areas RA to RC. Thus, the image T 'of the cooking object T can be identified in a short time with high accuracy, as compared to the case where the entire photographed image is checked and the image T' of the cooking object T is identified.
  • the positional relationship between the markers 44A to 44C and the container placement areas 46A to 46C (the cooking object presence areas RA to RC) needs to be known.
  • the heating and cooking indicating the positional relationship between the markers and the container placement areas Layout data is used.
  • the layout data is stored in the storage unit 60.
  • the cooking object state estimation unit 56 extracts the image T' of the specified cooking object T from the photographed image. For example, the extracted image T 'is stored in the storage unit 60 as image data.
  • the cooking target state estimation unit 56 estimates the state of the cooking target T. For example, based on a plurality of images T 'of the cooking target T extracted from a plurality of continuous captured images, a change in the image T' of the cooking target T is detected, and the state of the cooking target T is estimated based on the changes Do. For example, when the cooking target T is a liquid and the change in the image of the liquid level is large, it is estimated that the cooking target T is in a boiling state. Further, for example, when the cooking target T is grilled meat such as meat, it is estimated that baking has occurred when the change in color of the image of the cooking target is almost eliminated.
  • the cooking target state estimation unit 56 may estimate the state of the cooking target T taking into consideration also the detected temperatures of the temperature sensors 26A to 26C and the detected weight of the weight sensor 28. For example, when the cooking target T is a liquid, the change in the image of the liquid level is large, and the weight detected by the weight sensor 28 starts to decrease, the cooking target state estimation unit 56 evaporates the cooking target T Estimate that
  • the heating coil control unit 58 controls the outputs of the heating coils 16A to 16C based on the state of the cooking target T estimated by the cooking target state estimation unit 56. For example, when the cooking target state estimation unit 56 estimates that the cooking target T is being evaporated, the heating coil control unit 58 reduces the output level of the heating coil or stops the heating coil.
  • the heating cooker 10 that controls the heating of the cooking target T based on the image T ′ of the cooking target T shown in the captured image of the camera 30, the change of the light environment or It is possible to appropriately heat the cooking target T without being affected by the difference.
  • the heating cooker 10 which concerns on Embodiment 2 of this indication is demonstrated.
  • the markers 44A to 44C used for color correction are marks drawn on the top surface 14a of the top plate 14 or stickers attached thereto.
  • the marker used in the heating cooker 10A according to the second embodiment is a reference chart displayed on the output display portions 20A to 20C provided on the top surface 14a.
  • the heating cooker 10A according to the second embodiment is the same as the heating cooker 10 according to the first embodiment except for the matters described below.
  • heating cooking device 10A concerning Embodiment 2 aims at being able to confirm the color of cooking object to which the influence of lighting environment was reduced.
  • FIG. 11 is a schematic perspective view of the heating cooker 10 according to Embodiment 2 of the present disclosure.
  • FIG. 12 is a top view of the top plate 14 of the heating cooker 10.
  • FIG. 13 schematically shows the structure of the heating cooker 10.
  • FIG. 14 is a block diagram showing a control system of the heating cooker 10. As shown in FIG.
  • the output display unit 20 has output display units 20A to 20C.
  • Output display units 20A to 20C are arranged in predetermined areas of each of the output display units 20. Note that in FIG. 11, the output display units 20A to 20C are extinguished, and in FIG. 12, the output display units 20A and 20B are extinguished, so the output display unit 20 is shown in black.
  • the output display unit 20C is a color liquid crystal panel, and can display the reference chart 80.
  • the output display unit 20C doubles as a reference chart display unit that displays the reference chart 80 in addition to the display of the heating condition of the heating coil 16C.
  • the output display units 20A and 20B are monochrome or color liquid crystal panels.
  • the container placement areas 46A, 46B, 46C facing the heating coils 16A, 16B, 16C, respectively, are presented to the user by annular marks 48A, 48B, 48C printed on the top surface 14a of the top plate 14 There is.
  • light emitting portions 49 A, 49 B, 49 C emitting in a ring shape are disposed on the top plate 14 outside of the heating coils 16 A, 16 B, 16 C respectively.
  • the light emitting units 49A, 49B, 49C emit light, for example, when current flows through the corresponding heating coils 16A, 16B, 16C.
  • the light emitting units 49A to 49C have, for example, LED light emitting substrates.
  • the heating cooker 10 is provided with the cooking condition recognition part 31 which has the camera 30.
  • FIG. The distance from the camera 30 to the top surface 14 a of the top plate 14 is 600 mm or more and 2000 mm or less. With this distance, the entire top surface of the top plate 14 can be appropriately photographed within the field of view of the camera 30.
  • the camera 30 according to the second embodiment is, for example, a visible light camera capable of capturing still images, moving images, and the like, and is, for example, a CCD camera or a CMOS camera.
  • the camera 30 is mounted on the heating cooker 10, for example, attached to a ceiling or a range hood 70 or the like.
  • the camera 30 is installed at a position away from the top surface 14 a of the top plate 14 which is the upper portion of the main body 12, and captures the entire top surface 14 a of the main body 12, that is, the container C and the cooking target T.
  • the cooking condition recognition unit 31 is attached to the range hood 70, for example, via a magnet.
  • the camera 30 can be freely installed on the main body 12 of the heating cooker 10.
  • the camera can be mounted at a position preferred by the user without impairing the design of the kitchen.
  • the cooking condition recognition unit 31 that can be disposed at a position preferred by the user is configured to wirelessly communicate with the control unit 50A of the heating cooker 10.
  • the cooking condition recognition unit 31 includes an antenna 32.
  • an antenna 34 is provided in the main body 12 of the heating cooker 10.
  • the cooking condition recognition unit 31 and the control unit 50A are connected via the router device 36. That is, the cooking condition recognition unit 31 and the control unit 50A are connected to the local area network 38 centering on the router device 36 in a wirelessly communicable manner.
  • the router device 36 is connected to the Internet 40. Furthermore, the router device 36 is connected to the portable terminal 42 of the user in a wirelessly communicable manner.
  • the control unit 50A of the heating cooker 10 includes one or more microprocessors, a memory, and a circuit board.
  • the control unit 50A is, for example, a processing device such as a CPU, and performs various functions by executing a program stored in the storage unit 60 such as a ROM, a RAM, a hard disk, or an SSD (for example, It is configured to function as an image processing unit described later.
  • the control unit 50A of the heating cooker 10A outputs an imaging instruction to the camera 30 of the cooking condition recognition unit 31 to acquire a captured image; Has a section 54A, a cooking target state estimation section 56, a heating coil control section 58, and a guidance information output section 59 for presenting various information to the user from the output display sections 20A to 20C and the notification section 24 To do).
  • the control unit 50A when heating of the heating coils 16A to 16C is started, the control unit 50A outputs a photographing instruction to the camera 30. Thereby, the camera 30 acquires images on the top plate 14 at a predetermined cycle, and these images are sent to the control unit 50A.
  • the image processing unit 54A performs color correction on the captured image.
  • the reason for performing the color correction is to remove the influence of the light environment around the heating cooker 10 from the photographed image.
  • the image processing unit 54A in the second embodiment includes an area extraction unit 72, an illumination state detection unit 73, a correction parameter calculation unit 74, a color correction unit 76, and a lightness calculation unit 78.
  • the region extraction unit 72 extracts the reference chart 80 from the image captured by the camera 30, for example, by template matching. In addition, the region extraction unit 72 extracts the regions of the container C and the cooking target T placed in the container placement areas 46A to 46C.
  • the illumination state detection unit 73 detects the illumination state around the top surface 14a of the top plate 14 during heating output from the heating coils 16A to 16C.
  • the correction parameter calculation unit 74 calculates a correction parameter for correcting the pixel value of each of the RGB areas in the region of the cooking target T based on the pixel values of the reference chart 80.
  • the color correction unit 76 corrects the RGB values of the pixels of the region of the cooking target T using the correction parameters.
  • the lightness calculation unit 78 calculates the lightness of each pixel based on the pixel values of the standard RGB values of the corrected image.
  • heating coil control unit 58 performs heating output to corresponding heating coils 16A to 16C.
  • the heating coil control unit 58 supplies a high frequency current to the heating coils 16A to 16C.
  • the heating coils 16A to 16C generate an induction magnetic field when a high frequency current is supplied.
  • an induced current flows and is heated by the induced magnetic field generated.
  • the heating coil control unit 58 controls the amount of current supplied to the heating coils 16A to 16C to control the amount of heating from the heating coils 16A to 16C.
  • control unit 50A When control unit 50A outputs a heating instruction from heating coil control unit 58 to any of heating coils 16A to 16C, control unit 50A instructs image acquisition unit 52 to take an image in camera 30 of cooking condition recognition unit 31. Do. At the same time as this instruction, the guidance information output unit 59 displays a reference chart 80 for color correction shown in FIG. 7 on the output display unit 20C as a reference chart display unit. The reference chart 80 is displayed as markers 44A to 44C used for color correction in the first embodiment. In response to a photographing instruction from the control unit 50A, the camera 30 photographs an image 79 of the top plate 14 as shown in FIG. 6, for example, and transmits the photographed image 79 to the control unit 50A.
  • step S310 the image processing unit 54A acquires the image 79 transmitted from the camera 30.
  • the image 79 also includes an image of a reference chart 80 for color correction.
  • Image 79 has RGB values for each pixel.
  • step S320 the region extraction unit 72 extracts the image of the reference chart 80 from the image 79 by image processing such as template matching.
  • step S330 when the region extraction unit 72 does not successfully extract the reference chart 80 from the image 79, that is, when the image processing unit 54A does not recognize the reference chart 80, an output display unit as a display unit A warning is notified to the user from 20A or 20B.
  • the reference chart 80 As a case where the reference chart 80 is not recognized, there are cases where the illumination state around the top plate 14 is not appropriate and when a dirt or obstacle is placed on the output display section 20C that displays the reference chart 80. is there. Then, a warning is notified according to each case.
  • the illumination state detection unit 73 of the image processing unit 54A detects the illumination state that illuminates the top plate 14, and determines whether the detected illumination state is dark.
  • the illumination state detection unit 73 detects the illumination state using, for example, the histogram of the image 79.
  • the illumination state detection unit 73 can determine whether blackening occurs in the histogram or whether the illumination state is dark based on the peak position of the histogram.
  • a warning is notified to the user.
  • the guidance information output unit 59 notifies the user of a warning indicating that the illumination state illuminating the top plate 14 is to be brightened from the output display unit 20A or 20B.
  • the user illuminates the illumination that illuminates the top plate 14, so that the image processing unit 54A can recognize the reference chart 80.
  • the process is performed again from step S310 after a predetermined time has elapsed.
  • step S340 when a lighting condition equal to or greater than a predetermined brightness is detected, there is a possibility that a dirt or an obstacle is placed on the upper part of the output display unit 20C.
  • the guidance information output unit 59 notifies the output display unit 20C to expose the output display unit 20C from the output display unit 20A or 20B. Further, the guidance information output unit 59 may notify the output display unit 20A or 20B to remove dirt or an obstacle. Also, notification may be performed from the notification unit 24 by voice guidance. Thus, the user exposes the output display unit 20C, so that the image processing unit 54A can recognize the reference chart 80. After notification, the process is performed again from step S310 after a predetermined time has elapsed.
  • the reference chart 80 is constituted by charts 80a to 80f so that the brightness indicating the degree of white to black is substantially equal, in order to use the index indicating the dynamic range of white to black.
  • the chart 80a has both R and G values of 243, a B value of 242, and a lightness of about 96.5.
  • the chart 80b has R, G and B values of 200 and a lightness of about 81.3.
  • the chart 80c has R, G and B values of 160 together and a lightness of about 66.8.
  • the chart 80d has R and G values of 122, a B value of 121, and a lightness of about 50.9.
  • the chart 80e has R, G and B values of 85 and a lightness of about 35.7.
  • the chart 80 f has R, G and B values of 52 and a lightness of about 20.5.
  • the RGB values of these charts 80a to 80f are taken as standard RGB values.
  • the RGB values of the respective charts 80a to 80f of the reference chart 80 displayed on the output display unit 20C are as described above. However, the RGB values of the respective charts 80a to 80f of the reference chart 80 captured in the image 79 captured by the camera 30 differ due to the influence of the illumination environment.
  • FIG. 18 is a graph showing the standard RGB values D1 of the reference chart 80 displayed on the output display unit 20C and the RGB values of the reference chart 80 in the image captured by the camera 30.
  • the standard RGB values D1 are indicated by one because the R, G and B values are the same or almost the same in each of the charts 80a to 80f.
  • the RGB values of the reference chart 80 displayed on the output display unit 20C are standard RGB values, they are affected by the illumination environment at the time of shooting, and thus the R of each chart 80a to 80f obtained from the shot image
  • the value Rc, the G value Gc, and the B value Bc are different from the standard RGB value D1.
  • the correction parameter calculation unit 74 calculates correction parameters for correcting the RGB values of each of the charts 80a to 80f to standard RGB values.
  • the chart 80a is a chart showing white, and white correction can be performed by correcting this to standard RGB values.
  • the R value in the image of the chart 80a is 180, since the standard R value is 243, 1.35 which is a value obtained by dividing the R value in the image from the standard R value is a parameter for white correction.
  • Parameters for white correction are calculated for G value and B value, respectively.
  • the parameters of the white correction of the calculated RGB values are stored in the storage unit 60.
  • the parameters of the white correction are respectively multiplied by the R values of the images of all the charts 80a to 80f.
  • a value white-corrected to Rc which is an R value acquired from the image 79 is Rcwb.
  • the curve indicated by the broken line in FIG. 19 is a graph of the R value Rcwb obtained by the white correction.
  • FIG. 20 is a view showing an example of the color correction, and is an enlarged view of the rectangular broken line area Ar of FIG.
  • Rd (Rcwbn-Rcwb 80e) ⁇ (Rs80d-Rs80e) / (Rcwb80d-Rcwb 80e) (1)
  • the correction parameter Rd for gray scale correction can be obtained, and the range of pixel values 52 to 243 useful for confirming the cooking state of the food can be corrected at the standard R value. Further, the same linear interpolation operation can be performed for the range of pixel values 255-243 and 52-0 as well.
  • the correction parameter calculation unit 74 calculates a conversion table to standard RGB values from all possible RGB values of the captured image, and stores this in the storage unit 60.
  • step S380 the area extraction unit 72 extracts the cooking area Ac including the cooking target T.
  • the area in the luminous ring-like light emitting portions 49A to 49C may be extracted as the cooking area Ac by image processing, or may be extracted using another marker.
  • the user may input the cooking area Ac in the image 79 to the control unit 50A using the portable terminal 42.
  • step S390 the color correction unit 76 corrects the RGB values of all the pixels of the extracted cooking area Ac to standard RGB values using the calculated conversion table.
  • the calculated conversion table is calculated using the pixel values of the reference chart 80 included in the image captured by the camera 30. Accordingly, the color correction unit 76 performs color correction of the image captured by the camera 30 using the pixel values of the reference chart 80 included in the image captured by the camera 30.
  • step S400 the lightness calculation unit 78 calculates the lightness of the cooking area Ac.
  • the brightness for example, to calculate the CIE-L * a * b * color system of the lightness L *. Therefore, RGB values are converted to the CIE-XYZ color system, and further converted to the CIE-L * a * b * color system. A more specific conversion procedure will be described below.
  • the R′′G′′B ′ ′ value is converted to the CIE-XYZ color system. Note that since only the Y value is used, the lightness calculation unit 78 may calculate only the Y value.
  • step S410 the guide information output unit 59 notifies the tint of the IEC standard corresponding to the calculated lightness L * from the output display unit 20C.
  • This notification may notify the color corresponding to the most detected lightness L * in the cooking object T as the color of the cooking object, or corresponds to the histogram of the lightness L * detected in the cooking object T Then, it may be notified of the tint.
  • step S420 the cooking target state estimation unit 56 determines whether the detected color of the cooking target T has reached a predetermined color. This determination is performed, for example, by determining whether the calculated lightness L * has reached a predetermined lightness.
  • the detected color is a predetermined color, for example, the lightness L * of the cooking object T in the image photographed by the cooking object state estimation unit 56 is predetermined.
  • the heating coil control unit 58 reduces or stops the heating output of the heating coils 16A to 16C. Further, the guidance information output unit 59 notifies the output display units 20A to 20C that the cooking of the cooking target T is completed.
  • step S420 If the detected color is not a predetermined color in step S420, for example, if the calculated lightness L * has not reached the predetermined color, the color to be cooked is still predetermined. Since the color has not been reached, the process is repeated again from step S310.
  • the heating cooker 10 is mounted on the top plate 14 provided with the container mounting areas 46A to 46C on which the container C accommodating the cooking target T is mounted, and the container mounting areas 46A to 46C.
  • the heating coil 16A-16C which heats the placed container C from below
  • the heating coil control unit 58 which controls the heating output of the heating coil 16A-16C
  • the container C and cooking which are installed at a position away from the top plate 14 Photographed by the camera 30 using the camera 30 for photographing the target T
  • the output display unit 20C for displaying the reference chart 80 for color correction and the pixel values of the reference chart 80 included in the image 79 photographed by the camera 30
  • a color correction unit 76 that performs color correction of the image 79
  • a cooking target state estimation unit that estimates the cooking status based on the cooking target of the color-corrected image.
  • the image of the reference chart 80 is also acquired at the same time, so that image correction based on the color reference can be performed. Since the standard RGB values of the reference chart 80 are known in advance, the color correction of the image taken by the camera 30 may be performed using the pixel values of the reference chart 80 affected by the lighting environment around the heating cooker 10. it can. Thereby, the lighting environment around the heating cooker 10 is, for example, various lighting environments from cold to warm depending on the user's preference, but the influence of the lighting environment around the heating cooker 10 is reduced, It becomes possible to correctly recognize the state of the cooking object T using, for example, the color information of the baking color of the cooking object T.
  • white correction and gray scale correction are performed using six points (80a to 80f) as the reference chart 80, but only white correction is performed using only the reference charts 80a and 80f. You may implement. Even if only white correction is performed as color correction, color is significantly improved as shown in Rcwb shown in FIG. In addition, if three points 80a, 80d, and 80f are used, gray scale correction can also be performed for white correction, so that color accuracy can be enhanced.
  • the heating cooker 10 has one camera 30, but the camera of the heating cooker according to the embodiment of the present invention is limited to one. Absent.
  • a camera may be disposed above each of the plurality of heating coils.
  • the cooking condition recognition part 31 has one camera 30 as shown in FIG. 13, embodiment of this indication is not restricted to this.
  • the number of cameras may be two or more, and for example, the cameras may be disposed above each of the plurality of heating coils.
  • the camera 30 and the control part 50 of the heating cooker 10 are wirelessly connected via the local area network 38, it does not restrict to this. At least one of the camera 30 and the control unit 50 may be connected to the local area network 38 by wire. In the first and second embodiments, the camera 30 may be directly wirelessly connected or wired to the control units 50 and 50A of the heating cooker 10. Furthermore, the photographed image of the camera 30 may be transmitted to the portable terminal 42, or may be transmitted to the outside via the Internet 40. Thus, the user can confirm the photographed image actually photographed by the camera 30.
  • the cooking condition recognition unit 31 and the control unit 50A of the heating cooker 10A are wirelessly connected via the local area network 38.
  • the embodiment is not limited to this.
  • At least one of the cooking condition recognition unit 31 and the control unit 50A may be connected to the local area network 38 by wire.
  • the cooking condition recognition unit 31 may be directly wirelessly connected or wired to the control unit 50A of the heating cooker 10.
  • the heating cooker 10 is provided with three heating coils 16A-16C, the heating coil of the heating cooker 10 which concerns on embodiment of this invention
  • the number of is not limited to this.
  • the number of heating coils may be one, or three or more.
  • the markers 44A to 44C are marks or seals drawn on the top surface 14a of the top plate 14.
  • the embodiment of the present invention is not limited to this.
  • the marker may be a light emitter that emits light, such as an LED.
  • it is preferable to emit light such as red or blue instead of white light.
  • the markers 44A to 44C are provided on the top surface 14a of the top plate 14, but the embodiment of the present invention is not limited to this.
  • the markers 44A to 44C may be in the main body of the heating cooker, that is, in the shooting range of the camera. However, it is preferable that the marker be provided on the top surface of the top plate if color correction of the photographed image, that is, the image of the object to be cooked is more appropriately performed. In most cases, the cooker is installed with the top of the top plate horizontal. Moreover, when the cooking target is a liquid, the liquid level is also horizontal. Therefore, both the marker and the liquid level are equally affected by the light environment. Therefore, in order to color correct the image of the horizontal liquid surface, the image of the object to be cooked is color corrected under the color correction condition calculated based on the color of the image of the marker provided on the top surface of the horizontal top plate. It is preferable to do.
  • the markers 44A to 44C correspond to the top surface 14a of the top plate 14 other than the container mounting areas 46A to 46C facing the heating coils 16A to 16C.
  • the marker may be provided to be located at the center of each of the heating coils as viewed from above the top plate. Thereby, the marker can indicate the position of the heating coil, that is, it can also function as a mark indicating the mounting position of the container heated by the heating coil.
  • the markers 44A to 44C are provided to facilitate identification of the image T 'of the cooking target T in the photographed image of the camera 30, but the geometrical It is also used for correction and color correction.
  • the embodiment of the present invention is not limited to this.
  • a marker dedicated to color correction that is used only for color correction may be provided on the main body of the heating cooker, for example, the top surface of the top plate. This allows the use of markers that vary greatly in shade due to slight differences in the light environment, such as markers that reflect light.
  • the markers 44A to 44C are marks or stickers drawn on the top surface 14a of the top plate 14, but the embodiment of the present invention is limited thereto. Absent.
  • the marker may be present on the body of the cooker and within the imaging range of the camera. Thus, the marker may be all or part of the component in the body of the cooker.
  • at least one of the touch keys 18A to 18C and the output display units 20A to 20C shown in FIG. 2 may be used as a marker.
  • the entire top surface 14a of the top plate 14 may be used as a marker.
  • a marker should just exist on the main body of a heating cooker, exist in a fixed position with respect to a container mounting area (heating part), and that the image can be specified in a picked-up image. In this case, it is not necessary to draw a marker or attach a marker seal, and it is possible to suppress a decrease in the design of the heating cooker.
  • the markers 44A to 44C are provided on the main body 12 of the heating cooker 10, specifically, on the top surface 14a of the top plate 14.
  • the embodiment of the present invention is not limited to this. As long as it is used only for the color correction of the photographed image, a marker for the color correction may be provided in the container in which the cooking target is accommodated. Thereby, a marker for color correction can be provided near the cooking target, and the color of the image of the cooking target can be corrected more appropriately.
  • the reference chart 80 in the image is extracted to calculate the correction parameter each time the camera 30 captures an image, but the embodiment of the present disclosure is not limited thereto. .
  • the cooking area extraction in step S380 may be performed by omitting steps S320 to S370.
  • the process shown in FIG. 22 may be performed. That is, after step S310, in step S500, the illumination state detection unit 73 of the image processing unit 54 determines a change in the illumination state around the top plate 14. If there is no change in the illumination state, steps S320 to S370. May be omitted.
  • the case where there is no change in the illumination state means that the change in the illumination state is included in a predetermined range.
  • the illumination state detection unit 73 detects 30 or more changes in any of the RGB values of the image captured by the camera 30 as compared to the sampling image at the time of calculation of the previous sampling image or the correction parameter, It may be determined that there is a change in state, and the process from step S320 may be performed.
  • the illumination state detection unit 73 changes the illumination state, for example, when any of the RGB values of the image captured by the camera 30 is less than 30 compared to the sampling image at the time of calculation of the correction parameter.
  • step S320 to step S370 may be omitted and the process may be performed from step S380.
  • the color correction unit 76 performs color correction again using the reference chart 80 included in the image captured by the camera 30.
  • steps S320 to S370 are omitted, so that the color correction can be speeded up.
  • color correction is performed using the pixel values of the reference chart 80. Therefore, the condition of the cooking object T should be correctly recognized using, for example, the color information of the baking color. Is possible.
  • the illumination state detection unit 73 of the image processing unit 54 detects the illumination state around the top plate 14 by image processing, but the embodiment of the present disclosure is limited thereto. Absent. Instead of the illumination state detection unit 73 detecting the illumination state from the photographed image, the cooking state recognition unit 31 may directly detect the illumination state around the top plate 14 by providing a light intensity meter.
  • Embodiment 2 although the reference
  • heating coil control unit 58 reduces or stops the heating output of heating coils 16A to 16C.
  • the embodiment of is not limited to this.
  • the heating coil control unit 58 may increase the heating output of the heating coils 16A to 16C after, for example, the lightness L * reaches the predetermined lightness after the detected coloration has reached the predetermined color. .
  • a brown color as finish cooking.
  • a plurality of predetermined colors may be stored, and the heating output may be adjusted each time each color is reached.
  • the cooking target state estimation unit 56 recognizes that the cooking target T in the image captured by the camera 30 has reached a predetermined color, and the cooking target state estimation unit 56 captures the image captured by the camera 30. Upon recognizing that the cooking target T has reached a predetermined color, the heating coil control unit 58 performs control to increase or decrease the heating output.
  • the output display part 20C was sharing the output display of the heating coil 16C and the display of the reference
  • the reference chart 80 may be displayed from the output unit display unit corresponding to the heating coil which is not heated.
  • a dedicated display unit for displaying the reference chart 80 may be disposed on the top plate 14.
  • the present invention is not limited to the induction heating cooker, and is applicable to any heating cooker that heats a container containing a cooking target.

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Induction Heating Cooking Devices (AREA)

Abstract

Selon la présente invention, la cuisson à la chaleur est réalisée de manière appropriée sur la base d'une image d'un objet à cuire sur une image capturée sans être influencée par un environnement optique. L'appareil de cuisson comprend : un corps principal ayant une surface supérieure pourvue d'une partie de montage de récipient sur laquelle est monté un récipient, contenant un objet à cuire ; une partie de chauffage qui chauffe le récipient monté sur la partie de montage de récipient par en-dessous ; une caméra qui capture une image de l'objet à cuire ; un marqueur qui est disposé dans le corps principal et capturé avec la caméra ; une unité de traitement d'image qui mesure le ton d'une image du marqueur sur l'image capturée, et corrige les couleurs de l'image capturée de telle sorte que l'image capturée du marqueur a un ton réel du marqueur lui-même sur la base du résultat de mesure ; une unité d'estimation d'état d'objet à cuire qui estime l'état de l'objet à cuire, sur la base de l'image de l'objet à cuire sur l'image capturée à couleur corrigée ; et une unité de commande de chauffage qui commande la partie de chauffage sur la base de l'état estimé de l'objet à cuire.
PCT/JP2018/034893 2017-09-25 2018-09-20 Appareil de cuisson WO2019059304A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2017183991A JP6906192B2 (ja) 2017-09-25 2017-09-25 加熱調理器
JP2017-183991 2017-09-25
JP2018096532A JP7178622B2 (ja) 2018-05-18 2018-05-18 加熱調理器
JP2018-096532 2018-05-18

Publications (1)

Publication Number Publication Date
WO2019059304A1 true WO2019059304A1 (fr) 2019-03-28

Family

ID=65810822

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/034893 WO2019059304A1 (fr) 2017-09-25 2018-09-20 Appareil de cuisson

Country Status (1)

Country Link
WO (1) WO2019059304A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021081149A (ja) * 2019-11-21 2021-05-27 株式会社パロマ ガスコンロシステム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004363814A (ja) * 2003-06-03 2004-12-24 Toshiba Plant Systems & Services Corp 基準器、情報処理装置、及び画像処理方法
JP2017040430A (ja) * 2015-08-19 2017-02-23 東京瓦斯株式会社 レンジフード
WO2017033406A1 (fr) * 2015-08-27 2017-03-02 パナソニックIpマネジメント株式会社 Dispositif de cuisson

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004363814A (ja) * 2003-06-03 2004-12-24 Toshiba Plant Systems & Services Corp 基準器、情報処理装置、及び画像処理方法
JP2017040430A (ja) * 2015-08-19 2017-02-23 東京瓦斯株式会社 レンジフード
WO2017033406A1 (fr) * 2015-08-27 2017-03-02 パナソニックIpマネジメント株式会社 Dispositif de cuisson

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021081149A (ja) * 2019-11-21 2021-05-27 株式会社パロマ ガスコンロシステム
JP7350308B2 (ja) 2019-11-21 2023-09-26 株式会社パロマ ガスコンロシステム

Similar Documents

Publication Publication Date Title
CN104272717B (zh) 用于执行投影图像到检测到的红外线(ir)辐射信息的对准的方法和系统
US7924312B2 (en) Infrared and visible-light image registration
JP7149466B2 (ja) 加熱調理器
WO2018111915A1 (fr) Application de mesure et de dimensionnement du pied
KR101998639B1 (ko) 열화상 카메라와 색상 카메라의 복합 영상을 이용한 지능형 발화지점 감시 시스템
CN109141640A (zh) 基于机器视觉的色差检测方法、系统、设备及存储介质
JP5845834B2 (ja) 複数の画像フレームの位置合わせ方法、スキャニング装置およびコンピューター読み取り可能な媒体
JP2007303804A (ja) ふきこぼれ検知装置、ふきこぼれ検知方法及びふきこぼれ検知プログラム
EP2820617A1 (fr) Procédé et système de projection d'une représentation visible de rayonnement infrarouge
US9100595B2 (en) Image processing method and thermal imaging camera
JP6945620B2 (ja) システムの作動方法及びプログラム
CN106068092A (zh) 图像处理装置
JP6861345B2 (ja) 文字図形認識装置、文字図形認識方法、及び文字図形認識プログラム
US20160134854A1 (en) Automated white balancing in digital photography
JP7065305B2 (ja) 加熱調理器
JP2018066712A (ja) 計測装置
JP5817453B2 (ja) スキャニング装置、画像処理プログラムおよび融合画像を生成する方法
WO2016199253A1 (fr) Système de traitement d'image, dispositif de lecture d'image, procédé de détection de région et programme informatique
JP2012155700A (ja) 画像処理方法、画像処理装置、およびコンピューター読み取り可能な媒体
WO2019059304A1 (fr) Appareil de cuisson
JP7178622B2 (ja) 加熱調理器
JP2019061793A (ja) 加熱調理器
KR101678557B1 (ko) 고기 로스팅 상태 표시기
WO2013104313A1 (fr) Procédé et système destinés à être utilisés pour détecter des informations de position tridimensionnelles d'un dispositif d'entrée
JP7527622B2 (ja) 画像処理システム及び画像処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18859818

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18859818

Country of ref document: EP

Kind code of ref document: A1