WO2011043039A1 - Climatiseur - Google Patents

Climatiseur Download PDF

Info

Publication number
WO2011043039A1
WO2011043039A1 PCT/JP2010/005883 JP2010005883W WO2011043039A1 WO 2011043039 A1 WO2011043039 A1 WO 2011043039A1 JP 2010005883 W JP2010005883 W JP 2010005883W WO 2011043039 A1 WO2011043039 A1 WO 2011043039A1
Authority
WO
WIPO (PCT)
Prior art keywords
obstacle
detection means
person
area
position determination
Prior art date
Application number
PCT/JP2010/005883
Other languages
English (en)
Japanese (ja)
Inventor
恵子 岩本
智 佐藤
寧 神野
孝 杉尾
智貴 森川
博基 長谷川
裕介 河野
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2011043039A1 publication Critical patent/WO2011043039A1/fr

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/30Control or safety arrangements for purposes related to the operation of the system, e.g. for safety or monitoring
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F2110/00Control inputs relating to air properties

Definitions

  • the present invention provides an indoor unit with an obstacle detection device that detects the presence or absence of an obstacle, and controls the wind direction change blades and the like based on the detection result of the obstacle detection device to efficiently send conditioned air. It relates to air conditioners.
  • the conventional air conditioner detects the indoor form in which the indoor unit is installed and the installation position of the indoor unit, and controls the air direction, the air volume, etc. based on the detected indoor form and the installed position of the indoor unit. Driving efficiently.
  • the left and right distance detection sensors and the front and lower distance detection sensors are provided in the indoor unit, the distance between the indoor unit and the right wall is measured by the right distance detection sensor, and the left distance detection sensor is used. While measuring the distance between the indoor unit and the left side wall, the installation position of the indoor unit is recognized by measuring the installation height of the indoor unit with the lower distance detection sensor.
  • the distance from the wall is measured to recognize the indoor form.
  • air conditioning control is efficiently performed by controlling the wind direction changing blade and the indoor fan according to the installation position of the indoor unit and the indoor form recognized in this way (see, for example, Patent Document 1). .
  • the present invention has been made in view of the above-described problems of the prior art, and by accurately recognizing the position of an obstacle in the area to be air-conditioned, the width of the air flow control is widened, and a comfortable air-conditioned space is created.
  • the object is to provide an air conditioner that can be realized.
  • the present invention is an air conditioner provided with an imaging device that detects the presence or absence of an obstacle, and divides a region to be air-conditioned into a plurality of obstacle position determination regions.
  • the obstacle presence / absence determination in each of the divided obstacle position determination areas is performed separately from the walls existing around the area to be air-conditioned.
  • the present invention is also an air conditioner provided with an imaging device that detects the presence or absence of a person and the presence or absence of an obstacle, and the imaging device detects the presence or absence of an obstacle only when the detection result of the person is negative. If there is a person detection result, the presence or absence of an obstacle is not detected.
  • the present invention can accurately recognize the positions of various obstacles existing in the room by the above-described configuration, and can widen the range of airflow control and realize a comfortable air-conditioned space.
  • FIG. 1 is a front view of an indoor unit of an air conditioner according to the present invention.
  • 2 is a longitudinal sectional view of the indoor unit of FIG. 3 is a longitudinal sectional view of the indoor unit of FIG. 1 with the movable front panel opening the front opening and the upper and lower blades opening the outlet.
  • 4 is a longitudinal sectional view of the indoor unit of FIG. 1 in a state where the lower blades constituting the upper and lower blades are set downward.
  • FIG. 5 is a flowchart showing the flow of the human position estimation process in this embodiment.
  • FIG. 6 is a schematic diagram for explaining background difference processing in human position estimation in the present embodiment.
  • FIG. 7 is a schematic diagram for explaining processing for creating a background image in background difference processing.
  • FIG. 8 is a schematic diagram for explaining processing for creating a background image in background difference processing.
  • FIG. 9 is a schematic diagram for explaining a process of creating a background image in the background difference process
  • FIG. 10 is a schematic diagram for explaining region division processing in human position estimation in the present embodiment.
  • FIG. 11 is a schematic diagram for explaining two coordinate systems used in this embodiment.
  • FIG. 12 is a schematic diagram showing the distance from the image sensor unit to the position of the center of gravity of the person.
  • FIG. 13 is a schematic diagram showing a human position determination area detected by the image sensor unit constituting the human body detection means.
  • FIG. 14 is a schematic diagram in the case where a person is present in the human position determination area detected by the imaging sensor unit constituting the human body detection means.
  • FIG. 15 is a flowchart for setting region characteristics in each region shown in FIG.
  • FIG. 16 is a flowchart for finally determining the presence or absence of a person in each area shown in FIG. 17 is a schematic plan view of a residence where the indoor unit of FIG. 1 is installed.
  • 18 is a graph showing the long-term cumulative result of each image sensor unit in the residence of FIG. 19 is a schematic plan view of another residence in which the indoor unit of FIG. 1 is installed.
  • FIG. 20 is a graph showing the long-term cumulative result of each image sensor unit in the residence of FIG.
  • FIG. 21 is a flowchart showing the flow of a human position estimation process using a process of extracting a person-like area from a frame image.
  • FIG. 22 is a flowchart showing the flow of human position estimation processing using processing for extracting a face-like region from a frame image.
  • FIG. 23 is a schematic diagram showing an obstacle position determination area detected by the obstacle detection means.
  • FIG. 24 is a schematic diagram for explaining obstacle detection by the stereo method.
  • FIG. 25 is a flowchart showing the flow of processing for measuring the distance to the obstacle.
  • FIG. 26 is a schematic diagram showing the distance from the image sensor unit to the position P.
  • FIG. 27 is an elevation view of a living space, and is a schematic diagram showing the measurement results of the obstacle detection means.
  • FIG. 28 is a schematic diagram showing the definition of the wind direction at each position of the left and right blades constituting the left and right blades.
  • FIG. 29 is a schematic plan view of a room for explaining a wall detection algorithm for determining a distance number by measuring a distance from an indoor unit to a surrounding wall surface.
  • FIG. 30 is a front view of another indoor unit of an air conditioner according to the present invention.
  • FIG. 31 is a schematic diagram showing the relationship between the image sensor unit and the light projecting unit.
  • FIG. 32 is a flowchart showing the flow of processing for measuring the distance to an obstacle using the light projecting unit and the image sensor unit.
  • FIG. 33 is a front view of another indoor unit of an air conditioner according to the present invention.
  • FIG. 34 is a flowchart showing the flow of processing of the human body distance detecting means using the human body detecting means.
  • FIG. 35 is a schematic diagram for explaining processing for estimating the distance from the image sensor unit to a person using v1 which is the v coordinate at the top of the image.
  • FIG. 36 is a flowchart showing the flow of processing of obstacle detection means using human body detection means.
  • FIG. 37 is a schematic diagram for explaining the process of estimating the height v2 of the person on the image using the distance information from the imaging sensor unit to the person estimated by the human body distance detection means.
  • FIG. 38 is a schematic diagram for explaining processing for estimating whether an obstacle exists between the image sensor unit and a person.
  • FIG. 39 is a schematic diagram for explaining processing for estimating whether an obstacle exists between the image sensor unit and a person.
  • the present invention provides an indoor unit provided with an imaging device provided with an obstacle detection means for detecting the presence or absence of an obstacle, and controls air direction changing means provided in the indoor unit based on a detection signal of the obstacle detection means. It is a harmony machine, Comprising: An imaging device performs the presence or absence determination of an obstruction according to the condition of a room.
  • the area to be air-conditioned is divided into a plurality of obstacle position determination areas, and the imaging device determines whether there is an obstacle in each of the divided obstacle position determination areas around the area to be air-conditioned.
  • the imaging apparatus further includes a human body detection unit that detects the presence or absence of a person, and a wind direction change provided in the indoor unit based on the detection signal of the human body detection unit and the detection signal of the obstacle detection unit If the human detection result by the human body detection means is negative, the human body detection means detects the presence or absence of an obstacle, and if the human detection result by the human body detection means is present, the human body detection means Does not detect the presence or absence of objects. Thereby, the position of an obstacle can be recognized quickly and accurately without recognizing a person as an obstacle. Therefore, the range of airflow control can be expanded and a comfortable air-conditioned space can be realized.
  • FIGS. 1 to 4 show the indoor unit of an air conditioner according to the present invention. ing.
  • the indoor unit has a main body 2 and a movable front panel (hereinafter simply referred to as a front panel) 4 that can freely open and close the front opening 2a of the main body 2, and the front panel 4 is the main body 2 when the air conditioner is stopped. While the front opening 2a is closed in close contact with the front, the front panel 4 moves in a direction away from the main body 2 to open the front opening 2a during operation of the air conditioner.
  • 1 and 2 show a state where the front panel 4 closes the front opening 2a
  • FIGS. 3 and 4 show a state where the front panel 4 opens the front opening 2a.
  • the heat exchanger 6 and the indoor air taken in from the front opening 2 a and the upper opening 2 b are heat-exchanged by the heat exchanger 6 and are indoors.
  • a filter 16 is provided between the front opening 2a and the upper surface opening 2b and the heat exchanger 6 for removing dust contained in the indoor air taken in from the front opening 2a and the upper surface opening 2b.
  • the upper part of the front panel 4 is connected to the upper part of the main body 2 via two arms 18 and 20 provided at both ends thereof, and a drive motor (not shown) connected to the arm 18 is driven and controlled.
  • a drive motor (not shown) connected to the arm 18 is driven and controlled.
  • the upper and lower blades 12 are composed of an upper blade 12a and a lower blade 12b, and are respectively swingably attached to the lower portion of the main body 2.
  • the upper blade 12a and the lower blade 12b are connected to separate driving sources (for example, stepping motors), and are independently angle-controlled by a control device (for example, a microcomputer) built in the indoor unit.
  • a control device for example, a microcomputer
  • the upper and lower blades 12 can be composed of three or more upper and lower blades. In this case, at least two (particularly, the uppermost blade and the lowermost blade) can be independently angle-controlled. Is preferred.
  • the left and right blades 14 are composed of a total of 10 blades arranged five by left and right from the center of the indoor unit, and are respectively swingably attached to the lower part of the main body 2.
  • the left and right five blades are connected to separate drive sources (for example, stepping motors) as a unit, and the left and right five blades are independently angle-controlled by a control device built in the indoor unit. .
  • a method for driving the left and right blades 14 will also be described later.
  • an imaging sensor unit 24 is attached as an imaging device to the upper part of the front panel 4, and the imaging sensor unit 24 is held by a sensor holder.
  • the imaging sensor unit 24 includes a circuit board, a lens attached to the circuit board, and an imaging sensor mounted inside the lens. Further, the human body detecting means determines the presence or absence of a person by a circuit board based on, for example, a difference process described later. That is, the circuit board acts as presence / absence determination means for determining the presence / absence of a person.
  • ⁇ Estimation of human position by image sensor unit> In order to estimate the human position by the image sensor unit 24, a difference method which is a known technique is used. This is to perform a difference process between a background image that is an image in which no person is present and an image captured by the image sensor unit 24, and to estimate that a person is present in a region where the difference is generated.
  • FIG. 5 is a flowchart showing the flow of human position estimation processing in the present embodiment.
  • a background difference process is used to detect pixels that have a difference in the frame image.
  • Background difference processing is a comparison between a background image captured under a specific condition and a captured image captured under the same imaging conditions such as the background image and the field of view, viewpoint, and focal length of the imaging sensor unit 24. This is a technique for detecting an object that does not exist in the image but exists in the captured image. In order to detect a person, an image without a person is created as a background image.
  • FIG. 6 is a schematic diagram for explaining the background difference processing.
  • FIG. 6A shows a background image.
  • the visual field is set to be substantially equal to the air-conditioned space of the air conditioner.
  • 101 indicates a window existing in the air-conditioned space
  • 102 indicates a door.
  • FIG. 6B shows a frame image captured by the image sensor unit 24.
  • the field of view, the viewpoint, the focal length, and the like of the image sensor unit 24 are equal to the background image of FIG.
  • Reference numeral 103 denotes a person existing in the air-conditioned space.
  • FIGS. 6A and 6B shows a difference image.
  • White pixels indicate pixels where no difference exists, and black pixels indicate pixels where a difference occurs. It can be seen that the area of the person 103 that is not present in the background image but is present in the captured frame image is detected as the area 104 where the difference occurs. That is, it is possible to detect a person area by extracting an area where a difference is generated from the difference image.
  • FIGS. 7A to 7C are schematic diagrams showing three consecutive frames of images taken by the imaging sensor unit 24 in a scene in which the person 103 is moving from right to left in front of the window 101.
  • FIG. is there.
  • FIG. 7B shows an image of the next frame of FIG. 7A
  • FIG. 7C shows an image of the next frame of FIG. 7B.
  • 8A to 8C show inter-frame difference images obtained by performing inter-frame difference processing using the image of FIG.
  • White pixels indicate pixels where no difference exists
  • black pixels 105 indicate pixels where a difference occurs.
  • FIGS. 9A to 9C are diagrams schematically showing the update of the background image in each frame of FIGS. 7A to 7C.
  • a hatched area 106 indicates an area where the background image has been updated
  • a black area 107 indicates an area where a background image has not yet been created
  • a white area 108 indicates an area where the background image has not been updated. That is, the total area of the black area 107 and the white area 108 in FIG. 9 is equal to the black area in FIG.
  • the black area 107 is gradually reduced and a background image is automatically created.
  • step S102 the obtained difference area is divided into areas, and if there are a plurality of persons, the difference areas are divided into a plurality of difference areas.
  • the difference image is determined according to the rule that “the pixel in which the difference occurs and the pixel in which the difference exists in the vicinity are the same region”. Can be divided into regions.
  • FIG. 10 is a schematic diagram in which this area division processing is executed.
  • FIG. 10A shows a difference image calculated by the difference process, and black pixels 111 and 112 are pixels in which a difference occurs.
  • FIG. 10B shows that when FIG.
  • step S103 the position of the detected person is detected by calculating the position of the center of gravity of each obtained area.
  • perspective projection conversion may be used.
  • FIG. 11 is a schematic diagram for explaining two coordinate systems.
  • the image coordinate system This is a two-dimensional coordinate system in the captured image, where the upper left pixel of the image is the origin, u is rightward, and v is downward.
  • a camera coordinate system which is a three-dimensional coordinate system based on the camera. In this case, the focal position of the image sensor unit 24 is the origin, the optical axis direction of the image sensor unit 24 is Zc, the camera upward direction is Yc, and the camera left direction is Xc.
  • the focal position of the image sensor unit 24 is the origin
  • the optical axis direction of the image sensor unit 24 is Zc
  • the camera upward direction is Yc
  • the camera left direction is Xc.
  • f is the focal length [mm]
  • (u0, v0) is the image center [Pixel] on the image coordinates
  • (dpx, dpy) is the size [mm / Pixel] of one pixel of the image sensor.
  • FIGS. 12A and 12B the center of gravity position of the person on the image is (ug, vg), and the three-dimensional position in the camera coordinate system is (Xgc, Ygc, Zgc).
  • FIG. 12A is a schematic view of the air-conditioned space viewed from the side
  • FIG. 12B is a schematic view of the air-conditioned space viewed from above.
  • H the height at which the image sensor unit 24 is installed
  • the Xc direction is equal to the horizontal direction
  • the optical axis Zc is installed at an angle ⁇ from the vertical direction.
  • the direction in which the image sensor unit 24 is facing is measured in a vertical direction (elevation angle, an angle measured upward from the vertical line) ⁇ and a horizontal angle (rightward from the front reference line as viewed from the indoor unit). Angle) ⁇ . Furthermore, when the height of the center of gravity of the person is h, the distance L from the imaging sensor unit 24 to the center of gravity position and the direction W, which are three-dimensional positions in the air-conditioned space, can be calculated by the following equations.
  • FIGS. 13A and 13B show in which area in the air-conditioned space a person exists when the center of gravity position on the image exists in each of the areas A to G.
  • FIG. FIGS. 14A and 14B are schematic diagrams when a person is present. In FIG. 14A, the position of the center of gravity of the person exists in the areas A and F. Therefore, it is determined that the person exists in the areas A and F of FIG. On the other hand, in FIG. 14B, since the position of the center of gravity of the person exists in the area D, it is determined that the person exists in the area D of FIG.
  • FIG. 15 is a flowchart for setting region characteristics to be described later in each of the regions A to G using the image sensor unit 24.
  • FIG. 16 is a flowchart of the regions A to G using the image sensor unit 24. It is a flowchart which determines whether a person exists in which area
  • step S1 the presence or absence of a person in each area is first determined by the above-described method at a predetermined cycle T1 (for example, 200 milliseconds if the frame rate of the image sensor unit 24 is 5 fps).
  • a predetermined cycle T1 for example, 200 milliseconds if the frame rate of the image sensor unit 24 is 5 fps.
  • each of the areas A to G is divided into a first area where the person is good (a place where the person is good) and a second area where the person is short (the area where the person simply passes, and the stay time is short). And a third area (a non-living area such as a wall or a window where people hardly go).
  • the first region, the second region, and the third region are referred to as a life category I, a life category II, and a life category III, respectively, and the life category I, the life category II, and the life category III are respectively a region characteristic I. It can also be said that the region of region characteristic II, region of region characteristic II, region of region characteristic III.
  • the life category I (region characteristic I) and the life category II (region characteristic II) are combined into a life region (region where people live), while the life category III (region characteristic III) is changed to a non-life region (
  • the area of life may be broadly classified according to the frequency of the presence or absence of a person.
  • FIG. 17 shows a case where the indoor unit of the air conditioner according to the present invention is installed in an LD of 1 LDK composed of one Japanese-style room, LD (living room / dining room) and kitchen, and is indicated by an ellipse in FIG. The area shows the well-placed place where the subject reported.
  • the presence / absence of a person in each of the regions A to G is determined every period T1, and 1 (with a reaction) or 0 (without a reaction) is output as a reaction result (determination) in the period T1, Is repeated a plurality of times, and in step S2, all sensor outputs are cleared.
  • step S3 it is determined whether or not the cumulative operation time of the predetermined air conditioner has elapsed. If it is determined in step S3 that the predetermined time has not elapsed, the process returns to step S1. On the other hand, if it is determined that the predetermined time has elapsed, the reaction results accumulated in the predetermined time in each of the regions A to G are two. Each region A to G is identified as one of the life categories I to III by comparing with the threshold value.
  • the first threshold value and the second threshold value smaller than the first threshold value are set, and in step S4, the long-term accumulation results of the respective regions A to G are obtained. It is determined whether or not it is greater than the first threshold value, and the region determined to be greater is determined to be the life category I in step S5. If it is determined in step S4 that the long-term accumulation result of each region A to G is less than the first threshold value, whether or not the long-term accumulation result of each region A to G is greater than the second threshold value in step S6.
  • the region determined to be large is determined to be the life category II in step S7, while the region determined to be small is determined to be the life category III in step S8.
  • the areas C, D, and G are determined as the life category I
  • the areas B and F are determined as the life category II
  • the areas A and E are determined as the life category III.
  • FIG. 19 shows a case where the indoor unit of the air conditioner according to the present invention is installed in another LD of 1 LDK, and FIG. 20 discriminates each region A to G based on the long-term accumulation result in this case. Results are shown.
  • the areas B, C, and E are determined as the life category I
  • the areas A and F are determined as the life category II
  • the areas D and G are determined as the life category III.
  • step S23 it is determined whether or not a predetermined number M (for example, 45 times) of reaction results in the cycle T1 has been obtained. If it is determined that the cycle T1 has not reached the predetermined number M, the process returns to step S21. If it is determined that the period T1 has reached the predetermined number M, in step S24, the total number of reaction results in the period T1 ⁇ M is used as the cumulative reaction period number, and the cumulative reaction period number for one time is calculated.
  • a predetermined number M for example, 45 times
  • step S27 by subtracting 1 from the number of times (N) of cumulative reaction period calculations and returning to step S21, the calculation of the cumulative reaction period number for a predetermined number of times is repeatedly performed.
  • Table 1 shows a history of reaction results for the latest one time (time T1 ⁇ M).
  • ⁇ A0 means the number of cumulative reaction periods for one time in the region A.
  • the cumulative reaction period number of one time immediately before ⁇ A0 is ⁇ A1
  • the previous cumulative reaction period number of ⁇ A0 is ⁇ A2,...
  • N 4
  • the past four history ( ⁇ A4, ⁇ A3 , .SIGMA.A2, .SIGMA.A1), for life category I it is determined that there is a person if the cumulative reaction period is one or more.
  • life category II it is determined that there is a person if the cumulative reaction period of one or more times is two or more in the past four history
  • life category III the past four history Among them, if the cumulative reaction period number of 2 times or more is 3 times or more, it is determined that there is a person.
  • the presence / absence of the person is similarly estimated from the past four histories, life categories, and cumulative reaction period times.
  • the region characteristics obtained by accumulating the region determination results for each predetermined period for a long period of time and the region determination results for each predetermined cycle are accumulated N times, and the cumulative reaction of each region obtained is obtained.
  • the area characteristics of each area A to G (life classification I to III) is determined, and the time required for presence estimation and the time required for absence estimation are changed according to the region characteristics of the regions A to G.
  • the time required for estimating the presence / absence of the area determined as the life category II as a standard in the area determined as the life category I, there is a person at a shorter time interval than the area determined as the life category II. In contrast, when there are no people in the area, the absence of the person is estimated at a longer time interval than the area determined as the life category II.
  • the time required for estimation is set to be long.
  • the presence of a person is estimated at a longer time interval than the area determined to be life category II.
  • the difference method is used as the human position estimation by the image sensor unit 24.
  • a person-like region may be extracted from the frame image using image data of the whole body of the person.
  • a technique using a HOG (Histograms of Oriented Gradients) feature amount or the like is widely known (N. Dalal and B. Triggs, “Histograms of Oriented Gradients for Proc. IEEE Conf. On Computer Vision and Pattern Recognition, Vol.1, pp.886-893, 2005.).
  • HOG feature is a feature that focuses on the edge strength in each edge direction within the local region, and the person region is detected from the frame image by learning and identifying this feature using SVM (Support Vector Machine). It doesn't matter if you do.
  • FIG. 21 is a flowchart showing the flow of the human position estimation process using the process of extracting a human-like area from the frame image.
  • the same steps as those in FIG. 5 are denoted by the same reference numerals, and detailed description thereof is omitted here.
  • step S104 a human-like area is extracted as a human area in the frame image by using the HOG feature amount described above.
  • step S103 the position of the detected person is detected by calculating the position of the center of gravity of the obtained human area.
  • Equations 3 and 5 may be used as described above.
  • a face-like area may be extracted from the frame image.
  • a method using Haar-Like features is widely known (P. Viola and M. Jones, “Robust real-time face detection”, International Journal of Computer Vision, .57, no.2, pp.137-154, 2004.).
  • the Haar-Like feature amount is a feature amount focusing on the luminance difference between local regions, and this feature amount is learned and identified by SVM (Support Vector Machine) or the like to detect a person region from a frame image. It doesn't matter.
  • FIG. 22 is a flowchart showing a flow of a human position estimation process using a process of extracting a face-like area from a frame image.
  • the same steps as those in FIG. 5 are denoted by the same reference numerals, and detailed description thereof is omitted here.
  • step S105 a face-like area is extracted as a face area in the frame image by using the Haar-Like feature amount described above.
  • step S103 the position of the detected person is detected by calculating the position of the center of gravity of the obtained face area.
  • perspective projection conversion may be used.
  • the obstacle detection means for detecting an obstacle using the above-described imaging sensor unit 24 will be described.
  • the term “obstacle” refers to all objects that are blown out from the air outlet 10 of the indoor unit and impede the flow of air to provide a comfortable space for residents. It is a collective term for non-residents such as furniture such as sofas, televisions, and audio.
  • the floor surface of the living space is subdivided as shown in FIG. 23 based on the vertical angle ⁇ and the horizontal angle ⁇ as shown in FIG.
  • Each of these areas is defined as an obstacle position determination area or “position” to determine in which position an obstacle exists.
  • all the positions shown in FIG. 23 substantially coincide with the whole area of the human position determination area shown in FIG. 13B, and the area boundary in FIG. 13B substantially coincides with the position boundary in FIG.
  • the number of position areas is set larger than the number of areas of the human position determination area, and at least two positions belong to each of the human position determination areas, and these at least two obstacle position determinations.
  • the air conditioning control can be performed by dividing the area so that at least one position belongs to each person position determination area.
  • each of the plurality of person position determination areas is divided according to the distance to the indoor unit, and the number of areas belonging to the person position determination area in the near area is determined as the person position determination in the far area.
  • the number of positions belonging to the area is set to be larger than the number of areas belonging to the area, but the number of positions belonging to each person position determination area may be the same regardless of the distance from the indoor unit.
  • the air conditioner according to the present invention detects the presence or absence of a person in the regions A to G by the human body detection means, and detects the presence or absence of an obstacle in the positions A1 to G2 by the obstacle detection means. Based on the detection signal (detection result) of the human body detection means and the detection signal (detection result) of the obstacle detection means, the comfortable space is provided by driving and controlling the upper and lower blades 12 and the left and right blades 14 as the wind direction changing means. I am doing so.
  • the human body detection means can detect the presence or absence of a person by using an object that moves, for example, by detecting an object that moves in the air-conditioned space. Since the distance between the obstacles is detected by the image sensor unit 24, the person and the obstacle cannot be distinguished.
  • the area where the person is located may not be air-conditioned, or the person may be directly conditioned by air-conditioning airflow (airflow), resulting in inefficient air conditioning control or discomfort to the person. There is a risk of air conditioning control.
  • the obstacle detection means detects the obstacle only by performing the data processing described below.
  • FIG. 24 is a schematic diagram for explaining obstacle detection by the stereo method.
  • the distance to a point P that is an obstacle is measured using the image sensor units 24 and 26.
  • f is the focal length
  • B is the distance between the focal points of the two image sensor units 24 and 26
  • u1 is the u coordinate of the obstacle on the image of the image sensor unit 24, and the image of the image sensor unit 26 of u1.
  • the u coordinate of the corresponding point in the above is u2
  • X is the distance from the image sensor unit to the point P. Further, it is assumed that the image center positions of the two image sensor units 24 and 26 are equal. At this time, the distance X from the imaging sensor unit to the point P is obtained from the following equation.
  • the distance X from the imaging sensor unit to the obstacle point P depends on the parallax
  • the search for corresponding points may use a block matching method using a template matching method.
  • distance measurement detection of the position of an obstacle
  • the imaging sensor unit uses the imaging sensor unit.
  • Equations 3, 5, and 6, it can be seen that the position of the obstacle is estimated from the pixel position and the parallax.
  • I and j in Table 3 indicate pixel positions to be measured.
  • the vertical angle and the horizontal angle are the above-described elevation angle ⁇ and angle ⁇ measured rightward from the front reference line when viewed from the indoor unit. Respectively. That is, when viewed from the indoor unit, each pixel is set in the range of 5 to 80 degrees in the vertical direction and ⁇ 80 to 80 degrees in the horizontal direction, and the image sensor unit measures the parallax of each pixel.
  • the air conditioner performs distance measurement (detection of the position of an obstacle) by measuring parallax at each pixel from pixel [14,15] to pixel [142,105].
  • the detection range of the obstacle detection means at the start of the operation of the air conditioner may be limited to an elevation angle of 10 degrees or more. This is because the measurement data can be effectively used by measuring the distance only in the area where there is a high possibility that there is a person at the start of the operation of the air conditioner and there is a high possibility that the person will not be detected, that is, the area where the wall is located. (Since the person is not an obstacle, the data of the area where the person is present is not used as will be described later).
  • step S41 when it is determined that there is no person in the area corresponding to the current pixel (any one of the areas A to G shown in FIG. 13), the process proceeds to step S42 while it is determined that there is a person. If so, the process proceeds to step S43. That is, since the person is not an obstacle, the pixel corresponding to the area determined to have a person uses the previous distance data without performing distance measurement (does not update the distance data) and determines that there is no person. The distance measurement is performed only in the pixel corresponding to the region thus set, and the newly measured distance data is set to be used (distance data is updated).
  • an obstacle in each obstacle position determination area is determined according to the result of the presence / absence determination of a person in the person position determination area corresponding to each obstacle position determination area.
  • the presence / absence determination of an obstacle is efficiently performed. More specifically, in the obstacle position determination area belonging to the human position determination area determined that there is no person by the human body detection means, the previous determination result by the obstacle detection means is updated with a new determination result, In the obstacle position determination area belonging to the human position determination area determined to have a person by the human body detection means, the previous determination result by the obstacle detection means is not updated with a new determination result.
  • step S42 the above-described block matching method is used to calculate the parallax of each pixel, and the process proceeds to step S44.
  • step S44 data is acquired eight times with the same pixel, and it is determined whether distance measurement based on the acquired data is complete. If it is determined that distance measurement is not complete, the process proceeds to step S41. Return. Conversely, if it is determined in step S44 that the distance measurement has been completed, the process proceeds to step S45.
  • step S45 the reliability of the distance estimation is improved by evaluating the reliability. That is, if it is determined that there is reliability, a distance number determination process is performed in step S46. On the other hand, if it is determined that there is no reliability, a nearby distance number is processed as distance data of the pixel in step S47. .
  • the image sensor units 24 and 26 function as obstacle position detection means.
  • step S46 the distance number determination process in step S46 will be described. First, the term “distance number” will be described.
  • the “distance number” means an approximate distance from the image sensor unit to the position P where the air-conditioned space is located. As shown in FIG. 26, the image sensor unit is installed 2 m above the floor, and the image sensor Assuming that the distance from the unit to the position P is “distance corresponding to the distance number” X [m], the position P is expressed by the following equation.
  • the distance X corresponding to the distance number depends on the parallax between the image sensor units 24 and 26.
  • the distance number is an integer value from 2 to 12, and the distance corresponding to each distance number is set as shown in Table 4.
  • Table 4 shows the position of the position P corresponding to the elevation angle ( ⁇ ) determined by the v-coordinate value of each pixel according to each distance number and the number 2, and in the black part, h is negative. Value (h ⁇ 0), indicating the position to bite into the floor.
  • the position corresponding to the distance number ⁇ 10 is a position that exceeds the wall of the room with a diagonal distance> 4.50 m (a position outside the room), and is a distance number that has no meaning at all. Yes, in black.
  • Table 6 shows the limit value of the distance number set according to the capability rank of the air conditioner and the elevation angle of each pixel.
  • step S45 the reliability evaluation process in step S45 and the distance number determination process in step S46 will be described.
  • Determine the distance number for 8 times for each pixel, remove the two distance numbers in order from the largest, and remove the two distance numbers in order from the smallest, and take the average value of the remaining four distance numbers to determine the distance number.
  • the stereo method based on the block matching method when an obstacle having no luminance change is detected, the parallax calculation is not stable, and a parallax result (distance number) that differs greatly every time measurement is performed. Therefore, in step S45, the values of the remaining four distance numbers are compared, and if the variation is equal to or greater than the threshold value, in step S47, the distance number value is given as not reliable because the distance number value is not reliable.
  • the distance number estimated in the neighboring pixels is used.
  • the average value is an integer value obtained by rounding up the decimal point and quantizing, and the position corresponding to the distance number thus determined is as shown in Table 4 or Table 5.
  • the distance number is determined by taking an average value of the remaining four distance numbers except for two distance numbers, each of which is larger and smaller.
  • the number of distances determined for each pixel is not limited to eight, and the number of distances taking an average value is not limited to four.
  • an obstacle in each obstacle position determination area is determined according to the result of the presence / absence determination of a person in the person position determination area corresponding to each obstacle position determination area.
  • the presence / absence determination of an obstacle is efficiently performed. More specifically, in the obstacle position determination area belonging to the human position determination area determined that there is no person by the human body detection means, the previous determination result by the obstacle detection means is updated with a new determination result, In the obstacle position determination area belonging to the human position determination area determined to have a person by the human body detection means, the previous determination result by the obstacle detection means is not updated with a new determination result.
  • step S43 in the flowchart of FIG. 25 the previous distance data is used. However, since the previous data does not exist immediately after the installation of the air conditioner, each obstacle position determination area by the obstacle detection unit is used. When the determination is the first time, the default value is used, and the limit value (maximum value D) described above is used as the default value.
  • FIG. 27 is an elevation view (longitudinal sectional view passing through the image sensor unit) of a certain living space.
  • the floor surface is 2 m below the image sensor unit, and a table or the like is 0.7 to 1.1 m from the floor surface.
  • the measurement results when there is an obstacle are shown in the drawing.
  • the shaded area, the upward-sloping shaded part, and the downward-sloping shaded part are short distance, medium distance, and long distance (these distances will be described later) Are determined to have obstacles.
  • the areas A to G shown in FIG. 13 belong to the following blocks, respectively.
  • Block N Region A Block R: Regions B and E Block C: Regions C and F Block L: Regions D and G Regions A to G belong to the following fields, respectively.
  • Field 1 Area A Field 2: Regions B and D Field 3: Region C Field 4: Regions E and G Field 5: Region F Furthermore, the distance from the indoor unit is defined as follows.
  • Table 7 shows the target setting angles at the positions of the five left blades and the five right blades constituting the left and right blades 14, and the symbols attached to the numbers (angles) are as shown in FIG.
  • the case where the left or right blade is directed inward is defined as a plus (+, no symbol in Table 7) direction, and the case where it is directed outward is defined as a minus ( ⁇ ) direction.
  • the “heating B area” in Table 7 is a heating area where obstacle avoidance control is performed, and “normal automatic wind direction control” is wind direction control where obstacle avoidance control is not performed.
  • the determination as to whether or not to perform the obstacle avoidance control is based on the temperature of the indoor heat exchanger 6.
  • the temperature is low, the wind direction control is not applied to the occupant, and when it is too high, the maximum air volume position is determined. In the case of wind direction control and moderate temperature, wind direction control to the heating B area is performed.
  • “temperature is low”, “too high”, “wind direction control that does not apply wind to the occupant”, and “wind direction control at the maximum airflow position” have the following meanings.
  • -Low temperature The temperature of the indoor heat exchanger 6 is set to the skin temperature (33 to 34 ° C) as the optimum temperature, and a temperature that can be lower than this temperature (for example, 32 ° C).
  • -Too high temperature for example, 56 ° C or higher-Wind direction control that does not direct wind to the occupant: Wind direction control that causes the wind to flow along the ceiling by controlling the angle of the upper and lower blades 12 so as not to send wind to the living space -Wind direction control at the maximum airflow position: When the air conditioner bends the airflow with the upper and lower blades 12 and the left and right blades 14, resistance (loss) is always generated, so the maximum airflow position is the wind direction where the loss is close to zero. Control (in the case of the left and right blades 14, it is a position facing directly in front, and in the case of the upper and lower blades 12, it is a position facing downward 35 degrees from the horizontal)
  • Table 8 shows target setting angles in each field of the upper and lower blades 12 when performing obstacle avoidance control.
  • the upper blade angle ( ⁇ 1) and the lower blade angle ( ⁇ 2) are angles (elevation angles) measured upward from the vertical line.
  • the swinging motion is a swinging motion of the left and right blades 14, and basically swinging with a predetermined left-right angle width around one target position and having no fixed time at both ends of the swing. is there.
  • the position stop operation means that the target setting angle (angle in Table 7) of a certain position is corrected as shown in Table 9 to be the left end and the right end, respectively.
  • the left end and the right end each have a wind direction fixing time (time for fixing the left and right blades 14). For example, when the wind direction fixing time elapses at the left end, it moves to the right end and the wind direction fixing time elapses at the right end. The wind direction at the right end is maintained, and after the fixed time of the wind direction has passed, it moves to the left end and repeats it.
  • the wind direction fixing time is set to 60 seconds, for example.
  • the set angles of the left and right blades 14 corresponding to the left end and the right end of each block are determined based on, for example, Table 10.
  • the operation has a fixed wind direction at the left and right ends of each block.For example, when the fixed wind direction has elapsed at the left end, it moves to the right end and maintains the right wind direction until the fixed wind direction has elapsed at the right end. Then, after the elapse of the wind direction fixing time, it moves to the left end and repeats it.
  • the wind direction fixing time is set to 60 seconds, for example, similarly to the position stop operation. Since the left end and the right end of each block coincide with the left end and the right end of the person position determination area belonging to the block, the block stop operation can be said to be a stop operation of the person position determination area.
  • position stop operation and block stop operation are properly used according to the size of the obstacle.
  • the obstacles in front are small, the position is stopped around the position where there are obstacles to avoid obstacles and blow, whereas the obstacles in front are large, for example, in front of the area where people are When there is an obstacle, the air is blown over a wide range by performing a block stop operation.
  • the swing operation, the position stop operation, and the block stop operation are collectively referred to as the swing operation of the left and right blades 14.
  • the human body detection means determines that the person is only in a single region.
  • the air flow control is performed to control the upper and lower blades 12 to avoid the obstacle from above. ing.
  • the obstacle detection means determines that there is an obstacle in the obstacle position determination area belonging to the person position determination area determined that the person is detected by the human body detection means, the person position determination is determined that there is a person.
  • the left and right blades 14 are swung within at least one obstacle position determination region belonging to the region, and the fixing time of the left and right blades 14 is not provided at both ends of the swing range.
  • the left and right blades 14 are swung within at least one obstacle position determining region belonging to the person position determining region or the human position determining region adjacent to the region, and fixed times of the left and right blades 14 are provided at both ends of the swing range.
  • One of the two airflow controls is selected.
  • both the left blade and the right blade have 10 degrees. It continues to swing (swing) without stopping in the center at an angle range of ⁇ 10 degrees.
  • the timing of swinging the left and right blades to the left and right is set to be the same, and the swinging motions of the left and right blades are linked.
  • the first airflow control is performed by swinging the target setting angles of two positions without obstacles at both ends to basically air-condition a position without obstacles.
  • the block N is operated in a block stop and the second airflow control is performed. This is because the block stop operation is more directional and can reach far away than the entire area, and there is a high possibility of avoiding obstacles. That is, even when obstacles are scattered in the area A, there is usually a gap between the obstacles, and the air can be blown through the gap between the obstacles.
  • the first airflow control is performed by swinging left and right. For example, when there is a person in the region D and there is an obstacle only at the position D2, the swing operation is performed to the left and right around the target setting angle of the position D1.
  • the block including the area where the person is present is operated to stop the block and the second air flow control is performed.
  • the block L is operated while being stopped.
  • the first airflow control is performed by performing a swing operation around the target setting angle in a position where there is no obstacle in the middle distance region. For example, if there is a person in the area E and there is an obstacle at the position B2 and there are no obstacles on both sides, but there are obstacles behind it, it is advantageous to send airflow from the position B1 where there is no obstacle. .
  • the first airflow control is performed by swinging around the target setting angle of the position where there is no obstacle . For example, if there is a person in the area F, there is an obstacle in position C2, there is an obstacle in position D1 of both sides of position C2, and there is no obstacle in C1, the obstacles from position C1 to position C2 where there is no obstacle Airflow can be sent to area F while avoiding objects.
  • the block including the area where the person is present is operated in block stop to perform the second air flow control.
  • the block C is operated in a block stop state. In this case, since there is an obstacle ahead of the person and there is no way to avoid the obstacle, the block stop operation is performed regardless of whether there is an obstacle in the block adjacent to the block C.
  • the first airflow control is performed by swinging around the target setting angle of the other position where there is no obstacle. For example, if there is a person in the area F, there are no obstacles in the positions C1, C2, and F1, and there is an obstacle in the position F2, the front of the area F in which the person is present is open. Considering this, air conditioning is performed around the far-off position F1 without an obstacle.
  • ⁇ Human wall proximity control> When a person and a wall exist in the same area, the person is always located in front of the wall and close to the wall, and during heating, hot air tends to stay near the wall. Since the room temperature tends to be higher than the room temperature of other parts, human wall proximity control is performed.
  • the parallax is calculated in a pixel different from the pixel [i, j] shown in Table 4, the distance is detected, and the positions of the front wall and the left and right walls are first recognized.
  • the parallax of the pixel corresponding to the front in the substantially horizontal direction is calculated, and the distance to the front wall is measured to obtain the distance number. Further, the parallax of the pixel corresponding to the left side in the substantially horizontal direction is calculated, the distance to the left wall is measured, the distance number is obtained, and the distance number of the right wall is obtained similarly.
  • FIG. 29 is a top view of a room to which an indoor unit is attached, and shows a case where a front wall WC, a left wall WL, and a right wall WR exist on the front, left, and right sides as viewed from the indoor unit. ing.
  • the numbers on the left side of FIG. 29 indicate the distance numbers of the corresponding cells, and Table 12 indicates the distances from the indoor unit to the near and far points corresponding to the distance numbers.
  • the “obstacle” used in the present specification is assumed to be furniture such as a table and a sofa, a television, an audio, and the like. Since it is not detected in the angle range of 75 degrees and it can be estimated that it is a wall that is detected, in the present embodiment, the distance to the front, left end, and right end of the indoor unit is detected at an elevation angle of 75 degrees or more, It is assumed that there is a wall on the extension including the position.
  • the left wall WL is at the positions of ⁇ 80 degrees and ⁇ 75 degrees
  • the front wall WC is at the positions of ⁇ 15 degrees to 15 degrees
  • the right wall WR is at the angles of 75 degrees and 80 degrees. Therefore, among the pixels shown in Table 3, the pixels corresponding to the viewing angle in the horizontal direction within the elevation angle of 75 degrees are as follows.
  • the upper limit value and the lower limit value of each wall surface data are deleted to eliminate unnecessary wall surface data.
  • the front wall WC, left A distance number to the wall WL and the right wall WR is determined.
  • the maximum values in Table 14 (WC: 5, WL: 6, WR: 3) can be adopted.
  • a room large room
  • a wider space should be set as a target for air-conditioning control.
  • the temperature setting is lower than the setting temperature set by the remote control I do.
  • the set temperature is set to a low value by a first predetermined temperature (for example, 2 ° C.).
  • a first predetermined temperature for example, 2 ° C.
  • B. When a person is in a long-distance area Since the long-distance area is far from the indoor unit and has a large area, the degree of increase in room temperature is lower than that in the short-distance area or medium-distance area.
  • the set temperature is set to a low level by a second predetermined temperature (for example, 1 ° C.) lower than the first predetermined temperature.
  • the long-distance area has a large area, even if it is detected that there is a person and a wall in the same person position determination area, there is a possibility that the person and the wall are separated. Only in this case, the human wall proximity control is performed, and the temperature shift is performed according to the positional relationship between the person and the wall.
  • the wall and the obstacle can be controlled by controlling the detection order at the imaging sensor unit to be different between the wall detection and the obstacle detection. Obstacles can be detected efficiently and accurately.
  • the distance is measured in the following order when a wall is detected.
  • the stereo method is used as the distance detection means, but a method using the light projecting unit 28 and the image sensor unit 24 may be used instead of the stereo method. This method will be described.
  • the main body 2 of the present embodiment includes an image sensor unit 24 and a light projecting unit 28.
  • the light projecting unit 28 includes a light source and a scanning unit (not shown), and the light source may use an LED or a laser. Further, the scanning unit can change the light projecting direction arbitrarily using a galvanometer mirror or the like.
  • FIG. 31 is a schematic diagram showing the relationship between the image sensor unit 24 and the light projecting unit 28. Originally, the projection direction is a two-degree-of-freedom and the imaging surface is a vertical and horizontal two-dimensional plane.
  • the light projecting unit 28 projects light in the light projecting direction ⁇ with respect to the optical axis direction of the imaging sensor unit 24.
  • the image sensor unit 24 performs a difference process between the frame image immediately before the light projecting unit 28 projects light and the frame image being projected, thereby reflecting the light P projected by the light projecting unit 28.
  • the u coordinate u1 on the image is acquired.
  • distance information in the air-conditioned space can be obtained by detecting the reflection point P of the light while changing the light projecting direction ⁇ of the light projecting unit 28.
  • i and j indicate addresses to be scanned by the light projecting unit 28, and the vertical angle and the horizontal angle are set to the right from the elevation angle ⁇ and the reference line in front as viewed from the indoor unit.
  • Each measured angle ⁇ is shown. That is, when viewed from the indoor unit, each address is set in the range of 5 to 80 degrees in the vertical direction and -80 to 80 degrees in the horizontal direction, and the light projecting unit 28 measures each address and scans the living space. To do.
  • step S48 when it is determined that there is no person in the area (any one of areas A to G shown in FIG. 13) corresponding to the address [i, j] where the light projecting unit 28 performs light projection, If it is determined that there is a person while the process proceeds to step S49, the process proceeds to step S43. That is, since the person is not an obstacle, the pixel corresponding to the area determined to have a person uses the previous distance data without performing distance measurement (does not update the distance data) and determines that there is no person. The distance measurement is performed only in the pixel corresponding to the region thus set, and the newly measured distance data is set to be used (distance data is updated).
  • step S49 the distance to the obstacle is estimated by acquiring the above-described light projection process and the reflection point from the image sensor unit 24.
  • the distance number determination process may be used to perform the process using the distance number.
  • human body detection means may be used as distance detection means. This comprises a human body distance detecting means using human body detecting means and an obstacle detecting means using human body detecting means. This process will be described.
  • FIG. 34 is a flowchart showing the flow of processing of the human body distance detecting means using the human body detecting means.
  • the same steps as those in FIG. 5 are denoted by the same reference numerals, and detailed description thereof is omitted here.
  • step S201 the human body distance detection means detects a pixel that is present at the top of the image among the pixels in which the difference is generated in each area where the human body detection means has divided the area, and sets the v coordinate as v1. get.
  • the human body distance detection means estimates the distance from the image sensor unit to the person using v1, which is the v coordinate at the top of the image.
  • FIG. 35 is a schematic diagram for explaining this process.
  • FIG. 35A is a schematic diagram of a scene in which two persons 121 and 122 are present near and far from the camera
  • FIG. 35B is a difference image of images captured by the image sensor unit in the scene of FIG. Is shown.
  • the areas 123 and 124 where the difference occurs correspond to the persons 121 and 122, respectively.
  • the height h1 of the person is known and the heights of all the persons in the air-conditioned space are substantially equal.
  • the image sensor unit 24 since the image sensor unit 24 is installed at a height of 2 m, as shown in FIG. 35A, the image sensor unit captures an image while looking down from above the person. At this time, the closer the person is to the image sensor unit, the more the person is imaged in the lower part of the image as shown in FIG. That is, the v-coordinate v1 at the top of the image of the person and the distance from the image sensor unit to the person correspond one-to-one. From this, the human body distance detecting means using the human body detecting means can be performed by obtaining in advance the correspondence between the uppermost v coordinate v1 of the person and the distance from the imaging sensor unit to the person.
  • Table 17 shows an example in which the average height of a person is used as h1, and the correspondence between the v-coordinate v1 at the top of the image and the distance from the imaging sensor unit to the person is obtained in advance.
  • FIG. 36 is a flowchart showing the flow of processing of obstacle detection means using human body detection means.
  • step S203 the obstacle detection means estimates the height v2 of the person on the image using the distance information from the image sensor unit 24 to the person estimated by the human body distance detection means.
  • FIG. 37 is a schematic diagram for explaining this processing, and is a schematic diagram showing a scene similar to FIG.
  • the height h1 of the person is known as described above, and the heights of all persons in the air-conditioned space are substantially equal.
  • the image sensor unit 24 since the image sensor unit 24 is installed at a height of 2 m, as shown in FIG. 34 (a), the image sensor unit performs imaging while looking down from above the person. At this time, the closer the person is to the image sensor unit 24, the larger the size of the person on the image as shown in FIG.
  • the difference v2 between the v-coordinate at the top of the image and the v-coordinate at the bottom of the image has a one-to-one correspondence with the distance from the image sensor unit 24 to the person. From this, when the distance from the image sensor unit to the person is known, the size on the image can be estimated. This can be done by obtaining in advance the correspondence between the difference v2 between the v-coordinate at the top of the image and the v-coordinate at the bottom of the image and the distance from the image sensor unit to the person.
  • step S204 the obstacle detection means detects the pixel having the highest difference at the top of the image and the pixel having the lowest difference at the bottom of the image in each region of the difference image.
  • the difference v3 is calculated.
  • step S205 the person's height v2 on the image estimated using the distance information from the image sensor unit 24 to the person is compared with the person's height v3 obtained from the actual difference image, thereby capturing an image. It is estimated whether there is an obstacle between the sensor unit 24 and the person.
  • 38 and 39 are schematic diagrams for explaining this process.
  • FIG. 38 shows a scene similar to FIG. 35, and is a schematic diagram showing a scene where no obstacle exists between the image sensor unit 24 and a person.
  • FIG. 39 is a schematic diagram showing a scene where an obstacle exists.
  • FIG. 38 shows a scene similar to FIG. 35, and is a schematic diagram showing a scene where no obstacle exists between the image sensor unit 24 and a person.
  • FIG. 39 is a schematic diagram showing a scene where an obstacle exists.
  • FIG. 38 shows a scene similar to FIG. 35, and is a schematic diagram showing a scene where no obstacle exists between the image sensor unit 24 and a person.
  • FIG. 39 is a schematic diagram showing
  • step S205 when it is determined in step S205 that v3 is sufficiently smaller than v2, the process proceeds to step S206, and it is determined that there is an obstacle between the imaging sensor unit and the person. At this time, the distance between the imaging sensor unit and the obstacle is assumed to be equal to the distance from the imaging sensor unit to the person obtained from the uppermost v coordinate v1.
  • the distance detection means is realized by using the detection result of the human body detection means.
  • a subdivided human position determination area and an obstacle position determination area are provided, and the suction temperature of the indoor unit is controlled according to the detected positional relationship between the wall and the person.
  • a technique for detecting the positions of people and walls in the area to be air-conditioned without using the subdivided person position determination area and obstacle position determination area is well known. From this detected position of the person and the wall, it is determined whether the person and the wall are less than a predetermined distance (or less), and if an affirmative determination is obtained, the suction temperature of the indoor unit is controlled. It doesn't matter.
  • the air conditioner according to the present invention has the effect of enabling energy-saving operation while realizing a comfortable air-conditioned space, and is useful as various air conditioners including general home air conditioners.
  • 2 indoor unit body 2a front opening, 2b top opening, 4 movable front panel, 6 heat exchanger, 8 indoor fan, 10 air outlets, 12 top and bottom blades, 14 left and right blades, 16 Filter, 18, 20 Front panel arm, 24, 26 Image sensor unit, 28 Projection unit.

Abstract

La présente invention se rapporte à un climatiseur pourvu d'un dispositif de capture d'image destiné à détecter si un obstacle est présent ou non. Le dispositif de capture d'image détermine, indépendamment des parois présentes autour d'une région à climatiser, si un obstacle est présent ou non dans chacune des régions de détermination de position d'obstacle séparées.
PCT/JP2010/005883 2009-10-06 2010-09-30 Climatiseur WO2011043039A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009-232360 2009-10-06
JP2009232360A JP5487869B2 (ja) 2009-10-06 2009-10-06 空気調和機

Publications (1)

Publication Number Publication Date
WO2011043039A1 true WO2011043039A1 (fr) 2011-04-14

Family

ID=43856523

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/005883 WO2011043039A1 (fr) 2009-10-06 2010-09-30 Climatiseur

Country Status (2)

Country Link
JP (1) JP5487869B2 (fr)
WO (1) WO2011043039A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5697583B2 (ja) * 2011-11-21 2015-04-08 三菱電機株式会社 部屋形状認識方法および装置、ならびにこれを用いた空気調和機
CN112665160B (zh) * 2020-12-21 2022-01-28 珠海格力电器股份有限公司 空调器的控制方法和空调器

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0372249U (fr) * 1989-11-16 1991-07-22
JP2008224099A (ja) * 2007-03-12 2008-09-25 Mitsubishi Electric Corp 空気調和装置
JP2008304083A (ja) * 2007-06-05 2008-12-18 Nikon Corp 空調装置
JP2009092281A (ja) * 2007-10-05 2009-04-30 Mitsubishi Electric Building Techno Service Co Ltd 空調制御システム
JP2009139010A (ja) * 2007-12-06 2009-06-25 Sharp Corp 空気調和機
JP2009186136A (ja) * 2008-02-08 2009-08-20 Panasonic Corp 空気調和機

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0372249U (fr) * 1989-11-16 1991-07-22
JP2008224099A (ja) * 2007-03-12 2008-09-25 Mitsubishi Electric Corp 空気調和装置
JP2008304083A (ja) * 2007-06-05 2008-12-18 Nikon Corp 空調装置
JP2009092281A (ja) * 2007-10-05 2009-04-30 Mitsubishi Electric Building Techno Service Co Ltd 空調制御システム
JP2009139010A (ja) * 2007-12-06 2009-06-25 Sharp Corp 空気調和機
JP2009186136A (ja) * 2008-02-08 2009-08-20 Panasonic Corp 空気調和機

Also Published As

Publication number Publication date
JP2011080663A (ja) 2011-04-21
JP5487869B2 (ja) 2014-05-14

Similar Documents

Publication Publication Date Title
JP5402488B2 (ja) 空気調和機
JP5454065B2 (ja) 空気調和機
WO2011043054A1 (fr) Climatiseur
JP5402487B2 (ja) 空気調和機
JP2011080621A (ja) 空気調和機
JP6335425B2 (ja) 空気調和機
JP5815490B2 (ja) 空気調和機
JP5697583B2 (ja) 部屋形状認識方法および装置、ならびにこれを用いた空気調和機
JP2013024534A (ja) 状況認識装置
KR20140031081A (ko) 공기 조화기
JP2012042074A (ja) 空気調和機
JP5488297B2 (ja) 空気調和機
JP2010169373A (ja) 空気調和機
JP5487867B2 (ja) 空気調和機
JP2015052431A (ja) 空気調和機の室内機および空気調和機
JP2017053603A (ja) 空気調和機
JP2015190666A (ja) 空気調和機の室内機及びこれを用いた空気調和機
JP6097183B2 (ja) 空気調和機
JP2012037102A (ja) 人物識別装置、人物識別方法及び人物識別装置を備えた空気調和機
JP5487869B2 (ja) 空気調和機
JP2011080685A (ja) 空気調和機
JP2016044863A (ja) 空気調和機
JP6692134B2 (ja) 空気調和機
JP6552164B2 (ja) 空気調和機
JP2014081144A (ja) 空気調和機

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10821716

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10821716

Country of ref document: EP

Kind code of ref document: A1