WO2011043056A1 - 空気調和機 - Google Patents
空気調和機 Download PDFInfo
- Publication number
- WO2011043056A1 WO2011043056A1 PCT/JP2010/005948 JP2010005948W WO2011043056A1 WO 2011043056 A1 WO2011043056 A1 WO 2011043056A1 JP 2010005948 W JP2010005948 W JP 2010005948W WO 2011043056 A1 WO2011043056 A1 WO 2011043056A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- obstacle
- person
- area
- distance
- air conditioner
- Prior art date
Links
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/30—Control or safety arrangements for purposes related to the operation of the system, e.g. for safety or monitoring
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F1/00—Room units for air-conditioning, e.g. separate or self-contained units or units receiving primary air from a central station
- F24F1/0007—Indoor units, e.g. fan coil units
- F24F1/0011—Indoor units, e.g. fan coil units characterised by air outlets
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F1/00—Room units for air-conditioning, e.g. separate or self-contained units or units receiving primary air from a central station
- F24F1/06—Separate outdoor units, e.g. outdoor unit to be linked to a separate room comprising a compressor and a heat exchanger
- F24F1/56—Casing or covers of separate outdoor units, e.g. fan guards
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/70—Control systems characterised by their outputs; Constructional details thereof
- F24F11/72—Control systems characterised by their outputs; Constructional details thereof for controlling the supply of treated air, e.g. its pressure
- F24F11/79—Control systems characterised by their outputs; Constructional details thereof for controlling the supply of treated air, e.g. its pressure for controlling the direction of the supplied air
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F2120/00—Control inputs relating to users or occupants
- F24F2120/10—Occupancy
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F2120/00—Control inputs relating to users or occupants
- F24F2120/10—Occupancy
- F24F2120/12—Position of occupants
Definitions
- the present invention relates to an air conditioner provided with a human body detection device that detects the presence or absence of a person in an indoor unit and an obstacle detection device that detects the presence or absence of an obstacle, the detection result of the human body detection device, and the obstacle detection device
- the present invention relates to an air conditioner that controls a wind direction changing blade based on a detection result.
- the conventional air conditioner is provided with a human position detecting means and an obstacle detecting means in the indoor unit, and the air direction changing means is controlled by controlling the wind direction changing means based on the detection signals of both the human position detecting means and the obstacle position detecting means. There is something that has improved.
- this air conditioner when heating operation starts, it is first determined whether there is a person in the room by the person position detecting means, and if there is no person, whether there is an obstacle by the obstacle position detecting means If there is no obstacle, the wind direction changing means is controlled so that the conditioned air spreads throughout the room.
- the wind direction changing means is controlled in a direction in which there is no obstacle.
- air conditioning is performed directly on the obstacle. The wind direction changing means is controlled so that the wind does not hit and the conditioned air spreads throughout the room.
- the wind direction changing means is controlled so that the conditioned air spreads throughout the room. Determine whether there is an obstacle in the absence area, and if there is an obstacle, control the wind direction control means in the direction of the obstacle so that the conditioned air does not hit the obstacle strongly, but there is no obstacle In this case, the wind direction control means is controlled in a direction where there is no obstacle (see, for example, Patent Document 1).
- an ultrasonic sensor is used as a distance measuring device as a means for detecting the distance to a person or an obstacle, and the ultrasonic sensor is driven so that the entire indoor area is placed. Obstacle detection scanning is performed, but distance measuring devices such as ultrasonic sensors have a narrow measurement range, and if they do not perform complicated and time-consuming scanning, they can detect and grasp people or obstacles in the entire area of the room. I could not do it.
- the distance measuring device such as an ultrasonic sensor
- the distance measuring device is affected by dust, cigarette smoke, etc., leading to a reduction in recognition performance. become.
- a movable type distance measuring device such as an ultrasonic sensor
- the detection operation of the distance measuring device is stopped after scanning the entire area of the room
- the distance measuring device will turn to the corner of the room and Is uncomfortable.
- every time scanning of a distance measuring device such as an ultrasonic sensor is completed if the direction in which the distance measuring device faces is different, the resident is similarly uncomfortable.
- the present invention has been made in view of such problems of the prior art, and detects the presence or absence of a person (human body detection means) using a fixed or driven imaging device provided in an indoor unit. Furthermore, the presence or absence of an obstacle is detected (obstacle detection means) so that the imaging device is not exposed when the air conditioner is stopped. In the case of a driving imaging device, the imaging device is set when the operation of the air conditioner is started. It is an object of the present invention to provide an air conditioner that can suppress the deterioration of the recognition performance of the imaging device and can give the resident a sense of security by setting so that they always face in the same direction.
- the present invention uses a fixed or driven imaging device provided in an indoor unit to detect the presence / absence of a person (human body detection means) and to detect the presence or absence of an obstacle (obstacles). Detecting means) and controlling the wind direction change blade provided in the indoor unit based on the detection result of the human body detecting means and the detection result of the obstacle detecting means, and the operation of the air conditioner is stopped.
- the imaging device is covered with a part of the indoor unit, and in the case of a driving imaging device, the imaging device is set to face the same direction when the operation of the air conditioner is started.
- the imaging device when the operation of the air conditioner is stopped, the imaging device is configured not to be exposed to suppress a reduction in the recognition performance of the image pickup device, and at the start of the operation of the air conditioner, the imaging device has the same direction. It is possible to give the resident a sense of security by setting it to face. Furthermore, since the imaging device is exposed even when the operation is stopped, it is possible to eliminate the sense of insecurity from the viewpoint of privacy that the room may be constantly photographed.
- FIG. 1 is a front view of an indoor unit of an air conditioner according to the present invention.
- 2 is a longitudinal sectional view of the indoor unit of FIG.
- FIG. 3A is a longitudinal sectional view of the indoor unit of FIG. 1 with the movable front panel opening the front opening and the upper and lower blades opening the outlet.
- 3B is a longitudinal sectional view of the indoor unit in FIG. 1 in a state where the lower blades constituting the upper and lower blades are set downward.
- FIG. 4 is a cross-sectional view of the imaging device provided in the indoor unit of FIG.
- FIG. 5 is a flowchart showing the flow of the human position estimation process in this embodiment.
- FIG. 6 is a schematic diagram for explaining background difference processing in human position estimation in the present embodiment.
- FIG. 7 is a schematic diagram for explaining processing for creating a background image in background difference processing.
- FIG. 8 is a schematic diagram for explaining processing for creating a background image in background difference processing.
- FIG. 9 is a schematic diagram for explaining a process of creating a background image in the background difference process
- FIG. 10 is a schematic diagram for explaining region division processing in human position estimation in the present embodiment.
- FIG. 11 is a schematic diagram for explaining two coordinate systems used in this embodiment.
- FIG. 12 is a schematic diagram showing the distance from the image sensor unit to the position of the center of gravity of the person.
- FIG. 13 is a schematic diagram showing a human position determination area detected by the image sensor unit constituting the human body detection means.
- FIG. 14 is a schematic diagram in the case where a person is present in the human position determination area detected by the imaging sensor unit constituting the human body detection means.
- FIG. 15 is a flowchart for setting region characteristics in each region shown in FIG.
- FIG. 16 is a flowchart for finally determining the presence or absence of a person in each area shown in FIG. 17 is a schematic plan view of a residence where the indoor unit of FIG. 1 is installed.
- 18 is a graph showing the long-term cumulative result of each image sensor unit in the residence of FIG. 19 is a schematic plan view of another residence in which the indoor unit of FIG. 1 is installed.
- FIG. 20 is a graph showing the long-term cumulative result of each image sensor unit in the residence of FIG. FIG.
- FIG. 21 is a flowchart showing the flow of a human position estimation process using a process of extracting a person-like area from a frame image.
- FIG. 22 is a flowchart showing the flow of human position estimation processing using processing for extracting a face-like region from a frame image.
- FIG. 23 is a schematic diagram showing an obstacle position determination area detected by the obstacle detection means.
- FIG. 24 is a schematic diagram for explaining obstacle detection by the stereo method.
- FIG. 25 is a flowchart showing the flow of processing for measuring the distance to the obstacle.
- FIG. 26 is a schematic diagram showing the distance from the image sensor unit to the position P.
- FIG. 27 is an elevation view of a living space, and is a schematic diagram showing the measurement results of the obstacle detection means.
- FIG. 28 is a schematic diagram showing the definition of the wind direction at each position of the left and right blades constituting the left and right blades.
- FIG. 29 is a schematic plan view of a room for explaining a wall detection algorithm for determining a distance number by measuring a distance from an indoor unit to a surrounding wall surface.
- FIG. 30 is a front view of another indoor unit of an air conditioner according to the present invention.
- FIG. 31 is a schematic diagram showing the relationship between the image sensor unit and the light projecting unit.
- FIG. 32 is a flowchart showing the flow of processing for measuring the distance to an obstacle using the light projecting unit and the image sensor unit.
- FIG. 33 is a front view of another indoor unit of an air conditioner according to the present invention.
- FIG. 29 is a schematic plan view of a room for explaining a wall detection algorithm for determining a distance number by measuring a distance from an indoor unit to a surrounding wall surface.
- FIG. 30 is a front view of another indoor unit of an air conditioner according to the present
- FIG. 34 is a flowchart showing the flow of processing of the human body distance detecting means using the human body detecting means.
- FIG. 35 is a schematic diagram for explaining processing for estimating the distance from the image sensor unit to a person using v1 which is the v coordinate at the top of the image.
- FIG. 36 is a flowchart showing the flow of processing of obstacle detection means using human body detection means.
- FIG. 37 is a schematic diagram for explaining the process of estimating the height v2 of the person on the image using the distance information from the imaging sensor unit to the person estimated by the human body distance detection means.
- FIG. 38 is a schematic diagram for explaining processing for estimating whether an obstacle exists between the image sensor unit and a person.
- FIG. 39 is a schematic diagram for explaining processing for estimating whether an obstacle exists between the image sensor unit and a person.
- the first invention uses an imaging device that is fixed or driven in an indoor unit to detect the presence or absence of a person (human body detection means), and further detects the presence or absence of an obstacle (obstacle detection means).
- An air conditioner that controls a wind direction changing blade provided in an indoor unit based on the detection result of the detection unit and the detection result of the obstacle detection unit, and when the operation of the air conditioner is stopped, the imaging device is installed in the indoor unit. It is set as the structure covered with a part.
- This configuration can suppress a decrease in recognition performance of the imaging device and give a sense of security to residents. Furthermore, since the imaging device is exposed even when the operation is stopped, it is possible to eliminate the sense of insecurity from the viewpoint of privacy that the room may be constantly photographed.
- the image pickup device in the case of an image pickup device to be driven, the image pickup device is set to face the same direction when the operation of the air conditioner is started.
- the imaging device in the case of a driving imaging device, is set to face the front of the indoor unit when the operation of the air conditioner is started.
- the optical axis of the imaging device is set to be substantially perpendicular to the installation surface when viewed from above the indoor unit.
- the imaging apparatus in the case of a driving imaging apparatus, is configured such that its orientation can be freely changed within a predetermined angle range in the vertical direction and the horizontal direction, and the orientation of the imaging apparatus is set when the operation of the air conditioner starts.
- the orientation of the imaging apparatus is set when the operation of the air conditioner starts.
- the structure for moving the imaging device may be only in the horizontal direction.
- the imaging device is Needless to say, the moving structure is only vertical. Needless to say, if the viewing angle of the imaging device is sufficient in both the vertical and horizontal directions, the imaging device may be fixedly installed.
- the imaging device when the air conditioner is stopped, the imaging device is covered with the movable front panel or the up / down airflow direction change blade, so that the imaging device is not affected by dust, cigarette smoke, and the like. Can be suppressed. Furthermore, since the imaging device is exposed even when the operation is stopped, it is possible to eliminate the sense of insecurity from the viewpoint of privacy that the room may be constantly photographed.
- FIGS. 1 to 3B show the indoor unit of the air conditioner according to the present invention. ing.
- the indoor unit has a main body 2 and a movable front panel (hereinafter simply referred to as “front panel”) 4 that can freely open and close the front opening 2a of the main body 2.
- front panel a movable front panel
- FIGS. 3A and 3B show a state where the front panel 4 opens the front opening 2a.
- the heat exchanger 6 and the indoor air taken in from the front opening 2 a and the top opening 2 b are heat-exchanged by the heat exchanger 6.
- An indoor fan 8 for blowing air into and out an upper and lower blade 12 for opening and closing the air outlet 10 for blowing heat-exchanged air into the room and changing the air blowing direction up and down, and a left and right blade 14 for changing the air blowing direction left and right
- a filter 16 is provided.
- the upper part of the front panel 4 is connected to the upper part of the main body 2 via two arms 18 and 20 provided at both ends thereof, and a drive motor (not shown) connected to the arm 18 is driven and controlled.
- a drive motor (not shown) connected to the arm 18 is driven and controlled.
- the upper and lower blades 12 are composed of an upper blade 12a and a lower blade 12b, and are respectively swingably attached to the lower portion of the main body 2.
- the upper blade 12a and the lower blade 12b are connected to separate drive sources (for example, stepping motors), and are independently controlled by a control device (first board 48, for example, a microcomputer described later) built in the indoor unit.
- first board 48 for example, a microcomputer described later
- the upper and lower blades 12 can be composed of three or more upper and lower blades. In this case, at least two (particularly, the uppermost blade and the lowermost blade) can be independently angle-controlled. Is preferred.
- the left and right blades 14 are composed of a total of 10 blades arranged five by left and right from the center of the indoor unit, and are respectively swingably attached to the lower part of the main body 2.
- the left and right five blades are connected to separate drive sources (for example, stepping motors) as a unit, and the left and right five blades are independently angle-controlled by a control device built in the indoor unit. .
- a method for driving the left and right blades 14 will also be described later.
- imaging sensor units 24 are incorporated in an imaging device 25 at both left and right ends or a lower portion of one side as viewed from the front of the main body. Refer to FIG. 4 for the imaging device 25. While explaining.
- the imaging sensor unit 24 includes a circuit board 51, a lens 52 attached to the circuit board, and an imaging sensor 53 mounted inside the lens.
- the human body detection means for example, the presence or absence of a person is determined by the circuit board 51 based on differential processing described later. That is, the circuit board 51 acts as presence / absence determination means for determining the presence / absence of a person.
- the imaging sensor 53 includes a spherical support (sensor holder) 54 that is rotatably supported, and an imaging direction changing means (driving means) that changes the orientation of the imaging sensor 53 so that all necessary fields of view can be scanned and driven. Yes.
- the support 54 has a horizontal (horizontal) rotating shaft 55 and a vertical (vertical) rotating shaft 56 extending in a direction orthogonal to the horizontal rotating shaft 55.
- the horizontal rotating shaft 55 is The horizontal rotation motor 57 is connected to and driven, and the vertical rotation shaft 56 is connected to and driven by the vertical rotation motor 58.
- the imaging direction changing means includes a horizontal rotation motor 57, a vertical rotation motor 58, and the like, and can change the direction angle of the imaging sensor 53 in two dimensions, and the direction angle that the imaging sensor 53 faces. Can be recognized.
- a difference method which is a known technique is used. This is to perform a difference process between a background image that is an image in which no person is present and an image captured by the image sensor unit 24, and to estimate that a person is present in a region where the difference is generated.
- FIG. 5 is a flowchart showing the flow of human position estimation processing in the present embodiment.
- a background difference process is used to detect pixels that have a difference in the frame image.
- Background difference processing is a comparison between a background image captured under a specific condition and a captured image captured under the same image capturing conditions such as the field of view, viewpoint, and focal length of the image sensor unit. This method detects an object that does not exist in the captured image but exists in the captured image. In order to detect a person, an image without a person is created as a background image.
- FIG. 6 is a schematic diagram for explaining the background difference processing.
- FIG. 6A shows a background image.
- the visual field is set to be substantially equal to the air-conditioned space of the air conditioner.
- 101 indicates a window existing in the air-conditioned space
- 102 indicates a door.
- FIG. 6B shows a frame image captured by the image sensor unit.
- the field of view, viewpoint, focal length, and the like of the image sensor unit are the same as the background image of FIG.
- Reference numeral 103 denotes a person existing in the air-conditioned space.
- FIGS. 6A and 6B shows a difference image.
- White pixels indicate pixels where no difference exists, and black pixels indicate pixels where a difference occurs. It can be seen that the area of the person 103 that is not present in the background image but is present in the captured frame image is detected as the area 104 where the difference occurs. That is, it is possible to detect a person area by extracting an area where a difference is generated from the difference image.
- FIGS. 7A to 7C are schematic diagrams illustrating three consecutive frames captured by the image sensor unit in a scene in which the person 103 is moving from right to left in front of the window 101.
- FIG. 7B shows an image of the next frame of FIG. 7A
- FIG. 7C shows an image of the next frame of FIG. 7B.
- 8A to 8C show inter-frame difference images obtained by performing inter-frame difference processing using the image of FIG.
- White pixels indicate pixels where no difference exists, and black pixels 105 indicate pixels where a difference occurs.
- FIGS. 9A to 9C are diagrams schematically showing the update of the background image in each frame of FIGS. 7A to 7C.
- a hatched area 106 indicates an area where the background image has been updated
- a black area 107 indicates an area where a background image has not yet been created
- a white area 108 indicates an area where the background image has not been updated. That is, the total area of the black area 107 and the white area 108 in FIG. 9 is equal to the black area in FIG.
- the black area 107 is gradually reduced and a background image is automatically created.
- step S102 the obtained difference area is divided into areas, and if there are a plurality of persons, the difference areas are divided into a plurality of difference areas.
- the difference image is determined according to the rule that “the pixel in which the difference occurs and the pixel in which the difference exists in the vicinity are the same region”. Can be divided into regions.
- FIG. 10 is a schematic diagram in which this area division processing is executed.
- FIG. 10A shows a difference image calculated by the difference process, and black pixels 111 and 112 are pixels in which a difference occurs.
- FIG. 10B shows that when FIG.
- step S103 the position of the detected person is detected by calculating the position of the center of gravity of each obtained area.
- perspective projection conversion may be used.
- FIG. 11 is a schematic diagram for explaining two coordinate systems.
- the image coordinate system This is a two-dimensional coordinate system in the captured image, where the upper left pixel of the image is the origin, u is rightward, and v is downward.
- a camera coordinate system which is a three-dimensional coordinate system based on the camera. In this case, the focal position of the image sensor unit is the origin, the optical axis direction of the image sensor unit 24 is Zc, the camera upward direction is Yc, and the camera left direction is Xc.
- the focal position of the image sensor unit is the origin
- the optical axis direction of the image sensor unit 24 is Zc
- the camera upward direction is Yc
- the camera left direction is Xc.
- f is the focal length [mm]
- (u0, v0) is the image center [Pixel] on the image coordinates
- (dpx, dpy) is the size [mm / Pixel] of one pixel of the image sensor.
- FIGS. 12A and 12B the center of gravity position of the person on the image is (ug, vg), and the three-dimensional position in the camera coordinate system is (Xgc, Ygc, Zgc).
- FIG. 12A is a schematic view of the air-conditioned space viewed from the side
- FIG. 12B is a schematic view of the air-conditioned space viewed from above.
- H the height at which the image sensor unit is installed
- the Xc direction is equal to the horizontal direction
- the optical axis Zc is installed at an angle ⁇ from the vertical direction.
- the direction in which the image sensor unit 24 is facing is measured in a vertical direction (elevation angle, an angle measured upward from the vertical line) ⁇ and a horizontal angle (rightward from the front reference line as viewed from the indoor unit). Angle) ⁇ . Further, if the height of the center of gravity of the person is h, the distance L from the image sensor unit to the center of gravity position and the direction W, which are three-dimensional positions in the air-conditioned space, can be calculated by the following equations.
- Equations 3 and 5 are When the installed height H and the height h of the center of gravity of the person are defined, the center of gravity position (L, W) of the person in the air-conditioned space is uniquely determined from the center of gravity position (ug, vg) on the screen. It shows that it is required.
- FIGS. 13A and 13B show in which area in the air-conditioned space a person exists when the center of gravity position on the image exists in each of the areas A to G.
- FIGS. 14A and 14B are schematic diagrams when a person is present. In FIG.
- FIG. 15 is a flowchart for setting region characteristics to be described later in each of the regions A to G using the image sensor unit.
- FIG. 16 illustrates which region of the regions A to G using the image sensor unit.
- FIG. 6 is a flowchart for determining whether or not there is a person, and a person position determination method will be described below with reference to these flowcharts.
- step S1 the presence or absence of a person in each area is first determined by the above-described method at a predetermined cycle T1 (for example, 200 milliseconds if the frame rate of the image sensor unit 24 is 5 fps).
- a predetermined cycle T1 for example, 200 milliseconds if the frame rate of the image sensor unit 24 is 5 fps.
- each of the areas A to G is divided into a first area where the person is good (a place where the person is good) and a second area where the person is short (the area where the person simply passes, and the stay time is short). And a third area (a non-living area such as a wall or a window where people hardly go).
- the first region, the second region, and the third region are referred to as a life category I, a life category II, and a life category III, respectively, and the life category I, the life category II, and the life category III are respectively a region characteristic I. It can also be said that the region of region characteristic II, region of region characteristic II, region of region characteristic III.
- the life category I (region characteristic I) and the life category II (region characteristic II) are combined into a life region (region where people live), while the life category III (region characteristic III) is changed to a non-life region (
- the area of life may be broadly classified according to the frequency of the presence or absence of a person.
- FIG. 17 shows a case where the indoor unit of the air conditioner according to the present invention is installed in an LD of 1 LDK composed of one Japanese-style room, LD (living room / dining room) and kitchen, and is indicated by an ellipse in FIG. The area shows the well-placed place where the subject reported.
- the presence / absence of a person in each of the regions A to G is determined every period T1, and 1 (with a reaction) or 0 (without a reaction) is output as a reaction result (determination) in the period T1, Is repeated a plurality of times, and in step S2, all sensor outputs are cleared.
- step S3 it is determined whether or not the cumulative operation time of the predetermined air conditioner has elapsed. If it is determined in step S3 that the predetermined time has not elapsed, the process returns to step S1. On the other hand, if it is determined that the predetermined time has elapsed, the reaction results accumulated in the predetermined time in each of the regions A to G are two. Each region A to G is identified as one of the life categories I to III by comparing with the threshold value.
- the first threshold value and the second threshold value smaller than the first threshold value are set, and in step S4, the long-term accumulation results of the respective regions A to G are obtained. It is determined whether or not it is greater than the first threshold value, and the region determined to be greater is determined to be the life category I in step S5. If it is determined in step S4 that the long-term accumulation result of each region A to G is less than the first threshold value, whether or not the long-term accumulation result of each region A to G is greater than the second threshold value in step S6.
- the region determined to be large is determined to be the life category II in step S7, while the region determined to be small is determined to be the life category III in step S8.
- the areas C, D, and G are determined as the life category I
- the areas B and F are determined as the life category II
- the areas A and E are determined as the life category III.
- FIG. 19 shows a case where the indoor unit of the air conditioner according to the present invention is installed in another LD of 1 LDK, and FIG. 20 discriminates each region A to G based on the long-term accumulation result in this case. Results are shown.
- the areas B, C, and E are determined as the life category I
- the areas A and F are determined as the life category II
- the areas D and G are determined as the life category III.
- step S23 it is determined whether or not a predetermined number M (for example, 45 times) of reaction results in the cycle T1 has been obtained. If it is determined that the cycle T1 has not reached the predetermined number M, the process returns to step S21. If it is determined that the period T1 has reached the predetermined number M, in step S24, the total number of reaction results in the period T1 ⁇ M is used as the cumulative reaction period number, and the cumulative reaction period number for one time is calculated.
- a predetermined number M for example, 45 times
- step S27 by subtracting 1 from the number of times (N) of cumulative reaction period calculations and returning to step S21, the calculation of the cumulative reaction period number for a predetermined number of times is repeatedly performed.
- Table 1 shows a history of reaction results for the latest one time (time T1 ⁇ M).
- ⁇ A0 means the number of cumulative reaction periods for one time in the region A.
- the cumulative reaction period number of one time immediately before ⁇ A0 is ⁇ A1
- the previous cumulative reaction period number of ⁇ A0 is ⁇ A2,...
- N 4
- the past four history ( ⁇ A4, ⁇ A3 , .SIGMA.A2, .SIGMA.A1), for life category I it is determined that there is a person if the cumulative reaction period is one or more.
- life category II it is determined that there is a person if the cumulative reaction period of one or more times is two or more in the past four history
- life category III the past four history Among them, if the cumulative reaction period number of 2 times or more is 3 times or more, it is determined that there is a person.
- the presence / absence of the person is similarly estimated from the past four histories, life categories, and cumulative reaction period times.
- the region characteristics obtained by accumulating the region determination results for each predetermined period for a long period of time and the region determination results for each predetermined cycle are accumulated N times, and the cumulative reaction of each region obtained is obtained.
- the area to be air-conditioned by the indoor unit of the air conditioner according to the present invention is divided into a plurality of areas A to G by the imaging sensor unit, the area characteristics of each area A to G (life classification I to III)
- the time required for presence estimation and the time required for absence estimation are changed according to the region characteristics of the regions A to G.
- the time required for estimating the presence / absence of the area determined as the life category II as a standard in the area determined as the life category I, there is a person at a shorter time interval than the area determined as the life category II. In contrast, when there are no people in the area, the absence of the person is estimated at a longer time interval than the area determined as the life category II.
- the time required for estimation is set to be long.
- the presence of a person is estimated at a longer time interval than the area determined to be life category II.
- the difference method is used for human position estimation by the image sensor unit, but other methods may be used as a matter of course.
- a person-like region may be extracted from the frame image using image data of the whole body of the person.
- a technique using a HOG (Histograms of Oriented Gradients) feature amount or the like is widely known (N. Dalal and B. Triggs, “Histograms of Oriented Gradients for Proc. IEEE Conf. On Computer Vision and Pattern Recognition, Vol.1, pp.886-893, 2005.).
- HOG Heistograms of Oriented Gradients
- SVM Serial Vector Machine
- FIG. 21 is a flowchart showing the flow of the human position estimation process using the process of extracting a human-like area from the frame image.
- the same steps as those in FIG. 5 are denoted by the same reference numerals, and detailed description thereof is omitted here.
- step S104 a human-like area is extracted as a human area in the frame image by using the HOG feature amount described above.
- step S103 the position of the detected person is detected by calculating the position of the center of gravity of the obtained human area.
- Equations 3 and 5 may be used as described above.
- a face-like area may be extracted from the frame image.
- a method using Haar-Like features is widely known (P. Viola and M. Jones, “Robust real-time face detection”, International Journal of Computer Vision, .57, no.2, pp.137-154, 2004.).
- the Haar-Like feature amount is a feature amount focusing on the luminance difference between local regions, and this feature amount is learned and identified by SVM (Support Vector Machine) or the like to detect a person region from a frame image. It doesn't matter.
- FIG. 22 is a flowchart showing a flow of a human position estimation process using a process of extracting a face-like area from a frame image.
- the same steps as those in FIG. 5 are denoted by the same reference numerals, and detailed description thereof is omitted here.
- step S105 a face-like area is extracted as a face area in the frame image by using the Haar-Like feature amount described above.
- step S103 the position of the detected person is detected by calculating the position of the center of gravity of the obtained face area.
- perspective projection conversion may be used.
- the obstacle detection means for detecting an obstacle using the above-described imaging sensor unit 24 will be described.
- the term “obstacle” refers to all objects that are blown out from the air outlet 10 of the indoor unit and impede the flow of air to provide a comfortable space for residents. It is a collective term for non-residents such as furniture such as sofas, televisions, and audio.
- the floor surface of the living space is subdivided as shown in FIG. 23 based on the vertical angle ⁇ and the horizontal angle ⁇ as shown in FIG.
- Each of these areas is defined as an obstacle position determination area or “position” to determine in which position an obstacle exists.
- all the positions shown in FIG. 23 substantially coincide with the whole area of the human position determination area shown in FIG. 13B, and the area boundary in FIG. 13B substantially coincides with the position boundary in FIG.
- the number of position areas is set larger than the number of areas of the human position determination area, and at least two positions belong to each of the human position determination areas, and these at least two obstacle position determinations.
- the air conditioning control can be performed by dividing the area so that at least one position belongs to each person position determination area.
- each of the plurality of person position determination areas is divided according to the distance to the indoor unit, and the number of areas belonging to the person position determination area in the near area is determined as the person position determination in the far area.
- the number of positions belonging to the area is set to be larger than the number of areas belonging to the area, but the number of positions belonging to each person position determination area may be the same regardless of the distance from the indoor unit.
- the air conditioner according to the present invention detects the presence or absence of a person in the regions A to G by the human body detection means, and detects the presence or absence of an obstacle in the positions A1 to G2 by the obstacle detection means. Based on the detection signal (detection result) of the human body detection means and the detection signal (detection result) of the obstacle detection means, the comfortable space is provided by driving and controlling the upper and lower blades 12 and the left and right blades 14 as the wind direction changing means. I am doing so.
- the human body detection means can detect the presence or absence of a person by using an object that moves, for example, by detecting an object that moves in the air-conditioned space. Since the distance between the obstacles is detected by the image sensor unit 24, the person and the obstacle cannot be distinguished.
- the area where the person is located may not be air-conditioned, or the person may be directly conditioned by air-conditioning airflow (airflow), resulting in inefficient air conditioning control or discomfort to the person. There is a risk of air conditioning control.
- the obstacle detection means detects the obstacle only by performing the data processing described below.
- FIG. 24 is a schematic diagram for explaining obstacle detection by the stereo method.
- the distance to a point P that is an obstacle is measured using the image sensor units 24 and 26.
- f is the focal length
- B is the distance between the focal points of the two image sensor units 24 and 26
- u1 is the u coordinate of the obstacle on the image of the image sensor unit 24, and the image of the image sensor unit 26 of u1.
- the u coordinate of the corresponding point in the above is u2
- X is the distance from the image sensor unit to the point P. Further, it is assumed that the image center positions of the two image sensor units 24 and 26 are equal. At this time, the distance X from the imaging sensor unit to the point P is obtained from the following equation.
- the distance X from the imaging sensor unit to the obstacle point P depends on the parallax
- the search for corresponding points may use a block matching method using a template matching method.
- distance measurement detection of the position of an obstacle
- the imaging sensor unit uses the imaging sensor unit.
- Equations 3, 5, and 6, it can be seen that the position of the obstacle is estimated from the pixel position and the parallax.
- I and j in Table 3 indicate pixel positions to be measured.
- the vertical angle and the horizontal angle are the above-described elevation angle ⁇ and angle ⁇ measured rightward from the front reference line when viewed from the indoor unit. Respectively. That is, when viewed from the indoor unit, each pixel is set in the range of 5 to 80 degrees in the vertical direction and ⁇ 80 to 80 degrees in the horizontal direction, and the image sensor unit measures the parallax of each pixel.
- the air conditioner performs distance measurement (detection of the position of an obstacle) by measuring parallax at each pixel from pixel [14,15] to pixel [142,105].
- the detection range of the obstacle detection means at the start of the operation of the air conditioner may be limited to an elevation angle of 10 degrees or more. This is because the measurement data can be effectively used by measuring the distance only in the area where there is a high possibility that there is a person at the start of operation of the air conditioner and there is a high possibility that the person is not detected, that is, the area where the wall is located (Since the person is not an obstacle, the data of the area where the person is present is not used as will be described later).
- step S41 when it is determined that there is no person in the area corresponding to the current pixel (any one of the areas A to G shown in FIG. 13), the process proceeds to step S42 while it is determined that there is a person. If so, the process proceeds to step S43. That is, since the person is not an obstacle, the pixel corresponding to the area determined to have a person uses the previous distance data without performing distance measurement (does not update the distance data) and determines that there is no person. The distance measurement is performed only in the pixel corresponding to the region thus set, and the newly measured distance data is set to be used (distance data is updated).
- an obstacle in each obstacle position determination area is determined according to the result of the presence / absence determination of a person in the person position determination area corresponding to each obstacle position determination area.
- the presence / absence determination of an obstacle is efficiently performed. More specifically, in the obstacle position determination area belonging to the human position determination area determined that there is no person by the human body detection means, the previous determination result by the obstacle detection means is updated with a new determination result, In the obstacle position determination area belonging to the human position determination area determined to have a person by the human body detection means, the previous determination result by the obstacle detection means is not updated with a new determination result.
- step S42 the above-described block matching method is used to calculate the parallax of each pixel, and the process proceeds to step S44.
- step S44 data is acquired eight times with the same pixel, and it is determined whether distance measurement based on the acquired data is complete. If it is determined that distance measurement is not complete, the process proceeds to step S41. Return. Conversely, if it is determined in step S44 that the distance measurement has been completed, the process proceeds to step S45.
- step S45 the reliability of the distance estimation is improved by evaluating the reliability. That is, if it is determined that there is reliability, a distance number determination process is performed in step S46. On the other hand, if it is determined that there is no reliability, a nearby distance number is processed as distance data of the pixel in step S47. .
- the image sensor units 24 and 26 function as obstacle position detection means.
- step S46 the distance number determination process in step S46 will be described. First, the term “distance number” will be described.
- the “distance number” means an approximate distance from the image sensor unit to the position P where the air-conditioned space is located. As shown in FIG. 26, the image sensor unit is installed 2 m above the floor surface, and the image sensor Assuming that the distance from the unit to the position P is “distance corresponding to the distance number” X [m], the position P is expressed by the following equation.
- the distance X corresponding to the distance number depends on the parallax between the image sensor units 24 and 26.
- the distance number is an integer value from 2 to 12, and the distance corresponding to each distance number is set as shown in Table 4.
- Table 4 shows the position of the position P corresponding to the elevation angle ( ⁇ ) determined by the v-coordinate value of each pixel according to each distance number and the number 2, and in the black part, h is negative. Value (h ⁇ 0), indicating the position to bite into the floor.
- the position corresponding to the distance number ⁇ 10 is a position that exceeds the wall of the room with a diagonal distance> 4.50 m (a position outside the room), and is a distance number that has no meaning at all. Yes, in black.
- Table 6 shows the limit value of the distance number set according to the capability rank of the air conditioner and the elevation angle of each pixel.
- step S45 the reliability evaluation process in step S45 and the distance number determination process in step S46 will be described.
- ⁇ Determine the distance number for 8 times for each pixel, remove the two distance numbers in order from the largest, and remove the two distance numbers in order from the smallest, and take the average value of the remaining four distance numbers to determine the distance number.
- the stereo method based on the block matching method when an obstacle having no luminance change is detected, the parallax calculation is not stable, and a parallax result (distance number) that differs greatly every time measurement is performed. Therefore, in step S45, the values of the remaining four distance numbers are compared, and if the variation is equal to or greater than the threshold value, in step S47, the distance number value is given as not reliable because the distance number value is not reliable.
- the distance number estimated in the neighboring pixels is used.
- the average value is an integer value obtained by rounding up the decimal point and quantizing, and the position corresponding to the distance number thus determined is as shown in Table 4 or Table 5.
- the distance number is determined by taking an average value of the remaining four distance numbers except for two distance numbers, each of which is larger and smaller.
- the number of distances determined for each pixel is not limited to eight, and the number of distances taking an average value is not limited to four.
- an obstacle in each obstacle position determination area is determined according to the result of the presence / absence determination of a person in the person position determination area corresponding to each obstacle position determination area.
- the presence / absence determination of an obstacle is efficiently performed. More specifically, in the obstacle position determination area belonging to the human position determination area determined that there is no person by the human body detection means, the previous determination result by the obstacle detection means is updated with a new determination result, In the obstacle position determination area belonging to the human position determination area determined to have a person by the human body detection means, the previous determination result by the obstacle detection means is not updated with a new determination result.
- step S43 in the flowchart of FIG. 25 the previous distance data is used. However, since the previous data does not exist immediately after the installation of the air conditioner, each obstacle position determination area by the obstacle detection unit is used. When the determination is the first time, the default value is used, and the limit value (maximum value D) described above is used as the default value.
- FIG. 27 is an elevation view (longitudinal sectional view passing through the image sensor unit) of a certain living space.
- the floor surface is 2 m below the image sensor unit, and a table or the like is 0.7 to 1.1 m from the floor surface.
- the measurement results when there is an obstacle are shown in the drawing.
- the shaded area, the upward-sloping shaded part, and the downward-sloping shaded part are short distance, medium distance, and long distance (these distances will be described later) Are determined to have obstacles.
- the stereo method fails to calculate the parallax, so it is difficult to determine the position of the table.
- a luminance difference (texture) is generated on the top surface, so that the table position can be easily determined by the stereo method.
- obstacle detection is performed using not only the obstacle but also the interaction with the surrounding incidental objects in the vicinity of the obstacle.
- the furniture that is actually placed in the room in fact, the daily necessities placed on the furniture rather than the furniture
- the angle of the obstacle Since the interaction of surrounding accessories in the vicinity of the obstacle changes, it is possible to reduce detection errors as much as possible by repeatedly performing the obstacle detection.
- the obstacle position is learned based on the scanning result of each time, the location of the obstacle is judged from the learning control result, and airflow control described later is performed. It is.
- FIG. 28 is a flowchart showing the obstacle presence / absence determination. This obstacle presence / absence determination is sequentially performed for all positions (obstacle position determination areas) shown in FIG. Here, the position A1 will be described as an example.
- step S71 the detection operation (stereo method) is performed by the imaging sensor units 24 and 26 at the first pixel of the position A1, and the above-described obstacle is detected in step S72.
- the presence or absence of an object is determined. If it is determined in step S72 that there is an obstacle, "1" is added to the first memory in step S73, while if it is determined that there is no obstacle, in step S74, the first memory is stored. Add “0”.
- step S75 it is determined whether or not the detection for all the pixels at the position A1 is completed. If the detection for all the pixels is not completed, the detection operation is performed by the stereo method for the next pixel in step S76. Return to step S72.
- step S77 the numerical value recorded in the first memory (the total number of pixels determined to have an obstacle) is divided by the number of pixels at position A1.
- step S78 the quotient is compared with a predetermined threshold value (by dividing). If the quotient is larger than the threshold value, it is temporarily determined in step S79 that there is an obstacle at position A1, and in step S80, “5” is added to the second memory. On the other hand, if the quotient is less than the threshold value, it is temporarily determined in step S81 that there is no obstacle at position A1, and in step S82, “ ⁇ 1” is added to the second memory (“1”). Subtract).
- the threshold value used here depends on the distance from the indoor unit, for example, It is set as follows.
- step S83 it is determined whether or not the numerical value (total after addition) recorded in the second memory is greater than or equal to a determination reference value (for example, 5). While it is finally determined that there is an obstacle at position A1, if it is less than the determination reference value, it is finally determined at step S85 that there is no obstacle at position A1.
- a determination reference value for example, 5
- the first memory can be used as a memory for an obstacle detection operation at the next position by clearing the memory when the obstacle detection operation at a certain position is completed.
- the second memory is an air conditioner. Since the added value at one position is accumulated each time the machine is operated (however, maximum value ⁇ total ⁇ minimum value), the same number of memories as the number of positions are prepared.
- “5” is set as the determination reference value, and when it is finally determined that there is an obstacle in the first obstacle detection at a certain position, “5” is stored in the second memory. Is recorded. In this state, when it is finally determined that there are no obstacles in the next obstacle detection, the value obtained by adding “ ⁇ 1” to “5” is less than the criterion value, so there is no obstacle at that position. It will not exist.
- this obstacle detection learning control determines the final presence / absence of an obstacle based on a plurality of cumulative addition values (or addition / subtraction cumulative values), and adds a value to be added when it is determined that there is an obstacle.
- a characteristic is that the number is set sufficiently larger than the value to be subtracted when it is determined that there is no obstacle.
- the maximum value and the minimum value are set to the numerical values recorded in the second memory, even if the position of the obstacle changes greatly due to moving or redesigning, the change can be followed as soon as possible. If there is no maximum value, the sum will gradually increase each time it is judged that there is an obstacle, the position of the obstacle will change due to moving etc., and there will be no obstacle in the area that is judged every time there is an obstacle However, it takes time to fall below the criterion value. Further, when the minimum value is not provided, the reverse phenomenon occurs.
- FIG. 29 shows a modification of the obstacle detection learning control shown in the flowchart of FIG. 28, and only steps S100, S102, and S103 are different from the flowchart of FIG. To do.
- step S99 when it is temporarily determined in step S99 that there is an obstacle at position A1, "1" is added to the second memory in step S100. On the other hand, if it is temporarily determined in step S101 that there is no obstacle at position A1, "0" is added to the second memory in step S102.
- step S103 the total value recorded in the second memory based on the past ten obstacle detections including the current obstacle detection is compared with a determination reference value (for example, 2), and the determination reference value is determined. If it is above, it is finally determined in step S104 that there is an obstacle at position A1, whereas if it is less than the criterion value, it is finally determined in step S105 that there is no obstacle in position A1. Determined.
- a determination reference value for example, 2
- the obstacle detection learning control described above is finally determined that there is an obstacle if it can be detected twice even if it cannot be detected eight times in the past ten obstacle detections at a certain position. It will be. Therefore, this learning control is characterized in that the number of obstacle detections (here, 2) that is finally determined to be an obstacle is set to a number sufficiently smaller than the past number of obstacle detections to be referred to. Yes, by setting in this way, it is easy to get the result that there is an obstacle.
- a button for resetting data recorded in the second memory may be provided on the indoor unit main body or the remote control, and the data may be reset by pressing this button.
- the position of obstacles and wall surfaces that greatly affect airflow control is unlikely to change, but changes in the indoor unit installation position due to moving, etc., and changes in the furniture position due to redesign of the room, etc. In such a case, it is not preferable to perform airflow control based on the data obtained so far. This is because, depending on the learning control, the control is eventually suitable for the room, but it takes time to reach the optimal control (particularly when the obstacle disappears in the region). Therefore, if the relative positional relationship between the indoor unit and the obstacle or wall surface is changed by providing a reset button, improper air conditioning based on past incorrect data is performed by resetting the previous data. Can be prevented, and by resuming the learning control from the beginning, the control suitable for the situation can be made earlier.
- the areas A to G shown in FIG. 13 belong to the following blocks, respectively.
- Block N Region A Block R: Regions B and E Block C: Regions C and F Block L: Regions D and G Regions A to G belong to the following fields, respectively.
- Field 1 Area A Field 2: Regions B and D Field 3: Region C Field 4: Regions E and G Field 5: Region F Furthermore, the distance from the indoor unit is defined as follows.
- Table 7 shows the target setting angles at the positions of the five left blades and the five right blades constituting the left and right blades 14, and the symbols attached to the numbers (angles) are as shown in FIG.
- the case where the left or right blade is directed inward is defined as a plus (+, no symbol in Table 7) direction, and the case where it is directed outward is defined as a minus ( ⁇ ) direction.
- the “heating B area” in Table 7 is a heating area where obstacle avoidance control is performed, and “normal automatic wind direction control” is wind direction control where obstacle avoidance control is not performed.
- the determination as to whether or not to perform the obstacle avoidance control is based on the temperature of the indoor heat exchanger 6.
- the temperature is low, the wind direction control is not applied to the occupant, and when it is too high, the maximum air volume position is determined. In the case of wind direction control and moderate temperature, wind direction control to the heating B area is performed.
- “temperature is low”, “too high”, “wind direction control that does not apply wind to the occupant”, and “wind direction control at the maximum airflow position” have the following meanings.
- -Low temperature The temperature of the indoor heat exchanger 6 is set to the skin temperature (33 to 34 ° C) as the optimum temperature, and a temperature that can be lower than this temperature (for example, 32 ° C).
- -Too high temperature for example, 56 ° C or higher
- Wind direction control that causes the wind to flow along the ceiling by controlling the angle of the upper and lower blades 12 so as not to send wind to the living space
- -Wind direction control at the maximum airflow position When the air conditioner bends the airflow with the upper and lower blades 12 and the left and right blades 14, resistance (loss) is always generated, so the maximum airflow position is the wind direction where the loss is close to zero.
- Table 8 shows target setting angles in the fields of the upper and lower blades 12 when performing obstacle avoidance control.
- the upper blade angle ( ⁇ 1) and the lower blade angle ( ⁇ 2) are angles (elevation angles) measured upward from the vertical line.
- the swinging motion is a swinging motion of the left and right blades 14, and basically swinging with a predetermined left-right angle width around one target position and having no fixed time at both ends of the swing. is there.
- the position stop operation means that the target setting angle (angle in Table 7) of a certain position is corrected as shown in Table 9 to be the left end and the right end, respectively.
- the left end and the right end each have a wind direction fixing time (time for fixing the left and right blades 14). For example, when the wind direction fixing time elapses at the left end, it moves to the right end and the wind direction fixing time elapses at the right end. The wind direction at the right end is maintained, and after the fixed time of the wind direction has passed, it moves to the left end and repeats it.
- the wind direction fixing time is set to 60 seconds, for example.
- the set angles of the left and right blades 14 corresponding to the left end and the right end of each block are determined based on, for example, Table 10.
- the operation has a fixed wind direction at the left and right ends of each block.For example, when the fixed wind direction has elapsed at the left end, it moves to the right end and maintains the right wind direction until the fixed wind direction has elapsed at the right end. Then, after the elapse of the wind direction fixing time, it moves to the left end and repeats it.
- the wind direction fixing time is set to 60 seconds, for example, similarly to the position stop operation. Since the left end and the right end of each block coincide with the left end and the right end of the person position determination area belonging to the block, the block stop operation can be said to be a stop operation of the person position determination area.
- position stop operation and block stop operation are properly used according to the size of the obstacle.
- the obstacles in front are small, the position is stopped around the position where there are obstacles to avoid obstacles and blow, whereas the obstacles in front are large, for example, in front of the area where people are When there is an obstacle, the air is blown over a wide range by performing a block stop operation.
- the swing operation, the position stop operation, and the block stop operation are collectively referred to as the swing operation of the left and right blades 14.
- the human body detection means determines that the person is only in a single region.
- the air flow control is performed to control the upper and lower blades 12 to avoid the obstacle from above. ing.
- the obstacle detection means determines that there is an obstacle in the obstacle position determination area belonging to the person position determination area determined that the person is detected by the human body detection means, the person position determination is determined that there is a person.
- the left and right blades 14 are swung within at least one obstacle position determination region belonging to the region, and the fixing time of the left and right blades 14 is not provided at both ends of the swing range.
- the left and right blades 14 are swung within at least one obstacle position determining region belonging to the person position determining region or the human position determining region adjacent to the region, and fixed times of the left and right blades 14 are provided at both ends of the swing range.
- One of the two airflow controls is selected.
- both the left blade and the right blade have 10 degrees. It continues to swing (swing) without stopping in the center at an angle range of ⁇ 10 degrees.
- the timing of swinging the left and right blades to the left and right is set to be the same, and the swinging motions of the left and right blades are linked.
- the first airflow control is performed by swinging the target setting angles of two positions without obstacles at both ends to basically air-condition a position without obstacles.
- the block N is operated in a block stop and the second airflow control is performed. This is because the block stop operation is more directional and can reach far away than the entire area, and there is a high possibility of avoiding obstacles. That is, even when obstacles are scattered in the area A, there is usually a gap between the obstacles, and the air can be blown through the gap between the obstacles.
- the first airflow control is performed by swinging left and right. For example, when there is a person in the region D and there is an obstacle only at the position D2, the swing operation is performed to the left and right around the target setting angle of the position D1.
- the block including the area where the person is present is operated to stop the block and the second air flow control is performed.
- the block L is operated while being stopped.
- the first airflow control is performed by performing a swing operation around the target setting angle in a position where there is no obstacle in the middle distance region. For example, if there is a person in the area E and there is an obstacle at the position B2 and there are no obstacles on both sides, but there are obstacles behind it, it is advantageous to send airflow from the position B1 where there is no obstacle. .
- the first airflow control is performed by swinging around the target setting angle of the position where there is no obstacle . For example, if there is a person in the area F, there is an obstacle in position C2, there is an obstacle in position D1 of both sides of position C2, and there is no obstacle in C1, the obstacles from position C1 to position C2 where there is no obstacle Airflow can be sent to area F while avoiding objects.
- the block including the area where the person is present is operated in block stop to perform the second air flow control.
- the block C is operated in a block stop state. In this case, since there is an obstacle ahead of the person and there is no way to avoid the obstacle, the block stop operation is performed regardless of whether there is an obstacle in the block adjacent to the block C.
- the first airflow control is performed by swinging around the target setting angle of the other position where there is no obstacle. For example, if there is a person in the area F, there are no obstacles in the positions C1, C2, and F1, and there is an obstacle in the position F2, the front of the area F in which the person is present is open. Considering this, air conditioning is performed around the far-off position F1 without an obstacle.
- the upper and lower blades 12 and the left and right blades 14 are controlled based on the presence / absence determination of the person by the human body detection means and the presence / absence determination of the obstacle by the obstacle detection means.
- the upper and lower blades 12 and the left and right blades 14 can be controlled based only on the presence / absence determination of the obstacle by the detection means.
- This obstacle avoidance control is basically for avoiding the area determined as having an obstacle by the obstacle detection means and blowing air toward the area determined as having no obstacle.
- An example will be described.
- A. Upper and lower blade control (1) When there is an obstacle in area A (short distance) When warm air is sent out with the upper and lower blades 12 directed to the bottom to suppress warm air that becomes lighter and rises during heating, the obstacle will appear in area A. If there is, there is a possibility that warm air accumulates behind the obstacle (on the indoor unit side), or the warm air hits the obstacle and does not reach the floor.
- the set angle of the upper and lower blades 12 is corrected as shown in Table 11 with respect to the normal field control (Table 8), and the air flow control with the upper and lower blades 12 set upward is performed.
- Air conditioning over obstacles To avoid the obstacles, if the entire airflow is raised too much, the warm air directly hits the resident's face, giving uncomfortable feeling. The upper blade 12a prevents the lift.
- the blocks including the areas C and D or the areas B and C are operated in a block stop.
- the areas C and D or the areas B and C are block-stopped at a rate of once every plural times (for example, 5 times)
- the left and right blades 14 are swung toward the area B or D.
- it is effective in terms of air-conditioning of the entire room.
- the position (obstacle position determination area) for determining the presence or absence of an obstacle may be subdivided as shown in FIG. 23 regardless of the capability rank of the air conditioner, but is set according to the capability rank. Since the room sizes are also different, the number of divided areas may be changed. For example, if the ability rank is 4.0 kw or higher, it is divided as shown in FIG. 23, and if it is 3.6 kw or less, the short distance is divided into three without providing a long distance, and the medium distance region is divided into six. You may make it do.
- ⁇ Human wall proximity control> When a person and a wall exist in the same area, the person is always located in front of the wall and close to the wall, and during heating, hot air tends to stay near the wall. Since the room temperature tends to be higher than the room temperature of other parts, human wall proximity control is performed.
- the parallax is calculated in a pixel different from the pixel [i, j] shown in Table 4, the distance is detected, and the positions of the front wall and the left and right walls are first recognized.
- the parallax of the pixel corresponding to the front in the substantially horizontal direction is calculated, and the distance to the front wall is measured to obtain the distance number. Further, the parallax of the pixel corresponding to the left side in the substantially horizontal direction is calculated, the distance to the left wall is measured, the distance number is obtained, and the distance number of the right wall is obtained similarly.
- FIG. 29 is a top view of a room to which an indoor unit is attached, and shows a case where a front wall WC, a left wall WL, and a right wall WR exist on the front, left, and right sides as viewed from the indoor unit. ing.
- the numbers on the left side of FIG. 29 indicate the distance numbers of the corresponding cells, and Table 12 indicates the distances from the indoor unit to the near and far points corresponding to the distance numbers.
- the “obstacle” used in the present specification is assumed to be furniture such as a table and a sofa, a television, an audio, and the like. Since it is not detected in the angle range of 75 degrees and it can be estimated that it is a wall that is detected, in the present embodiment, the distance to the front, left end, and right end of the indoor unit is detected at an elevation angle of 75 degrees or more, It is assumed that there is a wall on the extension including the position.
- the left wall WL is at the positions of ⁇ 80 degrees and ⁇ 75 degrees
- the front wall WC is at the positions of ⁇ 15 degrees to 15 degrees
- the right wall WR is at the angles of 75 degrees and 80 degrees. Therefore, among the pixels shown in Table 3, the pixels corresponding to the viewing angle in the horizontal direction within the elevation angle of 75 degrees are as follows.
- the upper limit value and the lower limit value of each wall surface data are deleted to eliminate unnecessary wall surface data.
- the front wall WC, left A distance number to the wall WL and the right wall WR is determined.
- the maximum values in Table 14 (WC: 5, WL: 6, WR: 3) can be adopted.
- a room large room
- a wider space should be set as a target for air-conditioning control.
- the temperature setting is lower than the setting temperature set by the remote control I do.
- the set temperature is set to a low value by a first predetermined temperature (for example, 2 ° C.).
- a first predetermined temperature for example, 2 ° C.
- B. When a person is in a long-distance area Since the long-distance area is far from the indoor unit and has a large area, the degree of increase in room temperature is lower than that in the short-distance area or medium-distance area.
- the set temperature is set to a low level by a second predetermined temperature (for example, 1 ° C.) lower than the first predetermined temperature.
- the long-distance area has a large area, even if it is detected that there is a person and a wall in the same person position determination area, there is a possibility that the person and the wall are separated. Only in this case, the human wall proximity control is performed, and the temperature shift is performed according to the positional relationship between the person and the wall.
- the stereo method is used as the distance detection means, but a method using the light projecting unit 28 and the image sensor unit 24 may be used instead of the stereo method. This method will be described.
- the main body 2 of the present embodiment includes an image sensor unit 24 and a light projecting unit 28.
- the light projecting unit 28 includes a light source and a scanning unit (not shown), and the light source may use an LED or a laser. Further, the scanning unit can change the light projecting direction arbitrarily using a galvanometer mirror or the like.
- FIG. 31 is a schematic diagram showing the relationship between the image sensor unit 24 and the light projecting unit 28. Originally, the projection direction is a two-degree-of-freedom and the imaging surface is a vertical and horizontal two-dimensional plane.
- the light projecting unit 28 projects light in the light projecting direction ⁇ with respect to the optical axis direction of the imaging sensor unit 24.
- the image sensor unit 24 performs a difference process between the frame image immediately before the light projecting unit 28 projects light and the frame image being projected, thereby reflecting the light P projected by the light projecting unit 28.
- the u coordinate u1 on the image is acquired.
- distance information in the air-conditioned space can be obtained by detecting the reflection point P of the light while changing the light projecting direction ⁇ of the light projecting unit 28.
- i and j indicate addresses to be scanned by the light projecting unit 28, and the vertical angle and the horizontal angle are set to the right from the elevation angle ⁇ and the reference line in front as viewed from the indoor unit.
- Each measured angle ⁇ is shown. That is, when viewed from the indoor unit, each address is set in the range of 5 to 80 degrees in the vertical direction and -80 to 80 degrees in the horizontal direction, and the light projecting unit 28 measures each address and scans the living space. To do.
- step S48 when it is determined that there is no person in the area (any one of areas A to G shown in FIG. 13) corresponding to the address [i, j] where the light projecting unit 28 performs light projection, If it is determined that there is a person while the process proceeds to step S49, the process proceeds to step S43. That is, since the person is not an obstacle, the pixel corresponding to the area determined to have a person uses the previous distance data without performing distance measurement (does not update the distance data) and determines that there is no person. The distance measurement is performed only in the pixel corresponding to the region thus set, and the newly measured distance data is set to be used (distance data is updated).
- step S49 the distance to the obstacle is estimated by acquiring the above-described light projection process and the reflection point from the image sensor unit 24.
- the distance number determination process may be used to perform the process using the distance number.
- human body detection means may be used as distance detection means. This comprises a human body distance detecting means using human body detecting means and an obstacle detecting means using human body detecting means. This process will be described.
- FIG. 34 is a flowchart showing the flow of processing of the human body distance detecting means using the human body detecting means.
- the same steps as those in FIG. 5 are denoted by the same reference numerals, and detailed description thereof is omitted here.
- step S201 the human body distance detection means detects a pixel that is present at the top of the image among the pixels in which the difference is generated in each area where the human body detection means has divided the area, and sets the v coordinate as v1. get.
- the human body distance detection means estimates the distance from the image sensor unit to the person using v1, which is the v coordinate at the top of the image.
- FIG. 35 is a schematic diagram for explaining this process.
- FIG. 35A is a schematic diagram of a scene in which two persons 121 and 122 are present near and far from the camera
- FIG. 35B is a difference image of images captured by the image sensor unit in the scene of FIG. Is shown.
- the areas 123 and 124 where the difference is generated correspond to the persons 121 and 122, respectively.
- the height h1 of the person is known and the heights of all the persons in the air-conditioned space are substantially equal.
- the image sensor unit 24 since the image sensor unit 24 is installed at a height of 2 m, as shown in FIG. 35A, the image sensor unit captures an image while looking down from above the person. At this time, the closer the person is to the image sensor unit, the more the person is imaged in the lower part of the image as shown in FIG. That is, the v-coordinate v1 at the top of the image of the person and the distance from the image sensor unit to the person correspond one-to-one. From this, the human body distance detecting means using the human body detecting means can be performed by obtaining in advance the correspondence between the uppermost v coordinate v1 of the person and the distance from the imaging sensor unit to the person.
- Table 17 shows an example in which the average height of a person is used as h1, and the correspondence between the v-coordinate v1 at the top of the image and the distance from the imaging sensor unit to the person is obtained in advance.
- FIG. 36 is a flowchart showing the flow of processing of obstacle detection means using human body detection means.
- step S203 the obstacle detection means estimates the height v2 of the person on the image using the distance information from the image sensor unit 24 to the person estimated by the human body distance detection means.
- FIG. 37 is a schematic diagram for explaining this processing, and is a schematic diagram showing a scene similar to FIG.
- the height h1 of the person is known as described above, and the heights of all persons in the air-conditioned space are substantially equal.
- the image sensor unit 24 since the image sensor unit 24 is installed at a height of 2 m, as shown in FIG. 34 (a), the image sensor unit performs imaging while looking down from above the person. At this time, the closer the person is to the image sensor unit 24, the larger the size of the person on the image as shown in FIG.
- the difference v2 between the v-coordinate at the top of the image and the v-coordinate at the bottom of the image has a one-to-one correspondence with the distance from the image sensor unit 24 to the person. From this, when the distance from the image sensor unit to the person is known, the size on the image can be estimated. This can be done by obtaining in advance the correspondence between the difference v2 between the v-coordinate at the top of the image and the v-coordinate at the bottom of the image and the distance from the image sensor unit to the person.
- step S204 the obstacle detection means detects the pixel having the highest difference at the top of the image and the pixel having the lowest difference at the bottom of the image in each region of the difference image.
- the difference v3 is calculated.
- step S205 the person's height v2 on the image estimated using the distance information from the image sensor unit 24 to the person is compared with the person's height v3 obtained from the actual difference image, thereby capturing an image. It is estimated whether there is an obstacle between the sensor unit 24 and the person.
- 38 and 39 are schematic diagrams for explaining this processing.
- FIG. 38 shows a scene similar to FIG. 35, and is a schematic diagram showing a scene where no obstacle exists between the image sensor unit 24 and a person.
- FIG. 39 is a schematic diagram showing a scene where an obstacle exists.
- FIG. 38 shows a scene similar to FIG. 35, and is a schematic diagram showing a scene where no obstacle exists between the image sensor unit 24 and a person.
- FIG. 39 is a schematic diagram showing a scene where an obstacle exists.
- FIG. 38 shows a scene similar to FIG. 35, and is a schematic diagram showing a scene where no obstacle exists between the image sensor unit 24 and a person.
- FIG. 39 is a schematic diagram showing
- step S205 when it is determined in step S205 that v3 is sufficiently smaller than v2, the process proceeds to step S206, and it is determined that there is an obstacle between the imaging sensor unit and the person.
- the distance between the imaging sensor unit and the obstacle is assumed to be equal to the distance from the imaging sensor unit to the person obtained from the uppermost v coordinate v1.
- the distance detection means is realized by using the detection result of the human body detection means.
- the imaging sensor unit 24 is described as being fixed with a sufficient viewing angle.
- an operation of expanding the field of view by reciprocating horizontally is performed. You can do it.
- the vertical field of view of the image sensor unit 24 is narrow, an operation of widening the field of view by reciprocating vertically may be performed.
- both the horizontal and vertical fields of view are narrow, the horizontal and vertical fields of view can be widened by scanning the image sensor unit 24 and operating it.
- each image process may use the entire image obtained by driving the imaging device, and the concept is the same as that of the fixed imaging device except that the number of pixels is different.
- the imaging device 25 disposed at the lower part of the indoor unit is configured to be covered with a part of the indoor unit. Yes. Furthermore, even when the imaging device 25 is not covered with a part of the indoor unit and the imaging sensor unit is exposed even when the operation is stopped, or is protected with a transparent cover, the driving method of the imaging sensor unit to be driven is It goes without saying that the operation described in the present invention can be performed without causing the resident to feel uncomfortable.
- one imaging device 25 may be arranged near the center of the indoor unit.
- the air conditioner according to the present invention is particularly useful as an air conditioner for general households because it can suppress a decrease in the recognition performance of the image sensor and give the resident a sense of security.
- 2 indoor unit body 2a front opening, 2b top opening, 4 movable front panel, 6 heat exchanger, 8 indoor fan, 10 air outlets, 12 top and bottom blades, 14 left and right blades, 16 Filter, 18, 20 Front panel arm, 24, 26 Imaging sensor unit, 28 Projection unit, 25 imaging device, 51 circuit board, 52 lens, 53 imaging sensor, 54 support (sensor holder), 55 Horizontal (horizontal) rotary axis, 56 Vertical (vertical) rotary axis, 57 Motor for horizontal rotation, 58 Motor for vertical rotation.
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Mechanical Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Air Conditioning Control Device (AREA)
Abstract
Description
一般家庭で使用される空気調和機は、通常冷媒配管で互いに接続された室外機と室内機とで構成されており、図1乃至図3Bは、本発明に係る空気調和機の室内機を示している。
図1に示されるように、本体の正面から見て左右両端もしくはその片側の下部には、撮像センサユニット24が撮像装置25に組み込まれて設けられており、この撮像装置25について図4を参照しながら説明する。
以下、撮像センサユニットによる人位置推定方法について説明するが、簡略化のため(撮像センサの視野は水平方向、垂直方向とも十分に確保できているとし)撮像センサの駆動は必要ない場合、すなわち固定の場合について説明する。
前述の撮像センサユニット24を利用して、障害物検出を行なう、この障害物検知手段について説明する。なお、本明細書で使用する「障害物」という用語は、室内機の吹出口10から吹き出され居住者に快適空間を提供するための空気の流れを妨げる物全般を指しており、例えばテーブルやソファー等の家具、テレビ、オーディオ等の居住者以外の物を総称したものである。
領域B:ポジションB1+B2
領域C:ポジションC1+C2
領域D:ポジションD1+D2
領域E:ポジションE1+E2
領域F:ポジションF1+F2
領域G:ポジションG1+G2
なお、図23の領域分割は、ポジションの領域数を人位置判別領域の領域数より多く設定しており、人位置判別領域の各々に少なくとも二つのポジションが属し、これら少なくとも二つの障害物位置判別領域を室内機から見て左右に配置しているが、各人位置判別領域に少なくとも一つのポジションが属するように領域分割して、空調制御を行うこともできる。
上述したように、本発明に係る空気調和機は、人体検知手段により領域A~Gにおける人の在否を検知するとともに、障害物検知手段によりポジションA1~G2における障害物の有無を検知し、人体検知手段の検知信号(検知結果)と障害物検知手段の検知信号(検知結果)に基づいて、風向変更手段である上下羽根12及び左右羽根14を駆動制御することにより、快適空間を提供するようにしている。
前述のように、ステレオ法は輝度変化のない障害物を検出する場合など、被写体によっては、障害物検知に失敗する可能性が高くなる。
中距離:0.3
遠距離:0.2
また、この障害物検知動作は、空気調和機を運転するたびに行われるので、第2のメモリには、「5」あるいは「-1」が繰り返し加算される。そこで、第2のメモリに記録される数値は、最大値を「10」に、最小値を「0」に設定している。
上記障害物の存否判定に基づき、風向変更手段としての上下羽根12及び左右羽根14は、暖房時次のように制御される。
ブロックR:領域B,E
ブロックC:領域C,F
ブロックL:領域D,G
また、領域A~Gは次のフィールドにそれぞれ属している。
フィールド2:領域B,D
フィールド3:領域C
フィールド4:領域E,G
フィールド5:領域F
さらに、室内機からの距離については次のように定義している。
中距離:領域B,C,D
遠距離:領域E,F,G
表7は、左右羽根14を構成する5枚の左羽根と5枚の右羽根の各ポジションにおける目標設定角度を示しており、数字(角度)に付した記号は、図28に示されるように、左羽根あるいは右羽根が内側に向く場合をプラス(+、表7では無記号)の方向、外側に向く場合をマイナス(-)の方向と定義している。
・高すぎる温度:例えば、56℃以上
・居住者に風を当てない風向制御:居住空間に風を送らないように、上下羽根12を角度制御して、風が天井に沿うように流れる風向制御
・最大風量位置の風向制御:空気調和機は、上下羽根12及び左右羽根14により気流を曲げると必ず抵抗(損失)が発生することから、最大風量位置とは損失が限りなく0に近くなる風向制御(左右羽根14の場合、まっすぐ正面を向いた位置であり、上下羽根12の場合、水平から35度下を向いた位置)
表8は、障害物回避制御を行う場合の上下羽根12の各フィールドにおける目標設定角度を示している。なお、表8における上羽根の角度(γ1)及び下羽根の角度(γ2)は垂直線から上方向に測定した角度(仰角)である。
A.上下羽根制御
(1)領域B~Gのいずれかに人がいて、人がいる領域の前方のポジションA1~A3に障害物がある場合
上下羽根12の設定角度を通常のフィールド風向制御(表8)に対し表11のように補正し、上下羽根12を上向き設定した気流制御を行う。
通常自動風向制御を行う。
B.左右羽根制御
B1.領域A(近距離)に人がいる場合
(1)領域Aにおいて障害物のないポジションが一つの場合
障害物のないポジションの目標設定角度を中心として左右にスイング動作させ第1の気流制御を行う。例えば、ポジションA1,A3に障害物があり、ポジションA2に障害物がない場合、ポジションA2の目標設定角度を中心として左右にスイング動作させ、基本的には障害物のないポジションA2を空調するが、ポジションA1,A3に人がいないとは限らないので、スイング動作を加えることで、ポジションA1,A3に多少でも気流が振り分けられるようにする。
障害物のない二つのポジションの目標設定角度を両端としてスイング動作させ第1の気流制御を行うことで、基本的に障害物のないポジションを空調する。
障害物のない二つのポジションの目標設定角度を両端としてブロック停留稼動させ第2の気流制御を行う。
どこを狙っていいのか不明なので、ブロックNをブロック停留稼動させ第2の気流制御を行う。領域全体を狙うよりもブロック停留稼動の方が指向性のある風向となって遠くに届きやすく、障害物を回避できる可能性が高いからである。すなわち、領域Aに障害物が点在している場合でも、障害物と障害物との間には通常隙間があり、この障害物間の隙間を通して送風することができる。
領域Aの通常自動風向制御を行う。
(1)人がいる領域に属する二つのポジションの一方にのみ障害物がある場合
障害物のないポジションの目標設定角度を中心として左右にスイング動作させ第1の気流制御を行う。例えば、領域Dに人がいて、ポジションD2にのみ障害物がある場合、ポジションD1の目標設定角度を中心として左右にスイング動作させる。
人がいる領域を含むブロックをブロック停留稼動させ第2の気流制御を行う。例えば、領域Dに人がいて、ポジションD1,D2の両方に障害物がある場合、ブロックLをブロック停留稼動させる。
人がいる領域の通常自動風向制御を行う。
(1)人がいる領域の前方の中距離領域に属する二つのポジションの一方にのみ障害物がある場合(例:領域Eに人がいて、ポジションB2に障害物があり、ポジションB1に障害物がない)
(1.1)障害物があるポジションの両隣に障害物がない場合(例:ポジションB1,C1に障害物がない)
(1.1.1)障害物があるポジションの後方に障害物がない場合(例:ポジションE2に障害物がない)
障害物があるポジションを中心としてポジション停留稼動させ第2の気流制御を行う。例えば、領域Eに人がいて、ポジションB2に障害物があり、その両側にも後方にも障害物がない場合、ポジションB2にある障害物を横から避けて領域Eに気流を送り込むことができる。
中距離領域で障害物がないポジションの目標設定角度を中心としてスイング動作させ第1の気流制御を行う。例えば、領域Eに人がいて、ポジションB2に障害物があり、その両側には障害物がないが、その後方に障害物がある場合、障害物がないポジションB1から気流を送り込むほうが有利である。
障害物がないポジションの目標設定角度を中心としてスイング動作させ第1の気流制御を行う。例えば、領域Fに人がいて、ポジションC2に障害物があり、ポジションC2の両隣のうちポジションD1に障害物があり、C1に障害物がない場合、障害物がないポジションC1からポジションC2の障害物を避けて気流を領域Fに送ることができる。
人がいる領域を含むブロックをブロック停留稼動させ第2の気流制御を行う。例えば、領域Fに人がいて、ポジションC1,C2の両方に障害物がある場合、ブロックCをブロック停留稼動させる。この場合、人の前方に障害物があり、障害物を避けようがないので、ブロックCに隣接するブロックに障害物があるかどうかに関係なく、ブロック停留稼動を行う。
(3.1)人がいる領域に属する二つのポジションの一方のポジションにのみ障害物がある場合
障害物がない他方のポジションの目標設定角度を中心としてスイング動作させ第1の気流制御を行う。例えば、領域Fに人がいて、ポジションC1,C2,F1に障害物がなく、ポジションF2に障害物がある場合、人がいる領域Fの前方は開放されているので、遠距離の障害物を考慮して障害物のない遠距離のポジションF1を中心に空調する。
人がいる領域を含むブロックをブロック停留稼動させ第2の気流制御を行う。例えば、領域Gに人がいて、ポジションD1,D2に障害物がなく、ポジションG1,G2の両方に障害物がある場合、人がいる領域Gの前方は開放されているが、この領域全体に障害物があり、どこを狙っていいのか不明なので、ブロックLをブロック停留稼動させる。
人がいる領域の通常自動風向制御を行う。
この障害物回避制御は、基本的には障害物検知手段により障害物ありと判定された領域を回避し、障害物なしと判定された領域に向けて送風するためのものであり、以下その具体例を説明する。
A.上下羽根制御
(1)領域A(近距離)に障害物がある場合
暖房時には軽くなって浮き上がる暖気を抑えるために上下羽根12を最下方向に向けて温風を送り出すと、領域Aに障害物がある場合、障害物の裏(室内機側)に暖気がたまったり、暖気が障害物に当たって床面まで届かなかったりすることが考えられる。
B.左右羽根制御
(1)領域B、C、D(中距離)のいずれかに障害物がある場合
障害物がない方向を重点的に空調する。例えば、領域C(部屋中央)に障害物を検知した場合には、障害物のない両側の領域B、Dを含むブロックを交互にブロック停留稼動させることで、障害物のない(=人の存在する可能性の高い)領域を重点的に空調できる。
人と壁が同一領域に存在する場合、人は必ず壁よりも前に位置して壁に近接していることになり、暖房時においては、壁近傍に温風が滞留しやすく、壁近傍の室温が他の部分の室温に比べて高くなる傾向にあることから、人壁近接制御を行うようにしている。
正面:[66,15]~[90,15]、[66,21]~[90,21]、[66,27]~[90,27]
右端:[138,15]、[142,15]、[138,21]、[142,21]、[138,27]、[142,27]
室内機から正面壁WC、左壁WL、右壁WRまでの距離番号決定に際し、表13に示されるように、まず上記各画素で壁面データを抽出する。
A.人が近距離領域あるいは中距離領域にいる場合
近距離領域及び中距離領域は、室内機から近い位置にあり、領域面積も小さいので、室温が上昇する度合いが高くなることから、リモコンで設定された設定温度を第1の所定温度(例えば、2℃)だけ低目に設定する。
B.人が遠距離領域にいる場合
遠距離領域は、室内機から遠い位置にあり、領域面積も大きいので、室温が上昇する度合いは近距離領域あるいは中距離領域より低いことから、リモコンで設定された設定温度を第1の所定温度より少ない第2の所定温度(例えば、1℃)だけ低目に設定する。
4 可動前面パネル、 6 熱交換器、 8 室内ファン、
10 吹出口、 12 上下羽根、 14 左右羽根、
16 フィルタ、 18,20 前面パネル用アーム、
24,26 撮像センサユニット、 28 投光部、
25 撮像装置、 51 回路基板、 52 レンズ、
53 撮像センサ、 54 支持体(センサホルダ)、
55 水平(横)回転用回転軸、 56 垂直(縦)回転用回転軸、
57 水平回転用モータ、 58 垂直回転用モータ。
Claims (6)
- 室内機に、人の在否を検知する人体検知手段と、障害物の有無を検知する障害物検知手段を設け、該人体検知手段の検知結果及び該障害物検知手段の検知結果に基づいて室内機に設けられた風向変更羽根を制御するようにした空気調和機であって、
前記人体検知手段と前記障害物検知手段とは固定もしくは駆動される撮像装置により実現されることを特徴とし、
空気調和機の運転停止時には、撮像装置を室内機の一部で覆う構成としていることを特徴とする空気調和機。 - 前記撮像装置が駆動される方式の場合において、空気調和機の運転開始時には、撮像装置が同じ方向を向くように設定されていることを特徴とする請求項1に記載の空気調和機。
- 前記同じ方向は、室内機の正面であることを特徴とする請求項1あるいは2に記載の空気調和機。
- 前記同じ方向は、前記撮像装置の光軸が前記室内機を上から見て据付面に対し前方に略垂直であることを特徴とする請求項1あるいは2に記載の空気調和機。
- 前記撮像装置は、その向きが垂直方向及び水平方向の所定の角度範囲で変更自在に構成され、前記同じ方向は、垂直方向の角度範囲における上限位置もしくは下限位置であることを特徴とする請求項1乃至4のいずれか1項に記載の空気調和機。
- 前記室内機の一部は、前記室内機の前面開口部を開閉自在の可動前面パネル、あるいは、空気を室内に吹き出す吹出口を開閉するとともに空気の吹き出し方向を上下に変更する上下風向変更羽根であることを特徴とする請求項1乃至5のいずれか1項に記載の空気調和機。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080045430.5A CN102575865B (zh) | 2009-10-07 | 2010-10-05 | 空气调节机 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-233319 | 2009-10-07 | ||
JP2009233319A JP5454065B2 (ja) | 2009-10-07 | 2009-10-07 | 空気調和機 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011043056A1 true WO2011043056A1 (ja) | 2011-04-14 |
Family
ID=43856540
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/005948 WO2011043056A1 (ja) | 2009-10-07 | 2010-10-05 | 空気調和機 |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP5454065B2 (ja) |
KR (1) | KR20120093203A (ja) |
CN (1) | CN102575865B (ja) |
WO (1) | WO2011043056A1 (ja) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5709701B2 (ja) * | 2011-09-07 | 2015-04-30 | 三菱電機株式会社 | 空気調和装置の室内機及びそれを備えた空気調和装置 |
JP5886156B2 (ja) * | 2012-07-20 | 2016-03-16 | 日立アプライアンス株式会社 | 空気調和機 |
JP6249647B2 (ja) * | 2013-06-18 | 2017-12-20 | 三菱電機株式会社 | 空気調和機の室内機 |
JP6428144B2 (ja) * | 2014-10-17 | 2018-11-28 | オムロン株式会社 | エリア情報推定装置、エリア情報推定方法、および空気調和装置 |
CN105180344B (zh) * | 2015-07-03 | 2018-12-21 | 珠海格力电器股份有限公司 | 空调器及空调器的控制方法 |
JP7049167B2 (ja) * | 2018-04-20 | 2022-04-06 | 三菱重工サーマルシステムズ株式会社 | 空気調和機 |
CN112665160B (zh) * | 2020-12-21 | 2022-01-28 | 珠海格力电器股份有限公司 | 空调器的控制方法和空调器 |
US20220397338A1 (en) * | 2021-06-14 | 2022-12-15 | Haier Us Appliance Solutions, Inc. | Inventory management system in a refrigerator appliance |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008224099A (ja) * | 2007-03-12 | 2008-09-25 | Mitsubishi Electric Corp | 空気調和装置 |
JP2009174830A (ja) * | 2008-01-28 | 2009-08-06 | Sharp Corp | 人物位置検出装置および空気調和機 |
JP4503093B1 (ja) * | 2009-06-24 | 2010-07-14 | パナソニック株式会社 | 空気調和機 |
JP2010185590A (ja) * | 2009-02-10 | 2010-08-26 | Panasonic Corp | 空気調和機 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0372249U (ja) * | 1989-11-16 | 1991-07-22 | ||
JP2849467B2 (ja) * | 1990-03-12 | 1999-01-20 | 三菱電機株式会社 | 空気調和機 |
ID19087A (id) * | 1996-09-12 | 1998-06-11 | Samsung Electronics Co Ltd | Alat kontrol arus angin dari dari mesin penyejuk udara dan metoda kerjanya |
WO2009098849A1 (ja) * | 2008-02-08 | 2009-08-13 | Panasonic Corporation | 空気調和機 |
-
2009
- 2009-10-07 JP JP2009233319A patent/JP5454065B2/ja not_active Expired - Fee Related
-
2010
- 2010-10-05 WO PCT/JP2010/005948 patent/WO2011043056A1/ja active Application Filing
- 2010-10-05 KR KR1020127008960A patent/KR20120093203A/ko not_active Application Discontinuation
- 2010-10-05 CN CN201080045430.5A patent/CN102575865B/zh not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008224099A (ja) * | 2007-03-12 | 2008-09-25 | Mitsubishi Electric Corp | 空気調和装置 |
JP2009174830A (ja) * | 2008-01-28 | 2009-08-06 | Sharp Corp | 人物位置検出装置および空気調和機 |
JP2010185590A (ja) * | 2009-02-10 | 2010-08-26 | Panasonic Corp | 空気調和機 |
JP4503093B1 (ja) * | 2009-06-24 | 2010-07-14 | パナソニック株式会社 | 空気調和機 |
Also Published As
Publication number | Publication date |
---|---|
CN102575865A (zh) | 2012-07-11 |
KR20120093203A (ko) | 2012-08-22 |
JP5454065B2 (ja) | 2014-03-26 |
JP2011080687A (ja) | 2011-04-21 |
CN102575865B (zh) | 2014-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5454065B2 (ja) | 空気調和機 | |
JP5402488B2 (ja) | 空気調和機 | |
WO2011043054A1 (ja) | 空気調和機 | |
JP2011080621A (ja) | 空気調和機 | |
JP5402487B2 (ja) | 空気調和機 | |
JP5405901B2 (ja) | 空気調和機 | |
JP2012042074A (ja) | 空気調和機 | |
JP5267408B2 (ja) | 空気調和機 | |
JP2013024534A (ja) | 状況認識装置 | |
JP5487867B2 (ja) | 空気調和機 | |
JP5488297B2 (ja) | 空気調和機 | |
JP2011080685A (ja) | 空気調和機 | |
JP5487869B2 (ja) | 空気調和機 | |
JP5254881B2 (ja) | 空気調和機 | |
JP5126189B2 (ja) | 空気調和機 | |
JP5388726B2 (ja) | 空気調和機 | |
JP2011064337A (ja) | 空気調和機 | |
JP5420302B2 (ja) | 空気調和機 | |
JP5405925B2 (ja) | 空気調和機 | |
JP5426917B2 (ja) | 空気調和機 | |
JP2011017459A (ja) | 空気調和機 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080045430.5 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10821733 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 3092/CHENP/2012 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 20127008960 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1201001604 Country of ref document: TH |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10821733 Country of ref document: EP Kind code of ref document: A1 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112012010809 Country of ref document: BR |
|
ENPW | Started to enter national phase and was withdrawn or failed for other reasons |
Ref document number: 112012010809 Country of ref document: BR |