WO2012011400A1 - Output control device - Google Patents

Output control device Download PDF

Info

Publication number
WO2012011400A1
WO2012011400A1 PCT/JP2011/065760 JP2011065760W WO2012011400A1 WO 2012011400 A1 WO2012011400 A1 WO 2012011400A1 JP 2011065760 W JP2011065760 W JP 2011065760W WO 2012011400 A1 WO2012011400 A1 WO 2012011400A1
Authority
WO
WIPO (PCT)
Prior art keywords
output
image
camera
control device
scene image
Prior art date
Application number
PCT/JP2011/065760
Other languages
French (fr)
Japanese (ja)
Inventor
本郷 仁志
石井 洋平
圭介 淺利
長輝 楊
Original Assignee
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三洋電機株式会社 filed Critical 三洋電機株式会社
Publication of WO2012011400A1 publication Critical patent/WO2012011400A1/en

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/70Control systems characterised by their outputs; Constructional details thereof
    • F24F11/72Control systems characterised by their outputs; Constructional details thereof for controlling the supply of treated air, e.g. its pressure
    • F24F11/79Control systems characterised by their outputs; Constructional details thereof for controlling the supply of treated air, e.g. its pressure for controlling the direction of the supplied air
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F2120/00Control inputs relating to users or occupants
    • F24F2120/10Occupancy
    • F24F2120/12Position of occupants

Definitions

  • the present invention relates to an output control device, and more particularly to an output control device that detects a specific object existing in space and controls the output of the device.
  • the sensor unit measures at least the temperature of a person or the like in the detection area.
  • the image processing unit extracts a feature amount by processing a thermal image having a detected temperature, which is an output from the sensor unit, as a pixel value.
  • the image information detection unit Based on the signal from the image processing unit, the image information detection unit outputs a human personal information signal, an environmental information signal, or the like equivalent to or more detailed than the output signal from the sensor unit using a neural network or a pattern recognition mechanism. .
  • the operation of the air conditioner is controlled based on such a signal.
  • a main object of the present invention is to provide an output control device capable of reducing the load applied to the initial setting work and / or the maintenance work.
  • the output control device comprises the following: search means for searching for one or more specific objects existing in the space based on the object scene image output from the camera that captures the space; Control means for controlling the output operation of the plurality of devices generating the output based on the positional relationship between one or more specific objects discovered by the search means and the plurality of devices; an object scene image output from the camera; Reproduction means for reproducing on the monitor screen in parallel with the control processing of the control means; and multiplexing means for multiplexing the state information indicating the operation states of the plurality of apparatuses onto the object scene image reproduced by the reproduction means.
  • a selection unit that alternatively selects a plurality of modes including the trial operation mode, and an activation unit that activates the multiplexing unit corresponding to the trial operation mode are further provided.
  • the camera has an imaging surface defined along the UV coordinate system
  • the space has a plane defined along the XY coordinate system
  • the control means corresponds to the UV coordinate system and the XY coordinate system.
  • Calculating means for calculating XY coordinates of a specific object with reference to a calibration parameter indicating the relationship and an image found by the searching means; and detecting means for detecting a positional relationship with reference to the XY coordinates calculated by the calculating means.
  • a definition means for defining the multiple positions of the state information with reference to the calibration parameter.
  • each of the plurality of devices corresponds to an air conditioning control device, and the state information has at least one of temperature, humidity, air volume, wind direction, and power consumption as parameters.
  • the status information has a position and / or width that each output of the plurality of devices reaches as a parameter.
  • transmission means for transmitting the object scene image and the state information to the mobile communication terminal is further provided, and the monitor screen corresponds to the monitor screen of the mobile communication terminal.
  • the output control device comprises the following: search means for searching for one or more specific objects existing in space based on the object scene image output from the camera capturing the space; output from the camera Reproduction means for reproducing the object scene image on the monitor screen in parallel with the search processing of the search means; directing different outputs to the space according to the positional relationship with one or more specific objects found by the search means Detecting means for detecting operating states of a plurality of generated devices; and multiplexing means for multiplexing state information indicating the operating states detected by the detecting means on the object scene image reproduced by the reproducing means.
  • the plurality of devices generate outputs toward the space, the specific object existing in the space is searched based on the object scene image output from the camera capturing the space, and the outputs of the plurality of devices The operation is controlled based on the positional relationship between the specific object discovered by the search process and the plurality of devices.
  • adaptive output control in consideration of the position of the specific object is realized.
  • the object scene image output from the camera is reproduced on the monitor screen in parallel with the output control of the plurality of devices, and the state information indicating the operation state of the plurality of devices is multiplexed on the reproduced object scene image. . Whether or not the outputs of the plurality of devices are adaptively controlled can be determined based on the multiplexed state information. As a result, the load on the initial setting work and / or the maintenance work can be reduced.
  • the object scene image output from the camera is reproduced on the monitor screen together with the state information indicating the operation states of the plurality of devices. Whether or not the outputs of a plurality of devices are adaptively controlled in consideration of the positional relationship with a specific object existing in space can be determined based on the multiplexed state information. As a result, the load on the initial setting work and / or the maintenance work can be reduced.
  • FIG. 3 is an illustrative view showing one example of a configuration of a representative point register applied to the embodiment in FIG.
  • FIG. 2 It is an illustration figure which shows an example of a structure of the weighting amount register
  • (A) is an illustration figure which shows an example of the allocation state of the divided area on a map image
  • (B) is an illustration figure which shows an example of the allocation state of the measurement area on a camera image.
  • (A) is an illustrative view showing an example of a graphic image multiplexed on a map image
  • (B) is an illustrative view showing an example of a graphic image multiplexed on a camera image. It is an illustration figure which shows an example of a camera image.
  • FIG. 6C is an illustrative view showing an example of a graphic image drawn corresponding to the air volume “large”.
  • A) is an illustration figure which shows an example of the graphic image multiplexed on the camera image
  • B) is an illustration figure which shows an example of the graphic image multiplexed on the camera image. It is a flowchart which shows a part of operation
  • FIG. 11 is a flowchart showing still another portion of behavior of the CPU applied to the embodiment in FIG. 2;
  • FIG. 10 is a flowchart showing yet another portion of behavior of the CPU applied to the embodiment in FIG. 2; It is a flowchart which shows a part of other operation
  • FIG. 11 is a flowchart showing still another portion of behavior of the CPU applied to the embodiment in FIG. 2; It is a flowchart which shows a part of other operation
  • (A) is an illustrative view showing another example of a graphic image multiplexed on a map image
  • (B) is an illustrative view showing another example of a graphic image multiplexed on a camera image. It is an illustration figure which shows an example of the object scene image and graphic image which are displayed on the monitor screen of another Example.
  • the output control device of one embodiment of the present invention is basically configured as follows.
  • the search means 1a searches for one or more specific objects existing in the space based on the object scene image output from the camera 5a capturing the space.
  • the control means 2a includes the one or two or more specific objects discovered by the search means 1a and the plurality of devices 6a, 6a,... That output the operations of the plurality of devices 6a, 6a,. Control based on the positional relationship.
  • the reproduction unit 3a reproduces the object scene image output from the camera 5a on the monitor screen 7a in parallel with the control process of the control unit 2a.
  • the multiplexing unit 4a multiplexes the state information indicating the operation states of the plurality of devices 6a, 6a,... On the object scene image reproduced by the reproduction unit 3a.
  • the plurality of devices 6a, 6a,... Generate an output toward the space, a specific object existing in the space is searched based on the object scene image output from the camera 5a capturing the space, and the plurality of devices 6a, 6a,.
  • the output operation of 6a,... Is controlled based on the positional relationship between the specific object discovered by the search process and the plurality of devices 6a, 6a,. Thereby, adaptive output control in consideration of the position of the specific object is realized.
  • the object scene image output from the camera 5a is reproduced on the monitor screen 7a in parallel with the output control of the plurality of devices 6a, 6a,..., And state information indicating the operation state of the plurality of devices 6a, 6a,. Are multiplexed onto the reproduced scene image. It can be determined whether or not the outputs of the plurality of devices 6a, 6a,... Are adaptively controlled based on the multiplexed state information. As a result, the load on the initial setting work and / or the maintenance work can be reduced.
  • the output control device of another embodiment is basically configured as follows.
  • the search means 1b searches for one or more specific objects existing in the space based on the object scene image output from the camera 5b that captures the space.
  • the reproduction unit 2b reproduces the object scene image output from the camera 5b on the monitor screen 6b in parallel with the search process of the search unit 1b.
  • the detection means 3b detects the operating state of the plurality of devices 7b, 7b,... That generate different outputs toward the space depending on the positional relationship with one or more specific objects discovered by the search means 1b.
  • the multiplexing unit 4b multiplexes the state information indicating the operation state detected by the detection unit 3b on the object scene image reproduced by the reproduction unit 2b.
  • the object scene image output from the camera 5b is reproduced on the monitor screen together with state information indicating the operation states of the plurality of devices 7b, 7b,. It can be determined based on the multiplexed state information whether or not the outputs of the plurality of devices 7b, 7b,... Are adaptively controlled in consideration of the positional relationship with a specific object existing in the space. As a result, the load on the initial setting work and / or the maintenance work can be reduced.
  • the air conditioning control device 10 of this embodiment includes a camera 12 that repeatedly outputs image data representing an object scene (three-dimensional space) captured on the imaging surface.
  • the image data output from the camera 12 is subjected to display processing by the image processing circuit 14.
  • the processed image data is given to the monitor 16 and transmitted toward the mobile communication terminal 30 via the communication I / F 20.
  • An image representing the object scene, that is, a camera image is displayed on each screen of the monitor 16 and the portable communication terminal 30.
  • room RM1 is partitioned by floor surface FL1 and ceiling HV1 and four wall surfaces WL1 to WL4.
  • the camera 12 is provided in the upper part of the center in the width direction of the wall surface WL1, and captures the internal space of the room RM1 from obliquely above. Therefore, the camera image is displayed on the monitor screen as shown in FIG.
  • the internal space of the room RM1 is defined by the X, Y, and Z axes that are orthogonal to each other, and the imaging surface of the camera 12 is defined by the U and V axes that are orthogonal to each other.
  • the air conditioners D_1 to D_6 are installed at a predetermined distance on the ceiling HV1. Each of the air conditioners D_1 to D_6 outputs air having a specified temperature and humidity in a specified air volume and direction, and the temperature and humidity of the room RM1 are adjusted by the air thus output.
  • the following processing is executed by the CPU 14p.
  • the camera 12 is in a stopped state.
  • the map image shown in FIG. corresponds to an image that schematically represents a bird's-eye view of the plane FL1.
  • marks M_1 to M_6 respectively representing air conditioners D_1 to D_6 are displayed corresponding to the positions of the air conditioners D_1 to D_6.
  • variable K is set to “1”.
  • the mark M_K is clicked by the mouse pointer provided in the input device 18, coordinates indicating the clicked position, that is, click coordinates are calculated.
  • the calculated click coordinates are described in the area register 14r1 shown in FIG. 6 corresponding to the variable K, and the variable K is incremented thereafter.
  • the click operation is accepted a total of six times corresponding to the marks M_1 to M_6, and thereby six click coordinates respectively corresponding to the marks M_1 to M_6 are set in the area register 14r1.
  • the map image is divided in the manner shown in FIG. 9A with the six click coordinates thus set as a reference.
  • the boundary lines BL_1 to B_L3 are drawn on the map image so as to surround the marks M_1 to M_6.
  • the divided areas MP_1 to MP_6 are allocated around the marks M_1 to M_6, and the internal space of the room RM1 is divided into a plurality of small spaces respectively corresponding to the divided areas MP_1 to MP_6.
  • the area register 14r1 a plurality of XY coordinates defining the divided area MP_K (K: 1 to 6) are described.
  • each of a plurality of XY coordinates that define the divided area MP_K is converted into UV coordinates according to Equation 1.
  • the calibration parameters P11 to P33 shown in Equation 1 correspond to a matrix for performing planar projective transformation between the XY coordinate system that defines the plane FL1 and the imaging plane of the camera 12, that is, the UV coordinate system that defines the camera image. Therefore, by applying the desired XY coordinates to Equation 1, the corresponding UV coordinates on the camera image are calculated.
  • the plurality of UV coordinates thus converted are described in the area register 14r1 corresponding to the plurality of XY coordinates of the conversion source.
  • the measurement area DT_K corresponding to the divided area MP_K is defined on the camera image in the manner shown in FIG. 9B.
  • the drawing position of the graphic image G_K indicating the operating status of the air conditioner D_K is defined on the map image.
  • the graphic image G_K is represented by a plurality of circles drawn around the XY coordinates of the air conditioner D_K. Accordingly, the drawing position of the graphic image G_K is defined such that the graphic image G_K is drawn on the map image in the manner shown in FIG. A plurality of XY coordinates indicating the defined drawing position are described in the area register 14r1 corresponding to the variable K.
  • the described XY coordinates are then converted into UV coordinates according to Equation 1.
  • the drawing position of the graphic image G_K is defined on the camera image, and the graphic image G_K is drawn on the camera image in the manner shown in FIG.
  • a plurality of UV coordinates indicating the drawing position defined on the camera image are also described in the area register 14r1 corresponding to the variable K.
  • the camera 12 When the congestion degree measurement mode is selected by operating the input device 18, the camera 12 is activated, and the next processing is executed by the CPU 14p every time the measurement cycle arrives.
  • a person image is searched from a camera image by pattern matching or motion detection.
  • the variable L is set to each of “1” to “Lmax” (Lmax: total number of person images found), and one or more person images found are found.
  • the representative point of the L-th person image is determined as “RP_L”.
  • the determined XY coordinates of the representative point RP_L are described in the representative point register 14r2 shown in FIG.
  • the representative point RP_1 is determined on the image representing the person H1
  • the representative point RP_2 is determined on the image representing the person H2
  • the person H3 is selected.
  • a representative point RP_3 is determined on the image to be represented.
  • variable L is set to each of “1” to “Lmax”
  • variable K is set to each of “1” to “6”
  • the distance from the representative point RP_L to the air conditioner D_K is set to “DS_K”. ”As measured.
  • the positional relationship between each of the persons corresponding to “Lmax” and the air conditioners D_1 to D_6 is clarified by the distances DS_1 to DS_6 thus measured.
  • a_K” and “b_K” are variables depending on the current output of the air conditioner D_K and the indoor environment.
  • the function value f (D_K) is defined by the variables a_K and b_K and the distance DS_K.
  • the weighting amount W_K is derived by subtracting from “1” the ratio of the function value f (D_K) to the sum of the function values f (D_1) to f (D_6) respectively corresponding to the air conditioners D_1 to D_6.
  • the variables a_K and b_K are adjusted so that the sum of the weighting amounts W_1 to W_6 is “1.0”.
  • the variable K is set to each of “1” to “6”, and the congestion degree CR_K is calculated.
  • the degree of congestion CR_K corresponds to the sum of weighting amounts assigned to the divided areas MP_K.
  • test operation mode Two modes, a test operation mode and a normal operation mode, are prepared as modes for operating the air conditioners D_1 to D_6.
  • test operation mode after the installation of the air conditioning control device 10 is completed and the setting of the divided areas MP_1 to MP_6 is completed, the output operations of the air conditioning devices D_1 to D_6 are adaptively controlled under the congestion measurement task. This is a mode for checking whether or not.
  • the normal operation mode is a mode selected in a normal use stage after the adaptive control of the air conditioners D_1 to D_6 is confirmed by the test operation mode.
  • the trial operation mode and the normal operation mode are alternatively selected by operating the input device 18.
  • variable K is set to “1” to “6”, and the operating status (operating status: temperature, humidity, air volume, and wind direction) of the air conditioner D_K is detected.
  • the graphic image G_K is multiplexed with the camera image in a drawing manner according to the detected operating situation.
  • the drawing position is determined with reference to the description of the area register 14r1.
  • the drawing mode of the graphic image G_K changes in the manner shown in FIGS. 12A to 12C according to the air volume of the air conditioner D_K.
  • the graphic image G_K is drawn in the manner shown in FIG. 12A corresponding to the air volume “small”, drawn in the manner shown in FIG. 12B corresponding to the air volume “medium”, and the air volume “large”. Is drawn in the manner shown in FIG. Further, the hue of the graphic image G_K changes from “blue” to “red” as the set temperature of the air conditioner D_K increases, and the saturation of the graphic image G_K increases as the set humidity of the air conditioner D_K increases.
  • the shape of the graphic image G_K is distorted along the wind direction.
  • the air volume of the air conditioner D_4 is set to “large”, and the air volumes of the air conditioners D_1, D_2, and D_5 are set to “medium”.
  • the air volumes of the air conditioners D_3 and D_6 are set to “small”.
  • the graphic images G_1 to G_6 are drawn in a manner showing such air volume setting.
  • the camera images and graphic images G_1 to G_6 are also displayed on the screen of the mobile communication terminal 30. Therefore, by carrying the mobile communication terminal 30 and moving in the room RM1, the person H1 can easily confirm whether or not the output operations of the air conditioners D_1 to D_6 are adaptively controlled. Thereby, the efficiency of the initial setting work and the maintenance work can be improved.
  • the drawing positions of the graphic images G_1 to G_6 are also deviated. Therefore, the person H1 can recognize that the orientation of the camera 12 has shifted by moving through the space while viewing the graphic images G_1 to G_6.
  • the CPU 14p executes a plurality of tasks including a main task shown in FIG. 14, an area setting task shown in FIGS. 15 to 16, a congestion degree measuring task shown in FIGS. 17 to 19, and a display control task shown in FIG. Note that control programs corresponding to these tasks are stored in the recording medium 22.
  • step S1 it is determined whether or not the current operation mode is the area setting mode, and in step S5, it is determined whether or not the current operation mode is the congestion degree measurement mode.
  • step S1 the area setting task is started in a step S3, and thereafter, the process proceeds to a step S15. If “YES” in the step S5, it is determined whether or not the divided areas MP_1 to MP_6 have been set in a step S7. If the determination result is YES, the congestion degree measurement task and the display control task are started in steps S9 and S11, and then the process proceeds to step S15. If the determination result is NO, the process directly proceeds to step S15. If both steps S1 and S5 are NO, another process is executed in step S13, and then the process proceeds to step S15.
  • step S15 it is repeatedly determined whether or not a mode change operation has been performed.
  • the determination result is updated from NO to YES, the activated task is terminated in step S17, and thereafter, the process returns to step S1.
  • step S21 a map image is displayed on monitor 16 in step S21, and variable K is set to “1” in step S23.
  • step S25 it is determined whether or not a click operation for designating an area has been performed. If the determination result is updated from NO to YES, click coordinates are calculated in step S27. The calculated coordinates are described in the area register 14r1 corresponding to the variable K.
  • step S29 it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S31 and then the process returns to step S25. Proceed to S33.
  • step S33 the map image is divided with reference to the click coordinates. As a result, six divided areas MP_1 to MP_6 are allocated on the map image. In step S35, boundary lines that divide the divided images MP_1 to MP_6 are drawn on the map image.
  • step S37 the variable K is set to “1”, and in step S39, a plurality of XY coordinates defining the divided area MP_K are calculated.
  • the calculated XY coordinates are described in the area register 14r1 corresponding to the variable K.
  • step S41 each of the plurality of XY coordinates that define the divided area MP_K is converted into UV coordinates according to Equation 1.
  • the converted UV coordinates are described in the area register 14r1 corresponding to the variable K, whereby the measurement area DT_K corresponding to the divided area MP_K is assigned to the camera image.
  • step S43 the drawing position of the graphic image G_K is defined on the map image.
  • a plurality of XY coordinates indicating the defined drawing position are described in the area register 14r1 corresponding to the variable K.
  • step S45 the XY coordinates indicating the drawing position defined in step S43 are converted into UV coordinates according to Equation 1.
  • the drawing position of the graphic image G_K is defined on the camera image.
  • a plurality of UV coordinates indicating the drawing position defined in this way are also described in the area register 14r1 corresponding to the variable K.
  • step S47 it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S49 and the process returns to step S39. If the determination result is YES, the process is ended.
  • step S51 it is determined whether or not the measurement cycle has arrived.
  • the process proceeds to step S53, and a person image is searched from the camera image by pattern matching or motion detection.
  • step S55 it is determined whether one or more person images have been found. If the determination result is NO, the process returns to step S51, whereas if the determination result is YES, the process proceeds to step S57.
  • step S57 the variable L is set to “1”.
  • step S59 the representative point of the Lth person image among the one or more found person images is determined as “RP_L”, and is determined in step S61.
  • the XY coordinates of the representative point RP_L are calculated with reference to the above equation 1.
  • the calculated XY coordinates are described in the representative point register 14r2 corresponding to the variable L.
  • step S67 the variable L is set to "1" again, in step S69, the variable K is set to "1", and in step S71, the distance from the representative point RP_L to the air conditioner D_K is measured as "DS_K”.
  • step S73 it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S75 and the process returns to step S71. Proceed to S77. The positional relationship between the Lth person and the air conditioners D_1 to D_6 is clarified by the distances DS_1 to DS_6 thus measured.
  • step S77 the variable K is set to “1” again.
  • step S79 the weighting amount W_K corresponding to the divided area MP_K is calculated according to Equation 2. The calculated weighting amount W_K is described in the weighting amount register 14r3 corresponding to the variable K.
  • step S81 it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S83 and the process returns to step S79. Proceed to S85.
  • step S85 it is determined whether or not the variable L has reached the maximum value Lmax. If the determination result is NO, the variable L is incremented in step S87 and the process returns to step S69. Proceed to step S89. In step S89, the variable K is set to “1”, and in step S91, the congestion level CR_K is calculated. The degree of congestion CR_K corresponds to the sum of weighting amounts assigned to the divided areas MP_K.
  • step S93 it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S95 and the process returns to step S91. Proceed to S97.
  • step S97 the outputs of the air conditioners D1 to D6 are controlled with reference to the congestion levels CR_1 to CR_6 calculated in step S91. Specifically, the output of the air conditioner corresponding to the divided area with a high degree of congestion is strengthened, and the output of the air conditioner corresponding to the divided area with a low degree of congestion is weakened.
  • the camera 12 is activated in step S101, and the display of the camera image is started in step S103.
  • Image data representing the scene captured by the camera 12 is given to the monitor 16 and transmitted to the mobile communication terminal 30 via the communication I / F 20.
  • the camera image is displayed on the screen of the monitor 16 and the mobile communication terminal.
  • step S105 it is determined whether the current mode is the normal operation mode or the test operation mode. If the current mode is the normal operation mode, the display control task is terminated. If the current mode is the test operation mode, the process proceeds to step S107.
  • step S107 the variable K is set to “1”, and in step S109, the operating status (operating status: temperature, humidity, air volume, wind direction) of the air conditioner D_K is detected.
  • step S111 the graphic image G_K is multiplexed with the camera image corresponding to the measurement area DT_K. The drawing mode of the graphic image G_K corresponds to the detected operating status.
  • step S113 it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S117 and then the process returns to step S109. If the determination result is YES, the variable K is set to “1” in step S115 and then the process returns to step S109.
  • the CPU 14p searches for one or more persons existing in the internal space of the room RM1 based on the object scene image output from the camera 12 capturing the internal space of the room RM1 (S53). ), Controlling the output operation of the air conditioners D_1 to D_6 that generate output toward the internal space of the room RM1 based on the positional relationship between the air conditioners D_1 to D_6 and one or more persons discovered by the search process (S57-S97).
  • the CPU 14p also reproduces the object scene image output from the camera 12 on the monitor screen of the monitor 16 or the portable communication terminal 30 in parallel with the air conditioning control (S101 to S103), and shows the operating state of the air conditioning devices D_1 to D_6.
  • the graphic images G_1 to G_6 are multiplexed on the object scene image reproduced on the monitor screen (S107 to S117).
  • the air conditioners D_1 to D_6 generate outputs toward the interior space of the room RM1, and a person existing in the interior space of the room RM1 searches based on the object scene image output from the camera 12 that captures the interior space of the room RM1.
  • the output operations of the air conditioners D_1 to D_6 are controlled based on the positional relationship between the person discovered by the search process and the air conditioners D_1 to D_6. Thereby, adaptive output control in consideration of the position of the person is realized.
  • the object scene image output from the camera 12 is reproduced on the monitor screen in parallel with the output control of the air conditioners D_1 to D_6, and the graphic images G_1 to G_6 showing the operation states of the air conditioners D_1 to D_6 are reproduced. It is multiplexed on the object scene image. Whether the outputs of the air conditioners D_1 to D_6 are adaptively controlled can be determined based on the multiplexed graphic images G_1 to G_6. As a result, the load on the initial setting work and / or maintenance work can be reduced.
  • the coordinates of the marks M_1 to M_6 are designated by clicking the mouse pointer.
  • the coordinate values of the marks M_1 to M_6 may be directly specified.
  • the output of the air conditioner is adaptively controlled.
  • the output (that is, brightness) of the lighting device is adaptive. You may make it control to. In this case, the drawing mode of the graphic image changes differently according to the brightness of the lighting device.
  • planar projective transformation with reference to Equation 1 is assumed, but perspective projective transformation may be performed instead.
  • an image schematically representing a state in which the floor surface FL1 is bird's-eye view is adopted as the map image.
  • the map image may be generated by performing bird's-eye conversion with reference to Equation 1 described above on the camera image.
  • temperature”, “humidity”, “air volume”, and “wind direction” are assumed as the operating status of the air conditioner D_K detected in step S109 shown in FIG.
  • power consumption may additionally be assumed with the power consumption of the air conditioner D_K as the operating status.
  • the graphic image G_K is multiplexed on the monitor screen in the manner shown in FIGS. 13 (A) to 13 (B).
  • numerical values for example, “1 m” and “3 m” indicating the spread of the wind output from the air conditioner D_K may be additionally displayed.
  • the graphic image G_K is displayed as shown in FIG. A) to be displayed as shown in FIG.
  • the scene image captured by the camera 12 is displayed on the monitor screen from the viewpoint of the camera 12, and the shape and / or the graphic image G_K is adapted to match the displayed scene image.
  • the size is adjusted (see FIG. 10B).
  • the object scene image captured by the camera 12 may be displayed on the monitor screen in a state of being converted into a bird's-eye view image, and the shape and / or size of the graphic image G_K may be adjusted to match the bird's-eye view image. Good.
  • the overhead image and the graphic image G_K are displayed on the monitor screen as shown in FIG.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Studio Devices (AREA)
  • Air Conditioning Control Device (AREA)

Abstract

A CPU searches for one or two or more persons present in the interior space of a room (RM1) on the basis of an image of field output from a camera (12) for capturing the interior space of the room (RM1) and controls the output operations of air conditioners (D_1 to D_6) for generating outputs to the interior space of the room (RM1) on the basis of the positional relationships between the one or two or more persons found by the search processing and the air conditioners (D_1 to D_6). The CPU also reproduces the image of field output from the camera (12) on a monitor screen while controlling the air conditioners and multiplexes a graphic image showing the operation states of the air conditioners (D_1 to D_6) onto the image of field reproduced on the monitor screen.

Description

出力制御装置Output control device
 この発明は、出力制御装置に関し、特に、空間に存在する特定物体を検出して装置の出力を制御する、出力制御装置に関する。 The present invention relates to an output control device, and more particularly to an output control device that detects a specific object existing in space and controls the output of the device.
 この種の装置の一例が、特許文献1に開示されている。この背景技術によれば、センサ部は、少なくとも検出エリア内の人間等の温度を計測する。画像処理部は、センサ部からの出力である検出温度を画素値とする熱画像を処理して、特徴量を抽出する。画像情報検出部は、画像処理部からの信号を基に、ニューラルネットまたはパターン認識機構を用いて、センサ部からの出力信号より同等もしくは詳細な人間の個人情報信号、環境情報信号等を出力する。空気調和機の動作は、このような信号に基づいて制御される。 An example of this type of device is disclosed in Patent Document 1. According to this background art, the sensor unit measures at least the temperature of a person or the like in the detection area. The image processing unit extracts a feature amount by processing a thermal image having a detected temperature, which is an output from the sensor unit, as a pixel value. Based on the signal from the image processing unit, the image information detection unit outputs a human personal information signal, an environmental information signal, or the like equivalent to or more detailed than the output signal from the sensor unit using a neural network or a pattern recognition mechanism. . The operation of the air conditioner is controlled based on such a signal.
特開平6-117836号公報Japanese Patent Application Laid-Open No. 6-117836
 しかし、背景技術では、空気調和機の動作が的確に制御されているか否かを判別することは容易でははく、初期設定作業やメンテナンス作業に掛かる負荷が増大するおそれがある。つまり、人物が検知されたこと、および検知結果に応答して空気調和機が作動することは、検知結果に応答した空気調和機の動作の変化から分かる。しかし、検知された人物の位置と空気調和機の稼働範囲とを正しく合わせるには、経験と熟練を要する。 However, in the background art, it is not easy to determine whether or not the operation of the air conditioner is accurately controlled, and there is a possibility that the load on the initial setting work and the maintenance work increases. That is, it can be understood from the change in the operation of the air conditioner in response to the detection result that the person has been detected and that the air conditioner operates in response to the detection result. However, experience and skill are required to correctly match the detected position of the person with the operating range of the air conditioner.
 それゆえに、この発明の主たる目的は、初期設定作業および/またはメンテナンス作業に掛かる負荷を低減することができる、出力制御装置を提供することである。 Therefore, a main object of the present invention is to provide an output control device capable of reducing the load applied to the initial setting work and / or the maintenance work.
 この発明に従う出力制御装置は、次のものを備える:空間に存在する1または2以上の特定物体を空間を捉えるカメラから出力された被写界像に基づいて探索する探索手段;空間に向けて出力を発生する複数の装置の出力動作を探索手段によって発見された1または2以上の特定物体と複数の装置との位置関係に基づいて制御する制御手段;カメラから出力された被写界像を制御手段の制御処理と並列してモニタ画面に再現する再現手段;および複数の装置の動作状態を示す状態情報を再現手段によって再現された被写界像に多重する多重手段。 The output control device according to the present invention comprises the following: search means for searching for one or more specific objects existing in the space based on the object scene image output from the camera that captures the space; Control means for controlling the output operation of the plurality of devices generating the output based on the positional relationship between one or more specific objects discovered by the search means and the plurality of devices; an object scene image output from the camera; Reproduction means for reproducing on the monitor screen in parallel with the control processing of the control means; and multiplexing means for multiplexing the state information indicating the operation states of the plurality of apparatuses onto the object scene image reproduced by the reproduction means.
 好ましくは、試運転モードを含む複数のモードを代替的に選択する選択手段、および試運転モードに対応して多重手段を起動する起動手段がさらに備えられる。 Preferably, a selection unit that alternatively selects a plurality of modes including the trial operation mode, and an activation unit that activates the multiplexing unit corresponding to the trial operation mode are further provided.
 好ましくは、カメラはUV座標系に沿って定義された撮像面を有し、空間はXY座標系に沿って定義された平面を有し、制御手段は、UV座標系とXY座標系との対応関係を示す校正パラメータと探索手段によって発見された画像とを参照して特定物体のXY座標を算出する算出手段、および算出手段によって算出されたXY座標を参照して位置関係を検出する検出手段を含む。 Preferably, the camera has an imaging surface defined along the UV coordinate system, the space has a plane defined along the XY coordinate system, and the control means corresponds to the UV coordinate system and the XY coordinate system. Calculating means for calculating XY coordinates of a specific object with reference to a calibration parameter indicating the relationship and an image found by the searching means; and detecting means for detecting a positional relationship with reference to the XY coordinates calculated by the calculating means. Including.
 さらに好ましくは、状態情報の多重位置を校正パラメータを参照して定義する定義手段がさらに備えられる。 More preferably, there is further provided a definition means for defining the multiple positions of the state information with reference to the calibration parameter.
 好ましくは、複数の装置の各々は空調制御装置に相当し、状態情報は温度,湿度,風量,風向きおよび消費電力の少なくとも1つをパラメータとして有する。 Preferably, each of the plurality of devices corresponds to an air conditioning control device, and the state information has at least one of temperature, humidity, air volume, wind direction, and power consumption as parameters.
 好ましくは、状態情報は複数の装置の各々の出力が及ぶ位置および/または広さをパラメータとして有する。 Preferably, the status information has a position and / or width that each output of the plurality of devices reaches as a parameter.
 好ましくは、被写界像および状態情報を携帯通信端末に向けて送信する送信手段がさらに備えられ、モニタ画面は携帯通信端末のモニタ画面に相当する。 Preferably, transmission means for transmitting the object scene image and the state information to the mobile communication terminal is further provided, and the monitor screen corresponds to the monitor screen of the mobile communication terminal.
 この発明に従う出力制御装置は、次のものを備える:空間に存在する1または2以上の特定物体を空間を捉えるカメラから出力された被写界像に基づいて探索する探索手段;カメラから出力された被写界像を探索手段の探索処理と並列してモニタ画面に再現する再現手段;探索手段によって発見された1または2以上の特定物体との位置関係に応じて異なる出力を空間に向けて発生する複数の装置の動作状態を検出する検出手段;および検出手段によって検出された動作状態を示す状態情報を再現手段によって再現された被写界像に多重する多重手段。 The output control device according to the present invention comprises the following: search means for searching for one or more specific objects existing in space based on the object scene image output from the camera capturing the space; output from the camera Reproduction means for reproducing the object scene image on the monitor screen in parallel with the search processing of the search means; directing different outputs to the space according to the positional relationship with one or more specific objects found by the search means Detecting means for detecting operating states of a plurality of generated devices; and multiplexing means for multiplexing state information indicating the operating states detected by the detecting means on the object scene image reproduced by the reproducing means.
 この発明によれば、複数の装置は空間に向けて出力を発生し、空間に存在する特定物体は空間を捉えるカメラから出力された被写界像に基づいて探索され、そして複数の装置の出力動作は探索処理によって発見された特定物体と複数の装置との位置関係に基づいて制御される。これによって、特定物体の位置を考慮した適応的な出力制御が実現される。 According to the present invention, the plurality of devices generate outputs toward the space, the specific object existing in the space is searched based on the object scene image output from the camera capturing the space, and the outputs of the plurality of devices The operation is controlled based on the positional relationship between the specific object discovered by the search process and the plurality of devices. Thereby, adaptive output control in consideration of the position of the specific object is realized.
 また、カメラから出力された被写界像は複数の装置の出力制御と並列してモニタ画面に再現され、複数の装置の動作状態を示す状態情報は再現された被写界像に多重される。複数の装置の出力が適応的に制御されているか否かは、多重された状態情報に基づいて判別できる。これによって、初期設定作業および/またはメンテナンス作業に掛かる負荷を低減することができる。 In addition, the object scene image output from the camera is reproduced on the monitor screen in parallel with the output control of the plurality of devices, and the state information indicating the operation state of the plurality of devices is multiplexed on the reproduced object scene image. . Whether or not the outputs of the plurality of devices are adaptively controlled can be determined based on the multiplexed state information. As a result, the load on the initial setting work and / or the maintenance work can be reduced.
 この発明によれば、カメラから出力された被写界像は、複数の装置の動作状態を示す状態情報とともにモニタ画面に再現される。複数の装置の出力が空間に存在する特定物体との位置関係を考慮して適応的に制御されているか否かは、多重された状態情報に基づいて判別できる。これによって、初期設定作業および/またはメンテナンス作業に掛かる負荷を低減することができる。 According to the present invention, the object scene image output from the camera is reproduced on the monitor screen together with the state information indicating the operation states of the plurality of devices. Whether or not the outputs of a plurality of devices are adaptively controlled in consideration of the positional relationship with a specific object existing in space can be determined based on the multiplexed state information. As a result, the load on the initial setting work and / or the maintenance work can be reduced.
 この発明の上述の目的,その他の目的,特徴および利点は、図面を参照して行う以下の実施例の詳細な説明から一層明らかとなろう。 The above object, other objects, features, and advantages of the present invention will become more apparent from the following detailed description of embodiments with reference to the drawings.
(A)はこの発明の一実施例の基本的構成を示すブロック図であり、(B)はこの発明の他の実施例の基本的構成を示すブロック図である。(A) is a block diagram showing a basic configuration of one embodiment of the present invention, and (B) is a block diagram showing a basic configuration of another embodiment of the present invention. この発明の一実施例の構成を示すブロック図である。It is a block diagram which shows the structure of one Example of this invention. 図2実施例に適用されるカメラの設置状態の一例を示す図解図である。It is an illustration figure which shows an example of the installation state of the camera applied to the FIG. 2 Example. 図2実施例のモニタに表示されるカメラ画像の一例を示す図解図である。It is an illustration figure which shows an example of the camera image displayed on the monitor of FIG. 2 Example. 図2実施例のモニタに表示されるマップ画像の一例を示す図解図である。It is an illustration figure which shows an example of the map image displayed on the monitor of FIG. 2 Example. 図2実施例に適用されるエリアレジスタの構成の一例を示す図解図である。It is an illustration figure which shows an example of a structure of the area register applied to the FIG. 2 Example. 図2実施例に適用される代表点レジスタの構成の一例を示す図解図である。FIG. 3 is an illustrative view showing one example of a configuration of a representative point register applied to the embodiment in FIG. 2; 図2実施例に適用される重み付け量レジスタの構成の一例を示す図解図である。It is an illustration figure which shows an example of a structure of the weighting amount register | resistor applied to the FIG. 2 Example. (A)はマップ画像上での分割エリアの割り当て状態の一例を示す図解図であり、(B)はカメラ画像上での測定エリアの割り当て状態の一例を示す図解図である。(A) is an illustration figure which shows an example of the allocation state of the divided area on a map image, (B) is an illustration figure which shows an example of the allocation state of the measurement area on a camera image. (A)はマップ画像に多重されたグラフィック画像の一例を示す図解図であり、(B)はカメラ画像に多重されたグラフィック画像の一例を示す図解図である。(A) is an illustrative view showing an example of a graphic image multiplexed on a map image, and (B) is an illustrative view showing an example of a graphic image multiplexed on a camera image. カメラ画像の一例を示す図解図である。It is an illustration figure which shows an example of a camera image. (A)は風量“小”に対応して描画されたグラフィック画像の一例を示す図解図であり、(B)は風量“中”に対応して描画されたグラフィック画像の一例を示す図解図であり、(C)は風量“大”に対応して描画されたグラフィック画像の一例を示す図解図である。(A) is an illustrative view showing an example of a graphic image drawn corresponding to the air volume “small”, and (B) is an illustrative view showing an example of a graphic image drawn corresponding to the air volume “medium”. FIG. 6C is an illustrative view showing an example of a graphic image drawn corresponding to the air volume “large”. (A)はカメラ画像に多重されたグラフィック画像の一例を示す図解図であり、(B)はカメラ画像に多重されたグラフィック画像の一例を示す図解図である。(A) is an illustration figure which shows an example of the graphic image multiplexed on the camera image, (B) is an illustration figure which shows an example of the graphic image multiplexed on the camera image. 図2実施例に適用されるCPUの動作の一部を示すフロー図である。It is a flowchart which shows a part of operation | movement of CPU applied to the FIG. 2 Example. 図2実施例に適用されるCPUの動作の他の一部を示すフロー図である。It is a flowchart which shows a part of other operation | movement of CPU applied to the FIG. 2 Example. 図2実施例に適用されるCPUの動作のその他の一部を示すフロー図である。FIG. 11 is a flowchart showing still another portion of behavior of the CPU applied to the embodiment in FIG. 2; 図2実施例に適用されるCPUの動作のさらにその他の一部を示すフロー図である。FIG. 10 is a flowchart showing yet another portion of behavior of the CPU applied to the embodiment in FIG. 2; 図2実施例に適用されるCPUの動作の他の一部を示すフロー図である。It is a flowchart which shows a part of other operation | movement of CPU applied to the FIG. 2 Example. 図2実施例に適用されるCPUの動作のその他の一部を示すフロー図である。FIG. 11 is a flowchart showing still another portion of behavior of the CPU applied to the embodiment in FIG. 2; 図2実施例に適用されるCPUの動作の他の一部を示すフロー図である。It is a flowchart which shows a part of other operation | movement of CPU applied to the FIG. 2 Example. (A)はマップ画像に多重されたグラフィック画像の他の一例を示す図解図であり、(B)はカメラ画像に多重されたグラフィック画像の他の一例を示す図解図である。(A) is an illustrative view showing another example of a graphic image multiplexed on a map image, and (B) is an illustrative view showing another example of a graphic image multiplexed on a camera image. 他の実施例のモニタ画面に表示される被写界像およびグラフィック画像の一例を示す図解図である。It is an illustration figure which shows an example of the object scene image and graphic image which are displayed on the monitor screen of another Example.
 図1(A)を参照して、この発明の一実施例の出力制御装置は、基本的に次のように構成される。探索手段1aは、空間に存在する1または2以上の特定物体を空間を捉えるカメラ5aから出力された被写界像に基づいて探索する。制御手段2aは、空間に向けて出力を発生する複数の装置6a,6a,…の出力動作を探索手段1aによって発見された1または2以上の特定物体と複数の装置6a,6a,…との位置関係に基づいて制御する。再現手段3aは、カメラ5aから出力された被写界像を制御手段2aの制御処理と並列してモニタ画面7aに再現する。多重手段4aは、複数の装置6a,6a,…の動作状態を示す状態情報を再現手段3aによって再現された被写界像に多重する。 Referring to FIG. 1 (A), the output control device of one embodiment of the present invention is basically configured as follows. The search means 1a searches for one or more specific objects existing in the space based on the object scene image output from the camera 5a capturing the space. The control means 2a includes the one or two or more specific objects discovered by the search means 1a and the plurality of devices 6a, 6a,... That output the operations of the plurality of devices 6a, 6a,. Control based on the positional relationship. The reproduction unit 3a reproduces the object scene image output from the camera 5a on the monitor screen 7a in parallel with the control process of the control unit 2a. The multiplexing unit 4a multiplexes the state information indicating the operation states of the plurality of devices 6a, 6a,... On the object scene image reproduced by the reproduction unit 3a.
 複数の装置6a,6a,…は空間に向けて出力を発生し、空間に存在する特定物体は空間を捉えるカメラ5aから出力された被写界像に基づいて探索され、そして複数の装置6a,6a,…の出力動作は探索処理によって発見された特定物体と複数の装置6a,6a,…との位置関係に基づいて制御される。これによって、特定物体の位置を考慮した適応的な出力制御が実現される。 The plurality of devices 6a, 6a,... Generate an output toward the space, a specific object existing in the space is searched based on the object scene image output from the camera 5a capturing the space, and the plurality of devices 6a, 6a,. The output operation of 6a,... Is controlled based on the positional relationship between the specific object discovered by the search process and the plurality of devices 6a, 6a,. Thereby, adaptive output control in consideration of the position of the specific object is realized.
 また、カメラ5aから出力された被写界像は複数の装置6a,6a,…の出力制御と並列してモニタ画面7aに再現され、複数の装置6a,6a,…の動作状態を示す状態情報は再現された被写界像に多重される。複数の装置6a,6a,…の出力が適応的に制御されているか否かは、多重された状態情報に基づいて判別できる。これによって、初期設定作業および/またはメンテナンス作業に掛かる負荷を低減することができる。 In addition, the object scene image output from the camera 5a is reproduced on the monitor screen 7a in parallel with the output control of the plurality of devices 6a, 6a,..., And state information indicating the operation state of the plurality of devices 6a, 6a,. Are multiplexed onto the reproduced scene image. It can be determined whether or not the outputs of the plurality of devices 6a, 6a,... Are adaptively controlled based on the multiplexed state information. As a result, the load on the initial setting work and / or the maintenance work can be reduced.
 図1(B)を参照して、他の実施例の出力制御装置は、基本的に次のように構成される。探索手段1bは、空間に存在する1または2以上の特定物体を空間を捉えるカメラ5bから出力された被写界像に基づいて探索する。再現手段2bは、カメラ5bから出力された被写界像を探索手段1bの探索処理と並列してモニタ画面6bに再現する。検出手段3bは、探索手段1bによって発見された1または2以上の特定物体との位置関係に応じて異なる出力を空間に向けて発生する複数の装置7b,7b,…の動作状態を検出する。多重手段4bは、検出手段3bによって検出された動作状態を示す状態情報を再現手段2bによって再現された被写界像に多重する。 Referring to FIG. 1 (B), the output control device of another embodiment is basically configured as follows. The search means 1b searches for one or more specific objects existing in the space based on the object scene image output from the camera 5b that captures the space. The reproduction unit 2b reproduces the object scene image output from the camera 5b on the monitor screen 6b in parallel with the search process of the search unit 1b. The detection means 3b detects the operating state of the plurality of devices 7b, 7b,... That generate different outputs toward the space depending on the positional relationship with one or more specific objects discovered by the search means 1b. The multiplexing unit 4b multiplexes the state information indicating the operation state detected by the detection unit 3b on the object scene image reproduced by the reproduction unit 2b.
 カメラ5bから出力された被写界像は、複数の装置7b,7b,…の動作状態を示す状態情報とともにモニタ画面に再現される。複数の装置7b,7b,…の出力が空間に存在する特定物体との位置関係を考慮して適応的に制御されているか否かは、多重された状態情報に基づいて判別できる。これによって、初期設定作業および/またはメンテナンス作業に掛かる負荷を低減することができる。 The object scene image output from the camera 5b is reproduced on the monitor screen together with state information indicating the operation states of the plurality of devices 7b, 7b,. It can be determined based on the multiplexed state information whether or not the outputs of the plurality of devices 7b, 7b,... Are adaptively controlled in consideration of the positional relationship with a specific object existing in the space. As a result, the load on the initial setting work and / or the maintenance work can be reduced.
 図2を参照して、この実施例の空調制御装置10は、撮像面で捉えられた被写界(3次元空間)を表す画像データを繰り返し出力するカメラ12を含む。カメラ12から出力された画像データは、画像処理回路14によって表示処理を施される。処理後の画像データはモニタ16に与えられるとともに、通信I/F20を介して携帯通信端末30に向けて送信される。被写界を表す画像つまりカメラ画像がモニタ16および携帯通信端末30の各々の画面に表示される。 Referring to FIG. 2, the air conditioning control device 10 of this embodiment includes a camera 12 that repeatedly outputs image data representing an object scene (three-dimensional space) captured on the imaging surface. The image data output from the camera 12 is subjected to display processing by the image processing circuit 14. The processed image data is given to the monitor 16 and transmitted toward the mobile communication terminal 30 via the communication I / F 20. An image representing the object scene, that is, a camera image is displayed on each screen of the monitor 16 and the portable communication terminal 30.
 図3を参照して、部屋RM1は、床面FL1および天井HV1と4つの壁面WL1~WL4とによって仕切られる。カメラ12は、壁面WL1の幅方向中央の上部に設けられ、部屋RM1の内部空間を斜め上方から捉える。したがって、カメラ画像は、図4に示す要領でモニタ画面に表示される。図3および図4に示すように、部屋RM1の内部空間は互いに直交するX軸,Y軸およびZ軸によって定義され、カメラ12の撮像面は互いに直交するU軸およびV軸によって定義される。 Referring to FIG. 3, room RM1 is partitioned by floor surface FL1 and ceiling HV1 and four wall surfaces WL1 to WL4. The camera 12 is provided in the upper part of the center in the width direction of the wall surface WL1, and captures the internal space of the room RM1 from obliquely above. Therefore, the camera image is displayed on the monitor screen as shown in FIG. As shown in FIGS. 3 and 4, the internal space of the room RM1 is defined by the X, Y, and Z axes that are orthogonal to each other, and the imaging surface of the camera 12 is defined by the U and V axes that are orthogonal to each other.
 天井HV1には、空調装置D_1~D_6が既定の距離を隔てて設置される。空調装置D_1~D_6の各々は指定された温度および湿度を有する空気を指定された風量および風向きで出力し、部屋RM1の温度および湿度はこうして出力された空気によって調整される。 The air conditioners D_1 to D_6 are installed at a predetermined distance on the ceiling HV1. Each of the air conditioners D_1 to D_6 outputs air having a specified temperature and humidity in a specified air volume and direction, and the temperature and humidity of the room RM1 are adjusted by the air thus output.
 入力装置18の操作によってエリア設定モードが選択されると、CPU14pによって次の処理が実行される。なお、エリア設定モードにおいては、カメラ12は停止状態にある。 When the area setting mode is selected by operating the input device 18, the following processing is executed by the CPU 14p. In the area setting mode, the camera 12 is in a stopped state.
 まず、図5に示すマップ画像がモニタ16に表示される。マップ画像は、平面FL1を鳥瞰した状態を模式的に表す画像に相当する。マップ画像にはまた、空調装置D_1~D_6をそれぞれ表すマークM_1~M_6が、空調装置D_1~D_6の位置に対応して表示される。 First, the map image shown in FIG. The map image corresponds to an image that schematically represents a bird's-eye view of the plane FL1. In the map image, marks M_1 to M_6 respectively representing air conditioners D_1 to D_6 are displayed corresponding to the positions of the air conditioners D_1 to D_6.
 マップ画像の表示が完了すると、変数Kが“1”に設定される。入力装置18に設けられたマウスポインタによってマークM_Kがクリックされると、クリックされた位置を示す座標つまりクリック座標が算出される。算出されたクリック座標は変数Kに対応して図6に示すエリアレジスタ14r1に記述され、変数Kはその後にインクリメントされる。クリック操作はマークM_1~M_6に対応して合計6回受け付けられ、これによってマークM_1~M_6にそれぞれ対応する6つのクリック座標がエリアレジスタ14r1に設定される。 When the display of the map image is completed, the variable K is set to “1”. When the mark M_K is clicked by the mouse pointer provided in the input device 18, coordinates indicating the clicked position, that is, click coordinates are calculated. The calculated click coordinates are described in the area register 14r1 shown in FIG. 6 corresponding to the variable K, and the variable K is incremented thereafter. The click operation is accepted a total of six times corresponding to the marks M_1 to M_6, and thereby six click coordinates respectively corresponding to the marks M_1 to M_6 are set in the area register 14r1.
 マップ画像は、こうして設定された6つのクリック座標を基準として、図9(A)に示す要領で分割される。境界線BL_1~B_L3は、マークM_1~M_6を囲むようにマップ画像上に描かれる。この結果、分割エリアMP_1~MP_6がマークM_1~M_6の周辺に割り当てられ、部屋RM1の内部空間が分割エリアMP_1~MP_6にそれぞれ対応する複数の小空間に分割される。エリアレジスタ14r1には、分割エリアMP_K(K:1~6)を定義する複数のXY座標が記述される。 The map image is divided in the manner shown in FIG. 9A with the six click coordinates thus set as a reference. The boundary lines BL_1 to B_L3 are drawn on the map image so as to surround the marks M_1 to M_6. As a result, the divided areas MP_1 to MP_6 are allocated around the marks M_1 to M_6, and the internal space of the room RM1 is divided into a plurality of small spaces respectively corresponding to the divided areas MP_1 to MP_6. In the area register 14r1, a plurality of XY coordinates defining the divided area MP_K (K: 1 to 6) are described.
 続いて、分割エリアMP_Kを定義する複数のXY座標の各々が、数1に従ってUV座標に変換される。
Figure JPOXMLDOC01-appb-M000001
Subsequently, each of a plurality of XY coordinates that define the divided area MP_K is converted into UV coordinates according to Equation 1.
Figure JPOXMLDOC01-appb-M000001
 数1に示す校正パラメータP11~P33は、平面FL1を定義するXY座標系とカメラ12の撮像面つまりカメラ画像を定義するUV座標系との間で平面射影変換を行うための行列に相当する。したがって、所望のXY座標を数1に適用することで、カメラ画像上の対応するUV座標が算出される。こうして変換された複数のUV座標は、変換元の複数のXY座標に対応してエリアレジスタ14r1に記述される。分割エリアMP_Kに対応する測定エリアDT_Kは、図9(B)に示す要領でカメラ画像上に定義される。 The calibration parameters P11 to P33 shown in Equation 1 correspond to a matrix for performing planar projective transformation between the XY coordinate system that defines the plane FL1 and the imaging plane of the camera 12, that is, the UV coordinate system that defines the camera image. Therefore, by applying the desired XY coordinates to Equation 1, the corresponding UV coordinates on the camera image are calculated. The plurality of UV coordinates thus converted are described in the area register 14r1 corresponding to the plurality of XY coordinates of the conversion source. The measurement area DT_K corresponding to the divided area MP_K is defined on the camera image in the manner shown in FIG. 9B.
 続いて、空調装置D_Kの稼働状況を示すグラフィック画像G_Kの描画位置がマップ画像上で定義される。グラフィック画像G_Kは、空調装置D_KのXY座標を中心として描かれた複数の円によって表現される。したがって、グラフィック画像G_Kの描画位置は、グラフィック画像G_Kが図10(A)に示す要領でマップ画像上に描画されるように定義される。定義された描画位置を示す複数のXY座標は、変数Kに対応してエリアレジスタ14r1に記述される。 Subsequently, the drawing position of the graphic image G_K indicating the operating status of the air conditioner D_K is defined on the map image. The graphic image G_K is represented by a plurality of circles drawn around the XY coordinates of the air conditioner D_K. Accordingly, the drawing position of the graphic image G_K is defined such that the graphic image G_K is drawn on the map image in the manner shown in FIG. A plurality of XY coordinates indicating the defined drawing position are described in the area register 14r1 corresponding to the variable K.
 記述されたXY座標はその後、数1に従ってUV座標に変換される。これによって、グラフィック画像G_Kの描画位置がカメラ画像上で定義され、グラフィック画像G_Kは図10(B)に示す要領でカメラ画像上に描画される。カメラ画像上で定義された描画位置を示す複数のUV座標もまた、変数Kに対応してエリアレジスタ14r1に記述される。 The described XY coordinates are then converted into UV coordinates according to Equation 1. Thus, the drawing position of the graphic image G_K is defined on the camera image, and the graphic image G_K is drawn on the camera image in the manner shown in FIG. A plurality of UV coordinates indicating the drawing position defined on the camera image are also described in the area register 14r1 corresponding to the variable K.
 入力装置18の操作によって混雑度測定モードが選択されると、カメラ12が起動され、測定周期が到来する毎に次の処理がCPU14pによって実行される。 When the congestion degree measurement mode is selected by operating the input device 18, the camera 12 is activated, and the next processing is executed by the CPU 14p every time the measurement cycle arrives.
 まず、パターンマッチングまたは動き検出によってカメラ画像から人物像が探索される。1または2以上の人物像が発見されると、変数Lが“1”~“Lmax”(Lmax:発見された人物像の総数)の各々に設定され、発見された1または2以上の人物像のうちL番目の人物像の代表点が“RP_L”として決定される。決定された代表点RP_LのXY座標は、変数Lに対応して図7に示す代表点レジスタ14r2に記述される。 First, a person image is searched from a camera image by pattern matching or motion detection. When one or more person images are found, the variable L is set to each of “1” to “Lmax” (Lmax: total number of person images found), and one or more person images found are found. The representative point of the L-th person image is determined as “RP_L”. The determined XY coordinates of the representative point RP_L are described in the representative point register 14r2 shown in FIG.
 人物H1~H3が図11に示す要領で部屋RM1に存在するとき、人物H1を表す画像上で代表点RP_1が決定され、人物H2を表す画像上で代表点RP_2が決定され、そして人物H3を表す画像上で代表点RP_3が決定される。代表点RP_1~RP_3のXY座標は、L=1~3に対応して代表点レジスタ14r2に記述される。 When the persons H1 to H3 are present in the room RM1 as shown in FIG. 11, the representative point RP_1 is determined on the image representing the person H1, the representative point RP_2 is determined on the image representing the person H2, and the person H3 is selected. A representative point RP_3 is determined on the image to be represented. The XY coordinates of the representative points RP_1 to RP_3 are described in the representative point register 14r2 corresponding to L = 1 to 3.
 続いて、変数Lが“1”~“Lmax”の各々に設定されるとともに、変数Kが“1”~“6”の各々に設定され、代表点RP_Lから空調装置D_Kまでの距離が“DS_K”として測定される。こうして測定された距離DS_1~DS_6によって、“Lmax”に相当する人物の各々と空調装置D_1~D_6との位置関係が明らかとなる。 Subsequently, the variable L is set to each of “1” to “Lmax”, the variable K is set to each of “1” to “6”, and the distance from the representative point RP_L to the air conditioner D_K is set to “DS_K”. ”As measured. The positional relationship between each of the persons corresponding to “Lmax” and the air conditioners D_1 to D_6 is clarified by the distances DS_1 to DS_6 thus measured.
 変数Kは再度“1”~6“の各々に設定され、分割エリアMP_Kに対応する重み付け量W_Kが数2に従って算出される。算出された重み付け量W_Kは、変数Kに対応して図8に示す重み付け量レジスタ14r3に記述される。
[数2]
W_K=1-f(D_K)/Σf(D_K)
但し、f(D_K)=a_K*DS_K+b_K
The variable K is again set to each of “1” to “6”, and the weighting amount W_K corresponding to the divided area MP_K is calculated according to Equation 2. The calculated weighting amount W_K corresponds to the variable K in FIG. It is described in the weighting amount register 14r3 shown.
[Equation 2]
W_K = 1−f (D_K) / Σf (D_K)
However, f (D_K) = a_K * DS_K + b_K
 数2において、“a_K”および“b_K”は空調装置D_Kの現時点の出力や室内環境に依存する変数である。関数値f(D_K)は、変数a_Kおよびb_Kと距離DS_Kとによって定義される。重み付け量W_Kは、空調装置D_1~D_6にそれぞれ対応する関数値f(D_1)~f(D_6)の総和に対する関数値f(D_K)の割合を“1”から減算することで導き出される。なお、変数a_Kおよびb_Kは、重み付け量W_1~W_6の総和が“1.0”となるように調整される。 In Equation 2, “a_K” and “b_K” are variables depending on the current output of the air conditioner D_K and the indoor environment. The function value f (D_K) is defined by the variables a_K and b_K and the distance DS_K. The weighting amount W_K is derived by subtracting from “1” the ratio of the function value f (D_K) to the sum of the function values f (D_1) to f (D_6) respectively corresponding to the air conditioners D_1 to D_6. The variables a_K and b_K are adjusted so that the sum of the weighting amounts W_1 to W_6 is “1.0”.
 重み付け量がこうして決定されると、変数Kが“1”~“6”の各々に設定され、混雑度CR_Kが算出される。混雑度CR_Kは、分割エリアMP_Kに割り当てられた重み付け量の総和に相当する。空調装置D_1~D_6の各々の出力(=風量)は、こうして算出された混雑度CR_1~CR_6に基づいて制御される。具体的には、混雑度が大きい分割エリアに対応する空調装置の風量が強められ、混雑度が小さい分割エリアに対応する空調装置の風量が弱められる。 When the weighting amount is thus determined, the variable K is set to each of “1” to “6”, and the congestion degree CR_K is calculated. The degree of congestion CR_K corresponds to the sum of weighting amounts assigned to the divided areas MP_K. The outputs (= air volume) of the air conditioners D_1 to D_6 are controlled based on the congestion levels CR_1 to CR_6 thus calculated. Specifically, the air volume of the air conditioner corresponding to the divided area where the degree of congestion is high is increased, and the air volume of the air conditioner corresponding to the divided area where the degree of congestion is low is reduced.
 空調装置D_1~D_6を運転するモードとしては、試運転モードおよび通常運転モードの2つが準備される。ここで、試運転モードは、空調制御装置10の設置が完了しかつ分割エリアMP_1~MP_6の設定が完了した後に、空調装置D_1~D_6の出力動作が混雑度測定タスクの下で適応的に制御されるか否かを確認するためのモードである。一方、通常運転モードは、試運転モードによって空調装置D_1~D_6の適応制御が確認された後の通常の使用段階で選択されるモードである。試運転モードおよび通常運転モードは、入力装置18の操作によって代替的に選択される。 Two modes, a test operation mode and a normal operation mode, are prepared as modes for operating the air conditioners D_1 to D_6. Here, in the test operation mode, after the installation of the air conditioning control device 10 is completed and the setting of the divided areas MP_1 to MP_6 is completed, the output operations of the air conditioning devices D_1 to D_6 are adaptively controlled under the congestion measurement task. This is a mode for checking whether or not. On the other hand, the normal operation mode is a mode selected in a normal use stage after the adaptive control of the air conditioners D_1 to D_6 is confirmed by the test operation mode. The trial operation mode and the normal operation mode are alternatively selected by operating the input device 18.
 試運転モードでは、変数Kが“1”~“6”の各々に設定され、空調装置D_Kの稼働状況(稼働状況:温度,湿度,風量および風向き)が検出される。グラフィック画像G_Kは、検出された稼働状況に応じた描画態様でカメラ画像に多重される。なお、描画位置は、エリアレジスタ14r1の記述を参照して決定される。 In the trial operation mode, the variable K is set to “1” to “6”, and the operating status (operating status: temperature, humidity, air volume, and wind direction) of the air conditioner D_K is detected. The graphic image G_K is multiplexed with the camera image in a drawing manner according to the detected operating situation. The drawing position is determined with reference to the description of the area register 14r1.
 グラフィック画像G_Kの描画態様は、空調装置D_Kの風量に応じて、図12(A)~図12(C)に示す要領で変化する。グラフィック画像G_Kは、風量“小”に対応して図12(A)に示す要領で描画され、風量“中”に対応して図12(B)に示す要領で描画され、そして風量“大”に対応して図12(C)に示す要領で描画される。また、グラフィック画像G_Kの色相は空調装置D_Kの設定温度の上昇に伴って“青”から“赤”に変化し、グラフィック画像G_Kの彩度は空調装置D_Kの設定湿度の上昇に伴って増大し、そしてグラフィック画像G_Kの形状は風向きに沿って歪む。 The drawing mode of the graphic image G_K changes in the manner shown in FIGS. 12A to 12C according to the air volume of the air conditioner D_K. The graphic image G_K is drawn in the manner shown in FIG. 12A corresponding to the air volume “small”, drawn in the manner shown in FIG. 12B corresponding to the air volume “medium”, and the air volume “large”. Is drawn in the manner shown in FIG. Further, the hue of the graphic image G_K changes from “blue” to “red” as the set temperature of the air conditioner D_K increases, and the saturation of the graphic image G_K increases as the set humidity of the air conditioner D_K increases. The shape of the graphic image G_K is distorted along the wind direction.
 図13(A)に示すように人物H1が測定エリアDT_4に存在するとき、空調装置D_4の風量は“大”に設定され、空調装置D_1,D_2およびD_5の風量は“中”に設定され、そして空調装置D_3およびD_6の風量は“小”に設定される。グラフィック画像G_1~G_6は、このような風量設定を示す態様で描画される。 As shown in FIG. 13A, when the person H1 exists in the measurement area DT_4, the air volume of the air conditioner D_4 is set to “large”, and the air volumes of the air conditioners D_1, D_2, and D_5 are set to “medium”. The air volumes of the air conditioners D_3 and D_6 are set to “small”. The graphic images G_1 to G_6 are drawn in a manner showing such air volume setting.
 図13(B)に示すように人物H1が測定エリアDT_1に移動すると、空調装置D_1の風量が“中”から“大”に変更され、空調装置D_4の風量が“大”から“中”に変更される。この結果、グラフィック画像G_1およびG_4の描画態様もまた変化する。 As shown in FIG. 13B, when the person H1 moves to the measurement area DT_1, the air volume of the air conditioner D_1 is changed from "medium" to "large", and the air volume of the air conditioner D_4 is changed from "large" to "medium". Be changed. As a result, the drawing mode of the graphic images G_1 and G_4 also changes.
 カメラ画像およびグラフィック画像G_1~G_6は、携帯通信端末30の画面にも表示される。このため、携帯通信端末30を携帯して部屋RM1を移動することで、人物H1は空調装置D_1~D_6の出力動作が適応的に制御されるか否かを容易に確認することができる。これによって、初期設定作業やメンテナンス作業の効率化が図られる。 The camera images and graphic images G_1 to G_6 are also displayed on the screen of the mobile communication terminal 30. Therefore, by carrying the mobile communication terminal 30 and moving in the room RM1, the person H1 can easily confirm whether or not the output operations of the air conditioners D_1 to D_6 are adaptively controlled. Thereby, the efficiency of the initial setting work and the maintenance work can be improved.
 また、カメラ12の向きがずれると、グラフィック画像G_1~G_6の描画位置にもずれが生じる。このため、グラフィック画像G_1~G_6を見ながら空間を移動することで、人物H1は、カメラ12の向きにずれが生じていることを認識することができる。 Further, when the orientation of the camera 12 is deviated, the drawing positions of the graphic images G_1 to G_6 are also deviated. Therefore, the person H1 can recognize that the orientation of the camera 12 has shifted by moving through the space while viewing the graphic images G_1 to G_6.
 CPU14pは、図14に示すメインタスク,図15~図16に示すエリア設定タスク,図17~図19に示す混雑度測定タスク,および図20に示す表示制御タスクを含む複数のタスクを実行する。なお、これらのタスクに対応する制御プログラムは、記録媒体22に保存される。 The CPU 14p executes a plurality of tasks including a main task shown in FIG. 14, an area setting task shown in FIGS. 15 to 16, a congestion degree measuring task shown in FIGS. 17 to 19, and a display control task shown in FIG. Note that control programs corresponding to these tasks are stored in the recording medium 22.
 図14を参照して、ステップS1では現時点の動作モードがエリア設定モードであるか否かを判別し、ステップS5では現時点の動作モードが混雑度測定モードであるか否かを判別する。 Referring to FIG. 14, in step S1, it is determined whether or not the current operation mode is the area setting mode, and in step S5, it is determined whether or not the current operation mode is the congestion degree measurement mode.
 ステップS1でYESであれば、ステップS3でエリア設定タスクを起動し、その後にステップS15に進む。ステップS5でYESであれば、分割エリアMP_1~MP_6が設定済みであるか否かをステップS7で判別する。判別結果がYESであればステップS9およびS11で混雑度測定タスクおよび表示制御タスクを起動してからステップS15に進み、判別結果がNOであればそのままステップS15に進む。ステップS1およびS5のいずれもNOであれば、ステップS13で他の処理を実行し、その後にステップS15に進む。 If “YES” in the step S1, the area setting task is started in a step S3, and thereafter, the process proceeds to a step S15. If “YES” in the step S5, it is determined whether or not the divided areas MP_1 to MP_6 have been set in a step S7. If the determination result is YES, the congestion degree measurement task and the display control task are started in steps S9 and S11, and then the process proceeds to step S15. If the determination result is NO, the process directly proceeds to step S15. If both steps S1 and S5 are NO, another process is executed in step S13, and then the process proceeds to step S15.
 ステップS15ではモード変更操作が行われたか否かを繰り返し判別する。判別結果がNOからYESに更新されると、起動中のタスクをステップS17で終了し、その後にステップS1に戻る。 In step S15, it is repeatedly determined whether or not a mode change operation has been performed. When the determination result is updated from NO to YES, the activated task is terminated in step S17, and thereafter, the process returns to step S1.
 図15を参照して、ステップS21ではマップ画像をモニタ16に表示し、ステップS23では変数Kを“1”に設定する。ステップS25ではエリア指定のためのクリック操作が行われたか否かを判別し、判別結果がNOからYESに更新されるとステップS27でクリック座標を算出する。算出された座標は、変数Kに対応してエリアレジスタ14r1に記述される。ステップS29では変数Kが“6”に達したか否かを判別し、判別結果がNOであればステップS31で変数KをインクリメントしてからステップS25に戻る一方、判別結果がYESであればステップS33に進む。 Referring to FIG. 15, a map image is displayed on monitor 16 in step S21, and variable K is set to “1” in step S23. In step S25, it is determined whether or not a click operation for designating an area has been performed. If the determination result is updated from NO to YES, click coordinates are calculated in step S27. The calculated coordinates are described in the area register 14r1 corresponding to the variable K. In step S29, it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S31 and then the process returns to step S25. Proceed to S33.
 ステップS33では、クリック座標を参照してマップ画像を分割する。この結果、6個の分割エリアMP_1~MP_6がマップ画像上に割り当てられる。ステップS35では分割画像MP_1~MP_6を仕切る境界線をマップ画像上に描画する。 In step S33, the map image is divided with reference to the click coordinates. As a result, six divided areas MP_1 to MP_6 are allocated on the map image. In step S35, boundary lines that divide the divided images MP_1 to MP_6 are drawn on the map image.
 ステップS37では変数Kを“1”に設定し、ステップS39では分割エリアMP_Kを定義する複数のXY座標を算出する。算出されたXY座標は、変数Kに対応してエリアレジスタ14r1に記述される。ステップS41では、分割エリアMP_Kを定義する複数のXY座標の各々を数1に従ってUV座標に変換する。変換されたUV座標は変数Kに対応してエリアレジスタ14r1に記述され、これによって分割エリアMP_Kに対応する測定エリアDT_Kがカメラ画像に割り当てられる。 In step S37, the variable K is set to “1”, and in step S39, a plurality of XY coordinates defining the divided area MP_K are calculated. The calculated XY coordinates are described in the area register 14r1 corresponding to the variable K. In step S41, each of the plurality of XY coordinates that define the divided area MP_K is converted into UV coordinates according to Equation 1. The converted UV coordinates are described in the area register 14r1 corresponding to the variable K, whereby the measurement area DT_K corresponding to the divided area MP_K is assigned to the camera image.
 ステップS43では、グラフィック画像G_Kの描画位置をマップ画像上で定義する。定義された描画位置を示す複数のXY座標は、変数Kに対応してエリアレジスタ14r1に記述される。ステップS45では、ステップS43で定義された描画位置を示すXY座標を数1に従ってUV座標に変換する。この結果、グラフィック画像G_Kの描画位置がカメラ画像上で定義される。こうして定義された描画位置を示す複数のUV座標もまた、変数Kに対応してエリアレジスタ14r1に記述される。 In step S43, the drawing position of the graphic image G_K is defined on the map image. A plurality of XY coordinates indicating the defined drawing position are described in the area register 14r1 corresponding to the variable K. In step S45, the XY coordinates indicating the drawing position defined in step S43 are converted into UV coordinates according to Equation 1. As a result, the drawing position of the graphic image G_K is defined on the camera image. A plurality of UV coordinates indicating the drawing position defined in this way are also described in the area register 14r1 corresponding to the variable K.
 ステップS47では、変数Kが“6”に達したか否かを判別する。判別結果がNOであればステップS49で変数KをインクリメントしてからステップS39に戻る一方、判別結果がYESであれば処理を終了する。 In step S47, it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S49 and the process returns to step S39. If the determination result is YES, the process is ended.
 図17を参照して、ステップS51では測定周期が到来したか否かを判別する。判別結果がNOからYESに更新されるとステップS53に進み、パターンマッチングまたは動き検出によってカメラ画像から人物像を探索する。ステップS55では1または2以上の人物像が発見されたか否かを判別し、判別結果がNOであればステップS51に戻る一方、判別結果がYESであればステップS57に進む。 Referring to FIG. 17, in step S51, it is determined whether or not the measurement cycle has arrived. When the determination result is updated from NO to YES, the process proceeds to step S53, and a person image is searched from the camera image by pattern matching or motion detection. In step S55, it is determined whether one or more person images have been found. If the determination result is NO, the process returns to step S51, whereas if the determination result is YES, the process proceeds to step S57.
 ステップS57では変数Lを“1”に設定し、ステップS59では発見された1または2以上の人物像のうちL番目の人物像の代表点を“RP_L”として決定し、そしてステップS61では決定された代表点RP_LのXY座標を上述の数1を参照して算出する。算出されたXY座標は、変数Lに対応して代表点レジスタ14r2に記述される。ステップS63では変数Lが最大値Lmax(=発見された人物像の総数)に達したか否かを判別し、判別結果がNOであればステップS65で変数LをインクリメントしてからステップS59に戻る一方、判別結果がYESであればステップS67に進む。 In step S57, the variable L is set to “1”. In step S59, the representative point of the Lth person image among the one or more found person images is determined as “RP_L”, and is determined in step S61. The XY coordinates of the representative point RP_L are calculated with reference to the above equation 1. The calculated XY coordinates are described in the representative point register 14r2 corresponding to the variable L. In step S63, it is determined whether or not the variable L has reached the maximum value Lmax (= total number of discovered human images). If the determination result is NO, the variable L is incremented in step S65, and the process returns to step S59. On the other hand, if a determination result is YES, it will progress to Step S67.
 ステップS67では変数Lを再度“1”に設定し、ステップS69では変数Kを“1”に設定し、そしてステップS71では代表点RP_Lから空調装置D_Kまでの距離を“DS_K”として測定する。ステップS73では変数Kが“6”に達したか否かを判別し、判別結果がNOであればステップS75で変数KをインクリメントしてからステップS71に戻る一方、判別結果がYESであればステップS77に進む。こうして測定された距離DS_1~DS_6によって、L番目の人物と空調装置D_1~D_6との位置関係が明らかとなる。 In step S67, the variable L is set to "1" again, in step S69, the variable K is set to "1", and in step S71, the distance from the representative point RP_L to the air conditioner D_K is measured as "DS_K". In step S73, it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S75 and the process returns to step S71. Proceed to S77. The positional relationship between the Lth person and the air conditioners D_1 to D_6 is clarified by the distances DS_1 to DS_6 thus measured.
 ステップS77では変数Kを再度“1”に設定し、ステップS79では数2に従って分割エリアMP_Kに対応する重み付け量W_Kを算出する。算出された重み付け量W_Kは、変数Kに対応して重み付け量レジスタ14r3に記述される。 In step S77, the variable K is set to “1” again. In step S79, the weighting amount W_K corresponding to the divided area MP_K is calculated according to Equation 2. The calculated weighting amount W_K is described in the weighting amount register 14r3 corresponding to the variable K.
 ステップS81では変数Kが“6”に達したか否かを判別し、判別結果がNOであればステップS83で変数KをインクリメントしてからステップS79に戻る一方、判別結果がYESであればステップS85に進む。 In step S81, it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S83 and the process returns to step S79. Proceed to S85.
 ステップS85では、変数Lが最大値Lmaxに達したか否かを判別し、判別結果がNOであればステップS87で変数LをインクリメントしてからステップS69に戻る一方、判別結果がYESであればステップS89に進む。ステップS89では変数Kを“1”に設定し、ステップS91では混雑度CR_Kを算出する。混雑度CR_Kは、分割エリアMP_Kに割り当てられた重み付け量の総和に相当する。 In step S85, it is determined whether or not the variable L has reached the maximum value Lmax. If the determination result is NO, the variable L is incremented in step S87 and the process returns to step S69. Proceed to step S89. In step S89, the variable K is set to “1”, and in step S91, the congestion level CR_K is calculated. The degree of congestion CR_K corresponds to the sum of weighting amounts assigned to the divided areas MP_K.
 ステップS93では変数Kが“6”に達したか否かを判別し、判別結果がNOであればステップS95で変数KをインクリメントしてからステップS91に戻る一方、判別結果がYESであればステップS97に進む。ステップS97では、ステップS91で算出された混雑度CR_1~CR_6を参照して空調装置D1~D6の出力を制御する。具体的には、混雑度が大きい分割エリアに対応する空調装置の出力を強め、混雑度が小さい分割エリアに対応する空調装置の出力を弱める。 In step S93, it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S95 and the process returns to step S91. Proceed to S97. In step S97, the outputs of the air conditioners D1 to D6 are controlled with reference to the congestion levels CR_1 to CR_6 calculated in step S91. Specifically, the output of the air conditioner corresponding to the divided area with a high degree of congestion is strengthened, and the output of the air conditioner corresponding to the divided area with a low degree of congestion is weakened.
 図20を参照して、ステップS101ではカメラ12を起動し、ステップS103ではカメラ画像の表示を開始する。カメラ12によって捉えられた被写界を表す画像データは、モニタ16に与えられるとともに、通信I/F20を介して携帯通信端末30に送信される。この結果、カメラ画像は、モニタ16および携帯通信端末の画面に表示される。 Referring to FIG. 20, the camera 12 is activated in step S101, and the display of the camera image is started in step S103. Image data representing the scene captured by the camera 12 is given to the monitor 16 and transmitted to the mobile communication terminal 30 via the communication I / F 20. As a result, the camera image is displayed on the screen of the monitor 16 and the mobile communication terminal.
 ステップS105では、現在のモードが通常運転モードおよび試運転モードのいずれであるかを判別する。現在のモードが通常運転モードであれば表示制御タスクを終了する一方、現在のモードが試運転モードであればステップS107に進む。ステップS107では変数Kを“1”に設定し、ステップS109では空調装置D_Kの稼働状況(稼働状況:温度,湿度,風量,風向き)を検出する。ステップS111では、グラフィック画像G_Kを測定エリアDT_Kに対応してカメラ画像に多重する。グラフィック画像G_Kの描画態様は、検出された稼働状況に対応する。 In step S105, it is determined whether the current mode is the normal operation mode or the test operation mode. If the current mode is the normal operation mode, the display control task is terminated. If the current mode is the test operation mode, the process proceeds to step S107. In step S107, the variable K is set to “1”, and in step S109, the operating status (operating status: temperature, humidity, air volume, wind direction) of the air conditioner D_K is detected. In step S111, the graphic image G_K is multiplexed with the camera image corresponding to the measurement area DT_K. The drawing mode of the graphic image G_K corresponds to the detected operating status.
 ステップS113では、変数Kが“6”に達したか否かを判別する。判別結果がNOであればステップS117で変数KをインクリメントしてからステップS109に戻る一方、判別結果がYESであればステップS115で変数Kを“1”に設定してからステップS109に戻る。 In step S113, it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S117 and then the process returns to step S109. If the determination result is YES, the variable K is set to “1” in step S115 and then the process returns to step S109.
 以上の説明から分かるように、CPU14pは、部屋RM1の内部空間に存在する1または2以上の人物を部屋RM1の内部空間を捉えるカメラ12から出力された被写界像に基づいて探索し(S53)、部屋RM1の内部空間に向けて出力を発生する空調装置D_1~D_6の出力動作を探索処理によって発見された1または2以上の人物と空調装置D_1~D_6との位置関係に基づいて制御する(S57~S97)。CPU14pはまた、カメラ12から出力された被写界像を空調制御と並列してモニタ16または携帯通信端末30のモニタ画面に再現し(S101~S103)、空調装置D_1~D_6の動作状態を示すグラフィック画像G_1~G_6をモニタ画面に再現された被写界像に多重する(S107~S117)。 As can be seen from the above description, the CPU 14p searches for one or more persons existing in the internal space of the room RM1 based on the object scene image output from the camera 12 capturing the internal space of the room RM1 (S53). ), Controlling the output operation of the air conditioners D_1 to D_6 that generate output toward the internal space of the room RM1 based on the positional relationship between the air conditioners D_1 to D_6 and one or more persons discovered by the search process (S57-S97). The CPU 14p also reproduces the object scene image output from the camera 12 on the monitor screen of the monitor 16 or the portable communication terminal 30 in parallel with the air conditioning control (S101 to S103), and shows the operating state of the air conditioning devices D_1 to D_6. The graphic images G_1 to G_6 are multiplexed on the object scene image reproduced on the monitor screen (S107 to S117).
 空調装置D_1~D_6は部屋RM1の内部空間に向けて出力を発生し、部屋RM1の内部空間に存在する人物は部屋RM1の内部空間を捉えるカメラ12から出力された被写界像に基づいて探索され、そして空調装置D_1~D_6の出力動作は探索処理によって発見された人物と空調装置D_1~D_6との位置関係に基づいて制御される。これによって、人物の位置を考慮した適応的な出力制御が実現される。 The air conditioners D_1 to D_6 generate outputs toward the interior space of the room RM1, and a person existing in the interior space of the room RM1 searches based on the object scene image output from the camera 12 that captures the interior space of the room RM1. The output operations of the air conditioners D_1 to D_6 are controlled based on the positional relationship between the person discovered by the search process and the air conditioners D_1 to D_6. Thereby, adaptive output control in consideration of the position of the person is realized.
 また、カメラ12から出力された被写界像は空調装置D_1~D_6の出力制御と並列してモニタ画面に再現され、空調装置D_1~D_6の動作状態を示すグラフィック画像G_1~G_6は再現された被写界像に多重される。空調装置D_1~D_6の出力が適応的に制御されているか否かは、多重されたグラフィック画像G_1~G_6に基づいて判別できる。これによって、初期設定作業および/またはメンテナンス作業に掛かる負荷を低減することができる。 In addition, the object scene image output from the camera 12 is reproduced on the monitor screen in parallel with the output control of the air conditioners D_1 to D_6, and the graphic images G_1 to G_6 showing the operation states of the air conditioners D_1 to D_6 are reproduced. It is multiplexed on the object scene image. Whether the outputs of the air conditioners D_1 to D_6 are adaptively controlled can be determined based on the multiplexed graphic images G_1 to G_6. As a result, the load on the initial setting work and / or maintenance work can be reduced.
 なお、この実施例では、マウスポインタのクリック操作によってマークM_1~M_6の各々の座標を指定するようにしている。しかし、これに代えて、マークM_1~M_6の各々の座標値を直接的に指定するようにしてもよい。 In this embodiment, the coordinates of the marks M_1 to M_6 are designated by clicking the mouse pointer. However, instead of this, the coordinate values of the marks M_1 to M_6 may be directly specified.
 また、この実施例では空調装置の出力を適応的に制御することを想定しているが、空調装置の出力に代えて或いは空調装置の出力とともに、照明装置の出力(つまり明るさ)を適応的に制御するようにしてもよい。この場合、グラフィック画像の描画態様は、照明装置の明るさに応じて異なるように変化する。 In this embodiment, it is assumed that the output of the air conditioner is adaptively controlled. However, instead of the output of the air conditioner or together with the output of the air conditioner, the output (that is, brightness) of the lighting device is adaptive. You may make it control to. In this case, the drawing mode of the graphic image changes differently according to the brightness of the lighting device.
 さらに、この実施例では、数1を参照した平面射影変換を想定しているが、これに代えて透視射影変換を行うようにしてもよい。 Furthermore, in this embodiment, planar projective transformation with reference to Equation 1 is assumed, but perspective projective transformation may be performed instead.
 また、この実施例では、床面FL1を鳥瞰した状態を模式的に表す画像をマップ画像として採用している。しかし、マップ画像は、上述した数1を参照した鳥瞰変換をカメラ画像に対して施すことで生成するようにしてもよい。 Further, in this embodiment, an image schematically representing a state in which the floor surface FL1 is bird's-eye view is adopted as the map image. However, the map image may be generated by performing bird's-eye conversion with reference to Equation 1 described above on the camera image.
 さらに、この実施例では、図20に示すステップS109で検出される空調装置D_Kの稼働状況として、“温度”,“湿度”,“風量”および“風向き”を想定している。しかし、空調装置D_Kの消費電力を稼働状況として“消費電力”を追加的に想定するようにしてもよい。 Furthermore, in this embodiment, “temperature”, “humidity”, “air volume”, and “wind direction” are assumed as the operating status of the air conditioner D_K detected in step S109 shown in FIG. However, “power consumption” may additionally be assumed with the power consumption of the air conditioner D_K as the operating status.
 また、この実施例では、図13(A)~図13(B)に示す要領でグラフィック画像G_Kをモニタ画面に多重するようにしている。しかし、空調装置D_Kから出力された風が及ぶ広がりを示す数値(たとえば“1m”,“3m”のキャラクタを追加的に表示するようにしてもよい。この場合、グラフィック画像G_Kは、図21(A)~図21(B)に示すように表示される。 In this embodiment, the graphic image G_K is multiplexed on the monitor screen in the manner shown in FIGS. 13 (A) to 13 (B). However, numerical values (for example, “1 m” and “3 m”) indicating the spread of the wind output from the air conditioner D_K may be additionally displayed. In this case, the graphic image G_K is displayed as shown in FIG. A) to be displayed as shown in FIG.
 さらに、この実施例では、カメラ12によって捉えられた被写界像をカメラ12の視点でモニタ画面に表示し、こうして表示された被写界像に適合するようにグラフィック画像G_Kの形状および/またはサイズを調整するようにしている(図10(B)参照)。しかし、カメラ12によって捉えられた被写界像を俯瞰画像に変換した状態でモニタ画面に表示し、この俯瞰画像に適合するようにグラフィック画像G_Kの形状および/またはサイズを調整するようにしてもよい。この場合、モニタ画面には、図22に示す要領で俯瞰画像およびグラフィック画像G_Kが表示される。 Further, in this embodiment, the scene image captured by the camera 12 is displayed on the monitor screen from the viewpoint of the camera 12, and the shape and / or the graphic image G_K is adapted to match the displayed scene image. The size is adjusted (see FIG. 10B). However, the object scene image captured by the camera 12 may be displayed on the monitor screen in a state of being converted into a bird's-eye view image, and the shape and / or size of the graphic image G_K may be adjusted to match the bird's-eye view image. Good. In this case, the overhead image and the graphic image G_K are displayed on the monitor screen as shown in FIG.
 この発明が詳細に説明され図示されたが、それは単なる図解および一例として用いたものであり、限定であると解されるべきではないことは明らかであり、この発明の精神および範囲は添付されたクレームの文言によってのみ限定される。 Although the present invention has been described and illustrated in detail, it is clear that it has been used merely as an illustration and example and should not be construed as limiting, and the spirit and scope of the present invention are attached Limited only by the wording of the claims.
 10 …空調制御装置
 12 …カメラ
 14 …画像処理回路
 14p …CPU
 16 …モニタ
 18 …入力装置
DESCRIPTION OF SYMBOLS 10 ... Air-conditioning control apparatus 12 ... Camera 14 ... Image processing circuit 14p ... CPU
16 ... monitor 18 ... input device

Claims (8)

  1.  出力制御装置であって、次のものを備える:
     空間に存在する1または2以上の特定物体を前記空間を捉えるカメラから出力された被写界像に基づいて探索する探索手段;
     前記空間に向けて出力を発生する複数の装置の出力動作を前記探索手段によって発見された1または2以上の特定物体と前記複数の装置との位置関係に基づいて制御する制御手段;
     前記カメラから出力された被写界像を前記制御手段の制御処理と並列してモニタ画面に再現する再現手段;および
     前記複数の装置の動作状態を示す状態情報を前記再現手段によって再現された被写界像に多重する多重手段。
    An output control device comprising:
    Search means for searching for one or more specific objects existing in space based on an object scene image output from a camera capturing the space;
    Control means for controlling output operations of a plurality of devices that generate output toward the space based on a positional relationship between one or more specific objects discovered by the search means and the plurality of devices;
    Reproduction means for reproducing an object scene image output from the camera on a monitor screen in parallel with the control processing of the control means; and status information indicating operation states of the plurality of devices being reproduced by the reproduction means. Multiplexing means to multiplex to the scene image.
  2.  クレーム1に従属する出力制御装置であって、次のものをさらに備える:
     試運転モードを含む複数のモードを代替的に選択する選択手段;および
     前記試運転モードに対応して前記多重手段を起動する起動手段。
    An output control device according to claim 1, further comprising:
    Selection means for alternatively selecting a plurality of modes including a trial operation mode; and activation means for activating the multiplexing means in response to the trial operation mode.
  3.  クレーム1に従属する出力制御装置であって、前記カメラはUV座標系に沿って定義された撮像面を有し、
     前記空間はXY座標系に沿って定義された平面を有し、
     前記制御手段は、前記UV座標系と前記XY座標系との対応関係を示す校正パラメータと前記探索手段によって発見された画像とを参照して前記特定物体のXY座標を算出する算出手段、および前記算出手段によって算出されたXY座標を参照して前記位置関係を検出する検出手段を含む。
    An output control device according to claim 1, wherein the camera has an imaging surface defined along a UV coordinate system,
    The space has a plane defined along the XY coordinate system;
    The control means calculates a XY coordinate of the specific object with reference to a calibration parameter indicating a correspondence relationship between the UV coordinate system and the XY coordinate system and an image found by the search means; and Detection means for detecting the positional relationship with reference to XY coordinates calculated by the calculation means is included.
  4.  クレーム3に従属する出力制御装置であって、前記状態情報の多重位置を前記校正パラメータを参照して定義する定義手段をさらに備える。 The output control device according to claim 3, further comprising a defining means for defining a multiplex position of the state information with reference to the calibration parameter.
  5.  クレーム1に従属する出力制御装置であって、前記複数の装置の各々は空調制御装置に相当し、前記状態情報は温度,湿度,風量,風向きおよび消費電力の少なくとも1つをパラメータとして有する。 The output control device according to claim 1, wherein each of the plurality of devices corresponds to an air conditioning control device, and the state information has at least one of temperature, humidity, air volume, wind direction, and power consumption as parameters.
  6.  クレーム1に従属する出力制御装置であって、前記状態情報は前記複数の装置の各々の出力が及ぶ位置および/または広さをパラメータとして有する。 The output control device according to claim 1, wherein the state information has, as a parameter, a position and / or a width that each output of the plurality of devices reaches.
  7.  クレーム1に従属する出力制御装置であって、前記被写界像および前記状態情報を携帯通信端末に向けて送信する送信手段をさらに備え、前記モニタ画面は前記携帯通信端末のモニタ画面に相当する。 The output control device according to claim 1, further comprising: transmission means for transmitting the object scene image and the state information to a mobile communication terminal, wherein the monitor screen corresponds to a monitor screen of the mobile communication terminal. .
  8.  出力制御装置であって、次のものを備える:
     空間に存在する1または2以上の特定物体を前記空間を捉えるカメラから出力された被写界像に基づいて探索する探索手段;
     前記カメラから出力された被写界像を前記探索手段の探索処理と並列してモニタ画面に再現する再現手段;
     前記探索手段によって発見された1または2以上の特定物体との位置関係に応じて異なる出力を前記空間に向けて発生する複数の装置の動作状態を検出する検出手段;および
     前記検出手段によって検出された動作状態を示す状態情報を前記再現手段によって再現された被写界像に多重する多重手段。
    An output control device comprising:
    Search means for searching for one or more specific objects existing in space based on an object scene image output from a camera capturing the space;
    Reproduction means for reproducing the object scene image output from the camera on a monitor screen in parallel with the search processing of the search means;
    Detecting means for detecting operation states of a plurality of devices that generate different outputs toward the space according to a positional relationship with one or more specific objects discovered by the searching means; and detected by the detecting means Multiplexing means for multiplexing state information indicating the operating state on the object scene image reproduced by the reproducing means.
PCT/JP2011/065760 2010-07-20 2011-07-11 Output control device WO2012011400A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010-162647 2010-07-20
JP2010162647 2010-07-20

Publications (1)

Publication Number Publication Date
WO2012011400A1 true WO2012011400A1 (en) 2012-01-26

Family

ID=45496827

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/065760 WO2012011400A1 (en) 2010-07-20 2011-07-11 Output control device

Country Status (2)

Country Link
JP (1) JP2012042197A (en)
WO (1) WO2012011400A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021085559A (en) * 2019-11-25 2021-06-03 ダイキン工業株式会社 Device management system
AU2020434523B2 (en) 2020-03-11 2023-08-24 Mitsubishi Electric Corporation Air-conditioning system
JP7390946B2 (en) 2020-03-23 2023-12-04 三菱電機株式会社 Air conditioning control device, photographing device, air conditioning control system, air conditioning system, and air conditioning control method

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0674526A (en) * 1992-08-26 1994-03-15 Shimizu Corp Air-conditioning control system
JPH11206906A (en) * 1998-01-26 1999-08-03 Nittan Co Ltd Device for displaying projected discharge range of fire extinguishing equipment
JP2006125727A (en) * 2004-10-28 2006-05-18 Shin Nippon Air Technol Co Ltd Control method for air conditioning/lighting, and system ceiling module therefor
JP2006146803A (en) * 2004-11-24 2006-06-08 Olympus Corp Operation device, and remote operation system
JP2006220405A (en) * 2005-01-12 2006-08-24 Mitsubishi Electric Corp Air conditioner
JP2007078283A (en) * 2005-09-15 2007-03-29 Seiko Epson Corp Air conditioner, and air conditioning method
JP2008290629A (en) * 2007-05-25 2008-12-04 Toyota Motor Corp Parking supporting device for vehicle
JP2009103347A (en) * 2007-10-22 2009-05-14 Yamaha Corp Environment control system
JP2009104543A (en) * 2007-10-25 2009-05-14 Sumitomo Electric Ind Ltd Information providing device, computer program and information providing method
JP2009299933A (en) * 2008-06-10 2009-12-24 Shimizu Corp Air-conditioning control device
JP2010032210A (en) * 2007-10-03 2010-02-12 Mitsubishi Electric Corp Indoor unit of air conditioner

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0674526A (en) * 1992-08-26 1994-03-15 Shimizu Corp Air-conditioning control system
JPH11206906A (en) * 1998-01-26 1999-08-03 Nittan Co Ltd Device for displaying projected discharge range of fire extinguishing equipment
JP2006125727A (en) * 2004-10-28 2006-05-18 Shin Nippon Air Technol Co Ltd Control method for air conditioning/lighting, and system ceiling module therefor
JP2006146803A (en) * 2004-11-24 2006-06-08 Olympus Corp Operation device, and remote operation system
JP2006220405A (en) * 2005-01-12 2006-08-24 Mitsubishi Electric Corp Air conditioner
JP2007078283A (en) * 2005-09-15 2007-03-29 Seiko Epson Corp Air conditioner, and air conditioning method
JP2008290629A (en) * 2007-05-25 2008-12-04 Toyota Motor Corp Parking supporting device for vehicle
JP2010032210A (en) * 2007-10-03 2010-02-12 Mitsubishi Electric Corp Indoor unit of air conditioner
JP2009103347A (en) * 2007-10-22 2009-05-14 Yamaha Corp Environment control system
JP2009104543A (en) * 2007-10-25 2009-05-14 Sumitomo Electric Ind Ltd Information providing device, computer program and information providing method
JP2009299933A (en) * 2008-06-10 2009-12-24 Shimizu Corp Air-conditioning control device

Also Published As

Publication number Publication date
JP2012042197A (en) 2012-03-01

Similar Documents

Publication Publication Date Title
KR102121785B1 (en) Air-conditioner controlling direction of the wind using artificial intelligence by instructed position and method of controlling thereof
JP5238679B2 (en) Air conditioning control device, air conditioning control method, and radiation temperature measuring device
US10365004B2 (en) Control method and communication device
WO2020052167A1 (en) Method and device for determining air blowing angle range of air conditioner, and air conditioner
CN109073252B (en) Air conditioner visualization system
WO2012011401A1 (en) Output control device
CN103780835A (en) Identification device and method
JP4325593B2 (en) Air conditioner correspondence support system
KR20160046728A (en) Temperature distribution display device and method
KR101460925B1 (en) Engineering device and point information creation method
WO2012011400A1 (en) Output control device
CN110762815A (en) Air conditioner control method, device and system
CN111461487B (en) Indoor decoration engineering wisdom management system based on BIM
JP2011153737A (en) Calibrating apparatus
CN109323407A (en) Air conditioner, server, air-conditioning system and control method
JP2017033319A (en) Decorative material simulation system, method and program
JP2014235102A (en) Position estimation system and position estimation device
WO2012008288A1 (en) Output controller
KR102085799B1 (en) Indoor air conditioning control system using environment measurement sensor connected to user terminal
CN111465906B (en) Automatic allocation between flow control device, sensor device and control device in HVAC application
JP2011154449A (en) Congestion level measuring device
JP4613742B2 (en) Air conditioner installation work support device and air conditioner installation work support program
JP2017072351A (en) Control system, control method, control device, information terminal and control program
JP6554980B2 (en) Image processing system, method, and program
WO2011155439A1 (en) Motion detection device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11809574

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11809574

Country of ref document: EP

Kind code of ref document: A1