WO2012008288A1 - Output controller - Google Patents

Output controller Download PDF

Info

Publication number
WO2012008288A1
WO2012008288A1 PCT/JP2011/064622 JP2011064622W WO2012008288A1 WO 2012008288 A1 WO2012008288 A1 WO 2012008288A1 JP 2011064622 W JP2011064622 W JP 2011064622W WO 2012008288 A1 WO2012008288 A1 WO 2012008288A1
Authority
WO
WIPO (PCT)
Prior art keywords
space
output
detecting
defining
coordinate system
Prior art date
Application number
PCT/JP2011/064622
Other languages
French (fr)
Japanese (ja)
Inventor
圭介 淺利
石井 洋平
本郷 仁志
長輝 楊
Original Assignee
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三洋電機株式会社 filed Critical 三洋電機株式会社
Publication of WO2012008288A1 publication Critical patent/WO2012008288A1/en

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/30Control or safety arrangements for purposes related to the operation of the system, e.g. for safety or monitoring
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/50Control or safety arrangements characterised by user interfaces or communication
    • F24F11/52Indication arrangements, e.g. displays
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/50Control or safety arrangements characterised by user interfaces or communication
    • F24F11/54Control or safety arrangements characterised by user interfaces or communication using one central controller connected to several sub-controllers
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/62Control or safety arrangements characterised by the type of control or by internal processing, e.g. using fuzzy logic, adaptive control or estimation of values
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/62Control or safety arrangements characterised by the type of control or by internal processing, e.g. using fuzzy logic, adaptive control or estimation of values
    • F24F11/63Electronic processing
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F2120/00Control inputs relating to users or occupants
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F2120/00Control inputs relating to users or occupants
    • F24F2120/10Occupancy
    • F24F2120/12Position of occupants

Definitions

  • the present invention relates to an output control device, and more particularly to an output control device that controls the output of the device by measuring the degree of congestion of a specific object.
  • Patent Document 1 An example of this type of device is disclosed in Patent Document 1.
  • the camera is installed in an elevator hall or a car.
  • the image output from the camera is divided into a plurality of blocks, and the occupancy rate of waiting customers or passengers is calculated for each block.
  • the number of waiting customers or passengers is determined based on the calculated occupancy rate, and the determination result is transmitted to the elevator control device.
  • the coordinate system that defines the imaging surface of the camera does not always match the coordinate system that defines the plane on which the waiting customer or passenger exists.
  • the two coordinate systems are different, the calculation accuracy of the occupancy rate, and hence the accuracy of the elevator control, is lowered.
  • a main object of the present invention is to provide an output control device capable of improving the accuracy of adaptive output control.
  • the motion detection device comprises: definition means for defining a plurality of subspaces respectively corresponding to a plurality of devices that generate output toward the space; one or more specific objects present in the space Detecting means for detecting one or more representative points respectively corresponding to each of the above; measuring the distance from each of the one or more representative points detected by the detecting means to the boundaries of a plurality of small spaces defined by the defining means And measuring means for controlling the output operations of the plurality of devices based on the measurement results of the measuring means.
  • a camera for capturing a space is further provided, and the detection unit searches for one or more specific object images from the object scene image output from the camera, and the specific object image found by the search unit.
  • a determination means for determining a representative point is included.
  • the space is defined according to the XYZ coordinate system
  • the camera has an imaging surface defined according to the UV coordinate system.
  • control unit assigns a different weighting factor to each of the plurality of small spaces according to the magnitude of the distance measured by the measuring unit, and the congestion of the specific object based on the weighting factor assigned by the assigning unit.
  • the total sum of the weighting coefficients assigned by the assigning means corresponding to each of the one or more representative points is common among the one or more representative points.
  • the definition means reproduces a reference image representing a bird's-eye view of a plane defining the space, and accepting means for accepting a designation operation for designating a plurality of areas in contact with each other on the reference image reproduced by the reproduction means And definition processing means for defining a plurality of small spaces corresponding to the plurality of areas designated by the designation operation.
  • An output control device comprises: an object field image representing a space having an imaging surface defined along a UV coordinate system and having a plane defined along an XY coordinate system; A search means for searching for an image representing a specific object existing in the space from the object scene image output from the camera; a calibration parameter indicating a correspondence relationship between the UV coordinate system and the XY coordinate system and the search means Calculating means for calculating XY coordinates of a specific object with reference to an image; detecting a positional relationship between a plurality of devices generating an output toward a space and the specific object with reference to XY coordinates calculated by the calculating means Detecting means; and control means for controlling output operations of the plurality of devices with reference to the positional relationship detected by the detecting means.
  • the congestion degree measuring apparatus comprises the following: defining means for defining a plurality of small spaces forming a space; one or more representatives corresponding to one or more specific objects existing in the space, respectively. Detecting means for detecting points; measuring means for measuring the distance from each of one or more representative points detected by the detecting means to boundaries of a plurality of small spaces defined by the defining means; and defined by the defining means Calculating means for calculating the degree of congestion of the specific object in each of the plurality of small spaces based on the measurement result of the measuring means; and output means for outputting congestion degree information indicating the degree of congestion calculated by the calculating means.
  • An output control method is an output control method executed by a processor of an output control device, and includes the following: a plurality of small spaces respectively corresponding to a plurality of devices that generate output toward a space Defining step; detecting step of detecting one or more representative points respectively corresponding to one or more specific objects existing in space; defining from each of one or more representative points detected by the detecting step A measuring step for measuring distances to boundaries of the plurality of small spaces defined by the steps; and a control step for controlling output operations of the plurality of devices based on the measurement results of the measuring steps.
  • the output control method is an output control including a camera having an imaging surface defined along a UV coordinate system and outputting a scene image representing a space having a plane defined along an XY coordinate system.
  • An output control method executed by a processor of the apparatus comprising: a search step for searching an image representing a specific object existing in space from an object scene image output from a camera; UV coordinate system and XY A calculation step of calculating an XY coordinate of the specific object with reference to a calibration parameter indicating a correspondence relationship with the coordinate system and an image found by the search step; a plurality of devices that generate an output toward the space and the specific object A detection step for detecting the positional relationship with reference to the XY coordinates calculated by the calculating step; and a plurality of devices with reference to the positional relationship detected by the detection step. Control step of controlling the force action.
  • a congestion degree measuring method is a congestion degree measuring method executed by a processor of a congestion degree measuring apparatus, and includes the following steps: a defining step for defining a plurality of small spaces forming a space; A detecting step for detecting one or more representative points respectively corresponding to one or more specific objects to be detected; a plurality of small points defined by the defining step from each of the one or more representative points detected by the detecting step A measurement step for measuring the distance to the boundary of the space; a calculation step for calculating the degree of congestion of the specific object in each of the plurality of small spaces defined by the definition step; and a calculation step for calculating the degree of congestion based on the measurement result of the measurement step; An output step for outputting congestion degree information indicating the degree of congestion.
  • An output control system is an output control system including a plurality of devices that generate output toward a space and a control device that controls the plurality of devices, and the control device includes the following: a plurality of devices Defining means for defining a plurality of small spaces respectively corresponding to the above; detecting means for detecting one or more representative points respectively corresponding to one or more specific objects existing in the space; 1 or 2 detected by the detecting means Measuring means for measuring the distance from each of the two or more representative points to the boundaries of the plurality of small spaces defined by the defining means; and output control means for controlling the output operation of the plurality of devices based on the measurement results of the measuring means .
  • An output control system is directed to a camera that outputs an object scene image that represents a space having an imaging surface defined along a UV coordinate system and a plane defined along an XY coordinate system.
  • Output control system comprising a plurality of devices that generate output and a control device that controls the plurality of devices based on the object scene image output from the camera, the control device comprising: Search means for searching for an image representing a specific object existing in the space from the output object scene image; referring to a calibration parameter indicating the correspondence between the UV coordinate system and the XY coordinate system and the image found by the search means Calculating means for calculating the XY coordinates of the specific object; detecting means for detecting the positional relationship between the plurality of devices and the specific object with reference to the XY coordinates calculated by the calculating means; and detecting by the detecting means Output control means for controlling the output operation of a plurality of devices with reference to the position relationship.
  • the plurality of small spaces respectively correspond to the plurality of devices that generate output toward the space.
  • the outputs of the plurality of devices are controlled with reference to the distances from the boundaries of the plurality of small spaces to the representative points of the specific objects existing in the space.
  • the position of the image representing the specific object existing in the space is also indicated by the UV coordinates.
  • the UV coordinates are converted into XY coordinates (XY coordinate system: a coordinate system that defines a plane that divides space) with reference to the calibration parameters, and the positional relationship between a plurality of devices and specific objects is converted XY coordinates. Detected by reference.
  • XY coordinate system a coordinate system that defines a plane that divides space
  • one or more specific objects exist in a space formed by a plurality of small spaces.
  • the degree of congestion of the specific object in each small space is calculated based on the distance from the representative point of the specific object to the boundary of the small space. Thereby, the evaluation performance of the congestion degree is improved.
  • FIG. 3 is an illustrative view showing one example of a configuration of a representative point register applied to the embodiment in FIG.
  • FIG. 2 It is an illustration figure which shows an example of a structure of the weighting amount register
  • (A) is an illustration figure which shows an example of the allocation state of the divided area on a map image
  • (B) is an illustration figure which shows an example of the allocation state of the measurement area on a camera image.
  • (A) is an illustration figure which shows an example of a camera image
  • (B) is an illustration figure which shows an example of a corresponding bird's-eye view image.
  • A) is an illustrative view showing a part of a weighting amount calculating operation
  • (B) is an illustrative view showing another part of the weighting amount calculating operation
  • (C) is a weighting amount calculating operation.
  • FIG. 11 is a flowchart showing still another portion of behavior of the CPU applied to the embodiment in FIG. 2;
  • FIG. 10 is a flowchart showing yet another portion of behavior of the CPU applied to the embodiment in FIG. 2;
  • FIG. 11 is a flowchart showing still another portion of behavior of the CPU applied to the embodiment in FIG.
  • the output control device of one embodiment of the present invention is basically configured as follows.
  • the defining means 1a defines a plurality of small spaces respectively corresponding to the plurality of devices 5a, 5a,.
  • the detection unit 2a detects one or more representative points corresponding to one or more specific objects existing in the space.
  • the measuring means 3a measures the distance from each of one or more representative points detected by the detecting means 2a to the boundaries of a plurality of small spaces defined by the defining means.
  • the control unit 4a controls the output operations of the plurality of devices based on the measurement result of the measurement unit 3a.
  • a plurality of small spaces respectively correspond to a plurality of devices 5a, 5a,... That generate outputs toward the space. Further, the outputs of the plurality of devices 5a, 5a,... are controlled with reference to the distances from the boundaries of the plurality of small spaces to the representative points of specific objects existing in the space. As a result, the accuracy of adaptive control for the space can be improved.
  • the output control device of one embodiment of the present invention is basically configured as follows.
  • the camera 1b has an imaging surface defined along the UV coordinate system, and outputs an object scene image representing a space having a plane defined along the XY coordinate system.
  • the search means 2b searches for an image representing a specific object existing in the space from the object scene image output from the camera 1b.
  • the calculation unit 3b calculates the XY coordinates of the specific object with reference to the calibration parameter indicating the correspondence between the UV coordinate system and the XY coordinate system and the image found by the search unit 2b.
  • the detecting means 4b detects the positional relationship between the specific objects and the plurality of devices 6b, 6b,... That generate outputs toward the space with reference to the XY coordinates calculated by the calculating means 3b.
  • the control means 5b refers to the positional relationship detected by the detection means 4b and controls the output operations of the plurality of devices 6b, 6b,.
  • the position of the image representing the specific object existing in the space is also indicated by the UV coordinates.
  • the UV coordinates are converted into XY coordinates (XY coordinate system: a coordinate system that defines a plane partitioning the space) with reference to the calibration parameters, and the positional relationship between the plurality of devices 6b, 6b,. It is detected with reference to the generated XY coordinates.
  • XY coordinate system a coordinate system that defines a plane partitioning the space
  • the defining means 1c defines a plurality of small spaces forming a space.
  • the detection unit 2c detects one or more representative points corresponding to one or more specific objects existing in the space.
  • the measuring means 3c measures the distance from each of one or more representative points detected by the detecting means 2c to the boundaries of a plurality of small spaces defined by the defining means 1c.
  • the calculating unit 4c calculates the congestion degree of the specific object in each of the plurality of small spaces defined by the defining unit 1c based on the measurement result of the measuring unit 3c.
  • the output unit 5c outputs congestion level information indicating the congestion level calculated by the calculation unit 4c.
  • one or more specific objects exist in a space formed by a plurality of small spaces.
  • the degree of congestion of the specific object in each small space is calculated based on the distance from the representative point of the specific object to the boundary of the small space. Thereby, the evaluation performance of the congestion degree is improved.
  • the air conditioning control device 10 of this embodiment includes a camera 12 that repeatedly outputs image data representing an object scene (three-dimensional space) captured on the imaging surface.
  • the image data output from the camera 12 is captured by the image processing circuit 14 and subjected to camera image display processing by the CPU 14p.
  • an image representing the object scene that is, a camera image is displayed on the monitor 16.
  • room RM1 is partitioned by floor surface FL1 and ceiling HV1 and four wall surfaces WL1 to WL4.
  • the camera 12 is provided above the wall surface WL1, and captures the internal space of the room RM1 from obliquely above. Therefore, the camera image is displayed on the monitor screen as shown in FIG.
  • the internal space of the room RM1 is defined by the X, Y, and Z axes that are orthogonal to each other, and the imaging surface of the camera 12 is defined by the U and V axes that are orthogonal to each other.
  • the air conditioners D_1 to D_6 are installed at a predetermined distance on the ceiling HV1. Each of the air conditioners D_1 to D_6 outputs air having a specified temperature with a specified air volume, and the temperature of the room RM1 is adjusted by the air thus output.
  • a system constituted by the air conditioner 10 shown in FIG. 2 and the air conditioners D_1 to D_6 shown in FIG. 3 is defined as an “air conditioner control system”.
  • the map image shown in FIG. corresponds to an image that schematically represents a bird's-eye view of the plane FL1.
  • marks M_1 to M_6 respectively representing air conditioners D_1 to D_6 are displayed corresponding to the positions of the air conditioners D_1 to D_6.
  • variable K is set to “1”.
  • the mark M_K is clicked by the mouse pointer provided in the input device 18, coordinates indicating the clicked position, that is, click coordinates are calculated.
  • the calculated click coordinates are described in the area register 14r1 shown in FIG. 6 corresponding to the variable K, and the variable K is incremented thereafter.
  • the click operation is accepted a total of six times corresponding to the marks M_1 to M_6, and thereby six click coordinates respectively corresponding to the marks M_1 to M_6 are set in the area register 14r1.
  • the map image is divided in the manner shown in FIG. 9A with the six click coordinates thus set as a reference.
  • the boundary lines BL_1 to BL_3 are drawn on the map image so as to surround the marks M_1 to M_6.
  • the divided areas MP_1 to MP_6 are allocated around the marks M_1 to M_6, and the internal space of the room RM1 is divided into a plurality of small spaces respectively corresponding to the divided areas MP_1 to MP_6.
  • the area register 14r1 a plurality of XY coordinates defining the divided area MP_K (K: 1 to 6) are described.
  • each of a plurality of XY coordinates that define the divided area MP_K is converted into UV coordinates according to Equation 1.
  • the calibration parameters P11 to P33 shown in Equation 1 correspond to a matrix for performing planar projective transformation between the XY coordinate system that defines the plane FL1 and the imaging plane of the camera 12, that is, the UV coordinate system that defines the camera image. Therefore, by applying the desired XY coordinates to Equation 1, the corresponding UV coordinates on the camera image are calculated.
  • the plurality of UV coordinates thus converted are described in the area register 14r1 corresponding to the plurality of XY coordinates of the conversion source.
  • the measurement area DT_K corresponding to the divided area MP_K is defined on the camera image in the manner shown in FIG. 9B.
  • the congestion degree measurement mode is selected by operating the input device 18, the following processing is executed by the CPU 14p every time the measurement period arrives.
  • a person image is searched from a camera image by pattern matching.
  • the variable L is set to each of “1” to “Lmax” (Lmax: total number of person images found), and one or more person images found are found.
  • the representative point of the L-th person image is determined as “RP_L”.
  • the determined XY coordinates of the representative point RP_L are described in the representative point register 14r2 shown in FIG.
  • the representative point RP_1 is determined on the image representing the person H1
  • the representative point RP_2 is determined on the image representing the person H2
  • a representative point RP_3 is determined on the image representing the person H3.
  • variable L is again set to each of “1” to “Lmax”, and the divided area to which the Lth representative point belongs among the divided areas MP_1 to MP_6 is detected as the representative area.
  • the divided area MP_2 is a representative area corresponding to the representative point RP_1
  • the divided area MP_5 is a representative area corresponding to the representative point RP_2
  • the divided area MP_6 is a representative. This is a representative area corresponding to the point RP_3.
  • variable M is set to each of “1” to “3”, and the distance from the representative point RP_L to the boundary line BL_M is measured. If the calculated distance is less than the threshold value TH, a divided area in contact with the representative area across the boundary line BL_M is detected as a peripheral area.
  • the distance from representative point RP_1 to boundary line BL_1 is less than threshold TH, and the distance from representative point RP_1 to each of boundary lines BL_2 and BL_3 is greater than or equal to threshold TH. Therefore, for the representative point RP_1, the divided area MP_5 is detected as a peripheral area.
  • the distance from representative point RP_2 to each of boundary lines BL_1 and BL_3 is less than threshold TH, and the distance from representative point RP_1 to boundary line BL_2 is greater than or equal to threshold TH. Therefore, for the representative point RP_2, the divided areas MP_2 and MP_6 are detected as peripheral areas.
  • the distance from representative point RP_3 to each of boundary lines BL_1 to BL_3 is equal to or greater than threshold value TH. Therefore, the peripheral area is not detected for the representative point RP_3.
  • the weighting amount “1.0 / N” is distributed to the representative area and surrounding areas. Specifically, the weighting amount is described in the weighting amount register 14r3 shown in FIG. 8 corresponding to the representative area and the peripheral area. Therefore, if no peripheral area is detected, a weighting amount of “1.0” is assigned to the representative area. On the other hand, if even one peripheral area is detected, a weighting amount of less than “1.0” is evenly distributed to the representative area and the peripheral area.
  • a weighting amount of “0.5” is assigned to each of the divided areas MP_2 and MP_5 with respect to the person H1, and a weighting amount of “0.33” is assigned to the person H2.
  • a weighting amount of “1.0” is assigned to the divided area MP_6 with respect to the person H3.
  • the variable K is set to each of “1” to “6”, and the congestion degree CR_K is calculated.
  • the degree of congestion CR_K corresponds to the sum of weighting amounts assigned to the divided areas MP_K. Therefore, in the example of FIGS. 11A to 11C, the congestion levels CR_2 and CR_5 indicate “0.83”, the congestion level CR_6 indicates “1.33”, and the congestion levels CR_1 and CR_3. And CR_4 indicates “0.0”.
  • the outputs of the air conditioners D1 to D6 are controlled based on the congestion levels CR_1 to CR_6 thus calculated. Specifically, the output of the air conditioner corresponding to the divided area with a high degree of congestion is increased, and the output of the air conditioner corresponding to the divided area with a low degree of congestion is reduced.
  • the CPU 14p executes a plurality of tasks including a main task shown in FIG. 12, an area setting task shown in FIGS. 13 to 14, and a congestion degree measuring task shown in FIGS. Note that control programs corresponding to these tasks are stored in the recording medium 20.
  • step S1 camera image display processing is executed in step S1. As a result, a camera image is displayed on the monitor 16.
  • step S3 it is determined whether or not the current operation mode is the area setting mode, and in step S7, it is determined whether or not the current operation mode is the congestion degree measurement mode.
  • step S3 If “YES” in the step S3, an area setting task is activated in a step S5, and thereafter, the process proceeds to a step S15. If “YES” in the step S7, it is determined whether or not the divided areas MP_1 to MP_6 have been set in a step S9. If the determination result is YES, the congestion degree measurement task is started in step S11 and then the process proceeds to step S15. If the determination result is NO, the process directly proceeds to step S15. If both step S3 and S7 are NO, another process is executed in step S13, and then the process proceeds to step S15.
  • step S15 it is repeatedly determined whether or not a mode change operation has been performed.
  • the determination result is updated from NO to YES, the activated task is terminated in step S17, and thereafter, the process returns to step S3.
  • step S21 the map image is displayed on monitor 16 in step S21, and variable K is set to “1” in step S23.
  • step S25 it is determined whether or not a click operation for designating an area has been performed. If the determination result is updated from NO to YES, click coordinates are calculated in step S27. The calculated coordinates are described in the area register 14r1 corresponding to the variable K.
  • step S29 it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S31 and then the process returns to step S25. Proceed to S33.
  • step S33 the map image is divided with reference to the click coordinates. As a result, six divided areas MP_1 to MP_6 are allocated on the map image.
  • step S35 boundary lines BL_1 to BL_3 partitioning the divided images MP_1 to MP_6 are drawn on the map image.
  • step S37 the variable K is set to “1”, and in step S39, a plurality of XY coordinates defining the divided area MP_K are calculated.
  • the calculated XY coordinates are described in the area register 14r1 corresponding to the variable K.
  • step S41 each of the plurality of XY coordinates that define the divided area MP_K is converted into UV coordinates according to Equation 1.
  • the converted UV coordinates are described in the area register 14r1 corresponding to the variable K, whereby the measurement area DT_K corresponding to the divided area MP_K is assigned to the camera image.
  • step S43 it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S45 and then the process returns to step S39. If the determination result is YES, the process ends.
  • step S51 it is determined whether or not the measurement cycle has arrived.
  • the process proceeds to step S53, and a person image is searched from the camera image by pattern matching.
  • step S55 it is determined whether one or more person images have been found. If the determination result is NO, the process returns to step S51, whereas if the determination result is YES, the process proceeds to step S57.
  • step S57 the variable L is set to “1”.
  • step S59 the representative point of the Lth person image among the one or more found person images is determined as “RP_L”, and is determined in step S61.
  • the XY coordinates of the representative point RP_L are calculated.
  • the calculated XY coordinates are described in the representative point register 14r2 corresponding to the variable L.
  • step S67 the variable L is set to “1” again.
  • step S69 the divided area to which the representative point RP_L belongs among the divided areas MP_1 to MP_6 is detected as a representative area.
  • step S71 the variables M and N are set to “1”, in step S73, the distance from the representative point RP_L to the boundary line BL_M is measured, and in step S75, it is determined whether or not the calculated distance is less than the threshold value TH. Determine.
  • step S81 a divided area in contact with the representative area across the boundary line BL_M is detected as a peripheral area.
  • step S79 the variable N is incremented.
  • step S87 it is determined whether or not the variable L has reached the maximum value Lmax. If the determination result is NO, the variable L is incremented in step S89 and then the process returns to step S69, while if the determination result is YES. Proceed to step S91. In step S91, the variable K is set to “1”, and in step S93, the degree of congestion CR_K is calculated. The degree of congestion CR_K corresponds to the sum of weighting amounts assigned to the divided areas MP_K.
  • step S95 it is determined whether or not the variable K has reached "6". If the determination result is NO, the variable K is incremented in step S97 and the process returns to step S93. If the determination result is YES, step S95 is performed. Proceed to S99.
  • step S99 the outputs of the air conditioners D1 to D6 are controlled with reference to the congestion levels CR_1 to CR_6 calculated in step S93. Specifically, the output of the air conditioner corresponding to the divided area with a high degree of congestion is strengthened, and the output of the air conditioner corresponding to the divided area with a low degree of congestion is weakened.
  • the CPU 14p defines a plurality of divided areas MP_1 to MP_6 respectively corresponding to the plurality of air conditioners D_1 to D_6 that generate outputs toward the room RM1 (S21 to S45), and the room RM1.
  • One or more representative points respectively corresponding to one or more existing persons are detected (S53, S57 to S65).
  • the CPU 14p also measures the distance from each of the detected one or more representative points to the boundaries of the divided areas MP_1 to MP_6 (S67 to S73, S81 to S83, S87 to S89), and performs air conditioning based on the measurement result.
  • the output operation of the devices D_1 to D_6 is controlled (S75 to S79, S85, S91 to S99).
  • the divided areas MP_1 to MP_6 respectively correspond to the air conditioners D_1 to D_6 that generate outputs toward the room RM1.
  • the outputs of the air conditioners D_1 to D_6 are controlled with reference to the distance from the boundary between the divided areas MP_1 to MP_6 to the representative point of the person existing in the room RM1. Thereby, the accuracy of adaptive control for the room RM1 can be improved.
  • the degree of congestion is measured by searching for a person in the manner described above, it is possible to maintain the person's presence / absence discrimination accuracy and thus the comfort of the person present in the room RM1.
  • the distance from the representative point of the person image to each of the boundary lines BL_1 to BL_3 is measured.
  • the degree of congestion CR_1 to CR_6 may be calculated by measuring the distance from the representative point of the person image to each of the air conditioners D_1 to D_6.
  • steps S101 to S117 shown in FIG. 18 are executed instead of steps S69 to S85 shown in FIG.
  • step S101 as in step S69, the divided area to which the representative point RP_L belongs is detected as a representative area.
  • step S103 the variable K is set to “1”, and in step S105, the distance from the representative point RP_L to the air conditioner D_K is measured as “DS_K”.
  • step S107 it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S109 and then the process returns to step S105. Proceed to S111. The positional relationship between the Lth person and the air conditioners D_1 to D_6 is clarified by the distances DS_1 to DS_6 thus measured.
  • step S111 the variable K is set to “1” again.
  • a_K” and “b_K” are variables depending on the current output of the air conditioner D_K and the indoor environment.
  • the function value f (D_K) is defined by the variables a_K and b_K and the distance DS_K.
  • the weighting amount W_K is derived by subtracting from “1” the ratio of the function value f (D_K) to the sum of the function values f (D_1) to f (D_6) respectively corresponding to the air conditioners D_1 to D_6.
  • the variables a_K and b_K are adjusted so that the sum of the weighting amounts W_1 to W_6 is “1.0”.
  • step S115 it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S117 and then the process returns to step S113. Proceed to S87.
  • the coordinates of the marks M_1 to M_6 are designated by clicking the mouse pointer.
  • the coordinate values of the marks M_1 to M_6 may be directly specified.
  • the output of the air conditioner is adaptively controlled.
  • the output (that is, brightness) of the lighting device is adaptive. You may make it control to.
  • planar projective transformation with reference to Equation 1 is assumed, but perspective projective transformation may be performed instead.
  • an image schematically representing a state in which the floor surface FL1 is bird's-eye view is adopted as the map image.
  • the map image may be generated by performing bird's-eye conversion with reference to Equation 1 described above on the camera image.
  • the outputs of the air conditioners D1 to D6 are controlled with reference to the congestion levels CR_1 to CR_6 calculated in step S93.
  • a process of displaying the congestion degree information on the monitor 16 may be executed. In this case, the process shown in FIG. 17 is partially corrected as shown in FIG.
  • step S123 shows the congestion degree information including the number of person images discovered by the search process in step S53, the congestion levels CR_1 to CR_6 calculated in step S93, and the current time detected in step S121. It is displayed on the monitor 16 in a manner.
  • the process of step S123 is completed, the process proceeds to step S99 or returns to step S51.
  • step S99 shown in FIG. 17 When the process of step S99 shown in FIG. 17 is omitted, the name of the device is changed from “air conditioning control device” to “congestion degree measuring device”. In the congestion level information shown in FIG. 20, the congestion level may be expressed as a percentage.
  • one or more person images are searched from the camera image by pattern matching, and representative points are determined on each of the found person images (steps S53 to S59 in FIG. 15). reference).
  • one or more motion images may be detected on the camera image, and the representative point may be determined on each detected motion image.
  • the process shown in FIG. 15 is partially corrected as shown in FIG.
  • step S51 when the determination result in step S51 is updated from NO to YES, the process proceeds to step S131, and one or more motion images are searched on the camera image. Specifically, a plurality of pixels indicating motion are detected by paying attention to an inter-frame difference, and the detected plurality of pixels are divided into successive pixel clusters. One segmented block corresponds to one motion image. An identification number starting from “1” is assigned to each motion image detected in this way.
  • step S55 it is determined whether or not at least one moving image has been found. If the determination result is NO, the process returns to step S51. If the determination result is YES, the variable L is set to “1” in step S57. . In the subsequent step S133, the representative point of the Lth motion image is determined as “RP_L”, and then the process proceeds to step S61.
  • an area setting mode and a congestion degree measurement mode are prepared. However, if an area defined by another device is registered in the register 14r1, the area setting mode is not necessary and further input is performed. The device 18 is also unnecessary.

Landscapes

  • Engineering & Computer Science (AREA)
  • Combustion & Propulsion (AREA)
  • General Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Air Conditioning Control Device (AREA)
  • Image Analysis (AREA)

Abstract

Air conditioners (D_1 to D_6) are installed in the ceiling (HV1) of a room (RM1) and generate output toward the room (RM1). A plurality of sub-areas corresponding to each of the air conditioners (D_1 to D_6) are allocated in the room (RM1). A CPU detects one or two or more representative points corresponding to one or two or more people in the room (RM1), measures the distance from each of the one or two or more detected representative points to the boundary of a sub-area, and controls the output operations of the air conditioners (D_1 to D_6) in accordance with the measurement results.

Description

出力制御装置Output control device
 この発明は、出力制御装置に関し、特に、特定物体の混雑度を測定して装置の出力を制御する、出力制御装置に関する。 The present invention relates to an output control device, and more particularly to an output control device that controls the output of the device by measuring the degree of congestion of a specific object.
 この種の装置の一例が、特許文献1に開示されている。この背景技術によれば、カメラは、エレベータホール或いは乗りかご内に設置される。カメラから出力された画像は複数のブロックに分割され、待ち客または乗客の占有率はブロック毎に算出される。待ち客または乗客の人数は算出された占有率に基づいて判定され、判定結果はエレベータ制御装置に伝送される。 An example of this type of device is disclosed in Patent Document 1. According to this background art, the camera is installed in an elevator hall or a car. The image output from the camera is divided into a plurality of blocks, and the occupancy rate of waiting customers or passengers is calculated for each block. The number of waiting customers or passengers is determined based on the calculated occupancy rate, and the determination result is transmitted to the elevator control device.
特開平6-92563号公報JP-A-6-92563
 しかし、背景技術では、カメラの撮像面を定義する座標系と待ち客または乗客が存在する平面を定義する座標系とが一致するとは限らない。ここで、両座標系が相違すると、占有率の算出精度ひいてはエレベータ制御の精度が低下する。 However, in the background art, the coordinate system that defines the imaging surface of the camera does not always match the coordinate system that defines the plane on which the waiting customer or passenger exists. Here, if the two coordinate systems are different, the calculation accuracy of the occupancy rate, and hence the accuracy of the elevator control, is lowered.
 それゆえに、この発明の主たる目的は、適応的な出力制御の精度を向上させることができる、出力制御装置を提供することである。 Therefore, a main object of the present invention is to provide an output control device capable of improving the accuracy of adaptive output control.
 この発明に従う動き検出装置は、次のものを備える:空間に向けて出力を発生する複数の装置にそれぞれ対応する複数の小空間を定義する定義手段;空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する検出手段;検出手段によって検出された1または2以上の代表点の各々から定義手段によって定義された複数の小空間の境界までの距離を測定する測定手段;および測定手段の測定結果に基づいて複数の装置の出力動作を制御する制御手段。 The motion detection device according to the present invention comprises: definition means for defining a plurality of subspaces respectively corresponding to a plurality of devices that generate output toward the space; one or more specific objects present in the space Detecting means for detecting one or more representative points respectively corresponding to each of the above; measuring the distance from each of the one or more representative points detected by the detecting means to the boundaries of a plurality of small spaces defined by the defining means And measuring means for controlling the output operations of the plurality of devices based on the measurement results of the measuring means.
 好ましくは、空間を捉えるカメラがさらに備えられ、検出手段は、カメラから出力された被写界像から1または2以上の特定物体像を探索する探索手段、および探索手段によって発見された特定物体像上で代表点を決定する決定手段を含む。 Preferably, a camera for capturing a space is further provided, and the detection unit searches for one or more specific object images from the object scene image output from the camera, and the specific object image found by the search unit. A determination means for determining a representative point is included.
 さらに好ましくは、空間はXYZ座標系に従って定義され、カメラはUV座標系に従って定義された撮像面を有する。 More preferably, the space is defined according to the XYZ coordinate system, and the camera has an imaging surface defined according to the UV coordinate system.
 好ましくは、制御手段は、測定手段によって測定された距離の大きさに応じて異なる重み付け係数を複数の小空間の各々に割り当てる割り当て手段、割り当て手段によって割り当てられた重み付け係数に基づいて特定物体の混雑度を小空間毎に算出する算出手段、および算出手段によって算出された混雑度に基づいて複数の装置の出力を調整する調整手段を含む。 Preferably, the control unit assigns a different weighting factor to each of the plurality of small spaces according to the magnitude of the distance measured by the measuring unit, and the congestion of the specific object based on the weighting factor assigned by the assigning unit. Calculation means for calculating the degree for each small space, and adjustment means for adjusting the outputs of the plurality of devices based on the degree of congestion calculated by the calculation means.
 さらに好ましくは、1または2以上の代表点の各々に対応して割り当て手段によって割り当てられる重み付け係数の総和は1または2以上の代表点の間で共通する。 More preferably, the total sum of the weighting coefficients assigned by the assigning means corresponding to each of the one or more representative points is common among the one or more representative points.
 好ましくは、定義手段は、空間を定義する平面を鳥瞰した状態を表す参照画像を再現する再現手段、互いに接する複数のエリアを指定する指定操作を再現手段によって再現された参照画像上で受け付ける受け付け手段、および指定操作によって指定された複数のエリアに対応して複数の小空間を定義する定義処理手段を含む。 Preferably, the definition means reproduces a reference image representing a bird's-eye view of a plane defining the space, and accepting means for accepting a designation operation for designating a plurality of areas in contact with each other on the reference image reproduced by the reproduction means And definition processing means for defining a plurality of small spaces corresponding to the plurality of areas designated by the designation operation.
 この発明に従う出力制御装置は、次のものを備える:UV座標系に沿って定義された撮像面を有し、XY座標系に沿って定義された平面を有する空間を表す被写界像を出力するカメラ;カメラから出力された被写界像から空間に存在する特定物体を表す画像を探索する探索手段;UV座標系とXY座標系との対応関係を示す校正パラメータと探索手段によって発見された画像とを参照して特定物体のXY座標を算出する算出手段;空間に向けて出力を発生する複数の装置と特定物体との位置関係を算出手段によって算出されたXY座標を参照して検出する検出手段;および検出手段によって検出された位置関係を参照して複数の装置の出力動作を制御する制御手段。 An output control device according to the present invention comprises: an object field image representing a space having an imaging surface defined along a UV coordinate system and having a plane defined along an XY coordinate system; A search means for searching for an image representing a specific object existing in the space from the object scene image output from the camera; a calibration parameter indicating a correspondence relationship between the UV coordinate system and the XY coordinate system and the search means Calculating means for calculating XY coordinates of a specific object with reference to an image; detecting a positional relationship between a plurality of devices generating an output toward a space and the specific object with reference to XY coordinates calculated by the calculating means Detecting means; and control means for controlling output operations of the plurality of devices with reference to the positional relationship detected by the detecting means.
 この発明に従う混雑度測定装置は、次のものを備える:空間を形成する複数の小空間を定義する定義手段;空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する検出手段;検出手段によって検出された1または2以上の代表点の各々から定義手段によって定義された複数の小空間の境界までの距離を測定する測定手段;および定義手段によって定義された複数の小空間の各々における特定物体の混雑度を測定手段の測定結果に基づいて算出する算出手段;および算出手段によって算出された混雑度を示す混雑度情報を出力する出力手段。 The congestion degree measuring apparatus according to the present invention comprises the following: defining means for defining a plurality of small spaces forming a space; one or more representatives corresponding to one or more specific objects existing in the space, respectively. Detecting means for detecting points; measuring means for measuring the distance from each of one or more representative points detected by the detecting means to boundaries of a plurality of small spaces defined by the defining means; and defined by the defining means Calculating means for calculating the degree of congestion of the specific object in each of the plurality of small spaces based on the measurement result of the measuring means; and output means for outputting congestion degree information indicating the degree of congestion calculated by the calculating means.
 この発明に従う出力制御方法は、出力制御装置のプロセッサによって実行される出力制御方法であって、次のものを備える:空間に向けて出力を発生する複数の装置にそれぞれ対応する複数の小空間を定義する定義ステップ;空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する検出ステップ;検出ステップによって検出された1または2以上の代表点の各々から定義ステップによって定義された複数の小空間の境界までの距離を測定する測定ステップ;および測定ステップの測定結果に基づいて複数の装置の出力動作を制御する制御ステップ。 An output control method according to the present invention is an output control method executed by a processor of an output control device, and includes the following: a plurality of small spaces respectively corresponding to a plurality of devices that generate output toward a space Defining step; detecting step of detecting one or more representative points respectively corresponding to one or more specific objects existing in space; defining from each of one or more representative points detected by the detecting step A measuring step for measuring distances to boundaries of the plurality of small spaces defined by the steps; and a control step for controlling output operations of the plurality of devices based on the measurement results of the measuring steps.
 この発明に従う出力制御方法は、UV座標系に沿って定義された撮像面を有し、XY座標系に沿って定義された平面を有する空間を表す被写界像を出力するカメラを備える出力制御装置のプロセッサによって実行される出力制御方法であって、次のものを備える:カメラから出力された被写界像から空間に存在する特定物体を表す画像を探索する探索ステップ;UV座標系とXY座標系との対応関係を示す校正パラメータと探索ステップによって発見された画像とを参照して特定物体のXY座標を算出する算出ステップ;空間に向けて出力を発生する複数の装置と特定物体との位置関係を算出ステップによって算出されたXY座標を参照して検出する検出ステップ;および検出ステップによって検出された位置関係を参照して複数の装置の出力動作を制御する制御ステップ。 The output control method according to the present invention is an output control including a camera having an imaging surface defined along a UV coordinate system and outputting a scene image representing a space having a plane defined along an XY coordinate system. An output control method executed by a processor of the apparatus, comprising: a search step for searching an image representing a specific object existing in space from an object scene image output from a camera; UV coordinate system and XY A calculation step of calculating an XY coordinate of the specific object with reference to a calibration parameter indicating a correspondence relationship with the coordinate system and an image found by the search step; a plurality of devices that generate an output toward the space and the specific object A detection step for detecting the positional relationship with reference to the XY coordinates calculated by the calculating step; and a plurality of devices with reference to the positional relationship detected by the detection step. Control step of controlling the force action.
 この発明に従う混雑度測定方法は、混雑度測定装置のプロセッサによって実行される混雑度測定方法であって、次のものを備える:空間を形成する複数の小空間を定義する定義ステップ;空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する検出ステップ;検出ステップによって検出された1または2以上の代表点の各々から定義ステップによって定義された複数の小空間の境界までの距離を測定する測定ステップ;定義ステップによって定義された複数の小空間の各々における特定物体の混雑度を測定ステップの測定結果に基づいて算出する算出ステップ;および算出ステップによって算出された混雑度を示す混雑度情報を出力する出力ステップ。 A congestion degree measuring method according to the present invention is a congestion degree measuring method executed by a processor of a congestion degree measuring apparatus, and includes the following steps: a defining step for defining a plurality of small spaces forming a space; A detecting step for detecting one or more representative points respectively corresponding to one or more specific objects to be detected; a plurality of small points defined by the defining step from each of the one or more representative points detected by the detecting step A measurement step for measuring the distance to the boundary of the space; a calculation step for calculating the degree of congestion of the specific object in each of the plurality of small spaces defined by the definition step; and a calculation step for calculating the degree of congestion based on the measurement result of the measurement step; An output step for outputting congestion degree information indicating the degree of congestion.
 この発明に従う出力制御システムは、空間に向けて出力を発生する複数の装置、および複数の装置を制御する制御装置を備える出力制御システムであって、制御装置は次のものを備える:複数の装置にそれぞれ対応する複数の小空間を定義する定義手段;空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する検出手段;検出手段によって検出された1または2以上の代表点の各々から定義手段によって定義された複数の小空間の境界までの距離を測定する測定手段;および測定手段の測定結果に基づいて複数の装置の出力動作を制御する出力制御手段。 An output control system according to the present invention is an output control system including a plurality of devices that generate output toward a space and a control device that controls the plurality of devices, and the control device includes the following: a plurality of devices Defining means for defining a plurality of small spaces respectively corresponding to the above; detecting means for detecting one or more representative points respectively corresponding to one or more specific objects existing in the space; 1 or 2 detected by the detecting means Measuring means for measuring the distance from each of the two or more representative points to the boundaries of the plurality of small spaces defined by the defining means; and output control means for controlling the output operation of the plurality of devices based on the measurement results of the measuring means .
 この発明に従う出力制御システムは、UV座標系に沿って定義された撮像面を有し、XY座標系に沿って定義された平面を有する空間を表す被写界像を出力するカメラ、空間に向けて出力を発生する複数の装置、およびカメラから出力された被写界像に基づいて複数の装置を制御する制御装置を備える出力制御システムであって、制御装置は次のものを備える:カメラから出力された被写界像から空間に存在する特定物体を表す画像を探索する探索手段;UV座標系とXY座標系との対応関係を示す校正パラメータと探索手段によって発見された画像とを参照して特定物体のXY座標を算出する算出手段;複数の装置と特定物体との位置関係を算出手段によって算出されたXY座標を参照して検出する検出手段;および検出手段によって検出された位置関係を参照して複数の装置の出力動作を制御する出力制御手段。 An output control system according to the present invention is directed to a camera that outputs an object scene image that represents a space having an imaging surface defined along a UV coordinate system and a plane defined along an XY coordinate system. Output control system comprising a plurality of devices that generate output and a control device that controls the plurality of devices based on the object scene image output from the camera, the control device comprising: Search means for searching for an image representing a specific object existing in the space from the output object scene image; referring to a calibration parameter indicating the correspondence between the UV coordinate system and the XY coordinate system and the image found by the search means Calculating means for calculating the XY coordinates of the specific object; detecting means for detecting the positional relationship between the plurality of devices and the specific object with reference to the XY coordinates calculated by the calculating means; and detecting by the detecting means Output control means for controlling the output operation of a plurality of devices with reference to the position relationship.
 この発明によれば、複数の小空間は、空間に向けて出力を発生する複数の装置にそれぞれ対応する。また、複数の装置の出力は、複数の小空間の境界から空間に存在する特定物体の代表点までの距離を参照して制御される。これによって、空間に向けた適応的な制御の精度を向上させることができる。 According to the present invention, the plurality of small spaces respectively correspond to the plurality of devices that generate output toward the space. The outputs of the plurality of devices are controlled with reference to the distances from the boundaries of the plurality of small spaces to the representative points of the specific objects existing in the space. As a result, the accuracy of adaptive control for the space can be improved.
 この発明によれば、カメラの撮像面はUV座標系に沿って定義されるため、空間に存在する特定物体を表す画像の位置もまたUV座標によって示される。ただし、このUV座標は校正パラメータを参照してXY座標(XY座標系:空間を仕切る平面を定義する座標系)に変換され、複数の装置と特定物体との位置関係は変換されたXY座標を参照して検出される。こうして検出された位置関係を参照して複数の装置の出力動作を制御することで、特定物体の動きに対して適応的な出力制御が高精度で実現される。 According to the present invention, since the imaging surface of the camera is defined along the UV coordinate system, the position of the image representing the specific object existing in the space is also indicated by the UV coordinates. However, the UV coordinates are converted into XY coordinates (XY coordinate system: a coordinate system that defines a plane that divides space) with reference to the calibration parameters, and the positional relationship between a plurality of devices and specific objects is converted XY coordinates. Detected by reference. By controlling the output operations of the plurality of devices with reference to the positional relationship thus detected, adaptive output control with respect to the movement of the specific object can be realized with high accuracy.
 この発明によれば、1または2以上の特定物体は、複数の小空間によって形成された空間に存在する。各々の小空間における特定物体の混雑度は、特定物体の代表点から小空間の境界までの距離に基づいて算出される。これによって、混雑度の評価性能が向上する。 According to the present invention, one or more specific objects exist in a space formed by a plurality of small spaces. The degree of congestion of the specific object in each small space is calculated based on the distance from the representative point of the specific object to the boundary of the small space. Thereby, the evaluation performance of the congestion degree is improved.
 この発明の上述の目的,その他の目的,特徴および利点は、図面を参照して行う以下の実施例の詳細な説明から一層明らかとなろう。 The above object, other objects, features, and advantages of the present invention will become more apparent from the following detailed description of embodiments with reference to the drawings.
(A)はこの発明の一実施例の基本的構成を示すブロック図であり、(B)はこの発明の他の実施例の基本的構成を示すブロック図である。(A) is a block diagram showing a basic configuration of one embodiment of the present invention, and (B) is a block diagram showing a basic configuration of another embodiment of the present invention. この発明の一実施例の構成を示すブロック図である。It is a block diagram which shows the structure of one Example of this invention. 図2実施例に適用されるカメラの設置状態の一例を示す図解図である。It is an illustration figure which shows an example of the installation state of the camera applied to the FIG. 2 Example. 図2実施例のモニタに表示されるカメラ画像の一例を示す図解図である。It is an illustration figure which shows an example of the camera image displayed on the monitor of FIG. 2 Example. 図2実施例のモニタに表示されるマップ画像の一例を示す図解図である。It is an illustration figure which shows an example of the map image displayed on the monitor of FIG. 2 Example. 図2実施例に適用されるエリアレジスタの構成の一例を示す図解図である。It is an illustration figure which shows an example of a structure of the area register applied to the FIG. 2 Example. 図2実施例に適用される代表点レジスタの構成の一例を示す図解図である。FIG. 3 is an illustrative view showing one example of a configuration of a representative point register applied to the embodiment in FIG. 2; 図2実施例に適用される重み付け量レジスタの構成の一例を示す図解図である。It is an illustration figure which shows an example of a structure of the weighting amount register | resistor applied to the FIG. 2 Example. (A)はマップ画像上での分割エリアの割り当て状態の一例を示す図解図であり、(B)はカメラ画像上での測定エリアの割り当て状態の一例を示す図解図である。(A) is an illustration figure which shows an example of the allocation state of the divided area on a map image, (B) is an illustration figure which shows an example of the allocation state of the measurement area on a camera image. (A)はカメラ画像の一例を示す図解図であり、(B)は対応する鳥瞰画像の一例を示す図解図である。(A) is an illustration figure which shows an example of a camera image, (B) is an illustration figure which shows an example of a corresponding bird's-eye view image. (A)は重み付け量の算出動作の一部を示す図解図であり、(B)は重み付け量の算出動作の他の一部を示す図解図であり、(C)は重み付け量の算出動作のその他の一部を示す図解図である。(A) is an illustrative view showing a part of a weighting amount calculating operation, (B) is an illustrative view showing another part of the weighting amount calculating operation, and (C) is a weighting amount calculating operation. It is an illustration figure which shows a part of others. 図2実施例に適用されるCPUの動作の一部を示すフロー図である。It is a flowchart which shows a part of operation | movement of CPU applied to the FIG. 2 Example. 図2実施例に適用されるCPUの動作の他の一部を示すフロー図である。It is a flowchart which shows a part of other operation | movement of CPU applied to the FIG. 2 Example. 図2実施例に適用されるCPUの動作のその他の一部を示すフロー図である。FIG. 11 is a flowchart showing still another portion of behavior of the CPU applied to the embodiment in FIG. 2; 図2実施例に適用されるCPUの動作のさらにその他の一部を示すフロー図である。FIG. 10 is a flowchart showing yet another portion of behavior of the CPU applied to the embodiment in FIG. 2; 図2実施例に適用されるCPUの動作の他の一部を示すフロー図である。It is a flowchart which shows a part of other operation | movement of CPU applied to the FIG. 2 Example. 図2実施例に適用されるCPUの動作のその他の一部を示すフロー図である。FIG. 11 is a flowchart showing still another portion of behavior of the CPU applied to the embodiment in FIG. 2; 他の実施例に適用されるCPUの動作の一部を示すフロー図である。It is a flowchart which shows a part of operation | movement of CPU applied to another Example. その他の実施例に適用されるCPUの動作の一部を示すフロー図である。It is a flowchart which shows a part of operation | movement of CPU applied to another Example. 図19に示す処理によってモニタ画面に表示される混雑度情報の一例を示す図解図である。It is an illustration figure which shows an example of the congestion degree information displayed on a monitor screen by the process shown in FIG. さらにその他の実施例に適用されるCPUの動作の一部を示すフロー図である。It is a flowchart which shows a part of operation | movement of CPU applied to another Example.
 図1(A)を参照して、この発明の一実施例の出力制御装置は、基本的に次のように構成される。定義手段1aは、空間に向けて出力を発生する複数の装置5a,5a,…にそれぞれ対応する複数の小空間を定義する。検出手段2aは、空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する。測定手段3aは、検出手段2aによって検出された1または2以上の代表点の各々から定義手段によって定義された複数の小空間の境界までの距離を測定する。制御手段4aは、測定手段3aの測定結果に基づいて複数の装置の出力動作を制御する。 Referring to FIG. 1 (A), the output control device of one embodiment of the present invention is basically configured as follows. The defining means 1a defines a plurality of small spaces respectively corresponding to the plurality of devices 5a, 5a,. The detection unit 2a detects one or more representative points corresponding to one or more specific objects existing in the space. The measuring means 3a measures the distance from each of one or more representative points detected by the detecting means 2a to the boundaries of a plurality of small spaces defined by the defining means. The control unit 4a controls the output operations of the plurality of devices based on the measurement result of the measurement unit 3a.
 複数の小空間は、空間に向けて出力を発生する複数の装置5a,5a,…にそれぞれ対応する。また、複数の装置5a,5a,…の出力は、複数の小空間の境界から空間に存在する特定物体の代表点までの距離を参照して制御される。これによって、空間に向けた適応的な制御の精度を向上させることができる。 A plurality of small spaces respectively correspond to a plurality of devices 5a, 5a,... That generate outputs toward the space. Further, the outputs of the plurality of devices 5a, 5a,... Are controlled with reference to the distances from the boundaries of the plurality of small spaces to the representative points of specific objects existing in the space. As a result, the accuracy of adaptive control for the space can be improved.
 図1(B)を参照して、この発明の一実施例の出力制御装置は、基本的に次のように構成される。カメラ1bは、UV座標系に沿って定義された撮像面を有し、XY座標系に沿って定義された平面を有する空間を表す被写界像を出力する。探索手段2bは、カメラ1bから出力された被写界像から空間に存在する特定物体を表す画像を探索する。算出手段3bは、UV座標系とXY座標系との対応関係を示す校正パラメータと探索手段2bによって発見された画像とを参照して特定物体のXY座標を算出する。検出手段4bは、空間に向けて出力を発生する複数の装置6b,6b,…と特定物体との位置関係を算出手段3bによって算出されたXY座標を参照して検出する。制御手段5bは、検出手段4bによって検出された位置関係を参照して複数の装置6b,6b,…の出力動作を制御する。 Referring to FIG. 1 (B), the output control device of one embodiment of the present invention is basically configured as follows. The camera 1b has an imaging surface defined along the UV coordinate system, and outputs an object scene image representing a space having a plane defined along the XY coordinate system. The search means 2b searches for an image representing a specific object existing in the space from the object scene image output from the camera 1b. The calculation unit 3b calculates the XY coordinates of the specific object with reference to the calibration parameter indicating the correspondence between the UV coordinate system and the XY coordinate system and the image found by the search unit 2b. The detecting means 4b detects the positional relationship between the specific objects and the plurality of devices 6b, 6b,... That generate outputs toward the space with reference to the XY coordinates calculated by the calculating means 3b. The control means 5b refers to the positional relationship detected by the detection means 4b and controls the output operations of the plurality of devices 6b, 6b,.
 カメラ1bの撮像面はUV座標系に沿って定義されるため、空間に存在する特定物体を表す画像の位置もまたUV座標によって示される。ただし、このUV座標は校正パラメータを参照してXY座標(XY座標系:空間を仕切る平面を定義する座標系)に変換され、複数の装置6b,6b,…と特定物体との位置関係は変換されたXY座標を参照して検出される。こうして検出された位置関係を参照して複数の装置6b,6b,…の出力動作を制御することで、特定物体の動きに対して適応的な出力制御が高精度で実現される。 Since the imaging surface of the camera 1b is defined along the UV coordinate system, the position of the image representing the specific object existing in the space is also indicated by the UV coordinates. However, the UV coordinates are converted into XY coordinates (XY coordinate system: a coordinate system that defines a plane partitioning the space) with reference to the calibration parameters, and the positional relationship between the plurality of devices 6b, 6b,. It is detected with reference to the generated XY coordinates. By controlling the output operation of the plurality of devices 6b, 6b,... With reference to the positional relationship thus detected, adaptive output control with respect to the movement of the specific object can be realized with high accuracy.
 図1(C)を参照して、定義手段1cは、空間を形成する複数の小空間を定義する。検出手段2cは、空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する。測定手段3cは、検出手段2cによって検出された1または2以上の代表点の各々から定義手段1cによって定義された複数の小空間の境界までの距離を測定する。算出手段4cは、定義手段1cによって定義された複数の小空間の各々における特定物体の混雑度を測定手段3cの測定結果に基づいて算出する。出力手段5cは、算出手段4cによって算出された混雑度を示す混雑度情報を出力する。 Referring to FIG. 1C, the defining means 1c defines a plurality of small spaces forming a space. The detection unit 2c detects one or more representative points corresponding to one or more specific objects existing in the space. The measuring means 3c measures the distance from each of one or more representative points detected by the detecting means 2c to the boundaries of a plurality of small spaces defined by the defining means 1c. The calculating unit 4c calculates the congestion degree of the specific object in each of the plurality of small spaces defined by the defining unit 1c based on the measurement result of the measuring unit 3c. The output unit 5c outputs congestion level information indicating the congestion level calculated by the calculation unit 4c.
 このように、1または2以上の特定物体は、複数の小空間によって形成された空間に存在する。各々の小空間における特定物体の混雑度は、特定物体の代表点から小空間の境界までの距離に基づいて算出される。これによって、混雑度の評価性能が向上する。 Thus, one or more specific objects exist in a space formed by a plurality of small spaces. The degree of congestion of the specific object in each small space is calculated based on the distance from the representative point of the specific object to the boundary of the small space. Thereby, the evaluation performance of the congestion degree is improved.
 図2を参照して、この実施例の空調制御装置10は、撮像面で捉えられた被写界(3次元空間)を表す画像データを繰り返し出力するカメラ12を含む。カメラ12から出力された画像データは、画像処理回路14によって取り込まれ、CPU14pによってカメラ画像表示処理を施される。この結果、被写界を表す画像つまりカメラ画像がモニタ16に表示される。 Referring to FIG. 2, the air conditioning control device 10 of this embodiment includes a camera 12 that repeatedly outputs image data representing an object scene (three-dimensional space) captured on the imaging surface. The image data output from the camera 12 is captured by the image processing circuit 14 and subjected to camera image display processing by the CPU 14p. As a result, an image representing the object scene, that is, a camera image is displayed on the monitor 16.
 図3を参照して、部屋RM1は、床面FL1および天井HV1と4つの壁面WL1~WL4とによって仕切られる。カメラ12は、壁面WL1の上部に設けられ、部屋RM1の内部空間を斜め上方から捉える。したがって、カメラ画像は、図4に示す要領でモニタ画面に表示される。図3および図4に示すように、部屋RM1の内部空間は互いに直交するX軸,Y軸およびZ軸によって定義され、カメラ12の撮像面は互いに直交するU軸およびV軸によって定義される。 Referring to FIG. 3, room RM1 is partitioned by floor surface FL1 and ceiling HV1 and four wall surfaces WL1 to WL4. The camera 12 is provided above the wall surface WL1, and captures the internal space of the room RM1 from obliquely above. Therefore, the camera image is displayed on the monitor screen as shown in FIG. As shown in FIGS. 3 and 4, the internal space of the room RM1 is defined by the X, Y, and Z axes that are orthogonal to each other, and the imaging surface of the camera 12 is defined by the U and V axes that are orthogonal to each other.
 天井HV1には、空調装置D_1~D_6が既定の距離を隔てて設置される。空調装置D_1~D_6の各々は指定温度を有する空気を指定の風量で出力し、部屋RM1の温度はこうして出力された空気によって調整される。 The air conditioners D_1 to D_6 are installed at a predetermined distance on the ceiling HV1. Each of the air conditioners D_1 to D_6 outputs air having a specified temperature with a specified air volume, and the temperature of the room RM1 is adjusted by the air thus output.
 なお、図2に示す空調装置10および図3に示す空調装置D_1~D_6によって構成されるシステムは、“空調制御システム”と定義される。 Note that a system constituted by the air conditioner 10 shown in FIG. 2 and the air conditioners D_1 to D_6 shown in FIG. 3 is defined as an “air conditioner control system”.
 入力装置18の操作によってエリア設定モードが選択されると、CPU14pによって次の処理が実行される。 When the area setting mode is selected by operating the input device 18, the following processing is executed by the CPU 14p.
 まず、図5に示すマップ画像がモニタ16に表示される。マップ画像は、平面FL1を鳥瞰した状態を模式的に表す画像に相当する。マップ画像にはまた、空調装置D_1~D_6をそれぞれ表すマークM_1~M_6が、空調装置D_1~D_6の位置に対応して表示される。 First, the map image shown in FIG. The map image corresponds to an image that schematically represents a bird's-eye view of the plane FL1. In the map image, marks M_1 to M_6 respectively representing air conditioners D_1 to D_6 are displayed corresponding to the positions of the air conditioners D_1 to D_6.
 マップ画像の表示が完了すると、変数Kが“1”に設定される。入力装置18に設けられたマウスポインタによってマークM_Kがクリックされると、クリックされた位置を示す座標つまりクリック座標が算出される。算出されたクリック座標は変数Kに対応して図6に示すエリアレジスタ14r1に記述され、変数Kはその後にインクリメントされる。クリック操作はマークM_1~M_6に対応して合計6回受け付けられ、これによってマークM_1~M_6にそれぞれ対応する6つのクリック座標がエリアレジスタ14r1に設定される。 When the display of the map image is completed, the variable K is set to “1”. When the mark M_K is clicked by the mouse pointer provided in the input device 18, coordinates indicating the clicked position, that is, click coordinates are calculated. The calculated click coordinates are described in the area register 14r1 shown in FIG. 6 corresponding to the variable K, and the variable K is incremented thereafter. The click operation is accepted a total of six times corresponding to the marks M_1 to M_6, and thereby six click coordinates respectively corresponding to the marks M_1 to M_6 are set in the area register 14r1.
 マップ画像は、こうして設定された6つのクリック座標を基準として、図9(A)に示す要領で分割される。境界線BL_1~BL_3はマークM_1~M_6を囲むようにマップ画像上に描かれる。この結果、分割エリアMP_1~MP_6がマークM_1~M_6の周辺に割り当てられ、部屋RM1の内部空間が分割エリアMP_1~MP_6にそれぞれ対応する複数の小空間に分割される。エリアレジスタ14r1には、分割エリアMP_K(K:1~6)を定義する複数のXY座標が記述される。 The map image is divided in the manner shown in FIG. 9A with the six click coordinates thus set as a reference. The boundary lines BL_1 to BL_3 are drawn on the map image so as to surround the marks M_1 to M_6. As a result, the divided areas MP_1 to MP_6 are allocated around the marks M_1 to M_6, and the internal space of the room RM1 is divided into a plurality of small spaces respectively corresponding to the divided areas MP_1 to MP_6. In the area register 14r1, a plurality of XY coordinates defining the divided area MP_K (K: 1 to 6) are described.
 続いて、分割エリアMP_Kを定義する複数のXY座標の各々が、数1に従ってUV座標に変換される。
Figure JPOXMLDOC01-appb-M000001
Subsequently, each of a plurality of XY coordinates that define the divided area MP_K is converted into UV coordinates according to Equation 1.
Figure JPOXMLDOC01-appb-M000001
 数1に示す校正パラメータP11~P33は、平面FL1を定義するXY座標系とカメラ12の撮像面つまりカメラ画像を定義するUV座標系との間で平面射影変換を行うための行列に相当する。したがって、所望のXY座標を数1に適用することで、カメラ画像上の対応するUV座標が算出される。こうして変換された複数のUV座標は、変換元の複数のXY座標に対応してエリアレジスタ14r1に記述される。分割エリアMP_Kに対応する測定エリアDT_Kは、図9(B)に示す要領でカメラ画像上に定義される。 The calibration parameters P11 to P33 shown in Equation 1 correspond to a matrix for performing planar projective transformation between the XY coordinate system that defines the plane FL1 and the imaging plane of the camera 12, that is, the UV coordinate system that defines the camera image. Therefore, by applying the desired XY coordinates to Equation 1, the corresponding UV coordinates on the camera image are calculated. The plurality of UV coordinates thus converted are described in the area register 14r1 corresponding to the plurality of XY coordinates of the conversion source. The measurement area DT_K corresponding to the divided area MP_K is defined on the camera image in the manner shown in FIG. 9B.
 入力装置18の操作によって混雑度測定モードが選択されると、測定周期が到来する毎に次の処理がCPU14pによって実行される。 When the congestion degree measurement mode is selected by operating the input device 18, the following processing is executed by the CPU 14p every time the measurement period arrives.
 まず、パターンマッチングによってカメラ画像から人物像が探索される。1または2以上の人物像が発見されると、変数Lが“1”~“Lmax”(Lmax:発見された人物像の総数)の各々に設定され、発見された1または2以上の人物像のうちL番目の人物像の代表点が“RP_L”として決定される。決定された代表点RP_LのXY座標は、変数Lに対応して図7に示す代表点レジスタ14r2に記述される。 First, a person image is searched from a camera image by pattern matching. When one or more person images are found, the variable L is set to each of “1” to “Lmax” (Lmax: total number of person images found), and one or more person images found are found. The representative point of the L-th person image is determined as “RP_L”. The determined XY coordinates of the representative point RP_L are described in the representative point register 14r2 shown in FIG.
 人物H1~H3が図10(A)に示す要領で部屋RM1に存在するとき、人物H1を表す画像上で代表点RP_1が決定され、人物H2を表す画像上で代表点RP_2が決定され、そして人物H3を表す画像上で代表点RP_3が決定される。代表点RP_1~RP_3のXY座標は、図10(B)に示す要領で鳥瞰画像上に分布し、かつL=1~3に対応して代表点レジスタ14r2に記述される。 When the persons H1 to H3 exist in the room RM1 as shown in FIG. 10A, the representative point RP_1 is determined on the image representing the person H1, the representative point RP_2 is determined on the image representing the person H2, and A representative point RP_3 is determined on the image representing the person H3. The XY coordinates of the representative points RP_1 to RP_3 are distributed on the bird's-eye view image in the manner shown in FIG. 10B, and are described in the representative point register 14r2 corresponding to L = 1 to 3.
 変数Lは再度“1”~“Lmax”の各々に設定され、分割エリアMP_1~MP_6のうちL番目の代表点が属する分割エリアが代表エリアとして検出される。図10(A)~図10(B)の実施例では、分割エリアMP_2が代表点RP_1に対応する代表エリアとなり、分割エリアMP_5が代表点RP_2に対応する代表エリアとなり、そして分割エリアMP_6が代表点RP_3に対応する代表エリアとなる。 The variable L is again set to each of “1” to “Lmax”, and the divided area to which the Lth representative point belongs among the divided areas MP_1 to MP_6 is detected as the representative area. 10A to 10B, the divided area MP_2 is a representative area corresponding to the representative point RP_1, the divided area MP_5 is a representative area corresponding to the representative point RP_2, and the divided area MP_6 is a representative. This is a representative area corresponding to the point RP_3.
 続いて、変数Mが“1”~“3”の各々に設定され、代表点RP_Lから境界線BL_Mまでの距離が測定される。算出された距離が閾値TH未満であれば、境界線BL_Mを挟んで代表エリアと接する分割エリアが周辺エリアとして検出される。 Subsequently, the variable M is set to each of “1” to “3”, and the distance from the representative point RP_L to the boundary line BL_M is measured. If the calculated distance is less than the threshold value TH, a divided area in contact with the representative area across the boundary line BL_M is detected as a peripheral area.
 図11(A)を参照して、代表点RP_1から境界線BL_1までの距離は閾値TH未満となり、代表点RP_1から境界線BL_2およびBL_3の各々までの距離は閾値TH以上となる。したがって、代表点RP_1については、分割エリアMP_5が周辺エリアとして検出される。 Referring to FIG. 11A, the distance from representative point RP_1 to boundary line BL_1 is less than threshold TH, and the distance from representative point RP_1 to each of boundary lines BL_2 and BL_3 is greater than or equal to threshold TH. Therefore, for the representative point RP_1, the divided area MP_5 is detected as a peripheral area.
 図11(B)を参照して、代表点RP_2から境界線BL_1およびBL_3の各々までの距離は閾値TH未満となり、代表点RP_1から境界線BL_2までの距離は閾値TH以上となる。したがって、代表点RP_2については、分割エリアMP_2およびMP_6が周辺エリアとして検出される。 Referring to FIG. 11B, the distance from representative point RP_2 to each of boundary lines BL_1 and BL_3 is less than threshold TH, and the distance from representative point RP_1 to boundary line BL_2 is greater than or equal to threshold TH. Therefore, for the representative point RP_2, the divided areas MP_2 and MP_6 are detected as peripheral areas.
 図11(C)を参照して、代表点RP_3から境界線BL_1~BL_3の各々までの距離はいずれも閾値TH以上となる。したがって、代表点RP_3については、周辺エリアは非検出とされる。 Referring to FIG. 11C, the distance from representative point RP_3 to each of boundary lines BL_1 to BL_3 is equal to or greater than threshold value TH. Therefore, the peripheral area is not detected for the representative point RP_3.
 代表エリアおよび周辺エリアの総数を“N”とすると、重み付け量“1.0/N”が代表エリアおよび周辺エリアに分配される。重み付け量は、具体的には、代表エリアおよび周辺エリアに対応して図8に示す重み付け量レジスタ14r3に記述される。したがって、周辺エリアが1つも検出されなければ、“1.0”の重み付け量が代表エリアに割り当てられる。これに対して、周辺エリアが1つでも検出されれば、“1.0”未満の重み付け量が代表エリアおよび周辺エリアに均等に分配される。 If the total number of representative areas and surrounding areas is “N”, the weighting amount “1.0 / N” is distributed to the representative area and surrounding areas. Specifically, the weighting amount is described in the weighting amount register 14r3 shown in FIG. 8 corresponding to the representative area and the peripheral area. Therefore, if no peripheral area is detected, a weighting amount of “1.0” is assigned to the representative area. On the other hand, if even one peripheral area is detected, a weighting amount of less than “1.0” is evenly distributed to the representative area and the peripheral area.
 図11(A)~図11(C)の例では、人物H1に関して“0.5”の重み付け量が分割エリアMP_2およびMP_5の各々に割り当てられ、人物H2に関して“0.33”の重み付け量が分割エリアMP_2,MP_5およびMP_6の各々に割り当てられ、人物H3に関して“1.0”の重み付け量が分割エリアMP_6に割り当てられる。 In the example of FIGS. 11A to 11C, a weighting amount of “0.5” is assigned to each of the divided areas MP_2 and MP_5 with respect to the person H1, and a weighting amount of “0.33” is assigned to the person H2. Assigned to each of the divided areas MP_2, MP_5, and MP_6, and a weighting amount of “1.0” is assigned to the divided area MP_6 with respect to the person H3.
 重み付け量がこうして決定されると、変数Kが“1”~“6”の各々に設定され、混雑度CR_Kが算出される。混雑度CR_Kは、分割エリアMP_Kに割り当てられた重み付け量の総和に相当する。したがって、図11(A)~図11(C)の実施例では、混雑度CR_2およびCR_5は“0.83”を示し、混雑度CR_6は“1.33”を示し、そして混雑度CR_1,CR_3およびCR_4は“0.0”を示す。 When the weighting amount is thus determined, the variable K is set to each of “1” to “6”, and the congestion degree CR_K is calculated. The degree of congestion CR_K corresponds to the sum of weighting amounts assigned to the divided areas MP_K. Therefore, in the example of FIGS. 11A to 11C, the congestion levels CR_2 and CR_5 indicate “0.83”, the congestion level CR_6 indicates “1.33”, and the congestion levels CR_1 and CR_3. And CR_4 indicates “0.0”.
 空調装置D1~D6の出力は、こうして算出された混雑度CR_1~CR_6に基づいて制御される。具体的には、混雑度が大きい分割エリアに対応する空調装置の出力が強められ、混雑度が小さい分割エリアに対応する空調装置の出力が弱められる。 The outputs of the air conditioners D1 to D6 are controlled based on the congestion levels CR_1 to CR_6 thus calculated. Specifically, the output of the air conditioner corresponding to the divided area with a high degree of congestion is increased, and the output of the air conditioner corresponding to the divided area with a low degree of congestion is reduced.
 CPU14pは、図12に示すメインタスク,図13~図14に示すエリア設定タスク,および図15~図17に示す混雑度測定タスクを含む複数のタスクを実行する。なお、これらのタスクに対応する制御プログラムは、記録媒体20に保存される。 The CPU 14p executes a plurality of tasks including a main task shown in FIG. 12, an area setting task shown in FIGS. 13 to 14, and a congestion degree measuring task shown in FIGS. Note that control programs corresponding to these tasks are stored in the recording medium 20.
 図12を参照して、ステップS1ではカメラ画像表示処理を実行する。この結果、カメラ画像がモニタ16に表示される。ステップS3では現時点の動作モードがエリア設定モードであるか否かを判別し、ステップS7では現時点の動作モードが混雑度測定モードであるか否かを判別する。 Referring to FIG. 12, camera image display processing is executed in step S1. As a result, a camera image is displayed on the monitor 16. In step S3, it is determined whether or not the current operation mode is the area setting mode, and in step S7, it is determined whether or not the current operation mode is the congestion degree measurement mode.
 ステップS3でYESであれば、ステップS5でエリア設定タスクを起動し、その後にステップS15に進む。ステップS7でYESであれば、分割エリアMP_1~MP_6が設定済みであるか否かをステップS9で判別する。判別結果がYESであればステップS11で混雑度測定タスクを起動してからステップS15に進み、判別結果がNOであればそのままステップS15に進む。ステップS3およびS7のいずれもNOであれば、ステップS13で他の処理を実行し、その後にステップS15に進む。 If “YES” in the step S3, an area setting task is activated in a step S5, and thereafter, the process proceeds to a step S15. If “YES” in the step S7, it is determined whether or not the divided areas MP_1 to MP_6 have been set in a step S9. If the determination result is YES, the congestion degree measurement task is started in step S11 and then the process proceeds to step S15. If the determination result is NO, the process directly proceeds to step S15. If both step S3 and S7 are NO, another process is executed in step S13, and then the process proceeds to step S15.
 ステップS15ではモード変更操作が行われたか否かを繰り返し判別する。判別結果がNOからYESに更新されると、起動中のタスクをステップS17で終了し、その後にステップS3に戻る。 In step S15, it is repeatedly determined whether or not a mode change operation has been performed. When the determination result is updated from NO to YES, the activated task is terminated in step S17, and thereafter, the process returns to step S3.
 図13を参照して、ステップS21ではマップ画像をモニタ16に表示し、ステップS23では変数Kを“1”に設定する。ステップS25ではエリア指定のためのクリック操作が行われたか否かを判別し、判別結果がNOからYESに更新されるとステップS27でクリック座標を算出する。算出された座標は、変数Kに対応してエリアレジスタ14r1に記述される。ステップS29では変数Kが“6”に達したか否かを判別し、判別結果がNOであればステップS31で変数KをインクリメントしてからステップS25に戻る一方、判別結果がYESであればステップS33に進む。 Referring to FIG. 13, the map image is displayed on monitor 16 in step S21, and variable K is set to “1” in step S23. In step S25, it is determined whether or not a click operation for designating an area has been performed. If the determination result is updated from NO to YES, click coordinates are calculated in step S27. The calculated coordinates are described in the area register 14r1 corresponding to the variable K. In step S29, it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S31 and then the process returns to step S25. Proceed to S33.
 ステップS33では、クリック座標を参照してマップ画像を分割する。この結果、6個の分割エリアMP_1~MP_6がマップ画像上に割り当てられる。ステップS35では分割画像MP_1~MP_6を仕切る境界線BL_1~BL_3をマップ画像上に描画する。 In step S33, the map image is divided with reference to the click coordinates. As a result, six divided areas MP_1 to MP_6 are allocated on the map image. In step S35, boundary lines BL_1 to BL_3 partitioning the divided images MP_1 to MP_6 are drawn on the map image.
 ステップS37では変数Kを“1”に設定し、ステップS39では分割エリアMP_Kを定義する複数のXY座標を算出する。算出されたXY座標は、変数Kに対応してエリアレジスタ14r1に記述される。ステップS41では、分割エリアMP_Kを定義する複数のXY座標の各々を数1に従ってUV座標に変換する。変換されたUV座標は変数Kに対応してエリアレジスタ14r1に記述され、これによって分割エリアMP_Kに対応する測定エリアDT_Kがカメラ画像に割り当てられる。 In step S37, the variable K is set to “1”, and in step S39, a plurality of XY coordinates defining the divided area MP_K are calculated. The calculated XY coordinates are described in the area register 14r1 corresponding to the variable K. In step S41, each of the plurality of XY coordinates that define the divided area MP_K is converted into UV coordinates according to Equation 1. The converted UV coordinates are described in the area register 14r1 corresponding to the variable K, whereby the measurement area DT_K corresponding to the divided area MP_K is assigned to the camera image.
 ステップS43では、変数Kが“6”に達したか否かを判別する。判別結果がNOであればステップS45で変数KをインクリメントしてからステップS39に戻る一方、判別結果がYESであれば処理を終了する。 In step S43, it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S45 and then the process returns to step S39. If the determination result is YES, the process ends.
 図15を参照して、ステップS51では測定周期が到来したか否かを判別する。判別結果がNOからYESに更新されるとステップS53に進み、パターンマッチングによってカメラ画像から人物像を探索する。ステップS55では1または2以上の人物像が発見されたか否かを判別し、判別結果がNOであればステップS51に戻る一方、判別結果がYESであればステップS57に進む。 Referring to FIG. 15, in step S51, it is determined whether or not the measurement cycle has arrived. When the determination result is updated from NO to YES, the process proceeds to step S53, and a person image is searched from the camera image by pattern matching. In step S55, it is determined whether one or more person images have been found. If the determination result is NO, the process returns to step S51, whereas if the determination result is YES, the process proceeds to step S57.
 ステップS57では変数Lを“1”に設定し、ステップS59では発見された1または2以上の人物像のうちL番目の人物像の代表点を“RP_L”として決定し、そしてステップS61では決定された代表点RP_LのXY座標を算出する。算出されたXY座標は、変数Lに対応して代表点レジスタ14r2に記述される。ステップS63では変数Lが最大値Lmax(=発見された人物像の総数)に達したか否かを判別し、判別結果がNOであればステップS65で変数LをインクリメントしてからステップS59に戻る一方、判別結果がYESであればステップS67に進む。 In step S57, the variable L is set to “1”. In step S59, the representative point of the Lth person image among the one or more found person images is determined as “RP_L”, and is determined in step S61. The XY coordinates of the representative point RP_L are calculated. The calculated XY coordinates are described in the representative point register 14r2 corresponding to the variable L. In step S63, it is determined whether or not the variable L has reached the maximum value Lmax (= total number of discovered human images). If the determination result is NO, the variable L is incremented in step S65, and the process returns to step S59. On the other hand, if a determination result is YES, it will progress to Step S67.
 ステップS67では、変数Lを再度“1”に設定する。ステップS69では、分割エリアMP_1~MP_6のうち代表点RP_Lが属する分割エリアを代表エリアとして検出する。ステップS71では変数MおよびNを“1”に設定し、ステップS73では代表点RP_Lから境界線BL_Mまでの距離を測定し、そしてステップS75では算出された距離が閾値TH未満であるか否かを判別する。 In step S67, the variable L is set to “1” again. In step S69, the divided area to which the representative point RP_L belongs among the divided areas MP_1 to MP_6 is detected as a representative area. In step S71, the variables M and N are set to “1”, in step S73, the distance from the representative point RP_L to the boundary line BL_M is measured, and in step S75, it is determined whether or not the calculated distance is less than the threshold value TH. Determine.
 判別結果がYESであればステップS77~S79を経てステップS81に進み、判別結果がNOであればそのままステップS81に進む。ステップS77では、境界線BL_Mを挟んで代表エリアと接する分割エリアを周辺エリアとして検出する。ステップS79では、変数Nをインクリメントする。 If the determination result is YES, the process proceeds to steps S81 through S77 to S79, and if the determination result is NO, the process proceeds to step S81 as it is. In step S77, a divided area in contact with the representative area across the boundary line BL_M is detected as a peripheral area. In step S79, the variable N is incremented.
 ステップS81では変数Mが最大値Mmax(=発見された境界線の総数)に達したか否かを判別し、判別結果がNOであればステップS83で変数MをインクリメントしてからステップS73に戻る一方、判別結果がYESであればステップS85に進む。 In step S81, it is determined whether or not the variable M has reached the maximum value Mmax (= total number of found boundary lines). If the determination result is NO, the variable M is incremented in step S83, and the process returns to step S73. On the other hand, if a determination result is YES, it will progress to Step S85.
 ステップS85では、重み付け量(=1.0)を変数Nの値で割り算し、これによって得られた割り算値を代表エリアと周辺エリアとに分配する。したがって、周辺エリアが1つも検出されなければ、“1.0”の重み付け量が代表エリアに対応して重み付け量レジスタ14r3に記述される。これに対して、周辺エリアが1つでも検出されれば、“1.0/N”の重み付け量が代表エリアおよび周辺エリアに対応して重み付け量レジスタ14r3に記述される。 In step S85, the weighting amount (= 1.0) is divided by the value of the variable N, and the division value obtained thereby is distributed to the representative area and the peripheral area. Therefore, if no peripheral area is detected, a weighting amount of “1.0” is described in the weighting amount register 14r3 corresponding to the representative area. On the other hand, if even one peripheral area is detected, a weighting amount of “1.0 / N” is described in the weighting amount register 14r3 corresponding to the representative area and the peripheral area.
 ステップS87では、変数Lが最大値Lmaxに達したか否かを判別し、判別結果がNOであればステップS89で変数LをインクリメントしてからステップS69に戻る一方、判別結果がYESであればステップS91に進む。ステップS91では変数Kを“1”に設定し、ステップS93では混雑度CR_Kを算出する。混雑度CR_Kは、分割エリアMP_Kに割り当てられた重み付け量の総和に相当する。 In step S87, it is determined whether or not the variable L has reached the maximum value Lmax. If the determination result is NO, the variable L is incremented in step S89 and then the process returns to step S69, while if the determination result is YES. Proceed to step S91. In step S91, the variable K is set to “1”, and in step S93, the degree of congestion CR_K is calculated. The degree of congestion CR_K corresponds to the sum of weighting amounts assigned to the divided areas MP_K.
 ステップS95では変数Kが“6”に達したか否かを判別し、判別結果がNOであればステップS97で変数KをインクリメントしてからステップS93に戻る一方、判別結果がYESであればステップS99に進む。ステップS99では、ステップS93で算出された混雑度CR_1~CR_6を参照して空調装置D1~D6の出力を制御する。具体的には、混雑度が大きい分割エリアに対応する空調装置の出力を強め、混雑度が小さい分割エリアに対応する空調装置の出力を弱める。 In step S95, it is determined whether or not the variable K has reached "6". If the determination result is NO, the variable K is incremented in step S97 and the process returns to step S93. If the determination result is YES, step S95 is performed. Proceed to S99. In step S99, the outputs of the air conditioners D1 to D6 are controlled with reference to the congestion levels CR_1 to CR_6 calculated in step S93. Specifically, the output of the air conditioner corresponding to the divided area with a high degree of congestion is strengthened, and the output of the air conditioner corresponding to the divided area with a low degree of congestion is weakened.
 以上の説明から分かるように、CPU14pは、部屋RM1に向けて出力を発生する複数の空調装置D_1~D_6にそれぞれ対応する複数の分割エリアMP_1~MP_6を定義し(S21~S45)、部屋RM1に存在する1または2以上の人物にそれぞれ対応する1または2以上の代表点を検出する(S53, S57~S65)。CPU14pはまた、検出された1または2以上の代表点の各々から分割エリアMP_1~MP_6の境界までの距離を測定し(S67~S73, S81~S83, S87~S89)、測定結果に基づいて空調装置D_1~D_6の出力動作を制御する(S75~S79, S85, S91~S99)。 As can be seen from the above description, the CPU 14p defines a plurality of divided areas MP_1 to MP_6 respectively corresponding to the plurality of air conditioners D_1 to D_6 that generate outputs toward the room RM1 (S21 to S45), and the room RM1. One or more representative points respectively corresponding to one or more existing persons are detected (S53, S57 to S65). The CPU 14p also measures the distance from each of the detected one or more representative points to the boundaries of the divided areas MP_1 to MP_6 (S67 to S73, S81 to S83, S87 to S89), and performs air conditioning based on the measurement result. The output operation of the devices D_1 to D_6 is controlled (S75 to S79, S85, S91 to S99).
 分割エリアMP_1~MP_6は、部屋RM1に向けて出力を発生する空調装置D_1~D_6にそれぞれ対応する。また、空調装置D_1~D_6の出力は、分割エリアMP_1~MP_6の境界から部屋RM1に存在する人物の代表点までの距離を参照して制御される。これによって、部屋RM1に向けた適応的な制御の精度を向上させることができる。 The divided areas MP_1 to MP_6 respectively correspond to the air conditioners D_1 to D_6 that generate outputs toward the room RM1. The outputs of the air conditioners D_1 to D_6 are controlled with reference to the distance from the boundary between the divided areas MP_1 to MP_6 to the representative point of the person existing in the room RM1. Thereby, the accuracy of adaptive control for the room RM1 can be improved.
 また、人物が不在のときに空調機D_1~D_6の動作を停止して消費電力を抑制することを想定した場合、人物の在/不在の判別を誤ると、空調機D_1~D_6が意図しないタイミングで停止され、これによって部屋RM1に存在する人物の快適性が損なわれてしまう。この実施例では、上述の要領で人物を探索して混雑度を測定するようにしたため、人物の在/不在の判別精度ひいては部屋RM1に存在する人物の快適性を維持することができる。 Further, when it is assumed that the operation of the air conditioners D_1 to D_6 is stopped and the power consumption is suppressed when the person is absent, the timing when the air conditioners D_1 to D_6 are not intended if the presence / absence determination of the person is wrong. This stops the comfort of the person existing in the room RM1. In this embodiment, since the degree of congestion is measured by searching for a person in the manner described above, it is possible to maintain the person's presence / absence discrimination accuracy and thus the comfort of the person present in the room RM1.
 なお、この実施例では、混雑度CR_1~CR_6を算出するにあたって、人物像の代表点から境界線BL_1~BL_3の各々までの距離を測定するようにしている。しかし、人物像の代表点から空調装置D_1~D_6の各々までの距離を測定して混雑度CR_1~CR_6を算出するようにしてもよい。この場合、好ましくは、図16に示すステップS69~S85の処理に代えて、図18に示すステップS101~S117の処理が実行される。 In this embodiment, in calculating the congestion levels CR_1 to CR_6, the distance from the representative point of the person image to each of the boundary lines BL_1 to BL_3 is measured. However, the degree of congestion CR_1 to CR_6 may be calculated by measuring the distance from the representative point of the person image to each of the air conditioners D_1 to D_6. In this case, preferably, steps S101 to S117 shown in FIG. 18 are executed instead of steps S69 to S85 shown in FIG.
 ステップS101では、ステップS69と同様に代表点RP_Lが属する分割エリアを代表エリアとして検出する。ステップS103では変数Kを“1”に設定し、ステップS105では代表点RP_Lから空調装置D_Kまでの距離を“DS_K”として測定する。ステップS107では変数Kが“6”に達したか否かを判別し、判別結果がNOであればステップS109で変数KをインクリメントしてからステップS105に戻る一方、判別結果がYESであればステップS111に進む。こうして測定された距離DS_1~DS_6によって、L番目の人物と空調装置D_1~D_6との位置関係が明らかとなる。 In step S101, as in step S69, the divided area to which the representative point RP_L belongs is detected as a representative area. In step S103, the variable K is set to “1”, and in step S105, the distance from the representative point RP_L to the air conditioner D_K is measured as “DS_K”. In step S107, it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S109 and then the process returns to step S105. Proceed to S111. The positional relationship between the Lth person and the air conditioners D_1 to D_6 is clarified by the distances DS_1 to DS_6 thus measured.
 ステップS111では変数Kを再度“1”に設定し、ステップS113では数2に従って分割エリアMP_Kに対応する重み付け量W_Kを算出する。
[数2]
W_K=1-f(D_K)/Σf(D_K)
但し、f(D_K)=a_K*DS_K+b_K
In step S111, the variable K is set to “1” again. In step S113, the weighting amount W_K corresponding to the divided area MP_K is calculated according to Equation 2.
[Equation 2]
W_K = 1−f (D_K) / Σf (D_K)
However, f (D_K) = a_K * DS_K + b_K
 数2において、“a_K”および“b_K”は空調装置D_Kの現時点の出力や室内環境に依存する変数である。関数値f(D_K)は、変数a_Kおよびb_Kと距離DS_Kとによって定義される。重み付け量W_Kは、空調装置D_1~D_6にそれぞれ対応する関数値f(D_1)~f(D_6)の総和に対する関数値f(D_K)の割合を“1”から減算することで導き出される。なお、変数a_Kおよびb_Kは、重み付け量W_1~W_6の総和が“1.0”となるように調整される。 In Equation 2, “a_K” and “b_K” are variables depending on the current output of the air conditioner D_K and the indoor environment. The function value f (D_K) is defined by the variables a_K and b_K and the distance DS_K. The weighting amount W_K is derived by subtracting from “1” the ratio of the function value f (D_K) to the sum of the function values f (D_1) to f (D_6) respectively corresponding to the air conditioners D_1 to D_6. The variables a_K and b_K are adjusted so that the sum of the weighting amounts W_1 to W_6 is “1.0”.
 ステップS115では変数Kが“6”に達したか否かを判別し、判別結果がNOであればステップS117で変数KをインクリメントしてからステップS113に戻る一方、判別結果がYESであればステップS87に進む。 In step S115, it is determined whether or not the variable K has reached “6”. If the determination result is NO, the variable K is incremented in step S117 and then the process returns to step S113. Proceed to S87.
 なお、この実施例では、マウスポインタのクリック操作によってマークM_1~M_6の各々の座標を指定するようにしている。しかし、これに代えて、マークM_1~M_6の各々の座標値を直接的に指定するようにしてもよい。 In this embodiment, the coordinates of the marks M_1 to M_6 are designated by clicking the mouse pointer. However, instead of this, the coordinate values of the marks M_1 to M_6 may be directly specified.
 また、この実施例では空調装置の出力を適応的に制御することを想定しているが、空調装置の出力に代えて或いは空調装置の出力とともに、照明装置の出力(つまり明るさ)を適応的に制御するようにしてもよい。 In this embodiment, it is assumed that the output of the air conditioner is adaptively controlled. However, instead of the output of the air conditioner or together with the output of the air conditioner, the output (that is, brightness) of the lighting device is adaptive. You may make it control to.
 さらに、この実施例では、数1を参照した平面射影変換を想定しているが、これに代えて透視射影変換を行うようにしてもよい。 Furthermore, in this embodiment, planar projective transformation with reference to Equation 1 is assumed, but perspective projective transformation may be performed instead.
 また、この実施例では、床面FL1を鳥瞰した状態を模式的に表す画像をマップ画像として採用している。しかし、マップ画像は、上述した数1を参照した鳥瞰変換をカメラ画像に対して施すことで生成するようにしてもよい。 Further, in this embodiment, an image schematically representing a state in which the floor surface FL1 is bird's-eye view is adopted as the map image. However, the map image may be generated by performing bird's-eye conversion with reference to Equation 1 described above on the camera image.
 さらにこの実施例では、ステップS93で算出された混雑度CR_1~CR_6を参照して空調装置D1~D6の出力を制御するようにしている。しかし、空調装置D1~D6の制御に代えて或いは空調装置D1~D6の制御とともに、混雑度情報をモニタ16に表示する処理を実行するようにしてもよい。この場合、図17に示す処理は、図19に示すように部分的に修正される。 Furthermore, in this embodiment, the outputs of the air conditioners D1 to D6 are controlled with reference to the congestion levels CR_1 to CR_6 calculated in step S93. However, instead of controlling the air conditioners D1 to D6 or along with the controls of the air conditioners D1 to D6, a process of displaying the congestion degree information on the monitor 16 may be executed. In this case, the process shown in FIG. 17 is partially corrected as shown in FIG.
 図19を参照して、ステップS95の判別結果がNOからYESに更新されると、ステップS121で現在時刻を検出する。ステップS123では、ステップS53の探索処理によって発見された人物像の数,ステップS93で算出された混雑度CR_1~CR_6,およびステップS121で検出された現在時刻を含む混雑度情報を、図20に示す要領でモニタ16に表示する。ステップS123の処理が完了すると、ステップS99に進むか或いはステップS51に戻る。 Referring to FIG. 19, when the determination result in step S95 is updated from NO to YES, the current time is detected in step S121. In step S123, FIG. 20 shows the congestion degree information including the number of person images discovered by the search process in step S53, the congestion levels CR_1 to CR_6 calculated in step S93, and the current time detected in step S121. It is displayed on the monitor 16 in a manner. When the process of step S123 is completed, the process proceeds to step S99 or returns to step S51.
 なお、図17に示すステップS99の処理を省略する場合、装置の名称は“空調制御装置”から“混雑度測定装置”に改められる。また、図20に示す混雑度情報において、混雑度は百分率で表現するようにしてもよい。 When the process of step S99 shown in FIG. 17 is omitted, the name of the device is changed from “air conditioning control device” to “congestion degree measuring device”. In the congestion level information shown in FIG. 20, the congestion level may be expressed as a percentage.
 また、この実施例では、パターンマッチングによってカメラ画像から1または2以上の人物像を探索し、発見された各々の人物像上で代表点を決定するようにしている(図15のステップS53~59参照)。しかし、カメラ画像上で1または2以上の動き画像を検出し、検出された各々の動き画像上で代表点を決定するようにしてもよい。この場合、図15に示す処理は、図21に示すように部分的に修正される。 In this embodiment, one or more person images are searched from the camera image by pattern matching, and representative points are determined on each of the found person images (steps S53 to S59 in FIG. 15). reference). However, one or more motion images may be detected on the camera image, and the representative point may be determined on each detected motion image. In this case, the process shown in FIG. 15 is partially corrected as shown in FIG.
 図21を参照して、ステップS51の判別結果がNOからYESに更新されるとステップS131に進み、1または2以上の動き画像をカメラ画像上で探索する。具体的には、動きを示す複数の画素をフレーム間差分に注目して検出し、検出された複数の画素を連続する画素の塊毎に区分する。区分された1つの塊が、1つの動き画像に相当する。こうして探知された各々の動き画像には、“1”から始まる識別番号が付与される。ステップS55では少なくとも1つの動き画像が発見されたか否かを判別し、判別結果がNOであればステップS51に戻る一方、判別結果がYESであればステップS57で変数Lを“1”に設定する。続くステップS133ではL番目の動き画像の代表点を“RP_L”として決定し、その後にステップS61に進む。 Referring to FIG. 21, when the determination result in step S51 is updated from NO to YES, the process proceeds to step S131, and one or more motion images are searched on the camera image. Specifically, a plurality of pixels indicating motion are detected by paying attention to an inter-frame difference, and the detected plurality of pixels are divided into successive pixel clusters. One segmented block corresponds to one motion image. An identification number starting from “1” is assigned to each motion image detected in this way. In step S55, it is determined whether or not at least one moving image has been found. If the determination result is NO, the process returns to step S51. If the determination result is YES, the variable L is set to “1” in step S57. . In the subsequent step S133, the representative point of the Lth motion image is determined as “RP_L”, and then the process proceeds to step S61.
 さらに、この実施例では、エリア設定モードおよび混雑度測定モードを準備しているが、別の装置で定義されたエリアをレジスタ14r1に登録するのであれば、エリア設定モードは不要となり、さらには入力装置18も不要になる。 Further, in this embodiment, an area setting mode and a congestion degree measurement mode are prepared. However, if an area defined by another device is registered in the register 14r1, the area setting mode is not necessary and further input is performed. The device 18 is also unnecessary.
 この発明が詳細に説明され図示されたが、それは単なる図解および一例として用いたものであり、限定であると解されるべきではないことは明らかであり、この発明の精神および範囲は添付されたクレームの文言によってのみ限定される。 Although the present invention has been described and illustrated in detail, it is clear that it has been used merely as an illustration and example and should not be construed as limiting, and the spirit and scope of the present invention are attached Limited only by the wording of the claims.
 10 …空調制御装置
 12 …カメラ
 14 …画像処理回路
 14p …CPU
 16 …モニタ
 18 …入力装置
DESCRIPTION OF SYMBOLS 10 ... Air-conditioning control apparatus 12 ... Camera 14 ... Image processing circuit 14p ... CPU
16 ... monitor 18 ... input device

Claims (13)

  1.  出力制御装置であって、次のものを備える:
     空間に向けて出力を発生する複数の装置にそれぞれ対応する複数の小空間を定義する定義手段;
     前記空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する検出手段;
     前記検出手段によって検出された1または2以上の代表点の各々から前記定義手段によって定義された複数の小空間の境界までの距離を測定する測定手段;および
     前記測定手段の測定結果に基づいて前記複数の装置の出力動作を制御する制御手段。
    An output control device comprising:
    Defining means for defining a plurality of small spaces respectively corresponding to a plurality of devices that generate output toward the space;
    Detecting means for detecting one or more representative points respectively corresponding to one or more specific objects existing in the space;
    Measuring means for measuring a distance from each of one or more representative points detected by the detecting means to boundaries of a plurality of small spaces defined by the defining means; and based on a measurement result of the measuring means, Control means for controlling output operations of a plurality of devices.
  2.  クレーム1に従属する出力制御装置であって、前記空間を捉えるカメラをさらに備え、
     前記検出手段は、前記カメラから出力された被写界像から1または2以上の特定物体像を探索する探索手段、および前記探索手段によって発見された特定物体像上で前記代表点を決定する決定手段を含む。
    An output control device according to claim 1, further comprising a camera that captures the space,
    The detecting means searches for one or more specific object images from the object scene image output from the camera, and determines the representative point on the specific object image found by the searching means. Including means.
  3.  クレーム2に従属する出力制御装置であって、前記空間はXYZ座標系に従って定義され、
     前記カメラはUV座標系に従って定義された撮像面を有する。
    An output control device according to claim 2, wherein the space is defined according to an XYZ coordinate system,
    The camera has an imaging surface defined according to a UV coordinate system.
  4.  クレーム1に従属する出力制御装置であって、前記制御手段は、前記測定手段によって測定された距離の大きさに応じて異なる重み付け係数を前記複数の小空間の各々に割り当てる割り当て手段、前記割り当て手段によって割り当てられた重み付け係数に基づいて前記特定物体の混雑度を小空間毎に算出する算出手段、および前記算出手段によって算出された混雑度に基づいて前記複数の装置の出力を調整する調整手段を含む。 An output control device according to claim 1, wherein the control means assigns a weighting coefficient that differs according to the distance measured by the measurement means to each of the plurality of small spaces, the assignment means Calculating means for calculating the degree of congestion of the specific object for each small space on the basis of the weighting coefficient assigned by, and adjusting means for adjusting the outputs of the plurality of devices based on the degree of congestion calculated by the calculating means. Including.
  5.  クレーム4に従属する出力制御装置であって、前記1または2以上の代表点の各々に対応して前記割り当て手段によって割り当てられる重み付け係数の総和は前記1または2以上の代表点の間で共通する。 The output control device according to claim 4, wherein the sum of the weighting coefficients assigned by the assigning means corresponding to each of the one or more representative points is common among the one or more representative points. .
  6.  クレーム1に従属する出力制御装置であって、前記定義手段は、前記空間を定義する平面を鳥瞰した状態を表す参照画像を再現する再現手段、互いに接する複数のエリアを指定する指定操作を前記再現手段によって再現された参照画像上で受け付ける受け付け手段、および前記指定操作によって指定された複数のエリアに対応して前記複数の小空間を定義する定義処理手段を含む。 The output control device according to claim 1, wherein the definition unit reproduces a reference image representing a state in which a plane defining the space is viewed from above, a designation operation for designating a plurality of areas in contact with each other. Receiving means for receiving the reference image reproduced by the means, and definition processing means for defining the plurality of small spaces corresponding to the plurality of areas designated by the designation operation.
  7.  出力制御装置であって、次のものを備える:
     UV座標系に沿って定義された撮像面を有し、XY座標系に沿って定義された平面を有する空間を表す被写界像を出力するカメラ;
     前記カメラから出力された被写界像から前記空間に存在する特定物体を表す画像を探索する探索手段;
     前記UV座標系と前記XY座標系との対応関係を示す校正パラメータと前記探索手段によって発見された画像とを参照して前記特定物体のXY座標を算出する算出手段;
     前記空間に向けて出力を発生する複数の装置と前記特定物体との位置関係を前記算出手段によって算出されたXY座標を参照して検出する検出手段;および
     前記検出手段によって検出された位置関係を参照して前記複数の装置の出力動作を制御する制御手段。
    An output control device comprising:
    A camera having an imaging surface defined along the UV coordinate system and outputting a scene image representing a space having a plane defined along the XY coordinate system;
    Search means for searching for an image representing a specific object existing in the space from an object scene image output from the camera;
    Calculating means for calculating XY coordinates of the specific object with reference to a calibration parameter indicating a correspondence relationship between the UV coordinate system and the XY coordinate system and an image found by the search means;
    Detecting means for detecting a positional relationship between the plurality of devices generating the output toward the space and the specific object with reference to XY coordinates calculated by the calculating means; and a positional relationship detected by the detecting means Control means for controlling an output operation of the plurality of devices with reference to.
  8.  混雑度測定装置であって、次のものを備える:
     空間を形成する複数の小空間を定義する定義手段;
     前記空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する検出手段;
     前記検出手段によって検出された1または2以上の代表点の各々から前記定義手段によって定義された複数の小空間の境界までの距離を測定する測定手段;
     前記定義手段によって定義された複数の小空間の各々における前記特定物体の混雑度を前記測定手段の測定結果に基づいて算出する算出手段;および
     前記算出手段によって算出された混雑度を示す混雑度情報を出力する出力手段。
    Congestion level measuring device comprising:
    Defining means for defining a plurality of small spaces forming the space;
    Detecting means for detecting one or more representative points respectively corresponding to one or more specific objects existing in the space;
    Measuring means for measuring a distance from each of one or more representative points detected by the detecting means to boundaries of a plurality of small spaces defined by the defining means;
    Calculation means for calculating the degree of congestion of the specific object in each of the plurality of small spaces defined by the definition means; and congestion degree information indicating the degree of congestion calculated by the calculation means; Output means for outputting.
  9.  出力制御装置のプロセッサによって実行される出力制御方法であって、次のものを備える:
     空間に向けて出力を発生する複数の装置にそれぞれ対応する複数の小空間を定義する定義ステップ;
     前記空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する検出ステップ;
     前記検出ステップによって検出された1または2以上の代表点の各々から前記定義ステップによって定義された複数の小空間の境界までの距離を測定する測定ステップ;および
     前記測定ステップの測定結果に基づいて前記複数の装置の出力動作を制御する制御ステップ。
    An output control method executed by a processor of an output control device, comprising:
    A definition step for defining a plurality of small spaces respectively corresponding to a plurality of devices that generate output toward the space;
    A detection step of detecting one or more representative points respectively corresponding to one or more specific objects existing in the space;
    A measuring step of measuring a distance from each of one or more representative points detected by the detecting step to boundaries of a plurality of small spaces defined by the defining step; and based on a measurement result of the measuring step, A control step for controlling output operations of a plurality of devices.
  10.  UV座標系に沿って定義された撮像面を有し、XY座標系に沿って定義された平面を有する空間を表す被写界像を出力するカメラを備える出力制御装置のプロセッサによって実行される出力制御方法であって、次のものを備える:
     前記カメラから出力された被写界像から前記空間に存在する特定物体を表す画像を探索する探索ステップ;
     前記UV座標系と前記XY座標系との対応関係を示す校正パラメータと前記探索ステップによって発見された画像とを参照して前記特定物体のXY座標を算出する算出ステップ;
     前記空間に向けて出力を発生する複数の装置と前記特定物体との位置関係を前記算出ステップによって算出されたXY座標を参照して検出する検出ステップ;および
     前記検出ステップによって検出された位置関係を参照して前記複数の装置の出力動作を制御する制御ステップ。
    Output executed by a processor of an output control device comprising a camera having an imaging surface defined along a UV coordinate system and outputting a scene image representing a space having a plane defined along an XY coordinate system A control method comprising:
    A search step of searching for an image representing a specific object existing in the space from an object scene image output from the camera;
    A calculation step of calculating an XY coordinate of the specific object with reference to a calibration parameter indicating a correspondence relationship between the UV coordinate system and the XY coordinate system and an image found by the search step;
    Detecting a positional relationship between the specific object and a plurality of devices that generate outputs toward the space with reference to XY coordinates calculated by the calculating step; and a positional relationship detected by the detecting step A control step of controlling output operations of the plurality of devices with reference to
  11.  混雑度測定装置のプロセッサによって実行される混雑度測定方法であって、次のものを備える:
     空間を形成する複数の小空間を定義する定義ステップ;
     前記空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する検出ステップ;
     前記検出ステップによって検出された1または2以上の代表点の各々から前記定義ステップによって定義された複数の小空間の境界までの距離を測定する測定ステップ;
     前記定義ステップによって定義された複数の小空間の各々における前記特定物体の混雑度を前記測定ステップの測定結果に基づいて算出する算出ステップ;および
     前記算出ステップによって算出された混雑度を示す混雑度情報を出力する出力ステップ。
    A congestion measurement method executed by a processor of a congestion measurement apparatus, comprising:
    A definition step for defining a plurality of small spaces forming the space;
    A detection step of detecting one or more representative points respectively corresponding to one or more specific objects existing in the space;
    A measuring step of measuring a distance from each of the one or more representative points detected by the detecting step to boundaries of a plurality of small spaces defined by the defining step;
    A calculation step of calculating a congestion degree of the specific object in each of a plurality of small spaces defined by the definition step based on a measurement result of the measurement step; and congestion degree information indicating the congestion degree calculated by the calculation step Output step to output.
  12.  空間に向けて出力を発生する複数の装置、および前記複数の装置を制御する制御装置を備える出力制御システムであって、前記制御装置は次のものを備える:
     前記複数の装置にそれぞれ対応する複数の小空間を定義する定義手段;
     前記空間に存在する1または2以上の特定物体にそれぞれ対応する1または2以上の代表点を検出する検出手段;
     前記検出手段によって検出された1または2以上の代表点の各々から前記定義手段によって定義された複数の小空間の境界までの距離を測定する測定手段;および
     前記測定手段の測定結果に基づいて前記複数の装置の出力動作を制御する出力制御手段。
    An output control system comprising a plurality of devices for generating an output toward a space and a control device for controlling the plurality of devices, the control device comprising:
    Defining means for defining a plurality of small spaces respectively corresponding to the plurality of devices;
    Detecting means for detecting one or more representative points respectively corresponding to one or more specific objects existing in the space;
    Measuring means for measuring a distance from each of one or more representative points detected by the detecting means to boundaries of a plurality of small spaces defined by the defining means; and based on a measurement result of the measuring means, Output control means for controlling output operations of a plurality of devices.
  13.  UV座標系に沿って定義された撮像面を有し、XY座標系に沿って定義された平面を有する空間を表す被写界像を出力するカメラ、前記空間に向けて出力を発生する複数の装置、および前記カメラから出力された被写界像に基づいて前記複数の装置を制御する制御装置を備える出力制御システムであって、前記制御装置は次のものを備える:
     前記カメラから出力された被写界像から前記空間に存在する特定物体を表す画像を探索する探索手段;
     前記UV座標系と前記XY座標系との対応関係を示す校正パラメータと前記探索手段によって発見された画像とを参照して前記特定物体のXY座標を算出する算出手段;
     前記複数の装置と前記特定物体との位置関係を前記算出手段によって算出されたXY座標を参照して検出する検出手段;および
     前記検出手段によって検出された位置関係を参照して前記複数の装置の出力動作を制御する出力制御手段。
    A camera having an imaging surface defined along a UV coordinate system, and outputting a scene image representing a space having a plane defined along an XY coordinate system; and a plurality of outputs generating output toward the space An output control system comprising an apparatus and a control device for controlling the plurality of devices based on an object scene image output from the camera, wherein the control device comprises:
    Search means for searching for an image representing a specific object existing in the space from an object scene image output from the camera;
    Calculating means for calculating XY coordinates of the specific object with reference to a calibration parameter indicating a correspondence relationship between the UV coordinate system and the XY coordinate system and an image found by the search means;
    Detecting means for detecting a positional relationship between the plurality of devices and the specific object with reference to XY coordinates calculated by the calculating means; and referring to a positional relationship detected by the detecting means. Output control means for controlling the output operation.
PCT/JP2011/064622 2010-07-13 2011-06-27 Output controller WO2012008288A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010-158359 2010-07-13
JP2010158359A JP2012021669A (en) 2010-07-13 2010-07-13 Output control device

Publications (1)

Publication Number Publication Date
WO2012008288A1 true WO2012008288A1 (en) 2012-01-19

Family

ID=45469296

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/064622 WO2012008288A1 (en) 2010-07-13 2011-06-27 Output controller

Country Status (2)

Country Link
JP (1) JP2012021669A (en)
WO (1) WO2012008288A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015092894A1 (en) * 2013-12-18 2015-06-25 三菱電機株式会社 Control device for air conditioner

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6173784B2 (en) * 2013-06-12 2017-08-02 株式会社東芝 Air conditioning energy management system, method, and program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11311437A (en) * 1998-04-28 1999-11-09 Tokyo Electric Power Co Inc:The Air conditioner
JP2008304104A (en) * 2007-06-06 2008-12-18 Chugoku Electric Power Co Inc:The Electric device control system
JP2010117800A (en) * 2008-11-11 2010-05-27 Toshiba It & Control Systems Corp Parking lot monitoring device and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11311437A (en) * 1998-04-28 1999-11-09 Tokyo Electric Power Co Inc:The Air conditioner
JP2008304104A (en) * 2007-06-06 2008-12-18 Chugoku Electric Power Co Inc:The Electric device control system
JP2010117800A (en) * 2008-11-11 2010-05-27 Toshiba It & Control Systems Corp Parking lot monitoring device and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015092894A1 (en) * 2013-12-18 2015-06-25 三菱電機株式会社 Control device for air conditioner

Also Published As

Publication number Publication date
JP2012021669A (en) 2012-02-02

Similar Documents

Publication Publication Date Title
US9464819B2 (en) Air conditioning control system and air conditioning control method
KR101292499B1 (en) Device and method for controlling air conditioner
JP5985716B2 (en) Electrical device control apparatus, electrical device control system, and program
US9456183B2 (en) Image processing occupancy sensor
JP5238679B2 (en) Air conditioning control device, air conditioning control method, and radiation temperature measuring device
KR102121785B1 (en) Air-conditioner controlling direction of the wind using artificial intelligence by instructed position and method of controlling thereof
US20030096572A1 (en) Space-conditioning control employing image-based detection of occupancy and use
US20150028114A1 (en) Apparatus and method for controlling a heating ventilation and / or air conditioning system utilizing an infrared sensing or imaging device for determining radiated temperature of one or more objects or occupants in the conditioned space
US20170123386A1 (en) Method and apparatus for determining information for building information modeling
WO2012011401A1 (en) Output control device
EP2309454A2 (en) Apparatus and method for detecting motion
JP5595165B2 (en) Control system
CN109073252B (en) Air conditioner visualization system
WO2020052167A1 (en) Method and device for determining air blowing angle range of air conditioner, and air conditioner
KR20190062307A (en) Apparatus for evaluating indoor thermal environment using thermal camera and method thereof
KR20160046728A (en) Temperature distribution display device and method
JP6668010B2 (en) Air conditioning control device, air conditioning control method, and air conditioning control program
CN111461487B (en) Indoor decoration engineering wisdom management system based on BIM
WO2012008288A1 (en) Output controller
WO2013001407A1 (en) Environment control apparatus
JP2011153737A (en) Calibrating apparatus
WO2012011400A1 (en) Output control device
CN113623815A (en) Control method and system of air conditioner, air conditioner and storage medium
JP2009294887A (en) Construction facility control system and program
WO2020149044A1 (en) Parameter selection device, parameter selection method, and parameter selection program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11806618

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11806618

Country of ref document: EP

Kind code of ref document: A1