WO2013024655A1 - Illumination controller - Google Patents

Illumination controller Download PDF

Info

Publication number
WO2013024655A1
WO2013024655A1 PCT/JP2012/068110 JP2012068110W WO2013024655A1 WO 2013024655 A1 WO2013024655 A1 WO 2013024655A1 JP 2012068110 W JP2012068110 W JP 2012068110W WO 2013024655 A1 WO2013024655 A1 WO 2013024655A1
Authority
WO
WIPO (PCT)
Prior art keywords
lighting
light output
illumination
image
luminance value
Prior art date
Application number
PCT/JP2012/068110
Other languages
French (fr)
Japanese (ja)
Inventor
伸裕 見市
一馬 原口
智治 中原
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2013024655A1 publication Critical patent/WO2013024655A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present invention relates to a lighting control device.
  • the presence or absence of a person is often detected using a human sensor equipped with a pyroelectric infrared sensor.
  • a technique for acquiring an image of an illumination space using an imaging unit such as a camera, determining whether the person is a person from the image, and detecting the position of the person has also been proposed.
  • the illumination range means a range in which necessary illuminance can be obtained according to the purpose of use of the illumination space.
  • the illumination range is a range where the illuminance on the floor is equal to or greater than the illuminance specified.
  • the light output of the lighting fixture can be adjusted according to the location of the person if the location of the presence of the person in the illumination space is known. That is, when a person is localized in the lighting space, by relatively increasing the light output from the lighting fixtures present in the vicinity of the person and by relatively reducing the light output of the remaining lighting fixtures. The power consumption in the entire lighting space is reduced.
  • the light output is increased, the case of full lighting (rated lighting) is included, and when the light output is decreased, the case of turning off the light is included.
  • a human sensor using a pyroelectric infrared sensor is provided in each luminaire, and the human sensor detects whether or not a person is detected. It is conceivable to adjust the light output of lighting fixtures around the person in coordination. In this case, in order to link a plurality of lighting fixtures, it is necessary to communicate between the lighting fixtures. When this configuration is adopted, the same number of human sensors as the lighting fixtures are required, and a circuit configuration for linking the lighting fixtures is required for each lighting fixture, resulting in an increase in cost. The problem that leads to.
  • Document 1 Japanese Published Patent Publication No. 2009-283183 uses two image sensors that capture images from two directions orthogonal to each other and uses the image data obtained from the two image sensors to determine the position of a person. A technique is described in which the coordinates are calculated and the dimming control of the luminaire is performed so that the closer to the position where a person is, the brighter the light is.
  • the lighting fixtures are sequentially turned on one by one, and the position of the brightened area in the imaging data is used. Find the coordinates of the luminaire. In other words, the lighting fixtures are turned on one by one, and the physical address of the lighting fixture that has been turned on is associated with the logical address of the bright area in the image.
  • FIG. 11 consider a case where a desk 26 is arranged in the vicinity of the lighting fixture 25 in the lighting space including the lighting fixtures 24 and 25. It is assumed that the camera 16 as an imaging unit is disposed between the lighting fixtures 24 and 25. In the illustrated arrangement, when a person 31 is present in the vicinity of the desk 26, the light output of the lighting fixture 25 is relatively increased, and the lighting fixture 24 arranged far from the desk 26 relatively increases the light output. It is desirable to make it smaller.
  • one side of the desk 26 is included in the field of view of the camera 16, direct light from the lighting fixture 24 is incident on the side of the desk 26, but no direct light from the lighting fixture 25 is incident. Therefore, as shown in FIG. 12, in the image captured by the camera 16, one side surface of the desk 26 is associated with the lighting fixture 24. In the image shown in FIG. 12, the shaded area is an area associated with the luminaire 24, and the remaining area is associated with the luminaire 25.
  • the person 31 existing near the desk 26 may overlap the position (hatched portion) where the person 31 exists in the vicinity of the desk 26 and associated with the lighting fixture 24.
  • the light output of the lighting fixture 24 becomes relatively larger than the light output of the lighting fixture 25, which causes a problem that the intended operation cannot be obtained.
  • FIG. 13A is an example of an illumination space, and shows a case where illumination is performed by three illumination fixtures.
  • the illumination range for each lighting fixture is indicated by three regions E1, E2, E3 on the floor surface.
  • a plurality of three-dimensional objects 27 are arranged on the floor of the lighting space, and a camera 16 is arranged on the ceiling.
  • the camera 16 is disposed on the ceiling near the boundary between the areas E1 and E2.
  • FIGS. 13B, 13C, and 13D Images taken by the camera 16 when the lighting fixtures are individually turned on are as shown in FIGS. 13B, 13C, and 13D. Then, when the physical address of the lit lighting fixture is associated with the logical address of the bright area in each image, the physical address of the lighting fixture corresponds to the image captured by the camera 16 as shown in FIG. Attached. In FIG. 13 (e), the physical address of the lighting fixture is represented using three-level gray value.
  • the physical addresses of the lighting fixtures must be grouped into one area for each lighting fixture, similarly to the regions E1, E2, and E3 shown in FIG.
  • the physical address of one luminaire is divided into a plurality of areas as shown in FIG. (A region surrounded by a broken line in FIG. 13E) may occur. If such an enclave-like area is generated, there is a possibility of controlling an inappropriate lighting device with respect to the position of the person as described above.
  • the present invention uses the three-dimensional information of the lighting space to enable the illumination range by the lighting fixture to be associated on the floor surface, and as a result, the position of the person on the floor surface can be appropriately positioned. It is an object of the present invention to provide an illumination control device that can control the light output of a lighting fixture.
  • the illumination control device includes a measurement processing unit, a map generation unit, a human detection unit, and a control instruction unit.
  • the said measurement process part is comprised so that the three-dimensional information of the said illumination space may be produced
  • the map generation unit is configured to generate an illumination range map that represents an illumination range of the luminaire on the floor surface of the illumination space based on the three-dimensional information.
  • the person detecting unit is configured to obtain a position where a person is present on the floor surface of the illumination space based on the three-dimensional information.
  • the control instruction unit is configured to determine an instruction value of each light output of the plurality of lighting fixtures with reference to the presence position and the illumination range map.
  • the imaging means includes two cameras.
  • the measurement processing unit is configured to generate the three-dimensional information from two images respectively obtained from the two cameras using the principle of triangulation.
  • the map generation unit is a pixel corresponding to a predetermined position on the floor surface in the image with respect to each of the plurality of lighting fixtures.
  • the luminance value is changed, and the pixel is assigned to the illumination range of the luminaire in which the luminance value change is maximized.
  • the map generation unit is configured to obtain a difference between a first luminance value of a pixel corresponding to the predetermined position and a second luminance value of a pixel corresponding to the predetermined position as a change in the luminance value.
  • the first luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging unit when the light output of the lighting fixture is the first light output.
  • the second luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging unit when the light output of the lighting fixture is a second light output different from the first light output. .
  • the map generation unit is a pixel corresponding to a predetermined position on the floor surface in the image with respect to each of the plurality of lighting fixtures.
  • the luminance value is changed, and the pixel is assigned to the illumination range of the lighting fixture in which the luminance value change is equal to or greater than a predetermined threshold.
  • the map generation unit is configured to obtain a difference between a first luminance value of a pixel corresponding to the predetermined position and a second luminance value of a pixel corresponding to the predetermined position as a change in the luminance value.
  • the first luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging unit when the light output of the lighting fixture is the first light output.
  • the second luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging unit when the light output of the lighting fixture is a second light output different from the first light output. .
  • the map generation unit is a pixel corresponding to a predetermined position on the floor surface in the image with respect to each of the plurality of lighting fixtures. And a pixel is assigned to the illumination range of the luminaire where the contribution is maximized.
  • the map generation unit And a luminance value of the pixel corresponding to the predetermined position in the image obtained from the imaging means.
  • the map generation unit is configured to obtain, as the contribution, a ratio of the luminance value corresponding to the one lighting fixture with respect to a total of the luminance values corresponding to the plurality of lighting fixtures.
  • the map generation unit is a pixel corresponding to a predetermined position on the floor surface in the image with respect to each of the plurality of lighting fixtures. And determining that the pixel is assigned to the illumination range of the luminaire where the contribution is equal to or greater than a specified threshold.
  • the map generation unit And a luminance value of the pixel corresponding to the predetermined position in the image obtained from the imaging means.
  • the map generation unit is configured to obtain, as the contribution, a ratio of the luminance value corresponding to the one lighting fixture with respect to a total of the luminance values corresponding to the plurality of lighting fixtures.
  • the map generation unit assigns a pixel corresponding to a surface intersecting the floor surface in the image It is configured not to be assigned to the illumination range.
  • the map generation unit is an undetermined position that is not included in any of the lighting ranges in the lighting range map. Is present, the undetermined position is assigned to any one of the illumination ranges based on the positional relationship between the illumination range and the undetermined position of each of the plurality of lighting fixtures.
  • the map generation unit obtains a position immediately below the lighting fixture on the floor surface of the image based on the lighting range. Configured.
  • the positional relationship is a positional relationship between the position immediately below and the undetermined position of each of the plurality of lighting fixtures.
  • the illumination control apparatus according to an eleventh aspect of the present invention according to any one of the first to tenth aspects further includes the imaging means and an illumination control unit.
  • the lighting control unit is configured to control the lighting fixture so that the light output of the lighting fixture becomes a value corresponding to the indication value when receiving the indication value.
  • FIG. 1 It is a schematic block diagram which shows the illumination control apparatus of Embodiment 1. It is a figure which shows the example of arrangement
  • FIG. It is a principle explanatory view of the lighting control device of Embodiment 1. It is a principle explanatory view of the lighting control device of Embodiment 1. It is a figure which shows the operation example of the illumination control apparatus of Embodiment 1.
  • FIG. It is a figure explaining the concept of the parallax in the illumination control apparatus of Embodiment 1.
  • FIG. It is a figure explaining the correction process of an illumination range map in the illumination control apparatus of Embodiment 1.
  • FIG. It is a figure which shows the operation example of the illumination control apparatus of Embodiment 1.
  • FIG. It is operation
  • FIG. It is a schematic block diagram which shows the example of arrangement
  • the present invention relates to a lighting control device, and more particularly to a lighting control device that automatically adjusts the light output of a lighting fixture in accordance with the position of a person.
  • the illumination control apparatus assumes three-dimensional measurement by stereo vision using two cameras in a three-dimensional measurement unit that acquires three-dimensional information of an illumination space.
  • the configuration of the three-dimensional measurement unit is not limited as long as the three-dimensional information is acquired from an image captured by a camera as an imaging unit.
  • an active configuration that acquires three-dimensional information by projecting a light pattern such as a light cutting method or a phase shift method may be used. These techniques use the principle of triangulation.
  • the three-dimensional measuring unit emits intensity-modulated light whose intensity changes at a constant period and receives reflected light with a camera, thereby obtaining the distance for each pixel from the phase difference between the intensity-modulated light and the received light.
  • a technique for generating This technique is called the TOF (Time of Flight) method because it measures the flight time of light.
  • the illumination control device of the present embodiment includes a three-dimensional measurement unit 10, a storage unit 14, and an illumination control unit 15.
  • the three-dimensional measuring unit 10 includes two cameras 11 and 12 as imaging means arranged so that the visual fields with respect to the three-dimensional real space almost overlap.
  • the three-dimensional measuring unit 10 includes an arithmetic processing unit 13.
  • the cameras 11 and 12 are configured by combining an optical system with an area image sensor in which pixels are arranged on lattice points of a two-dimensional lattice, and a CCD image sensor, a CMOS image sensor, or the like is used as the area image sensor. .
  • the two cameras 11 and 12 having the same specifications such as image resolution and focal length are used.
  • the two cameras 11 and 12 have optical axes Ax1 and Ax2 parallel to each other and a direction along a line connecting the centers C1 and C2 of the light receiving surfaces PL1 and PL2 (“baseline direction”). And the optical axes Ax1 and Ax2 are arranged orthogonally.
  • the optical axes Ax1, Ax2 are straight lines connecting the optical centers O1, O2 and the centers C1, C2 of the light receiving surfaces PL1, PL2.
  • Each camera 11 and 12 includes an optical system (not shown) having a function corresponding to a fisheye lens or a fisheye lens having an angle of view close to 180 degrees.
  • the projection system of the optical system is not particularly limited, but will be described below using the equidistant projection method.
  • the two cameras 11 and 12 are arranged so that the baseline direction and the horizontal direction on the light receiving surfaces PL1 and PL2 coincide. That is, parallel stereo is assumed.
  • the projection system of the optical system is not limited to the equidistant projection system, and the optical system may have other distortion characteristics, or an optical system with unknown distortion characteristics can be used because calibration is performed. It is.
  • the configuration of the above-described three-dimensional measurement unit 10 is not limited, but by adopting the above-described configuration, a wide viewing angle that captures the entire illumination space can be obtained, and the cameras 11 and 12 can capture images. The amount of calculation when calculating three-dimensional information from the obtained image is reduced.
  • the cameras 11 and 12 are arranged at a high place such as the ceiling so that the optical axes Ax1 and Ax2 are both vertically downward. Moreover, the cameras 11 and 12 are arrange
  • the direction along the horizontal direction (that is, the baseline direction) in the light receiving surfaces PL1 and PL2 is the x direction
  • the direction along the vertical direction in the light receiving surface is the y direction.
  • the coordinate system is determined so that the rightward direction in the horizontal direction is the positive direction in the x direction and the downward direction in the vertical direction is the positive direction in the y direction. ing.
  • the direction away from the light receiving surface of the camera is the positive direction. That is, the positive direction in the z direction is the front direction of the camera.
  • the pixel position of the image (first image) captured by the camera 11 is represented in the format (u1, v1)
  • the pixel position of the image (second image) captured by the camera 12 is in the format (u2, v2). It is represented by
  • the u1 axis and the u2 axis are coordinate axes in the x direction
  • the v1 axis and the v2 axis are coordinate axes in the y direction.
  • the u1 axis and the u2 axis are aligned on a straight line.
  • a plurality of (three in the illustrated example) lighting fixtures 20 are arranged at a high place such as the ceiling, and between the lighting fixture 21 and the lighting fixture 22.
  • Cameras 11 and 12 are arranged.
  • the visual fields of the cameras 11 and 12 are set so that the three lighting fixtures 21, 22, and 23 include the entire illumination space to be illuminated.
  • three lighting fixtures 21, 22, and 23 are arranged in a straight line for convenience of illustration, but the number of lighting fixtures 20 may be three or more, and the lighting fixture 20 may be an arbitrary one. You may arrange
  • Image data (images) output from the cameras 11 and 12 are input to the arithmetic processing unit 13.
  • the case where the image data is a grayscale image will be described as an example.
  • the technical idea described below is applicable even if the image data is a color image.
  • the arithmetic processing unit 13 includes a computer as a hardware resource, and executes a program for causing the computer to function as a device that performs processing described below. With this program, the arithmetic processing unit 13 functions as a measurement processing unit 130, a map generation unit 131, a person detection unit 132, a collation unit 133, and a control instruction unit 134.
  • the arithmetic processing unit 13 may be configured to include dedicated hardware.
  • a device having a function of executing a program such as a DSP (Digital Signal Processor) or an FPGA (Field-Programmable Gate Array) may be used.
  • an AD converter that converts the analog signals into digital signals is provided between the cameras 11 and 12 and the arithmetic processing unit 13. Further, a filter circuit or the like that removes unnecessary components such as noise from the image data may be provided between the cameras 11 and 12 and the arithmetic processing unit 13.
  • the system program and application program for operating the arithmetic processing unit 13 are stored in the storage unit 14. Also, the captured image data and the calculation process data that are the processing targets of the arithmetic processing unit 13 are also stored in the storage unit 14 that functions as a data memory and a working memory. Accordingly, the storage unit 14 includes a storage medium in which stored contents are held without power supply, such as a flash memory and a hard disk drive device, and a volatile memory serving as a main storage for storing system programs and application programs during processing. including.
  • the arithmetic processing unit 13 includes a measurement processing unit 130 that generates three-dimensional information of the illumination space using image data acquired from the cameras 11 and 12. That is, the measurement processing unit 130 3D information of the illumination space 50 based on the image acquired from the imaging means (cameras 11 and 12) that captures the illumination space 50 including the floor surface 40 illuminated by the plurality of lighting fixtures 20. Is configured to generate In the present embodiment, a part of the storage unit 14 and the measurement processing unit l30 constitute the three-dimensional measurement unit 10 together with the cameras 11 and 12. A technique for generating three-dimensional information from image data will be described later.
  • the arithmetic processing unit 13 includes a map generation unit 131 that generates an illumination range map that represents an illumination range for each of the lighting fixtures 21, 22, and 23, and a person detection unit 132 that detects the presence position of the person 30 using three-dimensional information. Is provided.
  • the map generation unit 131 is configured to generate an illumination range map representing each illumination range of the plurality of lighting fixtures 20 in the illumination space 50 based on the three-dimensional information generated by the measurement processing unit 130.
  • the illumination range map represents the illumination range of the luminaire 20 on the floor surface 40 of the illumination space 50.
  • the illumination range map represents each illumination range of the plurality of lighting fixtures 20 on the floor surface 40 of the illumination space 50.
  • the illumination range is, for example, an area of the floor surface 40 that is illuminated by the luminaire 20.
  • the person detection unit 132 is configured to obtain an existence position where a person exists in the illumination space 50 based on the three-dimensional information generated by the measurement processing unit 130.
  • the presence position is a position where the person 30 exists on the floor surface 40 of the illumination space 50.
  • the illumination range map generated by the map generation unit 131 is stored in the map storage unit 141 provided in the storage unit 14.
  • the map storage unit 141 is preferably composed of a rewritable nonvolatile memory.
  • the arithmetic processing unit 13 collates the detected position (existing position) with the illumination range map, thereby determining the position (existing position) in the illumination range.
  • a collation unit 133 for extracting the lighting fixture 20 (21, 22, 23) is provided.
  • the collation unit 133 refers to the illumination range map generated by the map generation unit 131 and has an illumination range that includes the presence position obtained by the human detection unit 132 from the plurality of lighting fixtures 21, 22, and 23. It is comprised so that the lighting fixture 20 may be selected.
  • control instruction unit 134 refers to the existence position obtained by the person detection unit 132 and the illumination range map generated by the map generation unit 131, and determines the instruction value of each light output of the plurality of lighting fixtures 20. Configured to determine.
  • the rule is set so that, for example, only the lighting fixtures 20 (21, 22, 23) extracted by the collation unit 133 are rated-lit.
  • the control instruction unit 134 selects an instruction value corresponding to the rated lighting as the instruction value of the light output of the luminaire 21.
  • indication part 134 selects the instruction
  • the lighting fixture 20 (21, 22, 23) extracted by the collation unit 133 is rated-lit and the lighting fixture 20 (21, 22, 23) adjacent to the periphery of the lighting fixture 20 (21, 22, 23).
  • a rule is set to dim the light.
  • the control instruction unit 134 selects an instruction value corresponding to the rated lighting as the instruction value of the light output of the luminaire 21. Further, the control instruction unit 134 selects an instruction value corresponding to 50% lighting as the instruction value of the light output of the lighting apparatus 22 adjacent to the selected lighting apparatus 21. Further, the control instruction unit 134 selects an instruction value corresponding to turning off as the instruction value of the light output of the lighting fixture 23 that is not adjacent to the selected lighting fixture 21.
  • rules are examples, and the rules may be set so that the area around the position where the person 30 exists in the illumination space 50 is relatively brighter than the remaining areas.
  • the output (instruction value) of the control instruction unit 134 provided in the arithmetic processing unit 13 is given to the illumination control unit 15 that adjusts the light output of the lighting fixture 20 (21, 22, 23).
  • the illumination control unit 15 adjusts the light output of the lighting fixture 20 (21, 22, 23) according to the instruction content (instruction value) of the control instruction unit 134. Therefore, when the presence position of the person 30 is detected, the lighting state of the luminaire 20 (21, 22, 23) is controlled according to the rule set in the control instruction unit 134.
  • the three-dimensional measuring unit 10 obtains the three-dimensional information of the real space based on the captured images captured by the two cameras 11 and 12, respectively, so that the two cameras 11 and 12 capture the pair captured at the same time.
  • the obtained image (captured image) is acquired and stored in the storage unit 14.
  • a three-dimensional coordinate system (xyz) with the optical center O as the origin is defined for the real space
  • a two-dimensional coordinate system (o-uv) is defined for the light receiving surface PL.
  • the coordinate system of the light receiving surface PL has a one-to-one correspondence with the coordinate system of the captured image.
  • the x direction in the three-dimensional coordinate system in the real space is parallel to the baseline direction
  • the y direction is parallel to the vertical direction of the light receiving surface.
  • the u direction in the two-dimensional coordinate system on the light receiving surface PL is parallel to the x direction
  • the v direction is parallel to the y direction.
  • the position of the pixel on the light receiving surface PL is represented by the number of pixels in the horizontal and vertical directions with the upper left corner as the origin. If the coordinates of the point on the optical axis projected on the image are (uc, vc), the distance r between (uc, vc) and the pixel located at the coordinates (u, v) is expressed by the following equation (1). expressed.
  • the equidistant projection method since the equidistant projection method is employed, a model in which the point P in the three-dimensional real space is projected onto the surface of the spherical surface SP having the radius 1 and having the optical center O as the center position can be used. That is, as shown in FIG. 3, the angle ⁇ [rad] formed by the straight line connecting the point Q where the point P in the real space is projected on the surface of the spherical surface SP and the optical center O with the optical axis Ax (z-axis direction) is Using the distance r, it is expressed by the following equation (3).
  • a point R indicates the position of a pixel obtained by projecting the point Q onto the light receiving surface PL.
  • the point P in the real space is associated with the pixel at the coordinates (u, v) of the light receiving surface PL
  • the point P in the real space is projected onto the surface of the spherical surface SP used as a model.
  • the position (Xr, Yr, Zr) of the point Q is expressed by the following equations (4), (5), (6).
  • the position (Xr, Yr, Zr) of the point Q is replaced by the x-axis, y-axis, z-axis instead of the pixel coordinates (u, v).
  • an angle around the x axis is ⁇
  • an angle around the y axis is ⁇
  • the angle ⁇ represented by the above equation (2) can be referred to as an angle around the z axis (an angle in the xy plane).
  • the angle ⁇ increases toward the positive direction of the z-axis with the positive direction of the x-axis as 0 degree, and the angle ⁇ increases toward the positive direction of the z-axis with the positive direction of the y-axis as 0 degrees.
  • the pixel position can be expressed by ( ⁇ , ⁇ ) instead of the coordinates (u, v).
  • the following equations (7) and (8) are calculated using the calculation results of the above equations (4), (5), and (6). .
  • FIG. 5 shows the result of conversion from an image expressing the pixel position in coordinates (u, v) to an image expressed in ( ⁇ , ⁇ ) by the procedure described above.
  • FIG. 5A shows a captured image (first image) obtained by the camera 11
  • FIG. 5B shows a captured image (second image) obtained by the camera 12.
  • 5C and 5D show images converted from coordinates (u, v) to ( ⁇ , ⁇ ), respectively.
  • the measurement processing unit 130 in the arithmetic processing unit 13 determines the position (u, v) of the pixel in the captured image captured by the two cameras 11 and 12 as an angle in the above-described three-dimensional real space.
  • An image converted into a set ( ⁇ , ⁇ ) is generated.
  • an image in which the position corresponding to the point P in the real space is represented by ( ⁇ , ⁇ ) is referred to as a converted image.
  • the present embodiment is premised on the parallel stereo under the above-described conditions, in the two converted images obtained from the captured images captured by the two cameras 11 and 12, the same point P in the real space is used.
  • the values of ⁇ corresponding to are equal.
  • corresponding point when searching for a pixel corresponding to one point P in real space (hereinafter referred to as “corresponding point”) from two converted images, only a range within the same angle ⁇ around the x axis is searched. Therefore, the search range for corresponding points is narrowed, and the processing load is reduced.
  • the measurement processing unit 130 employs a block matching technique in order to evaluate whether or not it is a corresponding point. That is, a block including a plurality of pixels is formed around a portion of one of the two converted images for which the corresponding point is to be evaluated, and a scanning block having a size corresponding to the block is formed on the other converted image. The scanning block is scanned along the ⁇ axis. It is desirable to set the block and the scanning block as a rectangular area around the pixel for which the corresponding point is to be evaluated.
  • the pixel at the coordinate ( ⁇ 1, ⁇ 1) in one converted image and the pixel at the coordinate ( ⁇ 2, ⁇ 2) in the other converted image are the corresponding points when the evaluation value Vs shown in the following equation (9) is minimized.
  • the block and the scanning block have (2N + 1) pixels in the ⁇ direction and (2M + 1) pixels in the ⁇ direction.
  • I ( ⁇ 1, ⁇ 1) is a luminance value at the coordinates ( ⁇ 1, ⁇ 1) of one converted image
  • I2 ( ⁇ 2, ⁇ 2) is a luminance value at the coordinates ( ⁇ 2, ⁇ 2) of the other converted image. Value.
  • the evaluation value Vs used in the above equation (9) is a value known as SAD (Sum of Absolute Difference), but other evaluation values can be used for evaluation of corresponding points.
  • SAD Sud of Absolute Difference
  • SSD Squared Difference
  • normalized cross-correlation function etc.
  • block matching other stereo matching techniques may be employed.
  • the measurement processing unit 130 extracts the corresponding points from the two converted images, and then generates a parallax image having the parallax d as a pixel value for each coordinate ( ⁇ , ⁇ ).
  • An example of the parallax image is shown in FIG.
  • the measurement processing unit 130 obtains a coordinate value in the z direction in the real space based on the parallax d with respect to the point P. That is, three-dimensional information in real space can be obtained.
  • the map generation unit 131 generates an illumination range map in which the three-dimensional information obtained by the measurement processing unit 130 is associated with the illumination range of the lighting fixture 20 (21, 22, 23).
  • the map generation unit 131 In order to generate the illumination range map by the map generation unit 131, the map generation unit 131 individually lights the lighting fixtures 21, 22, and 23 one by one, and images when the lighting fixtures 21, 22, and 23 are turned on respectively. Is captured by at least one of the cameras (11, 12), and the gray value (luminance value, pixel value) of each pixel is obtained.
  • the three images thus obtained are images in which only one lighting fixture 20 (21, 22, 23) is lit (FIGS. 13B, 13C, and 13D). See).
  • a correspondence map image is generated with the same resolution as the captured image of the camera 11.
  • the map generation unit 131 calculates the intensity value of the same coordinates (u, v) for the three images captured by lighting the lighting fixtures 21, 22, and 23 one by one. Compare.
  • the map generation unit 131 selects the image having the smallest gray value (the highest received light intensity), and the lighting fixture 20 (21, 22, 23) that is turned on when the image is captured, the coordinates ( u, v).
  • the map generation unit 131 changes the luminance value (gray value) of the pixel corresponding to the predetermined position on the floor surface 40 in the image obtained by the imaging means (cameras 11 and 12). Ask for.
  • the map generation unit 131 obtains the difference between the first luminance value and the second luminance value as a change in luminance value.
  • the first luminance value is a pixel corresponding to a predetermined position in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is the first light output (for example, light output corresponding to rated lighting).
  • Luminance value is a predetermined value in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is a second light output different from the first light output (for example, a light output corresponding to turning off). This is the luminance value of the pixel corresponding to the position.
  • the map generation unit 131 assigns the pixel to the illumination range of the lighting fixture 20 in which the change in the luminance value is maximized.
  • identifiers that can be distinguished from each other are given to the lighting fixtures 21, 22, and 23, and the identifier of the lighting fixture 20 (21, 22, 23) is assigned to each coordinate (u, v) of the captured image. Associate.
  • the identifier may be any value as long as it can be associated with one pixel. For example, the identifier “1” is assigned to the lighting fixture 21, “2” is assigned to the lighting fixture 22, and “3” is assigned to the lighting fixture 23. .
  • an image whose pixel value is the identifier of the lighting device 20 (21, 22, 23) is generated, and this image becomes a corresponding map image (see FIG. 13E for an example of the corresponding map image).
  • the map generation unit 131 converts the corresponding map image into an illumination range map by the procedure shown in FIG.
  • the pixel value of the corresponding map image (the identifier of the lighting fixture 20 (21, 22, 23)) is associated with the coordinate value of the three-dimensional information.
  • an identifier associated with the coordinate value of the three-dimensional information is set to 0 for initialization (S11).
  • the map generation unit 131 obtains the parallax d by collating the converted coordinates ( ⁇ , ⁇ ) with the parallax image, and calculates the distance Zd corresponding to the coordinates ( ⁇ , ⁇ ) by the following equation (10). (S13).
  • B is the distance between the optical center O1 of the camera 11 and the optical center O2 of the camera 12.
  • FIG. 7 shows the relationship between the point P in the real space, the optical centers O1 and O2 of the cameras 11 and 12, the angles ⁇ 1 and ⁇ 2, and the distance B.
  • the coordinate z is the distance from the xy plane to the point P in the coordinate system with the optical center O ⁇ b> 1 of the camera 11 as the origin.
  • the floor surface 40 of the illumination space 50 becomes a plane parallel to the xy plane (see FIG. 3) set with the camera 11 as a reference.
  • the world coordinates with the origin set on the floor surface 40 are defined, and the coordinates of the optical center O1 of the camera 11 in the world coordinates are (X0, Y0, Z0). Further, the Z axis in the world coordinates has a positive direction from the floor surface 40 toward the ceiling surface. Therefore, in the world coordinates, the coordinates of the predetermined position on the floor surface 40 are represented by (X, Y, 0).
  • the world coordinates (X, Y, Z) corresponding to the coordinates (u, v) in the corresponding map image are expressed by the following equation (11). That is, the coordinates (u, v) are converted into coordinates (X, Y, Z) (S14).
  • the identifier set to the coordinates (u, v) of the corresponding map image is associated with the coordinates (X, Y) of the illumination range map converted from the coordinates (u, v) (S17).
  • This process is performed for all coordinates (u, v) of the corresponding map image (S19, S20).
  • the current Z value is smaller than the stored Z value (S16: Yes), that is, if the part corresponding to the coordinates (X, Y) is located above the floor surface, the coordinates
  • the identifier of the luminaire 20 (21, 22, 23) is associated with the coordinates (X, Y) corresponding to (u, v) (S17), and the Z value obtained this time is the coordinates (X, Y).
  • the map generation unit 131 After generating the illumination range map by the process shown in FIG. 6, the map generation unit 131 sets any one of the coordinates (X, Y) to which the identifier of the lighting fixture 20 (21, 22, 23) is not associated. Correction processing is performed so as to associate the identifiers of the lighting fixtures 20 (21, 22, 23).
  • a coordinate position immediately under each lighting fixture 20 (21, 22, 23) is estimated on the XY plane (floor surface), and coordinates (X, Y) are determined according to the distance from the coordinate position. Are associated with the identifiers of the lighting fixtures 20 (21, 22, 23).
  • the coordinate position immediately below the lighting fixture 20 (21, 22, 23) is assigned an average coordinate position (center of gravity position) for each identifier using an illumination range map before correction processing.
  • the position of the center of gravity of the illumination range for each lighting fixture 20 (21, 22, 23) is estimated as the coordinate position immediately below the lighting fixture 20 (21, 22, 23).
  • the map generation unit 131 obtains the coordinate position immediately below the lighting fixture 20 (21, 22, 23), and then uses the following equation (12) to calculate the lighting fixture 20 (21, 22, 23) in the lighting range map.
  • the distance D between the coordinates (X, Y) that are not associated with the identifier and the coordinate position immediately below each lighting fixture 20 (21, 22, 23) is obtained.
  • (ug, vg) is a coordinate position immediately below the lighting fixture 20 (21, 22, 23), and has a different value for each lighting fixture 20 (21, 22, 23).
  • the distance D (D1, D2, D3) is obtained for each lighting fixture 20 (21, 22, 23).
  • three lighting fixtures 20 (21, 22, 23) are used.
  • Three types of distances D are required for X, Y).
  • the map generator 131 Since the luminaire 20 (21, 22, 23) is considered to have a larger influence as the distance D to the coordinates (X, Y) is smaller, the map generator 131 has the luminaire 20 with which the obtained distance D is minimized. Assign (21, 22, 23) to the coordinates (X, Y).
  • the map generation unit 131 first generates an illumination range map as illustrated in FIG. 8 as illustrated in FIG. 8, the identifiers of the lighting fixtures 21, 22, and 23 are associated with the elliptical regions F1, F2, and F3, respectively.
  • the center-of-gravity positions (ug1, vg1), (ug2, vg2), and (ug3, vg3) are obtained for each of the regions F1, F2, and F3.
  • Luminaire 20 (21) corresponding to the center-of-gravity position (ug, vg) ((ug1, vg1), (ug2, vg2), (ug3, vg3)) at which the obtained distance D (D1, D2, D3) is minimized. , 22, 23) are assigned to the coordinates (X, Y).
  • the lighting fixture 20 (in which the distance D is the minimum with respect to the coordinates (X, Y) to which the identifier of the lighting fixture 20 (21, 22, 23) is not assigned. 21, 22, 23) are assigned. That is, the identifier of the lighting fixture 20 (21, 22, 23) is associated with all the coordinates (X, Y) of the floor surface 40 of the illumination space 50 by the correction process of the illumination range map.
  • the distance from the coordinates (X, Y) to which the identifier of the lighting fixture 20 (21, 22, 23) is assigned may be used. That is, when the association with the lighting fixture 20 (21, 22, 23) is not made, an identifier assigned to a close position may be assigned.
  • illumination is performed at the coordinates (X, Y) using the positional relationship between the illumination range for each of the lighting fixtures 21, 22, and 23 and the position on the floor surface 40 represented by the coordinates (X, Y).
  • the instrument 20 (21, 22, 23) will be assigned.
  • the processing described above is performed in the setting mode before the normal mode for controlling the lighting fixture 20 (21, 22, 23). Next, the operation in the normal mode in which the light output of the lighting fixture 20 (21, 22, 23) is controlled according to the position where the person 30 is present will be described.
  • the person detection unit 132 detects the presence position of the person 30 using the images captured by the cameras 11 and 12. That is, the person detection unit 132 not only detects the presence / absence of the person 30 in the illumination space 50 but also detects (identifies) a position (existence position) where the person 30 exists when the person 30 exists in the illumination space 50. To do.
  • the three-dimensional measurement unit 10 acquires three-dimensional information of the illumination space 50.
  • the three-dimensional information is generated according to the frame rates of the cameras 11 and 12, and for example, the three-dimensional information is acquired for each image output at 30 frames per second.
  • the operation in which the three-dimensional measuring unit 10 acquires the three-dimensional information is as described above, and the coordinates (u, v) of the captured images for the cameras 11 and 12 are converted into ( ⁇ , ⁇ ), and the parallax image is further converted. Generated and three-dimensional information is calculated based on the parallax image.
  • the difference image has a pixel value difference of the same coordinate pixel value between the evaluation image to be compared and the background image. Therefore, when an object that is not included in the background image is included in the evaluation image, the pixel value of the coordinate corresponding to the object is relatively large. If this is utilized, it becomes possible to detect the presence position of the person 30 using a difference image.
  • the difference between the parallax image in the setting mode and the parallax image in the normal mode is used to detect the presence position of the person 30, but the person's presence position may be detected by other methods.
  • a distance image can be generated by calculating the value of the distance z shown in FIG. 7, so that a distance image in the setting mode and a distance image in the normal mode are generated, and the distance images are The presence position of the person 30 can be detected using the difference between the two.
  • an area corresponding to the person 30 in the image is extracted by a method such as an inter-frame difference method or template matching, and the extracted area has parallax and By calculating the distance, the presence position of the person 30 can be detected.
  • the human detection unit 132 generates a differential image between the evaluation image generated based on the three-dimensional information acquired in the normal mode and the background image, and a pixel corresponding to the human is a pixel whose pixel value exceeds a prescribed threshold in the differential image Extracted as a candidate.
  • a threshold value that appropriately sets a difference between the parallax dn ( ⁇ , ⁇ ) of the coordinates ( ⁇ , ⁇ ) in the evaluation image and the parallax db ( ⁇ , ⁇ ) of the coordinates ( ⁇ , ⁇ ) in the background image.
  • dn ( ⁇ , ⁇ ) ⁇ db ( ⁇ , ⁇ )> dth is established as compared with dth, the pixel at the coordinate ( ⁇ , ⁇ ) is set as a candidate pixel corresponding to the person 30.
  • the human detection unit 132 generates a difference image between the evaluation image generated based on the three-dimensional information acquired in the normal mode and the background image, and sets a pixel whose pixel value in the difference image is equal to or greater than a specified threshold value to the person. You may extract as a corresponding pixel candidate.
  • the difference image is as shown in FIG. 9 (c).
  • connectivity is evaluated for the pixel of interest and adjacent pixels (near 4 or 8), and when both pixels exceed the threshold value dth, both pixels are determined to have connectivity.
  • the human detection unit 132 compares the area (number of pixels) of the pixels constituting the set with respect to a set of pixels determined to have connectivity, and if the area is within the specified range, the set Is assumed to correspond to the person 30.
  • the position of the representative point with respect to the floor surface 40 of the illumination space 50 is obtained. That is, the person detection unit 132 obtains coordinate values (X, Y, Z) in world coordinates for the set of pixels that are candidates for the person 30 using the above formulas (11) and (12). Further, the coordinates of the center of gravity on the XY plane are obtained for the set of pixels, and this coordinate is set as the position of the representative point of the person 30, that is, the position of the person 30.
  • the collation unit 133 collates the existence position of the person 30 with the illumination range map and corresponds to the existence position of the person 30.
  • the lighting fixture 20 (21, 22, 23) to be extracted is extracted.
  • the identifier of the luminaire 20 (21, 22, 23) associated with the position of the person 30 in the illumination range map is extracted.
  • the identifier extracted by the collation unit 133 is given to the control instruction unit 134.
  • the control instruction unit 134 determines the light output (light output instruction value) of the luminaires 21, 22, and 23 according to the appropriately set rules, and performs illumination control on the determined light output (light output instruction value).
  • the unit 15 is instructed.
  • the simplest rule is to turn on only the lighting fixtures 20 (21, 22, 23) corresponding to the identifiers extracted by the collation unit 133 at a rating, and turn off the other lighting fixtures 20 (21, 22, 23). Is set.
  • the lighting fixture 20 (21, 22, 23) corresponding to the identifier extracted by the collation unit 133 is turned on with a rating, and the lighting fixtures 20 (21, 22, 23) around the lighting fixture 20 (21, 22, 23) are also displayed.
  • 22 and 23) may be set to be dimmed (for example, the light output is dimmed to 50%).
  • the other lighting fixtures 20 (21, 22, 23) are turned off except for the lighting fixtures 20 (21, 22, 23) for which rated lighting and dimming lighting are instructed.
  • the lighting device 20 (21, 22, 23) that is close to the person 30 is lit at a rating to obtain the illuminance necessary for performing work and ensuring safety. be able to.
  • the lighting fixtures 20 (21, 22, 23) that are far from humans are turned off, the lighting fixtures 20 (21, 22, 23) that are less affected by work and ensuring safety are turned off. Energy saving can be realized.
  • the lighting fixtures 20 are not associated with a surface (such as a side surface or a wall surface of the furniture) that intersects the floor surface 40 in the lighting space 50, a person on the floor surface 40 It becomes possible to control the light output of the luminaires 21, 22, and 23 so as to correspond to the 30 positions.
  • the illumination control apparatus includes the imaging unit (cameras 11 and 12) that captures the illumination space 50 illuminated by the plurality of lighting fixtures 20, and the imaging unit (cameras 11 and 12).
  • a three-dimensional measurement unit 10 that acquires three-dimensional information of the illumination space 50 from the captured image, and an illumination range map in which the position on the floor surface 40 of the illumination space 50 is assigned to the lighting fixture 20 by using the three-dimensional information.
  • the map generation unit 131 that performs the detection
  • the person detection unit 132 that detects the position of the person 30 on the floor 40 in the illumination space 50 using the three-dimensional information, and the position of the person 30 are collated with the illumination range map.
  • a control instruction unit 134 that determines the light output of the luminaire 20 using the illumination range of each luminaire 20 extracted as described above.
  • the illumination control apparatus of the present embodiment includes a measurement processing unit 130, a map generation unit 131, a person detection unit 132, and a control instruction unit 134.
  • the measurement processing unit 130 generates three-dimensional information of the illumination space 50 based on an image (captured image) acquired from an imaging unit that images the illumination space 50 including the floor surface 40 illuminated by the plurality of lighting fixtures 20. Configured.
  • the map generation unit 131 is configured to generate an illumination range map representing the illumination range of the luminaire 20 on the floor surface 40 of the illumination space 50 based on the three-dimensional information.
  • the person detection unit 132 is configured to obtain an existing position where the person 30 exists on the floor surface 40 of the illumination space 50 based on the three-dimensional information.
  • the control instruction unit 134 is configured to determine an instruction value for each light output of the plurality of lighting fixtures 20 with reference to the presence position and the illumination range map.
  • the illumination control device of the present embodiment further includes imaging means (cameras 11 and 12) and an illumination control unit 15.
  • imaging means cameras 11 and 12
  • illumination control unit 15 When the illumination control unit 15 receives the instruction value from the control instruction unit 134, the illumination control unit 15 is configured to control the illumination apparatus 20 so that the light output of the illumination apparatus 20 becomes a value corresponding to the instruction value.
  • the three-dimensional measuring unit 10 uses two cameras 11 and 12 as imaging means, and uses the triangulation principle from the images captured by the cameras 11 and 12 to use the illumination space. 50 three-dimensional information is acquired.
  • the imaging means includes two cameras 11 and 12.
  • the measurement processing unit 130 is configured to generate three-dimensional information from two images (captured images) obtained from the two cameras 11 and 12 using the principle of triangulation.
  • the map generation unit 131 changes the light output for each lighting fixture 20 in two stages to generate an illumination range map, and the pixels constituting the image captured by the imaging unit Among the pixels corresponding to the floor surface 40 in the lighting space 50, the change in the gray value corresponding to the two-stage light output for each lighting fixture 20 is obtained, and the lighting fixture 20 that maximizes the change for each pixel is obtained. Associate.
  • the map generation unit 131 obtains a change in the luminance value of a pixel corresponding to a predetermined position on the floor surface 40 in the image for each of the plurality of lighting fixtures 20, and illuminates the pixel with the largest change in the luminance value. It is configured to be assigned to the illumination range of the instrument 20.
  • the map generation unit 131 is configured to obtain a difference between the first luminance value and the second luminance value as a change in luminance value.
  • the first luminance value is a pixel corresponding to a predetermined position in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is the first light output (for example, light output corresponding to rated lighting). Luminance value.
  • the second luminance value is a predetermined value in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is a second light output different from the first light output (for example, a light output corresponding to turning off). This is the luminance value of the pixel corresponding to the position.
  • the map generation part 131 excludes the pixel corresponding to the surface which cross
  • the map generation unit 131 is configured not to assign a pixel corresponding to a surface intersecting the floor surface 40 in the image to any illumination range.
  • the map generation unit 131 sets each lighting fixture 20 at a position on the floor surface 40 of the lighting space 50 that is not associated with the lighting fixture 20 in the lighting range map.
  • the lighting fixture 20 is assigned to the position using the positional relationship between the illumination range and the position.
  • the map generation unit 131 is based on the positional relationship between the illumination range and the undetermined position of each of the plurality of lighting fixtures 20.
  • the undetermined position is configured to be assigned to any illumination range.
  • the map generation unit 131 estimates a position immediately below each lighting fixture 20 on the floor surface 40 based on the lighting range for each lighting fixture 20 in the lighting range map.
  • the map generation unit 131 is configured to obtain the position immediately below the lighting fixture on the floor surface 40 of the image based on the illumination range.
  • the positional relationship between the illumination range and the undetermined position is the positional relationship between the position immediately below each of the plurality of lighting fixtures 20 and the undetermined position.
  • the illumination range by the luminaire 20 is associated on the floor surface 40 by using the three-dimensional information of the illumination space 50.
  • the received light intensity of the cameras 11 and 12 when the lighting fixtures 21, 22, and 23 are turned on one by one is compared.
  • a difference image is generated between an image obtained when the lighting fixtures 21, 22, and 23 are turned on one by one and an image obtained when all the lighting fixtures 21, 22, and 23 are turned off.
  • the correspondence map image may be generated using the pixel value in the difference image.
  • the light output of the luminaire 20 may be changed in two stages, and the light output is turned on in two stages in the lighting state, not on and off. It may be changed to. In short, the change in the light output of the luminaire 20 (21, 22, 23) may be reflected in the change in the gray value in the cameras 11 and 12.
  • the map generation unit 131 calculates the contribution to the pixel corresponding to the predetermined position of the floor surface 40 in the image for each of the plurality of lighting fixtures 20, and the contribution of the pixel is determined. You may allocate to the illumination range of the lighting fixture 20 which becomes the largest.
  • the map generation unit 131 has a light output of one of the plurality of lighting fixtures 20 as a first light output (for example, a light output corresponding to rated lighting), and a light output of the other lighting fixtures 20.
  • a first light output for example, a light output corresponding to rated lighting
  • a second light output lower than the first light output for example, a light output corresponding to turning off.
  • the map generation unit 131 is configured to obtain a ratio of the luminance value I corresponding to one lighting fixture 20 to the total S of the luminance values I corresponding to the plurality of lighting fixtures 20 as a contribution.
  • one lighting range map is generated by combining a plurality of lighting fixtures 20 (21, 22, 23).
  • each lighting fixture 20 (21, 22, 23) is generated.
  • Generate an illumination range map That is, for the three lighting fixtures 21, 22, and 23, three illumination range maps are generated.
  • the map generation unit 131 turns on the lighting fixtures 21, 22, and 23 one by one to generate three captured images.
  • an illumination range map in which the identifier of the lighting fixture 20 (21, 22, 23) is associated with the coordinate value of the three-dimensional information is generated using the gray values in the obtained three captured images. Yes.
  • three individually lit images are generated using three captured images.
  • the individually lit image is obtained by dividing the gray value I at the coordinates (u, v) of one captured image by the sum S of the gray values of all the captured images at the same coordinates (u, v). u, v).
  • the gray value when all the lighting fixtures 21, 22 and 23 are lit at the rated value is regarded as the sum S of the grayscale values of the captured images when the lighting fixtures 21, 22 and 23 are individually lit.
  • the gray value I of the captured image (individual lighting image corresponding to the lighting fixture 21) obtained when only the lighting fixture 21 is turned on is set to I1.
  • the grayscale value I of the captured image (individual lighting image corresponding to the lighting fixture 22) obtained when only the lighting fixture 22 is turned on is defined as I2.
  • the gray value I of the captured image (individual lighting image corresponding to the lighting fixture 23) obtained when only the lighting fixture 23 is turned on is set to I3.
  • S I1 + I2 + I3.
  • the pixel value of the individually lit image is I / S
  • the degree (contribution) that each lighting fixture 21, 22, 23 affects the position represented by coordinates (u, v) is referred to as I / S. It is expressed by an indicator. Therefore, three individually lit images having pixel values of coordinates (u, v) of I1 / S, I2 / S, and I3 / S are obtained.
  • the circular area indicates the field of view of the camera (11, 12).
  • the individual lighting images are the same as those shown in FIGS. 13B, 13C, and 13D.
  • the corresponding map image described in the first embodiment is generated.
  • the individual lighting image corresponding to the lighting fixture 21 (FIG. 10A) is bright at the top, and the individual lighting image corresponding to the lighting fixture 22 (FIG. 10B) is bright at the center, and the lighting fixture 23.
  • the individual lighting image corresponding to (FIG. 10C) is bright at the bottom.
  • the map generation unit 131 uses three individual lighting images and associates the coordinates of the floor surface 40 in the illumination space 50 with the identifiers of the lighting fixtures 20 (21, 22, 23), and the three illumination ranges. Generate a map.
  • one lighting range map includes identifiers of a plurality of lighting fixtures 21, 22, and 23, but in this embodiment, one lighting range map includes one lighting fixture 20. Only the identifiers (21, 22, 23) are included. In other words, three illumination range maps are generated from the three individual lighting images generated for each identifier of the lighting fixture 20 (21, 22, 23).
  • the contribution to the coordinates (X, Y) for each lighting fixture 20 (21, 22, 23) is associated with the illumination range map.
  • the pixel values (I1 / S, I2 / S, I3 / S) of the coordinates (u, v) corresponding to the coordinates (X, Y) in the individual lighting image are used for the contribution at the coordinates (X, Y). .
  • the procedure for generating the illumination range map is basically the same as that in the first embodiment shown in FIG. That is, after the illumination range map is initialized, the pixel value of the coordinate (u, v) of the individually lit image is associated with the coordinate (X, Y) on the floor surface. Further, the association with the surface intersecting the floor surface 40 is prevented.
  • an illumination range map for each lighting fixture 20 is formed by generating an illumination range map for each individual lighting image.
  • FIGS. 10D, 10E, and 10F For example, from the individually lit images shown in FIGS. 10A, 10B, and 10C, three illumination range maps shown in FIGS. 10D, 10E, and 10F are generated. .
  • the brightness in the illustrated illumination range map represents the contribution, and the higher the brightness, the greater the contribution.
  • the collation unit 133 collates the existence position of the person 30 obtained by the person detection unit 132 with all the illumination range maps, and coordinates (X, X) corresponding to the existence position of the person 30 for each illumination range map. Y) contribution is extracted.
  • the control instruction unit 134 extracts the contribution (I / S) extracted for each illumination range map. Is compared with a prescribed threshold value, and lighting fixtures 20 (21, 22, 23) whose contribution is equal to or greater than the threshold value are turned on.
  • the contribution of the position of the person 30 is equal to or greater than the threshold in FIGS. 10D and 10E, and FIG. ), The contribution degree of the position of the person 30 is less than the threshold value.
  • the control instruction unit 134 turns on the lighting fixtures 21 and 22 and turns off the lighting fixture 23.
  • the map generation unit 131 obtains the degree of contribution to the pixel corresponding to the predetermined position of the floor surface 40 in the image with respect to each of the plurality of lighting fixtures 20, and determines the pixel. It is comprised so that the contribution may be allocated to the illumination range of the lighting fixture 20 which becomes more than a prescribed threshold value.
  • the light output of one of the plurality of lighting fixtures 20 is the first light output (for example, the light output corresponding to the rated lighting), and the light output of the other lighting fixtures 20 is the first.
  • the map generation unit 131 is configured to obtain a ratio of the luminance value I corresponding to one lighting fixture 20 to the total S of the luminance values I corresponding to the plurality of lighting fixtures 20 as a contribution.
  • the map generation unit 131 changes the light output for each lighting fixture 20 in two stages in order to generate an illumination range map, and the imaging means (cameras 11 and 12) captures images. For each pixel associated with the floor surface 40 in the illumination space 50 among the pixels constituting the image, a change in the gray value corresponding to the two-stage light output for each lighting fixture 20 is obtained, and the change is defined for each pixel. You may associate the lighting fixture 20 which becomes more than this threshold value.
  • the map generation unit 131 obtains a change in the luminance value of a pixel corresponding to a predetermined position on the floor surface 40 in the image for each of the plurality of lighting fixtures 20, and the change in the luminance value of the pixel is equal to or greater than a predetermined threshold value. It may be configured to be assigned to the illumination range of the lighting fixture 20 to be.
  • the map generation unit 131 is configured to obtain the difference between the first luminance value and the second luminance value as a change in luminance value.
  • the first luminance value is a pixel corresponding to a predetermined position in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is the first light output (for example, light output corresponding to rated lighting).
  • the second luminance value is a predetermined value in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is a second light output different from the first light output (for example, a light output corresponding to turning off). This is the luminance value of the pixel corresponding to the position.
  • the lighting is performed according to the location of the person 30 without setting a rule for allowing lighting of the plurality of lighting fixtures 20 (21, 22, 23) in the control instruction unit 134. It is possible to dynamically change the number of lighting fixtures 20 (21, 22, 23) to be changed. Other configurations and operations are the same as those of the first embodiment.

Abstract

This illumination control device is provided with: a measurement processor for generating three-dimensional information of an illumination space including a floor surface illuminated by a plurality of illumination devices, the information being generated on the basis of an image acquired from an image-capturing means for capturing an image of the illumination space; a map-generating unit for generating, on the basis of the three-dimensional information, an illumination range map representing the illumination range of each of the plurality of illumination devices on the floor surface of the illumination space; a person-detecting unit for determining, on the basis of the three-dimensional information, the presence-position of where a person is present on the floor surface of the illumination space; and a control designator for determining a designation value of light outputted from each of the plurality of illumination devices with reference being made to the presence-position and the illumination range map.

Description

照明制御装置Lighting control device
 本発明は、照明制御装置に関するものである。 The present invention relates to a lighting control device.
 従来から、照明器具により照明される照明空間での人の存否に応じて照明器具の光出力を自動的に調節する技術が提案されている。この種の技術は、照明空間に人が存在する期間にのみ照明器具を点灯させることにより照明器具の消し忘れを防止し、また、人の非存在時に照明器具の光出力を抑制することにより消費電力の無駄な増加を抑制することを目的として採用されている。 Conventionally, a technique for automatically adjusting the light output of a lighting fixture according to the presence or absence of a person in the lighting space illuminated by the lighting fixture has been proposed. This type of technology prevents the forgetting to turn off the lighting fixtures by turning on the lighting fixtures only when there is a person in the lighting space, and consumes light by suppressing the light output of the lighting fixtures when no one is present. It is adopted for the purpose of suppressing an unnecessary increase in electric power.
 人の存否は、焦電型赤外線センサを備える人感センサを用いて検出することが多い。ただし、カメラのような撮像手段を用いて照明空間の画像を取得し、画像から人か否かを判別し、かつ人の位置を検出する技術も提案されている。 The presence or absence of a person is often detected using a human sensor equipped with a pyroelectric infrared sensor. However, a technique for acquiring an image of an illumination space using an imaging unit such as a camera, determining whether the person is a person from the image, and detecting the position of the person has also been proposed.
 ところで、照明空間がオフィスや店舗のように広い場合には、一般に複数台の照明器具が配置される。各照明器具は、照明空間の全体を照明するのではなく、照明空間の一部を照明範囲にしている。ここに、照明範囲は、照明空間の利用目的に応じた必要な照度が得られる範囲を意味する。たとえば、照明器具を天井に配置している場合に、床面での照度が規定した照度以上になる範囲が照明範囲になる。 By the way, when the lighting space is wide like an office or a store, a plurality of lighting fixtures are generally arranged. Each lighting fixture does not illuminate the entire illumination space, but makes a part of the illumination space an illumination range. Here, the illumination range means a range in which necessary illuminance can be obtained according to the purpose of use of the illumination space. For example, when the luminaire is arranged on the ceiling, the illumination range is a range where the illuminance on the floor is equal to or greater than the illuminance specified.
 複数台の照明器具により1つの照明空間を照明している場合、照明空間における人の存在位置がわかれば、人の存在位置に応じて照明器具の光出力を調節することが可能になる。すなわち、照明空間において人が局在している場合には、人の近辺に存在する照明器具からの光出力を相対的に大きくし、残りの照明器具の光出力を相対的に小さくすることによって、照明空間全体での消費電力が低減されることになる。ここに、光出力を大きくする場合には全点灯(定格点灯)する場合が含まれており、光出力を小さくする場合には消灯する場合が含まれている。 When one illumination space is illuminated by a plurality of lighting fixtures, the light output of the lighting fixture can be adjusted according to the location of the person if the location of the presence of the person in the illumination space is known. That is, when a person is localized in the lighting space, by relatively increasing the light output from the lighting fixtures present in the vicinity of the person and by relatively reducing the light output of the remaining lighting fixtures. The power consumption in the entire lighting space is reduced. Here, when the light output is increased, the case of full lighting (rated lighting) is included, and when the light output is decreased, the case of turning off the light is included.
 上述の動作では、人の存在しない領域を照明範囲とする照明器具の光出力を相対的に小さくしているから、照明空間の全体で照明器具の光出力を一定にする場合に比較すると、照明器具による消費電力が低減される。しかも、人の近辺に存在する照明器具の光出力を相対的に大きくしているから、照明空間の利用に応じた必要な照度が確保されることになる。 In the above-described operation, since the light output of the luminaire that makes the area where no person is present the illumination range is relatively small, compared with the case where the light output of the luminaire is constant throughout the illumination space, The power consumption by the appliance is reduced. And since the light output of the lighting fixture which exists in the vicinity of a person is relatively enlarged, the required illumination intensity according to utilization of illumination space is ensured.
 人の存在位置に応じて照明器具の光出力を調節するには、焦電型赤外線センサを用いた人感センサを個々の照明器具に設け、人感センサでの人の検知の有無に応じて人の周囲の照明器具の光出力を連携させて調節することが考えられる。この場合、複数台の照明器具を連携させるために、照明器具の間で通信を行うことが必要になる。この構成を採用すると、照明器具と同じ個数の人感センサが必要になる上に、複数の照明器具の間で連携させるための回路構成が個々の照明器具に必要になり、結果的にコスト増につながるという問題が生じる。 In order to adjust the light output of the luminaire according to the position of the person, a human sensor using a pyroelectric infrared sensor is provided in each luminaire, and the human sensor detects whether or not a person is detected. It is conceivable to adjust the light output of lighting fixtures around the person in coordination. In this case, in order to link a plurality of lighting fixtures, it is necessary to communicate between the lighting fixtures. When this configuration is adopted, the same number of human sensors as the lighting fixtures are required, and a circuit configuration for linking the lighting fixtures is required for each lighting fixture, resulting in an increase in cost. The problem that leads to.
 一方、撮像手段を用いて取得した画像を用いて人の存在位置を検出する技術を採用する場合は、個々の照明器具に撮像手段を設ける必要がなく、複数台の照明器具の照明範囲における人の存在位置を1台の撮像手段によって検知することが可能になる。この種の技術を採用する場合、照明空間の広さにもよるが、1台の撮像手段で照明空間の全体を撮像することが多い。 On the other hand, in the case of adopting a technique for detecting the presence position of a person using an image acquired using an imaging unit, it is not necessary to provide an imaging unit for each lighting fixture, and a person in the illumination range of a plurality of lighting fixtures Can be detected by a single imaging means. When this type of technology is employed, although depending on the size of the illumination space, the entire illumination space is often imaged by a single imaging means.
 ただし、撮像手段を用いて人の存在位置を検出する場合には、撮像手段が撮像する画像内の位置と、照明空間における位置とをあらかじめ対応付けておくことが必要である。 However, when detecting the presence position of a person using the imaging unit, it is necessary to associate the position in the image captured by the imaging unit with the position in the illumination space in advance.
 文献1(日本国公開特許公報第2009-283183号)には、互いに直交する2方向から撮像する2台の画像センサを用い、2台の画像センサから得られた撮像データに基づいて人の位置座標を演算し、さらに、人が居る位置に近いほど明るくなるように照明器具の調光制御を行う技術が記載されている。 Document 1 (Japanese Published Patent Publication No. 2009-283183) uses two image sensors that capture images from two directions orthogonal to each other and uses the image data obtained from the two image sensors to determine the position of a person. A technique is described in which the coordinates are calculated and the dimming control of the luminaire is performed so that the closer to the position where a person is, the brighter the light is.
 文献1では、撮像データにおける照明器具の論理アドレスと、照明器具の実際の物理アドレスとを一致させるために、照明器具を1台ずつ順に点灯させ、撮像データにおいて明るくなった領域の位置を用いて、照明器具の座標を求めている。言い換えると、照明器具を1台ずつ点灯させ、画像内で明るくなった領域の論理アドレスに、点灯させた照明器具の物理アドレスを対応付けている。 In Document 1, in order to make the logical address of the lighting fixture in the imaging data coincide with the actual physical address of the lighting fixture, the lighting fixtures are sequentially turned on one by one, and the position of the brightened area in the imaging data is used. Find the coordinates of the luminaire. In other words, the lighting fixtures are turned on one by one, and the physical address of the lighting fixture that has been turned on is associated with the logical address of the bright area in the image.
 ところで、文献1に記載された技術では、照明空間に机、棚、ロッカ、事務機、パーティションのような立体物が存在している場合に、論理アドレスと物理アドレスとを正確に対応付けることができない場合がある。論理アドレスと物理アドレスとが正確に対応付けられていなければ、照明空間における人の存在位置に基づいて照明器具の光出力を調節する際に、不適切な照明器具を制御する可能性がある。 By the way, in the technique described in Document 1, when a three-dimensional object such as a desk, a shelf, a rocker, an office machine, or a partition exists in the illumination space, the logical address and the physical address cannot be accurately associated with each other. There is a case. If the logical address and the physical address are not accurately associated with each other, there is a possibility of controlling an inappropriate lighting fixture when adjusting the light output of the lighting fixture based on the position of the person in the lighting space.
 いま、図11のように、照明器具24,25を備える照明空間において、照明器具25の直下付近に机26が配置されている場合を考える。撮像手段としてのカメラ16は、照明器具24,25の間に配置されているものとする。図示した配置では、机26の近傍に人31が存在する場合には、照明器具25の光出力を相対的に大きくし、机26から遠方に配置された照明器具24は光出力を相対的に小さくすることが望ましい。 Now, as shown in FIG. 11, consider a case where a desk 26 is arranged in the vicinity of the lighting fixture 25 in the lighting space including the lighting fixtures 24 and 25. It is assumed that the camera 16 as an imaging unit is disposed between the lighting fixtures 24 and 25. In the illustrated arrangement, when a person 31 is present in the vicinity of the desk 26, the light output of the lighting fixture 25 is relatively increased, and the lighting fixture 24 arranged far from the desk 26 relatively increases the light output. It is desirable to make it smaller.
 ここで、カメラ16の視野に机26の一側面が含まれているとすると、机26の側面には照明器具24からの直接光が入射するが、照明器具25からの直接光は入射しない。したがって、図12に示すように、カメラ16で撮像された画像において、机26の一側面は照明器具24に対応付けられことになる。図12に示す画像において、斜線部が照明器具24に対応付けられる領域であり、残りの領域が照明器具25に対応付けられる。 Here, if one side of the desk 26 is included in the field of view of the camera 16, direct light from the lighting fixture 24 is incident on the side of the desk 26, but no direct light from the lighting fixture 25 is incident. Therefore, as shown in FIG. 12, in the image captured by the camera 16, one side surface of the desk 26 is associated with the lighting fixture 24. In the image shown in FIG. 12, the shaded area is an area associated with the luminaire 24, and the remaining area is associated with the luminaire 25.
 一方、机26の近傍に存在する人31は、図12のように、人31の存在位置が、机26の近傍で照明器具24に対応付けられた領域(斜線部)に重複する場合が生じる。この場合、照明器具24の光出力が照明器具25の光出力よりも相対的に大きくなり、意図した動作が得られないという問題が生じる。 On the other hand, as shown in FIG. 12, the person 31 existing near the desk 26 may overlap the position (hatched portion) where the person 31 exists in the vicinity of the desk 26 and associated with the lighting fixture 24. . In this case, the light output of the lighting fixture 24 becomes relatively larger than the light output of the lighting fixture 25, which causes a problem that the intended operation cannot be obtained.
 図13に実際の動作例を示す。図13(a)は照明空間の一例であって、3台の照明器具により照明を行う場合を示している。照明器具ごとの照明範囲は、床面上に3つの領域E1,E2,E3で示している。また、照明空間の床には、複数個の立体物27(机、棚、事務機など)が配置され、天井にはカメラ16が配置されている。図示例では、カメラ16が領域E1,E2の境界付近の天井に配置されている。 Fig. 13 shows an example of actual operation. FIG. 13A is an example of an illumination space, and shows a case where illumination is performed by three illumination fixtures. The illumination range for each lighting fixture is indicated by three regions E1, E2, E3 on the floor surface. A plurality of three-dimensional objects 27 (desks, shelves, office machines, etc.) are arranged on the floor of the lighting space, and a camera 16 is arranged on the ceiling. In the illustrated example, the camera 16 is disposed on the ceiling near the boundary between the areas E1 and E2.
 照明器具を1台ずつ個別に点灯させたときにカメラ16で撮像される画像は、図13(b),(c),(d)のようになる。そして、各画像で明るくなった領域の論理アドレスに、点灯させた照明器具の物理アドレスを対応付けると、図13(e)のように、カメラ16で撮像した画像内に照明器具の物理アドレスが対応付けられる。図13(e)では3段階の濃淡値を用いて照明器具の物理アドレスを表している。 Images taken by the camera 16 when the lighting fixtures are individually turned on are as shown in FIGS. 13B, 13C, and 13D. Then, when the physical address of the lit lighting fixture is associated with the logical address of the bright area in each image, the physical address of the lighting fixture corresponds to the image captured by the camera 16 as shown in FIG. Attached. In FIG. 13 (e), the physical address of the lighting fixture is represented using three-level gray value.
 照明器具の物理アドレスは、本来ならば、図13(a)に示された領域E1,E2,E3と同様に、照明器具ごとに1つの領域にまとまっていなければならない。しかしながら、実際の照明空間には立体物27が存在するから、図13(e)のように、1台の照明器具の物理アドレスが複数の領域に分割され、1つの物理アドレスに飛び地状の領域(図13(e)において破線で囲まれた領域)が生じることがある。このような飛び地状の領域が生じていると、上述したように、人の存在位置に対して不適切な照明器具を制御する可能性が生じる。 Originally, the physical addresses of the lighting fixtures must be grouped into one area for each lighting fixture, similarly to the regions E1, E2, and E3 shown in FIG. However, since the three-dimensional object 27 exists in the actual illumination space, the physical address of one luminaire is divided into a plurality of areas as shown in FIG. (A region surrounded by a broken line in FIG. 13E) may occur. If such an enclave-like area is generated, there is a possibility of controlling an inappropriate lighting device with respect to the position of the person as described above.
 本発明は、照明空間の三次元情報を用いることにより、照明器具による照明範囲を床面上で対応付けることを可能にし、結果的に、床面上での人の位置を用いて適切な位置の照明器具の光出力を制御することを可能にした照明制御装置を提供することを目的とする。 The present invention uses the three-dimensional information of the lighting space to enable the illumination range by the lighting fixture to be associated on the floor surface, and as a result, the position of the person on the floor surface can be appropriately positioned. It is an object of the present invention to provide an illumination control device that can control the light output of a lighting fixture.
 本発明に係る第1の形態の照明制御装置は、計測処理部と、マップ生成部と、人検出部と、制御指示部と、を備える。前記計測処理部は、複数の照明器具により照明される床面を含む照明空間を撮像する撮像手段から取得した画像に基づいて前記照明空間の三次元情報を生成するように構成される。前記マップ生成部は、前記三次元情報に基づいて前記照明空間の前記床面における前記照明器具の照明範囲を表す照明範囲マップを生成するように構成される。前記人検出部は、前記三次元情報に基づいて前記照明空間の前記床面において人が存在する存在位置を求めるように構成される。前記制御指示部は、前記存在位置と前記照明範囲マップとを参照して前記複数の照明器具のそれぞれの光出力の指示値を決定するように構成される。 The illumination control device according to the first aspect of the present invention includes a measurement processing unit, a map generation unit, a human detection unit, and a control instruction unit. The said measurement process part is comprised so that the three-dimensional information of the said illumination space may be produced | generated based on the image acquired from the imaging means which images the illumination space containing the floor surface illuminated by several lighting fixtures. The map generation unit is configured to generate an illumination range map that represents an illumination range of the luminaire on the floor surface of the illumination space based on the three-dimensional information. The person detecting unit is configured to obtain a position where a person is present on the floor surface of the illumination space based on the three-dimensional information. The control instruction unit is configured to determine an instruction value of each light output of the plurality of lighting fixtures with reference to the presence position and the illumination range map.
 本発明に係る第2の形態の照明制御装置では、上記第1の形態において、前記撮像手段は、2台のカメラを備える。前記計測処理部は、前記2台のカメラからそれぞれ得られた2枚の画像から、三角測量法の原理を用いて、前記三次元情報を生成するように構成される。 In the illumination control device according to the second aspect of the present invention, in the first aspect, the imaging means includes two cameras. The measurement processing unit is configured to generate the three-dimensional information from two images respectively obtained from the two cameras using the principle of triangulation.
 本発明に係る第3の形態の照明制御装置では、第1または第2の形態において、前記マップ生成部は、前記複数の照明器具のそれぞれに関して前記画像における前記床面の所定位置に対応する画素の輝度値の変化を求め、前記画素をその輝度値の変化が最大になる前記照明器具の前記照明範囲に割り当てるように構成される。前記マップ生成部は、前記所定位置に対応する画素の第1輝度値と前記所定位置に対応する画素の第2輝度値との差分を、前記輝度値の変化として求めるように構成される。前記第1輝度値は、前記照明器具の光出力が第1光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する画素の輝度値である。前記第2輝度値は、前記照明器具の光出力が前記第1光出力と異なる第2光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する画素の輝度値である。 In the lighting control apparatus according to the third aspect of the present invention, in the first or second aspect, the map generation unit is a pixel corresponding to a predetermined position on the floor surface in the image with respect to each of the plurality of lighting fixtures. The luminance value is changed, and the pixel is assigned to the illumination range of the luminaire in which the luminance value change is maximized. The map generation unit is configured to obtain a difference between a first luminance value of a pixel corresponding to the predetermined position and a second luminance value of a pixel corresponding to the predetermined position as a change in the luminance value. The first luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging unit when the light output of the lighting fixture is the first light output. The second luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging unit when the light output of the lighting fixture is a second light output different from the first light output. .
 本発明に係る第4の形態の照明制御装置では、第1または第2の形態において、前記マップ生成部は、前記複数の照明器具のそれぞれに関して前記画像における前記床面の所定位置に対応する画素の輝度値の変化を求め、前記画素をその輝度値の変化が規定の閾値以上になる前記照明器具の前記照明範囲に割り当てるように構成される。前記マップ生成部は、前記所定位置に対応する画素の第1輝度値と前記所定位置に対応する画素の第2輝度値との差分を、前記輝度値の変化として求めるように構成される。前記第1輝度値は、前記照明器具の光出力が第1光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する画素の輝度値である。前記第2輝度値は、前記照明器具の光出力が前記第1光出力と異なる第2光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する画素の輝度値である。 In the lighting control apparatus according to the fourth aspect of the present invention, in the first or second aspect, the map generation unit is a pixel corresponding to a predetermined position on the floor surface in the image with respect to each of the plurality of lighting fixtures. The luminance value is changed, and the pixel is assigned to the illumination range of the lighting fixture in which the luminance value change is equal to or greater than a predetermined threshold. The map generation unit is configured to obtain a difference between a first luminance value of a pixel corresponding to the predetermined position and a second luminance value of a pixel corresponding to the predetermined position as a change in the luminance value. The first luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging unit when the light output of the lighting fixture is the first light output. The second luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging unit when the light output of the lighting fixture is a second light output different from the first light output. .
 本発明に係る第5の形態の照明制御装置では、第1または第2の形態において、前記マップ生成部は、前記複数の照明器具のそれぞれに関して前記画像における前記床面の所定位置に対応する画素への寄与度を求め、画素をその寄与度が最大になる前記照明器具の照明範囲に割り当てるように構成される。前記マップ生成部は、前記複数の照明器具のうちの1つの照明器具の光出力が第1光出力であり他の照明器具の光出力が前記第1光出力より低い第2光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する前記画素の輝度値を求めるように構成される。前記マップ生成部は、前記複数の照明器具に対応する前記輝度値の合計に対する前記1つの照明器具に対応する前記輝度値の割合を、前記寄与度として求めるように構成される。 In the lighting control apparatus according to the fifth aspect of the present invention, in the first or second aspect, the map generation unit is a pixel corresponding to a predetermined position on the floor surface in the image with respect to each of the plurality of lighting fixtures. And a pixel is assigned to the illumination range of the luminaire where the contribution is maximized. When the light output of one of the plurality of lighting fixtures is a first light output and the light output of the other lighting fixture is a second light output lower than the first light output, the map generation unit And a luminance value of the pixel corresponding to the predetermined position in the image obtained from the imaging means. The map generation unit is configured to obtain, as the contribution, a ratio of the luminance value corresponding to the one lighting fixture with respect to a total of the luminance values corresponding to the plurality of lighting fixtures.
 本発明に係る第6の形態の照明制御装置では、第1または第2の形態において、前記マップ生成部は、前記複数の照明器具のそれぞれに関して前記画像における前記床面の所定位置に対応する画素への寄与度を求め、画素をその寄与度が規定の閾値以上になる前記照明器具の照明範囲に割り当てるように構成される。前記マップ生成部は、前記複数の照明器具のうちの1つの照明器具の光出力が第1光出力であり他の照明器具の光出力が前記第1光出力より低い第2光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する前記画素の輝度値を求めるように構成される。前記マップ生成部は、前記複数の照明器具に対応する前記輝度値の合計に対する前記1つの照明器具に対応する前記輝度値の割合を、前記寄与度として求めるように構成される。 In the lighting control apparatus according to a sixth aspect of the present invention, in the first or second aspect, the map generation unit is a pixel corresponding to a predetermined position on the floor surface in the image with respect to each of the plurality of lighting fixtures. And determining that the pixel is assigned to the illumination range of the luminaire where the contribution is equal to or greater than a specified threshold. When the light output of one of the plurality of lighting fixtures is a first light output and the light output of the other lighting fixture is a second light output lower than the first light output, the map generation unit And a luminance value of the pixel corresponding to the predetermined position in the image obtained from the imaging means. The map generation unit is configured to obtain, as the contribution, a ratio of the luminance value corresponding to the one lighting fixture with respect to a total of the luminance values corresponding to the plurality of lighting fixtures.
 本発明に係る第7の形態の照明制御装置では、第1~第6のいずれかの形態において、前記マップ生成部は、前記画像における前記床面に交差する面に対応する画素をいずれの前記照明範囲にも割り当てないように構成される。 In the lighting control device according to a seventh aspect of the present invention, in any one of the first to sixth aspects, the map generation unit assigns a pixel corresponding to a surface intersecting the floor surface in the image It is configured not to be assigned to the illumination range.
 本発明に係る第8の形態の照明制御装置では、第1~第7のいずれかの形態において、前記マップ生成部は、前記照明範囲マップにおいていずれの前記照明範囲にも含まれてない未定位置が存在する場合、前記複数の照明器具それぞれの前記照明範囲と前記未定位置との位置関係に基づいて前記未定位置をいずれかの前記照明範囲に割り当てるように構成される。 In the lighting control device according to the eighth aspect of the present invention, in any one of the first to seventh aspects, the map generation unit is an undetermined position that is not included in any of the lighting ranges in the lighting range map. Is present, the undetermined position is assigned to any one of the illumination ranges based on the positional relationship between the illumination range and the undetermined position of each of the plurality of lighting fixtures.
 本発明に係る第9の形態の照明制御装置では、第8の形態において、前記マップ生成部は、前記照明範囲に基づいて、前記画像の前記床面における前記照明器具の直下の位置を求めるように構成される。 In the lighting control device according to the ninth aspect of the present invention, in the eighth aspect, the map generation unit obtains a position immediately below the lighting fixture on the floor surface of the image based on the lighting range. Configured.
 本発明に係る第10の形態の照明制御装置では、第9の形態において、前記位置関係は、前記複数の照明器具それぞれの前記直下の位置と前記未定位置との位置関係である。 In the illumination control apparatus according to the tenth aspect of the present invention, in the ninth aspect, the positional relationship is a positional relationship between the position immediately below and the undetermined position of each of the plurality of lighting fixtures.
 本発明に係る第11の形態の照明制御装置は、第1~第10のいずれかの形態において、前記撮像手段と、照明制御部と、をさらに備える。前記照明制御部は、前記指示値を受け取ると、前記照明器具の前記光出力が前記指示値に対応する値となるように、前記照明器具を制御するように構成される。 The illumination control apparatus according to an eleventh aspect of the present invention according to any one of the first to tenth aspects further includes the imaging means and an illumination control unit. The lighting control unit is configured to control the lighting fixture so that the light output of the lighting fixture becomes a value corresponding to the indication value when receiving the indication value.
実施形態1の照明制御装置を示す概略構成図である。It is a schematic block diagram which shows the illumination control apparatus of Embodiment 1. 実施形態1の照明制御装置におけるカメラの配置例を示す図である。It is a figure which shows the example of arrangement | positioning of the camera in the illumination control apparatus of Embodiment 1. FIG. 実施形態1の照明制御装置の原理説明図である。It is a principle explanatory view of the lighting control device of Embodiment 1. 実施形態1の照明制御装置の原理説明図である。It is a principle explanatory view of the lighting control device of Embodiment 1. 実施形態1の照明制御装置の動作例を示す図である。It is a figure which shows the operation example of the illumination control apparatus of Embodiment 1. FIG. 実施形態1の照明制御装置において照明範囲マップを生成する手順を示す動作説明図である。It is operation | movement explanatory drawing which shows the procedure which produces | generates an illumination range map in the illumination control apparatus of Embodiment 1. FIG. 実施形態1の照明制御装置における視差の概念を説明する図である。It is a figure explaining the concept of the parallax in the illumination control apparatus of Embodiment 1. FIG. 実施形態1の照明制御装置において照明範囲マップの補正処理を説明する図である。It is a figure explaining the correction process of an illumination range map in the illumination control apparatus of Embodiment 1. FIG. 実施形態1の照明制御装置の動作例を示す図である。It is a figure which shows the operation example of the illumination control apparatus of Embodiment 1. FIG. 実施形態2の照明制御装置の動作説明図である。It is operation | movement explanatory drawing of the illumination control apparatus of Embodiment 2. FIG. 照明器具の配置例を示す概略構成図である。It is a schematic block diagram which shows the example of arrangement | positioning of a lighting fixture. 図11の配置例に対応した課題を説明する図である。It is a figure explaining the subject corresponding to the example of arrangement | positioning of FIG. 従来の動作を説明する図である。It is a figure explaining the conventional operation | movement.
 本発明は、照明制御装置に関し、特に、人の存在位置に応じて照明器具の光出力を自動的に調節する照明制御装置に関する。 The present invention relates to a lighting control device, and more particularly to a lighting control device that automatically adjusts the light output of a lighting fixture in accordance with the position of a person.
 以下に説明する本発明の実施形態の照明制御装置は、照明空間の三次元情報を取得する三次元計測部において、2台のカメラを用いたステレオビジョンによる三次元計測を想定する。ただし、撮像手段としてのカメラで撮像した画像から三次元情報を取得する構成であれば、三次元計測部の構成は問わない。たとえば、ステレオビジョンのようにパッシブ型の構成のほか、光切断法や位相シフト法のように光パターンを投影して三次元情報を取得するアクティブ型の構成であってもよい。これらの技術は三角測量の原理を用いている。 The illumination control apparatus according to the embodiment of the present invention described below assumes three-dimensional measurement by stereo vision using two cameras in a three-dimensional measurement unit that acquires three-dimensional information of an illumination space. However, the configuration of the three-dimensional measurement unit is not limited as long as the three-dimensional information is acquired from an image captured by a camera as an imaging unit. For example, in addition to a passive configuration such as stereo vision, an active configuration that acquires three-dimensional information by projecting a light pattern such as a light cutting method or a phase shift method may be used. These techniques use the principle of triangulation.
 三次元計測部は、強度が一定周期で変化する強度変調光を投光し、反射光をカメラで受光することにより、強度変調光の投受光の位相差から画素ごとの距離を求めて距離画像を生成する技術を採用してもよい。この技術は、光の飛行時間を計測するからTOF(Time of Flight)法と呼ばれている。 The three-dimensional measuring unit emits intensity-modulated light whose intensity changes at a constant period and receives reflected light with a camera, thereby obtaining the distance for each pixel from the phase difference between the intensity-modulated light and the received light. A technique for generating This technique is called the TOF (Time of Flight) method because it measures the flight time of light.
 (実施形態1)
 図1に示すように、本実施形態の照明制御装置は、三次元計測部10と、記憶部14と、照明制御部15と、を備える。
(Embodiment 1)
As shown in FIG. 1, the illumination control device of the present embodiment includes a three-dimensional measurement unit 10, a storage unit 14, and an illumination control unit 15.
 三次元計測部10は、三次元の実空間に対する視野がほぼ重複するように配置された撮像手段としての2台のカメラ11,12を備える。また、三次元計測部10は、演算処理部13を備える。 The three-dimensional measuring unit 10 includes two cameras 11 and 12 as imaging means arranged so that the visual fields with respect to the three-dimensional real space almost overlap. The three-dimensional measuring unit 10 includes an arithmetic processing unit 13.
 カメラ11,12は、2次元格子の格子点上に画素が配列されたエリアイメージセンサに光学系を組み合わせて構成されており、エリアイメージセンサには、CCDイメージセンサ、CMOSイメージセンサなどが用いられる。また、2台のカメラ11,12は、画像の解像度や焦点距離などの仕様が等しいものが用いられる。 The cameras 11 and 12 are configured by combining an optical system with an area image sensor in which pixels are arranged on lattice points of a two-dimensional lattice, and a CCD image sensor, a CMOS image sensor, or the like is used as the area image sensor. . The two cameras 11 and 12 having the same specifications such as image resolution and focal length are used.
 また、図2のように、2台のカメラ11,12は、光軸Ax1,Ax2を平行にし、かつ受光面PL1,PL2の中心C1,C2を結ぶ線に沿う方向(「ベースライン方向」と呼ぶ)と光軸Ax1,Ax2とを直交させて配置されている。ここでは、光軸Ax1,Ax2は、光学中心O1,O2と受光面PL1,PL2の中心C1,C2を結ぶ直線とする。 As shown in FIG. 2, the two cameras 11 and 12 have optical axes Ax1 and Ax2 parallel to each other and a direction along a line connecting the centers C1 and C2 of the light receiving surfaces PL1 and PL2 (“baseline direction”). And the optical axes Ax1 and Ax2 are arranged orthogonally. Here, the optical axes Ax1, Ax2 are straight lines connecting the optical centers O1, O2 and the centers C1, C2 of the light receiving surfaces PL1, PL2.
 各カメラ11,12は、画角が180度に近い魚眼レンズないし魚眼レンズに相当する機能を有した光学系(図示せず)を備える。光学系の射影方式はとくに問わないが、以下では等距離射影方式を用いて説明する。 Each camera 11 and 12 includes an optical system (not shown) having a function corresponding to a fisheye lens or a fisheye lens having an angle of view close to 180 degrees. The projection system of the optical system is not particularly limited, but will be described below using the equidistant projection method.
 また、2台のカメラ11,12は、ベースライン方向と受光面PL1,PL2における水平方向とを一致させるように配置される。すなわち、平行ステレオを想定する。 Further, the two cameras 11 and 12 are arranged so that the baseline direction and the horizontal direction on the light receiving surfaces PL1 and PL2 coincide. That is, parallel stereo is assumed.
 なお、光学系の射影方式は等距離射影方式に限定されず、光学系は他の歪み特性を有していてもよく、またキャリブレーションを行うから歪み特性が未知の光学系を用いることも可能である。 The projection system of the optical system is not limited to the equidistant projection system, and the optical system may have other distortion characteristics, or an optical system with unknown distortion characteristics can be used because calibration is performed. It is.
 上述した三次元計測部10の構成は限定する趣旨ではないが、上述した構成を採用することにより、照明空間の全体を撮像する程度の広視野角が得られ、しかも、カメラ11,12が撮像した画像から三次元情報を算出する際の演算量が低減されることになる。 The configuration of the above-described three-dimensional measurement unit 10 is not limited, but by adopting the above-described configuration, a wide viewing angle that captures the entire illumination space can be obtained, and the cameras 11 and 12 can capture images. The amount of calculation when calculating three-dimensional information from the obtained image is reduced.
 ここに、カメラ11,12は、天井のような高所に、それぞれの光軸Ax1,Ax2がともに鉛直下向きとなるように配置される。また、カメラ11,12は、後述する照明器具の照明対象である照明空間の全体が視野に含まれるように配置される。 Here, the cameras 11 and 12 are arranged at a high place such as the ceiling so that the optical axes Ax1 and Ax2 are both vertically downward. Moreover, the cameras 11 and 12 are arrange | positioned so that the whole illumination space which is the illumination object of the lighting fixture mentioned later may be included in a visual field.
 以下では、図3に示すように、受光面PL1,PL2内の水平方向に沿う方向(すなわち、ベースライン方向)をx方向、受光面内の垂直方向に沿う方向をy方向とし、受光面に直交する方向をz方向とする。 In the following, as shown in FIG. 3, the direction along the horizontal direction (that is, the baseline direction) in the light receiving surfaces PL1 and PL2 is the x direction, and the direction along the vertical direction in the light receiving surface is the y direction. The direction orthogonal to the z direction.
 また、各カメラ11,12の撮像画像をモニタ装置に表示したときに、水平方向の右向きがx方向の正の向き、垂直方向の下向きがy方向の正の向きになるように座標系を定めている。z方向についてはカメラの受光面から遠ざかる向きを正の向きとする。すなわち、z方向における正の向きは、カメラの正面方向である。 In addition, when the captured images of the cameras 11 and 12 are displayed on the monitor device, the coordinate system is determined so that the rightward direction in the horizontal direction is the positive direction in the x direction and the downward direction in the vertical direction is the positive direction in the y direction. ing. For the z direction, the direction away from the light receiving surface of the camera is the positive direction. That is, the positive direction in the z direction is the front direction of the camera.
 たとえば、カメラ11で撮像した画像(第1画像)の画素位置は(u1,v1)という形式で表され、カメラ12で撮像した画像(第2画像)の画素位置は(u2,v2)という形式で表される。u1軸とu2軸とはx方向の座標軸であり、v1軸とv2軸とはy方向の座標軸になる。また、平行ステレオであるから、u1軸とu2軸とは一直線上に並ぶ。 For example, the pixel position of the image (first image) captured by the camera 11 is represented in the format (u1, v1), and the pixel position of the image (second image) captured by the camera 12 is in the format (u2, v2). It is represented by The u1 axis and the u2 axis are coordinate axes in the x direction, and the v1 axis and the v2 axis are coordinate axes in the y direction. Moreover, since it is parallel stereo, the u1 axis and the u2 axis are aligned on a straight line.
 図1に示す構成例では、複数台(図示例では3台)の照明器具20(21,22,23)が天井のような高所に配置され、照明器具21と照明器具22との間にカメラ11,12が配置されている。 In the configuration example shown in FIG. 1, a plurality of (three in the illustrated example) lighting fixtures 20 (21, 22, 23) are arranged at a high place such as the ceiling, and between the lighting fixture 21 and the lighting fixture 22. Cameras 11 and 12 are arranged.
 上述のように、カメラ11,12の視野は、3台の照明器具21,22,23が照明対象とする照明空間の全体を含むように設定される。図1では、図示の都合上3台の照明器具21,22,23が一直線上に配置されているが、照明器具20の台数は3台以上であってもよく、また照明器具20は任意の位置関係で配置されていてもよい。 As described above, the visual fields of the cameras 11 and 12 are set so that the three lighting fixtures 21, 22, and 23 include the entire illumination space to be illuminated. In FIG. 1, three lighting fixtures 21, 22, and 23 are arranged in a straight line for convenience of illustration, but the number of lighting fixtures 20 may be three or more, and the lighting fixture 20 may be an arbitrary one. You may arrange | position by positional relationship.
 カメラ11,12から出力された画像データ(画像)は演算処理部13に入力される。本実施形態では、画像データが濃淡画像である場合を例として説明するが、画像データがカラー画像であっても以下に説明する技術思想は適用可能である。 Image data (images) output from the cameras 11 and 12 are input to the arithmetic processing unit 13. In the present embodiment, the case where the image data is a grayscale image will be described as an example. However, the technical idea described below is applicable even if the image data is a color image.
 演算処理部13は、ハードウェア資源としてのコンピュータを有し、コンピュータを以下に説明する処理を行う装置として機能させるためのプログラムを実行する。このプログラムによって、演算処理部13が、計測処理部130と、マップ生成部131と、人検出部132と、照合部133と、制御指示部134として機能する。 The arithmetic processing unit 13 includes a computer as a hardware resource, and executes a program for causing the computer to function as a device that performs processing described below. With this program, the arithmetic processing unit 13 functions as a measurement processing unit 130, a map generation unit 131, a person detection unit 132, a collation unit 133, and a control instruction unit 134.
 ただし、演算処理部13は専用のハードウェアを備える構成であってもよい。また、マイコンを備えるコンピュータのほか、DSP(Digital Signal Processor)、FPGA(Field-Programmable Gate Array)のようにプログラムを実行する機能を備えるデバイスを用いて構成してもよい。 However, the arithmetic processing unit 13 may be configured to include dedicated hardware. In addition to a computer having a microcomputer, a device having a function of executing a program such as a DSP (Digital Signal Processor) or an FPGA (Field-Programmable Gate Array) may be used.
 カメラ11,12がアナログ信号を出力する場合には、カメラ11,12と演算処理部13との間にアナログ信号をデジタル信号に変換するAD変換器が設けられる。また、カメラ11,12と演算処理部13との間には、画像データからノイズのような不要成分を除去するフィルタ回路などが設けられていてもよい。 When the cameras 11 and 12 output analog signals, an AD converter that converts the analog signals into digital signals is provided between the cameras 11 and 12 and the arithmetic processing unit 13. Further, a filter circuit or the like that removes unnecessary components such as noise from the image data may be provided between the cameras 11 and 12 and the arithmetic processing unit 13.
 演算処理部13を動作させるためのシステムプログラムおよびアプリケーションプログラムは記憶部14に記憶される。また、演算処理部13の処理対象である撮像画像データや演算過程のデータも、データメモリおよび作業用メモリとして機能する記憶部14に記憶される。したがって、記憶部14は、フラッシュメモリやハードディスクドライブ装置のように無給電で記憶内容が保持される記憶媒体と、処理の実行時にシステムプログラムやアプリケーションプログラムを置くための主記憶となる揮発性メモリとを含む。 The system program and application program for operating the arithmetic processing unit 13 are stored in the storage unit 14. Also, the captured image data and the calculation process data that are the processing targets of the arithmetic processing unit 13 are also stored in the storage unit 14 that functions as a data memory and a working memory. Accordingly, the storage unit 14 includes a storage medium in which stored contents are held without power supply, such as a flash memory and a hard disk drive device, and a volatile memory serving as a main storage for storing system programs and application programs during processing. including.
 演算処理部13は、カメラ11,12から取得した画像データを用いて照明空間の三次元情報を生成する計測処理部130を備える。すなわち、計測処理部130は、複数の照明器具20により照明される床面40を含む照明空間50を撮像する撮像手段(カメラ11,12)から取得した画像に基づいて照明空間50の三次元情報を生成するように構成される。本実施形態では、記憶部14の一部と計測処理部l30とは、カメラ11,12とともに三次元計測部10を構成する。画像データから三次元情報を生成する技術については後述する。 The arithmetic processing unit 13 includes a measurement processing unit 130 that generates three-dimensional information of the illumination space using image data acquired from the cameras 11 and 12. That is, the measurement processing unit 130 3D information of the illumination space 50 based on the image acquired from the imaging means (cameras 11 and 12) that captures the illumination space 50 including the floor surface 40 illuminated by the plurality of lighting fixtures 20. Is configured to generate In the present embodiment, a part of the storage unit 14 and the measurement processing unit l30 constitute the three-dimensional measurement unit 10 together with the cameras 11 and 12. A technique for generating three-dimensional information from image data will be described later.
 演算処理部13は、照明器具21,22,23ごとの照明範囲を表す照明範囲マップを生成するマップ生成部131と、三次元情報を用いて人30の存在位置を検出する人検出部132とを備える。 The arithmetic processing unit 13 includes a map generation unit 131 that generates an illumination range map that represents an illumination range for each of the lighting fixtures 21, 22, and 23, and a person detection unit 132 that detects the presence position of the person 30 using three-dimensional information. Is provided.
 すなわち、マップ生成部131は、計測処理部130で生成された三次元情報に基づいて、照明空間50における複数の照明器具20のそれぞれの照明範囲を表す照明範囲マップを生成するように構成される。照明範囲マップは、照明空間50の床面40における照明器具20の照明範囲を表す。特に、本実施形態では、照明範囲マップは、照明空間50の床面40における複数の照明器具20のそれぞれの照明範囲を表す。照明範囲は、たとえば、照明器具20によって照らされる床面40の領域である。 In other words, the map generation unit 131 is configured to generate an illumination range map representing each illumination range of the plurality of lighting fixtures 20 in the illumination space 50 based on the three-dimensional information generated by the measurement processing unit 130. . The illumination range map represents the illumination range of the luminaire 20 on the floor surface 40 of the illumination space 50. In particular, in this embodiment, the illumination range map represents each illumination range of the plurality of lighting fixtures 20 on the floor surface 40 of the illumination space 50. The illumination range is, for example, an area of the floor surface 40 that is illuminated by the luminaire 20.
 人検出部132は、計測処理部130で生成された三次元情報に基づいて、照明空間50において人が存在する存在位置を求めるように構成される。存在位置は、照明空間50の床面40において人30が存在する位置である。 The person detection unit 132 is configured to obtain an existence position where a person exists in the illumination space 50 based on the three-dimensional information generated by the measurement processing unit 130. The presence position is a position where the person 30 exists on the floor surface 40 of the illumination space 50.
 マップ生成部131が生成した照明範囲マップは記憶部14に設けたマップ記憶部141に記憶される。マップ記憶部141は、書換可能な不揮発性メモリで構成されていることが好ましい。 The illumination range map generated by the map generation unit 131 is stored in the map storage unit 141 provided in the storage unit 14. The map storage unit 141 is preferably composed of a rewritable nonvolatile memory.
 演算処理部13には、人検出部132が人30の存在位置を検出したときに、検出された位置(存在位置)を照明範囲マップに照合することによって、当該位置(存在位置)を照明範囲とする照明器具20(21,22,23)を抽出する照合部133が設けられる。 When the human detection unit 132 detects the presence position of the person 30, the arithmetic processing unit 13 collates the detected position (existing position) with the illumination range map, thereby determining the position (existing position) in the illumination range. A collation unit 133 for extracting the lighting fixture 20 (21, 22, 23) is provided.
 すなわち、照合部133は、マップ生成部131で生成された照明範囲マップを参照して、複数の照明器具21,22,23から、人検出部132で求められた存在位置を含む照明範囲を有する照明器具20を選択するように構成される。 That is, the collation unit 133 refers to the illumination range map generated by the map generation unit 131 and has an illumination range that includes the presence position obtained by the human detection unit 132 from the plurality of lighting fixtures 21, 22, and 23. It is comprised so that the lighting fixture 20 may be selected.
 照合部133により抽出された照明器具21,22,23を識別する情報は、演算処理部13に設けた制御指示部134に与えられ、制御指示部134において、あらかじめ定められたルールに従って照明器具21,22,23ごとの光出力が決定される。すなわち、制御指示部134は、人検出部132で求められた存在位置とマップ生成部131で生成された照明範囲マップとを参照して、複数の照明器具20のそれぞれの光出力の指示値を決定するように構成される。 Information for identifying the luminaires 21, 22, and 23 extracted by the collating unit 133 is given to the control instruction unit 134 provided in the arithmetic processing unit 13, and the luminaire 21 is determined by the control instruction unit 134 according to a predetermined rule. , 22 and 23 are determined. That is, the control instruction unit 134 refers to the existence position obtained by the person detection unit 132 and the illumination range map generated by the map generation unit 131, and determines the instruction value of each light output of the plurality of lighting fixtures 20. Configured to determine.
 ルールは、たとえば、照合部133が抽出した照明器具20(21,22,23)だけを定格点灯させるように設定される。 The rule is set so that, for example, only the lighting fixtures 20 (21, 22, 23) extracted by the collation unit 133 are rated-lit.
 例えば、照合部133が照明器具21を選択した場合、制御指示部134は、照明器具21の光出力の指示値として、定格点灯に対応する指示値を選択する。また、制御指示部134は、選択されなかった照明器具22,23の光出力の指示値として、消灯に対応する指示値を選択する。 For example, when the collation unit 133 selects the luminaire 21, the control instruction unit 134 selects an instruction value corresponding to the rated lighting as the instruction value of the light output of the luminaire 21. Moreover, the control instruction | indication part 134 selects the instruction | indication value corresponding to light extinction as an instruction | indication value of the light output of the lighting fixtures 22 and 23 which were not selected.
 あるいは、照合部133が抽出した照明器具20(21,22,23)を定格点灯させるとともに当該照明器具20(21,22,23)の周囲と隣り合わせである照明器具20(21,22,23)を調光点灯させるようにルールが設定される。 Alternatively, the lighting fixture 20 (21, 22, 23) extracted by the collation unit 133 is rated-lit and the lighting fixture 20 (21, 22, 23) adjacent to the periphery of the lighting fixture 20 (21, 22, 23). A rule is set to dim the light.
 例えば、照合部133が照明器具21を選択した場合、制御指示部134は、照明器具21の光出力の指示値として、定格点灯に対応する指示値を選択する。また、制御指示部134は、選択された照明器具21の隣の照明器具22の光出力の指示値として、50%点灯に対応する指示値を選択する。また、制御指示部134は、選択された照明器具21の隣ではない照明器具23の光出力の指示値として、消灯に対応する指示値を選択する。 For example, when the collation unit 133 selects the luminaire 21, the control instruction unit 134 selects an instruction value corresponding to the rated lighting as the instruction value of the light output of the luminaire 21. Further, the control instruction unit 134 selects an instruction value corresponding to 50% lighting as the instruction value of the light output of the lighting apparatus 22 adjacent to the selected lighting apparatus 21. Further, the control instruction unit 134 selects an instruction value corresponding to turning off as the instruction value of the light output of the lighting fixture 23 that is not adjacent to the selected lighting fixture 21.
 これらのルールは一例であって、ルールは、照明空間50において人30の存在位置の周囲の領域が残りの領域よりも相対的に明るくなるように設定されていればよい。 These rules are examples, and the rules may be set so that the area around the position where the person 30 exists in the illumination space 50 is relatively brighter than the remaining areas.
 演算処理部13に設けた制御指示部134の出力(指示値)は、照明器具20(21,22,23)の光出力を調節する照明制御部15に与えられる。 The output (instruction value) of the control instruction unit 134 provided in the arithmetic processing unit 13 is given to the illumination control unit 15 that adjusts the light output of the lighting fixture 20 (21, 22, 23).
 照明制御部15は、制御指示部134の指示内容(指示値)に従って照明器具20(21,22,23)の光出力を調節する。したがって、人30の存在位置が検出されると、制御指示部134に設定されたルールに応じて照明器具20(21,22,23)の点灯状態が制御される。 The illumination control unit 15 adjusts the light output of the lighting fixture 20 (21, 22, 23) according to the instruction content (instruction value) of the control instruction unit 134. Therefore, when the presence position of the person 30 is detected, the lighting state of the luminaire 20 (21, 22, 23) is controlled according to the rule set in the control instruction unit 134.
 以下では、個々の技術(換言すれば本実施形態の照明制御装置の構成要素)についてさらに詳しく説明する。三次元計測部10は、2台のカメラ11,12がそれぞれ撮像した撮像画像に基づいて実空間の三次元情報を取得するために、2台のカメラ11,12が同時刻に撮像した対になる画像(撮像画像)を取得して記憶部14に保存する。 Hereinafter, individual technologies (in other words, components of the lighting control device of the present embodiment) will be described in more detail. The three-dimensional measuring unit 10 obtains the three-dimensional information of the real space based on the captured images captured by the two cameras 11 and 12, respectively, so that the two cameras 11 and 12 capture the pair captured at the same time. The obtained image (captured image) is acquired and stored in the storage unit 14.
 いま、2台のカメラ11,12のうちの1台について着目する。図3に示すように、実空間に対して光学中心Oを原点とする三次元の座標系(xyz)を規定し、受光面PLに対して2次元の座標系(o-uv)を規定する。受光面PLの座標系は撮像画像の座標系と一対一に対応する。実空間における三次元の座標系におけるx方向はベースライン方向と平行とし、y方向は受光面の垂直方向と平行とする。また、受光面PLにおける2次元の座標系におけるu方向はx方向と平行とし、v方向はy方向と平行とする。 Now, focus on one of the two cameras 11 and 12. As shown in FIG. 3, a three-dimensional coordinate system (xyz) with the optical center O as the origin is defined for the real space, and a two-dimensional coordinate system (o-uv) is defined for the light receiving surface PL. . The coordinate system of the light receiving surface PL has a one-to-one correspondence with the coordinate system of the captured image. The x direction in the three-dimensional coordinate system in the real space is parallel to the baseline direction, and the y direction is parallel to the vertical direction of the light receiving surface. The u direction in the two-dimensional coordinate system on the light receiving surface PL is parallel to the x direction, and the v direction is parallel to the y direction.
 受光面PLでの画素の位置は、左上隅を原点として水平方向と垂直方向との画素の個数で表される。画像上に投影された光軸上の点の座標を(uc,vc)とすると、(uc,vc)と座標(u,v)に位置する画素との距離rは、次式(1)で表される。 The position of the pixel on the light receiving surface PL is represented by the number of pixels in the horizontal and vertical directions with the upper left corner as the origin. If the coordinates of the point on the optical axis projected on the image are (uc, vc), the distance r between (uc, vc) and the pixel located at the coordinates (u, v) is expressed by the following equation (1). expressed.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 また、等距離射影方式を採用しているから、三次元の実空間における点Pが光学中心Oを中心位置とする半径1の球面SPの表面に射影されるモデルを用いることができる。すなわち、図3のように実空間の点Pが球面SPの表面に射影された点Qと光学中心Oとを結ぶ直線が、光軸Ax(z軸方向)となす角度θ[rad]は、距離rを用いて、次式(3)で表される。 Further, since the equidistant projection method is employed, a model in which the point P in the three-dimensional real space is projected onto the surface of the spherical surface SP having the radius 1 and having the optical center O as the center position can be used. That is, as shown in FIG. 3, the angle θ [rad] formed by the straight line connecting the point Q where the point P in the real space is projected on the surface of the spherical surface SP and the optical center O with the optical axis Ax (z-axis direction) is Using the distance r, it is expressed by the following equation (3).
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 ただし、上式(3)において、距離L0は、実空間が投影される球面SPの上でz=0に対応した受光面PLの上での円の半径を示す。また、図3において、点Rは点Qを受光面PLへ投影した画素の位置を示す。 However, in the above equation (3), the distance L0 indicates the radius of the circle on the light receiving surface PL corresponding to z = 0 on the spherical surface SP on which the real space is projected. In FIG. 3, a point R indicates the position of a pixel obtained by projecting the point Q onto the light receiving surface PL.
 実空間における点Pが、受光面PLの座標(u,v)の画素に対応付けられるようにキャリブレーションが行われているとすると、モデルとして用いる球面SPの表面に実空間の点Pを射影した点Qの位置(Xr,Yr,Zr)は、次式(4),(5),(6)で表される。 If the calibration is performed so that the point P in the real space is associated with the pixel at the coordinates (u, v) of the light receiving surface PL, the point P in the real space is projected onto the surface of the spherical surface SP used as a model. The position (Xr, Yr, Zr) of the point Q is expressed by the following equations (4), (5), (6).
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 また、上式(1)の距離rは、r=f・θとおくことが可能であって、この場合、fは等距離射影方式における光学系の比例定数に相当する。 Further, the distance r in the above equation (1) can be set to r = f · θ, and in this case, f corresponds to a proportional constant of the optical system in the equidistant projection method.
 ところで、図4(a),(b)に示しているように、点Qの位置(Xr,Yr,Zr)は、画素の座標(u,v)に代えてx軸、y軸、z軸の3軸のうちの2軸の周りの角度の組み合わせとして表すことが可能である。点Qについて、x軸周りの角度(yz平面内での角度)をβ、y軸周りの角度(zx平面内での角度)をαとする。また、上式(2)で表される角度φは、z軸周りの角度(xy平面内での角度)ということができる。 By the way, as shown in FIGS. 4A and 4B, the position (Xr, Yr, Zr) of the point Q is replaced by the x-axis, y-axis, z-axis instead of the pixel coordinates (u, v). Can be expressed as a combination of angles around two of the three axes. For the point Q, an angle around the x axis (an angle in the yz plane) is β, and an angle around the y axis (an angle in the zx plane) is α. Further, the angle φ represented by the above equation (2) can be referred to as an angle around the z axis (an angle in the xy plane).
 角度αはx軸の正の向きを0度としz軸の正の向きに向かって増加し、角度βはy軸の正の向きを0度としz軸の正の向きに向かって増加する。角度α,βを用いると、画素の位置は、座標(u,v)に代えて(α,β)で表すことができる。座標(u,v)から(α,β)に変換するには、上式(4),(5),(6)の計算結果を用いて次式(7),(8)の計算を行う。 The angle α increases toward the positive direction of the z-axis with the positive direction of the x-axis as 0 degree, and the angle β increases toward the positive direction of the z-axis with the positive direction of the y-axis as 0 degrees. When the angles α and β are used, the pixel position can be expressed by (α, β) instead of the coordinates (u, v). In order to convert from the coordinates (u, v) to (α, β), the following equations (7) and (8) are calculated using the calculation results of the above equations (4), (5), and (6). .
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 上述した手順によって、画素の位置を座標(u,v)で表現した画像から、(α,β)で表現した画像に変換した結果を図5に示す。図5(a)はカメラ11で得られた撮像画像(第1画像)を示し、図5(b)はカメラ12で得られた撮像画像(第2画像)を示す。また、図5(c),(d)はそれぞれ座標(u,v)から、(α,β)に変換した画像を示す。 FIG. 5 shows the result of conversion from an image expressing the pixel position in coordinates (u, v) to an image expressed in (α, β) by the procedure described above. FIG. 5A shows a captured image (first image) obtained by the camera 11, and FIG. 5B shows a captured image (second image) obtained by the camera 12. 5C and 5D show images converted from coordinates (u, v) to (α, β), respectively.
 このように、演算処理部13における計測処理部130は、2台のカメラ11,12がそれぞれ撮像した撮像画像内での画素の位置(u,v)を、上述した三次元の実空間における角度の組(α,β)に変換した画像を生成する。以下では、実空間の点Pに対応する位置を(α,β)で表した画像を変換画像と呼ぶ。 As described above, the measurement processing unit 130 in the arithmetic processing unit 13 determines the position (u, v) of the pixel in the captured image captured by the two cameras 11 and 12 as an angle in the above-described three-dimensional real space. An image converted into a set (α, β) is generated. Hereinafter, an image in which the position corresponding to the point P in the real space is represented by (α, β) is referred to as a converted image.
 ここで、本実施形態は、上述した条件の平行ステレオを前提にしているから、2台のカメラ11,12が撮像した撮像画像から得られる2枚の変換画像において、実空間における同一の点Pに対応するβの値は等しくなる。 Here, since the present embodiment is premised on the parallel stereo under the above-described conditions, in the two converted images obtained from the captured images captured by the two cameras 11 and 12, the same point P in the real space is used. The values of β corresponding to are equal.
 そのため、実空間における同一の点Pに対応する画素を2枚の変換画像から探索する際には、角度βを変化させずに角度αのみを変化させればよいから、2枚の画像間で画素を対応付ける処理において、撮像画像を用いる場合よりも処理負荷が軽減されることになる。 Therefore, when searching for the pixel corresponding to the same point P in the real space from the two converted images, it is only necessary to change the angle α without changing the angle β. In the process of associating pixels, the processing load is reduced as compared with the case of using a captured image.
 要するに、実空間における1つの点Pに対応する画素(以下、「対応点」という)を2枚の変換画像から探索する際に、x軸回りの角度βが同じである範囲内のみを探索すればよいから、対応点の探索範囲が狭められ、処理負荷が低減される。 In short, when searching for a pixel corresponding to one point P in real space (hereinafter referred to as “corresponding point”) from two converted images, only a range within the same angle β around the x axis is searched. Therefore, the search range for corresponding points is narrowed, and the processing load is reduced.
 ところで、計測処理部130は、対応点か否かを評価するために、ブロックマッチングの技術を採用している。すなわち、2枚の変換画像のうち一方の変換画像において対応点の評価を行おうとする部位の周囲に複数画素を含むブロックを形成するとともに、ブロックに対応する大きさの走査ブロックを他方の変換画像に設定し、走査ブロックをα軸に沿って走査するのである。ブロックおよび走査ブロックは、対応点の評価を行おうとする画素の周囲で矩形領域として設定するのが望ましい。 By the way, the measurement processing unit 130 employs a block matching technique in order to evaluate whether or not it is a corresponding point. That is, a block including a plurality of pixels is formed around a portion of one of the two converted images for which the corresponding point is to be evaluated, and a scanning block having a size corresponding to the block is formed on the other converted image. The scanning block is scanned along the α axis. It is desirable to set the block and the scanning block as a rectangular area around the pixel for which the corresponding point is to be evaluated.
 一方の変換画像における座標(α1,β1)の画素と、他方の変換画像における座標(α2,β2)の画素とは、次式(9)に示す評価値Vsが最小になる場合に対応点とみなす。ただし、β2=β1という条件で探索を行う。また、ブロックと走査ブロックとは、α方向に(2N+1)画素を有し、β方向に(2M+1)画素を有する。また、数5において、I(α1,β1)は一方の変換画像の座標(α1,β1)における輝度値であり、I2(α2,β2)は他方の変換画像の座標(α2,β2)における輝度値である。 The pixel at the coordinate (α1, β1) in one converted image and the pixel at the coordinate (α2, β2) in the other converted image are the corresponding points when the evaluation value Vs shown in the following equation (9) is minimized. I reckon. However, the search is performed under the condition of β2 = β1. Further, the block and the scanning block have (2N + 1) pixels in the α direction and (2M + 1) pixels in the β direction. In Equation 5, I (α1, β1) is a luminance value at the coordinates (α1, β1) of one converted image, and I2 (α2, β2) is a luminance value at the coordinates (α2, β2) of the other converted image. Value.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 上式(9)において用いた評価値Vsは、SAD(Sum of Absolute Difference)として知られている値であるが、対応点の評価には他の評価値を用いることが可能である。たとえば、SSD(Sum of Squared Difference)、正規化相互相関関数などを評価値に用いることが可能である。また、ブロックマッチングに代えて他のステレオマッチングの技術を採用してもよい。 The evaluation value Vs used in the above equation (9) is a value known as SAD (Sum of Absolute Difference), but other evaluation values can be used for evaluation of corresponding points. For example, SSD (Sum of Squared Difference), normalized cross-correlation function, etc. can be used as the evaluation value. Further, instead of block matching, other stereo matching techniques may be employed.
 上述のようにして、計測処理部130は、2枚の変換画像から対応点を抽出した後、座標(α,β)ごとに視差dを画素値に持つ視差画像を生成する。視差dは、対応点について、d=α2-α1として求められる。視差画像の例を図5(e)に示す。計測処理部130は、点Pに関して視差dに基づいて実空間におけるz方向の座標値を求める。すなわち、実空間の三次元情報が得られることになる。 As described above, the measurement processing unit 130 extracts the corresponding points from the two converted images, and then generates a parallax image having the parallax d as a pixel value for each coordinate (α, β). The parallax d is obtained as d = α2−α1 for the corresponding point. An example of the parallax image is shown in FIG. The measurement processing unit 130 obtains a coordinate value in the z direction in the real space based on the parallax d with respect to the point P. That is, three-dimensional information in real space can be obtained.
 次にマップ生成部131について説明する。マップ生成部131は、計測処理部130により得られた三次元情報と、照明器具20(21,22,23)の照明範囲とを対応付けた照明範囲マップを生成する。 Next, the map generation unit 131 will be described. The map generation unit 131 generates an illumination range map in which the three-dimensional information obtained by the measurement processing unit 130 is associated with the illumination range of the lighting fixture 20 (21, 22, 23).
 マップ生成部131により照明範囲マップを生成するために、マップ生成部131は、照明器具21,22,23を1台ずつ個別に点灯させ、照明器具21,22,23がそれぞれ点灯したときの画像を少なくとも一方のカメラ(11,12)で撮像し、各画素の濃淡値(輝度値、画素値)を求める。 In order to generate the illumination range map by the map generation unit 131, the map generation unit 131 individually lights the lighting fixtures 21, 22, and 23 one by one, and images when the lighting fixtures 21, 22, and 23 are turned on respectively. Is captured by at least one of the cameras (11, 12), and the gray value (luminance value, pixel value) of each pixel is obtained.
 たとえば、すべての照明器具21,22,23を消灯させた状態から、照明器具21のみを点灯させた画像をカメラ11で撮像した後、照明器具22のみを点灯させた画像をカメラ11で撮像し、次に、照明器具23のみを点灯させた画像をカメラ11で撮像する。 For example, after all the lighting fixtures 21, 22, and 23 are turned off, an image in which only the lighting fixture 21 is turned on is taken by the camera 11, and then an image in which only the lighting fixture 22 is turned on is taken by the camera 11. Next, an image obtained by lighting only the lighting fixture 23 is captured by the camera 11.
 このようにして得られた3枚の画像は、照明器具20(21,22,23)が1台だけ点灯している状態の画像になる(図13(b),(c),(d)を参照)。 The three images thus obtained are images in which only one lighting fixture 20 (21, 22, 23) is lit (FIGS. 13B, 13C, and 13D). See).
 したがって、3枚の画像における濃淡値を比較することにより、カメラ11の視野内(つまり照明空間50)のどの領域において、どの照明器具20(21,22,23)が支配的に照明を行っているかの情報が得られる。 Therefore, by comparing the gray values in the three images, which lighting fixture 20 (21, 22, 23) dominates in which region of the field of view of the camera 11 (that is, the illumination space 50). Information is obtained.
 本実施形態では、照明範囲マップを生成するにあたり2段階の処理を行う。第1段階では、カメラ11の撮像画像と同じ解像度で対応マップ画像を生成する。対応マップ画像を生成する際に、マップ生成部131は、照明器具21,22,23を1台ずつ点灯させて撮像した3枚の画像について、同じ座標(u,v)の濃淡値の大小を比較する。 In this embodiment, two steps of processing are performed when generating the illumination range map. In the first stage, a correspondence map image is generated with the same resolution as the captured image of the camera 11. When generating the corresponding map image, the map generation unit 131 calculates the intensity value of the same coordinates (u, v) for the three images captured by lighting the lighting fixtures 21, 22, and 23 one by one. Compare.
 さらに、マップ生成部131は、濃淡値がもっとも小さい(もっとも受光強度が大きい)画像を選択し、当該画像を撮像したときに点灯させた照明器具20(21,22,23)を、当該座標(u,v)に対応付ける。 Further, the map generation unit 131 selects the image having the smallest gray value (the highest received light intensity), and the lighting fixture 20 (21, 22, 23) that is turned on when the image is captured, the coordinates ( u, v).
 すなわち、マップ生成部131は、複数の照明器具20のそれぞれに関して、撮像手段(カメラ11,12)で得られた画像における床面40の所定位置に対応する画素の輝度値(濃淡値)の変化を求める。 That is, for each of the plurality of lighting fixtures 20, the map generation unit 131 changes the luminance value (gray value) of the pixel corresponding to the predetermined position on the floor surface 40 in the image obtained by the imaging means (cameras 11 and 12). Ask for.
 たとえば、マップ生成部131は、第1輝度値と第2輝度値との差分を、輝度値の変化として求める。第1輝度値は、照明器具20の光出力が第1光出力(たとえば定格点灯に対応する光出力)であるときに撮像手段(カメラ11,12)から得られる画像における所定位置に対応する画素の輝度値である。第2輝度値は、照明器具20の光出力が第1光出力と異なる第2光出力(たとえば消灯に対応する光出力)であるときに撮像手段(カメラ11,12)から得られる画像における所定位置に対応する画素の輝度値である。 For example, the map generation unit 131 obtains the difference between the first luminance value and the second luminance value as a change in luminance value. The first luminance value is a pixel corresponding to a predetermined position in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is the first light output (for example, light output corresponding to rated lighting). Luminance value. The second luminance value is a predetermined value in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is a second light output different from the first light output (for example, a light output corresponding to turning off). This is the luminance value of the pixel corresponding to the position.
 マップ生成部131は、画素をその輝度値の変化が最大になる照明器具20の照明範囲に割り当てる。 The map generation unit 131 assigns the pixel to the illumination range of the lighting fixture 20 in which the change in the luminance value is maximized.
 具体的には、照明器具21,22,23に、互いに識別可能な識別子を付与しておき、撮像画像の各座標(u,v)には照明器具20(21,22,23)の識別子を対応付ける。識別子は、1画素に対応付けることができれば、どのような値でもよく、たとえば、照明器具21に「1」、照明器具22に「2」、照明器具23に「3」という識別子を割り当てればよい。 Specifically, identifiers that can be distinguished from each other are given to the lighting fixtures 21, 22, and 23, and the identifier of the lighting fixture 20 (21, 22, 23) is assigned to each coordinate (u, v) of the captured image. Associate. The identifier may be any value as long as it can be associated with one pixel. For example, the identifier “1” is assigned to the lighting fixture 21, “2” is assigned to the lighting fixture 22, and “3” is assigned to the lighting fixture 23. .
 換言すれば、画素値が照明器具20(21,22,23)の識別子である画像が生成され、この画像が対応マップ画像になる(対応マップ画像の例は図13(e)を参照)。 In other words, an image whose pixel value is the identifier of the lighting device 20 (21, 22, 23) is generated, and this image becomes a corresponding map image (see FIG. 13E for an example of the corresponding map image).
 マップ生成部131では、図6に示す手順で、対応マップ画像を照明範囲マップに変換する。照明範囲マップは、対応マップ画像の画素値(照明器具20(21,22,23)の識別子)が三次元情報の座標値に対応付けられる。 The map generation unit 131 converts the corresponding map image into an illumination range map by the procedure shown in FIG. In the illumination range map, the pixel value of the corresponding map image (the identifier of the lighting fixture 20 (21, 22, 23)) is associated with the coordinate value of the three-dimensional information.
 照明範囲マップを生成するには、まず初期化のために三次元情報の座標値に対応付ける識別子が0に設定される(S11)。 To generate the illumination range map, first, an identifier associated with the coordinate value of the three-dimensional information is set to 0 for initialization (S11).
 次に、上式(4)-(8)を用いて、対応マップ画像における座標(u,v)が座標(α,β)に変換される(S12)。 Next, using the above equations (4)-(8), the coordinates (u, v) in the corresponding map image are converted into coordinates (α, β) (S12).
 さらに、マップ生成部131は、変換された座標(α,β)を視差画像に照合することにより視差dを求め、次式(10)により座標(α,β)に対応する距離Zdを算出する(S13)。 Further, the map generation unit 131 obtains the parallax d by collating the converted coordinates (α, β) with the parallax image, and calculates the distance Zd corresponding to the coordinates (α, β) by the following equation (10). (S13).
 次式(10)においてBは、カメラ11の光学中心O1とカメラ12の光学中心O2との距離である。 In the following equation (10), B is the distance between the optical center O1 of the camera 11 and the optical center O2 of the camera 12.
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 実空間における点Pと、カメラ11,12の光学中心O1,O2と角度α1,α2と距離Bとの関係を図7に示す。図7において、座標zは、カメラ11の光学中心O1を原点とする座標系でのxy平面から点Pまでの距離である。 FIG. 7 shows the relationship between the point P in the real space, the optical centers O1 and O2 of the cameras 11 and 12, the angles α1 and α2, and the distance B. In FIG. 7, the coordinate z is the distance from the xy plane to the point P in the coordinate system with the optical center O <b> 1 of the camera 11 as the origin.
 ところで、カメラ11,12が上述のように配置されていることにより、照明空間50の床面40は、カメラ11を基準に設定したxy平面(図3参照)に平行な平面になる。 By the way, since the cameras 11 and 12 are arranged as described above, the floor surface 40 of the illumination space 50 becomes a plane parallel to the xy plane (see FIG. 3) set with the camera 11 as a reference.
 いま、原点を床面40上に設定したワールド座標を規定し、ワールド座標におけるカメラ11の光学中心O1の座標を(X0,Y0,Z0)とする。また、ワールド座標におけるZ軸は、床面40から天井面に向かう向きを正の向きとする。したがって、ワールド座標では、床面40の所定位置の座標は(X,Y,0)で表される。 Now, the world coordinates with the origin set on the floor surface 40 are defined, and the coordinates of the optical center O1 of the camera 11 in the world coordinates are (X0, Y0, Z0). Further, the Z axis in the world coordinates has a positive direction from the floor surface 40 toward the ceiling surface. Therefore, in the world coordinates, the coordinates of the predetermined position on the floor surface 40 are represented by (X, Y, 0).
 上式(4)-(6)により求められた球面SP上での座標(Xr,Yr,Zr)と、カメラ11の光学中心O1のワールド座標(X0,Y0,Z0)と、上式(10)により求めた距離Zdとを用いると、対応マップ画像における座標(u,v)に対応するワールド座標(X,Y,Z)は次式(11)で表される。すなわち、座標(u,v)が座標(X,Y,Z)に変換される(S14)。 The coordinates (Xr, Yr, Zr) on the spherical surface SP obtained by the above expressions (4)-(6), the world coordinates (X0, Y0, Z0) of the optical center O1 of the camera 11, and the above expressions (10 ), The world coordinates (X, Y, Z) corresponding to the coordinates (u, v) in the corresponding map image are expressed by the following equation (11). That is, the coordinates (u, v) are converted into coordinates (X, Y, Z) (S14).
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 次に、対応マップ画像の座標(u,v)に設定された識別子が、座標(u,v)から変換した照明範囲マップの座標(X,Y)に対応付けられる(S17)。 Next, the identifier set to the coordinates (u, v) of the corresponding map image is associated with the coordinates (X, Y) of the illumination range map converted from the coordinates (u, v) (S17).
 この処理は、対応マップ画像のすべての座標(u,v)について行われる(S19,S20)。 This process is performed for all coordinates (u, v) of the corresponding map image (S19, S20).
 ところで、対応マップ画像において異なる座標(u,v)が、照明範囲マップの同じ座標(X,Y)に対応する場合が考えられる。 By the way, it is conceivable that different coordinates (u, v) in the corresponding map image correspond to the same coordinates (X, Y) in the illumination range map.
 そこで、このような座標(X,Y)の重複を避けるために、座標(X,Y)にすでにZ値が記憶されている場合(S15:Yes)、すでに記憶されているZ値と、今回求めたZ値との大小を比較する(S16)。 Therefore, in order to avoid such duplication of coordinates (X, Y), when the Z value is already stored in the coordinate (X, Y) (S15: Yes), the Z value already stored and this time The magnitude is compared with the obtained Z value (S16).
 そして、今回のZ値が、記憶しているZ値よりも小さい場合(S16:Yes)、すなわち、座標(X,Y)に対応する部位が床面よりも上方に位置する場合には、座標(u,v)に対応する座標(X,Y)に照明器具20(21,22,23)の識別子が対応付けられ(S17)、さらに、座標(X,Y)に今回求めたZ値が対応付けて記憶される(S18)。 If the current Z value is smaller than the stored Z value (S16: Yes), that is, if the part corresponding to the coordinates (X, Y) is located above the floor surface, the coordinates The identifier of the luminaire 20 (21, 22, 23) is associated with the coordinates (X, Y) corresponding to (u, v) (S17), and the Z value obtained this time is the coordinates (X, Y). Correspondingly stored (S18).
 このように、対応マップ画像の1つの座標(u,v)に、ワールド座標の複数の座標(X,Y)が対応する場合に、Z値が小さいほうが採用されるから、対応マップ画像上で床面40に交差した面(ほぼ直交した面)に設定された照明器具20(21,22,23)の識別子が、照明範囲マップに対応付けられることが防止される。 As described above, when a plurality of coordinates (X, Y) of the world coordinates correspond to one coordinate (u, v) of the corresponding map image, the smaller Z value is adopted. It is prevented that the identifier of the lighting fixture 20 (21, 22, 23) set on the plane (substantially orthogonal plane) intersecting the floor surface 40 is associated with the illumination range map.
 マップ生成部131は、図6に示した処理により照明範囲マップを生成した後、照明器具20(21,22,23)の識別子が対応付けられていない座標(X,Y)に、いずれかの照明器具20(21,22,23)の識別子を対応付けるように補正処理を行う。 After generating the illumination range map by the process shown in FIG. 6, the map generation unit 131 sets any one of the coordinates (X, Y) to which the identifier of the lighting fixture 20 (21, 22, 23) is not associated. Correction processing is performed so as to associate the identifiers of the lighting fixtures 20 (21, 22, 23).
 この補正処理には、XY平面(床面)上で、照明器具20(21,22,23)ごとの直下の座標位置を推定し、当該座標位置からの距離に応じて座標(X,Y)で表される領域のすべての座標位置に照明器具20(21,22,23)の識別子を対応付ける。 In this correction process, a coordinate position immediately under each lighting fixture 20 (21, 22, 23) is estimated on the XY plane (floor surface), and coordinates (X, Y) are determined according to the distance from the coordinate position. Are associated with the identifiers of the lighting fixtures 20 (21, 22, 23).
 照明器具20(21,22,23)の直下の座標位置は、補正処理前の照明範囲マップを用い、識別子ごとに座標位置の平均の位置(重心位置)が割り当てられる。要するに、照明器具20(21,22,23)ごとの照明範囲の重心の位置が、照明器具20(21,22,23)の直下の座標位置と推定される。 The coordinate position immediately below the lighting fixture 20 (21, 22, 23) is assigned an average coordinate position (center of gravity position) for each identifier using an illumination range map before correction processing. In short, the position of the center of gravity of the illumination range for each lighting fixture 20 (21, 22, 23) is estimated as the coordinate position immediately below the lighting fixture 20 (21, 22, 23).
 マップ生成部131は、照明器具20(21,22,23)の直下の座標位置を求めた後、次式(12)を用いて、照明範囲マップのうち照明器具20(21,22,23)の識別子を対応付けていない座標(X,Y)と照明器具20(21,22,23)ごとの直下の座標位置との距離Dを求める。 The map generation unit 131 obtains the coordinate position immediately below the lighting fixture 20 (21, 22, 23), and then uses the following equation (12) to calculate the lighting fixture 20 (21, 22, 23) in the lighting range map. The distance D between the coordinates (X, Y) that are not associated with the identifier and the coordinate position immediately below each lighting fixture 20 (21, 22, 23) is obtained.
 次式(12)において(ug,vg)は照明器具20(21,22,23)の直下の座標位置であり、照明器具20(21,22,23)ごとに異なる値になる。 In the following equation (12), (ug, vg) is a coordinate position immediately below the lighting fixture 20 (21, 22, 23), and has a different value for each lighting fixture 20 (21, 22, 23).
Figure JPOXMLDOC01-appb-M000008
   
Figure JPOXMLDOC01-appb-M000008
   
 距離D(D1,D2,D3)は照明器具20(21,22,23)ごとに求められ、本実施形態では3台の照明器具20(21,22,23)を用いているから、座標(X,Y)に対して3種類の距離Dが求められることになる。 The distance D (D1, D2, D3) is obtained for each lighting fixture 20 (21, 22, 23). In this embodiment, three lighting fixtures 20 (21, 22, 23) are used. Three types of distances D are required for X, Y).
 照明器具20(21,22,23)は、座標(X,Y)までの距離Dが小さいほど影響が大きいと考えられるから、マップ生成部131は、求めた距離Dが最小になる照明器具20(21,22,23)を座標(X,Y)に割り当てる。 Since the luminaire 20 (21, 22, 23) is considered to have a larger influence as the distance D to the coordinates (X, Y) is smaller, the map generator 131 has the luminaire 20 with which the obtained distance D is minimized. Assign (21, 22, 23) to the coordinates (X, Y).
 たとえば、対応マップ画像が図13(e)に例として示した画像である場合、マップ生成部131は、まず図8に示すような照明範囲マップを生成する。図8において、楕円形の領域F1,F2,F3には、それぞれ照明器具21,22,23の識別子が対応付けられている。 For example, when the corresponding map image is the image illustrated in FIG. 13E as an example, the map generation unit 131 first generates an illumination range map as illustrated in FIG. In FIG. 8, the identifiers of the lighting fixtures 21, 22, and 23 are associated with the elliptical regions F1, F2, and F3, respectively.
 また、領域F1,F2,F3ごとに重心位置(ug1,vg1),(ug2,vg2),(ug3,vg3)が求められている。これらの重心位置(ug1,vg1),(ug2,vg2),(ug3,vg3)は、それぞれ照明器具21,22,23の直下の座標位置とみなされる。 Also, the center-of-gravity positions (ug1, vg1), (ug2, vg2), and (ug3, vg3) are obtained for each of the regions F1, F2, and F3. These barycentric positions (ug1, vg1), (ug2, vg2), (ug3, vg3) are regarded as coordinate positions immediately below the lighting fixtures 21, 22, 23, respectively.
 次に、照明範囲マップのうち照明器具20(21,22,23)の識別子が対応付けられていない座標(X,Y)から各重心位置(ug1,vg1),(ug2,vg2),(ug3,vg3)までの距離D1,D2,D3が求められる。 Next, each center-of-gravity position (ug1, vg1), (ug2, vg2), (ug3) from coordinates (X, Y) not associated with the identifier of the lighting fixture 20 (21, 22, 23) in the illumination range map. , Vg3), distances D1, D2, D3 are determined.
 求められた距離D(D1,D2,D3)が最小になる重心位置(ug,vg)((ug1,vg1),(ug2,vg2),(ug3,vg3))に対応する照明器具20(21,22,23)が、座標(X,Y)に割り当てられる。 Luminaire 20 (21) corresponding to the center-of-gravity position (ug, vg) ((ug1, vg1), (ug2, vg2), (ug3, vg3)) at which the obtained distance D (D1, D2, D3) is minimized. , 22, 23) are assigned to the coordinates (X, Y).
 上述のように、照明空間50の床面40において、照明器具20(21,22,23)の識別子を割り当てられていない座標(X,Y)に対し、距離Dが最小になる照明器具20(21,22,23)の識別子が割り当てられる。つまり、照明範囲マップの補正処理によって、照明空間50の床面40のすべての座標(X,Y)に照明器具20(21,22,23)の識別子が対応付けられる。 As described above, on the floor surface 40 of the lighting space 50, the lighting fixture 20 (in which the distance D is the minimum with respect to the coordinates (X, Y) to which the identifier of the lighting fixture 20 (21, 22, 23) is not assigned. 21, 22, 23) are assigned. That is, the identifier of the lighting fixture 20 (21, 22, 23) is associated with all the coordinates (X, Y) of the floor surface 40 of the illumination space 50 by the correction process of the illumination range map.
 床面40の座標(X,Y)と照明器具20(21,22,23)との対応付けには、照明器具20(21,22,23)の直下の座標(重心位置(ug,vg)との距離Dを用いることは必須ではない。 For the correspondence between the coordinates (X, Y) of the floor surface 40 and the lighting fixture 20 (21, 22, 23), the coordinates directly below the lighting fixture 20 (21, 22, 23) (center of gravity position (ug, vg)). It is not essential to use the distance D.
 たとえば、照明器具20(21,22,23)の識別子が割り当てられている座標(X,Y)との距離を用いてもよい。つまり、照明器具20(21,22,23)との対応付けがなされていない場合、近い位置に割り当てられている識別子を割り当てるようにしてもよい。 For example, the distance from the coordinates (X, Y) to which the identifier of the lighting fixture 20 (21, 22, 23) is assigned may be used. That is, when the association with the lighting fixture 20 (21, 22, 23) is not made, an identifier assigned to a close position may be assigned.
 いずれの場合も、照明器具21,22,23ごとの照明範囲と、座標(X,Y)で表される床面40上の位置との位置関係を用いて、座標(X,Y)に照明器具20(21,22,23)が割り当てられることになる。 In any case, illumination is performed at the coordinates (X, Y) using the positional relationship between the illumination range for each of the lighting fixtures 21, 22, and 23 and the position on the floor surface 40 represented by the coordinates (X, Y). The instrument 20 (21, 22, 23) will be assigned.
 上述した処理は、照明器具20(21,22,23)を制御する通常モードの前の設定モードにおいて実施される。次に、照明器具20(21,22,23)の光出力を人30の存在位置に応じて制御する通常モードの動作について説明する。 The processing described above is performed in the setting mode before the normal mode for controlling the lighting fixture 20 (21, 22, 23). Next, the operation in the normal mode in which the light output of the lighting fixture 20 (21, 22, 23) is controlled according to the position where the person 30 is present will be described.
 通常モードでは、カメラ11,12が撮像した画像を用いて人検出部132が人30の存在位置を検出する。すなわち、人検出部132は、照明空間50における人30の存否を検出するだけではなく、照明空間50に人30が存在する場合に、人30が存在する位置(存在位置)を検出(特定)する。 In the normal mode, the person detection unit 132 detects the presence position of the person 30 using the images captured by the cameras 11 and 12. That is, the person detection unit 132 not only detects the presence / absence of the person 30 in the illumination space 50 but also detects (identifies) a position (existence position) where the person 30 exists when the person 30 exists in the illumination space 50. To do.
 カメラ11,12から出力される撮像画像を用いて人30の存在位置を検出するには、まず三次元計測部10において照明空間50の三次元情報を取得する。三次元情報は、カメラ11,12のフレームレートに応じて生成され、たとえば、毎秒30フレームで出力される画像に対して、それぞれ三次元情報が取得される。三次元計測部10が三次元情報を取得する動作は上述した通りであって、カメラ11,12ごとの撮像画像の座標(u,v)が(α,β)に変換され、さらに視差画像が生成され、視差画像に基づいて三次元情報が算出される。 In order to detect the presence position of the person 30 using the captured images output from the cameras 11 and 12, first, the three-dimensional measurement unit 10 acquires three-dimensional information of the illumination space 50. The three-dimensional information is generated according to the frame rates of the cameras 11 and 12, and for example, the three-dimensional information is acquired for each image output at 30 frames per second. The operation in which the three-dimensional measuring unit 10 acquires the three-dimensional information is as described above, and the coordinates (u, v) of the captured images for the cameras 11 and 12 are converted into (α, β), and the parallax image is further converted. Generated and three-dimensional information is calculated based on the parallax image.
 人30の存在位置を検出するには、設定モードにおいて生成した人30の存在しない状態での視差画像(すなわち、背景画像)と、通常モードにおいて生成した視差画像(すなわち、評価画像)との差分画像を用いる。 In order to detect the presence position of the person 30, a difference between the parallax image (that is, the background image) generated in the setting mode and the person 30 is not present and the parallax image (that is, the evaluation image) generated in the normal mode. Use images.
 この差分画像は、比較対象である評価画像と背景画像とにおける同座標の画素値の差分を画素値に持つ。したがって、背景画像に含まれない物体が評価画像に含まれていると、当該物体に対応する座標の画素値が比較的大きくなる。このことを利用すれば、差分画像を用いて人30の存在位置を検出することが可能になる。 The difference image has a pixel value difference of the same coordinate pixel value between the evaluation image to be compared and the background image. Therefore, when an object that is not included in the background image is included in the evaluation image, the pixel value of the coordinate corresponding to the object is relatively large. If this is utilized, it becomes possible to detect the presence position of the person 30 using a difference image.
 本実施形態では、人30の存在位置を検出するために設定モードでの視差画像と通常モードでの視差画像の差分を用いるが、他の方法により人の存在位置を検出してもよい。 In this embodiment, the difference between the parallax image in the setting mode and the parallax image in the normal mode is used to detect the presence position of the person 30, but the person's presence position may be detected by other methods.
 たとえば、視差画像の各画素について、図7に示す距離zの値を算出することで距離画像を生成できるため、設定モードでの距離画像と通常モードでの距離画像をそれぞれ生成し、距離画像同士の差分を用いて人30の存在位置を検出することができる。 For example, for each pixel of the parallax image, a distance image can be generated by calculating the value of the distance z shown in FIG. 7, so that a distance image in the setting mode and a distance image in the normal mode are generated, and the distance images are The presence position of the person 30 can be detected using the difference between the two.
 また、通常モードにおいてカメラ11,12のいずれかから得られる画像に基づいて、フレーム間差分法やテンプレートマッチングなどの方法により画像中の人30に相当する領域を抽出し、抽出した領域について視差および距離を算出することで人30の存在位置を検出することができる。 Further, based on the image obtained from either of the cameras 11 and 12 in the normal mode, an area corresponding to the person 30 in the image is extracted by a method such as an inter-frame difference method or template matching, and the extracted area has parallax and By calculating the distance, the presence position of the person 30 can be detected.
 人検出部132は、通常モードにおいて取得した三次元情報に基づいて生成した評価画像と背景画像との差分画像を生成し、差分画像における画素値が規定した閾値を超える画素を人に対応する画素の候補として抽出する。 The human detection unit 132 generates a differential image between the evaluation image generated based on the three-dimensional information acquired in the normal mode and the background image, and a pixel corresponding to the human is a pixel whose pixel value exceeds a prescribed threshold in the differential image Extracted as a candidate.
 具体的には、評価画像における座標(α,β)の視差dn(α,β)と、背景画像における座標(α,β)の視差db(α,β)との差分を適宜に設定した閾値dthと比較し、dn(α,β)-db(α,β)>dthが成立するときに、当該座標(α,β)の画素を、人30に対応する画素の候補とする。 Specifically, a threshold value that appropriately sets a difference between the parallax dn (α, β) of the coordinates (α, β) in the evaluation image and the parallax db (α, β) of the coordinates (α, β) in the background image. When dn (α, β) −db (α, β)> dth is established as compared with dth, the pixel at the coordinate (α, β) is set as a candidate pixel corresponding to the person 30.
 なお、人検出部132は、通常モードにおいて取得した三次元情報に基づいて生成した評価画像と背景画像との差分画像を生成し、差分画像における画素値が規定した閾値以上である画素を人に対応する画素の候補として抽出してもよい。 Note that the human detection unit 132 generates a difference image between the evaluation image generated based on the three-dimensional information acquired in the normal mode and the background image, and sets a pixel whose pixel value in the difference image is equal to or greater than a specified threshold value to the person. You may extract as a corresponding pixel candidate.
 たとえば、図9(a)のような評価画像と、図9(b)のような背景画像とが得られている場合に、差分画像は図9(c)のようになる。図9(c)に示される差分画像は、差分値(=dn(α,β)-db(α,β))が閾値dth以下である画素の画素値を所定明度(たとえば最大明度)とし、差分値が大きいほど画素値の明度を低くしたグレースケールで表されている。したがって、色の濃い領域が人に対応する領域の候補になっている。 For example, when an evaluation image as shown in FIG. 9 (a) and a background image as shown in FIG. 9 (b) are obtained, the difference image is as shown in FIG. 9 (c). In the difference image shown in FIG. 9C, the pixel value of a pixel whose difference value (= dn (α, β) −db (α, β)) is equal to or less than a threshold value dth is set to a predetermined brightness (for example, maximum brightness), The larger the difference value, the lower the brightness of the pixel value. Therefore, a dark region is a candidate region corresponding to a person.
 ただし、差分値が閾値dthを超えている領域であっても人ではないノイズが含まれている可能性があるから、差分値が閾値dth以下である画素はノイズとして除去する。さらに、差分値が閾値dthを超えている領域については、連結性を評価するとともに、連結性がある場合には面積が規定範囲内か否かを判定する。 However, since there is a possibility that non-human noise is included even in a region where the difference value exceeds the threshold value dth, pixels whose difference value is equal to or less than the threshold value dth are removed as noise. Further, for the region where the difference value exceeds the threshold value dth, the connectivity is evaluated, and if there is connectivity, it is determined whether the area is within the specified range.
 ここに、連結性は、着目する画素と隣接する画素(4近傍や8近傍)とについて評価し、2つの画素がともに閾値dthを超えているときに、両画素は連結性があると判断される。 Here, connectivity is evaluated for the pixel of interest and adjacent pixels (near 4 or 8), and when both pixels exceed the threshold value dth, both pixels are determined to have connectivity. The
 また、人検出部132は、連結性があると判断された画素の集合について、当該集合を構成する画素の面積(画素数)を規定範囲と比較し、面積が規定範囲内であれば当該集合が人30に対応すると推定する。 Further, the human detection unit 132 compares the area (number of pixels) of the pixels constituting the set with respect to a set of pixels determined to have connectivity, and if the area is within the specified range, the set Is assumed to correspond to the person 30.
 以上の処理により、図9(d)に示すように人30に対応する領域が抽出される。なお、図9(d)に示す例は、αβ平面からXY平面への座標変換を施してある。 By the above processing, an area corresponding to the person 30 is extracted as shown in FIG. In the example shown in FIG. 9D, coordinate conversion from the αβ plane to the XY plane is performed.
 人30に対応すると推定された画素の集合は、照明空間50の床面40に対する代表点の位置が求められる。すなわち、人検出部132は、人30の候補となる画素の集合に対して、上式(11),(12)を用いて、ワールド座標における座標値(X,Y,Z)を求める。さらに、当該画素の集合についてXY平面の上での重心の座標を求め、この座標を人30の代表点の位置、すなわち、人30の存在位置の座標とする。 For the set of pixels estimated to correspond to the person 30, the position of the representative point with respect to the floor surface 40 of the illumination space 50 is obtained. That is, the person detection unit 132 obtains coordinate values (X, Y, Z) in world coordinates for the set of pixels that are candidates for the person 30 using the above formulas (11) and (12). Further, the coordinates of the center of gravity on the XY plane are obtained for the set of pixels, and this coordinate is set as the position of the representative point of the person 30, that is, the position of the person 30.
 上述したように人検出部132により床面40上での人30の存在位置が求められると、照合部133は、人30の存在位置を照明範囲マップと照合し、人30の存在位置に対応する照明器具20(21,22,23)を抽出する。 As described above, when the presence position of the person 30 on the floor surface 40 is obtained by the person detection unit 132, the collation unit 133 collates the existence position of the person 30 with the illumination range map and corresponds to the existence position of the person 30. The lighting fixture 20 (21, 22, 23) to be extracted is extracted.
 つまり、照明範囲マップにおいて人30の存在位置に対応付けた照明器具20(21,22,23)の識別子が抽出される。照合部133が抽出した識別子は、制御指示部134に与えられる。 That is, the identifier of the luminaire 20 (21, 22, 23) associated with the position of the person 30 in the illumination range map is extracted. The identifier extracted by the collation unit 133 is given to the control instruction unit 134.
 制御指示部134は、適宜に設定されているルールに従って、照明器具21,22,23の光出力(光出力の指示値)を決定し、決定した光出力(光出力の指示値)を照明制御部15に指示する。 The control instruction unit 134 determines the light output (light output instruction value) of the luminaires 21, 22, and 23 according to the appropriately set rules, and performs illumination control on the determined light output (light output instruction value). The unit 15 is instructed.
 もっとも簡単なルールは、照合部133が抽出した識別子に対応する照明器具20(21,22,23)のみを定格で点灯させ、他の照明器具20(21,22,23)は消灯するように設定される。 The simplest rule is to turn on only the lighting fixtures 20 (21, 22, 23) corresponding to the identifiers extracted by the collation unit 133 at a rating, and turn off the other lighting fixtures 20 (21, 22, 23). Is set.
 また、照合部133が抽出した識別子に対応する照明器具20(21,22,23)を定格で点灯させ、さらに、その照明器具20(21,22,23)の周囲の照明器具20(21,22,23)を調光点灯(たとえば、光出力を50%に調光)させるようにルールが設定されていてもよい。この場合、定格点灯と調光点灯とが指示される照明器具20(21,22,23)を除く他の照明器具20(21,22,23)は消灯される。 Moreover, the lighting fixture 20 (21, 22, 23) corresponding to the identifier extracted by the collation unit 133 is turned on with a rating, and the lighting fixtures 20 (21, 22, 23) around the lighting fixture 20 (21, 22, 23) are also displayed. 22 and 23) may be set to be dimmed (for example, the light output is dimmed to 50%). In this case, the other lighting fixtures 20 (21, 22, 23) are turned off except for the lighting fixtures 20 (21, 22, 23) for which rated lighting and dimming lighting are instructed.
 上述のようにルールが設定されていれば、人30からの距離が近い照明器具20(21,22,23)を定格で点灯させることにより、作業の実施や安全の確保に必要な照度を得ることができる。しかも、人からの距離が遠い照明器具20(21,22,23)は消灯させているから、作業の実施や安全の確保において影響の少ない照明器具20(21,22,23)を消灯して省エネルギーを実現することができる。 If the rules are set as described above, the lighting device 20 (21, 22, 23) that is close to the person 30 is lit at a rating to obtain the illuminance necessary for performing work and ensuring safety. be able to. In addition, since the lighting fixtures 20 (21, 22, 23) that are far from humans are turned off, the lighting fixtures 20 (21, 22, 23) that are less affected by work and ensuring safety are turned off. Energy saving can be realized.
 また、照明空間50において床面40に交差する面(家具の側面や壁面など)に照明器具20(21,22,23)が対応付けられないようにしているから、床面40上での人30の存在位置に対応するように照明器具21,22,23の光出力を制御することが可能になる。 In addition, since the lighting fixtures 20 (21, 22, 23) are not associated with a surface (such as a side surface or a wall surface of the furniture) that intersects the floor surface 40 in the lighting space 50, a person on the floor surface 40 It becomes possible to control the light output of the luminaires 21, 22, and 23 so as to correspond to the 30 positions.
 つまり、人30からの距離が遠い照明器具20(21,22,23)で人の周囲が照明されるという、誤った対応付けが防止される。言い換えると、人30からの距離が近い照明器具20(21,22,23)で必要な照度を確保するから、必要な照度を確保するのに必要なエネルギーが低減され、結果的に省エネルギーにつながる。 That is, an erroneous association in which a person's surroundings are illuminated with the lighting device 20 (21, 22, 23) far from the person 30 is prevented. In other words, since the illuminance necessary for the lighting fixture 20 (21, 22, 23) close to the person 30 is ensured, the energy required to ensure the necessary illuminance is reduced, resulting in energy saving. .
 以上述べたように、本実施形態の照明制御装置は、複数台の照明器具20により照明される照明空間50を撮像する撮像手段(カメラ11,12)を備え撮像手段(カメラ11,12)が撮像した画像から照明空間50の三次元情報を取得する三次元計測部10と、三次元情報を用いることにより照明空間50の床面40上の位置を照明器具20に割り当てた照明範囲マップを生成するマップ生成部131と、三次元情報を用いて照明空間50における床面40上での人30の存在位置を検出する人検出部132と、人30の存在位置が照明範囲マップに照合されることにより抽出された照明器具20ごとの照明範囲を用いて照明器具20の光出力を決定する制御指示部134とを備える。 As described above, the illumination control apparatus according to the present embodiment includes the imaging unit (cameras 11 and 12) that captures the illumination space 50 illuminated by the plurality of lighting fixtures 20, and the imaging unit (cameras 11 and 12). A three-dimensional measurement unit 10 that acquires three-dimensional information of the illumination space 50 from the captured image, and an illumination range map in which the position on the floor surface 40 of the illumination space 50 is assigned to the lighting fixture 20 by using the three-dimensional information. The map generation unit 131 that performs the detection, the person detection unit 132 that detects the position of the person 30 on the floor 40 in the illumination space 50 using the three-dimensional information, and the position of the person 30 are collated with the illumination range map. And a control instruction unit 134 that determines the light output of the luminaire 20 using the illumination range of each luminaire 20 extracted as described above.
 換言すれば、本実施形態の照明制御装置は、計測処理部130と、マップ生成部131と、人検出部132と、制御指示部134と、を備える。計測処理部130は、複数の照明器具20により照明される床面40を含む照明空間50を撮像する撮像手段から取得した画像(撮像画像)に基づいて照明空間50の三次元情報を生成するように構成される。マップ生成部131は、三次元情報に基づいて照明空間50の床面40における照明器具20の照明範囲を表す照明範囲マップを生成するように構成される。人検出部132は、三次元情報に基づいて照明空間50の床面40において人30が存在する存在位置を求めるように構成される。制御指示部134は、存在位置と照明範囲マップとを参照して複数の照明器具20のそれぞれの光出力の指示値を決定するように構成される。 In other words, the illumination control apparatus of the present embodiment includes a measurement processing unit 130, a map generation unit 131, a person detection unit 132, and a control instruction unit 134. The measurement processing unit 130 generates three-dimensional information of the illumination space 50 based on an image (captured image) acquired from an imaging unit that images the illumination space 50 including the floor surface 40 illuminated by the plurality of lighting fixtures 20. Configured. The map generation unit 131 is configured to generate an illumination range map representing the illumination range of the luminaire 20 on the floor surface 40 of the illumination space 50 based on the three-dimensional information. The person detection unit 132 is configured to obtain an existing position where the person 30 exists on the floor surface 40 of the illumination space 50 based on the three-dimensional information. The control instruction unit 134 is configured to determine an instruction value for each light output of the plurality of lighting fixtures 20 with reference to the presence position and the illumination range map.
 特に、本実施形態の照明制御装置は、さらに、撮像手段(カメラ11,12)と、照明制御部15と、を備える。照明制御部15は、制御指示部134から指示値を受け取ると、照明器具20の光出力が指示値に対応する値となるように、照明器具20を制御するように構成される。 In particular, the illumination control device of the present embodiment further includes imaging means (cameras 11 and 12) and an illumination control unit 15. When the illumination control unit 15 receives the instruction value from the control instruction unit 134, the illumination control unit 15 is configured to control the illumination apparatus 20 so that the light output of the illumination apparatus 20 becomes a value corresponding to the instruction value.
 また、本実施形態の照明制御装置では、三次元計測部10は、撮像手段として2台のカメラ11,12を用い、カメラ11,12が撮像した画像から三角測量法の原理を用いて照明空間50の三次元情報を取得する。 Further, in the illumination control apparatus of the present embodiment, the three-dimensional measuring unit 10 uses two cameras 11 and 12 as imaging means, and uses the triangulation principle from the images captured by the cameras 11 and 12 to use the illumination space. 50 three-dimensional information is acquired.
 換言すれば、撮像手段は、2台のカメラ11,12を備える。計測処理部130は、2台のカメラ11,12からそれぞれ得られた2枚の画像(撮像画像)から、三角測量法の原理を用いて、三次元情報を生成するように構成される。 In other words, the imaging means includes two cameras 11 and 12. The measurement processing unit 130 is configured to generate three-dimensional information from two images (captured images) obtained from the two cameras 11 and 12 using the principle of triangulation.
 また、本実施形態の照明制御装置では、マップ生成部131は、照明範囲マップを生成するために、照明器具20ごとに光出力を2段階で変化させ、撮像手段が撮像した画像を構成する画素のうち照明空間50における床面40に対応付けられる各画素について照明器具20ごとの2段階の光出力に対応する濃淡値の変化分を求め、画素ごとに変化分を最大にする照明器具20を対応付ける。 Further, in the illumination control device of the present embodiment, the map generation unit 131 changes the light output for each lighting fixture 20 in two stages to generate an illumination range map, and the pixels constituting the image captured by the imaging unit Among the pixels corresponding to the floor surface 40 in the lighting space 50, the change in the gray value corresponding to the two-stage light output for each lighting fixture 20 is obtained, and the lighting fixture 20 that maximizes the change for each pixel is obtained. Associate.
 換言すれば、マップ生成部131は、複数の照明器具20のそれぞれに関して画像における床面40の所定位置に対応する画素の輝度値の変化を求め、画素をその輝度値の変化が最大になる照明器具20の照明範囲に割り当てるように構成される。マップ生成部131は、第1輝度値と第2輝度値との差分を、輝度値の変化として求めるように構成される。第1輝度値は、照明器具20の光出力が第1光出力(たとえば定格点灯に対応する光出力)であるときに撮像手段(カメラ11,12)から得られる画像における所定位置に対応する画素の輝度値である。第2輝度値は、照明器具20の光出力が第1光出力と異なる第2光出力(たとえば消灯に対応する光出力)であるときに撮像手段(カメラ11,12)から得られる画像における所定位置に対応する画素の輝度値である。 In other words, the map generation unit 131 obtains a change in the luminance value of a pixel corresponding to a predetermined position on the floor surface 40 in the image for each of the plurality of lighting fixtures 20, and illuminates the pixel with the largest change in the luminance value. It is configured to be assigned to the illumination range of the instrument 20. The map generation unit 131 is configured to obtain a difference between the first luminance value and the second luminance value as a change in luminance value. The first luminance value is a pixel corresponding to a predetermined position in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is the first light output (for example, light output corresponding to rated lighting). Luminance value. The second luminance value is a predetermined value in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is a second light output different from the first light output (for example, a light output corresponding to turning off). This is the luminance value of the pixel corresponding to the position.
 また、本実施形態の照明制御装置では、マップ生成部131は、撮像手段が撮像した画像を構成する画素のうち床面40に交差する面に対応する画素は、照明範囲マップの対象外とする。 Moreover, in the illumination control apparatus of this embodiment, the map generation part 131 excludes the pixel corresponding to the surface which cross | intersects the floor surface 40 among the pixels which comprise the image which the imaging means imaged from the object of an illumination range map. .
 換言すれば、マップ生成部131は、画像における床面40に交差する面に対応する画素をいずれの照明範囲にも割り当てないように構成される。 In other words, the map generation unit 131 is configured not to assign a pixel corresponding to a surface intersecting the floor surface 40 in the image to any illumination range.
 また、本実施形態の照明制御装置では、マップ生成部131は、照明範囲マップにおいて照明空間50の床面40上の位置のうち照明器具20が対応付けられていない位置には、照明器具20ごとの照明範囲と位置との位置関係を用いて位置に照明器具20を割り当てる。 Further, in the lighting control device of the present embodiment, the map generation unit 131 sets each lighting fixture 20 at a position on the floor surface 40 of the lighting space 50 that is not associated with the lighting fixture 20 in the lighting range map. The lighting fixture 20 is assigned to the position using the positional relationship between the illumination range and the position.
 換言すれば、マップ生成部131は、照明範囲マップにおいていずれの照明範囲にも含まれてない未定位置が存在する場合、複数の照明器具20それぞれの照明範囲と未定位置との位置関係に基づいて未定位置をいずれかの照明範囲に割り当てるように構成される。 In other words, when there is an undetermined position that is not included in any illumination range in the illumination range map, the map generation unit 131 is based on the positional relationship between the illumination range and the undetermined position of each of the plurality of lighting fixtures 20. The undetermined position is configured to be assigned to any illumination range.
 また、本実施形態の照明制御装置では、マップ生成部131は、照明範囲マップにおける照明器具20ごとの照明範囲に基づいて、床面40における照明器具20ごとの直下の位置を推定する。 Moreover, in the lighting control apparatus of the present embodiment, the map generation unit 131 estimates a position immediately below each lighting fixture 20 on the floor surface 40 based on the lighting range for each lighting fixture 20 in the lighting range map.
 換言すれば、マップ生成部131は、照明範囲に基づいて、画像の床面40における照明器具の直下の位置を求めるように構成される。 In other words, the map generation unit 131 is configured to obtain the position immediately below the lighting fixture on the floor surface 40 of the image based on the illumination range.
 特に、本実施形態の照明制御装置では、照明範囲と未定位置との位置関係は、複数の照明器具20それぞれの直下の位置と未定位置との位置関係である。 In particular, in the illumination control device of the present embodiment, the positional relationship between the illumination range and the undetermined position is the positional relationship between the position immediately below each of the plurality of lighting fixtures 20 and the undetermined position.
 以上述べた本実施形態の照明制御装置によれば、照明空間50の三次元情報を用いることにより、照明器具20による照明範囲を床面40上で対応付けているので、床面40上での人30の位置を用いて適切な位置の照明器具20の光出力を制御することが可能になるという利点がある。 According to the illumination control device of the present embodiment described above, the illumination range by the luminaire 20 is associated on the floor surface 40 by using the three-dimensional information of the illumination space 50. There is an advantage that it is possible to control the light output of the luminaire 20 at an appropriate position using the position of the person 30.
 本実施形態では、対応マップ画像を生成する際に、照明器具21,22,23を1台ずつ点灯させたときのカメラ11,12での受光強度を比較している。 In the present embodiment, when the correspondence map image is generated, the received light intensity of the cameras 11 and 12 when the lighting fixtures 21, 22, and 23 are turned on one by one is compared.
 この動作に対して、照明器具21,22,23を1台ずつ点灯させた状態での画像と、すべての照明器具21,22,23を消灯させた状態での画像との差分画像を生成し、差分画像における画素値を用いて対応マップ画像を生成してもよい。 In response to this operation, a difference image is generated between an image obtained when the lighting fixtures 21, 22, and 23 are turned on one by one and an image obtained when all the lighting fixtures 21, 22, and 23 are turned off. The correspondence map image may be generated using the pixel value in the difference image.
 また、照明範囲マップを生成する際には、照明器具20(21,22,23)の光出力を2段階で変化させればよく、点灯と消灯とではなく、点灯状態において光出力を2段階に変化させてもよい。要するに、照明器具20(21,22,23)の光出力の変化が、カメラ11,12において濃淡値の変化に反映されていればよい。 Further, when generating the illumination range map, the light output of the luminaire 20 (21, 22, 23) may be changed in two stages, and the light output is turned on in two stages in the lighting state, not on and off. It may be changed to. In short, the change in the light output of the luminaire 20 (21, 22, 23) may be reflected in the change in the gray value in the cameras 11 and 12.
 なお、本実施形態の照明制御装置では、マップ生成部131は、複数の照明器具20のそれぞれに関して画像における床面40の所定位置に対応する画素への寄与度を求め、画素をその寄与度が最大になる照明器具20の照明範囲に割り当ててもよい。 In the lighting control device of the present embodiment, the map generation unit 131 calculates the contribution to the pixel corresponding to the predetermined position of the floor surface 40 in the image for each of the plurality of lighting fixtures 20, and the contribution of the pixel is determined. You may allocate to the illumination range of the lighting fixture 20 which becomes the largest.
 この場合、マップ生成部131は、複数の照明器具20のうちの1つの照明器具20の光出力が第1光出力(たとえば定格点灯に対応する光出力)であり他の照明器具20の光出力が第1光出力より低い第2光出力(たとえば消灯に対応する光出力)であるときに撮像手段(カメラ11,12)から得られる画像における所定位置に対応する画素の輝度値Iを求めるように構成される。マップ生成部131は、複数の照明器具20に対応する輝度値Iの合計Sに対する1つの照明器具20に対応する輝度値Iの割合を、寄与度として求めるように構成される。 In this case, the map generation unit 131 has a light output of one of the plurality of lighting fixtures 20 as a first light output (for example, a light output corresponding to rated lighting), and a light output of the other lighting fixtures 20. To obtain a luminance value I of a pixel corresponding to a predetermined position in an image obtained from the imaging means (cameras 11 and 12) when is a second light output lower than the first light output (for example, a light output corresponding to turning off). Configured. The map generation unit 131 is configured to obtain a ratio of the luminance value I corresponding to one lighting fixture 20 to the total S of the luminance values I corresponding to the plurality of lighting fixtures 20 as a contribution.
 (実施形態2)
 実施形態1において説明したように、人検出部132が検出した人30の存在位置を、照合部133が照明範囲マップと照合すると、照明範囲マップからは照明器具20(21,22,23)が1台だけ抽出される。
(Embodiment 2)
As described in the first embodiment, when the collation unit 133 collates the existence position of the person 30 detected by the human detection unit 132 with the illumination range map, the lighting fixture 20 (21, 22, 23) is obtained from the illumination range map. Only one is extracted.
 この処理では、制御指示部134に適切なルールが設定してあれば、人30の周囲における照度が適切に設定されるが、人30の存在位置によっては適切な照度を得るようにルールを設定することが難しい場合も考えられる。 In this process, if an appropriate rule is set in the control instruction unit 134, the illuminance around the person 30 is appropriately set. However, depending on the position of the person 30, the rule is set so as to obtain an appropriate illuminance. It may be difficult to do this.
 実施形態1は、複数台の照明器具20(21,22,23)を併せて1枚の照明範囲マップを生成しているが、本実施形態は、照明器具20(21,22,23)ごとに照明範囲マップを生成する。すなわち、3台の照明器具21,22,23に対しては、3枚の照明範囲マップが生成される。 In the first embodiment, one lighting range map is generated by combining a plurality of lighting fixtures 20 (21, 22, 23). In the present embodiment, each lighting fixture 20 (21, 22, 23) is generated. Generate an illumination range map. That is, for the three lighting fixtures 21, 22, and 23, three illumination range maps are generated.
 マップ生成部131は、実施形態1と同様に、照明器具21,22,23を1台ずつ点灯させて3枚の撮像画像を生成する。実施形態1では、得られた3枚の撮像画像における濃淡値を用いて、照明器具20(21,22,23)の識別子を三次元情報の座標値に対応付けた照明範囲マップを生成している。これに対して、本実施形態は、3枚の撮像画像を用いて3枚の個別点灯画像が生成される。 As in the first embodiment, the map generation unit 131 turns on the lighting fixtures 21, 22, and 23 one by one to generate three captured images. In the first embodiment, an illumination range map in which the identifier of the lighting fixture 20 (21, 22, 23) is associated with the coordinate value of the three-dimensional information is generated using the gray values in the obtained three captured images. Yes. On the other hand, in the present embodiment, three individually lit images are generated using three captured images.
 個別点灯画像は、1枚の撮像画像の座標(u,v)における濃淡値Iを、同じ座標(u,v)におけるすべての撮像画像の濃淡値の合計Sで除した値を、当該座標(u,v)の画素値に持っている。 The individually lit image is obtained by dividing the gray value I at the coordinates (u, v) of one captured image by the sum S of the gray values of all the captured images at the same coordinates (u, v). u, v).
 ここに、すべての照明器具21,22,23を定格で点灯させたときの濃淡値を、照明器具21,22,23を個別に点灯させたときの撮像画像の濃淡値の合計Sとみなしている。たとえば、照明器具21のみが点灯しているときに得られた撮像画像(照明器具21に対応する個別点灯画像)の濃淡値Iを、I1とする。照明器具22のみが点灯しているときに得られた撮像画像(照明器具22に対応する個別点灯画像)の濃淡値Iを、I2とする。照明器具23のみが点灯しているときに得られた撮像画像(照明器具23に対応する個別点灯画像)の濃淡値Iを、I3とする。この場合、S=I1+I2+I3になる。 Here, the gray value when all the lighting fixtures 21, 22 and 23 are lit at the rated value is regarded as the sum S of the grayscale values of the captured images when the lighting fixtures 21, 22 and 23 are individually lit. Yes. For example, the gray value I of the captured image (individual lighting image corresponding to the lighting fixture 21) obtained when only the lighting fixture 21 is turned on is set to I1. The grayscale value I of the captured image (individual lighting image corresponding to the lighting fixture 22) obtained when only the lighting fixture 22 is turned on is defined as I2. The gray value I of the captured image (individual lighting image corresponding to the lighting fixture 23) obtained when only the lighting fixture 23 is turned on is set to I3. In this case, S = I1 + I2 + I3.
 また、個別点灯画像の画素値はI/Sであり、それぞれの照明器具21,22,23が座標(u,v)で表される位置に影響する程度(寄与度)を、I/Sという指標で表していることになる。したがって、座標(u,v)の画素値が、それぞれ、I1/S、I2/S、I3/Sである3枚の個別点灯画像が得られる。 In addition, the pixel value of the individually lit image is I / S, and the degree (contribution) that each lighting fixture 21, 22, 23 affects the position represented by coordinates (u, v) is referred to as I / S. It is expressed by an indicator. Therefore, three individually lit images having pixel values of coordinates (u, v) of I1 / S, I2 / S, and I3 / S are obtained.
 個別点灯画像を模式的に示すと、図10(a),(b),(c)のようになる。各図において円形の領域は、カメラ(11,12)の視野を示している。 The individual lighting images are schematically shown in FIGS. 10 (a), 10 (b), and 10 (c). In each figure, the circular area indicates the field of view of the camera (11, 12).
 個別点灯画像は、図13(b),(c),(d)と同様の画像であって、個別点灯画像を組み合わせると実施形態1で説明した対応マップ画像が生成される。 The individual lighting images are the same as those shown in FIGS. 13B, 13C, and 13D. When the individual lighting images are combined, the corresponding map image described in the first embodiment is generated.
 図示例において、照明器具21に対応する個別点灯画像(図10(a))は上部が明るく、照明器具22に対応する個別点灯画像(図10(b))は中央部が明るく、照明器具23に対応する個別点灯画像(図10(c))は下部が明るくなっている。 In the illustrated example, the individual lighting image corresponding to the lighting fixture 21 (FIG. 10A) is bright at the top, and the individual lighting image corresponding to the lighting fixture 22 (FIG. 10B) is bright at the center, and the lighting fixture 23. The individual lighting image corresponding to (FIG. 10C) is bright at the bottom.
 マップ生成部131は、次に、個々の個別点灯画像を用いて、照明空間50における床面40の座標と照明器具20(21,22,23)の識別子とを対応付けた3枚の照明範囲マップを生成する。 Next, the map generation unit 131 uses three individual lighting images and associates the coordinates of the floor surface 40 in the illumination space 50 with the identifiers of the lighting fixtures 20 (21, 22, 23), and the three illumination ranges. Generate a map.
 ここに、実施形態1では1枚の照明範囲マップに複数台の照明器具21,22,23の識別子を含んでいるが、本実施形態では1枚の照明範囲マップには1台の照明器具20(21,22,23)の識別子のみが含まれる。言い換えると、照明器具20(21,22,23)の識別子ごとに生成された3枚の個別点灯画像から3枚の照明範囲マップが生成される。 Here, in the first embodiment, one lighting range map includes identifiers of a plurality of lighting fixtures 21, 22, and 23, but in this embodiment, one lighting range map includes one lighting fixture 20. Only the identifiers (21, 22, 23) are included. In other words, three illumination range maps are generated from the three individual lighting images generated for each identifier of the lighting fixture 20 (21, 22, 23).
 また、照明範囲マップには、照明器具20(21,22,23)ごとの座標(X,Y)への寄与度が対応付けられる。座標(X,Y)における寄与度には、個別点灯画像において座標(X,Y)に対応する座標(u,v)の画素値(I1/S,I2/S,I3/S)が用いられる。 Further, the contribution to the coordinates (X, Y) for each lighting fixture 20 (21, 22, 23) is associated with the illumination range map. The pixel values (I1 / S, I2 / S, I3 / S) of the coordinates (u, v) corresponding to the coordinates (X, Y) in the individual lighting image are used for the contribution at the coordinates (X, Y). .
 照明範囲マップを生成する手順は、基本的には図6に示した実施形態1と同様の手順になる。すなわち、照明範囲マップの初期化後、個別点灯画像の座標(u,v)の画素値を床面上の座標(X,Y)に対応付ける。また、床面40に交差する面への対応付けは防止される。 The procedure for generating the illumination range map is basically the same as that in the first embodiment shown in FIG. That is, after the illumination range map is initialized, the pixel value of the coordinate (u, v) of the individually lit image is associated with the coordinate (X, Y) on the floor surface. Further, the association with the surface intersecting the floor surface 40 is prevented.
 以上のように設定モードでは、個別点灯画像ごとに照明範囲マップを生成することにより、照明器具20(21,22,23)ごとの照明範囲マップが形成されることになる。 As described above, in the setting mode, an illumination range map for each lighting fixture 20 (21, 22, 23) is formed by generating an illumination range map for each individual lighting image.
 たとえば、図10(a),(b),(c)に示した個別点灯画像からは、それぞれ図10(d),(e),(f)に示す3枚の照明範囲マップが生成される。図示する照明範囲マップにおける明度は寄与度を表しており、明度が高いほど寄与度が大きいことを示す。 For example, from the individually lit images shown in FIGS. 10A, 10B, and 10C, three illumination range maps shown in FIGS. 10D, 10E, and 10F are generated. . The brightness in the illustrated illumination range map represents the contribution, and the higher the brightness, the greater the contribution.
 通常モードでは、照合部133は、人検出部132が求めた人30の存在位置を、すべての照明範囲マップにそれぞれ照合し、照明範囲マップごとに人30の存在位置に対応する座標(X,Y)の寄与度を抽出する。 In the normal mode, the collation unit 133 collates the existence position of the person 30 obtained by the person detection unit 132 with all the illumination range maps, and coordinates (X, X) corresponding to the existence position of the person 30 for each illumination range map. Y) contribution is extracted.
 照明範囲マップから人30の存在位置における照明器具20(21,22,23)ごとの寄与度が抽出されると、制御指示部134は、照明範囲マップごとに抽出した寄与度(I/S)を規定の閾値と比較し、寄与度が閾値以上である照明器具20(21,22,23)を点灯させる。 When the contribution for each lighting fixture 20 (21, 22, 23) at the position where the person 30 is present is extracted from the illumination range map, the control instruction unit 134 extracts the contribution (I / S) extracted for each illumination range map. Is compared with a prescribed threshold value, and lighting fixtures 20 (21, 22, 23) whose contribution is equal to or greater than the threshold value are turned on.
 たとえば、図10(d),(e),(f)に示す照明範囲マップのうち、図10(d),(e)において人30の位置の寄与度が閾値以上であり、図10(f)において人30の位置の寄与度が閾値未満であるとする。この例の場合、制御指示部134は、照明器具21,22を点灯させ、照明器具23を消灯させる。 For example, among the illumination range maps shown in FIGS. 10D, 10E, and 10F, the contribution of the position of the person 30 is equal to or greater than the threshold in FIGS. 10D and 10E, and FIG. ), The contribution degree of the position of the person 30 is less than the threshold value. In this example, the control instruction unit 134 turns on the lighting fixtures 21 and 22 and turns off the lighting fixture 23.
 以上述べたように、本実施形態の照明制御装置では、マップ生成部131は、複数の照明器具20のそれぞれに関して画像における床面40の所定位置に対応する画素への寄与度を求め、画素をその寄与度が規定の閾値以上になる照明器具20の照明範囲に割り当てるように構成される。マップ生成部131は、複数の照明器具20のうちの1つの照明器具20の光出力が第1光出力(たとえば定格点灯に対応する光出力)であり他の照明器具20の光出力が第1光出力より低い第2光出力(たとえば消灯に対応する光出力)であるときに撮像手段(カメラ11,12)から得られる画像における所定位置に対応する画素の輝度値Iを求めるように構成される。マップ生成部131は、複数の照明器具20に対応する輝度値Iの合計Sに対する1つの照明器具20に対応する輝度値Iの割合を、寄与度として求めるように構成される。 As described above, in the lighting control apparatus according to the present embodiment, the map generation unit 131 obtains the degree of contribution to the pixel corresponding to the predetermined position of the floor surface 40 in the image with respect to each of the plurality of lighting fixtures 20, and determines the pixel. It is comprised so that the contribution may be allocated to the illumination range of the lighting fixture 20 which becomes more than a prescribed threshold value. In the map generation unit 131, the light output of one of the plurality of lighting fixtures 20 is the first light output (for example, the light output corresponding to the rated lighting), and the light output of the other lighting fixtures 20 is the first. It is configured to obtain a luminance value I of a pixel corresponding to a predetermined position in an image obtained from the imaging means (cameras 11 and 12) when the second light output is lower than the light output (for example, the light output corresponding to turning off). The The map generation unit 131 is configured to obtain a ratio of the luminance value I corresponding to one lighting fixture 20 to the total S of the luminance values I corresponding to the plurality of lighting fixtures 20 as a contribution.
 なお、本実施形態の照明制御装置では、マップ生成部131は、照明範囲マップを生成するために、照明器具20ごとに光出力を2段階で変化させ、撮像手段(カメラ11,12)が撮像した画像を構成する画素のうち照明空間50における床面40に対応付けられる各画素について照明器具20ごとの2段階の光出力に対応する濃淡値の変化分を求め、画素ごとに変化分が規定の閾値以上になる照明器具20を対応付けてもよい。 In the illumination control device of the present embodiment, the map generation unit 131 changes the light output for each lighting fixture 20 in two stages in order to generate an illumination range map, and the imaging means (cameras 11 and 12) captures images. For each pixel associated with the floor surface 40 in the illumination space 50 among the pixels constituting the image, a change in the gray value corresponding to the two-stage light output for each lighting fixture 20 is obtained, and the change is defined for each pixel. You may associate the lighting fixture 20 which becomes more than this threshold value.
 換言すれば、マップ生成部131は、複数の照明器具20のそれぞれに関して画像における床面40の所定位置に対応する画素の輝度値の変化を求め、画素をその輝度値の変化が規定の閾値以上になる照明器具20の照明範囲に割り当てるように構成されてもよい。 In other words, the map generation unit 131 obtains a change in the luminance value of a pixel corresponding to a predetermined position on the floor surface 40 in the image for each of the plurality of lighting fixtures 20, and the change in the luminance value of the pixel is equal to or greater than a predetermined threshold value. It may be configured to be assigned to the illumination range of the lighting fixture 20 to be.
 この場合、マップ生成部131は、第1輝度値と第2輝度値との差分を、輝度値の変化として求めるように構成される。第1輝度値は、照明器具20の光出力が第1光出力(たとえば定格点灯に対応する光出力)であるときに撮像手段(カメラ11,12)から得られる画像における所定位置に対応する画素の輝度値である。第2輝度値は、照明器具20の光出力が第1光出力と異なる第2光出力(たとえば消灯に対応する光出力)であるときに撮像手段(カメラ11,12)から得られる画像における所定位置に対応する画素の輝度値である。 In this case, the map generation unit 131 is configured to obtain the difference between the first luminance value and the second luminance value as a change in luminance value. The first luminance value is a pixel corresponding to a predetermined position in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is the first light output (for example, light output corresponding to rated lighting). Luminance value. The second luminance value is a predetermined value in an image obtained from the imaging means (cameras 11 and 12) when the light output of the lighting fixture 20 is a second light output different from the first light output (for example, a light output corresponding to turning off). This is the luminance value of the pixel corresponding to the position.
 上述の動作によって、たとえば、複数台の照明器具21,22,23の照明範囲の境界付近に人が存在する場合に、1台の照明器具20(21,22,23)のみを点灯させるのではなく、人30の周囲の複数台の照明器具20(21,22,23)を点灯させることが可能になる。 With the above-described operation, for example, when there is a person near the boundary of the illumination range of the plurality of lighting fixtures 21, 22, and 23, only one lighting fixture 20 (21, 22, 23) is turned on. In addition, a plurality of lighting fixtures 20 (21, 22, 23) around the person 30 can be turned on.
 したがって、実施形態1のように、複数台の照明器具20(21,22,23)の点灯を許容するためのルールを制御指示部134に設定しなくとも、人30の存在位置に応じて点灯させる照明器具20(21,22,23)の台数を動的に変化させることが可能になる。他の構成および動作は実施形態1と同様である。 Therefore, as in the first embodiment, the lighting is performed according to the location of the person 30 without setting a rule for allowing lighting of the plurality of lighting fixtures 20 (21, 22, 23) in the control instruction unit 134. It is possible to dynamically change the number of lighting fixtures 20 (21, 22, 23) to be changed. Other configurations and operations are the same as those of the first embodiment.

Claims (11)

  1.  複数の照明器具により照明される床面を含む照明空間を撮像する撮像手段から取得した画像に基づいて前記照明空間の三次元情報を生成する計測処理部と、
     前記三次元情報に基づいて、前記照明空間の前記床面における前記照明器具の照明範囲を表す照明範囲マップを生成するマップ生成部と、
     前記三次元情報に基づいて、前記照明空間の前記床面において人が存在する存在位置を求める人検出部と、
     前記存在位置と前記照明範囲マップとを参照して、前記複数の照明器具のそれぞれの光出力の指示値を決定する制御指示部と、
     を備える
     ことを特徴とする照明制御装置。
    A measurement processing unit that generates three-dimensional information of the illumination space based on an image acquired from an imaging unit that images an illumination space including a floor illuminated by a plurality of lighting fixtures;
    Based on the three-dimensional information, a map generation unit that generates an illumination range map that represents an illumination range of the luminaire on the floor surface of the illumination space;
    Based on the three-dimensional information, a person detection unit for determining a presence position where a person exists on the floor surface of the illumination space;
    With reference to the presence position and the illumination range map, a control instruction unit that determines an instruction value of each light output of the plurality of lighting fixtures,
    A lighting control device comprising:
  2.  前記撮像手段は、2台のカメラを備え、
     前記計測処理部は、前記2台のカメラからそれぞれ得られた2枚の画像から、三角測量法の原理を用いて、前記三次元情報を生成するように構成される
     ことを特徴とする請求項1記載の照明制御装置。
    The imaging means includes two cameras,
    The said measurement process part is comprised so that the said three-dimensional information may be produced | generated using the principle of a triangulation method from the two images each obtained from the said two cameras. The lighting control apparatus according to 1.
  3.  前記マップ生成部は、前記複数の照明器具のそれぞれに関して前記画像における前記床面の所定位置に対応する画素の輝度値の変化を求め、前記画素をその輝度値の変化が最大になる前記照明器具の前記照明範囲に割り当てるように構成され、
     前記マップ生成部は、第1輝度値と第2輝度値との差分を、前記輝度値の変化として求めるように構成され、
     前記第1輝度値は、前記照明器具の光出力が第1光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する画素の輝度値であり、
     前記第2輝度値は、前記照明器具の光出力が前記第1光出力と異なる第2光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する画素の輝度値である
     ことを特徴とする請求項1記載の照明制御装置。
    The map generation unit obtains a change in luminance value of a pixel corresponding to a predetermined position on the floor surface in the image with respect to each of the plurality of lighting fixtures, and the lighting fixture that maximizes the change in the luminance value of the pixel. Is configured to assign to the lighting range of
    The map generation unit is configured to obtain a difference between the first luminance value and the second luminance value as a change in the luminance value;
    The first luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging means when the light output of the lighting fixture is the first light output,
    The second luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging unit when the light output of the lighting fixture is a second light output different from the first light output. The lighting control device according to claim 1.
  4.  前記マップ生成部は、前記複数の照明器具のそれぞれに関して前記画像における前記床面の所定位置に対応する画素の輝度値の変化を求め、前記画素をその輝度値の変化が規定の閾値以上になる前記照明器具の前記照明範囲に割り当てるように構成され、
     前記マップ生成部は、第1輝度値と第2輝度値との差分を、前記輝度値の変化として求めるように構成され、
     前記第1輝度値は、前記照明器具の光出力が第1光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する画素の輝度値であり、
     前記第2輝度値は、前記照明器具の光出力が前記第1光出力と異なる第2光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する画素の輝度値である
     ことを特徴とする請求項1記載の照明制御装置。
    The map generation unit obtains a change in luminance value of a pixel corresponding to a predetermined position on the floor surface in the image for each of the plurality of lighting fixtures, and the luminance value change of the pixel is equal to or greater than a predetermined threshold value. Configured to assign to the illumination range of the luminaire,
    The map generation unit is configured to obtain a difference between the first luminance value and the second luminance value as a change in the luminance value;
    The first luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging means when the light output of the lighting fixture is the first light output,
    The second luminance value is a luminance value of a pixel corresponding to the predetermined position in the image obtained from the imaging unit when the light output of the lighting fixture is a second light output different from the first light output. The lighting control device according to claim 1.
  5.  前記マップ生成部は、前記複数の照明器具のそれぞれに関して前記画像における前記床面の所定位置に対応する画素への寄与度を求め、画素をその寄与度が最大になる前記照明器具の照明範囲に割り当てるように構成され、
     前記マップ生成部は、
      前記複数の照明器具のうちの1つの照明器具の光出力が第1光出力であり他の照明器具の光出力が前記第1光出力より低い第2光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する前記画素の輝度値を求め、
      前記複数の照明器具に対応する前記輝度値の合計に対する前記1つの照明器具に対応する前記輝度値の割合を、前記寄与度として求める
     ように構成される
     ことを特徴とする請求項1記載の照明制御装置。
    The map generation unit obtains a contribution to a pixel corresponding to a predetermined position of the floor surface in the image for each of the plurality of lighting fixtures, and sets the pixel to an illumination range of the lighting fixture that maximizes the contribution. Configured to assign,
    The map generation unit
    Obtained from the imaging means when the light output of one of the plurality of lighting fixtures is a first light output and the light output of the other lighting fixture is a second light output lower than the first light output. A luminance value of the pixel corresponding to the predetermined position in the image to be obtained;
    2. The illumination according to claim 1, wherein a ratio of the luminance value corresponding to the one lighting fixture to a sum of the luminance values corresponding to the plurality of lighting fixtures is obtained as the contribution. Control device.
  6.  前記マップ生成部は、前記複数の照明器具のそれぞれに関して前記画像における前記床面の所定位置に対応する画素への寄与度を求め、画素をその寄与度が規定の閾値以上になる前記照明器具の照明範囲に割り当てるように構成され、
     前記マップ生成部は、
      前記複数の照明器具のうちの1つの照明器具の光出力が第1光出力であり他の照明器具の光出力が前記第1光出力より低い第2光出力であるときに前記撮像手段から得られる前記画像における前記所定位置に対応する前記画素の輝度値を求め、
      前記複数の照明器具に対応する前記輝度値の合計に対する前記1つの照明器具に対応する前記輝度値の割合を、前記寄与度として求める
     ように構成される
     ことを特徴とする請求項1記載の照明制御装置。
    The map generation unit obtains a degree of contribution to a pixel corresponding to a predetermined position of the floor surface in the image for each of the plurality of lighting fixtures, and the pixel of the lighting fixture whose contribution degree is equal to or greater than a predetermined threshold value. Configured to assign to lighting range,
    The map generation unit
    Obtained from the imaging means when the light output of one of the plurality of lighting fixtures is a first light output and the light output of the other lighting fixture is a second light output lower than the first light output. A luminance value of the pixel corresponding to the predetermined position in the image to be obtained;
    2. The illumination according to claim 1, wherein a ratio of the luminance value corresponding to the one lighting fixture to a sum of the luminance values corresponding to the plurality of lighting fixtures is obtained as the contribution. Control device.
  7.  前記マップ生成部は、前記画像における前記床面に交差する面に対応する画素をいずれの前記照明範囲にも割り当てないように構成される
     ことを特徴とする請求項1記載の照明制御装置。
    The illumination control apparatus according to claim 1, wherein the map generation unit is configured not to assign a pixel corresponding to a surface intersecting the floor surface in the image to any of the illumination ranges.
  8.  前記マップ生成部は、前記照明範囲マップにおいていずれの前記照明範囲にも含まれてない未定位置が存在する場合、前記複数の照明器具それぞれの前記照明範囲と前記未定位置との位置関係に基づいて前記未定位置をいずれかの前記照明範囲に割り当てるように構成される
     ことを特徴とする請求項1記載の照明制御装置。
    When there is an undetermined position that is not included in any of the illumination ranges in the illumination range map, the map generation unit is based on a positional relationship between the illumination ranges and the undetermined positions of the plurality of lighting fixtures. The lighting control device according to claim 1, wherein the undetermined position is configured to be assigned to any one of the lighting ranges.
  9.  前記マップ生成部は、前記照明範囲に基づいて、前記画像の前記床面における前記照明器具の直下の位置を求めるように構成される
     ことを特徴とする請求項6記載の照明制御装置。
    The illumination control device according to claim 6, wherein the map generation unit is configured to obtain a position immediately below the lighting fixture on the floor surface of the image based on the illumination range.
  10.  前記位置関係は、前記複数の照明器具それぞれの前記直下の位置と前記未定位置との位置関係である
     ことを特徴とする請求項7記載の照明制御装置。
    The lighting control device according to claim 7, wherein the positional relationship is a positional relationship between the position immediately below each of the plurality of lighting fixtures and the undetermined position.
  11.  前記撮像手段と、照明制御部と、を備え、
     前記照明制御部は、前記指示値を受け取ると、前記照明器具の前記光出力が前記指示値に対応する値となるように、前記照明器具を制御するように構成される
     ことを特徴とする請求項1記載の照明制御装置。
    The imaging means, and an illumination control unit,
    The lighting control unit is configured to control the lighting fixture so that the light output of the lighting fixture becomes a value corresponding to the indication value when receiving the indication value. Item 2. The lighting control device according to Item 1.
PCT/JP2012/068110 2011-08-12 2012-07-17 Illumination controller WO2013024655A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011176847A JP5799232B2 (en) 2011-08-12 2011-08-12 Lighting control device
JP2011-176847 2011-08-12

Publications (1)

Publication Number Publication Date
WO2013024655A1 true WO2013024655A1 (en) 2013-02-21

Family

ID=47714980

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/068110 WO2013024655A1 (en) 2011-08-12 2012-07-17 Illumination controller

Country Status (2)

Country Link
JP (1) JP5799232B2 (en)
WO (1) WO2013024655A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016122876A (en) * 2014-12-24 2016-07-07 三菱電機株式会社 Display device, display system, air conditioner, illumination device, and display device control program
JP2018147718A (en) * 2017-03-06 2018-09-20 東日本旅客鉄道株式会社 Illumination control method and illumination control system

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6124215B2 (en) * 2013-10-28 2017-05-10 パナソニックIpマネジメント株式会社 Lighting system and lighting control device
CA3036061A1 (en) * 2016-09-06 2018-03-15 Noon Home, Inc. Intelligent lighting control system automated adjustment apparatuses, systems, and methods
JP6918744B2 (en) * 2018-05-16 2021-08-18 浜井電球工業株式会社 LED lamp with camera and lighting range monitoring / warning system using it
JP6868290B2 (en) * 2019-03-07 2021-05-12 浜井電球工業株式会社 Multi-function LED lamp and lighting range monitoring / warning system using it

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008032272A (en) * 2006-07-26 2008-02-14 Matsushita Electric Works Ltd Control system
WO2008084677A1 (en) * 2006-12-28 2008-07-17 Sharp Kabushiki Kaisha Transmission device, view environment control device, and view environment control system
JP2009283183A (en) * 2008-05-20 2009-12-03 Panasonic Electric Works Co Ltd Illumination control system
JP2010170907A (en) * 2009-01-23 2010-08-05 Panasonic Electric Works Co Ltd Lighting system
JP2011155393A (en) * 2010-01-26 2011-08-11 Denso It Laboratory Inc Device and method for displaying image of vehicle surroundings

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008032272A (en) * 2006-07-26 2008-02-14 Matsushita Electric Works Ltd Control system
WO2008084677A1 (en) * 2006-12-28 2008-07-17 Sharp Kabushiki Kaisha Transmission device, view environment control device, and view environment control system
JP2009283183A (en) * 2008-05-20 2009-12-03 Panasonic Electric Works Co Ltd Illumination control system
JP2010170907A (en) * 2009-01-23 2010-08-05 Panasonic Electric Works Co Ltd Lighting system
JP2011155393A (en) * 2010-01-26 2011-08-11 Denso It Laboratory Inc Device and method for displaying image of vehicle surroundings

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016122876A (en) * 2014-12-24 2016-07-07 三菱電機株式会社 Display device, display system, air conditioner, illumination device, and display device control program
JP2018147718A (en) * 2017-03-06 2018-09-20 東日本旅客鉄道株式会社 Illumination control method and illumination control system

Also Published As

Publication number Publication date
JP2013041714A (en) 2013-02-28
JP5799232B2 (en) 2015-10-21

Similar Documents

Publication Publication Date Title
US11869139B2 (en) System and method for three-dimensional scanning and for capturing a bidirectional reflectance distribution function
WO2013024655A1 (en) Illumination controller
RU2698303C2 (en) Conflict of lighting preferences resolution
JP6506731B2 (en) System and method for scoring clutter used for 3D point cloud matching in vision system
KR101550474B1 (en) Method and device for finding and tracking pairs of eyes
JP5726792B2 (en) Information processing apparatus, image sensor apparatus, and program
US10383498B2 (en) Systems and methods to command a robotic cleaning device to move to a dirty region of an area
US9295141B2 (en) Identification device, method and computer program product
TW201416908A (en) Pupil tracking device
JP7243707B2 (en) Surveillance system and method of controlling the surveillance system
WO2014024364A1 (en) Object detection device, object detection method, and program
KR20130107981A (en) Device and method for tracking sight line
JP2016170610A (en) Three-dimensional model processing device and camera calibration system
JP6601489B2 (en) Imaging system, imaging apparatus, imaging method, and imaging program
JP5601179B2 (en) Gaze detection apparatus and gaze detection method
TW201627818A (en) Interactive system, remote controller and operating method thereof
US11445107B2 (en) Supervised setup for control device with imager
JP5336325B2 (en) Image processing method
JP5647084B2 (en) Surface normal measurement device, surface normal measurement system, and surface normal measurement program
JP6374812B2 (en) 3D model processing apparatus and camera calibration system
CN208314563U (en) A kind of visual identifying system for robotic tracking
CN214202417U (en) Detection device
Walter et al. Enabling multi-purpose mobile manipulators: Localization of glossy objects using a light-field camera
WO2020051747A1 (en) Method of acquiring contour of object, image processing method and computer storage medium
US20180249552A1 (en) Method and system for calculating color of ambient light

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12823916

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12823916

Country of ref document: EP

Kind code of ref document: A1