WO2023276605A1 - Lighting control system, lighting control method, and program - Google Patents

Lighting control system, lighting control method, and program Download PDF

Info

Publication number
WO2023276605A1
WO2023276605A1 PCT/JP2022/023366 JP2022023366W WO2023276605A1 WO 2023276605 A1 WO2023276605 A1 WO 2023276605A1 JP 2022023366 W JP2022023366 W JP 2022023366W WO 2023276605 A1 WO2023276605 A1 WO 2023276605A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
lighting
space
unit
control system
Prior art date
Application number
PCT/JP2022/023366
Other languages
French (fr)
Japanese (ja)
Inventor
千人 浦
仁 吉澤
達雄 古賀
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2023276605A1 publication Critical patent/WO2023276605A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/13Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using passive infrared detectors
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/16Controlling the light source by timing means
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present invention relates to a lighting control system, lighting control method and program.
  • Patent Literature 1 discloses a technique for controlling lighting based on human movement.
  • the present invention provides a lighting control system and the like that can control lighting according to various movements of people.
  • a lighting control system includes an acquisition unit that acquires image data of a space in which a person exists; an identification unit that identifies the skeletal coordinates of the person based on the obtained image data; an estimation unit for estimating the motion of the person based on the time-series data of the skeletal coordinates; and a control unit for controlling lighting in the space based on the estimated motion of the person.
  • a lighting control method includes an acquisition step of acquiring image data of a space in which a person exists; an specifying step of specifying the skeletal coordinates of the person based on the acquired image data; an estimation step of estimating the movement of the person based on the time-series data of the skeletal coordinates; and a control step of controlling lighting in the space based on the estimated movement of the person.
  • a program according to one aspect of the present invention is a program for causing a computer to execute the lighting control method.
  • the lighting control system of the present invention can control lighting according to various human movements.
  • FIG. 1 is a block diagram illustrating an example of a functional configuration of a lighting control system according to an embodiment
  • FIG. FIG. 2 is a flowchart illustrating an example of the operation of the lighting control system according to the embodiment
  • FIG. 3 is a diagram conceptually showing identification of a two-dimensional human skeleton model.
  • FIG. 4 is a diagram conceptually showing skeletal coordinate estimation.
  • FIG. 5 is a diagram showing an example of specific operation.
  • FIG. 6 is a diagram showing an example of specific operation.
  • each figure is a schematic diagram and is not necessarily strictly illustrated. Moreover, in each figure, the same code
  • FIG. 1 is a block diagram showing an example of the functional configuration of the lighting control system 10 according to the embodiment.
  • the lighting control system 10 is a system that acquires spatial image data output by the camera 20 and controls lighting in the space based on the acquired image data.
  • the space is, for example, an office space, but it may also be a space in a commercial facility or a space in other facilities such as a space in a house.
  • lighting control system 10 includes camera 20 , control device 30 , and lighting device 40 .
  • the camera 20 is installed, for example, on the ceiling or wall of a space, and captures an image (moving image composed of a plurality of images) including a person existing in the space as a subject.
  • the camera 20 also transmits image data of the captured image to the control device 30 .
  • the camera 20 may be a camera using a CMOS (Complementary Metal Oxide Semiconductor) image sensor, or may be a camera using a CCD (Charge Coupled Device) image sensor.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge Coupled Device
  • the camera 20 may be a camera using an image sensor capable of detecting infrared rays (infrared light). That is, camera 20 may be an infrared camera. This allows the camera 20 to capture an image (infrared image) even when the space is dark.
  • the lighting control system 10 may include two or more cameras 20 .
  • the estimating unit 36 can generate a three-dimensional skeletal model of the entire human body by estimating skeletal coordinates from each of the two images captured by the two cameras 20 and synthesizing them.
  • the control device 30 receives image data from the camera 20 and controls the lighting device 40 based on the received image data. Lighting in the space is controlled by controlling the lighting device 40 .
  • the control device 30 is, for example, a local controller (i.e., edge computer) installed in the same facility as the facility in which the space is provided, but is a server device (i.e., cloud computer) installed outside the facility. good too.
  • the control device 30 includes a communication section 31 , an information processing section 32 and a storage section 33 .
  • the communication unit 31 is a communication module (communication circuit) for the control device 30 to communicate with the camera 20 and the lighting device 40.
  • the communication unit 31 for example, receives image data from the camera 20 and transmits control signals to the lighting device 40 .
  • the communication performed by the communication unit 31 may be wireless communication or wired communication.
  • the communication standard used for communication is also not particularly limited.
  • the information processing section 32 acquires the image data of the image received by the communication section 31, and performs information processing for controlling the lighting device 40 based on the acquired image data.
  • the information processing section 32 is specifically realized by a processor or a microcomputer.
  • the information processing section 32 includes an acquisition section 34 , an identification section 35 , an estimation section 36 , an identification section 37 and a control section 38 .
  • the function of the acquisition unit 34, the identification unit 35, the estimation unit 36, the identification unit 37, and the control unit 38 is that the processor or microcomputer constituting the information processing unit 32 executes a computer program stored in the storage unit 33. realized by Details of the functions of the acquisition unit 34, the identification unit 35, the estimation unit 36, the identification unit 37, and the control unit 38 will be described later.
  • the storage unit 33 is a storage device that stores image data received by the communication unit 31, computer programs executed by the information processing unit 32, and the like.
  • the storage unit 33 also stores a machine learning model, an estimation model, and the like, which will be described later.
  • the storage unit 33 is implemented by a semiconductor memory, HDD (Hard Disk Drive), or the like.
  • the lighting device 40 is installed on the ceiling of the space or the like to illuminate the space. For example, the brightness or color temperature of the light output from the lighting device 40 is adjusted based on the control signal transmitted from the control device 30 .
  • FIG. 2 is a flowchart showing an example of the operation of the lighting control system 10.
  • the communication unit 31 of the control device 30 receives the image data of the space image from the camera 20 (S11).
  • the information processing section 32 stores the received image data in the storage section 33 (S12).
  • the acquisition unit 34 acquires the image data received by the communication unit 31 and stored in the storage unit 33 (S13), and the specifying unit 35 determines the image data that appears in the image based on the acquired image data.
  • a two-dimensional skeleton model of a person is specified (S14).
  • FIG. 3 is a diagram conceptually showing identification of a two-dimensional human skeleton model.
  • the two-dimensional skeleton model is a model in which the joint positions (spheres) of a person in the image are connected by links (lines).
  • An existing pose and skeleton identification algorithm is used to identify the two-dimensional skeleton model.
  • the identifying unit 35 identifies skeleton coordinates (three-dimensional coordinate data of each joint) from the identified two-dimensional skeleton model (S15).
  • the identifying unit 35 identifies skeletal coordinates using, for example, a machine learning model.
  • FIG. 4 is a diagram conceptually showing skeletal coordinate estimation.
  • a machine learning model is a learning model pre-constructed by machine learning using a two-dimensional skeletal model in which the skeletal coordinates of each joint are known as learning data and skeletal coordinates as training data.
  • Such a machine learning model can take a two-dimensional skeleton model as an input and output its skeleton coordinates (in other words, a three-dimensional skeleton model).
  • the specifying unit 35 can specify the time-series data of the skeleton coordinates by specifying the skeleton coordinates for each of a plurality of images (frames) forming the moving image.
  • the estimation unit 36 estimates the movement of the person in the image from the specified time-series data of the skeletal coordinates (S16).
  • the estimating unit 36 detects a specific motion of a person based on the estimation result (S17). For example, if the storage unit 33 stores a discriminative model that has learned the motion of a joint when a specific action is performed, the estimating unit 36 inputs time-series data of skeletal coordinates to the discriminative model to perform a specific motion. Motion can be detected.
  • FIGS. 5 and 6 are diagrams showing an example of a specific operation.
  • the specific action includes an action instructing to control lighting.
  • the specific action may be a gesture instructing to control lighting.
  • FIG. 5 shows, as an example of the action of instructing to control the lighting, the action of raising the left hand after the right hand is raised.
  • the action of raising the left hand after the right hand is raised.
  • the action of raising the left hand after raising the right hand is an example of the action of instructing to brighten the lighting, and is not limited to this.
  • the operation of instructing to control the lighting includes the operation of instructing to dim the lighting, the operation of instructing to increase the color temperature of the lighting, Alternatively, there is an operation of instructing to lower the color temperature of lighting.
  • the time-series data of the skeleton coordinates are specified from the moving image including the person performing such actions as the subject, and the estimation unit 36 calculates the lighting conditions of the person reflected in the image from the specified time-series data of the skeleton coordinates. Detects an action directed to control the
  • actions that people perform on a daily basis include actions that people perform on a daily basis.
  • Actions that people perform on a daily basis are, for example, not gestures that instruct to control lighting, but actions such as desk work, stretching during desk work, or conversation with people.
  • FIG. 6 shows a motion of a person stretching during desk work as an example of a motion that a person performs on a daily basis.
  • Time-series data of skeleton coordinates are identified from a moving image including a person performing such actions as a subject. Detect intentional actions.
  • the identification unit 37 identifies a person based on the acquired image data (S18). For example, if the storage unit 33 stores a discriminant model that has learned individual features (for example, face, physique, or bone structure), the estimating unit 36 inputs image data of a person into the discriminant model. , can identify a person.
  • the storage unit 33 stores a discriminant model that has learned individual features (for example, face, physique, or bone structure)
  • the estimating unit 36 inputs image data of a person into the discriminant model. , can identify a person.
  • control unit 38 controls lighting in the space based on the estimated human movement (S19). Specifically, the controller 38 controls the lighting device 40 .
  • the lighting device 40 is controlled by transmitting a control signal from the communication unit 31 to the lighting device 40 .
  • control unit 38 controls lighting in the space with control details corresponding to the specific action.
  • the control unit 38 controls the lighting in the space with control details corresponding to the action. For example, the control unit 38 brightens the lighting in the space when an operation instructing to brighten the lighting is detected, and dims the lighting in the space when an operation instructing to dim the lighting is detected. , when an operation instructing to increase the color temperature of the lighting is detected, the color temperature of the lighting in the space is increased, and when an operation instructing to decrease the color temperature of the lighting is detected, the lighting in the space lower the color temperature of the
  • the control unit 38 controls the lighting in the space with the control content corresponding to the action.
  • the control part 38 brightens the brightness in the space when a person's deskwork motion is detected, and darkens the space brightness when a person's stretching motion is detected during the deskwork.
  • control unit 38 controls the lighting device 40 around the person who performed the specific action, thereby controlling the lighting around the person in the space.
  • the control unit 38 brightens the brightness around the person in the space.
  • the control unit 38 may also control lighting in the space based on the estimated human movement and the identified person. For example, since different people have different tastes in lighting control details, the control details corresponding to the specific action may be different for each identified person.
  • control unit 38 may further control lighting in the space based on the current time.
  • the brightness in a space can change according to the time of day due to the movement of the sun throughout the day. may brighten the lighting in the space.
  • the control unit 38 may also control lighting in the space based on the estimated human movement and the current time. For example, even if the same motion is detected, the content of lighting control may be changed depending on the time. Specifically, when it is detected that a person stretches while working at a desk in the morning, control to reduce the brightness in the space is not performed, and when it is detected that a person stretches while working at a desk in the afternoon. Also, control may be performed to darken the brightness in the space. For example, by dimming the brightness of the space during desk work, the person can take a break. This is because it may not be necessary to take a break. In this way, depending on the estimated movement of the person, the content of lighting control may be varied according to the current time.
  • the control unit 38 may also control lighting in the space based on the estimated movement of the person, the current time, and the identified person. For example, since people have different tastes in lighting control details depending on the time of day, the lighting control details depending on the time of day may be different for each identified person.
  • the lighting control system 10 includes the acquiring unit 34 that acquires image data of a space in which a person exists, the specifying unit 35 that specifies the skeletal coordinates of the person based on the acquired image data, and the specifying unit 35.
  • the estimation unit 36 detects a specific action of a person by estimating the movement of the person, and when the specific action is detected, the control unit 38 adjusts the lighting in the space to correspond to the specific action. Control by control contents.
  • the lighting in the space can be controlled with different control details for each specific action.
  • specific actions include actions that people perform on a daily basis.
  • the lighting can be controlled by the daily actions of the person, for example, the control contents corresponding to the daily actions of the person can be used without instructing the person to control the lighting. Lighting can be controlled automatically.
  • the specific action includes an action instructing to control lighting.
  • the lighting can be controlled by gestures.
  • the lighting control system 10 further includes an identification unit 37 that identifies a person based on the acquired image data, and the control unit 38 controls the estimated movement of the person and the identified person based on the , controls the lighting in the space.
  • control unit 38 controls lighting in the space based on the estimated human movement and the current time.
  • the brightness of the lighting can be changed according to the current time. For example, depending on the estimated movement of a person, it is possible to vary lighting control details according to the current time.
  • the acquisition unit 34 acquires image data of an image captured by an infrared camera.
  • the lighting control method executed by a computer such as the lighting control system 10 includes an acquisition step (S13 in FIG. 2) of acquiring image data of a space in which a person exists, and based on the acquired image data, A specifying step of specifying skeletal coordinates (S15 in FIG. 2), an estimating step of estimating a person's movement (S16 in FIG. 2) based on the specified time-series data of the skeletal coordinates, and an estimated human movement. and a control step (S19 in FIG. 2) of controlling the lighting in the space based on.
  • the lighting control system is implemented by a plurality of devices in the above embodiment, it may be implemented as a single device.
  • the lighting control system may be implemented as a single device that corresponds to the controller.
  • each component included in the lighting control system may be distributed to the plurality of devices in any way.
  • controller 38 may not control the lighting in the space based on the estimated human movement and the identified person.
  • control unit 38 controls the lighting in the space based on the current time, but the control unit 38 does not need to control the lighting in the space based on the current time. good too.
  • processing executed by a specific processing unit may be executed by another processing unit.
  • order of multiple processes may be changed, and multiple processes may be executed in parallel.
  • each component may be realized by executing a software program suitable for each component.
  • Each component may be realized by reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory by a program execution unit such as a CPU or processor.
  • each component may be realized by hardware.
  • each component may be a circuit (or integrated circuit). These circuits may form one circuit as a whole, or may be separate circuits. These circuits may be general-purpose circuits or dedicated circuits.
  • general or specific aspects of the present invention may be implemented in a system, apparatus, method, integrated circuit, computer program, or recording medium such as a computer-readable CD-ROM.
  • any combination of systems, devices, methods, integrated circuits, computer programs and recording media may be implemented.
  • the present invention may be realized as a program for causing a computer to execute the lighting control method of the above embodiments, or as a computer-readable non-temporary recording medium storing such a program. may be implemented.
  • an acquisition unit that acquires image data of a space in which a person exists; an identification unit that identifies skeletal coordinates of the person based on the obtained image data; and time-series data of the identified skeletal coordinates. and a controller that controls lighting in the space based on the estimated movement of the person.
  • the estimating unit detects a specific action of the person by estimating the movement of the person, and the control unit adjusts the lighting in the space to the specific action when the specific action is detected.
  • the lighting control system according to (1) which is controlled by control contents corresponding to the operation.
  • an identification unit that identifies the person based on the acquired image data, the control unit, based on the estimated movement of the person and the identified person, in the space
  • the lighting control system according to any one of (1) to (4) for controlling lighting.

Abstract

A lighting control system (10) is provided with an acquisition unit (34) for acquiring image data of a space where a person is present, an identification unit (35) for identifying skeletal coordinates of the person on the basis of the acquired image data, an estimation unit (36) for estimating movement of the person on the basis of time-series data of the identified skeletal coordinates, and a control unit (38) for controlling lighting in the space.

Description

照明制御システム、照明制御方法及びプログラムLIGHTING CONTROL SYSTEM, LIGHTING CONTROL METHOD AND PROGRAM
 本発明は、照明制御システム、照明制御方法及びプログラムに関する。 The present invention relates to a lighting control system, lighting control method and program.
 空間における照明の制御に関する様々な技術が提案されている。特許文献1には、人の動きに基づいて照明を制御する技術が開示されている。 Various technologies have been proposed for controlling lighting in a space. Patent Literature 1 discloses a technique for controlling lighting based on human movement.
特開2008-016289号公報JP 2008-016289 A
 本発明は、人の様々な動きに合わせて照明を制御することができる照明制御システムなどを提供する。 The present invention provides a lighting control system and the like that can control lighting according to various movements of people.
 本発明の一態様に係る照明制御システムは、人が存在する空間の画像データを取得する取得部と、取得された前記画像データに基づいて、前記人の骨格座標を特定する特定部と、特定された前記骨格座標の時系列データに基づいて、前記人の動きを推定する推定部と、推定された前記人の動きに基づいて、前記空間における照明を制御する制御部と、を備える。 A lighting control system according to an aspect of the present invention includes an acquisition unit that acquires image data of a space in which a person exists; an identification unit that identifies the skeletal coordinates of the person based on the obtained image data; an estimation unit for estimating the motion of the person based on the time-series data of the skeletal coordinates; and a control unit for controlling lighting in the space based on the estimated motion of the person.
 本発明の一態様に係る照明制御方法は、人が存在する空間の画像データを取得する取得ステップと、取得された前記画像データに基づいて、前記人の骨格座標を特定する特定ステップと、特定された前記骨格座標の時系列データに基づいて、前記人の動きを推定する推定ステップと、推定された前記人の動きに基づいて、前記空間における照明を制御する制御ステップと、を含む。 A lighting control method according to an aspect of the present invention includes an acquisition step of acquiring image data of a space in which a person exists; an specifying step of specifying the skeletal coordinates of the person based on the acquired image data; an estimation step of estimating the movement of the person based on the time-series data of the skeletal coordinates; and a control step of controlling lighting in the space based on the estimated movement of the person.
 本発明の一態様に係るプログラムは、前記照明制御方法をコンピュータに実行させるためのプログラムである。 A program according to one aspect of the present invention is a program for causing a computer to execute the lighting control method.
 本発明の照明制御システムなどは、人の様々な動きに合わせて照明を制御することができる。 The lighting control system of the present invention can control lighting according to various human movements.
図1は、実施の形態に係る照明制御システムの機能構成の一例を示すブロック図である。1 is a block diagram illustrating an example of a functional configuration of a lighting control system according to an embodiment; FIG. 図2は、実施の形態に係る照明制御システムの動作の一例を示すフローチャートである。FIG. 2 is a flowchart illustrating an example of the operation of the lighting control system according to the embodiment; 図3は、人の2次元骨格モデルの特定を概念的に示す図である。FIG. 3 is a diagram conceptually showing identification of a two-dimensional human skeleton model. 図4は、骨格座標の推定を概念的に示す図である。FIG. 4 is a diagram conceptually showing skeletal coordinate estimation. 図5は、特定動作の一例を示す図である。FIG. 5 is a diagram showing an example of specific operation. 図6は、特定動作の一例を示す図である。FIG. 6 is a diagram showing an example of specific operation.
 以下、実施の形態について、図面を参照しながら説明する。なお、以下で説明する実施の形態は、いずれも包括的又は具体的な例を示すものである。以下の実施の形態で示される数値、形状、材料、構成要素、構成要素の配置位置及び接続形態、ステップ、ステップの順序などは、一例であり、本発明を限定する主旨ではない。また、以下の実施の形態における構成要素のうち、独立請求項に記載されていない構成要素については、任意の構成要素として説明される。 Hereinafter, embodiments will be described with reference to the drawings. It should be noted that the embodiments described below are all comprehensive or specific examples. Numerical values, shapes, materials, components, arrangement positions and connection forms of components, steps, order of steps, and the like shown in the following embodiments are examples and are not intended to limit the present invention. In addition, among the constituent elements in the following embodiments, constituent elements not described in independent claims will be described as optional constituent elements.
 なお、各図は模式図であり、必ずしも厳密に図示されたものではない。また、各図において、実質的に同一の構成に対しては同一の符号を付し、重複する説明は省略又は簡略化される場合がある。 It should be noted that each figure is a schematic diagram and is not necessarily strictly illustrated. Moreover, in each figure, the same code|symbol is attached|subjected with respect to substantially the same structure, and the overlapping description may be abbreviate|omitted or simplified.
 (実施の形態)
 [構成]
 まず、実施の形態に係る照明制御システムの構成について説明する。
(Embodiment)
[composition]
First, the configuration of the lighting control system according to the embodiment will be described.
 図1は、実施の形態に係る照明制御システム10の機能構成の一例を示すブロック図である。 FIG. 1 is a block diagram showing an example of the functional configuration of the lighting control system 10 according to the embodiment.
 照明制御システム10は、カメラ20によって出力される空間の画像データを取得し、取得した画像データに基づいて空間における照明を制御するシステムである。空間は、例えば、オフィス空間であるが、商業施設内の空間、又は、住宅内の空間などのその他の施設内の空間であってもよい。図1に示されるように、照明制御システム10は、カメラ20と、制御装置30と、照明装置40とを備える。 The lighting control system 10 is a system that acquires spatial image data output by the camera 20 and controls lighting in the space based on the acquired image data. The space is, for example, an office space, but it may also be a space in a commercial facility or a space in other facilities such as a space in a house. As shown in FIG. 1 , lighting control system 10 includes camera 20 , control device 30 , and lighting device 40 .
 カメラ20は、例えば、空間の天井又は壁などに設置され、空間に存在する人を被写体として含む画像(複数の画像によって構成される動画像)を撮影する。また、カメラ20は、撮影した画像の画像データを制御装置30へ送信する。カメラ20は、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサを用いたカメラであってもよいし、CCD(Charge Coupled Device)イメージセンサを用いたカメラであってもよい。 The camera 20 is installed, for example, on the ceiling or wall of a space, and captures an image (moving image composed of a plurality of images) including a person existing in the space as a subject. The camera 20 also transmits image data of the captured image to the control device 30 . The camera 20 may be a camera using a CMOS (Complementary Metal Oxide Semiconductor) image sensor, or may be a camera using a CCD (Charge Coupled Device) image sensor.
 また、カメラ20は、赤外線(赤外光)を検出することができるイメージセンサを用いたカメラであってもよい。つまり、カメラ20は、赤外カメラであってもよい。これにより、カメラ20は、空間が暗いときにも画像(赤外線画像)を撮影することができる。 Also, the camera 20 may be a camera using an image sensor capable of detecting infrared rays (infrared light). That is, camera 20 may be an infrared camera. This allows the camera 20 to capture an image (infrared image) even when the space is dark.
 なお、照明制御システム10は、カメラ20を2台以上備えてもよい。例えば、1つのカメラ20によって撮影される画像に人の体の一部しか映っていないような場合には、人の体全体の骨格座標(後述)を推定することは難しい。このような場合に、推定部36は、2つのカメラ20によって撮影された2つの画像それぞれから骨格座標を推定し、合成することで人の体全体の3次元骨格モデルを生成することができる。 Note that the lighting control system 10 may include two or more cameras 20 . For example, when an image captured by one camera 20 shows only a part of the human body, it is difficult to estimate the skeletal coordinates (described later) of the entire human body. In such a case, the estimating unit 36 can generate a three-dimensional skeletal model of the entire human body by estimating skeletal coordinates from each of the two images captured by the two cameras 20 and synthesizing them.
 制御装置30は、画像データをカメラ20から受信し、受信した画像データに基づいて照明装置40を制御する。照明装置40が制御されることで、空間における照明が制御される。制御装置30は、例えば、空間が設けられる施設と同一の施設内に設置されるローカルコントローラ(つまり、エッジコンピュータ)であるが、施設外に設置されるサーバ装置(つまり、クラウドコンピュータ)であってもよい。制御装置30は、通信部31と、情報処理部32と、記憶部33とを備える。 The control device 30 receives image data from the camera 20 and controls the lighting device 40 based on the received image data. Lighting in the space is controlled by controlling the lighting device 40 . The control device 30 is, for example, a local controller (i.e., edge computer) installed in the same facility as the facility in which the space is provided, but is a server device (i.e., cloud computer) installed outside the facility. good too. The control device 30 includes a communication section 31 , an information processing section 32 and a storage section 33 .
 通信部31は、制御装置30がカメラ20及び照明装置40と通信するための通信モジュール(通信回路)である。通信部31は、例えば、カメラ20から画像データを受信し、照明装置40へ制御信号を送信する。通信部31によって行われる通信は、無線通信であってもよいし、有線通信であってもよい。通信に用いられる通信規格についても特に限定されない。 The communication unit 31 is a communication module (communication circuit) for the control device 30 to communicate with the camera 20 and the lighting device 40. The communication unit 31 , for example, receives image data from the camera 20 and transmits control signals to the lighting device 40 . The communication performed by the communication unit 31 may be wireless communication or wired communication. The communication standard used for communication is also not particularly limited.
 情報処理部32は、通信部31によって受信された画像の画像データを取得し、取得した画像データに基づいて照明装置40を制御するための情報処理を行う。情報処理部32は、具体的には、プロセッサ又はマイクロコンピュータによって実現される。情報処理部32は、取得部34と、特定部35と、推定部36と、識別部37と、制御部38とを備える。取得部34、特定部35、推定部36、識別部37、及び、制御部38の機能は、情報処理部32を構成するプロセッサ又はマイクロコンピュータが記憶部33に記憶されたコンピュータプログラムを実行することによって実現される。取得部34、特定部35、推定部36、識別部37、及び、制御部38の機能の詳細については後述される。 The information processing section 32 acquires the image data of the image received by the communication section 31, and performs information processing for controlling the lighting device 40 based on the acquired image data. The information processing section 32 is specifically realized by a processor or a microcomputer. The information processing section 32 includes an acquisition section 34 , an identification section 35 , an estimation section 36 , an identification section 37 and a control section 38 . The function of the acquisition unit 34, the identification unit 35, the estimation unit 36, the identification unit 37, and the control unit 38 is that the processor or microcomputer constituting the information processing unit 32 executes a computer program stored in the storage unit 33. realized by Details of the functions of the acquisition unit 34, the identification unit 35, the estimation unit 36, the identification unit 37, and the control unit 38 will be described later.
 記憶部33は、通信部31によって受信された画像データ、及び、情報処理部32が実行するコンピュータプログラムなどが記憶される記憶装置である。記憶部33には、後述の機械学習モデル、及び、推定モデルなども記憶される。記憶部33は、具体的には、半導体メモリ又はHDD(Hard Disk Drive)などによって実現される。 The storage unit 33 is a storage device that stores image data received by the communication unit 31, computer programs executed by the information processing unit 32, and the like. The storage unit 33 also stores a machine learning model, an estimation model, and the like, which will be described later. Specifically, the storage unit 33 is implemented by a semiconductor memory, HDD (Hard Disk Drive), or the like.
 照明装置40は、空間の天井などに設置され、空間を照明する。例えば、照明装置40から出力される光は、制御装置30から送信される制御信号に基づいて明るさ又は色温度が調整される。 The lighting device 40 is installed on the ceiling of the space or the like to illuminate the space. For example, the brightness or color temperature of the light output from the lighting device 40 is adjusted based on the control signal transmitted from the control device 30 .
 [動作例]
 次に、照明制御システム10の動作例について説明する。
[Example of operation]
Next, an operation example of the lighting control system 10 will be described.
 図2は、照明制御システム10の動作の一例を示すフローチャートである。 FIG. 2 is a flowchart showing an example of the operation of the lighting control system 10. FIG.
 制御装置30の通信部31は、空間の画像の画像データをカメラ20から受信する(S11)。情報処理部32は、受信された画像データを記憶部33に記憶する(S12)。 The communication unit 31 of the control device 30 receives the image data of the space image from the camera 20 (S11). The information processing section 32 stores the received image data in the storage section 33 (S12).
 次に、取得部34は、通信部31によって受信され、かつ、記憶部33に記憶された画像データを取得し(S13)、特定部35は、取得された画像データに基づいて、画像に映る人の2次元骨格モデルを特定する(S14)。 Next, the acquisition unit 34 acquires the image data received by the communication unit 31 and stored in the storage unit 33 (S13), and the specifying unit 35 determines the image data that appears in the image based on the acquired image data. A two-dimensional skeleton model of a person is specified (S14).
 図3は、人の2次元骨格モデルの特定を概念的に示す図である。 FIG. 3 is a diagram conceptually showing identification of a two-dimensional human skeleton model.
 図3に示されるように、2次元骨格モデルは、画像に映る人の関節の位置(球体)をリンク(線)で結んだモデルである。2次元骨格モデルの特定には、既存の、姿勢及び骨格の特定アルゴリズムが用いられる。 As shown in Fig. 3, the two-dimensional skeleton model is a model in which the joint positions (spheres) of a person in the image are connected by links (lines). An existing pose and skeleton identification algorithm is used to identify the two-dimensional skeleton model.
 次に、特定部35は、特定された2次元骨格モデルから骨格座標(各関節の3次元座標データ)を特定する(S15)。特定部35は、例えば、機械学習モデルを用いて骨格座標を特定する。 Next, the identifying unit 35 identifies skeleton coordinates (three-dimensional coordinate data of each joint) from the identified two-dimensional skeleton model (S15). The identifying unit 35 identifies skeletal coordinates using, for example, a machine learning model.
 図4は、骨格座標の推定を概念的に示す図である。 FIG. 4 is a diagram conceptually showing skeletal coordinate estimation.
 機械学習モデルは、各関節の骨格座標が既知である2次元骨格モデルを学習データとし、骨格座標を教師データとする機械学習によってあらかじめ構築された学習モデルである。このような機械学習モデルは、2次元骨格モデルを入力としてその骨格座標(言い換えれば、3次元骨格モデル)を出力することができる。特定部35は、動画像を構成する複数の画像(フレーム)のそれぞれを対象に骨格座標を特定することにより、骨格座標の時系列データを特定することができる。 A machine learning model is a learning model pre-constructed by machine learning using a two-dimensional skeletal model in which the skeletal coordinates of each joint are known as learning data and skeletal coordinates as training data. Such a machine learning model can take a two-dimensional skeleton model as an input and output its skeleton coordinates (in other words, a three-dimensional skeleton model). The specifying unit 35 can specify the time-series data of the skeleton coordinates by specifying the skeleton coordinates for each of a plurality of images (frames) forming the moving image.
 次に、推定部36は、特定された骨格座標の時系列データから、画像に映る人の動きを推定する(S16)。推定部36は、推定結果に基づいて、人の特定動作を検出する(S17)。例えば、記憶部33に、特定動作が行われるときの関節の動きを学習した識別モデルが記憶されていれば、推定部36は、骨格座標の時系列データを識別モデルに入力することにより、特定動作を検出することができる。ここで、特定動作の一例について図5及び図6を用いて説明する。 Next, the estimation unit 36 estimates the movement of the person in the image from the specified time-series data of the skeletal coordinates (S16). The estimating unit 36 detects a specific motion of a person based on the estimation result (S17). For example, if the storage unit 33 stores a discriminative model that has learned the motion of a joint when a specific action is performed, the estimating unit 36 inputs time-series data of skeletal coordinates to the discriminative model to perform a specific motion. Motion can be detected. Here, an example of the specific operation will be described with reference to FIGS. 5 and 6. FIG.
 図5及び図6は、特定動作の一例を示す図である。  FIGS. 5 and 6 are diagrams showing an example of a specific operation.
 例えば、特定動作は、照明を制御することを指示する動作を含む。例えば、特定動作は、照明を制御することを指示するジェスチャーであってもよい。図5には、照明を制御することを指示する動作の一例として、人が右手を上げた後に左手を上げるという動作が示されている。例えば、人が右手を上げた後に左手を上げるという動作によって、照明を明るくすることができることをあらかじめ人に認識させておくことで、人が照明を明るくしたいときに、右手を上げた後に左手を上げるという動作を人にさせることができる。なお、照明を明るくすることを指示する動作として、右手を上げた後に左手を上げるという動作は一例であり、これに限らない。また、照明を制御することを指示する動作には、照明を明るくすることを指示する動作の他に、照明を暗くすることを指示する動作、照明の色温度を高くすることを指示する動作、又は、照明の色温度を低くすることを指示する動作などがある。このような動作をしている人を被写体として含む動画像から、骨格座標の時系列データが特定され、推定部36は、特定された骨格座標の時系列データから、画像に映る人の、照明を制御することを指示する動作を検出する。 For example, the specific action includes an action instructing to control lighting. For example, the specific action may be a gesture instructing to control lighting. FIG. 5 shows, as an example of the action of instructing to control the lighting, the action of raising the left hand after the right hand is raised. For example, by making the person aware in advance that the lighting can be brightened by raising the left hand after raising the right hand, when the person wants to brighten the lighting, he/she raises the right hand and then raises the left hand. It is possible to make a person perform the action of raising. Note that the action of raising the left hand after raising the right hand is an example of the action of instructing to brighten the lighting, and is not limited to this. In addition to the operation of instructing to brighten the lighting, the operation of instructing to control the lighting includes the operation of instructing to dim the lighting, the operation of instructing to increase the color temperature of the lighting, Alternatively, there is an operation of instructing to lower the color temperature of lighting. The time-series data of the skeleton coordinates are specified from the moving image including the person performing such actions as the subject, and the estimation unit 36 calculates the lighting conditions of the person reflected in the image from the specified time-series data of the skeleton coordinates. Detects an action directed to control the
 また、例えば、特定動作は、人が日常的に行う動作を含む。人が日常的に行う動作は、例えば、照明を制御することを指示するジェスチャーなどではなく、デスクワークを行う動作、デスクワーク中に背伸びをする動作、又は、人と会話をする動作などである。図6には、人が日常的に行う動作の一例として、人がデスクワーク中に背伸びをする動作が示されている。このような動作をしている人を被写体として含む動画像から、骨格座標の時系列データが特定され、推定部36は、特定された骨格座標の時系列データから、画像に映る人の、日常的に行う動作を検出する。 Also, for example, specific actions include actions that people perform on a daily basis. Actions that people perform on a daily basis are, for example, not gestures that instruct to control lighting, but actions such as desk work, stretching during desk work, or conversation with people. FIG. 6 shows a motion of a person stretching during desk work as an example of a motion that a person performs on a daily basis. Time-series data of skeleton coordinates are identified from a moving image including a person performing such actions as a subject. Detect intentional actions.
 次に、識別部37は、取得された画像データに基づいて、人を識別する(S18)。例えば、記憶部33に、個人の特徴(例えば、顔、体格又は骨格など)を学習した識別モデルが記憶されていれば、推定部36は、人が映る画像データを識別モデルに入力することにより、人を識別することができる。 Next, the identification unit 37 identifies a person based on the acquired image data (S18). For example, if the storage unit 33 stores a discriminant model that has learned individual features (for example, face, physique, or bone structure), the estimating unit 36 inputs image data of a person into the discriminant model. , can identify a person.
 次に、制御部38は、推定された人の動きに基づいて、空間における照明を制御する(S19)。具体的には、制御部38は、照明装置40を制御する。照明装置40の制御は、通信部31から照明装置40へ制御信号が送信されることによって行われる。 Next, the control unit 38 controls lighting in the space based on the estimated human movement (S19). Specifically, the controller 38 controls the lighting device 40 . The lighting device 40 is controlled by transmitting a control signal from the communication unit 31 to the lighting device 40 .
 例えば、制御部38は、特定動作が検出されたときに、空間における照明を、特定動作に対応する制御内容で制御する。 For example, when a specific action is detected, the control unit 38 controls lighting in the space with control details corresponding to the specific action.
 例えば、制御部38は、特定動作として照明を制御することを指示する動作が検出されたときに、空間における照明を当該動作に対応する制御内容で制御する。例えば、制御部38は、照明を明るくすることを指示する動作が検出されたときには、空間における照明を明るくし、照明を暗くすることを指示する動作が検出されたときには、空間における照明を暗くし、照明の色温度を高くすることを指示する動作が検出されたときには、空間における照明の色温度を高くし、照明の色温度を低くすることを指示する動作が検出されたときには、空間における照明の色温度を低くする。 For example, when an action instructing to control lighting is detected as a specific action, the control unit 38 controls the lighting in the space with control details corresponding to the action. For example, the control unit 38 brightens the lighting in the space when an operation instructing to brighten the lighting is detected, and dims the lighting in the space when an operation instructing to dim the lighting is detected. , when an operation instructing to increase the color temperature of the lighting is detected, the color temperature of the lighting in the space is increased, and when an operation instructing to decrease the color temperature of the lighting is detected, the lighting in the space lower the color temperature of the
 また、例えば、制御部38は、特定動作として人が日常的に行う動作が検出されたときに、空間における照明を当該動作に対応する制御内容で制御する。例えば、制御部38は、人がデスクワークを行う動作が検出されたときには、空間における明るさを明るくし、人がデスクワーク中に背伸びをする動作が検出されたときには、空間における明るさを暗くする。 Also, for example, when an action that a person performs on a daily basis is detected as a specific action, the control unit 38 controls the lighting in the space with the control content corresponding to the action. For example, the control part 38 brightens the brightness in the space when a person's deskwork motion is detected, and darkens the space brightness when a person's stretching motion is detected during the deskwork.
 例えば、空間に複数の照明装置40が設けられているときには、制御部38は、特定動作を行った人の周辺の照明装置40を制御することで、空間における当該人の周辺の照明を制御してもよい。例えば、制御部38は、人と会話をする動作が検出されたときには、空間における当該人の周辺の明るさを明るくする。 For example, when a plurality of lighting devices 40 are provided in the space, the control unit 38 controls the lighting device 40 around the person who performed the specific action, thereby controlling the lighting around the person in the space. may For example, when an action of conversing with a person is detected, the control unit 38 brightens the brightness around the person in the space.
 また、制御部38は、推定された人の動き及び識別された人に基づいて、空間における照明を制御してもよい。例えば、人によって照明の制御内容の好みに違いがあるため、特定動作に対応する制御内容が識別された人によって異なっていてもよい。 The control unit 38 may also control lighting in the space based on the estimated human movement and the identified person. For example, since different people have different tastes in lighting control details, the control details corresponding to the specific action may be different for each identified person.
 また、制御部38は、さらに、現在時刻に基づいて、空間における照明を制御してもよい。例えば、太陽の1日の動きによって、空間における明るさが時刻に応じて変わり得るため、太陽の光が入りやすい時間帯には、空間における照明を暗くし、太陽の光が入りにくい時間帯には、空間における照明を明るくしてもよい。 Also, the control unit 38 may further control lighting in the space based on the current time. For example, the brightness in a space can change according to the time of day due to the movement of the sun throughout the day. may brighten the lighting in the space.
 また、制御部38は、推定された人の動き及び現在時刻に基づいて、空間における照明を制御してもよい。例えば、同じ動きが検出されたとしても、時刻によって照明の制御内容を異ならせてもよい。具体的には、午前に人がデスクワーク中に背伸びをする動作が検出されたときには、空間における明るさを暗くする制御が行われず、午後に人がデスクワーク中に背伸びをする動作が検出されたときに、空間における明るさを暗くする制御が行われてもよい。例えば、人がデスクワーク中に空間における明るさを暗くすることで、人に休憩をさせることができるが、業務開始後あまり時間が経過していない午前中には、空間における明るさを暗くして休憩をさせる必要がない場合があるためである。このように、推定された人の動きによっては、現在時刻に応じて照明の制御内容を異ならせてもよい。 The control unit 38 may also control lighting in the space based on the estimated human movement and the current time. For example, even if the same motion is detected, the content of lighting control may be changed depending on the time. Specifically, when it is detected that a person stretches while working at a desk in the morning, control to reduce the brightness in the space is not performed, and when it is detected that a person stretches while working at a desk in the afternoon. Also, control may be performed to darken the brightness in the space. For example, by dimming the brightness of the space during desk work, the person can take a break. This is because it may not be necessary to take a break. In this way, depending on the estimated movement of the person, the content of lighting control may be varied according to the current time.
 また、制御部38は、推定された人の動き及、現在時刻及び識別された人に基づいて、空間における照明を制御してもよい。例えば、人によって時刻に応じた照明の制御内容の好みに違いがあるため、時刻に応じた照明の制御内容が識別された人によって異なっていてもよい。 The control unit 38 may also control lighting in the space based on the estimated movement of the person, the current time, and the identified person. For example, since people have different tastes in lighting control details depending on the time of day, the lighting control details depending on the time of day may be different for each identified person.
 [効果等]
 以上説明したように、照明制御システム10は、人が存在する空間の画像データを取得する取得部34と、取得された画像データに基づいて、人の骨格座標を特定する特定部35と、特定された骨格座標の時系列データに基づいて、人の動きを推定する推定部36と、推定された人の動きに基づいて、空間における照明を制御する制御部38と、を備える。
[Effects, etc.]
As described above, the lighting control system 10 includes the acquiring unit 34 that acquires image data of a space in which a person exists, the specifying unit 35 that specifies the skeletal coordinates of the person based on the acquired image data, and the specifying unit 35. An estimation unit 36 for estimating human motion based on the time-series data of skeletal coordinates, and a control unit 38 for controlling lighting in space based on the estimated human motion.
 これによれば、特定された骨格座標の時系列データによって、人の細かな動きを推定することができるため、人の様々な動きを推定することができる。したがって、人の様々な動きに合わせて照明を制御することができる。 According to this, it is possible to estimate a person's fine movements based on the specified time-series data of the skeletal coordinates, so various movements of the person can be estimated. Therefore, lighting can be controlled according to various movements of a person.
 また、例えば、推定部36は、人の動きを推定することにより、人の特定動作を検出し、制御部38は、特定動作が検出されたときに、空間における照明を、特定動作に対応する制御内容で制御する。 Also, for example, the estimation unit 36 detects a specific action of a person by estimating the movement of the person, and when the specific action is detected, the control unit 38 adjusts the lighting in the space to correspond to the specific action. Control by control contents.
 これによれば、特定動作ごとに異なる制御内容で空間における照明を制御することができる。 According to this, the lighting in the space can be controlled with different control details for each specific action.
 また、例えば、特定動作は、人が日常的に行う動作を含む。 Also, for example, specific actions include actions that people perform on a daily basis.
 これによれば、人が日常的に行う動作によって照明を制御することができるため、例えば、人が照明を制御することを指示することなく、人が日常的に行う動作に対応する制御内容で照明を自動的に制御することができる。 According to this, since the lighting can be controlled by the daily actions of the person, for example, the control contents corresponding to the daily actions of the person can be used without instructing the person to control the lighting. Lighting can be controlled automatically.
 また、例えば、特定動作は、照明を制御することを指示する動作を含む。 Also, for example, the specific action includes an action instructing to control lighting.
 これによれば、ジェスチャーなどによって照明を制御することができる。 According to this, the lighting can be controlled by gestures.
 また、例えば、照明制御システム10は、さらに、取得された画像データに基づいて、人を識別する識別部37を備え、制御部38は、推定された人の動き及び識別された人に基づいて、空間における照明を制御する。 Further, for example, the lighting control system 10 further includes an identification unit 37 that identifies a person based on the acquired image data, and the control unit 38 controls the estimated movement of the person and the identified person based on the , controls the lighting in the space.
 これによれば、同じ動きであっても、人によって照明の制御内容を異ならせることができる。 According to this, even with the same movement, the contents of lighting control can be changed depending on the person.
 また、例えば、制御部38は、推定された人の動き及び現在時刻に基づいて、空間における照明を制御する。 Also, for example, the control unit 38 controls lighting in the space based on the estimated human movement and the current time.
 これによれば、現在時刻によって照明の明るさなどを変更することができる。例えば、推定された人の動きによっては、現在時刻に応じて照明の制御内容を異ならせることができる。 According to this, the brightness of the lighting can be changed according to the current time. For example, depending on the estimated movement of a person, it is possible to vary lighting control details according to the current time.
 また、例えば、取得部34は、赤外カメラによって撮影された画像の画像データを取得する。 Also, for example, the acquisition unit 34 acquires image data of an image captured by an infrared camera.
 これによれば、赤外カメラによって撮影された画像の画像データに基づいて、空間が暗い時でも人の骨格座標を特定することができる。 According to this, it is possible to identify the skeletal coordinates of a person even when the space is dark, based on the image data of the image taken by the infrared camera.
 また、照明制御システム10などのコンピュータによって実行される照明制御方法は、人が存在する空間の画像データを取得する取得ステップ(図2のS13)と、取得された画像データに基づいて、人の骨格座標を特定する特定ステップ(図2のS15)と、特定された骨格座標の時系列データに基づいて、人の動きを推定する推定ステップ(図2のS16)と、推定された人の動きに基づいて、空間における照明を制御する制御ステップ(図2のS19)と、を含む。 Further, the lighting control method executed by a computer such as the lighting control system 10 includes an acquisition step (S13 in FIG. 2) of acquiring image data of a space in which a person exists, and based on the acquired image data, A specifying step of specifying skeletal coordinates (S15 in FIG. 2), an estimating step of estimating a person's movement (S16 in FIG. 2) based on the specified time-series data of the skeletal coordinates, and an estimated human movement. and a control step (S19 in FIG. 2) of controlling the lighting in the space based on.
 これによれば、人の様々な動きに合わせて照明を制御することができる照明制御方法を提供できる。 According to this, it is possible to provide a lighting control method that can control lighting according to various movements of a person.
 (その他の実施の形態)
 以上、実施の形態に係る照明制御システム10、及び、照明制御方法について説明したが、本発明は、上記実施の形態に限定されるものではない。
(Other embodiments)
Although the lighting control system 10 and the lighting control method according to the embodiment have been described above, the present invention is not limited to the above embodiment.
 上記実施の形態では、照明制御システムは、複数の装置によって実現されたが、単一の装置として実現されてもよい。例えば、照明制御システムは、制御装置に相当する単一の装置として実現されてもよい。照明制御システムが複数の装置によって実現される場合、照明制御システムが備える各構成要素は、複数の装置にどのように振り分けられてもよい。 Although the lighting control system is implemented by a plurality of devices in the above embodiment, it may be implemented as a single device. For example, the lighting control system may be implemented as a single device that corresponds to the controller. When the lighting control system is implemented by a plurality of devices, each component included in the lighting control system may be distributed to the plurality of devices in any way.
 例えば、上記実施の形態では、照明制御システム10が識別部37を備える例について説明したが、照明制御システム10は、識別部37を備えていなくてもよい。この場合、制御部38は、推定された人の動き及び識別された人に基づいて、空間における照明を制御しなくてもよい。 For example, in the above embodiment, an example in which the lighting control system 10 includes the identification unit 37 has been described, but the lighting control system 10 does not have to include the identification unit 37 . In this case, controller 38 may not control the lighting in the space based on the estimated human movement and the identified person.
 例えば、上記実施の形態では、制御部38は、現在時刻に基づいて、空間における照明を制御する例について説明したが、制御部38は、現在時刻に基づいて、空間における照明を制御しなくてもよい。 For example, in the above embodiment, the control unit 38 controls the lighting in the space based on the current time, but the control unit 38 does not need to control the lighting in the space based on the current time. good too.
 また、上記実施の形態において、特定の処理部が実行する処理を別の処理部が実行してもよい。また、複数の処理の順序が変更されてもよいし、複数の処理が並行して実行されてもよい。 Further, in the above embodiment, the processing executed by a specific processing unit may be executed by another processing unit. In addition, the order of multiple processes may be changed, and multiple processes may be executed in parallel.
 また、上記実施の形態において、各構成要素は、各構成要素に適したソフトウェアプログラムを実行することによって実現されてもよい。各構成要素は、CPU又はプロセッサなどのプログラム実行部が、ハードディスク又は半導体メモリなどの記録媒体に記録されたソフトウェアプログラムを読み出して実行することによって実現されてもよい。 Also, in the above embodiments, each component may be realized by executing a software program suitable for each component. Each component may be realized by reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory by a program execution unit such as a CPU or processor.
 また、各構成要素は、ハードウェアによって実現されてもよい。例えば、各構成要素は、回路(又は集積回路)でもよい。これらの回路は、全体として1つの回路を構成してもよいし、それぞれ別々の回路でもよい。また、これらの回路は、それぞれ、汎用的な回路でもよいし、専用の回路でもよい。 Also, each component may be realized by hardware. For example, each component may be a circuit (or integrated circuit). These circuits may form one circuit as a whole, or may be separate circuits. These circuits may be general-purpose circuits or dedicated circuits.
 また、本発明の全般的又は具体的な態様は、システム、装置、方法、集積回路、コンピュータプログラム又はコンピュータ読み取り可能なCD-ROMなどの記録媒体で実現されてもよい。また、システム、装置、方法、集積回路、コンピュータプログラム及び記録媒体の任意な組み合わせで実現されてもよい。例えば、本発明は、上記実施の形態の照明制御方法をコンピュータに実行させるためのプログラムとして実現されてもよいし、このようなプログラムが記憶された、コンピュータ読み取り可能な非一時的な記録媒体として実現されてもよい。 In addition, general or specific aspects of the present invention may be implemented in a system, apparatus, method, integrated circuit, computer program, or recording medium such as a computer-readable CD-ROM. Also, any combination of systems, devices, methods, integrated circuits, computer programs and recording media may be implemented. For example, the present invention may be realized as a program for causing a computer to execute the lighting control method of the above embodiments, or as a computer-readable non-temporary recording medium storing such a program. may be implemented.
 その他、各実施の形態に対して当業者が思いつく各種変形を施して得られる形態、又は、本発明の趣旨を逸脱しない範囲で各実施の形態における構成要素及び機能を任意に組み合わせることで実現される形態も本発明に含まれる。 In addition, forms obtained by applying various modifications to each embodiment that a person skilled in the art can think of, or realized by arbitrarily combining the constituent elements and functions of each embodiment without departing from the spirit of the present invention. Also included in the present invention.
 なお、本明細書には少なくとも以下の事項が記載されている。 At least the following matters are described in this specification.
 (1)人が存在する空間の画像データを取得する取得部と、取得された前記画像データに基づいて、前記人の骨格座標を特定する特定部と、特定された前記骨格座標の時系列データに基づいて、前記人の動きを推定する推定部と、推定された前記人の動きに基づいて、前記空間における照明を制御する制御部と、を備える照明制御システム。 (1) an acquisition unit that acquires image data of a space in which a person exists; an identification unit that identifies skeletal coordinates of the person based on the obtained image data; and time-series data of the identified skeletal coordinates. and a controller that controls lighting in the space based on the estimated movement of the person.
 (2)前記推定部は、前記人の動きを推定することにより、前記人の特定動作を検出し、前記制御部は、前記特定動作が検出されたときに、前記空間における照明を、前記特定動作に対応する制御内容で制御する(1)に記載の照明制御システム。 (2) The estimating unit detects a specific action of the person by estimating the movement of the person, and the control unit adjusts the lighting in the space to the specific action when the specific action is detected. The lighting control system according to (1), which is controlled by control contents corresponding to the operation.
 (3)前記特定動作は、前記人が日常的に行う動作を含む(2)に記載の照明制御システム。 (3) The lighting control system according to (2), wherein the specific action includes a daily action performed by the person.
 (4)前記特定動作は、照明を制御することを指示する動作を含む(2)又は(3)に記載の照明制御システム。 (4) The lighting control system according to (2) or (3), wherein the specific operation includes an instruction to control lighting.
 (5)さらに、取得された前記画像データに基づいて、前記人を識別する識別部を備え、前記制御部は、推定された前記人の動き及び識別された前記人に基づいて、前記空間における照明を制御する(1)~(4)のいずれかに記載の照明制御システム。 (5) Further, an identification unit that identifies the person based on the acquired image data, the control unit, based on the estimated movement of the person and the identified person, in the space The lighting control system according to any one of (1) to (4) for controlling lighting.
 (6)前記制御部は、推定された前記人の動き及び現在時刻に基づいて、前記空間における照明を制御する(1)~(5)のいずれかに記載の照明制御システム。 (6) The lighting control system according to any one of (1) to (5), wherein the control unit controls lighting in the space based on the estimated movement of the person and the current time.
 (7)前記取得部は、赤外カメラによって撮影された画像の前記画像データを取得する(1)~(6)のいずれかに記載の照明制御システム。 (7) The lighting control system according to any one of (1) to (6), wherein the acquisition unit acquires the image data of an image captured by an infrared camera.
 (8)人が存在する空間の画像データを取得する取得ステップと、取得された前記画像データに基づいて、前記人の骨格座標を特定する特定ステップと、特定された前記骨格座標の時系列データに基づいて、前記人の動きを推定する推定ステップと、推定された前記人の動きに基づいて、前記空間における照明を制御する制御ステップと、を含む照明制御方法。 (8) an acquiring step of acquiring image data of a space in which a person exists; an identifying step of identifying the skeletal coordinates of the person based on the obtained image data; and time-series data of the identified skeletal coordinates. and a control step of controlling lighting in the space based on the estimated movement of the person.
 (9)(8)に記載の照明制御方法をコンピュータに実行させるためのプログラム。 (9) A program for causing a computer to execute the lighting control method described in (8).
 10 照明制御システム
 34 取得部
 35 特定部
 36 推定部
 37 識別部
 38 制御部
10 lighting control system 34 acquisition unit 35 identification unit 36 estimation unit 37 identification unit 38 control unit

Claims (9)

  1.  人が存在する空間の画像データを取得する取得部と、
     取得された前記画像データに基づいて、前記人の骨格座標を特定する特定部と、
     特定された前記骨格座標の時系列データに基づいて、前記人の動きを推定する推定部と、
     推定された前記人の動きに基づいて、前記空間における照明を制御する制御部と、を備える
     照明制御システム。
    an acquisition unit that acquires image data of a space in which a person exists;
    an identifying unit that identifies the skeletal coordinates of the person based on the acquired image data;
    an estimating unit that estimates the movement of the person based on the specified time-series data of the skeletal coordinates;
    A lighting control system, comprising: a controller that controls lighting in the space based on the estimated movement of the person.
  2.  前記推定部は、前記人の動きを推定することにより、前記人の特定動作を検出し、
     前記制御部は、前記特定動作が検出されたときに、前記空間における照明を、前記特定動作に対応する制御内容で制御する
     請求項1に記載の照明制御システム。
    The estimation unit detects a specific motion of the person by estimating the motion of the person,
    The lighting control system according to claim 1, wherein when the specific action is detected, the control unit controls the lighting in the space with control details corresponding to the specific action.
  3.  前記特定動作は、前記人が日常的に行う動作を含む
     請求項2に記載の照明制御システム。
    The lighting control system according to claim 2, wherein the specific action includes a daily action performed by the person.
  4.  前記特定動作は、照明を制御することを指示する動作を含む
     請求項2又は3に記載の照明制御システム。
    The lighting control system according to claim 2 or 3, wherein the specific action includes an action instructing to control lighting.
  5.  さらに、取得された前記画像データに基づいて、前記人を識別する識別部を備え、
     前記制御部は、推定された前記人の動き及び識別された前記人に基づいて、前記空間における照明を制御する
     請求項1に記載の照明制御システム。
    Furthermore, an identification unit that identifies the person based on the acquired image data,
    2. The lighting control system of claim 1, wherein the controller controls lighting in the space based on the estimated movement of the person and the identified person.
  6.  前記制御部は、推定された前記人の動き及び現在時刻に基づいて、前記空間における照明を制御する
     請求項1に記載の照明制御システム。
    The lighting control system according to claim 1, wherein the controller controls lighting in the space based on the estimated movement of the person and the current time.
  7.  前記取得部は、赤外カメラによって撮影された画像の前記画像データを取得する
     請求項1に記載の照明制御システム。
    The lighting control system according to claim 1, wherein the acquisition unit acquires the image data of an image captured by an infrared camera.
  8.  人が存在する空間の画像データを取得する取得ステップと、
     取得された前記画像データに基づいて、前記人の骨格座標を特定する特定ステップと、
     特定された前記骨格座標の時系列データに基づいて、前記人の動きを推定する推定ステップと、
     推定された前記人の動きに基づいて、前記空間における照明を制御する制御ステップと、を含む
     照明制御方法。
    an acquisition step of acquiring image data of a space in which a person is present;
    an identifying step of identifying the skeletal coordinates of the person based on the acquired image data;
    an estimating step of estimating the movement of the person based on the specified time-series data of the skeletal coordinates;
    and a control step of controlling lighting in the space based on the estimated movement of the person.
  9.  請求項8に記載の照明制御方法をコンピュータに実行させるためのプログラム。 A program for causing a computer to execute the lighting control method according to claim 8.
PCT/JP2022/023366 2021-06-29 2022-06-09 Lighting control system, lighting control method, and program WO2023276605A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021108188 2021-06-29
JP2021-108188 2021-06-29

Publications (1)

Publication Number Publication Date
WO2023276605A1 true WO2023276605A1 (en) 2023-01-05

Family

ID=84692738

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/023366 WO2023276605A1 (en) 2021-06-29 2022-06-09 Lighting control system, lighting control method, and program

Country Status (1)

Country Link
WO (1) WO2023276605A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012032066A (en) * 2010-07-30 2012-02-16 Toyota Home Kk Environmental adjustment system
JP2020170247A (en) * 2019-04-01 2020-10-15 オムロン株式会社 Person detection device and person detection method
WO2021044787A1 (en) * 2019-09-04 2021-03-11 ソニー株式会社 Information processing device, information processing method, and program
JP2021068088A (en) * 2019-10-21 2021-04-30 株式会社東海理化電機製作所 Image processing device, computer program, and image processing system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012032066A (en) * 2010-07-30 2012-02-16 Toyota Home Kk Environmental adjustment system
JP2020170247A (en) * 2019-04-01 2020-10-15 オムロン株式会社 Person detection device and person detection method
WO2021044787A1 (en) * 2019-09-04 2021-03-11 ソニー株式会社 Information processing device, information processing method, and program
JP2021068088A (en) * 2019-10-21 2021-04-30 株式会社東海理化電機製作所 Image processing device, computer program, and image processing system

Similar Documents

Publication Publication Date Title
US9544504B2 (en) Rapid synchronized lighting and shuttering
AU2012253292B2 (en) Presence sensing
US9349039B2 (en) Gesture recognition device and control method for the same
JP4711885B2 (en) Remote control device and method
JPH11265249A (en) Information input device, information input method and storage medium
US20020135581A1 (en) Method and system for controlling an avatar using computer vision
US20140118257A1 (en) Gesture detection systems
WO2016085212A1 (en) Electronic device and method for controlling display
US9223415B1 (en) Managing resource usage for task performance
EP3422152A1 (en) Remote operation device, remote operation method, remote operation system, and program
KR101810956B1 (en) Camera device having image sensor of rolling shutter type and lighting control method
JP2023509291A (en) Joint infrared and visible light visual inertial object tracking
WO2022031478A1 (en) Systems and methods for object tracking using fused data
CN106204743A (en) Control method, device and the mobile terminal of a kind of augmented reality function
WO2017195450A1 (en) Information processing apparatus and information processing method
WO2023276605A1 (en) Lighting control system, lighting control method, and program
CN109323159A (en) Illuminating bracket formula multimedia equipment
CN108181989B (en) Gesture control method and device based on video data and computing equipment
KR20180074124A (en) Method of controlling electronic device with face recognition and electronic device using the same
CN109041375B (en) Sense light brightness adjusting method, system, computer equipment and can storage medium
CN109243402A (en) A kind of method and device adjusting brightness of display screen
TW201709022A (en) Non-contact control system and method
EP4066775A1 (en) Indication system for a surgical lighting apparatus
CN113597072A (en) Lamp control method and device and electronic equipment
CN113177524A (en) Control method and system of access control equipment and electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22832762

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE