WO2022211083A1 - 登録システム、空気調和システム及び登録プログラム - Google Patents
登録システム、空気調和システム及び登録プログラム Download PDFInfo
- Publication number
- WO2022211083A1 WO2022211083A1 PCT/JP2022/016825 JP2022016825W WO2022211083A1 WO 2022211083 A1 WO2022211083 A1 WO 2022211083A1 JP 2022016825 W JP2022016825 W JP 2022016825W WO 2022211083 A1 WO2022211083 A1 WO 2022211083A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user terminal
- specific range
- unit
- real space
- information
- Prior art date
Links
- 238000004378 air conditioning Methods 0.000 title claims abstract description 104
- 238000003384 imaging method Methods 0.000 claims description 67
- 230000002093 peripheral effect Effects 0.000 claims description 19
- 238000005259 measurement Methods 0.000 claims description 6
- 238000012545 processing Methods 0.000 description 56
- 238000010586 diagram Methods 0.000 description 52
- 238000012217 deletion Methods 0.000 description 27
- 230000037430 deletion Effects 0.000 description 27
- 238000000034 method Methods 0.000 description 24
- 230000005540 biological transmission Effects 0.000 description 20
- 238000003825 pressing Methods 0.000 description 18
- 238000001514 detection method Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 16
- 238000004891 communication Methods 0.000 description 11
- 230000000449 premovement Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 238000013459 approach Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/50—Control or safety arrangements characterised by user interfaces or communication
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/50—Control or safety arrangements characterised by user interfaces or communication
- F24F11/52—Indication arrangements, e.g. displays
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/50—Control or safety arrangements characterised by user interfaces or communication
- F24F11/52—Indication arrangements, e.g. displays
- F24F11/526—Indication arrangements, e.g. displays giving audible indications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/50—Control or safety arrangements characterised by user interfaces or communication
- F24F11/54—Control or safety arrangements characterised by user interfaces or communication using one central controller connected to several sub-controllers
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/50—Control or safety arrangements characterised by user interfaces or communication
- F24F11/56—Remote control
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/62—Control or safety arrangements characterised by the type of control or by internal processing, e.g. using fuzzy logic, adaptive control or estimation of values
- F24F11/63—Electronic processing
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/70—Control systems characterised by their outputs; Constructional details thereof
- F24F11/72—Control systems characterised by their outputs; Constructional details thereof for controlling the supply of treated air, e.g. its pressure
- F24F11/79—Control systems characterised by their outputs; Constructional details thereof for controlling the supply of treated air, e.g. its pressure for controlling the direction of the supplied air
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72415—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F2120/00—Control inputs relating to users or occupants
- F24F2120/10—Occupancy
Definitions
- This disclosure relates to a registration system, an air conditioning system, and a registration program.
- Patent Literature 1 describes an air conditioning system that controls the amount and direction of airflow so that the air to be blown does not hit a specific area where a user holding a portable transmitter exists in real space. disclosed.
- the present disclosure provides a registration system, an air conditioning system, and a registration program for registering a specific range to be controlled in real space.
- a first aspect of the present disclosure includes: A registration system having a control unit and a storage unit, The control unit A specific range in the real space determined based on the position information of the user terminal in the real space and the content of the operation on the user terminal is registered in the storage unit.
- a second aspect of the present disclosure is the registration system according to the first aspect,
- the control unit A specific range in the real space is determined based on the trajectory of the position information of the user terminal when a movement operation is performed on the user terminal, and registered in the storage unit.
- a third aspect of the present disclosure is the registration system according to the first aspect,
- the user terminal is capable of accepting an instruction operation
- the control unit A peripheral range based on the position information of the user terminal when the user terminal receives the instruction operation is determined as the specific range and registered in the storage unit.
- a fourth aspect of the present disclosure is the registration system according to the first aspect,
- the user terminal is capable of accepting an instruction operation,
- the control unit A peripheral range based on the position information in the real space of the object identified by the user terminal receiving the instruction operation is determined as the specific range in the real space and registered in the storage unit.
- a fifth aspect of the present disclosure is the registration system according to the fourth aspect,
- the user terminal is capable of accepting a selection instruction operation,
- the control unit determining, as the specific range in the real space, a peripheral range based on the position information in the real space of the part of the object specified by the user terminal receiving the selection instruction operation, and storing the peripheral range in the storage unit; sign up.
- a sixth aspect of the present disclosure is the registration system according to the first aspect,
- the user terminal is capable of accepting a setting operation
- the control unit A size or shape of the specific range is determined based on a setting operation received by the user terminal.
- a seventh aspect of the present disclosure is the registration system according to the sixth aspect,
- the user terminal is capable of accepting a setting operation
- the control unit Position information of the user terminal when the user terminal receives the start point setting operation, or position information used as a reference when determining the specific range, and when the user terminal receives the end point setting operation
- the size of the specific range is determined based on the location information of the user terminal.
- An eighth aspect of the present disclosure is the registration system according to the fourth aspect,
- the user terminal is an imaging device; a display device for displaying an image captured by the imaging device;
- the control unit of the user terminal An object included in the image is specified as the target object.
- a ninth aspect of the present disclosure is the registration system according to the fourth aspect,
- the user terminal is an imaging device; a display device that displays an image captured by the imaging device on a screen and allows pointing to a position on the screen where the image is displayed;
- the control unit of the user terminal Among a plurality of objects included in the image, an object at the pointed position is specified as the target object.
- a tenth aspect of the present disclosure is the registration system according to the first aspect,
- the user terminal is having a display device for displaying an image,
- the control unit of the user terminal An image showing the specific range is superimposed on the image and displayed on the display device.
- An eleventh aspect of the present disclosure is the registration system according to the tenth aspect,
- the user terminal is having an imaging device,
- the control unit of the user terminal A three-dimensional image representing the specific range, which is generated based on position information and posture information of the user terminal in real space, is superimposed on the image captured by the imaging device and displayed.
- a twelfth aspect of the present disclosure is the registration system according to the first aspect,
- the user terminal has an output device,
- the output device notifies a stimulus when the user terminal is positioned in a specific range in the real space.
- a thirteenth aspect of the present disclosure is the registration system according to the first aspect,
- the user terminal has an output device,
- the output device notifies different stimuli according to the positional relationship between the specific range in the real space and the user terminal.
- a fourteenth aspect of the present disclosure is the registration system according to the fourth aspect,
- the user terminal has an output device, When the user terminal is located in a specific range in the real space, the output device Notifying a different stimulus depending on whether the specific range is determined by the position information of the object in the real space, or When the specific range is determined by the position information of the object in the real space, different stimuli are notified depending on whether the object is a person.
- a fifteenth aspect of the present disclosure is the registration system according to the first aspect,
- the user terminal has a sensor,
- the control unit of the user terminal By collating the shape data of the real space with the data measured by the sensor, the position information of the user terminal in the real space is calculated.
- a sixteenth aspect of the present disclosure is the registration system according to the fifteenth aspect, the sensor is an imaging device, The control unit of the user terminal, Positional information of the user terminal in the real space is calculated by collating the three-dimensional data in the real space with a map generated from the captured image captured by the imaging device.
- a seventeenth aspect of the present disclosure is the registration system according to the first aspect,
- the control unit Positional information of the user terminal in real space is calculated based on data measured by a three-dimensional position measuring sensor installed in the real space.
- An eighteenth aspect of the present disclosure is the registration system according to the seventeenth aspect,
- the three-dimensional position measurement sensor is composed of a plurality of imaging devices with different mounting positions,
- the control unit Positional information of the user terminal in real space is calculated by matching captured images captured by each of the plurality of imaging devices.
- a nineteenth aspect of the present disclosure is the registration system according to the first aspect,
- the control unit Positional information of the user terminal in the real space is calculated by matching data measured by the sensor of the user terminal with data measured by the sensor installed in the real space.
- a twentieth aspect of the present disclosure is the registration system according to the first aspect,
- the user terminal is capable of accepting an air conditioning instruction operation
- the control unit When the user terminal receives an air-conditioning instruction operation, air-conditioning control is performed for a specific range within the real space registered in the storage unit.
- a twenty-first aspect of the present disclosure is the registration system according to the twentieth aspect,
- the control unit By using a plurality of air conditioners attached to the real space, different air conditioning controls are performed for a specific range in the real space registered in the storage unit and a non-specific range other than the specific range.
- the air conditioning system according to the twenty-second aspect of the present disclosure includes A registration system according to any of the first to twenty-first aspects.
- the registration program includes: a registration step of registering, in a storage unit, a specific range in the real space determined based on the position information of the user terminal in the real space and the operation content of the user terminal; is executed by the control unit.
- FIG. 1 is a diagram showing an example of the system configuration of an air conditioning system.
- FIG. 2 is a diagram showing an example of a real space to which the air conditioning system is applied.
- FIG. 3 is a diagram illustrating an example of a hardware configuration of a control device and a hardware configuration of a user terminal;
- FIG. 4 is a diagram showing an example of operation modes of the air conditioning system and an operation screen.
- Drawing 5 is a figure showing an example of functional composition of a control device in a specific mode of operation.
- FIG. 6 is a first diagram showing a specific example of the functional configuration of the specific range input unit and specific range determination processing.
- FIG. 7 is a second diagram showing a specific example of the functional configuration of the specific range input unit and specific range determination processing.
- FIG. 1 is a diagram showing an example of the system configuration of an air conditioning system.
- FIG. 2 is a diagram showing an example of a real space to which the air conditioning system is applied.
- FIG. 3 is a diagram illustrating an
- FIG. 8 is a third diagram showing a specific example of the functional configuration of the specific range input unit and specific range determination processing.
- FIG. 9 is a fourth diagram showing a functional configuration of the specific range input unit and a specific example of specific range determination processing.
- FIG. 10 is a fifth diagram showing a specific example of the functional configuration of the specific range input unit and specific range determination processing.
- FIG. 11 is a sixth diagram showing a specific example of the functional configuration of the specific range input unit and specific range determination processing.
- FIG. 12 is a diagram illustrating a functional configuration of a specific range input unit and a specific example of specific range deletion processing.
- FIG. 13 is a diagram illustrating a functional configuration of a specific range input unit and a specific example of specific range movement processing.
- FIG. 14 is a diagram showing a specific example of the functional configuration of the specific range input unit and the driving information input process.
- FIG. 15 is a diagram showing a specific example of the functional configuration of the specific range input unit and the driving information reversing process.
- FIG. 16 is a first diagram showing a functional configuration of a specific range output unit and a specific example of specific range output processing.
- FIG. 17 is a first flowchart showing the flow of specific range registration processing in the specific operation mode.
- FIG. 18 is a second flowchart showing the flow of specific range registration processing in the specific operation mode.
- FIG. 19 is a diagram showing an example of air conditioning control by an air conditioning system.
- FIG. 20 is a diagram illustrating an example of functional configurations of a control device and a user terminal in a specific operation mode; FIG.
- FIG. 21 is a diagram illustrating a specific example of the functional configuration of the transmission unit and the specific range output unit and specific range display processing.
- FIG. 22 is a second diagram illustrating a functional configuration of the specific range output unit and a specific example of specific range output processing.
- FIG. 23 is a diagram illustrating a calculation example of coordinate information indicating the position of a user terminal in real space.
- FIG. 24 is a diagram illustrating an example of a functional configuration of a user terminal; 25 is a diagram illustrating an example of a functional configuration of a specific range input unit;
- FIG. 26 is a first diagram showing a specific example of center position determination processing.
- FIG. 27 is a second diagram showing a specific example of center position determination processing.
- FIG. 28 is a third diagram showing a specific example of center position determination processing.
- FIG. 29 is a fourth diagram showing a specific example of center position determination processing.
- FIG. 1 is a diagram showing an example of the system configuration of an air conditioning system.
- the air conditioning system 100 has an air conditioning device 110, an imaging device 120, a display device 130, and a user terminal 140.
- the air conditioner 110 and the user terminal 140 are connected via wireless communication.
- the air conditioner 110 includes, for example, an indoor unit 111 of a domestic air conditioner, an indoor unit 112a of a commercial air conditioner, an edge device 112b (control device), an operation panel 112c, and the like.
- the indoor unit 111 performs air conditioning control of the real space. Specifically, the indoor unit 111 - Various instructions and various information transmitted from the user terminal 140 via wireless communication, - A photographed image (for example, an RGB image) transmitted from the imaging device 120, - Data measured by the sensor of the indoor unit 111, Based on this, the built-in control device operates to control air conditioning in the real space. Note that the indoor unit 111 may incorporate the imaging device 120 . In addition, the indoor unit 111 transmits various information generated by the operation of the built-in control device to the user terminal 140 or the display device 130 via wireless communication.
- the indoor unit 112a controls the air conditioning of the real space.
- the indoor unit 112a - Various instructions and various information transmitted from the operation panel 112c or transmitted from the user terminal 140 via wireless communication, - A photographed image transmitted from the imaging device 120, - data measured by the sensor of the indoor unit 112a, , the edge device 112b (control device) operates to control the air conditioning of the real space.
- the indoor unit 112a transmits various information generated by the operation of the edge device 112b to the user terminal 140 or the display device 130 via wireless communication.
- the operation panel 112 c may function as the display device 130 .
- a control device built into the indoor unit 111, and - An edge device 112b connected to the indoor unit 112a can perform air conditioning control for a specific range in the real space, which is different from that for a non-specific range (a range other than the specific range).
- the imaging device 120 includes, for example, an indoor security camera 121, a surveillance camera 122, and the like.
- the security camera 121, the surveillance camera 122, and the like photograph the inside of the real space in which the indoor unit 111 of the home air conditioner is installed and the inside of the real space in which the indoor unit 112a of the commercial air conditioner is installed. is transmitted to the air conditioner 110 .
- the display device 130 includes, for example, a monitor 131 and the like.
- the monitor 131 and the like are installed in the real space where the indoor unit 111 of the home air conditioner is installed or in the real space where the indoor unit 112a of the commercial air conditioner is installed. Also, the monitor 131 and the like display a display image generated by the control device built in the indoor unit 111 or the edge device 112b connected to the indoor unit 112a.
- the user terminal 140 includes, for example, a smartphone 141, a remote controller 142 for a home air conditioner, a wearable device (not shown), and the like.
- a smartphone 141 , a remote controller 142 , a wearable device (not shown), or the like is held (or worn) by a user in real space, and transmits various instructions and various information input by the user to the air conditioner 110 . This allows the user to register a specific range in real space via the user terminal 140 .
- the smartphone 141, the remote control 142, the wearable device (not shown), etc. notify the user of the registered specific range based on various information transmitted from the air conditioner 110. Thereby, the user can grasp through the user terminal 140 which range is registered as the specific range in the real space.
- FIG. 2 is a diagram showing an example of a real space to which the air conditioning system is applied.
- the real space 210 is the room of the user 211 .
- An air conditioning system 100 applied to a real space 210 has an indoor unit 111 with an imaging device 120 built therein and a smartphone 141 .
- a control device (not shown) of the indoor unit 111 identifies each position in the real space 210 by coordinate information based on the coordinate axis 212 .
- the real space 220 is the office of users 221 and 222 .
- the air conditioning system 100 applied to the real space 220 has an indoor unit 112a, an edge device 112b (not shown), an operation panel 112c, a security camera 121, a monitor 131, and a smart phone 141.
- the edge device 112 b (not shown) identifies each position in the real space 220 by coordinate information based on the coordinate axis 223 .
- FIG. 3 is a diagram showing an example of the hardware configuration of the control device and the hardware configuration of the user terminal.
- FIG. 3 shows an example of the hardware configuration of the control device of the air conditioner 110 .
- the control device of the air conditioner 110 has a processor 301, a memory 302, and an auxiliary storage device 303.
- the control device of the air conditioner 110 also has an I/F (Interface) device 304 and a communication device 305 . It should be noted that each piece of hardware of the controller of the air conditioner 110 is interconnected via a bus 306 .
- the processor 301 has various computing devices such as a CPU (Central Processing Unit).
- the processor 301 reads various programs (for example, a registration program to be described later) onto the memory 302 and executes them.
- programs for example, a registration program to be described later
- the memory 302 has main storage devices such as ROM (Read Only Memory) and RAM (Random Access Memory).
- the processor 301 and the memory 302 form a so-called computer (a “control unit” as hardware), and the processor 301 executes various programs read on the memory 302, thereby the control device of the air conditioner 110 , realize various functions.
- the auxiliary storage device 303 is an example of a storage unit, and stores various programs and various data used when the various programs are executed by the processor 301 .
- the I/F device 304 is a connection device that connects the control device of the air conditioner 110 and other devices.
- Other devices in this embodiment include an imaging device 120 and a display device 130 .
- the other device includes the radio wave transmitting/receiving device 310, which is an example of the sensor included in the indoor unit 111 of the air conditioner 110 or the sensor included in the indoor unit 112a.
- the radio wave transmitting/receiving device 310 is, for example, a TOF depth sensor (an example of a three-dimensional position measurement sensor) that acquires radio wave information (a reflected signal of laser light) by scanning the real space with laser light.
- Coordinate information (a so-called depth map) indicating the positions of multiple objects in the real space 210 or 220 is calculated by using the radio wave information in any direction in the real space 210 or 220 acquired by the radio wave transmitting/receiving device 310. be able to.
- the communication device 305 is a communication device that transmits and receives information between the control device of the air conditioner 110 and the user terminal 140 .
- FIG. 3 shows an example of the hardware configuration of the user terminal 140.
- the user terminal 140 has a processor 321, a memory 322, an auxiliary storage device 323, a communication device 324, an operation device 325, and a display device 326.
- the user terminal 140 also has an imaging device 327 , an acceleration sensor 328 , an audio input device 329 , an audio output device 330 and a vibration device 331 .
- Each piece of hardware of the user terminal 140 is interconnected via a bus 332 .
- the processor 321 has various computing devices such as a CPU (Central Processing Unit).
- the processor 321 reads various programs (for example, an air conditioning operation program to be described later) onto the memory 322 and executes them.
- programs for example, an air conditioning operation program to be described later
- the memory 322 has main storage devices such as ROM (Read Only Memory) and RAM (Random Access Memory).
- the processor 321 and the memory 322 form a so-called computer (a “controller” as hardware), and the processor 321 executes various programs read on the memory 322, thereby allowing the user terminal 140 to perform various functions. come true.
- the auxiliary storage device 323 stores various programs and various data used when the various programs are executed by the processor 321 .
- the communication device 324 is a communication device that transmits and receives information between the user terminal 140 and the controller of the air conditioner 110 .
- the operation device 325 receives various instructions from the users 211, 221, 222, etc. to the user terminal 140.
- Display device 326 displays a display image based on various information transmitted from the control device of air conditioner 110 .
- the image capturing device 327 captures images of the real spaces 210 and 220, for example.
- the acceleration sensor 328 measures the acceleration of the user terminal 140 in three axial directions.
- the voice input device 329 inputs voices of the users 211, 221, 222, and the like.
- the audio output device 330 outputs sounds based on various information transmitted from the control device of the air conditioner 110, for example.
- the vibration device 331 outputs vibration based on various information transmitted from the control device of the air conditioner 110, for example.
- FIG. 4 is a diagram showing an example of operation modes of the air conditioning system and an operation screen.
- 4a in FIG. 4 shows the operation mode when the air conditioning system 100 operates. As shown in 4a of FIG. 4, the air conditioning system 100 transitions to either the "normal operation mode” or the "specific operation mode” during operation.
- Normal operation mode is a mode in which the air conditioning system 100 is operated based on conventional operation information and air conditioning control of the real spaces 210 and 220 is performed.
- the “specific operation mode” is a mode in which air conditioning control is performed based on different operation information between the specific range and the non-specific range when operating the air conditioning system 100 .
- FIG. 4 shows an example of an operation screen displayed on the user terminal 140 when the air conditioning system 100 is operated in the specific operation mode.
- the screen transitions to the operation screen 400 of the specific driving mode.
- the operation screen 400 includes an operation information display field 411 , a specific range input field 413 , a “specific range display” button 414 , an “operation information input” button 415 , a “start” button 416 , and a “stop” button 417 .
- the driving information display column 411 displays driving information set in a specific range. If there are multiple specific ranges, the driving information is displayed for each specific range.
- the specific range input field 413 includes instruction buttons used by the users 211, 221, 222, etc. to input the specific range.
- the "input range” button is a button that is pressed when instructing to start inputting a specific range.
- the “increase/decrease” button is a button that is pressed when entering the size of a specific range. Instead of pressing the "increase/decrease” button to enter the size of the specific range in the "radius information" input field ("Rmm” in the figure), a setting operation is performed to set the value indicating the size of the specific range. may be entered directly into the "radius information" input field.
- the "target shooting” button is a button that is pressed when shooting an object to make the area of the object a specific range.
- the imaging device 327 is activated by pressing the "object shooting” button, and shooting is started.
- the “delete range” button is a button that is pressed when deleting a specific range that has already been registered.
- the “range move” button is a button that is pressed to move a specific range that has already been registered.
- the "reverse” button is the operating information used when performing air conditioning control for a specific range that has already been registered, and the operating information used when performing air conditioning control for a non-specific range different from the specific range. This is a button that is pressed when replacing the .
- the imaging device 327 When the “display specific range” button 414 is pressed, the imaging device 327 is activated, and the captured image captured by the imaging device 327 is displayed on the display screen. Display images based on the various information are superimposed and displayed.
- the "input driving information” button 415 is a button that is pressed when the users 211, 221, 222, etc. input driving information. By pressing the "input driving information” button 415, items that can be input in the specific driving mode are displayed, and the users 211, 221, 222, etc. input driving information corresponding to each item.
- the driving information may be configured so that the same content can be registered for each specific range, or can be configured so that different content can be registered for each position within the same specific range. Also, the history of the driving information registered in the past may be read out and registered in another specific range. Furthermore, the driving information associated with each of the currently registered specific ranges may be configured so that it can be registered by replacing it between the specific ranges.
- a “start” button 416 is a button for an air conditioning instruction operation used when instructing the air conditioner 110 to start air conditioning control in a specific operation mode.
- a “stop” button 417 is a button for an air conditioning instruction operation used when instructing the air conditioner 110 to stop air conditioning control in a specific operation mode.
- Drawing 5 is a figure showing an example of functional composition of a control device in a specific mode of operation.
- the control device of the air conditioner 110 is installed with a registration program, and by executing the program, the control device of the air conditioner 110 has the specific range input unit 501, the air conditioning control unit 502 , functions as a specific range output unit 503 .
- the specific range input unit 501 is based on various instructions and various information transmitted from the user terminal 140, the captured image transmitted from the imaging device 120, the data measured by the sensor of the indoor unit 111, and the like. Determine information. Further, the specific range input unit 501 registers the determined specific range and operation information in the air conditioning control unit 502 . Specifically, the specific range input unit 501 notifies the air conditioning control unit 502 of the determined specific range and the operation information. As a result, the air conditioning control unit 502 stores in the specific range and operation information storage unit, which is a storage area available to the air conditioning control unit 502, of the auxiliary storage device 303 of the control device of the air conditioner 110. FIG.
- the air conditioning control unit 502 performs air conditioning control based on the specific range and operation information registered by the specific range input unit 501 .
- the specific range output unit 503 transmits to the display device 130 an image (specific range image) for notifying the users 211, 221, 222, etc. of the specific range registered in the air conditioning control unit 502.
- FIG. 6 is a first diagram showing a specific example of the functional configuration of the specific range input unit and specific range determination processing.
- the specific range input unit 501 includes a captured image acquisition unit 601, a radio wave information acquisition unit 602, a position information acquisition unit 603, an operation content acquisition unit 604, a radius information specification unit 605, and a specific range determination unit 607. have.
- a captured image acquisition unit 601 acquires a captured image captured by the imaging device 120 and notifies the position information acquisition unit 603 of it.
- the radio wave information acquisition unit 602 acquires the radio wave information transmitted and received by the radio wave transmission/reception device 310 and notifies the position information acquisition unit 603 of it.
- the position information acquisition unit 603 calculates coordinate information (x, y, z) indicating the position of the user terminal 140 in the real space 210 or 220 based on the notified captured image and radio wave information.
- the radio wave information acquisition unit 602 calculates coordinate information indicating the positions of a plurality of objects in real space based on the radio wave information, and selects, from the calculated coordinate information indicating the positions of the plurality of objects, based on the captured image. and extracts coordinate information indicating the position of the user terminal 140 whose image has been recognized.
- the operation content acquisition unit 604 acquires the operation content transmitted from the user terminal 140 .
- the example of FIG. 6 shows that the "range input” button is pressed and the “increase/decrease” button is pressed as the operation contents.
- the specific range determination unit 607 When the operation content acquisition unit 604 acquires a range input instruction by pressing the “input range” button, the specific range determination unit 607 is notified of the range input instruction. Note that when the operation content acquisition unit 604 notifies the range input instruction, the specific range determination unit 607 converts the coordinate information notified by the position information acquisition unit 603 into center position information indicating the position of the center point of the specific range. Retain as (x 0 , y 0 , z 0 ).
- the operation content acquiring unit 604 acquires tap information by tapping the “increase/decrease” button n times (by repeating the tap operation), the radius information specifying unit 605 is notified of the tap information. .
- the operation content acquiring unit 604 acquires tap information by pressing the “increase/decrease” button for t seconds (by continuing the pressing operation for a predetermined time), the tap information is sent to the radius information specifying unit 605. Notice.
- the radius information specifying unit 605 calculates radius information (R) based on the tap information notified from the operation content acquiring unit 604 and notifies the specified range determining unit 607 of it.
- a radius information specifying unit 605 calculates radius information (R) proportional to the number of taps (the number of repetitions) or the long press time (continuous time).
- the specific range determination unit 607 refers to the shape storage unit 610 and acquires shape information of the predetermined shape of the specific range.
- the example of FIG. 6 shows how spherical shape information is acquired by referring to the shape storage unit 610 .
- the shape information acquired by referring to the shape storage unit 610 is not limited to spherical shape information, and may be any other three-dimensional shape information such as conical shape information.
- the specific range determining unit 607 obtains the center position information (x 0 , y 0 , z 0 ) of the specific range, the radius information (R) notified by the radius information specifying unit 605, and the shape storage unit 610 A specific range 621 is determined based on the spherical shape information. That is, the specific range determination unit 607 determines the peripheral range based on the center position information as the specific range 621 .
- the example in FIG. 6 shows how the specific range 620 is generated based on the tap information in the middle.
- the user 211 can adjust the size of the specific range by adjusting the number of taps or the long press time.
- the user terminal 140 - By the user 211 tapping the "increase/decrease” button, or ⁇ By pressing and holding A predetermined stimulus (sound, light, or vibration) may be notified to the user 211 each time the radius information (R) changes by a certain amount. Thereby, the user 211 can grasp that the radius information (R) has changed by a certain amount.
- a predetermined stimulus sound, light, or vibration
- FIG. 7 is a second diagram showing a specific example of the functional configuration of the specific range input unit and specific range determination processing.
- the specific range input unit 501 has a captured image acquisition unit 601 , radio wave information acquisition unit 602 , position information acquisition unit 603 , operation content acquisition unit 701 , and specific range determination unit 607 .
- the operation content acquisition unit 701 acquires the operation content transmitted from the user terminal 140 .
- the example of FIG. 7 shows a state in which the "range input” button is pressed and the radius information (R) is directly input in the "radius information” input field.
- the specific range determination unit 607 When the operation content acquisition unit 701 acquires a range input instruction by pressing the “input range” button, the specific range determination unit 607 is notified of the range input instruction. Note that when the operation content acquisition unit 604 notifies the range input instruction, the specific range determination unit 607 converts the coordinate information notified by the position information acquisition unit 603 into center position information (x 0 , y 0 ) of the specific range. , z 0 ).
- the radius information (R) is acquired by directly inputting the radius information (R) in the "radius information" input field, the radius information (R) is sent to the specific range determination unit. 607 is notified.
- FIG. 8 is a third diagram showing a specific example of the functional configuration of the specific range input unit and specific range determination processing.
- the specific range input unit 501 includes a captured image acquisition unit 601, a radio wave information acquisition unit 602, a position information acquisition unit 603, an operation content acquisition unit 801, a center position and radius information calculation unit 802, and a specific range determination unit.
- a part 803 is provided.
- the photographed image acquisition unit 601 to the position information acquisition unit 603 have already been explained using FIG. 6, so the explanation is omitted here.
- the position information acquisition unit 603 notifies the center position and radius information calculation unit 802 of the calculated coordinate information.
- the operation content acquisition unit 801 acquires the operation content transmitted from the user terminal 140 and notifies the center position and radius information calculation unit 802 of it.
- the example of FIG. 8 shows, as the operation content, that the range input instruction is notified by pressing the "range input” button.
- the user 211 performs a movement operation on the user terminal 140 after pressing the "range input” button, and draws an arc locus in the real space 210. specifies the position and size of a specific range.
- the center position and radius information calculation unit 802 receives the coordinate information (coordinate information ( Obtain x 1 , y 1 , z 1 , ) to (x m , y m , z m )).
- center position and radius information calculation unit 802 calculates the center point is calculated and notified to the specific range determination unit 803 .
- center position and radius information calculation unit 802 calculates the center position information ( x 0 , y 0 , z 0 ). Furthermore, the center position and radius information calculation unit 802 notifies the specified range determination unit 803 of the calculated maximum value as radius information (R).
- the specific range determination unit 803 refers to the shape storage unit 610 and acquires shape information of the predetermined shape of the specific range.
- the example of FIG. 8 shows how spherical shape information is acquired by referring to the shape storage unit 610 .
- the shape information acquired by referring to the shape storage unit 610 is not limited to spherical shape information, and may be any other three-dimensional shape information such as conical shape information.
- the specific range determination unit 803 acquires the center position information (x 0 , y 0 , z 0 ) and the radius information (R) notified from the center position and radius information calculation unit 802 and the shape storage unit 610 A specific range 621 is determined based on the obtained spherical shape information. That is, the specific range determination unit 803 determines the peripheral range based on the center position information as the specific range 621 .
- FIG. 9 is a fourth diagram showing a functional configuration of the specific range input unit and a specific example of specific range determination processing.
- the specific range input unit 501 includes a captured image acquisition unit 601, a radio wave information acquisition unit 602, a position information acquisition unit 603, an operation content acquisition unit 801, a center position and radius information calculation unit 802, and a projection direction determination unit. It has a section 901 and a specific range determination section 902 .
- the projection direction determination unit 901 notifies the specific range determination unit 902 of the imaging direction when the imaging device 120 images the user terminal 140 as the projection direction.
- the specific range determination unit 902 refers to the shape storage unit 610 and acquires shape information of the predetermined shape of the specific range.
- the example of FIG. 9 shows how spherical shape information is acquired by referring to the shape storage unit 610 .
- the specific range determination unit 902 acquires the center position information (x 0 , y 0 , z 0 ) and the radius information (R) notified from the center position and radius information calculation unit 802 and the shape storage unit 610 The specific range is calculated based on the obtained spherical shape information. Furthermore, based on the projection direction notified from the projection direction determination unit 901, the specific range determination unit 902 shifts the position of the center point by a predetermined amount in the projection direction while maintaining the size and shape of the calculated specific range. Position information (x' 0 , y' 0 , z' 0 ) is calculated. Accordingly, the specific range determination unit 902 determines the projection area as the specific range 621 . That is, the specific range determination unit 902 determines the peripheral range based on the center position information as the specific range 621 .
- coordinate information (coordinate information (x 1 , y 1 , z 1 , ) to (x m , y m , z m )) is obtained based on an image captured by the imaging device 120 in a shooting direction. project to Then, a projection area specified by the coordinate information after projection (an area specified based on the center position information, the radius information (R), and the spherical shape information) is determined as the specified range.
- FIG. 10 is a fifth diagram showing a specific example of the functional configuration of the specific range input unit and specific range determination processing.
- the specific range input unit 501 includes a captured image acquisition unit 601, a radio wave information acquisition unit 602, a position information acquisition unit 603, an operation content acquisition unit 801, a center position and radius information calculation unit 802, and a projection direction determination unit. It has a section 1001 and a specific range determination section 1002 .
- the projection direction determination unit 1001 determines a predetermined direction according to the positional relationship between the user terminal 140 and the user 211 as the projection direction, and notifies the specified range determination unit 1002 of the determined projection direction.
- the specific range determination unit 1002 refers to the shape storage unit 610 and acquires shape information of the predetermined shape of the specific range.
- the example of FIG. 10 shows how spherical shape information is acquired by referring to the shape storage unit 610 .
- the specific range determination unit 1002 acquires the center position information (x 0 , y 0 , z 0 ) and the radius information (R) notified from the center position and radius information calculation unit 802 and the shape storage unit 610 The specific range is calculated based on the obtained spherical shape information. Furthermore, based on the projection direction notified from the projection direction determination unit 1001, the specific range determination unit 1002 shifts the position of the center point by a predetermined amount in the projection direction while maintaining the size and shape of the calculated specific range. Position information (x' 0 , y' 0 , z' 0 ) is calculated. Thereby, the specific range determination unit 1002 determines the projection area as the specific range 621 . That is, the specific range determination unit 1002 determines the peripheral range based on the center position information as the specific range 621 .
- the specific range is calculated and then projected, but the specific range may be calculated after the trajectory based on the movement operation on the user terminal 140 is projected.
- coordinate information (coordinate information (x 1 , y 1 , z 1 , ) to (x m , y m , z m )) is specified from the captured image captured by the imaging device 120. , is projected based on the positional relationship between the user 211 and the user terminal 140 . Then, a projection area specified by the coordinate information after projection (an area specified based on the center position information, the radius information (R), and the spherical shape information) is determined as the specified range.
- FIG. 11 is a sixth diagram showing a specific example of the functional configuration of the specific range input unit and specific range determination processing.
- the specific range input unit 501 includes a captured image acquisition unit 601, a radio wave information acquisition unit 602, a position information acquisition unit 603, an operation content acquisition unit 1101, a target region calculation unit 1102, and a specific range determination unit 1103. have.
- the operation content acquisition unit 1101 acquires the operation content transmitted from the user terminal 140 .
- the operation contents first, the "range input” button is pressed, and then the "target shooting” button is pressed.
- the operation content acquisition unit 1101 acquires a range input instruction when the "range input” button is pressed.
- shooting is instructed by pressing the shutter button (not shown). is acquired and notified to the target area calculation unit 1102 .
- the target area calculation unit 1102 retains the coordinate information notified from the position information acquisition unit 603 when the operation content acquisition unit 1101 notifies the shooting instruction.
- the position coordinates held at this time become the reference position information (x b , y b , z b ) which is the coordinate information of the user terminal 140 at the time of shooting, which is used when calculating the coordinate information of the target area.
- the operation content acquisition unit 1101 acquires a captured image from the user terminal 140 by pressing the shutter button (not shown), it notifies the target area calculation unit 1102 of the captured image.
- the target area calculation unit 1102 Based on the reference position information ( xb , yb , zb ), the target area calculation unit 1102 identifies the target object included in the captured image, and determines the area of the target object (target area) in the real space 210. Calculate coordinate information.
- the target area calculation unit 1102 refers to the layout information storage unit 1104 when specifying the target object included in the captured image based on the reference position information (x b , y b , z b ).
- the object name of each object for example, foliage plant, chair, etc. placed in the real space 210 is associated with the coordinate information (shape data such as three-dimensional data) of the object area. It is assumed that the layout information obtained is stored in advance.
- the target area calculation unit 1102 calculates an object name corresponding to the object included in the captured image and associated with the coordinate information near the reference position information (x b , y b , z b ). , from the layout information. Then, the target area calculation unit 1102 notifies the specific range determination unit 1103 of the coordinate information of the object area associated with the specified object name as the coordinate information of the area (target area) of the photographed object.
- the specific range determination unit 1103 determines the coordinate information of the target region notified by the target region calculation unit 1102 as the specific range 621 .
- FIG. 12 is a diagram showing a specific example of the functional configuration of the specific range input unit and specific range deletion processing.
- the specific range input unit 501 includes a captured image acquisition unit 601, a radio wave information acquisition unit 602, a position information acquisition unit 603, an operation content acquisition unit 1201, a deletion target determination unit 1202, and a specific range deletion unit 1203. have.
- the photographed image acquisition unit 601 to the position information acquisition unit 603 have already been explained using FIG. 6, so the explanation is omitted here.
- the position information acquisition unit 603 notifies the deletion target determination unit 1202 of the calculated coordinate information.
- the operation content acquisition unit 1201 acquires the operation content transmitted from the user terminal 140 and notifies the deletion target determination unit 1202 of it.
- the example of FIG. 12 shows, as the operation content, that the "delete range” button is pressed, and the range deletion instruction is notified.
- the deletion target determination unit 1202 converts the coordinate information notified by the position information acquisition unit 603 into deletion position information (x d , y d , z d ).
- the deletion target determination unit 1202 reads specific range information already registered in the air conditioning control unit 502 and determines a specific range including the deletion position information (x d , y d , z d ). Furthermore, the deletion target determination unit 1202 notifies the specific range deletion unit 1203 of the determined specific range as a deletion target specific range.
- the specific range deletion unit 1203 deletes the specific range notified by the deletion target determination unit 1202 from the specific range information already registered in the air conditioning control unit 502 .
- the specific range deletion unit 1203 deletes the specific range to be deleted when the "delete range” button is pressed while the user terminal 140 is located within the specific range to be deleted. .
- FIG. 13 is a diagram showing a specific example of the functional configuration of the specific range input unit and specific range movement processing.
- the specific range input unit 501 includes a captured image acquisition unit 601, a radio wave information acquisition unit 602, a position information acquisition unit 603, an operation content acquisition unit 1301, a moving object determination unit 1302, and a specific range update unit 1303. have.
- the photographed image acquisition unit 601 to the position information acquisition unit 603 have already been explained using FIG. 6, so the explanation is omitted here.
- the position information acquisition unit 603 notifies the moving object determination unit 1302 of the calculated coordinate information.
- the operation content acquisition unit 1301 acquires the operation content transmitted from the user terminal 140 and notifies the movement target determination unit 1302 of it.
- the example of FIG. 13 shows, as the operation content, that a range movement start instruction and a range movement end instruction are notified by pressing the "movement range" button.
- the movement target determination unit 1302 converts the coordinate information notified from the position information acquisition unit 603 into the pre-movement position information (x t , y t , z t ).
- the moving object determination unit 1302 reads the specific range information already registered in the air conditioning control unit 502 and determines the specific range including the pre-movement position information (x t , y t , z t ). Further, the movement target determination unit 1302 acquires pre-movement center position information (x 0 , y 0 , z 0 ), which is center position information of the determined specific range.
- the movement target determination unit 1302 when the movement target determination unit 1302 receives a range movement end instruction from the operation content acquisition unit 1301 after the user terminal 140 is operated to move, the movement target determination unit 1302 converts the coordinate information notified from the position information acquisition unit 603 into the post-movement position. It is held as information (x' t , y' t , z' t ).
- moving object determination section 1302 calculates the difference between the pre-movement position information (x t , y t , z t ) and the post-movement position information (x′ t , y′ t , z′ t ) as the pre-movement center Add to position information (x 0 , y 0 , z 0 ). As a result, the moving object determination unit 1302 calculates post-movement center position information (x' 0 , y' 0 , z' 0 ).
- the moving object determination unit 1302 notifies the specific range updating unit 1303 of the calculated post-movement center position information (x′ 0 , y′ 0 , z′ 0 ).
- the specific range update unit 1303 updates the correspondence information already registered in the air conditioning control unit 502 . Update the specified range information.
- the specific range updating unit 1303 updates the movement target. Move a specific range of
- FIG. 14 is a diagram showing a specific example of the functional configuration of the specific range input unit and the operation information input process.
- the specific range input unit 501 includes a captured image acquisition unit 601, a radio wave information acquisition unit 602, a position information acquisition unit 603, an operation content acquisition unit 1401, a driving information acquisition unit 1402, and a driving information setting unit 1403. have.
- the photographed image acquisition unit 601 to the position information acquisition unit 603 have already been explained using FIG. 6, so the explanation is omitted here.
- the position information acquisition unit 603 notifies the driving information acquisition unit 1402 of the calculated coordinate information.
- the operation content acquisition unit 1401 acquires the operation content transmitted from the user terminal 140 and notifies the operation information acquisition unit 1402 of it.
- the example of FIG. 14 shows a state in which a driving information input instruction is notified by pressing the "input driving information" button 415 as the operation content. Furthermore, the example of FIG. 14 shows how the driving information is notified as the driving information is input as the operation content.
- the driving information acquisition unit 1402 converts the coordinate information notified from the position information acquisition unit 603 into the input position information (x i , y i , z i ). hold as
- the driving information acquisition unit 1402 reads specific range information already registered in the air conditioning control unit 502 and determines a specific range that includes the input position information (x i , y i , z i ). Furthermore, the driving information acquisition unit 1402 notifies the driving information setting unit 1403 of the determined specific range together with the notified driving information.
- the driving information setting unit 1403 registers the driving information notified from the driving information acquiring unit 1402 as the driving information of the notified specific range in association with the specific range.
- the driving information setting unit 1403 operates when the user terminal 140 is located within the specific range in which the driving information is to be registered and the "input driving information" button is pressed to input the driving information. , the driving information is registered in association with the specific range.
- FIG. 15 is a diagram showing a specific example of the functional configuration of the specific range input unit and the driving information reversing process.
- the specific range input unit 501 includes a captured image acquisition unit 601, a radio wave information acquisition unit 602, a position information acquisition unit 603, an operation content acquisition unit 1501, a reversing instruction acquisition unit 1502, and a driving information update unit 1503. have.
- the photographed image acquisition unit 601 to the position information acquisition unit 603 have already been explained using FIG. 6, so the explanation is omitted here.
- the position information acquisition unit 603 notifies the reversal instruction acquisition unit 1502 of the calculated coordinate information.
- the operation content acquisition unit 1501 acquires the operation content transmitted from the user terminal 140 and notifies the reversing instruction acquisition unit 1502 of it.
- the example of FIG. 15 shows, as the operation content, that a reverse instruction is notified by pressing the "reverse" button.
- the reversing instruction acquiring unit 1502 holds the coordinate information notified from the position information acquiring unit 603 as reversing position information (x r , yr , z r ) when the reversing instruction is notified from the operation content acquiring unit 1501 . do.
- the reversal instruction acquisition unit 1502 reads specific range information already registered in the air conditioning control unit 502 and determines a specific range that includes the reversal position information (x r , yr , z r ) . Further, the reversal instruction acquisition unit 1502 reads the driving information associated with the determined specific range and the driving information associated with the non-specific range, and notifies the driving information update unit 1503 of them.
- the driving information updating unit 1503 replaces the driving information associated with the specific range notified by the reversing instruction acquiring unit 1502 and the driving information associated with the non-specific range, and updates the operating information after the replacement with the air conditioning control unit. Register in section 502 .
- the driving information updating unit 1503 updates the driving information of the specific range. is replaced with driving information of a non-specific range.
- FIG. 16 is a first diagram showing a functional configuration of a specific range output unit and a specific example of specific range output processing.
- the specific range output unit 503 has a specific range information acquisition unit 1601 , a captured image acquisition unit 1602 and a specific range image generation unit 1603 .
- the specific range information acquisition unit 1601 reads the specific range information registered in the air conditioning control unit 502 and notifies the specific range image generation unit 1603.
- a captured image acquisition unit 1602 acquires a captured image captured by the imaging device 120 and notifies the specific range image generation unit 1603 of it.
- the specific range image generation unit 1603 generates an image showing the specific range based on the specific range information notified from the specific range information acquisition unit 1601. Further, the specific range image generation unit 1603 superimposes the generated image indicating the specific range on the captured image notified from the captured image acquisition unit 1602 at a position corresponding to the specific range information, thereby generating a specific range image. Generate.
- the specific range image generating unit 1603 displays the captured image or the specific range image on the monitor 131 by transmitting the captured image or the generated specific range image to the monitor 131 .
- FIG. 16 shows how a captured image 1611 is displayed on the monitor 131 and how a specific range image 1612 is displayed on the monitor 131 .
- 17 and 18 are first and second flowcharts showing the flow of specific range registration processing in the specific operation mode.
- step S1701 the specific range input unit 501 calculates the coordinate information of the user terminal 140.
- step S1702 the specific range input unit 501 determines whether or not a range input instruction has been acquired. If it is determined in step S1702 that the range input instruction has not been obtained (NO in step S1702), the process proceeds to step S1706.
- step S1702 determines in step S1702 that a range input instruction has been obtained (YES in step S1702). the process proceeds to step S1703.
- step S ⁇ b>1703 the specific range input unit 501 acquires the operation content for the user terminal 140 .
- step S1704 the specific range input unit 501 determines the specific range based on the coordinate information of the user terminal 140 and the details of the operation on the user terminal 140, and generates specific range information.
- step S1705 the specific range input unit 501 registers the generated specific range information in the air conditioning control unit 502.
- step S1706 the specific range input unit 501 determines whether or not a range deletion instruction has been acquired. If it is determined in step S1706 that the range deletion instruction has not been acquired (NO in step S1706), the process proceeds to step S1709.
- step S1706 determines in step S1706 that the range deletion instruction has been acquired (YES in step S1706).
- step S ⁇ b>1707 the specific range input unit 501 determines the specific range to be deleted based on the coordinate information of the user terminal 140 .
- step S1708 the specific range input unit 501 deletes the specific range information of the determined specific range.
- step S1709 the specific range input unit 501 determines whether or not a range movement start instruction and a range movement end instruction have been acquired. If it is determined in step S1709 that the range movement start instruction and range movement end instruction have not been acquired (NO in step S1709), the process proceeds to step S1801 in FIG.
- step S1709 determines whether the range movement start instruction and range movement end instruction have been acquired (YES in step S1709). If it is determined in step S1709 that the range movement start instruction and range movement end instruction have been acquired (YES in step S1709), the process proceeds to step S1710.
- step S1710 the specific range input unit 501 determines the specific range to be moved based on the coordinate information of the user terminal 140.
- step S1711 the specific range input unit 501 moves the determined specific range, and registers the specific range information of the specific range after movement.
- step S1801 the specific range input unit 501 determines whether or not a driving information input instruction has been acquired. If it is determined in step S1801 that the operation information input instruction has not been acquired (NO in step S1801), the process proceeds to step S1804.
- step S1801 determines whether a driving information input instruction has been acquired (if YES in step S1801), the process proceeds to step S1802.
- step S1802 the specific range input unit 501 determines the specific range for which driving information is to be registered based on the coordinate information of the user terminal 140.
- step S1803 the specific range input unit 501 registers driving information in association with the determined specific range.
- step S1804 the specific range input unit 501 determines whether or not a reverse instruction has been acquired. If it is determined in step S1804 that the reversal instruction has not been acquired (NO in step S1804), the process proceeds to step S1807.
- step S1804 determines whether a reversal instruction has been acquired (if YES in step S1804). If it is determined in step S1804 that a reversal instruction has been acquired (if YES in step S1804), the process proceeds to step S1805.
- step S1805 the specific range input unit 501 determines the specific range in which the driving information is to be inverted based on the coordinate information of the user terminal 140.
- step S1806 the specific range input unit 501 reverses the driving information associated with the determined specific range and the driving information associated with the non-specific range, and registers the reversed driving information.
- step S1807 the specific range input unit 501 determines whether or not to end the specific driving mode. When it is determined in step S1807 to continue the specific operation mode (in the case of NO in step S1807), the process returns to step S1701 in FIG.
- step S1807 determines whether the specific range registration process in the specific operation mode is finished.
- FIG. 19 is a diagram showing an example of air conditioning control by an air conditioning system.
- the operation information is registered so that the blown air hits the specific range 621, and the operation information is registered so that the blown air does not hit the specific ranges 1901 and 1902. It shows how it is registered.
- the user 211 can freely change the size and shape of the specific range to be controlled, and for one user 211, the specific Multiple ranges can be registered.
- the air conditioning system 100 includes - It has an air conditioner capable of air conditioning control and a user terminal for a specific range in real space. - Registering the specific range determined based on the position information of the user terminal in real space and the details of the operation on the user terminal.
- the specific range output unit has been described as being implemented in the controller of the air conditioner 110 .
- the specific range output unit may be implemented in the user terminal 140, for example.
- the second embodiment will be described below, focusing on differences from the first embodiment.
- FIG. 20 is a diagram showing an example of functional configurations of the control device and the user terminal in the specific operation mode.
- the control device of the air conditioner 110 functions as a specific range input unit 501, an air conditioning control unit 502, and a transmission unit 2001 by executing a registration program.
- the transmission unit 2001 transmits to the user terminal 140 various types of information for notifying the users 211, 221, 222, etc. of the specific range registered by the air conditioning control unit 502.
- the user terminal 140 functions as the specific range output unit 2002 by executing the air conditioning operation program.
- the specific range output unit 2002 performs processing for notifying the users 211, 221, 222, etc. of the specific range based on the various information transmitted from the transmission unit 2001.
- FIG. 21 is a diagram illustrating functional configurations of a transmission unit and a specific range output unit, and a specific example of specific range output processing.
- the transmission unit 2001 has a captured image acquisition unit 2101, a radio wave information acquisition unit 2102, a location information acquisition unit 2103, a location information transmission unit 2104, a specific range information acquisition unit 2105, and a specific range transmission unit 2106. .
- a captured image acquisition unit 2101 acquires a captured image captured by the imaging device 120 and notifies the position information acquisition unit 2103 of it.
- the radio wave information acquisition unit 2102 acquires the radio wave information received by the radio wave transmission/reception device 310 and notifies the location information acquisition unit 2103 of it.
- the position information acquisition unit 2103 calculates coordinate information (x, y, z) indicating the position of the user terminal 140 in the real space 210 or 220 based on the notified captured image and radio wave information.
- the position information transmission section 2104 transmits the coordinate information (x, y, z) calculated by the position information acquisition section 2103 to the specific range output section 2002 of the user terminal 140 .
- the specific range information acquisition unit 2105 reads the specific range information registered in the air conditioning control unit 502 and notifies the specific range transmission unit 2106 of it.
- the specific range transmission section 2106 transmits the specific range information notified by the specific range information acquisition section 2105 to the specific range output section 2002 .
- the specific range output unit 2002 has a position information acquisition unit 2111, a specific range information acquisition unit 2112, a specific range image generation unit 2113, a captured image acquisition unit 2114, and a display control unit 2115.
- the position information acquisition unit 2111 acquires the coordinate information (x, y, z) transmitted from the position information transmission unit 2104 and notifies the specific range image generation unit 2113 of it.
- the specific range information acquisition unit 2112 acquires the specific range information transmitted from the specific range transmission unit 2106 and notifies the specific range image generation unit 2113 of it.
- the specific range image generation unit 2113 Based on the coordinate information (x, y, z) notified from the position information acquisition unit 2111, the specific range image generation unit 2113 identifies where in the real space the user terminal 140 is currently located. Further, the specific range image generating unit 2113 identifies which direction the user terminal 140 is currently facing (attitude of the user terminal 140) based on the output of the acceleration sensor 328. FIG. Further, the specific range image generation unit 2113 calculates coordinate information of each position within the shooting range when the imaging device 327 of the user terminal 140 shoots from the identified position and direction (coordinate information and orientation information). Furthermore, the specific range image generation unit 2113 applies the specific range information notified from the specific range information acquisition unit 2112 to the coordinate information of each position within the imaging range.
- the specific range information acquisition unit 2112 generates an image of the specific range oriented according to the orientation of the user terminal 140 at the position within the captured image when the imaging device 327 of the user terminal 140 starts capturing. be able to.
- the specific range image generation unit 2113 notifies the display control unit 2115 of the generated specific range image (three-dimensional image).
- the captured image acquisition unit 2114 is operated by pressing the “Display specific range” button 414 on the operation screen 400 of the user terminal 140, and when the specific range display instruction is given and the image capture by the imaging device 327 is started, the imaging device 327, the photographed image is acquired. Also, the captured image acquisition unit 2114 notifies the display control unit 2115 of the acquired captured image.
- the display control unit 2115 generates a display image by superimposing the specific range image notified from the specific range image generation unit 2113 on the captured image notified from the captured image acquisition unit 2114 . Also, the display control unit 2115 displays the generated display image on the display screen of the display device 326 of the user terminal 140 .
- the user 211 presses the “display specific range” button 414 on the operation screen 400 of the user terminal 140 at a position specified by the coordinate information (x, y, z). It shows a state in which a range display instruction is given and shooting is started.
- FIG. 21 shows a case where specific range information has been registered for a foliage plant 2120 arranged in the real space 210 .
- a display image 2122 in which the specific range image is superimposed on the photographed image 2121 photographed from the side of the foliage plant 2120 is displayed. .
- a display image 2132 in which a specific range image is superimposed on a photographed image 2131 photographed from directly above the foliage plant 2120 is displayed.
- the user 211 can see which position in the real space 210 is registered as the specific range. can be grasped via the user terminal 140.
- the air conditioning system 100 has, in addition to the functions described in the first embodiment, ⁇ Display a 3D image showing a specific range, generated based on the position information and posture information of the user terminal in the real space, superimposed on the photographed image taken by performing the photographing operation on the user terminal. It has the function to
- the second embodiment it is possible to obtain the same effects as those of the first embodiment, and the user can grasp the registered specific range via the user terminal 140. become able to.
- the notification method for notifying the user of the specific range information is not limited to this.
- the user terminal 140 outputs a predetermined stimulus (sound, light, or vibration) to determine whether the user terminal 140 is positioned within a specific range or outside a specific range (within a non-specific range) in real space. to notify the user.
- a predetermined stimulus sound, light, or vibration
- the third embodiment will be described below, focusing on the differences from the first and second embodiments.
- FIG. 22 is a second diagram illustrating a functional configuration of the specific range output unit and a specific example of specific range output processing.
- the specific range output unit 2002 has a position information acquisition unit 2111, a specific range information acquisition unit 2112, a positional relationship determination unit 2201, and an output device control unit 2202.
- the position information acquisition unit 2111 has the same function as the position information acquisition unit 2111 described with reference to FIG.
- the relationship determination unit 2201 is notified.
- the specific range information acquisition unit 2112 has the same function as the specific range information acquisition unit 2112 described with reference to FIG. to notify.
- the positional relationship determination unit 2201 identifies where the user terminal 140 is currently located in real space based on the coordinate information (x, y, z) notified from the position information acquisition unit 2111 . Also, the positional relationship determination unit 2201 determines the positional relationship between the specific range and the user terminal 140 based on the specific range information notified from the specific range information acquisition unit 2112 . Furthermore, the positional relationship determination unit 2201 notifies the output device control unit 2202 of the determined positional relationship.
- the output device control unit 2202 controls the output device (the display device 326, the audio output device 330, or the vibration device 331) based on the positional relationship notified by the positional relationship determination unit 2201, and controls light, sound, or vibration. Notify the user by outputting. Thereby, the user can grasp the positional relationship between the specific range and the user terminal 140 .
- the output device control unit 2202 may control the display device 326 to output light, or may control an LED (not shown) to output light.
- the example of FIG. 22 shows the user 211 holding the user terminal 140 at the position specified by the coordinate information (x, y, z). Also, the example of FIG. 22 shows how a specific range 621 is registered.
- the output device control unit 2202 outputs any one of light, sound, and vibration, When it is positioned, it is controlled so as not to output any light, sound or vibration.
- the output device control unit 2202 may, for example, determine whether the user terminal 140 is positioned within the specific range 621 or outside the specific range 621 depending on whether the magnitude of vibration, sound volume, or image content is different. , the output may be controlled such that the color or amount of light varies.
- the output device control unit 2202 outputs light when the user terminal 140 is positioned within the specific range 621, and sounds or vibrates when the user terminal 140 is positioned outside the specific range 621. may be controlled to switch stimuli, such as outputting .
- the output device control unit 2202 changes the range corresponding to the coordinate information (x, y, z) from within the specific range 621 to outside the specific range 621. (within a non-specific range), control the output.
- the output device control unit 2202 specifies the range corresponding to the coordinate information (x, y, z) from outside the specified range 621 (within the non-specified range). If it changes into range 621, it controls the output.
- the output device control unit 2202 may control to increase the output of any one of sound, light, and vibration. good. Further, for example, the output device control unit 2202 may perform control to reduce the output of any one of sound, light, and vibration when the user terminal 140 approaches the boundary surface of the specific range 621 within the specific range 621. good.
- the output device control unit 2202 when the user terminal 140 is positioned within the specific range 621 and outputs any one of light, sound, and vibration, the output device control unit 2202 outputs the corresponding driving information.
- the output may be controlled accordingly.
- different driving information is registered for each of a plurality of specific ranges. will change either the color or the amount of
- the magnitude of vibration, sound volume, and image content may change. , either the color or the amount of light will change.
- the user 211 can determine which range in the real space is registered as the specific range, and ⁇ What kind of driving information is registered can be grasped via the user terminal 140 .
- the air conditioning system 100 has, in addition to the functions described in the first embodiment, ⁇ It has a function to control the output according to the movement operation on the user terminal.
- the third embodiment it is possible to obtain the same effects as those of the first embodiment, and the user can grasp the registered specific range via the user terminal 140. .
- a three-dimensional position measurement sensor is configured by a plurality of imaging devices 120 with different mounting positions in the real space 210 or 220, - A photographed image photographed by the first imaging device 120; - A photographed image photographed by the second imaging device 120, ⁇ A photographed image photographed by the ⁇ -th imaging device 120, may be matched to calculate coordinate information indicating the position of the user terminal 140 in the real space.
- SLAM Simultaneous Localization and Mapping
- the three-dimensional position measurement sensor is configured by the imaging device 120 attached via an arm in the real space 210 or 220, and the user terminal 140 receives high-intensity light that the imaging device 120 can receive.
- Install the emitting sensor, - Three-dimensional space data acquired from a captured image captured while swinging the imaging device 120; - Relative position data between the user terminal 140 and the imaging device 120, which is obtained by the imaging device 120 receiving high-brightness light emitted by the user terminal 140; may be used to calculate coordinate information indicating the position of the user terminal 140 in the real space.
- the coordinate information indicating the position of the user terminal 140 in the real space is obtained by using data acquired by a device other than the user terminal 140 (air The description has been given assuming that it is calculated by the control device of the harmony device).
- the coordinate information indicating the position of the user terminal 140 in real space may be calculated by the user terminal 140 using data acquired by the user terminal 140 .
- an API Application Programming Interface
- Layout information shape data such as three-dimensional data
- the real space 210 or 220 may be used to calculate coordinate information indicating the position of the user terminal 140 in real space.
- FIG. 23 is a diagram showing a calculation example of coordinate information indicating the position of the user terminal in real space.
- the calculation example in FIG. 23 shows how the user terminal 140 constantly captures images of the real space 210 or 220 while the user terminal 140 is being operated to move.
- a captured image 2301 and a captured image 2302 indicate captured images captured at different times.
- the user terminal 140 can calculate the feature point X
- the depth distances l 1 , l 2 from can be calculated.
- the coordinate information (a, b, c) of the object 2303 is obtained in advance as the layout information of the object.
- the user terminal 140 presents the current coordinate information ( x , y, z ) can be calculated.
- the configuration, number, and arrangement of sensors for calculating coordinate information indicating the position of the user terminal 140 in real space are arbitrary, and a combination suitable for the real space environment is selected.
- the fifth embodiment a case will be described in which an area in which an object or person in the real space exists, or a part of them exists, is registered as a specific range.
- the coordinate information of the object region of the object is not stored in advance, and is calculated each time. I will explain the case where
- FIG. 24 is a diagram illustrating an example of a functional configuration of a user terminal
- the user terminal 140 functions as a captured image acquisition unit 2401, a depth map generation unit 2402, an object detection unit 2403, and a center position determination unit 2404 by executing an air conditioning operation program.
- a captured image acquisition unit 2401 acquires a captured image captured by the imaging device 327 and notifies the depth map generation unit 2402 and the object detection unit 2403 of it.
- the object detection unit 2403 detects objects included in the captured image acquired by the captured image acquisition unit 2401, extracts objects of a type specified by the user from among the detected objects, and generates a depth map.
- the generation unit 2402 is notified.
- a depth map generation unit 2402 is an API that generates a depth map based on a captured image.
- a depth map generation unit 2402 generates a depth map in the real space 210 or 220 based on the captured image notified from the captured image acquisition unit 2401 .
- the depth map generation unit 1422 identifies a region corresponding to the object notified by the object detection unit 2403 in the generated depth map, and notifies the center position determination unit 2404 of it.
- the center position determination unit 2404 calculates the center position information of the object notified by the object detection unit 2403 based on the depth map of the area specified by the depth map generation unit 2402 .
- the center position determination unit 2404 transmits the calculated center position information to the controller of the air conditioner 110 .
- the center position information acquisition unit 2501 acquires center position information from the user terminal 140 and notifies it to the specific range determination unit 2502 .
- the specific range determination unit 2502 refers to the shape storage unit 610 and acquires shape information (for example, spherical shape information) of a predetermined shape of the specific range. Further, the specific range determination unit 607 obtains the center position information notified from the center position information acquisition unit 2501, the radius information (R) notified from the radius information specification unit 605, and the spherical shape information acquired from the shape storage unit 610. and to determine the specific range information.
- shape information for example, spherical shape information
- FIG. 26 is a first diagram showing a specific example of center position determination processing.
- the captured image acquisition unit 2401 displays the captured image 2610 on the display screen
- the object detection unit 2403 displays the captured image.
- a plurality of objects contained in 2610 are detected.
- the depth map generator 2402 generates a depth map based on the captured image 2610 .
- the user 211 By displaying the captured image 2610, the user 211 performs an instruction operation regarding the type of target object to be detected.
- the example of FIG. 26 shows a state in which the user 211 designates "object" as the type of object to be detected.
- the object detection unit 2403 designates “object” as the type of object to be detected
- the object that is closest to the user terminal 140 among the detected objects is specified as the object, and the specified object is detected.
- the object is notified to the depth map generator 2402 . Note that the example of FIG. 26 shows how the “desk” closest to the user terminal 140 is specified as the object.
- the depth map generation unit 2402 identifies the area of the identified object from the generated depth map and notifies the center position determination unit 2404.
- the center position determination unit 2404 calculates the center position information of the object based on the notified depth map of the region of the object.
- reference numeral 2620 indicates center position information calculated by the center position determining section 2404 .
- the center position information indicated by reference numeral 2620 is input to the specific range input unit 501, and the peripheral range based on the center position information is determined as the specific range 2630.
- FIG. 27 is a second diagram showing a specific example of center position determination processing.
- the captured image acquisition unit 2401 displays the captured image 2710 on the display screen
- the object detection unit 2403 displays the captured image.
- a plurality of objects contained in 2710 are detected.
- the depth map generation unit 2402 generates a depth map based on the captured image 2710 .
- the user 211 performs an instruction operation on the type of target object to be detected.
- the user 211 selects, as types of objects to be detected, It shows how to indicate a "person”.
- the object detection unit 2403 identifies the person closest to the user terminal 140 among the plurality of detected objects as the object, and detects the identified object. The object is notified to the depth map generator 2402 . Note that the example of FIG. 27 shows how the user 222 closest to the user terminal 140 is specified as the object.
- the depth map generation unit 2402 identifies the area of the identified object from the generated depth map and notifies the center position determination unit 2404.
- the center position determination unit 2404 calculates the center position information of the object based on the notified depth map of the region of the object.
- reference numeral 2720 indicates center position information calculated by the center position determining section 2404 .
- the center position information indicated by reference numeral 2720 is input to the specific range input unit 501, and the peripheral range based on the center position information is determined as the specific range 2730.
- FIG. 28 is a third diagram showing a specific example of center position determination processing.
- the captured image acquisition unit 2401 displays the captured image 2810 on the display screen
- the object detection unit 2403 displays the captured image 2810 on the display screen.
- a plurality of objects contained in 2810 are detected.
- the depth map generator 2402 generates a depth map based on the captured image 2810 .
- the user 211 performs a selection instruction operation for the part of the object to be detected.
- the example of FIG. 28 shows how the user 211 selects and instructs the "head" of a "person” as the part of the object to be detected.
- the object detection unit 2403 selects and instructs the “head” of the “person” as the part of the object to be detected, the object detection unit 2403 selects the head of the person closest to the user terminal 140 among the plurality of detected objects as the object. , and notifies the depth map generation unit 2402 of the identified object.
- the example of FIG. 28 shows how the head of user 222 closest to user terminal 140 is identified as the object.
- the depth map generation unit 2402 identifies the area of the identified object from the generated depth map and notifies the center position determination unit 2404.
- the center position determination unit 2404 calculates the center position information of the object based on the notified depth map of the region of the object.
- reference numeral 2820 indicates center position information calculated by the center position determining section 2404 .
- the center position information indicated by reference numeral 2820 is input to the specific range input unit 501, and the peripheral range based on the center position information is determined as the specific range 2830.
- FIG. 29 is a fourth diagram showing a specific example of center position determination processing.
- the captured image acquisition unit 2401 displays the captured image 2910 on the display screen
- the object detection unit 2403 displays the captured image 2910.
- a plurality of objects contained in 2810 are detected.
- the depth map generator 2402 generates a depth map based on the captured image 2910 .
- the user 211 When the captured image 2910 is displayed, the user 211 performs an instruction operation on the object to be detected. It is assumed that the display screen on which the photographed image 2910 is displayed can be pointed.
- the example of FIG. 29 shows how the user 211 points to the position of "notebook PC" on the display screen, and thus the instruction operation for the target object to be detected is accepted.
- the object detection unit 2403 designates and operates the “notebook PC” as the object to be detected, the object detection unit 2403 identifies the designated “notebook PC” as the object among the plurality of detected objects, and detects the identified object. The object is notified to the depth map generator 2402 .
- the depth map generation unit 2402 identifies the area of the identified object from the generated depth map and notifies the center position determination unit 2404.
- the center position determination unit 2404 calculates the center position information of the object based on the notified depth map of the region of the object.
- reference numeral 2920 indicates center position information calculated by the center position determining section 2404 .
- the center position information indicated by reference numeral 2920 is input to the specific range input unit 501, and the peripheral range based on the center position information is determined as the specific range 2930.
- the air conditioning system 100 is -
- the user terminal is capable of accepting an instruction operation for an object, and the surrounding area based on the center position information in the real space of the object accepted by the user terminal is determined as the specific area.
- the user terminal can accept a selection instruction operation for a part of the object, and the peripheral range based on the center position information in the real space of the part of the object accepted by the user terminal is determined as the specific range.
- the display screen on which the captured image captured by the imaging device of the user terminal is displayed is configured so that pointing is possible, and the user terminal selects the object at the pointed position among the plurality of objects included in the captured image as the target. Identify as a thing.
- the temperature, wind direction, and air volume are set as the operating information associated with the specific range, but the registration items that can be registered as the operating information associated with the specific range are not limited to these.
- the specific range input unit 501 has been described as being implemented in the control device within the air conditioner 110, but the specific range input unit 501 is implemented by a device other than the control device within the air conditioner 110. may be implemented in Alternatively, it may be implemented in a device outside air conditioner 110 (an arbitrary device within air conditioning system 100).
- the specific range output unit 503 is implemented in the control device or the user terminal 140 in the air conditioner 110, but the specific range output unit 503 is implemented in the air conditioner 110 It may be implemented in a device other than the controller. Alternatively, it may be implemented in a device other than the air conditioning device 110 and the user terminal 140 (any device within the air conditioning system 100).
- the air conditioning system 100 has been described as having the air conditioning device 110, the imaging device 120, the display device 130, and the user terminal 140.
- the configuration of the air conditioning system 100 is not limited to this, and may include a server device that controls a plurality of air conditioning devices 110, for example.
- the specific range input unit 501 and the specific range output unit 503 may be implemented in the server device.
- the size of the specific range is determined by a predetermined setting operation, but the shape of the specific range may be determined by a predetermined setting operation. .
- the setting operation for determining the size of the specific range the tap operation n times and the pressing operation continued for a predetermined time were exemplified.
- the setting operation for determination is not limited to these.
- the specific range may be configured to determine the size of
- the specified It may be configured to determine the size of the range.
- the specific range image (three-dimensional image) is superimposed on the captured image captured by the imaging device 327 of the user terminal 140 has been described.
- the specific range image may be superimposed on the image (still image) of the real space 210 or 220 to which the layout information in the real space 210 or 220 is linked. This is because if the coordinate information of the real space 210 or 220 is associated with each position of the still image, the specific range image can be superimposed appropriately.
- variations of the method of notifying the user of a predetermined stimulus (sound, light, or vibration) by the output device control unit 2202 have been described, but variations of the method of notifying the user include It is not limited to this.
- the specific range is determined by the center position information in the real space of the object. It may be configured to notify different stimuli. Specifically, the stimulus when the user terminal 140 is positioned within the specific range is determined depending on whether the specific range is determined to be a region in which no target object exists or when the region in which the target object exists is determined as the specific range. It may be configured differently.
- a different stimulus is notified depending on whether the object is a person or not.
- the user terminal 140 can specify whether the object is a person or an object other than a person. It may be configured to provide a different stimulus when positioned within the range.
- the smartphone 141, the remote control 142, the wearable device, etc. are exemplified as the user terminal 140, but the user terminal 140 does not have to be an electronic device.
- the user terminal 140 does not have to be an electronic device.
- the housing may have any shape, and may be, for example, a rod-shaped member.
- each of the above embodiments has been described as being realized in an air conditioning system, but the target system is not limited to an air conditioning system, and a specific range is determined in real space, and the determined specific range is determined. Any other system may be used as long as it is a registration system for registration. That is, the air conditioning system described in each of the above embodiments is an application example of the registration system.
- Air conditioning system 110 Air conditioning device 120: Imaging device 130: Display device 140: User terminal 210, 220: Real space 326: Display device 327: Imaging device 330: Audio output device 331: Vibration device 400: Operation screen 501 : Specific range input unit 503 : Specific range output unit 601 : Captured image acquisition unit 602 : Radio wave information acquisition unit 603 : Position information acquisition unit 604 : Operation content acquisition unit 605 : Radius information specification unit 607 : Specific range determination unit 701 : Operation Content acquisition unit 801: Operation content acquisition unit 802: Center position and radius information calculation unit 803: Specific range determination unit 902: Specific range determination unit 1001: Projection direction determination unit 1002: Specific range determination unit 1101: Operation content acquisition unit 1102: Target area calculation unit 1103: Specific range determination unit 1111: Operation content acquisition unit 1112: Reference position calculation unit 1113: Target region calculation unit 1114: Specific range determination unit 1201: Operation content acquisition unit 1202: Dele
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Chemical & Material Sciences (AREA)
- Mechanical Engineering (AREA)
- Combustion & Propulsion (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Air Conditioning Control Device (AREA)
- Position Input By Displaying (AREA)
- Selective Calling Equipment (AREA)
Abstract
Description
制御部と記憶部とを有する登録システムであって、
前記制御部は、
ユーザ端末の実空間における位置情報と、前記ユーザ端末に対する操作内容とに基づいて決定した前記実空間内の特定範囲を、前記記憶部に登録する。
前記制御部は、
前記ユーザ端末に対して移動操作が行われた場合の前記ユーザ端末の位置情報の軌跡に基づいて、前記実空間内の特定範囲を決定し、前記記憶部に登録する。
前記ユーザ端末は指示操作を受け付け可能であり、
前記制御部は、
前記ユーザ端末が指示操作を受け付けた際の前記ユーザ端末の位置情報を基準とする周辺範囲を、前記特定範囲として決定し、前記記憶部に登録する。
前記ユーザ端末は指示操作を受け付け可能であり、
前記制御部は、
前記ユーザ端末が指示操作を受け付けたことにより特定された対象物の前記実空間における位置情報を基準とする周辺範囲を、前記実空間内の特定範囲として決定し、前記記憶部に登録する。
前記ユーザ端末は選択指示操作を受け付け可能であり、
前記制御部は、
前記ユーザ端末が選択指示操作を受け付けたことにより特定された前記対象物の部位の前記実空間における位置情報を基準とする周辺範囲を、前記実空間内の特定範囲として決定し、前記記憶部に登録する。
前記ユーザ端末は設定操作を受け付け可能であり、
前記制御部は、
前記特定範囲の大きさまたは形状を、前記ユーザ端末が受け付けた設定操作に基づいて決定する。
前記ユーザ端末は設定操作を受け付け可能であり、
前記制御部は、
前記ユーザ端末が始点の設定操作を受け付けた際の前記ユーザ端末の位置情報、または、前記特定範囲を決定する際に基準とした位置情報と、前記ユーザ端末が終点の設定操作を受け付けた際の前記ユーザ端末の位置情報とに基づいて、前記特定範囲の大きさを決定する。
前記ユーザ端末は、
撮像装置と、
前記撮像装置により撮影された画像を表示する表示装置と、を有し、
前記ユーザ端末の制御部は、
前記画像に含まれる物体を、前記対象物として特定する。
前記ユーザ端末は、
撮像装置と、
前記撮像装置により撮影された画像を画面に表示するとともに、画像が表示された画面上の位置をポインティング可能な表示装置と、を有し、
前記ユーザ端末の制御部は、
前記画像に含まれる複数の物体のうち、ポインティングされた位置の物体を、前記対象物として特定する。
前記ユーザ端末は、
画像を表示する表示装置を有し、
前記ユーザ端末の制御部は、
前記画像に、前記特定範囲を示す画像を重畳して前記表示装置に表示する。
前記ユーザ端末は、
撮像装置を有し、
前記ユーザ端末の制御部は、
前記撮像装置により撮影された画像に、前記ユーザ端末の実空間における位置情報及び姿勢情報に基づいて生成された、前記特定範囲を示す3次元の画像を重畳して表示する。
前記ユーザ端末は、出力装置を有し、
前記実空間内の特定範囲に前記ユーザ端末が位置する場合に、前記出力装置は刺激を通知する。
前記ユーザ端末は、出力装置を有し、
前記出力装置は、前記実空間内の特定範囲と、前記ユーザ端末との位置関係に応じて、異なる刺激を通知する。
前記ユーザ端末は、出力装置を有し、
前記実空間内の特定範囲に前記ユーザ端末が位置する場合に、前記出力装置は、
前記特定範囲が前記対象物の前記実空間における位置情報により決定されたのか否かに応じて異なる刺激を通知する、または、
前記特定範囲が前記対象物の前記実空間における位置情報により決定されていた場合には、前記対象物が人であるか否かに応じて、異なる刺激を通知する。
前記ユーザ端末は、センサを有し、
前記ユーザ端末の制御部は、
前記実空間の形状データと、前記センサにより測定されたデータとを照合することで、前記ユーザ端末の実空間における位置情報を算出する。
前記センサは撮像装置であり、
前記ユーザ端末の制御部は、
前記実空間内の3次元データと、前記撮像装置により撮影された撮影画像から生成したマップとを照合することで、前記ユーザ端末の実空間における位置情報を算出する。
前記制御部は、
前記実空間内に取り付けられた3次元位置計測センサにより測定されたデータに基づいて、前記ユーザ端末の実空間における位置情報を算出する。
前記3次元位置計測センサは、取り付け位置の異なる複数の撮像装置により構成され、
前記制御部は、
前記複数の撮像装置それぞれにより撮影された撮影画像をマッチングさせることで、前記ユーザ端末の実空間における位置情報を算出する。
前記制御部は、
前記ユーザ端末が有するセンサにより測定されたデータと、前記実空間内に取り付けられたセンサにより測定されたデータとを照合することで、前記ユーザ端末の実空間における位置情報を算出する。
前記ユーザ端末は空調指示操作を受け付け可能であり、
前記制御部は、
前記ユーザ端末が空調指示操作を受け付けた場合に、前記記憶部に登録された前記実空間内の特定範囲に対して空調制御を行う。
前記制御部は、
前記実空間に取り付けられた複数の空気調和装置を用いることで、前記記憶部に登録された前記実空間内の特定範囲と、特定範囲以外の非特定範囲とで、異なる空調制御を行う。
第1乃至第21のいずれかの態様に記載の登録システムを含む。
ユーザ端末の実空間における位置情報と、前記ユーザ端末に対する操作内容とに基づいて決定した前記実空間内の特定範囲を、記憶部に登録する登録工程、
を制御部に実行させる。
<空気調和システムのシステム構成>
はじめに、第1の実施形態にかかる空気調和システムのシステム構成について説明する。図1は、空気調和システムのシステム構成の一例を示す図である。
・ユーザ端末140から無線通信を介して送信される各種指示及び各種情報、
・撮像装置120から送信される撮影画像(例えば、RGB画像)、
・室内機111が有するセンサにより測定されたデータ、
に基づいて、内蔵する制御装置が動作し、実空間の空調制御を行う。なお、室内機111は、撮像装置120を内蔵していてもよい。また、室内機111は、内蔵する制御装置が動作することで生成した各種情報を、無線通信を介してユーザ端末140に送信する、あるいは、表示装置130に送信する。
・操作パネル112cから送信される、またはユーザ端末140から無線通信を介して送信される各種指示及び各種情報、
・撮像装置120から送信される撮影画像、
・室内機112aが有するセンサにより測定されたデータ、
に基づいて、エッジ装置112b(制御装置)が動作し、実空間の空調制御を行う。
・室内機111に内蔵された制御装置、及び、
・室内機112aに接続されたエッジ装置112b、
は、実空間内の特定範囲に対して、非特定範囲(特定範囲以外の範囲)とは異なる空調制御を行うことが可能である。
次に、空気調和システム100が適用される実空間の具体例について説明する。図2は、空気調和システムが適用される実空間の一例を示す図である。
次に、空気調和装置110の室内機111に内蔵された制御装置または空気調和装置110の室内機112aに接続されたエッジ装置112b(制御装置)のハードウェア構成と、ユーザ端末140のハードウェア構成とについて、図3を用いて説明する。
このうち、図3の3aは、空気調和装置110の制御装置のハードウェア構成の一例を示している。図3の3aに示すように、空気調和装置110の制御装置は、プロセッサ301、メモリ302、補助記憶装置303を有する。また、空気調和装置110の制御装置は、I/F(Interface)装置304、通信装置305を有する。なお、空気調和装置110の制御装置の各ハードウェアは、バス306を介して相互に接続されている。
一方、図3の3bは、ユーザ端末140のハードウェア構成の一例を示している。図3の3bに示すように、ユーザ端末140は、プロセッサ321、メモリ322、補助記憶装置323、通信装置324、操作装置325、表示装置326を有する。また、ユーザ端末140は、撮像装置327、加速度センサ328、音声入力装置329、音声出力装置330、振動装置331を有する。なお、ユーザ端末140の各ハードウェアは、バス332を介して相互に接続されている。
次に、空気調和システム100が稼働する際に、ユーザ端末にて上記空調操作プログラムが実行されることで実現される運転モードと、ユーザ端末140に表示される操作画面について説明する。図4は、空気調和システムの運転モードと、操作画面の一例を示す図である。
次に、特定運転モードにおける空気調和装置110の制御装置の機能構成について説明する。図5は、特定運転モードにおける制御装置の機能構成の一例を示す図である。上述したように、空気調和装置110の制御装置には、登録プログラムがインストールされており、当該プログラムが実行されることにより、空気調和装置110の制御装置は、特定範囲入力部501、空調制御部502、特定範囲出力部503として機能する。
次に、特定範囲入力部501の機能構成(ここでは、特定範囲の決定に関わる機能構成)の詳細と、特定範囲決定処理の具体例について、(1)~(6)のパターンについて説明する。
図6は、特定範囲入力部の機能構成及び特定範囲決定処理の具体例を示す第1の図である。図6に示すように、特定範囲入力部501は、撮影画像取得部601、電波情報取得部602、位置情報取得部603、操作内容取得部604、半径情報特定部605、特定範囲決定部607を有する。
・ユーザ211が、「増減」ボタンをタップすることにより、あるいは、
・長押しすることにより、
半径情報(R)が一定量変化するごとに、所定の刺激(音、光、または振動)をユーザ211に通知してもよい。これにより、ユーザ211は、半径情報(R)が一定量変化したことを把握することができる。
図7は、特定範囲入力部の機能構成及び特定範囲決定処理の具体例を示す第2の図である。図7に示すように、特定範囲入力部501は、撮影画像取得部601、電波情報取得部602、位置情報取得部603、操作内容取得部701、特定範囲決定部607を有する。
図8は、特定範囲入力部の機能構成及び特定範囲決定処理の具体例を示す第3の図である。図8に示すように、特定範囲入力部501は、撮影画像取得部601、電波情報取得部602、位置情報取得部603、操作内容取得部801、中心位置及び半径情報算出部802、特定範囲決定部803を有する。
図9は、特定範囲入力部の機能構成及び特定範囲決定処理の具体例を示す第4の図である。図9に示すように、特定範囲入力部501は、撮影画像取得部601、電波情報取得部602、位置情報取得部603、操作内容取得部801、中心位置及び半径情報算出部802、射影方向判定部901、特定範囲決定部902を有する。
図10は、特定範囲入力部の機能構成及び特定範囲決定処理の具体例を示す第5の図である。図10に示すように、特定範囲入力部501は、撮影画像取得部601、電波情報取得部602、位置情報取得部603、操作内容取得部801、中心位置及び半径情報算出部802、射影方向判定部1001、特定範囲決定部1002を有する。
図11は、特定範囲入力部の機能構成及び特定範囲決定処理の具体例を示す第6の図である。図11に示すように、特定範囲入力部501は、撮影画像取得部601、電波情報取得部602、位置情報取得部603、操作内容取得部1101、対象領域算出部1102、特定範囲決定部1103を有する。
次に、特定範囲入力部501の機能構成(ここでは、特定範囲の削除に関わる機能構成)の詳細と、特定範囲削除処理の具体例について説明する。
次に、特定範囲入力部501の機能構成(ここでは、特定範囲の移動に関わる機能構成)の詳細と、特定範囲移動処理の具体例について説明する。
次に、特定範囲入力部501の機能構成(ここでは、運転情報の入力に関わる機能構成)の詳細と、運転情報入力処理の具体例について説明する。
次に、特定範囲入力部501の機能構成(ここでは、運転情報の反転に関わる機能構成)の詳細と、運転情報反転処理の具体例について説明する。図15は、特定範囲入力部の機能構成及び運転情報反転処理の具体例を示す図である。図15に示すように、特定範囲入力部501は、撮影画像取得部601、電波情報取得部602、位置情報取得部603、操作内容取得部1501、反転指示取得部1502、運転情報更新部1503を有する。
次に、特定範囲出力部503の機能構成(ここでは、特定範囲の表示に関わる機能構成)の詳細と、特定範囲表示処理の具体例について説明する。図16は、特定範囲出力部の機能構成及び特定範囲出力処理の具体例を示す第1の図である。図16に示すように、特定範囲出力部503は、特定範囲情報取得部1601、撮影画像取得部1602、特定範囲画像生成部1603を有する。
次に、空気調和システム100の特定運転モードにおける特定範囲登録処理の流れについて説明する。図17及び図18は、特定運転モードにおける特定範囲登録処理の流れを示す第1及び第2のフローチャートである。
次に、空気調和システム100の動作例について説明する。図19は、空気調和システムによる空調制御の一例を示す図である。図19の例は、特定範囲621に対しては、送風される空気があたるように運転情報が登録され、特定範囲1901及び1902に対しては、送風される空気があたらないように運転情報が登録された様子を示している。
以上の説明から明らかなように、第1の実施形態に係る空気調和システム100は、
・実空間内の特定範囲に対して、空調制御が可能な空気調和装置と、ユーザ端末とを有する。
・ユーザ端末の実空間における位置情報と、ユーザ端末に対する操作内容とに基づいて決定した特定範囲を登録する。
上記第1の実施形態では、特定範囲出力部が、空気調和装置110の制御装置において実現されるものとして説明した。しかしながら、特定範囲出力部は、例えば、ユーザ端末140において実現されてもよい。以下、第2の実施形態について、上記第1の実施形態との相違点を中心に説明する。
はじめに、第2の実施形態に係る空気調和システムにおける、空気調和装置110の制御装置の機能構成、及び、ユーザ端末140の機能構成について説明する。
次に、送信部2001及び特定範囲出力部2002の機能構成(ここでは、特定範囲出力処理に関わる機能構成)の詳細と、特定範囲出力処理の具体例について説明する。図21は、送信部及び特定範囲出力部の機能構成と、特定範囲出力処理の具体例を示す図である。
以上の説明から明らかなように、第2の実施形態に係る空気調和システム100は、上記第1の実施形態において説明した機能に加えて、
・ユーザ端末に対して撮影操作が行われることで撮影された撮影画像に、ユーザ端末の実空間における位置情報及び姿勢情報に基づいて生成された、特定範囲を示す3次元画像を重畳して表示する機能を有する。
上記第2の実施形態では、ユーザ端末140の撮像装置327により撮影された撮影画像に、特定範囲画像を重ね合わせた表示画像を、ユーザ端末140の表示画面に表示することで、登録された特定範囲情報をユーザに通知する場合について説明した。
はじめに、ユーザ端末140における特定範囲出力部2002の機能構成(ここでは、特定範囲出力処理に関わる機能構成)の詳細と、特定範囲出力処理の具体例について説明する。図22は、特定範囲出力部の機能構成及び特定範囲出力処理の具体例を示す第2の図である。
・ユーザ211は、実空間内のどの範囲が特定範囲として登録されているのか、及び、
・どのような運転情報が登録されているのか、
を、ユーザ端末140を介して把握することができる。
以上の説明から明らかなように、第3の実施形態に係る空気調和システム100は、上記第1の実施形態において説明した機能に加えて、
・ユーザ端末に対する移動操作に応じて出力を制御する機能を有する。
上記第1の実施形態では、ユーザ端末140の実空間における位置を示す座標情報を算出するにあたり、
・電波送受信装置310(3次元位置計測センサの一例)により取得された電波情報(レーザ光の反射信号)と、
・撮像装置120により撮影された撮影画像(RGB画像)と、
を用いる場合について説明した。
・第1の撮像装置120により撮影された撮影画像、
・第2の撮像装置120により撮影された撮影画像、
・・・
・第αの撮像装置120により撮影された撮影画像、
をマッチングさせることで、ユーザ端末140の実空間における位置を示す座標情報を算出してもよい。
・撮像装置120をスイングさせながら撮影した撮影画像により取得された3次元空間データと、
・ユーザ端末140により放出された高輝度の光を撮像装置120が受光することで取得された、ユーザ端末140と撮像装置120との相対位置データと、
を照合することで、ユーザ端末140の実空間における位置を示す座標情報を算出してもよい。
・撮像装置327で撮影した撮影画像から、APIを用いて生成した深度マップと、
・実空間210または220内のレイアウト情報(3次元データ等の形状データ)、
とを照合することで、ユーザ端末140の実空間における位置を示す座標情報を算出してもよい。
上記第1の実施形態では、主に、実空間内の所定の領域(対象物が存在していない領域)を、特定範囲として登録する場合について説明した。あるいは、上記第1の実施形態において、対象物が存在している領域を特定範囲として登録する場合にあっては、当該対象物が存在している物体領域の座標情報が予め格納されているものとして説明した。
はじめに、第5の実施形態に係るユーザ端末140の機能構成(ここでは、基準位置算出処理に関わる機能構成)の詳細について説明する。図24は、ユーザ端末の機能構成の一例を示す図である。
次に、第5の実施形態に係る空気調和装置110の制御装置において、登録プログラムが実行されることで実現される特定範囲入力部501の機能構成(ここでは、特定範囲の決定に関わる機能構成)について説明する。図25は、特定範囲入力部の機能構成の一例を示す図である。図6に示した特定範囲入力部との相違点は、図25の場合、中心位置情報取得部2501を有する点、及び、特定範囲決定部2502の機能が、図6の特定範囲決定部607の機能とは異なる点である。
次に、ユーザ端末140による中心位置決定処理の具体例について、(1)~(4)のパターンについて説明する。
図26は、中心位置決定処理の具体例を示す第1の図である。図26に示すように、ユーザ端末140の撮像装置327が実空間220を撮影することで、撮影画像取得部2401では、撮影画像2610を表示画面に表示し、対象物検出部2403では、撮影画像2610に含まれる複数の物体を検出する。更に、深度マップ生成部2402では、撮影画像2610に基づいて、深度マップを生成する。
図27は、中心位置決定処理の具体例を示す第2の図である。図27に示すように、ユーザ端末140の撮像装置327が実空間220を撮影することで、撮影画像取得部2401では、撮影画像2710を表示画面に表示し、対象物検出部2403では、撮影画像2710に含まれる複数の物体を検出する。更に、深度マップ生成部2402では、撮影画像2710に基づいて、深度マップを生成する。
“人”を指示した様子を示している。
図28は、中心位置決定処理の具体例を示す第3の図である。図28に示すように、ユーザ端末140の撮像装置327が実空間220を撮影することで、撮影画像取得部2401では、撮影画像2810を表示画面に表示し、対象物検出部2403では、撮影画像2810に含まれる複数の物体を検出する。更に、深度マップ生成部2402では、撮影画像2810に基づいて、深度マップを生成する。
図29は、中心位置決定処理の具体例を示す第4の図である。図29に示すように、ユーザ端末140の撮像装置327が実空間220を撮影することで、撮影画像取得部2401では、撮影画像2910を表示画面に表示し、対象物検出部2403では、撮影画像2810に含まれる複数の物体を検出する。更に、深度マップ生成部2402では、撮影画像2910に基づいて、深度マップを生成する。
以上の説明から明らかなように、第5の実施形態に係る空気調和システム100は、
・ユーザ端末が対象物の指示操作を受け付け可能であり、ユーザ端末が受け付けた対象物の実空間における中心位置情報を基準とする周辺範囲を、特定範囲として決定する。
・ユーザ端末が対象物の部位の選択指示操作を受け付け可能であり、ユーザ端末が受け付けた対象物の部位の実空間における中心位置情報を基準とする周辺範囲を、特定範囲として決定する。
・ユーザ端末の撮像装置により撮影された撮影画像が表示される表示画面が、ポインティング可能に構成され、ユーザ端末は、撮影画像に含まれる複数の物体のうち、ポインティングされた位置の物体を、対象物として特定する。
上記各実施形態では、特定範囲に対応付けられる運転情報として、温度、風向、風量を設定するように構成したが、特定範囲に対応付けられる運転情報として登録可能な登録項目はこれらに限定されない。
110 :空気調和装置
120 :撮像装置
130 :表示装置
140 :ユーザ端末
210、220 :実空間
326 :表示装置
327 :撮像装置
330 :音声出力装置
331 :振動装置
400 :操作画面
501 :特定範囲入力部
503 :特定範囲出力部
601 :撮影画像取得部
602 :電波情報取得部
603 :位置情報取得部
604 :操作内容取得部
605 :半径情報特定部
607 :特定範囲決定部
701 :操作内容取得部
801 :操作内容取得部
802 :中心位置及び半径情報算出部
803 :特定範囲決定部
902 :特定範囲決定部
1001 :射影方向判定部
1002 :特定範囲決定部
1101 :操作内容取得部
1102 :対象領域算出部
1103 :特定範囲決定部
1111 :操作内容取得部
1112 :基準位置算出部
1113 :対象領域算出部
1114 :特定範囲決定部
1201 :操作内容取得部
1202 :削除対象判定部
1203 :特定範囲削除部
1301 :操作内容取得部
1302 :移動対象判定部
1303 :特定範囲更新部
1401 :操作内容取得部
1402 :運転情報取得部
1403 :運転情報設定部
1501 :操作内容取得部
1502 :反転指示取得部
1503 :運転情報更新部
1601 :特定範囲情報取得部
1602 :撮影画像取得部
1603 :特定範囲画像生成部
1611 :撮影画像
1612 :特定範囲画像
1901、1902 :特定範囲
2001 :送信部
2002 :特定範囲出力部
2101 :撮影画像取得部
2102 :電波情報取得部
2103 :位置情報取得部
2104 :位置情報送信部
2105 :特定範囲情報取得部
2106 :特定範囲送信部
2111 :位置情報取得部
2112 :特定範囲情報取得部
2113 :特定範囲画像生成部
2114 :撮影画像取得部
2115 :表示制御部
2121、2131 :撮影画像
2122、2132 :表示画像
2201 :位置関係判定部
2202 :出力装置制御部
2401 :撮影画像取得部
2402 :深度マップ生成部
2403 :対象物検出部
2404 :中心位置決定部
2501 :中心位置情報取得部
2502 :特定範囲決定部
Claims (23)
- 制御部と記憶部とを有する登録システムであって、
前記制御部は、
ユーザ端末の実空間における位置情報と、前記ユーザ端末に対する操作内容とに基づいて決定した前記実空間内の特定範囲を、前記記憶部に登録する、
登録システム。 - 前記制御部は、
前記ユーザ端末に対して移動操作が行われた場合の前記ユーザ端末の位置情報の軌跡に基づいて、前記実空間内の特定範囲を決定し、前記記憶部に登録する、請求項1に記載の登録システム。 - 前記ユーザ端末は指示操作を受け付け可能であり、
前記制御部は、
前記ユーザ端末が指示操作を受け付けた際の前記ユーザ端末の位置情報を基準とする周辺範囲を、前記特定範囲として決定し、前記記憶部に登録する、請求項1に記載の登録システム。 - 前記ユーザ端末は指示操作を受け付け可能であり、
前記制御部は、
前記ユーザ端末が指示操作を受け付けたことにより特定された対象物の前記実空間における位置情報を基準とする周辺範囲を、前記実空間内の特定範囲として決定し、前記記憶部に登録する、請求項1に記載の登録システム。 - 前記ユーザ端末は選択指示操作を受け付け可能であり、
前記制御部は、
前記ユーザ端末が選択指示操作を受け付けたことにより特定された前記対象物の部位の前記実空間における位置情報を基準とする周辺範囲を、前記実空間内の特定範囲として決定し、前記記憶部に登録する、請求項4に記載の登録システム。 - 前記ユーザ端末は設定操作を受け付け可能であり、
前記制御部は、
前記特定範囲の大きさまたは形状を、前記ユーザ端末が受け付けた設定操作に基づいて決定する、請求項1に記載の登録システム。 - 前記ユーザ端末は設定操作を受け付け可能であり、
前記制御部は、
前記ユーザ端末が始点の設定操作を受け付けた際の前記ユーザ端末の位置情報、または、前記特定範囲を決定する際に基準とした位置情報と、前記ユーザ端末が終点の設定操作を受け付けた際の前記ユーザ端末の位置情報とに基づいて、前記特定範囲の大きさを決定する、請求項6に記載の登録システム。 - 前記ユーザ端末は、
撮像装置と、
前記撮像装置により撮影された画像を表示する表示装置と、を有し、
前記ユーザ端末の制御部は、
前記画像に含まれる物体を、前記対象物として特定する、請求項4に記載の登録システム。 - 前記ユーザ端末は、
撮像装置と、
前記撮像装置により撮影された画像を画面に表示するとともに、画像が表示された画面上の位置をポインティング可能な表示装置と、を有し、
前記ユーザ端末の制御部は、
前記画像に含まれる複数の物体のうち、ポインティングされた位置の物体を、前記対象物として特定する、請求項4に記載の登録システム。 - 前記ユーザ端末は、
画像を表示する表示装置を有し、
前記ユーザ端末の制御部は、
前記画像に、前記特定範囲を示す画像を重畳して前記表示装置に表示する、請求項1に記載の登録システム。 - 前記ユーザ端末は、
撮像装置を有し、
前記ユーザ端末の制御部は、
前記撮像装置により撮影された画像に、前記ユーザ端末の実空間における位置情報及び姿勢情報に基づいて生成された、前記特定範囲を示す3次元の画像を重畳して表示する、請求項10に記載の登録システム。 - 前記ユーザ端末は、出力装置を有し、
前記実空間内の特定範囲に前記ユーザ端末が位置する場合に、前記出力装置は刺激を通知する、請求項1に記載の登録システム。 - 前記ユーザ端末は、出力装置を有し、
前記出力装置は、前記実空間内の特定範囲と、前記ユーザ端末との位置関係に応じて、異なる刺激を通知する、請求項1に記載の登録システム。 - 前記ユーザ端末は、出力装置を有し、
前記実空間内の特定範囲に前記ユーザ端末が位置する場合に、前記出力装置は、
前記特定範囲が前記対象物の前記実空間における位置情報により決定されたのか否かに応じて異なる刺激を通知する、または、
前記特定範囲が前記対象物の前記実空間における位置情報により決定されていた場合には、前記対象物が人であるか否かに応じて、異なる刺激を通知する、
請求項4に記載の登録システム。 - 前記ユーザ端末は、センサを有し、
前記ユーザ端末の制御部は、
前記実空間の形状データと、前記センサにより測定されたデータとを照合することで、前記ユーザ端末の実空間における位置情報を算出する、請求項1に記載の登録システム。 - 前記センサは撮像装置であり、
前記ユーザ端末の制御部は、
前記実空間内の3次元データと、前記撮像装置により撮影された撮影画像から生成したマップとを照合することで、前記ユーザ端末の実空間における位置情報を算出する、請求項15に記載の登録システム。 - 前記制御部は、
前記実空間内に取り付けられた3次元位置計測センサにより測定されたデータに基づいて、前記ユーザ端末の実空間における位置情報を算出する、請求項1に記載の登録システム。 - 前記3次元位置計測センサは、取り付け位置の異なる複数の撮像装置により構成され、
前記制御部は、
前記複数の撮像装置それぞれにより撮影された撮影画像をマッチングさせることで、前記ユーザ端末の実空間における位置情報を算出する、請求項17に記載の登録システム。 - 前記制御部は、
前記ユーザ端末が有するセンサにより測定されたデータと、前記実空間内に取り付けられたセンサにより測定されたデータとを照合することで、前記ユーザ端末の実空間における位置情報を算出する、請求項1に記載の登録システム。 - 前記ユーザ端末は空調指示操作を受け付け可能であり、
前記制御部は、
前記ユーザ端末が空調指示操作を受け付けた場合に、前記記憶部に登録された前記実空間内の特定範囲に対して空調制御を行う、請求項1に記載の登録システム。 - 前記制御部は、
前記実空間に取り付けられた複数の空気調和装置を用いることで、前記記憶部に登録された前記実空間内の特定範囲と、特定範囲以外の非特定範囲とで、異なる空調制御を実行する、請求項20に記載の登録システム。 - 請求項1乃至21のいずれか1項に記載の登録システムを含む空気調和システム。
- ユーザ端末の実空間における位置情報と、前記ユーザ端末に対する操作内容とに基づいて決定した前記実空間内の特定範囲を、記憶部に登録する登録工程、
を制御部に実行させるための登録プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/550,014 US20240175597A1 (en) | 2021-03-31 | 2022-03-31 | Registration system, air conditioning system and registration program |
EP22781292.2A EP4319183A1 (en) | 2021-03-31 | 2022-03-31 | Registration system, air-conditioning system, and registration program |
CN202280023032.6A CN117044225A (zh) | 2021-03-31 | 2022-03-31 | 登记系统、空调系统和登记程序 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021059231 | 2021-03-31 | ||
JP2021-059231 | 2021-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022211083A1 true WO2022211083A1 (ja) | 2022-10-06 |
Family
ID=83459646
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/016825 WO2022211083A1 (ja) | 2021-03-31 | 2022-03-31 | 登録システム、空気調和システム及び登録プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240175597A1 (ja) |
EP (1) | EP4319183A1 (ja) |
JP (2) | JP7189484B2 (ja) |
CN (1) | CN117044225A (ja) |
WO (1) | WO2022211083A1 (ja) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002099971A (ja) * | 2000-09-22 | 2002-04-05 | Sanyo Electric Co Ltd | 緊急通報システム及び緊急通報装置 |
JP2009020007A (ja) * | 2007-07-12 | 2009-01-29 | Denso Corp | 車載ナビゲーション装置及び車載ナビゲーションシステム |
JP2010154124A (ja) * | 2008-12-24 | 2010-07-08 | Kyocera Corp | 携帯端末、携帯端末の制御方法およびプログラム |
JP2012257119A (ja) * | 2011-06-09 | 2012-12-27 | Jvc Kenwood Corp | 携帯型受信機および放送受信方法 |
JP2016080602A (ja) * | 2014-10-20 | 2016-05-16 | パイオニア株式会社 | 端末装置、エネルギー供給装置、サーバ装置、航行可能範囲表示方法、航行可能範囲表示プログラムおよび記録媒体 |
JP2017201745A (ja) * | 2016-05-02 | 2017-11-09 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP2017211149A (ja) | 2016-05-26 | 2017-11-30 | 三菱電機ビルテクノサービス株式会社 | 空気調和機 |
JP2018063537A (ja) * | 2016-10-12 | 2018-04-19 | 株式会社東芝 | ホームネットワーク、電子機器、処理装置および表示方法 |
JP2021059231A (ja) | 2019-10-07 | 2021-04-15 | 住友ゴム工業株式会社 | 空気入りタイヤ |
-
2022
- 2022-03-31 EP EP22781292.2A patent/EP4319183A1/en active Pending
- 2022-03-31 US US18/550,014 patent/US20240175597A1/en active Pending
- 2022-03-31 CN CN202280023032.6A patent/CN117044225A/zh active Pending
- 2022-03-31 WO PCT/JP2022/016825 patent/WO2022211083A1/ja active Application Filing
- 2022-03-31 JP JP2022059133A patent/JP7189484B2/ja active Active
- 2022-11-30 JP JP2022192276A patent/JP2023025145A/ja active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002099971A (ja) * | 2000-09-22 | 2002-04-05 | Sanyo Electric Co Ltd | 緊急通報システム及び緊急通報装置 |
JP2009020007A (ja) * | 2007-07-12 | 2009-01-29 | Denso Corp | 車載ナビゲーション装置及び車載ナビゲーションシステム |
JP2010154124A (ja) * | 2008-12-24 | 2010-07-08 | Kyocera Corp | 携帯端末、携帯端末の制御方法およびプログラム |
JP2012257119A (ja) * | 2011-06-09 | 2012-12-27 | Jvc Kenwood Corp | 携帯型受信機および放送受信方法 |
JP2016080602A (ja) * | 2014-10-20 | 2016-05-16 | パイオニア株式会社 | 端末装置、エネルギー供給装置、サーバ装置、航行可能範囲表示方法、航行可能範囲表示プログラムおよび記録媒体 |
JP2017201745A (ja) * | 2016-05-02 | 2017-11-09 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP2017211149A (ja) | 2016-05-26 | 2017-11-30 | 三菱電機ビルテクノサービス株式会社 | 空気調和機 |
JP2018063537A (ja) * | 2016-10-12 | 2018-04-19 | 株式会社東芝 | ホームネットワーク、電子機器、処理装置および表示方法 |
JP2021059231A (ja) | 2019-10-07 | 2021-04-15 | 住友ゴム工業株式会社 | 空気入りタイヤ |
Non-Patent Citations (1)
Title |
---|
JEONG YONGJIN, RYO KURAZUME, YUMI IWASHITA, TSUTOMU HASEGAWA: "Global Localization for Mobile Robot using Large-scale 3D Environmental Map and RGB-D Camera", JOURNAL OF THE ROBOTICS SOCIETY OF JAPAN, vol. 31, no. 9, 1 January 2013 (2013-01-01), pages (68) 896 - (78) 906, XP055974433, ISSN: 0289-1824, DOI: 10.7210/jrsj.31.896 * |
Also Published As
Publication number | Publication date |
---|---|
EP4319183A1 (en) | 2024-02-07 |
US20240175597A1 (en) | 2024-05-30 |
JP7189484B2 (ja) | 2022-12-14 |
JP2023025145A (ja) | 2023-02-21 |
JP2022159213A (ja) | 2022-10-17 |
CN117044225A (zh) | 2023-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9858643B2 (en) | Image generating device, image generating method, and program | |
US9179182B2 (en) | Interactive multi-display control systems | |
JP5136060B2 (ja) | 画像処理装置、画像処理方法、そのプログラム及びそのプログラムを記録した記録媒体と撮像装置 | |
JP6843164B2 (ja) | プログラム、方法、および情報処理装置 | |
US10917560B2 (en) | Control apparatus, movable apparatus, and remote-control system | |
US20210263168A1 (en) | System and method to determine positioning in a virtual coordinate system | |
US20080225137A1 (en) | Image information processing apparatus | |
JP2001008232A (ja) | 全方位映像出力方法と装置 | |
JP6788845B2 (ja) | 遠隔通信方法、遠隔通信システム及び自律移動装置 | |
JP2019169154A (ja) | 端末装置およびその制御方法、並びにプログラム | |
JP2009010728A (ja) | カメラ設置支援装置 | |
CN112581571B (zh) | 虚拟形象模型的控制方法、装置、电子设备及存储介质 | |
JP2008511877A (ja) | 装置制御方法 | |
JP2005341060A (ja) | カメラ制御装置 | |
JP2005063225A (ja) | 自己画像表示を用いたインタフェース方法、装置、ならびにプログラム | |
TW202041882A (zh) | 擴增實境系統及決定移動感測器型號與安裝位置的方法 | |
WO2022211083A1 (ja) | 登録システム、空気調和システム及び登録プログラム | |
GB2581248A (en) | Augmented reality tools for lighting design | |
CN115904188A (zh) | 户型图的编辑方法、装置、电子设备及存储介质 | |
CN115731349A (zh) | 户型图的展示方法、装置、电子设备及存储介质 | |
WO2022269887A1 (ja) | ウェアラブル端末装置、プログラムおよび画像処理方法 | |
JP2022014758A (ja) | 情報処理装置及びプログラム | |
JP4631634B2 (ja) | 情報出力システム及び情報出力方法 | |
US20230300314A1 (en) | Information processing apparatus, information processing system, and information processing method | |
WO2021215246A1 (ja) | 画像処理装置、画像処理方法、および、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22781292 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18550014 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280023032.6 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202317064329 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022781292 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022781292 Country of ref document: EP Effective date: 20231031 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |