CN111794993B - Control of fan assembly - Google Patents

Control of fan assembly Download PDF

Info

Publication number
CN111794993B
CN111794993B CN202010258019.1A CN202010258019A CN111794993B CN 111794993 B CN111794993 B CN 111794993B CN 202010258019 A CN202010258019 A CN 202010258019A CN 111794993 B CN111794993 B CN 111794993B
Authority
CN
China
Prior art keywords
fan assembly
scene
orientation
steerable section
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010258019.1A
Other languages
Chinese (zh)
Other versions
CN111794993A (en
Inventor
N.辛格
M.J.阿德金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dyson Technology Ltd
Original Assignee
Dyson Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dyson Technology Ltd filed Critical Dyson Technology Ltd
Publication of CN111794993A publication Critical patent/CN111794993A/en
Application granted granted Critical
Publication of CN111794993B publication Critical patent/CN111794993B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F04POSITIVE - DISPLACEMENT MACHINES FOR LIQUIDS; PUMPS FOR LIQUIDS OR ELASTIC FLUIDS
    • F04DNON-POSITIVE-DISPLACEMENT PUMPS
    • F04D25/00Pumping installations or systems
    • F04D25/02Units comprising pumps and their driving means
    • F04D25/08Units comprising pumps and their driving means the working fluid being air, e.g. for ventilation
    • F04D25/10Units comprising pumps and their driving means the working fluid being air, e.g. for ventilation the unit having provisions for automatically changing direction of output air
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F04POSITIVE - DISPLACEMENT MACHINES FOR LIQUIDS; PUMPS FOR LIQUIDS OR ELASTIC FLUIDS
    • F04DNON-POSITIVE-DISPLACEMENT PUMPS
    • F04D27/00Control, e.g. regulation, of pumps, pumping installations or pumping systems specially adapted for elastic fluids
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F04POSITIVE - DISPLACEMENT MACHINES FOR LIQUIDS; PUMPS FOR LIQUIDS OR ELASTIC FLUIDS
    • F04FPUMPING OF FLUID BY DIRECT CONTACT OF ANOTHER FLUID OR BY USING INERTIA OF FLUID TO BE PUMPED; SIPHONS
    • F04F5/00Jet pumps, i.e. devices in which flow is induced by pressure drop caused by velocity of another fluid flow
    • F04F5/14Jet pumps, i.e. devices in which flow is induced by pressure drop caused by velocity of another fluid flow the inducing fluid being elastic fluid
    • F04F5/16Jet pumps, i.e. devices in which flow is induced by pressure drop caused by velocity of another fluid flow the inducing fluid being elastic fluid displacing elastic fluids
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F04POSITIVE - DISPLACEMENT MACHINES FOR LIQUIDS; PUMPS FOR LIQUIDS OR ELASTIC FLUIDS
    • F04FPUMPING OF FLUID BY DIRECT CONTACT OF ANOTHER FLUID OR BY USING INERTIA OF FLUID TO BE PUMPED; SIPHONS
    • F04F5/00Jet pumps, i.e. devices in which flow is induced by pressure drop caused by velocity of another fluid flow
    • F04F5/44Component parts, details, or accessories not provided for in, or of interest apart from, groups F04F5/02 - F04F5/42
    • F04F5/48Control
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/50Control or safety arrangements characterised by user interfaces or communication
    • F24F11/56Remote control
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/62Control or safety arrangements characterised by the type of control or by internal processing, e.g. using fuzzy logic, adaptive control or estimation of values
    • F24F11/63Electronic processing
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/70Control systems characterised by their outputs; Constructional details thereof
    • F24F11/72Control systems characterised by their outputs; Constructional details thereof for controlling the supply of treated air, e.g. its pressure
    • F24F11/79Control systems characterised by their outputs; Constructional details thereof for controlling the supply of treated air, e.g. its pressure for controlling the direction of the supplied air
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/35Categorising the entire scene, e.g. birthday party or wedding scene
    • G06V20/36Indoor scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Abstract

A method of controlling a fan assembly capable of changing the direction of air flow emitted therefrom by adjusting the relative orientation of a steerable section relative to a non-steerable section is provided. The method includes, at a remote computer device: capturing a series of images of a scene including a fan assembly; the captured image is used to determine a current position and a current orientation of the fan assembly. The method also includes receiving, from the fan assembly, a current relative orientation of the steerable section relative to the non-steerable section of the fan assembly; receiving a user input indicating a target direction for air flow emitted from a fan assembly; determining a target relative orientation that aligns the steerable segment with a target direction; and sending instructions to the fan assembly to move to the target relative orientation.

Description

Control of fan assembly
Technical Field
The present invention relates to a method for controlling a fan assembly, a fan assembly and an electrical device configured to control a fan assembly.
Background
Conventional domestic fans typically include a set of blades or vanes mounted for rotation about an axis, and a drive arrangement for rotating the set of blades to generate an air flow. The movement and circulation of the air flow creates a "cold" or breeze, and as a result, the user experiences a cooling effect as heat is dissipated by convection and evaporation. The blades are typically located in a cage that allows airflow through the housing while preventing a user from contacting the rotating blades during use of the fan.
US 2,488,467 describes a fan that does not use vanes enclosed in a cage for emitting air from the fan assembly. Instead, the fan assembly includes a base housing a motor-driven impeller to draw an air flow into the base, and a series of concentric annular nozzles connected to the base, the annular nozzles each including an annular outlet positioned at the front of the fan for emitting the air flow from the fan. Each nozzle extends about a bore axis to define a bore about which the nozzle extends.
Each nozzle of the airfoil shape may thus be considered to have a leading edge at the rear of the nozzle, a trailing edge at the front of the nozzle, and a chord line extending between the leading and trailing edges. In US 2,488,467, the chord line of each nozzle is parallel to the eye axis of the nozzle. The air outlet is located on the chord line and is arranged to emit an air flow in a direction extending along the chord line away from the nozzle.
Another fan assembly is described in WO 2010/100451 which does not use blades enclosed in a cage to emit air from the fan assembly. The fan assembly comprises a cylindrical base which also houses a motor-driven impeller for drawing a primary air flow into the base, and a single annular nozzle connected to the base and comprising an annular mouth through which the primary air flow is emitted from the fan. The nozzle defines an opening through which air in the environment surrounding the fan assembly is drawn by the primary air flow emitted from the mouth, expanding the primary air flow. The nozzle includes a coanda surface over which the mouth is arranged to direct the primary air flow. The coanda surfaces extend symmetrically about the central axis of the opening so that the air flow produced by the fan assembly is in the form of an annular jet having a cylindrical or frusto-conical profile.
WO 2010/046691 also describes a fan assembly. The fan assembly includes a cylindrical base housing a motor-driven impeller to draw a primary air flow into the base, and an annular nozzle connected to the base and including an annular air outlet through which the primary air flow is ejected from the fan. The fan assembly includes a filter for removing particulates from the air flow. The filter may be located upstream of the motor-driven impeller, in which case particulates are removed from the air flow before passing through the impeller. This protects the impeller from debris and dust (which may be attracted to the human fan assembly and which may damage the fan assembly). Alternatively, the filter may be disposed downstream of the motor-driven impeller. In this configuration, air drawn through the motor-driven impeller, including any exhaust air, may be filtered and cleaned prior to passing through the elements of the fan assembly and being supplied to the user.
WO 2016/128732 describes a fan assembly similar to those of WO 2010/100451 and WO 2010/046691. The fan assembly is provided with an air inlet which extends around the entire periphery of the body of the fan so as to maximise the area available for air to be drawn into the fan assembly. The fan assembly is thus also provided with a tubular, barrel-type filter mounted concentrically over and around the entire circumference of the body of the fan, upstream of the air inlet, and a nozzle (which is removably mounted on the body). The filter is not connected to the body or nozzle but is held securely in place by the nozzle when mounted on the body and can only be removed from the fan assembly after the nozzle is removed. This arrangement provides that the filter can simply be lowered onto the body before being secured in place by engagement of the nozzle with the body, and that the filter can be easily removed from the body after removal of the nozzle to allow cleaning or resetting of the filter.
The fan assemblies described in each of WO 2010/100451, WO 2010/046691, and WO 2016/128732 each include a plurality of user operable buttons to enable a user to operate the fan. WO 2012/017219 then also discloses a fan assembly in the form of a portable fan heater provided with a plurality of user operable buttons to enable a user to control various functions of the fan assembly, and a display for providing a visual indication to the user of the temperature setting of the fan. Similarly, GB2509111 discloses a fan assembly provided with user interface circuitry including a user actuable switch for the fan assembly and a display for displaying the current operating settings of the fan assembly.
Disclosure of Invention
According to a first aspect, a method of controlling the direction of airflow emitted from a fan assembly is provided, wherein the fan assembly is capable of changing the direction of airflow emitted therefrom by adjusting the relative orientation of a steerable section of the fan assembly with respect to a non-steerable section of the fan assembly. The method includes, at a remote computer device: capturing a series of images of a scene including a fan assembly; the captured images are used to determine a current position of the fan assembly and a current orientation of the steerable section of the fan assembly. The method also includes receiving, from the fan assembly, a current relative orientation of the steerable section relative to the non-steerable section of the fan assembly; receiving a user input indicating a target direction for air flow emitted from a fan assembly; determining an orientation difference between a current orientation of the steerable section and a target direction; combining the orientation difference and the current relative orientation to determine a target relative orientation of the steerable section that aligns the steerable section with the target direction; and sending instructions to the fan assembly to move to the target relative orientation.
The step of capturing a series of images of a scene including the fan assembly may include capturing a series of images of the scene using one or more image sensors of the computer device. The step of determining the current position and current orientation of the steerable section of the fan assembly using the captured images may comprise processing the captured images to generate a scene map, and detecting the fan assembly in the scene map. The scene map is preferably a three-dimensional map of the scene. The step of processing the captured images to generate a scene map may comprise using a feature-based approach to perform instantaneous localization and mapping. These feature-based methods include any of image registration or alignment, visual ranging, and visual inertial ranging.
The step of processing the captured image to detect the fan assembly in the map of the scene may include implementing an object recognition process to facilitate identifying the fan assembly and determining the position and orientation of the fan assembly in the map. The step of processing the captured image to generate a scene map and detecting the fan assembly in the scene map includes generating a point cloud for the scene and matching at least a portion of the generated point cloud with a representative point cloud associated with the fan assembly. The representative point cloud may be stored in association with orientation data defining an orientation of the points relative to the representative point cloud. The method may also include, when at least a portion of the generated point cloud corresponds to the stored representative point cloud, acquiring orientation data defining an orientation of the point relative to the representative point cloud, and determining an orientation of the fan assembly within the scene using the generated point cloud and the acquired orientation data.
The step of receiving user input indicating a target direction may comprise displaying a captured image of the scene on a display of the device; receiving a user input selecting a location within a displayed image; a location within the scene corresponding to the selected location within the display image is identified, and the target direction is identified by determining an orientation of the identified location within the scene relative to the fan assembly. The method may also include displaying a virtual object positioned at the identified location within a scene image subsequently displayed on the electronic display.
The step of receiving user input indicating a target direction may comprise displaying a captured image of the scene on a display of the device; receiving a user input selecting a direction within a displayed image; a direction within the scene corresponding to the selected direction within the display image is identified, and the identified direction within the scene is used as the target direction. The method may also include displaying a virtual object within the image of the scene subsequently displayed on the electronic display, the virtual object having at least a portion aligned with the identified direction.
The method also includes, after receiving a user input indicating a target direction, storing data indicating the target direction, displaying a graphical object associated with the stored data on a display of the device, and in response to receipt of the user input selecting the displayed graphical object, determining a target relative orientation of the steerable segment using the stored data.
The received user input may mark a location in the scene and the stored data may then include the identified location. The step of displaying the graphical object on the electronic display may then comprise displaying a virtual object positioned at the identified position within the image of the scene displayed on the display. The receiving of the user input selecting the displayed graphical object may include receiving the user input selecting a virtual object within the displayed image. The step of using the stored data to determine a target relative orientation of the steerable section may comprise determining an orientation of the marked location relative to the fan assembly in the scene and using the determined orientation as the target direction.
The received user input may mark a direction in the scene and the stored data may then include the identified direction. The step of displaying the graphical object on the display may then comprise displaying a virtual object, at least a portion of which is aligned with the identified direction within the image of the scene displayed on the display. The receiving of the user input selecting the displayed graphical object may include receiving the user input selecting a virtual object within the displayed image. The step of using the stored data to determine a target relative orientation of the steerable section may comprise using the marked direction as the target direction.
The method also includes initiating formation of a wireless data connection between the remote computer devices of the fan assembly.
A method of controlling the direction of airflow emitted from a fan assembly is also provided, wherein the fan assembly is capable of changing the direction of airflow emitted therefrom by adjusting the relative orientation of a steerable section of the fan assembly with respect to a non-steerable section of the fan assembly. The method includes, at the fan assembly, receiving a request from the remote computer device for a current relative orientation of the steerable section relative to the non-steerable section, in response to receipt of the request, determining the current relative orientation of the steerable section and transmitting the current relative orientation to the remote computer device. The method also includes, at the fan assembly, receiving an instruction from the remote computer device to move the steerable section to the target relative orientation, and in response to receipt of the instruction, adjusting the relative orientation of the steerable section from the current relative orientation to the target relative orientation.
The method also includes initiating formation of a wireless data connection between remote computer devices of the fan assembly.
The method may further include, after receiving the request for the current relative orientation of the steerable section, displaying a fiducial marker on a display of the fan assembly. The fan assembly may display the fiducial marker on the display in response to receiving a request from a remote computer device to display the fiducial marker. The fan assembly may display a fiducial marker on the display in response to receiving the request for the current relative orientation. The fan assembly may display the fiducial marker on the display for a predetermined period of time or until an indication is received from the remote computer device that the display of the fiducial marker may cease.
According to a second aspect, there is provided a computer apparatus configured to control a fan assembly, wherein the fan assembly is capable of changing the direction of airflow emitted therefrom by adjusting the relative orientation of a steerable section of the fan assembly with respect to a non-steerable section of the fan assembly. The computer device comprises a user input device, one or more image sensors, a wireless receiver, a wireless transmitter; and a controller. The controller is configured to: receiving, using a wireless receiver, a current relative orientation of the steerable section relative to the non-steerable section from the fan assembly; instructing an image capture device to capture a series of images of a scene; the method further includes determining when the fan assembly is present within the scene using the captured images, and determining a current position of the fan assembly and a current orientation of the steerable section of the fan assembly within the scene when the fan assembly is present. The controller is further configured to, in response to an input received at the user input device indicating a target direction of airflow emitted from the fan assembly, determine an orientation difference between a current orientation of the steerable section and the target direction relative to the orientation of the fan assembly, and combine the orientation difference and the current relative orientation to determine a target relative orientation between the steerable section and the non-steerable section that aligns the steerable section with the target direction; and sending instructions to the fan assembly using the wireless transmitter to move the steerable section to the target relative orientation.
The controller may also be configured to process the captured image to generate a scene map, and determine when a graphic representation of the fan assembly is present within the scene map. The controller may be configured to implement an object recognition process to facilitate determining when the representation of the fan assembly is present within the map of the scene and, when the representation of the fan assembly is present within the map, determining a position and orientation of the fan assembly within the map. The controller may be configured to generate a point cloud for the scene and determine when at least a portion of the generated point cloud corresponds to a representative point cloud associated with the fan assembly.
The computer device may also include a memory storing a representative point cloud associated with the fan assembly and orientation data defining an orientation relative to points of the representative point cloud. The computer device may also include an electronic display, and the controller may be configured to cause the captured image of the scene to be displayed on the electronic display. The controller may be configured to, in response to input received at the user input device selecting a location within the displayed image, identify a location within the scene corresponding to the selected location within the displayed image, and identify the target direction by determining an orientation of the identified location within the scene relative to the fan assembly. The controller may be configured to cause the electronic display to display a virtual object positioned at the identified location within a subsequently displayed image of the scene on the electronic display.
The controller may be configured to, in response to an input received at the user input device selecting a direction within the displayed image, identify a direction within the scene corresponding to the selected direction within the displayed image, and use the identified direction within the scene as the target direction. The controller may be configured to cause the electronic display to display a virtual object within an image of the scene subsequently displayed on the electronic display, the virtual object being at least partially aligned with the identified direction.
The remote computer device may be a portable computer device, such as a smartphone or laptop. The one or more image sensors include any of a Charge Coupled Device (CCD) and a Complementary Metal Oxide Semiconductor (CMOS) or active pixel sensor. The remote computer device may also include one or more motion sensors, such as an Inertial Measurement Unit (IMU). The remote computer device may then be configured to enhance processing of the captured image using input from the one or more motion sensors.
There is also provided a fan assembly comprising an air flow generator for generating an air flow, a non-steerable section, a steerable section, one or more actuators for adjusting the relative orientation of the steerable section with respect to the non-steerable section, and an air outlet arranged to emit the air flow from the fan assembly, the air outlet being provided on the steerable section. The fan assembly also includes a wireless receiver, a wireless transmitter, and a controller. The controller is configured to receive, using the wireless receiver, a request from a remote computer device for a current relative orientation of the steerable section relative to the non-steerable section, determine, using the wireless receiver, the current relative orientation of the steerable section in response to receipt of the request, and transmit, using the wireless transmitter, the current relative orientation to the remote computer device, receive, using the wireless receiver, an instruction from the remote computer device to move the steerable section to a target relative orientation, and, in response to receipt of the instruction, cause the one or more actuators to move the steerable section from the current relative orientation to the target relative orientation.
The fan assembly may be any direction capable of changing the direction of the air flow emitted therefrom within the adjustment range of the fan assembly.
The fan assembly may also include an electronic display. The controller may then be configured to cause the electronic display to display the fiducial marker after receiving the current relative orientation to the steerable section. The controller may be configured to cause the electronic display to display the fiducial marks in response to receiving a request from a remote computer device to display the fiducial marks. The controller may be configured to cause the electronic display to display the fiducial marker in response to receiving the current request for the relative orientation. The controller may be configured to cause the electronic display to display the fiducial marker for a predetermined period of time or until an indication is received from the remote computer device that the display of the fiducial marker may cease. An electronic display may be disposed on the steerable section.
The steerable section may be mounted on or supported by the non-steerable section. The steerable section may be arranged to rotate about an axis of rotation relative to the non-steerable section. The non-steerable section may include a base of the fan assembly. The steerable section may comprise a nozzle of the fan assembly from which the air stream is emitted. The air outlet may be provided on the nozzle. The steerable section may comprise the body of the fan assembly which is mounted on or supported by the base of the fan assembly and the nozzle may then be mounted on or supported by the body. The non-steerable section may comprise a body of the fan assembly that is mounted on or supported by a base of the fan assembly, and the nozzle may be mounted on or supported by the body. The nozzle may then be arranged to rotate relative to both the base and the body. The body of the fan assembly may comprise an air flow generator arranged to generate an air flow emitted by the nozzle. The airflow generator may comprise a motor driven impeller.
Drawings
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
FIG. 1 is a flow chart showing an example of steps that may be performed by a remote computer device in order to implement a method for controlling a fan assembly;
FIG. 2 is a flow chart showing an example of steps that may be performed by the fan assembly to implement a method such as that shown in FIG. 1;
FIG. 3 schematically illustrates an example of a system suitable for implementing the methods described herein;
FIG. 4 is a sequence diagram illustrating a detailed example of the method described herein;
FIGS. 5a and 5b are sequence diagrams illustrating another detailed example of the methods described herein;
FIG. 6a is an example of a fan assembly to which the methods described herein are applicable, while FIGS. 6b and 6c illustrate alternative configurations of the fan assembly of FIG. 6 a;
7a, 7b, 7c and 7d schematically illustrate a first example of a usage scenario for the methods described herein;
8a, 8b and 8c schematically illustrate a second example of a usage scenario for the method described herein;
9a, 9b, 9c and 9d schematically illustrate a third example of a usage scenario for the methods described herein; and
fig. 10a, 10b, 10c and 10d schematically show a fourth example of a usage scenario for the method described herein.
Detailed Description
A method of controlling the direction of airflow emitted from a fan assembly will now be described. The method is for a fan assembly that is capable of changing a direction of airflow emitted from the fan assembly by adjusting an orientation offset between a steerable section of the fan assembly and a non-steerable section of the fan assembly. The offset in orientation between the steerable section and the non-steerable section is the difference between the orientation of the steerable section and the orientation of the non-steerable section, and may thus also be referred to as the relative orientation of the steerable section with respect to the non-steerable section. The method involves using a remote computer device to capture a series of images of a scene including a fan assembly (i.e., images of the fan assembly and its surroundings), determining a position of the fan assembly and an orientation of a steerable section of the fan assembly using the captured images, and in response to a user input indicating a target direction of airflow emitted from the fan assembly in the scene, calculating an orientation offset that aligns the steerable section of the fan assembly with the target direction, and sending instructions to the fan assembly to move the steerable section to the calculated orientation offset.
Geometrically, the position and orientation of an object is given relative to a frame of reference, where the frame of reference includes a coordinate system, and a set of physical reference points that locate and orient the coordinate system. In general, reference to the position and orientation of an object thus refers to the position and orientation relative to the scene in which the object is located (and relative to its periphery), which is defined relative to a frame of reference (which is fixed relative to the scene and the real world). Thus, the position of an object in a scene may thus also be referred to as the "scene position" or "world position" of the object, and the orientation of the object in the scene may also be referred to as the "scene orientation" or "world orientation". Conversely, the term "relative orientation" as used herein refers to the orientation of a first object relative to a second object. The relative orientation is thus defined relative to a frame of reference that is fixed relative to one or the other of the first and second objects, but not necessarily relative to the scene or real world.
The term "fan assembly" refers herein to a fan assembly configured to generate and deliver an air flow for the purposes of thermal comfort and/or environmental or climate control. Such a fan assembly may be capable of generating one or more of a dehumidified air stream, a humidified air stream, a purified air stream, a filtered air stream, a cooled air stream, and a heated air stream. Such fan assemblies are generally capable of changing the direction of the air flow emitted therefrom to any direction within the adjustment range of the fan assembly. The term "adjustment range" as used herein refers to the extent to which the direction of the emitted air flow may be varied, and is thus synonymous with the terms range of motion and range of travel (when used with respect to a mechanical system). As an example, some fan assemblies are configured to be capable of sweeping (i.e., rotating in a horizontal plane, about a vertical axis) such that the adjustment range will then be defined by the angle that the fan assembly is capable of sweeping, while some other fans are configured to be capable of sweeping and tilting (i.e., rotating in a vertical plane, about a horizontal axis), such that the adjustment range will then be defined by a combination of the angle that the fan assembly is capable of sweeping and the angle that the fan assembly is capable of tilting.
The methods described herein thus relate to controlling a fan assembly using a remote computer device. The term "remote computer device" as used herein refers to a computer device separate from the fan assembly that is capable of interacting with the fan assembly at a distance (e.g., using wireless communication). For example, the remote computer device may be provided by a notebook or smart phone configured with a computer program/software application implementing the necessary methods, and having a wireless transmitter and a wireless receiver allowing wireless communication.
Fig. 1 thus illustrates a flow chart of an example of steps that may be performed by a remote computer device in order to implement the methods described herein. At step 101, the remote computer device acquires a series of images of a scene including the fan assembly (i.e., images of the fan assembly and its surroundings) using one or more image sensors. In order to implement the methods described herein, the fan assembly does not have to be present in all captured images. For example, a user of a remote computer device may scan the device across a scene such that the fan assembly is only present in a sub-combination of captured images, where those images that do not include the fan assembly are other portions of the scene. In particular, scanning a scene while capturing a series of images is advantageous when determining depth information from the series of images, as different perspectives of features within the scene may then be used to implement a disparity based technique to estimate depth.
At step 102, the remote computer device processes the captured images to determine the position of the fan assembly in the scene and the orientation of the steerable section of the fan assembly in the scene. For example, the position of the fan assembly in the scene may include coordinates (x, y, z) within a fixed frame of reference relative to the imaged scene. The orientation of the steerable section within the scene may then comprise a set of vectors defining the rotation of the steerable section within the same frame of reference. For example, a frame of reference used to define the position and orientation in the scene may include a set of coordinates whose origin is fixed relative to the position of the remote computer device in the real world (when the scene imaging begins).
At step 103, the remote computer device receives a user input indicating a target direction of air flow emitted from a fan assembly within a scene. The user input may be provided in any manner for identifying a direction of a target within a scene. For example, the user input may include a selection of a location within an image of the scene displayed on an electronic display of the remote computer device, wherein the selection is translated from a two-dimensional image location to a location in the three-dimensional scene.
At step 104, the remote computer device receives the current relative orientation (i.e., orientation bias) of the steerable section from the fan assembly. When its relative orientation includes the current orientation of the turning section relative to the non-turning section of the fan assembly. At step 105, the remote computer device determines a target relative orientation for the steerable section. The target relative orientation includes a steering offset that aligns the steerable section of the fan assembly with a target direction. At step 106, the remote computer device sends instructions to the fan assembly to move the steerable section from the current relative orientation to the target relative orientation.
In the example of fig. 1, steps 102, 103 and 104 are independent of each other and thus may be performed in any order. In particular, any two or more of steps 102, 103, and 104 may occur simultaneously or sequentially and in any order. For example, step 102 may occur before step 103, where step 103 occurs before step 104. As another example, step 102 may occur before step 103, where step 104 occurs simultaneously with either step 102 or 103. As yet another example, step 104 may occur before both steps 102 and 103, where step 102 and step 103 occur simultaneously.
FIG. 2 shows a flow chart of an example of steps that may be performed by the fan assembly to implement a method such as that shown in FIG. 1. At step 201, a fan assembly receives a request from a remote computer device for a current relative orientation of a steerable section relative to an non-steerable section. At step 202, the fan assembly determines the current relative orientation of the steerable section and sends it to the remote computer device. At step 203, the fan assembly receives instructions from the remote computer device to move to the target relative orientation. At step 204, the fan assembly adjusts the relative orientation of the steerable section from the current relative orientation to a target relative orientation.
FIG. 3 schematically illustrates a preferred embodiment of a system suitable for implementing the methods described herein, wherein the system includes both a fan assembly 300 and a remote computer device 310 capable of wirelessly communicating with the fan assembly 300. In the illustrated embodiment, the fan assembly 300 is implemented as a combination of mechanical components, computer hardware and software, and includes an airflow generator 301 for generating an airflow; a non-steerable section 302a on which the fan assembly 300 is supported; a steerable section 302b connected to the non-steerable section 302a and whose orientation relative to the non-steerable section 302a is adjustable; one or more actuators 303 for adjusting the orientation (i.e., the directional bias) of the steerable section 302b relative to the non-steerable section 302 a; and an air outlet 304 disposed at the steerable section 302b and arranged to emit an air flow from the fan assembly 300. Adjusting the orientation offset between the steerable section 302b and the non-steerable section 302b thereby adjusts the orientation of the air outlet 304 disposed on the steerable section 302b such that the direction of the air flow emitted from the fan assembly 300 changes. For example, based on the type of fan assembly, the non-steerable section 302a may include a base or pedestal of the fan assembly 300 on which the fan assembly 300 rests, as with an upright fan assembly. In this example, the steerable section 302b would then be supported at the base or support, either directly or indirectly. As an alternative example, the non-steerable section 302a may include a mount or bracket by which the fan assembly 300 is attached/secured to a surface, such as a ceiling or wall mounted fan assembly. In this example, the steerable section 302b would then be supported on the mount or bracket, either directly or indirectly. The non-steerable section 302a of the fan assembly 300 may thus also be referred to as a stationary section or a support section of the fan assembly 300.
The fan assembly 300 also includes a controller 305 configured to control the various functions of the fan assembly 300, a wireless receiver 306 by which the fan assembly 300 receives signals/messages from the remote computer device 310, and a wireless transmitter 307 by which the fan assembly 300 transmits signals/messages to the remote computer device 310. For example, wireless receiver 306 and wireless transmitter 307 may each be adapted to communicate wireless information using any of Wireless Local Area Network (WLAN) technology (e.g., Wi-Fi) and Wireless Personal Area Network (WPAN) technology (e.g., bluetooth).
The controller 305 may include electronic components mounted on a circuit board having an electronic interface with the one or more actuators 303, and preferably with an electronic interface with each of the wireless receiver 306, the wireless transmitter 307, and the air flow generator 301. In particular, the controller 305 may include a processor, such as a central processing unit or microprocessor. The controller 305 may then also include a memory (e.g., a main memory such as a Random Access Memory (RAM) that is directly accessible by the processor, as well as a secondary memory for any data (such as any program/software application implemented by the processor.) As examples, the controller 305 may include a microcontroller or a system on a chip (SoC).
As described above, the controller 305 is configured to control various functions of the fan assembly. In particular, the controller 305 is configured to control the one or more actuators 303, and thereby control the orientation bias between the steerable section 302b and the non-steerable section 302 a. The controller 305 is also configured to determine or monitor the relative orientation of the steerable section 302b with respect to the non-steerable section 302 a. For example, the controller 305 may be configured to monitor the status of one or more actuators 303 in order to monitor the relative orientation of the steerable section 302b with respect to the non-steerable section 302 a. Alternatively, fan assembly 300 may also include an orientation offset sensor 308 configured to monitor the relative orientation of steerable section 302b with respect to non-steerable section 302 a. The controller 305 would then be configured to receive the output from the orientation bias sensor 308, which provides an indication of the relative orientation of the steerable section 302b with respect to the non-steerable section 302 a.
The controller 305 may also be configured to process any messages received from the remote computer device 310 and generate any messages to be transmitted to the remote computer device 310. The controller 305 may thus be configured to control both the wireless receiver 306 and the wireless transmitter 307. In particular, the controller 305 may be configured to provide the current relative orientation (i.e., orientation bias) of the steerable segment 302b to the remote computer device 310. For example, the controller 305 may be configured to process a request for a current relative orientation (which is received by the wireless receiver 306 from the remote computer device 310), and in response to receipt of the request, generate a reply that includes the current relative orientation, and send the reply to the remote computer device 310 using the wireless transmitter 307. The request for the current relative orientation of the steerable segment 302b may be a message specifically for this purpose. Alternatively, the request for the current relative orientation of steerable segment 302b may be an implicit request that is considered to be for the current relative orientation. For example, the controller 305 may be configured to send the current relative orientation of the steerable segment 302b to the remote computer device 310 as part of an orientation control process initiated by the fan assembly 300 in response to receiving a message from the remote computer device 310.
Optionally, the fan assembly 300 may also include an electronic display 309. The controller 305 may then also be configured to cause the fan assembly 300 to display fiducial markers on the electronic display 309. Fiducial markers displayed by fan assembly 300 may aid in the identification of fan assembly 300 in images captured by remote computer device 310, as described in more detail below. The fiducial markers displayed by the fan assembly 300 are particularly advantageous when the electronic display 309 is disposed on the steerable section 302b, as this may then aid in determining the orientation of the steerable section 302b of the fan assembly 300 during object identification. To this end, fiducial markers typically comprise images of known patterns and sizes that can be used to improve the estimation of position and orientation in the object recognition process. In particular, the appearance of fiducial markers is designed to aid in the detection of objects by providing images that are recognizable and easily distinguishable from the environment in a typical scene. For example, it may take the form of a QR code or the like. The fiducial marks will thus have a predetermined appearance and be stored in the memory of the fan assembly 300.
The controller 305 may be configured to cause the electronic display 309 to display the fiducial marker after receiving the request for the current relative orientation of the steerable section 302 b. The controller 305 may be configured to cause the electronic display 309 to display the fiducial marker after receiving the request for the current relative orientation of the steerable section 302 b. The controller 305 then begins displaying the fiducial marker at any time after receiving the request (e.g., as part of an orientation control process that begins in response to receipt of the request), and may continue to display the marker for a predetermined period of time or until the fan assembly 300 receives further messages from the remote computer device 310 that either explicitly or implicitly the fan assembly 300 may not continue to display the fiducial marker. As an alternative example, the controller 305 may be configured to cause the electronic display 309 to display the fiducial mark in response to receiving an explicit request to display the fiducial mark from the remote computer device 310. Likewise, the controller 305 may then begin displaying the fiducial marker, and may continue to display the marker for a predetermined period of time or until the fan assembly 300 receives further messages from the remote computer device 310 indicating that the fan assembly 300 may not continue to display the fiducial marker.
The controller 305 may also be configured to receive instructions from the remote computer device 310 to move to a target relative orientation and adjust the relative orientation of the steerable section 302b from a current relative orientation to the target relative orientation in response to the instructions. To this end, the controller 305 may be configured to process a request received from a remote computer device 310 via the wireless receiver 307, the request including a target relative orientation, and control the one or more actuators 304 to make a required adjustment to the orientation of the steerable section 302b relative to the non-steerable section 302 a.
The controller 305 may also be configured to control the air flow generator 301. For example, the airflow generator 301 may comprise a motor-driven impeller, and the controller 305 may then be configured to control the speed of the motor/impeller in order to control the flow rate of the generated airflow. The control of the air flow generator 305 may be based on user input selecting one of a plurality of desired operating states of the air flow generator 301 (e.g. speed selection of user speed), and/or based on a control algorithm implemented by the controller 305. For example, the control algorithm may provide an automatic mode of the air flow generator 301 in which the operational status of the air flow generator 301 is determined based on input from one or more environmental sensors (not shown) associated with the fan assembly 300.
Although the air outlet 304 of the fan assembly 300 is disposed on the steerable section 302b such that adjustment of the offset in orientation between the steerable section 302b and the non-steerable section 302a results in a change in the direction of the airflow emitted from the fan assembly 300, other components of the fan assembly 300 may be disposed on or within either of the steerable section 302b and the non-steerable section 302 a. For example, air flow generator 301, one or more actuators 303, controller 305, wireless receiver 306, and wireless transmitter 307 may all be housed within steerable section 302b such that they move with steerable section 302b when the steering bias is adjusted. Alternatively, the air flow generator 301, the one or more actuators 303, the controller 305, the wireless receiver 306, and the wireless transmitter 307 may all be housed within the non-steerable section 302a such that they do not move with the steerable section 302 b. As another alternative, a sub-combination of these components may be housed within the steerable section 302b such that they move with the steerable section 302b, while the remainder of these components are housed within the non-steerable section 302 a. When fan assembly 300 further includes an electronic display 309, electronic display 309 may be disposed on either of steerable section 302b and non-steerable section 302 a. However, as described above, it may be beneficial to place the electronic display 309 on the steerable section 302b when the electronic display 309 is used to display the fiducial marks.
The remote computer device 310 is implemented as a combination of computer hardware and software and includes one or more user input devices 311 configured to receive input provided by a user of the remote computer device; one or more image sensors 312 configured to capture images (i.e., images of a scene surrounding/within the field of view of the computer device); a controller 313 configured to control various functions of the remote computer device 310; a wireless transmitter 314 by which the remote computer device 310 sends signals/messages to the fan assembly 300; a wireless receiver 315 by which the remote computer device 310 receives signals/messages from the fan assembly 300. For example, wireless transmitter 314 and wireless receiver 315 may each be adapted to communicate wireless information using any of Wireless Local Area Network (WLAN) technology (e.g., Wi-Fi) and Wireless Personal Area Network (WPAN) technology (e.g., bluetooth). The remote computer device 310 then further comprises an electronic display 316 arranged to present images and/or data to a user of the remote computer device 310.
The one or more user input devices 311 may include any device that enables a user to provide input to the remote computer device 310. For example, the one or more user input devices may include one or more of a keyboard, a pointing device (e.g., a mouse, touchpad, or gamepad), buttons and/or knobs, an acoustic input device for voice control (e.g., a microphone), a gesture recognition control device (e.g., camera(s) and/or inertial measurement unit), or a contact interface (e.g., a touchscreen). In particular, the remote computer device 310 may include a touch screen that combines an electronic display 316 with a user input device 311. Alternatively, or in addition, the remote computer device 310 may include one or more user input devices 311 that are separate from the electronic display 316.
The controller 313 may include electronic components mounted on a circuit board with an electronic interface to one or more user input devices 311, an image sensor 312, and preferably an interface to a wireless transmitter 314, a wireless receiver 315, and an electronic display 316. In particular, the controller 313 may include a processor, such as a central processing unit or a microprocessor. The controller 313 may then also include memory (e.g., a main memory such as Random Access Memory (RAM)) that is directly accessible by the processor, as well as secondary memory for any data (such as any program/software application implemented by the processor.
As described above, the controller 313 is configured to control various functions of the remote computer device 310. In particular, the controller 313 is configured to control the image sensor 312 to capture a series of images of a scene, process the captured images to determine a position of the fan assembly 300 within the scene, and determine an orientation of the steerable section 302b of the fan assembly 300 within the scene, calculate an orientation offset that aligns the steerable section 302b of the fan assembly 300 with a user target direction, and send instructions to the fan assembly 300 to move the steerable section 302b to the calculated orientation offset.
To determine both the position of the fan assembly 300 in the scene and the orientation of the steerable section 302b of the fan assembly 300 in the scene, the controller 313 may be configured to process images captured by the self-contained image sensor 312 to determine the position and orientation (e.g., pose) of the remote computer device 310. This position and orientation of the remote computer device 310 is defined relative to a frame of reference that is fixed relative to the imaged scene and thus also relative to the real world. For example, the frame of reference typically includes a set of coordinates whose origin is fixed at the location of the remote camera device 310 in the real world when imaging of the scene begins (i.e., the location of the remote camera device 310 when the first image of the series of images is captured).
For example, the controller 313 may be configured to implement visual ranging methods to determine the position and orientation of the remote computer device 310 relative to the imaged scene. Visual ranging methods typically involve extracting features of each image and correlating/tracking these features across a series of images to estimate the motion of the image sensor. Alternatively, the controller 313 may be configured to implement visual inertial ranging methods to determine the position and orientation of the remote computer device 310 relative to the imaged scene. Visual inertial ranging involves combining traditional visual ranging with data from motion sensors, such as Inertial Measurement Units (IMUs), to improve pose estimation. The remote computer device 310 may thus further include one or more motion sensors 317, such as acceleration sensors and gyroscopes, to provide inertial information for visual inertial ranging.
In addition to determining the position and orientation of the remote computer device 310 relative to the imaged scene, the controller 313 may also be configured to process a series of images captured by the onboard image sensor 312 to generate a three-dimensional map of the imaged scene. For example, the controller 313 may be configured to use parallax effects on features identified in a series of images to reconstruct depth information of the features in order to build a three-dimensional map of the imaged scene. These three-dimensional maps typically take the form of a point cloud of the scene. The controller 313 may be configured to simultaneously determine the pose of the remote computer device 310 and generate a three-dimensional map of the imaged scene using simultaneous localization and mapping (SLAM) techniques.
To determine the position of the fan assembly 300 within the scene and the orientation of the steerable section 302b of the fan assembly 300 within the scene, the controller 313 may be configured to implement an object recognition process to facilitate determining when the fan assembly 300 is present within the scene, and to determine the position of the fan assembly 300 and the orientation of the steerable section 302b of the fan assembly 300 within the scene while the fan assembly 300 is present within the scene. To this end, the controller 313 may be configured to determine whether a portion of a three-dimensional map of the imaged scene matches/correlates with the reference illustration of the fan assembly 300 and, if so, the location and orientation of the matching/correlated portion of the three-dimensional map within the scene. To this end, the reference illustration of the fan assembly 300 should be in a format similar to a three-dimensional map of the imaged scene and must be stored within the memory of the remote computer device 310. The controller 313 may then scan the three-dimensional map of the imaged scene for a portion that matches the reference representation. For example, when the three-dimensional map takes the form of a three-dimensional point cloud generated for a scene, the reference representation of the fan assembly 300 should also take the form of a three-dimensional point cloud. Determining whether a portion of the three-dimensional point cloud of the imaged scene matches/correlates with the reference illustration of the fan assembly 300 then includes comparing the point cloud model of the fan assembly 300 to a portion of the point cloud generated for the scene. In this example, the point matching or point set alignment of the point cloud model of the fan assembly 300 then also provides a transformation that aligns the point cloud model of the fan assembly 300 with the matching portion of the point cloud generated for the scene and thereby provides an estimate of the position of the fan assembly 300 within the scene and the orientation of the steerable section 302b of the fan assembly 300 within the scene.
The controller 313 is also configured to process inputs received at the user input device to identify a target direction for the airflow emitted from the fan assembly 300 within the scene. The required processing of the user input depends on the type of user input. For example, the remote computer device 310 may be configured to provide augmented reality input by which a user may provide input indicating a direction of a target within a scene. Augmented reality refers to systems and devices that capture images, augment those images with additional information, and then present the augmented images on a display. This enables, for example, a user to capture a video stream of a scene using a computer device such as a smartphone and display the scene along with additional information in real-time or near real-time. This information may include placing objects in the virtual scene such that the virtual objects appear as if they were in the scene as part of the real world. In particular, the virtual object may appear to be located at the same position relative to the scene even if the user moves the computer device such that the perspective of the image sensor capturing the image of the scene changes.
When providing an augmented reality interface, the controller 313 may be configured to display a series of real-time or near real-time images of a scene captured by the image sensor 312 on the electronic display 316 of the remote computer device 310, wherein the at least one user input device 311 of the remote computer device 310 is then configured to enable the user to select a location on the electronic display 316 that marks the location/point in the image displayed on the display 316. The controller 313 may then be configured to identify a location within the scene that corresponds to the selected image location and determine the target direction by determining an orientation of the identified location in the scene relative to the fan assembly 300. The controller 313 may then also be configured to display a virtual object on the electronic display 316 that is anchored/fixed to the identified location in a subsequent scene image displayed on the display 316. Displaying the virtual objects at the identified locations in the scene provides feedback to the user that allows them to check whether the desired target direction for the air flow emitted from the fan assembly 300 has been accurately selected.
Alternatively, the at least one user input device 311 of the remote computer device 310 may be configured to enable a user to select a position on the electronic display 316 that indicates an orientation within an image displayed on the electronic display 316. The controller 313 may then be configured to identify a direction within the scene corresponding to the selected direction in the displayed image and use the identified direction within the scene as the target direction. The controller 313 may then also be configured to display a virtual object on the electronic display 316, at least a portion of which is aligned with the identified direction in a subsequent scene image displayed on the display 316.
When the remote computer device 310 is configured to provide an augmented reality interface, it is preferred that the electronic display 316 and the user input device 311 of the remote computer device 310 are combined into a touch screen that allows a user to mark a position or orientation within a scene directly by touching a portion of the touch screen that is displaying an image of the scene. For example, the user may then use the touch screen to touch a point on the touch screen to mark a corresponding location within the displayed image of the scene. The controller 313 is then configured to identify a location within the scene corresponding to the selected image location (by converting between the image plane coordinates (x ', y') of the selected image location and the coordinates (x, y, z) of the corresponding location in the scene), and to determine the target direction by determining the orientation of the identified location in the scene relative to the fan assembly 300. As an alternative example, in a particular mode of operation, the controller 313 may be configured to display a virtual object in a displayed image of the scene that presents the current orientation of the steerable section 302b (e.g., an arrow or other directional indicator from the fan assembly) within the scene or a currently set range of oscillation of the steerable section 302b (e.g., a circle or sector centered on the fan assembly 300). For a virtual object that presents the current orientation of the steerable segment 302b, the user may drag or slide the representation of the virtual object on the touchscreen to change the direction of the virtual object. The controller 313 may then be configured to identify a direction within the scene that corresponds to the direction of the virtual object in the scene image, and use the identified direction as the target direction. For a virtual object presenting the currently set swing range, the user may drag, slide, pinch, or enlarge a depiction of the virtual object on the touchscreen to change the size of the virtual object or individually adjust the endpoints of the virtual object. The controller 313 may then be configured to identify a first direction within the scene corresponding to a first end of the virtual object in the scene image and use the identified first direction as a target direction, and to identify a second direction within the scene corresponding to a second end of the virtual object in the scene image and use the identified second direction as another target direction. The controller 313 may then be configured to instruct the fan assembly to move the steerable section 302b to a target relative orientation corresponding to the target direction and to cause the steerable section 302b to oscillate between the target relative orientation and another target relative orientation corresponding to another target direction.
The controller 313 may be configured to store data indicative of the target direction in the memory of the remote computer device 310 after receiving the user input indicative of the target direction prior to determining the target relative orientation of the steerable segment 302 b. The controller 313 may then be configured to retrieve the stored data from the memory such that the graphical element or object associated with the stored data is displayed on the electronic display 316, and in response to a user input selecting the displayed graphical element or object, identify a target direction using the stored data and determine a target relative orientation that aligns the steerable section 302b of the fan assembly 300 with the target direction. For example, the graphical elements or objects displayed on the electronic display 316 may be control elements, such as icons, buttons, or virtual objects. The at least one user input device 311 of the remote computer device 310 may thus be configured to enable a user to select a displayed graphical element or object. This allows the user to initially define a target direction and then select that target direction by simply selecting the relevant graphical element or object displayed on the electronic display 316.
For example, the received user input may mark a location in the scene, and the stored data may then include the identified location. The controller 313 may then be further configured to cause the electronic display 316 to display the virtual object at the identified position in the subsequent image of the scene displayed on the electronic display 316. The controller 313 may then also be configured to receive a user input selecting a virtual object in the display image. The controller 313 may then be configured to use the stored data (i.e., the previously identified position) to determine a target relative orientation of the steerable section 302b by determining an orientation of the identified position relative to the fan assembly 300 in the scene and using the determined orientation as the target direction.
As an alternative example, the received user input may mark a direction in the scene, and the stored data may then include the identified direction. The controller 313 may then be further configured to cause the electronic display 316 to display a virtual object, at least a portion of which is aligned with the identified orientation in the subsequent image of the scene displayed on the electronic display 316. The controller 313 may then also be configured to receive a user input selecting a virtual object in the display image. The controller 313 may then be configured to use the stored data (i.e. the previously identified direction) to determine a target relative orientation of the steerable section 302b by using the identified direction as the target direction.
The controller 313 may also be configured to process any messages received from the fan assembly 300 and generate any messages to be transmitted to the fan assembly 300. The controller 313 may thus be configured to control both the wireless transmitter 314 and the wireless receiver 315. In particular, the controller 313 may be configured to obtain the current relative orientation (i.e., the orientation offset) of the steerable section 302b from the fan assembly 300. To this end, the controller 313 may be configured to generate a request for the current relative orientation and send the request to the fan assembly 300 using the wireless transmitter 314, and then process a reply including the current relative orientation received by the wireless receiver 315 from the fan assembly 300. The request for the current relative orientation may be a message explicitly generated for this purpose. Alternatively, the request for the current relative orientation of steerable segment 302b may be used as an implicit request for the current relative orientation. For example, the controller 313 may be configured to generate a message to the fan assembly 300 arranged to initiate an orientation control process at the fan assembly 300, wherein the transmission of the current relative orientation of the steerable section 302b to the remote computer device 310 is one of the steps performed in the orientation control process.
The controller 313 may also be configured to calculate an orientation offset that aligns the steerable section 302b of the fan assembly 300 with a target direction indicated by the user input (i.e., determine a target relative orientation for the steerable section 302b of the fan assembly 300). To determine the target relative orientation for the steerable section 302b of the fan assembly 300, the controller 313 may be configured to determine an orientation difference between the current orientation of the steerable section 302b and the target direction, and then combine the orientation difference with the current relative orientation to determine the target relative orientation of the steerable section 302 b. For example, when the user inputs a position that is registered in the scene, the controller 313 may be configured to determine the corresponding target direction by defining a vector that extends between the position of the fan assembly 300 within the scene and the user-registered position within the scene (such that the vector represents the orientation of the registered position relative to the fan assembly 300). The controller 313 may then be configured to determine the difference between the vector marking the position within the scene and the current orientation of the steerable section 302b, which is also defined by the vector.
The controller 313 may also be configured to instruct the fan assembly 300 to move to a target relative orientation. To this end, the controller 313 may be configured to generate a request to move to the relative orientation of the target and send the request to the fan assembly 300 using the wireless transmitter 314.
FIG. 4 is a timing diagram illustrating a detailed example of a method for controlling the direction of air flow emitted from a fan assembly when implemented by a system such as that shown in FIG. 3. At step 401, a request for a current relative orientation of the steerable section relative to the non-steerable section is sent from the remote computer device to the fan assembly. This step may be initiated by the remote computer device receiving a user input, which initiates the directional control process for the fan assembly. For example, the user input may include a user activating a directional control portion of a software application that is associated with the fan assembly and that is executed by the remote computer device. The remote computer device may automatically send the request as part of a directional control process implemented by the remote computer device.
At step 402, a request for a current relative orientation is received by a fan assembly. At step 403, in response to receipt of the request, the fan assembly determines a current relative orientation of the steerable section relative to the non-steerable section. As described above, the fan assembly may be configured to monitor the status of one or more actuators for the steerable section in order to monitor the relative orientation of the steerable section with respect to the non-steerable section. For example, the one or more actuators may each comprise a stepper motor. The controller of the fan assembly may then be configured to monitor the position of each stepper motor in order to determine the relative orientation of the steerable section relative to the non-steerable section. Alternatively, the fan assembly may also include an orientation offset sensor configured to monitor the relative orientation of the steerable section relative to the non-steerable section. At step 404, a reply including the current relative orientation of the steerable segment is sent from the fan assembly to the remote computer device. As described above, the process implemented by the fan assembly may also include, after receiving the request for the current relative orientation of the steerable section, displaying a fiducial marker on a display of the fan assembly.
At step 405, an acknowledgement including the current relative orientation of the steerable segment is received by the remote computer device. At step 406, the remote computer device uses one or more image sensors to acquire a series of images of a scene that includes the fan assembly (i.e., images of the fan assembly and its surroundings). As described above, this step of capturing a series of images of a scene may occur simultaneously with any of the preceding steps. At step 407, the captured series of images is processed to determine the position of the fan assembly and the orientation of the steerable section of the fan assembly in the scene.
At step 408, a user input is received by the remote computer device indicating a target direction of air flow emitted from the fan assembly within the scene. As described above, the fan assembly may be configured to provide an augmented reality interface to a user. The step of receiving user input indicating a target direction may thus comprise displaying the captured image in real time or near real time, and receiving user input selecting a position within the displayed image. The process performed by the fan assembly may then further include identifying a location in the scene corresponding to the selected image location, and displaying the virtual object positioned at the location within the image of the scene subsequently displayed by the electronic display.
At step 409, the remote computer device determines an orientation difference between the current orientation of the steerable segment within the scene and the target direction. At step 410, the remote computer device determines a target relative orientation of the steerable section of the fan assembly by combining the orientation difference determined at step 409 and the current relative orientation received at step 405. At step 411, an instruction to move the steerable section to the target relative orientation is sent from the remote computer device to the fan assembly. At step 412, the instruction to move to the target relative orientation is received by the fan assembly. At step 413, in response to receipt of the instruction, the fan assembly adjusts the relative orientation of the steerable section from the current relative orientation to the target relative orientation.
Fig. 5a and 5b are timing diagrams illustrating another detailed example of a method for controlling the direction of air flow emitted from a fan assembly when implemented by a system such as that shown in fig. 3. At step 501, a remote computer device uses one or more image sensors to acquire a series of images of a scene that includes a fan assembly (i.e., images of the fan assembly and its surroundings). At step 502, a user input is received by a remote computer device indicating a target direction of airflow emitted from a fan assembly within a scene. As described above, the remote computer device may be configured to provide an augmented reality interface to the user. The step of receiving user input indicating a target direction may thus comprise displaying the captured image in real time or near real time, and receiving user input selecting a position within the displayed image. The process performed by the fan assembly may then further include identifying a location in the scene corresponding to the selected image location, and displaying the virtual object positioned at the location within the image of the scene subsequently displayed by the electronic display. At step 503, the remote computer device stores data indicating the target direction within a memory of the remote computer device.
At step 504, the remote computer device retrieves the stored data from memory. This step typically occurs after some delay after receiving user input indicating the target direction. For example, steps 501 to 503 may be implemented in a previous orientation control process, and the orientation control process is terminated before step 504. Step 504 may then occur during another orientation control process initiated by the user. The method thus allows user input indicating a target direction to be stored and reused.
At step 505, the remote computer device again uses one or more image sensors to acquire an additional series of images of the scene that includes the fan assembly (i.e., images of the fan assembly and its surroundings). At step 506, the remote computer device causes the electronic display to display the captured image and the virtual object in real-time or near real-time, the location of which in the scene is defined by the stored data. At step 507, user input selecting a virtual object in the displayed image is received by the remote computer device. At step 508, the remote computer device determines a target direction using stored data associated with the selected virtual object.
At step 509, the captured series of images is processed to determine the position of the fan assembly and the orientation of the steerable section of the fan assembly in the scene. At step 510, a request for a current relative orientation of the steerable section relative to the non-steerable section is sent from the remote computer device to the fan assembly. At step 511, a request for a current relative orientation is received by the fan assembly. At step 512, in response to receipt of the request, the fan assembly determines a current relative orientation of the steerable section relative to the non-steerable section. At step 513, a reply is sent from the fan assembly to the remote computer device including the current relative orientation of the steerable segment. At step 514, a reply including the current relative orientation of the steerable segment is received by the remote computer device.
At step 515, the remote computer device determines an orientation difference between the current orientation of the steerable section within the scene (as determined in step 509) and the target direction (as determined in step 508). At step 516, the remote computer device determines a target relative orientation of the steerable section of the fan assembly by combining the orientation difference determined at step 515 and the current relative orientation received at step 514. At step 517, instructions to move the steerable section to the target relative orientation are sent from the remote computer device to the fan assembly. At step 518, an instruction to move to the target relative orientation is received by the fan assembly. At step 519, in response to receipt of the instruction, the fan assembly adjusts the relative orientation of the steerable section from the current relative orientation to a target relative orientation.
FIG. 6a illustrates an example of a fan assembly to which the methods described herein are applicable. Fig. 6b and 6c show alternative configurations of the fan assembly of fig. 6 a. In the example of fig. 6a, the fan assembly 600 is a free-standing fan assembly, wherein the non-steerable section 601 comprises a base of the fan assembly 600 on which the fan assembly 600 rests. The steerable section 602 then comprises a body 602a of the fan assembly 600 which is rotatably mounted to the base 601 of the fan assembly 600, wherein the air outlet 603 is then provided by a nozzle 602b which is mounted to the body 602a of the fan assembly 600. In this example, the body 602a of the fan assembly 600 may be rotated (i.e., swept) relative to the base 601 of the fan assembly 600 in order to adjust the relative orientation of the body 602a and nozzle 602b (i.e., the steerable section) relative to the base 601 (i.e., the non-steerable section), and thereby adjust the direction of the air flow emitted from the air outlet 603. In particular, the fan assembly 600 is provided with an actuator (not shown) comprising a motor, a drive member arranged to be driven by the drive member and thereby cause the body 602a to rotate relative to the base 601. In this example, the relative orientation of the steerable section (i.e. body and nozzle) of the fan assembly with respect to the non-steerable section (i.e. base) is thus given by a single angle defining the angular distance between the reference direction of the steerable section and the reference direction of the non-steerable section. In this example, the fan assembly 600 also includes an electronic display 604 disposed on the body 602a (i.e., on the steerable section) of the fan assembly 600.
Fig. 6a shows the example fan assembly 600 in a configuration where the body 602a and base 601 are aligned such that the orientation between the steerable section and the non-steerable section is offset by zero degrees. Fig. 6b then shows the example fan assembly 600 in a first alternative configuration in which the body 602a has been rotated to the right relative to the base 601 such that the orientation offset therebetween is about 45 degrees, while fig. 6c then shows the example fan assembly 600 in a second alternative configuration in which the body 602a has been rotated to the left relative to the base 601 such that the orientation offset therebetween is about-45 degrees.
Fig. 7a, 7b, 7c and 7d schematically show a first example of a usage scenario for the method described herein. In particular, fig. 7 a-7 d show a remote computer device 700 including a touch screen 710 in combination with an electronic display and a user input device of the remote computer device 700, wherein the remote computer device 700 is configured to provide an augmented reality interface to a user. In FIG. 7a, the remote computer device 700 utilizes one or more image sensors to capture a series of images of a scene 720 (i.e., images of the fan assembly and its surroundings) that includes the fan assembly 721, and simultaneously displays these images on the touch screen 710 in real-time or near real-time. In this example, the fan assembly 721 is a free-standing fan assembly, similar to that of fig. 6 a-6 c, wherein the non-steerable section includes a base of the fan assembly on which the fan assembly rests. The remote computer device uses the captured images to determine the position of the fan assembly 721 and the orientation of the steerable section of the fan assembly 721 in the scene.
In FIG. 7b, the user of the remote computer device 700 provides user input by touching a location on the touch screen 710 and thereby selecting a location in the image of the scene 720 displayed on the touch screen 710. The remote computer device 710 thus identifies a location 730 in the scene that corresponds to the selected image location. This may involve, for example, converting two-dimensional image locations to locations in a three-dimensional scene. The remote computer device 700 then identifies a target direction 740 for the airflow emitted from the fan assembly 721 by determining the orientation of the location 730 identified in the scene relative to the fan assembly 721, calculates a target relative orientation for the steerable section of the fan assembly 721 that aligns the steerable section with the target direction, and sends instructions to the fan assembly 721 to move the steerable section to the target relative orientation.
In fig. 7c, in response to the user input, the remote computer device 710 also displays a virtual object 722 on the touch screen 710 anchored/fixed to the identified location 730 in a subsequent scene image displayed on the touch screen 710. In this example, the virtual object 722 is provided by a drop-shaped marker icon. Displaying the virtual object at the identified location in the scene provides feedback to the user that allows them to check whether the desired target direction for the air flow emitted from the fan assembly 721 has been accurately selected. In fig. 7d, in response to receiving an instruction from the remote computer apparatus 700, the fan assembly 721 has adjusted the relative orientation of the steerable section relative to the non-steerable section such that the air flow emitted from the fan assembly 721 is now directed in a target direction towards the location 730 indicated by the user input.
Fig. 8a, 8b and 8c schematically show a second example of a usage scenario for the method described herein. Fig. 8a to 8c show a remote computer device 800 comprising a touch screen 810 in combination with an electronic display and a user input device of the remote computer device 800, wherein the remote computer device 800 is configured to provide an augmented reality interface to a user. In FIG. 8a, the remote computer device 800 utilizes one or more image sensors to capture a series of images of the scene 820 including the fan assembly 821 (i.e., images of the fan assembly and its surroundings) and simultaneously display these images in real-time or near real-time on the touch screen 810. The remote computer device 800 also displays a virtual object 822 on the touch screen 810 that is anchored/fixed within the scene to this identified location 830 previously marked by the user of the remote computer device 800. In this example, the virtual object 822 is provided by a drop-shaped marker icon. The previously marked location 830 will be stored within the memory of the remote computer device 800 when marked by the user and then retrieved from the memory when the enhanced display interface is activated in order to allow the remote computer device 800 to display the virtual object 822 at the previously marked location 830.
In FIG. 8b, the user of the remote computer device 800 provides user input by touching a location on the touch screen 810 that selects the virtual object 822 displayed on the touch screen 810. Remote computer device 800 then identifies a target direction 840 for the airflow emitted from fan assembly 821 by determining the orientation of previously identified location 830 in the scene relative to fan assembly 821, calculates a target relative orientation for the steerable section of fan assembly 821 that aligns the steerable section with target direction 840, and sends instructions to fan assembly 821 to move the steerable section to the target relative orientation. In fig. 8c, in response to receiving a command from remote computer device 800, fan assembly 821 has adjusted the relative orientation of the steerable section relative to the non-steerable section such that the air flow emitted from fan assembly 821 is now directed in a target direction, towards location 830 indicated by the user input.
Fig. 9a, 9b, 9c and 9d schematically show a third example of a usage scenario for the method described herein. In particular, fig. 9 a-9 d show a remote computer device 900 including a touch screen 910 that incorporates an electronic display and user input device of the remote computer device 900, wherein the remote computer device 900 is configured to provide an augmented reality interface to a user. In FIG. 9a, the remote computer device 900 utilizes one or more image sensors to capture a series of images of the scene 920 including the fan assembly 921 (i.e., images of the fan assembly and its surroundings) and simultaneously displays these images on the touch screen 910 in real-time or near real-time. In this example, the fan assembly 921 is a free-standing fan assembly, similar to that of fig. 6 a-6 c, wherein the non-steerable section includes a base of the fan assembly on which the fan assembly rests. The remote computer device 900 uses the captured images to determine the position of the fan assembly 921 and the orientation of the steerable section of the fan assembly 921 in the scene.
In FIG. 9b, a user of the remote computer device 900 provides a first user input by touching a first location on the touch screen 910 and a second user input by touching a second location on the touch screen 910. The first user input thus selects a first image location in the image of the scene 920 displayed on the touch screen 910, while the second user input selects a second image location in the image of the scene 920 displayed on the touch screen 910. The remote computer device 900 thus identifies 930 a first location within the scene corresponding to the selected first image location and identifies 931 a second location within the scene corresponding to the selected second image location. Remote computer device 900 then identifies a target direction 940 for the air flow emitted from fan assembly 921 by determining the orientation of first location 930 identified in the scene relative to fan assembly 921, and calculates a target relative orientation for the steerable section of fan assembly 921 that aligns the steerable section with target direction 940. Remote computer device 900 also identifies another target direction 941 for the air flow emitted from fan assembly 921 by determining the orientation of second location 931 relative to fan assembly 921 identified in the scene, calculating another target relative orientation for the steerable section of fan assembly 921 that aligns the steerable section with another target direction 940. Remote computer device 900 then sends instructions to fan assembly 921 to move the steerable section to the target relative orientation and cause the steerable section to oscillate between the target relative orientation and another target relative orientation.
In FIG. 9c, in response to user input, the remote computer device 900 displays a virtual object 922 on the touch screen 910 that is riveted/fixed to the identified first location 930 in a subsequent scene image displayed on the touch screen 910. The remote computer device 910 also displays another virtual object 923 on the touch screen 910 that is riveted/fixed to the identified second location 931 in a subsequent scene image displayed on the touch screen 910. In this example, virtual object 922 and another virtual object 923 are provided by a pin shape marker icon. In fig. 9d, in response to receiving instructions from remote computer device 900, fan assembly 921 has adjusted the relative orientation of the steerable section relative to the non-steerable section such that the air flow emitted from fan assembly 921 is now directed in the target direction, towards first position 930 indicated by the user input. The fan assembly 921 is then oscillated between the target relative orientation and another target relative orientation.
Fig. 10a, 10b, 10c and 10d schematically show a fourth example of a usage scenario for the method described herein. In particular, fig. 10a to 10d show a remote computer device 1000 comprising a touch screen 1010 in combination with an electronic display and a user input device of the remote computer device 1000, wherein the remote computer device 1000 is configured to provide an augmented reality interface to a user. In fig. 10a, the remote computer device 1000 utilizes one or more image sensors to capture a series of images of a scene 1020 including a fan assembly 1021 (i.e., images of the fan assembly and its surroundings) and simultaneously displays these images on the touch screen 1010 in real-time or near real-time. In this example, the fan assembly 1021 is a free-standing fan assembly, similar to that of fig. 6 a-6 c, wherein the non-steerable section comprises a base of the fan assembly on which the fan assembly rests. The remote computer device 1000 uses the captured images to determine the position of the fan assembly 1021 and the orientation of the steerable section of the fan assembly 1021 in the scene.
The remote computer device 1000 also displays a virtual object 1022 on the touch screen 1010 that is anchored/fixed to the location of the fan assembly 1021 within the scene and presents the currently set swing range of the fan assembly 1021. The current set swing range of the fan assembly 1021 may have been predefined by a user of the fan assembly 1021, or may be a default swing range of the fan assembly 1021. In this example, the virtual object 1022 is provided by an arc icon. Upon activation of the augmented reality interface, the currently set swing range will be retrieved from memory of the remote computer device 1000 or from the fan assembly 1021 in order to allow the remote computer device 1000 to display a virtual object 1022 that presents the currently set swing range of the fan assembly 1021.
In FIG. 10b, the user of the remote computer device 1000 provides user input by touching a location on the touchscreen 1010 that selects the virtual object 1022 displayed on the touchscreen 1010, and performs a touch gesture that changes the endpoint of the virtual object 1022. For example, the user input may include touching the virtual object 1022 displayed on the touch screen 1010, and then performing a drag, slide, pinch, or zoom in on the representation of the virtual object on the touch screen to change the size of the virtual object 1022 or individually adjust the endpoints of the virtual object 1022. The remote computer device 1000 then identifies a target direction 1040 for the air flow emitted from the fan assembly 1021 by determining a direction within the scene corresponding to a first end of a virtual object within the image of the scene, and calculates a target relative orientation for the steerable section of the fan assembly 1021 that aligns the steerable section with the target direction 1040. The remote computer device 1000 also identifies another target direction 1041 for the flow of air emitted from the fan assembly 1021 by determining a direction within the scene corresponding to a second end of the virtual object within the image of the scene, and calculates another target relative orientation for the steerable section of the fan assembly 1021 that aligns the steerable section with the other target direction 1041. The remote computer apparatus 1000 then sends instructions to the fan assembly 1021 to move the steerable section to the target relative orientation and cause the steerable section to oscillate between the target relative orientation and another target relative orientation.
In fig. 10c, in response to the user input, the remote computer device 1000 adjusts the end points of the virtual object 1022 in the subsequent scene image displayed on the touch screen 910 to reflect the effect of the user input on the swing range. In fig. 10d, in response to receiving instructions from the remote computer apparatus 1000, the fan assembly 1021 has adjusted the relative orientation of the steerable section relative to the non-steerable section such that the air flow emitted from the fan assembly 1021 is now directed in a target direction, which is aligned with the first end of the virtual object 1022 that presents the range of oscillation indicated by the user input. The fan assembly 1021 then oscillates between the target relative orientation and another target relative orientation.
In a preferred embodiment, the fan assembly will include one or both of a sweep pendulum motor and a tilt pendulum motor. The sweep pendulum motor will be configured to move at least a section of the fan assembly such that the emission direction rotates in a horizontal plane (i.e., when the fan assembly is positioned on a generally horizontal support surface). The adjustment range of the fan assembly including the sweep pendulum motor will then be defined at least in part by the angle through which the sweep pendulum motor can rotate the emission direction. In contrast, the tilt and swing motor will be configured to move at least a section of the fan assembly such that the emission direction rotates in a vertical plane (i.e., when the fan assembly is positioned on a generally horizontal support surface). The range of adjustment of the fan assembly including the tilt motor will then be defined at least in part by the angle through which the tilt motor can rotate the direction of emission.
In an alternative embodiment, the fan is a bladeless fan. The term "bladeless" as used herein refers to a fan assembly wherein an air stream is emitted from the fan assembly without visible/external moving blades. In other words, a bladeless fan assembly may be considered to have an output region or emission region without movable blades. Thus, in an alternative embodiment, the fan assembly preferably comprises a nozzle mounted on the fan body, wherein the motor driven impeller is housed within the fan body and the air outlet is provided by the nozzle. The nozzle is thereby arranged to receive the air flow from the fan body and to emit the air flow from the air outlet. In a preferred embodiment, the nozzle defines an aperture through which air from outside the fan assembly is drawn by the air flow emitted from the air outlet and which combines with the air flow emitted from the air outlet to produce an amplified air flow.
It will be understood that each of the articles shown may be used alone or in combination with other articles shown in the figures or described in the specification, and that articles mentioned in the same paragraph or in the same figure are not necessarily used in combination with each other. Furthermore, the word "device" may be replaced by a suitable actuator or system or apparatus. Furthermore, references to "comprising" or "constituting" are not intended to limit anything in any way and the reader should interpret the corresponding description and claims accordingly.
Furthermore, while the present invention has been described in the terms of the preferred embodiments mentioned above, it should be understood that those embodiments are merely exemplary. Those skilled in the art will be able to make modifications and variations, in view of this disclosure, within the scope of the appended claims. For example, those skilled in the art will appreciate that the above-described invention is equally applicable to both free-standing fan assemblies and other types of environmentally controlled fan assemblies. By way of example, the fan assembly can be any of a free-standing fan assembly, a ceiling or wall mounted fan assembly, and an onboard fan assembly, for example.
Although embodiments of the invention have been described with reference to the accompanying drawings, which comprise a computer processor and a method performed by the computer processor, the invention also relates to a computer program, in particular a computer program on or in a carrier, adapted for putting the invention into practice. The program may be in the form of source code or object code or in any other form suitable for use in the implementation of the method according to the invention. The carrier may be any entity or device capable of carrying the program. For example, the carrier may comprise a storage medium, such as a ROM, e.g. a CD ROM or a semiconductor ROM, or a magnetic recording medium, such as a floppy disk or hard disk. Further, the carrier may be a transmissible carrier such as an electrical or optical signal, which may be conveyed via electrical or optical cable or by radio or other means. When the program is embodied in a signal which may be conveyed directly by a cable or other device or means, the carrier may be constituted by such cable or other device or means. Alternatively, the carrier may be an integrated circuit in which the program is implanted, the integrated circuit being adapted for performing, or for use in a method relating thereto.

Claims (14)

1. A method of controlling the direction of airflow emitted from a fan assembly, wherein the fan assembly is capable of changing the direction of airflow emitted therefrom by adjusting the relative orientation of a steerable section of the fan assembly with respect to a non-steerable section of the fan assembly, the method comprising:
at a remote computer device:
capturing a series of images of a scene including a fan assembly;
determining a current position of the fan assembly and a current orientation of the steerable section of the fan assembly using the captured images;
receiving, from the fan assembly, a current relative orientation of the steerable section relative to the non-steerable section of the fan assembly;
receiving a user input indicating a target direction for air flow emitted from a fan assembly;
determining an orientation difference between a current orientation of the steerable section and a target direction;
combining the orientation difference and the current relative orientation to determine a target relative orientation for the steerable section that aligns the steerable section with the target direction; and
sending instructions to the fan assembly to move to a target relative orientation,
further comprising, after receiving user input indicating a target direction:
storing data indicative of a target direction;
displaying a graphical object associated with the stored data on a display of the device; and
in response to receipt of a user input selecting a displayed graphical object, the stored data is used to determine a target relative orientation of the steerable segment.
2. The method of claim 1, wherein the step of determining a current position of the fan assembly and a current orientation of the steerable section of the fan assembly using the captured images comprises:
the captured image is processed to generate a scene map and a fan component is detected in the scene map.
3. The method of claim 2, wherein the step of processing the captured images to generate a scene map and detecting the fan assembly in the scene map includes performing an object recognition process to facilitate identifying the fan assembly and determining the position and orientation of the fan assembly in the map.
4. The method of claim 1, wherein receiving user input indicating a target direction comprises:
displaying the captured image of the scene on a display of the device;
receiving a user input selecting a location in the display image;
identifying a location within the scene corresponding to the selected location in the display image; and
a target direction is identified by determining a position within the identified scene relative to an orientation of the fan assembly.
5. The method of claim 1, wherein receiving user input indicating a target direction comprises:
displaying the captured image of the scene on a display of the device;
receiving a user input selecting a direction in the display image;
identifying a direction within the scene corresponding to the selected direction in the display image; and
the identified direction within the scene is used as the target direction.
6. The method of claim 1, wherein the received user input marks a location within the scene and the stored data includes the marked location within the scene.
7. The method of claim 1, wherein the step of using the stored data to determine a target relative orientation of the steerable section comprises determining an orientation of the marked location relative to the fan assembly in the scene and using the determined orientation as the target direction.
8. The method of claim 1, wherein the received user input marks a direction within the scene and the stored data includes the marked direction within the scene.
9. The method of claim 8, wherein the step of using the stored data to determine a target relative orientation of the steerable segment includes using the marked direction as the target direction.
10. A computer apparatus configured to control a fan assembly, wherein the fan assembly is capable of changing a direction of airflow emitted therefrom by adjusting a relative orientation of a steerable section of the fan assembly with respect to a non-steerable section of the fan assembly, the apparatus comprising:
a user input device, one or more image sensors, a wireless receiver, a wireless transmitter; and a controller;
wherein the controller is configured to:
receiving, using a wireless receiver, a current relative orientation of the steerable section relative to the non-steerable section from the fan assembly;
instructing an image capture device to capture a series of images of a scene;
determining when the fan assembly is present within the scene using the captured images, and when the fan assembly is present, determining a current position of the fan assembly and a current orientation of the steerable section of the fan assembly within the scene;
in response to an input received at the user input device indicating a target direction of airflow emitted from the fan assembly, determining an orientation difference between a current orientation of the steerable section and the target direction, and combining the orientation difference and the current relative orientation to determine a target relative orientation between the steerable section and the non-steerable section that aligns the steerable section with the target direction; and
sending instructions to the fan assembly using the wireless transmitter to move the steerable section to the target relative orientation;
after receiving user input indicating a target direction:
storing data indicative of a target direction;
displaying a graphical object associated with the stored data on a display of the device; and
in response to receipt of a user input selecting a displayed graphical object, the stored data is used to determine a target relative orientation of the steerable segment.
11. The device of claim 10, wherein the controller is configured to:
the captured image is processed to generate a scene map, and a determination is made as to when a representation of the fan assembly is present within the scene map.
12. The device of claim 10, wherein the controller is configured to:
an object recognition process is implemented to facilitate determining when a representation of the fan assembly exists within a map of the scene and, when the representation of the fan assembly exists within the map, determining a position and orientation of the fan assembly within the map.
13. The device of claim 10, wherein the controller is configured to, in response to an input received at the user input device selecting a location within the displayed image, identify a location within the scene corresponding to the selected location within the displayed image, and identify the target direction by determining an orientation of the identified location within the scene relative to the fan assembly.
14. The device of claim 10, wherein the controller is configured to, in response to an input received at the user input device selecting a direction within the displayed image, identify a direction within the scene corresponding to the selected direction within the displayed image, and use the identified direction within the scene as the target direction.
CN202010258019.1A 2019-04-03 2020-04-03 Control of fan assembly Active CN111794993B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1904700.0 2019-04-03
GB1904700.0A GB2582796B (en) 2019-04-03 2019-04-03 Control of a fan assembly

Publications (2)

Publication Number Publication Date
CN111794993A CN111794993A (en) 2020-10-20
CN111794993B true CN111794993B (en) 2022-08-16

Family

ID=66442936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010258019.1A Active CN111794993B (en) 2019-04-03 2020-04-03 Control of fan assembly

Country Status (3)

Country Link
CN (1) CN111794993B (en)
GB (1) GB2582796B (en)
WO (1) WO2020201686A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102192171A (en) * 2010-03-08 2011-09-21 艾美特电器(深圳)有限公司 Intelligent tracking oscillating fan
JP2012117418A (en) * 2010-11-30 2012-06-21 Japan Science & Technology Agency Blast system
TW201241321A (en) * 2011-04-12 2012-10-16 Hon Hai Prec Ind Co Ltd Controlling system, method for fans, and fan including the same
JP2014190686A (en) * 2013-03-28 2014-10-06 Daikin Ind Ltd Terminal device and air conditioning unit including the same
CN109073252A (en) * 2016-05-11 2018-12-21 三菱电机株式会社 Air-conditioning visualization system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2488467A (en) 1947-09-12 1949-11-15 Lisio Salvatore De Motor-driven fan
GB2464736A (en) 2008-10-25 2010-04-28 Dyson Technology Ltd Fan with a filter
GB2476172B (en) 2009-03-04 2011-11-16 Dyson Technology Ltd Tilting fan stand
GB2482547A (en) 2010-08-06 2012-02-08 Dyson Technology Ltd A fan assembly with a heater
GB2509111B (en) 2012-12-20 2017-08-09 Dyson Technology Ltd A fan
KR102105463B1 (en) * 2013-09-02 2020-04-28 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
GB2537584B (en) 2015-02-13 2019-05-15 Dyson Technology Ltd Fan assembly comprising a nozzle releasably retained on a body
KR102622756B1 (en) * 2016-05-04 2024-01-10 주식회사 엘지생활건강 Flying apparatus with blowing function and method for drying target in flying apparatus
JP6576568B2 (en) * 2016-09-02 2019-09-18 三菱電機株式会社 Air conditioning system
US20190003480A1 (en) * 2017-06-29 2019-01-03 David R. Hall Programmable Fan

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102192171A (en) * 2010-03-08 2011-09-21 艾美特电器(深圳)有限公司 Intelligent tracking oscillating fan
JP2012117418A (en) * 2010-11-30 2012-06-21 Japan Science & Technology Agency Blast system
TW201241321A (en) * 2011-04-12 2012-10-16 Hon Hai Prec Ind Co Ltd Controlling system, method for fans, and fan including the same
JP2014190686A (en) * 2013-03-28 2014-10-06 Daikin Ind Ltd Terminal device and air conditioning unit including the same
CN109073252A (en) * 2016-05-11 2018-12-21 三菱电机株式会社 Air-conditioning visualization system

Also Published As

Publication number Publication date
GB2582796A (en) 2020-10-07
CN111794993A (en) 2020-10-20
GB201904700D0 (en) 2019-05-15
GB2582796B (en) 2021-11-03
WO2020201686A1 (en) 2020-10-08

Similar Documents

Publication Publication Date Title
WO2019128070A1 (en) Target tracking method and apparatus, mobile device and storage medium
US9667870B2 (en) Method for controlling camera operation based on haptic function and terminal supporting the same
JP5049228B2 (en) Dialogue image system, dialogue apparatus and operation control method thereof
US10169880B2 (en) Information processing apparatus, information processing method, and program
CN108021145A (en) The autonomous camera system of unmanned plane mobile image kept with target following and shooting angle
KR20160003553A (en) Electroninc device for providing map information
JP2009011362A (en) Information processing system, robot apparatus, and its control method
CN110825333B (en) Display method, display device, terminal equipment and storage medium
US20220155880A1 (en) Interacting with a smart device using a pointing controller
KR20170086482A (en) Control device and control method of flying bot
US20190172271A1 (en) Information processing device, information processing method, and program
CN106792537A (en) A kind of alignment system
CN111794993B (en) Control of fan assembly
JPWO2020166178A1 (en) Information processing equipment, information processing methods and programs
JP6974247B2 (en) Information processing equipment, information presentation instruction method, program, and recording medium
TW201241321A (en) Controlling system, method for fans, and fan including the same
JP2016017707A (en) Air conditioning system
JP6855616B2 (en) Operating devices, mobile devices, and their control systems
JP6685742B2 (en) Operating device, moving device, and control system thereof
CN112788443A (en) Interaction method and system based on optical communication device
US20220030206A1 (en) Information processing apparatus, information processing method, program, and projection system
US20130155211A1 (en) Interactive system and interactive device thereof
US20200320729A1 (en) Information processing apparatus, method of information processing, and information processing system
KR20170019777A (en) Apparatus and method for controling capturing operation of flying bot
WO2023058083A1 (en) Air blowing control device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant