WO2010146446A1 - Robotized lighting apparatus and control method - Google Patents

Robotized lighting apparatus and control method Download PDF

Info

Publication number
WO2010146446A1
WO2010146446A1 PCT/IB2010/001454 IB2010001454W WO2010146446A1 WO 2010146446 A1 WO2010146446 A1 WO 2010146446A1 IB 2010001454 W IB2010001454 W IB 2010001454W WO 2010146446 A1 WO2010146446 A1 WO 2010146446A1
Authority
WO
WIPO (PCT)
Prior art keywords
head
sensors
light beam
hand
gestures
Prior art date
Application number
PCT/IB2010/001454
Other languages
French (fr)
Inventor
Davide Girlando
Andrea Mangone
Luca Carlone
Matteo Bianchi
Andrea Bonarini
Basilio Bona
Matteo Matteucci
Italo Belmonte
Original Assignee
Davide Girlando
Andrea Mangone
Luca Carlone
Matteo Bianchi
Andrea Bonarini
Basilio Bona
Matteo Matteucci
Italo Belmonte
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Davide Girlando, Andrea Mangone, Luca Carlone, Matteo Bianchi, Andrea Bonarini, Basilio Bona, Matteo Matteucci, Italo Belmonte filed Critical Davide Girlando
Priority to EP10740260A priority Critical patent/EP2443388A1/en
Publication of WO2010146446A1 publication Critical patent/WO2010146446A1/en

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V23/00Arrangement of electric circuit elements in or on lighting devices
    • F21V23/04Arrangement of electric circuit elements in or on lighting devices the elements being switches
    • F21V23/0442Arrangement of electric circuit elements in or on lighting devices the elements being switches activated by means of a sensor, e.g. motion or photodetectors
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21SNON-PORTABLE LIGHTING DEVICES; SYSTEMS THEREOF; VEHICLE LIGHTING DEVICES SPECIALLY ADAPTED FOR VEHICLE EXTERIORS
    • F21S6/00Lighting devices intended to be free-standing
    • F21S6/002Table lamps, e.g. for ambient lighting
    • F21S6/003Table lamps, e.g. for ambient lighting for task lighting, e.g. for reading or desk work, e.g. angle poise lamps
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V19/00Fastening of light sources or lamp holders
    • F21V19/02Fastening of light sources or lamp holders with provision for adjustment, e.g. for focusing
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V21/00Supporting, suspending, or attaching arrangements for lighting devices; Hand grips
    • F21V21/14Adjustable mountings
    • F21V21/15Adjustable mountings specially adapted for power operation, e.g. by remote control
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/10Controlling the intensity of the light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/12Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by detecting audible sound
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21SNON-PORTABLE LIGHTING DEVICES; SYSTEMS THEREOF; VEHICLE LIGHTING DEVICES SPECIALLY ADAPTED FOR VEHICLE EXTERIORS
    • F21S6/00Lighting devices intended to be free-standing
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V21/00Supporting, suspending, or attaching arrangements for lighting devices; Hand grips
    • F21V21/14Adjustable mountings
    • F21V21/26Pivoted arms
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V33/00Structural combinations of lighting devices with other articles, not otherwise provided for
    • F21V33/0004Personal or domestic articles
    • F21V33/0052Audio or video equipment, e.g. televisions, telephones, cameras or computers; Remote control devices therefor
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21YINDEXING SCHEME ASSOCIATED WITH SUBCLASSES F21K, F21L, F21S and F21V, RELATING TO THE FORM OR THE KIND OF THE LIGHT SOURCES OR OF THE COLOUR OF THE LIGHT EMITTED
    • F21Y2115/00Light-generating elements of semiconductor light sources
    • F21Y2115/10Light-emitting diodes [LED]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present invention refers to an innovative robotized lighting apparatus, which allows an innovative interactive control by the user.
  • Lighting apparatuses provided with sensors to react to external stimuli are known from the prior art.
  • lamps provided with ON and OFF sensors sensitive to movement, sound or voice are currently available in the market.
  • lamps have a limited degree of interaction with the user, in that they solely eliminate the need for the user to manually manoeuvre the ON/OFF switch.
  • Lamps motorised and controlled electronically to perform the movement of special emitter devices or markers were also proposed.
  • the emitter devices for example may be worn on the wrist, so that the lamp apparently follows the movement of the hand with the light beam.
  • this system is uncomfortable (it requires wearing a special control device) and scarcely flexible. Interaction with the user is however poor. For example, the lamp actually interacts with the device which it is programmed to follow and not the user actually.
  • the general object of the present invention is that of overcoming the abovementioned drawbacks by providing a control method and a robotized lighting apparatus with an innovative gesture control which allows an actual interaction between the user and the light.
  • a lighting apparatus comprising a head with a light source directed in a light beam, a motorised kinematic structure for spatially directing the head, sensors for detecting control stimuli and an electronic control system which receives signals from said sensors and controls the movement of the head according to the detected stimuli, characterised in that it comprises an image sensor arranged in the head and directed in the direction of the light beam, and electronic processing means suitable for processing the images taken by the image sensor to distinguish at least one hand of a user inserted into the beam, to distinguish a gesture therein from among a predetermined series of preset gestures in the control system and control a corresponding interactive behaviour of the light source selecting it from among a series of behaviours which are stored in the control system associated with the gestures of the pre
  • Still according to the principles of the invention also proposed was a method for controlling a lighting apparatus comprising a head with a light source directed in a light beam and a motorised kinematic structure for spatially directing the head, in which an image, detected by an image sensor arranged in the head and directed in the direction of the light beam, is electronically processed to: recognise at least one hand of a user inserted into the beam; distinguish a gesture from among a predetermined series of gestures of the hand; control a corresponding interactive behaviour of the light source selecting it from among a series of associated gestures of the predetermined series of gestures.
  • -figure 1 represents a partly sectional schematic view of a lighting apparatus according to the invention
  • -figure 2 represents an enlarged view of the lighting head of the apparatus of figure 1
  • -figure 3 schematically represents means for the variation of the light beam in the apparatus of figure
  • figure 1 shows, schematically and in perspective view, a lighting apparatus, generally indicated with 10, provided according to the principles of the invention, having a beam lighting head 14 and a motorised kinematic structure 35 for spatially directing the head.
  • the lighting apparatus was provided in form of a desk lamp.
  • Other forms may however be conceived according to specific needs.
  • the principles of the invention may be easily applied to ceiling lamps or with ground base, suspended lights, lighting structures such as those of a dental unit, etc) .
  • the robot lamp was conceived as similar as possible to the general form of a conventional desk lamp, so as not to distract the user and make the use of the lamp as natural as possible.
  • the apparatus 10 comprises a support base 11 and an articulated kinematic chain, in turn comprising two articulated arms 12, 13 which end with a lighting head 14 which emits a directive light beam.
  • Respective joints which are motorised to rotate on command according to horizontal parallel axes 15, 16, 17 are present between the base and the first arm, between the two arms and between the second arm and the head.
  • a further motorised axis 18, with vertical rotation, is provided between the base and the first arm.
  • the head 14 may also rotate around a second motorised axis 19 which is transverse to the axis 17.
  • the lamp has six degrees of freedom which guarantee good flexibility.
  • the lamp is externally similar to most conventional manual desk lamps available in the market and the high complexity that makes them different from any other lamp are well concealed therein.
  • the structure of an innovative lamp according to the invention can be similar to a robotic arm. From a geometric point of view, the arm may be seen as a kinematic chain, constituted by connections and joints. Each joint adds a degree of movement to the robot. The concept of the degree of movement is different from that of the degrees of freedom which represents the number of coordinates that uniquely identifies the laying of a solid element in space. The number of degrees of movement is related to the ability of the robot . It was found advantageous to maintain the degrees of movement of the lamp 10 identical to that of common desktop lamps .
  • the structure of the lamp 10 is basically made of two layers: the internal mechatronic structure and the outer aesthetic cover.
  • the internal structure constitutes the actual robot and it is made up of a structure (for example, metallic) which supports the motors, the encoders, the gears and the sensors.
  • the aesthetic cover may instead be formed by light shells, as easily imaginable by a man skilled in the art.
  • the cover is indicated with a dashed line, being of any desired aesthetic aspect.
  • Various materials such as, for example, plastic, aluminium, carbon fibre etc may be used for the aesthetic cover.
  • the structure of the lamp may however be of the totally or partly self -bearing type.
  • a control system or main electronic unit 25 for the intelligent control of the lamp and which shall be connected to the sensors and to the actuators present in the lamp is also accommodated in the base 11 of the lamp (connected to the power supply network through a cable not shown) .
  • Such unit 25 may also be a known microprocessor controller, suitably programmed, as clear to a man skilled in the art from the explanations that follow.
  • each motorised axis 15, 16, 17, 18, 19 has its own electric gear motor, respectively indicated with 20, 21, 22, 23, 24.
  • the gear motor 20 which is kinematically connected to the first horizontal axis 15 and which must bear the higher stress, is provided having two motors connected in parallel in the base for doubling the torque.
  • This allows using electric motors (advantageously, DC motors) identical to those of the axes 16 and 18.
  • DC motors DC motors
  • gear motors 22 and 24 which control the two movement axes of the head 14 are instead much smaller in size, given that they are required to move a minimum load, and lighter, so as to weigh less on the motors of the joints of the arms.
  • each gear motor has an associated position encoder, as easily imaginable by a man skilled in the art, which provides the position information to the control board.
  • the exact technology for controlling the position of the motorised axes may obviously vary depending on the specific practical implementation requirements and desired costs.
  • the gear motors may be, all or partly of the known pulse width controlled type (in the figures, the gear motors 22 and 24 are, for example of this type) .
  • the coordinated robotized control of the joints of a kinematic chain (for example, by using the inverse kinematics) to reach with an end head the desired points in space is known.
  • the lamp also comprises a further actuator to allow the controlled variation of the width of the light beam.
  • such actuator, indicated with 26, is advantageously arranged over a parabolic reflector 27 of the lamp and, as schematically shown in figure 3, controls the movement of the light source 28 with respect to the focus of the reflector.
  • This solution was found advantageous in that it allows an easy, progressive and accurate control of the width of the beam, simultaneously maintaining the light intensity quite constant. Furthermore, such solution allows energy saving with respect to systems based on the occlusion of the beam, such as for example the diaphragm systems.
  • figure 3a shows the most receded position of the source, corresponding to the narrowest beam
  • figure 3b shows the most advanced position of the source, corresponding to the widest beam.
  • the opening angle of the light beam may vary from about 23° to about 100°.
  • the light source may be constituted by one or more power LEDs, which have high efficiency and also allow an easy electronic control of the luminosity through well known methods. Totally electronic systems for the variation of the
  • Q _ width of the beam may however be used.
  • several concentric LED rings may be used and the width of the beam may be carried out by changing the number of lit rings.
  • This solution allows providing an extremely thin head exploiting the LED technology, but it reveals some drawbacks. As a matter of fact, it complicates the control, producing an interaction between the width of the beam and the lighting intensity, hence forcing to modify the luminosity of the LEDs according to the number of lit rings in a coordinated manner.
  • the lighting apparatus also comprises a given number of sensors to allow the control interaction with the user.
  • sensors do not require a particular structuring of the environment.
  • the robot lamp may be seen as an interactive agent which acquires information from the normal environment, interprets it (extracting the characteristics of interest) and obtains a desired output .
  • the lamp is advantageously provided with three types of sensor systems, with each sensor system providing a different and particular sensorial channel and which regards a different type of response in the behaviour of the lamp.
  • an image sensor herein represented by a camera 30, which has a direction of vision substantially coincident with the direction of the light beam emitted by the lamp is above all present in the head 14.
  • This image sensor alongside associated means for electronically processing the image, provides the main gesture control system of the lamp.
  • the vision angle of the camera is such that the taken image contains (with, for example, just a slight margin) the area 31 lighted by the light beam to the maximum width thereof.
  • this allows an innovative interaction with the user based on "a gesture interacting with the light” and which makes the use of the lamp according to the invention surprisingly simple and natural.
  • the fact that the field of vision of the camera is identified by the light cone projected by the lamp not only allows easy interaction between the user and the lamp but also allows a simpler and more reliable operation of the view system which, as explained hereinafter, must recognize the hands placed by the user within it and on the edges of the light cone in the acquired images .
  • Vision systems are often used in robotics for the identification of objects and for the calculation of the movement and posture thereof and a detailed description of the operation thereof is not required herein.
  • a known so-called "smart camera” i.e. a camera also integrating an artificial vision system, comprising, alongside the image capturing systems, its own specific processing unit (which may extract information from the images without requiring an external processing unit) and interface devices used for sending the results of the processing to other devices (the main processing unit 25 in this case) .
  • the smart camera allows easy processing of images and the camera may identify the position and the dimensions of the user's hands and communicate it to the central control unit 25 which, as observable, consequently controls the movement of the lamp, thus providing an interface with the user that is inexpensive and efficient for "modelling" light.
  • the unit for processing the images may obviously be contained in the main unit.
  • the head 14 also contains some known sensors for the distance detection of objects around the head, indicated with 32.
  • the distance sensors are of the infrared type. Such sensors allow measuring the distances of the objects arranged along the line or vision cone thereof.
  • the operation thereof is based on the emission of an infrared light pulse (by means of an infrared emitter Led present in the sensor) which when reflected by an object is detected by a suitable infrared detector of the sensor.
  • a triangle is formed between the reflection point, the emitter and the detector.
  • the sensor provides in output an electrical dimension (for example, an electric voltage) which is a function of the distance of the object from the sensor itself, by means of a suitable triangulation technique.
  • the infrared sensors are arranged in the head 14 to create a detection area 33 around the head 14.
  • three sensors are used for radially "looking" towards the external of the lighting head and a fourth sensor is used for looking upwards.
  • the detection angle of the sensors allows detecting a relatively small object approaching the head of the lamp from the various directions .
  • any other number of sensors may be provided for, even dependent from the type of sensor used in practice, to have a suitable covering of the space around the head of the lamp. For example, should one decide to use an ultrasonic sensor as a distance sensor, this may require different mounting and/or positioning.
  • the sensors 32 are used both for a "touchless” interaction with the user and for avoiding impacts against obstacles during the robotized movement of the lamp.
  • the sensors located in the head of the lamp are capable of detecting a hand of the user (or any other object) allowing not only the gesture interaction but also preventing an unwanted contact with objects of the environment .
  • the lamp 14 also comprises acoustic sensors, advantageously provided with several microphones 34, for spatial identification of sounds.
  • acoustic sensors are advantageously arranged in the base 11. Suitable openings shall thus be provided in the covering of the base (not shown in figure 4) .
  • the acoustic sensors are important both for a localisation of the user and for acoustic interaction.
  • the lamp may be actuated by clapping the hands.
  • the presence of microphones makes the platform expandable (for example, for the speech recognition, allowing further increasing the level of interaction) .
  • the field of sound source localisation has been studied over the decades and thus further details shall not be provided herein. As a matter of fact, a man skilled in the art may easily imagine how it is possible to identify the spatial position of a sound source starting from the signal captured by some microphones suitably arranged.
  • the advantage of positioning the microphones at the base 11 lies in the fact that this allows considering the position of the microphones fixed in the origin of the spatial reference system of the lamp and the calculations are thus simplified.
  • the short distance at which the microphones must be positioned to remain in the base may create some problems related to an accurate identification of the sound source.
  • an error of even tenths of centimetres in the spatial identification of the sound source may still be acceptable, as observable hereinafter.
  • the difference between the times of arrival of the sound to the four microphones thus provides information on the position of the source and, if the microphones are not coplanar, the point is determined uniquely.
  • the microphone system may also be extended to allow greater robustness with respect to the surrounding sound.
  • the first method of interaction is gestural and it allows an easy and complete control of various aspects of the light ray, thus avoiding forcing the user, for example, to regulate a set of knobs and buttons whose meaning and function are often complex and counterintuitive .
  • the recognition of the gestures is obtained through the camera 30 which is always directed towards the light beam, so that the user always knows how he is seen by the lamp. Furthermore, the objects subject of recognition (the hands) are thus definitely always well lighted and contrasted and, thus, easier to recognise.
  • the artificial vision system (connected to the camera or directly present therein) is herein provided or programmed to substantially distinguish three types of gestures according to the shape of the hand and the movement thereof under the camera (and, thus, in the light cone projected by the lamp) . The control system then consequently reacts to the three types of gestures.
  • the head of the lamp may be controlled by the control system so that it just rotates and it shall not move as long as it can still reach the desired point with the relative light.
  • the arms of the lamp may be actuated by the system to displace the head to a more favourable position.
  • Two hands inserted entirely (or even partly, if far enough to be recognised) into the light cone and moving away from each other control the light spot to become larger, while the two hands moving close to each other make it smaller. This is observable in figure 6.
  • Such behaviour could be useful, for example, if the user is reading or working while the roommate is sleeping, or for any other reason.
  • a recognition operation by means of a computerised vision system may be a complex operation.
  • the background conditions for example, the desk surface
  • the objects to be recognised, the hands are not always of the same colour and may also vary considerably from one person to another.
  • characteristics that make a hand easily recognizable and which may be sufficient in this case the shape, the dimension, the relative uniformity of colour (the skin of the people in question may be darker or lighter, but the two types are usually not found on the same hand) .
  • the fact that the images are taken under the directed light cone of the lamp makes the shooting conditions much more suitable for quick and reliable recognition of the contour of the hands .
  • the selected gestures are not only intuitive for the user intending to interact with the lamp but they are also easily distinguishable according to the shape of the hand and the relative movement.
  • an open hand has five fingers which are recognizable by means of a simple shape analysis. When the fingers are clenched, the hand acquires an approximately ellipsoidal shape with the "peaks" of the fingers still easily identifiable in the contour. It is even easy to identify the hand which becomes larger or smaller in the image when it moves up and down (towards and away from the camera) .
  • two hands moving away from each other may be easily detected by means of an inter-frame analysis.
  • a process for detecting a light spot may be applied to find all the sets of adjacent pixel that meet the colour requirements necessary for a human hand.
  • the zones that do not represent considerable colour differences and which have a minimum dimension compatible with the image of a hand at a predictable maximum shooting distance may be selected.
  • Luminosity normalisation may also be required to compensate the general luminosity variation.
  • the contours thereof may be analysed to see whether they meet the characteristics set up to represent a hand exemplified in one of the gestures defined considerable for the system for controlling the lamp. For example, if the spot is an open hand, a low-pass filtered version of the distance between the centroid and the contour points reveals the five large peaks, corresponding to the fingers. Other simple similar filtering processes allow distinguishing the other gestures.
  • the lamp follows it constantly to keep the centroid at the centre of the image (which more or less coincides with the centre of the light spot projected by the lamp) .
  • the movement of the two hands is the main element: if a difference analysis between the frames reveals a movement that mainly moves outwards (or, on the contrary, towards the centre) the distance between the two hands is measured and the lamp behaves consequently .
  • a further "key” gesture which could advantageously be a closed fist, to be performed to start and conclude each process for identifying a command gesture. This allows preventing the user from inadvertently commanding the lamp by performing - under the camera - some simple gesture (for example, extending the hand to pick a pen) that the system could erroneously detect as a command gesture.
  • a second method of interaction with the lamp referred to as "touchless movement” , exploits the infrared sensors to detect an object, outside the light cone, which nears the head of the lamp.
  • This method of operation is not suitably aimed at controlling the light independently from the physical object represented by the lamp, contrary to the case of the gestural method obtained by means of the camera and in which the hands "interact" with the light cone and with the spot thereof projected on a surface.
  • the "touchless movement” allows moving the head of the lamp without touching it and “pushing” or “pulling” it virtually in any direction by simply moving the hand towards or away from the head of the lamp.
  • the lamp moves and rotates to compensate the "touchless” movement and still continue lighting the same area.
  • This is schematically shown in figure 8 for a repulsion movement of the lamp.
  • the user may move the body of the lamp to a more comfortable configuration using the hands to command the
  • the "communication channel” provided by means of the infrared sensors is different from the previous one for gesture visual movement in that, while the gesture under the camera allows changing the light conditions, the "touchless” interaction allows modifying the position of the lamp without moving the lighted area, with the control system of the lamp set to compensate the relocation.
  • the two systems are complementary to allow a complete interaction with the lamp. Normally, when the position of a conventional lamp creates discomfort due to the shadow or due to the fact that it hinders some movements one would like to perform, one should grasp it, move it and then rotate the light again where required. The touchless interaction makes the whole of this boring process even closer to an instinctive gesture of pushing it away, almost easily done as thought.
  • the sensors of proximity ignore any information when the hands or any other body are too far. Furthermore, a tolerance time interval should be set, so that sudden movements of the user do not cause sudden changes and that the lamp does not try to follow the hand when it is moving away, once the position has been adjusted satisfactorily.
  • the indication of the distance between an obstacle on the line of vision of the sensor and the sensor itself can be obtained from the electric dimension at the output of infrared sensors (for example, the rising and falling output voltage) . Such measurement of the distance is the input for the touchless movement.
  • a possible innovative system for providing the touchless interaction is based on the fusion of the information of the sensors IR by means of the "fuzzy" logic.
  • Fuzzy logic is a per se known multiple value logic, derived from the theory of fuzzy sets and which deals with approximate logic instead of precise logic.
  • the logic associates a value to a variable of interest. This value, often referred to as degree of truth, may vary (overall) between 0 and 1 and it indicates the correspondence between a proposition and the observed phenomena .
  • degree of truth of a proposition may vary at several levels between 0 and 1 and it is not restricted to the two values ⁇ true, false ⁇ of the classic binary logic.
  • an area of interest in which the user may interact with the lamp and is yet to be detected by the sensors, and an external area, where no interaction may occur, may be conceptually distinguished. This distinction is carried out to prevent an unwanted interaction and leave the hands of the user free when no interaction is required.
  • a two- level conventional logic in and out of the area of interest
  • the lamp has a more satisfactory behaviour if there is a further partitioning into areas which are defined by progressively increasing distances from the sensor, like onion layers.
  • the actual distances for example in centimetres from the head) which identify the borders of each layer or area may be practically defined according to the exact desired behaviour and according to a compromise between the desired maximum control distance and the probability of having false detections.
  • the areas may be defined (from that closest to the head outwards) as:
  • Near area the measurement of the distance obtained by the infrared sensors is lower than a minimum threshold value within which the lamp reacts moving away from the objects that get into such area. This also inherently guarantees avoiding an obstacle.
  • the near area is thus a repulsion area. In a conventional lamp this movement is obtained by physically pushing the head of the lamp using a hand;
  • Figure 9 shows a possible example of correlation of the areas according to the "fuzzy" logic. As observable in the figure, the various areas do not have sharp borders .
  • Figure 10 schematically shows the third method of interaction which uses the microphones system for spatial identification of sounds.
  • a sound of predetermined characteristics and which thus identifies a "sound signal" for example, snapping fingers or clapping hands
  • the source point is localised and the lamp directs the light beam towards such point, advantageously without moving the head of the lamp, but solely rotating it, unless the movement is required.
  • Such operating method is useful, for example, when the light, for whatever reason, is required on the other side of the desk and dragging it over the entire distance may be uncomfortable or complicated.
  • the sound signal provides a more instantaneous signal for the light at the desired point. If required, possible accurate adjustment of the position may subsequently be performed through the gesture command method.
  • a second camera may be provided to frame other details, such as the face of the user, present in the surrounding area.
  • Luminosity sensors may also be provided to adjust the light intensity of the lamp to the surrounding one .
  • the same camera may be used to serve such purpose.
  • the possible "key" activation gesture of the gestural recognition may also be a command coming from another sensorial channel, for example the acoustic one, through a voice command or preset sound. Other distance sensors may be provided for.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

A lighting apparatus comprises a head (14) with a light source directed in a light beam and a motorised kinematic structure (25) for spatially directing the head. An image sensor (30) is arranged in the head and it is directed in the direction of the light beam. Electronic processing means (25, 30) process the images taken by the image sensor (30) to distinguish at least one hand of a user inserted into the beam, to distinguish a gesture therein from among a predetermined series of preset gestures in the control system and control a corresponding interactive behaviour of the light source. Further distance sensors and sensors for identifying the position of acoustic sources are provided for further additional interactive behaviours of the apparatus.

Description

ROBOTIZED LIGHTING APPARATUS AND CONTROL METHOD
The present invention refers to an innovative robotized lighting apparatus, which allows an innovative interactive control by the user.
Lighting apparatuses provided with sensors to react to external stimuli are known from the prior art. For example, lamps provided with ON and OFF sensors sensitive to movement, sound or voice are currently available in the market.
These lamps have a limited degree of interaction with the user, in that they solely eliminate the need for the user to manually manoeuvre the ON/OFF switch. Lamps motorised and controlled electronically to perform the movement of special emitter devices or markers were also proposed. The emitter devices for example may be worn on the wrist, so that the lamp apparently follows the movement of the hand with the light beam. However, this system is uncomfortable (it requires wearing a special control device) and scarcely flexible. Interaction with the user is however poor. For example, the lamp actually interacts with the device which it is programmed to follow and not the user actually.
The general object of the present invention is that of overcoming the abovementioned drawbacks by providing a control method and a robotized lighting apparatus with an innovative gesture control which allows an actual interaction between the user and the light. With the aim of attaining such object it was proposed, according to the invention, to provide a lighting apparatus comprising a head with a light source directed in a light beam, a motorised kinematic structure for spatially directing the head, sensors for detecting control stimuli and an electronic control system which receives signals from said sensors and controls the movement of the head according to the detected stimuli, characterised in that it comprises an image sensor arranged in the head and directed in the direction of the light beam, and electronic processing means suitable for processing the images taken by the image sensor to distinguish at least one hand of a user inserted into the beam, to distinguish a gesture therein from among a predetermined series of preset gestures in the control system and control a corresponding interactive behaviour of the light source selecting it from among a series of behaviours which are stored in the control system associated with the gestures of the predetermined series of gestures. Still according to the principles of the invention also proposed was a method for controlling a lighting apparatus comprising a head with a light source directed in a light beam and a motorised kinematic structure for spatially directing the head, in which an image, detected by an image sensor arranged in the head and directed in the direction of the light beam, is electronically processed to: recognise at least one hand of a user inserted into the beam; distinguish a gesture from among a predetermined series of gestures of the hand; control a corresponding interactive behaviour of the light source selecting it from among a series of associated gestures of the predetermined series of gestures. To clarify the explanation of the innovative principles of the present invention and the advantages thereof with respect to the prior art a possible exemplifying embodiment applying such principles shall be described hereinafter, with the help of the attached drawings. In the drawings : -figure 1 represents a partly sectional schematic view of a lighting apparatus according to the invention; -figure 2 represents an enlarged view of the lighting head of the apparatus of figure 1; -figure 3 schematically represents means for the variation of the light beam in the apparatus of figure
1;
-figure 4 represents a schematic view of the apparatus operating;
-figures 5 to 8 schematically represent steps of interaction of the user with the apparatus;
-figure 9 represents a diagram describing detection areas for a possible remote control of the apparatus according to a "fuzzy" logic; -figure 10 schematically represents the interaction of the user with acoustic sensors of the apparatus. Referring to the figures, figure 1 shows, schematically and in perspective view, a lighting apparatus, generally indicated with 10, provided according to the principles of the invention, having a beam lighting head 14 and a motorised kinematic structure 35 for spatially directing the head. Advantageously, the lighting apparatus was provided in form of a desk lamp. Other forms may however be conceived according to specific needs. For example, the principles of the invention may be easily applied to ceiling lamps or with ground base, suspended lights, lighting structures such as those of a dental unit, etc) .
In the example shown herein, the robot lamp was conceived as similar as possible to the general form of a conventional desk lamp, so as not to distract the user and make the use of the lamp as natural as possible.
Thus, the apparatus 10 comprises a support base 11 and an articulated kinematic chain, in turn comprising two articulated arms 12, 13 which end with a lighting head 14 which emits a directive light beam. Respective joints which are motorised to rotate on command according to horizontal parallel axes 15, 16, 17 are present between the base and the first arm, between the two arms and between the second arm and the head. A further motorised axis 18, with vertical rotation, is provided between the base and the first arm. The head 14 may also rotate around a second motorised axis 19 which is transverse to the axis 17. Thus, the lamp has six degrees of freedom which guarantee good flexibility.
With the described structure, the lamp is externally similar to most conventional manual desk lamps available in the market and the high complexity that makes them different from any other lamp are well concealed therein.
The structure of an innovative lamp according to the invention can be similar to a robotic arm. From a geometric point of view, the arm may be seen as a kinematic chain, constituted by connections and joints. Each joint adds a degree of movement to the robot. The concept of the degree of movement is different from that of the degrees of freedom which represents the number of coordinates that uniquely identifies the laying of a solid element in space. The number of degrees of movement is related to the ability of the robot . It was found advantageous to maintain the degrees of movement of the lamp 10 identical to that of common desktop lamps . Advantageously, the structure of the lamp 10 is basically made of two layers: the internal mechatronic structure and the outer aesthetic cover. The internal structure constitutes the actual robot and it is made up of a structure (for example, metallic) which supports the motors, the encoders, the gears and the sensors.
The aesthetic cover may instead be formed by light shells, as easily imaginable by a man skilled in the art. In figure 1 the cover is indicated with a dashed line, being of any desired aesthetic aspect. Various materials such as, for example, plastic, aluminium, carbon fibre etc may be used for the aesthetic cover. Alternatively, the structure of the lamp may however be of the totally or partly self -bearing type. A control system or main electronic unit 25 for the intelligent control of the lamp and which shall be connected to the sensors and to the actuators present in the lamp is also accommodated in the base 11 of the lamp (connected to the power supply network through a cable not shown) . Such unit 25 may also be a known microprocessor controller, suitably programmed, as clear to a man skilled in the art from the explanations that follow. Given that the unit 25 is per se known and easily imaginable by the man skilled in the art, it shall not be described or shown further hereinafter. Still, as clearly observable in figure 1, each motorised axis 15, 16, 17, 18, 19 has its own electric gear motor, respectively indicated with 20, 21, 22, 23, 24. Advantageously, the gear motor 20, which is kinematically connected to the first horizontal axis 15 and which must bear the higher stress, is provided having two motors connected in parallel in the base for doubling the torque. This allows using electric motors (advantageously, DC motors) identical to those of the axes 16 and 18. Furthermore, the use of only one type of motor allows reducing costs. Also as clearly observable in figure 2, the gear motors 22 and 24 which control the two movement axes of the head 14 are instead much smaller in size, given that they are required to move a minimum load, and lighter, so as to weigh less on the motors of the joints of the arms.
For an accurate control of the position, each gear motor has an associated position encoder, as easily imaginable by a man skilled in the art, which provides the position information to the control board. The exact technology for controlling the position of the motorised axes may obviously vary depending on the specific practical implementation requirements and desired costs. For example, the gear motors may be, all or partly of the known pulse width controlled type (in the figures, the gear motors 22 and 24 are, for example of this type) . The coordinated robotized control of the joints of a kinematic chain (for example, by using the inverse kinematics) to reach with an end head the desired points in space is known. The lamp also comprises a further actuator to allow the controlled variation of the width of the light beam. As better observable in figure 2, such actuator, indicated with 26, is advantageously arranged over a parabolic reflector 27 of the lamp and, as schematically shown in figure 3, controls the movement of the light source 28 with respect to the focus of the reflector. This solution was found advantageous in that it allows an easy, progressive and accurate control of the width of the beam, simultaneously maintaining the light intensity quite constant. Furthermore, such solution allows energy saving with respect to systems based on the occlusion of the beam, such as for example the diaphragm systems.
For example, figure 3a shows the most receded position of the source, corresponding to the narrowest beam, and figure 3b shows the most advanced position of the source, corresponding to the widest beam. For example, with the selected system, with even only one 20mm excursion of the actuator, the opening angle of the light beam may vary from about 23° to about 100°. Considering an average distance from the top of the desk of 35 cm there is an ensuing diameter of the lighted area which changes from about 140 millimetres to about 900 millimetres on the lighted surface. These values were deemed sufficient for the particular application. Advantageously, the light source may be constituted by one or more power LEDs, which have high efficiency and also allow an easy electronic control of the luminosity through well known methods. Totally electronic systems for the variation of the
— Q _ width of the beam may however be used. For example, several concentric LED rings may be used and the width of the beam may be carried out by changing the number of lit rings. This solution allows providing an extremely thin head exploiting the LED technology, but it reveals some drawbacks. As a matter of fact, it complicates the control, producing an interaction between the width of the beam and the lighting intensity, hence forcing to modify the luminosity of the LEDs according to the number of lit rings in a coordinated manner.
Alongside the actuators, the lighting apparatus according to the invention also comprises a given number of sensors to allow the control interaction with the user. Such sensors do not require a particular structuring of the environment.
As a matter of fact, the robot lamp may be seen as an interactive agent which acquires information from the normal environment, interprets it (extracting the characteristics of interest) and obtains a desired output .
Substantially, the lamp is advantageously provided with three types of sensor systems, with each sensor system providing a different and particular sensorial channel and which regards a different type of response in the behaviour of the lamp.
As well observable in figure 4, an image sensor, herein represented by a camera 30, which has a direction of vision substantially coincident with the direction of the light beam emitted by the lamp is above all present in the head 14.
This image sensor, alongside associated means for electronically processing the image, provides the main gesture control system of the lamp.
The vision angle of the camera is such that the taken image contains (with, for example, just a slight margin) the area 31 lighted by the light beam to the maximum width thereof. As observable in detail hereinafter, this allows an innovative interaction with the user based on "a gesture interacting with the light" and which makes the use of the lamp according to the invention surprisingly simple and natural. The fact that the field of vision of the camera is identified by the light cone projected by the lamp, not only allows easy interaction between the user and the lamp but also allows a simpler and more reliable operation of the view system which, as explained hereinafter, must recognize the hands placed by the user within it and on the edges of the light cone in the acquired images .
Vision systems are often used in robotics for the identification of objects and for the calculation of the movement and posture thereof and a detailed description of the operation thereof is not required herein. Regarding the present application, it was found advantageous to use a known so-called "smart camera", i.e. a camera also integrating an artificial vision system, comprising, alongside the image capturing systems, its own specific processing unit (which may extract information from the images without requiring an external processing unit) and interface devices used for sending the results of the processing to other devices (the main processing unit 25 in this case) . Through suitable programming, the smart camera allows easy processing of images and the camera may identify the position and the dimensions of the user's hands and communicate it to the central control unit 25 which, as observable, consequently controls the movement of the lamp, thus providing an interface with the user that is inexpensive and efficient for "modelling" light. Alternatively, the unit for processing the images may obviously be contained in the main unit.
The head 14 also contains some known sensors for the distance detection of objects around the head, indicated with 32. Advantageously, the distance sensors are of the infrared type. Such sensors allow measuring the distances of the objects arranged along the line or vision cone thereof. Usually, the operation thereof is based on the emission of an infrared light pulse (by means of an infrared emitter Led present in the sensor) which when reflected by an object is detected by a suitable infrared detector of the sensor. Thus, a triangle is formed between the reflection point, the emitter and the detector. The sensor provides in output an electrical dimension (for example, an electric voltage) which is a function of the distance of the object from the sensor itself, by means of a suitable triangulation technique.
This technology is well known and particularly suitable for use in household robots, given that it offers good performance in terms of measurement accuracy at a low cost. In the lamp according to the invention, the infrared sensors are arranged in the head 14 to create a detection area 33 around the head 14. In particular, three sensors are used for radially "looking" towards the external of the lighting head and a fourth sensor is used for looking upwards. The detection angle of the sensors allows detecting a relatively small object approaching the head of the lamp from the various directions . Obviously, any other number of sensors may be provided for, even dependent from the type of sensor used in practice, to have a suitable covering of the space around the head of the lamp. For example, should one decide to use an ultrasonic sensor as a distance sensor, this may require different mounting and/or positioning.
As clearly observable hereinafter, the sensors 32 are used both for a "touchless" interaction with the user and for avoiding impacts against obstacles during the robotized movement of the lamp. Basically, the sensors located in the head of the lamp are capable of detecting a hand of the user (or any other object) allowing not only the gesture interaction but also preventing an unwanted contact with objects of the environment .
Still as observable in figure 4, the lamp 14 also comprises acoustic sensors, advantageously provided with several microphones 34, for spatial identification of sounds. Such sensors are advantageously arranged in the base 11. Suitable openings shall thus be provided in the covering of the base (not shown in figure 4) . The acoustic sensors are important both for a localisation of the user and for acoustic interaction. For example, the lamp may be actuated by clapping the hands. Furthermore, the presence of microphones makes the platform expandable (for example, for the speech recognition, allowing further increasing the level of interaction) . The field of sound source localisation has been studied over the decades and thus further details shall not be provided herein. As a matter of fact, a man skilled in the art may easily imagine how it is possible to identify the spatial position of a sound source starting from the signal captured by some microphones suitably arranged.
The advantage of positioning the microphones at the base 11 lies in the fact that this allows considering the position of the microphones fixed in the origin of the spatial reference system of the lamp and the calculations are thus simplified.
As a matter of fact, the short distance at which the microphones must be positioned to remain in the base may create some problems related to an accurate identification of the sound source. However, regarding the present application an error of even tenths of centimetres in the spatial identification of the sound source may still be acceptable, as observable hereinafter.
The wise choice made herein regarding the sounds required to retrace the spatial position of the source simplifies the task of identifying the source further and allows keeping the required processing power low.
Actually, only impulsive sounds are intended to be localised in this case and thus phase difference calculations are not required, but only a localisation of the peak and a given interpolation to make hypotheses on a most accurate position of the peak in terms of fractions of the sample of the signal. The difference of the arrival times is simply the difference in the position of the peak in the response of the microphones . As far as the number of microphones to be used is concerned, the use of four microphones equally distributed around the base, but not coplanar, was found advantageous. Actually, with a pair of microphones, the position of the points whose distances from two focuses have a constant difference is a two- sheet hyperboloid; given that the nearest focus is known, the possibilities of positioning the source are limited to the points of a sheet. With three non- coplanar microphones two points are identified in the worst hypothesis, when the source is outside the plane containing the microphones. If a fourth microphone is arranged outside that plane, not only is the point uniquely detected as an intersection of six sheets of the hyperboloid (all possible pairs) , but also some additional information is provided for the compensation of the error. As a matter of fact, this generates a system of six equations, with the same three unknown quantities.
The difference between the times of arrival of the sound to the four microphones thus provides information on the position of the source and, if the microphones are not coplanar, the point is determined uniquely. The microphone system may also be extended to allow greater robustness with respect to the surrounding sound.
Now considering the desired natural interaction with light, some new innovative methods have been identified which, alone or combined together, make the control of a robotized light source more instantaneous, simple and intuitive . Due to the principles of the invention it is possible to provide a range of simple methods of interaction which may contribute to manipulating and conforming the light to the needs and tastes of the user, in a much more consistent manner with respect to what can be obtained solely by moving and directing a lamp.
The first method of interaction is gestural and it allows an easy and complete control of various aspects of the light ray, thus avoiding forcing the user, for example, to regulate a set of knobs and buttons whose meaning and function are often complex and counterintuitive .
The recognition of the gestures is obtained through the camera 30 which is always directed towards the light beam, so that the user always knows how he is seen by the lamp. Furthermore, the objects subject of recognition (the hands) are thus definitely always well lighted and contrasted and, thus, easier to recognise. The artificial vision system (connected to the camera or directly present therein) is herein provided or programmed to substantially distinguish three types of gestures according to the shape of the hand and the movement thereof under the camera (and, thus, in the light cone projected by the lamp) . The control system then consequently reacts to the three types of gestures.
An open hand arranged in the light cone clamps and moves the point or the light spot towards a desired point: this is merely an intuitive "drag and drop" of the lighted area to move it to the required point. This is clearly visible in figure 5.
Advantageously, the head of the lamp may be controlled by the control system so that it just rotates and it shall not move as long as it can still reach the desired point with the relative light. When this is no longer possible, the arms of the lamp may be actuated by the system to displace the head to a more favourable position. Two hands inserted entirely (or even partly, if far enough to be recognised) into the light cone and moving away from each other control the light spot to become larger, while the two hands moving close to each other make it smaller. This is observable in figure 6. Such behaviour could be useful, for example, if the user is reading or working while the roommate is sleeping, or for any other reason.
A hand having the fingers joined together and stretched instead regulates the intensity of the light: the displacement of the hand upwards increases luminosity, while the displacement downwards reduces it, as clearly represented in figure 7.
Obviously, the system is required to interpret the gestures perfectly to give the correct response. It is generally known that a recognition operation by means of a computerised vision system may be a complex operation. In this specific case, it is true that the background conditions (for example, the desk surface) are unknown and the objects to be recognised, the hands, are not always of the same colour and may also vary considerably from one person to another. However, there are given characteristics that make a hand easily recognizable and which may be sufficient in this case: the shape, the dimension, the relative uniformity of colour (the skin of the people in question may be darker or lighter, but the two types are usually not found on the same hand) . Furthermore, the fact that the images are taken under the directed light cone of the lamp makes the shooting conditions much more suitable for quick and reliable recognition of the contour of the hands . Furthermore, according to the principles of the invention, the selected gestures are not only intuitive for the user intending to interact with the lamp but they are also easily distinguishable according to the shape of the hand and the relative movement. For example, an open hand has five fingers which are recognizable by means of a simple shape analysis. When the fingers are clenched, the hand acquires an approximately ellipsoidal shape with the "peaks" of the fingers still easily identifiable in the contour. It is even easy to identify the hand which becomes larger or smaller in the image when it moves up and down (towards and away from the camera) . Lastly, two hands moving away from each other may be easily detected by means of an inter-frame analysis.
All this requires a relatively limited processing power and the corresponding hardware is insertable into a lamp. Still for reasons relating to processing efficiency with relatively few hardware resources and to meet, however, the desire of a quick response, it was found advantageous that the images be taken at a resolution of about 320x240 pixels. This allows safe detection of the contours of the hands, without having surplus non- required details in the image.
A process for detecting a light spot may be applied to find all the sets of adjacent pixel that meet the colour requirements necessary for a human hand. In addition, the zones that do not represent considerable colour differences and which have a minimum dimension compatible with the image of a hand at a predictable maximum shooting distance may be selected. Luminosity normalisation may also be required to compensate the general luminosity variation.
Once the spots in question have been found and other information thereon has been collected (centroid, circularity, area, etc) , the contours thereof may be analysed to see whether they meet the characteristics set up to represent a hand exemplified in one of the gestures defined considerable for the system for controlling the lamp. For example, if the spot is an open hand, a low-pass filtered version of the distance between the centroid and the contour points reveals the five large peaks, corresponding to the fingers. Other simple similar filtering processes allow distinguishing the other gestures.
After the identification, the movement or "the evolution" of the hand may be followed.
For example, when the hand with the fingers closed is enlarged, the light intensity increases, while it is reduced in the opposite case.
On the other hand, when the open hand is detected, the lamp follows it constantly to keep the centroid at the centre of the image (which more or less coincides with the centre of the light spot projected by the lamp) . When the user wants to enlarge or reduce the light spot, the movement of the two hands is the main element: if a difference analysis between the frames reveals a movement that mainly moves outwards (or, on the contrary, towards the centre) the distance between the two hands is measured and the lamp behaves consequently .
With the aim of providing further "robustness" to the recognition system, one may also use a further "key" gesture, which could advantageously be a closed fist, to be performed to start and conclude each process for identifying a command gesture. This allows preventing the user from inadvertently commanding the lamp by performing - under the camera - some simple gesture (for example, extending the hand to pick a pen) that the system could erroneously detect as a command gesture.
A second method of interaction with the lamp, referred to as "touchless movement" , exploits the infrared sensors to detect an object, outside the light cone, which nears the head of the lamp.
This method of operation is not suitably aimed at controlling the light independently from the physical object represented by the lamp, contrary to the case of the gestural method obtained by means of the camera and in which the hands "interact" with the light cone and with the spot thereof projected on a surface.
The "touchless movement" allows moving the head of the lamp without touching it and "pushing" or "pulling" it virtually in any direction by simply moving the hand towards or away from the head of the lamp.
Advantageously, the lamp moves and rotates to compensate the "touchless" movement and still continue lighting the same area. This is schematically shown in figure 8 for a repulsion movement of the lamp.
This is useful to give maximum freedom to the user.
For example, after positioning the lighted area by means of the gestural interaction, the user may move the body of the lamp to a more comfortable configuration using the hands to command the
"touchless" movement.
The "communication channel" provided by means of the infrared sensors is different from the previous one for gesture visual movement in that, while the gesture under the camera allows changing the light conditions, the "touchless" interaction allows modifying the position of the lamp without moving the lighted area, with the control system of the lamp set to compensate the relocation. However, the two systems are complementary to allow a complete interaction with the lamp. Normally, when the position of a conventional lamp creates discomfort due to the shadow or due to the fact that it hinders some movements one would like to perform, one should grasp it, move it and then rotate the light again where required. The touchless interaction makes the whole of this boring process even closer to an instinctive gesture of pushing it away, almost easily done as thought. Naturally, it is advantageous that the sensors of proximity ignore any information when the hands or any other body are too far. Furthermore, a tolerance time interval should be set, so that sudden movements of the user do not cause sudden changes and that the lamp does not try to follow the hand when it is moving away, once the position has been adjusted satisfactorily. As already described previously, the indication of the distance between an obstacle on the line of vision of the sensor and the sensor itself can be obtained from the electric dimension at the output of infrared sensors (for example, the rising and falling output voltage) . Such measurement of the distance is the input for the touchless movement. A possible innovative system for providing the touchless interaction, is based on the fusion of the information of the sensors IR by means of the "fuzzy" logic.
Fuzzy logic is a per se known multiple value logic, derived from the theory of fuzzy sets and which deals with approximate logic instead of precise logic. The logic associates a value to a variable of interest. This value, often referred to as degree of truth, may vary (overall) between 0 and 1 and it indicates the correspondence between a proposition and the observed phenomena . In the fuzzy logic, the degree of truth of a proposition may vary at several levels between 0 and 1 and it is not restricted to the two values {true, false} of the classic binary logic. The reason that led to the advantageous use of a fuzzy logic, in the lamp according to the invention, lies in the particular type that has been identified herein for the touchless interaction.
As a first intuitive approach, an area of interest, in which the user may interact with the lamp and is yet to be detected by the sensors, and an external area, where no interaction may occur, may be conceptually distinguished. This distinction is carried out to prevent an unwanted interaction and leave the hands of the user free when no interaction is required. Thus, a two- level conventional logic (in and out of the area of interest) would be enough to implement a simple control . However, it was discovered that the lamp has a more satisfactory behaviour if there is a further partitioning into areas which are defined by progressively increasing distances from the sensor, like onion layers. The actual distances (for example in centimetres from the head) which identify the borders of each layer or area may be practically defined according to the exact desired behaviour and according to a compromise between the desired maximum control distance and the probability of having false detections.
The areas may be defined (from that closest to the head outwards) as:
• "Near" area: the measurement of the distance obtained by the infrared sensors is lower than a minimum threshold value within which the lamp reacts moving away from the objects that get into such area. This also inherently guarantees avoiding an obstacle. The near area is thus a repulsion area. In a conventional lamp this movement is obtained by physically pushing the head of the lamp using a hand;
• "Target" area: defines an intermediate area within which an object (usually the hand of the user) is at a "correct" distance, so that no movement is required. The objective area is thus an indifference area. The case corresponding to a conventional lamp is that of the hand of the user positioned on the lamp, but without applying any force; • "Far" area: when the object is within such area the lamp is commanded to near it (and then stopping when the object gets into the target area) . In such manner, when the user moves the hand away from the lamp, the lamp follows the command thereof until the hand is maintained within the "far" area. The far area is thus an attraction area. Even in this case, no actual thrust force has to be applied, given that the lamp reacts spontaneously to the user's input; • "External" area: this area is not important in terms of user- light interaction. It may reflect the capturing distance limit of the infrared sensor and the distinction thereof from the area of interest may serve for additional functionalities. For example, when the lamp is OFF and a sensor detects an object coming from the external area, the lamp may go switching itself ON, thus nearing a hand to the lamp that is switched OFF is enough to switch it ON.
Figure 9 shows a possible example of correlation of the areas according to the "fuzzy" logic. As observable in the figure, the various areas do not have sharp borders .
It should be observed that this smart approach allows providing two actions using the information coming from the same sensor: the user may push or pull the head of the lamp with only one sensor, by simply moving the hand in one or the other area. Furthermore, the fuzzy logic not only provides the degree of truth of each area, but it also influences the velocity of the reaction.
This function is provided by directly linking the degree of truth to the velocity of reaction. Still referring to figure 9, if in the far area, but near distance "d3" , the movements of the lamp shall thus be slow, thus allowing a progressive approach to the target area. On the other hand, if the distance is very low and one ends up in the "near area" , the lamp reacts swiftly avoiding impacts. The touchless interaction is completed by applying the fuzzy logic to any measurement of the sensors. The result in the response of the lamp is the composition of commands produced by each sensor. The sensors facing the opposite direction shall obviously produce a counter-effect: thus if one nears both hands to the right and to the left of the lamp, it does not give rise to any movement in that the two touchless actions produce identical and opposite commands. It is clear how the function of avoiding unwanted impacts was obtained. If the head of the lamp is moving too close to the user or to other elements of the surroundings, it inevitably leads into the "near" area and thus the control logic prevents the occurrence of accidents, by interrupting or inverting the movement. Figure 10 schematically shows the third method of interaction which uses the microphones system for spatial identification of sounds. According to this third method, a sound of predetermined characteristics and which thus identifies a "sound signal" (for example, snapping fingers or clapping hands), is detected by the apparatus, the source point is localised and the lamp directs the light beam towards such point, advantageously without moving the head of the lamp, but solely rotating it, unless the movement is required.
Such operating method is useful, for example, when the light, for whatever reason, is required on the other side of the desk and dragging it over the entire distance may be uncomfortable or complicated. The sound signal provides a more instantaneous signal for the light at the desired point. If required, possible accurate adjustment of the position may subsequently be performed through the gesture command method.
At this point, it is clear how the preset objects are attained, by providing a lighting apparatus with a light beam which may be commanded in an easy and intuitive manner and which actually interacts with the user naturally.
Naturally, the aforedescribed embodiment applying the innovative principles of the present invention is provided by way of example of such innovative principles and thus shall not be deemed restrictive to the scope of protection of the patent claimed herein. For example, a second camera may be provided to frame other details, such as the face of the user, present in the surrounding area. Luminosity sensors may also be provided to adjust the light intensity of the lamp to the surrounding one . The same camera may be used to serve such purpose. The possible "key" activation gesture of the gestural recognition may also be a command coming from another sensorial channel, for example the acoustic one, through a voice command or preset sound. Other distance sensors may be provided for.

Claims

1. Lighting apparatus comprising a head (14) with a light source directed in a light beam, a motorised kinematic structure (25) for spatially orienting the head, sensors for detecting control stimuli and an electronic control system that receives signals from said sensors and controls the movement of the head based upon the stimuli detected, characterised in that it comprises an image sensor (30) arranged in the head and directed in the direction of the light beam, and electronic processing means (25, 30) suitable for processing the images taken by the image sensor (30) to recognise at least one hand of a user inserted in the beam, to distinguish a gesture in it from a predetermined series of gestures preset in the control system and to control a corresponding interactive behaviour of the light source selecting it from a series of behaviours that are stored in the control system associated with the gestures of the predetermined series of gestures.
2. Apparatus according to claim 1, characterised in that the gestures preset in the control system comprise the hand stretched out with the fingers open and the hand stretched out with the fingers closed, and the stored behaviours comprise moving the head to keep the moving hand in the light beam and adjusting the luminosity of the light beam.
3. Apparatus according to claim 1, characterised in that the head comprises an actuator (26) for adjusting the width of the light beam, the actuator being connected to the control system to allow the width of the beam to be adjusted in association with one of said stored behaviours .
4. Apparatus according to claim 3, characterised in that the gestures preset in the control system comprise two hands moving apart or coming together and the associated behaviour is a corresponding increase or decrease in width of the beam.
5. Apparatus according to claim 1, characterised in that amongst the sensors it comprises distance sensors (32) that are arranged in the head to detect the distance of objects around the head and that are connected to the control system (25) to control movements of the head in space according to the detected distance.
6. Apparatus according to claim 5, characterised in that in the control system there are areas defined by gradually increasing distances detected by the distance sensors, a first area being associated with a repulsion movement of the head from the detected object, a second area, farther out, being associated with an indifference condition, a third area, even farther out, being associated with an attraction movement of the head towards the detected object.
7. Apparatus according to claim 5, characterised in that the distances detected by the distance sensors are processed with "fuzzy" logic.
8. Apparatus according to claim 1, characterised in that amongst the sensors it comprises sound sensors (34) connected to electronic processing means (30) that are suitable for processing sounds captured by the sound sensors, to identify the spatial position of the source of the sounds and control movements of the head in space to direct the light beam towards it.
9. Apparatus according to claim 1, characterised in that it is in the form of a desk lamp.
10. Method for controlling a lighting apparatus comprising a head (14) with a light source directed in a light beam and a motorised kinematic structure (25) for spatially orienting the head, in which an image, detected by an image sensor (30) arranged in the head and directed in the direction of the light beam, is electronically processed to: recognise at least one hand of a user inserted in the beam; distinguish a gesture from a predetermined series of hand gestures; control a corresponding interactive behaviour of the light source selecting it from a series of associated gestures of the predetermined series of gestures .
11. Method according to claim 10, wherein the gestures comprise the hand stretched out with the fingers open and the hand stretched out with the fingers closed, and the interactive behaviours comprise moving the head to keep the moving hand in the light beam and adjusting the luminosity of the light beam according to a movement of the hand.
12. Method according to claim 10, wherein the preset gestures comprise two hands moving apart or coming together and the associated behaviour is a corresponding increase or decrease in width of the beam.
13. Method according to claim 10, wherein the distance of objects around the head is detected by means of distance sensors (32) located in the head, and movements of the head in space are controlled according to the detected distance.
14. Method according to claim 13, wherein there are areas defined by gradually increasing distances from the head, with a first area associated with a repulsion movement of the head from a detected object, a second area, farther out, associated with an indifference condition and a third area, even farther out, associated with an attraction movement of the head towards a detected object.
15. Method according to claim 13, wherein the distances detected by the distance sensors are processed according to "fuzzy" logic.
16. Method according to claim 10, wherein the spatial position of a sound of predetermined characteristics is identified through sound sensors (34) and electronic processing means (30) , and movements of the head in space are controlled to direct the light beam towards it.
PCT/IB2010/001454 2009-06-16 2010-06-16 Robotized lighting apparatus and control method WO2010146446A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP10740260A EP2443388A1 (en) 2009-06-16 2010-06-16 Robotized lighting apparatus and control method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ITMI2009A001066A IT1394314B1 (en) 2009-06-16 2009-06-16 ROBOTIC LIGHTING SYSTEM AND CONTROL METHOD
ITMI2009A001066 2009-06-16

Publications (1)

Publication Number Publication Date
WO2010146446A1 true WO2010146446A1 (en) 2010-12-23

Family

ID=41682566

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/001454 WO2010146446A1 (en) 2009-06-16 2010-06-16 Robotized lighting apparatus and control method

Country Status (3)

Country Link
EP (1) EP2443388A1 (en)
IT (1) IT1394314B1 (en)
WO (1) WO2010146446A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2434202A1 (en) * 2010-09-28 2012-03-28 TRUMPF Medizin Systeme GmbH + Co. KG Operating light with sterile operating device
US8159156B2 (en) 2009-08-10 2012-04-17 Redwood Systems, Inc. Lighting systems and methods of auto-commissioning
GB2489394A (en) * 2011-02-07 2012-10-03 Alistair Allan Macfarlane Smart Lighting
WO2013169635A1 (en) 2012-05-07 2013-11-14 Chia Ming Chen Light control systems and methods
US8759734B2 (en) 2012-02-23 2014-06-24 Redwood Systems, Inc. Directional sensors for auto-commissioning lighting systems
DE102013215337A1 (en) * 2013-08-05 2015-02-05 Trilux Medical Gmbh & Co. Kg Operating light with control
EP2902698A1 (en) * 2014-02-03 2015-08-05 Regent Beleuchtungskörper AG Luminaire
WO2015144405A1 (en) * 2014-03-26 2015-10-01 Steinel Gmbh Controlled lamp device
EP3021641A1 (en) 2014-11-12 2016-05-18 Electrolux Appliances Aktiebolag Kitchen unit provided with a lighting system
WO2016117593A1 (en) * 2015-01-20 2016-07-28 バルミューダ株式会社 Illumination device
WO2017003931A1 (en) * 2015-06-27 2017-01-05 Brown Gregory A M Light fixtures, systems, and methods for operating and/or controlling light fixtures
WO2018095861A1 (en) * 2016-11-28 2018-05-31 Philips Lighting Holding B.V. Method for light guidance using sound monitoring.
US10406967B2 (en) 2014-04-29 2019-09-10 Chia Ming Chen Light control systems and methods
FR3100869A1 (en) * 2019-09-16 2021-03-19 Grégoire Popineau Lighting device for bottles, synchronized with surrounding sounds
AT17107U3 (en) * 2020-06-11 2022-04-15 Nanjing Ruixiang Information Tech Co Ltd Intelligent household light and control method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5023709A (en) * 1989-11-06 1991-06-11 Aoi Studio Kabushiki Kaisha Automatic follow-up lighting system
GB2381979A (en) * 2001-10-18 2003-05-14 Robert William Chandler Intruder tracking and illuminating system
JP2003197006A (en) * 2001-12-28 2003-07-11 Hitachi Koki Co Ltd Projector
GB2423378A (en) * 2005-01-28 2006-08-23 Stephen Terry A lamp
WO2008152382A1 (en) * 2007-06-13 2008-12-18 Royal College Of Art Directable light

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5023709A (en) * 1989-11-06 1991-06-11 Aoi Studio Kabushiki Kaisha Automatic follow-up lighting system
GB2381979A (en) * 2001-10-18 2003-05-14 Robert William Chandler Intruder tracking and illuminating system
JP2003197006A (en) * 2001-12-28 2003-07-11 Hitachi Koki Co Ltd Projector
GB2423378A (en) * 2005-01-28 2006-08-23 Stephen Terry A lamp
WO2008152382A1 (en) * 2007-06-13 2008-12-18 Royal College Of Art Directable light

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8159156B2 (en) 2009-08-10 2012-04-17 Redwood Systems, Inc. Lighting systems and methods of auto-commissioning
US8710772B2 (en) 2009-08-10 2014-04-29 Redwood Systems, Inc. Orbing and lighting systems
US8729835B2 (en) 2009-08-10 2014-05-20 Redwood Systems, Inc. Group creation in auto-commissioning of lighting systems
EP2434202A1 (en) * 2010-09-28 2012-03-28 TRUMPF Medizin Systeme GmbH + Co. KG Operating light with sterile operating device
CN102537796A (en) * 2010-09-28 2012-07-04 通快医疗系统两合公司 Surgical lamp with sterile operating device
US8833953B2 (en) 2010-09-28 2014-09-16 Trumpf Medizin Systeme Gmbh + Co. Kg Surgical lamps and related systems and methods
GB2489394A (en) * 2011-02-07 2012-10-03 Alistair Allan Macfarlane Smart Lighting
US8759734B2 (en) 2012-02-23 2014-06-24 Redwood Systems, Inc. Directional sensors for auto-commissioning lighting systems
WO2013169635A1 (en) 2012-05-07 2013-11-14 Chia Ming Chen Light control systems and methods
US9587804B2 (en) 2012-05-07 2017-03-07 Chia Ming Chen Light control systems and methods
EP2848094A4 (en) * 2012-05-07 2016-12-21 Chia Ming Chen Light control systems and methods
CN105658171A (en) * 2013-08-05 2016-06-08 特里吕克瑟医疗两合公司 Surgical lamp having control
DE102013215337A1 (en) * 2013-08-05 2015-02-05 Trilux Medical Gmbh & Co. Kg Operating light with control
US10709519B2 (en) 2013-08-05 2020-07-14 Trilux Medical Gmbh & Co. Kg Surgical lamp having control
WO2015018830A3 (en) * 2013-08-05 2015-04-09 Trilux Medical Gmbh & Co. Kg Surgical lamp having control
EP2902698A1 (en) * 2014-02-03 2015-08-05 Regent Beleuchtungskörper AG Luminaire
CH709254A1 (en) * 2014-02-03 2015-08-14 Regent Beleuchtungskörper Ag Lamp.
WO2015144405A1 (en) * 2014-03-26 2015-10-01 Steinel Gmbh Controlled lamp device
CN106465516A (en) * 2014-03-26 2017-02-22 斯坦内尔有限公司 Controlled lamp device
US10129953B2 (en) 2014-03-26 2018-11-13 Steinel Gmbh Controlled lamp device
US10953785B2 (en) 2014-04-29 2021-03-23 Chia Ming Chen Light control systems and methods
US10406967B2 (en) 2014-04-29 2019-09-10 Chia Ming Chen Light control systems and methods
WO2016075023A1 (en) 2014-11-12 2016-05-19 Electrolux Appliances Aktiebolag Kitchen unit provided with a lighting system
EP3021641A1 (en) 2014-11-12 2016-05-18 Electrolux Appliances Aktiebolag Kitchen unit provided with a lighting system
US10539330B2 (en) 2014-11-12 2020-01-21 Electrolux Appliances Aktiebolag Kitchen unit provided with a lighting system
US10561001B2 (en) 2015-01-20 2020-02-11 Balmuda Inc. Illumination device
US10349490B2 (en) 2015-01-20 2019-07-09 Balmuda Inc. Illumination device
WO2016117593A1 (en) * 2015-01-20 2016-07-28 バルミューダ株式会社 Illumination device
WO2017003931A1 (en) * 2015-06-27 2017-01-05 Brown Gregory A M Light fixtures, systems, and methods for operating and/or controlling light fixtures
WO2018095861A1 (en) * 2016-11-28 2018-05-31 Philips Lighting Holding B.V. Method for light guidance using sound monitoring.
FR3100869A1 (en) * 2019-09-16 2021-03-19 Grégoire Popineau Lighting device for bottles, synchronized with surrounding sounds
AT17107U3 (en) * 2020-06-11 2022-04-15 Nanjing Ruixiang Information Tech Co Ltd Intelligent household light and control method thereof

Also Published As

Publication number Publication date
ITMI20091066A1 (en) 2010-12-17
EP2443388A1 (en) 2012-04-25
IT1394314B1 (en) 2012-06-06

Similar Documents

Publication Publication Date Title
EP2443388A1 (en) Robotized lighting apparatus and control method
US8455830B2 (en) Directable light
US10913151B1 (en) Object hand-over between robot and actor
JP5963372B2 (en) How to make a mobile robot follow people
JP6647228B2 (en) Control devices for medical devices
JP2017191792A (en) Light control method and lighting device using the same
EP2939580B1 (en) Cleaner
US20080256494A1 (en) Touchless hand gesture device controller
CN102902271A (en) Binocular vision-based robot target identifying and gripping system and method
CN106958799B (en) Illumination control device, illumination system, and illumination control method
CN111726921B (en) Somatosensory interactive light control system
Nguyen et al. A clickable world: Behavior selection through pointing and context for mobile manipulation
CN105612815B (en) Luminaire system with touch input unit for controlling light output angle
US10375799B2 (en) Lighting commanding method and an assymetrical gesture decoding device to command a lighting apparatus
CN108261787B (en) Control method for platform-expanding robot
CN112287717A (en) Intelligent system, gesture control method, electronic device and storage medium
WO2017127598A1 (en) Gesturing proximity sensor for spa operation
WO2021073733A1 (en) Method for controlling a device by a human
CN114245542A (en) Radar induction lamp and control method thereof
CN110916576A (en) Cleaning method based on voice and image recognition instruction and cleaning robot
CN220964937U (en) Interactive bracket
CN109500815B (en) Robot for front gesture judgment learning
US20240175565A1 (en) Illumination assembly including a microlens array
WO2021256463A1 (en) Imaging system and robot system
Amat et al. Human robot interaction from visual perception

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10740260

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2010740260

Country of ref document: EP