EP3204935B1 - Control device, operation method of such a device and audiovisual system - Google Patents

Control device, operation method of such a device and audiovisual system Download PDF

Info

Publication number
EP3204935B1
EP3204935B1 EP15804896.7A EP15804896A EP3204935B1 EP 3204935 B1 EP3204935 B1 EP 3204935B1 EP 15804896 A EP15804896 A EP 15804896A EP 3204935 B1 EP3204935 B1 EP 3204935B1
Authority
EP
European Patent Office
Prior art keywords
object
light beams
means
optical sensor
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP15804896.7A
Other languages
German (de)
French (fr)
Other versions
EP3204935A1 (en
Inventor
Claude Francis JUHEN
Original Assignee
Juhen, Claude Francis
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to FR1461092A priority Critical patent/FR3028655B1/en
Application filed by Juhen, Claude Francis filed Critical Juhen, Claude Francis
Priority to PCT/FR2015/053108 priority patent/WO2016079420A1/en
Publication of EP3204935A1 publication Critical patent/EP3204935A1/en
Application granted granted Critical
Publication of EP3204935B1 publication Critical patent/EP3204935B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/02Means for controlling the tone frequencies, e.g. attack, decay; Means for producing special musical effects, e.g. vibrato, glissando
    • G10H1/04Means for controlling the tone frequencies, e.g. attack, decay; Means for producing special musical effects, e.g. vibrato, glissando by additional modulation
    • G10H1/053Means for controlling the tone frequencies, e.g. attack, decay; Means for producing special musical effects, e.g. vibrato, glissando by additional modulation during execution only
    • G10H1/055Means for controlling the tone frequencies, e.g. attack, decay; Means for producing special musical effects, e.g. vibrato, glissando by additional modulation during execution only by switches with variable impedance elements
    • G10H1/0553Means for controlling the tone frequencies, e.g. attack, decay; Means for producing special musical effects, e.g. vibrato, glissando by additional modulation during execution only by switches with variable impedance elements using optical or light-responsive means
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/405Beam sensing or control, i.e. input interfaces involving substantially immaterial beams, radiation, or fields of any nature, used, e.g. as a switch as in a light barrier, or as a control device, e.g. using the theremin electric field sensing principle
    • G10H2220/411Light beams

Description

    Field of the invention
  • The present invention relates to a control device, a method of operating such a device and an audiovisual system. The present invention applies to the field of non-contact control devices.
  • More particularly, the present invention is applicable to electronic musical instruments.
  • State of the art
  • Non-contact control devices are mainly control devices on presentation of an object or by interruption of a light beam, such as a laser for example. These control devices make it possible to switch from one state to another, such as the opening of a door, or the start-up of an appliance.
  • Modulating devices of an intensity, sound or light, for example, use a potentiometer and require a contact of the user. These devices allow modulation of an intensity. These devices exclude the possibility of switching from one state to another over the proposed range of intensities without going through all the intermediate intensities.
  • With regard to contactless musical instruments, the patent US 8,835,739 discloses a device for playing previously recorded sounds by interrupting lasers. The device disclosed in the patent FR 2,777,107 allows to produce sounds by interrupting a laser by means of a rod. The baton interrupts the laser for the first time to play the sound and a second time to interrupt the sound. The velocity of the rod being measured by device object of the patent FR 2,777,107 , the sound produced is more or less strong depending on this velocity. A sequence of sounds is prerecorded and the sound played, for example a note, is independent of the will of the user
  • In addition, the publication US6489550 discloses apparatus for controlling musical parameters and effects using two light beams reflected by objects (eg the user's hand) to a central sensor. The control of the musical effects can be realized according to the intensity and the variation of the light detected by the central sensor.
  • The document US5017770 describes a luminous scene within which the user controls musical effects by interrupting light beams with the hands. Each beam is projected from top to bottom or bottom to top and captured on the other side by one or more sensors. The processor detects the movement speed of the hand, the longitudinal position of the interruption and the direction of movement. At least four sensors are provided to detect the interruption height of the light beams.
  • Object of the invention
  • The present invention aims to remedy all or part of the disadvantages of the prior art.
  • For this purpose, according to a first aspect, the present invention is directed to a device for controlling a parameterizable audiovisual effect, which comprises:
    • means for generating at least two first optical paths traversed by non-parallel light beams comprising at least one optical sensor and at least one emitter of at least one light beam,
    • means for measuring the speed of an object traversing successively at least two rectilinear light beams as a function of a signal coming out of at least one optical sensor representing the absence of light successively on the optical paths because of the cut of each optical path by the object,
    • means for estimating the longitudinal position of the crossing of the object as a function of a so-called "transit time" between two characteristic instants of a signal leaving at least one optical sensor representing the absence of light of at least two light beams due to the successive cutting of the optical paths by the object and
    • control means known as "position control means" of a parameter value of an audiovisual effect as a function of the estimated longitudinal position; in which :
      • the means for measuring the speed of the object are configured to:
        • determining, for at least one beam, the so-called "cutoff duration" during which the beam is not picked up by the optical sensor, and dividing a predetermined dimension d of the object by said cut-off time, or
        • the generating means generating, in addition, an additional optical path traversed by a light beam, parallel to one of the first optical paths, between an optical sensor and an emitter of at least one light beam, dividing the distance between two beams luminous parallel by the duration of crossing between these parallel light beams,
      • the means for estimating the longitudinal position of the crossing of the object of at least two non-parallel light beams, are configured to multiply the duration of passage of the object between two non-parallel beams by the measured speed.
  • Thanks to these arrangements, several different commands can be made according to each estimated longitudinal position. The user can, for example, create a melody or control different devices, such as devices producing visual effects, depending on the estimated longitudinal position.
  • In addition, such a device can be used to control a large number of devices. As such a device is configurable and configurable, the device object of the present invention can have different uses.
  • In embodiments, the device that is the subject of the present invention comprises control means called "speed control means" of a parameter value of an audiovisual effect as a function of the measured speed.
  • These embodiments have the advantage of modifying the intensity or the speed of a parameter value of an audiovisual effect controlled by the positional control means for example.
  • In embodiments, the device that is the subject of the present invention comprises:
    • means for detecting the direction of interruption of at least two optical paths by the object as a function of at least one signal coming from at least one optical sensor representing the cutting of the optical paths by the object of at least one two light beams and
    • control means said "directional control means" of a parameter value of an audiovisual effect according to the direction detected.
  • The advantage of these embodiments is to control two parameter values of an audiovisual effect for the same position of cutting optical paths by the object as a function of the cutoff direction.
  • In embodiments, the direction of cleavage of the optical paths by the object is detected as a function of at least one signal output from at least one optical sensor representing the cleavage of the optical paths by the object of at least three The light beams and the directional control means control at least one parameter value of at least one audiovisual effect according to two components of a vector representative of the detected direction.
  • The use of two components of a vector representative of the direction detected makes it possible to increase the number of attainable parameter values. In addition, each component of the vector representative of the detected direction can control a parameter value of a distinct audiovisual effect.
  • In embodiments, the direction of cleavage of the optical paths by the object is detected as a function of at least one signal output from at least one optical sensor representing the cleavage of the optical paths by the object of at least three light beams defining a volume and the directional control means controls at least one parameter value of at least one audiovisual effect according to three components of a vector representative of the direction detected.
  • These embodiments have the advantage of having more possibilities of parameter values of a controlled audiovisual effect. In addition, each component of the vector representative of the detected direction can control a parameter value of a distinct audiovisual effect.
  • In embodiments, the speed measuring means are configured to measure the speed of the object as a function of at least one duration of so-called "cut-off time" of a signal output from at least one optical sensor, the cut-off time representing the cutting of the optical paths by the object of at least one light beam and a predetermined dimension of the object.
  • The advantage of these embodiments is to require only two optical paths traversed by light beams. The energy consumption of the device is reduced.
  • In embodiments, the means for measuring the speed are configured to measure the speed of the object as a function of a signal emerging from at least one optical sensor representing the cutting of the optical paths by the object of the invention. minus two parallel light beams.
  • The use of two parallel light beams to measure the cutoff speed of the optical paths allows the user to choose any object to use the control device, such as a stick or fingers of the hand for example.
  • In embodiments, the means for estimating the longitudinal position of the bushing and the means for measuring the speed of the object are configured to estimate the longitudinal position and the speed as a function of a signal exiting from less an optical sensor representing the cutting of optical paths by the object of at least three light beams defining a volume.
  • These embodiments have the advantage of having a greater precision of the calculation of the longitudinal position of the crossing and the speed of the object.
  • In embodiments, the device that is the subject of the present invention comprises means of converting each parameter value into a value represented according to the MIDI protocol (acronym for "Musical Instrument Digital Interface").
  • The advantage of these embodiments is to be able to use the device object of the present invention as a musical instrument.
  • According to a second aspect, the present invention relates to a method of operating a device that is the subject of the present invention, according to claim 10.
  • Since the advantages, aims and particular characteristics of the method that are the subject of the present invention being similar to those of the device that is the subject of the present invention, they are not recalled here.
  • According to a third aspect, the present invention aims at an audiovisual system which comprises at least one device which is the subject of the present invention, means of transforming each parameter value of an audiovisual effect into a control signal of a sound effect and or visual and a transducer converting the control signal into sound and / or visual effect. The advantage of these embodiments is to have a system for producing a sound and / or visual effect according to the movements made by the user facing the light beams.
  • In embodiments, the transducer includes an electroacoustic transducer such that the sound signal emitted by the transducer depends on the movements of a user facing the light beams.
  • The connection of a device object of the present invention with an electroacoustic transducer has the advantage of using the system as a musical instrument.
  • Brief description of the figures
  • Other advantages, aims and particular characteristics of the invention will emerge from the following nonlimiting description of at least one particular embodiment of a control device and a method for implementing such a device. device, with reference to the accompanying drawings, in which:
    • the figure 1 represents, schematically, a first particular embodiment of a device that is the subject of the present invention,
    • the figure 2 represents, schematically, a second particular embodiment of a device that is the subject of the present invention,
    • the figure 3 represents, schematically, a third particular embodiment of a device which is the subject of the present invention,
    • the figure 4 represents, in the form of a logic diagram, a fourth particular embodiment of a device that is the subject of the present invention, and
    • the figure 5 represents, schematically, a fifth particular embodiment of a device object of the present invention.
    Description of embodiments of the invention
  • It is already noted that the figures are not to scale.
  • This description is given in a nonlimiting manner, each feature of an embodiment being able to be combined with any other feature of any other embodiment in an advantageous manner.
  • It is noted that the term "one" is used in the sense of "at least one".
  • We observe on the figure 1 , a particular embodiment of a device for controlling a parameterizable audiovisual effect object of the present invention.
  • The device 10 comprises a structural element 100 on which are fixed the means for generating two optical paths and in particular the transmitters 105a and 105b, at least one light beam 110a or 110b, and the optical sensor 115. The element The structural element may comprise a metal structure comprising the emitters 105a and 105b and the optical sensor 115. The structural element may comprise two independent supports fixed by pinching on a surface, such as a table, for example. One of the supports may comprise at least one transmitter, 105a or 105b, the other support comprising at least one optical sensor 115.
  • An emitter 105a or 105b of at least one light beam 110a or 110b may comprise:
    • a laser,
    • a transmitter of at least two laser beams,
    • an emitter of at least two beams of distinct wavelengths,
    • an emitter of at least two beams whose activation is alternated and / or
    • any other focused light transmitter.
  • An emitter 105a or 105b may emit several light beams 110a or 110b. Each transmitter, 105a or 105b, may be of different type. Preferably, the device 10, object of the present invention comprises two transmitters, 105a and 105b. The transmitter 105a emitting a light beam 110a, the transmitter 105b emitting a light beam 110b.
  • An optical sensor 115 may comprise:
    • a photoconductive cell,
    • a photodiode,
    • a phototransistor,
    • a CCD sensor (acronym for "Couple Charge Device"),
    • a CMOS sensor (acronym for "Complementary Metal Oxide Semiconductor" in English) and / or
    • any other optical sensor.
  • An optical sensor 115 may include a wavelength discrimination filter. An optical sensor 115 may comprise diffraction means of at least one captured light beam. An optical sensor 115 can pick up a plurality of light beams, 110a and 110b. Each optical sensor 115 may be of different type.
  • Preferably, the device 10, object of the present invention, comprises an optical sensor sensing the two light beams 110a and 110b respectively from the two transmitters 105a and 105b. In embodiments, discrimination of the light beams 110a and 110b is effected by diffraction.
  • Preferably, each light beam 110a, 110b is a beam with a single wavelength. The light beams 110a and 110b have the same wavelength and are activated alternately. The discrimination of the light beams 110a and 110b is effected by means of the alternative activation.
  • At least two light beams 110a, 110b are nonparallel.
  • In embodiments, the means for generating two optical paths traversed by non-parallel light beams 110a and 110b comprising at least one optical sensor 115 and at least one transmitter, 105a or 105b, comprise at least one mirror. The mirror can be semi-reflective. For example, a light beam, 110a or 110b, from an emitter, 105a or 105b, is partially diffracted and partially reflected by the mirror. The diffracted portion of the light beam, 110a or 110b, travels through an optical path. The reflected part of the light beam, 110a or 110b, goes through another optical path.
  • The optical sensor 115 generates at least one electrical signal 120 representative of the cutoff of at least one light beam, 110a or 110b, captured. Each electrical signal 120 is transmitted to:
    • means 150 for measuring the speed 150 of an object passing through at least two light beams, 110a and 110b, as a function of a signal 120 coming out of at least one optical sensor 115 representing the cutting of the optical paths by an object,
    • estimation means 125 for the longitudinal position 130 of the crossing of the object as a function of a signal 120 coming out of at least one optical sensor 115 representing the cutting of the optical paths by the object of at least two beams bright, 110a and 110b, not parallel and
    • detection means 165 of the breaking direction 170 of the optical paths by the object as a function of at least one signal 120 coming out of at least one optical sensor 115 representing the cutting of the optical paths by the object of at least two light beams, 110a and 110b.
  • At least one dimension d of the object is predetermined and previously recorded by the measuring means 145 of the speed of the object. Preferably, the object is a cylinder trunk circular guide curve located in a plane perpendicular to the generator, such a rod for example. The predetermined dimension is the diameter of the cylinder trunk. It is recalled that a cylinder is a surface in the space defined by a line, called generator, passing through a variable point describing a closed plane curve, called the directing curve, and keeping a fixed direction.
  • The means 145 for measuring the speed of the object determine, for at least one beam, 110a or 110b, the so-called "cutoff duration" during which the beam, 110a or 110b, has not been picked up by the sensor optical 115. The determination of the cut-off time is carried out by means of the electrical signal 120. The cut-off time corresponds to the duration during which the optical sensor 115 is closed by the object. The cut-off time may correspond to the duration between two characteristic times of an electrical signal 120, each characteristic instant being defined with respect to a predetermined limit value of intensity or voltage of the electrical signal 120.
  • The speed of the object is obtained by dividing the predetermined dimension d by the cut-off time.
  • In embodiments, the cut-off time for each beam, 110a and 110b, is determined. The speed is determined with respect to an average of the cut-off times.
  • The measured speed 150 is then transmitted:
    • to control means 155 called "speed control means" of a parameter value 160 of an audiovisual effect as a function of the measured speed 150 and
    • to the estimation means 125.
  • The longitudinal position 130 of the crossing of the object of at least two light beams, 110a and 110b, is determined by calculation of the so-called duration "crossing time". The crossing time is the duration between a characteristic instant of the electrical signal 120 when a first beam, 110a or 110b, is not picked up by an optical sensor 115 and a corresponding characteristic instant of the electrical signal 120 when a second beam, 110a or 110b, is not picked up by an optical sensor 115. For example, the characteristic moment may be the start time of the cut-off of the light beam, 110a or 110b, or the end time of the cut-off of the light beam. light beam, 110a or 110b. The start time of the cutoff and the end time of the break corresponding each to a change of state of the electrical signal 120.
  • The longitudinal position 130 is determined by multiplying the transit time by the predetermined dimension d divided by the cut-off time. The longitudinal position 130 can be determined by multiplying the measured speed 150 by the traversing time.
  • The longitudinal position 130 is transmitted to control means 135 called "position control means" of a parameter value 140 of an audiovisual effect as a function of the estimated longitudinal position 130.
  • The detection means 165 detect the breaking direction 170 as a function of a characteristic instant of the electrical signal 120 when a first light beam, 110a or 110b, is not picked up by an optical sensor 115 and a corresponding characteristic instant of the electrical signal 120 when a second light beam, 110a or 110b, is not picked up by an optical sensor 115. The cutoff direction 170 is in the direction of the first beam, 110a or 110b, which is not captured by an optical sensor 115 to the second beam, 110a or 110b, which is not picked up by an optical sensor 115.
  • The cutoff direction 170 is transmitted to control means 175 called "directional control means" of a parameter value 180 of an audiovisual effect as a function of the direction 170 detected.
  • Conversion means 185 converts each parameter value 140, 160 and 180 to a value represented according to the MIDI protocol (acronym for "Musical Instrument Digital Interface "in English) 190. In embodiments, the conversion means 185 are optional.
  • The measuring means 145, the estimation means 125, the determination means 165, the position control means 135, the speed control means 155, the directional control means 175 and the conversion means 185 may each be a microprocessor associated with a program memory comprising instructions for carrying out the steps of the method that is the subject of the present invention. Preferably, a microprocessor associated with a program memory comprising instructions for carrying out the steps of the method that is the subject of the present invention performs the functions of the means 125, 135, 145, 155, 165, 175 and 185.
  • We observe on the figure 2 , a particular embodiment of a device object of the present invention.
  • The device 20 comprises a structural element 200 on which are fixed the means for generating three optical paths and in particular the emitters 205a and 205b, at least one light beam 210a, 210b or 210c, and the optical sensors 215a and 215b. The structural element may be a metallic structure comprising the emitters 205a and 205b and the optical sensors 215a and 215b. The structural element may comprise two independent supports fixed by pinching on a surface, such as a table, for example. One of the supports may comprise at least one transmitter, 205a or 205b, the other support comprising at least one optical sensor, 215a or 215b.
  • An emitter 205a or 205b of at least one light beam 210a, 210b or 210c may comprise:
    • a laser,
    • a transmitter of at least two laser beams,
    • an emitter of at least two beams of distinct wavelengths,
    • an emitter of at least two beams whose activation is alternated and / or
    • any other focused light transmitter.
  • An emitter 205a or 205b may emit several light beams 210a, 210b or 210c. Each transmitter, 205a or 205b, may be of different type. Preferably, the transmitter 205a emits a light beam 210a. The transmitter 205b emits two light beams, 210b and 210c. The light beam 210b is non-parallel to the beam 210a, and the light beam 210c is parallel to the beam 210a.
  • An optical sensor, 215a or 215b, may comprise:
    • a photoconductive cell,
    • a photodiode,
    • a phototransistor,
    • a CCD sensor (acronym for "Couple Charge Device"),
    • a CMOS sensor (acronym for "Complementary Metal Oxide Semiconductor" in English) and / or
    • any other optical sensor.
  • In embodiments, an optical sensor, 215a or 215b, may include a wavelength discrimination filter. An optical sensor, 215a or 215b can comprise diffraction means of at least one light beam, 210a, 210b or 210c, captured. A discrimination of at least two light beams, 210a, 210b or 210c, can be effected by diffraction. An optical sensor 215a or 215b can pick up a plurality of light beams 210a, 210b or 210c. Each optical sensor, 215a or 215b, may be of different type.
  • Preferably, the device 20 object of the present invention comprises two optical sensors 215a and 215b. The optical sensor 215a captures the light beams 210a and 210b. The optical sensor 215a has means for discriminating the light beams 210a and 210b. The optical sensor 215b captures the light beam 210c.
  • Preferably, each light beam 210a, 210b, 210c is a single wavelength beam.
  • In embodiments, the means for generating three optical paths comprise at least one mirror. The mirror can be semi-reflective. For example, a light beam, 210a, 210b or 210c, from an emitter 205a or 205b is partially diffracted and partially reflected by the mirror. The diffracted portion of the light beam, 210a, 210b or 210c, travels through an optical path. The reflected part of the light beam, 210a, 210b or 210c, goes through another optical path.
  • The optical sensor 215a generates an electrical signal 220a representative of the cutoff of at least one light beam, 210a or 210b, captured. The electrical signal 220a is transmitted to:
    • means 245 for measuring the speed 250 of an object passing through at least two light beams 210a, 210b or 210c as a function of at least one signal 220a or 220b coming out of at least one optical sensor 215a or 215b, representing the cutting of optical paths by an object,
    • estimation means 225 of the longitudinal position 230 of the crossing of the object as a function of at least one signal, 220a or 220b, coming out of at least one optical sensor, 215a or 215b, representing the cutting of the optical paths by the object of at least two light beams, 210a, 210b or 210c, and
    • detection means 265 of the cutoff direction 270 of the optical paths by the object as a function of at least one signal 220a, 220b coming out of at least one optical sensor representing the cutting of the optical paths by the object of at minus three light beams 210a, 210b and 210c.
  • The optical sensor 215b generates an electrical signal 220b representative of the cutoff of at least one light beam 210c captured. The electrical signal 220b is transmitted to the measuring means 245, the estimation means 225 and the detection means 265.
  • The light beams 210a and 210c are parallel, the speed 250 of the object passing through the light beams 210a and 210c is measured by means of the electrical signals 220a and 220b.
  • The speed 250 is measured by the calculation of the transit time. The duration of crossing is the duration between:
    • a characteristic instant of the electrical signal 220a when the light beam 210a is not picked up by an optical sensor 215a and a corresponding characteristic instant of the electric signal 220b when the second light beam 210c is not picked up by the optical sensor 215b or,
    • a characteristic instant of the electric signal 220b when the light beam 210c is not picked up by an optical sensor 215b and a corresponding characteristic instant of the electric signal 220a when the second light beam 210a is not picked up by the optical sensor 215a.
  • The distance between the beams 210c and 210a is predetermined and previously recorded by the measuring means 245. The speed 250 is measured by dividing the distance between the beams 210c and 210a by the crossing time of the distance between the beams 210c and 210a.
  • In embodiments, the measuring means 245 measure the speed as described with reference to the figure 1 , depending on the beams 210a and 210b and / or depending on the beams 210b and 210c. The measured speed 250 may be an average of different speeds calculated from several pairs of light beams 210a, 210b, 210c.
  • The measured speed 250 is transmitted to control means 255 called "speed control means" of a parameter value 260 of an audiovisual effect as a function of the speed 250 measured and to the estimation means 225.
  • The longitudinal position 230 is estimated according to one of the detailed embodiments with respect to the figure 1 , as a function of the signals 220a and 220b representing the breaking of the light beams 210a, 210b and 210c by an object.
  • The longitudinal position 230 is transmitted to control means 235 called "position control means" of a parameter value 240 of an audiovisual effect as a function of the estimated longitudinal position 230.
  • The detection means 265 detect the cutoff direction 270. Preferably, two components of the cutoff direction 270 are detected, a axial direction component and a longitudinal direction component. The axial direction component is a component perpendicular to the parallel beams 210a and 210c. The longitudinal direction component is a component parallel to the parallel beams 210a and 210c.
  • The axial direction component is detected according to the signals 220a and 220b. The axial direction component is the direction of the first cut of a light beam, 210a, 210b or 210c, towards the cutting of a second light beam, 210a, 210b or 210c. The axial direction component may be beam 210a to beam 210c or beam 210c to beam 210a. Each cutoff of each light beam 210a, 210b or 210c is determined by analysis of the electrical signals 220a and 220b. The detection of the cutoff order of the light beams is determined by a time comparison of the electrical signals 220a and 220b.
  • The longitudinal direction component is detected as a function of the electrical signals 220a and 220b. The longitudinal direction component is determined by analysis of electrical signals 220a and 220b. The longitudinal direction component may be in the direction of the optical path followed by one of the parallel light beams, 210a or 210c, or in the opposite direction. Preferably, the longitudinal direction component is measured, as a function of the determined axial direction component and by comparison of the estimated longitudinal position between the light beams 210a and 210b and of an estimated longitudinal position between the light beams 210b and 210c calculated by the estimation means 225.
  • The components of the cutoff direction 270 are transmitted to control means 275 called "directional control means" of a parameter value 280 of an audiovisual effect as a function of the detected direction 270.
  • Preferably, each component of the cutoff direction 270 controls a parameter value of an audiovisual effect.
  • Conversion means 285 converts each parameter value 240, 260 and 280 to a value represented according to the MIDI protocol (acronym for "Musical Instrument Digital Interface") 290. In some embodiments, the conversion means 285 are optional.
  • The measuring means 245, the estimation means 225, the determination means 265, the position control means 235, the speed control means 255, the directional control means 275 and the conversion means 285 may each be a microprocessor associated with a program memory comprising instructions for carrying out the steps of the method that is the subject of the present invention. Preferably, a microprocessor associated with a program memory comprising instructions for carrying out the steps of the method which is the subject of the present invention performs the functions of the means 225, 235, 245, 255, 265, 275 and 285.
  • We observe on the figure 3 , a particular embodiment of a device object of the present invention.
  • The device 30 comprises a structural element 300 on which are fixed the means for generating three optical paths and in particular the emitters 305a, 305b and 305c, at least one light beam 310a, 310b or 310c, and the optical sensors, 315a, 315b and 315c. The structural element may be a metal structure comprising the emitters 305a, 305b and 305c, and the optical sensors 315a, 315b and 315c. The structural element may comprise two independent supports fixed by pinching on a surface, such as a table, for example. One of the supports may comprise at least one transmitter, 305a, 305b or 305c, the other support comprising at least one optical sensor, 315a, 315b or 315c.
  • An emitter, 305a, 305b or 305c, of at least one light beam, 310a, 310b or 310c may comprise:
    • a laser,
    • a transmitter of at least two laser beams,
    • an emitter of at least two beams of distinct wavelengths,
    • an emitter of at least two beams whose activation is alternated and / or
    • any other focused light transmitter.
  • An emitter 305a, 305b or 305c may emit several light beams 310a, 310b or 310c. Each transmitter, 305a, 305b or 305c may be of different type. Preferably, the transmitter 305a emits a light beam 310a. The transmitter 305b emits a light beam 310b and the transmitter 305c emits a light beam 310c. The light beams 310a, 310b and 310b form a volume.
  • An optical sensor, 315a, 315b or 315c, may comprise:
    • a photoconductive cell,
    • a photodiode,
    • a phototransistor,
    • a CCD sensor (acronym for "Couple Charge Device"),
    • a CMOS sensor (acronym for "Complementary Metal Oxide Semiconductor" in English) and / or
    • any other optical sensor.
  • In embodiments, an optical sensor 315a, 315b or 315c may include a wavelength discrimination filter. An optical sensor, 315a, 315b or 315c may comprise diffraction means of at least one light beam, 310a, 310b or 310c, captured. Discrimination of the light beams, 310a, 310b or 310c, can be effected by diffraction. An optical sensor, 315a, 315b or 315c, can pick up several light beams, 310a, 310b or 310c. Each optical sensor, 315a, 315b or 315c, may be of different type.
  • Preferably, the device 30 which is the subject of the present invention comprises three optical sensors 315a, 315b and 315b. The optical sensor 315a captures the light beam 310a. The optical sensor 315b captures the light beam 310b. The optical sensor 315c captures the light beam 310c.
  • Preferably, each light beam 310a, 310b, 310c is a single wavelength beam.
  • The light beams 310a, 310b and 310c form a volume. At least two light beams 310a, 310b, 310c are nonparallel.
  • In embodiments, the means for generating three optical paths comprise at least one mirror. The mirror can be semi-reflective. For example, a light beam, 310a, 310b or 310c, from an emitter, 305a, 305b or 305c, is partially diffracted and partially reflected by the mirror. The diffracted portion of the light beam, 310a, 310b or 310c, traverses an optical path. The reflected part of the light beam, 310a, 310b or 310c, goes through another optical path.
  • The optical sensor 315a generates an electrical signal 320a representative of the cutoff of at least one light beam 310a picked up. The electrical signal 320a is transmitted to:
    • measuring means 345 of the speed 350 of an object passing through at least two light beams 310a, 310b, 310c as a function of at least one signal 320a, 320b, 320c issuing from at least one optical sensor 315a, 315b, 315c representing the cutting of the optical paths by an object,
    • means 325 for estimating the longitudinal position 330 of the crossing of the object as a function of at least one signal 320a, 320b, 320c coming out of at least one optical sensor 315a, 315b, 315c representing the cutting of the optical paths by the object of at least two light beams 310a, 310b, 310c and
    • detection means 365 of the clipping direction 370 of the optical paths by the object as a function of at least one signal 320a, 320b, 320c coming out of at least one optical sensor representing the cleavage of the optical paths by the object d at least three light beams 310a, 310b and 310c.
  • The optical sensor 315b generates an electrical signal 320b representative of the cutoff of at least one light beam 310b captured. The electrical signal 320b is transmitted to the measuring means 345, the estimation means 325 and the detection means 365.
  • The optical sensor 315c generates an electrical signal 320c representative of the cutoff of at least one light beam 310c captured. The electrical signal 320c is transmitted to the measuring means 345, the estimation means 325 and the detection means 365.
  • The speed 350 of the object passing through the light beams 310a, 310b and 310c is measured by means of at least two electrical signals, 320a, 320b or 320c. The speed 350 is measured according to one of the embodiments defined with regard to the figures 1 and 2 .
  • The measured speed 350 may be an average of different speeds calculated from several pairs of light beams, 310a, 310b or 310c.
  • The measured speed 350 is then transmitted to control means 355 called "speed control means" of a parameter value 360 of an audiovisual effect as a function of the measured speed 350 and the estimation means 325.
  • The longitudinal position 330 is estimated according to one of the detailed embodiments with respect to the figure 1 . The longitudinal position 330 may be an average of longitudinal positions 330 calculated for at least two light beams, 310a, 310b or 310c.
  • The longitudinal position 330 is transmitted to control means 335 called "position control means" of a parameter value 340 of an audiovisual effect as a function of the estimated longitudinal position 330.
  • The detection means 365 detect the cutoff direction 370. Preferably, three components of the cutoff direction 370 are detected, a direction component ab, a direction component bc and a direction component ac. The direction component ab is a component determined with respect to the plane formed by the light beams 310a and 310b. The direction component bc is a component determined with respect to the plane formed by the light beams 310b and 310c. The direction component ac is a component determined with respect to the plane formed by the light beams 310a and 310c.
  • Preferably, each direction component is determined as a function of an estimated longitudinal position between the light beams 310a and 310b, of an estimated longitudinal position between the light beams 310b and 310c and of an estimated longitudinal position between the light beams 310a and 310b. 310c respectively.
  • The components of the cutoff direction 370 are transmitted to control means 375 called "directional control means" of a parameter value 380 of an audiovisual effect as a function of the detected direction 370.
  • Preferably, each component of the cutoff direction 370 controls a parameter value of at least one audiovisual effect.
  • Converting means 385 converts each parameter value 340, 360 and 380 to a value represented according to the MIDI protocol (acronym for "Musical Instrument Digital Interface") 390.
  • The measuring means 345, the estimation means 325, the determination means 365, the position control means 335, the speed control means 355, the directional control means 375 and the conversion means 385 may each be a microprocessor associated with a program memory comprising instructions for carrying out the steps of the method that is the subject of the present invention. Preferably, a microprocessor associated with a program memory comprising instructions for carrying out the steps of the method that is the subject of the present invention provides functions of means 325, 335, 345, 355, 365, 375 and 385.
  • We observe on the figure 4 , a particular embodiment 40 of a method that is the subject of the present invention.
  • The method 40 comprises the following steps:
    • generation 41 of at least two optical paths traversed by light beams, 110a, 110b, 210a, 210b, 210c, 310a, 310b or 310c, non-parallel between at least one optical sensor, 115, 215a, 215b, 315a, 315b or 315c, and at least one transmitter 105a, 105b, 205a, 205b, 305a, 305b or 305c, of at least one light beam, 110a, 110b, 210a, 210b, 210c, 310a, 310b or 310c,
    • measuring 42 the speed, 150, 250 or 350, of an object passing through at least two light beams, 110a, 110b, 210a, 210b, 210c, 310a, 310b or 310c, as a function of a signal, 120, 220a, 220b, 320a, 320b or 320c, emerging from at least one optical sensor, 115, 215a, 215b, 315a, 315b or 315c, representing the cutting of the optical paths by the object,
    • estimate 43 of the longitudinal position, 130, 230 or 330, of the crossing of the object as a function of an outgoing signal, 120, 220a, 220b, 320a, 320b or 320c, of at least one optical sensor, 115, 215a, 215b, 315a, 315b or 315c, representing the cutting of optical paths by the object of at least two light beams, 110a, 110b, 210a, 210b, 210c, 310a, 310b or 310c, non-parallel and
    • control 44 of a parameter value, 140, 240 or 340, of an audiovisual effect according to the estimated longitudinal position, 130, 230 or 330.
  • The generation steps 41, measurement 42, estimate 43, control 44 are preferably performed by means of an embodiment of a device, 10, 20, 30 or one of the embodiments described above.
  • In embodiments, the method 40 includes at least one of the following steps:
    • control of a parameter value, 160, 260 or 360, of an audiovisual effect as a function of the measured speed, 150, 250 or 350,
    • detection of the cutoff direction, 170, 270 or 370, of the optical paths by the object as a function of at least one signal, 120, 220a, 220b, 320a, 320b or 320c, coming out of at least one optical sensor, 115, 215a, 215b, 315a, 315b or 315c, representing the break optical paths through the object of at least two light beams, 110b, 210a, 210b, 210c, 310a, 310b or 310c,
    • controlling a parameter value, 180, 280 or 380, of an audiovisual effect as a function of the direction, 170, 270 or 370, detected,
    • converting at least one parameter value, 140, 240, 340, 160, 260, 360, 180, 280 or 380, in value represented according to the MIDI protocol, 190, 290 or 390.
  • We observe on the figure 5 , a particular embodiment of an audiovisual system 50 object of the present invention.
  • An embodiment, 10, 20 or 30, of a device that is the subject of the present invention transmits each parameter value, 140, 160 and 180, or 240, 260 and 280, or 340, 360 and 380, in value represented according to the MIDI protocol, 190, 290 or 390, to transformation means 500 of each parameter value of an audiovisual effect, represented according to the MIDI protocol, 190, 290 or 390, into a control signal 505 of a sound effect and / or visual.
  • In embodiments, the parameter values of an audiovisual effect, 140, 160 and 180, or 240, 260 and 280, or 340, 360 and 380, are transmitted directly to the transforming means 500. The transformation means 500, transforming the parameter values of an audiovisual effect, 140, 160 and 180, or 240, 260 and 280, or 340, 360 and 380 into a control signal 505 of a sound and / or visual effect.
  • The control signal 505 is transmitted at the input of a transducer 510 converting the control signal 505 to sound and / or visual effect. Preferably, the transducer 510 comprises an electroacoustic transducer so that the sound signal emitted by the transducer depends on the movements of a user facing the light beams of an embodiment of a device that is the subject of the present invention.
  • In embodiments, the device of the audiovisual system 50 is one of the embodiments described above.

Claims (12)

  1. Control device (10, 20, 30) for controlling a customizable audio-visual effect, which comprises:
    - means for generating at least two first optical paths followed by non-parallel light beams (110a, 110b, 210a, 210b, 210c, 310a, 310b, 310c), which paths comprise at least one optical sensor (115, 215a, 215b, 315a, 315b, 315c) and at least one emitter (105a, 105b, 205a, 205b, 305a, 305b, 305c) of at least one light beam,
    - means (145, 245, 345) for measuring the speed (150, 250, 350) of an object successively crossing at least two rectilinear light beams as a function of a signal (120, 220a, 220b, 320a, 320b, 320c) exiting from at least one optical sensor representing the absence of light successively on the optical paths due to each optical path being crossed by the object,
    - means (125, 225, 325) for estimating the longitudinal position (130, 230, 330) of the crossing of the light beams by the object, as a function of a length of time, called the "transit time", between two instants characteristic of a signal coming from at least one optical sensor representative of the absence of light from at least two light beams due to the optical paths being cut successively by the object, and
    - control means (135, 235, 335), called "positional control means", for controlling a parameter value (140, 240, 340) of an audio-visual effect as a function of the estimated longitudinal position;
    characterized in that:
    - the means for measuring the speed of the object are configured:
    to determine, for at least one beam, the length of time, called the "interruption time", during which the beam was not captured by the optical sensor, and to divide a predefined dimension d of the object by said interruption time,
    or,
    the generating means also generating an additional optical path followed by a light beam, parallel to one of the first optical paths, between an optical sensor (115, 215a, 215b, 315a, 315b, 315c) and an emitter (105a, 105b, 205a, 205b, 305a, 305b, 305c) of at least one light beam, to divide the distance between two parallel light beams (210a, 210c) by the transit time between these two parallel light beams,
    - the means for estimating the longitudinal position of the object's crossing of at least two non-parallel light beams are configured to multiply the object's transit time between two non-parallel beams by the speed measured.
  2. Control device (10, 20, 30) according to claim 1, which comprises control means (155, 255, 355), called "speed control means", for controlling a parameter value (160, 260, 360) of an audio-visual effect as a function of the measured speed (150, 250, 350).
  3. Control device (10, 20, 30) according to one of claims 1 or 2, which comprises:
    - means (165, 265, 365) for detecting the direction of crossing (170, 270, 370) of at least two optical paths by the object as a function of at least one signal (120, 220a, 220b, 320a, 320b, 320c) coming from at least one optical sensor (115, 215a, 215b, 315a, 315b, 315c) representative of the object crossing the optical paths of at least two light beams (110a, 110b, 210a, 210b, 210c, 310a, 310b, 310c), and
    - control means (175, 275, 375), called "directional control means", for controlling a parameter value (180, 280, 380) of an audio-visual effect as a function of the direction detected.
  4. Control device (20, 30) according to claim 3, wherein:
    - the direction of crossing (270, 370) of optical paths by the object is detected as a function of at least one signal (220a, 220b, 320a, 320b, 320c) coming from at least one optical sensor (215a, 215b, 315a, 315b, 315c) representative of the object crossing the optical paths of at least three light beams (210a, 210b, 210c, 310a, 310b, 310c), and
    - the directional control means (275, 375) control at least one parameter value (280, 380) of at least one audio-visual effect as a function of two components of a vector representative of the direction detected.
  5. Control device (30) according to one of claims 3 or 4, wherein:
    - the direction of crossing (370) of optical paths by the object is detected as a function of at least one signal (320a, 320b, 320c) coming from at least one optical sensor (315a, 315b, 315c) representative of the object crossing the optical paths of at least three light beams (310a, 310b, 310c) defining a volume, and
    - the directional control means (375) control at least one parameter value (380) of at least one audio-visual effect as a function of three components of a vector representative of the direction detected.
  6. Control device (10, 20, 30) according to one of claims 1 to 5, wherein the means (145, 245, 345) for measuring the speed (150, 250, 350) are configured to measure the speed of the object as a function of at least one length of time, called the "interruption time", of a signal (120, 220a, 220b, 320a, 320b, 320c) coming from at least one optical sensor (115, 215a, 215b, 315a, 315b, 315c), the interruption time being representative of the object crossing the optical paths of at least one light beam (110a, 110b, 210a, 210b, 210c, 310a, 310b, 310c) and a predefined dimension.
  7. Control device (20) according to one of claims 1 to 6, wherein the means (245) for measuring the speed (250) are configured to measure the speed of the object as a function of a signal (220a, 220b) coming from at least one optical sensor (215a, 215b) representative of the object crossing the optical paths of at least two parallel light beams (210a, 210c).
  8. Control device (30) according to one of claims 1 to 7, wherein the means (325) for estimating the longitudinal position (330) of the crossing and the means for measuring the object speed (350) are configured to estimate the longitudinal position and the speed as a function of a signal (320a, 320b, 320c) coming from at least one optical sensor (315a, 315b, 315c) representative of the object crossing at least three light beams (310a, 310b, 310c) defining a volume.
  9. Control device (10, 20, 30) according to one of claims 1 to 8, which comprises means (185, 285, 385) for converting each parameter value (140, 160, 180, 240, 260, 280, 340, 360, 380) into a value represented in the MIDI (acronym for "Musical Instrument Digital Interface") protocol (190, 290, 390).
  10. Operating method (40) of a device (10, 20, 30) according to one of claims 1 to 9, which comprises the following steps:
    - generating at least two first optical paths followed by non-parallel light beams (110a, 110b, 210a, 210b, 210c, 310a, 310b, 310c) comprising at least one optical sensor (115, 215a, 215b, 315a, 315b, 315c) and at least one emitter (105a, 105b, 205a, 205b, 305a, 305b, 305c) of at least one light beam,
    - measuring (145, 245, 345) the speed (150, 250, 350) of an object successively crossing at least two rectilinear light beams as a function of a signal (120, 220a, 220b, 320a, 320b, 320c) exiting from at least one optical sensor representing the absence of light successively on the optical paths due to each optical path being cut by the object,
    - estimating (125, 225, 325) the longitudinal position (130, 230, 330) of the crossing of the object as a function of a length of time, called the "transit time", between two instants characteristic of a signal coming from at least one optical sensor representative of the absence of light from at least two light beams due to the optical paths being cut successively by the object, and
    - controlling (135, 235, 335) a parameter value (140, 240, 340) of an audio-visual effect as a function of the estimated longitudinal position;
    characterized in that:
    - the speed of the object is measured by:
    the determination, for at least one beam, of the length of time, called the "interruption time", during which the beam was not captured by the optical sensor, and dividing a predefined dimension d of the object by said interruption time,
    or,
    the additional generation of an additional optical path followed by a light beam, parallel to one of the first optical paths, between an optical sensor (115, 215a, 215b, 315a, 315b, 315c) and an emitter (105a, 105b, 205a, 205b, 305a, 305b, 305c) of at least one light beam, the division of the distance between two parallel light beams (210a, 210c) by the transit time between these two parallel light beams,
    - the estimation of the longitudinal position of the object's crossing of at least two non-parallel light beams is performed by multiplying the object's transit time between two non-parallel beams by the speed measured.
  11. Audio-visual system (50), which comprises:
    - at least one device (10, 20, 30) according to one of claims 1 to 9,
    - means (500) for transforming each parameter value (140, 160, 180, 190, 240, 260, 280, 290, 340, 360, 380, 390) of an audio-visual effect into a signal (505) controlling a sound and/or visual effect and
    - a transducer (510) converting the control signal into a sound and/or visual effect.
  12. Audio-visual system (50) according to claim 11, wherein the transducer (510) comprises an electroacoustic transducer such that the sound signal emitted by the transducer is dependent on the movements of a user in front of the light beams.
EP15804896.7A 2014-11-17 2015-11-17 Control device, operation method of such a device and audiovisual system Active EP3204935B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
FR1461092A FR3028655B1 (en) 2014-11-17 2014-11-17 Control device, method for operating such a device and audiovisual system
PCT/FR2015/053108 WO2016079420A1 (en) 2014-11-17 2015-11-17 Control device, operation method of such a device and audiovisual system

Publications (2)

Publication Number Publication Date
EP3204935A1 EP3204935A1 (en) 2017-08-16
EP3204935B1 true EP3204935B1 (en) 2019-10-23

Family

ID=52824326

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15804896.7A Active EP3204935B1 (en) 2014-11-17 2015-11-17 Control device, operation method of such a device and audiovisual system

Country Status (3)

Country Link
EP (1) EP3204935B1 (en)
FR (1) FR3028655B1 (en)
WO (1) WO2016079420A1 (en)

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5017770A (en) * 1985-10-07 1991-05-21 Hagai Sigalov Transmissive and reflective optical control of sound, light and motion
FR2590033A1 (en) * 1985-11-13 1987-05-15 Guerre Philippe Device for the three-dimensional detection of an object by laser light, particularly for a show
DE4226661A1 (en) * 1992-06-23 1994-01-05 Friedrich Foerster Optical information input esp. for electronic music prodn. - involves control of instrument by photoreceiver of signal from reflector movable within sweep of deflected laser beam
US6489550B1 (en) * 1997-12-11 2002-12-03 Roland Corporation Musical apparatus detecting maximum values and/or peak values of reflected light beams to control musical functions
FR2777107B1 (en) 1998-04-02 2001-03-09 Jean Joseph Paul Schmutz Optoelectronic instrument for music interpretation
US8431811B2 (en) * 2001-08-16 2013-04-30 Beamz Interactive, Inc. Multi-media device enabling a user to play audio content in association with displayed video
US8339379B2 (en) * 2004-04-29 2012-12-25 Neonode Inc. Light-based touch screen
JP4822782B2 (en) * 2005-09-15 2011-11-24 株式会社河合楽器製作所 Keyboard instrument touch detection device
CA2778774A1 (en) * 2009-10-16 2011-04-21 Rpo Pty Limited Methods for detecting and tracking touch objects
US8835739B2 (en) 2012-02-01 2014-09-16 Beamz Interactive, Inc. Keystroke and MIDI command system for DJ player and video game systems

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
FR3028655B1 (en) 2019-10-18
WO2016079420A1 (en) 2016-05-26
FR3028655A1 (en) 2016-05-20
EP3204935A1 (en) 2017-08-16

Similar Documents

Publication Publication Date Title
TWI313835B (en) Method of measuring the movement of an object relative to a user's input device and related input device,mobile phone apparatus, cordless phone apparatus, laptor computer, mouse and remote control
EP2399182B1 (en) System, method and apparatus for causing a device to enter an active mode
US6969795B2 (en) Electronic tone generation system and batons therefor
US8913039B2 (en) Method and device for locating at least one touch on a touch-sensitive surface of an object
US20160364003A1 (en) Holographic interface for manipulation
CA2544600C (en) Proximity detector
EP1435509B1 (en) Optoelectronic measuring method and device
JP3355743B2 (en) Electronic keyboard instrument
US7977566B2 (en) Optical instrument pickup
US8969699B2 (en) Musical instrument, method of controlling musical instrument, and program recording medium
JP4760272B2 (en) Vehicle periphery monitoring device and sensor unit
EP1884869A2 (en) Information processing device and storage medium storing information processing program
US8411289B2 (en) Optical position detection device
US20070033012A1 (en) Method and apparatus for a verbo-manual gesture interface
JP2009516213A (en) Method and system for reproducing sound and generating synthesizer control data from data collected by a sensor coupled to a stringed instrument
US8860547B2 (en) Control terminal
CN105572681A (en) Absolute distance measurement for time-of-flight sensors
KR101224358B1 (en) Relative movement sensor and method for measuring movement of an object and said sensor relative to each other, sheet sensor, apparatus for processing sheet material and input device
CN103384837B (en) Hand-held laser rangefinder
US8664508B2 (en) Musical performance device, method for controlling musical performance device and program storage medium
ES2527073T3 (en) System and procedure for the evaluation of multidimensional gestures
US20050134556A1 (en) Optical navigation based on laser feedback or laser interferometry
CN105437267A (en) Adjustable spacing comb, adjustment drive and hair cutting appliance
US9761210B2 (en) Control methods for musical performance
US20020015521A1 (en) Apparatus and method for recognizing self-position in robot system

Legal Events

Date Code Title Description
AX Request for extension of the european patent to:

Extension state: BA ME

AV Request for validation of the european patent

Extension state: MA MD

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

17P Request for examination filed

Effective date: 20170509

INTG Intention to grant announced

Effective date: 20180608

INTC Intention to grant announced (deleted)
DAX Request for extension of the european patent (to any country) (deleted)
DAV Request for validation of the european patent (in any country) (deleted)
17Q First examination report despatched

Effective date: 20190201

INTG Intention to grant announced

Effective date: 20190508

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: FRENCH

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602015040412

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1194553

Country of ref document: AT

Kind code of ref document: T

Effective date: 20191115

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20191023

PGFP Annual fee paid to national office [announced from national office to epo]

Ref country code: FR

Payment date: 20191223

Year of fee payment: 5

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PGFP Annual fee paid to national office [announced from national office to epo]

Ref country code: DE

Payment date: 20200130

Year of fee payment: 5

Ref country code: GB

Payment date: 20191223

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200224

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200123

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200124

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200123

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023