EP1969587A2 - Verfahren zum komponieren eines musikstücks durch einen nichtmusiker - Google Patents

Verfahren zum komponieren eines musikstücks durch einen nichtmusiker

Info

Publication number
EP1969587A2
EP1969587A2 EP06817623A EP06817623A EP1969587A2 EP 1969587 A2 EP1969587 A2 EP 1969587A2 EP 06817623 A EP06817623 A EP 06817623A EP 06817623 A EP06817623 A EP 06817623A EP 1969587 A2 EP1969587 A2 EP 1969587A2
Authority
EP
European Patent Office
Prior art keywords
collection
accompaniment
melody
musical
sound
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06817623A
Other languages
English (en)
French (fr)
Inventor
Jacques Ladyjensky
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Structures SPRL
Original Assignee
Continental Structures SPRL
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Structures SPRL filed Critical Continental Structures SPRL
Publication of EP1969587A2 publication Critical patent/EP1969587A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • G10H1/0025Automatic or semi-automatic music composition, e.g. producing random music, applying rules from music theory or modifying a musical piece
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/101Music Composition or musical creation; Tools or processes therefor
    • G10H2210/105Composing aid, e.g. for supporting creation, edition or modification of a piece of music
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/101Music Composition or musical creation; Tools or processes therefor
    • G10H2210/151Music Composition or musical creation; Tools or processes therefor using templates, i.e. incomplete musical sections, as a basis for composing
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/106Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/121Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
    • G10H2240/131Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/325Synchronizing two or more audio tracks or files according to musical features or musical timings

Definitions

  • the present method is primarily intended for persons who, while being strongly attracted by music and musical composition, are not good at manipulating a musical keyboard or receiving instruction from branches such as solfeggio or harmony.
  • the musical works to which the method according to the invention is applicable are constituted, and this is a case frequently encountered, of a melody and an accompaniment, the latter being often called orchestration or orchestral accompaniment, when has several instruments.
  • the simplest method for the interested person, for lack of being able to compose it himself, is to take one in the repertoire of the musical heritage of the past, - lawful thing if its author has died for more than 70 years and copy it, even adapting it somewhat, as for example to remove one or the other ornamental outmoded, or rework. These things done, she could, without being able to write it in music, sing it mentally.
  • the method according to the present invention makes it possible to obtain in a short time an unprecedented musical work of the length of an entire piece of music, for example to obtain during the course of a the same day a piece of several minutes. It consists of using banks of melodies and accompaniment banks, the latter constituting collections of orchestral accompaniments, which have a canvas structure the length of an entire piece.
  • This structure of orchestral accompaniment canvases implements several accompanying instruments, a rhythmic preference associated with those of our time, an introduction and a final.
  • These collections of orchestral accompaniment patterns are presented in several groups, for example group a, group b, group c, etc., each of them ringing in a distinct tone. They are recorded in advance in the software made available to the user, and in a manner easily audible by him to the ear.
  • the user is invited to listen to these melodies and orchestral accompaniments, to examine in depth those they prefer, and to make a choice.
  • the next step is to simultaneously record the sound of the melody and the orchestral accompaniment - but not just any way.
  • the system allows the user to truly "modulate" his melody according to the accompaniment he hears.
  • the software makes the following maneuver available to it. He starts listening to the orchestral accompaniment and acts on the mouse so that at each click of the mouse is recorded, and this audibly, a note of melody, and this in sequence, starting with the first, with as a characteristic, that each of these successive sounds of the melody is noted by the system as being the duration of the click pushed by the user.
  • the song once recorded can be made visible on the screen in the form of a linear diagram affecting the shape of a ribbon provided with equal time units and numbered in sequence (the "measures") with a course of cursor moving as you listen to the song.
  • This allows the user to improve his in the following way. While listening, he locates the number of the measure where he could have heard two notes he considers unsatisfactory in compatibility. For one of the two - and this will usually be the one belonging to the orchestral accompaniment - he has the wish to suppress it or to attenuate it.
  • the software enables it to display on the screen the structure of the orchestral accompaniment at the level of the measurement bearing the number concerned. He sees the instruments intervening, each instrument appearing accompanied by a virtual potentiometer which governs its volume. It is then enough for him to lower the one concerned, and this for the duration of the measure in question to the exclusion of others.
  • Fig.l represents an example of what appears on the computer screen when the user calls a melodic theme from one of the banks, - melodic theme here consisting of 2 sentences of 7 notes each, and represented on the screen by 14 marks 2 put in sequence on a litter 1.
  • Fig.2 shows the same sequence modified by the user to better adapt to his taste the notes of the melody called.
  • Pig.3 shows an example of the on-screen representation of how the orchestral accompaniment is described.
  • the mobile cursor 4 vertical line, is represented: it is mobile from the start and moves at a uniform speed, over the audible course of the accompaniment, on 32 tracks
  • Fig.4 shows the same with the cursor
  • Fig. 5 again represents the cursor in the course of scrolling, here arrived in measure 6.
  • the preceding measures have been successively filled with markers embodying the fact that the notes of the melody were, one after the other, recorded and recorded over those of the orchestral accompaniment.
  • Fig.6 shows an example of an on-screen representation of the structure of an orchestral accompaniment corresponding to a given measure.
  • Fig. 7 schematically represents a "screenshot" supposed to represent the maneuverability of the user with his mouse, during the main operations that help him compose.
  • the example described below illustrates in a concrete case the description of the process according to the invention.
  • the user begins by exploring, by audition, the banks of melodies appended to the software available to him. They are classified in categories such as: gay, serious, nostalgic, sad or other. At this point, the tempo, or the playing speed of the melody, is not yet decided by the user; he will do it later.
  • By choosing a melody he notes to which group it belongs: either for example the group A. At the desired moment, he will have to choose an accompaniment of the homonymous group.
  • the user can, and must, display on the screen a diagram representing the sequence of sounds of the selected melodic theme.
  • a button is available to erase any "note” junk, and to add more, it will make use of a small virtual keyboard located on the screen.
  • a keyboard located on the screen.
  • his melody being considered adopted with regard to the choice of sounds (but not yet their durations, nor the rhythm of the game), it leaves it on the screen, and will explore orchestrally the orchestral accompaniments in the banks that contain them, and here, more precisely, in the homonymous bank, "group a". He chooses one, given the melody he has in mind, according to his taste.
  • the bank gives him the choice between various styles, and suppose for example that he chooses an accompaniment in the style "songs-rock".
  • a simple maneuver it records it with materialization on the screen of its visual scrolling diagram which has the appearance of fig.3.
  • By launching the cursor it describes the entire ribbon, until the last measurement, the one bearing the No. 32, and the sound accompanies him audibly, so that the user has a system locating.
  • the orchestral accompaniments in the chosen case "song-rock” are built with a length of 25 to 35 measures.
  • the software provides simple tools to delete certain measures, or to split, to change places of split or non-duplicate measurements, or other such maneuvers.
  • the user must also assign a tempo, a speed of play, scrolling his orchestral accompaniment. He will do this by taking into account the pace he wants to give to his piece: for example, for a nostalgic piece, it will take a slow tempo. It must also take into account the duration of the melody which it has so that it adapts to that of the accompaniment. It is common for him to decide to play the melody - this is the object of the maneuver that will be described below - only after one or more accompanying measures alone, - and similarly at the end of the piece. It is among others in anticipation of this that the accompaniments were elaborated with an introduction and a final. In our example, as it is a song, it has a duration of about two minutes.
  • the note will be all the longer. Aided by the fact that the accompaniment takes place at the same time as his maneuver with the melody, it is therefore the user and he alone who will give the rhythm deemed appropriate to his piece, since he is master of both the when he enters a note, and the duration he gives it.
  • Fig.5 shows the cursor en route, arrived in the sixth bar. Looking closely at this figure, we can see that the user has slightly modified the rhythm evoked by the previous sequences ( Figures 1 and 2). It has, if you will, "swing" compared to the regular rhythm of which are generally provided the folk melodies. Measure 5 has only one note, because it pushed at length on the click. And on measure 6, he clicked four times, and quickly.
  • the correction maneuver that the present method allows consists in recording the number of the measurement concerned, and by a simple maneuver, to show on the screen what is the structure of the orchestral accompaniment at the level of said measurement.
  • Figure 6 shows schematically how such a table can be presented for the measurement taken This is the case with the number 7, for example.
  • Each instrument of the orchestral accompaniment is on the screen accompanied by a small potentiometer whose activation allows to attenuate or to suppress its sound. It is enough for the user by attentive listening, to identify the one of the instruments responsible for the dissonance. In general it will be one of the solo instruments of accompaniment, and in no case, for obvious reasons, percussion, it will leave as it is.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Electrophonic Musical Instruments (AREA)
  • Auxiliary Devices For Music (AREA)
EP06817623A 2005-11-14 2006-11-14 Verfahren zum komponieren eines musikstücks durch einen nichtmusiker Withdrawn EP1969587A2 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
BE200500550 2005-11-14
PCT/BE2006/000123 WO2007053917A2 (fr) 2005-11-14 2006-11-14 Procede de composition d’une œuvre musicale par un non-musicien

Publications (1)

Publication Number Publication Date
EP1969587A2 true EP1969587A2 (de) 2008-09-17

Family

ID=37882545

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06817623A Withdrawn EP1969587A2 (de) 2005-11-14 2006-11-14 Verfahren zum komponieren eines musikstücks durch einen nichtmusiker

Country Status (3)

Country Link
US (1) US20090272252A1 (de)
EP (1) EP1969587A2 (de)
WO (1) WO2007053917A2 (de)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101611511B1 (ko) 2009-05-12 2016-04-12 삼성전자주식회사 터치스크린을 구비한 휴대 단말기를 이용한 음악 생성 방법
WO2014008209A1 (en) * 2012-07-02 2014-01-09 eScoreMusic, Inc. Systems and methods for music display, collaboration and annotation
US8927846B2 (en) * 2013-03-15 2015-01-06 Exomens System and method for analysis and creation of music
US9715870B2 (en) 2015-10-12 2017-07-25 International Business Machines Corporation Cognitive music engine using unsupervised learning
CN106898341B (zh) * 2017-01-04 2021-03-09 清华大学 一种基于共同语义空间的个性化音乐生成方法及装置
CN108806655B (zh) * 2017-04-26 2022-01-07 微软技术许可有限责任公司 歌曲的自动生成

Family Cites Families (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4958551A (en) * 1987-04-30 1990-09-25 Lui Philip Y F Computerized music notation system
US4930390A (en) * 1989-01-19 1990-06-05 Yamaha Corporation Automatic musical performance apparatus having separate level data storage
JP2923356B2 (ja) * 1990-01-18 1999-07-26 イーミュー システムズ インコーポレーテッド 音響データのデータ圧縮
JP2541074B2 (ja) * 1992-04-20 1996-10-09 ヤマハ株式会社 電子楽器
US5495073A (en) * 1992-05-18 1996-02-27 Yamaha Corporation Automatic performance device having a function of changing performance data during performance
US5478967A (en) * 1993-03-30 1995-12-26 Kabushiki Kaisha Kawai Gakki Seisakusho Automatic performing system for repeating and performing an accompaniment pattern
US5665927A (en) * 1993-06-30 1997-09-09 Casio Computer Co., Ltd. Method and apparatus for inputting musical data without requiring selection of a displayed icon
US5602357A (en) * 1994-12-02 1997-02-11 Yamaha Corporation Arrangement support apparatus for production of performance data based on applied arrangement condition
US5663517A (en) * 1995-09-01 1997-09-02 International Business Machines Corporation Interactive system for compositional morphing of music in real-time
US5801694A (en) * 1995-12-04 1998-09-01 Gershen; Joseph S. Method and apparatus for interactively creating new arrangements for musical compositions
IT1282613B1 (it) * 1996-02-13 1998-03-31 Roland Europ Spa Apparecchiatura elettronica per la composizione e riproduzione automatica di dati musicali
US7297856B2 (en) * 1996-07-10 2007-11-20 Sitrick David H System and methodology for coordinating musical communication and display
US6121532A (en) * 1998-01-28 2000-09-19 Kay; Stephen R. Method and apparatus for creating a melodic repeated effect
US6121533A (en) * 1998-01-28 2000-09-19 Kay; Stephen Method and apparatus for generating random weighted musical choices
JP3541706B2 (ja) * 1998-09-09 2004-07-14 ヤマハ株式会社 自動作曲装置および記憶媒体
FR2785438A1 (fr) * 1998-09-24 2000-05-05 Baron Rene Louis Procede et dispositif de generation musicale
JP3533974B2 (ja) * 1998-11-25 2004-06-07 ヤマハ株式会社 曲データ作成装置および曲データ作成プログラムを記録したコンピュータで読み取り可能な記録媒体
US6169242B1 (en) * 1999-02-02 2001-01-02 Microsoft Corporation Track-based music performance architecture
HU225078B1 (en) * 1999-07-30 2006-06-28 Sandor Ifj Mester Method and apparatus for improvisative performance of range of tones as a piece of music being composed of sections
US6740802B1 (en) * 2000-09-06 2004-05-25 Bernard H. Browne, Jr. Instant musician, recording artist and composer
US6888999B2 (en) * 2001-03-16 2005-05-03 Magix Ag Method of remixing digital information
US7232949B2 (en) * 2001-03-26 2007-06-19 Sonic Network, Inc. System and method for music creation and rearrangement
JP4267925B2 (ja) * 2001-04-09 2009-05-27 ミュージックプレイグラウンド・インコーポレーテッド 対話型再生によるマルチパートオーディオ演奏を記憶する媒体
US6822153B2 (en) * 2001-05-15 2004-11-23 Nintendo Co., Ltd. Method and apparatus for interactive real time music composition
US6483019B1 (en) * 2001-07-30 2002-11-19 Freehand Systems, Inc. Music annotation system for performance and composition of musical scores
US7735011B2 (en) * 2001-10-19 2010-06-08 Sony Ericsson Mobile Communications Ab Midi composer
WO2003036613A1 (en) * 2001-10-19 2003-05-01 Sony Ericsson Mobile Communications Ab Midi composer
US7223911B2 (en) * 2001-10-29 2007-05-29 Yamaha Corporation Portable telephone set with reproducing and composing capability of music
US6653545B2 (en) * 2002-03-01 2003-11-25 Ejamming, Inc. Method and apparatus for remote real time collaborative music performance
JP4211672B2 (ja) * 2004-04-28 2009-01-21 ヤマハ株式会社 演奏データ作成装置及びプログラム
US7525036B2 (en) * 2004-10-13 2009-04-28 Sony Corporation Groove mapping
US7297858B2 (en) * 2004-11-30 2007-11-20 Andreas Paepcke MIDIWan: a system to enable geographically remote musicians to collaborate
CA2489256A1 (en) * 2004-12-06 2006-06-06 Christoph Both System and method for video assisted music instrument collaboration over distance
US20060180007A1 (en) * 2005-01-05 2006-08-17 Mcclinsey Jason Music and audio composition system
US7608775B1 (en) * 2005-01-07 2009-10-27 Apple Inc. Methods and systems for providing musical interfaces
US20070028750A1 (en) * 2005-08-05 2007-02-08 Darcie Thomas E Apparatus, system, and method for real-time collaboration over a data network
US7518051B2 (en) * 2005-08-19 2009-04-14 William Gibbens Redmann Method and apparatus for remote real time collaborative music performance and recording thereof
US7853342B2 (en) * 2005-10-11 2010-12-14 Ejamming, Inc. Method and apparatus for remote real time collaborative acoustic performance and recording thereof
US20070137463A1 (en) * 2005-12-19 2007-06-21 Lumsden David J Digital Music Composition Device, Composition Software and Method of Use
US7542273B2 (en) * 2006-08-25 2009-06-02 Laycock Larry R Music display and collaboration system
EP2122509A1 (de) * 2007-02-14 2009-11-25 Museami, Inc. Portal zur bearbeitung verteilter audiodateien
WO2008113120A1 (en) * 2007-03-18 2008-09-25 Igruuv Pty Ltd File creation process, file format and file playback apparatus enabling advanced audio interaction and collaboration capabilities
JP4306754B2 (ja) * 2007-03-27 2009-08-05 ヤマハ株式会社 楽曲データ自動生成装置及び音楽再生制御装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2007053917A3 *

Also Published As

Publication number Publication date
US20090272252A1 (en) 2009-11-05
WO2007053917A2 (fr) 2007-05-18
WO2007053917A3 (fr) 2007-06-28

Similar Documents

Publication Publication Date Title
Covach et al. Understanding rock: Essays in musical analysis
WO2007053917A2 (fr) Procede de composition d’une œuvre musicale par un non-musicien
Greenwald Hip-hop drumming: the rhyme may define, but the groove makes you move
Zak III Bob Dylan and Jimi Hendrix: Juxtaposition and transformation" all along the watchtower"
FR2974226A1 (fr) Procede de generation d'effet sonore dans un logiciel de jeu, programme d'ordinateur associe et systeme informatique pour executer des instructions du programme d'ordinateur.
Gray Improvisation and composition in Balinese gendér wayang: music of the moving shadows
Rothenbuhler For-the-record aesthetics and Robert Johnson's blues style as a product of recorded culture
CN111223470A (zh) 音频处理方法、装置及电子设备
FR3038440A1 (fr) Procede d’extraction et d’assemblage de morceaux d’enregistrements musicaux
Minton Houston Creoles and Zydeco: The emergence of an African American urban popular style
Dowdall Technology and the stylistic evolution of the jazz bass
EP1395976B1 (de) Hilfsvorgang und -gerät zum komponieren oder spielen eines musikstückes
Perone The words and music of Carole King
Tsougras The application of GTTM on 20th century modal music: Research based on the analysis of Yannis Constantinidis's “44 Greek miniatures for piano”
Chernoff The artistic challenge of African music: Thoughts on the absence of drum orchestras in Black American music
Tucker Mainstreaming Monk: The Ellington Album
Hoek ARSC CONFERENCE PAPER: Beyond Bebop: Dial Records and the Library of Contemporary Classics
Davidson Performing" Hurt": Aging, Disability, and Popular Music as Mediated Product and Lived-Experience in Johnny Cash's Final Recordings
Cannon Laughter, Liquor, and Licentiousness: Preservation Through Play in Southern Vietnamese Traditional Music
Washington Freedom in the jazz imaginary: Twentieth-century aesthetic revolt
Aittoniemi Cultural and musical dimensions of Goa trance and early psychedelic trance in Finland
Heetderks Slanted beats, enchanted communities: Pavement's early phrase rhythm as indie narrative
RITZAREV Rothschild's Violin and a Russian Tune.
Cannon LAUGHTER, LIQUOR, AND LICENTIOUSNESS
Dylan et al. ALBIN J. ZAK III

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20080528

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

RIN1 Information on inventor provided before grant (corrected)

Inventor name: LADYJENSKY, JACQUES

17Q First examination report despatched

Effective date: 20080609

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20100601