WO2015168899A1 - Système et procédé de lecture de vidéo - Google Patents

Système et procédé de lecture de vidéo Download PDF

Info

Publication number
WO2015168899A1
WO2015168899A1 PCT/CN2014/077024 CN2014077024W WO2015168899A1 WO 2015168899 A1 WO2015168899 A1 WO 2015168899A1 CN 2014077024 W CN2014077024 W CN 2014077024W WO 2015168899 A1 WO2015168899 A1 WO 2015168899A1
Authority
WO
WIPO (PCT)
Prior art keywords
index information
content
terminal
video
audio
Prior art date
Application number
PCT/CN2014/077024
Other languages
English (en)
Chinese (zh)
Inventor
漆·亚历克斯
Original Assignee
漆·亚历克斯
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 漆·亚历克斯 filed Critical 漆·亚历克斯
Priority to PCT/CN2014/077024 priority Critical patent/WO2015168899A1/fr
Publication of WO2015168899A1 publication Critical patent/WO2015168899A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware

Definitions

  • the present invention relates to the field of Internet technologies, and in particular, to a video playing system and method. Background technique
  • the present invention aims to solve at least one of the technical problems in the related art to some extent.
  • the first object of the present invention is to provide a video playing system, which can make the viewer more convenient to understand relevant information without affecting the viewer's normal viewing of the network video, and greatly expand the content of the network video. Give users a better viewing experience.
  • a second object of the present invention is to provide a video playing method.
  • the first aspect of the present invention provides a video playing system, including: a first terminal, a second terminal, and a cloud server, where the cloud server is configured to provide playing content, where the playing The content has index information, wherein the index information is related to the play content; the first terminal is configured to acquire the play content from the cloud server, display the play content, and display the index information Sending to the second terminal; and the second terminal is configured to acquire extended content corresponding to the index information, and synchronously display the extended content during the playing of the playing content.
  • the extended content related to the network video can be played and displayed in the second terminal in real time, thereby not affecting the viewer.
  • the network video Normally watching the network video while making the audience more convenient to understand the relevant information, greatly expanding the content of the network video, providing users with a better viewing experience.
  • the second aspect of the present invention provides a video playing method, including: the first terminal acquiring the playing content from the cloud server, displaying the playing content, and sending the index information a second terminal, wherein the play content has index information, wherein the index information is related to the play content; and the second terminal acquires extended content corresponding to the index information, and in the Inside the play The extended content is displayed synchronously during the playback process.
  • the extended content related to the network video can be played and displayed in the second terminal in real time, thereby not affecting the viewer. Normally watching the network video while making the audience more convenient to understand the relevant information, greatly expanding the content of the network video, providing users with a better viewing experience.
  • FIG. 1 is a schematic structural diagram of a video playing system according to an embodiment of the present invention.
  • FIG. 2(a) and (b) are schematic views of a video playing system according to an embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of a video playing system according to another embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a video playing system according to another embodiment of the present invention.
  • FIG. 5 is a flowchart of a video playing method according to an embodiment of the present invention.
  • FIG. 6 is a flow chart of a video playing method according to another embodiment of the present invention. detailed description
  • first and second are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated.
  • features defining “first” and “second” may include one or more of the features, either explicitly or implicitly.
  • the meaning of “plurality” is two or more, unless specifically defined otherwise.
  • FIG. 1 is a schematic structural diagram of a video playing system according to an embodiment of the present invention.
  • the video playing device includes: a first terminal 10, a second terminal 20, and a cloud server 30.
  • the cloud server 30 is configured to provide playing content, where the playing content has index information, wherein the index information is related to the playing content.
  • the index information includes one or more of a name, a keyword, a title, a subtitle, and a program content profile of the currently played video.
  • a name, a keyword, a title, a subtitle, and a program content profile of the currently played video For example, each scene of a movie is divided, and the name, keyword, and the like of each scene are used as index information, and the broadcast content can be accurately determined to a specific scene or screen through the index information, and the current broadcast content is obtained.
  • information closely related to the playback scene or picture can be obtained based on the index information.
  • the index information may be embedded in the code stream of the video when the video is created, so that the first terminal 10 can identify the index information from the play content after acquiring the play content from the cloud server 30.
  • the first terminal 10 is configured to acquire the play content from the cloud server, display the play content, and send the index information to the second terminal 20.
  • the first terminal 10 may be a terminal having a communication function such as Wifi or Bluetooth, such as a personal computer PC, a mobile phone, or a tablet computer.
  • the first terminal 10 may display the play content on the screen of the first terminal 10, and at the same time, the embedded index information may be identified from the play content, and the index is The information is transmitted to the second terminal 20 by means of Wifi or Bluetooth.
  • the second terminal 20 is configured to acquire extended content corresponding to the index information, and synchronously display the extended content in the process of playing the content.
  • the second terminal 20 may be a wearable device, and the wearable device should also have a communication function with the first terminal 10 such as Wifi or Bluetooth, and the The wearable device can have an audio receiving/playing module such as a microphone or a speaker.
  • the second terminal 20 may obtain the extended content corresponding to the index information from the cloud server 30, where the extended content may be created by the video content producer when the video content is created and stored in the cloud server 30, and
  • the extended content is associated with the video content or the video content may be an advertisement screen or the like that the video content producer wishes to play.
  • the content of the video is a football game
  • the index information is a keyword of the team of the game team, player information, a stadium, and the like.
  • the second terminal 20 can obtain the player introduction and the stadium introduction from the cloud server 30.
  • the content is expanded, or after the player scores, the specific information of the player is obtained.
  • the extended content may include one or more of multimedia information, language information, and text information, and more specifically, the extended content may include a website related to the current video content, a social network, a chat group, One or more of songs, character introductions, background materials, costumes, scene introductions, and pre-videos of current videos.
  • the index information in addition to the method described above, the index information is embedded in the view.
  • the index information may be converted into audio index information and the audio index information is embedded in the audio code stream of the network video, and the network video is played. When the content is played, the audio of the network video is played together, so that the audio index information is recognized and the index information is acquired after receiving the audio index information.
  • the first terminal 10 is further configured to play the play content and the audio index information, wherein the audio index information is played in a manner that has no effect on the user.
  • the second terminal 20 is further configured to receive and identify audio index information to obtain index information.
  • the audio index information can be broadcasted by the audio playback device of the first terminal 10 and then received and recognized by the audio receiving device of the second terminal 20.
  • the first terminal 10 is a tablet computer
  • the second terminal 20 is a wearable device.
  • the index information is embedded in the original audio signal of the network video to form a mixed audio signal, and the mixed audio signal passes.
  • the speaker of the tablet is played out, the microphone of the wearable device receives the mixed audio signal, and then the wearable device recognizes and extracts the index information from the mixed audio signal.
  • the first terminal 10 plays the audio index information on a frequency other than the frequency that can be heard by the human ear; or sets the frequency of the audio index information to be the same as the audio frequency of the currently played content; Or shorten the playing time of the audio index information to a length other than the human ear can recognize. Specifically, since the audio index information is played on the first terminal 10 together with the sound of the network video through the audio frequency, the first terminal 10 playing the audio index information should not affect the viewer's normal hearing of the network video through the ear. sound.
  • the first terminal 10 may embed the audio index information on a frequency other than the sound frequency band that can be heard by the human ear; or may embed the frequency of the sound frequency band that can be heard by the human ear, and index the audio index.
  • the audio index information is embedded in the sound of the currently played network video, so that the user can not perceive the audio index information; or the duration of the audio index information playback can be shortened to a short enough time.
  • the use of the human ear to be insensitive to transient sounds makes the user unable to perceive the audio index information.
  • the audio index information can be pre-processed by other methods in the art, as long as it is not accessible to the human ear, and is not repeated here.
  • the index information may be added to the playback content during the network video production process, or may also be inserted during the network video playback process. That is, in order to make the play content played by the first terminal 10 and the extended content displayed by the second terminal can be associated, the index information can be embedded in the play content of the network video, and the information amount of the index information should be as small as possible, first The terminal 10 transmits the index information to the second terminal 20 by means of Wifi or Bluetooth, so that after identifying the index information, the second terminal 20 obtains the extended content through the cloud server 30 according to the index information, and the extended content is in the second Displayed in the terminal.
  • the extended content related to the network video can be played and displayed in the second terminal in real time, thereby not affecting the viewer.
  • Normal view While watching online video it is more convenient for viewers to understand relevant information, which greatly expands the content of online video and provides users with a better viewing experience.
  • FIG. 3 is a schematic structural diagram of a video playing system according to another embodiment of the present invention.
  • the video playback device includes: a first terminal 10, a second terminal 20, and a cloud server 30.
  • the second terminal 20 is a television set, and the television set should also have a communication function such as Wifi or Bluetooth.
  • the first terminal 10 is further configured to send the play content to the second terminal 20, acquire the extended content corresponding to the index information, and play the play content.
  • the extended content is displayed synchronously during the process.
  • the second terminal 20 is further configured to display the played content.
  • the first terminal 10 may identify the embedded index information from the play content, and acquire the extended content from the cloud server 30 through the index information and display the extended content in the first
  • the content to be played is simultaneously transmitted to the second terminal 20 by means of Wifi or Bluetooth.
  • the first terminal is Apple's iPAD
  • the TV has WiFi function
  • the TV and iPAD are in the same WiFi network.
  • the iPAD can switch the content of the network video to the TV for playback through Airplay.
  • the second terminal 20 is further configured to play the play content and the audio index information, where Play audio index information in a way that has no effect on the user.
  • the first terminal 10 is further configured to receive and identify audio index information to obtain index information.
  • the first terminal 10 is a tablet computer
  • the second terminal 20 is a television set.
  • the index information is embedded in the original audio signal of the network video to form a mixed audio signal.
  • the audio signal is played through the speaker of the television, the microphone of the tablet receives the mixed audio signal, and then the tablet recognizes and extracts the index information from the mixed audio signal.
  • the extended content related to the network video can be played and displayed in the first terminal in real time, thereby not affecting the viewer.
  • the network video while making the audience more convenient to understand the relevant information, greatly expanding the content of the network video, providing users with a better viewing experience.
  • the present invention also proposes a video playing method.
  • FIG. 5 is a flowchart of a video playing method according to an embodiment of the present invention.
  • the video playing method includes:
  • the first terminal acquires the playing content from the cloud server, displays the playing content, and sends the index information to the second terminal, where the playing content has index information, wherein the index information is related to the playing content.
  • the index information includes one or more of a name, a keyword, a title, a subtitle, and a program content profile of the currently played video. For example, divide each scene of a movie into segments, each scene The name, keyword, and the like are used as index information.
  • the index information can be used to accurately play the content to a specific scene or picture.
  • information related to the playing scene or the screen can be obtained according to the index information.
  • the index information may be embedded in the code stream of the video when the video is created, so that the first terminal can identify the index information from the play content after acquiring the play content from the cloud server.
  • the first terminal may be a terminal having a communication function such as Wifi or Bluetooth, such as a personal computer PC, a mobile phone, or a tablet computer.
  • the first terminal may display the playing content on the screen of the first terminal, and at the same time, the embedded index information may be identified from the playing content, and the index information is passed through the Wifi. Or Bluetooth or the like to send to the second terminal.
  • the second terminal acquires the extended content corresponding to the index information, and synchronously displays the extended content during the playing of the playing content.
  • the second terminal may be a wearable device, and the wearable device should also have a communication function such as Wifi or Bluetooth that can communicate with the first terminal, and the wearable device
  • the device can have an audio receiving/playing module such as a microphone or speaker.
  • the second terminal may obtain the extended content corresponding to the index information from the cloud server, where the extended content may be created by the video content producer when the video content is created and stored in the cloud server, and the extended content is
  • the video content associated or the video content may be an advertisement screen or the like that the video content producer wishes to play.
  • the content of the video is a football match
  • the index information is the team name of the game team
  • the player information is the stadium and other keywords.
  • the second terminal can obtain the player introduction, the stadium introduction and the like from the cloud server. Content, or after the player scores, obtain specific information about the player, etc.
  • the extended content may include one or more of multimedia information, language information, and text information, and more specifically, the extended content may include a website related to the current video content, a social network, a chat group, One or more of songs, character introductions, background materials, costumes, scene introductions, and pre-videos of current videos.
  • the index information in addition to the manner described above, in which the index information is embedded in the code stream of the video, in the embodiment of the present invention, can also be converted into audio index information and the audio index can be indexed.
  • the information is embedded in the audio stream of the network video, and when the content of the network video is played, the audio of the network video is played together, so that the audio index information is recognized and the index letter is obtained after receiving the audio index information.
  • the first terminal plays the play content and the audio index information, wherein the audio index information is played in a manner that has no effect on the user.
  • the second terminal receives and identifies the audio index information to obtain index information.
  • the audio index information may be broadcasted by the audio playing device of the first terminal, and then received and recognized by the audio receiving device of the second terminal.
  • the first terminal is a tablet, and the second terminal is worn.
  • the index information is embedded in the original audio signal of the network video to form a mixed audio signal, and the mixed audio signal is played through the speaker of the tablet computer, and the microphone of the wearable device receives the mixed audio signal, and then The index information is identified and extracted from the mixed audio signal by the wearable device.
  • the first terminal plays the audio index information on a frequency other than the frequency that can be heard by the human ear; or sets the frequency of the audio index information to be the same as the audio frequency of the currently played content; or Shorten the playback time of the audio index information to a length other than the human ear can recognize.
  • the audio index information played by the first terminal should not affect the sound of the network video that the viewer normally hears through the ear.
  • the first terminal may embed the audio index information on a frequency other than the sound frequency band that can be heard by the human ear; or may embed the frequency of the sound frequency band that can be heard by the human ear, and adopt the audio index information.
  • the audio index information can be pre-processed by other methods in the art, as long as it is not accessible to the human ear, and is not repeated here.
  • the index information may be added to the playback content during the network video production process, or may also be inserted during the network video playback process. That is, in order to enable the play content played by the first terminal to be associated with the extended content displayed by the second terminal, the index information may be embedded in the play content of the network video, and the information amount of the index information should be as small as possible, the first terminal
  • the index information is transmitted to the second terminal by means of Wifi or Bluetooth, so that after identifying the index information, the second terminal acquires the extended content through the cloud server according to the index information, and displays the extended content in the second terminal.
  • the extended content related to the network video can be played and displayed in the second terminal in real time, thereby not affecting the viewer. Normally watching the network video while making the audience more convenient to understand the relevant information, greatly expanding the content of the network video, providing users with a better viewing experience.
  • FIG. 6 is a flow chart of a video playing method according to another embodiment of the present invention.
  • the video playing method includes:
  • the first terminal acquires the playing content from the cloud server.
  • the first terminal sends the play content to the second terminal, acquires extended content corresponding to the index information, and synchronously displays the extended content during the playing of the play content.
  • the second terminal is a television, and the television should also have a communication function such as Wifi or Bluetooth.
  • the first terminal can identify the embedded index information from the play content, and obtain the extended content from the cloud server through the index information, and display the extended content on the screen of the first terminal, and send the play content to the first through the Wifi or Bluetooth.
  • the first terminal is Apple's iPAD
  • the TV has WiFi function
  • the TV and iPAD are in the same WiFi network.
  • the iPAD can switch the playing content of the network video to the TV for playing through Airplay.
  • the second terminal is further configured to play the playback content and the audio index information, where The user plays the audio index information in an unaffected manner.
  • the first terminal is further configured to receive and identify audio index information to obtain index information.
  • the first terminal is a tablet computer
  • the second terminal is a television set.
  • the index information is embedded in the original audio signal of the network video to form a mixed audio signal, and the mixed audio signal is formed.
  • the microphone of the tablet receives the mixed audio signal through the speaker of the television, and then the tablet recognizes and extracts the index information from the mixed audio signal.
  • the second terminal displays the played content.
  • the extended content related to the network video can be played and displayed in the first terminal in real time, thereby not affecting the viewer. Normally watching the network video while making the audience more convenient to understand the relevant information, greatly expanding the content of the network video, providing users with a better viewing experience.
  • portions of the invention may be implemented in hardware, software, firmware or a combination thereof.
  • multiple steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it can be implemented with any one or combination of the following techniques well known in the art: having logic gates for implementing logic functions on data signals Discrete logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate arrays (PGAs), field programmable gate arrays (FPGAs), etc.
  • the terms “installation”, “connected”, “connected”, and the like are to be understood broadly, and may be, for example, a fixed connection or a detachable connection, or Integral; may be mechanically connected or electrically connected; may be directly connected, or may be indirectly connected through an intermediate medium, may be internal communication of two elements or an interaction relationship of two elements unless explicitly defined otherwise.
  • installation may be, for example, a fixed connection or a detachable connection, or Integral; may be mechanically connected or electrically connected; may be directly connected, or may be indirectly connected through an intermediate medium, may be internal communication of two elements or an interaction relationship of two elements unless explicitly defined otherwise.
  • the specific meaning of the above terms in the present invention can be understood on a case-by-case basis.
  • the description of the terms “one embodiment”, “some embodiments”, “example”, “specific example”, or “some examples” and the like means a specific feature described in connection with the embodiment or example.
  • a structure, material or feature is included in at least one embodiment or example of the invention.
  • the schematic representation of the above terms is not necessarily directed to the same embodiment or example.
  • the specific features, structures, materials or characteristics described may be Combine in a suitable manner in one or more embodiments or examples.
  • various embodiments or examples described in the specification, as well as features of various embodiments or examples may be combined and combined.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

L'invention concerne un système et un procédé de lecture de vidéo. Le système de lecture de vidéo comprend : un premier terminal, un second terminal et un serveur en nuage. Le serveur en nuage est utilisé pour fournir un contenu de lecture, le contenu de lecture comportant des informations d'index, les informations d'index étant associées au contenu de lecture. Le premier terminal est utilisé pour acquérir le contenu de lecture à partir du serveur en nuage et afficher le contenu de lecture, et envoyer les informations d'index au second terminal. Le second terminal est utilisé pour acquérir un contenu d'extension correspondant aux informations d'index, et afficher de manière synchrone le contenu d'extension durant la lecture du contenu de lecture. Le système de lecture de vidéo de la présente invention permet à un spectateur de trouver de manière plus commode des informations associées, tout en n'affectant pas la visualisation normale d'une vidéo de réseau par le spectateur, permettant ainsi d'étendre de manière significative le contenu de la vidéo de réseau, et de fournir à l'utilisateur une meilleure expérience de visualisation.
PCT/CN2014/077024 2014-05-08 2014-05-08 Système et procédé de lecture de vidéo WO2015168899A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/077024 WO2015168899A1 (fr) 2014-05-08 2014-05-08 Système et procédé de lecture de vidéo

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/077024 WO2015168899A1 (fr) 2014-05-08 2014-05-08 Système et procédé de lecture de vidéo

Publications (1)

Publication Number Publication Date
WO2015168899A1 true WO2015168899A1 (fr) 2015-11-12

Family

ID=54391987

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/077024 WO2015168899A1 (fr) 2014-05-08 2014-05-08 Système et procédé de lecture de vidéo

Country Status (1)

Country Link
WO (1) WO2015168899A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110069937A1 (en) * 2009-09-18 2011-03-24 Laura Toerner Apparatus, system and method for identifying advertisements from a broadcast source and providing functionality relating to the same
CN102668582A (zh) * 2009-11-05 2012-09-12 科斯莫研究有限公司 在移动装置上识别、提供以及呈现补充内容的系统和方法
CN103535028A (zh) * 2010-12-30 2014-01-22 汤姆逊许可公司 用于提供与显示的内容相关的附加内容的方法和系统
CN103748897A (zh) * 2011-06-02 2014-04-23 谷歌公司 用于在第二设备上显示与在第一设备上播放的内容有关的内容的方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110069937A1 (en) * 2009-09-18 2011-03-24 Laura Toerner Apparatus, system and method for identifying advertisements from a broadcast source and providing functionality relating to the same
CN102668582A (zh) * 2009-11-05 2012-09-12 科斯莫研究有限公司 在移动装置上识别、提供以及呈现补充内容的系统和方法
CN103535028A (zh) * 2010-12-30 2014-01-22 汤姆逊许可公司 用于提供与显示的内容相关的附加内容的方法和系统
CN103748897A (zh) * 2011-06-02 2014-04-23 谷歌公司 用于在第二设备上显示与在第一设备上播放的内容有关的内容的方法

Similar Documents

Publication Publication Date Title
US11165988B1 (en) System and methods providing supplemental content to internet-enabled devices synchronized with rendering of original content
ES2911179T3 (es) Controlador de fuente de audio/vídeo autónomo, inteligente y activado por contenido
CN110741651B (zh) 用于呈现指示推荐内容的通知的方法、系统和介质
CN106464953B (zh) 双声道音频系统和方法
US9668031B2 (en) Apparatus, systems and methods for accessing and synchronizing presentation of media content and supplemental media rich content
US8667529B2 (en) Presentation of audiovisual exercise segments between segments of primary audiovisual content
US20160366464A1 (en) Method, device, and system for interactive television
US20130301392A1 (en) Methods and apparatuses for communication of audio tokens
WO2015090095A1 (fr) Procédé, dispositif et système de poussée d'informations
US9191553B2 (en) System, methods, and computer program products for multi-stream audio/visual synchronization
CN104918061B (zh) 一种电视频道的识别方法及系统
US20150177958A1 (en) Providing context information relating to media content that is being presented
JP7290260B1 (ja) サーバ、端末及びコンピュータプログラム
US20180176628A1 (en) Information device and display processing method
CN103945074A (zh) 一种彩铃定制方法和系统
CN105100858A (zh) 视频播放系统和方法
WO2015168899A1 (fr) Système et procédé de lecture de vidéo
WO2015168898A1 (fr) Procédé et dispositif de lecture de vidéo et lecteur
JP6254852B2 (ja) 放送通信連携触覚提示システム、サービスサーバおよびプログラム、並びに、携帯端末
CN105338397A (zh) 信息推送方法、装置和系统
WO2016000154A1 (fr) Procédé, dispositif et système de distribution sélective d'informations
CN105100864A (zh) 视频播放方法、装置和播放器
JP7302801B1 (ja) ストリーミングデータを取り扱う方法、システム及びコンピュータプログラム
JP2018019404A (ja) データ再生システム、データ再生方法、データ配信装置、広告配信装置、データ再生端末及びデータ再生用プログラム
KR102668559B1 (ko) 콘텐츠에 포함된 음파신호로부터 콘텐츠를 추정하는 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14891511

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 31/03/2017)

122 Ep: pct application non-entry in european phase

Ref document number: 14891511

Country of ref document: EP

Kind code of ref document: A1