WO2020000970A1 - Procédé et appareil d'identification d'intérêt d'utilisateur, et dispositif terminal et support d'informations - Google Patents

Procédé et appareil d'identification d'intérêt d'utilisateur, et dispositif terminal et support d'informations Download PDF

Info

Publication number
WO2020000970A1
WO2020000970A1 PCT/CN2018/124743 CN2018124743W WO2020000970A1 WO 2020000970 A1 WO2020000970 A1 WO 2020000970A1 CN 2018124743 W CN2018124743 W CN 2018124743W WO 2020000970 A1 WO2020000970 A1 WO 2020000970A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
user
currently displayed
display
response
Prior art date
Application number
PCT/CN2018/124743
Other languages
English (en)
Chinese (zh)
Inventor
王亚
Original Assignee
北京微播视界科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京微播视界科技有限公司 filed Critical 北京微播视界科技有限公司
Publication of WO2020000970A1 publication Critical patent/WO2020000970A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • Embodiments of the present disclosure relate to Internet technologies, and for example, to a method, an apparatus, a terminal device, and a storage medium for identifying user interests.
  • a related way of identifying the user's interest is to set an operable control on the video display interface whether the user is interested, and the user is required to click the operable control to mark whether he is interested in the currently viewed video.
  • the related method of identifying the user's interests has the disadvantage of not being interactive, and the operable controls on the display interface have caused the user to watch the video to some extent.
  • embodiments of the present disclosure provide a method, a device, a terminal device, and a storage medium for identifying user interests to improve interactivity and reduce operable controls on a display interface.
  • An embodiment of the present disclosure provides a method for identifying user interests.
  • the method includes:
  • An embodiment of the present disclosure further provides a device for identifying user interests, and the device includes:
  • a gesture detection module configured to be a display interface that currently displays a video in a video stream, and detects a user's gesture
  • a click area display module configured to display a click area that is not of interest in response to the detected gesture
  • the next video display module is configured to determine that the user is not interested in the currently displayed video when the user's clicking operation on the selected area is detected, and display the currently displayed video in the video stream. Next video.
  • An embodiment of the present disclosure further provides a terminal device, where the terminal device includes:
  • One or more processors are One or more processors;
  • a storage device configured to store one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method for identifying a user interest according to any embodiment of the present disclosure.
  • An embodiment of the present disclosure also provides a computer-readable storage medium on which a computer program is stored.
  • the program is executed by a processor, the method for identifying a user's interest according to any of the embodiments of the present disclosure is implemented.
  • FIG. 1 is a flowchart of a method for identifying user interests provided in Embodiment 1 of the present disclosure
  • FIG. 2 is a flowchart of a method for identifying user interests provided in Embodiment 2 of the present disclosure
  • 3a is a schematic diagram of a display interface of a user's gesture acting on a currently displayed video in an embodiment of the present disclosure
  • FIG. 3b is a schematic diagram showing an uninteresting selected area on a mask layer in an embodiment of the present disclosure
  • FIG. 3c is a schematic diagram of a user clicking a click area in an embodiment of the present disclosure.
  • 3d is a schematic diagram showing a next video of a currently displayed video in a displayed video stream in an embodiment of the present disclosure
  • 3e is a schematic diagram of a user touching a selected area on a mask layer in an embodiment of the present disclosure
  • 3f is a schematic diagram of restoring and displaying the currently displayed video in the embodiment of the present disclosure.
  • Embodiment 4 is a flowchart of a method for identifying user interests provided by Embodiment 3 of the present disclosure
  • 5a is a schematic diagram of a display interface of a next video moving according to a look-back operation instruction in an embodiment of the present disclosure
  • 5b is a schematic diagram showing a previous video of a currently displayed video when a user's finger leaves the screen according to an embodiment of the present disclosure
  • FIG. 6 is a schematic structural diagram of a device for identifying user interests provided by Embodiment 4 of the present disclosure.
  • FIG. 7 is a schematic diagram of a hardware structure of a terminal device according to Embodiment 5 of the present disclosure.
  • FIG. 1 is a flowchart of a method for identifying user interests provided in Embodiment 1 of the present disclosure. This embodiment is applicable to a case where a user is not interested in identifying videos in a video stream.
  • the method may be performed by a device that identifies user interests.
  • the device may be implemented by at least one of software and hardware, and may generally be integrated in a terminal device such as a mobile phone or a tablet computer.
  • the method includes steps 110, 120, and 130.
  • the gesture may be a long-press gesture, which may include a gesture in which the pressing time is greater than a set time threshold and the pressing pressure is greater than a set pressure threshold, and the pressing pressure may be obtained through a pressure sensor.
  • the video stream includes multiple videos. The terminal device requests the server once, and the server returns a video stream including a set number of videos.
  • the embodiments of the present disclosure determine that the user is not interested in the currently displayed video based on the user's gesture.
  • the position of the gesture is not limited, and it can be applied to any position outside the existing controls on the display interface.
  • step 120 in response to the detected gesture, a non-interesting selection area is displayed.
  • the selected area may be an area located at a preset position of the display interface (such as a middle position of the display interface) or an area located at a position where a user gesture is detected.
  • the area of the selected area may be much smaller than the display interface, for example. Area.
  • the shape of the selected area is not limited, for example, it can be a circular area, a rectangular area, or other shapes.
  • a non-interesting selection area is displayed on the video display interface, and the selection area interacts with the user to determine whether the user is interested in the currently displayed video.
  • the selected area is displayed, if the currently displayed video is being played, the display of the selected area does not affect the playback of the currently displayed video.
  • step 130 when the user's clicking operation on the selected area is detected, it is determined that the user is not interested in the currently displayed video, and the next video of the currently displayed video in the video stream is displayed. .
  • a user's click operation on the clicked area it can be determined that the user is not interested in the currently displayed video, and then switches to display the next video of the currently displayed video in the video stream; if it is detected that the user clicks outside the clicked area Area, cancel the selected area, and continue to display the currently displayed video.
  • the method for identifying user interests displays an uninteresting click area when a user gesture is detected on a display interface of a currently displayed video in a video stream, and detects a user's click operation on the click area , It is determined that the user is not interested in the currently displayed video, and the next video in the video stream is displayed, thereby improving the interactivity, and it is not necessary to directly display the uninteresting operable controls on the display interface, thereby reducing the operable controls on the display interface. To improve the user experience.
  • the method further includes:
  • the video identification of the currently displayed video and the user identification of the user are stored.
  • FIG. 2 is a flowchart of a method for identifying a user's interest according to Embodiment 2 of the present disclosure, displaying an uninteresting selected area, further including: displaying a mask layer, and displaying the uninteresting selected area on the mask layer. .
  • the method includes steps 210, 220, and 230.
  • step 210 a user gesture is detected in a display interface of a currently displayed video in the video stream.
  • the user's gesture can be applied to any position outside the existing controls of the display interface where the video is currently displayed.
  • step 220 in response to the gesture, a mask layer is displayed, and a selected area of no interest is displayed on the mask layer.
  • the mask layer is generally translucent, and users can see the currently displayed video through the translucent mask layer.
  • the mask layer covers the entire display area, and an uninteresting selected area is displayed in the middle position of the mask layer.
  • step 230 when the user's clicking operation on the selected area is detected, it is determined that the user is not interested in the currently displayed video, and the next video of the currently displayed video in the video stream is displayed. .
  • the method for identifying user interest displayed by this embodiment displays a mask layer when a user gesture is detected, and displays an uninteresting selection area on the mask layer, so that the user can intuitively understand that a corresponding operation is to be performed on the currently displayed video To further improve interactivity.
  • the method further includes: when a touch operation of the user is detected outside the selected area on the mask layer, canceling the mask layer and restoring the display of the currently displayed video.
  • a user's touch operation is detected outside the selected area on the mask layer.
  • the mask layer is cancelled and the display of the currently displayed video is restored, as shown in FIG. 3f.
  • it can be determined whether the user is not interested in the currently displayed video.
  • FIG. 4 is a flowchart of a method for identifying user interests provided by Embodiment 3 of the present disclosure, and further includes: when displaying the next video of the currently displayed video in the video stream, deleting the currently displayed video in the video stream Local cache of the current video; when a user's review operation instruction detected on the display interface of the next video is displayed, the previous video of the currently displayed video is displayed.
  • the method includes the following steps 310 to 340.
  • step 310 a user gesture is detected in a display interface of a currently displayed video in the video stream.
  • step 320 in response to the gesture, a non-interesting selection area is displayed.
  • step 330 when the user's clicking operation on the selected area is detected, it is determined that the user is not interested in the currently displayed video, and the next video of the currently displayed video in the video stream is displayed. , Deleting the local cache of the currently displayed video in the video stream.
  • the terminal device requests the server once, and the server can return a video stream including a set number of videos, and the set number of videos is one page of the video stream. If the user refreshes, the next page of the video stream is loaded and the terminal device will receive the video
  • the playback address cache of the video in the stream is in the form of a playlist. After the user exits the application to which the video stream belongs, the cache is deleted to free up storage space.
  • the playlist is an array, and the array is an advanced array with no length limit.
  • the playback address of the currently displayed video is deleted from the locally cached playlist. After deleting the playback address of the currently displayed video in the playlist, the subsequent data is shifted forward.
  • step 340 when the user looks back at the operation instruction detected on the display interface of the next video, the previous video of the currently displayed video is displayed.
  • the look-back operation instruction may be a pull-down gesture, or other gesture instructions that can implement look-back, which is not limited here.
  • a user's review operation instruction is detected on the display interface of the next video, and the interface moves with the review operation instruction. Since the playback address of the currently displayed video is deleted in the local cache, when the user's finger leaves the screen, Display the previous video of the currently displayed video, as shown in Figure 5b.
  • the method for identifying user interests provided in this embodiment deletes a local cache of a currently displayed video when displaying a next video of a currently displayed video in a video stream, and if a user's review operation instruction is detected on a display interface of the next video, It can directly display the next video of the currently displayed video, thereby avoiding playing videos that the user is not interested in and improving the user experience.
  • FIG. 6 is a schematic structural diagram of a device for identifying user interests provided in Embodiment 4 of the present disclosure. This embodiment is applicable to a case where a user is not interested in identifying videos in a video stream.
  • the device may be implemented by at least one of software and hardware, and may generally be integrated in a terminal device such as a mobile phone or a tablet computer.
  • the device for identifying user interests according to this embodiment includes a gesture detection module 410, a click area display module 420, and a next video display module 430.
  • the gesture detection module 410 is configured to be a display interface of a currently displayed video in a video stream, and detects a user's gesture.
  • the click area display module 420 is configured to display a click area that is not of interest in response to the detected gesture.
  • the next video display module 430 is configured to determine that the user is not interested in the currently displayed video, and display the currently displayed video in the video stream when the user's click operation on the selected area is detected. Next video.
  • the next video display module is configured to display a mask layer, and display a selected area of no interest on the mask layer.
  • the method further includes a mask canceling module.
  • the mask cancel module is configured to cancel the mask and restore the display of the currently displayed video when a user's touch operation is detected outside the selected area on the mask.
  • the method further includes: an identification saving module.
  • the identifier saving module is configured to save the video identifier of the currently displayed video and the user identifier of the user after the next video display module determines that the user is not interested in the currently displayed video.
  • the method further includes a cache deletion module and a previous video display module.
  • the cache deletion module is configured to delete the local cache of the currently displayed video in the video stream when the next video display module displays the next video of the currently displayed video in the video stream; the previous video display module , Configured to display a previous video of the currently displayed video when a user's review operation instruction is detected on a display interface of the next video.
  • the gesture is a long press gesture
  • the long press gesture includes a gesture in which the pressing time is greater than a set time threshold and the pressing pressure is greater than a set pressure threshold.
  • the device for identifying user interests may execute the method for identifying user interests provided by any embodiment of the present disclosure, and has corresponding function modules and beneficial effects of executing the method.
  • the device for identifying user interests may execute the method for identifying user interests provided by any embodiment of the present disclosure, and has corresponding function modules and beneficial effects of executing the method.
  • FIG. 7 is a schematic diagram of a hardware structure of a terminal device according to Embodiment 5 of the present disclosure.
  • Terminal devices can be implemented in a variety of forms.
  • Terminal devices in this disclosure may include, but are not limited to, such as mobile phones, smart phones, laptops, digital broadcast receivers, PDAs (Personal Digital Assistants), PADs (tablets) (Computer), PMP (Portable Multimedia Player), navigation device, vehicle terminal equipment, vehicle display terminal, vehicle electronic rear-view mirror, etc. mobile terminal equipment and fixed terminal equipment such as digital TV, desktop computer, etc. .
  • the terminal device 500 may include a wireless communication unit 510, an A / V (audio / video) input unit 520, a user input unit 530, a sensing unit 540, an output unit 550, a memory 560, an interface unit 570, and a processing unit. 580 and power supply unit 590 and so on.
  • FIG. 7 illustrates a terminal device having various components, but it should be understood that it is not required to implement all the illustrated components, and more or fewer components may be implemented instead.
  • the wireless communication unit 510 allows radio communication between the terminal device 500 and a wireless communication system or network.
  • the A / V input unit 520 is configured to receive an audio or video signal.
  • the user input unit 530 may generate key input data according to a command input by the user to control various operations of the terminal device.
  • the sensing unit 540 detects the current status of the terminal device 500, the location of the terminal device 500, the presence or absence of a user's touch input to the terminal device 500, the orientation of the terminal device 500, the acceleration or deceleration movement and direction of the terminal device 500, and so on, and A command or signal for controlling the operation of the terminal device 500 is generated.
  • the interface unit 570 functions as an interface that at least one external device can communicate with when connected to the terminal device 500.
  • the output unit 550 is configured to provide an output signal in at least one of visual, audio, and haptic ways.
  • the memory 560 may store software programs and the like for processing and control operations performed by the processor 580, or may temporarily store data that has been output or is to be output.
  • the memory 560 may include at least one type of storage medium.
  • the terminal device 500 may cooperate with a network storage device that performs a storage function of the memory 560 through a network connection.
  • the processor 580 generally controls the overall operation of the terminal device.
  • the processor 580 may include a multimedia module for reproducing or playing back multimedia data.
  • the processor 580 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as characters or images.
  • the power supply unit 590 receives external power or internal power under the control of the processor 580 and provides appropriate power required to operate a plurality of elements and components.
  • the processor 580 executes various functional applications and data processing of the terminal device 500 by running a program stored in the memory 560, for example, implementing a method for identifying user interests provided by the embodiments of the present disclosure, including:
  • a user's gesture is detected in the display interface of the currently displayed video in the video stream;
  • Embodiment 6 of the present disclosure also provides a storage medium containing computer-executable instructions.
  • the method is used to perform a method for identifying user interests.
  • the method includes:
  • a user's gesture is detected in the display interface of the currently displayed video in the video stream;
  • the storage medium provided by the embodiments of the present disclosure includes computer-executable instructions, and the computer-executable instructions are not limited to the method operations described above, and may also perform the method for identifying user interests provided by any embodiment of the present disclosure. Related operations in.
  • the present disclosure can be implemented by software and necessary general hardware, and of course, can also be implemented by hardware, but in many cases the former is a better implementation. .
  • the technical solution of the present disclosure that is essential or contributes to related technologies may be embodied in the form of a software product.
  • the computer software product may be stored in a computer-readable storage medium, such as a computer floppy disk, Read-only memory (ROM), random access memory (RAM), flash memory (FLASH), hard disk or optical disk, etc., including several instructions to make a computer device (can be a personal computer, A server, or a network device, etc.) perform the methods described in various embodiments of the present disclosure.
  • the multiple units and modules included are only divided according to functional logic, but are not limited to the above division, as long as the corresponding functions can be realized;
  • the specific names of the multiple functional units are only for the convenience of distinguishing each other, and are not used to limit the protection scope of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

L'invention concerne un procédé et un appareil d'identification d'intérêt d'utilisateur, et dispositif terminal et support d'informations. Le procédé consiste : à détecter un geste d'un utilisateur sur une interface d'affichage d'une vidéo actuellement affichée dans un flux vidéo (110); à afficher, en réponse au geste, une zone de clic non intéressante (120); et lorsqu'une opération de clic qui est effectuée par l'utilisateur sur la zone de clic est détectée, à déterminer que l'utilisateur n'est pas intéressé par la vidéo actuellement affichée, et à afficher la vidéo suivante de la vidéo actuellement affichée dans le flux vidéo (130).
PCT/CN2018/124743 2018-06-29 2018-12-28 Procédé et appareil d'identification d'intérêt d'utilisateur, et dispositif terminal et support d'informations WO2020000970A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810697502.2 2018-06-29
CN201810697502.2A CN108932103A (zh) 2018-06-29 2018-06-29 标识用户兴趣的方法、装置、终端设备及存储介质

Publications (1)

Publication Number Publication Date
WO2020000970A1 true WO2020000970A1 (fr) 2020-01-02

Family

ID=64447008

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/124743 WO2020000970A1 (fr) 2018-06-29 2018-12-28 Procédé et appareil d'identification d'intérêt d'utilisateur, et dispositif terminal et support d'informations

Country Status (2)

Country Link
CN (1) CN108932103A (fr)
WO (1) WO2020000970A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108932103A (zh) * 2018-06-29 2018-12-04 北京微播视界科技有限公司 标识用户兴趣的方法、装置、终端设备及存储介质
CN110446077A (zh) * 2019-07-31 2019-11-12 安徽抖范视频科技有限公司 一种基于视频类型进行兴趣点标识的方法及系统
CN113094135B (zh) * 2021-04-06 2023-05-30 北京字跳网络技术有限公司 页面显示控制方法、装置、设备及存储介质
CN114531614B (zh) * 2022-02-17 2023-09-15 北京字跳网络技术有限公司 视频播放方法、装置、电子设备和存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103096045A (zh) * 2011-10-28 2013-05-08 宏碁股份有限公司 利用手势调整视频影像压缩的方法
US20130332834A1 (en) * 2011-09-12 2013-12-12 Wenlong Li Annotation and/or recommendation of video content method and apparatus
CN107562349A (zh) * 2017-09-11 2018-01-09 广州酷狗计算机科技有限公司 一种执行处理的方法和装置
CN108932103A (zh) * 2018-06-29 2018-12-04 北京微播视界科技有限公司 标识用户兴趣的方法、装置、终端设备及存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2416557A4 (fr) * 2009-03-31 2013-01-23 Sharp Kk Dispositif d'amélioration d'image, procédé d'amélioration d'image, programme d'amélioration d'image et dispositif de traitement de signal
CN102402369B (zh) * 2010-09-13 2017-11-24 联想(北京)有限公司 电子设备及其操作提示标识移动方法
CN103577083A (zh) * 2012-07-30 2014-02-12 腾讯科技(深圳)有限公司 一种图片操作方法和移动终端

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130332834A1 (en) * 2011-09-12 2013-12-12 Wenlong Li Annotation and/or recommendation of video content method and apparatus
CN103096045A (zh) * 2011-10-28 2013-05-08 宏碁股份有限公司 利用手势调整视频影像压缩的方法
CN107562349A (zh) * 2017-09-11 2018-01-09 广州酷狗计算机科技有限公司 一种执行处理的方法和装置
CN108932103A (zh) * 2018-06-29 2018-12-04 北京微播视界科技有限公司 标识用户兴趣的方法、装置、终端设备及存储介质

Also Published As

Publication number Publication date
CN108932103A (zh) 2018-12-04

Similar Documents

Publication Publication Date Title
US10990278B2 (en) Method and device for controlling information flow display panel, terminal apparatus, and storage medium
WO2020000970A1 (fr) Procédé et appareil d'identification d'intérêt d'utilisateur, et dispositif terminal et support d'informations
EP3570163B1 (fr) Procédé de lancement d'application, support d'enregistrement et terminal
US20170336938A1 (en) Method and apparatus for controlling content using graphical object
KR102010955B1 (ko) 프리뷰 제어 방법 및 이를 구현하는 휴대 단말
CN110727369B (zh) 电子装置
US20130154978A1 (en) Method and apparatus for providing a multi-touch interaction in a portable terminal
KR102044826B1 (ko) 마우스 기능 제공 방법 및 이를 구현하는 단말
US20150128073A1 (en) Method for sharing contents and electronic device thereof
JP7181375B2 (ja) 目標対象の動作認識方法、装置及び電子機器
CN110069188B (zh) 标识显示方法及终端设备
CN107153546B (zh) 一种视频播放方法及移动设备
KR102125212B1 (ko) 전자 필기 운용 방법 및 이를 지원하는 전자 장치
CN105210023A (zh) 装置及相关方法
US20150220205A1 (en) User input method and apparatus in electronic device
KR20130124866A (ko) 이동 단말기 및 그 제어방법
WO2022227784A1 (fr) Procédé et appareil d'affichage d'image, dispositif, et support de stockage
CN115730092A (zh) 用于内容呈现的方法、装置、设备和存储介质
TWI485616B (zh) 記錄軌跡的方法及電子裝置
US20140201648A1 (en) Displaying hotspots in response to movement of icons
WO2021068382A1 (fr) Procédé et appareil de commande de fonctionnement de fenêtres multiples, dispositif et support de mémoire
WO2023134599A1 (fr) Procédé et appareil d'envoi d'informations vocales, et dispositif électronique
CN110502169B (zh) 一种显示控制方法及终端
US20130113741A1 (en) System and method for searching keywords
WO2020007010A1 (fr) Procédé et appareil d'affichage de volume, dispositif terminal, et support de stockage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18925044

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08.04.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18925044

Country of ref document: EP

Kind code of ref document: A1