EP3398111A1 - Système basé sur la détection de profondeur pour détecter, suivre, estimer et identifier l'occupation en temps réel - Google Patents

Système basé sur la détection de profondeur pour détecter, suivre, estimer et identifier l'occupation en temps réel

Info

Publication number
EP3398111A1
EP3398111A1 EP16826746.6A EP16826746A EP3398111A1 EP 3398111 A1 EP3398111 A1 EP 3398111A1 EP 16826746 A EP16826746 A EP 16826746A EP 3398111 A1 EP3398111 A1 EP 3398111A1
Authority
EP
European Patent Office
Prior art keywords
depth
based system
sensing based
sensor
depth sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP16826746.6A
Other languages
German (de)
English (en)
Other versions
EP3398111B1 (fr
Inventor
Jonathan M. FRANCIS
Sirajum MUNIR
Charles P. Shelton
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP3398111A1 publication Critical patent/EP3398111A1/fr
Application granted granted Critical
Publication of EP3398111B1 publication Critical patent/EP3398111B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit

Definitions

  • FIG. IB illustrates a depth sensing system installed above an entryway
  • FIG. 1C illustrates a block diagram of the depth sensing system of FIG. IB
  • FIG. 6A illustrates multilevel scanning at level 5'
  • FIG. 6C illustrates multilevel scanning at level 4'
  • FIG. 9C illustrates a door detection step after Canny edge detection in accordance with an exemplary embodiment
  • FIG. 10 is a graph illustrating an occupancy estimation performance at different frame rates in accordance with an exemplary embodiment.
  • FIG. 1C illustrates a block diagram of the depth sensing based occupancy tracking and estimation system 100 of FIG. IB.
  • the system 100 includes a sensor 112, a processor 1 14, a computer readable medium 116, a communication interface 118, an input/output interface 120, and a graphical user interface (GUI) 122.
  • GUI graphical user interface
  • other computer implemented devices for performing other features not defined herein may be incorporated into the system 100.
  • One or more system buses B coupled to one or more computer implemented devices 112, 1 14, 116, 118, 120, 122 for facilitating communication between various computer implemented devices 112, 114, 1 16, 118, 120, 122, one or more output devices, one or more peripheral interfaces, and one or more communication devices.
  • the system buses 220 may be any types of bus structures including a memory or a memory controller, a peripheral bus, a local bus, and any type of bus architectures.
  • the sensor 112 is a depth sensor, sometime referred as a Time of Flight (TOF) sensor is configured to detect number of occupants in real-time in the site. Although one sensor 102 is illustrated, more than one depth sensor may be disposed within the system 100. Other types of sensor such as optical sensors, imaging sensors, acoustic sensors, motion sensors, global positioning system sensors, thermal sensors, environmental sensors, and so forth may be coupled to the depth sensor and mounted within the system 100. In some embodiments, other non-depth sensor as a separate device may be electrically coupled to the system 100.
  • TOF Time of Flight
  • Communication media typically includes computer readable instructions, data structures, program modules, or other data in a modulated data signal such a carrier wave or other transport mechanism and include any information delivery media.
  • Communication media may also include wired media such as a wired network or direct-wired communication, and wireless media such as acoustic, RF, infrared (IR) and other wireless media. Communications of the any of the above should also be included with the scope of computer readable media.
  • An occupancy estimation module 124 is incorporated in the system 100.
  • the module 124 may be communicatively coupled to one or more computer implemented devices 112, 114, 116, 118, 120, 122, in some embodiments.
  • the module 124 may be embedded into the processor 114 and is configured to detect number of occupants in the site in real-time as described in further detail below.
  • FIG. 2 illustrates a layout example of a particular floor of a building 200 that will be used throughout this description to aid in describing the depth sensing based occupancy tracking and estimation system 100. The floor is divided into various conferences 224a-224c, offices 226a, 226b, and main hall offices 228a-228c.
  • the floor may further include other rooms such as kitchen, laboratory, general mailroom, and any types of site.
  • depth sensing systems lOOa-lOOg are installed to detect number of occupants in each room in real-time.
  • the depth sensing systems 100a- lOOg are installed above the entryway of the rooms 224a-224c, 226a-226b, 228a-228c.
  • the depth sensing systems lOOa-lOOg communicatively coupled to one or more networks and servers are either powered by the same energy source or powered separately.
  • optional depth sensing system may be installed to detect number of occupants in that rooms.
  • FIG. 3 illustrates a chart of an occupancy pattern 300 taken over a time frame, i.e. two weeks for a particular deployment.
  • X axis represents the date and time of occupancy change 304.
  • Y-axis represents the corresponding occupancy count at that date and time 306. For example, it shows that on Sept 25 th 1 :00 PM, there were 44 people in this particular room 310b.
  • FIG. 4 illustrates a high level block diagram of an occupancy estimation module where depth data inputs are received for processing.
  • a depth sensor 112 of the depth sensing based occupancy tracking and estimation system 100 located above the entryway detects at least one object upon either entry into a particular room or exit the particular room.
  • the detected depth data is then displayed for preprocessing to remove noise embedded in the depth data.
  • At least a depth image 500 is displayed either on a client device 106 or on a display of the depth sensing system 100 at low frame rate.
  • the depth image 500 has a 512x424 resolution produced at 30FPS. Other types of resolution, depending on the application, may be displayed.
  • the generated depth data stream in the image may be very noisy.
  • Each pixel of a depth frame or depth image provides a distance from the depth sensing system 100 to the nearest objects in millimeter. However, in the presence of a noise, the corresponding pixel has a value of 0.
  • a multilevel scanning at step 406 is performed, where the system 100 scans a few potential depth levels to detect humans. For each level, the system extract contours of potential heads at level 408 and ignore the depth data below that level. Using the contours, the depth estimation module detects minimum enclosing circles of the contours, which provides approximate centers and radii of the heads. For each identified circles, the module uses the 3D depth data to verify whether it is an individual, i.e. a human or non-human. The step 408 continues to verify the presence of a head and a shoulder.
  • an orientation invariant 3D human model is used by leveraging anthropometric properties of human heads and shoulders as these are the most prominent body parts seen from the ceiling.
  • the orientation invariant 3D human model may be either integrated into the depth estimation module or integrated into the system 100 as a separate computer implement device.
  • the system 100 tracks individuals to determine whether they are going inside or outside through the nearby door to count people in steps 414 and 416.
  • the system 100 determines the location of the door in step 418 and then tracks individuals in step 414 to determine whether they are entering or leaving a particular room for estimating number of occupants in that room.
  • FIGs. 6A-6D illustrate images 600 taken by the depth sensing system 100 during a multilevel scanning process. Centers and radii of minimum enclosing circles of all the potential heads at different height levels are determined.
  • the system finds the minimum enclosing circle using an iterative algorithm, which gives us the center and radius of a head. Some circles with radius smaller than a threshold to be a head are discarded. For example, Person B may be discarded at 6' level scanning, but is detected at level 5 '6" scanning. For each detected center and radius, the system verifies whether it is a person by verifying the presence of a head using head verification as defined in Equation 1 and a shoulder using shoulder verification described further below. Note that a person can be detected at different levels. The scanning process begins by scanning at a point from the top and when the system verifies a person at a higher depth level, all the nearby centers at lower levels are then discarded.
  • the depth sensing network system 100 may perform biometric tracking to identify and track individuals. For example, every time when someone enters/exits, the depth sensing network system 100 extracts 38 simple features regarding height, head radius, shoulder size, going in/coming out, and walking speed of the subject. Although 38 simple features are disclosed, more or less than 38 simple features may be extracted, depending on the application. In one example, the system extracts 12 features regarding height including the minimum, maximum, average, and exact height from the depth data when a person is crossing a threshold, such as the frame CDEF and line AB during the entrance/exit event. Similar features are extracted regarding the head radius and shoulder size.
  • a train module coupled to or integrated into the system and includes at least one of several machine learning algorithms is configured to identify individuals.
  • the machine learning algorithms include at least one of Naive Bayes, Multilayer Perceptron, Random Forest, or K*.
  • the machine learning algorithms may be run either in the cloud (network 104).
  • the machine learning algorithms may be run in an occupancy estimation module an occupant identification module other than the train module.
  • the occupancy estimation module, the occupant identification module, or the like may be either integrated into the system 100 or communicatively coupled to the system 100. The following section shows an example of how its performance varies for different amount of training data.
  • the system determines Di, which is 1 if the person is at the left side of Doori and 0 otherwise, where i 6 ⁇ 1,2 ⁇ . To be at the left side, someone's head center has to be at the left of the line segment AB for Doori, and at the left of all the three line segments CD, DE, and EF for Door2.
  • the system increases the occupancy count if someone's Di is changed from 1 (at the previous frame) to 0 (at the current frame). If the direction of the person Dj (j is not equal to i) is changed from 1 to 0 later, the system does not increase the count again. However, if either Di or Dj is changed from 0 to 1 later, the system decreases the occupancy count and ignores a similar change (0 to 1) subsequently.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Alarm Systems (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un système basé sur la détection de profondeur pour détecter et/ou suivre et/ou estimer et/ou identifier l'occupation en temps réel, comprenant un processeur, un support lisible par ordinateur et une interface de communication connectés de manière communicante entre eux par le biais d'un bus de système. Un module d'estimation d'occupation est intégré dans le système. Un algorithme glouton d'appariement bipartite qui représente un module d'estimation d'occupation affinée (FORK) par exploitation de la position, de la taille et du rayon de la tête de personnes est également intégré dans le système.
EP16826746.6A 2015-12-28 2016-12-28 Système de détection basé en profondeur pour détecter, évaluer et identifier l'occupation en temps réel Active EP3398111B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562271529P 2015-12-28 2015-12-28
PCT/EP2016/082737 WO2017114846A1 (fr) 2015-12-28 2016-12-28 Système basé sur la détection de profondeur pour détecter, suivre, estimer et identifier l'occupation en temps réel

Publications (2)

Publication Number Publication Date
EP3398111A1 true EP3398111A1 (fr) 2018-11-07
EP3398111B1 EP3398111B1 (fr) 2023-12-13

Family

ID=57821930

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16826746.6A Active EP3398111B1 (fr) 2015-12-28 2016-12-28 Système de détection basé en profondeur pour détecter, évaluer et identifier l'occupation en temps réel

Country Status (3)

Country Link
EP (1) EP3398111B1 (fr)
CN (1) CN108701211B (fr)
WO (1) WO2017114846A1 (fr)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10599947B2 (en) 2018-03-09 2020-03-24 Ricoh Co., Ltd. On-demand visual analysis focalized on salient events
US11763111B2 (en) * 2018-05-04 2023-09-19 Rowan Companies, Inc. System and method for locating personnel at muster station on offshore unit
CN110634209B (zh) * 2018-06-25 2023-03-14 罗伯特·博世有限公司 监管服务管理的占用感测系统
FR3086782B1 (fr) * 2018-09-27 2021-12-03 Aereco Dispositif et procede de comptage de personnes
US10657746B1 (en) * 2019-01-18 2020-05-19 Robert Bosch Gmbh Access control system including occupancy estimation
GB2588106A (en) * 2019-10-07 2021-04-21 Seechange Tech Limited System and method for determining occupancy zones
CN117241133B (zh) * 2023-11-13 2024-02-06 武汉益模科技股份有限公司 基于非固定位置的多工序同时作业的视觉报工方法及系统

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8229172B2 (en) * 2009-12-16 2012-07-24 Sony Corporation Algorithms for estimating precise and relative object distances in a scene
US20110169917A1 (en) * 2010-01-11 2011-07-14 Shoppertrak Rct Corporation System And Process For Detecting, Tracking And Counting Human Objects of Interest
US20110176000A1 (en) * 2010-01-21 2011-07-21 Utah State University System and Method for Counting People
US9690266B2 (en) * 2011-09-19 2017-06-27 Siemens Industry, Inc. Building automation system control with motion sensing
CN102609680B (zh) * 2011-12-22 2013-12-04 中国科学院自动化研究所 一种基于三维深度图像信息的并行统计学习人体部位检测方法
US8929592B2 (en) * 2012-03-13 2015-01-06 Mitsubishi Electric Research Laboratories, Inc. Camera-based 3D climate control
CN102930524A (zh) * 2012-09-11 2013-02-13 无锡数字奥森科技有限公司 一种基于垂直放置的深度摄像头的人头检测方法
US9165190B2 (en) * 2012-09-12 2015-10-20 Avigilon Fortress Corporation 3D human pose and shape modeling
US10009579B2 (en) * 2012-11-21 2018-06-26 Pelco, Inc. Method and system for counting people using depth sensor
CN103839038A (zh) * 2012-11-23 2014-06-04 浙江大华技术股份有限公司 一种人数统计的方法及装置
TWI503756B (zh) * 2013-08-15 2015-10-11 Univ Nat Taiwan 人型影像追蹤系統及其人型影像偵測方法與追蹤方法

Also Published As

Publication number Publication date
CN108701211A (zh) 2018-10-23
EP3398111B1 (fr) 2023-12-13
CN108701211B (zh) 2023-09-12
WO2017114846A1 (fr) 2017-07-06

Similar Documents

Publication Publication Date Title
EP3398111B1 (fr) Système de détection basé en profondeur pour détecter, évaluer et identifier l'occupation en temps réel
WO2020215961A1 (fr) Procédé et système de détection d'informations sur du personnel pour régulation thermique intérieure
Munir et al. Real-time fine grained occupancy estimation using depth sensors on arm embedded platforms
Benezeth et al. Towards a sensor for detecting human presence and characterizing activity
Chattopadhyay et al. Pose Depth Volume extraction from RGB-D streams for frontal gait recognition
US20180300887A1 (en) System and process for detecting, tracking and counting human objects of interest
Charfi et al. Optimized spatio-temporal descriptors for real-time fall detection: comparison of support vector machine and Adaboost-based classification
US7623674B2 (en) Method and system for enhanced portal security through stereoscopy
CN107256377B (zh) 用于检测视频中的对象的方法、设备和系统
Tian et al. Robust and efficient foreground analysis in complex surveillance videos
Salas et al. People detection using color and depth images
US10936859B2 (en) Techniques for automatically identifying secondary objects in a stereo-optical counting system
Hua et al. Pedestrian detection by using a spatio-temporal histogram of oriented gradients
Koh et al. An integrated automatic face detection and recognition system
US8873804B2 (en) Traffic monitoring device
US20200074228A1 (en) Rgbd sensing based object detection system and method thereof
JP2018067755A (ja) 検出データ管理装置
US10657746B1 (en) Access control system including occupancy estimation
Garcia-Bunster et al. Crowded pedestrian counting at bus stops from perspective transformations of foreground areas
JP6786837B2 (ja) 動体計数装置及びプログラム
Maaspuro A low-resolution IR-array as a doorway occupancy counter in a smart building
JP2011198244A (ja) 対象物認識システム及び該システムを利用する監視システム、見守りシステム
Hernández et al. People counting with re-identification using depth cameras
WO2017135310A1 (fr) Dispositif de comptage du nombre de passages, procédé de comptage du nombre de passages, programme et support de stockage
Yuan et al. Pedestrian detection for counting applications using a top-view camera

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180730

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ROBERT BOSCH GMBH

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210527

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602016084731

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06K0009000000

Ipc: G06V0020500000

Ref legal event code: R079

Ipc: G06V0020500000

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G06V 20/52 20220101ALI20230421BHEP

Ipc: G07C 9/00 20060101ALI20230421BHEP

Ipc: G06V 40/10 20220101ALI20230421BHEP

Ipc: G06V 20/64 20220101ALI20230421BHEP

Ipc: G06V 20/50 20220101AFI20230421BHEP

INTG Intention to grant announced

Effective date: 20230523

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602016084731

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240314

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231213

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20231213

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231213

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231213

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240314

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231213

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240313

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240227

Year of fee payment: 8

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1641110

Country of ref document: AT

Kind code of ref document: T

Effective date: 20231213

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231213

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231213

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231213

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240313

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231213

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231213

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231213