WO2020135916A1 - Armoire intelligente pour système de conteneur jetable - Google Patents

Armoire intelligente pour système de conteneur jetable Download PDF

Info

Publication number
WO2020135916A1
WO2020135916A1 PCT/EP2018/097053 EP2018097053W WO2020135916A1 WO 2020135916 A1 WO2020135916 A1 WO 2020135916A1 EP 2018097053 W EP2018097053 W EP 2018097053W WO 2020135916 A1 WO2020135916 A1 WO 2020135916A1
Authority
WO
WIPO (PCT)
Prior art keywords
disposal
injection device
disposal cabinet
cabinet according
injection
Prior art date
Application number
PCT/EP2018/097053
Other languages
English (en)
Inventor
Quentin Le Masne
Rodrigue CHATTON
Matthias Pfister
Nicolas THEVENAZ
Yves Pellaton
Original Assignee
Ares Trading S.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ares Trading S.A. filed Critical Ares Trading S.A.
Priority to PCT/EP2018/097053 priority Critical patent/WO2020135916A1/fr
Publication of WO2020135916A1 publication Critical patent/WO2020135916A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B50/00Containers, covers, furniture or holders specially adapted for surgical or diagnostic appliances or instruments, e.g. sterile covers
    • A61B50/30Containers specially adapted for packaging, protecting, dispensing, collecting or disposing of surgical or diagnostic appliances or instruments
    • A61B50/36Containers specially adapted for packaging, protecting, dispensing, collecting or disposing of surgical or diagnostic appliances or instruments for collecting or disposing of used articles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/92Identification means for patients or instruments, e.g. tags coded with colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/755Deformable models or variational models, e.g. snakes or active contours
    • G06V10/7553Deformable models or variational models, e.g. snakes or active contours based on shape, e.g. active shape models [ASM]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00199Electrical control of surgical instruments with a console, e.g. a control panel with a display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B50/00Containers, covers, furniture or holders specially adapted for surgical or diagnostic appliances or instruments, e.g. sterile covers
    • A61B50/10Furniture specially adapted for surgical or diagnostic appliances or instruments
    • A61B2050/105Cabinets
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0804Counting number of instruments used; Instrument detectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B50/00Containers, covers, furniture or holders specially adapted for surgical or diagnostic appliances or instruments, e.g. sterile covers
    • A61B50/30Containers specially adapted for packaging, protecting, dispensing, collecting or disposing of surgical or diagnostic appliances or instruments
    • A61B50/36Containers specially adapted for packaging, protecting, dispensing, collecting or disposing of surgical or diagnostic appliances or instruments for collecting or disposing of used articles
    • A61B50/362Containers specially adapted for packaging, protecting, dispensing, collecting or disposing of surgical or diagnostic appliances or instruments for collecting or disposing of used articles for sharps
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • A61M5/178Syringes
    • A61M5/31Details
    • A61M5/32Needles; Details of needles pertaining to their connection with syringe or hub; Accessories for bringing the needle into, or holding the needle on, the body; Devices for protection of needles
    • A61M5/3205Apparatus for removing or disposing of used needles or syringes, e.g. containers; Means for protection against accidental injuries from used needles

Definitions

  • the present invention relates to a disposal cabinet for safely and efficiently discarding a variety of injection devices as well as identifying the injection devices as they’re being discarded.
  • Many people diagnosed with one or more chronic diseases are prescribed regular injections with a variety of injection devices (e.g., syringes, needles, cartridges, and auto-injectors).
  • injection devices e.g., syringes, needles, cartridges, and auto-injectors.
  • dexterity impairments muscle weakness and numbness, pain, swelling in joints, spasms
  • visual impairments visual impairments
  • cognitive impairments memory, confusion and forgetfulness, fatigue, depression and anxiety
  • adhering to a prescribed treatment of regular injection may prove quite difficult.
  • people with the aforementioned impairments may find it difficult to properly and safely discard of the different injection. Further, such people may also find it difficult to keep track of all the different injections prescribed in the treatment.
  • Figure 1 illustrates an example embodiment of a casing of the injection device disposal cabinet.
  • Figure 2A illustrates an example embodiment of an optical module of the injection device disposal cabinet.
  • Figure 2B illustrates an exploded view of an example embodiment of an optical module of the injection device disposal cabinet.
  • Figure 2C illustrates an example embodiment of a variety reflected image paths in the optical module depicted in Figure 2B.
  • Figure 3 illustrates an example embodiment of a transfer module of the injection device disposal cabinet.
  • Figure 4 illustrates an example embodiment of a control module of the injection device disposal cabinet.
  • Figure 5 illustrates an example embodiment of the electronics architecture of the injection device disposal cabinet.
  • Figure 6A illustrates an example embodiment of the process utilized to identify an injection device.
  • Figure 6B illustrates an example of the decision trees concept utilized in Figure 6A.
  • Figure 7 provides a non-exhaustive list of injection devices recognizable by the injection device disposal cabinet.
  • Figure 8 illustrates a flow diagram of the injection device identification process.
  • Figure 9 illustrates an example embodiment of a user interaction with disposal cabinet.
  • One aspect of the present disclosure is to provide systems for efficiently disposing and identifying a variety of different injection devices.
  • the systems herein address at least one of the problems discussed above.
  • a disposal cabinet for discarding and identifying a variety of injection devices includes an optical module, wherein the optical module includes an image acquisition device; and a processor, wherein, upon detecting an object approaching the disposal cabinet, the processor is configured to: receive, in the optical module, an injection device, capture, with the image acquisition device, an image of the injection device, discard the received injection device into a disposal container coupled to the optical module, and identify the injection device based on the captured image.
  • Figure 1 illustrates an example embodiment of a casing of the injection device disposal cabinet.
  • the casing of the injection device disposal cabinet 100 includes an activity area 101 , a graphical user interface 102, an opening 103, a handle 104, a pairing button 105, a power switch 106, a power plug 107, and a door 108.
  • the activity area 101 includes a proximity sensor to detect approaching objects (e.g., person’s hand and/or the injection device).
  • the system on the disposal cabinet is activated and the opening 103 (e.g., a trapdoor) is opened in order to receive the injection device.
  • the injection device can be disposed of with only a single hand.
  • the shape of the opening 103 is compatible to receive various injection device sizes. Therefore, users suffering from many of the impairments discussed above will be able to easily interact with the opening 103.
  • the opening 103 also shields the optical module from environmental perturbations like direct sun illumination or dust.
  • the opening 103 is located near the graphical user interface 102. In an embodiment, the graphical user interface 102 informs the user about the various disposal cabinet-related states.
  • the handle 104, the pairing button 105, the power switch 106, the power plug 107, and the door 108 can be located on a variety of different sides of the casing of the disposal cabinet 100.
  • the handle 104 facilitates the user’s manipulation of the disposal cabinet 100. Therefore, the disposal cabinet 100 can be easily moved and transported to a variety of locations.
  • the pairing button 105 allows the disposal cabinet 100 to pair (e.g., via a Bluetooth interface) to a user’s smart device (e.g., smartphone, tablet, smart watch). Therefore, the device is able to transmit important information from the disposal cabinet 100 to the user’s smart device.
  • the power switch 106 is used to activate and de-activate the power being supplied to the electrical components of the disposal cabinet 100 via an internal battery. Further, the power plug 107 is utilized to charge the internal battery as needed. Further, in an embodiment, the door 108 allows the user to place a disposal container (not shown) within the cabinet when necessary. In another embodiment, the door 108 may be implemented with a child-safety lock.
  • Figure 2A illustrates an example embodiment of an optical module of the injection device disposal cabinet.
  • the optical module 1 10 is integrated within the disposal cabinet 100.
  • the optical module 110 includes a plurality of reflecting surfaces 11 1 , a plurality of transparent holding elements 1 12, a reflecting surface 1 13, and an image acquisition device 1 14.
  • the reflecting surfaces 1 11 are configured in a V-shape and are utilized to reflect an image of a side of an injection device 50 facing the reflecting surfaces 1 13 (e.g., the back side of the injection device 50).
  • the reflecting surfaces 1 13 are a plurality of mirrors.
  • the transparent holding elements 1 12 also configured in a V-shape and are each located at a predefined distance from a respective reflecting surfaces 1 13.
  • the transparent holding elements are configured to hold the injection device 50 after it is received by the opening 103 in Figure 1.
  • the transparent holding elements 1 12 are a plurality of transparent glass slabs.
  • the reflecting surface 1 13 is located at the top of the optical module 1 10 and is configured to reflect images of the injection device 50 to the image acquisition device 1 14.
  • the reflecting surface 1 13 is configured to reflect images of the injection device 50 initially reflected by the reflecting surfaces 1 1 1.
  • the image acquisition device 1 14 is configured to acquire an image of the injection device 50 based on the reflections from the reflecting surfaces 1 1 1 and 1 13.
  • the image acquisition device 1 14 is a CMOS image sensor configured to receive and capture color images.
  • the image acquisition device 1 14 is a CCD image sensor configured to receive and capture color images.
  • the captured image may then be stored for further processing (e.g., identification of the injection device 50).
  • Figure 2B illustrates an exploded view of an example embodiment of an optical module of the injection device disposal cabinet. As depicted in the figure, the transparent holding elements 112 are maintained in the V-shape by holding elements 1 15 coupled to each of the transparent holding elements 1 12.
  • Figure 2C illustrates an example embodiment of a variety of reflected image paths in the optical module depicted in Figure 2B.
  • the image acquisition device 1 14 receives image paths 1 11 a and 1 13a.
  • the reflected image path 1 1 1 a corresponds to the optical path of the reflected images of the of the injection device 50 initially reflected by the reflecting surfaces 11 1 and then reflected by the reflecting surface 1 13.
  • the reflected image paths 113a correspond to the optical paths of reflected images of the injection device 50 that were not initially reflected by the reflecting surfaces 11 1.
  • the reflect image paths 113a correspond to images of the injection device 50 that were reflected only by the reflecting surface 1 13.
  • an image of the injection device 50 may be captured from multiple points of view. Accordingly, a sufficient amount of information may be collected in order to determine the discriminators, such as color and shape, of the injection device 50.
  • additional sensing systems can also be included in order to determine other discriminators, such as weight, metallic or ferromagnetic content, Optical Character Recognition (OCR) and barcode (e.g., one-dimensional and two-dimensional) reading.
  • OCR Optical Character Recognition
  • barcode e.g., one-dimensional and two-dimensional
  • Figure 3 illustrates an example embodiment of a transfer module of the injection device disposal cabinet.
  • the transfer module 120 includes an exit 121 , an exit actuator 122, a disposal container interface 123, and a disposal container 124.
  • the exit 121 is implemented on a bottom end of the optical module 1 10.
  • the exit 121 is configured to hold the injection device 50 while the image acquisition device 1 14 captures an image of the injection device 50.
  • the exit 121 is a trapdoor.
  • the exit 121 may be electromechanically actuated (e.g., opened) by the exit actuator 122.
  • the exit 121 is opened without any additional user interaction.
  • the injection device 50 is released into the disposal container 124.
  • the injection device 50 is released via the disposal container interface 123.
  • the disposal container interface 123 includes a conical shape.
  • the disposal container 124 e.g., sharps disposal container
  • the opening of the disposal container is configured to be spread by the conically-shaped disposal container interface 123. As such, large injection devices will not be blocked by the safety features at the opening of the disposal container 124.
  • the transfer module 120 may include additional sensors to be implemented around the disposal container 124. In an embodiment, these additional sensors may be used to monitor the filling level of the disposal container 124 as well as the presence of the disposal container 124 in the disposal cabinet 100.
  • Figure 4 illustrates an example embodiment of a control module of the injection device disposal cabinet.
  • the control module 130 may include a first printed circuit board (PCB) 131 , a second PCB 132, a battery 133, and a third PCB 134.
  • the first PCB 131 may be dedicated to data processing and radio frequency (RF) data transfer.
  • the first PCB may be placed on top of the disposal container 124 close to the image acquisition device 1 14.
  • the second PCB 132 may be dedicated to power management and illumination control of the disposal cabinet 100.
  • the second PCB 132 may be responsible for the monitoring and managing of the battery 133.
  • the first PCB 131 may also be dedicated to the power management and illumination control of the disposal cabinet 100.
  • the third PCB 134 may dedicated to the filling level and container presence sensors discussed above.
  • the third PCB 134 may also host the driver for controlling the actuators for opening and closing the opening 103 and the exit 121 .
  • the third PCB 134 may be located below the optical module 1 10 on the front side of the disposal container 124.
  • a single PCB could be utilized to perform all of the above functions (and more) described for PCB 131 , 132 and 134.
  • the identification of the injection device 50 can be performed in the following steps. First, one of the user or the injection device 50 has to be detected by the proximity sensor associated with the activity area. Upon detection of one of the user or the injection device 50, the processor associated with the PCB 131 performs at least one of the following: (i) activates the disposal cabinet 100, (ii) opens the opening 103, and (iii) initiates the image acquisition device 1 14. Then, while the opening 103 is opened, the injection device 50 is discarded into disposal cabinet 100. Once the injection device 50 is secured in the optical module 1 10, the processor closes the opening and triggers the image acquisition device 1 14.
  • the image acquisition device 1 14 then captures an image of the discarded injection device 50 while it is secured in the optical module 1 10 by the transparent holding elements 1 12 and the exit 121. Then, after the image of the injection device 50 is captured and stored in memory, the processor opens the exit 121 in order to release the injection device 50 into the disposal container 124. The processor then applies a series of detection filters (e.g., decision trees) on the stored image of the discarded injection device 50. After which, the processor performs a vote for each decision tree in order to select the most probable match among a database of injection device references (e.g., classes) stored locally in the disposal cabinet 100.
  • a series of detection filters e.g., decision trees
  • the processor 131 finds the“best match” and, as such, identifies the discarded injection device 50.
  • the identification result is then locally stored in memory.
  • the processor may then automatically connect, via Bluetooth, to the user’s smart device.
  • the graphical user interface 102 may inform the user on the status of the status of (i) the system itself (e.g., on, processing, error, off), (ii) the filling level of the disposal container, (iii) the battery level, and (iv) the presence of a disposal container.
  • the graphical user interface 102 may also provide wireless connection to an application located on the user’s smart device and/or associated with another device or web application.
  • the disposal cabinet 100 may provide the results of the identification as well as the filling level of the disposal container to the user’s smart device.
  • the smart device can then further transfer this information to a cloud application.
  • the cloud application can also be accessed by others (e.g., physicians and/or nurses) tending to the user.
  • the disposal cabinet may then return to sleep mode.
  • Figure 5 illustrates an example embodiment of the electronics architecture of the injection device disposal cabinet.
  • the electronics architecture includes the image acquisition device 1 14, optical module illumination LEDs 140, Bluetooth module 150, power management 160, sensors and actuators 170, and user interface 102.
  • the image acquisition device 1 14 is a CMOS image sensor.
  • the CMOS image sensor may include an integrated image processor.
  • the optical module illumination LEDs 140 the disposal cabinet 100 may require a plurality of LEDs.
  • the Bluetooth module 150 permits the transfer of the data to the user’s smart device or other devices.
  • the Bluetooth module may include a built-in antenna and a surface mount module that integrates the complete Bluetooth stack onboard.
  • the disposal cabinet 100 may limit the communication with a set of unidirectional commands from the disposal cabinet 100 to the user’s smartphone only.
  • the disposal cabinet 100 may include a battery cell, a battery charger circuit, a fuel gauge (which communicates with the processor 131 ) to monitor the battery charge status and capacity degradation allowing for an accurate prediction of the available capacity, a charge connector (e.g., power plug 107), and a main switch (e.g., power switch 106).
  • the disposal cabinet 100 may include a proximity sensor, multiple door actuators (e.g., for the opening 103 and the exit 121 , respectively), a disposal container presence detector, and a filling level detector.
  • the proximity sensor may be used to automatically activate the disposal cabinet 100 as an object (e.g., user and/or injection device 50) approaches the disposal cabinet 100.
  • the proximity sensor may include at least one of an infrared (IR) sensor, ultrasonic sensor, and a capacitive sensor.
  • the disposal container detector may include at least one of a mechanical switch and an IR sensor (e.g., reflective or transmissive).
  • the disposal container detector may also be implemented with an available image sensor (e.g., image acquisition device 1 14).
  • the filling level detector may include at least one of an IR sensor, an ultrasonic sensor, and a mechanical finger (e.g., to sense the filling level).
  • the filling level detector may also be implemented with an available image sensor (e.g., image acquisition device 1 14).
  • the multiple door actuators can be utilized to open and close the opening 103 and the exit 121 , respectively.
  • the door actuators can be implemented with a low- cost servo or linear actuator in conjunction with an H-Bridge driver.
  • the user interface 102 can be utilized to implement the following elements: status symbols (e.g., battery, filling level, pairing, etc.), LED backlight or OLED display, and a speaker.
  • Figure 6A illustrates an example embodiment of the process utilized to identify an injection device.
  • the identification algorithm is based on the concept of decision forests architecture (also called Random Forest). It consists of combining multiple uncorrelated decision trees with each of them giving votes for a subset of classes. As such, by combining the multiple decision trees, a more robust prediction of the class to which a sample belongs can be expected.
  • a single decision tree e.g., filter for certain discriminator
  • a single decision tree is not sufficient to identify a unique injection device 50; additional relevant decisions trees are required.
  • Figure 6B illustrates an example of the decision trees concept utilized in Figure 6A. This example is based on identifying a given picture among 3 possibilities (e.g., classes): a green square (class A), a green triangle (class B) and a red triangle (class C). Looking at only a single piece of information, like the amount of green color (the color decision tree), will not be sufficient to reliably identify the picture as belonging to only one of these classes. In fact, identifying the amount of green color alone will not be sufficient to distinguish between classes A and B. Further, the shape (or the surface) alone will also not be sufficient to discriminate between all references. Therefore, both decision trees have to be combined to get a reliable identification, which leads to the building of a decision forest. The identification relies on two steps, according to Figure 6B:
  • the first step consists of defining the attributes of classes that will be used (e.g. , as a look up table) during identification.
  • Each decision tree is applied to each class (i.e., each separated reference image/product to identify) to get a“score.”
  • a table is then built containing the relevant class information for each tree.
  • the look-up table includes the characteristics of each class for each attribute (e.g., the class for injection device A may have the following information; blue, conical, etc.).
  • the look-up table is then stored in the relevant product (e.g., disposal cabinet 100).
  • the second step includes the voting of the classes, a step that takes place by selecting an image at random (e.g., with regard to the disposal cabinet 100, this happens each time a user throws a new syringe in).
  • the two decision trees are applied and votes are attributed to the corresponding and predefined classes.
  • Figure 7 provides a non-exhaustive list of injection devices recognizable by the injection device disposal cabinet. As depicted in the figure, it can be seen that the injection devices share many similar aspects, and only minor discriminators can be found between the injection devices, which stresses the importance of finding relevant discriminators.
  • Figure 8 illustrates a flow diagram of the injection device identification process utilized in the disposal cabinet.
  • a training database is populated with a set of images of a variety of different injection devices.
  • the training database can be populated with a set of characteristics for each class for each of the injection devices. For example, the training database can be populated with 92 images corresponding to 21 injection devices. Then, from this database, a plurality of features are extracted.
  • the features may be red, green, blue, yellow, cyan, magenta, total color pixels, length, width, surface, brightness (e.g., average value of all the pixel values within the acquired picture), contrast (e.g., sum of the pixels of the acquired picture after detecting discontinuities in brightness using the Sobel method), barcode (e.g., measurement of the profile of the light intensity measured along the injection device axis, helping to differentiate the various injection devices between them), and yellow-green content in a specified region of interest.
  • the decision trees had been previously defined, the extracted features are then combined to generate bagged decision trees. Then, based on the generated decision trees, the look-up table (i.e. , classifier) is generated.
  • the look-up table is then stored for later use. Specifically, the look-up table is later utilized to classify (i.e., identify) any newly-discarded injection device. For example, after the disposal cabinet captures an image of the injection device (e.g., object), the processor (i) extracts the relevant features from the image and (ii) compares features with the look-up table to predict the class associated with the injection device. Then, in order to determine the confidence level of the prediction, a distance between the extracted feature and the value for this class in the look up table is determined. This estimation is very useful to identify potential areas of improvement for the algorithm. For example if an injection device always has a medium score, a typical discriminator associated with it can be added in the features recognized by the decision forest. Accordingly, using the decision forest, combined with multiple discriminators, multiple injection devices can be recognized and differentiated from each other.
  • the processor i) extracts the relevant features from the image and (ii) compares features with the look-up table to predict the class associated with the injection device. Then,
  • Figure 9 illustrates an example embodiment of a user interaction with disposal cabinet.
  • the user performs an injection with a certain injection device 50.
  • the user may then discard the injection device 50 into the disposal cabinet 100 as depicted in step 220.
  • the disposal cabinet 100 processes the injection device 50 and displays a confirmation to the user.
  • the processing may include: (i) capturing an image of the injection device 50, (ii) discarding the injection device 50 into the disposal container 124, and (iii) identifying the injection device 50.
  • the processing may also include the monitoring of the filling level (e.g., of the injection devices 50) in the disposal container 124.
  • the disposal cabinet 100 sends the injection device identification information to the user smart device application.
  • the disposal cabinet 100 may also send the filling level information to the user smart device application.
  • such information may also be sent to a cloud application to be accessed by other relevant parties, e.g., medical professionals. Accordingly, the medical professionals can also remotely track the user’s treatment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Software Systems (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Vascular Medicine (AREA)
  • Data Mining & Analysis (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Of Solid Wastes (AREA)

Abstract

Des systèmes selon la présente invention fournissent une armoire d'élimination pour éliminer et identifier une variété de dispositifs d'injection. Des modes de réalisation concernent l'identification du dispositif d'injection sur la base d'au moins une couleur et d'une forme du dispositif d'injection. L'identification du dispositif d'injection est effectuée avec une pluralité d'arbres de décision. Les dispositifs d'injection sont mis au rebut de manière sûre et efficace dans un conteneur d'élimination situé à l'intérieur de l'armoire d'élimination.
PCT/EP2018/097053 2018-12-27 2018-12-27 Armoire intelligente pour système de conteneur jetable WO2020135916A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2018/097053 WO2020135916A1 (fr) 2018-12-27 2018-12-27 Armoire intelligente pour système de conteneur jetable

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2018/097053 WO2020135916A1 (fr) 2018-12-27 2018-12-27 Armoire intelligente pour système de conteneur jetable

Publications (1)

Publication Number Publication Date
WO2020135916A1 true WO2020135916A1 (fr) 2020-07-02

Family

ID=65241200

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/097053 WO2020135916A1 (fr) 2018-12-27 2018-12-27 Armoire intelligente pour système de conteneur jetable

Country Status (1)

Country Link
WO (1) WO2020135916A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0773175A1 (fr) * 1995-11-08 1997-05-14 Zenon Collecteur de déchets spécifiques, notamment médicaux
US20050065820A1 (en) * 2003-09-19 2005-03-24 Mallett Scott R. System and method for sorting medical waste for disposal
US20090317002A1 (en) * 2008-06-23 2009-12-24 John Richard Dein Intra-operative system for identifying and tracking surgical sharp objects, instruments, and sponges
US7971715B1 (en) * 2004-11-15 2011-07-05 Deroyal Industries, Inc. Medical disposables containers
US20150125072A1 (en) * 2013-11-05 2015-05-07 Canon Kabushiki Kaisha Data processing method for learning discriminator, and data processing apparatus therefor
EP3461520A1 (fr) * 2017-10-02 2019-04-03 Ares Trading S.A. Armoire intelligente pour un système de contenant jetable

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0773175A1 (fr) * 1995-11-08 1997-05-14 Zenon Collecteur de déchets spécifiques, notamment médicaux
US20050065820A1 (en) * 2003-09-19 2005-03-24 Mallett Scott R. System and method for sorting medical waste for disposal
US7971715B1 (en) * 2004-11-15 2011-07-05 Deroyal Industries, Inc. Medical disposables containers
US20090317002A1 (en) * 2008-06-23 2009-12-24 John Richard Dein Intra-operative system for identifying and tracking surgical sharp objects, instruments, and sponges
US20150125072A1 (en) * 2013-11-05 2015-05-07 Canon Kabushiki Kaisha Data processing method for learning discriminator, and data processing apparatus therefor
EP3461520A1 (fr) * 2017-10-02 2019-04-03 Ares Trading S.A. Armoire intelligente pour un système de contenant jetable

Similar Documents

Publication Publication Date Title
US11335172B1 (en) Sharing video footage from audio/video recording and communication devices for parcel theft deterrence
US20200320837A1 (en) Parcel theft deterrence for audio/video recording and communication devices
JP2023521616A (ja) 光センシングベースのインベントリ管理システムおよび方法
CN105740921B (zh) 一种智能档案架管理控制系统及控制方法
US20100183199A1 (en) Systems and methods for biometric identification
CN106061386A (zh) 用于介入式设备的插入期间增强可视化的可佩戴电子设备
CN107209857A (zh) 用户终端及其提供方法
CN106412491A (zh) 视频监控方法、装置及系统
US20220042912A1 (en) Systems and methods for detection of contaminants on surfaces
KR102534525B1 (ko) 자가 실시형의 변조 방지용 약물 검출
CN111710107A (zh) 智能物品存放柜及存放管理系统
CN106846750A (zh) 一种自动布防撤防系统及方法
WO2021092883A1 (fr) Procédé de gestion d'article, appareil terminal, dispositif de gestion d'article, et support de stockage
CN107507320A (zh) 一种基于人脸识别的幼儿园门禁系统
US10475310B1 (en) Operation method for security monitoring system
EP3461520A1 (fr) Armoire intelligente pour un système de contenant jetable
KR20170013596A (ko) 안전 강화 방법 및 장치
CN106462732A (zh) 室外及室内的虹膜图像获得装置及方法
WO2020135916A1 (fr) Armoire intelligente pour système de conteneur jetable
US11853827B2 (en) Handheld device
CN102525425B (zh) 生理信息识别装置及生理信息识别方法
CN209625240U (zh) 一种静脉识别身份认证装置及系统
US11049242B2 (en) Portable and rapid screening in vitro detection system
CN112200128A (zh) 指纹采集方法、指纹采集装置、电子装置和存储介质
CN109284730B (zh) 应用于筛选数据的方法、装置以及监控系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18842423

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18842423

Country of ref document: EP

Kind code of ref document: A1