WO2024020408A1 - Procédé de développement d'un ensemble de données d'objets à partir d'images et leurs utilisations - Google Patents

Procédé de développement d'un ensemble de données d'objets à partir d'images et leurs utilisations Download PDF

Info

Publication number
WO2024020408A1
WO2024020408A1 PCT/US2023/070439 US2023070439W WO2024020408A1 WO 2024020408 A1 WO2024020408 A1 WO 2024020408A1 US 2023070439 W US2023070439 W US 2023070439W WO 2024020408 A1 WO2024020408 A1 WO 2024020408A1
Authority
WO
WIPO (PCT)
Prior art keywords
learning model
image
machine
electronic device
user
Prior art date
Application number
PCT/US2023/070439
Other languages
English (en)
Inventor
Ian TROISI
Original Assignee
Imagine Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Imagine Technologies, Inc. filed Critical Imagine Technologies, Inc.
Publication of WO2024020408A1 publication Critical patent/WO2024020408A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/09Supervised learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Definitions

  • Electroencephalography is an electrophysiological monitoring method to non- invasively record electrical activity on a human’s scalp that has been shown to represent the macroscopic activity of the surface layer of the brain underneath.
  • a brain-computer interface (BCI) is a communication system that can help users interact with the outside environment by translating brain signals into machine commands.
  • EEG electroencephalographic
  • a method includes, by a processor, receiving an image of an object in an environment and receiving coordinates of a bounding box containing the object in the captured image. The method further includes, by the processor, in response to receiving the coordinates of the bounding box, training a machine learning model.
  • the machine learning model is trained by generating a configuration file containing the coordinates of the bounding box and generating multiple copies of (i) the configuration file and (ii) the portion of the image in the bounding box.
  • the machine-learning model is further trained by applying the copies of the configuration file and the portion of the image in the bounding box to train the machine learning model to train the machine-learning model, yielding a trained machine-learning model that is trained to recognize the object in a video stream.
  • the method further includes transmitting the trained machine-learning model to an electronic device for storage in a data store in the electronic device.
  • receiving the coordinates of the bounding box containing the object includes displaying, to a user, the image of the object and receiving a bounding box constructed by the user around the object in the image.
  • the method may further include receiving a command associated with the object; and associating the command with the trained machine-learning model.
  • the command may include a computer-executable function.
  • the command is configured to, when executed by the electronic device, cause the object to change from a first state to a different state.
  • the electronic device may be a wearable electronic device configured to apply the trained machine-learning model to images captured by a camera of the wearable electronic device.
  • the configuration file may further contain a user-applied label for the object.
  • the machine-learning model is configured to determine a probability that the object is in the image, and the machine-learning model further configured to recognize the object when the determined probability satisfies a probability threshold.
  • receiving the image of the object in the environment includes receiving the image of the object by a camera of a mobile electronic device in wireless communication with the electronic device.
  • a system in an embodiment, includes a mobile device having a camera to capture images of objects in an environment.
  • the mobile device is configured to receive an image of an object in the environment and receive coordinates of a bounding box containing the object in the captured image.
  • the mobile device In response to receiving the coordinates of the bounding box, the mobile device is configured to train a machine-learning model by generating a configuration file containing the coordinates of the bounding box; generating multiple copies of (i) the configuration file and (ii) at least the portion of the image in the bounding box; and applying the copies of the configuration file and the portion of the image in the bounding box to the machinelearning model to train the machine-learning model, yielding a machine-learning model that is trained to recognize the object in a video stream.
  • the mobile device is further configured to transmit the trained machine-learning model to an electronic device for storage in a data store in the electronic device.
  • Implementations of the disclosure may include one or more of the following optional features.
  • the mobile device is configured to receiving the coordinates of the bounding box containing the object by displaying, to a user, the image of the object and receiving a bounding box constructed by the user around the object in the image.
  • the mobile device may be further configured to receive a command associated with the object and associate the command with the trained machine-learning model.
  • the mobile device may be configured to receive the command by receiving a computer-executable function.
  • the command may be configured to, when executed by the electronic device, cause the object to change from a first state to a different state.
  • the electronic device may be a wearable electronic device configured to apply the trained machine-learning model to images captured by a camera of the wearable electronic device.
  • transmitting the trained machine-learning model to the wearable electronic device causes the wearable electronic device to execute the command in response to the trained machine-learning model detecting the object in images captured by the camera.
  • the configuration file may further contain a user-applied label for the object.
  • the machine-learning model may be configured to determine a probability that the object is in the image, and the machine-learning model may be further configured to recognize the object when the determined probability satisfies a probability threshold.
  • the mobile device may be in wireless communication with the electronic device
  • FIG. 1 shows an example environment for controlling a device based on neuronal activity
  • FIG. 2 shows an example wearable electro-biometric apparatus
  • FIG. 3 is a flowchart of a method of detecting neuronal activity
  • FIG. 4 is a flowchart of a method of detecting devices to control
  • FIG. 5 is a flowchart of a method of training a machine-learning model
  • FTGs. 6A and 6B show an example environment for controlling a cursor position based on neuronal activity
  • FIG. 7 shows an example environment for managing a data store of the example wearable electro-biometric apparatus
  • FIG. 8 shows details of the data store
  • FIGs. 9A and 9B show an example environment for controlling a device based on neuronal activity
  • FIG. 10 illustrates a block diagram of internal hardware included in any of the electronic components of this disclosure.
  • FIG. 1 shows an example environment 100 for controlling a device 130 based on neuronal activity.
  • the environment 100 includes a user 102 associated with a wearable electro-biometric sensor apparatus 200.
  • the wearable electro-biometric apparatus 200 is affixed to an ear of the user 102 and is configured to monitor neuronal activity of the user 102 via one or more electro-biometric sensors 220 (FIG. 2).
  • Example electro-biometric sensors include electroencephalographic (EEG), electro optical encephalopathic (EOG), and electro-myographic (EMG) sensors.
  • EEG electroencephalographic
  • EEG electro optical encephalopathic
  • EMG electro-myographic
  • the wearable electrobiometric apparatus 200 is in wireless communication with a device controller 120 capable of controlling one or more functions or characteristics of a device 130.
  • the wireless communication may follow communication protocols such as Near Field Communication (NFC), Bluetooth, WiFi, Infrared (IrDA), or other technology allowing for transmitting and/or receiving information wirelessly.
  • the device controller 120 may be capable of receiving instructions and, in response to receiving the instructions, controlling, operating, or adjusting the device 130 according to the instructions.
  • the device controller 120 is capable of turning a light bulb on and off.
  • the device controller 120 may adjust a setpoint, such as the temperature setting of a climate control system, or the station and/or volume of a television or audio system.
  • the device controller 120 may adjust more complex parameters, such as a time profde of a setting.
  • the device controller 120 may also adjust a virtual setting, such as the position of a cursor or other object on a computer screen, or the remaining time on a countdown timer.
  • the device controller 120 includes a switch, a transistor, or another device (such as a microprocessor or integrated circuit) that is capable of causing the device 130 to change a condition.
  • the device controller 120 may be capable of closing a circuit that includes the device 130 and a power source, and also opening the circuit between the device 130 and power source, thus turning the device on and off.
  • the device controller 120 also may include a circuit that can adjust one or more functions of the device 130, such as a light brightness control, a volume control, a fan speed control, or other control device.
  • devices 130 that may be controlled by the device controller 120 include remote controlled doors, windows, skylights, blinds or shades, etc., kitchen appliances such as toasters, blenders, ovens, range tops, garbage disposals, trash compactors, plumbing devices such as sinks, showers, or toilets, Internet of Things (loT) devices such as smart plugs, and any other device that can be controlled using a computer.
  • kitchen appliances such as toasters, blenders, ovens, range tops, garbage disposals, trash compactors
  • plumbing devices such as sinks, showers, or toilets
  • Internet of Things (loT) devices such as smart plugs, and any other device that can be controlled using a computer.
  • the wearable electro-biometric apparatus 200 is in wireless communication with a computing device 104 associated with the user 102, such as a mobile phone or other mobile electronic device.
  • the computing device 104 may be capable of directly controlling one or more functions or characteristics of the device 130.
  • the computing device 104 may transmit instructions to the device using a wired or wireless communication channel.
  • the computing device 104 may transmit an infrared or RF signal to control audio, video, or other electronic equipment, e g., to cause the device to change its state.
  • the command may be to actuate the television’s power switch and thus from on to off) or vice versa, to or change to a different station.
  • the computing device 104 may also send an audio signal to a voice-controlled device, such as a virtual assistant.
  • the computing device 104 may be in communication with the device controller 120, allowing the computing device 104 to control one or more functions or characteristics of the device 130 via the device controller 120.
  • the computing device 104 may transmit a signal (such as an optical signal or a communication with instructions) to turn a smart outlet on or off or control a smart appliance, such as an oven, crockpot, clothes washer/dryer, refrigerator, garage door opener, dishwasher, vacuum cleaner, alarm system, climate control system, or the like.
  • Examples of computing devices include smartphones, tablets, laptop computers, or other devices capable of wirelessly communicating with the wearable electro-biometric apparatus 200 and directly or indirectly controlling one or more functions or characteristics of the device 130, e.g. by interfacing with the device controller 120.
  • the wearable electro-biometric apparatus 200 also includes an imaging system 240 (FIG. 2), such as a camera, capable of receiving images of objects in the field of view 114 of the imaging system 240.
  • the imaging system 240 may be configured to capture images of objects in the field of view of the user.
  • the light bulb is with the field of view 114 of the imaging system 240.
  • the wearable electro-biometric apparatus 200 may detect neuronal activity of the user 102 indicating that the user 102 is focusing on an object in the field of view of the user (e.g., the light bulb).
  • the wearable electro-biometric apparatus 200 may detect brain waves indicating concentration.
  • the wearable electro-biometric apparatus 200 may detect neuronal activity indicating that the user is blinking, e.g., from an EOG sensor. In response to detecting the neuronal activity, the wearable electro-biometric apparatus 200 may transmit a signal to the device controller 120, causing the device controller 120 to turn the light on.
  • the signal may be a digital signal that transfers programming instructions and/or an encoded data packet that, when received by a processor of the device controller 120, will cause the processor to trigger the device controller 120 to take an action, such as turn the device 130 on or off.
  • FTG. 2 shows an example wearable electro-biometric apparatus 200.
  • the apparatus 200 includes a housing 210 supporting and/or containing other components of the apparatus 200.
  • the housing 210 may be configured to be worn over the ear of a user 102.
  • the housing 210 may be formed from a rigid or semi-rigid material, such as a plastic, and may be curved or otherwise shaped to conform with the side of the user’s head.
  • the housing 210 may be an elongated structure with a forward section 211, a central section 212 and a rear section 213.
  • forward section 211 will extend from the user’s ear toward the user’s face
  • rear section 213 will extend from the user’s ear toward the back of the user’s head and/or neck.
  • Central section 212 serves as a bridge over the user’s ear by joining the forward section 211 and the rear section 213 and including a semicircular indentation 230 that is sized and positioned to be placed over the ear, behind (and optionally extending upward from) the helix of the ear.
  • the semicircular indentation 230 may include a liner 235 formed of a gel, an elastomer such as silicone or other elastic or rubber-like material, or other resilient material to provide a more comfortable fit, and also to help keep the device from falling off of the user’s ear during use.
  • a liner 235 formed of a gel, an elastomer such as silicone or other elastic or rubber-like material, or other resilient material to provide a more comfortable fit, and also to help keep the device from falling off of the user’s ear during use.
  • the housing 210 may include mounting positions for two or more electro-biometric sensors and an imaging system 240.
  • three electro-biometric sensors 220a, 220b, 220c that are dry EEG sensors are mounted on a user-facing surface of the housing 210.
  • This document may refer to the electro-biometric sensors collectively using reference number 220).
  • Dry EEG sensors are capable of operating without the use of conductive gel or paste.
  • wet EEG sensors are also within the scope of the disclosure.
  • wearable electro-biometric apparatus 200 is configured to locate the three electro-biometric sensors 220a, 220b, 220c near locations T4, F8, and A2, respectively, of International 10-20 system of describing the location of scalp electrodes.
  • wearable electro-biometric apparatus 200 is configured to locate one or more EOG sensors near the user’s eye to measuring the corneo-retinal standing potential that exists between the front and the back of the user’s eye.
  • a first electro-biometric sensor 220a is mounted within the central section 212 of the housing 210 so that, when the wearable electro-biometric apparatus 200 is worn by the user 102, the first electro-biometric sensor 220a will contact the user 102 at a position behind the user’s ear (i.e., a location under the ear’s helix or between the helix and the back on the user’s head), over a temporal lobe of the user’s brain.
  • a second electro-biometric sensor 220b is mounted within the forward section 211 of the housing 210 so that the second electro-biometric sensor 220b will contact the user 102 at a position at or near the user’s temple, over a frontal lobe of the user’s brain.
  • a third electro-biometric sensor 220c is mounted on the housing 210 so that the third electro-biometric sensor 220c will contact the user 102 at a position that is relatively lower on the user’s head than the positions of the first electro-biometric sensor 220a and second electrobiometric sensor 220b below the first electro-biometric sensor 220a.
  • the position of the third electro-biometric sensor 220c may correspond to a location that is under the user’ s earlobe when worn.
  • the third electro-biometric sensor 220c is mounted on a flexible earpiece 250 extending away (e.g., downward) from either the central section 212 or the rear section 213 of the housing 210.
  • the flexible earpiece 250 may be formed from a gel, an elastomer such as silicone or other elastic or rubber-like material, or other materials, and it may be integral with or separate from liner 235
  • the flexible earpiece 250 may including wiring leading from the electrical components within the housing 210 to the third electro-biometric sensor 220c.
  • the wiring includes Benecreat aluminum wiring.
  • the flexible earpiece 250 may be covered with a cloth and/or padding to provide additional comfort.
  • the flexible earpiece 250 may bend to facilitate placing the wearable electro-biometric apparatus 200 over an ear of the user 102 and securely attaching the wearable electro-biometric apparatus 200 to the user 102.
  • the flexible earpiece 250 is capable of securing the wearable electro-biometric apparatus 200 to the user 102 even during vigorous activity.
  • the flexible earpiece 250 may apply a gripping force to the ear of the user 102 when the user 102 is wearing the wearable electro-biometric apparatus 200.
  • the housing 210 is configured to translate the gripping force of the flexible earpiece 250 to the electro-biometric sensors 220, causing the electro-biometric sensors 220 to press firmly against the user 102 to facilitate receiving a strong electro-biometric signal (e.g., EEG, EOG, EMG).
  • a strong electro-biometric signal e.g., EEG, EOG, EMG
  • an imaging system 240 is mounted at the front of the front section 211 of the housing 210, with a lens that is positioned toward the user’s face and configured to capture images of objects in the field of view 114 of the imaging system 240.
  • the imaging system 240 may be mounted on the housing 210 so that the field of view 114 includes devices 130 that are within the field of view of the eyes of the user 102 when the user 102 is wearing the wearable electro-biometric apparatus 200.
  • the housing 210 may enclose other components of the wearable electro-biometric apparatus 200, e.g., to protect the components from damage or exposure to hazards.
  • the components may include items such as those illustrated in FIG. 10, including a communication device 1010 that includes a transmitter, a processor 1005 in communication with the transmitter, the electro-biometric sensors (220 in FIG. 2, 1020 in FIG. 10), and the imaging system (240 in FIG. 2, 1040 in FIG. 10).
  • the components may also include wiring and interfacing components, such as signal conditioning components (e.g., band-pass fdters), some of which may be mounted on printed circuit boards (PCBs) enclosed within the housing 210.
  • signal conditioning components e.g., band-pass fdters
  • the processor 1005 may have associated memory containing programming instructions that, when executed, cause the processor 1005 to implement methods, such as detecting neuronal activity and controlling one or more devices 130 based on the detected neuronal activity.
  • the components may also include a power source 1080 (e.g., a rechargeable battery) providing power to electrical components of the wearable electro-biometric apparatus 200.
  • the battery may be sized to allow for extended use of the wearable electro-biometric apparatus 200 between charges.
  • the processor 1005 may be a low-power component, e.g., designed for use in a mobile phone. The battery may be charged though an electrical connection or by inductive wireless charging, and the wearable electro-biometric apparatus 200 may harvest energy from the environment to extend the time between recharges.
  • the wearable electro-biometric apparatus 200 includes a photovoltaic device disposed on a portion of the housing 210.
  • three electro-biometric sensors 220 are disposed on the housing 210 so that the electro-biometric sensors 220 contact the user 102 at specific locations on the user’s head.
  • the third electro-biometric sensor 220c may contact the user 102 below the user’s ear, at a location of low electro-biometric signal strength (e.g., a weak signal of neuronal activity).
  • the third electro-biometric sensor 220c may act as a ground or baseline reference relative to the other two electro-biometric sensors 220a, 220b by being placed on or near the midline sagittal plane of the skull, e.g., where lateral hemispheric cortical activity is largely not detected.
  • the second electro-biometric sensor 220b may contact the user near the user’s temple, at a location of relatively high electro-biometric signal strength.
  • a filter such as a band-pass filter, attenuates less relevant signals from the second sensor 220b.
  • the band-pass filter may pass beta waves (e.g., brain-wave signals in the range from 20-30 Hz), while attenuating delta, theta, gamma, and alpha waves.
  • the processor 1005 may adjust parameters of the filter, e g., to pass alpha wave signals in lieu of beta wave signals or to switch between passing alpha wave signals and passing beta wave signals.
  • the first electro-biometric sensor 220a may contact the user behind the user’s ear at a location of relatively lower electro-biometric signal strength than the location of the second electro-biometric sensor 220b.
  • the devices may have additional electro-biometric sensors located in different locations. Other devices may have fewer than three sensors.
  • the system may operate with only the first electro-biometric sensor 220a positioned over the user’s temple and frontal lobe and the third electro-biometric sensor 220c providing a ground reference.
  • the system may operate with only the second electro-biometric sensor 220b positioned under the helix of the user’s ear and over the user’s temporal lobe and the third electrobiometric sensor 220c providing a ground reference.
  • FIG. 3 shows a method 300 of detecting neuronal activity to determine when the neuronal activity of a user of the device indicates, e.g., an increase in the user’s focus, or whether the user has blinked, or other indication by the user that the wearable electro-biometric apparatus 200 should perform functions to control a device.
  • the method 300 includes receiving, by the processor of the wearable electro-biometric apparatus 200 or of a separate electronic device that is in communication with and proximate to the wearable electro-biometric apparatus 200, the filtered electro-biometric signals from one or more electro-biometric sensors 220, such as EEG sensors.
  • the processor may sample the fdtered electro-biometric signals, e g., using an analog-to- digital converter (ADC) to determine a representative amplitude (at step 304) of the electrobiometric signal in the pass band.
  • the representative amplitude may be a root mean square (rms) value or other value representing the level of neuronal activity within the frequency range of the pass band.
  • the processor determines a difference (at step 306) between the representative amplitude of the electro-biometric signal from the first electrobiometric sensor 220a and the second electro-biometric sensor 220b to determine the level of neuronal activity.
  • the filter and ADC may be disposed on a daughter PCB interfaced with the processor PCB.
  • the electro-biometric sensors filter and sample the electrobiometric signals and provide digital representations of the signals to the processor, e.g., through a serial or parallel data bus.
  • the processor may receive (at step 302) electro-biometric signals at irregular or regular intervals of time, such as twice per second.
  • the processor may further measure a change in the representative level of neuronal activity between interval by comparing one level of a first interval with a second level of a previous (or subsequent) interval.
  • the processor compares the measured change in representative level of neuronal activity to a threshold (at step 308). When the change in representative level of neuronal activity exceeds the threshold, the processor may determine that the user’s focus has increased.
  • the threshold may be based on a baseline level of neuronal activity of the user.
  • the user triggers the wearable electro-biometric apparatus 200 to determine the baseline level of neuronal activity of the user, e.g., through an application 710 (FIG. 7) executing on the computing device or through an actuator on the wearable electro-biometric apparatus 200, e g., by pressing a button located on the housing 210.
  • wearable electro-biometric apparatus 200 automatically determines the baseline level of neuronal activity of the user 102, e.g., as an average or mean level over a period of time, with fluctuation in the representative level of neuronal activity of the user 102 over that period of time.
  • wearable electro-biometric apparatus 200 includes a data store
  • the imaging system 240 is configured to capture images of objects in the field of view 114 of the imaging system 240.
  • the wearable electro-biometric apparatus 200 may recognize the device 130, based on images captured by the imaging system 240, and select the device 130 from the known devices contained in the data store 800. Example image recognition methods will be described below in the discussion of FIG. 4.
  • the processor 1005 may perform functions to control the selected device 130.
  • the data store 800 also contains patterns or features 808 (FIG. 8) associated with each known device.
  • the pattern may be a previously captured image of the device 130, or the pattern may include features of the device 130 extracted from the previously captured image of the device 130, e.g., by image processing.
  • the extracted features may include colors of surfaces of the device 130, geometric features, such as detected edges of the device 130, or patterns, textures, or shapes of a surface of the device 130.
  • the pattern includes an April tag, barcode, QR code, or other form of computer- readable data associated with the device 130, e.g., affixed to a surface of the device.
  • the extracted feature includes a symbol or pattern of symbols imprinted on the surface of the device 130, such as a logo.
  • the extracted features may include knobs, switches or other controls associated with the device 130 or text or other display information associated with the device 130.
  • FIG. 4 shows a method 400 of detecting devices 130 to control.
  • the processor 1005 may receive images from the imaging system 240 and may extract, at step 404, features from the received images.
  • Features may include characteristics of the image, such as overall shape as well as corners, edges, relationships between corners and edges, colors, patterns, etc.
  • the processor 1005 executes a Scale Invariant Feature Transform (SIFT) or similar algorithm to locate one or more “keypoints” of the image.
  • SIFT Scale Invariant Feature Transform
  • a “keypoint” is a feature which is unaffected by the size, aspect ratio or other asymmetry of a shape in the image.
  • the processor may construct a scale space to ensure that the keypoints are scale-independent, ensure the keypoints are rotation invariant (unaffected by the orientation of the image), and assign a unique identifier to the keypoint.
  • the unique identifier may be a 128 bit or 256 bit Binary Robust Independent Elementary Features (BRIEF) descriptor.
  • the processor executes a Features from Accelerated Segment Test (FAST) corner-detection algorithm to extract features.
  • the processor 1005 may execute functions of a software library, such as the python module ORB (Oriented FAST and Rotated BRIEF), to perform feature extraction.
  • the processor 1005 may compare, at step 406, the received image and/or the features extracted from the received image with features associated with known devices contained in the data store 800. For example, the processor 1005 may compare unique identifiers assigned to keypoints of the image with unique identifiers assigned to keypoints of previously captured images of known devices. The processor 1005 may select, at step 408, a device 130 from the known devices when a threshold number of extracted features match features of previously captured images. For example, each known image contained in the data store 800 may include, e.g., 30 or more features 808 extracted from a previously captured image of the device 130. In this example, the processor 1005 may select the device 130 from the known devices when more than ten features match.
  • the processor 1005 may also select the device 130 having the greatest number of matching features when more than one of the known devices has more than the threshold number of matching features extracted from the captured image of the device 130.
  • the wearable electro-biometric apparatus 200 may use a machine-learning algorithm to recognize the device 130.
  • the processor 1005 may provide the received image to a deep-learning neural network, such as a convolutional neural network (CNN) trained to recognize known objects.
  • CNN convolutional neural network
  • FIG. 5 shows a method 500 of training a machine-learning model (such as a deep-learning neural network) to recognize a known device 130 to control.
  • the user or other supervised trainer
  • the method 500 includes receiving an object selection.
  • the user may cause the system to select an object in the scene as the known device 130 by acquiring an image of the object, e.g., through an application 710 (FIG. 7).
  • the user may label or annotate the selected object via the application 710.
  • the user may then apply a bounding box around the obj ect in the image, the bounding box defining a portion of the image containing the object.
  • the coordinates of the points that define the bounding box within the image may be saved to a configuration file (e.g., an XML data file).
  • the configuration file may also include one or more labels or annotations applied by the user.
  • the method 500 includes preparing training data.
  • the method may include receiving the image and the configuration file.
  • the method may include cropping the image to the portion within the defined bounding box.
  • the method may include making multiple (e.g., 1000) of duplicates of the image (or the portion of the image within the bounding box) and multiple copies (e.g., 1000) of the configuration file. These additional copies of the image and configuration file serve as inputs for training the deep-learning network to recognize the object as the known device 130.
  • the method 500 includes training a machine-learning model.
  • the copies of the image (or the portion of the image within the bounding box) and configuration files may be input into the deep-learning neural network, thus providing multiple examples of images of the object to train the deep-learning neural network to recognize the known device.
  • the deep-learning neural network may predict the probability that the known device is in a newly acquired image.
  • the neural network may more accurately predict the probability that the known device is in a newly acquired image than a neural network trained with a single image.
  • the method 500 includes transmitting the trained deep-learning neural network to the wearable electro-biometric apparatus 200.
  • One or more trained deep-learning neural network may be stored in the data store 800 of the wearable electro-biometric apparatus 200 and applied to images acquired by the wearable electro-biometric apparatus 200.
  • one or more deep-learning neural networks may determine a probability that the known device 130 is in the newly acquired image, e.g., and image of a video stream acquired by the camera 240 of the wearable electro-biometric apparatus 200. The one or more deep-learning neural networks may detect that the known device 130 is in the newly acquired image when the predicted probability satisfies or exceeds a threshold.
  • the method of training the machine-learning model to recognize an object in an image of a video stream may be applicable in other circumstances as well.
  • the method described above strikes a balance between the effort to prepare training data and the effectiveness of the trained machine-learning model to recognize an object in a video stream.
  • the disclosed method does not require users to take multiple images of the object. Nor does the disclosed method require the user to construct multiple bounding boxes, e.g., around the object in each of multiple images containing the object. Furthermore, the disclosed method is not subject to errors caused by the system improperly construct bounding boxes around the object in the image.
  • the user constructs a single bounding box around the image in a single image. In this case, the system prepares the training data based on the single image provided by the user.
  • the method 500 includes receiving a command associated with the known device 130 and transmitting the associated command to the wearable electro-biometric apparatus 200.
  • the user may provide commands or functions associated with one or more deeplearning neural network, as described in FIG. 8, so that when the deep-learning neural network recognizes an object, the wearable electro-biometric apparatus 200 executes the associated command or function.
  • the system may select the device as an object that the electro-biometric apparatus 200 will control.
  • the system may detect multiple devices in a field of view of the electro-biometric apparatus’s camera. To determine which of the devices will be controlled, the system may require that the device to be controlled is centrally positioned within the field of view, such as over the center point or within a threshold distance of the center point of the field of view, for at least a threshold period of time. The system also may require that the user’s focus have increased while the device is within the central location or the field of view in order to select the device to control.
  • the wearable electro-biometric apparatus 200 may control 412 one or more functions or characteristics of the device 130.
  • the wearable electro-biometric apparatus 200 may further include an accelerometer, Inertial Measurement Unit (IMU), or other position/orientation sensor 260 (FIG. 2) in communication with the processor 1005 and providing position/orientation information to the processor 1005.
  • IMU Inertial Measurement Unit
  • the IMU is disposed within the housing 210 at a position between the user’s ear and the user’s temple, or other suitable location which tracks well with movement of the user’s head.
  • the processor 1005 may determine that the device 130 includes an electronic interface capable of receiving position/orientation information. For example, the processor 1005 may determine that device 130 is capable of discovering a Bluetooth mouse, trackball, or other wireless position-controlling computer peripheral. The processor 1005 may cause the wearable electro-biometric apparatus 200 to interface with (e.g., pair with) with the device 130 and use information from the position/orientation sensor 260 to control a cursor location or other point of focus associated with the device 130. For example, the wearable electrobiometric apparatus 200 may transmit relative X-axis and Y-axis data to the device 130, e.g., according to the Bluetooth Human Interface Device (HID) profile.
  • HID Bluetooth Human Interface Device
  • the X-axis and Y-axis data may indicate a number of pixels to move the device’s cursor from its current position.
  • the user 102 of the electro-biometric apparatus 200 may adjust the cursor location on the display screen by changing the orientation of the wearable electro-biometric apparatus 200, thus changing the orientation of the position/orientation sensor 260. For example, tilting the user’ s head up and down will cause a corresponding change in orientation of the IMU.
  • the system will measure changes in one or more values of the position/orientation as the user’s head tilts up and down.
  • the system may scale the measured changes of position/orientation by a proportionality constant to determine a number of pixels to move the cursor up or down.
  • the system then sends the scaled measurement data as, e.g., relative Y-axis data to the device 130, causing the cursor to move up and down in response to the user’s head tilting up and down.
  • the system may send relative X-axis data to the device 130 as the user’s head moves side to side, causing the device’s cursor to move back and forth.
  • FIG. 6A shows an example environment 600 for controlling the location of a cursor 604 on a controllable display screen 602 of a computer system.
  • the environment 600 includes a controllable display screen 602 of a computer system displaying a selectable window 606 and a cursor 604.
  • the computer system may include a device controller application and communicate directly with the electro-biometric apparatus via near-field or other wireless communication protocols.
  • the environment 600 also may include a portable electronic device 614 on which a device controller application is installed.
  • the electrobiometric apparatus 200 recognizes the display screen 602 using processes such as those described above in the discussion of FIG. 4, it may then use movement of the electro-biometric apparatus 200 to control movement and activation of the cursor 604.
  • the cursor 604 is shown as located near the lower right comer of the display screen 602 in FIG. 6A, but in practice the cursor 604 may start in any position of the display screen 602.
  • the user 102 of the electro-biometric apparatus 200 may adjust the cursor location on the display screen by changing the orientation of the wearable electro-biometric apparatus 200, thus changing the orientation of the position/orientation sensor 260.
  • the device controller application of the portable electronic device 614 or of the computer system with then generate a command to move the cursor to a location that corresponds to movement of the position/orientation sensor. For example, referring to FIG. 6B, the cursor 604 has been moved to a higher location on the computer screen 602 in response to the wearable electro-biometric apparatus 200 tilting upward.
  • the system will determine a reference point, such as the position of the cursor 604 on the display screen 602 when the system detects a certain trigger, such as a determination that the user’s focus has remained at or above a threshold for at least a minimum amount of time, or a determination that the cursor 604 and/or the display screen 602 is centrally located (i.e. within a threshold range of a center point) in a field of view of the imaging system of electro-biometric apparatus 200 for at least a minimum period of time.
  • the system will determine initial position and/or orientation values of the electro-biometric apparatus’ position/orientation sensor and save that information to onboard memory. The system will measure changes in one or more values of the position/orientation sensor as compared to the reference point.
  • the processor may further determine that device 130 is capable of receiving a select event (e.g., mouse left click), e.g., according to the Bluetooth Human Interface Device (HID) profile.
  • the processor may cause the wearable electro-biometric apparatus 200 to transmit a select event to the device 130 in response to determining that the user’s focus has increased.
  • the user may position the cursor 604 within the window 606 (e.g., over the selectable close button, shown as an X in the title bar of the window) and select the window or a function of the window (e.g., selecting the close function to close the window).
  • wearable electro-biometric apparatus 200 may control large presentation screens, video games, mobile computing platforms such as smart phones, smart televisions, automobile infotainment systems, and other electronic devices with movable cursors.
  • the user 102 may position the cursor at a desired location, then, by focusing on the location, cause the position or object on the screen to be selected.
  • the wearable electro-biometric apparatus 200 will pair with multiple devices 130 simultaneously.
  • the wearable electro-biometric apparatus 200 will only pair with the device 130 if the device 130 is contained in the data store 800 of the wearable electro-biometric apparatus 200 and is recognized by the wearable electro-biometric apparatus 200.
  • FIG. 7 shows an example environment 700 for managing the data store 800 of the wearable electro-biometric apparatus 200.
  • a computing device 104 associated with the user 102 may execute an application 710 which manages the known devices in the data store 800 of the wearable electro-biometric apparatus 200.
  • the application 710 executes on a remote server, such as a web server that the user 102 accesses via the computing device 104.
  • the application 710 may capture images of objects, such as the device 130, in the field of view 714 of an imaging system of the computing device 104 and extract features from the captured image of the device 130, e.g., by image processing.
  • images of devices 130 are uploaded to the computing device 104 through a computer network or bus.
  • the application 710 may present images of devices 130 to the user 102.
  • the user 102 may select the device 130 as a known device to include in the data store 800 of the wearable electro-biometric apparatus 200.
  • the application 710 may cause the computing device 104 to transmit the selected device 130 (and patterns associated with the selected device 130) to the wearable electro-biometric apparatus 200.
  • the wearable electro-biometric apparatus 200 may add the selected device and patterns associated with the selected device, such as the captured image and/or the extracted features 808, to the data store 800.
  • the computing device 104 receives a list of known devices contained in the data store 800 of the wearable electro-biometric apparatus 200.
  • the application 710 may present the list to the user 102, so that the user 102 may select devices to be removed from the list of known devices contained in the data store 800.
  • the application 710 may cause the computing device 104 to transmit the selected devices to the wearable electro-biometric apparatus 200, causing the wearable electro-biometric apparatus 200 to remove the selected devices from the data store 800.
  • Using the application 710 in this way allows the user to add, remove, and update objects included in the list of known devices contained in the data store 800 of the wearable electro-biometric apparatus 200.
  • FIG. 8 shows details of the data store 800.
  • Each known device 130 may have an associated device ID 802.
  • the list of known devices includes functions 804 associated with each device 130.
  • Example functions include activate/deactivate, open/close, step to a next element or position in a sequence (e.g., tune a radio or television receiver to a next station, or toggle through setting of COOL, OFF, and HEAT of a climate control system, or speeds of a cooling fan).
  • Some devices 130 may have more than one associated function 804.
  • a cooling fan having multiple fan speeds may also include a light which can be turned on and off. As shown in FIG. 7, the light bulb can be toggled between the ON state and the OFF state.
  • a motion sensor associated with the light bulb (which may activate the light when it detects motion near the light) can also be toggled between an active and inactive state.
  • the application 710 allows the user 102 to select the function performed by the wearable electro-biometric apparatus 200 in response to the user’s increase in focus. Here, the function the state of the light bulb is selected.
  • the application 710 may provide a list of functions associated with each known device for the user to select from.
  • the application 710 may allow the user to define additional functions 804 associated with each known device (or group of known devices).
  • the application 710 may allow the user to enter additional functions according to the syntax of a command language or script language, such as TCL, perl, or python, or according to the syntax of a general-purpose computing language, such as Java.
  • the application 710 may cause the computing device 104 to transmit the defined and/or selected functions to the wearable electrobiometric apparatus 200, causing the wearable electro-biometric apparatus 200 to execute the function in response to the user’s increase in focus.
  • the application 710 may download functions from a repository of functions associated with a device or class of devices.
  • the application 710 may also upload functions to the repository, e.g., after a suitable degree of testing.
  • the list of known devices includes settings 806 associated with each device 130.
  • the processor 1005 may determining a change in position or orientation of the wearable electro-biometric apparatus 200, e.g., based on information received from a position/orientation sensor 260. In response to determining a change in position or orientation, the processor 1005 may perform functions to control a setting of the selected device 130. For example, in response to the user’s head moving from a downward facing position to a more upward facing position, the processor 1005 may cause a volume setting of the selected device 130 to increase. Similarly, in response to the user’s head moving from an upward facing position to a more downward facing position, the processor 1005 may cause a volume setting of the selected device 130 to decrease.
  • the application 710 may allow the user 102 to define additional functions 804 associated with each known device (or group of known devices) and to select the setting to control by the wearable electro-biometric apparatus 200 in response to the change in position or orientation.
  • a user 102 may activate or deactivate a device 130 based on a level of focus or, e.g., by intentionally blinking and may adjust a setting of the device 130 based on movements of the user’s head.
  • FIG. 9A shows an example environment 900 for controlling a setting 806 associated with a device 130.
  • the device 130 is a light fixture containing a light bulb controlled by a device controller 120.
  • the device controller 120 includes an activation controller 904 capable of switching the light fixture on and off and a dimmer controller 906 capable of adjusting the brightness of the light bulb, e g., by adjusting the current flowing through the bulb.
  • the setting 806 of the dimmer controller 906 is a relatively low value, corresponding to a low level of brightness.
  • FIG. 9B the setting 806 of the dimmer controller 906 has been moved to a higher value in response to the wearable electro-biometric apparatus 200 tilting upward.
  • Processes of detecting the device and detecting movement of the wearable electrobiometric apparatus may be as those described above.
  • the system may calculate a brightness increase or decrease to match (or be a function of) the amount by which the device’s yaw has increased or decreased.
  • the wearable electro-biometric apparatus 200 may also transmit a signal to the device controller 120, causing the device controller 120 to turn the light on or off via the activation controller 904.
  • a user 102 may turn the light device 130 on or off based on a level of focus or, e.g., by intentionally blinking, and the user may adjust the brightness of the light based on movements of the user’s head.
  • the wearable electro-biometric apparatus 200 may be used to activate/deactivate and adjust settings of a variety of devices 130, including turning on and off and controlling the speed of a fan, activating/deactivating audio equipment and adjusting the volume, activating/deactivating a climate control system and adjusting the temperature, etc.
  • FIGs. 9A and 9B show the wearable electro-biometric apparatus 200 communicating directly with the device 130 and its device controller 120, the communications may route through one or more intermediate devices such as were shown in FTGs. 1 , 6A, 6B and 7.
  • FIG. 10 illustrates example hardware that may be included in any of the electronic components of the system, such as internal processing systems of the wearable electro-biometric apparatus 200.
  • An electrical bus 1000 serves as an information highway interconnecting the other illustrated components of the hardware.
  • Processor 1005 is a central processing device of the system, configured to perform calculations and logic operations required to execute programming instructions.
  • the terms “processor” and “processing device” may refer to a single processor 1005 or any number of processors in a set of processors that collectively perform a set of operations, such as a central processing unit (CPU), a graphics processing unit (GPU), a remote server, or a combination of these.
  • CPU central processing unit
  • GPU graphics processing unit
  • remote server or a combination of these.
  • ROM Read only memory
  • RAM random access memory
  • flash memory hard drives and other devices capable of storing electronic data constitute examples of memory devices 1025.
  • a memory device may include a single device or a collection of devices across which data and/or instructions are stored.
  • Various embodiments of the invention may include a computer-readable medium containing programming instructions that are configured to cause one or more processors to perform the functions described in the context of the previous figures.
  • An optional display interface 1030 may permit information from the bus 1000 to be displayed on a display device 1035 in visual, graphic or alphanumeric format, such on an indashboard display system of the vehicle.
  • An audio interface and audio output (such as a speaker) also may be provided.
  • Communication with external devices may occur using various communication devices 1010 such as a wireless antenna, a radio frequency identification (RFID) tag and/or short-range or near-field communication transceiver, each of which may optionally communicatively connect with other components of the device via one or more communication system.
  • RFID radio frequency identification
  • the communication device(s) 1010 may include a transmitter, transceiver, or other device that is configured to be communicatively connected to a communications network, such as the Internet, a Wi-Fi or local area network or a cellular telephone data network, or to make a direct communication connection with one or more nearby devices, such as a Bluetooth transmitter or infrared light emitter.
  • a communications network such as the Internet, a Wi-Fi or local area network or a cellular telephone data network, or to make a direct communication connection with one or more nearby devices, such as a Bluetooth transmitter or infrared light emitter.
  • the hardware may also include a user interface sensor 1045 that allows for receipt of data from input devices 1050 such as a keyboard or keypad, a joystick, a touchscreen, a touch pad, a remote control, a pointing device and/or microphone. Digital image frames also may be received from a camera 1040 that can capture video and/or still images.
  • the system also may receive data from one or more sensors 1020 such as EEG sensors 220 (or other electro-biometric sensors) and/or motion/position sensors 1070, such as inertial measurement sensors.
  • an “electronic device” or a “computing device” refers to a device that includes a processor and memory. Each device may have its own processor and/or memory, or the processor and/or memory may be shared with other devices as in a virtual machine or container arrangement.
  • the memory will contain or receive programming instructions that, when executed by the processor, cause the electronic device to perform one or more operations according to the programming instructions.
  • memory each refer to a non-transitory device on which computer- readable data, programming instructions or both are stored. Except where specifically stated otherwise, the terms “memory,” “memory device,” “computer-readable medium,” “data store,” “data storage facility” and the like are intended to include single device embodiments, embodiments in which multiple memory devices together or collectively store a set of data or instructions, as well as individual sectors within such devices.
  • a computer program product is a memory device with programming instructions stored on it.
  • processor and “processing device” refer to a hardware component of an electronic device that is configured to execute programming instructions, such as a microprocessor or other logical circuit.
  • a processor and memory may be elements of a microcontroller, custom configurable integrated circuit, programmable system-on-a-chip, or other electronic device that can be programmed to perform various functions.
  • processor or “processing device” is intended to include both single-processing device embodiments and embodiments in which multiple processing devices together or collectively perform a process.
  • An “imaging system” is any device or system that is capable of optically viewing an object and converting an interpretation of that object into electronic signals.
  • An imaging device is a digital camera.
  • a “machine learning model” or a “model” refers to a set of algorithmic routines and parameters that can predict an output(s) of a real-world process (e.g., identification or classification of an object) based on a set of input features, without being explicitly programmed.
  • a structure of the software routines (e.g., number of subroutines and relation between them) and/or the values of the parameters can be determined in a training process, which can use actual results of the real-world process that is being modeled.
  • Such systems or models are understood to be necessarily rooted in computer technology, and in fact, cannot be implemented or even exist in the absence of computing technology.
  • bounding box refers to a rectangular box that represents the location of an object.
  • a bounding box may be represented in data by x- and y-axis coordinates [x ma x, ymax] that correspond to a first corner of the box (such as the upper right comer), along with x- and y- axis coordinates [xmin, ymin] that correspond to the corner of the rectangle that is opposite the first corner (such as the lower left comer). It may be calculated as the smallest rectangle that contains all of the points of an object, optionally plus an additional space to allow for a margin of error.
  • the points of the object may be those detected by one or more sensors, such as pixels of an image captured by a camera.
  • wireless communication refers to a communication protocol in which at least a portion of the communication path between a source and destination involves transmission of a signal through the air and not via a physical conductor, as in that of a Wi-Fi network, a Bluetooth connection, or communications via another short-range or near-field communication protocol.
  • wireless communication does not necessarily require that the entire communication path be wireless, as part of the communication path also may include a physical conductors positioned before a transmitter or after a receiver that facilitate communication across a wireless position of the path.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Dermatology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Le présent document divulgue des systèmes et des procédés d'entraînement d'un modèle d'apprentissage machine. Par exemple, un procédé comprend, par un processeur, la réception d'une image d'un objet dans un environnement et la réception de coordonnées d'une boîte de délimitation contenant l'objet dans l'image capturée. Le procédé comprend en outre, en réponse à la réception des coordonnées de la boîte de délimitation, l'entraînement d'un modèle d'apprentissage machine par génération d'un fichier de configuration contenant les coordonnées de la boîte de délimitation ; la génération de multiples copies (i) du fichier de configuration et (ii) au moins de la partie de l'image dans la boîte de délimitation ; et l'application des copies du fichier de configuration et de la partie de l'image au modèle d'apprentissage machine, produisant un modèle d'apprentissage machine qui est entraîné pour reconnaître l'objet dans un flux vidéo. Le procédé comprend en outre la transmission du modèle d'apprentissage machine entraîné à un dispositif électronique en vue d'un stockage dans une mémoire de données dans le dispositif électronique.
PCT/US2023/070439 2022-07-19 2023-07-18 Procédé de développement d'un ensemble de données d'objets à partir d'images et leurs utilisations WO2024020408A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263368781P 2022-07-19 2022-07-19
US63/368,781 2022-07-19

Publications (1)

Publication Number Publication Date
WO2024020408A1 true WO2024020408A1 (fr) 2024-01-25

Family

ID=89618604

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/070439 WO2024020408A1 (fr) 2022-07-19 2023-07-18 Procédé de développement d'un ensemble de données d'objets à partir d'images et leurs utilisations

Country Status (1)

Country Link
WO (1) WO2024020408A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210035313A1 (en) * 2019-07-01 2021-02-04 Sas Institute Inc. Real-time concealed object tracking
US20220206576A1 (en) * 2020-12-30 2022-06-30 Imagine Technologies, Inc. Wearable electroencephalography sensor and device control methods using same

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210035313A1 (en) * 2019-07-01 2021-02-04 Sas Institute Inc. Real-time concealed object tracking
US20220206576A1 (en) * 2020-12-30 2022-06-30 Imagine Technologies, Inc. Wearable electroencephalography sensor and device control methods using same

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ROY BIPARNAK, NANDY SUBHADIP, GHOSH DEBOJIT, DUTTA DEBARGHYA, BISWAS PRITAM, DAS TAMODIP: "MOXA: A Deep Learning Based Unmanned Approach For Real-Time Monitoring of People Wearing Medical Masks", TRANSACTIONS OF THE INDIAN NATIONAL ACADEMY OF ENGINEERING, vol. 5, no. 3, 1 September 2020 (2020-09-01), pages 509 - 518, XP093135900, ISSN: 2662-5415, DOI: 10.1007/s41403-020-00157-z *
SAI B N KRISHNA; SASIKALA T.: "Object Detection and Count of Objects in Image using Tensor Flow Object Detection API", 2019 INTERNATIONAL CONFERENCE ON SMART SYSTEMS AND INVENTIVE TECHNOLOGY (ICSSIT), IEEE, 27 November 2019 (2019-11-27), pages 542 - 546, XP033707921, DOI: 10.1109/ICSSIT46314.2019.8987942 *

Similar Documents

Publication Publication Date Title
US11461991B2 (en) Method of developing a database of controllable objects in an environment
CN106774818B (zh) 姿势识别方法、姿势识别设备和可穿戴装置
CN109074819A (zh) 基于操作-声音的多模式命令的优选控制方法及应用其的电子设备
CN107576022A (zh) 空调器的控制方法、空调器、及存储介质
US20150323998A1 (en) Enhanced user interface for a wearable electronic device
US10806364B2 (en) Methods and apparatuses for electrooculogram detection, and corresponding portable devices
US20160334868A1 (en) Method and system for adapting a display based on input from an iris camera
CN104679246A (zh) 一种交互界面中人手漫游控制的穿戴式设备及控制方法
Pomboza-Junez et al. Hand gesture recognition based on sEMG signals using Support Vector Machines
CN105511750A (zh) 一种切换方法及电子设备
CN106055954B (zh) 一种终端的解锁方法及终端
CN110427108A (zh) 基于眼球追踪的拍照方法及相关产品
CN109238306A (zh) 基于穿戴式设备的计步数据校验方法、装置、存储介质及终端
CN110460774A (zh) 一种视频录制的控制方法及终端
WO2024020408A1 (fr) Procédé de développement d'un ensemble de données d'objets à partir d'images et leurs utilisations
WO2021115097A1 (fr) Procédé de détection de pupille et produit associé
CN108375912A (zh) 智能家居控制方法及相关产品
KR20160039589A (ko) 손가락 센싱 방식을 이용한 무선 공간 제어 장치
US11340703B1 (en) Smart glasses based configuration of programming code
CN108769347A (zh) 基于背光亮度的信息提示方法以及相关产品
CN107749942A (zh) 悬浮拍摄方法、移动终端及计算机可读存储介质
CN104375631A (zh) 一种基于移动终端的非接触式交互方法
CN111639628A (zh) 一种眼部特征以及动作识别方法及系统
US20200089935A1 (en) Body information analysis apparatus capable of indicating shading-areas
CN104866112A (zh) 一种基于移动终端的非接触式交互方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23843829

Country of ref document: EP

Kind code of ref document: A1