WO2022211793A1 - Systèmes de commande pour barrières automatiques - Google Patents

Systèmes de commande pour barrières automatiques Download PDF

Info

Publication number
WO2022211793A1
WO2022211793A1 PCT/US2021/025021 US2021025021W WO2022211793A1 WO 2022211793 A1 WO2022211793 A1 WO 2022211793A1 US 2021025021 W US2021025021 W US 2021025021W WO 2022211793 A1 WO2022211793 A1 WO 2022211793A1
Authority
WO
WIPO (PCT)
Prior art keywords
barrier
control system
machine learning
learning model
sensor
Prior art date
Application number
PCT/US2021/025021
Other languages
English (en)
Inventor
Alexander Mark HAYES
Original Assignee
Hayes Alexander Mark
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hayes Alexander Mark filed Critical Hayes Alexander Mark
Priority to PCT/US2021/025021 priority Critical patent/WO2022211793A1/fr
Publication of WO2022211793A1 publication Critical patent/WO2022211793A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05FDEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05F15/00Power-operated mechanisms for wings
    • E05F15/70Power-operated mechanisms for wings with automatic actuation
    • E05F15/73Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05FDEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05F15/00Power-operated mechanisms for wings
    • E05F15/70Power-operated mechanisms for wings with automatic actuation
    • E05F15/73Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects
    • E05F15/74Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects using photoelectric cells
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05FDEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05F15/00Power-operated mechanisms for wings
    • E05F15/70Power-operated mechanisms for wings with automatic actuation
    • E05F15/79Power-operated mechanisms for wings with automatic actuation using time control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05YINDEXING SCHEME ASSOCIATED WITH SUBCLASSES E05D AND E05F, RELATING TO CONSTRUCTION ELEMENTS, ELECTRIC CONTROL, POWER SUPPLY, POWER SIGNAL OR TRANSMISSION, USER INTERFACES, MOUNTING OR COUPLING, DETAILS, ACCESSORIES, AUXILIARY OPERATIONS NOT OTHERWISE PROVIDED FOR, APPLICATION THEREOF
    • E05Y2400/00Electronic control; Electrical power; Power supply; Power or signal transmission; User interfaces
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05YINDEXING SCHEME ASSOCIATED WITH SUBCLASSES E05D AND E05F, RELATING TO CONSTRUCTION ELEMENTS, ELECTRIC CONTROL, POWER SUPPLY, POWER SIGNAL OR TRANSMISSION, USER INTERFACES, MOUNTING OR COUPLING, DETAILS, ACCESSORIES, AUXILIARY OPERATIONS NOT OTHERWISE PROVIDED FOR, APPLICATION THEREOF
    • E05Y2400/00Electronic control; Electrical power; Power supply; Power or signal transmission; User interfaces
    • E05Y2400/10Electronic control
    • E05Y2400/45Control modes
    • E05Y2400/456Control modes for programming, e.g. learning or AI [artificial intelligence]
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05YINDEXING SCHEME ASSOCIATED WITH SUBCLASSES E05D AND E05F, RELATING TO CONSTRUCTION ELEMENTS, ELECTRIC CONTROL, POWER SUPPLY, POWER SIGNAL OR TRANSMISSION, USER INTERFACES, MOUNTING OR COUPLING, DETAILS, ACCESSORIES, AUXILIARY OPERATIONS NOT OTHERWISE PROVIDED FOR, APPLICATION THEREOF
    • E05Y2900/00Application of doors, windows, wings or fittings thereof
    • E05Y2900/10Application of doors, windows, wings or fittings thereof for buildings or parts thereof
    • E05Y2900/13Type of wing
    • E05Y2900/132Doors
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05YINDEXING SCHEME ASSOCIATED WITH SUBCLASSES E05D AND E05F, RELATING TO CONSTRUCTION ELEMENTS, ELECTRIC CONTROL, POWER SUPPLY, POWER SIGNAL OR TRANSMISSION, USER INTERFACES, MOUNTING OR COUPLING, DETAILS, ACCESSORIES, AUXILIARY OPERATIONS NOT OTHERWISE PROVIDED FOR, APPLICATION THEREOF
    • E05Y2900/00Application of doors, windows, wings or fittings thereof
    • E05Y2900/40Application of doors, windows, wings or fittings thereof for gates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • This disclosure relates to control systems for barriers that open and close, such as roll-up doors, sliding doors, gates, turnstiles, and the like. Additionally, this disclosure relates to artificial intelligence, machine learning, and model predictive control.
  • FIG. 1 illustrates a diagram of the state of a barrier relative to time, according to one embodiment.
  • FIG. 2A illustrates actuation of a barrier according to different control systems for detected objects and predicted future objects, according to one embodiment.
  • FIG. 2B illustrates actuation of the barrier according to different control systems, according to one embodiment.
  • FIG. 2C illustrates a diagram of the actuation of lateral-moving door panels by different control systems, according to one embodiment.
  • FIG. 3A illustrates a block diagram of an example of a barrier control system, according to one embodiment.
  • FIG. 3B illustrates a block diagram of another example of a barrier control system, according to one embodiment.
  • FIG. 3C illustrates a block diagram of another example of a barrier control system, according to one embodiment.
  • FIG. 4A illustrates a flow chart of a method for calculating a minimum opening value for a barrier based on a trajectory' calculated using a static, preprogrammed trajectory' algorithm, according to one embodiment.
  • FIG. 4B illustrates a flow chart, of a method for calculating a minimum opening value for a barrier based on a trajectory ' predicted by a path prediction machine learning model, according to one embodiment.
  • FIG. 4C illustrates a flow chart of a method for calculating a minimum opening value for a barrier before and after training of a machine learning model, according to one embodiment.
  • FIG. 4D illustrates a flow chart of a method for calculating a minimum opening value for a barrier based on future time increments during which an object is predicted to occupy a barrier actuation region using an occupancy prediction machine learning model, according to one embodiment.
  • FIG. 5A illustrates a block diagram of a barrier control system utilizing a machine learning model, according to one embodiment
  • FIG. 5B illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model, according to one embodiment.
  • FIG. 5C illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model with some pre-processed data, according to one embodiment.
  • FIG. 5D illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model with pre-processed data, according to one embodiment.
  • FIG. 6A illustrates a block diagram of a barrier control system that includes integrated sensors, according to one embodiment.
  • FIG. 6B illustrates an example of a barrier control system with integrated sensors mounted above a roll-up door, according to one embodiment.
  • FIG. 6C illustrates an example of a barrier control system with an integrated sensor mounted to the side of a roll-up door, according to one embodiment.
  • FIG. 6D illustrates the roll-up door partially rolled up to show a bander actuation region that includes the leading edge of the roll-up door, according to one embodiment.
  • FIG. 7 illustrates an example field of view' of sensors integrated into a barrier control system, according to one embodiment.
  • FIG. 8 illustrates a barrier control system analyzing trajectory information of nearby equipment and a person, according to one embodiment.
  • FIG. 9A illustrates a modeled trajectory' with no acceleration, according to one embodiment.
  • FIG. 9B illustrates modeled trajectory possibilities of an object with relatively small acceleration possibilities, according to one embodiment.
  • FIG. 9C illustrates modeled trajectory possibilities of an object with relatively large acceleration, according to one embodiment.
  • FIG. 9D illustrates modeled trajectory possibilities of an object with relatively large acceleration with a known obstacle, according to one embodiment.
  • FIG. 10 illustrates an example diagram of predicted trajectories of an object relative to a barrier, according to one embodiment.
  • FIG. 11A illustrates a graphical representation of a predicted minimum required opening height and an actual minimum required opening height with respect to time, according to one embodiment
  • FIG. 11 B illustrates a graphical representation of measurements between the predicted minimum opening heights and the actual minimum opening heights used to train a machine learning process, according to one embodiment.
  • FIG. 12A illustrates a graph of the opening aperture of a door (e.g., door height) relative to time for two different control systems, according to one embodiment.
  • FIG. 12B illustrates a graph of the difference in the aperture opening of the doors controlled by the two different control systems of FIG. 12 A, according to one embodiment.
  • a roll-up door may ⁇ be used to control access to a refrigerated room.
  • the door When the door is rolled down (e.g., closed), the door provides a barrier that helps maintain the refrigerated room cold.
  • the roll-up door can be manually or automatically rolled up, at least partially, to allow a person or vehicle to enter the refrigerated room. The door may remain open until the person or vehicle leaves the refrigerated room, or the door may be closed until the person or vehicle is ready to leave the refrigerated room.
  • a barrier may exist to prevent entry by unauthorized persons.
  • a barrier may- only be present to prevent or reduce the flow of a fluid (e.g., a gas or a liquid) between two rooms (or between a room and outside of the room).
  • a fluid e.g., a gas or a liquid
  • the barrier control systems and methods described herein are applicable to a wide variety of motorized and electronically opened barriers including, without limitation, automatic industrial doors, panel doors, roll-up doors, rolling doors, pivot turnstiles, folding doors, sliding doors, lifting gates, swinging doors, revolving doors, passenger access barriers, train doors, bus doors, subway doors, sectional panel doors, pivot arms, rotatable turnstiles, and the like.
  • many of the examples described herein are provided in the context of an electronically controlled (motorized) roll-up door that controls access to a refrigerated room.
  • an electronically controlled (motorized) roll-up door that controls access to a refrigerated room.
  • many of the same principles, functionalities, systems, methods, subsystems, and other components of the barrier control systems and methods described herein are equally applicable or can be adapted for use with other types of barriers, including the various types of barriers listed above.
  • a barrier control system includes a sensor subsystem with one or more sensors to detect an object, such as a person, an animal, a vehicle (e.g., a forklift, truck, car, or the like).
  • the sensor subsystem may determine various information about the object, including, without limitation, the location of the object relative to a barrier, the width of the object, the height of the object, velocity information, trajectory' information, acceleration information, and/or characteristics of the object (e.g., the direction a person is looking, the direction a person’ s head is turned, the angle at which a person i s leaning, etc.).
  • the sensor subsystem may also include sensors to monitor moisture, air pressure, temperature, and other environmental characteristics of one or both sides of the harrier and/or of the detected objects.
  • raw low-level sensor data is provided directly to a machine learning model (such as an occupancy prediction machine learning model and/or a path prediction machine learning model).
  • the raw low-level sensor data is pre-processed by specialized algorithms or machine learning models to generate high-level abstractions that are provided as inputs to the machine learning model.
  • raw low-level sensor data may be pre-processed algorithms or specialized machine learning models to determine any of wide variety of object characteristics which can be provided as inputs to the machine learning model for predicting an object’s trajectory.
  • object characteristics that can be determined via the pre-processed algorithms or specialized machine learning models (or by the machine learning model that receives the raw data) include, but are not limited to object position, object velocity, object acceleration, object heights, object widths, object depths, object types (e.g., identification), object orientation, a direction in which a pedestrian or a vehicle operator is looking, a size of a load, identification of tools being carried, hand movements of a pedestrian or operator of a vehicle (e.g., movement of a steering wheel), angle of a body (e.g., the direction a pedestrian or operator of a vehicle is leaning), and/or the like.
  • the object characteristics may include other derivatives of position in addition to velocity and acceleration including, for example, the object's snap or jounce, the object's crackle, and/or the object's pop.
  • any combination of the above-identified object characteristics may be provided as pre-calculated high-level abstraction inputs into the machine learning model.
  • the machine learning model may be trained using raw data and the above-identified object characteristics may be explicitly determined by the machine learning model, impliedly determined by the machine learning model, or not determined at all if they are not used by the trained neural network.
  • sensors include, but are not limited to, LiDAR sensors, stereo cameras (e.g., visible light or infrared), time of flight cameras, ultrasound sensors, induction coils, pressure sensors, radar, imaging millimeter-wave sensors (e.g., mmWave sensors), temperature sensors, humidity sensors, and the like.
  • the sensor subsystem may utilize the collected data to generate two-dimensional point clouds, three-dimensional point clouds, visible light images, false-color images, tables or graphs representing object positions and velocities as a time series, frequency responses of measured objects, and the like.
  • Processing of the sensor data can be performed locally (i ,e., within the barrier control system) and/or remotely (e.g., in remote servers or computing devices).
  • the sensor subsystem includes at least one sensor to monitor the location of a leading of the barrier being opened. For example, at least one sensor may monitor the relative location of the bottom edge of a roll-up door or the opposing edges of a two-panel pivoting door that opens inward or outward.
  • a sensor subsystem captures images or other sensory data to detect an object and detect the relative location of at least one edge of a barrier.
  • An object analysis subsystem may utilize the data from the sensor subsystem to calculate trajectory information of the object.
  • the sensor subsystem may capture images via a red, green, blue (RGB) image sensor and/or generate a point cloud via a LiDAR sensor.
  • the object analysis subsystem may process the data from the sensor subsystem to calculate trajectory information of an object proximate to the barrier and/or size information of the object.
  • any of a wide variety of machine learning algorithms may be utilized for the occupancy prediction and/or path prediction machine learning models described herein.
  • an artificial neural feedforward network, a long short-term memory artificial neural network, or another neural network can be trained to predict the minimally required opening degree of an aperture (e.g., the height of a door, the width of a door, the pivot angle of a gate, etc.) based on object size and predicted trajectory.
  • the output of the machine learning model may provide minimally required opening state information for each of a plurality of future increments of time. As the predicted trajectory of a given object changes, the minimally- required opening state information for a given increment of time may also change.
  • a motor control unit may determine opening and/or closing actions to take at a given time based on the minimally required opening state information provided by the machine learning model.
  • the motor control unit may utilize principles of model predictive control to control the opening and closing of the barrier, as discussed in greater detail below.
  • a basic automatic door may include a sensor that opens the door when it detects a moving object with specific regions on each side of the door.
  • a basic automatic door for a refrigerated room may open to a maximum height at a fixed opening speed in response to detecting a person or a vehicle within an external triggering region. The door may- remain fully open until the person or vehicle is no longer within the external triggering region. In some instances, the person or vehicle may not enter the refrigerated room, in which case the door opened unnecessarily. If the person or vehicle does enter the refrigerated room, the door may remain fully open until the person or vehicle is outside of a corresponding internal triggering region.
  • a basic automatic door may open to a larger aperture than necessary, open too soon (e.g., for a slow-moving person), open too late (e.g., for a fast-moving vehicle), remain open for longer than necessary, and/or otherwise provide suboptimal operation.
  • the presently described barrier control systems and methods reduce the passage of air or other fluids during opening cycles, reduce the chance of collisions between vehicles, pedestrians, and the barrier, reduce temperature fluctuations, reduce moisture fluctuations, save energy, reduce the exchange of contaminants, reduce the likelihood of tailgating by unauthorized individuals and animals, provide controlled access, provide notifications of unexpected behaviors, provide notifications of accidents, and/or facilitate automatic maintenance scheduling.
  • a computing device, system, subsystem, module, or controller may include a processor, such as a microprocessor, a microcontroller, logic circuitry', or the like.
  • a processor may include one or more special-purpose processing devices, such as application- specific integrated circuits (ASICs), a programmable array logic (PAL), a programmable logic array (PLA), a programmable logic device (PLD), a field-programmable gate array (FPGA), or another customizable and/or programmable device.
  • the computing device may also include a machine-readable storage device, such as non-volatile memory, static RAM, dynamic RAM, ROM, CD-ROM, disk, tape, magnetic, optical, flash memory, or another machine-readable storage medium.
  • Various aspects of certain embodiments may be implemented or enhanced using hardware, software, firmware, or a combination thereof.
  • FIG. 1 illustrates a diagram 100 of the state of a harrier relative to time, according to one embodiment.
  • the vertical axis represents the actuation of the barrier between a closed state and an open state.
  • the horizontal axis represents time and includes a current time marker 105.
  • a detected object 107 requires the barrier to be opened to a first actuation state.
  • a safety margin 110 can be applied to ensure clearance to avoid collisions between the object traversing the barrier and the harrier. Additional objects 150 are predicted to traverse the barrier in the near future, as described in greater detail below.
  • a traditional barrier control system such as a motion-based control system 115 shows an actuation that is slightly too late, intersecting the safety margin zone, and ultimately opening much more than necessary.
  • a barrier control system based on a trained machine-learning model (as described in greater detail herein), opens to the height specified according to the safety margin, but no farther.
  • FIG. 2A illustrates actuation of a barrier according to different control systems 220 and 230 for detected objects 210 and predicted future objects 250, according to one embodiment.
  • the detected objects 210 and predicted future objects 250 are shown with safety margins (if any) included.
  • the traditional, motion- based controller 220 opens much more than necessary (e.g., beyond 6 meters).
  • the machine-learning-based control system 230 opens only as much as needed to avoid collisions.
  • the integration of the region beneath the lines of the motion-based controller 220 and the macliine-learning-based control system 230 corresponds to possible airflow that traverses the barrier. In the ease of refrigerated rooms, minimizing airflow that traverses the barrier can reduce cooling costs and/or temperature fluctuations.
  • FIG. 2B illustrates actuation of the barrier according to different control systems, according to one embodiment.
  • the predicted objects 250 from FIG. 2A have now 7 traversed the barrier as actual objects 251.
  • the motion-based controller 220 opens much more than necessary (e.g., beyond 6 meters), while the machine-leaming-based control system 230 opens only as much as needed to avoid collisions.
  • FIG. 2C illustrates a diagram 203 of the actuation of lateral -moving door panels 280 and 285 by different control systems, according to one embodiment.
  • a traditional, motion-based controller (with a timer) 264 causes the lateral -moving door panels 280 and 285 to open much more than necessary.
  • the machine-leaming-based control system 262 opens only as much as needed 260 to avoid collisions, or more specifically to a width that includes a fixed or learned safety margin 261.
  • the machine-learning based control system 262 may learn from forklift operators that slow down due to the doors being opened only just enough (e.g., to the width of the object. 260).
  • FIG. 3A illustrates a block diagram of an example of a barrier control system 301 that includes a sensor subsystem 310, an aperture analysis subsystem 320, a motor control interface 330, a path prediction machine learning model 340, an occupancy prediction machine learning model, and a machine learning training subsystem 350.
  • the sensor subsystem 310 includes one or more sensors or sensor arrays to capture data from a region or regions proximate to a barrier.
  • the sensor subsystem 310 may detect an object in motion, such as a vehicle or pedestrian, that is within a threshold distance from the barrier (e.g., within the range and field of view of the sensors in the sensor subsystem 310).
  • the sensor subsystem 310 may capture images of a region proximate to a barrier (e.g., via an RGB image sensor or a LiDAR scanning sensor).
  • At least one sensor of the sensor subsystem 310 is positioned and configured with a field of view that allows for detection of an edge of the barrier that moves as the barrier is opened and closed.
  • the sensor subsystem 310 may include a camera that is positioned with a field of view sufficient to capture images of objects as they approach a region in front of a roll-up door and images of the leading edge of the roll-up door as it transitions from a closed state to an open state.
  • the sensor subsystem 310 may comprise any of a wide variety and/or combination of sensors.
  • suitable sensors include, but are not limited to, LiDAR sensors, stereo cameras (e.g., visible light or infrared), time of flight cameras, ultrasound sensors, induction coils, pressure sensors, radar, imaging millimeter- wave sensors (e.g., mmWave sensors), temperature sensors, humidity sensors, and the like.
  • the sensor subsystem may utilize the collected data to generate three-dimensional images, twO ⁇ dimensional point, clouds, three- dimensional point clouds, visible light images, false-color images, tables or graphs representing obj ect positions and velocities as a time series, frequency responses of measured obj ects, and/or the like.
  • the sensor subsystem 310 includes two cameras positioned above or beside a door (e.g., a stereo camera system) to capture images outside of a room (i.e., on one side of the door) and two more cameras positioned above or beside the door to capture images inside of the room (i.e., on the other side of the door).
  • a framerate for image capture e.g., 15, 30, 60, 100, or 240 frames per second
  • Each stereo camera system may, for example, provide a 70° vertical field of view and a 120° lateral field of view.
  • Each stereo camera system may be positioned such that one edge of the vertical field of view can capture images of the leading edge of the door when it is fully open or approximately fully open. In such a position, the stereo camera system can be used to monitor the position of door panels or other portions of a barrier and detect objects at a relatively far distance from the barrier without any blind spots.
  • the sensor subsystem 310 can track the position of multiple objects at the same time and calculate velocity vectors for each object.
  • the sensor subsystem 310 may generate a three-dimensional bounding box for each object that specifies a height, width, and/or depth.
  • the sensor subsystem 310 can detect and distinguish between different objects (e.g., detect an object as either a pedestrian or a forklift).
  • the path prediction machine learning model 340 may be trained to detect and distinguish between different objects.
  • the sensor subsystem 310 includes temperature compensation circuits to compensate for temperature fluctuations and/or an accelerometer to align the coordinate system and facilitate detection of the floor of a facility.
  • the aperture analysis subsystem 320 may calculate size information of an object (e.g., height information relevant to the height a roll-up door must open, or width information relevant to the width pivoting lateral doors must open).
  • the size information may be provided by the sensor subsystem 310, while in other embodiments, the aperture analysis subsystem 320 may calculate the size information using images or point cloud information provided by the sensor subsystem.
  • the aperture analysis subsystem 320 may predict a trajectory' of each detected object using the trained path prediction machine learning model 340.
  • the path prediction machine learning model 340 may receive images or a point cloud as an input and be trained to detect objects, determine object locations, and predict trajectories.
  • the data input into the path prediction machine learning model 340 may include high-level detection results that can serve to expedite training on trajectory predictions.
  • the machine learning model 340 may be trained to implicitly develop its own object detection and position tracking functionalities.
  • the machine learning model 340 can be more quickly trained and/or refined to predict object trajectories.
  • the aperture analysis subsystem 320 may determine a minimum aperture value for the barrier to attain for each of a plurality of future time increments.
  • the motor control interface 330 may communicate the minimum aperture values for each future time increment to a motor control unit.
  • the motor control unit may initiate the opening and closing of the barrier to ensure the minimum aperture values are attained at each respective time increment.
  • the actual route of the object may be tracked by the sensor subsystem 310 and provided to a machine learning training subsystem 350.
  • the machine learning training subsystem 350 can use the actual route of the object to further improve or refine the path prediction machine learning model 340.
  • the sensor subsystem 310 may detect a forklift and two pedestrians proximate the barrier.
  • the forklift may be determined to have a height of 3 meters and each pedestrian may be determined to have a height of 2 meters.
  • the aperture analysis subsystem 320 may utilize the path prediction machine learning model 340 to predict trajectories of all three objects.
  • the predicted trajectory of the forklift may indicate that the forklift will traverse a roll-up door from time increments 3 to 5.
  • the barrier control system may inform the motor control unit, via the motor control interface 330, that the roll-up door should be opened to a minimum height of 3 meters during time increments 3 and 5.
  • the predicted trajectory of the first pedestrian may indicate that the first pedestrian does not intend to enter the room via the roll-up door. Accordingly, no minimum height information may be transmited to the motor control unit.
  • a trajectory of the second pedestrian calculated by itself using a static, trajectory algorithm might normally indicate that that the second pedestrian will traverse the roll-up door from time increments 5 to 9.
  • the training of the path prediction machine learning model may have included numerous training data in which a pedestrian and a forklift are entering a room at approximately the same time. In this training data, the pedestrians may have frequently yielded to the forklift.
  • the path prediction machine learning model 340 may predict a trajectory for the second pedestrian indicating that the second pedestrian will traverse the roll-up door from time increment 6-10. Accordingly, the barrier control system may inform the motor control unit, via the motor control interface 330 that the roll-up door should be opened to a minimum height of 2 meters during time increments 6-10.
  • the barrier control system 301 may utilize the occupancy prediction machine learning model 345 instead of or in addition to the path prediction machine learning model 340.
  • the occupancy prediction machine learning model 345 may utilize low-level data, raw data, pre-processed data (high-level data) and/or other sensor data collected by a sensor subsystem.
  • the occupancy prediction machine learning model 345 may identify future time increments (e.g., a set of continuous or discontinuous time increments) during which a detected object (or objects) is predicted (by the machine learning model 345) to occupy a barrier actuation region of a barrier.
  • the barrier control system 301 may predict barrier opening states suitable to avoid collisions between the barrier and the object for the future time increments during which the object detected by the sensor subsystem is predicted to occupy the barrier actuation region.
  • FIG. 3B illustrates a block diagram of another example of a barrier control system 302 that further includes the motor control unit 335, a preprogrammed trajectory algorithm subsystem 360, and a model predictive control subsystem 370.
  • the barrier control system 302 includes an integrated motor control unit 335.
  • the barrier control system 301 of FIG. 3A communicates with an external and independently functioning motor control unit.
  • the barrier control system 302 also includes a preprogrammed trajectory algorithm subsystem 360 that can calculate a worst-case trajectory' based on fixed variables.
  • a barrier control system may be deployed for operation in a facility with minimal or no training of the machine learning model 341 (e.g., the path prediction machine learning model 340 (FIG. 3 A) and/or the occupancy prediction machine learning model 345 (FIG. 3B)).
  • the untrained or undertrained machine learning model 341 may not yet be capable of accurately predicting object trajectories.
  • the barrier control system may initially utilize a static, preprogrammed trajectory algorithm subsystem 360 to calculate the trajectories of detected objects (e.g., a worst-case trajectory').
  • the aperture analysis subsystem 320 may determine aperture values (e.g., minimum aperture values) for the barrier to attain during future time increments based on the trajectory calculated for each detected object.
  • the preprogrammed trajectory algorithm subsystem 360 may calculate a worst-case trajectory based on a function of the current position and velocity of a detected object and default or preprogrammed maximum possible acceleration values. For example, while a trained neural network may correctly predict that a forklift driver driving away from a barrier is unlikely to turn around and go through the barrier, a worst-case trajectory algorithm may indicate that such a trajectory is possible and, therefore, inform the motor control unit to he open to a specific height during specific time increments ‘just in case” the worst-case trajectory is realized.
  • the actual route of the forklift in the example above may he used as training data to improve the machine learning rnodei(s) 341.
  • the machine learning model 341 has demonstrated an ability to predict trajectories of objects with sufficient accuracy (in the case of a path prediction machine learning model) or predict time increments during which the object, is calcuiated/estimated/predicted to occupy the barrier actuation region of the barrier (in the case of the occupancy prediction machine learning model).
  • the barrier control system 302 may begin utilizing the trajectories predicted by a path prediction machine learning model and/or the occupancy predictions generated by the occupancy prediction machine learning model instead of the trajectories calculated using the preprogrammed trajectory algorithm subsystem 360.
  • the barrier control interface 302 also includes a model predictive control subsystem 370 to enhance the operation of the motor control unit 335.
  • the motor control unit 335 may attempt to open the barrier to the minimum aperture value specified for each time increment by the aperture analysis subsystem 320.
  • the barrier control system may utilize principles of model predictive control to determine an optimal control input to be executed by the motor control unit 335, as described in greater detail below.
  • FIG. 3C illustrates a block diagram of another example of a barrier control system 303, according to one embodiment.
  • the barrier control system 303 includes a bus 305 that connects a processor 307, a memory 309, and a network interface 311 to a computer-readable storage medium 390.
  • the computer-readable storage medium 390 includes a sensor subsystem module 391, a machine learning model 392 (e.g., for path prediction and/or occupancy prediction), a minimum barrier opening state calculation module 393, a preprogrammed trajectory algorithm module 394, a trajectory training data feedback module 395, and a prediction accuracy module 396.
  • the sensor subsystem module 391 may receive and process data from one or more sensors to detect objects proximate to the barrier, categorize or identify the object, detect a relative location of an edge of the barrier or otherwise determine a state of openness of the barrier, calculate size or other dimension information of the object, and the like.
  • the machine learning model 392 may be trained using various training data from local sources and/or training data gathered from other, remote locations by other barrier control systems.
  • the machine learning model 392 may be used to predict a trajectory and/or occupancy of one or more detected objects.
  • an occupancy prediction machine learning algorithm may predict time increments during which the object is predicted to occupy a barrier actuation region of the barrier (e.g., a region of space where a roll-up or sliding door moves).
  • the machine learning model 392 predicts the trajectory and/or occupancy of a given object based on the current object position, current object velocity, type of object, size of object, and/or the predicted trajectories of other objects. Accordingly, in some instances, the machine learning model may be trained such that a predicted trajectory or occupancy state (e.g., future time increments during which the object is predicted to occupy the barrier actuation region of the barrier) of an object is based, at least in part, on the relative locations of other objects and/or the precited trajectories of the other objects.
  • a predicted trajectory or occupancy state e.g., future time increments during which the object is predicted to occupy the barrier actuation region of the barrier
  • a minimum barrier opening state calculation module 393 may determine a minimum aperture value for the barrier to attain for each of a plurality of future time increments based on the calculated size information and predicted trajectory' of the object. In the event that more than one object is expected to traverse the barrier at the same time or approximately the same time, the minimum aperture value for the barrier during the relevant time increments wall be the greater of the individual minimum aperture values to ensure that neither of the two objects collides with the barrier.
  • the preprogrammed trajectory algorithm module 394 may be used to ensure safe actuation and opening of the barrier until the machine learning model is fully or sufficiently trained.
  • the preprogrammed trajectory algorithm module 394 may calculate a trajectory of detected objects based on a worst-case scenario in which detected objects accelerate toward the barrier at a selected maximum acceleration value and/or maximum velocity value.
  • the selected maximum acceleration and/or velocity values may be a theoretical maximum value for the given type of object, or it may be a lower, fixed value manua ly selected as reasonable for the type of object in question.
  • the maximum theoretical velocity value of a pedestrian may be around 11 m/s; however, the preprogrammed trajectory algorithm module 394 may utilize a manually assigned value of 6 m/s based on the assumption that pedestrians are sufficiently unlikely to sprint through the barrier. Similarly, the preprogrammed trajectory algorithm module 394 may utilize other, reasonable acceleration and velocity values to calculate worst-case trajectories for each of a variety of different types of objects.
  • the training data feedback module 395 may utilize actual route information of an object, as detected by the sensor subsystem, as training data to improve the machine learning model 392,
  • the prediction accuracy module 396 may evaluate the trajectory' predictions and/ occupancy predictions made by the machine learning model 392 and the actual route of objects to determine when the barrier control system 303 can switch from using the preprogrammed trajectory algorithm module 394 to the machine learning a model 392. For example, the prediction accuracy module 396 may determine that the machine learning model 392 is sufficiently trained based on a sequential count of accurate trajectory and/or occupancy predictions exceeding (e.g., equal to or greater than) a threshold value.
  • FIG. 4A illustrates a flow chart of a, method 401 for calculating a, minimum opening value for a barrier based on a trajectory' calculated using a static, preprogrammed trajectory algorithm, according to one embodiment.
  • a sensor subsystem is used to detect, at 410, an object proximate a barrier.
  • the system calculates, at 425, an object trajectory using a static, preprogrammed trajectory algorithm.
  • the calculated trajectory is used to calculate, at 435, a minimum opening value for each of a plurality of future time increments for the barrier.
  • the minimum opening values for each of the future time increments are transmitted, at 470, to a motor control unit.
  • FIG. 4B illustrates a flow chart of a method 402 for calculating a minimum opening value for a harrier based on a trajectory predicted by a path prediction machine learning model, according to one embodiment.
  • the sensor subsystem is still used to detect, at 410 the object proximate the barrier.
  • the system predicts, at 420, an object trajectory using a path prediction machine learning model.
  • the system calculates, at 430, a minimum opening value for each of a plurality of future time increments for the barrier.
  • the minimum opening values for each of the future time increments are transmitted, at 460, to a motor control unit.
  • the system then provides, at 480, an actual route of the object as detected by the sensor subsystem as training data for the path prediction machine learning model.
  • FIG. 4C illustrates a flow chart of a method for calculating a minimum opening value for a harrier before and after training of a machine learning model, according to one embodiment.
  • FIG. 4A above outlines a method 401 that uses an algorithm to calculate a trajectory based on, for example, a worst-case scenario.
  • FIG. 4B illustrates a method 402 that uses a trained machine learning model.
  • FIG. 4C relates to a method in winch a preprogrammed trajectory algorithm is used to calculate a trajectory while the path prediction machine learning model is being trained.
  • the system may be immediately deployed for operation. During the initial training period, the barrier opens to prevent collisions using the preprogrammed trajectory algorithm.
  • the initial operation may cause the barrier to open earlier than necessary, open unnecessarily, open to a wider aperture than needed, close later than possible, and/or exhibit other inefficiencies.
  • the system may discontinue using the preprogrammed trajectory algorithm in favor of the path prediction machine learning model.
  • the sensor subsystem detects, at 410, an object proximate to the barrier.
  • a path prediction machine learning model is used to predict, at 420, an object trajectory, and an object trajectory is calculated, at 425, using the static, preprogrammed trajectory algorithm.
  • Minimum opening values are calculated, at 430 and 435, for each of a plurality of future time increments based on the predicted and calculated trajectories, respectively. If the machine learning model is sufficiently trained, at 450, then the minimum opening values calculated based on the trajectory' predicted by the path prediction machine learning model are transmitted, at 460, to a motor control unit.
  • the minimum opening values calculated based on the trajectory calculated using the static, preprogrammed trajectory algorithm are transmitted, at 470, to the motor control unit.
  • elements 425, 435, and 470 may be omited from the flow to avoid unnecessary' computations.
  • the sensor subsystem may be used to continually track the route of the object and the actual route of the object may be provided, at 480, as training data to improve the accuracy of the path prediction machine learning model.
  • the path prediction machine learning model may be marked, at 490, as sufficiently trained once the sequential count of accurate predictions exceeds a threshold training value, as described herein,
  • FIG. 4D illustrates a flow chart of a method for calculating a minimum opening value for a barrier based on future time increments during which an object is predicted to occupy a barrier actuation region using an occupancy prediction machine learning model, according to one embodiment.
  • the sensor subsystem is still used to detect, at 411 the object proximate the barrier.
  • the system identifies, at 421, future time increments during which an object is predicted to occupy a barrier actuation region using an occupancy prediction machine learning model.
  • the system calculates, at 431, an opening value (e.g., a minimum opening value) for the future time increments.
  • the opening values for each of the future time increments are transmited, at 461, to a motor control unit.
  • the system then provides, at 481, an actual route or other position information of the object as detected by the sensor subsystem as training data for the path prediction machine learning model.
  • FIG. 5 A illustrates a block diagram of a machine learning model for training a barrier control system, according to one embodiment.
  • the illustrated embodiment includes a sensor array 510 that may, for example, be a part, of a sensor subsystem.
  • the sensor array captures raw or low-level data that can be directly transmitted 517 to the aperture analysis subsystem 550 that includes a machine learning model 530.
  • the raw or low-level data 515 may be transmitted to specialized pre-trained algorithms 520 that perform specific tasks, such as position tracking, determining the leaning angle of people, estimate a forklift load type and/or weight, and/or generate a three-dimensional point cloud.
  • the high-level abstractions 525 generated by the specialized pre-trained algorithms 520 may be provided as inputs to the machine learning model 530.
  • the machine learning model 530 could potentially he trained using all the raw' or low-level data 515, the training would likely require more training samples to accurately predict object trajectories.
  • Specific sensor data may be pre-processed by specialized pre-trained algorithms that operate independently of the facility in which the barrier control system is installed.
  • a pre-trained algorithm 520 may be used to detect an object's position and provide a high-level abstraction 525 of the object's position as an input into the machine learning model 530.
  • the data captured by the sensor array 510 can be recorded and transmitted to the machine learning model 530 in a raw or low-level format.
  • the data may be transmitted using wireless networking infrastructure, wired network infrastructure, various wireless protocols, via the Internet, etc.
  • the machine learning model 530 may receive the raw or low-level data and be trained to predict object trajectories. The relative significance of the received raw or low-level data does not need to be determined ahead of time.
  • the raw data may be low-level processed on-site or in the cloud prior to being provided to the machine learning model 530.
  • the machine learning model 530 may be trained to predict object trajectories using pre-processed high-level abstractions 525, such as object positions, the leaning angles of persons, forklift load type, forklift, weight estimations, a three-dimensional point cloud, etc.
  • the machine learning algorithm 530 does not need to learn the underlying processes of object detection, the leaning angles of persons, etc.
  • the machine learning model 530 can leverage existing trained models and algorithms to make better use of the training data, which is especially important during initialization and early usage when the machine learning model is “data-starved ”
  • the proposed aperture analysis subsystem may utilize a modular approach that includes a machine learning model 530 that receives, as inputs, high-level abstractions from pre-built or pre-train ed algorithms 520 for specialized analysis and raw low- level sensor data 515.
  • the machine learning model may be a path prediction machine learning model or an occupancy prediction machine learning model, as described herein.
  • the machine learning model 520 may utilize the high-level abstractions 525 from the pre-built or pre-trained algorithms 520 during initial training and then use raw or low-level sensor data 517 to refine the training over time.
  • duplicate data pre-processed high-level abstractions 525 and raw data 517) may be available to the machine learning model 530.
  • the barrier control system may limit or restrict data transmission and/or storage to high-level abstraction data 525 to intentionally decrease the training data size, reduce data bandwidth requirements, and/or reduce data store requirements,
  • FIG. 5B illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model, according to one embodiment.
  • the sensor array 510 provides raw or low-level sensor information 515 to the aperture analysis subsystem 550.
  • the aperture analysis subsystem 550 includes an end- to-end machine learning model that, using one or more cost functions (e.g., a weighted average of cost functions), determines aperture openings for a barrier based on the provided raw or low- level sensor data 515.
  • Barrier control instructions 540 can be sent to a barrier control unit to control the actuation of the barrier.
  • the barrier control system includes a plurality of sensors that generate any of a wide variety of sensor information that detects various possible information about objects proximate a barrier.
  • the aperture analysis subsystem 550 uses the end-to-end trained machine-learning model to determine a minimum aperture value for the barrier to attain at each of a plurality of future time increments based on the sensor information.
  • the end- to-end machine-learning model may be trained to minimize a cost function to avoid collisions, minimize a cost function associated with airflow traversing the barrier, minimize a cost function associated with energy costs to cool one or both sides of barrier, minimize a cost function of wear and tear on mechanical components, or weighted combinations of such cost functions.
  • a motor control interface as described herein, may transmit the barrier control instructions with the minimum aperture values for the future time increments to a motor control unit.
  • FIG. 5C illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model with some pre-processed data, according to one embodiment.
  • the sensor array 510 provides raw or low-level sensor information 515 to the aperture analysis subsystem 550. Additionally, some of the data is preprocessed 517 to generate a high-level data stream. The high-level data stream is provided to the aperture analysis subsystem 550 together with the raw or low-level sensor information 515.
  • the aperture analysis subsystem 550 may include an end-to-end machine learning model that, using one or more cost functions (e.g., a weighted average of cost functions), determines aperture openings for a barrier based on the provided raw or low-level sensor data 515.
  • Barrier control instructions 540 can be sent to a barrier control unit to control the actuation of the barrier.
  • FIG. 5D illustrates a block diagram of another example of a barrier control system utilizing an end-to-end machine learning model with pre-processed data, according to one embodiment.
  • the sensor array 510 provides raw or low-level sensor information 515 to various data preprocessing subsystems to generate various high level data streams 517.
  • the high-level data streams 517 are provided to the aperture analysis subsystem 550.
  • the aperture analysis subsystem 550 may include an end-to-end machine learning model that, using one or more cost functions (e.g., a weighted average of cost functions), determines aperture openings for a barrier based on the preprocessed or high-level data streams 517.
  • Barrier control instructions 540 are sent to a barrier control unit to control the actuation of the barrier.
  • the sensor information from the sensor array 510 may comprise actual position data.
  • the sensor array 510 may comprise radar sensors or other sensor types that provide velocity information (e.g., Doppler).
  • the aperture analysis subsystem 550 may explicitly extrapolate position data (and explicitly determine a trajectory), as described in the embodiment in FIG. 5A.
  • the end-to-end machine-learning model of the aperture analysis subsystem 550 of FIGS. 5B-D may determine the minimum aperture values for future time increments without explicit intermediary calculations.
  • the processed sensor data from the sensor array 510 may be used to calculate at least one vectorial component of a derivative of position.
  • system or subsystems thereof may calculate a velocity vector of an object (the first derivative of position), an acceleration vector of an object (the second derivative of position), a jerk vector of an object (the third derivative of position), a snap vector of an object (the fourth derivative of position), a crackle vector of an object (the fifth derivative of position, and/or a pop vector of an object (the sixth derivative of position).
  • Vector components of derivative of position may be used by the aperture analysis subsystem 550 to determine minimum aperture values for future time increments during which an object is predicted to be within a barrier actuation region of a barrier.
  • Vectorial components are generally defined orthogonal to each other. Accordingly, the system may determine a vector component of a velocity, acceleration, jerk, snap, crackle, and/or pop of an object in a direction orthogonal to a plan of the barrier actuation region, in a direction defined by the shortest distance from the object to the barrier actuation region, or in a direction defined by an arbitrary coordinate system.
  • FIG. 6 A illustrates a block diagram of a barrier control system 600 that includes integrated sensors 605, 610, 615, and 625, according to one embodiment.
  • the barrier control system 600 may include dual or stereo cameras 605 for capturing images of objects proximate to a barrier.
  • Block diagram elements 610, 615, and 625 represent any of the wide variety of sensors listed herein.
  • the barrier control system comprises disparate components that are not integrated into a single unit.
  • the sensor subsystem of a barrier control system may be mounted near a door and communicate with other portions of the barrier control system that are located nearby or in remote locations (e.g., cloud-based servers).
  • FIG. 6B illustrates an example of a barrier control system 600 with integrated sensors mounted above a roll-up door 650, according to one embodiment. While the illustrated example only shows one side of the roll-up door 650, in many embodiments a second set of sensors, or even a second barrier control system 600, may be positioned on the other side of the roll-up door 650 to provide actuation of the door for pedestrians and vehicles approaching the roll-up door 650 from either side.
  • FIG. 6C illustrates an example of a harrier control system 600 with an integrated sensor mounted to the side of a roll-up door 650, according to one embodiment.
  • one or more sensors of the barrier control system 600 may be physically mounted (e.g., positioned) and configured with a field of view sufficient to capture images of the panels of the roll-up door 650 as it is rolled up and down.
  • the cameras 605 may be configured and positioned to capture images of the leading edge 655 of the roll-up door 650 as the aperture of the roll-up door 650 changes (i.e., as it is rolled up and down). While the illustrated embodiment includes a leading-edge 655 of a roll-up door 650 that is straight, it is appreciated that the leading edge of other types of barriers may not be straight and may not move straight lines.
  • FIG. 6D illustrates the roll-up door 650 partially rolled up with the leading edge 655 of the roll-up door 650 slightly open.
  • a barrier actuation region i s shown that includes a volume extending up from the shaded region 699 on the floor to encompass a region within which the barrier moves when the barrier transitions between open and closed states.
  • the barrier actuation region 699 includes the region within which at least a portion of the barrier moves during actuation.
  • the barrier actuation region may be different for an arm the pivots up and down or for pivoting doors.
  • the barrier actuation region for a pivoting door that swings outward and/or inward would be relatively large.
  • the barrier control system utilizes captured images of the leading edge(s) of a barrier to develop mathematical models of the barrier actuation dynamics.
  • the barrier control system may test the barrier actuation dynamics to determine, for example, barrier opening speed, barrier closing speed, barrier acceleration, barrier inertia upwards, barrier inertia downwards, barrier direction changing delays, etc.
  • the barrier control system may utilize captured images of the leading edge(s) of a barrier for accurate model and calibrated control parameters and/or to provide alerts or notifications if the barrier operation changes (e.g., to detect fault conditions due to damage, collisions, or wear).
  • a separate sensor is used to monitor the leading edge(s) of a barrier.
  • the barrier control system may model some barriers using linear models. However, the barrier control system may model other barriers using nonlinear mathematical models to account for noniinearities in the aperture actuation. For example, the barrier control system may model rapid roller doors using nonlinear models to account for the nonlinear change in coil radius as the door is rolled up. According to various embodiments, the barrier control system may utilize a separate barrier actuation artificial neural network to model barriers with nonlinear aperture actuations. In some embodiments, barrier actuation dynamics may be known and provided as inputs to the barrier control system as built-in grey box models to speed up calibration processes.
  • the machine learning model may he pretrained with data from one or more other barriers in one or more different environments.
  • the pretrained machine learning model may be fine-tuned or augmented with training data during actual use after installation on a specific barrier.
  • FIG. 7 illustrates an example field of view 750 of sensors integrated into a barrier control system 700, according to one embodiment.
  • the field of view may be defined relative to a plane 775 in which the barrier aperture is opened and closed.
  • the sensors of the barrier control system 700 may be positioned above a barrier that exists on the plane 775.
  • the field of view 750 may be adapted for a particular application, the size of the room(s) on either side of the barrier, the expected velocities of the objects traversing the barrier, and/or other factors.
  • the field of view 750 may include a lateral dimension between 45 degrees (e.g., for a barrier at.
  • the lateral field of view is approximately 120 degrees and the vertical field of view is approximately 70 degrees.
  • FIG. 8 illustrates a barrier control system 800 positioned above a roll-up door 825 analyzing trajectory information of a forklift 875 and a pedestrian 850. Based on predicted trajectories of the forklift 875 and/or the pedestrian 850, the barrier control system 800 has caused the roll-up door 825 to open to the illustrated height.
  • FIG. 9A illustrates a modeled trajectory 901 with no acceleration, according to one embodiment. Specifically, an object at. position 10 along the vertical axis is shown with minimal deviation with respect to time along the horizontal axis.
  • FIG. 9B illustrates modeled trajectory 902 possibilities of an object, with relatively small acceleration possibilities, according to one embodiment. As illustrated, the possible positions of the object vary slightly from the position 10 over time,
  • FIG. 9C illustrates modeled trajectory ' 903 possibilities of an object with relatively large acceleration, according to one embodiment. As illustrated, the possible positions of the object vary widely depending on the specific trajectory ' taken by the object.
  • FIG. 9D illustrates modeled trajectory 904 possibilities of an object with relatively large acceleration with a known obstacle, according to one embodiment. Again, the possible positions of the object vary widely depending on the specific trajectory' taken by the object. However, the obstacle makes some trajectories unlikely or impossible.
  • FIGS. 9A- 9D Illustrate the additional complexity in predicting future locations of objects, especially as the possible acceleration is assumed to be high.
  • FIG. 10 illustrates an example diagram of predicted trajectories 1000 of an object relative to a barrier 1050, according to one embodiment.
  • the barrier control system predicts a trajectory. While many predicted trajectories 1000 do not correspond to the actual future location of the vehicle, some of the predicted trajectories 1000 indicate the vehicle will traverse the barrier 1050. In such instances, the barrier control system causes the barrier to open to allow the vehicle to traverse without collision.
  • the barrier control system may receive raw data and determine (or receive high-level abstractions that indicate) positions, velocity, accelerations, and/or other object characteristics (as described herein) at each location of the vehicle.
  • the machine learning model of the barrier control system may explicitly output the full future trajectory of the detected object(s) in the form of a time series, in which case a cost function can compare the predicted and recorded future trajectories and penalize deviations.
  • the machine learning model may additionally or alternatively identify future time increments during which an object or objects are predicted to occupy a barrier actuation region.
  • the barrier control system operates a barrier or causes a barrier to operate to allow an object to traverse the barrier regardless of how fast the object approaches or accelerates, and without relying on human attentiveness.
  • the barrier control system may also operate to minimize the time the barrier is open and/or minimize the degree to which the barrier is open.
  • the barrier control system determines the degree to which the barrier must be open (e.g., determine the height to which the barrier must be open) prior to the barrier beginning to open.
  • the barrier control system may generate time-series data that indicates the minimum opening degree (e.g., minimum aperture value, minimum height value, or minimum barrier opening state) required at each time interval in the time series.
  • a possible cost function can be the mean of the squared Euclidian distances between predicted position and reference at every given time step for one and the same object for all present objects for all recorded time steps whereas a weighting function of various shapes can be applied to the squares that, for example, penalizes deviations more heavily the closer they occur to the door, an example of which can be seen in Equation 1 below :
  • Equation 1 d i denotes a time series vector belonging to object i whose elements are the Euclidian distance between the object's predicted and actual position.
  • y [ V0S,itwu ; s a vector containing weights that may, for instance, depend on factors such as the object type and velocity (indicating the possible severity of a collision).
  • the w V0S,iMU m a y inversely scale with the duration between the time at which the prediction is made and the time at which the object enters the aperture of the barrier (i.e., traverses the barrier).
  • the duration is longer than the prediction horizon of the controller or the duration of time that the door would require to open to said object's height (e.g., if the controller does not employ a prediction horizon), incorrect actuations pose no risk, and therefore should not be penalized.
  • the proximity of the object to the door at the time the prediction is made is used. The length of and v ar y depending on the object.
  • Some objects can alter their height as they drive. Pedestrians and vehicles approaching an aperture at an angle or around a bend of a curved path may vary their effective widths as they move. Some forklifts can also change their effective width while they are traveling a straight path.
  • the barrier control system may predict the width or height of the object that is expected at the time the object traverses the barrier. The barrier control system may cause the harrier to open in accordance with the predicted width, height, or another size characteristic.
  • the barrier control system may include an algorithm or trained neural network to predict a height (or other dimension or size characteristic relevant to the aperture of the barrier) of an object for the time increments during which the object is predicted to traverse the barrier. Size predictions can still be useful when objects do not change as they approach the barrier. In such instances, the algorithm effectively filters out noise from successive height measurements. In some embodiments, the largest measured height during any time the object was visible is used as the basis for determining the minimum height opening state. Accordingly, the barrier control system may utilize a cost function of an approaching object's height as set forth in Equation 2 below:
  • Equation 2 h ⁇ denotes predicted heights of the object as a tie vector, h L represents the corresponding measured reference heights, and w ; hei9ht represents the weight vector.
  • the prediction algorithm of the barrier control system may be trained using the combined costs functions of position and height, and , such that the training cost function C is defined in Equation 3 below: Equation 3
  • a prediction of the future 2-D positions, 5), of all objects by the sensor subsystem can he converted to a prediction of the minimally required aperture opening degree using matrices.
  • S i may be a 2 x n matrix corresponding to the object i with n denoting the number of time steps for which the prediction is made, and the top and bottom of each column contain the coordinates respectively for each time step.
  • the minimally required opening degree is equal to the height of the tallest object about to traverse or currently traversing the plane of a door panel.
  • the barrier control system may detect three objects proximate the door and predict future positions S 1 , S 2 , and S 3 , for n time increments.
  • the barrier control system may calculate h redlcted starting from a 1 ,n vector fdled with zeros.
  • Each entry in the vector that corresponds to a time increment at which at least one of the objects is about to enter or currently within a set minimum distance of the plane of door panel movement may be replaced by the tallest predicted height of all objects fulfilling this condition.
  • a non-circular shape is used instead of a minimum distance, and any object within the shape is considered within the plane of door movement.
  • the shape can be defined larger for fast-moving objects.
  • the proposed systems and methods allow the barrier control system to target individual opening degrees for each panel independently of each other. Not all panels must open to the same height.
  • FIG. HA illustrates a graphical representation 1100 of a predicted minimum required opening height and an actual minimum required opening height with respect to time, according to one embodiment.
  • the barrier control system may utilize a cost function that compares the shapes of the predicted minimum required opening degree and an actual minimum required opening degree in terms of the vector space defined with time on one axis (the horizontal axis as illustrated) and the ‘ minimal required opening height” on the other axis (the vertical axis as illustrated).
  • FIG. 11B illustrates a graphical representation 1150 of measurements between the predicted minimum opening heights and the actual minimum opening heights used to train a machine learning process, according to one embodiment. Specifically, for each point of t h e S y S ⁇ e m may calculate the smallest Euclidian distance to any point. The cost function used for training the machine learning model may be the sum of the distances for all points. In some embodiments, the axes are scaled to ensure the cost function does not cause the training to produce an always closed prediction.
  • FIG. 12A illustrates a graph 1200 of the opening aperture of a door (e.g., door height) relative to time for two different control systems, according to one embodiment.
  • the “predictive door” graph shown as a solid line represents the aperture of the door with respect to time-based on control signals from a barrier control system according to the various embodiments described herein.
  • the short-dashed line represents the actual height of objects passing through the aperture of the door.
  • the long-dashed line shows the aperture actuations under the control of a traditional automatic door controller.
  • FIG. 12B illustrates a graph 1250 of the difference in the aperture opening of the doors controlled by the two different control systems of FIG. 12A, according to one embodiment.
  • the solid line represents the aperture of the door with respect to time-based on control signals from a barrier control system according to the various embodiments described herein.
  • the long-dashed line shows the aperture actuations under the control of a traditional automatic door controller.
  • the short-dashed line represents the difference in aperture opening provided by the two different control systems.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Feedback Control In General (AREA)

Abstract

Sont divulgués ici des systèmes et procédés de commande de barrière qui font appel à un sous-système de capteur comportant au moins un capteur pour capturer des données d'une région proche d'une barrière. Le sous-système de capteur peut détecter des objets et/ou l'état ouvert ou fermé d'une barrière. Le système peut déterminer des valeurs d'ouverture minimales, telles que des largeurs et/ou des hauteurs de barrière, pour chaque incrément de temps futur d'une pluralité d'incréments de temps futurs. Par exemple, le système peut calculer la taille de l'objet et/ou prédire la trajectoire de l'objet à l'aide d'un modèle d'apprentissage machine de prédiction d'occupation entraîné. Le système peut faire appel à la taille et/ou à une trajectoire prédite des objets afin de déterminer des valeurs d'ouverture pour des incréments de temps futurs dans le but d'éviter des collisions et/ou de minimiser l'écoulement d'air. Une interface de commande de moteur peut transmettre les valeurs d'ouverture à une unité de commande de moteur qui commande directement l'ouverture et la fermeture de la barrière.
PCT/US2021/025021 2021-03-31 2021-03-31 Systèmes de commande pour barrières automatiques WO2022211793A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2021/025021 WO2022211793A1 (fr) 2021-03-31 2021-03-31 Systèmes de commande pour barrières automatiques

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2021/025021 WO2022211793A1 (fr) 2021-03-31 2021-03-31 Systèmes de commande pour barrières automatiques

Publications (1)

Publication Number Publication Date
WO2022211793A1 true WO2022211793A1 (fr) 2022-10-06

Family

ID=83459812

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/025021 WO2022211793A1 (fr) 2021-03-31 2021-03-31 Systèmes de commande pour barrières automatiques

Country Status (1)

Country Link
WO (1) WO2022211793A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220308166A1 (en) * 2021-03-18 2022-09-29 Wisense Technologies Ltd. System and method for electromagnetic signal estimation
US20220324297A1 (en) * 2021-04-08 2022-10-13 Carrier Corporation Refrigeration systems weight analysis

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9879466B1 (en) * 2017-04-18 2018-01-30 Chengfu Yu Garage door controller and monitoring system and method
DE102016119339A1 (de) * 2016-10-11 2018-04-12 Bircher Reglomat Ag Situationsabhängige Verschließkörpersteuerung
WO2018191260A1 (fr) * 2017-04-12 2018-10-18 Sears Brands, Llc Système d'ouverture de porte de garage à fermeture automatique
US20180321758A1 (en) * 2017-05-08 2018-11-08 GM Global Technology Operations LLC Foreign object detection systems and control logic for vehicle compartment closure assemblies
US20190271185A1 (en) * 2018-03-05 2019-09-05 The Chamberlain Group, Inc. Movable Barrier Operator and Method
CN110454027A (zh) * 2018-05-08 2019-11-15 比业电子(北京)有限公司 一种用于自动门控制的虚拟按键和多区域保护方法及装置
WO2019238718A1 (fr) * 2018-06-15 2019-12-19 Assa Abloy Entrance Systems Ab Configuration de systèmes d'entrée comportant un ou plusieurs éléments de porte mobiles
WO2020229175A1 (fr) * 2019-05-13 2020-11-19 Assa Abloy Entrance Systems Ab Opérateur de porte oscillante pour déplacer un battant de porte oscillante dans un trajet d'oscillation entre une position ouverte et une position fermée, porte oscillante et pièce dotée d'une porte oscillante

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016119339A1 (de) * 2016-10-11 2018-04-12 Bircher Reglomat Ag Situationsabhängige Verschließkörpersteuerung
WO2018191260A1 (fr) * 2017-04-12 2018-10-18 Sears Brands, Llc Système d'ouverture de porte de garage à fermeture automatique
US9879466B1 (en) * 2017-04-18 2018-01-30 Chengfu Yu Garage door controller and monitoring system and method
US20180321758A1 (en) * 2017-05-08 2018-11-08 GM Global Technology Operations LLC Foreign object detection systems and control logic for vehicle compartment closure assemblies
US20190271185A1 (en) * 2018-03-05 2019-09-05 The Chamberlain Group, Inc. Movable Barrier Operator and Method
CN110454027A (zh) * 2018-05-08 2019-11-15 比业电子(北京)有限公司 一种用于自动门控制的虚拟按键和多区域保护方法及装置
WO2019238718A1 (fr) * 2018-06-15 2019-12-19 Assa Abloy Entrance Systems Ab Configuration de systèmes d'entrée comportant un ou plusieurs éléments de porte mobiles
WO2020229175A1 (fr) * 2019-05-13 2020-11-19 Assa Abloy Entrance Systems Ab Opérateur de porte oscillante pour déplacer un battant de porte oscillante dans un trajet d'oscillation entre une position ouverte et une position fermée, porte oscillante et pièce dotée d'une porte oscillante

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220308166A1 (en) * 2021-03-18 2022-09-29 Wisense Technologies Ltd. System and method for electromagnetic signal estimation
US20220324297A1 (en) * 2021-04-08 2022-10-13 Carrier Corporation Refrigeration systems weight analysis

Similar Documents

Publication Publication Date Title
US8020672B2 (en) Video aided system for elevator control
WO2022211793A1 (fr) Systèmes de commande pour barrières automatiques
CN108052097B (zh) 用于训练异构感测系统的方法和异构感测系统
CN106428000B (zh) 一种车辆速度控制装置和方法
CN106144862B (zh) 用于乘客运输门控制的基于深度传感器的乘客感测
CN101180444B (zh) 用于控制驱动运动件的设备
KR101522970B1 (ko) 자율 이동 장치 및 그 제어 방법
WO2013090910A2 (fr) Détection d'anomalie en temps réel de comportement de foule à l'aide d'informations de multicapteur
CN112850406A (zh) 用于乘客运输的通行列表产生
JPH108825A (ja) ドア装置の作動方法及びこの方法により作動するドア装置
EP3608280A1 (fr) Système d'ascenseur à réponse de porte optimisée
US20200065980A1 (en) Eccentricity maps
US11521494B2 (en) Vehicle eccentricity mapping
US20230085922A1 (en) Apparatus and method for door control
JP6823671B2 (ja) 駐車場内の通行を制御するためのコンセプト
CN109353337A (zh) 一种智能车换道阶段碰撞概率安全预测方法
JP2007140606A (ja) 駐車場監視システム
US11443517B1 (en) Video-based parking garage monitoring
CN116038684A (zh) 一种基于视觉的机器人碰撞预警方法
CN117693459A (zh) 用于自主交通工具操作的轨迹一致性测量
WO2012052750A1 (fr) Système sensible au mouvement et procédé de production d'un signal de commande
GB2479495A (en) Video aided system for elevator control.
Manikandan et al. Collision avoidance approaches for autonomous mobile robots to tackle the problem of pedestrians roaming on campus road
KR102486834B1 (ko) 자율주행 서비스 로봇 운행관리시스템
CN118019894A (zh) 用于操作自动门系统的方法以及具有自动门系统的系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21935380

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21935380

Country of ref document: EP

Kind code of ref document: A1