WO2023046599A1 - Method for operating an automatic door system as well as system having an automatic door system - Google Patents

Method for operating an automatic door system as well as system having an automatic door system Download PDF

Info

Publication number
WO2023046599A1
WO2023046599A1 PCT/EP2022/075832 EP2022075832W WO2023046599A1 WO 2023046599 A1 WO2023046599 A1 WO 2023046599A1 EP 2022075832 W EP2022075832 W EP 2022075832W WO 2023046599 A1 WO2023046599 A1 WO 2023046599A1
Authority
WO
WIPO (PCT)
Prior art keywords
door
analysis unit
recording
camera
property
Prior art date
Application number
PCT/EP2022/075832
Other languages
French (fr)
Inventor
Marco HAURI
Original Assignee
Agtatec Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agtatec Ag filed Critical Agtatec Ag
Publication of WO2023046599A1 publication Critical patent/WO2023046599A1/en

Links

Classifications

    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05FDEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05F15/00Power-operated mechanisms for wings
    • E05F15/70Power-operated mechanisms for wings with automatic actuation
    • E05F15/73Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05FDEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05F15/00Power-operated mechanisms for wings
    • E05F15/70Power-operated mechanisms for wings with automatic actuation
    • E05F15/73Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects
    • E05F2015/767Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects using cameras
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05YINDEXING SCHEME RELATING TO HINGES OR OTHER SUSPENSION DEVICES FOR DOORS, WINDOWS OR WINGS AND DEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION, CHECKS FOR WINGS AND WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05Y2400/00Electronic control; Power supply; Power or signal transmission; User interfaces
    • E05Y2400/10Electronic control
    • E05Y2400/30Electronic control of motors
    • E05Y2400/40Control units therefore
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05YINDEXING SCHEME RELATING TO HINGES OR OTHER SUSPENSION DEVICES FOR DOORS, WINDOWS OR WINGS AND DEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION, CHECKS FOR WINGS AND WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05Y2900/00Application of doors, windows, wings or fittings thereof
    • E05Y2900/10Application of doors, windows, wings or fittings thereof for buildings or parts thereof
    • E05Y2900/13Application of doors, windows, wings or fittings thereof for buildings or parts thereof characterised by the type of wing
    • E05Y2900/132Doors

Definitions

  • the invention concerns a method for operating an automatic door system as well as a system having an automatic door system.
  • Automatic door systems for example at buildings, are well known in the art.
  • Today, automatic door systems are based on a sensor for proximity detection to detect an object close to the door. In response to an object close to the door, the door is opened irrespectively of the actual desire of the person (or the object) in front of the door to pass the door.
  • the senor is used to ensure the safety of the door, meaning that the door is controlled in such a way that no persons present in the track of the door are touched or even harmed, for example while closing the door.
  • the door system comprises at least one door, at least one drive unit for actuating the at least one door, a camera and a control unit for controlling the drive unit, wherein the camera is an integral part of the safety functionality of the door and monitors the track of the door.
  • the method comprises the following steps: capturing at least one recording by the camera, wherein the recording includes at least the area in front of the door, transmitting the recording to the analysis unit, recognizing at least one object and the type of the object in the recording by the analysis unit, and controlling the drive unit based on the at least one recognized object and/or its type.
  • the number of sensors can be reduced. Further, recognizing the object and the type of objects, the control of the drive unit and thus the motion of the door can be adapted to the recognized object. This leads to a more refined interaction between a person and the door, in the best cases, the person does not recognize that he has just passed the door because the door has been opened according to his needs.
  • the door may be a swing door, a revolving door, a sliding door, a folding door or the like.
  • the door may comprise a door leaf, which is actuated by the drive unit.
  • the control of the door may be based on one or more than one object recognized in the recording simultaneously.
  • the camera may be mounted above the door. In particular, the camera monitors the track of the door leaf.
  • the recording includes parts of the track of the door leaves and an area of up to 5 m, preferably up to 7 m, more preferably still up to 10 m (measured on the ground) in front of the door.
  • the analysis unit may be part of the door system, in particular a part of the control unit, or the analysis unit may be separate from the door system, for example provided as a cloud server
  • the drive unit receives steering input from the analysis unit or the control unit.
  • the analysis unit transmits information about the recognized object and/or the type of the object to the control unit.
  • the at least one object is a person, an animal, a movable object or a stationary object allowing the analysis unit to process the situation in its entirety.
  • a moveable object is, for example, any inanimate object that may be carried by a person, like a bag or backpack, may be rolling and pushed or pulled by a person, like a stroller, a bicycle or a trolley, and/or is self-propelled, like an e- scooter or a car.
  • the type of the object may be "person” for a person, "dog” for an animal being a dog, “trolley” for a moveable object being a trolley, or “tree” for a stationary object being a free.
  • the analysis unit may recognize at least one individual property of the object in the recording, in particular wherein the individual property is one or more of the following properties: if the object is a person: age, size, ethnicity, clothes worn, items carried, type of locomotion, orientation, pose, viewing direction, level of attention, state of health, mood, current activity, performed gesture, behavior, gait and/or posture; and/or if the object is an animal: species, breed, whether the animal is wet, whether the animal carries prey, whether the animal is on a leash and/or whether the animal is wearing a muzzle; and/or if the object is a movable object: whether the object is associated with and/or carried by a person.
  • the individual property is one or more of the following properties: if the object is a person: age, size, ethnicity, clothes worn, items carried, type of locomotion, orientation, pose, viewing direction, level of attention, state of health, mood, current activity, performed gesture, behavior, gait and/or posture; and/or if
  • the analysis unit recognizes at least one kinematic property of the at least one object and/or a kinematic subpart of the at least one object in the recording, in particular the change of position, the velocity, the change of velocity, the acceleration, the change of acceleration, the position, the distance to the door and/or the direction of movement of the object and/or the kinematic subpart. This way, the change of the situation can be predicted, in particular an estimated time of arrival of a person at the door can be predicted.
  • the velocity, the change of velocity, the acceleration, the change of acceleration, the position, the distance to the door and/or the direction of movement may be determined for each spatial direction separately, in particular for each of the two spatial directions parallel to the ground.
  • a two dimensional vector is determined for each of the velocity, the change of velocity, the acceleration, the change of acceleration, the position, the distance to the door and/or the direction of movement.
  • a kinematic subpart of an object is a part of an object that may move in addition to the general movement of the object. Examples are the skeleton function of a human or an animal, a body part of a human or animal, like an arm, a leg or the head, or parts of an object extending from the main part of the object, like the handlebar of a bicycle.
  • the analysis unit may recognize at least one image processing property of the at least one object in the recording, in particular the bounding box, the closure, the timeline of properties and/or fusion of the object with other objects.
  • a rule set in particular comprising an evaluation logic, is stored in the analysis unit and/or the control unit, the rule set, in particular the rule set and its evaluation logic, defining a plurality of conditions when the door shall open or stay closed, in particular wherein the conditions include the presence or absence of a specific object, a specific type of object, a specific individual property, a specific kinematic property, a specific image processing property or any combination thereof.
  • the analysis unit and/or the control unit determines based on the rule set, in particular the rule set and its evaluation logic, and the type of the at least one object recognized in the recording and/or its at least one individual property, its at least one kinematic property and/or its at least one image processing property, whether the door shall be opened and particularly operates the drive unit accordingly.
  • the rule set it is possible to define specific characteristics that persons, animals and/or objects must have to pass the door very granularly.
  • the conditions and/or rules may also take into account more than one object, its type and/or properties to differentiate between different situations.
  • a condition may be that the door shall open for persons without shopping carts, i.e. the condition demands the presence of an object recognized as a person and the absence of objects recognized as shopping carts in the recording.
  • the rule set comprises instructions, in particular definitions in the evaluation logic, that define whether and/or how the door shall be opened if more than one condition of the rule set is met, in particular if the conditions that are met are in conflict with one another. This way, it is possible to even handle complex situations in front of the door.
  • the drive unit is controlled according to or based on an actuation profile for achieving a desired door movement, in particular a desired motion of at least one door leaf of the door, wherein the actuation profile is predetermined and selected based on the at least one object recognized in the recording and/or its type, its at least one individual property, its at least one kinematic property and/or its at least one image processing property; and/or wherein the actuation profile is created or adapted based on the type of the at least one object recognized in the recording and/or its at least one individual property, its at least one kinematic property and/or its at least one image processing property.
  • the desired movement of the door can be chosen and adapted easily.
  • the actuation profile includes the acceleration and the velocity of the door, in particular the door leaf, at various positions during motion; the desired traveling distance; the duration between the start of the motion and reaching various predetermined positions, optionally including the time of the start of the motion; and/or a minimal distance between one of the at least one objects to the door, to the door leaf, to the track of the door and/or to the track of the door leaf, so that the desired door movement is represented, in particular represented entirely by the actuation profile.
  • the analysis unit and/or the control unit may take additional data into consideration for determining, whether the door shall be opened and/or for the determination of the actuation profile, in particular the additional data including the state of the door, the current time, the current weather conditions and/or the current number of persons within the building.
  • the recording is a captured single image, a captured series of consecutive images and/or a video recording.
  • the images of the series of image are preferably consecutive.
  • the analysis unit and/or control unit receives at least one measurement value extracted from the data of the camera and/or of at least one additional sensor of the door system, in particular a distance sensor and/or a source of electromagnetic radiation, and uses the at least one measurement value for the recognition of the at least one object, the type of the at least one object and/or its individual, kinematic and/or image processing properties. This improves the recognition of the objects further.
  • the door system and/or the building in which the door system is installed in may comprise a signaling device, in particular an optical and/or acoustical signaling device, wherein the signaling device is activated by the analysis unit and/or the control unit based on the type of the at least one object, of its at least one property and/or its individual, kinematic and/or image processing properties.
  • a signaling device in particular an optical and/or acoustical signaling device, wherein the signaling device is activated by the analysis unit and/or the control unit based on the type of the at least one object, of its at least one property and/or its individual, kinematic and/or image processing properties.
  • the analysis unit and/or the control unit determines the number of objects, in particular persons, passing the door and their direction of movement, obtaining the number of objects and/or the length of stay of the objects within the area confined by the door system based on the determined number of objects passing the door, wherein the obtained number and/or the determined length of stay is used to control the drive unit and/or the signaling device or is transmitted via an interface.
  • the door system may comprise a plurality of doors wherein the number of objects passing the door is to be understood in this case to be the number of objects passing any one of the doors.
  • the recognition of the at least one object in the recording is performed by a deterministic algorithm, a machine learning algorithm, a support vector machine and/or a trained artificial neural network of the analysis unit, allowing efficient object recognition.
  • the artificial neural network may be trained using training data, wherein the training data comprises, for various training situations, input data of the same type and structure as the data which is fed to the artificial neural network during regular operation of the door system, and information about the expected correct output of the artificial neural network for the training situations; the training comprises the following training steps: feed forward of the input data through the artificial neural network; determining an answer output by the artificial neural network based on the input data, determining an error between the answer output of the artificial neural network and the expected correct output of the artificial neural network; and changing the weights of the artificial neural network by back-propagating the error through the artificial neural network, in particular wherein the input data includes recordings captured by the camera, at least one measurement value extracted from the data of the camera and/or at least one measurement value of at least one additional sensor; wherein the information about the expected correct output includes information about the actual objects in the recording including their types as well as their individual, kinematic and/or image processing properties; and wherein the answer output includes the actual objects in the recording, their
  • a system comprising an analysis unit and an automatic door system having at least one door, at least one drive unit for actuating the at least one door, in particular at least one door leaf of the door, a camera and a control unit for controlling the drive unit, wherein the system is configured to carry out a method as explained above, in particular wherein the analysis unit is part of the door system, for example part of the control unit and/or an image processing unit of the camera.
  • the camera may be a single camera, a stereo camera, a time of flight 3D camera, an event camera or a plurality of cameras; and/or the door system may comprise an additional sensor, in particular a distance sensor and/or a source of electromagnetic radiation, whose field of view overlaps with the field of view of the camera.
  • Fig. 1 shows a system according to the invention schematically
  • Fig. 2 shows the different evaluation blocks involved in the method according to the invention
  • Fig. 3 shows a flowchart of the method according to the invention
  • Fig. 4 shows a first situation during the operation of the system according to Figure 1 carrying out the method according to Figure 3
  • Fig. 3 shows a flowchart of the method according to the invention
  • Fig. 4 shows a first situation during the operation of the system according to Figure 1 carrying out the method according to Figure 3
  • Fig. 5 shows a second situation during the operation of the door system according to Figure 1 carrying out the method according to Figure 3.
  • Figure 1 shows schematically a system 10 according to the invention having an analysis unit 12 and an automatic door system 14.
  • the automatic door system 14 has a door 16, being a sliding door in the shown embodiment with two door leaves 18, a control unit 20, two drive units 22, and a camera 24.
  • the automatic door system 14 may further comprise one or more additional sensors 26 and a signaling device 28.
  • the door 16 may as well be a swing door, a revolving door, a folding door or the like. The method of operation remains the same.
  • the camera 24 and the drive units 22 are connected to the control unit 20, wherein the control unit 20 is configured to control the drive units 22.
  • Each of the drive units 22 is associated with one of the door leaves 18 and is designed to move the respective door leaf 18 along a track.
  • the door leaves 18 may be moved individually from one another.
  • the door leaves 18 are moveable such that between them a passage can be opened, wherein the width of the passage is adjustable by the control unit 20.
  • the camera 24 has a controller 30 and is located above the door 16, i.e. above the track of the door leaves 18.
  • the camera 24 is an integral part of the safety functionality of the door system 14. Namely, the camera 24 monitors the track of the door 16, i.e. the movement path of the door leaves 18, and forwards this information to the control unit 20 or its integrated controller 30. Based on this information, the integrated controller 30 and/or the control unit 20 control the drive units 22 to ensure that the door 16 is operated safely. In particular, to avoid that persons, for example vulnerable persons such as children or elderly people, present in the track of the door 16 are not touched or even harmed by closing the door leaves 18.
  • the camera 24 may be a single camera, a stereo camera, a time of flight 3D camera an event camera or a plurality of cameras.
  • the field of view F of the camera 24 includes the track of the door 16, in particular the track of the door leaves 18.
  • the field of view F of the camera 24 may cover an area of up to 5 m, preferably up to 7 m, more preferably still up to 10 m in front of the door 16, measured on the ground.
  • the field of view F includes, for example, the sidewalk of the street in front of the building the system 10 is installed in.
  • the analysis unit 12 may also be a part of the automatic door system 14.
  • the analysis unit 12 may be integrated into the control unit 20 or integrated into the controller 30 of the camera 24.
  • the analysis unit 12 is separate from the door system 14.
  • the analysis unit 12 may be provided as a server (shown in dashed lines in Figure 1), for example a cloud server or a server on premise.
  • the analysis unit 12 is connected to the control unit 20 and/or the drive units 22.
  • the additional sensor 26 may be a distance sensor, like an infrared light source as known in the art, or a source of electromagnetic radiation, for example a radar.
  • the field of view of the additional sensor 26 overlaps with the field of view F of the camera 24.
  • the signaling device 28 may include an optical signaling device, like a light, and an acoustical signaling device like a speaker.
  • the signaling device 28 may be mounted above the door 16.
  • the signaling device 28 may be a signaling device of the building in which the door system 14 is installed in.
  • Figure 2 shows an illustration of evaluation blocks to determine, whether and how the door 16 shall be opened.
  • Figure 2 is for illustration purposes only and the blocks shown therein are not necessarily separate hardware and/or software modules.
  • the recordings captured by the camera 24 are evaluated and classified. Objects in the recordings are recognized as will be explained in more detail below.
  • the information generated in block Bl is evaluated based on rules R according to an evaluation logic.
  • the rules R may be predefined and stored in a memory of the analysis unit 12.
  • block B3 the information generated in the blocks Bl and/or B2 are used to select an actuation profile P.
  • the actuation profile P defines the movement that the door 16, in particular the door leaves 18 shall perform.
  • the actuation profile P includes the acceleration and the velocity of the door leaves 18 at various positions during the motion.
  • the actuation profile P may define the traveling distance for each door leaf 18, the duration between the start of the motion and reaching various predetermined positions during the motion. Further, the time of start of the motion can be defined, for example as “immediately”.
  • the actuation profile defines a minimal distance between an object recognized in the recording and the door 16, in particular the door leaf 18 and/or the track of the door leaves 18, at which the motion of the door leaves 18 shall be initiated.
  • actuation profile P takes into account the physical capabilities of the specific door 16 or door leaf 18, e.g. a maximum possible acceleration.
  • the actuation profiles P may be predefined and stored in the memory of the analysis unit 12. The actuation profile P is then selected based on the information generated in blocks Bl and/or B2. It is also possible, that the selected actuation profile P is adapted based on this information.
  • actuation profile P is created based on the information of blocks Bl and B2, i.e. that no actuation profiles P are predetermined.
  • block B4 based on the selected actuation profile P of block B3 and optionally the information generated in blocks Bl and/or B2, the door 16, more precisely the door leaves 18 are actuated by the drive units 22.
  • the drive units 22 receive steering input from the control unit 20 and/or the analysis unit 12 based on the actuation profile P.
  • block B 1 and optionally also block B2 and/or block B3 are carried out by the analysis unit 12.
  • the analysis unit 12 comprises a deterministic algorithm, a machine learning algorithm, a support vector machine and/or a trained artificial neural network.
  • Figure 3 shows a more detailed flowchart of the method according to the invention than the overview of Figure 2.
  • the camera captures recordings.
  • the recordings are, for example, recordings of a single image, recordings of a series of consecutive images and/or a video recording.
  • recordings are captured in regular intervals, in particular continuously, and the following steps are carried out for multiple recordings simultaneously.
  • the steps are carried out for each recording improving reaction time of the system 10.
  • the recordings include the field of view F, and thus the track of the door 16 as well as the area in front of the door 16 and objects or persons present therein.
  • the recordings are used on the one hand for ensuring a safe operation of the door 16.
  • the control unit 20 or the integrated controller 30, which receives the recordings, ensures that - based on at least the part of the recording showing the track of the door - the door 16 is operated safely.
  • persons for example vulnerable persons such as children or elderly people, that are present in the track of the door 16 are touched or even harmed by the door leaves 18.
  • the control unit 20 and/or the integrated controller 30 controls the drive unit 22 to actuate the door leaves 18 accordingly (step S2). This way, the door 16 can be operated safely.
  • the camera 24 is an integral part of the safety functionality of the door 16.
  • step S3 the recordings are transmitted to the analysis unit 12.
  • step S4 the analysis unit 12 performs image recognition. Firstly, in step S4.1 the analysis unit 12 recognizes the objects and the types of the objects in the recording.
  • the recognized objects may be persons, animals, movable objects or stationary objects.
  • Movable objects are, for example, inanimate objects that may be carried by a person, like a purse, a dog or a backpack. They also may be rolling, pushed or pulled by a person, like a stroller, a bicycle or a trolley. Thus, these objects can also be associated with a person. Further, movable objects may also be self- propelled, like a car.
  • Stationary objects may be plants, permanent signs or the like.
  • the analysis unit 12 may also recognize kinematic properties of the objects in the recordings (step S4.2).
  • the analysis unit 12 may for each object in the recording determine the position of the object with respect to the door 16, the change of position, the velocity, the change of velocity, the acceleration, the change of acceleration, the distance to the door and/or the direction of movement of the object.
  • the analysis unit 12 determines the kinematic property of a kinematic subpart of an object which is a part of an object that may move in addition to the general movement of the object.
  • a kinematic subpart of an object which is a part of an object that may move in addition to the general movement of the object.
  • Examples are the skeleton function of a human or an animal, a body part of a human or animal, like an arm, a leg or the head, or parts of an object extending from the main part of the object, like the handlebar of a bicycle.
  • kinematic properties of a kinematic subpart are also meant when referring to the kinematic properties of the object.
  • the type of the object is also recognized, for example the objects are classified as "person", "dog”, “stroller”, “trolley”, “bicycle”, “plant”, “e-scooter” and/or "car” (step S4.3).
  • the analysis unit 12 recognizes various individual properties of each of the objects in the recording (step S4.4).
  • the recognized individual properties may differ depending on the type of the object.
  • the individual properties may include the age, the size, the ethnicity, the clothes worn by the person, the items and/or objects carried by the person, the type of locomotion (walking aid, inline skates, skateboard, etc.), the orientation with respect to the door, the pose of the person, the viewing direction of the person, the level of attention of the person, the state of health of the person, the mood of the person, the current activity of the person, a gesture, if any, performed by the person, the behavior of the person, the gait of the person and/or the posture of the person.
  • the type of locomotion walking aid, inline skates, skateboard, etc.
  • the orientation with respect to the door the pose of the person, the viewing direction of the person, the level of attention of the person, the state of health of the person, the mood of the person, the current activity of the person, a gesture, if any, performed by the person, the behavior of the person, the gait of the person and/or the posture of the person.
  • the analysis unit determines the species of the animal, the breed of the animal, whether the animal is a pet, whether the animal carries prey, whether the animal is on a leash and/or whether the animal is wearing a muzzle.
  • the analysis unit 12 may determine whether the object is associated with and/or carried by a person.
  • the analysis unit 12 may determine for each object in the recording image processing properties in step S4.5.
  • Image processing properties are properties of the object that are used for the purposes of image processing and simplification.
  • image processing properties may be the bounding box B of each object, the closure of objects, in particular of fused objects, the time limit of properties of an object and/or the fusion of an object with other objects.
  • the analysis unit 12 recognizes each object in the recording, their types as well as their individual, kinematic and/or image processing properties.
  • step S4 corresponds to block Bl of Figure 2 and the objects, their types as well as their individual, kinematic and/or image processing properties correspond to the information generated in block Bl.
  • the analysis unit 12 and/or the control unit 20 determine whether the door 16 shall be opened based on a rule set.
  • the rule set comprises the plurality of rules R and also an evaluation logic according to which the rules R are to be applied, in particular in case of a conflict.
  • the rule set is stored in the analysis unit 12 or the control unit 20, respectively.
  • the rule set may be predefined and/or adapted during the operation based on machine learning.
  • the rules R define a plurality of conditions when the door 16 shall be opened or shall stay closed or shall be closed.
  • Each rule R includes conditions concerning the presence or the absence of a specific object, a specific type of object, a specific individual property, a specific kinematic property, a specific image processing property or a combination thereof. Further, each rule R comprises the consequence of the conditions, i.e. whether the door shall be opened or not.
  • Simple examples of rules may be that only the presence of a person, i.e. not the presence of objects or animals, leads to opening of the door.
  • Another example is that only persons without luggage shall open the door, meaning that the presence of a person not associated with objects of the type trolleys, luggage or the like leads to opening of the door.
  • a further example may be that only animals on a leash shall open the door, meaning that the presence of a person and the presence of an animal having the additional individual property that they are held on the leash, lead to the consequence that the door shall be opened.
  • one rule R takes into account more than one object, its types and/or properties to differentiate between different situation.
  • a condition or rule may be that the door shall open for persons without shopping carts, i.e. the rule R demands the presence of an object recognized as a person at the absence of objects recognized as shopping carts in the recording, before the door shall be opened.
  • the evaluation logic comprises instructions that define whether and/or how the door shall be opened if more than one condition of the rule set, i.e. more than one rule R, is met.
  • control of the door 16 may be based on more than one object recognized in the recording.
  • step S5 If it has been determined in step S5, which corresponds to block B2, that the door shall be opened, it has to be decided, how the door shall be opened. This is done in step S6, also by the analysis unit 12 or the control unit 20, respectively.
  • an actuation profile P is selected, created or adapted.
  • the kinematic properties of the objects relevant for this specific rule R are taken into consideration for the selection, creation or adaption of the actuation profile P.
  • the door 16 has to be opened wider than for the person alone.
  • the desired movement of the door leaves 18 depends on the objects and the overall situation.
  • Step S6 thus corresponds to block B3 of Figure 2.
  • step S4 S5 and/or S6 measurement values from the additional sensors 26 are considered.
  • step S7 the distance to one or more of the objects is determined and transmitted to the analysis unit 12 for the control unit 20, respectively.
  • the analysis unit 12 or the control unit 20, respectively takes the measurement values into consideration for the recognition of objects, type and properties (step S4), determining whether the door shall be opened (step S5) and/or for the determination of the actuation profile (step S6).
  • step S8 it is conceivable that the analysis unit 12 or the control unit 20, respectively, takes the additional data into consideration for the recognition of objects, type and properties (step S4), determining whether the door shall be opened (step S5) and/or for the determination of the actuation profile (step S6).
  • the additional data may be generated by the camera 24, for example as an additional distance measurement, or from the control unit 20 being the state of the door, the current time, the current weather conditions and/or the current number of persons within the building.
  • steps S4, S5 and/or S6 are transmitted to the control unit 20 or the integrated controller 30 to be taken into consideration for ensuring the safe operation of the door 16.
  • steps S2 to S8 do not have to be carried out in the order explained above but may be carried out in any other order. It is also possible that one or more of the steps S3 to S 8 is omitted.
  • step S9 the drive unit 22 is controlled by the analysis unit 12 or the control unit 20, respectively, based on the selected, created or adapted actuation profile P.
  • step S10 the drive units 22 then actuate the respective door leaves 18 associated with them.
  • the desired movement of the door 16 as defined in the actuation profile P is actually carried out by the door 16.
  • Steps S9 and S10 corresponds to block B4 of Figure 2.
  • step Si l the signaling device 28 is actuated by the analysis unit 12 or the control unit, respectively, based on the recognized object, the type of the object, at least one property of the object, its individually kinematic and/or imaging processing property.
  • the actuation profile P or the rule R may already indicate, whether the signaling device 28 shall be activated and how.
  • the signaling device 28 may be activated if an object is denied entrance, e.g. if a person has attempted to enter the door 16 but no rule R for opening the door matches this particular person and properties.
  • the signaling device 28 could also be used, to indicate the position of the door to a person that is allowed to enter the door 16.
  • the artificial neural network is trained using training data.
  • the training data comprises sets of different input data for various situations, for example situations as explained above, and also information about the desired door movement in each of the situations.
  • the input data includes the same type and data structure as supplied to the artificial neural network during the operation of the artificial neural network as explained above.
  • the input data includes recordings generated by the camera 24 and optionally the at least one measurement value extracted from the data of the camera 24 and/or at least one measurement value of at least one additional sensor 26.
  • the input data includes the expected correct output of the artificial neural network in response to each data set of the input data, in particular information about the actual objects in the recording including their types as well as their individual, kinematic and/or image processing properties.
  • a first training step T1 (Fig. 3) the input data is fed forward through the artificial neural network. Then, the answer output of the artificial neural network is determined (step T2).
  • the answer output may include the actual objects in the recording, their types, their properties and/or their individual, kinematic and/or image processing properties that has/have been determined by the artificial neural network based on the input data for one of the various training situations.
  • step T3 an error between the answer output of the artificial neural network and the expected correct output (known from the training data) is determined.
  • step T4 the weights of the artificial neural network are changed by back propagating the error through the artificial neural network.
  • the training may be used for all the blocks Bl, B2 and B3 and/or as the basis for steps S2 to S8.
  • FIGs 4 and 5 show two situations of the system 10 in use.
  • the first person on the left-hand side is a woman slowly walking towards the door 16.
  • the analysis unit 12 determines that this object is a person of age 24 and gender female, that she does not have luggage or carries objects, that her mood is "happy" and that she is oriented towards the door at an angle of 20° (with respect to a line perpendicular to the track of the door 16). The woman has eye contact with the door, indicating that she will likely pass the door. Further, the kinematic properties are determined like the slow speed of 0.5 m/s as well as the acceleration (no acceleration in this case).
  • the analysis unit 12 determines the bounding box B - shown in a dashed line in Figure 4 - as an image processing property.
  • the second object in the middle, as recognized by the analysis unit 12, is a male person of age 42 that is merely passing by the door 16.
  • the analysis unit 12 determines the type and further properties of the person, as indicated in Figure 4.
  • the person carries no luggage but a mobile phone. His mood is focused and his orientation is traversed to the door, having an angle of 89° to the door. The person has no eye contact and moves at a slow speed of 0.5 m/s with no acceleration.
  • the third person on the right hand side runs towards the door. As indicated in Figure 4, it is recognized by the analysis unit 12 as a person of age 31 with male gender. The person does not carry luggage or objects and appears to be in a stressed mood. The person has eye contact with the door and its movement is directed to the door with an angle of 42°. The person moves fast with a speed of 2.5 m/s and also accelerates.
  • the analysis unit 12 evaluates the rules R.
  • two rules R one for the person on the left-hand side walking to the door and one for the person on the right-hand side running to the door, indicate that the door shall be opened.
  • an actuation profile P suitable for the kinematic properties of the person on the right-hand side is chosen.
  • This actuation profile P describes a door movement that is very quick and happens immediately so that the person on the right-hand side does not have to stop or slow down.
  • the drive units 22 are then controlled according to the selected actuation profile P so that the door leaves 18 move apart in a quick fashion to allow the person on the right hand side to run through the door without the need to change his movement, even though he is accelerating.
  • the door 16 will then stay open until the second person, which will of course be longer in the field of view F of the camera 24, has passed the door.
  • Figure 5 shows a second situation in front of the door 16.
  • the analysis unit 12 recognizes a female person of age 24 in neutral mood who is walking at a slow speed of 0.3 m/s and slightly decelerating at the door 16. The person has eye contact with the door.
  • the analysis unit 12 recognizes two movable objects as trolleys that are pulled by the person. Thus, the two objects are associated with the person.
  • a second person (fourth object) in the middle is walking parallel to the door.
  • the analysis unit 12 recognizes this object is a person of female gender and age 18. The person carries a handbag and appears happy.
  • the analysis unit 12 also determines that the orientation of the person is traverse to the door and that she has no eye contact with the door. Also the kinematic properties of 0.7 m/s walking speed and no acceleration are recognized.
  • the fifth object is an animal right in front of the door.
  • the animal is a sheep and thus not a pet. It is further not on a leash or not related to any person. It is oriented at the door and walks at a slow speed of 0.2 m/s and decelerates slightly.
  • the sheep in front of the door fulfills the conditions of a rule R that does not allow animals to enter the building. Thus, this rule would result in the door not being opened. However, the person on the left-hand side walking towards the door with two trolleys fulfills the condition of another rule R that indicates that the door 16 shall be opened.
  • the instructions define that persons are allowed to enter the building even though animals are present in front of the door.
  • the rule R associated with the person on the left-hand side is regarded more important. Therefore, according to this rule R for the person on the left-hand side and its properties an actuation profile P is selected.
  • an actuation profile P for persons with luggage is selected that leads to a wide opening width of the door 16. Further, it is possible that the analysis unit 12 adapts this actuation profile P based on the estimated width of the person and the two trolleys to further increase the resulting opening width of the door 16 so that the person with the two trolleys can safely pass the door without the chance of collisions with one of the door leaves 18.
  • the method and the system can provide the possibility of controlling the access to a building or other confined space in a very detailed and precise maimer without the need for further personnel or other structures.
  • the door 16 may be opened in a fashion that is always adapted to the needs of the person that would like to pass the door, wherein, at the same time, the duration the door is opened can be reduced to avoid that heated air leaves the building.
  • the analysis unit 12 and/or the control unit 20 determines the number of objects, in particular persons, that pass the door 16 and their direction of movement, meaning that the analysis unit 12 or the control unit 20 can determine whether a person leaves or enters the building.
  • the analysis unit 12 or the control unit 20, respectively can determine the number of objects, in particular persons, inside the building, and the length of stay of each object within the building.
  • This information can also be used to control the drive unit 22, e.g. to close the door after the last person has left the building, or it can be transmitted via an interface to, for example, a facility management system.
  • the door system 14 may also comprise a plurality of doors 16 wherein the number of objects, in particular persons, passing the various doors 16 are added up and subtracted from one another to obtain the number of persons within the building.
  • the number of persons, the length of stay or the like can also be determined for various areas within the building or other areas that are confined by the door system 14.

Abstract

A method is provided for operating an automatic door system (14) using an analysis unit (12), wherein the door system (14) has a drive unit (22) for actuating a door (16), a camera (24) and a control unit (20), wherein the camera (24) is an integral part of the safety functionality of the door (16), wherein the method comprises the following steps:- capturing a recording by the camera (24), wherein the recording includes at least the area in front of the door (16),- recognizing at least one object and the type of the object in the recording by the analysis unit (12), and - controlling the drive unit (22) based on the at least one recognized object and/or its type.Further, a system (10) is provided.

Description

Method for operating an automatic door system as well as system having an automatic door system
The invention concerns a method for operating an automatic door system as well as a system having an automatic door system.
Automatic door systems, for example at buildings, are well known in the art. Today, automatic door systems are based on a sensor for proximity detection to detect an object close to the door. In response to an object close to the door, the door is opened irrespectively of the actual desire of the person (or the object) in front of the door to pass the door.
Thus, known door systems open the door for any person or even objects close to the door and the door is always opened in the same manner. It can be said that the door system is agnostic.
Secondly, the sensor is used to ensure the safety of the door, meaning that the door is controlled in such a way that no persons present in the track of the door are touched or even harmed, for example while closing the door.
It is known to use different technologies, i.e. radar, camera, or the like, to detect persons that would like to pass the door. These sensors are, however, not suitable to ensure the safety of the door operation. It is the object of the invention to provide a method for operating a door system that ensures the safety of the door operation as well as provides a door movement specific to the actual situation in front of the door.
For this purpose, a method for operating an automatic door system using an analysis unit is provided. The door system comprises at least one door, at least one drive unit for actuating the at least one door, a camera and a control unit for controlling the drive unit, wherein the camera is an integral part of the safety functionality of the door and monitors the track of the door. The method comprises the following steps: capturing at least one recording by the camera, wherein the recording includes at least the area in front of the door, transmitting the recording to the analysis unit, recognizing at least one object and the type of the object in the recording by the analysis unit, and controlling the drive unit based on the at least one recognized object and/or its type.
By using the same camera that is also part of the safety functionality of the door for recognizing the objects in front of the door, the number of sensors can be reduced. Further, recognizing the object and the type of objects, the control of the drive unit and thus the motion of the door can be adapted to the recognized object. This leads to a more refined interaction between a person and the door, in the best cases, the person does not recognize that he has just passed the door because the door has been opened according to his needs.
The door may be a swing door, a revolving door, a sliding door, a folding door or the like. The door may comprise a door leaf, which is actuated by the drive unit. In particular, the control of the door may be based on one or more than one object recognized in the recording simultaneously.
The camera may be mounted above the door. In particular, the camera monitors the track of the door leaf.
For example, the recording includes parts of the track of the door leaves and an area of up to 5 m, preferably up to 7 m, more preferably still up to 10 m (measured on the ground) in front of the door.
The analysis unit may be part of the door system, in particular a part of the control unit, or the analysis unit may be separate from the door system, for example provided as a cloud server
It is also conceivable that the drive unit receives steering input from the analysis unit or the control unit. In the latter case, the analysis unit transmits information about the recognized object and/or the type of the object to the control unit.
In an aspect, the at least one object is a person, an animal, a movable object or a stationary object allowing the analysis unit to process the situation in its entirety.
A moveable object is, for example, any inanimate object that may be carried by a person, like a bag or backpack, may be rolling and pushed or pulled by a person, like a stroller, a bicycle or a trolley, and/or is self-propelled, like an e- scooter or a car.
Accordingly, the type of the object may be "person" for a person, "dog" for an animal being a dog, "trolley" for a moveable object being a trolley, or "tree" for a stationary object being a free.
For improving the detection accuracy of the situation in front of the door further, the analysis unit may recognize at least one individual property of the object in the recording, in particular wherein the individual property is one or more of the following properties: if the object is a person: age, size, ethnicity, clothes worn, items carried, type of locomotion, orientation, pose, viewing direction, level of attention, state of health, mood, current activity, performed gesture, behavior, gait and/or posture; and/or if the object is an animal: species, breed, whether the animal is wet, whether the animal carries prey, whether the animal is on a leash and/or whether the animal is wearing a muzzle; and/or if the object is a movable object: whether the object is associated with and/or carried by a person.
In an aspect, the analysis unit recognizes at least one kinematic property of the at least one object and/or a kinematic subpart of the at least one object in the recording, in particular the change of position, the velocity, the change of velocity, the acceleration, the change of acceleration, the position, the distance to the door and/or the direction of movement of the object and/or the kinematic subpart. This way, the change of the situation can be predicted, in particular an estimated time of arrival of a person at the door can be predicted.
The velocity, the change of velocity, the acceleration, the change of acceleration, the position, the distance to the door and/or the direction of movement may be determined for each spatial direction separately, in particular for each of the two spatial directions parallel to the ground.
For example, a two dimensional vector is determined for each of the velocity, the change of velocity, the acceleration, the change of acceleration, the position, the distance to the door and/or the direction of movement. In particular, a kinematic subpart of an object is a part of an object that may move in addition to the general movement of the object. Examples are the skeleton function of a human or an animal, a body part of a human or animal, like an arm, a leg or the head, or parts of an object extending from the main part of the object, like the handlebar of a bicycle.
For improved image processing, the analysis unit may recognize at least one image processing property of the at least one object in the recording, in particular the bounding box, the closure, the timeline of properties and/or fusion of the object with other objects.
In an embodiment, a rule set, in particular comprising an evaluation logic, is stored in the analysis unit and/or the control unit, the rule set, in particular the rule set and its evaluation logic, defining a plurality of conditions when the door shall open or stay closed, in particular wherein the conditions include the presence or absence of a specific object, a specific type of object, a specific individual property, a specific kinematic property, a specific image processing property or any combination thereof. The analysis unit and/or the control unit determines based on the rule set, in particular the rule set and its evaluation logic, and the type of the at least one object recognized in the recording and/or its at least one individual property, its at least one kinematic property and/or its at least one image processing property, whether the door shall be opened and particularly operates the drive unit accordingly. By using the rule set, it is possible to define specific characteristics that persons, animals and/or objects must have to pass the door very granularly.
The conditions and/or rules may also take into account more than one object, its type and/or properties to differentiate between different situations. A condition may be that the door shall open for persons without shopping carts, i.e. the condition demands the presence of an object recognized as a person and the absence of objects recognized as shopping carts in the recording. For example, the rule set comprises instructions, in particular definitions in the evaluation logic, that define whether and/or how the door shall be opened if more than one condition of the rule set is met, in particular if the conditions that are met are in conflict with one another. This way, it is possible to even handle complex situations in front of the door.
In an embodiment, the drive unit is controlled according to or based on an actuation profile for achieving a desired door movement, in particular a desired motion of at least one door leaf of the door, wherein the actuation profile is predetermined and selected based on the at least one object recognized in the recording and/or its type, its at least one individual property, its at least one kinematic property and/or its at least one image processing property; and/or wherein the actuation profile is created or adapted based on the type of the at least one object recognized in the recording and/or its at least one individual property, its at least one kinematic property and/or its at least one image processing property. By using actuation profiles, the desired movement of the door can be chosen and adapted easily.
For example, the actuation profile includes the acceleration and the velocity of the door, in particular the door leaf, at various positions during motion; the desired traveling distance; the duration between the start of the motion and reaching various predetermined positions, optionally including the time of the start of the motion; and/or a minimal distance between one of the at least one objects to the door, to the door leaf, to the track of the door and/or to the track of the door leaf, so that the desired door movement is represented, in particular represented entirely by the actuation profile.
In order to choose, generate and/or adapt the actuation profile with improved precision, the analysis unit and/or the control unit may take additional data into consideration for determining, whether the door shall be opened and/or for the determination of the actuation profile, in particular the additional data including the state of the door, the current time, the current weather conditions and/or the current number of persons within the building.
For example, the recording is a captured single image, a captured series of consecutive images and/or a video recording.
The images of the series of image are preferably consecutive.
In an aspect the analysis unit and/or control unit receives at least one measurement value extracted from the data of the camera and/or of at least one additional sensor of the door system, in particular a distance sensor and/or a source of electromagnetic radiation, and uses the at least one measurement value for the recognition of the at least one object, the type of the at least one object and/or its individual, kinematic and/or image processing properties. This improves the recognition of the objects further.
To allow direct feedback to persons around the door, the door system and/or the building in which the door system is installed in may comprise a signaling device, in particular an optical and/or acoustical signaling device, wherein the signaling device is activated by the analysis unit and/or the control unit based on the type of the at least one object, of its at least one property and/or its individual, kinematic and/or image processing properties.
In an embodiment, the analysis unit and/or the control unit determines the number of objects, in particular persons, passing the door and their direction of movement, obtaining the number of objects and/or the length of stay of the objects within the area confined by the door system based on the determined number of objects passing the door, wherein the obtained number and/or the determined length of stay is used to control the drive unit and/or the signaling device or is transmitted via an interface. This way, more sophisticated opening rules can be implemented. The door system may comprise a plurality of doors wherein the number of objects passing the door is to be understood in this case to be the number of objects passing any one of the doors.
In an embodiment the recognition of the at least one object in the recording is performed by a deterministic algorithm, a machine learning algorithm, a support vector machine and/or a trained artificial neural network of the analysis unit, allowing efficient object recognition.
For a very well adapted recognition, the artificial neural network may be trained using training data, wherein the training data comprises, for various training situations, input data of the same type and structure as the data which is fed to the artificial neural network during regular operation of the door system, and information about the expected correct output of the artificial neural network for the training situations; the training comprises the following training steps: feed forward of the input data through the artificial neural network; determining an answer output by the artificial neural network based on the input data, determining an error between the answer output of the artificial neural network and the expected correct output of the artificial neural network; and changing the weights of the artificial neural network by back-propagating the error through the artificial neural network, in particular wherein the input data includes recordings captured by the camera, at least one measurement value extracted from the data of the camera and/or at least one measurement value of at least one additional sensor; wherein the information about the expected correct output includes information about the actual objects in the recording including their types as well as their individual, kinematic and/or image processing properties; and wherein the answer output includes the actual objects in the recording, their types, their properties and/or their individual, kinematic and/or image processing properties determined based on the input data.
For above mentioned purpose, a system is provided comprising an analysis unit and an automatic door system having at least one door, at least one drive unit for actuating the at least one door, in particular at least one door leaf of the door, a camera and a control unit for controlling the drive unit, wherein the system is configured to carry out a method as explained above, in particular wherein the analysis unit is part of the door system, for example part of the control unit and/or an image processing unit of the camera.
The features and advantages discussed in context of the method also apply to the system and vice versa.
For efficiently providing the recording and/or the additional measurement value, the camera may be a single camera, a stereo camera, a time of flight 3D camera, an event camera or a plurality of cameras; and/or the door system may comprise an additional sensor, in particular a distance sensor and/or a source of electromagnetic radiation, whose field of view overlaps with the field of view of the camera.
Further features and advantages will be apparent from the following description as well as the accompanying drawings, to which reference is made. In the drawings:
Fig. 1: shows a system according to the invention schematically,
Fig. 2: shows the different evaluation blocks involved in the method according to the invention,
Fig. 3 : shows a flowchart of the method according to the invention, Fig. 4: shows a first situation during the operation of the system according to Figure 1 carrying out the method according to Figure 3, and
Fig. 5: shows a second situation during the operation of the door system according to Figure 1 carrying out the method according to Figure 3.
Figure 1 shows schematically a system 10 according to the invention having an analysis unit 12 and an automatic door system 14.
The automatic door system 14 has a door 16, being a sliding door in the shown embodiment with two door leaves 18, a control unit 20, two drive units 22, and a camera 24. The automatic door system 14 may further comprise one or more additional sensors 26 and a signaling device 28.
The door 16 may as well be a swing door, a revolving door, a folding door or the like. The method of operation remains the same.
The camera 24 and the drive units 22 are connected to the control unit 20, wherein the control unit 20 is configured to control the drive units 22.
Each of the drive units 22 is associated with one of the door leaves 18 and is designed to move the respective door leaf 18 along a track. The door leaves 18 may be moved individually from one another.
In particular, the door leaves 18 are moveable such that between them a passage can be opened, wherein the width of the passage is adjustable by the control unit 20.
The camera 24 has a controller 30 and is located above the door 16, i.e. above the track of the door leaves 18.
The camera 24 is an integral part of the safety functionality of the door system 14. Namely, the camera 24 monitors the track of the door 16, i.e. the movement path of the door leaves 18, and forwards this information to the control unit 20 or its integrated controller 30. Based on this information, the integrated controller 30 and/or the control unit 20 control the drive units 22 to ensure that the door 16 is operated safely. In particular, to avoid that persons, for example vulnerable persons such as children or elderly people, present in the track of the door 16 are not touched or even harmed by closing the door leaves 18.
The camera 24 may be a single camera, a stereo camera, a time of flight 3D camera an event camera or a plurality of cameras.
The field of view F of the camera 24 includes the track of the door 16, in particular the track of the door leaves 18.
The field of view F of the camera 24 may cover an area of up to 5 m, preferably up to 7 m, more preferably still up to 10 m in front of the door 16, measured on the ground. Thus, the field of view F includes, for example, the sidewalk of the street in front of the building the system 10 is installed in.
The analysis unit 12 may also be a part of the automatic door system 14. For example, the analysis unit 12 may be integrated into the control unit 20 or integrated into the controller 30 of the camera 24.
It is also conceivable that the analysis unit 12 is separate from the door system 14. In this case, the analysis unit 12 may be provided as a server (shown in dashed lines in Figure 1), for example a cloud server or a server on premise.
In any case, the analysis unit 12 is connected to the control unit 20 and/or the drive units 22.
The additional sensor 26 may be a distance sensor, like an infrared light source as known in the art, or a source of electromagnetic radiation, for example a radar. The field of view of the additional sensor 26 overlaps with the field of view F of the camera 24.
The signaling device 28 may include an optical signaling device, like a light, and an acoustical signaling device like a speaker. The signaling device 28 may be mounted above the door 16.
The signaling device 28 may be a signaling device of the building in which the door system 14 is installed in.
Figure 2 shows an illustration of evaluation blocks to determine, whether and how the door 16 shall be opened. Figure 2 is for illustration purposes only and the blocks shown therein are not necessarily separate hardware and/or software modules.
For simplification, the inputs of the additional sensor 26 are not shown in Figure 2.
In the first block Bl, the recordings captured by the camera 24 are evaluated and classified. Objects in the recordings are recognized as will be explained in more detail below.
In block B2, the information generated in block Bl is evaluated based on rules R according to an evaluation logic. The rules R may be predefined and stored in a memory of the analysis unit 12.
In block B3, the information generated in the blocks Bl and/or B2 are used to select an actuation profile P.
The actuation profile P defines the movement that the door 16, in particular the door leaves 18 shall perform. For example, the actuation profile P includes the acceleration and the velocity of the door leaves 18 at various positions during the motion. Further, the actuation profile P may define the traveling distance for each door leaf 18, the duration between the start of the motion and reaching various predetermined positions during the motion. Further, the time of start of the motion can be defined, for example as "immediately".
It is also possible, that the actuation profile defines a minimal distance between an object recognized in the recording and the door 16, in particular the door leaf 18 and/or the track of the door leaves 18, at which the motion of the door leaves 18 shall be initiated.
Further, the actuation profile P takes into account the physical capabilities of the specific door 16 or door leaf 18, e.g. a maximum possible acceleration.
The actuation profiles P may be predefined and stored in the memory of the analysis unit 12. The actuation profile P is then selected based on the information generated in blocks Bl and/or B2. It is also possible, that the selected actuation profile P is adapted based on this information.
It is also conceivable that the actuation profile P is created based on the information of blocks Bl and B2, i.e. that no actuation profiles P are predetermined.
Then, in block B4, based on the selected actuation profile P of block B3 and optionally the information generated in blocks Bl and/or B2, the door 16, more precisely the door leaves 18 are actuated by the drive units 22. To this end, the drive units 22 receive steering input from the control unit 20 and/or the analysis unit 12 based on the actuation profile P.
In particular block B 1 and optionally also block B2 and/or block B3 are carried out by the analysis unit 12.
To this end, in particular for block Bl, the analysis unit 12 comprises a deterministic algorithm, a machine learning algorithm, a support vector machine and/or a trained artificial neural network. Figure 3 shows a more detailed flowchart of the method according to the invention than the overview of Figure 2.
During operation, in a first step SI, the camera captures recordings. The recordings are, for example, recordings of a single image, recordings of a series of consecutive images and/or a video recording.
For example, recordings are captured in regular intervals, in particular continuously, and the following steps are carried out for multiple recordings simultaneously. In particular, the steps are carried out for each recording improving reaction time of the system 10.
The recordings include the field of view F, and thus the track of the door 16 as well as the area in front of the door 16 and objects or persons present therein.
The recordings are used on the one hand for ensuring a safe operation of the door 16. The control unit 20 or the integrated controller 30, which receives the recordings, ensures that - based on at least the part of the recording showing the track of the door - the door 16 is operated safely. In particular, it can be avoided that persons, for example vulnerable persons such as children or elderly people, that are present in the track of the door 16 are touched or even harmed by the door leaves 18. To this end, the control unit 20 and/or the integrated controller 30 controls the drive unit 22 to actuate the door leaves 18 accordingly (step S2). This way, the door 16 can be operated safely. Thus, the camera 24 is an integral part of the safety functionality of the door 16.
On the other hand, in step S3, the recordings are transmitted to the analysis unit 12.
In step S4, the analysis unit 12 performs image recognition. Firstly, in step S4.1 the analysis unit 12 recognizes the objects and the types of the objects in the recording. The recognized objects may be persons, animals, movable objects or stationary objects.
Movable objects are, for example, inanimate objects that may be carried by a person, like a purse, a dog or a backpack. They also may be rolling, pushed or pulled by a person, like a stroller, a bicycle or a trolley. Thus, these objects can also be associated with a person. Further, movable objects may also be self- propelled, like a car.
Stationary objects may be plants, permanent signs or the like.
The analysis unit 12 may also recognize kinematic properties of the objects in the recordings (step S4.2).
Thus, the analysis unit 12 may for each object in the recording determine the position of the object with respect to the door 16, the change of position, the velocity, the change of velocity, the acceleration, the change of acceleration, the distance to the door and/or the direction of movement of the object.
Of course, all of these values may be determined for each spatial direction separately. In particular, for each object two spatial directions are relevant, namely the two spatial directions parallel to the ground. Thus, for example, a two dimensional vector is determined for each of the above mentioned kinematic properties for each object.
In particular, the analysis unit 12 determines the kinematic property of a kinematic subpart of an object which is a part of an object that may move in addition to the general movement of the object. Examples are the skeleton function of a human or an animal, a body part of a human or animal, like an arm, a leg or the head, or parts of an object extending from the main part of the object, like the handlebar of a bicycle. For the ease of understanding, kinematic properties of a kinematic subpart are also meant when referring to the kinematic properties of the object. The type of the object is also recognized, for example the objects are classified as "person", "dog", "stroller", "trolley", "bicycle", "plant", "e-scooter" and/or "car" (step S4.3).
Further, the analysis unit 12 recognizes various individual properties of each of the objects in the recording (step S4.4). The recognized individual properties may differ depending on the type of the object.
For example, if the object is a person, the individual properties may include the age, the size, the ethnicity, the clothes worn by the person, the items and/or objects carried by the person, the type of locomotion (walking aid, inline skates, skateboard, etc.), the orientation with respect to the door, the pose of the person, the viewing direction of the person, the level of attention of the person, the state of health of the person, the mood of the person, the current activity of the person, a gesture, if any, performed by the person, the behavior of the person, the gait of the person and/or the posture of the person.
If the object is an animal, the analysis unit determines the species of the animal, the breed of the animal, whether the animal is a pet, whether the animal carries prey, whether the animal is on a leash and/or whether the animal is wearing a muzzle.
If the object is a movable object, the analysis unit 12 may determine whether the object is associated with and/or carried by a person.
Further, the analysis unit 12 may determine for each object in the recording image processing properties in step S4.5.
Image processing properties are properties of the object that are used for the purposes of image processing and simplification. For example, image processing properties may be the bounding box B of each object, the closure of objects, in particular of fused objects, the time limit of properties of an object and/or the fusion of an object with other objects. Thus, the analysis unit 12 recognizes each object in the recording, their types as well as their individual, kinematic and/or image processing properties.
In total, step S4 corresponds to block Bl of Figure 2 and the objects, their types as well as their individual, kinematic and/or image processing properties correspond to the information generated in block Bl.
In the next step S5, the analysis unit 12 and/or the control unit 20 determine whether the door 16 shall be opened based on a rule set.
The rule set comprises the plurality of rules R and also an evaluation logic according to which the rules R are to be applied, in particular in case of a conflict.
The rule set is stored in the analysis unit 12 or the control unit 20, respectively. The rule set may be predefined and/or adapted during the operation based on machine learning.
The rules R define a plurality of conditions when the door 16 shall be opened or shall stay closed or shall be closed.
Each rule R includes conditions concerning the presence or the absence of a specific object, a specific type of object, a specific individual property, a specific kinematic property, a specific image processing property or a combination thereof. Further, each rule R comprises the consequence of the conditions, i.e. whether the door shall be opened or not.
Simple examples of rules may be that only the presence of a person, i.e. not the presence of objects or animals, leads to opening of the door.
Another example is that only persons without luggage shall open the door, meaning that the presence of a person not associated with objects of the type trolleys, luggage or the like leads to opening of the door. A further example may be that only animals on a leash shall open the door, meaning that the presence of a person and the presence of an animal having the additional individual property that they are held on the leash, lead to the consequence that the door shall be opened.
It is also possible, that one rule R takes into account more than one object, its types and/or properties to differentiate between different situation.
A condition or rule may be that the door shall open for persons without shopping carts, i.e. the rule R demands the presence of an object recognized as a person at the absence of objects recognized as shopping carts in the recording, before the door shall be opened.
It is of course possible, that more than one object is present in the recording that fulfill conditions of different rules R but that the consequence of these rules R are contradicting to one another, meaning that one rule indicates that the door shall be opened based on one of the objects in the recording and another rule indicates that the door shall stay closed based on other objects in the recording.
To resolve these issues, the evaluation logic comprises instructions that define whether and/or how the door shall be opened if more than one condition of the rule set, i.e. more than one rule R, is met.
This way, the control of the door 16 may be based on more than one object recognized in the recording.
If it has been determined in step S5, which corresponds to block B2, that the door shall be opened, it has to be decided, how the door shall be opened. This is done in step S6, also by the analysis unit 12 or the control unit 20, respectively.
Based on the objects recognized in the recording that have been considered by the rules R in the decision to open the door, their types, their individual properties, their kinematic properties and/or their image processing properties, an actuation profile P is selected, created or adapted. In particular, the kinematic properties of the objects relevant for this specific rule R are taken into consideration for the selection, creation or adaption of the actuation profile P.
For example, if the speed of a person approaching the door is high, then the door 16 has to open more quickly than in cases where the speed is rather low.
Further, if a person carries bulky luggage or pulls trolleys, the door 16 has to be opened wider than for the person alone.
Thus, the desired movement of the door leaves 18 depends on the objects and the overall situation.
Step S6 thus corresponds to block B3 of Figure 2.
It is conceivable that already the rules R indicate a specific actuation profile P that shall be selected or adapted.
It is possible that in step S4, S5 and/or S6 measurement values from the additional sensors 26 are considered.
As an illustration, in step S7 the distance to one or more of the objects is determined and transmitted to the analysis unit 12 for the control unit 20, respectively.
Then the analysis unit 12 or the control unit 20, respectively, takes the measurement values into consideration for the recognition of objects, type and properties (step S4), determining whether the door shall be opened (step S5) and/or for the determination of the actuation profile (step S6).
Likewise, in step S8, it is conceivable that the analysis unit 12 or the control unit 20, respectively, takes the additional data into consideration for the recognition of objects, type and properties (step S4), determining whether the door shall be opened (step S5) and/or for the determination of the actuation profile (step S6).
The additional data may be generated by the camera 24, for example as an additional distance measurement, or from the control unit 20 being the state of the door, the current time, the current weather conditions and/or the current number of persons within the building.
In particular, if the state of the door indicates that the door is already open, another actuation profile P has to be chosen than in cases where the door is closed.
It is also conceivable that the results of steps S4, S5 and/or S6 are transmitted to the control unit 20 or the integrated controller 30 to be taken into consideration for ensuring the safe operation of the door 16.
Further, steps S2 to S8 do not have to be carried out in the order explained above but may be carried out in any other order. It is also possible that one or more of the steps S3 to S 8 is omitted.
In the following step S9, the drive unit 22 is controlled by the analysis unit 12 or the control unit 20, respectively, based on the selected, created or adapted actuation profile P.
In step S10, the drive units 22 then actuate the respective door leaves 18 associated with them. Thus, the desired movement of the door 16 as defined in the actuation profile P is actually carried out by the door 16.
Steps S9 and S10 corresponds to block B4 of Figure 2.
Further in step Si l, the signaling device 28 is actuated by the analysis unit 12 or the control unit, respectively, based on the recognized object, the type of the object, at least one property of the object, its individually kinematic and/or imaging processing property. For example, the actuation profile P or the rule R may already indicate, whether the signaling device 28 shall be activated and how.
For example, the signaling device 28 may be activated if an object is denied entrance, e.g. if a person has attempted to enter the door 16 but no rule R for opening the door matches this particular person and properties.
The signaling device 28 could also be used, to indicate the position of the door to a person that is allowed to enter the door 16.
In case that an artificial neural network of the analysis unit 12 is used, the artificial neural network is trained using training data.
The training data comprises sets of different input data for various situations, for example situations as explained above, and also information about the desired door movement in each of the situations.
The input data includes the same type and data structure as supplied to the artificial neural network during the operation of the artificial neural network as explained above.
In particular, the input data includes recordings generated by the camera 24 and optionally the at least one measurement value extracted from the data of the camera 24 and/or at least one measurement value of at least one additional sensor 26.
Further, the input data includes the expected correct output of the artificial neural network in response to each data set of the input data, in particular information about the actual objects in the recording including their types as well as their individual, kinematic and/or image processing properties.
For the training of the artificial neural network, in a first training step T1 (Fig. 3) the input data is fed forward through the artificial neural network. Then, the answer output of the artificial neural network is determined (step T2). The answer output may include the actual objects in the recording, their types, their properties and/or their individual, kinematic and/or image processing properties that has/have been determined by the artificial neural network based on the input data for one of the various training situations.
In step T3, an error between the answer output of the artificial neural network and the expected correct output (known from the training data) is determined.
In step T4, the weights of the artificial neural network are changed by back propagating the error through the artificial neural network.
Thus, the training may be used for all the blocks Bl, B2 and B3 and/or as the basis for steps S2 to S8.
Figures 4 and 5 show two situations of the system 10 in use.
In the situation of Figure 4, three persons are in front of the door 16 and within the field of view F of the camera 24. Thus, all of the persons are included in the recording.
The first person on the left-hand side is a woman slowly walking towards the door 16.
The analysis unit 12 determines that this object is a person of age 24 and gender female, that she does not have luggage or carries objects, that her mood is "happy" and that she is oriented towards the door at an angle of 20° (with respect to a line perpendicular to the track of the door 16). The woman has eye contact with the door, indicating that she will likely pass the door. Further, the kinematic properties are determined like the slow speed of 0.5 m/s as well as the acceleration (no acceleration in this case).
Further, the analysis unit 12 determines the bounding box B - shown in a dashed line in Figure 4 - as an image processing property. The second object in the middle, as recognized by the analysis unit 12, is a male person of age 42 that is merely passing by the door 16. The analysis unit 12 determines the type and further properties of the person, as indicated in Figure 4. The person carries no luggage but a mobile phone. His mood is focused and his orientation is traversed to the door, having an angle of 89° to the door. The person has no eye contact and moves at a slow speed of 0.5 m/s with no acceleration.
The third person on the right hand side runs towards the door. As indicated in Figure 4, it is recognized by the analysis unit 12 as a person of age 31 with male gender. The person does not carry luggage or objects and appears to be in a stressed mood. The person has eye contact with the door and its movement is directed to the door with an angle of 42°. The person moves fast with a speed of 2.5 m/s and also accelerates.
Having recognized the situation, the analysis unit 12 then evaluates the rules R. In this case, two rules R, one for the person on the left-hand side walking to the door and one for the person on the right-hand side running to the door, indicate that the door shall be opened.
According to the evaluation logic, due to the speed of the person on the righthand side, an actuation profile P suitable for the kinematic properties of the person on the right-hand side is chosen. This actuation profile P describes a door movement that is very quick and happens immediately so that the person on the right-hand side does not have to stop or slow down.
The drive units 22 are then controlled according to the selected actuation profile P so that the door leaves 18 move apart in a quick fashion to allow the person on the right hand side to run through the door without the need to change his movement, even though he is accelerating. The door 16 will then stay open until the second person, which will of course be longer in the field of view F of the camera 24, has passed the door.
Figure 5 shows a second situation in front of the door 16.
In this situation, five objects are within the field of view F of the camera 24, namely two female persons, two trolleys and one sheep.
On the left-hand side, as indicated in Figure 5, the analysis unit 12 recognizes a female person of age 24 in neutral mood who is walking at a slow speed of 0.3 m/s and slightly decelerating at the door 16. The person has eye contact with the door.
Further, the analysis unit 12 recognizes two movable objects as trolleys that are pulled by the person. Thus, the two objects are associated with the person.
Further, a second person (fourth object) in the middle is walking parallel to the door. The analysis unit 12 recognizes this object is a person of female gender and age 18. The person carries a handbag and appears happy. The analysis unit 12 also determines that the orientation of the person is traverse to the door and that she has no eye contact with the door. Also the kinematic properties of 0.7 m/s walking speed and no acceleration are recognized.
Further, the fifth object is an animal right in front of the door. The animal is a sheep and thus not a pet. It is further not on a leash or not related to any person. It is oriented at the door and walks at a slow speed of 0.2 m/s and decelerates slightly.
In this situation, the person in the middle traversing the door does not have any influence on the door behavior as she is not interested in entering the building.
The sheep in front of the door fulfills the conditions of a rule R that does not allow animals to enter the building. Thus, this rule would result in the door not being opened. However, the person on the left-hand side walking towards the door with two trolleys fulfills the condition of another rule R that indicates that the door 16 shall be opened.
The results of both rules R are in contradiction to one another so that an instruction in the evaluation logic to resolve the conflict have to be regarded by the analysis unit 12.
In this case, the instructions define that persons are allowed to enter the building even though animals are present in front of the door. Thus, the rule R associated with the person on the left-hand side is regarded more important. Therefore, according to this rule R for the person on the left-hand side and its properties an actuation profile P is selected.
Due to the fact that the person is pulling two trolleys, an actuation profile P for persons with luggage is selected that leads to a wide opening width of the door 16. Further, it is possible that the analysis unit 12 adapts this actuation profile P based on the estimated width of the person and the two trolleys to further increase the resulting opening width of the door 16 so that the person with the two trolleys can safely pass the door without the chance of collisions with one of the door leaves 18.
Therefore, the method and the system can provide the possibility of controlling the access to a building or other confined space in a very detailed and precise maimer without the need for further personnel or other structures.
In addition, the door 16 may be opened in a fashion that is always adapted to the needs of the person that would like to pass the door, wherein, at the same time, the duration the door is opened can be reduced to avoid that heated air leaves the building.
It is also possible that the analysis unit 12 and/or the control unit 20 determines the number of objects, in particular persons, that pass the door 16 and their direction of movement, meaning that the analysis unit 12 or the control unit 20 can determine whether a person leaves or enters the building.
Thus, the analysis unit 12 or the control unit 20, respectively, can determine the number of objects, in particular persons, inside the building, and the length of stay of each object within the building.
This information can also be used to control the drive unit 22, e.g. to close the door after the last person has left the building, or it can be transmitted via an interface to, for example, a facility management system.
The door system 14 may also comprise a plurality of doors 16 wherein the number of objects, in particular persons, passing the various doors 16 are added up and subtracted from one another to obtain the number of persons within the building.
It is of course possible, that the number of persons, the length of stay or the like can also be determined for various areas within the building or other areas that are confined by the door system 14.

Claims

27 Claims
1. Method for operating an automatic door system (14) using an analysis unit (12), wherein the door system (14) comprises at least one door, at least one drive unit (22) for actuating the at least one door (16), a camera (24) and a control unit (20) for controlling the drive unit (22), wherein the camera (24) is an integral part of the safety functionality of the door (16) and monitors the track of the door (16), wherein the method comprises the following steps:
- capturing at least one recording by the camera (24), wherein the recording includes at least the area in front of the door (16),
- transmitting the recording to the analysis unit (12),
- recognizing at least one object and the type of the object in the recording by the analysis unit (12), and
- controlling the drive unit (22) based on the at least one recognized object and/or its type.
2. Method according to claim 1, characterized in that the at least one object is a person, an animal, a movable object or a stationary object.
3. Method according to claim 2, characterized in that the analysis unit (12) recognizes at least one individual property of the object in the recording, in particular wherein the individual property is one or more of the following properties:
- if the object is a person: age, size, ethnicity, clothes worn, items carried, type of locomotion, orientation, pose, viewing direction, level of attention, state of health, mood, current activity, performed gesture, behavior, gait and/or posture; and/or
- if the object is an animal: species, breed, whether the animal is wet, whether the animal carries prey, whether the animal is on a leash and/or whether the animal is wearing a muzzle; and/or - if the object is a movable object: whether the object is associated with and/or carried by a person.
4. Method according to any one of the preceding claims, characterized in that the analysis unit (12) recognizes at least one kinematic property of the at least one object in the recording, particularly a kinematic property of a kinematic subpart of the at least one object, in particular a kinematic property being the change of position, the velocity, the change of velocity, the acceleration, the change of acceleration, the position, the distance to the door and/or the direction of movement of the object and/or of its kinematic subpart.
5. Method according to any one of the preceding claims, characterized in that the analysis unit (12) recognizes at least one image processing property of the at least one object in the recording, in particular the bounding box (B), the closure, the timeline of properties and/or fusion of the object with other objects.
6. Method according to any one of the preceding claims, characterized in that a rule set, in particular comprising an evaluation logic, is stored in the analysis unit (12) and/or the control unit (20), the rule set, in particular the rule set and its evaluation logic, defining a plurality of conditions when the door (16) shall open or stay closed, in particular wherein the conditions include the presence or absence of a specific object, a specific type of object, a specific individual property, a specific kinematic property, a specific image processing property or any combination thereof, wherein the analysis unit (12) and/or the control unit (20) determines based on the rule set, in particular the rule set and its evaluation logic, and the type of the at least one object recognized in the recording and/or its at least one individual property, its at least one kinematic property and/or its at least one image processing property, whether the door (16) shall be opened and particularly operates the drive unit (22) accordingly.
7. Method according to claim 6, characterized in that the rule set comprises instructions, in particular definitions in the evaluation logic, that define whether and/or how the door (16) shall be opened if more than one condition of the rule set is met, in particular if the conditions that are met are in conflict with one another.
8. Method according to any one of the preceding claims, characterized in that the drive unit (22) is controlled according to or based on an actuation profile (P) for achieving a desired door movement, in particular a desired motion of at least one door leaf (18) of the door (16), wherein the actuation profile (P) is predetermined and selected based on the at least one object recognized in the recording and/or its type, its at least one individual property, its at least one kinematic property and/or its at least one image processing property; and/or wherein the actuation profile (P) is created or adapted based on the type of the at least one object recognized in the recording and/or its at least one individual property, its at least one kinematic property and/or its at least one image processing property.
9. Method according to claim 8, characterized in that the actuation profile (P) includes the acceleration and the velocity of the door (16), in particular the door leaf (18), at various positions during the movement; the desired traveling distance; the duration between the start of the movement and reaching various predetermined positions, optionally including the time of the start of the motion; and/or a minimal distance between one of the at least one objects to the door (16), to the door leaf (18), to the track of the door (16) and/or to the track of the door leaf (18).
10. Method according to claim 8 or 9, characterized in that the analysis unit (12) and/or the control unit (20) takes additional data into consideration for determining, whether the door (16) shall be opened and/or for the determination of the actuation profile (P), in particular the additional data including the state of the door (16), the current time, the current weather conditions and/or the current number of persons within the building.
11. Method according to any one of the preceding claims, characterized in that the recording is a captured single image, a captured series of consecutive images and/or a video recording.
12. Method according to any one of the preceding claims, characterized in that the analysis unit (12) and/or control unit (20) receives at least one measurement value extracted from the data of the camera (24) and/or of at least one additional sensor (26) of the door system, in particular a distance sensor and/or a source of electromagnetic radiation, and uses the at least one measurement value for the recognition of the at least one object, the type of the at least one object and/or its individual, kinematic and/or image processing properties.
13. Method according to any one of the preceding claims, characterized in that the door system (14) and/or the building in which the door system (14) is installed in comprises a signaling device (28), in particular an optical and/or acoustical signaling device, wherein the signaling device (28) is activated by the analysis unit (12) and/or the control unit (20) based on the type of the at least one object, of its at least one property and/or its individual, kinematic and/or image processing properties.
14. Method according to any one of the preceding claims, characterized in that the analysis unit (12) and/or the control unit (20) determines the number of objects, in particular persons, passing the door (16) and their direction of movement, obtaining the number of objects and/or the length of stay of the objects within the area confined by the door system (14) based on the determined number of objects passing the door (16), wherein the obtained number and/or the determined length of stay is used to control the drive unit (22) and/or the signaling device (28) or is transmitted via an interface.
15. Method according to any one of the preceding claims, characterized in that the recognition of the at least one object in the recording is performed by a 31 deterministic algorithm, a machine learning algorithm, a support vector machine and/or a trained artificial neural network of the analysis unit.
16. Method according to claim 15, characterized in that the artificial neural network is trained using training data, wherein the training data comprises, for various training situations, input data of the same type and structure as the data which is fed to the artificial neural network during regular operation of the door system (14), and information about the expected correct output of the artificial neural network for the training situations; the training comprises the following training steps:
- feed forward of the input data through the artificial neural network;
- determining an answer output by the artificial neural network based on the input data,
- determining an error between the answer output of the artificial neural network and the expected correct output of the artificial neural network; and
- changing the weights of the artificial neural network by back- propagating the error through the artificial neural network, in particular wherein the input data includes recordings captured by the camera (24), at least one measurement value extracted from the data of the camera (24) and/or at least one measurement value of at least one additional sensor (26); wherein the information about the expected correct output includes information about the actual objects in the recording including their types as well as their individual, kinematic and/or image processing properties; and wherein the answer output includes the actual objects in the recording, their types, their properties and/or their individual, kinematic and/or image processing properties determined based on the input data.
17. System comprising an analysis unit (12) and an automatic door system (14) having at least one door (16), at least one drive unit (22) for actuating the 32 at least one door (16), in particular at least one door leaf (18) of the door (16), a camera (24) and a control unit (20) for controlling the drive unit (22), wherein the system (10) is configured to carry out a method according to any one of the claims 1 to 16, in particular wherein the analysis unit (12) is part of the door system (14), for example part of the control unit (20) and/or an image processing unit of the camera (24).
18. System according to claim 17, characterized in that the camera (24) is a single camera, a stereo camera, a time of flight 3D camera, an event camera or a plurality of cameras; and/or wherein the door system (14) comprises an additional sensor (26), in particular a distance sensor and/or a source of electromagnetic radiation, whose field of view overlaps with the field of view (F) of the camera (24).
PCT/EP2022/075832 2021-09-23 2022-09-16 Method for operating an automatic door system as well as system having an automatic door system WO2023046599A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE2130251-8 2021-09-23
SE2130251 2021-09-23

Publications (1)

Publication Number Publication Date
WO2023046599A1 true WO2023046599A1 (en) 2023-03-30

Family

ID=83400752

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/075832 WO2023046599A1 (en) 2021-09-23 2022-09-16 Method for operating an automatic door system as well as system having an automatic door system

Country Status (1)

Country Link
WO (1) WO2023046599A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230175307A1 (en) * 2018-12-21 2023-06-08 Inventio Ag Access control system with sliding door with a gesture control function

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190226265A1 (en) * 2018-01-19 2019-07-25 Cehan Ahmad Child-Safe Automatic Doors
US20200024885A1 (en) * 2017-03-30 2020-01-23 Assa Abloy Entrance Systems Ab Door operator
WO2021049636A1 (en) * 2019-09-13 2021-03-18 Necソリューションイノベータ株式会社 Device for assisting in passing through automatic door, pass-through assistance method, program, and recording medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200024885A1 (en) * 2017-03-30 2020-01-23 Assa Abloy Entrance Systems Ab Door operator
US20190226265A1 (en) * 2018-01-19 2019-07-25 Cehan Ahmad Child-Safe Automatic Doors
WO2021049636A1 (en) * 2019-09-13 2021-03-18 Necソリューションイノベータ株式会社 Device for assisting in passing through automatic door, pass-through assistance method, program, and recording medium
US20220341247A1 (en) * 2019-09-13 2022-10-27 Nec Solution Innovators, Ltd. Automatic door passage assistance apparatus, passage assistance method, program, and recording medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230175307A1 (en) * 2018-12-21 2023-06-08 Inventio Ag Access control system with sliding door with a gesture control function

Similar Documents

Publication Publication Date Title
US11919531B2 (en) Method for customizing motion characteristics of an autonomous vehicle for a user
KR102302239B1 (en) Method of controlling cart-robot in restricted area and cart-robot of implementing thereof
KR20190106867A (en) An artificial intelligence apparatus for guiding arrangement location of furniture and operating method thereof
DE112017001573T5 (en) Autonomous robot performing a welcome action
US20210405646A1 (en) Marker, method of moving in marker following mode, and cart-robot implementing method
WO2023046599A1 (en) Method for operating an automatic door system as well as system having an automatic door system
WO2013011543A1 (en) Autonomous locomotion equipment and control method therefor
Tammvee et al. Human activity recognition-based path planning for autonomous vehicles
KR102615685B1 (en) Method for estimating location by synchronizing multi-sensors and robot implementing it
KR102300574B1 (en) Method of moving in power assist mode reflecting physical characteristic of user and cart-robot of implementing thereof
AU2022349768A1 (en) Method for operating an automatic door system as well as system having an automatic door system
US20220341247A1 (en) Automatic door passage assistance apparatus, passage assistance method, program, and recording medium
Simmons et al. Training a remote-control car to autonomously lane-follow using end-to-end neural networks
KR20210083812A (en) Autonomous mobile robots and operating method thereof
KR20210026595A (en) Method of moving in administrator mode and robot of implementing thereof
WO2018069334A1 (en) Situation-dependent control of a closing body
WO2020129312A1 (en) Guidance robot control device, guidance system in which same is used, and guidance robot control method
JP7308442B2 (en) Information processing method, information processing device, and information processing system
CN113467462A (en) Pedestrian accompanying control method and device for robot, mobile robot and medium
KR20190095196A (en) Marker for space recogition, method of moving and lining up cart-robot based on space recognition and cart-robot of implementing thereof
KR102619956B1 (en) SAFETY SYSTEM FOR AUTOMATICALLY OPENING AND CLOSING SAFETY DEVICE USING ToF SENSOR AND METHOD FOR CONTROLLING THEREOF
WO2018069354A2 (en) Stereometric object flow interaction
Schmuck et al. training networks separately on static and dynamic obstacles improves collision avoidance during indoor robot navigation.
US20140333763A1 (en) Method and system for controlling access using a smart optical sensor
KR102446670B1 (en) Ai based non-contact elevator control system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22773745

Country of ref document: EP

Kind code of ref document: A1