WO2024049462A1 - Synchronous user authentication and personalization of human-machine interfaces - Google Patents

Synchronous user authentication and personalization of human-machine interfaces Download PDF

Info

Publication number
WO2024049462A1
WO2024049462A1 PCT/US2022/075570 US2022075570W WO2024049462A1 WO 2024049462 A1 WO2024049462 A1 WO 2024049462A1 US 2022075570 W US2022075570 W US 2022075570W WO 2024049462 A1 WO2024049462 A1 WO 2024049462A1
Authority
WO
WIPO (PCT)
Prior art keywords
face
hmi
detected face
hmi system
input video
Prior art date
Application number
PCT/US2022/075570
Other languages
French (fr)
Inventor
Mayuri DESHPANDE
Anant Kumar Mishra
Original Assignee
Siemens Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Corporation filed Critical Siemens Corporation
Priority to PCT/US2022/075570 priority Critical patent/WO2024049462A1/en
Publication of WO2024049462A1 publication Critical patent/WO2024049462A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2141Access rights, e.g. capability lists, access control lists, access tables, access matrices

Abstract

A human-machine interface (HMI) system defines a camera and a plurality of modules. The HMI system can capture a video feed of a production area in vicinity of the HMI system, so as to define an input video. The HMI system can detect a face in the input video, so as to define a detected face. Furthermore, the HMI system can determine whether the detected face in the input video represents a spoof attack. When it is determined that the detected face does not represent the spoof attack, the HMI system can determine whether the face belongs to an authorized user of the HMI system. In an example, after determining that the face belongs to an authorized user of the HMI system, the HMI system can allow access to the production area corresponding to particular access rights of the authorized user.

Description

SYNCHRONOUS USER AUTHENTICATION AND PERSONALIZATION OF HUMAN¬
MACHINE INTERFACES
BACKGROUND
[0001] Many industrial processes and machinery are monitored and controlled by operators or engineers using human-machine interface (HMI) screens. HMI screens can display information related to operational statuses of components. In some cases, HMI screens can render operational controls so that a user can control one or more monitored processes or components via the HMI screen. In many manufacturing environments, operators frequently access and monitor HMI displays, for example, to understand different machine parameters and information associated with various tasks. It is recognized herein that such frequent accessing of HMI screens can cost time associated with logging into the system and authenticating the user, among other inefficiencies.
BRIEF SUMMARY
[0002] Embodiments of the invention address and overcome one or more of the described- herein shortcomings by providing methods, systems, and apparatuses that automatically authenticate and authorize users of human-machine interface (HMI) screens. Furthermore, in various examples, spoof attacks on a given HMI screen can be identified.
[0003] In an example aspect, a human machine interface (HMI) system defines a camera and a plurality of modules. The HMI system can capture a video feed of a production area in vicinity of the HMI system, so as to define an input video. The HMI system can detect a face in the input video, so as to define a detected face. Furthermore, the HMI system can determine whether the detected face in the input video represents a spoof attack. When it is determined that the detected face does not represent the spoof attack, the HMI system can determine whether the face belongs to an authorized user of the HMI system. In an example, after determining that the face belongs to an authorized user of the HMI system, the HMI system can allow access to the production area corresponding to particular access rights of the authorized user. In another example aspect, the HMI system can determine whether a mask is on the detected face. When it is determined that the mask is on the detected face, the HMI system can search a database of masked faces, and match the detected face to a masked face in the database of masked faces. To detect or determine whether the detected face represents the spoof attack, the HMI system can determine whether an eye of the detected face blinks, and whether a mouth of the detected face moves.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
[0004] The foregoing and other aspects of the present invention are best understood from the following detailed description when read in connection with the accompanying drawings. For the purpose of illustrating the invention, there is shown in the drawings embodiments that are presently preferred, it being understood, however, that the invention is not limited to the specific instrumentalities disclosed. Included in the drawings are the following Figures:
[0005] FIG. 1 is a block diagram of an example control system that includes a plurality of human-machine interface (HMI) screens, in accordance with an example embodiment.
[0006] FIG. 2 depicts an example computing system configured to identify authentic users of human-machine interface (HMI) screens, in accordance with an example embodiment.
[0007] FIG. 3 depicts example computations performed by a liveness detection module of the computing system of FIG. 2, for detecting whether an eye blinks, in accordance with an example embodiment.
[0008] FIG. 4 depicts further example computations performed by the liveness detection module, for detecting whether a mouth moves, in accordance with an example embodiment.
[0009] FIG. 5 is a flow diagram that depicts operations that can be performed by the computing system depicted in FIG. 2, in accordance with another example embodiment.
[0010] FIG. 6 illustrates a computing environment within which embodiments of the disclosure may be implemented.
DETAILED DESCRIPTION
[0011] As an initial matter, a human-machine interface (HMI) screen may refer to a panel having one or more physical displays. Thus, unless otherwise specified, display, screen, and panel may be used interchangeably herein, without limitation. A given HMI screen may define a small text-based display or a large panel having key or touch support. Example HMI screens further include, without limitation, PC-based panels and high resolution displays (e.g., televisions) associated with Supervisory Control and Data Acquisition (SCADA) servers. [0012] By way of background, HMI panels can define user interfaces for users to interact with various systems via the HMI panels. User administration generally refers to functionality that provides protection against unauthorized users accessing a given HMI panel. For example, users can have various roles and access rights based on their respective role. It is recognized herein, however, that such user administration can limit the efficiency and flexibility of operators or users. For example, in some cases, some areas of a given plant are restricted to certain operators, such that only certain operations have access to those areas. By way of example, a given system can define export control operational setpoints that restrict areas to specific operators. Enforcing such setpoints and maintaining security in an efficient manner can define a technical challenge. To further challenge user authentication and user administration, it is recognized herein that operators often have contaminated hands (e.g., oil, grease, etc. from CNC machines) when they need to interact with or operate HMI display panels. Further still, it is recognized herein that operators often control robotic operations in crowded environments that require their hands to operate on joysticks or the like, so as further complicate efficient operation of HMI displays. It is further recognized herein that shop floor operators often wear gloves, thereby further complicating access to HMI displays. In various example embodiments, HMI displays can recognize operators, for instance various facial features of operators, in real-time so as to provide real-time information and accesses that are associated with particular users.
[0013] Referring initially to FIG. 1, an example industrial control system (ICS) 100 includes various HMI screens that can be implemented in accordance with embodiments described herein. The example system 100 includes an office or corporate IT network 102 and an operational plant or production network 104 communicatively coupled to the IT network 102. It will be understood that the ICS 100 is illustrated and simplified as an example, and HMI screens can be implemented in other domains having other configurations, and all such systems are contemplated as being within the scope of this disclosure. For example, embodiments of the distributed control system can define an operational technology system, energy generation system (e.g., wind parks, solar parks, etc.), or an energy distribution network.
[0014] The production network 104 can include an abstraction engine 106 that is connected to the IT network 102. The production network 104 can include various production machines configured to work together to perform one or more manufacturing operations. Example production machines of the production network 104 can include, without limitation, robots 108 and other field devices, such as sensors 110, actuators 112, or other machines, which can be controlled by a respective PLC 114. The PLC 114 can send instructions to respective field devices. In some cases, a given PLC 114 can be coupled to one or more human machine interfaces (HMIs) 116.
[0015] The ICS 100, in particular the production network 104, can define a fieldbus portion 118 and an Ethernet portion 120. For example, the fieldbus portion 118 can include the robots 108, PLC 114, sensors 110, actuators 112, and HMIs 116. The fieldbus portion 118 can define one or more production cells or control zones. The fieldbus portion 118 can further include a data extraction node 115 that can be configured to communicate with a given PLC 114 and sensors 110.
[0016] The PLC 114, data extraction node 115, sensors 110, actuators 112, and HMI 116 within a given production cell can communicate with each other via a respective field bus 122. Each control zone can be defined by a respective PLC 114, such that the PLC 114, and thus the corresponding control zone, can connect to the Ethernet portion 120 via an Ethernet connection 124. The robots 108 can be configured to communicate with other devices within the fieldbus portion 118 via a WiFi connection 126. Similarly, the robots 108 can communicate with the Ethernet portion 120, in particular a Supervisory Control and Data Acquisition (SCADA) server 128, via the WiFi connection 126. The Ethernet portion 120 of the production network 104 can include various computing devices communicatively coupled together via the Ethernet connection 124. Example computing devices in the Ethernet portion 120 include, without limitation, a mobile data collector 130, HMIs 132, the SCADA server 128, the abstraction engine 106, a wireless router 134, a manufacturing execution system (MES) 136, an engineering system (ES) 138, and a log server 140. The ES 138 can include one or more engineering workstations. In an example, the MES 136, HMIs 132, ES 138, and log server 140 are connected to the production network 104 directly. The wireless router 134 can also connect to the production network 104 directly. Thus, in some cases, mobile users, for instance the mobile data collector 130 and robots 108, can connect to the production network 104 via the wireless router 134. In some cases, by way of example, the ES 138 and the mobile data collector 130 define guest devices that are allowed to connect to the abstraction engine 106. The abstraction engine 106 can be configured to collect or obtain historical project information. [0017] Example users of the ICS 100 include, for example and without limitation, operators of an industrial plant or engineers that can update the control logic of a plant. By way of an example, an operator can interact with the HMIs 132, which may be located in a control room of a given plant. Alternatively, or additionally, an operator can interact with HMIs of the ICS 100 that are located remotely from the production network 104. Similarly, for example, engineers can use the HMIs 116 that can be located in an engineering room of the ICS 100. Alternatively, or additionally, an engineer can interact with HMIs of the ICS 100 that are located remotely from the production network 104.
[0018] Referring also to FIG. 2, an example computing or HMI system 200 can be defined by one or more of the HMIs 116 and 132. The HMI system 200 can be configured to identify users or operators in real-time. The HMI system 200 can include one or more processors and memory having stored thereon applications, agents, and computer program modules including, for example, a camera or sensor 202, a face detection module 204 communicatively coupled to the camera or sensor 202, a liveness detection module 206 communicatively coupled to the camera or sensor 202, a texture detection module 208 communicatively coupled to the camera or sensor 118, and an operator recognition module 210 communicatively coupled to the camera or sensor 118.
[0019] It will be appreciated that the program modules, applications, computer-executable instructions, code, or the like depicted in FIG. 2 are merely illustrative and not exhaustive, and that processing described as being supported by any particular module may alternatively be distributed across multiple modules or performed by a different module. In addition, various program module(s), script(s), plug-in(s), Application Programming Interface(s) (API(s)), or any other suitable computer-executable code may be provided to support functionality provided by the program modules, applications, or computer-executable code depicted in FIG. 2 and/or additional or alternate functionality. Further, functionality may be modularized differently such that processing described as being supported collectively by the collection of program modules depicted in FIG. 2 may be performed by a fewer or greater number of modules, or functionality described as being supported by any particular module may be supported, at least in part, by another module. In addition, program modules that support the functionality described herein may form part of one or more applications executable across any number of systems or devices in accordance with any suitable computing model such as, for example, a client-server model, a peer-to-peer model, and so forth. In addition, any of the functionality described as being supported by any of the program modules depicted in FIG. 2 may be implemented, at least partially, in hardware and/or firmware across any number of devices. [0020] Still referring to FIG. 2, in accordance with various embodiments, existing user authentication at a factory floor, for instance within the production network 104, can be enhanced by real-time image processing that can be performed by the HMI system 200, as further described herein. In some cases, the camera 202 can define an RGB camera or depth sensor integrated into an existing HMI panel, thereby defining the HMI system 200 that is flexible with respect to privacy and integrity (e.g., authentication and authorization) mechanisms. In an example, the camera 202 operates so as to continuously capture images or a video feed in the vicinity of the camera 202. In particular, for example, the camera 202 can capture movement continuously in a given production area, so as to capture a video feed of human movement in the production area. The video feed can be processed by the face detection module 204. In particular, the face detection module 204 can process the video feed so as to detect faces within the video feed that includes various video frames. In some cases, the face detection module 204 can detect faces that are within a given range of the HMI system 200. For example, the face detection module 204 can define a deep learning model that is trained to detect and extract features from images or video frames associated with faces. By extracting such features, the face detection module 204 can locate one or more faces in each video frame. In some examples, the face detection module 204 can identify a given face that is detected, so as to authenticate a user. For example, the face detection module 204 can compare features of a detected face with features stored in a database. When the match percentage is over a predetermined threshold, a user or operator might be authenticated, thereby providing the user or operator with access to the HMI system 200 or production area corresponding to their respective rights. The features or faces stored in the database may be associated with operators that have different access rights to the given HMI system 200. Alternatively, or additionally, when the comparison of detected features or faces does not match features or faces in the database over the required threshold, the face detection module 204 can restrict or deny access to the associated HMI system 200 or production area. Thus, the face detection module 204 can define a face authentication or face recognition system.
[0021] With continuing reference to FIG. 2, the liveness detection module 206 can process images or video feeds so as to protect the HMI system 200, and thus the ICS 100, from nefarious users. In particular, the liveness detection module 206 can protect the system 100 against a nefarious user who attempts to circumvent the face authentication system by, for example, placing a photo or a video replay of another user in front of the camera 202. By way of example, without the liveness detection module 204, in some cases, the face represented in the image or video replay might be detected and authenticated by the face detection module 204, which can allow an unauthorized user (e.g., the user placing the image or video in front of the camera 202) to access the HMI system 200 or production area via circumventing or spoofing the face recognition system. Thus, as further described herein, the liveness detection module 206 can define an extra layer of protection that can differentiate between legitimate and illegitimate faces so as to detect fakes.
[0022] Referring to FIGs. 3 and 4, based on a video feed from the camera 202, the liveness detection module 206 can determine whether the video feed is a spoof or is authentic. In particular, the liveness detection module 206 can identify an eye in the video feed, and can determine whether the identified eye blinks. By way of example, referring in particular to FIG. 3, the liveness detection module 206 can identify a plurality of reference points 302 from one or more video frames, for instance first and second video frames 301a and 301b. The reference points 302 can define locations of the eye in the video feed. Thus, various pairs of reference points 302 can define distances between the reference points, for instance first and second distances 304 and 306, respectively. In some cases, the liveness detection module 206 defines a neural network or machine learning component configured to learn eye images, so as to detect whether a given eye blinks. For example, the location of the reference points 302 and their associated distances, for instance the first and second distances 304 and 306, respectively, can be input into the neural network or machine learning component. In particular, for example, the liveness detection module can encode values 308 over time associated with the reference points 302, so as to detect the occurrence of a blink.
[0023] When an eye blink is detected by the liveness detection module 206, the liveness detection module 206 can determine that the video feed is not defined by a photograph being placed in front of the camera 202. Similarly, the liveness detection module 206 can identify a mouth in the video feed, and can determine whether there is movement associated with the identified mouth. If the identified mouth moves, the liveness detection module 206 can determine that the identified mouth is not a photograph of a mouth displayed in front of the camera 202. By way of example, referring in particular to FIG. 4, the liveness detection module 206 can identify a plurality of reference points (e.g., reference points 1-68) that define locations of a face in given video feed. In particular, the liveness detection module 206 can identify reference points (e.g., reference points 49-67) that define locations of a mouth of the face. The liveness detection module can encode values over time associated with the reference points 49-67, so as to detect whether the mouth moves.
[0024] Referring again to FIG. 2, in various examples, after the liveness detection module 206 and texture detection module 208 determine that a given image or video feed is not a spoof, the operator recognition module 210 can process the given image or video feed so as to identify a user or operator depicted in the given image or video feed. In particular, in some cases, the operator recognition module 210 can recognize or identify users who are wearing masks that cover their nose and face, for instance surgical masks commonly worn during the Covid- 19 pandemic. The operator recognition module 210 can detect users with masks and authenticate such users based on the eyes, forehead features, and aspect ratios. In particular, the operator recognition module 210 can authenticate users based a region of the face that includes the eyes and forehead.
[0025] In various examples, the operator recognition module 210 defines a deep learning model is that trained to classify users with and without masks. For example, if a given user is not wearing a mask, the operator recognition module 210 can perform cosine similarity techniques, based on user images stored in a database, so as to authenticate the user. If the user is detected with mask on, in some examples, the operator recognition module 210 can further analyze the associated image or video. In particular, the deep learning model of the operator recognition module 210 can process the features associated with the forehead or eye depicted in the image or video, so as to recognize the user. Additionally, or alternatively, the HMI system 200 can recognize or identify a task that the operator in front of the camera 202 is performing, or intends to perform. Responsive to identifying the operator or task, the HMI system 200 can render or display relevant insights associated with continuing or completing the task.
[0026] By way of example, the HMI system 200 might proactively change from one screen to another based on the detection of an intention of the user. For instance, an alarm can be issued for a given section of the plant. The user might react in a way that indicates that they are concerned about the alarm, and the HMI system 200 may then jump to the HMI screen that details the section of the plant that originated the alarm. By way of further example, the camera 202 can define perception sensors that capture facial features in real-time, so that operators or tasks can be recognized or identified. Such recognition can help in seamlessly continuing work from a previous session, for instance when a new operator begins a shift. Such task identifications can also result in the HMI system 200 recommending new tasks associated with the identified task, displaying reminders for an upcoming task, identifying and displaying changes to data, providing information related to machine repairs or other data records, and the like.
[0027] Referring now to FIG. 5, example operations 600 can be performed by the HMI system 200. At 602, video can be captured by the camera 202, so as to define input video. At 604, the face detection module 204 can determine whether a face is depicted in the input video. At 606, when the face detection module 204 does not identify a face in the input video, the face detection module 204 can determine that no operator is in view of the camera 202, and can return to processing other input video or images. At 608, when the face detection module 204 detects a face so as to define a detected face, the input video can be sent to the liveness detection module 206 and the texture detection module 208 for further processing. At 610, the liveness detection module 206 can identify at least one eye of the detected face, and can determine whether the eye blinks. At 612, the liveness detection module 206 can identify a mouth of the face, and can determine whether the mouth moves. At 614, based on the input video, the texture detection module 208 can determine a color space histogram concatenation. At 616, based on the determination associated with eye movement (at 610), the determination associated with mouth movement (at 614), and the color space histogram concatenation (at 614), the HMI system 200 can determine whether the input video is a spoof attack. For example, the results of individual checks can be combined into an overall spoof attack via a Boolean logic operations.
[0028] With continuing reference to FIG. 5, based on the results of the processing performed by liveness detection module 206 and the texture detection module 208, a spoof attack is detected in the input video, at 618. Responsive to the spoof being detected, the HMI system can send a report or an alert to the production system or appropriate security personnel within the production system, for instance within the ICS 100. When no spook is detected in the input video, the input video can be sent to the operation recognition module 210, at 622. At 624, the operator recognition module 210 can determine whether a mask is on the detected face. When a mask is detected on the detected face, the operator recognition module 210 can search a database 628 of images of masked operators, so as to match (at 626) an operator in the input video with an operator in the database 628, thereby identifying or authenticating the operator in the input video. When a mask is not detected on the detected face, the operator recognition module 210 can search a database 632 of images of faces of operators, so as to match (at 630) an operator in the input video with an operator in the database 628, thereby identifying or authenticating the operator in the input video.
[0029] Thus, as described herein, a human machine interface (HMI) system defines a camera and a plurality of modules. The HMI system can capture a video feed of a production area in vicinity of the HMI system, so as to define an input video. The HMI system can detect a face in the input video, so as to define a detected face. Furthermore, the HMI system can determine whether the detected face in the input video represents a spoof attack. When it is determined that the detected face does not represent the spoof attack, the HMI system can determine whether the face belongs to an authorized user of the HMI system. In an example, after determining that the face belongs to an authorized user of the HMI system, the HMI system can allow access to the production area corresponding to particular access rights of the authorized user. In another example aspect, the HMI system can determine whether a mask is on the detected face. When it is determined that the mask is on the detected face, the HMI system can search a database of masked faces, and match the detected face to a masked face in the database of masked faces. To detect or determine whether the detected face represents the spoof attack, the HMI system can determine whether an eye of the detected face blinks, and whether a mouth of the detected face moves.
[0030] FIG. 6 illustrates an example of a computing environment within which embodiments of the present disclosure may be implemented. A computing environment 700 includes a computer system 710 that may include a communication mechanism such as a system bus 721 or other communication mechanism for communicating information within the computer system 710. The computer system 710 further includes one or more processors 720 coupled with the system bus 721 for processing the information. HMI screens, panels, or displays may include, or be coupled to, the one or more processors 720.
[0031] The processors 720 may include one or more central processing units (CPUs), graphical processing units (GPUs), or any other processor known in the art. More generally, a processor as described herein is a device for executing machine-readable instructions stored on a computer readable medium, for performing tasks and may comprise any one or combination of, hardware and firmware. A processor may also comprise memory storing machine-readable instructions executable for performing tasks. A processor acts upon information by manipulating, analyzing, modifying, converting or transmitting information for use by an executable procedure or an information device, and/or by routing the information to an output device. A processor may use or comprise the capabilities of a computer, controller or microprocessor, for example, and be conditioned using executable instructions to perform special purpose functions not performed by a general purpose computer. A processor may include any type of suitable processing unit including, but not limited to, a central processing unit, a microprocessor, a Reduced Instruction Set Computer (RISC) microprocessor, a Complex Instruction Set Computer (CISC) microprocessor, a microcontroller, an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), a System-on-a-Chip (SoC), a digital signal processor (DSP), and so forth. Further, the processor(s) 720 may have any suitable micro architecture design that includes any number of constituent components such as, for example, registers, multiplexers, arithmetic logic units, cache controllers for controlling read/write operations to cache memory, branch predictors, or the like. The micro architecture design of the processor may be capable of supporting any of a variety of instruction sets. A processor may be coupled (electrically and/or as comprising executable components) with any other processor enabling interaction and/or communication there-between. A user interface processor or generator is a known element comprising electronic circuitry or software or a combination of both for generating display images or portions thereof. A user interface comprises one or more display images enabling user interaction with a processor or other device.
[0032] The system bus 721 may include at least one of a system bus, a memory bus, an address bus, or a message bus, and may permit exchange of information (e.g., data (including computer-executable code), signaling, etc.) between various components of the computer system 710. The system bus 721 may include, without limitation, a memory bus or a memory controller, a peripheral bus, an accelerated graphics port, and so forth. The system bus 721 may be associated with any suitable bus architecture including, without limitation, an Industry Standard Architecture (ISA), a Micro Channel Architecture (MCA), an Enhanced ISA (EISA), a Video Electronics Standards Association (VESA) architecture, an Accelerated Graphics Port (AGP) architecture, a Peripheral Component Interconnects (PCI) architecture, a PCI-Express architecture, a Personal Computer Memory Card International Association (PCMCIA) architecture, a Universal Serial Bus (USB) architecture, and so forth.
[0033] Continuing with reference to FIG. 6, the computer system 710 may also include a system memory 730 coupled to the system bus 721 for storing information and instructions to be executed by processors 720. The system memory 730 may include computer readable storage media in the form of volatile and/or nonvolatile memory, such as read only memory (ROM) 731 and/or random access memory (RAM) 732. The RAM 732 may include other dynamic storage device(s) (e.g., dynamic RAM, static RAM, and synchronous DRAM). The ROM 731 may include other static storage device(s) (e.g., programmable ROM, erasable PROM, and electrically erasable PROM). In addition, the system memory 730 may be used for storing temporary variables or other intermediate information during the execution of instructions by the processors 720. A basic input/output system 733 (BIOS) containing the basic routines that help to transfer information between elements within computer system 710, such as during start-up, may be stored in the ROM 731. RAM 732 may contain data and/or program modules that are immediately accessible to and/or presently being operated on by the processors 720. System memory 730 may additionally include, for example, operating system 734, application programs 735, and other program modules 736. Application programs 735 may also include a user portal for development of the application program, allowing input parameters to be entered and modified as necessary.
[0034] The operating system 734 may be loaded into the memory 730 and may provide an interface between other application software executing on the computer system 710 and hardware resources of the computer system 710. More specifically, the operating system 734 may include a set of computer-executable instructions for managing hardware resources of the computer system 710 and for providing common services to other application programs (e.g., managing memory allocation among various application programs). In certain example embodiments, the operating system 734 may control execution of one or more of the program modules depicted as being stored in the data storage 740. The operating system 734 may include any operating system now known or which may be developed in the future including, but not limited to, any server operating system, any mainframe operating system, or any other proprietary or non-proprietary operating system.
[0035] The computer system 710 may also include a disk/media controller 743 coupled to the system bus 721 to control one or more storage devices for storing information and instructions, such as a magnetic hard disk 741 and/or a removable media drive 742 (e.g., floppy disk drive, compact disc drive, tape drive, flash drive, and/or solid state drive). Storage devices 740 may be added to the computer system 710 using an appropriate device interface (e.g., a small computer system interface (SCSI), integrated device electronics (IDE), Universal Serial Bus (USB), or FireWire). Storage devices 741, 742 may be external to the computer system 710. [0036] The computer system 710 may also include a field device interface 765 coupled to the system bus 721 to control a field device 766, such as a device used in a production line. The computer system 710 may include a user input interface or GUI 761, which may comprise one or more input devices, such as a keyboard, touchscreen, tablet and/or a pointing device, for interacting with a computer user and providing information to the processors 720.
[0037] The computer system 710 may perform a portion or all of the processing steps of embodiments of the invention in response to the processors 720 executing one or more sequences of one or more instructions contained in a memory, such as the system memory 730. Such instructions may be read into the system memory 730 from another computer readable medium of storage 740, such as the magnetic hard disk 741 or the removable media drive 742. The magnetic hard disk 741 (or solid state drive) and/or removable media drive 742 may contain one or more data stores and data files used by embodiments of the present disclosure. The data store 740 may include, but are not limited to, databases (e.g., relational, object-oriented, etc.), file systems, flat files, distributed data stores in which data is stored on more than one node of a computer network, peer-to-peer network data stores, or the like. The data stores may store various types of data such as, for example, skill data, sensor data, or any other data generated in accordance with the embodiments of the disclosure. Data store contents and data files may be encrypted to improve security. The processors 720 may also be employed in a multi-processing arrangement to execute the one or more sequences of instructions contained in system memory 730. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
[0038] As stated above, the computer system 710 may include at least one computer readable medium or memory for holding instructions programmed according to embodiments of the invention and for containing data structures, tables, records, or other data described herein. The term “computer readable medium” as used herein refers to any medium that participates in providing instructions to the processors 720 for execution. A computer readable medium may take many forms including, but not limited to, non-transitory, non-volatile media, volatile media, and transmission media. Non-limiting examples of non-volatile media include optical disks, solid state drives, magnetic disks, and magneto-optical disks, such as magnetic hard disk 741 or removable media drive 742. Non-limiting examples of volatile media include dynamic memory, such as system memory 730. Non-limiting examples of transmission media include coaxial cables, copper wire, and fiber optics, including the wires that make up the system bus 721. Transmission media may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
[0039] Computer readable medium instructions for carrying out operations of the present disclosure may be assembler instructions, instruction- set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, statesetting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
[0040] Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, may be implemented by computer readable medium instructions.
[0041] The computing environment 700 may further include the computer system 710 operating in a networked environment using logical connections to one or more remote computers, such as remote computing device 780. The network interface 770 may enable communication, for example, with other remote devices 780 or systems and/or the storage devices 741, 742 via the network 771. Remote computing device 780 may be a personal computer (laptop or desktop), a mobile device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer system 710. When used in a networking environment, computer system 710 may include modem 772 for establishing communications over a network 771, such as the Internet. Modem 772 may be connected to system bus 721 via user network interface 770, or via another appropriate mechanism.
[0042] Network 771 may be any network or system generally known in the art, including the Internet, an intranet, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a direct connection or series of connections, a cellular telephone network, or any other network or medium capable of facilitating communication between computer system 710 and other computers (e.g., remote computing device 780). The network 771 may be wired, wireless or a combination thereof. Wired connections may be implemented using Ethernet, Universal Serial Bus (USB), RJ-6, or any other wired connection generally known in the art. Wireless connections may be implemented using Wi-Fi, WiMAX, and Bluetooth, infrared, cellular networks, satellite or any other wireless connection methodology generally known in the art. Additionally, several networks may work alone or in communication with each other to facilitate communication in the network 771.
[0043] It should be appreciated that the program modules, applications, computer-executable instructions, code, or the like depicted in FIG. 6 as being stored in the system memory 730 are merely illustrative and not exhaustive and that processing described as being supported by any particular module may alternatively be distributed across multiple modules or performed by a different module. In addition, various program module(s), script(s), plug-in(s), Application Programming Interface(s) (API(s)), or any other suitable computer-executable code hosted locally on the computer system 710, the remote device 780, and/or hosted on other computing device(s) accessible via one or more of the network(s) 771, may be provided to support functionality provided by the program modules, applications, or computer-executable code depicted in FIG. 6 and/or additional or alternate functionality. Further, functionality may be modularized differently such that processing described as being supported collectively by the collection of program modules depicted in FIG. 6 may be performed by a fewer or greater number of modules, or functionality described as being supported by any particular module may be supported, at least in part, by another module. In addition, program modules that support the functionality described herein may form part of one or more applications executable across any number of systems or devices in accordance with any suitable computing model such as, for example, a client-server model, a peer-to-peer model, and so forth. In addition, any of the functionality described as being supported by any of the program modules depicted in FIG. 6 may be implemented, at least partially, in hardware and/or firmware across any number of devices.
[0044] It should further be appreciated that the computer system 710 may include alternate and/or additional hardware, software, or firmware components beyond those described or depicted without departing from the scope of the disclosure. More particularly, it should be appreciated that software, firmware, or hardware components depicted as forming part of the computer system 710 are merely illustrative and that some components may not be present or additional components may be provided in various embodiments. While various illustrative program modules have been depicted and described as software modules stored in system memory 730, it should be appreciated that functionality described as being supported by the program modules may be enabled by any combination of hardware, software, and/or firmware. It should further be appreciated that each of the above-mentioned modules may, in various embodiments, represent a logical partitioning of supported functionality. This logical partitioning is depicted for ease of explanation of the functionality and may not be representative of the structure of software, hardware, and/or firmware for implementing the functionality. Accordingly, it should be appreciated that functionality described as being provided by a particular module may, in various embodiments, be provided at least in part by one or more other modules. Further, one or more depicted modules may not be present in certain embodiments, while in other embodiments, additional modules not depicted may be present and may support at least a portion of the described functionality and/or additional functionality. Moreover, while certain modules may be depicted and described as sub-modules of another module, in certain embodiments, such modules may be provided as independent modules or as sub-modules of other modules.
[0045] Although specific embodiments of the disclosure have been described, one of ordinary skill in the art will recognize that numerous other modifications and alternative embodiments are within the scope of the disclosure. For example, any of the functionality and/or processing capabilities described with respect to a particular device or component may be performed by any other device or component. Further, while various illustrative implementations and architectures have been described in accordance with embodiments of the disclosure, one of ordinary skill in the art will appreciate that numerous other modifications to the illustrative implementations and architectures described herein are also within the scope of this disclosure. In addition, it should be appreciated that any operation, element, component, data, or the like described herein as being based on another operation, element, component, data, or the like can be additionally based on one or more other operations, elements, components, data, or the like. Accordingly, the phrase “based on,” or variants thereof, should be interpreted as “based at least in part on.”
[0046] Although embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the disclosure is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the embodiments. Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments could include, while other embodiments do not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements, and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements, and/or steps are included or are to be performed in any particular embodiment.
[0047] The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.

Claims

CLAIMS What is claimed is:
1. A method performed by a human-machine interface (HMI) system comprising a camera and a plurality of modules, the method comprising: capturing a video feed of a production area in vicinity of the HMI system, so as to define an input video; detecting a face in the input video, so as to define a detected face; determining whether the detected face in the input video represents a spoof attack; and when it is determined that the detected face does not represent the spoof attack, determining whether the face belongs to an authorized user of the HMI system.
2. The method as recited in claim 1, the method further comprising: determining whether a mask is on the detected face.
3. The method as recited in claim 2, the method further comprising: when it is determined that the mask is on the detected face, searching a database of masked faces; and matching the detected face to a masked face in the database of masked faces.
4. The method as recited claim 1, wherein detecting whether the detected face represents the spoof attack further comprises: determining whether an eye of the detected face blinks; and determining whether a mouth of the detected face moves.
5. The method as recited in claim 1, the method further comprising: after determining that the face belongs to an authorized user of the HMI system, allowing access to the production area corresponding to particular access rights of the authorized user.
6. A human-machine interface (HMI) system, the HMI system comprising: a camera configured to capture a video feed of a production area in vicinity of the HMI system, so as to define an input video; a processor; and a memory storing instructions that, when executed by the processor, cause the computing system to: detect a face in the input video, so as to define a detected face; determine whether the detected face in the input video represents a spoof attack; and when it is determined that the detected face does not represent the spoof attack, determine whether the face belongs to an authorized user of the HMI system.
7. The HMI system as recited in claim 6, the memory further storing instructions that, when executed by the processor, further case the HMI system to: determine whether a mask is on the detected face.
8. The HMI system as recited in claim 7, the memory further storing instructions that, when executed by the processor, further case the HMI system to: when it is determined that the mask is on the detected face, search a database of masked faces; and match the detected face to a masked face in the database of masked faces.
9. The HMI system as recited in claim 6, the memory further storing instructions that, when executed by the processor, further case the HMI system to: determine whether an eye of the detected face blinks; and determine whether a mouth of the detected face moves.
10. The HMI system as recited in claim 6, the memory further storing instructions that, when executed by the processor, further case the HMI system to: after determining that the face belongs to an authorized user of the HMI system, allow access to the production area corresponding to particular access rights of the authorized user.
PCT/US2022/075570 2022-08-29 2022-08-29 Synchronous user authentication and personalization of human-machine interfaces WO2024049462A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2022/075570 WO2024049462A1 (en) 2022-08-29 2022-08-29 Synchronous user authentication and personalization of human-machine interfaces

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/075570 WO2024049462A1 (en) 2022-08-29 2022-08-29 Synchronous user authentication and personalization of human-machine interfaces

Publications (1)

Publication Number Publication Date
WO2024049462A1 true WO2024049462A1 (en) 2024-03-07

Family

ID=83902888

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/075570 WO2024049462A1 (en) 2022-08-29 2022-08-29 Synchronous user authentication and personalization of human-machine interfaces

Country Status (1)

Country Link
WO (1) WO2024049462A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10850709B1 (en) * 2019-08-27 2020-12-01 Toyota Motor Engineering & Manufacturing North America, Inc. Facial recognition and object detection for vehicle unlocking scenarios
US20210273940A1 (en) * 2020-03-02 2021-09-02 Charter Communications Operating, Llc Method and apparatus for multifactor authentication and authorization
US20220147602A1 (en) * 2018-03-07 2022-05-12 Private Identity Llc System and methods for implementing private identity
US20220237274A1 (en) * 2021-01-25 2022-07-28 Apple Inc. Implementation of biometric authentication

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220147602A1 (en) * 2018-03-07 2022-05-12 Private Identity Llc System and methods for implementing private identity
US10850709B1 (en) * 2019-08-27 2020-12-01 Toyota Motor Engineering & Manufacturing North America, Inc. Facial recognition and object detection for vehicle unlocking scenarios
US20210273940A1 (en) * 2020-03-02 2021-09-02 Charter Communications Operating, Llc Method and apparatus for multifactor authentication and authorization
US20220237274A1 (en) * 2021-01-25 2022-07-28 Apple Inc. Implementation of biometric authentication

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "About Face ID advanced technology - Apple Support", 27 August 2022 (2022-08-27), pages 1 - 3, XP093031983, Retrieved from the Internet <URL:https://web.archive.org/web/20220827203054/https://support.apple.com/en-us/HT208108> [retrieved on 20230315] *

Similar Documents

Publication Publication Date Title
US10044749B2 (en) System and method for cyber-physical security
EP3607484B1 (en) Multilevel intrusion detection in automation and control systems
CN104977874B (en) The enabled mobile device of industry
US9403277B2 (en) Systems and methods for automated cloud-based analytics for security and/or surveillance
WO2020046260A1 (en) Process semantic based causal mapping for security monitoring and assessment of control networks
US20220191227A1 (en) User behavorial analytics for security anomaly detection in industrial control systems
JP6900918B2 (en) Learning device and learning method
US20210382989A1 (en) Multilevel consistency check for a cyber attack detection in an automation and control system
Ibrahim et al. Security enhancement in smart home management through multimodal biometric and passcode
JP2019523512A (en) System and method for analyzing and authenticating scenarios and actions performed in a plant or factory
US20220345475A1 (en) Decision Support for Anomaly Detection via Impact Analysis for Security and Safety
Kumar et al. Challenges within the industry 4.0 setup
Fataliyev et al. Industry 4.0: the oil and gas sector security and personal data protection
WO2020167155A1 (en) Method and system for detecting troubling events during interaction with a self-service device
Sung et al. Design-knowledge in learning plant dynamics for detecting process anomalies in water treatment plants
CN115867913A (en) Privacy-preserving one-way communication device
WO2024049462A1 (en) Synchronous user authentication and personalization of human-machine interfaces
Dhall et al. Machine Learning Algorithms for Industry Using Image Sensing
US20200050756A1 (en) Action monitoring apparatus, system, and method
Wadhwa Smart cities: toward the surveillance society?
JP6259962B1 (en) Remote work support system, remote work support method and program
WO2019066883A1 (en) Plug-and-play declarative security functionality deployment for an engineering platform
Abishek et al. Collaborative robots and cyber security in industry 5.0
KR102470045B1 (en) Cctv video management method, device and system
Yasmeen et al. Suspicious Activity Detection Using CCTV Surveillance Video

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22778184

Country of ref document: EP

Kind code of ref document: A1