US20120123563A1 - Method and Apparatus for Monitoring Zones - Google Patents
Method and Apparatus for Monitoring Zones Download PDFInfo
- Publication number
- US20120123563A1 US20120123563A1 US13/298,416 US201113298416A US2012123563A1 US 20120123563 A1 US20120123563 A1 US 20120123563A1 US 201113298416 A US201113298416 A US 201113298416A US 2012123563 A1 US2012123563 A1 US 2012123563A1
- Authority
- US
- United States
- Prior art keywords
- sensor
- control unit
- sensors
- boundary
- monitoring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16P—SAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
- F16P3/00—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
- F16P3/12—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
- F16P3/14—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
- F16P3/142—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using image capturing devices
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16P—SAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
- F16P3/00—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
- F16P3/12—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
- F16P3/14—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
- F16P3/144—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using light grids
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19641—Multiple cameras having overlapping views on a single scene
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19652—Systems using zones in a single scene defined for different treatment, e.g. outer zone gives pre-alarm, inner zone gives alarm
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/1968—Interfaces for setting up or customising the system
Definitions
- the present invention generally relates to monitoring zones, such as the area or volume around a hazardous machine, secure location, or Autonomous Guided Vehicle (AGV), and particularly relates to the use of multiple sensors for zone monitoring.
- AGV Autonomous Guided Vehicle
- Monitoring systems such as laser scanners and stereoscopic camera systems, are often used for monitoring a zone established by configured boundaries. During run time, such systems detect the presence and measure the positions of objects bigger than a minimum object detection size, and compare these positions with the configured monitoring boundaries. Such a monitoring system then “decides” whether or not an intrusion has occurred for each considered boundary. Illumination of the area, whether it is intrinsically part of the monitored environment, or supplied actively by the system itself, is necessary for proper operation of such systems.
- Such systems enable remote area or volume monitoring.
- This feature has the advantage that a zone (e.g., a large area or volume) may be monitored from a distance that places the zone monitoring system out of harm's way.
- the sensors for such systems e.g., cameras, laser scanners, etc., avoid damage from collisions with machinery in operation or from pollutants that may be present close to or within the monitored area.
- the sensor of such a monitoring system interfaces with a corresponding control unit that often is mounted in a more conveniently accessed location.
- the solution itself introduces new challenges. For example, the amount of processing required for establishing the proper monitoring function scales with the number of sensors. Consequently, vision systems that use multiple sensors have very high overall processing burdens. That burden ultimately causes the control unit cost and performance to scale with the number of sensors used in the application. In turn, the need for high-performance vision processing makes it difficult for the customer to scale a solution to properly match to the monitoring task.
- each sensor in a multi-sensor system is commonly provided with its own I/O.
- This configuration creates an abundance of wiring and external control logic that may not be needed when, for instance, there are only a small number of machines to control for a large number of sensors. Note that this is also the case for applications such as autonomous guided vehicle safety control, where it is common to have up to four or more laser scanners on a guided vehicle.
- the present invention comprises an apparatus and method for monitoring a zone, which is a contiguous or non-contiguous, two-dimensional (2D) area or three-dimensional (3D) volume.
- the apparatus includes a plurality of sensors that monitor portions of the zone and report intrusion status to a control unit that provides monitoring boundary information to each of the sensors based on user input and further “maps” or otherwise associates each sensor to control unit outputs in accordance with user-defined behaviors.
- at least a subset of sensors use a common coordinate frame of reference, based on a common origin located within overlapping sensor fields of view.
- the present invention comprises a monitoring apparatus that includes a plurality of sensors, each configured to monitor for intrusions according to a configured boundary and each having a communication interface for sending monitoring information indicating intrusion detections and receiving configuration data defining the configured boundary to be monitored by each sensor.
- the sensors which are also referred to as “sensor heads,” each comprise sensing circuitry for obtaining raw sensor data representing the sensor's field of view, along with processing circuitry for processing the raw sensor data, for object detection and intrusion monitoring.
- the sensors comprise stereoscopic camera systems or laser scanners, or a mix thereof. That is, the sensors may be homogenous or heterogeneous in type.
- the monitoring apparatus further includes a control unit having a communication interface for receiving the monitoring information from each sensor, including intrusion detection information, and for sending sensor configuration data to each sensor.
- the control unit includes a number of outputs for providing signals to external devices. At least some of these outputs are, for example, Output Signal Switching Device (OSSD) safety outputs used to energize/de-energize hazardous machinery within the monitored zone, or to perform some other safety-related switching function. Other ones of the outputs may relate to various status monitoring, diagnostics, or control functions.
- OSD Output Signal Switching Device
- the control unit further includes one or more processing circuits configured to control the outputs according to a defined control unit configuration, wherein the control unit configuration defines control responses for the sensors.
- the control unit configuration data defines the control response for each sensor, such that the control unit may be understood as mapping or otherwise associating each sensor with one or more particular ones of the outputs, according to the defined control unit configuration.
- This feature allows the control unit to behave differently with respect to each sensor or with respect to different groups of its sensors.
- the control response configured for each sensor defines how the control unit controls its outputs and/or which outputs it controls, on a per-sensor or per sensor group basis. Consequently, a user can configure different behavior (control responses) for intrusion detection and other events reported from the sensors, on a per sensor or sensor-group basis.
- the control unit also includes a configuration interface operatively associated with the one or more processing circuits, for receiving the control unit configuration data, thereby defining the control unit configuration.
- the configuration interface comprises, for example, a computer interface, such as USB, Ethernet or another PC-compatible interface.
- the configuration interface allows allow the control unit to receive the control unit and/or sensor configuration data from an external configuration unit, which may be laptop or other PC.
- FIG. 1 is a block diagram of one embodiment of a monitoring apparatus.
- FIG. 2 is a logic flow diagram of one embodiment of a method of zone monitoring.
- FIG. 3 is a block diagram illustrating the use of multiple sensors positioned at different viewing angles.
- FIGS. 4-6 are block diagrams of various embodiments of a monitoring apparatus, illustrating different communication interface configurations for the plurality of sensors and control unit comprising the apparatus.
- FIG. 7 is a block diagram illustrating the use of a common coordinate frame of reference between two sensors.
- FIG. 1 illustrates a monitoring apparatus 10 (hereafter “apparatus 10 ”) as contemplated herein, according to one example embodiment.
- the apparatus 10 may be understood as a type of monitoring system, a plurality of sensors 12 are configured to monitor all or part of a monitoring zone 14 .
- the “monitoring zone” 14 comprises a contiguous or non-contiguous two-dimensional area or three-dimensional volume to be monitored collectively by the sensors 12 .
- two sensors 12 - 1 , and 12 - 2 are shown by way of example, but more sensors 12 could be used and further note that “sensors 12 ” is used in the generic plural sense and “sensor 12 ” is used in the generic singular sense.
- the monitoring zone 14 is a more or less continuous three-dimensional space, but it includes obstructions or features that prevent a single sensor 12 from “seeing” the entire space. Therefore, by using two or more sensors 12 , each having a different field of view 16 into the monitoring zone 14 , that apparatus 10 provides for full monitoring of the monitoring zone 14 .
- the “field of view” 16 for a given sensor 12 may be along a two-dimensional plane or in a three-dimensional space, and note that the shape or extents of the field of view 16 of a given sensor 12 is defined by a configured boundary 18 .
- sensor 12 - 1 monitors within a zone defined by its configured boundary 18 - 1 and sensor 12 - 2 monitors within a zone defined by its configured boundary 18 - 2 .
- the boundaries 18 may be configured to overlap or extend between sensors 12 , and the portion of the monitored zone 14 that is monitored by each sensor 12 may at least partially overlap with that of one or more other sensors 12 .
- each sensor 12 comprises an assembly, e.g., a housing, connectors, etc., and includes certain functional circuitry.
- the illustrated sensors 12 each include sensing circuitry 20 , processing circuitry 22 , program/data memory 24 , and a communication interface 26 .
- the sensing circuitry 20 comprises, for example, a laser scanner or one or more cameras or other imaging sensors. See U.S. Pat. No. 7,965,384 for example details regarding laser scanning optics electromechanical elements, and associated circuitry.
- the sensing circuitry 20 of at least one sensor 12 comprises a camera.
- the sensing circuitry 20 of at least one sensor 12 comprises a stereoscopic camera system.
- the sensors 12 may not be homogenous; that is, one or more of the sensors 12 may use a first detection technology (e.g., laser scanning), while one or more other ones of the sensors 12 use a second detection technology (e.g., camera or other imaging sensor based machine vision).
- the processing circuitry 22 in each sensor 12 is configured as appropriate for the sensing technology implemented by the sensing circuitry 20 .
- the sensing circuitry comprises a 2D or a 3D camera-based vision system
- the processing circuitry 22 is configured to carry out image-processing algorithms, e.g., for object detection processing.
- the configuration and specific processing may be at least partially implemented according to computer program instructions stored in the program/data memory 24 .
- the specific configuration of the sensor's monitoring e.g., the shape, contour, or dimensional extents information defining the configured boundary 18 used by the sensor to define its monitored zone also may be loaded into and stored by the program/data memory.
- the program/data memory 24 may comprise more than one memory device and/or more than one type of memory, such as SRAM for working data and EEPROM, FLASH or other non-volatile, writeable memory for program and/or configuration data storage.
- the processing circuitry 22 includes communication capabilities, e.g., the processing circuitry 22 sends and receives control and data messages according to a defined communication protocol.
- each sensor 12 includes a communication interface 26 that couples the sensor 12 to a communication link 32 , which in turn provides communicative coupling between the plurality of sensors 12 and an associated control unit 32 .
- the communication interface 26 comprises, for example, the physical-layer circuitry required to communicate on a given medium, e.g., a wired and/or wireless network.
- the communication interface 26 comprises an Ethernet interface, and the sensor 12 may be configured to receive power via connection to powered Ethernet via the communication interface 26 .
- the control unit 32 has a corresponding compatible communication interface 34 , for communicating with each sensor 12 over the communication link(s) 30 .
- the intelligence for managing such communications resides in the processing circuitry 36 of the control unit 32 .
- the processing circuitry 36 comprises one or more processing circuits that are configured via hardware, software, or both, to implement the operational behavior of the control unit 32 .
- the control unit 32 comprises a microcontroller or other type of microprocessor, and program, working, and configuration data to support such processing may be stored in the program/data memory 38 .
- the control unit 32 further includes a number of outputs 40 (e.g., electrical output circuits, each providing an output signal that selectively asserted or otherwise controlled by the control unit 32 ).
- outputs 40 are safety outputs (OSSD outputs, for example), for controlling external machinery 42 , responsive to intrusion detection information from the sensors 12 .
- Such machinery comprises, for example, a manufacturing machine or robot, or an AGV.
- the control unit 32 may shut down or alter operation of one or more external machines 42 , via its outputs 40 .
- the control unit 32 in at least one embodiment further includes diagnostic input/output (I/O), which allows, for example, non-safety signaling from the control unit 32 .
- I/O diagnostic input/output
- Such signaling allows for monitoring the control unit state, and the interface circuits constituting the I/O 44 may be connected with various external monitoring systems, such as factory-floor networks, etc.
- the control unit also may include a configuration interface 48 , which may comprise a computer interface, such as USB, Ethernet or another PC-compatible interface.
- the configuration interface 48 is configured (in conjunction with the processing circuitry 36 ) to allow the control unit 32 to receive configuration data from an external configuration unit 50 , which may be laptop or other PC.
- the apparatus 10 includes a control unit 32 and a number of sensors 12 , with each sensor communicatively linked to the control unit 32 .
- An authorized user (not shown in the figure) attached a computer to the control unit 32 via the configuration interface 48 and executes a configuration program that allows the user to define the configuration boundary 18 of each sensor 12 , and to map or otherwise associate each sensor 12 with specific ones of the outputs 40 .
- the “behavior” of the control unit 32 is configurable with respect to each sensor 12 .
- the particular safety outputs among the collection of outputs 40 that are energized or de-energized based on intrusion detection events reported by a particular one of the sensors is configurable, via the configuration interface 48 .
- Different sensors 12 can be mapped to different safety outputs, and this allows the control unit 32 to energize (or de-energize) certain ones of the outputs 40 in response to object detection by certain ones of the sensors 12 .
- different zones, configured for a given sensor, and therefore monitored simultaneously by that sensor can be mapped to different outputs on the control unit
- the flexibility to attach an array of sensors 12 to the control unit 32 , and to logically configure how the control unit 32 responds to intrusion detection in dependence on which one or ones of the sensors 12 detects the intrusion provides great flexibility for machine guarding and AGV control.
- FIG. 2 illustrates one embodiment of a method 100 of operation for the apparatus 10 .
- the apparatus 10 is configured to perform the illustrated operations by the processing circuitry 36 executing computer program instructions stored in the program/data memory 38 .
- the processing circuitry 36 executing computer program instructions stored in the program/data memory 38 .
- the illustrated processing can be performed in dedicated hardware and/or discrete circuitry in the apparatus 10 .
- the method 100 includes monitoring for configuration mode activation, or otherwise determining by the control unit 32 whether it should enter configuration mode (YES OR NO from Block 132 ). If not, and assuming that the control unit 32 has a valid configuration in its memory, it performs or otherwise continues with run-time operations (Block 104 ), which in one or more embodiments entail a series of complex monitoring and self-verification operations, related not only to verifying communication integrity with the sensors 12 , but also verifying proper integrity and control of its outputs 40 and/or 44 .
- Block 106 includes receiving control unit configuration data (Block 106 - 1 ), receiving or generating sensor configuration data (Block 106 - 2 ), defining control unit behavior (the control responses) of the control unit 32 with respect to intrusion detection by each sensor 12 , in accordance with the received control unit configuration data (Block 106 - 3 ), and communicating the sensor configuration data to the sensors 12 (Block 106 - 4 ).
- a primary aspect of the control unit configuration is concerned with the association of monitored zones with control unit I/O (e.g., outputs 40 and/or 44 and/or misc. inputs 52 ), and whether or not the user desires a particular operating mode (e.g. AUTOMATIC RESTART, START/RESTART INTERLOCK, etc.), to be triggered responsive to intrusion detection information from each particular sensor 12 .
- the boundary 18 for each sensor 12 is constructed using, e.g., a software tool on an attached laptop or other PC, which is supported by the control unit's configuration interface 48 .
- the data defining the configured boundaries 18 is sent from the attached computer to the control unit 32 , which then sends it to the correct sensor 12 .
- data from the attached computer is sent to the control unit 32 , which then generates boundary information from it, or otherwise translates the received data into corresponding boundary information.
- the configured boundary 18 for each sensor 12 is associated with OSSDs and/or other outputs on the control unit 32 .
- the control unit 32 is configured to provide a “zone select” function that acts on the boundary 18 in one or more of the sensors 12 , wherein each such boundary 18 has its own associations with the control unit outputs 40 and/or 44 .
- zone select functionality monitoring of multiple zones is realized through input selection—for example zone selection details, see the commonly owned U.S. Pat. No. 8,018,353, issued on 13 Sep. 2011.
- different sensor configuration data may be provided to each sensor, based on its location with respect to the monitored zone and its intended monitoring functions, and based on its type (e.g., laser, camera, etc.).
- This feature allows the configured boundary 18 and other monitoring parameters of each sensor 12 to be configured on an individual basis.
- the control unit configuration data allows an individualized control response to be defined for the control unit 32 , with respect to each sensor 12 .
- This feature allows a user to configure the control unit behavior differently for different sensors 12 , meaning that the response of the control unit 32 to intrusion detection by one sensor 12 can be different than its response to object intrusion detection by another sensor 12 .
- the control response can be the same across defined groups of sensors 12 , and yet still be different between different groups of sensors 12 .
- the configuration data received by the control unit 32 may come, e.g., from an attached laptop computer of other configuration device 50 , such as was shown in FIG. 1 .
- the configuration device 50 and/or the control unit 32 provide a user interface facilitating configuration choices and data input by the user.
- the control unit 32 provides sensor data from one or more of the sensors 12 —e.g., a camera view or other representation of a field of view into the monitored zone)—for display to the user.
- the user draws or otherwise sees graphical overlays on the field of view, representing configuration boundaries, etc.
- the control unit 32 receives monitoring information from each sensor 12 (Block 110 - 1 ), monitors that information for intrusion detection information from any of the sensors 12 (Block 110 - 2 ), and controls its outputs 40 and/or 44 according to the control response defined for the sensor 12 from which the intrusion detection information was received (Block 110 - 3 ).
- the control unit configuration data received by the control unit 32 is used by the control unit 32 to determine how it responds to intrusion detection by particular ones of the sensors 12 . While it may exhibit uniform behavior with respect to all sensors 12 , if so configured by the user, the user may also tailor the response of the control unit 32 to each particular sensor 12 and/or to particular groups of sensors 12 . As one example, this feature allows the particular outputs 40 and/or 44 to be exercised when intrusion detection signaling is received from a particular sensor 12 .
- FIGS. 1 and 2 will be understood as representing structural/functional and operational examples for a monitoring apparatus 10 that is configured for detecting intrusions within large areas (or volumes), where the monitored area (or volume) is such that one sensor 12 alone cannot achieve the desired coverage due to shadows in the sensor field of view, or limitations in the sensor field of view. See FIG.
- the apparatus 10 includes a plurality of sensors 12 which each detect intrusions into respective portions of the monitored area (or volume) 14 .
- the apparatus 10 further includes a control unit 32 that associates intrusions into the respective portions of the monitored area (or volume) to control outputs 40 and/or 44 , which are used for machine control or diagnostic functions.
- the apparatus 10 further includes a communication interface 34 communicatively linking the sensors 12 with the control unit 32 that allows each of the sensors to communicate intrusion status with respect to its configured boundaries 18 to the control unit 32 .
- the boundary/boundaries 18 - x configured for one of the sensors 12 - x may be a line, a contour, or other two-dimensional boundary, or may be a three-dimensional boundary defining a bounded volume (which itself may be defined at least in part by real world objects, such as barriers, walls, etc.).
- the control unit 32 further includes a configuration interface 48 that allows users to configure each sensor 12 and the control unit 32 , with an external device 50 such as a portable computer. Again, it should be noted that configuring the apparatus 10 in an overall sense includes configuring the sensors 12 and configuring the control unit 32 .
- the configuration interface 48 enables the control unit 32 to receive both sensor configuration data and control unit configuration data.
- the control unit 32 includes a configuration program that allows users to create monitoring boundaries 18 and assign them to sensors 12 and transmit them to individual sensors 12 , or groups of sensors 12 , over the communication interface 34 . That is, sensor configuration data is sent from the user's configuration device 50 (or entered directly into the control unit 32 via the configuration device 50 ), and the control unit then sends that configuration data 12 to the targeted sensors 12 .
- the configuration program or at least a portion of it, resides on the configuration device 50 and communicates with the control unit 32 according to a defined protocol that enables the control unit 32 to parse the incoming data and recognize sensor configuration data parameters and control unit configuration parameters.
- the same or another configuration program includes program instructions that, when executed, allow the user to set the safety parameters of each sensor 12 , such as object detection size and speed of detection (response time). Further, the same or another configuration program includes program instructions that, when executed, allow the user to associate intrusions with respect to configured boundaries 18 to control and/or diagnostic outputs 40 and/or 44 on the control unit 32 . This feature can be understood as allowing the user to map particular sensors 12 to particular ones of the control outputs 40 and/or diagnostic outputs 44 .
- Such mapping comprises identifying which individual ones among the signal lines comprising the outputs 40 and/or 44 will be exercised in response to intrusion detections reported by a particular sensor 12 , or it comprises when/how any or all of the outputs 40 and/or 44 are controlled in response to such intrusion detection reports.
- the apparatus 10 is configured for detecting intrusions within large areas (or volumes), where the monitored area (or volume) contains multiple disjoint or overlapping monitoring areas (or volumes).
- the apparatus 10 includes at least one sensor 12 , which detects intrusions into portions of the monitored zone 14 , which is an area or volume.
- the apparatus 10 includes a control unit 32 that associates intrusions into respective portions of the monitored zone with control outputs 40 or 44 , which are then used for machine control or diagnostic functions.
- a communication interface 34 communicatively links the control unit 32 to the sensors 12 , allowing each sensor 12 to communicate intrusion status with respect to its configured boundaries 18 to the control unit 32 .
- control unit 32 includes a configuration interface 48 that allows users to configure individual sensors 12 and further to configured the control unit's behavior with respect to those sensors 12 , using an external configuration device 50 , such as a laptop computer.
- a configuration program e.g., stored in whole or in part in program/data memory 38 —allows users to create monitoring boundaries 18 on a per-sensor or per sensor-group basis, including assigning particular sensor configurations to particular sensors 12 .
- the control unit 32 is configured to transfer the sensor configuration data to the targeted sensors 12 over the communication interface 34 .
- the configuration program allows the user to set the safety parameters of the sensor, such as object detection size and speed of detection (response time), along with associating or otherwise mapping detected intrusions into the respective portions of the monitored zone 14 to particular control and/or diagnostic outputs 40 and 44 .
- Each sensor 12 detects such intrusions into its respective portion of the monitored zone 14 , based on the corresponding boundaries 18 configured for that sensor 12 .
- the teachings herein provide a monitoring system or apparatus 10 that includes one or more sensors 12 (as noted, alternately referred to as “sensor modules” or “sensor heads”), along with a control module 32 .
- the control unit 32 includes a communication interface 34 that communicatively links the control unit 32 to each sensor 12 —see FIG. 4 , illustrating an embodiment where each sensor 12 links directly to the control unit 32 , e.g., using powered or unpowered Ethernet links.
- FIG. 5 illustrates the use of an aggregator, shown as a “switch” or “endspan” 70 , between the control unit proper and the sensors 12 .
- the control unit 32 and its companion aggregator 70 may be regarded as the overall control module and this configuration offers the advantage of standardizing the control unit 32 and its protocol/physical-layer interface with the aggregator 70 , while still allowing different models of aggregator 70 , each supporting a different type of interface to the sensors 12 .
- FIG. 6 illustrates a daisy chain or series of sensors 12 coupled to the control unit 32 .
- a configuration may be based on, for example, DeviceNet or etherCAT.
- such configurations may use addressable sensors 12 , where the communication protocol between the control unit 32 and the sensor heads 32 supports sensor addresses or sensor IDs, such that configuration data is packaged into messages directed to an identified one of the sensors 12 in the chain.
- each sensor 12 is configurable with multiple monitoring boundaries 18 , and performs a monitoring function for each configured boundary.
- the control module 32 is connected to the sensor module(s) though a fast communication interface, such as Ethernet, and it includes a number of control outputs 40 , some or all of which may be safety-critical control outputs used to control machinery 42 that may be dangerous to people inside the monitoring zone 14 .
- the control unit 32 in one or more embodiments further includes a number of non-safety outputs 44 , which are used, for example, to provide diagnostic functionality and/or signaling to external monitoring and control systems 46 . Still further, the control unit 32 in one or more embodiments includes a number of inputs 52 , which may further be used to actuate specific functionality such as additional monitoring cases, modes or reset functions at the control unit 32 .
- the diagnostic input/output (I/O) 44 includes a diagnostic display in one or more embodiments of the control unit 32 .
- the diagnostic display provides diagnostic information about environmental conditions and/or device errors.
- each sensor 12 is configured to monitor at least at portion of the monitoring zone 14 , according to boundaries 18 configured for that sensor 12 . Objects are detected, subject to any size, persistence, or other “qualification” requirements that prevent false detections. Further each sensor 12 communicates object intrusion status to the control unit 32 , which reacts to the status information from each sensor 12 according to its configured behavior. As between the control unit 32 and the sensors 12 , the sensors 12 perform most of the processing, or at least the sensors 12 perform the most complex processing, e.g., laser scanning, 3D imaging/ranging.
- control unit 32 This aspect of the apparatus 10 allows the control unit 32 to be relatively inexpensive, as was noted earlier.
- the control unit 32 therefore represents a comparatively small portion of the cost of guarding a particular installation, machine or vehicle.
- many sensors 12 can be used with a single control unit 32 without placing heavy requirements on the communication interface between the control unit 32 and the sensors 12 .
- the control unit 32 associates or otherwise relates multiple monitoring cases to a comparatively small number of outputs 40 / 44 , the wiring burden is reduced for many installations.
- diagnostic aids may be unhelpful.
- status indicators may be too numerous to practically implement, or they may be too far away to be visible to the user requiring diagnostic information.
- One solution is to connect an external diagnostic device, such as a PC or video monitor, which allows the user to monitor an image of the zone, superimposed with boundaries and measurement data.
- an external diagnostic device such as a PC or video monitor
- the sensor is far away from the monitored boundary, or does not share a similar vantage point to the zone as a typical user might, it becomes difficult for a user to associate the intrusion diagnostic with a physical location in any way that is intuitively easy to understand.
- At least one embodiment of the apparatus 10 contemplated herein solves the above problem by sharing a common coordinate frame among its multiple sensors 12 .
- This common coordinate frame allows the sensors to be referred to a common monitoring boundary 18 or boundaries 18 .
- a common monitoring boundary 18 or boundaries 18 In the simplest case of two sensors 12 , e.g., sensors 12 - 1 and 12 - 2 , a set of external points in the scene that are visible to both sensors 12 are used to establish a common origin.
- a common origin is established by requiring overlapping fields of view 16 for at least a subset of the sensors 12 .
- One of the advantages gained through use of the common coordinate frame of reference is that such use allows the boundaries 18 of a given sensor 12 to be viewed by any other sensor 12 .
- This feature allows users of the apparatus 10 to view the boundary 18 of a particular sensor 12 from multiple vantage points. In turn, this ability allows users to create more intuitive diagnostics for creating boundaries 18 , and for diagnostic monitoring.
- FIG. 7 shows a boundary 18 - 1 being monitored by a sensor 12 - 1 .
- Sensor 12 - 2 also has a view of boundary 18 - 1 and much of same zone monitored by sensor 12 - 1 .
- positional adjustment of sensor 12 - 1 e.g. to a typical user vantage point
- control unit 32 is configured to provide a user configuration interface and to set or adjust the configured boundaries of the sensors based at least in part on user inputs received via the user configuration interface.
- the control unit 32 is configured to: receive field of view data from a first one of the sensors 12 ; receive or generate data representing a displayed boundary representing the configured boundary 18 of a second one of the sensors 12 as seen from the perspective of the first sensor 12 ; provide the field of view data and the data representing the displayed boundary via the user configuration interface; adjust the data representing the displayed boundary responsive to user inputs; and adjust the configured boundary 18 of the second sensor 12 in accordance with the adjustments made to the displayed boundary.
- the ability to view the configured boundary 18 of one sensor from the perspective of another one of the sensors 12 is used to improve diagnostics (e.g. using the diagnostic display 80 shown in FIG. 7 ).
- This capability is useful for diagnostics because the view of a 3D boundary of a given sensor may be difficult to interpret when seen from the perspective of that given sensor.
- the control unit 32 outputs a video feed from a first sensor 12 showing, from the perspective of the first sensor 12 , the boundary 18 of a second sensor 12 , for the area monitored by the second sensor 12 .
- the first sensor 12 does not necessarily need to be a safety device, but could be a web or mobile camera, for instance, whose FOV overlaps with the second sensor 12 and can be registered to a common coordinate system.
- video or images of multiple views of a configured boundary 18 can be compared to assess the location and size of shadowed areas (e.g., S 1 in FIG. 7 ).
- shadowed areas e.g., S 1 in FIG. 7
- Yet another advantage of using a common coordinate reference is that overlapping boundaries 18 can be consolidated, which can in turn lead to the consolidation of safety outputs.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/298,416 US20120123563A1 (en) | 2010-11-17 | 2011-11-17 | Method and Apparatus for Monitoring Zones |
US15/793,668 US10841539B2 (en) | 2010-11-17 | 2017-10-25 | Method and apparatus for monitoring zones |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US41476110P | 2010-11-17 | 2010-11-17 | |
US13/298,416 US20120123563A1 (en) | 2010-11-17 | 2011-11-17 | Method and Apparatus for Monitoring Zones |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/793,668 Continuation US10841539B2 (en) | 2010-11-17 | 2017-10-25 | Method and apparatus for monitoring zones |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120123563A1 true US20120123563A1 (en) | 2012-05-17 |
Family
ID=45418762
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/298,416 Abandoned US20120123563A1 (en) | 2010-11-17 | 2011-11-17 | Method and Apparatus for Monitoring Zones |
US15/793,668 Active 2032-08-24 US10841539B2 (en) | 2010-11-17 | 2017-10-25 | Method and apparatus for monitoring zones |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/793,668 Active 2032-08-24 US10841539B2 (en) | 2010-11-17 | 2017-10-25 | Method and apparatus for monitoring zones |
Country Status (5)
Country | Link |
---|---|
US (2) | US20120123563A1 (ja) |
EP (1) | EP2641236A1 (ja) |
JP (1) | JP5883881B2 (ja) |
CN (1) | CN103415876B (ja) |
WO (1) | WO2012068329A1 (ja) |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090185719A1 (en) * | 2008-01-21 | 2009-07-23 | The Boeing Company | Modeling motion capture volumes with distance fields |
CN102956084A (zh) * | 2012-09-16 | 2013-03-06 | 中国安全生产科学研究院 | 三维空间安全防护系统 |
US20140044325A1 (en) * | 2012-08-09 | 2014-02-13 | Hologic, Inc. | System and method of overlaying images of different modalities |
JP2014504394A (ja) * | 2010-11-17 | 2014-02-20 | オムロン サイエンティフィック テクノロジーズ, インコーポレイテッド | ゾーンを監視する方法及び装置 |
US20140348385A1 (en) * | 2013-05-21 | 2014-11-27 | Transocean Sedco Forex Ventures Limited | Computer vision collision avoidance in drilling operations |
FR3007176A1 (fr) * | 2013-06-18 | 2014-12-19 | Airbus Operations Sas | Dispositif, systeme et procede d’escorte pour un aeronef au sol |
WO2015126672A1 (en) * | 2014-02-19 | 2015-08-27 | Enlighted, Inc. | Motion tracking |
US20170057088A1 (en) * | 2015-08-31 | 2017-03-02 | Fanuc Corporation | Robot system using a vision sensor |
EP3142088A1 (de) * | 2015-09-11 | 2017-03-15 | Sick Ag | Verfahren zum einstellen von mehreren teilbereichen eines gewünschten schutzbereichs |
US9646376B2 (en) | 2013-03-15 | 2017-05-09 | Hologic, Inc. | System and method for reviewing and analyzing cytological specimens |
US20170252939A1 (en) * | 2014-08-26 | 2017-09-07 | Keith Blenkinsopp | Productivity enhancement for band saw |
EP3217195A1 (de) * | 2016-03-08 | 2017-09-13 | Leuze electronic GmbH + Co KG | Optischer sensor |
IT201600070878A1 (it) * | 2016-07-07 | 2018-01-07 | Datalogic IP Tech Srl | Metodo per configurare una rete di scanner laser di sicurezza all’interno di un’area di lavoro |
US10016898B2 (en) * | 2015-06-30 | 2018-07-10 | Fanuc Corporation | Robot system using a vision sensor |
US10127183B2 (en) * | 2016-06-30 | 2018-11-13 | Fisher Controls International Llc | Systems and methods for provisioning devices operating in industrial automation environments |
US20190007659A1 (en) * | 2017-06-28 | 2019-01-03 | Sick Ag | Sensor for securing a machine |
US20190145577A1 (en) * | 2016-05-12 | 2019-05-16 | Kando Innovation Limited | Enhanced safety attachment for cutting machine |
US20190340909A1 (en) * | 2018-05-02 | 2019-11-07 | Rockwell Automation Technologies, Inc. | Advanced industrial safety notification systems |
US10544898B2 (en) * | 2016-01-12 | 2020-01-28 | Pilz Gmbh & Co. Kg | Safety device and method for monitoring a machine |
WO2020118140A1 (en) * | 2018-12-07 | 2020-06-11 | Schlumberger Technology Corporation | Zone management system and equipment interlocks |
US10798111B2 (en) * | 2016-09-14 | 2020-10-06 | International Business Machines Corporation | Detecting intrusion attempts in data transmission sessions |
US10885758B2 (en) | 2018-11-20 | 2021-01-05 | Transocean Sedeo Forex Ventures Limited | Proximity-based personnel safety system and method |
US10890060B2 (en) | 2018-12-07 | 2021-01-12 | Schlumberger Technology Corporation | Zone management system and equipment interlocks |
US10907466B2 (en) | 2018-12-07 | 2021-02-02 | Schlumberger Technology Corporation | Zone management system and equipment interlocks |
CN112329621A (zh) * | 2020-11-04 | 2021-02-05 | 青岛以萨数据技术有限公司 | 一种异常行为预警数据的处理方法、系统、终端及介质 |
US10969762B2 (en) * | 2018-06-07 | 2021-04-06 | Sick Ag | Configuring a hazard zone monitored by a 3D sensor |
CN113343856A (zh) * | 2021-06-09 | 2021-09-03 | 北京容联易通信息技术有限公司 | 一种图像识别的方法及系统 |
US11126857B1 (en) * | 2014-09-30 | 2021-09-21 | PureTech Systems Inc. | System and method for object falling and overboarding incident detection |
US20210341612A1 (en) * | 2018-03-28 | 2021-11-04 | Nec Corporation | Monitoring control device, monitoring system, monitoring control method, and non-transitory computer-readable medium with program stored therein |
US11174989B2 (en) * | 2018-08-15 | 2021-11-16 | Sick Ag | Sensor arrangement and method of securing a monitored zone |
US11216005B1 (en) * | 2020-10-06 | 2022-01-04 | Accenture Global Solutions Limited | Generating a point cloud capture plan |
US11335182B2 (en) * | 2016-06-22 | 2022-05-17 | Outsight | Methods and systems for detecting intrusions in a monitored volume |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103200393B (zh) * | 2013-04-02 | 2016-03-16 | 天津市亚安科技股份有限公司 | 一种实现视频监控区域扫描的方法及装置 |
US9256944B2 (en) * | 2014-05-19 | 2016-02-09 | Rockwell Automation Technologies, Inc. | Integration of optical area monitoring with industrial machine control |
EP3380865A4 (en) * | 2015-11-25 | 2019-08-07 | VHS IP Pty Ltd | SAFETY DEVICE FOR WORKING PLACE WITH LIDAR |
JP6728842B2 (ja) * | 2016-03-24 | 2020-07-22 | オムロン株式会社 | 光学計測装置 |
TWI682368B (zh) * | 2018-07-03 | 2020-01-11 | 緯創資通股份有限公司 | 利用多維度感測器資料之監控系統及監控方法 |
WO2020041761A1 (en) * | 2018-08-24 | 2020-02-27 | Lutron Technology Company Llc | Occupant detection device |
JP6529062B1 (ja) * | 2019-02-27 | 2019-06-12 | 株式会社 テクノミライ | デジタルアキュレート・セキュリティシステム、方法及びプログラム |
US20220180002A1 (en) * | 2019-03-15 | 2022-06-09 | Omron Corporation | Safety device configuration cloning |
CN109816910A (zh) * | 2019-03-29 | 2019-05-28 | 杭州涂鸦信息技术有限公司 | 一种智能监测设备、系统及门铃 |
CN110111516B (zh) * | 2019-04-17 | 2021-11-23 | 中广核研究院有限公司 | 一种核电站厂区安全保卫方法及系统 |
CN111832357A (zh) * | 2019-04-19 | 2020-10-27 | 苏州涟漪信息科技有限公司 | 一种移动事件检测方法及装置 |
CN110362923B (zh) * | 2019-07-16 | 2021-06-01 | 成都奥伦达科技有限公司 | 基于三维可视域分析的三维监控覆盖率方法及监控安装方法和监控系统 |
WO2021028910A1 (en) * | 2019-08-12 | 2021-02-18 | Nanomotion Ltd | A gimbal apparatus system and method for automated vehicles |
US20210284335A1 (en) * | 2020-03-16 | 2021-09-16 | Asylon, Inc. | Automated alert system using unmanned aerial vehicles |
CN112040131B (zh) * | 2020-09-08 | 2022-08-26 | 湖南两湖机电科技有限公司 | 基于EtherCAT工业以太网总线的智能相机及其控制方法 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100053330A1 (en) * | 2008-08-26 | 2010-03-04 | Honeywell International Inc. | Security system using ladar-based sensors |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US818353A (en) | 1904-08-04 | 1906-04-17 | James M Cole | Means for preserving foods. |
JPH04234287A (ja) * | 1990-12-28 | 1992-08-21 | Sony Corp | 映像信号切換装置 |
US6081606A (en) * | 1996-06-17 | 2000-06-27 | Sarnoff Corporation | Apparatus and a method for detecting motion within an image sequence |
JPH11165291A (ja) * | 1997-12-02 | 1999-06-22 | Yaskawa Electric Corp | 安全監視装置および方法 |
US7522186B2 (en) * | 2000-03-07 | 2009-04-21 | L-3 Communications Corporation | Method and apparatus for providing immersive surveillance |
US7200246B2 (en) * | 2000-11-17 | 2007-04-03 | Honeywell International Inc. | Object detection |
JP2003025859A (ja) * | 2001-07-11 | 2003-01-29 | Kubota Corp | 作業機 |
WO2003025859A1 (fr) | 2001-09-17 | 2003-03-27 | National Institute Of Advanced Industrial Science And Technology | Dispositif d'interface |
JP3749945B2 (ja) * | 2001-11-27 | 2006-03-01 | 独立行政法人産業技術総合研究所 | 空間マーキング装置 |
JP4100934B2 (ja) * | 2002-02-28 | 2008-06-11 | シャープ株式会社 | 複合カメラシステム、ズームカメラ制御方法およびズームカメラ制御プログラム |
JP4035610B2 (ja) | 2002-12-18 | 2008-01-23 | 独立行政法人産業技術総合研究所 | インタフェース装置 |
JP4568009B2 (ja) * | 2003-04-22 | 2010-10-27 | パナソニック株式会社 | カメラ連携による監視装置 |
JP4301051B2 (ja) * | 2004-03-24 | 2009-07-22 | 三菱電機株式会社 | 港湾監視システム |
JP2006059185A (ja) * | 2004-08-20 | 2006-03-02 | Rise Corp | データ管理システム及び方法 |
JP4205036B2 (ja) * | 2004-10-04 | 2009-01-07 | 日本電信電話株式会社 | 物品管理システム、方法及びプログラム |
JP2006279927A (ja) * | 2005-03-01 | 2006-10-12 | Omron Corp | 監視制御装置、監視システム、監視方法、プログラムおよび記録媒体 |
US8471910B2 (en) * | 2005-08-11 | 2013-06-25 | Sightlogix, Inc. | Methods and apparatus for providing fault tolerance in a surveillance system |
DE102005063217C5 (de) | 2005-12-22 | 2022-08-18 | Pilz Gmbh & Co. Kg | Verfahren zum Konfigurieren einer Überwachungseinrichtung zum Überwachen eines Raumbereichsund entsprechende Überwachungseinrichtung |
JP2007249722A (ja) * | 2006-03-17 | 2007-09-27 | Hitachi Ltd | 物体検知装置 |
CN100446568C (zh) * | 2006-09-15 | 2008-12-24 | 杭州华三通信技术有限公司 | 视频监控设备及方法 |
CN101119478A (zh) * | 2007-05-09 | 2008-02-06 | 上海天卫通信科技有限公司 | 双镜头视频监控中的视角自动配置系统和方法 |
US7965384B2 (en) * | 2007-09-27 | 2011-06-21 | Omron Scientific Technologies, Inc. | Clutter rejection in active object detection systems |
DE102007053812A1 (de) * | 2007-11-12 | 2009-05-14 | Robert Bosch Gmbh | Konfigurationsmodul für ein Videoüberwachungssystem, Überwachungssystem mit dem Konfigurationsmodul, Verfahren zur Konfiguration eines Videoüberwachungssystems sowie Computerprogramm |
US8576282B2 (en) * | 2008-12-12 | 2013-11-05 | Honeywell International Inc. | Security system with operator-side privacy zones |
CN101489120A (zh) * | 2009-01-21 | 2009-07-22 | 北京中星微电子有限公司 | 实现视频监控区域遮挡的系统和装置以及方法 |
CN103415876B (zh) * | 2010-11-17 | 2017-03-22 | 欧姆龙科学技术公司 | 一种用于监控区域的方法和设备 |
-
2011
- 2011-11-17 CN CN201180065135.0A patent/CN103415876B/zh not_active Expired - Fee Related
- 2011-11-17 JP JP2013539996A patent/JP5883881B2/ja not_active Expired - Fee Related
- 2011-11-17 WO PCT/US2011/061121 patent/WO2012068329A1/en active Application Filing
- 2011-11-17 US US13/298,416 patent/US20120123563A1/en not_active Abandoned
- 2011-11-17 EP EP11802190.6A patent/EP2641236A1/en not_active Ceased
-
2017
- 2017-10-25 US US15/793,668 patent/US10841539B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100053330A1 (en) * | 2008-08-26 | 2010-03-04 | Honeywell International Inc. | Security system using ladar-based sensors |
Cited By (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8452052B2 (en) * | 2008-01-21 | 2013-05-28 | The Boeing Company | Modeling motion capture volumes with distance fields |
US20090185719A1 (en) * | 2008-01-21 | 2009-07-23 | The Boeing Company | Modeling motion capture volumes with distance fields |
US10841539B2 (en) | 2010-11-17 | 2020-11-17 | Omron Scientific Technologies, Inc. | Method and apparatus for monitoring zones |
JP2014504394A (ja) * | 2010-11-17 | 2014-02-20 | オムロン サイエンティフィック テクノロジーズ, インコーポレイテッド | ゾーンを監視する方法及び装置 |
US20140044325A1 (en) * | 2012-08-09 | 2014-02-13 | Hologic, Inc. | System and method of overlaying images of different modalities |
US9076246B2 (en) * | 2012-08-09 | 2015-07-07 | Hologic, Inc. | System and method of overlaying images of different modalities |
CN102956084A (zh) * | 2012-09-16 | 2013-03-06 | 中国安全生产科学研究院 | 三维空间安全防护系统 |
US9646376B2 (en) | 2013-03-15 | 2017-05-09 | Hologic, Inc. | System and method for reviewing and analyzing cytological specimens |
EP3000229A1 (en) * | 2013-05-21 | 2016-03-30 | Transocean Sedco Forex Ventures Limited | Computer vision collision avoidance in drilling operations |
EP3000229A4 (en) * | 2013-05-21 | 2017-05-03 | Transocean Sedco Forex Ventures Limited | Computer vision collision avoidance in drilling operations |
US20160292513A1 (en) * | 2013-05-21 | 2016-10-06 | Transocean Sedco Forex Ventures Limited | Computer vision collision avoidance in drilling operations |
US20140348385A1 (en) * | 2013-05-21 | 2014-11-27 | Transocean Sedco Forex Ventures Limited | Computer vision collision avoidance in drilling operations |
US9396398B2 (en) * | 2013-05-21 | 2016-07-19 | Transocean Secco Forex Ventures Limited | Computer vision collision avoidance in drilling operations |
US10402662B2 (en) * | 2013-05-21 | 2019-09-03 | Transocean Sedco Forex Ventures Limited | Computer vision collision avoidance in drilling operations |
WO2014190081A1 (en) * | 2013-05-21 | 2014-11-27 | Transocean Sedco Forex Ventures Limited | Computer vision collision avoidance in drilling operations |
FR3007176A1 (fr) * | 2013-06-18 | 2014-12-19 | Airbus Operations Sas | Dispositif, systeme et procede d’escorte pour un aeronef au sol |
US10520209B2 (en) | 2014-02-19 | 2019-12-31 | Enlighted, Inc. | Motion tracking |
US9671121B2 (en) | 2014-02-19 | 2017-06-06 | Enlighted, Inc. | Motion tracking |
WO2015126672A1 (en) * | 2014-02-19 | 2015-08-27 | Enlighted, Inc. | Motion tracking |
US20170252939A1 (en) * | 2014-08-26 | 2017-09-07 | Keith Blenkinsopp | Productivity enhancement for band saw |
US10603808B2 (en) * | 2014-08-26 | 2020-03-31 | Kando Innovation Limited | Productivity enhancement for band saw |
US11126857B1 (en) * | 2014-09-30 | 2021-09-21 | PureTech Systems Inc. | System and method for object falling and overboarding incident detection |
US10016898B2 (en) * | 2015-06-30 | 2018-07-10 | Fanuc Corporation | Robot system using a vision sensor |
US9782897B2 (en) * | 2015-08-31 | 2017-10-10 | Fanuc Corporation | Robot system using a vision sensor |
US20170057088A1 (en) * | 2015-08-31 | 2017-03-02 | Fanuc Corporation | Robot system using a vision sensor |
US11333790B2 (en) * | 2015-09-11 | 2022-05-17 | Sick Ag | Method of setting a plurality of part regions of a desired protected zone |
US20170075027A1 (en) * | 2015-09-11 | 2017-03-16 | Sick Ag | Method of setting a plurality of part regions of a desired protected zone |
EP3142088A1 (de) * | 2015-09-11 | 2017-03-15 | Sick Ag | Verfahren zum einstellen von mehreren teilbereichen eines gewünschten schutzbereichs |
US10544898B2 (en) * | 2016-01-12 | 2020-01-28 | Pilz Gmbh & Co. Kg | Safety device and method for monitoring a machine |
EP3217195A1 (de) * | 2016-03-08 | 2017-09-13 | Leuze electronic GmbH + Co KG | Optischer sensor |
US20190145577A1 (en) * | 2016-05-12 | 2019-05-16 | Kando Innovation Limited | Enhanced safety attachment for cutting machine |
US11181232B2 (en) * | 2016-05-12 | 2021-11-23 | Kando Innovation Limited | Enhanced safety attachment for cutting machine |
US11335182B2 (en) * | 2016-06-22 | 2022-05-17 | Outsight | Methods and systems for detecting intrusions in a monitored volume |
US10127183B2 (en) * | 2016-06-30 | 2018-11-13 | Fisher Controls International Llc | Systems and methods for provisioning devices operating in industrial automation environments |
IT201600070878A1 (it) * | 2016-07-07 | 2018-01-07 | Datalogic IP Tech Srl | Metodo per configurare una rete di scanner laser di sicurezza all’interno di un’area di lavoro |
US10798111B2 (en) * | 2016-09-14 | 2020-10-06 | International Business Machines Corporation | Detecting intrusion attempts in data transmission sessions |
US20190007659A1 (en) * | 2017-06-28 | 2019-01-03 | Sick Ag | Sensor for securing a machine |
US20210341612A1 (en) * | 2018-03-28 | 2021-11-04 | Nec Corporation | Monitoring control device, monitoring system, monitoring control method, and non-transitory computer-readable medium with program stored therein |
US20190340909A1 (en) * | 2018-05-02 | 2019-11-07 | Rockwell Automation Technologies, Inc. | Advanced industrial safety notification systems |
US10832548B2 (en) * | 2018-05-02 | 2020-11-10 | Rockwell Automation Technologies, Inc. | Advanced industrial safety notification systems |
US10969762B2 (en) * | 2018-06-07 | 2021-04-06 | Sick Ag | Configuring a hazard zone monitored by a 3D sensor |
US11174989B2 (en) * | 2018-08-15 | 2021-11-16 | Sick Ag | Sensor arrangement and method of securing a monitored zone |
US10885758B2 (en) | 2018-11-20 | 2021-01-05 | Transocean Sedeo Forex Ventures Limited | Proximity-based personnel safety system and method |
US11763653B2 (en) | 2018-11-20 | 2023-09-19 | Transocean Sedco Forex Ventures Limited | Proximity-based personnel safety system and method |
US11238717B2 (en) | 2018-11-20 | 2022-02-01 | Transocean Sedco Forex Ventures Limited | Proximity-based personnel safety system and method |
WO2020118140A1 (en) * | 2018-12-07 | 2020-06-11 | Schlumberger Technology Corporation | Zone management system and equipment interlocks |
US10907466B2 (en) | 2018-12-07 | 2021-02-02 | Schlumberger Technology Corporation | Zone management system and equipment interlocks |
US10890060B2 (en) | 2018-12-07 | 2021-01-12 | Schlumberger Technology Corporation | Zone management system and equipment interlocks |
US11216005B1 (en) * | 2020-10-06 | 2022-01-04 | Accenture Global Solutions Limited | Generating a point cloud capture plan |
CN112329621A (zh) * | 2020-11-04 | 2021-02-05 | 青岛以萨数据技术有限公司 | 一种异常行为预警数据的处理方法、系统、终端及介质 |
CN113343856A (zh) * | 2021-06-09 | 2021-09-03 | 北京容联易通信息技术有限公司 | 一种图像识别的方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
US10841539B2 (en) | 2020-11-17 |
EP2641236A1 (en) | 2013-09-25 |
WO2012068329A1 (en) | 2012-05-24 |
JP2014504394A (ja) | 2014-02-20 |
JP5883881B2 (ja) | 2016-03-15 |
CN103415876A (zh) | 2013-11-27 |
CN103415876B (zh) | 2017-03-22 |
US20180048870A1 (en) | 2018-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10841539B2 (en) | Method and apparatus for monitoring zones | |
US9976700B2 (en) | Area monitoring sensor | |
Wang | Collaborative robot monitoring and control for enhanced sustainability | |
EP2685421B1 (en) | Determining objects present in a process control system | |
JP6367102B2 (ja) | 監視システム | |
US10679504B2 (en) | Applications of a plurality of safety laser scanners combined with a camera or mobile computer | |
US9933510B2 (en) | Safety scanner and optical safety system | |
JP6851137B2 (ja) | 安全スキャナ | |
US10436901B2 (en) | Optoelectronic sensor and method for detecting objects | |
JP6866646B2 (ja) | センサ支援システム、端末、センサおよびセンサ支援方法 | |
US20190378264A1 (en) | Method of Securing a Hazard Zone | |
US11815598B2 (en) | Anti-collision and motion monitoring, control, and alerting systems and methods | |
CN102650883A (zh) | 无人飞行载具控制系统及方法 | |
ES2569202T3 (es) | Disposición y el método para hacer funcionar un sistema, un programa informático correspondiente y un medio de almacenamiento legible por ordenador correspondiente | |
CN104034734B (zh) | 一种卷扬机钢缆检测装置及其方法 | |
CN108908402A (zh) | 一种机器人硬件的检测方法及系统 | |
US11586225B2 (en) | Mobile device, mobile body control system, mobile body control method, and program | |
KR101756092B1 (ko) | 애플리케이션 연동 차량용 초음파 센서 모듈 및 그에 의한 차량 탐지 방법 | |
JP6367100B2 (ja) | エリア監視センサ | |
KR102217608B1 (ko) | Qr코드를 이용한 hmi 모니터링 시스템 | |
JP6355543B2 (ja) | 監視システム | |
JP2020205649A (ja) | 安全スキャナ | |
KR20230133982A (ko) | 작업 관리 시스템 및 작업 기계 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: OMRON SCIENTIFIC TECHNOLOGIES, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DRINKARD, JOHN;REEL/FRAME:027573/0122 Effective date: 20111220 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |