WO2022140460A1 - Identification et surveillance de risques pour la productivité, la santé et la sécurité dans des sites industriels - Google Patents

Identification et surveillance de risques pour la productivité, la santé et la sécurité dans des sites industriels Download PDF

Info

Publication number
WO2022140460A1
WO2022140460A1 PCT/US2021/064714 US2021064714W WO2022140460A1 WO 2022140460 A1 WO2022140460 A1 WO 2022140460A1 US 2021064714 W US2021064714 W US 2021064714W WO 2022140460 A1 WO2022140460 A1 WO 2022140460A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
risk
industrial site
industrial
workers
Prior art date
Application number
PCT/US2021/064714
Other languages
English (en)
Inventor
Lai Him Matthew Man
Mohammad SOLTANI
Seyedfarid Mirahadi
Jiazi Liu
Original Assignee
Procore Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/129,355 external-priority patent/US20210109497A1/en
Application filed by Procore Technologies, Inc. filed Critical Procore Technologies, Inc.
Priority to AU2021409557A priority Critical patent/AU2021409557A1/en
Priority to EP21912086.2A priority patent/EP4264383A1/fr
Publication of WO2022140460A1 publication Critical patent/WO2022140460A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/4183Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31449Monitor workflow, to optimize business, industrial processes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31465Determine which variables of the system to be monitored

Definitions

  • One technical field includes the methods for monitoring and tracking persons, physical assets, and deliveries of parts and materials in industrial sites. Another technical field includes the methods for determining constructions risks that are excessive and greater than typical risks. Another technical field includes an implemented machine vision. Yet other technical field is artificial intelligence-based processing.
  • Managing large industrial sites often includes considering safety, efficiency, and accountability of the persons present on the sites. This may also include considering efficiency and accountability of the activities taking place at the sites and managing constructions risks.
  • the term safety refers herein to the safety of workers, bystanders, and pedestrians present at or near the industrial site, while the term efficiency refers to the optimization of resources such as equipment assets in the industrial site.
  • the term accountability refers to the ability to validate the constructed product against the specification of the industrial project and the billing accuracy for the work done.
  • the constructions risk usually refers to the constructions risks that are excessive or greater than the typical constructions risks.
  • Managing large industrial sites may also include comparing the schedules for parts and materials deliveries with the actual needs. This may also include comparing the employee schedules with the workforce needs required to keep the projects on track. [0007] Traditionally, safety and efficiency are enforced on industrial sites by managers and supervisors. However, the managers and supervisors cannot observe everything that happens at the sites all the time; especially, if the industrial sites are extensive and widespread, and when many activities occur simultaneously at the site.
  • the systems may include computer servers that communicate via communications networks with many different devices that collect data from the sites. But, due to the complexity and non-uniformity of the collected data, the computerized systems rarely provide the results in a timely fashion.
  • FIG. 1 illustrates an example computer system that may be used to implement an embodiment.
  • FIG. 2 depicts an example map showing an industrial site and locations of video cameras implemented in the site.
  • FIG. 3 depicts a combined display of several video streams provided to a computer from several video cameras.
  • FIG. 4A depicts examples of various risks.
  • FIG. 4B depicts examples of possible reasons for safety risks posed by steel erection activities.
  • FIG. 4C depicts examples of various risks assessment approaches.
  • FIG. 4D depicts examples of detecting safety risks.
  • FIG. 4E depicts an example workflow for identifying and monitoring health and safety risks in industrial sites.
  • FIG. 4F depicts an example workflow integration of a decision support system.
  • FIG. 5 depicts an example workflow executed by a decision support system.
  • FIG. 6 depicts an example diagram for performing persons and equipment detection.
  • FIG. 7 depicts an example diagram for training a machine learning system.
  • FIG. 8 depicts a deployment example of a decision support system.
  • FIG. 9A and FIG. 9B depict example markers that can be affixed to hard hats.
  • FIG. 10A schematically illustrates a top plan view of an example physical site that is equipped with cameras mounted along an outside perimeter of the physical site.
  • FIG. 10B schematically illustrates a top plan view of an example physical site that is equipped with cameras mounted along an inside perimeter of the physical site.
  • FIG. 11 is a block diagram that illustrates an example computer system with which an embodiment may be implemented.
  • a computer-implemented approach for identifying and monitoring productivity, health and safety risks at industrial sites is presented.
  • the approach may include computer-implemented machine learning processes, statistical analysis, computer models, operations research models, and industrial domain analysis; and may be implemented in a distributed computer system executing as a decision support system.
  • industrial sites with which embodiments can be used include construction sites, indoor industrial sites, outdoor industrial sites, underground industrial sites, warehouses, oil & gas facilities, mining sites, shipyards, ports, and the like.
  • a computing device receives a plurality of data inputs from a plurality of data input devices implemented in an industrial site.
  • data input devices include video cameras, digital cameras, sensors, and other devices configured to collect data specific to industrial sites.
  • the data inputs may include video recordings, digital photographs, sensor data, business data such as schedules, production output expectation charts, specifications provided by users or other systems, and the like.
  • the plurality of data inputs further comprises data inputs manually provided by users, managers, and/or service team members.
  • a user may type input information directly on his/her keyboard communicatively coupled with the computing device or associated servers.
  • the user may also import documents to be crawled and/or processed using any of optical character recognition (OCR) approaches or any of context recognition software applications.
  • OCR optical character recognition
  • the data input devices may be deployed at different locations of the industrial site.
  • the devices may be configured to collect data from the sites and transmit the collected data to one or more computing devices.
  • Some of the data collection devices may be stationary, while others may be mobile.
  • the stationary data collection devices may be mounted on, for example, poles, walls, gates, doors, ceilings, doors, and the like.
  • the mobile data collection devices may be installed on, for example, tracks, trailers, conveyors, cranes, personal protective equipment (hardhats, vests, boots), equipment and gear worn by contractors and workers present on the industrial site, and the like.
  • the data collection devices may send the collected data to the computer device via one or more communications networks, including Internet-based networks, local-area networks (LANs), wide area networks (WANs), and the like.
  • LANs local-area networks
  • WANs wide area networks
  • Some of the data collection devices may be equipped with computational resources sufficient to preliminarily analyze and convert the collected data if needed. After performing the preliminary analysis, the devices may transmit the analyzed data to the computing devices to perform additional analysis, such as the video analytics, image analysis, sensor data analysis, data synthesis, and the like. Some complex computations may be performed in a cloud computing system.
  • the machine learning models may be executed either in a cloud (e.g. AWS, GCP, Azure, etc.) and/or by an edge (e.g. , NVIDIA Jetson, Google Coral TPU, Mobile phone NPUs).
  • a cloud e.g. AWS, GCP, Azure, etc.
  • an edge e.g. , NVIDIA Jetson, Google Coral TPU, Mobile phone NPUs.
  • the machine learning models are adapted to the corresponding hardware and optimized accordingly.
  • an edge may be used to perform the computing, while for tasks that involve very heavy computations, the cloud computing may be used (e.g., multiple machine learning algorithms working together to perform, for example, an activity recognition).
  • the computing device selects a data model from one or more data models.
  • the data models may include machine learning models and may implement neural networks configured to process input data using artificial intelligence approaches.
  • the data models are computer-designed and programmed models trained on training data specific to industrial sites and programmed to detect activities or objects associated with workers or equipment present at the industrial sites.
  • the data models may be implemented as, for example, machine learning models that integrate one or more artificial intelligence approaches. Training of the data model may involve providing, as input, the training data that teaches the model to identify, with a relatively high probability, workers, objects, pieces of equipment, machinery, trucks, excavators, materials, constructions elements, etc., that may be present at industrial sites.
  • Selection of the data model, from the plurality of models, may be based on the type of the industrial site, and based on the training that the model was subjected to. For example, if the approach is implemented for construction sites in which the workers and heavy machinery are expected to be present, then the selected model may be the model that has been trained and configured for detecting the workers and heavy machinery at the industrial sites.
  • the computing device applies the plurality of data inputs to the data model to cause the data model to evaluate the inputs and generate output data specifying whether the plurality of data inputs indicates one or more activities or objects associated with the workers or the equipment present at the industrial site.
  • applying, by the computing device, the plurality of data inputs to the data model and evaluating the data model with the data inputs to result in receiving output data comprises identifying, in the plurality of data inputs (e.g., in a plurality of digital images) the workers working at the industrial site and the pieces of equipment present at the industrial site.
  • one or more characteristics of the one or more activities or objects are determined.
  • the characteristics may indicate, for example, the placement and location of the objects (or persons) in relation to other objects (or persons).
  • the characteristics may indicate the activities or objects that may cause some health or safety risks at the industrial site may include the activities or objects that pose an excessive risk at the industrial site or pose a risk that is greater than a normal risk at the industrial site.
  • examples of the characteristics may include one or more of: a falling-from-height characteristic, a being-struck-by-object characteristic, a tool-usage-based characteristic, a machine-failure-based characteristic, an object-property characteristic, an object-placement characteristic, a lack-of-protection characteristic, an activity -based characteristic, a size-based characteristic, a proximity-based characteristic, a usage-based characteristic, a potential-injury-based characteristic, a failure-based characteristic, a schedule-based characteristic, a quality-based characteristic, a production-based characteristic, a congestion-based characteristic, or a cleanliness-based characteristic.
  • determining whether the one or more characteristics of the one or more activities or objects cause any health or safety risks at the industrial site is based on determining whether a data repository that stores one or more mappings of risk- prone characteristics onto risks posed in the industrial sites includes the entries for the one or more characteristics. If such entries are found in the repository, then the entries are retrieved, and one or more risk identifiers are extracted from the entries and used to indicate that the health or safety risks are posed in the industrial site.
  • the characteristics determined for the activities and objects may also include probabilities with which the activities and objects have been detected in the inputs (e.g., in the provided frames and pictures) and weights associated with the activities and objects. Based on the probabilities and the weights, safety scoresjnay be computed and used to determine the gravity of the risks.
  • the weights may be described as the relational unsafety impact of each activity/object.
  • the risk may be described as a likelihood multiplied by a consequence of the corresponding event. Therefore, the probability is the likelihood in the global definition; and consequence is the above mentioned weight with the magnitude of impact of each activity/object.
  • the computing device determines whether the one or more activities or objects cause any health or safety risks at the industrial site.
  • risks that may be determined based on the characteristics of the activities or objects associated with the workers or the equipment may include one or more of: danger zones and restricted areas related to risks, a falling-from-height risk, a being-struck-by-object risk, a tool-usage-based risk, a machine-failure-based risk, an object-property-based risk, an object-placement risk, a lack-of-protection risk, an activity -based risk, a size-based risk, a proximity-based risk, a usage-based risk, a potential-injury -based risk, a failure-based risk, a schedule-based risk, a quality-based risk, a production-based risk, a congestion-based risk, or a cleanliness-based risk.
  • the danger zones may include streets, intersections, safety zones, and the like.
  • the restricted areas may include unloading zones, cranes’ overheads, welding zones, barriers, pathways, channels, and the like. Additional risks may include weather related risks. For example, a crane stand may be down due to windy conditions, or workers may be exposed to injuries during winter’s slippery conditions. Risks may also pertain to safety zones that are dynamic. For example, some risks may be caused by a swing radius of an excavator that is operating/moving. Another risk may be caused by a crane in an operating area where the crane is traversing the industrial site, and the like.
  • the computing device computes safety scores based on the probabilities with which the activities and objects have been detected in the inputs (e.g., in the provided frames and pictures) and weights associated with the activities and objects.
  • the safety scores may be used to determine the gravity of the risks.
  • the computing device determines that the activities or objects cause the health/safety risks, then the computing device generates one or more notifications that indicate the one or more health or safety risks at the industrial site and transmits the one or more notifications to notification recipients.
  • a graphical representation of the one or more health or safety risks posed at the industrial site may be generated and transmitted to a computer display device to cause the computer display device to generate, based on the graphical representation, and display a graphical user interface depicting the one or more health or safety risks posed at the industrial site.
  • audible and/or vibration-based representations of the one or more health or safety risks posed at the industrial site may be generated and transmitted to a computer device to cause the computer device to generate, based on the audible/vibration representation, warnings or alarm signals that represent the one or more health or safety risks posed at the industrial site.
  • the vibration representations may be transmitted to, for example, smartphones to cause the smartphones to vibrate upon receiving a corresponding signal.
  • a computer-implemented approach for identifying and monitoring risks in industrial sites is leveraging the benefits and outcomes achieved by implementing the site monitoring approaches introduced in the next section.
  • a computer-implemented approach for monitoring activities in industrial sites is presented.
  • computer-implemented machine learning processes, statistical analysis, computer models, operations research models, and industrial domain analysis may be implemented in a distributed computer system executing as a decision support system.
  • a decision support system receives input data collected from an industrial site by specialized data collection devices.
  • the data collection devices include video cameras, digital sensors, and other types of computer-based data collectors.
  • the devices may be deployed at different locations of an industrial site and may be configured to collect data and transmit the collected data to a computer.
  • Some of the data collection devices may be equipped with computational resources sufficient to preliminarily analyze the collected data. After performing the preliminary analysis, the devices may transmit the analyzed data to a computer to perform additional analysis, such as the video analytics and data synthesis. Some complex computations may be performed in a cloud computing system.
  • Some of the data collection devices may be stationary, while others may be mobile.
  • the stationary data collection devices may be mounted on, for example, poles, walls, gates, doors, ceilings, doors, and the like.
  • the mobile data collection devices may be installed on, for example, tracks, trailers, conveyors, cranes, and the like.
  • the data collection devices may send the collected data to a computer via power cables and/or one or more communications networks, including Internet-based networks, local-area networks (LANs), wide area networks (WANs), and the like.
  • a computer Upon receiving the collected data, a computer processes the data to generate output data. The processing may be performed using a machine learning approach, computer modeling, statistical analysis, and other types of data processing.
  • the output data may include automated safety alerts, performance metrics, and records of activities provided for determining the status of the industrial projects.
  • a decision support and data processing method for monitoring activities on industrial sites is configured to employ a machine learning approach to process the data received from a distributed network of sensors.
  • the sensors may provide the data expressed in various data formats.
  • the machine learning system may be programmed to process the collected data, and generate outputs that include activity records, activity metrics, and activity-based alerts.
  • the outputs may be used to generate warnings and alarms that may be used to deter safety violations, corruption, and inefficiencies in using mechanical equipment, industrial materials, and other resources.
  • the warnings and alarms may be particularly useful in managing large-scale industrial sites.
  • a decision support system is configured or programmed to generate visual or audible safety -related warnings and recommendations for industrial sites.
  • the system may be configured or programmed to process collections of video streams captured by digital cameras from multiple camera views. It may also be configured or programmed to process sensor-based measurements and photographs captured by cameras.
  • the collections may be provided to the decision support system to cause the decision support system to generate alerts and warnings.
  • the alerts and warnings may be generated by applying, to the collected data, machine learning processing, computer vision processing, and event detection processing.
  • the data processing of image data may be performed by specialized distributed systems that are configured to model industrial equipment.
  • an approach for monitoring activities in industrial sites includes receiving video stream collections captured by multiple cameras.
  • the collected video frames may be displayed on display devices to provide an enhanced level of situational awareness of the activities taking place in the sites. Displaying the multiple views helps monitoring the whereabouts of the industrial workers and the industrial equipment. This may also include providing measures for improving safety, efficiency and accountability in the industrial sites, and efficiency in using the machines, materials, and other resources. Furthermore, this may include monitoring the persons as they enter and leave the industrial sites. This may also include using the collected data to verify the workers’ timesheets, workers’ overtime entries, and so forth. [00063] 2. EXAMPLE SYSTEM ARCHITECTURE
  • FIG. 1 illustrates an example computer system that may be used to implement an embodiment.
  • a computer system comprises components that are implemented at least partially in hardware, such as one or more hardware processors executing program instructions stored in one or more memories as described herein.
  • all functions described herein are intended to indicate operations that are performed using programming in a special-purpose computer or general-purpose computer, in various embodiments.
  • FIG. 1 illustrates only one of many possible arrangements of components configured to execute the programming described herein. Other arrangements may include fewer or different components, and the division of work between the components may vary depending on the arrangement.
  • FIG. 1 depicts a plurality of data input devices 102A, 102B, 104A, 104B, 106A, 106B, one or more computers 110, one or more application storage devices 120 and one or more data storage devices 130.
  • Data input devices may include one or more video cameras 102A, 102B, one or more digital cameras 104 A, 104B, and one or more digital sensors 106A, 106B.
  • the data input devices may also include other devices, such as time-card sensors, audio devices, and others.
  • Data input devices may be configured to collect information about persons, objects and activities taking place in an industrial site.
  • video cameras 102A, 102B may be configured or programmed to record video segments depicting persons, trucks, and cranes present in an industrial site, store the recorded video segments, and transmit the recorded video segments to computer 110.
  • Digital cameras 104A, 104B may be configured or programmed to capture digital images. The images may depict, for example, persons, trucks, and cranes present in an industrial site. The images may be transmitted to computer 110.
  • Digital sensors 106A, 106B may be configured or programmed to detect events indicating entering or leaving the industrial site. The sensors may also associate the events with corresponding timestamps, store the associations between the detected events and the timestamps, and transmit the associations to computer 110.
  • Digital sensors 106 A, 106B may include, for example, sensors that are configured to detect markings imprinted on or affixed to hard hats or other apparel that industrial workers wear.
  • the markings may include, for example, quick response (“QR”) codes that are imprinted on the hats, apparel or plates, color-coded stickers attached to the hard hats or vests, or tags affixed to apparel.
  • QR quick response
  • Color-coded stickers or markers, attached to hard hats, or vests, may be used to identify workers present on an industrial site.
  • Color-coded stickers may be either individual stickers or stickers combined with so called certified safety stickers.
  • a certified safety sticker is a sticker that depicts a number, or code, assigned to the worker once the worker is successfully completed a worker orientation and has been certified to work on the job site. Both a color-coded sticker and a certified safety sticker may be combined into one sticker and used for two purposes: to identify a worker present on the industrial site and to indicate that the worker has been certified.
  • color-coded stickers attached to hard hats, or vests are used to count the worker trades, i.e., to count the workers per trade.
  • trades include plumbers, electricians, carpenters, and so forth.
  • Hard hats that workers wear may also be color-coded, and different colors may be assigned to different worker trades. For example, plumbers may wear red hard hats, while electricians may wear blue hard hats.
  • Color-coded stickers and color-coded hard hats, or vests may be used to count the workers and worker trades present on an industrial site.
  • the counting may be performed in many different ways. For example, if different hard hat colors are assigned to different trades, then a count of worker trades may be computed based on the detected hat colors.
  • a count of worker trades may be computed based on the detected sticker colors.
  • the counts may be computed as a combination of the hat colors and the sticker colors. For instance, if the plumbers wear red hard hats with white stickers, then a count of the plumbers present on the industrial site may be computed based on the detected red hard hats with white stickers.
  • the counts may be computed as a combination of vest colors and the sticker colors. For instance, if the carpenters wear red vests with white stickers, then a count of the carpenters present on the industrial site may be computed based on the detected red vests with white stickers.
  • the sensors may also detect markings imprinted on plates of industrial trucks.
  • the sensors may collect the information about the detected markings and transmit the collected information to computer 110 for processing.
  • Computer 110 is programmed to receive data collected by cameras and sensors, and process and analyze the received data. Once the collected data is processed, computer 110 may generate output that may include activity records, activity metrics, and activity-based alerts. The output generated by computer 110 may be used directly or indirectly to manage an industrial site. The output may include, for example, messages, warnings, and alarms indicating safety violations, corruption, or inefficiencies.
  • computer 110 is part of a public or a private cloud system. Access to computer 110 may be secured using credentials. The credentials may be provided to a management team or a system administrator.
  • computer 110 may be a privately-owned computing device that is operated on behalf of a management team. It may be implemented within a local network managed by an industrial site or as an independent device outside the local network of the industrial site.
  • Computer 110 may be configured to execute a plurality of processes designed to monitor activities, machines, and persons. Computer 110 may also be configured to generate output which may include activity records, activity metrics, and activity-based alerts for an industrial site. Output generated by computer 110 may be in the form of warnings, alerts or reports. The warnings, alerts or reports may be used to deter safety and security violations, corruption, and inefficiencies in using machines, materials, or equipment.
  • computer 110 includes an input interface 110A that is configured to receive data from data input devices, such as video cameras 102A, 102B, digital cameras 104A, 104B, and digital sensors 106A, 106B.
  • Computer 110 may also include an output interface HOG for outputting data.
  • Computer 110 may use output interface HOG to, for example, transmit the warnings and alarms to data storage device 130, from which, the warnings and alarms may be distributed to a management team.
  • Computer 110 may include a data collector 110B that is configured to receive data from input interface 110A.
  • Data collector HOB may be also configured to translate the received data to a particular data format.
  • data collector 110B may be used to convert data from one format to another. If the data received from video camera 102A is in a format that is different from the format in which computer 110 reads the data, then data collector 110B may convert the data received from video camera 102A to the format that video camera 102B is using.
  • Computer 110 may further include a machine learning processor 110C configured to execute a machine learning program, algorithm, or process.
  • the machine learning process may be executed using data provided by any of the data input devices 102A- B, 104A-B, and 106A-B.
  • the machine learning process may be executed to enhance and improve the content of the received data.
  • machine learning processor HOC may be configured to process the data provided by the video cameras, digital cameras, and sensors, and generate output in form of activity records, activity metrics, and activity -based alerts.
  • Computer 110 may also include a data analyzer HOD.
  • Data analyzer HOD may be configured to execute a computer modeling approach, statistical analysis, computer modeling, and other types of processing to generate additional output data.
  • Computer 110 may also include one or more hardware processors 110E configured to execute instructions stored in memory 11 OF, and to cause instantiating of data collector 110B, machine learning processor 110C, and data analyzer HOD.
  • hardware processors 110E configured to execute instructions stored in memory 11 OF, and to cause instantiating of data collector 110B, machine learning processor 110C, and data analyzer HOD.
  • computer 110 hosts components configured to provide capabilities for identifying and monitoring health and safety risks.
  • the components may be implemented in hardware, software, or a combination of both, and may include, for example, an activity and object detector 110G, a characteristics analyzer 110H, and a health and safety risks identifier 1101.
  • computer 110 implements other components configured to support the method for identifying and monitoring health and safety risks.
  • Activity and object detector 110G may be configured to receive a plurality of data inputs from a plurality of data input devices in an industrial site.
  • the data input devices may include video cameras, digital cameras, sensors, and other devices configured to collect data specific to industrial sites and may be deployed at different locations of the industrial site.
  • the data inputs may include video recordings, digital photographs, sensor data, and the like.
  • Activity and object detector 110G may also be configured to select a data model from one or more data models that have been trained on training data for industrial sites.
  • the data models may be implemented as, for example, machine learning models that integrate one or more artificial intelligence approaches.
  • Activity and object detector 110G may select the data model based on the type of the industrial site and based on the training that the model was subjected to.
  • Activity and object detector 110G may further be configured to apply the data inputs to the data model to cause the data model to evaluate the inputs and generate output data specifying whether the plurality of data inputs indicates one or more activities or objects associated with the workers or the equipment present at the industrial site.
  • Characteristics analyzer 110H may be configured to determine characteristics of the data output. More specifically, in response to determining that the output data indicate the one or more activities or objects associated with the workers or the equipment present at the industrial site, characteristics analyzer 11 OH may determine one or more characteristics of the one or more activities or objects. Examples of the characteristics were described above.
  • Health and safety risks identifier 1101 may be configured to identify, based on the characteristics, the activities or objects that may cause any health or safety risks at the industrial site.
  • health and safety risks identifier 1101 determines that the activities or objects cause the health/safety risks, then health and safety risks identifier 1101, of any other component configured for this purpose, may generate one or more notifications that indicate the one or more health or safety risks at the industrial site and transmit the notifications to notification recipients.
  • Computer 110 may be communicatively connected to application storage device 120 and data storage device 130.
  • the communications between computer 110 and storage devices 120-130 may be facilitated by power lines, and/or one or more computer networks including, but not limited to, a Local Area Network (LAN), a Wide Area Network (WAN), the Internet, and a company private network.
  • LAN Local Area Network
  • WAN Wide Area Network
  • Internet the Internet
  • company private network a company private network
  • Application storage device 120 may be configured to store program applications and instructions used by data collector HOB, machine learning processor HOC, data analyzer 110D to process received data, activity and object detector 110G, characteristics analyzer 11 OH, and health and safety risks identifier 1101.
  • Application storage device 120 may be used to store data format translators, neural network models, machine learning models, neural network parameters, specifications of the data input devices, default data models, statistical programs and functions to perform a statistical analysis, and the like.
  • Data storage device 130 may be configured to store data used and generated by computer 110.
  • Data storage device 130 may be implemented in one or more hard disk drives, or other electronic digital data devices configured to store data.
  • Data storage device 130 may include an individual device (as depicted in FIG.
  • Data storage device 130 may be implemented as a device separate from computer 110, as depicted in FIG. 1. Alternatively, data storage device 130 may be implemented as part of computer 110 or may be implemented within the same computer network as computer 110.
  • Data storage device 130 may include one or more databases such as relational databases, hybrid databases, columnar databases, and the like.
  • an approach for monitoring activities on an industrial site includes one or more decision support systems that are programmed or configured to model behavioral characteristics of objects identified in the site.
  • the decision support systems may be implemented as part of machine learning processor 110C.
  • Functions of a decision support system may be defined by specifying inputs and outputs of the system; inputs may include heterogeneous-in-type, real-time data provided by multiple sources to the decision support system.
  • the inputs may be provided by video cameras, digital cameras and sensors deployed in an industrial site.
  • the video cameras and digital cameras may have assigned their own IP addresses and may be part of a computer network.
  • the inputs include specifications of the physical locations of the video and digital cameras deployed in an industrial site.
  • the specifications of the physical locations may include physical addresses, names associated with the physical locations, and the like.
  • the inputs may also include data formats of input and output data.
  • the inputs may specify, for example, the formats for video streams that the decision support system may process.
  • the input may also specify the formats for recording the video stream in a time series format that is specific to the decision support system.
  • Input may specify the format for storing the video stream in the time series format before the data is used to detect events depicted in the images included in the stream. Input may specify the events that are considered critical, and the actions that are to be pursued in response to detecting the events.
  • the inputs of the decision support system may also include specification of a data encoding format for the input data streams. Different encoding schemes offer different tradeoffs in terms of accuracy, errors, computational costs, and communications costs.
  • videos recorded in the 720p format provide a good visual accuracy; however, playing such videos may be expensive in terms of communications bandwidth, storage, power consumption, and computational processing time.
  • using a low-quality format that is sufficient for archival purposes may be insufficient for the machine learning components of the decision support system because the machine learning components may require access to video streams that have high video quality.
  • FIG. 2 depicts an example map showing an industrial site 201 and locations of video cameras implemented on the site.
  • Example site 201 has a rectangular shape and has roads 212, 214.
  • video cameras and digital cameras are shown using circles.
  • Examples of the depicted video cameras are video cameras 202A, 202B, 204A, and 204B.
  • Examples of digital cameras include a camera 206. The locations for the cameras are selected to enable monitoring areas such as entrances, exits, restrooms, offices, delivery and receiving sites, and the like.
  • FIG. 3 depicts a combined display 302, 304 of several video streams provided to a computer from several video cameras.
  • Combined display 302, 304 may include a plurality of display regions, two of which are depicted in FIG. 3. Some of the display regions are used to display video streams from individual cameras, other display regions may be used to display a simplified map of locations of the individual cameras and the respective fields of view captured by the cameras.
  • Combined display 302, 304 includes the frames captured by a plurality of digital video cameras, including the frames captured by digital video cameras 202A, 204A, respectively.
  • the frames may be synchronized using timestamps. Therefore, the frames displayed in combined display 302, 304 may correspond to the frames recorded at the same time.
  • a decision support system may also be configured to generate outputs such as alarms, warnings, messages, and the like.
  • the outputs may include specifications for displaying the alarms on computer dashboards of a management team in a timely fashion.
  • the outputs may also include the specification for defining a size and a resolution for displaying the alarms on, for example, relatively small displays of portable devices such as smartphones.
  • Additional functions may also include specifications for grouping the alarms.
  • the specifications may, for example, describe how to group the alarms based on corresponding levels of urgency.
  • the levels of urgency may be determined based on characteristics of the alarms. For example, some alarms may be associated with non-critical events; other alarms may be associated with critical events and may require immediate actions.
  • Functions for outputs may also include generating a set of performance metrics.
  • Performance metrics may be represented using key performance indicators (“KPIs”).
  • KPIs key performance indicators
  • a KPI may be used to measure, for example, efficiency of an industrial process in an industrial site over time.
  • a KPI may be specified in terms of an amount of industrial material moved to and/or from an industrial site.
  • Another KPI may be specified in terms of absenteeism of the workers.
  • a decision support system may analyze the KPIs to diagnose the sources and reasons for inefficiencies. For example, the decision support system may determine bottlenecks that are caused by failing to complete certain prerequisite industrial tasks.
  • Output functions may further include specifications for processing the received video and sensor data to determine accountability measures for an industrial site.
  • the specification may include instructions for verifying, by authorized agents and personnel, the received data for accountability purposes.
  • the specification may describe how to analyze the received data to determine causes of accidents and losses, and how to audit the presence of contractors on the industrial site.
  • Output functions may also include specifications for processing the received data to determine accuracy measures for an industrial site.
  • the specification may provide for detecting objects depicted in, for example, images provided for an industrial site. This may include detecting depictions of persons, materials, and machines in the images provided by video cameras and digital cameras to a decision support system.
  • the specifications may relate to storage that computer 110 and data storage device 130 are expected to provide to the decision support system.
  • the system features definitions of (1) the amount of accuracy to be achieved by various components of the system, (2) the ability to preserve the privacy of persons, (3) the speed with which the decision support system is expected to generate outputs, (4) the scalability of the system to a large number and a wide variety of different industrial sites (e.g., underground, indoor, outdoor), and (5) the cost of deploying and maintaining the decision support system.
  • (1) the amount of accuracy to be achieved by various components of the system (2) the ability to preserve the privacy of persons, (3) the speed with which the decision support system is expected to generate outputs, (4) the scalability of the system to a large number and a wide variety of different industrial sites (e.g., underground, indoor, outdoor), and (5) the cost of deploying and maintaining the decision support system.
  • a decision support system is programmed or configured to process large amounts of data.
  • the data received from different cameras is analyzed in a distributed fashion across multiple devices. In some embodiments, this may include using a distributed computation system. Due to the real-time nature of the data streams, the storage capacity in memory may be addressed by deploying algorithms that are suitable for the streaming computational model.
  • a decision support system is programmed or configured to process heterogeneous data received from different data sources.
  • the analysis of data provided by heterogeneous data sources may be performed by different components and at different time periods.
  • Even a single data source may feed data into multiple components operating at distinct levels of detail.
  • data from a single source may be provided to the components configured to count persons and to the components configured to detect safety shortcomings.
  • the outputs may be combined in an asynchronous fashion.
  • a decision support system is programmed or configured to remain at least partially operational even if some of its parts or components fail.
  • a decision support system is programmed or configured for maintaining a balance between analyzing the received data by local devices and analyzing the received data by a central server.
  • Finding an optimal balance may include determining whether communications between the local devices and the central server have a negative impact on the efficiency of the decision support system. Some communications between the local devices such as edge cameras, and computer 110 are performed for archiving video data. However, the object detection may be delegated to the local devices to reduce the communications cost.
  • Finding the optimal balance may also include investigating and comparing the overall delay between detecting an object and processing of the information about the detected object.
  • Technical features of the decision support system may also include technical specification of the decision support system in terms of hardware and software specification for the system. This may include specifying a hardware-software configuration for computer 110 so that computer 110 is able to execute machine learning processor 110C, data collector HOB, and data analyzer HOD. This may also include providing technical specifications for video cameras, digital cameras, application storage device 120, and data storage device 130. [000130] 4. IDENTIFYING AND MONITORING HEALTH AND SAFETY
  • the computer-implemented approach presented herein comprises a process for identifying and monitoring health and safety risks in industrial sites.
  • the approach may implement a computer-implemented machine learning processes, statistical analysis, computer models, operations research models, and industrial domain analysis as a decision support system.
  • FIG. 4E depicts an example workflow for identifying and monitoring health and safety risks in industrial sites.
  • the steps described in FIG. 4E may be performed by one or more components of computer 110, depicted in FIG. 1.
  • some steps may be performed by activity /object detector 110G, other steps may be performed by characteristics analyzer 110H, and yet other steps may be performed by health and safety risks identifier 1101.
  • any two components 110G-1101 may perform the steps described in FIG. 4E.
  • each of components 110G-1101 may perform all the steps itself.
  • computer 110 may have a dedicated module (not shown in FIG. 1) configured to perform all the steps described in FIG. 4E.
  • the processor receives a plurality of data inputs from a plurality of data input devices installed in an industrial site.
  • the data inputs may include video recordings, digital photographs, sensor data, and the like, collected by the data input devices, such as video cameras, digital cameras, sensors, and other devices configured to collect data specific to industrial sites.
  • the data collection devices may send the collected data, as the data inputs, to the processor via one or more communications networks, including Internet-based networks, LANs, WANs, and the like.
  • step 420 the processor selects a data model from one or more data models that have been trained on training data for industrial sites and programmed to detect activities or objects associated with the workers or equipment present at the industrial sites. Training of the data model may involve providing, as input, the training data that allow the model to identify (based on video input data, photographs, and the like) the workers, objects, pieces of equipment, materials, constructions elements, etc., that may be present in industrial sites.
  • the processor may select the data model based on a variety of criteria, including the type of the industrial site, the specification of the model itself, the reliability of the model, and the like.
  • step 430 the processor applies the data inputs to the data model to cause the data model to evaluate the data inputs and generate output data specifying whether the inputs describe activities or objects indicative of the presence of workers, equipment, and the like at the industrial site.
  • Applying the data inputs to the data model and evaluating the data model with the data inputs may include identifying, in the inputs, a plurality of digital images that depict the workers working at the industrial site and the pieces of equipment present at the industrial site. It may also include identifying, based on the images, the pieces of equipment depicted in the images, and generating output indicating whether one or more activities or objects associated with workers or the equipment are present at the industrial site. The specific details about the way in which the output is generated depend on the implementation and the type of the machine learning model being used.
  • step 440 the processor determines that the output data indicates one or more activities or objects associated with the workers or the equipment present at the industrial site. Otherwise, the processor performs step 410.
  • step 450 the processor determines, based on the output data, one or more characteristics of the one or more activities or objects.
  • the characteristics of the activities and objects may be the characteristics that may cause any health or safety risks at the industrial site, especially, that pose an excessive risk at the industrial site or pose a risk that is greater than a normal risk at the industrial site. Examples of the characteristics were described above.
  • Identifying the characteristics from the output data may include parsing the output data, and using, for example, image recognition mechanisms to determine the characteristics specific to the object depicted in the parsed output data. For example, if the output data includes an indication of a worker depicted in a corresponding video frame included in the data input, then the processor may parse the output data to determine that a depiction of the worker is included in a corresponding video frame and use the image recognition capabilities to determine one or more characteristics of the depiction of the worker in the corresponding frame.
  • the characteristics in this example may include a workersize-based characteristic, a worker-placement characteristic, a lack-of-protection characteristic, a worker-activity-based characteristic, a worker-potential-injury -based characteristic, and the like.
  • step 460 based on the or more characteristics, the processor determines whether the one or more activities or objects cause any health or safety risks at the industrial site. This may include determining whether a data repository, which stores one or more mappings of characteristics onto risks posed in the industrial sites, includes the entries for the one or more characteristics. If the processor determines that such entries are present in the repository, then the processor may retrieve those entries, extract the risk identifiers from the entries, and use the extracted risk identifiers as indications of the health or safety risks posed in the industrial site.
  • the risks that may be determined based on the characteristics of the activities or objects associated with the workers or the equipment may include the risks that may be, for example, endangering the safety of the workers, exposing the workers to injuries, causing damages to the equipment, causing damages to the construction site in general, and the like. Other examples of risks were described above.
  • step 470 the processor determines that the activities or objects cause the health/safety risks, then the processor proceeds to performing step 480. Otherwise, the processor proceeds to perform step 410.
  • the processor determines actions based on the identified risks.
  • the processor may, for example, generate one or more notifications that indicate the health or safety risks at the industrial site and transmit the notifications to notification recipients, such as managers, security officers, the police, and the like.
  • the processor may, for example, generate a graphical representation of the health or safety risks posed at the industrial site and transmit the representation to a computer display device to cause the computer display device to generate, based on the graphical representation, a GUI depicting the one or more health or safety risks posed at the industrial site.
  • the computer display device may be a device that is operated by a manager, a security officer or any other person in charge of the safety of the site.
  • the processor may, for example, generate an audible representation of the one or more health or safety risks posed at the industrial site.
  • the audible representation may include alarm sounds, warning sounds, and the like.
  • the processor may transmit the audible representation to one or more computer devices to cause the devices to generate, based on the audible representation, signals that represent the one or more health or safety risks posed at the industrial site.
  • Examples of the computer devices configured to generate and emit the signals may include sirens, and the like.
  • FIG. 4A depicts examples of various risks 4100.
  • risks 4100 include time related risks 4202, safety related risks 4302, cost related risks 4402 and quality related risks 4502.
  • Other types of risk may also be identified in construction sites. Any of those risks may be determined using the approach described in FIG. 4E, and in particular, using the approach described in step 460 of FIG. 4E.
  • Each of the types of risks may have associated risk characteristics.
  • the characteristics may provide additional information about the particular risks. They may correspond to the characteristics determined based on the output generated, using for example a machine-learning model, from the input video streams and images depicting the industrial site.
  • a video camera provided a sequence of frames that depicts a worker claiming a ladder.
  • a machine learning model was applied to the sequence of video frames and the model was evaluated based on the frames to generate and provide output data.
  • the output data indicate that the worker was detected in the frames and it was detected that the worker was climbing the ladder.
  • computer 110 determines one or more characteristics indicating that the worker was climbing the ladder. Those characteristics may be used to lookup a repository of the characteristics of the risks associated with the industrial sites, and if a match is found, then computer 110 may determine that the input data provided from the industrial site indicate a risk associated with the worker climbing the ladder. An example of such a risk may include falling from a height as the worker climbs the ladder. Other examples are described below.
  • Grouping and organization of the risk characteristics may vary.
  • the risk characteristics are arranged hierarchically, and the arrangement depends on the specific aspects of the corresponding risk type.
  • each of the risks 4202, 4302, 4402 and 4502 has a corresponding activity-based characteristic 4204, 4304, 4404, and 4504, respectively.
  • Other grouping may also be implemented.
  • Each of the activity-based characteristics may have associated particularactivity-based characteristics.
  • each of the activity -based characteristics 4204, 4304, 4404, and 4504 has one or more particular-activity-based characteristics.
  • activities characteristic 4304 includes a pilling characteristic 4306, an excavation characteristic 4308, a steel erection characteristic 4310, and a formwork/rebar/concrete characteristic 4312. Other characteristics may also be included.
  • safety risk 4302 may have associated activities 4304, which in turn, may be further characterized by pilling characteristic 4306, excavation characteristic 4308, steel erection characteristic 4310, and formwork/rebar/concrete characteristic 4312. These characteristics are described herein merely to provide simple examples. In other implementations, additional characteristics may be included.
  • Performing steel erection 4310 may be associated with many risks and situations endangering the workers and the surroundings. These risks may include falling from height 4320, being struck by a flying object 4322, being exposed to hazards due to the tool usage 4324, and a failure of the work platform or the lift equipment 4326. Additional risks may also be included.
  • step 460 of FIG. 4E the processor determines, based on the content of the input frames, (1) activity characteristic 4304 indicating that a worker is working at an industrial site, and (2) steel erection characteristic 4310 indicating that the worker is erecting a steel pole as he sets up the scaffolds, and (3) potential fall from heights characteristic 4320 indicating that, as the worker sets the scaffolds in an upper level of the scaffolding, he might fall, then the processor may determine that those characteristics indicate safety risk 4302 that belongs to risks 4100.
  • step 460 of FIG. 4E the processor determines, based on the content of the input frames, (1) activity characteristic 4304 indicating that a worker is working at an industrial site, and (2) steel erection characteristic 4310 indicating that the worker is erecting a steel pole as he sets up the scaffolds, and (3) struck by a flying object characteristic 4322 indicating that, as the worker sets the scaffolds in an upper level of the scaffolding, he might have been struck by some lose scaffold, then the processor may determine that those characteristics indicate safety risk 4302 that belongs to risks 4100.
  • the processor determines, based on the content of the input frames, (1) activity characteristic 4304 indicating that a worker is working at an industrial site, and (2) steel erection characteristic 4310 indicating that the worker is erecting a steel pole as he sets up the scaffolds, and (3) hazard due to a tool usage characteristic 4322 indicating that, as the worker sets the scaffolds in an upper level of the scaffolding, he might have dropped one of his tools, then the processor may determine that those characteristics indicate safety risk 4302 that belongs to risks 4100.
  • FIG. 4B depicts examples of possible reasons for safety risks posed by steel erection activities.
  • An example depicted in FIG. 4B builds on the example depicted in FIG. 4A. Both examples are provided merely to illustrate a process of using characteristics of, for example, video frames provided from an industrial site in the process of accessing the potential and actual risks occurring in the industrial site.
  • FIG. 4B illustrates an example of causes for falling from heights 4320 and an example of mitigation factors for the fall. While the example in FIG. 4B illustrates some, not all, causes and mitigation factors, one could easily imagine the additional causes and additional scenarios in which a worker may fall from heights as he tries to erect, for example, steel scaffoldings.
  • a cause 4320A for falling from height 4320 may include a lack of personal fall arrest system (PF As) 4320B, a wrong mounting of anchorage 4320C, and the like 4320D.
  • mitigation factors 4320M for falling 4320 from heights may include a safety lanyard 4320N, an overhead anchorage 43200, and the like 4320P.
  • step 460 of FIG. 4E the processor determines, based on the content of the input frames, (1) activity characteristic 4304 indicating that a worker is working at an industrial site, and (2) steel erection characteristic 4310 indicating that the worker is erecting a steel pole as he sets up the scaffolds, and (3) potential fall from heights characteristic 4320 indicating that, as the worker sets the scaffolds in an upper level of the scaffolding, he might fall, then the processor may not only determine that those characteristics indicate safety risk 4302, but also provide information about the causes and mitigation factors indicating that the worker failed to, for example, wear the PF As (4320B) and that the overhead anchorage (43200) was improperly installed. This additional information may be very helpful in determining, for example, why the worker fell from heights.
  • performing steel erection 4310 is also associated with potentially being struck by a flying object 4322, being exposed to hazards due to the tool usage 4324, and a failure of the work platform or the lift equipment 4326.
  • a flying object 4322 may have associated causes and mitigation factors.
  • causes, and factors are omitted herein.
  • FIG. 4C depicts examples of various risks assessment approaches.
  • the approaches may be identified and grouped according to different schemes. One of the schemes is depicted in FIG. 4C.
  • risk assessment approaches are grouped according to the following categories: safety risks 4012, schedule risks 4014, quality control risks 4016, production-related risks 4018, productivity-related risks 4020, and other risks 4022. Other grouping of the risk assessment approaches may also be implemented.
  • safety risks 4012 may include various present and future safety risks that may be related to various activities taking place at an industrial site, workers’ movements and positions at the industrial site, objects detected at the industrial site, location-based-relationships between objects and workers at the industrial site, and the like.
  • the safety risks 4012 may be determined (4012B) based on employing various methods. Examples of those methods are described in an element 4012C and include detecting lack of personal protection equipment (PPE), presence of ladders and angles of ladders, presence of dangerous edges, low headroom, locations of workers, workers’ ergonomic injuries, workers’ positions, workers’ activities, sizes of the objects, parts assemble methods, relationships between the objects, and the like. Details about the example of different ways of detecting safety risks 4012 are described in FIG. 4D.
  • PPE personal protection equipment
  • Safety risks 4012 may be detected using different approaches. Examples of those approaches are described in FIG. 4D.
  • FIG. 4D depicts examples of detecting safety risks 4110.
  • the depicted examples are provided merely to illustrate non-limiting examples.
  • safety risks may be detected (4110) based on determining (4112) lack of protective personal equipment (PPE), which may include workman gloves, workman suites, workmen shoes, workman masks, workman helmets, workman headsets, and the like. If it is expected that, while performing certain activities, the workers are expected to wear, for example, the helmets, then lack of depictions of the helmets in the pictures showing the workers at the industrial site may indicate safety risks at the site.
  • Safety risks may be detected (4110) based on determining (4114) presence of objects that block passages. Such objects may include pieces of industrial equipment, materials, boxes, vehicles, and the like, positioned on walking paths, roads, and the like. If, for example, a pile of building material has been placed in the middle of a delivery road, then the depiction of that material on the delivery road in the pictures showing the industrial site may indicate safety risks at the site.
  • Safety risks may also be detected (4110) based on determining (4116) presence of objects that may fall from heights and cause injuries. Those objects may include pieces of industrial equipment, materials, boxes, ladders, tools, and the like, positioned on scaffoldings, walls, and the like. If, for example, a bag of cement has been left on an edge of scaffoldings, then the depiction of that bag on the scaffoldings in the pictures showing the industrial site may indicate safety risks at the site.
  • Safety risks may also be detected (4110) based on determining (4118) presence of objects that may have sharp edges that may cause injuries.
  • objects may include tools with sharp edges, axes, hatchets, knives, pins, and the like, present at an industrial site. If, for example, a knife has been left on an edge of scaffoldings, then the depiction of that knife on the scaffoldings in the pictures showing the industrial site may indicate safety risks at the site.
  • Safety risks may also be detected (4110) based on determining (4120) presence of heavy machinery that may cause injuries.
  • Such machinery may include excavators, loaders, bulldozers, backhoe loaders, cranes, forklifts, and the like, present at an industrial site. If, for example, an excavator is moving along a road across an industrial sites, then the depiction of that excavator in the pictures showing the industrial site may indicate safety risks at the site.
  • Safety risks may also be detected (4110) based on determining (4122) presence of workers’ activities that may cause injuries. Such activities may include material handling, construction, scaffolding setting, wall erection, window installation, pouring cement, pouring asphalt, excavating, handling waste material, material shipment and receiving, building manufacturing, and the like. If, for example, a worker is installing windows at an industrial site, then the depiction of the wall installation in the pictures showing the industrial site may indicate safety risks at the site.
  • Safety risks may also be detected (4110) based on determining (4122) presence of construction’s activities that may cause injuries.
  • activities may include material handling, construction, scaffolding setting, wall erection, window installation, pouring cement, pouring asphalt, excavating, handling waste material, material shipment and receiving, building manufacturing, and the like. If, for example, the pictures taken from the industrial site depict excavation, then this may indicate safety risks at the site.
  • a safety score is computed.
  • a safety score is a numerical representation of the total safety risks that presently and/or potentially cause dangerous situations.
  • a safety score may be computed (4012A) using many different approaches. In one approach, the safety score is computed as an average of a sum of weighted safety risks detected based on the input information, such as video frames depicting an industrial site.
  • a safety risk may be computed as:
  • N indicates a count of different safety risks identified from the video frames depicting an industrial site
  • Riskn indicates a probability that, based on the frames, the frames indeed depict a n th risk
  • Weightn indicates a weight empirically determined for the n th risk.
  • safety scores may be normalized, which means that the highest score corresponds to 1.0, the lowest score corresponds to 0.0, and all other scores are from the set [0.0, 1.0], Hence, in the above example, the safety score values belong to the range [0.0, 1.0], wherein the score value of 0.0 means that there is no safety risk, while the score value of 1.0 means that risk is at its maximum.
  • Safety Score(2)
  • * (80% * 0.4 + 75% * 0.5) 0.3475 (3) [000199] Therefore, the safety score in the situation when the object obstructing the walking passage and the object that may fall from the scaffolding are detected is 0.3475. [000200] This result may be interpreted according to a corresponding scheme: suppose that the score assessment system implemented the following scheme: if the safety score values are between 0.0 and 0.33, then the risk is low (a green category); if the safety score values are between 0.331 and 0.66, then the risk is medium (a yellow category); but if the safety score values are between 0.661 and 1.0, then the risk is high (a red category).
  • the process of assessing risks (4010) may also include assessing schedule risks 4014. That may include, for example, detecting (4014A) a start time and an end time of certain construction activities, and determining how those times impact the overall construction schedule. For example, if the rebar placement should have started by November 11 th , 2020, but the workers started placing the rebar on November 22, 2020, then the project of placing rebar is delayed by 11 days. The delay may require adjusting, and potentially delaying, other projects.
  • the system may receive information about the delay of the correspondence between contractors, clients and/or consultants.
  • the system may detect the changes in the project.
  • the changes may include, for example, the changes in the number of drawing revisions, the number of orders, the amount of extra work, and the measure of the impact that those changes have on the overall schedule and the schedule delay.
  • Handling the schedule risks may also include determining (4014B) material delivery risks. For example, if the rebar placement project is delayed by 11 days, then the project of delivering, for example, cement may have to be delayed.
  • the process of assessing risks may also include assessing quality control risks 4016. This may include assessing the specific characteristics of actual objects (4016A) at an industrial site and determining whether those characteristics correspond to the expected characteristics. For example, this may include assessing whether the objects, such as wood planks, are not bent, not tilted, or not deformed, or whether there are defects of the welds. This may also include assessing whether the objects, such as doors and windows, have the correct color and texture.
  • the objects assessment may be part of a quality control process (4016B).
  • Quality control process (6016B) may also include determining whether the sequence of construction activities has been performed in the correct order and according to the recommended timing. This could include determining whether two coats of paint were put on the outside walls and the coats were put within a two-day interval.
  • the process of assessing risks (4010) may also include assessing production-related risks 4018. That may include determining (4018A) a production rate, determining (4018B) if the staff requirements are met, and the like. For example, if placing rebar in a construction site is delayed by two days, one may want to determine how the delay in placing the rebar may impact the production rate, and how it impacts the staff requirements and scheduling. As another example, production line of pre-fabricated wall panels can be monitored with vision-based devices. Delays in fabrication and movement of each panel from one stage to the next stage can be monitored and contribute to the calculation of production-related risks.
  • the process of assessing risks (4010) may also include assessing productivity-related risks 4020. That may include determining counts (4020 A) of workers required for particular tasks on particular days/weeks, counts of jobs to be performed on particular days/weeks, counts of trucks to be available on particular days/weeks, and they like. If, for example, a particular task requires 50 workers to be present during the particular week, but only 40 workers are available, the productivity of the project will be negatively impacted in this situation.
  • the process of assessing risks may also include assessing other risks 4022. That may include, for example, accessing whether tasks and jobs are properly coordinated (4022A). That may also include determining whether storage of the materials is congested and overfilled (4022B. Furthermore, that may include determining a level of cleanliness in the industrial sites. Another example, can be the relationship between a status of installed materials and the next activity getting performed on site (i.e., glass windows are installed without proper protecting films, and there is a new welding activity happening on the above floor). The debris from such welding activities may damage the finished windows on the floor below. Another example is for a floor of rooms that are fully finished with a fresh paint, polished granite counters tops, and that there should not be anyone other than an authorized person to enter such a floor to reduce the risk of damaging the painted finished or polished granite tops from stretches.
  • Those activities may also include being hit by an object or a piece of machinery. For example, this may include being hit by an excavator or other piece of heavy equipment.
  • Some of the activity may be caused by lack of PPE.
  • the workers may become injured if they do not wear hard hats, protective equipment, protective gloves, and the like.
  • Some risks may be caused by certain activities such as demolition. For example, the workers may become injured if they remain in the proximity of demolition of the site and expose themselves to a risk-demolition injury.
  • Some risks may be caused by social distancing violations. For example, when the workers fail to obey the social distancing ordinance and fail to keep, for example, a six feet distance from each other, then the workers may expose themselves to health hazards, and the like.
  • FIG. 4F depicts an example workflow integration of a decision support system.
  • the example illustrates various components of the decision support system that are integrated to support the data processing workflow.
  • the depicted components include one or more cameras 402, one or more processing application 404, an input translation service 408, one or more processing applications 410, a web service 412, and a browser display 414.
  • Output generated by processing application 404 may include one or more JPEG images.
  • the images may be communicated to input translation service 408 via a UDP communications connection, or any other type of communications connection.
  • the components also include a browser thumbs and legacy display 416.
  • AWS Amazon Web Services
  • a decision support system integrates components that are configured to detect safety events, perform an efficiency analysis, and perform an accountability analysis. Outputs of the safety and efficiency components may be generated by integrating data from different sensors and by applying various algorithms to the integrated data.
  • a decision support system is programmed to process received data at different time scales.
  • the safety event detection component may generate outputs in real-time with delays as short as possible.
  • the accountability processing components may require receiving sensor measurements in near-real-time; such data may have to be stored in buffers and hard disks at a central server.
  • the component that analyzes the efficiency of the industrial work using many metrics only needs to generate outputs in retrospect. Executing the components that are not time-sensitive allows reducing the overall costs by delaying the analysis to time periods when there are no persons on the industrial site (e.g., at night and/or during holidays). This is possible using the same data that is stored in the central server for accountability purposes.
  • the components may be tested by executing test cases selected by domain experts from an industrial site. Based on the testing results, the components may be tuned, and eventually integrated into a decision support system.
  • the testing may include testing the overall system with respect to the functions of accuracy, privacy, speed, and scalability.
  • the testing may also include testing the usability of the system by end users, and the system’s resilience to partial failures of components and sensors.
  • FIG. 5 depicts an example workflow executed by a decision support system.
  • FIG. 5, and each other flow diagram in this disclosure, is intended to describe an algorithm or functional process that can be used as a basis of preparing one or more computer programs for the environment of FIG. 1.
  • FIG. 5 is expressed at the same level of detail that is typically used by persons of ordinary skill in the art, to which this disclosure pertains, to communicate among themselves about and understand the general plans, specifications, and algorithms that they will later be capable of using to define specific code, instructions or other details of programming the system of FIG. 1 or other computing elements of the environment of FIG. 1.
  • step 510 the decision support system receives a plurality of data inputs from a plurality of data input devices implemented across an industrial site.
  • step 520 the decision support system synchronizes one or more data inputs, from the plurality of data inputs, based on, for example, timestamps associated with the one or more data inputs.
  • step 530 the decision support system identifies a data model that has been trained on training data for the industrial site.
  • FIG. 7 depicts an example diagram for training a machine learning system.
  • the depicted training process may be initialized by trainers 744, who communicate with training applications 704 via, for example, an Internet-based network 742.
  • the training applications 704 may be configured to access a machine learning API 706.
  • Machine learning API 706 may be also configured to retrieve data from an object detection database 708.
  • Machine learning API 706 may be configured to receive data from a plate detection database 714, and receive data from, for example, a license plate detector 712. License plate detector 712 may be configured to receive an image stream from a license plate detection camera 724 via an Internet-based network 722.
  • Machine learning API 706 may be used to generate a machine learning model 755. Model 755 may be trained and validated using a training/validation API 710.
  • the trained and validated model may be accessible to users 734 via an Internet-based network 732 and a dashboard core API 702.
  • Dashboard core API 702 may use a core database 703 for storing data.
  • the decision support system applies the one or more data inputs to the data model to generate output data.
  • the output data may include one or more of alarms, warnings, statistical information outputs, or one or more summaries or reports.
  • the decision support system generates and displays a first visual representation of the output data.
  • the visual representation may include a graphical representation of an alarm or a graphical representation of metrics and reports.
  • step 560 the decision support system determines whether one or more new data inputs have been received from any of the plurality of data input devices implemented across the industrial site. If it is determined in step 570, that the new data inputs have been received, then step 510 is performed. Otherwise, step 580 is performed, and the processing of the data inputs ends.
  • Managing a large-scale industrial site often includes ensuring safety of the persons present on the site and near the site.
  • the traditional approaches for monitoring safety usually involve human visual observations by industrial-site managers. This is often inefficient because it requires an elevated level of mental focus and is prone to errors and distraction.
  • a decision support system is configured to transmit safety protocol messages and alert notifications to industrial -site managers.
  • a safety component of the decision support system cooperates with two subcomponents: a process that is configured to detect humans, industrial equipment and machines, and an event detection process that processes the detected information.
  • a decision support system deploys various algorithms to compensate for the fact that the cameras may have different fields of view and that there might be different visibility conditions.
  • a decision support system implements persons and equipment detection algorithms, a machine learning system, and a computer vision system.
  • the machine learning system allows detecting and processing features included in depictions of the persons detected within an industrial site.
  • labels may be used to track the persons.
  • Labeled data may be used to perform a computer simulation to track and account for the persons.
  • a decision support system is configured to detect persons and equipment by employing a machine learning system.
  • the machine learning system may be trained using a sequence of classifiers to label frames of video streams according to the presence or absence of an object. Even though there is no uniform classifier that is applicable to all possible datasets, a set of certain classifiers may be selected based on availability of the data, availability of the computational power, presence of the computational delay constraints, and empirical probability distribution of the data.
  • FIG. 6 depicts an example diagram for performing persons and equipment detection.
  • the configuration depicted in FIG. 6 includes a public subnet 600 and a private subnet 655.
  • inputs to an object detection component of a decision support system may include digital images from one or more cameras 616A, 616B.
  • the images may be routed via a router 614 to an Internet-based network 612, and then provided via an Internet gateway 696 to an elastic load balancer 694. Additional input may be provided by a time lapse generator 680 that is communicatively coupled to a relational database 672.
  • the images may be provided to a transcoder 692, and then to an instance storage 675.
  • the images may be provided to a recording storage 684, which may provide the images to a lambda-parameter- correction processing 682 to normalize the color hues through the images to compensate for the different gamma characteristics of the image capturing devices. Then the images are stored in a relational database 674.
  • the images may be subjected to reporting 620, streaming 630A, 630, and security processing 660.
  • the core processing is executed in a component 640 which, in addition to receiving the images, also receives data from a time-lapse reader 650 and a weather reader 670.
  • Users 602 may access the streaming process for displaying the generated outputs via an Internet network 604 and an Internet gateway 606.
  • a decision support system employs techniques incorporating neural networks, vector machines, and combinations of different classifiers through boosting. Due to the complexity of each data point of, for example, a high-resolution image, and the availability of copious amounts of labeled data, the image processing starts with a neural network approach.
  • the neural network approach may incorporate state-of-the- art software packages for deep learning to extract the features automatically rather than manually. This approach provides savings in terms of time and domain expertise.
  • Examples of neural networks may include convolutional neural networks that are efficient in performing in image recognition.
  • Detecting persons and equipment may also be performed in private subnet 655 comprising a processing unit 680, a bucket storage 686 and a relational database 689. Private subnet 655 may be used to provide auxiliary support to the public subnet described above.
  • Persons and machinery entering and leaving an industrial site may be tracked using cameras installed along an outside perimeter of the industrial site, along an inside perimeter of the industrial site, or along both perimeters.
  • FIG. 10A schematically illustrates a top plan view of an example physical site that is equipped with cameras mounted along an outside perimeter of the physical site
  • FIG. 10B schematically illustrates a top plan view of an example physical site that is equipped with cameras mounted along an inside perimeter of the physical site.
  • an industrial site 1010 has one or more entrances 1040 and one or more gates 1050. Although not depicted in FIG. 10A, industrial site 1010 may also include doors, windows, fences, tunnels, and the like.
  • one or more cameras 1020A, 1020B, 1020C, 1020D are installed at industrial site 1010 to capture digital images and video streams of persons or activities crossing at the entrances and gates of industrial site 1010.
  • Cameras 1020A, 1020B, 1020C, 1020D in FIG. 10A may correspond to video cameras 616 in FIG. 6.
  • Cameras 1020A, 1020B, 1020C, 1020D may be installed along an outside perimeter of industrial site 1010 at specific points that allow the cameras to capture images of persons or pieces of equipment as they pass through entrances 1040 and/or gates 1050.
  • the cameras are positioned to capture images within a polygon that is spaced apart by a specified distance 1030 outside of industrial site 1010.
  • the cameras may be positioned to capture images of persons or pieces of equipment passing through a virtual boundary that surrounds industrial site 1010.
  • the virtual boundary may be either outside or inside industrial site 1010.
  • An outside boundary may be plotted, for example, one meter outside of industrial site 1010.
  • a collective field of view of the cameras defines a virtual fence 1000 or a virtual boundary around industrial site 1010, and cameras capture images of persons and equipment crossing virtual fence 1000 to either enter industrial site 1010 or leave industrial site 1010.
  • FIG. 10A While the cameras in FIG. 10A are installed outside the industrial site, other cameras may be installed inside the industrial site, or both outside and inside.
  • the cameras are installed inside an industrial site 1100.
  • industrial site 1110 has one or more entrances 1140 and one or more gates 1150.
  • industrial site 1110 may also include doors, windows, fences, tunnels, and the like.
  • one or more cameras 1120A, 1120B, 1120C, 1120D are installed along a virtual fence 1100 to capture digital images or video streams of persons or activities occurring in the site.
  • Cameras 1120A, 1120B, 1120C, 1120D in FIG. 10B may correspond to video cameras 616 in FIG. 6.
  • One or more cameras may be installed at comers of industrial site 1010/1110, or along the sides of industrial site 1010/1110.
  • two cameras may be installed at a particular location along a virtual boundary surrounding industrial site 1010/1110: one camera may be installed on a pole one meter above the ground, and another camera may be installed on the same pole two meters above the ground.
  • one camera may be configured to monitor trucks entering and leaving the site, while another camera may be configured to monitor the workers.
  • the cameras are positioned along industrial fence 1100 that is inside industrial site 1110 and separate from industrial site 1110 by a specified distance 1130 from industrial site 1110.
  • the collective field of view of the cameras defines virtual fence 1100 or a virtual boundary inside industrial site 1110 and cameras capture images of persons and pieces of equipment entering or leaving industrial site 1110 as they cross industrial fence 1100.
  • each person moving inside a virtual or industrial fence is expected to wear a hard hat, apparel, or an item bearing indicia that can be seen in a digital image or digital video frames, and recognizable using digital image recognition techniques.
  • each person working on site 1010/1110 wears a hard hat bearing a barcode, a QR code or other marker that can be recognized in a digital image depicting a person.
  • an image of the person is captured by a camera and transmitted to, for example, computer 110 depicted in FIG. 1.
  • Computer 110 processes the image and detects the marker. Based on the detected marker, computer 110 may increment a count of workers present on site 1010/1110.
  • Computer 110 processes the image and detects the marker. Based on the detected marker, computer 110 may decrement the count of workers present on site 1010/1110. This technique allows to determine accurate worker-counts working on site 1010/1110 at a given time. A similar technique may be used to count pieces of equipment and to count trucks entering and leaving site 1010/1110.
  • markers attached to hard hats or overalls that workers are wearing are color-coded.
  • the color-coded markers may uniquely identify the workers or the workers’ trades. Examples of trades may include plumbers, carpenters, electricians, glassers, and the like.
  • the trades may have assigned different color markers, and the different color markers may be attached to the workers’ hard hats.
  • site 1010 requires that all plumbers working on site 1010 wear hard hats with red markers, and that all carpenters working on site 1010 wear hard hats with white markers.
  • the cameras installed throughout site 1010 capture images of the workers present on site 1010 and transmit the captured images to computer 110.
  • Computer 110 may analyze the images, and detect the markers attached to the hard hats worn by the workers depicted in the images.
  • Computer 110 may analyze the colors of the detected markers and based on the detected colors, computer 110 may determine counts of workers that belong to different trades.
  • hard hats that workers are wearing are color-coded.
  • the color-coded hats may uniquely identify the workers or the workers’ trades.
  • the trades may have assigned different color hats.
  • site 1010 requires that all electricians working on site 1010 wear white hard hats, and that all glaziers working on site 1010 wear red hard hats.
  • the cameras installed throughout site 1010 capture images of the workers present on site 1010 and transmit the captured images to computer 110.
  • Computer 110 may analyze the image, and detect the hard hats worn by the workers depicted in the images.
  • Computer 110 may analyze the colors of the detected hard hats and based on the detected colors, computer 110 may determine counts of workers that belong to different trades.
  • a count of workers that belong to a particular trade may be increased when computer 110 receives an image depicting a worker who belongs to the particular trade and who is entering site 1010/1110.
  • a count of workers that belong to a particular trade may be decreased when computer 110 receives an image depicting a worker who belongs to the particular trade and who is leaving site 1010/1110.
  • counts of workers that belong to different trades may be timestamped, and the timestamped counts may be used to determine how many workers of the different trades worked on site 1010/1110 at a given time.
  • FIG. 9A and FIG. 9B depict examples of markers, or stickers, affixed to hard hats.
  • FIG. 9A depicts an example of black and white markers.
  • FIG. 9B depicts an example of color-coded markers. Other types of markers may also be implemented.
  • the hard hats themselves may be also color-coded.
  • Color-coded markers and color-coded hard hats may be used to count the worker trades present on an industrial site.
  • the counting may be performed in many different ways. For example, if different hard hat colors are assigned to different trades, then a count of worker trades may be computed based on the detected hat colors. However, if different marker colors are assigned to different worker trades, then a count of worker trades may be computed based on the detected marker colors.
  • the counts may be also computed based on a combination of the hard hat colors and the marker colors. For instance, if the electricians wear blue hard hats with black markers, then a count of the electricians present on the industrial site may be computed based on the detected blue hard hats with black markers.
  • Cameras 1020A, 1020B, 1020C, 1020D may be configured to capture depictions of the workers as the workers enter industrial site 1010 and as the workers leave industrial site 1010.
  • the captured images may be time stamped and provided to a decision support system for processing.
  • the processing may include determining the times at which each individual worker, or a group of workers belonging to the same trade, entered industrial site 1010 and left the site.
  • the processing may also include determining the time periods during which the workers, or the groups of workers, were present on industrial site 1010, and the time periods during which the workers or groups of workers were absent from the site.
  • a decision support system may be configured to track the industrial equipment.
  • the equipment may include delivery trucks, loaders, excavators, backhoes, bulldozers, dump trucks, cement mixers, cranes, and others.
  • the tracking may be performed by monitoring and capturing images of license plates or markers attached to the trucks.
  • the trucks may have attached markers at, for example, license plates or front grilles, and cameras 1020 A, 1020B, 1020C, 1020D may be configured to capture the images for the markers attached on the trucks.
  • a decision support system is configured to recognize one or more types of events that the system needs to detect or track.
  • the types of the events may be defined by the type of sensors used to provide the event-and-sensor specific measurements.
  • each event type is associated with a corresponding action.
  • An action may correspond to, for example, generating a push-notification alarm, generating a message for an industrial manager, or displaying a message on the manager’s portable device such as a smartphone.
  • a decision support system is configured to detect a variety of events. Examples of the events include a collision event between machineries, a collision between a worker and a piece of equipment, an incident when a piece of equipment encroached on a forbidden zone, and an incident when a worker fell or got injured on the industrial site.
  • a decision support system may be configured to apply different detection algorithms to different types of events. For example, detecting a collision may be performed by employing a combination of geolocation-based algorithms and regression-based algorithms. On the other hand, detecting the equipment failure or an anomaly may be performed by employing a changepoint detection algorithm.
  • a decision support system implements an alert user interface.
  • the user interface may be configured to generate and display visual and audio alerts, as well as providing interactivity mechanisms enabling interactions between users and the decision support system. Enabling the interactions may include generating and providing the functionalities to allow the users to submit inputs to the decision support system, control tasks executed by the decision support system, and receive results from the decision support system.
  • a user interface may be ergonomically adaptable to allow communicating with the decision support system in any type of safe and unsafe situations.
  • a user interface is programmed to display outputs generated by a decision support system in the form that is easy to understand by users.
  • the outputs may be customized to portable devices that have limited display space.
  • FIG. 8 depicts a deployment example of a decision support system.
  • an IP camera 854 is deployed.
  • IP camera 854 may be configured to generate one or more video streams comprising video frames that are captured by the camera and depict persons, assets, and events detected by the camera.
  • the video streams may be communicated from IP camera 854 via an Internetbased network 852 to a local transcoder service 822.
  • Local transcoder service 822 may be configured to process the frames in the video streams and timestamp the frames.
  • Output 842 generated by local transcoder service 822 may be provided via an Internet-based network 862 to users 872.
  • Output 842 may also be stored in a long-term storage 844.
  • Output 842 generated by local transcoder service 822 may also be provided to an image-lambda-correction processor 806 that is configured to compensate for the image hue and saturation for different lambda-based characteristics of image capturing devices.
  • Output generated by image-lambda-correction processor 806 may be provided to a decision support system API 802 and stored in a storage device 804. Users 872 may access decision support system API 802 via Internet-based network 862 described above.
  • a decision support system is configured to receive realtime measurements from sensors strategically placed on an industrial site and return one or more efficiency or productivity metrics and recommendations.
  • the metrics may provide summaries of the progress of the industrial project and allow decision makers to dynamically modify parameters of the industrial project. Providing the functionalities to dynamically modify the project’s parameters may result in significant monetary savings that otherwise would not be attainable.
  • Example recommendations may include recommendations for different industrial resources such as equipment and manpower or resources for specific days and time periods.
  • metrics may include metrics that were derived by applying statistical estimation and regression approaches to data received from cameras, sensors, and a machine learning system. The metrics may also include metrics derived by applying dynamic programming approaches such as Markov decision processes.
  • An efficiency metric for an industrial project may be determined by comparing multiple metrics generated by a decision support system.
  • the comparison may include comparing the pay -rates of workers, the rates of raw material used, the disturbance of road traffic nearby, and the rates of progress of the industrial work. Some of the metrics may be estimated, whereas others may be assessed manually.
  • a proper time scale may be defined.
  • the time scale defines the estimation granularity. For example, the amount of manpower may be estimated for each day, whereas the number of raw materials may be estimated at an hour-by-hour resolution.
  • a detection algorithm may be used to obtain sample counts of workers detected at different times of the day and by different cameras.
  • a mathematical model with so-called sampling with replacement may be used to estimate a count of the workers present on an industrial site. The approach with the sampling with replacement has may advantages, including preserving the privacy of individual workers and assigning distinct labels to the workers.
  • Deep excavation is an important part of most industrial projects.
  • a decision support system may measure the amount of dirt-material removed from an industrial site by trucks.
  • a machine learning model that has been trained on historical data may be used. For example, an equipment-detection algorithm and a regression-based computer vision algorithm may be applied to the collected data.
  • a use of raw materials may be estimated using a count of scoops made by an excavator. The counts of scoops may be determined using a computer vision system configured to detect different poses of the excavator and configured to detect events corresponding to forward and backward movements of the excavator.
  • a decision support system may generate an industrial-impact metric. Determining that metric may include counting one or more distinct vehicles that are depicted in video frames captured by cameras pointing toward the roads near the industrial site. By comparing time series of the counts collected at different days, the decision support system may estimate the effect that the industrial project may have on the nearby traffic.
  • a decision support system may determine a metric indicating the impact that the industry has on the timeliness of the public busses.
  • cameras may be used to collect video streams for an industrial site, and the decision support system may process the streams to detect one or more public busses, and to compute delays between the timestamps indicating the times when the busses were detected, and the times scheduled for the busses’ arrivals and departures.
  • a decision support system may perform an efficiency analysis at a much slower time pace than a safety analysis is usually performed.
  • the efficiency analysis may be performed during otherwise idle computer cycles, and during hours when only few persons are present on the industrial site (e.g., at night).
  • a decision support system is configured to record data from a plurality of sensors strategically installed throughout an industrial site. Recording the data is customized to provide support for communications and storage requirements, while satisfying security and image quality objectives.
  • a decision support system keeps records of activities on an industrial site as evidence and does so for many reasons.
  • the records may be used to, for example, determine the causes of events such as incidents involving civilians. For instance, it is important to detect the pedestrian who fell or got injured on the site or detect whether a pedestrian got injured because of the industrial equipment.
  • an industrial company is liable for any incidents that happened to the public within the industrial zones.
  • An industrial zone usually includes a few hundred feet of publicly accessible roadway beyond the actual site of the industrial zone.
  • Records of critical incidents may provide evidence to an industrial company and may be used to quickly resolve lawsuits.
  • the records of the incidents may allow the industrial company to prepare their cases before a lawsuit against the company is even filed.
  • the records also allow the industrial company to be proactive and correct problems by providing, for example, safer roads and surroundings along the sites.
  • a decision support system may maintain the records of activities on an industrial site as evidence for auditing purposes. After reviewing the recordings, the industrial site operator may audit the equipment rental fees charged by subcontractors against the actual usage of the equipment. To do so, the operator may review timestamped screenshots depicting drilling rigs, buses, cranes, or excavators entering and leaving the industrial site. Furthermore, the industrial company may cross check the number of workers billed against the number of workers invoiced. The audits may be triggered when, for example, an invoice deviates excessively from the previous invoices, and when significant overtime charges appear on the invoices. The activity records may also be used to prevent corruption through the risk of discovery.
  • a decision support system provides many functionalities that have been sought by major industrial companies.
  • the system is valuable to the companies for many reasons, including improving safety on industrial sites, improving efficiency in tracking persons and equipment, improving accountability of subcontractors, and improving accountability to the city and citizens.
  • a decision support system provides several economic benefits. Some of the direct benefits include the ability to commercialize and export the decision support system technology internationally. Some of the indirect benefits include the ability to address the frequent criticisms of industrial projects: high rate of incidents, inefficiency, and lack of accountability.
  • the techniques described herein are implemented by at least one computing device.
  • the techniques may be implemented in whole or in part using a combination of at least one server computer and/or other computing devices that are coupled using a network, such as a packet data network.
  • the computing devices may be hardwired to perform the techniques or may include digital electronic devices such as at least one application-specific integrated circuit (ASIC) or field programmable gate array (FPGA) that is persistently programmed to perform the techniques or may include at least one general purpose hardware processor programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination.
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • Such computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the described techniques.
  • the computing devices may be server computers, workstations, personal computers, portable computer systems, handheld devices, mobile computing devices, wearable devices, body mounted or implantable devices, smartphones, smart appliances, internetworking devices, autonomous or semi-autonomous devices such as robots or unmanned ground or aerial vehicles, any other electronic device that incorporates hard-wired and/or program logic to implement the described techniques, one or more virtual computing machines or instances in a data center, and/or a network of server computers and/or personal computers.
  • FIG. 11 is a block diagram that illustrates an example computer system with which an embodiment may be implemented.
  • a computer system 1100 and instructions for implementing the disclosed technologies in hardware, software, or a combination of hardware and software are represented schematically, for example as boxes and circles, at the same level of detail that is commonly used by persons of ordinary skill in the art to which this disclosure pertains for communicating about computer architecture and computer systems implementations.
  • Computer system 1100 includes an input/output (I/O) subsystem 1102 which may include a bus and/or other communication mechanism(s) for communicating information and/or instructions between the components of the computer system 1100 over electronic signal paths.
  • the I/O subsystem 1102 may include an I/O controller, a memory controller and at least one I/O port.
  • the electronic signal paths are represented schematically in the drawings, for example as lines, unidirectional arrows, or bidirectional arrows.
  • At least one hardware processor 1104 is coupled to I/O subsystem 1102 for processing information and instructions.
  • Hardware processor 1104 may include, for example, a general-purpose microprocessor or microcontroller and/or a special-purpose microprocessor such as an embedded system or a graphics processing unit (GPU) or a digital signal processor or ARM processor.
  • Processor 1104 may comprise an integrated arithmetic logic unit (ALU) or may be coupled to a separate ALU.
  • ALU arithmetic logic unit
  • Computer system 1100 includes one or more units of memory 1106, such as a main memory, which is coupled to I/O subsystem 1102 for electronically digitally storing data and instructions to be executed by processor 1104.
  • Memory 1106 may include volatile memory such as various forms of random-access memory (RAM) or other dynamic storage devices.
  • RAM random-access memory
  • Memory 1106 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 1104.
  • Such instructions when stored in non-transitory computer-readable storage media accessible to processor 1104, can render computer system 1100 into a special-purpose machine that is customized to perform the operations specified in the instructions.
  • Computer system 1100 further includes non-volatile memory such as read only memory (ROM) 1108 or other static storage device coupled to I/O subsystem 1102 for storing information and instructions for processor 1104.
  • the ROM 1108 may include various forms of programmable ROM (PROM) such as erasable PROM (EPROM) or electrically erasable PROM (EEPROM).
  • a unit of persistent storage 1110 may include various forms of non-volatile RAM (NVRAM), such as FLASH memory, or solid-state storage, magnetic disk or optical disk such as CD-ROM or DVD-ROM and may be coupled to I/O subsystem 1102 for storing information and instructions.
  • Storage 1110 is an example of a non-transitory computer-readable medium that may be used to store instructions and data which when executed by the processor 1104 cause performing computer-implemented methods to execute the techniques herein.
  • the instructions in memory 1106, ROM 1108 or storage 1110 may comprise one or more sets of instructions that are organized as modules, methods, objects, functions, routines, or calls.
  • the instructions may be organized as one or more computer programs, operating system services, or application programs including mobile apps.
  • the instructions may comprise an operating system and/or system software; one or more libraries to support multimedia, programming or other functions; data protocol instructions or stacks to implement TCP/IP, HTTP or other communication protocols; file format processing instructions to parse or render files coded using HTML, XML, JPEG, MPEG or PNG; user interface instructions to render or interpret commands for a graphical user interface (GUI), command-line interface or text user interface; application software such as an office suite, internet access applications, design and manufacturing applications, graphics applications, audio applications, software engineering applications, educational applications, games or miscellaneous applications.
  • the instructions may implement a web server, web application server or web client.
  • the instructions may be organized as a presentation layer, application layer and data storage layer such as a relational database system using structured query language (SQL) or no SQL, an object store, a graph database, a flat file system or other data storage.
  • SQL structured query language
  • Computer system 1100 may be coupled via I/O subsystem 1102 to at least one output device 1112.
  • output device 1112 is a digital computer display. Examples of a display that may be used in various embodiments include a touch screen display or a light-emitting diode (LED) display or a liquid crystal display (LCD) or an e- paper display.
  • Computer system 1100 may include other type(s) of output devices 1112, alternatively or in addition to a display device. Examples of other output devices 1112 include printers, ticket printers, plotters, projectors, sound cards or video cards, speakers, buzzers or piezoelectric devices or other audible devices, lamps or LED or LCD indicators, haptic devices, actuators, or servos.
  • At least one input device 1114 is coupled to I/O subsystem 1102 for communicating signals, data, command selections or gestures to processor 1104.
  • input devices 1114 include touch screens, microphones, still and video digital cameras, alphanumeric and other keys, keypads, keyboards, graphics tablets, image scanners, joysticks, clocks, switches, buttons, dials, slides, and/or various types of sensors such as force sensors, motion sensors, heat sensors, accelerometers, gyroscopes, and inertial measurement unit (IMU) sensors and/or various types of transceivers such as wireless, such as cellular or Wi-Fi, radio frequency (RF) or infrared (IR) transceivers and Global Positioning System (GPS) transceivers.
  • RF radio frequency
  • IR infrared
  • GPS Global Positioning System
  • control device 1116 may perform cursor control or other automated control functions such as navigation in a graphical interface on a display screen, alternatively or in addition to input functions.
  • Control device 1116 may be a touchpad, a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 1104 and for controlling cursor movement on display 1112.
  • the input device may have at least two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
  • An input device 1114 may include a combination of multiple different input devices, such as a video camera and a depth sensor.
  • computer system 1100 may comprise an internet of things (loT) device in which one or more of the output device 1112, input device 1114, and control device 1116 are omitted.
  • the input device 1114 may comprise one or more cameras, motion detectors, thermometers, microphones, seismic detectors, other sensors or detectors, measurement devices or encoders and the output device 1112 may comprise a special-purpose display such as a single-line LED or LCD display, one or more indicators, a display panel, a meter, a valve, a solenoid, an actuator or a servo.
  • input device 1114 may comprise a global positioning system (GPS) receiver coupled to a GPS module that is capable of triangulating to a plurality of GPS satellites, determining and generating geolocation or position data such as latitude-longitude values for a geophysical location of the computer system 1100.
  • Output device 1112 may include hardware, software, firmware, and interfaces for generating position reporting packets, notifications, pulse or heartbeat signals, or other recurring data transmissions that specify a position of the computer system 1100, alone or in combination with other application-specific data, directed toward host 1124 or server 1130.
  • Computer system 1100 may implement the techniques described herein using customized hard-wired logic, at least one ASIC or FPGA, firmware and/or program instructions or logic which when loaded and used or executed in combination with the computer system causes or programs the computer system to operate as a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 1100 in response to processor 1104 executing at least one sequence of at least one instruction contained in main memory 1106. Such instructions may be read into main memory 1106 from another storage medium, such as storage 1110. Execution of the sequences of instructions contained in main memory 1106 causes processor 1104 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
  • Non-volatile media includes, for example, optical or magnetic disks, such as storage 1110.
  • Volatile media includes dynamic memory, such as memory 1106.
  • Common forms of storage media include, for example, a hard disk, solid state drive, flash drive, magnetic data storage medium, any optical or physical data storage medium, memory chip, or the like.
  • Storage media is distinct from but may be used in conjunction with transmission media.
  • Transmission media participates in transferring information between storage media.
  • transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise a bus of I/O subsystem 1102.
  • transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infrared data communications.
  • Various forms of media may be involved in carrying at least one sequence of at least one instruction to processor 1104 for execution.
  • the instructions may initially be carried on a magnetic disk or solid-state drive of a remote computer.
  • the remote computer can load the instructions into its dynamic memory and send the instructions over a communication link such as a fiber optic or coaxial cable or telephone line using a modem.
  • a modem or router local to computer system 1100 can receive the data on the communication link and convert the data to a format that can be read by computer system 1100.
  • a receiver such as a radio frequency antenna or an infrared detector can receive the data carried in a wireless or optical signal and appropriate circuitry can provide the data to I/O subsystem 1102 such as place the data on a bus.
  • I/O subsystem 1102 carries the data to memory 1106, from which processor 1104 retrieves and executes the instructions.
  • the instructions received by memory 1106 may optionally be stored on storage 1110 either before or after execution by processor 1104.
  • Computer system 1100 also includes a communication interface 1118 coupled to bus 1102.
  • Communication interface 1118 provides a two-way data communication coupling to network link(s) 1120 that are directly or indirectly connected to at least one communication network, such as a network 1122 or a public or private cloud on the Internet.
  • communication interface 1118 may be an Ethernet networking interface, integrated-services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of communications line, for example an Ethernet cable or a metal cable of any kind or a fiber-optic line or a telephone line.
  • Network 1122 broadly represents a local area network (LAN), wide-area network (WAN), campus network, internetwork or any combination thereof.
  • Communication interface 1118 may comprise a LAN card to provide a data communication connection to a compatible LAN, or a cellular radiotelephone interface that is wired to send or receive cellular data according to cellular radiotelephone wireless networking standards, or a satellite radio interface that is wired to send or receive digital data according to satellite wireless networking standards.
  • communication interface 1118 sends and receives electrical, electromagnetic or optical signals over signal paths that carry digital data streams representing various types of information.
  • Network link 1120 typically provides electrical, electromagnetic, or optical data communication directly or through at least one network to other data devices, using, for example, satellite, cellular, Wi-Fi, or BLUETOOTH technology.
  • network link 1120 may provide a connection through a network 1122 to a host computer 1124.
  • network link 1120 may provide a connection through network 1122 or to other computing devices via internetworking devices and/or computers that are operated by an Internet Service Provider (ISP) 1126.
  • ISP 1126 provides data communication services through a world-wide packet data communication network represented as internet 1128.
  • a server computer 1130 may be coupled to internet 1128.
  • Server 1130 broadly represents any computer, data center, virtual machine or virtual computing instance with or without a hypervisor, or computer executing a containerized program system such as DOCKER or KUBERNETES.
  • Server 1130 may represent an electronic digital service that is implemented using more than one computer or instance and that is accessed and used by transmitting web services requests, uniform resource locator (URL) strings with parameters in HTTP payloads, API calls, app services calls, or other service calls.
  • Computer system 1100 and server 1130 may form elements of a distributed computing system that includes other computers, a processing cluster, server farm or other organization of computers that cooperate to perform tasks or execute applications or services.
  • Server 1130 may comprise one or more sets of instructions that are organized as modules, methods, objects, functions, routines, or calls. The instructions may be organized as one or more computer programs, operating system services, or application programs including mobile apps.
  • the instructions may comprise an operating system and/or system software; one or more libraries to support multimedia, programming or other functions; data protocol instructions or stacks to implement TCP/IP, HTTP or other communication protocols; file format processing instructions to parse or render files coded using HTML, XML, JPEG, MPEG or PNG; user interface instructions to render or interpret commands for a graphical user interface (GUI), command-line interface or text user interface; application software such as an office suite, internet access applications, design and manufacturing applications, graphics applications, audio applications, software engineering applications, educational applications, games or miscellaneous applications.
  • Server 1130 may comprise a web application server that hosts a presentation layer, application layer and data storage layer such as a relational database system using structured query language (SQL) or no SQL, an object store, a graph database, a flat file system or other data storage.
  • SQL structured query language
  • Computer system 1100 can send messages and receive data and instructions, including program code, through the network(s), network link 1120 and communication interface 1118.
  • a server 1130 might transmit a requested code for an application program through Internet 1128, ISP 1126, local network 1122 and communication interface 1118.
  • the received code may be executed by processor 1104 as it is received, and/or stored in storage 1110, or other non-volatile storage for later execution.
  • the execution of instructions as described in this section may implement a process in the form of an instance of a computer program that is being executed and consisting of program code and its current activity.
  • a process may be made up of multiple threads of execution that execute instructions concurrently.
  • a computer program is a passive collection of instructions, while a process may be the actual execution of those instructions.
  • Several processes may be associated with the same program; for example, instantiating several instances of the same program often means more than one process is being executed. Multitasking may be implemented to allow multiple processes to share processor 1104.
  • computer system 1100 may be programmed to implement multitasking to allow each processor to switch between tasks that are being executed without having to wait for each task to finish.
  • switches may be performed when tasks perform input/ output operations, when a task indicates that it can be switched, or on hardware interrupts.
  • Time-sharing may be implemented to allow fast response for interactive user applications by rapidly performing context switches to provide the appearance of concurrent execution of multiple processes simultaneously.
  • an operating system may prevent direct communication between independent processes, providing strictly mediated and controlled inter-process communication functionality.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Factory Administration (AREA)
  • Jib Cranes (AREA)
  • Eye Examination Apparatus (AREA)
  • Control Of Vending Devices And Auxiliary Devices For Vending Devices (AREA)

Abstract

Un procédé mis en œuvre par ordinateur pour surveiller les risques pour la productivité, la santé et la sécurité posés par des activités et des objets, et d'autres signaux présents au niveau de sites industriels, consiste à : recevoir des entrées de données provenant de dispositifs d'entrée au niveau d'un site industriel; sélectionner un modèle de données qui est programmé pour détecter des activités ou des objets associés à des travailleurs ou à un équipement présents au niveau des sites industriels; appliquer les entrées de données au modèle de données pour recevoir des données de sortie spécifiant si les activités ou objets associés aux travailleurs ou à l'équipement sont présents au niveau du site industriel; et s'ils sont présents : déterminer, sur la base des données de sortie, des caractéristiques des activités ou des objets; déterminer, sur la base des caractéristiques, si les activités ou les objets provoquent des risques pour la productivité, la santé ou la sécurité au niveau du site industriel; et si tel est le cas, générer des notifications indiquant les risques pour la santé ou la sécurité au niveau du site industriel.
PCT/US2021/064714 2020-12-21 2021-12-21 Identification et surveillance de risques pour la productivité, la santé et la sécurité dans des sites industriels WO2022140460A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2021409557A AU2021409557A1 (en) 2020-12-21 2021-12-21 Identifying and monitoring productivity, health, and safety risks in industrial sites
EP21912086.2A EP4264383A1 (fr) 2020-12-21 2021-12-21 Identification et surveillance de risques pour la productivité, la santé et la sécurité dans des sites industriels

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/129,355 US20210109497A1 (en) 2018-01-29 2020-12-21 Identifying and monitoring productivity, health, and safety risks in industrial sites
US17/129,355 2020-12-21

Publications (1)

Publication Number Publication Date
WO2022140460A1 true WO2022140460A1 (fr) 2022-06-30

Family

ID=82160088

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/064714 WO2022140460A1 (fr) 2020-12-21 2021-12-21 Identification et surveillance de risques pour la productivité, la santé et la sécurité dans des sites industriels

Country Status (3)

Country Link
EP (1) EP4264383A1 (fr)
AU (1) AU2021409557A1 (fr)
WO (1) WO2022140460A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170314961A1 (en) * 2012-10-12 2017-11-02 Nec Laboratories America, Inc. A data analytic engine towards the self-management of complex physical systems
JP2018028784A (ja) * 2016-08-17 2018-02-22 富士通株式会社 移動体群検出プログラム、移動体群検出装置、及び移動体群検出方法
US20180183823A1 (en) * 2016-12-28 2018-06-28 Samsung Electronics Co., Ltd. Apparatus for detecting anomaly and operating method for the same
US20200103894A1 (en) * 2018-05-07 2020-04-02 Strong Force Iot Portfolio 2016, Llc Methods and systems for data collection, learning, and streaming of machine signals for computerized maintenance management system using the industrial internet of things
WO2020191004A1 (fr) * 2019-03-18 2020-09-24 Georgia Tech Research Corporation Procédé et système de suivi et d'alerte pour la productivité et la sécurité des travailleurs
US20210109497A1 (en) * 2018-01-29 2021-04-15 indus.ai Inc. Identifying and monitoring productivity, health, and safety risks in industrial sites

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170314961A1 (en) * 2012-10-12 2017-11-02 Nec Laboratories America, Inc. A data analytic engine towards the self-management of complex physical systems
JP2018028784A (ja) * 2016-08-17 2018-02-22 富士通株式会社 移動体群検出プログラム、移動体群検出装置、及び移動体群検出方法
US20180183823A1 (en) * 2016-12-28 2018-06-28 Samsung Electronics Co., Ltd. Apparatus for detecting anomaly and operating method for the same
US20210109497A1 (en) * 2018-01-29 2021-04-15 indus.ai Inc. Identifying and monitoring productivity, health, and safety risks in industrial sites
US20200103894A1 (en) * 2018-05-07 2020-04-02 Strong Force Iot Portfolio 2016, Llc Methods and systems for data collection, learning, and streaming of machine signals for computerized maintenance management system using the industrial internet of things
WO2020191004A1 (fr) * 2019-03-18 2020-09-24 Georgia Tech Research Corporation Procédé et système de suivi et d'alerte pour la productivité et la sécurité des travailleurs

Also Published As

Publication number Publication date
AU2021409557A1 (en) 2023-08-10
EP4264383A1 (fr) 2023-10-25

Similar Documents

Publication Publication Date Title
US20210109497A1 (en) Identifying and monitoring productivity, health, and safety risks in industrial sites
US10896331B2 (en) Monitoring activities in industrial sites
De Melo et al. Applicability of unmanned aerial system (UAS) for safety inspection on construction sites
Zhou et al. A multidimensional framework for unmanned aerial system applications in construction project management
Teizer et al. Proximity hazard indicator for workers-on-foot near miss interactions with construction equipment and geo-referenced hazard areas
US11875410B2 (en) Systems and methods for dynamic real-time analysis from multi-modal data fusion for contextual risk identification
Costin et al. RFID and BIM-enabled worker location tracking to support real-time building protocol and data visualization
Edirisinghe Digital skin of the construction site: Smart sensor technologies towards the future smart construction site
Zhou et al. Safety barrier warning system for underground construction sites using Internet-of-Things technologies
Vahdatikhaki et al. Framework for near real-time simulation of earthmoving projects using location tracking technologies
Navon Automated project performance control of construction projects
US20220284366A1 (en) Method and system for managing a crane and/or construction site
CN104808610B (zh) 基于bim建筑信息模型的机械控制系统及方法
US10671948B2 (en) Work cycle management
Ren et al. Real-time anticollision system for mobile cranes during lift operations
Desvignes Requisite empirical risk data for integration of safety with advanced technologies and intelligent systems
Nakanishi et al. A review of monitoring construction equipment in support of construction project management
Abdel-Alim et al. Dynamic labor tracking system in construction project using bim technology
Hatoum et al. The use of Drones in the construction industry: Applications and implementation
EP4264383A1 (fr) Identification et surveillance de risques pour la productivité, la santé et la sécurité dans des sites industriels
Pradhananga Construction site safety analysis for human-equipment interaction using spatio-temporal data
Su Construction crew productivity monitoring supported by location awareness technologies
CN114926147A (zh) 智能轻量化管理平台及方法
Kassem et al. Measuring and improving the productivity of construction’s site equipment fleet: an integrated IoT and BIM system
Alzubi et al. Applications of cyber-physical systems in construction projects

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21912086

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2021912086

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021912086

Country of ref document: EP

Effective date: 20230721

ENP Entry into the national phase

Ref document number: 2021409557

Country of ref document: AU

Date of ref document: 20211221

Kind code of ref document: A