IL292503B1 - Remote crane tracking - Google Patents

Remote crane tracking

Info

Publication number
IL292503B1
IL292503B1 IL292503A IL29250322A IL292503B1 IL 292503 B1 IL292503 B1 IL 292503B1 IL 292503 A IL292503 A IL 292503A IL 29250322 A IL29250322 A IL 29250322A IL 292503 B1 IL292503 B1 IL 292503B1
Authority
IL
Israel
Prior art keywords
crane
display
operator
trigger event
image
Prior art date
Application number
IL292503A
Other languages
Hebrew (he)
Other versions
IL292503B2 (en
IL292503A (en
Inventor
Aviv Carmel
Original Assignee
Sky Line Cockpit Ltd
Crane Cockpit Tech Ltd
Aviv Carmel
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sky Line Cockpit Ltd, Crane Cockpit Tech Ltd, Aviv Carmel filed Critical Sky Line Cockpit Ltd
Priority to IL292503A priority Critical patent/IL292503B2/en
Priority to PCT/IL2023/050415 priority patent/WO2023209705A1/en
Publication of IL292503A publication Critical patent/IL292503A/en
Publication of IL292503B1 publication Critical patent/IL292503B1/en
Publication of IL292503B2 publication Critical patent/IL292503B2/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/40Applications of devices for transmitting control pulses; Applications of remote control devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/46Position indicators for suspended loads or for crane elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C15/00Safety gear
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Control And Safety Of Cranes (AREA)
  • Closed-Circuit Television Systems (AREA)

Description

292503/ REMOTE CRANE TRACKING FIELD OF THE INVENTION The present invention relates generally to the field of remote tracking of conditions and operation of a machine or system, and more specifically to remote tracking of conditions and operation of a crane.
BACKGROUND OF THE INVENTION The field of tracking from a distance of the conditions, settings, and operation of systems and devices is constantly progressing, by virtue of the constant development of improved sensing and recording abilities, improved computational abilities, and improved wireless technologies. This allows for a growing variety of fields to gradually reduce the number of "men-on-the-spot" which are required to run and supervise the working of a device or system, and replace these men with relevant sensors and computational programs.
However, in fields which involve complex operations requiring a large number of workers on the spot working in highly dynamic settings, particularly when the operations are at a relatively high level of risk, the transition to remotely controlled systems is more complicated, and is slower in being applied.
SUMMARY OF THE INVENTION In accordance with one aspect of the present invention, there is thus provided a remote crane tracking apparatus for remotely controlling and monitoring at least one crane including a display for displaying to an operator, at 292503/ a remote cockpit location, a target object of particular interest. The remote crane tracking apparatus further includes at least one image sensor, and a tracking device operational for tracking the target object and providing an image of the target object to the display. The target object may include at least one of: the crane hook; a load to be carried by the at least one crane; a load landing spot; a safety weight of the crane; a display of a scenery of interest; and a view of the crane surroundings as captured by an image sensor disposed at a cockpit-location on the at least one crane. The target object may be maintained in a preferred position on the display, the preferred position being selected by the operator and/or predefined in a controller, wherein the controller is operational for control the tracking device and/or the display. The preferred position may be intended to maintain the operator in a healthy body posture. A picture may be bursting onto the display when the target object is identified in data provided by a particular image sensor of the at least one image sensor. The crane tracking apparatus may be part of a system, which system includes a speaker for sounding crane real time environmental noises as heard and perceived by a microphone at a cockpit- location of the at least one crane. The at least one sensor may include a multiplicity of sensors, and a display of the image perceived by a particular image sensor may be displayed as a principal display capturing the principal area of the display, and the image perceived by at least one of other sensor of the multiplicity of sensors may be displayed as a secondary display capturing a secondary area of the display. Sound captured by a microphone of the particular image sensor may be played for the operator. The particular image sensor may be selected when the target object is identified as best seen/closest to the particular sensor. 292503/ The crane tracking apparatus may further include Augmented Reality (AR) or Virtual Reality (VR) features for combining with the displayed image, providing data, and pointing and marking items and locations of interest. The locations of interest include a load landing spot. The crane tracking apparatus may further include a hopping mobile apparatus configured for mobilizing and deploying for chirurgic command and control tasks. The target object may include a series of objects with regard to which an assignment is required to be conducted.
The series of objects may include the crane safety pins and the assignment may include safety check of the status of each of the safety pins.
The target object may include a safety weight sensor, operational for indicating safety threshold check regarding maximal load and/or maximal spatial positioning of a load. The at least one image sensor may be mounted on at least one of the following: crane cockpit; signaler helmet; UAV; and structures in the vicinity of an area of interest.
In accordance with another aspect of the present invention, there is thus provided a method for remotely controlling and monitoring at least one crane in a remote crane tracking, including the procedure of retrieving an image of a target object of particular interest, including tracking the target object by at least one image sensor. The method further includes providing the image of the target object to a display, and displaying the image on the display to an operator at the remote cockpit location. The target object may include at least one of: the crane hook; a load to be carried by the at least one crane; a load landing spot; a safety weight of the crane; a display of a scenery of interest; and a view of the crane surroundings as captured by an image sensor disposed at a cockpit-location on 292503/ the at least one crane. The method may further includes the procedure of maintaining the target object in a preferred position on the display, the preferred position being selected by the operator and/or predefined in a controller, wherein the controller is operational for control the tracking device and/or the display. The preferred position may be intended to maintain the operator in a healthy body posture.
The method may further include the procedure of bursting the image onto the display when the target object is identified in data provided by a particular image sensor of the at least one image sensor. The method may further include the procedure of sounding crane real time environmental noises as heard and perceived by a microphone at a cockpit-location of the at least one crane, using a speaker. The method may further include displaying a display of the image perceived by a particular image sensor as a principal display, capturing the principal area [majority] of the display, and displaying the image perceived by at least one other image sensor as a secondary display capturing a secondary area of the display. The method may further include the procedure of playing the sound captured by a microphone of the particular image sensor for the operator. The method may further include the procedure of selecting the particular image sensor when the target object is identified as best seen/closest to the particular sensor.
The method may further include the procedure of combining AR/VR features with the displayed image, for providing data, and pointing and marking items and locations of interest. The locations of interest may include a load landing spot. 292503/ The method may further include the procedure of mobilizing and deploying a hopping mobile apparatus for chirurgic command and control tasks.
The target object may include a series of objects with regard to which an assignment is required to be conducted. The series of objects may include the crane safety pins and the assignment may include a safety check of the status of each of the safety pins. The target object may include a safety weight sensor operational for indicating safety threshold check regarding maximal load and/or maximal spatial positioning of a load.
The method may further include the procedure of mounting the at least one image sensor on at least one of the following: crane cockpit; signaler helmet; UAV; and structures in the vicinity of an area of interest.
BRIEF DESCRIPTION OF THE DRAWINGS The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which: Figure 1A is a schematic illustration of a crane tracking apparatus, including a crane and a remote display, constructed and operative according to the present invention; Figure 1B is a schematic illustration of the crane of the crane tracking apparatus of Figure 1A, including a UAV controlled and maneuvered by an operator; Figure 2A is a schematic illustration of the display of Figure 1A, divided into a plurality of screens of which at least one is an optimal settings screen; 292503/ Figure 2B is a schematic illustration of the display of Figure 1A, divided into a major screen and a minor screen; Figure 3A is a schematic illustration of a variety of tracking-sensors mounted on a crane and/or in the vicinity thereof, and in communication with a controller, constructed and operative according to another embodiment of the present invention; Figure 3B is an illustration of coupled mutual detectors mounted on the crane of Figure 3A and on neighboring tree; Figure 3C is a schematic illustration of an automatic intervention response of a controller to a suddenly appearing hazard, operative according to another embodiment of the present invention; Figure 4 is a schematic illustration of real-time-conditions indicators located in the vicinity of an operator in a remote cockpit of a crane, constructed and operative according to another embodiment of the present invention; Figures 5A to 5C are sequential live images of a crane shown upon the display, with changing display settings in response to data which is indicative of a potential defect or hazard, constructed and operative according to another embodiment of the present invention, wherein in Figure 5A is a visual display of real time imaging of a scene of interest shown on the display, the contrast between the crane and its background environment being at a baseline level, Figure 5B is a schematic illustration of the live images of Figure 5A, where the contrast level is augmented, and Figure 5C is a schematic illustration of the live images of Figure 5B, where the contrast level is further augmented; 292503/ Figure 6A is a schematic illustration of an image of a crane presented on a display, with added augmented reality features, constructed and operative according to another embodiment the present invention; Figure 6B is a schematic illustration of an image of a crane presented on a display, with added virtual reality features, constructed and operative according to another embodiment of the present invention; and Figure 7 is a block diagram of a method for remote tracking of a crane, operative according to another embodiment of the present invention.
DETAILED DESCRIPTION The present invention addresses the above-mentioned issues by providing a remote crane tracking apparatus and method for tracking and controlling at least one crane from a distance by an operator. The remote crane tracking apparatus includes a display for displaying live imaging of the crane to an operator at a remote cockpit location, a site image sensor for continuously recording the crane and supplying the live imaging to the image display, and a tracking device for automatically centering and focusing the image sensor on a selected target object of particular interest, for tracking the target object. The target object may include, for example, the crane hook, a load carried by the crane, a load landing spot, a feature of interest from the surrounding scenery, and the like. The site image sensor may include a plurality of sensors, which may be positioned at different locations of the crane and the surrounding scenery so that the sensors collectively span the work site area. In a certain embodiment, the site image sensor is configured to capture a trigger event in accordance with a 292503/ controller functionality. The image display may be configured to switch between images provided by different sensors according to how well they depict the target object, and may even display more than one live imaging at a time. Other sensors may be combined with the image sensor and AR/VR features may be combined with the display, to enhance the tracking and control abilities of the operator.
The operation of the apparatus and system will now be further explained with reference to the illustrations.
Reference is made to Figure 1A, which is a schematic illustration of a crane tracking apparatus, hereby referenced 100, constructed and operative according to the present invention. Crane tracking apparatus 100 includes controller 110 and display 112, for following and controlling the operation of a crane, referenced 200. Crane 200 includes vertical crane mast 202, jib 204 which is positioned at a top region of crane mast 202 and perpendicular thereto, coupling pins 208 which connect the mast sections of mast 202, and hoist-and-hook 2 which is suspended from jib 204 and is configured to have a load connected to its end and to be extended and retracted according to need. Counterweight 210 is mounted at an end of jib 204 opposite to the end from which hoist-and-hook 2 extends, and is operative to counterbalance a load carried by hoist-and-hook 206.
Two image sensors 220 (e.g., cameras) are installed in the vicinity of crane 200, one mounted on the top of jib 204, in the region substantially correlated to the location of a standard non-remote crane-cockpit, and one placed on a neighboring building 300, and visually record crane 200 and its surroundings for being displayed in real time as live imaging upon display 112, for the use of an operator 150. A tracking device is embodied by a plurality of emitters 222, which are 292503/ installed in various locations on crane 200, including the distal end of jib 204, the hook of hoist-and-hook 206, and counterweight 210, and allow enhanced tracking of the components on which they are installed by operator 150 by use of receiving unit 224, which is coupled to an image sensor 220. In a certain embodiment, site image sensors are connected to a cloth fastener, or a hook-and-loop fastener, or an elastic band thereby providing wearability. Other fasters rendering the image sensors wearable are included within the scope of this invention. Wearable sensors enable detection of trigger events of personnel in designated areas. For example, when a wearable site sensor is implemented as an image sensor, then detection of an image matching a reference image of a designated site area detected by a controller is identified as a trigger event.
The components and devices of apparatus 100 may be based in hardware, software, or combinations thereof. It is appreciated that the functionality associated with each of the devices or components of apparatus 100 may be distributed among multiple devices or components, which may reside at a single location or at multiple locations. For example, the functionality associated with controller 110 may be distributed between multiple processing units. Controller 110 may be part of a server or a remote computer system accessible over a communications medium or network, or may be integrated with other components of apparatus 100, such as incorporated with image sensor 220. Apparatus 100 may optionally include and/or be associated with additional components not shown in Figure 1A, for enabling the implementation of the disclosed subject matter. For example, apparatus 100 may include a memory or storage unit (not shown) for temporary storage of images or other data. 292503/ The operation of remote crane tracking apparatus 100 will now be described in general terms, followed by specific examples.
Operator 150 manages and follows the operation of crane 200 by the use of controller 110 and display 112 (respectively). Crane 200 is continuously monitored by at least one image sensor 220, such that the data recorded by image sensor 220 is substantially instantaneously transmitted to controller 110 and translated into a visual image which is displayed upon display 112. Image sensors 220 may be installed at any location on crane 200 or on surrounding features, and positioned so as to track one or more specific components of crane 200 or of the surrounding features (herein: "selected crane component" or "target object"), which are of particular interest to the crane operator 150. Some examples of locations at which image sensors 220 may be positioned include: the position of a non-remote crane cockpit, mast 202, each of the ends of jib 204, hoist-and-hook 206, surrounding building 300, and the like. When the target object includes a plurality of dispersed objects, such as a series of objects, or is operational to move along a trajectory, a plurality of image sensors 220 may be positioned to capture the plurality of objects, or a plurality of locations along the trajectory of the target object, respectively. Image sensors 220 may also be positioned on mobile elements in the vicinity of crane 200, such as trucks and cement-mixers, and particularly may be worn by workers at the building site. For example, a signaler who directs operator 150 in maneuvering an element of crane 200, and continuously adjusts his physical location to be in an optimal position to survey the element of crane 200 which is being maneuvered, may have an image sensor 220 installed on the top of his helmet or on any other part of his attire, allowing 292503/ operator 150 to observe crane 200 from the perspective of the signaler.
Unmanned Air Vehicles (UAV) may also include an image sensor 220, and be utilized to survey crane 200 from changing locations. The at least one target object of crane 200 or of the vicinity thereof, may include, by way of an unlimiting example: the crane hook (of hoist-and-hook 206), a load carried by the crane, counterweight 210, safety pins 208, a load landing spot, a display of a scenery of interest, a view of the crane surroundings as captured by an image sensor disposed at a cockpit-location on the at least one crane, etc. The target object may further include a safety weight sensor, which is operational for measuring the weight and/or distance of a crane load, and of indicating a safety threshold check regarding a maximal load and/or maximal spatial positioning of a load.
Operator 150 perceives the real time situation of crane 200 as shown on display 112 and can react accordingly via controller 110. Controller 110 may be operational to manage all operations of crane 200, such as changing the angle of a working arm/jib, i.e., changing the azimuth and/or vertical angle thereof; extending and retracting of a working arm/jib; lowering and collecting of a load hoist; and any other operation of a crane as known in the art, particularly the crane operations which are usually controlled by a crane-cockpit located on the crane.
Controller 110 may also control the activation and operation of image sensors 220, including adjustment of angle, position, focus, magnification, and any other settings of image sensors 220. In a first option, the imaging data (herein "images") recorded and provided by only a single image sensor 220 (herein "currently shown image sensor") is shown on display 112 at any given time. Operator 150 can adjust the settings of display 112, e.g., brightness, contrast, coloring and the like, 292503/ and/or can adjust any of the settings of currently shown image sensor 220, so as to better perceive or locate a feature of interest shown on display 112. Operator 150 can also shift between the images provided by the various image sensors 220, to survey different components of crane 200 and to improve the view of a particular feature of interest.
Emitters 222 can be installed on any number of various components of crane 200 and the surrounding features, and are configured to each emit a distinct signal which can be identified by a receiving unit 224. Receiving unit 2 recognizes the directionality of the signal emitted by an emitter 222 relative to receiving unit 224. Receiving unit 224 may also recognize the intensity of the signal which allows to assess the distance of emitter 222 from receiving unit 224, and, together with the directionality, can allow to assess the location of emitter 222 relative to receiving unit 224. Receiving unit 224 is in wireless communication with controller 110, and may be fixedly installed at a particular position which is recorded by controller 110, such that any space-related information, e.g., location coordinates, which is defined relative to receiving unit 224 can be translated into absolute location coordinates as defined in controller 110 (herein "absolute location coordinates"). As just one example, the absolute location coordinates may be defined such that the longitudinal and latitudinal coordinates comply with the GPS system, and the altitudinal coordinate is defined relative to Sea Level.
Any other coordinate system may of course be defined. When controller 1 receives data from receiving unit 224 regarding the signal emitted by emitter 222, controller 110 can compute the position of emitter 222 relative to receiving unit 292503/ 224, and by extension compute the precise absolute location of emitter 222 and the crane component to which it is coupled.
Other examples of a tracking device, which provides location related data of selected crane components, may include for example: a visually- identifiable-element coupled to the selected crane component, featuring a distinct shape, design, color or other visual feature, which controller 110 can be predefined to identify within visual data provided by visual sensors 220; machine learning of the features of interest; a GPS receiver and a barometric pressure sensor (e.g., InvenSense ICP-10111 barometric sensor) both coupled to the selected crane component, the GPS receiver being operational to compute the longitudinal and latitudinal coordinates of the crane component, and the barometric pressure sensor being operational to compute the altitudinal position of the selected crane component; or any other tracking device or system known in the art.
Tracking the location of a selected component of crane 200 by controller 110 can improve the surveillance abilities of operator 150. Controller 110 may be predefined to continuously adjust the position, focus and/or other characteristics of an image sensor 220, so as to maintain the selected crane component in a preferred region of the image provided by image sensor 220, e.g., a substantially central region, a region in the eye line of operator 150, a region dynamically selected by operator 150, or any other predefined region, such that when the selected crane component is in motion it appears on display 112 to be floating at a substantially fixed region of the image, while the surrounding scenery dynamically changes. For example, controller 110 may adjust the angle of image 292503/ sensor 220, i.e., the direction in which it points, so as to follow the trajectory along which the selected crane component is moving (or corresponding tracking device/element) and to zoom in to provide an enlarged view of the target relative to the image size upon detection as a trigger event. This close-up view advantageously provides an operator with additional detail of the surrounding area facilitating operator control. Alternatively, the preferred. region may be defined relative to display 112, and not necessarily in relation to the live-imaging provided by image sensor 220. This may be particularly useful when images from a plurality of image sensors 220 are being presented on display 112 and the selected crane component is being tracked by more than one image sensor 220, as will be further explained in relation to Figures 2A and 2B. Furthermore, in some situations the selected crane component may be captured by image sensor 220 but the angle or position of image sensor 220 cannot be adjusted to bring the selected crane component into a preferred region of the image. In such a situation, controller 1 may be operational to digitally bring the selected crane component into the preferred region on display 112. This may be achieved, for example, by enlarging the portion of the image surrounding the selected crane component (i.e., zooming in), and/or by moving the image supplied by image sensor 220 relative to display 112, such that the portion of the image including the selected crane component is located in the preferred region of display 112, and other parts of the live imaging (on the opposite side of the image from the selected crane component) are not presented on display 112. The parts of display 112 which become imageless due to the moving of the live-imaging relative to display 112 may be filled with a generic filling, such as a single-colored patch, or may be filled by controller 110 with virtual 292503/ reality elements, clearly differentiated from the real live imaging which is provided by the image sensor 220, which virtual reality elements may be based on image data supplied by any of image sensors 220.
In addition to assisting operator 150 in tracking a selected crane component, maintaining the selected crane component in a preferred position may also allow, or assist, operator 150 in maintaining a healthy body-posture while supervising the operation of the various crane components. This is particularly significant when the operation of the selected crane component, which requires scrutiny, is lengthy, requiring substantially continuous examination for a period of, for example: 15 minutes, 30 minutes, 1 hours, 2 hours, 4 hours, or more.
For example, when hoist-and-hook 206 is being operated to successively transfer a plurality of loads from a first position to a second position, operator 150 often tracks the movement of hoist-and-hook 206 throughout the entire operation, moving along a trajectory back-and-forth from the first position to the second position, where the two positions may be at different heights from the ground.
Maintaining hoist-and-hook 206 at a preferred position on display 112, the preferred position being either predefined or selected by operator 150 (optionally, ongoingly selected during the operation of hoist-and-hook 206), may ensure that operator is gazing at a point on display 112 which corresponds to a healthy body posture. Optionally, a healthy body posture may include diversifying the body posture of operator 150 after a lapse of a predetermined time period (e.g., minutes), where in order to assist operator 150 in achieving this body posture controller 110 may be predefined to relocate the preferred position of the selected crane element on display 112, according to a predefined or a random sequence 292503/ pattern. Examples of an unhealthy body posture may include, for example, continuously bending the neck of operator 150 downwards, straining the neck upwards, turning the neck to one side, hunching the back of operator 150, and the like.
Alternatively, or additionally, controller 110 may be operational to emphasize the selected crane component upon the image provided by image sensor 220 and presented on display 112, for example by adding a colored-dot to the image in close proximity to the selected crane component, surrounding the selected crane component with a circle, changing the image settings in the immediate vicinity of the selected crane component (e.g., sharpness, coloring, brightness, magnification, etc.), and the like.
It is noted that tracking the selected crane component by controller 1 may be carried out independently of the imaging data which is shown on display 112. Controller 110 may track the location, and motion, of any number of selected components of crane 200 or the surroundings thereof, either through data provided by image sensors 220 or through data provided by any other sensors, without any necessary correlation with the imaging data which is being presented on display 112. This enhances the ability of operator 150 to control and regulate the operation of the crane components, as he can receive information and indications from controller 110 which are not dependent upon and limited by his own perception of crane 200 through display 112.
The image sensors 220 (particularly those which are installed in fixed locations), although optionally dispersed in various regions of crane 200 and the vicinity thereof, may not effectively cover all elements or areas of crane 200 or the 292503/ vicinity thereof which may be of interest to operator 150, such that operator 1 may not have visual access to an element or area which require attention. To overcome this issue, a hopping mobile apparatus, which includes an image sensor, may be utilized to be directed to visually record the area of interest.
Reference is now made also to Figure 1B, which is an illustration of UAV 226 which is controlled and maneuvered by operator 150 to scan regions which are not sufficiently scanned by other image sensors 220. UAV 226 includes image sensor 221 which is coupled to the front upper portion of UAV 226. UAV 226 may include more than one image sensor 221, which may face in different directions conveying a live imaging coverage of different sides of UAV 226. This may be particularly useful when UAV 226 is applied by operator 150 to scan a region which may be difficult to access directly with an UAV, such that the forward-facing image sensor 221 may not be effective, but image sensors 221 which are located on other portions of UAV 226 and/or which are facing other directions may be useful instead. UAV 226 may include other features or components in addition to image sensor 221, such as a deployable arm, a voice recorder (e.g., microphone), and the like, and may be utilized by operator 150 (and/or controller 110) to perform chirurgic command and control tasks on crane 200 or components thereof, particularly in regions which operator 150 cannot sufficiently access with the other components of crane 200. The UAV may include, for example, a delivery drone, a helicam, or any other type of UAV, and may be manually controlled and/or autonomously operated.
Reference is now made to Figure 2A, which is an illustration of display 112 divided into a plurality of screens 114, each screen 114 showing a live 292503/ imaging provided by an individual corresponding one of image sensors 220, where: screen 114A displays a live imaging of the front-end of jib 204 (from which is extending hoist-and-hook 206) provided by an image sensor 220 which is located at the crane-cockpit location of crane 200; screen 114B displays a live imaging of the rear-end of jib 204 (including counterweight 210) provided by an image sensor 220 which is located the crane-cockpit location of crane 200; screen 114C displays a live imaging of the lateral profile of crane 200, provided by an image sensor 220 which is located on a construction or apparatus, e.g., an unmanned air vehicle (UAV) (not shown), which is positioned laterally to crane 200; and screen 114D displays a live imaging of a particular coupling pin 208, provided by an image sensor 220 which is positioned on a component of crane 200 located above the particular coupling pin 208. Visually-identifiable-element 250, which is another embodiment of the above-mentioned tracking device by being recognized by controller 110 within visual data provided by visual sensors 220, is coupled with hoist-and-hook 206, shown on screens 114B and 114C. The settings of the image displayed on screen 114D, including brightness, coloring, contrast, sharpness etc., are set so as to be substantially optimal for being viewed and considered by operator 150, while the image-settings of the other screens 114A-114C are less optimal (herein "second-rate image settings"). This difference in the settings of screen 114D and the rest of screens 114A-114C is intended to focus the attention of operator 150 on the live imaging that is being shown on screen 114D, herein "optimal-settings screen", while at the same time maintaining the live imaging which is displayed on screens 114A-114C, herein "secondary screens", available for inspection by operator 150 and at an image quality which 292503/ is good enough for inspection. A "good enough" quality of the second-rate image settings is not a fixed definition, and may vary according to a variety of changing factors, including, amongst other things, the personal preferences of the operator 150, and the importance of the live imaging which is shown on the secondary screens, relative to the optimal-settings screen. For example, when operator 150 wants to generally supervise the operation of crane 200, with no particular interest in a specific component thereof, there may be no difference between the image settings of the various screens, i.e., no specific optimal-settings screen, or the optimal-settings screen may have only marginally better image settings compared to the secondary screens. On the other hand, when a particular component of crane 200 is requiring the attention of operator 150, e.g., when the component is performing a complex operation or when there has been an indication of a defect in that particular component, the image settings of the secondary screens may be substantially altered, usually to be less convenient for viewing, so as to assist operator 150 to focus his attention on the live imaging of the optimal-settings screen in which the particular component of crane 200 is best depicted. One or more of screens 114A-114D may be displayed at optimal image settings, i.e., serve as optimal-settings screens, for example when the particular component of crane 200 which requires special attention is shown from different angles on a plurality of screens, or when the target object which requires special attention is a series of objects with regard to which a visual assignment is required to be conducted, each of the series of objects respectively shown on one of screens 114A-114D. Alternatively, when the target object is a series of objects, the objects may be successively inspected by image sensors 220, optionally according to a 292503/ program predefined in controller 110 or manually by operator 150, such that upon initiation of the sequential inspection program, the direction/position of respective image sensors 220 is successively adjusted to provide a view of each one of the target objects in turn, the currently inspected object being presented in an optimal- settings screen. For example, the target object may be safety pins 208, and the visual assignment may be checking that each of pins 208 is safely installed in its respective position.
It is noted that the safety check may be done by means other than a visual survey, e.g., an audio check, by utilizing audio sensors positioned in close proximity to safety pins 208, where when an intensified rattling sound is captured by a specific audio sensor when a respective safety pin 208 is placed under strain it is an indication that the pin 208 is not installed tightly enough; a mechanical check using a mechanical sensor, constantly coupled to a respective safety pin 208 or employed specifically for the safety check by UAV 226, the mechanical sensor being coupled to a deployable arm of UAV 226 and UAV 226 sequentially cruising between the series of objects, the mechanical sensor sensing for unusually intense vibrations of the safety pin 208 as an indication of loosening of pin 208; and the like.
The choice of which of screens 114A-114D is the optimal-settings screen and which is a secondary screen may be ongoingly made by operator 150.
For example, controller 110 may be predefined to receive a selection of one or more of screens 114A-114D by operator 150, and adjust the image settings of the selected screen according to the selection of operator 150. Operator 150 may define, initially and/or ongoingly, what image settings or characteristics are 292503/ adjusted and in which direction, e.g., elevating or decreasing an image characteristic, or more generally improving or deteriorating the quality of the displayed image according to predefined settings. Additionally, or alternatively, the selection of which of screens 114A-114D be displayed as an optimal-settings screen may be performed automatically by controller 110 according to preset definitions. The preset definitions may be changeable to suit the dynamic requirements of operator 150 in each operating session of crane 200. The preset definitions may be directed to/intended for preserving substantially continuous tracking of a selected component of crane 200, preventing an imminent dangerous situation which is detected by controller 110, performing a sequential examination of different components of crane 200 or of the surrounding vicinity, or any other remote crane-controlling requirements. For example, controller 1 may be preset to render one or more of screens 114A-114D an optimal-settings screen (i.e., displaying the image in the screen with substantially optimal characteristics), upon recognizing visually-identifiable-element 250 within the imaging data of the visual sensor(s) 220 which is being presented on those one or more screens 114A-114D. This may allow operator 150 to easily track the selected component of crane 200 which is coupled with visually-identifiable- element 250, which in this example is hoist-and-hook 206, even when hoist-and- hook 206 is in motion and the screen 114A-114D on which it is being displayed is changing, i.e., hoist-and-hook 206 is being observed by changing image sensors 220. (As mentioned earlier, controller 110 may also be predefined to continuously alter the position of an image sensor 220, so as to follow hoist-and-hook 206.) 292503/ Furthermore, controller 110 may be additionally preset to identify the quality at which visually-identifiable-element 250 is being shown on each of screens 114A-114D (i.e., being observed/recorded by each of image sensors 220) and to render optimal-settings to the one (or more) of screens 114A-114D which displays visually-identifiable-element 250 with the best quality, or at a quality which is above a predefined threshold. The definition of the "quality of display" of visually-identifiable-element 250 may be according to various criteria of the image presented on screens 114A-114D which can be measured by controller 110. For example, the quality of the image may be defined according to the "size" of element 250 within the presented image, the size being assessed in correlation with the number of pixels which element 250 occupies within the live imaging data provided by image sensor 220. The higher quality image may be defined, for example, in direct relation to the increase in size of visually-identifiable-element 250, namely "the bigger the better"; may be defined in relation to a predefined size range, such that the closer to the size range (or the mid-point thereof) the better; or in any other relation to the size of visually-identifiable-element 250 as measured by controller 110. The quality of the display of visually-identifiable- element 250 may be defined according to other factors, some non-limiting examples being the completeness of the display of visually-identifiable-element 250, i.e., what percent of element 250 is being displayed (the higher the better); the sharpness, acutance, resolution, or other imaging characteristic of element 250 or of the whole image; and/or according to any combination of the above factors. 292503/ Reference is now made to Figure 2B, which is an illustration of display 112 divided into a major screen 116A and a minor screen 116B, each of screens 116A-116B showing a live imaging provided by an individual corresponding one of image sensors 220. Major screen 116A occupies the majority of the area of display 112 and constitutes a principal display, and minor screen 116B occupies a minor portion of the area of display 112, constituting a minor display. The parameters that may determine which live imaging (i.e., imaging recorded by which image sensor 220) is presented on major screen 116A and which live imaging is presented on minor screen 116B, are essentially identical to the various possibilities of parameters that have been elaborated with reference to Figure 2A, regarding which of screens 114A-114D is defined as an optimal-settings-screen, and which are defined as secondary screens, respectively. For example, the live imaging presented on major screen 116A may be continuously selected by an operator 150. Alternatively, it may be selected according to parameters which are predefined in controller 110, as explained in detail above. Ongoing thumbnail-like samples of the live imaging provided the plurality of image sensors 220 may be displayed on a margin of display 112, allowing operator 150 to constantly be aware of what is being viewed by each image sensor 220 and to be able to select from the different possibilities what live imaging is displayed on major screen 116A, and what is displayed on minor screen 116B. The number of minor screens shown on display 112 may be adjusted according to the choice of operator 150, as may the relative sizes of the major screen 116A and the one or more minor screens 116B. For example, operator 150 may select a live imaging recorded by a first image sensor 220 to be displayed on major screen 116A, which records a 292503/ selected crane component from a direct front view, and may select an additional two live imagings, supplied by two additional image sensors 220, to be displayed in two minor screens 116B, which record the selected crane component from two opposite side views, respectively. The choice of the number of minor screens 116B, and the relative size of the minor screens 116B and the major screen 116A may also be automatically controlled by controller 110 according to predefined settings, similarly to the preset settings according to which controller 110 may render one or more of screens 114A-114D an optimal-settings screen, as described above.
In some cases, operator 150 may have a "selective interest" in a selected crane component, i.e., the crane component may be of interest only when it is in a particular location, or when the crane component comes into close contact with another component or structure in the vicinity of the crane. One option of addressing selective interest in a selected crane component may include fixedly positioning an image sensor 220(F) to continuously stare towards and record the location or apparatus in the vicinity of which the selected crane component is of interest. The recorded live imaging is continuously provided to controller 110, and controller 110 may be preset to present the live imaging provided by image sensor 220(F) on display 112 immediately upon recognizing the selected crane component within the provided live imaging data. This possibility of suddenly presenting a live-imaging upon display 112 in response to a particular trigger (when the suddenly presented live-imaging is not necessarily continuous to the live imaging which was previously being shown on display 112, nor related to the crane feature which was previously being tracked or presented), is termed herein 292503/ as "image bursting". When display 112 is divided into/includes screens 114A- 114D or major screen 116A and minor screen(s) 116B, the image bursting may be presented in a dominant screen (e.g., major screen 116A or an optimal settings screen) or in an auxiliary screen (e.g., in minor screen 116B or secondary screens), according to the predefined settings of controller 110. Image bursting may be further utilized in other situations in which a trigger is predefined in controller 110. For example, when the operator is searching for a particular crane component using a plurality of image sensors 220, upon identifying the particular crane component controller 110 immediately displays the live-imaging in which the particular crane component was identified; or when a dangerous situation occurs which involves a particular crane component or surrounding feature, controller 110 immediately displays the live-imaging in which the particular crane component was identified.
Reference is now made to Figure 3A, which is an illustration of a variety of tracking-sensors 230 mounted on crane 200 and/or in the vicinity thereof, and are in communication with controller 110. Tracking sensors 230 include, in addition to image sensors 220, the following sensors: wind sensor 232, mounted on jib 204 and configured to detect the force and direction of the wind blowing at the height of jib 204 above ground; vibration/motion sensor 234, coupled with a first coupling pin 208 of crane mast 202, and configured to detect vibrations/movements of coupling pin 208; tension sensor 235, coupled with a second coupling pin 208 and configured to measure the stress/tension being applied on coupling pin 208 by the crane elements whose coupling it is maintaining; radar sensor 236, coupled with a hoist-and-hook 206 and configured 292503/ to continuously detect objects or constructions which are in proximity to hoist-and- hook 206 and measure the distance between hoist-and-hook 206 and the detected object; and direction sensor 238 (i.e., compass-like sensor), coupled with the distal end of jib 204, and configured to measure the direction of jib 204 relative to the absolute north. Each of tracking-sensors 230 is in wireless communication with controller 110 and may continuously provide controller 110 with the data that the tracking-sensor senses and/or computes. Reference is now also made to Figure 3B, which illustrates coupled mutual detectors 240, mounted on crane 2 and on neighboring tree 160. A first mutual detector 240 is positioned near the end of hoist-and-hook 206, and a second mutual detector 240 is positioned on tree 160, at a location which lies within a potential trajectory of hoist-and-hook 206. Mutual detectors 240 are configured to constantly emit and/or receive wireless signals 242 at a specific matching frequency and intensity, usually in an omni-directional pattern, such that when the first and second mutual detectors 2 come within a predefined distance of each other, at least one of detectors 240 will receive the coupling signal 242 emitted by its corresponding detector 240, and will thereby identify the close presence of the corresponding detector 240. Upon identifying the presence of the corresponding detector 240, each detector 240 is configured to provide a signal to controller 110. The signal provided to controller 110 may be of an intensity which corresponds to the intensity of the coupling signal 242 which is received by detector 240, this intensity being correlated with the closeness of the two detectors 240, which allows controller 110 to compute the distance between detectors 240 and their motion (i.e., speed, acceleration) relative to each other. Each type of sensors 230, and detectors 240, may include 292503/ one or more sensors, which may be installed in various locations of crane 200 and the surroundings thereof to collect information about the surrounding settings in which crane 200 and it's components are operating. For example, in a certain embodiment the one or more site sensors are mounted to a crane load either directly or indirectly such that upon conveyance of the load the sensor detects the change in location. The resulting signal is processed by a linked processor or controller. When the sensor is implemented as a site image sensor, a pre- conveyance image is used as a reference image and compared regularly to a newly captured image by a controller. When a threshold non-match between the reference and the recently captured image, a trigger event is identified.
Embodiments implementing the one or more site sensors a motion sensor, or an accelerator, initiation of load conveyance is also identified as a trigger event.
Controller 110 receives the information (i.e., data) collected by sensors 230 and may be configured to perform a variety of analyses on the collected data so as to continuously monitor the situation of crane 200 and the components thereof.
Independent thresholds may be defined for the data provided by different sensors and for the products of the analyses performed on the sensed data, which when the thresholds are crossed controller 110 produces an indicative signal and/or a suitable response. The independent thresholds may be safety thresholds, which are predefined in controller 110 for each of sensors 230. The response of controller 110 to crossing of a threshold may be to burst a live-imaging onto display 112 which shows at least one of the sensors, crane components, and/or risk factors which is related to the threshold which was crossed. At least one of image sensors 220 may be assigned for each sensor/crane component, 292503/ such that when a safety threshold is crossed controller 110 diverts the assigned image sensor 220 toward the relevant sensor or crane component. When the sensors or crane components are substantially in fixed positions during the operation of crane 200, the assignment of an image sensor 220 to a crane component/sensor may be based on the fixed location of the image sensor 220 relative to the fixed location of the crane component/sensor. When the sensors or crane components are mobile, however, it is required to continuously track the motion of the crane components/sensors, as has been explained with reference to Figure 1A regarding the tracking device, in order to allow diverting an image sensor 220 to depict them in the event of sensory data which crosses a threshold.
For example, an independent safety threshold may be predefined in controller 110 for a wind intensity (i.e., force) measured by wind sensors 232, which intensity is considered to endanger the stability of crane 200 or of one of the crane components. The wind intensity threshold may be preset in correlation to the direction at/from which the wind is blowing, or to the dimension of jib 204 on which the wind is blowing, e.g., the side-lengthwise dimension; the "front"/"backward" dimension, "front" being the dimension facing the distal tip of jib 204 and backward being the respective opposite dimension; the bottom dimension, i.e., upward-blowing wind gusts; etc. When controller 110 receives data from a wind sensor 232 indicative of a particular wind intensity which exceeds the predefined intensity threshold, it may be operational to burst a corresponding live-imaging onto display 112. The preset corresponding live-imaging may include a live imaging which depicts, for example, wind sensor 232, the side of jib 2 which is being buffeted by the wind, the whole of crane 200 from the side from 292503/ which the wind is blowing, or from the opposing side (i.e., the side towards crane 200 might topple), and/or the area beneath or in the vicinity of crane 200 which is endangered by a potential collapse of crane 200 or of one of its components.
Similar safety related thresholds may be defined in controller 110 for each of the above-mentioned sensors, and the image sensors 220 may be similarly re- positioned to record a relevant crane component/senor etc.
Controller 110 may also be preset to autonomously activate imminent- danger-prevention measures, in cases where a potential danger is identified by controller 110 and is regarded to be too imminent for a timely reaction by operator 150. Reference is now made to Figure 3C, which illustrates an example of an automatic intervention [response] of controller 110 to a suddenly appearing hazard. Crane 200 is erected in proximity to under-construction-building 300.
Deployable platform 310 is positioned in an alcove of building 300 with an opening in the direction of crane 200, such that arm 312 of deployable platform 310 can be deployed outward from building 300, along trajectory A2, in the direction of crane 200, for example for receiving a load which is delivered by cane 200.
Construction worker 320 operates deployable platform 310, e.g., by using a lever (322), to deploy and retract arm 312. Jib 204 of crane 200 carries load 216 using hoist-and-hook 206, and is operational, amongst other movements, to raise and lower the distal end of jib 204 along trajectory A1, i.e., the end of jib 204 from which hoist-and-hook 206 is suspended. As previously described, controller 1 controls the movement of jib 204 and is usually operated by an operator (150). In the illustrated scenario, controller 110 is operated by an operator to produce an operator-signal S1 so as to raise the distal end of jib 204 along trajectory A1, in 292503/ order to deliver load 216 to an elevated position within building 300. However, as jib 204 is rising towards its intended destination, construction worker 320 pulls lever 322, thereby deploying arm 312 of deployable platform 310 along trajectory A2. Arm 312 extends outward such that it is positioned directly within the intended movement pathway of jib 204. Controller 110 recognizes the imminent collision by analyzing the ongoing data which it receives from sensors disposed on crane 2 (not shown), and identifies that if the ongoing operation of raising jib 204 continues at its current velocity, jib 204 will collide with deployable platform 310 within a very short time period, in which time period an operator (150) would not succeed in responding to an alarm sensory stimulation to prevent the collision. Controller 110 therefore produces an automatic-signal S2 which brings jib 204 to an immediate stop, overriding operator-signal S1 under which jib 204 was previously operating.
Controller 110 may be operational to provide an automatic-signal intervention response so as to prevent any other type of dangerous situation, and the automatic-signal intervention response may entail activating any component of crane 200 or any other component which is controlled by controller 110. Some further examples include: (a) controller 110 operating a slewing mechanism which controls the azimuth of jib 204, and the vertical angle of jib 204 relative to the horizon, so as to adjust the azimuth and/or vertical angle of jib 204 when a wind blows against jib 204 at a direction and intensity which are above a predefined threshold, for preventing falling over of the crane; (b) controller 110 operating a trolley which is operational to travel back-and-forth along jib 204, from which trolly is suspended load-carrying hoist-and-hook 206, such that if a truck, animal, or the 292503/ like suddenly enter into the immediate course of the trolley, controller 1 produces a signal to halt or retract the trolley; and the like.
It is noted, that a person who is physically located in the vicinity of at least one crane 200 (herein "man-on-the-spot"), e.g., a construction worker, a manager of a team of construction workers, a building engineer, and the like, may have a means of communication with operator 150 and/or directly with controller 110, to convey information and/or instructions to operator 150 and controller 110.
The communication of information and/or instructions may of course be in both directions, operator 150 or controller 110 being operational to supply the man-on-the-spot with instructions or notifications of potential threats. For example, in imminent danger situations, as described above, the man-on-the-spot may be enabled to activate an alarm indication in the vicinity of operator 150, to burst an alert message or live image onto display 112 of operator 150, or to activate an imminent-danger-prevention response of controller 110. In order to prevent a clashing between the instructions which may be provided by the man-on-the-spot and the operating of crane 200 by operator 150, a restricting action sequence may be defined to restrict the intervention of the man-on-the-spot in the operation of crane 200 only to situations in which such intervention is deemed crucial. For example, prior to being able to directly activate a crane component via controller 110, the man-on-the-spot may have to activate an alarm indication in the vicinity of operator 150, such that the man-on-the-spot can activate an element of crane 200 only if operator 150 does not respond to the alarm indication within a predefined time-length (e.g., 10 seconds), or if operator 292503/ 150 gives the man-on-the-spot allowance. Any other settings and/or relations between operator 150 and the man-on-the-spot may be defined.
The responses initiated by controller 110 in reaction to sensed data crossing a predefined threshold (or the result of a subsequent analysis performed on said sensed data), may further include producing a signal or activating an alarm indicator, which is intended to inform the operator of the dangerous situation which is represented by the threshold being crossed, and/or to draw the attention of the operator to the dangerous situation. Upon recognition of a dangerous situation or defect by controller 110, controller 110 may be configured to activate one or more alarm stimulations, intended to draw the attention of operator 150 to the hazard/defect which has been recognized. The alarm stimulation may be produced, for example, by at least one alarm indicator which may be directed to stimulate any of the five senses: touch, sight, hearing, smell and taste, and may additionally be directed to cause a psychological reaction. Some non-limiting examples of alarm indicators include: varying the settings of the live imaging depicted on display 112; a visual alert display, e.g., a written message or warning symbol appearing on display 112; automatic focusing on an object of particular interest/requiring particular attention; a flickering light/display; a light-control for flickering the ambient lighting surrounding operator 150; a gas-sprayer for spraying strong smelling gas; an alarm for sounding a message/sound/alert, such as beeping, siren, a spoken command, screaming; a tactile vibrator for vibrating a handle/stick/chair in physical contact interface with operator 150; and an electric current in physical contact interface with the operator 150. 292503/ A general purpose of the alarm stimulations is to produce a sense of alert and urgency in operator 150. Operator 150 may be able, when setting up controller 110, to define an alarm, or sequence of alarm indicators, which he knows to be particularly effective on him. Alternatively, controller 110 may be configured to record the response time of operator 150 to various alarm indications during operation sessions, which may be diverse both in respect to the bodily-sense which the alarm stimulates and in respect to the type of stimulation (e.g., which type of sound, which type of visual stimulation, etc,), and to analyze the collected information/data so as to define a most-effective stimulation for operator 150. This data collection and analysis may be performed for each operator independently, or may be performed for all operators collectively.
Controller 110 may be configured to record the response time and the response quality of operator 150 to various stimulations, i.e., how fast and how accurately operator 150 reacts to each sensory stimulation, during operation sessions, and to analyze the collected information/data so as to define a stimulation scale for operator 150. The quality of the response of operator 150 is important as well as the speed of the reaction, as certain sensory stimulations may be found to be too alarming, so as to cause operator 150 excess stress, leading to a misguided or non-accurate reaction. Controller 110 may also be configured to randomly change the stimulations, or the sequence of stimulations, so as to prevent operator 150 from getting used to a repeated sequence.
Sequence activation of different alarm indicators may have the general purpose of creating an intensification of the sensory stimulation imposed upon operator 150. The relative extremity of the various sensory stimulations is a first 292503/ consideration in planning their sequence of activation. The relative extremity of sensory stimulation may be evaluated, for example, according to the irritation level of the particular chosen stimulation (e.g., mellow beeping sound vs. screechy wailing sound), the stimulation's intensity (sound volume/light intensity/strength of vibration, etc.), and that of the five senses which it stimulates, wherein it is generally accepted that the average sensitivity scale of the five senses (from highest to lowest) is hearing, touch, sight, and smell, at least from the point of view of reaction-time (RT) (i.e., hearing has the fastest RT). Each specific sensory stimulation may be assigned a "stimulatory score" in controller 110, for example according to the above-outlined parameters. Various alarm sequence programs may be defined in controller 110 based on the stimulatory score of the different sensory stimulations, such as: a linear progression between sensory stimulations in correlation with linear increase in danger level; an exponential progression between sensory stimulations in correlation with linear increase in danger level; a progression of stimulations in a step intervals function in correlation with linear increase in danger level; a progression of stimulations directed towards the same sense, e.g., various sounds with progressing stimulatory scores; a progression of stimulations directed towards alternating senses; an accumulation of stimulations; and any combination of the above. An alarm indicator may also produce positive indications, such as an eye-catching visual message which appears upon display 112, which informs operator 150 what action is required in order to prevent the potential hazard and/or amend the detected defect.
The tracking ability of operator 150, controlling the crane from a remote cockpit, may be further enhanced by supplying indications of the real time 292503/ conditions of the crane to the remote cockpit. Reference is now made to Figure which illustrates real-time-conditions indicators 380 located in the vicinity of operator 150, intended for increasing the sensory experience of operator 1 during the operation of crane 200, in addition to the imaging shown on display 1 and optionally in conjunction with the features (e.g., the target object) presented in the imaging, to imitate a sensory experience of an operator residing within a non-remote crane cockpit and enhance the tracking abilities of operator 150.
Real-time-conditions indicators 380 include wind-emitter/fan 382 and surround- speaker 384. Preferably, fan 382 includes a plurality of fans which are positioned at at least two different sides of operator 150, and surround-speaker 384 includes a plurality of speakers which are positioned at at least two different sides of operator 150. Controller 110 receives data that is sensed or perceived by wind sensors 332 and sound sensors 334 (i.e., microphones) which are installed in the vicinity of crane 200, and regulates the activation, intensity, directionality and/or other characteristics of real-time-conditions indicators 380 such that they imitate the data received from the sensors. The more positions of fans and/or speakers there are, the better the ability of controller 110 to produce a sensory surround effect which accurately imitates the real-time conditions surrounding the crane.
For example, when controller 110 receives audio data of real-time environmental noises (e.g., an audio signal emitted by a truck driving in reverse) from a microphone 334 disposed on the right-hand side of crane 200, where right- hand side is defined relative to a theoretical operator sitting within a crane cockpit at the top of the crane, controller 110 may be operational to activate surround speaker 384 to produce a corresponding audio signal. When there is a plurality of 292503/ speakers, controller 110 may activate the speaker or speakers which are positioned in a corresponding right-hand side of operator 150. This creates a sensory experience for operator 150 which is similar to the sensory experience which operator 150 would have if he were actually located at the top of crane 200.
A similar process is carried out with fan 382, where the wind sensors 332 provide wind data to controller 110 (e.g., intensity and directionality of a gust of wind), and controller 110 activates the corresponding fan or fans to produce a sensory experience for operator 150 which corresponds to real-time conditions.
Alternatively or additionally to the activation of fan 382, controller 110 may activate surround-speaker 384 to produce wind-blowing sounds when a substantial wind is blowing on crane 200. Creating a surrounding sensory experience for operator 150 which corresponds to the real-time conditions may increase operator 150's understanding of the situation of crane 200, and substantially enhance operator 150's awareness of potential threats or dangers to crane 200, even without controller 110 activating alarm indicators. Real-time-conditions indicators 380 may also include additional elements such as a vibrating chair, for imitating vibrations of a non-remote cockpit due to strong wind or movements of the crane, and the like. Optionally, the sensors installed on crane 200 are installed in the location on crane 200 which would be occupied by a non-remote crane cockpit, and are arranged such that they perceive the surrounding conditions as would be perceived by an operator within the non-remote cockpit. This option is particularly useful for a crane operator who is experienced in operating a crane from a non- remote cockpit, where the data collected by the sensors allows the real-time- conditions indicators 380 to produce a sensory experience corresponding to what 292503/ the operator is used to from his experience in a non-remote cockpit. Alternatively, the sensors may be distributed and arranged in any position in the vicinity of crane 200 which allows sensing wind, sounds, or any other relevant surrounding data which can enhance the ability of operator 150 to sense the conditions surrounding crane 200, even if the data sensed by the sensors would not be perceived by operator 150 if he were situated within the non-remote cockpit. This allows operator 150 to acquire enhanced sensory indications which are not limited to the detecting abilities of his own senses, as is the case in a non-remote cockpit, and this may increase his understanding of the surrounding of crane 200 and help him better forecast potential hazards or dangers. There is a risk that for an operator experienced with non-remote cockpit operation, these additional sensory indications may be potentially misleading, as he is generally accustomed to perceiving only what he can pick up with his own senses. To overcome this issue, a program may be predefined in controller 110 such that controller 110 may at first filter out, i.e., not express via real-time-conditions indicators 380, data which is beyond the scope of what an operator in a non-remote cockpit would pick up, and gradually add and express data which is received from more distant sensors, such that operator 150 can become accustomed to this broadening of sensory input.
Reference is now made to Figures 5A to 5C, which are sequential live images of crane 200 shown upon display 112, with changing display settings in response to data received by controller 110, which is indicative of a potential defect or hazard for crane 200. In Figure 5A a visual display of real time imaging of a scene of interest is shown on display 112, the scene of interest including 292503/ crane 200, where the contrast between crane 200 and its background environment is at a baseline level. This baseline contrast level may be representative of a normal view of crane 200 as would be seen by an operator sitting within a non-remote crane cockpit, a contrast level which is suitable for easy tracking of the operation and motion of crane 200 within its background environment. When controller 110 identifies a potential defect or hazard, the contrast level of the image on display 112 is progressively augmented, serving as a visual sensory alarm stimulation of the operator, as is shown in Figure 5B. The contrast between crane 200 and its background environment is increased in comparison to the baseline level shown in Figure 5A. Increasing the contrast between crane 200 and the environment is intended to draw the attention of the operator to the potentially dangerous situation of crane 200 in several ways: by emphasizing the motion of crane 200 in relation to its general background; by causing a visual discomfort to the operator due to the unnaturally contrasted image; and/or by capturing the operator's attention due to the sudden change of contrast. The contrast level of the entire image shown on display 112 may be adjusted. Alternatively, only the contrast of crane 200 (and possibly its proximal surroundings) may be increased, and optionally also the contrast of an object which presents a potential hazard for crane 200. The contrast level may be adjusted gradually, i.e., continuously rising as the hazard increases and/or perseveres and continuously decreasing as the hazard potential decreases, or may be adjusted in leaps according to predefined step levels of hazard increase/hazard time extent. Contrast adjustment by leaps may be more effective in capturing the attention of the operator. If controller 110 detects that the operator 292503/ has not responded to the increase in contrast by taking action to avert the potential danger and/or amend the defect, controller 110 may further increase the contrast of the image on display 112, as shown in Figure 5C.
Controller 110 may apply other changes to the display settings of the image on display 112, as a visual alarm stimulation of the operator and/or to enhance the focus on crane 200, the display settings including for example: brightness, sharpness, shading, coloring strength, coloring heat (where hot coloring is substantially yellow, orange and red, and cold coloring is substantially green, clue, indigo and violet), and the like. Each of these display characteristics may be augmented or intensified as a hazard or defect increases/perseveres, and may be reduced or moderated as the risk of hazard is alleviated or the defect remedied.
Further manipulations and adjustments may be applied to the image which is displayed on display 112, to enhance the tracking and controlling abilities of operator 150. Reference is now made to Figure 6A which illustrates an image of crane 200 presented on display 112, with added augmented reality features 410. Augmented reality features 410 include arrow 412, destination target 414, and parallel border line 416. Augmented reality features 410 appear upon display 112 in addition to the live imaging provided by an image sensor 220, such that the image in which operator 150 tracks crane 200 includes a combination of real physical features from the surroundings of crane 200, and virtual features which are not actually in the surroundings of crane 200. Features 410 may appear on display 112 according to definitions which are preset in controller 110, or may be created or initiated by operator 150. For example, controller 110 may be preset to 292503/ recognize jib 204 of crane 200 in a live imaging, and to create a parallel border line 416 at a predetermined distance above and below the length of jib 204. Border line 416 may serve as a perimeter, such that operator 150 attempts to ensure that line 416 does not overlap with surrounding buildings or other physical features which appear in display 112, so as to prevent a collision of jib 204 with the surrounding buildings etc. Arrow 412 and destination target 414 may be marked on display 112 by operator 150 according to each specific crane operation, arrow 412 representing the trajectory which jib 204 is intended to follow so as to reach destination target 414. Any other elements which improve the control and tracking of crane 200 may also be added.
Reference is now also made to Figure 6B which illustrates crane 2 presented on display 112 within virtual reality surroundings 440. Virtual reality settings 440 includes building-like pillars 442, and protruding trees 444. Virtual reality settings 440 replace the live imaging which is provided by an image sensor (220) with virtual images, but the virtual images may be based on the live imaging and may correspond with features and/or sections of the live imaging. Building- like pillars 442 and protruding tree 444 are elements which are created virtually and correspond to real buildings and trees, respectively, which are recorded by an image sensor 220 in the vicinity of crane 200. Pillars 442 and trees 444 may be manually added to display 112 by operator 150, or may be added automatically by controller 110 according to preset definitions of analyzing the live imaging data provided by an image sensor (220). An advantage of replacing the real physical live imaging with the virtual pillars 442 and trees 444 within virtual reality settings 440 is that this allows presenting only the features of the surroundings of crane 292503/ 200 which are relevant for operator 150, i.e., of which operator 150 needs to keep track or has to watch out for. This also allows adjusting the manner in which these features are represented, i.e., the characteristics of these features, to be the most convenient for operator 150, for example, by selecting the color, width, shape and other features of pillars 442 and trees 444 so as to be as noticeable as required in the eyes of operator 150. The settings of the pillars 442 and trees 444 may be set by operator 150, or may be predefined in controller 110 according to settings which are predetermined to be most effective in representing these features when appearing in the crane vicinity, e.g., buildings and trees. Any other virtual reality features may be added to virtual reality settings 440 which assist operator 150 in the tracking and control of crane 200. In addition, any combination of augmented reality and virtual reality, or alternation between the two, may be applied, according to the selection of operator 150 and/or according to a predefined program of controller 110.
Reference is now made to Figure 7, which is a block diagram of a method for remote tracking of a crane, operative in accordance with an embodiment of the present invention.
In procedure 502, an image of a target object of particular interest is retrieved by at least one image sensor. The image sensor, linked to tracking software and hardware, continuously tracks and visually records the target object of interest, capturing live imaging thereof. The target object may include a selected crane component, such as the crane hook, a load carried by the crane, a counterweight, safety pins, and the like, or may include elements in the crane surroundings, such as a load landing spot, a display of a scenery of interest, a 292503/ view of the crane surroundings as captured by an image sensor disposed at a cockpit-location on the at least one crane, etc. The target object may further include a safety weight sensor, which is operational for measuring the weight and/or distance of a crane load, and of indicating a safety threshold check regarding a maximal load and/or maximal spatial positioning of a load. The tracking may include automatically centering and focusing the image sensor on the target object, either continuously, intermittently according to a predefined trigger event like a weight reading exceeding a threshold or a time of day, or when triggered to do so by the operator. When the image sensor is not able to maintain the target object in the center or in any other preferred position relative to the image frame, the image sensor may be predefined to maintain the target object anywhere within the image frame. A hopping mobile apparatus, such as a UAV, may be deployed to visually record the target object when it is in a position which is not sufficiently covered by the image sensors which are in fixed positions, the fixed positions of image sensors including, by way of example, a crane cockpit, a hoist-and-hook of a crane, a structure in the vicinity of the crane, and the like. A tracking device allows continuous tracking of the target object, not necessarily in correlation with the live-imaging recorded by the image sensors. The tracking device may be operational, for example, to allow automatic recognition of the target object of interest within data provided by the image sensors (i.e., live imaging), or to determine and provide to a controller the absolute physical location of the target object. With reference to Figures 1A and 1B, a live imaging of a target object of crane 200 or of a component thereof is captured by at least one image sensor 220. Image sensor 220 may be positioned on crane mast 202, at an end 292503/ of jib 204, on hoist-and-hook 206, or any other component of crane 200 or a structure in the vicinity thereof. UAV 226, including image sensor 221, may also be utilized. Tracking device 222 is mounted on the target object to allow tracking by image sensor 220, the target object including, for example, counterweight 210, joist-and-hook 206, or the distal end of jib 204.
In procedure 504, the image of the target object is provided to the controller, which receives the image data provided by the at least one image sensor and presents the live imaging on the display. The controller may receive image data from a plurality of image sensors and may examine the image data according to predefined analyses to determine which image data to present on the display. The controller may receive data from a variety of sensors, and compute an ongoing situation or condition of the at least one crane or of any one of the crane's components. The controller may be predefined, according to the ongoing analyses/computations the controller performs, to adjust the live imaging which is being presented on the display, to supply a variety of indications to the operator, and/or to autonomously control/activate/deactivate different crane components or systems. With reference to Figures 1A, 1B, 3A-3C, controller 1 receives the imaging data provided by image sensors 220, and presents a corresponding image on display 112. Controller 110 may receive data from a plurality of image sensors 220 and examine the provided data according to predefined analyses, for example in order to identify a selected target object within the imaging data. Controller 110 may also receive indications from a variety of tracking sensors 230, including for example wind sensor 232, vibration/motion sensor 234, tension sensor 235, and radar sensor 236. Controller 110 may further 292503/ receive signals from mutual detectors 240, which may be mounted on a component of crane 200 and a corresponding surrounding feature. Controller 1 may be operational to analyze the situation of crane 200 according to the signals received from image sensors 220, tracking sensors 230 and mutual detectors 240, according to which it presents a selected imaging on display 112, the imaging retrieved from at least one of image sensors 220. Controller 110 may further supply real-time conditions indications to operator 150 via real-time indicators 380. Controller 110 may automatically control/activate/deactivate different components or systems of crane 200 when needed, e.g., when an imminent danger is identified.
In procedure 506, the retrieved image/live imaging is displayed on the display to the operator located in a remote crane cockpit. When the target object of interest is maintained in a preferred position relative to the image, by virtue of adjusting the position of the image sensor, the selected crane component appears to be floating at the preferred position within a dynamic scenery. The preferred position of the selected object of interest may be selected by the operator or predefined in the controller, and may be defined relative to the image frame, i.e., relative to the viewing of the image sensor, or defined relative to the display. The preferred position may be selected so as to enhance the tracking ability of the operator, and/or may be intended to maintain the operator on a healthy body posture. AR/VR features may be combined with the displayed image, for providing data, and pointing and marking items and locations of interest, e.g., a load landing spot, which may substantially enhance the tracking and controlling abilities of the operator. With reference to Figures 1A, 5A-5C, 6A and 6B, controller 110 displays 292503/ a live image on display 112, which is based on the visual data supplied by image sensors 220. Operator 150 may select a preferred position on display 112, and controller 110 may correspondingly adjust the position of image sensors 2 and/or the settings of the image presented on display 112 in order to maintain the selected target object in the preferred position during the crane operation session.
The preferred position may optionally be predefined in controller 110. AR features 410 and/or VR features 440 may be combined with the image on display 112 for enhancing the tracking and controlling abilities of operator 150.
In procedure 508, which is a sub-procedure of procedure 504, an image captured by one of the image sensors is bursting onto the display when the controller identifies the object of interest in the captured image. The controller may be predefined to burst the image onto the display whenever it identifies the target object in a captured image, or an improved view of the object relative to the view that is being currently presented on the display. Additionally or alternatively, the controller may receive instruction from the operator during the crane operating session, which define under what conditions to burst an image onto the display, and with regard to identification of which crane component. The controller may also burst an image onto the display when it identifies am imminent danger or a substantial defect which is related to the bursting image. With reference to Figures 1A, 2A, 2B and 5A-5C, controller 110 may burst an image captured by at least one of image sensors 220 onto display 112, when an object of interest, or an imminent danger correlated to the bursting image, are identified by controller 110.
The bursting image may take up the whole display are of display 112, or may take up a portion of the screen, e.g., one of screens 114A-114D, 116A-116B. 292503/ In procedure 510, which is a sub-procedure of procedure 506, the controller provides real-time conditions indications for the operator, which are based on data collected from the vicinity of the crane, optionally from the crane- cockpit location. The real time conditions indications may include, for example, sounding crane real time environmental noises to the operator, as heard and perceived by a microphone at a cockpit-location of the at least one crane, using a speaker located in the remote crane cockpit. With reference to Figure 4, real time indicators 380, including fan 382 and speakers 384, convey real-time indications provided to controller 110 by sensors installed in the vicinity of crane 200, such as wind sensor 332 and sound-sensors (microphones) 334, to operator 150 residing in the remote crane cockpit, to produce a comprehensive sensory experience which imitates the sensory experience of an operator in a non-remote crane cockpit.
In procedure 512, which is a sub-procedure of procedure 506, a display of an image perceived by a particular image sensor is presented on the display as a principal display, capturing the principal area of the display, and an image perceived by at least one other image sensor is displayed as a secondary display capturing a secondary area of the display. The controller may play the sound of the crane surroundings for the operator, which sound is captured by a microphone located in the vicinity of the particular image sensor of the principal display, or in the vicinity of the crane component which the particular image sensor is intended to visually record. The particular image sensor may be selected to be presented on the principal display when the target object is identified as best seen/closest to the particular sensor. The principal display may 292503/ be differentiated over the at least one secondary display in the aspect of size, i.e., the principal display taking up a larger portion of the display than each of the secondary displays. Additionally or alternatively the principal display may be differentiated over a secondary display in the aspect of image settings, such that the image shown on the principal display is accentuated relative to the image displayed on the secondary display(s), which may assist the operator in focusing on the principal display. Alternatively, the principal display may overlay secondary displays, or the principal display may flash. With reference to Figures 2A, 2B, and 4, major screen 116A constitutes a major display, capturing the principal area of display 112, and presents an image perceived by a particular image sensor 220, and secondary display 116B constitutes a secondary display, capturing a secondary area of display 112, and presents an image perceived by at least one other image sensor 220. At least one of screens 114A-114D is rendered an optimal settings screen by controller 110, either according to predefined settings or upon selection by operator 150. Controller 110 may play the sound of the surroundings of crane 200 to operator 150, which sound is captured by microphone 334 which is correlated with image sensor 220 of the principal display (116A).
While certain embodiments of the disclosed subject matter have been described, so as to enable one of skill in the art to practice the present invention, the preceding description is intended to be exemplary only. It should not be used to limit the scope of the disclosed subject matter, which should be determined by reference to the following claims.

Claims (30)

292503/3 CLAIMS
1. A remote, camera tracking system for single display viewing of task execution within a crane work site, the system comprising: a plurality of panning, motion-tracking cameras having a collective sensing capacity spanning a work site area, each of the cameras configured to capture a motion picture segment of a portion of a task during execution, each capture of the motion picture segment responsive to a trigger event; a controller configured to merge the motion picture segments into a single, complete motion picture of the task; and one remote display operative to display the complete motion picture of the task in real time so as to facilitate operator crane control.
2. The system of claim 1, wherein the plurality of cameras includes a camera activated by a trigger event captured by a different camera of the plurality of cameras.
3. The system of claim 1, wherein each of the cameras is activated by a trigger event associated with a respective camera of the plurality of cameras.
4. The system of any one of claims 1-3, wherein at least one of the panning, motion-tracking cameras is further configured to capture the motion picture segment as a close-up view.
5. The system any one of claims 1-4, wherein the motion picture segment is a motion picture segment of loading, load conveying, or unloading. 292503/3
6. The system of any one of claims 1-4, wherein the trigger event is a time.
7. The system of any one of claims 1-4, wherein the trigger event is an imminent jib collision.
8. The system of any one of claims 1-4, wherein the trigger event is a threshold load volume.
9. The system of any one of claims 1-4, wherein the trigger event is abutment of a load and crane hook.
10. The system of any one of claims 1-4, wherein the trigger event is implemented as conveyance of a load.
11. The system of any one of claims 1-4, wherein the trigger event is implemented as a captured task execution bordering a field of vision of a camera of the plurality of cameras.
12. The system of any one of claims 1-4, wherein the trigger event is implemented as a captured image having a resolution below a threshold resolution.
13. The system of any one of claims 1-4, wherein the trigger event is implemented as a captured image having a resolution below a threshold image resolution. 292503/3
14. The system of any one of claims 1-4, wherein the plurality of the panning, motion-tracking cameras includes at least one camera configured to pan one or more deployed crane pins.
15. The system of claim 14, wherein the at least one camera is deployed crane pins is deployed in an Unmanned Aerial Vehicle (UAV).
16. A method for remote tracking and real-time display of task execution within a crane work site, the method comprising: capturing motion picture segments of a task under execution in a crane work site, the capturing motion picture segments implemented through a plurality of panning, motion-tracking cameras having a collective sensing capacity spanning the work site area, the capturing responsive to one or more trigger events; merging the motion picture segments into a single complete motion picture of the task; and remotely displaying the complete motion picture of the task on a single display in real time so as to facilitate operator crane control.
17. The method of claim 16, wherein each of the cameras is activated by a trigger event associated with a respective camera of the plurality of cameras.
18. The method of claim 16, wherein the plurality of cameras includes a camera activated by a trigger event captured by a different camera of the plurality of cameras. 292503/3
19. The method of any one of claims 16-19, wherein the capturing motion picture segments includes focusing in on the task under execution.
20. The method of claims 16-19, wherein the motion picture segments are selected from the group consisting of loading, load conveying, or unloading.
21. The method of any one of claims 16-19, wherein the trigger event is a time.
22. The method of claim any one of claims 16-19, wherein the trigger event is a threshold load volume.
23. The method of any one of claims 16-19, wherein the trigger event is an imminent jib collision.
24. The method of any one of claims 16-19, wherein the trigger event is abutment of a load and crane hook.
25. The method of any one of claims 16-19, wherein, the trigger event is implemented as a captured task execution segment bordering a field of vision of a camera of the plurality of cameras.
26. The method of any one of claims 16-19, wherein the trigger event is implemented as an abutment of a load and crane hook. 292503/3
27. The method of any one of claims 16-19, wherein the trigger event is implemented as a captured image having a resolution below a threshold resolution.
28. The method of any one of claims 16-19, wherein the trigger event is implemented as captured image having a resolution below a threshold resolution.
29. The method of any one of claims 16-18, wherein the plurality of the panning, motion-tracking cameras includes at least one camera configured to pan one or more deployed crane pins.
30. The method of claim 29, wherein the at least one camera is deployed in a UAV.
IL292503A 2022-04-25 2022-04-25 Remote crane tracking IL292503B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
IL292503A IL292503B2 (en) 2022-04-25 2022-04-25 Remote crane tracking
PCT/IL2023/050415 WO2023209705A1 (en) 2022-04-25 2023-04-24 Remote crane tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
IL292503A IL292503B2 (en) 2022-04-25 2022-04-25 Remote crane tracking

Publications (3)

Publication Number Publication Date
IL292503A IL292503A (en) 2023-11-01
IL292503B1 true IL292503B1 (en) 2024-03-01
IL292503B2 IL292503B2 (en) 2024-07-01

Family

ID=88586101

Family Applications (1)

Application Number Title Priority Date Filing Date
IL292503A IL292503B2 (en) 2022-04-25 2022-04-25 Remote crane tracking

Country Status (1)

Country Link
IL (1) IL292503B2 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130299440A1 (en) * 2012-05-10 2013-11-14 Dale Hermann Crane collision avoidance
CN106516984A (en) * 2016-12-29 2017-03-22 深圳大学 Unmanned tower crane control system based on wireless communication network and implementing method
US20170369288A1 (en) * 2016-06-22 2017-12-28 The Boeing Company Systems and methods for object guidance and collision avoidance
US20180229978A1 (en) * 2013-04-11 2018-08-16 Liebherr-Components Biberach Gmbh Remote-controlled crane
US20190337771A1 (en) * 2018-05-04 2019-11-07 Rowan Companies, Inc. System and Method for Remote Crane Operations on Offshore Unit
US20200048052A1 (en) * 2017-04-03 2020-02-13 Cargotec Patenter Ab Driver assistance system and a method
WO2021229576A2 (en) * 2020-05-14 2021-11-18 Ultrawis Ltd. Systems and methods for remote control and automation of a tower crane

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130299440A1 (en) * 2012-05-10 2013-11-14 Dale Hermann Crane collision avoidance
US20180229978A1 (en) * 2013-04-11 2018-08-16 Liebherr-Components Biberach Gmbh Remote-controlled crane
US20170369288A1 (en) * 2016-06-22 2017-12-28 The Boeing Company Systems and methods for object guidance and collision avoidance
CN106516984A (en) * 2016-12-29 2017-03-22 深圳大学 Unmanned tower crane control system based on wireless communication network and implementing method
US20200048052A1 (en) * 2017-04-03 2020-02-13 Cargotec Patenter Ab Driver assistance system and a method
US20190337771A1 (en) * 2018-05-04 2019-11-07 Rowan Companies, Inc. System and Method for Remote Crane Operations on Offshore Unit
WO2021229576A2 (en) * 2020-05-14 2021-11-18 Ultrawis Ltd. Systems and methods for remote control and automation of a tower crane

Also Published As

Publication number Publication date
IL292503B2 (en) 2024-07-01
IL292503A (en) 2023-11-01

Similar Documents

Publication Publication Date Title
US20220220697A1 (en) Object detection system and method
KR101723283B1 (en) Worker Behavior Based Safety Management System and Method
US11501619B2 (en) Worksite classification system and method
DE102008001391B4 (en) Fire detection device and method for fire detection
US10469790B2 (en) Control system and method for an aerially moved payload system
JPWO2008029802A1 (en) Driving information providing device
JP2019016836A (en) Monitoring system, information processing unit, information processing method, and program
KR101519974B1 (en) Cctv for sensing neighborhood information and intelligent alert system using the same
CN111686392A (en) Artificial intelligence fire extinguishing system is surveyed to full scene of vision condition
CN114604761A (en) Intelligent tower crane-assisted operation and control safety warning system and method
CN108319892A (en) A kind of vehicle safety method for early warning and system based on genetic algorithm
CN111724557A (en) Electric power operation border crossing early warning system and method
GB2368482A (en) Pose-dependent viewing system
CN112010187B (en) Monitoring method and device based on tower crane
CN114604773A (en) Safety warning auxiliary system and method for intelligent tower crane
CN114604763A (en) Electromagnetic positioning device and method for intelligent tower crane hook guide
EP3923569B1 (en) Remote operation system and remote operation server
CN114604768A (en) Intelligent tower crane maintenance management method and system based on fault identification model
IL292503B2 (en) Remote crane tracking
CN114572845A (en) Intelligent auxiliary robot for detecting working condition of intelligent tower crane and control method thereof
WO2023209705A1 (en) Remote crane tracking
KR20200071560A (en) Farm Experience Safety Management System
CN117115999A (en) Mobile safety inspection method and terminal equipment
CN114468453B (en) Building site safety helmet detecting system based on 5G technique
JP2019218198A (en) Operation support system