WO2023209705A1 - Suivi de grue à distance - Google Patents

Suivi de grue à distance Download PDF

Info

Publication number
WO2023209705A1
WO2023209705A1 PCT/IL2023/050415 IL2023050415W WO2023209705A1 WO 2023209705 A1 WO2023209705 A1 WO 2023209705A1 IL 2023050415 W IL2023050415 W IL 2023050415W WO 2023209705 A1 WO2023209705 A1 WO 2023209705A1
Authority
WO
WIPO (PCT)
Prior art keywords
crane
trigger event
display
operator
image
Prior art date
Application number
PCT/IL2023/050415
Other languages
English (en)
Inventor
Aviv Carmel
Original Assignee
Crane Cockpit Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from IL292503A external-priority patent/IL292503B1/en
Application filed by Crane Cockpit Technologies Ltd filed Critical Crane Cockpit Technologies Ltd
Publication of WO2023209705A1 publication Critical patent/WO2023209705A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/46Position indicators for suspended loads or for crane elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/52Details of compartments for driving engines or motors or of operator's stands or cabins
    • B66C13/54Operator's stands or cabins
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C15/00Safety gear
    • B66C15/04Safety gear for preventing collisions, e.g. between cranes or trolleys operating on the same track
    • B66C15/045Safety gear for preventing collisions, e.g. between cranes or trolleys operating on the same track electrical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C15/00Safety gear
    • B66C15/06Arrangements or use of warning devices
    • B66C15/065Arrangements or use of warning devices electrical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C23/00Cranes comprising essentially a beam, boom, or triangular structure acting as a cantilever and mounted for translatory of swinging movements in vertical or horizontal planes or a combination of such movements, e.g. jib-cranes, derricks, tower cranes
    • B66C23/88Safety gear
    • B66C23/94Safety gear for limiting slewing movements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the present invention relates generally to the field of remote tracking of conditions and operation of a machine or system, and more specifically to remote tracking of conditions and operation of a crane.
  • the field of tracking from a distance of the conditions, settings, and operation of systems and devices is constantly progressing, by virtue of the constant development of improved sensing and recording abilities, improved computational abilities, and improved wireless technologies.
  • This allows for a growing variety of fields to gradually reduce the number of "men-on-the-spot" which are required to run and supervise the working of a device or system, and replace these men with relevant sensors and computational programs.
  • fields which involve complex operations requiring a large number of workers on the spot working in highly dynamic settings, particularly when the operations are at a relatively high level of risk the transition to remotely controlled systems is more complicated, and is slower in being applied.
  • a remote crane tracking apparatus for remotely controlling and monitoring at least one crane including a display for displaying to an operator, at a remote cockpit location, a target object of particular interest.
  • the remote crane tracking apparatus further includes at least one image sensor, and a tracking device operational for tracking the target object and providing an image of the target object to the display.
  • the target object may include at least one of: the crane hook; a load to be carried by the at least one crane; a load landing spot; a safety weight of the crane; a display of a scenery of interest; and a view of the crane surroundings as captured by an image sensor disposed at a cockpit-location on the at least one crane.
  • the target object may be maintained in a preferred position on the display, the preferred position being selected by the operator and/or predefined in a controller, wherein the controller is operational for control the tracking device and/or the display.
  • the preferred position may be intended to maintain the operator in a healthy body posture.
  • a picture may be bursting onto the display when the target object is identified in data provided by a particular image sensor of the at least one image sensor.
  • the crane tracking apparatus may be part of a system, which system includes a speaker for sounding crane real time environmental noises as heard and perceived by a microphone at a cockpitlocation of the at least one crane.
  • the at least one sensor may include a multiplicity of sensors, and a display of the image perceived by a particular image sensor may be displayed as a principal display capturing the principal area of the display, and the image perceived by at least one of other sensor of the multiplicity of sensors may be displayed as a secondary display capturing a secondary area of the display. Sound captured by a microphone of the particular image sensor may be played for the operator.
  • the particular image sensor may be selected when the target object is identified as best seen/closest to the particular sensor.
  • the crane tracking apparatus may further include Augmented Reality (AR) or Virtual Reality (VR) features for combining with the displayed image, providing data, and pointing and marking items and locations of interest.
  • the locations of interest include a load landing spot.
  • the crane tracking apparatus may further include a hopping mobile apparatus configured for mobilizing and deploying for chirurgic command and control tasks.
  • the target object may include a series of objects with regard to which an assignment is required to be conducted.
  • the series of objects may include the crane safety pins and the assignment may include safety check of the status of each of the safety pins.
  • the target object may include a safety weight sensor, operational for indicating safety threshold check regarding maximal load and/or maximal spatial positioning of a load.
  • the at least one image sensor may be mounted on at least one of the following: crane cockpit; signaler helmet; UAV; and structures in the vicinity of an area of interest.
  • a method for remotely controlling and monitoring at least one crane in a remote crane tracking including the procedure of retrieving an image of a target object of particular interest, including tracking the target object by at least one image sensor.
  • the method further includes providing the image of the target object to a display, and displaying the image on the display to an operator at the remote cockpit location.
  • the target object may include at least one of: the crane hook; a load to be carried by the at least one crane; a load landing spot; a safety weight of the crane; a display of a scenery of interest; and a view of the crane surroundings as captured by an image sensor disposed at a cockpit-location on the at least one crane.
  • the method may further includes the procedure of maintaining the target object in a preferred position on the display, the preferred position being selected by the operator and/or predefined in a controller, wherein the controller is operational for control the tracking device and/or the display.
  • the preferred position may be intended to maintain the operator in a healthy body posture.
  • the method may further include the procedure of bursting the image onto the display when the target object is identified in data provided by a particular image sensor of the at least one image sensor.
  • the method may further include the procedure of sounding crane real time environmental noises as heard and perceived by a microphone at a cockpit-location of the at least one crane, using a speaker.
  • the method may further include displaying a display of the image perceived by a particular image sensor as a principal display, capturing the principal area [majority] of the display, and displaying the image perceived by at least one other image sensor as a secondary display capturing a secondary area of the display.
  • the method may further include the procedure of playing the sound captured by a microphone of the particular image sensor for the operator.
  • the method may further include the procedure of selecting the particular image sensor when the target object is identified as best seen/closest to the particular sensor.
  • the method may further include the procedure of combining AR/VR features with the displayed image, for providing data, and pointing and marking items and locations of interest.
  • the locations of interest may include a load landing spot.
  • the method may further include the procedure of mobilizing and deploying a hopping mobile apparatus for chirurgic command and control tasks.
  • the target object may include a series of objects with regard to which an assignment is required to be conducted.
  • the series of objects may include the crane safety pins and the assignment may include a safety check of the status of each of the safety pins.
  • the target object may include a safety weight sensor operational for indicating safety threshold check regarding maximal load and/or maximal spatial positioning of a load.
  • the method may further include the procedure of mounting the at least one image sensor on at least one of the following: crane cockpit; signaler helmet; UAV; and structures in the vicinity of an area of interest.
  • a remote, camera tracking system for single display viewing of task execution within a crane work site, the system including: a plurality of panning, motiontracking cameras having a collective sensing capacity spanning a work site area, each of the cameras configured to capture a motion picture segment of a portion of a task during execution, each capture of the motion picture segment responsive to a trigger event; a controller configured to merge the motion picture segments into a single, complete motion picture of the task; and one remote display operative to display the complete motion picture of the task in real time so as to facilitate operator crane control.
  • the plurality of cameras includes a camera activated by a trigger event captured by a different camera of the plurality of cameras.
  • each of the cameras is activated by a trigger event associated with a respective camera of the plurality of cameras.
  • At least one of the panning, motion-tracking cameras is further configured to capture the motion picture segment as a close-up view.
  • the motion picture segment is a motion picture segment of loading, load conveying, or unloading.
  • the trigger event is a time.
  • the trigger event is an imminent jib collision.
  • the trigger event is a threshold load volume.
  • the trigger event is abutment of a load and crane hook.
  • the trigger event is implemented as conveyance of a load.
  • the trigger event is implemented as a captured task execution bordering a field of vision of a camera of the plurality of cameras.
  • the trigger event is implemented as a captured image having a resolution below a threshold resolution.
  • the trigger event is implemented as a captured image having a resolution below a threshold image resolution.
  • the plurality of the panning, motion-tracking cameras includes at least one camera configured to pan one or more deployed crane pins.
  • the at least one camera is deployed crane pins is deployed in an Unmanned Aerial Vehicle (UAV).
  • UAV Unmanned Aerial Vehicle
  • a method for remote tracking and real-time display of task execution within a crane work site including: capturing motion picture segments of a task under execution in a crane work site, the capturing motion picture segments implemented through a plurality of panning, motion-tracking cameras having a collective sensing capacity spanning the work site area, the capturing responsive to one or more trigger events; merging the motion picture segments into a single complete motion picture of the task; and remotely displaying the complete motion picture of the task on a single display in real time so as to facilitate operator crane control.
  • each of the cameras is activated by a trigger event associated with a respective camera of the plurality of cameras.
  • the plurality of cameras includes a camera activated by a trigger event captured by a different camera of the plurality of cameras.
  • the capturing motion picture segments includes focusing in on the task under execution.
  • the motion picture segments are selected from the group consisting of loading, load conveying, or unloading.
  • the trigger event is a time. According to a further feature of the present invention, the trigger event is a threshold load volume.
  • the trigger event is an imminent jib collision.
  • the trigger event is abutment of a load and crane hook.
  • the trigger event is implemented as a captured task execution segment bordering a field of vision of a camera of the plurality of cameras.
  • the trigger event is implemented as an abutment of a load and crane hook.
  • the trigger event is implemented as a captured image having a resolution below a threshold resolution.
  • the trigger event is implemented as captured image having a resolution below a threshold resolution.
  • the plurality of the panning, motion-tracking cameras includes at least one camera configured to pan one or more deployed crane pins.
  • the at least one camera is deployed in a UAV.
  • Figure 1A is a schematic illustration of a crane tracking apparatus, including a crane and a remote display, constructed and operative according to the present invention
  • Figure 1 B is a schematic illustration of the crane of the crane tracking apparatus of Figure 1A, including a UAV controlled and maneuvered by an operator;
  • Figure 2A is a schematic illustration of the display of Figure 1A, divided into a plurality of screens of which at least one is an optimal settings screen;
  • Figure 2B is a schematic illustration of the display of Figure 1A, divided into a major screen and a minor screen;
  • Figure 3A is a schematic illustration of a variety of tracking-sensors mounted on a crane and/or in the vicinity thereof, and in communication with a controller, constructed and operative according to another embodiment of the present invention
  • Figure 3B is an illustration of coupled mutual detectors mounted on the crane of Figure 3A and on neighboring tree;
  • Figure 3C is a schematic illustration of an automatic intervention response of a controller to a suddenly appearing hazard, operative according to another embodiment of the present invention
  • Figure 4 is a schematic illustration of real-time-conditions indicators located in the vicinity of an operator in a remote cockpit of a crane, constructed and operative according to another embodiment of the present invention
  • Figures 5A to 5C are sequential live images of a crane shown upon the display, with changing display settings in response to data which is indicative of a potential defect or hazard, constructed and operative according to another embodiment of the present invention, wherein in Figure 5A is a visual display of real time imaging of a scene of interest shown on the display, the contrast between the crane and its background environment being at a baseline level,
  • Figure 5B is a schematic illustration of the live images of Figure 5A, where the contrast level is augmented.
  • Figure 5C is a schematic illustration of the live images of Figure 5B, where the contrast level is further augmented;
  • Figure 6A is a schematic illustration of an image of a crane presented on a display, with added augmented reality features, constructed and operative according to another embodiment the present invention
  • Figure 6B is a schematic illustration of an image of a crane presented on a display, with added virtual reality features, constructed and operative according to another embodiment of the present invention.
  • Figure 7 is a block diagram of a method for remote tracking of a crane, operative according to another embodiment of the present invention.
  • FIG. 8 is a flow diagram of processing steps employed in remote tracking, according to a second embodiment of the present invention.
  • the present invention addresses the above-mentioned issues by providing a remote crane tracking apparatus and method for tracking and controlling at least one crane from a distance by an operator.
  • the remote crane tracking apparatus includes a display for displaying live imaging of the crane to an operator at a remote cockpit location separate from the crane, an image sensor for continuously recording the crane and supplying the live imaging to the image display, and a tracking device for automatically centering and focusing the image sensor on a selected target object of particular interest, for tracking the target object.
  • the target object may include, for example, the crane hook, a load carried by the crane, a load landing spot, a feature of interest from the surrounding scenery, and the like.
  • the image sensor may include a plurality of sensors, which may be positioned at different locations of the crane and the surrounding scenery so that the sensors collectively span the work site area
  • the image display may be configured to switch between images provided by different sensors according to how well they depict the target object and may even display more than one live imaging at a time.
  • Other sensors may be combined with the image sensor and AR/VR features may be combined with the display, to enhance the tracking and control abilities of the operator.
  • an accelerometer is employed as a site sensor instead in addition to or instead of an imager. It should be appreciated that either imagers or accelerometer can be employed as a site sensor configured to detect trigger events.
  • Crane tracking apparatus 100 includes controller 110 and display 112, for following and controlling the operation of a crane, referenced 200.
  • Crane 200 includes vertical crane mast 202, jib 204 which is positioned at a top region of crane mast 202 and perpendicular thereto, coupling pins 208 which connect the mast sections of mast 202, and hoist- and-hook 206 which is suspended from jib 204 and is configured to have a load connected to its end and to be extended and retracted according to need.
  • Counterweight 210 is mounted at an end of jib 204 opposite to the end from which hoist-and-hook 206 extends and is operative to counterbalance a load carried by hoist-and-hook 206.
  • Two image sensors 220 e.g., cameras
  • a tracking device is embodied by a plurality of emitters 222, which are installed in various locations on crane 200, including the distal end of jib 204, the hook of hoist-and-hook 206, and counterweight 210, and allow enhanced tracking of the components on which they are installed by operator 150 by use of receiving unit 224, which is coupled to an image sensor 220.
  • cameras 220 are implemented as panning, motion tracking cameras operative to track an object as it moves or pans a particular area. Cameras 220 have a collective sensing capacity spanning the entire work site.
  • the motion tracking activity is any one or a combination of loading, load conveying, or unloading.
  • the panning, motion tracking cameras 220 are activated responsively to a trigger event in accordance with configuration choices.
  • cameras 220 are triggered by an event associated with a particular camera.
  • camera 220 can be activated by a particular load or jib movement or unloading captured through that camera and recognized through image recognition algorythms processed by controller 110.
  • camera 220 is activated by one or more of these or other trigger events detected by a different camera.
  • Possible trigger events include, a time, an immenent jib collision, threshold load volume, an abutment of a crane hook with a load. Abutment of a crane hook with a load generally signals the beginning of loading operations.
  • Other trigger events are related to system functionality like captured task execution bordering a field of vision of a camera ora image resolution below a configured threshold resolution.
  • Activation in this context refers to initiation of motion picture capture characterized by the capture of a series of images while tracking the motion of a designated object or a series of images captured while the camera pans one or more stationary objects.
  • the components and devices of apparatus 100 may be based in hardware, software, or combinations thereof. It is appreciated that the functionality associated with each of the devices or components of apparatus 100 may be distributed among multiple devices or components, which may reside at a single location or at multiple locations. For example, the functionality associated with controller 110 may be distributed between multiple processing units. Controller 110 may be part of a server or a remote computer system accessible over a communications medium or network, or may be integrated with other components of apparatus 100, such as incorporated with image sensor 220.
  • Apparatus 100 may optionally include and/or be associated with additional components not shown in Figure 1A, for enabling the implementation of the disclosed subject matter.
  • apparatus 100 may include a memory or storage unit (not shown) for temporary storage of images or other data.
  • controller 110 is configured to merge motion picture segments captured by panning motion tracking cameras 220 into a single motion picture of the task being executed. The merging is implemented through the appropriate motion picture software as is known to those skilled in the art.
  • Real time display refers to the near instantaneous display time after moving picture capture. Controller processing delays are deemed to be negligible and are a part of real time display.
  • cameras 220 are implemented as panning, motion-tracking cameras, the cameras are configured to pan across an array or series of deployed safety pins 208. The panning is captured as a motion picture segment and merged with other motion picture segments to form a complete motion picture of the task performed. Furthermore, panning, motion-tracking cameras 220 are operative to zoom in to capture close-up motion picture segment of greater detail.
  • remote crane tracking apparatus 100 The operation of remote crane tracking apparatus 100 will now be described in general terms, followed by specific examples.
  • Operator 150 manages and follows the operation of crane 200 by the use of controller 110 and display 112 (respectively).
  • Crane 200 is continuously monitored by at least one image sensor 220, such that the data recorded by image sensor 220 is substantially instantaneously transmitted to controller 110 and translated into a visual image which is displayed upon display 112.
  • Image sensors 220 may be installed at any location on crane 200 or on surrounding features, and positioned so as to track one or more specific components of crane 200 or of the surrounding features (herein: "selected crane component” or "target object”), which are of particular interest to the crane operator 150.
  • locations at which image sensors 220 may be positioned include: the position of a non-remote crane cockpit, mast 202, each of the ends of jib 204, hoist-and- hook 206, surrounding building 300, and the like.
  • a plurality of image sensors 220 may be positioned to capture the plurality of objects, or a plurality of locations along the trajectory of the target object, respectively.
  • Image sensors 220 may also be positioned on mobile elements in the vicinity of crane 200, such as trucks and cement-mixers, and particularly may be worn by workers at the building site.
  • a signaler who directs operator 150 in maneuvering an element of crane 200, and continuously adjusts his physical location to be in an optimal position to survey the element of crane 200 which is being maneuvered may have an image sensor 220 installed on the top of his helmet or on any other part of his attire, allowing operator 150 to observe crane 200 from the perspective of the signaler.
  • Unmanned Air Vehicles UAV may also include an image sensor 220, and be utilized to survey crane 200 from changing locations.
  • the at least one target object of crane 200 or of the vicinity thereof may include, by way of an unlimiting example: the crane hook (of hoist-and-hook 206), a load carried by the crane, counterweight 210, safety pins 208, a load landing spot or location, a display of a scenery of interest, a view of the crane surroundings as captured by an image sensor disposed at a cockpit-location on the at least one crane, etc.
  • the target object may further include a safety weight sensor, which is operational for measuring the weight and/or distance of a crane load, and of indicating a safety threshold check regarding a maximal load and/or maximal spatial positioning of a load. Operator 150 perceives the real time situation of crane 200 as shown on display 112 and can react accordingly via controller 110.
  • Controller 110 may be operational to manage all operations of crane 200, such as changing the angle of a working arm/jib, i.e. , changing the azimuth and/or vertical angle thereof; extending and retracting of a working arm/jib; lowering and collecting of a load hoist; and any other operation of a crane as known in the art, particularly the crane operations which are usually controlled by a crane-cockpit located on the crane.
  • Controller 1 10 may also control the activation and operation of image sensors 220, including adjustment of angle, position, focus, magnification, and any other settings of image sensors 220.
  • the imaging data (herein “images") recorded and provided by only a single image sensor 220 (herein “currently shown image sensor”) is shown on display 112 at any given time.
  • Operator 150 can adjust the settings of display 112, e.g., brightness, contrast, coloring and the like, and/or can adjust any of the settings of currently shown image sensor 220, so as to better perceive or locate a feature of interest shown on display 112.
  • Operator 150 can also shift between the images provided by the various image sensors 220, to survey different components of crane 200 and to improve the view of a particular feature of interest.
  • Emitters 222 can be installed on any number of various components of crane 200 and the surrounding features, and are configured to each emit a distinct signal which can be identified by a receiving unit 224.
  • Receiving unit 224 recognizes the directionality of the signal emitted by an emitter 222 relative to receiving unit 224.
  • Receiving unit 224 may also recognize the intensity of the signal which allows to assess the distance of emitter 222 from receiving unit 224, and, together with the directionality, can allow to assess the location of emitter 222 relative to receiving unit 224.
  • Receiving unit 224 is in wireless communication with controller 110, and may be fixedly installed at a particular position which is recorded by controller 110, such that any space-related information, e.g., location coordinates, which is defined relative to receiving unit 224 can be translated into absolute location coordinates as defined in controller 110 (herein “absolute location coordinates").
  • absolute location coordinates may be defined such that the longitudinal and latitudinal coordinates comply with the GPS system, and the altitudinal coordinate is defined relative to Sea Level. Any other coordinate system may of course be defined.
  • controller 110 When controller 110 receives data from receiving unit 224 regarding the signal emitted by emitter 222, controller 110 can compute the position of emitter 222 relative to receiving unit 224, and by extension compute the precise absolute location of emitter 222 and the crane component to which it is coupled.
  • a tracking device which provides location related data of selected crane components, may include for example: a visually-identifiable-element coupled to the selected crane component, featuring a distinct shape, design, color or other visual feature, which controller 110 can be predefined to identify within visual data provided by visual sensors 220; machine learning of the features of interest; a GPS receiver and a barometric pressure sensor (e.g., InvenSense ICP-10111 barometric sensor) both coupled to the selected crane component, the GPS receiver being operational to compute the longitudinal and latitudinal coordinates of the crane component, and the barometric pressure sensor being operational to compute the altitudinal position of the selected crane component; or any other tracking device or system known in the art.
  • a visually-identifiable-element coupled to the selected crane component, featuring a distinct shape, design, color or other visual feature, which controller 110 can be predefined to identify within visual data provided by visual sensors 220; machine learning of the features of interest
  • a GPS receiver and a barometric pressure sensor e.g.
  • Controller 1 10 may be predefined to continuously adjust the position, focus and/or other characteristics of an image sensor 220, so as to maintain the selected crane component in a preferred region of the image provided by image sensor 220, e.g., a substantially central region, a region in the eye line of operator 150, a region dynamically selected by operator 150, or any other predefined region, such that when the selected crane component is in motion it appears on display 112 to be floating at a substantially fixed region of the image, while the surrounding scenery dynamically changes.
  • a preferred region of the image provided by image sensor 220 e.g., a substantially central region, a region in the eye line of operator 150, a region dynamically selected by operator 150, or any other predefined region, such that when the selected crane component is in motion it appears on display 112 to be floating at a substantially fixed region of the image, while the surrounding scenery dynamically changes.
  • controller 110 may adjust the angle of image sensor 220, i.e., the direction in which it points, so as to follow the trajectory along which the selected crane component is moving (or corresponding tracking device/element) and to zoom in to provide an enlarged view of the target relative to the image size upon detection as a trigger event.
  • This close-up view advantageously provides an operator with additional detail of the surrounding area facilitating operator control.
  • the preferred region may be defined relative to display 112, and not necessarily in relation to the live-imaging provided by image sensor 220. This may be particularly useful when images from a plurality of image sensors 220 are being presented on display 112 and the selected crane component is being tracked by more than one image sensor 220, as will be further explained in relation to Figures 2A and 2B.
  • controller 110 may be operational to digitally bring the selected crane component into the preferred region on display 112. This may be achieved, for example, by enlarging the portion of the image surrounding the selected crane component (i.e., zooming in), and/or by moving the image supplied by image sensor 220 relative to display 112, such that the portion of the image including the selected crane component is located in the preferred region of display 112, and other parts of the live imaging (on the opposite side of the image from the selected crane component) are not presented on display 112.
  • the parts of display 112 which become imageless due to the moving of the live-imaging relative to display 112 may be filled with a generic filling, such as a single-colored patch, or may be filled by controller 110 with virtual reality elements, clearly differentiated from the real live imaging which is provided by the image sensor 220, which virtual reality elements may be based on image data supplied by any of image sensors 220.
  • maintaining the selected crane component in a preferred position may also allow, or assist, operator 150 in maintaining a healthy body-posture while supervising the operation of the various crane components. This is particularly significant when the operation of the selected crane component, which requires scrutiny, is lengthy, requiring substantially continuous examination for a period of, for example: 15 minutes, 30 minutes, 1 hours, 2 hours, 4 hours, or more.
  • hoist-and-hook 206 when hoist-and-hook 206 is being operated to successively transfer a plurality of loads from a first position to a second position, operator 150 often tracks the movement of hoist-and-hook 206 throughout the entire operation, moving along a trajectory back-and-forth from the first position to the second position, where the two positions may be at different heights from the ground. Maintaining hoist-and-hook 206 at a preferred position on display 112, the preferred position being either predefined or selected by operator 150 (optionally, ongoingly selected during the operation of hoist- and-hook 206), may ensure that operator is gazing at a point on display 112 which corresponds to a healthy body posture.
  • a healthy body posture may include diversifying the body posture of operator 150 after a lapse of a predetermined time period (e.g., 10 minutes), where in order to assist operator 150 in achieving this body posture controller 110 may be predefined to relocate the preferred position of the selected crane element on display 112, according to a predefined or a random sequence pattern.
  • a predetermined time period e.g. 10 minutes
  • Examples of an unhealthy body posture may include, for example, continuously bending the neck of operator 150 downwards, straining the neck upwards, turning the neck to one side, hunching the back of operator 150, and the like.
  • controller 110 may be operational to emphasize the selected crane component upon the image provided by image sensor 220 and presented on display 112, for example by adding a colored-dot to the image in close proximity to the selected crane component, surrounding the selected crane component with a circle, changing the image settings in the immediate vicinity of the selected crane component (e.g., sharpness, coloring, brightness, magnification, etc.), and the like.
  • controller 110 may track the location, and motion, of any number of selected components of crane 200 or the surroundings thereof, either through data provided by image sensors 220 or through data provided by any other sensors, without any necessary correlation with the imaging data which is being presented on display 112. This enhances the ability of operator 150 to control and regulate the operation of the crane components, as he can receive information and indications from controller 110 which are not dependent upon and limited by his own perception of crane 200 through display 112.
  • the image sensors 220 may not effectively cover all elements or areas of crane 200 or the vicinity thereof which may be of interest to operator 150, such that operator 150 may not have visual access to an element or area which require attention.
  • a hopping mobile apparatus which includes an image sensor, may be utilized to be directed to visually record the area of interest.
  • Figure 1 B is an illustration of Unmanned Aerial Vehicle (UAV) or drone 226 which is controlled and maneuvered by operator 150 to scan regions which are not sufficiently scanned by other image sensors 220.
  • UAV 226 includes image sensor 221 which is coupled to the front upper portion of UAV 226.
  • UAV 226 may include more than one image sensor 221 , which may face in different directions conveying a live imaging coverage of different sides of UAV 226. This may be particularly useful when UAV 226 is applied by operator 150 to scan a region which may be difficult to access directly with an UAV, such that the forward-facing image sensor 221 may not be effective, but image sensors 221 which are located on other portions of UAV 226 and/or which are facing other directions may be useful instead.
  • UAV 226 may include other features or components in addition to image sensor 221 , such as a deployable arm, a voice recorder (e.g., microphone), and the like, and may be utilized by operator 150 (and/or controller 110) to perform chirurgic command and control tasks on crane 200 or components thereof, particularly in regions which operator 150 cannot sufficiently access with the other components of crane 200.
  • the UAV may include, for example, a delivery drone, a helicam, or any other type of UAV, and may be manually controlled and/or autonomously operated.
  • FIG. 2A is an illustration of display 112 divided into a plurality of screens 114, each screen 114 showing a live imaging provided by an individual corresponding one of image sensors 220, where: screen 114A displays a live imaging of the front-end of jib 204 (from which is extending hoist-and-hook 206) provided by an image sensor 220 which is located at the crane-cockpit location of crane 200; screen 114B displays a live imaging of the rear-end of jib 204 (including counterweight 210) provided by an image sensor 220 which is located the crane-cockpit location of crane 200; screen 114C displays a live imaging of the lateral profile of crane 200, provided by an image sensor 220 which is located on a construction or apparatus, e.g., an unmanned air vehicle (UAV) (not shown), which is positioned laterally to crane 200; and screen 114D displays a live imaging of a particular coupling pin 208, provided by an image sensor 220 which is positioned on a component of UAV
  • Visually-identifiable-element 250 which is another embodiment of the above-mentioned tracking device by being recognized by controller 110 within visual data provided by visual sensors 220, is coupled with hoist-and-hook 206, shown on screens 114B and 114C.
  • the settings of the image displayed on screen 114D including brightness, coloring, contrast, sharpness etc., are set so as to be substantially optimal for being viewed and considered by operator 150, while the image-settings of the other screens 114A-114C are less optimal (herein "second-rate image settings").
  • This difference in the settings of screen 114D and the rest of screens 114A-114C is intended to focus the attention of operator 150 on the live imaging that is being shown on screen 114D, herein “optimal-settings screen”, while at the same time maintaining the live imaging which is displayed on screens 114A-114C, herein “secondary screens”, available for inspection by operator 150 and at an image quality which is good enough for inspection.
  • a "good enough" quality of the second-rate image settings is not a fixed definition, and may vary according to a variety of changing factors, including, amongst other things, the personal preferences of the operator 150, and the importance of the live imaging which is shown on the secondary screens, relative to the optimal-settings screen.
  • the optimal-settings screen may have only marginally better image settings compared to the secondary screens.
  • the image settings of the secondary screens may be substantially altered, usually to be less convenient for viewing, so as to assist operator 150 to focus his attention on the live imaging of the optimal-settings screen in which the particular component of crane 200 is best depicted.
  • One or more of screens 114A-114D may be displayed at optimal image settings, i.e., serve as optimal-settings screens, for example when the particular component of crane 200 which requires special attention is shown from different angles on a plurality of screens, or when the target object which requires special attention is a series of objects with regard to which a visual assignment is required to be conducted, each of the series of objects respectively shown on one of screens 114A-114D.
  • the objects may be successively inspected by image sensors 220, optionally according to a program predefined in controller 110 or manually by operator 150, such that upon initiation of the sequential inspection program, the direction/position of respective image sensors 220 is successively adjusted to provide a view of each one of the target objects in turn, the currently inspected object being presented in an optimal- settings screen.
  • the target object may be safety pins 208, and the visual assignment may be checking that each of pins 208 is safely installed in its respective position.
  • the safety check may be done by means other than a visual survey, e.g., an audio check, by utilizing audio sensors positioned in close proximity to safety pins 208, where when an intensified rattling sound is captured by a specific audio sensor when a respective safety pin 208 is placed under strain it is an indication that the pin 208 is not installed tightly enough; a mechanical check using a mechanical sensor, constantly coupled to a respective safety pin 208 or employed specifically for the safety check by UAV 226, the mechanical sensor being coupled to a deployable arm of UAV 226 and UAV 226 sequentially cruising between the series of objects, the mechanical sensor sensing for unusually intense vibrations of the safety pin 208 as an indication of loosening of pin 208; and the like.
  • controller 110 may be predefined to receive a selection of one or more of screens 114A-114D by operator 150, and adjust the image settings of the selected screen according to the selection of operator 150.
  • Operator 150 may define, initially and/or ongoingly, what image settings or characteristics are adjusted and in which direction, e.g., elevating or decreasing an image characteristic, or more generally improving or deteriorating the quality of the displayed image according to predefined settings.
  • the selection of which of screens 114A-114D be displayed as an optimal-settings screen may be performed automatically by controller 110 according to preset definitions.
  • the preset definitions may be changeable to suit the dynamic requirements of operator 150 in each operating session of crane 200.
  • the preset definitions may be directed to/intended for preserving substantially continuous tracking of a selected component of crane 200, preventing an imminent dangerous situation which is detected by controller 110, performing a sequential examination of different components of crane 200 or of the surrounding vicinity, or any other remote crane-controlling requirements.
  • controller 110 may be preset to render one or more of screens 114A-114D an optimal-settings screen (i.e., displaying the image in the screen with substantially optimal characteristics), upon recognizing visually-identifiable-element 250 within the imaging data of the visual sensor(s) 220 which is being presented on those one or more screens 114A-114D.
  • controller 110 may also be predefined to continuously alter the position of an image sensor 220, so as to follow hoist-and-hook 206.
  • controller 110 may be additionally preset to identify the quality at which visually-identifiable-element 250 is being shown on each of screens 114A-114D (i.e.
  • the definition of the "quality of display" of visually-identifiable-element 250 may be according to various criteria of the image presented on screens 114A-114D which can be measured by controller 110.
  • the quality of the image may be defined according to the "size" of element 250 within the presented image, the size being assessed in correlation with the number of pixels which element 250 occupies within the live imaging data provided by image sensor 220.
  • the higher quality image may be defined, for example, in direct relation to the increase in size of visually-identifiable-element 250, namely "the bigger the better”; may be defined in relation to a predefined size range, such that the closer to the size range (or the mid-point thereof) the better; or in any other relation to the size of visually-identifiable-element 250 as measured by controller 110.
  • the quality of the display of visually-identifiable- element 250 may be defined according to other factors, some non-limiting examples being the completeness of the display of visually-identifiable-element 250, i.e., what percent of element 250 is being displayed (the higher the better); the sharpness, acutance, resolution, or other imaging characteristic of element 250 or of the whole image; and/or according to any combination of the above factors.
  • FIG. 2B is an illustration of display 112 divided into a major screen 116A and a minor screen 116B, each of screens 116A-116B showing a live imaging provided by an individual corresponding one of image sensors 220.
  • Major screen 116A occupies the majority of the area of display 112 and constitutes a principal display
  • minor screen 116B occupies a minor portion of the area of display 112, constituting a minor display.
  • the parameters that may determine which live imaging i.e., imaging recorded by which image sensor 220
  • which live imaging is presented on minor screen 116B
  • the live imaging presented on major screen 116A may be continuously selected by an operator 150. Alternatively, it may be selected according to parameters which are predefined in controller 110, as explained in detail above.
  • Ongoing thumbnail-like samples of the live imaging provided the plurality of image sensors 220 may be displayed on a margin of display 112, allowing operator 150 to constantly be aware of what is being viewed by each image sensor 220 and to be able to select from the different possibilities what live imaging is displayed on major screen 116A, and what is displayed on minor screen 116B.
  • the number of minor screens shown on display 112 may be adjusted according to the choice of operator 150, as may the relative sizes of the major screen 116A and the one or more minor screens 116B.
  • operator 150 may select a live imaging recorded by a first image sensor 220 to be displayed on major screen 116A, which records a selected crane component from a direct front view, and may select an additional two live imagings, supplied by two additional image sensors 220, to be displayed in two minor screens 116B, which record the selected crane component from two opposite side views, respectively.
  • the choice of the number of minor screens 116B, and the relative size of the minor screens 116B and the major screen 116A may also be automatically controlled by controller 110 according to predefined settings, similarly to the preset settings according to which controller 110 may render one or more of screens 114A-114D an optimal-settings screen, as described above.
  • operator 150 may have a "selective interest" in a selected crane component, i.e., the crane component may be of interest only when it is in a particular location, or when the crane component comes into close contact with another component or structure in the vicinity of the crane.
  • One option of addressing selective interest in a selected crane component may include fixedly positioning an image sensor 220(F) to continuously stare towards and record the location or apparatus in the vicinity of which the selected crane component is of interest. The recorded live imaging is continuously provided to controller 110, and controller 110 may be preset to present the live imaging provided by image sensor 220(F) on display 112 immediately upon recognizing the selected crane component within the provided live imaging data.
  • image bursting This possibility of suddenly presenting a live-imaging upon display 112 in response to a particular trigger (when the suddenly presented live-imaging is not necessarily continuous to the live imaging which was previously being shown on display 112, nor related to the crane feature which was previously being tracked or presented), is termed herein as "image bursting".
  • display 112 is divided into/includes screens 114A- 114D or major screen 116A and minor screen(s) 116B
  • the image bursting may be presented in a dominant screen (e.g., major screen 116A or an optimal settings screen) or in an auxiliary screen (e.g., in minor screen 116B or secondary screens), according to the predefined settings of controller 110.
  • Image bursting may be further utilized in other situations in which a trigger is predefined in controller 110.
  • controller 110 when the operator is searching for a particular crane component using a plurality of image sensors 220, upon identifying the particular crane component controller 110 immediately displays the live-imaging in which the particular crane component was identified; or when a dangerous situation occurs which involves a particular crane component or surrounding feature, controller 110 immediately displays the live-imaging in which the particular crane component was identified.
  • Tracking sensors 230 include, in addition to image sensors 220, the following sensors: wind sensor 232, mounted on jib 204 and configured to detect the force and direction of the wind blowing at the height of jib 204 above ground; vibration/motion sensor 234, coupled with a first coupling pin 208 of crane mast 202, and configured to detect vibrations/movements of coupling pin 208; tension sensor 235, coupled with a second coupling pin 208 and configured to measure the stress/tension being applied on coupling pin 208 by the crane elements whose coupling it is maintaining; radar sensor 236, coupled with a hoist-and-hook 206 and configured to continuously detect objects or constructions which are in proximity to hoist-and- hook 206 and measure the distance between hoist-and-hook 206 and the detected object; and direction sensor 238 (i.e., compass-like
  • Each of tracking-sensors 230 is in wireless communication with controller 110 and may continuously provide controller 110 with the data that the tracking-sensor senses and/or computes.
  • Figure 3B illustrates coupled mutual detectors 240, mounted on crane 200 and on neighboring tree 160.
  • a first mutual detector 240 is positioned near the end of hoist-and-hook 206, and a second mutual detector 240 is positioned on tree 160, at a location which lies within a potential trajectory of hoist-and-hook 206.
  • Mutual detectors 240 are configured to constantly emit and/or receive wireless signals 242 at a specific matching frequency and intensity, usually in an omni-directional pattern, such that when the first and second mutual detectors 240 come within a predefined distance of each other, at least one of detectors 240 will receive the coupling signal 242 emitted by its corresponding detector 240, and will thereby identify the close presence of the corresponding detector 240. Upon identifying the presence of the corresponding detector 240, each detector 240 is configured to provide a signal to controller 110.
  • the signal provided to controller 110 may be of an intensity which corresponds to the intensity of the coupling signal 242 which is received by detector 240, this intensity being correlated with the closeness of the two detectors 240, which allows controller 110 to compute the distance between detectors 240 and their motion (i.e., speed, acceleration) relative to each other.
  • Each type of sensors 230, and detectors 240 may include one or more sensors, which may be installed in various locations of crane 200 and the surroundings thereof to collect information about the surrounding settings in which crane 200 and it's components are operating. Controller 110 receives the information (i.e., data) collected by sensors 230 and may be configured to perform a variety of analyses on the collected data so as to continuously monitor the situation of crane 200 and the components thereof.
  • Independent thresholds may be defined for the data provided by different sensors and for the products of the analyses performed on the sensed data, which when the thresholds are crossed controller 110 produces an indicative signal and/or a suitable response.
  • the independent thresholds may be safety thresholds, which are predefined in controller 110 for each of sensors 230.
  • the response of controller 110 to crossing of a threshold may be to burst a live-imaging onto display 112 which shows at least one of the sensors, crane components, and/or risk factors which is related to the threshold which was crossed.
  • At least one of image sensors 220 may be assigned for each sensor/crane component, such that when a safety threshold is crossed controller 110 diverts the assigned image sensor 220 toward the relevant sensor or crane component.
  • the assignment of an image sensor 220 to a crane component/sensor may be based on the fixed location of the image sensor 220 relative to the fixed location of the crane component/sensor.
  • the sensors or crane components are mobile, however, it is required to continuously track the motion of the crane components/sensors, as has been explained with reference to Figure 1A regarding the tracking device, in order to allow diverting an image sensor 220 to depict them in the event of sensory data which crosses a threshold.
  • an independent safety threshold may be predefined in controller 110 for a wind intensity (i.e., force) measured by wind sensors 232, which intensity is considered to endanger the stability of crane 200 or of one of the crane components.
  • the wind intensity threshold may be preset in correlation to the direction at/from which the wind is blowing, or to the dimension of jib 204 on which the wind is blowing, e.g., the side-lengthwise dimension; the "front"/"backward” dimension, "front” being the dimension facing the distal tip of jib 204 and backward being the respective opposite dimension; the bottom dimension, i.e., upward-blowing wind gusts; etc.
  • controller 110 When controller 110 receives data from a wind sensor 232 indicative of a particular wind intensity which exceeds the predefined intensity threshold, it may be operational to burst a corresponding live-imaging onto display 112.
  • the preset corresponding live-imaging may include a live imaging which depicts, for example, wind sensor 232, the side of jib 204 which is being buffeted by the wind, the whole of crane 200 from the side from which the wind is blowing, or from the opposing side (i.e., the side towards crane 200 might topple), and/or the area beneath or in the vicinity of crane 200 which is endangered by a potential collapse of crane 200 or of one of its components.
  • Similar safety related thresholds may be defined in controller 110 for each of the above-mentioned sensors, and the image sensors 220 may be similarly repositioned to record a relevant crane component/senor etc.
  • Controller 110 may also be preset to autonomously activate imminent- danger-prevention measures, in cases where a potential danger is identified by controller 110 and is regarded to be too imminent for a timely reaction by operator 150.
  • Figure 3C illustrates an example of an automatic intervention [response] of controller 110 to a suddenly appearing hazard.
  • Crane 200 is erected in proximity to under-construction-building 300.
  • Deployable platform 310 is positioned in an alcove of building 300 with an opening in the direction of crane 200, such that arm 312 of deployable platform 310 can be deployed outward from building 300, along trajectory A2, in the direction of crane 200, for example for receiving a load which is delivered by cane 200.
  • Construction worker 320 operates deployable platform 310, e.g., by using a lever (322), to deploy and retract arm 312.
  • Jib 204 of crane 200 carries load 216 using hoist-and-hook 206, and is operational, amongst other movements, to raise and lower the distal end of jib 204 along trajectory A1 , i.e., the end of jib 204 from which hoist-and-hook 206 is suspended.
  • controller 110 controls the movement of jib 204 and is usually operated by an operator (150).
  • controller 110 is operated by an operator to produce an operator-signal S1 so as to raise the distal end of jib 204 along trajectory A1 , in order to deliver load 216 to an elevated position within building 300.
  • operator-signal S1 so as to raise the distal end of jib 204 along trajectory A1 , in order to deliver load 216 to an elevated position within building 300.
  • construction worker 320 pulls lever 322, thereby deploying arm 312 of deployable platform 310 along trajectory A2.
  • Arm 312 extends outward such that it is positioned directly within the intended movement pathway of jib 204.
  • Controller 110 recognizes the imminent collision by analyzing the ongoing data which it receives from sensors disposed on crane 200 (not shown), and identifies that if the ongoing operation of raising jib 204 continues at its current velocity, jib 204 will collide with deployable platform 310 within a very short time period, in which time period an operator (150) would not succeed in responding to an alarm sensory stimulation to prevent the collision. Controller 110 therefore produces an automatic-signal S2 which brings jib 204 to an immediate stop, overriding operator-signal S1 under which jib 204 was previously operating.
  • Controller 110 may be operational to provide an automatic-signal intervention response so as to prevent any other type of dangerous situation, and the automatic-signal intervention response may entail activating any component of crane 200 or any other component which is controlled by controller 110.
  • Some further examples include: (a) controller 110 operating a slewing mechanism which controls the azimuth of jib 204, and the vertical angle of jib 204 relative to the horizon, so as to adjust the azimuth and/or vertical angle of jib 204 when a wind blows against jib 204 at a direction and intensity which are above a predefined threshold, for preventing falling over of the crane; (b) controller 110 operating a trolley which is operational to travel back-and-forth along jib 204, from which trolly is suspended load-carrying hoist-and-hook 206, such that if a truck, animal, or the like suddenly enter into the immediate course of the trolley, controller 110 produces a signal to halt or retract the trolley; and the like.
  • a person who is physically located in the vicinity of at least one crane 200 may have a means of communication with operator 150 and/or directly with controller 110, to convey information and/or instructions to operator 150 and controller 110.
  • the communication of information and/or instructions may of course be in both directions, operator 150 or controller 110 being operational to supply the man-on-the-spot with instructions or notifications of potential threats.
  • the man-on-the-spot may be enabled to activate an alarm indication in the vicinity of operator 150, to burst an alert message or live image onto display 112 of operator 150, or to activate an imminent-danger-prevention response of controller 110.
  • a restricting action sequence may be defined to restrict the intervention of the man-on-the-spot in the operation of crane 200 only to situations in which such intervention is deemed crucial.
  • the man-on-the-spot may have to activate an alarm indication in the vicinity of operator 150, such that the man-on-the-spot can activate an element of crane 200 only if operator 150 does not respond to the alarm indication within a predefined time-length (e.g., 10 seconds), or if operator 150 gives the man-on-the-spot allowance. Any other settings and/or relations between operator 150 and the man-on-the-spot may be defined.
  • the responses initiated by controller 110 in reaction to sensed data crossing a predefined threshold may further include producing a signal or activating an alarm indicator, which is intended to inform the operator of the dangerous situation which is represented by the threshold being crossed, and/or to draw the attention of the operator to the dangerous situation.
  • controller 110 may be configured to activate one or more alarm stimulations, intended to draw the attention of operator 150 to the hazard/defect which has been recognized.
  • the alarm stimulation may be produced, for example, by at least one alarm indicator which may be directed to stimulate any of the five senses: touch, sight, hearing, smell and taste, and may additionally be directed to cause a psychological reaction.
  • alarm indicators include: varying the settings of the live imaging depicted on display 112; a visual alert display, e.g., a written message or warning symbol appearing on display 112; automatic focusing on an object of particular interest/requiring particular attention; a flickering light/display; a light-control for flickering the ambient lighting surrounding operator 150; a gas-sprayer for spraying strong smelling gas; an alarm for sounding a message/sound/alert, such as beeping, siren, a spoken command, screaming; a tactile vibrator for vibrating a handle/stick/chair in physical contact interface with operator 150; and an electric current in physical contact interface with the operator 150.
  • a visual alert display e.g., a written message or warning symbol appearing on display 112
  • automatic focusing on an object of particular interest/requiring particular attention e.g., a flickering light/display
  • a light-control for flickering the ambient lighting surrounding operator 150
  • a gas-sprayer for spraying strong smelling
  • a general purpose of the alarm stimulations is to produce a sense of alert and urgency in operator 150.
  • Operator 150 may be able, when setting up controller 110, to define an alarm, or sequence of alarm indicators, which he knows to be particularly effective on him.
  • controller 110 may be configured to record the response time of operator 150 to various alarm indications during operation sessions, which may be diverse both in respect to the bodily-sense which the alarm stimulates and in respect to the type of stimulation (e.g., which type of sound, which type of visual stimulation, etc,), and to analyze the collected information/data so as to define a most-effective stimulation for operator 150. This data collection and analysis may be performed for each operator independently, or may be performed for all operators collectively.
  • Controller 110 may be configured to record the response time and the response quality of operator 150 to various stimulations, i.e. , how fast and how accurately operator 150 reacts to each sensory stimulation, during operation sessions, and to analyze the collected information/data so as to define a stimulation scale for operator 150.
  • the quality of the response of operator 150 is important as well as the speed of the reaction, as certain sensory stimulations may be found to be too alarming, so as to cause operator 150 excess stress, leading to a misguided or non-accurate reaction.
  • Controller 110 may also be configured to randomly change the stimulations, or the sequence of stimulations, so as to prevent operator 150 from getting used to a repeated sequence.
  • Sequence activation of different alarm indicators may have the general purpose of creating an intensification of the sensory stimulation imposed upon operator 150.
  • the relative extremity of the various sensory stimulations is a first consideration in planning their sequence of activation.
  • the relative extremity of sensory stimulation may be evaluated, for example, according to the irritation level of the particular chosen stimulation (e.g., mellow beeping sound vs.
  • each specific sensory stimulation may be assigned a "stimulatory score" in controller 110, for example according to the above-outlined parameters.
  • Various alarm sequence programs may be defined in controller 110 based on the stimulatory score of the different sensory stimulations, such as: a linear progression between sensory stimulations in correlation with linear increase in danger level; an exponential progression between sensory stimulations in correlation with linear increase in danger level; a progression of stimulations in a step intervals function in correlation with linear increase in danger level; a progression of stimulations directed towards the same sense, e.g., various sounds with progressing stimulatory scores; a progression of stimulations directed towards alternating senses; an accumulation of stimulations; and any combination of the above.
  • An alarm indicator may also produce positive indications, such as an eye-catching visual message which appears upon display 112, which informs operator 150 what action is required in order to prevent the potential hazard and/or amend the detected defect.
  • FIG. 4 illustrates real-time-conditions indicators 380 located in the vicinity of operator 150, intended for increasing the sensory experience of operator 150 during the operation of crane 200, in addition to the imaging shown on display 112 and optionally in conjunction with the features (e.g., the target object) presented in the imaging, to imitate a sensory experience of an operator residing within a non-remote crane cockpit and enhance the tracking abilities of operator 150.
  • Real-time-conditions indicators 380 include wind-emitter/fan 382 and surroundspeaker 384.
  • fan 382 includes a plurality of fans which are positioned at at least two different sides of operator 150
  • surround-speaker 384 includes a plurality of speakers which are positioned at at least two different sides of operator 150.
  • Controller 110 receives data that is sensed or perceived by wind sensors 332 and sound sensors 334 (i.e. , microphones) which are installed in the vicinity of crane 200, and regulates the activation, intensity, directionality and/or other characteristics of real-time-conditions indicators 380 such that they imitate the data received from the sensors.
  • controller 110 when controller 110 receives audio data of real-time environmental noises (e.g., an audio signal emitted by a truck driving in reverse) from a microphone 334 disposed on the right-hand side of crane 200, where righthand side is defined relative to a theoretical operator sitting within a crane cockpit at the top of the crane, controller 110 may be operational to activate surround speaker 384 to produce a corresponding audio signal. When there is a plurality of speakers, controller 110 may activate the speaker or speakers which are positioned in a corresponding right-hand side of operator 150. This creates a sensory experience for operator 150 which is similar to the sensory experience which operator 150 would have if he were actually located at the top of crane 200.
  • real-time environmental noises e.g., an audio signal emitted by a truck driving in reverse
  • controller 110 may be operational to activate surround speaker 384 to produce a corresponding audio signal.
  • controller 110 may activate the speaker or speakers which are positioned in a corresponding right-hand side of operator 150. This creates a sensory experience for
  • a similar process is carried out with fan 382, where the wind sensors 332 provide wind data to controller 110 (e.g., intensity and directionality of a gust of wind), and controller 110 activates the corresponding fan or fans to produce a sensory experience for operator 150 which corresponds to real-time conditions.
  • controller 110 may activate surround-speaker 384 to produce wind-blowing sounds when a substantial wind is blowing on crane 200. Creating a surrounding sensory experience for operator 150 which corresponds to the real-time conditions may increase operator 150's understanding of the situation of crane 200, and substantially enhance operator 150's awareness of potential threats or dangers to crane 200, even without controller 110 activating alarm indicators.
  • Real-time-conditions indicators 380 may also include additional elements such as a vibrating chair, for imitating vibrations of a non-remote cockpit due to strong wind or movements of the crane, and the like.
  • the sensors installed on crane 200 are installed in the location on crane 200 which would be occupied by a non-remote crane cockpit, and are arranged such that they perceive the surrounding conditions as would be perceived by an operator within the non-remote cockpit. This option is particularly useful for a crane operator who is experienced in operating a crane from a non- remote cockpit, where the data collected by the sensors allows the real-time- conditions indicators 380 to produce a sensory experience corresponding to what the operator is used to from his experience in a non-remote cockpit.
  • the sensors may be distributed and arranged in any position in the vicinity of crane 200 which allows sensing wind, sounds, or any other relevant surrounding data which can enhance the ability of operator 150 to sense the conditions surrounding crane 200, even if the data sensed by the sensors would not be perceived by operator 150 if he were situated within the non-remote cockpit.
  • This allows operator 150 to acquire enhanced sensory indications which are not limited to the detecting abilities of his own senses, as is the case in a non-remote cockpit, and this may increase his understanding of the surrounding of crane 200 and help him better forecast potential hazards or dangers.
  • controller 110 may at first filter out, i.e. , not express via real-time-conditions indicators 380, data which is beyond the scope of what an operator in a non-remote cockpit would pick up, and gradually add and express data which is received from more distant sensors, such that operator 150 can become accustomed to this broadening of sensory input.
  • Figures 5A to 5C are sequential live images of crane 200 shown upon display 112, with changing display settings in response to data received by controller 110, which is indicative of a potential defect or hazard for crane 200.
  • a visual display of real time imaging of a scene of interest is shown on display 112, the scene of interest including crane 200, where the contrast between crane 200 and its background environment is at a baseline level.
  • This baseline contrast level may be representative of a normal view of crane 200 as would be seen by an operator sitting within a non-remote crane cockpit, a contrast level which is suitable for easy tracking of the operation and motion of crane 200 within its background environment.
  • the contrast level of the image on display 112 is progressively augmented, serving as a visual sensory alarm stimulation of the operator, as is shown in Figure 5B.
  • the contrast between crane 200 and its background environment is increased in comparison to the baseline level shown in Figure 5A.
  • Increasing the contrast between crane 200 and the environment is intended to draw the attention of the operator to the potentially dangerous situation of crane 200 in several ways: by emphasizing the motion of crane 200 in relation to its general background; by causing a visual discomfort to the operator due to the unnaturally contrasted image; and/or by capturing the operator's attention due to the sudden change of contrast.
  • the contrast level of the entire image shown on display 112 may be adjusted.
  • the contrast of crane 200 may be increased, and optionally also the contrast of an object which presents a potential hazard for crane 200.
  • the contrast level may be adjusted gradually, i.e., continuously rising as the hazard increases and/or perseveres and continuously decreasing as the hazard potential decreases, or may be adjusted in leaps according to predefined step levels of hazard increase/hazard time extent. Contrast adjustment by leaps may be more effective in capturing the attention of the operator. If controller 110 detects that the operator has not responded to the increase in contrast by taking action to avert the potential danger and/or amend the defect, controller 110 may further increase the contrast of the image on display 112, as shown in Figure 5C.
  • Controller 110 may apply other changes to the display settings of the image on display 112, as a visual alarm stimulation of the operator and/or to enhance the focus on crane 200, the display settings including for example: brightness, sharpness, shading, coloring strength, coloring heat (where hot coloring is substantially yellow, orange and red, and cold coloring is substantially green, clue, indigo and violet), and the like.
  • the display settings including for example: brightness, sharpness, shading, coloring strength, coloring heat (where hot coloring is substantially yellow, orange and red, and cold coloring is substantially green, clue, indigo and violet), and the like.
  • Each of these display characteristics may be augmented or intensified as a hazard or defect increases/perseveres, and may be reduced or moderated as the risk of hazard is alleviated or the defect remedied.
  • FIG. 6A illustrates an image of crane 200 presented on display 112, with added augmented reality features 410.
  • Augmented reality features 410 include arrow 412, destination target 414, and parallel border line 416.
  • Augmented reality features 410 appear upon display 112 in addition to the live imaging provided by an image sensor 220, such that the image in which operator 150 tracks crane 200 includes a combination of real physical features from the surroundings of crane 200, and virtual features which are not actually in the surroundings of crane 200.
  • Features 410 may appear on display 112 according to definitions which are preset in controller 110, or may be created or initiated by operator 150.
  • controller 110 may be preset to recognize jib 204 of crane 200 in a live imaging, and to create a parallel border line 416 at a predetermined distance above and below the length of jib 204.
  • Border line 416 may serve as a perimeter, such that operator 150 attempts to ensure that line 416 does not overlap with surrounding buildings or other physical features which appear in display 112, so as to prevent a collision of jib 204 with the surrounding buildings etc.
  • Arrow 412 and destination target 414 may be marked on display 112 by operator 150 according to each specific crane operation, arrow 412 representing the trajectory which jib 204 is intended to follow so as to reach destination target 414. Any other elements which improve the control and tracking of crane 200 may also be added.
  • Virtual reality settings 440 includes building-like pillars 442, and protruding trees 444.
  • Virtual reality settings 440 replace the live imaging which is provided by an image sensor (220) with virtual images, but the virtual images may be based on the live imaging and may correspond with features and/or sections of the live imaging.
  • Buildinglike pillars 442 and protruding tree 444 are elements which are created virtually and correspond to real buildings and trees, respectively, which are recorded by an image sensor 220 in the vicinity of crane 200.
  • Pillars 442 and trees 444 may be manually added to display 112 by operator 150, or may be added automatically by controller 110 according to preset definitions of analyzing the live imaging data provided by an image sensor (220).
  • An advantage of replacing the real physical live imaging with the virtual pillars 442 and trees 444 within virtual reality settings 440 is that this allows presenting only the features of the surroundings of crane 200 which are relevant for operator 150, i.e. , of which operator 150 needs to keep track or has to watch out for.
  • the settings of the pillars 442 and trees 444 may be set by operator 150, or may be predefined in controller 110 according to settings which are predetermined to be most effective in representing these features when appearing in the crane vicinity, e.g., buildings and trees. Any other virtual reality features may be added to virtual reality settings 440 which assist operator 150 in the tracking and control of crane 200. In addition, any combination of augmented reality and virtual reality, or alternation between the two, may be applied, according to the selection of operator 150 and/or according to a predefined program of controller 110.
  • Figure 7 is a block diagram of a method for remote tracking of a crane, operative in accordance with an embodiment of the present invention.
  • an image of a target object of particular interest is retrieved by at least one image sensor.
  • the image sensor linked to tracking software and hardware, continuously tracks and visually records the target object of interest, capturing live imaging thereof.
  • the target object may include a selected crane component, such as the crane hook, a load carried by the crane, a counterweight, safety pins, and the like, or may include elements in the crane surroundings, such as a load landing spot, a display of a scenery of interest, a view of the crane surroundings as captured by an image sensor disposed at a cockpit-location on theat least one crane, etc.
  • the target object may further include a safety weight sensor, which is operational for measuring the weight and/or distance of a crane load, and of indicating a safety threshold check regarding a maximal load and/or maximal spatial positioning of a load.
  • the tracking may include automatically centering and focusing the image sensor on the target object, either continuously, intermittently according to a predefined trigger event like a weight reading exceeding a threshold or a time of day, or when triggered to do so by the operator.
  • the image sensor may be predefined to maintain the target object anywhere within the image frame.
  • a hopping mobile apparatus such as a UAV
  • a tracking device allows continuous tracking of the target object, not necessarily in correlation with the live-imaging recorded by the image sensors.
  • the tracking device may be operational, for example, to allow automatic recognition of the target object of interest within data provided by the image sensors (i.e., live imaging), or to determine and provide to a controller the absolute physical location of the target object.
  • a live imaging of a target object of crane 200 or of a component thereof is captured by at least one image sensor 220.
  • Image sensor 220 may be positioned on crane mast 202, at an end of jib 204, on hoist-and-hook 206, or any other component of crane 200 or a structure in the vicinity thereof.
  • UAV 226, including image sensor 221 may also be utilized.
  • Tracking device 222 is mounted on the target object to allow tracking by image sensor 220, the target object including, for example, counterweight 210, joist-and-hook 206, or the distal end of jib 204.
  • the image of the target object is provided to the controller, which receives the image data provided by the at least one image sensor and presents the live imaging on the display.
  • the controller may receive image data from a plurality of image sensors and may examine the image data according to predefined analyses to determine which image data to present on the display.
  • the controller may receive data from a variety of sensors, and compute an ongoing situation or condition of the at least one crane or of any one of the crane's components.
  • the controller may be predefined, according to the ongoing analyses/computations the controller performs, to adjust the live imaging which is being presented on the display, to supply a variety of indications to the operator, and/or to autonomously control/activate/deactivate different crane components or systems.
  • controller 110 receives the imaging data provided by image sensors 220, and presents a corresponding image on display 112. Controller 110 may receive data from a plurality of image sensors 220 and examine the provided data according to predefined analyses, for example in order to identify a selected target object within the imaging data. Controller 110 may also receive indications from a variety of tracking sensors 230, including for example wind sensor 232, vibration/motion sensor 234, tension sensor 235, and radar sensor 236. Controller 110 may further receive signals from mutual detectors 240, which may be mounted on a component of crane 200 and a corresponding surrounding feature.
  • Controller 110 may be operational to analyze the situation of crane 200 according to the signals received from image sensors 220, tracking sensors 230 and mutual detectors 240, according to which it presents a selected imaging on display 112, the imaging retrieved from at least one of image sensors 220. Controller 110 may further supply real-time conditions indications to operator 150 via real-time indicators 380. Controller 110 may automatically control/activate/deactivate different components or systems of crane 200 when needed, e.g., when an imminent danger is identified.
  • the retrieved image/live imaging is displayed on the display to the operator located in a remote crane cockpit.
  • the target object of interest is maintained in a preferred position relative to the image, by virtue of adjusting the position of the image sensor, the selected crane component appears to be floating at the preferred position within a dynamic scenery.
  • the preferred position of the selected object of interest may be selected by the operator or predefined in the controller, and may be defined relative to the image frame, i.e. , relative to the viewing of the image sensor, or defined relative to the display.
  • the preferred position may be selected so as to enhance the tracking ability of the operator, and/or may be intended to maintain the operator on a healthy body posture.
  • AR/VR features may be combined with the displayed image, for providing data, and pointing and marking items and locations of interest, e.g., a load landing spot, which may substantially enhance the tracking and controlling abilities of the operator.
  • controller 110 displays a live image on display 112, which is based on the visual data supplied by image sensors 220. Operator 150 may select a preferred position on display 112, and controller 110 may correspondingly adjust the position of image sensors 220 and/or the settings of the image presented on display 112 in order to maintain the selected target object in the preferred position during the crane operation session.
  • the preferred position may optionally be predefined in controller 110.
  • AR features 410 and/or VR features 440 may be combined with the image on display 112 for enhancing the tracking and controlling abilities of operator 150.
  • procedure 508 which is a sub-procedure of procedure 504
  • an image captured by one of the image sensors is bursting onto the display when the controller identifies the object of interest in the captured image.
  • the controller may be predefined to burst the image onto the display whenever it identifies the target object in a captured image, or an improved view of the object relative to the view that is being currently presented on the display. Additionally or alternatively, the controller may receive instruction from the operator during the crane operating session, which define under what conditions to burst an image onto the display, and with regard to identification of which crane component.
  • controller 110 may burst an image onto the display when it identifies am imminent danger or a substantial defect which is related to the bursting image.
  • controller 110 may burst an image captured by at least one of image sensors 220 onto display 112, when an object of interest, or an imminent danger correlated to the bursting image, are identified by controller 110.
  • the bursting image may take up the whole display are of display 112, or may take up a portion of the screen, e.g., one of screens 114A-114D, 116A-116B.
  • procedure 510 which is a sub-procedure of procedure 506
  • the controller provides real-time conditions indications for the operator, which are based on data collected from the vicinity of the crane, optionally from the cranecockpit location.
  • the real time conditions indications may include, for example, sounding crane real time environmental noises to the operator, as heard and perceived by a microphone at a cockpit-location of the at least one crane, using a speaker located in the remote crane cockpit.
  • real time indicators 380 convey real-time indications provided to controller 110 by sensors installed in the vicinity of crane 200, such as wind sensor 332 and sound-sensors (microphones) 334, to operator 150 residing in the remote crane cockpit, to produce a comprehensive sensory experience which imitates the sensory experience of an operator in a non-remote crane cockpit.
  • procedure 512 which is a sub-procedure of procedure 506
  • a display of an image perceived by a particular image sensor is presented on the display as a principal display, capturing the principal area of the display, and an image perceived by at least one other image sensor is displayed as a secondary display capturing a secondary area of the display.
  • the controller may play the sound of the crane surroundings for the operator, which sound is captured by a microphone located in the vicinity of the particular image sensor of the principal display, or in the vicinity of the crane component which the particular image sensor is intended to visually record.
  • the particular image sensor may be selected to be presented on the principal display when the target object is identified as best seen/closest to the particular sensor.
  • the principal display may be differentiated over the at least one secondary display in the aspect of size, i.e.
  • the principal display taking up a larger portion of the display than each of the secondary displays.
  • the principal display may be differentiated over a secondary display in the aspect of image settings, such that the image shown on the principal display is accentuated relative to the image displayed on the secondary display(s), which may assist the operator in focusing on the principal display.
  • the principal display may overlay secondary displays, or the principal display may flash.
  • major screen 116A constitutes a major display, capturing the principal area of display 112, and presents an image perceived by a particular image sensor 220
  • secondary display 116B constitutes a secondary display, capturing a secondary area of display 112, and presents an image perceived by at least one other image sensor 220.
  • At least one of screens 114A-114D is rendered an optimal settings screen by controller 110, either according to predefined settings or upon selection by operator 150.
  • Controller 110 may play the sound of the surroundings of crane 200 to operator 150, which sound is captured by microphone 334 which is correlated with image sensor 220 of the principal display (116A).
  • FIG. 8 depicting a flow chart 600 of the processing steps employed in a method for remote tracking of a crane, in accordance with a second embodiment.
  • a panning, motion-tracking camera captures motion picture segments of a task under execution responsively to a trigger event as noted above. It should be appreciated that when the camera is in a state of non-panning or non-tracking, captured content is still can still form a part of the of the motion picture segment in accordance with configuration choices.
  • step 604 all captured motion picture segments are merged into a single motion picture in controller 110.
  • step 606 the single motion picture is displayed to a crane operator at a remote location separate from the crane or cranes he is operating. Display is implemented in real time to advantageously provide a single overall motion picture facilitating situational awareness.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un système de suivi de site de travail de grue utilisant une pluralité de caméras panoramiques de suivi de mouvement, en réponse à divers événements de déclenchement pour capturer des segments d'image de mouvement d'activité de tâche à l'intérieur d'un site de travail de grue et afficher les segments d'image de mouvement sous la forme d'une seule image de mouvement fusionnée à un opérateur distant sur un écran unique.
PCT/IL2023/050415 2022-04-25 2023-04-24 Suivi de grue à distance WO2023209705A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
IL292503A IL292503B1 (en) 2022-04-25 2022-04-25 Remote crane tracking
IL292503 2022-04-25
US202363480174P 2023-01-17 2023-01-17
US63/480,174 2023-01-17

Publications (1)

Publication Number Publication Date
WO2023209705A1 true WO2023209705A1 (fr) 2023-11-02

Family

ID=88518044

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2023/050415 WO2023209705A1 (fr) 2022-04-25 2023-04-24 Suivi de grue à distance

Country Status (1)

Country Link
WO (1) WO2023209705A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130299440A1 (en) * 2012-05-10 2013-11-14 Dale Hermann Crane collision avoidance
CN203373016U (zh) * 2013-05-03 2014-01-01 中冶东方工程技术有限公司 一种起重机的吊钩监测跟踪系统
CN106516984A (zh) * 2016-12-29 2017-03-22 深圳大学 一种基于无线通讯网络的无人塔吊控制系统及实现方法
US20170369288A1 (en) * 2016-06-22 2017-12-28 The Boeing Company Systems and methods for object guidance and collision avoidance
US20180229978A1 (en) * 2013-04-11 2018-08-16 Liebherr-Components Biberach Gmbh Remote-controlled crane
US20190337771A1 (en) * 2018-05-04 2019-11-07 Rowan Companies, Inc. System and Method for Remote Crane Operations on Offshore Unit
US20200048052A1 (en) * 2017-04-03 2020-02-13 Cargotec Patenter Ab Driver assistance system and a method
CN214270032U (zh) * 2021-01-18 2021-09-24 江苏省特种设备安全监督检验研究院 一种起重机吊装作业安全监测预警系统
WO2021229576A2 (fr) * 2020-05-14 2021-11-18 Ultrawis Ltd. Systèmes et procédés de commande à distance et d'automatisation d'une grue à tour

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130299440A1 (en) * 2012-05-10 2013-11-14 Dale Hermann Crane collision avoidance
US20180229978A1 (en) * 2013-04-11 2018-08-16 Liebherr-Components Biberach Gmbh Remote-controlled crane
CN203373016U (zh) * 2013-05-03 2014-01-01 中冶东方工程技术有限公司 一种起重机的吊钩监测跟踪系统
US20170369288A1 (en) * 2016-06-22 2017-12-28 The Boeing Company Systems and methods for object guidance and collision avoidance
CN106516984A (zh) * 2016-12-29 2017-03-22 深圳大学 一种基于无线通讯网络的无人塔吊控制系统及实现方法
US20200048052A1 (en) * 2017-04-03 2020-02-13 Cargotec Patenter Ab Driver assistance system and a method
US20190337771A1 (en) * 2018-05-04 2019-11-07 Rowan Companies, Inc. System and Method for Remote Crane Operations on Offshore Unit
WO2021229576A2 (fr) * 2020-05-14 2021-11-18 Ultrawis Ltd. Systèmes et procédés de commande à distance et d'automatisation d'une grue à tour
CN214270032U (zh) * 2021-01-18 2021-09-24 江苏省特种设备安全监督检验研究院 一种起重机吊装作业安全监测预警系统

Similar Documents

Publication Publication Date Title
US20180312256A1 (en) Device that controls flight altitude of unmanned aerial vehicle
DE102008001391B4 (de) Brandmeldervorrichtung sowie Verfahren zur Branddetektion
US20190228667A1 (en) Unmanned aerial vehicle detection system and unmanned aerial vehicle detection method
KR101251942B1 (ko) 산불감지시스템 및 그 제어방법
JP3257165B2 (ja) 移動物体監視装置
US20130050467A1 (en) Control System And Method For An Aerially Moved Payload System
US11501619B2 (en) Worksite classification system and method
JPWO2008029802A1 (ja) 走行情報提供装置
US20190152753A1 (en) System for assisting in the evaluation and management of a danger on an aerial lift
KR101519974B1 (ko) 주변정보를 감지하는 감시 카메라 및 이를 이용한 지능형 경보 시스템
JP2019016836A (ja) 監視システム、情報処理装置、情報処理方法、及びプログラム
US20020052708A1 (en) Optimal image capture
CN111753780B (zh) 变电站违章检测系统及违章检测方法
CN114604763A (zh) 用于智能塔吊吊钩导向的电磁定位装置及方法
CN114604768A (zh) 基于故障识别模型的智能塔吊维护管理方法及系统
WO2023209705A1 (fr) Suivi de grue à distance
IL292503B1 (en) Remote crane tracking
KR20200071560A (ko) 체험농장 안전관리 시스템
JP2019218198A (ja) 操作支援システム
CN114560396A (zh) 用于智能塔吊取放运动检测的传感物联网设备及方法
KR101655969B1 (ko) 영상 분석을 통하여 환자의 낙상을 방지하기 위한 장치 및 방법
JP2018170575A (ja) 監視システム
CN114572845A (zh) 用于智能塔吊工况检测的智能辅助机器人及其控制方法
JPH07309577A (ja) クレーン用テレビモニタ装置
WO2023209700A1 (fr) Poste de pilotage de grue à distance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23795783

Country of ref document: EP

Kind code of ref document: A1