WO2023100533A1 - Système d'affichage d'image, système d'aide au fonctionnement à distance et procédé d'affichage d'image - Google Patents

Système d'affichage d'image, système d'aide au fonctionnement à distance et procédé d'affichage d'image Download PDF

Info

Publication number
WO2023100533A1
WO2023100533A1 PCT/JP2022/039550 JP2022039550W WO2023100533A1 WO 2023100533 A1 WO2023100533 A1 WO 2023100533A1 JP 2022039550 W JP2022039550 W JP 2022039550W WO 2023100533 A1 WO2023100533 A1 WO 2023100533A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
index
image display
display system
working
Prior art date
Application number
PCT/JP2022/039550
Other languages
English (en)
Japanese (ja)
Inventor
卓 伊藤
洋一郎 山▲崎▼
誠司 佐伯
佑介 上村
友鷹 三谷
Original Assignee
コベルコ建機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コベルコ建機株式会社 filed Critical コベルコ建機株式会社
Publication of WO2023100533A1 publication Critical patent/WO2023100533A1/fr

Links

Images

Classifications

    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a technique for supporting remote control by an operator of a work machine such as a hydraulic excavator.
  • the position of the work device is represented by an image formed along the terrain. Therefore, since unnecessary positional information other than the predetermined part of the working device is also displayed at the same time, it is difficult to grasp the position of the working device with respect to the work target, which may make it difficult to perform the work efficiently. have a nature.
  • an object of the present invention is to provide a system and the like that can improve the accuracy of recognition by an operator of the positional relationship between a working mechanism that constitutes a working machine and a target object that exists around the working machine.
  • the image display system of the present invention is In a working environment image representing the situation of a working mechanism that constitutes a working machine and a target object that exists around the working machine, a second index point obtained as a result of projecting a first index point in the working mechanism onto the surface of the target object
  • the second index points on the surface of the working mechanism and the target object that constitute the working machine are displayed through the working environment image output to the output interface of the remote control device and the index image superimposed thereon. can be made to grasp the position of the operator.
  • the second index point is the result of projecting the first index point onto the surface of the target object and does not follow the surface shape of the target object. Therefore, it is possible to avoid providing the operator with unnecessary positional information of the working mechanism other than the first index point, thereby improving the operator's recognition accuracy of the positional relationship between the working mechanism and the target object.
  • FIG. 2 is an explanatory diagram regarding the configuration of an image display composite system and an image display system; Explanatory drawing about the structure of a remote control. Explanatory drawing about the structure of a working machine. Explanatory drawing about the function of an image display system.
  • FIG. 4 is an explanatory diagram of one display form of a work environment image and index images; FIG. 4 is an explanatory diagram regarding the relationship between the arm top position and the display mode of the index image; FIG. 4 is an explanatory diagram regarding the relationship between the arm top position and the display mode of the index image; FIG. 4 is an explanatory diagram regarding the relationship between the arm top position and the display mode of the index image; FIG. 4 is an explanatory diagram regarding the relationship between the arm top position and the display mode of the index image; FIG.
  • FIG. 5 is an explanatory diagram regarding the relationship between the displacement mode of the working mechanism and the directivity of the index image;
  • FIG. 5 is an explanatory diagram regarding the relationship between the displacement mode of the working mechanism and the directivity of the index image;
  • FIG. 5 is an explanatory diagram regarding the relationship between the displacement mode of the working mechanism and the directivity of the index image;
  • FIG. 5 is an explanatory diagram relating to the relationship between the posture mode of the working mechanism and the directivity of the index image;
  • FIG. 5 is an explanatory diagram relating to the relationship between the posture mode of the working mechanism and the directivity of the index image;
  • FIG. 5 is an explanatory diagram relating to the relationship between the posture mode of the working mechanism and the directivity of the index image;
  • FIG. 5 is an explanatory diagram relating to the relationship between the posture mode of the working mechanism and the directivity of the index image;
  • FIG. 5 is an explanatory diagram relating to the relationship between the posture mode of the working mechanism and the directivity of the index
  • FIG. 4 is an explanatory diagram of a display form of a three-dimensional index image
  • FIG. 4 is an explanatory diagram of a display form of a three-dimensional index image
  • FIG. 4 is an explanatory diagram of a display form of a three-dimensional index image
  • FIG. 5 is an explanatory diagram regarding the relationship between the displacement mode of the working mechanism and the position of the first index point
  • FIG. 5 is an explanatory diagram regarding the relationship between the displacement mode of the working mechanism and the position of the first index point
  • FIG. 5 is an explanatory diagram regarding the relationship between the displacement mode of the working mechanism and the position of the first index point
  • FIG. 11 is an explanatory diagram of another display form of the working environment image and the index image
  • FIG. 4 is an explanatory diagram regarding the relationship between the operation mode of the working mechanism and the display mode of the index image;
  • FIG. 4 is an explanatory diagram regarding the relationship between the operation mode of the working mechanism and the display mode of the index image;
  • FIG. 5 is an explanatory diagram regarding the relationship between the displacement mode of the working mechanism and the position of the first index point;
  • FIG. 5 is an explanatory diagram regarding the relationship between the displacement mode of the working mechanism and the position of the first index point;
  • the composite image display system shown in FIG. 1 includes an image display system 10, a remote control device 20, and/or a work machine 40 to be remotely controlled by the remote control device 20. As shown in FIG.
  • the image display system 10, the remote controller 20 and the work machine 40 are configured to be able to communicate with each other through a network.
  • the mutual communication network of image display system 10 and remote control device 20 and the mutual communication network of image display system 10 and work machine 40 may be the same or different.
  • the image display system 10 is configured by a computer that exists separately from the remote control device 20 and the work machine 40, and includes a database 102, a communication function element 121, and an image processing function element 122.
  • the database 102 stores and holds captured image data and the like.
  • the database 102 may be composed of a database server capable of mutual communication with the image display system 10 .
  • Each functional element is composed of an arithmetic processing unit (for example, a single-core processor and/or a multi-core processor or a processor core that constitutes this), reads necessary data and software from a storage device such as a memory, and targets the data Calculation processing, which will be described later, is executed according to the software.
  • the remote control device 20 includes a remote control device 200 , a remote input interface 210 , a remote output interface 220 and a remote wireless communication device 224 .
  • the remote control device 200 is configured by an arithmetic processing device (for example, a single-core processor and/or a multi-core processor or a processor core constituting this), reads necessary data and software from a storage device such as a memory, and processes the data. , the arithmetic processing according to the software is executed.
  • the remote input interface 210 includes a remote control mechanism 211.
  • Remote output interface 220 includes a remote image output device 221 .
  • the remote control mechanism 211 includes a traveling operating device, a turning operating device, a boom operating device, an arm operating device, and a bucket operating device.
  • Each operating device has an operating lever that receives a rotating operation.
  • An operating lever (running lever) of the operating device for running is operated to move the lower running body 410 of the work machine 40 .
  • the travel lever may also serve as a travel pedal.
  • a traction pedal may be provided that is fixed to the base or lower end of the traction lever.
  • An operation lever (swing lever) of the swing operation device is operated to move a hydraulic swing motor that constitutes the swing mechanism 430 of the work machine 40 .
  • An operating lever (boom lever) of the boom operating device is operated to move the boom cylinder 442 of the work machine 40 .
  • An operating lever (arm lever) of the arm operating device is operated to move the arm cylinder 444 of the work machine 40 .
  • An operating lever (bucket lever) of the bucket operating device is operated to move the bucket cylinder 446 of
  • each control lever that constitutes the remote control mechanism 211 is arranged around the seat St on which the operator sits.
  • the seat St is in the form of a high-back chair with armrests, a low-back chair without a headrest, or a chair without a backrest. may be
  • a pair of left and right travel levers 2110 corresponding to the left and right crawlers are arranged side by side in front of the seat St.
  • One operating lever may serve as a plurality of operating levers.
  • the left operation lever 2111 provided in front of the left frame of the seat St shown in FIG. 2 functions as an arm lever when operated in the longitudinal direction, and when operated in the lateral direction. function as a pivot lever.
  • the right operation lever 2112 provided in front of the right frame of the seat St shown in FIG. It may function as a bucket lever in some cases.
  • the lever pattern may be arbitrarily changed by an operator's operation instruction.
  • the remote image output device 221 includes a central remote image output device 2210 having substantially rectangular screens arranged in front, diagonally forward left, and diagonally forward right of the sheet St. It consists of a left remote image output device 2211 and a right remote image output device 2212 .
  • the shapes and sizes of the respective screens (image display areas) of the central remote image output device 2210, the left remote image output device 2211 and the right remote image output device 2212 may be the same or different.
  • the left remote image is tilted such that the screen of the central remote image output device 2210 and the screen of the left remote image output device 2211 form an inclination angle ⁇ 1 (for example, 120° ⁇ 1 ⁇ 150°).
  • the right edge of output device 2211 is adjacent to the left edge of central remote image output device 2210 .
  • the right remote image is tilted such that the screen of the central remote image output device 2210 and the screen of the right remote image output device 2212 form an inclination angle ⁇ 2 (eg, 120° ⁇ 2 ⁇ 150°).
  • the left edge of output device 2212 is adjacent to the right edge of central remote image output device 2210 .
  • the tilt angles ⁇ 1 and ⁇ 2 may be the same or different.
  • the respective screens of the central remote image output device 2210, the left remote image output device 2211, and the right remote image output device 2212 may be parallel to the vertical direction or may be inclined with respect to the vertical direction. At least one of the central remote image output device 2210, the left remote image output device 2211 and the right remote image output device 2212 may be composed of a plurality of divided image output devices.
  • the central remote image output device 2210 may comprise a pair of vertically adjacent image output devices having substantially rectangular screens.
  • the remote image output device 221 may be composed of a single image output device curved or bent to surround the sheet St.
  • a single image output device may be constituted by, for example, the central remote image output device 2210 .
  • the remote image output device 221 may consist of two image output devices (eg, a central remote image output device 2210 and a left remote image output device 2211 or a right remote image output device 2212).
  • the working machine 40 includes a real machine control device 400 , a real machine input interface 41 , a real machine output interface 42 , a real machine wireless communication device 422 , and a working mechanism 440 .
  • the actual device control device 400 is composed of an arithmetic processing device (single-core processor or multi-core processor or a processor core that constitutes it), reads necessary data and software from a storage device such as a memory, and processes the data into the software. Execute the arithmetic processing accordingly.
  • the work machine 40 is, for example, a crawler excavator (construction machine), and as shown in FIG. and an upper revolving body 420 .
  • a cab 424 (driver's cab) is provided on the front left side of the upper swing body 420 .
  • a work mechanism 440 is provided in the front central portion of the upper swing body 420 .
  • the real machine input interface 41 includes a real machine operating mechanism 411 , a real machine imaging device 412 , a real machine distance measuring device 414 , and a real machine sensor group 416 .
  • the actual machine operating mechanism 411 includes a plurality of operating levers arranged around a seat arranged inside the cab 424 in the same manner as the remote operating mechanism 211 .
  • the cab 424 is provided with a drive mechanism or a robot that receives a signal corresponding to the operation mode of the remote control lever and moves the actual machine control lever based on the received signal.
  • the actual machine imaging device 412 is installed inside the cab 424, for example, and images the environment including at least part of the working mechanism 440 (for example, the attachment 445) through the front window.
  • the real machine range finder 414 is a device for measuring the real space distance to a target object existing around the work machine 40, and thus the real space position, and is composed of, for example, a LiDAR and a TOF sensor.
  • the actual machine sensor group 416 includes a turning angle sensor for measuring the turning angle of the upper turning body 420 with respect to the lower traveling body 410, an attitude angle sensor for measuring the attitude angle representing the attitude of the working mechanism 440, and the like. It consists of various sensors for measuring 40 operating conditions.
  • the real machine output interface 42 includes a real machine wireless communication device 422 .
  • a working mechanism 440 as a working mechanism includes a boom 441 attached to the upper rotating body 420 so as to be able to rise and fall, and an arm 443 rotatably connected to the tip of the boom 441. and an attachment 445 (for example, a bucket) rotatably connected to the tip of the arm 443 .
  • the working mechanism 440 is equipped with a boom cylinder 442, an arm cylinder 444, and a bucket cylinder 446, which are configured by telescopic hydraulic cylinders.
  • the boom cylinder 442 is interposed between the boom 441 and the upper slewing body 420 so as to expand and contract when supplied with hydraulic oil and rotate the boom 441 in the hoisting direction.
  • the arm cylinder 444 is interposed between the arm 443 and the boom 441 so as to expand and contract when supplied with hydraulic oil to rotate the arm 443 about the horizontal axis with respect to the boom 441 .
  • the bucket cylinder 446 is interposed between the attachment 445 and the arm 443 so as to expand and contract when supplied with hydraulic oil to rotate the attachment 445 with respect to the arm 443 about the horizontal axis.
  • FIG. 4 is a flow chart for explaining the image display system having the above configuration and the function of the image display system.
  • the block "C ⁇ " is used for simplification of the description, means transmission and/or reception of data, and processing in the branch direction is executed on the condition of transmission and/or reception of the data. It means a conditional branch.
  • the flowchart is repeated for each control cycle, and after reaching "END", the process returns to "START" and the subsequent processes are executed.
  • an environment confirmation request operation is, for example, an operation such as a tap on the remote input interface 210 for the operator to instruct the work machine 40 intended to be remotely operated to perform an environment confirmation request operation. If the determination result is negative (FIG. 4/STEP 210 . . . NO), return to START. On the other hand, if the determination result is affirmative (FIG. 4/STEP 210 . . . YES), an environment confirmation request is transmitted to the image display system 10 through the remote wireless communication device 224 (FIG. 4/STEP 211).
  • the communication function element 121 transmits the environment confirmation request to the relevant work machine 40 (FIG. 4/C10).
  • the environment confirmation request may be transmitted to work machine 40 without going through image display system 10 .
  • the actual machine imaging device 412 detects the work object Obj (for example, the ground, earth and sand, materials and/or buildings) are acquired, a three-dimensional image of the work object Obj is acquired by the actual machine distance measuring device 414, and three-dimensional image data representing the three-dimensional image is transmitted to the actual machine wirelessly. It is transmitted to the image display system 10 through the communication device 422 (FIG. 4/STEP 410).
  • the work object Obj for example, the ground, earth and sand, materials and/or buildings
  • a three-dimensional image is an image having the direction and distance to the work object Obj or the real space position of the work object Obj acquired through the actual rangefinder 414 .
  • a “real space position” is defined by coordinate values (for example, latitude, longitude and altitude) in a real space coordinate system or coordinate values in a real machine coordinate system (a coordinate system whose position or orientation is fixed with respect to the work machine 40). be.
  • the real space position of each point forming the point group on the surface of the work object Obj corresponding to each pixel of the three-dimensional image is the pixel value of each pixel. included as
  • the three-dimensional image data consists of data of a captured image obtained through the actual imaging device 412 or a model image equivalent thereto, distance or real space position data obtained through the actual ranging device 414, and may be acquired and transmitted as a combination of
  • a captured image including at least the work object Obj is not captured by the actual imaging device 412, but is captured by an imaging device installed around the working machine 40, an imaging device mounted on an unmanned airplane, and/or carried by a field worker. It may be acquired through an imaging device of the equipment that is being used.
  • the distance or real space position which is the pixel value of the three-dimensional image, may be obtained through a rangefinder installed around work machine 40 and/or a rangefinder mounted on an unmanned aerial vehicle.
  • a stereo camera (a pair of left and right actual imaging devices 412) mounted on the work machine 40 is used to acquire a captured image and a three-dimensional image of the work object Obj.
  • the image processing function element 122 when three-dimensional image data is received by the communication function element 121 (FIG. 4/C11), the image processing function element 122 outputs work environment image data corresponding to the three-dimensional image data to the remote control device 20. (FIG. 4/STEP 110).
  • the work environment image data includes the captured image data itself (which does not include the real space position and distance information as pixel values) that is the basis of the 3D image data, as well as the simulated image data generated based on the captured image data. This is image data representing a working environment image.
  • the remote control device 200 When the remote operation device 20 receives the working environment image data through the remote wireless communication device 224 (FIG. 4/C21), the remote control device 200 outputs the working environment image corresponding to the working environment image data to the remote image output device 221. (FIG. 4/STEP 212).
  • a work environment image in which earth and sand, which are objects Obj, are reflected is output to the remote image output device 221 .
  • real machine sensor group 416 acquires the real space position of first index point p1 of working mechanism 440, and data representing the real space position of first index point p1 is transmitted to the real machine wireless communication device. 422 to the image display system 10 (FIG. 4/STEP 412).
  • the process of transmitting three-dimensional image data (see STEP 410 in FIG. 4) and the process of transmitting data representing the real space position of the first index point p 1 (see STEP 412 in FIG. 4) are simultaneously executed as a process of transmitting a set of data. may be
  • a point corresponding to the tip (arm top) of the arm 443 is defined as the first index point p1 .
  • the real space position of the first index point p 1 defined in the working mechanism 440 is the output signal of the attitude angle sensor that constitutes the real machine sensor group 416 mounted on the working machine 40, and the output signal of each working mechanism 440. Calculated forward kinematics based on component sizes.
  • the attitude angle sensor detects at least part of the hoisting angle of the boom 441 with respect to the upper swing body 420, the rotation angle of the arm 443 at the joint with the boom 441, and the rotation angle of the attachment 445 at the joint with the arm 443. is configured to output a signal corresponding to An arbitrary point such as the boom 441, the arm 443 and the attachment 445 that constitute the working mechanism 440 may be defined as the first index point p1 .
  • the real space position of the first index point p 1 may be recognized by the actual machine control device 400 based on the three-dimensional image.
  • image analysis processing grayscaling processing, edge extraction processing, and/or pattern matching processing, etc.
  • the average value of the pixel values is recognized as the real space position of the first index point p1 .
  • the other may be corrected.
  • the real space position of the first index point p 1 and the three-dimensional image include
  • the real space position of the second index point p 2 is recognized by the image processing functional element 122 based on the real space position or three-dimensional shape of each point forming the point group on the surface of the target object Obj (FIG. 4/STEP 112 ).
  • the second index point p2 is a point resulting from projection of the first index point p1 onto the surface of the target object Obj.
  • the projection direction of the first index point p 1 with respect to the surface of the target object Obj is, for example, the vertical direction.
  • the projection direction of the first index point p1 with respect to the surface of the target object Obj is similarly with respect to the vertical axis of the real space. It may be defined in an oblique direction.
  • the tilt angle of the work machine 40 with respect to the vertical axis is measured by a body tilt angle sensor (for example, a gyro sensor) that constitutes the actual machine sensor group 416 .
  • the recognition of the real space position of the first index point p 1 by the real machine control device 400 and the transmission of data May be omitted.
  • the image processing functional element 122 performs image analysis processing (such as grayscaling, edge extraction, and/or pattern matching) on the three-dimensional image to define the first index point p 1 in the working mechanism 440.
  • image analysis processing such as grayscaling, edge extraction, and/or pattern matching
  • One or more pixels corresponding to the point are recognized, and the pixel value of the pixels or the average value thereof is recognized as the real space position of the first index point p1 .
  • the index image data in which the index image M indicating the second index point p2 is superimposed on the working environment image is transmitted to the remote control device 20 by the image processing functional element 122 (Fig. 4/STEP 114).
  • the command includes the real space position of the second index point p2 and/or the pixel position (u, v) corresponding to the second index point p2 in the three-dimensional image or work environment image.
  • the remote control device 20 When the remote control device 20 receives the index image data through the remote wireless communication device 224 (FIG. 4/C22), the remote control device 200 superimposes the index image M corresponding to the command on the working environment image. is output to the remote image output device 221 (FIG. 4/STEP 214).
  • the process of transmitting work environment image data (see FIG. 4/STEP 110) and the process of transmitting index image data (see FIG. 4/STEP 114) from the image display system 10 to the remote control device 20 are simultaneously executed as a batch of data transmission process.
  • the remote control device 20 output processing of the work environment image (see FIG. 4/STEP 212) and output processing of the index image superimposed on the work environment image (see FIG. 4/STEP 214) are performed as a single image output. They may be executed simultaneously as processes.
  • a second index is obtained as a result of projecting the first index point p 1 onto the surface of the target object Obj (for example, earth and sand around the working machine 40).
  • the index image M indicating the point p2 is output to the remote image output device 221 in a form superimposed on the working environment image.
  • the index image M is an image of a triangular or arrow-shaped figure pointing toward the second index point p2 in the vertical direction in real space.
  • the first index point p1 and the second index point p2 may be superimposed on the captured image, and the first index point p1 and/or the second index point p2 The superimposed display may be omitted.
  • the remote control device 200 recognizes the operation mode of the remote control mechanism 211, and transmits a remote control command corresponding to the operation mode to the image display system 10 through the remote wireless communication device 224. (FIG. 4/STEP 220).
  • the communication functional element 121 transmits the remote operation instruction to the working machine 40 (FIG. 4/C14).
  • the remote control command may be transmitted to work machine 40 without going through image display system 10 .
  • the actual machine control device 400 receives an operation command through the actual machine wireless communication device 422 (FIG. 4/C44), the operation of the working mechanism 440 and the like is controlled (FIG. 4/STEP 420).
  • the attachment 445 scoops up earth and sand, which is the target object Obj in front of the working machine 40 , rotates the upper revolving body 420 , and then removes the earth and sand from the attachment 445 .
  • the work machine 40 can be viewed through the work environment image output to the remote image output device 221 that constitutes the remote output interface 220 and the index image M superimposed thereon.
  • the operator can grasp the working mechanism 440 (attachment 445) and the position of the second index point p2 on the surface of the target object Obj (see FIG. 5).
  • the second index point p2 is the result of projecting the first index point p1 onto the surface of the target object Obj. This prevents the operator from being provided with the information, thereby improving the recognition accuracy by the operator of the positional relationship between the working mechanism 440 and the target object Obj.
  • FIGS. 6A to 6C each show the positional relationship between the arm top and the attachment 445 when excavating the ground.
  • Working mechanism 440 can exert the strongest force under the arm top, which is the connection point for attachment 445 .
  • the operator usually contacts the tip of the attachment 445 with the ground below or farther from the arm top, and then moves the attachment 445 so that the tip is below the arm top.
  • operate so that the tip is closer to you than the arm top.
  • the arm top is often more suitable than the tip of the attachment 445 as the position index of the working mechanism 440 .
  • attachment 445 can be replaced with a breaker, grapple, lifting magnet, etc. in addition to the bucket described above, but since the position of the "arm top" does not change even if the attachment 445 is replaced, the same image display can be applied.
  • the index image M is an image that has directivity with respect to the second index point p2 on the surface of the target object Obj, or that indicates the position by a substantially triangular vertex or an arrow (see FIG. 5). Therefore, while avoiding the operator's mistake regarding the three-dimensional shape of the surface of the target object Obj, the operator can determine the positional relationship between the working mechanism 440, particularly the attachment 445 in which the first index point p1 is defined, and the target object Obj. The ease of recognition is improved by
  • the image display system 10 and the communication function element 121 and the image processing function element 122 that constitute it are configured by a computer that exists separately from the remote control device 20 and the work machine 40, but other embodiments , an image display system is mounted on the remote control device 20 and/or the work machine 40, and the communication function element 121 and/or the image processing function element 122 are configured by the remote control device 200 and/or the actual machine control device 400. good. In that case, the communication function in the image display system 10 can be omitted.
  • the first index point p 1 was defined as the arm top in the above embodiment, it may be defined at the tip of the attachment 445 .
  • the index image M showing the second index point p 2 as a result of projecting the first index point p 1 onto the surface of the target object Obj allows the operator to determine the position of the attachment 445 with respect to the contact of the target object Obj. It is possible to improve the recognition accuracy of the relationship.
  • the first index point p 1 moves toward the target object Obj in a direction corresponding to the displacement mode of the attachment (for example, the attachment 445) on which the first index point p 1 is arranged in the working mechanism 440.
  • An index image M showing the second index point p 2 as a result of being projected onto the surface may be superimposed on the working environment image and output to the remote image output device 221 .
  • the displacement mode of the working mechanism 440 or the attachment is recognized based on the attitude angle sensor and/or the turning angle sensor that constitute the actual machine sensor group 416 .
  • the displacement mode of the working mechanism 440 or the attachment may be recognized based on the operation mode of the control lever that constitutes the remote control mechanism 211 .
  • FIG. 7A when the attachment 445 is displaced vertically downward, the first index point p 1 is projected vertically downward onto the surface of the target object Obj, resulting in the second An index image M indicating the index point p 2 is superimposed on the work environment image and output to the remote image output device 221 .
  • FIG. 7B when the attachment 445 is displaced forward and downward as viewed from the work machine 40, the first index point p1 is forward and downward of the work machine 40 with respect to the surface of the target object Obj.
  • An index image M indicating the second index point p 2 as a result of the projection is superimposed on the working environment image and output to the remote image output device 221 .
  • FIG. 7A when the attachment 445 is displaced vertically downward, the first index point p 1 is projected vertically downward onto the surface of the target object Obj, resulting in the second An index image M indicating the index point p 2 is superimposed on the work environment image and output to the remote image output device 221 .
  • the displacement mode of the working mechanism 440 constituting the working machine 40 is displayed through the working environment image output to the remote image output device 221 of the remote controller 20 and the index image M superimposed thereon.
  • a second index point p 2 as a result of projecting the first index point p 1 onto the surface of the target object Obj in a direction corresponding to the posture mode of the working mechanism 440 or the attachment 445 according to the index image output command. may be superimposed on the working environment image and output to the remote image output device 221 .
  • An index image M indicating the second index point p 2 as a result of downward projection is superimposed on the work environment image and output to the remote image output device 221 .
  • first index point p1 is positioned rearward of work machine 40 with respect to the surface of target object Obj.
  • An index image M indicating the second index point p 2 as a result of downward projection is superimposed on the work environment image and output to the remote image output device 221 .
  • the work mechanism 440 or the attachment 445 constituting the work machine 40 is displayed through the work environment image output to the remote image output device 221 of the remote operation device 20 and the index image M superimposed thereon. Since the operator can easily recognize the position and direction acting on the object according to the posture of the operator, the recognition accuracy of the operator can be improved, and the work efficiency can be improved.
  • the first index point is defined in the attachment 445, and while the transmission of a specific remote control command is started and continues (the specific operation continues), the position information of the first index point is The position of the first index point p1 immediately before the operation command is transmitted may be maintained without being updated.
  • an operation switch that operates a breaker 445 that is an attachment 445 is provided on the lever of the remote control mechanism, and when the breaker 445 is operated by the remote control mechanism 211, the working machine 40 is controlled by the remote control device 20.
  • a remote control command to operate the breaker 445 is sent to the image display system 10, and an update stop signal is sent to the image display system 10 so as not to update the position information of the first index point p1 .
  • the update stop signal is continuously transmitted while the operation switch is being pressed. As a result, the positional information of the first index point p1 is maintained while the positional information immediately before the operation switch is operated continues.
  • the update stop signal is no longer sent to the image display system 10 . As a result, updating of the position information of the first index point p1 is resumed.
  • the position information of the first index point is not updated while the attachment 445 is being operated. Therefore, the work environment image output to the remote image output device 221 of the remote control device 20 and the index image M superimposed thereon are vibrated at the first index point by the vibration of the attachment 445 caused by the operation of the attachment 445. vibrate by doing As a result, it is possible to prevent a phenomenon in which the operator, who is performing remote control while gazing at the index image M, becomes sick.
  • the second index point p2 is defined as a result of projecting the first index point p1 onto the surface of the target object Obj in a direction corresponding to the displacement mode or posture mode of the working mechanism 440 or attachment.
  • a stereoscopic index image M may be output to the remote image output device 221 of the remote control device 20 in the work environment image.
  • FIG. 7A and 8A when the second index point p 2 is defined as a result of projecting the first index point p 1 vertically downward onto the surface of the target object Obj (see FIGS. 7A and 8A), FIG. As shown, a substantially conical index image M with a central axis parallel to the vertical direction of the real space and with the apex directed downward in the real space is superimposed on the work environment image and output to the remote image output device 221. be done.
  • the second index point p 2 is defined as a result of projecting the first index point p 1 onto the surface of the target object Obj to the front and lower side of the work machine 40 (see FIGS.
  • FIG. 9B a substantially conical index image M whose apex is directed forward and downward in real space is superimposed on the work environment image and output to the remote image output device 221 .
  • the second index point p 2 is defined as a result of projecting the first index point p 1 on the surface of the target object Obj to the rear and downward direction of the work machine 40 (see FIGS. 7C and 8C)
  • FIG. 9C 2 a substantially conical index image M whose apex is directed backward and downward in the real space is superimposed on the work environment image and output to the remote image output device 221 .
  • the index image M output to the remote image output device 221 of the remote control device 20 is a three-dimensional image.
  • the operator can easily grasp the positional relationship between the first index point p1 and the second index point p2 on the surface of the target object Obj.
  • An index image M showing a second index point p2 as a result of projecting the first index point p1 whose position differs according to the displacement mode of the working mechanism 440 onto the surface of the target object Obj is superimposed on the work environment image. and output to the remote image output device 221 of the remote operation device 20 .
  • the tip center point of the bucket 445 is defined as the first index point p1 , as shown in FIG. 10A.
  • the upper revolving body 420 is turned against the lower traveling body 410 as viewed from above. If a clockwise turn is estimated or predicted or measured (see leftward white arrow), the point to the left of the tip of bucket 445 is defined as first index point p 1 , as shown in FIG. 10B. be.
  • the upper rotating body 420 rotates clockwise with respect to the lower traveling body 410 when viewed from above. 10C, the point to the right of the tip of bucket 445 is defined as first index point p 1 , as shown in FIG. 10C.
  • an index image M showing a second index point p2 as a result of projecting the first index point p1 onto the surface of the target object Obj is superimposed on the work environment image to form a remote image. It is output to the output device 221 .
  • the part to be observed in the working mechanism 440 changes according to the changing mode of the position and/or posture of the working mechanism 440. For example, if the operation is for rotating the upper rotating body 420 with respect to the lower traveling body 410, the part is the leading edge in the rotating direction, and if the operation is for moving the working mechanism 440 away from the center of the machine, the working mechanism 440 It is the leading edge in the direction away from the machine center.
  • the position and position of the working mechanism 440 constituting the working machine 40 can be displayed through the working environment image output to the remote image output device 221 of the remote control device 20 and the index image M superimposed thereon. / Alternatively, it becomes easier to grasp the positional relationship between the part to be watched and the target object Obj according to the posture change mode, and the recognition accuracy of the operator can be improved.
  • a plurality of index images M each showing a plurality of second index points p2 as a result of projecting the plurality of first index points p1 onto the surface of the target object Obj are superimposed on the work environment image. It may be output to the remote image output device 221 of the remote operation device 20 .
  • the left and right endpoints of the tip of attachment 445 are defined as first index points p 11 and p 12 , respectively, and the first index points p 11 and p
  • a plurality of index images M 1 and M 2 representing second index points p 21 and p 22 , respectively, as a result of projecting each of 12 vertically downward onto the surface of the target object Obj are superimposed on the work environment image. may be output to the remote image output device 221 of the remote operation device 20.
  • the number of each of the first index points p 1 and the corresponding second index points p 2 may be 3 or more.
  • the plurality of first index points p The operator can grasp the position of each of the plurality of second index points p21 and p22 as a result of projecting each of 11 and p12 onto the surface of the target object Obj.
  • the operator's recognition accuracy of the positional relationship between the working mechanism 440 and the target object Obj is improved.
  • a plurality of index images oriented with respect to each of a plurality of second index points p2 corresponding to each of a plurality of first index points p1 whose relative positions change as the posture of the working mechanism 440 changes. M may be superimposed on the work environment image and output to the remote image output device 221 of the remote operation device 20 .
  • FIG. 12A when a pair of members 4451 and 4452 that make up attachment 445 (eg, a grapple or crusher) are closed, the ends of the respective members 4451 and 4452 of the pair are closed.
  • the image is superimposed on the environment image and output to the remote image output device 221 of the remote operation device 20 .
  • FIG. 12B when the pair of constituent members 4451 and 4452 that constitute the attachment 445 are opened, the distal ends of the pair of constituent members 4451 and 4452 are positioned at the first designated point.
  • M 1 a plurality of index images M 1 defined as p 11 and p 12 and showing second index points p 21 and p 22 as a result of projecting the first index points p 11 and p 12 onto the surface of the target object Obj; M 2 is superimposed on the work environment image and output to the remote image output device 221 of the remote controller 20 .
  • the working mechanism 440 (for example, the pair of constituent members 4451 and 4452 of the attachment 445) is displayed through the working environment image and the plurality of index images output to the remote image output device of the remote operation device 20.
  • a plurality of second index points p21 and p22 as a result of projecting the plurality of first index points p11 and p12 , whose relative positions change with the change in posture, onto the surface of the target object.
  • the operator can grasp each position. As a result, it is possible to improve the recognition accuracy by the operator of the positional relationship between the plurality of parts of the working mechanism 440 and the target object Obj according to the posture change mode.
  • the plurality of parts may be configured by, for example, one part of the attachment 445 and another part of the arm 443 and/or the boom 441 .
  • the center point of the tip of the attachment 445 and the center point of the tip of the arm 443 may be defined as the first index points p 11 and p 12 , respectively.
  • the indicator images M1 and M2 superimposed on the work environment image are identifiably represented by colors, shapes, patterns, or a combination thereof, so that the indicator images M1 and M2 are displayed in real space.
  • the operator can be made aware of the directional arrangement and thus the attitude of the attachment 445 (eg, bucket).
  • the index image M having directivity with respect to the second index point p2 was output to the remote image output device 221 (see FIG. 5 ).
  • the index image M having no directivity with respect to may be output to the remote image output device 221 .
  • the index image M is output to the remote image output device 221, in which a two-dimensional figure of a designated shape such as a circle or square centered or centered on the second index point p2 is arranged in a posture parallel to the horizontal plane. good too.
  • An index image M is output to the remote image output device 221, in which a three-dimensional figure of a designated shape such as a sphere, cube, or polyhedron centered or centered on the second index point p2 is arranged in a posture parallel to the horizontal plane. good too.
  • the upper rotating body 420 rotates counterclockwise relative to the lower traveling body 410 when viewed from above.
  • a central point is defined as the first index point p 1 .
  • the following description is based on the operation mode of the remote control mechanism 211 and/or the actual machine operating mechanism 411, or based on the output signal of the actual machine sensor group 416, so that the upper swing body 420 is raised with respect to the lower traveling body 410.
  • a central point is defined as the first index point p 1 .
  • an index image M showing a second index point p2 as a result of projecting the first index point p1 onto the surface of the target object Obj is superimposed on the work environment image to form a remote image. It is output to the output device 221 .
  • communication delay of the working mechanism 440 constituting the working machine 40 can be detected through the working environment image output to the remote image output device 221 of the remote controller 20 and the index image M superimposed thereon. Since the operator can recognize changes in the position and/or posture taking into consideration part or all of the time and response delay time, the positional relationship of the working mechanism 440 with respect to the target object Obj can be determined in consideration of the operating environment unique to remote control. The recognition accuracy of is improved, and operations can be performed more efficiently.
  • the index image having directivity with respect to the second index point is superimposed on the work environment image and output to the output interface of the remote control device.
  • the index image output to the output interface of the remote control device has directivity with respect to the second index point (the index image indicates the position of the second index point). Therefore, the operator can easily grasp the positional relationship between the first index point of the working mechanism and the second index point on the surface of the target object.
  • the working environment image output to the output interface of the remote control device and the index image superimposed thereon are used to display the direction corresponding to the displacement mode or attitude mode of the working mechanism that constitutes the working machine. , it is possible to improve the recognition accuracy of the operator regarding the positional relationship between the first index point of the working mechanism and the second index point on the surface of the target object in the working environment image.
  • the image display system of the present invention superimposing the index image showing the second index point as a result of projecting the first index point having a different position according to the displacement mode of the working mechanism onto the surface of the target object, on the working environment image; preferably output to the output interface of the remote control device.
  • the working environment image output to the output interface of the remote control device and the index image superimposed thereon are displaced to a position corresponding to the displacement mode of the working mechanism constituting the working machine. Further, it is possible to improve the recognition accuracy of the operator regarding the positional relationship between the first index point of the working mechanism and the second index point on the surface of the target object in the working environment image.
  • the index image indicating the second index point is superimposed on the working environment image and output to the output interface of the remote control device.
  • the index is output to the output interface of the remote control device.
  • the operator can easily comprehend or guess the positional relationship between the first index point and the second index point as a result of the projection of the first index point onto the surface of the target object.
  • the first index point is set on the arm top.
  • the position of the tip of the attachment is manipulated with respect to the position of the arm top.
  • an arm top is suitable. Furthermore, since the position of the arm top does not change even if the attachment is replaced, there is an advantage that the same image display can be applied.
  • a plurality of index images each showing a plurality of the plurality of second index points as a result of each of the plurality of first index points being projected onto the surface of the target object are superimposed on the work environment image. It is preferable to output to the output interface of the remote control device.
  • each of the plurality of first index points of the working mechanism is projected onto the surface of the target object through the working environment image and the plurality of index images output to the output interface of the remote control device.
  • the operator can grasp the position of each of the plurality of second index points.
  • it is possible to improve the recognition accuracy by the operator of the positional relationship between the working mechanism and the target object.
  • a plurality of index images pointing toward each of the plurality of second index points corresponding to each of the plurality of first index points whose relative positions change as the posture of the working mechanism changes; It is preferable that the image is superimposed on the work environment image and output to the output interface of the remote control device.
  • a plurality of first indices whose relative positions change as the posture of the working mechanism changes through the work environment image and the plurality of index images output to the output interface of the remote control device.
  • the operator can grasp the position of each of the plurality of second index points as a result of each point being projected onto the surface of the target object.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Civil Engineering (AREA)
  • Structural Engineering (AREA)
  • Computer Graphics (AREA)
  • Mining & Mineral Resources (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Component Parts Of Construction Machinery (AREA)
  • Manipulator (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Sont prévus un système et similaires qui permettent d'améliorer la précision de la reconnaissance, par un opérateur, de la relation de position entre un mécanisme de travail constituant une machine de travail et un objet situé à proximité de la machine de travail. Selon la présente invention, un opérateur peut capter la position d'un second repère p2 sur la surface d'un objet Obj et la position d'un mécanisme de travail 440 (accessoire 445) constituant une machine de travail 40, au moyen d'une image d'environnement de travail délivrée à un dispositif de sortie d'image à distance 221 constituant une interface de sortie à distance 220 et une image de repère M superposée sur l'image d'environnement de travail. Le second repère p2 est un résultat obtenu en projetant un premier repère p1 sur la surface de l'objet Obj.
PCT/JP2022/039550 2021-12-03 2022-10-24 Système d'affichage d'image, système d'aide au fonctionnement à distance et procédé d'affichage d'image WO2023100533A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021197367 2021-12-03
JP2021-197367 2021-12-03

Publications (1)

Publication Number Publication Date
WO2023100533A1 true WO2023100533A1 (fr) 2023-06-08

Family

ID=86611900

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/039550 WO2023100533A1 (fr) 2021-12-03 2022-10-24 Système d'affichage d'image, système d'aide au fonctionnement à distance et procédé d'affichage d'image

Country Status (2)

Country Link
JP (1) JP2023083245A (fr)
WO (1) WO2023100533A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019031908A (ja) * 2018-12-04 2019-02-28 住友建機株式会社 ショベル
JP2019157569A (ja) * 2018-03-15 2019-09-19 日立建機株式会社 建設機械
JP2021038649A (ja) * 2020-10-08 2021-03-11 株式会社小松製作所 作業機械の画像表示システム、作業機械の遠隔操作システム、作業機械、及び作業機械の画像表示方法
JP2021050602A (ja) * 2021-01-07 2021-04-01 株式会社小松製作所 建設機械の表示システムおよびその制御方法
JP2021165526A (ja) * 2020-09-29 2021-10-14 株式会社小松製作所 作業機械の画像表示システム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019157569A (ja) * 2018-03-15 2019-09-19 日立建機株式会社 建設機械
JP2019031908A (ja) * 2018-12-04 2019-02-28 住友建機株式会社 ショベル
JP2021165526A (ja) * 2020-09-29 2021-10-14 株式会社小松製作所 作業機械の画像表示システム
JP2021038649A (ja) * 2020-10-08 2021-03-11 株式会社小松製作所 作業機械の画像表示システム、作業機械の遠隔操作システム、作業機械、及び作業機械の画像表示方法
JP2021050602A (ja) * 2021-01-07 2021-04-01 株式会社小松製作所 建設機械の表示システムおよびその制御方法

Also Published As

Publication number Publication date
JP2023083245A (ja) 2023-06-15

Similar Documents

Publication Publication Date Title
JP6832548B2 (ja) 作業機械の画像表示システム、作業機械の遠隔操作システム、作業機械及び作業機械の画像表示方法
JPWO2019244574A1 (ja) 掘削機、情報処理装置
WO2020003631A1 (fr) Dispositif de commande d'affichage et procédé de commande d'affichage
JP6867132B2 (ja) 作業機械の検出処理装置及び作業機械の検出処理方法
JP6947101B2 (ja) 遠隔操作システム及び主操作装置
JP7420733B2 (ja) 表示制御システムおよび表示制御方法
JP2024052764A (ja) 表示制御装置及び表示方法
WO2023100533A1 (fr) Système d'affichage d'image, système d'aide au fonctionnement à distance et procédé d'affichage d'image
JP2023040971A (ja) 遠隔操作支援装置、遠隔操作支援システムおよび遠隔操作支援方法
WO2020090898A1 (fr) Système de commande d'affichage, procédé de commande d'affichage et système de commande à distance
JP2021099017A (ja) 遠隔操作装置および遠隔操作システム
JP7444094B2 (ja) 遠隔操作支援システムおよび遠隔操作支援方法
JP7441733B2 (ja) 実機状態監視システムおよび実機状態監視方法
WO2021124858A1 (fr) Dispositif de commande à distance et système de commande à distance
EP4130400A1 (fr) Dispositif d'aide au fonctionnement à distance et procédé d'aide au fonctionnement à distance
US11939744B2 (en) Display system, remote operation system, and display method
WO2021106280A1 (fr) Serveur d'aide au travail, procédé d'aide au travail, et système d'aide au travail
JP7390991B2 (ja) 作業機械および施工支援システム
JP7363560B2 (ja) 遠隔操作装置、遠隔操作支援サーバ、遠隔操作支援システムおよび遠隔操作支援方法
WO2023026568A1 (fr) Système de commande à distance et système composite de commande à distance
WO2023136070A1 (fr) Système et procédé d'assistance de fonctionnement à distance
US20210395980A1 (en) System and method for work machine
WO2021240957A1 (fr) Dispositif d'assistance au fonctionnement à distance, système d'assistance au fonctionnement à distance et procédé d'assistance au fonctionnement à distance
US20220002977A1 (en) System and method for work machine
JP2021130973A (ja) 情報提示システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22900966

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022900966

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022900966

Country of ref document: EP

Effective date: 20240517