WO2019230366A1 - Display system - Google Patents

Display system Download PDF

Info

Publication number
WO2019230366A1
WO2019230366A1 PCT/JP2019/019088 JP2019019088W WO2019230366A1 WO 2019230366 A1 WO2019230366 A1 WO 2019230366A1 JP 2019019088 W JP2019019088 W JP 2019019088W WO 2019230366 A1 WO2019230366 A1 WO 2019230366A1
Authority
WO
WIPO (PCT)
Prior art keywords
main body
controller
optical member
display
display system
Prior art date
Application number
PCT/JP2019/019088
Other languages
French (fr)
Japanese (ja)
Inventor
薫 草深
橋本 直
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2019230366A1 publication Critical patent/WO2019230366A1/en

Links

Images

Classifications

    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • This disclosure relates to a display system.
  • a construction machine such as a hydraulic excavator includes a display device that displays information (see Patent Document 1).
  • the display system of the present disclosure is a display system for a construction machine that includes a work machine and a main body unit to which the work machine is attached.
  • the display system includes a display unit, a first optical member, and a controller.
  • the display unit displays parallax images having parallax with each other.
  • the first optical member reflects image light of the parallax image displayed on the display unit with respect to a predetermined position corresponding to an eye position of a driver of the construction machine.
  • the first optical member transmits external light incident on a surface opposite to a surface that reflects the image light.
  • the controller displays a parallax image on the display unit.
  • FIG. 1 is a diagram showing a schematic configuration of an excavator car equipped with a display system in the present embodiment.
  • FIG. 2 is a diagram showing a schematic configuration of the display device shown in FIG.
  • FIG. 3 is a diagram showing an example of the display panel shown in FIG. 2 viewed from the normal direction of the active area.
  • FIG. 4 is a diagram showing the parallax barrier shown in FIG.
  • FIG. 5 is a diagram for explaining a virtual image visually recognized by the driver's left eye on the display panel shown in FIG. 1.
  • FIG. 6 is a diagram for explaining a virtual image visually recognized by the driver's right eye on the display panel shown in FIG. 1.
  • FIG. 1 is a diagram showing a schematic configuration of an excavator car equipped with a display system in the present embodiment.
  • FIG. 2 is a diagram showing a schematic configuration of the display device shown in FIG.
  • FIG. 3 is a diagram showing an example of the display panel shown in FIG. 2 viewed from the normal direction of the active
  • FIG. 7A is a diagram illustrating a relationship between the excavator and the horizontal plane when the angle of the excavator with respect to the horizontal plane is 0 degree.
  • FIG. 7B is a diagram illustrating a relationship between the excavator and the horizontal plane when the angle of the excavator with respect to the horizontal plane is 10 degrees.
  • FIG. 7C is a diagram illustrating a relationship between the shovel car and the horizontal plane when the angle of the shovel car with respect to the horizontal plane is ⁇ 5 degrees.
  • FIG. 8A is a diagram illustrating a relationship between the shovel car and the design surface when the angle of the shovel car with respect to the horizontal plane is 0 degree.
  • FIG. 8B is a diagram illustrating a relationship between the shovel car and the design surface when the angle of the shovel car with respect to the horizontal plane is 10 degrees.
  • FIG. 8C is a diagram illustrating a relationship between the shovel car and the design surface when the angle of the shovel car with respect to the horizontal plane is ⁇ 5 degrees.
  • FIG. 9A is a diagram illustrating a state in which the working machine is extended so that the bucket is separated from the main body portion forward.
  • FIG. 9B is a diagram illustrating a state in which the working machine is extended so that the bucket is closer to the ground than the state illustrated in FIG. 9A.
  • FIG. 9C is a diagram illustrating a state where the working machine is extended so that the bucket excavates the ground.
  • FIG. 9A is a diagram illustrating a state in which the working machine is extended so that the bucket is separated from the main body portion forward.
  • FIG. 9B is a diagram illustrating a state in which the working machine is extended so that the bucket is closer
  • FIG. 10A is a diagram illustrating a scene visually recognized by the driver when the work machine is in the state illustrated in FIG. 9A.
  • FIG. 10B is a diagram illustrating a scene visually recognized by the driver when the work machine is in the state illustrated in FIG. 9B.
  • FIG. 10C is a diagram illustrating a scene visually recognized by the driver when the work machine is in the state illustrated in FIG. 9C.
  • FIG. 11 is a diagram illustrating an example of a virtual image that is visible to the driver and indicates a range that can be reached by the work implement.
  • FIG. 12A is a diagram illustrating a state in which the working machine is extended so that the bucket is separated from the main body portion forward.
  • FIG. 12A is a diagram illustrating a state in which the working machine is extended so that the bucket is separated from the main body portion forward.
  • FIG. 12B is a diagram illustrating a state in which the working machine is extended so that the bucket is closer to the main body than the state illustrated in FIG. 12A.
  • FIG. 12C is a diagram illustrating a state in which the working machine is extended so that the bucket is closer to the main body than the state illustrated in FIG. 12B.
  • FIG. 13 is a diagram illustrating an example of a virtual image showing a portion that can be reached by the work implement on the design surface, which is visually recognized by the driver.
  • FIG. 14 is a diagram for explaining the relationship between the position and orientation of the mirror and the region where the image light reaches the windshield.
  • the present disclosure provides a display system capable of improving convenience when a driver operating a construction machine recognizes information.
  • the display system 1 may be mounted on an excavator (construction machine) 2 as shown in FIG.
  • the excavator 2 includes a traveling unit 21, a main body unit 22, and a work machine 23.
  • the traveling unit 21 is, for example, a crawler type traveling device.
  • the traveling unit 21 is connected to the main body 22 below the main body 22.
  • the traveling unit 21 includes drive wheels 211, idle wheels 212, and crawlers 213 wound around the drive wheels 211 and idle wheels 212.
  • the traveling unit 21 moves on the ground in the extending direction by driving the driving wheels 211 and rotating the crawler 213 in the extending direction.
  • the main body 22 is connected to the traveling unit 21 on the upper side of the traveling unit 21. Therefore, the main body 22 moves as the traveling unit 21 moves.
  • the main body 22 is provided so as to be able to turn in a plane parallel to the ground.
  • the main body 22 includes a driver seat 221 for the driver to sit on and a windshield 222 provided on the front side of the driver who has seated.
  • the main body unit 22 includes an operation unit such as an operation lever for operating the traveling unit 21 and the work implement 23, respectively.
  • the work machine 23 is attached to the main body 22.
  • the work machine 23 is attached so as to be rotatable around an attachment position to the main body 22 and changes its posture by turning.
  • the work machine 23 includes a boom (first portion) 231, an arm (second portion) 232, and a bucket (third portion) 233.
  • the boom 231 is attached to the side of the main body 22 at one end in the longitudinal direction.
  • the boom 231 is pivotally attached around the attachment position to the main body 22 based on the operation of the driver, and changes its posture by turning.
  • the arm 232 is attached to the boom 231 at one end in the longitudinal direction.
  • the arm 232 is attached so as to be rotatable around an attachment position to the boom 231.
  • the position of the boom 231 is changed by the rotation of the arm 232.
  • the boom 231 changes its posture by rotating.
  • the bucket 233 is attached to the end of the arm 232 opposite to the end attached to the boom 231.
  • the bucket 233 is attached so as to be rotatable around an attachment position to the arm 232.
  • the position of the bucket 233 is changed by the rotation of the boom 231 and the arm 232.
  • the bucket 233 changes its posture by rotating.
  • the display system 1 includes a position detection device 3, an attitude detection device 4, a display device 5, a projection optical system 6, and a drive device 7. A part of the configuration of the display system 1 may be shared with other devices or parts included in the excavator 2.
  • the position detection device 3 detects the position of the position detection device 3.
  • the position of the position detection device 3 corresponds to the position of the excavator 2 on which the position detection device 3 is mounted. Therefore, the position detection device 3 detects the position of the excavator 2 by detecting the position of the position detection device 3.
  • the position detection device 3 includes, for example, a receiver of a global positioning satellite system (GNSS).
  • GNSS global positioning satellite system
  • the position detection device 3 detects the position of the position detection device 3 based on the signal received from the artificial satellite by the receiver of the global positioning system.
  • the global positioning system includes satellite positioning systems such as GPS (Global Positioning System), GLONASS, Galileo, and Quasi-Zenith Satellite.
  • the posture detection device 4 detects the posture of the main body 22 of the excavator 2 on which the posture detection device 4 is mounted.
  • the posture is, for example, an inclination angle with respect to the horizontal plane.
  • the posture detection device 4 may be a tilt sensor, for example.
  • the tilt sensor measures, for example, gravitational acceleration and determines the posture based on the measured gravitational acceleration.
  • the position detection device 3 detects the posture of the shovel car 2
  • the position detection device 3 outputs information indicating the posture (hereinafter also referred to as “posture information”) to the display device 5 and the drive device 7.
  • the display device 5 is, for example, a three-dimensional display device. As shown in FIG. 2, the display device 5 includes a communication unit 51, a memory 52, a display unit 53, and a controller 54.
  • the communication unit 51 may receive position information indicating the position of the excavator 2 transmitted from the position detection device 3.
  • the communication unit 51 may receive posture information indicating the posture of the main body 22 transmitted from the posture detection device 4.
  • the communication unit 51 may receive design information from an external device.
  • the design information is information indicating the size and position of the design surface formed by the excavator 2, for example.
  • the memory 52 includes an arbitrary storage device such as a RAM (Random Access Memory) and a ROM (Read Only Memory).
  • the memory 52 may store design information.
  • the display unit 53 displays parallax images having parallax with each other.
  • the display unit 53 includes an irradiator 531, a display panel 532, and a parallax barrier 533 as an optical element.
  • the irradiator 531 can irradiate the display panel 532 in a plane.
  • the irradiator 531 may include a light source, a light guide plate, a diffusion plate, a diffusion sheet, and the like.
  • the irradiator 531 emits irradiation light from a light source, and equalizes the irradiation light in the surface direction of the display panel 532 using a light guide plate, a diffusion plate, a diffusion sheet, or the like.
  • the irradiator 531 can emit the uniformed light toward the display panel 532.
  • the display panel 532 may display a left eye image (first image) and a right eye image (second image) having parallax with each other.
  • the display panel 532 may employ a display panel such as a transmissive liquid crystal display panel.
  • the display panel 532 has a plurality of partitioned areas partitioned into an active area 532 a formed on the surface of the plate-shaped display panel 532 in a first direction and a second direction orthogonal to the first direction.
  • a direction orthogonal to the first direction and the second direction is referred to as a third direction.
  • the first direction may be referred to as the horizontal direction.
  • the second direction may be referred to as the vertical direction.
  • the third direction may be referred to as the depth direction.
  • first direction is represented as the x-axis direction
  • second direction is represented as the y-axis direction
  • third direction is represented as the z-axis direction.
  • the active area 532a includes a plurality of sub-pixels arranged in a grid along the horizontal direction and the vertical direction.
  • Each subpixel corresponds to any color of R (Red), G (Green), and B (Blue), and one pixel can be configured by combining the three subpixels of R, G, and B.
  • One pixel can be referred to as one pixel.
  • the horizontal direction is, for example, a direction in which a plurality of subpixels constituting one pixel are arranged.
  • the vertical direction is, for example, a direction in which sub-pixels of the same color are arranged.
  • the display panel 532 is not limited to a transmissive liquid crystal panel, and other display panels such as an organic EL can be used. When a self-luminous display panel is used as the display panel 532, the display device 5 may not include the irradiator 531.
  • a plurality of subpixels arranged in the active area 532a constitute a subpixel group Pg.
  • the subpixel group Pg is repeatedly arranged adjacent to each other in the horizontal direction.
  • the subpixel group Pg is repeatedly arranged adjacent to a position shifted by one subpixel in the horizontal direction in the vertical direction.
  • the subpixel group Pg includes subpixels in a predetermined row and column. Specifically, the sub-pixel group Pg includes b pieces (b rows) in the vertical direction, 2 ⁇ n pieces (2 ⁇ n columns) in the horizontal direction, and (2 ⁇ n ⁇ b) pieces arranged in succession. Subpixels P1 to P (2 ⁇ n ⁇ b) are included. In the example shown in FIG. 3, in the active area 532a, a subpixel group Pg including twelve subpixels P1 to P12 arranged in succession, one in the vertical direction and twelve in the horizontal direction is arranged. In the example illustrated in FIG. 2, reference numerals are given to some of the subpixel groups Pg.
  • the subpixel group Pg is a minimum unit for the controller 54 to perform control for displaying an image.
  • the subpixels P1 to P (2 ⁇ n ⁇ b) having the same identification information of all the subpixel groups Pg are similarly controlled by the controller 54 at the same time. For example, when the controller 54 switches the image to be displayed on the subpixel P1 from the left eye image (first image) to the right eye image (second image), the image to be displayed on the subpixels P1 in all the subpixel groups Pg. Switching from the left eye image to the right eye image simultaneously.
  • the parallax barrier 533 is formed by a plane along the active area 532a as shown in FIG.
  • the parallax barrier 533 is arranged at a predetermined distance (gap) g from the active area 532a.
  • the parallax barrier 533 may be positioned on the opposite side of the irradiator 531 with respect to the display panel 532.
  • the parallax barrier 533 may be positioned on the irradiator 531 side of the display panel 532.
  • the parallax barrier 533 has a light beam direction that is a propagation direction of the image light IL emitted from the sub-pixel for each of the light-transmitting regions 533 b that are a plurality of band-like regions extending in a predetermined direction in the plane. Stipulate.
  • the predetermined direction is a direction that forms a predetermined angle other than 0 degree with respect to the vertical direction.
  • a region on the active area 532a that can be visually recognized by the driver's left eye is referred to as a left visible region 532aL (first visible region).
  • a region on the active area 532a that can be visually recognized by the driver's right eye is referred to as a right visible region 532aR (second visible region).
  • the parallax barrier 533 includes a plurality of light shielding surfaces 533a that shield the image light IL.
  • the plurality of light shielding surfaces 533a define a light transmitting region 533b between the light shielding surfaces 533a adjacent to each other.
  • the light transmitting region 533b has a higher light transmittance than the light shielding surface 533a.
  • the light shielding surface 533a has a light transmittance lower than that of the light transmitting region 533b.
  • the light transmissive region 533 b is a portion that transmits light incident on the parallax barrier 533.
  • the light transmitting region 533b may transmit light with a transmittance equal to or higher than the first predetermined value.
  • the first predetermined value may be 100%, for example, or a value close to 100%.
  • the light blocking surface 533a is a portion that blocks light that is incident on the parallax barrier 533 and does not transmit the light. In other words, the light shielding surface 533 a blocks an image displayed on the display device 5.
  • the light blocking surface 533a may block light with a transmittance equal to or lower than the second predetermined value.
  • the second predetermined value may be 0%, for example, or a value close to 0%.
  • the light transmissive regions 533b and the light shielding surfaces 533a extend in a predetermined direction along the active area 532a, and are alternately and repeatedly arranged in a direction orthogonal to the predetermined direction.
  • the translucent area 533b defines the light ray direction of the image light IL emitted from the subpixel.
  • the parallax barrier 533 may be composed of a film or a plate-like member having a transmittance less than the second predetermined value.
  • the light shielding surface 533a is configured by the film or the plate-like member.
  • the light transmitting region 533b is configured by an opening provided in a film or a plate-like member.
  • a film may be comprised with resin and may be comprised with another material.
  • the plate-like member may be made of resin or metal, or may be made of other materials.
  • the parallax barrier 533 is not limited to a film or a plate-like member, and may be constituted by other types of members.
  • the base material may have a light-shielding property, and the base material may contain an additive having a light-shielding property.
  • the parallax barrier 533 may be configured with a liquid crystal shutter.
  • the liquid crystal shutter can control the light transmittance according to the applied voltage.
  • the liquid crystal shutter may be composed of a plurality of pixels and may control the light transmittance in each pixel.
  • the liquid crystal shutter may be formed in an arbitrary shape with a region having a high light transmittance or a region having a low light transmittance.
  • the light transmitting region 533b may be a region having a transmittance equal to or higher than the first predetermined value.
  • the light shielding surface 533a may be a region having a transmittance equal to or lower than the second predetermined value.
  • the parallax barrier 533 defines the light beam direction, which is the propagation direction of the image light IL emitted from the sub-pixel, for each of a plurality of band-shaped regions extending in a predetermined direction in the active area 532a.
  • the image light IL in which the light beam direction is defined is propagated via the projection optical system 6 to a predetermined position corresponding to the eyes of the driver sitting on the driver seat 221 of the main body 22.
  • the right eye image displayed on the subpixel P7 As shown in FIG. 6, half of the right eye image displayed on the subpixel P7, the entire right eye image displayed on the subpixels P8 to P12, and half of the left eye image displayed on the subpixel P1.
  • the image light IL is propagated to the right eye of the driver through the projection optical system 6.
  • the right eye of the driver is half of the right eye image displayed on the subpixel P7, the entire right eye image displayed on the subpixels P8 to P12, and the left eye image displayed on the subpixel P1.
  • the sub-pixel area displaying the left eye image viewed by the driver's left eye is maximized, and the sub-pixel area displaying the right eye image is minimized.
  • the subpixel area displaying the right eye image viewed by the driver's right eye is maximized, and the subpixel area displaying the left eye image is minimized. Therefore, the driver visually recognizes a virtual image of the three-dimensional image in a state where crosstalk is most suppressed.
  • the controller 54 is connected to each component of the display system 1 and can control each component.
  • the components controlled by the controller 54 include a display panel 532.
  • the controller 54 is configured as a processor, for example.
  • the controller 54 may include one or more processors.
  • the processor may include a general-purpose processor that reads a specific program and executes a specific function, and a dedicated processor specialized for a specific process.
  • the dedicated processor may include an application specific IC (ASIC: Application Specific Circuit).
  • the processor may include a programmable logic device (PLD: Programmable Logic Device).
  • the PLD may include an FPGA (Field-Programmable GateArray).
  • the controller 54 may be one of SoC (System-on-a-Chip) in which one or a plurality of processors cooperate, and SiP (System-In-a-Package).
  • the controller 54 includes a storage unit, and may store various types of information or a program for operating each component of the display system 1 in the storage unit.
  • the storage unit may be configured by, for example, a semiconductor memory.
  • the storage unit may function as a work memory for the controller 54. Details of the controller 54 will be described later.
  • the projection optical system 6 can be configured to include a first optical member 61 and a second optical member 62.
  • the first optical member 61 reflects the image light IL whose propagation direction is deflected by the second optical member 62 and transmits external light incident on the surface opposite to the surface from which the image light IL is reflected. Specifically, the first optical member 61 reflects the image light IL emitted from the display device 5 and reflected by the second optical member 62 and propagates it to the left eye and right eye of the driver. As shown in FIG. 1, the windshield 222 of the shovel car 2 may also be used as the first optical member 61. When the image light is reflected by the first optical member 61 and reaches the eyes, the eyes of the driver of the excavator 2 can visually recognize the virtual image VI. The first optical member 61 transmits external light incident on the windshield 222 from the side opposite to the second optical member 62.
  • the second optical member 62 deflects the propagation direction of the image light IL emitted from the display device 5. Specifically, the second optical member 62 reflects the image light IL emitted from the display panel 532 via the parallax barrier 533 to reach the first optical member 61.
  • the second optical member 62 may include one or more mirrors and lenses. When the second optical member 62 includes a mirror, for example, the mirror included in the second optical member 62 may be a concave mirror. In FIG. 2, the second optical member 62 is displayed as one mirror. However, the present invention is not limited to this, and the second optical member 62 may be configured by combining one or more mirrors, lenses, and other optical elements.
  • the driving device 7 changes the position and posture of the second optical member 62.
  • the driving device 7 determines the position of the second optical member 62 and the position of the second optical member 62 based on the position of the excavator 2 detected by the position detection device 3 and the position of the work area indicated by the work information stored in the memory 52. Change posture. Specifically, when the distance between the position of the excavator 2 and the position of the work area is equal to or greater than a predetermined value, the driving device 7 causes the image light IL to reach an area around the center of the windshield 222. In addition, the position and posture of the second optical member 62 may be changed. The predetermined value will be described in detail later.
  • the driving device 7 is configured to cause the image light IL to reach an area below the area around the center of the windshield 222.
  • the position and posture of the two optical members 62 may be changed.
  • the controller 54 displays the parallax image on the display panel 532.
  • the controller 54 causes the display panel 532 to display a parallax image based on the posture detected by the posture detection device 4 and a predetermined position corresponding to the position of the driver's eyes.
  • the predetermined position is a position where the driver's eyes are expected to be positioned when the driver is seated on the main body 22.
  • the controller 54 may acquire posture information indicating the posture of the main body unit 22 detected by the posture detection device 4 and received by the communication unit 51. Based on the posture of the main body 22 with respect to the horizontal plane indicated by the posture information, the controller 54 displays a parallax image indicating the horizontal plane on the display panel 532 when the driver is seated on the main body 22. You may display according to the predetermined
  • the controller 54 when the inclination angle of the main body portion 22 with respect to the horizontal plane is + 10 ° as shown in FIG. 7B, the controller 54 extends forward and downward compared to the case where the inclination angle as shown in FIG. 7A is 0 degrees. A parallax image is displayed so that the driver can visually recognize the virtual plane. For example, when the tilt angle is ⁇ 5 ° as illustrated in FIG. 7C, the controller 54 causes the parallax image so that the driver can visually recognize a virtual surface extending forward as compared with the case illustrated in FIG. 7A. Is displayed.
  • the controller 54 may display a parallax image on the display unit 53 based on the position of the design surface and the position and orientation of the main body unit 22. Specifically, the controller 54 may acquire position information indicating the position of the main body unit 22 detected by the position detection device 3 and received by the communication unit 51. The controller 54 may acquire posture information indicating the posture of the main body unit 22 detected by the posture detection device 4 and received by the communication unit 51. The controller 54 may acquire design information stored in the memory 52. The controller 54 may acquire design information received from an external device by the communication unit 51.
  • the controller 54 may display a parallax image indicating the design surface on the display unit 53 based on the position of the design surface and the position and orientation of the main body unit 22. Specifically, the controller 54 superimposes the parallax image on the display unit 53 so that the driver can visually recognize the virtual image VI of the parallax image indicating the design surface superimposed on a scene visually recognized on the outside of the windshield 222. You may display.
  • the controller 54 when the inclination angle of the main body 22 with respect to the horizontal plane is + 10 ° as shown in FIG. 8B, the driver moves forward as compared with the case where the inclination angle as shown in FIG. 8A is 0 degrees.
  • the parallax image is displayed so as to visually recognize the design surface extending to the screen.
  • the controller 54 when the inclination angle is ⁇ 5 ° as shown in FIG. 8C, the controller 54 is a virtual that the driver extends forward as compared with the case where the inclination angle is 0 degree as shown in FIG. 8A.
  • a parallax image is displayed so as to visually recognize a typical design surface.
  • the controller 54 may cause the driver to visually recognize a virtual image VI indicating a reachable range of the work implement 23 by rotating each part of the work implement 23 with the main body portion 22 fixed.
  • a virtual image VI indicating a reachable range of the work implement 23 by rotating each part of the work implement 23 with the main body portion 22 fixed.
  • FIG. 9A when each part of the work machine 23 is rotated so that the bucket 233 is farthest forward from the main body 22, the tip of the bucket 233 reaches the position A.
  • the driver visually recognizes the substance of the bucket 233 shown in FIG. 10A outside the windshield 222.
  • FIG. 9B when each part of the work implement 23 is rotated so that the bucket 233 is closer to the ground than the state illustrated in FIG.
  • the tip of the bucket 233 reaches the position B. At this time, the driver visually recognizes the substance of the bucket 233 shown in FIG. 10B outside the windshield 222.
  • the tip of the bucket 233 reaches the position C. At this time, the driver visually recognizes the substance of the bucket 233 shown in FIG. 10C outside the windshield 222.
  • the controller 54 calculates the reachable range of the bucket 233 with the main body 22 fixed in accordance with the movable range of the boom 231, the arm 232, and the bucket 233 based on the position of the main body 22. Good.
  • the controller 54 may display a parallax image indicating the reachable range of the bucket 233 on the display unit 53. Specifically, the controller 54 displays a parallax image on the display unit 53 so that the driver can visually recognize the virtual image VI that indicates the reachable range of the bucket 233 as indicated by a two-dot chain line in FIG. 11. It's okay.
  • the controller 54 may calculate the reachable range of the bucket 233 based on the position of the main body 22 and the posture of the boom 231 while the main body 22 and the boom 231 are fixed. At this time, the controller 54 can calculate the reachable range of the bucket 233 using the movable range of the arm 232 and the bucket 233.
  • the controller 54 may display a parallax image indicating a reachable range of the bucket 233 on the display unit 53. Specifically, the controller 54 may display the parallax image on the display unit 53 so that the driver displays a virtual image VI indicating the reachable range of the bucket 233.
  • the controller 54 may cause the display unit 53 to display a parallax image indicating the design surface based on the position and orientation of the main body unit 22 and an external design surface. Specifically, based on the position and orientation of the main body 22 and the position of the design surface, the controller 54 determines a portion of the design surface that can be reached by the bucket 233 according to the movable range of each unit of the work implement 23. May be calculated.
  • the bucket 233 does not reach the region of the design surface that is separated from the main body 22 in the horizontal direction. As illustrated in FIGS. 12B and 12C, the bucket 233 reaches a region near the horizontal direction from the main body 22 in the design surface.
  • the controller 54 may display a parallax image on the display unit 53 for identifying a portion that can be reached by the bucket 233 and a portion that cannot be reached by the bucket 233 in the design surface. Specifically, the controller 54 displays the parallax image so that the driver's eyes can visually recognize the virtual image VI indicating the portion that can be reached by the bucket 233 in the design surface as indicated by the two-dot chain line in FIG. Is displayed on the display unit 53.
  • the controller 54 may change the position and / or posture of the second optical member 62 so as to change the region where the image light IL reaches in the first optical member 61. Specifically, the controller 54 may cause the driving device 7 to change the position and angle of the second optical member 62 based on the position of the main body 22. More specifically, the controller 54 may cause the driving device 7 to change both and / or one of the position and angle of the second optical member 62 based on the position of the main body 22 and the position of the work area.
  • the controller 54 reaches the image light ILa at a predetermined position corresponding to the driver's eyes.
  • the image light IL may be reflected by the central region 61a of the first optical member 61 so as to do so.
  • the predetermined value may be, for example, the maximum value of the distance from the work area of the main body 22 that should be positioned to operate the work machine 23 with respect to the work area.
  • the predetermined value may be, for example, the distance between the bucket 233 and the main body 22 when the bucket 233 is farthest from the main body 22 in the horizontal direction.
  • the controller 54 may control the driving device 7 so that the second optical member 62 is in the position and posture indicated by the second optical member 62a. Accordingly, the driver's eyes visually recognize the virtual image VIa outside the central region 61 a of the first optical member 61.
  • the controller 54 causes the first optical member 61 so that the image light ILb reaches a predetermined position corresponding to the driver's eyes.
  • the image light IL may be reflected by the lower region 61b.
  • the controller 54 may control the driving device 7 so as to change the second optical member 62 to the position and posture indicated by the second optical member 62b. Accordingly, the driver's eyes visually recognize the virtual image VIb outside the lower region 61 b of the first optical member 61.
  • the construction machine display system 1 includes a display unit 53 that displays a parallax image.
  • the display system 1 reflects the image light IL whose propagation direction is deflected by the second optical member 62 and transmits the external light incident on the surface opposite to the surface that reflects the image light IL. 61 and a controller 54 for displaying a parallax image on the display unit 53. For this reason, when the eyes of the driver of the excavator 2 are located on the surface side that reflects the image light IL with respect to the first optical member 61, the eyes of the driver are reflected by the first optical member 61.
  • the virtual image VI is visually recognized by the IL. Since external light that has passed through the first optical member 61 reaches the eyes of the driver, a scene on the opposite side of the driver from the first optical member 61 is visually recognized. As a result, the driver's eyes visually recognize the virtual image VI of the parallax image displayed on the display unit 53 while visually recognizing the scene outside the excavator 2. For example, when the driver moves his / her line of sight to the display to confirm the image displayed on the display during the operation of the excavator 2, the observation power to the work area may be reduced. However, in the present embodiment, the driver can grasp information necessary for driving the shovel car 2, for example, shown in the parallax image without interrupting observation of the outside of the shovel car 2.
  • the posture of the main body unit 22 is acquired, and a parallax image is displayed on the display unit 53 based on the posture.
  • the posture of the main body 22 changes, the relative position of the scene outside the excavator 2 with respect to the driver changes.
  • the parallax image is displayed so as to visually recognize the virtual image VI of the parallax image superimposed on the scene outside the shovel car 2 visually recognized by the driver, the posture of the main body 22 changes to The virtual image VI of the parallax image may not be superimposed at an appropriate position in the scene.
  • the controller 54 displays the parallax image based on the posture of the main body unit 22, the driver can visually recognize the virtual image VI of the parallax image at an appropriate position.
  • the controller 54 displays a parallax image indicating a horizontal plane on the display unit 53 based on the attitude of the main body unit 22. For this reason, the driver can visually recognize the virtual image VI of the parallax image showing the horizontal plane on the outside of the excavator 2. Thereby, even if the main body 22 on which the driver is seated is inclined with respect to the horizontal plane, the driver can recognize the horizontal plane outside the shovel car 2. Therefore, the driver can accurately operate the work machine 23 based on the horizontal plane.
  • the controller 54 displays on the display unit 53 a parallax image indicating an area that can be reached by the work machine 23 rotating. For this reason, the driver can visually recognize a virtual image VI of a parallax image indicating an area reachable by the work implement 23 when the work implement 23 is operated while being superimposed on a scene outside the excavator 2. . Accordingly, the driver can recognize the workable range before operating the work machine 23, and can change at least one of the position and orientation of the excavator 2 as necessary. Therefore, the driver can operate the excavator 2 efficiently.
  • the controller 54 causes the display unit 53 to display a parallax image indicating an area reachable by the work implement 23 by rotating the arm 232 while the boom 231 is fixed to the main body unit 22. For this reason, when the driver rotates the arm 232 in a state where the boom 231 is fixed to the main body portion 22 so as to be superimposed on the scene on the outside of the excavator car 2, an area where the working machine 23 can reach is shown.
  • the virtual image VI of the parallax image can be visually recognized. Accordingly, the driver can recognize the workable range before operating the arm 232, and can change the position of the boom 231 as necessary. Therefore, the driver can operate the excavator 2 efficiently.
  • the controller 54 displays a parallax image indicating the design surface based on the position and orientation of the main body 22 and the position of the design surface. For this reason, the driver can visually recognize the virtual image VI of the parallax image indicating the design surface on the outside of the excavator 2. Thereby, even if the main body 22 on which the driver is seated is tilted, the driver can recognize the design surface on the outside of the excavator 2. Therefore, the driver can accurately operate the work machine 23 based on the design surface.
  • the controller 54 displays on the display unit 53 a parallax image that indicates a part in the design plane that can be reached by the work implement 23 by rotating the arm 232 while the boom 231 is fixed to the main body unit 22. To do. Therefore, the driver can recognize the workable range in the design plane before operating the arm 232, and can change the position of the boom 231 as necessary. Therefore, the driver can operate the excavator 2 efficiently.
  • the display system 1 further includes a drive device 7 that changes the position and orientation of the second optical member 62. For this reason, the position of the virtual image VI to be visually recognized by the driver's eyes can be changed.
  • the main body 22 may not be so wide that the display device 5 and the second optical member 62 can be arranged so that the image light IL reaches the entire surface of the windshield 222 that also serves as the first optical member 61. . Therefore, the driver can visually recognize the virtual image VI at an appropriate position by changing the region of the windshield 222 through which the image light IL reaches.
  • the controller 54 acquires the position of the work area where the work machine 23 is operated, and based on the distance between the position of the main body portion 22 and the position of the work area, the position of the second optical member 62 in the drive device 7. And change posture. For example, when the distance between the position of the main body 22 and the position of the work area is equal to or greater than a predetermined value, the driver is expected to move the excavator 2 toward the work area. Therefore, when the image light IL reaches the central region 61a of the first optical member 61, which is in the direction of viewing when the driver moves, the driver observing the distant outside the main body 22 visually recognizes the virtual image VI. Make it easier.
  • the driver When the distance between the position of the main body 22 and the position of the work area is less than a predetermined value, the driver is expected to observe the work area outside the shovel car 2. Therefore, when the image light IL reaches an area corresponding to the height of the work area in the first optical member 61, the driver can easily see the virtual image VI.
  • the display device 5 may be a two-dimensional display device.
  • the display unit 53 of the display device 5 does not include the irradiator 531 and the parallax barrier 533.
  • the controller 54 may display the two-dimensional image on the display unit 53 so that the driver can visually recognize the virtual image VI of the two-dimensional image.
  • the two-dimensional image may be, for example, a cross-sectional view showing a work area, a horizontal plane and a design surface with respect to the work area.
  • the sub-pixel group Pg is repeatedly arranged adjacent to a position shifted by one sub-pixel in the horizontal direction in the vertical direction, but in the vertical direction at a position that does not shift in the horizontal direction. It may be arranged repeatedly adjacent. In this case, in the parallax barrier 533, the light shielding surfaces 533a extending in the vertical direction may be arranged at predetermined intervals in the horizontal direction.
  • the optical element of the display device 5 is the parallax barrier 533, but is not limited thereto.
  • the optical element included in the display device 5 may be a lenticular lens.
  • the lenticular lens is configured by arranging the cylindrical lenses 9 in a plane parallel to the active area 532a.
  • the lenticular lens propagates the image light IL emitted from the sub-pixel in the left visible region 532aL to the position of the driver's left eye, and emits the image light emitted from the sub-pixel in the right visible region 532aR.
  • the IL is propagated to the position of the driver's right eye.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Mining & Mineral Resources (AREA)
  • Civil Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Structural Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Component Parts Of Construction Machinery (AREA)

Abstract

A display system (1) is provided for a construction machine (2) that includes a working machine (23) and a main body (22) having the working machine attached thereto. The display system (1) comprises a display unit (53), a first optical member (61), and a controller (54). The display unit (53) displays disparity images with disparity therebetween. The first optical member (61) reflects the image light (IL) from the disparity images displayed on the display unit (53) to a given position corresponding to the position of the eyes of a driver driving the construction machine (2). The first optical member (61) transmits external light incident on the surface on the opposite side of the reflection surface for the image light. The controller (54) displays the disparity images on the display unit (53).

Description

表示システムDisplay system 関連出願の相互参照Cross-reference of related applications
 本出願は、2018年5月30日出願の日本国特許出願第2018-103387号の優先権を主張するものであり、当該出願の開示全体を、ここに参照のために取り込む。 This application claims the priority of Japanese Patent Application No. 2018-103387 filed on May 30, 2018, the entire disclosure of which is incorporated herein by reference.
 本開示は、表示システムに関する。 This disclosure relates to a display system.
 従来、油圧ショベルのような、作業機を備える建設機械が情報を表示する表示装置を備えることが知られている(特許文献1参照)。 Conventionally, it is known that a construction machine such as a hydraulic excavator includes a display device that displays information (see Patent Document 1).
特開2017-179961号公報JP 2017-179961 A
 本開示の表示システムは、作業機と前記作業機を取り付けた本体部とを備える建設機械の表示システムである。表示システムは、表示部と、第1光学部材と、コントローラと、を備える。前記表示部は、互いに視差を有する視差画像を表示する。前記第1光学部材は、前記表示部に表示された前記視差画像の画像光を、前記建設機械の運転者の眼の位置に相当する所定位置に対して反射させる。前記第1光学部材は、前記画像光を反射した面とは反対側の面に入射した外光を透過させる。前記コントローラは、前記表示部に視差画像を表示させる。 The display system of the present disclosure is a display system for a construction machine that includes a work machine and a main body unit to which the work machine is attached. The display system includes a display unit, a first optical member, and a controller. The display unit displays parallax images having parallax with each other. The first optical member reflects image light of the parallax image displayed on the display unit with respect to a predetermined position corresponding to an eye position of a driver of the construction machine. The first optical member transmits external light incident on a surface opposite to a surface that reflects the image light. The controller displays a parallax image on the display unit.
図1は、本実施形態における表示システムを搭載するショベルカーの概略構成を示す図である。FIG. 1 is a diagram showing a schematic configuration of an excavator car equipped with a display system in the present embodiment. 図2は、図1に示す表示装置の概略構成を示す図である。FIG. 2 is a diagram showing a schematic configuration of the display device shown in FIG. 図3は、図2に示す表示パネルをアクティブエリアの法線方向から見た例を示す図である。FIG. 3 is a diagram showing an example of the display panel shown in FIG. 2 viewed from the normal direction of the active area. 図4は、図2に示すパララックスバリアを示す図である。FIG. 4 is a diagram showing the parallax barrier shown in FIG. 図5は、図1に示す表示パネルにおける運転者の左眼が視認する虚像を説明するための図である。FIG. 5 is a diagram for explaining a virtual image visually recognized by the driver's left eye on the display panel shown in FIG. 1. 図6は、図1に示す表示パネルにおける運転者の右眼が視認する虚像を説明するための図である。FIG. 6 is a diagram for explaining a virtual image visually recognized by the driver's right eye on the display panel shown in FIG. 1. 図7Aは、ショベルカーの水平面に対する角度が0度である場合の、ショベルカーと水平面との関係を示す図である。FIG. 7A is a diagram illustrating a relationship between the excavator and the horizontal plane when the angle of the excavator with respect to the horizontal plane is 0 degree. 図7Bは、ショベルカーの水平面に対する角度が10度である場合の、ショベルカーと水平面との関係を示す図である。FIG. 7B is a diagram illustrating a relationship between the excavator and the horizontal plane when the angle of the excavator with respect to the horizontal plane is 10 degrees. 図7Cは、ショベルカーの水平面に対する角度が-5度である場合の、ショベルカーと水平面との関係を示す図である。FIG. 7C is a diagram illustrating a relationship between the shovel car and the horizontal plane when the angle of the shovel car with respect to the horizontal plane is −5 degrees. 図8Aは、ショベルカーの水平面に対する角度が0度である場合の、ショベルカーと設計面との関係を示す図である。FIG. 8A is a diagram illustrating a relationship between the shovel car and the design surface when the angle of the shovel car with respect to the horizontal plane is 0 degree. 図8Bは、ショベルカーの水平面に対する角度が10度である場合の、ショベルカーと設計面との関係を示す図である。FIG. 8B is a diagram illustrating a relationship between the shovel car and the design surface when the angle of the shovel car with respect to the horizontal plane is 10 degrees. 図8Cは、ショベルカーの水平面に対する角度が-5度である場合の、ショベルカーと設計面との関係を示す図である。FIG. 8C is a diagram illustrating a relationship between the shovel car and the design surface when the angle of the shovel car with respect to the horizontal plane is −5 degrees. 図9Aは、バケットが本体部から前方に離れるように作業機を延伸した状態を示す図である。FIG. 9A is a diagram illustrating a state in which the working machine is extended so that the bucket is separated from the main body portion forward. 図9Bは、バケットが図9Aに示す状態より地面に近づくように作業機を延伸させた状態を示す図である。FIG. 9B is a diagram illustrating a state in which the working machine is extended so that the bucket is closer to the ground than the state illustrated in FIG. 9A. 図9Cは、バケットが地面を掘削ように作業機を延伸させた状態を示す図である。FIG. 9C is a diagram illustrating a state where the working machine is extended so that the bucket excavates the ground. 図10Aは、作業機が図9Aに示す状態にあるときに運転者が視認する光景を示す図である。FIG. 10A is a diagram illustrating a scene visually recognized by the driver when the work machine is in the state illustrated in FIG. 9A. 図10Bは、作業機が図9Bに示す状態にあるときに運転者が視認する光景を示す図である。FIG. 10B is a diagram illustrating a scene visually recognized by the driver when the work machine is in the state illustrated in FIG. 9B. 図10Cは、作業機が図9Cに示す状態にあるときに運転者が視認する光景を示す図である。FIG. 10C is a diagram illustrating a scene visually recognized by the driver when the work machine is in the state illustrated in FIG. 9C. 図11は、運転者が視認する、作業機が到達可能な範囲を示す虚像の例を示す図である。FIG. 11 is a diagram illustrating an example of a virtual image that is visible to the driver and indicates a range that can be reached by the work implement. 図12Aは、バケットが本体部から前方に離れるように作業機を延伸した状態を示す図である。FIG. 12A is a diagram illustrating a state in which the working machine is extended so that the bucket is separated from the main body portion forward. 図12Bは、バケットが図12Aに示す状態より本体部に近づくように作業機を延伸させた状態を示す図である。FIG. 12B is a diagram illustrating a state in which the working machine is extended so that the bucket is closer to the main body than the state illustrated in FIG. 12A. 図12Cは、バケットが図12Bに示す状態より本体部に近づくように作業機を延伸させた状態を示す図である。FIG. 12C is a diagram illustrating a state in which the working machine is extended so that the bucket is closer to the main body than the state illustrated in FIG. 12B. 図13は、運転者が視認する、設計面において作業機が到達可能な部分を示す虚像の例を示す図である。FIG. 13 is a diagram illustrating an example of a virtual image showing a portion that can be reached by the work implement on the design surface, which is visually recognized by the driver. 図14は、ミラーの位置および姿勢と、ウィンドシールドにおける画像光が到達する領域との関係を説明するための図である。FIG. 14 is a diagram for explaining the relationship between the position and orientation of the mirror and the region where the image light reaches the windshield.
 従来の表示装置において、建設機械を操作する運転者が情報を認識する際の利便性を向上することが望まれている。 In a conventional display device, it is desired to improve convenience when a driver operating a construction machine recognizes information.
 本開示は、建設機械を操作する運転者が情報を認識する際の利便性を向上することができる表示システムを提供する。 The present disclosure provides a display system capable of improving convenience when a driver operating a construction machine recognizes information.
 以下、本開示の実施形態について、図面を参照して説明する。 Hereinafter, embodiments of the present disclosure will be described with reference to the drawings.
 本実施形態にかかる表示システム1は、図1に示すように、ショベルカー(建設機械)2に搭載されてよい。ショベルカー2は、走行部21と、本体部22と、作業機23とを含んで構成される。 The display system 1 according to the present embodiment may be mounted on an excavator (construction machine) 2 as shown in FIG. The excavator 2 includes a traveling unit 21, a main body unit 22, and a work machine 23.
 走行部21は、例えば、履帯式の走行装置である。走行部21は、本体部22の下側で、本体部22と連結している。走行部21は、駆動輪211、遊動輪212、ならびに駆動輪211および遊動輪212に巻き回されたクローラ213を含んで構成される。走行部21は、駆動輪211が駆動してクローラ213を延出方向に回転させることによって、地面の上を当該延出方向に移動する。 The traveling unit 21 is, for example, a crawler type traveling device. The traveling unit 21 is connected to the main body 22 below the main body 22. The traveling unit 21 includes drive wheels 211, idle wheels 212, and crawlers 213 wound around the drive wheels 211 and idle wheels 212. The traveling unit 21 moves on the ground in the extending direction by driving the driving wheels 211 and rotating the crawler 213 in the extending direction.
 本体部22は、走行部21の上側で走行部21に連結している。そのため、本体部22は、走行部21の移動に伴って移動する。本体部22は、地面に平行な面内で旋回可能に設けられている。本体部22は、運転者が着席するための運転席221と、着席した運転者の前方側に設けられたウィンドシールド222とを含んで構成される。本体部22は、走行部21および作業機23をそれぞれ操作するための操作レバー等の操作部を含む。 The main body 22 is connected to the traveling unit 21 on the upper side of the traveling unit 21. Therefore, the main body 22 moves as the traveling unit 21 moves. The main body 22 is provided so as to be able to turn in a plane parallel to the ground. The main body 22 includes a driver seat 221 for the driver to sit on and a windshield 222 provided on the front side of the driver who has seated. The main body unit 22 includes an operation unit such as an operation lever for operating the traveling unit 21 and the work implement 23, respectively.
 作業機23は、本体部22へ取り付けられる。作業機23は、本体部22への取付け位置を中心に回動可能に取付けられ、回動することによって姿勢を変更する。作業機23は、ブーム(第1部分)231と、アーム(第2部分)232と、バケット(第3部分)233とを含んで構成される。 The work machine 23 is attached to the main body 22. The work machine 23 is attached so as to be rotatable around an attachment position to the main body 22 and changes its posture by turning. The work machine 23 includes a boom (first portion) 231, an arm (second portion) 232, and a bucket (third portion) 233.
 ブーム231は、長手方向における一方の端部で本体部22の側方に取付けられている。ブーム231は、運転者の操作に基づいて、本体部22への取付け位置を中心に回動可能に取付けられ、回動することによって姿勢を変更する。 The boom 231 is attached to the side of the main body 22 at one end in the longitudinal direction. The boom 231 is pivotally attached around the attachment position to the main body 22 based on the operation of the driver, and changes its posture by turning.
 アーム232は、長手方向における一方の端部でブーム231に取付けられている。アーム232は、ブーム231への取付け位置を中心に回動可能に取付けられる。ブーム231の位置は、アーム232の回動により変更される。ブーム231は、回動することによって姿勢を変更する。 The arm 232 is attached to the boom 231 at one end in the longitudinal direction. The arm 232 is attached so as to be rotatable around an attachment position to the boom 231. The position of the boom 231 is changed by the rotation of the arm 232. The boom 231 changes its posture by rotating.
 バケット233は、アーム232におけるブーム231へ取り付けられた端部とは反対側の端部に取付けられている。バケット233は、アーム232への取付け位置を中心に回動可能に取付けられる。バケット233の位置は、ブーム231およびアーム232の回動により変更される。バケット233は、回動することによって姿勢を変更する。 The bucket 233 is attached to the end of the arm 232 opposite to the end attached to the boom 231. The bucket 233 is attached so as to be rotatable around an attachment position to the arm 232. The position of the bucket 233 is changed by the rotation of the boom 231 and the arm 232. The bucket 233 changes its posture by rotating.
 表示システム1は、位置検出装置3と、姿勢検出装置4と、表示装置5と、投影光学系6と、駆動装置7とを含んで構成される。表示システム1の構成の一部は、ショベルカー2が備える他の装置または部品と兼用されてよい。 The display system 1 includes a position detection device 3, an attitude detection device 4, a display device 5, a projection optical system 6, and a drive device 7. A part of the configuration of the display system 1 may be shared with other devices or parts included in the excavator 2.
 位置検出装置3は、該位置検出装置3の位置を検出する。位置検出装置3の位置は、該位置検出装置3を搭載したショベルカー2の位置に相当する。したがって、位置検出装置3は、該位置検出装置3の位置を検出することによってショベルカー2の位置を検出する。位置検出装置3は、例えば、全地球測位衛星システム(GNSS:Global NavigationSatellite System)の受信器を含む。位置検出装置3は、全地球測位システムの受信機が人工衛星から受信した信号に基づいて、当該位置検出装置3の位置を検出する。全地球測位システムは、GPS(Global Positioning System)、GLONASS、Galileo、準天頂衛星等の衛星測位システムを含む。位置検出装置3は、ショベルカー2の位置を検出すると、当該位置を示す情報(以下、「位置情報」とも称する)を表示装置5および駆動装置7に出力する。 The position detection device 3 detects the position of the position detection device 3. The position of the position detection device 3 corresponds to the position of the excavator 2 on which the position detection device 3 is mounted. Therefore, the position detection device 3 detects the position of the excavator 2 by detecting the position of the position detection device 3. The position detection device 3 includes, for example, a receiver of a global positioning satellite system (GNSS). The position detection device 3 detects the position of the position detection device 3 based on the signal received from the artificial satellite by the receiver of the global positioning system. The global positioning system includes satellite positioning systems such as GPS (Global Positioning System), GLONASS, Galileo, and Quasi-Zenith Satellite. When the position of the shovel car 2 is detected, the position detection device 3 outputs information indicating the position (hereinafter also referred to as “position information”) to the display device 5 and the drive device 7.
 姿勢検出装置4は、該姿勢検出装置4を搭載したショベルカー2の本体部22の姿勢を検出する。姿勢は、例えば、水平面に対する傾斜角度である。姿勢検出装置4は、例えば、傾斜センサであってよい。傾斜センサは、例えば、重力加速度を計測し、計測された重力加速度に基づいて姿勢を判定する。位置検出装置3は、ショベルカー2の姿勢を検出すると、当該姿勢を示す情報(以下、「姿勢情報」とも称する)を表示装置5および駆動装置7に出力する。 The posture detection device 4 detects the posture of the main body 22 of the excavator 2 on which the posture detection device 4 is mounted. The posture is, for example, an inclination angle with respect to the horizontal plane. The posture detection device 4 may be a tilt sensor, for example. The tilt sensor measures, for example, gravitational acceleration and determines the posture based on the measured gravitational acceleration. When the position detection device 3 detects the posture of the shovel car 2, the position detection device 3 outputs information indicating the posture (hereinafter also referred to as “posture information”) to the display device 5 and the drive device 7.
 表示装置5は、例えば3次元表示装置である。図2に示すように、表示装置5は、通信部51と、メモリ52と、表示部53と、コントローラ54とを含んで構成される。 The display device 5 is, for example, a three-dimensional display device. As shown in FIG. 2, the display device 5 includes a communication unit 51, a memory 52, a display unit 53, and a controller 54.
 通信部51は、位置検出装置3から送信されたショベルカー2の位置を示す位置情報を受信してよい。通信部51は、姿勢検出装置4から送信された本体部22の姿勢を示す姿勢情報を受信してよい。通信部51は、外部の装置から設計情報を受信してよい。設計情報は、例えば、ショベルカー2によって形成する設計面の寸法及び位置を示す情報である。 The communication unit 51 may receive position information indicating the position of the excavator 2 transmitted from the position detection device 3. The communication unit 51 may receive posture information indicating the posture of the main body 22 transmitted from the posture detection device 4. The communication unit 51 may receive design information from an external device. The design information is information indicating the size and position of the design surface formed by the excavator 2, for example.
 メモリ52は、例えばRAM(Random Access Memory)およびROM(Read Only Memory)など、任意の記憶デバイスにより構成される。メモリ52は、設計情報を記憶してよい。 The memory 52 includes an arbitrary storage device such as a RAM (Random Access Memory) and a ROM (Read Only Memory). The memory 52 may store design information.
 表示部53は、互いに視差を有する視差画像を表示する。表示部53は、照射器531と、表示パネル532と、光学素子としてのパララックスバリア533とを含んで構成される。 The display unit 53 displays parallax images having parallax with each other. The display unit 53 includes an irradiator 531, a display panel 532, and a parallax barrier 533 as an optical element.
 照射器531は、表示パネル532を面的に照射しうる。照射器531は、光源、導光板、拡散板、拡散シート等を含んで構成されてよい。照射器531は、光源により照射光を射出し、導光板、拡散板、拡散シート等により照射光を表示パネル532の面方向に均一化する。そして、照射器531は均一化された光を表示パネル532の方に出射しうる。 The irradiator 531 can irradiate the display panel 532 in a plane. The irradiator 531 may include a light source, a light guide plate, a diffusion plate, a diffusion sheet, and the like. The irradiator 531 emits irradiation light from a light source, and equalizes the irradiation light in the surface direction of the display panel 532 using a light guide plate, a diffusion plate, a diffusion sheet, or the like. The irradiator 531 can emit the uniformed light toward the display panel 532.
 表示パネル532は、互いに視差を有する左眼画像(第1画像)および右眼画像(第2画像)を表示してよい。具体的には、表示パネル532は、例えば透過型の液晶表示パネルなどの表示パネルを採用しうる。図3に示すように、表示パネル532は、板状の表示パネル532の面に形成されたアクティブエリア532aに、第1方向および第1方向に直交する第2方向に区画された複数の区画領域を有する。第1方向および第2方向に直交する方向は第3方向と称される。第1方向は水平方向と称されてよい。第2方向は鉛直方向と称されてよい。第3方向は奥行方向と称されてよい。しかし、第1方向、第2方向、および第3方向はそれぞれこれらに限られない。図面において、第1方向はx軸方向として表され、第2方向はy軸方向として表され、第3方向はz軸方向として表される。 The display panel 532 may display a left eye image (first image) and a right eye image (second image) having parallax with each other. Specifically, the display panel 532 may employ a display panel such as a transmissive liquid crystal display panel. As shown in FIG. 3, the display panel 532 has a plurality of partitioned areas partitioned into an active area 532 a formed on the surface of the plate-shaped display panel 532 in a first direction and a second direction orthogonal to the first direction. Have A direction orthogonal to the first direction and the second direction is referred to as a third direction. The first direction may be referred to as the horizontal direction. The second direction may be referred to as the vertical direction. The third direction may be referred to as the depth direction. However, the first direction, the second direction, and the third direction are not limited to these. In the drawing, the first direction is represented as the x-axis direction, the second direction is represented as the y-axis direction, and the third direction is represented as the z-axis direction.
 区画領域の各々には、1つのサブピクセルが対応する。したがって、アクティブエリア532aは、水平方向および鉛直方向に沿って格子状に配列された複数のサブピクセルを備える。 Each sub-region corresponds to one subpixel. Therefore, the active area 532a includes a plurality of sub-pixels arranged in a grid along the horizontal direction and the vertical direction.
 各サブピクセルはR(Red),G(Green),B(Blue)のいずれかの色に対応し、R,G,Bの3つのサブピクセルを一組として1ピクセルを構成することができる。1ピクセルは、1画素と称されうる。水平方向は、例えば、1ピクセルを構成する複数のサブピクセルが並ぶ方向である。鉛直方向は、例えば、同じ色のサブピクセルが並ぶ方向である。表示パネル532としては、透過型の液晶パネルに限られず、有機EL等他の表示パネルを使用しうる。表示パネル532として、自発光型の表示パネルを使用した場合、表示装置5は照射器531を備えなくてよい。 Each subpixel corresponds to any color of R (Red), G (Green), and B (Blue), and one pixel can be configured by combining the three subpixels of R, G, and B. One pixel can be referred to as one pixel. The horizontal direction is, for example, a direction in which a plurality of subpixels constituting one pixel are arranged. The vertical direction is, for example, a direction in which sub-pixels of the same color are arranged. The display panel 532 is not limited to a transmissive liquid crystal panel, and other display panels such as an organic EL can be used. When a self-luminous display panel is used as the display panel 532, the display device 5 may not include the irradiator 531.
 上述のようにアクティブエリア532aに配列された複数のサブピクセルはサブピクセル群Pgを構成する。サブピクセル群Pgは、水平方向に隣接して、繰り返して配列される。サブピクセル群Pgは、鉛直方向においては、水平方向に1サブピクセル分ずれた位置に隣接して繰り返して配列される。 As described above, a plurality of subpixels arranged in the active area 532a constitute a subpixel group Pg. The subpixel group Pg is repeatedly arranged adjacent to each other in the horizontal direction. The subpixel group Pg is repeatedly arranged adjacent to a position shifted by one subpixel in the horizontal direction in the vertical direction.
 サブピクセル群Pgは、所定の行および列のサブピクセルを含む。具体的には、サブピクセル群Pgは、鉛直方向にb個(b行)、水平方向に2×n個(2×n列)、連続して配列された(2×n×b)個のサブピクセルP1~P(2×n×b)を含む。図3に示す
例では、アクティブエリア532aには、鉛直方向に1個、水平方向に12個、連続して配列された12個のサブピクセルP1~P12を含むサブピクセル群Pgが配置される。図2に示す例では、一部のサブピクセル群Pgに符号を付している。
The subpixel group Pg includes subpixels in a predetermined row and column. Specifically, the sub-pixel group Pg includes b pieces (b rows) in the vertical direction, 2 × n pieces (2 × n columns) in the horizontal direction, and (2 × n × b) pieces arranged in succession. Subpixels P1 to P (2 × n × b) are included. In the example shown in FIG. 3, in the active area 532a, a subpixel group Pg including twelve subpixels P1 to P12 arranged in succession, one in the vertical direction and twelve in the horizontal direction is arranged. In the example illustrated in FIG. 2, reference numerals are given to some of the subpixel groups Pg.
 サブピクセル群Pgは、コントローラ54が画像を表示するための制御を行う最小単位である。全てのサブピクセル群Pgの同じ識別情報を有するサブピクセルP1~P(2×n×b)は、コントローラ54によって同時に同様に制御される。例えば、コントローラ54は、サブピクセルP1に表示させる画像を左眼画像(第1画像)から右眼画像(第2画像)に切り替える場合、全てのサブピクセル群PgにおけるサブピクセルP1に表示させる画像を左眼画像から右眼画像に同時的に切り替えられる。 The subpixel group Pg is a minimum unit for the controller 54 to perform control for displaying an image. The subpixels P1 to P (2 × n × b) having the same identification information of all the subpixel groups Pg are similarly controlled by the controller 54 at the same time. For example, when the controller 54 switches the image to be displayed on the subpixel P1 from the left eye image (first image) to the right eye image (second image), the image to be displayed on the subpixels P1 in all the subpixel groups Pg. Switching from the left eye image to the right eye image simultaneously.
 パララックスバリア533は、図2に示したように、アクティブエリア532aに沿う平面により形成される。パララックスバリア533は、アクティブエリア532aから所定距離(ギャップ)g、離れて配置される。パララックスバリア533は、表示パネル532に対して照射器531の反対側に位置してよい。パララックスバリア533は、表示パネル532の照射器531側に位置してよい。 The parallax barrier 533 is formed by a plane along the active area 532a as shown in FIG. The parallax barrier 533 is arranged at a predetermined distance (gap) g from the active area 532a. The parallax barrier 533 may be positioned on the opposite side of the irradiator 531 with respect to the display panel 532. The parallax barrier 533 may be positioned on the irradiator 531 side of the display panel 532.
 パララックスバリア533は、図4に示すように、面内の所定方向に伸びる複数の帯状領域である透光領域533bごとに、サブピクセルから射出される画像光ILの伝播方向である光線方向を規定する。所定方向は、鉛直方向と0度でない所定角度をなす方向である。パララックスバリア533がサブピクセルから射出される画像光ILを規定することによって、運転者の眼が視認可能なアクティブエリア532a上の領域が定まる。以降において、当該領域は可視領域と称される。運転者の左眼が視認可能なアクティブエリア532a上の領域は左可視領域532aL(第1可視領域)と称される。運転者の右眼が視認可能なアクティブエリア532a上の領域は右可視領域532aR(第2可視領域)と称される。 As shown in FIG. 4, the parallax barrier 533 has a light beam direction that is a propagation direction of the image light IL emitted from the sub-pixel for each of the light-transmitting regions 533 b that are a plurality of band-like regions extending in a predetermined direction in the plane. Stipulate. The predetermined direction is a direction that forms a predetermined angle other than 0 degree with respect to the vertical direction. By defining the image light IL emitted from the sub-pixel by the parallax barrier 533, an area on the active area 532a that can be visually recognized by the driver's eyes is determined. Hereinafter, the area is referred to as a visible area. A region on the active area 532a that can be visually recognized by the driver's left eye is referred to as a left visible region 532aL (first visible region). A region on the active area 532a that can be visually recognized by the driver's right eye is referred to as a right visible region 532aR (second visible region).
 具体的には、パララックスバリア533は、複数の、画像光ILを遮光する遮光面533aを有する。複数の遮光面533aは、互いに隣接する該遮光面533aの間の透光領域533bを画定する。透光領域533bは、遮光面533aに比べて光透過率が高い。遮光面533aは、透光領域533bに比べて光透過率が低い。 Specifically, the parallax barrier 533 includes a plurality of light shielding surfaces 533a that shield the image light IL. The plurality of light shielding surfaces 533a define a light transmitting region 533b between the light shielding surfaces 533a adjacent to each other. The light transmitting region 533b has a higher light transmittance than the light shielding surface 533a. The light shielding surface 533a has a light transmittance lower than that of the light transmitting region 533b.
 透光領域533bは、パララックスバリア533に入射する光を透過させる部分である。透光領域533bは、第1所定値以上の透過率で光を透過させてよい。第1所定値は、例えば100%であってよいし、100%に近い値であってよい。遮光面533aは、パララックスバリア533に入射する光を遮って透過させない部分である。言い換えれば、遮光面533aは、表示装置5に表示される画像を遮る。遮光面533aは、第2所定値以下の透過率で光を遮ってよい。第2所定値は、例えば0%であってよいし、0%に近い値であってよい。 The light transmissive region 533 b is a portion that transmits light incident on the parallax barrier 533. The light transmitting region 533b may transmit light with a transmittance equal to or higher than the first predetermined value. The first predetermined value may be 100%, for example, or a value close to 100%. The light blocking surface 533a is a portion that blocks light that is incident on the parallax barrier 533 and does not transmit the light. In other words, the light shielding surface 533 a blocks an image displayed on the display device 5. The light blocking surface 533a may block light with a transmittance equal to or lower than the second predetermined value. The second predetermined value may be 0%, for example, or a value close to 0%.
 透光領域533bと遮光面533aとは、アクティブエリア532aに沿う所定方向に延び、所定方向と直交する方向に繰り返し交互に配列される。透光領域533bは、サブピクセルから射出される画像光ILの光線方向を規定する。 The light transmissive regions 533b and the light shielding surfaces 533a extend in a predetermined direction along the active area 532a, and are alternately and repeatedly arranged in a direction orthogonal to the predetermined direction. The translucent area 533b defines the light ray direction of the image light IL emitted from the subpixel.
 パララックスバリア533は、第2所定値未満の透過率を有するフィルムまたは板状部材で構成されてよい。この場合、遮光面533aは、当該フィルムまたは板状部材で構成される。透光領域533bは、フィルムまたは板状部材に設けられた開口で構成される。フィルムは、樹脂で構成されてよいし、他の材料で構成されてよい。板状部材は、樹脂または金属等で構成されてよいし、他の材料で構成されてよい。パララックスバリア533は、フィルムまたは板状部材に限られず、他の種類の部材で構成されてよい。パララックスバリア533は、基材が遮光性を有してよいし、基材に遮光性を有する添加物が含有されてよい。 The parallax barrier 533 may be composed of a film or a plate-like member having a transmittance less than the second predetermined value. In this case, the light shielding surface 533a is configured by the film or the plate-like member. The light transmitting region 533b is configured by an opening provided in a film or a plate-like member. A film may be comprised with resin and may be comprised with another material. The plate-like member may be made of resin or metal, or may be made of other materials. The parallax barrier 533 is not limited to a film or a plate-like member, and may be constituted by other types of members. In the parallax barrier 533, the base material may have a light-shielding property, and the base material may contain an additive having a light-shielding property.
 パララックスバリア533は、液晶シャッターで構成されてよい。液晶シャッターは、印加する電圧に応じて光の透過率を制御しうる。液晶シャッターは、複数の画素で構成され、各画素における光の透過率を制御してよい。液晶シャッターは、光の透過率が高い領域または光の透過率が低い領域を任意の形状に形成してうる。パララックスバリア533が液晶シャッターで構成される場合、透光領域533bは、第1所定値以上の透過率を有する領域としてよい。パララックスバリア533が液晶シャッターで構成される場合、遮光面533aは、第2所定値以下の透過率を有する領域としてよい。 The parallax barrier 533 may be configured with a liquid crystal shutter. The liquid crystal shutter can control the light transmittance according to the applied voltage. The liquid crystal shutter may be composed of a plurality of pixels and may control the light transmittance in each pixel. The liquid crystal shutter may be formed in an arbitrary shape with a region having a high light transmittance or a region having a low light transmittance. When the parallax barrier 533 is configured by a liquid crystal shutter, the light transmitting region 533b may be a region having a transmittance equal to or higher than the first predetermined value. When the parallax barrier 533 is configured by a liquid crystal shutter, the light shielding surface 533a may be a region having a transmittance equal to or lower than the second predetermined value.
 これにより、パララックスバリア533は、アクティブエリア532a内の所定方向に伸びる複数の帯状領域ごとに、サブピクセルから射出される画像光ILの伝播方向である光線方向を規定する。光線方向が規定された画像光ILは、投影光学系6を介して本体部22の運転席221に着席している運転者の眼に対応する所定位置に伝播される。 Thereby, the parallax barrier 533 defines the light beam direction, which is the propagation direction of the image light IL emitted from the sub-pixel, for each of a plurality of band-shaped regions extending in a predetermined direction in the active area 532a. The image light IL in which the light beam direction is defined is propagated via the projection optical system 6 to a predetermined position corresponding to the eyes of the driver sitting on the driver seat 221 of the main body 22.
 このような構成において、例えば、サブピクセルP1~P6に左眼画像(第1画像)が表示され、サブピクセルP7~P12に右眼画像(第2画像)が表示されると、左眼および右眼はそれぞれ虚像を視認する。 In such a configuration, for example, when the left eye image (first image) is displayed on the subpixels P1 to P6 and the right eye image (second image) is displayed on the subpixels P7 to P12, the left eye and the right eye are displayed. Each eye sees a virtual image.
 具体的には、図5に示すような、サブピクセルP1に表示された左眼画像の半分と、サブピクセルP2~P6に表示された左眼画像の全体と、サブピクセルP7に表示された右眼画像の半分とのそれぞれ画像光ILが投影光学系6を介して運転者の左眼に伝播される。これにより、運転者の左眼は、サブピクセルP1に表示された左眼画像の半分と、サブピクセルP2~P6に表示された左眼画像の全体と、サブピクセルP7に表示された右眼画像の半分との虚像を視認する。図5及び図6において、左眼画像を表示するサブピクセルには符号「L」が付され、右眼画像を表示するサブピクセルには符号「R」が付されている。 Specifically, as shown in FIG. 5, half of the left eye image displayed on the subpixel P1, the entire left eye image displayed on the subpixels P2 to P6, and the right image displayed on the subpixel P7. The image light IL with each half of the eye image is transmitted to the left eye of the driver via the projection optical system 6. As a result, the left eye of the driver is half of the left eye image displayed on the subpixel P1, the entire left eye image displayed on the subpixels P2 to P6, and the right eye image displayed on the subpixel P7. The virtual image with half of In FIGS. 5 and 6, the subpixel that displays the left-eye image is denoted by the symbol “L”, and the subpixel that displays the right-eye image is denoted by the symbol “R”.
 図6に示すような、サブピクセルP7に表示された右眼画像の半分と、サブピクセルP8~P12に表示された右眼画像の全体と、サブピクセルP1に表示された左眼画像の半分とのそれぞれ画像光ILが投影光学系6を介して運転者の右眼に伝播される。これにより、運転者の右眼は、サブピクセルP7に表示された右眼画像の半分と、サブピクセルP8~P12に表示された右眼画像の全体と、サブピクセルP1に表示された左眼画像の半分との虚像を視認する。 As shown in FIG. 6, half of the right eye image displayed on the subpixel P7, the entire right eye image displayed on the subpixels P8 to P12, and half of the left eye image displayed on the subpixel P1. The image light IL is propagated to the right eye of the driver through the projection optical system 6. Thus, the right eye of the driver is half of the right eye image displayed on the subpixel P7, the entire right eye image displayed on the subpixels P8 to P12, and the left eye image displayed on the subpixel P1. The virtual image with half of
 この状態において、運転者の左眼が視認する左眼画像を表示するサブピクセルの領域は最大となり、右眼画像を表示するサブピクセルの領域は最小となる。運転者の右眼が視認する右眼画像を表示するサブピクセルの領域は最大となり、左眼画像を表示するサブピクセルの領域は最小となる。したがって、運転者は、クロストークが最も抑制された状態で3次元画像の虚像を視認する。 In this state, the sub-pixel area displaying the left eye image viewed by the driver's left eye is maximized, and the sub-pixel area displaying the right eye image is minimized. The subpixel area displaying the right eye image viewed by the driver's right eye is maximized, and the subpixel area displaying the left eye image is minimized. Therefore, the driver visually recognizes a virtual image of the three-dimensional image in a state where crosstalk is most suppressed.
 コントローラ54は、表示システム1の各構成要素に接続され、各構成要素を制御しうる。コントローラ54によって制御される構成要素は、表示パネル532を含む。コントローラ54は、例えばプロセッサとして構成される。コントローラ54は、1以上のプロセッサを含んでよい。プロセッサは、特定のプログラムを読み込ませて特定の機能を実行する汎用のプロセッサ、および特定の処理に特化した専用のプロセッサを含んでよい。専用のプロセッサは、特定用途向けIC(ASIC:Application Specific Integrated Circuit)を含んでよい。プロセッサは、プログラマブルロジックデバイス(PLD:Programmable Logic Device)を含んでよい。PLDは、FPGA(Field-Programmable GateArray)を含んでよい。コントローラ54は、1つまたは複数のプロセッサが協働するSoC(System-on-a-Chip)、およびSiP(System In a Package)のいずれかであってよい。コントローラ54は、記憶部を備え、記憶部に各種情報、または表示システム1の各構成要素を動作させるためのプログラム等を格納してよい。記憶部は、例えば半導体メモリ等で構成されてよい。記憶部は、コントローラ54のワークメモリとして機能してよい。コントローラ54の詳細については追って説明する。 The controller 54 is connected to each component of the display system 1 and can control each component. The components controlled by the controller 54 include a display panel 532. The controller 54 is configured as a processor, for example. The controller 54 may include one or more processors. The processor may include a general-purpose processor that reads a specific program and executes a specific function, and a dedicated processor specialized for a specific process. The dedicated processor may include an application specific IC (ASIC: Application Specific Circuit). The processor may include a programmable logic device (PLD: Programmable Logic Device). The PLD may include an FPGA (Field-Programmable GateArray). The controller 54 may be one of SoC (System-on-a-Chip) in which one or a plurality of processors cooperate, and SiP (System-In-a-Package). The controller 54 includes a storage unit, and may store various types of information or a program for operating each component of the display system 1 in the storage unit. The storage unit may be configured by, for example, a semiconductor memory. The storage unit may function as a work memory for the controller 54. Details of the controller 54 will be described later.
 図2に示したように、投影光学系6は、第1光学部材61と第2光学部材62とを含んで構成することができる。 As shown in FIG. 2, the projection optical system 6 can be configured to include a first optical member 61 and a second optical member 62.
 第1光学部材61は、第2光学部材62によって伝播方向が偏向された画像光ILを反射させるとともに、画像光ILを反射した面とは反対側の面に入射した外光を透過させる。具体的には、第1光学部材61は、表示装置5から射出され、第2光学部材62によって反射された画像光ILを反射し、運転者の左眼および右眼に伝播する。図1に示したように、ショベルカー2のウィンドシールド222は、第1光学部材61として兼用されてよい。第1光学部材61に反射されて画像光が眼に到達することによって、ショベルカー2の運転者の眼は虚像VIを視認しうる。第1光学部材61は、ウィンドシールド222に対して第2光学部材62とは反対側から入射した外光を透過させる。 The first optical member 61 reflects the image light IL whose propagation direction is deflected by the second optical member 62 and transmits external light incident on the surface opposite to the surface from which the image light IL is reflected. Specifically, the first optical member 61 reflects the image light IL emitted from the display device 5 and reflected by the second optical member 62 and propagates it to the left eye and right eye of the driver. As shown in FIG. 1, the windshield 222 of the shovel car 2 may also be used as the first optical member 61. When the image light is reflected by the first optical member 61 and reaches the eyes, the eyes of the driver of the excavator 2 can visually recognize the virtual image VI. The first optical member 61 transmits external light incident on the windshield 222 from the side opposite to the second optical member 62.
 第2光学部材62は、表示装置5から射出された画像光ILの伝播方向を偏向する。具体的には、第2光学部材62は、表示パネル532からパララックスバリア533を介して射出された画像光ILを反射して、第1光学部材61に到達させる。第2光学部材62は、1つ以上のミラーおよびレンズを備えてよい。第2光学部材62がミラーを備える場合、例えば、第2光学部材62が備えるミラーは凹面鏡としてよい。図2において、第2光学部材62は1つのミラーとして表示している。しかし、これに限られず、第2光学部材62は、1つ以上のミラー、レンズおよびその他の光学素子を組み合わせて構成してよい。 The second optical member 62 deflects the propagation direction of the image light IL emitted from the display device 5. Specifically, the second optical member 62 reflects the image light IL emitted from the display panel 532 via the parallax barrier 533 to reach the first optical member 61. The second optical member 62 may include one or more mirrors and lenses. When the second optical member 62 includes a mirror, for example, the mirror included in the second optical member 62 may be a concave mirror. In FIG. 2, the second optical member 62 is displayed as one mirror. However, the present invention is not limited to this, and the second optical member 62 may be configured by combining one or more mirrors, lenses, and other optical elements.
 駆動装置7は、第2光学部材62の位置および姿勢を変化させる。例えば、駆動装置7は、位置検出装置3によって検出されたショベルカー2の位置と、メモリ52に記憶されている作業情報が示す作業エリアの位置とに基づいて、第2光学部材62の位置および姿勢を変化させる。具体的には、駆動装置7は、ショベルカー2の位置と作業エリアの位置との間の距離が、所定値以上である場合、ウィンドシールド222の中央周辺の領域に画像光ILが到達するように第2光学部材62の位置および姿勢を変化させてよい。所定値については、追って詳細に説明する。駆動装置7は、ショベルカー2の位置と作業エリアの位置との間の距離が所定値未満である場合、ウィンドシールド222の中央周辺の領域より下方の領域に画像光ILが到達するように第2光学部材62の位置および姿勢を変化させてよい。 The driving device 7 changes the position and posture of the second optical member 62. For example, the driving device 7 determines the position of the second optical member 62 and the position of the second optical member 62 based on the position of the excavator 2 detected by the position detection device 3 and the position of the work area indicated by the work information stored in the memory 52. Change posture. Specifically, when the distance between the position of the excavator 2 and the position of the work area is equal to or greater than a predetermined value, the driving device 7 causes the image light IL to reach an area around the center of the windshield 222. In addition, the position and posture of the second optical member 62 may be changed. The predetermined value will be described in detail later. When the distance between the position of the excavator 2 and the position of the work area is less than a predetermined value, the driving device 7 is configured to cause the image light IL to reach an area below the area around the center of the windshield 222. The position and posture of the two optical members 62 may be changed.
 コントローラ54は、表示パネル532に視差画像を表示させる。 The controller 54 displays the parallax image on the display panel 532.
 <本体部の姿勢に基づく水平面の表示>
 コントローラ54は、姿勢検出装置4によって検出された姿勢と、運転者の眼の位置に相当する所定位置とに基づいて表示パネル532に視差画像を表示させる。所定位置は、運転者が本体部22に着席したときに、運転者の眼が位置すると見込まれる位置である。
<Display of the horizontal plane based on the posture of the main unit>
The controller 54 causes the display panel 532 to display a parallax image based on the posture detected by the posture detection device 4 and a predetermined position corresponding to the position of the driver's eyes. The predetermined position is a position where the driver's eyes are expected to be positioned when the driver is seated on the main body 22.
 例えば、コントローラ54は、姿勢検出装置4によって検出され、通信部51によって受信された、本体部22の姿勢を示す姿勢情報を取得してよい。コントローラ54は、姿勢情報が示す、水平面に対する本体部22の姿勢に基づいて、表示パネル532に水平面を示す視差画像を、運転者が本体部22に着席したときに運転者の眼の位置になると見込まれる予め定められた所定位置に応じて表示してよい。具体的には、コントローラ54は、運転者が、ウィンドシールド222の外側の光景に重畳して、運転者の眼の位置から実際の水平面に平行な仮想的な面の虚像VIを視認しうるように、表示パネル532に視差画像を表示させる。コントローラ54は、予め定められた所定位置に応じてではなく、運転者の眼の実際の位置に基づいて視差画像を表示してよい。 For example, the controller 54 may acquire posture information indicating the posture of the main body unit 22 detected by the posture detection device 4 and received by the communication unit 51. Based on the posture of the main body 22 with respect to the horizontal plane indicated by the posture information, the controller 54 displays a parallax image indicating the horizontal plane on the display panel 532 when the driver is seated on the main body 22. You may display according to the predetermined | prescribed predetermined position anticipated. Specifically, the controller 54 allows the driver to visually recognize a virtual image VI of a virtual plane parallel to the actual horizontal plane from the position of the driver's eyes, superimposed on a scene outside the windshield 222. In addition, a parallax image is displayed on the display panel 532. The controller 54 may display the parallax image based on the actual position of the driver's eyes instead of according to a predetermined position.
 例えば、図7Bに示すように本体部22の水平面に対する傾斜角度が+10°である場合、コントローラ54は、図7Aに示すような傾斜角度が0度である場合に比べて前下がりに延在する仮想的な面を、運転者が視認するように視差画像を表示する。例えば、図7Cに示すように傾斜角度が-5°である場合、コントローラ54は、図7Aに示す場合に比べて前上がりに延在する仮想的な面を運転者が視認するように視差画像を表示する。 For example, when the inclination angle of the main body portion 22 with respect to the horizontal plane is + 10 ° as shown in FIG. 7B, the controller 54 extends forward and downward compared to the case where the inclination angle as shown in FIG. 7A is 0 degrees. A parallax image is displayed so that the driver can visually recognize the virtual plane. For example, when the tilt angle is −5 ° as illustrated in FIG. 7C, the controller 54 causes the parallax image so that the driver can visually recognize a virtual surface extending forward as compared with the case illustrated in FIG. 7A. Is displayed.
 <本体部の位置および姿勢に基づく設計面の表示>
 コントローラ54は、設計面の位置と、本体部22の位置および姿勢とに基づいて表示部53に視差画像を表示させてよい。具体的には、コントローラ54は、位置検出装置3によって検出され、通信部51によって受信された、本体部22の位置を示す位置情報を取得してよい。コントローラ54は、姿勢検出装置4によって検出され、通信部51によって受信された、本体部22の姿勢を示す姿勢情報を取得してよい。コントローラ54は、メモリ52に記憶されている設計情報を取得してよい。コントローラ54は、通信部51によって外部の装置から受信された設計情報を取得してよい。コントローラ54は、設計面の位置と、本体部22の位置および姿勢とに基づいて、設計面を示す視差画像を表示部53に表示させてよい。具体的には、コントローラ54は、運転者が、ウィンドシールド222の外側に視認する光景に重畳して、設計面を示す視差画像の虚像VIを視認しうるように、表示部53に視差画像を表示させてよい。
<Display of design surface based on position and orientation of main unit>
The controller 54 may display a parallax image on the display unit 53 based on the position of the design surface and the position and orientation of the main body unit 22. Specifically, the controller 54 may acquire position information indicating the position of the main body unit 22 detected by the position detection device 3 and received by the communication unit 51. The controller 54 may acquire posture information indicating the posture of the main body unit 22 detected by the posture detection device 4 and received by the communication unit 51. The controller 54 may acquire design information stored in the memory 52. The controller 54 may acquire design information received from an external device by the communication unit 51. The controller 54 may display a parallax image indicating the design surface on the display unit 53 based on the position of the design surface and the position and orientation of the main body unit 22. Specifically, the controller 54 superimposes the parallax image on the display unit 53 so that the driver can visually recognize the virtual image VI of the parallax image indicating the design surface superimposed on a scene visually recognized on the outside of the windshield 222. You may display.
 例えば、コントローラ54は、図8Bに示すように本体部22の水平面に対する傾斜角度が+10°である場合、図8Aに示すような傾斜角度が0度である場合に比べて、運転者が前下がりに延在する設計面を視認するように視差画像を表示させる。例えば、コントローラ54は、図8Cに示すように傾斜角度が-5°である場合、図8Aに示すような傾斜角度が0度である場合に比べて、運転者が前上がりに延在する仮想的な設計面を視認するように視差画像を表示させる。 For example, in the controller 54, when the inclination angle of the main body 22 with respect to the horizontal plane is + 10 ° as shown in FIG. 8B, the driver moves forward as compared with the case where the inclination angle as shown in FIG. 8A is 0 degrees. The parallax image is displayed so as to visually recognize the design surface extending to the screen. For example, when the inclination angle is −5 ° as shown in FIG. 8C, the controller 54 is a virtual that the driver extends forward as compared with the case where the inclination angle is 0 degree as shown in FIG. 8A. A parallax image is displayed so as to visually recognize a typical design surface.
 <作業機の到達可能な範囲の表示>
 (本体部を固定した状態)
 コントローラ54は、本体部22を固定した状態で作業機23の各部を回動させることによって作業機23が到達可能な範囲を示す虚像VIを運転者に視認させてよい。図9Aに示すように、バケット233が本体部22から前方に最も離れるように作業機23の各部を回動させた場合、バケット233の先端は位置Aに到達する。このとき、運転者は、ウィンドシールド222の外側に、図10Aに示すバケット233の実体を視認する。例えば、図9Bに示すように、バケット233が図9Aに示す状態より地面に近づくように、作業機23の各部を回動させた場合、バケット233の先端は位置Bに到達する。このとき、運転者は、ウィンドシールド222の外側に、図10Bに示すバケット233の実体を視認する。例えば、図9Cに示すように、バケット233が図9Bに示す状態より地面を掘削するように、作業機23の各部を回動させた場合、バケット233の先端は位置Cに到達する。このとき、運転者は、ウィンドシールド222の外側に、図10Cに示すバケット233の実体を視認する。
<Display of reachable range of work equipment>
(With the main unit fixed)
The controller 54 may cause the driver to visually recognize a virtual image VI indicating a reachable range of the work implement 23 by rotating each part of the work implement 23 with the main body portion 22 fixed. As shown in FIG. 9A, when each part of the work machine 23 is rotated so that the bucket 233 is farthest forward from the main body 22, the tip of the bucket 233 reaches the position A. At this time, the driver visually recognizes the substance of the bucket 233 shown in FIG. 10A outside the windshield 222. For example, as illustrated in FIG. 9B, when each part of the work implement 23 is rotated so that the bucket 233 is closer to the ground than the state illustrated in FIG. 9A, the tip of the bucket 233 reaches the position B. At this time, the driver visually recognizes the substance of the bucket 233 shown in FIG. 10B outside the windshield 222. For example, as shown in FIG. 9C, when each part of the work implement 23 is rotated so that the bucket 233 excavates the ground from the state shown in FIG. 9B, the tip of the bucket 233 reaches the position C. At this time, the driver visually recognizes the substance of the bucket 233 shown in FIG. 10C outside the windshield 222.
 そこで、コントローラ54は、本体部22の位置に基づいて、ブーム231、アーム232、およびバケット233の可動範囲に応じて、本体部22を固定した状態でバケット233が到達可能な範囲を算出してよい。コントローラ54は、バケット233到達可能な範囲を示す視差画像を表示部53に表示してよい。具体的には、コントローラ54は、図11の二点鎖線で示すような、バケット233の到達可能な範囲を示す、虚像VIを運転者に視認させるように、視差画像を表示部53に表示してよい。 Therefore, the controller 54 calculates the reachable range of the bucket 233 with the main body 22 fixed in accordance with the movable range of the boom 231, the arm 232, and the bucket 233 based on the position of the main body 22. Good. The controller 54 may display a parallax image indicating the reachable range of the bucket 233 on the display unit 53. Specifically, the controller 54 displays a parallax image on the display unit 53 so that the driver can visually recognize the virtual image VI that indicates the reachable range of the bucket 233 as indicated by a two-dot chain line in FIG. 11. It's okay.
 (本体部およびブームを固定した状態)
 コントローラ54は、本体部22の位置と、ブーム231の姿勢とに基づいて、本体部22およびブーム231を固定した状態でバケット233が到達可能な範囲を算出してよい。このとき、コントローラ54は、アーム232およびバケット233の可動範囲を用いて、バケット233が到達可能な範囲を算出することができる。コントローラ54は、バケット233が到達可能な範囲を示す視差画像を表示部53に表示してよい。具体的には、コントローラ54は、バケット233の到達可能な範囲を示す虚像VIを運転者に表示させるように、視差画像を表示部53に表示してよい。
(Main body and boom fixed)
The controller 54 may calculate the reachable range of the bucket 233 based on the position of the main body 22 and the posture of the boom 231 while the main body 22 and the boom 231 are fixed. At this time, the controller 54 can calculate the reachable range of the bucket 233 using the movable range of the arm 232 and the bucket 233. The controller 54 may display a parallax image indicating a reachable range of the bucket 233 on the display unit 53. Specifically, the controller 54 may display the parallax image on the display unit 53 so that the driver displays a virtual image VI indicating the reachable range of the bucket 233.
 <設計面における作業機の到達可能な部分の表示>
 コントローラ54は、本体部22の位置および姿勢と、外部の設計面とに基づいて、設計面を示す視差画像を表示部53に表示させてよい。具体的には、コントローラ54は、本体部22の位置および姿勢と、設計面の位置とに基づいて、設計面のうち、バケット233が到達可能な部分を作業機23の各部の可動範囲に応じて算出してよい。
<Display of reachable parts of the work equipment on the design>
The controller 54 may cause the display unit 53 to display a parallax image indicating the design surface based on the position and orientation of the main body unit 22 and an external design surface. Specifically, based on the position and orientation of the main body 22 and the position of the design surface, the controller 54 determines a portion of the design surface that can be reached by the bucket 233 according to the movable range of each unit of the work implement 23. May be calculated.
 図12Aに示すように、バケット233は、設計面のうち、本体部22から水平方向に離れた領域に到達しない。図12Bおよび図12Cに示すように、バケット233は、設計面のうち、本体部22から水平方向に近い領域に到達する。コントローラ54は、設計面のうち、バケット233が到達可能な部分とバケット233が到達不可能な部分とを識別するための視差画像を表示部53に表示してよい。具体的には、コントローラ54は、図13の2点鎖線の斜線で示すような、設計面のうちバケット233が到達可能な部分を示す虚像VIを運転者の眼が視認するように、視差画像を表示部53に表示する。 As shown in FIG. 12A, the bucket 233 does not reach the region of the design surface that is separated from the main body 22 in the horizontal direction. As illustrated in FIGS. 12B and 12C, the bucket 233 reaches a region near the horizontal direction from the main body 22 in the design surface. The controller 54 may display a parallax image on the display unit 53 for identifying a portion that can be reached by the bucket 233 and a portion that cannot be reached by the bucket 233 in the design surface. Specifically, the controller 54 displays the parallax image so that the driver's eyes can visually recognize the virtual image VI indicating the portion that can be reached by the bucket 233 in the design surface as indicated by the two-dot chain line in FIG. Is displayed on the display unit 53.
 <第2光学部材の位置および姿勢の変更>
 コントローラ54は、第1光学部材61における画像光ILが到達する領域を変更するように、第2光学部材62の位置および姿勢の双方または何れか一方を変更させてよい。具体的には、コントローラ54は、本体部22の位置に基づいて、駆動装置7に第2光学部材62の位置および角度を変更させてよい。さらに具体的には、コントローラ54は、本体部22の位置および作業エリアの位置に基づいて、駆動装置7に第2光学部材62の位置および角度の双方または何れか一方を変更させてよい。
<Change of position and posture of second optical member>
The controller 54 may change the position and / or posture of the second optical member 62 so as to change the region where the image light IL reaches in the first optical member 61. Specifically, the controller 54 may cause the driving device 7 to change the position and angle of the second optical member 62 based on the position of the main body 22. More specifically, the controller 54 may cause the driving device 7 to change both and / or one of the position and angle of the second optical member 62 based on the position of the main body 22 and the position of the work area.
 図14に示すように、例えば、コントローラ54は、本体部22の位置と作業エリアの位置との距離が所定値以上である場合に、運転者の眼に対応する所定位置に画像光ILaが到達するよう第1光学部材61の中央領域61aで画像光ILを反射させてよい。所定値は、例えば、作業エリアを対象にして作業機23を動作させるために位置すべき本体部22の、作業エリアからの距離の最大値であってよい。所定値は、例えば、バケット233が本体部22から水平方向に最も遠く離れた場合の、バケット233と本体部22との間の距離であってよい。この場合、コントローラ54は、第2光学部材62を第2光学部材62aで示す位置および姿勢となるよう駆動装置7を制御してよい。これにより、運転者の眼は、第1光学部材61の中央領域61aの外側に虚像VIaを視認する。 As shown in FIG. 14, for example, when the distance between the position of the main body 22 and the position of the work area is equal to or greater than a predetermined value, the controller 54 reaches the image light ILa at a predetermined position corresponding to the driver's eyes. The image light IL may be reflected by the central region 61a of the first optical member 61 so as to do so. The predetermined value may be, for example, the maximum value of the distance from the work area of the main body 22 that should be positioned to operate the work machine 23 with respect to the work area. The predetermined value may be, for example, the distance between the bucket 233 and the main body 22 when the bucket 233 is farthest from the main body 22 in the horizontal direction. In this case, the controller 54 may control the driving device 7 so that the second optical member 62 is in the position and posture indicated by the second optical member 62a. Accordingly, the driver's eyes visually recognize the virtual image VIa outside the central region 61 a of the first optical member 61.
 例えば、コントローラ54は、本体部22の位置と作業エリアの位置との距離が所定値未満である場合に、運転者の眼に対応する所定位置に画像光ILbが到達するよう第1光学部材61の下部領域61bで画像光ILを反射させてよい。このため、コントローラ54は、第2光学部材62を第2光学部材62bで示す位置および姿勢に変更するよう駆動装置7を制御してよい。これにより、運転者の眼は、第1光学部材61の下部領域61bの外側に虚像VIbを視認する。 For example, when the distance between the position of the main body 22 and the position of the work area is less than a predetermined value, the controller 54 causes the first optical member 61 so that the image light ILb reaches a predetermined position corresponding to the driver's eyes. The image light IL may be reflected by the lower region 61b. For this reason, the controller 54 may control the driving device 7 so as to change the second optical member 62 to the position and posture indicated by the second optical member 62b. Accordingly, the driver's eyes visually recognize the virtual image VIb outside the lower region 61 b of the first optical member 61.
 以上説明したように、本開示の一実施形態によれば、建設機械を操作する運転者が情報を認識する際の利便性を向上することが可能となる。一実施形態では、建設機械の表示システム1は、視差画像を表示する表示部53を備える。表示システム1は、第2光学部材62によって伝播方向が偏向された画像光ILを反射させるとともに、画像光ILを反射させる面とは反対側の面に入射した外光を透過させる第1光学部材61と、視差画像を表示部53に表示させるコントローラ54と、を備える。このため、ショベルカー2の運転者の眼が第1光学部材61に対して画像光ILを反射させる面側に位置する場合、運転者の眼は、第1光学部材61によって反射された画像光ILによって虚像VIを視認する。運転者の眼には、第1光学部材61を透過した外光が到達するため、第1光学部材61に対して運転者とは反対側の光景を視認する。これにより、運転者の眼はショベルカー2の外側の光景を視認しつつ、表示部53に表示された視差画像の虚像VIを視認する。例えば、運転者が、ショベルカー2の操作中にディスプレイに表示された画像を確認するために視線をディスプレイに移すと、作業エリアへの観察力が低下することがある。しかし、本実施形態では、運転者はショベルカー2の外側を観察することを中断せずに、視差画像に示される、例えばショベルカー2の運転に必要な情報を把握することができる。 As described above, according to an embodiment of the present disclosure, it is possible to improve convenience when a driver operating a construction machine recognizes information. In one embodiment, the construction machine display system 1 includes a display unit 53 that displays a parallax image. The display system 1 reflects the image light IL whose propagation direction is deflected by the second optical member 62 and transmits the external light incident on the surface opposite to the surface that reflects the image light IL. 61 and a controller 54 for displaying a parallax image on the display unit 53. For this reason, when the eyes of the driver of the excavator 2 are located on the surface side that reflects the image light IL with respect to the first optical member 61, the eyes of the driver are reflected by the first optical member 61. The virtual image VI is visually recognized by the IL. Since external light that has passed through the first optical member 61 reaches the eyes of the driver, a scene on the opposite side of the driver from the first optical member 61 is visually recognized. As a result, the driver's eyes visually recognize the virtual image VI of the parallax image displayed on the display unit 53 while visually recognizing the scene outside the excavator 2. For example, when the driver moves his / her line of sight to the display to confirm the image displayed on the display during the operation of the excavator 2, the observation power to the work area may be reduced. However, in the present embodiment, the driver can grasp information necessary for driving the shovel car 2, for example, shown in the parallax image without interrupting observation of the outside of the shovel car 2.
 本実施形態では、本体部22の姿勢を取得し、姿勢に基づいて視差画像を表示部53に表示する。本体部22の姿勢が変化すると、運転者に対する、ショベルカー2の外側の光景の相対的な位置が変化する。このため、運転者が視認するショベルカー2の外側の光景に重畳して視差画像の虚像VIを視認するように視差画像が表示されるとき、本体部22の姿勢が変化することによって、外側の光景の適切な位置に視差画像の虚像VIが重畳されないことがある。しかし、コントローラ54が本体部22の姿勢に基づいて視差画像を表示することによって、運転者は適切な位置に視差画像の虚像VIを視認することができる。 In the present embodiment, the posture of the main body unit 22 is acquired, and a parallax image is displayed on the display unit 53 based on the posture. When the posture of the main body 22 changes, the relative position of the scene outside the excavator 2 with respect to the driver changes. For this reason, when the parallax image is displayed so as to visually recognize the virtual image VI of the parallax image superimposed on the scene outside the shovel car 2 visually recognized by the driver, the posture of the main body 22 changes to The virtual image VI of the parallax image may not be superimposed at an appropriate position in the scene. However, when the controller 54 displays the parallax image based on the posture of the main body unit 22, the driver can visually recognize the virtual image VI of the parallax image at an appropriate position.
 本実施形態では、コントローラ54は、本体部22の姿勢に基づいて、水平面を示す視差画像を表示部53に表示する。このため、運転者は、ショベルカー2の外側に水平面を示す視差画像の虚像VIを視認することができる。これにより、運転者が着席している本体部22が水平面に対して傾いていても、運転者はショベルカー2の外側における水平面を認識することができる。したがって、運転者は、水平面に基づいて正確に作業機23を動作させることができる。 In the present embodiment, the controller 54 displays a parallax image indicating a horizontal plane on the display unit 53 based on the attitude of the main body unit 22. For this reason, the driver can visually recognize the virtual image VI of the parallax image showing the horizontal plane on the outside of the excavator 2. Thereby, even if the main body 22 on which the driver is seated is inclined with respect to the horizontal plane, the driver can recognize the horizontal plane outside the shovel car 2. Therefore, the driver can accurately operate the work machine 23 based on the horizontal plane.
 本実施形態では、コントローラ54は、作業機23が回動することによって到達可能な領域を示す視差画像を表示部53に表示する。このため、運転者は、ショベルカー2の外側の光景に重畳して、作業機23を動作させた場合に該作業機23が到達可能な領域を示す視差画像の虚像VIを視認することができる。これにより、運転者は、作業機23を動作させる前に作業可能な範囲を認識することができ、必要に応じてショベルカー2の位置及び向きの少なくとも一方を変更することができる。したがって、運転者は効率的にショベルカー2を動作させることができる。 In the present embodiment, the controller 54 displays on the display unit 53 a parallax image indicating an area that can be reached by the work machine 23 rotating. For this reason, the driver can visually recognize a virtual image VI of a parallax image indicating an area reachable by the work implement 23 when the work implement 23 is operated while being superimposed on a scene outside the excavator 2. . Accordingly, the driver can recognize the workable range before operating the work machine 23, and can change at least one of the position and orientation of the excavator 2 as necessary. Therefore, the driver can operate the excavator 2 efficiently.
 本実施形態では、コントローラ54は、ブーム231を本体部22に固定した状態でアーム232を回動させることによって作業機23が到達可能な領域を示す視差画像を表示部53に表示する。このため、運転者は、ショベルカー2の外側の光景に重畳して、ブーム231を本体部22に固定した状態でアーム232を回動させた場合に該作業機23が到達可能な領域を示す視差画像の虚像VIを視認することができる。これにより、運転者は、アーム232を動作させる前に作業可能な範囲を認識することができ、必要に応じてブーム231の位置を変更することができる。したがって、運転者は効率的にショベルカー2を動作させることができる。 In the present embodiment, the controller 54 causes the display unit 53 to display a parallax image indicating an area reachable by the work implement 23 by rotating the arm 232 while the boom 231 is fixed to the main body unit 22. For this reason, when the driver rotates the arm 232 in a state where the boom 231 is fixed to the main body portion 22 so as to be superimposed on the scene on the outside of the excavator car 2, an area where the working machine 23 can reach is shown. The virtual image VI of the parallax image can be visually recognized. Accordingly, the driver can recognize the workable range before operating the arm 232, and can change the position of the boom 231 as necessary. Therefore, the driver can operate the excavator 2 efficiently.
 本実施形態では、コントローラ54は、本体部22の位置および姿勢と、設計面の位置とに基づいて、設計面を示す視差画像を表示する。このため、運転者は、ショベルカー2の外側に設計面を示す視差画像の虚像VIを視認することができる。これにより、運転者が着席している本体部22が傾いていても、運転者はショベルカー2の外側における設計面を認識することができる。したがって、運転者は、設計面に基づいて正確に作業機23を動作させることができる。 In the present embodiment, the controller 54 displays a parallax image indicating the design surface based on the position and orientation of the main body 22 and the position of the design surface. For this reason, the driver can visually recognize the virtual image VI of the parallax image indicating the design surface on the outside of the excavator 2. Thereby, even if the main body 22 on which the driver is seated is tilted, the driver can recognize the design surface on the outside of the excavator 2. Therefore, the driver can accurately operate the work machine 23 based on the design surface.
 本実施形態では、コントローラ54は、ブーム231を本体部22に固定した状態でアーム232を回動させることによって作業機23が到達可能な設計面内の部分を示す視差画像を表示部53に表示する。このため、運転者は、アーム232を動作させる前に設計面内における作業可能な範囲を認識することができ、必要に応じてブーム231の位置を変更することができる。したがって、運転者は効率的にショベルカー2を動作させることができる。 In the present embodiment, the controller 54 displays on the display unit 53 a parallax image that indicates a part in the design plane that can be reached by the work implement 23 by rotating the arm 232 while the boom 231 is fixed to the main body unit 22. To do. Therefore, the driver can recognize the workable range in the design plane before operating the arm 232, and can change the position of the boom 231 as necessary. Therefore, the driver can operate the excavator 2 efficiently.
 本実施形態では、表示システム1は、第2光学部材62の位置および姿勢を変更する駆動装置7をさらに備える。このため、運転者の眼に視認させる虚像VIの位置を変更することができる。一般に、本体部22は、第1光学部材61を兼ねるウィンドシールド222の全面に画像光ILを到達させるように表示装置5および第2光学部材62を配置することができるほどは広くないことがある。したがって、画像光ILを到達させるウィンドシールド222の領域を変更させることによって、運転者が適切な位置に虚像VIを視認することができる。 In the present embodiment, the display system 1 further includes a drive device 7 that changes the position and orientation of the second optical member 62. For this reason, the position of the virtual image VI to be visually recognized by the driver's eyes can be changed. In general, the main body 22 may not be so wide that the display device 5 and the second optical member 62 can be arranged so that the image light IL reaches the entire surface of the windshield 222 that also serves as the first optical member 61. . Therefore, the driver can visually recognize the virtual image VI at an appropriate position by changing the region of the windshield 222 through which the image light IL reaches.
 本実施形態では、コントローラ54は作業機23を動作させる作業エリアの位置を取得し、本体部22の位置と作業エリアの位置との距離に基づいて、駆動装置7に第2光学部材62の位置および姿勢を変更させる。例えば、本体部22の位置と作業エリアの位置との間の距離が所定値以上である場合、運転者はショベルカー2を作業エリアに向けて移動させていると見込まれる。そのため、運転者が移動するときに見る方向にある、第1光学部材61の中央領域61aに画像光ILを到達させると、本体部22の外側の遠方を観察する運転者が虚像VIを視認しやすくする。本体部22の位置と作業エリアの位置との間の距離が所定値未満である場合、運転者はショベルカー2の外側の作業エリアを観察していると見込まれる。そのため、第1光学部材61における作業エリアの高さに応じた領域に画像光ILを到達させると、運転者が虚像VIを視認しやすくする。 In the present embodiment, the controller 54 acquires the position of the work area where the work machine 23 is operated, and based on the distance between the position of the main body portion 22 and the position of the work area, the position of the second optical member 62 in the drive device 7. And change posture. For example, when the distance between the position of the main body 22 and the position of the work area is equal to or greater than a predetermined value, the driver is expected to move the excavator 2 toward the work area. Therefore, when the image light IL reaches the central region 61a of the first optical member 61, which is in the direction of viewing when the driver moves, the driver observing the distant outside the main body 22 visually recognizes the virtual image VI. Make it easier. When the distance between the position of the main body 22 and the position of the work area is less than a predetermined value, the driver is expected to observe the work area outside the shovel car 2. Therefore, when the image light IL reaches an area corresponding to the height of the work area in the first optical member 61, the driver can easily see the virtual image VI.
 上述の実施形態は代表的な例として説明したが、本開示の趣旨および範囲内で、多くの変更および置換ができることは当業者に明らかである。したがって、本開示は、上述の実施形態によって制限するものと解するべきではなく、特許請求の範囲から逸脱することなく、種々の変形および変更が可能である。例えば、実施形態および実施例に記載の複数の構成ブロックを1つに組合せたり、あるいは1つの構成ブロックを分割したりすることが可能である。 The above embodiment has been described as a representative example, but it will be apparent to those skilled in the art that many changes and substitutions can be made within the spirit and scope of the present disclosure. Accordingly, the present disclosure should not be construed as being limited by the above-described embodiments, and various modifications and changes can be made without departing from the scope of the claims. For example, a plurality of constituent blocks described in the embodiments and examples can be combined into one, or one constituent block can be divided.
 上述の実施形態では、表示装置5は2次元表示装置であってよい。この場合、表示装置5の表示部53は、照射器531およびパララックスバリア533を備えない。コントローラ54は、2次元画像の虚像VIを運転者に視認させるように、2次元画像を表示部53に表示させてよい。2次元画像は、例えば、作業エリアと、該作業エリアに対する水平面および設計面を示す断面図であってよい。 In the above-described embodiment, the display device 5 may be a two-dimensional display device. In this case, the display unit 53 of the display device 5 does not include the irradiator 531 and the parallax barrier 533. The controller 54 may display the two-dimensional image on the display unit 53 so that the driver can visually recognize the virtual image VI of the two-dimensional image. The two-dimensional image may be, for example, a cross-sectional view showing a work area, a horizontal plane and a design surface with respect to the work area.
 上述の実施形態では、サブピクセル群Pgは、鉛直方向においては、水平方向に1サブピクセル分ずれた位置に隣接して繰り返して配列されるとしたが、水平方向にずれない位置に鉛直方向に隣接して繰り返して配列されてよい。この場合、パララックスバリア533において、鉛直方向に延在する遮光面533aが、水平方向に所定の間隔で配置されてよい。 In the above embodiment, the sub-pixel group Pg is repeatedly arranged adjacent to a position shifted by one sub-pixel in the horizontal direction in the vertical direction, but in the vertical direction at a position that does not shift in the horizontal direction. It may be arranged repeatedly adjacent. In this case, in the parallax barrier 533, the light shielding surfaces 533a extending in the vertical direction may be arranged at predetermined intervals in the horizontal direction.
 上述の実施形態では、表示装置5の光学素子がパララックスバリア533であるとしたが、これに限られない。例えば、表示装置5が備える光学素子は、レンチキュラレンズとしてよい。この場合、レンチキュラレンズは、シリンドリカルレンズ9を、アクティブエリア532aに平行な面内に配列して構成される。レンチキュラレンズは、パララックスバリア533と同様に、左可視領域532aLのサブピクセルを出射した画像光ILを、運転者の左眼の位置に伝搬させ、右可視領域532aRのサブピクセルを出射した画像光ILを、運転者の右眼の位置に伝搬させる。 In the above-described embodiment, the optical element of the display device 5 is the parallax barrier 533, but is not limited thereto. For example, the optical element included in the display device 5 may be a lenticular lens. In this case, the lenticular lens is configured by arranging the cylindrical lenses 9 in a plane parallel to the active area 532a. Similarly to the parallax barrier 533, the lenticular lens propagates the image light IL emitted from the sub-pixel in the left visible region 532aL to the position of the driver's left eye, and emits the image light emitted from the sub-pixel in the right visible region 532aR. The IL is propagated to the position of the driver's right eye.
1 表示システム
2 ショベルカー
3 位置検出装置
4 姿勢検出装置
5 表示装置
6 投影光学系
7 駆動装置
21 走行部
22 本体部
23 作業機
51 通信部
52 メモリ
53 表示部
54 コントローラ
61 第1光学部材
62 第2光学部材
211 駆動輪
212 遊動輪
213 クローラ
221 運転席
222 ウィンドシールド
231 ブーム
232 アーム
233 バケット
531 照射器
532 表示パネル
532a アクティブエリア
532aL 左可視領域
532aR 右可視領域
533 パララックスバリア
533a 遮光面
533b 透光領域
IL 画像光
VI 虚像
 
DESCRIPTION OF SYMBOLS 1 Display system 2 Excavator 3 Position detection apparatus 4 Attitude detection apparatus 5 Display apparatus 6 Projection optical system 7 Drive apparatus 21 Traveling part 22 Main body part 23 Work machine 51 Communication part 52 Memory 53 Display part 54 Controller 61 1st optical member 62 1st Two optical members 211 Drive wheel 212 idle wheel 213 crawler 221 driver seat 222 wind shield 231 boom 232 arm 233 bucket 531 irradiator 532 display panel 532a active area 532aL left visible area 532aR right visible area 533 parallax barrier 533a light shielding surface 533b Region IL Image light VI Virtual image

Claims (11)

  1.  作業機と前記作業機を取り付けた本体部とを備える建設機械の表示システムであって、
     互いに視差を有する視差画像を表示する表示部と、
     前記表示部に表示された前記視差画像の画像光を、前記建設機械の運転者の眼の位置に相当する所定位置に対して反射させるとともに、前記画像光を反射した面とは反対側の面に入射した外光を透過させる第1光学部材と、
     前記表示部に前記視差画像を表示させるコントローラと、を備える、
     建設機械の表示システム。
    A construction machine display system comprising a work machine and a main body portion to which the work machine is attached,
    A display unit for displaying parallax images having parallax with each other;
    The surface of the parallax image displayed on the display unit is reflected with respect to a predetermined position corresponding to the position of the driver's eye of the construction machine, and the surface opposite to the surface that reflects the image light A first optical member that transmits external light incident on
    A controller for displaying the parallax image on the display unit,
    Construction machine display system.
  2.  前記本体部の姿勢を検出する姿勢検出装置をさらに備え、
     前記コントローラは、前記姿勢検出装置により検出した前記本体部の姿勢に基づいて、前記視差画像を表示する、請求項1に記載の表示システム。
    Further comprising an attitude detection device for detecting the attitude of the main body,
    The display system according to claim 1, wherein the controller displays the parallax image based on a posture of the main body detected by the posture detection device.
  3.  前記コントローラは、前記本体部の姿勢に基づいて、前記表示部に水平面を示す前記視差画像を表示する、請求項2に記載の表示システム。 The display system according to claim 2, wherein the controller displays the parallax image indicating a horizontal plane on the display unit based on an attitude of the main body unit.
  4.  前記作業機は、前記本体部への取付け位置を中心に回動可能に取付けられ、
     前記コントローラは、前記作業機が回動することによって到達可能な領域を示す前記視差画像を表示する、請求項1から3のいずれか一項に記載の表示システム。
    The working machine is attached so as to be rotatable around an attachment position to the main body,
    The display system according to any one of claims 1 to 3, wherein the controller displays the parallax image indicating an area that is reachable when the work implement rotates.
  5.  前記作業機は、前記本体部に取り付けられた第1部分、および前記第1部分に取り付けられた第2部分を含み、
     前記第1部分は前記本体部への取付け位置を中心に回動可能であり、前記第2部分は、前記第1部分への取付け位置を中心に回動可能であり、
     前記コントローラは、前記第1部分の姿勢に基づいて、前記1部分を前記本体部に固定した状態で前記第2部分が回動することによって前記作業機が到達可能な領域を示す前記視差画像を表示する、請求項1から3のいずれか一項に記載の表示システム。
    The working machine includes a first part attached to the main body, and a second part attached to the first part,
    The first part is rotatable around an attachment position to the main body, and the second part is rotatable around an attachment position to the first part;
    The controller is configured to display the parallax image indicating an area reachable by the work implement by rotating the second portion in a state where the first portion is fixed to the main body based on an attitude of the first portion. The display system according to any one of claims 1 to 3, wherein the display is performed.
  6.  前記本体部の位置を検出する位置検出装置をさらに備え、
     前記コントローラは、前記本体部の位置および前記本体部の姿勢と、前記本体部の外部に位置する所定の設計面とに基づいて、前記設計面を示す前記視差画像を表示する、請求項2または3に記載の表示システム。
    A position detection device for detecting the position of the main body,
    The controller displays the parallax image indicating the design surface based on a position of the main body portion and an attitude of the main body portion and a predetermined design surface located outside the main body portion. 3. The display system according to 3.
  7.  前記作業機は、前記本体部への取付け位置を中心に回動可能に取付けられ、
     前記コントローラは、前記作業機が回動することによって到達可能な前記設計面内の部分を示す前記視差画像を表示する、請求項6に記載の表示システム。
    The working machine is attached so as to be rotatable around an attachment position to the main body,
    The display system according to claim 6, wherein the controller displays the parallax image indicating a portion in the design plane that can be reached by rotating the work implement.
  8.  前記作業機は、前記本体部に取り付けられた第1部分、および前記第1部分に取り付けられた第2部分を含み、
     前記第1部分は前記本体部への取付け位置を中心に回動可能であり、前記第2部分は、前記第1部分への取付け位置を中心に回動可能であり、
     前記コントローラは、前記第1部分の姿勢に基づいて、前記1部分を前記本体部に固定した状態で前記第2部分が回動することによって前記作業機が到達可能な前記設計面内の領域を示す前記視差画像を表示する、請求項6に記載の表示システム。
    The working machine includes a first part attached to the main body, and a second part attached to the first part,
    The first part is rotatable around an attachment position to the main body, and the second part is rotatable around an attachment position to the first part;
    Based on the posture of the first part, the controller determines an area in the design plane that the work implement can reach by rotating the second part with the one part fixed to the main body. The display system according to claim 6, wherein the displayed parallax image is displayed.
  9.  前記表示部に表示された前記視差画像の画像光の伝播方向を前記第1光学部材の方へ偏向する第2光学部材と、
     前記第2光学部材の位置および姿勢を変更する駆動装置と、
    をさらに備え、
     前記コントローラは、前記本体部の位置に基づいて第1光学部材における前記画像光が到達する領域を変更するように、前記駆動装置に前記第2光学部材の位置および姿勢の少なくとも一方を変更させる、請求項6から8のいずれか一項に記載の表示システム。
    A second optical member that deflects the propagation direction of the image light of the parallax image displayed on the display unit toward the first optical member;
    A drive device for changing the position and posture of the second optical member;
    Further comprising
    The controller causes the driving device to change at least one of the position and the posture of the second optical member so as to change the region where the image light reaches the first optical member based on the position of the main body. The display system according to any one of claims 6 to 8.
  10.  前記コントローラは、前記作業機を動作させる作業エリアの位置を取得し、前記本体部の位置と前記作業エリアの位置との距離に基づいて、前記駆動装置に前記第2光学部材の位置および姿勢を変更させる、請求項9に記載の表示システム。 The controller acquires a position of a work area where the work implement is operated, and determines a position and a posture of the second optical member on the driving device based on a distance between the position of the main body and the position of the work area. The display system according to claim 9, wherein the display system is changed.
  11.  前記コントローラは、前記距離が所定値未満であると判定すると、前記作業エリアの高さに基づいて、前記駆動装置に前記第2光学部材の位置および姿勢を変更させる、請求項10に記載の表示システム。
     
    The display according to claim 10, wherein when the controller determines that the distance is less than a predetermined value, the controller causes the driving device to change the position and orientation of the second optical member based on the height of the work area. system.
PCT/JP2019/019088 2018-05-30 2019-05-14 Display system WO2019230366A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-103387 2018-05-30
JP2018103387 2018-05-30

Publications (1)

Publication Number Publication Date
WO2019230366A1 true WO2019230366A1 (en) 2019-12-05

Family

ID=68696653

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/019088 WO2019230366A1 (en) 2018-05-30 2019-05-14 Display system

Country Status (1)

Country Link
WO (1) WO2019230366A1 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009173195A (en) * 2008-01-25 2009-08-06 Sumitomo (Shi) Construction Machinery Manufacturing Co Ltd Display system for construction machine
JP2011001775A (en) * 2009-06-19 2011-01-06 Caterpillar Sarl Display device for construction machine
WO2012114870A1 (en) * 2011-02-22 2012-08-30 株式会社小松製作所 Hydraulic shovel operability range display device and method for controlling same
WO2014103498A1 (en) * 2012-12-28 2014-07-03 株式会社小松製作所 Construction machinery display system and control method for same
JP2014150304A (en) * 2013-01-31 2014-08-21 Nippon Seiki Co Ltd Display device and display method therefor
WO2015145935A1 (en) * 2014-03-27 2015-10-01 パナソニックIpマネジメント株式会社 Virtual image display device, head-up display system, and vehicle
WO2017043107A1 (en) * 2015-09-10 2017-03-16 富士フイルム株式会社 Projection-type display device and projection control method
WO2017110014A1 (en) * 2015-12-21 2017-06-29 株式会社Jvcケンウッド Image display device, image display method, and control program
US20170344221A1 (en) * 2016-05-31 2017-11-30 Novatron Oy User interface and earth-moving machine
WO2018043301A1 (en) * 2016-09-02 2018-03-08 株式会社小松製作所 Work machine graphics display system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009173195A (en) * 2008-01-25 2009-08-06 Sumitomo (Shi) Construction Machinery Manufacturing Co Ltd Display system for construction machine
JP2011001775A (en) * 2009-06-19 2011-01-06 Caterpillar Sarl Display device for construction machine
WO2012114870A1 (en) * 2011-02-22 2012-08-30 株式会社小松製作所 Hydraulic shovel operability range display device and method for controlling same
WO2014103498A1 (en) * 2012-12-28 2014-07-03 株式会社小松製作所 Construction machinery display system and control method for same
JP2014150304A (en) * 2013-01-31 2014-08-21 Nippon Seiki Co Ltd Display device and display method therefor
WO2015145935A1 (en) * 2014-03-27 2015-10-01 パナソニックIpマネジメント株式会社 Virtual image display device, head-up display system, and vehicle
WO2017043107A1 (en) * 2015-09-10 2017-03-16 富士フイルム株式会社 Projection-type display device and projection control method
WO2017110014A1 (en) * 2015-12-21 2017-06-29 株式会社Jvcケンウッド Image display device, image display method, and control program
US20170344221A1 (en) * 2016-05-31 2017-11-30 Novatron Oy User interface and earth-moving machine
WO2018043301A1 (en) * 2016-09-02 2018-03-08 株式会社小松製作所 Work machine graphics display system

Similar Documents

Publication Publication Date Title
US11675211B2 (en) Three-dimensional display apparatus, three-dimensional display system, head up display, head up display system, three-dimensional display apparatus design method, and mobile object
CN100434971C (en) Glassless stereoscopic display
JP7129789B2 (en) Head-up displays, head-up display systems, and moving objects
WO2019009243A1 (en) Three-dimensional display device, three-dimensional display system, mobile body, and three-dimensional display method
JP7188981B2 (en) 3D display device, 3D display system, head-up display, and moving object
JP7483604B2 (en) Three-dimensional display system, optical element, installation method, control method, and moving body
JP7120537B2 (en) THREE-DIMENSIONAL DISPLAY DEVICE, THREE-DIMENSIONAL DISPLAY SYSTEM, HEAD-UP DISPLAY, AND THREE-DIMENSIONAL DISPLAY DESIGN METHOD
US20200053352A1 (en) Three-dimensional display apparatus, three-dimensional display system, head-up display system, and mobile body
JP7188888B2 (en) Image display device, head-up display system, and moving object
JP7145214B2 (en) 3D DISPLAY DEVICE, CONTROLLER, 3D DISPLAY METHOD, 3D DISPLAY SYSTEM, AND MOVING OBJECT
US20230004002A1 (en) Head-up display, head-up display system, and movable body
WO2019225400A1 (en) Image display device, image display system, head-up display, and mobile object
JP7403290B2 (en) Head-up display systems and moving objects
WO2020130049A1 (en) Three-dimensional display device, head-up display system, and mobile body
WO2019230366A1 (en) Display system
WO2020130048A1 (en) Three-dimensional display device, head-up display system, and moving object
JP2021056480A (en) Three-dimensional display device, controller, three-dimensional display method, three-dimensional display system, and movable body
WO2022163728A1 (en) Three-dimensional display device
JP7336782B2 (en) 3D display device, 3D display system, head-up display, and moving object
JP7346587B2 (en) Head-up display, head-up display system and mobile object
JP7475231B2 (en) 3D display device
JP7332448B2 (en) Head-up display system and moving body
US20210311306A1 (en) Head-up display, head-up display system, and mobile body
EP4276523A1 (en) Three-dimensional display device
WO2021060011A1 (en) Parallax barrier, 3d display device, 3d display system, heads-up display, and moving body

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19811155

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19811155

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP