WO2019230395A1 - Display system - Google Patents

Display system Download PDF

Info

Publication number
WO2019230395A1
WO2019230395A1 PCT/JP2019/019264 JP2019019264W WO2019230395A1 WO 2019230395 A1 WO2019230395 A1 WO 2019230395A1 JP 2019019264 W JP2019019264 W JP 2019019264W WO 2019230395 A1 WO2019230395 A1 WO 2019230395A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
image
controller
display system
display panel
Prior art date
Application number
PCT/JP2019/019264
Other languages
French (fr)
Japanese (ja)
Inventor
薫 草深
橋本 直
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2019230395A1 publication Critical patent/WO2019230395A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/16Applications of indicating, registering, or weighing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/46Position indicators for suspended loads or for crane elements

Definitions

  • the present invention relates to a display system for a crane.
  • Cranes are used for transporting heavy objects during cargo handling to transportation equipment and transporting materials during construction work.
  • a suspended load in which a heavy object is suspended by a crane, it is required to reduce the swing of the suspended load. Accordingly, various mechanisms for preventing the swinging of the suspended load have been proposed (see Patent Document 1).
  • the display system is: A display system for a crane, A display panel for displaying a display image; A wireless communication sensor attached to an attachment including at least one of a hook and a suspended load of the crane; A controller that changes the display image based on a wireless signal acquired from the wireless communication sensor.
  • FIG. 1 It is a functional block diagram which shows schematic structure of the display system which concerns on one Embodiment. It is the figure which looked at the display panel of FIG. 1, and the 1st optical element from the optical element side. It is a figure which shows an example of the attitude
  • a display system 10 includes a wireless communication sensor 11 and a display device 12.
  • the display system 10 is a display system for a crane and may be mounted on the crane.
  • the display system 10 may share part of the configuration with other devices and parts included in the crane.
  • the wireless communication sensor 11 is attached to the attachment 13.
  • the attachment 13 includes at least one of a crane hook 14 and a suspended load 15.
  • the wireless communication sensor 11 has a wireless communication function, and transmits a measured value measured by the wireless communication sensor 11 as a wireless signal to the controller 16 described later.
  • the wireless communication sensor 11 may measure a state value for calculating the motion state of the attachment 13.
  • the state value for calculating the driving state may include at least one of acceleration, angular velocity, and geomagnetism, for example.
  • the wireless communication sensor 11 may include at least one of an acceleration sensor, an angular velocity sensor, and a geomagnetic sensor, and may include a 6-axis sensor or a 9-axis sensor that combines a plurality of sensors.
  • the display device 12 includes a display unit 17 and a controller 16.
  • the display unit 17 has at least a display panel 18.
  • the display unit 17 may include an irradiator 19 in a configuration in which a display panel 18 other than the self-luminous type is applied.
  • the display unit 17 may include a first optical element 20 having a function to be described later and provide a stereoscopic view to the operator.
  • the display unit 17 may be a head-up display (HUD) that has a second optical element 21 having a function to be described later, and allows an operator to visually recognize a virtual image.
  • HUD head-up display
  • the display panel 18 may be a display panel such as a transmissive liquid crystal display panel.
  • the display panel 18 forms and displays a display image generated by the controller 16. As shown in FIG. 2, the display panel 18 has a plurality of partitioned areas partitioned in a horizontal direction and a vertical direction by a grid-like black matrix 22 on a plate-like surface.
  • the black matrix 22 has a first black line 23 extending in the vertical direction and a second black line 24 extending in the horizontal direction.
  • a plurality of first black lines 23 are arranged in the horizontal direction at a constant pitch, for example, and a plurality of second black lines 24 are arranged in the vertical direction at a constant pitch, for example.
  • the plurality of first black lines 23 and the plurality of second black lines 24 define a plurality of partition regions.
  • Each sub-region corresponds to one subpixel.
  • the plurality of subpixels are arranged in a matrix in the horizontal direction and the vertical direction.
  • Each subpixel corresponds to each color of R, G, and B, and one pixel may be configured by combining the three subpixels of R, G, and B as a set.
  • One pixel can be called one pixel.
  • the display panel 18 may be mounted in the crane so that, for example, the direction in which a plurality of subpixels constituting one pixel are aligned is the horizontal direction. In the present embodiment, the display panel 18 may be mounted in a crane so that the direction in which sub-pixels of the same color are arranged is the vertical direction.
  • the display panel 18 is not limited to a transmissive liquid crystal panel, and other display panels such as an organic EL can also be used.
  • the irradiator 19 is arranged on one surface side of the display panel 18.
  • the irradiator 19 may include a light source, a light guide plate, a diffusion plate, a diffusion sheet, and the like.
  • the irradiator 19 emits irradiation light from a light source, and equalizes the irradiation light in the surface direction of the display panel 18 using a light guide plate, a diffusion plate, a diffusion sheet, or the like.
  • the irradiator 19 emits the uniformed light to the display panel 18.
  • the 1st optical element 20 prescribes
  • the first optical element 20 may be, for example, a parallax barrier or a lenticular lens. Below, the 1st optical element 20 in the structure to which a parallax barrier is applied is demonstrated.
  • the first optical element 20 changes the light beam direction, which is the propagation direction of the image light emitted from the subpixel, for each of the plurality of opening regions 25 extending in a predetermined direction on the display panel 18.
  • the visible range of the image light emitted from the subpixel is determined by the first optical element 20.
  • the first optical element 20 is located on the opposite side of the irradiator 19 with respect to the display panel 18.
  • the first optical element 20 can be located on the irradiator 19 side of the display panel 18.
  • the first optical element 20 has a plurality of light shielding surfaces 26.
  • the light shielding surface 26 shields image light.
  • the plurality of light shielding surfaces 26 define an opening region 25 between the light shielding surfaces 26 adjacent to each other.
  • the opening region 25 has a higher light transmittance than the light shielding surface 26.
  • the light shielding surface 26 has a light transmittance lower than that of the opening region 25.
  • the opening region 25 is a portion that transmits light incident on the first optical element 20.
  • the opening region 25 may transmit light with a transmittance equal to or higher than the first predetermined value.
  • the first predetermined value may be 100%, for example, or a value close to 100%.
  • the light blocking surface 26 is a portion that blocks and does not transmit light incident on the first optical element 20.
  • the light shielding surface 26 blocks an image displayed on the display panel 18.
  • the light blocking surface 26 may block light with a transmittance equal to or lower than the second predetermined value.
  • the second predetermined value may be 0%, for example, or a value close to 0%.
  • the opening area 25 and the light shielding surface 26 are mounted in the crane so as to be alternately arranged in the horizontal direction and the vertical direction.
  • the end of the opening area 25 defines the light ray direction of the image light emitted from the subpixel for each of a plurality of band-like areas extending in a predetermined direction on the surface of the display panel 18.
  • the line indicating the end of the opening area 25 is along the vertical direction, moire is likely to be recognized in the display image due to an error included in the arrangement of the subpixels or the size of the opening area 25.
  • the line indicating the end of the opening area 25 extends in a direction having a predetermined angle with respect to the vertical direction, moire is generated in the display image regardless of the subpixel arrangement or the error included in the dimension of the opening area 25. It becomes difficult to be recognized.
  • the first optical element 20 may be composed of a film or a plate-like member having a transmittance less than the second predetermined value.
  • the light shielding surface 26 is formed of the film or plate member.
  • the opening area 25 is configured by an opening provided in a film or a plate-like member.
  • a film may be comprised with resin and may be comprised with another material.
  • the plate-like member may be made of resin or metal, or may be made of other materials.
  • the 1st optical element 20 is not restricted to a film or a plate-shaped member, You may be comprised with another kind of member.
  • a base material may have a light-shielding property and the additive which has a light-shielding property may be contained in a base material.
  • the first optical element 20 may be composed of a liquid crystal shutter.
  • the liquid crystal shutter can control the light transmittance according to the applied voltage.
  • the liquid crystal shutter may be composed of a plurality of pixels and may control the light transmittance in each pixel.
  • the liquid crystal shutter may be formed in an arbitrary shape with a region having a high light transmittance or a region having a low light transmittance.
  • the opening region 25 may be a region having a transmittance equal to or higher than a first predetermined value.
  • the light-shielding surface 26 may be an area
  • the first optical element 20 propagates the image light emitted from the sub-pixels in some of the opening regions 25 to the position of the user's right eye and emits the image light emitted from the other sub-pixels in the other opening regions 25. Is propagated to the position of the user's left eye.
  • the first optical element 20 is located a predetermined distance away from the surface of the display panel 18.
  • the first optical element 20 includes light blocking surfaces 26 arranged in a slit shape.
  • the left eye of the user visually recognizes the band-like visible area on the display panel 18 corresponding to the opening area 25. Yes. Since the image light is blocked by the light blocking surface 26 of the first optical element 20, the left eye of the user cannot visually recognize the invisible region on the display panel 18 corresponding to the light blocking surface 26.
  • the second optical element 21 reflects the image light of the display image displayed on the display panel 18.
  • the second optical element 21 transmits light outside the crane (external light).
  • the second optical element 21 may be a windshield of a crane.
  • the second optical element 21 may be disposed on the optical path of the image light of the display panel 18.
  • the second optical element 21 may be disposed in the image light emission direction of the display panel 18.
  • the second optical element 21 is disposed in the reflection direction of the image light of the display panel 18 by the reflective element. May have been.
  • the image light reflected by the second optical element 21 reaches the left eye and right eye of the operator. Therefore, in the display device 12, the image light is advanced from the display panel 18 to the left eye and right eye of the operator along the optical path indicated by the broken line. The operator can visually recognize the image light that has reached along the optical path as a virtual image 27.
  • the controller 16 is configured as a processor, for example.
  • the controller 16 may include one or more processors.
  • the processor may include a general-purpose processor that reads a specific program and executes a specific function, and a dedicated processor specialized for a specific process.
  • the dedicated processor may include an application specific IC (ASIC: Application Specific Circuit).
  • the processor may include a programmable logic device (PLD: Programmable Logic Device).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the controller 16 may be one of SoC (System-on-a-Chip) and SiP (System-In-a-Package) in which one or a plurality of processors cooperate.
  • the controller 16 includes a storage unit, and may store various information or a program for operating each component of the three-dimensional display system 1 in the storage unit.
  • the storage unit may be configured by, for example, a semiconductor memory.
  • the storage unit may function as a work memory for the controller 16.
  • the controller 16 is connected to each component of the display device 12, such as the display panel 18, and controls each component. For example, the controller 16 changes the display image based on the wireless signal acquired from the wireless communication sensor 11 and causes the display panel 18 to display the display image.
  • the display image to be changed based on the radio signal may include, for example, an object whose appearance varies depending on the viewing direction (hereinafter also referred to as “posture visually recognized object”) and acceleration acting on the attachment 13.
  • the posture visually recognizing object may be a three-dimensional object other than a sphere, for example, a plate-like material such as a flat plate.
  • the posture visually recognizing object may be a sphere with an identifier attached to the surface.
  • the identifier may be, for example, a spherical parallel or meridian, a color of only a part of the surface, and the like.
  • the controller 16 changes the posture of the posture visually recognized object displayed on the display panel 18 according to the posture of the attachment 13.
  • the controller 16 calculates the posture of the attachment 13 in order to change the posture of the posture visually recognized object.
  • the controller 16 may calculate the posture as an inclination with respect to the standard state of the attachment 13.
  • the standard state of the attachment 13 is, for example, a state where the attachment 13 is lifted by a crane and is stationary.
  • the controller 16 may acquire information other than the radio signal from the crane and use it for calculating the posture. For example, in the configuration in which the wireless communication sensor 11 includes only an acceleration sensor, the controller 16 may acquire the payout length of the crane winch. The controller 16 may calculate the turning radius from the boom of the crane to the attachment 13 based on the feeding length and use it for calculating the posture.
  • the controller 16 may calculate the inclination of the attachment 13 around at least one axis. In this embodiment, the controller 16 calculates inclinations around the three axes of the pitch axis, the yaw axis, and the roll axis of the crane.
  • the pitch axis of the crane is an axis parallel to the left-right direction of the crane.
  • the yaw axis of the crane is an axis parallel to the vertical direction of the crane.
  • shaft of a crane is an axis
  • the left-right direction, the up-down direction, and the front-rear direction of the crane are directions with respect to the cockpit of the crane.
  • the wireless communication sensor 11 may be attached to the attachment 13 at a predetermined position and posture, or to the attachment 13 The actual attachment position and orientation of the wireless communication sensor 11 may be detected and used for calculation.
  • the controller 16 generates a display image of the posture visually recognized object having a posture corresponding to the calculated posture of the attachment 13.
  • the posture-viewing object is a plate-like material, and in a standard state, the plate surface is perpendicular to the vertical direction of the crane and the side surface including the long side of the plate surface is perpendicular to the front-rear direction. It has been.
  • the controller 16 views the posture visual recognition object obj in the standard state from above and from the right as illustrated in FIG. A display image having a different shape is generated.
  • the controller 16 changes the posture of the posture visually recognized object obj by giving the generated display image to the display panel 18 for display.
  • the controller 16 may change the depth of the posture visually recognized object obj based on the wireless signal.
  • the controller 16 calculates the inclination around the pitch axis of the attachment 13 based on the radio signal in order to change the depth of the posture visually recognized object obj.
  • the controller 16 further obtains the payout length of the winch of the crane and calculates the turning radius from the crane boom to the attachment 13.
  • the controller 16 calculates the position of the attachment 13 in the longitudinal direction of the crane based on the inclination around the pitch axis of the attachment 13 and the turning radius.
  • the controller 16 generates a display image of the posture visually recognized object obj having a size corresponding to the calculated position.
  • the size corresponding to the position is larger as it approaches the rear direction of the crane, and is smaller as it approaches the front direction.
  • the controller 16 generates a suggestion image that perceives the posture visually recognized object obj to protrude from the position where the virtual image 27 is visually recognized or away from the position in a configuration in which the display image is generated as a parallax image. Good.
  • the controller 16 changes the depth of the posture visually recognized object obj by giving the generated display image to the display panel 18 for display.
  • the controller 16 may calculate the magnitude of acceleration acting on the attachment 13 along the longitudinal direction and the lateral direction of the crane.
  • the controller 16 generates a display image that displays the magnitude of the calculated acceleration in the front-rear direction and the left-right direction.
  • the controller 16 changes the current acceleration of the attachment 13 by giving the generated display image to the display panel 18 for display.
  • the controller 16 may generate a display image in which the acceleration in the front-rear direction is displayed as an arrow 29 extending in the vertical direction on the one-dimensional coordinate system 28 extending in the vertical direction as illustrated in FIG.
  • the controller 16 may generate a display image in which the acceleration in the left-right direction is displayed as an arrow 31 extending in the left-right direction on the one-dimensional coordinate system 30 extending in the left-right direction.
  • the controller 16 displays a display image in which accelerations in the front-rear direction and the left-right direction are displayed as arrows 33 extending in an arbitrary direction on the two-dimensional coordinate system 32 extending in the up-down direction and the left-right direction. May be generated.
  • the controller 16 may generate a display image including the acceleration of the attachment 13 and a guide 34 for canceling the acceleration.
  • the controller 16 may display the acceleration acting on the attachment 13 by reversing the acting direction. For example, in an image in which the direction of acceleration is reversed, in the display images illustrated in FIGS. 4 and 5, the arrow indicating acceleration is reversed with the origin as the axis.
  • the controller 16 may display the locus of acceleration as a display image together with the acceleration of the attachment 13. As illustrated in FIG. 6, the controller 16 generates a display image in which the trajectories of acceleration in the front-rear direction and the left-right direction are displayed as a collection of points 35 on the two-dimensional coordinate system 32 extending in the up-down direction and the left-right direction. It's okay.
  • the controller 16 may generate a parallax image corresponding to the above-described display image in a configuration in which the display unit 17 includes the first optical element 20.
  • a parallax image is an image that allows a three-dimensional image to be perceived by stereoscopic viewing.
  • the display image is a three-dimensional object
  • the image of the arbitrary first point of the three-dimensional object viewed with the right eye and the image viewed with the left eye are slightly shifted in the left-right direction.
  • a set of images in which another second point adjacent to the first point of the three-dimensional object in the left-right direction is viewed with the right eye and an image viewed with the left eye is a set of images corresponding to the first point. It is arrange
  • the controller 16 displays each right-eye image on the display panel 18 in which the right eye RE and the left eye LE of the operator at a specific position with respect to the display unit 17 can be visually recognized from the opening area 25 of the first optical element 20.
  • a parallax image is generated in which images of the respective points constituting the display image viewed from the right eye RE and the left eye LE are arranged.
  • the specific position of the operator with respect to the display unit 17 may be determined in advance as an ideal state.
  • the specific position of the operator with respect to the display unit 17 may be detected using a camera or the like.
  • the display system 10 of the present embodiment configured as described above changes the display image based on a wireless signal acquired from the wireless communication sensor 11 attached to the attachment 13.
  • the operation of reducing the swing of the attachment 13 only by the appearance of the attachment 13 can be difficult.
  • the display system 10 provides the operator with information for reducing the swing of the suspended load 15, that is, a change in the state of the attachment 13 measured by the wireless communication sensor 11. obtain.
  • the display system 10 of the present embodiment further includes a first optical element 20 that defines the light beam direction of the display image and a second optical element 21 that reflects the image light from the display panel 18, and the display image has a parallax. It is an image.
  • the display system 10 can perceive the state of the attachment 13 in a three-dimensional manner.
  • the display system 10 can make the posture visual recognition object obj perceived in a three-dimensional manner, thereby more clearly recognizing the state of the attachment 13. .
  • the controller 16 displays the posture visually recognized object obj having a different appearance depending on the viewing direction as a display image changed according to the posture of the attachment 13.
  • the shape of the suspended load 15 is poor in change in appearance depending on the viewing direction, such as a sphere, it is difficult to grasp the longitudinal swing of the crane in the front-rear direction.
  • the change in appearance with respect to pitching becomes clear by displaying the posture visually recognized object obj, so that the operator can easily grasp the pitching of the suspended load 15.
  • the controller 16 changes the depth of the posture visually recognized object obj based on the wireless signal.
  • the display system 10 can allow the operator to easily grasp the pitching of the suspended load 15 that is relatively far from the operator, which is difficult only by visual recognition.
  • the controller 16 displays the acceleration acting on the attachment 13 as a display image.
  • the display system 10 can make the operator recognize the direction and magnitude of the force acting on the attachment 13 that is useful for the operation of suppressing the swing of the suspended load 15. .
  • the controller 16 displays the acceleration acting on the attachment 13 with the direction of action reversed.
  • the acceleration directed in the same direction as the force to be applied to the attachment 13 is displayed, so that the operator can directly recognize the direction in which the force should be applied.
  • the controller 16 displays the acceleration locus.
  • the display system 10 can provide a trajectory of acceleration that reflects the swing reduction operation by the operator. Therefore, the display system 10 can provide information for objectively determining the degree of mastery of the operator's reduction operation.
  • the controller 16 displays a guide 34 for canceling the acceleration of the attachment 13.
  • the display system 10 can directly recognize the direction in which a force should be applied to the operator.
  • the display unit 17 is a head-up display that allows the operator to visually recognize the virtual image 27, but may be a display that directly recognizes the display panel 18.
  • the display unit 17 includes the first optical element 20 and can display the parallax image on the display panel 18 so that the operator can view the image stereoscopically. It's okay.

Abstract

This display system 10 is a display system designed for cranes. This display system 10 has a display panel 18, a wireless communication sensor 11, and a controller 16. The display panel 18 displays a display image. The wireless communication sensor 11 is attached to an attachment-receiving object 13. The attachment-receiving object 13 includes a hook 14 and/or suspended load 15 of a crane. The controller 16 changes the display image on the basis of wireless signals acquired from the wireless communication sensor 11.

Description

表示システムDisplay system 関連出願の相互参照Cross-reference of related applications
 本出願は、2018年5月30日に日本国に特許出願された特願2018-103170の優先権を主張するものであり、この先の出願の開示全体をここに参照のために取り込む。 This application claims the priority of Japanese Patent Application No. 2018-103170 filed in Japan on May 30, 2018, the entire disclosure of this earlier application is incorporated herein for reference.
 本発明は、クレーン向けの表示システムに関するものである。 The present invention relates to a display system for a crane.
 輸送機器への荷役における重量物の運搬、および建設工事における資材の運搬などに、クレーンが用いられている。クレーンによる重量物を吊るした吊荷では、吊荷の揺動を低減させることが求められている。そこで、吊荷の揺動を防止するための多様な機構が提案されている(特許文献1参照)。 Cranes are used for transporting heavy objects during cargo handling to transportation equipment and transporting materials during construction work. In a suspended load in which a heavy object is suspended by a crane, it is required to reduce the swing of the suspended load. Accordingly, various mechanisms for preventing the swinging of the suspended load have been proposed (see Patent Document 1).
特開平11-035283号公報Japanese Patent Laid-Open No. 11-035283
 第1の観点による表示システムは、
 クレーン向け表示システムであって、
 表示画像を表示する表示パネルと、
 前記クレーンのフックおよび吊荷の少なくとも一方を含む被取付物に取付けられる無線通信センサと、
 前記無線通信センサから取得する無線信号に基づいて、前記表示画像を変更するコントローラと、を備える。
The display system according to the first aspect is:
A display system for a crane,
A display panel for displaying a display image;
A wireless communication sensor attached to an attachment including at least one of a hook and a suspended load of the crane;
A controller that changes the display image based on a wireless signal acquired from the wireless communication sensor.
一実施形態に係る表示システムの概略構成を示す機能ブロック図である。It is a functional block diagram which shows schematic structure of the display system which concerns on one Embodiment. 図1の表示パネルおよび第1光学素子を光学素子側から見た図である。It is the figure which looked at the display panel of FIG. 1, and the 1st optical element from the optical element side. 図1の表示パネルに表示される姿勢視認物体の姿勢の一例を示す図である。It is a figure which shows an example of the attitude | position of the attitude | position visual recognition object displayed on the display panel of FIG. 図1の表示パネルに表示される加速度の一例である。It is an example of the acceleration displayed on the display panel of FIG. 図1の表示パネルに表示される加速度の別の一例である。It is another example of the acceleration displayed on the display panel of FIG. 図1の表示パネルに加速度とともに表示される軌跡の一例である。It is an example of the locus | trajectory displayed with an acceleration on the display panel of FIG. 図1の表示パネルに表示される視差画像を説明するための図である。It is a figure for demonstrating the parallax image displayed on the display panel of FIG.
 以下、本開示の複数の実施形態について、図面を参照して説明する。 Hereinafter, a plurality of embodiments of the present disclosure will be described with reference to the drawings.
 図1に示すように、本開示の一実施形態にかかる表示システム10は、無線通信センサ11および表示装置12を含んで構成されている。 As shown in FIG. 1, a display system 10 according to an embodiment of the present disclosure includes a wireless communication sensor 11 and a display device 12.
 表示システム10は、クレーン向けの表示システムであって、クレーンに搭載されてよい。表示システム10は、構成の一部を、当該クレーンが備える他の装置、部品と兼用してよい。 The display system 10 is a display system for a crane and may be mounted on the crane. The display system 10 may share part of the configuration with other devices and parts included in the crane.
 無線通信センサ11は、被取付物13に取付けられている。被取付物13は、少なくともクレーンのフック14および吊荷15の少なくとも一方を含む。無線通信センサ11は、無線通信機能を有し、無線通信センサ11が測定する測定値を無線信号として、後述するコントローラ16に送信する。 The wireless communication sensor 11 is attached to the attachment 13. The attachment 13 includes at least one of a crane hook 14 and a suspended load 15. The wireless communication sensor 11 has a wireless communication function, and transmits a measured value measured by the wireless communication sensor 11 as a wireless signal to the controller 16 described later.
 無線通信センサ11は、被取付物13の運動状態を算出するための状態値を測定してよい。運転状態を算出するための状態値は、例えば、加速度、角速度、および地磁気の少なくとも1つを含んでよい。言い換えると、無線通信センサ11は、加速度センサ、角速度センサ、地磁気センサの少なくとも1つを含んでよく、複数のセンサを組合わせた6軸センサまたは9軸センサを含んでよい。 The wireless communication sensor 11 may measure a state value for calculating the motion state of the attachment 13. The state value for calculating the driving state may include at least one of acceleration, angular velocity, and geomagnetism, for example. In other words, the wireless communication sensor 11 may include at least one of an acceleration sensor, an angular velocity sensor, and a geomagnetic sensor, and may include a 6-axis sensor or a 9-axis sensor that combines a plurality of sensors.
 表示装置12は、表示部17およびコントローラ16を有する。 The display device 12 includes a display unit 17 and a controller 16.
 表示部17は、少なくとも表示パネル18を有する。表示部17は、自発光型以外の種類の表示パネル18を適用する構成においては、照射器19を有してよい。表示部17は、後述する機能を有する第1光学素子20を有して、操作員に立体視を提供してよい。表示部17は、後述する機能を有する第2光学素子21を有して、操作員に虚像を視認させ得るヘッドアップディスプレイ(HUD:Head Up Display)であってよい。 The display unit 17 has at least a display panel 18. The display unit 17 may include an irradiator 19 in a configuration in which a display panel 18 other than the self-luminous type is applied. The display unit 17 may include a first optical element 20 having a function to be described later and provide a stereoscopic view to the operator. The display unit 17 may be a head-up display (HUD) that has a second optical element 21 having a function to be described later, and allows an operator to visually recognize a virtual image.
 表示パネル18は、例えば透過型の液晶表示パネルなどの表示パネルを採用しうる。表示パネル18は、コントローラ16が生成する表示画像を形成して表示する。表示パネル18は、板状の面上に、図2に示されるように、格子状のブラックマトリックス22により水平方向及び垂直方向に区画された複数の区画領域を有する。 The display panel 18 may be a display panel such as a transmissive liquid crystal display panel. The display panel 18 forms and displays a display image generated by the controller 16. As shown in FIG. 2, the display panel 18 has a plurality of partitioned areas partitioned in a horizontal direction and a vertical direction by a grid-like black matrix 22 on a plate-like surface.
 ブラックマトリックス22は、垂直方向に延びる第1ブラックライン23、および水平方向に延びる第2ブラックライン24を有する。ブラックマトリックス22では、複数の第1ブラックライン23が水平方向に例えば一定のピッチで配列され、複数の第2ブラックライン24が垂直方向に例えば一定のピッチで配列されている。複数の第1ブラックライン23および複数の第2ブラックライン24が、複数の区画領域を区画する。 The black matrix 22 has a first black line 23 extending in the vertical direction and a second black line 24 extending in the horizontal direction. In the black matrix 22, a plurality of first black lines 23 are arranged in the horizontal direction at a constant pitch, for example, and a plurality of second black lines 24 are arranged in the vertical direction at a constant pitch, for example. The plurality of first black lines 23 and the plurality of second black lines 24 define a plurality of partition regions.
 区画領域の各々には、1つのサブピクセルが対応する。複数のサブピクセルは、水平方向及び垂直方向にマトリクス状に配列されている。各サブピクセルはR,G,Bの各色に対応し、R,G,Bの3つのサブピクセルを一組として1ピクセルを構成してよい。1ピクセルは、1画素と呼びうる。 Each sub-region corresponds to one subpixel. The plurality of subpixels are arranged in a matrix in the horizontal direction and the vertical direction. Each subpixel corresponds to each color of R, G, and B, and one pixel may be configured by combining the three subpixels of R, G, and B as a set. One pixel can be called one pixel.
 本実施形態において、表示パネル18は、例えば、1ピクセルを構成する複数のサブピクセルが並ぶ方向が水平方向となるように、クレーン内に搭載されていてよい。本実施形態において、表示パネル18は、同じ色のサブピクセルが並ぶ方向が垂直方向となるように、クレーン内に搭載されていてよい。表示パネル18としては、透過型の液晶パネルに限られず、有機EL等他の表示パネルを使用することもできる。 In the present embodiment, the display panel 18 may be mounted in the crane so that, for example, the direction in which a plurality of subpixels constituting one pixel are aligned is the horizontal direction. In the present embodiment, the display panel 18 may be mounted in a crane so that the direction in which sub-pixels of the same color are arranged is the vertical direction. The display panel 18 is not limited to a transmissive liquid crystal panel, and other display panels such as an organic EL can also be used.
 図1に示すように、照射器19は、表示パネル18の一方の面側に配置されている。照射器19は、光源、導光板、拡散板、拡散シート等を含んで構成されてよい。照射器19は、光源により照射光を射出し、導光板、拡散板、拡散シート等により照射光を表示パネル18の面方向に均一化する。照射器19は均一化した光を表示パネル18に出射する。 As shown in FIG. 1, the irradiator 19 is arranged on one surface side of the display panel 18. The irradiator 19 may include a light source, a light guide plate, a diffusion plate, a diffusion sheet, and the like. The irradiator 19 emits irradiation light from a light source, and equalizes the irradiation light in the surface direction of the display panel 18 using a light guide plate, a diffusion plate, a diffusion sheet, or the like. The irradiator 19 emits the uniformed light to the display panel 18.
 第1光学素子20は、サブピクセルから射出される表示画像の画像光の光線方向を規定する。第1光学素子20は、例えば、パララックスバリアまたはレンチキュラーレンズであってよい。以下に、パララックスバリアを適用した構成における第1光学素子20について説明する。 1st optical element 20 prescribes | regulates the light ray direction of the image light of the display image inject | emitted from a sub pixel. The first optical element 20 may be, for example, a parallax barrier or a lenticular lens. Below, the 1st optical element 20 in the structure to which a parallax barrier is applied is demonstrated.
 第1光学素子20は、表示パネル18上の所定方向に伸びる複数の開口領域25ごとに、サブピクセルから射出される画像光の伝播方向である光線方向を変更する。サブピクセルから射出される画像光は、第1光学素子20によって、視認可能な範囲が定まる。第1光学素子20は、表示パネル18に対して照射器19の反対側に位置する。第1光学素子20は、表示パネル18の照射器19側に位置しうる。 The first optical element 20 changes the light beam direction, which is the propagation direction of the image light emitted from the subpixel, for each of the plurality of opening regions 25 extending in a predetermined direction on the display panel 18. The visible range of the image light emitted from the subpixel is determined by the first optical element 20. The first optical element 20 is located on the opposite side of the irradiator 19 with respect to the display panel 18. The first optical element 20 can be located on the irradiator 19 side of the display panel 18.
 具体的には、第1光学素子20は、複数の遮光面26を有する。遮光面26は、画像光を遮光する。複数の遮光面26は、互いに隣接する該遮光面26の間の開口領域25を画定する。開口領域25は、遮光面26に比べて光透過率が高い。遮光面26は、開口領域25に比べて光透過率が低い。 Specifically, the first optical element 20 has a plurality of light shielding surfaces 26. The light shielding surface 26 shields image light. The plurality of light shielding surfaces 26 define an opening region 25 between the light shielding surfaces 26 adjacent to each other. The opening region 25 has a higher light transmittance than the light shielding surface 26. The light shielding surface 26 has a light transmittance lower than that of the opening region 25.
 開口領域25は、第1光学素子20に入射する光を透過させる部分である。開口領域25は、第1所定値以上の透過率で光を透過させてよい。第1所定値は、例えば100%であってよいし、100%に近い値であってよい。遮光面26は、第1光学素子20に入射する光を遮って透過させない部分である。遮光面26は、表示パネル18に表示される画像を遮る。遮光面26は、第2所定値以下の透過率で光を遮ってよい。第2所定値は、例えば0%であってよいし、0%に近い値であってよい。 The opening region 25 is a portion that transmits light incident on the first optical element 20. The opening region 25 may transmit light with a transmittance equal to or higher than the first predetermined value. The first predetermined value may be 100%, for example, or a value close to 100%. The light blocking surface 26 is a portion that blocks and does not transmit light incident on the first optical element 20. The light shielding surface 26 blocks an image displayed on the display panel 18. The light blocking surface 26 may block light with a transmittance equal to or lower than the second predetermined value. The second predetermined value may be 0%, for example, or a value close to 0%.
 開口領域25と遮光面26とは、水平方向および垂直方向に交互に並ぶように、クレーン内に搭載されている。開口領域25の端部は、表示パネル18の表面上の所定方向に延びる複数の帯状領域ごとに、サブピクセルから射出される画像光の光線方向を規定する。 The opening area 25 and the light shielding surface 26 are mounted in the crane so as to be alternately arranged in the horizontal direction and the vertical direction. The end of the opening area 25 defines the light ray direction of the image light emitted from the subpixel for each of a plurality of band-like areas extending in a predetermined direction on the surface of the display panel 18.
 仮に、開口領域25の端部を示す線が鉛直方向に沿う場合、サブピクセルの配置または開口領域25の寸法に含まれる誤差によって、表示画像においてモアレが認識されやすい。開口領域25の端部を示す線が鉛直方向に対して所定の角度を有する方向に延在する場合、サブピクセルの配置または開口領域25の寸法に含まれる誤差にかかわらず、表示画像においてモアレが認識されにくくなる。 If the line indicating the end of the opening area 25 is along the vertical direction, moire is likely to be recognized in the display image due to an error included in the arrangement of the subpixels or the size of the opening area 25. When the line indicating the end of the opening area 25 extends in a direction having a predetermined angle with respect to the vertical direction, moire is generated in the display image regardless of the subpixel arrangement or the error included in the dimension of the opening area 25. It becomes difficult to be recognized.
 第1光学素子20は、第2所定値未満の透過率を有するフィルムまたは板状部材で構成されてよい。この構成において、遮光面26は、当該フィルムまたは板状部材で構成される。開口領域25は、フィルムまたは板状部材に設けられた開口で構成される。フィルムは、樹脂で構成されてよいし、他の材料で構成されてよい。板状部材は、樹脂または金属等で構成されてよいし、他の材料で構成されてよい。第1光学素子20は、フィルムまたは板状部材に限られず、他の種類の部材で構成されてよい。第1光学素子20は、基材が遮光性を有してよいし、基材に遮光性を有する添加物が含有されてよい。 The first optical element 20 may be composed of a film or a plate-like member having a transmittance less than the second predetermined value. In this configuration, the light shielding surface 26 is formed of the film or plate member. The opening area 25 is configured by an opening provided in a film or a plate-like member. A film may be comprised with resin and may be comprised with another material. The plate-like member may be made of resin or metal, or may be made of other materials. The 1st optical element 20 is not restricted to a film or a plate-shaped member, You may be comprised with another kind of member. As for the 1st optical element 20, a base material may have a light-shielding property and the additive which has a light-shielding property may be contained in a base material.
 第1光学素子20は、液晶シャッターで構成されてよい。液晶シャッターは、印加する電圧に応じて光の透過率を制御しうる。液晶シャッターは、複数の画素で構成され、各画素における光の透過率を制御してよい。液晶シャッターは、光の透過率が高い領域または光の透過率が低い領域を任意の形状に形成してうる。第1光学素子20が液晶シャッターで構成される場合、開口領域25は、第1所定値以上の透過率を有する領域であってよい。第1光学素子20が液晶シャッターで構成される場合、遮光面26は、第2所定値以下の透過率を有する領域であってよい。 The first optical element 20 may be composed of a liquid crystal shutter. The liquid crystal shutter can control the light transmittance according to the applied voltage. The liquid crystal shutter may be composed of a plurality of pixels and may control the light transmittance in each pixel. The liquid crystal shutter may be formed in an arbitrary shape with a region having a high light transmittance or a region having a low light transmittance. When the first optical element 20 is configured by a liquid crystal shutter, the opening region 25 may be a region having a transmittance equal to or higher than a first predetermined value. When the 1st optical element 20 is comprised with a liquid-crystal shutter, the light-shielding surface 26 may be an area | region which has the transmittance | permeability below 2nd predetermined value.
 第1光学素子20は、一部の開口領域25のサブピクセルから出射した画像光を、利用者の右眼の位置に伝搬させ、他の一部の開口領域25のサブピクセルを出射した画像光を、利用者の左眼の位置に伝搬させる。第1光学素子20は、表示パネル18の表面から所定距離、離れて位置する。第1光学素子20は、スリット状に配列された遮光面26を含む。 The first optical element 20 propagates the image light emitted from the sub-pixels in some of the opening regions 25 to the position of the user's right eye and emits the image light emitted from the other sub-pixels in the other opening regions 25. Is propagated to the position of the user's left eye. The first optical element 20 is located a predetermined distance away from the surface of the display panel 18. The first optical element 20 includes light blocking surfaces 26 arranged in a slit shape.
 第1光学素子20の開口領域25を透過した画像光が利用者の眼に到達することによって、利用者の左眼は、開口領域25に対応する、表示パネル18上の帯状の可視領域を視認しうる。第1光学素子20の遮光面26によって画像光が遮られることによって、利用者の左眼は、遮光面26に対応する、表示パネル18上の不可視領域を視認できない。 When the image light transmitted through the opening area 25 of the first optical element 20 reaches the user's eyes, the left eye of the user visually recognizes the band-like visible area on the display panel 18 corresponding to the opening area 25. Yes. Since the image light is blocked by the light blocking surface 26 of the first optical element 20, the left eye of the user cannot visually recognize the invisible region on the display panel 18 corresponding to the light blocking surface 26.
 第2光学素子21は、表示パネル18に表示される表示画像の画像光を反射する。第2光学素子21は、クレーンの外部の光(外光)を透過する。第2光学素子21は、例えば、クレーンのウインドシールドであってよい。 The second optical element 21 reflects the image light of the display image displayed on the display panel 18. The second optical element 21 transmits light outside the crane (external light). For example, the second optical element 21 may be a windshield of a crane.
 画像光を反射するために、第2光学素子21は、表示パネル18の画像光の光路上に配置されていてよい。例えば、第2光学素子21は、表示パネル18の画像光の出射方向に配置されていてよい。光路における表示パネル18および第2光学素子21の間にミラーなどの反射素子が配置されている構成においては、第2光学素子21は、表示パネル18の画像光の当該反射素子による反射方向に配置されていてよい。 In order to reflect the image light, the second optical element 21 may be disposed on the optical path of the image light of the display panel 18. For example, the second optical element 21 may be disposed in the image light emission direction of the display panel 18. In a configuration in which a reflective element such as a mirror is disposed between the display panel 18 and the second optical element 21 in the optical path, the second optical element 21 is disposed in the reflection direction of the image light of the display panel 18 by the reflective element. May have been.
 第2光学素子21で反射させた画像光は、操作員の左眼及び右眼に到達する。したがって、表示装置12では、破線で示される光路に沿って、表示パネル18から操作員の左眼及び右眼まで画像光を進行させる。操作員は、光路に沿って到達した画像光を、虚像27として視認しうる。 The image light reflected by the second optical element 21 reaches the left eye and right eye of the operator. Therefore, in the display device 12, the image light is advanced from the display panel 18 to the left eye and right eye of the operator along the optical path indicated by the broken line. The operator can visually recognize the image light that has reached along the optical path as a virtual image 27.
 コントローラ16は、例えばプロセッサとして構成される。コントローラ16は、1以上のプロセッサを含んでよい。プロセッサは、特定のプログラムを読み込ませて特定の機能を実行する汎用のプロセッサ、及び特定の処理に特化した専用のプロセッサを含んでよい。専用のプロセッサは、特定用途向けIC(ASIC:Application Specific Integrated Circuit)を含んでよい。プロセッサは、プログラマブルロジックデバイス(PLD:Programmable Logic Device)を含んでよい。PLDは、FPGA(Field-Programmable Gate Array)を含んでよい。コントローラ16は、1つまたは複数のプロセッサが協働するSoC(System-on-a-Chip)、及びSiP(System In a Package)のいずれかであってよい。コントローラ16は、記憶部を備え、記憶部に各種情報、または3次元表示システム1の各構成部を動作させるためのプログラム等を格納してよい。記憶部は、例えば半導体メモリ等で構成されてよい。記憶部は、コントローラ16のワークメモリとして機能してよい。 The controller 16 is configured as a processor, for example. The controller 16 may include one or more processors. The processor may include a general-purpose processor that reads a specific program and executes a specific function, and a dedicated processor specialized for a specific process. The dedicated processor may include an application specific IC (ASIC: Application Specific Circuit). The processor may include a programmable logic device (PLD: Programmable Logic Device). The PLD may include an FPGA (Field-Programmable Gate Array). The controller 16 may be one of SoC (System-on-a-Chip) and SiP (System-In-a-Package) in which one or a plurality of processors cooperate. The controller 16 includes a storage unit, and may store various information or a program for operating each component of the three-dimensional display system 1 in the storage unit. The storage unit may be configured by, for example, a semiconductor memory. The storage unit may function as a work memory for the controller 16.
 コントローラ16は、例えば、表示パネル18などの表示装置12の各構成部に接続され、当該構成部それぞれを制御する。例えば、コントローラ16は、無線通信センサ11から取得する無線信号に基づいて、表示画像を変更し、表示パネル18に表示させる。 The controller 16 is connected to each component of the display device 12, such as the display panel 18, and controls each component. For example, the controller 16 changes the display image based on the wireless signal acquired from the wireless communication sensor 11 and causes the display panel 18 to display the display image.
 無線信号に基づいて変更させる表示画像は、例えば、見る方向によって外観が異なる物体(以下、「姿勢視認物体」とも呼ぶ。)、および被取付物13に作用する加速度を含んでよい。 The display image to be changed based on the radio signal may include, for example, an object whose appearance varies depending on the viewing direction (hereinafter also referred to as “posture visually recognized object”) and acceleration acting on the attachment 13.
 姿勢視認物体とは、球体以外の三次元物体であってよく、例えば、平板などの板状材である。姿勢視認物体は、表面に識別子を付した球体であってよい。識別子は、例えば、球体の緯線または経線、表面の一部のみの彩色などであってよい。 The posture visually recognizing object may be a three-dimensional object other than a sphere, for example, a plate-like material such as a flat plate. The posture visually recognizing object may be a sphere with an identifier attached to the surface. The identifier may be, for example, a spherical parallel or meridian, a color of only a part of the surface, and the like.
 コントローラ16は、被取付物13の姿勢に対応して、表示パネル18に表示する姿勢視認物体の姿勢を変更する。コントローラ16は、姿勢視認物体の姿勢の変更のために、被取付物13の姿勢を算出する。コントローラ16は、姿勢を、被取付物13の標準状態に対する傾斜として算出してよい。被取付物13の標準状態とは、例えば、被取付物13をクレーンで吊上げて、静止している状態である。 The controller 16 changes the posture of the posture visually recognized object displayed on the display panel 18 according to the posture of the attachment 13. The controller 16 calculates the posture of the attachment 13 in order to change the posture of the posture visually recognized object. The controller 16 may calculate the posture as an inclination with respect to the standard state of the attachment 13. The standard state of the attachment 13 is, for example, a state where the attachment 13 is lifted by a crane and is stationary.
 コントローラ16は、姿勢の算出のために、無線信号以外の情報をクレーンから取得して、用いてもよい。例えば、コントローラ16は、無線通信センサ11が加速度センサのみを含む構成においては、クレーンのウインチの繰出し長さを取得してよい。コントローラ16は、繰出し長さに基づいて、クレーンのブームから被取付物13までの回転半径を算出して、姿勢の算出に用いてよい。 The controller 16 may acquire information other than the radio signal from the crane and use it for calculating the posture. For example, in the configuration in which the wireless communication sensor 11 includes only an acceleration sensor, the controller 16 may acquire the payout length of the crane winch. The controller 16 may calculate the turning radius from the boom of the crane to the attachment 13 based on the feeding length and use it for calculating the posture.
 コントローラ16は、少なくとも1つの軸周りの被取付物13の傾斜を算出してよい。コントローラ16は、本実施形態において、クレーンのピッチ軸、ヨー軸、ロール軸の3軸周りの傾斜を算出する。 The controller 16 may calculate the inclination of the attachment 13 around at least one axis. In this embodiment, the controller 16 calculates inclinations around the three axes of the pitch axis, the yaw axis, and the roll axis of the crane.
 本実施形態において、クレーンのピッチ軸とはクレーンの左右方向に平行な軸である。また、クレーンのヨー軸とはクレーンの上下方向に平行な軸である。また、クレーンのロール軸とは、クレーンの前後方向に平行な軸である。本実施形態において、クレーンの左右方向、上下方向、および前後方向は、クレーンの操縦席を基準とする方向である。ピッチ軸、ヨー軸、ロール軸の3軸周りの傾斜を算出するために、無線通信センサ11が被取付物13に対して所定の位置および姿勢で取付けられていてよく、または被取付物13への無線通信センサ11実際の取付け位置および姿勢を検出して算出に用いてよい。 In this embodiment, the pitch axis of the crane is an axis parallel to the left-right direction of the crane. The yaw axis of the crane is an axis parallel to the vertical direction of the crane. Moreover, the roll axis | shaft of a crane is an axis | shaft parallel to the front-back direction of a crane. In the present embodiment, the left-right direction, the up-down direction, and the front-rear direction of the crane are directions with respect to the cockpit of the crane. In order to calculate the inclination around the three axes of the pitch axis, the yaw axis, and the roll axis, the wireless communication sensor 11 may be attached to the attachment 13 at a predetermined position and posture, or to the attachment 13 The actual attachment position and orientation of the wireless communication sensor 11 may be detected and used for calculation.
 コントローラ16は、算出した被取付物13の姿勢に対応させた姿勢である姿勢視認物体の表示画像を生成する。本実施形態においては、姿勢視認物体が板状材であって、標準状態において板面がクレーンの上下方向に垂直で、板面の長辺を含む側面が前後方向に垂直であるものと予め定められている。このような標準状態において、コントローラ16は、被取付物13が前方かつ右方に傾斜している場合に、図3に例示するように、標準状態の姿勢視認物体objを上方かつ右方から見た形状の表示画像を生成する。コントローラ16は、生成した表示画像を表示パネル18に付与して表示させることにより、姿勢視認物体objの姿勢を変更する。 The controller 16 generates a display image of the posture visually recognized object having a posture corresponding to the calculated posture of the attachment 13. In the present embodiment, the posture-viewing object is a plate-like material, and in a standard state, the plate surface is perpendicular to the vertical direction of the crane and the side surface including the long side of the plate surface is perpendicular to the front-rear direction. It has been. In such a standard state, when the attachment 13 is tilted forward and to the right, the controller 16 views the posture visual recognition object obj in the standard state from above and from the right as illustrated in FIG. A display image having a different shape is generated. The controller 16 changes the posture of the posture visually recognized object obj by giving the generated display image to the display panel 18 for display.
 コントローラ16は、無線信号に基づいて、姿勢視認物体objの奥行きを変更してよい。コントローラ16は、姿勢視認物体objの奥行きの変更のために、無線信号に基づいて、被取付物13のピッチ軸周りの傾斜を算出する。コントローラ16は、さらにクレーンのウインチの繰出し長さを取得して、クレーンのブームから被取付物13までの回転半径を算出する。 The controller 16 may change the depth of the posture visually recognized object obj based on the wireless signal. The controller 16 calculates the inclination around the pitch axis of the attachment 13 based on the radio signal in order to change the depth of the posture visually recognized object obj. The controller 16 further obtains the payout length of the winch of the crane and calculates the turning radius from the crane boom to the attachment 13.
 コントローラ16は、被取付物13のピッチ軸周りの傾斜および回転半径に基づいて、クレーンの前後方向における被取付物13の位置を算出する。コントローラ16は、算出した当該位置に対応する大きさの姿勢視認物体objの表示画像を生成する。位置に対応する大きさとは、クレーンの後ろ方向に近づくほど大きく、前方向に近づくほど小さい大きさである。コントローラ16は、後述するように、表示画像を視差画像として生成する構成において、虚像27を視認させる位置から姿勢視認物体objが飛び出すように、または奥に離れるように知覚させる示唆画像を生成してよい。コントローラ16は、生成した表示画像を表示パネル18に付与して表示させることにより、姿勢視認物体objの奥行きを変更する。 The controller 16 calculates the position of the attachment 13 in the longitudinal direction of the crane based on the inclination around the pitch axis of the attachment 13 and the turning radius. The controller 16 generates a display image of the posture visually recognized object obj having a size corresponding to the calculated position. The size corresponding to the position is larger as it approaches the rear direction of the crane, and is smaller as it approaches the front direction. As will be described later, the controller 16 generates a suggestion image that perceives the posture visually recognized object obj to protrude from the position where the virtual image 27 is visually recognized or away from the position in a configuration in which the display image is generated as a parallax image. Good. The controller 16 changes the depth of the posture visually recognized object obj by giving the generated display image to the display panel 18 for display.
 コントローラ16は、被取付物13に作用する加速度の大きさを、クレーンの前後方向および左右方向に沿って算出してよい。コントローラ16は、算出した加速度の大きさを、前後方向および左右方向に表示する表示画像を生成する。コントローラ16は、生成した表示画像を表示パネル18に付与して表示させることにより、現在の被取付物13の加速度を変更する。 The controller 16 may calculate the magnitude of acceleration acting on the attachment 13 along the longitudinal direction and the lateral direction of the crane. The controller 16 generates a display image that displays the magnitude of the calculated acceleration in the front-rear direction and the left-right direction. The controller 16 changes the current acceleration of the attachment 13 by giving the generated display image to the display panel 18 for display.
 コントローラ16は、図4に例示するように、前後方向の加速度を上下方向に延びる一次元座標系28上の上下方向に延びる矢印29として表示した、表示画像を生成してよい。また、コントローラ16は、左右方向の加速度を左右方向に延びる一次元座標系30上の左右方向に延びる矢印31として表示した、表示画像を生成してよい。コントローラ16は、例えば、図5に例示するように、前後方向および左右方向の加速度を上下方向および左右方向に延びる二次元座標系32上の任意の方向に延びる矢印33として表示した、表示画像を生成してよい。図4、5に示すように、コントローラ16は、被取付物13の加速度とともに、当該加速度を打ち消すためのガイド34を含めて表示画像を生成してよい。 The controller 16 may generate a display image in which the acceleration in the front-rear direction is displayed as an arrow 29 extending in the vertical direction on the one-dimensional coordinate system 28 extending in the vertical direction as illustrated in FIG. The controller 16 may generate a display image in which the acceleration in the left-right direction is displayed as an arrow 31 extending in the left-right direction on the one-dimensional coordinate system 30 extending in the left-right direction. For example, as illustrated in FIG. 5, the controller 16 displays a display image in which accelerations in the front-rear direction and the left-right direction are displayed as arrows 33 extending in an arbitrary direction on the two-dimensional coordinate system 32 extending in the up-down direction and the left-right direction. May be generated. As shown in FIGS. 4 and 5, the controller 16 may generate a display image including the acceleration of the attachment 13 and a guide 34 for canceling the acceleration.
 コントローラ16は、被取付物13に作用する加速度を、作用方向を反転して表示させてよい。例えば、加速度の作用方向を反転させた画像では、図4、5において例示した表示画像において、加速度を示す矢印が、原点を軸に反転している。 The controller 16 may display the acceleration acting on the attachment 13 by reversing the acting direction. For example, in an image in which the direction of acceleration is reversed, in the display images illustrated in FIGS. 4 and 5, the arrow indicating acceleration is reversed with the origin as the axis.
 コントローラ16は、被取付物13の加速度とともに、加速度の軌跡を表示画像として表示させてよい。コントローラ16は、図6に例示するように、前後方向および左右方向の加速度の軌跡を、上下方向および左右方向に延びる二次元座標系32上の点の集合体35として表示した表示画像を生成してよい。 The controller 16 may display the locus of acceleration as a display image together with the acceleration of the attachment 13. As illustrated in FIG. 6, the controller 16 generates a display image in which the trajectories of acceleration in the front-rear direction and the left-right direction are displayed as a collection of points 35 on the two-dimensional coordinate system 32 extending in the up-down direction and the left-right direction. It's okay.
 コントローラ16は、表示部17が第1光学素子20を有する構成において、上述の表示画像に相当する視差画像を生成してよい。視差画像は、立体視によって3次元像を知覚させ得る画像である。 The controller 16 may generate a parallax image corresponding to the above-described display image in a configuration in which the display unit 17 includes the first optical element 20. A parallax image is an image that allows a three-dimensional image to be perceived by stereoscopic viewing.
 視差画像においては、表示画像が立体物であると仮定して、当該立体物の任意の第1の点を右目で見た画像および左目で見た画像が、左右方向において僅かにずれた位置に配置されている。視差画像においては、立体物の第1の点に左右方向に隣接する別の第2の点を右目で見た画像および左目で見た画像の組が、第1の点に対応する画像の組から左右方向にずれた位置に配置されている。 In the parallax image, assuming that the display image is a three-dimensional object, the image of the arbitrary first point of the three-dimensional object viewed with the right eye and the image viewed with the left eye are slightly shifted in the left-right direction. Has been placed. In a parallax image, a set of images in which another second point adjacent to the first point of the three-dimensional object in the left-right direction is viewed with the right eye and an image viewed with the left eye is a set of images corresponding to the first point. It is arrange | positioned in the position shifted | deviated from the left-right direction.
 コントローラ16は、図7に示すように、表示部17に対する特定の位置の操作員の右目REおよび左目LEが第1光学素子20の開口領域25から視認し得る表示パネル18上のそれぞれの右目画像領域RAおよび左目画像領域LAに、表示画像を構成する各点を右目REおよび左目LEから見た画像を配置した視差画像を生成する。表示部17に対する操作員の特定の位置は、理想状態として予め定められていてよい。表示部17に対する操作員の特定の位置は、カメラなどを用いて検出されてよい。 As shown in FIG. 7, the controller 16 displays each right-eye image on the display panel 18 in which the right eye RE and the left eye LE of the operator at a specific position with respect to the display unit 17 can be visually recognized from the opening area 25 of the first optical element 20. In the area RA and the left eye image area LA, a parallax image is generated in which images of the respective points constituting the display image viewed from the right eye RE and the left eye LE are arranged. The specific position of the operator with respect to the display unit 17 may be determined in advance as an ideal state. The specific position of the operator with respect to the display unit 17 may be detected using a camera or the like.
 以上のような構成の本実施形態の表示システム10は、被取付物13に取付けられる無線通信センサ11から取得する無線信号に基づいて、表示画像を変更する。被取付物13の外観だけによる被取付物13の揺動の低減操作は困難となりうる。一方で、上述のような構成により、表示システム10は、無線通信センサ11が測定する被取付物13の状態の変化という、吊荷15の揺動を低減させるための情報を操作員に供与し得る。 The display system 10 of the present embodiment configured as described above changes the display image based on a wireless signal acquired from the wireless communication sensor 11 attached to the attachment 13. The operation of reducing the swing of the attachment 13 only by the appearance of the attachment 13 can be difficult. On the other hand, with the configuration as described above, the display system 10 provides the operator with information for reducing the swing of the suspended load 15, that is, a change in the state of the attachment 13 measured by the wireless communication sensor 11. obtain.
 また、本実施形態の表示システム10は、表示画像の光線方向を規定する第1光学素子20および表示パネル18からの画像光を反射する第2光学素子21を、さらに備え、且つ表示画像は視差画像である。このような構成により、表示システム10は、被取付物13の状態を3次元状に知覚させ得る。後述するように、表示画像が姿勢視認物体objを含む構成においては、表示システム10は、姿勢視認物体objを3次元状に知覚させることにより、被取付物13の状態をより明確に認識させ得る。 The display system 10 of the present embodiment further includes a first optical element 20 that defines the light beam direction of the display image and a second optical element 21 that reflects the image light from the display panel 18, and the display image has a parallax. It is an image. With such a configuration, the display system 10 can perceive the state of the attachment 13 in a three-dimensional manner. As will be described later, in the configuration in which the display image includes the posture visual recognition object obj, the display system 10 can make the posture visual recognition object obj perceived in a three-dimensional manner, thereby more clearly recognizing the state of the attachment 13. .
 また、本実施形態の表示システム10では、コントローラ16は、見る方向によって外観が異なる姿勢視認物体objを、被取付物13の姿勢に応じて変更した表示画像として表示させる。吊荷15の形状が、例えば、球体のように、見る方向による外観の変化に乏しい場合には、クレーンの前後方向の縦揺れの把握は難しかった。しかし、上述の構成によれば、表示システム10では、姿勢視認物体objの表示により縦揺れに対する外観の変化が明確となるので、操作員に吊荷15の縦揺れを容易に把握させ得る。 Further, in the display system 10 of the present embodiment, the controller 16 displays the posture visually recognized object obj having a different appearance depending on the viewing direction as a display image changed according to the posture of the attachment 13. When the shape of the suspended load 15 is poor in change in appearance depending on the viewing direction, such as a sphere, it is difficult to grasp the longitudinal swing of the crane in the front-rear direction. However, according to the above-described configuration, in the display system 10, the change in appearance with respect to pitching becomes clear by displaying the posture visually recognized object obj, so that the operator can easily grasp the pitching of the suspended load 15.
 また、本実施形態の表示システム10では、コントローラ16は、無線信号に基づいて姿勢視認物体objの奥行きを変更する。このような構成により、表示システム10は、視認だけでは困難な、操作員から比較的遠い位置にある吊荷15の縦揺れを、操作員に容易に把握させ得る。 Further, in the display system 10 of the present embodiment, the controller 16 changes the depth of the posture visually recognized object obj based on the wireless signal. With such a configuration, the display system 10 can allow the operator to easily grasp the pitching of the suspended load 15 that is relatively far from the operator, which is difficult only by visual recognition.
 また、本実施形態の表示システム10では、コントローラ16は、被取付物13に作用する加速度を表示画像として表示させる。このような構成により、表示システム10は、操作員に、吊荷15の揺動を押さえる操作のために有益な、被取付物13に作用している力の方向および大きさなどを認識させ得る。 In the display system 10 of the present embodiment, the controller 16 displays the acceleration acting on the attachment 13 as a display image. With such a configuration, the display system 10 can make the operator recognize the direction and magnitude of the force acting on the attachment 13 that is useful for the operation of suppressing the swing of the suspended load 15. .
 また、本実施形態の表示システム10では、コントローラ16は、被取付物13に作用する加速度を、作用方向を反転して表示させる。吊荷15の揺動を低減するためには、吊荷15に作用する力とは逆方向の力を印加することが望ましい。それゆえ、上述の構成の表示システム10では、被取付物13に付与すべき力と同じ方向を向く加速度が表示されるので、操作員に力を付与すべき方向を直接認識させ得る。 In the display system 10 of the present embodiment, the controller 16 displays the acceleration acting on the attachment 13 with the direction of action reversed. In order to reduce the swing of the suspended load 15, it is desirable to apply a force in the direction opposite to the force acting on the suspended load 15. Therefore, in the display system 10 having the above-described configuration, the acceleration directed in the same direction as the force to be applied to the attachment 13 is displayed, so that the operator can directly recognize the direction in which the force should be applied.
 また、本実施形態の表示システム10では、コントローラ16は加速度の軌跡を表示させる。このような構成により、表示システム10は、操作員による揺動の低減操作が反映された、加速度の軌跡を提供し得る。したがって、表示システム10は、操作員の低減操作の習熟の度合いを客観的に判断する情報を提供し得る。 In the display system 10 of the present embodiment, the controller 16 displays the acceleration locus. With such a configuration, the display system 10 can provide a trajectory of acceleration that reflects the swing reduction operation by the operator. Therefore, the display system 10 can provide information for objectively determining the degree of mastery of the operator's reduction operation.
 また、本実施形態の表示システム10では、コントローラ16は、被取付物13の加速度を打ち消すためのガイド34を表示させる。このような構成により、表示システム10は、操作員に力を付与すべき方向を直接認識させ得る。 In the display system 10 of the present embodiment, the controller 16 displays a guide 34 for canceling the acceleration of the attachment 13. With such a configuration, the display system 10 can directly recognize the direction in which a force should be applied to the operator.
 本発明を諸図面や実施例に基づき説明してきたが、当業者であれば本開示に基づき種々の変形や修正を行うことが容易であることに注意されたい。従って、これらの変形や修正は本発明の範囲に含まれることに留意されたい。 Although the present invention has been described based on the drawings and examples, it should be noted that those skilled in the art can easily make various changes and modifications based on the present disclosure. Therefore, it should be noted that these variations and modifications are included in the scope of the present invention.
 例えば、本実施形態において、表示部17は、操作員に虚像27を視認させるヘッドアップディスプレイであるが、表示パネル18を直接視認させるディスプレイであってよい。また、本実施形態において、表示部17は、第1光学素子20を有し且つ表示パネル18に視差画像を表示させることにより、操作員に画像を立体視させ得るが、平面視させる構成であってよい。 For example, in the present embodiment, the display unit 17 is a head-up display that allows the operator to visually recognize the virtual image 27, but may be a display that directly recognizes the display panel 18. In the present embodiment, the display unit 17 includes the first optical element 20 and can display the parallax image on the display panel 18 so that the operator can view the image stereoscopically. It's okay.
 なお、ここでは、特定の機能を実行する種々のモジュール及び/またはユニットを有するものとしてのシステムを開示しており、これらのモジュール及びユニットは、その機能性を簡略に説明するために模式的に示されたものであって、必ずしも、特定のハードウェア及び/またはソフトウェアを示すものではないことに留意されたい。その意味において、これらのモジュール、ユニット、その他の構成要素は、ここで説明された特定の機能を実質的に実行するように実装されたハードウェア及び/またはソフトウェアであればよい。異なる構成要素の種々の機能は、ハードウェア及び/もしくはソフトウェアのいかなる組合せまたは分離したものであってもよく、それぞれ別々に、またはいずれかの組合せにより用いることができる。また、キーボード、ディスプレイ、タッチスクリーン、ポインティングデバイス等を含むがこれらに限られない入力/出力もしくはI/Oデバイスまたはユーザインターフェースは、システムに直接にまたは介在するI/Oコントローラを介して接続することができる。このように、本開示内容の種々の側面は、多くの異なる態様で実施することができ、それらの態様はすべて本開示内容の範囲に含まれる。 It should be noted that here, a system having various modules and / or units that perform a specific function is disclosed, and these modules and units are schematically illustrated in order to briefly explain the functionality. It should be noted that the descriptions are not necessarily indicative of specific hardware and / or software. In that sense, these modules, units, and other components may be hardware and / or software implemented to substantially perform the specific functions described herein. The various functions of the different components may be any combination or separation of hardware and / or software, each used separately or in any combination. Also, input / output or I / O devices or user interfaces, including but not limited to keyboards, displays, touch screens, pointing devices, etc., must be connected directly to the system or via an intervening I / O controller. Can do. Thus, the various aspects of the present disclosure can be implemented in many different ways, all of which are within the scope of the present disclosure.
 10 表示システム
 11 無線通信センサ
 12 表示装置
 13 被取付物
 14 フック
 15 吊荷
 16 コントローラ
 17 表示部
 18 表示パネル
 19 照射器
 20 第1光学素子
 21 第2光学素子
 22 ブラックマトリックス
 23 第1ブラックライン
 24 第2ブラックライン
 25 開口領域
 26 遮光面
 27 虚像
 28 上下方向に延びる一次元座標系
 29 上下方向に延びる矢印
 30 左右方向に延びる一次元座標系
 31 左右方向に延びる矢印
 32 上下方向および左右方向に延びる二次元座標系
 33 任意の方向に延びる矢印
 34 ガイド
 35 点の集合体
 LA 左目画像領域
 LE 左目
 obj 姿勢視認物体
 RA 右目画像領域
 RE 右目
DESCRIPTION OF SYMBOLS 10 Display system 11 Wireless communication sensor 12 Display apparatus 13 Attachment 14 Hook 15 Hanging load 16 Controller 17 Display part 18 Display panel 19 Irradiator 20 1st optical element 21 2nd optical element 22 Black matrix 23 1st black line 24 1st 2 Black line 25 Opening area 26 Light-shielding surface 27 Virtual image 28 One-dimensional coordinate system extending in the vertical direction 29 Arrow extending in the vertical direction 30 One-dimensional coordinate system extending in the horizontal direction 31 Arrow extending in the horizontal direction 32 Two extending in the vertical direction and the horizontal direction Dimensional coordinate system 33 Arrow extending in any direction 34 Guide 35 Point collection LA Left eye image area LE Left eye obj Posture visually recognized object RA Right eye image area RE Right eye

Claims (10)

  1.  クレーン向け表示システムであって、
     表示画像を表示する表示パネルと、
     前記クレーンのフックおよび吊荷の少なくとも一方を含む被取付物に取付けられる無線通信センサと、
     前記無線通信センサから取得する無線信号に基づいて、前記表示画像を変更するコントローラと、を備える
     表示システム。
    A display system for a crane,
    A display panel for displaying a display image;
    A wireless communication sensor attached to an attachment including at least one of a hook and a suspended load of the crane;
    And a controller that changes the display image based on a wireless signal acquired from the wireless communication sensor.
  2.  請求項1に記載の表示システムにおいて、
     前記表示画像の光線方向を規定する第1光学素子と、
     前記表示パネルからの画像光を反射するとともに、外光を透過する第2光学素子と、をさらに備え、
     前記表示画像は、立体視によって3次元像を知覚させ得る視差画像である
     表示システム。
    The display system according to claim 1,
    A first optical element that defines a light beam direction of the display image;
    A second optical element that reflects image light from the display panel and transmits external light; and
    The display system is a parallax image that allows a three-dimensional image to be perceived by stereoscopic viewing.
  3.  請求項1または2に記載の表示システムにおいて、
     前記無線通信センサは、加速度、角速度、および地磁気の少なくとも1つを測定するセンサを有する
     表示システム。
    The display system according to claim 1 or 2,
    The wireless communication sensor includes a sensor that measures at least one of acceleration, angular velocity, and geomagnetism.
  4.  請求項3に記載の表示システムにおいて、
     前記コントローラは、前記無線信号に基づいて、見る方向によって外観が異なる物体を、前記表示パネルに前記表示画像として表示させる
     表示システム。
    The display system according to claim 3, wherein
    The controller causes the display panel to display, as the display image, an object whose appearance varies depending on a viewing direction based on the wireless signal.
  5.  請求項4に記載の表示システムにおいて、
     前記コントローラは、前記被取付物の姿勢に対応して、前記物体の姿勢を変更する
     表示システム。
    The display system according to claim 4,
    The controller is configured to change a posture of the object according to a posture of the attached object.
  6.  請求項5に記載の表示システムにおいて、
     前記コントローラは、前記無線信号に基づいて、前記物体の奥行きを変更する
     表示システム。
    The display system according to claim 5,
    The controller changes the depth of the object based on the wireless signal.
  7.  請求項3に記載の表示システムにおいて、
     前記コントローラは、前記無線信号に基づいて、前記被取付物に作用する加速度を、前記表示パネルに前記表示画像として表示させる
     表示システム。
    The display system according to claim 3, wherein
    The controller displays an acceleration acting on the attached object on the display panel as the display image based on the radio signal.
  8.  請求項7に記載の表示システムにおいて、
     前記コントローラは、前記被取付物に作用する加速度を、作用方向を反転して、表示させる
     表示システム。
    The display system according to claim 7,
    The controller displays the acceleration acting on the attached object by inverting the acting direction.
  9.  請求項7または8に記載の表示システムにおいて、
     前記コントローラは、前記加速度の軌跡を、前記表示パネルに前記表示画像として表示させる
     表示システム。
    The display system according to claim 7 or 8,
    The controller displays the locus of acceleration as the display image on the display panel.
  10.  請求項7から9のいずれか1項に記載の表示システムにおいて、
     前記コントローラは、前記被取付物の加速度を打ち消すためのガイドを、前記表示パネルに前記表示画像として表示させる
     表示システム。
     
    The display system according to any one of claims 7 to 9,
    The controller displays a guide for canceling the acceleration of the attached object on the display panel as the display image.
PCT/JP2019/019264 2018-05-30 2019-05-15 Display system WO2019230395A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-103170 2018-05-30
JP2018103170 2018-05-30

Publications (1)

Publication Number Publication Date
WO2019230395A1 true WO2019230395A1 (en) 2019-12-05

Family

ID=68696658

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/019264 WO2019230395A1 (en) 2018-05-30 2019-05-15 Display system

Country Status (1)

Country Link
WO (1) WO2019230395A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5129159A (en) * 1974-09-04 1976-03-12 Nippon Steel Corp Buumusentanbu no undokenshutsuhoho oyobi sochi
JPH05147882A (en) * 1991-11-29 1993-06-15 Shimadzu Corp Crane controller
JP2011079648A (en) * 2009-10-08 2011-04-21 Hitachi Plant Technologies Ltd Stationary image display system
US20120255188A1 (en) * 2009-11-20 2012-10-11 Sany Automobile Manufacture Co., Ltd. Hook pose detecting equipment and crane
JP2014174019A (en) * 2013-03-08 2014-09-22 Mitsui Eng & Shipbuild Co Ltd Device and method to measure gravity center of suspended load
JP2014237506A (en) * 2013-06-06 2014-12-18 清水建設株式会社 Construction support device, member information reading device, crane, construction support method, member information reading method and program
JP2015215509A (en) * 2014-05-12 2015-12-03 パナソニックIpマネジメント株式会社 Display apparatus, display method and program
JP2016033064A (en) * 2014-07-31 2016-03-10 あおみ建設株式会社 Underwater work device and underwater work method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5129159A (en) * 1974-09-04 1976-03-12 Nippon Steel Corp Buumusentanbu no undokenshutsuhoho oyobi sochi
JPH05147882A (en) * 1991-11-29 1993-06-15 Shimadzu Corp Crane controller
JP2011079648A (en) * 2009-10-08 2011-04-21 Hitachi Plant Technologies Ltd Stationary image display system
US20120255188A1 (en) * 2009-11-20 2012-10-11 Sany Automobile Manufacture Co., Ltd. Hook pose detecting equipment and crane
JP2014174019A (en) * 2013-03-08 2014-09-22 Mitsui Eng & Shipbuild Co Ltd Device and method to measure gravity center of suspended load
JP2014237506A (en) * 2013-06-06 2014-12-18 清水建設株式会社 Construction support device, member information reading device, crane, construction support method, member information reading method and program
JP2015215509A (en) * 2014-05-12 2015-12-03 パナソニックIpマネジメント株式会社 Display apparatus, display method and program
JP2016033064A (en) * 2014-07-31 2016-03-10 あおみ建設株式会社 Underwater work device and underwater work method

Similar Documents

Publication Publication Date Title
JP6924637B2 (en) 3D display device, 3D display system, mobile body, and 3D display method
JP6821454B2 (en) 3D display system, head-up display system, and mobile
US10882454B2 (en) Display system, electronic mirror system, and moving body
JP7100523B2 (en) Display devices, display systems and moving objects
WO2019160160A1 (en) Head-up display, head-up display system, and moving body
WO2018199183A1 (en) Three-dimensional display device, three-dimensional display system, head-up display system, and mobile body
JP6271820B2 (en) Projection display apparatus and projection control method
JP2018120189A (en) Three-dimensional display device, three-dimensional display system, head-up display system and mobile body
JP7188981B2 (en) 3D display device, 3D display system, head-up display, and moving object
WO2020090626A1 (en) Image display device, image display system, and moving body
WO2021106688A1 (en) Head-up display, head-up display system, and moving body
WO2019230395A1 (en) Display system
WO2020130049A1 (en) Three-dimensional display device, head-up display system, and mobile body
JP2019148638A (en) Image display device, head-up display system and moving body
JPWO2020004275A1 (en) 3D display device, control controller, 3D display method, 3D display system, and mobile
JP6481445B2 (en) Head-up display
JP6821515B2 (en) 3D display, head-up display system, and mobile
JP7336782B2 (en) 3D display device, 3D display system, head-up display, and moving object
JP7228033B2 (en) 3D display device, head-up display system, and moving body
JP7332448B2 (en) Head-up display system and moving body
WO2021060011A1 (en) Parallax barrier, 3d display device, 3d display system, heads-up display, and moving body
WO2020241863A1 (en) Head-up display system and moving body
WO2022149599A1 (en) Three-dimensional display device
WO2021060012A1 (en) Parallax barrier, three-dimensional display device, three-dimensional display system, head-up display, and mobile body
CN112526748A (en) Head-up display device, imaging system and vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19811064

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19811064

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP