JP6424133B2 - Video display system - Google Patents

Video display system Download PDF

Info

Publication number
JP6424133B2
JP6424133B2 JP2015087763A JP2015087763A JP6424133B2 JP 6424133 B2 JP6424133 B2 JP 6424133B2 JP 2015087763 A JP2015087763 A JP 2015087763A JP 2015087763 A JP2015087763 A JP 2015087763A JP 6424133 B2 JP6424133 B2 JP 6424133B2
Authority
JP
Japan
Prior art keywords
optical element
display system
holographic optical
device
element surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2015087763A
Other languages
Japanese (ja)
Other versions
JP2016208273A (en
Inventor
真治 木村
真治 木村
美木子 中西
美木子 中西
▲高▼橋 和彦
和彦 ▲高▼橋
油川 雄司
雄司 油川
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Priority to JP2015087763A priority Critical patent/JP6424133B2/en
Publication of JP2016208273A publication Critical patent/JP2016208273A/en
Application granted granted Critical
Publication of JP6424133B2 publication Critical patent/JP6424133B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a video display system used for a video phone or a video conference system.

  In recent years, videophone systems and video conference systems are spreading (for example, Non-Patent Document 1 and Non-Patent Document 2). The videophone system of Non-Patent Document 1 records video and audio on the transmitting side (receiving side) with a TV camera and microphone, and uses the TV that can be connected to the Internet and a dedicated application (sending side). Realized by playing on the talk side). Similarly, in the video conference system of Non-Patent Document 2, video and audio on the transmitting side (receiving side) are recorded by a dedicated camera and microphone and reproduced on the receiving side (transmitting side) monitor and projector. Realize the meeting.

Panasonic Corporation, "A lot of things that digital home appliances don't know! How to enjoy TV! Enjoy videophones with family and friends on a big screen!", [Online], Panasonic Corporation, September 3, 2014, [Heisei Era Search on April 7, 2015], Internet <URL: http: //panasonic.jp/blog/viera/2014/09/post-4.html> Panasonic Corporation, "Panasonic's Video Conferencing / Video Conferencing System-HD Com", [online], Panasonic Corporation, [April 7, 2015 Search], Internet <URL: http://panasonic.biz/com / visual />

  In the above-described videophone system and videoconferencing system, the position of the face of the receiver (speaker) reflected on the monitor, TV, screen, etc. is not aligned with the position of the camera that captures the face of the speaker (listener). As a result, the line of sight of the receiver and the transmitter is not aligned, which hinders natural communication between the receiver and the transmitter. Therefore, an object of the present invention is to provide a video display system that realizes a videophone or a video conference system in which a user can communicate naturally without being aware of the device.

  The video display system of the present invention includes a first holographic optical element surface, a photographing device, a second holographic optical element surface, and a projection device.

  The photographing apparatus photographs the reflected light from the first holographic optical element surface and transmits it to the receiving-side projection apparatus. The second holographic optical element surface is disposed in parallel with the first holographic optical element surface. It is assumed that the light imaged on the imaging device of the imaging device passes through the first and second holographic optical element surfaces without being reflected and refracted, and forms an image on the imaging device of the virtual imaging device. A point representing the position of the virtual imaging device in the case is defined as a point T, and the projection device is arranged outside the cone including the boundary of the first holographic optical element surface on the side surface with the point T as a vertex, An image photographed by the photographing device on the receiving side is projected onto the second holographic optical element surface.

  According to the video display system of the present invention, it is possible to realize a videophone or a video conference system in which a user can communicate naturally without being aware of the device.

1 is a schematic perspective view illustrating a configuration of a video display system according to Embodiment 1. FIG. 1 is a schematic side view showing a configuration of a video display system according to Embodiment 1. FIG. 3 is a flowchart illustrating the operation of the video display system according to the first embodiment. The figure explaining the example which varied the reflectance for every wavelength of the 1st HOE surface of the video display system of Example 1, and the 2nd HOE surface.

  Hereinafter, embodiments of the present invention will be described in detail. In addition, the same number is attached | subjected to the structure part which has the same function, and duplication description is abbreviate | omitted.

  Hereinafter, the configuration and operation of the video display system according to the first embodiment will be described with reference to FIGS. 1, 2, and 3. FIG. FIG. 1 is a schematic perspective view showing a configuration of a video display system 1 of the present embodiment. FIG. 2 is a schematic side view showing the configuration of the video display system 1 of the present embodiment. FIG. 3 is a flowchart showing the operation of the video display system 1 of the present embodiment. The video display system 1 of the present embodiment has the same configuration requirements on the receiver side and the transmitter side. Hereinafter, the description will be made on the assumption that the user 9-1 and the user 9-2 interact using the present system.

  The video display system 1 of the present embodiment includes an imaging device 10-1 and a first holographic optical element surface 11-1 (hereinafter also referred to as a first HOE surface 11-1) as equipment used by the user 9-1. And a second holographic optical element surface 12-1 (hereinafter also referred to as a second HOE surface 12-1) and a projection device 13-1. In addition, the video display system 1 according to the present embodiment includes an imaging device 10-2 similar to the above as equipment used by the user 9-2 and a first holographic optical element surface 11-2 (hereinafter referred to as a first HOE surface 11). 2), a second holographic optical element surface 12-2 (hereinafter also referred to as a second HOE surface 12-2), and a projection device 13-2. Although not shown, the video display system 1 may include a photographing device 10-1 (10-2) and a control device (typically a computer) that controls the projection device 13-1 (13-2). Alternatively, a function for controlling the present system may be incorporated in any of the imaging device 10-1 (10-2) and the projection device 13-1 (13-2). Since the equipment on the user 9-2 side is exactly the same as the equipment on the user 9-1 side as described above, the explanation of the equipment on the user 9-2 side is omitted as appropriate, and the user 9-side is mainly used. The equipment on the first side will be described.

  As shown in FIG. 1, the first HOE surface 11-1 can be a horizontally long rectangular surface, for example. In the following description, the four vertices of the first HOE surface 11-1 are referred to as V1, V2, V3, and V4 in the clockwise order from the top left vertex when viewed from the user 9-1. The first HOE surface 11-1 may have a shape other than this, for example, a circular or elliptical surface.

  The first HOE surface 11-1 reflects and collects incident light at a predetermined angle (S1). Of the light emitted from the user 9-1, a predetermined wavelength component of light incident on the first HOE surface 11-1 at a predetermined angle is selectively reflected on the first HOE surface 11-1, and the photographing apparatus 10. -1 is focused on the lens. The trajectory of this light is represented by a broken line. Note that the photographing device 10-1 is installed above the user 9-1, for example, near the ceiling, so as not to interfere with the field of view of the user 9-1 and to be reflected in the video observed by the user 9-2. The

  As described above, the first HOE surface 11-1 is a holographic optical element (hologram). It is known that a hologram has wavefront reproducibility and wavelength selectivity. Wavefront reproducibility is the property of reproducing the light path during hologram exposure. The wavelength selectivity is a property in which only wavelength light at the time of hologram exposure is refracted and reflected, and other wavelength light is transmitted. Therefore, the reflection characteristic of the first HOE surface 11-1 can be designed so as to select the direction and wavelength component of the incident light. By appropriately designing the reflection characteristics of the first HOE surface 11-1, an image of the user 9-1 and its surroundings can be formed on the imaging device of the imaging apparatus 10-1. The imaging device 10-1 captures the reflected light from the first HOE surface 11-1 and transmits it to the projection device 13-2 that is the receiving side projection device (S2). The second HOE surface 12-1 is disposed in parallel with the first HOE surface 11-1. As shown in FIG. 1, the 2nd HOE surface 12-1 can be arrange | positioned in the back | inner side seeing from a user rather than the 1st HOE surface 11-1. The second HOE surface 12-1 is not limited to the example in FIG. Similarly to the first HOE surface 11-1, the second HOE surface 12-1 may be a horizontally long rectangular surface, and may have a shape other than this, for example, a circular shape, an elliptical shape, or another shape. May be. The second HOE surface 12-1 is generally the same size and shape as the first HOE surface 11-1, but is not limited to this. For example, the second HOE surface 12-1 may be larger than the first HOE surface 12-1. The shape may be different from that of the first HOE surface 12-1. Similar to the first HOE surface 11-1, the second HOE surface 12-1 is designed to select and reflect the direction and wavelength component of incident light. The projection device 13-1 receives the video imaged by the imaging device 10-2. The projection device 13-1 projects the video received from the imaging device 10-2 onto the second HOE surface 12-1 (S3). The user 9-1 can visually recognize an image (for example, an image P-2 of the user 9-2 or a surrounding environment of the user 9-2) projected by the projection device 13-1 on the second HOE surface 12-1. it can. The second HOE surface 12-1 may be any optical material whose reflected light is visible to the user 9-1 and can be controlled in a direction not incident on the photographing apparatus 10-1, and is other than a holographic optical element. Also good. The second HOE surface 12-1 can be replaced with, for example, a Fresnel lens. In this case, the Fresnel lens reflects incident light at an angle predetermined by the lens design. If the image light projected by the projection device 13-1 is controlled so as not to be reflected in the direction of the imaging device 10-1 or the direction of the projection device 13-1, the image projected by the projection device 13-1 is captured by the imaging device 10. In addition to not being reflected in -1, it is possible for the user 9-1 to visually recognize a brighter image than in the case of diffuse reflection (such as a wall).

<Arrangement of Projector 13-1>
The projection device 13-1 is arranged at a position that is not likely to be obstructed in using the system. Specifically, the projection device 13-1 has a point T, which is a point representing the position of a virtual imaging device (shown by a broken line in the figure), as a vertex, and includes the boundary of the first HOE surface 11-1 on its side surface. It is arranged outside the cone (in the example of FIG. 1, a quadrangular pyramid including sides V1V2, V2V3, V3V4, V4V1 on its side surface) (see FIG. 2). The position of the virtual imaging device means that the light imaged on the imaging device of the imaging device 10-1 passes through the first and second HOE surfaces 11-1 and 12-1 without being reflected and refracted, and the virtual imaging device 10-1 This is the position of the virtual imaging device when it is assumed that an image is formed on the imaging device of the imaging device. By disposing the projection device 13-1 at the above-described position, it is possible to prevent the light emitted from the projection device 13-1 from being reflected by the first HOE surface 11-1 and entering the imaging device 10-1. It is possible to prevent the image from being unnatural when viewed from the user 9-2.

<Projector 13-1>
As shown in FIG. 1, the projection device 13-1 is preferably a short focus type projector. The short focus type projector may be referred to as an ultra short focus type projector. Some so-called ultra-short focus projectors can project an image at a distance of about 10 cm from a screen. By using a short focus type or ultra short focus type projector as the projection device 13-1, light emitted from the user 9-1 and incident on the first HOE surface 11-1 and output from the projection device 13-1. Thus, the angle of the light incident on the first HOE surface 11-1 (the trajectory of the light beam in FIG. 1 is represented by a two-dot chain line) can be varied. If the angle of the light beam can be varied by using a short focus type or ultra short focus type projector for the projection device 13-1, light from the front direction (for example, light from a predetermined direction with an incident angle of 10 degrees or less). ) For the user 9-1 by designing the reflection characteristics of the first HOE surface 11-1 so that the light is reflected and collected by the photographing apparatus 10-1 and all the light from other directions is transmitted. And the light emitted from the periphery and the light which the projector 13-1 output can be distinguished favorably. As a result, it is possible to prevent a phenomenon in which the output light of the projection device 13-1 enters the photographing device 10-1 and deteriorates the visibility of the video. Further, if the projection device 13-1 is a short focus type or ultra short focus type projector, even if the user 9-1 approaches the projection surface, the user 9-1 can connect the projection device 13-1 and the projection surface. Therefore, the possibility that the output light from the projection device 13-1 is blocked by the user 9-1 is reduced.

<Wavelength selection on the first and second HOE surfaces>
Hereinafter, wavelength selection in the first and second HOE planes will be described with reference to FIG. FIG. 4 is a diagram illustrating an example in which the reflectance for each wavelength of the first HOE surface 11-1 and the second HOE surface 12-1 of the video display system 1 of the present embodiment is different. As shown by a solid line in FIG. 4, in any one of the first HOE surface 11-1 and the second HOE surface 12-1, a narrow-band wavelength region centering on wavelengths B 1 , G 1 , R 1 (respectively The wavelength is designed so as to have a high reflectance only in a narrow-band wavelength region visually recognized as blue, green, and red, and the wavelengths B 1 , G 1 , and R 1 are shifted in the positive wavelength direction on the other surface, respectively. Only in the narrow-band wavelength region centered on B 2 , G 2 , and R 2 (the narrow-band wavelength region visually recognized as blue, green, and red shifted to the longer wavelength side than the above-described blue, green, and red). By designing to have a high reflectance, the quality of the image can be further improved. As described above, since the light reflected and transmitted by the first HOE surface 11-1 can be filtered by the first HOE surface 11-1 by changing the direction of the light rays, the imaging system and the projection are used. System light is generally discriminated. Here, as shown in FIG. 4, the three primary colors of light reflected by the first HOE surface 11-1 and incident on the imaging apparatus 10-1 and the first HOE surface 11-1 are transmitted and projected onto the second HOE surface 11-2. Thus, the light of the imaging system and the projection system can be discriminated with higher accuracy by making the three primary colors of light visually recognized by the user 9-1 different.

  When the three primary colors of the reflected light are shifted as shown in FIG. 4, it is necessary to correct the color balance of the displayed image by software.

<Modification>
The imaging device 10-1 may be a smartphone camera, for example. Further, in order to discriminate between the light of the imaging system and the projection system with higher accuracy, the imaging device 10-1 and the projection device 13-1 are synchronized, a shutter is provided on the lens of the imaging device 10-1, and the imaging device 10-1. A time-sharing operation for switching between the imaging timing and the projection timing of the projection device 13-1 at high speed may be executed. That is, the image projection by the projection device 13-1 and the photographing by the photographing device 10-1 may be alternately executed while switching at a predetermined frame rate. The synchronization operation and the time-sharing operation may be executed by the above-described control device, or may be executed with either the photographing device 10-1 or the projection device 13-1 having a control function. If the switching rate is set to 120 Hz, for example, 60 Hz can be allocated to shooting and projection, and a sufficient frame rate can be ensured for both shooting and projection.

<Effect>
According to the video display system 1 of the present embodiment, by using the holographic optical element, it is possible to shoot the light emitted from the user as if shooting from the front, and to capture the video from the other user. Since it can be projected on the front of the user without being reflected on the first HOE surface, it can solve the conventional problem that the lines of sight of the users do not match. The person can communicate naturally without being aware of the device. In addition, by using a projection device for video display, it is possible to realize a video having a larger screen than a video displayed by a conventional videophone system or videoconferencing system at a low cost.

<Supplementary note>
The apparatus of the present invention includes, for example, a single hardware entity as an input unit to which a keyboard or the like can be connected, an output unit to which a liquid crystal display or the like can be connected, and a communication device (for example, a communication cable) capable of communicating outside the hardware entity Can be connected to a communication unit, a CPU (Central Processing Unit, may include a cache memory or a register), a RAM or ROM that is a memory, an external storage device that is a hard disk, and an input unit, an output unit, or a communication unit thereof , A CPU, a RAM, a ROM, and a bus connected so that data can be exchanged between the external storage devices. If necessary, the hardware entity may be provided with a device (drive) that can read and write a recording medium such as a CD-ROM. A physical entity having such hardware resources includes a general-purpose computer.

  The external storage device of the hardware entity stores a program necessary for realizing the above functions and data necessary for processing the program (not limited to the external storage device, for example, reading a program) It may be stored in a ROM that is a dedicated storage device). Data obtained by the processing of these programs is appropriately stored in a RAM or an external storage device.

  In the hardware entity, each program stored in an external storage device (or ROM or the like) and data necessary for processing each program are read into a memory as necessary, and are interpreted and executed by a CPU as appropriate. . As a result, the CPU realizes a predetermined function (respective component requirements expressed as the above-described unit, unit, etc.).

  The present invention is not limited to the above-described embodiment, and can be appropriately changed without departing from the spirit of the present invention. In addition, the processing described in the above embodiment may be executed not only in time series according to the order of description but also in parallel or individually as required by the processing capability of the apparatus that executes the processing. .

  As described above, when the processing functions in the hardware entity (the apparatus of the present invention) described in the above embodiments are realized by a computer, the processing contents of the functions that the hardware entity should have are described by a program. Then, by executing this program on a computer, the processing functions in the hardware entity are realized on the computer.

  The program describing the processing contents can be recorded on a computer-readable recording medium. As the computer-readable recording medium, for example, any recording medium such as a magnetic recording device, an optical disk, a magneto-optical recording medium, and a semiconductor memory may be used. Specifically, for example, as a magnetic recording device, a hard disk device, a flexible disk, a magnetic tape or the like, and as an optical disk, a DVD (Digital Versatile Disc), a DVD-RAM (Random Access Memory), a CD-ROM (Compact Disc Read Only). Memory), CD-R (Recordable) / RW (ReWritable), etc., magneto-optical recording medium, MO (Magneto-Optical disc), etc., semiconductor memory, EEP-ROM (Electronically Erasable and Programmable-Read Only Memory), etc. Can be used.

  The program is distributed by selling, transferring, or lending a portable recording medium such as a DVD or CD-ROM in which the program is recorded. Furthermore, the program may be distributed by storing the program in a storage device of the server computer and transferring the program from the server computer to another computer via a network.

  A computer that executes such a program first stores, for example, a program recorded on a portable recording medium or a program transferred from a server computer in its own storage device. When executing the process, the computer reads a program stored in its own recording medium and executes a process according to the read program. As another execution form of the program, the computer may directly read the program from a portable recording medium and execute processing according to the program, and the program is transferred from the server computer to the computer. Each time, the processing according to the received program may be executed sequentially. Also, the program is not transferred from the server computer to the computer, and the above-described processing is executed by a so-called ASP (Application Service Provider) type service that realizes the processing function only by the execution instruction and result acquisition. It is good. Note that the program in this embodiment includes information that is used for processing by an electronic computer and that conforms to the program (data that is not a direct command to the computer but has a property that defines the processing of the computer).

  In this embodiment, a hardware entity is configured by executing a predetermined program on a computer. However, at least a part of these processing contents may be realized by hardware.

Claims (4)

  1. A first holographic optical element surface;
    A photographing device for photographing reflected light from the first holographic optical element surface and transmitting it to a receiving-side projection device;
    A second holographic optical element surface disposed parallel to the first holographic optical element surface;
    The light imaged on the imaging device of the imaging device passes through the first and second holographic optical element surfaces without being reflected and refracted, and forms an image on the imaging device of the virtual imaging device. A point representing the position of the virtual photographing device when assumed is defined as a point T,
    The second holographic optics is an image captured by the receiving device, which is arranged outside the cone having the point T as an apex and including the boundary of the first holographic optical element surface on its side surface. A projection device for projecting onto the element surface;
    Including video display system.
  2. The video display system according to claim 1,
    The first holographic optical element surface reflects incident light in a predetermined wavelength region and transmits incident light in other wavelength regions, and the second holographic optical element surface transmits the first holographic optical surface. An image display system that reflects incident light in a wavelength region different from the wavelength region reflected by the optical element surface and transmits incident light in other wavelength regions.
  3. The video display system according to claim 1 or 2 ,
    The projection apparatus is a video display system which is a short focus type projector.
  4. A video display system according to claim 1 or et 3,
    An image display system that alternately executes image projection by the projection device and imaging by the imaging device while switching at a predetermined frame rate.
JP2015087763A 2015-04-22 2015-04-22 Video display system Active JP6424133B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015087763A JP6424133B2 (en) 2015-04-22 2015-04-22 Video display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2015087763A JP6424133B2 (en) 2015-04-22 2015-04-22 Video display system

Publications (2)

Publication Number Publication Date
JP2016208273A JP2016208273A (en) 2016-12-08
JP6424133B2 true JP6424133B2 (en) 2018-11-14

Family

ID=57490127

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2015087763A Active JP6424133B2 (en) 2015-04-22 2015-04-22 Video display system

Country Status (1)

Country Link
JP (1) JP6424133B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106997616A (en) * 2017-03-01 2017-08-01 天津大学 A kind of three-D imaging method and pyramid three-dimensional image forming apparatus

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05328336A (en) * 1992-05-15 1993-12-10 Nippon Telegr & Teleph Corp <Ntt> Display/image pickup device
JPH08340520A (en) * 1995-06-14 1996-12-24 Sharp Corp Video equipment
JPH10213776A (en) * 1997-01-29 1998-08-11 Asahi Glass Co Ltd Display image pickup device
JP3496871B2 (en) * 1998-08-27 2004-02-16 日本電信電話株式会社 Display capturing device and remote interactive device
JP2001359064A (en) * 2000-06-09 2001-12-26 Canon Inc Video phone and method for controlling the video phone
EP2337010A3 (en) * 2002-03-13 2011-11-02 Dolby Laboratories Licensing Corporation High dynamic range display devices
US6882358B1 (en) * 2002-10-02 2005-04-19 Terabeam Corporation Apparatus, system and method for enabling eye-to-eye contact in video conferences
JP4349067B2 (en) * 2003-10-09 2009-10-21 株式会社ニコン Imaging apparatus and camera having the same
JP2007295187A (en) * 2006-04-24 2007-11-08 Canon Inc Projector
JP2014071282A (en) * 2012-09-28 2014-04-21 Dainippon Printing Co Ltd Reflective screen and video image display system
JP2014176042A (en) * 2013-03-13 2014-09-22 Ricoh Co Ltd Communication device, and voice input/output unit control method

Also Published As

Publication number Publication date
JP2016208273A (en) 2016-12-08

Similar Documents

Publication Publication Date Title
US8451339B2 (en) Digital imaging system for correcting image aberrations
JP5261805B2 (en) Camera application for portable devices
US9055220B1 (en) Enabling the integration of a three hundred and sixty degree panoramic camera within a mobile device case
EP1348148B2 (en) Camera
US7224382B2 (en) Immersive imaging system
US7079173B2 (en) Displaying a wide field of view video image
US9196027B2 (en) Automatic focus stacking of captured images
US20070230944A1 (en) Plenoptic camera
NL1029968C2 (en) Digital stereo camera / digital stereo video camera, 3-dimensional display, 3-dimensional projector, and printer and stereoviewer.
US9774896B2 (en) Network synchronized camera settings
CN101636747B (en) Two dimensional/three dimensional digital information acquisition and display device
KR101877160B1 (en) Parallax free multi-camera system capable of capturing full spherical images
US20110025830A1 (en) Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation
US20100045773A1 (en) Panoramic adapter system and method with spherical field-of-view coverage
US7397501B2 (en) Digital camera with viewfinder designed for improved depth of field photographing
US20100033479A1 (en) Apparatus, method, and computer program product for displaying stereoscopic images
US9686471B2 (en) Methods and apparatus relating to image stabilization
US8305425B2 (en) Solid-state panoramic image capture apparatus
US8867886B2 (en) Surround video playback
US8259161B1 (en) Method and system for automatic 3-D image creation
CN1934874A (en) Three dimensional acquisition and visualization system for personal electronic devices
JP2004023203A (en) Imaging apparatus, imaging method, display, and displaying method
US7118228B2 (en) Image display system
CN1401186A (en) Combined display camera for an image processing system
JP2014508430A (en) Zero parallax plane for feedback-based 3D video

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20180202

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20180905

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20180911

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180925

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20181016

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20181022

R150 Certificate of patent or registration of utility model

Ref document number: 6424133

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150