WO2017213241A1 - Système de projection d'images, dispositif de projection d'images, procédé de projection d'images, programme de projection d'images et dispositif de serveur - Google Patents

Système de projection d'images, dispositif de projection d'images, procédé de projection d'images, programme de projection d'images et dispositif de serveur Download PDF

Info

Publication number
WO2017213241A1
WO2017213241A1 PCT/JP2017/021380 JP2017021380W WO2017213241A1 WO 2017213241 A1 WO2017213241 A1 WO 2017213241A1 JP 2017021380 W JP2017021380 W JP 2017021380W WO 2017213241 A1 WO2017213241 A1 WO 2017213241A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
projection
information
unit
image data
Prior art date
Application number
PCT/JP2017/021380
Other languages
English (en)
Japanese (ja)
Inventor
菅原 充
鈴木 誠
欣也 長谷川
Original Assignee
株式会社Qdレーザ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2017111151A external-priority patent/JP6255524B2/ja
Application filed by 株式会社Qdレーザ filed Critical 株式会社Qdレーザ
Priority to US16/306,966 priority Critical patent/US10409146B2/en
Priority to CN201780034880.6A priority patent/CN109313343B/zh
Priority to EP17810414.7A priority patent/EP3470909B1/fr
Publication of WO2017213241A1 publication Critical patent/WO2017213241A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus

Definitions

  • the present invention relates to an image projection system, an image projection apparatus, an image projection method, an image projection program, and a server apparatus.
  • an image projection apparatus using a method of causing a person to visually recognize an image represented by image data without being affected by the function of the human crystalline lens by projecting an imaging light beam based on image data onto the retina of a person.
  • an imaging light beam based on image data onto the retina of a person.
  • the entire range of the image or video represented by the input image data is projected as it is onto the retina of a person, so for example, for a person with a limited visual field such as a visual field constriction Can not show the whole range of images and pictures. For this reason, in the related art, even if a person having a limited field of view, for example, wears the image projection apparatus, it may not be possible to visually recognize a portion to be particularly noted in an image or video.
  • the disclosed technology has been made in view of the above-described circumstances, and provides an image projection system, an image projection apparatus, an image projection method, an image projection program, and a server apparatus capable of projecting an image into a field of view.
  • the purpose is that.
  • the disclosed technology is an image projection system including an image projection device and a terminal device that communicates with the image projection device, and the terminal device holds the projection information projected by the image projection device.
  • a target holding unit a storage unit in which position information indicating a position on the user's retina where the projection information is projected is stored, and a position information acquisition unit for obtaining the position information from the storage unit
  • An image data generation unit configured to generate image data of an image for projecting the projection information onto the position indicated by the position information
  • an image output processing unit configured to output the image data to the image projection apparatus;
  • the apparatus generates a light source for emitting a light beam, an image input unit for inputting the image data from the terminal device, and an image light beam based on the input image data, and the light source unit
  • a control unit for controlling emission of the image light beam, a scanning mirror for scanning the image light beam, and projecting the image light beam onto the retina of the eyeball of the user as the image represented by the image data
  • a projection unit for controlling emission of the image
  • Images can be projected into the field of view.
  • FIG. 1 is a diagram for explaining an outline of projection of an image in the first embodiment.
  • the image projection system 100 of the present embodiment includes an image projection device 200 and a terminal device 300.
  • image data for projecting an image in the field of view of the user P based on the information indicating the field of view of the user P wearing the image projector 200 and the projection information. are provided to the image projection apparatus 200.
  • the projection information is information that is the origin of the image projected onto the retina of the user P by the image projection apparatus 200, and may be a symbol such as a character or a number, or may be an image. good.
  • the image projection apparatus 200 of the present embodiment is a retina projection head mounted display using Maxwell vision.
  • Maxwell vision a light beam for imaging based on image data is once converged at the center of the pupil and then projected onto the retina to make a person visually recognize the image represented by the image data without being affected by the function of the human crystalline lens. It is a method.
  • the terminal device 300 according to the present embodiment is, for example, a tablet computer or a smartphone, and transmits image data to the image projection device 200.
  • the image projection device 200 projects an image onto the retina of the user P by irradiating the light beam for image based on the image data transmitted from the terminal device 300 onto the retina of the user P.
  • the image projected directly onto the human retina is viewed as the image represented by the image data if the function of the retina is normal. However, when there is a problem in the function of the retina, the optic nerve or the like, the image projected onto the retina is viewed in a mode different from the image represented by the image data.
  • the image projected onto the retina only the limited range of the field of view is viewed.
  • the image projected onto the retina is viewed as an image in which a portion of the lost visual field is missing.
  • information indicating the visual field in the user's retina is acquired in advance by visual field inspection or the like, and the user is caused to project an image in the visual field by referring to the information indicating the visual field. .
  • the terminal device 300 projects information within the field of view of the user P based on the information indicating the field of view of the user P, the information indicating the visual acuity, and the projection information.
  • Image data D to be projected is generated and transmitted to the image projector 200.
  • the image projection apparatus 200 projects an image based on the image data D onto the retina of the user P.
  • the size of the image projected on the user P's retina corresponds to the visual acuity of the user P in the region corresponding to the user's P field of view like the image 201-P.
  • FIG. 2 is a top view of the visual field inspection apparatus.
  • the image projection apparatus 200 of the present embodiment includes a projection unit 210 and a control unit 230.
  • the projection unit 210 of the present embodiment includes a light source 211, a scanning mirror 212, a mirror 213, a mirror 214, a mirror 215, and a projection mirror 216.
  • the light source 211 is disposed on the temple 250 of the eyeglass frame.
  • the light source 211 emits a light beam L of, for example, a single or a plurality of wavelengths under the instruction of the control unit 230.
  • the light ray L is an image light ray for projecting an image on the retina 261 of the eyeball 260 of the user. In the following description, the light ray L is called an image light ray.
  • the light source 211 emits, for example, red laser light (wavelength: about 610 nm to 660 nm), green laser light (wavelength: about 515 nm to 540 nm), and blue laser light (wavelength: about 440 nm to 480 nm). It emits red, green and blue laser light.
  • the light source 211 according to the present embodiment is realized as a light source 211, for example, a light source in which laser diode chips of RGB (red, green, and blue), a three-color combining device, and a microcollimator lens are integrated.
  • the scanning mirror 212 is disposed on the temple 250 of the eyeglass frame.
  • the scanning mirror 212 scans the imaging light beam emitted from the light source 211 in the horizontal direction and the vertical direction.
  • the scanning mirror 212 is, for example, a MEMS (Micro Electro Mechanical System) mirror.
  • the imaging light beam emitted from the light source 211 is reflected by, for example, the mirror 213 and the mirror 214 and enters the scanning mirror 212.
  • the control unit 230 is realized by a processor such as a central processing unit (CPU) and a random access memory (RAM) and a read only memory (ROM).
  • a processor such as a central processing unit (CPU) and a random access memory (RAM) and a read only memory (ROM).
  • the processor and the memory may be mounted, for example, on the same substrate as the substrate on which the scanning mirror 212 (MEMS mirror) is mounted.
  • the processor and the memory may be provided in an external device (for example, the terminal device 300 etc.) connected to the image projection device 200.
  • the control unit 230 of the present embodiment controls the projection unit 210.
  • the control unit 230 causes the light source 211 to emit an image light beam based on the input image data.
  • the control unit 230 according to the present embodiment vibrates the scanning mirror 212 (MEMS mirror), scans the imaging light beam emitted from the light source 211, and projects the image on the retina 261.
  • MEMS mirror scanning mirror 212
  • FIG. 3 is an enlarged view of the vicinity of the projection unit of the image projection apparatus.
  • the imaging light beam scanned by the scanning mirror 212 is reflected by the mirror 215 toward the lens 251 of the spectacles-type frame.
  • the projection unit 210 is disposed on the surface of the lens 251 on the eyeball 260 side, the imaging light beam scanned by the scanning mirror 212 is incident on the projection mirror 216.
  • the projection mirror 216 is a half mirror having a free-form surface or a composite structure of a free-form surface and a diffractive surface in a region 216 a where the image light beam is incident.
  • the imaging light beam incident on the projection mirror 216 is converged in the vicinity of the pupil 262 of the eyeball 260 and then projected onto the retina 261.
  • the user can recognize the image formed by the image light beam and can visually recognize the outside image by the see-through.
  • FIG. 4 is a diagram for explaining the vibration of the first mirror.
  • FIG. 4 shows the case where the scanning mirror 212 vibrates from point A to point B.
  • a method of scanning an image light beam by the scanning mirror 212 and projecting an image on the retina 261 a method of scanning light at high speed from the upper left to the lower right of the region on which the image is projected (eg, raster scan) ).
  • the scanning mirror 212 is larger than a region H (broken line range in FIG. 4) in which the image is projected onto the retina 261 in order to scan the image light beam (light beam L). And vibrate in the horizontal direction (first direction) and in the vertical direction (second direction intersecting the first direction).
  • the vibration of the scanning mirror 212 is indicated by reference numeral 50.
  • the imaging light beam is scanned at a place where the deflection of the scanning mirror 212 is small.
  • FIG. 4 shows an example in which the imaging light beam is scanned in a rectangular shape
  • the present invention is not limited to this case, and other cases such as trapezoidal scanning may be used.
  • FIG. 5 is a diagram for explaining an example of the hardware configuration of the terminal device.
  • the terminal device 300 of the present embodiment includes a display operation device 301, a drive device 302, an auxiliary storage device 303, a memory device 304, an arithmetic processing unit 305, and an interface device 306, which are mutually connected by a bus B.
  • the display operation device 301 is a touch panel or the like, and has a display function of displaying information and an input function of inputting information.
  • the interface device 306 includes a LAN card and the like, and is used to connect to a network.
  • the image projection program executed in the terminal device 300 is at least a part of various programs for controlling the terminal device 300.
  • the image projection program is provided, for example, by distributing the recording medium 307 or downloading from the network.
  • the recording medium 307 on which the image projection program is recorded is a recording medium for optically, electrically or magnetically recording information such as a CD-ROM, flexible disk, magneto-optical disk, etc., information such as ROM, flash memory etc.
  • Various types of recording media can be used, such as a semiconductor memory that electrically records the data.
  • the image projection program is installed from the recording medium 307 to the auxiliary storage device 303 via the drive device 302 when the recording medium 307 storing the image projection program is set to the drive device 302.
  • the image projection program downloaded from the network is installed in the auxiliary storage device 303 via the interface device 306.
  • the auxiliary storage device 303 stores the installed image projection program and also stores necessary files, data, and the like.
  • the memory device 304 reads out and stores the image projection program from the auxiliary storage device 303 when the computer is started.
  • the arithmetic processing unit 305 implements various processes as described later according to the image projection program stored in the memory unit 304.
  • the terminal device 300 includes the display operation device 301 in the present embodiment, the present invention is not limited to this.
  • the terminal device 300 may be, for example, a desktop computer or a notebook computer. In that case, the terminal device 300 may have an input device such as a mouse or a keyboard used for inputting information, and an output device such as a display for displaying information, instead of the display operation device 301.
  • FIG. 6 is a diagram for explaining the system configuration of the image projection system of the first embodiment.
  • the image projection system 100 includes an image projection device 200 and a terminal device 300.
  • the image projection device 200 is connected to the terminal device 300 and communicates with the terminal device 300.
  • the image projection apparatus 200 and the terminal apparatus 300 perform communication by wireless, but the present invention is not limited to this.
  • the image projection device 200 and the terminal device 300 may be connected by any method as long as they are connected in a communicable state.
  • the terminal device 300 includes an image generation processing unit 310, an image output processing unit 320, and a visual field and sight information storage unit 330.
  • the image generation processing unit 310 refers to the visual field information of the user P of the terminal device 300 and the visual acuity information stored in the visual field and visual acuity information storage unit 330 and is projection information which is information to be projected. To generate image data so that is projected to the user's field of view. Details of the image generation processing unit 310 will be described later.
  • the projection information can be mainly required by the user or information of interest.
  • the image output processing unit 320 outputs the image data received from the image generation processing unit 310 to an external device. Specifically, the image output processing unit 320 of the present embodiment outputs (sends) the image data received from the image generation processing unit 310 to the image projection apparatus 200.
  • the visual field visual acuity information storage unit 330 stores information indicating the visual field of the user P of the image projector 200 and information indicating the visual acuity of the user P.
  • the information indicating the visual field is information indicating the area of the retina of the user P, and is information indicating the area of the retina where the image projected by the user P can be viewed.
  • the information indicating the visual acuity in the present embodiment is information indicating the visual acuity of the retina itself. Specifically, the information indicating the visual acuity in the present embodiment is different from general visual acuity which changes due to, for example, adjustment of the thickness of the lens by the ciliary muscle, and indicates the function of the macular region of the retina. It is a thing. Details of the visual field visual acuity information storage unit 330 will be described later.
  • FIG. 7 is a diagram for explaining the function of the image generation processing unit of the first embodiment.
  • the image generation processing unit 310 of the present embodiment includes a projection request reception unit 311, a projection target acquisition unit 312, a visual field visual acuity information acquisition unit 313, and an image data generation unit 314.
  • the projection request receiving unit 311 receives a projection request of the input projection information in the display operation device 301 or the like.
  • the projection target acquisition unit 312 acquires projection information.
  • the projection information may be, for example, content data etc. stored in the auxiliary storage device 33 etc. of the terminal device 300, or even if the terminal device 300 is content data etc. obtained from an external server or storage device etc. good.
  • the projection information may be text data or image data including a moving image.
  • an imaging apparatus is incorporated, and the projection information may be an image captured by the imaging apparatus incorporated in the image projection apparatus 200.
  • the visual field visual acuity information acquisition unit 313 acquires visual field information of the user P of the image projection apparatus 200 and visual acuity information of the user P from the visual field visual acuity information storage unit 330.
  • the visual field visual acuity information acquisition unit 313 is a position information acquisition unit that acquires information indicating the position where the projection information is projected.
  • the image data generation unit 314 generates image data of an image on which projection information is displayed based on the visual field information, and passes the generated image data to the image output processing unit 320.
  • FIG. 8 is a diagram showing an example of the visual field and sight information storage unit of the first embodiment.
  • the visual field information and the visual acuity information are managed as a table associated with the user ID.
  • the visual field / visual acuity information storage unit 330 of the present embodiment has a user ID, a visual field, and a visual acuity as items of information, and the visual field and the visual acuity correspond to the user ID.
  • the value of the item "user ID" is identification information for identifying the user. In the present embodiment, the user's name or the like may be used instead of the identifier as the information for identifying the user.
  • the value of the item “visual field” associated with the user ID is referred to as visual field information
  • the value of the item “visual acuity” associated with the user ID is referred to as visual acuity information.
  • the value of the item "field of view” is coordinate information indicating the field of view of the user.
  • the value of the item "visual acuity” indicates the visual acuity of the user's retina itself.
  • the visual field information and the visual acuity information are not associated with each other in the example of FIG. 8, the visual field information and the visual acuity information may be associated with each other in the visual field / acuity information storage unit 330. Specifically, in the visual field and visual acuity information storage unit 330, the area indicated by the visual field information may be managed in association with the visual acuity information indicating the visual acuity of this area.
  • the visual acuity information of the present embodiment may be information indicating general visual acuity using a Landolt ring or the like.
  • FIG. 9 is a first diagram showing an example of a field of view. As shown in FIG. 9, missing areas 112, 113, and 114 exist in the field of view 111 of the user P. That is, in the retina of the user P, there is some abnormality at the position corresponding to the defect areas 112, 113, and 114.
  • the defect areas 112, 113, and 114 shown in FIG. 9 are reflected even in the projected image. Ru. Therefore, by projecting the information of interest by the user P on areas other than the loss area, even if the user P has a loss area, he / she can view the information of interest or need can do.
  • the information projected onto the defect area may be an image continuous with the image projected onto the area not the defect area if it is an image, an image of a single color, etc. In this area, the image light beam is stopped. It is also good.
  • the information indicating the visual field of the user includes, for example, an inspection result indicating the presence or absence of visual field loss obtained from a general dynamic visual field inspection or static visual field inspection (visual field inspection with a Humphrey perimeter). It may be acquired as an image and converted into coordinate information by converting an area where the visual field is missing and an area where the visual field is not missing.
  • the dynamic visual field inspection is, for example, a visual field inspection using a Goldman perimeter
  • the static visual field inspection is, for example, a visual field inspection using a Humphrey perimeter.
  • the visual field / visual acuity information storage unit 330 of the present embodiment may be recorded, for example, in a portable recording medium or the like, and may be read from the recording medium by the terminal device 300. Also, the visual field and sight information storage unit 330 may be downloaded from an external server or the like via a network.
  • the terminal device 300 only needs to refer to the visual field information storage unit 330, and the method of acquiring the visual field information storage unit 330 by the terminal device 300 is not particularly limited. Further, in the present embodiment, since the terminal device 300 only needs to be able to refer to the visual field information storage unit 330, the visual field information storage unit 330 is stored in an external device capable of communicating with the terminal device 300. good.
  • FIG. 10 is a flowchart for explaining the operation of the image generation processing unit of the first embodiment.
  • the image generation processing unit 310 causes the projection request receiving unit 311 to select the projection information and determine whether the projection request for the projection information has been received (step S1001). In step S1001, when the projection request is not received, the image generation processing unit 310 stands by until the projection request is received.
  • step S1001 when a projection request is received, the image generation processing unit 310 acquires the selected projection information from the projection target acquisition unit 312 (step S1002).
  • the image generation processing unit 310 causes the visual field and visual acuity information acquiring unit 313 to acquire visual field information and visual acuity information of the user who has made the projection request from the visual field and visual acuity information storage unit 330 (step S1003).
  • the image generation processing unit 310 causes the image data generation unit 314 to generate image data to be delivered to the image output processing unit 320 based on the projection information, the visual field information and the visual acuity information (step S1004).
  • the image data generation unit 314 transfers the generated image data to the image output processing unit 320 (step S1005), and ends the process.
  • FIG. 11 is a first diagram illustrating an example of image data generated by the image generation processing unit. 11, the case where the image generation processing unit 310 generates image data with reference to visual field information and visual acuity information of the user P having the visual field shown in FIG. 8 will be described.
  • the image generation processing unit 310 generates image data in which projection information is included in the area 115 indicating the field of view of the user P.
  • An image 201 -P shown in FIG. 11 is an image in which the projection information 202 is included in the area 115 indicated by the visual field information stored in the visual field / visual acuity information storage unit 330. At this time, the size or the like of the character indicating the projection information 202 is set to a size according to the visual acuity information stored in the visual field visual acuity information storage unit 330.
  • the image generation processing unit 310 of the present embodiment generates image data representing the image 201-P, and causes the image projection device 200 to project the image 201-P on the retina of the user P.
  • the size of the projection information 202 for which the user P made a projection request in the visual field of the user P according to the visual acuity of the user P Can be projected.
  • the image data generation unit 314 generates image data of the image 201-P in which information is projected only to the user's field of view, but the present invention is not limited to this.
  • the image generation processing unit 310 may generate image data of an image showing the projection information itself, and may transmit the image data to the image projection apparatus 200 together with the visual field information.
  • the image projection apparatus 200 controls the vibration of the scanning mirror 212 based on the visual field information by the control unit 230 to adjust the projection position of the image, and projects the image showing the projection information itself in the user's visual field. good.
  • the image generation processing unit 310 generates image data of the image 202G indicating the projection information 202 itself, and causes the image output processing unit 320 to output the image data of the image 202G to the image projection device 200 together with the visual field information. .
  • the image projection apparatus 200 may control the vibration of the scanning mirror 212 based on the visual field information to adjust the projection position of the image, and project the image data of the image 202G onto the area indicated by the visual field information.
  • FIG. 12 is a second diagram showing an example of the field of view.
  • the visual field 141 of the user Q is shown.
  • the field of view 141 as can be seen from FIG. That is, in the retina of the user Q, there is some abnormality at the position corresponding to the defect area 142. Therefore, the projected image is visually recognized by the user Q as an image in which the area (shaded portion) at the position corresponding to the loss area 142 is missing.
  • the image projected on the position corresponding to the area 143 other than the defect area 142 is visually recognized by the user Q.
  • FIG. 13 is a second diagram illustrating an example of image data generated by the image generation processing unit. 13, the case where the image generation processing unit 310 generates image data with reference to the visual field information and the visual acuity information of the user Q having the visual field shown in FIG. 12 will be described.
  • the image generation processing unit 310 generates image data in which the projection information 203 is included in the area 143 indicating the field of view of the user Q.
  • An image 201-Q shown in FIG. 13 is an image in which the projection information 203 is included in an area 143 indicated by the visual field information of the user Q.
  • the image generation processing unit 310 of the present embodiment generates image data representing the image 201-Q, and causes the image projection device 200 to project the image 201-Q onto the retina of the user Q.
  • image data representing the image 201-Q
  • the image projection device 200 causes the image projection device 200 to project the image 201-Q onto the retina of the user Q.
  • FIG. 14 is a third diagram illustrating an example of image data generated by the image generation processing unit.
  • An image 201-QA shown in FIG. 14 shows an image generated by the image generation processing unit 310 when the visual acuity of the user Q is worse than the example shown in FIG.
  • the character indicating the projection information 203 is larger than the character indicating the projection information 203 in the image 201-Q.
  • the size of characters, images, symbols, etc. to be projected on the user's retina can be changed according to the user's vision.
  • FIG. 15 is a first diagram showing an example of how the projection information in the first embodiment looks.
  • FIG. 15 shows how the image 201-Q is projected on the retina of the user Q having the field of view shown in FIG.
  • the image 201-Q is projected to a position corresponding to the non-defective area 143 of the user Q's retina, so that the user Q can visually recognize the projection information 203.
  • FIG. 16 is a second diagram showing an example of how the projection information appears in the first embodiment.
  • FIG. 16 shows how the image 201-QA is projected onto the retina of the user Q having the field of view shown in FIG.
  • the character indicating the projection information 203 is enlarged according to the visual acuity information of the user Q.
  • the projection information 203 which is in the visual field of the visual field, which is visible to the user in the visual field.
  • the image of the search result of the searched term is displayed in the image where the meaning of the term is displayed. Only the image can be extracted, and the extracted image can be displayed in the field of view, and even a person with limited field of view can use a normal electronic dictionary.
  • the image data generated by the image generation processing unit 310 of the present embodiment also includes moving image data.
  • the moving image data may be projected so that the text data flows in the field of view of the user.
  • the image projection apparatus 200 controls the vibration of the scanning mirror 212 based on the visual field information by the control unit 230 to adjust the projection position of the image, and projects the image showing the projection information itself in the user's visual field. good.
  • the second embodiment will be described below.
  • the second embodiment is different from the first embodiment in that the result of the visual field inspection performed using the image projector 200 is used as visual field information. Therefore, in the following description of the second embodiment, only differences from the first embodiment will be described, and for those having the same functional configuration as the first embodiment, the description of the first embodiment The same reference numerals as the reference numerals used in FIG.
  • FIG. 17 is a diagram showing an example of a system configuration of an image projection system according to the second embodiment.
  • the image projection system 100A of the present embodiment includes an image projection device 200 and a terminal device 300A.
  • the terminal device 300A of the present embodiment includes an image generation processing unit 310, an image output processing unit 320, a visual field and sight information storage unit 330A, and an inspection processing unit 340.
  • the inspection processing unit 340 of the present embodiment holds inspection image data corresponding to the visual field inspection image G, and passes the inspection image data to the image output processing unit 320.
  • the inspection processing unit 340 of the present embodiment causes the visual field inspection image G to be displayed on the display operation device 301 of the terminal device 300A, receives an input of the inspection result from the user, and transmits the inspection result to the visual field visual information storage unit 330A. Store in Details of the inspection processing unit 340 will be described later.
  • the visual field and visual acuity information storage unit 330A of the present embodiment has a visual field information table 331 and a visual acuity information table 332.
  • the visual field information table 331 inspection results of visual field inspection of a user wearing the image projection apparatus 200 are stored.
  • the visual acuity information table 332 the examination result of the visual acuity examination of the user wearing the image projection apparatus 200 is stored. Details of the visual field information table 331 and the visual acuity information table 332 will be described later.
  • FIG. 18 is a diagram for explaining an inspection processing unit of the second embodiment.
  • Each unit shown in FIG. 18 is realized by the arithmetic processing unit 305 of the terminal device 300A reading out and executing the visual field inspection program stored in the memory device 304 or the like.
  • the inspection processing unit 340 of the present embodiment includes an image data holding unit 341, a display control unit 342, an input receiving unit 343, an image data selection unit 344, and an inspection result storage unit 345.
  • the image data holding unit 341 holds image data for visual field inspection corresponding to the image G for visual field inspection and image data for visual acuity inspection corresponding to the image T for visual acuity inspection.
  • the image data storage unit 341 of the present embodiment receives an inspection start request, the image data storage unit 341 passes image data corresponding to the inspection to be performed to the image output processing unit 320 and the display control unit 342.
  • the image data holding unit 341 holds the inspection image data.
  • the inspection image data may also be held by the control unit 230 of the image projection apparatus 200.
  • the display control unit 342 When the display control unit 342 receives the inspection image data, the display control unit 342 causes the display operation device 301 of the terminal device 300 to display an input screen of an inspection result including the inspection image. More specifically, the display control unit 342 displays an input screen of an inspection result including the visual field inspection image G and an input screen of an inspection result including the visual acuity inspection image T. In addition, the display control unit 342 of the present embodiment may display a selection screen for selecting a visual field test or a visual acuity test in response to a request for performing an examination.
  • the input receiving unit 343 receives an input according to various search operations performed in the display operation device 301. Specifically, the input receiving unit 343 receives an inspection start request, an input on an input screen of an inspection result, and the like.
  • the image data selection unit 344 selects image data corresponding to the type of examination accepted by the input acceptance unit 343 from the examination image data retained in the image data retention unit 341 and passes the image data to the display control unit 342.
  • the examination result storage unit 345 stores the examination result accepted by the input acceptance unit 343 in the visual field of vision information storage unit 330A in association with information indicating the date and time when the input of the user ID and the examination result is accepted.
  • FIG. 19 is a diagram showing a first example of the visual field inspection image of the second embodiment
  • FIG. 20 is a diagram showing a second example of the visual field inspection image of the second embodiment.
  • the visual field inspection image G illustrated in FIG. 19 is divided into a plurality of regions in each of the vertical direction and the horizontal direction. In other words, the visual field inspection image G is formed of a set of rectangular areas.
  • a number is drawn as an identifier for identifying each area.
  • the identifier of the area is not limited to numbers.
  • the identifier of the area may be, for example, hiragana or kanji, or letters of an alphabet or another language.
  • a gaze point M is formed in the central portion.
  • the mark indicating the gaze point M is “+”, but the shape of the mark indicating the gaze point is not limited to this.
  • the shape of the mark indicating the fixation point may be any shape as long as the user P can indicate the point to be watched.
  • visual impairments there are types of impairments such as discrimination of images and the like but inability to discriminate characters.
  • the identifier of the area of the present embodiment may not necessarily be a character.
  • an image or the like may be displayed for each area as an area identifier.
  • an image of an apple may be displayed in an area where the identifier is “1” and an image of a car may be displayed in an area where the identifier is “2”.
  • an image of a star-shaped mark may be displayed in an area where the identifier is “1”
  • an image of a heart-shaped mark may be displayed in an area where the identifier is “2”.
  • the identifier is not a character but is an image as described above, for example, even for a user who does not have literacy ability, it is possible to distinguish between the visible area and the invisible area in the inspection image. .
  • the number of regions included in the visual field inspection image G is 100 in FIG. 19, the number of regions is not limited to this.
  • the number of regions in the visual field inspection image G may be determined according to the size of the region onto which the visual field inspection image G is projected. Further, the number of areas in the visual field inspection image G may be such that the user can understand the area seen by the user and the area not seen by the user.
  • the number of regions in the visual field inspection image G be such that the region visible by the user and the region invisible by the user can be appropriately grasped without increasing the burden on the user.
  • the number of the regions may be determined in advance, for example, from the result of repeating the visual field inspection using the image projection system 100A of the present embodiment.
  • region contained in the image G for visual field inspection was made into the rectangle in this embodiment, it is not limited to this.
  • the shape of the area included in the visual field inspection image G may be circular, elliptical, or square.
  • the image for visual field inspection G1 shown in FIG. 20 is obtained by increasing the size of the identifier drawn in the area as the area is farther from the fixation point M in the central portion.
  • the identifier of the area near the fixation point is easily visible to the user P and is separated from the fixation point The identifier of the area is less visible to the user P.
  • the visibility of the identifiers around the visual field inspection image G1 can be improved by drawing the larger the identifier located in the position away from the fixation point M centering on the fixation point M.
  • the visual field test image G1 can also be used to measure the visual acuity distribution of the retina itself.
  • FIG. 21 is a first diagram showing an example of a vision test image of the second embodiment.
  • 11 columns of Landolt rings are arranged across six rows.
  • coordinates indicating the center of each Landolt ring included in the vision test image T may be associated with the vision test image T.
  • the coordinates of the center points P1, P2, and P3 of the Landolt rings 121 to 123 may be output as information for identifying the Landolt ring where the user could not identify the break.
  • FIG. 22 is a second view showing an example of a vision test image of the second embodiment.
  • the vision test image T1 shown in FIG. 22 five rows of Landolt rings are arranged in three rows, and the Landolt ring of the vision test image T1 is the same as the Landolt rings of the vision test image T shown in FIG. It turns out that it is large compared with.
  • the visual acuity examination image corresponding to the visual acuity stage is also provided with a Landolt ring of a size corresponding to the visual acuity stage. It was a thing.
  • the size of the Landolt ring to be projected onto the user's retina is selected, and the visual acuity examination image in which the Landolt ring of the selected size is arranged is used. It may be projected on the retina of At this time, the size of the Landolt's ring to be projected on the user's retina may be selected by the user, or may be selected by the examination assistant, for example.
  • FIG. 23 is a diagram for explaining a Landolt ring.
  • the Landolt ring is a black ring, and the width of the entire ring, the width of the arc, and the width of the open ring (the width of the cut) are 5: 5: 1.
  • the visual acuity is 1.0. More specifically, the vision has a diameter of 7.5 mm, an arc width of 1.5 mm, and a cut width of 1.5 mm. It may be a vision 1.0.
  • the image projection apparatus 200 may project the vision test image T including the Landolt's ring, which has a size as viewed from a distance of 5 m from the Landolt's ring.
  • FIG. 24 is a third diagram showing an example of a vision test image of the second embodiment.
  • the position for displaying the Landolt ring shows an image for visual acuity examination.
  • one Landolt ring is arranged at the upper left, and in the vision test image T12, one Landolt ring is arranged at the lower left. Further, in the vision test image T13 shown in FIG. 24, one Landolt ring is disposed on the upper right.
  • one Landolt ring is arranged for one vision test image, but the present invention is not limited to this.
  • a plurality of Landolt rings may be arranged for one vision test image.
  • the vision test images different in the position where the Landolt's ring is disposed are sequentially projected on the user's retina, but the present invention is not limited to this.
  • the vision test image may be projected on the user's retina as a moving image in which the position of the Landolt ring moves.
  • the present embodiment it is possible to associate the user's vision with the vision.
  • the distribution of visual acuity in the user's retina can be grasped.
  • FIG. 25 is a fourth diagram showing an example of a vision test image of the second embodiment.
  • a vision test image T15 shown in FIG. 25 a plurality of Landolt's ring images 151 to 156 having different luminances are arranged.
  • the luminance gradually decreases from the image 151 of the Landolt ring to the image 156 of the Landolt ring.
  • the luminance required for the user to discriminate the image can be grasped by thus differentiating the luminance of the image of the Landolt ring.
  • the Landolt ring may be a black image, or may be an image of a color other than black, for example, blue, red, or green.
  • the present embodiment by changing the color of the image of the Landolt's ring, it is possible to examine the presence or absence of a color vision abnormality together with the vision test.
  • FIG. 26 is a fifth diagram illustrating an example of the vision test image of the second embodiment.
  • the visual acuity test may be performed using an image other than the Landolt's ring.
  • the vision test image T16 shown in FIG. 26 includes an ETDRS (Early Treatment of Diabetic Retinopathy Study) chart.
  • the difference between the sizes of targets between rows is 0.1 log MAR unit, and five targets are arranged in one row.
  • the character to be a target is Sloan Letter set (10 characters of C, D, H, K, N, O, R, S, V, Z) of Sloan font.
  • the distance between each target is one target, and the result is not digitized but the numerical value by the targets.
  • the visual acuity test may be performed using a target other than the Landolt ring.
  • the targets other than the Landolt's ring are, for example, a tumbling E chart etc. in addition to the ETDRS chart.
  • FIG. 27 is a diagram showing an example of an image used for examination of contrast sensitivity
  • FIG. 28 is a diagram showing an example of an image used for examination of astigmatism.
  • the vision test image T of the present embodiment may include, for example, a test image of contrast sensitivity shown in FIG.
  • Images 171 and 172 shown in FIG. 27 are images for projecting a color gradation over the entire field of view.
  • the contrast sensitivity of the user can be measured by thus projecting the gradation of the color on the user's visual field and inputting the color recognizable by the user as the inspection result.
  • the image projected on the entire visual field is not an image of gradation of color, but a plurality of images in which the entire visual field has the same color are prepared, and the plurality of images are sequentially projected. It is also possible to realize color gradation.
  • the vision test image T of the present embodiment may include the image 181 and the image 182 for astigmatism examination shown in FIG. By including these images in the visual acuity examination image T, it can be examined whether the user is astigmatic or not.
  • FIG. 29 is a flowchart illustrating processing of the inspection processing unit of the second embodiment.
  • the terminal device 300A determines whether the input receiving unit 343 has received a request for starting a visual field inspection (step S2901). If the start request has not been received in step S2901, the inspection processing unit 340 proceeds to step S2911 described later.
  • step S 2901 when the start request is received, the inspection processing unit 340 causes the image data selection unit 344 to select and read out the image data for visual field inspection from the image data held by the image data holding unit 341, and outputs the image.
  • the image data for visual field inspection is transferred to the processing unit 320 and the display control unit 342 (step S2902).
  • the image output processing unit 320 transmits the image data for visual field examination to the image projection apparatus 200 (step S2903).
  • the projection unit 210 scans an image light beam based on the visual image data for visual field inspection onto the user's retina and performs visual field inspection. Make the image G visible to the user.
  • the display control unit 342 causes the display operation device 301 of the terminal device 300A to display an input screen of an inspection result including the visual field inspection image G based on the inspection image data (step S2904). Details of the inspection result input screen will be described later.
  • the inspection processing unit 340 causes the input receiving unit 343 to determine whether or not the input of the inspection result of the visual field inspection is received on the input screen (step S2905).
  • the input receiving unit 343 proceeds to step S2908 described later.
  • step S2905 when the input of the examination result is accepted, the input accepting unit 343 determines whether or not an instruction to save the examination result is accepted (step S2906). In step S2906, when the storage instruction is not received, the input receiving unit 343 proceeds to step S2910 described later.
  • step S2906 when the input of the storage instruction is received, the inspection processing unit 340 causes the inspection result storage unit 345 to store the input inspection result in the visual field and vision information storage unit 330A (step S2907), and the processing ends.
  • step S2905 when the input of the test result is not received, the input receiving unit 343 determines whether a predetermined time has elapsed (step S2908). If the predetermined time has not elapsed in step S2908, the input accepting unit 343 returns to step S2905.
  • step S2908 when the predetermined time has elapsed, the inspection processing unit 340 causes the display control unit 342 to cause the terminal device 300 to display that the execution of the visual field inspection has not been performed normally (step S2909), and the process is ended. Do.
  • step S2906 when the storage instruction is not received, the input receiving unit 343 determines whether a predetermined time has elapsed (step S2910). In step S2910, when the predetermined time has not elapsed, the input accepting unit 343 returns to step S2906.
  • step S2910 If the predetermined time has elapsed in step S2910, the inspection processing unit 340 proceeds to step S2909.
  • step S2901 the input receiving unit 343 determines whether the start request of the visual acuity inspection is received (step S2911). In step S2911, when the start request for the visual acuity examination is not received, the inspection processing unit 340 returns to step S2901.
  • step S 2911 when the start request for the visual acuity examination is received, the examination processing unit 340 causes the image data selection unit 344 to select and read the visual acuity examination image data from the image data held by the image data holding unit 341. Then, the visual field inspection image data is passed to the image output processing unit 320 and the display control unit 342 (step S2912), and the process proceeds to step S2903.
  • FIG. 30 is a diagram illustrating an example of an inspection result input screen according to the second embodiment.
  • the visual field inspection image G is displayed on the inspection result input screen 101 shown in FIG. Further, on the input screen 101, a message 102 for selecting a read numeral in the visual field inspection image G and a button 103 for instructing storage of the inspection result are displayed.
  • the message 102 is "Please touch the number read by the user", which is the content prompting the user to select the number visually recognized in the visual field inspection image G. But it is not limited to this.
  • the message 102 may be, for example, content for prompting the user to select a number which can not be visually recognized in the visual field inspection image G.
  • the content of the message 102 may be set in advance by the administrator of the image projection system 100A or the like. Also, for example, it may be set by the user whether to select a read digit or to select an unreadable digit.
  • the visual field information table 331 and the visual acuity information table 332 stored in the visual field and visual acuity information storage unit 330A of the present embodiment will be described.
  • the visual field information table 331 -P of the user P will be described with reference to FIGS. 31 to 36.
  • FIG. 31 is a first diagram showing an example of how the visual field inspection image of the second embodiment looks.
  • FIG. 31 shows an example of how the user P having the visual field shown in FIG. 9 looks at the visual field inspection image G.
  • the user P has missing areas 112, 113, 114 (see FIG. 9).
  • the defective areas 112, 113, and 114 are reflected also in the visual field inspection image G.
  • the visual field inspection image G is visually recognized by the user P as an image in which the area (shaded area) at the position corresponding to the defect areas 112, 113, and 114 is missing. Therefore, the number drawn in the area at the position corresponding to the missing area is not visually recognized by the user.
  • the area included in the loss area 112 is an area with identifiers 1, 2, 11, 21 and the area included in the loss area 113 is an area with identifiers 9, 10, 20, 30. is there. Therefore, these numbers are numbers that can not be read by the user P.
  • FIG. 32 is a first diagram illustrating an example of an input screen in a state in which a test result is input in the second embodiment.
  • An input screen 101A shown in FIG. 32 shows an example in which the inspection result of the user P is input on the input screen 101.
  • the numbers of the areas included in the respective missing areas 112 to 114 are not selected, and the numbers of the areas other than the missing areas 112 to 114 are selected.
  • the selected area in the image for visual field inspection G, the selected area is made to look brighter than the area not selected, but the present invention is not limited to this.
  • the selected area and the non-selected area be displayed in different modes in the visual field inspection image G so that they can be distinguished.
  • the user P has selected 3 to 8, 12 to 19, 22 to 29, 31 to 89, 91 to 99 as the numbers read. Therefore, in the image G for visual field inspection displayed on the input screen 101A, it can be understood that the area in which the number selected by the user P is drawn is the visual field of the user P.
  • the inputted examination result is stored as the visual field information table 331 in the visual acuity information storage unit 330A.
  • FIG. 33 is a first diagram showing an example of the visual field information table of the second embodiment.
  • the visual field information table 331 of the present embodiment is provided for each user, and FIG. 33 shows the visual field information table 331 -P of the user P.
  • the visual field information table 331 -P of the present embodiment has, as items of information, a user ID, an examination date, an input time, a read number, and a not read number.
  • the item "user ID" is associated with other items.
  • information including the value of each item is referred to as visual field inspection result information.
  • the visual field test result information of the present embodiment corresponds to visual field information indicating the visual field of the user.
  • the value of the item “examination date” is information indicating the date on which the visual field examination was performed.
  • the item “input time” value is information indicating the time at which the inspection result of the visual field inspection is input in the terminal device 300. That is, the value of the item “input time” is information indicating the time when the visual field examination was performed.
  • the value of the item “read number” indicates a number read by the user in the visual field inspection image G.
  • the value of the item “read number” indicates the number drawn in the area selected as the read number in the visual field inspection image G displayed on the inspection result input screen 101.
  • the value of the item “unreadable number” indicates a number that can not be read by the user in the visual field inspection image G.
  • the value of the item “unreadable digit” indicates the numeral drawn in the area not selected as the readable digit in the visual field inspection image G displayed on the inspection result input screen 101.
  • FIG. 34 is a second view showing an example of how the field-of-view image of the second embodiment looks.
  • FIG. 34 shows an example of how the user Q having the visual field shown in FIG. 12 looks at the visual field inspection image G.
  • the user Q has a defect area 142 (see FIG. 12).
  • the visual field inspection image G is visually recognized by the user Q as an image in which the area (hatched portion) at the position corresponding to the lack area 142 is missing. Therefore, in the example of FIG. 34, the numbers drawn in the area at the position corresponding to the missing area 142 are numbers that can not be read by the user Q.
  • FIG. 35 is a second diagram showing an example of the input screen in the state in which the visual field inspection result is input in the second embodiment.
  • the input screen 101B shown in FIG. 35 shows an example in which the inspection result of the user Q is input on the input screen 101.
  • the numbers of the areas included in the loss area 142 are not selected, and the numbers of the areas other than the loss area 142 are selected.
  • the user Q has selected 5 to 10, 15 to 20, 25 to 30, 35 to 40, 45 to 50, 55 to 59, and the like as the numbers read. Therefore, it can be understood that the area in which the number selected by the user Q is drawn is the field of view of the user Q.
  • the examination result is stored as the visual field information table 331-Q in the visual field / visual acuity information storage unit 330A.
  • FIG. 36 is a second diagram showing an example of the visual field information table of the second embodiment.
  • the result of the visual field inspection performed by the user Q with the user ID 002 is 10:00 on 2016/10/10 and 18:00 on 16/31/2016. You can see that it was input to
  • the visual field information table 331 can be generated by performing visual field inspection with a simple configuration and using the inspection result as visual field information indicating the user's visual field.
  • the visual field information table 331 stored in the terminal device 300A may be stored, for example, in a server such as a medical institution.
  • the visual field inspection image G is described as having a size that covers the visual field of the user, but the present invention is not limited to this.
  • the image projector 200 may be provided with a mechanism for detecting the moving direction of the pupil.
  • a mechanism for detecting the moving direction of the pupil by providing such a mechanism in the image projection apparatus 200, it is possible to change the irradiation direction of the image light beam from the light source 211 to the moving direction of the pupil in the moving direction of the user's pupil.
  • the user can view the same visual field inspection image G both after moving the pupil and before moving it.
  • the user does not have to turn to a fixed direction, and can perform visual field inspection in a free posture.
  • the visual field inspection image G can be always viewed by the user during the inspection without being influenced by the posture or the like of the user, and the accuracy of the inspection can be improved.
  • the visual field information table 331 of the present embodiment for example, instead of the value of the item “read numeral” and the value of the item “unreadable numeral”, the area indicated by each identifier in the visual inspection image G Coordinate information indicating the position may be used. Further, the visual field information table 331 may have, as items of information, for example, coordinate information indicating the position of the area indicated by the value of the item “read number” and the value of the item “unreadable number”.
  • the coordinate information may be acquired, for example, from the image data for visual field inspection, or may be acquired from, for example, a table in which an identifier of each area and coordinate information specifying the identifier and the corresponding area are associated. . This table may be given to the terminal device 300 in advance.
  • the visual field inspection result information may include the image data of the visual field inspection image G in the state where the area is selected. That is, in the present embodiment, the visual field inspection result information may hold the image data of the visual field inspection image G in which the area is selected as shown in FIGS. 32 and 35 as one item of the visual field inspection result information.
  • the brightness (brightness) of the visual field inspection image G may be changed stepwise, and the inspection result may be input each time.
  • the item “brightness of the visual field inspection image G” may be added to the visual field information table 331, and the value of the item “brightness of the visual field inspection image G” may be included in the inspection result information.
  • the visual field corresponding to the brightness of the user can be specified by inputting the inspection result for each brightness of the visual field inspection image G.
  • FIG. 37 is a diagram showing an example of how the vision test image of the second embodiment is viewed.
  • FIG. 37 shows an example of how the user Q having the visual field shown in FIG. 12 looks at the visual acuity examination image T.
  • the user Q has a defect area 142. Therefore, for the user Q, as shown in FIG. 37, the vision test image T is visually recognized as an image in which the area (shaded portion) at the position corresponding to the defect area 142 is missing. Therefore, the Landolt ring drawn in the area at the position corresponding to the defect area is not visually recognized by the user Q. Therefore, in the example of FIG. 37, the Landolt's ring drawn in the area included in the missing area 142 is an area where the user Q can not determine the break.
  • FIG. 38 is a diagram showing an example of an input screen in a state where the visual acuity test result is input in the second embodiment.
  • An input screen 101C illustrated in FIG. 38 illustrates an example in which the result of the visual field inspection of the user Q is input on the input screen 101.
  • a vision test image T including Landolt rings of different sizes is sequentially projected onto the retina of the user Q.
  • the user Q reads out the direction of the break of the Landolt ring projected to an area other than the defect area 142, and the test assistant reads the direction of the break by the user. Select at 101C.
  • an identifier or the like for specifying each Landolt ring may be provided at the center or in the vicinity of the Landolt ring.
  • a number serving as an identifier for identifying a Landolt ring may be assigned to the center of each Landolt ring.
  • the Landolt's ring of the region included in the defect region 142 is not selected on the input screen 101C, and the Landolt's ring of the region other than the defect region 142 is selected.
  • the visual acuity test can be performed in this manner to measure the visual acuity of the user Q's visual field. Further, in the present embodiment, simultaneously with the visual acuity examination, a visual field examination as to whether or not a defect area exists in the visual field of the user Q can also be performed.
  • the result of the vision test is stored in the vision information table 332-Q.
  • FIG. 39 is a first diagram showing an example of the visual acuity information table of the second embodiment.
  • the visual acuity information table 332 of the present embodiment is provided for each user, and FIG. 39 shows the visual acuity information table 332 -Q of the user Q.
  • the visual acuity information table 332-Q of the present embodiment has, as items of information, a user ID, an examination date, an input time, a Landolt ring which could not be discriminated, and a Landolt ring which could be discriminated.
  • the item "user ID" is associated with other items.
  • information including the value of each item is the visual acuity information.
  • the value of the item “Landolt ring that could not be determined” indicates the Landolt ring on which the user could not determine the break in the vision test image T.
  • the value of the item “Landolt ring that could not be determined” is an identifier that specifies the Landolt ring that has not been selected as the Landolt ring whose break has been determined in the visual acuity examination image T displayed on the input screen 101C. .
  • the value of the item “discriminated Landolt ring” indicates a Landolt ring at which the user can discriminate a break in the vision test image T.
  • the value of the item “discriminated Landolt ring” is an identifier for identifying the Landolt ring selected as the Landolt ring whose break has been discriminated in the visual acuity examination image T displayed on the input screen 101C.
  • the visual information table 332 -Q shown in FIG. 39 indicates that the result of the visual acuity examination performed by the user Q with the user ID 002 was input at 10:00 on 2016/4/10.
  • each of the values of the item “Landolt ring that could not be determined” and the value of “Landolt ring that could be determined” was used as an identifier for identifying the Landolt ring, but is not limited thereto.
  • the values of the items “undetected Landolt ring” and “judged Landolt ring” may be, for example, the coordinates of the central point of the Landolt ring selected on the input screen 101C.
  • FIG. 40 is a second diagram showing an example of the visual acuity information table of the second embodiment.
  • the visual acuity information table 332A-Q shown in FIG. 40 shows an example of the examination result of visual acuity when a moving image in which one Landolt's ring moves is projected on the retina of the user Q.
  • the value of the item “discriminated Landolt ring” is coordinate information indicating the area in which the Landolt ring whose break has been determined by the user Q is displayed in the visual acuity examination image T. is there.
  • the value of the item “discriminated Landolt ring” is coordinate information indicating a region where the Landolt ring whose break has been determined by the user Q is displayed in the visual acuity examination image T.
  • the value of the item “Landolt ring that could not be determined” is coordinate information indicating a region in which the Landolt ring for which a break was not determined by the user Q was displayed in the visual acuity examination image T.
  • the coordinates here are the coordinates of the central point of the Landolt ring.
  • “discrimination of Landolt ring break” refers to visual recognition of the Landolt ring, and the case in which the direction of the break point is correctly determined and the Landolt ring could be recognized, but the direction of the break point is correct. It may include both the case where it is not determined and the case.
  • each region of the visual field inspection image G Since it corresponds to the position on the retina, it is possible to perform a visual field test at each position on the retina by examining the visible or invisible in each of the areas.
  • the vision test image T including the image of the visual target used for the vision test is directly made to a predetermined position on the retina. Therefore, according to the present embodiment, the eyesight of the retina itself can be measured. Furthermore, in the present embodiment, the distribution of visual acuity in the retina can be measured.
  • the image projection apparatus 200 has a shape like general glasses, but is not limited to this.
  • it may be shaped like goggles covering both eyes of the user P.
  • the area H (see FIG. 4) on which the image is projected in the image projection device 200 be sized to cover the field of view of the user.
  • the size that covers the user's field of vision is, for example, a size in which the image projected onto the retina covers about 60 degrees on the nose side and upper side, about 70 degrees on the lower side, and about 90 to 100 degrees on the ear side. It is.
  • the area on which the image (the test image) is projected has a size that covers the user's (subject's) field of view, even for users who have no visual field defects, retina or optic nerve, etc. , Can perform appropriate visual field inspection.
  • the region H where the image is projected is “under about 60 degrees on the nose side and upper side with one eye, It may be smaller than about 70 degrees, and about 90 to 100 degrees on the side of the ear.
  • the input may be performed after removing the attached image projection apparatus 200, or the terminal with the eye not performing the visual field inspection You may look at the screen of apparatus 300A, and may input.
  • the inspection assistant may be requested to input the terminal device 300A.
  • the user may convey the area where the characters can be read to the inspection assistant by reading the characters that have been viewed.
  • the third embodiment will be described below with reference to the drawings.
  • the third embodiment is different from the second embodiment in that the function of the terminal device of the second embodiment is provided in a server outside the terminal device. Therefore, in the following description of the third embodiment, components having the same functional configuration as the second embodiment are given the same reference numerals as the symbols used in the description of the second embodiment, and the description thereof Omit.
  • FIG. 41 is a view for explaining the system configuration of the image projection system of the third embodiment.
  • An image projection system 100B of the present embodiment includes an image projection device 200, a terminal device 300B, and a server 400.
  • the image projection apparatus 200 and the terminal apparatus 300B are one in the example of FIG. 25, the number of the image projection apparatus 200 and the terminal apparatus 300B included in the image projection system 100B may be any number.
  • the terminal device 300 ⁇ / b> B of the present embodiment receives image data from the server 400 and transmits the received image data to the image projection device 200.
  • the terminal device 300B of the present embodiment displays a screen instructed from the server 400, and transmits information input on the screen to the server 400.
  • the server 400 of the present embodiment is connected to the terminal device 300B via a network.
  • the server 400 includes a visual field of sight information database 330B, an image generation processing unit 310, an inspection processing unit 340A, and an image data transmission unit 350.
  • the visual field information and the visual acuity information input in the terminal device 300B are stored in association with the user ID and the information indicating the date and time each information was input. There is. Specifically, in the visual field / visual acuity information database 330B, for example, when there are a plurality of terminal devices 300B, the visual field information table 331 and the visual acuity information table 332 input in each of the plurality of terminal devices 300B are user IDs. It is stored every time.
  • the examination processing unit 340A of the present embodiment causes the display operation device 301 of the terminal device 300B to display an input screen of examination results including the visual field examination image data and the visual acuity examination image data.
  • the input receiving unit 343 receives information input on the input screen of the terminal device 300B.
  • the image data transmission unit 350 of the present embodiment transmits the image data for visual field examination, the image data for visual acuity examination, and the image data generated by the image generation processing unit 310 to the terminal device 300B.
  • the terminal device 300 ⁇ / b> B outputs the image data for visual field examination, the image data for visual acuity examination, and the image data generated by the image generation processing unit 310 to the image projection device 200.
  • the inspection processing unit 340A and the image generation processing unit 310 are provided in the server 400 as described above, the processing load on the terminal device 300B can be reduced.
  • the image generation processing unit 310 and the inspection processing unit 340A are provided in the server 400, but the present invention is not limited to this.
  • the image generation processing unit 310 and the inspection processing unit 340A may be provided in different servers 400, respectively.
  • the visual field visual information database 330B may be provided so that visual field information for each user obtained by the visual field inspection by the inspection processing unit 340A is stored and can be referred to by the image generation processing unit 310.
  • the fourth embodiment will be described below with reference to the drawings.
  • the fourth embodiment is different from the second embodiment in that the server distributes an application including an inspection processing unit, an image generation processing unit, and an image output processing unit to the terminal device. Therefore, in the following description of the fourth embodiment, components having the same functional configuration as the second embodiment are given the same reference numerals as the symbols used in the description of the second embodiment, and the description thereof Omit.
  • FIG. 42 is a diagram for explaining the system configuration of an image projection system according to the fourth embodiment.
  • the image projection system 100C of the present embodiment includes an image projection device 200, a terminal device 300C, and a server 400A.
  • the server 400A of the present embodiment has an application distribution unit 410.
  • the server 400A also has an application 420 for realizing the functions of the image generation processing unit 310, the image output processing unit 320, and the inspection processing unit 340.
  • the application distribution unit 410 when the application distribution unit 410 receives a distribution request of the application 420 from the terminal device 300C, the application distribution unit 410 distributes the application 420 to the terminal device 300C that has received the distribution request.
  • the terminal device 300C to which the application 420 is distributed is the terminal device 300A including the image generation processing unit 310, the image output processing unit 320, and the inspection processing unit 340. Therefore, in the present embodiment, the terminal device 300C after the application 420 is distributed performs the visual field inspection with only the terminal device 300C to acquire the visual field information and the visual acuity information of the user, and the acquired visual field information and the visual acuity Image data generated based on the information can be provided to the image projector 200.
  • the application 420 distributed by the server 400A includes the functions of the image generation processing unit 310, the image output processing unit 320, and the inspection processing unit 340, but is not limited thereto.
  • the application 420 delivered from the server 400A to the terminal device 300C may include, for example, only the functions of the image generation processing unit 310 and the image output processing unit 320.
  • the terminal device 300 ⁇ / b> C may receive the distribution of the application from the server 400 ⁇ / b> A and may acquire the view information of the user.
  • the fifth embodiment will be described below with reference to the drawings.
  • the fifth embodiment is different from the first embodiment described above in that the position where the projection information is projected is set by the user. Therefore, in the following description of the fifth embodiment, components having the same functional configuration as the first embodiment are given the same reference numerals as the symbols used in the description of the first embodiment, and the description thereof Omit.
  • FIG. 43 is a diagram for explaining the system configuration of an image projection system according to the fifth embodiment.
  • the image projection system 100D of the present embodiment includes an image projection device 200 and a terminal device 300D.
  • the terminal device 300D of the present embodiment includes an image generation processing unit 310A, an image output processing unit 320, a projection setting unit 355, and a projection position table 360.
  • the image generation processing unit 310A of the present embodiment acquires projection position information stored in the projection position table 360 instead of the visual field information. Details of the image generation processing unit 310A will be described later.
  • the projection setting unit 355 of the present embodiment sets the projection location of the projection information. Specifically, the projection setting unit 355 causes the display operation device 301 of the terminal device 300D to display a setting screen for setting the projection position of the projection information, and stores the input setting in the projection position table 360.
  • the projection position table 360 stores projection information and a projection position in association with each other for each user ID. Details of the projection position table 360 will be described later.
  • FIG. 44 is a first diagram for explaining the function of the image generation processing unit of the fifth embodiment.
  • the image generation processing unit 310A of the present embodiment has a projection determination unit 311A instead of the projection request reception unit 311, and has a projection position acquisition unit 313A instead of the visual field / acuity information acquisition unit 313.
  • the projection determination unit 311A determines whether to output projection information to the image projection apparatus 200. Specifically, the projection determination unit 311A determines whether or not it is time to project the projection information set in the projection position table 360 on the image projector 200.
  • the projection target acquisition unit 312 acquires projection information such as text data to be projected and image data.
  • the projection position acquisition unit 313A acquires, from the projection position table 360, information indicating the projection position associated with the projection information.
  • FIG. 45 is a view showing an example of a projection position table of the fifth embodiment.
  • the projection position table 360 of this embodiment has projection information and a projection position as items of information, and both are associated with each other.
  • the value of the item “projection information” indicates information to be projected on the user's retina in the image projector 200.
  • the value of the item “projection position” indicates projection position information indicating the projection position of projection information on the user's retina. Specifically, the value of the item “projection position” indicates the coordinates of the upper left point of the rectangle and the coordinates of the lower right point in the projection area (rectangle) of the projection information. Note that the value of the item “projection position” may be coordinates of four points of the rectangular projection area.
  • the projection information “incoming call notification” is projected inside a rectangle whose coordinates of the upper left point are (x11, y11) and whose lower right coordinates are (x21, y21).
  • FIG. 46 is a flowchart for explaining the processing of the projection setting unit of the fifth embodiment.
  • the projection setting unit 355 determines whether a request for setting the projection position of the projection information has been received (step S4601).
  • step S4601 when the setting request is not received, the projection setting unit 355 waits until the setting request is received.
  • step S4601 when the setting request is received, the projection setting unit 355 causes the display operation device 301 of the terminal device 300D to display a selection screen of projection information (step S4602), and receives selection of projection information in the selection screen ( Step S4603).
  • the projection setting unit 355 displays a setting screen of the projection position of the selected projection information (step S4604), and receives the setting of the projection position (step S4605).
  • the projection setting unit 355 associates the selected projection information with the position information indicating the set projection position, stores the information in the projection position table 360 (step S4606), and ends the processing.
  • FIG. 47 is a flowchart for explaining the processing of the image generation processing unit of the fifth embodiment.
  • the image generation processing unit 310A of the present embodiment determines whether it is time to project projection information by the projection determination unit 311A (step S4701). Specifically, for example, when the projection information is an incoming notification, the projection determination unit 311A determines that it is a timing to project the projection information when the terminal device 300D receives the incoming notification.
  • step S4701 when it is not the timing to project, the projection determination unit 311A stands by until the corresponding timing is reached.
  • step S4701 when it is the timing to project, the image generation processing unit 310A refers to the projection position table 360 and acquires projection position information corresponding to the projection information. Also, the image generation processing unit 310A causes the projection target acquisition unit 312 to acquire projection information (step S4702).
  • the image generation processing unit 310A causes the image data generation unit 314 to generate image data of an image indicating projection information (step S4703).
  • the image data generation unit 314 outputs the generated image data to the image output processing unit 320 (step S4704), and the process ends.
  • the image data transmitted to the image output processing unit 320 is transmitted to the image projection apparatus 200 by the image output processing unit 320.
  • FIG. 48A is a diagram showing an example of a selection screen of projection information according to the fifth embodiment
  • FIG. 48B is a diagram showing an example of a setting screen of projection positions of projection information.
  • a message 322 for prompting selection of projection information, a list 323 of projection information, and a button 234 for transitioning the selection screen 321-1 to the setting screen 321-2 are displayed.
  • the selection screen 321-1 displayed on the display operation device 301 is displayed as shown in FIG. It changes to the setting screen 321-2 shown in FIG.
  • the setting screen 321-2 displays a message 325 prompting specification of a projection position, and an image 326 indicating a projection area by the image projection apparatus 200.
  • the projection setting unit 355 of the present embodiment identifies the incoming notification and the area 327.
  • Coordinate information is associated with the coordinate information and stored in the projection position table 360.
  • the projection setting unit 355 selects the clock and coordinate information for specifying the area 328. It associates and stores in the projection position table 360.
  • the projection position may be designated by, for example, the user touching a desired area on the image 326 in the setting screen 321-2 or surrounding the area with a finger.
  • the terminal device 300D may hold the visual acuity information of the user, and the image generation processing unit 310A acquires and projects the visual acuity information of the user along with the projection position and the projection information.
  • the size of the characters, numbers, symbols, etc. of the projection information may be set according to the visual acuity information.
  • FIG. 49 is a diagram showing an example in which projection information is projected to a designated position.
  • FIG. 49 shows a part of the field of view of the user.
  • the incoming call notification is projected so as to be visible at the position of the area 327 of the field of view, and the clock is projected so as to be visually recognized at the position of the area 328.
  • the projection information can be projected on any position on the user's retina. Therefore, according to the present embodiment, it is not necessary for the user to move the line of sight or adjust the focus in order to visually recognize the projected information.
  • the image projection system 100D of the present embodiment if the driver of a mobile object such as a car is equipped with the image projection apparatus 200 and the traffic information etc. is projected to a position designated in advance, the driver looks at the line of sight. You can see traffic information without moving it.
  • the doctor or the like in the operation is mounted with the image projection apparatus 200, and the numerical value, information, etc. which are required to be referred in the operation are used as projection information. You can see the information you need without turning your eyes off the operative field.
  • the image generation processing unit 310A, the projection setting unit 355, and the projection position table 360 according to the present embodiment may be provided in a server outside the terminal device 300.
  • the image generation processing unit 310A and the projection setting unit 355 of the present embodiment may be distributed to the terminal device 300 as an application from an external server.
  • FIG. 50 is a second diagram for explaining the function of the image generation processing unit of the fifth embodiment.
  • An image generation processing unit 310B illustrated in FIG. 50 includes a projection determination unit 311A, a projection target acquisition unit 312A, a projection position acquisition unit 313A, and an image data generation unit 314, and converts voice input from the outside into character information.
  • the converted character information is used as projection information.
  • the projection target acquisition unit 312A of the present embodiment has a voice acquisition unit 315 and a character information conversion unit 316.
  • the voice acquisition unit 315 acquires, for example, voice input from a voice input device such as a microphone of the terminal device 300D.
  • the character information conversion unit 316 analyzes the voice acquired by the voice acquisition unit 315 and converts it into character information.
  • the image generation processing unit 310B of the present embodiment converts the voice obtained by the text information conversion unit 316 into text information to be projection information. Then, by projecting this projection information to an arbitrary field of view of the user, the user can visually recognize the voice as characters.
  • this configuration if this configuration is used, even a user with hearing impairment and visual impairment can perceive voice information. Therefore, if this configuration is applied to, for example, voice guidance in an art museum or the like, or a voice guide in kabuki, it is possible to obtain information that the user has difficulty obtaining due to a fault.
  • voice information can be moved to any position in the field of view as character information and projected, character information converted from voice can be obtained without interrupting the area of interest. Will be visible.
  • the sixth embodiment will be described below with reference to the drawings.
  • the sixth embodiment is different from the first embodiment in the shape of the mirror in the image projector. Therefore, in the following description of the sixth embodiment, only differences from the first embodiment will be described, and for those having the same functional configuration as the first embodiment, the description of the first embodiment.
  • FIG. 51 is a top view of the image projector of the sixth embodiment.
  • the image projection apparatus 200A of the present embodiment can be applied to the image projection system described in the first to fifth embodiments.
  • the image projection device 200A of the present embodiment includes a projection unit 210A and a control unit 230.
  • the projection unit 210A of the present embodiment includes a light source 211, a scanning mirror 212A, a reflection mirror 215A, and a projection mirror 216.
  • the projection unit 210A of this embodiment does not have the mirror 213 and the mirror 214, has the scanning mirror 212A instead of the scanning mirror 212, and has the reflecting mirror 215A instead of the reflecting mirror 215. Is different from the projection unit 210 of the second embodiment.
  • the traveling direction of the light beam incident on the projection mirror 216 in the projection mirror 216 is taken as the X direction
  • the direction orthogonal to the X direction in the projection mirror 216 is taken as the Y direction.
  • the scanning mirror 212A is, for example, a MEMS mirror, and scans the laser beam (light beam) L emitted from the light source 211 in the two-dimensional direction of the horizontal direction and the vertical direction. In addition, the scanning mirror 212A two-dimensionally scans the light beam L emitted from the light source 211 to obtain projection light for projecting an image on the retina 26 of the eyeball 260 of the user.
  • the reflection mirror 215A reflects the light beam L scanned by the scanning mirror 212A toward the lens 251.
  • a projection mirror 216 having a free-form surface is provided on the surface of the lens 251 on the eyeball 260 side of the user.
  • the projection mirror 216 projects an image on the retina 261 by scanning the scanning mirror 212A and irradiating the light ray L reflected by the reflection mirror 215A to the retina 261 of the eyeball 260. That is, the user can recognize an image by the afterimage effect of the laser beam projected onto the retina 261.
  • the projection mirror 216 is designed such that the convergence position of the light beam L scanned by the scanning mirror 212A is the pupil 262 of the eyeball 260.
  • the ray L is incident on the projection mirror 216 almost immediately (that is, approximately in the ⁇ X direction).
  • the distance from the reflection mirror 215A to the convergence position of the pupil 262 can be shortened, and the image projection apparatus 200A can be miniaturized. .
  • FIG. 52 is a diagram showing an optical path of a light beam in the image projector according to the comparative example.
  • light beams L0 to L2 are light beams scanned in the horizontal direction by the scanning mirror 212A, and are irradiated to the projection mirror 216 from the ⁇ X direction.
  • a ray L0 is a ray corresponding to the center of the image
  • rays L1 and L2 are rays corresponding to the edge of the image.
  • the rays L0 to L2 are reflected by the regions R0 to R2 of the projection mirror 216, respectively.
  • the reflected light rays L 0 to L 2 converge at the pupil 262 located at the center of the iris 263, pass through the lens 264, and reach the retina 261.
  • Region R0 is a region that reflects light ray L0 corresponding to the center of the image.
  • the region R1 is a region from the region R0 in the -X direction (the direction in which the light beams L0 to L2 are incident).
  • the region R2 is a region in the + X direction from the region R0. Rays L 0 to L 2 will cross near the pupil 262 for Maxwell viewing. However, the in-focus positions F0 to F2 of the respective light beams L0 to L2 deviate from the retina 261.
  • the light ray L0 reflected by the projection mirror 216 is incident on the lens 264 as substantially parallel light and is focused near the retina 261.
  • the light beam L1 reflected by the projection mirror 216 enters the lens 264 as diffused light. For this reason, the light beam L1 is focused farther than the retina 261.
  • the light beam L2 reflected by the projection mirror 216 is incident on the lens 264 as convergent light. For this reason, the light beam L2 is focused closer to the retina 261.
  • the in-focus position F1 is farther from the projection mirror 216 than the retina 261. It is the distance D1 between the in-focus position F1 and the retina 261.
  • the in-focus position F2 is closer to the projection mirror 216 than the retina 261.
  • the distance D2 between the in-focus position F2 and the retina 26 is obtained.
  • the projection mirror 216 is a free-form surface, and if it is attempted to cause the pupil 262 to converge the light rays L0 to L2 incident on the projection mirror 216 from the -X direction, This is because the curvatures of the regions R0 to R2 of the mirror 216 differ in the X direction and / or an optical path difference of the light beams L0 to L2 occurs.
  • the region R2 has a curvature larger than that of R1. That is, the region R2 has a larger condensing power than R1. Therefore, the in-focus position F2 is closer to the light source than F1.
  • the in-focus position F2 is closer to the light source than F1.
  • the optical system in the Y direction is substantially symmetrical with respect to the X axis, and in the Y direction, the shift of the in-focus position as in the X direction is less likely to occur.
  • the reflection mirror 215A is used as an optical component.
  • FIG. 53 is a diagram showing an optical path of a light beam in the image projection apparatus of the sixth embodiment
  • FIG. 54 is an enlarged view of the vicinity of the reflection mirror in FIG.
  • the light beams L0 to L2 irradiated to the regions R0 to R2 of the projection mirror 216 are reflected at the regions S0 to S2 in the reflection mirror 215A, respectively.
  • the reflection mirror 215A has a free-form surface.
  • the other configuration is the same as that of the above-described comparative example, and the description thereof is omitted.
  • FIG. 55 is a perspective view showing the unevenness of the surface of the reflection mirror in the sixth embodiment
  • FIG. 56 is a view showing the height Z in the X direction of the reflection mirror.
  • the X direction and the Y direction are directions corresponding to the X direction and the Y direction in the projection mirror 216.
  • the height at the reflection mirror 215A is in the Z direction.
  • the Z direction is an enlarged view of the unevenness on the surface of the reflection mirror 215A.
  • the surface of the reflecting mirror 215A is substantially flat, in the area S1, the surface of the reflecting mirror 215A is concave, and in the area S2, the surface of the reflecting mirror 215A is convex.
  • the collected power is approximately 0 in the region S0, the collected power is positive in the region S1, and the collected power is negative in the region S2. Therefore, the in-focus position F0 of the light ray L0 does not change from the comparative example.
  • the in-focus position F1 of the light beam L1 is closer to the light source than in the comparative example of FIG. 52, and the in-focus position F2 of the light beam L2 is farther from the light source than in FIG. Thereby, the in-focus positions F0 to F2 are in the vicinity of the retina 261.
  • Z on the surface of the reflection mirror 215A is a free-form surface expressed by the following equation.
  • the collected power in the Y direction at the projection mirror 216 is symmetrical with respect to the X axis. Therefore, the coefficient a ij of an odd term where j is 0 is set to 0.
  • the coefficients a30 and a12 are finite. Thereby, a free-form surface as shown in FIGS. 55 and 56 can be realized.
  • the coefficients a10 and / or a20 may be set to finite values. Furthermore, higher order coefficients may be finite values.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)

Abstract

L'invention concerne un dispositif terminal (300) qui comporte : une unité de maintien d'objet de projection destinée à conserver des informations de projection qui doivent être projetées par un dispositif de projection d'image (200) ; une unité de stockage dans laquelle sont stockées des informations de position indiquant une position où les informations de projection doivent être projetées sur la rétine d'un utilisateur (P) ; une unité d'acquisition d'informations de position destinée à acquérir, de l'unité de stockage, les informations de position ; une unité de génération de données d'image qui génère des données d'image (D) pour une image (202G) dans laquelle les informations de projection conservées sont projetées à la position indiquée par les informations de position ; et une unité de traitement de sortie d'image qui transmet les données d'image (D) au dispositif de projection d'image (200). Le dispositif de projection d'image (200) comporte : une unité de source lumineuse destinée à émettre un faisceau lumineux ; une unité d'entrée d'image destinée à entrer les données d'image (D) à partir du dispositif terminal (300) ; une unité de commande qui génère un faisceau lumineux d'image sur la base des données d'image entrées (D) et commande l'émission du faisceau lumineux d'image provenant de l'unité de source lumineuse ; un miroir de balayage destiné à balayer le faisceau lumineux d'image ; et une unité de projection qui projette le faisceau lumineux d'image sur la rétine d'un globe oculaire de l'utilisateur (P) en tant qu'image (202G) indiquée par les données d'image (D).
PCT/JP2017/021380 2016-06-09 2017-06-08 Système de projection d'images, dispositif de projection d'images, procédé de projection d'images, programme de projection d'images et dispositif de serveur WO2017213241A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/306,966 US10409146B2 (en) 2016-06-09 2017-06-08 Image projection system, image projection device, image projection method, image projection program, and server device
CN201780034880.6A CN109313343B (zh) 2016-06-09 2017-06-08 图像投影系统、图像投影装置、图像投影方法及服务器装置
EP17810414.7A EP3470909B1 (fr) 2016-06-09 2017-06-08 Système de projection d'images et procédé de projection d'images

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016115046 2016-06-09
JP2016-115046 2016-06-09
JP2017111151A JP6255524B2 (ja) 2016-06-09 2017-06-05 画像投影システム、画像投影装置、画像投影方法、画像投影プログラム及びサーバ装置
JP2017-111151 2017-06-05

Publications (1)

Publication Number Publication Date
WO2017213241A1 true WO2017213241A1 (fr) 2017-12-14

Family

ID=60578727

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/021380 WO2017213241A1 (fr) 2016-06-09 2017-06-08 Système de projection d'images, dispositif de projection d'images, procédé de projection d'images, programme de projection d'images et dispositif de serveur

Country Status (1)

Country Link
WO (1) WO2017213241A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021111792A1 (fr) * 2019-12-02 2021-06-10 ソニーグループ株式会社 Dispositif d'affichage et système d'affichage

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010066599A (ja) * 2008-09-11 2010-03-25 Brother Ind Ltd ヘッドマウントディスプレイ
JP2012108379A (ja) * 2010-11-18 2012-06-07 Nec Corp 情報表示システム、装置、方法及びプログラム
JP2014120963A (ja) * 2012-12-18 2014-06-30 Seiko Epson Corp 表示装置、頭部装着型表示装置、表示装置の制御方法、および、頭部装着型表示装置の制御方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010066599A (ja) * 2008-09-11 2010-03-25 Brother Ind Ltd ヘッドマウントディスプレイ
JP2012108379A (ja) * 2010-11-18 2012-06-07 Nec Corp 情報表示システム、装置、方法及びプログラム
JP2014120963A (ja) * 2012-12-18 2014-06-30 Seiko Epson Corp 表示装置、頭部装着型表示装置、表示装置の制御方法、および、頭部装着型表示装置の制御方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
See also references of EP3470909A4 *
YASUYUKI MURAI ET AL.: "Development of the fitting method of HMD (eyesight aid) to support visibility of low vision", DAI 14 KAI FORUM ON INFORMATION TECHNOLOGY KOEN RONBUNSHU, vol. 3, 24 August 2015 (2015-08-24), pages 545 - 546, XP055456084 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021111792A1 (fr) * 2019-12-02 2021-06-10 ソニーグループ株式会社 Dispositif d'affichage et système d'affichage

Similar Documents

Publication Publication Date Title
JP6255524B2 (ja) 画像投影システム、画像投影装置、画像投影方法、画像投影プログラム及びサーバ装置
US11686945B2 (en) Methods of driving light sources in a near-eye display
JP6255522B2 (ja) 視野視力検査システム、視野視力検査装置、視野視力検査方法、視野視力検査プログラム及びサーバ装置
US10852817B1 (en) Eye tracking combiner having multiple perspectives
Kress Digital optical elements and technologies (EDO19): applications to AR/VR/MR
JP2009268778A (ja) 画像表示装置、画像表示方法、プログラムおよび集積回路
US20210022606A1 (en) Image processing method, program, image processing device, and ophthalmic system
RU2709114C1 (ru) Способ проверки зрения, устройство для проверки зрения и сервер для скачивания для хранения программы или способа проверки зрения
WO2017213200A1 (fr) Système d'examen de champ visuel/acuité visuelle, dispositif d'examen de champ visuel/acuité visuelle, procédé d'examen de champ visuel/acuité visuelle, programme d'examen de champ visuel/acuité visuelle et dispositif serveur
WO2017213241A1 (fr) Système de projection d'images, dispositif de projection d'images, procédé de projection d'images, programme de projection d'images et dispositif de serveur
JP3785539B2 (ja) 広視域網膜投影型表示システム
JP2010117541A (ja) ビーム走査型表示装置
US11918286B2 (en) Ophthalmic instrument, management method, and management device
US10771774B1 (en) Display apparatus and method of producing images having spatially-variable angular resolutions
US12073018B2 (en) Multiple gaze dependent illumination sources for retinal eye tracking
JP7133163B2 (ja) 網膜走査型画像投影装置、網膜走査型画像投影方法、網膜走査型画像投影システム
WO2018016223A1 (fr) Dispositif de projection d'images, système de projection d'images, serveur, procédé de projection d'images et programme de projection d'images
EP3893088A1 (fr) Modification du contenu affiché
JP7123403B2 (ja) 画像検査装置
US20210121059A1 (en) Ophthalmic instrument, management device, and method for managing an ophthalmic instrument
JP2019097806A (ja) 網膜走査型視力検査装置、網膜走査型視力検査システム、網膜走査型視力検査方法
Weir et al. Developing a user-centered accessible virtual reality video environment for severe visual disabilities
Kageyama et al. Visual field self-evaluation by free-focus retinal-scanning laser display technology
JP7534420B2 (ja) 眼球運動測定査定のための拡張および仮想現実ディスプレイシステム
JP2019181004A (ja) 画像生成装置、画像生成方法及び画像生成プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17810414

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017810414

Country of ref document: EP

Effective date: 20190109