WO2018016223A1 - Dispositif de projection d'images, système de projection d'images, serveur, procédé de projection d'images et programme de projection d'images - Google Patents

Dispositif de projection d'images, système de projection d'images, serveur, procédé de projection d'images et programme de projection d'images Download PDF

Info

Publication number
WO2018016223A1
WO2018016223A1 PCT/JP2017/021452 JP2017021452W WO2018016223A1 WO 2018016223 A1 WO2018016223 A1 WO 2018016223A1 JP 2017021452 W JP2017021452 W JP 2017021452W WO 2018016223 A1 WO2018016223 A1 WO 2018016223A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
correction
image data
laser light
correction information
Prior art date
Application number
PCT/JP2017/021452
Other languages
English (en)
Japanese (ja)
Inventor
菅原 充
Original Assignee
株式会社Qdレーザ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Qdレーザ filed Critical 株式会社Qdレーザ
Publication of WO2018016223A1 publication Critical patent/WO2018016223A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to an image projection apparatus, an image projection system, a server, an image projection method, and an image projection program.
  • an image projection apparatus using a method of causing a person to visually recognize an image represented by image data without being affected by the function of the human crystalline lens by projecting an imaging light beam based on image data onto the retina of a person.
  • an imaging light beam based on image data onto the retina of a person.
  • the disclosed technology has been made in view of the above-described circumstances, and an image projection apparatus, an image projection system, a server, and an image capable of displaying an image obtained by correcting an image distorted and recognized by distortion of a field of view. It aims at providing a projection method and an image projection program.
  • the disclosed technology has a light source for emitting laser light and an image input unit for inputting image data, generates image laser light based on the input image data, and emits the image laser light.
  • a control unit that performs control, a scanning mirror that scans the image laser light, a projection mirror that projects the image laser light as an image represented by the image data onto the retina of the user's eye, and the image And an operation unit for operating a pointer to be projected, the control unit including: a correction image data holding unit that holds correction image data representing a correction image; and an image based on the correction image data
  • a correction image data output unit for generating laser light and emitting the light from the light source, and an operation for receiving the operation of the operation unit on the pointer projected onto the user's retina together with the correction image A reception unit; a correction information generation unit generating correction information for correcting distortion of the image visually recognized by the user according to the operation; a correction information holding unit holding the generated correction information; It has an image correction unit that corrects the input image data using correction information
  • An image is displayed in which the image that is viewed as distorted due to distortion of the field of view is corrected.
  • FIG. 1 is a view for explaining an outline of an image projection apparatus in the first embodiment.
  • the image projection apparatus 100 projects the correction image G on the user's retina for generating correction information used for displaying (projecting) an image obtained by correcting an image that has been distorted and viewed due to distortion of the field of view.
  • the image projection apparatus 100 of the present embodiment has the operation unit 11 and projects the pointer P operated by the operation unit 11 onto the correction image G.
  • the operation unit 11 of the present embodiment is a pointing device, and may be, for example, a track ball or the like disposed on the temple 150 of the eyeglass-type frame.
  • the operation unit 11 according to the present embodiment may be a pointing device other than a trackball, and may be provided as an external device capable of communicating with the image projection apparatus 100.
  • the image projection apparatus 100 of the present embodiment is a retinal projection head mounted display using Maxwell vision.
  • Maxwell vision image light for imaging (laser light) based on image data is once converged at the center of the pupil and then projected onto the retina, thereby representing image data to a person without being affected by the function of the human crystalline lens. It is a method of making an image look.
  • the image projected directly onto the human retina is viewed as the image represented by the image data if the function of the retina is normal. However, when there is a problem in the function of the retina, the optic nerve or the like, the image projected onto the retina is viewed in a mode different from the image represented by the image data.
  • the projected image is visually recognized as an image deformed in the same manner as the shape of the retina.
  • correction information indicating the shape (distortion) of the user's retina is obtained in advance, and the image indicated by the image data corrected using the correction information is projected on the user's retina
  • the correction image G is, for example, an Amsler chart or the like for examining the degree of distortion of the field of view.
  • the correction image G may not be an Amsler chart, and may be any image as long as it is an image including a plurality of grids formed by straight lines.
  • the user operates the pointer P with the operation unit 11 to correct the image such that a distorted line in the correction image G is a straight line.
  • the image data of the correction image G before the operation and the image data of the correction image G after the operation are distorted and viewed due to distortion of the user's field of vision. Generates and holds correction information for correcting an image.
  • the distortion of the image is corrected using the correction information.
  • the image projection apparatus 100 of the present embodiment corrects the image data based on the correction information, and projects the corrected image data onto the user's retina. That is, the image data after correction is image data of an image having distortion that cancels out distortion of the user's field of vision so that distortion is corrected when projected onto the user's retina. It becomes.
  • correction information for correcting an image that is viewed as distorted due to distortion of the field of view is generated, and when projecting image data, image data after correction based on the correction information is used. Project on the person's retina. Therefore, according to the present embodiment, it is possible to display an image obtained by correcting an image that is viewed as distorted due to distortion of the field of view.
  • FIG. 2 is a top view of the visual field inspection apparatus.
  • the image projector 100 according to this embodiment includes a projection unit 110 and a control unit 130.
  • the projection unit 110 of this embodiment includes a light source 111, a scanning mirror 112, a mirror 113, a mirror 114, a mirror 115, and a projection mirror 116.
  • the light source 111 is disposed on the temple 150 of the eyeglass frame.
  • the light source 111 emits a light beam L of, for example, a single or a plurality of wavelengths under the instruction of the control unit 130.
  • the light ray L is an image light ray for projecting an image on the retina 161 of the eye 160 of the user. In the following description, the light ray L is called an image light ray.
  • the light source 111 emits, for example, red laser light (wavelength: about 610 nm to about 660 nm), green laser light (wavelength: about 515 nm to about 540 nm), and blue laser light (wavelength: about 440 nm to about 480 nm). It emits red, green and blue laser light.
  • the light source 111 of the present embodiment is realized as a light source 111, for example, a light source in which laser diode chips of RGB (red, green, and blue), a three-color combining device, and a microcollimator lens are integrated.
  • the scanning mirror 112 is disposed on the temple 150 of the eyeglass frame.
  • the scanning mirror 112 scans the imaging light beam emitted from the light source 111 in the horizontal direction and the vertical direction.
  • the scanning mirror 112 is, for example, a micro electro mechanical system (MEMS) mirror.
  • MEMS micro electro mechanical system
  • the imaging light beam emitted from the light source 111 is reflected by, for example, the mirror 113 and the mirror 114 and enters the scanning mirror 112.
  • the control unit 130 of the present embodiment includes an arithmetic processing unit such as a central processing unit (CPU) and a storage device such as a random access memory (RAM) and a read only memory (ROM). Details of the control unit 130 will be described later.
  • CPU central processing unit
  • RAM random access memory
  • ROM read only memory
  • the control unit 130 may be mounted on, for example, the same substrate as the substrate on which the scanning mirror 112 (MEMS mirror) is mounted. Also, the control unit 130 may be provided in an external device connected to the image projection apparatus 100.
  • the control unit 130 of the present embodiment controls the projection unit 110.
  • the control unit 130 causes the light source 111 to emit an image light beam based on the input image data. Further, the control unit 130 of the present embodiment vibrates the scanning mirror 112 (MEMS mirror), scans the image light beam emitted from the light source 111, and projects the image on the retina 161.
  • MEMS mirror scanning mirror 112
  • FIG. 3 is an enlarged view of the vicinity of the projection unit of the image projection apparatus.
  • the imaging light beam scanned by the scanning mirror 112 is reflected by the mirror 115 toward the lens 151 of the eyeglass frame.
  • the projection unit 110 is disposed on the surface of the lens 151 on the eyeball 160 side, the imaging light beam scanned by the scanning mirror 112 is incident on the projection mirror 116.
  • the projection mirror 116 is a half mirror which has a free-form surface or a composite structure of a free-form surface and a diffractive surface in a region 116 a on the surface where the imaging light beam is incident. As a result, the imaging light beam incident on the projection mirror 116 is converged in the vicinity of the pupil 162 of the eyeball 160 and then projected onto the retina 161.
  • the subject can recognize the image formed by the image light beam and can visually recognize the outside image by the see-through.
  • FIG. 4 is a diagram for explaining the vibration of the scanning mirror.
  • FIG. 4 shows the case where the scanning mirror 112 vibrates from point A to point B.
  • a method of scanning an image light beam by the scanning mirror 112 and projecting an image on the retina 161 a method of scanning light at high speed from the upper left to the lower right of the area on which the image is projected (eg, raster scan) ).
  • the scanning mirror 112 is larger than the region H (broken line range in FIG. 4) in which the image is projected on the retina 161 in order to scan the image light beam (light beam L). And vibrate in the horizontal direction (first direction) and in the vertical direction (second direction intersecting the first direction).
  • the vibration of the scanning mirror 112 is indicated by reference numeral 50.
  • the imaging light beam is scanned at a place where the deflection of the scanning mirror 112 is small.
  • FIG. 4 shows an example in which the imaging light beam is scanned in a rectangular shape
  • the present invention is not limited to this case, and other cases such as trapezoidal scanning may be used.
  • FIG. 5 is a diagram for explaining an example of the hardware configuration of the control unit.
  • the control unit 130 of the present embodiment includes an operation unit 11, a memory 12, a CPU 13 and an interface unit 14 which are mutually connected by a bus B.
  • the operation unit 11 is for inputting various types of information to the control unit 130, and, for example, the content of the operation by the operation unit 11 is input.
  • the memory 12 is realized by a ROM, a RAM, and the like, and stores an arithmetic processing result by the CPU 13, an image projection program, and the like.
  • the CPU 13 implements various processes as described later according to the image projection program.
  • the interface unit 14 is used to connect the image projection apparatus 100 to a network.
  • image data to be projected may be input through the interface unit 14.
  • the correction information generation program and the image processing program executed in the image projection apparatus 100 are at least a part of the image projection program for controlling the image projection apparatus 100.
  • the correction information generation program and the image processing program are provided, for example, by distributing a recording medium or downloading from a network.
  • the recording medium on which the correction information generation program and the image processing program are recorded is a recording medium for recording information optically, electrically or magnetically such as a CD-ROM, flexible disk, magneto-optical disk etc., ROM, flash memory etc.
  • Various types of recording media can be used, such as a semiconductor memory that electrically records information as in FIG.
  • FIG. 6 is a diagram for explaining the function of the control unit of the first embodiment.
  • the control unit 130 of the present embodiment includes a correction information generation processing unit 300 and an image correction processing unit 400.
  • the correction information generation processing unit 300 is realized by the CPU 13 of the control unit 130 executing a correction information generation program stored in the memory 12.
  • the image correction processing unit 400 is realized by the CPU 13 of the control unit 130 executing an image correction program stored in the memory 12.
  • the correction information generation program and the image correction program are included in the image projection program.
  • the correction information generation processing unit 300 includes a correction image data holding unit 310, a correction image data output unit 320, an operation reception unit 330, a correction information generation unit 340, and a correction information holding unit 350. Correction information for each user is generated according to the operation of.
  • the correction image data holding unit 310 of the present embodiment holds correction image data indicating the correction image G.
  • the correction image data output unit 320 outputs the correction image data to the projection unit 110.
  • the correction image data output unit 320 generates an image light beam based on the correction image data and causes the light source 111 to emit the light beam.
  • the correction image data output unit 320 changes the position of the image (straight line) to be projected according to the operation received by the operation reception unit 330.
  • the operation receiving unit 330 receives an operation from the operation unit 11 and notifies the correction image data output unit 320 of the content of the operation.
  • the correction information generation unit 340 When the operation reception unit 330 completes the reception of the operation, the correction information generation unit 340 generates correction information from the correction image data after the operation and the correction image data before the operation.
  • the correction information holding unit 350 holds the generated correction information.
  • the image correction processing unit 400 of the present embodiment performs correction based on the correction information on the image data to be projected.
  • the image correction processing unit 400 includes an image data acquisition unit 410, a correction information reading unit 420, an image correction unit 430, and an image data output unit 440.
  • the image data acquisition unit 410 acquires the image data for which the projection request has been received.
  • the image data acquisition unit 410 may acquire image data from, for example, an external server or a recording medium via the interface unit 14.
  • the correction information reading unit 420 reads the correction information held in the correction information holding unit 350.
  • the image correction unit 430 corrects the image data based on the acquired image data and the correction information.
  • the image data output unit 440 outputs the corrected image data to the projection unit 110. In other words, the image data output unit 440 generates an image light beam based on the corrected image data and causes the light source 111 to emit the light beam.
  • FIG. 7 is a flowchart for explaining the processing of the correction information generation processing unit of the first embodiment.
  • the correction information generation processing unit 300 determines whether a request for generation of correction information has been received (step S701). In step S701, when the generation request is not received, the correction information generation processing unit 300 waits until the generation request is received.
  • step S701 when the generation request is received, the correction information generation processing unit 300 causes the correction image data output unit 320 to read out the correction image data held by the correction image data holding unit 310 (step S702). .
  • the correction information generation processing unit 300 causes the correction image data output unit 320 to project the correction image data on the user's retina (step S703).
  • the correction information generation processing unit 300 determines, with the operation receiving unit 330, whether or not an operation on the projected correction image G has been received (step S704). If the operation is not accepted in step S704, the process waits until the operation is accepted. At this time, if the correction information generation processing unit 300 does not receive an operation even after a predetermined time has elapsed, the processing may be ended.
  • step S704 when the operation is received, the operation receiving unit 330 notifies the correction image data output unit 320 of the content of the operation, and the correction image data output unit 320 detects the portion operated in the correction image G.
  • the projection position is changed (step S705).
  • the correction information generation processing unit 300 determines whether or not the reception of the operation by the operation reception unit 330 is completed (step S706).
  • the operation accepting unit 330 may complete the acceptance of the operation when a predetermined time has passed without accepting the operation, or may accept the operation indicating the completion of the operation and complete the acceptance of the operation. .
  • step S706 If the acceptance of the operation has not been completed in step S706, the correction information generation processing unit 300 returns to step S705.
  • step S706 when the acceptance of the operation is completed, the correction information generation processing unit 300 causes the correction information generation unit 340 to correct the correction information from the difference between the correction image data before the operation and the correction image data after the operation. It is generated and held by the correction information holding unit 350 (step S 707), and the process ends.
  • FIG. 8 is a diagram for explaining generation of correction information.
  • FIG. 8A shows an example of the correction image.
  • FIG. 8B is a view showing an example of how a correction image looks to a user who has a distortion in the visual field.
  • FIG. 8C is a view for explaining the operation on the correction image.
  • the correction image G of the present embodiment is a grid-like image formed by a plurality of straight lines, and has a gaze point S at the central portion.
  • the correction image G projected onto the retina is visually recognized as a lattice-like image formed by vertical lines and horizontal lines as shown in FIG. 8A.
  • the correction image G projected onto the retina is distorted according to the shape of the retina, and is visually recognized as, for example, an image as shown in FIG. Ru.
  • the user corrects the correction image G visually recognized in a distorted state by using the operation unit 11 so as to form a grid shape without distortion for the user.
  • the straight line L81 is not a straight line, but a state of being distorted and viewed. Therefore, the user selects the straight line L81 with the operation unit 11, and performs an operation of deforming the straight line L81 so that the straight line L81 appears to the user as a straight line without distortion.
  • the straight line L81 is selected by the pointer P, and the straight line L81 is deformed into a line L81 'which is perceived as a straight line by the user.
  • the straight line L82 is selected by the pointer P, and the straight line L82 is transformed into a line L82 'that is perceived as a straight line by the user.
  • the correction image G may be projected with a number or the like for specifying a line to be selected in association with each line.
  • the user can specify the operation target line by selecting this number.
  • the distortion of the image G for correction that has been visually recognized as shown in FIG. 8B is corrected until the user can accept distortion.
  • the correction image G after the operation is an image obtained by deforming the correction image G before the operation.
  • FIG. 9 is a view showing an example of the correction image after the operation.
  • the straight line around the gaze point S is an image distorted so as to offset the distortion shown in FIG. 8 (B).
  • the correction image Ga after the operation is an image corrected so as to eliminate distortion when projected onto the retina of the user who has performed the correction.
  • the correction information generation unit 340 of the present embodiment generates correction information from the correction image G before the operation shown in FIG. 8A and the correction image Ga after the operation shown in FIG.
  • the correction information generation unit 340 of the present embodiment associates the information stored by associating the coordinates before the operation of the point moved by the pointer P with the coordinates after the operation in the correction image G before the operation with the correction information. Do.
  • the correction information generation unit 340 associates and stores the coordinates of the point M1 indicated by the pointer P and the coordinates of the point M1a to which the point M1 has moved by the operation (FIG. 8). See (B) and (C)).
  • FIG. 10 is a diagram showing an example of the correction information.
  • the correction information 101 is held by the correction information holding unit 350.
  • the coordinates (x1, y1) of the point M1 before the operation and the coordinates (x1a, y1a) of the point M1a after the point M1 is moved by the operation are stored in association with each other.
  • the correction information generation unit 340 causes the correction information holding unit 350 to hold the coordinates before the operation and the coordinates after the operation for all the points moved by the operation of the pointer P.
  • FIG. 11 is a flowchart for explaining the processing of the image correction processing unit according to the first embodiment.
  • the image data acquisition unit 410 determines whether a projection request for an image has been received (step S1101). In step S1101, when the projection request is not received, the image correction processing unit 400 stands by until the projection request is received.
  • step S1101 when receiving the projection request, the image data acquisition unit 410 acquires the image data for which the projection request has been received (step S1102). In other words, the image data acquisition unit 410 receives an input of image data to be projected.
  • the image correction processing unit 400 causes the correction information reading unit 420 to read the correction information 101 held in the correction information holding unit 350 (step S1103). Subsequently, the image correction processing unit 400 causes the image correction unit 430 to correct the acquired image data using the correction information 101 (step S1104).
  • the image correction unit 430 moves the pixel specified as the coordinates of the point before the operation of the correction information to the position of the point specified as the coordinates of the point after the operation of the correction information in the image indicated by the image data. Move it.
  • the pixel at the position of the point M1 is moved to the point M1a.
  • the projection area of the correction image G and the projection area of the image to be projected upon receiving the projection request are the same. Therefore, the position indicated by the coordinates in the correction information 101 indicates the same point in the projection area of the correction image G and the projection area of the image projected upon receiving the projection request.
  • the image correction processing unit 400 uses the image data corrected so that the pixels specified by all the coordinates before the operation included in the correction information 101 move to the coordinates after the corresponding operation as image data for projection. It passes to the output unit 440.
  • the image data output unit 440 outputs the projection image data to the projection unit 110, causes the projection on the user's retina (step S1105), and the process ends.
  • FIG. 12 is a diagram showing an example of projection image data.
  • An image 121A illustrated in FIG. 12A illustrates an example of image data that has received a projection request before correction.
  • FIG. 12B shows an example of how the image 121A appears to the user who generated the correction information 101.
  • An image 121C illustrated in FIG. 12C illustrates an example of an image projected by projection image data in which the image 121A is corrected by the correction information 101.
  • the distortion of the user's field of vision is offset using the correction information 101.
  • Projection image data for projecting the image 121C is generated.
  • the user can visually recognize the image 121C as an image close to the image shown in the image 121A. Therefore, according to the present embodiment, it is possible to display an image obtained by correcting an image that is viewed as distorted due to distortion of the field of view of the user as shown in FIG. 12 (B).
  • the second embodiment will be described below with reference to the drawings.
  • the second embodiment is different from the first embodiment in that the correction information generation processing unit 300 and the image correction processing unit 400 are provided in a server. Therefore, in the following description of the second embodiment, components having the same functional configuration as the first embodiment will be assigned the same reference numerals as in the first embodiment, and the description thereof will be omitted.
  • FIG. 13 is a diagram showing an example of a system configuration of the image projection system of the second embodiment.
  • the image projection system 500 of the present embodiment includes an image projection device 100A and a server 600.
  • the server 600 includes a correction information generation processing unit 300, an image correction processing unit 400, and a correction information database 101A.
  • correction information generation processing unit 300 and the image correction processing unit 400 are the same as in the first embodiment.
  • the image projection apparatus 100A of the present embodiment When receiving the correction image data corresponding to the correction image G from the server 600, the image projection apparatus 100A of the present embodiment causes the correction image G to be projected onto the user's retina. Then, upon receiving an operation on the correction image G by the operation unit 11, the image projection apparatus 100A notifies the server 600 of the content of the operation. The server 600 receives this notification and generates correction information.
  • the generated correction information is stored in the correction information database 101A in association with the identifier for identifying the user. Details of the correction information database 101A will be described later.
  • the image projection device 100A when the image projection device 100A receives a request for projection of an image, the image projection device 100A notifies the server 600 of this request.
  • the server 600 acquires the image data for which the projection request is received by the image correction processing unit 400, and in the correction information database 101A, refers to the correction information corresponding to the user who made the projection request. Correct the image data. Then, the server 600 causes the image correction processing unit 400 to output the corrected image data to the image projection apparatus 100A as projection image data.
  • the server 600 by causing the server 600 to execute the processing of the correction information generation processing unit 300 and the image correction processing unit 400, the processing load on the image projection apparatus 100A side can be reduced.
  • the image projection apparatus 100A can be shared by a plurality of different users.
  • FIG. 14 is a diagram showing an example of the correction information database.
  • a user ID is associated with correction information.
  • the user ID is an identifier for identifying the user of the image projection apparatus 100A.
  • the correction information may be associated with information indicating the date and time when the correction information was generated. If the information indicating the date and time is associated with the correction information, the image correction processing unit 400 can correct the image data based on the latest correction information.
  • the third embodiment will be described below with reference to the drawings.
  • the third embodiment is different from the second embodiment in that a correction information generation processing unit 300 is provided in the image projection apparatus, and an image correction processing unit 400 is provided in a server. Therefore, in the following description of the third embodiment, components having the same functional configuration as that of the second embodiment are denoted by the same reference numerals as those of the second embodiment, and the description thereof is omitted.
  • FIG. 15 is a diagram showing an example of a system configuration of an image projection system according to the third embodiment.
  • An image projection system 500A of the present embodiment includes an image projection device 100B and a server 600A.
  • the image projection device 100 ⁇ / b> B includes a correction information generation processing unit 300.
  • the server 600A of the present embodiment has an image correction processing unit 400 and a correction information database 101A.
  • the correction information generation processing unit 300 when the correction information generation processing unit 300 generates the correction information, the generated correction information is transmitted to the server 600A.
  • the server 600A associates the correction information with the user information of the user corresponding to the correction information, and stores the correction information in the correction information database 101A.
  • the user ID corresponding to the correction information may be an identifier for identifying the image projection apparatus 100B, or may be an ID unique to the user. If the ID is unique to the user, the user ID may be input by the user, for example, when storing the correction information in the correction information database 101A.
  • the server 600A When the server 600A receives a projection request for image data from the image projection apparatus 100B, the server 600A refers to the correction information database 101A and corrects the image data for which the projection request has been received based on the correction information corresponding to the user who made the projection request. Output as image data for projection to the image projection apparatus 100B.
  • the image projection apparatus 100B can generate correction information without communicating with the server 600A. Furthermore, in the present embodiment, by providing the image correction processing unit 400 in the server 600A, it is possible to reduce the load of processing concerning correction of image data in the image projection apparatus 100B.
  • the fourth embodiment will be described below with reference to the drawings.
  • the fourth embodiment is different from the first embodiment in the shape of the mirror in the image projector. Therefore, in the following description of the fourth embodiment, only differences from the first embodiment will be described, and for those having the same functional configuration as the first embodiment, the description of the first embodiment.
  • FIG. 16 is a top view of the image projector of the fourth embodiment.
  • the image projection apparatus 100B of the present embodiment can be applied to the image projection system described in the first and second embodiments.
  • An image projection apparatus 100B of the present embodiment includes a projection unit 110A and a control unit 130.
  • the projection unit 110A of the present embodiment includes a light source 111, a scanning mirror 112A, a reflection mirror 115A, and a projection mirror 116.
  • the projection unit 110A of this embodiment does not have the mirror 113 and the mirror 114, has the scanning mirror 112A instead of the scanning mirror 112, and has the reflecting mirror 115A instead of the reflecting mirror 115. Is different from the projection unit 110 of the embodiment of FIG.
  • the traveling direction of the light beam incident on the projection mirror 116 in the projection mirror 116 is the X direction
  • the direction orthogonal to the X direction in the projection mirror 116 is the Y direction.
  • the scanning mirror 112A is, for example, a MEMS mirror, and scans the laser beam (light beam) L emitted from the light source 111 in the two-dimensional direction of the horizontal direction and the vertical direction. In addition, the scanning mirror 112A two-dimensionally scans the light beam L emitted from the light source 111 to obtain projection light for projecting an image on the retina 161 of the eye 160 of the user.
  • the reflection mirror 115A reflects the light beam L scanned by the scanning mirror 112A toward the lens 151.
  • a projection mirror 116 having a free-form surface is provided on the surface of the lens 151 on the side of the eye 160 of the user.
  • the projection mirror 116 projects an image on the retina 161 by irradiating the light beam L scanned by the scanning mirror 112A and reflected by the reflection mirror 115A onto the retina 161 of the eyeball 160. That is, the user can recognize an image by the afterimage effect of the laser beam projected onto the retina 161.
  • the projection mirror 116 is designed such that the convergence position of the light beam L scanned by the scanning mirror 112A is the pupil 162 of the eye 160.
  • the ray L is incident on the projection mirror 116 almost immediately (that is, approximately in the ⁇ X direction).
  • the distance from the reflection mirror 115A to the convergence position of the pupil 162 can be shortened, and the image projection apparatus 100B can be miniaturized. .
  • FIG. 17 is a view showing an optical path of a light beam in the image projector according to the comparative example.
  • light beams L0 to L2 are light beams scanned in the horizontal direction by the scanning mirror 112A, and are irradiated to the projection mirror 116 from the ⁇ X direction.
  • a ray L0 is a ray corresponding to the center of the image
  • rays L1 and L2 are rays corresponding to the edge of the image.
  • the rays L0 to L2 are reflected at the regions R0 to R2 of the projection mirror 116, respectively.
  • the reflected light rays L 0 to L 2 converge at the pupil 162 located at the center of the iris 163, pass through the lens 164, and reach the retina 161.
  • Region R0 is a region that reflects light ray L0 corresponding to the center of the image.
  • the region R1 is a region from the region R0 in the -X direction (the direction in which the light beams L0 to L2 are incident).
  • the region R2 is a region in the + X direction from the region R0.
  • the rays L0 to L2 intersect near the pupil 162 for Maxwell vision. However, the in-focus positions F0 to F2 of the respective light beams L0 to L2 deviate from the retina 161.
  • the light beam L 0 reflected by the projection mirror 116 is incident on the lens 164 as substantially parallel light and is focused near the retina 161.
  • the light beam L1 reflected by the projection mirror 116 enters the lens 164 as diffused light. For this reason, the light beam L1 is focused farther than the retina 161.
  • the light beam L2 reflected by the projection mirror 116 is incident on the lens 164 as convergent light. For this reason, the light ray L2 is focused closer to the retina 161.
  • the in-focus position F1 is farther from the projection mirror 116 than the retina 161. This is the distance D1 between the in-focus position F1 and the retina 161.
  • the in-focus position F2 is closer to the projection mirror 116 than the retina 161. This is the distance D2 between the in-focus position F2 and the retina 161.
  • the in-focus positions F0 to F2 differ in this way is that the projection mirror 116 is a free-form surface, and if it is attempted to cause the pupil 162 to converge the light rays L0 to L2 incident on the projection mirror 116 from the -X direction, This is because the curvatures of the regions R0 to R2 of the mirror 116 differ in the X direction and / or an optical path difference of the light beams L0 to L2 occurs.
  • the region R2 has a curvature larger than that of R1. That is, the region R2 has a larger condensing power than R1. Therefore, the in-focus position F2 is closer to the light source than F1.
  • the in-focus position F2 is closer to the light source than F1.
  • the optical system in the Y direction is substantially symmetrical with respect to the X axis, and in the Y direction, the shift of the in-focus position as in the X direction is less likely to occur.
  • FIG. 18 is a first diagram for explaining an image projector according to the fourth embodiment.
  • FIG. 18A is a view showing an optical path of a light beam in the image projector according to the fourth embodiment
  • FIG. 18B is an enlarged view of the vicinity of a reflection mirror in FIG. 18A.
  • light beams L0 to L2 applied to the regions R0 to R2 of the projection mirror 116 are reflected at the regions S0 to S2 in the reflection mirror 115A, respectively.
  • the reflection mirror 115A has a free-form surface.
  • the other configuration is the same as that of the above-described comparative example, and the description thereof is omitted.
  • FIG. 19 is a second diagram illustrating the image projector of the fourth embodiment.
  • FIG. 19A is a perspective view showing the unevenness of the surface of the reflection mirror in the fourth embodiment
  • FIG. 19B is a view showing the height Z in the X direction of the reflection mirror.
  • the X direction and the Y direction are directions corresponding to the X direction and the Y direction in the projection mirror 116.
  • the height of the reflection mirror 115A is in the Z direction.
  • the Z direction is shown by enlarging the unevenness of the surface of the reflection mirror 115A.
  • the surface of the reflecting mirror 115A in the region S0, the surface of the reflecting mirror 115A is substantially flat, in the region S1 the surface of the reflecting mirror 115A is concave, and in the region S2, the surface of the reflecting mirror 115A is It is convex.
  • the collected power is approximately 0 in the region S0, the collected power is positive in the region S1, and the collected power is negative in the region S2. Therefore, the in-focus position F0 of the light ray L0 does not change from the comparative example.
  • the in-focus position F1 of the light beam L1 is closer to the light source than in the comparative example of FIG. 17, and the in-focus position F2 of the light beam L2 is farther from the light source than in FIG. Thereby, the in-focus positions F0 to F2 are in the vicinity of the retina 161.
  • Z on the surface of the reflection mirror 115A is a free-form surface expressed by the following equation.
  • the collected power in the Y direction at the projection mirror 116 is symmetrical with respect to the X axis. Therefore, the coefficient a ij of an odd term where j is 0 is set to 0.
  • the coefficients a30 and a12 are finite. Thereby, a free-form surface as shown in FIG. 19 can be realized.
  • the coefficients a10 and / or a20 may be finite values. Furthermore, higher order coefficients may be finite values.

Landscapes

  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Signal Processing (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Mechanical Optical Scanning Systems (AREA)
  • Digital Computer Display Output (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

La présente invention concerne un dispositif de projection d'image (100) comprenant : une unité de commande pour générer une lumière laser, sur la base de données d'image, utilisée pour réaliser l'imagerie et commander l'émission de la lumière laser à partir d'une source de lumière ; un miroir de balayage pour balayer la lumière laser ; un miroir de projection pour projeter la lumière laser sur la rétine d'un utilisateur sous la forme d'une image représentée par les données d'image ; et une unité de fonctionnement (11) pour actionner un pointeur (P) projeté avec l'image. L'unité de commande génère une lumière laser, sur la base de données d'image de correction, utilisée pour réaliser l'imagerie et émettre la lumière laser à partir de la source de lumière, accepte l'action effectuée par l'unité de fonctionnement (11) vers le pointeur (P) projeté sur la rétine de l'utilisateur avec une image de correction (G), génère des informations de correction pour corriger la distorsion de l'image visualisée par l'utilisateur conformément à l'opération, corrige les données d'image d'entrée à l'aide des informations de correction générées, et génère une lumière laser, sur la base des données d'image corrigées, utilisée pour réaliser l'imagerie, et émet la lumière laser à partir de la source de lumière.
PCT/JP2017/021452 2016-07-20 2017-06-09 Dispositif de projection d'images, système de projection d'images, serveur, procédé de projection d'images et programme de projection d'images WO2018016223A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016142128A JP6255450B1 (ja) 2016-07-20 2016-07-20 画像投影装置、画像投影システム、サーバ、画像投影方法及び画像投影プログラム
JP2016-142128 2016-07-20

Publications (1)

Publication Number Publication Date
WO2018016223A1 true WO2018016223A1 (fr) 2018-01-25

Family

ID=60860125

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/021452 WO2018016223A1 (fr) 2016-07-20 2017-06-09 Dispositif de projection d'images, système de projection d'images, serveur, procédé de projection d'images et programme de projection d'images

Country Status (2)

Country Link
JP (1) JP6255450B1 (fr)
WO (1) WO2018016223A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102437814B1 (ko) 2018-11-19 2022-08-29 이-비전 스마트 옵틱스, 아이엔씨. 빔 조향 장치
JP7427237B2 (ja) * 2020-02-21 2024-02-05 株式会社Qdレーザ 画像投影装置、画像投影方法、プログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010061547A1 (fr) * 2008-11-25 2010-06-03 学校法人日本大学 Simulateur ophtalmique
JP2013078001A (ja) * 2011-09-30 2013-04-25 Seiko Epson Corp プロジェクター、およびプロジェクターの制御方法
JP2015111231A (ja) * 2013-05-31 2015-06-18 株式会社Qdレーザ 画像投影装置及び投射装置
JP2016517036A (ja) * 2013-03-25 2016-06-09 エコール・ポリテクニーク・フェデラル・ドゥ・ローザンヌ(ウペエフエル)Ecole Polytechnique Federale de Lausanne (EPFL) 多射出瞳頭部装着型ディスプレイのための方法および装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9180053B2 (en) * 2013-01-29 2015-11-10 Xerox Corporation Central vision impairment compensation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010061547A1 (fr) * 2008-11-25 2010-06-03 学校法人日本大学 Simulateur ophtalmique
JP2013078001A (ja) * 2011-09-30 2013-04-25 Seiko Epson Corp プロジェクター、およびプロジェクターの制御方法
JP2016517036A (ja) * 2013-03-25 2016-06-09 エコール・ポリテクニーク・フェデラル・ドゥ・ローザンヌ(ウペエフエル)Ecole Polytechnique Federale de Lausanne (EPFL) 多射出瞳頭部装着型ディスプレイのための方法および装置
JP2015111231A (ja) * 2013-05-31 2015-06-18 株式会社Qdレーザ 画像投影装置及び投射装置

Also Published As

Publication number Publication date
JP6255450B1 (ja) 2017-12-27
JP2018013566A (ja) 2018-01-25

Similar Documents

Publication Publication Date Title
US10921598B2 (en) Image projection device
AU2007319122B2 (en) Improvements in or relating to retinal scanning
JP3787939B2 (ja) 立体映像表示装置
JP6255524B2 (ja) 画像投影システム、画像投影装置、画像投影方法、画像投影プログラム及びサーバ装置
JP2019154815A (ja) 眼球の傾き位置検知装置、表示装置、及び検眼装置
JP6227177B1 (ja) 画像投影装置
JP6659917B2 (ja) 画像投影装置
CN109804296B (zh) 图像投影装置
JP5214060B1 (ja) 虚像表示装置
JP4933056B2 (ja) 画像表示装置及びそれを用いた撮像装置
JP2011242580A (ja) 投影光学装置
JP4681825B2 (ja) 走査型表示光学系
JP2015219489A (ja) 表示装置
JP4082075B2 (ja) 画像表示装置
WO2018016223A1 (fr) Dispositif de projection d'images, système de projection d'images, serveur, procédé de projection d'images et programme de projection d'images
JPWO2017056802A1 (ja) 画像投影装置
US20230080420A1 (en) Display apparatus
JP7133163B2 (ja) 網膜走査型画像投影装置、網膜走査型画像投影方法、網膜走査型画像投影システム
JP7437048B2 (ja) 画像投影装置
US20230199141A1 (en) Image projection device
JP3493949B2 (ja) 走査光学装置
JP2020021012A (ja) 画像処理装置、プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17830743

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17830743

Country of ref document: EP

Kind code of ref document: A1