WO2016117005A1 - Dispositif de projection, procédé de projection, programme et support d'informations - Google Patents

Dispositif de projection, procédé de projection, programme et support d'informations Download PDF

Info

Publication number
WO2016117005A1
WO2016117005A1 PCT/JP2015/051230 JP2015051230W WO2016117005A1 WO 2016117005 A1 WO2016117005 A1 WO 2016117005A1 JP 2015051230 W JP2015051230 W JP 2015051230W WO 2016117005 A1 WO2016117005 A1 WO 2016117005A1
Authority
WO
WIPO (PCT)
Prior art keywords
scanning
light
image
target
frame period
Prior art date
Application number
PCT/JP2015/051230
Other languages
English (en)
Japanese (ja)
Inventor
和弥 笹森
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2015/051230 priority Critical patent/WO2016117005A1/fr
Publication of WO2016117005A1 publication Critical patent/WO2016117005A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/02Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes by tracing or scanning a light beam on a screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/04Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of a single character by selection from a plurality of characters, or by composing the character by combination of individual elements, e.g. segments using a combination of such display devices for composing words, rows or the like, in a frame with fixed character positions
    • G09G3/06Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of a single character by selection from a plurality of characters, or by composing the character by combination of individual elements, e.g. segments using a combination of such display devices for composing words, rows or the like, in a frame with fixed character positions using controlled light sources
    • G09G3/12Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of a single character by selection from a plurality of characters, or by composing the character by combination of individual elements, e.g. segments using a combination of such display devices for composing words, rows or the like, in a frame with fixed character positions using controlled light sources using electroluminescent elements
    • G09G3/14Semiconductor devices, e.g. diodes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters

Definitions

  • the present invention relates to a technique for scanning laser light with a projection device.
  • Patent Document 1 describes an image projection apparatus that raster scans laser light using a MEMS resonant mirror and projects an image on a screen. This image projection apparatus adjusts the light emission intensity of the laser beam in response to the change in the raster scan speed in the horizontal direction so that the brightness of the projected image displayed on the screen is uniform.
  • Patent Document 1 also discloses that one frame is divided into two fields and interlaced scanning (so-called television interlace) is performed in the even and odd fields in the vertical direction.
  • interlace in television there is no point where scanning lines overlap in even and odd fields.
  • scanning with a MEMS mirror it is common to draw an image using both forward and backward movement of the MEMS mirror. For this reason, a point where the scanning lines overlap in the even field and the odd field occurs, and as a result, the luminance of the point where the scanning lines overlap becomes higher than the surroundings.
  • FIG. 1 shows an example in which interlaced scanning is performed in a plurality of fields by a MEMS mirror.
  • FIG. 1A shows scanning lines in the first field
  • FIG. 1B shows scanning lines in the second field.
  • the scanning line in the frame image is as shown in FIG. 1C, and the point indicated by the broken-line circle X is scanned in both the first field and the second field. Will increase the brightness.
  • a main object of the present invention is to reduce unevenness in luminance of a projected image when an image is projected by scanning with laser light in a plurality of fields.
  • the invention described in the claims is a projection device that projects an image composed of a plurality of continuous frames onto a projection area, a light source for emitting light, and a target light amount corresponding to an image signal indicating the image to be projected
  • Control means for controlling the amount of light emitted from the light source based on the above, and scanning means for scanning the light emitted from the light source over the entire projection area during one frame period, the control means comprising: The overlapping region in which scanning is performed a plurality of times during the one frame period rather than the first target light amount corresponding to the non-overlapping region in which the scanning unit performs one scanning during the one frame period.
  • the target light quantity corresponding to the image signal is set so that the second target light quantity corresponding to the above becomes small.
  • the invention described in claim is a projection method executed by a projection apparatus that has a light source and a scanning unit and projects an image composed of a plurality of continuous frames onto a projection region, and shows an image to be projected
  • the control step performs a plurality of times during the one frame period rather than a first target light quantity that is a target light quantity corresponding to a non-overlapping region that is scanned once during the one frame period by the scanning means.
  • the target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount corresponding to the overlapping area where scanning is performed, is reduced.
  • the invention described in the claims is a program executed by a projection apparatus that has a light source, a scanning unit, and a computer and projects an image composed of a plurality of continuous frames onto a projection region, and should be projected
  • the control step is more than a first target light amount that is a target light amount corresponding to a non-overlapping region that is scanned once by the scanning unit during the one frame period. Setting the target light amount according to the image signal so that the second target light amount, which is the target light amount corresponding to the overlapping area where scanning is performed a plurality of times during one frame period, is reduced. And butterflies.
  • FIG. 1 shows a configuration of an image drawing apparatus according to a first embodiment.
  • An example of a scanning line by two fields is shown.
  • a method for calculating bit data on a scanning line will be described.
  • the example of the scanning line which concerns on 1st Example is shown.
  • the other example of the scanning line which concerns on 1st Example is shown.
  • the example of the scanning line which concerns on 2nd Example is shown.
  • a projection device that projects an image composed of a plurality of continuous frames onto a projection area has a light source that emits light and a target light amount corresponding to an image signal indicating an image to be projected.
  • Control means for controlling the amount of light emitted from the light source based on the above, and scanning means for scanning the entire area of the projection area with light emitted from the light source during one frame period, the control means, In the overlapping area where scanning is performed a plurality of times during the one frame period, rather than the first target light quantity corresponding to the non-overlapping area where scanning is performed once during the one frame period by the scanning means.
  • the target light amount corresponding to the image signal is set so that the second target light amount, which is the corresponding target light amount, becomes small.
  • an image composed of a plurality of continuous frames is input as an image to be projected, and the amount of light emitted from the light source is based on a target amount of light corresponding to an image signal indicating the image.
  • the light emitted from the light source is scanned over the entire projection area during one frame period.
  • the control means corresponds to an overlapping region where scanning is performed a plurality of times during one frame period, rather than a first target light amount which is a target light amount corresponding to a non-overlapping region where scanning is performed once during one frame period.
  • the target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount, becomes small.
  • control unit may include a combined light amount of light that is scanned a plurality of times in the overlapping region and light that is scanned only once in the non-overlapping region during the one frame period.
  • the target light amount is set so that the light amount substantially matches.
  • the frame includes a plurality of fields
  • the scanning unit draws the field by raster scanning light emitted from the light source, and between the plurality of fields. Perform interlaced scanning.
  • flickering of a projected image can be prevented by performing interlaced scanning in a plurality of fields.
  • control means sets the second target light amount in at least one of the plurality of fields to zero.
  • the scanning unit draws the frame by performing a Lissajous scan of light emitted from a light source.
  • the control step includes a plurality of first target light amounts during the one frame period, which are target light amounts corresponding to non-overlapping regions that are scanned one time during the one frame period by the scanning unit.
  • the target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount corresponding to the overlapping area where the scanning is performed, is reduced. Also by this method, it is possible to prevent the luminance of the overlapping area where scanning is performed a plurality of times during one frame period from becoming higher than that of the non-overlapping area.
  • a program executed by a projection apparatus that has a light source, a scanning unit, and a computer and projects an image composed of a plurality of continuous frames onto a projection area is projected.
  • a control step for controlling the amount of light emitted from the light source based on a target light amount corresponding to an image signal indicating a power image, and scanning the light emitted from the light source over the entire projection area during one frame period A scanning step is executed by the computer, and the control step includes a first target light amount that is a target light amount corresponding to a non-overlapping region in which one scan is performed during the one frame period by the scanning unit.
  • the target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount corresponding to the overlapping area where scanning is performed a plurality of times during the one frame period, is reduced.
  • This program can be stored in a storage medium.
  • FIG. 2 shows a configuration of the image drawing apparatus 1 to which the projection apparatus according to the first embodiment is applied.
  • the image drawing apparatus 1 includes an image signal input unit 2, a video ASIC 3, a frame memory 4, a ROM 5, a RAM 6, a laser driver 7, a MEMS driver 8, and a laser light source unit 9. .
  • the image drawing apparatus 1 is used as a light source for a head-up display, for example, and emits light constituting a display image to an optical element such as a combiner.
  • the image signal input unit 2 receives an image signal input from the outside and outputs it to the video ASIC 3.
  • the video ASIC 3 is a block that controls the laser driver 7 and the MEMS driver 8 based on the image signal input from the image signal input unit 2 and the scanning position information Sc input from the MEMS mirror 95, and is an ASIC (Application Specific Integrated Circuit). ).
  • the video ASIC 3 includes a synchronization / image separation unit 31, a bit data conversion unit 32, a light emission pattern conversion unit 33, and a timing controller 34.
  • the synchronization / image separation unit 31 separates the image data displayed on the screen 11 and the synchronization signal from the image signal input from the image signal input unit 2 and writes the image data to the frame memory 4.
  • the bit data conversion unit 32 reads the image data written in the frame memory 4 and converts it into bit data.
  • the light emission pattern conversion unit 33 converts the bit data converted by the bit data conversion unit 32 into a signal representing the light emission pattern of each laser.
  • the timing controller 34 controls the operation timing of the synchronization / image separation unit 31 and the bit data conversion unit 32.
  • the timing controller 34 also controls the operation timing of the MEMS driver 8 described later.
  • the image data separated by the synchronization / image separation unit 31 is written.
  • the ROM 5 stores a control program and data for operating the video ASIC 3. Various data are sequentially read from and written into the RAM 6 as a work memory when the video ASIC 3 operates.
  • the laser driver 7 generates a signal for driving a laser diode provided in the laser light source unit 9 based on the signal output from the light emission pattern conversion unit 33.
  • the laser driver 7 includes a red laser driving circuit 71, a blue laser driving circuit 72, and a green laser driving circuit 73.
  • the red laser driving circuit 71 drives the red laser LD1 based on the signal output from the light emission pattern conversion unit 33.
  • the blue laser drive circuit 72 drives the blue laser LD2 based on the signal output from the light emission pattern conversion unit 33.
  • the green laser drive circuit 73 drives the green laser LD3 based on the signal output from the light emission pattern conversion unit 33.
  • the MEMS driver 8 controls the MEMS mirror 95 based on a signal output from the timing controller 34.
  • the MEMS driver 8 includes a servo circuit and a driver circuit.
  • the servo circuit controls the operation of the MEMS mirror 95 based on the signal from the timing controller 34.
  • the driver circuit amplifies the control signal for the MEMS mirror 95 output from the servo circuit to a predetermined level and outputs the amplified signal.
  • the laser light source unit 9 emits laser light based on the drive signal output from the laser driver 7. Specifically, the laser light source unit 9 includes a red laser LD1, a blue laser LD2, a green laser LD3, collimator lenses 91a to 91c, and reflection mirrors 92a to 92c.
  • the red laser LD1 emits red laser light (also referred to as “red laser light Lr”)
  • the blue laser LD2 emits blue laser light (also referred to as “blue laser light Lb”)
  • Green laser light (also referred to as “green laser light Lg”) is emitted.
  • the collimator lenses 91a to 91c convert the red, blue, and green laser beams Lr, Lb, and Lg into parallel beams and emit them to the reflection mirrors 92a to 92c, respectively.
  • the reflection mirror 92b reflects the blue laser light Lb.
  • the reflection mirror 92c transmits the blue laser light Lb and reflects the green laser light Lg.
  • the reflection mirror 92a transmits the red laser beam Lr and reflects the blue and green laser beams Lb and Lg.
  • the red laser light Lr transmitted through the reflection mirror 92 a and the blue and green laser lights Lb and Lg reflected by the reflection mirror 92 a are incident on the MEMS mirror 95.
  • the MEMS mirror 95 performs a raster scan on the screen 11 with the laser light L incident from the reflection mirror 92a.
  • the MEMS mirror 95 basically swings so as to scan on the screen 11 under the control of the MEMS driver 8 in order to display an image input to the image signal input unit 2, and scanning at that time.
  • Position information (for example, information such as the angle of the mirror) is output to the video ASIC 3 as scanning position information Sc.
  • the lasers LD1 to LD3 are examples of the light source of the present invention
  • the video ASIC 3 is an example of the control means of the present invention
  • the MEMS mirror 95 and the MEMS driver 8 are examples of the scanning means of the present invention.
  • one frame image is composed of two field images, and interlaced scanning (so-called interlace in television) is performed in the first field and the second field in the vertical direction. That is, the MEMS mirror 95 draws one frame image by scanning two field images of the first field and the second field on the screen 11.
  • FIG. 3 shows an example in which one frame image is drawn by two fields.
  • the horizontal direction (X direction) is referred to as “main scanning direction”
  • the vertical direction (Y direction) is referred to as “sub scanning direction”.
  • FIG. 3A shows the scanning line F1 in the first field
  • FIG. 3B shows the scanning line F2 in the second field.
  • the laser light L from the MEMS mirror 95 is scanned in the sub-scanning direction while swinging in the main scanning direction.
  • the frame image is composed of a first field scanning line F1 and a second field scanning line F2.
  • the amount of laser light scanned by the MEMS mirror 95 is indicated by the bit data generated by the bit data conversion unit 32.
  • the bit data is data serving as a basis for determining the amount of light to be emitted by the laser in accordance with the scanning position of the MEMS mirror 95.
  • the MEMS mirror 95 is sine-driven in the main scanning direction and is scanned at a predetermined speed in the sub-scanning direction. Therefore, the scanning lines scanned in the main scanning direction are parallel to the horizontal arrangement of the image data. is not. Therefore, the bit data needs to be calculated with reference to pixel data corresponding to the trajectory of the scanning line in the image data developed in the frame memory 4.
  • FIG. 4 shows the relationship between the pixel data and the bit data at the scanning position on the scanning line.
  • Each square represents each pixel of the image data developed in the frame memory 4, and the numbers in each pixel are a pixel number and R, G, and B level values.
  • An arrow F indicates the trajectory of the scanning line.
  • the bit data conversion unit 32 refers to the pixel data of the pixels corresponding to the trajectory of the scanning line F, and the bit data B1, B2, B3,. . , Is calculated.
  • the spot of the laser beam scanned at the scanning position B1 extends over the pixels 1-1, 2-1 and 2-2.
  • the pixel data 1-1, 2-1 and 2-2 are referenced.
  • the bit data at the scanning position B2 is obtained by assuming that 80% of the spot of the laser beam scanned at the scanning position B2 is included in the pixel 2-2 and 20% is included in the pixel 1-2.
  • each bit data corresponding to each scanning position of the MEMS mirror 95 is assigned with which weight of which pixel data of the image data. It is determined in advance whether to calculate by adding. In this way, the bit data conversion unit 32 calculates bit data corresponding to each scanning position.
  • the light emission pattern conversion unit 33 converts the bit data thus generated by the bit data conversion unit 32 into a signal representing the light emission pattern of each laser.
  • the scanning speed of the scanning line is fast at the central portion of the image display area and is slow at both ends of the image display area. For this reason, when the same amount of laser light is scanned, the image tends to become brighter at both ends of the image display area.
  • the light emission pattern conversion unit 33 converts the bit data so that both ends of the image display area become darker when the bit data is converted into a signal representing the light emission intensity of each laser. As a result, the images projected based on the bit data having the same brightness have the same brightness at the center and both ends of the image display area.
  • the laser light source unit 9 scans the screen 11 with the laser light L having the light amount indicated by the signal output from the light emission pattern conversion unit 33.
  • the amount of laser light emitted based on the bit data at each scanning position on the scanning line is referred to as a “target light amount”. That is, the target light amount is a light amount corresponding to the luminance at each scanning position obtained based on the image data, and has a different value for each scanning position on the scanning line according to the image data.
  • the scanning line is scanned only once in either the first field or the second field during one frame period (hereinafter referred to as “non-overlapping region”).
  • non-overlapping region there is a region where the scanning line is scanned in both the first field and the second field, that is, a region where the scanning line is scanned twice in one frame (hereinafter referred to as “overlapping region”). For this reason, in the overlapping area scanned twice in one frame, the amount of laser light projected per unit time increases, and the image becomes brighter than in the non-overlapping area.
  • Method A is a method in which the luminance of any field is made zero in the overlapping region.
  • FIG. 5 shows an example of the scanning line when the luminance of the scanning line of the second frame is made zero by the method A.
  • FIG. 5A shows the scanning lines of the first field
  • FIG. 5B shows the scanning lines of the second field
  • FIG. 5C shows the scanning lines of the frame.
  • the laser beam is scanned with a luminance corresponding to the image data.
  • the laser beam is scanned with a luminance corresponding to the image data in the non-overlapping region, but the luminance of the laser beam is set to zero in the overlapping region.
  • the bit data conversion unit 32 stores the overlapping area in advance. For the overlapping area, the bit data conversion unit 32 calculates bit data from the image data in the first field, and generates bit data with all the color levels set to zero in the second field. On the other hand, for the non-overlapping area, the bit data conversion unit 32 generates bit data from the image data in both the first field and the second field. Then, the light emission pattern conversion unit 33 causes each laser to emit light according to the bit data generated by the bit data conversion unit 32. Thereby, it can prevent that the image of an overlapping area becomes brighter than a non-overlapping area.
  • the luminance is set to zero in the overlapping region of the first field, and the luminance corresponding to the image data is set in the overlapping region of the second field. It is good also as scanning a laser beam.
  • Method B is a method of lowering the luminance of any field in the overlapping region. That is, in the method B, the luminance is not reduced to zero in the overlapping region, but the luminance is decreased at a predetermined ratio.
  • the bit data conversion unit 32 calculates bit data from the image data in the first field. Also, the bit data conversion unit 32 calculates bit data from the image data even in the overlapping region of the second field, and outputs bit data in which the level of each color of the bit data is darkened by a predetermined ratio. That is, in the second field, the laser light is scanned at a luminance obtained by reducing the luminance corresponding to the image data by a predetermined ratio, instead of uniformly setting the luminance to zero. On the other hand, for the non-overlapping area, the bit data conversion unit 32 generates bit data from the image data in both the first field and the second field.
  • the light emission pattern conversion unit 33 causes each laser to emit light according to the bit data generated by the bit data conversion unit 32.
  • the light emission pattern conversion unit 33 causes each laser to emit light according to the bit data generated by the bit data conversion unit 32.
  • the luminance may be lowered in the overlapping region of the first field, and the laser light may be scanned with the luminance corresponding to the image data in the overlapping region of the second field.
  • Method C is a method in which the luminance of two fields is halved in the overlapping region.
  • FIG. 6 shows an example of scanning lines when the luminance of the scanning lines in the first and second fields is halved by the method C.
  • 6A shows a scanning line for the first field
  • FIG. 6B shows a scanning line for the second field
  • FIG. 6C shows a scanning line for the frame.
  • the laser beam is scanned at a luminance of 1/2 of the luminance corresponding to the image data in the overlapping region of the first field, and the luminance of 1/2 of the luminance corresponding to the image data is also detected in the overlapping region of the second field.
  • the laser beam is scanned.
  • the bit data conversion unit 32 calculates the bit data from the image data in both the first field and the second field and then sets the level of each color to 0.5 times for the overlapping area. Is generated. Then, the light emission pattern conversion unit 33 causes each laser to emit light according to the bit data generated by the bit data conversion unit 32. Thereby, the flicker of the overlapping area can be made inconspicuous.
  • the light amount of the laser light determined based on the image data when scanning the non-overlapping region corresponds to the “first target light amount” of the present invention, and when scanning the overlapping region.
  • the amount of laser light determined based on the image data corresponds to the “second target light amount” of the present invention.
  • the bit data conversion unit 32 generates bit data whose luminance is corrected in the overlapping region, and the light emission pattern conversion unit 33 emits each laser according to the bit data generated by the bit data conversion unit 32. I am letting. Instead, the bit data conversion unit 32 calculates the bit data based on the image data regardless of the non-overlapping region and the overlapping region, and the light emission pattern conversion unit 33 converts the bit data into a signal indicating the light emission intensity of each laser. At the time of conversion, the light quantity of each laser in the overlapping area may be corrected.
  • the light emission pattern conversion unit 33 stores the emission timing of the laser light corresponding to the overlapping region, and in the entire region of the first field and the non-overlapping region of the second field.
  • the bit data is converted into a signal representing the emission intensity of each laser as it is, and a signal in which the emission of each laser becomes zero is generated in the overlapping region of the second field.
  • the light emission pattern conversion unit 33 generates a signal in which the light emission of each laser becomes zero in the overlapping area of the first field, and the bit data is left as it is in the non-overlapping area of the first field and the entire area of the second field. You may convert into the signal showing the emitted light intensity of each laser.
  • the light emission pattern conversion unit 33 converts the bit data as it is into a signal representing the light emission intensity of each laser in the entire first field region and the second field non-overlapping region, and overlaps the second field. In the region, a signal is generated by reducing the emission intensity of each laser indicated by the bit data at a predetermined ratio. Instead, the light emission pattern conversion unit 33 generates a signal in which the light emission intensity of each laser indicated by the bit data in the overlapping region of the first field is reduced at a predetermined ratio, and the non-overlapping region and the second field of the first field are generated.
  • the bit data may be converted into a signal representing the emission intensity of each laser as it is in the entire field.
  • the light emission pattern conversion unit 33 outputs a signal indicating the light emission intensity of 0.5 times the light emission intensity of each laser indicated by the bit data in the overlapping region for both the first field and the second field. It only has to be generated.
  • the region defined as the overlapping region is a region where the first field scanning line and the second field scanning line partially overlap, a region where more than half overlap, a region where all overlap, or the like. Is set as appropriate.
  • one frame is composed of two fields, but one frame may be composed of three or more fields.
  • the image drawing apparatus 1 stores in advance an overlapping area generated by scanning each field, and scans a laser beam with a normal light amount based on the image data in one field for the overlapping area, and other fields. Then, it is sufficient to correct the light amount and scan.
  • scanning is performed at the brightness corresponding to the image data in the entire first field and the non-overlapping areas of the second and third fields, and the brightness is set to zero in the overlapping areas of the second and third fields. Is done.
  • scanning is performed at the brightness corresponding to the image data in the entire area of the first field and the non-overlapping areas of the second and third fields, and the brightness is higher in the overlapping areas of the second and third fields. Scanning is performed at a predetermined rate.
  • scanning is performed at a luminance corresponding to the image data in the non-overlapping areas of the first to third fields, and 1 / of the luminance of the image data is respectively performed in the overlapping areas of the first to third fields.
  • Scanning is performed at a luminance of 3.
  • the second embodiment is an example in which the present invention is applied to so-called Lissajous scanning.
  • the configuration of the image drawing apparatus according to the second embodiment is the same as that of the first embodiment, and a description thereof will be omitted.
  • the MEMS mirror 95 is resonantly driven in each of the main scanning direction and the sub-scanning direction, and an image is projected onto the screen 11 by performing Lissajous scanning with laser light.
  • An example of the Lissajous scanning line is shown in FIG. In the Lissajous scan, a large number of overlapping areas are generated in which a scanning line is scanned twice or more when one frame is drawn.
  • the bit data conversion unit 32 calculates bit data by referring to pixel data corresponding to the trajectory of the scanning line among the image data developed in the frame memory 4.
  • the bit data conversion unit 32 stores the overlap area in advance as in the first embodiment.
  • the bit data conversion unit 32 generates bit data calculated from the image data in the first scan in the overlapping region, and performs the second and subsequent scans. In, bit data is generated with all the color levels set to zero.
  • the bit data conversion unit 32 reduces the level of each color of the bit data by a predetermined ratio in the second and subsequent overlapping area scans during one frame period. Generate bit data.
  • the bit data conversion unit 32 stores in advance the number of times of scanning for each of the overlapping areas during one frame period. Then, when calculating the bit data corresponding to the overlapping area, the bit data conversion unit 32 calculates the bit data obtained by dividing the luminance based on the image data by the number of times of scanning. For example, for an overlapping region that is scanned three times during one frame period by Lissajous scanning, the bit data conversion unit 32 performs scanning at a luminance that is 1/3 of the luminance calculated based on the image data.
  • the bit data conversion unit 32 corrects the luminance in the overlapping area to generate bit data. Instead, as described in the first embodiment, the bit data conversion unit 32 calculates the bit data based on the image data regardless of the non-overlapping area and the overlapping area, and the light emission pattern conversion unit 33 sets the bit data. May be corrected to the amount of light emitted from each laser in the overlapping region.
  • the present invention can be used in a projection apparatus that projects an image.
  • Image drawing device 3 Video ASIC 7 Laser Driver 8 MEMS Driver 9 Laser Light Source 11 Screen 95 MEMS Mirror

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Mechanical Optical Scanning Systems (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

La présente invention concerne un dispositif de projection dans lequel une image constituée d'une pluralité de trames continues est entrée en tant qu'image devant être projetée, et la quantité de lumière émise depuis une source de lumière est commandée sur la base d'une quantité cible de lumière conformément à un signal d'image représentant l'image. La lumière émise depuis la source de lumière est balayée à travers la totalité de la région de projection pendant chaque période de trame. Un moyen de commande règle des quantités cibles de lumière conformément au signal d'image de telle sorte qu'une seconde quantité cible de lumière, qui est une quantité cible de lumière correspondant à une région de chevauchement où le balayage est effectué plusieurs fois par période de trame, est inférieure à une première quantité cible de lumière, qui est une quantité cible de lumière correspondant à une région de non-chevauchement où le balayage est effectué une fois par période de trame.
PCT/JP2015/051230 2015-01-19 2015-01-19 Dispositif de projection, procédé de projection, programme et support d'informations WO2016117005A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/051230 WO2016117005A1 (fr) 2015-01-19 2015-01-19 Dispositif de projection, procédé de projection, programme et support d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/051230 WO2016117005A1 (fr) 2015-01-19 2015-01-19 Dispositif de projection, procédé de projection, programme et support d'informations

Publications (1)

Publication Number Publication Date
WO2016117005A1 true WO2016117005A1 (fr) 2016-07-28

Family

ID=56416572

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/051230 WO2016117005A1 (fr) 2015-01-19 2015-01-19 Dispositif de projection, procédé de projection, programme et support d'informations

Country Status (1)

Country Link
WO (1) WO2016117005A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019065484A1 (fr) * 2017-09-28 2019-04-04 パイオニア株式会社 Dispositif de mesure de distance et dispositif de balayage optique
KR20190125409A (ko) * 2017-03-03 2019-11-06 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Mems 스캐닝 디스플레이 디바이스

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003005737A (ja) * 2001-06-21 2003-01-08 Toshiba Corp 映像表示装置及び映像表示方法
JP2007047243A (ja) * 2005-08-08 2007-02-22 Seiko Epson Corp 画像表示装置及び画像表示装置の制御方法
JP3172678U (ja) * 2008-12-10 2012-01-05 宏瞻科技股▲ふん▼有限公司 レーザ投影装置
JP2013068859A (ja) * 2011-09-26 2013-04-18 Hitachi Media Electoronics Co Ltd 画像表示装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003005737A (ja) * 2001-06-21 2003-01-08 Toshiba Corp 映像表示装置及び映像表示方法
JP2007047243A (ja) * 2005-08-08 2007-02-22 Seiko Epson Corp 画像表示装置及び画像表示装置の制御方法
JP3172678U (ja) * 2008-12-10 2012-01-05 宏瞻科技股▲ふん▼有限公司 レーザ投影装置
JP2013068859A (ja) * 2011-09-26 2013-04-18 Hitachi Media Electoronics Co Ltd 画像表示装置

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190125409A (ko) * 2017-03-03 2019-11-06 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Mems 스캐닝 디스플레이 디바이스
JP2020510870A (ja) * 2017-03-03 2020-04-09 マイクロソフト テクノロジー ライセンシング,エルエルシー Mems走査ディスプレイデバイス
JP7075939B2 (ja) 2017-03-03 2022-05-26 マイクロソフト テクノロジー ライセンシング,エルエルシー Mems走査ディスプレイデバイス
KR102516979B1 (ko) 2017-03-03 2023-03-31 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Mems 스캐닝 디스플레이 디바이스
WO2019065484A1 (fr) * 2017-09-28 2019-04-04 パイオニア株式会社 Dispositif de mesure de distance et dispositif de balayage optique
JPWO2019065484A1 (ja) * 2017-09-28 2020-11-12 パイオニア株式会社 測距装置及び光走査装置
EP3674749A4 (fr) * 2017-09-28 2021-04-28 Pioneer Corporation Dispositif de mesure de distance et dispositif de balayage optique
JP2022121521A (ja) * 2017-09-28 2022-08-19 パイオニア株式会社 測距装置及び光走査装置

Similar Documents

Publication Publication Date Title
CN107621748B (zh) 激光投影显示装置
JP5175504B2 (ja) 投写型映像表示装置
JP6118913B2 (ja) 表示装置
JP6623584B2 (ja) 画像生成装置、ヘッドアップディスプレイ
JP6137006B2 (ja) 画像表示装置および画像表示方法
WO2012120589A1 (fr) Dispositif de rendu d'image, programme de contrôle de rendu, et dispositif de détection d'écart d'axe optique
US20070200866A1 (en) Light source performing scanning operation twice, image apparatus using the light source, and method of driving the light source
JP5976925B2 (ja) 投影装置、ヘッドアップディスプレイ、制御方法、プログラム及び記憶媒体
WO2016117005A1 (fr) Dispositif de projection, procédé de projection, programme et support d'informations
WO2014162506A1 (fr) Dispositif d'affichage d'image virtuelle, dispositif de projection, procédé de commande, programme, et support d'enregistrement
JP6311823B2 (ja) 画像表示装置および画像表示方法
WO2015194377A1 (fr) Dispositif de commande de source de lumière et dispositif d'affichage d'image
JP2017083631A (ja) 表示装置、制御方法、プログラム及び記憶媒体
US20080316373A1 (en) Projection type image display device
JP5731660B2 (ja) 投影装置、制御方法及びプログラム
JP4230496B2 (ja) 画像表示装置及び画像表示方法
JP2014007358A (ja) 投影装置、ヘッドアップディスプレイ、制御方法、プログラム及び記憶媒体
WO2016135796A1 (fr) Dispositif de rendu d'image, affichage tête haute, et procédé de réglage de luminance d'image
JP2016099561A (ja) 投影装置、投影方法、プログラム及び記憶媒体
JP2010211149A (ja) 画像表示装置
JP6844350B2 (ja) 投影装置、制御方法、プログラム及び記憶媒体
WO2013145153A1 (fr) Dispositif de dessin d'image
WO2014162503A1 (fr) Dispositif de projection, procédé de commande, programme et support d'enregistrement
JP6606634B2 (ja) 投影装置、制御方法、プログラム及び記憶媒体
JP6721082B2 (ja) 投影装置、投影方法、プログラム及び記憶媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15878701

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 15878701

Country of ref document: EP

Kind code of ref document: A1