WO2016117005A1 - Projection device, projection method, program, and storage medium - Google Patents

Projection device, projection method, program, and storage medium Download PDF

Info

Publication number
WO2016117005A1
WO2016117005A1 PCT/JP2015/051230 JP2015051230W WO2016117005A1 WO 2016117005 A1 WO2016117005 A1 WO 2016117005A1 JP 2015051230 W JP2015051230 W JP 2015051230W WO 2016117005 A1 WO2016117005 A1 WO 2016117005A1
Authority
WO
WIPO (PCT)
Prior art keywords
scanning
light
image
target
frame period
Prior art date
Application number
PCT/JP2015/051230
Other languages
French (fr)
Japanese (ja)
Inventor
和弥 笹森
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2015/051230 priority Critical patent/WO2016117005A1/en
Publication of WO2016117005A1 publication Critical patent/WO2016117005A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/02Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes by tracing or scanning a light beam on a screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/04Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of a single character by selection from a plurality of characters, or by composing the character by combination of individual elements, e.g. segments using a combination of such display devices for composing words, rows or the like, in a frame with fixed character positions
    • G09G3/06Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of a single character by selection from a plurality of characters, or by composing the character by combination of individual elements, e.g. segments using a combination of such display devices for composing words, rows or the like, in a frame with fixed character positions using controlled light sources
    • G09G3/12Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of a single character by selection from a plurality of characters, or by composing the character by combination of individual elements, e.g. segments using a combination of such display devices for composing words, rows or the like, in a frame with fixed character positions using controlled light sources using electroluminescent elements
    • G09G3/14Semiconductor devices, e.g. diodes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters

Definitions

  • the present invention relates to a technique for scanning laser light with a projection device.
  • Patent Document 1 describes an image projection apparatus that raster scans laser light using a MEMS resonant mirror and projects an image on a screen. This image projection apparatus adjusts the light emission intensity of the laser beam in response to the change in the raster scan speed in the horizontal direction so that the brightness of the projected image displayed on the screen is uniform.
  • Patent Document 1 also discloses that one frame is divided into two fields and interlaced scanning (so-called television interlace) is performed in the even and odd fields in the vertical direction.
  • interlace in television there is no point where scanning lines overlap in even and odd fields.
  • scanning with a MEMS mirror it is common to draw an image using both forward and backward movement of the MEMS mirror. For this reason, a point where the scanning lines overlap in the even field and the odd field occurs, and as a result, the luminance of the point where the scanning lines overlap becomes higher than the surroundings.
  • FIG. 1 shows an example in which interlaced scanning is performed in a plurality of fields by a MEMS mirror.
  • FIG. 1A shows scanning lines in the first field
  • FIG. 1B shows scanning lines in the second field.
  • the scanning line in the frame image is as shown in FIG. 1C, and the point indicated by the broken-line circle X is scanned in both the first field and the second field. Will increase the brightness.
  • a main object of the present invention is to reduce unevenness in luminance of a projected image when an image is projected by scanning with laser light in a plurality of fields.
  • the invention described in the claims is a projection device that projects an image composed of a plurality of continuous frames onto a projection area, a light source for emitting light, and a target light amount corresponding to an image signal indicating the image to be projected
  • Control means for controlling the amount of light emitted from the light source based on the above, and scanning means for scanning the light emitted from the light source over the entire projection area during one frame period, the control means comprising: The overlapping region in which scanning is performed a plurality of times during the one frame period rather than the first target light amount corresponding to the non-overlapping region in which the scanning unit performs one scanning during the one frame period.
  • the target light quantity corresponding to the image signal is set so that the second target light quantity corresponding to the above becomes small.
  • the invention described in claim is a projection method executed by a projection apparatus that has a light source and a scanning unit and projects an image composed of a plurality of continuous frames onto a projection region, and shows an image to be projected
  • the control step performs a plurality of times during the one frame period rather than a first target light quantity that is a target light quantity corresponding to a non-overlapping region that is scanned once during the one frame period by the scanning means.
  • the target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount corresponding to the overlapping area where scanning is performed, is reduced.
  • the invention described in the claims is a program executed by a projection apparatus that has a light source, a scanning unit, and a computer and projects an image composed of a plurality of continuous frames onto a projection region, and should be projected
  • the control step is more than a first target light amount that is a target light amount corresponding to a non-overlapping region that is scanned once by the scanning unit during the one frame period. Setting the target light amount according to the image signal so that the second target light amount, which is the target light amount corresponding to the overlapping area where scanning is performed a plurality of times during one frame period, is reduced. And butterflies.
  • FIG. 1 shows a configuration of an image drawing apparatus according to a first embodiment.
  • An example of a scanning line by two fields is shown.
  • a method for calculating bit data on a scanning line will be described.
  • the example of the scanning line which concerns on 1st Example is shown.
  • the other example of the scanning line which concerns on 1st Example is shown.
  • the example of the scanning line which concerns on 2nd Example is shown.
  • a projection device that projects an image composed of a plurality of continuous frames onto a projection area has a light source that emits light and a target light amount corresponding to an image signal indicating an image to be projected.
  • Control means for controlling the amount of light emitted from the light source based on the above, and scanning means for scanning the entire area of the projection area with light emitted from the light source during one frame period, the control means, In the overlapping area where scanning is performed a plurality of times during the one frame period, rather than the first target light quantity corresponding to the non-overlapping area where scanning is performed once during the one frame period by the scanning means.
  • the target light amount corresponding to the image signal is set so that the second target light amount, which is the corresponding target light amount, becomes small.
  • an image composed of a plurality of continuous frames is input as an image to be projected, and the amount of light emitted from the light source is based on a target amount of light corresponding to an image signal indicating the image.
  • the light emitted from the light source is scanned over the entire projection area during one frame period.
  • the control means corresponds to an overlapping region where scanning is performed a plurality of times during one frame period, rather than a first target light amount which is a target light amount corresponding to a non-overlapping region where scanning is performed once during one frame period.
  • the target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount, becomes small.
  • control unit may include a combined light amount of light that is scanned a plurality of times in the overlapping region and light that is scanned only once in the non-overlapping region during the one frame period.
  • the target light amount is set so that the light amount substantially matches.
  • the frame includes a plurality of fields
  • the scanning unit draws the field by raster scanning light emitted from the light source, and between the plurality of fields. Perform interlaced scanning.
  • flickering of a projected image can be prevented by performing interlaced scanning in a plurality of fields.
  • control means sets the second target light amount in at least one of the plurality of fields to zero.
  • the scanning unit draws the frame by performing a Lissajous scan of light emitted from a light source.
  • the control step includes a plurality of first target light amounts during the one frame period, which are target light amounts corresponding to non-overlapping regions that are scanned one time during the one frame period by the scanning unit.
  • the target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount corresponding to the overlapping area where the scanning is performed, is reduced. Also by this method, it is possible to prevent the luminance of the overlapping area where scanning is performed a plurality of times during one frame period from becoming higher than that of the non-overlapping area.
  • a program executed by a projection apparatus that has a light source, a scanning unit, and a computer and projects an image composed of a plurality of continuous frames onto a projection area is projected.
  • a control step for controlling the amount of light emitted from the light source based on a target light amount corresponding to an image signal indicating a power image, and scanning the light emitted from the light source over the entire projection area during one frame period A scanning step is executed by the computer, and the control step includes a first target light amount that is a target light amount corresponding to a non-overlapping region in which one scan is performed during the one frame period by the scanning unit.
  • the target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount corresponding to the overlapping area where scanning is performed a plurality of times during the one frame period, is reduced.
  • This program can be stored in a storage medium.
  • FIG. 2 shows a configuration of the image drawing apparatus 1 to which the projection apparatus according to the first embodiment is applied.
  • the image drawing apparatus 1 includes an image signal input unit 2, a video ASIC 3, a frame memory 4, a ROM 5, a RAM 6, a laser driver 7, a MEMS driver 8, and a laser light source unit 9. .
  • the image drawing apparatus 1 is used as a light source for a head-up display, for example, and emits light constituting a display image to an optical element such as a combiner.
  • the image signal input unit 2 receives an image signal input from the outside and outputs it to the video ASIC 3.
  • the video ASIC 3 is a block that controls the laser driver 7 and the MEMS driver 8 based on the image signal input from the image signal input unit 2 and the scanning position information Sc input from the MEMS mirror 95, and is an ASIC (Application Specific Integrated Circuit). ).
  • the video ASIC 3 includes a synchronization / image separation unit 31, a bit data conversion unit 32, a light emission pattern conversion unit 33, and a timing controller 34.
  • the synchronization / image separation unit 31 separates the image data displayed on the screen 11 and the synchronization signal from the image signal input from the image signal input unit 2 and writes the image data to the frame memory 4.
  • the bit data conversion unit 32 reads the image data written in the frame memory 4 and converts it into bit data.
  • the light emission pattern conversion unit 33 converts the bit data converted by the bit data conversion unit 32 into a signal representing the light emission pattern of each laser.
  • the timing controller 34 controls the operation timing of the synchronization / image separation unit 31 and the bit data conversion unit 32.
  • the timing controller 34 also controls the operation timing of the MEMS driver 8 described later.
  • the image data separated by the synchronization / image separation unit 31 is written.
  • the ROM 5 stores a control program and data for operating the video ASIC 3. Various data are sequentially read from and written into the RAM 6 as a work memory when the video ASIC 3 operates.
  • the laser driver 7 generates a signal for driving a laser diode provided in the laser light source unit 9 based on the signal output from the light emission pattern conversion unit 33.
  • the laser driver 7 includes a red laser driving circuit 71, a blue laser driving circuit 72, and a green laser driving circuit 73.
  • the red laser driving circuit 71 drives the red laser LD1 based on the signal output from the light emission pattern conversion unit 33.
  • the blue laser drive circuit 72 drives the blue laser LD2 based on the signal output from the light emission pattern conversion unit 33.
  • the green laser drive circuit 73 drives the green laser LD3 based on the signal output from the light emission pattern conversion unit 33.
  • the MEMS driver 8 controls the MEMS mirror 95 based on a signal output from the timing controller 34.
  • the MEMS driver 8 includes a servo circuit and a driver circuit.
  • the servo circuit controls the operation of the MEMS mirror 95 based on the signal from the timing controller 34.
  • the driver circuit amplifies the control signal for the MEMS mirror 95 output from the servo circuit to a predetermined level and outputs the amplified signal.
  • the laser light source unit 9 emits laser light based on the drive signal output from the laser driver 7. Specifically, the laser light source unit 9 includes a red laser LD1, a blue laser LD2, a green laser LD3, collimator lenses 91a to 91c, and reflection mirrors 92a to 92c.
  • the red laser LD1 emits red laser light (also referred to as “red laser light Lr”)
  • the blue laser LD2 emits blue laser light (also referred to as “blue laser light Lb”)
  • Green laser light (also referred to as “green laser light Lg”) is emitted.
  • the collimator lenses 91a to 91c convert the red, blue, and green laser beams Lr, Lb, and Lg into parallel beams and emit them to the reflection mirrors 92a to 92c, respectively.
  • the reflection mirror 92b reflects the blue laser light Lb.
  • the reflection mirror 92c transmits the blue laser light Lb and reflects the green laser light Lg.
  • the reflection mirror 92a transmits the red laser beam Lr and reflects the blue and green laser beams Lb and Lg.
  • the red laser light Lr transmitted through the reflection mirror 92 a and the blue and green laser lights Lb and Lg reflected by the reflection mirror 92 a are incident on the MEMS mirror 95.
  • the MEMS mirror 95 performs a raster scan on the screen 11 with the laser light L incident from the reflection mirror 92a.
  • the MEMS mirror 95 basically swings so as to scan on the screen 11 under the control of the MEMS driver 8 in order to display an image input to the image signal input unit 2, and scanning at that time.
  • Position information (for example, information such as the angle of the mirror) is output to the video ASIC 3 as scanning position information Sc.
  • the lasers LD1 to LD3 are examples of the light source of the present invention
  • the video ASIC 3 is an example of the control means of the present invention
  • the MEMS mirror 95 and the MEMS driver 8 are examples of the scanning means of the present invention.
  • one frame image is composed of two field images, and interlaced scanning (so-called interlace in television) is performed in the first field and the second field in the vertical direction. That is, the MEMS mirror 95 draws one frame image by scanning two field images of the first field and the second field on the screen 11.
  • FIG. 3 shows an example in which one frame image is drawn by two fields.
  • the horizontal direction (X direction) is referred to as “main scanning direction”
  • the vertical direction (Y direction) is referred to as “sub scanning direction”.
  • FIG. 3A shows the scanning line F1 in the first field
  • FIG. 3B shows the scanning line F2 in the second field.
  • the laser light L from the MEMS mirror 95 is scanned in the sub-scanning direction while swinging in the main scanning direction.
  • the frame image is composed of a first field scanning line F1 and a second field scanning line F2.
  • the amount of laser light scanned by the MEMS mirror 95 is indicated by the bit data generated by the bit data conversion unit 32.
  • the bit data is data serving as a basis for determining the amount of light to be emitted by the laser in accordance with the scanning position of the MEMS mirror 95.
  • the MEMS mirror 95 is sine-driven in the main scanning direction and is scanned at a predetermined speed in the sub-scanning direction. Therefore, the scanning lines scanned in the main scanning direction are parallel to the horizontal arrangement of the image data. is not. Therefore, the bit data needs to be calculated with reference to pixel data corresponding to the trajectory of the scanning line in the image data developed in the frame memory 4.
  • FIG. 4 shows the relationship between the pixel data and the bit data at the scanning position on the scanning line.
  • Each square represents each pixel of the image data developed in the frame memory 4, and the numbers in each pixel are a pixel number and R, G, and B level values.
  • An arrow F indicates the trajectory of the scanning line.
  • the bit data conversion unit 32 refers to the pixel data of the pixels corresponding to the trajectory of the scanning line F, and the bit data B1, B2, B3,. . , Is calculated.
  • the spot of the laser beam scanned at the scanning position B1 extends over the pixels 1-1, 2-1 and 2-2.
  • the pixel data 1-1, 2-1 and 2-2 are referenced.
  • the bit data at the scanning position B2 is obtained by assuming that 80% of the spot of the laser beam scanned at the scanning position B2 is included in the pixel 2-2 and 20% is included in the pixel 1-2.
  • each bit data corresponding to each scanning position of the MEMS mirror 95 is assigned with which weight of which pixel data of the image data. It is determined in advance whether to calculate by adding. In this way, the bit data conversion unit 32 calculates bit data corresponding to each scanning position.
  • the light emission pattern conversion unit 33 converts the bit data thus generated by the bit data conversion unit 32 into a signal representing the light emission pattern of each laser.
  • the scanning speed of the scanning line is fast at the central portion of the image display area and is slow at both ends of the image display area. For this reason, when the same amount of laser light is scanned, the image tends to become brighter at both ends of the image display area.
  • the light emission pattern conversion unit 33 converts the bit data so that both ends of the image display area become darker when the bit data is converted into a signal representing the light emission intensity of each laser. As a result, the images projected based on the bit data having the same brightness have the same brightness at the center and both ends of the image display area.
  • the laser light source unit 9 scans the screen 11 with the laser light L having the light amount indicated by the signal output from the light emission pattern conversion unit 33.
  • the amount of laser light emitted based on the bit data at each scanning position on the scanning line is referred to as a “target light amount”. That is, the target light amount is a light amount corresponding to the luminance at each scanning position obtained based on the image data, and has a different value for each scanning position on the scanning line according to the image data.
  • the scanning line is scanned only once in either the first field or the second field during one frame period (hereinafter referred to as “non-overlapping region”).
  • non-overlapping region there is a region where the scanning line is scanned in both the first field and the second field, that is, a region where the scanning line is scanned twice in one frame (hereinafter referred to as “overlapping region”). For this reason, in the overlapping area scanned twice in one frame, the amount of laser light projected per unit time increases, and the image becomes brighter than in the non-overlapping area.
  • Method A is a method in which the luminance of any field is made zero in the overlapping region.
  • FIG. 5 shows an example of the scanning line when the luminance of the scanning line of the second frame is made zero by the method A.
  • FIG. 5A shows the scanning lines of the first field
  • FIG. 5B shows the scanning lines of the second field
  • FIG. 5C shows the scanning lines of the frame.
  • the laser beam is scanned with a luminance corresponding to the image data.
  • the laser beam is scanned with a luminance corresponding to the image data in the non-overlapping region, but the luminance of the laser beam is set to zero in the overlapping region.
  • the bit data conversion unit 32 stores the overlapping area in advance. For the overlapping area, the bit data conversion unit 32 calculates bit data from the image data in the first field, and generates bit data with all the color levels set to zero in the second field. On the other hand, for the non-overlapping area, the bit data conversion unit 32 generates bit data from the image data in both the first field and the second field. Then, the light emission pattern conversion unit 33 causes each laser to emit light according to the bit data generated by the bit data conversion unit 32. Thereby, it can prevent that the image of an overlapping area becomes brighter than a non-overlapping area.
  • the luminance is set to zero in the overlapping region of the first field, and the luminance corresponding to the image data is set in the overlapping region of the second field. It is good also as scanning a laser beam.
  • Method B is a method of lowering the luminance of any field in the overlapping region. That is, in the method B, the luminance is not reduced to zero in the overlapping region, but the luminance is decreased at a predetermined ratio.
  • the bit data conversion unit 32 calculates bit data from the image data in the first field. Also, the bit data conversion unit 32 calculates bit data from the image data even in the overlapping region of the second field, and outputs bit data in which the level of each color of the bit data is darkened by a predetermined ratio. That is, in the second field, the laser light is scanned at a luminance obtained by reducing the luminance corresponding to the image data by a predetermined ratio, instead of uniformly setting the luminance to zero. On the other hand, for the non-overlapping area, the bit data conversion unit 32 generates bit data from the image data in both the first field and the second field.
  • the light emission pattern conversion unit 33 causes each laser to emit light according to the bit data generated by the bit data conversion unit 32.
  • the light emission pattern conversion unit 33 causes each laser to emit light according to the bit data generated by the bit data conversion unit 32.
  • the luminance may be lowered in the overlapping region of the first field, and the laser light may be scanned with the luminance corresponding to the image data in the overlapping region of the second field.
  • Method C is a method in which the luminance of two fields is halved in the overlapping region.
  • FIG. 6 shows an example of scanning lines when the luminance of the scanning lines in the first and second fields is halved by the method C.
  • 6A shows a scanning line for the first field
  • FIG. 6B shows a scanning line for the second field
  • FIG. 6C shows a scanning line for the frame.
  • the laser beam is scanned at a luminance of 1/2 of the luminance corresponding to the image data in the overlapping region of the first field, and the luminance of 1/2 of the luminance corresponding to the image data is also detected in the overlapping region of the second field.
  • the laser beam is scanned.
  • the bit data conversion unit 32 calculates the bit data from the image data in both the first field and the second field and then sets the level of each color to 0.5 times for the overlapping area. Is generated. Then, the light emission pattern conversion unit 33 causes each laser to emit light according to the bit data generated by the bit data conversion unit 32. Thereby, the flicker of the overlapping area can be made inconspicuous.
  • the light amount of the laser light determined based on the image data when scanning the non-overlapping region corresponds to the “first target light amount” of the present invention, and when scanning the overlapping region.
  • the amount of laser light determined based on the image data corresponds to the “second target light amount” of the present invention.
  • the bit data conversion unit 32 generates bit data whose luminance is corrected in the overlapping region, and the light emission pattern conversion unit 33 emits each laser according to the bit data generated by the bit data conversion unit 32. I am letting. Instead, the bit data conversion unit 32 calculates the bit data based on the image data regardless of the non-overlapping region and the overlapping region, and the light emission pattern conversion unit 33 converts the bit data into a signal indicating the light emission intensity of each laser. At the time of conversion, the light quantity of each laser in the overlapping area may be corrected.
  • the light emission pattern conversion unit 33 stores the emission timing of the laser light corresponding to the overlapping region, and in the entire region of the first field and the non-overlapping region of the second field.
  • the bit data is converted into a signal representing the emission intensity of each laser as it is, and a signal in which the emission of each laser becomes zero is generated in the overlapping region of the second field.
  • the light emission pattern conversion unit 33 generates a signal in which the light emission of each laser becomes zero in the overlapping area of the first field, and the bit data is left as it is in the non-overlapping area of the first field and the entire area of the second field. You may convert into the signal showing the emitted light intensity of each laser.
  • the light emission pattern conversion unit 33 converts the bit data as it is into a signal representing the light emission intensity of each laser in the entire first field region and the second field non-overlapping region, and overlaps the second field. In the region, a signal is generated by reducing the emission intensity of each laser indicated by the bit data at a predetermined ratio. Instead, the light emission pattern conversion unit 33 generates a signal in which the light emission intensity of each laser indicated by the bit data in the overlapping region of the first field is reduced at a predetermined ratio, and the non-overlapping region and the second field of the first field are generated.
  • the bit data may be converted into a signal representing the emission intensity of each laser as it is in the entire field.
  • the light emission pattern conversion unit 33 outputs a signal indicating the light emission intensity of 0.5 times the light emission intensity of each laser indicated by the bit data in the overlapping region for both the first field and the second field. It only has to be generated.
  • the region defined as the overlapping region is a region where the first field scanning line and the second field scanning line partially overlap, a region where more than half overlap, a region where all overlap, or the like. Is set as appropriate.
  • one frame is composed of two fields, but one frame may be composed of three or more fields.
  • the image drawing apparatus 1 stores in advance an overlapping area generated by scanning each field, and scans a laser beam with a normal light amount based on the image data in one field for the overlapping area, and other fields. Then, it is sufficient to correct the light amount and scan.
  • scanning is performed at the brightness corresponding to the image data in the entire first field and the non-overlapping areas of the second and third fields, and the brightness is set to zero in the overlapping areas of the second and third fields. Is done.
  • scanning is performed at the brightness corresponding to the image data in the entire area of the first field and the non-overlapping areas of the second and third fields, and the brightness is higher in the overlapping areas of the second and third fields. Scanning is performed at a predetermined rate.
  • scanning is performed at a luminance corresponding to the image data in the non-overlapping areas of the first to third fields, and 1 / of the luminance of the image data is respectively performed in the overlapping areas of the first to third fields.
  • Scanning is performed at a luminance of 3.
  • the second embodiment is an example in which the present invention is applied to so-called Lissajous scanning.
  • the configuration of the image drawing apparatus according to the second embodiment is the same as that of the first embodiment, and a description thereof will be omitted.
  • the MEMS mirror 95 is resonantly driven in each of the main scanning direction and the sub-scanning direction, and an image is projected onto the screen 11 by performing Lissajous scanning with laser light.
  • An example of the Lissajous scanning line is shown in FIG. In the Lissajous scan, a large number of overlapping areas are generated in which a scanning line is scanned twice or more when one frame is drawn.
  • the bit data conversion unit 32 calculates bit data by referring to pixel data corresponding to the trajectory of the scanning line among the image data developed in the frame memory 4.
  • the bit data conversion unit 32 stores the overlap area in advance as in the first embodiment.
  • the bit data conversion unit 32 generates bit data calculated from the image data in the first scan in the overlapping region, and performs the second and subsequent scans. In, bit data is generated with all the color levels set to zero.
  • the bit data conversion unit 32 reduces the level of each color of the bit data by a predetermined ratio in the second and subsequent overlapping area scans during one frame period. Generate bit data.
  • the bit data conversion unit 32 stores in advance the number of times of scanning for each of the overlapping areas during one frame period. Then, when calculating the bit data corresponding to the overlapping area, the bit data conversion unit 32 calculates the bit data obtained by dividing the luminance based on the image data by the number of times of scanning. For example, for an overlapping region that is scanned three times during one frame period by Lissajous scanning, the bit data conversion unit 32 performs scanning at a luminance that is 1/3 of the luminance calculated based on the image data.
  • the bit data conversion unit 32 corrects the luminance in the overlapping area to generate bit data. Instead, as described in the first embodiment, the bit data conversion unit 32 calculates the bit data based on the image data regardless of the non-overlapping area and the overlapping area, and the light emission pattern conversion unit 33 sets the bit data. May be corrected to the amount of light emitted from each laser in the overlapping region.
  • the present invention can be used in a projection apparatus that projects an image.
  • Image drawing device 3 Video ASIC 7 Laser Driver 8 MEMS Driver 9 Laser Light Source 11 Screen 95 MEMS Mirror

Abstract

Provided is a projection device in which an image constituted by a plurality of continuous frames is input as the image to be projected, and the amount of light emitted from a light source is controlled on the basis of a target amount of light according to an image signal representing the image. The light emitted from the light source is scanned across the entire projection region during each frame period. A control means sets target amounts of light according to the image signal such that a second target amount of light, which is a target amount of light corresponding to an overlapping region where scanning is performed a plurality of times during one frame period, is less than a first target amount of light, which is a target amount of light corresponding to a non-overlapping region where scanning is performed once during one frame period.

Description

投影装置、投影方法、プログラム及び記憶媒体Projection apparatus, projection method, program, and storage medium
 本発明は、投影装置によりレーザ光を走査する技術に関する。 The present invention relates to a technique for scanning laser light with a projection device.
 光ビームを投射することにより画像を表示する投影装置が知られている。例えば、特許文献1は、MEMS共振ミラーを用いてレーザ光をラスタースキャンし画像をスクリーンに投影する画像投影装置を記載している。この画像投影装置は、水平方向のラスタースキャン速度の変化に対応してレーザ光の発光強度を調整することで、スクリーン上に表示される投影画像の明るさが均等になるようにしている。 Projectors that display images by projecting a light beam are known. For example, Patent Document 1 describes an image projection apparatus that raster scans laser light using a MEMS resonant mirror and projects an image on a screen. This image projection apparatus adjusts the light emission intensity of the laser beam in response to the change in the raster scan speed in the horizontal direction so that the brightness of the projected image displayed on the screen is uniform.
特開2006-343397号公報JP 2006-343397 A
 特許文献1では、1つのフレームを2つのフィールドに分け、偶数フィールド、奇数フィールドで垂直方向に飛び越し走査(いわゆるテレビジョンにおけるインターレース)を行うことも開示されている。しかし、テレビジョンにおけるインターレースでは、偶数フィールド、奇数フィールドで走査線が重なる点がない。これに対して、MEMSミラーによる走査では、MEMSミラーの揺動の行きと戻りの両方を使って画像を描画するのが一般的である。このため、偶数フィールド、奇数フィールドで走査線が重なる点が生じてしまい、結果として走査線が重なる点の輝度が周辺よりも高くなってしまう。 Patent Document 1 also discloses that one frame is divided into two fields and interlaced scanning (so-called television interlace) is performed in the even and odd fields in the vertical direction. However, in interlace in television, there is no point where scanning lines overlap in even and odd fields. On the other hand, in scanning with a MEMS mirror, it is common to draw an image using both forward and backward movement of the MEMS mirror. For this reason, a point where the scanning lines overlap in the even field and the odd field occurs, and as a result, the luminance of the point where the scanning lines overlap becomes higher than the surroundings.
 図1は、MEMSミラーにより複数フィールドで飛び越し走査を行う例を示す。図1(A)は第1フィールドにおける走査線を示し、図1(B)は第2フィールドにおける走査線を示す。この場合、フレーム画像における走査線は図1(C)に示すようになり、破線の円Xで示す点は、第1フィールドと第2フィールドの両方で走査されるため、他の点と比較して輝度が高くなってしまう。 FIG. 1 shows an example in which interlaced scanning is performed in a plurality of fields by a MEMS mirror. FIG. 1A shows scanning lines in the first field, and FIG. 1B shows scanning lines in the second field. In this case, the scanning line in the frame image is as shown in FIG. 1C, and the point indicated by the broken-line circle X is scanned in both the first field and the second field. Will increase the brightness.
 本発明が解決しようとする課題は上記のようなものが例として挙げられる。本発明は、複数のフィールドでレーザ光を走査して画像を投影する場合に、投影される画像の輝度ムラを低減することを主な目的とする。 Examples of the problem to be solved by the present invention include the above. A main object of the present invention is to reduce unevenness in luminance of a projected image when an image is projected by scanning with laser light in a plurality of fields.
 請求項に記載の発明は、連続する複数のフレームにより構成される画像を投影領域に投影する投影装置であって、光を射出する光源と、投影すべき画像を示す画像信号に応じた目標光量に基づいて前記光源が射出する光の光量を制御する制御手段と、1フレーム期間中に、前記光源が射出する光を前記投影領域の全域に走査する走査手段と、を備え、前記制御手段は、前記走査手段により前記1フレーム期間中に1回の走査がなされる非重複領域に対応する目標光量である第1目標光量よりも、前記1フレーム期間中に複数回の走査がなされる重複領域に対応する目標光量である第2目標光量が小さくなるように前記画像信号に応じた目標光量を設定することを特徴とする。 The invention described in the claims is a projection device that projects an image composed of a plurality of continuous frames onto a projection area, a light source for emitting light, and a target light amount corresponding to an image signal indicating the image to be projected Control means for controlling the amount of light emitted from the light source based on the above, and scanning means for scanning the light emitted from the light source over the entire projection area during one frame period, the control means comprising: The overlapping region in which scanning is performed a plurality of times during the one frame period rather than the first target light amount corresponding to the non-overlapping region in which the scanning unit performs one scanning during the one frame period. The target light quantity corresponding to the image signal is set so that the second target light quantity corresponding to the above becomes small.
 請求項に記載の発明は、光源及び走査手段を有し、連続する複数のフレームにより構成される画像を投影領域に投影する投影装置により実行される投影方法であって、投影すべき画像を示す画像信号に応じた目標光量に基づいて前記光源が射出する光の光量を制御する制御工程と、1フレーム期間中に、前記光源が射出する光を前記投影領域の全域に走査する走査工程と、を備え、前記制御工程は、前記走査手段により前記1フレーム期間中に1回の走査がなされる非重複領域に対応する目標光量である第1目標光量よりも、前記1フレーム期間中に複数回の走査がなされる重複領域に対応する目標光量である第2目標光量が小さくなるように前記画像信号に応じた目標光量を設定することを特徴とする。 The invention described in claim is a projection method executed by a projection apparatus that has a light source and a scanning unit and projects an image composed of a plurality of continuous frames onto a projection region, and shows an image to be projected A control step of controlling the amount of light emitted from the light source based on a target light amount according to an image signal, and a scanning step of scanning the entire area of the projection area with the light emitted from the light source during one frame period; And the control step performs a plurality of times during the one frame period rather than a first target light quantity that is a target light quantity corresponding to a non-overlapping region that is scanned once during the one frame period by the scanning means. The target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount corresponding to the overlapping area where scanning is performed, is reduced.
 請求項に記載の発明は、光源、走査手段、及び、コンピュータを有し、連続する複数のフレームにより構成される画像を投影領域に投影する投影装置により実行されるプログラムであって、投影すべき画像を示す画像信号に応じた目標光量に基づいて前記光源が射出する光の光量を制御する制御工程と、1フレーム期間中に、前記光源が射出する光を前記投影領域の全域に走査する走査工程と、を前記コンピュータに実行させ、前記制御工程は、前記走査手段により前記1フレーム期間中に1回の走査がなされる非重複領域に対応する目標光量である第1目標光量よりも、前記1フレーム期間中に複数回の走査がなされる重複領域に対応する目標光量である第2目標光量が小さくなるように前記画像信号に応じた目標光量を設定することを特徴とする。 The invention described in the claims is a program executed by a projection apparatus that has a light source, a scanning unit, and a computer and projects an image composed of a plurality of continuous frames onto a projection region, and should be projected A control process for controlling the amount of light emitted by the light source based on a target light amount corresponding to an image signal indicating an image, and scanning for scanning the light emitted by the light source over the entire projection area during one frame period And the control step is more than a first target light amount that is a target light amount corresponding to a non-overlapping region that is scanned once by the scanning unit during the one frame period. Setting the target light amount according to the image signal so that the second target light amount, which is the target light amount corresponding to the overlapping area where scanning is performed a plurality of times during one frame period, is reduced. And butterflies.
従来技術による飛び越し走査の例を示す。An example of interlaced scanning according to the prior art is shown. 第1実施例に係る画像描画装置の構成を示す。1 shows a configuration of an image drawing apparatus according to a first embodiment. 2つのフィールドによる走査線の例を示す。An example of a scanning line by two fields is shown. 走査線上のビットデータの算出方法を示す。A method for calculating bit data on a scanning line will be described. 第1実施例に係る走査線の例を示す。The example of the scanning line which concerns on 1st Example is shown. 第1実施例に係る走査線の他の例を示す。The other example of the scanning line which concerns on 1st Example is shown. 第2実施例に係る走査線の例を示す。The example of the scanning line which concerns on 2nd Example is shown.
 本発明の好適な実施形態では、連続する複数のフレームにより構成される画像を投影領域に投影する投影装置は、光を射出する光源と、投影すべき画像を示す画像信号に応じた目標光量に基づいて前記光源が射出する光の光量を制御する制御手段と、1フレーム期間中に、前記光源が射出する光を前記投影領域の全域に走査する走査手段と、を備え、前記制御手段は、前記走査手段により前記1フレーム期間中に1回の走査がなされる非重複領域に対応する目標光量である第1目標光量よりも、前記1フレーム期間中に複数回の走査がなされる重複領域に対応する目標光量である第2目標光量が小さくなるように前記画像信号に応じた目標光量を設定する。 In a preferred embodiment of the present invention, a projection device that projects an image composed of a plurality of continuous frames onto a projection area has a light source that emits light and a target light amount corresponding to an image signal indicating an image to be projected. Control means for controlling the amount of light emitted from the light source based on the above, and scanning means for scanning the entire area of the projection area with light emitted from the light source during one frame period, the control means, In the overlapping area where scanning is performed a plurality of times during the one frame period, rather than the first target light quantity corresponding to the non-overlapping area where scanning is performed once during the one frame period by the scanning means. The target light amount corresponding to the image signal is set so that the second target light amount, which is the corresponding target light amount, becomes small.
 上記の投影装置は、投影すべき画像として、連続する複数のフレームにより構成される画像が入力され、その画像を示す画像信号に応じた目標光量に基づいて、光源から射出される光の光量が制御される。光源から射出された光は、1フレーム期間中に投影領域の全域に走査される。制御手段は、1フレーム期間中に1回の走査がなされる非重複領域に対応する目標光量である第1目標光量よりも、1フレーム期間中に複数回の走査がなされる重複領域に対応する目標光量である第2目標光量が小さくなるように、画像信号に応じた目標光量を設定する。これにより、1フレーム期間中に複数回の走査がなされる重複領域の輝度が非重複領域に比べて高くなってしまうことを防止することができる。 In the above projection apparatus, an image composed of a plurality of continuous frames is input as an image to be projected, and the amount of light emitted from the light source is based on a target amount of light corresponding to an image signal indicating the image. Be controlled. The light emitted from the light source is scanned over the entire projection area during one frame period. The control means corresponds to an overlapping region where scanning is performed a plurality of times during one frame period, rather than a first target light amount which is a target light amount corresponding to a non-overlapping region where scanning is performed once during one frame period. The target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount, becomes small. As a result, it is possible to prevent the luminance of the overlapping region where scanning is performed a plurality of times during one frame period from becoming higher than that of the non-overlapping region.
 上記の投影装置の一態様では、前記制御手段は、前記1フレーム期間中において、前記重複領域に複数回にわたり走査される光の合成光量と、前記非重複領域に1回だけ走査される光の光量とが略一致するように前記目標光量を設定する。この態様では、重複領域と非重複領域において1フレーム期間中に走査される光のトータルの光量は同じとなるので、画像の輝度ムラを防止することができる。 In one aspect of the above-described projection apparatus, the control unit may include a combined light amount of light that is scanned a plurality of times in the overlapping region and light that is scanned only once in the non-overlapping region during the one frame period. The target light amount is set so that the light amount substantially matches. In this aspect, since the total amount of light scanned during one frame period is the same in the overlapping region and the non-overlapping region, uneven brightness of the image can be prevented.
 上記の投影装置の他の一態様では、前記フレームは複数のフィールドで構成され、前記走査手段は、前記光源が射出する光をラスタースキャンすることにより前記フィールドを描画し、前記複数のフィールド間で飛び越し走査をする。この態様では、複数のフィールドで飛び越し走査を行うことにより、投影される画像のちらつきを防止することができる。 In another aspect of the above projection apparatus, the frame includes a plurality of fields, and the scanning unit draws the field by raster scanning light emitted from the light source, and between the plurality of fields. Perform interlaced scanning. In this aspect, flickering of a projected image can be prevented by performing interlaced scanning in a plurality of fields.
 好適な例では、前記制御手段は、前記複数のフィールドのうちの少なくとも1つのフィールドにおける前記第2目標光量をゼロに設定する。 In a preferred example, the control means sets the second target light amount in at least one of the plurality of fields to zero.
 他の好適な例では、前記走査手段は、光源が射出する光をリサージュスキャンすることにより前記フレームを描画する。 In another preferred example, the scanning unit draws the frame by performing a Lissajous scan of light emitted from a light source.
 本発明の他の好適な実施形態では、光源及び走査手段を有し、連続する複数のフレームにより構成される画像を投影領域に投影する投影装置により実行される投影方法は、投影すべき画像を示す画像信号に応じた目標光量に基づいて前記光源が射出する光の光量を制御する制御工程と、1フレーム期間中に、前記光源が射出する光を前記投影領域の全域に走査する走査工程と、を備え、前記制御工程は、前記走査手段により前記1フレーム期間中に1回の走査がなされる非重複領域に対応する目標光量である第1目標光量よりも、前記1フレーム期間中に複数回の走査がなされる重複領域に対応する目標光量である第2目標光量が小さくなるように前記画像信号に応じた目標光量を設定する。この方法によっても、1フレーム期間中に複数回の走査がなされる重複領域の輝度が非重複領域に比べて高くなってしまうことを防止することができる。 In another preferred embodiment of the present invention, a projection method executed by a projection apparatus that has a light source and a scanning unit and projects an image composed of a plurality of continuous frames onto a projection region includes: A control step of controlling the amount of light emitted from the light source based on a target amount of light corresponding to an image signal to be displayed, and a scanning step of scanning the light emitted from the light source over the entire projection area during one frame period. , And the control step includes a plurality of first target light amounts during the one frame period, which are target light amounts corresponding to non-overlapping regions that are scanned one time during the one frame period by the scanning unit. The target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount corresponding to the overlapping area where the scanning is performed, is reduced. Also by this method, it is possible to prevent the luminance of the overlapping area where scanning is performed a plurality of times during one frame period from becoming higher than that of the non-overlapping area.
 本発明の他の好適な実施形態では、光源、走査手段、及び、コンピュータを有し、連続する複数のフレームにより構成される画像を投影領域に投影する投影装置により実行されるプログラムは、投影すべき画像を示す画像信号に応じた目標光量に基づいて前記光源が射出する光の光量を制御する制御工程と、1フレーム期間中に、前記光源が射出する光を前記投影領域の全域に走査する走査工程と、を前記コンピュータに実行させ、前記制御工程は、前記走査手段により前記1フレーム期間中に1回の走査がなされる非重複領域に対応する目標光量である第1目標光量よりも、前記1フレーム期間中に複数回の走査がなされる重複領域に対応する目標光量である第2目標光量が小さくなるように前記画像信号に応じた目標光量を設定する。このプログラムをコンピュータで実行することにより、上記の投影装置を実現することができる。このプログラムは、記憶媒体に記憶することができる。 In another preferred embodiment of the present invention, a program executed by a projection apparatus that has a light source, a scanning unit, and a computer and projects an image composed of a plurality of continuous frames onto a projection area is projected. A control step for controlling the amount of light emitted from the light source based on a target light amount corresponding to an image signal indicating a power image, and scanning the light emitted from the light source over the entire projection area during one frame period A scanning step is executed by the computer, and the control step includes a first target light amount that is a target light amount corresponding to a non-overlapping region in which one scan is performed during the one frame period by the scanning unit. The target light amount corresponding to the image signal is set so that the second target light amount, which is the target light amount corresponding to the overlapping area where scanning is performed a plurality of times during the one frame period, is reduced. By executing this program on a computer, the above projection apparatus can be realized. This program can be stored in a storage medium.
 以下、図面を参照して本発明の好適な実施例について説明する。 Hereinafter, preferred embodiments of the present invention will be described with reference to the drawings.
 [第1実施例]
 (画像描画装置の構成)
 図2は、第1実施例に係る投影装置が適用された画像描画装置1の構成を示す。図2に示すように、画像描画装置1は、画像信号入力部2と、ビデオASIC3と、フレームメモリ4と、ROM5と、RAM6と、レーザドライバ7と、MEMSドライバ8と、レーザ光源部9と、を備える。画像描画装置1は、例えばヘッドアップディスプレイの光源として用いられ、コンバイナ等の光学素子に表示像を構成する光を出射する。
[First embodiment]
(Configuration of image drawing device)
FIG. 2 shows a configuration of the image drawing apparatus 1 to which the projection apparatus according to the first embodiment is applied. As shown in FIG. 2, the image drawing apparatus 1 includes an image signal input unit 2, a video ASIC 3, a frame memory 4, a ROM 5, a RAM 6, a laser driver 7, a MEMS driver 8, and a laser light source unit 9. . The image drawing apparatus 1 is used as a light source for a head-up display, for example, and emits light constituting a display image to an optical element such as a combiner.
 画像信号入力部2は、外部から入力される画像信号を受信してビデオASIC3に出力する。 The image signal input unit 2 receives an image signal input from the outside and outputs it to the video ASIC 3.
 ビデオASIC3は、画像信号入力部2から入力される画像信号及びMEMSミラー95から入力される走査位置情報Scに基づいてレーザドライバ7やMEMSドライバ8を制御するブロックであり、ASIC(Application Specific Integrated Circuit)として構成されている。ビデオASIC3は、同期/画像分離部31と、ビットデータ変換部32と、発光パターン変換部33と、タイミングコントローラ34と、を備える。 The video ASIC 3 is a block that controls the laser driver 7 and the MEMS driver 8 based on the image signal input from the image signal input unit 2 and the scanning position information Sc input from the MEMS mirror 95, and is an ASIC (Application Specific Integrated Circuit). ). The video ASIC 3 includes a synchronization / image separation unit 31, a bit data conversion unit 32, a light emission pattern conversion unit 33, and a timing controller 34.
 同期/画像分離部31は、画像信号入力部2から入力された画像信号から、スクリーン11に表示される画像データと同期信号とを分離し、画像データをフレームメモリ4へ書き込む。 The synchronization / image separation unit 31 separates the image data displayed on the screen 11 and the synchronization signal from the image signal input from the image signal input unit 2 and writes the image data to the frame memory 4.
 ビットデータ変換部32は、フレームメモリ4に書き込まれた画像データを読み出してビットデータに変換する。 The bit data conversion unit 32 reads the image data written in the frame memory 4 and converts it into bit data.
 発光パターン変換部33は、ビットデータ変換部32で変換されたビットデータを、各レーザの発光パターンを表す信号に変換する。 The light emission pattern conversion unit 33 converts the bit data converted by the bit data conversion unit 32 into a signal representing the light emission pattern of each laser.
 タイミングコントローラ34は、同期/画像分離部31、ビットデータ変換部32の動作タイミングを制御する。また、タイミングコントローラ34は、後述するMEMSドライバ8の動作タイミングも制御する。 The timing controller 34 controls the operation timing of the synchronization / image separation unit 31 and the bit data conversion unit 32. The timing controller 34 also controls the operation timing of the MEMS driver 8 described later.
 フレームメモリ4には、同期/画像分離部31により分離された画像データが書き込まれる。ROM5は、ビデオASIC3が動作するための制御プログラムやデータなどを記憶している。RAM6には、ビデオASIC3が動作する際のワークメモリとして、各種データが逐次読み書きされる。 In the frame memory 4, the image data separated by the synchronization / image separation unit 31 is written. The ROM 5 stores a control program and data for operating the video ASIC 3. Various data are sequentially read from and written into the RAM 6 as a work memory when the video ASIC 3 operates.
 レーザドライバ7は、発光パターン変換部33が出力する信号に基づき、レーザ光源部9に設けられるレーザダイオードを駆動する信号を生成する。レーザドライバ7は、赤色レーザ駆動回路71と、青色レーザ駆動回路72と、緑色レーザ駆動回路73と、を備える。 The laser driver 7 generates a signal for driving a laser diode provided in the laser light source unit 9 based on the signal output from the light emission pattern conversion unit 33. The laser driver 7 includes a red laser driving circuit 71, a blue laser driving circuit 72, and a green laser driving circuit 73.
 赤色レーザ駆動回路71は、発光パターン変換部33が出力する信号に基づき、赤色レーザLD1を駆動する。青色レーザ駆動回路72は、発光パターン変換部33が出力する信号に基づき、青色レーザLD2を駆動する。緑色レーザ駆動回路73は、発光パターン変換部33が出力する信号に基づき、緑色レーザLD3を駆動する。 The red laser driving circuit 71 drives the red laser LD1 based on the signal output from the light emission pattern conversion unit 33. The blue laser drive circuit 72 drives the blue laser LD2 based on the signal output from the light emission pattern conversion unit 33. The green laser drive circuit 73 drives the green laser LD3 based on the signal output from the light emission pattern conversion unit 33.
 MEMSドライバ8は、タイミングコントローラ34が出力する信号に基づきMEMSミラー95を制御する。MEMSドライバ8は、サーボ回路と、ドライバ回路と、を備える。サーボ回路は、タイミングコントローラ34からの信号に基づき、MEMSミラー95の動作を制御する。ドライバ回路は、サーボ回路が出力するMEMSミラー95の制御信号を所定レベルに増幅して出力する。 The MEMS driver 8 controls the MEMS mirror 95 based on a signal output from the timing controller 34. The MEMS driver 8 includes a servo circuit and a driver circuit. The servo circuit controls the operation of the MEMS mirror 95 based on the signal from the timing controller 34. The driver circuit amplifies the control signal for the MEMS mirror 95 output from the servo circuit to a predetermined level and outputs the amplified signal.
 レーザ光源部9は、レーザドライバ7から出力される駆動信号に基づいて、レーザ光を出射する。具体的には、レーザ光源部9は、赤色レーザLD1と、青色レーザLD2と、緑色レーザLD3と、コリメータレンズ91a~91cと、反射ミラー92a~92cと、を備える。 The laser light source unit 9 emits laser light based on the drive signal output from the laser driver 7. Specifically, the laser light source unit 9 includes a red laser LD1, a blue laser LD2, a green laser LD3, collimator lenses 91a to 91c, and reflection mirrors 92a to 92c.
 赤色レーザLD1は赤色のレーザ光(「赤色レーザ光Lr」とも呼ぶ。)を出射し、青色レーザLD2は青色のレーザ光(「青色レーザ光Lb」とも呼ぶ。)を出射し、緑色レーザLD3は緑色のレーザ光(「緑色レーザ光Lg」とも呼ぶ。)を出射する。コリメータレンズ91a~91cは、それぞれ、赤色、青色及び緑色のレーザ光Lr、Lb、Lgを平行光にして、反射ミラー92a~92cに出射する。反射ミラー92bは、青色レーザ光Lbを反射する。反射ミラー92cは、青色レーザ光Lbを透過させ、緑色レーザ光Lgを反射する。反射ミラー92aは、赤色レーザ光Lrを透過させ、青色及び緑色のレーザ光Lb、Lgを反射する。こうして反射ミラー92aを透過した赤色レーザ光Lr及び反射ミラー92aで反射された青色及び緑色のレーザ光Lb、Lgは、MEMSミラー95に入射される。 The red laser LD1 emits red laser light (also referred to as “red laser light Lr”), the blue laser LD2 emits blue laser light (also referred to as “blue laser light Lb”), and the green laser LD3. Green laser light (also referred to as “green laser light Lg”) is emitted. The collimator lenses 91a to 91c convert the red, blue, and green laser beams Lr, Lb, and Lg into parallel beams and emit them to the reflection mirrors 92a to 92c, respectively. The reflection mirror 92b reflects the blue laser light Lb. The reflection mirror 92c transmits the blue laser light Lb and reflects the green laser light Lg. The reflection mirror 92a transmits the red laser beam Lr and reflects the blue and green laser beams Lb and Lg. The red laser light Lr transmitted through the reflection mirror 92 a and the blue and green laser lights Lb and Lg reflected by the reflection mirror 92 a are incident on the MEMS mirror 95.
 MEMSミラー95は、反射ミラー92aから入射されたレーザ光Lをスクリーン11上にラスタースキャンする。また、MEMSミラー95は、基本的には、画像信号入力部2に入力された画像を表示するために、MEMSドライバ8の制御によりスクリーン11上を走査するように揺動し、その際の走査位置情報(例えばミラーの角度などの情報)を走査位置情報ScとしてビデオASIC3へ出力する。 The MEMS mirror 95 performs a raster scan on the screen 11 with the laser light L incident from the reflection mirror 92a. The MEMS mirror 95 basically swings so as to scan on the screen 11 under the control of the MEMS driver 8 in order to display an image input to the image signal input unit 2, and scanning at that time. Position information (for example, information such as the angle of the mirror) is output to the video ASIC 3 as scanning position information Sc.
 上記の構成において、レーザLD1~LD3は本発明の光源の一例であり、ビデオASIC3は本発明の制御手段の一例であり、MEMSミラー95及びMEMSドライバ8は本発明の走査手段の一例である。 In the above configuration, the lasers LD1 to LD3 are examples of the light source of the present invention, the video ASIC 3 is an example of the control means of the present invention, and the MEMS mirror 95 and the MEMS driver 8 are examples of the scanning means of the present invention.
 (走査方法)
 次に、MEMSミラー95による走査方法について説明する。本実施例では、1つのフレーム画像を2つのフィールド画像により構成し、第1フィールドと第2フィールドで垂直方向に飛び越し走査(いわゆるテレビジョンにおけるインターレース)を行う。即ち、MEMSミラー95は、スクリーン11上に第1フィールドと第2フィールドの2つのフィールド画像を走査することにより、1つのフレーム画像を描画する。
(Scanning method)
Next, a scanning method using the MEMS mirror 95 will be described. In this embodiment, one frame image is composed of two field images, and interlaced scanning (so-called interlace in television) is performed in the first field and the second field in the vertical direction. That is, the MEMS mirror 95 draws one frame image by scanning two field images of the first field and the second field on the screen 11.
 図3は、2つのフィールドにより1つのフレーム画像を描画する例を示す。なお、図中の水平方向(X方向)を「主走査方向」と呼び、垂直方向(Y方向)を「副走査方向」と呼ぶ。 FIG. 3 shows an example in which one frame image is drawn by two fields. In the drawing, the horizontal direction (X direction) is referred to as “main scanning direction”, and the vertical direction (Y direction) is referred to as “sub scanning direction”.
 図3(A)は、第1フィールドの走査線F1を示し、図3(B)は第2フィールドの走査線F2を示す。図3(A)、(B)に示すように、MEMSミラー95からのレーザ光Lは、主走査方向に揺動しながら、副走査方向に走査される。図3(C)に示すように、フレーム画像は第1フィールドの走査線F1と第2フィールドの走査線F2により構成される。 3A shows the scanning line F1 in the first field, and FIG. 3B shows the scanning line F2 in the second field. As shown in FIGS. 3A and 3B, the laser light L from the MEMS mirror 95 is scanned in the sub-scanning direction while swinging in the main scanning direction. As shown in FIG. 3C, the frame image is composed of a first field scanning line F1 and a second field scanning line F2.
 次に、こうして走査されるレーザ光の光量について説明する。MEMSミラー95により走査されるレーザ光の光量は、ビットデータ変換部32が生成するビットデータにより示される。ビットデータは、MEMSミラー95の走査位置に対応して、レーザに発光させるべき光量を決定する為の基となるデータである。ここで、MEMSミラー95は主走査方向に正弦駆動されながら、副走査方向に所定の速度で走査されるため、主走査方向に走査される走査線は、画像データの水平方向の配列とは平行ではない。このため、ビットデータは、フレームメモリ4に展開される画像データのうち、走査線の軌道に対応する画素のデータを参照して演算する必要がある。 Next, the amount of laser light scanned in this way will be described. The amount of laser light scanned by the MEMS mirror 95 is indicated by the bit data generated by the bit data conversion unit 32. The bit data is data serving as a basis for determining the amount of light to be emitted by the laser in accordance with the scanning position of the MEMS mirror 95. Here, the MEMS mirror 95 is sine-driven in the main scanning direction and is scanned at a predetermined speed in the sub-scanning direction. Therefore, the scanning lines scanned in the main scanning direction are parallel to the horizontal arrangement of the image data. is not. Therefore, the bit data needs to be calculated with reference to pixel data corresponding to the trajectory of the scanning line in the image data developed in the frame memory 4.
 図4は、画素データと、走査線上の走査位置におけるビットデータとの関係を示す。各正方形はフレームメモリ4に展開される画像データの各画素を示し、各画素内の数字は画素番号と、R,G,Bの各レベル値である。矢印Fは走査線の軌跡を示す。ビットデータ変換部32は、走査線Fの軌道に対応する画素の画素データを参照して、ビットデータB1,B2,B3,..,を算出する。 FIG. 4 shows the relationship between the pixel data and the bit data at the scanning position on the scanning line. Each square represents each pixel of the image data developed in the frame memory 4, and the numbers in each pixel are a pixel number and R, G, and B level values. An arrow F indicates the trajectory of the scanning line. The bit data conversion unit 32 refers to the pixel data of the pixels corresponding to the trajectory of the scanning line F, and the bit data B1, B2, B3,. . , Is calculated.
 具体的に、図4の例では、走査位置B1を走査されるレーザ光のスポットは、画素1-1,2-1,2-2にまたがるので、走査位置B1のビットデータの算出には画素1-1,2-1,2-2の画素データが参照される。この時、走査位置B1のスポットのうちの70%が画素2-1に、20%が画素1-1に、10%が画素2-2に含まれるとすれば、走査位置B1のビットデータは、加重平均をとってB1(R,G,B)=(0,26,0)と算出される。同様に走査位置B2のビットデータは、走査位置B2を走査されるレーザ光のスポットのうち80%が画素2-2に、20%が画素1-2に含まれるとすれば、B走査位置B2のビットデータはB2(R,G,B)=(0,204,0)と算出される。 Specifically, in the example of FIG. 4, the spot of the laser beam scanned at the scanning position B1 extends over the pixels 1-1, 2-1 and 2-2. The pixel data 1-1, 2-1 and 2-2 are referenced. At this time, if 70% of the spot at the scanning position B1 is included in the pixel 2-1, 20% is included in the pixel 1-1, and 10% is included in the pixel 2-2, the bit data at the scanning position B1 is The weighted average is calculated as B1 (R, G, B) = (0, 26, 0). Similarly, the bit data at the scanning position B2 is obtained by assuming that 80% of the spot of the laser beam scanned at the scanning position B2 is included in the pixel 2-2 and 20% is included in the pixel 1-2. The bit data is calculated as B2 (R, G, B) = (0, 204, 0).
 MEMSミラー95の軌道は、第1フィールド及び第2フィールドごとにそれぞれ固定であるため、MEMSミラー95の各走査位置に対応するそれぞれのビットデータは、画像データのどの画素データをどのような加重をつけて算出するかが予め決定されている。こうして、ビットデータ変換部32は、各走査位置に対応するビットデータを算出する。 Since the trajectory of the MEMS mirror 95 is fixed for each of the first field and the second field, each bit data corresponding to each scanning position of the MEMS mirror 95 is assigned with which weight of which pixel data of the image data. It is determined in advance whether to calculate by adding. In this way, the bit data conversion unit 32 calculates bit data corresponding to each scanning position.
 発光パターン変換部33は、こうしてビットデータ変換部32が生成したビットデータを、各レーザの発光パターンを表す信号に変換する。ここで、MEMSミラー95は主走査方向に正弦駆動されるため、走査線の走査速度は、画像表示エリアの中央部で速く、画像表示エリアの両端部ほど遅くなる。このため、同じ光量のレーザ光を走査した場合、画像表示エリアの両端部ほど画像が明るくなってしまう傾向がある。これを補正するため、発光パターン変換部33は、ビットデータを各レーザの発光強度を表す信号に変換する際に、画像表示エリアの両端部ほど、暗くなるようにビットデータを変換する。これにより同じ輝度のビットデータに基づいて投影される画像は、画像表示エリアの中央部でも両端部でも輝度が一致することになる。 The light emission pattern conversion unit 33 converts the bit data thus generated by the bit data conversion unit 32 into a signal representing the light emission pattern of each laser. Here, since the MEMS mirror 95 is sine-driven in the main scanning direction, the scanning speed of the scanning line is fast at the central portion of the image display area and is slow at both ends of the image display area. For this reason, when the same amount of laser light is scanned, the image tends to become brighter at both ends of the image display area. In order to correct this, the light emission pattern conversion unit 33 converts the bit data so that both ends of the image display area become darker when the bit data is converted into a signal representing the light emission intensity of each laser. As a result, the images projected based on the bit data having the same brightness have the same brightness at the center and both ends of the image display area.
 レーザ光源部9は、発光パターン変換部33が出力する信号が示す光量のレーザ光Lをスクリーン11上に走査する。ここで、走査線上の各走査位置において、ビットデータに基づいて射出されるレーザ光の光量を「目標光量」と呼ぶ。即ち、目標光量は、画像データに基づいて得られた各走査位置の輝度に対応する光量であり、画像データに応じて走査線上の各走査位置ごとに異なる値となる。 The laser light source unit 9 scans the screen 11 with the laser light L having the light amount indicated by the signal output from the light emission pattern conversion unit 33. Here, the amount of laser light emitted based on the bit data at each scanning position on the scanning line is referred to as a “target light amount”. That is, the target light amount is a light amount corresponding to the luminance at each scanning position obtained based on the image data, and has a different value for each scanning position on the scanning line according to the image data.
 (飛び越し走査)
 次に、本実施例による飛び越し走査について説明する。MEMSミラー95による光ビームの飛び越し走査では、1フレーム期間中に第1フィールドまたは第2フィールドのいずれかにおいて1回だけ走査線が走査される領域(以下、「非重複領域」と呼ぶ。)と、第1フィールドおよび第2フィールドの両方で走査線が走査される領域、つまり1フレーム中で走査線が2回走査される領域(以下、「重複領域」と呼ぶ。)が存在する。このため、1フレーム中で2回走査される重複領域では、単位時間当たりに投射されるレーザ光の光量が増大し、非重複領域よりも画像が明るくなってしまう。
(Interlaced scanning)
Next, interlaced scanning according to this embodiment will be described. In the interlaced scanning of the light beam by the MEMS mirror 95, the scanning line is scanned only once in either the first field or the second field during one frame period (hereinafter referred to as “non-overlapping region”). In addition, there is a region where the scanning line is scanned in both the first field and the second field, that is, a region where the scanning line is scanned twice in one frame (hereinafter referred to as “overlapping region”). For this reason, in the overlapping area scanned twice in one frame, the amount of laser light projected per unit time increases, and the image becomes brighter than in the non-overlapping area.
 本実施例ではこれに対して以下のいずれかの方法で対策を行う。 In this embodiment, measures are taken against this by one of the following methods.
 (1)方法A
 方法Aは、重複領域においていずれかのフィールドの輝度をゼロにする方法である。方法Aにより、第2フレームの走査線の輝度をゼロにした場合の走査線の例を図5に示す。図5(A)は第1フィールドの走査線を示し、図5(B)は第2フィールドの走査線を示し、図5(C)はフレームの走査線を示す。図示のように、第1フィールドでは、画像データに応じた輝度でレーザ光が走査される。また、第2フィールドでは、非重複領域では画像データに応じた輝度でレーザ光が走査されるが、重複領域ではレーザ光の輝度がゼロに設定される。
(1) Method A
Method A is a method in which the luminance of any field is made zero in the overlapping region. FIG. 5 shows an example of the scanning line when the luminance of the scanning line of the second frame is made zero by the method A. FIG. 5A shows the scanning lines of the first field, FIG. 5B shows the scanning lines of the second field, and FIG. 5C shows the scanning lines of the frame. As shown in the figure, in the first field, the laser beam is scanned with a luminance corresponding to the image data. In the second field, the laser beam is scanned with a luminance corresponding to the image data in the non-overlapping region, but the luminance of the laser beam is set to zero in the overlapping region.
 具体的には、ビットデータ変換部32は、重複領域を予め記憶しておく。そして、重複領域については、ビットデータ変換部32は、第1フィールドでは画像データからビットデータを演算し、第2フィールドでは、各色のレベルをすべてゼロとしてビットデータを生成する。一方、非重複領域については、ビットデータ変換部32は、第1フィールド、第2フィールドともに画像データからビットデータを生成する。そして、発光パターン変換部33は、ビットデータ変換部32が生成したビットデータに従って各レーザを発光させる。これにより、重複領域の画像が非重複領域よりも明るくなってしまうことを防ぐことができる。 Specifically, the bit data conversion unit 32 stores the overlapping area in advance. For the overlapping area, the bit data conversion unit 32 calculates bit data from the image data in the first field, and generates bit data with all the color levels set to zero in the second field. On the other hand, for the non-overlapping area, the bit data conversion unit 32 generates bit data from the image data in both the first field and the second field. Then, the light emission pattern conversion unit 33 causes each laser to emit light according to the bit data generated by the bit data conversion unit 32. Thereby, it can prevent that the image of an overlapping area becomes brighter than a non-overlapping area.
 なお、図5に示すように第2フィールドの重複領域における輝度をゼロにする代わりに、第1フィールドの重複領域において輝度をゼロとし、第2フィールドの重複領域においては画像データに対応する輝度でレーザ光を走査することとしてもよい。 As shown in FIG. 5, instead of setting the luminance in the overlapping region of the second field to zero, the luminance is set to zero in the overlapping region of the first field, and the luminance corresponding to the image data is set in the overlapping region of the second field. It is good also as scanning a laser beam.
 (2)方法B
 方法Bは、重複領域においていずれかのフィールドの輝度を下げる方法である。即ち、方法Bでは、重複領域では、輝度をゼロにはしないが、所定の比率で輝度を低下させる。
(2) Method B
Method B is a method of lowering the luminance of any field in the overlapping region. That is, in the method B, the luminance is not reduced to zero in the overlapping region, but the luminance is decreased at a predetermined ratio.
 具体的には、重複領域について、ビットデータ変換部32は、第1フィールドでは画像データからビットデータを演算する。また、ビットデータ変換部32は、第2フィールドの重複領域においても画像データからビットデータを演算するが、このビットデータの各色のレベルを所定の比率だけ暗くしたビットデータを出力する。即ち、第2フィールドでは、輝度を一律にゼロにするのではなく、画像データに対応する輝度を所定の比率で低下させた輝度でレーザ光を走査する。一方、非重複領域については、ビットデータ変換部32は、第1フィールド、第2フィールドともに画像データからビットデータを生成する。そして、発光パターン変換部33は、ビットデータ変換部32が生成したビットデータに従って各レーザを発光させる。これにより、第2フィールドの重複領域でも、輝度は低くなるものの第1フィールドと同系色の光が走査されることになる。よって、フリッカを目立ちにくくしつつ、重複領域の画像が非重複領域よりも明るくなることを目立ちにくくできる。 Specifically, for the overlapping area, the bit data conversion unit 32 calculates bit data from the image data in the first field. Also, the bit data conversion unit 32 calculates bit data from the image data even in the overlapping region of the second field, and outputs bit data in which the level of each color of the bit data is darkened by a predetermined ratio. That is, in the second field, the laser light is scanned at a luminance obtained by reducing the luminance corresponding to the image data by a predetermined ratio, instead of uniformly setting the luminance to zero. On the other hand, for the non-overlapping area, the bit data conversion unit 32 generates bit data from the image data in both the first field and the second field. Then, the light emission pattern conversion unit 33 causes each laser to emit light according to the bit data generated by the bit data conversion unit 32. As a result, even in the overlapping region of the second field, the light of the same color as the first field is scanned although the luminance is low. Accordingly, it is possible to make the image of the overlapping area brighter than the non-overlapping area while making the flicker less noticeable.
 なお、第2フィールドの重複領域における輝度を下げる代わりに、第1フィールドの重複領域において輝度を下げ、第2フィールドの重複領域においては画像データに対応する輝度でレーザ光を走査することとしてもよい。 Instead of lowering the luminance in the overlapping region of the second field, the luminance may be lowered in the overlapping region of the first field, and the laser light may be scanned with the luminance corresponding to the image data in the overlapping region of the second field. .
 (3)方法C
 方法Cは、重複領域において2つのフィールドの輝度をそれぞれ1/2とする方法である。方法Cにより、第1及び第2フィールドの走査線の輝度を1/2にした場合の走査線の例を図6に示す。図6(A)は第1フィールドの走査線を示し、図6(B)は第2フィールドの走査線を示し、図6(C)はフレームの走査線を示す。図示のように、第1フィールドの重複領域では画像データに応じた輝度の1/2の輝度でレーザ光が走査され、第2フィールドの重複領域でも画像データに応じた輝度の1/2の輝度でレーザ光が走査される。
(3) Method C
Method C is a method in which the luminance of two fields is halved in the overlapping region. FIG. 6 shows an example of scanning lines when the luminance of the scanning lines in the first and second fields is halved by the method C. 6A shows a scanning line for the first field, FIG. 6B shows a scanning line for the second field, and FIG. 6C shows a scanning line for the frame. As shown in the drawing, the laser beam is scanned at a luminance of 1/2 of the luminance corresponding to the image data in the overlapping region of the first field, and the luminance of 1/2 of the luminance corresponding to the image data is also detected in the overlapping region of the second field. The laser beam is scanned.
 具体的には、ビットデータ変換部32は、第1フィールド、第2フィールドの両方において、画像データからビットデータを演算したのちに、重複領域については各色のレベルをすべて0.5倍としてビットデータを生成する。そして、発光パターン変換部33は、ビットデータ変換部32が生成したビットデータに従って各レーザを発光させる。これにより、重複領域のフリッカを目立ちにくくすることができる。 Specifically, the bit data conversion unit 32 calculates the bit data from the image data in both the first field and the second field and then sets the level of each color to 0.5 times for the overlapping area. Is generated. Then, the light emission pattern conversion unit 33 causes each laser to emit light according to the bit data generated by the bit data conversion unit 32. Thereby, the flicker of the overlapping area can be made inconspicuous.
 なお、上記の方法A~Cにおいて、非重複領域を走査する際に画像データに基づいて決定されるレーザ光の光量は本発明の「第1目標光量」に相当し、重複領域を走査する際に画像データに基づいて決定されるレーザ光の光量は本発明の「第2目標光量」に相当する。 In the above methods A to C, the light amount of the laser light determined based on the image data when scanning the non-overlapping region corresponds to the “first target light amount” of the present invention, and when scanning the overlapping region. The amount of laser light determined based on the image data corresponds to the “second target light amount” of the present invention.
 さて、上記の方法A~Cでは、ビットデータ変換部32が重複領域において輝度を補正したビットデータを生成し、発光パターン変換部33はビットデータ変換部32が生成したビットデータに従って各レーザを発光させている。その代わりに、ビットデータ変換部32は、非重複領域、重複領域に関わらず、画像データに基づいてビットデータを算出し、発光パターン変換部33がビットデータを各レーザの発光強度を表す信号に変換する際に、重複領域の各レーザの光量を補正するようにしてもよい。 In the above methods A to C, the bit data conversion unit 32 generates bit data whose luminance is corrected in the overlapping region, and the light emission pattern conversion unit 33 emits each laser according to the bit data generated by the bit data conversion unit 32. I am letting. Instead, the bit data conversion unit 32 calculates the bit data based on the image data regardless of the non-overlapping region and the overlapping region, and the light emission pattern conversion unit 33 converts the bit data into a signal indicating the light emission intensity of each laser. At the time of conversion, the light quantity of each laser in the overlapping area may be corrected.
 具体的には、方法Aの場合には、発光パターン変換部33は、重複領域に対応するレーザ光の射出タイミングを記憶しておき、第1フィールドの全領域及び第2フィールドの非重複領域ではビットデータをそのまま各レーザの発光強度を表す信号に変換し、第2フィールドの重複領域では各レーザの発光がゼロとなる信号を生成する。その代わりに、発光パターン変換部33は、第1フィールドの重複領域では各レーザの発光がゼロとなる信号を生成し、第1フィールドの非重複領域及び第2フィールドの全領域でビットデータをそのまま各レーザの発光強度を表す信号に変換してもよい。 Specifically, in the case of method A, the light emission pattern conversion unit 33 stores the emission timing of the laser light corresponding to the overlapping region, and in the entire region of the first field and the non-overlapping region of the second field. The bit data is converted into a signal representing the emission intensity of each laser as it is, and a signal in which the emission of each laser becomes zero is generated in the overlapping region of the second field. Instead, the light emission pattern conversion unit 33 generates a signal in which the light emission of each laser becomes zero in the overlapping area of the first field, and the bit data is left as it is in the non-overlapping area of the first field and the entire area of the second field. You may convert into the signal showing the emitted light intensity of each laser.
 方法Bの場合には、発光パターン変換部33は、第1フィールドの全領域及び第2フィールドの非重複領域ではビットデータをそのまま各レーザの発光強度を表す信号に変換し、第2フィールドの重複領域では、ビットデータが示す各レーザの発光強度を所定の比率で低下させた信号を生成する。その代わりに、発光パターン変換部33は、第1フィールドの重複領域ではビットデータが示す各レーザの発光強度を所定の比率で低下させた信号を生成し、第1フィールドの非重複領域及び第2フィールドの全領域でビットデータをそのまま各レーザの発光強度を表す信号に変換してもよい。 In the case of method B, the light emission pattern conversion unit 33 converts the bit data as it is into a signal representing the light emission intensity of each laser in the entire first field region and the second field non-overlapping region, and overlaps the second field. In the region, a signal is generated by reducing the emission intensity of each laser indicated by the bit data at a predetermined ratio. Instead, the light emission pattern conversion unit 33 generates a signal in which the light emission intensity of each laser indicated by the bit data in the overlapping region of the first field is reduced at a predetermined ratio, and the non-overlapping region and the second field of the first field are generated. The bit data may be converted into a signal representing the emission intensity of each laser as it is in the entire field.
 方法Cの場合には、発光パターン変換部33は、第1フィールド及び第2フィールドの両方について、重複領域では、ビットデータが示す各レーザの発光強度の0.5倍の発光強度を示す信号を生成すればよい。 In the case of the method C, the light emission pattern conversion unit 33 outputs a signal indicating the light emission intensity of 0.5 times the light emission intensity of each laser indicated by the bit data in the overlapping region for both the first field and the second field. It only has to be generated.
 なお、重複領域として定義される領域は、第1フィールドの走査線と第2フィールドの走査線が一部でも重なる領域、半分以上が重なる領域、または、全部が重なる領域などであって、設計的に適宜設定される。 The region defined as the overlapping region is a region where the first field scanning line and the second field scanning line partially overlap, a region where more than half overlap, a region where all overlap, or the like. Is set as appropriate.
 (変形例)
 上記の第1実施例は、1フレームを2つのフィールドで構成するものであったが、1フレームを3つ以上のフィールドで構成してもよい。この場合も、画像描画装置1は、各フィールドの走査により生じる重複領域を予め記憶しておき、重複領域について、1のフィールドでは画像データに基づく通常の光量のレーザ光を走査し、他のフィールドでは光量を補正して走査すればよい。
(Modification)
In the first embodiment, one frame is composed of two fields, but one frame may be composed of three or more fields. Also in this case, the image drawing apparatus 1 stores in advance an overlapping area generated by scanning each field, and scans a laser beam with a normal light amount based on the image data in one field for the overlapping area, and other fields. Then, it is sufficient to correct the light amount and scan.
 例えば、1フレームが3つのフィールドで構成される場合を考える。方法Aの場合、第1フィールドの全領域及び第2、第3フィールドの非重複領域では画像データに対応する輝度で走査が行われ、第2、第3フィールドの重複領域では輝度がゼロに設定される。方法Bの場合、第1フィールドの全領域及び第2、第3フィールドの非重複領域では画像データに対応する輝度で走査が行われ、第2、第3フィールドの重複領域ではそれよりも輝度を所定の比率で低下させて走査が行われる。また、方法Cの場合、第1~第3フィールドの非重複領域においては画像データに対応する輝度で走査が行われ、第1~第3フィールドの重複領域においてはそれぞれ画像データの輝度の1/3の輝度で走査が行われる。 For example, consider a case where one frame consists of three fields. In the case of Method A, scanning is performed at the brightness corresponding to the image data in the entire first field and the non-overlapping areas of the second and third fields, and the brightness is set to zero in the overlapping areas of the second and third fields. Is done. In the case of method B, scanning is performed at the brightness corresponding to the image data in the entire area of the first field and the non-overlapping areas of the second and third fields, and the brightness is higher in the overlapping areas of the second and third fields. Scanning is performed at a predetermined rate. In the case of method C, scanning is performed at a luminance corresponding to the image data in the non-overlapping areas of the first to third fields, and 1 / of the luminance of the image data is respectively performed in the overlapping areas of the first to third fields. Scanning is performed at a luminance of 3.
 [第2実施例]
 第2実施例は、本発明をいわゆるリサージュ走査に適用した例である。なお、第2実施例に係る画像描画装置の構成は第1実施例と同様であるので、説明は省略する。
[Second Embodiment]
The second embodiment is an example in which the present invention is applied to so-called Lissajous scanning. The configuration of the image drawing apparatus according to the second embodiment is the same as that of the first embodiment, and a description thereof will be omitted.
 第2実施例においては、MEMSミラー95を主走査方向及び副走査方向のそれぞれに対して共振駆動させ、レーザ光をリサージュ走査することによりスクリーン11に画像を投影する。リサージュ走査の走査線の例を図7に示す。リサージュ走査では、1フレームを描画する際に走査線が2回以上走査される重複領域が多数発生する。 In the second embodiment, the MEMS mirror 95 is resonantly driven in each of the main scanning direction and the sub-scanning direction, and an image is projected onto the screen 11 by performing Lissajous scanning with laser light. An example of the Lissajous scanning line is shown in FIG. In the Lissajous scan, a large number of overlapping areas are generated in which a scanning line is scanned twice or more when one frame is drawn.
 リサージュ走査の場合も、ビットデータ変換部32は、フレームメモリ4に展開された画像データのうち、走査線の軌道に対応する画素のデータを参照してビットデータを演算する。 Also in the case of Lissajous scanning, the bit data conversion unit 32 calculates bit data by referring to pixel data corresponding to the trajectory of the scanning line among the image data developed in the frame memory 4.
 具体的に、ビットデータ変換部32は、第1実施例と同様に予め重複領域を記憶しておく。そして、ビットデータ変換部32は、第1実施例の方法Aに対応する方法の場合、重複領域において、1回目の走査においては画像データから算出されたビットデータを生成し、2回目以降の走査においては各色のレベルをすべてゼロとしたビットデータを生成する。 Specifically, the bit data conversion unit 32 stores the overlap area in advance as in the first embodiment. In the case of the method corresponding to the method A of the first embodiment, the bit data conversion unit 32 generates bit data calculated from the image data in the first scan in the overlapping region, and performs the second and subsequent scans. In, bit data is generated with all the color levels set to zero.
 第1実施例の方法Bに対応する方法の場合、ビットデータ変換部32は、1フレーム期間中における2回目以降の重複領域の走査においては、ビットデータの各色のレベルを所定の比率だけ低くしたビットデータを生成する。 In the case of the method corresponding to the method B of the first embodiment, the bit data conversion unit 32 reduces the level of each color of the bit data by a predetermined ratio in the second and subsequent overlapping area scans during one frame period. Generate bit data.
 第1実施例の方法Cに対応する方法の場合、ビットデータ変換部32は、1フレーム期間中において、重複領域のそれぞれについて走査される回数を予め記憶しておく。そして、重複領域に対応するビットデータを演算する際には、ビットデータ変換部32は、画像データに基づく輝度を走査される回数で除したビットデータを算出する。例えば、リサージュ走査により1フレーム期間中に3回走査される重複領域については、ビットデータ変換部32は、画像データに基づいて算出された輝度の1/3の輝度で走査を行う。 In the case of the method corresponding to the method C of the first embodiment, the bit data conversion unit 32 stores in advance the number of times of scanning for each of the overlapping areas during one frame period. Then, when calculating the bit data corresponding to the overlapping area, the bit data conversion unit 32 calculates the bit data obtained by dividing the luminance based on the image data by the number of times of scanning. For example, for an overlapping region that is scanned three times during one frame period by Lissajous scanning, the bit data conversion unit 32 performs scanning at a luminance that is 1/3 of the luminance calculated based on the image data.
 なお、上記の説明では、ビットデータ変換部32が重複領域において輝度を補正してビットデータを生成している。その代わりに、第1実施例でも説明したように、ビットデータ変換部32は、非重複領域、重複領域に関わらず、画像データに基づいてビットデータを算出し、発光パターン変換部33がビットデータを各レーザの発光強度を表す信号に変換する際に、重複領域の各レーザの光量を補正するようにしてもよい。 In the above description, the bit data conversion unit 32 corrects the luminance in the overlapping area to generate bit data. Instead, as described in the first embodiment, the bit data conversion unit 32 calculates the bit data based on the image data regardless of the non-overlapping area and the overlapping area, and the light emission pattern conversion unit 33 sets the bit data. May be corrected to the amount of light emitted from each laser in the overlapping region.
産業上の利用分野Industrial application fields
 本発明は、画像を投影する投影装置に利用することができる。 The present invention can be used in a projection apparatus that projects an image.
 1 画像描画装置
 3 ビデオASIC
 7 レーザドライバ
 8 MEMSドライバ
 9 レーザ光源部
 11 スクリーン
 95 MEMSミラー
1 Image drawing device 3 Video ASIC
7 Laser Driver 8 MEMS Driver 9 Laser Light Source 11 Screen 95 MEMS Mirror

Claims (8)

  1.  連続する複数のフレームにより構成される画像を投影領域に投影する投影装置であって、
     光を射出する光源と、
     投影すべき画像を示す画像信号に応じた目標光量に基づいて前記光源が射出する光の光量を制御する制御手段と、
     1フレーム期間中に、前記光源が射出する光を前記投影領域の全域に走査する走査手段と、
     を備え、
     前記制御手段は、前記走査手段により前記1フレーム期間中に1回の走査がなされる非重複領域に対応する目標光量である第1目標光量よりも、前記1フレーム期間中に複数回の走査がなされる重複領域に対応する目標光量である第2目標光量が小さくなるように前記画像信号に応じた目標光量を設定することを特徴とする投影装置。
    A projection device that projects an image composed of a plurality of continuous frames onto a projection region,
    A light source that emits light;
    Control means for controlling the amount of light emitted from the light source based on a target light amount corresponding to an image signal indicating an image to be projected;
    Scanning means for scanning light emitted from the light source over the entire projection area during one frame period;
    With
    The control means performs scanning a plurality of times during the one frame period, rather than a first target light quantity that is a target light quantity corresponding to a non-overlapping region that is scanned once during the one frame period by the scanning means. A projection apparatus characterized in that a target light amount corresponding to the image signal is set so that a second target light amount that is a target light amount corresponding to an overlapping area is reduced.
  2.  前記制御手段は、前記1フレーム期間中において、前記重複領域に複数回にわたり走査される光の合成光量と、前記非重複領域に1回だけ走査される光の光量とが略一致するように前記目標光量を設定することを特徴とする請求項1に記載の投影装置。 The control means is configured so that, during the one frame period, the combined light amount of light scanned in the overlapping region a plurality of times and the light amount of light scanned only once in the non-overlapping region substantially coincide with each other. The projection apparatus according to claim 1, wherein a target light quantity is set.
  3.  前記フレームは複数のフィールドで構成され、
     前記走査手段は、前記光源が射出する光をラスタースキャンすることにより前記フィールドを描画し、前記複数のフィールド間で飛び越し走査をすることを特徴とする請求項1又は2に記載の投影装置。
    The frame is composed of a plurality of fields,
    The projection apparatus according to claim 1, wherein the scanning unit draws the field by performing raster scanning of light emitted from the light source, and performs interlaced scanning between the plurality of fields.
  4.  前記制御手段は、前記複数のフィールドのうちの少なくとも1つのフィールドにおける前記第2目標光量をゼロに設定することを特徴とする請求項3に記載の投影装置。 4. The projection apparatus according to claim 3, wherein the control unit sets the second target light amount in at least one of the plurality of fields to zero.
  5.  前記走査手段は、光源が射出する光をリサージュスキャンすることにより前記フレームを描画することを特徴とする請求項1又は2記載の投影装置。 3. The projection apparatus according to claim 1, wherein the scanning unit draws the frame by performing a Lissajous scan on light emitted from a light source.
  6.  光源及び走査手段を有し、連続する複数のフレームにより構成される画像を投影領域に投影する投影装置により実行される投影方法であって、
     投影すべき画像を示す画像信号に応じた目標光量に基づいて前記光源が射出する光の光量を制御する制御工程と、
     1フレーム期間中に、前記光源が射出する光を前記投影領域の全域に走査する走査工程と、
     を備え、
     前記制御工程は、前記走査手段により前記1フレーム期間中に1回の走査がなされる非重複領域に対応する目標光量である第1目標光量よりも、前記1フレーム期間中に複数回の走査がなされる重複領域に対応する目標光量である第2目標光量が小さくなるように前記画像信号に応じた目標光量を設定することを特徴とする投影方法。
    A projection method that includes a light source and a scanning unit and that is executed by a projection apparatus that projects an image composed of a plurality of continuous frames onto a projection region,
    A control step of controlling the amount of light emitted by the light source based on a target amount of light corresponding to an image signal indicating an image to be projected;
    A scanning step of scanning light emitted from the light source over the entire projection area during one frame period;
    With
    In the control step, scanning is performed a plurality of times during the one frame period, rather than a first target light quantity that is a target light quantity corresponding to a non-overlapping region that is scanned once during the one frame period by the scanning unit. A projection method comprising: setting a target light amount corresponding to the image signal so that a second target light amount, which is a target light amount corresponding to an overlapped area, is reduced.
  7.  光源、走査手段、及び、コンピュータを有し、連続する複数のフレームにより構成される画像を投影領域に投影する投影装置により実行されるプログラムであって、
     投影すべき画像を示す画像信号に応じた目標光量に基づいて前記光源が射出する光の光量を制御する制御工程と、
     1フレーム期間中に、前記光源が射出する光を前記投影領域の全域に走査する走査工程と、
     を前記コンピュータに実行させ、
     前記制御工程は、前記走査手段により前記1フレーム期間中に1回の走査がなされる非重複領域に対応する目標光量である第1目標光量よりも、前記1フレーム期間中に複数回の走査がなされる重複領域に対応する目標光量である第2目標光量が小さくなるように前記画像信号に応じた目標光量を設定することを特徴とするプログラム。
    A program that is executed by a projection apparatus that has a light source, a scanning unit, and a computer and projects an image composed of a plurality of continuous frames onto a projection region,
    A control step of controlling the amount of light emitted by the light source based on a target amount of light corresponding to an image signal indicating an image to be projected;
    A scanning step of scanning light emitted from the light source over the entire projection area during one frame period;
    To the computer,
    In the control step, scanning is performed a plurality of times during the one frame period, rather than a first target light quantity that is a target light quantity corresponding to a non-overlapping region that is scanned once during the one frame period by the scanning unit. A program for setting a target light amount corresponding to the image signal so that a second target light amount, which is a target light amount corresponding to an overlapping area, is reduced.
  8.  請求項7に記載のプログラムを記憶したことを特徴とする記憶媒体。 A storage medium storing the program according to claim 7.
PCT/JP2015/051230 2015-01-19 2015-01-19 Projection device, projection method, program, and storage medium WO2016117005A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/051230 WO2016117005A1 (en) 2015-01-19 2015-01-19 Projection device, projection method, program, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/051230 WO2016117005A1 (en) 2015-01-19 2015-01-19 Projection device, projection method, program, and storage medium

Publications (1)

Publication Number Publication Date
WO2016117005A1 true WO2016117005A1 (en) 2016-07-28

Family

ID=56416572

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/051230 WO2016117005A1 (en) 2015-01-19 2015-01-19 Projection device, projection method, program, and storage medium

Country Status (1)

Country Link
WO (1) WO2016117005A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019065484A1 (en) * 2017-09-28 2019-04-04 パイオニア株式会社 Distance measurement device and optical scanning device
KR20190125409A (en) * 2017-03-03 2019-11-06 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 MEMS scanning display device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003005737A (en) * 2001-06-21 2003-01-08 Toshiba Corp Video display device and video display method
JP2007047243A (en) * 2005-08-08 2007-02-22 Seiko Epson Corp Picture display apparatus and method of controlling picture display apparatus
JP3172678U (en) * 2008-12-10 2012-01-05 宏瞻科技股▲ふん▼有限公司 Laser projector
JP2013068859A (en) * 2011-09-26 2013-04-18 Hitachi Media Electoronics Co Ltd Image display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003005737A (en) * 2001-06-21 2003-01-08 Toshiba Corp Video display device and video display method
JP2007047243A (en) * 2005-08-08 2007-02-22 Seiko Epson Corp Picture display apparatus and method of controlling picture display apparatus
JP3172678U (en) * 2008-12-10 2012-01-05 宏瞻科技股▲ふん▼有限公司 Laser projector
JP2013068859A (en) * 2011-09-26 2013-04-18 Hitachi Media Electoronics Co Ltd Image display device

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190125409A (en) * 2017-03-03 2019-11-06 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 MEMS scanning display device
JP2020510870A (en) * 2017-03-03 2020-04-09 マイクロソフト テクノロジー ライセンシング,エルエルシー MEMS scanning display device
JP7075939B2 (en) 2017-03-03 2022-05-26 マイクロソフト テクノロジー ライセンシング,エルエルシー MEMS scanning display device
KR102516979B1 (en) 2017-03-03 2023-03-31 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 MEMS scanning display device
WO2019065484A1 (en) * 2017-09-28 2019-04-04 パイオニア株式会社 Distance measurement device and optical scanning device
JPWO2019065484A1 (en) * 2017-09-28 2020-11-12 パイオニア株式会社 Distance measuring device and optical scanning device
EP3674749A4 (en) * 2017-09-28 2021-04-28 Pioneer Corporation Distance measurement device and optical scanning device
JP2022121521A (en) * 2017-09-28 2022-08-19 パイオニア株式会社 Ranging device and optical scanning device

Similar Documents

Publication Publication Date Title
CN107621748B (en) Laser projection display device
JP5175504B2 (en) Projection display device
JP6118913B2 (en) Display device
JP6623584B2 (en) Image generation device, head-up display
JP6137006B2 (en) Image display device and image display method
WO2012120589A1 (en) Image rendering device, rendering control program, and optical axis deviation detection device
US20070200866A1 (en) Light source performing scanning operation twice, image apparatus using the light source, and method of driving the light source
JP5976925B2 (en) Projection apparatus, head-up display, control method, program, and storage medium
WO2016117005A1 (en) Projection device, projection method, program, and storage medium
WO2014162506A1 (en) Virtual image display device, projection device, control method, program, and recording medium
WO2016038884A1 (en) Image drawing device and method for drawing image
JP6311823B2 (en) Image display device and image display method
WO2015194377A1 (en) Light source drive device and image display device
JP2017083631A (en) Display device, control method, program and storage medium
US20080316373A1 (en) Projection type image display device
JP5731660B2 (en) Projection apparatus, control method, and program
JP4230496B2 (en) Image display device and image display method
WO2016135796A1 (en) Image rendering device, head-up display, and method for adjusting image luminance
JP2016099561A (en) Projection device, projection method, program, and storage medium
JP2010211149A (en) Image display
WO2014162503A1 (en) Projection device, control method, program, and recording medium
JP6606634B2 (en) Projection apparatus, control method, program, and storage medium
JP2014007358A (en) Projection apparatus, head-up display, control method, program, and recording medium
JP6721082B2 (en) Projection apparatus, projection method, program and storage medium
JP4835077B2 (en) Image display device and control method of image display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15878701

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 15878701

Country of ref document: EP

Kind code of ref document: A1