WO2011102299A1 - Control device and projection-type image display device - Google Patents

Control device and projection-type image display device Download PDF

Info

Publication number
WO2011102299A1
WO2011102299A1 PCT/JP2011/052913 JP2011052913W WO2011102299A1 WO 2011102299 A1 WO2011102299 A1 WO 2011102299A1 JP 2011052913 W JP2011052913 W JP 2011052913W WO 2011102299 A1 WO2011102299 A1 WO 2011102299A1
Authority
WO
WIPO (PCT)
Prior art keywords
focus
unit
evaluation value
screen
image
Prior art date
Application number
PCT/JP2011/052913
Other languages
French (fr)
Japanese (ja)
Inventor
史教 水野
直幸 二村
俊朗 中莖
善光 野口
Original Assignee
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三洋電機株式会社 filed Critical 三洋電機株式会社
Publication of WO2011102299A1 publication Critical patent/WO2011102299A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/53Means for automatic focusing, e.g. to compensate thermal effects
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/142Adjusting of projection optics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback

Definitions

  • the present invention relates to a focus adjustment technique in a projection display apparatus.
  • Patent Document 1 discloses a liquid crystal projector that outputs a video signal of a focus adjustment pattern on a screen.
  • Patent Document 1 In order to display the focus focusing information on the screen, in Patent Document 1 described above, the user needs to operate the adjustment switch to project the focus adjustment pattern. Without any special operation such as the operation of a dedicated button by the user or selection from a predetermined menu, the user's intention of manual focus adjustment is detected and focus focus information is projected and displayed on the screen. If you can, it is convenient.
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide a technique for automatically projecting and displaying focus in-focus information on a screen when a user starts a manual focus adjustment operation. is there.
  • An aspect of the present invention provides a projection image that includes a projection unit that projects an image on a screen via a lens, an imaging unit that images the screen, and a focus adjustment unit that is provided in the projection unit and is manually operated. It is a control device mounted on the display device.
  • This apparatus acquires a captured image captured by an imaging unit, calculates a characteristic value that changes according to a focus state of an image projected on a screen as a focus evaluation value, and a focus evaluation value
  • An evaluation value change detection unit that determines that the focus adjustment unit is operated by the user and a focus focus for assisting focus adjustment when it is determined that the focus adjustment unit is operated.
  • a focus adjustment assist unit that projects and displays information together with the image.
  • This apparatus includes a projection unit that projects an image on a screen via a lens, an imaging unit for imaging the screen, and the control unit described above.
  • focus focus information can be projected and displayed on the screen at the start of a manual focus adjustment operation by the user.
  • FIG. 1 It is a figure which shows the positional relationship of the projection type video display apparatus concerning one Embodiment of this invention, and a screen. It is a figure which shows the structure of the projection type video display apparatus concerning one Embodiment. It is a flowchart explaining operation
  • FIG. 8A is a diagram illustrating an example of a relationship between a zoom operation detection image projected on a screen and an imaging range of an imaging unit.
  • FIG. 8B is a diagram showing another example of the relationship between the zoom operation detection image projected on the screen and the imaging range of the imaging unit. It is a figure which shows an example of the fluctuation
  • FIG. 10A is a diagram illustrating an example of a graph at the start of focus adjustment.
  • FIG. 10B is a diagram illustrating an example of a graph in a case where the focus evaluation value is smaller than the maximum value in the period in which the target value to be taken as the focus evaluation value is not detected.
  • FIG. 10C is a diagram illustrating an example of a graph in the case where the focus evaluation value decreases in the period after the target value to be taken as the focus evaluation value is detected.
  • FIG. 10D is a diagram illustrating an example of a graph in the case where the focus evaluation value increases in a period after the target value to be taken as the focus evaluation value is detected. It is a figure which shows the table with which the combination of the focus evaluation value and target value, and the display color of focus state information were matched. It is a figure which shows the state transition of the display color of focus state information. It is a flowchart which mainly shows the flow of a process of the control part based on 2nd Embodiment.
  • FIG. 14A is a diagram showing a pattern image mainly used for focus adjustment.
  • FIG. 14A is a diagram showing a pattern image mainly used for focus adjustment.
  • 14B is a diagram illustrating an example of a combination of a pattern image used for focus adjustment and a pattern image used for zoom operation detection. It is a figure which shows the positional relationship of the projection type video display apparatus concerning 3rd Embodiment of this invention, and a screen. It is a figure which shows an example of the captured image imaged by the imaging part in the positional relationship shown in FIG. It is a figure which shows the structure of the projection type video display apparatus concerning 3rd Embodiment. It is a figure which shows a mode that a detection area
  • FIG. 25A shows vertical trapezoidal distortion correction
  • 25B shows horizontal trapezoidal distortion correction. It is a figure which shows a mode that the test pattern for trapezoid distortion detection is reflected in the screen area
  • FIG. 1 is a diagram showing a positional relationship between a projection display apparatus 200 and a screen 300 according to a first embodiment of the present invention.
  • the projection display apparatus 200 according to the first embodiment includes an imaging unit 30 for photographing the direction of the screen 300.
  • the imaging unit 30 is installed such that the optical axis center thereof and the optical axis center of the projection light projected from the projection display apparatus 200 have a parallel relationship, for example.
  • the screen 300 faces the projection display apparatus 200.
  • the projection display apparatus 200 performs focus adjustment by manually moving a focus ring provided in front of the lens. In order to assist this focus adjustment, the projection display apparatus 200 projects and displays focus focusing information on the screen 300. The user can adjust the focus appropriately by moving the focus ring while referring to the focus focusing information.
  • FIG. 2 is a diagram showing a configuration of the projection display apparatus 200.
  • the projection display apparatus 200 includes a projection unit 10, an imaging unit 30, and a control unit 100.
  • the control unit 100 includes a focus evaluation unit 40, a video signal evaluation unit 50, a focus target calculation unit 60, a focus adjustment assist unit 70, a video signal setting unit 82, and an image memory 84.
  • the configuration of the control unit 100 can be realized in terms of hardware by a CPU, memory, or other LSI of an arbitrary computer, and can be realized in terms of software by a program loaded in the memory. Depicts functional blocks realized by. Accordingly, those skilled in the art will understand that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
  • the projection unit 10 projects an image on screen 300.
  • the projection unit 10 includes a light source 11, a light modulation unit 12, and a focus lens 13.
  • a halogen lamp having a filament-type electrode structure a metal halide lamp having an electrode structure for generating arc discharge, a xenon short arc lamp, a high-pressure mercury lamp, an LED lamp, or the like can be employed.
  • the light modulator 12 modulates the light incident from the light source 11 in accordance with the video signal set by the video signal setting unit 82.
  • a DMD Digital Micromirror Device
  • the DMD includes a plurality of micromirrors corresponding to the number of pixels, and generates desired video light by controlling the direction of each micromirror according to each pixel signal.
  • the focus lens 13 adjusts the focal position of the light incident from the light modulator 12.
  • the focus lens 13 is provided with a focus ring 15, and the lens position is moved on the optical axis by manually rotating the focus ring.
  • the image light generated by the light modulation unit 12 is projected onto the screen 300 via the focus lens 13. Any device other than the focus ring may be used as long as the lens position is moved on the optical axis.
  • the imaging unit 30 captures the screen 300 and a projected image projected on the screen 300 as a main subject.
  • the imaging unit 30 includes a solid-state imaging device 31 and a signal processing circuit 32.
  • a solid-state imaging device 31 As the solid-state imaging device 31, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, a CCD (Charge Coupled Devices) image sensor, or the like can be employed.
  • the signal processing circuit 32 performs various signal processing such as A / D conversion and conversion from the RGB format to the YUV format on the signal output from the solid-state imaging device 31, and outputs the signal to the control unit 100.
  • the focus target calculation unit 60 acquires the video signal being projected and displayed on the screen from the video signal setting unit 82. Then, an evaluation value of the input video signal is calculated by performing the same calculation as that of the focus evaluation value calculation unit 42 described later. Similar to the focus evaluation value calculation unit 42, the calculation may be performed on the entire image, or may be performed for each divided area in the image.
  • the calculated evaluation value of the input video signal is used as a target value for the focus degree. Therefore, the evaluation value of the input video signal is sent to the focus adjustment assist unit 70. Then, when a focus assist function described later is on, it is superimposed and displayed on the screen together with the focus evaluation value for the captured image as focus in-focus information. The user can obtain an indication of the focus degree of focus by comparing the evaluation value of the input signal displayed as the focus focusing information with the focus evaluation value for the captured image.
  • the evaluation value representing the focus degree of focus is calculated and displayed using both evaluation values. It may be.
  • the focus adjustment assist unit 70 sets the focus assist function to a standby state in response to an instruction from the video signal evaluation unit 50. Then, in accordance with an instruction from the focus evaluation unit 40, the focus assist function is turned on or off.
  • the “focus assist function” is a function for superimposing and displaying focus in-focus information for assisting focus adjustment by manual operation of the focus ring as at least a part of the projection screen on the screen.
  • the start of the manual focus adjustment by the user is detected and the focus focus information is superimposed and displayed, and the end of the manual focus adjustment is detected and the display of the focus focus information is ended.
  • the video displayed together with the focus focusing information may use the video signal as it is, or may display a focus-dedicated pattern.
  • the focus focusing information is a focus evaluation value and a focus target value (that is, an evaluation value of the input video signal) of the captured image. Thereby, manual adjustment of the focus ring can be assisted.
  • the video signal evaluation unit 50 acquires the video signal being projected and displayed on the screen from the video signal setting unit 82.
  • the video signal evaluation unit 50 includes an input signal evaluation value calculation unit 52 and an input signal evaluation value determination unit 54.
  • the input signal evaluation value calculation unit 52 calculates an input signal evaluation value for determining whether or not to set the focus assist function to the standby state.
  • the input signal evaluation value any characteristic value that changes depending on the presence or absence of motion in the video, such as a high-frequency component, contrast information, and luminance information of the video signal can be used. Any method can be used to calculate the high-frequency component or contrast from the video signal.
  • the input signal evaluation value may be calculated from the entire screen, or may be calculated for each region by dividing the screen into a plurality of regions.
  • the input signal evaluation value is preferably calculated at a predetermined time interval or every predetermined number of frames.
  • the input signal evaluation value determination unit 54 determines whether or not to standby the focus assist function based on the change in the input signal evaluation value. Based on the input signal evaluation value, if it is determined that the video signal being projected and displayed on the screen is a moving image or a still image has been switched, manual focus adjustment is difficult, so the focus assist function remains off. And When it is determined that the video signal being projected and displayed on the screen is a still image based on the input signal evaluation value, the focus adjustment assist unit 70 is instructed to set the focus assist function to the standby state.
  • the focus evaluation unit 40 acquires a captured image from the signal processing circuit 32.
  • the focus evaluation unit 40 includes a focus evaluation value calculation unit 42 and an evaluation value change detection unit 44.
  • the focus evaluation value calculation unit 42 calculates a focus evaluation value for detecting the start and end of manual focus adjustment by the user.
  • the focus evaluation value any characteristic value that changes according to the focus state of the image projected on the screen, such as a high-frequency component of the captured image, contrast information, and luminance information, can be used. Any method can be used to calculate the high-frequency component or contrast from the captured image.
  • the focus evaluation value is calculated for each area from the entire captured image or by dividing the captured image into a plurality of areas.
  • the focus evaluation value is preferably calculated at a predetermined time interval or every predetermined number of frames.
  • the evaluation value change detection unit 44 detects the start of manual focus adjustment by the user based on the focus evaluation value when the focus assist function is in the standby state. Specifically, when there is no change in the focus evaluation value, the standby state of the focus assist function is maintained assuming that the focus ring is not moved. When only the focus evaluation value of a part of the plurality of areas changes, it is determined that a person or hand has crossed the front of the projector, and the standby state of the focus assist function is maintained. When the focus evaluation value changes in the same increase / decrease direction in all areas, it is determined that the focus ring is moved, and the focus adjustment assist unit 70 is instructed to turn on the focus assist function.
  • the evaluation value change detection unit 44 detects the end of manual focus adjustment by the user based on the focus evaluation value. In other words, after the focus assist function is turned on, if there is no change in the focus evaluation value for a predetermined period (for example, 5 to 10 seconds), it is determined that the adjustment by the focus ring has been completed and the focus assist function is turned off. The focus adjustment assist unit 70 is instructed to
  • the degree of change in the focus evaluation value of a part of the area during manual adjustment using the focus ring In some cases, a phenomenon may occur in which the degree of change in the focus evaluation value in other regions becomes larger. Even in such a case, since the increase / decrease direction of the focus evaluation value in all the regions is the same (that is, increases in all regions or decreases in all regions), it is possible to detect the start of manual adjustment by the focus ring. There is no change.
  • the image memory 84 holds image data to be projected on the screen 300.
  • the image data is supplied from a video reproduction device such as a personal computer or a DVD player via an external interface (not shown).
  • the video signal setting unit 82 sets a video signal based on the image data held in the image memory 84 in the light modulation unit 12.
  • the start and end of manual focus adjustment using the focus ring 15 by the user is detected based on the following characteristics.
  • the focus state of the entire screen often changes, and the focus state of only a part of the screen rarely changes greatly. Further, the focus state changes relatively slowly, and there is a steep change only in the vicinity of the focus. On the other hand, when a person or hand crosses in front of the projector, the focus state changes greatly only with a part of the screen. In addition, when the video signal is switched, the focus state changes abruptly.
  • the projection display apparatus 200 captures an image projected and displayed on the screen with a camera, and determines whether the focus ring is moved based on a change in the focus evaluation value calculated from the captured image. No, that is, the start and end of manual focus adjustment are determined.
  • FIG. 3 is a flowchart for mainly explaining the operation of the video signal evaluation unit 50.
  • the input signal evaluation value calculation unit 52 acquires a video signal being projected and displayed from the video signal setting unit 82 (S10), and calculates an input signal evaluation value for the entire screen or for each divided area (S12).
  • N in S14 When there is no change in the input signal evaluation value within a predetermined period (for example, several seconds) (N in S14), it is determined that the video signal is a still image, and the focus adjustment assist unit 70 is set so that the focus assist function is in a standby state. (S20).
  • FIG. 4 is a diagram for explaining the change in the input signal evaluation value and the state of the focus assist function.
  • FIG. 4A shows the time change of the input signal evaluation value
  • FIG. 4B shows the state of the focus assist function.
  • FIG. 5 is a flowchart for mainly explaining the operation of the focus evaluation unit 40.
  • the focus evaluation value calculation unit 42 acquires a captured image and calculates a focus evaluation value for each divided area (S52).
  • the evaluation value change detection unit 44 observes the focus evaluation value within a predetermined period (for example, several seconds) (S54). If no change in the focus evaluation value is observed in all areas, or there is a change in the focus evaluation value in an area below a predetermined ratio (Y in S54), the standby state of the focus assist function is maintained (S56). ).
  • focus adjustment assist unit 70 When a change in the focus evaluation value is observed in an area of a predetermined ratio or more (N in S54), it is determined that manual focus adjustment has been started, and the focus adjustment assist unit 70 is instructed to turn on the focus assist function. (S58). In response to this, focus information is projected and displayed on at least a part of the projection screen.
  • the evaluation value change detection unit 44 determines that the focus ring is being moved while the focus evaluation value is changing (N in S60). When no change in the focus evaluation value is observed for a predetermined period (Y in S60), it is determined that the manual focus adjustment by the user has ended, and the focus adjustment assist unit 70 is instructed to turn off the focus assist function ( S62). In response to this, the focus focusing information disappears from the projection screen.
  • a focus evaluation value is calculated for each of the divided areas, and the focus evaluation value does not change in all the areas, or the focus evaluation value changes only in a part of the areas.
  • the evaluation value change detection unit 44 may divide the screen into a plurality of regions and refer to the input signal evaluation values calculated for each region. In this way, the focus evaluation value of the captured image can be determined while referring to the local characteristics of the input signal, so that the determination accuracy can be improved.
  • FIG. 6 is a diagram for explaining the change of the focus evaluation value and the state of the focus assist function.
  • FIG. 6A shows the temporal change of the focus evaluation value
  • FIG. 6B shows the state of the focus assist function.
  • the focus assist function is in the standby state.
  • the standby state is maintained.
  • the focus evaluation value of only a part of the region fluctuates, it is not regarded as a focus ring operation, and the standby state of the focus assist function is maintained.
  • the focus evaluation values of all the regions are changed, it is determined that the focus ring is moved, and the focus assist function is turned on. In response to this, focus information is projected and displayed on at least a part of the projection screen.
  • the focus change value has not changed over the predetermined period D, it is determined that the operation of the focus ring has been completed, and the focus assist function is turned off in the period t9. In response to this, the focus focusing information disappears from the projection screen.
  • the projection display apparatus it can be determined whether or not the user manually moves the focus ring.
  • the focus focus information is automatically superimposed on the projection screen, and when the focus ring operation is stopped, the focus focus information disappears from the projection screen. Therefore, the user does not need to operate a dedicated button or select a function from the menu screen in order to display the focus focusing information.
  • the focus evaluation value is calculated for the entire captured image or for each divided region.
  • the focus evaluation value calculation unit may detect a screen area from the captured image by a well-known method, and calculate the focus evaluation value only within the screen area. As a result, the focus evaluation value is calculated only within the screen where the focus is changed by the operation of the focus ring, and therefore the focus ring operation can be detected more accurately.
  • An inclination determination unit that determines whether the screen 300 is inclined based on the captured image may be provided. When it is determined that the screen is inclined, the focus point when the focus ring is operated varies depending on the position in the screen. In this case, it is preferable that the focus evaluation value calculation unit calculates the focus evaluation value only in a region closer to the back focus in the screen.
  • Second embodiment In a projection display apparatus, in order to automatically perform various adjustments necessary for display, a motor, its control device, and the like are required, which can increase the cost of the apparatus.
  • manually adjusting the focus ring to adjust the focus can be a complicated task for the user. Therefore, if the current focus state can be projected and displayed on the screen when the user manually adjusts the focus, the convenience of the focus adjustment can be improved while suppressing the cost of the apparatus.
  • the outline of the second embodiment will be described.
  • the projection display apparatus according to the second embodiment renews the focus adjustment auxiliary information presented to the user when the zoom operation by the user is detected.
  • FIG. 7 is a diagram schematically illustrating a functional configuration of the projection display apparatus 1200 according to the second embodiment.
  • the projection display apparatus 1200 includes a projection unit 10, an imaging unit 30, and a control unit 1100.
  • the control unit 1100 includes a focus evaluation value calculation unit 142, a zoom operation detection unit 144, a focus state information setting unit 150, a video signal setting unit 82, and an image memory 84.
  • the projection unit 10, the imaging unit 30, the video signal setting unit 82, and the image memory 84 have the same configuration as that described with reference to FIG.
  • the projection unit 10 includes a light source 11, a light modulation unit 12, a lens 13, a zoom ring 14, and a focus ring 15.
  • the lens 13 adjusts the focal length of the light incident from the light modulation unit 12 and adjusts the focus.
  • the lens 13 includes a zoom lens for moving the focal length and a focusing lens for adjusting the focus.
  • the lens 13 is provided with a zoom ring 14 and a focus ring 15. When the user manually rotates the zoom ring 14 or the focus ring 15, the lens position moves on the optical axis.
  • the image light generated by the light modulation unit 12 is projected onto the screen 300 via the lens 13. Note that any device other than the zoom ring 14 and the focus ring 15 may be used as long as the lens position is moved on the optical axis.
  • the focus evaluation value calculation unit 142 obtains a captured image captured by the imaging unit 30 and performs image analysis, thereby using a characteristic value that changes according to the focus state of the video adjusted by the focus ring 15 as a focus evaluation value. calculate.
  • the focus evaluation value is used for the focus assist function.
  • the “focus assist function” is a function for superimposing and displaying focus state information for assisting focus adjustment by manual operation of the focus ring 15 as at least a part of the projection screen on the screen 300. Details of the focus state information will be described later.
  • the focus state information is superimposed and displayed during manual focus adjustment by the user.
  • the video signal may be used as it is, or a focus-dedicated pattern may be displayed.
  • any characteristic value that changes in accordance with the focus state of the image projected on the screen such as a high-frequency component of the captured image, contrast information, and luminance information, can be used.
  • a high-frequency component or contrast from a captured image any image analysis method such as Fourier transform, multi-resolution analysis, edge extraction, or the like can be used.
  • the focus evaluation value is calculated for each area from the entire captured image or by dividing the captured image into a plurality of areas.
  • the focus evaluation value is preferably calculated at a predetermined time interval or every predetermined number of frames.
  • the zoom operation detection unit 144 determines whether or not the zoom ring 14 is operated by the user by acquiring a captured image captured by the imaging unit 30 and analyzing the image. This can be realized, for example, by extracting an edge component in the image and detecting an increase / decrease in the edge component.
  • FIG. 8 is a diagram for explaining an example of the principle of detecting the presence or absence of a zoom operation.
  • the example shown in FIG. 8 is an example of a case where a zoom operation detection video is projected on the screen 300 and the video is acquired and analyzed in order to detect a zoom operation by the user.
  • FIG. 8A is a diagram illustrating an example of a relationship between a zoom operation detection image projected on the screen 300 and an imaging range of the imaging unit 30.
  • FIG. 8A shows that the vertical stripe pattern 500 is projected on the screen 300 and the imaging unit 30 images the imaging range 400.
  • FIG. 8B is a diagram illustrating another example of the relationship between the zoom operation detection image projected on the screen 300 and the imaging range of the imaging unit 30. Compared to FIG. 8A, FIG. 8B projects the pattern 500 in a reduced size due to the influence of the zoom operation by the user. Note that the size of the screen 300 and the imaging range 400 of the imaging unit 30 are the same as in FIG.
  • the zoom operation detection unit 144 detects the presence or absence of a zoom operation by counting the number of vertical stripes projected on the screen 300 and examining the fluctuations. For example, in the example shown in FIG. 8A, seven black vertical stripes are projected on the screen 300. It is assumed that the user operates the zoom ring 14 to reduce the pattern 500 and the state shown in FIG. In the example shown in FIG. 8B, the number of black vertical stripes projected on the screen 300 varies from 7 to 9. As described above, the zoom operation detection unit 144 can detect the presence or absence of the zoom operation by counting the number of vertical stripes projected on the screen 300 and examining the variation thereof. Counting the number of black vertical stripes can be realized by using, for example, general edge extraction processing and threshold processing.
  • the focus state information setting unit 150 sets focus state information displayed on the screen 300 as the above-described focus assist function.
  • the focus state information setting unit 150 includes a focus evaluation value storage unit 152, a focus state determination unit 154, and a graph database 156.
  • the configuration of the focus state information setting unit 150 will be described with reference to FIGS. 9, 10, 11, and 12.
  • FIG. 9 is a diagram showing an example of a change in focus evaluation value during focus adjustment. In the example shown below, it is assumed that the focus evaluation value increases as the focus state is improved.
  • the focus evaluation value storage unit 152 stores f 0 acquired from the focus evaluation value calculation unit 142 as an initial value of the focus evaluation value. Focus evaluation value storage unit 152 also stores the f 0 as a provisional maximum value f m of the focus evaluation value.
  • the “provisional maximum value f m ” is the maximum value of the focus evaluation value calculated by the focus evaluation value calculation unit 142 in the past.
  • the user initiates the adjustment of the focus at the time t 1. From time t 1 to t 2 , the user makes an adjustment in the wrong direction, that is, the direction in which the focus is less suitable, and thus the focus evaluation value f becomes lower than the initial value f 0 . In the following, when notation is not added and it is simply written as f, it represents the current value of the focus evaluation value.
  • the focus evaluation value storage unit 152 stores the current value f of the focus evaluation value separately from the initial value f 0 of the focus evaluation value.
  • time t 2 the user initiates a focus adjustment in the right direction.
  • Current value f of the focus evaluation value increases until the initial value f 0 at time t 3.
  • the focus evaluation value storage unit 152 every time updating the provisional maximum value f m with the value of the focus evaluation value f.
  • the focus evaluation value f takes the peak value f 1 at time t 4 and then decreases. This is because if the focus adjustment continues even though the focus has been achieved, the image will be out of focus and the image will be blurred. Then at time t 5 the user returns the focus adjustment again focus evaluation value f at time t 6 is the peak value f 1. Adjust the focus of the at time t 6 is terminated.
  • the focus state determination unit 154 calculates an average value f a of focus evaluation values calculated in the past, and stores the result in the focus evaluation value storage unit 152.
  • the focus state determination unit 154 also detects the peak value f 1 of the focus evaluation value described above and stores it in the focus evaluation value storage unit 152 as the maximum value f M of the focus evaluation value.
  • the focus state determination unit 154 tracks the amount of change in the focus evaluation value, and sets the focus evaluation value f when the amount of change changes from increasing to decreasing as the maximum value f M of the focus evaluation value. Since the maximum value f M of the focus evaluation value is the focus evaluation value in a state where the focus suits, becomes a target value to be taken by the focus evaluation value.
  • the magnitude relationship between the current value f of the focus evaluation value and the target value of the focus evaluation value is the above-described focus state information. Therefore, the focus state determination unit 154 determines the display mode of the graph in order to express the focus state information as a graph having a color indicating the degree of focus.
  • FIG. 10 is a diagram illustrating an example of a graph representing focus state information.
  • FIG. 10A is a diagram illustrating an example of a graph at the start of focus adjustment, and is a graph of a period from t 0 to t 1 in FIG. The graph is projected over the image projected on the screen. A rectangle whose longitudinal direction is the horizontal direction with respect to the image is a graph, and the focus evaluation value and the memory of the graph are associated with each other. The color of the graph and the area with the color according to the change of the focus evaluation value The area increases or decreases. In the example of FIG. 10A, a color is attached to a three-quarter region of the graph. In the following, in FIG.
  • a region represented by an oblique diagonal line represents a yellow region
  • a region represented by an oblique lattice represents a red region.
  • the memory width of the graph is larger at the left end of the graph than the right end, and the colored area greatly increases or decreases.
  • the colored area increases or decreases finely at the right end of the graph.
  • FIG. 10B illustrates the above-described maximum focus evaluation value f M , that is, the focus evaluation value f is smaller than the provisional maximum value f m during a period in which the target value to be taken by the focus evaluation value is not detected.
  • the focus evaluation value f continues to increase during the period in which the maximum value of the focus evaluation value f M described above, that is, the target value to be taken as the focus evaluation value is not detected (t 3 in FIG. 9).
  • period t 4 from, the same graph as the example of FIG. 10 (a).
  • FIG. 10C is a diagram illustrating an example of a graph when the focus evaluation value f decreases in a period after the target value to be taken by the focus evaluation value f is detected. From t 4 to t in FIG. 5 is a graph of a period of 5 . The display color of the colored area indicating the three-quarter area of the graph changes from yellow to red.
  • FIG. 10D is a diagram illustrating an example of a graph when the focus evaluation value f increases in the period after the target value to be taken by the focus evaluation value f is detected. From t 5 to t in FIG. It is a graph of 6 periods. Nearly the entire area of the graph is yellow. Although not shown, when the focus evaluation value f is equal to the maximum value f M which is the target value, the whole area of the graph becomes a blue color, to notify that the focus adjustment is completed the user. As described above, the focus state information serves as auxiliary means when the user performs a manual focus adjustment operation.
  • FIG. 11 is a diagram showing a table in which combinations of focus evaluation values and target values are associated with display colors and display amounts of focus state information.
  • the table shown in FIG. 11 is stored in the graph database 156, and the focus state determination unit 154 determines focus state information to be displayed with reference to this table.
  • the focus state determination unit 154 acquires the focus evaluation value stored in the focus evaluation value storage unit 152 and calculates the change thereof. First, examine the magnitude relation between the current value f of the focus evaluation value and the average value f a, is determined increase in the table shown in FIG. 11, constant, and it is in any state of descent. Then the focus state determination unit 154, the maximum value f M which is the target value of the focus evaluation value to determine whether it has been detected. If the maximum value f M is undetected, the magnitude relationship of the focus state determination unit 154 and the current value f provisional maximum value f m and the focus evaluation value, and the initial value f 0 of the focus evaluation value and the provisional maximum value f m Investigate the size relationship.
  • the focus state determination unit 154 determines the magnitude relationship between the maximum value f M and the current value f of the focus evaluation value, and the initial value f 0 of the focus evaluation value and the maximum value f M. Examine the magnitude relationship. By examining the above, the focus state determination unit 154 can determine the focus state information to be displayed from the table stored in the graph database 156. Note that whether or not the maximum value f M is detected, achieved by examining the value of the flag (MaxFlag) shown is secured in the work memory (not shown), whether or not the maximum value f M is detected it can. This flag is initialized to 0, the focus state determination unit 154 is set to 1 when detecting the maximum value f M.
  • FIG. 12 is a diagram illustrating the state transition of the display color of the focus state information.
  • possible states of the display color of the focus state information are displayed by a combination of ST (abbreviation of State) meaning a state and a number.
  • ST abbreviation of State
  • the evaluation value initialization state ST10 is entered.
  • the focus evaluation value calculation unit 142 evaluation value initially calculated is stored in the focus evaluation value storage unit 152 as a provisional maximum value f m and the initial value f 0.
  • the provisional maximum value f m and the initial value f 0 is stored in the focus evaluation value storage unit 152, it shifts to the yellow display state ST12. It is possible to shift from the yellow display state ST12 to the yellow display state ST12 itself, the red display state ST16, the blue display state ST14, the graph erasing state ST20, and the zoom change detection state ST18.
  • the numbers surrounded by circles in FIG. 12 correspond to the numbers surrounded by circles in the table of FIG.
  • the transition from the yellow display state ST12 to the blue display state ST14 is performed in the following two cases. That is, the current value f is greater than the average value f a of the evaluation value of the focus has been detected maximum value f M, or if the current value f and the maximum value f M is equal to, or focus evaluation value equal to the current value f and the average value f a have been detected maximum value f M, which is when the current value f and the maximum value f M is equal.
  • the zoom operation detection unit 144 detects that the zoom ring 14 is operated by the user, the zoom change detection state is detected from the yellow display state ST12, the blue display state ST14, and the red display state ST16. Move on to ST18. As a result, operations such as initialization of the focus evaluation value f are performed again. This is advantageous in that the reliability of the focus evaluation value can be ensured.
  • the graph adjustment state ST20 is entered and the focus adjustment ends.
  • FIG. 13 is a flowchart mainly showing a processing flow of the control unit 1100 according to the second embodiment.
  • the processing in this flowchart starts when, for example, the projection display apparatus 1200 is turned on and the imaging unit 30 captures an image projected on the screen 300.
  • the focus evaluation value calculation unit 142 obtains a captured image captured by the imaging unit 30 and performs image analysis, thereby using a characteristic value that changes according to the focus state of the video adjusted by the focus ring 15 as a focus evaluation value.
  • the calculated value is stored in the focus evaluation value storage unit 152 (S110).
  • the focus state determination unit 154 acquires the focus evaluation value stored in the focus evaluation value storage unit 152 and calculates the change (S112).
  • Focus state determination unit 154 of the display color of the focus state information, the magnitude relationship between the current focus evaluation value f and the average value f a, the maximum value f M or interim maximum value f m and the current focus evaluation value f Focus state information to be displayed is determined by referring to a table in which the magnitude relationship and the combination of the magnitude relationship between the maximum value f M or the provisional maximum value f m and the initial value f 0 are associated (S114). .
  • the video signal setting unit 82 acquires the focus state information from the focus state determination unit 154 as a graph having a color indicating the degree of focus, and transmits the graph to the projection unit 10 to update the graph displayed on the screen 300 ( S116).
  • step S118 of N If the current focus evaluation value f and the maximum value f M focus adjustment without is not completed match (S118 of N), the process is repeated from step S10 described above S16. If the current focus evaluation value f and the maximum value f M focus adjustment is completed match (S118 of Y), the focus state determination unit 154 notifies to erase the graph to the video signal setting unit 82 Graph Is deleted (S120). When the graph is deleted in step S20, the processing in this flowchart ends.
  • Operation with the above configuration is as follows.
  • the user projects an image on the screen 300 using the projection display apparatus 1200 according to the second embodiment, and operates the focus ring 15 to adjust the focus.
  • the focus state determination unit 154 analyzes the focus evaluation value calculated by the focus evaluation value calculation unit 142 and stored in the focus evaluation value storage unit 152, and projects the current focus state onto the screen 300 as a graph having color. .
  • the user refers to the graph as an auxiliary means for focus adjustment.
  • the projection display apparatus 1200 it is possible to provide auxiliary means when the user performs a manual focus adjustment operation. Since the current focus state is displayed as a color graph, the user can intuitively understand the focus state.
  • the focus state determination unit tracks the change amount of the focus evaluation value, and calculates the focus evaluation value when the change amount changes from increase to decrease as the target value that the focus evaluation value should take.
  • the target value to be taken as the focus evaluation value may be calculated by acquiring an input signal of an image projected on the screen 300 and analyzing the image. This can be realized by the focus evaluation value calculation unit 142 acquiring an input signal from the image memory 84 via the video signal setting unit 82, and directly analyzing the acquired input signal to calculate a target value.
  • the maximum value is obtained until the change amount changes from increasing to decreasing. May not be determined, and only a provisional maximum value may be calculated. It is sufficient to detect when the amount of change in the focus evaluation value changes from increase to decrease as the maximum value. In this case, however, the focus must be out of focus once the focus is in the correct state, and focus adjustment is efficient. Can be impossible. These can be avoided by setting the target value to be taken as the focus evaluation value by acquiring the input signal of the image projected on the screen 300 and analyzing the image, and acquire the captured image captured by the imaging unit 30. As compared with the case of image analysis, it is advantageous in that a target value with higher accuracy can be set.
  • the focus state determination unit 154 tracks the amount of change in the focus evaluation value. However, the focus state determination unit 154 has changed the calculated amount of change in the focus evaluation value beyond a predetermined threshold. In this case, the focus evaluation value exceeding the threshold may be discarded from the focus evaluation value storage unit 152.
  • the predetermined threshold is a reference value for the change amount of the focus evaluation value for checking the reliability as the focus evaluation value. For example, when a situation occurs such as a hand in front of the camera or a person passing in front of the camera during focus assist, a correct focus evaluation value cannot be calculated, and the amount of change in the focus evaluation value increases. As a result, a malfunction may occur. Therefore, those situations are reproduced in advance by experiments, and a predetermined threshold is set. This is advantageous in that the above malfunction can be prevented.
  • the shape of the graph is not limited to a rectangle.
  • it may be a polygon such as a circle or a triangle.
  • the graph is a circle, it can be realized as a pie graph in which the focus evaluation value and the angle are associated with each other.
  • a polygon such as a triangle, it can be realized by associating the focus evaluation value with the area for filling the graph.
  • the zoom operation detection unit 144 determines whether or not the zoom operation is performed by acquiring the vertical stripe pattern imaged by the imaging unit 30 and analyzing the image.
  • Various modes can be considered for the image to be subjected to the above. This aspect will be described below.
  • FIG. 14 is a diagram showing an example of a pattern image used for image analysis.
  • FIG. 14A is a diagram showing a pattern image mainly used for focus adjustment.
  • the checkered pattern as shown in FIG. 14 (a) contains many high-frequency components because edge components appear periodically, which is convenient for calculating the focus evaluation value.
  • the zoom operation detection unit 144 can detect the zoom operation by analyzing the image of the checkerboard pattern. For example, when calculating a horizontal profile for a pattern and calculating its edge component, it is the same as using the vertical stripe pattern described above, but a profile in any direction can be used. is there. This is advantageous in that the calculation of the focus evaluation value and the determination of the presence or absence of the zoom operation can be performed simultaneously.
  • FIG. 14B is a diagram illustrating an example of a combination of a pattern image used for focus adjustment and a pattern image used for zoom operation detection.
  • the periphery of the checkered pattern image used for calculation of the focus evaluation value is an image surrounded by the vertical stripe pattern image used for determining the zoom operation.
  • the image for use in calculating the focus evaluation value and the image for use in determining the zoom operation are spatially divided and arranged, so that the calculation of the focus evaluation value and the presence / absence of the zoom operation are determined. This is advantageous in that the determination can be executed simultaneously.
  • a checkered pattern image used to calculate the focus evaluation value and a vertical stripe pattern image used to determine the zoom operation may be alternately projected and displayed in a time-division manner. Since the image switching cycle needs to be slow enough not to give a visual discomfort to the user and fast enough not to interfere with calculation of the focus evaluation value and determination of the zoom operation, it is determined experimentally. For example, the cycle is 1 second. Since the focus evaluation value calculation unit 142 and the zoom operation detection unit 144 can analyze an image suitable for each operation, it is advantageous in that accuracy of calculation of the focus evaluation value and determination of the presence / absence of the zoom operation can be improved. It is.
  • the zoom operation detection unit 144 may detect the presence or absence of the zoom operation by acquiring and analyzing the graph representing the focus state information shown in FIG. As shown in FIG. 10, the graph representing the focus state information has a memory in the vertical direction.
  • the zoom operation detection unit 144 can detect the presence or absence of the zoom operation by acquiring the interval of the memory and examining the fluctuation. There is no need to project a tasteless dry image for detecting the presence or absence of a zoom operation, which is advantageous in that the user's boredom felt during focus adjustment can be reduced.
  • the zoom operation detection unit 144 can detect the presence or absence of the zoom operation by detecting the human face and examining the variation in the size thereof. it can. This can be realized by using a general image recognition technique such as pattern matching or color analysis. Alternatively, when a character is projected on the screen 300, the zoom operation detection unit 144 can detect the presence of the zoom operation by detecting the character and examining the change in its size. This can be realized, for example, using a general-purpose character recognition technique.
  • FIG. 15 is a diagram showing the positional relationship between the projection display apparatus 2200 and the screen 300 according to the third embodiment of the present invention.
  • a projection display apparatus 2200 according to the third embodiment includes an imaging unit 30 for photographing the direction of the screen 300.
  • the imaging unit 30 is installed so that the optical axis center thereof is parallel to the optical axis center of the projection light projected from the projection display apparatus 2200.
  • the screen 300 does not face the projection display apparatus 2200 and the right side is inclined toward the heel.
  • FIG. 16 illustrates an example of a captured image PuI captured by the imaging unit 30 in the positional relationship illustrated in FIG.
  • a projection image PrI projected from the projection display apparatus 2200 on which a test pattern is drawn is shown.
  • the image SI of the screen 300 is also captured in the captured image PuI. Since the amount of light projected at each position in the screen 300 is inversely proportional to the square of the distance from the light source, the luminance level gradually decreases from the left side to the right side of the image SI on the screen 300.
  • autofocus processing for focusing on a suitable position within the screen 300 will be described.
  • FIG. 17 is a diagram showing a configuration of a projection display apparatus 2200 according to the third embodiment.
  • the projection display apparatus 200 includes a projection unit 10, a lens driving unit 20, an imaging unit 30, and a control device 2100.
  • the control device 2100 includes a screen position detection unit 240, a detection area setting unit 250, an autofocus adjustment unit 260, a video signal setting unit 82, an image memory 84, and a drive signal setting unit 86.
  • the configuration of the control device 2100 can be realized in terms of hardware by a CPU, memory, or other LSI of an arbitrary computer, and in terms of software, it can be realized by a program loaded in the memory. Depicts functional blocks realized by. Therefore, those skilled in the art will understand that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
  • the projection unit 10 projects an image on screen 300.
  • the projection unit 10 includes a light source 11, a light modulation unit 12, and a focus lens 13.
  • a halogen lamp having a filament-type electrode structure a metal halide lamp having an electrode structure for generating arc discharge, a xenon short arc lamp, a high-pressure mercury lamp, an LED lamp, or the like can be employed.
  • the light modulator 12 modulates the light incident from the light source 11 in accordance with the video signal set by the video signal setting unit 82.
  • a DMD Digital Micromirror Device
  • the DMD includes a plurality of micromirrors corresponding to the number of pixels, and generates desired video light by controlling the direction of each micromirror according to each pixel signal.
  • the focus lens 13 adjusts the focal position of the light incident from the light modulator 12.
  • the lens position of the focus lens 13 is moved on the optical axis by the lens driving unit 20.
  • the image light generated by the light modulation unit 12 is projected onto the screen 300 via the focus lens 13.
  • the lens driving unit 20 moves the position of the focus lens 13 in accordance with the driving signal set from the driving signal setting unit 86.
  • a stepping motor for the lens driving unit 20, a stepping motor, a voice coil motor (VCM), a piezoelectric element, or the like can be employed.
  • VCM voice coil motor
  • the imaging unit 30 captures the screen 300 and the projected image projected on the screen 300 as a main subject.
  • the imaging unit 30 includes a solid-state imaging device 31 and a signal processing circuit 32.
  • a solid-state imaging device 31 As the solid-state imaging device 31, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, a CCD (Charge Coupled Devices) image sensor, or the like can be employed.
  • the signal processing circuit 32 performs various signal processing such as A / D conversion and conversion from the RGB format to the YUV format on the signal output from the solid-state imaging device 31, and outputs the signal to the control device 2100.
  • the data is output to the screen position detection unit 240 and the detection area setting unit 250.
  • the screen position detection unit 240 detects the position of the screen shown in the image picked up by the image pickup unit 30. More specifically, the screen position detection unit 240 detects the positions of the four sides (upper side, lower side, left side, and right side) of the screen 300 captured in the captured image by extracting edges in the captured image. The detected screen position is set in the detection area setting unit 250. The screen position is specified by, for example, vertex coordinates of four corners.
  • the detection area setting unit 250 sets a central area in the screen area detected by the screen position detection unit 240 as a detection area.
  • FIG. 18 is a diagram illustrating a state in which the detection area DA is set in the screen area SA in the captured image PuI according to the third embodiment.
  • FIG. 19 is a diagram for explaining an example of a detection area DA setting method according to the third embodiment.
  • the detection area setting unit 250 acquires the position of the screen area SA from the screen position detection unit 240, and sets the center area as the detection area DA.
  • the position and size of the detection area DA are determined as follows, for example.
  • the detection area setting unit 250 sets the maximum rectangle IR inscribed in the screen area SA.
  • the rectangle IR is divided into three equal parts in the vertical direction and the horizontal direction.
  • the middle rectangle among the nine rectangles thus formed is set in the detection area DA.
  • FIG. 19 illustrates an example in which horizontal trapezoidal distortion is generated in the screen area SA due to the screen 300 tilting in the left-right direction.
  • the detection area DA can be set by the above algorithm even when trapezoidal distortion occurs or when both horizontal and vertical trapezoidal distortions occur. In this algorithm, the detection area DA can be set by simple calculation regardless of the position and size of the screen area SA.
  • the size of the detection area DA set in the center area of the screen area SA is not limited to the size calculated by the above algorithm.
  • the size of the detection area DA may be set to a value derived by the designer based on experiments and simulations. The value may be a value when the detection accuracy of a trapezoidal distortion detection test pattern and / or the evaluation value of the subjective image quality, which will be described later, is maximized in experiments and simulations.
  • the size of the detection area DA that is actually set is a normalized value according to the size of the screen area SA.
  • the size of the detection area DA may be adjusted according to the specifications (for example, S / N ratio, resolution) of the camera mounted on the projection display apparatus 2200. For example, when the S / N ratio is low, the influence of noise increases, so that the size of the detection area DA needs to be relatively large in order to maintain the autofocus adjustment accuracy. On the other hand, since the influence of noise is small when the S / N ratio is high, the autofocus adjustment accuracy is maintained even if the size of the detection area DA is reduced. Similarly, when the resolution is low, the size of the detection area DA needs to be relatively large. The designer may perform the experiment and simulation for each camera having different specifications.
  • the specifications for example, S / N ratio, resolution
  • the autofocus adjustment unit 260 adjusts the focus using a contrast detection method.
  • the video signal setting unit 82 reads out a test pattern for auto focus adjustment from the image memory 84 and causes the projection unit 10 to project the test pattern.
  • the test pattern is formed of, for example, a stripe pattern or a checker flag pattern.
  • the imaging unit 30 images the test pattern projected on the screen 300.
  • the autofocus adjustment unit 260 is configured to determine the position of the lens based on the sharpness of the detection area set by the screen position detection unit 240 in a plurality of images respectively captured by the imaging unit 30 at a plurality of lens positions. To decide.
  • the configuration of the autofocus adjustment unit 260 will be described more specifically.
  • the autofocus adjustment unit 260 includes a high-pass filter 261, an integration unit 262, and a lens position determination unit 263.
  • the high pass filter 261 extracts a high frequency component exceeding a predetermined threshold value of the image signal in the detection region, and supplies the extracted high frequency component to the integrating unit 262.
  • the high pass filter 261 may extract high frequency components in the horizontal direction, or may extract high frequency components in both the horizontal direction and the vertical direction.
  • the integrating unit 262 integrates the high frequency components extracted by the high pass filter 261 at each lens position, and supplies the integrated high frequency component to the lens position determining unit 263.
  • the integrating unit 62 adds both high frequency components.
  • the lens position determining unit 263 determines the position of the focus lens 13 from which the maximum integrated value is detected among the plurality of integrated values supplied from the integrating unit 262 as the in-focus position.
  • FIG. 20 is a diagram for explaining the process of determining the focus position of the focus lens 13.
  • the autofocus adjustment unit 260 instructs the video signal setting unit 82 to project a test pattern, and moves the focus lens 13 from the near side to the far side or from the far side to the near side.
  • a control signal for sequentially moving in a predetermined step width is set in the drive signal setting unit 86.
  • the video signal setting unit 82 sets the video signal of the test pattern in the light modulation unit 12, and the drive signal setting unit 86 sets the drive signal corresponding to the control signal in the lens driving unit 20.
  • the autofocus adjustment unit 260 calculates the sharpness (the above integrated value can be used) included in the test pattern imaged at each lens position of the focus lens 13. This sharpness increases as the focus lens 13 approaches the in-focus position. When the increase peaks and decreases, the autofocus adjustment unit 260 determines the previous lens position as the in-focus position.
  • the image memory 84 holds image data to be projected on the screen 300.
  • the image data is supplied from a PC or the like via an external interface (not shown).
  • a test pattern projected during autofocus adjustment is also held.
  • the video signal setting unit 82 sets a video signal based on the image data held in the image memory 84 in the light modulation unit 12.
  • the drive signal setting unit 86 sets a drive signal for moving the focus lens 13 to the lens position instructed from the autofocus adjustment unit 260 in the lens drive unit 20.
  • the detection area in the central area of the screen even when the screen is tilted, it is suitable for the position on the screen surface, that is, in the central area of the screen. Focus can be adjusted. In addition, the amount of calculation can be reduced by detecting the sharpness of each image using the image signal of the detection area instead of the image signal of the entire captured image. Therefore, the autofocus adjustment time can be shortened.
  • FIG. 21 is a diagram showing a configuration of a projection display apparatus 3200 according to the fourth embodiment of the present invention.
  • the projection display apparatus 3200 according to the fourth embodiment has a configuration in which a side length measurement unit 245 is added to the projection display apparatus 2200 according to the third embodiment.
  • the side length measurement unit 245 measures the lengths of the two opposite sides of the screen area detected by the screen position detection unit 240.
  • the side length measurement unit 245 can detect the inclination of the screen 300 from the lengths of these two sides. More specifically, the lengths of the left and right sides of the screen area are measured, and the horizontal tilt of the screen 300 is detected. Note that the ratio of the lengths of the left side and the right side may be calculated to detect the degree of inclination. It shows that the inclination is so large that the said ratio is large. Similarly, the lengths of the upper and lower sides of the screen area are measured, and the vertical tilt of the screen 300 is detected.
  • the detection area setting unit 250 moves the detection area set as the central area of the screen area by a predetermined distance in the direction of the short side of the two sides measured by the side length measurement unit 245.
  • FIG. 22 is a diagram illustrating a state in which the detection area DA is set in the screen area SA in the captured image PuI according to the fourth embodiment.
  • the detection area DA shown in FIG. 22 is moved in the right direction with respect to the detection area DA shown in FIG. Since the length of the right side of the screen area SA is shorter than the length of the left side, the detection area setting unit 250 moves the detection area DA from the central area to the right side. In FIG. 22, since the example in which the length of the upper side and the length of the lower side of the screen area SA are equal is drawn, the detection area DA is not moved in the vertical direction. Moved.
  • the distance by which the detection area DA is moved from the central area in the direction of the short side may be set to a value derived by the designer based on experiments and simulations. This value may be a value when the detection accuracy and / or subjective image quality evaluation value of the trapezoidal distortion detection test pattern, which will be described later, is maximized in the experiment or simulation. As described above, the distance may be adjusted according to the specifications (for example, S / N ratio, resolution) of the camera mounted on the projection display apparatus 3200. Note that the distance actually moved in the direction of the short side is a normalized value according to the size of the screen area.
  • the detection area setting unit 250 may adaptively change the distance for moving the detection area DA according to the degree of inclination of the screen 300 detected by the side length measurement unit 245. For example, the detection area setting unit 250 increases the distance by which the detection area DA is moved as the inclination increases.
  • the screen The focus can be adjusted to a suitable position in the plane.
  • the trapezoidal distortion described later is more effective when the detection area is set at a position slightly shifted in the direction far from the light source from the central area of the screen than when the detection area is set in the central area of the screen.
  • the result that the detection accuracy of the test pattern for detection was high was obtained.
  • the result that a big difference did not arise in the evaluation value of subjective image quality was obtained.
  • the method of focusing on a position shifted in a direction farther from the light source than the central region of the screen is an effective method.
  • the detection area is set at a position slightly shifted from the center area of the screen in a direction far from the light source, but the detection area may be set at another position.
  • the detection area DA may be moved to an area including any corner of the screen area SA. Good.
  • the detection area DA is moved to the corner where the shorter side intersects when the opposite sides are compared in the screen area SA.
  • the reason why the detection area DA is moved in the direction of the short side is that if the detection area is set closer to the long side, the luminance change in the detection area becomes larger than that set near the short side. This is because the in-focus position of the focus lens 13 is closer to the longer side, and this must be avoided.
  • the destination of the detection area may be set to an optimal position based on the position of the projection display apparatus, the distance between the projection display apparatus and the screen, and the like.
  • the autofocus adjustment unit 260 may cause the projection unit 10 to project a test pattern for autofocus adjustment in the central detection region or the detection region after movement.
  • focus information for assisting the user in adjusting the focus may be projected on the projection unit 10 in a screen area other than the detection area.
  • the focus information includes a focus adjustment bar, button, focus evaluation value, character information, and the like. The user can finely adjust the focus by operating the bar or button with the mouse or the laser pointer while referring to the focus evaluation value or the like.
  • FIG. 24 is a diagram showing a configuration of a projection display apparatus 4200 according to the fifth embodiment of the present invention.
  • the projection display apparatus 4200 according to the fifth embodiment has a configuration in which a trapezoidal distortion correction unit 270 is added to the projection display apparatus 2200 according to the third embodiment.
  • the trapezoidal distortion correction unit 270 detects the distortion in the shape of the test pattern for detecting the trapezoidal distortion by detecting an edge in the image captured by the imaging unit 30, and corrects the video signal so that the distortion is canceled. To do.
  • the trapezoidal distortion occurs when the screen 300 and the projection display apparatus 4200 are not facing each other. For example, when the optical axis of the projection light is shifted upward, a trapezoidal distortion in which the upper portion of the screen 300 swells occurs.
  • the video signal setting unit 82 reads a test pattern for keystone distortion adjustment from the image memory 84 and projects it onto the projection unit 10.
  • the test pattern is formed of, for example, a quadrangle (square, rectangle, parallelogram, rhombus, etc.).
  • the imaging unit 30 images the test pattern projected on the screen 300.
  • the trapezoidal distortion correction unit 270 detects the trapezoidal distortion based on the shape of the test pattern reflected in the captured image, and corrects the video signal to be set in the video signal setting unit 82 so that the trapezoidal distortion is canceled. .
  • the video signal setting unit 82 sets the video signal after the trapezoidal distortion correction supplied from the trapezoidal distortion correction unit 270 to the light modulation unit 12 during a period in which the trapezoidal distortion correction function by the trapezoidal distortion correction unit 270 is valid.
  • the trapezoidal distortion correction unit 270 When starting up the projection display apparatus 4200, the trapezoidal distortion correction unit 270 performs the trapezoidal distortion detection process described above after the focus adjustment by the autofocus adjustment unit 260 is completed.
  • the configuration of the trapezoidal distortion correction unit 270 will be described more specifically.
  • the trapezoidal distortion correction unit 270 includes an edge extraction unit 271, a distortion detection unit 272, and a video signal correction unit 273.
  • the edge extraction unit 271 detects an edge from the captured image in which the test pattern is captured. At this time, the edge extraction unit 271 extracts an edge within the screen area set by the screen position detection unit 240. The edge is extracted at a location where a luminance level change exceeding a predetermined threshold has occurred.
  • the distortion detection unit 272 specifies how much it swells in the vertical and horizontal directions from the vertex coordinates of the test pattern specified by edge extraction.
  • the video signal correction unit 273 corrects the video signal to be set in the video signal setting unit 82 so that the swelling of the test pattern projected on the screen 300 is cancelled.
  • FIG. 25 is a diagram for explaining trapezoidal distortion correction.
  • FIG. 25A shows vertical trapezoidal distortion correction
  • FIG. 25B shows horizontal trapezoidal distortion correction.
  • the video signal correction unit 273 gradually scales down the horizontal width of the video from the upper end to the lower end of the video to be projected so that a quadrangle whose aspect ratio is maintained is projected onto the screen 300.
  • the left part of the projection image PrI is swollen.
  • the video signal correction unit 732 gradually scales down the vertical width of the video from the left end to the right end of the video to be projected so that a quadrangle whose aspect ratio is maintained is projected onto the screen 300.
  • FIG. 26 is a diagram illustrating a state in which a trapezoidal distortion detection test pattern TP is reflected in the screen area SA in the captured image PuI according to the fifth embodiment.
  • This captured image PuI is focused on the left area of the screen area SA. That is, a case is shown in which the autofocus adjustment is performed using the image signal of the entire captured image or the entire screen area SA without setting the detection area described in the third and fourth embodiments. In this case, the focus is focused on the left area of the screen area SA that is closer to the light source than the center area of the screen area SA.
  • the edge extraction unit 271 may not be able to extract a part of the shape of the test pattern TP from the captured image PuI. If it cannot be extracted, the accuracy of the trapezoidal distortion correction is reduced.
  • the edge extraction unit 271 can extract the shape of the entire test pattern from the captured image, and the accuracy of trapezoidal distortion correction is improved.
  • trapezoidal distortion detection is performed, thereby improving the accuracy of trapezoidal distortion correction. Can do.
  • the trapezoidal distortion detection is executed before the autofocus adjustment or after the autofocus adjustment using the image signal of the entire captured image or the entire screen area, the accuracy of the trapezoidal distortion correction may be lowered. This is because the shape of the entire test pattern may not be extracted from the captured image.
  • the detection area setting unit 250 may adaptively change the size of the detection area according to the degree of inclination of the screen 300 detected by the side length measurement unit 245. For example, the detection area setting unit 250 reduces the size of the detection area as the inclination increases. If the inclination is large, the detection area is focused on the side closer to the light source than the center. Therefore, the smaller the size of the detection area, the closer the focus can be to the center of the entire screen area.
  • a shape correction unit may be provided instead of the trapezoidal distortion correction unit 270.
  • the shape correction unit executes shape correction such as well-known four-point correction (fitting correction).
  • the video signal setting unit 82 causes the projection unit 10 to project a test pattern for four-point correction, such as when the projection display apparatus is activated.
  • the test pattern is, for example, a white rectangle.
  • the imaging unit 30 images the test pattern projected on the screen 300.
  • the shape correction unit sets the video to be set in the video signal setting unit 82 so that the four vertices of the test pattern shown in the captured image coincide with the four vertices of the screen. Correct the signal.
  • the video signal setting unit 82 sets the corrected video signal supplied from the shape correction unit in the light modulation unit 12.
  • FIG. 27 is a diagram for explaining the four-point correction.
  • FIG. 27 is a captured image, and shows a state where a projected image (for example, a white test pattern) PrI is projected out of the screen 300. As indicated by arrows in the figure, the shape of the video is adjusted so that the four vertices of the projection image PrI coincide with the corresponding four vertices of the screen 300.
  • a projected image for example, a white test pattern
  • the shape correction unit may select and execute either trapezoidal distortion correction or four-point correction based on the shape characteristics of the projection image PrI.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Geometry (AREA)
  • Automatic Focus Adjustment (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

When manual focus adjustment operations are initiated by a user, focusing information is automatically projected and displayed on a screen. A projection unit (10) projects an image on a screen (300) via a focus lens (13). An imaging unit (30) images the screen. A manually operated focus-ring (15) is provided on the projection unit (10). A focus evaluation value calculation unit (42) acquires an image which has been imaged by the imaging unit (30), and calculates, as a focus evaluation value, a characteristic value which changes according to the focus state of the image projected on the screen. When the focus evaluation value changes, an evaluation value change detection unit (44) determines that the focus-ring (15) is being operated by a user. When it is determined that the focus-ring (15) is being operated, a focus adjustment assist unit (70) combines the focusing information, which is for supporting focus adjustment, with the image and projects and displays the result.

Description

制御装置および投写型映像表示装置Control device and projection display device
 本発明は、投写型映像表示装置におけるフォーカス調整技術に関する。 The present invention relates to a focus adjustment technique in a projection display apparatus.
 スクリーンに映像を投写する投写型映像表示装置(以下適宜、プロジェクタと表記する)の中には、ユーザがフォーカスリングを操作するマニュアルフォーカス調整を補助するための情報を投写するものがある。例えば、特許文献1には、スクリーン上にフォーカス調整パターンの映像信号を出力させる液晶プロジェクタが開示されている。 Some projection-type image display devices that project images onto a screen (hereinafter referred to as projectors as appropriate) project information for assisting manual focus adjustment in which a user operates a focus ring. For example, Patent Document 1 discloses a liquid crystal projector that outputs a video signal of a focus adjustment pattern on a screen.
特開平04-051227号公報Japanese Unexamined Patent Publication No. 04-051227
 フォーカス合焦情報をスクリーンに表示させようとする場合、上述の特許文献1では、ユーザが調整スイッチを操作してフォーカス調節パターンを投写させる必要がある。このようなユーザによる専用ボタンの操作や、または所定のメニューからの選択などの特別な動作なしに、ユーザのマニュアルフォーカス調整の意図を察知して、スクリーン上にフォーカス合焦情報を投写表示することができれば、便利である。 In order to display the focus focusing information on the screen, in Patent Document 1 described above, the user needs to operate the adjustment switch to project the focus adjustment pattern. Without any special operation such as the operation of a dedicated button by the user or selection from a predetermined menu, the user's intention of manual focus adjustment is detected and focus focus information is projected and displayed on the screen. If you can, it is convenient.
 本発明はこうした状況に鑑みなされたものであり、その目的は、ユーザがマニュアルによるフォーカス調整動作を開始したときに、フォーカス合焦情報を自動的にスクリーン上に投写表示する技術を提供することにある。 The present invention has been made in view of such circumstances, and an object of the present invention is to provide a technique for automatically projecting and displaying focus in-focus information on a screen when a user starts a manual focus adjustment operation. is there.
 本発明のある態様は、スクリーンにレンズを介して映像を投写する投写部と、スクリーンを撮像するための撮像部と、投写部に設けられ手動で操作されるフォーカス調整部と、備える投写型映像表示装置に搭載される制御装置である。この装置は、撮像部により撮像された撮像画像を取得し、スクリーンに投写された映像のフォーカス状態に応じて変化する特性値をフォーカス評価値として算出するフォーカス評価値算出部と、フォーカス評価値が変化しているとき、ユーザによりフォーカス調整部が操作されていると判定する評価値変化検出部と、フォーカス調整部が操作されていると判定されると、フォーカス調整を補助するためのフォーカス合焦情報を映像と合わせて投写表示させるフォーカス調整アシスト部と、を備える。 An aspect of the present invention provides a projection image that includes a projection unit that projects an image on a screen via a lens, an imaging unit that images the screen, and a focus adjustment unit that is provided in the projection unit and is manually operated. It is a control device mounted on the display device. This apparatus acquires a captured image captured by an imaging unit, calculates a characteristic value that changes according to a focus state of an image projected on a screen as a focus evaluation value, and a focus evaluation value An evaluation value change detection unit that determines that the focus adjustment unit is operated by the user and a focus focus for assisting focus adjustment when it is determined that the focus adjustment unit is operated. A focus adjustment assist unit that projects and displays information together with the image.
 本発明の別の態様は、投写型映像表示装置である。この装置は、スクリーンにレンズを介して映像を投写する投写部と、スクリーンを撮像するための撮像部と、上述した制御部と、を備える。 Another aspect of the present invention is a projection display apparatus. This apparatus includes a projection unit that projects an image on a screen via a lens, an imaging unit for imaging the screen, and the control unit described above.
 本発明によれば、ユーザによるマニュアルフォーカス調整動作の開始時に、フォーカス合焦情報をスクリーンに投写表示することができる。 According to the present invention, focus focus information can be projected and displayed on the screen at the start of a manual focus adjustment operation by the user.
本発明の一実施形態に係る投写型映像表示装置とスクリーンとの位置関係を示す図である。It is a figure which shows the positional relationship of the projection type video display apparatus concerning one Embodiment of this invention, and a screen. 一実施形態に係る投写型映像表示装置の構成を示す図である。It is a figure which shows the structure of the projection type video display apparatus concerning one Embodiment. 映像信号評価部の動作を説明するフローチャートである。It is a flowchart explaining operation | movement of a video signal evaluation part. 入力信号評価値の変化とフォーカスアシスト機能の状態との関係を説明する図である。It is a figure explaining the relationship between the change of an input signal evaluation value, and the state of a focus assist function. フォーカス評価部の動作を説明するフローチャートである。It is a flowchart explaining operation | movement of a focus evaluation part. フォーカス評価値の変化とフォーカスアシスト機能の状態との関係を説明する図である。It is a figure explaining the relationship between the change of a focus evaluation value, and the state of a focus assist function. 本発明の第2実施形態に係る投写型映像表示装置の機能構成を模式的に示す図である。It is a figure which shows typically the function structure of the projection type video display apparatus concerning 2nd Embodiment of this invention. 図8(a)は、スクリーンに投写されたズーム操作検出用の映像と撮像部の撮像範囲との関係の一例を示す図である。図8(b)は、スクリーンに投写されたズーム操作検出用の映像と撮像部の撮像範囲との関係を示す別の例を示す図である。FIG. 8A is a diagram illustrating an example of a relationship between a zoom operation detection image projected on a screen and an imaging range of an imaging unit. FIG. 8B is a diagram showing another example of the relationship between the zoom operation detection image projected on the screen and the imaging range of the imaging unit. フォーカス調整時におけるフォーカス評価値の変動の一例を示す図である。It is a figure which shows an example of the fluctuation | variation of the focus evaluation value at the time of focus adjustment. 図10(a)は、フォーカス調整開始時のグラフの一例を示す図である。図10(b)は、フォーカス評価値の取るべき目標値が未検出の期間において、フォーカス評価値がそれまでの最大値よりも小さな値となる場合のグラフの一例を示す図である。図10(c)は、フォーカス評価値の取るべき目標値が検出された後の期間において、フォーカス評価値が下降する場合のグラフの一例を示す図である。図10(d)は、フォーカス評価値の取るべき目標値が検出された後の期間において、フォーカス評価値が上昇する場合のグラフの一例を示す図である。FIG. 10A is a diagram illustrating an example of a graph at the start of focus adjustment. FIG. 10B is a diagram illustrating an example of a graph in a case where the focus evaluation value is smaller than the maximum value in the period in which the target value to be taken as the focus evaluation value is not detected. FIG. 10C is a diagram illustrating an example of a graph in the case where the focus evaluation value decreases in the period after the target value to be taken as the focus evaluation value is detected. FIG. 10D is a diagram illustrating an example of a graph in the case where the focus evaluation value increases in a period after the target value to be taken as the focus evaluation value is detected. フォーカス評価値および目標値の組み合わせとフォーカス状態情報の表示色とが対応づけられたテーブルを示す図である。It is a figure which shows the table with which the combination of the focus evaluation value and target value, and the display color of focus state information were matched. フォーカス状態情報の表示色の状態遷移を示す図である。It is a figure which shows the state transition of the display color of focus state information. 第2実施形態に係る、主に制御部の処理の流れを示すフローチャートである。It is a flowchart which mainly shows the flow of a process of the control part based on 2nd Embodiment. 図14(a)は、主にフォーカス調整に利用されるパターン画像を示す図である。図14(b)は、フォーカス調整に利用されるパターン画像とズーム操作検出に利用されるパターン画像との組み合わせの例を示す図である。FIG. 14A is a diagram showing a pattern image mainly used for focus adjustment. FIG. 14B is a diagram illustrating an example of a combination of a pattern image used for focus adjustment and a pattern image used for zoom operation detection. 本発明の第3実施形態に係る投写型映像表示装置と、スクリーンとの位置関係を示す図である。It is a figure which shows the positional relationship of the projection type video display apparatus concerning 3rd Embodiment of this invention, and a screen. 図15に示した位置関係において、撮像部により撮像された撮像画像の一例を示す図である。It is a figure which shows an example of the captured image imaged by the imaging part in the positional relationship shown in FIG. 第3実施形態に係る投写型映像表示装置の構成を示す図である。It is a figure which shows the structure of the projection type video display apparatus concerning 3rd Embodiment. 第3実施形態に係る撮像画像内のスクリーン領域内に、検出領域が設定される様子を示す図である。It is a figure which shows a mode that a detection area | region is set in the screen area | region in the captured image which concerns on 3rd Embodiment. 第3実施形態に係る検出領域の設定方法の一例を説明するための図である。It is a figure for demonstrating an example of the setting method of the detection area which concerns on 3rd Embodiment. フォーカスレンズの合焦位置の決定処理について説明するための図である。It is a figure for demonstrating the determination process of the focus position of a focus lens. 本発明の第4実施形態に係る投写型映像表示装置の構成を示す図である。It is a figure which shows the structure of the projection type video display apparatus concerning 4th Embodiment of this invention. 第4実施形態に係る撮像画像内のスクリーン領域内に、検出領域が設定される様子を示す図である。It is a figure which shows a mode that a detection area | region is set within the screen area | region in the captured image which concerns on 4th Embodiment. スクリーン領域のいずれかの角を含む領域に検出領域を移動させる様子を示す図である。It is a figure which shows a mode that a detection area | region is moved to the area | region containing any corner | angular part of a screen area | region. 本発明の第5実施形態に係る投写型映像表示装置の構成を示す図である。It is a figure which shows the structure of the projection type video display apparatus concerning 5th Embodiment of this invention. 台形歪み補正を説明するための図である。図25(a)は、垂直台形歪み補正を示し、図25(b)は、水平台形歪み補正を示す。It is a figure for demonstrating trapezoid distortion correction. FIG. 25A shows vertical trapezoidal distortion correction, and FIG. 25B shows horizontal trapezoidal distortion correction. 第5実施形態に係る撮像画像内のスクリーン領域内に、台形歪み検出用のテストパターンが写っている様子を示す図である。It is a figure which shows a mode that the test pattern for trapezoid distortion detection is reflected in the screen area | region in the captured image which concerns on 5th Embodiment. 四点補正を説明するための図である。It is a figure for demonstrating four-point correction | amendment.
第1実施形態
 図1は、本発明の第1実施形態に係る投写型映像表示装置200とスクリーン300との位置関係を示す図である。第1実施形態に係る投写型映像表示装置200は、スクリーン300方向を撮影するための撮像部30を備える。撮像部30は、その光軸中心と、投写型映像表示装置200から投写される投写光の光軸中心とが例えば平行な関係になるよう、設置される。図1では、スクリーン300が投写型映像表示装置200に対して正対している。
First Embodiment FIG. 1 is a diagram showing a positional relationship between a projection display apparatus 200 and a screen 300 according to a first embodiment of the present invention. The projection display apparatus 200 according to the first embodiment includes an imaging unit 30 for photographing the direction of the screen 300. The imaging unit 30 is installed such that the optical axis center thereof and the optical axis center of the projection light projected from the projection display apparatus 200 have a parallel relationship, for example. In FIG. 1, the screen 300 faces the projection display apparatus 200.
 投写型映像表示装置200は、レンズの前に設けられたフォーカスリングを手動で動かしてフォーカス調節を実施する。このフォーカス調節を補助するために、投写型映像表示装置200は、フォーカス合焦情報をスクリーン300上に投写表示する。ユーザは、このフォーカス合焦情報を参照しながらフォーカスリングを動かして、適切なフォーカス調節をすることができる。 The projection display apparatus 200 performs focus adjustment by manually moving a focus ring provided in front of the lens. In order to assist this focus adjustment, the projection display apparatus 200 projects and displays focus focusing information on the screen 300. The user can adjust the focus appropriately by moving the focus ring while referring to the focus focusing information.
 図2は、投写型映像表示装置200の構成を示す図である。投写型映像表示装置200は、投写部10、撮像部30および制御部100を備える。制御部100は、フォーカス評価部40、映像信号評価部50、フォーカス目標算出部60、フォーカス調整アシスト部70、映像信号設定部82および画像メモリ84を備える。 FIG. 2 is a diagram showing a configuration of the projection display apparatus 200. As shown in FIG. The projection display apparatus 200 includes a projection unit 10, an imaging unit 30, and a control unit 100. The control unit 100 includes a focus evaluation unit 40, a video signal evaluation unit 50, a focus target calculation unit 60, a focus adjustment assist unit 70, a video signal setting unit 82, and an image memory 84.
 制御部100の構成は、ハードウェア的には、任意のコンピュータのCPU、メモリ、その他のLSIで実現でき、ソフトウェア的にはメモリにロードされたプログラムなどによって実現されるが、ここではそれらの連携によって実現される機能ブロックを描いている。したがって、これらの機能ブロックがハードウェアのみ、ソフトウェアのみ、またはそれらの組合せによっていろいろな形で実現できることは、当業者には理解されるところである。 The configuration of the control unit 100 can be realized in terms of hardware by a CPU, memory, or other LSI of an arbitrary computer, and can be realized in terms of software by a program loaded in the memory. Depicts functional blocks realized by. Accordingly, those skilled in the art will understand that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
 投写部10は、スクリーン300に映像を投写する。投写部10は、光源11、光変調部12およびフォーカスレンズ13を含む。光源11には、フィラメント型の電極構造を有するハロゲンランプ、アーク放電を発生させる電極構造を有するメタルハライドランプ、キセノンショートアークランプ、高圧型の水銀ランプ、LEDランプなどを採用することができる。 Projection unit 10 projects an image on screen 300. The projection unit 10 includes a light source 11, a light modulation unit 12, and a focus lens 13. As the light source 11, a halogen lamp having a filament-type electrode structure, a metal halide lamp having an electrode structure for generating arc discharge, a xenon short arc lamp, a high-pressure mercury lamp, an LED lamp, or the like can be employed.
 光変調部12は、映像信号設定部82から設定される映像信号に応じて、光源11から入射される光を変調する。光変調部12には、例えばDMD(Digital Micromirror Device)を採用することができる。DMDは、画素数に対応した複数のマイクロミラーを備え、各マイクロミラーの向きが各画素信号に応じて制御されることにより、所望の映像光を生成する。 The light modulator 12 modulates the light incident from the light source 11 in accordance with the video signal set by the video signal setting unit 82. For example, a DMD (Digital Micromirror Device) can be used as the light modulator 12. The DMD includes a plurality of micromirrors corresponding to the number of pixels, and generates desired video light by controlling the direction of each micromirror according to each pixel signal.
 フォーカスレンズ13は、光変調部12から入射される光の焦点位置を調整する。フォーカスレンズ13にはフォーカスリング15が設けられており、このフォーカスリングを手動で回転させることによりそのレンズ位置が光軸上で移動される。光変調部12により生成された映像光は、フォーカスレンズ13を介してスクリーン300に投写される。なお、レンズ位置を光軸上で移動させるものであれば、フォーカスリング以外の任意の装置でよい。 The focus lens 13 adjusts the focal position of the light incident from the light modulator 12. The focus lens 13 is provided with a focus ring 15, and the lens position is moved on the optical axis by manually rotating the focus ring. The image light generated by the light modulation unit 12 is projected onto the screen 300 via the focus lens 13. Any device other than the focus ring may be used as long as the lens position is moved on the optical axis.
 撮像部30は、スクリーン300およびスクリーン300に投影された投影画像を主な被写体として撮像する。撮像部30は、固体撮像素子31および信号処理回路32を含む。固体撮像素子31には、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサやCCD(Charge Coupled Devices)イメージセンサなどを採用することができる。信号処理回路32は、固体撮像素子31から出力される信号に対して、A/D変換や、RGBフォーマットからYUVフォーマットへの変換などの各種信号処理を施し、制御部100に出力する。 The imaging unit 30 captures the screen 300 and a projected image projected on the screen 300 as a main subject. The imaging unit 30 includes a solid-state imaging device 31 and a signal processing circuit 32. As the solid-state imaging device 31, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, a CCD (Charge Coupled Devices) image sensor, or the like can be employed. The signal processing circuit 32 performs various signal processing such as A / D conversion and conversion from the RGB format to the YUV format on the signal output from the solid-state imaging device 31, and outputs the signal to the control unit 100.
 フォーカス目標算出部60は、映像信号設定部82から、スクリーンに投写表示中の映像信号を取得する。そして、後述するフォーカス評価値算出部42と同様の演算を行って、入力映像信号の評価値を算出する。フォーカス評価値算出部42と同様に、画像全体に対して演算を行ってもよいし、または画像内の分割された領域毎に演算を行ってもよい。 The focus target calculation unit 60 acquires the video signal being projected and displayed on the screen from the video signal setting unit 82. Then, an evaluation value of the input video signal is calculated by performing the same calculation as that of the focus evaluation value calculation unit 42 described later. Similar to the focus evaluation value calculation unit 42, the calculation may be performed on the entire image, or may be performed for each divided area in the image.
 算出された入力映像信号の評価値は、フォーカス度合いの目標値として利用される。したがって、入力映像信号の評価値は、フォーカス調整アシスト部70に送られる。そして、後述するフォーカスアシスト機能がオンであるときに、フォーカス合焦情報として撮像画像に対するフォーカス評価値とともにスクリーン上に重畳表示される。ユーザは、フォーカス合焦情報として表示された、入力信号の評価値と、撮像画像に対するフォーカス評価値とを比較することで、フォーカスの合焦度合いの目安を得ることができる。 The calculated evaluation value of the input video signal is used as a target value for the focus degree. Therefore, the evaluation value of the input video signal is sent to the focus adjustment assist unit 70. Then, when a focus assist function described later is on, it is superimposed and displayed on the screen together with the focus evaluation value for the captured image as focus in-focus information. The user can obtain an indication of the focus degree of focus by comparing the evaluation value of the input signal displayed as the focus focusing information with the focus evaluation value for the captured image.
 なお、入力信号の評価値と、撮像画像に対するフォーカス評価値とを表示する代わりに、またはこれとともに、両評価値を用いてフォーカスの合焦度合いを表す評価値を算出し、これを表示するようにしてもよい。 Instead of displaying the evaluation value of the input signal and the focus evaluation value for the captured image, or together with this, the evaluation value representing the focus degree of focus is calculated and displayed using both evaluation values. It may be.
 フォーカス調整アシスト部70は、映像信号評価部50からの指示に応じて、フォーカスアシスト機能をスタンバイ状態にする。そして、フォーカス評価部40からの指示に応じて、フォーカスアシスト機能をオンまたはオフにする。 The focus adjustment assist unit 70 sets the focus assist function to a standby state in response to an instruction from the video signal evaluation unit 50. Then, in accordance with an instruction from the focus evaluation unit 40, the focus assist function is turned on or off.
 ここで「フォーカスアシスト機能」とは、フォーカスリングのマニュアル操作によるフォーカス調整を補助するためのフォーカス合焦情報を、スクリーン上の投影画面の少なくとも一部として重畳表示する機能である。本実施形態では、ユーザによるマニュアルフォーカス調整の開始を検出してフォーカス合焦情報を重畳表示するとともに、マニュアルフォーカス調整の終了を検出してフォーカス合焦情報の表示を終了する。また、フォーカス合焦情報とともに表示される映像は、映像信号をそのまま使用してもよいし、またはフォーカス専用のパターンを表示してもよい。 Here, the “focus assist function” is a function for superimposing and displaying focus in-focus information for assisting focus adjustment by manual operation of the focus ring as at least a part of the projection screen on the screen. In the present embodiment, the start of the manual focus adjustment by the user is detected and the focus focus information is superimposed and displayed, and the end of the manual focus adjustment is detected and the display of the focus focus information is ended. Also, the video displayed together with the focus focusing information may use the video signal as it is, or may display a focus-dedicated pattern.
 フォーカス合焦情報は、撮像画像のフォーカス評価値およびフォーカス目標値(すなわち、入力映像信号の評価値)である。これにより、フォーカスリングのマニュアル調節を補助することができる。 The focus focusing information is a focus evaluation value and a focus target value (that is, an evaluation value of the input video signal) of the captured image. Thereby, manual adjustment of the focus ring can be assisted.
 映像信号評価部50は、映像信号設定部82からスクリーンに投写表示中の映像信号を取得する。映像信号評価部50は、入力信号評価値算出部52および入力信号評価値判定部54を含む。 The video signal evaluation unit 50 acquires the video signal being projected and displayed on the screen from the video signal setting unit 82. The video signal evaluation unit 50 includes an input signal evaluation value calculation unit 52 and an input signal evaluation value determination unit 54.
 入力信号評価値算出部52は、フォーカスアシスト機能をスタンバイ状態にするか否かを判定するための入力信号評価値を算出する。入力信号評価値としては、映像信号の高周波成分、コントラスト情報、輝度情報などの、映像内での動きの有無に応じて変化する任意の特性値を用いることができる。映像信号から高周波成分またはコントラストを算出するには、任意の手法を用いることができる。入力信号評価値は、画面の全体から算出してもよいし、画面を複数の領域に分割して領域毎に算出してもよい。入力信号評価値は、所定の時間間隔で、または所定数のフレーム毎に算出されることが好ましい。 The input signal evaluation value calculation unit 52 calculates an input signal evaluation value for determining whether or not to set the focus assist function to the standby state. As the input signal evaluation value, any characteristic value that changes depending on the presence or absence of motion in the video, such as a high-frequency component, contrast information, and luminance information of the video signal can be used. Any method can be used to calculate the high-frequency component or contrast from the video signal. The input signal evaluation value may be calculated from the entire screen, or may be calculated for each region by dividing the screen into a plurality of regions. The input signal evaluation value is preferably calculated at a predetermined time interval or every predetermined number of frames.
 入力信号評価値判定部54は、入力信号評価値の変化に基づき、フォーカスアシスト機能をスタンバイするか否かを判定する。入力信号評価値に基づき、スクリーンに投写表示中の映像信号が動画であるか、または静止画が切り替わっていると判定された場合、マニュアルフォーカス調整は困難であるから、フォーカスアシスト機能をオフのままとする。入力信号評価値に基づき、スクリーンに投写表示中の映像信号が静止画であると判定された場合、フォーカスアシスト機能をスタンバイ状態にするようにフォーカス調整アシスト部70に指示する。 The input signal evaluation value determination unit 54 determines whether or not to standby the focus assist function based on the change in the input signal evaluation value. Based on the input signal evaluation value, if it is determined that the video signal being projected and displayed on the screen is a moving image or a still image has been switched, manual focus adjustment is difficult, so the focus assist function remains off. And When it is determined that the video signal being projected and displayed on the screen is a still image based on the input signal evaluation value, the focus adjustment assist unit 70 is instructed to set the focus assist function to the standby state.
 フォーカス評価部40は、信号処理回路32から撮像画像を取得する。フォーカス評価部40は、フォーカス評価値算出部42および評価値変化検出部44を含む。 The focus evaluation unit 40 acquires a captured image from the signal processing circuit 32. The focus evaluation unit 40 includes a focus evaluation value calculation unit 42 and an evaluation value change detection unit 44.
 フォーカス評価値算出部42は、ユーザによるマニュアルフォーカス調整の開始および終了を検出するためのフォーカス評価値を算出する。フォーカス評価値としては、撮像画像の高周波成分、コントラスト情報、輝度情報など、スクリーンに投写された映像のフォーカス状態に応じて変化する任意の特性値を用いることができる。撮像画像から高周波成分またはコントラストを算出するには、任意の手法を用いることができる。フォーカス評価値は、撮像画像全体から、または撮像画像を複数の領域に分割して領域毎に算出される。フォーカス評価値は、所定の時間間隔で、または所定数のフレーム毎に算出されることが好ましい。 The focus evaluation value calculation unit 42 calculates a focus evaluation value for detecting the start and end of manual focus adjustment by the user. As the focus evaluation value, any characteristic value that changes according to the focus state of the image projected on the screen, such as a high-frequency component of the captured image, contrast information, and luminance information, can be used. Any method can be used to calculate the high-frequency component or contrast from the captured image. The focus evaluation value is calculated for each area from the entire captured image or by dividing the captured image into a plurality of areas. The focus evaluation value is preferably calculated at a predetermined time interval or every predetermined number of frames.
 評価値変化検出部44は、フォーカスアシスト機能がスタンバイ状態にあるとき、フォーカス評価値に基づきユーザによるマニュアルフォーカス調整の開始を検出する。具体的には、フォーカス評価値の変化がない場合は、フォーカスリングが動かされていないものとしてフォーカスアシスト機能のスタンバイ状態を維持する。複数の領域のうち一部の領域のフォーカス評価値のみが変化した場合は、プロジェクタの前方を人や手が横切ったものと判断して、フォーカスアシスト機能のスタンバイ状態を維持する。全ての領域でフォーカス評価値が同じ増減方向に変化した場合は、フォーカスリングが動かされているものと判断して、フォーカスアシスト機能をオンにするようにフォーカス調整アシスト部70に指示する。 The evaluation value change detection unit 44 detects the start of manual focus adjustment by the user based on the focus evaluation value when the focus assist function is in the standby state. Specifically, when there is no change in the focus evaluation value, the standby state of the focus assist function is maintained assuming that the focus ring is not moved. When only the focus evaluation value of a part of the plurality of areas changes, it is determined that a person or hand has crossed the front of the projector, and the standby state of the focus assist function is maintained. When the focus evaluation value changes in the same increase / decrease direction in all areas, it is determined that the focus ring is moved, and the focus adjustment assist unit 70 is instructed to turn on the focus assist function.
 さらに、評価値変化検出部44は、フォーカス評価値に基づきユーザによるマニュアルフォーカス調整の終了を検出する。すなわち、フォーカスアシスト機能がオンにされた後、所定の期間(例えば5~10秒)フォーカス評価値に変化が見られない場合、フォーカスリングによる調整が終了したと判断して、フォーカスアシスト機能をオフにするようにフォーカス調整アシスト部70に指示する。 Furthermore, the evaluation value change detection unit 44 detects the end of manual focus adjustment by the user based on the focus evaluation value. In other words, after the focus assist function is turned on, if there is no change in the focus evaluation value for a predetermined period (for example, 5 to 10 seconds), it is determined that the adjustment by the focus ring has been completed and the focus assist function is turned off. The focus adjustment assist unit 70 is instructed to
 なお、フォーカスリング15が動かされた場合、投影画像の画角はわずかではあるが変化する。したがって、フォーカス評価値にはある程度の許容変化量を設定しておくことが好ましい。 Note that when the focus ring 15 is moved, the angle of view of the projected image changes slightly. Therefore, it is preferable to set a certain amount of allowable change in the focus evaluation value.
 また、投写表示されている映像が、その一部にのみパターンがありその他はベタ画像である場合には、フォーカスリングによるマニュアル調整の最中に、一部の領域のフォーカス評価値の変化度合いが、その他の領域のフォーカス評価値の変化度合いよりも大きくなる現象が起こることがある。このようなときでも、全ての領域におけるフォーカス評価値の増減方向は等しい(すなわち、全ての領域で増加するか、または全ての領域で減少する)ので、フォーカスリングによるマニュアル調整の開始を検出できることに変わりはない。 Also, if the projected image has a pattern only on a part of it and the other part is a solid image, the degree of change in the focus evaluation value of a part of the area during manual adjustment using the focus ring In some cases, a phenomenon may occur in which the degree of change in the focus evaluation value in other regions becomes larger. Even in such a case, since the increase / decrease direction of the focus evaluation value in all the regions is the same (that is, increases in all regions or decreases in all regions), it is possible to detect the start of manual adjustment by the focus ring. There is no change.
 画像メモリ84は、スクリーン300に投写すべき画像データを保持する。当該画像データは、図示しない外部インタフェースを介して、パーソナルコンピュータやDVDプレーヤ等の映像再生装置から供給される。映像信号設定部82は、画像メモリ84に保持される画像データにもとづく映像信号を光変調部12に設定する。 The image memory 84 holds image data to be projected on the screen 300. The image data is supplied from a video reproduction device such as a personal computer or a DVD player via an external interface (not shown). The video signal setting unit 82 sets a video signal based on the image data held in the image memory 84 in the light modulation unit 12.
 本実施形態に係る投写型映像表示装置200では、ユーザによるフォーカスリング15を用いたマニュアルフォーカス調整の開始および終了を、以下のような特性に基づいて検出している。 In the projection display apparatus 200 according to the present embodiment, the start and end of manual focus adjustment using the focus ring 15 by the user is detected based on the following characteristics.
 すなわち、フォーカスリングを動かした場合、画面全体のフォーカス状態が変化することが多く、画面の一部のみのフォーカス状態が大きく変化することは少ない。また、フォーカス状態は比較的緩やかに変化し、フォーカスが合っている付近でのみ急峻な変化がある。これに対し、プロジェクタの前を人や手が横切った場合は、画面の一部のみでフォーカス状態は大きく変化する。また、映像信号が切り替わった場合には、フォーカス状態は急峻に変化する。 That is, when the focus ring is moved, the focus state of the entire screen often changes, and the focus state of only a part of the screen rarely changes greatly. Further, the focus state changes relatively slowly, and there is a steep change only in the vicinity of the focus. On the other hand, when a person or hand crosses in front of the projector, the focus state changes greatly only with a part of the screen. In addition, when the video signal is switched, the focus state changes abruptly.
 このような特性に基づき、投写型映像表示装置200は、スクリーンに投写表示される映像をカメラにより撮像し、撮像画像から算出されたフォーカス評価値の変化の様子から、フォーカスリングが動かされているか否か、すなわちマニュアルフォーカス調整の開始および終了を判断する。 Based on such characteristics, the projection display apparatus 200 captures an image projected and displayed on the screen with a camera, and determines whether the focus ring is moved based on a change in the focus evaluation value calculated from the captured image. No, that is, the start and end of manual focus adjustment are determined.
 図3は、主に映像信号評価部50の動作を説明するフローチャートである。まず、入力信号評価値算出部52は、映像信号設定部82から投写表示中の映像信号を取得し(S10)、画面全体または分割された領域毎に入力信号評価値を算出する(S12)。所定の期間(例えば数秒)内に入力信号評価値に変化がない場合(S14のN)、映像信号が静止画であると判定し、フォーカスアシスト機能をスタンバイ状態にするようにフォーカス調整アシスト部70に指示する(S20)。入力信号評価値に変化がある場合(S14のY)、映像信号が動画であるか、または静止画が切り替わっているものと判定し、フォーカスアシスト機能をオフのままにする(S16)。その後、所定の期間内に入力信号評価値に変化が見られない場合は(S18のY)、映像信号が静止画に切り替わったと判定し、フォーカスアシスト機能をスタンバイ状態にするようにフォーカス調整アシスト部70に指示する(S20)。 FIG. 3 is a flowchart for mainly explaining the operation of the video signal evaluation unit 50. First, the input signal evaluation value calculation unit 52 acquires a video signal being projected and displayed from the video signal setting unit 82 (S10), and calculates an input signal evaluation value for the entire screen or for each divided area (S12). When there is no change in the input signal evaluation value within a predetermined period (for example, several seconds) (N in S14), it is determined that the video signal is a still image, and the focus adjustment assist unit 70 is set so that the focus assist function is in a standby state. (S20). If there is a change in the input signal evaluation value (Y in S14), it is determined that the video signal is a moving image or a still image is switched, and the focus assist function is kept off (S16). Thereafter, when no change is observed in the input signal evaluation value within a predetermined period (Y in S18), it is determined that the video signal has been switched to a still image, and the focus adjustment assist unit is set so that the focus assist function is in a standby state. 70 is instructed (S20).
 図4は、入力信号評価値の変化とフォーカスアシスト機能の状態とを説明する図である。図4(a)は入力信号評価値の時間変化を示し、図4(b)はフォーカスアシスト機能の状態を示している。 FIG. 4 is a diagram for explaining the change in the input signal evaluation value and the state of the focus assist function. FIG. 4A shows the time change of the input signal evaluation value, and FIG. 4B shows the state of the focus assist function.
 図4(a)において、期間t1、t3、t5では、入力信号評価値の変化がない。したがって、この期間は映像信号が静止画であると判断され、フォーカスアシスト機能はスタンバイ状態にされる。期間t2では、入力信号評価値が急激に変化する。この期間は静止画が切り替わったものと判断され、フォーカスアシスト機能はオフにされる。期間t4では、各領域の入力信号評価値がランダムに変動している。この期間は映像信号が動画であると判断され、フォーカスアシスト機能はオフにされる。 4A, there is no change in the input signal evaluation value in the periods t1, t3, and t5. Therefore, during this period, it is determined that the video signal is a still image, and the focus assist function is set to the standby state. In the period t2, the input signal evaluation value changes rapidly. During this period, it is determined that the still image has been switched, and the focus assist function is turned off. In the period t4, the input signal evaluation value in each region varies randomly. During this period, it is determined that the video signal is a moving image, and the focus assist function is turned off.
 図5は、主にフォーカス評価部40の動作を説明するフローチャートである。まず、フォーカスアシスト機能がスタンバイ状態にあるか否かを判定する(S50)。スタンバイ状態のとき(S50のY)、フォーカス評価値算出部42は撮像画像を取得し、分割された領域毎にフォーカス評価値を算出する(S52)。評価値変化検出部44は、所定の期間(例えば数秒)内でのフォーカス評価値を観測する(S54)。そして、全ての領域でフォーカス評価値に変化が見られないか、または所定の割合以下の領域でフォーカス評価値に変化がある場合(S54のY)、フォーカスアシスト機能のスタンバイ状態を維持する(S56)。所定の割合以上の領域でフォーカス評価値に変化が見られる場合(S54のN)、マニュアルフォーカス調整が開始されていると判定し、フォーカスアシスト機能をオンにするようにフォーカス調整アシスト部70に指示する(S58)。これに応じて、フォーカス合焦情報が投写画面の少なくとも一部に投写表示される。 FIG. 5 is a flowchart for mainly explaining the operation of the focus evaluation unit 40. First, it is determined whether or not the focus assist function is in a standby state (S50). In the standby state (Y in S50), the focus evaluation value calculation unit 42 acquires a captured image and calculates a focus evaluation value for each divided area (S52). The evaluation value change detection unit 44 observes the focus evaluation value within a predetermined period (for example, several seconds) (S54). If no change in the focus evaluation value is observed in all areas, or there is a change in the focus evaluation value in an area below a predetermined ratio (Y in S54), the standby state of the focus assist function is maintained (S56). ). When a change in the focus evaluation value is observed in an area of a predetermined ratio or more (N in S54), it is determined that manual focus adjustment has been started, and the focus adjustment assist unit 70 is instructed to turn on the focus assist function. (S58). In response to this, focus information is projected and displayed on at least a part of the projection screen.
 評価値変化検出部44は、フォーカス評価値が変化している間は(S60のN)、フォーカスリングが動かされていると判断する。所定の期間、フォーカス評価値に変化が見られなくなると(S60のY)、ユーザによるマニュアルフォーカス調整が終了したと判定し、フォーカスアシスト機能をオフにするようにフォーカス調整アシスト部70に指示する(S62)。これに応じて、フォーカス合焦情報が投写画面から消える。 The evaluation value change detection unit 44 determines that the focus ring is being moved while the focus evaluation value is changing (N in S60). When no change in the focus evaluation value is observed for a predetermined period (Y in S60), it is determined that the manual focus adjustment by the user has ended, and the focus adjustment assist unit 70 is instructed to turn off the focus assist function ( S62). In response to this, the focus focusing information disappears from the projection screen.
 なお、上述のS52~54において、分割された領域毎にフォーカス評価値を算出し、全ての領域でフォーカス評価値に変化がみられないか、または一部の領域のみでフォーカス評価値に変化があるかを判定する際に、評価値変化検出部44は、画面を複数の領域に分割して領域毎に算出された入力信号評価値も合わせて参照するようにしてもよい。こうすると、入力信号の局所的な特徴を参照しながら撮像画像のフォーカス評価値を判定することができるので、判定精度を向上させることができる。 In S52 to S54 described above, a focus evaluation value is calculated for each of the divided areas, and the focus evaluation value does not change in all the areas, or the focus evaluation value changes only in a part of the areas. When determining whether or not there is, the evaluation value change detection unit 44 may divide the screen into a plurality of regions and refer to the input signal evaluation values calculated for each region. In this way, the focus evaluation value of the captured image can be determined while referring to the local characteristics of the input signal, so that the determination accuracy can be improved.
 図6は、フォーカス評価値の変化とフォーカスアシスト機能の状態とを説明する図である。図6(a)はフォーカス評価値の時間変化を示し、図6(b)はフォーカスアシスト機能の状態を示している。 FIG. 6 is a diagram for explaining the change of the focus evaluation value and the state of the focus assist function. FIG. 6A shows the temporal change of the focus evaluation value, and FIG. 6B shows the state of the focus assist function.
 以下の説明では、フォーカスアシスト機能がスタンバイ状態にあるものとする。図6(a)において、期間t6では、フォーカス評価値の変化がないのでスタンバイ状態が維持される。期間t7では、一部の領域のみのフォーカス評価値が変動しているので、フォーカスリングの動作とはみなされず、フォーカスアシスト機能のスタンバイ状態が維持される。期間t8では、全ての領域のフォーカス評価値が変動しているので、フォーカスリングが動かされていると判断され、フォーカスアシスト機能がオンにされる。これに応じて、フォーカス合焦情報が投写画面の少なくとも一部に投写表示される。その後、所定の期間Dにわたり、フォーカス変化値に変化がなかったため、フォーカスリングの操作が終わったと判断され、期間t9ではフォーカスアシスト機能がオフにされる。これに応じて、フォーカス合焦情報が投写画面から消える。 In the following explanation, it is assumed that the focus assist function is in the standby state. In FIG. 6A, in the period t6, since the focus evaluation value does not change, the standby state is maintained. In the period t7, since the focus evaluation value of only a part of the region fluctuates, it is not regarded as a focus ring operation, and the standby state of the focus assist function is maintained. In the period t8, since the focus evaluation values of all the regions are changed, it is determined that the focus ring is moved, and the focus assist function is turned on. In response to this, focus information is projected and displayed on at least a part of the projection screen. Thereafter, since the focus change value has not changed over the predetermined period D, it is determined that the operation of the focus ring has been completed, and the focus assist function is turned off in the period t9. In response to this, the focus focusing information disappears from the projection screen.
 以上説明したように、第1実施形態に係る投写型映像表示装置によれば、ユーザがフォーカスリングを手動で動かしているか否かを判断することができる。ユーザによりフォーカスリングが動かされると、フォーカス合焦情報が自動的に投写画面に重畳表示され、フォーカスリングの操作が止まると、フォーカス合焦情報が投写画面から消える。したがって、ユーザは、フォーカス合焦情報を表示させるために、専用のボタンを操作したり、メニュー画面から機能を選択したりする必要がない。 As described above, according to the projection display apparatus according to the first embodiment, it can be determined whether or not the user manually moves the focus ring. When the focus ring is moved by the user, the focus focus information is automatically superimposed on the projection screen, and when the focus ring operation is stopped, the focus focus information disappears from the projection screen. Therefore, the user does not need to operate a dedicated button or select a function from the menu screen in order to display the focus focusing information.
 上記で説明した第1実施形態は例示であり、それらの各構成要素や各処理プロセスの組合せにいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。 The first embodiment described above is an exemplification, and it will be understood by those skilled in the art that various modifications can be made to the combinations of the respective constituent elements and processing processes, and such modifications are also within the scope of the present invention. It is understood.
 第1実施形態では、撮像画像の全体または分割された領域毎に、フォーカス評価値を算出することを述べた。この代わりに、フォーカス評価値算出部は、周知の手法により撮像画像の中からスクリーン領域を検出し、このスクリーン領域内についてのみフォーカス評価値を算出するようにしてもよい。これにより、フォーカスリングの操作によるフォーカスが変化するスクリーン内でのみフォーカス評価値が算出されるので、フォーカスリング操作の検出がより正確に行える。 In the first embodiment, the focus evaluation value is calculated for the entire captured image or for each divided region. Instead, the focus evaluation value calculation unit may detect a screen area from the captured image by a well-known method, and calculate the focus evaluation value only within the screen area. As a result, the focus evaluation value is calculated only within the screen where the focus is changed by the operation of the focus ring, and therefore the focus ring operation can be detected more accurately.
 撮像画像に基づき、スクリーン300が傾斜しているか否かを判定する傾斜判定部を設けてもよい。スクリーンが傾斜していると判定された場合、フォーカスリングが操作されたときの合焦ポイントは、スクリーン内の位置によって異なる。この場合、フォーカス評価値算出部は、スクリーン内の後ろピント寄りの領域でのみ、フォーカス評価値を算出することが好ましい。 An inclination determination unit that determines whether the screen 300 is inclined based on the captured image may be provided. When it is determined that the screen is inclined, the focus point when the focus ring is operated varies depending on the position in the screen. In this case, it is preferable that the focus evaluation value calculation unit calculates the focus evaluation value only in a region closer to the back focus in the screen.
第2実施形態.
 投写型映像表示装置において、表示に必要な各種調整を自動的に行うためにはモータやその制御機器等が必要となり、装置のコストが高くなりうる。一方で、フォーカスリングを手動で操作してフォーカスを調整するのは、ユーザにとって繁雑な作業となりうる。そこで、ユーザが手動でフォーカスを調整する際にスクリーン上に現在のフォーカス状態を投写表示することができれば、装置のコストを抑えつつ、フォーカス調整時の利便性を向上させることができる。
Second embodiment.
In a projection display apparatus, in order to automatically perform various adjustments necessary for display, a motor, its control device, and the like are required, which can increase the cost of the apparatus. On the other hand, manually adjusting the focus ring to adjust the focus can be a complicated task for the user. Therefore, if the current focus state can be projected and displayed on the screen when the user manually adjusts the focus, the convenience of the focus adjustment can be improved while suppressing the cost of the apparatus.
 本発明の第2実施形態では、ユーザが手動によるフォーカス調整動作をする際の補助手段を提供することを目的とする。 In the second embodiment of the present invention, it is an object to provide auxiliary means when the user performs a manual focus adjustment operation.
 第2実施形態の概要を述べる。第2実施形態に係る投写型映像表示装置は、ユーザによるズーム操作を検出することを契機として、ユーザに提示するフォーカス調整の補助情報を刷新する。 The outline of the second embodiment will be described. The projection display apparatus according to the second embodiment renews the focus adjustment auxiliary information presented to the user when the zoom operation by the user is detected.
 図7は、第2実施形態に係る投写型映像表示装置1200の機能構成を模式的に示す図である。投写型映像表示装置1200は、投写部10、撮像部30および制御部1100を含む。制御部1100は、フォーカス評価値算出部142、ズーム操作検出部144、フォーカス状態情報設定部150、映像信号設定部82および画像メモリ84を含む。投写部10、撮像部30、映像信号設定部82および画像メモリ84については、図1で説明したものと同様の構成を備えるので、一部説明を省略する。 FIG. 7 is a diagram schematically illustrating a functional configuration of the projection display apparatus 1200 according to the second embodiment. The projection display apparatus 1200 includes a projection unit 10, an imaging unit 30, and a control unit 1100. The control unit 1100 includes a focus evaluation value calculation unit 142, a zoom operation detection unit 144, a focus state information setting unit 150, a video signal setting unit 82, and an image memory 84. The projection unit 10, the imaging unit 30, the video signal setting unit 82, and the image memory 84 have the same configuration as that described with reference to FIG.
 投写部10は、光源11、光変調部12、レンズ13、ズームリング14、およびフォーカスリング15を含む。レンズ13は、光変調部12から入射される光の焦点距離の調整および焦点調整をする。図示はしないが、レンズ13は焦点距離を移動するためのズームレンズと、焦点調整をするためのフォーカシングレンズとを含む。レンズ13にはズームリング14とフォーカスリング15とが設けられている。ユーザがズームリング14やフォーカスリング15を手動で回転させると、そのレンズ位置が光軸上で移動する。光変調部12により生成された映像光は、レンズ13を介してスクリーン300に投写される。なお、レンズ位置を光軸上で移動させるものであれば、ズームリング14やフォーカスリング15以外の任意の装置でよい。 The projection unit 10 includes a light source 11, a light modulation unit 12, a lens 13, a zoom ring 14, and a focus ring 15. The lens 13 adjusts the focal length of the light incident from the light modulation unit 12 and adjusts the focus. Although not shown, the lens 13 includes a zoom lens for moving the focal length and a focusing lens for adjusting the focus. The lens 13 is provided with a zoom ring 14 and a focus ring 15. When the user manually rotates the zoom ring 14 or the focus ring 15, the lens position moves on the optical axis. The image light generated by the light modulation unit 12 is projected onto the screen 300 via the lens 13. Note that any device other than the zoom ring 14 and the focus ring 15 may be used as long as the lens position is moved on the optical axis.
 フォーカス評価値算出部142は、撮像部30により撮像された撮像画像を取得して画像解析することにより、フォーカスリング15によって調整された映像のフォーカス状態に応じて変化する特性値をフォーカス評価値として算出する。フォーカス評価値は、フォーカスアシスト機能に利用される。 The focus evaluation value calculation unit 142 obtains a captured image captured by the imaging unit 30 and performs image analysis, thereby using a characteristic value that changes according to the focus state of the video adjusted by the focus ring 15 as a focus evaluation value. calculate. The focus evaluation value is used for the focus assist function.
 「フォーカスアシスト機能」とは、フォーカスリング15のマニュアル操作によるフォーカス調整を補助するためのフォーカス状態情報を、スクリーン300上の投影画面の少なくとも一部として重畳表示する機能である。フォーカス状態情報の詳細については後述する。第2実施形態では、ユーザによるマニュアルフォーカス調整時にフォーカス状態情報を重畳表示する。また、フォーカス状態情報とともに表示される映像は、映像信号をそのまま使用してもよいし、またはフォーカス専用のパターンを表示してもよい。 The “focus assist function” is a function for superimposing and displaying focus state information for assisting focus adjustment by manual operation of the focus ring 15 as at least a part of the projection screen on the screen 300. Details of the focus state information will be described later. In the second embodiment, the focus state information is superimposed and displayed during manual focus adjustment by the user. In addition, as the video displayed together with the focus state information, the video signal may be used as it is, or a focus-dedicated pattern may be displayed.
 フォーカス評価値としては、撮像画像の高周波成分、コントラスト情報、輝度情報など、スクリーンに投写された映像のフォーカス状態に応じて変化する任意の特性値を用いることができる。撮像画像から高周波成分またはコントラストを算出するには、例えばフーリエ変換や多重解像度解析、エッジ抽出等、任意の画像解析手法を用いることができる。フォーカス評価値は、撮像画像全体から、または撮像画像を複数の領域に分割して領域毎に算出される。フォーカス評価値は、所定の時間間隔で、または所定数のフレーム毎に算出されることが好ましい。 As the focus evaluation value, any characteristic value that changes in accordance with the focus state of the image projected on the screen, such as a high-frequency component of the captured image, contrast information, and luminance information, can be used. In order to calculate a high-frequency component or contrast from a captured image, any image analysis method such as Fourier transform, multi-resolution analysis, edge extraction, or the like can be used. The focus evaluation value is calculated for each area from the entire captured image or by dividing the captured image into a plurality of areas. The focus evaluation value is preferably calculated at a predetermined time interval or every predetermined number of frames.
 ズーム操作検出部144は、撮像部30により撮像された撮像画像を取得して画像解析することにより、ユーザによりズームリング14が操作されているか否かを判定する。これは例えば画像中のエッジ成分を抽出し、エッジ成分の増減を検出することで実現できる。 The zoom operation detection unit 144 determines whether or not the zoom ring 14 is operated by the user by acquiring a captured image captured by the imaging unit 30 and analyzing the image. This can be realized, for example, by extracting an edge component in the image and detecting an increase / decrease in the edge component.
 図8は、ズーム操作の有無を検出する原理の一例を説明するための図である。図8に示す例は、ユーザによるズーム操作を検出するために、ズーム操作検出用の映像をスクリーン300に投写し、その映像を取得して解析する場合の例である。図8(a)は、スクリーン300に投写されたズーム操作検出用の映像と撮像部30の撮像範囲との関係の一例を示す図である。図8(a)は、スクリーン300に縦縞状のパターン500が投写され、撮像範囲400を撮像部30が撮像することを示している。 FIG. 8 is a diagram for explaining an example of the principle of detecting the presence or absence of a zoom operation. The example shown in FIG. 8 is an example of a case where a zoom operation detection video is projected on the screen 300 and the video is acquired and analyzed in order to detect a zoom operation by the user. FIG. 8A is a diagram illustrating an example of a relationship between a zoom operation detection image projected on the screen 300 and an imaging range of the imaging unit 30. FIG. 8A shows that the vertical stripe pattern 500 is projected on the screen 300 and the imaging unit 30 images the imaging range 400.
 図8(b)は、スクリーン300に投写されたズーム操作検出用の映像と撮像部30の撮像範囲との関係を示す別の例を示す図である。図8(b)は図8(a)と比較すると、ユーザによるズーム操作の影響により、パターン500が縮小されて投写されている。なお、スクリーン300の大きさと撮像部30の撮像範囲400とは図8(a)と変わらない。 FIG. 8B is a diagram illustrating another example of the relationship between the zoom operation detection image projected on the screen 300 and the imaging range of the imaging unit 30. Compared to FIG. 8A, FIG. 8B projects the pattern 500 in a reduced size due to the influence of the zoom operation by the user. Note that the size of the screen 300 and the imaging range 400 of the imaging unit 30 are the same as in FIG.
 ズーム操作検出部144は、スクリーン300に投写されている縦縞の本数をカウントしてその変動を調べることにより、ズーム操作の有無を検出する。例えば図8(a)に示す例では、スクリーン300上に7本の黒い縦縞が投写されている。ユーザがズームリング14を操作してパターン500を縮小し、図8(b)に示す例の状態となったとする。図8(b)に示す例では、スクリーン300上に投写される黒い縦縞の本数が、7本から9本に変動している。このように、ズーム操作検出部144はスクリーン300上に投写される縦縞の本数をカウントしてその変動を調べることで、ズーム操作の有無を検出することが可能となる。黒い縦縞の本数のカウントは、例えば一般的なエッジ抽出処理と閾値処理とを用いることで実現できる。 The zoom operation detection unit 144 detects the presence or absence of a zoom operation by counting the number of vertical stripes projected on the screen 300 and examining the fluctuations. For example, in the example shown in FIG. 8A, seven black vertical stripes are projected on the screen 300. It is assumed that the user operates the zoom ring 14 to reduce the pattern 500 and the state shown in FIG. In the example shown in FIG. 8B, the number of black vertical stripes projected on the screen 300 varies from 7 to 9. As described above, the zoom operation detection unit 144 can detect the presence or absence of the zoom operation by counting the number of vertical stripes projected on the screen 300 and examining the variation thereof. Counting the number of black vertical stripes can be realized by using, for example, general edge extraction processing and threshold processing.
 フォーカス状態情報設定部150は、上述のフォーカスアシスト機能としてスクリーン300に表示されるフォーカス状態情報を設定する。フォーカス状態情報設定部150は、フォーカス評価値格納部152、フォーカス状態決定部154、およびグラフデータベース156を含む。以下、図9、図10、図11、および図12を参照してフォーカス状態情報設定部150の構成を説明する。 The focus state information setting unit 150 sets focus state information displayed on the screen 300 as the above-described focus assist function. The focus state information setting unit 150 includes a focus evaluation value storage unit 152, a focus state determination unit 154, and a graph database 156. Hereinafter, the configuration of the focus state information setting unit 150 will be described with reference to FIGS. 9, 10, 11, and 12.
 図9は、フォーカス調整時におけるフォーカス評価値の変動の一例を示す図である。以下に示す例では、フォーカスの状態がよいほどフォーカス評価値の値が大きくなることを前提としている。 FIG. 9 is a diagram showing an example of a change in focus evaluation value during focus adjustment. In the example shown below, it is assumed that the focus evaluation value increases as the focus state is improved.
 時刻tにおいて、投写型映像表示装置1200の電源が投入され、スクリーン300に映像が表示されたとする。この状態では通常フォーカスが合っておらず、撮像部30により撮像された撮像画像は高周波成分の少ないぼけた画像となっている。時刻tからtの間はフォーカスの調整がなされず、フォーカス評価値は変動しない。フォーカス評価値格納部152は、フォーカス評価値算出部142から取得したfをフォーカス評価値の初期値として格納する。フォーカス評価値格納部152はまた、fをフォーカス評価値の暫定最大値fとして格納する。「暫定最大値f」とは、フォーカス評価値算出部142が過去に算出したフォーカス評価値の最大値である。 It is assumed that the projection display apparatus 1200 is turned on at time t 0 and an image is displayed on the screen 300. In this state, the focus is not normally achieved, and the captured image captured by the imaging unit 30 is a blurred image with a small number of high frequency components. Between time t 0 of t 1 is not made adjustment of the focus, the focus evaluation value does not change. The focus evaluation value storage unit 152 stores f 0 acquired from the focus evaluation value calculation unit 142 as an initial value of the focus evaluation value. Focus evaluation value storage unit 152 also stores the f 0 as a provisional maximum value f m of the focus evaluation value. The “provisional maximum value f m ” is the maximum value of the focus evaluation value calculated by the focus evaluation value calculation unit 142 in the past.
 時刻tにおいてユーザはフォーカスの調整を開始する。時刻tからtの間、ユーザは誤った方向、すなわちフォーカスがより合わなくなる方向に調整をしたため、フォーカス評価値fが初期値fよりも低くなる。以下、添字を付さず単にfと表記するときは、フォーカス評価値の現在の値を表す。フォーカス評価値格納部152は、フォーカス評価値の初期値fとは別に、フォーカス評価値の現在値fも計算する毎に格納する。 The user initiates the adjustment of the focus at the time t 1. From time t 1 to t 2 , the user makes an adjustment in the wrong direction, that is, the direction in which the focus is less suitable, and thus the focus evaluation value f becomes lower than the initial value f 0 . In the following, when notation is not added and it is simply written as f, it represents the current value of the focus evaluation value. The focus evaluation value storage unit 152 stores the current value f of the focus evaluation value separately from the initial value f 0 of the focus evaluation value.
 時刻tにおいて、ユーザは正しい方向にフォーカスの調整を開始する。時刻tにおいてフォーカス評価値の現在値fは初期値fとなるまで上昇する。時刻t以降、フォーカス評価値fは上昇を続けるので、フォーカス評価値格納部152は、暫定最大値fをフォーカス評価値fの値で都度更新する。 In time t 2, the user initiates a focus adjustment in the right direction. Current value f of the focus evaluation value increases until the initial value f 0 at time t 3. Time t 3 after, since the focus evaluation value f continues to rise, the focus evaluation value storage unit 152 every time updating the provisional maximum value f m with the value of the focus evaluation value f.
 ユーザが正しい方向にフォーカスの調整を続けた結果、時刻tにおいてフォーカス評価値fはピーク値fを取り、以後減少する。これはフォーカスが合ったにも関わらずさらにフォーカスの調整を続けると、かえってフォーカスが合わなくなって映像がぼけるからである。その後時刻tにおいてユーザはフォーカス調整を戻し、時刻tにおいて再度フォーカス評価値fはピーク値fとなる。時刻tの時点でフォーカスの調整は終了する。 As a result of the user's continued focus adjustment in the correct direction, the focus evaluation value f takes the peak value f 1 at time t 4 and then decreases. This is because if the focus adjustment continues even though the focus has been achieved, the image will be out of focus and the image will be blurred. Then at time t 5 the user returns the focus adjustment again focus evaluation value f at time t 6 is the peak value f 1. Adjust the focus of the at time t 6 is terminated.
 フォーカス状態決定部154は、過去に算出されたフォーカス評価値の平均値fを計算して、その結果をフォーカス評価値格納部152に格納する。フォーカス状態決定部154はまた、前述のフォーカス評価値のピーク値fを検出して、フォーカス評価値の最大値fとしてフォーカス評価値格納部152に格納する。具体的には、フォーカス状態決定部154は、フォーカス評価値の変化量を追跡し、その変化量が増加から減少に転じる場合のフォーカス評価値fを、フォーカス評価値の最大値fとする。フォーカス評価値の最大値fはフォーカスが合った状態におけるフォーカス評価値であるから、フォーカス評価値の取るべき目標値となる。 The focus state determination unit 154 calculates an average value f a of focus evaluation values calculated in the past, and stores the result in the focus evaluation value storage unit 152. The focus state determination unit 154 also detects the peak value f 1 of the focus evaluation value described above and stores it in the focus evaluation value storage unit 152 as the maximum value f M of the focus evaluation value. Specifically, the focus state determination unit 154 tracks the amount of change in the focus evaluation value, and sets the focus evaluation value f when the amount of change changes from increasing to decreasing as the maximum value f M of the focus evaluation value. Since the maximum value f M of the focus evaluation value is the focus evaluation value in a state where the focus suits, becomes a target value to be taken by the focus evaluation value.
 このフォーカス評価値の現在値fとフォーカス評価値の目標値との大小関係が、前述のフォーカス状態情報となる。そこでフォーカス状態決定部154は、フォーカス状態情報をフォーカスの度合いを示す色彩を持ったグラフとして表現するため、そのグラフの表示態様を決定する。 The magnitude relationship between the current value f of the focus evaluation value and the target value of the focus evaluation value is the above-described focus state information. Therefore, the focus state determination unit 154 determines the display mode of the graph in order to express the focus state information as a graph having a color indicating the degree of focus.
 図10は、フォーカス状態情報をあらわすグラフの一例を示す図である。図10(a)はフォーカス調整開始時のグラフの一例を示す図であり、図9におけるtからtの期間のグラフである。グラフはスクリーンに投写される映像に上書きされて投写される。映像に対して水平方向を長手方向とする長方形がグラフであり、フォーカス評価値とグラフのメモリとが対応づけられており、フォーカス評価値の変化に応じてグラフの色彩および色彩のついた領域の面積が増減する。図10(a)の例では、グラフの4分の3の領域に色彩がついている。以下図10において斜めの斜線で表す領域は黄色の領域を表し、斜めの格子で表す領域は赤色の領域を表すものとする。また、図10においてグラフの左端部は右端部と比較してグラフのメモリ幅が大きく、色彩のついた領域が大きく増減する。反対にグラフの右端部では色彩のついた領域が細かく増減する。これにより、フォーカスが正しい状態に近づくにつれ、ユーザは詳細なフォーカス状態を把握することができ、利便性が向上する点で有利である。 FIG. 10 is a diagram illustrating an example of a graph representing focus state information. FIG. 10A is a diagram illustrating an example of a graph at the start of focus adjustment, and is a graph of a period from t 0 to t 1 in FIG. The graph is projected over the image projected on the screen. A rectangle whose longitudinal direction is the horizontal direction with respect to the image is a graph, and the focus evaluation value and the memory of the graph are associated with each other. The color of the graph and the area with the color according to the change of the focus evaluation value The area increases or decreases. In the example of FIG. 10A, a color is attached to a three-quarter region of the graph. In the following, in FIG. 10, a region represented by an oblique diagonal line represents a yellow region, and a region represented by an oblique lattice represents a red region. Also, in FIG. 10, the memory width of the graph is larger at the left end of the graph than the right end, and the colored area greatly increases or decreases. On the other hand, the colored area increases or decreases finely at the right end of the graph. Thereby, as the focus approaches the correct state, the user can grasp the detailed focus state, which is advantageous in that convenience is improved.
  図10(b)は、前述のフォーカス評価値の最大値f、すなわちフォーカス評価値の取るべき目標値が未検出の期間において、フォーカス評価値fが暫定最大値fよりも小さな値となる場合のグラフの一例を示す図であり、図9におけるtからtの期間のグラフである。フォーカス評価値fの値に応じて、グラフ中の黄色の色彩の領域の長さが変化する。図示はしないが、前述のフォーカス評価値の最大値fの最大値、すなわちフォーカス評価値の取るべき目標値が未検出の期間において、フォーカス評価値fが上昇を続ける期間(図9におけるtからtの期間)は、図10(a)の例と同じグラフとなる。 FIG. 10B illustrates the above-described maximum focus evaluation value f M , that is, the focus evaluation value f is smaller than the provisional maximum value f m during a period in which the target value to be taken by the focus evaluation value is not detected. is a diagram showing an example of a graph in a case, a graph of the period t 2 from t 1 in FIG. The length of the yellow colored region in the graph changes according to the focus evaluation value f. Although not shown in the figure, the focus evaluation value f continues to increase during the period in which the maximum value of the focus evaluation value f M described above, that is, the target value to be taken as the focus evaluation value is not detected (t 3 in FIG. 9). period t 4) from, the same graph as the example of FIG. 10 (a).
  図10(c)は、フォーカス評価値fの取るべき目標値が検出された後の期間において、フォーカス評価値fが下降する場合のグラフの一例を示す図であり、図9におけるtからtの期間のグラフである。グラフの4分の3の領域をしめる色彩のついた領域の表示色が黄色から赤色に変化する。 FIG. 10C is a diagram illustrating an example of a graph when the focus evaluation value f decreases in a period after the target value to be taken by the focus evaluation value f is detected. From t 4 to t in FIG. 5 is a graph of a period of 5 . The display color of the colored area indicating the three-quarter area of the graph changes from yellow to red.
 図10(d)は、フォーカス評価値fの取るべき目標値が検出された後の期間において、フォーカス評価値fが上昇する場合のグラフの一例を示す図であり、図9におけるtからtの期間のグラフである。グラフのほぼ全域が黄色の色彩となる。なお図示はしないが、フォーカス評価値fが目標値である最大値fと等しくなった時点で、グラフの全域が青色の色彩となり、ユーザにフォーカス調整が完了したことを通知する。このように、フォーカス状態情報は、ユーザがマニュアルによるフォーカス調整動作をする際の補助手段となる。 FIG. 10D is a diagram illustrating an example of a graph when the focus evaluation value f increases in the period after the target value to be taken by the focus evaluation value f is detected. From t 5 to t in FIG. It is a graph of 6 periods. Nearly the entire area of the graph is yellow. Although not shown, when the focus evaluation value f is equal to the maximum value f M which is the target value, the whole area of the graph becomes a blue color, to notify that the focus adjustment is completed the user. As described above, the focus state information serves as auxiliary means when the user performs a manual focus adjustment operation.
 図11は、フォーカス評価値および目標値の組み合わせと、フォーカス状態情報の表示色およびその表示量とが対応づけられたテーブルを示す図である。図11に示すテーブルは、グラフデータベース156に格納されており、フォーカス状態決定部154はこのテーブルを参照して表示すべきフォーカス状態情報を決定する。 FIG. 11 is a diagram showing a table in which combinations of focus evaluation values and target values are associated with display colors and display amounts of focus state information. The table shown in FIG. 11 is stored in the graph database 156, and the focus state determination unit 154 determines focus state information to be displayed with reference to this table.
 具体的に、フォーカス状態決定部154は、フォーカス評価値格納部152に格納されたフォーカス評価値を取得してその変化を算出する。まず、フォーカス評価値の現在値fと平均値fとの大小関係を調べ、図11に示すテーブルの上昇、定常、下降のいずれの状態にあるかを判断する。次にフォーカス状態決定部154は、フォーカス評価値の目標値である最大値fが検出されているか否かを判断する。最大値fが未検出の場合、フォーカス状態決定部154は暫定最大値fとフォーカス評価値の現在値fとの大小関係、およびフォーカス評価値の初期値fと暫定最大値fとの大小関係を調べる。 Specifically, the focus state determination unit 154 acquires the focus evaluation value stored in the focus evaluation value storage unit 152 and calculates the change thereof. First, examine the magnitude relation between the current value f of the focus evaluation value and the average value f a, is determined increase in the table shown in FIG. 11, constant, and it is in any state of descent. Then the focus state determination unit 154, the maximum value f M which is the target value of the focus evaluation value to determine whether it has been detected. If the maximum value f M is undetected, the magnitude relationship of the focus state determination unit 154 and the current value f provisional maximum value f m and the focus evaluation value, and the initial value f 0 of the focus evaluation value and the provisional maximum value f m Investigate the size relationship.
 最大値fが検出されている場合、フォーカス状態決定部154は最大値fとフォーカス評価値の現在値fとの大小関係、およびフォーカス評価値の初期値fと最大値fとの大小関係を調べる。以上を調べることにより、フォーカス状態決定部154は、グラフデータベース156に格納されているテーブルから表示すべきフォーカス状態情報を決定することができる。なお、最大値fが検出されているか否かは、図示しないワークメモリ内に確保されている、最大値fが検出されているか否かを示すフラグ(MaxFlag)の値を調べることで実現できる。このフラグは0で初期化されており、フォーカス状態決定部154が最大値fを検出したときに1に設定する。 When the maximum value f M is detected, the focus state determination unit 154 determines the magnitude relationship between the maximum value f M and the current value f of the focus evaluation value, and the initial value f 0 of the focus evaluation value and the maximum value f M. Examine the magnitude relationship. By examining the above, the focus state determination unit 154 can determine the focus state information to be displayed from the table stored in the graph database 156. Note that whether or not the maximum value f M is detected, achieved by examining the value of the flag (MaxFlag) shown is secured in the work memory (not shown), whether or not the maximum value f M is detected it can. This flag is initialized to 0, the focus state determination unit 154 is set to 1 when detecting the maximum value f M.
 図12は、フォーカス状態情報の表示色の状態遷移を示す図である。図12において、フォーカス状態情報の表示色の取り得る状態を、状態を意味するST(Stateの略)と数字との組み合わせで表示している。制御部1100が起動し、フォーカス評価値算出部142がフォーカス評価値fを算出すると、評価値初期化状態ST10となる。評価値初期化状態ST10において、フォーカス評価値算出部142が最初に算出した評価値が暫定最大値fおよび初期値fとしてフォーカス評価値格納部152に格納される。 FIG. 12 is a diagram illustrating the state transition of the display color of the focus state information. In FIG. 12, possible states of the display color of the focus state information are displayed by a combination of ST (abbreviation of State) meaning a state and a number. When the control unit 1100 is activated and the focus evaluation value calculation unit 142 calculates the focus evaluation value f, the evaluation value initialization state ST10 is entered. In the evaluation value initialization state ST10, the focus evaluation value calculation unit 142 evaluation value initially calculated is stored in the focus evaluation value storage unit 152 as a provisional maximum value f m and the initial value f 0.
 暫定最大値fおよび初期値fがフォーカス評価値格納部152に格納されると、黄色の表示状態ST12に移行する。黄色の表示状態ST12からは黄色の表示状態ST12自身、赤色の表示状態ST16、青色の表示状態ST14、グラフ消去状態ST20、およびズーム変化検出状態ST18に移行可能である。 When the provisional maximum value f m and the initial value f 0 is stored in the focus evaluation value storage unit 152, it shifts to the yellow display state ST12. It is possible to shift from the yellow display state ST12 to the yellow display state ST12 itself, the red display state ST16, the blue display state ST14, the graph erasing state ST20, and the zoom change detection state ST18.
 図12において丸で囲まれた数字は、図11のテーブルにおける丸で囲まれた数字と対応する。例えば黄色の表示状態ST12から青色の表示状態ST14に移行するのは、次のふたつの場合である。すなわち、フォーカスの評価値の現在値fが平均値fよりも大きく、最大値fが検出されており、現在値fと最大値fとが等しくなった場合か、あるいはフォーカスの評価値の現在値fと平均値fとが等しく、最大値fが検出されており、現在値fと最大値fとが等しくなった場合である。 The numbers surrounded by circles in FIG. 12 correspond to the numbers surrounded by circles in the table of FIG. For example, the transition from the yellow display state ST12 to the blue display state ST14 is performed in the following two cases. That is, the current value f is greater than the average value f a of the evaluation value of the focus has been detected maximum value f M, or if the current value f and the maximum value f M is equal to, or focus evaluation value equal to the current value f and the average value f a have been detected maximum value f M, which is when the current value f and the maximum value f M is equal.
 ユーザがズームリング14を操作すると、スクリーン300に投写される映像の大きさが変化するためフォーカス評価値が変動する。そのため、それまでに求めた暫定最大値fや最大値fが意味をなさなくなる。そこで、ズーム操作検出部144がユーザによってズームリング14が操作されたことを検出すると、それを契機として、黄色の表示状態ST12、青色の表示状態ST14、および赤色の表示状態ST16からズーム変化検出状態ST18に移行する。この結果、フォーカス評価値fの初期化等の操作がやり直される。フォーカス評価値の信頼性を担保できる点で有利である。 When the user operates the zoom ring 14, the focus evaluation value varies because the size of the image projected on the screen 300 changes. Therefore, the provisional maximum value f m and the maximum value f M can not make sense found so far. Therefore, when the zoom operation detection unit 144 detects that the zoom ring 14 is operated by the user, the zoom change detection state is detected from the yellow display state ST12, the blue display state ST14, and the red display state ST16. Move on to ST18. As a result, operations such as initialization of the focus evaluation value f are performed again. This is advantageous in that the reliability of the focus evaluation value can be ensured.
 なお、黄色の表示状態ST12、青色の表示状態ST14、および赤色の表示状態ST16において、ユーザがフォーカス調整を強制的に終了すると、グラフ消去状態ST20に移行してフォーカス調整は終了する。 In the yellow display state ST12, the blue display state ST14, and the red display state ST16, when the user forcibly ends the focus adjustment, the graph adjustment state ST20 is entered and the focus adjustment ends.
 図13は、第2実施形態に係る、主に制御部1100の処理の流れを示すフローチャートである。本フローチャートにおける処理は、例えば投写型映像表示装置1200の電源が投入されて撮像部30がスクリーン300に投写された映像を撮像したときに開始する。 FIG. 13 is a flowchart mainly showing a processing flow of the control unit 1100 according to the second embodiment. The processing in this flowchart starts when, for example, the projection display apparatus 1200 is turned on and the imaging unit 30 captures an image projected on the screen 300.
 フォーカス評価値算出部142は、撮像部30により撮像された撮像画像を取得して画像解析することにより、フォーカスリング15によって調整された映像のフォーカス状態に応じて変化する特性値をフォーカス評価値として算出し、フォーカス評価値格納部152に格納する(S110)。フォーカス状態決定部154は、フォーカス評価値格納部152に格納されたフォーカス評価値を取得してその変化を算出する(S112)。 The focus evaluation value calculation unit 142 obtains a captured image captured by the imaging unit 30 and performs image analysis, thereby using a characteristic value that changes according to the focus state of the video adjusted by the focus ring 15 as a focus evaluation value. The calculated value is stored in the focus evaluation value storage unit 152 (S110). The focus state determination unit 154 acquires the focus evaluation value stored in the focus evaluation value storage unit 152 and calculates the change (S112).
 フォーカス状態決定部154は、フォーカス状態情報の表示色と、現在のフォーカス評価値fと平均値fとの大小関係、最大値fまたは暫定最大値fと現在のフォーカス評価値fとの大小関係、および最大値fまたは暫定最大値fと初期値fとの大小関係の組み合わせとが対応づけられたテーブルを参照することにより、表示すべきフォーカス状態情報を決定する(S114)。映像信号設定部82は、フォーカス状態決定部154からフォーカス状態情報をフォーカスの度合いを示す色彩を持ったグラフとして取得し、投写部10に送信してスクリーン300に表示されているグラフを更新させる(S116)。 Focus state determination unit 154 of the display color of the focus state information, the magnitude relationship between the current focus evaluation value f and the average value f a, the maximum value f M or interim maximum value f m and the current focus evaluation value f Focus state information to be displayed is determined by referring to a table in which the magnitude relationship and the combination of the magnitude relationship between the maximum value f M or the provisional maximum value f m and the initial value f 0 are associated (S114). . The video signal setting unit 82 acquires the focus state information from the focus state determination unit 154 as a graph having a color indicating the degree of focus, and transmits the graph to the projection unit 10 to update the graph displayed on the screen 300 ( S116).
 現在のフォーカス評価値fと最大値fとが一致せずフォーカス調整が終了していない場合(S118のN)、上述のステップS10からS16の処理を繰り返す。現在のフォーカス評価値fと最大値fとが一致してフォーカス調整が終了すると(S118のY)、フォーカス状態決定部154は、グラフを消去する旨を映像信号設定部82に通知してグラフを消去させる(S120)。ステップS20においてグラフが消去されると、本フローチャートにおける処理は終了する。 If the current focus evaluation value f and the maximum value f M focus adjustment without is not completed match (S118 of N), the process is repeated from step S10 described above S16. If the current focus evaluation value f and the maximum value f M focus adjustment is completed match (S118 of Y), the focus state determination unit 154 notifies to erase the graph to the video signal setting unit 82 Graph Is deleted (S120). When the graph is deleted in step S20, the processing in this flowchart ends.
 以上の構成による動作は以下のとおりである。ユーザは第2実施形態に係る投写型映像表示装置1200を用いて映像をスクリーン300に投影し、そのフォーカスリング15を操作してフォーカスを調整する。フォーカス評価値算出部142が算出しフォーカス評価値格納部152に格納されるフォーカスの評価値をフォーカス状態決定部154が解析し、現在のフォーカスの状態を色彩を持ったグラフとしてスクリーン300に投影する。ユーザはそのグラフをフォーカス調整の補助手段として参照する。 Operation with the above configuration is as follows. The user projects an image on the screen 300 using the projection display apparatus 1200 according to the second embodiment, and operates the focus ring 15 to adjust the focus. The focus state determination unit 154 analyzes the focus evaluation value calculated by the focus evaluation value calculation unit 142 and stored in the focus evaluation value storage unit 152, and projects the current focus state onto the screen 300 as a graph having color. . The user refers to the graph as an auxiliary means for focus adjustment.
 以上説明したように、第2実施形態に係る投写型映像表示装置1200によれば、ユーザが手動によるフォーカス調整動作をする際の補助手段を提供することができる。現在のフォーカス状態を色彩を持ったグラフとして表示されるので、ユーザはフォーカス状態を直感的に理解することが可能となる。 As described above, according to the projection display apparatus 1200 according to the second embodiment, it is possible to provide auxiliary means when the user performs a manual focus adjustment operation. Since the current focus state is displayed as a color graph, the user can intuitively understand the focus state.
 上記の説明では、フォーカス状態決定部が、フォーカス評価値の変化量を追跡し、その変化量が増加から減少に転じる場合のフォーカス評価値を、フォーカス評価値の取るべき目標値として算出する場合について説明したが、フォーカス評価値の取るべき目標値は、スクリーン300に投写される映像の入力信号を取得して画像解析することにより算出してもよい。これはフォーカス評価値算出部142が、映像信号設定部82を介して画像メモリ84から入力信号を取得し、取得した入力信号を直接解析して目標値を算出することで実現できる。 In the above description, the focus state determination unit tracks the change amount of the focus evaluation value, and calculates the focus evaluation value when the change amount changes from increase to decrease as the target value that the focus evaluation value should take. As described above, the target value to be taken as the focus evaluation value may be calculated by acquiring an input signal of an image projected on the screen 300 and analyzing the image. This can be realized by the focus evaluation value calculation unit 142 acquiring an input signal from the image memory 84 via the video signal setting unit 82, and directly analyzing the acquired input signal to calculate a target value.
 フォーカス評価値の変化量を追跡し、その変化量が増加から減少に転じる場合のフォーカス評価値を、フォーカス評価値の取るべき目標値として算出する場合、変化量が増加から減少に転じるまで最大値が確定できず、暫定的な最大値しか算出できないこともあり得る。フォーカス評価値の変化量が増加から減少に転じるときを最大値として検出すればよいが、この場合フォーカスが正しい状態になってから一度フォーカスが外れた状態となる必要があり、フォーカス調整が効率的に行えないことになり得る。これらはスクリーン300に投写される映像の入力信号を取得して画像解析することによりフォーカス評価値の取るべき目標値を設定すれば回避することができ、撮像部30により撮像された撮像画像を取得して画像解析する場合と比較して、より精度の高い目標値を設定しうる点で有利である。 When tracking the change amount of the focus evaluation value and calculating the focus evaluation value when the change amount changes from increasing to decreasing as the target value to be taken by the focus evaluation value, the maximum value is obtained until the change amount changes from increasing to decreasing. May not be determined, and only a provisional maximum value may be calculated. It is sufficient to detect when the amount of change in the focus evaluation value changes from increase to decrease as the maximum value. In this case, however, the focus must be out of focus once the focus is in the correct state, and focus adjustment is efficient. Can be impossible. These can be avoided by setting the target value to be taken as the focus evaluation value by acquiring the input signal of the image projected on the screen 300 and analyzing the image, and acquire the captured image captured by the imaging unit 30. As compared with the case of image analysis, it is advantageous in that a target value with higher accuracy can be set.
 上記の説明において、フォーカス状態決定部154がフォーカス評価値の変化量を追跡することを説明したが、フォーカス状態決定部154は、算出したフォーカス評価値の変化量が所定の閾値を超えて変化した場合、フォーカス評価値格納部152から閾値を超えたフォーカス評価値を破棄してもよい。ここで所定の閾値とは、フォーカス評価値としての信頼性を調べるためのフォーカス評価値の変化量の基準値である。例えばフォーカスアシスト時にカメラの前に手をかざしたり、カメラの前を人が通るなどの状況が発生すると、正しいフォーカス評価値を算出することができず、フォーカス評価値の変化量が大きくなる。この結果誤動作することがあり得る。したがってそれらの状況をあらかじめ実験で再現し、所定の閾値を定めておく。上記のような誤動作を未然に防ぐことができる点で有利である。 In the above description, it has been described that the focus state determination unit 154 tracks the amount of change in the focus evaluation value. However, the focus state determination unit 154 has changed the calculated amount of change in the focus evaluation value beyond a predetermined threshold. In this case, the focus evaluation value exceeding the threshold may be discarded from the focus evaluation value storage unit 152. Here, the predetermined threshold is a reference value for the change amount of the focus evaluation value for checking the reliability as the focus evaluation value. For example, when a situation occurs such as a hand in front of the camera or a person passing in front of the camera during focus assist, a correct focus evaluation value cannot be calculated, and the amount of change in the focus evaluation value increases. As a result, a malfunction may occur. Therefore, those situations are reproduced in advance by experiments, and a predetermined threshold is set. This is advantageous in that the above malfunction can be prevented.
 上記の説明では、フォーカス状態情報を表すグラフとしてスクリーンに投写される映像に対して水平方向を長手方向とする長方形の場合を説明したが、グラフの形は長方形には限られない。例えば、円や三角形等の多角形でもよい。グラフが円の場合フォーカス評価値と角度とを対応づけた円グラフとして実現できる。三角形等の多角形の場合も同様に、フォーカス評価値をグラフを塗りつぶす面積に対応づけることで実現できる。 In the above description, the case of a rectangle whose longitudinal direction is the horizontal direction with respect to the image projected on the screen as the graph representing the focus state information has been described, but the shape of the graph is not limited to a rectangle. For example, it may be a polygon such as a circle or a triangle. When the graph is a circle, it can be realized as a pie graph in which the focus evaluation value and the angle are associated with each other. Similarly, in the case of a polygon such as a triangle, it can be realized by associating the focus evaluation value with the area for filling the graph.
 上記の説明では、ズーム操作検出部144が、撮像部30により撮像された縦縞状のパターンを取得して画像解析することにより、ズーム操作されているか否かを判定する場合について説明したが、解析の対象となる画像は様々な態様が考えられる。以下この態様について説明する。 In the above description, a case has been described in which the zoom operation detection unit 144 determines whether or not the zoom operation is performed by acquiring the vertical stripe pattern imaged by the imaging unit 30 and analyzing the image. Various modes can be considered for the image to be subjected to the above. This aspect will be described below.
 図14は、画像解析に利用するパターン画像の一例を示す図である。図14(a)は、主にフォーカス調整に利用されるパターン画像を示す図である。図14(a)に示すような市松模様は周期的にエッジ成分が出現するため高周波成分を多く含み、前述のフォーカス評価値の算出に都合がよい。ズーム操作検出部144は、この市松模様のパターンを画像解析することによりズーム操作を検出することができる。これは例えば、パターンに対して水平方向のプロファイルを算出してそのエッジ成分を算出する場合には前述の縦縞模様のパターンを利用する場合と同様になるが、任意の方向のプロファイルが利用可能である。フォーカス評価値の計算とズーム操作の有無の判定とを同時に実行しうる点で有利である。 FIG. 14 is a diagram showing an example of a pattern image used for image analysis. FIG. 14A is a diagram showing a pattern image mainly used for focus adjustment. The checkered pattern as shown in FIG. 14 (a) contains many high-frequency components because edge components appear periodically, which is convenient for calculating the focus evaluation value. The zoom operation detection unit 144 can detect the zoom operation by analyzing the image of the checkerboard pattern. For example, when calculating a horizontal profile for a pattern and calculating its edge component, it is the same as using the vertical stripe pattern described above, but a profile in any direction can be used. is there. This is advantageous in that the calculation of the focus evaluation value and the determination of the presence or absence of the zoom operation can be performed simultaneously.
 図14(b)は、フォーカス調整に利用されるパターン画像とズーム操作検出に利用されるパターン画像との組み合わせの例を示す図である。具体的には、フォーカス評価値の算出に利用するための市松模様の画像の周囲が、ズーム操作の決定に利用するための縦縞模様の画像で囲まれた画像である。このように、フォーカス評価値の計算に利用するための画像とズーム操作の決定に利用するための画像とを空間的に分割して配置することにより、フォーカス評価値の計算とズーム操作の有無の判定とを同時に実行しうる点で有利である。 FIG. 14B is a diagram illustrating an example of a combination of a pattern image used for focus adjustment and a pattern image used for zoom operation detection. Specifically, the periphery of the checkered pattern image used for calculation of the focus evaluation value is an image surrounded by the vertical stripe pattern image used for determining the zoom operation. In this way, the image for use in calculating the focus evaluation value and the image for use in determining the zoom operation are spatially divided and arranged, so that the calculation of the focus evaluation value and the presence / absence of the zoom operation are determined. This is advantageous in that the determination can be executed simultaneously.
 あるいは、フォーカス評価値の算出に利用するための市松模様の画像とズーム操作の決定に利用するための縦縞模様の画像とを、交互に時分割で投写表示してもよい。画像の切換周期はユーザに視覚的な違和感を与えない程度に遅く、かつ、フォーカス評価値の計算の計算およびズーム操作の決定に支障が出ない程度に早くする必要があるため、実験的に定めればよいが、例えば1秒周期である。フォーカス評価値算出部142およびズーム操作検出部144が、それぞれの動作に適した画像を解析することができるため、フォーカス評価値の計算およびズーム操作の有無の判定の精度を向上しうる点で有利である。 Alternatively, a checkered pattern image used to calculate the focus evaluation value and a vertical stripe pattern image used to determine the zoom operation may be alternately projected and displayed in a time-division manner. Since the image switching cycle needs to be slow enough not to give a visual discomfort to the user and fast enough not to interfere with calculation of the focus evaluation value and determination of the zoom operation, it is determined experimentally. For example, the cycle is 1 second. Since the focus evaluation value calculation unit 142 and the zoom operation detection unit 144 can analyze an image suitable for each operation, it is advantageous in that accuracy of calculation of the focus evaluation value and determination of the presence / absence of the zoom operation can be improved. It is.
 ズーム操作検出部144は、図10に示すフォーカス状態情報を表すグラフを取得して解析することにより、ズーム操作の有無を検出するようにしてもよい。図10に示すようにフォーカス状態情報を表すグラフには垂直方向にメモリが付されている。ズーム操作検出部144はこのメモリの間隔を取得してその変動を調べることにより、ズーム操作の有無を検出することができる。ズーム操作の有無を検出するための無味乾燥な画像を投写しなくてよく、フォーカス調整時に感じるユーザの退屈さを軽減しうる点で有利である。 The zoom operation detection unit 144 may detect the presence or absence of the zoom operation by acquiring and analyzing the graph representing the focus state information shown in FIG. As shown in FIG. 10, the graph representing the focus state information has a memory in the vertical direction. The zoom operation detection unit 144 can detect the presence or absence of the zoom operation by acquiring the interval of the memory and examining the fluctuation. There is no need to project a tasteless dry image for detecting the presence or absence of a zoom operation, which is advantageous in that the user's boredom felt during focus adjustment can be reduced.
 スクリーン300に人の顔を含む画像が投写されている場合には、ズーム操作検出部144は人の顔を検出してその大きさの変動を調べることにより、ズーム操作の有無を検出することができる。これは例えばパターンマッチングや色彩解析等、一般的な画像認識技術を用いて実現できる。あるいは、スクリーン300に文字が投写されている場合には、ズーム操作検出部144は文字を検出してその大きさの変動を調べることにより、ズーム操作の有無を検出することができる。これは例えば汎用の文字認識技術を用いて実現することができる。 When an image including a human face is projected on the screen 300, the zoom operation detection unit 144 can detect the presence or absence of the zoom operation by detecting the human face and examining the variation in the size thereof. it can. This can be realized by using a general image recognition technique such as pattern matching or color analysis. Alternatively, when a character is projected on the screen 300, the zoom operation detection unit 144 can detect the presence of the zoom operation by detecting the character and examining the change in its size. This can be realized, for example, using a general-purpose character recognition technique.
第3実施形態.
 撮像画像を使用するオートフォーカス調整では、複数のレンズ位置でそれぞれ撮像された画像のコントラストを算出し、コントラストが最大となるレンズ位置を検出することにより、フォーカスを合わせる手法(コントラスト検出法)が、よく使用される。
Third embodiment.
In autofocus adjustment using captured images, a method of adjusting the focus (contrast detection method) by calculating the contrast of images captured at a plurality of lens positions and detecting the lens position where the contrast is maximized. Often used.
 プロジェクタのオートフォーカス調整では、スクリーン面内にフォーカスを合わせる必要がある。しかしながら、スクリーンがプロジェクタに対して正対していない場合、すなわち、スクリーンが傾斜している場合、スクリーン面内の各位置とプロジェクタとの距離が不均一となる。その場合、スクリーン面内においてプロジェクタとの距離が近い領域にフォーカスが調整されてしまい、プロジェクタとの距離が遠い領域の画像が、ピンぼけてしまうことがあった。また、スクリーンが大きく傾斜している場合、スクリーン面内の中央領域の画像でさえ、ピンぼけてしまうことがあった。 オ ー ト In projector autofocus adjustment, it is necessary to focus on the screen surface. However, when the screen is not directly facing the projector, that is, when the screen is inclined, the distance between each position in the screen surface and the projector becomes non-uniform. In this case, the focus is adjusted to an area that is close to the projector on the screen surface, and an image in an area that is far from the projector may be out of focus. Further, when the screen is greatly inclined, even the image of the central area in the screen surface may be out of focus.
 本発明の第3実施形態では、コントラスト検出法を用いたオートフォーカス調整にて、スクリーンが傾斜している場合でも、スクリーン面内の好適な位置にフォーカスを合わせる技術を提供することを目的とする。 In the third embodiment of the present invention, it is an object to provide a technique for focusing on a suitable position in the screen surface even when the screen is inclined by autofocus adjustment using a contrast detection method. .
 図15は、本発明の第3実施形態に係る投写型映像表示装置2200と、スクリーン300との位置関係を示す図である。第3実施形態に係る投写型映像表示装置2200は、スクリーン300方向を撮影するための撮像部30を備える。撮像部30は、その光軸中心と、投写型映像表示装置2200から投写される投写光の光軸中心とが平行な関係になるよう、設置される。図15では、スクリーン300が投写型映像表示装置2200に対して正対せずに、右側が奧に傾いている。 FIG. 15 is a diagram showing the positional relationship between the projection display apparatus 2200 and the screen 300 according to the third embodiment of the present invention. A projection display apparatus 2200 according to the third embodiment includes an imaging unit 30 for photographing the direction of the screen 300. The imaging unit 30 is installed so that the optical axis center thereof is parallel to the optical axis center of the projection light projected from the projection display apparatus 2200. In FIG. 15, the screen 300 does not face the projection display apparatus 2200 and the right side is inclined toward the heel.
 図16は、図15に示した位置関係において、撮像部30により撮像された撮像画像PuIの一例を示す。撮像画像PuI内には、投写型映像表示装置2200から投写された、テストパターンが描かれた投影画像PrIが写っている。また、撮像画像PuI内には、スクリーン300の画像SIも写っている。スクリーン300面内の各位置の投影光量は光源からの距離の二乗に反比例するため、スクリーン300の画像SIの左側から右側にいくにしたがって、輝度レベルが漸次的に低下する。以下、スクリーン300面内の好適な位置にフォーカスを合わせるための、オートフォーカス処理について説明する。 FIG. 16 illustrates an example of a captured image PuI captured by the imaging unit 30 in the positional relationship illustrated in FIG. In the captured image PuI, a projection image PrI projected from the projection display apparatus 2200 on which a test pattern is drawn is shown. Further, the image SI of the screen 300 is also captured in the captured image PuI. Since the amount of light projected at each position in the screen 300 is inversely proportional to the square of the distance from the light source, the luminance level gradually decreases from the left side to the right side of the image SI on the screen 300. Hereinafter, autofocus processing for focusing on a suitable position within the screen 300 will be described.
 図17は、第3実施形態に係る投写型映像表示装置2200の構成を示す図である。投写型映像表示装置200は、投写部10、レンズ駆動部20、撮像部30および制御装置2100を備える。制御装置2100は、スクリーン位置検出部240、検出領域設定部250、オートフォーカス調整部260、映像信号設定部82、画像メモリ84および駆動信号設定部86を備える。 FIG. 17 is a diagram showing a configuration of a projection display apparatus 2200 according to the third embodiment. The projection display apparatus 200 includes a projection unit 10, a lens driving unit 20, an imaging unit 30, and a control device 2100. The control device 2100 includes a screen position detection unit 240, a detection area setting unit 250, an autofocus adjustment unit 260, a video signal setting unit 82, an image memory 84, and a drive signal setting unit 86.
 制御装置2100の構成は、ハードウェア的には、任意のコンピュータのCPU、メモリ、その他のLSIで実現でき、ソフトウェア的にはメモリにロードされたプログラムなどによって実現されるが、ここではそれらの連携によって実現される機能ブロックを描いている。したがって、これらの機能ブロックがハードウェアのみ、ソフトウェアのみ、またはそれらの組み合わせによっていろいろな形で実現できることは、当業者には理解されるところである。 The configuration of the control device 2100 can be realized in terms of hardware by a CPU, memory, or other LSI of an arbitrary computer, and in terms of software, it can be realized by a program loaded in the memory. Depicts functional blocks realized by. Therefore, those skilled in the art will understand that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
 投写部10は、スクリーン300に映像を投写する。投写部10は、光源11、光変調部12およびフォーカスレンズ13を含む。光源11には、フィラメント型の電極構造を有するハロゲンランプ、アーク放電を発生させる電極構造を有するメタルハライドランプ、キセノンショートアークランプ、高圧型の水銀ランプ、LEDランプなどを採用することができる。 Projection unit 10 projects an image on screen 300. The projection unit 10 includes a light source 11, a light modulation unit 12, and a focus lens 13. As the light source 11, a halogen lamp having a filament-type electrode structure, a metal halide lamp having an electrode structure for generating arc discharge, a xenon short arc lamp, a high-pressure mercury lamp, an LED lamp, or the like can be employed.
 光変調部12は、映像信号設定部82から設定される映像信号に応じて、光源11から入射される光を変調する。たとえば、光変調部12にはDMD(Digital Micromirror Device)を採用することができる。DMDは、画素数に対応した複数のマイクロミラーを備え、各マイクロミラーの向きが各画素信号に応じて制御されることにより、所望の映像光を生成する。 The light modulator 12 modulates the light incident from the light source 11 in accordance with the video signal set by the video signal setting unit 82. For example, a DMD (Digital Micromirror Device) can be employed for the light modulator 12. The DMD includes a plurality of micromirrors corresponding to the number of pixels, and generates desired video light by controlling the direction of each micromirror according to each pixel signal.
 フォーカスレンズ13は、光変調部12から入射される光の焦点位置を調整する。フォーカスレンズ13は、レンズ駆動部20によりそのレンズ位置が光軸上で移動される。光変調部12により生成された映像光は、フォーカスレンズ13を介して、スクリーン300に投写される。 The focus lens 13 adjusts the focal position of the light incident from the light modulator 12. The lens position of the focus lens 13 is moved on the optical axis by the lens driving unit 20. The image light generated by the light modulation unit 12 is projected onto the screen 300 via the focus lens 13.
 レンズ駆動部20は、駆動信号設定部86から設定される駆動信号に応じて、フォーカスレンズ13の位置を移動させる。レンズ駆動部20には、ステッピングモータ、ボイスコイルモータ(VCM)、ピエゾ素子などを採用することができる。 The lens driving unit 20 moves the position of the focus lens 13 in accordance with the driving signal set from the driving signal setting unit 86. For the lens driving unit 20, a stepping motor, a voice coil motor (VCM), a piezoelectric element, or the like can be employed.
 撮像部30は、スクリーン300およびスクリーン300に投影された投影画像を、主な被写体として、撮像する。撮像部30は、固体撮像素子31および信号処理回路32を含む。固体撮像素子31には、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサやCCD(Charge Coupled Devices)イメージセンサなどを採用することができる。信号処理回路32は、固体撮像素子31から出力される信号に対して、A/D変換や、RGBフォーマットからYUVフォーマットへの変換などの各種信号処理を施し、制御装置2100に出力する。本実施形態では、スクリーン位置検出部240および検出領域設定部250に出力する。 The imaging unit 30 captures the screen 300 and the projected image projected on the screen 300 as a main subject. The imaging unit 30 includes a solid-state imaging device 31 and a signal processing circuit 32. As the solid-state imaging device 31, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, a CCD (Charge Coupled Devices) image sensor, or the like can be employed. The signal processing circuit 32 performs various signal processing such as A / D conversion and conversion from the RGB format to the YUV format on the signal output from the solid-state imaging device 31, and outputs the signal to the control device 2100. In this embodiment, the data is output to the screen position detection unit 240 and the detection area setting unit 250.
 スクリーン位置検出部240は、撮像部30により撮像された画像内に写ったスクリーンの位置を検出する。より具体的には、スクリーン位置検出部240は、撮像画像内のエッジを抽出することにより、当該撮像画像内に写ったスクリーン300の四辺(上辺、下辺、左辺および右辺)の位置を検出する。検出されたスクリーン位置は、検出領域設定部250に設定される。当該スクリーン位置は、たとえば、四隅の頂点座標で特定される。 The screen position detection unit 240 detects the position of the screen shown in the image picked up by the image pickup unit 30. More specifically, the screen position detection unit 240 detects the positions of the four sides (upper side, lower side, left side, and right side) of the screen 300 captured in the captured image by extracting edges in the captured image. The detected screen position is set in the detection area setting unit 250. The screen position is specified by, for example, vertex coordinates of four corners.
 検出領域設定部250は、スクリーン位置検出部240により検出されたスクリーン領域内の中央領域を検出領域に設定する。 The detection area setting unit 250 sets a central area in the screen area detected by the screen position detection unit 240 as a detection area.
 図18は、第3実施形態に係る撮像画像PuI内のスクリーン領域SA内に、検出領域DAが設定される様子を示す図である。図19は、第3実施形態に係る検出領域DAの設定方法の一例を説明するための図である。図18に示すように、検出領域設定部250は、スクリーン位置検出部240からスクリーン領域SAの位置を取得し、その中央領域を検出領域DAに設定する。当該検出領域DAの位置およびサイズは、たとえば、次のように決定する。 FIG. 18 is a diagram illustrating a state in which the detection area DA is set in the screen area SA in the captured image PuI according to the third embodiment. FIG. 19 is a diagram for explaining an example of a detection area DA setting method according to the third embodiment. As shown in FIG. 18, the detection area setting unit 250 acquires the position of the screen area SA from the screen position detection unit 240, and sets the center area as the detection area DA. The position and size of the detection area DA are determined as follows, for example.
 図19に示すように、検出領域設定部250は、スクリーン領域SAに内接する最大の長方形IRを設定する。つぎに、その長方形IRを縦方向および横方向にそれぞれ三等分する。これにより形成される九個の長方形のうちの真ん中の長方形を、検出領域DAに設定する。 As shown in FIG. 19, the detection area setting unit 250 sets the maximum rectangle IR inscribed in the screen area SA. Next, the rectangle IR is divided into three equal parts in the vertical direction and the horizontal direction. The middle rectangle among the nine rectangles thus formed is set in the detection area DA.
 図19では、スクリーン300が左右方向に傾斜することにより、スクリーン領域SAに水平台形歪みが発生している例を描いているが、スクリーン300が上下方向に傾斜することにより、スクリーン領域SAに垂直台形歪みが発生している場合にも、水平台形歪みおよび垂直台形歪みの両方が発生している場合にも、上記アルゴリズムにより検出領域DAを設定することができる。このアルゴリズムでは、スクリーン領域SAの位置およびサイズに関わらず、簡易な計算で検出領域DAを設定することができる。 FIG. 19 illustrates an example in which horizontal trapezoidal distortion is generated in the screen area SA due to the screen 300 tilting in the left-right direction. The detection area DA can be set by the above algorithm even when trapezoidal distortion occurs or when both horizontal and vertical trapezoidal distortions occur. In this algorithm, the detection area DA can be set by simple calculation regardless of the position and size of the screen area SA.
 なお、スクリーン領域SAの中央領域に設定される検出領域DAのサイズは、上記アルゴリズムにより算出されるサイズに限定されるものではない。検出領域DAのサイズは、設計者が実験やシミュレーションにもとづき導き出した値に設定されてもよい。当該値は、実験やシミュレーションにおいて、後述する台形歪み検出用のテストパターンの検出精度および/または主観画質の評価値が最大となるときの値であってもよい。なお、実際に設定される検出領域DAのサイズは、スクリーン領域SAのサイズに応じて、正規化された値となる。 Note that the size of the detection area DA set in the center area of the screen area SA is not limited to the size calculated by the above algorithm. The size of the detection area DA may be set to a value derived by the designer based on experiments and simulations. The value may be a value when the detection accuracy of a trapezoidal distortion detection test pattern and / or the evaluation value of the subjective image quality, which will be described later, is maximized in experiments and simulations. Note that the size of the detection area DA that is actually set is a normalized value according to the size of the screen area SA.
 また、検出領域DAのサイズは、投写型映像表示装置2200に搭載されるカメラのスペック(たとえば、S/N比、解像度)に応じて、調整されてもよい。たとえば、S/N比が低い場合、ノイズの影響が大きくなるため、オートフォーカス調整精度を維持するには、検出領域DAのサイズを比較的大きくする必要がある。一方、S/N比が高い場合、ノイズの影響が小さいため、検出領域DAのサイズを小さくしても、オートフォーカス調整精度が維持される。同様に、解像度が低い場合も、検出領域DAのサイズを比較的大きくする必要がある。設計者は、スペックの異なるカメラごとに、上記実験やシミュレーションを行ってもよい。 Also, the size of the detection area DA may be adjusted according to the specifications (for example, S / N ratio, resolution) of the camera mounted on the projection display apparatus 2200. For example, when the S / N ratio is low, the influence of noise increases, so that the size of the detection area DA needs to be relatively large in order to maintain the autofocus adjustment accuracy. On the other hand, since the influence of noise is small when the S / N ratio is high, the autofocus adjustment accuracy is maintained even if the size of the detection area DA is reduced. Similarly, when the resolution is low, the size of the detection area DA needs to be relatively large. The designer may perform the experiment and simulation for each camera having different specifications.
 図18に戻る。オートフォーカス調整部260は、コントラスト検出法を用いて、フォーカスを合わせる。投写型映像表示装置2200の起動時や、ユーザ操作によりオートフォーカス調整が指示されたとき、映像信号設定部82は、画像メモリ84からオートフォーカス調整用のテストパターンを読み出し、投写部10に投写させる。当該テストパターンは、たとえば、ストライプパターンやチェッカーフラグパターンで形成される。撮像部30は、スクリーン300に投影されたテストパターンを撮像する。 Return to FIG. The autofocus adjustment unit 260 adjusts the focus using a contrast detection method. When the projection display apparatus 2200 is activated or when auto focus adjustment is instructed by a user operation, the video signal setting unit 82 reads out a test pattern for auto focus adjustment from the image memory 84 and causes the projection unit 10 to project the test pattern. . The test pattern is formed of, for example, a stripe pattern or a checker flag pattern. The imaging unit 30 images the test pattern projected on the screen 300.
 オートフォーカス調整部260は、複数のレンズ位置にて、撮像部30によりそれぞれ撮像された複数の画像内における、スクリーン位置検出部240により設定された検出領域の鮮明度をもとに、レンズの位置を決定する。以下、オートフォーカス調整部260の構成をより具体的に説明する。 The autofocus adjustment unit 260 is configured to determine the position of the lens based on the sharpness of the detection area set by the screen position detection unit 240 in a plurality of images respectively captured by the imaging unit 30 at a plurality of lens positions. To decide. Hereinafter, the configuration of the autofocus adjustment unit 260 will be described more specifically.
 オートフォーカス調整部260は、ハイパスフィルタ261、積算部262およびレンズ位置決定部263を含む。ハイパスフィルタ261は、上記検出領域内の画像信号の、所定の閾値を超える高周波成分を抽出して、その抽出した高周波成分を積算部262に供給する。ハイパスフィルタ261は、水平方向に高周波成分を抽出してもよいし、水平方向および垂直方向の両方向に高周波成分を抽出してもよい。 The autofocus adjustment unit 260 includes a high-pass filter 261, an integration unit 262, and a lens position determination unit 263. The high pass filter 261 extracts a high frequency component exceeding a predetermined threshold value of the image signal in the detection region, and supplies the extracted high frequency component to the integrating unit 262. The high pass filter 261 may extract high frequency components in the horizontal direction, or may extract high frequency components in both the horizontal direction and the vertical direction.
 積算部262は、それぞれのレンズ位置にて、ハイパスフィルタ261により抽出された高周波成分を積算し、レンズ位置決定部263に供給する。なお、ハイパスフィルタ261により水平方向および垂直方向の両方向に高周波成分が抽出されている場合、積算部62は、両者の高周波成分を合算する。レンズ位置決定部263は、積算部262から供給される複数の積算値のうち、最大積算値が検出されたフォーカスレンズ13の位置を、合焦位置に決定する。 The integrating unit 262 integrates the high frequency components extracted by the high pass filter 261 at each lens position, and supplies the integrated high frequency component to the lens position determining unit 263. When high frequency components are extracted in both the horizontal direction and the vertical direction by the high-pass filter 261, the integrating unit 62 adds both high frequency components. The lens position determining unit 263 determines the position of the focus lens 13 from which the maximum integrated value is detected among the plurality of integrated values supplied from the integrating unit 262 as the in-focus position.
 図20は、フォーカスレンズ13の合焦位置の決定処理について説明するための図である。オートフォーカス機能が有効化されると、オートフォーカス調整部260は、映像信号設定部82にテストパターンの投写を指示するとともに、フォーカスレンズ13をニア側からファー側へまたはファー側からニア側へ、所定のステップ幅で順次、移動させるための制御信号を駆動信号設定部86に設定する。映像信号設定部82は、当該テストパターンの映像信号を光変調部12に設定し、駆動信号設定部86は、上記制御信号に応じた駆動信号をレンズ駆動部20に設定する。 FIG. 20 is a diagram for explaining the process of determining the focus position of the focus lens 13. When the autofocus function is enabled, the autofocus adjustment unit 260 instructs the video signal setting unit 82 to project a test pattern, and moves the focus lens 13 from the near side to the far side or from the far side to the near side. A control signal for sequentially moving in a predetermined step width is set in the drive signal setting unit 86. The video signal setting unit 82 sets the video signal of the test pattern in the light modulation unit 12, and the drive signal setting unit 86 sets the drive signal corresponding to the control signal in the lens driving unit 20.
 オートフォーカス調整部260は、フォーカスレンズ13の各レンズ位置において撮像された、テストパターンに含まれるシャープネス(上記積算値を用いることができる)を算出する。このシャープネスは、フォーカスレンズ13が合焦位置に近づくにつれ、上昇する。その上昇がピークをうち、下降に転換したとき、オートフォーカス調整部260は、その一つ前のレンズ位置を合焦位置に決定する。 The autofocus adjustment unit 260 calculates the sharpness (the above integrated value can be used) included in the test pattern imaged at each lens position of the focus lens 13. This sharpness increases as the focus lens 13 approaches the in-focus position. When the increase peaks and decreases, the autofocus adjustment unit 260 determines the previous lens position as the in-focus position.
 図18に戻る。画像メモリ84は、スクリーン300に投写すべき画像データを保持する。当該画像データは、図示しない外部インタフェースを介して、PCなどから供給される。本実施形態では、オートフォーカス調整時に投写されるテストパターンも保持する。映像信号設定部82は、画像メモリ84に保持される画像データにもとづく映像信号を光変調部12に設定する。駆動信号設定部86は、フォーカスレンズ13を、オートフォーカス調整部260から指示されるレンズ位置に移動させるための駆動信号をレンズ駆動部20に設定する。 Return to FIG. The image memory 84 holds image data to be projected on the screen 300. The image data is supplied from a PC or the like via an external interface (not shown). In the present embodiment, a test pattern projected during autofocus adjustment is also held. The video signal setting unit 82 sets a video signal based on the image data held in the image memory 84 in the light modulation unit 12. The drive signal setting unit 86 sets a drive signal for moving the focus lens 13 to the lens position instructed from the autofocus adjustment unit 260 in the lens drive unit 20.
 以上説明したように第3実施形態によれば、スクリーンの中央領域に検出領域を設定することにより、スクリーンが傾斜している場合でも、スクリーン面内の好適な位置、すなわち、スクリーンの中央領域にフォーカスを合わせることができる。また、撮像画像全体の画像信号ではなく、上記検出領域の画像信号を用いて、各画像のシャープネスを検出することにより、演算量を低減することができる。したがって、オートフォーカス調整時間を短縮することができる。 As described above, according to the third embodiment, by setting the detection area in the central area of the screen, even when the screen is tilted, it is suitable for the position on the screen surface, that is, in the central area of the screen. Focus can be adjusted. In addition, the amount of calculation can be reduced by detecting the sharpness of each image using the image signal of the detection area instead of the image signal of the entire captured image. Therefore, the autofocus adjustment time can be shortened.
第4実施形態.
 図21は、本発明の第4実施形態に係る投写型映像表示装置3200の構成を示す図である。第4実施形態に係る投写型映像表示装置3200は、第3実施形態に係る投写型映像表示装置2200に、辺長測定部245が追加された構成である。
Fourth embodiment.
FIG. 21 is a diagram showing a configuration of a projection display apparatus 3200 according to the fourth embodiment of the present invention. The projection display apparatus 3200 according to the fourth embodiment has a configuration in which a side length measurement unit 245 is added to the projection display apparatus 2200 according to the third embodiment.
 辺長測定部245は、スクリーン位置検出部240により検出されたスクリーン領域の対向する二辺のそれぞれの長さを測定する。辺長測定部245は、それら二辺の長さからスクリーン300の傾きを検出することができる。より具体的には、当該スクリーン領域の左辺と右辺のそれぞれの長さを測定し、スクリーン300の左右方向の傾きを検出する。なお、左辺と右辺のそれぞれの長さの比率を算出して、その傾きの程度を検出してもよい。当該比率が大きいほど、その傾きが大きいことを示す。同様に、当該スクリーン領域の上辺と下辺のそれぞれの長さを測定し、スクリーン300の上下方向の傾きを検出する。 The side length measurement unit 245 measures the lengths of the two opposite sides of the screen area detected by the screen position detection unit 240. The side length measurement unit 245 can detect the inclination of the screen 300 from the lengths of these two sides. More specifically, the lengths of the left and right sides of the screen area are measured, and the horizontal tilt of the screen 300 is detected. Note that the ratio of the lengths of the left side and the right side may be calculated to detect the degree of inclination. It shows that the inclination is so large that the said ratio is large. Similarly, the lengths of the upper and lower sides of the screen area are measured, and the vertical tilt of the screen 300 is detected.
 検出領域設定部250は、辺長測定部245により測定された二辺のうちの短辺の方向に、上記スクリーン領域の中央領域に設定された検出領域を所定の距離、移動させる。 The detection area setting unit 250 moves the detection area set as the central area of the screen area by a predetermined distance in the direction of the short side of the two sides measured by the side length measurement unit 245.
 図22は、第4実施形態に係る撮像画像PuI内のスクリーン領域SA内に、検出領域DAが設定される様子を示す図である。図22に示す検出領域DAは、図18に示した検出領域DAに対して、右方向に移動されている。スクリーン領域SAの右辺の長さが、左辺の長さより短いため、検出領域設定部250により、検出領域DAが中央領域から右辺の方向に移動される。図22では、スクリーン領域SAの上辺の長さと下辺の長さが等しい例を描いているため、検出領域DAを上下方向に移動されていないが、それらの長さが異なる場合、上下方向にも移動される。 FIG. 22 is a diagram illustrating a state in which the detection area DA is set in the screen area SA in the captured image PuI according to the fourth embodiment. The detection area DA shown in FIG. 22 is moved in the right direction with respect to the detection area DA shown in FIG. Since the length of the right side of the screen area SA is shorter than the length of the left side, the detection area setting unit 250 moves the detection area DA from the central area to the right side. In FIG. 22, since the example in which the length of the upper side and the length of the lower side of the screen area SA are equal is drawn, the detection area DA is not moved in the vertical direction. Moved.
 検出領域DAを中央領域から短辺の方向に移動させる距離は、設計者が実験やシミュレーションにもとづき導き出した値に設定されてもよい。当該値は、その実験やシミュレーションにおいて、後述する台形歪み検出用のテストパターンの検出精度および/または主観画質の評価値が最大となるときの値であってもよい。上述したように、投写型映像表示装置3200に搭載されるカメラのスペック(たとえば、S/N比、解像度)に応じて、当該距離が調整されてもよい。なお、実際に短辺の方向に移動させる距離は、上記スクリーン領域のサイズに応じて、正規化された値となる。 The distance by which the detection area DA is moved from the central area in the direction of the short side may be set to a value derived by the designer based on experiments and simulations. This value may be a value when the detection accuracy and / or subjective image quality evaluation value of the trapezoidal distortion detection test pattern, which will be described later, is maximized in the experiment or simulation. As described above, the distance may be adjusted according to the specifications (for example, S / N ratio, resolution) of the camera mounted on the projection display apparatus 3200. Note that the distance actually moved in the direction of the short side is a normalized value according to the size of the screen area.
 また、検出領域設定部250は、辺長測定部245により検出されたスクリーン300の傾きの程度に応じて、検出領域DAを移動させる距離を適応的に変化させてもよい。たとえば、検出領域設定部250は、当該傾きが大きいほど、検出領域DAを移動させる距離を大きくする。 Further, the detection area setting unit 250 may adaptively change the distance for moving the detection area DA according to the degree of inclination of the screen 300 detected by the side length measurement unit 245. For example, the detection area setting unit 250 increases the distance by which the detection area DA is moved as the inclination increases.
 以上説明したように第4実施形態によれば、スクリーンの中央領域から、光源からの距離が遠い方向に少しずらした位置に検出領域を設定することにより、スクリーンが傾斜している場合でも、スクリーン面内の好適な位置にフォーカスを合わせることができる。 As described above, according to the fourth embodiment, even if the screen is tilted by setting the detection region at a position slightly shifted from the central region of the screen in a direction far from the light source, the screen The focus can be adjusted to a suitable position in the plane.
 本発明者の実験では、スクリーンの中央領域から、光源からの距離が遠い方向に少しずらした位置に検出領域を設定したほうが、スクリーンの中央領域に検出領域を設定する場合より、後述する台形歪み検出用のテストパターンの検出精度が高いという結果が得られた。また、主観画質の評価値に大きな違いが発生しないという結果が得られた。このように、スクリーンの中央領域よりも、光源からの距離が遠い方向にずらした位置にフォーカスを合わせる手法は、有効な手法である。 In the inventor's experiment, the trapezoidal distortion described later is more effective when the detection area is set at a position slightly shifted in the direction far from the light source from the central area of the screen than when the detection area is set in the central area of the screen. The result that the detection accuracy of the test pattern for detection was high was obtained. Moreover, the result that a big difference did not arise in the evaluation value of subjective image quality was obtained. Thus, the method of focusing on a position shifted in a direction farther from the light source than the central region of the screen is an effective method.
 第4実施形態では、スクリーンの中央領域から、光源からの距離が遠い方向に少しずらした位置に検出領域を設定することを述べたが、検出領域をさらに他の位置に設定してもよい。例えば、図23に示すように、スクリーン領域の各辺を三等分して九個の領域に分割したときに、スクリーン領域SAのいずれかの角を含む領域に検出領域DAを移動させてもよい。図23では、スクリーン領域SAにおいて対辺同士を比較したときに短い方の辺が交わる角に検出領域DAが移動されている。このように短辺の方向に検出領域DAを移動させるのは、仮に検出領域が長辺寄りに設定された場合、短辺寄りに設定されるよりも検出領域内での輝度変化が大きくなるため、フォーカスレンズ13の合焦位置が長辺寄りになってしまうので、これを避ける必要があるからである。 In the fourth embodiment, it has been described that the detection area is set at a position slightly shifted from the center area of the screen in a direction far from the light source, but the detection area may be set at another position. For example, as shown in FIG. 23, when each side of the screen area is equally divided into nine areas, the detection area DA may be moved to an area including any corner of the screen area SA. Good. In FIG. 23, the detection area DA is moved to the corner where the shorter side intersects when the opposite sides are compared in the screen area SA. The reason why the detection area DA is moved in the direction of the short side is that if the detection area is set closer to the long side, the luminance change in the detection area becomes larger than that set near the short side. This is because the in-focus position of the focus lens 13 is closer to the longer side, and this must be avoided.
 なお、検出領域の移動先は、投写型映像表示装置の位置、投写型映像表示装置とスクリーン間の距離などに基づき最適な位置に設定してもよい。 Note that the destination of the detection area may be set to an optimal position based on the position of the projection display apparatus, the distance between the projection display apparatus and the screen, and the like.
 第3および第4実施形態において、オートフォーカス調整部260は、中央の検出領域または移動した後の検出領域の中に、オートフォーカス調整用のテストパターンを投写部10に投写させてもよい。この場合、検出領域以外のスクリーン領域内に、ユーザによるフォーカス調整を支援するためのフォーカス情報を投写部10に投写させてもよい。フォーカス情報には、フォーカス調整用のバー、ボタンや、フォーカス評価値、文字情報などが含まれる。ユーザは、フォーカス評価値等を参照しながらバーやボタンをマウスまたはレーザポインタを用いて操作して、フォーカスを微調整することが可能になる。 In the third and fourth embodiments, the autofocus adjustment unit 260 may cause the projection unit 10 to project a test pattern for autofocus adjustment in the central detection region or the detection region after movement. In this case, focus information for assisting the user in adjusting the focus may be projected on the projection unit 10 in a screen area other than the detection area. The focus information includes a focus adjustment bar, button, focus evaluation value, character information, and the like. The user can finely adjust the focus by operating the bar or button with the mouse or the laser pointer while referring to the focus evaluation value or the like.
第5実施形態.
 図24は、本発明の第5実施形態に係る投写型映像表示装置4200の構成を示す図である。第5実施形態に係る投写型映像表示装置4200は、第3実施形態に係る投写型映像表示装置2200に、台形歪み補正部270が追加された構成である。
Fifth embodiment.
FIG. 24 is a diagram showing a configuration of a projection display apparatus 4200 according to the fifth embodiment of the present invention. The projection display apparatus 4200 according to the fifth embodiment has a configuration in which a trapezoidal distortion correction unit 270 is added to the projection display apparatus 2200 according to the third embodiment.
 台形歪み補正部270は、撮像部30により撮像された画像内のエッジを検出することにより、台形歪み検出用のテストパターンの形状の歪みを検出し、その歪みがキャンセルされるよう映像信号を補正する。台形歪みは、スクリーン300と投写型映像表示装置4200とが正対していない場合に発生する。たとえば、投写光の光軸が上向きにずれている場合、スクリーン300には上部が膨らんだ台形歪みが発生する。 The trapezoidal distortion correction unit 270 detects the distortion in the shape of the test pattern for detecting the trapezoidal distortion by detecting an edge in the image captured by the imaging unit 30, and corrects the video signal so that the distortion is canceled. To do. The trapezoidal distortion occurs when the screen 300 and the projection display apparatus 4200 are not facing each other. For example, when the optical axis of the projection light is shifted upward, a trapezoidal distortion in which the upper portion of the screen 300 swells occurs.
 投写型映像表示装置4200の起動時や、ユーザ操作により台形歪みの調整が指示されると、映像信号設定部82は、画像メモリ84から台形歪み調整用のテストパターンを読み出し、投写部10に投写させる。当該テストパターンは、たとえば、四角形(正方形、長方形、平行四辺形、菱形など)で形成される。撮像部30は、スクリーン300に投影されたテストパターンを撮像する。台形歪み補正部270は、撮像画像に写ったテストパターンの形状をもとに、台形歪みを検出し、その台形歪みがキャンセルされるよう、映像信号設定部82に設定すべき映像信号を補正する。映像信号設定部82は、台形歪み補正部270による台形歪み補正機能が有効な期間は、台形歪み補正部270から供給される台形歪み補正後の映像信号を光変調部12に設定する。 When keystone distortion adjustment is instructed when the projection display apparatus 4200 is activated or when a user operation is instructed, the video signal setting unit 82 reads a test pattern for keystone distortion adjustment from the image memory 84 and projects it onto the projection unit 10. Let The test pattern is formed of, for example, a quadrangle (square, rectangle, parallelogram, rhombus, etc.). The imaging unit 30 images the test pattern projected on the screen 300. The trapezoidal distortion correction unit 270 detects the trapezoidal distortion based on the shape of the test pattern reflected in the captured image, and corrects the video signal to be set in the video signal setting unit 82 so that the trapezoidal distortion is canceled. . The video signal setting unit 82 sets the video signal after the trapezoidal distortion correction supplied from the trapezoidal distortion correction unit 270 to the light modulation unit 12 during a period in which the trapezoidal distortion correction function by the trapezoidal distortion correction unit 270 is valid.
 投写型映像表示装置4200の起動時において、台形歪み補正部270は、オートフォーカス調整部260によるフォーカス調整が終了した後、上述した台形歪み検出処理を実行する。以下、台形歪み補正部270の構成をより具体的に説明する。 When starting up the projection display apparatus 4200, the trapezoidal distortion correction unit 270 performs the trapezoidal distortion detection process described above after the focus adjustment by the autofocus adjustment unit 260 is completed. Hereinafter, the configuration of the trapezoidal distortion correction unit 270 will be described more specifically.
 台形歪み補正部270は、エッジ抽出部271、歪み検出部272および映像信号補正部273を含む。エッジ抽出部271は、上記テストパターンが写った撮像画像内からエッジを検出する。その際、エッジ抽出部271は、スクリーン位置検出部240から設定されるスクリーン領域の範囲内でエッジを抽出する。エッジは、所定の閾値を超える輝度レベル変化が発生した箇所で抽出される。 The trapezoidal distortion correction unit 270 includes an edge extraction unit 271, a distortion detection unit 272, and a video signal correction unit 273. The edge extraction unit 271 detects an edge from the captured image in which the test pattern is captured. At this time, the edge extraction unit 271 extracts an edge within the screen area set by the screen position detection unit 240. The edge is extracted at a location where a luminance level change exceeding a predetermined threshold has occurred.
 歪み検出部272は、エッジ抽出により特定されるテストパターンの頂点座標から、上下左右のどちらの方向に、どの程度膨らんでいるかを特定する。映像信号補正部273は、スクリーン300に投影されたテストパターンの膨らみがキャンセルされるよう、映像信号設定部82に設定すべき映像信号を補正する。 The distortion detection unit 272 specifies how much it swells in the vertical and horizontal directions from the vertex coordinates of the test pattern specified by edge extraction. The video signal correction unit 273 corrects the video signal to be set in the video signal setting unit 82 so that the swelling of the test pattern projected on the screen 300 is cancelled.
 図25は、台形歪み補正を説明するための図である。図25(a)は、垂直台形歪み補正を示し、図25(b)は、水平台形歪み補正を示す。図25(a)では、投影画像PrIの上部が膨らんでいる。映像信号補正部273は、アスペクト比が維持された四角形がスクリーン300に投影されるよう、投影すべき映像の上端から下端に向けて、映像の横幅を徐々にスケールダウンする。図25(b)では、投影画像PrIの左部が膨らんでいる。映像信号補正部732は、アスペクト比が維持された四角形がスクリーン300に投影されるよう、投影すべき映像の左端から右端に向けて、映
像の縦幅を徐々にスケールダウンする。
FIG. 25 is a diagram for explaining trapezoidal distortion correction. FIG. 25A shows vertical trapezoidal distortion correction, and FIG. 25B shows horizontal trapezoidal distortion correction. In FIG. 25A, the upper part of the projection image PrI is swollen. The video signal correction unit 273 gradually scales down the horizontal width of the video from the upper end to the lower end of the video to be projected so that a quadrangle whose aspect ratio is maintained is projected onto the screen 300. In FIG. 25B, the left part of the projection image PrI is swollen. The video signal correction unit 732 gradually scales down the vertical width of the video from the left end to the right end of the video to be projected so that a quadrangle whose aspect ratio is maintained is projected onto the screen 300.
 図26は、第5実施形態に係る撮像画像PuI内のスクリーン領域SA内に、台形歪み検出用のテストパターンTPが写っている様子を示す図である。この撮像画像PuIは、スクリーン領域SAの左部領域にフォーカスが合っている。すなわち、第3、第4実施形態で説明した検出領域が設定されずに、撮像画像全体またはスクリーン領域SA全体の画像信号を用いて、オートフォーカス調整が実行された場合を示している。その場合、スクリーン領域SAの中央領域より、光源からの距離が近いスクリーン領域SAの左部領域にフォーカスが合わせられる。 FIG. 26 is a diagram illustrating a state in which a trapezoidal distortion detection test pattern TP is reflected in the screen area SA in the captured image PuI according to the fifth embodiment. This captured image PuI is focused on the left area of the screen area SA. That is, a case is shown in which the autofocus adjustment is performed using the image signal of the entire captured image or the entire screen area SA without setting the detection area described in the third and fourth embodiments. In this case, the focus is focused on the left area of the screen area SA that is closer to the light source than the center area of the screen area SA.
 その後、スクリーン300に台形歪み検出用のテストパターンTPが投影され、撮像部30により撮像されると、図26に示すように撮像画像PuI内において、テストパターンTPの左部は鮮明に写るが、その右部はピンぼけしてしまう(図26では、ピンぼけを点線で示している)。この場合、エッジ抽出部271にて、その撮像画像PuI内からテストパターンTPの一部の形状が抽出できない可能性があり、抽出できない場合、台形歪み補正の精度が低下する。 Thereafter, when a test pattern TP for trapezoidal distortion detection is projected onto the screen 300 and captured by the imaging unit 30, the left part of the test pattern TP is clearly visible in the captured image PuI as shown in FIG. The right part is out of focus (in FIG. 26, the out of focus is indicated by a dotted line). In this case, the edge extraction unit 271 may not be able to extract a part of the shape of the test pattern TP from the captured image PuI. If it cannot be extracted, the accuracy of the trapezoidal distortion correction is reduced.
 これに対し、第3、第4実施形態で説明した検出領域が設定され、その検出領域の画像信号を用いて、オートフォーカス調整が実行されると、スクリーン領域SAの中央領域か、そこから少し右側にずれた位置にフォーカスが合わせられる。その後、スクリーン300に台形歪み検出用のテストパターンが投影され、撮像部30により撮像されると、その撮像画像内に、テストパターン全体がピンぼけせずに写る。したがって、エッジ抽出部271は、その撮像画像内からテストパターン全体の形状を抽出することができ、台形歪み補正の精度が向上する。 On the other hand, when the detection area described in the third and fourth embodiments is set and the auto focus adjustment is performed using the image signal of the detection area, the center area of the screen area SA or a little from there. The focus is adjusted to the position shifted to the right. Thereafter, when a test pattern for trapezoidal distortion detection is projected onto the screen 300 and captured by the imaging unit 30, the entire test pattern appears in the captured image without being out of focus. Therefore, the edge extraction unit 271 can extract the shape of the entire test pattern from the captured image, and the accuracy of trapezoidal distortion correction is improved.
 以上説明したように第5実施形態によれば、第3、第4実施形態で説明したオートフォーカス調整が実行された後、台形歪み検出を実行することにより、台形歪み補正の精度を向上させることができる。これに対し、オートフォーカス調整前、または撮像画像全体もしくはスクリーン領域全体の画像信号を用いたオートフォーカス調整後に、台形歪み検出が実行されると、台形歪み補正の精度が低下する可能性がある。撮像画像内からテストパターン全体の形状を抽出できない可能性があるためである。 As described above, according to the fifth embodiment, after the autofocus adjustment described in the third and fourth embodiments is performed, trapezoidal distortion detection is performed, thereby improving the accuracy of trapezoidal distortion correction. Can do. On the other hand, if the trapezoidal distortion detection is executed before the autofocus adjustment or after the autofocus adjustment using the image signal of the entire captured image or the entire screen area, the accuracy of the trapezoidal distortion correction may be lowered. This is because the shape of the entire test pattern may not be extracted from the captured image.
 以上、本発明を実施の形態をもとに説明した。これらの実施の形態は例示であり、それらの各構成要素や各処理プロセスの組合せにいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。 The present invention has been described based on the embodiments. It is understood by those skilled in the art that these embodiments are exemplifications, and that various modifications can be made to combinations of the respective constituent elements and processing processes, and such modifications are also within the scope of the present invention. By the way.
 第4実施形態にて、検出領域設定部250は、辺長測定部245により検出されたスクリーン300の傾きの程度に応じて、上記検出領域のサイズを適応的に変化させてもよい。たとえば、検出領域設定部250は、当該傾きが大きいほど、当該検出領域のサイズを小さくする。当該傾きが大きい場合、当該検出領域内において、その中心より光源に近い側にずれてフォーカスが合わせられる。したがって、検出領域のサイズが小さいほど、スクリーン領域全体の中心近くに、フォーカスを合わせることができる。 In the fourth embodiment, the detection area setting unit 250 may adaptively change the size of the detection area according to the degree of inclination of the screen 300 detected by the side length measurement unit 245. For example, the detection area setting unit 250 reduces the size of the detection area as the inclination increases. If the inclination is large, the detection area is focused on the side closer to the light source than the center. Therefore, the smaller the size of the detection area, the closer the focus can be to the center of the entire screen area.
 第5実施形態において、台形歪み補正部270の代わりに形状補正部を設けてもよい。形状補正部は、周知の四点補正(フィッティング補正)などの形状補正を実行する。投写型映像表示装置の起動時などに、映像信号設定部82は、四点補正用のテストパターンを投写部10に投写させる。当該テストパターンは、たとえば白色の長方形である。撮像部30は、スクリーン300に投影されたテストパターンを撮像する。オートフォーカス調整部260によるフォーカス調整が終了した後、形状補正部は撮像画像に写ったテストパターンの四つの頂点をスクリーンの四つの頂点と一致させるように、映像信号設定部82に設定すべき映像信号を補正する。映像信号設定部82は、形状補正部から供給される補正後の映像信号を光変調部12に設定する。 In the fifth embodiment, a shape correction unit may be provided instead of the trapezoidal distortion correction unit 270. The shape correction unit executes shape correction such as well-known four-point correction (fitting correction). The video signal setting unit 82 causes the projection unit 10 to project a test pattern for four-point correction, such as when the projection display apparatus is activated. The test pattern is, for example, a white rectangle. The imaging unit 30 images the test pattern projected on the screen 300. After the focus adjustment by the autofocus adjustment unit 260 is completed, the shape correction unit sets the video to be set in the video signal setting unit 82 so that the four vertices of the test pattern shown in the captured image coincide with the four vertices of the screen. Correct the signal. The video signal setting unit 82 sets the corrected video signal supplied from the shape correction unit in the light modulation unit 12.
 図27は、四点補正を説明するための図である。図27は撮像画像であり、投影画像(たとえば白色のテストパターン)PrIがスクリーン300からずれて投影されている様子を示す。図中に矢印で示すように、投影画像PrIの四つの頂点を、スクリーン300の対応する四つの頂点と一致させるように、映像の形状を調整する。 FIG. 27 is a diagram for explaining the four-point correction. FIG. 27 is a captured image, and shows a state where a projected image (for example, a white test pattern) PrI is projected out of the screen 300. As indicated by arrows in the figure, the shape of the video is adjusted so that the four vertices of the projection image PrI coincide with the corresponding four vertices of the screen 300.
 なお、形状補正部は、投影画像PrIの形状の特徴に基づき、台形歪み補正と四点補正のいずれかを選択して実行するようにしてもよい。 Note that the shape correction unit may select and execute either trapezoidal distortion correction or four-point correction based on the shape characteristics of the projection image PrI.
 10 投写部、 11 光源、 12 光変調部、 13 フォーカスレンズ、 20 レンズ駆動部、 30 撮像部、 40 フォーカス評価部、 42 フォーカス評価値算出部、 44 評価値変化検出部、50 映像信号評価部、 52 入力信号評価値算出部、 54 入力信号評価値判定部、 60 フォーカス目標算出部、 70 フォーカス調整アシスト部、 82 映像信号設定部、 100、1100、2100、3100、4100 制御部、 144 ズーム操作検出部、 150 フォーカス状態情報設定部、 152 フォーカス評価値格納部、 154 フォーカス状態決定部、 156 グラフデータベース、 200、1200、2200、3200、4200 投写型映像表示装置、 240 スクリーン位置検出部、 245 辺長測定部、 250 検出領域設定部、 260 オートフォーカス調整部、 261 ハイパスフィルタ、 262 積算部、 263 レンズ位置決定部、 270 台形歪み補正部、 271 エッジ抽出部、 272 歪み検出部、 273 映像信号補正部、300 スクリーン。
 
 
DESCRIPTION OF SYMBOLS 10 Projection part, 11 Light source, 12 Light modulation part, 13 Focus lens, 20 Lens drive part, 30 Imaging part, 40 Focus evaluation part, 42 Focus evaluation value calculation part, 44 Evaluation value change detection part, 50 Video signal evaluation part, 52 input signal evaluation value calculation unit, 54 input signal evaluation value determination unit, 60 focus target calculation unit, 70 focus adjustment assist unit, 82 video signal setting unit, 100, 1100, 2100, 3100, 4100 control unit, 144 zoom operation detection , 150 focus state information setting unit, 152 focus evaluation value storage unit, 154 focus state determination unit, 156 graph database, 200, 1200, 2200, 3200, 4200 projection type video display device, 240 screen position detection unit, 245 side length Measurement unit, 250 detection area setting unit 260 auto-focus adjustment unit, 261 high-pass filter, 262 integration unit, 263 a lens position determining section 270 distortion corrector, 271 edge extraction unit, 272 distortion detector, 273 image signal modifier 300 screen.

Claims (15)

  1.  スクリーンにレンズを介して映像を投写する投写部と、前記スクリーンを撮像するための撮像部と、前記投写部に設けられ手動で操作されるフォーカス調整部と、備える投写型映像表示装置に搭載される制御装置であって、
     前記撮像部により撮像された撮像画像を取得し、スクリーンに投写された映像のフォーカス状態に応じて変化する特性値をフォーカス評価値として算出するフォーカス評価値算出部と、
     前記フォーカス評価値が変化しているとき、ユーザにより前記フォーカス調整部が操作されていると判定する評価値変化検出部と、
     前記フォーカス調整部が操作されていると判定されると、フォーカス調整を補助するためのフォーカス合焦情報を前記映像と合わせて投写表示させるフォーカス調整アシスト部と、
     を備えることを特徴とする制御装置。
    Mounted in a projection display apparatus that includes a projection unit that projects an image on a screen via a lens, an imaging unit for imaging the screen, and a focus adjustment unit that is provided in the projection unit and is manually operated. A control device,
    A focus evaluation value calculation unit that acquires a captured image captured by the imaging unit and calculates a characteristic value that changes according to a focus state of an image projected on a screen as a focus evaluation value;
    An evaluation value change detection unit for determining that the focus adjustment unit is operated by a user when the focus evaluation value is changed;
    A focus adjustment assisting unit that projects and displays focus in-focus information for assisting focus adjustment together with the video when it is determined that the focus adjustment unit is operated;
    A control device comprising:
  2.  前記評価値算出部は、前記撮像画像を複数の領域に分割して領域毎に前記フォーカス評価値を算出し、
     前記評価値変化検出部は、前記フォーカス評価値が変化している領域の数に応じて、ユーザにより前記フォーカス調整部が操作されていると判定することを特徴とする請求項1に記載の制御装置。
    The evaluation value calculation unit divides the captured image into a plurality of regions, calculates the focus evaluation value for each region,
    The control according to claim 1, wherein the evaluation value change detection unit determines that the focus adjustment unit is operated by a user according to the number of regions in which the focus evaluation value changes. apparatus.
  3.  前記評価値変化検出部は、前記フォーカス評価値が変化しなくなったとき、ユーザによる前記フォーカス調整部の操作が終了したと判定し、
     前記フォーカス調整アシスト部は、前記フォーカス合焦情報の投写表示を終了することを特徴とする請求項1または2に記載の制御装置。
    The evaluation value change detection unit determines that the operation of the focus adjustment unit by the user has ended when the focus evaluation value no longer changes,
    The control apparatus according to claim 1, wherein the focus adjustment assist unit ends the projection display of the focus in-focus information.
  4.  前記スクリーンに投写される映像信号を取得し、映像内の動きの有無に応じて変化する特性値を入力信号評価値として算出する入力信号評価値算出部と、
     前記入力信号評価値が所定の期間にわたり変化しないとき、映像が静止画であると判定する入力信号評価値判定部と、をさらに備え、
     前記フォーカス調整アシスト部は、映像が静止画であると判定されたときに前記フォーカス合焦情報を投写表示させることを特徴とする請求項1ないし3のいずれかに記載の制御装置。
    An input signal evaluation value calculation unit that acquires a video signal projected on the screen and calculates a characteristic value that changes according to the presence or absence of motion in the video as an input signal evaluation value;
    An input signal evaluation value determining unit that determines that the video is a still image when the input signal evaluation value does not change over a predetermined period;
    4. The control device according to claim 1, wherein the focus adjustment assist unit projects and displays the focus focusing information when it is determined that the video is a still image.
  5.  前記撮像部により撮像された撮像画像を取得して画像解析することにより、ユーザによりズーム調整部が操作されているか否かを判定するズーム操作検出部と、
     前記フォーカス評価値算出部から取得したフォーカス評価値の変化を追跡してフォーカス状態情報を導出するフォーカス状態決定部と、をさらに含み、
     前記フォーカス状態決定部は、前記ズーム操作検出部による前記ズーム調整部が操作されている旨の判定を契機として、フォーカス評価値の変化の追跡を再始動することを特徴とする請求項1に記載の制御装置。
    A zoom operation detection unit that determines whether or not the user is operating the zoom adjustment unit by acquiring and analyzing a captured image captured by the imaging unit;
    A focus state determination unit that derives focus state information by tracking a change in the focus evaluation value acquired from the focus evaluation value calculation unit, and
    The focus state determination unit restarts tracking of a change in a focus evaluation value triggered by a determination that the zoom adjustment unit is operated by the zoom operation detection unit. Control device.
  6.  フォーカス状態情報を前記映像と合わせて投写表示させる映像として設定する映像信号設定部をさらに含み、
     前記ズーム操作検出部は、前記スクリーンに投写されたフォーカス状態情報の映像を取得して画像解析することにより、ユーザにより前記ズーム調整部が操作されていると判定することを特徴とする請求項5に記載の制御装置。
    A video signal setting unit for setting the focus state information as an image to be projected and displayed together with the image;
    The zoom operation detection unit determines that the zoom adjustment unit is operated by a user by acquiring an image of focus state information projected on the screen and analyzing the image. The control device described in 1.
  7.  フォーカス評価値の算出に利用するための画像の周囲がズーム操作の決定に利用するための画像で囲まれた画像を、前記映像として投写表示する映像信号設定部をさらに含むことを特徴とする請求項5に記載の制御装置。 An image signal setting unit that projects and displays, as the image, an image in which the periphery of the image used for calculating the focus evaluation value is surrounded by an image used for determining the zoom operation is further included. Item 6. The control device according to Item 5.
  8.  フォーカス評価値の算出に利用するための映像とズーム操作の決定に利用するための映像とを、前記映像として交互に時分割で投写表示する映像信号設定部をさらに含むことを特徴とする請求項5に記載の制御装置。 The video signal setting unit further includes a video signal setting unit configured to alternately project and display a video for use in calculating a focus evaluation value and a video for use in determining a zoom operation as the video in a time-division manner. 5. The control device according to 5.
  9.  スクリーンにレンズを介して映像を投写する投写部と、前記スクリーンを撮像するための撮像部と、を備える投写型映像表示装置に搭載されるべき制御装置であって、
     前記撮像部により撮像された画像内に写ったスクリーンの位置を検出するスクリーン位置検出部と、
     前記スクリーン位置検出部により検出されたスクリーン領域内の中央領域を検出領域に設定する検出領域設定部と、
     複数のレンズ位置にて前記撮像部によりそれぞれ撮像された複数の画像内の、前記検出領域の鮮明度をもとに、前記レンズの位置を決定するフォーカス調整部と、
     を備えることを特徴とする制御装置。
    A control device to be mounted on a projection display apparatus comprising: a projection unit that projects an image on a screen through a lens; and an imaging unit for imaging the screen,
    A screen position detection unit for detecting a position of a screen captured in an image captured by the imaging unit;
    A detection region setting unit that sets a central region in the screen region detected by the screen position detection unit as a detection region;
    A focus adjustment unit that determines the position of the lens based on the sharpness of the detection area in a plurality of images respectively captured by the imaging unit at a plurality of lens positions;
    A control device comprising:
  10.  前記検出領域設定部は、設定された前記検出領域を前記スクリーン領域内で移動させることを特徴とする請求項9に記載の制御装置。 10. The control device according to claim 9, wherein the detection area setting unit moves the set detection area within the screen area.
  11.  前記スクリーン位置検出部により検出されたスクリーン領域の対向する二辺のそれぞれの長さを測定する辺長測定部をさらに備え、
     前記検出領域設定部は、前記辺長測定部により測定された二辺のうちの短辺の方向に、前記検出領域を所定の距離、移動させることを特徴とする請求項10に記載の制御装置。
    A side length measuring unit for measuring the lengths of two opposite sides of the screen area detected by the screen position detecting unit;
    The control device according to claim 10, wherein the detection region setting unit moves the detection region by a predetermined distance in a direction of a short side of two sides measured by the side length measurement unit. .
  12.  前記フォーカス調整部は、前記投写部を介して、フォーカス調整用のテストパターンを前記検出領域内に投写することを特徴とする請求項9ないし11のいずれかに記載の制御装置。 The control apparatus according to claim 9, wherein the focus adjustment unit projects a test pattern for focus adjustment into the detection area via the projection unit.
  13.  前記フォーカス調整部は、前記検出領域以外のスクリーン領域内に、ユーザによるフォーカス調整を支援するためのフォーカス情報を前記投写部を介して投写することを特徴とする請求項12に記載の制御装置。 13. The control apparatus according to claim 12, wherein the focus adjustment unit projects focus information for supporting focus adjustment by a user through the projection unit in a screen area other than the detection area.
  14.  前記撮像部により撮像された画像内のエッジを検出することにより、テストパターンの形状の歪みを検出し、その歪みがキャンセルされるよう映像信号を補正する形状補正部をさらに備え、
     前記形状補正部は、前記フォーカス調整部によるフォーカス調整が終了した後、形状歪み検出処理を実行することを特徴とする請求項9ないし13のいずれかに記載の制御装置。
    Further comprising a shape correction unit that detects distortion in the shape of the test pattern by detecting an edge in the image captured by the imaging unit, and corrects the video signal so that the distortion is canceled;
    The control device according to claim 9, wherein the shape correction unit executes a shape distortion detection process after the focus adjustment by the focus adjustment unit is completed.
  15.  スクリーンにレンズを介して映像を投写する投写部と、
     前記スクリーンを撮像するための撮像部と、
     請求項1から14のいずれかに記載の制御装置と、
     を備えることを特徴とする投写型映像表示装置。
    A projection unit that projects an image onto a screen via a lens;
    An imaging unit for imaging the screen;
    A control device according to any one of claims 1 to 14,
    A projection-type image display device comprising:
PCT/JP2011/052913 2010-02-19 2011-02-10 Control device and projection-type image display device WO2011102299A1 (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2010034578 2010-02-19
JP2010-034578 2010-02-19
JP2010082783 2010-03-31
JP2010-082783 2010-03-31
JP2010-122787 2010-05-28
JP2010122787 2010-05-28
JP2011-016140 2011-01-28
JP2011016140A JP2012008522A (en) 2010-02-19 2011-01-28 Control device and projection type image display device

Publications (1)

Publication Number Publication Date
WO2011102299A1 true WO2011102299A1 (en) 2011-08-25

Family

ID=44482886

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/052913 WO2011102299A1 (en) 2010-02-19 2011-02-10 Control device and projection-type image display device

Country Status (2)

Country Link
JP (1) JP2012008522A (en)
WO (1) WO2011102299A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017085802A1 (en) * 2015-11-18 2017-05-26 日立マクセル株式会社 Image projection device
CN113542733A (en) * 2021-06-25 2021-10-22 苏州智瞳道和显示技术有限公司 Method and system for adjusting and measuring definition of optical machine

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5849474B2 (en) * 2011-07-06 2016-01-27 株式会社リコー Imaging device
JP6056159B2 (en) * 2012-03-08 2017-01-11 セイコーエプソン株式会社 Projector and method of processing image in projector
JP2014029357A (en) * 2012-07-31 2014-02-13 Mega Chips Corp Imaging device, imaging adjustment method and program
JP5679016B2 (en) * 2013-08-05 2015-03-04 セイコーエプソン株式会社 Projector and projector control method
JP6324030B2 (en) * 2013-11-15 2018-05-16 キヤノン株式会社 Projection type image display apparatus and control method thereof
JP6503756B2 (en) * 2014-02-14 2019-04-24 セイコーエプソン株式会社 Projector and projector control method
JP6387125B2 (en) * 2017-01-23 2018-09-05 株式会社メガチップス Imaging apparatus, imaging adjustment method, and program
JP7012142B2 (en) 2018-03-01 2022-01-27 富士フイルム株式会社 Focus adjustment operation detection device, focus adjustment operation detection method and focus adjustment operation detection program, and image pickup device main body and image pickup device
CN111010556B (en) 2019-12-27 2022-02-11 成都极米科技股份有限公司 Method and device for projection bi-directional defocus compensation and readable storage medium
WO2022163207A1 (en) * 2021-01-29 2022-08-04 富士フイルム株式会社 Focusing assitance device, focusing assitance method, and focusing assitance program

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0451227A (en) * 1990-06-19 1992-02-19 Sharp Corp Focusing pattern projecting system for liquid crystal projector
JPH08292496A (en) * 1995-04-24 1996-11-05 Sony Corp Method and device for automatic focus and zoom adjustment of projector
JPH10111532A (en) * 1996-10-03 1998-04-28 Seiko Epson Corp Projection display device
JP2001083600A (en) * 1999-09-10 2001-03-30 Toshiba Corp Projection video display device
JP2001183740A (en) * 1999-12-27 2001-07-06 Nikon Corp Picture input/output device and projection display device
JP2003015177A (en) * 2001-06-29 2003-01-15 Hitachi Ltd Image pickup device
JP2004334117A (en) * 2003-05-12 2004-11-25 Sanyo Electric Co Ltd Liquid crystal projector
JP2005024741A (en) * 2003-06-30 2005-01-27 Casio Comput Co Ltd Projector and projection method
JP2005114901A (en) * 2003-10-06 2005-04-28 Matsushita Electric Ind Co Ltd Projection type video display device
JP2006010791A (en) * 2004-06-23 2006-01-12 Seiko Epson Corp Automatic focusing for projector
JP2006091109A (en) * 2004-09-21 2006-04-06 Nikon Corp Projector device, cellular phone and camera
JP2008145465A (en) * 2006-12-06 2008-06-26 Sigma Corp Method of adjusting depth of field and user interface for photographing apparatus
JP2008233462A (en) * 2007-03-20 2008-10-02 Seiko Epson Corp Projector and projection method for projector
JP2009163220A (en) * 2007-12-14 2009-07-23 Canon Inc Image pickup apparatus

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0451227A (en) * 1990-06-19 1992-02-19 Sharp Corp Focusing pattern projecting system for liquid crystal projector
JPH08292496A (en) * 1995-04-24 1996-11-05 Sony Corp Method and device for automatic focus and zoom adjustment of projector
JPH10111532A (en) * 1996-10-03 1998-04-28 Seiko Epson Corp Projection display device
JP2001083600A (en) * 1999-09-10 2001-03-30 Toshiba Corp Projection video display device
JP2001183740A (en) * 1999-12-27 2001-07-06 Nikon Corp Picture input/output device and projection display device
JP2003015177A (en) * 2001-06-29 2003-01-15 Hitachi Ltd Image pickup device
JP2004334117A (en) * 2003-05-12 2004-11-25 Sanyo Electric Co Ltd Liquid crystal projector
JP2005024741A (en) * 2003-06-30 2005-01-27 Casio Comput Co Ltd Projector and projection method
JP2005114901A (en) * 2003-10-06 2005-04-28 Matsushita Electric Ind Co Ltd Projection type video display device
JP2006010791A (en) * 2004-06-23 2006-01-12 Seiko Epson Corp Automatic focusing for projector
JP2006091109A (en) * 2004-09-21 2006-04-06 Nikon Corp Projector device, cellular phone and camera
JP2008145465A (en) * 2006-12-06 2008-06-26 Sigma Corp Method of adjusting depth of field and user interface for photographing apparatus
JP2008233462A (en) * 2007-03-20 2008-10-02 Seiko Epson Corp Projector and projection method for projector
JP2009163220A (en) * 2007-12-14 2009-07-23 Canon Inc Image pickup apparatus

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017085802A1 (en) * 2015-11-18 2017-05-26 日立マクセル株式会社 Image projection device
CN108292084A (en) * 2015-11-18 2018-07-17 麦克赛尔株式会社 Image projection device
US10409031B2 (en) 2015-11-18 2019-09-10 Maxell, Ltd. Image projection apparatus
CN113542733A (en) * 2021-06-25 2021-10-22 苏州智瞳道和显示技术有限公司 Method and system for adjusting and measuring definition of optical machine

Also Published As

Publication number Publication date
JP2012008522A (en) 2012-01-12

Similar Documents

Publication Publication Date Title
WO2011102299A1 (en) Control device and projection-type image display device
JP3761563B2 (en) Projection image automatic adjustment method for projector system and projector
JP4232042B2 (en) Projection control system, projector, program, information storage medium, and projection control method
JP5493340B2 (en) Projection display apparatus and arrangement relation detection method
JP6343910B2 (en) Projector and projector control method
JP5796286B2 (en) Projector and projector control method
JP2010130225A (en) Projection display device and adjustment method for projection
JP6503756B2 (en) Projector and projector control method
US8434879B2 (en) Control device and projection-type video-image display device
JP6330292B2 (en) Projector and projector control method
US20110279738A1 (en) Control device and projection video display device
JP2011176629A (en) Controller and projection type video display device
JP5556150B2 (en) Projector and control method thereof
CN112668569A (en) Projection type image display device
JP5461452B2 (en) Control device and projection display device
JP2010146328A (en) Projector, and method and program for controlling the same
JP4689948B2 (en) projector
JP2009253575A (en) Projector, program, and storage medium
JP2011199717A (en) Projection type display device and image display method
WO2011142197A1 (en) Control device and projection video display device
JP2008242087A (en) Installation position adjustment system for front projection type projector
JP6596872B2 (en) Projector and control method
JP5845566B2 (en) Projector and projector control method
JP5845565B2 (en) Projector and projector control method
JP5233613B2 (en) Projection display apparatus and arrangement relation detection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11744586

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11744586

Country of ref document: EP

Kind code of ref document: A1