WO2006046444A1 - 画像補間装置、および表示装置 - Google Patents
画像補間装置、および表示装置 Download PDFInfo
- Publication number
- WO2006046444A1 WO2006046444A1 PCT/JP2005/019182 JP2005019182W WO2006046444A1 WO 2006046444 A1 WO2006046444 A1 WO 2006046444A1 JP 2005019182 W JP2005019182 W JP 2005019182W WO 2006046444 A1 WO2006046444 A1 WO 2006046444A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel
- interpolation
- image
- display
- unit
- Prior art date
Links
- 238000012545 processing Methods 0.000 claims abstract description 74
- 238000006243 chemical reaction Methods 0.000 claims abstract description 48
- 238000004364 calculation method Methods 0.000 claims abstract description 17
- 238000000034 method Methods 0.000 claims description 23
- 238000012935 Averaging Methods 0.000 claims description 14
- 230000008569 process Effects 0.000 claims description 13
- 239000000284 extract Substances 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 28
- 239000004973 liquid crystal related substance Substances 0.000 description 23
- 230000004888 barrier function Effects 0.000 description 13
- 239000000758 substrate Substances 0.000 description 12
- 230000006866 deterioration Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 6
- 239000003086 colorant Substances 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000015556 catabolic process Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 238000006731 degradation reaction Methods 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4007—Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/34—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
- G09G3/36—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/39—Control of the bit-mapped memory
- G09G5/391—Resolution modifying circuits, e.g. variable screen formats
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
- H04N1/393—Enlarging or reducing
- H04N1/3935—Enlarging or reducing with modification of image resolution, i.e. determining the values of picture elements at new relative positions
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2300/00—Aspects of the constitution of display devices
- G09G2300/04—Structural and physical details of display devices
- G09G2300/0439—Pixel structures
- G09G2300/0452—Details of colour pixel setup, e.g. pixel composed of a red, a blue and two green components
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
- G09G2340/0421—Horizontal resolution change
Definitions
- Image interpolation device and display device are Image interpolation devices and display device
- the present invention relates to a technique for complementing a pixel value of a pixel with a pixel value of a pixel located in the vicinity of the pixel.
- JP-A-6-186526 and JP-A-2000-137443 disclose a display device that can display two screens simultaneously on a single liquid crystal display (LCD). By using this display device, for example, different screens can be displayed for people in the driver's seat and the passenger seat.
- Japanese Patent Application Laid-Open No. 11-331876 and Japanese Patent Application Laid-Open No. 9-4 6622 disclose a display device that can display two types of images on the same screen at the same time.
- Japanese Unexamined Patent Application Publication No. 2004-104368 discloses a technique for solving this problem.
- interpolation data is created by calculating an average or a weighted average of a plurality of pixel data around a position where pixel data is interpolated.
- interpolation data is uniformly created by averaging or weighted averaging of a plurality of pixel data around a position where pixel data is interpolated.
- Interpolated data is created in which the pixels that characterize the image are processed indistinctly. For this reason, there has been a problem that the image quality is remarkably deteriorated with the resolution conversion of the image data.
- interpolation data in which this pixel and surrounding pixels are averaged or weighted averaged is created. The brightness level is smoothed with the brightness level of the surrounding pixels, creating interpolated data in which the pixels that characterize the image are processed indistinctly, and the image quality is reduced when resolution conversion is performed on the image data. There was a problem of significant deterioration
- the present invention has been made in order to solve at least the above-described problems (problems) of the prior art, and an image interpolation apparatus and image interpolation apparatus capable of suppressing image quality degradation accompanying resolution conversion of image data.
- An object is to provide a display device.
- an image interpolation device and a display device are based on a pixel located in the interpolation target region and a neighboring pixel of the pixel.
- the feature amount of the pixel is calculated, and the pixel value of the interpolation pixel is determined based on whether or not the pixel is characteristic in the image.
- the image interpolation device and the display device according to the present invention can obtain an image interpolation device and a display device that suppress deterioration in image quality accompanying resolution conversion of image data and maintain the characteristics of the original image! /.
- FIG. 1 is a conceptual diagram of a display device according to an embodiment of the present invention.
- FIG. 2 is a perspective view showing that the display device shown in FIG. 1 is mounted on a car.
- FIG. 3 is a cross-sectional view of the display unit shown in FIG.
- FIG. 4 is a schematic view of the structure of the display panel as viewed from the front.
- FIG. 5 is a circuit diagram showing an outline of a TFT substrate.
- FIG. 6 is a block diagram of the display device shown in FIG.
- FIG. 7 is a block diagram of the image output unit 211 shown in FIG.
- FIG. 8 is a block diagram of control unit 200 shown in FIG.
- FIG. 9 is a block diagram of memory 218 shown in FIG.
- FIG. 10 is a block diagram of the configuration of the image interpolation apparatus according to the first embodiment.
- FIG. 11A is an explanatory diagram for explaining a two-screen display mode.
- FIG. 11B is an explanatory diagram for explaining the two-view display mode.
- FIG. 12 is an explanatory diagram for explaining calculation of a feature amount.
- FIG. 13 is a conceptual diagram for explaining the processing contents of the feature amount calculation unit and the image interpolation processing unit.
- FIG. 14 is a conceptual diagram for explaining processing contents of a resolution conversion processing unit and a display control processing unit.
- FIG. 15A is an explanatory diagram for explaining a specific example of the image interpolation processing (part 1).
- FIG. 15B is an explanatory diagram for explaining a specific example of the image interpolation processing (part 2).
- FIG. 16 is a flowchart of image interpolation processing.
- FIG. 17 is an explanatory diagram for explaining a modification of the display device shown in FIG.
- FIG. 18 is an explanatory diagram for explaining a modification of the display device shown in FIG. 1. Explanation of symbols
- VRAM Video RAM
- FIG. 1 is a conceptual diagram of a display device according to the present invention.
- 1 is the first image source
- 2 is the second image source
- 3 is the first image data from the first image source
- 4 is the second image data from the second image source.
- 5 is a display control unit
- 6 is display data
- 7 is a display unit (such as a liquid crystal panel)
- 8 is a first display image based on the first image source 1
- 9 is a second image source 2.
- 10 is an observer (user) located on the left side with respect to the display unit 7
- 11 is an observer (user) located on the right side with respect to the display unit 7.
- FIG. 1 shows the first display image according to the relative position of the observers 10 and 11 with respect to the display unit 7, in other words, according to the viewing angle with respect to the display unit 7. 8 that the observer 11 can see the second display image 9 substantially simultaneously, and that each display image 8, 9 can be viewed over the entire display surface of the display unit 7.
- the first image source 1 is, for example, a movie image of a DVD player or reception of a television receiver.
- the second image source 2 such as an image is, for example, a map of a car navigation device or a route guidance image, and the first image data 3 and the second image data 4 are supplied to the display control unit 5, Processing is performed so that they can be displayed on the display unit 7 substantially simultaneously.
- the display unit 7 supplied with the display data 6 from the display control unit 5 is configured by a liquid crystal panel or the like having a parallax barrier described later.
- Half of the total number of pixels in the horizontal direction of the display unit 7 is displayed on the first display image 8 based on the first image source 1, and the other half on the second display based on the second image source 2.
- the observer 10 located on the left side with respect to the display unit 7 sees only the pixels corresponding to the first display image 8, and the second display image 9 is the parallax formed on the surface of the display unit 7. It is blocked by the barrier and is virtually invisible.
- the observer 11 located on the right side of the display unit 7 can see only the pixel corresponding to the second display image 9, and the first display image 8 is substantially blocked by the parallax barrier. Absent.
- the parallax barrier for example, the configurations disclosed in JP-A-10-123462 and JP-A-11-84131 can be applied.
- FIG. 2 is a perspective view showing an example of mounting the multi-view display device according to the present invention on a vehicle.
- 12 is a passenger seat
- 13 is a driver's seat
- 14 is a windshield
- 15 is an operating section
- 16 is a speech force.
- the display unit 7 of the multi-view display device in FIG. 1 is arranged in the dashboard portion at the substantially center between the driver's seat 13 and the passenger seat 12 as shown in FIG.
- Various operations on the multi-view display device are performed by operation of a touch panel (not shown) integrally formed on the surface of the display unit 7, the operation unit 15, or an infrared or wireless remote controller (not shown).
- a speaker 16 is arranged at each door of the vehicle, and sounds, warning sounds, and the like linked to the display image are output.
- the observer 11 in FIG. 1 sits in the driver's seat 13 and the observer 10 sits in the passenger seat 12.
- the image that can be seen from the first viewing direction (driver's side) with respect to the display unit 7 is, for example, an image such as a map of a car navigation device, and is viewed from the second viewing direction (passenger's seat side) substantially simultaneously. It is possible
- the image to be received is, for example, a television reception image or a DVD movie image. Therefore, the passenger in the passenger seat 13 can enjoy television and DVD at the same time that the driver in the driver seat 13 receives driving assistance by car navigation. Since each image is displayed using, for example, the entire 7-inch screen, the screen size force S is not reduced as in the conventional multi-window display. In other words, for drivers and passengers, optimal information and content are provided for each person as if there is a dedicated display that is independent of each other.
- FIG. 3 is a schematic diagram of a cross-sectional structure of the display unit 7.
- 100 is a liquid crystal panel
- 101 is a backlight
- 102 is a polarizing plate placed on the backlight side of the liquid crystal panel
- 103 is a polarizing plate placed on the front side of the light emitting direction of the liquid crystal panel
- 104 is a TFT ( (Thin Film Transistor) substrate
- 105 is a liquid crystal layer
- 106 is a color filter substrate
- 107 is a glass substrate
- 108 is a parallax barrier.
- the liquid crystal panel 100 includes a pair of substrates having a liquid crystal layer 105 sandwiched between a TFT substrate 104 and a color filter substrate 106 disposed opposite thereto, a parallax barrier 108 disposed on the front surface on the light emitting direction side, and glass.
- the substrate 107 is sandwiched between two polarizing plates 102 and 103, and is disposed slightly apart from the knocklight 101.
- the liquid crystal panel 100 includes pixels configured with RGB colors (three primary colors).
- Each pixel of the liquid crystal panel 100 is displayed and controlled separately for left side (passenger seat side) display and right side (driver seat side) display.
- the display pixel on the left side (passenger seat side) is blocked from being displayed on the right side (driver seat side) by the parallax barrier 108 and is visible from the left side (passenger seat side).
- the display pixel on the right side (driver's seat side) is blocked from being displayed on the left side (passenger seat side) by the parallax barrier 108 and is visible from the right side (driver's seat side). This makes it possible to provide different displays for the driver's seat and passengers.
- navigation map information can be given to the driver, and at the same time, a DVD movie or the like can be shown to the passenger.
- a configuration in which different images are displayed in a plurality of directions such as three directions is also possible.
- the viewing angle may be varied by configuring the parallax noria itself with a liquid crystal shutter or the like that can be electrically driven.
- FIG. 4 is a schematic view of the structure of the display panel viewed from the front, and FIG. 3 is a cross-sectional view taken along line A—A ′ in FIG. Surface.
- 109 is a pixel for left side (passenger seat side) display
- 110 is a pixel for right side (driver seat side) display.
- 3 and 4 show a part of the liquid crystal panel 100 in which, for example, 800 pixels are arranged in the horizontal direction and 480 pixels are arranged in the vertical direction.
- the left (passenger side) display pixel 109 and the right (driver's side) display pixel 110 are grouped in the vertical direction and are arranged alternately.
- the parallax barriers 108 are arranged at intervals in the horizontal direction and are uniform in the vertical direction. Accordingly, when the left side force is also viewed on the display panel, the parallax barrier 108 covers the right side pixel 110 and the left side pixel 109 can be seen. Similarly, when the right side force is also seen, the parallax barrier 108 covers the left side pixel 109 and the right side pixel 110 is seen. Further, in the vicinity of the front, since both the left pixel 110 and the right pixel 110 are visible, the left display image and the right display image appear to substantially overlap.
- the left side pixel 109 and the right side pixel 110 alternately arranged in FIG. 4 have RGB colors as shown in FIG. It may be composed of a single color as in a row! It may be configured as an example in which multiple RGB colors are mixed.
- FIG. 5 is a circuit diagram showing an outline of the TFT substrate 104.
- 111 is a display panel driving unit
- 112 is a scanning line driving circuit
- 113 is a data line driving circuit
- 114 is a TFT element
- 115 to 118 are data lines
- 119 to 121 are scanning lines
- 122 is a pixel electrode
- 123 is a sub Is a pixel.
- a plurality of data lines 115 to 115, each data line 115 to: L 18 and each of the scanning lines 119 to 121 are formed in a unit as a unit.
- a pixel electrode 122 for applying a voltage to the liquid crystal layer 105 and a TFT element 114 for controlling the switching are formed.
- the display panel drive unit 111 controls the drive timing of the scanning line drive circuit 112 and the data line drive circuit 113.
- the scanning line driving circuit 112 performs selection of the TFT element 114, and the data line driving circuit 113 controls the voltage applied to the pixel electrode 122.
- the plurality of sub-pixels are, for example, connected to the data lines 115 and 117 based on the combined data of the first image data and the second image data or the first and second individual image data.
- the pixel data of 1 (for left image display) and the second pixel data (for right image display) are transmitted to data lines 116 and 118.
- FIG. 6 is a block diagram showing an outline of the display device according to the present invention, so-called Audio.
- 124 is a touch panel
- 200 is a control unit
- 201 is a CDZMD playback unit
- 202 is a radio reception unit
- 203 is a TV reception unit
- 204 is a DVD playback unit
- 205 is an HD (Hard Disk) playback unit
- 206 is Navigation unit
- 207 is a distribution circuit
- 208 is a first image adjustment circuit
- 209 is a second image adjustment circuit
- 210 is an audio adjustment circuit
- 211 is an image output unit
- 214 is a selector
- 215 is an operation unit
- 216 is a remote control transmission / reception unit
- 217 is a remote control
- 218 is a memory
- 219 is an external audio Z video input unit
- 220 is a camera
- 221 is a brightness
- the display unit 7 includes a touch panel 124, a liquid crystal panel 100, and a backlight 101.
- the liquid crystal panel 100 of the display unit 7 includes an image seen from the driver side as the first viewing direction and an image seen from the passenger side as the second viewing direction. It is possible to display virtually simultaneously.
- the display unit 7 may be a flat panel display other than a liquid crystal panel, such as an organic EL display panel, a plasma display panel, a cold cathode flat panel display, or the like.
- the control unit 200 is a circuit that distributes images and sounds of various sources (CDZMD playback unit 201, radio reception unit 202, TV reception unit 203, DVD playback unit 204, HD playback unit 205, and navigation unit 206).
- the image is distributed to the first image adjustment circuit 208 and the second image adjustment circuit 209 if it is an image, and to the sound adjustment circuit 210 if it is a sound.
- the first and second image adjustment circuits 208 and 209 the brightness, color tone, contrast, and the like are adjusted, and the adjusted images are displayed on the display unit 7 by the image output unit 211.
- the sound adjustment circuit 210 adjusts the distribution, volume, and sound to each speaker, and the adjusted sound is output from the speaker 16.
- FIG. 7 is a block diagram showing an outline of the image output unit 211.
- 226 is a first write circuit
- 227 is a second write circuit
- 228 is a VRAM (Video RAM).
- the image output unit 211 includes a first write circuit 226, a second write circuit 227, a VRAM (Video RAM) 228, and a display panel drive unit 111, for example, as shown in FIG. Yes.
- the first writing circuit 226 includes image data corresponding to an odd-numbered column of image data adjusted by the first image adjustment circuit 208 (that is, image data for the first display image 8 in FIG. 1).
- the second writing circuit 227 outputs (the image data corresponding to the even number sequence of the image data adjusted by the second image adjustment circuit 209 (that is, the image data for the second display image 9 in FIG. 1)).
- the display panel driver 111 is a circuit for driving the liquid crystal panel 100, and the image data stored in the VRAM 228 (the first image data and the first image data) is written in the corresponding area in the VRAM 228.
- the corresponding pixel of the liquid crystal display panel 100 is driven based on the composite image of the second image data), and the VRAM228 is used for multi-view display in which the first image data and the second image data are combined. Since the image data is written so as to correspond to the image, the operation of a single drive circuit is the same as the operation of the drive circuit of a normal liquid crystal display device. 1st image data and 2nd image data Without synthesizing, based on the respective image data, it is considered to use a first display panel drive circuit and a second display panel drive dynamic circuit for driving the corresponding pixels of the liquid crystal display panel.
- the navigation unit 206 includes a map information storage unit that stores map information used for navigation.
- the navigation unit 206 obtains information from the VICS information reception unit 212 and the GPS information reception unit 213, and performs navigation. An image for operation can be created and displayed.
- the TV receiver 203 receives analog TV broadcast waves and digital TV broadcast waves via the selector 214 with respect to the antenna power.
- FIG. 8 is a block diagram showing an outline of the control unit 200.
- 229 is an interface
- 230 is a CPU
- 231 is a storage unit
- 232 is a data storage unit.
- the control unit 200 controls the distribution circuit 207 and various sources, and displays the selected two sources or one source.
- the control unit 200 also causes the display unit 7 to display an operation menu display for controlling these various sources.
- the control unit 200 includes a microprocessor and the like.
- a CPU 230 that comprehensively controls each part and each circuit in the display device.
- the CPU 230 is provided with a program storage unit 231 having ROM power for holding various programs necessary for the operation of the display device, and a data storage unit 232 having RAM for holding various data.
- ROM, RAM, etc. can be used either internally in the CPU or provided externally.
- the ROM may be an electrically rewritable nonvolatile memory such as a flash memory.
- the user operates the touch panel 124 mounted on the surface of the display unit 7 or the switches provided around the display unit 7 or the input operation or selection operation such as voice recognition to control the above various sources. This can be done by part 215. Further, an input or selection operation may be performed by the remote control 217 via the remote control transmission / reception unit 216.
- the control unit 200 performs control including various sources in accordance with the operation of the touch panel 124 and the operation unit 215. Further, the control unit 200 is configured to be able to control each volume of the speaker 16 provided in the vehicle as shown in FIG.
- the control unit 200 also stores various setting information such as image quality setting information, programs, and vehicle information in the memory 218.
- FIG. 9 is a block diagram showing an outline of the memory 218.
- 233 is a first screen RAM
- 234 is a second screen RAM
- 235 is an image quality setting information storage means
- 236 is an environment adjustment value holding means.
- the memory 218 includes a first screen RAM 233 and a second image in which image quality adjustment values set by the user can be written, respectively.
- the image quality setting information storage unit 235 and the environmental adjustment value holding unit 236 are configured by an electrically rewritable non-volatile memory such as a flash memory or a battery-backed volatile memory.
- an image from the rear monitoring camera 220 connected to the external audio Z image input unit 219 may be displayed on the display unit 7.
- a video camera, a game machine, or the like may be connected to the external audio Z image input unit 219.
- the control unit 200 outputs based on the information detected by the brightness detection means 221 (for example, a vehicle light switch or light sensor) or the occupant detection means 222 (for example, a pressure sensor provided in a seat). It is possible to change settings such as the localization position of images and sounds.
- the brightness detection means 221 for example, a vehicle light switch or light sensor
- the occupant detection means 222 for example, a pressure sensor provided in a seat. It is possible to change settings such as the localization position of images and sounds.
- Reference numeral 223 denotes a rear display unit provided for the rear seat of the vehicle, which is the same as the image displayed on the display unit 7 via the image output unit 211, or an image car assistant for the driver's seat. One of the images for the seat can be displayed.
- the control unit 200 displays a charge display from the ETC vehicle-mounted device 250.
- control unit 200 may control the communication unit 225 for wireless connection with a mobile phone or the like so that a display related thereto is displayed.
- image interpolation processing in the display device will be described.
- the image interpolation processing is executed in the display device by the display control unit 5 in the conceptual diagram in FIG. 1, and by the first image adjustment circuit and the second image adjustment circuit in the block diagram in FIG.
- an image interpolation device that particularly extracts a portion related to the image interpolation processing will be described.
- FIG. 10 is a block diagram of the configuration of the image interpolation apparatus according to the first embodiment. As shown in the figure, this image interpolation device 310 is connected to an AV unit 320 and a navigation unit 330.
- the AV unit 320 is a DVD player that reads out a video signal or the like stored in a DVD disc (not shown) and outputs it to the image interpolation device 310. Specifically, a DVD video display request is made based on an instruction from a vehicle occupant, and image data of the DVD video is output to the image interpolation device 310.
- This AV unit 320 is a DV Not only the D player but also a compact disc, a hard disk, a radio, a TV, etc. may be used.
- the navigation unit 330 is a device that performs route guidance based on preset route information and position information of the host vehicle. Specifically, based on the planned route information of the own vehicle set by the vehicle occupant (for example, the driver) and the position information transmitted by the artificial satellite force acquired by the GPS receiver, A “navigation” video is created, and the image data of the created “navigation” video is output to the image interpolation device 310.
- the display unit 317 in the image interpolation device 310 displays the DVD video output by the AV unit 320 and the navigation unit 330.
- the navigation video that is output by is displayed.
- the resolution of the display unit 317 is 800 ⁇ 480
- the resolution of image data of DVD video is 800 ⁇ 480
- the resolution of image data of navigation video is 800 ⁇ 480.
- the two-screen display mode divides the screen of the display unit 317 into left and right so that the driver's side and passenger's side passengers can view two images from both sides.
- D VD video vl and navigation video v2 are displayed, and this display format is suitable for cases where both the driver's side and passenger's side passengers want to view both images.
- the two-view display mode has a parallax optical device (for example, a vertical display) on the display unit 317 so that passengers on the driver side and the passenger side can view different images.
- a parallax optical device for example, a vertical display
- This display format is suitable for preventing the driver on the driver's side from seeing when driving.
- image interpolation processing is performed only on the image data of the navigation video.
- the reason why the image interpolation processing target is limited to only the navigation image is that the image data of the navigation image is subjected to resolution conversion, resulting in missing pixels such as characters and symbols. This is because the situation that the contents of the video become unintelligible easily occurs.
- image interpolation processing may be performed on both the DVD video vl and the navigation video v2, and it goes without saying!
- the image interpolation device 310 calculates the feature amount of the pixel relative to the pixel located in the interpolation target region and the neighboring pixels of the pixel, and calculates the pixel of the pixel located in the calculated interpolation target region.
- the image interpolating device 310 in the original image data of the navigation video v2, the pixels located in the interpolation target region and the vicinity of the pixels. A feature amount of the pixel with respect to the pixel is calculated.
- the interpolation target area is set to 2 dots of a pair of odd dots and even dots, and the range of neighboring pixels to be referred to when determining the pixel value of the interpolation pixel is set. One dot on the right side of the interpolation target area.
- the calculation of the feature amount is performed for each component of the RGB digital signal.
- the powerful feature amount is an index representing how much the pixel value is deviated from the pixel located in the interpolation target region and the pixel located in the other interpolation target region and the neighboring pixels. Specifically, the absolute value of the difference between the pixel value of the target pixel in the pixels located in the interpolation target area and the average value of the pixel value of each pixel located in the interpolation target area and the pixel values of neighboring pixels is calculated. It is calculated by obtaining. For example, if the feature amount is large, it will be compared with neighboring pixels. If the feature amount is small, it indicates that it is a pixel with little change from the neighboring pixels.
- the image interpolating device 310 first converts the “R” digital signal of the original RGB digital signal to “pixel 1” located in the interpolation target area A.
- "P1” and the pixel values "P1” and “P2” of “Pixel 1” and “Pixel 2” located in the interpolation target area A and the pixel value "P3" of the neighboring pixel "Pixel 3” Calculate the absolute value of the difference from the average value and calculate the feature value I PI— (P1 + P2 + P3) Z3 I for “pixel 1”.
- the feature value I P2— (for pixel 2) P1 + P2 + P3) Calculate Z3
- the feature values of “pixel 1” and “pixel 2” in the digital signals “G” and “B” are similarly calculated.
- the image interpolating device 310 includes the feature amounts of "pixel 3" and “pixel 4" located in the interpolation target region B, and "pixel 5" and “pixel 6" located in the interpolation target region C.
- the feature amounts and the feature amounts of “pixel m” and “pixel n” located in the interpolation target area N are calculated in the same manner.
- the image interpolation device 310 preferentially adopts the pixel value of the pixel characterizing the image in the original image as the pixel value of the interpolation pixel, the feature amount of the pixel located in the interpolation target region is also calculated. And the pixel value of the interpolation pixel is determined. Specifically, the pixel value of the pixel whose feature amount exceeds the threshold among the pixels positioned in the interpolation target region is determined as the pixel value of the interpolation pixel.
- the pixel value that characterizes the image in the original image is determined.
- the pixel value is preferentially adopted as the pixel value of the interpolated pixel so that image quality deterioration due to resolution conversion of the image data can be suppressed.
- the pixel values of the pixels that characterize the image in the original image are adopted without averaging or weighted averaging. ) Make it easy to maintain the image!
- the “pixel 1” The pixel value “P1” of The pixel value of the interpolation pixel (that is, the interpolation target area A) is determined. Note that when both the “pixel 1” and “pixel 2” feature amounts are equal to or larger than the threshold value, it is preferable to adopt the pixel value of the pixel having the larger feature amount as the pixel value of the interpolation pixel.
- pixels that characterize an image can be obtained by uniformly creating interpolation data obtained by averaging or weighted averaging of a plurality of pixel data around a position where pixel data is interpolated.
- the pixel value of a pixel that exceeds a reasonable threshold to imitate the pixel that characterizes the image in the original image is determined as the pixel value of the interpolation pixel.
- the original image that is, the original image before the resolution conversion is performed.
- Image can be easily maintained.
- FIG. 10 is a block diagram illustrating the configuration of the image interpolation apparatus according to the first embodiment.
- this image interpolation device 310 includes an image data input unit 311, an image data input control unit 312, a feature amount calculation unit 313, an image interpolation processing unit 314, and a resolution conversion processing unit 315.
- the display control processing unit 316 and the display unit 317 are provided.
- the image data input unit 311 receives the image data input from the image data input control unit 312.
- This is a processing unit that inputs the image data output by the AV unit 320 and / or the navigation unit 330 based on the force instruction to the feature amount calculation unit 313.
- a DVD video image resolution is set to 800 X 480
- a navigation video similarly, the resolution of the image is , 800 x 480.
- the image data input control unit 312 controls the number of input systems of image data input from the image data input unit 311 to the feature amount calculation unit 313 in response to a display request from the AV unit 320 and Z or the navigation unit 330. Is a processing unit.
- the image data input unit 311 is instructed to input the DVD video vl image data, and the navigation unit 330
- the image data input unit 311 is instructed to input the image data of the navigation video v2.
- the image data of the DVD video vl and the navigation video v2 are input from the image data input unit 311. Instruct them to do so.
- the feature amount calculation unit 313 is a processing unit that calculates the feature amount of the pixel with respect to the pixel located in the interpolation target region and the neighboring pixels of the pixel based on the image data input from the image data input unit 311. It is. Specifically, as shown in FIG. 12, first, the digital value “R” of the original RGB digital signal is used as the pixel value “P1” of “pixel 1” located in the interpolation target area A. The absolute difference between the pixel value “P1” and “P2” of “pixel 1” and “pixel 2” located in the interpolation target area A and the average value of the pixel value “P3” of the neighboring pixel “pixel 3”.
- the feature quantity calculation unit 313 calculates the feature quantities of "pixel 3" and "pixel 4" located in the interpolation target area B, and "pixel 5" and “pixel 6" located in the interpolation target area C. Characteristic The amount and the feature amount of “pixel m” and “pixel n” located in the interpolation target area N are calculated in the same manner. For the purpose described above, here, the feature quantity calculation unit 313 and the image interpolation processing unit 314 are applied only to the image data of the navigation video for both the power DVD video vl and the navigation video v2. Perform image interpolation processing.
- the image interpolation processing unit 314 determines the pixel value of the interpolation pixel based on the feature amount of the pixel located in the interpolation target region. Specifically, out of the pixels located in the interpolation target region, a pixel having the largest feature amount calculated by the feature amount calculation unit 313 is extracted, and if the feature amount of the extracted pixel is equal to or greater than a threshold value, The pixel value of the pixel is determined as the pixel value of the interpolated pixel.
- the image interpolation processing unit 314 performs the “pixel 1” feature quantity I P1— (P1 + P2 + P3) Z3 I and “pixel” 2 ”feature quantity
- the pixel value “P1” of this “pixel 1” is set to the interpolation pixel (that is, the interpolation target area A). The pixel value is determined.
- the image can be characterized by the original image!
- the pixel value of the pixel that characterizes the image in the original image can be preferentially adopted as the pixel value of the interpolation pixel.
- the resolution conversion processing unit 315 is a processing unit that performs resolution conversion on a plurality of image data in which interpolation pixels are interpolated by the image interpolation processing unit 314. For example, when displaying the image data of the DVD video vl and the image data of the navigation video v2 input from the image interpolation processing unit 314 on the display unit 317 in the two-view display mode, the RGB digital The signal has a dot arrangement as shown in FIG.
- the resolution conversion processing unit 315 performs 1Z2 horizontal resolution conversion for thinning out odd dots “G” and thinning out even dots “R” and “B” in the image data of the DVD video vl.
- the resolution conversion processing unit 315 thins out odd-numbered dots “R” and “B” in the image data of the navigation video v2 and thins out even-numbered “G” 1Z2 horizontal resolution conversion To do.
- the display control processing unit 316 performs control so that the image data subjected to the resolution conversion by the resolution conversion processing unit 315 is rearranged into a predetermined display form (in the first embodiment, the two-view display form) and displayed. Is a processing unit. Specifically, the RGB digital signals of the DVD video vl and navigation video v2 whose resolution has been converted by the resolution conversion processing unit 315 are rearranged into the dot arrangement shown in FIG. 14 (that is, the DVD video Display every other digital signal “R”, “B”, “G”, and rearrange every other digital signal “G”, “R”, “B” in the navigation video) To do.
- G dots are not lit, R is lit, and B is lit for one dot for the viewer in the right direction relative to the display unit 317.
- the right-side viewer occupant on the driver's seat
- FIG. 15A The image pixel data generated by these processes is as shown in FIG. 15A, in which the original pixel is extracted from the even pixel array or the odd pixel array over one frame.
- Fig. 15B a high frequency component indicating that the change between pixels is high (large) is left, so that there is no significant deterioration in image quality and a certain level of good visibility is ensured. be able to.
- a new configuration is provided by controlling to display a plurality of pieces of image data including at least one image data subjected to resolution conversion in a predetermined display form.
- Nagaru Display multiple image data in various display formats on a single display. It becomes possible to do.
- FIG. 16 is a flowchart showing the procedure of the image interpolation process.
- This image interpolation processing is performed when a display request for a DVD video and a display request for a navigation video are received from the AV unit 320 and the navigation device 330. If it is not a setting but a coexistence display mode such as a two-screen display or a two-field display, it will be started.
- step S601 when receiving a DVD video display request and a navigation video display request from the AV unit 320 and the navigation device 330 (Yes in step S601), the image data input unit 311 For each input system of the video vl and the navigation video v2, these image data are input to the feature amount calculation unit 313 (step S602).
- the feature amount calculation unit 313 sequentially calculates the feature amounts of the pixels located in the interpolation target region based on the image data input from the image data input unit 311 (step S603). Subsequently, the image interpolation processing unit 314 extracts a pixel having the largest feature amount calculated by the feature amount calculation unit 313 from among the pixels located in the interpolation target region (step S604).
- the image interpolation processing unit 314 determines the pixel value of the pixel as the pixel value of the interpolation pixel (S (Step S606) On the other hand, if the feature value of the extracted pixel is less than the threshold value (No in Step S605), the pixel value of each pixel located in the interpolation target area is determined as the pixel value of the interpolation pixel. (Step S607).
- the image interpolation processing unit 314 reflects the pixel value of each interpolation pixel.
- Image data is generated (step S609). If the pixel values of the interpolation pixels are not determined for all the interpolation target regions (No at step S608), the above steps are not repeated until the pixel values of the interpolation pixels are determined for all the interpolation target regions. The processing from S603 to S607 is performed recursively.
- the resolution conversion processing unit 315 performs 1Z2 horizontal resolution conversion processing on the navigation video image data in which the interpolation pixels are interpolated by the image interpolation processing unit 314 and the DVD video image data, respectively. (Step S610).
- the display control processing unit 316 rearranges the navigation video image data and the DVD video image data that have been subjected to the 1Z2 horizontal resolution conversion processing by the resolution conversion processing unit 315 into a predetermined display format. (Step S611).
- step S612 when the navigation video or the DVD video is shifted or both the navigation video and the DVD video are finished (Yes in step S612), the processing is finished, and the navigation video and the DVD video are finished. If both of the processes are completed and the process is not successful (No at Step S612), the processes from Steps S602 to S611 are repeated.
- the pixel having the largest feature amount is extracted from the pixels located in the interpolation target region, and the feature amount of the extracted pixel is extracted. If is greater than or equal to the threshold value, the pixel value of the relevant pixel is determined as the pixel value of the interpolated pixel, so characterize the image with the original image! Extraction and the pixel value of the extracted pixel can be more preferentially adopted as the pixel value of the interpolation pixel, and it is possible to more effectively suppress deterioration in image quality due to resolution conversion of image data.
- an image interpolation process according to the present invention is performed when a plurality of display requests (that is, display requests for DVD video and navigation video) are received.
- the present invention is not limited to this, and the present invention can be similarly applied regardless of whether the number of display requests is single or plural.
- the present invention is applicable to image data having resolution conversion needs (for example, when a resolution conversion suitable for a relatively small display unit such as a mobile phone is required) even if a single display request is made. By applying such image interpolation processing, a higher effect can be obtained.
- a pixel having the largest feature amount calculated by the feature amount calculation unit 313 is extracted from the pixels located in the interpolation target region, and the feature amount of the extracted pixel is determined.
- each pixel value of each pixel located in the interpolation target region is determined as the pixel value of the interpolation pixel if it is less than the threshold value
- the present invention is not limited to this, and the extracted pixel If the feature amount is less than the threshold value, the pixel value obtained by averaging the pixel values of the pixels located in the interpolation target region may be determined as the pixel value of the interpolation pixel.
- a pixel having the largest feature amount calculated by the feature amount calculation unit 313 is extracted, and if the extracted pixel has a feature amount less than a threshold value.
- a difference in pixel value between pixels located in the interpolation target region is calculated, and if the absolute value of the pixel value difference between pixels located in the interpolation target region is equal to or greater than a threshold, A pixel value obtained by averaging pixel values of pixels located in the interpolation target area may be determined as the pixel value of the interpolation pixel.
- the pixel value difference between the pixels “pixel 1” and “pixel 2” located in the interpolation target area A is calculated and interpolated. If the absolute value of the pixel value difference between pixels located in the target area (ie, I P1 — P2 I) is equal to or greater than the threshold “THRESH”, the pixels “pixel 1” and “pixels” located in the interpolation target area A The pixel value “(PI + P2) / 2 ⁇ , which is the average of the pixel values of 2”, is determined as the pixel value of the interpolation pixel. Also, the interpolation target region B, the interpolation target region C,. The image interpolation process is performed.
- the pixel value difference between the pixels located in the interpolation target area is calculated, and if the absolute value of the pixel value difference between the pixels located in the interpolation target area is greater than or equal to the threshold value, the interpolation is performed.
- the pixel value obtained by averaging the pixel values of the pixels located in the target area as the pixel value of the interpolated pixel it is possible to smooth out large differences in brightness levels that occur locally, and to improve the resolution of the image data. It is possible to effectively suppress image quality deterioration due to conversion.
- the video signal input to the image interpolation device 310 is a composite signal.
- the ability to explain an example of (RGB format) The present invention is not limited to this, and the present invention can be similarly applied even when a video signal of another format such as YC format is input. Can do.
- the constituent elements of the illustrated devices are functionally conceptual, and need not be physically configured as illustrated.
- the specific form of distribution of each device's integration is not limited to that shown in the figure, and all or a part of it can be functionally or physically distributed in arbitrary units according to various loads and usage conditions. Can be integrated and configured
- each processing function performed in each device can be realized by a program analyzed and executed by the CPU and the CPU, or can be realized as hardware by wired logic. .
- the description has been given by taking an example of a two-screen display mode in which two screens are displayed on a single display and a two-field display mode in which two different images are output in two directions. It can also be implemented as a multi-screen display form that displays three or more screens or a multi-directional display form that outputs different images in three or more directions.
- the use of the present invention is not limited to this, and the present invention is also applicable to a display device other than in-vehicle use such as home use. It is possible.
- the image interpolation apparatus and display apparatus according to the present invention are useful for image interpolation, and are particularly suitable for resolution conversion while maintaining the characteristics of the original image.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Chemical & Material Sciences (AREA)
- Crystallography & Structural Chemistry (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Controls And Circuits For Display Device (AREA)
- Navigation (AREA)
- Liquid Crystal Display Device Control (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/666,506 US20070297692A1 (en) | 2004-10-29 | 2005-10-19 | Image Interpolation Device and Display Device |
EP05795572A EP1816638A4 (en) | 2004-10-29 | 2005-10-19 | IMAGE INTERPOLATION DEVICE AND DISPLAY DEVICE |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-316906 | 2004-10-29 | ||
JP2004316906 | 2004-10-29 | ||
JP2005265690A JP2006154759A (ja) | 2004-10-29 | 2005-09-13 | 画像補間装置、および表示装置 |
JP2005-265690 | 2005-09-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006046444A1 true WO2006046444A1 (ja) | 2006-05-04 |
Family
ID=36227686
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/019182 WO2006046444A1 (ja) | 2004-10-29 | 2005-10-19 | 画像補間装置、および表示装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20070297692A1 (ja) |
EP (1) | EP1816638A4 (ja) |
JP (1) | JP2006154759A (ja) |
KR (1) | KR20070083592A (ja) |
WO (1) | WO2006046444A1 (ja) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4255032B2 (ja) * | 2007-03-15 | 2009-04-15 | 富士通テン株式会社 | 表示装置及び表示方法 |
JP5293923B2 (ja) * | 2008-01-09 | 2013-09-18 | 株式会社リコー | 画像処理方法及び装置、画像表示装置並びにプログラム |
CA2753422A1 (en) * | 2009-02-24 | 2010-09-02 | Manufacturing Resources International, Inc. | System and method for displaying multiple images/videos on a single display |
US8896684B2 (en) * | 2010-06-24 | 2014-11-25 | Tk Holdings Inc. | Vehicle display enhancements |
JP5872764B2 (ja) * | 2010-12-06 | 2016-03-01 | 富士通テン株式会社 | 画像表示システム |
KR101531879B1 (ko) * | 2012-01-17 | 2015-06-26 | 티.비.티. 주식회사 | 적외선 영상 처리 장치 |
JP6244542B2 (ja) * | 2013-07-12 | 2017-12-13 | パナソニックIpマネジメント株式会社 | 投写型映像表示装置および投写型映像表示装置の制御方法 |
DE102015202846B4 (de) | 2014-02-19 | 2020-06-25 | Magna Electronics, Inc. | Fahrzeugsichtsystem mit Anzeige |
US10324297B2 (en) | 2015-11-30 | 2019-06-18 | Magna Electronics Inc. | Heads up display system for vehicle |
US10401621B2 (en) | 2016-04-19 | 2019-09-03 | Magna Electronics Inc. | Display unit for vehicle head-up display system |
KR102711203B1 (ko) * | 2019-03-25 | 2024-09-27 | 현대자동차주식회사 | 차량용 내비게이션 장치 및 그의 영상 표시 방법과 그를 포함하는 차량 |
CN114519967B (zh) * | 2022-02-21 | 2024-04-16 | 北京京东方显示技术有限公司 | 源驱动装置及其控制方法、显示系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01290373A (ja) * | 1988-05-18 | 1989-11-22 | Matsushita Electric Ind Co Ltd | 画像信号処理装置 |
JPH0380375A (ja) | 1989-08-24 | 1991-04-05 | Mitsubishi Heavy Ind Ltd | 画像データ処理装置 |
JPH04199477A (ja) * | 1990-11-29 | 1992-07-20 | Toshiba Corp | 画像処理装置 |
JP2002135569A (ja) * | 2000-10-20 | 2002-05-10 | Matsushita Electric Ind Co Ltd | 画像処理装置 |
JP2004104368A (ja) | 2002-09-06 | 2004-04-02 | Sony Corp | 画像データ処理方法、画像データ処理プログラム及び立体画像表示装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2321815A (en) * | 1997-02-04 | 1998-08-05 | Sharp Kk | Autostereoscopic display with viewer position indicator |
DE69732820T2 (de) * | 1996-09-12 | 2006-04-13 | Sharp K.K. | Parallaxeschranke und Anzeigevorrichtung |
GB2317291A (en) * | 1996-09-12 | 1998-03-18 | Sharp Kk | Observer tracking directional display |
US6624863B1 (en) * | 1997-06-28 | 2003-09-23 | Sharp Kabushiki Kaisha | Method of making a patterned retarder, patterned retarder and illumination source |
US6055103A (en) * | 1997-06-28 | 2000-04-25 | Sharp Kabushiki Kaisha | Passive polarisation modulating optical element and method of making such an element |
JP3388150B2 (ja) * | 1997-08-29 | 2003-03-17 | シャープ株式会社 | 立体画像表示装置 |
DE19808982A1 (de) * | 1998-03-03 | 1999-09-09 | Siemens Ag | Aktivmatrix-Flüssigkristallanzeige |
US7277121B2 (en) * | 2001-08-29 | 2007-10-02 | Sanyo Electric Co., Ltd. | Stereoscopic image processing and display system |
CN1500345B (zh) * | 2001-12-28 | 2010-06-02 | 索尼公司 | 显示设备和控制方法 |
JP3953434B2 (ja) * | 2003-03-20 | 2007-08-08 | 株式会社ソフィア | 画像表示装置 |
GB2399653A (en) * | 2003-03-21 | 2004-09-22 | Sharp Kk | Parallax barrier for multiple view display |
JP2006195415A (ja) * | 2004-12-13 | 2006-07-27 | Fujitsu Ten Ltd | 表示装置及び表示方法 |
-
2005
- 2005-09-13 JP JP2005265690A patent/JP2006154759A/ja active Pending
- 2005-10-19 EP EP05795572A patent/EP1816638A4/en not_active Withdrawn
- 2005-10-19 US US11/666,506 patent/US20070297692A1/en not_active Abandoned
- 2005-10-19 KR KR1020077006823A patent/KR20070083592A/ko not_active Application Discontinuation
- 2005-10-19 WO PCT/JP2005/019182 patent/WO2006046444A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01290373A (ja) * | 1988-05-18 | 1989-11-22 | Matsushita Electric Ind Co Ltd | 画像信号処理装置 |
JPH0380375A (ja) | 1989-08-24 | 1991-04-05 | Mitsubishi Heavy Ind Ltd | 画像データ処理装置 |
JPH04199477A (ja) * | 1990-11-29 | 1992-07-20 | Toshiba Corp | 画像処理装置 |
US5280546A (en) | 1990-11-29 | 1994-01-18 | Kabushiki Kaisha Toshiba | Image processing apparatus for variably magnifying image and controlling image density |
JP2002135569A (ja) * | 2000-10-20 | 2002-05-10 | Matsushita Electric Ind Co Ltd | 画像処理装置 |
JP2004104368A (ja) | 2002-09-06 | 2004-04-02 | Sony Corp | 画像データ処理方法、画像データ処理プログラム及び立体画像表示装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1816638A4 * |
Also Published As
Publication number | Publication date |
---|---|
US20070297692A1 (en) | 2007-12-27 |
EP1816638A1 (en) | 2007-08-08 |
KR20070083592A (ko) | 2007-08-24 |
EP1816638A4 (en) | 2009-08-26 |
JP2006154759A (ja) | 2006-06-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006046444A1 (ja) | 画像補間装置、および表示装置 | |
JP4255032B2 (ja) | 表示装置及び表示方法 | |
KR100869673B1 (ko) | 표시제어장치 및 표시장치 | |
JP2006195415A (ja) | 表示装置及び表示方法 | |
KR100896030B1 (ko) | 차량 탑재용 표시 장치 | |
US20070291172A1 (en) | Display Control Apparatus and Display Apparatus | |
JP2006184859A (ja) | 表示制御装置、及び表示装置 | |
JP4215782B2 (ja) | 表示装置、および表示装置の音声調整方法 | |
WO2006049213A1 (ja) | 映像信号処理方法、映像信号処理装置、及び、表示装置 | |
JP4308219B2 (ja) | 車載用表示装置 | |
WO2006043721A1 (ja) | 表示装置 | |
JP2006154756A5 (ja) | ||
JP5201989B2 (ja) | 表示装置及び表示装置の取付方法 | |
JP2006154754A (ja) | 表示制御装置、及び、表示装置 | |
JP4781059B2 (ja) | 表示装置及び出力制御装置 | |
JP2006301573A (ja) | 表示装置および表示方法 | |
JP2008102084A (ja) | 表示装置及び目的地設定方法 | |
JP2007034247A (ja) | 表示装置 | |
JP2006259761A (ja) | 車両用表示装置および表示方法 | |
JP4023815B2 (ja) | 表示装置 | |
JP2009204862A (ja) | 映像信号処理装置、表示装置、及び、映像信号処理方法 | |
JP4963166B2 (ja) | 表示装置 | |
JP2009069838A (ja) | 表示装置及び表示方法 | |
JP2007025617A (ja) | コントラスト調整装置、コントラスト調整方法、及び表示装置 | |
JP2006293350A (ja) | 表示装置及び表示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GE GM HR HU ID IL IN IS KE KG KM KP KZ LC LK LR LS LT LU LV LY MA MG MK MN MW MX MZ NA NG NI NZ OM PG PH PL PT RO RU SC SD SE SK SL SM SY TJ TM TN TR TT TZ UA US UZ VC VN YU ZA ZM |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SZ TZ UG ZM ZW AM AZ BY KG MD RU TJ TM AT BE BG CH CY DE DK EE ES FI FR GB GR HU IE IS IT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1020077006823 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580037420.6 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005795572 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11666506 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2005795572 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 11666506 Country of ref document: US |