WO2022255147A1 - 表示装置、表示システム、および表示駆動方法 - Google Patents
表示装置、表示システム、および表示駆動方法 Download PDFInfo
- Publication number
- WO2022255147A1 WO2022255147A1 PCT/JP2022/021117 JP2022021117W WO2022255147A1 WO 2022255147 A1 WO2022255147 A1 WO 2022255147A1 JP 2022021117 W JP2022021117 W JP 2022021117W WO 2022255147 A1 WO2022255147 A1 WO 2022255147A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- image data
- display
- pixels
- partial
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 13
- 238000001514 detection method Methods 0.000 claims description 66
- 230000005540 biological transmission Effects 0.000 claims description 45
- 230000006835 compression Effects 0.000 claims description 16
- 238000007906 compression Methods 0.000 claims description 16
- 230000008859 change Effects 0.000 claims description 5
- 239000003990 capacitor Substances 0.000 description 72
- 238000010586 diagram Methods 0.000 description 48
- 238000005401 electroluminescence Methods 0.000 description 44
- 230000004048 modification Effects 0.000 description 26
- 238000012986 modification Methods 0.000 description 26
- 230000001133 acceleration Effects 0.000 description 24
- 101100153525 Homo sapiens TNFRSF25 gene Proteins 0.000 description 20
- 102100022203 Tumor necrosis factor receptor superfamily member 25 Human genes 0.000 description 20
- 238000005516 engineering process Methods 0.000 description 15
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 13
- 108010063993 lens intrinsic protein MP 64 Proteins 0.000 description 11
- 238000012545 processing Methods 0.000 description 10
- 108700032832 MP-33 Proteins 0.000 description 9
- 101001128833 Xenopus laevis Nuclear distribution protein nudE homolog 1-A Proteins 0.000 description 9
- 230000012447 hatching Effects 0.000 description 8
- 230000036541 health Effects 0.000 description 8
- 101100208365 Trypanosoma brucei brucei KRET2 gene Proteins 0.000 description 6
- 210000003128 head Anatomy 0.000 description 6
- 101000588145 Homo sapiens Microtubule-associated tumor suppressor 1 Proteins 0.000 description 5
- 101001139122 Homo sapiens Nucleoporin NUP35 Proteins 0.000 description 5
- 102100037224 Noncompact myelin-associated protein Human genes 0.000 description 5
- 101710184695 Noncompact myelin-associated protein Proteins 0.000 description 5
- 102100020682 Nucleoporin NUP35 Human genes 0.000 description 5
- 101100472152 Trypanosoma brucei brucei (strain 927/4 GUTat10.1) REL1 gene Proteins 0.000 description 5
- 230000036760 body temperature Effects 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000009877 rendering Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000004378 air conditioning Methods 0.000 description 2
- 230000006866 deterioration Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 101100291915 Candida albicans (strain SC5314 / ATCC MYA-2876) MP65 gene Proteins 0.000 description 1
- 240000001973 Ficus microcarpa Species 0.000 description 1
- 101150010989 VCATH gene Proteins 0.000 description 1
- -1 WSL2 Proteins 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000012212 insulator Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 201000003152 motion sickness Diseases 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
Definitions
- the present disclosure relates to a display device, a display system, and a display driving method for displaying images.
- Some display devices for example, generate a frame image based on a low-resolution whole image and a high-resolution partial image, and display the generated frame image (for example, Patent Document 1).
- a display device includes a receiving circuit, a display section, and a display driving circuit.
- the receiving circuit receives first image data indicative of a first resolution overall image and second image data indicative of a first partial image at a second resolution higher than the first resolution corresponding to a portion of the overall image. It is configured to be capable of receiving image data.
- the display section has a plurality of pixels.
- the display driving circuit performs first driving for driving a plurality of pixels in units of a first number of pixels based on first image data, and driving a plurality of pixels based on second image data. and a second drive for driving two or more pixels provided in a region corresponding to the first partial image in units of a second number of pixels smaller than the first number. be.
- a display system includes an image generation device and a display device.
- the image generation device provides first image data representing a first resolution overall image and second image data representing a first partial image at a second resolution higher than the first resolution corresponding to a portion of the overall image.
- image data can be transmitted.
- a display device includes a receiving circuit, a display section, and a display driving circuit.
- the receiving circuit is configured to be able to receive the first image data and the second image data.
- the display section has a plurality of pixels.
- the display driving circuit performs first driving for driving a plurality of pixels in units of a first number of pixels based on first image data, and driving a plurality of pixels based on second image data. and a second drive for driving two or more pixels provided in a region corresponding to the first partial image in units of a second number of pixels smaller than the first number. be.
- a display driving method includes: first image data representing an entire image with a first resolution; transmitting second image data indicative of one partial image; receiving the first image data and the second image data; and converting a plurality of pixels based on the first image data into a first image data. and two or more pixels provided in a region corresponding to the first partial image among the plurality of pixels based on the second image data. and performing a second drive that drives a second number of pixels, which is less than the first number, in units of pixels.
- the reception circuit receives the first image data and the second image data.
- the first image data is data representing the entire image of the first resolution.
- the second image data is data representing a first partial image having a second resolution higher than the first resolution and corresponding to a portion of the entire image.
- first driving is performed to drive the plurality of pixels in units of the first number of pixels.
- two or more pixels provided in the region corresponding to the first partial image, among the plurality of pixels are divided into pixels of a second number smaller than the first number.
- a second drive is performed as follows.
- FIG. 1 is a block diagram showing a configuration example of a display system according to an embodiment of the present disclosure
- FIG. FIG. 2 is an explanatory diagram showing an example of a display image of the head mounted display shown in FIG. 1
- 2A and 2B are explanatory diagrams showing an example of a whole image and a partial image generated by the image generating device shown in FIG. 1
- FIG. FIG. 5 is an explanatory diagram showing an example of image data of a whole image according to a reference example
- FIG. 4 is an explanatory diagram showing an example of image data of a whole image and a partial image
- 2 is an explanatory diagram showing an example of image data transmitted by the image generation device shown in FIG. 1
- FIG. 2 is a block diagram showing one configuration example of a display panel shown in FIG. 1;
- FIG. 2 is an explanatory diagram showing an example of display operation in the display system shown in FIG. 1;
- FIG. 2 is an explanatory diagram showing an example of display operation in the head mounted display shown in FIG. 1;
- FIG. FIG. 4 is an explanatory diagram showing an example of pixel driving;
- FIG. 11 is another explanatory diagram showing an example of driving a pixel;
- 3 is another explanatory diagram showing an example of the display operation in the head mounted display shown in FIG. 1;
- FIG. FIG. 11 is a block diagram showing a configuration example of a display system according to a modification;
- 13 is an explanatory diagram showing an example of a display image of the head mounted display shown in FIG. 12;
- FIG. 11 is a block diagram showing a configuration example of a display system according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of driving pixels according to another modified example;
- FIG. 11 is a table showing one characteristic example of a display system according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of a whole image and a partial image according to another modified example;
- FIG. 11 is a table showing one characteristic example of a display system according to another modified example;
- FIG. FIG. 11 is an explanatory diagram showing an example of a whole image and a partial image according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of image data of a whole image and a partial image according to another modified example;
- FIG. 11 is a block diagram showing a configuration example of a display system according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of driving pixels according to another modified example;
- FIG. 11 is a table showing one characteristic example of
- FIG. 11 is an explanatory diagram showing an example of display operation in a head mounted display according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of a whole image and a partial image according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of image data of a whole image and a partial image according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of display operation in a head mounted display according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of a whole image and a partial image according to another modified example;
- FIG. 11 is a table showing one characteristic example of a display system according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of image data of a whole image and a partial image according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of display operation in a head mounted display according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of driving pixels according to another modified example;
- FIG. 11 is an explanatory diagram showing an operation example of a display system according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of display operation in a head mounted display according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of display operation in a head mounted display according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of display operation in a head mounted display according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of a display image of a head mounted display according to another modified example;
- FIG. 11 is an explanatory diagram showing an example of a display image of a head mounted display according to
- FIG. 10 is an explanatory diagram showing an example of a whole image and a partial image generated by an image generation device according to another modified example
- FIG. 11 is an explanatory diagram showing an example of image data of a whole image and a partial image according to another modified example
- FIG. 11 is an explanatory diagram showing an example of display operation in a display system according to another modified example
- FIG. 11 is an explanatory diagram showing an example of display operation in a head mounted display according to another modified example
- FIG. 11 is a table showing one characteristic example of a display system according to another modified example
- FIG. FIG. 11 is a block diagram showing a configuration example of a display system according to another modified example
- FIG. 11 is an explanatory diagram showing an example of image data of a whole image and a partial image according to another modified example;
- FIG. 11 is a block diagram showing a configuration example of a display system according to another modified example;
- FIG. 11 is a table showing an operation example of a display system according to another modified example;
- FIG. 11 is another table showing an operation example of the display system according to another modified example;
- FIG. 11 is another table showing an operation example of the display system according to another modified example;
- FIG. FIG. 11 is a block diagram showing a configuration example of a display panel according to another modified example;
- 44 is a circuit diagram showing a configuration example of a pixel shown in FIG. 43;
- FIG. 44 is a circuit diagram showing another configuration example of the pixel shown in FIG. 43;
- FIG. 44 is a circuit diagram showing another configuration example of the pixel shown in FIG. 43;
- FIG. 44 is a circuit diagram showing another configuration example of the pixel shown in FIG. 43;
- FIG. 43
- FIG. 44 is a circuit diagram showing another configuration example of the pixel shown in FIG. 43;
- FIG. 44 is a circuit diagram showing another configuration example of the pixel shown in FIG. 43;
- FIG. 44 is a circuit diagram showing another configuration example of the pixel shown in FIG. 43;
- FIG. 44 is a circuit diagram showing another configuration example of the pixel shown in FIG. 43;
- FIG. 44 is a circuit diagram showing another configuration example of the pixel shown in FIG. 43;
- FIG. 44 is a circuit diagram showing another configuration example of the pixel shown in FIG. 43;
- FIG. FIG. 11 is a perspective view showing an external configuration of a head mounted display according to an application example;
- FIG. 11 is a perspective view showing an external configuration of another head mounted display according to an application example;
- FIG. 11 is a front view showing the external configuration of a digital still camera according to another application example;
- FIG. 11 is a rear view showing the external configuration of a digital still camera according to another application example;
- FIG. 11 is a rear view showing an external configuration of a television device according to another application example;
- FIG. 11 is a rear view showing the external configuration of a smartphone according to another application example;
- FIG. 11 is an explanatory diagram showing a configuration example of a vehicle according to another application example;
- FIG. 11 is another explanatory diagram showing a configuration example of a vehicle according to another application example;
- FIG. 1 shows a configuration example of a display system (display system 1) according to an embodiment. Note that the display device and the display driving method according to the embodiment of the present disclosure are embodied by the present embodiment, so they will be described together.
- the display system 1 includes an image generation device 10 and a head mounted display 20.
- the display system 1 is used for augmented reality (AR) and virtual reality (VR).
- AR augmented reality
- VR virtual reality
- the display system 1 is configured to perform foveated rendering, in which the focused area is drawn with high resolution and the other areas are drawn with low resolution.
- the display system 1 communicates between the image generation device 10 and the head mounted display 20 using HDMI (registered trademark) (High-Definition Multimedia Interface) or MIPI (registered trademark) (Mobile Industry Processor Interface). This is done using an interface such as In this example, this communication is performed by wire communication, but it is not limited to this, and may be performed by wireless communication.
- the head mounted display 20 displays an image based on the image signal SP transmitted from the image generating device 10.
- An acceleration sensor 22 (described later) of the head-mounted display 20 detects movements such as orientation of the head-mounted display 20 .
- An eye-tracking sensor 23 (described later) of the head-mounted display 20 detects the direction of the eyes of the user wearing the head-mounted display 20, thereby determining which part of the displayed image the user is looking at. to detect
- the head-mounted display 20 supplies a detection signal SD containing these detection results to the image generation device 10 .
- the image generation device 10 generates an image (whole image P1) according to the direction of the head mounted display 20 based on the detection result of the acceleration sensor 22.
- the image generation device 10 generates an image (partial image P2) including the portion of the entire image P1 that the user is looking at, based on the detection result of the eye tracking sensor 23 .
- the resolution of the partial image P2 when displayed on the head-mounted display 20 is higher than the resolution of the full image P1.
- the image generation device 10 then transmits to the head mounted display 20 an image signal SP including image data DT1 representing the entire image P1 and image data DT2 representing the partial image P2.
- the image generation device 10 is configured to generate an image to be displayed on the head mounted display 20.
- the image generation device 10 has an image generation circuit 11 , a transmission circuit 12 and a reception circuit 13 .
- the image generation circuit 11 is configured to generate an image to be displayed on the head mounted display 20 by performing predetermined processing such as rendering processing.
- the image generating circuit 11 generates an entire image P1 showing the scenery in the virtual space according to the direction of the head mounted display 20 based on the detection result of the acceleration sensor 22 included in the data supplied from the receiving circuit 13. .
- the image generation circuit 11 selects, based on the detection result of the eye tracking sensor 23 included in the data supplied from the reception circuit 13, the user's generates a partial image P2 showing the part viewed by .
- FIG. 2 shows an example of a display image P20 displayed on the head-mounted display 20.
- FIG. Display image P20 includes an image of person 9 .
- the user wearing the head-mounted display 20 is viewing the image of this person 9 .
- the eye tracking sensor 23 of the head-mounted display 20 detects which part of the display image P20 the user is looking at by detecting the orientation of the user's eyes. Based on the detection result of the eye tracking sensor 23 included in the data supplied from the receiving circuit 13, the image generation circuit 11 selects a partial region including the portion viewed by the user, out of the entire region R1 of the display image P20. Determine R2. In this example, the size of the partial region R2 in the horizontal direction (horizontal direction in FIG.
- the image generation circuit 11 generates a full image P1 related to the entire region R1 and a partial image P2 related to the partial region R2.
- FIG. 3 shows an example of the entire image P1 and the partial image P2 generated by the image generation circuit 11.
- the entire image P1 is a low-resolution image of the entire region R1 (FIG. 2).
- the partial image P2 is a high-resolution image of the partial region R2.
- the partial image P2 is a high-resolution image of the image of the partial area R2 in the whole image P1.
- each pixel in the full image P1 corresponds to four pixels PIX (described later) in the head mounted display 20, and each pixel in the partial image P2 corresponds to one pixel in the head mounted display 20. It corresponds to PIX.
- the number of pixels in the full image P1 and the number of pixels in the partial image P2 are made equal to each other.
- Each of the full image P1 and the partial image P2 is also called a sub-frame image.
- FIG. 4A schematically shows image data when displaying only the entire image without performing foveal rendering.
- This image data indicates image data of a high-resolution overall image that can be used when writing a plurality of pixel values into a plurality of pixels PIX in the head-mounted display 20 .
- FIG. 4B schematically represents image data of a full image P1 and a partial image P2 according to the present technology.
- each pixel in the entire image P1 corresponds to four pixels PIX.
- the number of horizontal pixels of the whole image P1 is 50% of the number of horizontal pixels of the entire high-resolution image
- the number of pixels in the vertical direction (the number of vertical pixels) of the whole image P1 is 50% of the number of vertical pixels of the whole image of high resolution. That is, the horizontal pixel number ratio of the whole image P1 is 50%, and the vertical pixel number ratio of the whole image P1 is 50%. Therefore, the image data amount of the whole image P1 is 1/4 of the image data amount of the high-resolution whole image.
- the area of the partial region R2 is 1/4 of the area of the entire region R1.
- the number of horizontal pixels of the high-resolution full image shown is 50%
- the number of vertical pixels (vertical pixel number) of partial image P2 is 50% of the number of vertical pixels of the high-resolution full image. That is, the horizontal pixel number ratio of the partial image P2 is 50%, and the vertical pixel number ratio of the partial image P2 is 50%. Therefore, the image data amount of the partial region R2 is 1/4 of the image data amount of the high-resolution whole image.
- the horizontal pixel number ratios of the entire image P1 and the partial image P2 are equal to each other, and the vertical pixel number ratios of the entire image P1 and the partial image P2 are equal to each other. Also, the total data amount of the entire image P1 and the partial image P2 is half the data amount of the high-resolution entire image.
- the image generation circuit 11 generates such a full image P1 and a partial image P2. Then, the image generation circuit 11 supplies the generated image data of the entire image P1, the image data of the partial image P2, and the data on the position of the partial image P2 in the entire image P1 to the transmission circuit 12.
- the transmission circuit 12 ( FIG. 1 ) is configured to generate an image signal SP based on the data supplied from the image generation circuit 11 and transmit this image signal SP to the head mounted display 20 . Specifically, based on the image data of the entire image P1, the transmission circuit 12 generates the image data DT1 representing the entire image P1, and the image data of the partial image P2 and the data about the position of the partial image P2. Based on this, image data DT2 representing this partial image P2 is generated. Then, the transmission circuit 12 transmits the image signal SP including the image data DT1 and the image data DT2 to the head mounted display 20.
- FIG. 1 The transmission circuit 12 is configured to generate an image signal SP based on the data supplied from the image generation circuit 11 and transmit this image signal SP to the head mounted display 20 . Specifically, based on the image data of the entire image P1, the transmission circuit 12 generates the image data DT1 representing the entire image P1, and the image data of the partial image P2 and the data about the position of the partial image P2. Based on this, image
- FIG. 5 schematically shows an example of the image signal SP.
- the transmission circuit 12 transmits the image data DT1 and the image data DT2 in a time division manner. Specifically, the transmission circuit 12 alternately transmits image data DT1 representing the entire image P1 and image data DT2 representing the partial image P2.
- the receiving circuit 13 ( FIG. 1 ) is configured to receive the detection signal SD transmitted from the head mounted display 20 .
- the receiving circuit 13 supplies the image generating circuit 11 with the data on the detection result of the acceleration sensor 22 and the detection result of the eye tracking sensor 23 included in the detection signal SD.
- the head mounted display 20 has a receiving circuit 21, an acceleration sensor 22, an eye tracking sensor 23, a processor 24, a transmitting circuit 25, a display controller 26, and a display panel 27.
- the receiving circuit 21 is configured to receive the image signal SP transmitted from the image generating device 10 .
- the receiving circuit 21 supplies the processor 24 with the image data DT1 representing the entire image P1 and the image data DT2 representing the partial image P2 included in the image signal SP.
- the acceleration sensor 22 is configured to detect movements such as the orientation of the head mounted display 20 .
- a 6-axis inertial sensor for example, can be used as the acceleration sensor 22 .
- the display system 1 can generate the entire image P1 according to the orientation of the head mounted display 20 in the virtual space.
- the eye tracking sensor 23 is configured to detect the orientation of the eyes of the user wearing the head mounted display 20 .
- the display system 1 can detect which part of the display image the user is looking at, and can generate a high-resolution partial image P2 including the part the user is looking at. It has become.
- the processor 24 is configured to control the operation of the head mounted display 20, and includes, for example, a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit). Specifically, the processor 24 performs predetermined image processing based on, for example, the image data DT1 and DT2 supplied from the receiving circuit 21, and processes the image data of the entire image P1 included in the image data DT1 and the image data DT2. Image data of the included partial image P2 and data about the position of the partial image P2 included in the image data DT2 are supplied to the display controller . The processor 24 also supplies the detection result of the acceleration sensor 22 and the detection result of the eye tracking sensor 23 to the transmission circuit 25 and causes the transmission circuit 25 to transmit these detection results.
- a CPU Central Processing Unit
- GPU Graphics Processing Unit
- the transmission circuit 25 is configured to transmit the detection signal SD including the detection result of the acceleration sensor 22 supplied from the processor 24 and the detection result of the eye tracking sensor 23 to the image generation device 10 .
- the display controller 26 is configured to control the operation of the display panel 27 based on the image data of the full image P1, the image data of the partial image P2, and the data about the position of the partial image P2 supplied from the processor 24. be done.
- the display panel 27 is configured to display an image based on control by the display controller 26.
- the display panel 27 is an organic EL (Electro Luminescence) display panel in this example. Note that the display panel 27 is not limited to this, and may be, for example, a liquid crystal display panel.
- FIG. 6 shows a configuration example of the display panel 27.
- the display panel 27 has a pixel array 31 , a pixel signal generation circuit 32 and a scanning circuit 33 .
- the pixel array 31 has multiple signal lines SGL, multiple control lines CTL, and multiple pixels PIX.
- the plurality of signal lines SGL extends in the vertical direction (vertical direction in FIG. 6) and is arranged in parallel in the horizontal direction (horizontal direction in FIG. 6).
- Each of the plurality of signal lines SGL supplies pixel signals generated by the pixel signal generating circuit 32 to the pixels PIX.
- the plurality of control lines CTL extend in the horizontal direction (horizontal direction in FIG. 6) and are arranged in parallel in the vertical direction (vertical direction in FIG. 6). Each of the plurality of control lines CTL supplies control signals generated by the scanning circuit 33 to the pixels PIX.
- a plurality of pixels PIX are arranged in a matrix in the pixel array 31 .
- Each of the plurality of pixels PIX is controlled based on the control signal supplied via the control line CTL, and the pixel signal supplied via the signal line SGL is written. Thereby, each of the plurality of pixels PIX is configured to emit light with luminance according to the written pixel signal.
- a row of pixels PIX arranged in the horizontal direction forms a pixel line L. As shown in FIG.
- the pixel signal generation circuit 32 is configured to generate pixel signals based on image data to be displayed and apply the generated pixel signals to each of the plurality of signal lines SGL.
- the scanning circuit 33 generates a control signal and applies the generated control signal to each of the plurality of control lines CTL, thereby scanning the plurality of pixels PIX with one or a plurality of pixel lines L as scanning units. Configured.
- the receiving circuit 21 corresponds to a specific example of "receiving circuit” in the present disclosure.
- the overall image P1 corresponds to a specific example of the "whole image” in the present disclosure.
- the image data DT1 corresponds to a specific example of "first image data” in the present disclosure.
- the partial image P2 corresponds to a specific example of "partial image” in the present disclosure.
- the image data DT2 corresponds to a specific example of "second image data” in the present disclosure.
- the pixel array 31 corresponds to a specific example of the "display section” in the present disclosure.
- the display controller 26, the pixel signal generation circuit 32, and the scanning circuit 33 correspond to a specific example of "display driving circuit” in the present disclosure.
- the eye tracking sensor 23 corresponds to a specific example of "first sensor” in the present disclosure.
- the acceleration sensor 22 corresponds to a specific example of “second sensor” in the present disclosure.
- the transmission circuit 25 corresponds to a specific example of "transmission circuit” in the present disclosure.
- the receiving circuit 13 of the image generation device 10 receives the detection signal SD transmitted from the head-mounted display 20, and the detection result of the acceleration sensor 22 and the detection result of the eye tracking sensor 23 included in this detection signal SD. Data is supplied to the image generation circuit 11 .
- the image generating circuit 11 generates an entire image P1 showing the scenery in the virtual space according to the direction of the head mounted display 20 based on the detection result of the acceleration sensor 22 included in the data supplied from the receiving circuit 13. . Further, the image generation circuit 11 detects the scene viewed by the user according to the direction of the head mounted display 20 based on the detection result of the eye tracking sensor 23 included in the data supplied from the reception circuit 13 .
- a partial image P2 showing the part is generated.
- the transmission circuit 12 generates image data DT1 representing the entire image P1 based on the image data of the entire image P1, and based on the image data of the partial image P2 and the data about the position of the partial image P2, transmits the image data of the partial image P2.
- Image data DT2 representing the image P2 is generated. Then, the transmission circuit 12 transmits the image signal SP including the image data DT1 and the image data DT2 to the head mounted display 20.
- the receiving circuit 21 of the head-mounted display 20 receives the image signal SP transmitted from the image generating device 10, and the image data DT1 representing the entire image P1 and the image data representing the partial image P2 included in this image signal SP.
- DT2 is provided to processor 24 .
- the processor 24 performs predetermined image processing based on the image data DT1 and DT2 supplied from the receiving circuit 21, and converts the image data of the entire image P1 included in the image data DT1 and the image of the partial image P2 included in the image data DT2.
- the data and the data about the position of the partial image P2 are supplied to the display controller 26 .
- the display controller 26 controls the operation of the display panel 27 based on the image data of the full image P1, the image data of the partial image P2, and the data on the position of the partial image P2 supplied from the processor 24.
- FIG. The display panel 27 displays images under the control of the display controller 26 .
- the acceleration sensor 22 detects movements such as the orientation of the head mounted display 20 .
- the eye tracking sensor 23 detects the orientation of the eyes of the user wearing the head mounted display 20 .
- the processor 24 supplies the detection result of the acceleration sensor 22 and the detection result of the eye tracking sensor 23 to the transmission circuit 25 .
- the transmission circuit 25 transmits a detection signal SD including the detection result of the acceleration sensor 22 and the detection result of the eye tracking sensor 23 supplied from the processor 24 to the image generation device 10 .
- the head mounted display 20 generates a display image P20 based on the image data DT1 and the image data DT2 supplied in a time division manner.
- FIG. 7 shows an example of the display operation in the head mounted display 20.
- FIG. Head-mounted display 20 alternately receives image data DT1 representing entire image P1 and image data DT2 representing partial image P2.
- the display controller 26 divides the plurality of pixels PIX on the display panel 27 into four pixels PIX based on the image data of the entire image P1 included in the image data DT1. is controlled to drive as a unit. Thereby, the display panel 27 displays the display image P21 including the low-resolution whole image P1.
- the display controller 26 displays the image on the display panel 27 based on the image data of the partial image P2 and the data on the position of the partial image P2 included in the image data DT2.
- the plurality of pixels PIX arranged in the region corresponding to the partial image P2 are controlled to be driven in units of one pixel PIX.
- the plurality of pixels PIX on the display panel 27 displays the display image P22 including the high-resolution partial image P2.
- the head-mounted display 20 repeats the operation when receiving the image data DT1 and the operation when receiving the image data DT2.
- FIG. 8 shows an example of more detailed display operation in the head-mounted display 20, (A) shows the waveform of the synchronization signal Vsync, (B) shows the input image, and (C) shows the display panel. 27, and (D) shows a display image displayed on the display panel 27.
- FIG. 8 shows an example of more detailed display operation in the head-mounted display 20, (A) shows the waveform of the synchronization signal Vsync, (B) shows the input image, and (C) shows the display panel. 27, and (D) shows a display image displayed on the display panel 27.
- a pair of the full image P1 and the partial image P2 is supplied with a period T.
- Each of the full image P1 and the partial image P2 is supplied at a period Ts.
- a pulse of the synchronization signal Vsync is generated (Fig. 8(A)).
- the receiving circuit 21 of the head mounted display 20 receives image data DT1 representing the entire image P1 ((B) in FIG. 8). Since the input image is the entire image P1, the display controller 26 controls the plurality of pixels PIX in the display panel 27 to drive four pixels PIX as a unit based on the image data of the entire image P1.
- FIG. 9 shows the operation of driving the pixel PIX.
- the scanning circuit 33 scans the plurality of pixels PIX using two pixel lines L as a scanning unit US.
- the pixel signal generation circuit 32 applies the same pixel signal to two signal lines SGL adjacent to each other. As a result, the same pixel signal is written to the four pixels PIX in the two pixel lines L selected. In this manner, the display panel 27 drives a plurality of pixels PIX with four pixels PIX as a unit UD.
- the scanning circuit 33 sequentially scans two pixel lines L from the bottom to the top of the pixel array 31 in this example.
- the operating frequency can be halved and the power consumption can be reduced as compared with the case where the sequential scanning is performed with one pixel line L as the scanning unit.
- the pixel PIX to which the pixel signal is written emits light for a predetermined period after the pixel signal is written in this example.
- the display panel 27 displays the display image P21 (FIG. 8(D)).
- a pulse of the synchronization signal Vsync is generated (Fig. 8(A)).
- the receiving circuit 21 of the head mounted display 20 receives the image data DT2 representing the partial image P2 ((B) in FIG. 8). Since the input image is the partial image P2, the display controller 26 selects a portion of the plurality of pixels PIX on the display panel 27 based on the image data of this partial image P2 and the data about the position of this partial image P2.
- a plurality of pixels PIX arranged in an area corresponding to the image P2 are controlled to be driven in units of one pixel PIX.
- FIG. 10 shows the operation of driving the pixel PIX.
- the scanning circuit 33 scans a plurality of pixels PIX using one pixel line L as a scanning unit US. Further, the pixel signal generation circuit 32 applies a plurality of pixel signals to a plurality of signal lines SGL related to a region corresponding to the partial image P2 among the plurality of signal lines SGL. As a result, in one selected pixel line L, a plurality of pixel signals are written to a plurality of pixels PIX related to the area corresponding to the partial image P2. On the other hand, pixel signals are not written to the plurality of pixels PIX related to areas other than the area corresponding to the partial image P2. In this manner, the display panel 27 drives a plurality of pixels PIX with one pixel PIX as a unit UD.
- the scanning circuit 33 sequentially scans the area corresponding to the partial image P2 in the pixel array 31 with one pixel line L as the scanning unit US.
- the scanning speed is half the scanning speed in the period from timing t11 to t12 because the sequential scanning is performed with one pixel line L as the scanning unit US in this period.
- the operating frequency can be halved and the power consumption can be reduced as compared with the case of performing sequential scanning from the bottom to the top of the display panel 27, for example.
- the pixels PIX of the pixel line L to which the pixels PIX to which the pixel signals are written emit light for a predetermined period after the pixel signals are written in this example. do. Further, the pixels PIX in the pixel line L near the bottom to which the pixel signals were not written emit light in the same period as the pixels PIX to which the pixel signals were written first, and the pixels near the top to which the pixel signals were not written. The pixels PIX on the line L emit light in the same period as the pixel PIX to which the pixel signal was lastly written. Thus, the display panel 27 displays the display image P22 (FIG. 8(D)). Of the display image P22, the image in the area other than the partial area R2 is written during the timings t11 to t12, and the image in the partial area R2 is written during the period between the timings t12 to t13.
- the head mounted display 20 displays an image based on the pair of the entire image P1 and partial image P2 included in the image data DT1 and image data DT2.
- the head-mounted display 20 first displays the display image P21 based on the full image P1, and then redisplays the image of the area corresponding to the partial image P2 in the display image P21 based on the partial image P2. Display P22.
- the user grasps the entire image
- the display image P22 the user grasps the details of the image in the partial region R2. From the viewpoint of latency, the timing at which the user grasps the entire image is important.
- the latency of the head-mounted display 20 is, for example, the time ⁇ t from the timing t11 at which the input of the image data DT1 is started until the pixel PIX at the central position in the vertical direction of the display panel 27 starts emitting light.
- the display system 1 repeats the operation from timing tt to t13 after this.
- the receiving circuit 21 of the head mounted display 20 receives image data DT1 representing the entire image P1 ((B) in FIG. 8). Based on this image data DT1, the head-mounted display 20 displays the display image P21 (FIGS. 8(C) and 8(D)) in the same manner as during the period from timing t11 to t12.
- the receiving circuit 21 of the head mounted display 20 receives image data DT2 representing the partial image P2 ((B) in FIG. 8). Based on this image data DT2, the head-mounted display 20 displays the display image P22 (FIGS. 8(C) and 8(D)) in the same manner as during the period from timing t12 to t13.
- FIG. 11 shows an example of another display operation in the head mounted display 20.
- FIG. 11C the hatched light emitting operation in FIG. 11C is different from the example in FIG. That is, in the example of FIG. 8, the display panel 27 emits light according to the timing of line-sequential scanning, but in this example, the pixels PIX in the entire area emit light at the same timing.
- the receiving circuit 21 of the head mounted display 20 receives image data DT1 representing the entire image P1 (FIG. 11(B)). Since the input image is the entire image P1, the display controller 26 controls the plurality of pixels PIX in the display panel 27 to drive four pixels PIX as a unit based on the image data of the entire image P1.
- the scanning circuit 33 sequentially scans two pixel lines L from the bottom to the top of the pixel array 31 as a scanning unit US (FIG. 11(C)). Then, as indicated by hatching in FIG. 11C, at the timing when the pixel signals are written to all the pixels PIX in the display panel 27, the plurality of pixels PIX emit light at the same timing over a predetermined period. Thus, the display panel 27 displays the display image P21 (FIG. 11(D)).
- the receiving circuit 21 of the head mounted display 20 receives image data DT2 representing the partial image P2 (FIG. 11(B)). Since the input image is the partial image P2, the display controller 26 selects a portion of the plurality of pixels PIX on the display panel 27 based on the image data of this partial image P2 and the data about the position of this partial image P2. A plurality of pixels PIX arranged in an area corresponding to the image P2 are controlled to be driven in units of one pixel PIX.
- the scanning circuit 33 sequentially scans a plurality of pixels PIX related to a region corresponding to the partial image P2 among the plurality of pixels PIX, using one pixel line L as a scanning unit US (FIG. 11(C)). Then, as indicated by hatching in FIG. 11C, after the timing at which the pixel signals are written to all the pixels PIX in the display panel 27, the plurality of pixels PIX emit light at the same timing over a predetermined period. Thus, the display panel 27 displays the display image P22 (FIG. 11(D)).
- the head mounted display 20 displays an image based on the pair of the entire image P1 and partial image P2 included in the image data DT1 and image data DT2.
- the latency of the head-mounted display 20 is, for example, the time ⁇ t from the timing t11 when the input of the image data DT1 is started until the half of the display panel 27 is scanned.
- Image data (image data DT2) is received. Then, based on the first image data (image data DT1), a first drive for driving a plurality of pixels PIX in units of four pixels PIX, and based on the second image data (image data DT2) 2, two or more pixels provided in a region corresponding to the partial image P2 among the plurality of pixels PIX are driven in units of one pixel PIX.
- the display system 1 first displays the display image P21 based on the entire image P1, and then redisplays the image of the area corresponding to the partial image P2 in the display image P21 based on the partial image P2.
- a display image P22 can be displayed.
- the user grasps the entire image, and by observing the display image P22, the user grasps the details of the image in the partial region R2.
- the display system 1 can improve image quality.
- the display system 1 two or more pixels provided in the region corresponding to the partial image P2 among the plurality of pixels PIX are replaced with one pixel PIX based on the second image data (image data DT2).
- the second driving is performed as a unit.
- the display system 1 can display, for example, the part viewed by the user with high resolution, so that the image quality can be improved.
- first image data representing the low-resolution whole image P1
- second image data representing a high-resolution partial image P2 corresponding to a part of the whole image P1.
- image data DT2 is received.
- the image data amount of the image data DT1 and the image data amount of the image data DT2 can be made smaller than the image data amount of the high-resolution whole image. can.
- the transmission band in signal transmission of the image signal SP from the image generation device 10 to the head mounted display 20 can be reduced.
- the transmission band can be reduced in this way in the display system 1, the image data DT1 and the image data DT2 can be transmitted in a short time. As a result, the display system 1 can increase the frame rate, thereby improving the image quality.
- the first driving for driving the plurality of pixels PIX in units of four pixels PIX based on the first image data (image data DT1) and the second image data (image data DT1). DT2) two or more pixels provided in the region corresponding to the partial image P2 among the plurality of pixels PIX are driven in units of one pixel PIX. Accordingly, the display system 1 does not necessarily require a frame memory, unlike the case where the whole image P1 and the partial image P2 are synthesized in advance and the synthesized image is displayed. In this way, when the frame memory is not provided, for example, the circuit configuration can be simplified and, for example, the cost can be reduced.
- first image data indicating a low-resolution overall image and second image data indicating a high-resolution partial image corresponding to a part of the overall image are received. did. Then, based on the first image data, the plurality of pixels are driven in units of four pixels, and based on the second image data, the plurality of pixels corresponding to the partial image are driven. Two or more pixels provided in the region are driven in units of one pixel, and the second driving is performed. Thereby, image quality can be improved.
- second driving is performed in which two or more pixels provided in a region corresponding to a partial image among a plurality of pixels are driven in units of one pixel based on second image data. Since it was made to perform, the image quality can be improved.
- the first image data indicating the low-resolution whole image and the second image data indicating the high-resolution partial image corresponding to a part of the whole image are received.
- the image generating circuit 11 generates the partial image P2 based on the detection result of the eye tracking sensor 23, but it is not limited to this. This modification will be described in detail below.
- FIG. 12 shows a configuration example of the display system 1A.
- the display system 1A includes an image generation device 10A and a head mounted display 20A.
- the image generation device 10A has an image generation circuit 11A.
- the image generating circuit 11A generates an entire image P1 showing the scenery in the virtual space according to the direction of the head mounted display 20 based on the detection result of the acceleration sensor 22 included in the data supplied from the receiving circuit 13. . Further, the image generation circuit 11A generates a partial image P2 including a portion where the image changes in the entire image P1.
- FIG. 13 shows an example of the entire image P1.
- the image of person 9 is moving within whole image P1.
- the image generation circuit 11A determines a partial region R2 containing a changing image based on the entire image P1. Then, the image generating circuit 11A generates a partial image P2 related to the partial area R2. That is, since the user is likely to see the changing portion of the displayed image, the image generating circuit 11A generates the partial image P2 for the changing portion. Then, the image generation circuit 11A supplies the generated image data of the entire image P1, the image data of the partial image P2, and the data on the position of the partial image P2 in the entire image P1 to the transmission circuit 12.
- FIG. 13 shows an example of the entire image P1.
- the image of person 9 is moving within whole image P1.
- the image generation circuit 11A determines a partial region R2 containing a changing image based on the entire image P1. Then, the image generating circuit 11A generates a partial image P2 related to the partial area R2. That
- the head mounted display 20A (FIG. 12) has a receiving circuit 21, an acceleration sensor 22, a processor 24A, a transmitting circuit 25, a display controller 26, and a display panel 27. That is, head mounted display 20A according to the present modification is obtained by omitting eye tracking sensor 23 and replacing processor 24 with processor 24A in head mounted display 20 (FIG. 1) according to the above-described embodiment.
- the processor 24A for example, performs predetermined image processing based on the image data DT1 and DT2 supplied from the receiving circuit 21, and processes the image data of the entire image P1 included in the image data DT1 and the partial image P2 included in the image data DT2. and data on the position of the partial image P2 included in the image data DT2 to the display controller .
- the processor 24A also supplies the detection results of the acceleration sensor 22 to the transmission circuit 25 and causes the transmission circuit 25 to transmit these detection results.
- the present invention is not limited to this.
- a so-called video see-through may be realized by also displaying an image picked up by the image sensor 28B.
- This display system 1B includes a head mounted display 20B.
- the head mounted display 20B has an image sensor 28B and a processor 24B.
- the image sensor 28B is configured, for example, to capture an image in front of the user wearing the head mounted display 20B.
- the processor 24B performs predetermined image processing based on the image data DT1 and DT2 supplied from the receiving circuit 21 and the captured image generated by the image sensor 28B, and generates the image data of the entire image, the image data of the partial image, and the image data of the partial image. Data about the positions of the partial images are supplied to the display controller 26 .
- the plurality of pixels PIX in the display panel 27 are driven in units of four pixels PIX, but the present invention is not limited to this.
- various numbers of pixels PIX can be driven as a unit.
- two (2 ⁇ 1) pixels PIX arranged horizontally may be driven as a unit, or eight (8) pixels PIX including four pixels PIX in the horizontal direction and two pixels PIX in the vertical direction may be driven.
- 4 ⁇ 2) pixels PIX may be driven as a unit, or 32 (8 ⁇ 4) pixels PIX including 8 pixels PIX in the horizontal direction and 4 pixels PIX in the vertical direction may be driven as a unit.
- two (1 ⁇ 2) pixels PIX arranged in the vertical direction may be driven as a unit.
- One (2 ⁇ 4) pixels PIX may be driven as a unit, or 32 (4 ⁇ 8) pixels PIX including four pixels PIX in the horizontal direction and eight pixels PIX in the vertical direction may be driven as a unit.
- four (2 ⁇ 2) pixels PIX including two pixels PIX in the horizontal direction and two pixels PIX in the vertical direction may be driven as a unit, or four pixels PIX in the horizontal direction may be driven.
- 16 (4 ⁇ 4) pixels PIX including 4 pixels PIX in the vertical direction may be driven as a unit, or 64 pixels PIX including 8 pixels PIX in the horizontal direction and 8 pixels PIX in the vertical direction may be driven.
- the pixel PIX of (8 ⁇ 8) may be driven as a unit.
- even-numbered pixels PIX are driven as a unit, but the invention is not limited to this.
- odd-numbered pixels PIX may be driven as a unit.
- the band usage rate is 100% when one pixel PIX is driven as a unit, for example, when four (2 ⁇ 2) pixels PIX are driven as a unit as in the above embodiment, , the number of pixels in the entire image P1 is reduced to 1/4, so the bandwidth usage rate is 25%. Further, for example, when 16 (4 ⁇ 4) pixels PIX are driven as a unit, the number of pixels in the entire image P1 is reduced to 1/16, so the band usage rate is 6.25%.
- the user grasps the entire image, and by observing the display image P22, the user grasps the details of the image in the focused partial region R2.
- the display system according to this modification for example, by appropriately setting the number of pixels PIX, which are driving units, it is possible to reduce the transmission band while suppressing deterioration in image quality.
- the horizontal pixel number ratio of the partial image P2 is set to 50%, and the vertical pixel number ratio of the partial image P2 is set to 50%.
- the present invention is not limited to this. .
- the horizontal pixel number ratio and the vertical pixel number ratio of the partial image P2 can be set to various values.
- the horizontal pixel number ratio of the partial image P2 may be larger than the vertical pixel number ratio, considering that the human field of vision is wide in the horizontal direction.
- 19 to 21 show examples in which the horizontal pixel number ratio of the partial image P2 is 100% and the vertical pixel number ratio is 50%.
- 20 schematically shows the image data of the entire image P1 and the partial image P2, and
- FIG. 21 shows an example of the display operation.
- the number of pixels in the horizontal direction (number of horizontal pixels) of the partial image P2 is 100% of the number of horizontal pixels of the entire high-resolution image
- the number of pixels in the vertical direction (number of vertical pixels) of the partial image P2 is 100%.
- number of pixels) is 50% of the number of vertical pixels of the high-resolution global image.
- the whole image P1 is the same as in the above embodiment (FIG. 4B).
- the receiving circuit 21 of the head mounted display 20 receives the image data DT1 representing the entire image P1 as shown in FIG. ). Based on this image data DT1, the head-mounted display 20 displays a display image P21 (FIGS. 21C and 21D) in the same manner as in the above embodiment (FIGS. 8 and 9). Further, during the period from timing t32 to t33, the receiving circuit 21 receives image data DT2 representing the partial image P2 as shown in FIG. 20 ((B) in FIG. 21). Based on this image data DT2, the head mounted display 20 displays the display image P22 (FIGS. 21(C) and (D)) in the same manner as in the above embodiment (FIGS. 8 and 10).
- image data DT1 representing the entire image P1 is transmitted during the period from timing t31 to t32
- image data DT2 representing the partial image P2 is transmitted during the period from timing t32 to t33. be done.
- the image data amount of the partial image P2 is larger than the image data amount of the entire image P1, so the transmission band is determined by the image data DT2 representing the partial image P2. Therefore, when the horizontal pixel number ratio of the partial image P2 is 100% and the vertical pixel number ratio is 50%, the band usage rate is 100% as shown in FIG.
- FIG. 22 to 24 show examples in which the partial image P2 has a horizontal pixel number ratio of 75% and a vertical pixel number ratio of 50%.
- FIG. 20 schematically shows the image data of the entire image P1 and the partial image P2, and
- FIG. 21 shows an example of the display operation.
- the number of pixels in the horizontal direction (number of horizontal pixels) of the partial image P2 is 75% of the number of horizontal pixels of the entire high-resolution image
- the number of pixels in the vertical direction (number of vertical pixels) of the partial image P2 is 75%.
- number of pixels) is 50% of the number of vertical pixels of the high-resolution global image.
- the whole image P1 is the same as in the above embodiment (FIG. 4B).
- the receiving circuit 21 of the head mounted display 20 receives the image data DT1 representing the entire image P1 as shown in FIG. ). Based on this image data DT1, the head mounted display 20 displays a display image P21 (FIGS. 24(C) and (D)) in the same manner as in the above embodiment (FIGS. 8 and 9). Further, during the period from timing t42 to t43, the receiving circuit 21 receives image data DT2 representing the partial image P2 as shown in FIG. 23 ((B) in FIG. 24). Based on this image data DT2, the head-mounted display 20 displays a display image P22 (FIGS. 24(C) and (D)) in the same manner as in the above embodiment (FIGS. 8 and 10).
- the image data amount of the partial image P2 is larger than the image data amount of the entire image P1, so the transmission band is determined by the image data DT2 representing the partial image P2. Therefore, when the horizontal pixel number ratio of the partial image P2 is 75% and the vertical pixel number ratio is 50%, the band usage rate is 75% as shown in FIG. In this example, the transmission band can be reduced.
- the horizontal pixel number ratio of the partial image P2 is larger than the vertical pixel number ratio, but it is not limited to this.
- the vertical pixel number ratio of the partial image P2 may be made larger than the horizontal pixel number ratio.
- the transmission band is left over during the period of timings t31 to t32. As described above, it is possible to effectively use the remaining transmission band during the period from timing t31 to t32 to increase the resolution of the entire image P1 as described below.
- 27 to 29 show examples in which the resolution of the entire image P1 is increased when the horizontal pixel number ratio of the partial image P2 is 100% and the vertical pixel number ratio is 50%.
- 27 schematically shows the image data of the whole image P1 and the partial image P2
- FIG. 28 shows an example of the display operation
- FIG. 29 shows the operation of driving the pixels PIX.
- the number of pixels in the horizontal direction (horizontal pixel number) of the whole image P1 is 100% of the number of horizontal pixels of the high-resolution whole image
- the number of pixels in the vertical direction (vertical pixel number) of the whole image P1 is 100%. number of pixels) is 50% of the number of vertical pixels of the high-resolution global image.
- the horizontal resolution of the entire image P1 is doubled compared to the example of FIG.
- the receiving circuit 21 of the head mounted display 20 receives the image data DT1 representing the entire image P1 as shown in FIG. ). Since the input image is the entire image P1, the display controller 26 controls the plurality of pixels PIX on the display panel 27 to drive two pixels PIX as a unit based on the image data of the entire image P1.
- the scanning circuit 33 scans the plurality of pixels PIX with two pixel lines L as the scanning unit US. Also, the pixel signal generation circuit 32 applies a plurality of pixel signals to the plurality of signal lines SGL. As a result, the same pixel signal is written to the two pixels PIX in the two pixel lines L selected. In this way, the display panel 27 drives a plurality of pixels PIX with two pixels PIX as a unit UD.
- the scanning circuit 33 sequentially scans two pixel lines L from the bottom to the top of the pixel array 31 in this example. Then, as indicated by hatching in FIG. 28C, the pixel PIX to which the pixel signal is written emits light for a predetermined period after the pixel signal is written in this example. Thus, the display panel 27 displays the display image P21 (FIG. 28(D)).
- the receiving circuit 21 receives the image data DT2 representing the partial image P2 as shown in FIG. 27 (FIG. 28(B)). Based on this image data DT2, the head mounted display 20 displays the display image P22 (FIGS. 28(C) and (D)) in the same manner as in the above embodiment (FIGS. 8 and 10).
- the image data DT1 representing the entire image P1 and the image data DT2 representing the partial image P2 are alternately transmitted, but the present invention is not limited to this.
- image data to be transmitted may be changed based on the detection result of the acceleration sensor 22 or the detection result of the eye tracking sensor 23.
- the user changes the orientation of the head significantly.
- the acceleration sensor 22 detects movements such as the orientation of the head mounted display 20 .
- the transmission circuit 25 of the head mounted display 20 transmits the detection signal SD including the detection result of the acceleration sensor 22 to the image generation device 10 .
- the receiving circuit 13 of the image generation device 10 receives this detection signal SD, and the image generation circuit 11 receives the detection signal SD during the period Tmotion during which the direction of the head mounted display 20 continues to change based on the detection result of the acceleration sensor 22. , continue to generate the whole image P1 repeatedly.
- the transmission circuit 12 generates image data DT1 representing the entire image P1 based on the image data of the entire image P1, and transmits an image signal SP including the image data DT1. That is, the transmission circuit 12 of the image generation device 10 continues to repeatedly transmit the image data DT1 representing the entire image P1 during the period Tmotion.
- the receiving circuit 21 of the head mounted display 20 receives this image data DT1. Then, the head mounted display 20 displays a display image P21 including the entire image P1 based on the image data DT1. That is, the head mounted display 20 continues to repeatedly display the display image P21 including the entire image P1 during the period Tmotion. When the user stops turning the head, the period Tmotion ends, and the display system 1 alternately transmits the image data DT1 representing the entire image P1 and the image data DT2 representing the partial image P2. As a result, when the user moves his or her head, the user can view the display image P21 including the entire image P1 according to the movement of the head with a short latency, thereby reducing the possibility that the user will feel motion sickness. .
- the display system 1 repeatedly transmits the image data DT1 representing the entire image P1 when the user greatly changes the orientation of the head, but the present invention is not limited to this.
- the image data DT1 representing the entire image P1 may be repeatedly transmitted.
- the display system 1 repeatedly transmits image data DT1 representing the entire image P1 based on the detection result of the eye tracking sensor 23.
- one image data DT1 and one image data DT2 are alternately transmitted, but the invention is not limited to this.
- one image data DT1 and a plurality of image data DT2 are alternately transmitted. You may Also, the number of the plurality of pieces of image data DT2 may be changed. An example of alternately transmitting one image data DT1 and three image data DT2 will be described in detail below.
- FIG. 31 shows an example of the display operation in the head-mounted display 20 according to this modification, where (A) shows the waveform of the synchronization signal Vsync, (B) shows the input image, and (C) shows the display. 3 shows the operation of the panel 27, and (D) shows a display image displayed on the display panel 27.
- the receiving circuit 21 of the head mounted display 20 receives the image data DT1 representing the entire image P1 (FIG. 31(B)). Based on this image data DT1, the head mounted display 20 displays the display image P21 (FIGS. 31(C) and 31(D)) in the same manner as in the above embodiment (FIGS. 8 and 9). Further, during the period from timing t62 to t63, the receiving circuit 21 receives image data DT2 representing the partial image P2 (FIG. 31(B)). Based on this image data DT2, the head-mounted display 20 displays the display image P22 (FIGS. 31(C) and (D)) in the same manner as in the above embodiment (FIGS. 8 and 10).
- the receiving circuit 21 receives image data DT2 representing the partial image P2 (FIG. 31(B)). Based on this image data DT2, the head-mounted display 20 displays the display image P22 (FIGS. 31(C) and (D)) in the same manner as in the above embodiment (FIGS. 8 and 10). Further, during the period from timing t64 to t65, the receiving circuit 21 receives image data DT2 representing the partial image P2 (FIG. 31(B)). Based on this image data DT2, the head-mounted display 20 displays the display image P22 (FIGS. 31(C) and (D)) in the same manner as in the above embodiment (FIGS. 8 and 10).
- FIG. 32 shows an example of another display operation in the head mounted display 20 according to this modified example.
- the pixels PIX in the entire area of the display panel 27 emit light at the same timing.
- the partial image P2 is generated more frequently than in the above-described embodiment in which the entire image P1 and the partial image P2 are alternately generated.
- the image generation circuit 11 does not need to perform rendering processing for images outside the partial image P2. Therefore, in this modified example, power consumption can be further reduced.
- the image generation circuit 11 can reduce the number of times the partial images P2 are generated when, for example, the three partial images P2 are the same image, so power consumption can be reduced.
- the image generation circuit 11 generates an image of the entire scenery corresponding to the direction of the head-mounted display 20 in the virtual space based on the detection result of the acceleration sensor 22 included in the data supplied from the reception circuit 13. Generate an image P1.
- the image generation circuit 11 selects, based on the detection result of the eye tracking sensor 23 included in the data supplied from the reception circuit 13, the user's generates partial images P2 and P3 showing the portion viewed by .
- FIG. 33 shows an example of a display image P20 displayed on the head mounted display 20.
- the image generation circuit 11 selects a partial region including the portion viewed by the user, out of the entire region R1 of the display image P20. Determine R2 and R3.
- the size of the partial region R2 in the horizontal direction is half the size of the entire region R1 in the horizontal direction
- the size of the partial region R2 in the vertical direction is The size is half the vertical size of the total region R1. That is, the area of the partial region R2 is 1/4 of the area of the entire region R1.
- the horizontal size of the partial region R3 is half the horizontal size of the partial region R2, and the vertical size of the partial region R3 is the vertical size of the partial region R2. half the height. That is, the area of partial region R3 is 1/4 of the area of partial region R2. In this example, the central position of partial region R3 is the same as the central position of partial region R2.
- FIG. 34 shows an example of the entire image P1 and partial images P2 and P3 generated by the image generation circuit 11.
- the entire image P1 is a low-resolution image of the entire area R1 (FIG. 33).
- the partial image P2 is an image with medium resolution for the partial region R2.
- the partial image P3 is a high-resolution image of the partial area R3.
- each pixel in the full image P1 corresponds to 16 pixels PIX in the head mounted display 20
- each pixel in the partial image P2 corresponds to 4 pixels PIX in the head mounted display 20.
- each pixel in the partial image P3 corresponds to one pixel PIX in the head mounted display 20 .
- the number of pixels in the full image P1, the number of pixels in the partial image P2, and the number of pixels in the partial image P3 are made equal to each other.
- FIG. 35 schematically represents the image data of the entire image P1 and the partial images P2 and P3.
- the image data shown as a whole in FIG. 35 shows image data of a high-resolution whole image that can be used when writing a plurality of pixel values to a plurality of pixels PIX in the head-mounted display 20 .
- the number of pixels in the horizontal direction (the number of horizontal pixels) of the whole image P1 is 25% of the number of horizontal pixels of the high-resolution whole image shown in FIG. is 25% of the number of vertical pixels in the high-resolution global image. That is, the horizontal pixel number ratio of the whole image P1 is 25%, and the vertical pixel number ratio of the whole image P1 is 25%.
- the number of pixels in the horizontal direction (the number of horizontal pixels) of the partial image P2 is 25% of the number of horizontal pixels of the high-resolution whole image
- the number of pixels in the vertical direction (the number of vertical pixels) of the partial image P2 is the number of pixels of the high-resolution image. It is 25% of the number of vertical pixels in the entire image. That is, the horizontal pixel number ratio of the partial image P2 is 25%, and the vertical pixel number ratio of the partial image P2 is 25%.
- the number of pixels in the horizontal direction (the number of horizontal pixels) of the partial image P3 is 25% of the number of horizontal pixels of the entire high-resolution image
- the number of pixels in the vertical direction (the number of vertical pixels) of the partial image P3 is the number of pixels of the high-resolution image. It is 25% of the number of vertical pixels in the entire image. That is, the horizontal pixel number ratio of the partial image P3 is 25%, and the vertical pixel number ratio of the partial image P3 is 25%.
- the horizontal pixel number ratios of the entire image P1 and partial images P2, P3 are equal to each other, and the vertical pixel number ratios of the entire image P1 and partial images P2, P3 are equal to each other.
- the total data amount of the entire image P1 and the partial images P2 and P3 is 3/16 of the data amount of the high-resolution entire image.
- FIG. 36 shows an example of the display operation in the head mounted display 20 according to this modified example.
- Head-mounted display 20 cyclically receives image data DT1 representing entire image P1, image data DT2 representing partial image P2, and image data DT3 representing partial image P3 in this order.
- the display controller 26 converts the plurality of pixels PIX on the display panel 27 into 16 pixels based on the image data of the entire image P1 included in the image data DT1. It is controlled to drive in units of PIX. Thereby, the display panel 27 displays the display image P21 including the low-resolution whole image P1.
- the display controller 26 displays the image on the display panel 27 based on the image data of the partial image P2 and the data on the position of the partial image P2 included in the image data DT2.
- the plurality of pixels PIX arranged in the region corresponding to the partial image P2 are controlled to be driven in units of four pixels PIX.
- the plurality of pixels PIX on the display panel 27 the plurality of pixels PIX in the area other than the area corresponding to the partial image P2 maintains the display.
- the display panel 27 displays a display image P22 including the intermediate resolution partial image P2.
- the display controller 26 displays the image on the display panel 27 based on the image data of the partial image P3 and the data on the position of the partial image P3 included in the image data DT3.
- the plurality of pixels PIX arranged in the region corresponding to the partial image P3 are controlled to be driven in units of one pixel PIX.
- the plurality of pixels PIX on the display panel 27 the plurality of pixels PIX in the area other than the area corresponding to the partial image P3 maintains the display. Thereby, the display panel 27 displays the display image P23 including the low-resolution partial image P2.
- FIG. 37 shows an example of a more detailed display operation in the head mounted display 20.
- the receiving circuit 21 of the head mounted display 20 receives image data DT1 representing the entire image P1 (FIG. 37(B)). Since the input image is the entire image P1, the display controller 26 controls the plurality of pixels PIX in the display panel 27 to be driven in units of 16 pixels PIX based on the image data of this entire image P1. .
- the scanning circuit 33 scans the plurality of pixels PIX using four pixel lines L as scanning units. Also, the pixel signal generation circuit 32 applies the same pixel signal to the four signal lines SGL. As a result, the same pixel signal is written to 16 pixels PIX in the selected four pixel lines L. FIG. Thus, the display panel 27 drives a plurality of pixels PIX in units of 16 pixels PIX.
- the scanning circuit 33 sequentially scans eight pixel lines L from the bottom to the top of the pixel array 31 in this example. Then, as indicated by hatching in FIG. 37C, at the timing when the pixel signals are written to all the pixels PIX in the display panel 27, the plurality of pixels PIX emit light at the same timing over a predetermined period. Thus, the display panel 27 displays the display image P21 (FIG. 37(D)).
- the receiving circuit 21 of the head mounted display 20 receives the image data DT2 representing the partial image P2 (FIG. 37(B)). Since the input image is the partial image P2, the display controller 26 selects a portion of the plurality of pixels PIX on the display panel 27 based on the image data of this partial image P2 and the data about the position of this partial image P2. A plurality of pixels PIX arranged in the area corresponding to the image P2 are controlled to be driven in units of four pixels PIX.
- the scanning circuit 33 scans the plurality of pixels PIX using two pixel lines L as scanning units. Further, the pixel signal generation circuit 32 applies the same pixel signal to two signal lines SGL among the plurality of signal lines SGL related to the area corresponding to the partial image P2 among the plurality of signal lines SGL. As a result, the same pixel signal is written to the four pixels PIX in the two pixel lines L selected. Thus, the display panel 27 drives a plurality of pixels PIX in units of four pixels PIX.
- the scanning circuit 33 sequentially scans the area corresponding to the partial image P2 in the pixel array 31 with two pixel lines L as scanning units. Then, as indicated by hatching in FIG. 37C, at the timing when the pixel signals are written to all the pixels PIX in the display panel 27, the plurality of pixels PIX emit light at the same timing over a predetermined period. Thus, the display panel 27 displays the display image P22 (FIG. 37(D)).
- the receiving circuit 21 of the head mounted display 20 receives the image data DT3 representing the partial image P2 (FIG. 37(B)). Since the input image is the partial image P3, the display controller 26 selects a portion of the plurality of pixels PIX on the display panel 27 based on the image data of this partial image P3 and the data about the position of this partial image P3. A plurality of pixels PIX arranged in an area corresponding to the image P2 are controlled to be driven in units of one pixel PIX.
- the scanning circuit 33 scans a plurality of pixels PIX using one pixel line L as a scanning unit. Further, the pixel signal generation circuit 32 applies a plurality of pixel signals to a plurality of signal lines SGL related to a region corresponding to the partial image P3 among the plurality of signal lines SGL. In this manner, the display panel 27 drives a plurality of pixels PIX in units of one pixel PIX.
- the scanning circuit 33 sequentially scans the area corresponding to the partial image P3 in the pixel array 31 with one pixel line L as a scanning unit. Then, as indicated by hatching in FIG. 37C, at the timing when the pixel signals are written to all the pixels PIX in the display panel 27, the plurality of pixels PIX emit light at the same timing over a predetermined period. Thus, the display panel 27 displays the display image P23 (FIG. 37(D)).
- FIG. 38 shows the band usage rate in the display system 1 according to the above embodiment and the display system 1 according to this modification.
- the total data amount of the entire image P1 and the partial image P2 is 50% of the data amount of the high-resolution entire image, as shown in FIG. 4B. %, so the bandwidth utilization is 50%.
- the number of sub-frames is three as in this modified example, as shown in FIG. Since it is 3/16 of the volume, the bandwidth utilization is 19%.
- the bandwidth usage rate since the operating frequency can be reduced, power consumption can be reduced.
- FIG. 39 shows a configuration example of the display system 1C.
- the display system 1C includes an image generation device 10C and a head mounted display 20C.
- the image generation device 10C has an image generation circuit 11C.
- the image generation circuit 11C has an image compression circuit 19C.
- the image compression circuit 19C is configured, for example, to compress one or both of the full image P1 and the partial image P2.
- the image compression circuit 19C can set whether or not to perform compression for each subframe.
- the image compression circuit 19C can set the compression ratio for each subframe when performing compression. For example, when MIPI is used for communication, VESA (Video Electronics Standards Association)-DSC (Display Stream Compression) in MIPI can be used.
- VESA Video Electronics Standards Association
- DSC Display Stream Compression
- the head mounted display 20C has a processor 24C.
- the processor 24C has an image restoration circuit 29C.
- the image restoration circuit 29C is configured to restore the image compressed by the image compression circuit 19C.
- the image compression circuit 19C compresses the partial image P2, thereby reducing the amount of image data of the partial image P2 as shown in FIG. 40 compared to the case of FIG. can.
- shading indicates compressed images.
- the image compression circuit 19C compresses the partial image P2 at a compression rate of 50%. As a result, the transmission band can be reduced.
- the compressed partial image P2 is restored by the image restoration circuit 29C.
- Head-mounted display 20C displays an image based on restored partial image P2 in the same manner as in FIG.
- the head mounted display 20C is provided with the acceleration sensor 22 and the eye tracking sensor 23, but the present invention is not limited to this. It does not have to be provided.
- This display system 1D includes an image generation device 10D and a head mounted display 20D.
- the image generation device 10D has an image generation circuit 11D.
- the image generation circuit 11D generates a full image P1 to be displayed on the head mounted display 20D. Further, the image generation circuit 11D generates a partial image P2 including a portion where the image changes in the entire image P1.
- the head mounted display 20D has a processor 24D.
- the processor 24D performs predetermined image processing based on the image data DT1 and DT2 supplied from the receiving circuit 21, and processes the image data of the entire image P1 included in the image data DT1 and the partial image P2 included in the image data DT2. and data on the position of the partial image P2 included in the image data DT2 to the display controller .
- the red image (R), green image (G), and blue image (B) of the full image P1 are low-resolution images. Based on these images, the display panel 27 performs a display drive operation for the entire image P1 in units of four pixels PIX arranged in two rows and two columns. Also, the red image (R), the green image (G), and the blue image (B) of the partial image P2 are high-resolution images. Based on these images, the display panel 27 performs a display drive operation for the partial image P2 in units of one pixel PIX.
- the bandwidth usage rate in this case is 50%, as in the case of the above embodiment (FIG. 4).
- the luminance image (Y), the first color difference image (U), and the second color difference image (V) of the entire image P1 are low resolution images.
- display controller 26 Based on these images, display controller 26 generates a low-resolution red image (R), green image (G), and blue image (B), and display panel 27, based on the generated images, A display drive operation for the entire image P1 is performed using four pixels PIX arranged in two rows and two columns as a unit.
- the luminance image (Y), the first color difference image (U), and the second color difference image (V) of the partial image P2 are high-resolution images.
- display controller 26 Based on these images, display controller 26 generates a high resolution red image (R), green image (G), and blue image (B), and display panel 27, based on the generated images, A display drive operation for the partial image P2 is performed in units of one pixel PIX.
- the bandwidth usage rate in this case is 50%, as in the case of the above embodiment (FIG. 4).
- the luminance image (Y), the first color difference image (U), and the second color difference image (V) of the entire image P1 are low-resolution images.
- display controller 26 Based on these images, display controller 26 generates a low-resolution red image (R), green image (G), and blue image (B), and display panel 27, based on the generated images, A display drive operation for the entire image P1 is performed using four pixels PIX arranged in two rows and two columns as a unit.
- the luminance image (Y) of the partial image P2 is a high resolution image
- the first color difference image (U) and the second color difference image (V) are low resolution images.
- display controller 26 Based on these images, display controller 26 generates a high resolution red image (R), green image (G), and blue image (B), and display panel 27, based on the generated images, A display drive operation for the partial image P2 is performed in units of one pixel PIX.
- the band utilization rate in this case is 37.5%.
- the luminance image (Y) of the entire image P1 is a low-resolution image
- the first color difference image (U) and the second color difference image (V) are It is an image with a lower resolution.
- the pixels of the first color difference image (U) and the second color difference image (V) correspond to eight (4 ⁇ 2) pixels PIX.
- display controller 26 Based on these images, display controller 26 generates a low-resolution red image (R), green image (G), and blue image (B), and display panel 27, based on the generated images, A display drive operation for the entire image P1 is performed using four pixels PIX arranged in two rows and two columns as a unit.
- the luminance image (Y) of the partial image P2 is a high resolution image
- the first color difference image (U) and the second color difference image (V) are medium resolution images.
- the pixels of the first color difference image (U) and the second color difference image (V) correspond to two (2 ⁇ 1) pixels PIX.
- display controller 26 Based on these images, display controller 26 generates a high resolution red image (R), green image (G), and blue image (B), and display panel 27, based on the generated images, A display drive operation for the partial image P2 is performed in units of one pixel PIX.
- the band utilization rate in this case is 33.3%.
- the luminance image (Y) of the entire image P1 is a low-resolution image
- the first color difference image (U) and the second color difference image (V) are It is an image with a lower resolution.
- the pixels of the first color difference image (U) and the second color difference image (V) correspond to 16 (4 ⁇ 4) pixels PIX.
- display controller 26 Based on these images, display controller 26 generates a low-resolution red image (R), green image (G), and blue image (B), and display panel 27, based on the generated images, A display drive operation for the entire image P1 is performed using four pixels PIX arranged in two rows and two columns as a unit.
- the luminance image (Y) of the partial image P2 is a high resolution image
- the first color difference image (U) and the second color difference image (V) are low resolution images.
- display controller 26 Based on these images, display controller 26 generates a high resolution red image (R), green image (G), and blue image (B), and display panel 27, based on the generated images, A display drive operation for the partial image P2 is performed in units of one pixel PIX.
- the band utilization rate in this case is 25%.
- FIG. 43 shows a configuration example of the display panel 27E.
- the display panel 27E has a pixel array 31E, a pixel signal generation circuit 32, a scanning circuit 33, and a drive circuit 34E.
- the pixel array 31E has multiple signal lines SGL, multiple control lines CTL, multiple control lines WSEN, and multiple pixels PIX.
- the plurality of control lines WSEN extend in the vertical direction (the vertical direction in FIG. 43) and are arranged side by side in the horizontal direction (the horizontal direction in FIG. 43).
- Each of the plurality of control lines WSEN supplies control signals generated by the drive circuit 34E to the pixels PIX.
- the drive circuit 34E generates a control signal and applies the generated control signal to a plurality of control lines WSEN, thereby selecting a pixel PIX to which the pixel signal generated by the pixel signal generation circuit 32 is written, out of the plurality of pixels PIX. It is configured to control which pixels PIX are written to.
- FIG. 44 shows a configuration example of the pixel PIX.
- a pixel array having the pixels PIX has a control line WSL. Control lines CTL shown in FIG. 43 include this control line WSL.
- the pixel PIX has transistors MN01 to MN03, a capacitor C01, and a light emitting element EL.
- the transistors MN01 to MN03 are N-type MOSFETs (Metal Oxide Semiconductor Field Effect Transistors).
- the transistor MN01 has a gate connected to the control line WSEN, a drain connected to the signal line SGL, and a source connected to the drain of the transistor MN02.
- the transistor MN02 has a gate connected to the control line WSL, a drain connected to the source of the transistor MN01, and a source connected to the gate of the transistor MN03 and the capacitor C01.
- One end of the capacitor C01 is connected to the source of the transistor MN02 and the gate of the transistor MN03, and the other end is connected to the source of the transistor MN03 and the anode of the light emitting element EL.
- the transistor MN03 has a gate connected to the source of the transistor MN02 and one end of the capacitor C01, a drain connected to the power supply line VCCP, and a source connected to the other end of the capacitor C01 and the anode of the light emitting element EL.
- the light emitting element EL is, for example, an organic EL light emitting element, and has an anode connected to the source of the transistor MN03 and the other end of the capacitor C01, and a cathode connected to the power supply line Vcath.
- the voltage across the capacitor C01 is set based on the pixel signal supplied from the signal line SGL by turning on the transistors MN01 and MN02.
- the transistor MN03 causes a current corresponding to the voltage across the capacitor C01 to flow through the light emitting element EL.
- the light emitting element EL emits light based on the current supplied from the transistor MN03.
- the pixel PIX emits light with luminance according to the pixel signal.
- the receiving circuit 21 of the head mounted display 20 receives image data DT1 representing the entire image P1 (FIG. 8(B)). Since the input image is the entire image P1, the display controller 26 controls the plurality of pixels PIX in the display panel 27E to drive four pixels PIX as a unit based on the image data of the entire image P1. As shown in FIG. 9, during the timings t11 to t12, the scanning circuit 33 of the display panel 27E scans the plurality of pixels PIX with two pixel lines L as the scanning unit US. The drive circuit 34E activates (high level) all the control lines WSEN.
- the pixel signal generation circuit 32 applies the same pixel signal to two signal lines SGL adjacent to each other. As a result, the same pixel signal is written to the four pixels PIX in the two pixel lines L selected. In this manner, the display panel 27E drives a plurality of pixels PIX with four pixels PIX as a unit UD.
- the receiving circuit 21 of the head mounted display 20 receives the image data DT2 representing the partial image P2 (FIG. 8(B)). Since the input image is the partial image P2, the display controller 26, based on the image data of this partial image P2 and the data about the position of this partial image P2, selects a portion of the plurality of pixels PIX on the display panel 27E. A plurality of pixels PIX arranged in an area corresponding to the image P2 are controlled to be driven in units of one pixel PIX. As shown in FIG.
- the scanning circuit 33 of the display panel 27E scans the plurality of pixels PIX with one pixel line L as the scanning unit US.
- the drive circuit 34E activates (high level) the plurality of control lines WSEN related to the area corresponding to the partial image P2, and deactivates (low level) the other plurality of control lines WSEN.
- the pixel signal generation circuit 32 applies a plurality of pixel signals to a plurality of signal lines SGL related to a region corresponding to the partial image P2 among the plurality of signal lines SGL. As a result, in one selected pixel line L, a plurality of pixel signals are written to a plurality of pixels PIX related to the area corresponding to the partial image P2.
- the display panel 27E drives a plurality of pixels PIX with one pixel PIX as a unit UD.
- the configuration of the pixel PIX is not limited to the example of FIG. Some examples are given below.
- FIG. 45 shows another configuration example of the pixel PIX.
- a pixel array having the pixels PIX has a control line WSL, a control line DSL, and a control line AZSL.
- Control lines CTL shown in FIG. 43 include these control lines WSL, DSL and AZSL.
- This pixel PIX has transistors MP11 and MP12, capacitors C11 and C12, transistors MP13 to MP15, and a light emitting element EL.
- the transistors MP11 to MP15 are P-type MOSFETs.
- the transistor MP11 has a gate connected to the control line WSEN, a source connected to the signal line SGL, and a drain connected to the source of the transistor MP12.
- the transistor MP12 has a gate connected to the control line WSL, a source connected to the drain of the transistor MP11, and a drain connected to the gate of the transistor MP14 and the capacitor C12.
- One end of the capacitor C11 is connected to the power supply line VCCP, and the other end is connected to the capacitor C12, the drain of the transistor MP13, and the source of the transistor MP14.
- One end of capacitor C12 is connected to the other end of capacitor C11, the drain of transistor MP13, and the source of transistor MP14, and the other end is connected to the drain of transistor MP12 and the gate of transistor MP14.
- the transistor MP13 has a gate connected to the control line DSL, a source connected to the power supply line VCCP, and a drain connected to the source of the transistor MP14, the other end of the capacitor C11, and one end of the capacitor C12.
- the transistor MP14 has a gate connected to the drain of the transistor MP12 and the other end of the capacitor C12, a source connected to the drain of the transistor MP13, the other end of the capacitor C11 and one end of the capacitor C12, and a drain connected to the anode of the light emitting element EL and the transistor.
- the transistor MP15 has a gate connected to the control line AZSL, a source connected to the drain of the transistor MP14 and the anode of the light emitting element EL, and a drain connected to the power supply line VSS.
- the voltage across the capacitor C12 is set based on the pixel signal supplied from the signal line SGL by turning on the transistors MP11 and MP12.
- the transistor MP13 is turned on and off based on the signal on the control line DSL.
- the transistor MP14 causes a current corresponding to the voltage across the capacitor C12 to flow through the light emitting element EL while the transistor MP13 is on.
- the light emitting element EL emits light based on the current supplied from the transistor MP14.
- the pixel PIX emits light with luminance according to the pixel signal.
- the transistor MP15 is turned on and off based on the signal on the control line AZSL. While the transistor MP15 is on, the voltage of the anode of the light emitting element EL is initialized by setting it to the voltage of the power supply line VSS.
- FIG. 46 shows another configuration example of the pixel PIX.
- a pixel array having the pixels PIX has a control line WSL, a control line DSL, and a control line AZSL.
- Control lines CTL shown in FIG. 43 include these control lines WSL, DSL and AZSL.
- This pixel PIX has transistors MN21 and MN22, a capacitor C21, transistors MN23 to MN25, and a light emitting element EL.
- the transistors MN21 to MN25 are N-type MOSFETs.
- the transistor MN21 has a gate connected to the control line WSEN, a drain connected to the signal line SGL, and a source connected to the drain of the transistor MN22.
- the transistor MN22 has a gate connected to the control line WSL, a drain connected to the source of the transistor MN21, and a source connected to the gate of the transistor MN24 and the capacitor C21.
- One end of the capacitor C21 is connected to the source of the transistor MN22 and the gate of the transistor MN24, and the other end is connected to the source of the transistor MN24, the drain of the transistor MN25 and the anode of the light emitting element EL.
- the transistor MN23 has a gate connected to the control line DSL, a drain connected to the power supply line VCCP, and a source connected to the drain of the transistor MN24.
- the gate of transistor MN24 is connected to the source of transistor MN22 and one end of capacitor C21, the drain is connected to the source of transistor MN23, and the source is connected to the other end of capacitor C21, the drain of transistor MN25, and the anode of light emitting element EL.
- the transistor MN25 has a gate connected to the control line AZSL, a drain connected to the source of the transistor MN24, the other end of the capacitor C21 and the anode of the light emitting element EL, and a source connected to the power supply line VSS.
- the voltage across the capacitor C21 is set based on the pixel signal supplied from the signal line SGL by turning on the transistors MN21 and MN22.
- the transistor MN23 is turned on and off based on the signal on the control line DSL.
- the transistor MN24 causes a current corresponding to the voltage across the capacitor C21 to flow through the light emitting element EL while the transistor MN23 is on.
- the light emitting element EL emits light based on the current supplied from the transistor MN24.
- the pixel PIX emits light with luminance according to the pixel signal.
- the transistor MN25 is turned on and off based on the signal on the control line AZSL. While the transistor MN25 is on, the voltage of the anode of the light emitting element EL is initialized by setting it to the voltage of the power supply line VSS.
- FIG. 47 shows another configuration example of the pixel PIX.
- a pixel array having this pixel PIX has a control line WSL, a control line DSL, and control lines AZSL1 and AZSL2.
- Control lines CTL shown in FIG. 43 include control lines WSL, DSL, AZSL1 and AZSL2.
- This pixel PIX has transistors MP31 and MP32, a capacitor C31, transistors MP33 to MP36, and a light emitting element EL.
- the transistors MP31 to MP36 are P-type MOSFETs.
- the transistor MP31 has a gate connected to the control line WSEN, a source connected to the signal line SGL, and a drain connected to the source of the transistor MP32.
- the transistor MP32 has a gate connected to the control line WSL, a source connected to the drain of the transistor MP31, and a drain connected to the gate of the transistor MP33, the source of the transistor MP34, and the capacitor C31.
- One end of the capacitor C31 is connected to the power supply line VCCP, and the other end is connected to the drain of the transistor MP32, the gate of the transistor MP33, and the source of the transistor MP34.
- Transistor MP34 has a gate connected to control line AZSL1, a source connected to the drain of transistor MP32, a gate of transistor MP33 and the other end of capacitor C31, and a drain connected to the drain of transistor MP33 and the source of transistor MP35.
- the transistor MP35 has a gate connected to the control line DSL, a source connected to the drains of the transistors MP33 and MP34, and a drain connected to the source of the transistor MP36 and the anode of the light emitting element EL.
- the transistor MP36 has a gate connected to the control line AZSL2, a source connected to the drain of the transistor MP35 and the anode of the light emitting element EL, and a drain connected to the power supply line VSS.
- the voltage across the capacitor C31 is set based on the pixel signal supplied from the signal line SGL by turning on the transistors MP31 and MP32.
- the transistor MP35 is turned on and off based on the signal on the control line DSL.
- the transistor MP33 causes a current corresponding to the voltage across the capacitor C31 to flow through the light emitting element EL while the transistor MP35 is on.
- the light emitting element EL emits light based on the current supplied from the transistor MP33.
- the pixel PIX emits light with luminance according to the pixel signal.
- the transistor MP34 is turned on and off based on the signal on the control line AZSL1. While transistor MP34 is on, the drain and gate of transistor MP33 are connected to each other.
- the transistor MP36 is turned on and off based on the signal on the control line AZSL2. During the period in which the transistor MP36 is on, the voltage of the anode of the light emitting element EL is initialized by setting it to the voltage of the power supply line VSS.
- FIG. 48 shows another configuration example of the pixel PIX.
- a pixel array having this pixel PIX has control lines WSL1 and WSL2, a control line DSL, control lines AZSL1 and AZSL2, signal lines SGL1 and SGL2, capacitors C48 and C49, and a transistor MP49.
- the control lines CTL shown in FIG. 43 include control lines WSL1, WSL2, DSL, AZSL1 and AZSL2.
- Signal lines SGL shown in FIG. 43 include signal lines SGL1 and SGL2.
- One end of the capacitor C48 is connected to the signal line SGL1, and the other end is connected to the power supply line VSS.
- One end of the capacitor C49 is connected to the signal line SGL1, and the other end is connected to the signal line SGL2.
- the transistor MP49 is a P-type MOSFET, and has a gate connected to the control line WSL2, a source connected to the signal line SGL1, and a drain connected to the signal line SGL2.
- the pixel PIX has transistors MP41 and MP42, a capacitor C41, transistors MP43 to MP46, and a light emitting element EL.
- the transistors MP41 to MP46 are P-type MOSFETs.
- the transistor MP41 has a gate connected to the control line WSEN, a source connected to the signal line SGL2, and a drain connected to the source of the transistor MP42.
- the transistor MP42 has a gate connected to the control line WSL1, a source connected to the drain of the transistor MP41, and a drain connected to the gate of the transistor MP43 and the capacitor C41.
- One end of the capacitor 41 is connected to the power supply line VCCP, and the other end is connected to the drain of the transistor MP42 and the gate of the transistor MP43.
- the transistor MP43 has a gate connected to the drain of the transistor MP42 and the other end of the capacitor C41, a source connected to the power supply line VCCP, and a drain connected to the sources of the transistors MP44 and MP45.
- the transistor MP44 has a gate connected to the control line AZSL1, a source connected to the drain of the transistor MP43 and a source of the transistor MP45, and a drain connected to the signal line SGL2.
- the transistor MP45 has a gate connected to the control line DSL, a source connected to the drain of the transistor MP43 and the source of the transistor MP44, and a drain connected to the source of the transistor MP46 and the anode of the light emitting element EL.
- the transistor MP46 has a gate connected to the control line AZSL2, a source connected to the drain of the transistor MP45 and the anode of the light emitting element EL, and a drain connected to the power supply line VSS.
- the voltage across the capacitor C41 is set based on the pixel signal supplied from the signal line SGL1 via the capacitor C49 by turning on the transistors MP41 and MP42.
- the transistor MP45 is turned on and off based on the signal on the control line DSL.
- the transistor MP43 causes a current corresponding to the voltage across the capacitor C41 to flow through the light emitting element EL while the transistor MP45 is on.
- the light emitting element EL emits light based on the current supplied from the transistor MP43.
- the pixel PIX emits light with luminance according to the pixel signal.
- the transistor MP44 is turned on and off based on the signal on the control line AZSL1.
- transistor MP44 While transistor MP44 is on, the drain of transistor MP43 and signal line SGL2 are connected to each other.
- the transistor MP46 is turned on and off based on the signal on the control line AZSL2. While the transistor MP46 is on, the voltage of the anode of the light emitting element EL is initialized by setting it to the voltage of the power supply line VSS.
- FIG. 49 shows another configuration example of the pixel PIX.
- a pixel array having this pixel PIX has a control line WSL, a control line DSL, and control lines AZSL1 and AZSL2.
- Control lines CTL shown in FIG. 43 include control lines WSL, DSL, AZSL1 and AZSL2.
- This pixel PIX has transistors MP51 to MP54, a capacitor C51, transistors MP55 to MP60, and a light emitting element EL.
- the transistors MP51-MP60 are P-type MOSFETs.
- the transistor MP51 has a gate connected to the control line WSEN, a source connected to the signal line SGL, and a drain connected to the source of the transistor MP52.
- the transistor MP52 has a gate connected to the control line WSL, a source connected to the drain of the transistor MP51, and a drain connected to the drain of the transistor MP53 and the source of the transistor MP54.
- the transistor MP53 has a gate connected to the control line DSL, a source connected to the power supply line VCCP, and a drain connected to the drain of the transistor MP52 and the source of the transistor MP54.
- the gate of transistor MP54 is connected to the source of transistor MP55, the drain of transistor MP57 and capacitor C51, the source is connected to the drains of transistors MP52 and MP53, and the drain is connected to the sources of transistors MP58 and MP59.
- Capacitor C51 may include two capacitors connected in parallel with each other.
- the transistor MP55 has a gate connected to the control line AZSL1, a source connected to the gate of the transistor MP54, a drain of the transistor MP57 and the other end of the capacitor C51, and a drain connected to the source of the transistor MP56.
- the transistor MP56 has a gate connected to the control line AZSL1, a source connected to the drain of the transistor MP55, and a drain connected to the power supply line VSS.
- the transistor MP57 has a gate connected to the control line WSL, a drain connected to the gate of the transistor MP54, a source of the transistor MP55 and the other end of the capacitor C51, and a source connected to the drain of the transistor MP58.
- the transistor MP58 has a gate connected to the control line WSL, a drain connected to the drain of the transistor MP57, and a source connected to the drain of the transistor MP54 and the source of the transistor MP59.
- the transistor 59 has a gate connected to the control line DSL, a source connected to the drain of the transistor MP54 and the source of the transistor MP58, and a drain connected to the source of the transistor MP60 and the anode of the light emitting element EL.
- the transistor MP60 has a gate connected to the control line AZSL2, a source connected to the drain of the transistor MP59 and the anode of the light emitting element EL, and a drain connected to the power supply line VSS.
- the voltage across the capacitor C51 is set based on the pixel signal supplied from the signal line SGL by turning on the transistors MP51, MP52, MP54, MP58, and MP57.
- the transistors MP53 and MP59 are turned on and off based on the signal on the control line DSL.
- the transistor MP54 causes a current corresponding to the voltage across the capacitor C51 to flow through the light emitting element EL while the transistors MP53 and MP59 are on.
- the light emitting element EL emits light based on the current supplied from the transistor MP54.
- the pixel PIX emits light with luminance according to the pixel signal.
- the transistors MP55 and MP56 are turned on and off based on the signal on the control line AZSL1.
- the voltage of the gate of the transistor MP54 is initialized by setting it to the voltage of the power supply line VSS.
- the transistor MP60 is turned on and off based on the signal on the control line AZSL2. While the transistor MP60 is on, the voltage of the anode of the light emitting element EL is initialized by setting it to the voltage of the power supply line VSS.
- FIG. 50 shows another configuration example of the pixel PIX.
- a pixel array having the pixels PIX has control lines WSENN, WSENP, control lines WSNL, WSPL, control lines AZL, and control lines DSL.
- Control line WSEN shown in FIG. 43 includes control lines WSENN and WSENP.
- Control lines CTL shown in FIG. 43 include control lines WSNL, WSPL, AZL and DSL.
- the signal on the control line WSENN and the signal on the control line WSENP are signals inverted from each other.
- the signal on the control line WSNL and the signal on the control line WSPL are signals inverted from each other.
- the pixel PIX has transistors MN61, MP62, MN63, MP64, capacitors C61, C62, transistors MN65 to MN67, and a light emitting element EL.
- Transistors MN61, MN63, MN65 to MN67 are N-type MOSFETs, and transistors MP62 and MP64 are P-type MOSFETs.
- the transistor MN61 has a gate connected to the control line WSENN, a drain connected to the signal line SGL and the source of the transistor MP62, and a source connected to the drains of the transistors MP62, MN63 and MP64.
- the transistor MP62 has a gate connected to the control line WSENP, a source connected to the signal line SGL and the drain of the transistor MN61, and a drain connected to the sources of the transistors MN61, MN63, and MP64.
- the gate of the transistor MN63 is connected to the control line WSNL, the drain is connected to the source of the transistor MN61, the drain of the transistor MP62, and the source of the transistor MP64, and the source is connected to the drain of the transistor MP64, the capacitors C61 and C62, and the gates of the transistor MN65.
- the transistor MP64 has a gate connected to the control line WSPL, a source connected to the source of the transistor MN61, a drain of the transistor MP62 and a drain of the transistor MN63, and a drain connected to the source of the transistor MN63, the capacitors C61 and C62 and the gates of the transistor MN65.
- connected to Capacitor C61 is configured using, for example, a MOM (Metal Oxide Metal) capacitor, and has one end connected to the source of transistor MN63, the drain of transistor MP64, capacitor C62, and the gate of transistor MN65, and the other end connected to power supply line VSS2. be done.
- MOM Metal Oxide Metal
- capacitor C61 may be configured using, for example, a MOS capacitor or an MIM (Metal Insulator Metal) capacitor.
- Capacitor C62 is configured using a MOS capacitor, for example, and has one end connected to the source of transistor MN63, the drain of transistor MP64, one end of capacitor C61, and the gate of transistor MN65, and the other end connected to power supply line VSS2.
- the capacitor C62 may be configured using, for example, an MOM capacitor or an MIM capacitor.
- the gate of transistor MN65 is connected to the source of transistor MN63, the drain of transistor MP64, and one end of capacitors C61 and C62, the drain is connected to power supply line VCCP, and the source is connected to the drains of transistors MN66 and MN67.
- the transistor MN66 has a gate connected to the control line AZL, a drain connected to the sources of the transistors MN65 and MN67, and a source connected to the power supply line VSS1.
- the transistor MN67 has a gate connected to the control line DSL, a drain connected to the source of the transistor MN65 and a drain of the transistor MN66, and a source connected to the anode of the light emitting element EL.
- the pixel PIX In the pixel PIX, at least one of the transistors MN61 and MP62 is turned on, and at least one of the transistors MN63 and MP64 is turned on. , the voltage across the capacitors C61 and C62 is set.
- the transistor MN67 is turned on and off based on the signal on the control line DSL.
- the transistor MN65 causes a current corresponding to the voltage across the capacitors C61 and C62 to flow through the light emitting element EL while the transistor MN67 is on.
- the light emitting element EL emits light based on the current supplied from the transistor MP65.
- the pixel PIX emits light with luminance according to the pixel signal.
- the transistor MN66 may be turned on and off based on the signal on the control line AZL. Further, the transistor MN66 may function as a resistive element having a resistance value according to the signal on the control line AZL. In this case, transistors MN65 and MN66 form a so-called source follower circuit.
- FIG. 51 shows an example of the appearance of the head mounted display 110.
- the head-mounted display 110 has, for example, ear hooks 112 on both sides of an eyeglass-shaped display 111 to be worn on the user's head.
- the technology according to the above embodiments and the like can be applied to such a head mounted display 110 .
- FIG. 52 shows an example of the appearance of another head mounted display 120.
- the head-mounted display 120 is a transmissive head-mounted display having a body portion 121 , an arm portion 122 and a lens barrel portion 123 .
- This head mounted display 120 is attached to glasses 128 .
- the body section 121 has a control board and a display section for controlling the operation of the head mounted display 120 .
- the display section emits image light for a display image.
- the arm portion 122 connects the body portion 121 and the lens barrel portion 123 and supports the lens barrel portion 123 .
- the lens barrel section 123 projects the image light supplied from the body section 121 via the arm section 122 toward the user's eyes via the lens 129 of the spectacles 128 .
- the technology according to the above embodiments and the like can be applied to such a head mounted display 120 .
- the head mounted display 120 is a so-called light guide plate type head mounted display, it is not limited to this, and may be, for example, a so-called bird bath type head mounted display.
- This Birdbus-type head-mounted display includes, for example, a beam splitter and a partially transparent mirror. The beam splitter outputs light encoded with image information towards a mirror, which reflects the light towards the user's eye. Both the beam splitter and the partially transparent mirror are partially transparent. This allows light from the surrounding environment to reach the user's eyes.
- the digital still camera 130 is a single-lens reflex type camera with interchangeable lenses, and includes a camera body 131, a photographing lens unit 132, a grip 133, a monitor 134, and an electronic viewfinder 135. have.
- the imaging lens unit 312 is an interchangeable lens unit, and is provided near the center of the front surface of the camera body 311 .
- the grip portion 133 is provided on the front left side of the camera main body portion 311, and the photographer grips this grip portion 133. As shown in FIG.
- the monitor 134 is provided on the left side of the center of the rear surface of the camera body 131 .
- the electronic viewfinder 135 is provided above the monitor 14 on the back of the camera body 131 . By looking through the electronic viewfinder 135, the photographer can view the optical image of the subject guided from the photographing lens unit 132 and determine the composition.
- the technology according to the above embodiments and the like can be applied to the electronic viewfinder 135 .
- FIG. 54 shows an example of the appearance of the television device 140.
- Television apparatus 140 has image display screen portion 141 including front panel 142 and filter glass 143 .
- the technology according to the above embodiments and the like can be applied to the video display screen unit 141 .
- FIG. 55 shows an example of the appearance of smartphone 150 .
- the smartphone 150 has a display unit 151 that displays various types of information, and an operation unit 152 that includes buttons and the like for receiving operation input by the user.
- the technology according to the above embodiments and the like can be applied to this display unit 151 .
- FIG. 56A and 56B show a configuration example of a vehicle to which the technology of the present disclosure is applied.
- FIG. 56A shows an example of the inside of the vehicle viewed from the rear of the vehicle 200
- FIG. 1 shows an example of the interior of a vehicle viewed from the left rear of the vehicle.
- the vehicle of FIGS. 56A and 56B has a center display 201, a console display 202, a head-up display 203, a digital rear mirror 204, a steering wheel display 205, and a rear entertainment display 106.
- the center display 201 is arranged on the dashboard 261 at a location facing the driver's seat 262 and the front passenger's seat 263 .
- FIG. 56A shows an example of a horizontally elongated center display 201 extending from the driver's seat 262 side to the front passenger's seat 263 side, but the screen size and arrangement location of the center display 201 are not limited to this.
- Center display 201 can display information detected by various sensors. As a specific example, the center display 201 displays an image captured by an image sensor, an image of the distance to obstacles in front of and on the side of the vehicle measured by a ToF sensor, and the body temperature of a passenger detected by an infrared sensor. can be displayed.
- Center display 201 can be used, for example, to display at least one of safety-related information, operation-related information, lifelogs, health-related information, authentication/identification-related information, and entertainment-related information.
- Safety-related information is information based on sensor detection results, such as dozing off detection, looking away detection, tampering detection by children in the car, seatbelt wearing status, and occupant abandonment detection.
- the operation-related information is information of a gesture related to the operation of the occupant detected using a sensor. Gestures may include operations of various facilities in the vehicle, such as operations of an air conditioner, a navigation device, an AV (Audio Visual) device, a lighting device, and the like.
- the lifelog includes lifelogs of all crew members. For example, the lifelog includes activity records of each passenger. By acquiring and storing the lifelog, it is possible to check what kind of condition the occupant was in when the accident occurred.
- Health-related information includes occupant body temperature detected using a temperature sensor and occupant health information inferred based on the detected body temperature. Alternatively, information on the health condition of the occupant may be inferred based on the occupant's face imaged by an image sensor. Also, information on the health condition of the crew member may be estimated based on the response content of the crew member obtained by having a conversation with the crew member using automatic voice.
- Authentication/identification-related information includes information such as a keyless entry function that performs face authentication using a sensor, and a seat height and position automatic adjustment function for face recognition.
- the entertainment-related information includes operation information of the AV apparatus by the passenger detected by the sensor, content information to be displayed suitable for the passenger detected and recognized by the sensor, and the like.
- the console display 202 can be used, for example, to display lifelog information.
- Console display 202 is located near shift lever 265 on center console 264 between driver's seat 262 and passenger's seat 263 .
- a console display 202 is also capable of displaying information sensed by various sensors. Also, the console display 202 may display an image of the surroundings of the vehicle captured by an image sensor, or may display an image of the distance to obstacles around the vehicle.
- the head-up display 203 is virtually displayed behind the windshield 266 in front of the driver's seat 262 .
- the heads-up display 203 can be used to display at least one of safety-related information, operation-related information, lifelogs, health-related information, authentication/identification-related information, and entertainment-related information, for example. Since the head-up display 203 is often placed virtually in front of the driver's seat 262, it displays information directly related to vehicle operation, such as vehicle speed, fuel level, and battery level. Suitable for
- the digital rear mirror 204 can display not only the rear of the vehicle, but also the state of the passengers in the rear seats, so it can be used, for example, to display the lifelog information of the passengers in the rear seats.
- the steering wheel display 205 is arranged near the center of the steering wheel 267 of the vehicle.
- Steering wheel display 205 can be used, for example, to display at least one of safety-related information, operation-related information, lifelogs, health-related information, authentication/identification-related information, and entertainment-related information.
- life log information such as the driver's body temperature and information regarding the operation of AV equipment and air conditioning equipment.
- the rear entertainment display 206 is attached to the rear side of the driver's seat 262 and the front passenger's seat 263, and is for viewing by passengers in the rear seats.
- Rear entertainment display 206 can be used, for example, to display at least one of safety-related information, operation-related information, lifelogs, health-related information, authentication/identification-related information, and entertainment-related information.
- the rear entertainment display 206 may display, for example, information relating to the operation of AV equipment and air conditioning equipment, or the results of measuring the body temperature of passengers in the rear seats with the temperature sensor 5 .
- center display 201 console display 202, head-up display 203, digital rear mirror 204, steering wheel display 205, and rear entertainment display 206.
- the present technology has been applied to a head-mounted display in the above embodiments and the like, it is not limited to this, and can be applied to various electronic devices capable of displaying images, such as monitors and projectors. .
- This technology can be applied not only to the closed systems shown in the above embodiments and the like, but also to video see-through systems and mixed reality systems.
- This technology can also be applied to various simulators such as flight simulators, gaming, and projection mapping.
- This technology can be configured as follows. According to the present technology having the following configuration, image quality can be improved.
- a receiving circuit capable of receiving a display unit having a plurality of pixels; first driving for driving the plurality of pixels based on the first image data in units of a first number of pixels; and based on the second image data, among the plurality of pixels,
- a display capable of performing second driving of driving two or more pixels provided in the region corresponding to the first partial image in units of a second number of pixels smaller than the first number.
- a display device comprising: a driving circuit; (2) the receiving circuit is capable of receiving the second image data after receiving the first image data; The display device according to (1), wherein the display driving circuit is capable of performing the second driving after performing the first driving.
- the display device includes a first sensor capable of detecting which part of the display area of the display unit the user is observing; a transmission circuit capable of transmitting the detection result of the first sensor to an image generating device capable of generating the first image data and the second image data; The display device according to (1) or (2), wherein the first partial image is an image corresponding to a detection result of the first sensor.
- the receiving circuit is capable of receiving a plurality of the second image data, The display device according to (1) or (2), wherein the plurality of first partial images indicated by the plurality of second image data are different from each other. (5) The display device according to any one of (1) to (4), wherein the receiving circuit can alternately receive the first image data and one or more of the second image data. (6) The receiving circuit can alternately receive the first image data and the second image data in a first period, and receive the first image data and the second image data in a second period. The display device according to any one of (1) to (4), which can continuously receive the first image data of the data.
- the display device includes a second sensor capable of detecting a change in posture of the display device; a transmission circuit capable of transmitting the detection result of the second sensor to an image generation device capable of generating the first image data and the second image data;
- the receiving circuit is further capable of receiving third image data representing a second partial image of a third resolution higher than the second resolution, corresponding to a portion of the first partial image; Based on the third image data, the display drive circuit selects two or more pixels provided in a region corresponding to the second partial image, among the plurality of images, from the second number.
- the display device according to any one of the above (1) to (7), wherein the third driving can be performed by driving a third number of pixels, which is smaller than the number of pixels, as a unit.
- (11) further comprising a restoration circuit capable of restoring compressed image data out of the first image data and the second image data;
- the display device according to any one of (1) to (10), wherein at least one of the first image data and the second image data is compressed.
- (12) further comprising a restoration circuit capable of restoring compressed image data out of the first image data and the second image data; Both the first image data and the second image data are compressed, and the compression rate of the first image data and the compression rate of the second image data are different from each other.
- (13) First image data representing a first resolution overall image, and second image data representing a first partial image corresponding to a portion of the overall image and having a second resolution higher than the first resolution.
- an image generation device capable of transmitting a display device and The display device a receiving circuit capable of receiving the first image data and the second image data; a display unit having a plurality of pixels; first driving for driving the plurality of pixels based on the first image data in units of a first number of pixels; and based on the second image data, among the plurality of pixels,
- a display capable of performing second driving of driving two or more pixels provided in the region corresponding to the first partial image in units of a second number of pixels smaller than the first number.
- a display system comprising: a driving circuit; (14)
- the display device includes a first sensor capable of detecting which part of the display area of the display unit the user is observing; a transmission circuit capable of transmitting a detection result of the first sensor to the image generation device;
- the image generation device is capable of receiving the detection result of the first sensor transmitted from the transmission circuit, and is capable of generating the first partial image based on the detection result of the first sensor,
- the image generation device is capable of generating the first partial image by detecting a portion where the image changes in the entire image, and generates the second image data representing the first partial image.
- the display system according to (13) above.
- the display device includes a second sensor capable of detecting a change in posture of the display device; a transmission circuit capable of transmitting the detection result of the second sensor to the image generation device;
- the image generation device is capable of receiving the detection result of the second sensor transmitted from the transmission circuit, and generates the first image data and the second image data based on the detection result of the second sensor.
- a display system according to any of (13) to (15), wherein it is possible to decide which of the image data to send.
- a display driving method comprising: performing second driving for driving in units of
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Optics & Photonics (AREA)
Abstract
Description
1.実施の形態
2.適用例
[構成例]
図1は、一実施の形態に係る表示システム(表示システム1)の一構成例を表すものである。なお、本開示の実施の形態に係る表示装置および表示駆動方法は、本実施の形態により具現化されるので、併せて説明する。
続いて、本実施の形態の表示システム1の動作および作用について説明する。
まず、図1を参照して、表示システム1の全体動作概要を説明する。画像生成装置10の受信回路13は、ヘッドマウントディスプレイ20から送信された検出信号SDを受信し、この検出信号SDに含まれる、加速度センサ22の検出結果、およびアイトラッキングセンサ23の検出結果についてのデータを、画像生成回路11に供給する。画像生成回路11は、受信回路13から供給されたデータに含まれる、加速度センサ22の検出結果に基づいて、仮想空間における、ヘッドマウントディスプレイ20の向きに応じた風景を示す全体画像P1を生成する。また、画像生成回路11は、受信回路13から供給されたデータに含まれる、アイトラッキングセンサ23の検出結果に基づいて、ヘッドマウントディスプレイ20の向きに応じた風景のうちの、ユーザが見ている部分を示す部分画像P2を生成する。送信回路12は、全体画像P1の画像データに基づいて、この全体画像P1を示す画像データDT1を生成するとともに、部分画像P2の画像データおよび部分画像P2の位置についてのデータに基づいて、この部分画像P2を示す画像データDT2を生成する。そして、送信回路12は、この画像データDT1および画像データDT2を含む画像信号SPを、ヘッドマウントディスプレイ20に送信する。
ヘッドマウントディスプレイ20は、時分割的に供給された画像データDT1および画像データDT2に基づいて、表示画像P20を生成する。
以上のように本実施の形態では、低解像度の全体画像を示す第1の画像データ、および全体画像の一部に対応する、高解像度の部分画像を示す第2の画像データを受信するようにした。そして、第1の画像データに基づいて、複数の画素を、4つの画素を単位として駆動する第1の駆動と、第2の画像データに基づいて、複数の画素のうち、部分画像に対応する領域に設けられた2以上の画素を、1つの画素を単位として駆動する第2の駆動とを行うようにした。これにより、画質を高めることができる。
上記実施の形態では、画像生成回路11は、アイトラッキングセンサ23の検出結果に基づいて、部分画像P2を生成したが、これに限定されるものではない。以下に、本変形例について詳細に説明する。
上記実施の形態では、画像生成回路11が生成した画像を表示したが、これに限定されるものではない。例えば、さらに、図14に示す表示システム1Bのように、イメージセンサ28Bが撮像した画像をも表示することにより、いわゆるビデオシースルーを実現してもよい。この表示システム1Bは、ヘッドマウントディスプレイ20Bを備えている。ヘッドマウントディスプレイ20Bは、イメージセンサ28Bと、プロセッサ24Bとを有している。
上記実施の形態では、全体画像P1を表示する際、表示パネル27における複数の画素PIXを、4つの画素PIXを単位として駆動したが、これに限定されるものではない。これに代えて、例えば図15,16に示すように、様々な数の画素PIXを単位として駆動することができる。例えば、水平方向に並設された2つ(2×1)の画素PIXを単位として駆動してもよいし、水平方向に4つの画素PIXを含み垂直方向に2つの画素PIXを含む8つ(4×2)の画素PIXを単位として駆動してもよいし、水平方向に8つの画素PIXを含み垂直方向に4つの画素PIXを含む32個(8×4)の画素PIXを単位として駆動してもよい。また、例えば、垂直方向に並設された2つ(1×2)の画素PIXを単位として駆動してもよいし、水平方向に2つの画素PIXを含み垂直方向に4つの画素PIXを含む8つ(2×4)の画素PIXを単位として駆動してもよいし、水平方向に4つの画素PIXを含み垂直方向に8つの画素PIXを含む32個(4×8)の画素PIXを単位として駆動してもよい。また、例えば、水平方向に2つの画素PIXを含み垂直方向に2つの画素PIXを含む4つ(2×2)の画素PIXを単位として駆動してもよいし、水平方向に4つの画素PIXを含み垂直方向に4つの画素PIXを含む16つ(4×4)の画素PIXを単位として駆動してもよいし、水平方向に8つの画素PIXを含み垂直方向に8つの画素PIXを含む64個(8×8)の画素PIXを単位として駆動してもよい。なお、この例では、偶数の画素PIXを単位として駆動したが、これに限定されるものではなく、例えば、奇数の画素PIXを単位として駆動してもよい。
上記実施の形態では、図4Bに示したように、部分画像P2の水平画素数比率を50%にし、部分画像P2の垂直画素数比率を50%にしたが、これに限定されるものではない。これに代えて、部分画像P2の水平画素数比率および垂直画素数比率を様々な値にすることができる。
上記実施の形態では、全体画像P1を示す画像データDT1および部分画像P2を示す画像データDT2を交互に伝送したが、これに限定されるものではない。例えば、図30に示すように、加速度センサ22の検出結果や、アイトラッキングセンサ23の検出結果に基づいて、伝送する画像データを変えてもよい。この表示システム1では、基本的には、全体画像P1を示す画像データDT1および部分画像P2を示す画像データDT2を交互に伝送する。この例では、期間Tmotionにおいて、ユーザは頭の向きを大きく変更する。加速度センサ22は、ヘッドマウントディスプレイ20の向きなどの動きを検出する。そして、ヘッドマウントディスプレイ20の送信回路25は、この加速度センサ22の検出結果を含む検出信号SDを画像生成装置10に送信する。画像生成装置10の受信回路13は、この検出信号SDを受信し、画像生成回路11は、この加速度センサ22の検出結果に基づいて、ヘッドマウントディスプレイ20の向きが変化し続けている期間Tmotionにおいて、全体画像P1を繰り返し生成し続ける。送信回路12は、この全体画像P1の画像データに基づいて、全体画像P1を示す画像データDT1を生成し、画像データDT1を含む画像信号SPを送信する。すなわち、画像生成装置10の送信回路12は、期間Tmotionにおいて、全体画像P1を示す画像データDT1を繰り返し送信し続ける。ヘッドマウントディスプレイ20の受信回路21は、この画像データDT1を受信する。そして、ヘッドマウントディスプレイ20は画像データDT1に基づいて、全体画像P1を含む表示画像P21を表示する。すなわち、ヘッドマウントディスプレイ20は、期間Tmotionにおいて、全体画像P1を含む表示画像P21を繰り返し表示し続ける。そして、ユーザが頭の向きを停止すると、期間Tmotionは終了し、表示システム1は、全体画像P1を示す画像データDT1および部分画像P2を示す画像データDT2を交互に伝送する。これにより、ユーザは、頭を動かした場合に、短いレイテンシで、頭の動きに応じた全体画像P1を含む表示画像P21を見ることができるので、ユーザが酔いを感じるおそれを低減することができる。
上記実施の形態では、1つの画像データDT1および1つの画像データDT2を交互に伝送したが、これに限定されるものではなく、例えば、1つの画像データDT1および複数の画像データDT2を交互に伝送してもよい。また、この複数の画像データDT2の数を変更できるようにしてもよい。以下に、1つの画像データDT1および3つの画像データDT2を交互に伝送する例について詳細に説明する。
上記実施の形態では、2つのサブフレーム(全体画像P1および部分画像P2)を設けたが、これに限定されるものではなく、これに代えて、3以上のサブフレームを設けてもよい。以下に、3つのサブフレームを設けた場合の例について詳細に説明する。
上記実施の形態では、全体画像P1の画像データおよび部分画像P2の画像データを、圧縮せずに伝送したが、これに限定されるものではなく、例えば、圧縮してもよい。以下に、本変形例に係る表示システム1Cについて、詳細に説明する。
上記実施の形態では、ヘッドマウントディスプレイ20Cに加速度センサ22およびアイトラッキングセンサ23を設けたが、これに限定されるものではなく、例えば、図41に示す表示システム1Dに示すようにこれらのセンサを設けなくてもよい。この表示システム1Dは、画像生成装置10Dと、ヘッドマウントディスプレイ20Dとを備えている。画像生成装置10Dは、画像生成回路11Dを有している。画像生成回路11Dは、ヘッドマウントディスプレイ20Dに表示すべき全体画像P1を生成する。また、画像生成回路11Dは、全体画像P1における、画像が変化する部分を含む部分画像P2を生成する。ヘッドマウントディスプレイ20Dは、プロセッサ24Dを有している。プロセッサ24Dは、例えば、受信回路21から供給された画像データDT1,DT2に基づいて所定の画像処理を行い、画像データDT1に含まれる全体画像P1の画像データ、画像データDT2に含まれる部分画像P2の画像データ、および画像データDT2に含まれる部分画像P2の位置についてのデータを、ディスプレイコントローラ26に供給する。
上記実施の形態は、図42A~42Cに示したように、様々なカラー方式に適用することができる。
上記実施の形態では、図6に示した表示パネル27を用いたが、これに限定されるものではない。以下に、本変形例に係る表示パネル27Eについて、詳細に説明する。
また、これらの変形例のうちの2以上を組み合わせてもよい。
次に、上記実施の形態および変形例で説明した表示システムの適用例について説明する。
図51は、ヘッドマウントディスプレイ110の外観の一例を表すものである。ヘッドマウントディスプレイ110は、例えば、眼鏡形の表示部111の両側に、使用者の頭部に装着するための耳掛け部112を有する。このようなヘッドマウントディスプレイ110に、上記実施の形態等に係る技術を適用することができる。
図52は、他のヘッドマウントディスプレイ120の外観の一例を表すものである。ヘッドマウントディスプレイ120は、本体部121と、アーム部122と、鏡筒部123とを有する、透過式のヘッドマウントディスプレイである。このヘッドマウントディスプレイ120は、眼鏡128に装着されている。本体部121は、ヘッドマウントディスプレイ120の動作を制御するための制御基板や表示部を有している。この表示部は、表示画像の画像光を射出する。アーム部122は、本体部121と鏡筒部123とを連結し、鏡筒部123を支持する。鏡筒部123は、本体部121からアーム部122を介して供給された画像光を、眼鏡128のレンズ129を介して、ユーザの目に向かって投射する。このようなヘッドマウントディスプレイ120に、上記実施の形態等に係る技術を適用することができる。
図53A,53Bは、デジタルスチルカメラ130の外観の一例を表すものであり、図53Aは正面図を示し、図53Bは背面図を示す。このデジタルスチルカメラ130は、レンズ交換式一眼レフレックスタイプのカメラであり、カメラ本体部(カメラボディ)131と、撮影レンズユニット132と、グリップ部133と、モニタ134と、電子ビューファインダ135とを有する。撮像レンズユニット312は、交換式のレンズユニットであり、カメラ本体部311の正面のほぼ中央付近に設けられる。グリップ部133は、カメラ本体部311の正面の左側に設けられ、撮影者は、このグリップ部133を把持するようになっている。モニタ134は、カメラ本体部131の背面のほぼ中央よりも左側に設けられる。電子ビューファインダ135は、カメラ本体部131の背面において、モニタ14の上部に設けられる。撮影者は、この電子ビューファインダ135を覗くことにより、撮影レンズユニット132から導かれた被写体の光像を視認し、構図を決定することができる。電子ビューファインダ135に、上記実施の形態等に係る技術を適用することができる。
図54は、テレビジョン装置140の外観の一例を表すものである。テレビジョン装置140は、フロントパネル142およびフィルターガラス143を含む映像表示画面部141を有する。この映像表示画面部141に、上記実施の形態等に係る技術を適用することができる。
図55は、スマートフォン150の外観の一例を表すものである。スマートフォン150は、各種情報を表示する表示部151と、ユーザによる操作入力を受け付けるボタンなどを含む操作部152とを有する。この表示部151に、上記実施の形態等に係る技術を適用することができる。
図56A,56Bは、本開示の技術が適用された車両の一構成例を表すものであり、図56Aは、車両200の後部から見た車両の内部の一例を示し、図56Bは、車両200の左後方からみた車両の内部の一例を示す。
第1の解像度の全体画像を示す第1の画像データ、および前記全体画像の一部分に対応する、前記第1の解像度よりも高い第2の解像度の第1の部分画像を示す第2の画像データを受信可能な受信回路と、
複数の画素を有する表示部と、
前記第1の画像データに基づいて、前記複数の画素を、第1の数の画素を単位として駆動する第1の駆動と、前記第2の画像データに基づいて、前記複数の画素のうち、前記第1の部分画像に対応する領域に設けられた2以上の画素を、前記第1の数より少ない第2の数の画素を単位として駆動する第2の駆動とを行うことが可能な表示駆動回路と
を備えた表示装置。
(2)
前記受信回路は、前記第1の画像データを受信した後に前記第2の画像データを受信することが可能であり、
前記表示駆動回路は、前記第1の駆動を行った後に前記第2の駆動を行うことが可能である
前記(1)に記載の表示装置。
(3)
前記表示装置は、ユーザが前記表示部の表示領域におけるどの部分を観察しているかを検出可能な第1のセンサと、
前記第1のセンサの検出結果を、前記第1の画像データおよび前記第2の画像データを生成可能な画像生成装置に送信可能な送信回路と
をさらに備え、
前記第1の部分画像は、前記第1のセンサの検出結果に応じた画像である
前記(1)または(2)に記載の表示装置。
(4)
前記受信回路は、複数の前記第2の画像データを受信可能であり、
前記複数の第2の画像データがそれぞれ示す複数の前記第1の部分画像は、互いに異なる
前記(1)または(2)に記載の表示装置。
(5)
前記受信回路は、前記第1の画像データと、1または複数の前記第2の画像データとを交互に受信可能である
前記(1)から(4)のいずれかに記載の表示装置。
(6)
前記受信回路は、第1の期間において、前記第1の画像データおよび前記第2の画像データを交互に受信可能であり、第2の期間において、前記第1の画像データおよび前記第2の画像データのうちの前記第1の画像データを続けて受信可能である
前記(1)から(4)のいずれかに記載の表示装置。
(7)
前記表示装置は、前記表示装置の姿勢の変化を検出可能な第2のセンサと、
前記第2のセンサの検出結果を、前記第1の画像データおよび前記第2の画像データを生成可能な画像生成装置に送信可能な送信回路と
さらに備え、
前記第2の期間は、前記表示装置の姿勢が変化している期間に対応する
前記(6)に記載の表示装置。
(8)
前記第2の数は1である
前記(1)から(7)のいずれかに記載の表示装置。
(9)
前記受信回路は、さらに、前記第1の部分画像の一部分に対応する、前記第2の解像度よりも高い第3の解像度の第2の部分画像を示す第3の画像データを受信可能であり、
前記表示駆動回路は、前記第3の画像データに基づいて、前記複数の画像のうちの、前記第2の部分画像に対応する領域に設けられた2以上の画素を、前記第2の数よりも少ない第3の数の画素を単位として駆動する第3の駆動を行うことが可能である
前記(1)から(7)のいずれかに記載の表示装置。
(10)
前記第1の画像データのデータ量、および前記第2の画像データのデータ量は、互いに等しい
前記(1)から(9)のいずれかに記載の表示装置。
(11)
前記第1の画像データおよび前記第2の画像データのうち圧縮された画像データを復元可能な復元回路をさらに備え、
前記第1の画像データおよび前記第2の画像データのうちの少なくとも一方が圧縮された
前記(1)から(10)のいずれかに記載の表示装置。
(12)
前記第1の画像データおよび前記第2の画像データのうち圧縮された画像データを復元可能な復元回路をさらに備え、
前記第1の画像データおよび前記第2の画像データはともに圧縮され、前記第1の画像データの圧縮率および前記第2の画像データの圧縮率は互いに異なる
前記(1)から(10)のいずれかに記載の表示装置。
(13)
第1の解像度の全体画像を示す第1の画像データ、および前記全体画像の一部分に対応する、前記第1の解像度よりも高い第2の解像度の第1の部分画像を示す第2の画像データを送信可能な画像生成装置と、
表示装置と
を備え、
前記表示装置は、
前記第1の画像データおよび前記第2の画像データを受信可能な受信回路と、
複数の画素を有する表示部と、
前記第1の画像データに基づいて、前記複数の画素を、第1の数の画素を単位として駆動する第1の駆動と、前記第2の画像データに基づいて、前記複数の画素のうち、前記第1の部分画像に対応する領域に設けられた2以上の画素を、前記第1の数より少ない第2の数の画素を単位として駆動する第2の駆動とを行うことが可能な表示駆動回路と
を有する
表示システム。
(14)
前記表示装置は、ユーザが前記表示部の表示領域におけるどの部分を観察しているかを検出可能な第1のセンサと、
前記第1のセンサの検出結果を前記画像生成装置に送信可能な送信回路と
をさらに有し、
前記画像生成装置は、前記送信回路から送信された前記第1のセンサの検出結果を受信可能であり、前記第1のセンサの検出結果に基づいて前記第1の部分画像を生成可能であり、前記第1の部分画像を示す前記第2の画像データを生成可能な
前記(13)に記載の表示システム。
(15)
前記画像生成装置は、前記全体画像のうちの、画像が変化する部分を検出することにより前記第1の部分画像を生成可能であり、前記第1の部分画像を示す前記第2の画像データを生成可能である
前記(13)に記載の表示システム。
(16)
前記表示装置は、前記表示装置の姿勢の変化を検出可能な第2のセンサと、
前記第2のセンサの検出結果を前記画像生成装置に送信可能な送信回路と
さらに有し、
前記画像生成装置は、前記送信回路から送信された前記第2のセンサの検出結果を受信可能であり、前記第2のセンサの検出結果に基づいて、前記第1の画像データおよび前記第2の画像データのうちのどちらを送信するかを決定可能である
前記(13)から(15)のいずれかに記載の表示システム。
(17)
第1の解像度の全体画像を示す第1の画像データ、および前記全体画像の一部分に対応する、前記第1の解像度よりも高い第2の解像度の第1の部分画像を示す第2の画像データを送信することと、
前記第1の画像データおよび前記第2の画像データを受信することと、
前記第1の画像データに基づいて、複数の画素を、第1の数の画素を単位として駆動する第1の駆動を行うことと、
前記第2の画像データに基づいて、前記複数の画素のうち、前記第1の部分画像に対応する領域に設けられた2以上の画素を、前記第1の数より少ない第2の数の画素を単位として駆動する第2の駆動を行うことと
を含む表示駆動方法。
Claims (17)
- 第1の解像度の全体画像を示す第1の画像データ、および前記全体画像の一部分に対応する、前記第1の解像度よりも高い第2の解像度の第1の部分画像を示す第2の画像データを受信可能な受信回路と、
複数の画素を有する表示部と、
前記第1の画像データに基づいて、前記複数の画素を、第1の数の画素を単位として駆動する第1の駆動と、前記第2の画像データに基づいて、前記複数の画素のうち、前記第1の部分画像に対応する領域に設けられた2以上の画素を、前記第1の数より少ない第2の数の画素を単位として駆動する第2の駆動とを行うことが可能な表示駆動回路と
を備えた表示装置。 - 前記受信回路は、前記第1の画像データを受信した後に前記第2の画像データを受信することが可能であり、
前記表示駆動回路は、前記第1の駆動を行った後に前記第2の駆動を行うことが可能である
請求項1に記載の表示装置。 - 前記表示装置は、ユーザが前記表示部の表示領域におけるどの部分を観察しているかを検出可能な第1のセンサと、
前記第1のセンサの検出結果を、前記第1の画像データおよび前記第2の画像データを生成可能な画像生成装置に送信可能な送信回路と
をさらに備え、
前記第1の部分画像は、前記第1のセンサの検出結果に応じた画像である
請求項1に記載の表示装置。 - 前記受信回路は、複数の前記第2の画像データを受信可能であり、
前記複数の第2の画像データがそれぞれ示す複数の前記第1の部分画像は、互いに異なる
請求項1に記載の表示装置。 - 前記受信回路は、前記第1の画像データと、1または複数の前記第2の画像データとを交互に受信可能である
請求項1に記載の表示装置。 - 前記受信回路は、第1の期間において、前記第1の画像データおよび前記第2の画像データを交互に受信可能であり、第2の期間において、前記第1の画像データおよび前記第2の画像データのうちの前記第1の画像データを続けて受信可能である
請求項1に記載の表示装置。 - 前記表示装置は、前記表示装置の姿勢の変化を検出可能な第2のセンサと、
前記第2のセンサの検出結果を、前記第1の画像データおよび前記第2の画像データを生成可能な画像生成装置に送信可能な送信回路と
さらに備え、
前記第2の期間は、前記表示装置の姿勢が変化している期間に対応する
請求項6に記載の表示装置。 - 前記第2の数は1である
請求項1に記載の表示装置。 - 前記受信回路は、さらに、前記第1の部分画像の一部分に対応する、前記第2の解像度よりも高い第3の解像度の第2の部分画像を示す第3の画像データを受信可能であり、
前記表示駆動回路は、前記第3の画像データに基づいて、前記複数の画像のうちの、前記第2の部分画像に対応する領域に設けられた2以上の画素を、前記第2の数よりも少ない第3の数の画素を単位として駆動する第3の駆動を行うことが可能である
請求項1に記載の表示装置。 - 前記第1の画像データのデータ量、および前記第2の画像データのデータ量は、互いに等しい
請求項1に記載の表示装置。 - 前記第1の画像データおよび前記第2の画像データのうち圧縮された画像データを復元可能な復元回路をさらに備え、
前記第1の画像データおよび前記第2の画像データのうちの少なくとも一方が圧縮された
請求項1に記載の表示装置。 - 前記第1の画像データおよび前記第2の画像データのうち圧縮された画像データを復元可能な復元回路をさらに備え、
前記第1の画像データおよび前記第2の画像データはともに圧縮され、前記第1の画像データの圧縮率および前記第2の画像データの圧縮率は互いに異なる
請求項1に記載の表示装置。 - 第1の解像度の全体画像を示す第1の画像データ、および前記全体画像の一部分に対応する、前記第1の解像度よりも高い第2の解像度の第1の部分画像を示す第2の画像データを送信可能な画像生成装置と、
表示装置と
を備え、
前記表示装置は、
前記第1の画像データおよび前記第2の画像データを受信可能な受信回路と、
複数の画素を有する表示部と、
前記第1の画像データに基づいて、前記複数の画素を、第1の数の画素を単位として駆動する第1の駆動と、前記第2の画像データに基づいて、前記複数の画素のうち、前記第1の部分画像に対応する領域に設けられた2以上の画素を、前記第1の数より少ない第2の数の画素を単位として駆動する第2の駆動とを行うことが可能な表示駆動回路と
を有する
表示システム。 - 前記表示装置は、ユーザが前記表示部の表示領域におけるどの部分を観察しているかを検出可能な第1のセンサと、
前記第1のセンサの検出結果を前記画像生成装置に送信可能な送信回路と
をさらに有し、
前記画像生成装置は、前記送信回路から送信された前記第1のセンサの検出結果を受信可能であり、前記第1のセンサの検出結果に基づいて前記第1の部分画像を生成可能であり、前記第1の部分画像を示す前記第2の画像データを生成可能な
請求項13に記載の表示システム。 - 前記画像生成装置は、前記全体画像のうちの、画像が変化する部分を検出することにより前記第1の部分画像を生成可能であり、前記第1の部分画像を示す前記第2の画像データを生成可能である
請求項13に記載の表示システム。 - 前記表示装置は、前記表示装置の姿勢の変化を検出可能な第2のセンサと、
前記第2のセンサの検出結果を前記画像生成装置に送信可能な送信回路と
さらに有し、
前記画像生成装置は、前記送信回路から送信された前記第2のセンサの検出結果を受信可能であり、前記第2のセンサの検出結果に基づいて、前記第1の画像データおよび前記第2の画像データのうちのどちらを送信するかを決定可能である
請求項13に記載の表示システム。 - 第1の解像度の全体画像を示す第1の画像データ、および前記全体画像の一部分に対応する、前記第1の解像度よりも高い第2の解像度の第1の部分画像を示す第2の画像データを送信することと、
前記第1の画像データおよび前記第2の画像データを受信することと、
前記第1の画像データに基づいて、複数の画素を、第1の数の画素を単位として駆動する第1の駆動を行うことと、
前記第2の画像データに基づいて、前記複数の画素のうち、前記第1の部分画像に対応する領域に設けられた2以上の画素を、前記第1の数より少ない第2の数の画素を単位として駆動する第2の駆動を行うことと
を含む表示駆動方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280037924.1A CN117378002A (zh) | 2021-06-03 | 2022-05-23 | 显示装置、显示系统及显示驱动方法 |
JP2023525737A JPWO2022255147A1 (ja) | 2021-06-03 | 2022-05-23 | |
EP22815897.8A EP4350682A1 (en) | 2021-06-03 | 2022-05-23 | Display device, display system, and display drive method |
KR1020237039160A KR20240015633A (ko) | 2021-06-03 | 2022-05-23 | 표시 장치, 표시 시스템 및 표시 구동 방법 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-093716 | 2021-06-03 | ||
JP2021093716 | 2021-06-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022255147A1 true WO2022255147A1 (ja) | 2022-12-08 |
Family
ID=84323314
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/021117 WO2022255147A1 (ja) | 2021-06-03 | 2022-05-23 | 表示装置、表示システム、および表示駆動方法 |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP4350682A1 (ja) |
JP (1) | JPWO2022255147A1 (ja) |
KR (1) | KR20240015633A (ja) |
CN (1) | CN117378002A (ja) |
WO (1) | WO2022255147A1 (ja) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012077238A1 (ja) * | 2010-12-10 | 2012-06-14 | 富士通株式会社 | 立体視動画像生成装置、立体視動画像生成方法、立体視動画像生成プログラム |
US20140347363A1 (en) * | 2013-05-22 | 2014-11-27 | Nikos Kaburlasos | Localized Graphics Processing Based on User Interest |
US20190237021A1 (en) * | 2016-12-01 | 2019-08-01 | Shanghai Yunyinggu Technology Co., Ltd. | Zone-based display data processing and transmission |
US20190287495A1 (en) * | 2018-03-16 | 2019-09-19 | Magic Leap, Inc. | Depth based foveated rendering for display systems |
JP2019197224A (ja) | 2016-02-09 | 2019-11-14 | 株式会社ソニー・インタラクティブエンタテインメント | 映像表示システム |
JP2020021083A (ja) * | 2017-06-30 | 2020-02-06 | エルジー ディスプレイ カンパニー リミテッド | 表示装置及びそのゲート駆動回路 |
WO2020116348A1 (ja) * | 2018-12-04 | 2020-06-11 | 株式会社ソニー・インタラクティブエンタテインメント | 画像処理装置、画像処理システム、画像処理方法、及びプログラム |
JP2020523688A (ja) * | 2017-06-09 | 2020-08-06 | 株式会社ソニー・インタラクティブエンタテインメント | フォービエイテッド(中心窩)レンダリングシステムのための時間的スーパーサンプリング |
JP2021093716A (ja) | 2019-09-26 | 2021-06-17 | パナソニック株式会社 | 端末、基地局、及び、制御方法 |
-
2022
- 2022-05-23 KR KR1020237039160A patent/KR20240015633A/ko unknown
- 2022-05-23 EP EP22815897.8A patent/EP4350682A1/en active Pending
- 2022-05-23 JP JP2023525737A patent/JPWO2022255147A1/ja active Pending
- 2022-05-23 CN CN202280037924.1A patent/CN117378002A/zh active Pending
- 2022-05-23 WO PCT/JP2022/021117 patent/WO2022255147A1/ja active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012077238A1 (ja) * | 2010-12-10 | 2012-06-14 | 富士通株式会社 | 立体視動画像生成装置、立体視動画像生成方法、立体視動画像生成プログラム |
US20140347363A1 (en) * | 2013-05-22 | 2014-11-27 | Nikos Kaburlasos | Localized Graphics Processing Based on User Interest |
JP2019197224A (ja) | 2016-02-09 | 2019-11-14 | 株式会社ソニー・インタラクティブエンタテインメント | 映像表示システム |
US20190237021A1 (en) * | 2016-12-01 | 2019-08-01 | Shanghai Yunyinggu Technology Co., Ltd. | Zone-based display data processing and transmission |
JP2020523688A (ja) * | 2017-06-09 | 2020-08-06 | 株式会社ソニー・インタラクティブエンタテインメント | フォービエイテッド(中心窩)レンダリングシステムのための時間的スーパーサンプリング |
JP2020021083A (ja) * | 2017-06-30 | 2020-02-06 | エルジー ディスプレイ カンパニー リミテッド | 表示装置及びそのゲート駆動回路 |
US20190287495A1 (en) * | 2018-03-16 | 2019-09-19 | Magic Leap, Inc. | Depth based foveated rendering for display systems |
WO2020116348A1 (ja) * | 2018-12-04 | 2020-06-11 | 株式会社ソニー・インタラクティブエンタテインメント | 画像処理装置、画像処理システム、画像処理方法、及びプログラム |
JP2021093716A (ja) | 2019-09-26 | 2021-06-17 | パナソニック株式会社 | 端末、基地局、及び、制御方法 |
Also Published As
Publication number | Publication date |
---|---|
CN117378002A (zh) | 2024-01-09 |
KR20240015633A (ko) | 2024-02-05 |
EP4350682A1 (en) | 2024-04-10 |
JPWO2022255147A1 (ja) | 2022-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107797280B (zh) | 个人沉浸式显示装置及其驱动方法 | |
CN110322818B (zh) | 显示装置及操作方法 | |
US10398976B2 (en) | Display controller, electronic device, and virtual reality device | |
JP5782787B2 (ja) | 表示装置および表示方法 | |
CN109427283B (zh) | 图像产生方法和使用该方法的显示装置 | |
JP2022517991A (ja) | 眼追跡に基づく動的レンダリング時間ターゲット化 | |
US10699673B2 (en) | Apparatus, systems, and methods for local dimming in brightness-controlled environments | |
US7091929B2 (en) | Method and apparatus for displaying images | |
WO2022255147A1 (ja) | 表示装置、表示システム、および表示駆動方法 | |
KR102422036B1 (ko) | 낮은 레이턴시 가상 현실을 구동 및 보상하는 표시장치 | |
WO2023100468A1 (ja) | 表示装置、表示システム、および表示方法 | |
WO2023084950A1 (ja) | 表示装置、表示システム、および表示駆動方法 | |
KR20170134147A (ko) | 디스플레이 컨트롤러, 전자 기기, 및 가상 현실 장치 | |
US9137522B2 (en) | Device and method for 3-D display control | |
WO2023176166A1 (ja) | 表示装置及び電子機器 | |
WO2023243474A1 (ja) | 表示装置 | |
WO2024048221A1 (ja) | 表示装置 | |
WO2023181652A1 (ja) | 表示装置 | |
WO2023182097A1 (ja) | 表示装置及びその駆動方法 | |
WO2024101213A1 (ja) | 表示装置 | |
WO2023013247A1 (ja) | 表示装置、電子機器、及び表示制御方法 | |
WO2024048268A1 (ja) | 表示装置、電子機器及び表示装置の駆動方法 | |
WO2023189312A1 (ja) | 表示装置 | |
WO2023007819A1 (ja) | 表示装置 | |
US11972718B2 (en) | Display device, electronic apparatus, and moving body |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22815897 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023525737 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18559455 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280037924.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022815897 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022815897 Country of ref document: EP Effective date: 20240103 |