WO2017138458A1 - 映像表示システム - Google Patents
映像表示システム Download PDFInfo
- Publication number
- WO2017138458A1 WO2017138458A1 PCT/JP2017/004065 JP2017004065W WO2017138458A1 WO 2017138458 A1 WO2017138458 A1 WO 2017138458A1 JP 2017004065 W JP2017004065 W JP 2017004065W WO 2017138458 A1 WO2017138458 A1 WO 2017138458A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- display device
- images
- video
- Prior art date
Links
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 11
- 238000003786 synthesis reaction Methods 0.000 claims abstract description 11
- 230000005540 biological transmission Effects 0.000 claims description 31
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 claims description 6
- 238000003672 processing method Methods 0.000 claims description 6
- 230000002194 synthesizing effect Effects 0.000 abstract description 4
- 238000004891 communication Methods 0.000 description 22
- 238000000034 method Methods 0.000 description 12
- 210000003128 head Anatomy 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 239000000203 mixture Substances 0.000 description 2
- 239000002131 composite material Substances 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 230000002207 retinal effect Effects 0.000 description 1
- 230000004270 retinal projection Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/266—Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
- H04N21/2662—Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/39—Control of the bit-mapped memory
- G09G5/391—Resolution modifying circuits, e.g. variable screen formats
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
- G09G5/006—Details of the interface to the display terminal
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/23418—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0147—Head-up displays characterised by optical features comprising a device modifying the resolution of the displayed image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
- G09G2340/0428—Gradation resolution change
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/045—Zooming at least part of an image, i.e. enlarging it or shrinking it
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2350/00—Solving problems of bandwidth in display systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/18—Use of a frame buffer in a display terminal, inclusive of the display panel
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/04—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
Definitions
- the present invention relates to a video display system that displays a video on a display device that a user wears on his / her head, an image processing device and a display device that constitute the video display system, an image processing method, and a program.
- a technique called “Foveated Rendering”, which draws a region of interest in an image (a region assumed to be noticed by the user) at a higher resolution than other regions is known. According to such a technique, it is possible to generate an image in which the region of interest is drawn at a high resolution with a small processing load compared to the case where the entire image is drawn at a high resolution.
- the image finally displayed on the display device needs to have a high resolution in accordance with the resolution of the region of interest. Furthermore, in a display device of a type that is used by being mounted on a head such as a head-mounted display, it is desired to display an image at a relatively high frame rate. As described above, in order to display a video with a high resolution and a high frame rate, it is necessary to transmit image data at a high data transmission rate from an image processing apparatus that generates an image for display to the display apparatus.
- the present invention has been made in consideration of the above circumstances, and one of its purposes is to allow a relatively small amount of data transmission when displaying an image with a high resolution of a region of interest on a display device.
- a video display system, an image processing apparatus, a display apparatus, an image processing method, and a program are provided.
- the video display system is a video display system including a display device that a user wears on his / her head and an image processing device that supplies video displayed by the display device.
- An apparatus generates a plurality of pre-combination images that are used for display by the display device and have different resolutions, and transmits each of the plurality of pre-combination images to the display device.
- the display device displays a display image obtained by synthesizing the plurality of pre-combination images and a reception unit that receives the plurality of pre-combination images transmitted by the image processing device. And a display control unit.
- An image processing apparatus is an image processing apparatus that supplies video to a display device that a user wears on his / her head and uses a plurality of images used for composition of display images displayed by the display device
- An image generation unit that generates a plurality of pre-combination images having different resolutions and a transmission unit that transmits each of the plurality of pre-combination images to the display device.
- a display device is a display device that a user wears on his / her head and is connected to an image processing device that supplies video to the display device.
- a receiving unit that receives a plurality of different pre-combination images, and a display control unit that displays a display image obtained by combining the plurality of pre-combination images.
- An image processing method is an image processing method for supplying video to a display device used by a user wearing on a head, and a plurality of images used for composition of display images displayed by the display device
- the method includes generating a plurality of pre-combination images having different resolutions and transmitting each of the plurality of pre-combination images to the display device.
- the program according to the present invention is a program for controlling an image processing apparatus that supplies video to a display device that is used by a user wearing on the head, and is used for synthesizing a display image displayed by the display device.
- the image processing device as an image generation unit that generates a plurality of pre-combination images having different resolutions and a transmission unit that transmits each of the plurality of pre-combination images to the display device. It is a program to make it function.
- This program may be provided by being stored in a computer-readable non-transitory information storage medium.
- FIG. 1 is a configuration block diagram showing an overall configuration of a video display system according to an embodiment of the present invention. It is a functional block diagram which shows the function which the video display system which concerns on embodiment of this invention implement
- FIG. 1 is a block diagram showing the configuration of a video display system 1 according to an embodiment of the present invention.
- the video display system 1 includes an image processing device 10, an operation device 20, a relay device 30, and a display device 40.
- the image processing device 10 is a device that generates and supplies an image to be displayed by the display device 40, and may be, for example, a home game machine, a portable game machine, a personal computer, a smartphone, a tablet, or the like. As illustrated in FIG. 1, the image processing apparatus 10 includes a control unit 11, a storage unit 12, and an interface unit 13.
- the control unit 11 includes at least one processor such as a CPU, and executes various types of information processing by executing programs stored in the storage unit 12. In addition, the specific example of the process which the control part 11 performs in this embodiment is mentioned later.
- the storage unit 12 includes at least one memory device such as a RAM, and stores a program executed by the control unit 11 and data processed by the program.
- the interface unit 13 is an interface for data communication between the operation device 20 and the relay device 30.
- the image processing apparatus 10 is connected to the operation device 20 and the relay apparatus 30 either by wire or wirelessly via the interface unit 13.
- the interface unit 13 may include a multimedia interface such as HDMI (High-Definition Multimedia Interface: registered trademark) in order to transmit video and audio supplied by the image processing device 10 to the relay device 30.
- the interface unit 13 includes a data communication interface such as Bluetooth (registered trademark) or USB.
- the image processing apparatus 10 receives various types of information from the display device 40 via the relay device 30 and transmits control signals and the like via the data communication interface. In addition, an operation signal transmitted from the operation device 20 is accepted via the data communication interface.
- the operation device 20 is a controller or keyboard of a home game machine, and accepts an operation input from a user.
- the operation device 20 transmits a signal indicating the content of the operation input received from the user to the image processing apparatus 10.
- the relay device 30 is connected to the display device 40, receives image data supplied from the image processing device 10, and transmits the received data to the display device 40. At this time, the relay device 30 may execute a correction process for canceling the distortion caused by the optical system of the display device 40 on the supplied image data as necessary, and output the corrected image data. . In addition to the image data, the relay device 30 relays various types of information transmitted and received between the image processing device 10 and the display device 40 such as audio data and control signals. In the present embodiment, it is assumed that the relay device 30 exchanges data with the display device 40 by wireless communication.
- the display device 40 displays a video corresponding to the image data received from the relay device 30 and allows the user to browse.
- the display device 40 is a device that a user wears on his / her head and uses video viewing with both eyes. That is, the display device 40 forms an image in front of each of the user's right eye and left eye.
- the display device 40 may be configured to be able to display a stereoscopic video using binocular parallax.
- the display device 40 includes a processing unit 41, a communication interface 42, a buffer memory 43, a video display element 44, an optical element 45, and a rear camera 46.
- the processing unit 41 includes an integrated circuit and the like, generates a frame image (display image) based on image data received from the image processing device 10 via the relay device 30, and supplies the frame image to the video display element 44. . By repeatedly executing such processing at a predetermined frame rate, the processing unit 41 causes the video display element 44 to display a video.
- the communication interface 42 is an interface for performing data communication with the relay device 30 and includes an antenna, a communication circuit, and the like for wireless communication.
- the image data received by the communication interface 42 from the relay device 30 is temporarily stored in the buffer memory 43.
- the processing unit 41 generates a frame image based on the image data stored in the buffer memory 43.
- the video display element 44 is, for example, an organic EL display panel or a liquid crystal display panel, and displays a video corresponding to the video signal supplied from the processing unit 41.
- the video display element 44 displays two videos, a left-eye video and a right-eye video.
- the video display element 44 may be a single display element that displays the left-eye video and the right-eye video side by side, or may be configured by two display elements that display each video independently. Further, a known smartphone or the like may be used as the video display element 44.
- the display device 40 may be a retinal irradiation type (retinal projection type) device that directly projects an image on a user's retina.
- the video display element 44 may be configured by a laser that emits light and a MEMS (Micro Electro Mechanical Systems) mirror that scans the light.
- MEMS Micro Electro Mechanical Systems
- the optical element 45 is a hologram, a prism, a half mirror, or the like, and is disposed in front of the user's eyes.
- the optical element 45 transmits or refracts image light emitted from the image display element 44 so as to enter the left and right eyes of the user. .
- the left-eye image displayed by the image display element 44 is incident on the user's left eye via the optical element 45
- the right-eye image is incident on the user's right eye via the optical element 45.
- the user can view the left-eye video with the left eye and the right-eye video with the right eye while the display device 40 is mounted on the head.
- the rear camera 46 is arranged at a position where the rear side of the display device 40 (that is, the user side) can be photographed, and photographs the left and right eyes of the user. An image captured by the rear camera 46 is transmitted to the image processing apparatus 10 via the relay apparatus 30.
- the image processing apparatus 10 functionally includes an attention position specifying unit 51, an image generating unit 52, and an image data transmitting unit 53. These functions are realized when the control unit 11 executes a program stored in the storage unit 12. This program may be provided to the image processing apparatus 10 via a communication network such as the Internet, or may be provided by being stored in a computer-readable information storage medium such as an optical disk.
- the processing unit 41 of the display device 40 functionally includes a timing control unit 54, an image data selection unit 55, and a display control unit 56. Some or all of these functions may be realized by software, or may be realized by hardware such as an electronic circuit.
- the attention position specifying unit 51 specifies the position (attention position) in the display area where the user who is using the display device 40 is paying attention. Specifically, the attention position specifying unit 51 acquires an image captured by the rear camera 46 from the display device 40, and specifies the user's line-of-sight direction by analyzing the acquired captured image. The position in the display area corresponding to this line-of-sight direction is the position of interest.
- the image generation unit 52 generates an image to be displayed on the display device 40.
- the image generation unit 52 draws an image showing a state in a virtual three-dimensional space in which various objects are arranged.
- the image generating unit 52 is not limited to this, and various images may be drawn.
- the display device 40 displays the same frame image as the left-eye image and the right-eye image.
- the video display system 1 according to the present embodiment may realize stereoscopic display by parallax by displaying different frame images on the left and right. In that case, the image processing apparatus 10 and the display apparatus 40 may perform the processes described below in parallel for the left and right frame images.
- the image generation unit 52 generates a plurality of images having different resolutions for one frame image displayed by the display device 40. These images are the same objects drawn at different resolutions, and the higher the resolution, the narrower the area is drawn. As will be described later, these images are combined in the display device 40 into one frame image and displayed by the video display element 44.
- a plurality of images generated by the image generation unit 52 and finally combined into one frame image are referred to as pre-combination images.
- the image generation unit 52 generates three types of pre-combination images, that is, the entire image P0, the first partial image P1, and the second partial image P2 in order of increasing resolution.
- the entire image P0 is an image assumed to be displayed in the entire display area of the display device 40, and is compared with the first partial image P1 and the second partial image P2. It is drawn at a low resolution.
- the first partial image P1 is an image drawn with a resolution higher than that of the entire image P0 for a partial region in the entire image P0.
- the second partial image P ⁇ b> 2 is an image drawn with a higher resolution than the first partial image P ⁇ b> 1 in a partial region narrower than the first partial image P ⁇ b> 1. It is assumed that the resolution of the second partial image P2 is equivalent to the resolution of the frame image actually displayed by the display device 40.
- each partial image has an area centered around the target position in the entire image P0 as a drawing target.
- the first partial image P1 and the second partial image P2 are drawn on the upper right area in the entire image P0.
- . 4 and 5 illustrate a case where the target position is the center of the entire image P0, and each partial image corresponds to a central region of the entire image P0.
- the plurality of pre-combination images drawn by the image generation unit 52 has a lower resolution as it targets a wider area. Therefore, the total data amount of the entire image P0, the first partial image P1, and the second partial image P2 is the frame image that is finally displayed (that is, the entire area targeted by the entire image P0 is the second partial image). Compared to an image drawn at a resolution equivalent to P2).
- the image data transmission unit 53 transmits the pre-combination image data generated by the image generation unit 52 to the relay device 30. At this time, the image data transmission unit 53 individually transmits the entire image P0, the first partial image P1, and the second partial image P2 as they are before being combined. This can reduce the amount of data to be transmitted from the image processing device 10 to the display device 40 in order to display one frame image, as compared to the case of transmitting a high-resolution frame image after synthesis. it can.
- the image data transmission unit 53 transmits various information (hereinafter referred to as additional information) indicating how the display device 40 should synthesize the pre-combination image together with the image data of the pre-combination image.
- the image data transmission unit 53 performs processing for each of the plurality of pre-combination images on the position of the pre-combination image in the frame image, the size occupied in the frame image, the enlargement ratio, and other processing on the pre-combination image. Necessary parameters are added as additional information and transmitted.
- FIG. 6 is a diagram schematically showing the contents of data transmitted by the image data transmission unit 53, and shows a state in which additional information as described above is added to the pre-combination image. 6 is an example of additional information indicating the position of the pre-combination image in the frame image, and indicates the position coordinate in the frame image of the upper left pixel of the pre-combination image.
- the image data transmitted by the image data transmission unit 53 is input to the display device 40 via the relay device 30.
- the communication interface 42 of the display device 40 receives the pre-combination image data from the relay device 30 and temporarily stores it in the buffer memory 43. Since the buffer memory 43 stores the pre-combination image data received by the communication interface 42 as it is, it is not necessary to have a capacity capable of storing the entire frame image displayed by the video display element 44.
- the timing control unit 54 supplies a synchronization signal for controlling the display timing of the frame image to the image data selection unit 55 and the display control unit 56.
- the image data selection unit 55 and the display control unit 56 can execute processing in synchronization with each other by operating at a timing corresponding to the synchronization signal supplied by the timing control unit 54.
- the image data selection unit 55 and the display control unit 56 perform a process of synthesizing the pre-combination image received from the image processing apparatus 10 and generating a frame image for display. Specifically, the image data selection unit 55 determines, for each pixel in the frame image, a pre-combination image that is used to determine the pixel value of the pixel. Basically, the image data selection unit 55 selects a pre-combination image with the highest resolution among the pre-combination images that cover the position of the pixel of interest as a use target. For example, for the pixels in the region corresponding to the second partial image P2 at the center of the frame image, the second partial image P2 is selected as the use target.
- the entire image P0 is selected as the target of use.
- the image data selection unit 55 performs such selection by referring to the position and size information included in the additional information transmitted by the image data transmission unit 53 together with the pre-combination image. Then, for each pixel in the frame image, the image data selection unit 55 reads the corresponding pixel value information in the selected pre-combination image from the buffer memory 43 and outputs it to the display control unit 56.
- the display control unit 56 determines the pixel value of each pixel in the frame image based on the image data of the pre-combination image selected by the image data selection unit 55. Then, a video signal for displaying a frame image composed of the determined pixel values is supplied to the video display element 44. Thereby, a frame image formed by combining a plurality of pre-combination images can be displayed on the video display element 44. Note that the display control unit 56 enlarges the low-resolution pre-combination image before combining the pre-combination image. The enlargement ratio at this time is determined by the enlargement ratio parameter in the additional information as illustrated in FIG.
- the display control unit 56 may execute image processing such as blurring the boundary portion so that the boundary portion between the two becomes inconspicuous when combining a plurality of types of pre-combination images. Further, when such image processing is executed, the content of the image processing may be determined using parameters included in the additional information transmitted from the image data transmission unit 53.
- FIG. 7 shows an example of a frame image obtained by combining the three types of pre-combine images exemplified in FIGS.
- This frame image has the same resolution as that of the second partial image P2, but the peripheral portion is an enlarged image of the low resolution overall image P0.
- the region assumed to be noticed by the user is generated based on the high-resolution second partial image P2, the user can feel that the frame image is displayed at the high resolution. .
- the display device 40 transmits a captured image of the rear camera 46 to the image processing device 10 (S1).
- the attention position specifying unit 51 of the image processing apparatus 10 specifies the attention position of the user using the captured image transmitted in S1 (S2).
- the image generation unit 52 generates three types of pre-combine images using the information on the target position specified in S2 (S3). Thereafter, the image data transmission unit 53 adds additional information to the generated three types of pre-combination images, and transmits them to the display device 40 (S4).
- the communication interface 42 of the display device 40 stores the image data of the pre-combination image transmitted in S4 in the buffer memory 43 (S5).
- the processing unit 41 generates a frame image by combining the three types of pre-combination images stored in the buffer memory 43 in S5 (S6), and displays them on the video display element 44 (S7).
- the video display system 1 displays video by repeatedly executing the above-described processing and updating the frame image.
- the captured image of the rear camera 46 is transmitted from the display device 40 to the image processing device 10 and the attention position is updated, but the attention position specifying unit 51 is longer.
- the attention position may be updated at time intervals. In that case, it is not necessary to transmit the captured image of the rear camera 46 every time the frame image is updated, and the above-described processes of S1 and S2 are executed less frequently.
- the image processing apparatus 10 transmits a plurality of types of pre-combination images with different resolutions, and the display apparatus 40 combines these pre-combination images. Display the frame image. Accordingly, it is possible to reduce the amount of data to be transmitted in order to display one frame image, compared to a case where the combined frame image is transmitted from the image processing device 10 to the display device 40. Therefore, an image with a high resolution and a relatively high frame rate can be displayed on the display device 40 without securing a large communication band between the image processing device 10 and the display device 40.
- the first partial image P1 and the second partial image P2 are rectangular images having the same shape but different in size from the whole image P0.
- the partial image is not limited to such a shape, and may be a shape different from the entire display area, such as a square or a circle.
- the image processing apparatus 10 may designate the shape type of the partial image from a plurality of candidates. In this case, information specifying the shape type is included in the additional information.
- the display device 40 refers to this additional information and identifies which region in the frame image the received partial image corresponds to.
- the shape of the entire image P0 itself is not limited to a rectangle, and may be an image having various shapes such as a circle according to the display characteristics of the display device 40.
- the position of the partial image in the frame image is determined according to the position of interest at which the user is looking. Therefore, when the user's line-of-sight direction changes, the position of the partial image in the entire image also changes according to the change.
- the embodiment of the present invention is not limited to this, and the target position may be a fixed position such as the center of the display area. In this case, the attention position specifying unit 51 is not necessary.
- the pre-combination image used for one frame image synthesis includes two types of partial images.
- the present invention is not limited to this, and one type of partial image may be used. There may be more than one type.
- the amount of data to be transmitted can be saved by determining the resolution and size of each partial image so that the higher resolution partial image corresponds to a narrower region in the frame image.
- the synthesis process is performed after the reception of the three pre-combination images.
- the processing unit 41 receives the received part in parallel with the reception of the pre-combination image by the communication interface 42.
- a synthesis process may be executed.
- the processing unit 41 extracts and combines the stored parts and supplies them to the video display element 44 as video signals.
- the communication interface 42 receives the subsequent portion of the pre-combination image, it overwrites and stores the portion that has already been processed by the processing unit 41.
- the buffer memory 43 only needs to have a capacity capable of storing only a part of the pre-combination image, and need not have a capacity capable of storing the entire data of all the pre-combination images. .
- the image data transmission unit 53 divides each pre-combination image into a plurality of transmission units (hereinafter referred to as blocks) and transmits the image data in units of the blocks.
- the processing unit 41 generates a frame image in order from the uppermost line to the lowermost line, with pixel columns (lines) arranged in the horizontal direction in the frame image as processing units.
- the uppermost line of the frame image is generated based on the entire image P0 as shown in FIG.
- the data of the first partial image P1 and the second partial image P2 are not used.
- the image data transmission unit 53 divides each pre-combination image into a plurality of blocks, and transmits the blocks in the order required in the frame image synthesis processing.
- the block in this case may be for one line of pixel rows arranged in the horizontal direction in each composite image, or for a plurality of lines.
- the image data transmission unit 53 first transmits blocks (blocks corresponding to the position coordinates Y3 to Y2) that constitute the uppermost stage of the entire image P0, and then the entire image P0 and the first image corresponding to the position coordinates Y2 to Y1.
- the blocks constituting each partial image are transmitted to the display device 40 in the order of the block of one partial image P1, the block of each pre-combination image corresponding to the position coordinate Y1 and thereafter, and so on.
- the display device 40 can start generating and displaying the frame image from the upper stage in parallel while receiving the data of the partial image.
- the image data transmission unit 53 changes the transmission order of each block in accordance with the change of the target position.
- the positions of the first partial image P1 and the second partial image P2 with respect to the entire image P0 also move upward.
- FIG. 9 shows an example of a frame image obtained by combining three types of pre-combination images in this case.
- the upper block of the first partial image P1 and the upper block of the second partial image P2 are required immediately after the start of frame image generation compared to before the movement. Become. Therefore, when the target position moves upward, these blocks are preferentially transmitted. According to such control, a frame image can be generated without delay.
- the low-resolution pre-combination images may be preferentially transmitted.
- the portions from the position coordinates Y2 to Y1 when synthesized, basically the corresponding blocks of the entire image P0 and the first partial image P1 are necessary.
- the corresponding blocks of the entire image P0 Is preferentially transmitted, and then the block of the first partial image corresponding to the same position is transmitted.
- the transmission of the corresponding block of the first partial image is not in time due to a communication delay or the like, it is possible to generate and display a frame image even at a low resolution by using the entire image P0. it can.
- the image generation unit 52 draws a pre-combination image that shows the state in the three-dimensional space.
- the image generation unit 52 is not limited to this, and the high-resolution image prepared in advance is displayed. Based on this, a pre-combination image may be generated.
- the image generation unit 52 when there is video data including a frame image having the same resolution as the frame image displayed by the display device 40, the image generation unit 52 generates an image in which the resolution of the entire frame image is greatly reduced as the entire image P0. Further, an image obtained by clipping a predetermined area of the frame image and lowering the resolution slightly is defined as a first partial image P1, and an image obtained by clipping a predetermined area narrower than the first partial image P1 is used as it is with the same resolution.
- the image data transmission unit 53 transmits each pre-combination image generated in this way to the display device 40. As a result, the amount of data of an image transmitted from the image processing device 10 to the display device 40 can be reduced even when a high-resolution video is reproduced.
- the relay device 30 and the display device 40 are connected by wireless communication.
- the embodiment of the present invention is not limited to this, and the image processing device 10 and the display device are displayed.
- the device 40 may be connected via various types of communication lines.
- the relay device 30 is not necessarily required, and the image processing device 10 and the display device 40 may be directly connected.
- 1 video display system 10 image processing device, 11 control unit, 12 storage unit, 13 interface unit, 30 relay device, 40 display device, 41 processing unit, 42 communication interface, 43 buffer memory, 44 video display element, 45 optical element , 51 Attention position identification unit, 52 Image generation unit, 53 Image data transmission unit, 54 Timing control unit, 55 Image data selection unit, 56 Display control unit.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Databases & Information Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Controls And Circuits For Display Device (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Image Processing (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
Claims (8)
- ユーザーが頭部に装着して使用する表示装置と、当該表示装置が表示する映像を供給する画像処理装置と、を含む映像表示システムであって、
前記画像処理装置は、
前記表示装置による表示に用いられる複数の画像であって、互いに解像度の異なる複数の合成前画像を生成する画像生成部と、
前記複数の合成前画像のそれぞれを前記表示装置に送信する送信部と、
を含み、
前記表示装置は、
前記画像処理装置が送信する前記複数の合成前画像を受信する受信部と、
前記複数の合成前画像を合成して得られる表示用画像を表示する表示制御部と、
を含むことを特徴とする映像表示システム。 - 請求項1に記載の映像表示システムにおいて、
前記複数の合成前画像は、前記表示用画像内の一部範囲に対応する部分画像を含み、
前記送信部は、前記部分画像とともに、当該部分画像が対応する前記表示用画像内の位置を特定する付加情報を前記表示装置に送信し、
前記表示制御部は、前記部分画像が前記付加情報により特定される位置に対応するように前記複数の合成前画像を合成して、前記表示用画像を生成する
ことを特徴とする映像表示システム。 - 請求項2に記載の映像表示システムにおいて、
前記画像処理装置は、前記表示装置の表示領域内におけるユーザーが注目する注目位置を特定する注目位置特定部をさらに含み、
前記画像生成部は、前記部分画像が前記注目位置に応じて決まる前記表示用画像内の位置に対応するように、前記部分画像を生成する
ことを特徴とする映像表示システム。 - 請求項3に記載の映像表示システムにおいて、
前記送信部は、前記複数の合成前画像のそれぞれを複数の送信単位に分割し、送信単位ごとに前記表示装置に対して送信し、かつ、前記複数の合成前画像を構成する複数の送信単位の送信順序を、前記注目位置に応じて変化させる
ことを特徴とする映像表示システム。 - ユーザーが頭部に装着して使用する表示装置に映像を供給する画像処理装置であって、
前記表示装置が表示する表示用画像の合成に用いられる複数の画像であって、互いに解像度の異なる複数の合成前画像を生成する画像生成部と、
前記複数の合成前画像のそれぞれを前記表示装置に送信する送信部と、
を含むことを特徴とする画像処理装置。 - ユーザーが頭部に装着して使用する表示装置であって、
当該表示装置に映像を供給する画像処理装置と接続され、
前記画像処理装置が送信する、互いに解像度の異なる複数の合成前画像を受信する受信部と、
前記複数の合成前画像を合成して得られる表示用画像を表示する表示制御部と、
を含むことを特徴とする表示装置。 - ユーザーが頭部に装着して使用する表示装置に映像を供給する画像処理方法であって、
前記表示装置が表示する表示用画像の合成に用いられる複数の画像であって、互いに解像度の異なる複数の合成前画像を生成するステップと、
前記複数の合成前画像のそれぞれを前記表示装置に送信するステップと、
を含むことを特徴とする画像処理方法。 - ユーザーが頭部に装着して使用する表示装置に映像を供給する画像処理装置を制御するためのプログラムであって、
前記表示装置が表示する表示用画像の合成に用いられる複数の画像であって、互いに解像度の異なる複数の合成前画像を生成する画像生成部、及び、
前記複数の合成前画像のそれぞれを前記表示装置に送信する送信部、
として前記画像処理装置を機能させるためのプログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020187022916A KR20180102125A (ko) | 2016-02-09 | 2017-02-03 | 영상 표시 시스템 |
EP17750181.4A EP3416392A4 (en) | 2016-02-09 | 2017-02-03 | VIDEO DISPLAY SYSTEM |
CN201780009524.9A CN108605148B (zh) | 2016-02-09 | 2017-02-03 | 视频显示系统 |
JP2017566919A JP6563043B2 (ja) | 2016-02-09 | 2017-02-03 | 映像表示システム |
US16/060,999 US10810701B2 (en) | 2016-02-09 | 2017-02-03 | Video display system |
US16/992,208 US11270410B2 (en) | 2016-02-09 | 2020-08-13 | Video display system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016022622 | 2016-02-09 | ||
JP2016-022622 | 2016-02-09 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/060,999 A-371-Of-International US10810701B2 (en) | 2016-02-09 | 2017-02-03 | Video display system |
US16/992,208 Continuation US11270410B2 (en) | 2016-02-09 | 2020-08-13 | Video display system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017138458A1 true WO2017138458A1 (ja) | 2017-08-17 |
Family
ID=59563286
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/004065 WO2017138458A1 (ja) | 2016-02-09 | 2017-02-03 | 映像表示システム |
Country Status (6)
Country | Link |
---|---|
US (2) | US10810701B2 (ja) |
EP (1) | EP3416392A4 (ja) |
JP (2) | JP6563043B2 (ja) |
KR (1) | KR20180102125A (ja) |
CN (1) | CN108605148B (ja) |
WO (1) | WO2017138458A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019197224A (ja) * | 2016-02-09 | 2019-11-14 | 株式会社ソニー・インタラクティブエンタテインメント | 映像表示システム |
JPWO2019102676A1 (ja) * | 2017-11-21 | 2020-12-24 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2021065631A1 (ja) * | 2019-09-30 | 2021-04-08 | 株式会社ソニー・インタラクティブエンタテインメント | 画像データ転送装置および画像圧縮方法 |
JP2021057770A (ja) * | 2019-09-30 | 2021-04-08 | 株式会社ソニー・インタラクティブエンタテインメント | 画像データ転送装置および画像圧縮方法 |
WO2021065632A1 (ja) * | 2019-09-30 | 2021-04-08 | 株式会社ソニー・インタラクティブエンタテインメント | 画像データ転送装置、画像表示システム、および画像圧縮方法 |
WO2024084819A1 (ja) * | 2022-10-21 | 2024-04-25 | パナソニックIpマネジメント株式会社 | 画像生成装置 |
JP7496677B2 (ja) | 2019-09-30 | 2024-06-07 | 株式会社ソニー・インタラクティブエンタテインメント | 画像データ転送装置、画像表示システム、および画像圧縮方法 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107111866B (zh) * | 2014-12-22 | 2021-01-05 | 交互数字Ce专利控股公司 | 用于基于对象检测生成外推图像的方法和装置 |
US10732989B2 (en) * | 2017-02-09 | 2020-08-04 | Yanir NULMAN | Method for managing data, imaging, and information computing in smart devices |
WO2021061350A1 (en) * | 2019-09-27 | 2021-04-01 | Apple Inc. | Managing devices having additive displays |
CN113163214A (zh) * | 2020-01-22 | 2021-07-23 | 华为技术有限公司 | 一种视频处理方法及其装置 |
TWI736328B (zh) * | 2020-06-19 | 2021-08-11 | 宏碁股份有限公司 | 頭戴式顯示裝置及應用其之畫面顯示方法 |
CN114974165A (zh) * | 2021-02-22 | 2022-08-30 | 联咏科技股份有限公司 | 显示驱动器集成电路、图像处理器及其操作方法 |
KR20240015633A (ko) | 2021-06-03 | 2024-02-05 | 소니 세미컨덕터 솔루션즈 가부시키가이샤 | 표시 장치, 표시 시스템 및 표시 구동 방법 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006054830A (ja) * | 2004-08-16 | 2006-02-23 | Sony Corp | 画像圧縮通信方法及び装置 |
JP2007174568A (ja) * | 2005-12-26 | 2007-07-05 | Sanyo Electric Co Ltd | 符号化方法 |
JP2007235314A (ja) * | 2006-02-28 | 2007-09-13 | Sanyo Electric Co Ltd | 符号化方法 |
JP2007274621A (ja) * | 2006-03-31 | 2007-10-18 | Victor Co Of Japan Ltd | 画像伝送システム及び画像伝送方法 |
WO2015015584A1 (ja) * | 2013-07-31 | 2015-02-05 | 日立マクセル株式会社 | 映像伝送システム、送信装置、および受信装置 |
US20150054913A1 (en) * | 2013-08-21 | 2015-02-26 | Jaunt Inc. | Image stitching |
JP2016012843A (ja) * | 2014-06-30 | 2016-01-21 | 株式会社リコー | 伝送管理システム、伝送システム、伝送管理方法、伝送方法、及びプログラム |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5320534A (en) * | 1990-11-05 | 1994-06-14 | The United States Of America As Represented By The Secretary Of The Air Force | Helmet mounted area of interest (HMAoI) for the display for advanced research and training (DART) |
JPH09233471A (ja) * | 1996-02-27 | 1997-09-05 | Fujitsu Ltd | 画像情報圧縮符号化装置 |
US6417867B1 (en) | 1999-05-27 | 2002-07-09 | Sharp Laboratories Of America, Inc. | Image downscaling using peripheral vision area localization |
US7081870B2 (en) | 2001-05-09 | 2006-07-25 | Hewlett-Packard Development Company, L.P. | Wearable display and method of displaying images using a wearable display |
US6985529B1 (en) | 2002-01-07 | 2006-01-10 | Apple Computer, Inc. | Generation and use of masks in MPEG video encoding to indicate non-zero entries in transformed macroblocks |
JP2003333593A (ja) * | 2002-05-15 | 2003-11-21 | Fuji Photo Film Co Ltd | 画像データ送信装置および画像データ受信装置 |
JP2004056335A (ja) * | 2002-07-18 | 2004-02-19 | Sony Corp | 情報処理装置および方法、表示装置および方法、並びにプログラム |
US7495638B2 (en) * | 2003-05-13 | 2009-02-24 | Research Triangle Institute | Visual display with increased field of view |
KR100739686B1 (ko) * | 2004-08-13 | 2007-07-13 | 경희대학교 산학협력단 | 영상 코딩 방법, 코딩 장치, 영상 디코딩 방법 및 디코딩장치 |
MX2007012564A (es) | 2005-04-13 | 2007-11-15 | Nokia Corp | Codificacion, almacenamiento y senalizacion de informacion de escalabilidad. |
KR101255226B1 (ko) | 2005-09-26 | 2013-04-16 | 한국과학기술원 | 스케일러블 비디오 코딩에서 다중 roi 설정, 복원을위한 장치 및 방법 |
KR100751290B1 (ko) * | 2006-03-31 | 2007-08-23 | 한국과학기술연구원 | 헤드 마운티드 디스플레이용 영상 시스템 |
JP5194776B2 (ja) * | 2007-12-21 | 2013-05-08 | 株式会社リコー | 情報表示システム、情報表示方法およびプログラム |
US8493390B2 (en) * | 2010-12-08 | 2013-07-23 | Sony Computer Entertainment America, Inc. | Adaptive displays using gaze tracking |
US8184069B1 (en) | 2011-06-20 | 2012-05-22 | Google Inc. | Systems and methods for adaptive transmission of data |
JP6007600B2 (ja) | 2012-06-07 | 2016-10-12 | ソニー株式会社 | 画像処理装置、画像処理方法およびプログラム |
US10514541B2 (en) | 2012-12-27 | 2019-12-24 | Microsoft Technology Licensing, Llc | Display update time reduction for a near-eye display |
JP6112878B2 (ja) | 2013-01-28 | 2017-04-12 | オリンパス株式会社 | ウェアラブル型表示装置及びプログラム |
US9727991B2 (en) | 2013-03-01 | 2017-08-08 | Microsoft Technology Licensing, Llc | Foveated image rendering |
CN105075271A (zh) | 2013-04-08 | 2015-11-18 | 索尼公司 | 利用shvc的关注区域可伸缩性 |
WO2017138458A1 (ja) | 2016-02-09 | 2017-08-17 | 株式会社ソニー・インタラクティブエンタテインメント | 映像表示システム |
-
2017
- 2017-02-03 WO PCT/JP2017/004065 patent/WO2017138458A1/ja active Application Filing
- 2017-02-03 US US16/060,999 patent/US10810701B2/en active Active
- 2017-02-03 EP EP17750181.4A patent/EP3416392A4/en active Pending
- 2017-02-03 JP JP2017566919A patent/JP6563043B2/ja active Active
- 2017-02-03 KR KR1020187022916A patent/KR20180102125A/ko not_active Application Discontinuation
- 2017-02-03 CN CN201780009524.9A patent/CN108605148B/zh active Active
-
2019
- 2019-07-23 JP JP2019135407A patent/JP2019197224A/ja active Pending
-
2020
- 2020-08-13 US US16/992,208 patent/US11270410B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006054830A (ja) * | 2004-08-16 | 2006-02-23 | Sony Corp | 画像圧縮通信方法及び装置 |
JP2007174568A (ja) * | 2005-12-26 | 2007-07-05 | Sanyo Electric Co Ltd | 符号化方法 |
JP2007235314A (ja) * | 2006-02-28 | 2007-09-13 | Sanyo Electric Co Ltd | 符号化方法 |
JP2007274621A (ja) * | 2006-03-31 | 2007-10-18 | Victor Co Of Japan Ltd | 画像伝送システム及び画像伝送方法 |
WO2015015584A1 (ja) * | 2013-07-31 | 2015-02-05 | 日立マクセル株式会社 | 映像伝送システム、送信装置、および受信装置 |
US20150054913A1 (en) * | 2013-08-21 | 2015-02-26 | Jaunt Inc. | Image stitching |
JP2016012843A (ja) * | 2014-06-30 | 2016-01-21 | 株式会社リコー | 伝送管理システム、伝送システム、伝送管理方法、伝送方法、及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3416392A4 * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019197224A (ja) * | 2016-02-09 | 2019-11-14 | 株式会社ソニー・インタラクティブエンタテインメント | 映像表示システム |
US10810701B2 (en) | 2016-02-09 | 2020-10-20 | Sony Interactive Entertainment Inc. | Video display system |
US11270410B2 (en) | 2016-02-09 | 2022-03-08 | Sony Interactive Entertainment Inc. | Video display system |
JPWO2019102676A1 (ja) * | 2017-11-21 | 2020-12-24 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP7196856B2 (ja) | 2017-11-21 | 2022-12-27 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2021065631A1 (ja) * | 2019-09-30 | 2021-04-08 | 株式会社ソニー・インタラクティブエンタテインメント | 画像データ転送装置および画像圧縮方法 |
JP2021057770A (ja) * | 2019-09-30 | 2021-04-08 | 株式会社ソニー・インタラクティブエンタテインメント | 画像データ転送装置および画像圧縮方法 |
WO2021065632A1 (ja) * | 2019-09-30 | 2021-04-08 | 株式会社ソニー・インタラクティブエンタテインメント | 画像データ転送装置、画像表示システム、および画像圧縮方法 |
WO2021065633A1 (ja) * | 2019-09-30 | 2021-04-08 | 株式会社ソニー・インタラクティブエンタテインメント | 画像データ転送装置および画像圧縮方法 |
JP7491676B2 (ja) | 2019-09-30 | 2024-05-28 | 株式会社ソニー・インタラクティブエンタテインメント | 画像データ転送装置および画像圧縮方法 |
JP7496677B2 (ja) | 2019-09-30 | 2024-06-07 | 株式会社ソニー・インタラクティブエンタテインメント | 画像データ転送装置、画像表示システム、および画像圧縮方法 |
WO2024084819A1 (ja) * | 2022-10-21 | 2024-04-25 | パナソニックIpマネジメント株式会社 | 画像生成装置 |
Also Published As
Publication number | Publication date |
---|---|
US20180365800A1 (en) | 2018-12-20 |
JPWO2017138458A1 (ja) | 2018-09-13 |
CN108605148A (zh) | 2018-09-28 |
US11270410B2 (en) | 2022-03-08 |
CN108605148B (zh) | 2022-02-11 |
US20200372607A1 (en) | 2020-11-26 |
JP6563043B2 (ja) | 2019-08-21 |
EP3416392A4 (en) | 2019-09-18 |
US10810701B2 (en) | 2020-10-20 |
EP3416392A1 (en) | 2018-12-19 |
KR20180102125A (ko) | 2018-09-14 |
JP2019197224A (ja) | 2019-11-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6563043B2 (ja) | 映像表示システム | |
JP6511386B2 (ja) | 情報処理装置および画像生成方法 | |
JP6576536B2 (ja) | 情報処理装置 | |
US10277814B2 (en) | Display control method and system for executing the display control method | |
JP6474278B2 (ja) | 画像生成システム、画像生成方法、プログラム及び情報記憶媒体 | |
US11960086B2 (en) | Image generation device, head-mounted display, and image generation method | |
US10650507B2 (en) | Image display method and apparatus in VR device, and VR device | |
JP2017069718A (ja) | 撮像装置、情報処理装置、表示装置、情報処理システム、画像データ送出方法、および画像表示方法 | |
JP2011010126A (ja) | 画像処理装置、画像処理方法 | |
US11039124B2 (en) | Information processing apparatus, information processing method, and recording medium | |
JP2018133746A (ja) | 画像処理装置、画像処理システム、画像処理方法及びプログラム | |
JP6687751B2 (ja) | 画像表示システム、画像表示装置、その制御方法、及びプログラム | |
JP6591667B2 (ja) | 画像処理システム、画像処理装置、及びプログラム | |
US20220113794A1 (en) | Display device and image display method | |
WO2016190193A1 (ja) | 情報処理装置、出力制御装置、情報処理システム、および動画データ出力方法 | |
JP6442619B2 (ja) | 情報処理装置 | |
WO2019235228A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US20190313077A1 (en) | Virtual reality environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17750181 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017566919 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20187022916 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020187022916 Country of ref document: KR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |