US20160165208A1 - Systems for providing image or video to be displayed by projective display system and for displaying or projecting image or video by a projective display system - Google Patents
Systems for providing image or video to be displayed by projective display system and for displaying or projecting image or video by a projective display system Download PDFInfo
- Publication number
- US20160165208A1 US20160165208A1 US14/904,700 US201514904700A US2016165208A1 US 20160165208 A1 US20160165208 A1 US 20160165208A1 US 201514904700 A US201514904700 A US 201514904700A US 2016165208 A1 US2016165208 A1 US 2016165208A1
- Authority
- US
- United States
- Prior art keywords
- subsystem
- projection
- video
- image
- image data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/349—Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3173—Constructional details thereof wherein the projection device is specially adapted for enhanced portability
-
- H04N13/0048—
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/145—Housing details, e.g. position adjustments thereof
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/28—Reflectors in projection beam
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- H04N13/0059—
-
- H04N13/0242—
-
- H04N13/0282—
-
- H04N13/0445—
-
- H04N13/0459—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/282—Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/363—Image reproducers using image projection screens
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/315—Modulator illumination systems
- H04N9/3155—Modulator illumination systems for controlling the light source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/011—Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0118—Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
Definitions
- the present invention relates generally to projective display, and more particularly, to a transmission system including a capture and transmitting system and a display and receiving system that is capable of generating images of a subject and having the images displayed by a projective display system.
- a stereo display is a display device capable of conveying depth perception to the viewer and reproducing real-world viewing experiences.
- the stereo display can be implemented with different technologies.
- Stereoscopic display technology has the disadvantage that the viewer must be positioned in a well-defined spot to experience the 3D visual effect and the disadvantage that the effective horizontal pixel count viewable for each eye is reduced by one half as well as the luminance for each eye is also reduced by one half.
- glasses-free stereoscopic display is desirable but glasses-free stereoscopic display currently leads to poor user experience.
- Holographic display technology has a great viewing experience but the cost and the size is too high to apply to mobile devices.
- a system for providing image or video to be displayed by a projective display system comprises: an encoding subsystem and a packing subsystem.
- the encoding subsystem is configured to encode at least one image or video of a subject to generate encoded image data.
- the packing subsystem is coupled to the encoding subsystem, and configured to pack the encoded image data with projection configuration information regarding the projective display system to generate packed image data.
- the projective display system comprises a projection source device and a projection surface, the projection source device projects the image or video to the projection surface according to the packed image data.
- a system for displaying or projecting image or video by a projective display system comprises: a de-packing subsystem, a decoding subsystem and a display subsystem.
- the de-packing subsystem is configured to derive packed image data and de-pack the derived packed image data to obtain encoded image data and projection configuration information.
- the decoding subsystem is coupled to the de-packing subsystem, and configured to decode the encoded image data to generate at least one image or video.
- the a projection display subsystem is coupled to the decoding subsystem, and configured to project or display the at least one image or video by a projection source component of the projective display system according to the projection.
- FIG. 1 illustrates a projective display system in a single-view configuration according to one embodiment of the present invention.
- FIG. 2A illustrates a projective display system in a multi-view configuration according to one embodiment of the present invention.
- FIG. 2B illustrates source areas for displaying source images on a display panel of a projective display system according to one embodiment of the present invention.
- FIG. 3A illustrates a capture and transmitting system according to one embodiment of the present invention.
- FIG. 3B illustrates a display and receiving system according to one embodiment of the present invention.
- FIG. 4 illustrates an implementation of a capture subsystem according to one embodiment of the present invention.
- FIG. 5 illustrates combination patterns for captured images/videos.
- FIG. 6 illustrates a flowchart regarding how the capture subsystem captures images or videos according to one embodiment of the present invention.
- FIG. 7 illustrates a flowchart regarding how the captured images/videos or render images/videos are encoded according to one embodiment of the present invention.
- FIG. 8 illustrates projection configuration information to be packed with encoded image data according to one embodiment of the present invention.
- FIG. 1 illustrates a projective display system in a single-view configuration according to one embodiment of the present invention.
- a projective display system 100 comprises at least one of a base 110 , a projection source device 120 , a projection surface 130 and an optional optical adjustment unit 140 .
- the projection source device 120 may be rotatably mounted, for example, on the base 110 .
- the projection source device 120 may be placed in a carrier, which may be rotatably mounted, for example, on the base 110 .
- the projection source device 120 may be a portable device, such as a smartphone, a tablet, touch controlled device, or any other electronic device with a display panel or a projection source component.
- the projection source device 120 comprises a projection source component 122 and a projection processor 124 .
- the projection source component 122 is configured to project at least one source image.
- the projection source component 122 may be an organic light-emitting diode display panel, a liquid crystal display panel, or any other passive or active display panel.
- the projection source component 122 may be a solid-state (laser or LED) light or any other type of light source.
- the projection processor 124 is configured to adaptively control an image adjustment on an input image to generate the source image.
- One purpose of the projection processor 124 is to maintain a projection quality of the projective display system 100 .
- the projection processor 124 could be implemented with a general-purpose processor or dedicated hardware. Please note that, the position of the projection processor 124 in FIG. 1 is just for the purpose of illustration rather than limitations.
- the projection surface 130 could be made of transflective material or non-opaque material (e.g. transparent/semitransparent material.)
- the projection surface 130 may be rotatably attached to the base 110 .
- the projection surface 130 could be flat or curved.
- the projection surface 130 is configured to mirror or partially reflect the source image that is projected from a first side of the projection surface 130 to form a virtual image on a second side that is opposite to the first side, thereby forming a stereo viewing effect.
- some intensity of the source image may be projected through the projection surface 130 , and some other intensity of the source image may be reflected by the projection surface 130 , such that the projection surface 130 partially reflects the source.
- a user may see the virtual image displayed on the projection surface 130 or floating behind the projection surface 130 , thereby forming a stereo viewing effect, especially for the source image with 3d effect.
- the optical adjustment element 140 may be rotatably attached to and detachable from the base 110 .
- the optical adjustment element 140 may optically adjust forming of the virtual image on the projection surface 130 .
- the optical adjustment element 140 could be a single lens or a compound lens.
- FIG. 2A illustrates the projective display system 100 in a multi-view according to another embodiment of the present invention.
- the base 110 of the projective display system 100 may be placed at the center of the projection source component 122 of the projection source device 120 .
- a polyhedron may be formed by the projection surface 130 and may be put on the top of the base 110 .
- multiple pieces of the optional optical adjustment units 140 are optionally attached to different sides of the surface of the base 110 .
- the projection surface 130 has four viewable sides P 1 -P 4 .
- the polyhedral projection surface 130 may be formed by folding the projection surface 130 of FIG. 1 or by combining parts of the projection surface 130 of FIG. 1 together. Please note that, the shape and the number of viewable sides of the projection surface 130 illustrated by FIG. 2A is not limitations of the present invention. Multiple viewable sides of the projection surface 130 allow the virtual images being viewed from different sides.
- the viewable sides P 1 -P 4 respectively correspond to four source areas SA 1 -SA 4 on the projection source component 122 of the projection source device 120 (the source area SA 4 is not shown). Source images shown on the source areas SA 1 -SA 4 are mirrored or partially reflected by four viewable sides P 1 -P 4 , respectively, thereby forming virtual images to users' eyes.
- Each of the optical adjustment elements 140 is detachable from the base 110 .
- the optical adjustment elements 140 can optically adjust forming of the virtual image on the projection surface 130 .
- the optical adjustment element 140 could be a single lens or a compound lens.
- FIG. 2B illustrates a layout of source areas for displaying the source images on the projection source component 122 when the projective display system 100 is in a multi-view configuration.
- the arrangement of the source areas may be associated with the shape of the projection surface 130 .
- the illustrated arrangement in FIG. 2B is not a limitation of the present invention.
- a processing system for the projective display system 100 may include a capture and transmitting system 300 as illustrated by FIG. 3A and a display and receiving system 400 as illustrated by FIG. 3B .
- the capture and transmitting system 300 and the display and receiving system 400 could be implemented in a single device or in different devices.
- the capture and transmitting system 300 may include a capture subsystem 310 , an encoding subsystem 320 , and a packing subsystem 330 .
- Packed image data generated by the capture and transmitting system 300 may be stored in a storage device 340 or be sent to a channel coding and modulation device 350 or any other device. Afterwards, the coded and modulated data stream may be sent to the display and receiving system 400 with wired or wireless transmission.
- the display and receiving system 400 includes a display subsystem 410 , a decoding subsystem 420 , and a de-packing subsystem 430 .
- the display and receiving system 400 receives the packed data generated by the capture and transmitting system 300 through the storage device 340 or wired/wireless transmission.
- a channel decoding and demodulation device 600 channel-decodes and demodulates the packed image data that is processed by the channel coding and modulation device 350 .
- the capture subsystem 310 of the capture and transmitting system 300 may include one or multiple camera devices. Please refer to FIG. 4 .
- the capture subsystem 310 may comprise camera devices of different electronic devices 310 _ 1 - 310 _ 4 .
- the camera devices of the electronic devices 310 _ 1 - 310 _ 4 capture images or videos of the subject from different capturing views CV 1 -CV 4 .
- images or videos IMG 1 -IMG 4 corresponding to different views CV 1 -CV 4 may be combined by the capture subsystem 310 to generate one single image or video in patterns (a)-(c) as shown in FIG. 5 .
- the image images or videos IMG 1 -IMG 4 corresponding to views CV 1 -CV 4 are horizontally arranged in one image or video.
- the image images or videos IMG 1 -IMG 4 corresponding to views CV 1 -CV 4 are arranged as a 2 ⁇ 2 square in one image or video.
- the image images or videos IMG 1 -IMG 4 corresponding to views CV 1 -CV 4 are vertically arranged in one image or video.
- images or videos IMG 1 -IMG 4 may be stored or sent separately as shown in FIG. 5 ( d ) but not combined into a single image or video.
- the camera devices of the electronic devices 310 _ 1 - 310 _ 4 may utilize wide-angle lens or fish-eye lens to cover a wide scene.
- the number of the views for capturing the subject can be determined according to the number of the viewable sides of the projection surface 130 .
- the number of the viewable sides is four.
- the capture subsystem 310 could capture the images or videos of the subject in four different views.
- the capture subsystem 310 may include single camera device.
- the single camera device of the capture subsystem 310 needs to shoot the subject for several times from each view to satisfy the projective display system 100 in the multi-view configuration.
- the capture subsystem 310 could generate images or videos by utilizing a graphic engine to render 3D objects in single view or multiple views.
- FIG. 6 illustrates a flowchart regarding image/video capture of the present invention.
- the flow starts.
- the user may select to enable the projection function of the projective display system 100 .
- the projection surface 130 in the embodiment of FIG. 1 is in a single-view configuration, while the projection surface 130 in the embodiment of FIG. 2 is in the multi-view configuration.
- the determination of the step 630 may be based on information inputted by the user. In some other embodiments, the determination of the step 630 may be performed automatically based on information related to the projection surface 130 .
- magnetic hinges may be used to connect different sides of a collapsible projection surface 130 .
- By detecting the magnetic hinges it can be obtained the information about the number of viewable sides of the projection surface 130 in the multi-view configuration.
- the flow goes to step 640 , where the camera device(s) of the capture subsystem 310 is instructed to shot once in front of the subject. If the projection surface 130 is in the multi-view configuration, the flow may go to step 650 , where it is determined whether the subject is still. If yes, the flow may go to step 660 ; otherwise, the flow may go to step 674 , a notice message may be shown, which reminds the user of the failure of the capturing because when the subject is moving, it is not easy to derive the image of subject in multiple views.
- step 660 it is determined whether the depth information of the subject is obtained. If yes, the flow may go to step 672 , where the depth synthesis is performed. By the depth synthesis, the images of the subject in other views can be generated according to the depth information. If it is no in step 660 , the flow may go to step 670 , where multiple shoots are conducted by the camera device(s) to capture the images or videos of the subject in multi-view. Afterwards, the flow may go to step 680 , where the multiple view synthesis is performed. The multiple view synthesis may combine the images or videos captured from different views into one or adjust the relative positions of the images or videos captured from different views on the projection source component 122 . For example, image or videos captured from different views could be combined into a single image or video like the patterns shown by FIG. 5 or FIG. 2A .
- the encoding subsystem 320 of the capture and transmitting system 300 encodes the captured images generated by the capture subsystem 310 with respect to a single view or multiple views.
- FIG. 7 illustrates a flowchart regarding image/video encoding performed by of the encoding subsystem 320 .
- the flow starts.
- the user may select to enable the projection function of the projective display system 100 .
- it may be determined whether the projection surface 130 is in the single-view configuration or in the multi-view configuration in step 730 .
- the determination of the step 730 can be based on information inputted by the user. In some other embodiments, the determination of the step 730 may be performed automatically based on information related to the projection surface 130 .
- magnetic hinges may be used to connect different sides of a collapsible projection surface 130 .
- By detecting the magnetic hinges it can be obtained the information about the number of viewable sides of the projection surface 130 in the multi-view configuration.
- step 750 it is determined whether the projection processor 124 of the projection source device 120 is applied before. If the projection processor 124 of the projection source device 120 has been applied, this means the projection processor 124 has obtained the depth information of the subject.
- step 750 the flow may go to step 760 , in which shape/depth coding is performed, where the projection processor 124 provides the shape/depth mapping information to the encoding subsystem 320 for performing the shape/depth encoding; otherwise, the flow may go to step 770 , in which texture coding is performed.
- the shape/depth coding performed in step 760 may include MPEG-4shape coding.
- the texture coding may comprise JPEG, GIF, PNG, and still profile of MPEG-1, MPEG-2, MPEG-4, WMV, AVS, H.261, H.263, H.264, H.265, VP6, VP8, VP9, any other texture coding or combination thereof.
- the texture coding may comprise motion JPEG, MPEG-1, MPEG-2, MPEG-4, WMV, AVS, H.261, H.263, H.264, H.265, VP6, VP8, VP9, any other texture coding or combination thereof.
- the decoding subsystem 420 in the display and receiving system 400 is used to decode the encoded data generated by the encoding subsystem 320 based on how the data is encoded as described above.
- the packing subsystem 330 of the capture and transmitting system 300 packs the encoded image data with projection configuration information.
- the projection configuration information may be in form of H.264 SEI (Supplemental Enhancement Information) or any other configuration format.
- the projection configuration information includes at least one of the number of the viewable sides of the projection surface 130 and the enablement of the projection function of the projective display system 110 .
- An example of the information to be packed with the encoded image data is illustrated in FIG. 8 .
- UUID The universally unique identifier
- UUID identifies the following fields, which carry information of the number of the viewable sides of the projection surface 130 and information of the enablement of the projection function of the projective display system 110 .
- UUID is 128-bit long
- the field corresponding to the number of the viewable sides of the projection surface 130 is one-byte long
- the field corresponding to the enablement of the projection function of the projective display system 110 is also one-byte long.
- this is just one possible way to implement packing the projection configuring information with encoded image data, rather than a limitation of the present invention.
- the de-packing subsystem 430 of the display and receiving system 400 derives the packed image data stored in the storage device 340 or the wired/wireless transmission between the capture and transmitting system 300 and the display and receiving system 400 . Accordingly, the de-packing subsystem 430 de-packs the received packed image data to derive encoded image data (which will be decoded by the decoding subsystem 420 ) and the projection configuration information.
- Display subsystem 410 of the display and receiving system 400 are configured to display captured or rendered images/videos provided by capture subsystem 310 on one or more source areas, such as source areas SA 1 -SA 4 on the projection source component 122 .
- the number of the source areas on which the display subsystem 410 displays the images is determined according to the number of the viewable sides of the projection surface 130 (which could be provided by the de-packing subsystem 430 ).
- the display subsystem 410 have the images/videos to be shown on four source areas SA 1 -SA 4 in response to four viewable sides P 1 -P 4 of the projection surface 130 .
- the display subsystem 410 could have the images/videos shown on fewer source areas of the projection source component 122 , such as two or three.
- the display subsystem 410 could project or display identical or different images thereon. This depends on the number of views for capturing the subject and the number of the viewable sides of the projection surface.
- the display subsystem 410 projects or displays duplicated captured images/videos on the different source areas of the projection source component 122 of the projection source device 100 .
- the display subsystem 310 may have all the images/videos shown on the source areas SA 1 -SA 4 identical.
- the display subsystem 310 has each of images/videos display on one of the source areas of the projection source component 122 .
- Circuits in the embodiments of the invention may include function that may be implemented as software executed by a processor, hardware circuits or structures, or a combination of both.
- the processor may be a general-purpose or dedicated processor.
- the software may comprise programming logic, instructions or data to implement certain function for an embodiment of the invention.
- the software may be stored in a medium accessible by a machine or computer-readable medium, such as read-only memory (ROM), random-access memory (RAM), magnetic disk (e.g., floppy disk and hard drive), optical disk (e.g., CD-ROM) or any other data storage medium.
- the media may store programming instructions in a compressed and/or encrypted format, as well as instructions that may have to be compiled or installed by an installer before being executed by the processor.
- an embodiment of the invention may be implemented as specific hardware components that contain hard-wired logic, field programmable gate array, complex programmable logic device, or application-specific integrated circuit, for performing the recited function, or by any combination of programmed general-purpose computer components and custom hardware component.
Abstract
A system for providing image or video to be displayed by a projective display system includes: an encoding subsystem and a packing subsystem. The encoding subsystem is configured to encode at least one image or video of a subject to generate encoded image data. The packing subsystem is coupled to the encoding subsystem, and configured to pack the encoded image data with projection configuration information regarding the projective display system to generate packed image data. The projective display system comprises a projection source device and a projection surface, the projection source device projects the image or video to the projection surface according to the packed image data.
Description
- This application claims the benefit of U.S. Provisional Application No. 62/003,260, filed on May 27, 2014, and U.S. Provisional Application No. 62/034,952, filed on Aug. 8, 2014. The entire contents of the related applications are incorporated herein by reference.
- The present invention relates generally to projective display, and more particularly, to a transmission system including a capture and transmitting system and a display and receiving system that is capable of generating images of a subject and having the images displayed by a projective display system.
- A stereo display is a display device capable of conveying depth perception to the viewer and reproducing real-world viewing experiences. The stereo display can be implemented with different technologies. However, technologies nowadays respectively have some disadvantages. Stereoscopic display technology has the disadvantage that the viewer must be positioned in a well-defined spot to experience the 3D visual effect and the disadvantage that the effective horizontal pixel count viewable for each eye is reduced by one half as well as the luminance for each eye is also reduced by one half. In addition, glasses-free stereoscopic display is desirable but glasses-free stereoscopic display currently leads to poor user experience. Holographic display technology has a great viewing experience but the cost and the size is too high to apply to mobile devices.
- According to a first aspect of the present invention, a system for providing image or video to be displayed by a projective display system is provided. The system comprises: an encoding subsystem and a packing subsystem. The encoding subsystem is configured to encode at least one image or video of a subject to generate encoded image data. The packing subsystem is coupled to the encoding subsystem, and configured to pack the encoded image data with projection configuration information regarding the projective display system to generate packed image data. The projective display system comprises a projection source device and a projection surface, the projection source device projects the image or video to the projection surface according to the packed image data.
- According to a second aspect of the present invention, a system for displaying or projecting image or video by a projective display system comprises: a de-packing subsystem, a decoding subsystem and a display subsystem. The de-packing subsystem is configured to derive packed image data and de-pack the derived packed image data to obtain encoded image data and projection configuration information. The decoding subsystem is coupled to the de-packing subsystem, and configured to decode the encoded image data to generate at least one image or video. The a projection display subsystem is coupled to the decoding subsystem, and configured to project or display the at least one image or video by a projection source component of the projective display system according to the projection.
- These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
-
FIG. 1 illustrates a projective display system in a single-view configuration according to one embodiment of the present invention. -
FIG. 2A illustrates a projective display system in a multi-view configuration according to one embodiment of the present invention. -
FIG. 2B illustrates source areas for displaying source images on a display panel of a projective display system according to one embodiment of the present invention. -
FIG. 3A illustrates a capture and transmitting system according to one embodiment of the present invention. -
FIG. 3B illustrates a display and receiving system according to one embodiment of the present invention. -
FIG. 4 illustrates an implementation of a capture subsystem according to one embodiment of the present invention. -
FIG. 5 illustrates combination patterns for captured images/videos. -
FIG. 6 illustrates a flowchart regarding how the capture subsystem captures images or videos according to one embodiment of the present invention. -
FIG. 7 illustrates a flowchart regarding how the captured images/videos or render images/videos are encoded according to one embodiment of the present invention. -
FIG. 8 illustrates projection configuration information to be packed with encoded image data according to one embodiment of the present invention. - Certain terms are used throughout the following descriptions and claims to refer to particular system components. As one skilled in the art will appreciate, manufacturers may refer to a component by different names. This document does not intend to distinguish between components that differ in name but not differ in function. In the following discussion and in the claims, the terms “include”, “including”, “comprise”, and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . ” The terms “couple” and “coupled” are intended to mean either an indirect or a direct electrical connection. Thus, if a first device couples to a second device, that connection may be through a direct electrical connection, or through an indirect electrical connection via other devices and connections.
-
FIG. 1 illustrates a projective display system in a single-view configuration according to one embodiment of the present invention. As shown by figure, aprojective display system 100 comprises at least one of abase 110, aprojection source device 120, aprojection surface 130 and an optionaloptical adjustment unit 140. In some embodiments, theprojection source device 120 may be rotatably mounted, for example, on thebase 110. In some other embodiments, theprojection source device 120 may be placed in a carrier, which may be rotatably mounted, for example, on thebase 110. Theprojection source device 120 may be a portable device, such as a smartphone, a tablet, touch controlled device, or any other electronic device with a display panel or a projection source component. Theprojection source device 120 comprises aprojection source component 122 and aprojection processor 124. - The
projection source component 122 is configured to project at least one source image. In some embodiments, theprojection source component 122 may be an organic light-emitting diode display panel, a liquid crystal display panel, or any other passive or active display panel. In some other embodiments, theprojection source component 122 may be a solid-state (laser or LED) light or any other type of light source. Theprojection processor 124 is configured to adaptively control an image adjustment on an input image to generate the source image. One purpose of theprojection processor 124 is to maintain a projection quality of theprojective display system 100. Theprojection processor 124 could be implemented with a general-purpose processor or dedicated hardware. Please note that, the position of theprojection processor 124 inFIG. 1 is just for the purpose of illustration rather than limitations. Theprojection surface 130 could be made of transflective material or non-opaque material (e.g. transparent/semitransparent material.) Theprojection surface 130 may be rotatably attached to thebase 110. Also, theprojection surface 130 could be flat or curved. Theprojection surface 130 is configured to mirror or partially reflect the source image that is projected from a first side of theprojection surface 130 to form a virtual image on a second side that is opposite to the first side, thereby forming a stereo viewing effect. In detail, some intensity of the source image may be projected through theprojection surface 130, and some other intensity of the source image may be reflected by theprojection surface 130, such that theprojection surface 130 partially reflects the source. As a result, a user may see the virtual image displayed on theprojection surface 130 or floating behind theprojection surface 130, thereby forming a stereo viewing effect, especially for the source image with 3d effect. - The
optical adjustment element 140 may be rotatably attached to and detachable from thebase 110. Theoptical adjustment element 140 may optically adjust forming of the virtual image on theprojection surface 130. In various embodiment of the present invention, theoptical adjustment element 140 could be a single lens or a compound lens. -
FIG. 2A illustrates theprojective display system 100 in a multi-view according to another embodiment of the present invention. In this embodiment, thebase 110 of theprojective display system 100 may be placed at the center of theprojection source component 122 of theprojection source device 120. A polyhedron may be formed by theprojection surface 130 and may be put on the top of thebase 110. In addition, multiple pieces of the optionaloptical adjustment units 140 are optionally attached to different sides of the surface of thebase 110. - The
projection surface 130 has four viewable sides P1-P4. Thepolyhedral projection surface 130 may be formed by folding theprojection surface 130 ofFIG. 1 or by combining parts of theprojection surface 130 ofFIG. 1 together. Please note that, the shape and the number of viewable sides of theprojection surface 130 illustrated byFIG. 2A is not limitations of the present invention. Multiple viewable sides of theprojection surface 130 allow the virtual images being viewed from different sides. The viewable sides P1-P4 respectively correspond to four source areas SA1-SA4 on theprojection source component 122 of the projection source device 120(the source area SA4 is not shown). Source images shown on the source areas SA1-SA4 are mirrored or partially reflected by four viewable sides P1-P4, respectively, thereby forming virtual images to users' eyes. - Each of the
optical adjustment elements 140 is detachable from thebase 110. Hence, in various embodiments of the present invention, not everyoptical adjustment element 140 shown inFIG. 2A is attached to thebase 110. When they are attached to the base, theoptical adjustment elements 140 can optically adjust forming of the virtual image on theprojection surface 130. In various embodiment of the present invention, theoptical adjustment element 140 could be a single lens or a compound lens. -
FIG. 2B illustrates a layout of source areas for displaying the source images on theprojection source component 122 when theprojective display system 100 is in a multi-view configuration. The arrangement of the source areas may be associated with the shape of theprojection surface 130. The illustrated arrangement inFIG. 2B is not a limitation of the present invention. - According to one embodiment of the present invention, a processing system for the
projective display system 100 is provided. The processing system of the present invention may include a capture and transmittingsystem 300 as illustrated byFIG. 3A and a display and receivingsystem 400 as illustrated byFIG. 3B . The capture and transmittingsystem 300 and the display and receivingsystem 400 could be implemented in a single device or in different devices. - The capture and transmitting
system 300 may include acapture subsystem 310, anencoding subsystem 320, and apacking subsystem 330. Packed image data generated by the capture and transmittingsystem 300 may be stored in astorage device 340 or be sent to a channel coding andmodulation device 350 or any other device. Afterwards, the coded and modulated data stream may be sent to the display and receivingsystem 400 with wired or wireless transmission. - The display and receiving
system 400 includes adisplay subsystem 410, adecoding subsystem 420, and ade-packing subsystem 430. The display and receivingsystem 400 receives the packed data generated by the capture and transmittingsystem 300 through thestorage device 340 or wired/wireless transmission. A channel decoding anddemodulation device 600 channel-decodes and demodulates the packed image data that is processed by the channel coding andmodulation device 350. - Operations of the capture and transmitting
system 300 and the display and receivingsystem 400 will be illustrated later in further details. - The
capture subsystem 310 of the capture and transmittingsystem 300 may include one or multiple camera devices. Please refer toFIG. 4 . In this embodiment, thecapture subsystem 310 may comprise camera devices of different electronic devices 310_1-310_4. The camera devices of the electronic devices 310_1-310_4 capture images or videos of the subject from different capturing views CV1-CV4. In some embodiments, images or videos IMG1-IMG4 corresponding to different views CV1-CV4 may be combined by thecapture subsystem 310 to generate one single image or video in patterns (a)-(c) as shown inFIG. 5 . In pattern (a), the image images or videos IMG1-IMG4 corresponding to views CV1-CV4 are horizontally arranged in one image or video. In pattern (b), the image images or videos IMG1-IMG4 corresponding to views CV1-CV4 are arranged as a 2×2 square in one image or video. In pattern (c), the image images or videos IMG1-IMG4 corresponding to views CV1-CV4 are vertically arranged in one image or video. Alternatively, in some other embodiments, images or videos IMG1-IMG4 may be stored or sent separately as shown inFIG. 5 (d) but not combined into a single image or video. - In one embodiment, the camera devices of the electronic devices 310_1-310_4 may utilize wide-angle lens or fish-eye lens to cover a wide scene.
- The number of the views for capturing the subject can be determined according to the number of the viewable sides of the
projection surface 130. For example, in the embodiment ofFIG. 2 , the number of the viewable sides is four. Thus, thecapture subsystem 310 could capture the images or videos of the subject in four different views. - The
capture subsystem 310 may include single camera device. In such case, the single camera device of thecapture subsystem 310 needs to shoot the subject for several times from each view to satisfy theprojective display system 100 in the multi-view configuration. - In one embodiment, the
capture subsystem 310 could generate images or videos by utilizing a graphic engine to render 3D objects in single view or multiple views. -
FIG. 6 illustrates a flowchart regarding image/video capture of the present invention. Atstep 610, the flow starts. Atstep 620, the user may select to enable the projection function of theprojective display system 100. It is determined whether theprojection surface 130 is in a single-view configuration or in a multi-view configuration instep 630. For example, theprojection surface 130 in the embodiment ofFIG. 1 is in a single-view configuration, while theprojection surface 130 in the embodiment ofFIG. 2 is in the multi-view configuration. In some embodiments, the determination of thestep 630 may be based on information inputted by the user. In some other embodiments, the determination of thestep 630 may be performed automatically based on information related to theprojection surface 130. For example, in some embodiments of the present invention, magnetic hinges may be used to connect different sides of acollapsible projection surface 130. By detecting the magnetic hinges, it can be obtained the information about the number of viewable sides of theprojection surface 130 in the multi-view configuration. - If the
projection surface 130 is in the single-view configuration, the flow goes to step 640, where the camera device(s) of thecapture subsystem 310 is instructed to shot once in front of the subject. If theprojection surface 130 is in the multi-view configuration, the flow may go to step 650, where it is determined whether the subject is still. If yes, the flow may go to step 660; otherwise, the flow may go to step 674, a notice message may be shown, which reminds the user of the failure of the capturing because when the subject is moving, it is not easy to derive the image of subject in multiple views. - At
step 660, it is determined whether the depth information of the subject is obtained. If yes, the flow may go to step 672, where the depth synthesis is performed. By the depth synthesis, the images of the subject in other views can be generated according to the depth information. If it is no instep 660, the flow may go to step 670, where multiple shoots are conducted by the camera device(s) to capture the images or videos of the subject in multi-view. Afterwards, the flow may go to step 680, where the multiple view synthesis is performed. The multiple view synthesis may combine the images or videos captured from different views into one or adjust the relative positions of the images or videos captured from different views on theprojection source component 122. For example, image or videos captured from different views could be combined into a single image or video like the patterns shown byFIG. 5 orFIG. 2A . - The
encoding subsystem 320 of the capture and transmittingsystem 300 encodes the captured images generated by thecapture subsystem 310 with respect to a single view or multiple views.FIG. 7 illustrates a flowchart regarding image/video encoding performed by of theencoding subsystem 320. Atstep 710, the flow starts. Atstep 720, the user may select to enable the projection function of theprojective display system 100. Then, it may be determined whether theprojection surface 130 is in the single-view configuration or in the multi-view configuration instep 730. The determination of thestep 730 can be based on information inputted by the user. In some other embodiments, the determination of thestep 730 may be performed automatically based on information related to theprojection surface 130. For example, in some embodiments of the present invention, magnetic hinges may be used to connect different sides of acollapsible projection surface 130. By detecting the magnetic hinges, it can be obtained the information about the number of viewable sides of theprojection surface 130 in the multi-view configuration. - If the
projection surface 130 is in the single-view configuration, the flow may go to step 750; if theprojection surface 130 is in the multi-view configuration, the flow may go to step 740, where the multi-view coding may be conducted to remove data redundancy between captured images/videos (by camera device) or rendered images/videos (by graphic engine) with respect to multiple views. Instep 750, it is determined whether theprojection processor 124 of theprojection source device 120 is applied before. If theprojection processor 124 of theprojection source device 120 has been applied, this means theprojection processor 124 has obtained the depth information of the subject. Therefore, if it is yes instep 750, the flow may go to step 760, in which shape/depth coding is performed, where theprojection processor 124 provides the shape/depth mapping information to theencoding subsystem 320 for performing the shape/depth encoding; otherwise, the flow may go to step 770, in which texture coding is performed. - In one embodiment, the shape/depth coding performed in
step 760 may include MPEG-4shape coding. In the case of image encoding, the texture coding may comprise JPEG, GIF, PNG, and still profile of MPEG-1, MPEG-2, MPEG-4, WMV, AVS, H.261, H.263, H.264, H.265, VP6, VP8, VP9, any other texture coding or combination thereof. In the case of video encoding, the texture coding may comprise motion JPEG, MPEG-1, MPEG-2, MPEG-4, WMV, AVS, H.261, H.263, H.264, H.265, VP6, VP8, VP9, any other texture coding or combination thereof. - The
decoding subsystem 420 in the display and receivingsystem 400 is used to decode the encoded data generated by theencoding subsystem 320 based on how the data is encoded as described above. - After the captured or rendered images are encoded by the
encoding subsystem 330 to generate the encoded image data, thepacking subsystem 330 of the capture and transmittingsystem 300 packs the encoded image data with projection configuration information. The projection configuration information may be in form of H.264 SEI (Supplemental Enhancement Information) or any other configuration format. In one embodiment, the projection configuration information includes at least one of the number of the viewable sides of theprojection surface 130 and the enablement of the projection function of theprojective display system 110. An example of the information to be packed with the encoded image data is illustrated inFIG. 8 . - The universally unique identifier (UUID) identifies the following fields, which carry information of the number of the viewable sides of the
projection surface 130 and information of the enablement of the projection function of theprojective display system 110. In this embodiment, UUID is 128-bit long, the field corresponding to the number of the viewable sides of theprojection surface 130 is one-byte long, and the field corresponding to the enablement of the projection function of theprojective display system 110 is also one-byte long. However, this is just one possible way to implement packing the projection configuring information with encoded image data, rather than a limitation of the present invention. - The
de-packing subsystem 430 of the display and receivingsystem 400 derives the packed image data stored in thestorage device 340 or the wired/wireless transmission between the capture and transmittingsystem 300 and the display and receivingsystem 400. Accordingly, thede-packing subsystem 430 de-packs the received packed image data to derive encoded image data (which will be decoded by the decoding subsystem 420) and the projection configuration information. -
Display subsystem 410 of the display and receivingsystem 400 are configured to display captured or rendered images/videos provided bycapture subsystem 310 on one or more source areas, such as source areas SA1-SA4 on theprojection source component 122. The number of the source areas on which thedisplay subsystem 410 displays the images is determined according to the number of the viewable sides of the projection surface 130 (which could be provided by the de-packing subsystem 430). For example, in the embodiment ofFIG. 2 , thedisplay subsystem 410 have the images/videos to be shown on four source areas SA1-SA4 in response to four viewable sides P1-P4 of theprojection surface 130. However, if the viewable sides of theprojection surface 130 are fewer in another embodiment, thedisplay subsystem 410 could have the images/videos shown on fewer source areas of theprojection source component 122, such as two or three. - For different source areas of the
projection source component 122, thedisplay subsystem 410 could project or display identical or different images thereon. This depends on the number of views for capturing the subject and the number of the viewable sides of the projection surface. - When the number of view of capturing the subject is smaller than the number of the viewable points of the projection surface, the
display subsystem 410 projects or displays duplicated captured images/videos on the different source areas of theprojection source component 122 of theprojection source device 100. For example, if the image is captured in single view, the number of views of capturing the subject is 1. Hence, if the captured image is projected to theprojection surface 130 inFIG. 2 , which has 4 viewable sides, thedisplay subsystem 310 may have all the images/videos shown on the source areas SA1-SA4 identical. In addition, when the number of views of capturing the subject is identical to the number of the viewable points of the projection surface, thedisplay subsystem 310 has each of images/videos display on one of the source areas of theprojection source component 122. - Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least an implementation. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment. Thus, although embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that claimed subject matter may not be limited to the specific features or acts described. Rather, the specific features and acts are disclosed as sample forms of implementing the claimed subject matter.
- Circuits in the embodiments of the invention may include function that may be implemented as software executed by a processor, hardware circuits or structures, or a combination of both. The processor may be a general-purpose or dedicated processor. The software may comprise programming logic, instructions or data to implement certain function for an embodiment of the invention. The software may be stored in a medium accessible by a machine or computer-readable medium, such as read-only memory (ROM), random-access memory (RAM), magnetic disk (e.g., floppy disk and hard drive), optical disk (e.g., CD-ROM) or any other data storage medium. In one embodiment of the invention, the media may store programming instructions in a compressed and/or encrypted format, as well as instructions that may have to be compiled or installed by an installer before being executed by the processor. Alternatively, an embodiment of the invention may be implemented as specific hardware components that contain hard-wired logic, field programmable gate array, complex programmable logic device, or application-specific integrated circuit, for performing the recited function, or by any combination of programmed general-purpose computer components and custom hardware component.
- Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Claims (15)
1. A system for providing image or video to be displayed by a projective display system, comprising:
an encoding subsystem, configured to encode at least one image or video of a subject to generate encoded image data; and
a packing subsystem, coupled to the encoding subsystem, and configured to pack the encoded image data with projection configuration information regarding the projective display system to generate packed image data,
wherein the projective display system comprises a projection source device and a projection surface, the projection source device projects the image or video to the projection surface according to the packed image data.
2. The system of claim 1 , further comprising a capture subsystem, configured to generate the at least one image or video of a subject in a single view or multiple views.
3. The system of claim 2 , wherein the capture subsystem comprises a plurality of camera devices.
4. The system of claim 2 , wherein the camera devices are disposed on different electronic devices.
5. The system of claim 2 , wherein the capture subsystem captures a plurality of the images or videos of the subject and combines images or videos in a specific arrangement pattern.
6. The system of claim 2 , wherein the capture subsystem captures a plurality of images or videos of the subject and the encoding subsystem encodes the images or the videos to reduce data redundancy between the images or videos.
7. The system of claim 1 , further comprising a graphic engine configured to render at least one object in single view or multiple views to generate the at least one image or video.
8. The system of claim 1 , wherein the encoding subsystem performs shape/depth encoding to encode the at least one image or video.
9. The system of claim 1 , wherein the projection configuration information indicates the number of viewable sides of a projection surface of the projective display system or enablement of projection function of the projective display system.
10. A system for displaying or projecting image or video by a projective display system, comprising:
a de-packing subsystem, configured to derive packed image data and de-pack the derived packed image data to obtain encoded image data and projection configuration information;
a decoding subsystem, coupled to the de-packing subsystem, and configured to decode the encoded image data to generate at least one image or video; and
a projection display subsystem, coupled to the decoding subsystem, and configured to project or display the at least one image or video by a projection source component of the projective display system according to the projection configuration information, and a projection surface of the projective display system mirrors or partially reflects the at least one projected or displayed image.
11. The system of claim 10 , wherein the projection configuration information indicates the number of viewable sides of a projection surface of the projective display system or enablement of projection function of the projective display system.
12. The system of claim 10 , wherein the decoding subsystem decodes the encoded image data to generate a plurality of images or videos of a subject with respect to different views, and the display subsystem simultaneously displays the images or videos on a plurality of different areas of the projection source component.
13. The system of claim 10 , wherein when a number of views of images of the subject is smaller than the number of viewable sides of a projection surface of the projective display system, the project display subsystem projects or displays identical image on several areas of the display panel.
14. The system of claim 10 , wherein the de-packing subsystem derives the packed image data from a storage device.
15. The system of claim 10 , wherein the de-packing subsystem derives the packed image data from a wired/wireless transmission between the capture and transmitting system and the display and receiving system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/904,700 US20160165208A1 (en) | 2014-05-27 | 2015-05-27 | Systems for providing image or video to be displayed by projective display system and for displaying or projecting image or video by a projective display system |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462003260P | 2014-05-27 | 2014-05-27 | |
US201462034952P | 2014-08-08 | 2014-08-08 | |
US14/904,700 US20160165208A1 (en) | 2014-05-27 | 2015-05-27 | Systems for providing image or video to be displayed by projective display system and for displaying or projecting image or video by a projective display system |
PCT/CN2015/079979 WO2015180653A1 (en) | 2014-05-27 | 2015-05-27 | Systems for providing image or video to be displayed by projective display system and for displaying or projecting image or video by a projective display system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160165208A1 true US20160165208A1 (en) | 2016-06-09 |
Family
ID=54698113
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/904,699 Abandoned US20160156887A1 (en) | 2014-05-27 | 2015-05-27 | Projection processor for projective display system |
US14/905,290 Abandoned US20160165197A1 (en) | 2014-05-27 | 2015-05-27 | Projection processor and associated method |
US14/902,591 Active US10136114B2 (en) | 2014-05-27 | 2015-05-27 | Projection display component and electronic device |
US14/904,700 Abandoned US20160165208A1 (en) | 2014-05-27 | 2015-05-27 | Systems for providing image or video to be displayed by projective display system and for displaying or projecting image or video by a projective display system |
Family Applications Before (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/904,699 Abandoned US20160156887A1 (en) | 2014-05-27 | 2015-05-27 | Projection processor for projective display system |
US14/905,290 Abandoned US20160165197A1 (en) | 2014-05-27 | 2015-05-27 | Projection processor and associated method |
US14/902,591 Active US10136114B2 (en) | 2014-05-27 | 2015-05-27 | Projection display component and electronic device |
Country Status (3)
Country | Link |
---|---|
US (4) | US20160156887A1 (en) |
CN (4) | CN105474638A (en) |
WO (4) | WO2015180647A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10241386B2 (en) * | 2017-03-20 | 2019-03-26 | The Eli Whitney Museum, Inc. | Flat folding projection device |
WO2020185963A1 (en) | 2019-03-11 | 2020-09-17 | IKIN, Inc. | Portable terminal accessory device for holographic projection and user interface |
USD988277S1 (en) * | 2021-06-17 | 2023-06-06 | IKIN, Inc. | Portable holographic projection device |
USD994011S1 (en) | 2021-06-16 | 2023-08-01 | IKIN, Inc. | Holographic projection device |
US11792311B2 (en) | 2018-07-30 | 2023-10-17 | IKIN, Inc. | Portable terminal accessory device for holographic projection and user interface |
USD1009969S1 (en) | 2021-06-17 | 2024-01-02 | IKIN, Inc. | Holographic device housing |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105474638A (en) * | 2014-05-27 | 2016-04-06 | 联发科技股份有限公司 | Systems for providing image or video to be displayed by projective display system and for displaying or projecting image or video by projective display system |
CN105791882B (en) * | 2016-03-22 | 2018-09-18 | 腾讯科技(深圳)有限公司 | Method for video coding and device |
TWI653563B (en) * | 2016-05-24 | 2019-03-11 | 仁寶電腦工業股份有限公司 | Projection touch image selection method |
US10386648B2 (en) * | 2016-08-08 | 2019-08-20 | Innolux Corporation | Image display system |
US20190349556A1 (en) * | 2017-02-01 | 2019-11-14 | Sharp Kabushiki Kaisha | Projection suitability detection system, projection suitability detection method, and non-transitory medium |
CN108732772B (en) * | 2017-04-25 | 2020-06-30 | 京东方科技集团股份有限公司 | Display device and driving method thereof |
TWI656359B (en) * | 2017-05-09 | 2019-04-11 | 瑞軒科技股份有限公司 | Device for mixed reality |
WO2019161570A1 (en) * | 2018-02-26 | 2019-08-29 | 神画科技(深圳)有限公司 | Projector and temperature compensation method for trapezoidal correction thereof |
CN112783456A (en) * | 2019-11-08 | 2021-05-11 | 上海博泰悦臻电子设备制造有限公司 | Screen projection connection realization method and screen projection system |
US20210304357A1 (en) * | 2020-03-27 | 2021-09-30 | Alibaba Group Holding Limited | Method and system for video processing based on spatial or temporal importance |
CN114660916A (en) * | 2022-03-16 | 2022-06-24 | 李�杰 | Multi-angle holographic image display system and method |
Family Cites Families (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5851060A (en) | 1995-09-13 | 1998-12-22 | Nikon Corporation | Projective display device |
JPH10138794A (en) | 1996-11-06 | 1998-05-26 | Denso Corp | Head-up display device |
DE10105482A1 (en) * | 2001-02-07 | 2002-08-08 | Design & Dev Ct Gmbh & Co Kg | Portable projection screen has stand formed from plates which are hinged to protect screen when in transit |
RU2227314C1 (en) * | 2002-09-17 | 2004-04-20 | Самсунг Электроникс Ко., Лтд. | Optical system of projection tv receiver |
JP2004177654A (en) * | 2002-11-27 | 2004-06-24 | Fuji Photo Optical Co Ltd | Projection picture display device |
JP2005055812A (en) | 2003-08-07 | 2005-03-03 | Olympus Corp | Projector device and desk having the projector device fitted therein |
CN103235472B (en) | 2004-04-01 | 2015-11-04 | Mdh全息股份有限公司 | For projector equipment and the method for pepper's ghost illusion |
ATE497190T1 (en) | 2004-04-01 | 2011-02-15 | Musion Systems Ltd | PROJECTION APPARATUS AND METHOD FOR THE PEPPER'S GHOST ILLUSION |
CN102053472B (en) | 2004-04-01 | 2013-08-28 | 穆森系统有限公司 | Projection equipment and method for illusion of Peppers' ghost |
US7396133B2 (en) * | 2004-12-06 | 2008-07-08 | N-Lighten Technologies | System and method for self-aligning collapsible display |
JP2006267222A (en) | 2005-03-22 | 2006-10-05 | Toshiba Corp | Projection screen and image projection system |
US7843449B2 (en) | 2006-09-20 | 2010-11-30 | Apple Inc. | Three-dimensional display system |
US20080018591A1 (en) * | 2006-07-20 | 2008-01-24 | Arkady Pittel | User Interfacing |
CN101183342A (en) * | 2006-11-14 | 2008-05-21 | 明基电通股份有限公司 | Projection device control system |
KR20080044370A (en) * | 2006-11-16 | 2008-05-21 | 최해용 | Stereoscopic screen |
CN101311769A (en) * | 2007-05-24 | 2008-11-26 | 创盟科技股份有限公司 | Reflection imaging system |
CN101398596A (en) | 2007-09-25 | 2009-04-01 | 海尔集团公司 | Projector |
US20090219985A1 (en) | 2008-02-28 | 2009-09-03 | Vasanth Swaminathan | Systems and Methods for Processing Multiple Projections of Video Data in a Single Video File |
CN101256672B (en) * | 2008-03-21 | 2011-10-12 | 北京中星微电子有限公司 | Object image depth restruction apparatus based on video camera apparatus as well as projecting apparatus thereof |
WO2009119808A1 (en) | 2008-03-27 | 2009-10-01 | 三洋電機株式会社 | Projection video display device |
US7869204B2 (en) * | 2008-09-15 | 2011-01-11 | International Business Machines Corporation | Compact size portable computer having a fully integrated virtual keyboard projector and a display projector |
JP5349010B2 (en) | 2008-11-06 | 2013-11-20 | 三菱電機株式会社 | Rear projection display |
CN102224455B (en) * | 2008-12-10 | 2014-10-08 | 株式会社尼康 | Projection device |
CN102662297B (en) * | 2009-01-08 | 2015-08-05 | 日立麦克赛尔株式会社 | Inclination projection optics system and use the projection type video display device of this system |
US8194001B2 (en) | 2009-03-27 | 2012-06-05 | Microsoft Corporation | Mobile computer device display postures |
US8330673B2 (en) | 2009-04-02 | 2012-12-11 | GM Global Technology Operations LLC | Scan loop optimization of vector projection display |
WO2011021352A1 (en) | 2009-08-19 | 2011-02-24 | パナソニック株式会社 | Information-processing device provided with a projector |
CN201594167U (en) | 2009-11-20 | 2010-09-29 | 江苏惠通集团有限责任公司 | Portable projecting apparatus for miniature terminal device |
JP5511360B2 (en) * | 2009-12-22 | 2014-06-04 | キヤノン株式会社 | Image display device |
US20130278631A1 (en) | 2010-02-28 | 2013-10-24 | Osterhout Group, Inc. | 3d positioning of augmented reality information |
EP2372512A1 (en) * | 2010-03-30 | 2011-10-05 | Harman Becker Automotive Systems GmbH | Vehicle user interface unit for a vehicle electronic device |
US8322863B1 (en) | 2010-06-18 | 2012-12-04 | Samuel Seungmin Cho | Apparatus and method for automated visual distortion adjustments for a portable projection device |
JP5488306B2 (en) | 2010-07-29 | 2014-05-14 | 船井電機株式会社 | projector |
US8540379B2 (en) | 2010-10-15 | 2013-09-24 | Panasonic Corporation | Image display device and information processing apparatus including the same |
CN102109750A (en) | 2011-01-20 | 2011-06-29 | 杨绿 | 360-degree floating three-dimensional phantom imaging system |
US10083639B2 (en) * | 2011-02-04 | 2018-09-25 | Seiko Epson Corporation | Control device for controlling image display device, head-mounted display device, image display system, control method for the image display device, and control method for the head-mounted display device |
JP5551639B2 (en) | 2011-03-11 | 2014-07-16 | パナソニック株式会社 | Image display device |
US8988512B2 (en) | 2011-04-14 | 2015-03-24 | Mediatek Inc. | Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof |
JPWO2012153450A1 (en) * | 2011-05-11 | 2014-07-31 | パナソニック株式会社 | Moving image transmitting apparatus and moving image transmitting method |
US8764206B2 (en) * | 2011-05-23 | 2014-07-01 | 360Brandvision, Inc. | Accessory for reflecting an image from a display screen of a portable electronic device |
JP5847924B2 (en) | 2011-06-08 | 2016-01-27 | エンパイア テクノロジー ディベロップメント エルエルシー | 2D image capture for augmented reality representation |
US20130147686A1 (en) | 2011-12-12 | 2013-06-13 | John Clavin | Connecting Head Mounted Displays To External Displays And Other Communication Networks |
CN103164024A (en) * | 2011-12-15 | 2013-06-19 | 西安天动数字科技有限公司 | Somatosensory interactive system |
CN102520900B (en) | 2011-12-20 | 2015-03-25 | Tcl集团股份有限公司 | Multilayer display device and display control method thereof |
CN202548527U (en) * | 2012-01-19 | 2012-11-21 | 郭振 | Wide-angle naked-eye stereoscopic projection quasi-holographic imaging system |
KR101320052B1 (en) * | 2012-01-31 | 2013-10-21 | 한국과학기술연구원 | 3-dimensional display apparatus using extension of viewing zone width |
CN103293838A (en) | 2012-02-22 | 2013-09-11 | 光宝电子(广州)有限公司 | Minitype projecting device and method for prolonging play time of minitype projecting device and enabling image quality to be optimal |
GB2499793B (en) * | 2012-02-28 | 2018-01-17 | Bae Systems Plc | A deployment apparatus for a partially reflective combiner. |
JP5970887B2 (en) * | 2012-03-19 | 2016-08-17 | 株式会社リコー | Image processing system, image processing apparatus and display apparatus used therefor, and image processing method in image processing system |
US9717981B2 (en) | 2012-04-05 | 2017-08-01 | Microsoft Technology Licensing, Llc | Augmented reality and physical games |
CN103428472B (en) | 2012-05-18 | 2016-08-10 | 郑州正信科技发展股份有限公司 | A kind of full interactive exchange communication method in kind based on collaborative perception and device thereof |
CN202535428U (en) * | 2012-05-22 | 2012-11-14 | 广东欧珀移动通信有限公司 | Clamshell mobile phone structure with projection screen |
US8888295B2 (en) | 2012-07-02 | 2014-11-18 | Disney Enterprises, Inc. | Reflective surface tensioning and cleaning system for pepper's ghost illusion |
JP6213812B2 (en) | 2012-07-31 | 2017-10-18 | Tianma Japan株式会社 | Stereoscopic image display apparatus and stereoscopic image processing method |
CN108495103B (en) | 2012-11-13 | 2021-05-18 | 联想(北京)有限公司 | Electronic equipment |
CN203101796U (en) | 2012-12-14 | 2013-07-31 | 王奕盟 | Integrated transparent-screen projection apparatus |
JP6136246B2 (en) | 2012-12-25 | 2017-05-31 | セイコーエプソン株式会社 | Projector and projector control method |
WO2014103072A1 (en) * | 2012-12-28 | 2014-07-03 | 楽天株式会社 | Access control system, access control method, mobile terminal, method for controlling mobile terminal, recording medium on which program for controlling mobile terminal is recorded, and program for controlling mobile terminal |
US20160062117A1 (en) * | 2013-04-01 | 2016-03-03 | Pioneer Corporation | Virtual image display device |
JP6175866B2 (en) | 2013-04-02 | 2017-08-09 | 富士通株式会社 | Interactive projector |
JP2014230051A (en) * | 2013-05-22 | 2014-12-08 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
CN103309048A (en) * | 2013-06-18 | 2013-09-18 | 彭波 | Method and device for stereo laser imaging and autostereoscopic displaying |
CN203422528U (en) | 2013-07-05 | 2014-02-05 | 张家港康得新光电材料有限公司 | Projection-type naked-eye stereo display system |
EP3045970B1 (en) | 2013-09-13 | 2020-06-17 | Maxell, Ltd. | Projection-type image display device |
CN103595996B (en) * | 2013-12-02 | 2016-05-04 | 南京航空航天大学 | A kind of multiview data transmission method that is applicable to real three-dimensional display system |
CN104714627B (en) | 2013-12-11 | 2018-07-06 | 联想(北京)有限公司 | The method and electronic equipment of a kind of information processing |
US20150268838A1 (en) * | 2014-03-20 | 2015-09-24 | Institute For Information Industry | Methods, systems, electronic devices, and non-transitory computer readable storage medium media for behavior based user interface layout display (build) |
CN105474638A (en) | 2014-05-27 | 2016-04-06 | 联发科技股份有限公司 | Systems for providing image or video to be displayed by projective display system and for displaying or projecting image or video by projective display system |
CN105223761B (en) | 2014-07-01 | 2017-05-24 | 中强光电股份有限公司 | Projection device and illumination system |
US9483080B2 (en) * | 2014-09-26 | 2016-11-01 | Intel Corporation | Electronic device with convertible touchscreen |
US10334214B2 (en) | 2015-06-30 | 2019-06-25 | Motorola Mobility Llc | Method and apparatus configured for combined vibratory and projection functions |
-
2015
- 2015-05-27 CN CN201580001601.7A patent/CN105474638A/en active Pending
- 2015-05-27 CN CN201580001603.6A patent/CN105474071A/en active Pending
- 2015-05-27 WO PCT/CN2015/079942 patent/WO2015180647A1/en active Application Filing
- 2015-05-27 WO PCT/CN2015/079940 patent/WO2015180646A1/en active Application Filing
- 2015-05-27 WO PCT/CN2015/079979 patent/WO2015180653A1/en active Application Filing
- 2015-05-27 WO PCT/CN2015/079933 patent/WO2015180645A1/en active Application Filing
- 2015-05-27 US US14/904,699 patent/US20160156887A1/en not_active Abandoned
- 2015-05-27 US US14/905,290 patent/US20160165197A1/en not_active Abandoned
- 2015-05-27 CN CN201580001751.8A patent/CN105531625B/en active Active
- 2015-05-27 US US14/902,591 patent/US10136114B2/en active Active
- 2015-05-27 CN CN201580001608.9A patent/CN105519098A/en active Pending
- 2015-05-27 US US14/904,700 patent/US20160165208A1/en not_active Abandoned
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10241386B2 (en) * | 2017-03-20 | 2019-03-26 | The Eli Whitney Museum, Inc. | Flat folding projection device |
US11792311B2 (en) | 2018-07-30 | 2023-10-17 | IKIN, Inc. | Portable terminal accessory device for holographic projection and user interface |
WO2020185963A1 (en) | 2019-03-11 | 2020-09-17 | IKIN, Inc. | Portable terminal accessory device for holographic projection and user interface |
EP3938844A4 (en) * | 2019-03-11 | 2022-12-07 | Ikin, Inc. | Portable terminal accessory device for holographic projection and user interface |
USD994011S1 (en) | 2021-06-16 | 2023-08-01 | IKIN, Inc. | Holographic projection device |
USD988277S1 (en) * | 2021-06-17 | 2023-06-06 | IKIN, Inc. | Portable holographic projection device |
USD1009969S1 (en) | 2021-06-17 | 2024-01-02 | IKIN, Inc. | Holographic device housing |
Also Published As
Publication number | Publication date |
---|---|
US20160182872A1 (en) | 2016-06-23 |
US20160165197A1 (en) | 2016-06-09 |
US10136114B2 (en) | 2018-11-20 |
WO2015180653A1 (en) | 2015-12-03 |
CN105474071A (en) | 2016-04-06 |
CN105531625B (en) | 2018-06-01 |
WO2015180645A1 (en) | 2015-12-03 |
CN105531625A (en) | 2016-04-27 |
WO2015180646A1 (en) | 2015-12-03 |
CN105474638A (en) | 2016-04-06 |
US20160156887A1 (en) | 2016-06-02 |
WO2015180647A1 (en) | 2015-12-03 |
CN105519098A (en) | 2016-04-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160165208A1 (en) | Systems for providing image or video to be displayed by projective display system and for displaying or projecting image or video by a projective display system | |
US20210329222A1 (en) | System and method for creating a navigable, three-dimensional virtual reality environment having ultra-wide field of view | |
US20200336725A1 (en) | Systems, methods and apparatus for compressing video content | |
EP3249922A1 (en) | Method, apparatus and stream for immersive video format | |
US7660472B2 (en) | System and method for managing stereoscopic viewing | |
US11363250B2 (en) | Augmented 3D entertainment systems | |
US11882267B2 (en) | Adapting video images for wearable devices | |
US20190251735A1 (en) | Method, apparatus and stream for immersive video format | |
CN109478344A (en) | Method and apparatus for composograph | |
TW201016013A (en) | Method and system for encoding a 3D video signal, encoder for encoding a 3-D video signal, encoded 3D video signal, method and system for decoding a 3D video signal, decoder for decoding a 3D video signal | |
US20160353058A1 (en) | Method and apparatus to present three-dimensional video on a two-dimensional display driven by user interaction | |
US20190266802A1 (en) | Display of Visual Data with a Virtual Reality Headset | |
US9495795B2 (en) | Image recording device, three-dimensional image reproducing device, image recording method, and three-dimensional image reproducing method | |
CN113906761A (en) | Method and apparatus for encoding and rendering 3D scene using patch | |
US20230042874A1 (en) | Volumetric video with auxiliary patches | |
US20230283759A1 (en) | System and method for presenting three-dimensional content | |
US20120294374A1 (en) | Conditional replenishment for three-dimensional images with block-based spatial thresholding | |
US20220377302A1 (en) | A method and apparatus for coding and decoding volumetric video with view-driven specularity | |
KR20220109433A (en) | Method and apparatus for encoding and decoding multi-viewpoint 3DoF+ content | |
Rocha et al. | An overview of three-dimensional videos: 3D content creation, 3D representation and visualization | |
US11968349B2 (en) | Method and apparatus for encoding and decoding of multiple-viewpoint 3DoF+ content | |
US20220345681A1 (en) | Method and apparatus for encoding, transmitting and decoding volumetric video | |
Balogh et al. | HoloVizio-True 3D display system | |
JP2023507586A (en) | Method and Apparatus for Encoding, Decoding, and Rendering 6DOF Content from 3DOF Components | |
WO2021259686A1 (en) | A method and apparatus for encoding and decoding volumetric content in and from a data stream |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MEDIATEK INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIU, TSU-MING;CHANG, CHIH-KAI;JU, CHI-CHENG;AND OTHERS;REEL/FRAME:037470/0395 Effective date: 20150528 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |